Baczkowski, Blazej M; Johnstone, Tom; Walter, Henrik; Erk, Susanne; Veer, Ilya M
2017-06-01
We evaluated whether sliding-window analysis can reveal functionally relevant brain network dynamics during a well-established fear conditioning paradigm. To this end, we tested if fMRI fluctuations in amygdala functional connectivity (FC) can be related to task-induced changes in physiological arousal and vigilance, as reflected in the skin conductance level (SCL). Thirty-two healthy individuals participated in the study. For the sliding-window analysis we used windows that were shifted by one volume at a time. Amygdala FC was calculated for each of these windows. Simultaneously acquired SCL time series were averaged over time frames that corresponded to the sliding-window FC analysis, which were subsequently regressed against the whole-brain seed-based amygdala sliding-window FC using the GLM. Surrogate time series were generated to test whether connectivity dynamics could have occurred by chance. In addition, results were contrasted against static amygdala FC and sliding-window FC of the primary visual cortex, which was chosen as a control seed, while a physio-physiological interaction (PPI) was performed as cross-validation. During periods of increased SCL, the left amygdala became more strongly coupled with the bilateral insula and anterior cingulate cortex, core areas of the salience network. The sliding-window analysis yielded a connectivity pattern that was unlikely to have occurred by chance, was spatially distinct from static amygdala FC and from sliding-window FC of the primary visual cortex, but was highly comparable to that of the PPI analysis. We conclude that sliding-window analysis can reveal functionally relevant fluctuations in connectivity in the context of an externally cued task. Copyright © 2017 Elsevier Inc. All rights reserved.
Liu, Xue-Li; Gai, Shuang-Shuang; Zhang, Shi-Le; Wang, Pu
2015-01-01
Background An important attribute of the traditional impact factor was the controversial 2-year citation window. So far, several scholars have proposed using different citation time windows for evaluating journals. However, there is no confirmation whether a longer citation time window would be better. How did the journal evaluation effects of 3IF, 4IF, and 6IF comparing with 2IF and 5IF? In order to understand these questions, we made a comparative study of impact factors with different citation time windows with the peer-reviewed scores of ophthalmologic journals indexed by Science Citation Index Expanded (SCIE) database. Methods The peer-reviewed scores of 28 ophthalmologic journals were obtained through a self-designed survey questionnaire. Impact factors with different citation time windows (including 2IF, 3IF, 4IF, 5IF, and 6IF) of 28 ophthalmologic journals were computed and compared in accordance with each impact factor’s definition and formula, using the citation analysis function of the Web of Science (WoS) database. An analysis of the correlation between impact factors with different citation time windows and peer-reviewed scores was carried out. Results Although impact factor values with different citation time windows were different, there was a high level of correlation between them when it came to evaluating journals. In the current study, for ophthalmologic journals’ impact factors with different time windows in 2013, 3IF and 4IF seemed the ideal ranges for comparison, when assessed in relation to peer-reviewed scores. In addition, the 3-year and 4-year windows were quite consistent with the cited peak age of documents published by ophthalmologic journals. Research Limitations Our study is based on ophthalmology journals and we only analyze the impact factors with different citation time window in 2013, so it has yet to be ascertained whether other disciplines (especially those with a later cited peak) or other years would follow the same or similar patterns. Originality/ Value We designed the survey questionnaire ourselves, specifically to assess the real influence of journals. We used peer-reviewed scores to judge the journal evaluation effect of impact factors with different citation time windows. The main purpose of this study was to help researchers better understand the role of impact factors with different citation time windows in journal evaluation. PMID:26295157
Liu, Xue-Li; Gai, Shuang-Shuang; Zhang, Shi-Le; Wang, Pu
2015-01-01
An important attribute of the traditional impact factor was the controversial 2-year citation window. So far, several scholars have proposed using different citation time windows for evaluating journals. However, there is no confirmation whether a longer citation time window would be better. How did the journal evaluation effects of 3IF, 4IF, and 6IF comparing with 2IF and 5IF? In order to understand these questions, we made a comparative study of impact factors with different citation time windows with the peer-reviewed scores of ophthalmologic journals indexed by Science Citation Index Expanded (SCIE) database. The peer-reviewed scores of 28 ophthalmologic journals were obtained through a self-designed survey questionnaire. Impact factors with different citation time windows (including 2IF, 3IF, 4IF, 5IF, and 6IF) of 28 ophthalmologic journals were computed and compared in accordance with each impact factor's definition and formula, using the citation analysis function of the Web of Science (WoS) database. An analysis of the correlation between impact factors with different citation time windows and peer-reviewed scores was carried out. Although impact factor values with different citation time windows were different, there was a high level of correlation between them when it came to evaluating journals. In the current study, for ophthalmologic journals' impact factors with different time windows in 2013, 3IF and 4IF seemed the ideal ranges for comparison, when assessed in relation to peer-reviewed scores. In addition, the 3-year and 4-year windows were quite consistent with the cited peak age of documents published by ophthalmologic journals. Our study is based on ophthalmology journals and we only analyze the impact factors with different citation time window in 2013, so it has yet to be ascertained whether other disciplines (especially those with a later cited peak) or other years would follow the same or similar patterns. We designed the survey questionnaire ourselves, specifically to assess the real influence of journals. We used peer-reviewed scores to judge the journal evaluation effect of impact factors with different citation time windows. The main purpose of this study was to help researchers better understand the role of impact factors with different citation time windows in journal evaluation.
On Time Delay Margin Estimation for Adaptive Control and Optimal Control Modification
NASA Technical Reports Server (NTRS)
Nguyen, Nhan T.
2011-01-01
This paper presents methods for estimating time delay margin for adaptive control of input delay systems with almost linear structured uncertainty. The bounded linear stability analysis method seeks to represent an adaptive law by a locally bounded linear approximation within a small time window. The time delay margin of this input delay system represents a local stability measure and is computed analytically by three methods: Pade approximation, Lyapunov-Krasovskii method, and the matrix measure method. These methods are applied to the standard model-reference adaptive control, s-modification adaptive law, and optimal control modification adaptive law. The windowing analysis results in non-unique estimates of the time delay margin since it is dependent on the length of a time window and parameters which vary from one time window to the next. The optimal control modification adaptive law overcomes this limitation in that, as the adaptive gain tends to infinity and if the matched uncertainty is linear, then the closed-loop input delay system tends to a LTI system. A lower bound of the time delay margin of this system can then be estimated uniquely without the need for the windowing analysis. Simulation results demonstrates the feasibility of the bounded linear stability method for time delay margin estimation.
Self spectrum window method in wigner-ville distribution.
Liu, Zhongguo; Liu, Changchun; Liu, Boqiang; Lv, Yangsheng; Lei, Yinsheng; Yu, Mengsun
2005-01-01
Wigner-Ville distribution (WVD) is an important type of time-frequency analysis in biomedical signal processing. The cross-term interference in WVD has a disadvantageous influence on its application. In this research, the Self Spectrum Window (SSW) method was put forward to suppress the cross-term interference, based on the fact that the cross-term and auto-WVD- terms in integral kernel function are orthogonal. With the Self Spectrum Window (SSW) algorithm, a real auto-WVD function was used as a template to cross-correlate with the integral kernel function, and the Short Time Fourier Transform (STFT) spectrum of the signal was used as window function to process the WVD in time-frequency plane. The SSW method was confirmed by computer simulation with good analysis results. Satisfactory time- frequency distribution was obtained.
Burriel-Valencia, Jordi; Martinez-Roman, Javier; Sapena-Bano, Angel
2018-01-01
The aim of this paper is to introduce a new methodology for the fault diagnosis of induction machines working in the transient regime, when time-frequency analysis tools are used. The proposed method relies on the use of the optimized Slepian window for performing the short time Fourier transform (STFT) of the stator current signal. It is shown that for a given sequence length of finite duration, the Slepian window has the maximum concentration of energy, greater than can be reached with a gated Gaussian window, which is usually used as the analysis window. In this paper, the use and optimization of the Slepian window for fault diagnosis of induction machines is theoretically introduced and experimentally validated through the test of a 3.15-MW induction motor with broken bars during the start-up transient. The theoretical analysis and the experimental results show that the use of the Slepian window can highlight the fault components in the current’s spectrogram with a significant reduction of the required computational resources. PMID:29316650
Burriel-Valencia, Jordi; Puche-Panadero, Ruben; Martinez-Roman, Javier; Sapena-Bano, Angel; Pineda-Sanchez, Manuel
2018-01-06
The aim of this paper is to introduce a new methodology for the fault diagnosis of induction machines working in the transient regime, when time-frequency analysis tools are used. The proposed method relies on the use of the optimized Slepian window for performing the short time Fourier transform (STFT) of the stator current signal. It is shown that for a given sequence length of finite duration, the Slepian window has the maximum concentration of energy, greater than can be reached with a gated Gaussian window, which is usually used as the analysis window. In this paper, the use and optimization of the Slepian window for fault diagnosis of induction machines is theoretically introduced and experimentally validated through the test of a 3.15-MW induction motor with broken bars during the start-up transient. The theoretical analysis and the experimental results show that the use of the Slepian window can highlight the fault components in the current's spectrogram with a significant reduction of the required computational resources.
Optimal Window and Lattice in Gabor Transform. Application to Audio Analysis.
Lachambre, Helene; Ricaud, Benjamin; Stempfel, Guillaume; Torrésani, Bruno; Wiesmeyr, Christoph; Onchis-Moaca, Darian
2015-01-01
This article deals with the use of optimal lattice and optimal window in Discrete Gabor Transform computation. In the case of a generalized Gaussian window, extending earlier contributions, we introduce an additional local window adaptation technique for non-stationary signals. We illustrate our approach and the earlier one by addressing three time-frequency analysis problems to show the improvements achieved by the use of optimal lattice and window: close frequencies distinction, frequency estimation and SNR estimation. The results are presented, when possible, with real world audio signals.
Pedersen, Mangor; Omidvarnia, Amir; Zalesky, Andrew; Jackson, Graeme D
2018-06-08
Correlation-based sliding window analysis (CSWA) is the most commonly used method to estimate time-resolved functional MRI (fMRI) connectivity. However, instantaneous phase synchrony analysis (IPSA) is gaining popularity mainly because it offers single time-point resolution of time-resolved fMRI connectivity. We aim to provide a systematic comparison between these two approaches, on both temporal and topological levels. For this purpose, we used resting-state fMRI data from two separate cohorts with different temporal resolutions (45 healthy subjects from Human Connectome Project fMRI data with repetition time of 0.72 s and 25 healthy subjects from a separate validation fMRI dataset with a repetition time of 3 s). For time-resolved functional connectivity analysis, we calculated tapered CSWA over a wide range of different window lengths that were temporally and topologically compared to IPSA. We found a strong association in connectivity dynamics between IPSA and CSWA when considering the absolute values of CSWA. The association between CSWA and IPSA was stronger for a window length of ∼20 s (shorter than filtered fMRI wavelength) than ∼100 s (longer than filtered fMRI wavelength), irrespective of the sampling rate of the underlying fMRI data. Narrow-band filtering of fMRI data (0.03-0.07 Hz) yielded a stronger relationship between IPSA and CSWA than wider-band (0.01-0.1 Hz). On a topological level, time-averaged IPSA and CSWA nodes were non-linearly correlated for both short (∼20 s) and long (∼100 s) windows, mainly because nodes with strong negative correlations (CSWA) displayed high phase synchrony (IPSA). IPSA and CSWA were anatomically similar in the default mode network, sensory cortex, insula and cerebellum. Our results suggest that IPSA and CSWA provide comparable characterizations of time-resolved fMRI connectivity for appropriately chosen window lengths. Although IPSA requires narrow-band fMRI filtering, we recommend the use of IPSA given that it does not mandate a (semi-)arbitrary choice of window length and window overlap. A code for calculating IPSA is provided. Copyright © 2018. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Cheng, Z.; Chen, Y.; Liu, Y.; Liu, W.; Zhang, G.
2015-12-01
Among those hydrocarbon reservoir detection techniques, the time-frequency analysis based approach is one of the most widely used approaches because of its straightforward indication of low-frequency anomalies from the time-frequency maps, that is to say, the low-frequency bright spots usually indicate the potential hydrocarbon reservoirs. The time-frequency analysis based approach is easy to implement, and more importantly, is usually of high fidelity in reservoir prediction, compared with the state-of-the-art approaches, and thus is of great interest to petroleum geologists, geophysicists, and reservoir engineers. The S transform has been frequently used in obtaining the time-frequency maps because of its better performance in controlling the compromise between the time and frequency resolutions than the alternatives, such as the short-time Fourier transform, Gabor transform, and continuous wavelet transform. The window function used in the majority of previous S transform applications is the symmetric Gaussian window. However, one problem with the symmetric Gaussian window is the degradation of time resolution in the time-frequency map due to the long front taper. In our study, a bi-Gaussian S transform that substitutes the symmetric Gaussian window with an asymmetry bi-Gaussian window is proposed to analyze the multi-channel seismic data in order to predict hydrocarbon reservoirs. The bi-Gaussian window introduces asymmetry in the resultant time-frequency spectrum, with time resolution better in the front direction, as compared with the back direction. It is the first time that the bi-Gaussian S transform is used for analyzing multi-channel post-stack seismic data in order to predict hydrocarbon reservoirs since its invention in 2003. The superiority of the bi-Gaussian S transform over traditional S transform is tested on a real land seismic data example. The performance shows that the enhanced temporal resolution can help us depict more clearly the edge of the hydrocarbon reservoir, especially when the thickness of the reservoir is small (such as the thin beds).
Zhang, Zhengyi; Zhang, Gaoyan; Zhang, Yuanyuan; Liu, Hong; Xu, Junhai; Liu, Baolin
2017-12-01
This study aimed to investigate the functional connectivity in the brain during the cross-modal integration of polyphonic characters in Chinese audio-visual sentences. The visual sentences were all semantically reasonable and the audible pronunciations of the polyphonic characters in corresponding sentences contexts varied in four conditions. To measure the functional connectivity, correlation, coherence and phase synchronization index (PSI) were used, and then multivariate pattern analysis was performed to detect the consensus functional connectivity patterns. These analyses were confined in the time windows of three event-related potential components of P200, N400 and late positive shift (LPS) to investigate the dynamic changes of the connectivity patterns at different cognitive stages. We found that when differentiating the polyphonic characters with abnormal pronunciations from that with the appreciate ones in audio-visual sentences, significant classification results were obtained based on the coherence in the time window of the P200 component, the correlation in the time window of the N400 component and the coherence and PSI in the time window the LPS component. Moreover, the spatial distributions in these time windows were also different, with the recruitment of frontal sites in the time window of the P200 component, the frontal-central-parietal regions in the time window of the N400 component and the central-parietal sites in the time window of the LPS component. These findings demonstrate that the functional interaction mechanisms are different at different stages of audio-visual integration of polyphonic characters.
Hamuro, Yoshitomo
2017-03-01
A new strategy to analyze amide hydrogen/deuterium exchange mass spectrometry (HDX-MS) data is proposed, utilizing a wider time window and isotope envelope analysis of each peptide. While most current scientific reports present HDX-MS data as a set of time-dependent deuteration levels of peptides, the ideal HDX-MS data presentation is a complete set of backbone amide hydrogen exchange rates. The ideal data set can provide single amide resolution, coverage of all exchange events, and the open/close ratio of each amide hydrogen in EX2 mechanism. Toward this goal, a typical HDX-MS protocol was modified in two aspects: measurement of a wider time window in HDX-MS experiments and deconvolution of isotope envelope of each peptide. Measurement of a wider time window enabled the observation of deuterium incorporation of most backbone amide hydrogens. Analysis of the isotope envelope instead of centroid value provides the deuterium distribution instead of the sum of deuteration levels in each peptide. A one-step, global-fitting algorithm optimized exchange rate and deuterium retention during the analysis of each amide hydrogen by fitting the deuterated isotope envelopes at all time points of all peptides in a region. Application of this strategy to cytochrome c yielded 97 out of 100 amide hydrogen exchange rates. A set of exchange rates determined by this approach is more appropriate for a patent or regulatory filing of a biopharmaceutical than a set of peptide deuteration levels obtained by a typical protocol. A wider time window of this method also eliminates false negatives in protein-ligand binding site identification. Graphical Abstract ᅟ.
NASA Astrophysics Data System (ADS)
Hamuro, Yoshitomo
2017-03-01
A new strategy to analyze amide hydrogen/deuterium exchange mass spectrometry (HDX-MS) data is proposed, utilizing a wider time window and isotope envelope analysis of each peptide. While most current scientific reports present HDX-MS data as a set of time-dependent deuteration levels of peptides, the ideal HDX-MS data presentation is a complete set of backbone amide hydrogen exchange rates. The ideal data set can provide single amide resolution, coverage of all exchange events, and the open/close ratio of each amide hydrogen in EX2 mechanism. Toward this goal, a typical HDX-MS protocol was modified in two aspects: measurement of a wider time window in HDX-MS experiments and deconvolution of isotope envelope of each peptide. Measurement of a wider time window enabled the observation of deuterium incorporation of most backbone amide hydrogens. Analysis of the isotope envelope instead of centroid value provides the deuterium distribution instead of the sum of deuteration levels in each peptide. A one-step, global-fitting algorithm optimized exchange rate and deuterium retention during the analysis of each amide hydrogen by fitting the deuterated isotope envelopes at all time points of all peptides in a region. Application of this strategy to cytochrome c yielded 97 out of 100 amide hydrogen exchange rates. A set of exchange rates determined by this approach is more appropriate for a patent or regulatory filing of a biopharmaceutical than a set of peptide deuteration levels obtained by a typical protocol. A wider time window of this method also eliminates false negatives in protein-ligand binding site identification.
Platform for Postprocessing Waveform-Based NDE
NASA Technical Reports Server (NTRS)
Roth, Don
2008-01-01
Taking advantage of the similarities that exist among all waveform-based non-destructive evaluation (NDE) methods, a common software platform has been developed containing multiple- signal and image-processing techniques for waveforms and images. The NASA NDE Signal and Image Processing software has been developed using the latest versions of LabVIEW, and its associated Advanced Signal Processing and Vision Toolkits. The software is useable on a PC with Windows XP and Windows Vista. The software has been designed with a commercial grade interface in which two main windows, Waveform Window and Image Window, are displayed if the user chooses a waveform file to display. Within these two main windows, most actions are chosen through logically conceived run-time menus. The Waveform Window has plots for both the raw time-domain waves and their frequency- domain transformations (fast Fourier transform and power spectral density). The Image Window shows the C-scan image formed from information of the time-domain waveform (such as peak amplitude) or its frequency-domain transformation at each scan location. The user also has the ability to open an image, or series of images, or a simple set of X-Y paired data set in text format. Each of the Waveform and Image Windows contains menus from which to perform many user actions. An option exists to use raw waves obtained directly from scan, or waves after deconvolution if system wave response is provided. Two types of deconvolution, time-based subtraction or inverse-filter, can be performed to arrive at a deconvolved wave set. Additionally, the menu on the Waveform Window allows preprocessing of waveforms prior to image formation, scaling and display of waveforms, formation of different types of images (including non-standard types such as velocity), gating of portions of waves prior to image formation, and several other miscellaneous and specialized operations. The menu available on the Image Window allows many further image processing and analysis operations, some of which are found in commercially-available image-processing software programs (such as Adobe Photoshop), and some that are not (removing outliers, Bscan information, region-of-interest analysis, line profiles, and precision feature measurements).
Mars Reconnaissance Orbiter Uplink Analysis Tool
NASA Technical Reports Server (NTRS)
Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; Hwang, Pauline
2008-01-01
This software analyzes Mars Reconnaissance Orbiter (MRO) orbital geometry with respect to Mars Exploration Rover (MER) contact windows, and is the first tool of its kind designed specifically to support MRO-MER interface coordination. Prior to this automated tool, this analysis was done manually with Excel and the UNIX command line. In total, the process would take approximately 30 minutes for each analysis. The current automated analysis takes less than 30 seconds. This tool resides on the flight machine and uses a PHP interface that does the entire analysis of the input files and takes into account one-way light time from another input file. Input flies are copied over to the proper directories and are dynamically read into the tool s interface. The user can then choose the corresponding input files based on the time frame desired for analysis. After submission of the Web form, the tool merges the two files into a single, time-ordered listing of events for both spacecraft. The times are converted to the same reference time (Earth Transmit Time) by reading in a light time file and performing the calculations necessary to shift the time formats. The program also has the ability to vary the size of the keep-out window on the main page of the analysis tool by inputting a custom time for padding each MRO event time. The parameters on the form are read in and passed to the second page for analysis. Everything is fully coded in PHP and can be accessed by anyone with access to the machine via Web page. This uplink tool will continue to be used for the duration of the MER mission's needs for X-band uplinks. Future missions also can use the tools to check overflight times as well as potential site observation times. Adaptation of the input files to the proper format, and the window keep-out times, would allow for other analyses. Any operations task that uses the idea of keep-out windows will have a use for this program.
Shakil, Sadia; Lee, Chin-Hui; Keilholz, Shella Dawn
2016-01-01
A promising recent development in the study of brain function is the dynamic analysis of resting-state functional MRI scans, which can enhance understanding of normal cognition and alterations that result from brain disorders. One widely used method of capturing the dynamics of functional connectivity is sliding window correlation (SWC). However, in the absence of a “gold standard” for comparison, evaluating the performance of the SWC in typical resting-state data is challenging. This study uses simulated networks (SNs) with known transitions to examine the effects of parameters such as window length, window offset, window type, noise, filtering, and sampling rate on the SWC performance. The SWC time course was calculated for all node pairs of each SN and then clustered using the k-means algorithm to determine how resulting brain states match known configurations and transitions in the SNs. The outcomes show that the detection of state transitions and durations in the SWC is most strongly influenced by the window length and offset, followed by noise and filtering parameters. The effect of the image sampling rate was relatively insignificant. Tapered windows provide less sensitivity to state transitions than rectangular windows, which could be the result of the sharp transitions in the SNs. Overall, the SWC gave poor estimates of correlation for each brain state. Clustering based on the SWC time course did not reliably reflect the underlying state transitions unless the window length was comparable to the state duration, highlighting the need for new adaptive window analysis techniques. PMID:26952197
Solving the chemical master equation using sliding windows
2010-01-01
Background The chemical master equation (CME) is a system of ordinary differential equations that describes the evolution of a network of chemical reactions as a stochastic process. Its solution yields the probability density vector of the system at each point in time. Solving the CME numerically is in many cases computationally expensive or even infeasible as the number of reachable states can be very large or infinite. We introduce the sliding window method, which computes an approximate solution of the CME by performing a sequence of local analysis steps. In each step, only a manageable subset of states is considered, representing a "window" into the state space. In subsequent steps, the window follows the direction in which the probability mass moves, until the time period of interest has elapsed. We construct the window based on a deterministic approximation of the future behavior of the system by estimating upper and lower bounds on the populations of the chemical species. Results In order to show the effectiveness of our approach, we apply it to several examples previously described in the literature. The experimental results show that the proposed method speeds up the analysis considerably, compared to a global analysis, while still providing high accuracy. Conclusions The sliding window method is a novel approach to address the performance problems of numerical algorithms for the solution of the chemical master equation. The method efficiently approximates the probability distributions at the time points of interest for a variety of chemically reacting systems, including systems for which no upper bound on the population sizes of the chemical species is known a priori. PMID:20377904
Counter tube window and X-ray fluorescence analyzer study
NASA Technical Reports Server (NTRS)
Hertel, R.; Holm, M.
1973-01-01
A study was performed to determine the best design tube window and X-ray fluorescence analyzer for quantitative analysis of Venusian dust and condensates. The principal objective of the project was to develop the best counter tube window geometry for the sensing element of the instrument. This included formulation of a mathematical model of the window and optimization of its parameters. The proposed detector and instrument has several important features. The instrument will perform a near real-time analysis of dust in the Venusian atmosphere, and is capable of measuring dust layers less than 1 micron thick. In addition, wide dynamic measurement range will be provided to compensate for extreme variations in count rates. An integral pulse-height analyzer and memory accumulate data and read out spectra for detail computer analysis on the ground.
Air Traffic Complexity Measurement Environment (ACME): Software User's Guide
NASA Technical Reports Server (NTRS)
1996-01-01
A user's guide for the Air Traffic Complexity Measurement Environment (ACME) software is presented. The ACME consists of two major components, a complexity analysis tool and user interface. The Complexity Analysis Tool (CAT) analyzes complexity off-line, producing data files which may be examined interactively via the Complexity Data Analysis Tool (CDAT). The Complexity Analysis Tool is composed of three independently executing processes that communicate via PVM (Parallel Virtual Machine) and Unix sockets. The Runtime Data Management and Control process (RUNDMC) extracts flight plan and track information from a SAR input file, and sends the information to GARP (Generate Aircraft Routes Process) and CAT (Complexity Analysis Task). GARP in turn generates aircraft trajectories, which are utilized by CAT to calculate sector complexity. CAT writes flight plan, track and complexity data to an output file, which can be examined interactively. The Complexity Data Analysis Tool (CDAT) provides an interactive graphic environment for examining the complexity data produced by the Complexity Analysis Tool (CAT). CDAT can also play back track data extracted from System Analysis Recording (SAR) tapes. The CDAT user interface consists of a primary window, a controls window, and miscellaneous pop-ups. Aircraft track and position data is displayed in the main viewing area of the primary window. The controls window contains miscellaneous control and display items. Complexity data is displayed in pop-up windows. CDAT plays back sector complexity and aircraft track and position data as a function of time. Controls are provided to start and stop playback, adjust the playback rate, and reposition the display to a specified time.
NASA Astrophysics Data System (ADS)
Zhou, Peng; Zhang, Xi; Sun, Weifeng; Dai, Yongshou; Wan, Yong
2018-01-01
An algorithm based on time-frequency analysis is proposed to select an imaging time window for the inverse synthetic aperture radar imaging of ships. An appropriate range bin is selected to perform the time-frequency analysis after radial motion compensation. The selected range bin is that with the maximum mean amplitude among the range bins whose echoes are confirmed to be contributed by a dominant scatter. The criterion for judging whether the echoes of a range bin are contributed by a dominant scatter is key to the proposed algorithm and is therefore described in detail. When the first range bin that satisfies the judgment criterion is found, a sequence composed of the frequencies that have the largest amplitudes in every moment's time-frequency spectrum corresponding to this range bin is employed to calculate the length and the center moment of the optimal imaging time window. Experiments performed with simulation data and real data show the effectiveness of the proposed algorithm, and comparisons between the proposed algorithm and the image contrast-based algorithm (ICBA) are provided. Similar image contrast and lower entropy are acquired using the proposed algorithm as compared with those values when using the ICBA.
Seismic signal time-frequency analysis based on multi-directional window using greedy strategy
NASA Astrophysics Data System (ADS)
Chen, Yingpin; Peng, Zhenming; Cheng, Zhuyuan; Tian, Lin
2017-08-01
Wigner-Ville distribution (WVD) is an important time-frequency analysis technology with a high energy distribution in seismic signal processing. However, it is interfered by many cross terms. To suppress the cross terms of the WVD and keep the concentration of its high energy distribution, an adaptive multi-directional filtering window in the ambiguity domain is proposed. This begins with the relationship of the Cohen distribution and the Gabor transform combining the greedy strategy and the rotational invariance property of the fractional Fourier transform in order to propose the multi-directional window, which extends the one-dimensional, one directional, optimal window function of the optimal fractional Gabor transform (OFrGT) to a two-dimensional, multi-directional window in the ambiguity domain. In this way, the multi-directional window matches the main auto terms of the WVD more precisely. Using the greedy strategy, the proposed window takes into account the optimal and other suboptimal directions, which also solves the problem of the OFrGT, called the local concentration phenomenon, when encountering a multi-component signal. Experiments on different types of both the signal models and the real seismic signals reveal that the proposed window can overcome the drawbacks of the WVD and the OFrGT mentioned above. Finally, the proposed method is applied to a seismic signal's spectral decomposition. The results show that the proposed method can explore the space distribution of a reservoir more precisely.
Smith, Lauren H; Hargrove, Levi J; Lock, Blair A; Kuiken, Todd A
2011-04-01
Pattern recognition-based control of myoelectric prostheses has shown great promise in research environments, but has not been optimized for use in a clinical setting. To explore the relationship between classification error, controller delay, and real-time controllability, 13 able-bodied subjects were trained to operate a virtual upper-limb prosthesis using pattern recognition of electromyogram (EMG) signals. Classification error and controller delay were varied by training different classifiers with a variety of analysis window lengths ranging from 50 to 550 ms and either two or four EMG input channels. Offline analysis showed that classification error decreased with longer window lengths (p < 0.01 ). Real-time controllability was evaluated with the target achievement control (TAC) test, which prompted users to maneuver the virtual prosthesis into various target postures. The results indicated that user performance improved with lower classification error (p < 0.01 ) and was reduced with longer controller delay (p < 0.01 ), as determined by the window length. Therefore, both of these effects should be considered when choosing a window length; it may be beneficial to increase the window length if this results in a reduced classification error, despite the corresponding increase in controller delay. For the system employed in this study, the optimal window length was found to be between 150 and 250 ms, which is within acceptable controller delays for conventional multistate amplitude controllers.
Computed Tomography Window Blending: Feasibility in Thoracic Trauma.
Mandell, Jacob C; Wortman, Jeremy R; Rocha, Tatiana C; Folio, Les R; Andriole, Katherine P; Khurana, Bharti
2018-02-07
This study aims to demonstrate the feasibility of processing computed tomography (CT) images with a custom window blending algorithm that combines soft-tissue, bone, and lung window settings into a single image; to compare the time for interpretation of chest CT for thoracic trauma with window blending and conventional window settings; and to assess diagnostic performance of both techniques. Adobe Photoshop was scripted to process axial DICOM images from retrospective contrast-enhanced chest CTs performed for trauma with a window-blending algorithm. Two emergency radiologists independently interpreted the axial images from 103 chest CTs with both blended and conventional windows. Interpretation time and diagnostic performance were compared with Wilcoxon signed-rank test and McNemar test, respectively. Agreement with Nexus CT Chest injury severity was assessed with the weighted kappa statistic. A total of 13,295 images were processed without error. Interpretation was faster with window blending, resulting in a 20.3% time saving (P < .001), with no difference in diagnostic performance, within the power of the study to detect a difference in sensitivity of 5% as determined by post hoc power analysis. The sensitivity of the window-blended cases was 82.7%, compared to 81.6% for conventional windows. The specificity of the window-blended cases was 93.1%, compared to 90.5% for conventional windows. All injuries of major clinical significance (per Nexus CT Chest criteria) were correctly identified in all reading sessions, and all negative cases were correctly classified. All readers demonstrated near-perfect agreement with injury severity classification with both window settings. In this pilot study utilizing retrospective data, window blending allows faster preliminary interpretation of axial chest CT performed for trauma, with no significant difference in diagnostic performance compared to conventional window settings. Future studies would be required to assess the utility of window blending in clinical practice. Copyright © 2018 The Association of University Radiologists. All rights reserved.
Short time Fourier analysis of the electromyogram - Fast movements and constant contraction
NASA Technical Reports Server (NTRS)
Hannaford, Blake; Lehman, Steven
1986-01-01
Short-time Fourier analysis was applied to surface electromyograms (EMG) recorded during rapid movements, and during isometric contractions at constant forces. A portion of the data to be transformed by multiplying the signal by a Hamming window was selected, and then the discrete Fourier transform was computed. Shifting the window along the data record, a new spectrum was computed each 10 ms. The transformed data were displayed in spectograms or 'voiceprints'. This short-time technique made it possible to see time-dependencies in the EMG that are normally averaged in the Fourier analysis of these signals. Spectra of EMGs during isometric contractions at constant force vary in the short (10-20 ms) term. Short-time spectra from EMGs recorded during rapid movements were much less variable. The windowing technique picked out the typical 'three-burst pattern' in EMG's from both wrist and head movements. Spectra during the bursts were more consistent than those during isometric contractions. Furthermore, there was a consistent shift in spectral statistics in the course of the three bursts. Both the center frequency and the variance of the spectral energy distribution grew from the first burst to the second burst in the same muscle. The analogy between EMGs and speech signals is extended to argue for future applicability of short-time spectral analysis of EMG.
Cerquera, Alexander; Vollebregt, Madelon A; Arns, Martijn
2018-03-01
Nonlinear analysis of EEG recordings allows detection of characteristics that would probably be neglected by linear methods. This study aimed to determine a suitable epoch length for nonlinear analysis of EEG data based on its recurrence rate in EEG alpha activity (electrodes Fz, Oz, and Pz) from 28 healthy and 64 major depressive disorder subjects. Two nonlinear metrics, Lempel-Ziv complexity and scaling index, were applied in sliding windows of 20 seconds shifted every 1 second and in nonoverlapping windows of 1 minute. In addition, linear spectral analysis was carried out for comparison with the nonlinear results. The analysis with sliding windows showed that the cortical dynamics underlying alpha activity had a recurrence period of around 40 seconds in both groups. In the analysis with nonoverlapping windows, long-term nonstationarities entailed changes over time in the nonlinear dynamics that became significantly different between epochs across time, which was not detected with the linear spectral analysis. Findings suggest that epoch lengths shorter than 40 seconds neglect information in EEG nonlinear studies. In turn, linear analysis did not detect characteristics from long-term nonstationarities in EEG alpha waves of control subjects and patients with major depressive disorder patients. We recommend that application of nonlinear metrics in EEG time series, particularly of alpha activity, should be carried out with epochs around 60 seconds. In addition, this study aimed to demonstrate that long-term nonlinearities are inherent to the cortical brain dynamics regardless of the presence or absence of a mental disorder.
climwin: An R Toolbox for Climate Window Analysis.
Bailey, Liam D; van de Pol, Martijn
2016-01-01
When studying the impacts of climate change, there is a tendency to select climate data from a small set of arbitrary time periods or climate windows (e.g., spring temperature). However, these arbitrary windows may not encompass the strongest periods of climatic sensitivity and may lead to erroneous biological interpretations. Therefore, there is a need to consider a wider range of climate windows to better predict the impacts of future climate change. We introduce the R package climwin that provides a number of methods to test the effect of different climate windows on a chosen response variable and compare these windows to identify potential climate signals. climwin extracts the relevant data for each possible climate window and uses this data to fit a statistical model, the structure of which is chosen by the user. Models are then compared using an information criteria approach. This allows users to determine how well each window explains variation in the response variable and compare model support between windows. climwin also contains methods to detect type I and II errors, which are often a problem with this type of exploratory analysis. This article presents the statistical framework and technical details behind the climwin package and demonstrates the applicability of the method with a number of worked examples.
Design and comparison of laser windows for high-power lasers
NASA Astrophysics Data System (ADS)
Niu, Yanxiong; Liu, Wenwen; Liu, Haixia; Wang, Caili; Niu, Haisha; Man, Da
2014-11-01
High-power laser systems are getting more and more widely used in industry and military affairs. It is necessary to develop a high-power laser system which can operate over long periods of time without appreciable degradation in performance. When a high-energy laser beam transmits through a laser window, it is possible that the permanent damage is caused to the window because of the energy absorption by window materials. So, when we design a high-power laser system, a suitable laser window material must be selected and the laser damage threshold of the window must be known. In this paper, a thermal analysis model of high-power laser window is established, and the relationship between the laser intensity and the thermal-stress field distribution is studied by deducing the formulas through utilizing the integral-transform method. The influence of window radius, thickness and laser intensity on the temperature and stress field distributions is analyzed. Then, the performance of K9 glass and the fused silica glass is compared, and the laser-induced damage mechanism is analyzed. Finally, the damage thresholds of laser windows are calculated. The results show that compared with K9 glass, the fused silica glass has a higher damage threshold due to its good thermodynamic properties. The presented theoretical analysis and simulation results are helpful for the design and selection of high-power laser windows.
Iconic Meaning in Music: An Event-Related Potential Study.
Cai, Liman; Huang, Ping; Luo, Qiuling; Huang, Hong; Mo, Lei
2015-01-01
Although there has been extensive research on the processing of the emotional meaning of music, little is known about other aspects of listeners' experience of music. The present study investigated the neural correlates of the iconic meaning of music. Event-related potentials (ERP) were recorded while a group of 20 music majors and a group of 20 non-music majors performed a lexical decision task in the context of implicit musical iconic meaning priming. ERP analysis revealed a significant N400 effect of congruency in time window 260-510 ms following the onset of the target word only in the group of music majors. Time-course analysis using 50 ms windows indicated significant N400 effects both within the time window 410-460 ms and 460-510 ms for music majors, whereas only a partial N400 effect during time window 410-460 ms was observed for non-music majors. There was also a trend for the N400 effects in the music major group to be stronger than those in the non-major group in the sub-windows of 310-360 ms and 410-460 ms. Especially in the sub-window of 410-460 ms, the topographical map of the difference waveforms between congruent and incongruent conditions revealed different N400 distribution between groups; the effect was concentrated in bilateral frontal areas for music majors, but in central-parietal areas for non-music majors. These results imply probable neural mechanism differences underlying automatic iconic meaning priming of music. Our findings suggest that processing of the iconic meaning of music can be accomplished automatically and that musical training may facilitate the understanding of the iconic meaning of music.
Iconic Meaning in Music: An Event-Related Potential Study
Luo, Qiuling; Huang, Hong; Mo, Lei
2015-01-01
Although there has been extensive research on the processing of the emotional meaning of music, little is known about other aspects of listeners’ experience of music. The present study investigated the neural correlates of the iconic meaning of music. Event-related potentials (ERP) were recorded while a group of 20 music majors and a group of 20 non-music majors performed a lexical decision task in the context of implicit musical iconic meaning priming. ERP analysis revealed a significant N400 effect of congruency in time window 260-510 ms following the onset of the target word only in the group of music majors. Time-course analysis using 50 ms windows indicated significant N400 effects both within the time window 410-460 ms and 460-510 ms for music majors, whereas only a partial N400 effect during time window 410-460 ms was observed for non-music majors. There was also a trend for the N400 effects in the music major group to be stronger than those in the non-major group in the sub-windows of 310-360ms and 410-460ms. Especially in the sub-window of 410-460 ms, the topographical map of the difference waveforms between congruent and incongruent conditions revealed different N400 distribution between groups; the effect was concentrated in bilateral frontal areas for music majors, but in central-parietal areas for non-music majors. These results imply probable neural mechanism differences underlying automatic iconic meaning priming of music. Our findings suggest that processing of the iconic meaning of music can be accomplished automatically and that musical training may facilitate the understanding of the iconic meaning of music. PMID:26161561
Human Mars Mission: Launch Window from Earth Orbit. Pt. 1
NASA Technical Reports Server (NTRS)
Young, Archie
1999-01-01
The determination of orbital window characteristics is of major importance in the analysis of human interplanetary missions and systems. The orbital launch window characteristics are directly involved in the selection of mission trajectories, the development of orbit operational concepts, and the design of orbital launch systems. The orbital launch window problem arises because of the dynamic nature of the relative geometry between outgoing (departure) asymptote of the hyperbolic escape trajectory and the earth parking orbit. The orientation of the escape hyperbola asymptotic relative to the earth is a function of time. The required hyperbola energy level also varies with time. In addition, the inertial orientation of the parking orbit is a function of time because of the perturbations caused by the Earth's oblateness. Thus, a coplanar injection onto the escape hyperbola can be made only at a point in time when the outgoing escape asymptote is contained by the plane of parking orbit. Even though this condition may be planned as a nominal situation, it will not generally represent the more probable injection geometry. The general case of an escape injection maneuver performed at a time other than the coplanar time will involve both a path angle and plane change and, therefore, a delta V penalty. Usually, because of the delta V penalty the actual departure injection window is smaller in duration than that determined by energy requirement alone. This report contains the formulation, characteristics, and test cases for five different launch window modes for Earth orbit. These modes are: 1) One impulsive maneuver from a Highly Elliptical Orbit (HEO); 2) Two impulsive maneuvers from a Highly Elliptical Orbit (HEO); 3) One impulsive maneuver from a Low Earth Orbit (LEO); 4) Two impulsive maneuvers form LEO; and 5) Three impulsive maneuvers form LEO. The formulation of these five different launch window modes provides a rapid means of generating realistic parametric data for space exploration studies. Also the formulation provides vector and geometrical data sufficient for use as a good starting point in detail trajectory analysis based on calculus of variations, steepest descent, or parameter optimization program techniques.
Human Exploration Missions Study Launch Window from Earth Orbit
NASA Technical Reports Server (NTRS)
Young, Archie
2001-01-01
The determination of orbital launch window characteristics is of major importance in the analysis of human interplanetary missions and systems. The orbital launch window characteristics are directly involved in the selection of mission trajectories, the development of orbit operational concepts, and the design of orbital launch systems. The orbital launch window problem arises because of the dynamic nature of the relative geometry between outgoing (departure) asymptote of the hyperbolic escape trajectory and the earth parking orbit. The orientation of the escape hyperbola asymptotic relative to earth is a function of time. The required hyperbola energy level also varies with time. In addition, the inertial orientation of the parking orbit is a function of time because of the perturbations caused by the Earth's oblateness. Thus, a coplanar injection onto the escape hyperbola can be made only at a point in time when the outgoing escape asymptote is contained by the plane of parking orbit. Even though this condition may be planned as a nominal situation, it will not generally represent the more probable injection geometry. The general case of an escape injection maneuver performed at a time other than the coplanar time will involve both a path angle and plane change and, therefore, a Delta(V) penalty. Usually, because of the Delta(V) penalty the actual departure injection window is smaller in duration than that determined by energy requirement alone. This report contains the formulation, characteristics, and test cases for five different launch window modes for Earth orbit. These modes are: (1) One impulsive maneuver from a Low Earth Orbit (LEO), (2) Two impulsive maneuvers from LEO, (3) Three impulsive maneuvers from LEO, (4) One impulsive maneuvers from a Highly Elliptical Orbit (HEO), (5) Two impulsive maneuvers from a Highly Elliptical Orbit (HEO) The formulation of these five different launch window modes provides a rapid means of generating realistic parametric data for space exploration studies. Also the formulation provides vector and geometrical data sufficient for use as a good starting point in detail trajectory analysis based on calculus of variations, steepest descent, or parameter optimization program techniques.
An efficient pseudomedian filter for tiling microrrays.
Royce, Thomas E; Carriero, Nicholas J; Gerstein, Mark B
2007-06-07
Tiling microarrays are becoming an essential technology in the functional genomics toolbox. They have been applied to the tasks of novel transcript identification, elucidation of transcription factor binding sites, detection of methylated DNA and several other applications in several model organisms. These experiments are being conducted at increasingly finer resolutions as the microarray technology enjoys increasingly greater feature densities. The increased densities naturally lead to increased data analysis requirements. Specifically, the most widely employed algorithm for tiling array analysis involves smoothing observed signals by computing pseudomedians within sliding windows, a O(n2logn) calculation in each window. This poor time complexity is an issue for tiling array analysis and could prove to be a real bottleneck as tiling microarray experiments become grander in scope and finer in resolution. We therefore implemented Monahan's HLQEST algorithm that reduces the runtime complexity for computing the pseudomedian of n numbers to O(nlogn) from O(n2logn). For a representative tiling microarray dataset, this modification reduced the smoothing procedure's runtime by nearly 90%. We then leveraged the fact that elements within sliding windows remain largely unchanged in overlapping windows (as one slides across genomic space) to further reduce computation by an additional 43%. This was achieved by the application of skip lists to maintaining a sorted list of values from window to window. This sorted list could be maintained with simple O(log n) inserts and deletes. We illustrate the favorable scaling properties of our algorithms with both time complexity analysis and benchmarking on synthetic datasets. Tiling microarray analyses that rely upon a sliding window pseudomedian calculation can require many hours of computation. We have eased this requirement significantly by implementing efficient algorithms that scale well with genomic feature density. This result not only speeds the current standard analyses, but also makes possible ones where many iterations of the filter may be required, such as might be required in a bootstrap or parameter estimation setting. Source code and executables are available at http://tiling.gersteinlab.org/pseudomedian/.
An efficient pseudomedian filter for tiling microrrays
Royce, Thomas E; Carriero, Nicholas J; Gerstein, Mark B
2007-01-01
Background Tiling microarrays are becoming an essential technology in the functional genomics toolbox. They have been applied to the tasks of novel transcript identification, elucidation of transcription factor binding sites, detection of methylated DNA and several other applications in several model organisms. These experiments are being conducted at increasingly finer resolutions as the microarray technology enjoys increasingly greater feature densities. The increased densities naturally lead to increased data analysis requirements. Specifically, the most widely employed algorithm for tiling array analysis involves smoothing observed signals by computing pseudomedians within sliding windows, a O(n2logn) calculation in each window. This poor time complexity is an issue for tiling array analysis and could prove to be a real bottleneck as tiling microarray experiments become grander in scope and finer in resolution. Results We therefore implemented Monahan's HLQEST algorithm that reduces the runtime complexity for computing the pseudomedian of n numbers to O(nlogn) from O(n2logn). For a representative tiling microarray dataset, this modification reduced the smoothing procedure's runtime by nearly 90%. We then leveraged the fact that elements within sliding windows remain largely unchanged in overlapping windows (as one slides across genomic space) to further reduce computation by an additional 43%. This was achieved by the application of skip lists to maintaining a sorted list of values from window to window. This sorted list could be maintained with simple O(log n) inserts and deletes. We illustrate the favorable scaling properties of our algorithms with both time complexity analysis and benchmarking on synthetic datasets. Conclusion Tiling microarray analyses that rely upon a sliding window pseudomedian calculation can require many hours of computation. We have eased this requirement significantly by implementing efficient algorithms that scale well with genomic feature density. This result not only speeds the current standard analyses, but also makes possible ones where many iterations of the filter may be required, such as might be required in a bootstrap or parameter estimation setting. Source code and executables are available at . PMID:17555595
NASA Technical Reports Server (NTRS)
Shih, Y. H.; Sergienko, A. V.; Rubin, M. H.
1993-01-01
A pair of correlated photons generated from parametric down conversion was sent to two independent Michelson interferometers. Second order interference was studied by means of a coincidence measurement between the outputs of two interferometers. The reported experiment and analysis studied this second order interference phenomena from the point of view of Einstein-Podolsky-Rosen paradox. The experiment was done in two steps. The first step of the experiment used 50 psec and 3 nsec coincidence time windows simultaneously. The 50 psec window was able to distinguish a 1.5 cm optical path difference in the interferometers. The interference visibility was measured to be 38 percent and 21 percent for the 50 psec time window and 22 percent and 7 percent for the 3 nsec time window, when the optical path difference of the interferometers were 2 cm and 4 cm, respectively. By comparing the visibilities between these two windows, the experiment showed the non-classical effect which resulted from an E.P.R. state. The second step of the experiment used a 20 psec coincidence time window, which was able to distinguish a 6 mm optical path difference in the interferometers. The interference visibilities were measured to be 59 percent for an optical path difference of 7 mm. This is the first observation of visibility greater than 50 percent for a two interferometer E.P.R. experiment which demonstrates nonclassical correlation of space-time variables.
NASA Astrophysics Data System (ADS)
Prawin, J.; Rama Mohan Rao, A.
2018-01-01
The knowledge of dynamic loads acting on a structure is always required for many practical engineering problems, such as structural strength analysis, health monitoring and fault diagnosis, and vibration isolation. In this paper, we present an online input force time history reconstruction algorithm using Dynamic Principal Component Analysis (DPCA) from the acceleration time history response measurements using moving windows. We also present an optimal sensor placement algorithm to place limited sensors at dynamically sensitive spatial locations. The major advantage of the proposed input force identification algorithm is that it does not require finite element idealization of structure unlike the earlier formulations and therefore free from physical modelling errors. We have considered three numerical examples to validate the accuracy of the proposed DPCA based method. Effects of measurement noise, multiple force identification, different kinds of loading, incomplete measurements, and high noise levels are investigated in detail. Parametric studies have been carried out to arrive at optimal window size and also the percentage of window overlap. Studies presented in this paper clearly establish the merits of the proposed algorithm for online load identification.
An Efficient Adaptive Window Size Selection Method for Improving Spectrogram Visualization.
Nisar, Shibli; Khan, Omar Usman; Tariq, Muhammad
2016-01-01
Short Time Fourier Transform (STFT) is an important technique for the time-frequency analysis of a time varying signal. The basic approach behind it involves the application of a Fast Fourier Transform (FFT) to a signal multiplied with an appropriate window function with fixed resolution. The selection of an appropriate window size is difficult when no background information about the input signal is known. In this paper, a novel empirical model is proposed that adaptively adjusts the window size for a narrow band-signal using spectrum sensing technique. For wide-band signals, where a fixed time-frequency resolution is undesirable, the approach adapts the constant Q transform (CQT). Unlike the STFT, the CQT provides a varying time-frequency resolution. This results in a high spectral resolution at low frequencies and high temporal resolution at high frequencies. In this paper, a simple but effective switching framework is provided between both STFT and CQT. The proposed method also allows for the dynamic construction of a filter bank according to user-defined parameters. This helps in reducing redundant entries in the filter bank. Results obtained from the proposed method not only improve the spectrogram visualization but also reduce the computation cost and achieves 87.71% of the appropriate window length selection.
Removal of Noise from a Voice Signal by Synthesis
1973-05-01
for 102.4 millisecond windows is about five times as great as the cost of computing for 25.6 millisecond windows. Hammett in his work on an adaptive...spectrum analysis vocoder, has examined the selection of data window widths in detail [18]. The solution Hammett used to optimize the trade off between...result is: n s(t) E Ri(t - i . T) i-1 In this equation n is the number of impulse responses under consideration, s(t) is the resulting synthetic signal
Adaptive synchrosqueezing based on a quilted short-time Fourier transform
NASA Astrophysics Data System (ADS)
Berrian, Alexander; Saito, Naoki
2017-08-01
In recent years, the synchrosqueezing transform (SST) has gained popularity as a method for the analysis of signals that can be broken down into multiple components determined by instantaneous amplitudes and phases. One such version of SST, based on the short-time Fourier transform (STFT), enables the sharpening of instantaneous frequency (IF) information derived from the STFT, as well as the separation of amplitude-phase components corresponding to distinct IF curves. However, this SST is limited by the time-frequency resolution of the underlying window function, and may not resolve signals exhibiting diverse time-frequency behaviors with sufficient accuracy. In this work, we develop a framework for an SST based on a "quilted" short-time Fourier transform (SST-QSTFT), which allows adaptation to signal behavior in separate time-frequency regions through the use of multiple windows. This motivates us to introduce a discrete reassignment frequency formula based on a finite difference of the phase spectrum, ensuring computational accuracy for a wider variety of windows. We develop a theoretical framework for the SST-QSTFT in both the continuous and the discrete settings, and describe an algorithm for the automatic selection of optimal windows depending on the region of interest. Using synthetic data, we demonstrate the superior numerical performance of SST-QSTFT relative to other SST methods in a noisy context. Finally, we apply SST-QSTFT to audio recordings of animal calls to demonstrate the potential of our method for the analysis of real bioacoustic signals.
Zhang, Fei-Ruo; He, Li-Hua; Wu, Shan-Shan; Li, Jing-Yun; Ye, Kang-Pin; Wang, Sheng
2011-11-01
Work-related musculoskeletal disorders (WMSDs) have high prevalence in sewing machine operators employed in the garment industry. Long work duration, sustained low level work and precise hand work are the main risk factors of neck-shoulder disorders for sewing machine operators. Surface electromyogram (sEMG) offers a valuable tool to determine muscle activity (internal exposure) and quantify muscular load (external exposure). During sustained and/or repetitive muscle contractions, typical changes of muscle fatigue in sEMG, as an increase in amplitude or a decrease as a shift in spectrum towards lower frequencies, can be observed. In this paper, we measured and quantified the muscle load and muscular activity patterns of neck-shoulder muscles in female sewing machine operators during sustained sewing machine operating tasks using sEMG. A total of 18 healthy women sewing machine operators volunteered to participate in this study. Before their daily sewing machine operating task, we measured the maximal voluntary contractions (MVC) and 20%MVC of bilateral cervical erector spinae (CES) and upper trapezius (UT) respectively, then the sEMG signals of bilateral UT and CES were monitored and recorded continuously during 200 minutes of sustained sewing machine operating simultaneously which equals to 20 time windows with 10 minutes as one time window. After 200 minutes' work, we retest 20%MVC of four neck-shoulder muscles and recorded the sEMG signals. Linear analysis, including amplitude probability distribution frequency (APDF), amplitude analysis parameters such as roof mean square (RMS) and spectrum analysis parameter as median frequency (MF), were used to calculate and indicate muscle load and muscular activity of bilateral CES and UT. During 200 minutes of sewing machine operating, the median load for the left cervical erector spinae (LCES), right cervical erector spinae (RCES), left upper trapezius (LUT) and right upper trapezius (RUT) were 6.78%MVE, 6.94%MVE, 6.47%MVE and 5.68%MVE, respectively. Work load of right muscles are significantly higher than that of the left muscles (P < 0.05); sEMG signal analysis of isometric contractions indicated that the amplitude value before operating was significantly higher than that of after work (P < 0.01), and the spectrum value of bilateral CES and UT were significantly lower than those of after work (P < 0.01); according to the sEMG signal data of 20 time windows, with operating time pass by, the muscle activity patterns of bilateral CES and UT showed dynamic changes, the maximal amplitude of LCES, RCES, LUT occurred at the 20th time window, RUT at 16th time window, spectrum analysis showed that the lower value happened at 7th, 16th, 20th time windows. Female sewing machine operators were exposed to high sustained static load on bilateral neck-shoulder muscles; left neck and shoulder muscles were held in more static positions; the 7th, 16th, and 20th time windows were muscle fatigue period that ergonomics intervention can protocol at these periods.
A comparison of the wavelet and short-time fourier transforms for Doppler spectral analysis.
Zhang, Yufeng; Guo, Zhenyu; Wang, Weilian; He, Side; Lee, Ting; Loew, Murray
2003-09-01
Doppler spectrum analysis provides a non-invasive means to measure blood flow velocity and to diagnose arterial occlusive disease. The time-frequency representation of the Doppler blood flow signal is normally computed by using the short-time Fourier transform (STFT). This transform requires stationarity of the signal during a finite time interval, and thus imposes some constraints on the representation estimate. In addition, the STFT has a fixed time-frequency window, making it inaccurate to analyze signals having relatively wide bandwidths that change rapidly with time. In the present study, wavelet transform (WT), having a flexible time-frequency window, was used to investigate its advantages and limitations for the analysis of the Doppler blood flow signal. Representations computed using the WT with a modified Morlet wavelet were investigated and compared with the theoretical representation and those computed using the STFT with a Gaussian window. The time and frequency resolutions of these two approaches were compared. Three indices, the normalized root-mean-squared errors of the minimum, the maximum and the mean frequency waveforms, were used to evaluate the performance of the WT. Results showed that the WT can not only be used as an alternative signal processing tool to the STFT for Doppler blood flow signals, but can also generate a time-frequency representation with better resolution than the STFT. In addition, the WT method can provide both satisfactory mean frequencies and maximum frequencies. This technique is expected to be useful for the analysis of Doppler blood flow signals to quantify arterial stenoses.
Time Series Analysis Based on Running Mann Whitney Z Statistics
USDA-ARS?s Scientific Manuscript database
A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...
Bounded Linear Stability Analysis - A Time Delay Margin Estimation Approach for Adaptive Control
NASA Technical Reports Server (NTRS)
Nguyen, Nhan T.; Ishihara, Abraham K.; Krishnakumar, Kalmanje Srinlvas; Bakhtiari-Nejad, Maryam
2009-01-01
This paper presents a method for estimating time delay margin for model-reference adaptive control of systems with almost linear structured uncertainty. The bounded linear stability analysis method seeks to represent the conventional model-reference adaptive law by a locally bounded linear approximation within a small time window using the comparison lemma. The locally bounded linear approximation of the combined adaptive system is cast in a form of an input-time-delay differential equation over a small time window. The time delay margin of this system represents a local stability measure and is computed analytically by a matrix measure method, which provides a simple analytical technique for estimating an upper bound of time delay margin. Based on simulation results for a scalar model-reference adaptive control system, both the bounded linear stability method and the matrix measure method are seen to provide a reasonably accurate and yet not too conservative time delay margin estimation.
NASA Astrophysics Data System (ADS)
Cristescu, Constantin P.; Stan, Cristina; Scarlat, Eugen I.; Minea, Teofil; Cristescu, Cristina M.
2012-04-01
We present a novel method for the parameter oriented analysis of mutual correlation between independent time series or between equivalent structures such as ordered data sets. The proposed method is based on the sliding window technique, defines a new type of correlation measure and can be applied to time series from all domains of science and technology, experimental or simulated. A specific parameter that can characterize the time series is computed for each window and a cross correlation analysis is carried out on the set of values obtained for the time series under investigation. We apply this method to the study of some currency daily exchange rates from the point of view of the Hurst exponent and the intermittency parameter. Interesting correlation relationships are revealed and a tentative crisis prediction is presented.
Study of wavefront error and polarization of a side mounted infrared window
NASA Astrophysics Data System (ADS)
Liu, Jiaguo; Li, Lin; Hu, Xinqi; Yu, Xin
2008-03-01
The wavefront error and polarization of a side mounted infrared window made of ZnS are studied. The Infrared windows suffer from temperature gradient and stress during their launch process. Generally, the gradient in temperature changes the refractive index of the material whereas stress produces deformation and birefringence. In this paper, a thermal finite element analysis (FEA) of an IR window is presented. For this purpose, we employed an FEA program Ansys to obtain the time-varying temperature field. The deformation and stress of the window are derived from a structural FEA with the aerodynamic force and the temperature field previously obtained as being the loads. The deformation, temperature field, stress field, ray tracing and Jones Calculus are used to calculate the wavefront error and the change of polarization state.
Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R
2009-01-01
We present g-PRIME, a software based tool for physiology data acquisition, analysis, and stimulus generation in education and research. This software was developed in an undergraduate neurophysiology course and strongly influenced by instructor and student feedback. g-PRIME is a free, stand-alone, windows application coded and "compiled" in Matlab (does not require a Matlab license). g-PRIME supports many data acquisition interfaces from the PC sound card to expensive high throughput calibrated equipment. The program is designed as a software oscilloscope with standard trigger modes, multi-channel visualization controls, and data logging features. Extensive analysis options allow real time and offline filtering of signals, multi-parameter threshold-and-window based event detection, and two-dimensional display of a variety of parameters including event time, energy density, maximum FFT frequency component, max/min amplitudes, and inter-event rate and intervals. The software also correlates detected events with another simultaneously acquired source (event triggered average) in real time or offline. g-PRIME supports parameter histogram production and a variety of elegant publication quality graphics outputs. A major goal of this software is to merge powerful engineering acquisition and analysis tools with a biological approach to studies of nervous system function.
Vibration measurement by temporal Fourier analyses of a digital hologram sequence.
Fu, Yu; Pedrini, Giancarlo; Osten, Wolfgang
2007-08-10
A method for whole-field noncontact measurement of displacement, velocity, and acceleration of a vibrating object based on image-plane digital holography is presented. A series of digital holograms of a vibrating object are captured by use of a high-speed CCD camera. The result of the reconstruction is a three-dimensional complex-valued matrix with noise. We apply Fourier analysis and windowed Fourier analysis in both the spatial and the temporal domains to extract the displacement, the velocity, and the acceleration. The instantaneous displacement is obtained by temporal unwrapping of the filtered phase map, whereas the velocity and acceleration are evaluated by Fourier analysis and by windowed Fourier analysis along the time axis. The combination of digital holography and temporal Fourier analyses allows for evaluation of the vibration, without a phase ambiguity problem, and smooth spatial distribution of instantaneous displacement, velocity, and acceleration of each instant are obtained. The comparison of Fourier analysis and windowed Fourier analysis in velocity and acceleration measurements is also presented.
Nielsen, Merete Willemoes; Søndergaard, Birthe; Kjøller, Mette; Hansen, Ebba Holme
2008-09-01
This study compared national self-reported data on medicine use and national prescription records at the individual level. Data from the nationally representative Danish health survey conducted in 2000 (n=16,688) were linked at the individual level to national prescription records covering 1999-2000. Kappa statistics and 95% confidence intervals were calculated. Applying the legend time method to medicine groups used mainly on a chronic basis revealed good to very good agreement between the two data sources, whereas medicines used as needed showed fair to moderate agreement. When a fixed-time window was applied for analysis, agreement was unchanged for medicines used mainly on a chronic basis, whereas agreement increased somewhat compared to the legend time method when analyzing medicines used as needed. Agreement between national self-reported data and national prescription records differed according to method of analysis and therapeutic group. A fixed-time window is an appropriate method of analysis for most therapeutic groups.
Human Mars Mission: Launch Window from Earth Orbit. Pt. 1
NASA Technical Reports Server (NTRS)
Young, Archie
1999-01-01
The determination of orbital window characteristics is of major importance in the analysis of human interplanetary missions and systems. The orbital launch window characteristics are directly involved in the selection of mission trajectories, the development of orbit operational concepts, and the design of orbital launch systems. The orbital launch window problem arises because of the dynamic nature of the relative geometry between outgoing (departure) asymptote of the hyperbolic escape trajectory and the earth parking orbit. The orientation of the escape hyperbola asymptotic relative to earth is a function of time. The required hyperbola energy level also varies with time. In addition, the inertial orientation of the parking orbit is a function of time because of the perturbations caused by the Earth's oblateness. Thus, a coplanar injection onto the escape hyperbola can be made only at a point in time when the outgoing escape asymptote is contained by the plane of parking orbit. Even though this condition may be planned as a nominal situation, it will not generally represent the more probable injection geometry. The general case of an escape injection maneuver performed at a time other than the coplanar time will involve both a path angle and plane change and, therefore, a DELTA V penalty. Usually, because of the DELTA V penalty the actual departure injection window is smaller in duration than that determined by energy requirement alone. This report contains the formulation, characteristics, and test cases for five different launch window modes for Earth orbit. These modes are: (1) One impulsive maneuver from a Highly Elliptical Orbit (HEO) (2) Two impulsive maneuvers from a Highly Elliptical Orbit (HEO) (3) One impulsive maneuver from a Low Earth Orbit (LEO) (4) Two impulsive maneuvers from LEO (5) Three impulsive maneuvers from LEO.
NASA Astrophysics Data System (ADS)
Chen, Daniel; Chen, Damian; Yen, Ray; Cheng, Mingjen; Lan, Andy; Ghaskadvi, Rajesh
2008-11-01
Identifying hotspots--structures that limit the lithography process window--become increasingly important as the industry relies heavily on RET to print sub-wavelength designs. KLA-Tencor's patented Process Window Qualification (PWQ) methodology has been used for this purpose in various fabs. PWQ methodology has three key advantages (a) PWQ Layout--to obtain the best sensitivity (b) Design Based Binning--for pattern repeater analysis (c) Intelligent sampling--for the best DOI sampling rate. This paper evaluates two different analysis strategies for SEM review sampling successfully deployed at Inotera Memories, Inc. We propose a new approach combining the location repeater and pattern repeaters. Based on a recent case study the new sampling flow reduces the data analysis and sampling time from 6 hours to 1.5 hour maintaining maximum DOI sample rate.
Launch COLA Gap Analysis for Protection of the International Space Station
NASA Astrophysics Data System (ADS)
Jenkin, Alan B.; McVey, John P.; Peterson, Glenn E.; Sorge, Marlon E.
2013-08-01
For launch missions in general, a collision avoidance (COLA) gap exists between the end of the time interval covered by standard launch COLA screening and the time that other spacecraft can clear a collision with the newly launched objects. To address this issue for the International Space Station (ISS), a COLA gap analysis process has been developed. The first part of the process, nodal separation analysis, identifies launch dates and launch window opportunities when the orbit traces of a launched object and the ISS could cross during the COLA gap. The second and newest part of the analysis process, Monte Carlo conjunction probability analysis, is performed closer to the launch dates of concern to reopen some of the launch window opportunities that would be closed by nodal separation analysis alone. Both parts of the process are described and demonstrated on sample missions.
Short segment search method for phylogenetic analysis using nested sliding windows
NASA Astrophysics Data System (ADS)
Iskandar, A. A.; Bustamam, A.; Trimarsanto, H.
2017-10-01
To analyze phylogenetics in Bioinformatics, coding DNA sequences (CDS) segment is needed for maximal accuracy. However, analysis by CDS cost a lot of time and money, so a short representative segment by CDS, which is envelope protein segment or non-structural 3 (NS3) segment is necessary. After sliding window is implemented, a better short segment than envelope protein segment and NS3 is found. This paper will discuss a mathematical method to analyze sequences using nested sliding window to find a short segment which is representative for the whole genome. The result shows that our method can find a short segment which more representative about 6.57% in topological view to CDS segment than an Envelope segment or NS3 segment.
NASA Astrophysics Data System (ADS)
Taira, T.; Kato, A.
2013-12-01
A high-resolution Vp/Vs ratio estimate is one of the key parameters to understand spatial variations of composition and physical state within the Earth. Lin and Shearer (2007, BSSA) recently developed a methodology to obtain local Vp/Vs ratios in individual similar earthquake clusters, based on P- and S-wave differential times. A waveform cross-correlation approach is typically employed to measure those differential times for pairs of seismograms from similar earthquakes clusters, at narrow time windows around the direct P and S waves. This approach effectively collects P- and S-wave differential times and however requires the robust P- and S-wave time windows that are extracted based on either manually or automatically picked P- and S-phases. We present another technique to estimate P- and S-wave differential times by exploiting temporal properties of delayed time as a function of elapsed time on the seismograms with a moving-window cross-correlation analysis (e.g., Snieder, 2002, Phys. Rev. E; Niu et al. 2003, Nature). Our approach is based on the principle that the delayed time for the direct S wave differs from that for the direct P wave. Two seismograms aligned by the direct P waves from a pair of similar earthquakes yield that delayed times become zero around the direct P wave. In contrast, delayed times obtained from time windows including the direct S wave have non-zero value. Our approach, in principle, is capable of measuring both P- and S-wave differential times from single-component seismograms. In an ideal case, the temporal evolution of delayed time becomes a step function with its discontinuity at the onset of the direct S wave. The offset in the resulting step function would be the S-wave differential time, relative to the P-wave differential time as the two waveforms are aligned by the direct P wave. We apply our moving-window cross-correlation technique to the two different data sets collected at: 1) the Wakayama district, Japan and 2) the Geysers geothermal field, California. The both target areas are characterized by earthquake swarms that provide a number of similar events clusters. We use the following automated procedure to systematically analyze the two data sets: 1) the identification of the direct P arrivals by using an Akaike Information Criterion based phase picking algorithm introduced by Zhang and Thurber (2003, BSSA), 2) the waveform alignment by the P-wave with a waveform cross-correlation to obtain P-wave differential time, 3) the moving-time window analysis to estimate the S-differential time. Kato et al. (2010, GRL) have estimated the Vp/Vs ratios for a few similar earthquake clusters from the Wakayama data set, by a conventional approach to obtain differential times. We find that the resulting Vp/Vs ratios from our approach for the same earthquake clusters are comparable with those obtained from Kato et al. (2010, GRL). We show that the moving-window cross-correlation technique effectively measures both P- and S-wave differential times for the seismograms in which the clear P and S phases are not observed. We will show spatial distributions in Vp/Vs ratios in our two target areas.
Wavelet-based multiscale window transform and energy and vorticity analysis
NASA Astrophysics Data System (ADS)
Liang, Xiang San
A new methodology, Multiscale Energy and Vorticity Analysis (MS-EVA), is developed to investigate sub-mesoscale, meso-scale, and large-scale dynamical interactions in geophysical fluid flows which are intermittent in space and time. The development begins with the construction of a wavelet-based functional analysis tool, the multiscale window transform (MWT), which is local, orthonormal, self-similar, and windowed on scale. The MWT is first built over the real line then modified onto a finite domain. Properties are explored, the most important one being the property of marginalization which brings together a quadratic quantity in physical space with its phase space representation. Based on MWT the MS-EVA is developed. Energy and enstrophy equations for the large-, meso-, and sub-meso-scale windows are derived and their terms interpreted. The processes thus represented are classified into four categories: transport; transfer, conversion, and dissipation/diffusion. The separation of transport from transfer is made possible with the introduction of the concept of perfect transfer. By the property of marginalization, the classical energetic analysis proves to be a particular case of the MS-EVA. The MS-EVA developed is validated with classical instability problems. The validation is carried out through two steps. First, it is established that the barotropic and baroclinic instabilities are indicated by the spatial averages of certain transfer term interaction analyses. Then calculations of these indicators are made with an Eady model and a Kuo model. The results agree precisely with what is expected from their analytical solutions, and the energetics reproduced reveal a consistent and important aspect of the unknown dynamic structures of instability processes. As an application, the MS-EVA is used to investigate the Iceland-Faeroe frontal (IFF) variability. A MS-EVA-ready dataset is first generated, through a forecasting study with the Harvard Ocean Prediction System using the data gathered during the 1993 NRV Alliance cruise. The application starts with a determination of the scale window bounds, which characterize a double-peak structure in either the time wavelet spectrum or the space wavelet spectrum. The resulting energetics, when locally averaged, reveal that there is a clear baroclinic instability happening around the cold tongue intrusion observed in the forecast. Moreover, an interaction analysis shows that the energy released by the instability indeed goes to the meso-scale window and fuel the growth of the intrusion. The sensitivity study shows that, in this case, the key to a successful application is a correct decomposition of the large-scale window from the meso-scale window.
Statistical tests for power-law cross-correlated processes
NASA Astrophysics Data System (ADS)
Podobnik, Boris; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Stanley, H. Eugene
2011-12-01
For stationary time series, the cross-covariance and the cross-correlation as functions of time lag n serve to quantify the similarity of two time series. The latter measure is also used to assess whether the cross-correlations are statistically significant. For nonstationary time series, the analogous measures are detrended cross-correlations analysis (DCCA) and the recently proposed detrended cross-correlation coefficient, ρDCCA(T,n), where T is the total length of the time series and n the window size. For ρDCCA(T,n), we numerically calculated the Cauchy inequality -1≤ρDCCA(T,n)≤1. Here we derive -1≤ρDCCA(T,n)≤1 for a standard variance-covariance approach and for a detrending approach. For overlapping windows, we find the range of ρDCCA within which the cross-correlations become statistically significant. For overlapping windows we numerically determine—and for nonoverlapping windows we derive—that the standard deviation of ρDCCA(T,n) tends with increasing T to 1/T. Using ρDCCA(T,n) we show that the Chinese financial market's tendency to follow the U.S. market is extremely weak. We also propose an additional statistical test that can be used to quantify the existence of cross-correlations between two power-law correlated time series.
Ariza, Pedro; Solesio-Jofre, Elena; Martínez, Johann H.; Pineda-Pardo, José A.; Niso, Guiomar; Maestú, Fernando; Buldú, Javier M.
2015-01-01
In this study we used graph theory analysis to investigate age-related reorganization of functional networks during the active maintenance of information that is interrupted by external interference. Additionally, we sought to investigate network differences before and after averaging network parameters between both maintenance and interference windows. We compared young and older adults by measuring their magnetoencephalographic recordings during an interference-based working memory task restricted to successful recognitions. Data analysis focused on the topology/temporal evolution of functional networks during both the maintenance and interference windows. We observed that: (a) Older adults require higher synchronization between cortical brain sites in order to achieve a successful recognition, (b) The main differences between age groups arise during the interference window, (c) Older adults show reduced ability to reorganize network topology when interference is introduced, and (d) Averaging network parameters leads to a loss of sensitivity to detect age differences. PMID:26029079
Adaptive time-variant models for fuzzy-time-series forecasting.
Wong, Wai-Keung; Bai, Enjian; Chu, Alice Wai-Ching
2010-12-01
A fuzzy time series has been applied to the prediction of enrollment, temperature, stock indices, and other domains. Related studies mainly focus on three factors, namely, the partition of discourse, the content of forecasting rules, and the methods of defuzzification, all of which greatly influence the prediction accuracy of forecasting models. These studies use fixed analysis window sizes for forecasting. In this paper, an adaptive time-variant fuzzy-time-series forecasting model (ATVF) is proposed to improve forecasting accuracy. The proposed model automatically adapts the analysis window size of fuzzy time series based on the prediction accuracy in the training phase and uses heuristic rules to generate forecasting values in the testing phase. The performance of the ATVF model is tested using both simulated and actual time series including the enrollments at the University of Alabama, Tuscaloosa, and the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX). The experiment results show that the proposed ATVF model achieves a significant improvement in forecasting accuracy as compared to other fuzzy-time-series forecasting models.
Potential for Bias When Estimating Critical Windows for Air Pollution in Children's Health.
Wilson, Ander; Chiu, Yueh-Hsiu Mathilda; Hsu, Hsiao-Hsien Leon; Wright, Robert O; Wright, Rosalind J; Coull, Brent A
2017-12-01
Evidence supports an association between maternal exposure to air pollution during pregnancy and children's health outcomes. Recent interest has focused on identifying critical windows of vulnerability. An analysis based on a distributed lag model (DLM) can yield estimates of a critical window that are different from those from an analysis that regresses the outcome on each of the 3 trimester-average exposures (TAEs). Using a simulation study, we assessed bias in estimates of critical windows obtained using 3 regression approaches: 1) 3 separate models to estimate the association with each of the 3 TAEs; 2) a single model to jointly estimate the association between the outcome and all 3 TAEs; and 3) a DLM. We used weekly fine-particulate-matter exposure data for 238 births in a birth cohort in and around Boston, Massachusetts, and a simulated outcome and time-varying exposure effect. Estimates using separate models for each TAE were biased and identified incorrect windows. This bias arose from seasonal trends in particulate matter that induced correlation between TAEs. Including all TAEs in a single model reduced bias. DLM produced unbiased estimates and added flexibility to identify windows. Analysis of body mass index z score and fat mass in the same cohort highlighted inconsistent estimates from the 3 methods. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Phase locking route behind complex periodic windows in a forced oscillator
NASA Astrophysics Data System (ADS)
Jan, Hengtai; Tsai, Kuo-Ting; Kuo, Li-wei
2013-09-01
Chaotic systems have complex reactions against an external driving force; even in cases with low-dimension oscillators, the routes to synchronization are diverse. We proposed a stroboscope-based method for analyzing driven chaotic systems in their phase space. According to two statistic quantities generated from time series, we could realize the system state and the driving behavior simultaneously. We demonstrated our method in a driven bi-stable system, which showed complex period windows under a proper driving force. With increasing periodic driving force, a route from interior periodic oscillation to phase synchronization through the chaos state could be found. Periodic windows could also be identified and the circumstances under which they occurred distinguished. Statistical results were supported by conditional Lyapunov exponent analysis to show the power in analyzing the unknown time series.
NASA Astrophysics Data System (ADS)
Pierini, J. O.; Restrepo, J. C.; Aguirre, J.; Bustamante, A. M.; Velásquez, G. J.
2017-04-01
A measure of the variability in seasonal extreme streamflow was estimated for the Colombian Caribbean coast, using monthly time series of freshwater discharge from ten watersheds. The aim was to detect modifications in the streamflow monthly distribution, seasonal trends, variance and extreme monthly values. A 20-year length time moving window, with 1-year successive shiftments, was applied to the monthly series to analyze the seasonal variability of streamflow. The seasonal-windowed data were statistically fitted through the Gamma distribution function. Scale and shape parameters were computed using the Maximum Likelihood Estimation (MLE) and the bootstrap method for 1000 resample. A trend analysis was performed for each windowed-serie, allowing to detect the window of maximum absolute values for trends. Significant temporal shifts in seasonal streamflow distribution and quantiles (QT), were obtained for different frequencies. Wet and dry extremes periods increased significantly in the last decades. Such increase did not occur simultaneously through the region. Some locations exhibited continuous increases only at minimum QT.
2014-10-16
Time-Frequency analysis, Short-Time Fourier Transform, Wigner Ville Distribution, Fourier Bessel Transform, Fractional Fourier Transform. I...INTRODUCTION Most widely used time-frequency transforms are short-time Fourier Transform (STFT) and Wigner Ville distribution (WVD). In STFT, time and...frequency resolutions are limited by the size of window function used in calculating STFT. For mono-component signals, WVD gives the best time and frequency
Rapacchi, Stanislas; Wen, Han; Viallon, Magalie; Grenier, Denis; Kellman, Peter; Croisille, Pierre; Pai, Vinay M
2011-12-01
Diffusion-weighted imaging (DWI) using low b-values permits imaging of intravoxel incoherent motion in tissues. However, low b-value DWI of the human heart has been considered too challenging because of additional signal loss due to physiological motion, which reduces both signal intensity and the signal-to-noise ratio (SNR). We address these signal loss concerns by analyzing cardiac motion during a heartbeat to determine the time-window during which cardiac bulk motion is minimal. Using this information to optimize the acquisition of DWI data and combining it with a dedicated image processing approach has enabled us to develop a novel low b-value diffusion-weighted cardiac magnetic resonance imaging approach, which significantly reduces intravoxel incoherent motion measurement bias introduced by motion. Simulations from displacement encoded motion data sets permitted the delineation of an optimal time-window with minimal cardiac motion. A number of single-shot repetitions of low b-value DWI cardiac magnetic resonance imaging data were acquired during this time-window under free-breathing conditions with bulk physiological motion corrected for by using nonrigid registration. Principal component analysis (PCA) was performed on the registered images to improve the SNR, and temporal maximum intensity projection (TMIP) was applied to recover signal intensity from time-fluctuant motion-induced signal loss. This PCATMIP method was validated with experimental data, and its benefits were evaluated in volunteers before being applied to patients. Optimal time-window cardiac DWI in combination with PCATMIP postprocessing yielded significant benefits for signal recovery, contrast-to-noise ratio, and SNR in the presence of bulk motion for both numerical simulations and human volunteer studies. Analysis of mean apparent diffusion coefficient (ADC) maps showed homogeneous values among volunteers and good reproducibility between free-breathing and breath-hold acquisitions. The PCATMIP DWI approach also indicated its potential utility by detecting ADC variations in acute myocardial infarction patients. Studying cardiac motion may provide an appropriate strategy for minimizing the impact of bulk motion on cardiac DWI. Applying PCATMIP image processing improves low b-value DWI and enables reliable analysis of ADC in the myocardium. The use of a limited number of repetitions in a free-breathing mode also enables easier application in clinical conditions.
Software for Allocating Resources in the Deep Space Network
NASA Technical Reports Server (NTRS)
Wang, Yeou-Fang; Borden, Chester; Zendejas, Silvino; Baldwin, John
2003-01-01
TIGRAS 2.0 is a computer program designed to satisfy a need for improved means for analyzing the tracking demands of interplanetary space-flight missions upon the set of ground antenna resources of the Deep Space Network (DSN) and for allocating those resources. Written in Microsoft Visual C++, TIGRAS 2.0 provides a single rich graphical analysis environment for use by diverse DSN personnel, by connecting to various data sources (relational databases or files) based on the stages of the analyses being performed. Notable among the algorithms implemented by TIGRAS 2.0 are a DSN antenna-load-forecasting algorithm and a conflict-aware DSN schedule-generating algorithm. Computers running TIGRAS 2.0 can also be connected using SOAP/XML to a Web services server that provides analysis services via the World Wide Web. TIGRAS 2.0 supports multiple windows and multiple panes in each window for users to view and use information, all in the same environment, to eliminate repeated switching among various application programs and Web pages. TIGRAS 2.0 enables the use of multiple windows for various requirements, trajectory-based time intervals during which spacecraft are viewable, ground resources, forecasts, and schedules. Each window includes a time navigation pane, a selection pane, a graphical display pane, a list pane, and a statistics pane.
Maturation of the P3 and concurrent oscillatory processes during adolescence.
Mathes, Birgit; Khalaidovski, Ksenia; Wienke, Annika S; Schmiedt-Fehr, Christina; Basar-Eroglu, Canan
2016-07-01
During adolescence event-related modulations of the neural response may increase. For slow event-related components, such as the P3, this developmental change may be masked due to increased amplitude levels of ongoing delta and theta oscillations in adolescents. In a cross-sectional study design, EEG was measured in 51 participants between 13 and 24years. A visual oddball paradigm was used to elicit the P3. Our analysis focused on fronto-parietal activations within the P3 time-window and the concurrent time-frequency characteristics in the delta (∼0.5-4Hz) and theta (∼4-7Hz) band. The parietal P3 amplitude was similar across the investigated age range, while the amplitude at frontal regions increased with age. The pre-stimulus amplitudes of delta and theta oscillations declined with age, while post-stimulus amplitude enhancement and inter-trial phase coherence increased. These changes affected fronto-parietal electrode sites. The parietal P3 maximum seemed comparable for adolescents and young adults. Detailed analysis revealed that within the P3 time-window brain maturation during adolescence may lead to reduced spontaneous slow-wave oscillations, increased amplitude modulation and time precision of event-related oscillations, and altered P3 scalp topography. Time-frequency analyses may help to distinguish selective neurodevelopmental changes within the P3 time window. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Online frequency estimation with applications to engine and generator sets
NASA Astrophysics Data System (ADS)
Manngård, Mikael; Böling, Jari M.
2017-07-01
Frequency and spectral analysis based on the discrete Fourier transform is a fundamental task in signal processing and machine diagnostics. This paper aims at presenting computationally efficient methods for real-time estimation of stationary and time-varying frequency components in signals. A brief survey of the sliding time window discrete Fourier transform and Goertzel filter is presented, and two filter banks consisting of: (i) sliding time window Goertzel filters (ii) infinite impulse response narrow bandpass filters are proposed for estimating instantaneous frequencies. The proposed methods show excellent results on both simulation studies and on a case study using angular speed data measurements of the crankshaft of a marine diesel engine-generator set.
Ultra-high resolution water window x ray microscope optics design and analysis
NASA Technical Reports Server (NTRS)
Shealy, David L.; Wang, C.
1993-01-01
This project has been focused on the design and analysis of an ultra-high resolution water window soft-x-ray microscope. These activities have been accomplished by completing two tasks contained in the statement of work of this contract. The new results from this work confirm: (1) that in order to achieve resolutions greater than three times the wavelength of the incident radiation, it will be necessary to use spherical mirror surfaces and to use graded multilayer coatings on the secondary in order to accommodate the large variations of the angle of incidence over the secondary when operating the microscope at numerical apertures of 0.35 or greater; (2) that surface contour errors will have a significant effect on the optical performance of the microscope and must be controlled to a peak-to-valley variation of 50-100 A and a frequency of 8 periods over the surface of a mirror; and (3) that tolerance analysis of the spherical Schwarzschild microscope has been shown that the water window operations will require 2-3 times tighter tolerances to achieve a similar performance of operations with 130 A radiation. These results have been included in a manuscript included in the appendix.
Nonuniform Effects of Reinstatement within the Time Window
ERIC Educational Resources Information Center
Galluccio, Llissa; Rovee-Collier, Carolyn
2006-01-01
A time window is a limited period after an event initially occurs in which additional information can be integrated with the memory of that event. It shuts when the memory is forgotten. The time window hypothesis holds that the impact of a manipulation at different points within the time window is nonuniform. In two operant conditioning…
Due-Window Assignment Scheduling with Variable Job Processing Times
Wu, Yu-Bin
2015-01-01
We consider a common due-window assignment scheduling problem jobs with variable job processing times on a single machine, where the processing time of a job is a function of its position in a sequence (i.e., learning effect) or its starting time (i.e., deteriorating effect). The problem is to determine the optimal due-windows, and the processing sequence simultaneously to minimize a cost function includes earliness, tardiness, the window location, window size, and weighted number of tardy jobs. We prove that the problem can be solved in polynomial time. PMID:25918745
NASA Astrophysics Data System (ADS)
Zhao, Jianhua; Zeng, Haishan; Kalia, Sunil; Lui, Harvey
2017-02-01
Background: Raman spectroscopy is a non-invasive optical technique which can measure molecular vibrational modes within tissue. A large-scale clinical study (n = 518) has demonstrated that real-time Raman spectroscopy could distinguish malignant from benign skin lesions with good diagnostic accuracy; this was validated by a follow-up independent study (n = 127). Objective: Most of the previous diagnostic algorithms have typically been based on analyzing the full band of the Raman spectra, either in the fingerprint or high wavenumber regions. Our objective in this presentation is to explore wavenumber selection based analysis in Raman spectroscopy for skin cancer diagnosis. Methods: A wavenumber selection algorithm was implemented using variably-sized wavenumber windows, which were determined by the correlation coefficient between wavenumbers. Wavenumber windows were chosen based on accumulated frequency from leave-one-out cross-validated stepwise regression or least and shrinkage selection operator (LASSO). The diagnostic algorithms were then generated from the selected wavenumber windows using multivariate statistical analyses, including principal component and general discriminant analysis (PC-GDA) and partial least squares (PLS). A total cohort of 645 confirmed lesions from 573 patients encompassing skin cancers, precancers and benign skin lesions were included. Lesion measurements were divided into training cohort (n = 518) and testing cohort (n = 127) according to the measurement time. Result: The area under the receiver operating characteristic curve (ROC) improved from 0.861-0.891 to 0.891-0.911 and the diagnostic specificity for sensitivity levels of 0.99-0.90 increased respectively from 0.17-0.65 to 0.20-0.75 by selecting specific wavenumber windows for analysis. Conclusion: Wavenumber selection based analysis in Raman spectroscopy improves skin cancer diagnostic specificity at high sensitivity levels.
Seismpol_ a visual-basic computer program for interactive and automatic earthquake waveform analysis
NASA Astrophysics Data System (ADS)
Patanè, Domenico; Ferrari, Ferruccio
1997-11-01
A Microsoft Visual-Basic computer program for waveform analysis of seismic signals is presented. The program combines interactive and automatic processing of digital signals using data recorded by three-component seismic stations. The analysis procedure can be used in either an interactive earthquake analysis or an automatic on-line processing of seismic recordings. The algorithm works in the time domain using the Covariance Matrix Decomposition method (CMD), so that polarization characteristics may be computed continuously in real time and seismic phases can be identified and discriminated. Visual inspection of the particle motion in hortogonal planes of projection (hodograms) reduces the danger of misinterpretation derived from the application of the polarization filter. The choice of time window and frequency intervals improves the quality of the extracted polarization information. In fact, the program uses a band-pass Butterworth filter to process the signals in the frequency domain by analysis of a selected signal window into a series of narrow frequency bands. Significant results supported by well defined polarizations and source azimuth estimates for P and S phases are also obtained for short-period seismic events (local microearthquakes).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, J.; Moon, T.J.; Howell, J.R.
This paper presents an analysis of the heat transfer occurring during an in-situ curing process for which infrared energy is provided on the surface of polymer composite during winding. The material system is Hercules prepreg AS4/3501-6. Thermoset composites have an exothermic chemical reaction during the curing process. An Eulerian thermochemical model is developed for the heat transfer analysis of helical winding. The model incorporates heat generation due to the chemical reaction. Several assumptions are made leading to a two-dimensional, thermochemical model. For simplicity, 360{degree} heating around the mandrel is considered. In order to generate the appropriate process windows, the developedmore » heat transfer model is combined with a simple winding time model. The process windows allow for a proper selection of process variables such as infrared energy input and winding velocity to give a desired end-product state. Steady-state temperatures are found for each combination of the process variables. A regression analysis is carried out to relate the process variables to the resulting steady-state temperatures. Using regression equations, process windows for a wide range of cylinder diameters are found. A general procedure to find process windows for Hercules AS4/3501-6 prepreg tape is coded in a FORTRAN program.« less
Seismic facies analysis based on self-organizing map and empirical mode decomposition
NASA Astrophysics Data System (ADS)
Du, Hao-kun; Cao, Jun-xing; Xue, Ya-juan; Wang, Xing-jian
2015-01-01
Seismic facies analysis plays an important role in seismic interpretation and reservoir model building by offering an effective way to identify the changes in geofacies inter wells. The selections of input seismic attributes and their time window have an obvious effect on the validity of classification and require iterative experimentation and prior knowledge. In general, it is sensitive to noise when waveform serves as the input data to cluster analysis, especially with a narrow window. To conquer this limitation, the Empirical Mode Decomposition (EMD) method is introduced into waveform classification based on SOM. We first de-noise the seismic data using EMD and then cluster the data using 1D grid SOM. The main advantages of this method are resolution enhancement and noise reduction. 3D seismic data from the western Sichuan basin, China, are collected for validation. The application results show that seismic facies analysis can be improved and better help the interpretation. The powerful tolerance for noise makes the proposed method to be a better seismic facies analysis tool than classical 1D grid SOM method, especially for waveform cluster with a narrow window.
International Ultraviolet Explorer (IUE) satellite mission analysis
NASA Technical Reports Server (NTRS)
Cook, R. A.; Griffin, J. H.
1975-01-01
The results are presented of the mission analysis performed by Computer Sciences Corporation (CSC) in support of the International Ultraviolet Explorer (IUE) satellite. The launch window is open for three separate periods (for a total time of 7 months) during the year extending from July 20, 1977, to July 20, 1978. The synchronous orbit shadow constraint limits the launch window to approximately 88 minutes per day. Apogee boost motor fuel was computed to be 455 pounds (206 kilograms) and on-station weight was 931 pounds (422 kilograms). The target orbit is elliptical synchronous, with eccentricity 0.272 and 24 hour period.
Near real-time vaccine safety surveillance with partially accrued data.
Greene, Sharon K; Kulldorff, Martin; Yin, Ruihua; Yih, W Katherine; Lieu, Tracy A; Weintraub, Eric S; Lee, Grace M
2011-06-01
The Vaccine Safety Datalink (VSD) Project conducts near real-time vaccine safety surveillance using sequential analytic methods. Timely surveillance is critical in identifying potential safety problems and preventing additional exposure before most vaccines are administered. For vaccines that are administered during a short period, such as influenza vaccines, timeliness can be improved by undertaking analyses while risk windows following vaccination are ongoing and by accommodating predictable and unpredictable data accrual delays. We describe practical solutions to these challenges, which were adopted by the VSD Project during pandemic and seasonal influenza vaccine safety surveillance in 2009/2010. Adjustments were made to two sequential analytic approaches. The Poisson-based approach compared the number of pre-defined adverse events observed following vaccination with the number expected using historical data. The expected number was adjusted for the proportion of the risk window elapsed and the proportion of inpatient data estimated to have accrued. The binomial-based approach used a self-controlled design, comparing the observed numbers of events in risk versus comparison windows. Events were included in analysis only if they occurred during a week that had already passed for both windows. Analyzing data before risk windows fully elapsed improved the timeliness of safety surveillance. Adjustments for data accrual lags were tailored to each data source and avoided biasing analyses away from detecting a potential safety problem, particularly early during surveillance. The timeliness of vaccine and drug safety surveillance can be improved by properly accounting for partially elapsed windows and data accrual delays. Copyright © 2011 John Wiley & Sons, Ltd.
47 CFR 15.323 - Specific requirements for devices operating in the 1920-1930 MHz sub-band.
Code of Federal Regulations, 2010 CFR
2010-10-01
...] (c) Devices must incorporate a mechanism for monitoring the time and spectrum windows that its... transmission, devices must monitor the combined time and spectrum windows in which they intend to transmit for... windows without further monitoring. However, occupation of the same combined time and spectrum windows by...
Windowed and Wavelet Analysis of Marine Stratocumulus Cloud Inhomogeneity
NASA Technical Reports Server (NTRS)
Gollmer, Steven M.; Harshvardhan; Cahalan, Robert F.; Snider, Jack B.
1995-01-01
To improve radiative transfer calculations for inhomogeneous clouds, a consistent means of modeling inhomogeneity is needed. One current method of modeling cloud inhomogeneity is through the use of fractal parameters. This method is based on the supposition that cloud inhomogeneity over a large range of scales is related. An analysis technique named wavelet analysis provides a means of studying the multiscale nature of cloud inhomogeneity. In this paper, the authors discuss the analysis and modeling of cloud inhomogeneity through the use of wavelet analysis. Wavelet analysis as well as other windowed analysis techniques are used to study liquid water path (LWP) measurements obtained during the marine stratocumulus phase of the First ISCCP (International Satellite Cloud Climatology Project) Regional Experiment. Statistics obtained using analysis windows, which are translated to span the LWP dataset, are used to study the local (small scale) properties of the cloud field as well as their time dependence. The LWP data are transformed onto an orthogonal wavelet basis that represents the data as a number of times series. Each of these time series lies within a frequency band and has a mean frequency that is half the frequency of the previous band. Wavelet analysis combined with translated analysis windows reveals that the local standard deviation of each frequency band is correlated with the local standard deviation of the other frequency bands. The ratio between the standard deviation of adjacent frequency bands is 0.9 and remains constant with respect to time. This ratio defined as the variance coupling parameter is applicable to all of the frequency bands studied and appears to be related to the slope of the data's power spectrum. Similar analyses are performed on two cloud inhomogeneity models, which use fractal-based concepts to introduce inhomogeneity into a uniform cloud field. The bounded cascade model does this by iteratively redistributing LWP at each scale using the value of the local mean. This model is reformulated into a wavelet multiresolution framework, thereby presenting a number of variants of the bounded cascade model. One variant introduced in this paper is the 'variance coupled model,' which redistributes LWP using the local standard deviation and the variance coupling parameter. While the bounded cascade model provides an elegant two- parameter model for generating cloud inhomogeneity, the multiresolution framework provides more flexibility at the expense of model complexity. Comparisons are made with the results from the LWP data analysis to demonstrate both the strengths and weaknesses of these models.
Minimal Window Duration for Accurate HRV Recording in Athletes.
Bourdillon, Nicolas; Schmitt, Laurent; Yazdani, Sasan; Vesin, Jean-Marc; Millet, Grégoire P
2017-01-01
Heart rate variability (HRV) is non-invasive and commonly used for monitoring responses to training loads, fitness, or overreaching in athletes. Yet, the recording duration for a series of RR-intervals varies from 1 to 15 min in the literature. The aim of the present work was to assess the minimum record duration to obtain reliable HRV results. RR-intervals from 159 orthostatic tests (7 min supine, SU, followed by 6 min standing, ST) were analyzed. Reference windows were 4 min in SU (min 3-7) and 4 min in ST (min 9-13). Those windows were subsequently divided and the analyses were repeated on eight different fractioned windows: the first min (0-1), the second min (1-2), the third min (2-3), the fourth min (3-4), the first 2 min (0-2), the last 2 min (2-4), the first 3 min (0-3), and the last 3 min (1-4). Correlation and Bland & Altman statistical analyses were systematically performed. The analysis window could be shortened to 0-2 instead of 0-4 for RMSSD only, whereas the 4-min window was necessary for LF and total power. Since there is a need for 1 min of baseline to obtain a steady signal prior the analysis window, we conclude that studies relying on RMSSD may shorten the windows to 3 min (= 1+2) in SU or seated position only and to 6 min (= 1+2 min SU plus 1+2 min ST) if there is an orthostatic test. Studies relying on time- and frequency-domain parameters need a minimum of 5 min (= 1+4) min SU or seated position only but require 10 min (= 1+4 min SU plus 1+4 min ST) for the orthostatic test.
Current status of the real-time processing of complex radar signatures
NASA Astrophysics Data System (ADS)
Clay, E.
The real-time processing technique developed by ONERA to characterize radar signatures at the Brahms station is described. This technique is used for the real-time analysis of the RCS of airframes and rotating parts, the one-dimensional tomography of aircraft, and the RCS of electromagnetic decoys. Using this technique, it is also possible to optimize the experimental parameters, i.e., the analysis band, the microwave-network gain, and the electromagnetic window of the analysis.
Intensive care window: real-time monitoring and analysis in the intensive care environment.
Stylianides, Nikolas; Dikaiakos, Marios D; Gjermundrød, Harald; Panayi, George; Kyprianou, Theodoros
2011-01-01
This paper introduces a novel, open-source middleware framework for communication with medical devices and an application using the middleware named intensive care window (ICW). The middleware enables communication with intensive care unit bedside-installed medical devices over standard and proprietary communication protocol stacks. The ICW application facilitates the acquisition of vital signs and physiological parameters exported from patient-attached medical devices and sensors. Moreover, ICW provides runtime and post-analysis procedures for data annotation, data visualization, data query, and analysis. The ICW application can be deployed as a stand-alone solution or in conjunction with existing clinical information systems providing a holistic solution to inpatient medical condition monitoring, early diagnosis, and prognosis.
Windowed multitaper correlation analysis of multimodal brain monitoring parameters.
Faltermeier, Rupert; Proescholdt, Martin A; Bele, Sylvia; Brawanski, Alexander
2015-01-01
Although multimodal monitoring sets the standard in daily practice of neurocritical care, problem-oriented analysis tools to interpret the huge amount of data are lacking. Recently a mathematical model was presented that simulates the cerebral perfusion and oxygen supply in case of a severe head trauma, predicting the appearance of distinct correlations between arterial blood pressure and intracranial pressure. In this study we present a set of mathematical tools that reliably detect the predicted correlations in data recorded at a neurocritical care unit. The time resolved correlations will be identified by a windowing technique combined with Fourier-based coherence calculations. The phasing of the data is detected by means of Hilbert phase difference within the above mentioned windows. A statistical testing method is introduced that allows tuning the parameters of the windowing method in such a way that a predefined accuracy is reached. With this method the data of fifteen patients were examined in which we found the predicted correlation in each patient. Additionally it could be shown that the occurrence of a distinct correlation parameter, called scp, represents a predictive value of high quality for the patients outcome.
Weighted combination of LOD values oa splitted into frequency windows
NASA Astrophysics Data System (ADS)
Fernandez, L. I.; Gambis, D.; Arias, E. F.
In this analysis a one-day combined time series of LOD(length-of-day) estimates is presented. We use individual data series derived by 7 GPS and 3 SLR analysis centers, which routinely contribute to the IERS database over a recent 27-month period (Jul 1996 - Oct 1998). The result is compared to the multi-technique combined series C04 produced by the Central Bureau of the IERS that is commonly used as a reference for the study of the phenomena of Earth rotation variations. The Frequency Windows Combined Series procedure brings out a time series, which is close to C04 but shows an amplitude difference that might explain the evident periodic behavior present in the differences of these two combined series. This method could be useful to generate a new time series to be used as a reference in the high frequency variations of the Earth rotation studies.
NASA Astrophysics Data System (ADS)
Simonotto, Jennifer; Furman, Michael; Beaver, Thomas; Spano, Mark; Kavanagh, Katherine; Iden, Jason; Hu, Gang; Ditto, William
2004-03-01
Explanted Porcine hearts were Langendorff-perfused, administered a voltage-sensitive fluorescent dye (Di-4-ANEPPS) and illuminated with a ND:Yag laser (532 nm); the change in fluorescence resulting from electrical activity on the heart surface was recorded with an 80 x 80 pixel CCD camera at 1000 frames per second. The heart was put into fibrillation with rapid ventricular pacing and shocks were administered close to the defibrillation threshold. Defibrillation failure data was analyzed using synchronization, space-time volume plots and recurrence quantification. Preliminary spatiotemporal synchronization results reveal a short window of time ( 1 second) after defibrillation failure in which the disordered electrical activity becomes ordered; this ordered period occurs 4-5 seconds after the defibrillation shock. Recurrence analysis of a single time series confirmed these results, thus opening the avenue for dynamic defibrillators that can detect an optimal window for cardioversion.
Single-machine common/slack due window assignment problems with linear decreasing processing times
NASA Astrophysics Data System (ADS)
Zhang, Xingong; Lin, Win-Chin; Wu, Wen-Hsiang; Wu, Chin-Chia
2017-08-01
This paper studies linear non-increasing processing times and the common/slack due window assignment problems on a single machine, where the actual processing time of a job is a linear non-increasing function of its starting time. The aim is to minimize the sum of the earliness cost, tardiness cost, due window location and due window size. Some optimality results are discussed for the common/slack due window assignment problems and two O(n log n) time algorithms are presented to solve the two problems. Finally, two examples are provided to illustrate the correctness of the corresponding algorithms.
Pereira, Telma; Lemos, Luís; Cardoso, Sandra; Silva, Dina; Rodrigues, Ana; Santana, Isabel; de Mendonça, Alexandre; Guerreiro, Manuela; Madeira, Sara C
2017-07-19
Predicting progression from a stage of Mild Cognitive Impairment to dementia is a major pursuit in current research. It is broadly accepted that cognition declines with a continuum between MCI and dementia. As such, cohorts of MCI patients are usually heterogeneous, containing patients at different stages of the neurodegenerative process. This hampers the prognostic task. Nevertheless, when learning prognostic models, most studies use the entire cohort of MCI patients regardless of their disease stages. In this paper, we propose a Time Windows approach to predict conversion to dementia, learning with patients stratified using time windows, thus fine-tuning the prognosis regarding the time to conversion. In the proposed Time Windows approach, we grouped patients based on the clinical information of whether they converted (converter MCI) or remained MCI (stable MCI) within a specific time window. We tested time windows of 2, 3, 4 and 5 years. We developed a prognostic model for each time window using clinical and neuropsychological data and compared this approach with the commonly used in the literature, where all patients are used to learn the models, named as First Last approach. This enables to move from the traditional question "Will a MCI patient convert to dementia somewhere in the future" to the question "Will a MCI patient convert to dementia in a specific time window". The proposed Time Windows approach outperformed the First Last approach. The results showed that we can predict conversion to dementia as early as 5 years before the event with an AUC of 0.88 in the cross-validation set and 0.76 in an independent validation set. Prognostic models using time windows have higher performance when predicting progression from MCI to dementia, when compared to the prognostic approach commonly used in the literature. Furthermore, the proposed Time Windows approach is more relevant from a clinical point of view, predicting conversion within a temporal interval rather than sometime in the future and allowing clinicians to timely adjust treatments and clinical appointments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ojeda-Gonzalez, A.; Prestes, A.; Klausner, V.
Spatio-temporal entropy (STE) analysis is used as an alternative mathematical tool to identify possible magnetic cloud (MC) candidates. We analyze Interplanetary Magnetic Field (IMF) data using a time interval of only 10 days. We select a convenient data interval of 2500 records moving forward by 200 record steps until the end of the time series. For every data segment, the STE is calculated at each step. During an MC event, the STE reaches values close to zero. This extremely low value of STE is due to MC structure features. However, not all of the magnetic components in MCs have STEmore » values close to zero at the same time. For this reason, we create a standardization index (the so-called Interplanetary Entropy, IE, index). This index is a worthwhile effort to develop new tools to help diagnose ICME structures. The IE was calculated using a time window of one year (1999), and it has a success rate of 70% over other identifiers of MCs. The unsuccessful cases (30%) are caused by small and weak MCs. The results show that the IE methodology identified 9 of 13 MCs, and emitted nine false alarm cases. In 1999, a total of 788 windows of 2500 values existed, meaning that the percentage of false alarms was 1.14%, which can be considered a good result. In addition, four time windows, each of 10 days, are studied, where the IE method was effective in finding MC candidates. As a novel result, two new MCs are identified in these time windows.« less
Least Squares Moving-Window Spectral Analysis.
Lee, Young Jong
2017-08-01
Least squares regression is proposed as a moving-windows method for analysis of a series of spectra acquired as a function of external perturbation. The least squares moving-window (LSMW) method can be considered an extended form of the Savitzky-Golay differentiation for nonuniform perturbation spacing. LSMW is characterized in terms of moving-window size, perturbation spacing type, and intensity noise. Simulation results from LSMW are compared with results from other numerical differentiation methods, such as single-interval differentiation, autocorrelation moving-window, and perturbation correlation moving-window methods. It is demonstrated that this simple LSMW method can be useful for quantitative analysis of nonuniformly spaced spectral data with high frequency noise.
Time-localized wavelet multiple regression and correlation
NASA Astrophysics Data System (ADS)
Fernández-Macho, Javier
2018-02-01
This paper extends wavelet methodology to handle comovement dynamics of multivariate time series via moving weighted regression on wavelet coefficients. The concept of wavelet local multiple correlation is used to produce one single set of multiscale correlations along time, in contrast with the large number of wavelet correlation maps that need to be compared when using standard pairwise wavelet correlations with rolling windows. Also, the spectral properties of weight functions are investigated and it is argued that some common time windows, such as the usual rectangular rolling window, are not satisfactory on these grounds. The method is illustrated with a multiscale analysis of the comovements of Eurozone stock markets during this century. It is shown how the evolution of the correlation structure in these markets has been far from homogeneous both along time and across timescales featuring an acute divide across timescales at about the quarterly scale. At longer scales, evidence from the long-term correlation structure can be interpreted as stable perfect integration among Euro stock markets. On the other hand, at intramonth and intraweek scales, the short-term correlation structure has been clearly evolving along time, experiencing a sharp increase during financial crises which may be interpreted as evidence of financial 'contagion'.
Characterizing Detrended Fluctuation Analysis of multifractional Brownian motion
NASA Astrophysics Data System (ADS)
Setty, V. A.; Sharma, A. S.
2015-02-01
The Hurst exponent (H) is widely used to quantify long range dependence in time series data and is estimated using several well known techniques. Recognizing its ability to remove trends the Detrended Fluctuation Analysis (DFA) is used extensively to estimate a Hurst exponent in non-stationary data. Multifractional Brownian motion (mBm) broadly encompasses a set of models of non-stationary data exhibiting time varying Hurst exponents, H(t) as against a constant H. Recently, there has been a growing interest in time dependence of H(t) and sliding window techniques have been used to estimate a local time average of the exponent. This brought to fore the ability of DFA to estimate scaling exponents in systems with time varying H(t) , such as mBm. This paper characterizes the performance of DFA on mBm data with linearly varying H(t) and further test the robustness of estimated time average with respect to data and technique related parameters. Our results serve as a bench-mark for using DFA as a sliding window estimator to obtain H(t) from time series data.
Prevalence of Imaging Biomarkers to Guide the Planning of Acute Stroke Reperfusion Trials.
Jiang, Bin; Ball, Robyn L; Michel, Patrik; Jovin, Tudor; Desai, Manisha; Eskandari, Ashraf; Naqvi, Zack; Wintermark, Max
2017-06-01
Imaging biomarkers are increasingly used as selection criteria for stroke clinical trials. The goal of our study was to determine the prevalence of commonly studied imaging biomarkers in different time windows after acute ischemic stroke onset to better facilitate the design of stroke clinical trials using such biomarkers for patient selection. This retrospective study included 612 patients admitted with a clinical suspicion of acute ischemic stroke with symptom onset no more than 24 hours before completing baseline imaging. Patients with subacute/chronic/remote infarcts and hemorrhage were excluded from this study. Imaging biomarkers were extracted from baseline imaging, which included a noncontrast head computed tomography (CT), perfusion CT, and CT angiography. The prevalence of dichotomized versions of each of the imaging biomarkers in several time windows (time since symptom onset) was assessed and statistically modeled to assess time dependence (not lack thereof). We created tables showing the prevalence of the imaging biomarkers pertaining to the core, the penumbra and the arterial occlusion for different time windows. All continuous imaging features vary over time. The dichotomized imaging features that vary significantly over time include: noncontrast head computed tomography Alberta Stroke Program Early CT (ASPECT) score and dense artery sign, perfusion CT infarct volume, and CT angiography collateral score and visible clot. The dichotomized imaging features that did not vary significantly over time include the thresholded perfusion CT penumbra volumes. As part of the feasibility analysis in stroke clinical trials, this analysis and the resulting tables can help investigators determine sample size and the number needed to screen. © 2017 American Heart Association, Inc.
Analysis of oil-pipeline distribution of multiple products subject to delivery time-windows
NASA Astrophysics Data System (ADS)
Jittamai, Phongchai
This dissertation defines the operational problems of, and develops solution methodologies for, a distribution of multiple products into oil pipeline subject to delivery time-windows constraints. A multiple-product oil pipeline is a pipeline system composing of pipes, pumps, valves and storage facilities used to transport different types of liquids. Typically, products delivered by pipelines are petroleum of different grades moving either from production facilities to refineries or from refineries to distributors. Time-windows, which are generally used in logistics and scheduling areas, are incorporated in this study. The distribution of multiple products into oil pipeline subject to delivery time-windows is modeled as multicommodity network flow structure and mathematically formulated. The main focus of this dissertation is the investigation of operating issues and problem complexity of single-source pipeline problems and also providing solution methodology to compute input schedule that yields minimum total time violation from due delivery time-windows. The problem is proved to be NP-complete. The heuristic approach, a reversed-flow algorithm, is developed based on pipeline flow reversibility to compute input schedule for the pipeline problem. This algorithm is implemented in no longer than O(T·E) time. This dissertation also extends the study to examine some operating attributes and problem complexity of multiple-source pipelines. The multiple-source pipeline problem is also NP-complete. A heuristic algorithm modified from the one used in single-source pipeline problems is introduced. This algorithm can also be implemented in no longer than O(T·E) time. Computational results are presented for both methodologies on randomly generated problem sets. The computational experience indicates that reversed-flow algorithms provide good solutions in comparison with the optimal solutions. Only 25% of the problems tested were more than 30% greater than optimal values and approximately 40% of the tested problems were solved optimally by the algorithms.
Bullich, Santiago; Barthel, Henryk; Koglin, Norman; Becker, Georg A; De Santi, Susan; Jovalekic, Aleksandar; Stephens, Andrew W; Sabri, Osama
2017-11-24
Accurate amyloid PET quantification is necessary for monitoring amyloid-beta accumulation and response to therapy. Currently, most of the studies are analyzed using the static standardized uptake value ratio (SUVR) approach because of its simplicity. However, this approach may be influenced by changes in cerebral blood flow (CBF) or radiotracer clearance. Full tracer kinetic models require arterial blood sampling and dynamic image acquisition. The objectives of this work were: (1) to validate a non-invasive kinetic modeling approach for 18 F-florbetaben PET using an acquisition protocol with the best compromise between quantification accuracy and simplicity and (2) to assess the impact of CBF changes and radiotracer clearance on SUVRs and non-invasive kinetic modeling data in 18 F-florbetaben PET. Methods: Data from twenty subjects (10 patients with probable Alzheimer's dementia/ 10 healthy volunteers) were used to compare the binding potential (BP ND ) obtained from the full kinetic analysis to the SUVR and to non-invasive tracer kinetic methods (simplified reference tissue model (SRTM), and multilinear reference tissue model 2 (MRTM2)). Different approaches using shortened or interrupted acquisitions were compared to the results of the full acquisition (0-140 min). Simulations were carried out to assess the effect of CBF and radiotracer clearance changes on SUVRs and non-invasive kinetic modeling outputs. Results: A 0-30 and 120-140 min dual time-window acquisition protocol using appropriate interpolation of the missing time points provided the best compromise between patient comfort and quantification accuracy. Excellent agreement was found between BP ND obtained using full and dual time-window (2TW) acquisition protocols (BP ND,2TW =0.01+ 1.00 BP ND,FULL , R2=0.97 (MRTM2); BP ND,2TW = 0.05+ 0.92·BP ND,FULL , R2=0.93 (SRTM)). Simulations showed a limited impact of CBF and radiotracer clearance changes on MRTM parameters and SUVRs. Conclusion: This study demonstrates accurate non-invasive kinetic modeling of 18 F-florbetaben PET data using a dual time-window acquisition protocol, thus providing a good compromise between quantification accuracy, scan duration and patient burden. The influence of CBF and radiotracer clearance changes on amyloid-beta load estimates was small. For most clinical research applications, the SUVR approach is appropriate. However, for longitudinal studies in which a maximum quantification accuracy is desired, this non-invasive dual time-window acquisition protocol and kinetic analysis is recommended. Copyright © 2017 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
Sojoudi, Alireza; Goodyear, Bradley G
2016-12-01
Spontaneous fluctuations of blood-oxygenation level-dependent functional magnetic resonance imaging (BOLD fMRI) signals are highly synchronous between brain regions that serve similar functions. This provides a means to investigate functional networks; however, most analysis techniques assume functional connections are constant over time. This may be problematic in the case of neurological disease, where functional connections may be highly variable. Recently, several methods have been proposed to determine moment-to-moment changes in the strength of functional connections over an imaging session (so called dynamic connectivity). Here a novel analysis framework based on a hierarchical observation modeling approach was proposed, to permit statistical inference of the presence of dynamic connectivity. A two-level linear model composed of overlapping sliding windows of fMRI signals, incorporating the fact that overlapping windows are not independent was described. To test this approach, datasets were synthesized whereby functional connectivity was either constant (significant or insignificant) or modulated by an external input. The method successfully determines the statistical significance of a functional connection in phase with the modulation, and it exhibits greater sensitivity and specificity in detecting regions with variable connectivity, when compared with sliding-window correlation analysis. For real data, this technique possesses greater reproducibility and provides a more discriminative estimate of dynamic connectivity than sliding-window correlation analysis. Hum Brain Mapp 37:4566-4580, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Carey, David L; Blanch, Peter; Ong, Kok-Leong; Crossley, Kay M; Crow, Justin; Morris, Meg E
2017-08-01
(1) To investigate whether a daily acute:chronic workload ratio informs injury risk in Australian football players; (2) to identify which combination of workload variable, acute and chronic time window best explains injury likelihood. Workload and injury data were collected from 53 athletes over 2 seasons in a professional Australian football club. Acute:chronic workload ratios were calculated daily for each athlete, and modelled against non-contact injury likelihood using a quadratic relationship. 6 workload variables, 8 acute time windows (2-9 days) and 7 chronic time windows (14-35 days) were considered (336 combinations). Each parameter combination was compared for injury likelihood fit (using R 2 ). The ratio of moderate speed running workload (18-24 km/h) in the previous 3 days (acute time window) compared with the previous 21 days (chronic time window) best explained the injury likelihood in matches (R 2 =0.79) and in the immediate 2 or 5 days following matches (R 2 =0.76-0.82). The 3:21 acute:chronic workload ratio discriminated between high-risk and low-risk athletes (relative risk=1.98-2.43). Using the previous 6 days to calculate the acute workload time window yielded similar results. The choice of acute time window significantly influenced model performance and appeared to reflect the competition and training schedule. Daily workload ratios can inform injury risk in Australian football. Clinicians and conditioning coaches should consider the sport-specific schedule of competition and training when choosing acute and chronic time windows. For Australian football, the ratio of moderate speed running in a 3-day or 6-day acute time window and a 21-day chronic time window best explained injury risk. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Carey, David L; Blanch, Peter; Ong, Kok-Leong; Crossley, Kay M; Crow, Justin; Morris, Meg E
2017-01-01
Aims (1) To investigate whether a daily acute:chronic workload ratio informs injury risk in Australian football players; (2) to identify which combination of workload variable, acute and chronic time window best explains injury likelihood. Methods Workload and injury data were collected from 53 athletes over 2 seasons in a professional Australian football club. Acute:chronic workload ratios were calculated daily for each athlete, and modelled against non-contact injury likelihood using a quadratic relationship. 6 workload variables, 8 acute time windows (2–9 days) and 7 chronic time windows (14–35 days) were considered (336 combinations). Each parameter combination was compared for injury likelihood fit (using R2). Results The ratio of moderate speed running workload (18–24 km/h) in the previous 3 days (acute time window) compared with the previous 21 days (chronic time window) best explained the injury likelihood in matches (R2=0.79) and in the immediate 2 or 5 days following matches (R2=0.76–0.82). The 3:21 acute:chronic workload ratio discriminated between high-risk and low-risk athletes (relative risk=1.98–2.43). Using the previous 6 days to calculate the acute workload time window yielded similar results. The choice of acute time window significantly influenced model performance and appeared to reflect the competition and training schedule. Conclusions Daily workload ratios can inform injury risk in Australian football. Clinicians and conditioning coaches should consider the sport-specific schedule of competition and training when choosing acute and chronic time windows. For Australian football, the ratio of moderate speed running in a 3-day or 6-day acute time window and a 21-day chronic time window best explained injury risk. PMID:27789430
Wavelet-based clustering of resting state MRI data in the rat.
Medda, Alessio; Hoffmann, Lukas; Magnuson, Matthew; Thompson, Garth; Pan, Wen-Ju; Keilholz, Shella
2016-01-01
While functional connectivity has typically been calculated over the entire length of the scan (5-10min), interest has been growing in dynamic analysis methods that can detect changes in connectivity on the order of cognitive processes (seconds). Previous work with sliding window correlation has shown that changes in functional connectivity can be observed on these time scales in the awake human and in anesthetized animals. This exciting advance creates a need for improved approaches to characterize dynamic functional networks in the brain. Previous studies were performed using sliding window analysis on regions of interest defined based on anatomy or obtained from traditional steady-state analysis methods. The parcellation of the brain may therefore be suboptimal, and the characteristics of the time-varying connectivity between regions are dependent upon the length of the sliding window chosen. This manuscript describes an algorithm based on wavelet decomposition that allows data-driven clustering of voxels into functional regions based on temporal and spectral properties. Previous work has shown that different networks have characteristic frequency fingerprints, and the use of wavelets ensures that both the frequency and the timing of the BOLD fluctuations are considered during the clustering process. The method was applied to resting state data acquired from anesthetized rats, and the resulting clusters agreed well with known anatomical areas. Clusters were highly reproducible across subjects. Wavelet cross-correlation values between clusters from a single scan were significantly higher than the values from randomly matched clusters that shared no temporal information, indicating that wavelet-based analysis is sensitive to the relationship between areas. Copyright © 2015 Elsevier Inc. All rights reserved.
Time-marching multi-grid seismic tomography
NASA Astrophysics Data System (ADS)
Tong, P.; Yang, D.; Liu, Q.
2016-12-01
From the classic ray-based traveltime tomography to the state-of-the-art full waveform inversion, because of the nonlinearity of seismic inverse problems, a good starting model is essential for preventing the convergence of the objective function toward local minima. With a focus on building high-accuracy starting models, we propose the so-called time-marching multi-grid seismic tomography method in this study. The new seismic tomography scheme consists of a temporal time-marching approach and a spatial multi-grid strategy. We first divide the recording period of seismic data into a series of time windows. Sequentially, the subsurface properties in each time window are iteratively updated starting from the final model of the previous time window. There are at least two advantages of the time-marching approach: (1) the information included in the seismic data of previous time windows has been explored to build the starting models of later time windows; (2) seismic data of later time windows could provide extra information to refine the subsurface images. Within each time window, we use a multi-grid method to decompose the scale of the inverse problem. Specifically, the unknowns of the inverse problem are sampled on a coarse mesh to capture the macro-scale structure of the subsurface at the beginning. Because of the low dimensionality, it is much easier to reach the global minimum on a coarse mesh. After that, finer meshes are introduced to recover the micro-scale properties. That is to say, the subsurface model is iteratively updated on multi-grid in every time window. We expect that high-accuracy starting models should be generated for the second and later time windows. We will test this time-marching multi-grid method by using our newly developed eikonal-based traveltime tomography software package tomoQuake. Real application results in the 2016 Kumamoto earthquake (Mw 7.0) region in Japan will be demonstrated.
Guo, Chao; Zhu, Yanrong; Weng, Yan; Wang, Shiquan; Guan, Yue; Wei, Guo; Yin, Ying; Xi, Miaomaio; Wen, Aidong
2014-01-01
Breviscapine injection is a Chinese herbal medicine standardized product extracted from Erigeron breviscapus (Vant.) Hand.-Mazz. It has been widely used for treating cardiovascular and cerebrovascular diseases. However, the therapeutic time window and the action mechanism of breviscapine are still unclear. The present study was designed to investigate the therapeutic time window and underlying therapeutic mechanism of breviscapine injection against cerebral ischemic/reperfusion injury. Sprague-Dawley rats were subjected to middle cerebral artery occlusion for 2h followed by 24h of reperfusion. Experiment part 1 was used to investigate the therapeutic time window of breviscapine. Rats were injected intravenously with 50mg/kg breviscapine at different time-points of reperfusion. After 24h of reperfusion, neurologic score, infarct volume, brain water content and serum level of neuron specific enolase (NSE) were measured in a masked fashion. Part 2 was used to explore the therapeutic mechanism of breviscapine. 4-Hydroxy-2-nonenal (4-HNE), 8-hydroxyl-2'- deoxyguanosine (8-OHdG) and the antioxidant capacity of ischemia cortex were measured by ELISA and ferric-reducing antioxidant power (FRAP) assay, respectively. Immunofluorescence and western blot analysis were used to analyze the expression of nuclear factor erythroid 2-related factor 2 (Nrf2) and heme oxygenase-1 (HO-1). Part 1: breviscapine injection significantly ameliorated neurologic deficit, reduced infarct volume and water content, and suppressed the levels of NSE in a time-dependent manner. Part 2: breviscapine inhibited the increased levels of 4-HNE and 8-OHdG, and enhanced the antioxidant capacity of cortex tissue. Moreover, breviscapine obviously raised the expression of Nrf2 and HO-1 proteins after 24h of reperfusion. The therapeutic time window of breviscapine injection for cerebral ischemia/reperfusion injury seemed to be within 5h after reperfusion. By up-regulating the expression of Nrf2/HO-1 pathway might be involved in the therapeutic mechanism of breviscapine injection. © 2013 Elsevier Ireland Ltd. All rights reserved.
Ultrasound-guided identification of cardiac imaging windows.
Liu, Garry; Qi, Xiu-Ling; Robert, Normand; Dick, Alexander J; Wright, Graham A
2012-06-01
Currently, the use of cine magnetic resonance imaging (MRI) to identify cardiac quiescent periods relative to the electrocardiogram (ECG) signal is insufficient for producing submillimeter-resolution coronary MR angiography (MRA) images. In this work, the authors perform a time series comparison between tissue Doppler echocardiograms of the interventricular septum (IVS) and concurrent biplane x-ray angiograms. Our results indicate very close agreement between the diastasis gating windows identified by both the IVS and x-ray techniques. Seven cath lab patients undergoing diagnostic angiograms were simultaneously scanned during a breath hold by ultrasound and biplane x-ray for six to eight heartbeats. The heart rate of each patient was stable. Dye was injected into either the left or right-coronary vasculature. The IVS was imaged using color tissue Doppler in an apical four-chamber view. Diastasis was estimated on the IVS velocity curve. On the biplane angiograms, proximal, mid, and distal regions were identified on the coronary artery (CA). Frame by frame correlation was used to derive displacement, and then velocity, for each region. The quiescent periods for a CA and its subsegments were estimated based on velocity. Using Pearson's correlation coefficient and Bland-Altman analysis, the authors compared the start and end times of the diastasis windows as estimated from the IVS and CA velocities. The authors also estimated the vessel blur across the diastasis windows of multiple sequential heartbeats of each patient. In total, 17 heartbeats were analyzed. The range of heart rate observed across patients was 47-79 beats per minute (bpm) with a mean of 57 bpm. Significant correlations (R > 0.99; p < 0.01) were observed between the IVS and x-ray techniques for the identification of the start and end times of diastasis windows. The mean difference in the starting times between IVS and CA quiescent windows was -12.0 ms. The mean difference in end times between IVS and CA quiescent windows was -3.5 ms. In contrast, the correlation between RR interval and both the start and duration of the x-ray gating windows were relatively weaker: R = 0.63 (p = 0.13) and R = 0.86 (p = 0.01). For IVS gating windows, the average estimated vessel blurs during single and multiple heartbeats were 0.5 and 0.66 mm, respectively. For x-ray gating windows, the corresponding values were 0.26 and 0.44 mm, respectively. In this study, the authors showed that IVS velocity can be used to identify periods of diastasis for coronary arteries. Despite variability in mid-diastolic rest positions over multiple steady rate heartbeats, vessel blurring of 0.5-1 mm was found to be achievable using the IVS gating technique. The authors envision this leading to a new cardiac gating system that, compared with conventional ECG gating, provides better resolution and shorter scan times for coronary MRA. © 2012 American Association of Physicists in Medicine.
NASA Astrophysics Data System (ADS)
Fraizier, E.; Antoine, P.; Godefroit, J.-L.; Lanier, G.; Roy, G.; Voltz, C.
Lithium fluoride (LiF) windows are extensively used in traditional shock wave experiments because of their transparency beyond 100 GPa along [100] axis. A correct knowledge of the optical and mechanical properties of these windows is essential in order to analyze the experimental data and to determine the equation of state on a large variety of metals. This in mind, the windows supply is systematically characterized in order to determine the density, the thermal expansion and the crystalline orientation. Furthermore, an experimental campaign is conducted in order to characterize the windows properties under shock loading at 300 K and preheated conditions (450 K). This article describes the experiments, details the analysis and presents the results. Particle velocity measurements are carried out at the interface of a multiple windows stack using interferometer diagnostic (VISAR and IDL) at 532 nm wavelength. Shock velocity is calculated as a function of the time of flight through each window. The optical correction is calculated as the ratio of the apparent velocity gap and the particle velocity at the free surface. To go further, the Rankine-Hugoniot relations are applied to calculate the pressure and the density. Then, the results and uncertainties are presented and compared with literature data.
Komorowski, Dariusz; Pietraszek, Stanislaw
2016-01-01
This paper presents the analysis of multi-channel electrogastrographic (EGG) signals using the continuous wavelet transform based on the fast Fourier transform (CWTFT). The EGG analysis was based on the determination of the several signal parameters such as dominant frequency (DF), dominant power (DP) and index of normogastria (NI). The use of continuous wavelet transform (CWT) allows for better visible localization of the frequency components in the analyzed signals, than commonly used short-time Fourier transform (STFT). Such an analysis is possible by means of a variable width window, which corresponds to the scale time of observation (analysis). Wavelet analysis allows using long time windows when we need more precise low-frequency information, and shorter when we need high frequency information. Since the classic CWT transform requires considerable computing power and time, especially while applying it to the analysis of long signals, the authors used the CWT analysis based on the fast Fourier transform (FFT). The CWT was obtained using properties of the circular convolution to improve the speed of calculation. This method allows to obtain results for relatively long records of EGG in a fairly short time, much faster than using the classical methods based on running spectrum analysis (RSA). In this study authors indicate the possibility of a parametric analysis of EGG signals using continuous wavelet transform which is the completely new solution. The results obtained with the described method are shown in the example of an analysis of four-channel EGG recordings, performed for a non-caloric meal.
Using exposure windows to explore an elusive biomarker: blood manganese.
Baker, Marissa G; Stover, Bert; Simpson, Christopher D; Sheppard, Lianne; Seixas, Noah S
2016-05-01
We sought to understand the time course between exposure to manganese (Mn) and uptake into the blood, to allow a more meaningful interpretation of exposure biomarker data, and to determine the utility of blood as a biomarker of Mn exposure. Welder trainees were monitored over the course of a five-quarter training program. Each quarter, trainees gave eight blood samples and had personal air monitoring four times. A mixed model was fit to obtain estimates of airborne exposure by welding type (fixed effect), adjusted for subject (random effect). Considering weekends and days absent as zero exposure, estimated exposures were summed over various exposure windows and related to measured blood manganese (MnB) using a mixed model. A relationship consistent with zero was found between MnB and modeled 1 or 7 days of exposure. After 30 days of preceding exposure, a 1 mg-days/m(3) increase in air Mn is associated with a 0.57 ng/mL increase in MnB (95% CI -0.04, 1.19). Considering a 90-day exposure window and a cumulative exposure window, a 1 mg-days/m(3) increase in air Mn is associated with a 0.26 (95% CI 0.005, 0.51) and 0.09 (95% CI 0.006, 0.17) ng/mL increase in MnB, respectively. From this analysis, MnB may begin to act as a biomarker of Mn exposure over longer time periods, or at higher levels of exposure. This novel study design allowed investigation of how MnB relates to different time windows of exposure, representing the most robust Mn exposure assessment in the biomarker literature.
Toward Capturing Momentary Changes of Heart Rate Variability by a Dynamic Analysis Method
Zhang, Haoshi; Zhu, Mingxing; Zheng, Yue; Li, Guanglin
2015-01-01
The analysis of heart rate variability (HRV) has been performed on long-term electrocardiography (ECG) recordings (12~24 hours) and short-term recordings (2~5 minutes), which may not capture momentary change of HRV. In this study, we present a new method to analyze the momentary HRV (mHRV). The ECG recordings were segmented into a series of overlapped HRV analysis windows with a window length of 5 minutes and different time increments. The performance of the proposed method in delineating the dynamics of momentary HRV measurement was evaluated with four commonly used time courses of HRV measures on both synthetic time series and real ECG recordings from human subjects and dogs. Our results showed that a smaller time increment could capture more dynamical information on transient changes. Considering a too short increment such as 10 s would cause the indented time courses of the four measures, a 1-min time increment (4-min overlapping) was suggested in the analysis of mHRV in the study. ECG recordings from human subjects and dogs were used to further assess the effectiveness of the proposed method. The pilot study demonstrated that the proposed analysis of mHRV could provide more accurate assessment of the dynamical changes in cardiac activity than the conventional measures of HRV (without time overlapping). The proposed method may provide an efficient means in delineating the dynamics of momentary HRV and it would be worthy performing more investigations. PMID:26172953
Sahi, Kamal; Jackson, Stuart; Wiebe, Edward; Armstrong, Gavin; Winters, Sean; Moore, Ronald; Low, Gavin
2014-02-01
To assess if "liver window" settings improve the conspicuity of small renal cell carcinomas (RCC). Patients were analysed from our institution's pathology-confirmed RCC database that included the following: (1) stage T1a RCCs, (2) an unenhanced computed tomography (CT) abdomen performed ≤ 6 months before histologic diagnosis, and (3) age ≥ 17 years. Patients with multiple tumours, prior nephrectomy, von Hippel-Lindau disease, and polycystic kidney disease were excluded. The unenhanced CT was analysed, and the tumour locations were confirmed by using corresponding contrast-enhanced CT or magnetic resonance imaging studies. Representative single-slice axial, coronal, and sagittal unenhanced CT images were acquired in "soft tissue windows" (width, 400 Hounsfield unit (HU); level, 40 HU) and liver windows (width, 150 HU; level, 88 HU). In addition, single-slice axial, coronal, and sagittal unenhanced CT images of nontumourous renal tissue (obtained from the same cases) were acquired in soft tissue windows and liver windows. These data sets were randomized, unpaired, and were presented independently to 3 blinded radiologists for analysis. The presence or absence of suspicious findings for tumour was scored on a 5-point confidence scale. Eighty-three of 415 patients met the study criteria. Receiver operating characteristics (ROC) analysis, t test analysis, and kappa analysis were used. ROC analysis showed statistically superior diagnostic performance for liver windows compared with soft tissue windows (area under the curve of 0.923 vs 0.879; P = .0002). Kappa statistics showed "good" vs "moderate" agreement between readers for liver windows compared with soft tissue windows. Use of liver windows settings improves the detection of small RCCs on the unenhanced CT. Copyright © 2014 Canadian Association of Radiologists. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Roberts, Floyd E., III
1994-01-01
Software provides for control and acquisition of data from optical pyrometer. There are six individual programs in PYROLASER package. Provides quick and easy way to set up, control, and program standard Pyrolaser. Temperature and emisivity measurements either collected as if Pyrolaser in manual operating mode or displayed on real-time strip charts and stored in standard spreadsheet format for posttest analysis. Shell supplied to allow macros, which are test-specific, added to system easily. Written using Labview software for use on Macintosh-series computers running System 6.0.3 or later, Sun Sparc-series computers running Open-Windows 3.0 or MIT's X Window System (X11R4 or X11R5), and IBM PC or compatible computers running Microsoft Windows 3.1 or later.
Colonius, Hans; Diederich, Adele
2011-07-01
The concept of a "time window of integration" holds that information from different sensory modalities must not be perceived too far apart in time in order to be integrated into a multisensory perceptual event. Empirical estimates of window width differ widely, however, ranging from 40 to 600 ms depending on context and experimental paradigm. Searching for theoretical derivation of window width, Colonius and Diederich (Front Integr Neurosci 2010) developed a decision-theoretic framework using a decision rule that is based on the prior probability of a common source, the likelihood of temporal disparities between the unimodal signals, and the payoff for making right or wrong decisions. Here, this framework is extended to the focused attention task where subjects are asked to respond to signals from a target modality only. Evoking the framework of the time-window-of-integration (TWIN) model, an explicit expression for optimal window width is obtained. The approach is probed on two published focused attention studies. The first is a saccadic reaction time study assessing the efficiency with which multisensory integration varies as a function of aging. Although the window widths for young and older adults differ by nearly 200 ms, presumably due to their different peripheral processing speeds, neither of them deviates significantly from the optimal values. In the second study, head saccadic reactions times to a perfectly aligned audiovisual stimulus pair had been shown to depend on the prior probability of spatial alignment. Intriguingly, they reflected the magnitude of the time-window widths predicted by our decision-theoretic framework, i.e., a larger time window is associated with a higher prior probability.
Review of correlation techniques
NASA Technical Reports Server (NTRS)
Bowhill, S. A.
1983-01-01
Correlation analysis in MST radar to determine the scattered power, Doppler frequency and correlation time for a noisy signal is examined. It is assumed that coherent detection was employed, with two accurately balanced quadrature receiving channels and that coherent integration is performed with a window length significantly less than the correlation time of the signal.
A study on characteristics of retrospective optimal interpolation with WRF testbed
NASA Astrophysics Data System (ADS)
Kim, S.; Noh, N.; Lim, G.
2012-12-01
This study presents the application of retrospective optimal interpolation (ROI) with Weather Research and Forecasting model (WRF). Song et al. (2009) suggest ROI method which is an optimal interpolation (OI) that gradually assimilates observations over the analysis window for variance-minimum estimate of an atmospheric state at the initial time of the analysis window. Song and Lim (2011) improve the method by incorporating eigen-decomposition and covariance inflation. ROI method assimilates the data at post analysis time using perturbation method (Errico and Raeder, 1999) without adjoint model. In this study, ROI method is applied to WRF model to validate the algorithm and to investigate the capability. The computational costs for ROI can be reduced due to the eigen-decomposition of background error covariance. Using the background error covariance in eigen-space, 1-profile assimilation experiment is performed. The difference between forecast errors with assimilation and without assimilation is obviously increased as time passed, which means the improvement of forecast error by assimilation. The characteristics and strength/weakness of ROI method are investigated by conducting the experiments with other data assimilation method.
Windowed Multitaper Correlation Analysis of Multimodal Brain Monitoring Parameters
Proescholdt, Martin A.; Bele, Sylvia; Brawanski, Alexander
2015-01-01
Although multimodal monitoring sets the standard in daily practice of neurocritical care, problem-oriented analysis tools to interpret the huge amount of data are lacking. Recently a mathematical model was presented that simulates the cerebral perfusion and oxygen supply in case of a severe head trauma, predicting the appearance of distinct correlations between arterial blood pressure and intracranial pressure. In this study we present a set of mathematical tools that reliably detect the predicted correlations in data recorded at a neurocritical care unit. The time resolved correlations will be identified by a windowing technique combined with Fourier-based coherence calculations. The phasing of the data is detected by means of Hilbert phase difference within the above mentioned windows. A statistical testing method is introduced that allows tuning the parameters of the windowing method in such a way that a predefined accuracy is reached. With this method the data of fifteen patients were examined in which we found the predicted correlation in each patient. Additionally it could be shown that the occurrence of a distinct correlation parameter, called scp, represents a predictive value of high quality for the patients outcome. PMID:25821507
Bloomfield, Rachel C; Gillespie, Graeme R; Kerswell, Keven J; Butler, Kym L; Hemsworth, Paul H
2015-01-01
The window of the visitor viewing area adjacent to an animal platform in an orangutan enclosure was altered to produce three viewing treatments in a randomized controlled experiment. These treatments were window uncovered, left side of the window covered or right side of the window covered. Observations were conducted on the orangutans present on the platform, and on their location (left or right side), and orientation (towards or away from the window) while on the platform. The partial covering of the window had little effect on the proportion of time orangutans spent on the viewing platform, or on the direction they faced when on the platform. When the orangutans were facing towards the window, and the right side was uncovered, irrespective of whether the left side was covered, they spent about three quarters of the time on the right side, suggesting a preference for the right side of the platform. However, when the right side was covered and the left side uncovered, the animals facing towards the window spent only about a quarter of the time on the right side, that is, they spent more time on the uncovered side. The results suggest that the orangutans have a preference to position themselves to face the window of the visitor viewing area. © 2015 Wiley Periodicals, Inc.
Aggregation of Electric Current Consumption Features to Extract Maintenance KPIs
NASA Astrophysics Data System (ADS)
Simon, Victor; Johansson, Carl-Anders; Galar, Diego
2017-09-01
All electric powered machines offer the possibility of extracting information and calculating Key Performance Indicators (KPIs) from the electric current signal. Depending on the time window, sampling frequency and type of analysis, different indicators from the micro to macro level can be calculated for such aspects as maintenance, production, energy consumption etc. On the micro-level, the indicators are generally used for condition monitoring and diagnostics and are normally based on a short time window and a high sampling frequency. The macro indicators are normally based on a longer time window with a slower sampling frequency and are used as indicators for overall performance, cost or consumption. The indicators can be calculated directly from the current signal but can also be based on a combination of information from the current signal and operational data like rpm, position etc. One or several of those indicators can be used for prediction and prognostics of a machine's future behavior. This paper uses this technique to calculate indicators for maintenance and energy optimization in electric powered machines and fleets of machines, especially machine tools.
NASA Astrophysics Data System (ADS)
Yang, Shuang-Long; Liang, Li-Ping; Liu, Hou-De; Xu, Ke-Jun
2018-03-01
Aiming at reducing the estimation error of the sensor frequency response function (FRF) estimated by the commonly used window-based spectral estimation method, the error models of interpolation and transient errors are derived in the form of non-parameter models. Accordingly, window effects on the errors are analyzed and reveal that the commonly used hanning window leads to smaller interpolation error which can also be significantly eliminated by the cubic spline interpolation method when estimating the FRF from the step response data, and window with smaller front-end value can restrain more transient error. Thus, a new dual-cosine window with its non-zero discrete Fourier transform bins at -3, -1, 0, 1, and 3 is constructed for FRF estimation. Compared with the hanning window, the new dual-cosine window has the equivalent interpolation error suppression capability and better transient error suppression capability when estimating the FRF from the step response; specifically, it reduces the asymptotic property of the transient error from O(N-2) of the hanning window method to O(N-4) while only increases the uncertainty slightly (about 0.4 dB). Then, one direction of a wind tunnel strain gauge balance which is a high order, small damping, and non-minimum phase system is employed as the example for verifying the new dual-cosine window-based spectral estimation method. The model simulation result shows that the new dual-cosine window method is better than the hanning window method for FRF estimation, and compared with the Gans method and LPM method, it has the advantages of simple computation, less time consumption, and short data requirement; the actual data calculation result of the balance FRF is consistent to the simulation result. Thus, the new dual-cosine window is effective and practical for FRF estimation.
Tabelow, Karsten; König, Reinhard; Polzehl, Jörg
2016-01-01
Estimation of learning curves is ubiquitously based on proportions of correct responses within moving trial windows. Thereby, it is tacitly assumed that learning performance is constant within the moving windows, which, however, is often not the case. In the present study we demonstrate that violations of this assumption lead to systematic errors in the analysis of learning curves, and we explored the dependency of these errors on window size, different statistical models, and learning phase. To reduce these errors in the analysis of single-subject data as well as on the population level, we propose adequate statistical methods for the estimation of learning curves and the construction of confidence intervals, trial by trial. Applied to data from an avoidance learning experiment with rodents, these methods revealed performance changes occurring at multiple time scales within and across training sessions which were otherwise obscured in the conventional analysis. Our work shows that the proper assessment of the behavioral dynamics of learning at high temporal resolution can shed new light on specific learning processes, and, thus, allows to refine existing learning concepts. It further disambiguates the interpretation of neurophysiological signal changes recorded during training in relation to learning. PMID:27303809
Foo, Lee Kien; McGree, James; Duffull, Stephen
2012-01-01
Optimal design methods have been proposed to determine the best sampling times when sparse blood sampling is required in clinical pharmacokinetic studies. However, the optimal blood sampling time points may not be feasible in clinical practice. Sampling windows, a time interval for blood sample collection, have been proposed to provide flexibility in blood sampling times while preserving efficient parameter estimation. Because of the complexity of the population pharmacokinetic models, which are generally nonlinear mixed effects models, there is no analytical solution available to determine sampling windows. We propose a method for determination of sampling windows based on MCMC sampling techniques. The proposed method attains a stationary distribution rapidly and provides time-sensitive windows around the optimal design points. The proposed method is applicable to determine sampling windows for any nonlinear mixed effects model although our work focuses on an application to population pharmacokinetic models. Copyright © 2012 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Dobrynin, S. A.; Kolubaev, E. A.; Smolin, A. Yu.; Dmitriev, A. I.; Psakhie, S. G.
2010-07-01
Time-frequency analysis of sound waves detected by a microphone during the friction of Hadfield’s steel has been performed using wavelet transform and window Fourier transform methods. This approach reveals a relationship between the appearance of quasi-periodic intensity outbursts in the acoustic response signals and the processes responsible for the formation of wear products. It is shown that the time-frequency analysis of acoustic emission in a tribosystem can be applied, along with traditional approaches, to studying features in the wear and friction process.
Minimal Window Duration for Accurate HRV Recording in Athletes
Bourdillon, Nicolas; Schmitt, Laurent; Yazdani, Sasan; Vesin, Jean-Marc; Millet, Grégoire P.
2017-01-01
Heart rate variability (HRV) is non-invasive and commonly used for monitoring responses to training loads, fitness, or overreaching in athletes. Yet, the recording duration for a series of RR-intervals varies from 1 to 15 min in the literature. The aim of the present work was to assess the minimum record duration to obtain reliable HRV results. RR-intervals from 159 orthostatic tests (7 min supine, SU, followed by 6 min standing, ST) were analyzed. Reference windows were 4 min in SU (min 3–7) and 4 min in ST (min 9–13). Those windows were subsequently divided and the analyses were repeated on eight different fractioned windows: the first min (0–1), the second min (1–2), the third min (2–3), the fourth min (3–4), the first 2 min (0–2), the last 2 min (2–4), the first 3 min (0–3), and the last 3 min (1–4). Correlation and Bland & Altman statistical analyses were systematically performed. The analysis window could be shortened to 0–2 instead of 0–4 for RMSSD only, whereas the 4-min window was necessary for LF and total power. Since there is a need for 1 min of baseline to obtain a steady signal prior the analysis window, we conclude that studies relying on RMSSD may shorten the windows to 3 min (= 1+2) in SU or seated position only and to 6 min (= 1+2 min SU plus 1+2 min ST) if there is an orthostatic test. Studies relying on time- and frequency-domain parameters need a minimum of 5 min (= 1+4) min SU or seated position only but require 10 min (= 1+4 min SU plus 1+4 min ST) for the orthostatic test. PMID:28848382
Advanced water window x-ray microscope design and analysis
NASA Technical Reports Server (NTRS)
Shealy, D. L.; Wang, C.; Jiang, W.; Lin, J.
1992-01-01
The project was focused on the design and analysis of an advanced water window soft-x-ray microscope. The activities were accomplished by completing three tasks contained in the statement of work of this contract. The new results confirm that in order to achieve resolutions greater than three times the wavelength of the incident radiation, it will be necessary to use aspherical mirror surfaces and to use graded multilayer coatings on the secondary (to accommodate the large variations of the angle of incidence over the secondary when operating the microscope at numerical apertures of 0.35 or greater). The results are included in a manuscript which is enclosed in the Appendix.
Scott, Jonathan M.; Robinson, Stephen E.; Holroyd, Tom; Coppola, Richard; Sato, Susumu; Inati, Sara K.
2016-01-01
OBJECTIVE To describe and optimize an automated beamforming technique followed by identification of locations with excess kurtosis (g2) for efficient detection and localization of interictal spikes in medically refractory epilepsy patients. METHODS Synthetic Aperture Magnetometry with g2 averaged over a sliding time window (SAMepi) was performed in 7 focal epilepsy patients and 5 healthy volunteers. The effect of varied window lengths on detection of spiking activity was evaluated. RESULTS Sliding window lengths of 0.5–10 seconds performed similarly, with 0.5 and 1 second windows detecting spiking activity in one of the 3 virtual sensor locations with highest kurtosis. These locations were concordant with the region of eventual surgical resection in these 7 patients who remained seizure free at one year. Average g2 values increased with increasing sliding window length in all subjects. In healthy volunteers kurtosis values stabilized in datasets longer than two minutes. CONCLUSIONS SAMepi using g2 averaged over 1 second sliding time windows in datasets of at least 2 minutes duration reliably identified interictal spiking and the presumed seizure focus in these 7 patients. Screening the 5 locations with highest kurtosis values for spiking activity is an efficient and accurate technique for localizing interictal activity using MEG. SIGNIFICANCE SAMepi should be applied using the parameter values and procedure described for optimal detection and localization of interictal spikes. Use of this screening procedure could significantly improve the efficiency of MEG analysis if clinically validated. PMID:27760068
Rapid update of discrete Fourier transform for real-time signal processing
NASA Astrophysics Data System (ADS)
Sherlock, Barry G.; Kakad, Yogendra P.
2001-10-01
In many identification and target recognition applications, the incoming signal will have properties that render it amenable to analysis or processing in the Fourier domain. In such applications, however, it is usually essential that the identification or target recognition be performed in real time. An important constraint upon real-time processing in the Fourier domain is the time taken to perform the Discrete Fourier Transform (DFT). Ideally, a new Fourier transform should be obtained after the arrival of every new data point. However, the Fast Fourier Transform (FFT) algorithm requires on the order of N log2 N operations, where N is the length of the transform, and this usually makes calculation of the transform for every new data point computationally prohibitive. In this paper, we develop an algorithm to update the existing DFT to represent the new data series that results when a new signal point is received. Updating the DFT in this way uses less computational order by a factor of log2 N. The algorithm can be modified to work in the presence of data window functions. This is a considerable advantage, because windowing is often necessary to reduce edge effects that occur because the implicit periodicity of the Fourier transform is not exhibited by the real-world signal. Versions are developed in this paper for use with the boxcar window, the split triangular, Hanning, Hamming, and Blackman windows. Generalization of these results to 2D is also presented.
Detecting, anticipating, and predicting critical transitions in spatially extended systems.
Kwasniok, Frank
2018-03-01
A data-driven linear framework for detecting, anticipating, and predicting incipient bifurcations in spatially extended systems based on principal oscillation pattern (POP) analysis is discussed. The dynamics are assumed to be governed by a system of linear stochastic differential equations which is estimated from the data. The principal modes of the system together with corresponding decay or growth rates and oscillation frequencies are extracted as the eigenvectors and eigenvalues of the system matrix. The method can be applied to stationary datasets to identify the least stable modes and assess the proximity to instability; it can also be applied to nonstationary datasets using a sliding window approach to track the changing eigenvalues and eigenvectors of the system. As a further step, a genuinely nonstationary POP analysis is introduced. Here, the system matrix of the linear stochastic model is time-dependent, allowing for extrapolation and prediction of instabilities beyond the learning data window. The methods are demonstrated and explored using the one-dimensional Swift-Hohenberg equation as an example, focusing on the dynamics of stochastic fluctuations around the homogeneous stable state prior to the first bifurcation. The POP-based techniques are able to extract and track the least stable eigenvalues and eigenvectors of the system; the nonstationary POP analysis successfully predicts the timing of the first instability and the unstable mode well beyond the learning data window.
PWL 1.0 Personal WaveLab: an object-oriented workbench for seismogram analysis on Windows systems
NASA Astrophysics Data System (ADS)
Bono, Andrea; Badiali, Lucio
2005-02-01
Personal WaveLab 1.0 wants to be the starting point for an ex novo development of seismic time-series analysis procedures for Windows-based personal computers. Our objective is two-fold. Firstly, being itself a stand-alone application, it allows to do "basic" digital or digitised seismic waveform analysis. Secondly, thanks to its architectural characteristics it can be the basis for the development of more complex and power featured applications. An expanded version of PWL, called SisPick!, is currently in use at the Istituto Nazionale di Geofisica e Vulcanologia (Italian Institute of Geophysics and Volcanology) for real-time monitoring with purposes of Civil Protection. This means that about 90 users tested the application for more than 1 year, making its features more robust and efficient. SisPick! was also employed in the United Nations Nyragongo Project, in Congo, and during the Stromboli emergency in summer of 2002. The main appeals of the application package are: ease of use, object-oriented design, good computational speed, minimal need of disk space and the complete absence of third-party developed components (including ActiveX). Windows environment spares the user scripting or complex interaction with the system. The system is in constant development to answer the needs and suggestions of its users. Microsoft Visual Basic 6 source code, installation package, test data sets and documentation are available at no cost.
Detecting, anticipating, and predicting critical transitions in spatially extended systems
NASA Astrophysics Data System (ADS)
Kwasniok, Frank
2018-03-01
A data-driven linear framework for detecting, anticipating, and predicting incipient bifurcations in spatially extended systems based on principal oscillation pattern (POP) analysis is discussed. The dynamics are assumed to be governed by a system of linear stochastic differential equations which is estimated from the data. The principal modes of the system together with corresponding decay or growth rates and oscillation frequencies are extracted as the eigenvectors and eigenvalues of the system matrix. The method can be applied to stationary datasets to identify the least stable modes and assess the proximity to instability; it can also be applied to nonstationary datasets using a sliding window approach to track the changing eigenvalues and eigenvectors of the system. As a further step, a genuinely nonstationary POP analysis is introduced. Here, the system matrix of the linear stochastic model is time-dependent, allowing for extrapolation and prediction of instabilities beyond the learning data window. The methods are demonstrated and explored using the one-dimensional Swift-Hohenberg equation as an example, focusing on the dynamics of stochastic fluctuations around the homogeneous stable state prior to the first bifurcation. The POP-based techniques are able to extract and track the least stable eigenvalues and eigenvectors of the system; the nonstationary POP analysis successfully predicts the timing of the first instability and the unstable mode well beyond the learning data window.
A multimodal logistics service network design with time windows and environmental concerns
Zhang, Dezhi; He, Runzhong; Wang, Zhongwei
2017-01-01
The design of a multimodal logistics service network with customer service time windows and environmental costs is an important and challenging issue. Accordingly, this work established a model to minimize the total cost of multimodal logistics service network design with time windows and environmental concerns. The proposed model incorporates CO2 emission costs to determine the optimal transportation mode combinations and investment selections for transfer nodes, which consider transport cost, transport time, carbon emission, and logistics service time window constraints. Furthermore, genetic and heuristic algorithms are proposed to set up the abovementioned optimal model. A numerical example is provided to validate the model and the abovementioned two algorithms. Then, comparisons of the performance of the two algorithms are provided. Finally, this work investigates the effects of the logistics service time windows and CO2 emission taxes on the optimal solution. Several important management insights are obtained. PMID:28934272
A multimodal logistics service network design with time windows and environmental concerns.
Zhang, Dezhi; He, Runzhong; Li, Shuangyan; Wang, Zhongwei
2017-01-01
The design of a multimodal logistics service network with customer service time windows and environmental costs is an important and challenging issue. Accordingly, this work established a model to minimize the total cost of multimodal logistics service network design with time windows and environmental concerns. The proposed model incorporates CO2 emission costs to determine the optimal transportation mode combinations and investment selections for transfer nodes, which consider transport cost, transport time, carbon emission, and logistics service time window constraints. Furthermore, genetic and heuristic algorithms are proposed to set up the abovementioned optimal model. A numerical example is provided to validate the model and the abovementioned two algorithms. Then, comparisons of the performance of the two algorithms are provided. Finally, this work investigates the effects of the logistics service time windows and CO2 emission taxes on the optimal solution. Several important management insights are obtained.
Swift J1658.2-4242: Possible pulsar periodicity detected
NASA Astrophysics Data System (ADS)
Kennea, J. A.
2018-02-01
We report on analysis of all the current Windowed Timing mode data taken on the newly discovered Galactic Transient, Swift J1658.2-4242 (GCN #22416, GCN #22417, GCN #22419, ATEL #11310, ATEL #11306, ATEL #11307).
Lottman, Kristin K; Kraguljac, Nina V; White, David M; Morgan, Charity J; Calhoun, Vince D; Butt, Allison; Lahti, Adrienne C
2017-01-01
Resting-state functional connectivity studies in schizophrenia evaluating average connectivity over the entire experiment have reported aberrant network integration, but findings are variable. Examining time-varying (dynamic) functional connectivity may help explain some inconsistencies. We assessed dynamic network connectivity using resting-state functional MRI in patients with schizophrenia, while unmedicated ( n = 34), after 1 week ( n = 29) and 6 weeks of treatment with risperidone ( n = 24), as well as matched controls at baseline ( n = 35) and after 6 weeks ( n = 19). After identifying 41 independent components (ICs) comprising resting-state networks, sliding window analysis was performed on IC timecourses using an optimal window size validated with linear support vector machines. Windowed correlation matrices were then clustered into three discrete connectivity states (a relatively sparsely connected state, a relatively abundantly connected state, and an intermediately connected state). In unmedicated patients, static connectivity was increased between five pairs of ICs and decreased between two pairs of ICs when compared to controls, dynamic connectivity showed increased connectivity between the thalamus and somatomotor network in one of the three states. State statistics indicated that, in comparison to controls, unmedicated patients had shorter mean dwell times and fraction of time spent in the sparsely connected state, and longer dwell times and fraction of time spent in the intermediately connected state. Risperidone appeared to normalize mean dwell times after 6 weeks, but not fraction of time. Results suggest that static connectivity abnormalities in schizophrenia may partly be related to altered brain network temporal dynamics rather than consistent dysconnectivity within and between functional networks and demonstrate the importance of implementing complementary data analysis techniques.
Time-Resolved Data Acquisition for In Situ Subsurface Planetary Geochemistry
NASA Technical Reports Server (NTRS)
Bodnarik, Julia Gates; Burger, Dan M.; Burger, Arnold; Evans, Larry G.; Parsons, Ann M.; Starr, Richard D.; Stassun, Keivan G.
2012-01-01
The current gamma-ray/neutron instrumentation development effort at NASA Goddard Space Flight Center aims to extend the use of active pulsed neutron interrogation techniques to probe the subsurface geochemistry of planetary bodies in situ. All previous NASA planetary science missions, that used neutron and/or gamma-ray spectroscopy instruments, have relied on a constant neutron source produced from galactic cosmic rays. One of the distinguishing features of this effort is the inclusion of a high intensity 14.1 MeV pulsed neutron generator synchronized with a custom data acquisition system to time each event relative to the pulse. With usually only one opportunity to collect data, it is difficult to set a priori time-gating windows to obtain the best possible results. Acquiring time-tagged, event-by-event data from nuclear induced reactions provides raw data sets containing channel/energy, and event time for each gamma ray or neutron detected. The resulting data set can be plotted as a function of time or energy using optimized analysis windows after the data are acquired. Time windows can now be chosen to produce energy spectra that yield the most statistically significant and accurate elemental composition results that can be derived from the complete data set. The advantages of post-processing gamma-ray time-tagged event-by-event data in experimental tests using our prototype instrument will be demonstrated.
Time-resolved Neutron-gamma-ray Data Acquisition for in Situ Subsurface Planetary Geochemistry
NASA Technical Reports Server (NTRS)
Bodnarik, Julie G.; Burger, Dan Michael; Burger, A.; Evans, L. G.; Parsons, A. M.; Schweitzer, J. S.; Starr R. D.; Stassun, K. G.
2013-01-01
The current gamma-ray/neutron instrumentation development effort at NASA Goddard Space Flight Center aims to extend the use of active pulsed neutron interrogation techniques to probe the subsurface elemental composition of planetary bodies in situ. Previous NASA planetary science missions, that used neutron and/or gamma-ray spectroscopy instruments, have relied on neutrons produced from galactic cosmic rays. One of the distinguishing features of this effort is the inclusion of a high intensity 14.1 MeV pulsed neutron generator synchronized with a custom data acquisition system to time each event relative to the pulse. With usually only one opportunity to collect data, it is difficult to set a priori time-gating windows to obtain the best possible results. Acquiring time-tagged, event-by-event data from nuclear induced reactions provides raw data sets containing channel/energy, and event time for each gamma ray or neutron detected. The resulting data set can be plotted as a function of time or energy using optimized analysis windows after the data are acquired. Time windows can now be chosen to produce energy spectra that yield the most statistically significant and accurate elemental composition results that can be derived from the complete data set. The advantages of post-processing gamma-ray time-tagged event-by-event data in experimental tests using our prototype instrument will be demonstrated.
Iterated local search algorithm for solving the orienteering problem with soft time windows.
Aghezzaf, Brahim; Fahim, Hassan El
2016-01-01
In this paper we study the orienteering problem with time windows (OPTW) and the impact of relaxing the time windows on the profit collected by the vehicle. The way of relaxing time windows adopted in the orienteering problem with soft time windows (OPSTW) that we study in this research is a late service relaxation that allows linearly penalized late services to customers. We solve this problem heuristically by considering a hybrid iterated local search. The results of the computational study show that the proposed approach is able to achieve promising solutions on the OPTW test instances available in the literature, one new best solution is found. On the newly generated test instances of the OPSTW, the results show that the profit collected by the OPSTW is better than the profit collected by the OPTW.
Window and Overlap Processing Effects on Power Estimates from Spectra
NASA Astrophysics Data System (ADS)
Trethewey, M. W.
2000-03-01
Fast Fourier transform (FFT) spectral processing is based on the assumption of stationary ergodic data. In engineering practice, the assumption is often violated and non-stationary data processed. Data windows are commonly used to reduce leakage by decreasing the signal amplitudes near the boundaries of the discrete samples. With certain combinations of non-stationary signals and windows, the temporal weighting may attenuate important signal characteristics to adversely affect any subsequent processing. In other words, the window artificially reduces a significant section of the time signal. Consequently, spectra and overall power estimated from the affected samples are unreliable. FFT processing can be particularly problematic when the signal consists of randomly occurring transients superimposed on a more continuous signal. Overlap processing is commonly used in this situation to improve the estimates. However, the results again depend on the temporal character of the signal in relation to the window weighting. A worst-case scenario, a short-duration half sine pulse, is used to illustrate the relationship between overlap percentage and resulting power estimates. The power estimates are shown to depend on the temporal behaviour of the square of overlapped window segments. An analysis shows that power estimates may be obtained to within 0.27 dB for the following windows and overlap combinations: rectangular (0% overlap), Hanning (62.5% overlap), Hamming (60.35% overlap) and flat-top (82.25% overlap).
Windows Terminal Servers Orchestration
NASA Astrophysics Data System (ADS)
Bukowiec, Sebastian; Gaspar, Ricardo; Smith, Tim
2017-10-01
Windows Terminal Servers provide application gateways for various parts of the CERN accelerator complex, used by hundreds of CERN users every day. The combination of new tools such as Puppet, HAProxy and Microsoft System Center suite enable automation of provisioning workflows to provide a terminal server infrastructure that can scale up and down in an automated manner. The orchestration does not only reduce the time and effort necessary to deploy new instances, but also facilitates operations such as patching, analysis and recreation of compromised nodes as well as catering for workload peaks.
Inter-pulse high-resolution gamma-ray spectra using a 14 MeV pulsed neutron generator
Evans, L.G.; Trombka, J.I.; Jensen, D.H.; Stephenson, W.A.; Hoover, R.A.; Mikesell, J.L.; Tanner, A.B.; Senftle, F.E.
1984-01-01
A neutron generator pulsed at 100 s-1 was suspended in an artificial borehole containing a 7.7 metric ton mixture of sand, aragonite, magnetite, sulfur, and salt. Two Ge(HP) gamma-ray detectors were used: one in a borehole sonde, and one at the outside wall of the sample tank opposite the neutron generator target. Gamma-ray spectra were collected by the outside detector during each of 10 discrete time windows during the 10 ms period following the onset of gamma-ray build-up after each neutron burst. The sample was measured first when dry and then when saturated with water. In the dry sample, gamma rays due to inelastic neutron scattering, neutron capture, and decay were counted during the first (150 ??s) time window. Subsequently only capture and decay gamma rays were observed. In the wet sample, only neutron capture and decay gamma rays were observed. Neutron capture gamma rays dominated the spectrum during the period from 150 to 400 ??s after the neutron burst in both samples, but decreased with time much more rapidly in the wet sample. A signal-to-noise-ratio (S/N) analysis indicates that optimum conditions for neutron capture analysis occurred in the 350-800 ??s window. A poor S/N in the first 100-150 ??s is due to a large background continuum during the first time interval. Time gating can be used to enhance gamma-ray spectra, depending on the nuclides in the target material and the reactions needed to produce them, and should improve the sensitivity of in situ well logging. ?? 1984.
Early Warning for Large Magnitude Earthquakes: Is it feasible?
NASA Astrophysics Data System (ADS)
Zollo, A.; Colombelli, S.; Kanamori, H.
2011-12-01
The mega-thrust, Mw 9.0, 2011 Tohoku earthquake has re-opened the discussion among the scientific community about the effectiveness of Earthquake Early Warning (EEW) systems, when applied to such large events. Many EEW systems are now under-testing or -development worldwide and most of them are based on the real-time measurement of ground motion parameters in a few second window after the P-wave arrival. Currently, we are using the initial Peak Displacement (Pd), and the Predominant Period (τc), among other parameters, to rapidly estimate the earthquake magnitude and damage potential. A well known problem about the real-time estimation of the magnitude is the parameter saturation. Several authors have shown that the scaling laws between early warning parameters and magnitude are robust and effective up to magnitude 6.5-7; the correlation, however, has not yet been verified for larger events. The Tohoku earthquake occurred near the East coast of Honshu, Japan, on the subduction boundary between the Pacific and the Okhotsk plates. The high quality Kik- and K- networks provided a large quantity of strong motion records of the mainshock, with a wide azimuthal coverage both along the Japan coast and inland. More than 300 3-component accelerograms have been available, with an epicentral distance ranging from about 100 km up to more than 500 km. This earthquake thus presents an optimal case study for testing the physical bases of early warning and to investigate the feasibility of a real-time estimation of earthquake size and damage potential even for M > 7 earthquakes. In the present work we used the acceleration waveform data of the main shock for stations along the coast, up to 200 km epicentral distance. We measured the early warning parameters, Pd and τc, within different time windows, starting from 3 seconds, and expanding the testing time window up to 30 seconds. The aim is to verify the correlation of these parameters with Peak Ground Velocity and Magnitude, respectively, as a function of the length of the P-wave window. The entire rupture process of the Tohoku earthquake lasted more than 120 seconds, as shown by the source time functions obtained by several authors. When a 3 second window is used to measure Pd and τc the result is an obvious underestimation of the event size and final PGV. However, as the time window increases up to 27-30 seconds, the measured values of Pd and τc become comparable with those expected for a magnitude M≥8.5 earthquake, according to the τc vs. M and the PGV vs. Pd relationships obtained in a previous work. Since we did not observe any saturation effect for the predominant period and peak displacement measured within a P-wave, 30-seconds window, we infer that, at least from a theoretical point of view, the estimation of earthquake damage potential through the early warning parameters is still feasible for large events, provided that a longer time window is used for parameter measurement. The off-line analysis of the Tohoku event records shows that reliable estimations of the damage potential could have been obtained 40-50 seconds after the origin time, by updating the measurements of the early warning parameters in progressively enlarged P-wave time windows from 3 to 30 seconds.
Criteria for Comparing Domain Analysis Approaches Version 01.00.00
1991-12-01
Down-Bottom-Up Domain Analysis Process (1990 Version) ..... 14 Figure 8. FODAs Domain Analysis Process ............................................ 16... FODA , which uses the Design Approach for Real-Time Systems (DARTS) design method (Gomaa 1984)? 1 1. hntmduction Domain analysis is still immature... Analysis Process 16 2. An Orvicw of Some Domain AnabAppro•d•a 2.4.3 ExAzs The FODA report illustrates the process by using the window management
Data Stream Mining Based Dynamic Link Anomaly Analysis Using Paired Sliding Time Window Data
2014-11-01
Conference on Knowledge Dis- covery and Data Mining, PAKDD’10, Hyderabad, India , (2010). [2] Almansoori, W., Gao, S., Jarada, T. N., Elsheikh, A. M...F., Greif, C., and Lakshmanan, L. V., “Fast Matrix Computations for Pairwise and Columnwise Commute Times and Katz Scores,” Internet Mathematics, Vol
USDA-ARS?s Scientific Manuscript database
The Optimal Ranking Regime (ORR) method was used to identify intra- to multi-decadal (IMD) time windows containing significant ranking sequences in U.S. climate division temperature data. The simplicity of the ORR procedure’s output – a time series’ most significant non-overlapping periods of high o...
A note on windowing for the waveform relaxation
NASA Technical Reports Server (NTRS)
Zhang, Hong
1994-01-01
The technique of windowing has been often used in the implementation of the waveform relaxations for solving ODE's or time dependent PDE's. Its efficiency depends upon problem stiffness and operator splitting. Using model problems, the estimates for window length and convergence rate are derived. The electiveness of windowing is then investigated for non-stiff and stiff cases respectively. lt concludes that for the former, windowing is highly recommended when a large discrepancy exists between the convergence rate on a time interval and the ones on its subintervals. For the latter, windowing does not provide any computational advantage if machine features are disregarded. The discussion is supported by experimental results.
SSVEP recognition using common feature analysis in brain-computer interface.
Zhang, Yu; Zhou, Guoxu; Jin, Jing; Wang, Xingyu; Cichocki, Andrzej
2015-04-15
Canonical correlation analysis (CCA) has been successfully applied to steady-state visual evoked potential (SSVEP) recognition for brain-computer interface (BCI) application. Although the CCA method outperforms the traditional power spectral density analysis through multi-channel detection, it requires additionally pre-constructed reference signals of sine-cosine waves. It is likely to encounter overfitting in using a short time window since the reference signals include no features from training data. We consider that a group of electroencephalogram (EEG) data trials recorded at a certain stimulus frequency on a same subject should share some common features that may bear the real SSVEP characteristics. This study therefore proposes a common feature analysis (CFA)-based method to exploit the latent common features as natural reference signals in using correlation analysis for SSVEP recognition. Good performance of the CFA method for SSVEP recognition is validated with EEG data recorded from ten healthy subjects, in contrast to CCA and a multiway extension of CCA (MCCA). Experimental results indicate that the CFA method significantly outperformed the CCA and the MCCA methods for SSVEP recognition in using a short time window (i.e., less than 1s). The superiority of the proposed CFA method suggests it is promising for the development of a real-time SSVEP-based BCI. Copyright © 2014 Elsevier B.V. All rights reserved.
Jia, Tongying; Yuan, Huiyun
2017-04-12
Many large-scaled public hospitals have established branched hospitals in China. This study is to provide evidence for strategy making on the management and development of multi-branched hospitals by evaluating and comparing the operational efficiencies of different hospitals before and after their establishment of branched hospitals. DEA (Data Envelopment Analysis) window analysis was performed on a 7-year data pool from five public hospitals provided by health authorities and institutional surveys. The operational efficiencies of sample hospitals measured in this study (including technical efficiency, pure technical efficiency and scale efficiency) had overall trends towards increase during this 7-year period of time, however, a temporary downturn occurred shortly after the establishment of branched hospitals; pure technical efficiency contributed more to the improvement of technical efficiency compared to scale efficiency. The establishment of branched-hospitals did not lead to a long-term negative effect on hospital operational efficiencies. Our data indicated the importance of improving scale efficiency via the optimization of organizational management, as well as the advantage of a different form of branch-establishment, merging and reorganization. This study brought an insight into the practical application of DEA window analysis on the assessment of hospital operational efficiencies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yun, Geun Young; Steemers, Koen
2010-07-15
This paper investigates occupant behaviour of window-use in night-time naturally ventilated offices on the basis of a pilot field study, conducted during the summers of 2006 and 2007 in Cambridge, UK, and then demonstrates the effects of employing night-time ventilation on indoor thermal conditions using predictive models of occupant window-use. A longitudinal field study shows that occupants make good use of night-time natural ventilation strategies when provided with openings that allow secure ventilation, and that there is a noticeable time of day effect in window-use patterns (i.e. increased probability of action on arrival and departure). We develop logistic models ofmore » window-use for night-time naturally ventilated offices, which are subsequently applied to a behaviour algorithm, including Markov chains and Monte Carlo methods. The simulations using the behaviour algorithm demonstrate a good agreement with the observational data of window-use, and reveal how building design and occupant behaviour collectively affect the thermal performance of offices. They illustrate that the provision of secure ventilation leads to more frequent use of the window, and thus contributes significantly to the achievement of a comfortable indoor environment during the daytime occupied period. For example, the maximum temperature for a night-time ventilated office is found to be 3 C below the predicted value for a daytime-only ventilated office. (author)« less
NASA Astrophysics Data System (ADS)
Kang, Jae-sik; Oh, Eun-Joo; Bae, Min-Jung; Song, Doo-Sam
2017-12-01
Given that the Korean government is implementing what has been termed the energy standards and labelling program for windows, window companies will be required to assign window ratings based on the experimental results of their product. Because this has added to the cost and time required for laboratory tests by window companies, the simulation system for the thermal performance of windows has been prepared to compensate for time and cost burdens. In Korea, a simulator is usually used to calculate the thermal performance of a window through WINDOW/THERM, complying with ISO 15099. For a single window, the simulation results are similar to experimental results. A double window is also calculated using the same method, but the calculation results for this type of window are unreliable. ISO 15099 should not recommend the calculation of the thermal properties of an air cavity between window sashes in a double window. This causes a difference between simulation and experimental results pertaining to the thermal performance of a double window. In this paper, the thermal properties of air cavities between window sashes in a double window are analyzed through computational fluid dynamics (CFD) simulations with the results compared to calculation results certified by ISO 15099. The surface temperature of the air cavity analyzed by CFD is compared to the experimental temperatures. These results show that an appropriate calculation method for an air cavity between window sashes in a double window should be established for reliable thermal performance results for a double window.
Process Flow Features as a Host-Based Event Knowledge Representation
2012-06-14
an executing process during a window of time called a process flow. Process flows are calculated from key process data structures extracted from...for Cluster 98. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 4.9. Davies- Boldin Dunn Index Sliding Window 5 on Windows 7...82 4.10. Davies- Boldin Dunn Index Sliding Window 10 on Windows 7 . 83 4.11. Davies- Boldin Dunn Index Sliding Window 20 on Windows 7 . 83 ix List of
Eye movement evidence for defocused attention in dysphoria--a perceptual span analysis.
Brzezicka, Aneta; Krejtz, Izabela; von Hecker, Ulrich; Laubrock, Jochen
2012-07-01
The defocused attention hypothesis (von Hecker and Meiser, 2005) assumes that negative mood broadens attention, whereas the analytical rumination hypothesis (Andrews and Thompson, 2009) suggests a narrowing of the attentional focus with depression. We tested these conflicting hypotheses by directly measuring the perceptual span in groups of dysphoric and control subjects, using eye tracking. In the moving window paradigm, information outside of a variable-width gaze-contingent window was masked during reading of sentences. In measures of sentence reading time and mean fixation duration, dysphoric subjects were more pronouncedly affected than controls by a reduced window size. This difference supports the defocused attention hypothesis and seems hard to reconcile with a narrowing of attentional focus. Copyright © 2011 Elsevier B.V. All rights reserved.
Graphical User Interface for the NASA FLOPS Aircraft Performance and Sizing Code
NASA Technical Reports Server (NTRS)
Lavelle, Thomas M.; Curlett, Brian P.
1994-01-01
XFLOPS is an X-Windows/Motif graphical user interface for the aircraft performance and sizing code FLOPS. This new interface simplifies entering data and analyzing results, thereby reducing analysis time and errors. Data entry is simpler because input windows are used for each of the FLOPS namelists. These windows contain fields to input the variable's values along with help information describing the variable's function. Analyzing results is simpler because output data are displayed rapidly. This is accomplished in two ways. First, because the output file has been indexed, users can view particular sections with the click of a mouse button. Second, because menu picks have been created, users can plot engine and aircraft performance data. In addition, XFLOPS has a built-in help system and complete on-line documentation for FLOPS.
NASA Astrophysics Data System (ADS)
Elias, E.; Rango, A.; James, D.; Maxwell, C.; Anderson, J.; Abatzoglou, J. T.
2016-12-01
Researchers evaluating climate projections across southwestern North America observed a decreasing precipitation trend. Aridification was most pronounced in the cold (non-monsoonal) season, whereas downward trends in precipitation were smaller in the warm (monsoonal) season. In this region, based upon a multimodel mean of 20 Coupled Model Intercomparison Project 5 models using a business-as-usual (Representative Concentration Pathway 8.5) trajectory, midcentury precipitation is projected to increase slightly during the monsoonal time period (July-September; 6%) and decrease slightly during the remainder of the year (October-June; -4%). We use observed long-term (1915-2015) monthly precipitation records from 16 weather stations to investigate how well measured trends corroborate climate model predictions during the monsoonal and non-monsoonal timeframe. Running trend analysis using the Mann-Kendall test for 15 to 101 year moving windows reveals that half the stations showed significant (p≤0.1), albeit small, increasing trends based on the longest term record. Trends based on shorter-term records reveal a period of significant precipitation decline at all stations representing the 1950s drought. Trends from 1930 to 2015 reveal significant annual, monsoonal and non-monsoonal increases in precipitation (Fig 1). The 1960 to 2015 time window shows no significant precipitation trends. The more recent time window (1980 to 2015) shows a slight, but not significant, increase in monsoonal precipitation and a larger, significant decline in non-monsoonal precipitation. GCM precipitation projections are consistent with more recent trends for the region. Running trends from the most recent time window (mid-1990s to 2015) at all stations show increasing monsoonal precipitation and decreasing Oct-Jun precipitation, with significant trends at 6 of 16 stations. Running trend analysis revealed that the long-term trends were not persistent throughout the series length, but depended on the period examined. Recent trends in Southwest precipitation are directionally consistent with anthropogenic climate change.
Seismic Canvas: Evolution as a Data Exploration and Analysis Tool
NASA Astrophysics Data System (ADS)
Kroeger, G. C.
2015-12-01
SeismicCanvas, originally developed as a prototype interactive waveform display and printing application for educational use has evolved to include significant data exploration and analysis functionality. The most recent version supports data import from a variety of standard file formats including SAC and mini-SEED, as well as search and download capabilities via IRIS/FDSN Web Services. Data processing tools now include removal of means and trends, interactive windowing, filtering, smoothing, tapering, resampling. Waveforms can be displayed in a free-form canvas or as a record section based on angular or great circle distance, azimuth or back azimuth. Integrated tau-p code allows the calculation and display of theoretical phase arrivals from a variety of radial Earth models. Waveforms can be aligned by absolute time, event time, picked or theoretical arrival times and can be stacked after alignment. Interactive measurements include means, amplitudes, time delays, ray parameters and apparent velocities. Interactive picking of an arbitrary list of seismic phases is supported. Bode plots of amplitude and phase spectra and spectrograms can be created from multiple seismograms or selected windows of seismograms. Direct printing is implemented on all supported platforms along with output of high-resolution pdf files. With these added capabilities, the application is now being used as a data exploration tool for research. Coded in C++ and using the cross-platform Qt framework, the most recent version is available as a 64-bit application for Windows 7-10, Mac OS X 10.6-10.11, and most distributions of Linux, and a 32-bit version for Windows XP and 7. With the latest improvements and refactoring of trace display classes, the 64-bit versions have been tested with over 250 million samples and remain responsive in interactive operations. The source code is available under a LPGLv3 license and both source and executables are available through the IRIS SeisCode repository.
NASA Astrophysics Data System (ADS)
Kappler, Karl N.; Schneider, Daniel D.; MacLean, Laura S.; Bleier, Thomas E.
2017-08-01
A method for identification of pulsations in time series of magnetic field data which are simultaneously present in multiple channels of data at one or more sensor locations is described. Candidate pulsations of interest are first identified in geomagnetic time series by inspection. Time series of these "training events" are represented in matrix form and transpose-multiplied to generate time-domain covariance matrices. The ranked eigenvectors of this matrix are stored as a feature of the pulsation. In the second stage of the algorithm, a sliding window (approximately the width of the training event) is moved across the vector-valued time-series comprising the channels on which the training event was observed. At each window position, the data covariance matrix and associated eigenvectors are calculated. We compare the orientation of the dominant eigenvectors of the training data to those from the windowed data and flag windows where the dominant eigenvectors directions are similar. This was successful in automatically identifying pulses which share polarization and appear to be from the same source process. We apply the method to a case study of continuously sampled (50 Hz) data from six observatories, each equipped with three-component induction coil magnetometers. We examine a 90-day interval of data associated with a cluster of four observatories located within 50 km of Napa, California, together with two remote reference stations-one 100 km to the north of the cluster and the other 350 km south. When the training data contains signals present in the remote reference observatories, we are reliably able to identify and extract global geomagnetic signals such as solar-generated noise. When training data contains pulsations only observed in the cluster of local observatories, we identify several types of non-plane wave signals having similar polarization.
Rivera, Diego; Lillo, Mario; Granda, Stalin
2014-12-01
The concept of time stability has been widely used in the design and assessment of monitoring networks of soil moisture, as well as in hydrological studies, because it is as a technique that allows identifying of particular locations having the property of representing mean values of soil moisture in the field. In this work, we assess the effect of time stability calculations as new information is added and how time stability calculations are affected at shorter periods, subsampled from the original time series, containing different amounts of precipitation. In doing so, we defined two experiments to explore the time stability behavior. The first experiment sequentially adds new data to the previous time series to investigate the long-term influence of new data in the results. The second experiment applies a windowing approach, taking sequential subsamples from the entire time series to investigate the influence of short-term changes associated with the precipitation in each window. Our results from an operating network (seven monitoring points equipped with four sensors each in a 2-ha blueberry field) show that as information is added to the time series, there are changes in the location of the most stable point (MSP), and that taking the moving 21-day windows, it is clear that most of the variability of soil water content changes is associated with both the amount and intensity of rainfall. The changes of the MSP over each window depend on the amount of water entering the soil and the previous state of the soil water content. For our case study, the upper strata are proxies for hourly to daily changes in soil water content, while the deeper strata are proxies for medium-range stored water. Thus, different locations and depths are representative of processes at different time scales. This situation must be taken into account when water management depends on soil water content values from fixed locations.
Model MTF for the mosaic window
NASA Astrophysics Data System (ADS)
Xing, Zhenchong; Hong, Yongfeng; Zhang, Bao
2017-10-01
An electro-optical targeting system mounted either within an airframe or housed in separate pods requires a window to form an environmental barrier to the outside world. In current practice, such windows usually use a mosaic or segmented window. When scanning the target, internally gimbaled systems sweep over the window, which can affect the modulation transfer function (MTF) due to wave-front division and optical path differences arising from the thickness/wedge differences between panes. In this paper, a mathematical model of the MTF of the mosaic window is presented that allows an analysis of influencing factors; we show how the model may be integrated into ZEMAX® software for optical design. The model can be used to guide both the design and the tolerance analysis of optical systems that employ a mosaic window.
Is Latency to Test Deadline a Predictor of Student Test Performance?
ERIC Educational Resources Information Center
Landrum, R. Eric; Gurung, Regan A. R.
2013-01-01
When students are given a period or window of time to take an exam, is taking an exam earlier in the window (high latency to deadline) related to test scores? In Study 1, students (n = 236) were given windows of time to take online each of 13 quizzes and 4 exams. In Study 2, students (n = 251) similarly took 4 exams online within a test window. In…
Integrated Structural Analysis and Test Program
NASA Technical Reports Server (NTRS)
Kaufman, Daniel
2005-01-01
An integrated structural-analysis and structure-testing computer program is being developed in order to: Automate repetitive processes in testing and analysis; Accelerate pre-test analysis; Accelerate reporting of tests; Facilitate planning of tests; Improve execution of tests; Create a vibration, acoustics, and shock test database; and Integrate analysis and test data. The software package includes modules pertaining to sinusoidal and random vibration, shock and time replication, acoustics, base-driven modal survey, and mass properties and static/dynamic balance. The program is commanded by use of ActiveX controls. There is minimal need to generate command lines. Analysis or test files are selected by opening a Windows Explorer display. After selecting the desired input file, the program goes to a so-called analysis data process or test data process, depending on the type of input data. The status of the process is given by a Windows status bar, and when processing is complete, the data are reported in graphical, tubular, and matrix form.
Nondestrucive analysis of fuel pins
Stepan, I.E.; Allard, N.P.; Suter, C.R.
1972-11-03
Disclosure is made of a method and a correspondingly adapted facility for the nondestructive analysis of the concentation of fuel and poison in a nuclear reactor fuel pin. The concentrations of fuel and poison in successive sections along the entire length of the fuel pin are determined by measuring the reactivity of a thermal reactor as each successive small section of the fuel pin is exposed to the neutron flux of the reactor core and comparing the measured reactivity with the reactivities measured for standard fuel pins having various known concentrations. Only a small section of the length of the fuel pin is exposed to the neutron flux at any one time while the remainder of the fuel pin is shielded from the neutron flux. In order to expose only a small section at any one time, a boron-10-lined dry traverse tube is passed through the test region within the core of a low-power thermal nuclear reactor which has a very high fuel sensitivity. A narrow window in the boron-10 lining is positioned at the core center line. The fuel pins are then systematically traversed through the tube past the narrow window such that successive small sections along the length of the fuel pin are exposed to the neutron flux which passes through the narrow window.
Weibull Analysis and Area Scaling for Infrared Window Materials (U)
2016-08-01
the strength of a window scales inversely with the size of the window. This report was reviewed for technical accuracy by Howard Poisl, Thomas M...strength of a window scales inversely with the size of the window. Test data are given for aluminum oxynitride (ALON), calcium fluoride, chemical vapor...failure of an optical window in the absence of slow crack growth. This report illustrates how the strength of a window scales inversely with the size of
In Situ Optical Observation of High-Temperature Geological Processes With the Moissanite Cell
NASA Astrophysics Data System (ADS)
Walte, N.; Keppler, H.
2005-12-01
A major drawback of existing techniques in experimental earth and material sciences is the inability to observe ongoing high-temperature processes in situ during an experiment. Examples for important time-dependent processes include the textural development of rocks and oxide systems during melting and crystallization, solid-state and melt-present recrystallization and Ostwald ripening, and bubble nucleation and growth during degassing of glasses and melts. The investigation of these processes by post-mortem analysis of a quenched microstructure is time consuming and often unsatisfactory. Here, we introduce the moissanite cell that allows optical in situ observation of long-term experiments at high temperatures. Moissanite is a transparent gem-quality type of SiC that is characterized by its hardness and superior chemical and thermal resistance. Two moissanite windows with a thickness and diameter of several millimeters are placed into sockets of fired pyrophyllite and fixed onto two opposite metal plates. The sockets are wrapped with heating wire and each window is connected to a thermocouple for temperature control. The sample is placed directly between the moissanite windows and the cell is assembled similarly to a large diamond anvil cell. In situ observation of the sample is done with a microscope through observation windows and movies are recorded with an attached digital camera. Our experiments with the new cell show that temperatures above 1200°C can be maintained and observed in a sample for several days without damaging the cell nor the windows. Time-lapse movies of melting and crystallizing natural and synthetic rocks and of degassing glasses and melts will be presented to show the potential of the new technique for experimental earth and material science.
Dass, Jasmita; Gupta, Aastha; Mittal, Suchi; Saraf, Amrita; Langer, Sabina; Bhargava, Manorama
2017-06-01
Cation exchange-high performance liquid chromatography (CE-HPLC) is most commonly used to evaluate hemoglobin (Hb) variants, which elute in the Hb A2 window. This study aimed to assess prevalence of an uncommon Hb variant, Hb D-Iran, and compare its red cell parameters and peak characteristics with those of Hb E that commonly elutes in the Hb A2 window. Generally, we assess abnormal Hb using CE-HPLC as the primary technique along with alkaline and acid electrophoresis. All cases with Hb A2 window >9%, as assessed by CE-HPLCs during 2009-2013, were selected. Twenty-nine cases with Hb D-Iran variant were identified-25 heterozygous, 2 homozygous, 1 compound heterozygous Hb D-Iran/β-thalassemia, and 1 Hb D-Iran/Hb D-Punjab. Overall prevalence of Hb D-Iran was 0.23%. Compared to patients with Hb E, those with Hb D-Iran had significantly higher Hb (12.1 vs. 11.3 g/dL, P =0.03), MCV (82.4 vs. 76.4 fL, P =0.0044), MCH (27.9 vs. 25.45 pg, P =0.0006), and MCHC (33.9 vs. 33.3 g/dL, P =0.0005). Amount of abnormal Hb (40.7 vs. 26.4%, P =0.0001) was significantly higher while retention time (3.56 vs. 3.70 min, P =0.0001) was significantly lower in Hb D-Iran than in Hb E. Hb D-Iran peak can be easily missed if area and retention time of the Hb A2 window are not carefully analyzed. To distinguish between variants, careful analysis of peak area and retention time is sufficient in most cases and may be further confirmed by the second technique-alkaline electrophoresis.
NASA Astrophysics Data System (ADS)
Kim, Byung Soo; Lee, Woon-Seek; Koh, Shiegheun
2012-07-01
This article considers an inbound ordering and outbound dispatching problem for a single product in a third-party warehouse, where the demands are dynamic over a discrete and finite time horizon, and moreover, each demand has a time window in which it must be satisfied. Replenishing orders are shipped in containers and the freight cost is proportional to the number of containers used. The problem is classified into two cases, i.e. non-split demand case and split demand case, and a mathematical model for each case is presented. An in-depth analysis of the models shows that they are very complicated and difficult to find optimal solutions as the problem size becomes large. Therefore, genetic algorithm (GA) based heuristic approaches are designed to solve the problems in a reasonable time. To validate and evaluate the algorithms, finally, some computational experiments are conducted.
Short-time fractional Fourier methods for the time-frequency representation of chirp signals.
Capus, Chris; Brown, Keith
2003-06-01
The fractional Fourier transform (FrFT) provides a valuable tool for the analysis of linear chirp signals. This paper develops two short-time FrFT variants which are suited to the analysis of multicomponent and nonlinear chirp signals. Outputs have similar properties to the short-time Fourier transform (STFT) but show improved time-frequency resolution. The FrFT is a parameterized transform with parameter, a, related to chirp rate. The two short-time implementations differ in how the value of a is chosen. In the first, a global optimization procedure selects one value of a with reference to the entire signal. In the second, a values are selected independently for each windowed section. Comparative variance measures based on the Gaussian function are given and are shown to be consistent with the uncertainty principle in fractional domains. For appropriately chosen FrFT orders, the derived fractional domain uncertainty relationship is minimized for Gaussian windowed linear chirp signals. The two short-time FrFT algorithms have complementary strengths demonstrated by time-frequency representations for a multicomponent bat chirp, a highly nonlinear quadratic chirp, and an output pulse from a finite-difference sonar model with dispersive change. These representations illustrate the improvements obtained in using FrFT based algorithms compared to the STFT.
NASA Astrophysics Data System (ADS)
Kazanskiy, Nikolay; Protsenko, Vladimir; Serafimovich, Pavel
2016-03-01
This research article contains an experiment with implementation of image filtering task in Apache Storm and IBM InfoSphere Streams stream data processing systems. The aim of presented research is to show that new technologies could be effectively used for sliding window filtering of image sequences. The analysis of execution was focused on two parameters: throughput and memory consumption. Profiling was performed on CentOS operating systems running on two virtual machines for each system. The experiment results showed that IBM InfoSphere Streams has about 1.5 to 13.5 times lower memory footprint than Apache Storm, but could be about 2.0 to 2.5 slower on a real hardware.
Latency as a region contrast: Measuring ERP latency differences with Dynamic Time Warping.
Zoumpoulaki, A; Alsufyani, A; Filetti, M; Brammer, M; Bowman, H
2015-12-01
Methods for measuring onset latency contrasts are evaluated against a new method utilizing the dynamic time warping (DTW) algorithm. This new method allows latency to be measured across a region instead of single point. We use computer simulations to compare the methods' power and Type I error rates under different scenarios. We perform per-participant analysis for different signal-to-noise ratios and two sizes of window (broad vs. narrow). In addition, the methods are tested in combination with single-participant and jackknife average waveforms for different effect sizes, at the group level. DTW performs better than the other methods, being less sensitive to noise as well as to placement and width of the window selected. © 2015 Society for Psychophysiological Research.
A 640-MHz 32-megachannel real-time polyphase-FFT spectrum analyzer
NASA Technical Reports Server (NTRS)
Zimmerman, G. A.; Garyantes, M. F.; Grimm, M. J.; Charny, B.
1991-01-01
A polyphase fast Fourier transform (FFT) spectrum analyzer being designed for NASA's Search for Extraterrestrial Intelligence (SETI) Sky Survey at the Jet Propulsion Laboratory is described. By replacing the time domain multiplicative window preprocessing with polyphase filter processing, much of the processing loss of windowed FFTs can be eliminated. Polyphase coefficient memory costs are minimized by effective use of run length compression. Finite word length effects are analyzed, producing a balanced system with 8 bit inputs, 16 bit fixed point polyphase arithmetic, and 24 bit fixed point FFT arithmetic. Fixed point renormalization midway through the computation is seen to be naturally accommodated by the matrix FFT algorithm proposed. Simulation results validate the finite word length arithmetic analysis and the renormalization technique.
2015-01-19
MS WINDOWS platform, which enables multitasking with simultaneous evaluation and operation 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13...measurement and analysis software for data acquisition, storage and evaluation with MS WINDOWS platform, which enables multitasking with simultaneous...Proteus measurement and analysis software for data acquisition, storage and evaluation with MS WINDOWS platform, which enables multitasking with
Letter-sound processing deficits in children with developmental dyslexia: An ERP study.
Moll, Kristina; Hasko, Sandra; Groth, Katharina; Bartling, Jürgen; Schulte-Körne, Gerd
2016-04-01
The time course during letter-sound processing was investigated in children with developmental dyslexia (DD) and typically developing (TD) children using electroencephalography. Thirty-eight children with DD and 25 TD children participated in a visual-auditory oddball paradigm. Event-related potentials (ERPs) elicited by standard and deviant stimuli in an early (100-190 ms) and late (560-750 ms) time window were analysed. In the early time window, ERPs elicited by the deviant stimulus were delayed and less left lateralized over fronto-temporal electrodes for children with DD compared to TD children. In the late time window, children with DD showed higher amplitudes extending more over right frontal electrodes. Longer latencies in the early time window and stronger right hemispheric activation in the late time window were associated with slower reading and naming speed. Additionally, stronger right hemispheric activation in the late time window correlated with poorer phonological awareness skills. Deficits in early stages of letter-sound processing influence later more explicit cognitive processes during letter-sound processing. Identifying the neurophysiological correlates of letter-sound processing and their relation to reading related skills provides insight into the degree of automaticity during letter-sound processing beyond behavioural measures of letter-sound-knowledge. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
In situ laser annealing system for real-time surface kinetic analysis
NASA Astrophysics Data System (ADS)
Wang, Q.; Sun, Y.-M.; Zhao, W.; Campagna, J.; White, J. M.
2002-11-01
For real-time analysis during thermal annealing, a continuous wave CO2 infrared laser was coupled to a surface analysis system equipped for x-ray photoelectron spectroscopy (XPS) and ion scattering spectroscopy (ISS). The laser beam was directed into the vacuum chamber through a ZnSe window to the back side of the sample. With 10 W laser output, the sample temperature reached 563 K. The chamber remained below 10-8 Torr during annealing and allowed XPS and ISS data to be gathered as a function of time at selected temperatures. As a test example, real time Cu2O reduction at 563 K was investigated.
Liu, Shelley H; Bobb, Jennifer F; Lee, Kyu Ha; Gennings, Chris; Claus Henn, Birgit; Bellinger, David; Austin, Christine; Schnaas, Lourdes; Tellez-Rojo, Martha M; Hu, Howard; Wright, Robert O; Arora, Manish; Coull, Brent A
2018-07-01
The impact of neurotoxic chemical mixtures on children's health is a critical public health concern. It is well known that during early life, toxic exposures may impact cognitive function during critical time intervals of increased vulnerability, known as windows of susceptibility. Knowledge on time windows of susceptibility can help inform treatment and prevention strategies, as chemical mixtures may affect a developmental process that is operating at a specific life phase. There are several statistical challenges in estimating the health effects of time-varying exposures to multi-pollutant mixtures, such as: multi-collinearity among the exposures both within time points and across time points, and complex exposure-response relationships. To address these concerns, we develop a flexible statistical method, called lagged kernel machine regression (LKMR). LKMR identifies critical exposure windows of chemical mixtures, and accounts for complex non-linear and non-additive effects of the mixture at any given exposure window. Specifically, LKMR estimates how the effects of a mixture of exposures change with the exposure time window using a Bayesian formulation of a grouped, fused lasso penalty within a kernel machine regression (KMR) framework. A simulation study demonstrates the performance of LKMR under realistic exposure-response scenarios, and demonstrates large gains over approaches that consider each time window separately, particularly when serial correlation among the time-varying exposures is high. Furthermore, LKMR demonstrates gains over another approach that inputs all time-specific chemical concentrations together into a single KMR. We apply LKMR to estimate associations between neurodevelopment and metal mixtures in Early Life Exposures in Mexico and Neurotoxicology, a prospective cohort study of child health in Mexico City.
Obtaining high-resolution velocity spectra using weighted semblance
NASA Astrophysics Data System (ADS)
Ebrahimi, Saleh; Kahoo, Amin Roshandel; Porsani, Milton J.; Kalateh, Ali Nejati
2017-02-01
Velocity analysis employs coherency measurement along a hyperbolic or non-hyperbolic trajectory time window to build velocity spectra. Accuracy and resolution are strictly related to the method of coherency measurements. Semblance, the most common coherence measure, has poor resolution velocity which affects one's ability to distinguish and pick distinct peaks. Increase the resolution of the semblance velocity spectra causes the accuracy of estimated velocity for normal moveout correction and stacking is improved. The low resolution of semblance spectra depends on its low sensitivity to velocity changes. In this paper, we present a new weighted semblance method that ensures high-resolution velocity spectra. To increase the resolution of semblance spectra, we introduce two weighting functions based on the first to second singular values ratio of the time window and the position of the seismic wavelet in the time window to the semblance equation. We test the method on both synthetic and real field data to compare the resolution of weighted and conventional semblance methods. Numerical examples with synthetic and real seismic data indicate that the new proposed weighted semblance method provides higher resolution than conventional semblance and can separate the reflectors which are mixed in the semblance spectrum.
Lian, Yanyun; Song, Zhijian
2014-01-01
Brain tumor segmentation from magnetic resonance imaging (MRI) is an important step toward surgical planning, treatment planning, monitoring of therapy. However, manual tumor segmentation commonly used in clinic is time-consuming and challenging, and none of the existed automated methods are highly robust, reliable and efficient in clinic application. An accurate and automated tumor segmentation method has been developed for brain tumor segmentation that will provide reproducible and objective results close to manual segmentation results. Based on the symmetry of human brain, we employed sliding-window technique and correlation coefficient to locate the tumor position. At first, the image to be segmented was normalized, rotated, denoised, and bisected. Subsequently, through vertical and horizontal sliding-windows technique in turn, that is, two windows in the left and the right part of brain image moving simultaneously pixel by pixel in two parts of brain image, along with calculating of correlation coefficient of two windows, two windows with minimal correlation coefficient were obtained, and the window with bigger average gray value is the location of tumor and the pixel with biggest gray value is the locating point of tumor. At last, the segmentation threshold was decided by the average gray value of the pixels in the square with center at the locating point and 10 pixels of side length, and threshold segmentation and morphological operations were used to acquire the final tumor region. The method was evaluated on 3D FSPGR brain MR images of 10 patients. As a result, the average ratio of correct location was 93.4% for 575 slices containing tumor, the average Dice similarity coefficient was 0.77 for one scan, and the average time spent on one scan was 40 seconds. An fully automated, simple and efficient segmentation method for brain tumor is proposed and promising for future clinic use. Correlation coefficient is a new and effective feature for tumor location.
Automatic Multi-sensor Data Quality Checking and Event Detection for Environmental Sensing
NASA Astrophysics Data System (ADS)
LIU, Q.; Zhang, Y.; Zhao, Y.; Gao, D.; Gallaher, D. W.; Lv, Q.; Shang, L.
2017-12-01
With the advances in sensing technologies, large-scale environmental sensing infrastructures are pervasively deployed to continuously collect data for various research and application fields, such as air quality study and weather condition monitoring. In such infrastructures, many sensor nodes are distributed in a specific area and each individual sensor node is capable of measuring several parameters (e.g., humidity, temperature, and pressure), providing massive data for natural event detection and analysis. However, due to the dynamics of the ambient environment, sensor data can be contaminated by errors or noise. Thus, data quality is still a primary concern for scientists before drawing any reliable scientific conclusions. To help researchers identify potential data quality issues and detect meaningful natural events, this work proposes a novel algorithm to automatically identify and rank anomalous time windows from multiple sensor data streams. More specifically, (1) the algorithm adaptively learns the characteristics of normal evolving time series and (2) models the spatial-temporal relationship among multiple sensor nodes to infer the anomaly likelihood of a time series window for a particular parameter in a sensor node. Case studies using different data sets are presented and the experimental results demonstrate that the proposed algorithm can effectively identify anomalous time windows, which may resulted from data quality issues and natural events.
Zhou, Wei; Wen, Junhao; Qu, Qiang; Zeng, Jun; Cheng, Tian
2018-01-01
Recommender systems are vulnerable to shilling attacks. Forged user-generated content data, such as user ratings and reviews, are used by attackers to manipulate recommendation rankings. Shilling attack detection in recommender systems is of great significance to maintain the fairness and sustainability of recommender systems. The current studies have problems in terms of the poor universality of algorithms, difficulty in selection of user profile attributes, and lack of an optimization mechanism. In this paper, a shilling behaviour detection structure based on abnormal group user findings and rating time series analysis is proposed. This paper adds to the current understanding in the field by studying the credibility evaluation model in-depth based on the rating prediction model to derive proximity-based predictions. A method for detecting suspicious ratings based on suspicious time windows and target item analysis is proposed. Suspicious rating time segments are determined by constructing a time series, and data streams of the rating items are examined and suspicious rating segments are checked. To analyse features of shilling attacks by a group user's credibility, an abnormal group user discovery method based on time series and time window is proposed. Standard testing datasets are used to verify the effect of the proposed method.
Wen, Junhao; Qu, Qiang; Zeng, Jun; Cheng, Tian
2018-01-01
Recommender systems are vulnerable to shilling attacks. Forged user-generated content data, such as user ratings and reviews, are used by attackers to manipulate recommendation rankings. Shilling attack detection in recommender systems is of great significance to maintain the fairness and sustainability of recommender systems. The current studies have problems in terms of the poor universality of algorithms, difficulty in selection of user profile attributes, and lack of an optimization mechanism. In this paper, a shilling behaviour detection structure based on abnormal group user findings and rating time series analysis is proposed. This paper adds to the current understanding in the field by studying the credibility evaluation model in-depth based on the rating prediction model to derive proximity-based predictions. A method for detecting suspicious ratings based on suspicious time windows and target item analysis is proposed. Suspicious rating time segments are determined by constructing a time series, and data streams of the rating items are examined and suspicious rating segments are checked. To analyse features of shilling attacks by a group user’s credibility, an abnormal group user discovery method based on time series and time window is proposed. Standard testing datasets are used to verify the effect of the proposed method. PMID:29742134
Multiscale field-aligned current analyzer
NASA Astrophysics Data System (ADS)
Bunescu, C.; Marghitu, O.; Constantinescu, D.; Narita, Y.; Vogt, J.; Blǎgǎu, A.
2015-11-01
The magnetosphere-ionosphere coupling is achieved, essentially, by a superposition of quasi-stationary and time-dependent field-aligned currents (FACs), over a broad range of spatial and temporal scales. The planarity of the FAC structures observed by satellite data and the orientation of the planar FAC sheets can be investigated by the well-established minimum variance analysis (MVA) of the magnetic perturbation. However, such investigations are often constrained to a predefined time window, i.e., to a specific scale of the FAC. The multiscale field-aligned current analyzer, introduced here, relies on performing MVA continuously and over a range of scales by varying the width of the analyzing window, appropriate for the complexity of the magnetic field signatures above the auroral oval. The proposed technique provides multiscale information on the planarity and orientation of the observed FACs. A new approach, based on the derivative of the largest eigenvalue of the magnetic variance matrix with respect to the length of the analysis window, makes possible the inference of the current structures' location (center) and scale (thickness). The capabilities of the FAC analyzer are explored analytically for the magnetic field profile of the Harris sheet and tested on synthetic FAC structures with uniform current density and infinite or finite geometry in the cross-section plane of the FAC. The method is illustrated with data observed by the Cluster spacecraft on crossing the nightside auroral region, and the results are cross checked with the optical observations from the Time History of Events and Macroscale Interactions during Substorms ground network.
Hardware Implementation of a Bilateral Subtraction Filter
NASA Technical Reports Server (NTRS)
Huertas, Andres; Watson, Robert; Villalpando, Carlos; Goldberg, Steven
2009-01-01
A bilateral subtraction filter has been implemented as a hardware module in the form of a field-programmable gate array (FPGA). In general, a bilateral subtraction filter is a key subsystem of a high-quality stereoscopic machine vision system that utilizes images that are large and/or dense. Bilateral subtraction filters have been implemented in software on general-purpose computers, but the processing speeds attainable in this way even on computers containing the fastest processors are insufficient for real-time applications. The present FPGA bilateral subtraction filter is intended to accelerate processing to real-time speed and to be a prototype of a link in a stereoscopic-machine- vision processing chain, now under development, that would process large and/or dense images in real time and would be implemented in an FPGA. In terms that are necessarily oversimplified for the sake of brevity, a bilateral subtraction filter is a smoothing, edge-preserving filter for suppressing low-frequency noise. The filter operation amounts to replacing the value for each pixel with a weighted average of the values of that pixel and the neighboring pixels in a predefined neighborhood or window (e.g., a 9 9 window). The filter weights depend partly on pixel values and partly on the window size. The present FPGA implementation of a bilateral subtraction filter utilizes a 9 9 window. This implementation was designed to take advantage of the ability to do many of the component computations in parallel pipelines to enable processing of image data at the rate at which they are generated. The filter can be considered to be divided into the following parts (see figure): a) An image pixel pipeline with a 9 9- pixel window generator, b) An array of processing elements; c) An adder tree; d) A smoothing-and-delaying unit; and e) A subtraction unit. After each 9 9 window is created, the affected pixel data are fed to the processing elements. Each processing element is fed the pixel value for its position in the window as well as the pixel value for the central pixel of the window. The absolute difference between these two pixel values is calculated and used as an address in a lookup table. Each processing element has a lookup table, unique for its position in the window, containing the weight coefficients for the Gaussian function for that position. The pixel value is multiplied by the weight, and the outputs of the processing element are the weight and pixel-value weight product. The products and weights are fed to the adder tree. The sum of the products and the sum of the weights are fed to the divider, which computes the sum of products the sum of weights. The output of the divider is denoted the bilateral smoothed image. The smoothing function is a simple weighted average computed over a 3 3 subwindow centered in the 9 9 window. After smoothing, the image is delayed by an additional amount of time needed to match the processing time for computing the bilateral smoothed image. The bilateral smoothed image is then subtracted from the 3 3 smoothed image to produce the final output. The prototype filter as implemented in a commercially available FPGA processes one pixel per clock cycle. Operation at a clock speed of 66 MHz has been demonstrated, and results of a static timing analysis have been interpreted as suggesting that the clock speed could be increased to as much as 100 MHz.
A test of multiple correlation temporal window characteristic of non-Markov processes
NASA Astrophysics Data System (ADS)
Arecchi, F. T.; Farini, A.; Megna, N.
2016-03-01
We introduce a sensitive test of memory effects in successive events. The test consists of a combination K of binary correlations at successive times. K decays monotonically from K = 1 for uncorrelated events as a Markov process. For a monotonic memory fading, K<1 always. Here we report evidence of a K>1 temporal window in cognitive tasks consisting of the visual identification of the front face of the Necker cube after a previous presentation of the same. We speculate that memory effects provide a temporal window with K>1 and this experiment could be a possible first step towards a better comprehension of this phenomenon. The K>1 behaviour is maximal at an inter-measurement time τ around 2s with inter-subject differences. The K>1 persists over a time window of 1s around τ; outside this window the K<1 behaviour is recovered. The universal occurrence of a K>1 window in pairs of successive perceptions suggests that, at variance with single visual stimuli eliciting a suitable response, a pair of stimuli shortly separated in time displays mutual correlations.
Li, Rongxia; Stewart, Brock; Weintraub, Eric
2016-01-01
The self-controlled case series (SCCS) and self-controlled risk interval (SCRI) designs have recently become widely used in the field of post-licensure vaccine safety monitoring to detect potential elevated risks of adverse events following vaccinations. The SCRI design can be viewed as a subset of the SCCS method in that a reduced comparison time window is used for the analysis. Compared to the SCCS method, the SCRI design has less statistical power due to fewer events occurring in the shorter control interval. In this study, we derived the asymptotic relative efficiency (ARE) between these two methods to quantify this loss in power in the SCRI design. The equation is formulated as [Formula: see text] (a: control window-length ratio between SCRI and SCCS designs; b: ratio of risk window length and control window length in the SCCS design; and [Formula: see text]: relative risk of exposed window to control window). According to this equation, the relative efficiency declines as the ratio of control-period length between SCRI and SCCS methods decreases, or with an increase in the relative risk [Formula: see text]. We provide an example utilizing data from the Vaccine Safety Datalink (VSD) to study the potential elevated risk of febrile seizure following seasonal influenza vaccine in the 2010-2011 season.
Muon catalyzed fusion beam window mechanical strength testing and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ware, A.G.; Zabriskie, J.M.
A thin aluminum window (0.127 mm (0.005-inch) thick x 146 mm (5 3/4-inch) diameter) of 2024-T6 alloy was modeled and analyzed using the ABAQUS non-linear finite element analysis code. A group of windows was fabricated, heat-treated and subsequently tested. Testing included both ultimate burst pressure and fatigue. Fatigue testing cycles involved ''oil-canning'' behavior representing vacuum purge and reversal to pressure. Test results are compared to predictions and the mode of failure is discussed. Operational requirements, based on the above analysis and correlational testing, for the actual beam windows are discussed. 1 ref., 3 figs.
WINDOWS: a program for the analysis of spectral data foil activation measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stallmann, F.W.; Eastham, J.F.; Kam, F.B.K.
The computer program WINDOWS together with its subroutines is described for the analysis of neutron spectral data foil activation measurements. In particular, the unfolding of the neutron differential spectrum, estimated windows and detector contributions, upper and lower bounds for an integral response, and group fluxes obtained from neutron transport calculations. 116 references. (JFP)
Predonation screening of candidate donors and prevention of window period donations.
Lieshout-Krikke, Ryanne W; Zaaijer, Hans L; van de Laar, Thijs J W
2015-02-01
Infectious window period donations slip through routine donor screening procedures. To explore the potential value of predonation screening of candidate donors, we compared the proportion of incident transfusion-transmissible infections in candidate donors, in first-time donors, and in repeat donors. A retrospective analysis was performed of all incident hepatitis B virus (HBV), hepatitis C virus (HCV), and human immunodeficiency virus (HIV) infections in candidate, first-time, and repeat donors in the Netherlands during the period 2009 to 2013. In total, 176,716 candidate donors, 144,226 first-time donations, and 4,143,455 repeat donations were screened for HBV, HCV, and HIV infection. Acute HBV infection was identified in the predonation sample of six candidate donors. One first-time donor, testing HIV-negative at predonation screening, tested positive for anti-HIV and HIV RNA in the first donation 29 days later. Among repeat donations we identified 15, one, and six incident HBV, HCV and HIV infections, respectively. The proportion of incident infections among candidate donors/first-time donations/repeat donations was for HBV, 3.40/0/0.36; for HCV, 0/0/0.02; and for HIV 0/0.69/0.14 per 100,000, respectively. Predonation screening of candidate donors very likely causes a loss of donations, but it might prevent undetected window period donations. Further studies are necessary to determine the value of predonation screening as an additional safety measure. © 2014 AABB.
NASA Astrophysics Data System (ADS)
Kramer, J. L. A. M.; Ullings, A. H.; Vis, R. D.
1993-05-01
A real-time data acquisition system for microprobe analysis has been developed at the Free University of Amsterdam. The system is composed of two parts: a front-end real-time and a back-end monitoring system. The front-end consists of a VMEbus based system which reads out a CAMAC crate. The back-end is implemented on a Sun work station running the UNIX operating system. This separation allows the integration of a minimal, and consequently very fast, real-time executive within the sophisticated possibilities of advanced UNIX work stations.
Runtime Speculative Software-Only Fault Tolerance
2012-06-01
reliability of RSFT, a in-depth analysis on its window of vulnerability is also discussed and measured via simulated fault injection. The performance...propagation of faults through the entire program. For optimal performance, these techniques have to use herotic alias analysis to find the minimum set of...affect program output. No program source code or alias analysis is needed to analyze the fault propagation ahead of time. 2.3 Limitations of Existing
Yurtkuran, Alkın; Emel, Erdal
2014-01-01
The traveling salesman problem with time windows (TSPTW) is a variant of the traveling salesman problem in which each customer should be visited within a given time window. In this paper, we propose an electromagnetism-like algorithm (EMA) that uses a new constraint handling technique to minimize the travel cost in TSPTW problems. The EMA utilizes the attraction-repulsion mechanism between charged particles in a multidimensional space for global optimization. This paper investigates the problem-specific constraint handling capability of the EMA framework using a new variable bounding strategy, in which real-coded particle's boundary constraints associated with the corresponding time windows of customers, is introduced and combined with the penalty approach to eliminate infeasibilities regarding time window violations. The performance of the proposed algorithm and the effectiveness of the constraint handling technique have been studied extensively, comparing it to that of state-of-the-art metaheuristics using several sets of benchmark problems reported in the literature. The results of the numerical experiments show that the EMA generates feasible and near-optimal results within shorter computational times compared to the test algorithms.
Yurtkuran, Alkın
2014-01-01
The traveling salesman problem with time windows (TSPTW) is a variant of the traveling salesman problem in which each customer should be visited within a given time window. In this paper, we propose an electromagnetism-like algorithm (EMA) that uses a new constraint handling technique to minimize the travel cost in TSPTW problems. The EMA utilizes the attraction-repulsion mechanism between charged particles in a multidimensional space for global optimization. This paper investigates the problem-specific constraint handling capability of the EMA framework using a new variable bounding strategy, in which real-coded particle's boundary constraints associated with the corresponding time windows of customers, is introduced and combined with the penalty approach to eliminate infeasibilities regarding time window violations. The performance of the proposed algorithm and the effectiveness of the constraint handling technique have been studied extensively, comparing it to that of state-of-the-art metaheuristics using several sets of benchmark problems reported in the literature. The results of the numerical experiments show that the EMA generates feasible and near-optimal results within shorter computational times compared to the test algorithms. PMID:24723834
75 FR 11841 - Repowering Assistance Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-12
... application window. SUMMARY: RBS is announcing a new application window to submit applications for the...-time application window for remaining FY 2009 funds. Paperwork Reduction Act In accordance with the... allocate all of the FY 2009 authorized funds. Therefore, the Agency is opening a new application window to...
Millisecond timing on PCs and Macs.
MacInnes, W J; Taylor, T L
2001-05-01
A real-time, object-oriented solution for displaying stimuli on Windows 95/98, MacOS and Linux platforms is presented. The program, written in C++, utilizes a special-purpose window class (GLWindow), OpenGL, and 32-bit graphics acceleration; it avoids display timing uncertainty by substituting the new window class for the default window code for each system. We report the outcome of tests for real-time capability across PC and Mac platforms running a variety of operating systems. The test program, which can be used as a shell for programming real-time experiments and testing specific processors, is available at http://www.cs.dal.ca/~macinnwj. We propose to provide researchers with a sense of the usefulness of our program, highlight the ability of many multitasking environments to achieve real time, as well as caution users about systems that may not achieve real time, even under optimal conditions.
Exclusive queueing model including the choice of service windows
NASA Astrophysics Data System (ADS)
Tanaka, Masahiro; Yanagisawa, Daichi; Nishinari, Katsuhiro
2018-01-01
In a queueing system involving multiple service windows, choice behavior is a significant concern. This paper incorporates the choice of service windows into a queueing model with a floor represented by discrete cells. We contrived a logit-based choice algorithm for agents considering the numbers of agents and the distances to all service windows. Simulations were conducted with various parameters of agent choice preference for these two elements and for different floor configurations, including the floor length and the number of service windows. We investigated the model from the viewpoint of transit times and entrance block rates. The influences of the parameters on these factors were surveyed in detail and we determined that there are optimum floor lengths that minimize the transit times. In addition, we observed that the transit times were determined almost entirely by the entrance block rates. The results of the presented model are relevant to understanding queueing systems including the choice of service windows and can be employed to optimize facility design and floor management.
DOT National Transportation Integrated Search
2009-04-01
This paper studies approximations to the average length of Vehicle Routing Problems (VRP). The approximations are valuable for strategic and : planning analysis of transportation and logistics problems. The research focus is on VRP with varying numbe...
Correlates of avian building strikes at a glass façade museum surrounded by avian habitat
NASA Astrophysics Data System (ADS)
Kahle, L.; Flannery, M.; Dumbacher, J. P.
2013-12-01
Bird window collisions are the second largest anthropogenic cause of bird deaths in the world. Effective mitigation requires an understanding of which birds are most likely to strike, when, and why. Here, we examine five years of avian window strike data from the California Academy of Sciences - a relatively new museum with significant glass façade situated in Golden Gate Park, San Francisco. We examine correlates of window-killed birds, including age, sex, season, and migratory or sedentary tendencies of the birds. We also examine correlates of window kills such as presence of habitat surrounding the building and overall window area. We found that males are almost three times more likely than females to mortally strike windows, and immature birds are three times more abundant than adults in our window kill dataset. Among seasons, strikes were not notably different in spring, summer, and fall; however they were notably reduced in winter. There was no statistical effect of building orientation (north, south, east, or west), and the presence of avian habitat directly adjacent to windows had a minor effect. We also report ongoing studies examining various efforts to reduce window kill (primarily external decals and large electronic window blinds.) We hope that improving our understanding of the causes of the window strikes will help us strategically reduce window strikes.
Climate Exposure of US National Parks in a New Era of Change
Monahan, William B.; Fisichelli, Nicholas A.
2014-01-01
US national parks are challenged by climate and other forms of broad-scale environmental change that operate beyond administrative boundaries and in some instances are occurring at especially rapid rates. Here, we evaluate the climate change exposure of 289 natural resource parks administered by the US National Park Service (NPS), and ask which are presently (past 10 to 30 years) experiencing extreme (<5th percentile or >95th percentile) climates relative to their 1901–2012 historical range of variability (HRV). We consider parks in a landscape context (including surrounding 30 km) and evaluate both mean and inter-annual variation in 25 biologically relevant climate variables related to temperature, precipitation, frost and wet day frequencies, vapor pressure, cloud cover, and seasonality. We also consider sensitivity of findings to the moving time window of analysis (10, 20, and 30 year windows). Results show that parks are overwhelmingly at the extreme warm end of historical temperature distributions and this is true for several variables (e.g., annual mean temperature, minimum temperature of the coldest month, mean temperature of the warmest quarter). Precipitation and other moisture patterns are geographically more heterogeneous across parks and show greater variation among variables. Across climate variables, recent inter-annual variation is generally well within the range of variability observed since 1901. Moving window size has a measureable effect on these estimates, but parks with extreme climates also tend to exhibit low sensitivity to the time window of analysis. We highlight particular parks that illustrate different extremes and may facilitate understanding responses of park resources to ongoing climate change. We conclude with discussion of how results relate to anticipated future changes in climate, as well as how they can inform NPS and neighboring land management and planning in a new era of change. PMID:24988483
Climate exposure of US national parks in a new era of change.
Monahan, William B; Fisichelli, Nicholas A
2014-01-01
US national parks are challenged by climate and other forms of broad-scale environmental change that operate beyond administrative boundaries and in some instances are occurring at especially rapid rates. Here, we evaluate the climate change exposure of 289 natural resource parks administered by the US National Park Service (NPS), and ask which are presently (past 10 to 30 years) experiencing extreme (<5th percentile or >95th percentile) climates relative to their 1901-2012 historical range of variability (HRV). We consider parks in a landscape context (including surrounding 30 km) and evaluate both mean and inter-annual variation in 25 biologically relevant climate variables related to temperature, precipitation, frost and wet day frequencies, vapor pressure, cloud cover, and seasonality. We also consider sensitivity of findings to the moving time window of analysis (10, 20, and 30 year windows). Results show that parks are overwhelmingly at the extreme warm end of historical temperature distributions and this is true for several variables (e.g., annual mean temperature, minimum temperature of the coldest month, mean temperature of the warmest quarter). Precipitation and other moisture patterns are geographically more heterogeneous across parks and show greater variation among variables. Across climate variables, recent inter-annual variation is generally well within the range of variability observed since 1901. Moving window size has a measureable effect on these estimates, but parks with extreme climates also tend to exhibit low sensitivity to the time window of analysis. We highlight particular parks that illustrate different extremes and may facilitate understanding responses of park resources to ongoing climate change. We conclude with discussion of how results relate to anticipated future changes in climate, as well as how they can inform NPS and neighboring land management and planning in a new era of change.
Forensic Analysis of Windows Hosts Using UNIX-based Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cory Altheide
2004-07-19
Many forensic examiners are introduced to UNIX-based forensic utilities when faced with investigating a UNIX-like operating system for the first time. They will use these utilities for this very specific task, because in many cases these tools are the only ones for the given job. For example, at the time of this writing, given a FreeBSD 5.x file system, the author's only choice is to use The Coroner's Toolkit running on FreeBSD 5.x. However, many of the same tools examiners use for the occasional UNIX-like system investigation are extremely capable when a Windows system is the target. Indeed, the Linuxmore » operating system itself can prove to be an extremely useful forensics platform with very little use of specialized forensics utilities at all.« less
Hosein, Riad B M; Mehta, Chetan; Stickley, John; Mcguirk, Simon P; Jones, Timothy J; Brawn, William J; Barron, David J
2007-11-01
A small sub-group of patients with hypoplastic left heart syndrome (HLHS) have normal-sized ascending aorta and arch. An alternative to the Norwood I procedure in these patients is the creation of an aorto-pulmonary (AP) window with a distal pulmonary artery band (PAB). We reviewed our experience with this technique and compared outcomes to the Norwood procedure for HLHS. All patients treated for HLHS in a single institution between 1992 and 2005 were analysed. This identified 13 patients treated with AP window and PAB compared to 333 patients undergoing stage I Norwood procedure. An unrestrictive AP window was created and the main PA was banded. Patient records and echocardiograms were analysed. Median follow-up was 10 (IQR 0-655) days and 100% complete. There were seven early deaths (54%) in the AP window group and two conversions to Norwood circulation. This was a significantly worse outcome than for the Norwood procedure over the same period, which had an early mortality of 29% (p=0.03). Kaplan-Meier actuarial analysis demonstrated a continued survival benefit of the Norwood group at 6 months (p=0.0005). Deaths were due to either low cardiac output syndrome (n=4) or sudden unheralded arrest (n=3). This occurred despite aortic cross-clamp and circulatory arrest times being significantly lower in the AP window group compared to the Norwood group (35+/-27 vs 55+/-16 min, p<0.01 and 16+/-29 vs 55+/-20 min, p<0.01, respectively). No differences in arterial saturations or systolic blood pressure existed between the groups, but diastolic blood pressure was significantly lower in the AP window group at 27+/-10 mmHg compared to 42+/-8 mmHg in the Norwood group (p=0.01) with evidence of flow reversal in the descending aorta. Differences in diastolic blood pressure between groups were abolished after conversion to stage II. Despite favourable anatomy and shorter ischaemic times, the AP window/PAB technique has a poor outcome compared to the Norwood procedure for HLHS. Low diastolic blood pressure with reversal of descending aortic flow in diastole was a feature of the AP window/PAB circulation. We recommend the Norwood procedure for these sub-types. This may have implications for newer 'hybrid' procedures for HLHS which create a similar palliative circulation.
Sojourning with the Homogeneous Poisson Process.
Liu, Piaomu; Peña, Edsel A
2016-01-01
In this pedagogical article, distributional properties, some surprising, pertaining to the homogeneous Poisson process (HPP), when observed over a possibly random window, are presented. Properties of the gap-time that covered the termination time and the correlations among gap-times of the observed events are obtained. Inference procedures, such as estimation and model validation, based on event occurrence data over the observation window, are also presented. We envision that through the results in this paper, a better appreciation of the subtleties involved in the modeling and analysis of recurrent events data will ensue, since the HPP is arguably one of the simplest among recurrent event models. In addition, the use of the theorem of total probability, Bayes theorem, the iterated rules of expectation, variance and covariance, and the renewal equation could be illustrative when teaching distribution theory, mathematical statistics, and stochastic processes at both the undergraduate and graduate levels. This article is targeted towards both instructors and students.
Wang, Bing; Baby, Varghese; Tong, Wilson; Xu, Lei; Friedman, Michelle; Runser, Robert; Glesk, Ivan; Prucnal, Paul
2002-01-14
A novel optical switch based on cascading two terahertz optical asymmetric demultiplexers (TOAD) is presented. By utilizing the sharp edge of the asymmetric TOAD switching window profile, two TOAD switching windows are overlapped to produce a narrower aggregate switching window, not limited by the pulse propagation time in the SOA of the TOAD. Simulations of the cascaded TOAD switching window show relatively constant window amplitude for different window sizes. Experimental results on cascading two TOADs, each with a switching window of 8ps, but with the SOA on opposite sides of the fiber loop, show a minimum switching window of 2.7ps.
McBride, J.H.; Hatcher, R.D.; Stephenson, W.J.; Hooper, R.J.
2005-01-01
The southern Appalachian Pine Mountain window exposes 1.1 Ga Grenvillian basement and its metasedimentary Paleozoic(?) cover through the allochthonous Inner Piedmont. The issue of whether the crustal block inside the window was either transported above the master Appalachian (late Alleghanian) de??collement or is an autochthonous block that was overridden by the de??collement has been debated for some time. New detrital zircon geochronologic data from the cover rocks inside the window suggest this crustal block was derived from Gondwana but docked with Laurentia before the Alleghanian event. Reprocessed deep seismic reflection data from west-central Georgia (pre- and poststack noise reduction, amplitude variation analysis, and prestack depth migration) indicate that a significant band of subhorizontal reflections occurs almost continuously beneath the window collinear with the originally recognized de??collement reflections north of the window. A marked variation in the de??collement image, from strong and coherent north of the window to more diffuse directly beneath the window, is likely a partial consequence of the different geology between the Inner Piedmont and the window. The more diffuse image beneath the window may also result from imaging problems related to changes in topography and fold of cover (i.e., signal-to-noise ratio). Two alternative tectonic models for the Pine Mountain window can partially account for the observed variation in the de??collement reflectivity. (1) The Pine Mountain block could be truncated below by a relatively smooth continuation of the de??collement. The window would thus expose an allochthonous basement duplex or horse-block thrust upward from the south along the Late Proterozoic rifted continental margin. (2) The window represents localized exhumation of autochthonous basement and cover along a zone of distributed intrabasement shearing directly beneath the window. Either model is viable if only reflector geometry is considered; model (1) is favored if both geometry and kinematics of Blue Ridge-Piedmont thrust sheet emplacement are incorporated. In either model, the southern margin of the window merges to the west with the Iapetan early Alleghanian Central Piedmont suture, which juxtaposes North American-affinity Piedmont rocks to the north and exotic Panafrican rocks of the Carolina (Avalon) terrane to the south. Immediately south of the window, this suture dips southward and merges in the lower crust with the late Alleghanian suture joining the Appalachians with Gondwana. ?? 2005 Geological Society of America.
Bai, Jinbing; Harper, Felicity W K; Penner, Louis A; Swanson, Kristen; Santacroce, Sheila J
2017-11-01
To study the relationship between parental verbal and nonverbal caring behaviors and child distress during cancer-related port access placement using correlational and time-window sequential analyses. . Longitudinal, observational design. . Children's Hospital of Michigan and St. Jude Children's Research Hospital. . 43 child-parent dyads, each with two or three video recordings of the child undergoing cancer-related port placement. . Two trained raters coded parent interaction behaviors and child distress using the Parent Caring Response Scoring System and Karmanos Child Coping and Distress Scale, respectively. Mixed modeling with generalized estimating equations examined the associations between parent interaction behaviors and parent distress, child distress, and child cooperation reported by multiple raters. Time-window sequential analyses were performed to investigate the temporal relationships in parent-child interactions within a five-second window. . Parent caring behaviors, child distress, and child cooperation. . Parent caring interaction behaviors were significantly correlated with parent distress, child distress, and child cooperation during repeated cancer port accessing. Sequential analyses showed that children were significantly less likely to display behavioral and verbal distress following parent caring behaviors than at any other time. If a child is already distressed, parent verbal and nonverbal caring behaviors can significantly reduce child behavioral and verbal distress. . Parent caring behaviors, particularly the rarely studied nonverbal behaviors (e.g., eye contact, distance close to touch, supporting/allowing), can reduce the child's distress during cancer port accessing procedures. . Studying parent-child interactions during painful cancer-related procedures can provide evidence to develop nursing interventions to support parents in caring for their child during painful procedures.
High-Reliability Waveguide Vacuum/Pressure Window
NASA Technical Reports Server (NTRS)
Britcliffe, Michael J.; Hanson, Theodore R.; Long, Ezra M.; Montanez, Steven
2013-01-01
The NASA Deep Space Network (DSN) uses commercial waveguide windows on the output waveguide of Ka-band (32 GHz) low-noise amplifiers. Mechanical failure of these windows resulted in an unacceptable loss in tracking time. To address this issue, a new Ka-band WR-28 waveguide window has been designed, fabricated, and tested. The window uses a slab of low-loss, low-dielectric constant foam that is bonded into a 1/2-wave-thick waveguide/flange. The foam is a commercially available, rigid, closed-cell polymethacrylimide. It has excellent electrical properties with a dielectric constant of 1.04, and a loss tangent of 0.01. It is relatively strong with a tensile strength of 1 MPa. The material is virtually impermeable to helium. The finished window exhibits a leak rate of less than 3x10(exp -3)cu cm/s with helium. The material is also chemically resistant and can be cleaned with acetone. The window is constructed by fabricating a window body by brazing a short length of WR-28 copper waveguide into a standard rectangular flange, and machining the resulting part to a thickness of 4.6 mm. The foam is machined to a rectangular shape with a dimension of 7.06x3.53 mm. The foam is bonded into the body with a two-part epoxy. After curing, the excess glue and foam are knife-trimmed by hand. The finished window has a loss of less than 0.08 dB (2%) and a return loss of greater than 25 dB at 32 GHz. This meets the requirements for the DSN application. The window is usable for most applications over the entire 26-to-40-GHz waveguide band. The window return loss can be tuned to a required frequency by var y in g the thickness of the window slightly. Most standard waveguide windows use a thin membrane of material bonded into a recess in a waveguide flange, or sandwiched between two flanges with a polymer seal. Designs using the recessed window are prone to mechanical failure over time due to constraints on the dimensions of the recess that allow the bond to fail. Designs using the sandwich method are often permeable to helium, which prohibits the use of helium leak detection. At the time of this reporting, 40 windows have been produced. Twelve are in operation with a combined operating time of over 30,000 hours without a failure.
FRB180301: AstroSat CZTI upper limits
NASA Astrophysics Data System (ADS)
Anumarlapudi, A.; Aarthy, E.; Arvind, B.; Bhalerao, V.; Bhattacharya, D.; Rao, A. R.; Vadawale, S.
2018-03-01
We carried out offline analysis of data from Astrosat CZTI in a 100 second window centred on the FRB180301 (Parkes discovery - Savchenko, V. et al., ATEL #11376) trigger time, 2018-03-11 at 04:11:54.80 UTC, to look for any coincident hard X-ray flash.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-29
... maintenance window for the Gulf individual fishing quota (IFQ) programs, and removing obsolete codified text..., etc.), extends the IFQ maintenance window an additional 8 hours to allow for more time to conduct end... maintenance window. All electronic IFQ transactions must be completed by December 31 at 6 p.m. eastern time...
Oh, Se An; Yea, Ji Woon; Kim, Sung Kyu
2016-01-01
Respiratory-gated radiation therapy (RGRT) is used to minimize the radiation dose to normal tissue in lung-cancer patients. Although determining the gating window in the respiratory phase of patients is important in RGRT, it is not easy. Our aim was to determine the optimal gating window when using a visible guiding system for RGRT. Between April and October 2014, the breathing signals of 23 lung-cancer patients were recorded with a real-time position management (RPM) respiratory gating system (Varian, USA). We performed statistical analysis with breathing signals to find the optimal gating window for guided breathing in RGRT. When we compared breathing signals before and after the breathing training, 19 of the 23 patients showed statistically significant differences (p < 0.05). The standard deviation of the respiration signals after breathing training was lowest for phases of 30%-70%. The results showed that the optimal gating window in RGRT is 40% (30%-70%) with respect to repeatability for breathing after respiration training with the visible guiding system. RGRT was performed with the RPM system to confirm the usefulness of the visible guiding system. The RPM system and our visible guiding system improve the respiratory regularity, which in turn should improve the accuracy and efficiency of RGRT.
Q estimation of seismic data using the generalized S-transform
NASA Astrophysics Data System (ADS)
Hao, Yaju; Wen, Xiaotao; Zhang, Bo; He, Zhenhua; Zhang, Rui; Zhang, Jinming
2016-12-01
Quality factor, Q, is a parameter that characterizes the energy dissipation during seismic wave propagation. The reservoir pore is one of the main factors that affect the value of Q. Especially, when pore space is filled with oil or gas, the rock usually exhibits a relative low Q value. Such a low Q value has been used as a direct hydrocarbon indicator by many researchers. The conventional Q estimation method based on spectral ratio suffers from the problem of waveform tuning; hence, many researchers have introduced time-frequency analysis techniques to tackle this problem. Unfortunately, the window functions adopted in time-frequency analysis algorithms such as continuous wavelet transform (CWT) and S-transform (ST) contaminate the amplitude spectra because the seismic signal is multiplied by the window functions during time-frequency decomposition. The basic assumption of the spectral ratio method is that there is a linear relationship between natural logarithmic spectral ratio and frequency. However, this assumption does not hold if we take the influence of window functions into consideration. In this paper, we first employ a recently developed two-parameter generalized S-transform (GST) to obtain the time-frequency spectra of seismic traces. We then deduce the non-linear relationship between natural logarithmic spectral ratio and frequency. Finally, we obtain a linear relationship between natural logarithmic spectral ratio and a newly defined parameter γ by ignoring the negligible second order term. The gradient of this linear relationship is 1/Q. Here, the parameter γ is a function of frequency and source wavelet. Numerical examples for VSP and post-stack reflection data confirm that our algorithm is capable of yielding accurate results. The Q-value results estimated from field data acquired in western China show reasonable comparison with oil-producing well location.
Haest, Birgen; Hüppop, Ommo; Bairlein, Franz
2018-04-01
Many migrant bird species that breed in the Northern Hemisphere show advancement in spring arrival dates. The North Atlantic Oscillation (NAO) index is one of the climatic variables that have been most often investigated and shown to be correlated with these changes in spring arrival. Although the NAO is often claimed to be a good predictor or even to have a marked effect on interannual changes in spring migration phenology of Northern Hemisphere breeding birds, the results on relations between spring migration phenology and NAO show a large variety, ranging from no, over weak, to a strong association. Several factors, such as geographic location, migration phase, and the NAO index time window, have been suggested to partly explain these observed differences in association. A combination of a literature meta-analysis, and a meta-analysis and sliding time window analysis of a dataset of 23 short- and long-distance migrants from the constant-effort trapping garden at Helgoland, Germany, however, paints a completely different picture. We found a statistically significant overall effect size of the NAO on spring migration phenology (coefficient = -0.14, SE = 0.054), but this on average only explains 0%-6% of the variance in spring migration phenology across all species. As such, the value and biological meaning of the NAO as a general predictor or explanatory variable for climate change effects on migration phenology of birds, seems highly questionable. We found little to no definite support for previously suggested factors, such as geographic location, migration phenology phase, or the NAO time window, to explain the heterogeneity in correlation differences. We, however, did find compelling evidence that the lack of accounting for trends in both time series has led to strongly inflated (spurious) correlations in many studies (coefficient = -0.13, SE = 0.019). © 2017 John Wiley & Sons Ltd.
Spectral analysis for GNSS coordinate time series using chirp Fourier transform
NASA Astrophysics Data System (ADS)
Feng, Shengtao; Bo, Wanju; Ma, Qingzun; Wang, Zifan
2017-12-01
Spectral analysis for global navigation satellite system (GNSS) coordinate time series provides a principal tool to understand the intrinsic mechanism that affects tectonic movements. Spectral analysis methods such as the fast Fourier transform, Lomb-Scargle spectrum, evolutionary power spectrum, wavelet power spectrum, etc. are used to find periodic characteristics in time series. Among spectral analysis methods, the chirp Fourier transform (CFT) with less stringent requirements is tested with synthetic and actual GNSS coordinate time series, which proves the accuracy and efficiency of the method. With the length of series only limited to even numbers, CFT provides a convenient tool for windowed spectral analysis. The results of ideal synthetic data prove CFT accurate and efficient, while the results of actual data show that CFT is usable to derive periodic information from GNSS coordinate time series.
The impact of short term synaptic depression and stochastic vesicle dynamics on neuronal variability
Reich, Steven
2014-01-01
Neuronal variability plays a central role in neural coding and impacts the dynamics of neuronal networks. Unreliability of synaptic transmission is a major source of neural variability: synaptic neurotransmitter vesicles are released probabilistically in response to presynaptic action potentials and are recovered stochastically in time. The dynamics of this process of vesicle release and recovery interacts with variability in the arrival times of presynaptic spikes to shape the variability of the postsynaptic response. We use continuous time Markov chain methods to analyze a model of short term synaptic depression with stochastic vesicle dynamics coupled with three different models of presynaptic spiking: one model in which the timing of presynaptic action potentials are modeled as a Poisson process, one in which action potentials occur more regularly than a Poisson process (sub-Poisson) and one in which action potentials occur more irregularly (super-Poisson). We use this analysis to investigate how variability in a presynaptic spike train is transformed by short term depression and stochastic vesicle dynamics to determine the variability of the postsynaptic response. We find that sub-Poisson presynaptic spiking increases the average rate at which vesicles are released, that the number of vesicles released over a time window is more variable for smaller time windows than larger time windows and that fast presynaptic spiking gives rise to Poisson-like variability of the postsynaptic response even when presynaptic spike times are non-Poisson. Our results complement and extend previously reported theoretical results and provide possible explanations for some trends observed in recorded data. PMID:23354693
Fully automatic time-window selection using machine learning for global adjoint tomography
NASA Astrophysics Data System (ADS)
Chen, Y.; Hill, J.; Lei, W.; Lefebvre, M. P.; Bozdag, E.; Komatitsch, D.; Tromp, J.
2017-12-01
Selecting time windows from seismograms such that the synthetic measurements (from simulations) and measured observations are sufficiently close is indispensable in a global adjoint tomography framework. The increasing amount of seismic data collected everyday around the world demands "intelligent" algorithms for seismic window selection. While the traditional FLEXWIN algorithm can be "automatic" to some extent, it still requires both human input and human knowledge or experience, and thus is not deemed to be fully automatic. The goal of intelligent window selection is to automatically select windows based on a learnt engine that is built upon a huge number of existing windows generated through the adjoint tomography project. We have formulated the automatic window selection problem as a classification problem. All possible misfit calculation windows are classified as either usable or unusable. Given a large number of windows with a known selection mode (select or not select), we train a neural network to predict the selection mode of an arbitrary input window. Currently, the five features we extract from the windows are its cross-correlation value, cross-correlation time lag, amplitude ratio between observed and synthetic data, window length, and minimum STA/LTA value. More features can be included in the future. We use these features to characterize each window for training a multilayer perceptron neural network (MPNN). Training the MPNN is equivalent to solve a non-linear optimization problem. We use backward propagation to derive the gradient of the loss function with respect to the weighting matrices and bias vectors and use the mini-batch stochastic gradient method to iteratively optimize the MPNN. Numerical tests show that with a careful selection of the training data and a sufficient amount of training data, we are able to train a robust neural network that is capable of detecting the waveforms in an arbitrary earthquake data with negligible detection error compared to existing selection methods (e.g. FLEXWIN). We will introduce in detail the mathematical formulation of the window-selection-oriented MPNN and show very encouraging results when applying the new algorithm to real earthquake data.
Non-conventional optomechanical choppers: analysis and design of novel prototypes
NASA Astrophysics Data System (ADS)
Duma, Virgil-Florin; Demian, Dorin; Csukas, Eduard Sebastian; Pop, Nicolina; Cira, Octavian
2017-10-01
Optical choppers are widely used in laser systems - for light modulation and/or attenuation. In their most used and wellknown configuration, they are built as a rotational wheel with windows, which transforms a continuous-wave laser beam into a series of impulses with a certain frequency and profile. We briefly present the analysis and design we have completed for the classical chopper wheels (i.e., with windows with linear margins) for both top-hat and Gaussian laser beams. Further on, novel chopper wheels configurations, with outward or inward semi-circular (or with other non-linear shaped) margins of the windows is pointed out; we completed for them both analytic functions and simulations, for both top-hat and Gaussian beams, in order to deduce their transmission functions (i.e., the time profile of the laser impulses generated by the device). The stress of the presentation is put on the novel choppers with shafts (patent pending); their transmission functions are pointed out for top-hat laser beams. Finally, an example of such choppers is considered, with regard to the necessary Finite Element Analysis (FEA) that has to be performed for their rotational shaft. Both the mechanical stress and the deformations in the shaft have to be taken into account, especially at high rotational speeds of the mobile element.
Evaluation of SNS Beamline Shielding Configurations using MCNPX Accelerated by ADVANTG
DOE Office of Scientific and Technical Information (OSTI.GOV)
Risner, Joel M; Johnson, Seth R.; Remec, Igor
2015-01-01
Shielding analyses for the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory pose significant computational challenges, including highly anisotropic high-energy sources, a combination of deep penetration shielding and an unshielded beamline, and a desire to obtain well-converged nearly global solutions for mapping of predicted radiation fields. The majority of these analyses have been performed using MCNPX with manually generated variance reduction parameters (source biasing and cell-based splitting and Russian roulette) that were largely based on the analyst's insight into the problem specifics. Development of the variance reduction parameters required extensive analyst time, and was often tailored to specific portionsmore » of the model phase space. We previously applied a developmental version of the ADVANTG code to an SNS beamline study to perform a hybrid deterministic/Monte Carlo analysis and showed that we could obtain nearly global Monte Carlo solutions with essentially uniform relative errors for mesh tallies that cover extensive portions of the model with typical voxel spacing of a few centimeters. The use of weight window maps and consistent biased sources produced using the FW-CADIS methodology in ADVANTG allowed us to obtain these solutions using substantially less computer time than the previous cell-based splitting approach. While those results were promising, the process of using the developmental version of ADVANTG was somewhat laborious, requiring user-developed Python scripts to drive much of the analysis sequence. In addition, limitations imposed by the size of weight-window files in MCNPX necessitated the use of relatively coarse spatial and energy discretization for the deterministic Denovo calculations that we used to generate the variance reduction parameters. We recently applied the production version of ADVANTG to this beamline analysis, which substantially streamlined the analysis process. We also tested importance function collapsing (in space and energy) capabilities in ADVANTG. These changes, along with the support for parallel Denovo calculations using the current version of ADVANTG, give us the capability to improve the fidelity of the deterministic portion of the hybrid analysis sequence, obtain improved weight-window maps, and reduce both the analyst and computational time required for the analysis process.« less
Extreme multi-basin fluvial flows and their relationship to extra-tropical cyclones
NASA Astrophysics Data System (ADS)
De Luca, Paolo; Hillier, John K.; Wilby, Robert L.; Quinn, Nevil W.; Harrigan, Shaun
2017-04-01
Fluvial floods are typically investigated as 'events' at the single basin scale, thereby implicitly assuming that severe flooding impacts each catchment independently from those nearby. A statistical analysis of the spatio-temporal characteristics of extreme flows in Great Britain (GB), during 1975-2014, is presented. These observations deepen understanding of the processes leading to multi-basin floods and present helpful insights for contingency planning and emergency responders. The largest multi-basin peak flow events within different time windows were identified by counting the number of coincident annual maximum river peak flows (AMAX) across 261 non-nested catchments, using search windows of 1 to 19 days. This showed that up to 107 basins reached their AMAX within the same plateauing 13-day window, draining a total area equivalent to ˜46% of the overall basins considered, which is an equivalent fraction of ˜27% of Great Britain. Such episodes are typically associated with persistent cyclonic atmospheric circulation and saturated ground, combined with short hydrological response times (<48 h) from large contributing basins. The most spatially extensive episodes also tend to coincide with the most severe gales (i.e. extra-tropical cyclones) on a ±0-13 day time-scale. The analysis suggests that multi-basin peak flow events can be characterised by concurrent peak flow AMAX and that the most extreme are driven by very severe gales (VSG). This has implications for emergency response including planning for combined flood-wind impacts (on for example power and communication systems), meaning that the emergency preparedness need to be reorganised in order to face this peril.
Zhang, Fei-ruo; Wang, Sheng; He, Li-hua; Zhang, Ying; Wu, Shan-shan; Li, Jing-yun; Hu, Guang-yi; Ye, Kang-ping
2011-03-01
To study neck and shoulder work-related muscle fatigue of female sewing machine operators. 18 health female sewing machine operators without musculoskeletal disorders work in Beijing garment industry factory as volunteers in participate of this study. The maximal voluntary contraction (MVC) and 20% MVC of bilateral upper trapezium and cervical erectors spinae was tested before sewing operations, then the whole 20 time windows (1 time window = 10 min) sewing machine operations was monitored and the surface electromyography (sEMG) signals simultaneously was recorded after monitoring the 20%MVC was tested. Use amplitude analysis method to reduction recorded EMG signals. During work, the median load for the left cervical erector spinae (LCES), right cervical erector spinae (RCES), left upper trapezium (LUT) and right upper trapezium (RUT) respectively was 6.78 ± 1.05, 6.94 ± 1.12, 5.68 ± 2.56 and 6.47 ± 3.22, work load of right is higher than the left; static load analysis indicated the value of RMS(20%MVC) before work was higher than that value after work, the increase of right CES and UT RMS(20%MVC) was more; the largest 20%MVE of bilateral CES occurred at 20th time window, and that of bilateral UT happened at 16th. The work load of female sewing machine operators is sustained "static" load, and work load of right neck-shoulder is higher than left, right neck-shoulder muscle is more fatigable and much serious once fatigued.
The signal extraction of fetal heart rate based on wavelet transform and BP neural network
NASA Astrophysics Data System (ADS)
Yang, Xiao Hong; Zhang, Bang-Cheng; Fu, Hu Dai
2005-04-01
This paper briefly introduces the collection and recognition of bio-medical signals, designs the method to collect FM signals. A detailed discussion on the system hardware, structure and functions is also given. Under LabWindows/CVI,the hardware and the driver do compatible, the hardware equipment work properly actively. The paper adopts multi threading technology for real-time analysis and makes use of latency time of CPU effectively, expedites program reflect speed, improves the program to perform efficiency. One threading is collecting data; the other threading is analyzing data. Using the method, it is broaden to analyze the signal in real-time. Wavelet transform to remove the main interference in the FM and by adding time-window to recognize with BP network; Finally the results of collecting signals and BP networks are discussed. 8 pregnant women's signals of FM were collected successfully by using the sensor. The correctness rate of BP network recognition is about 83.3% by using the above measure.
The window of opportunity: decision theory and the timing of prognostic tests for newborn infants.
Wilkinson, Dominic
2009-11-01
In many forms of severe acute brain injury there is an early phase when prognosis is uncertain, followed later by physiological recovery and the possibility of more certain predictions of future impairment. There may be a window of opportunity for withdrawal of life support early, but if decisions are delayed there is the risk that the patient will survive with severe impairment. In this paper I focus on the example of neonatal encephalopathy and the question of the timing of prognostic tests and decisions to continue or to withdraw life-sustaining treatment. Should testing be performed early or later; and how should parents decide what to do given the conflicting values at stake? I apply decision theory to the problem, using sensitivity analysis to assess how different features of the tests or different values would affect a decision to perform early or late prognostic testing. I draw some general conclusions from this model for decisions about the timing of testing in neonatal encephalopathy. Finally I consider possible solutions to the problem posed by the window of opportunity. Decision theory highlights the costs of uncertainty. This may prompt further research into improving prognostic tests. But it may also prompt us to reconsider our current attitudes towards the palliative care of newborn infants predicted to be severely impaired.
Zhang, Mingjing; Wen, Ming; Zhang, Zhi-Min; Lu, Hongmei; Liang, Yizeng; Zhan, Dejian
2015-03-01
Retention time shift is one of the most challenging problems during the preprocessing of massive chromatographic datasets. Here, an improved version of the moving window fast Fourier transform cross-correlation algorithm is presented to perform nonlinear and robust alignment of chromatograms by analyzing the shifts matrix generated by moving window procedure. The shifts matrix in retention time can be estimated by fast Fourier transform cross-correlation with a moving window procedure. The refined shift of each scan point can be obtained by calculating the mode of corresponding column of the shifts matrix. This version is simple, but more effective and robust than the previously published moving window fast Fourier transform cross-correlation method. It can handle nonlinear retention time shift robustly if proper window size has been selected. The window size is the only one parameter needed to adjust and optimize. The properties of the proposed method are investigated by comparison with the previous moving window fast Fourier transform cross-correlation and recursive alignment by fast Fourier transform using chromatographic datasets. The pattern recognition results of a gas chromatography mass spectrometry dataset of metabolic syndrome can be improved significantly after preprocessing by this method. Furthermore, the proposed method is available as an open source package at https://github.com/zmzhang/MWFFT2. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Free-breathing 3D Cardiac MRI Using Iterative Image-Based Respiratory Motion Correction
Moghari, Mehdi H.; Roujol, Sébastien; Chan, Raymond H.; Hong, Susie N.; Bello, Natalie; Henningsson, Markus; Ngo, Long H.; Goddu, Beth; Goepfert, Lois; Kissinger, Kraig V.; Manning, Warren J.; Nezafat, Reza
2012-01-01
Respiratory motion compensation using diaphragmatic navigator (NAV) gating with a 5 mm gating window is conventionally used for free-breathing cardiac MRI. Due to the narrow gating window, scan efficiency is low resulting in long scan times, especially for patients with irregular breathing patterns. In this work, a new retrospective motion compensation algorithm is presented to reduce the scan time for free-breathing cardiac MRI that increasing the gating window to 15 mm without compromising image quality. The proposed algorithm iteratively corrects for respiratory-induced cardiac motion by optimizing the sharpness of the heart. To evaluate this technique, two coronary MRI datasets with 1.3 mm3 resolution were acquired from 11 healthy subjects (7 females, 25±9 years); one using a NAV with a 5 mm gating window acquired in 12.0±2.0 minutes and one with a 15 mm gating window acquired in 7.1±1.0 minutes. The images acquired with a 15 mm gating window were corrected using the proposed algorithm and compared to the uncorrected images acquired with the 5 mm and 15 mm gating windows. The image quality score, sharpness, and length of the three major coronary arteries were equivalent between the corrected images and the images acquired with a 5 mm gating window (p-value>0.05), while the scan time was reduced by a factor of 1.7. PMID:23132549
Xie, Li-Hong; Tang, Jie; Miao, Wen-Jie; Tang, Xiang-Long; Li, Heng; Tang, An-Zhou
2018-06-01
We evaluated the risk of cochlear implantation through the round window membrane in the facial recess through a preoperative analysis of the angle between the facial nerve-round window and the cranial midline using high-resolution temporal bone CT. Temporal bone CT films of 176 patients with profound sensorineural hearing loss at our hospital from 2013 to 2015 were reviewed. The preoperative temporal bone CT scans of the patients were retrospectively analysed. The vertical distance (d value) from the leading edge of the facial nerve to the posterior wall of the external auditory canal and the angle (α value) between the line from the leading edge of the facial nerve to the midpoint of the round window membrane and the median sagittal line on the round window membrane plane were measured. Based on intraoperative observation, the round window membrane was divided into complete round window membrane exposure (group A), partial exposure (group B), and unexposed (group C) groups, and statistical analysis was performed. The α value could be effectively measured for all 176 patients (62.60 ± 7.12), and the d value could be effectively measured for 95 cases (5.53 ± 1.00). An analysis of the correlation between the α and d values of these 95 cases found a negative correlation. Of the 176 cases, one-way analysis of variance (ANOVA) showed that the differences among the groups were significant [P = 0.000 (< 0.05)]. The angle (α value) between the line connecting the leading edge of the facial nerve to the midpoint of the round window and the median sagittal line measured in preoperative CT scans was associated with the difficulty of intraoperatively exposing the round window membrane. When the α value was larger than a certain degree, the difficulty of exposing the round window membrane was increased. In such cases, the surgeon should fully expose the round window membrane during surgery, which could result decrease the likelihood of complications.
Recognition of the Multi Specularity Objects using the Eigen-Window,
1996-02-29
analysis to each eigen-window [21]. The basic idea is that, even if some of the windows are occluded, the remaining windows are still effective and can...K.Ikeuchi, “The Machanical Manipulation of Randomly Oriented Parts”, Scientific American, Vol.251, No.2, pp.100-111, 1984. [5] S.A.Hutchinson and A.C.Kak
NASA Astrophysics Data System (ADS)
Liu, Tzu-Chi; Wu, Hau-Tieng; Chen, Ya-Hui; Chen, Ya-Han; Fang, Te-Yung; Wang, Pa-Chun; Liu, Yi-Wen
2018-05-01
The presence of click-evoked (CE) otoacoustic emissions (OAEs) has been clinically accepted as an indicator of normal cochlear processing of sounds. For treatment and diagnostic purposes, however, clinicians do not typically pay attention to the detailed spectrum and waveform of CEOAEs. A possible reason is due to the lack of noise-robust signal processing tools to estimate physiologically meaningful time-frequency properties of CEOAEs, such as the latency of spectral components. In this on-going study, we applied a modern tool called concentration of frequency and time (ConceFT, [1]) to analyze CEOAE waveforms. Randomly combined orthogonal functions are used as windowing functions for time-frequency analysis. The resulting spectrograms are subject to nonlinear time-frequency reassignment so as to enhance the concentration of time-varying sinusoidal components. The results after reassignment could be further averaged across the random choice of windows. CEOAE waveforms are acquired by a linear averaging paradigm, and longitudinal data are currently being collected from patients with Ménière's disease (MD) and a control group of normal hearing subjects. When CEOAE is present, the ConceFT plots show traces of decreasing but fluctuating instantaneous frequency against time. For comparison purposes, same processing methods are also applied to analyze CEOAE data from cochlear mechanics simulation.
NASA Technical Reports Server (NTRS)
Ko, William L.; Gong, Leslie
2000-01-01
To visually record the initial free flight event of the Hyper-X research flight vehicle immediately after separation from the Pegasus(registered) booster rocket, a video camera was mounted on the bulkhead of the adapter through which Hyper-X rides on Pegasus. The video camera was shielded by a protecting camera window made of heat-resistant quartz material. When Hyper-X separates from Pegasus, this camera window will be suddenly exposed to Mach 7 stagnation thermal shock and dynamic pressure loading (aerothermal loading). To examine the structural integrity, thermoelastic analysis was performed, and the stress distributions in the camera windows were calculated. The critical stress point where the tensile stress reaches a maximum value for each camera window was identified, and the maximum tensile stress level at that critical point was found to be considerably lower than the tensile failure stress of the camera window material.
Optimal ranking regime analysis of TreeFlow dendrohydrological reconstructions
USDA-ARS?s Scientific Manuscript database
The Optimal Ranking Regime (ORR) method was used to identify 6-100 year time windows containing significant ranking sequences in 55 western U.S. streamflow reconstructions, and reconstructions of the level of the Great Salt Lake and San Francisco Bay salinity during 1500-2007. The method’s ability t...
Dong, Jie; Wang, Dawei; Ma, Zhenshen; Deng, Guodong; Wang, Lanhua; Zhang, Jiandong
2017-01-01
The aim of the study was evaluate the 3.0 T magnetic resonance (MR) perfusion imaging scanning time window following contrast injection for differentiating benign and malignant breast lesions and to determine the optimum scanning time window for increased scanner usage efficiency and reduced diagnostic adverse risk factors. A total of 52 women with breast abnormalities were selected for conventional MR imaging and T1 dynamic-enhanced imaging. Quantitative parameters [volume transfer constant (Ktrans), rate constant (Kep) and extravascular extracellular volume fraction (Ve)] were calculated at phases 10, 20, 30, 40 and 50, which represented time windows at 5, 10, 15, 20 and 25 min, respectively, following injection of contrast agent. The association of the parameters at different phases with benign and malignant tumor diagnosis was analyzed. MR perfusion imaging was verified as an effective modality in the diagnosis of breast malignancies and the best scanning time window was identified: i) Values of Ktrans and Kep at all phases were statistically significant in differentiating benign and malignant tumors (P<0.05), while the value of Ve had statistical significance only at stage 10, but not at any other stages (P>0.05); ii) values of Ve in benign tumors increased with phase number, but achieved no obvious changes at different phases in malignant tumors; iii) the optimum scanning time window of breast perfusion imaging with 3.0 T MR was between phases 10 and 30 (i.e., between 5 and 15 min after contrast agent injection). The variation trend of Ve values at different phases may serve as a diagnostic reference for differentiating benign and malignant breast abnormalities. The most efficient scanning time window was indicated to be 5 min after contrast injection, based on the observation that the Ve value only had statistical significance in diagnosis at stage 10. However, the optimal scanning time window is from 5 to 15 min following the injection of contrast agent, since that the variation trend of Ve is able to serve as a diagnostic reference. PMID:28450944
Dong, Jie; Wang, Dawei; Ma, Zhenshen; Deng, Guodong; Wang, Lanhua; Zhang, Jiandong
2017-03-01
The aim of the study was evaluate the 3.0 T magnetic resonance (MR) perfusion imaging scanning time window following contrast injection for differentiating benign and malignant breast lesions and to determine the optimum scanning time window for increased scanner usage efficiency and reduced diagnostic adverse risk factors. A total of 52 women with breast abnormalities were selected for conventional MR imaging and T1 dynamic-enhanced imaging. Quantitative parameters [volume transfer constant (K trans ), rate constant (K ep ) and extravascular extracellular volume fraction (V e )] were calculated at phases 10, 20, 30, 40 and 50, which represented time windows at 5, 10, 15, 20 and 25 min, respectively, following injection of contrast agent. The association of the parameters at different phases with benign and malignant tumor diagnosis was analyzed. MR perfusion imaging was verified as an effective modality in the diagnosis of breast malignancies and the best scanning time window was identified: i) Values of K trans and K ep at all phases were statistically significant in differentiating benign and malignant tumors (P<0.05), while the value of V e had statistical significance only at stage 10, but not at any other stages (P>0.05); ii) values of V e in benign tumors increased with phase number, but achieved no obvious changes at different phases in malignant tumors; iii) the optimum scanning time window of breast perfusion imaging with 3.0 T MR was between phases 10 and 30 (i.e., between 5 and 15 min after contrast agent injection). The variation trend of V e values at different phases may serve as a diagnostic reference for differentiating benign and malignant breast abnormalities. The most efficient scanning time window was indicated to be 5 min after contrast injection, based on the observation that the V e value only had statistical significance in diagnosis at stage 10. However, the optimal scanning time window is from 5 to 15 min following the injection of contrast agent, since that the variation trend of V e is able to serve as a diagnostic reference.
Timing anthropogenic stressors to mitigate their impact on marine ecosystem resilience.
Wu, Paul Pao-Yen; Mengersen, Kerrie; McMahon, Kathryn; Kendrick, Gary A; Chartrand, Kathryn; York, Paul H; Rasheed, Michael A; Caley, M Julian
2017-11-02
Better mitigation of anthropogenic stressors on marine ecosystems is urgently needed to address increasing biodiversity losses worldwide. We explore opportunities for stressor mitigation using whole-of-systems modelling of ecological resilience, accounting for complex interactions between stressors, their timing and duration, background environmental conditions and biological processes. We then search for ecological windows, times when stressors minimally impact ecological resilience, defined here as risk, recovery and resistance. We show for 28 globally distributed seagrass meadows that stressor scheduling that exploits ecological windows for dredging campaigns can achieve up to a fourfold reduction in recovery time and 35% reduction in extinction risk. Although the timing and length of windows vary among sites to some degree, global trends indicate favourable windows in autumn and winter. Our results demonstrate that resilience is dynamic with respect to space, time and stressors, varying most strongly with: (i) the life history of the seagrass genus and (ii) the duration and timing of the impacting stress.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-01
.... Actual pile driving time during this work window will depend on a number of factors, such as sediments... period beginning in November 2010, and ending in February 2011. This work window was selected to coincide.... The work window also coincides with the USFWS' required construction work window to avoid the peak...
ERIC Educational Resources Information Center
Roman, Harry T.
2010-01-01
Skyscrapers sure do have a lot of windows, and these windows are cleaned and checked regularly. All this takes time, money, and puts workers at potential risk. Might there be a better way to do it? In this article, the author discusses a window-washing challenge and describes how students can tackle this task, pick up the challenge, and creatively…
NASA Astrophysics Data System (ADS)
Tong, Qiujie; Wang, Qianqian; Li, Xiaoyang; Shan, Bin; Cui, Xuntai; Li, Chenyu; Peng, Zhong
2016-11-01
In order to satisfy the requirements of the real-time and generality, a laser target simulator in semi-physical simulation system based on RTX+LabWindows/CVI platform is proposed in this paper. Compared with the upper-lower computers simulation platform architecture used in the most of the real-time system now, this system has better maintainability and portability. This system runs on the Windows platform, using Windows RTX real-time extension subsystem to ensure the real-time performance of the system combining with the reflective memory network to complete some real-time tasks such as calculating the simulation model, transmitting the simulation data, and keeping real-time communication. The real-time tasks of simulation system run under the RTSS process. At the same time, we use the LabWindows/CVI to compile a graphical interface, and complete some non-real-time tasks in the process of simulation such as man-machine interaction, display and storage of the simulation data, which run under the Win32 process. Through the design of RTX shared memory and task scheduling algorithm, the data interaction between the real-time tasks process of RTSS and non-real-time tasks process of Win32 is completed. The experimental results show that this system has the strongly real-time performance, highly stability, and highly simulation accuracy. At the same time, it also has the good performance of human-computer interaction.
Rapidity window dependences of higher order cumulants and diffusion master equation
NASA Astrophysics Data System (ADS)
Kitazawa, Masakiyo
2015-10-01
We study the rapidity window dependences of higher order cumulants of conserved charges observed in relativistic heavy ion collisions. The time evolution and the rapidity window dependence of the non-Gaussian fluctuations are described by the diffusion master equation. Analytic formulas for the time evolution of cumulants in a rapidity window are obtained for arbitrary initial conditions. We discuss that the rapidity window dependences of the non-Gaussian cumulants have characteristic structures reflecting the non-equilibrium property of fluctuations, which can be observed in relativistic heavy ion collisions with the present detectors. It is argued that various information on the thermal and transport properties of the hot medium can be revealed experimentally by the study of the rapidity window dependences, especially by the combined use, of the higher order cumulants. Formulas of higher order cumulants for a probability distribution composed of sub-probabilities, which are useful for various studies of non-Gaussian cumulants, are also presented.
Lee, Yu-Hao; Hsieh, Ya-Ju; Shiah, Yung-Jong; Lin, Yu-Huei; Chen, Chiao-Yun; Tyan, Yu-Chang; GengQiu, JiaCheng; Hsu, Chung-Yao; Chen, Sharon Chia-Ju
2017-01-01
Abstract To quantitate the meditation experience is a subjective and complex issue because it is confounded by many factors such as emotional state, method of meditation, and personal physical condition. In this study, we propose a strategy with a cross-sectional analysis to evaluate the meditation experience with 2 artificial intelligence techniques: artificial neural network and support vector machine. Within this analysis system, 3 features of the electroencephalography alpha spectrum and variant normalizing scaling are manipulated as the evaluating variables for the detection of accuracy. Thereafter, by modulating the sliding window (the period of the analyzed data) and shifting interval of the window (the time interval to shift the analyzed data), the effect of immediate analysis for the 2 methods is compared. This analysis system is performed on 3 meditation groups, categorizing their meditation experiences in 10-year intervals from novice to junior and to senior. After an exhausted calculation and cross-validation across all variables, the high accuracy rate >98% is achievable under the criterion of 0.5-minute sliding window and 2 seconds shifting interval for both methods. In a word, the minimum analyzable data length is 0.5 minute and the minimum recognizable temporal resolution is 2 seconds in the decision of meditative classification. Our proposed classifier of the meditation experience promotes a rapid evaluation system to distinguish meditation experience and a beneficial utilization of artificial techniques for the big-data analysis. PMID:28422856
Lee, Yu-Hao; Hsieh, Ya-Ju; Shiah, Yung-Jong; Lin, Yu-Huei; Chen, Chiao-Yun; Tyan, Yu-Chang; GengQiu, JiaCheng; Hsu, Chung-Yao; Chen, Sharon Chia-Ju
2017-04-01
To quantitate the meditation experience is a subjective and complex issue because it is confounded by many factors such as emotional state, method of meditation, and personal physical condition. In this study, we propose a strategy with a cross-sectional analysis to evaluate the meditation experience with 2 artificial intelligence techniques: artificial neural network and support vector machine. Within this analysis system, 3 features of the electroencephalography alpha spectrum and variant normalizing scaling are manipulated as the evaluating variables for the detection of accuracy. Thereafter, by modulating the sliding window (the period of the analyzed data) and shifting interval of the window (the time interval to shift the analyzed data), the effect of immediate analysis for the 2 methods is compared. This analysis system is performed on 3 meditation groups, categorizing their meditation experiences in 10-year intervals from novice to junior and to senior. After an exhausted calculation and cross-validation across all variables, the high accuracy rate >98% is achievable under the criterion of 0.5-minute sliding window and 2 seconds shifting interval for both methods. In a word, the minimum analyzable data length is 0.5 minute and the minimum recognizable temporal resolution is 2 seconds in the decision of meditative classification. Our proposed classifier of the meditation experience promotes a rapid evaluation system to distinguish meditation experience and a beneficial utilization of artificial techniques for the big-data analysis.
NASA Astrophysics Data System (ADS)
Žaknić-Ćatović, Ana; Gough, William A.
2018-04-01
Climatological observing window (COW) is defined as a time frame over which continuous or extreme air temperature measurements are collected. A 24-h time interval, ending at 00UTC or shifted to end at 06UTC, has been associated with difficulties in characterizing daily temperature extrema. A fixed 24-h COW used to obtain the temperature minima leads to potential misidentification due to fragmentation of "nighttime" into two subsequent nighttime periods due to the time discretization interval. The correct identification of air temperature extrema is achievable using a COW that identifies daily minimum over a single nighttime period and maximum over a single daytime period, as determined by sunrise and sunset. Due to a common absence of hourly air temperature observations, the accuracy of the mean temperature estimation is dependent on the accuracy of determination of diurnal air temperature extrema. Qualitative and quantitative criteria were used to examine the impact of the COW on detecting daily air temperature extrema. The timing of the 24-h observing window occasionally affects the determination of daily extrema through a mischaracterization of the diurnal minima and by extension can lead to errors in determining daily mean temperature. Hourly air temperature data for the time period from year 1987 to 2014, obtained from Toronto Buttonville Municipal Airport weather station, were used in analysis of COW impacts on detection of daily temperature extrema and calculation of annual temperature averages based on such extrema.
Optimization of ramp area aircraft push back time windows in the presence of uncertainty
NASA Astrophysics Data System (ADS)
Coupe, William Jeremy
It is well known that airport surface traffic congestion at major airports is responsible for increased taxi-out times, fuel burn and excess emissions and there is potential to mitigate these negative consequences through optimizing airport surface traffic operations. Due to a highly congested voice communication channel between pilots and air traffic controllers and a data communication channel that is used only for limited functions, one of the most viable near-term strategies for improvement of the surface traffic is issuing a push back advisory to each departing aircraft. This dissertation focuses on the optimization of a push back time window for each departing aircraft. The optimization takes into account both spatial and temporal uncertainties of ramp area aircraft trajectories. The uncertainties are described by a stochastic kinematic model of aircraft trajectories, which is used to infer distributions of combinations of push back times that lead to conflict among trajectories from different gates. The model is validated and the distributions are included in the push back time window optimization. Under the assumption of a fixed taxiway spot schedule, the computed push back time windows can be integrated with a higher level taxiway scheduler to optimize the flow of traffic from the gate to the departure runway queue. To enable real-time decision making the computational time of the push back time window optimization is critical and is analyzed throughout.
Measuring kinetics of complex single ion channel data using mean-variance histograms.
Patlak, J B
1993-07-01
The measurement of single ion channel kinetics is difficult when those channels exhibit subconductance events. When the kinetics are fast, and when the current magnitudes are small, as is the case for Na+, Ca2+, and some K+ channels, these difficulties can lead to serious errors in the estimation of channel kinetics. I present here a method, based on the construction and analysis of mean-variance histograms, that can overcome these problems. A mean-variance histogram is constructed by calculating the mean current and the current variance within a brief "window" (a set of N consecutive data samples) superimposed on the digitized raw channel data. Systematic movement of this window over the data produces large numbers of mean-variance pairs which can be assembled into a two-dimensional histogram. Defined current levels (open, closed, or sublevel) appear in such plots as low variance regions. The total number of events in such low variance regions is estimated by curve fitting and plotted as a function of window width. This function decreases with the same time constants as the original dwell time probability distribution for each of the regions. The method can therefore be used: 1) to present a qualitative summary of the single channel data from which the signal-to-noise ratio, open channel noise, steadiness of the baseline, and number of conductance levels can be quickly determined; 2) to quantify the dwell time distribution in each of the levels exhibited. In this paper I present the analysis of a Na+ channel recording that had a number of complexities. The signal-to-noise ratio was only about 8 for the main open state, open channel noise, and fast flickers to other states were present, as were a substantial number of subconductance states. "Standard" half-amplitude threshold analysis of these data produce open and closed time histograms that were well fitted by the sum of two exponentials, but with apparently erroneous time constants, whereas the mean-variance histogram technique provided a more credible analysis of the open, closed, and subconductance times for the patch. I also show that the method produces accurate results on simulated data in a wide variety of conditions, whereas the half-amplitude method, when applied to complex simulated data shows the same errors as were apparent in the real data. The utility and the limitations of this new method are discussed.
Osadchii, Oleg E.
2014-01-01
Normal hearts exhibit a positive time difference between the end of ventricular contraction and the end of QT interval, which is referred to as the electromechanical (EM) window. Drug-induced prolongation of repolarization may lead to the negative EM window, which was proposed to be a novel proarrhythmic marker. This study examined whether abnormal changes in the EM window may account for arrhythmogenic effects produced by hypokalemia. Left ventricular pressure, electrocardiogram, and epicardial monophasic action potentials were recorded in perfused hearts from guinea-pig and rabbit. Hypokalemia (2.5 mM K+) was found to prolong repolarization, reduce the EM window, and promote tachyarrhythmia. Nevertheless, during both regular pacing and extrasystolic excitation, the increased QT interval invariably remained shorter than the duration of mechanical systole, thus yielding positive EM window values. Hypokalemia-induced arrhythmogenicity was associated with slowed ventricular conduction, and shortened effective refractory periods, which translated to a reduced excitation wavelength index. Hypokalemia also evoked non-uniform prolongation of action potential duration in distinct epicardial regions, which resulted in increased spatial variability in the repolarization time. These findings suggest that arrhythmogenic effects of hypokalemia are not accounted for by the negative EM window, and are rather attributed to abnormal changes in ventricular conduction times, refractoriness, excitation wavelength, and spatial repolarization gradients. PMID:25141124
Integrating speech in time depends on temporal expectancies and attention.
Scharinger, Mathias; Steinberg, Johanna; Tavano, Alessandro
2017-08-01
Sensory information that unfolds in time, such as in speech perception, relies on efficient chunking mechanisms in order to yield optimally-sized units for further processing. Whether or not two successive acoustic events receive a one-unit or a two-unit interpretation seems to depend on the fit between their temporal extent and a stipulated temporal window of integration. However, there is ongoing debate on how flexible this temporal window of integration should be, especially for the processing of speech sounds. Furthermore, there is no direct evidence of whether attention may modulate the temporal constraints on the integration window. For this reason, we here examine how different word durations, which lead to different temporal separations of sound onsets, interact with attention. In an Electroencephalography (EEG) study, participants actively and passively listened to words where word-final consonants were occasionally omitted. Words had either a natural duration or were artificially prolonged in order to increase the separation of speech sound onsets. Omission responses to incomplete speech input, originating in left temporal cortex, decreased when the critical speech sound was separated from previous sounds by more than 250 msec, i.e., when the separation was larger than the stipulated temporal window of integration (125-150 msec). Attention, on the other hand, only increased omission responses for stimuli with natural durations. We complemented the event-related potential (ERP) analyses by a frequency-domain analysis on the stimulus presentation rate. Notably, the power of stimulation frequency showed the same duration and attention effects than the omission responses. We interpret these findings on the background of existing research on temporal integration windows and further suggest that our findings may be accounted for within the framework of predictive coding. Copyright © 2017 Elsevier Ltd. All rights reserved.
Xu, Stanley; Hambidge, Simon J; McClure, David L; Daley, Matthew F; Glanz, Jason M
2013-08-30
In the examination of the association between vaccines and rare adverse events after vaccination in postlicensure observational studies, it is challenging to define appropriate risk windows because prelicensure RCTs provide little insight on the timing of specific adverse events. Past vaccine safety studies have often used prespecified risk windows based on prior publications, biological understanding of the vaccine, and expert opinion. Recently, a data-driven approach was developed to identify appropriate risk windows for vaccine safety studies that use the self-controlled case series design. This approach employs both the maximum incidence rate ratio and the linear relation between the estimated incidence rate ratio and the inverse of average person time at risk, given a specified risk window. In this paper, we present a scan statistic that can identify appropriate risk windows in vaccine safety studies using the self-controlled case series design while taking into account the dependence of time intervals within an individual and while adjusting for time-varying covariates such as age and seasonality. This approach uses the maximum likelihood ratio test based on fixed-effects models, which has been used for analyzing data from self-controlled case series design in addition to conditional Poisson models. Copyright © 2013 John Wiley & Sons, Ltd.
Pilatti, Fernanda Kokowicz; Ramlov, Fernanda; Schmidt, Eder Carlos; Costa, Christopher; Oliveira, Eva Regina de; Bauer, Claudia M; Rocha, Miguel; Bouzon, Zenilda Laurita; Maraschin, Marcelo
2017-01-30
Fossil fuels, e.g. gasoline and diesel oil, account for substantial share of the pollution that affects marine ecosystems. Environmental metabolomics is an emerging field that may help unravel the effect of these xenobiotics on seaweeds and provide methodologies for biomonitoring coastal ecosystems. In the present study, FTIR and multivariate analysis were used to discriminate metabolic profiles of Ulva lactuca after in vitro exposure to diesel oil and gasoline, in combinations of concentrations (0.001%, 0.01%, 0.1%, and 1.0% - v/v) and times of exposure (30min, 1h, 12h, and 24h). PCA and HCA performed on entire mid-infrared spectral window were able to discriminate diesel oil-exposed thalli from the gasoline-exposed ones. HCA performed on spectral window related to the protein absorbance (1700-1500cm -1 ) enabled the best discrimination between gasoline-exposed samples regarding the time of exposure, and between diesel oil-exposed samples according to the concentration. The results indicate that the combination of FTIR with multivariate analysis is a simple and efficient methodology for metabolic profiling with potential use for biomonitoring strategies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Presentation Extensions of the SOAP
NASA Technical Reports Server (NTRS)
Carnright, Robert; Stodden, David; Coggi, John
2009-01-01
A set of extensions of the Satellite Orbit Analysis Program (SOAP) enables simultaneous and/or sequential presentation of information from multiple sources. SOAP is used in the aerospace community as a means of collaborative visualization and analysis of data on planned spacecraft missions. The following definitions of terms also describe the display modalities of SOAP as now extended: In SOAP terminology, View signifies an animated three-dimensional (3D) scene, two-dimensional still image, plot of numerical data, or any other visible display derived from a computational simulation or other data source; a) "Viewport" signifies a rectangular portion of a computer-display window containing a view; b) "Palette" signifies a collection of one or more viewports configured for simultaneous (split-screen) display in the same window; c) "Slide" signifies a palette with a beginning and ending time and an animation time step; and d) "Presentation" signifies a prescribed sequence of slides. For example, multiple 3D views from different locations can be crafted for simultaneous display and combined with numerical plots and other representations of data for both qualitative and quantitative analysis. The resulting sets of views can be temporally sequenced to convey visual impressions of a sequence of events for a planned mission.
Double Fourier analysis for Emotion Identification in Voiced Speech
NASA Astrophysics Data System (ADS)
Sierra-Sosa, D.; Bastidas, M.; Ortiz P., D.; Quintero, O. L.
2016-04-01
We propose a novel analysis alternative, based on two Fourier Transforms for emotion recognition from speech. Fourier analysis allows for display and synthesizes different signals, in terms of power spectral density distributions. A spectrogram of the voice signal is obtained performing a short time Fourier Transform with Gaussian windows, this spectrogram portraits frequency related features, such as vocal tract resonances and quasi-periodic excitations during voiced sounds. Emotions induce such characteristics in speech, which become apparent in spectrogram time-frequency distributions. Later, the signal time-frequency representation from spectrogram is considered an image, and processed through a 2-dimensional Fourier Transform in order to perform the spatial Fourier analysis from it. Finally features related with emotions in voiced speech are extracted and presented.
Wilson, Ander; Chiu, Yueh-Hsiu Mathilda; Hsu, Hsiao-Hsien Leon; Wright, Robert O; Wright, Rosalind J; Coull, Brent A
2017-07-01
Epidemiological research supports an association between maternal exposure to air pollution during pregnancy and adverse children's health outcomes. Advances in exposure assessment and statistics allow for estimation of both critical windows of vulnerability and exposure effect heterogeneity. Simultaneous estimation of windows of vulnerability and effect heterogeneity can be accomplished by fitting a distributed lag model (DLM) stratified by subgroup. However, this can provide an incomplete picture of how effects vary across subgroups because it does not allow for subgroups to have the same window but different within-window effects or to have different windows but the same within-window effect. Because the timing of some developmental processes are common across subpopulations of infants while for others the timing differs across subgroups, both scenarios are important to consider when evaluating health risks of prenatal exposures. We propose a new approach that partitions the DLM into a constrained functional predictor that estimates windows of vulnerability and a scalar effect representing the within-window effect directly. The proposed method allows for heterogeneity in only the window, only the within-window effect, or both. In a simulation study we show that a model assuming a shared component across groups results in lower bias and mean squared error for the estimated windows and effects when that component is in fact constant across groups. We apply the proposed method to estimate windows of vulnerability in the association between prenatal exposures to fine particulate matter and each of birth weight and asthma incidence, and estimate how these associations vary by sex and maternal obesity status in a Boston-area prospective pre-birth cohort study. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Catelani, Tiago A; Santos, João Rodrigo; Páscoa, Ricardo N M J; Pezza, Leonardo; Pezza, Helena R; Lopes, João A
2018-03-01
This work proposes the use of near infrared (NIR) spectroscopy in diffuse reflectance mode and multivariate statistical process control (MSPC) based on principal component analysis (PCA) for real-time monitoring of the coffee roasting process. The main objective was the development of a MSPC methodology able to early detect disturbances to the roasting process resourcing to real-time acquisition of NIR spectra. A total of fifteen roasting batches were defined according to an experimental design to develop the MSPC models. This methodology was tested on a set of five batches where disturbances of different nature were imposed to simulate real faulty situations. Some of these batches were used to optimize the model while the remaining was used to test the methodology. A modelling strategy based on a time sliding window provided the best results in terms of distinguishing batches with and without disturbances, resourcing to typical MSPC charts: Hotelling's T 2 and squared predicted error statistics. A PCA model encompassing a time window of four minutes with three principal components was able to efficiently detect all disturbances assayed. NIR spectroscopy combined with the MSPC approach proved to be an adequate auxiliary tool for coffee roasters to detect faults in a conventional roasting process in real-time. Copyright © 2017 Elsevier B.V. All rights reserved.
Analysis techniques for residual acceleration data
NASA Technical Reports Server (NTRS)
Rogers, Melissa J. B.; Alexander, J. Iwan D.; Snyder, Robert S.
1990-01-01
Various aspects of residual acceleration data are of interest to low-gravity experimenters. Maximum and mean values and various other statistics can be obtained from data as collected in the time domain. Additional information may be obtained through manipulation of the data. Fourier analysis is discussed as a means of obtaining information about dominant frequency components of a given data window. Transformation of data into different coordinate axes is useful in the analysis of experiments with different orientations and can be achieved by the use of a transformation matrix. Application of such analysis techniques to residual acceleration data provides additional information than what is provided in a time history and increases the effectiveness of post-flight analysis of low-gravity experiments.
NASA Astrophysics Data System (ADS)
Jian, Wang; Xiaohong, Meng; Hong, Liu; Wanqiu, Zheng; Yaning, Liu; Sheng, Gui; Zhiyang, Wang
2017-03-01
Full waveform inversion and reverse time migration are active research areas for seismic exploration. Forward modeling in the time domain determines the precision of the results, and numerical solutions of finite difference have been widely adopted as an important mathematical tool for forward modeling. In this article, the optimum combined of window functions was designed based on the finite difference operator using a truncated approximation of the spatial convolution series in pseudo-spectrum space, to normalize the outcomes of existing window functions for different orders. The proposed combined window functions not only inherit the characteristics of the various window functions, to provide better truncation results, but also control the truncation error of the finite difference operator manually and visually by adjusting the combinations and analyzing the characteristics of the main and side lobes of the amplitude response. Error level and elastic forward modeling under the proposed combined system were compared with outcomes from conventional window functions and modified binomial windows. Numerical dispersion is significantly suppressed, which is compared with modified binomial window function finite-difference and conventional finite-difference. Numerical simulation verifies the reliability of the proposed method.
Alternative Fuels Data Center: Hydrogen Drive
, contact Greater Washington Region Clean Cities Coalition. Download QuickTime Video QuickTime (.mov ) Download Windows Media Video Windows Media (.wmv) Video Download Help Text version See more videos provided
A frequency-based window width optimized two-dimensional S-Transform profilometry
NASA Astrophysics Data System (ADS)
Zhong, Min; Chen, Feng; Xiao, Chao
2017-11-01
A new scheme is proposed to as a frequency-based window width optimized two-dimensional S-Transform profilometry, in which parameters pu and pv are introduced to control the width of a two-dimensional Gaussian window. Unlike the standard two-dimensional S-transform using the Gaussian window with window width proportional to the reciprocal local frequency of the tested signal, the size of window width for the optimized two-dimensional S-Transform varies with the pu th (pv th) power of the reciprocal local frequency fx (fy) in x (y) direction. The paper gives a detailed theoretical analysis of optimized two-dimensional S-Transform in fringe analysis as well as the characteristics of the modified Gauss window. Simulations are applied to evaluate the proposed scheme, the results show that the new scheme has better noise reduction ability and can extract phase distribution more precise in comparison with the standard two-dimensional S-transform even though the surface of the measured object varies sharply. Finally, the proposed scheme is demonstrated on three-dimensional surface reconstruction for a complex plastic cat mask to show its effectiveness.
Software for Real-Time Analysis of Subsonic Test Shot Accuracy
2014-03-01
used the C++ programming language, the Open Source Computer Vision ( OpenCV ®) software library, and Microsoft Windows® Application Programming...video for comparison through OpenCV image analysis tools. Based on the comparison, the software then computed the coordinates of each shot relative to...DWB researchers wanted to use the Open Source Computer Vision ( OpenCV ) software library for capturing and analyzing frames of video. OpenCV contains
The effect of low ceiling on the external combustion of the cabin fire
NASA Astrophysics Data System (ADS)
Su, Shichuan; Chen, Changyun; Wang, Liang; Wei, Chengyin; Cui, Haibing; Guo, Chengyu
2018-06-01
External combustion is a phenomenon where the flame flares out of the window and burns outside. Because of the particularity of the ship's cabin structure, there is a great danger in the external combustion. In this paper, the numerical calculation and analysis of three kinds of low ceiling ship cabin fire are analyzed based on the large eddy numerical simulation technique. Through the analysis of temperature, flue gas velocity, heat flux density and so on, the external combustion phenomenon of fire development is calculated. The results show that when external combustion occurs, the amount of fuel escaping decreases with the roof height. The temperature above the window increases with the height of the ceiling. The heat flux density in the external combustion flame is mainly provided by radiation, and convection is only a small part; In the plume area there is a time period, in this time period, the convective heat flux density is greater than the radiation heat flux, this time with the ceiling height increases. No matter which ceiling height, the external combustion will seriously damage the structure of the ship after a certain period of time. The velocity distribution of the three roof is similar, but with the height of the ceiling, the area size is also increasing.
Varela, P; Silva, A; da Silva, F; da Graça, S; Manso, M E; Conway, G D
2010-10-01
The spectrogram is one of the best-known time-frequency distributions suitable to analyze signals whose energy varies both in time and frequency. In reflectometry, it has been used to obtain the frequency content of FM-CW signals for density profile inversion and also to study plasma density fluctuations from swept and fixed frequency data. Being implemented via the short-time Fourier transform, the spectrogram is limited in resolution, and for that reason several methods have been developed to overcome this problem. Among those, we focus on the reassigned spectrogram technique that is both easily automated and computationally efficient requiring only the calculation of two additional spectrograms. In each time-frequency window, the technique reallocates the spectrogram coordinates to the region that most contributes to the signal energy. The application to ASDEX Upgrade reflectometry data results in better energy concentration and improved localization of the spectral content of the reflected signals. When combined with the automatic (data driven) window length spectrogram, this technique provides improved profile accuracy, in particular, in regions where frequency content varies most rapidly such as the edge pedestal shoulder.
NASA Astrophysics Data System (ADS)
Kim, Shin-Woo; Noh, Nam-Kyu; Lim, Gyu-Ho
2013-04-01
This study presents the introduction of retrospective optimal interpolation (ROI) and its application with Weather Research and Forecasting model (WRF). Song et al. (2009) suggested ROI method which is an optimal interpolation (OI) that gradually assimilates observations over the analysis window for variance-minimum estimate of an atmospheric state at the initial time of the analysis window. The assimilation window of ROI algorithm is gradually increased, similar with that of the quasi-static variational assimilation (QSVA; Pires et al., 1996). Unlike QSVA method, however, ROI method assimilates the data at post analysis time using perturbation method (Verlaan and Heemink, 1997) without adjoint model. Song and Lim (2011) improved this method by incorporating eigen-decomposition and covariance inflation. The computational costs for ROI can be reduced due to the eigen-decomposition of background error covariance which can concentrate ROI analyses on the error variances of governing eigenmodes by transforming the control variables into eigenspace. A total energy norm is used for the normalization of each control variables. In this study, ROI method is applied to WRF model with Observing System Simulation Experiment (OSSE) to validate the algorithm and to investigate the capability. Horizontal wind, pressure, potential temperature, and water vapor mixing ratio are used for control variables and observations. Firstly, 1-profile assimilation experiment is performed. Subsequently, OSSE's are performed using the virtual observing system which consists of synop, ship, and sonde data. The difference between forecast errors with assimilation and without assimilation is obviously increased as time passed, which means the improvement of forecast error with the assimilation by ROI. The characteristics and strength/weakness of ROI method are also investigated by conducting the experiments with 3D-Var (3-dimensional variational) method and 4D-Var (4-dimensional variational) method. In the initial time, ROI produces a larger forecast error than that of 4D-Var. However, the difference between the two experimental results is decreased gradually with time, and the ROI shows apparently better result (i.e., smaller forecast error) than that of 4D-Var after 9-hour forecast.
Zheng, Xiangrong; Zhang, Weishe; Lu, Chan; Norbäck, Dan; Deng, Qihong
2018-05-01
It is well known that exposure to thermal stress during pregnancy can lead to an increased incidence of premature births. However, there is little known regarding window(s) of susceptibility during the course of a pregnancy. We attempted to identify possible windows of susceptibility in a cohort study of 3604 children in Changsha with a hot-summer and cold winter climatic characteristics. We examined the association between PTB and ambient temperature during different timing windows of pregnancy: conception month, three trimesters, birth month and entire pregnancy. We found a U-shaped relation between the prevalence of PTB and mean ambient temperature during pregnancy. Both high and low temperatures were associated with PTB risk, adjusted OR (95% CI) respectively 2.57 (1.98-3.33) and 2.39 (1.93-2.95) for 0.5 °C increase in high temperature range (>18.2°C) and 0.5°C decrease in low temperature range (< 18.2°C). Specifically, PTB was significantly associated with ambient temperature and extreme heat/cold days during conception month and the third trimester. Sensitivity analysis indicated that female fetus were more susceptible to the risk of ambient temperature. Our study indicates that the risk of preterm birth due to high or low temperature may exist early during the conception month. Copyright © 2018 Elsevier Ltd. All rights reserved.
Single-agent parallel window search
NASA Technical Reports Server (NTRS)
Powley, Curt; Korf, Richard E.
1991-01-01
Parallel window search is applied to single-agent problems by having different processes simultaneously perform iterations of Iterative-Deepening-A(asterisk) (IDA-asterisk) on the same problem but with different cost thresholds. This approach is limited by the time to perform the goal iteration. To overcome this disadvantage, the authors consider node ordering. They discuss how global node ordering by minimum h among nodes with equal f = g + h values can reduce the time complexity of serial IDA-asterisk by reducing the time to perform the iterations prior to the goal iteration. Finally, the two ideas of parallel window search and node ordering are combined to eliminate the weaknesses of each approach while retaining the strengths. The resulting approach, called simply parallel window search, can be used to find a near-optimal solution quickly, improve the solution until it is optimal, and then finally guarantee optimality, depending on the amount of time available.
ERIC Educational Resources Information Center
Zhang, Yili; Smolen, Paul; Baxter, Douglas A.; Byrne, John H.
2010-01-01
Memory consolidation and reconsolidation require kinase activation and protein synthesis. Blocking either process during or shortly after training or recall disrupts memory stabilization, which suggests the existence of a critical time window during which these processes are necessary. Using a computational model of kinase synthesis and…
FRB180311: AstroSat CZTI upper limits and correction to FRB180301 upper limits
NASA Astrophysics Data System (ADS)
Anumarlapudi, A.; Aarthy, E.; Arvind, B.; Bhalerao, V.; Bhattacharya, D.; Rao, A. R.; Vadawale, S.
2018-03-01
We carried out offline analysis of data from Astrosat CZTI in a 200 second window centred on the FRB 180311 (Parkes discovery - Oslowski, S. et al., ATEL #11396) trigger time, 2018-03-11 04:11:54.80 UTC, to look for any coincident hard X-ray flash.
Levenson, M.
1960-10-25
A cave window is described. It is constructed of thick glass panes arranged so that interior panes have smaller windowpane areas and exterior panes have larger areas. Exterior panes on the radiation exposure side are remotely replaceable when darkened excessively. Metal shutters minimize exposure time to extend window life.
Introduction and application of the multiscale coefficient of variation analysis.
Abney, Drew H; Kello, Christopher T; Balasubramaniam, Ramesh
2017-10-01
Quantifying how patterns of behavior relate across multiple levels of measurement typically requires long time series for reliable parameter estimation. We describe a novel analysis that estimates patterns of variability across multiple scales of analysis suitable for time series of short duration. The multiscale coefficient of variation (MSCV) measures the distance between local coefficient of variation estimates within particular time windows and the overall coefficient of variation across all time samples. We first describe the MSCV analysis and provide an example analytical protocol with corresponding MATLAB implementation and code. Next, we present a simulation study testing the new analysis using time series generated by ARFIMA models that span white noise, short-term and long-term correlations. The MSCV analysis was observed to be sensitive to specific parameters of ARFIMA models varying in the type of temporal structure and time series length. We then apply the MSCV analysis to short time series of speech phrases and musical themes to show commonalities in multiscale structure. The simulation and application studies provide evidence that the MSCV analysis can discriminate between time series varying in multiscale structure and length.
How to determine life expectancy change of air pollution mortality: a time series study
2011-01-01
Background Information on life expectancy (LE) change is of great concern for policy makers, as evidenced by discussions of the "harvesting" (or "mortality displacement") issue, i.e. how large an LE loss corresponds to the mortality results of time series (TS) studies. Whereas loss of LE attributable to chronic air pollution exposure can be determined from cohort studies, using life table methods, conventional TS studies have identified only deaths due to acute exposure, during the immediate past (typically the preceding one to five days), and they provide no information about the LE loss per death. Methods We show how to obtain information on population-average LE loss by extending the observation window (largest "lag") of TS to include a sufficient number of "impact coefficients" for past exposures ("lags"). We test several methods for determining these coefficients. Once all of the coefficients have been determined, the LE change is calculated as time integral of the relative risk change after a permanent step change in exposure. Results The method is illustrated with results for daily data of non-accidental mortality from Hong Kong for 1985 - 2005, regressed against PM10 and SO2 with observation windows up to 5 years. The majority of the coefficients is statistically significant. The magnitude of the SO2 coefficients is comparable to those for PM10. But a window of 5 years is not sufficient and the results for LE change are only a lower bound; it is consistent with what is implied by other studies of long term impacts. Conclusions A TS analysis can determine the LE loss, but if the observation window is shorter than the relevant exposures one obtains only a lower bound. PMID:21450107
Developmental time windows for axon growth influence neuronal network topology.
Lim, Sol; Kaiser, Marcus
2015-04-01
Early brain connectivity development consists of multiple stages: birth of neurons, their migration and the subsequent growth of axons and dendrites. Each stage occurs within a certain period of time depending on types of neurons and cortical layers. Forming synapses between neurons either by growing axons starting at similar times for all neurons (much-overlapped time windows) or at different time points (less-overlapped) may affect the topological and spatial properties of neuronal networks. Here, we explore the extreme cases of axon formation during early development, either starting at the same time for all neurons (parallel, i.e., maximally overlapped time windows) or occurring for each neuron separately one neuron after another (serial, i.e., no overlaps in time windows). For both cases, the number of potential and established synapses remained comparable. Topological and spatial properties, however, differed: Neurons that started axon growth early on in serial growth achieved higher out-degrees, higher local efficiency and longer axon lengths while neurons demonstrated more homogeneous connectivity patterns for parallel growth. Second, connection probability decreased more rapidly with distance between neurons for parallel growth than for serial growth. Third, bidirectional connections were more numerous for parallel growth. Finally, we tested our predictions with C. elegans data. Together, this indicates that time windows for axon growth influence the topological and spatial properties of neuronal networks opening up the possibility to a posteriori estimate developmental mechanisms based on network properties of a developed network.
Li, Qinglin; Zhao, Meng; Wang, Xiaodan
2018-01-01
To compare the differences between the Kidney Disease Improving Global Outcomes (KDIGO) criteria of the 48-hour window and the 7-day window in the diagnosis of acute kidney injury (AKI) in very elderly patients, as well as the relationship between the 48-hour and 7-day windows for diagnosis and 90-day mortality. We retrospectively enrolled very elderly patients (≥75 years old) from the geriatrics department of the Chinese PLA General Hospital between January 2007 and December 2015. AKI patients were divided into 48-hour and 7-day groups by their diagnosis criteria. AKI patients were divided into survivor and nonsurvivor groups by their outcomes within 90 days after diagnosis of AKI. In total, 652 patients were included in the final analysis. The median age of the cohort was 87 (84-91) years, the majority (623, 95.6%) of whom were male. Of the 652 AKI patients, 334 cases (51.2%) were diagnosed with AKI by the 48-hour window for diagnosis, while 318 cases (48.8%) were by the 7-day window for diagnosis. The 90-day mortality was 42.5% in patients with 48-hour window AKI and 24.2% in patients with 7-day window AKI. Kaplan-Meier curves showed that 90-day mortality was lower in the 7-day window AKI group than in the 48-hour window AKI group (log rank: P <0.001). Multivariate analysis by the Cox model revealed that 48-hour window for diagnosis hazard ratio (HR=1.818; 95% CI: 1.256-2.631; P =0.002) was associated with higher 90-day mortality. The 90-day mortality was higher in 48-hour window AKI than in 7-day window AKI in very elderly patients. The 48-hour KDIGO window definition may be less sensitive. The 48-hour KDIGO window definition is significantly better correlated with subsequent mortality and is, therefore, still appropriate for clinical use. Finding early, sensitive biomarkers of kidney damage is a future direction of research.
Do windows or natural views affect outcomes or costs among patients in ICUs?
Kohn, Rachel; Harhay, Michael O; Cooney, Elizabeth; Small, Dylan S; Halpern, Scott D
2013-07-01
To determine whether potential exposure to natural light via windows or to more pleasing views through windows affects outcomes or costs among critically ill patients. Retrospective cohort study. An academic hospital in Philadelphia, PA. Six thousand one hundred thirty-eight patients admitted to a 24-bed medical ICU and 6,631 patients admitted to a 24-bed surgical ICU from July 1, 2006, to June 30, 2010. Assignment to medical ICU rooms with vs. without windows and to surgical ICU rooms with natural vs. industrial views based on bed availability. In primary analyses adjusting for patient characteristics, medical ICU patients admitted to rooms with (n = 4,093) versus without (n = 2,243) windows did not differ in rates of ICU (p = 0.25) or in-hospital (p = 0.94) mortality, ICU readmissions (p = 0.37), or delirium (p = 0.56). Surgical ICU patients admitted to rooms with natural (n = 3,072) versus industrial (n = 3,588) views experienced slightly shorter ICU lengths of stay and slightly lower variable costs. Instrumental variable analyses based on initial bed assignment and exposure time did not show any differences in any outcomes in either the medical ICU or surgical ICU cohorts, and none of the differences noted in primary analyses remained statistically significant when adjusting for multiple comparisons. In a prespecified subgroup analysis among patients with ICU length of stay greater than 72 hours, MICU windows were associated with reduced ICU (p = 0.02) and hospital mortality (p = 0.04); these results did not meet criteria for significance after adjustment for multiple comparisons. ICU rooms with windows or natural views do not improve outcomes or reduce costs of in-hospital care for general populations of medical and surgical ICU patients. Future work is needed to determine whether targeting light from windows directly toward patients influences outcomes and to explore these effects in patients at high risk for adverse outcomes.
2013-09-30
accuracy of the analysis . Root mean square difference ( RMSD ) is much smaller for RIP than for either Simple Ocean Data Assimilation or Incremental... Analysis Update globally for temperature as well as salinity. Regionally the same results were found, with only one exception in which the salinity RMSD ...short-term forecast using a numerical model with the observations taken within the forecast time window. The resulting state is the so-called “ analysis
Sunlight Responsive Thermochromic Window System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Millett, F,A; Byker,H, J
2006-10-27
Pleotint has embarked on a novel approach with our Sunlight Responsive Thermochromic, SRT™, windows. We are integrating dynamic sunlight control, high insulation values and low solar heat gain together in a high performance window. The Pleotint SRT window is dynamic because it reversibly changes light transmission based on thermochromics activated directly by the heating effect of sunlight. We can achieve a window package with low solar heat gain coefficient (SHGC), a low U value and high insulation. At the same time our windows provide good daylighting. Our innovative window design offers architects and building designers the opportunity to choose theirmore » desired energy performance, excellent sound reduction, external pane can be self-cleaning, or a resistance to wind load, blasts, bullets or hurricanes. SRT windows would provide energy savings that are estimated at up to 30% over traditional window systems. Glass fabricators will be able to use existing equipment to make the SRT window while adding value and flexibility to the basic design. Glazing installers will have the ability to fit the windows with traditional methods without wires, power supplies and controllers. SRT windows can be retrofit into existing buildings,« less
Effects of temporal variability in ground data collection on classification accuracy
Hoch, G.A.; Cully, J.F.
1999-01-01
This research tested whether the timing of ground data collection can significantly impact the accuracy of land cover classification. Ft. Riley Military Reservation, Kansas, USA was used to test this hypothesis. The U.S. Army's Land Condition Trend Analysis (LCTA) data annually collected at military bases was used to ground truth disturbance patterns. Ground data collected over an entire growing season and data collected one year after the imagery had a kappa statistic of 0.33. When using ground data from only within two weeks of image acquisition the kappa statistic improved to 0.55. Potential sources of this discrepancy are identified. These data demonstrate that there can be significant amounts of land cover change within a narrow time window on military reservations. To accurately conduct land cover classification at military reservations, ground data need to be collected in as narrow a window of time as possible and be closely synchronized with the date of the satellite imagery.
Threshold network of a financial market using the P-value of correlation coefficients
NASA Astrophysics Data System (ADS)
Ha, Gyeong-Gyun; Lee, Jae Woo; Nobi, Ashadun
2015-06-01
Threshold methods in financial networks are important tools for obtaining important information about the financial state of a market. Previously, absolute thresholds of correlation coefficients have been used; however, they have no relation to the length of time. We assign a threshold value depending on the size of the time window by using the P-value concept of statistics. We construct a threshold network (TN) at the same threshold value for two different time window sizes in the Korean Composite Stock Price Index (KOSPI). We measure network properties, such as the edge density, clustering coefficient, assortativity coefficient, and modularity. We determine that a significant difference exists between the network properties of the two time windows at the same threshold, especially during crises. This implies that the market information depends on the length of the time window when constructing the TN. We apply the same technique to Standard and Poor's 500 (S&P500) and observe similar results.
Effect of the time window on the heat-conduction information filtering model
NASA Astrophysics Data System (ADS)
Guo, Qiang; Song, Wen-Jun; Hou, Lei; Zhang, Yi-Lu; Liu, Jian-Guo
2014-05-01
Recommendation systems have been proposed to filter out the potential tastes and preferences of the normal users online, however, the physics of the time window effect on the performance is missing, which is critical for saving the memory and decreasing the computation complexity. In this paper, by gradually expanding the time window, we investigate the impact of the time window on the heat-conduction information filtering model with ten similarity measures. The experimental results on the benchmark dataset Netflix indicate that by only using approximately 11.11% recent rating records, the accuracy could be improved by an average of 33.16% and the diversity could be improved by 30.62%. In addition, the recommendation performance on the dataset MovieLens could be preserved by only considering approximately 10.91% recent records. Under the circumstance of improving the recommendation performance, our discoveries possess significant practical value by largely reducing the computational time and shortening the data storage space.
Du, Yifeng; Kemper, Timothy; Qiu, Jiange; Jiang, Jianxiong
2016-01-01
Neuroinflammation is a common feature in nearly all neurological and some psychiatric disorders. Resembling its extraneural counterpart, neuroinflammation can be both beneficial and detrimental depending on the responding molecules. The overall effect of inflammation on disease progression is highly dependent on the extent of inflammatory mediator production and the duration of inflammatory induction. The time-dependent aspect of inflammatory responses suggests that the therapeutic time window for quelling neuroinflammation might vary with molecular targets and injury types. Therefore, it is important to define the therapeutic time window for anti-inflammatory therapeutics, as contradicting or negative results might arise when different treatment regimens are utilized even in similar animal models. Herein, we discuss a few critical factors that can help define the therapeutic time window and optimize treatment paradigm for suppressing the cyclooxygenase-2/prostaglandin-mediated inflammation after status epilepticus. These determinants should also be relevant to other anti-inflammatory therapeutic strategies for the CNS diseases. PMID:26689339
Ackermann, M.; Arcavi, I.; Baldini, L.; ...
2015-07-09
Supernovae (SNe) exploding in a dense circumstellar medium (CSM) are hypothesized to accelerate cosmic rays in collisionless shocks and emit GeV γ-rays and TeV neutrinos on a timescale of several months. We perform the first systematic search for γ-ray emission in Fermi Large Area Telescope data in the energy range frommore » $$100\\;\\mathrm{MeV}$$ to $$300\\;\\mathrm{GeV}$$ from the ensemble of 147 SNe Type IIn exploding in a dense CSM. Here, we search for a γ-ray excess at each SNe location in a one-year time window. In order to enhance a possible weak signal, we simultaneously study the closest and optically brightest sources of our sample in a joint-likelihood analysis in three different time windows (1 year, 6 months, and 3 months). For the most promising source of the sample, SN 2010jl (PTF 10aaxf), we repeat the analysis with an extended time window lasting 4.5 years. We do not find a significant excess in γ-rays for any individual source nor for the combined sources and provide model-independent flux upper limits for both cases. Additionally, we derive limits on the γ-ray luminosity and the ratio of γ-ray-to-optical luminosity ratio as a function of the index of the proton injection spectrum assuming a generic γ-ray production model. Furthermore, we present detailed flux predictions based on multi-wavelength observations and the corresponding flux upper limit at a 95% confidence level (CL) for the source SN 2010jl (PTF 10aaxf).« less
Smart windows with functions of reflective display and indoor temperature-control
NASA Astrophysics Data System (ADS)
Lee, I.-Hui; Chao, Yu-Ching; Hsu, Chih-Cheng; Chang, Liang-Chao; Chiu, Tien-Lung; Lee, Jiunn-Yih; Kao, Fu-Jen; Lee, Chih-Kung; Lee, Jiun-Haw
2010-02-01
In this paper, a switchable window based on cholestreric liquid crystal (CLC) was demonstrated. Under different applied voltages, incoming light at visible and infrared wavelengths was modulated, respectively. A mixture of CLC with a nematic liquid crystal and a chiral dopant selectively reflected infrared light without bias, which effectively reduced the indoor temperature under sunlight illumination. At this time, transmission at visible range was kept at high and the windows looked transparent. With increasing the voltage to 15V, CLC changed to focal conic state and can be used as a reflective display, a privacy window, or a screen for projector. Under a high voltage (30V), homeotropic state was achieved. At this time, both infrared and visible light can transmit which acted as a normal window, which permitted infrared spectrum of winter sunlight to enter the room so as to reduce the heating requirement. Such a device can be used as a switchable window in smart buildings, green houses and windshields.
Wagner, Pablo; Ortiz, Cristian; Vela, Omar; Arias, Paul; Zanolli, Diego; Wagner, Emilio
2016-09-01
Tibialis posterior (TP) tendon transfer through the interosseous membrane is commonly performed in Charcot-Marie-Tooth disease. In order to avoid entrapment of this tendon, no clear recommendation relative to the interosseous membrane (IOM) incision size has been made. Analyze the TP size at the transfer level and therefore determine the most adequate IOM window size to avoid muscle entrapment. Eleven lower extremity magnetic resonances were analyzed. TP muscle measurements were made in axial views, obtaining the medial-lateral and antero-posterior diameter at various distances from the medial malleolus tip. The distance from the posterior to anterior compartment was also measured. These measurements were applied to a mathematical model to predict the IOM window size necessary to allow an ample TP passage in an oblique direction. The average tendon diameter (confidence-interval) at 15cm proximal to the medial malleolus tip was 19.47mm (17.47-21.48). The deep posterior compartment to anterior compartment distance was 10.97mm (9.03-12.90). Using a mathematical model, the estimated IOM window size ranges from 4.2 to 4.9cm. The IOM window size is of utmost importance in trans-membrane TP transfers, given that if equal or smaller than the transposed tendon oblique diameter, a high entrapment risk exists. A membrane window of 5cm or 2.5 times the size of the tendon diameter should be performed in order to theoretically diminish this complication. Copyright © 2015 European Foot and Ankle Society. Published by Elsevier Ltd. All rights reserved.
The dynamics underlying the regeneration and stalling of Hurricane Harvey
NASA Astrophysics Data System (ADS)
Liang, X. S.
2017-12-01
The explosive regeneration and stalling make the hurricane Harvey go from a little-noticed storm to an extremely destructive behemoth in late August 2017 that incurred an estimated economic loss at 70-200 billion USD. In this study, we use a recently developed analysis tool, namely, multiscale window transform (MWT), and the MWT-based theory of canonical transfer, to investigate the dynamics underlying this regeneration and stalling. The atmospheric fields are reconstructed onto three scale ranges or windows, namely, large-scale, tropical cyclone-scale, and cumulus convection-scale windows. The intertwined cyclone-scale nonlinear energy process is uniquely separated into a transport of energy within the cyclone window and an interscale transfer through reconstructing the "atomic" energy fluxes on the multiple scale windows. The resulting transfer bears a Lie bracket form, reminiscent of the Poisson bracket in Hamiltonian mechanics, and is hence referred to as canonical. It is found that within the Gulf of Mexico, Harvey gains much energy from the cumulus convection window through an inverse energy cascade, leading to its explosive growth. In the mean time, there is a barotropic instability (positive canonical transfer) center of the mean circulation in the lower and mid troposphere which lies quasi-steadily over Houston during August 22 through early September. The northwestward propagating Harvey meets that center and then stalls for two days near the coastline, dropping torrential and unprecedented amounts of rainfall and causing catastrophic flooding. It moves out of the instability center by the end of August, and then dissipates quickly in the following days.
The Seismic Tool-Kit (STK): an open source software for seismology and signal processing.
NASA Astrophysics Data System (ADS)
Reymond, Dominique
2016-04-01
We present an open source software project (GNU public license), named STK: Seismic ToolKit, that is dedicated mainly for seismology and signal processing. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 19 500 downloads at the date of writing. The STK project is composed of two main branches: First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The estimation of spectral density of the signal are performed via the Fourier transform, with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noize), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. A MINimum library of Linear AlGebra (MIN-LINAG) is also provided for computing the main matrix process like: QR/QL decomposition, Cholesky solve of linear system, finding eigen value/eigen vectors, QR-solve/Eigen-solve of linear equations systems ... etc. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. Usefull links: http://sourceforge.net/projects/seismic-toolkit/ http://sourceforge.net/p/seismic-toolkit/wiki/browse_pages/
NASA Astrophysics Data System (ADS)
Boashash, Boualem; Lovell, Brian; White, Langford
1988-01-01
Time-Frequency analysis based on the Wigner-Ville Distribution (WVD) is shown to be optimal for a class of signals where the variation of instantaneous frequency is the dominant characteristic. Spectral resolution and instantaneous frequency tracking is substantially improved by using a Modified WVD (MWVD) based on an Autoregressive spectral estimator. Enhanced signal-to-noise ratio may be achieved by using 2D windowing in the Time-Frequency domain. The WVD provides a tool for deriving descriptors of signals which highlight their FM characteristics. These descriptors may be used for pattern recognition and data clustering using the methods presented in this paper.
Digital PIV (DPIV) Software Analysis System
NASA Technical Reports Server (NTRS)
Blackshire, James L.
1997-01-01
A software package was developed to provide a Digital PIV (DPIV) capability for NASA LaRC. The system provides an automated image capture, test correlation, and autocorrelation analysis capability for the Kodak Megaplus 1.4 digital camera system for PIV measurements. The package includes three separate programs that, when used together with the PIV data validation algorithm, constitutes a complete DPIV analysis capability. The programs are run on an IBM PC/AT host computer running either Microsoft Windows 3.1 or Windows 95 using a 'quickwin' format that allows simple user interface and output capabilities to the windows environment.
ADGS-2100 Adaptive Display and Guidance System Window Manager Analysis
NASA Technical Reports Server (NTRS)
Whalen, Mike W.; Innis, John D.; Miller, Steven P.; Wagner, Lucas G.
2006-01-01
Recent advances in modeling languages have made it feasible to formally specify and analyze the behavior of large system components. Synchronous data flow languages, such as Lustre, SCR, and RSML-e are particularly well suited to this task, and commercial versions of these tools such as SCADE and Simulink are growing in popularity among designers of safety critical systems, largely due to their ability to automatically generate code from the models. At the same time, advances in formal analysis tools have made it practical to formally verify important properties of these models to ensure that design defects are identified and corrected early in the lifecycle. This report describes how these tools have been applied to the ADGS-2100 Adaptive Display and Guidance Window Manager being developed by Rockwell Collins Inc. This work demonstrates how formal methods can be easily and cost-efficiently used to remove defects early in the design cycle.
Optimization of simultaneous tritium–radiocarbon internal gas proportional counting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonicalzi, R. M.; Aalseth, C. E.; Day, A. R.
Specific environmental applications can benefit from dual tritium and radiocarbon measurements in a single compound. Assuming typical environmental levels, it is often the low tritium activity relative to the higher radiocarbon activity that limits the dual measurement. In this paper, we explore the parameter space for a combined tritium and radiocarbon measurement using a methane sample mixed with an argon fill gas in low-background proportional counters of a specific design. We present an optimized methane percentage, detector fill pressure, and analysis energy windows to maximize measurement sensitivity while minimizing count time. The final optimized method uses a 9-atm fill ofmore » P35 (35% methane, 65% argon), and a tritium analysis window from 1.5 to 10.3 keV, which stops short of the tritium beta decay endpoint energy of 18.6 keV. This method optimizes tritium counting efficiency while minimizing radiocarbon beta decay interference.« less
Metformin and the Risk of Cancer
Suissa, Samy; Azoulay, Laurent
2012-01-01
OBJECTIVE Time-related biases in observational studies of drug effects have been described extensively in different therapeutic areas but less so in diabetes. Immortal time bias, time-window bias, and time-lag bias all tend to greatly exaggerate the benefits observed with a drug. RESEARCH DESIGN AND METHODS These time-related biases are described and shown to be prominent in observational studies that have associated metformin with impressive reductions in the incidence of and mortality from cancer. As a consequence, metformin received much attention as a potential anticancer agent; these observational studies sparked the conduction of randomized, controlled trials of metformin as cancer treatment. However, the spectacular effects reported in these studies are compatible with time-related biases. RESULTS We found that 13 observational studies suffered from immortal time bias; 9 studies had not considered time-window bias, whereas other studies did not consider inherent time-lagging issues when comparing the first-line treatment metformin with second- or third-line treatments. These studies, subject to time-related biases that are avoidable with proper study design and data analysis, led to illusory extraordinarily significant effects, with reductions in cancer risk with metformin ranging from 20 to 94%. Three studies that avoided these biases reported no effect of metformin use on cancer incidence. CONCLUSIONS Although observational studies are important to better understand the effects of drugs, their proper design and analysis is essential to avoid major time-related biases. With respect to metformin, the scientific evidence of its potential beneficial effects on cancer would need to be reassessed critically before embarking on further long and expensive trials. PMID:23173135
Sone, M
1998-10-01
The inner layer of the round window membrane is composed of mesothelial cells and this mesothelial cell layer extends to the scala tympani. This study describes the histopathologic findings of temporal bone analysis from a patient with bilateral perilymphatic fistula of the round window membrane. The left ear showed proliferation of mesothelial cells in the scala tympani of the basal turn adjoining the round window membrane. This cell proliferation is thought to be a reaction to the rupture of the round window membrane.
Window acoustic study for advanced turboprop aircraft
NASA Technical Reports Server (NTRS)
Prydz, R. A.; Balena, F. J.
1984-01-01
An acoustic analysis was performed to establish window designs for advanced turboprop powered aircraft. The window transmission loss requirements were based on A-weighted interior noise goals of 80 and 75 dBA. The analytical results showed that a triple pane window consisting of two glass outer panes and an inner pane of acrylic would provide the required transmission loss and meet the sidewall space limits. Two window test articles were fabricated for laboratory evaluation and verification of the predicted transmission loss. Procedures for performing laboratory tests are presented.
Statistical analysis of data and modeling of Nanodust measured by STEREO/WAVES at 1AU
NASA Astrophysics Data System (ADS)
Belheouane, S.; Zaslavsky, A.; Meyer-Vernet, N.; Issautier, K.; Czechowski, A.; Mann, I.; Le Chat, G.; Zouganelis, I.; Maksimovic, M.
2012-12-01
We study the flux of dust particles of nanometer size measured at 1AU by the S/WAVES instrument aboard the twin STEREO spacecraft. When they impact the spacecraft at very high speed, these nanodust particles, first detected by Meyer-Vernet et al. (2009), generate plasma clouds and produce voltage pulses measured by the electric antennas. The Time Domain Sampler (TDS) of the radio and plasma instrument produces temporal windows containing several pulses. We perform a statistical study of the distribution of pulse amplitudes and arrival times in the measuring window during the 2007-2012 period. We interpret the results using simulations of the dynamics of nanodust in the solar wind based on the model of Czechowski and Mann (2010). We also investigate the variations of nanodust fluxes while STEREO rotates about the sunward axis (Roll) ; this reveals that some directions are privilegied.
NASA Astrophysics Data System (ADS)
Profumieri, A.; Bonell, C.; Catalfamo, P.; Cherniz, A.
2016-04-01
Virtual reality has been proposed for different applications, including the evaluation of new control strategies and training protocols for upper limb prostheses and for the study of new rehabilitation programs. In this study, a lower limb simulation environment commanded by surface electromyography signals is evaluated. The time delays generated by the acquisition and processing stages for the signals that would command the knee joint, were measured and different acquisition windows were analysed. The subjective perception of the quality of simulation was also evaluated when extra delays were added to the process. The results showed that the acquisition window is responsible for the longest delay. Also, the basic implemented processes allowed for the acquisition of three signal channels for commanding the simulation. Finally, the communication between different applications is arguably efficient, although it depends on the amount of data to be sent.
Shang, Jianyu; Deng, Zhihong; Fu, Mengyin; Wang, Shunting
2016-06-16
Traditional artillery guidance can significantly improve the attack accuracy and overall combat efficiency of projectiles, which makes it more adaptable to the information warfare of the future. Obviously, the accurate measurement of artillery spin rate, which has long been regarded as a daunting task, is the basis of precise guidance and control. Magnetoresistive (MR) sensors can be applied to spin rate measurement, especially in the high-spin and high-g projectile launch environment. In this paper, based on the theory of a MR sensor measuring spin rate, the mathematical relationship model between the frequency of MR sensor output and projectile spin rate was established through a fundamental derivation. By analyzing the characteristics of MR sensor output whose frequency varies with time, this paper proposed the Chirp z-Transform (CZT) time-frequency (TF) domain analysis method based on the rolling window of a Blackman window function (BCZT) which can accurately extract the projectile spin rate. To put it into practice, BCZT was applied to measure the spin rate of 155 mm artillery projectile. After extracting the spin rate, the impact that launch rotational angular velocity and aspect angle have on the extraction accuracy of the spin rate was analyzed. Simulation results show that the BCZT TF domain analysis method can effectively and accurately measure the projectile spin rate, especially in a high-spin and high-g projectile launch environment.
Double-Windows-Based Motion Recognition in Multi-Floor Buildings Assisted by a Built-In Barometer.
Liu, Maolin; Li, Huaiyu; Wang, Yuan; Li, Fei; Chen, Xiuwan
2018-04-01
Accelerometers, gyroscopes and magnetometers in smartphones are often used to recognize human motions. Since it is difficult to distinguish between vertical motions and horizontal motions in the data provided by these built-in sensors, the vertical motion recognition accuracy is relatively low. The emergence of a built-in barometer in smartphones improves the accuracy of motion recognition in the vertical direction. However, there is a lack of quantitative analysis and modelling of the barometer signals, which is the basis of barometer's application to motion recognition, and a problem of imbalanced data also exists. This work focuses on using the barometers inside smartphones for vertical motion recognition in multi-floor buildings through modelling and feature extraction of pressure signals. A novel double-windows pressure feature extraction method, which adopts two sliding time windows of different length, is proposed to balance recognition accuracy and response time. Then, a random forest classifier correlation rule is further designed to weaken the impact of imbalanced data on recognition accuracy. The results demonstrate that the recognition accuracy can reach 95.05% when pressure features and the improved random forest classifier are adopted. Specifically, the recognition accuracy of the stair and elevator motions is significantly improved with enhanced response time. The proposed approach proves effective and accurate, providing a robust strategy for increasing accuracy of vertical motions.
Ionospheric gravity wave measurements with the USU dynasonde
NASA Technical Reports Server (NTRS)
Berkey, Frank T.; Deng, Jun Yuan
1992-01-01
A method for the measurement of ionospheric Gravity Wave (GW) using the USU Dynasonde is outlined. This method consists of a series of individual procedures, which includes functions for data acquisition, adaptive scaling, polarization discrimination, interpolation and extrapolation, digital filtering, windowing, spectrum analysis, GW detection, and graphics display. Concepts of system theory are applied to treat the ionosphere as a system. An adaptive ionogram scaling method was developed for automatically extracting ionogram echo traces from noisy raw sounding data. The method uses the well known Least Mean Square (LMS) algorithm to form a stochastic optimal estimate of the echo trace which is then used to control a moving window. The window tracks the echo trace, simultaneously eliminating the noise and interference. Experimental results show that the proposed method functions as designed. Case studies which extract GW from ionosonde measurements were carried out using the techniques described. Geophysically significant events were detected and the resultant processed results are illustrated graphically. This method was also developed for real time implementation in mind.
Kahle, Logan Q; Flannery, Maureen E; Dumbacher, John P
2016-01-01
Bird-window collisions are a major and poorly-understood generator of bird mortality. In North America, studies of this topic tend to be focused east of the Mississippi River, resulting in a paucity of data from the Western flyways. Additionally, few available data can critically evaluate factors such as time of day, sex and age bias, and effect of window pane size on collisions. We collected and analyzed 5 years of window strike data from a 3-story building in a large urban park in San Francisco, California. To evaluate our window collision data in context, we collected weekly data on local bird abundance in the adjacent parkland. Our study asks two overarching questions: first-what aspects of a bird's biology might make them more likely to fatally strike windows; and second, what characteristics of a building's design contribute to bird-window collisions. We used a dataset of 308 fatal bird strikes to examine the relationships of strikes relative to age, sex, time of day, time of year, and a variety of other factors, including mitigation efforts. We found that actively migrating birds may not be major contributors to collisions as has been found elsewhere. We found that males and young birds were both significantly overrepresented relative to their abundance in the habitat surrounding the building. We also analyzed the effect of external window shades as mitigation, finding that an overall reduction in large panes, whether covered or in some way broken up with mullions, effectively reduced window collisions. We conclude that effective mitigation or design will be required in all seasons, but that breeding seasons and migratory seasons are most critical, especially for low-rise buildings and other sites away from urban migrant traps. Finally, strikes occur throughout the day, but mitigation may be most effective in the morning and midday.
Kahle, Logan Q.; Flannery, Maureen E.; Dumbacher, John P.
2016-01-01
Bird-window collisions are a major and poorly-understood generator of bird mortality. In North America, studies of this topic tend to be focused east of the Mississippi River, resulting in a paucity of data from the Western flyways. Additionally, few available data can critically evaluate factors such as time of day, sex and age bias, and effect of window pane size on collisions. We collected and analyzed 5 years of window strike data from a 3-story building in a large urban park in San Francisco, California. To evaluate our window collision data in context, we collected weekly data on local bird abundance in the adjacent parkland. Our study asks two overarching questions: first–what aspects of a bird’s biology might make them more likely to fatally strike windows; and second, what characteristics of a building’s design contribute to bird-window collisions. We used a dataset of 308 fatal bird strikes to examine the relationships of strikes relative to age, sex, time of day, time of year, and a variety of other factors, including mitigation efforts. We found that actively migrating birds may not be major contributors to collisions as has been found elsewhere. We found that males and young birds were both significantly overrepresented relative to their abundance in the habitat surrounding the building. We also analyzed the effect of external window shades as mitigation, finding that an overall reduction in large panes, whether covered or in some way broken up with mullions, effectively reduced window collisions. We conclude that effective mitigation or design will be required in all seasons, but that breeding seasons and migratory seasons are most critical, especially for low-rise buildings and other sites away from urban migrant traps. Finally, strikes occur throughout the day, but mitigation may be most effective in the morning and midday. PMID:26731417
Scientific Data Analysis Toolkit: A Versatile Add-in to Microsoft Excel for Windows
ERIC Educational Resources Information Center
Halpern, Arthur M.; Frye, Stephen L.; Marzzacco, Charles J.
2018-01-01
Scientific Data Analysis Toolkit (SDAT) is a rigorous, versatile, and user-friendly data analysis add-in application for Microsoft Excel for Windows (PC). SDAT uses the familiar Excel environment to carry out most of the analytical tasks used in data analysis. It has been designed for student use in manipulating and analyzing data encountered in…
Time-series analysis of foreign exchange rates using time-dependent pattern entropy
NASA Astrophysics Data System (ADS)
Ishizaki, Ryuji; Inoue, Masayoshi
2013-08-01
Time-dependent pattern entropy is a method that reduces variations to binary symbolic dynamics and considers the pattern of symbols in a sliding temporal window. We use this method to analyze the instability of daily variations in foreign exchange rates, in particular, the dollar-yen rate. The time-dependent pattern entropy of the dollar-yen rate was found to be high in the following periods: before and after the turning points of the yen from strong to weak or from weak to strong, and the period after the Lehman shock.
Communicating likelihoods and probabilities in forecasts of volcanic eruptions
NASA Astrophysics Data System (ADS)
Doyle, Emma E. H.; McClure, John; Johnston, David M.; Paton, Douglas
2014-02-01
The issuing of forecasts and warnings of natural hazard events, such as volcanic eruptions, earthquake aftershock sequences and extreme weather often involves the use of probabilistic terms, particularly when communicated by scientific advisory groups to key decision-makers, who can differ greatly in relative expertise and function in the decision making process. Recipients may also differ in their perception of relative importance of political and economic influences on interpretation. Consequently, the interpretation of these probabilistic terms can vary greatly due to the framing of the statements, and whether verbal or numerical terms are used. We present a review from the psychology literature on how the framing of information influences communication of these probability terms. It is also unclear as to how people rate their perception of an event's likelihood throughout a time frame when a forecast time window is stated. Previous research has identified that, when presented with a 10-year time window forecast, participants viewed the likelihood of an event occurring ‘today’ as being of less than that in year 10. Here we show that this skew in perception also occurs for short-term time windows (under one week) that are of most relevance for emergency warnings. In addition, unlike the long-time window statements, the use of the phrasing “within the next…” instead of “in the next…” does not mitigate this skew, nor do we observe significant differences between the perceived likelihoods of scientists and non-scientists. This finding suggests that effects occurring due to the shorter time window may be ‘masking’ any differences in perception due to wording or career background observed for long-time window forecasts. These results have implications for scientific advice, warning forecasts, emergency management decision-making, and public information as any skew in perceived event likelihood towards the end of a forecast time window may result in an underestimate of the likelihood of an event occurring ‘today’ leading to potentially inappropriate action choices. We thus present some initial guidelines for communicating such eruption forecasts.
Standardization of pitch-range settings in voice acoustic analysis.
Vogel, Adam P; Maruff, Paul; Snyder, Peter J; Mundt, James C
2009-05-01
Voice acoustic analysis is typically a labor-intensive, time-consuming process that requires the application of idiosyncratic parameters tailored to individual aspects of the speech signal. Such processes limit the efficiency and utility of voice analysis in clinical practice as well as in applied research and development. In the present study, we analyzed 1,120 voice files, using standard techniques (case-by-case hand analysis), taking roughly 10 work weeks of personnel time to complete. The results were compared with the analytic output of several automated analysis scripts that made use of preset pitch-range parameters. After pitch windows were selected to appropriately account for sex differences, the automated analysis scripts reduced processing time of the 1,120 speech samples to less than 2.5 h and produced results comparable to those obtained with hand analysis. However, caution should be exercised when applying the suggested preset values to pathological voice populations.
THE EPA REMOTE SENSING ARCHIVE: A VALUABLE WINDOW INTO THE PAST FOR ENVIRONMENTAL ANALYSIS TODAY
Often environmental issues need to have a historical perspective, to look back into the past. Remotely sensed imagery is one way to see the land and what happened in a previous time. The EPA is often responsible to look into the past to facilitate a better future for the environm...
Peng, Sijia; Wang, Wenjuan; Chen, Chunlai
2018-05-10
Fluorescence correlation spectroscopy is a powerful single-molecule tool that is able to capture kinetic processes occurring at the nanosecond time scale. However, the upper limit of its time window is restricted by the dwell time of the molecule of interest in the confocal detection volume, which is usually around submilliseconds for a freely diffusing biomolecule. Here, we present a simple and easy-to-implement method, named surface transient binding-based fluorescence correlation spectroscopy (STB-FCS), which extends the upper limit of the time window to seconds. We further demonstrated that STB-FCS enables capture of both intramolecular and intermolecular kinetic processes whose time scales cross several orders of magnitude.
Region of interest and windowing-based progressive medical image delivery using JPEG2000
NASA Astrophysics Data System (ADS)
Nagaraj, Nithin; Mukhopadhyay, Sudipta; Wheeler, Frederick W.; Avila, Ricardo S.
2003-05-01
An important telemedicine application is the perusal of CT scans (digital format) from a central server housed in a healthcare enterprise across a bandwidth constrained network by radiologists situated at remote locations for medical diagnostic purposes. It is generally expected that a viewing station respond to an image request by displaying the image within 1-2 seconds. Owing to limited bandwidth, it may not be possible to deliver the complete image in such a short period of time with traditional techniques. In this paper, we investigate progressive image delivery solutions by using JPEG 2000. An estimate of the time taken in different network bandwidths is performed to compare their relative merits. We further make use of the fact that most medical images are 12-16 bits, but would ultimately be converted to an 8-bit image via windowing for display on the monitor. We propose a windowing progressive RoI technique to exploit this and investigate JPEG 2000 RoI based compression after applying a favorite or a default window setting on the original image. Subsequent requests for different RoIs and window settings would then be processed at the server. For the windowing progressive RoI mode, we report a 50% reduction in transmission time.
Tapiainen, V; Hartikainen, S; Taipale, H; Tiihonen, J; Tolppanen, A-M
2017-06-01
Studies investigating psychiatric disorders as Alzheimer's disease (AD) risk factors have yielded heterogeneous findings. Differences in time windows between the exposure and outcome could be one explanation. We examined whether (1) mental and behavioral disorders in general or (2) specific mental and behavioral disorder categories increase the risk of AD and (3) how the width of the time window between the exposure and outcome affects the results. A nationwide nested case-control study of all Finnish clinically verified AD cases, alive in 2005 and their age, sex and region of residence matched controls (n of case-control pairs 27,948). History of hospital-treated mental and behavioral disorders was available since 1972. Altogether 6.9% (n=1932) of the AD cases and 6.4% (n=1784) of controls had a history of any mental and behavioral disorder. Having any mental and behavioral disorder (adjusted OR=1.07, 95% CI=1.00-1.16) or depression/other mood disorder (adjusted OR=1.17, 95% CI=1.05-1.30) were associated with higher risk of AD with 5-year time window but not with 10-year time window (adjusted OR, 95% CI 0.99, 0.91-1.08 for any disorder and 1.08, 0.96-1.23 for depression). The associations between mental and behavioral disorders and AD were modest and dependent on the time window. Therefore, some of the disorders may represent misdiagnosed prodromal symptoms of AD, which underlines the importance of proper differential diagnostics among older persons. These findings also highlight the importance of appropriate time window in psychiatric and neuroepidemiology research. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Boswell, Paul G.; Abate-Pella, Daniel; Hewitt, Joshua T.
2015-01-01
Compound identification by liquid chromatography-mass spectrometry (LC-MS) is a tedious process, mainly because authentic standards must be run on a user’s system to be able to confidently reject a potential identity from its retention time and mass spectral properties. Instead, it would be preferable to use shared retention time/index data to narrow down the identity, but shared data cannot be used to reject candidates with an absolute level of confidence because the data are strongly affected by differences between HPLC systems and experimental conditions. However, a technique called “retention projection” was recently shown to account for many of the differences. In this manuscript, we discuss an approach to calculate appropriate retention time tolerance windows for projected retention times, potentially making it possible to exclude candidates with an absolute level of confidence, without needing to have authentic standards of each candidate on hand. In a range of multi-segment gradients and flow rates run among seven different labs, the new approach calculated tolerance windows that were significantly more appropriate for each retention projection than global tolerance windows calculated for retention projections or linear retention indices. Though there were still some small differences between the labs that evidently were not taken into account, the calculated tolerance windows only needed to be relaxed by 50% to make them appropriate for all labs. Even then, 42% of the tolerance windows calculated in this study without standards were narrower than those required by WADA for positive identification, where standards must be run contemporaneously. PMID:26292624
Boswell, Paul G; Abate-Pella, Daniel; Hewitt, Joshua T
2015-09-18
Compound identification by liquid chromatography-mass spectrometry (LC-MS) is a tedious process, mainly because authentic standards must be run on a user's system to be able to confidently reject a potential identity from its retention time and mass spectral properties. Instead, it would be preferable to use shared retention time/index data to narrow down the identity, but shared data cannot be used to reject candidates with an absolute level of confidence because the data are strongly affected by differences between HPLC systems and experimental conditions. However, a technique called "retention projection" was recently shown to account for many of the differences. In this manuscript, we discuss an approach to calculate appropriate retention time tolerance windows for projected retention times, potentially making it possible to exclude candidates with an absolute level of confidence, without needing to have authentic standards of each candidate on hand. In a range of multi-segment gradients and flow rates run among seven different labs, the new approach calculated tolerance windows that were significantly more appropriate for each retention projection than global tolerance windows calculated for retention projections or linear retention indices. Though there were still some small differences between the labs that evidently were not taken into account, the calculated tolerance windows only needed to be relaxed by 50% to make them appropriate for all labs. Even then, 42% of the tolerance windows calculated in this study without standards were narrower than those required by WADA for positive identification, where standards must be run contemporaneously. Copyright © 2015 Elsevier B.V. All rights reserved.
A fast algorithm for vertex-frequency representations of signals on graphs
Jestrović, Iva; Coyle, James L.; Sejdić, Ervin
2016-01-01
The windowed Fourier transform (short time Fourier transform) and the S-transform are widely used signal processing tools for extracting frequency information from non-stationary signals. Previously, the windowed Fourier transform had been adopted for signals on graphs and has been shown to be very useful for extracting vertex-frequency information from graphs. However, high computational complexity makes these algorithms impractical. We sought to develop a fast windowed graph Fourier transform and a fast graph S-transform requiring significantly shorter computation time. The proposed schemes have been tested with synthetic test graph signals and real graph signals derived from electroencephalography recordings made during swallowing. The results showed that the proposed schemes provide significantly lower computation time in comparison with the standard windowed graph Fourier transform and the fast graph S-transform. Also, the results showed that noise has no effect on the results of the algorithm for the fast windowed graph Fourier transform or on the graph S-transform. Finally, we showed that graphs can be reconstructed from the vertex-frequency representations obtained with the proposed algorithms. PMID:28479645
El-Deftar, Moteaa M; Speers, Naomi; Eggins, Stephen; Foster, Simon; Robertson, James; Lennard, Chris
2014-08-01
A commercially available laser-induced breakdown spectroscopy (LIBS) instrument was evaluated for the determination of elemental composition of twenty Australian window glass samples, consisting of 14 laminated samples and 6 non-laminated samples (or not otherwise specified) collected from broken windows at crime scenes. In this study, the LIBS figures of merit were assessed in terms of accuracy, limits of detection and precision using three standard reference materials (NIST 610, 612, and 1831). The discrimination potential of LIBS was compared to that obtained using laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS), X-ray microfluorescence spectroscopy (μXRF) and scanning electron microscopy energy dispersive X-ray spectrometry (SEM-EDX) for the analysis of architectural window glass samples collected from crime scenes in the Canberra region, Australia. Pairwise comparisons were performed using a three-sigma rule, two-way ANOVA and Tukey's HSD test at 95% confidence limit in order to investigate the discrimination power for window glass analysis. The results show that the elemental analysis of glass by LIBS provides a discrimination power greater than 97% (>98% when combined with refractive index data), which was comparable to the discrimination powers obtained by LA-ICP-MS and μXRF. These results indicate that LIBS is a feasible alternative to the more expensive LA-ICP-MS and μXRF options for the routine forensic analysis of window glass samples. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
An Efficient Implementation For Real Time Applications Of The Wigner-Ville Distribution
NASA Astrophysics Data System (ADS)
Boashash, Boualem; Black, Peter; Whitehouse, Harper J.
1986-03-01
The Wigner-Ville Distribution (WVD) is a valuable tool for time-frequency signal analysis. In order to implement the WVD in real time an efficient algorithm and architecture have been developed which may be implemented with commercial components. This algorithm successively computes the analytic signal corresponding to the input signal, forms a weighted kernel function and analyses the kernel via a Discrete Fourier Transform (DFT). To evaluate the analytic signal required by the algorithm it is shown that the time domain definition implemented as a finite impulse response (FIR) filter is practical and more efficient than the frequency domain definition of the analytic signal. The windowed resolution of the WVD in the frequency domain is shown to be similar to the resolution of a windowed Fourier Transform. A real time signal processsor has been designed for evaluation of the WVD analysis system. The system is easily paralleled and can be configured to meet a variety of frequency and time resolutions. The arithmetic unit is based on a pair of high speed VLSI floating-point multiplier and adder chips. Dual operand buses and an independent result bus maximize data transfer rates. The system is horizontally microprogrammed and utilizes a full instruction pipeline. Each microinstruction specifies two operand addresses, a result location, the type of arithmetic and the memory configuration. input and output is via shared memory blocks with front-end processors to handle data transfers during the non access periods of the analyzer.
Marck, C
1988-01-01
DNA Strider is a new integrated DNA and Protein sequence analysis program written with the C language for the Macintosh Plus, SE and II computers. It has been designed as an easy to learn and use program as well as a fast and efficient tool for the day-to-day sequence analysis work. The program consists of a multi-window sequence editor and of various DNA and Protein analysis functions. The editor may use 4 different types of sequences (DNA, degenerate DNA, RNA and one-letter coded protein) and can handle simultaneously 6 sequences of any type up to 32.5 kB each. Negative numbering of the bases is allowed for DNA sequences. All classical restriction and translation analysis functions are present and can be performed in any order on any open sequence or part of a sequence. The main feature of the program is that the same analysis function can be repeated several times on different sequences, thus generating multiple windows on the screen. Many graphic capabilities have been incorporated such as graphic restriction map, hydrophobicity profile and the CAI plot- codon adaptation index according to Sharp and Li. The restriction sites search uses a newly designed fast hexamer look-ahead algorithm. Typical runtime for the search of all sites with a library of 130 restriction endonucleases is 1 second per 10,000 bases. The circular graphic restriction map of the pBR322 plasmid can be therefore computed from its sequence and displayed on the Macintosh Plus screen within 2 seconds and its multiline restriction map obtained in a scrolling window within 5 seconds. PMID:2832831
Real-Time Detection of Dust Devils from Pressure Readings
NASA Technical Reports Server (NTRS)
Wagstaff, Kiri
2009-01-01
A method for real-time detection of dust devils at a given location is based on identifying the abrupt, temporary decreases in atmospheric pressure that are characteristic of dust devils as they travel through that location. The method was conceived for use in a study of dust devils on the Martian surface, where bandwidth limitations encourage the transmission of only those blocks of data that are most likely to contain information about features of interest, such as dust devils. The method, which is a form of intelligent data compression, could readily be adapted to use for the same purpose in scientific investigation of dust devils on Earth. In this method, the readings of an atmospheric- pressure sensor are repeatedly digitized, recorded, and processed by an algorithm that looks for extreme deviations from a continually updated model of the current pressure environment. The question in formulating the algorithm is how to model current normal observations and what minimum magnitude deviation can be considered sufficiently anomalous as to indicate the presence of a dust devil. There is no single, simple answer to this question: any answer necessarily entails a compromise between false detections and misses. For the original Mars application, the answer was sought through analysis of sliding time windows of digitized pressure readings. Windows of 5-, 10-, and 15-minute durations were considered. The windows were advanced in increments of 30 seconds. Increments of other sizes can also be used, but computational cost increases as the increment decreases and analysis is performed more frequently. Pressure models were defined using a polynomial fit to the data within the windows. For example, the figure depicts pressure readings from a 10-minute window wherein the model was defined by a third-degree polynomial fit to the readings and dust devils were identified as negative deviations larger than both 3 standard deviations (from the mean) and 0.05 mbar in magnitude. An algorithm embodying the detection scheme of this example was found to yield a miss rate of just 8 percent and a false-detection rate of 57 percent when evaluated on historical pressure-sensor data collected by the Mars Pathfinder lander. Since dust devils occur infrequently over the course of a mission, prioritizing observations that contain successful detections could greatly conserve bandwidth allocated to a given mission. This technique can be used on future Mars landers and rovers, such as Mars Phoenix and the Mars Science Laboratory.
Purged window apparatus. [On-line spectroscopic analysis of gas flow systems
Ballard, E.O.
1982-04-05
A purged window apparatus is described which utilizes tangentially injected heated purge gases in the vicinity of electromagnetic radiation transmitting windows and a tapered external mounting tube to accelerate these gases to provide a vortex flow on the window surface and a turbulent flow throughout the mounting tube thereby preventing backstreaming of flowing gases under investigation in a chamber to which a plurality of similar purged apparatus is attached with the consequent result that spectroscopic analyses can be undertaken for lengthy periods without the necessity of interrupting the flow for cleaning or replacing the windows due to contamination.
Opto-mechanical design of optical window for aero-optics effect simulation instruments
NASA Astrophysics Data System (ADS)
Wang, Guo-ming; Dong, Dengfeng; Zhou, Weihu; Ming, Xing; Zhang, Yan
2016-10-01
A complete theory is established for opto-mechanical systems design of the window in this paper, which can make the design more rigorous .There are three steps about the design. First, the universal model of aerodynamic environment is established based on the theory of Computational Fluid Dynamics, and the pneumatic pressure distribution and temperature data of optical window surface is obtained when aircraft flies in 5-30km altitude, 0.5-3Ma speed and 0-30°angle of attack. The temperature and pressure distribution values for the maximum constraint is selected as the initial value of external conditions on the optical window surface. Then, the optical window and mechanical structure are designed, which is also divided into two parts: First, mechanical structure which meet requirements of the security and tightness is designed. Finally, rigorous analysis and evaluation are given about the structure of optics and mechanics we have designed. There are two parts to be analyzed. First, the Fluid-Solid-Heat Coupled Model is given based on finite element analysis. And the deformation of the glass and structure can be obtained by the model, which can assess the feasibility of the designed optical windows and ancillary structure; Second, the new optical surface is fitted by Zernike polynomials according to the deformation of the surface of the optical window, which can evaluate imaging quality impact of spectral camera by the deformation of window.
Tynkkynen, Liina-Kaisa; Lehto, Juhani
2009-01-01
Background We studied the prerequisites for Public-Private Partnership (PPP) in the context of the Finnish health care system and more specifically in the field of ophthalmology. PPP can be defined as a more or less permanent cooperation between public and private actors, through which the joint products or services are developed and in which the risks, costs and profits are shared. The Finnish eye care services system is heterogeneous with several different providers and can be regarded as sub-optimal in terms of overall resource use. What is more, the public sector is suffering from a shortage of ophthalmologists, which further decreases its possibilities to meet the present needs. As ophthalmology has traditionally been a medical specialty with a substantial private sector involvement in service provision, PPP could be a feasible policy to be used in the field. We thus ask the following research question: Is there, and to what extent, an open window of opportunity for PPP? Methods In addition to the previously published literature, the research data consisted of 17 thematic interviews with public and private experts in the field of ophthalmology. The analysis was conducted in two stages. First, a literature-based content analysis was used to explore the prerequisites for PPP. Second, Kingdon's (1995) multiple streams theory was used to study the opening of the window of opportunity for PPP. Results Public and private parties reported similar problems in the current situation but defined them differently. Also, there is no consensus on policy alternatives. Public opinion seems to be somewhat uncertain as to the attitudes towards private service providers. The analysis thus showed that although there are prerequisites for PPP, the time has not yet come for a Public-Private Partnership. Conclusion Should the window open fully, the emergence of policy entrepreneurs and an opportunity for a win-win situation between public and private organizations are required. PMID:19900293
Wang, WeiBo; Sun, Wei; Wang, Wei; Szatkiewicz, Jin
2018-03-01
The application of high-throughput sequencing in a broad range of quantitative genomic assays (e.g., DNA-seq, ChIP-seq) has created a high demand for the analysis of large-scale read-count data. Typically, the genome is divided into tiling windows and windowed read-count data is generated for the entire genome from which genomic signals are detected (e.g. copy number changes in DNA-seq, enrichment peaks in ChIP-seq). For accurate analysis of read-count data, many state-of-the-art statistical methods use generalized linear models (GLM) coupled with the negative-binomial (NB) distribution by leveraging its ability for simultaneous bias correction and signal detection. However, although statistically powerful, the GLM+NB method has a quadratic computational complexity and therefore suffers from slow running time when applied to large-scale windowed read-count data. In this study, we aimed to speed up substantially the GLM+NB method by using a randomized algorithm and we demonstrate here the utility of our approach in the application of detecting copy number variants (CNVs) using a real example. We propose an efficient estimator, the randomized GLM+NB coefficients estimator (RGE), for speeding up the GLM+NB method. RGE samples the read-count data and solves the estimation problem on a smaller scale. We first theoretically validated the consistency and the variance properties of RGE. We then applied RGE to GENSENG, a GLM+NB based method for detecting CNVs. We named the resulting method as "R-GENSENG". Based on extensive evaluation using both simulated and empirical data, we concluded that R-GENSENG is ten times faster than the original GENSENG while maintaining GENSENG's accuracy in CNV detection. Our results suggest that RGE strategy developed here could be applied to other GLM+NB based read-count analyses, i.e. ChIP-seq data analysis, to substantially improve their computational efficiency while preserving the analytic power.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-07
... defines a remotely managed Post Office (RMPO) as a Post Office that offers part-time window service hours... Administrative Post Office. The final rule also defines a part-time Post Office (PTPO) as a Post Office that offers part-time window service hours, is staffed by a Postal Service employee, and reports to a district...
Dynamic Granger-Geweke causality modeling with application to interictal spike propagation
Lin, Fa-Hsuan; Hara, Keiko; Solo, Victor; Vangel, Mark; Belliveau, John W.; Stufflebeam, Steven M.; Hamalainen, Matti S.
2010-01-01
A persistent problem in developing plausible neurophysiological models of perception, cognition, and action is the difficulty of characterizing the interactions between different neural systems. Previous studies have approached this problem by estimating causal influences across brain areas activated during cognitive processing using Structural Equation Modeling and, more recently, with Granger-Geweke causality. While SEM is complicated by the need for a priori directional connectivity information, the temporal resolution of dynamic Granger-Geweke estimates is limited because the underlying autoregressive (AR) models assume stationarity over the period of analysis. We have developed a novel optimal method for obtaining data-driven directional causality estimates with high temporal resolution in both time and frequency domains. This is achieved by simultaneously optimizing the length of the analysis window and the chosen AR model order using the SURE criterion. Dynamic Granger-Geweke causality in time and frequency domains is subsequently calculated within a moving analysis window. We tested our algorithm by calculating the Granger-Geweke causality of epileptic spike propagation from the right frontal lobe to the left frontal lobe. The results quantitatively suggested the epileptic activity at the left frontal lobe was propagated from the right frontal lobe, in agreement with the clinical diagnosis. Our novel computational tool can be used to help elucidate complex directional interactions in the human brain. PMID:19378280
Independent Orbiter Assessment (IOA): Analysis of the purge, vent and drain subsystem
NASA Technical Reports Server (NTRS)
Bynum, M. C., III
1987-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter PV and D (Purge, Vent and Drain) Subsystem hardware. The PV and D Subsystem controls the environment of unpressurized compartments and window cavities, senses hazardous gases, and purges Orbiter/ET Disconnect. The subsystem is divided into six systems: Purge System (controls the environment of unpressurized structural compartments); Vent System (controls the pressure of unpressurized compartments); Drain System (removes water from unpressurized compartments); Hazardous Gas Detection System (HGDS) (monitors hazardous gas concentrations); Window Cavity Conditioning System (WCCS) (maintains clear windows and provides pressure control of the window cavities); and External Tank/Orbiter Disconnect Purge System (prevents cryo-pumping/icing of disconnect hardware). Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode. Four of the sixty-two failure modes analyzed were determined as single failures which could result in the loss of crew or vehicle. A possible loss of mission could result if any of twelve single failures occurred. Two of the criticality 1/1 failures are in the Window Cavity Conditioning System (WCCS) outer window cavity, where leakage and/or restricted flow will cause failure to depressurize/repressurize the window cavity. Two criticality 1/1 failures represent leakage and/or restricted flow in the Orbiter/ET disconnect purge network which prevent cryopumping/icing of disconnect hardware. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode.
Enhanced analysis of real-time PCR data by using a variable efficiency model: FPK-PCR
Lievens, Antoon; Van Aelst, S.; Van den Bulcke, M.; Goetghebeur, E.
2012-01-01
Current methodology in real-time Polymerase chain reaction (PCR) analysis performs well provided PCR efficiency remains constant over reactions. Yet, small changes in efficiency can lead to large quantification errors. Particularly in biological samples, the possible presence of inhibitors forms a challenge. We present a new approach to single reaction efficiency calculation, called Full Process Kinetics-PCR (FPK-PCR). It combines a kinetically more realistic model with flexible adaptation to the full range of data. By reconstructing the entire chain of cycle efficiencies, rather than restricting the focus on a ‘window of application’, one extracts additional information and loses a level of arbitrariness. The maximal efficiency estimates returned by the model are comparable in accuracy and precision to both the golden standard of serial dilution and other single reaction efficiency methods. The cycle-to-cycle changes in efficiency, as described by the FPK-PCR procedure, stay considerably closer to the data than those from other S-shaped models. The assessment of individual cycle efficiencies returns more information than other single efficiency methods. It allows in-depth interpretation of real-time PCR data and reconstruction of the fluorescence data, providing quality control. Finally, by implementing a global efficiency model, reproducibility is improved as the selection of a window of application is avoided. PMID:22102586
Recalibration of the Multisensory Temporal Window of Integration Results from Changing Task Demands
Mégevand, Pierre; Molholm, Sophie; Nayak, Ashabari; Foxe, John J.
2013-01-01
The notion of the temporal window of integration, when applied in a multisensory context, refers to the breadth of the interval across which the brain perceives two stimuli from different sensory modalities as synchronous. It maintains a unitary perception of multisensory events despite physical and biophysical timing differences between the senses. The boundaries of the window can be influenced by attention and past sensory experience. Here we examined whether task demands could also influence the multisensory temporal window of integration. We varied the stimulus onset asynchrony between simple, short-lasting auditory and visual stimuli while participants performed two tasks in separate blocks: a temporal order judgment task that required the discrimination of subtle auditory-visual asynchronies, and a reaction time task to the first incoming stimulus irrespective of its sensory modality. We defined the temporal window of integration as the range of stimulus onset asynchronies where performance was below 75% in the temporal order judgment task, as well as the range of stimulus onset asynchronies where responses showed multisensory facilitation (race model violation) in the reaction time task. In 5 of 11 participants, we observed audio-visual stimulus onset asynchronies where reaction time was significantly accelerated (indicating successful integration in this task) while performance was accurate in the temporal order judgment task (indicating successful segregation in that task). This dissociation suggests that in some participants, the boundaries of the temporal window of integration can adaptively recalibrate in order to optimize performance according to specific task demands. PMID:23951203
Association between the electromyographic fatigue threshold and ventilatory threshold.
Camata, T V; Lacerda, T R; Altimari, L R; Bortolloti, H; Fontes, E B; Dantas, J L; Nakamura, F Y; Abrão, T; Chacon-Mikahil, M P T; Moraes, A C
2009-01-01
The objective of this study is to verify the coincidence between the occurrence of the electromyographic fatigue threshold (EMGth) and the ventilatory threshold (Vth) in an incremental test in the cyclosimulator, as well as to compare the calculation of the RMS from the EMG signal using different time windows. Thirteen male cyclists (73.7 +/- 12.4 kg and 174.3 +/- 6.2 cm) performed a ramp incremental test (TI) in a cyclosimulator until voluntary exhaustion. Before the start of each TI subjects had the active bipolar electrodes placed over the superficial muscles of the quadriceps femoris (QF) of the right leg: rectus femoris (RF), vastus medialis (VM) and vastus lateralis (VL). The paired student's t test, pearson's correlation coefficient and the analysis method described by Bland and Altman for the determination of the concordance level were used for statistical analysis. The significance level adopted was P < 0.05. Although no significant differences were found between Vth and the EMGth calculated from windows of 2, 5, 10, 30 and 60 seconds in the studied muscles, it is suggested that the EMGth values determined from the calculation of the RMS curve with windows of 5 and 10 seconds seem to be more appropriate for the calculation of the RMS curve and determination of EMGth from visual inspection.
Delay Differential Equation Models of Normal and Diseased Electrocardiograms
NASA Astrophysics Data System (ADS)
Lainscsek, Claudia; Sejnowski, Terrence J.
Time series analysis with nonlinear delay differential equations (DDEs) is a powerful tool since it reveals spectral as well as nonlinear properties of the underlying dynamical system. Here global DDE models are used to analyze electrocardiography recordings (ECGs) in order to capture distinguishing features for different heart conditions such as normal heart beat, congestive heart failure, and atrial fibrillation. To capture distinguishing features of the different data types the number of terms and delays in the model as well as the order of nonlinearity of the DDE model have to be selected. The DDE structure selection is done in a supervised way by selecting the DDE that best separates different data types. We analyzed 24 h of data from 15 young healthy subjects in normal sinus rhythm (NSR) of 15 congestive heart failure (CHF) patients as well as of 15 subjects suffering from atrial fibrillation (AF) selected from the Physionet database. For the analysis presented here we used 5 min non-overlapping data windows on the raw data without any artifact removal. For classification performance we used the Cohen Kappa coefficient computed directly from the confusion matrix. The overall classification performance of the three groups was around 72-99 % on the 5 min windows for the different approaches. For 2 h data windows the classification for all three groups was above 95%.
Experimental study on the crack detection with optimized spatial wavelet analysis and windowing
NASA Astrophysics Data System (ADS)
Ghanbari Mardasi, Amir; Wu, Nan; Wu, Christine
2018-05-01
In this paper, a high sensitive crack detection is experimentally realized and presented on a beam under certain deflection by optimizing spatial wavelet analysis. Due to the crack existence in the beam structure, a perturbation/slop singularity is induced in the deflection profile. Spatial wavelet transformation works as a magnifier to amplify the small perturbation signal at the crack location to detect and localize the damage. The profile of a deflected aluminum cantilever beam is obtained for both intact and cracked beams by a high resolution laser profile sensor. Gabor wavelet transformation is applied on the subtraction of intact and cracked data sets. To improve detection sensitivity, scale factor in spatial wavelet transformation and the transformation repeat times are optimized. Furthermore, to detect the possible crack close to the measurement boundaries, wavelet transformation edge effect, which induces large values of wavelet coefficient around the measurement boundaries, is efficiently reduced by introducing different windowing functions. The result shows that a small crack with depth of less than 10% of the beam height can be localized with a clear perturbation. Moreover, the perturbation caused by a crack at 0.85 mm away from one end of the measurement range, which is covered by wavelet transform edge effect, emerges by applying proper window functions.
Imaging windows for long-term intravital imaging
Alieva, Maria; Ritsma, Laila; Giedt, Randy J; Weissleder, Ralph; van Rheenen, Jacco
2014-01-01
Intravital microscopy is increasingly used to visualize and quantitate dynamic biological processes at the (sub)cellular level in live animals. By visualizing tissues through imaging windows, individual cells (e.g., cancer, host, or stem cells) can be tracked and studied over a time-span of days to months. Several imaging windows have been developed to access tissues including the brain, superficial fascia, mammary glands, liver, kidney, pancreas, and small intestine among others. Here, we review the development of imaging windows and compare the most commonly used long-term imaging windows for cancer biology: the cranial imaging window, the dorsal skin fold chamber, the mammary imaging window, and the abdominal imaging window. Moreover, we provide technical details, considerations, and trouble-shooting tips on the surgical procedures and microscopy setups for each imaging window and explain different strategies to assure imaging of the same area over multiple imaging sessions. This review aims to be a useful resource for establishing the long-term intravital imaging procedure. PMID:28243510
Imaging windows for long-term intravital imaging: General overview and technical insights.
Alieva, Maria; Ritsma, Laila; Giedt, Randy J; Weissleder, Ralph; van Rheenen, Jacco
2014-01-01
Intravital microscopy is increasingly used to visualize and quantitate dynamic biological processes at the (sub)cellular level in live animals. By visualizing tissues through imaging windows, individual cells (e.g., cancer, host, or stem cells) can be tracked and studied over a time-span of days to months. Several imaging windows have been developed to access tissues including the brain, superficial fascia, mammary glands, liver, kidney, pancreas, and small intestine among others. Here, we review the development of imaging windows and compare the most commonly used long-term imaging windows for cancer biology: the cranial imaging window, the dorsal skin fold chamber, the mammary imaging window, and the abdominal imaging window. Moreover, we provide technical details, considerations, and trouble-shooting tips on the surgical procedures and microscopy setups for each imaging window and explain different strategies to assure imaging of the same area over multiple imaging sessions. This review aims to be a useful resource for establishing the long-term intravital imaging procedure.
Public Health Professionals as Policy Entrepreneurs: Arkansas's Childhood Obesity Policy Experience
Craig, Rebekah L.; Felix, Holly C.; Phillips, Martha M.
2010-01-01
In response to a nationwide rise in obesity, several states have passed legislation to improve school health environments. Among these was Arkansas's Act 1220 of 2003, the most comprehensive school-based childhood obesity legislation at that time. We used the Multiple Streams Framework to analyze factors that brought childhood obesity to the forefront of the Arkansas legislative agenda and resulted in the passage of Act 1220. When 3 streams (problem, policy, and political) are combined, a policy window is opened and policy entrepreneurs may advance their goals. We documented factors that produced a policy window and allowed entrepreneurs to enact comprehensive legislation. This historical analysis and the Multiple Streams Framework may serve as a roadmap for leaders seeking to influence health policy. PMID:20864715
Data in support of energy performance of double-glazed windows.
Shakouri, Mahmoud; Banihashemi, Saeed
2016-06-01
This paper provides the data used in a research project to propose a new simplified windows rating system based on saved annual energy ("Developing an empirical predictive energy-rating model for windows by using Artificial Neural Network" (Shakouri Hassanabadi and Banihashemi Namini, 2012) [1], "Climatic, parametric and non-parametric analysis of energy performance of double-glazed windows in different climates" (Banihashemi et al., 2015) [2]). A full factorial simulation study was conducted to evaluate the performance of 26 different types of windows in a four-story residential building. In order to generalize the results, the selected windows were tested in four climates of cold, tropical, temperate, and hot and arid; and four different main orientations of North, West, South and East. The accompanied datasets include the annual saved cooling and heating energy in different climates and orientations by using the selected windows. Moreover, a complete dataset is provided that includes the specifications of 26 windows, climate data, month, and orientation of the window. This dataset can be used to make predictive models for energy efficiency assessment of double glazed windows.
NASA Astrophysics Data System (ADS)
Ganguly, S.; Lubetzky, E.; Martinelli, F.
2015-05-01
The East process is a 1 d kinetically constrained interacting particle system, introduced in the physics literature in the early 1990s to model liquid-glass transitions. Spectral gap estimates of Aldous and Diaconis in 2002 imply that its mixing time on L sites has order L. We complement that result and show cutoff with an -window. The main ingredient is an analysis of the front of the process (its rightmost zero in the setup where zeros facilitate updates to their right). One expects the front to advance as a biased random walk, whose normal fluctuations would imply cutoff with an -window. The law of the process behind the front plays a crucial role: Blondel showed that it converges to an invariant measure ν, on which very little is known. Here we obtain quantitative bounds on the speed of convergence to ν, finding that it is exponentially fast. We then derive that the increments of the front behave as a stationary mixing sequence of random variables, and a Stein-method based argument of Bolthausen (`82) implies a CLT for the location of the front, yielding the cutoff result. Finally, we supplement these results by a study of analogous kinetically constrained models on trees, again establishing cutoff, yet this time with an O(1)-window.
Off-Line Quality Control In Integrated Circuit Fabrication Using Experimental Design
NASA Astrophysics Data System (ADS)
Phadke, M. S.; Kackar, R. N.; Speeney, D. V.; Grieco, M. J.
1987-04-01
Off-line quality control is a systematic method of optimizing production processes and product designs. It is widely used in Japan to produce high quality products at low cost. The method was introduced to us by Professor Genichi Taguchi who is a Deming-award winner and a former Director of the Japanese Academy of Quality. In this paper we will i) describe the off-line quality control method, and ii) document our efforts to optimize the process for forming contact windows in 3.5 Aim CMOS circuits fabricated in the Murray Hill Integrated Circuit Design Capability Laboratory. In the fabrication of integrated circuits it is critically important to produce contact windows of size very near the target dimension. Windows which are too small or too large lead to loss of yield. The off-line quality control method has improved both the process quality and productivity. The variance of the window size has been reduced by a factor of four. Also, processing time for window photolithography has been substantially reduced. The key steps of off-line quality control are: i) Identify important manipulatable process factors and their potential working levels. ii) Perform fractional factorial experiments on the process using orthogonal array designs. iii) Analyze the resulting data to determine the optimum operating levels of the factors. Both the process mean and the process variance are considered in this analysis. iv) Conduct an additional experiment to verify that the new factor levels indeed give an improvement.
Photorefractive-based adaptive optical windows
NASA Astrophysics Data System (ADS)
Liu, Yuexin; Yang, Yi; Wang, Bo; Fu, John Y.; Yin, Shizhuo; Guo, Ruyan; Yu, Francis T.
2004-10-01
Optical windows have been widely used in optical spectrographic processing system. In this paper, various window profiles, such as rectangular, triangular, Hamming, Hanning, and Blackman etc., have been investigated in detail, regarding their effect on the generated spectrograms, such as joint time-frequency resolution ΔtΔw, the sidelobe amplitude attenuation etc.. All of these windows can be synthesized in a photorefractive crystal by angular multiplexing holographic technique, which renders the system more adaptive. Experimental results are provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-09-14
This package contains statistical routines for extracting features from multivariate time-series data which can then be used for subsequent multivariate statistical analysis to identify patterns and anomalous behavior. It calculates local linear or quadratic regression model fits to moving windows for each series and then summarizes the model coefficients across user-defined time intervals for each series. These methods are domain agnostic-but they have been successfully applied to a variety of domains, including commercial aviation and electric power grid data.
Simplified Least Squares Shadowing sensitivity analysis for chaotic ODEs and PDEs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chater, Mario, E-mail: chaterm@mit.edu; Ni, Angxiu, E-mail: niangxiu@mit.edu; Wang, Qiqi, E-mail: qiqi@mit.edu
This paper develops a variant of the Least Squares Shadowing (LSS) method, which has successfully computed the derivative for several chaotic ODEs and PDEs. The development in this paper aims to simplify Least Squares Shadowing method by improving how time dilation is treated. Instead of adding an explicit time dilation term as in the original method, the new variant uses windowing, which can be more efficient and simpler to implement, especially for PDEs.
Precision Departure Release Capability (PDRC) Final Report
NASA Technical Reports Server (NTRS)
Engelland, Shawn A.; Capps, Richard; Day, Kevin Brian; Kistler, Matthew Stephen; Gaither, Frank; Juro, Greg
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows that may be subject to constraints that create localized demand/capacity imbalances. When demand exceeds capacity, Traffic Management Coordinators (TMCs) and Frontline Managers (FLMs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves a Call for Release (CFR) procedure wherein the Tower must call the Center to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool, based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release time is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that improves tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions and departure runway assignments to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept reduces uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs and FLMs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station in Dallas/Fort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations. This paper presents research results from the PDRC research activity. Companion papers present the Concept of Operations and a Technology Description.
Precision Departure Release Capability (PDRC): NASA to FAA Research Transition
NASA Technical Reports Server (NTRS)
Engelland, Shawn; Davis, Thomas J.
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows which may be subject to constraints that create localized demand-capacity imbalances. When demand exceeds capacity, Traffic Management Coordinators (TMCs) and Frontline Managers (FLMs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves use of a Call for Release (CFR) procedure wherein the Tower must call the Center to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release time is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that improves tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions and departure runway assignments to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept reduces uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs and FLMs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station in Dallas-Fort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations.
Numerical and experimental validation for the thermal transmittance of windows with cellular shades
Hart, Robert
2018-02-21
Some highly energy efficient window attachment products are available today, but more rapid market adoption would be facilitated by fair performance metrics. It is important to have validated simulation tools to provide a basis for this analysis. This paper outlines a review and validation of the ISO 15099 center-of-glass zero-solar-load heat transfer correlations for windows with cellular shades. Thermal transmittance was measured experimentally, simulated using computational fluid dynamics (CFD) analysis, and simulated utilizing correlations from ISO 15099 as implemented in Berkeley Lab WINDOW and THERM software. CFD analysis showed ISO 15099 underestimates heat flux of rectangular cavities by up tomore » 60% when aspect ratio (AR) = 1 and overestimates heat flux up to 20% when AR = 0.5. CFD analysis also showed that wave-type surfaces of cellular shades have less than 2% impact on heat flux through the cavities and less than 5% for natural convection of room-side surface. WINDOW was shown to accurately represent heat flux of the measured configurations to a mean relative error of 0.5% and standard deviation of 3.8%. Finally, several shade parameters showed significant influence on correlation accuracy, including distance between shade and glass, inconsistency in cell stretch, size of perimeter gaps, and the mounting hardware.« less
Numerical and experimental validation for the thermal transmittance of windows with cellular shades
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Robert
Some highly energy efficient window attachment products are available today, but more rapid market adoption would be facilitated by fair performance metrics. It is important to have validated simulation tools to provide a basis for this analysis. This paper outlines a review and validation of the ISO 15099 center-of-glass zero-solar-load heat transfer correlations for windows with cellular shades. Thermal transmittance was measured experimentally, simulated using computational fluid dynamics (CFD) analysis, and simulated utilizing correlations from ISO 15099 as implemented in Berkeley Lab WINDOW and THERM software. CFD analysis showed ISO 15099 underestimates heat flux of rectangular cavities by up tomore » 60% when aspect ratio (AR) = 1 and overestimates heat flux up to 20% when AR = 0.5. CFD analysis also showed that wave-type surfaces of cellular shades have less than 2% impact on heat flux through the cavities and less than 5% for natural convection of room-side surface. WINDOW was shown to accurately represent heat flux of the measured configurations to a mean relative error of 0.5% and standard deviation of 3.8%. Finally, several shade parameters showed significant influence on correlation accuracy, including distance between shade and glass, inconsistency in cell stretch, size of perimeter gaps, and the mounting hardware.« less
Measuring kinetics of complex single ion channel data using mean-variance histograms.
Patlak, J B
1993-01-01
The measurement of single ion channel kinetics is difficult when those channels exhibit subconductance events. When the kinetics are fast, and when the current magnitudes are small, as is the case for Na+, Ca2+, and some K+ channels, these difficulties can lead to serious errors in the estimation of channel kinetics. I present here a method, based on the construction and analysis of mean-variance histograms, that can overcome these problems. A mean-variance histogram is constructed by calculating the mean current and the current variance within a brief "window" (a set of N consecutive data samples) superimposed on the digitized raw channel data. Systematic movement of this window over the data produces large numbers of mean-variance pairs which can be assembled into a two-dimensional histogram. Defined current levels (open, closed, or sublevel) appear in such plots as low variance regions. The total number of events in such low variance regions is estimated by curve fitting and plotted as a function of window width. This function decreases with the same time constants as the original dwell time probability distribution for each of the regions. The method can therefore be used: 1) to present a qualitative summary of the single channel data from which the signal-to-noise ratio, open channel noise, steadiness of the baseline, and number of conductance levels can be quickly determined; 2) to quantify the dwell time distribution in each of the levels exhibited. In this paper I present the analysis of a Na+ channel recording that had a number of complexities. The signal-to-noise ratio was only about 8 for the main open state, open channel noise, and fast flickers to other states were present, as were a substantial number of subconductance states. "Standard" half-amplitude threshold analysis of these data produce open and closed time histograms that were well fitted by the sum of two exponentials, but with apparently erroneous time constants, whereas the mean-variance histogram technique provided a more credible analysis of the open, closed, and subconductance times for the patch. I also show that the method produces accurate results on simulated data in a wide variety of conditions, whereas the half-amplitude method, when applied to complex simulated data shows the same errors as were apparent in the real data. The utility and the limitations of this new method are discussed. Images FIGURE 2 FIGURE 4 FIGURE 8 FIGURE 9 PMID:7690261
Noise normalization and windowing functions for VALIDAR in wind parameter estimation
NASA Astrophysics Data System (ADS)
Beyon, Jeffrey Y.; Koch, Grady J.; Li, Zhiwen
2006-05-01
The wind parameter estimates from a state-of-the-art 2-μm coherent lidar system located at NASA Langley, Virginia, named VALIDAR (validation lidar), were compared after normalizing the noise by its estimated power spectra via the periodogram and the linear predictive coding (LPC) scheme. The power spectra and the Doppler shift estimates were the main parameter estimates for comparison. Different types of windowing functions were implemented in VALIDAR data processing algorithm and their impact on the wind parameter estimates was observed. Time and frequency independent windowing functions such as Rectangular, Hanning, and Kaiser-Bessel and time and frequency dependent apodized windowing function were compared. The briefing of current nonlinear algorithm development for Doppler shift correction subsequently follows.
[Online endpoint detection algorithm for blending process of Chinese materia medica].
Lin, Zhao-Zhou; Yang, Chan; Xu, Bing; Shi, Xin-Yuan; Zhang, Zhi-Qiang; Fu, Jing; Qiao, Yan-Jiang
2017-03-01
Blending process, which is an essential part of the pharmaceutical preparation, has a direct influence on the homogeneity and stability of solid dosage forms. With the official release of Guidance for Industry PAT, online process analysis techniques have been more and more reported in the applications in blending process, but the research on endpoint detection algorithm is still in the initial stage. By progressively increasing the window size of moving block standard deviation (MBSD), a novel endpoint detection algorithm was proposed to extend the plain MBSD from off-line scenario to online scenario and used to determine the endpoint in the blending process of Chinese medicine dispensing granules. By online learning of window size tuning, the status changes of the materials in blending process were reflected in the calculation of standard deviation in a real-time manner. The proposed method was separately tested in the blending processes of dextrin and three other extracts of traditional Chinese medicine. All of the results have shown that as compared with traditional MBSD method, the window size changes according to the proposed MBSD method (progressively increasing the window size) could more clearly reflect the status changes of the materials in blending process, so it is suitable for online application. Copyright© by the Chinese Pharmaceutical Association.
Helicopter TEM parameters analysis and system optimization based on time constant
NASA Astrophysics Data System (ADS)
Xiao, Pan; Wu, Xin; Shi, Zongyang; Li, Jutao; Liu, Lihua; Fang, Guangyou
2018-03-01
Helicopter transient electromagnetic (TEM) method is a kind of common geophysical prospecting method, widely used in mineral detection, underground water exploration and environment investigation. In order to develop an efficient helicopter TEM system, it is necessary to analyze and optimize the system parameters. In this paper, a simple and quantitative method is proposed to analyze the system parameters, such as waveform, power, base frequency, measured field and sampling time. A wire loop model is used to define a comprehensive 'time constant domain' that shows a range of time constant, analogous to a range of conductance, after which the characteristics of the system parameters in this domain is obtained. It is found that the distortion caused by the transmitting base frequency is less than 5% when the ratio of the transmitting period to the target time constant is greater than 6. When the sampling time window is less than the target time constant, the distortion caused by the sampling time window is less than 5%. According to this method, a helicopter TEM system, called CASHTEM, is designed, and flight test has been carried out in the known mining area. The test results show that the system has good detection performance, verifying the effectiveness of the method.
An Efficient Format for Nearly Constant-Time Access to Arbitrary Time Intervals in Large Trace Files
Chan, Anthony; Gropp, William; Lusk, Ewing
2008-01-01
A powerful method to aid in understanding the performance of parallel applications uses log or trace files containing time-stamped events and states (pairs of events). These trace files can be very large, often hundreds or even thousands of megabytes. Because of the cost of accessing and displaying such files, other methods are often used that reduce the size of the tracefiles at the cost of sacrificing detail or other information. This paper describes a hierarchical trace file format that provides for display of an arbitrary time window in a time independent of the total size of the file and roughlymore » proportional to the number of events within the time window. This format eliminates the need to sacrifice data to achieve a smaller trace file size (since storage is inexpensive, it is necessary only to make efficient use of bandwidth to that storage). The format can be used to organize a trace file or to create a separate file of annotations that may be used with conventional trace files. We present an analysis of the time to access all of the events relevant to an interval of time and we describe experiments demonstrating the performance of this file format.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olivas, Eric Richard
2016-02-26
A conjugate heat transfer and thermal structural analysis was completed, with the objective of determining the following: Lead bismuth eutectic (LBE) peak temperature, free convective velocity patterns in the LBE, peak beam window temperature, and thermal stress/deformation in the window.
NASA Astrophysics Data System (ADS)
Behm, M.
2016-12-01
The ALP2002 experiment was a large 3D active seismic experiment to reveal the crustal structure of the Eastern Alps and their neighboring tectonic provinces in Central Europe. The deployment comprised 993 autonomous Reftek Texan recorders equipped with vertical component geophones. The average station spacing was 4.5 km, and the recording instruments were distributed along 14 interlocking profiles with a total line length of 4313 km. During 5 days, 40 explosions in boreholes were fired and recorded within pre-programmed time windows. The limited memory capacity of the recorders allowed for just a few additional backup time windows. Despite the short total recording time of only 17.5 hours within 5 days, one of these backup time windows comprises the registration of the reflection from the earth's inner core (PKiKP phase) originating from an earthquake 130 km offshore Papa New Guinea. This magnitude 5.7 teleseimic event occurred in a depth of 33 km and its epicentral distance to the deployment area is 121.5°. Although the 1C geophones with a natural frequency of 4.5 Hz are not designed to capture the complete characteristics of low-frequency earthquake waveforms, the high-frequency part of the PKiKP wavelet is clearly recorded on all 993 stations. Thus the dataset represents a unique opportunity to study regional and local crustal structures from the analysis of teleseismic events, in particular since the results from the active source data provide calibration and validation. Arrival time analysis is facilitated by the sub-vertical emergence angle of the PKiKP phase. Time corrections for the near surface (< 10 km depth) and the upper mantle structure (50 - 400 km depth) are obtained from previously established seismic 3D models and allow focusing the interpretation on the lower crust and crust-mantle transition. Further, a recently developed blind deconvolution approach is applied to the data for imaging the crustal structure from surface reflections of the PKiKP phase.
NASA Astrophysics Data System (ADS)
Deininger, Michael; Lippold, Jörg; Abele, Florian; McDermott, Frank
2016-04-01
Speleothems are considered as a valuable continental climate archive. Their δ18O records provide information onto past changes of the atmospheric circulation accompanied by changes in surface air temperature and precipitation. During the last decades European speleothem studies have assembled a European speleothem network (including numerous speleothem δ18O records) that allow now not only to picture past climate variability in time but also in space. In particular the climate variability of the last 4.5 ka was investigated by these studies. This allows the comparison of the speleothem-based reconstructed palaeoclimate with the timings of the rise and fall of ancient civilisations in this period - including the Dark Ages. Here we evaluate a compilation of 10 speleothem δ18O records covering the last 4.5 ka using a Monte Carlo based Principal Component Analysis (MC-PCA) that accounts for uncertainties in individual speleothem age models and for the different and varying temporal resolutions of each speleothem δ18O record. Our MC-PCA approach allows not only the identification of temporally coherent changes in δ18O records, i.e. the common signal in all investigated speleothem δ18O records, but it also facilitates their depiction and evaluation spatially. The speleothem δ18O records are spanning almost the entire European continent ranging from the western Margin of the European continent to Northern Turkey and from Northern Italy to Norway. For the MC-PCA analysis the 4.5 ka are divided into eight 1ka long time windows that overlap the subsequent time window by 500 years to allow a comparison of the spatio-temporal evolution of the common signal. For every single time window we derive a common mode of climate variability of all speleothem δ18O records as well as its spatial extent. This allows us to compare the rise and fall of ancient civilisations, like the Hittite and the Roman Empire, with our reconstructed spatio-temporal record.
Popularity and Novelty Dynamics in Evolving Networks.
Abbas, Khushnood; Shang, Mingsheng; Abbasi, Alireza; Luo, Xin; Xu, Jian Jun; Zhang, Yu-Xia
2018-04-20
Network science plays a big role in the representation of real-world phenomena such as user-item bipartite networks presented in e-commerce or social media platforms. It provides researchers with tools and techniques to solve complex real-world problems. Identifying and predicting future popularity and importance of items in e-commerce or social media platform is a challenging task. Some items gain popularity repeatedly over time while some become popular and novel only once. This work aims to identify the key-factors: popularity and novelty. To do so, we consider two types of novelty predictions: items appearing in the popular ranking list for the first time; and items which were not in the popular list in the past time window, but might have been popular before the recent past time window. In order to identify the popular items, a careful consideration of macro-level analysis is needed. In this work we propose a model, which exploits item level information over a span of time to rank the importance of the item. We considered ageing or decay effect along with the recent link-gain of the items. We test our proposed model on four various real-world datasets using four information retrieval based metrics.
Galias, Zbigniew
2017-05-01
An efficient method to find positions of periodic windows for the quadratic map f(x)=ax(1-x) and a heuristic algorithm to locate the majority of wide periodic windows are proposed. Accurate rigorous bounds of positions of all periodic windows with periods below 37 and the majority of wide periodic windows with longer periods are found. Based on these results, we prove that the measure of the set of regular parameters in the interval [3,4] is above 0.613960137. The properties of periodic windows are studied numerically. The results of the analysis are used to estimate that the true value of the measure of the set of regular parameters is close to 0.6139603.
Precision Departure Release Capability (PDRC) Technology Description
NASA Technical Reports Server (NTRS)
Engelland, Shawn A.; Capps, Richard; Day, Kevin; Robinson, Corissia; Null, Jody R.
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows which may be subject to constraints that create localized demand-capacity imbalances. When demand exceeds capacity, Traffic Management Coordinators (TMCs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves use of a Call for Release (CFR) procedure wherein the Tower must call the Center TMC to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System (NextGen) plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that uses this technology to improve tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept helps reduce uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station (NTX) in Dallas-Fort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations. This paper presents the Technology Description. Companion papers include the Final Report and a Concept of Operations.
Precision Departure Release Capability (PDRC) Concept of Operations
NASA Technical Reports Server (NTRS)
Engelland, Shawn; Capps, Richard A.; Day, Kevin Brian
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows which may be subject to constraints that create localized demandcapacity imbalances. When demand exceeds capacity Traffic Management Coordinators (TMCs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves use of a Call for Release (CFR) procedure wherein the Tower must call the Center TMC to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System (NextGen) plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that uses this technology to improve tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept helps reduce uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station (NTX) in DallasFort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations. This paper presents the Concept of Operations. Companion papers include the Final Report and a Technology Description. ? SUBJECT:
Time-series analysis of sleep wake stage of rat EEG using time-dependent pattern entropy
NASA Astrophysics Data System (ADS)
Ishizaki, Ryuji; Shinba, Toshikazu; Mugishima, Go; Haraguchi, Hikaru; Inoue, Masayoshi
2008-05-01
We performed electroencephalography (EEG) for six male Wistar rats to clarify temporal behaviors at different levels of consciousness. Levels were identified both by conventional sleep analysis methods and by our novel entropy method. In our method, time-dependent pattern entropy is introduced, by which EEG is reduced to binary symbolic dynamics and the pattern of symbols in a sliding temporal window is considered. A high correlation was obtained between level of consciousness as measured by the conventional method and mean entropy in our entropy method. Mean entropy was maximal while awake (stage W) and decreased as sleep deepened. These results suggest that time-dependent pattern entropy may offer a promising method for future sleep research.
High-impact resistance optical sensor windows
NASA Astrophysics Data System (ADS)
Askinazi, Joel; Ceccorulli, Mark L.; Goldman, Lee
2011-06-01
Recent field experience with optical sensor windows on both ground and airborne platforms has shown a significant increase in window fracturing from foreign object debris (FOD) impacts and as a by-product of asymmetrical warfare. Common optical sensor window materials such as borosilicate glass do not typically have high impact resistance. Emerging advanced optical window materials such as aluminum oxynitride offer the potential for a significant improvement in FOD impact resistance due to their superior surface hardness, fracture toughness and strength properties. To confirm the potential impact resistance improvement achievable with these emerging materials, Goodrich ISR Systems in collaboration with Surmet Corporation undertook a set of comparative FOD impact tests of optical sensor windows made from borosilicate glass and from aluminum oxynitride. It was demonstrated that the aluminum oxynitride windows could withstand up to three times the FOD impact velocity (as compared with borosilicate glass) before fracture would occur. These highly encouraging test results confirm the utility of this new highly viable window solution for use on new ground and airborne window multispectral applications as well as a retrofit to current production windows. We believe that this solution can go a long way to significantly reducing the frequency and life cycle cost of window replacement.
NASA Technical Reports Server (NTRS)
Barry, Matthew R.
2006-01-01
The X-Windows Process Validation Table (PVT) Widget Class ( Class is used here in the object-oriented-programming sense of the word) was devised to simplify the task of implementing network registration services for Information Sharing Protocol (ISP) graphical-user-interface (GUI) computer programs. Heretofore, ISP PVT programming tasks have required many method calls to identify, query, and interpret the connections and messages exchanged between a client and a PVT server. Normally, programmers have utilized direct access to UNIX socket libraries to implement the PVT protocol queries, necessitating the use of many lines of source code to perform frequent tasks. Now, the X-Windows PVT Widget Class encapsulates ISP client server network registration management tasks within the framework of an X Windows widget. Use of the widget framework enables an X Windows GUI program to interact with PVT services in an abstract way and in the same manner as that of other graphical widgets, making it easier to program PVT clients. Wrapping the PVT services inside the widget framework enables a programmer to treat a PVT server interface as though it were a GUI. Moreover, an alternate subclass could implement another service in a widget of the same type. This program was written by Matthew R. Barry of United Space Alliance for Johnson Space Center. For further information, contact the Johnson Technology Transfer Office at (281) 483-3809. MSC-23582 Shuttle Data Center File- Processing Tool in Java A Java-language computer program has been written to facilitate mining of data in files in the Shuttle Data Center (SDC) archives. This program can be executed on a variety of workstations or via Web-browser programs. This program is partly similar to prior C-language programs used for the same purpose, while differing from those programs in that it exploits the platform-neutrality of Java in implementing several features that are important for analysis of large sets of time-series data. The program supports regular expression queries of SDC archive files, reads the files, interleaves the time-stamped samples according to a chosen output, then transforms the results into that format. A user can choose among a variety of output file formats that are useful for diverse purposes, including plotting, Markov modeling, multivariate density estimation, and wavelet multiresolution analysis, as well as for playback of data in support of simulation and testing.
Application of MEMS-based x-ray optics as tuneable nanosecond choppers
NASA Astrophysics Data System (ADS)
Chen, Pice; Walko, Donald A.; Jung, Il Woong; Li, Zhilong; Gao, Ya; Shenoy, Gopal K.; Lopez, Daniel; Wang, Jin
2017-08-01
Time-resolved synchrotron x-ray measurements often rely on using a mechanical chopper to isolate a set of x-ray pulses. We have started the development of micro electromechanical systems (MEMS)-based x-ray optics, as an alternate method to manipulate x-ray beams. In the application of x-ray pulse isolation, we recently achieved a pulse-picking time window of half a nanosecond, which is more than 100 times faster than mechanical choppers can achieve. The MEMS device consists of a comb-drive silicon micromirror, designed for efficiently diffracting an x-ray beam during oscillation. The MEMS devices were operated in Bragg geometry and their oscillation was synchronized to x-ray pulses, with a frequency matching subharmonics of the cycling frequency of x-ray pulses. The microscale structure of the silicon mirror in terms of the curvature and the quality of crystallinity ensures a narrow angular spread of the Bragg reflection. With the discussion of factors determining the diffractive time window, this report showed our approaches to narrow down the time window to half a nanosecond. The short diffractive time window will allow us to select single x-ray pulse out of a train of pulses from synchrotron radiation facilities.
Windows of sensitivity to toxic chemicals in the motor effects development.
Ingber, Susan Z; Pohl, Hana R
2016-02-01
Many chemicals currently used are known to elicit nervous system effects. In addition, approximately 2000 new chemicals introduced annually have not yet undergone neurotoxicity testing. This review concentrated on motor development effects associated with exposure to environmental neurotoxicants to help identify critical windows of exposure and begin to assess data needs based on a subset of chemicals thoroughly reviewed by the Agency for Toxic Substances and Disease Registry (ATSDR) in Toxicological Profiles and Addenda. Multiple windows of sensitivity were identified that differed based on the maturity level of the neurological system at the time of exposure, as well as dose and exposure duration. Similar but distinct windows were found for both motor activity (GD 8-17 [rats], GD 12-14 and PND 3-10 [mice]) and motor function performance (insufficient data for rats, GD 12-17 [mice]). Identifying specific windows of sensitivity in animal studies was hampered by study designs oriented towards detection of neurotoxicity that occurred at any time throughout the developmental process. In conclusion, while this investigation identified some critical exposure windows for motor development effects, it demonstrates a need for more acute duration exposure studies based on neurodevelopmental windows, particularly during the exposure periods identified in this review. Published by Elsevier Inc.
Windows of sensitivity to toxic chemicals in the motor effects development✩
Ingber, Susan Z.; Pohl, Hana R.
2017-01-01
Many chemicals currently used are known to elicit nervous system effects. In addition, approximately 2000 new chemicals introduced annually have not yet undergone neurotoxicity testing. This review concentrated on motor development effects associated with exposure to environmental neurotoxicants to help identify critical windows of exposure and begin to assess data needs based on a subset of chemicals thoroughly reviewed by the Agency for Toxic Substances and Disease Registry (ATSDR) in Toxicological Profiles and Addenda. Multiple windows of sensitivity were identified that differed based on the maturity level of the neurological system at the time of exposure, as well as dose and exposure duration. Similar but distinct windows were found for both motor activity (GD 8–17 [rats], GD 12–14 and PND 3–10 [mice]) and motor function performance (insufficient data for rats, GD 12–17 [mice]). Identifying specific windows of sensitivity in animal studies was hampered by study designs oriented towards detection of neurotoxicity that occurred at any time throughout the developmental process. In conclusion, while this investigation identified some critical exposure windows for motor development effects, it demonstrates a need for more acute duration exposure studies based on neurodevelopmental windows, particularly during the exposure periods identified in this review. PMID:26686904
Schüpbach, Jörg; Gebhardt, Martin D.; Scherrer, Alexandra U.; Bisset, Leslie R.; Niederhauser, Christoph; Regenass, Stephan; Yerly, Sabine; Aubert, Vincent; Suter, Franziska; Pfister, Stefan; Martinetti, Gladys; Andreutti, Corinne; Klimkait, Thomas; Brandenberger, Marcel; Günthard, Huldrych F.
2013-01-01
Background Tests for recent infections (TRIs) are important for HIV surveillance. We have shown that a patient's antibody pattern in a confirmatory line immunoassay (Inno-Lia) also yields information on time since infection. We have published algorithms which, with a certain sensitivity and specificity, distinguish between incident (< = 12 months) and older infection. In order to use these algorithms like other TRIs, i.e., based on their windows, we now determined their window periods. Methods We classified Inno-Lia results of 527 treatment-naïve patients with HIV-1 infection < = 12 months according to incidence by 25 algorithms. The time after which all infections were ruled older, i.e. the algorithm's window, was determined by linear regression of the proportion ruled incident in dependence of time since infection. Window-based incident infection rates (IIR) were determined utilizing the relationship ‘Prevalence = Incidence x Duration’ in four annual cohorts of HIV-1 notifications. Results were compared to performance-based IIR also derived from Inno-Lia results, but utilizing the relationship ‘incident = true incident + false incident’ and also to the IIR derived from the BED incidence assay. Results Window periods varied between 45.8 and 130.1 days and correlated well with the algorithms' diagnostic sensitivity (R2 = 0.962; P<0.0001). Among the 25 algorithms, the mean window-based IIR among the 748 notifications of 2005/06 was 0.457 compared to 0.453 obtained for performance-based IIR with a model not correcting for selection bias. Evaluation of BED results using a window of 153 days yielded an IIR of 0.669. Window-based IIR and performance-based IIR increased by 22.4% and respectively 30.6% in 2008, while 2009 and 2010 showed a return to baseline for both methods. Conclusions IIR estimations by window- and performance-based evaluations of Inno-Lia algorithm results were similar and can be used together to assess IIR changes between annual HIV notification cohorts. PMID:23990968
Geometric subspace methods and time-delay embedding for EEG artifact removal and classification.
Anderson, Charles W; Knight, James N; O'Connor, Tim; Kirby, Michael J; Sokolov, Artem
2006-06-01
Generalized singular-value decomposition is used to separate multichannel electroencephalogram (EEG) into components found by optimizing a signal-to-noise quotient. These components are used to filter out artifacts. Short-time principal components analysis of time-delay embedded EEG is used to represent windowed EEG data to classify EEG according to which mental task is being performed. Examples are presented of the filtering of various artifacts and results are shown of classification of EEG from five mental tasks using committees of decision trees.
2017-05-01
Analyzing these factors enables a planner to develop an axis-of-advance that a vessel can easily maintain, as well as to reduce the travel time from...operational risk by testing the feasibility of the navigability of an area; 2) determining the capacity and timing of that operation; 3) defining the...conditions at this location dictate that only a narrow window of time is available for conducting surface ship-to- shore operations. The vessel
Kirschman, Lucas J; Crespi, Erica J; Warne, Robin W
2018-01-01
Ubiquitous environmental stressors are often thought to alter animal susceptibility to pathogens and contribute to disease emergence. However, duration of exposure to a stressor is likely critical, because while chronic stress is often immunosuppressive, acute stress can temporarily enhance immune function. Furthermore, host susceptibility to stress and disease often varies with ontogeny; increasing during critical developmental windows. How the duration and timing of exposure to stressors interact to shape critical windows and influence disease processes is not well tested. We used ranavirus and larval amphibians as a model system to investigate how physiological stress and pathogenic infection shape development and disease dynamics in vertebrates. Based on a resource allocation model, we designed experiments to test how exposure to stressors may induce resource trade-offs that shape critical windows and disease processes because the neuroendocrine stress axis coordinates developmental remodelling, immune function and energy allocation in larval amphibians. We used wood frog larvae (Lithobates sylvaticus) to investigate how chronic and acute exposure to corticosterone, the dominant amphibian glucocorticoid hormone, mediates development and immune function via splenocyte immunohistochemistry analysis in association with ranavirus infection. Corticosterone treatments affected immune function, as both chronic and acute exposure suppressed splenocyte proliferation, although viral replication rate increased only in the chronic corticosterone treatment. Time to metamorphosis and survival depended on both corticosterone treatment and infection status. In the control and chronic corticosterone treatments, ranavirus infection decreased survival and delayed metamorphosis, although chronic corticosterone exposure accelerated rate of metamorphosis in uninfected larvae. Acute corticosterone exposure accelerated metamorphosis increased survival in infected larvae. Interactions between stress exposure (via glucocorticoid actions) and infection impose resource trade-offs that shape optimal allocation between development and somatic function. As a result, critical disease windows are likely shaped by stress exposure because any conditions that induce changes in differentiation rates will alter the duration and susceptibility of organisms to stressors or disease. © 2017 The Authors. Journal of Animal Ecology © 2017 British Ecological Society.
Multifractal analysis of 2001 Mw 7 . 7 Bhuj earthquake sequence in Gujarat, Western India
NASA Astrophysics Data System (ADS)
Aggarwal, Sandeep Kumar; Pastén, Denisse; Khan, Prosanta Kumar
2017-12-01
The 2001 Mw 7 . 7 Bhuj mainshock seismic sequence in the Kachchh area, occurring during 2001 to 2012, has been analyzed using mono-fractal and multi-fractal dimension spectrum analysis technique. This region was characterized by frequent moderate shocks of Mw ≥ 5 . 0 for more than a decade since the occurrence of 2001 Bhuj earthquake. The present study is therefore important for precursory analysis using this sequence. The selected long-sequence has been investigated first time for completeness magnitude Mc 3.0 using the maximum curvature method. Multi-fractal Dq spectrum (Dq ∼ q) analysis was carried out using effective window-length of 200 earthquakes with a moving window of 20 events overlapped by 180 events. The robustness of the analysis has been tested by considering the magnitude completeness correction term of 0.2 to Mc 3.0 as Mc 3.2 and we have tested the error in the calculus of Dq for each magnitude threshold. On the other hand, the stability of the analysis has been investigated down to the minimum magnitude of Mw ≥ 2 . 6 in the sequence. The analysis shows the multi-fractal dimension spectrum Dq decreases with increasing of clustering of events with time before a moderate magnitude earthquake in the sequence, which alternatively accounts for non-randomness in the spatial distribution of epicenters and its self-organized criticality. Similar behavior is ubiquitous elsewhere around the globe, and warns for proximity of a damaging seismic event in an area. OS: Please confirm math roman or italics in abs.
MEA-Tools: an open source toolbox for the analysis of multi-electrode data with MATLAB.
Egert, U; Knott, Th; Schwarz, C; Nawrot, M; Brandt, A; Rotter, S; Diesmann, M
2002-05-30
Recent advances in electrophysiological techniques have created new tools for the acquisition and storage of neuronal activity recorded simultaneously with numerous electrodes. These techniques support the analysis of the function as well as the structure of individual electrogenic cells in the context of surrounding neuronal or cardiac network. Commercially available tools for the analysis of such data, however, cannot be easily adapted to newly emerging requirements for data analysis and visualization, and cross compatibility between them is limited. In this report we introduce a free open source toolbox called microelectrode array tools (MEA-Tools) for the analysis of multi-electrode data based on the common data analysis environment MATLAB (version 5.3-6.1, The Mathworks, Natick, MA). The toolbox itself is platform independent. The file interface currently supports files recorded with MCRack (Multi Channel Systems, Reutlingen, Germany) under Microsoft Windows 95, 98, NT, and 2000, but can be adapted to other data acquisition systems. Functions are controlled via command line input and graphical user interfaces, and support common requirements for the analysis of local field potentials, extracellular spike activity, and continuous recordings, in addition to supplementary data acquired by additional instruments, e.g. intracellular amplifiers. Data may be processed as continuous recordings or time windows triggered to some event.
Visualization of IAV Genomes at the Single-Cell Level.
Wang, Dan; Ma, Wenjun
2017-10-01
Different influenza A viruses (IAVs) infect the same cell in a host, and can subsequently produce new viruses through genome reassortment. By combining padlock probe RNA labeling with a single-cell analysis, a new approach effectively captures IAV genome trafficking and defines a time window for genome reassortment from same-cell coinfections. Copyright © 2017 Elsevier Ltd. All rights reserved.
Caring About Dostoyevsky: The Untapped Potential of Studying Literature.
Willems, Roel M; Jacobs, Arthur M
2016-04-01
Should cognitive scientists and neuroscientists care about Dostoyevsky? Engaging with fiction is a natural and rich behavior, providing a unique window onto the mind and brain, particularly for mental simulation, emotion, empathy, and immersion. With advances in analysis techniques, it is time that cognitive scientists and neuroscientists embrace literature and fiction. Copyright © 2016 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-12
... impact of eliminating the correction window from the electronic grant application submission process on... process a temporary error correction window to ensure a smooth and successful transition for applicants. This window provides applicants a period of time beyond the grant application due date to correct any...
Glerean, Enrico; Salmi, Juha; Lahnakoski, Juha M; Jääskeläinen, Iiro P; Sams, Mikko
2012-01-01
Functional brain activity and connectivity have been studied by calculating intersubject and seed-based correlations of hemodynamic data acquired with functional magnetic resonance imaging (fMRI). To inspect temporal dynamics, these correlation measures have been calculated over sliding time windows with necessary restrictions on the length of the temporal window that compromises the temporal resolution. Here, we show that it is possible to increase temporal resolution by using instantaneous phase synchronization (PS) as a measure of dynamic (time-varying) functional connectivity. We applied PS on an fMRI dataset obtained while 12 healthy volunteers watched a feature film. Narrow frequency band (0.04-0.07 Hz) was used in the PS analysis to avoid artifactual results. We defined three metrics for computing time-varying functional connectivity and time-varying intersubject reliability based on estimation of instantaneous PS across the subjects: (1) seed-based PS, (2) intersubject PS, and (3) intersubject seed-based PS. Our findings show that these PS-based metrics yield results consistent with both seed-based correlation and intersubject correlation methods when inspected over the whole time series, but provide an important advantage of maximal single-TR temporal resolution. These metrics can be applied both in studies with complex naturalistic stimuli (e.g., watching a movie or listening to music in the MRI scanner) and more controlled (e.g., event-related or blocked design) paradigms. A MATLAB toolbox FUNPSY ( http://becs.aalto.fi/bml/software.html ) is openly available for using these metrics in fMRI data analysis.
Detrending moving average algorithm for multifractals
NASA Astrophysics Data System (ADS)
Gu, Gao-Feng; Zhou, Wei-Xing
2010-07-01
The detrending moving average (DMA) algorithm is a widely used technique to quantify the long-term correlations of nonstationary time series and the long-range correlations of fractal surfaces, which contains a parameter θ determining the position of the detrending window. We develop multifractal detrending moving average (MFDMA) algorithms for the analysis of one-dimensional multifractal measures and higher-dimensional multifractals, which is a generalization of the DMA method. The performance of the one-dimensional and two-dimensional MFDMA methods is investigated using synthetic multifractal measures with analytical solutions for backward (θ=0) , centered (θ=0.5) , and forward (θ=1) detrending windows. We find that the estimated multifractal scaling exponent τ(q) and the singularity spectrum f(α) are in good agreement with the theoretical values. In addition, the backward MFDMA method has the best performance, which provides the most accurate estimates of the scaling exponents with lowest error bars, while the centered MFDMA method has the worse performance. It is found that the backward MFDMA algorithm also outperforms the multifractal detrended fluctuation analysis. The one-dimensional backward MFDMA method is applied to analyzing the time series of Shanghai Stock Exchange Composite Index and its multifractal nature is confirmed.
Fuzzy CMAC With incremental Bayesian Ying-Yang learning and dynamic rule construction.
Nguyen, M N
2010-04-01
Inspired by the philosophy of ancient Chinese Taoism, Xu's Bayesian ying-yang (BYY) learning technique performs clustering by harmonizing the training data (yang) with the solution (ying). In our previous work, the BYY learning technique was applied to a fuzzy cerebellar model articulation controller (FCMAC) to find the optimal fuzzy sets; however, this is not suitable for time series data analysis. To address this problem, we propose an incremental BYY learning technique in this paper, with the idea of sliding window and rule structure dynamic algorithms. Three contributions are made as a result of this research. First, an online expectation-maximization algorithm incorporated with the sliding window is proposed for the fuzzification phase. Second, the memory requirement is greatly reduced since the entire data set no longer needs to be obtained during the prediction process. Third, the rule structure dynamic algorithm with dynamically initializing, recruiting, and pruning rules relieves the "curse of dimensionality" problem that is inherent in the FCMAC. Because of these features, the experimental results of the benchmark data sets of currency exchange rates and Mackey-Glass show that the proposed model is more suitable for real-time streaming data analysis.
NASA Astrophysics Data System (ADS)
Li, Ming-Xia; Jiang, Zhi-Qiang; Xie, Wen-Jie; Xiong, Xiong; Zhang, Wei; Zhou, Wei-Xing
2015-02-01
Traders develop and adopt different trading strategies attempting to maximize their profits in financial markets. These trading strategies not only result in specific topological structures in trading networks, which connect the traders with the pairwise buy-sell relationships, but also have potential impacts on market dynamics. Here, we present a detailed analysis on how the market behaviors are correlated with the structures of traders in trading networks based on audit trail data for the Baosteel stock and its warrant at the transaction level from 22 August 2005 to 23 August 2006. In our investigation, we divide each trade day into 48 rolling time windows with a length of 5 min, construct a trading network within each window, and obtain a time series of over 11,600 trading networks. We find that there are strongly simultaneous correlations between the topological metrics (including network centralization, assortative index, and average path length) of trading networks that characterize the patterns of order execution and the financial variables (including return, volatility, intertrade duration, and trading volume) for the stock and its warrant. Our analysis may shed new lights on how the microscopic interactions between elements within complex system affect the system's performance.
Experience Gained From Launch and Early Orbit Support of the Rossi X-Ray Timing Explorer (RXTE)
NASA Technical Reports Server (NTRS)
Fink, D. R.; Chapman, K. B.; Davis, W. S.; Hashmall, J. A.; Shulman, S. E.; Underwood, S. C.; Zsoldos, J. M.; Harman, R. R.
1996-01-01
this paper reports the results to date of early mission support provided by the personnel of the Goddard Space Flight Center Flight Dynamics Division (FDD) for the Rossi X-Ray Timing Explorer (RXTE) spacecraft. For this mission, the FDD supports onboard attitude determination and ephemeris propagation by supplying ground-based orbit and attitude solutions and calibration results. The first phase of that support was to provide launch window analyses. As the launch window was determined, acquisition attitudes were calculated and calibration slews were planned. postlaunch, these slews provided the basis for ground determined calibration. Ground determined calibration results are used to improve the accuracy of onboard solutions. The FDD is applying new calibration tools designed to facilitate use of the simultaneous, high-accuracy star observations from the two RXTE star trackers for ground attitude determination and calibration. An evaluation of the performance of these tools is presented. The FDD provides updates to the onboard star catalog based on preflight analysis and analysis of flight data. The in-flight results of the mission support in each area are summarized and compared with pre-mission expectations.
Some stylized facts of the Bitcoin market
NASA Astrophysics Data System (ADS)
Bariviera, Aurelio F.; Basgall, María José; Hasperué, Waldo; Naiouf, Marcelo
2017-10-01
In recent years a new type of tradable assets appeared, generically known as cryptocurrencies. Among them, the most widespread is Bitcoin. Given its novelty, this paper investigates some statistical properties of the Bitcoin market. This study compares Bitcoin and standard currencies dynamics and focuses on the analysis of returns at different time scales. We test the presence of long memory in return time series from 2011 to 2017, using transaction data from one Bitcoin platform. We compute the Hurst exponent by means of the Detrended Fluctuation Analysis method, using a sliding window in order to measure long range dependence. We detect that Hurst exponents changes significantly during the first years of existence of Bitcoin, tending to stabilize in recent times. Additionally, multiscale analysis shows a similar behavior of the Hurst exponent, implying a self-similar process.
Time-localized frequency analysis of ultrasonic guided waves for nondestructive testing
NASA Astrophysics Data System (ADS)
Shin, Hyeon Jae; Song, Sung-Jin
2000-05-01
A time-localized frequency (TLF) analysis is employed for the guided wave mode identification and improved guided wave applications. For the analysis of time-localized frequency contents of digitized ultrasonic signals, TLF analysis consists of splitting the time domain signal into overlapping segments, weighting each with the hanning window, and forming the columns of discrete Fourier transforms. The result is presented by a frequency versus time domain diagram showing frequency variation along the signal arrival time. For the demonstration of the utility of TLF analysis, an experimental group velocity dispersion pattern obtained by TLF analysis is compared with the dispersion diagram obtained by theory of elasticity. Sample piping is carbon steel piping that is used for the transportation of natural gas underground. Guided wave propagation characteristic on the piping is considered with TLF analysis and wave structure concepts. TLF analysis is used for the detection of simulated corrosion defects and the assessment of weld joint using ultrasonic guided waves. TLF analysis has revealed that the difficulty of mode identification in multi-mode propagation could be overcome. Group velocity dispersion pattern obtained by TLF analysis agrees well with theoretical results.
Wildfire cluster detection using space-time scan statistics
NASA Astrophysics Data System (ADS)
Tonini, M.; Tuia, D.; Ratle, F.; Kanevski, M.
2009-04-01
The aim of the present study is to identify spatio-temporal clusters of fires sequences using space-time scan statistics. These statistical methods are specifically designed to detect clusters and assess their significance. Basically, scan statistics work by comparing a set of events occurring inside a scanning window (or a space-time cylinder for spatio-temporal data) with those that lie outside. Windows of increasing size scan the zone across space and time: the likelihood ratio is calculated for each window (comparing the ratio "observed cases over expected" inside and outside): the window with the maximum value is assumed to be the most probable cluster, and so on. Under the null hypothesis of spatial and temporal randomness, these events are distributed according to a known discrete-state random process (Poisson or Bernoulli), which parameters can be estimated. Given this assumption, it is possible to test whether or not the null hypothesis holds in a specific area. In order to deal with fires data, the space-time permutation scan statistic has been applied since it does not require the explicit specification of the population-at risk in each cylinder. The case study is represented by Florida daily fire detection using the Moderate Resolution Imaging Spectroradiometer (MODIS) active fire product during the period 2003-2006. As result, statistically significant clusters have been identified. Performing the analyses over the entire frame period, three out of the five most likely clusters have been identified in the forest areas, on the North of the country; the other two clusters cover a large zone in the South, corresponding to agricultural land and the prairies in the Everglades. Furthermore, the analyses have been performed separately for the four years to analyze if the wildfires recur each year during the same period. It emerges that clusters of forest fires are more frequent in hot seasons (spring and summer), while in the South areas they are widely present along the whole year. The analysis of fires distribution to evaluate if they are statistically more frequent in some area or/and in some period of the year, can be useful to support fire management and to focus on prevention measures.
Re-Evaluation of Event Correlations in Virtual California Using Statistical Analysis
NASA Astrophysics Data System (ADS)
Glasscoe, M. T.; Heflin, M. B.; Granat, R. A.; Yikilmaz, M. B.; Heien, E.; Rundle, J.; Donnellan, A.
2010-12-01
Fusing the results of simulation tools with statistical analysis methods has contributed to our better understanding of the earthquake process. In a previous study, we used a statistical method to investigate emergent phenomena in data produced by the Virtual California earthquake simulator. The analysis indicated that there were some interesting fault interactions and possible triggering and quiescence relationships between events. We have converted the original code from Matlab to python/C++ and are now evaluating data from the most recent version of Virtual California in order to analyze and compare any new behavior exhibited by the model. The Virtual California earthquake simulator can be used to study fault and stress interaction scenarios for realistic California earthquakes. The simulation generates a synthetic earthquake catalog of events with a minimum size of ~M 5.8 that can be evaluated using statistical analysis methods. Virtual California utilizes realistic fault geometries and a simple Amontons - Coulomb stick and slip friction law in order to drive the earthquake process by means of a back-slip model where loading of each segment occurs due to the accumulation of a slip deficit at the prescribed slip rate of the segment. Like any complex system, Virtual California may generate emergent phenomena unexpected even by its designers. In order to investigate this, we have developed a statistical method that analyzes the interaction between Virtual California fault elements and thereby determine whether events on any given fault elements show correlated behavior. Our method examines events on one fault element and then determines whether there is an associated event within a specified time window on a second fault element. Note that an event in our analysis is defined as any time an element slips, rather than any particular “earthquake” along the entire fault length. Results are then tabulated and then differenced with an expected correlation, calculated by assuming a uniform distribution of events in time. We generate a correlation score matrix, which indicates how weakly or strongly correlated each fault element is to every other in the course of the VC simulation. We calculate correlation scores by summing the difference between the actual and expected correlations over all time window lengths and normalizing by the time window size. The correlation score matrix can focus attention on the most interesting areas for more in-depth analysis of event correlation vs. time. The previous study included 59 faults (639 elements) in the model, which included all the faults save the creeping section of the San Andreas. The analysis spanned 40,000 yrs of Virtual California-generated earthquake data. The newly revised VC model includes 70 faults, 8720 fault elements, and spans 110,000 years. Due to computational considerations, we will evaluate the elements comprising the southern California region, which our previous study indicated showed interesting fault interaction and event triggering/quiescence relationships.
GlastCam: A Telemetry-Driven Spacecraft Visualization Tool
NASA Technical Reports Server (NTRS)
Stoneking, Eric T.; Tsai, Dean
2009-01-01
Developed for the GLAST project, which is now the Fermi Gamma-ray Space Telescope, GlastCam software ingests telemetry from the Integrated Test and Operations System (ITOS) and generates four graphical displays of geometric properties in real time, allowing visual assessment of the attitude, configuration, position, and various cross-checks. Four windows are displayed: a "cam" window shows a 3D view of the satellite; a second window shows the standard position plot of the satellite on a Mercator map of the Earth; a third window displays star tracker fields of view, showing which stars are visible from the spacecraft in order to verify star tracking; and the fourth window depicts
Shang, Jianyu; Deng, Zhihong; Fu, Mengyin; Wang, Shunting
2016-01-01
Traditional artillery guidance can significantly improve the attack accuracy and overall combat efficiency of projectiles, which makes it more adaptable to the information warfare of the future. Obviously, the accurate measurement of artillery spin rate, which has long been regarded as a daunting task, is the basis of precise guidance and control. Magnetoresistive (MR) sensors can be applied to spin rate measurement, especially in the high-spin and high-g projectile launch environment. In this paper, based on the theory of a MR sensor measuring spin rate, the mathematical relationship model between the frequency of MR sensor output and projectile spin rate was established through a fundamental derivation. By analyzing the characteristics of MR sensor output whose frequency varies with time, this paper proposed the Chirp z-Transform (CZT) time-frequency (TF) domain analysis method based on the rolling window of a Blackman window function (BCZT) which can accurately extract the projectile spin rate. To put it into practice, BCZT was applied to measure the spin rate of 155 mm artillery projectile. After extracting the spin rate, the impact that launch rotational angular velocity and aspect angle have on the extraction accuracy of the spin rate was analyzed. Simulation results show that the BCZT TF domain analysis method can effectively and accurately measure the projectile spin rate, especially in a high-spin and high-g projectile launch environment. PMID:27322266
Changes in dynamic resting state network connectivity following aphasia therapy.
Duncan, E Susan; Small, Steven L
2017-10-24
Resting state magnetic resonance imaging (rsfMRI) permits observation of intrinsic neural networks produced by task-independent correlations in low frequency brain activity. Various resting state networks have been described, with each thought to reflect common engagement in some shared function. There has been limited investigation of the plasticity in these network relationships after stroke or induced by therapy. Twelve individuals with language disorders after stroke (aphasia) were imaged at multiple time points before (baseline) and after an imitation-based aphasia therapy. Language assessment using a narrative production task was performed at the same time points. Group independent component analysis (ICA) was performed on the rsfMRI data to identify resting state networks. A sliding window approach was then applied to assess the dynamic nature of the correlations among these networks. Network correlations during each 30-second window were used to cluster the data into ten states for each window at each time point for each subject. Correlation was performed between changes in time spent in each state and therapeutic gains on the narrative task. The amount of time spent in a single one of the (ten overall) dynamic states was positively associated with behavioral improvement on the narrative task at the 6-week post-therapy maintenance interval, when compared with either baseline or assessment immediately following therapy. This particular state was characterized by minimal correlation among the task-independent resting state networks. Increased functional independence and segregation of resting state networks underlies improvement on a narrative production task following imitation-based aphasia treatment. This has important clinical implications for the targeting of noninvasive brain stimulation in post-stroke remediation.
Wavelet analysis and scaling properties of time series
NASA Astrophysics Data System (ADS)
Manimaran, P.; Panigrahi, Prasanta K.; Parikh, Jitendra C.
2005-10-01
We propose a wavelet based method for the characterization of the scaling behavior of nonstationary time series. It makes use of the built-in ability of the wavelets for capturing the trends in a data set, in variable window sizes. Discrete wavelets from the Daubechies family are used to illustrate the efficacy of this procedure. After studying binomial multifractal time series with the present and earlier approaches of detrending for comparison, we analyze the time series of averaged spin density in the 2D Ising model at the critical temperature, along with several experimental data sets possessing multifractal behavior.
Windows Into the Real World From a Virtual Globe
NASA Astrophysics Data System (ADS)
Rich, J.; Urban-Rich, J.
2007-12-01
Virtual globes such as Google Earth can be great tools for learning about the geographical variation of the earth. The key to virtual globes is the use of satellite imagery to provide a highly accurate view of the earth's surface. However, because the images are not updated regularly, variations in climate and vegetation over time can not be easily seen. In order to enhance the view of the earth and observe these changes by region and over time we are working to add near real time "windows" into the real world from a virtual globe. For the past 4 years we have been installing web cameras in areas of the world that will provide long term monitoring of global changes. By archiving hourly images from arctic, temperate and tropical regions we are creating a visual data set that is already beginning to tell the story of climate variability. The cameras are currently installed in 10 elementary schools in 3 countries and show the student's view out each window. The Windows Around the World program (http://www.WindowsAroundTheWorld.org) uses the images from these cameras to help students gain a better understanding of earth process and variability in climate and vegetation between different regions and over time. Previously we have used standard web based technologies such as DHTML and AJAX to provide near real-time access to these images and also provide enhanced functionality such as dynamic time lapse movies that allow users to see changes over months, days or hours up to the current hour (http://www.windowsaroundtheworld.org/north_america.aspx). We have integrated the camera images from Windows Around the World into Google Earth. Through network links and models we are creating a way for students to "fly" to another school in the program and see what the current view is out the window. By using a model as a screen, the image can be viewed from the same direction as the students who are sitting in a classroom at the participating school. Once at the school, visiting students can move around the area in three dimensions and gain a better understanding of what they are seeing out the window. Currently time-lapse images can be viewed at a lower resolution for all schools on the globe or when flying into an individual school, higher resolution time-lapse images can be seen. The observation of shadows, precipitation, movement of the sun and changes in vegetation allows the viewer to gain a better understanding of how the earth works and how the environment changes between regions and over time. World.org
Zhong, Zhentao; Yu, Yue; Jin, Shufang; Pan, Jinming
2018-01-01
The hatch window that varies from 24 to 48 h is known to influence post-hatch performance of chicks. A narrow hatch window is needed for commercial poultry industry to acquire a high level of uniformity of chick quality. Hatching synchronization observed in avian species presents possibilities in altering hatch window in artificial incubation. Layer eggs which were laid on the same day by a single breeder flock and stored for no more than two days started incubation 12 h apart to obtain developmental distinction. The eggs of different initial incubation time were mixed as rows adjacent to rows on day 12 of incubation. During the hatching period (since day 18), hatching time of individual eggs and hatch window were obtained by video recordings. Embryonic development (day 18 and 20) and post-hatch performance up to day 7 were measured. The manipulation of mixing eggs of different initial incubation time shortened the hatch window of late incubated eggs in the manipulated group by delaying the onset of hatching process, and improved the hatchability. Compared to the control groups, chick embryos or chicks in the egg redistribution group showed no significant difference in embryonic development and post-hatch performance up to day 7. We have demonstrated that eggs that were incubated with advanced eggs performed a narrow spread of hatch with higher hatchability, normal embryonic development as well as unaffected chick quality. This specific manipulation is applicable in industrial poultry production to shorten hatch window and improve the uniformity of chick quality.
Horesh, Yair; Wexler, Ydo; Lebenthal, Ilana; Ziv-Ukelson, Michal; Unger, Ron
2009-03-04
Scanning large genomes with a sliding window in search of locally stable RNA structures is a well motivated problem in bioinformatics. Given a predefined window size L and an RNA sequence S of size N (L < N), the consecutive windows folding problem is to compute the minimal free energy (MFE) for the folding of each of the L-sized substrings of S. The consecutive windows folding problem can be naively solved in O(NL3) by applying any of the classical cubic-time RNA folding algorithms to each of the N-L windows of size L. Recently an O(NL2) solution for this problem has been described. Here, we describe and implement an O(NLpsi(L)) engine for the consecutive windows folding problem, where psi(L) is shown to converge to O(1) under the assumption of a standard probabilistic polymer folding model, yielding an O(L) speedup which is experimentally confirmed. Using this tool, we note an intriguing directionality (5'-3' vs. 3'-5') folding bias, i.e. that the minimal free energy (MFE) of folding is higher in the native direction of the DNA than in the reverse direction of various genomic regions in several organisms including regions of the genomes that do not encode proteins or ncRNA. This bias largely emerges from the genomic dinucleotide bias which affects the MFE, however we see some variations in the folding bias in the different genomic regions when normalized to the dinucleotide bias. We also present results from calculating the MFE landscape of a mouse chromosome 1, characterizing the MFE of the long ncRNA molecules that reside in this chromosome. The efficient consecutive windows folding engine described in this paper allows for genome wide scans for ncRNA molecules as well as large-scale statistics. This is implemented here as a software tool, called RNAslider, and applied to the scanning of long chromosomes, leading to the observation of features that are visible only on a large scale.
Creating a Parallel Version of VisIt for Microsoft Windows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitlock, B J; Biagas, K S; Rawson, P L
2011-12-07
VisIt is a popular, free interactive parallel visualization and analysis tool for scientific data. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images or movies for presentations. VisIt was designed from the ground up to work on many scales of computers from modest desktops up to massively parallel clusters. VisIt is comprised of a set of cooperating programs. All programs can be run locally or in client/server mode in which some run locally and some run remotely on compute clusters. The VisIt program most able to harness today's computing powermore » is the VisIt compute engine. The compute engine is responsible for reading simulation data from disk, processing it, and sending results or images back to the VisIt viewer program. In a parallel environment, the compute engine runs several processes, coordinating using the Message Passing Interface (MPI) library. Each MPI process reads some subset of the scientific data and filters the data in various ways to create useful visualizations. By using MPI, VisIt has been able to scale well into the thousands of processors on large computers such as dawn and graph at LLNL. The advent of multicore CPU's has made parallelism the 'new' way to achieve increasing performance. With today's computers having at least 2 cores and in many cases up to 8 and beyond, it is more important than ever to deploy parallel software that can use that computing power not only on clusters but also on the desktop. We have created a parallel version of VisIt for Windows that uses Microsoft's MPI implementation (MSMPI) to process data in parallel on the Windows desktop as well as on a Windows HPC cluster running Microsoft Windows Server 2008. Initial desktop parallel support for Windows was deployed in VisIt 2.4.0. Windows HPC cluster support has been completed and will appear in the VisIt 2.5.0 release. We plan to continue supporting parallel VisIt on Windows so our users will be able to take full advantage of their multicore resources.« less
Air transparent soundproof window
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Sang-Hoon, E-mail: shkim@mmu.ac.kr; Lee, Seong-Hyun
2014-11-15
A soundproof window or wall which is transparent to airflow is presented. The design is based on two wave theories: the theory of diffraction and the theory of acoustic metamaterials. It consists of a three-dimensional array of strong diffraction-type resonators with many holes centered on each individual resonator. The negative effective bulk modulus of the resonators produces evanescent wave, and at the same time the air holes with subwavelength diameter existed on the surfaces of the window for macroscopic air ventilation. The acoustic performance levels of two soundproof windows with air holes of 20mm and 50mm diameters were measured. Themore » sound level was reduced by about 30 - 35dB in the frequency range of 400 - 5,000Hz with the 20mm window, and by about 20 - 35dB in the frequency range of 700 - 2,200Hz with the 50mm window. Multi stop-band was created by the multi-layers of the window. The attenuation length or the thickness of the window was limited by background noise. The effectiveness of the soundproof window with airflow was demonstrated by a real installation.« less
High-temperature, high-pressure optical port for rocket engine applications
NASA Technical Reports Server (NTRS)
Delcher, Ray; Nemeth, ED; Powers, W. T.
1993-01-01
This paper discusses the design, fabrication, and test of a window assembly for instrumentation of liquid-fueled rocket engine hot gas systems. The window was designed to allow optical measurements of hot gas in the SSME fuel preburner and appears to be the first window designed for application in a rocket engine hot gas system. Such a window could allow the use of a number of remote optical measurement technologies including: Raman temperature and species concentration measurement, Raleigh temperature measurements, flame emission monitoring, flow mapping, laser-induced florescence, and hardware imaging during engine operation. The window assembly has been successfully tested to 8,000 psi at 1000 F and over 11,000 psi at room temperature. A computer stress analysis shows the window will withstand high temperature and cryogenic thermal shock.
Gross, Arnd; Ziepert, Marita; Scholz, Markus
2012-01-01
Analysis of clinical studies often necessitates multiple graphical representations of the results. Many professional software packages are available for this purpose. Most packages are either only commercially available or hard to use especially if one aims to generate or customize a huge number of similar graphical outputs. We developed a new, freely available software tool called KMWin (Kaplan-Meier for Windows) facilitating Kaplan-Meier survival time analysis. KMWin is based on the statistical software environment R and provides an easy to use graphical interface. Survival time data can be supplied as SPSS (sav), SAS export (xpt) or text file (dat), which is also a common export format of other applications such as Excel. Figures can directly be exported in any graphical file format supported by R. On the basis of a working example, we demonstrate how to use KMWin and present its main functions. We show how to control the interface, customize the graphical output, and analyse survival time data. A number of comparisons are performed between KMWin and SPSS regarding graphical output, statistical output, data management and development. Although the general functionality of SPSS is larger, KMWin comprises a number of features useful for survival time analysis in clinical trials and other applications. These are for example number of cases and number of cases under risk within the figure or provision of a queue system for repetitive analyses of updated data sets. Moreover, major adjustments of graphical settings can be performed easily on a single window. We conclude that our tool is well suited and convenient for repetitive analyses of survival time data. It can be used by non-statisticians and provides often used functions as well as functions which are not supplied by standard software packages. The software is routinely applied in several clinical study groups.
Gross, Arnd; Ziepert, Marita; Scholz, Markus
2012-01-01
Background Analysis of clinical studies often necessitates multiple graphical representations of the results. Many professional software packages are available for this purpose. Most packages are either only commercially available or hard to use especially if one aims to generate or customize a huge number of similar graphical outputs. We developed a new, freely available software tool called KMWin (Kaplan-Meier for Windows) facilitating Kaplan-Meier survival time analysis. KMWin is based on the statistical software environment R and provides an easy to use graphical interface. Survival time data can be supplied as SPSS (sav), SAS export (xpt) or text file (dat), which is also a common export format of other applications such as Excel. Figures can directly be exported in any graphical file format supported by R. Results On the basis of a working example, we demonstrate how to use KMWin and present its main functions. We show how to control the interface, customize the graphical output, and analyse survival time data. A number of comparisons are performed between KMWin and SPSS regarding graphical output, statistical output, data management and development. Although the general functionality of SPSS is larger, KMWin comprises a number of features useful for survival time analysis in clinical trials and other applications. These are for example number of cases and number of cases under risk within the figure or provision of a queue system for repetitive analyses of updated data sets. Moreover, major adjustments of graphical settings can be performed easily on a single window. Conclusions We conclude that our tool is well suited and convenient for repetitive analyses of survival time data. It can be used by non-statisticians and provides often used functions as well as functions which are not supplied by standard software packages. The software is routinely applied in several clinical study groups. PMID:22723912
Alternative Fuels Data Center: Schwan's Home Service Delivers With
distribute products across the United States. For information about this project, contact Twin Cities Clean Cities Coalition. Download QuickTime Video QuickTime (.mov) Download Windows Media Video Windows Media (.wmv) Video Download Help Text version See more videos provided by Clean Cities TV and FuelEconomy.gov
Displaying Special Characters and Symbols in Computer-Controlled Reaction Time Experiments.
ERIC Educational Resources Information Center
Friel, Brian M.; Kennison, Shelia M.
A procedure for using MEL2 (Version 2.0 of Microcomputer Experimental Laboratory) and FontWINDOW to present special characters and symbols in computer-controlled reaction time experiments is described. The procedure permits more convenience and flexibility than in tachistocopic and projection techniques. FontWINDOW allows researchers to design…
Attosecond light sources in the water window
NASA Astrophysics Data System (ADS)
Ren, Xiaoming; Li, Jie; Yin, Yanchun; Zhao, Kun; Chew, Andrew; Wang, Yang; Hu, Shuyuan; Cheng, Yan; Cunningham, Eric; Wu, Yi; Chini, Michael; Chang, Zenghu
2018-02-01
As a compact and burgeoning alternative to synchrotron radiation and free-electron lasers, high harmonic generation (HHG) has proven its superiority in static and time-resolved extreme ultraviolet spectroscopy for the past two decades and has recently gained many interests and successes in generating soft x-ray emissions covering the biologically important water window spectral region. Unlike synchrotron and free-electron sources, which suffer from relatively long pulse width or large time jitter, soft x-ray sources from HHG could offer attosecond time resolution and be synchronized with their driving field to investigate time-resolved near edge absorption spectroscopy, which could reveal rich structural and dynamical information of the interrogated samples. In this paper, we review recent progresses on generating and characterizing attosecond light sources in the water window region. We show our development of an energetic, two-cycle, carrier-envelope phase stable laser source at 1.7 μm and our achievement in producing a 53 as soft x-ray pulse covering the carbon K-edge in the water window. Such source paves the ways for the next generation x-ray spectroscopy with unprecedented temporal resolution.
NASA Technical Reports Server (NTRS)
2002-01-01
Dimension Technologies Inc., developed a line of 2-D/3-D Liquid Crystal Display (LCD) screens, including a 15-inch model priced at consumer levels. DTI's family of flat panel LCD displays, called the Virtual Window(TM), provide real-time 3-D images without the use of glasses, head trackers, helmets, or other viewing aids. Most of the company initial 3-D display research was funded through NASA's Small Business Innovation Research (SBIR) program. The images on DTI's displays appear to leap off the screen and hang in space. The display accepts input from computers or stereo video sources, and can be switched from 3-D to full-resolution 2-D viewing with the push of a button. The Virtual Window displays have applications in data visualization, medicine, architecture, business, real estate, entertainment, and other research, design, military, and consumer applications. Displays are currently used for computer games, protein analysis, and surgical imaging. The technology greatly benefits the medical field, as surgical simulators are helping to increase the skills of surgical residents. Virtual Window(TM) is a trademark of Dimension Technologies Inc.
Image-guided adaptive gating of lung cancer radiotherapy: a computer simulation study
NASA Astrophysics Data System (ADS)
Aristophanous, Michalis; Rottmann, Joerg; Park, Sang-June; Nishioka, Seiko; Shirato, Hiroki; Berbeco, Ross I.
2010-08-01
The purpose of this study is to investigate the effect that image-guided adaptation of the gating window during treatment could have on the residual tumor motion, by simulating different gated radiotherapy techniques. There are three separate components of this simulation: (1) the 'Hokkaido Data', which are previously measured 3D data of lung tumor motion tracks and the corresponding 1D respiratory signals obtained during the entire ungated radiotherapy treatments of eight patients, (2) the respiratory gating protocol at our institution and the imaging performed under that protocol and (3) the actual simulation in which the Hokkaido Data are used to select tumor position information that could have been collected based on the imaging performed under our gating protocol. We simulated treatments with a fixed gating window and a gating window that is updated during treatment. The patient data were divided into different fractions, each with continuous acquisitions longer than 2 min. In accordance to the imaging performed under our gating protocol, we assume that we have tumor position information for the first 15 s of treatment, obtained from kV fluoroscopy, and for the rest of the fractions the tumor position is only available during the beam-on time from MV imaging. The gating window was set according to the information obtained from the first 15 s such that the residual motion was less than 3 mm. For the fixed gating window technique the gate remained the same for the entire treatment, while for the adaptive technique the range of the tumor motion during beam-on time was measured and used to adapt the gating window to keep the residual motion below 3 mm. The algorithm used to adapt the gating window is described. The residual tumor motion inside the gating window was reduced on average by 24% for the patients with regular breathing patterns and the difference was statistically significant (p-value = 0.01). The magnitude of the residual tumor motion depended on the regularity of the breathing pattern suggesting that image-guided adaptive gating should be combined with breath coaching. The adaptive gating window technique was able to track the exhale position of the breathing cycle quite successfully. Out of a total of 53 fractions the duty cycle was greater than 20% for 42 fractions for the fixed gating window technique and for 39 fractions for the adaptive gating window technique. The results of this study suggest that real-time updating of the gating window can result in reliably low residual tumor motion and therefore can facilitate safe margin reduction.
Multi-Window Controllers for Autonomous Space Systems
NASA Technical Reports Server (NTRS)
Lurie, B, J.; Hadaegh, F. Y.
1997-01-01
Multi-window controllers select between elementary linear controllers using nonlinear windows based on the amplitude and frequency content of the feedback error. The controllers are relatively simple to implement and perform much better than linear controllers. The commanders for such controllers only order the destination point and are freed from generating the command time-profiles. The robotic missions rely heavily on the tasks of acquisition and tracking. For autonomous and optimal control of the spacecraft, the control bandwidth must be larger while the feedback can (and, therefore, must) be reduced.. Combining linear compensators via multi-window nonlinear summer guarantees minimum phase character of the combined transfer function. It is shown that the solution may require using several parallel branches and windows. Several examples of multi-window nonlinear controller applications are presented.
Speed-of-light limitations in passive linear media
NASA Astrophysics Data System (ADS)
Welters, Aaron; Avniel, Yehuda; Johnson, Steven G.
2014-08-01
We prove that well-known speed-of-light restrictions on electromagnetic energy velocity can be extended to a new level of generality, encompassing even nonlocal chiral media in periodic geometries, while at the same time weakening the underlying assumptions to only passivity and linearity of the medium (either with a transparency window or with dissipation). As was also shown by other authors under more limiting assumptions, passivity alone is sufficient to guarantee causality and positivity of the energy density (with no thermodynamic assumptions). Our proof is general enough to include a very broad range of material properties, including anisotropy, bianisotropy (chirality), nonlocality, dispersion, periodicity, and even delta functions or similar generalized functions. We also show that the "dynamical energy density" used by some previous authors in dissipative media reduces to the standard Brillouin formula for dispersive energy density in a transparency window. The results in this paper are proved by exploiting deep results from linear-response theory, harmonic analysis, and functional analysis that had previously not been brought together in the context of electrodynamics.
James, S. R.; Knox, H. A.; Abbott, R. E.; ...
2017-04-13
Cross correlations of seismic noise can potentially record large changes in subsurface velocity due to permafrost dynamics and be valuable for long-term Arctic monitoring. We applied seismic interferometry, using moving window cross-spectral analysis (MWCS), to 2 years of ambient noise data recorded in central Alaska to investigate whether seismic noise could be used to quantify relative velocity changes due to seasonal active-layer dynamics. The large velocity changes (>75%) between frozen and thawed soil caused prevalent cycle-skipping which made the method unusable in this setting. We developed an improved MWCS procedure which uses a moving reference to measure daily velocity variationsmore » that are then accumulated to recover the full seasonal change. This approach reduced cycle-skipping and recovered a seasonal trend that corresponded well with the timing of active-layer freeze and thaw. Lastly, this improvement opens the possibility of measuring large velocity changes by using MWCS and permafrost monitoring by using ambient noise.« less
Thermal/structural/optical integrated design for optical sensor mounted on unmanned aerial vehicle
NASA Astrophysics Data System (ADS)
Zhang, Gaopeng; Yang, Hongtao; Mei, Chao; Wu, Dengshan; Shi, Kui
2016-01-01
With the rapid development of science and technology and the promotion of many local wars in the world, altitude optical sensor mounted on unmanned aerial vehicle is more widely applied in the airborne remote sensing, measurement and detection. In order to obtain high quality image of the aero optical remote sensor, it is important to analysis its thermal-optical performance on the condition of high speed and high altitude. Especially for the key imaging assembly, such as optical window, the temperature variation and temperature gradient can result in defocus and aberrations in optical system, which will lead to the poor quality image. In order to improve the optical performance of a high speed aerial camera optical window, the thermal/structural/optical integrated design method is developed. Firstly, the flight environment of optical window is analyzed. Based on the theory of aerodynamics and heat transfer, the convection heat transfer coefficient is calculated. The temperature distributing of optical window is simulated by the finite element analysis software. The maximum difference in temperature of the inside and outside of optical window is obtained. Then the deformation of optical window under the boundary condition of the maximum difference in temperature is calculated. The optical window surface deformation is fitted in Zernike polynomial as the interface, the calculated Zernike fitting coefficients is brought in and analyzed by CodeV Optical Software. At last, the transfer function diagrams of the optical system on temperature field are comparatively analyzed. By comparing and analyzing the result, it can be obtained that the optical path difference caused by thermal deformation of the optical window is 138.2 nm, which is under PV ≤1 4λ . The above study can be used as an important reference for other optical window designs.
NASA Astrophysics Data System (ADS)
Leavey, Anna; Reed, Nathan; Patel, Sameer; Bradley, Kevin; Kulkarni, Pramod; Biswas, Pratim
2017-10-01
Advanced automobile technology, developed infrastructure, and changing economic markets have resulted in increasing commute times. Traffic is a major source of harmful pollutants and consequently daily peak exposures tend to occur near roadways or while travelling on them. The objective of this study was to measure simultaneous real-time particulate matter (particle numbers, lung-deposited surface area, PM2.5, particle number size distributions) and CO concentrations outside and in-cabin of an on-road car during regular commutes to and from work. Data was collected for different ventilation parameters (windows open or closed, fan on, AC on), whilst travelling along different road-types with varying traffic densities. Multiple predictor variables were examined using linear mixed-effects models. Ambient pollutants (NOx, PM2.5, CO) and meteorological variables (wind speed, temperature, relative humidity, dew point) explained 5-44% of outdoor pollutant variability, while the time spent travelling behind a bus was statistically significant for PM2.5, lung-deposited SA, and CO (adj-R2 values = 0.12, 0.10, 0.13). The geometric mean diameter (GMD) for outdoor aerosol was 34 nm. Larger cabin GMDs were observed when windows were closed compared to open (b = 4.3, p-value = <0.01). When windows were open, cabin total aerosol concentrations tracked those outdoors. With windows closed, the pollutants took longer to enter the vehicle cabin, but also longer to exit it. Concentrations of pollutants in cabin were influenced by outdoor concentrations, ambient temperature, and the window/ventilation parameters. As expected, particle number concentrations were impacted the most by changes to window position/ventilation, and PM2.5 the least. Car drivers can expect their highest exposures when driving with windows open or the fan on, and their lowest exposures during windows closed or the AC on. Final linear mixed-effects models could explain between 88 and 97% of cabin pollutant concentration variability. An individual may control their commuting exposure by applying dynamic behavior modification to adapt to changing pollutant scenarios.
Leavey, Anna; Reed, Nathan; Patel, Sameer; Bradley, Kevin; Kulkarni, Pramod; Biswas, Pratim
2017-01-01
Advanced automobile technology, developed infrastructure, and changing economic markets have resulted in increasing commute times. Traffic is a major source of harmful pollutants and consequently daily peak exposures tend to occur near roadways or while traveling on them. The objective of this study was to measure simultaneous real-time particulate matter (particle numbers, lung-deposited surface area, PM2.5, particle number size distributions) and CO concentrations outside and in-cabin of an on-road car during regular commutes to and from work. Data was collected for different ventilation parameters (windows open or closed, fan on, AC on), whilst traveling along different road-types with varying traffic densities. Multiple predictor variables were examined using linear mixed-effects models. Ambient pollutants (NOx, PM2.5, CO) and meteorological variables (wind speed, temperature, relative humidity, dew point) explained 5–44% of outdoor pollutant variability, while the time spent travelling behind a bus was statistically significant for PM2.5, lung-deposited SA, and CO (adj-R2 values = 0.12, 0.10, 0.13). The geometric mean diameter (GMD) for outdoor aerosol was 34 nm. Larger cabin GMDs were observed when windows were closed compared to open (b = 4.3, p-value = <0.01). When windows were open, cabin total aerosol concentrations tracked those outdoors. With windows closed, the pollutants took longer to enter the vehicle cabin, but also longer to exit it. Concentrations of pollutants in cabin were influenced by outdoor concentrations, ambient temperature, and the window/ventilation parameters. As expected, particle number concentrations were impacted the most by changes to window position / ventilation, and PM2.5 the least. Car drivers can expect their highest exposures when driving with windows open or the fan on, and their lowest exposures during windows closed or the AC on. Final linear mixed-effects models could explain between 88–97% of cabin pollutant concentration variability. An individual may control their commuting exposure by applying dynamic behavior modification to adapt to changing pollutant scenarios. PMID:29284988
Leavey, Anna; Reed, Nathan; Patel, Sameer; Bradley, Kevin; Kulkarni, Pramod; Biswas, Pratim
2017-10-01
Advanced automobile technology, developed infrastructure, and changing economic markets have resulted in increasing commute times. Traffic is a major source of harmful pollutants and consequently daily peak exposures tend to occur near roadways or while traveling on them. The objective of this study was to measure simultaneous real-time particulate matter (particle numbers, lung-deposited surface area, PM 2.5 , particle number size distributions) and CO concentrations outside and in-cabin of an on-road car during regular commutes to and from work. Data was collected for different ventilation parameters (windows open or closed, fan on, AC on), whilst traveling along different road-types with varying traffic densities. Multiple predictor variables were examined using linear mixed-effects models. Ambient pollutants (NO x , PM 2.5 , CO) and meteorological variables (wind speed, temperature, relative humidity, dew point) explained 5-44% of outdoor pollutant variability, while the time spent travelling behind a bus was statistically significant for PM 2.5, lung-deposited SA, and CO (adj-R 2 values = 0.12, 0.10, 0.13). The geometric mean diameter (GMD) for outdoor aerosol was 34 nm. Larger cabin GMDs were observed when windows were closed compared to open (b = 4.3, p-value = <0.01). When windows were open, cabin total aerosol concentrations tracked those outdoors. With windows closed, the pollutants took longer to enter the vehicle cabin, but also longer to exit it. Concentrations of pollutants in cabin were influenced by outdoor concentrations, ambient temperature, and the window/ventilation parameters. As expected, particle number concentrations were impacted the most by changes to window position / ventilation, and PM 2.5 the least. Car drivers can expect their highest exposures when driving with windows open or the fan on, and their lowest exposures during windows closed or the AC on. Final linear mixed-effects models could explain between 88-97% of cabin pollutant concentration variability. An individual may control their commuting exposure by applying dynamic behavior modification to adapt to changing pollutant scenarios.
Chauhan, Preeti; Cerdá, Magdalena; Messner, Steven F.; Tracy, Melissa; Tardiff, Kenneth; Galea, Sandro
2012-01-01
The current study evaluated a range of social influences including misdemeanor arrests, drug arrests, cocaine consumption, alcohol consumption, firearm availability, and incarceration that may be associated with changes in gun-related homicides by racial/ethnic group in New York City (NYC) from 1990 to 1999. Using police precincts as the unit of analysis, we used cross-sectional, time series data to examine changes in Black, White, and Hispanic homicides, separately. Bayesian hierarchical models with a spatial error term indicated that an increase in cocaine consumption was associated with an increase in Black homicides. An increase in firearm availability was associated with an increase in Hispanic homicides. Last, there were no significant predictors for White homicides. Support was found for the crack cocaine hypotheses but not for the broken windows hypothesis. Examining racially/ethnically disaggregated data can shed light on group-sensitive mechanisms that may explain changes in homicide over time. PMID:22328820
Directional time-distance probing of model sunspot atmospheres
NASA Astrophysics Data System (ADS)
Moradi, H.; Cally, P. S.; Przybylski, D.; Shelyag, S.
2015-05-01
A crucial feature not widely accounted for in local helioseismology is that surface magnetic regions actually open a window from the interior into the solar atmosphere, and that the seismic waves leak through this window, reflect high in the atmosphere, and then re-enter the interior to rejoin the seismic wave field normally confined there. In a series of recent numerical studies using translation invariant atmospheres, we utilized a `directional time-distance helioseismology' measurement scheme to study the implications of the returning fast and Alfvén waves higher up in the solar atmosphere on the seismology at the photosphere (Cally & Moradi 2013; Moradi & Cally 2014). In this study, we extend our directional time-distance analysis to more realistic sunspot-like atmospheres to better understand the direct effects of the magnetic field on helioseismic travel-time measurements in sunspots. In line with our previous findings, we uncover a distinct frequency-dependent directional behaviour in the travel-time measurements, consistent with the signatures of magnetohydrodynamic mode conversion. We found this to be the case regardless of the sunspot field strength or depth of its Wilson depression. We also isolated and analysed the direct contribution from purely thermal perturbations to the measured travel times, finding that waves propagating in the umbra are much more sensitive to the underlying thermal effects of the sunspot.
Carreiro, André V; Amaral, Pedro M T; Pinto, Susana; Tomás, Pedro; de Carvalho, Mamede; Madeira, Sara C
2015-12-01
Amyotrophic Lateral Sclerosis (ALS) is a devastating disease and the most common neurodegenerative disorder of young adults. ALS patients present a rapidly progressive motor weakness. This usually leads to death in a few years by respiratory failure. The correct prediction of respiratory insufficiency is thus key for patient management. In this context, we propose an innovative approach for prognostic prediction based on patient snapshots and time windows. We first cluster temporally-related tests to obtain snapshots of the patient's condition at a given time (patient snapshots). Then we use the snapshots to predict the probability of an ALS patient to require assisted ventilation after k days from the time of clinical evaluation (time window). This probability is based on the patient's current condition, evaluated using clinical features, including functional impairment assessments and a complete set of respiratory tests. The prognostic models include three temporal windows allowing to perform short, medium and long term prognosis regarding progression to assisted ventilation. Experimental results show an area under the receiver operating characteristics curve (AUC) in the test set of approximately 79% for time windows of 90, 180 and 365 days. Creating patient snapshots using hierarchical clustering with constraints outperforms the state of the art, and the proposed prognostic model becomes the first non population-based approach for prognostic prediction in ALS. The results are promising and should enhance the current clinical practice, largely supported by non-standardized tests and clinicians' experience. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Hong, Guosong; Zou, Yingping; Antaris, Alexander L.; Diao, Shuo; Wu, Di; Cheng, Kai; Zhang, Xiaodong; Chen, Changxin; Liu, Bo; He, Yuehui; Wu, Justin Z.; Yuan, Jun; Zhang, Bo; Tao, Zhimin; Fukunaga, Chihiro; Dai, Hongjie
2014-06-01
In vivo fluorescence imaging in the second near-infrared window (1.0-1.7 μm) can afford deep tissue penetration and high spatial resolution, owing to the reduced scattering of long-wavelength photons. Here we synthesize a series of low-bandgap donor/acceptor copolymers with tunable emission wavelengths of 1,050-1,350 nm in this window. Non-covalent functionalization with phospholipid-polyethylene glycol results in water-soluble and biocompatible polymeric nanoparticles, allowing for live cell molecular imaging at >1,000 nm with polymer fluorophores for the first time. Importantly, the high quantum yield of the polymer allows for in vivo, deep-tissue and ultrafast imaging of mouse arterial blood flow with an unprecedented frame rate of >25 frames per second. The high time-resolution results in spatially and time resolved imaging of the blood flow pattern in cardiogram waveform over a single cardiac cycle (~200 ms) of a mouse, which has not been observed with fluorescence imaging in this window before.
Low-complexity image processing for real-time detection of neonatal clonic seizures.
Ntonfo, Guy Mathurin Kouamou; Ferrari, Gianluigi; Raheli, Riccardo; Pisani, Francesco
2012-05-01
In this paper, we consider a novel low-complexity real-time image-processing-based approach to the detection of neonatal clonic seizures. Our approach is based on the extraction, from a video of a newborn, of an average luminance signal representative of the body movements. Since clonic seizures are characterized by periodic movements of parts of the body (e.g., the limbs), by evaluating the periodicity of the extracted average luminance signal it is possible to detect the presence of a clonic seizure. The periodicity is investigated, through a hybrid autocorrelation-Yin estimation technique, on a per-window basis, where a time window is defined as a sequence of consecutive video frames. While processing is first carried out on a single window basis, we extend our approach to interlaced windows. The performance of the proposed detection algorithm is investigated, in terms of sensitivity and specificity, through receiver operating characteristic curves, considering video recordings of newborns affected by neonatal seizures.
Smart glass as the method of improving the energy efficiency of high-rise buildings
NASA Astrophysics Data System (ADS)
Gamayunova, Olga; Gumerova, Eliza; Miloradova, Nadezda
2018-03-01
The question that has to be answered in high-rise building is glazing and its service life conditions. Contemporary market offers several types of window units, for instance, wooden, aluminum, PVC and combined models. Wooden and PVC windows become the most widespread and competitive between each other. In recent times design engineers choose smart glass. In this article, the advantages and drawbacks of all types of windows are reviewed, and the recommendations are given according to choice of window type in order to improve energy efficiency of buildings.
Rendon, Alexis; Livingston, Melvin; Suzuki, Sumihiro; Hill, Whitney; Walters, Scott
2017-07-01
Self-reported substance use is commonly used as an outcome measure in treatment research. We evaluated the validity of self-reported drug use in a sample of 334 adults with mental health problems who were residing in supportive housing programs. The primary analysis was the calculation of the positive predictive values (PPVs) of self-report compared to an oral fluid test taken at the same time. A sensitivity analysis compared the positive predictive values of two self-reported drug use histories: biological testing window (ranging between the past 96h to 30days depending on drug type) or the full past 90-day comparison window (maximum length recorded during interview). A multivariable logistic regression was used to predict discordance between self-report and the drug test for users. Self-reported drug use and oral fluid drug tests were compared to determine the positive predictive value for amphetamines/methamphetamines/PCP (47.1% agreement), cocaine (43.8% agreement), and marijuana (69.7% agreement) drug tests. Participants who misreported their drug use were more likely to be older, non-White, have no medical insurance, and not report any alcohol use. In general, amphetamine/methamphetamine/PCP and cocaine use was adequately captured by the biological test, while marijuana use was best captured by a combination of self-report and biological data. Using the full past 90day comparison window resulted in higher concordance with the oral fluid drug test, indicating that self-reported drug use in the past 90days may be a proxy for drug use within the biological testing window. Self-report has some disadvantages when used as the sole measure of drug use in this population. Copyright © 2017 Elsevier Ltd. All rights reserved.
Frequency spectral analysis of GPR data over a crude oil spill
Burton, B.L.; Olhoeft, G.R.; Powers, M.H.; ,
2004-01-01
A multi-offset ground penetrating radar (GPR) dataset was acquired by the U.S. Geological Survey (USGS) at a crude oil spill site near Bemidji, Minnesota, USA. The dataset consists of two, parallel profiles, each with 17 transmitter-receiver offsets ranging from 0.60 to 5.15m. One profile was acquired over a known oil pool floating on the water table, and the other profile was acquired over an uncontaminated area. The data appear to be more attenuated, or at least exhibit less reflectivity, in the area over the oil pool. In an attempt to determine the frequency dependence of this apparent attenuation, several attributes of the frequency spectra of the data were analyzed after accounting for the effects on amplitude of the radar system (radiation pattern), changes in antenna-ground coupling, and spherical divergence. The attributes analyzed were amplitude spectra peak frequency, 6 dB down, or half-amplitude, spectrum width, and the low and high frequency slopes between the 3 and 9 dB down points. The most consistent trend was observed for Fourier transformed full traces at offsets 0.81, 1.01, and 1.21m which displayed steeper low frequency slopes over the area corresponding to the oil pool. The Fourier-transformed time-windowed traces, where each window was equal to twice the airwave wavelet length, exhibited weakly consistent attribute trends from offset to offset and from window to window. The fact that strong, consistent oil indicators are not seen in this analysis indicates that another mechanism due to the presence of the oil, such as a gradient in the electromagnetic properties, may simply suppress reflections over the contaminated zone.
14 CFR 417.229 - Far-field overpressure blast effects analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... characteristics; (2) The potential for broken windows due to peak incident overpressures below 1.0 psi and related... the potentially affected windows, including their size, location, orientation, glazing material, and...
Temporal Integration Windows in Neural Processing and Perception Aligned to Saccadic Eye Movements.
Wutz, Andreas; Muschter, Evelyn; van Koningsbruggen, Martijn G; Weisz, Nathan; Melcher, David
2016-07-11
When processing dynamic input, the brain balances the opposing needs of temporal integration and sensitivity to change. We hypothesized that the visual system might resolve this challenge by aligning integration windows to the onset of newly arriving sensory samples. In a series of experiments, human participants observed the same sequence of two displays separated by a brief blank delay when performing either an integration or segregation task. First, using magneto-encephalography (MEG), we found a shift in the stimulus-evoked time courses by a 150-ms time window between task signals. After stimulus onset, multivariate pattern analysis (MVPA) decoding of task in occipital-parietal sources remained above chance for almost 1 s, and the task-decoding pattern interacted with task outcome. In the pre-stimulus period, the oscillatory phase in the theta frequency band was informative about both task processing and behavioral outcome for each task separately, suggesting that the post-stimulus effects were caused by a theta-band phase shift. Second, when aligning stimulus presentation to the onset of eye fixations, there was a similar phase shift in behavioral performance according to task demands. In both MEG and behavioral measures, task processing was optimal first for segregation and then integration, with opposite phase in the theta frequency range (3-5 Hz). The best fit to neurophysiological and behavioral data was given by a dampened 3-Hz oscillation from stimulus or eye fixation onset. The alignment of temporal integration windows to input changes found here may serve to actively organize the temporal processing of continuous sensory input. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Roemmelt, Andreas T; Steuer, Andrea E; Poetzsch, Michael; Kraemer, Thomas
2014-12-02
Forensic and clinical toxicological screening procedures are employing liquid chromatography-tandem mass spectrometry (LC-MS/MS) techniques with information-dependent acquisition (IDA) approaches more and more often. It is known that the complexity of a sample and the IDA settings might prevent important compounds from being triggered. Therefore, data-independent acquisition (DIA) methods should be more suitable for systematic toxicological analysis (STA). The DIA method sequential window acquisition of all theoretical fragment-ion spectra (SWATH), which uses Q1 windows of 20-35 Da for data-independent fragmentation, was systematically investigated for its suitability for STA. Quality of SWATH-generated mass spectra were evaluated with regard to mass error, relative abundance of the fragments, and library hits. With the Q1 window set to 20-25 Da, several precursors pass Q1 at the same time and are fragmented, thus impairing the library search algorithms to a different extent: forward fit was less affected than reverse fit and purity fit. Mass error was not affected. The relative abundance of the fragments was concentration dependent for some analytes and was influenced by cofragmentation, especially of deuterated analogues. Also, the detection rate of IDA compared to SWATH was investigated in a forced coelution experiment (up to 20 analytes coeluting). Even using several different IDA settings, it was observed that IDA failed to trigger relevant compounds. Screening results of 382 authentic forensic cases revealed that SWATH's detection rate was superior to IDA, which failed to trigger ∼10% of the analytes.
Time resolved analysis of quetiapine and 7-OH-quetiapine in hair using LC/MS-MS.
Binz, Tina M; Yegles, Michel; Schneider, Serge; Neels, Hugo; Crunelle, Cleo L
2014-09-01
Hair analysis is a powerful tool for retrospective drug analysis and has a wide application window. This article describes the simultaneous determination and quantification of the short-acting atypical antipsychotic drug quetiapine and its main metabolite 7-OH quetiapine in hair. A sensitive and accurate method for the determination of these two compounds was developed using high-performance liquid chromatography coupled to tandem mass spectrometry detection (LC-MS/MS). The method was applied to 10 real case samples. For five patients, a time resolved hair analysis was done. Results varied from 0.35 ng/mg to 10.21 ng/mg hair for quetiapine and from 0.02 ng/mg to 3.19 ng/mg hair for 7-OH-quetiapine. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Visualization of system dynamics using phasegrams
Herbst, Christian T.; Herzel, Hanspeter; Švec, Jan G.; Wyman, Megan T.; Fitch, W. Tecumseh
2013-01-01
A new tool for visualization and analysis of system dynamics is introduced: the phasegram. Its application is illustrated with both classical nonlinear systems (logistic map and Lorenz system) and with biological voice signals. Phasegrams combine the advantages of sliding-window analysis (such as the spectrogram) with well-established visualization techniques from the domain of nonlinear dynamics. In a phasegram, time is mapped onto the x-axis, and various vibratory regimes, such as periodic oscillation, subharmonics or chaos, are identified within the generated graph by the number and stability of horizontal lines. A phasegram can be interpreted as a bifurcation diagram in time. In contrast to other analysis techniques, it can be automatically constructed from time-series data alone: no additional system parameter needs to be known. Phasegrams show great potential for signal classification and can act as the quantitative basis for further analysis of oscillating systems in many scientific fields, such as physics (particularly acoustics), biology or medicine. PMID:23697715
The Analysis of Design of Robust Nonlinear Estimators and Robust Signal Coding Schemes.
1982-09-16
b - )’/ 12. between uniform and nonuniform quantizers. For the nonuni- Proof: If b - acca then form quantizer we can expect the mean-square error to...in the window greater than or equal to the value at We define f7 ’(s) as the n-times filtered signal p + 1; consequently, point p + 1 is the median and
NASA Astrophysics Data System (ADS)
Reymond, D.
2016-12-01
We present an open source software project (GNU public license), named STK: Seismic Tool-Kit, that is dedicated mainly for learning signal processing and seismology. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 20000 downloads at the date of writing.The STK project is composed of two main branches:First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The passage in the frequency domain via the Fourier transform is used to introduce the estimation of spectral density of the signal , with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noise), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. STK has been used in some schools for viewing and plotting seismic records provided by IRIS, and it has been used as a practical support for teaching the basis of signal processing. Useful links:http://sourceforge.net/projects/seismic-toolkit/http://sourceforge.net/p/seismic-toolkit/wiki/browse_pages/
Grivna, Michal; Al-Marzouqi, Hanan M; Al-Ali, Maryam R; Al-Saadi, Nada N; Abu-Zidan, Fikri M
2017-01-01
Falls of children from heights (balconies and windows) usually result in severe injuries and death. Details on child falls from heights in the United Arab Emirates (UAE) are not easily accessible. Our aim was to assess the incidents, personal, and environmental risk factors for pediatric falls from windows/balconies using newspaper clippings. We used a retrospective study design to electronically assess all major UAE national Arabic and English newspapers for reports of unintentional child falls from windows and balconies during 2005-2016. A structured data collection form was developed to collect information. Data were entered into an Excel sheet and descriptive analysis was performed. Newspaper clippings documented 96 fall incidents. After cleaning the data and excluding duplicate cases and intentional injuries, 81 cases were included into the final analysis. Fifty-three percent ( n = 42) were boys. The mean (range) age was 4.9 years (1-15). Thirty-eight (47%) children fell from windows and 36 (44%) from balconies. Twenty-two (27%) children climbed on the furniture placed on a balcony or close to a window. Twenty-five (31%) children were not alone in the apartment when they fell. Twenty-nine children fell from less than 5 floors (37%), 33 from 5 to 10 floors (42%) and 16 from more than 10 floors (21%) . Fifteen children (19%) were hospitalized and survived the fall incident, while 66 died (81%). Newspapers proved to be useful to study pediatric falls from heights. It is necessary to improve window safety by installing window guards and raising awareness.
NASA Astrophysics Data System (ADS)
Yin, Yi; Shang, Pengjian
2013-12-01
We use multiscale detrended fluctuation analysis (MSDFA) and multiscale detrended cross-correlation analysis (MSDCCA) to investigate auto-correlation (AC) and cross-correlation (CC) in the US and Chinese stock markets during 1997-2012. The results show that US and Chinese stock indices differ in terms of their multiscale AC structures. Stock indices in the same region also differ with regard to their multiscale AC structures. We analyze AC and CC behaviors among indices for the same region to determine similarity among six stock indices and divide them into four groups accordingly. We choose S&P500, NQCI, HSI, and the Shanghai Composite Index as representative samples for simplicity. MSDFA and MSDCCA results and average MSDFA spectra for local scaling exponents (LSEs) for individual series are presented. We find that the MSDCCA spectrum for LSE CC between two time series generally tends to be greater than the average MSDFA LSE spectrum for individual series. We obtain detailed multiscale structures and relations for CC between the four representatives. MSDFA and MSDCCA with secant rolling windows of different sizes are then applied to reanalyze the AC and CC. Vertical and horizontal comparisons of different window sizes are made. The MSDFA and MSDCCA results for the original window size are confirmed and some new interesting characteristics and conclusions regarding multiscale correlation structures are obtained.
Kaganovich, Natalya; Schumaker, Jennifer
2016-01-01
Sensitivity to the temporal relationship between auditory and visual stimuli is key to efficient audiovisual integration. However, even adults vary greatly in their ability to detect audiovisual temporal asynchrony. What underlies this variability is currently unknown. We recorded event-related potentials (ERPs) while participants performed a simultaneity judgment task on a range of audiovisual (AV) and visual-auditory (VA) stimulus onset asynchronies (SOAs) and compared ERP responses in good and poor performers to the 200 ms SOA, which showed the largest individual variability in the number of synchronous perceptions. Analysis of ERPs to the VA200 stimulus yielded no significant results. However, those individuals who were more sensitive to the AV200 SOA had significantly more positive voltage between 210 and 270 ms following the sound onset. In a follow-up analysis, we showed that the mean voltage within this window predicted approximately 36% of variability in sensitivity to AV temporal asynchrony in a larger group of participants. The relationship between the ERP measure in the 210-270 ms window and accuracy on the simultaneity judgment task also held for two other AV SOAs with significant individual variability - 100 and 300 ms. Because the identified window was time-locked to the onset of sound in the AV stimulus, we conclude that sensitivity to AV temporal asynchrony is shaped to a large extent by the efficiency in the neural encoding of sound onsets. PMID:27094850
Windowed time-reversal music technique for super-resolution ultrasound imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Lianjie; Labyed, Yassin
Systems and methods for super-resolution ultrasound imaging using a windowed and generalized TR-MUSIC algorithm that divides the imaging region into overlapping sub-regions and applies the TR-MUSIC algorithm to the windowed backscattered ultrasound signals corresponding to each sub-region. The algorithm is also structured to account for the ultrasound attenuation in the medium and the finite-size effects of ultrasound transducer elements.
The Golden Cage: Growing up in the Socialist Yugoslavia
ERIC Educational Resources Information Center
Marjanovic-Shane, Ana
2018-01-01
From the mid 1950s through roughly the 1980s, some or many children and youth of the Socialist Yugoslavia, especially those of us in Belgrade, the capital, lived in a curious, almost surreal "window" in the space and time. This surreal window of space-time, offered to children and youth of Yugoslavia, unprecedented opportunities for…
Alternative Fuels Data Center: Maine's Only Biodiesel Manufacturer Powers
this project, contact Maine Clean Communities. Download QuickTime Video QuickTime (.mov) Download Windows Media Video Windows Media (.wmv) Video Download Help Text version See more videos provided by truck Krug Energy Opens Natural Gas Fueling Station in Arkansas June 18, 2016 photo of natural gas
Alternative Fuels Data Center: Texas Taxis Go Hybrid
information about this project, contact Alamo Area Clean Cities (San Antonio). Download QuickTime Video QuickTime (.mov) Download Windows Media Video Windows Media (.wmv) Video Download Help Text version See more car Hydrogen Powers Fuel Cell Vehicles in California Nov. 18, 2017 Photo of a car Smart Car Shopping
The Data Analysis in Gravitational Wave Detection
NASA Astrophysics Data System (ADS)
Wang, Xiao-ge; Lebigot, Eric; Du, Zhi-hui; Cao, Jun-wei; Wang, Yun-yong; Zhang, Fan; Cai, Yong-zhi; Li, Mu-zi; Zhu, Zong-hong; Qian, Jin; Yin, Cong; Wang, Jian-bo; Zhao, Wen; Zhang, Yang; Blair, David; Ju, Li; Zhao, Chun-nong; Wen, Lin-qing
2017-01-01
Gravitational wave (GW) astronomy based on the GW detection is a rising interdisciplinary field, and a new window for humanity to observe the universe, followed after the traditional astronomy with the electromagnetic waves as the detection means, it has a quite important significance for studying the origin and evolution of the universe, and for extending the astronomical research field. The appearance of laser interferometer GW detector has opened a new era of GW detection, and the data processing and analysis of GWs have already been developed quickly around the world, to provide a sharp weapon for the GW astronomy. This paper introduces systematically the tool software that commonly used for the data analysis of GWs, and discusses in detail the basic methods used in the data analysis of GWs, such as the time-frequency analysis, composite analysis, pulsar timing analysis, matched filter, template, χ2 test, and Monte-Carlo simulation, etc.
Application of Bounded Linear Stability Analysis Method for Metrics-Driven Adaptive Control
NASA Technical Reports Server (NTRS)
Bakhtiari-Nejad, Maryam; Nguyen, Nhan T.; Krishnakumar, Kalmanje
2009-01-01
This paper presents the application of Bounded Linear Stability Analysis (BLSA) method for metrics-driven adaptive control. The bounded linear stability analysis method is used for analyzing stability of adaptive control models, without linearizing the adaptive laws. Metrics-driven adaptive control introduces a notion that adaptation should be driven by some stability metrics to achieve robustness. By the application of bounded linear stability analysis method the adaptive gain is adjusted during the adaptation in order to meet certain phase margin requirements. Analysis of metrics-driven adaptive control is evaluated for a second order system that represents a pitch attitude control of a generic transport aircraft. The analysis shows that the system with the metrics-conforming variable adaptive gain becomes more robust to unmodeled dynamics or time delay. The effect of analysis time-window for BLSA is also evaluated in order to meet the stability margin criteria.
Location identification of closed crack based on Duffing oscillator transient transition
NASA Astrophysics Data System (ADS)
Liu, Xiaofeng; Bo, Lin; Liu, Yaolu; Zhao, Youxuan; Zhang, Jun; Deng, Mingxi; Hu, Ning
2018-02-01
The existence of a closed micro-crack in plates can be detected by using the nonlinear harmonic characteristics of the Lamb wave. However, its location identification is difficult. By considering the transient nonlinear Lamb under the noise interference, we proposed a location identification method for the closed crack based on the quantitative measurement of Duffing oscillator transient transfer in the phase space. The sliding short-time window was used to create a window truncation of to-be-detected signal. And then, the periodic extension processing for transient nonlinear Lamb wave was performed to ensure that the Duffing oscillator has adequate response time to reach a steady state. The transient autocorrelation method was used to reduce the occurrence of missed harmonic detection due to the random variable phase of nonlinear Lamb wave. Moreover, to overcome the deficiency in the quantitative analysis of Duffing system state by phase trajectory diagram and eliminate the misjudgment caused by harmonic frequency component contained in broadband noise, logic operation method of oscillator state transition function based on circular zone partition was adopted to establish the mapping relation between the oscillator transition state and the nonlinear harmonic time domain information. Final state transition discriminant function of Duffing oscillator was used as basis for identifying the reflected and transmitted harmonics from the crack. Chirplet time-frequency analysis was conducted to identify the mode of generated harmonics and determine the propagation speed. Through these steps, accurate position identification of the closed crack was achieved.
Real-time spectral analysis of HRV signals: an interactive and user-friendly PC system.
Basano, L; Canepa, F; Ottonello, P
1998-01-01
We present a real-time system, built around a PC and a low-cost data acquisition board, for the spectral analysis of the heart rate variability signal. The Windows-like operating environment on which it is based makes the computer program very user-friendly even for non-specialized personnel. The Power Spectral Density is computed through the use of a hybrid method, in which a classical FFT analysis follows an autoregressive finite-extension of data; the stationarity of the sequence is continuously checked. The use of this algorithm gives a high degree of robustness of the spectral estimation. Moreover, always in real time, the FFT of every data block is computed and displayed in order to corroborate the results as well as to allow the user to interactively choose a proper AR model order.
Alterations in audiovisual simultaneity perception in amblyopia
2017-01-01
Amblyopia is a developmental visual impairment that is increasingly recognized to affect higher-level perceptual and multisensory processes. To further investigate the audiovisual (AV) perceptual impairments associated with this condition, we characterized the temporal interval in which asynchronous auditory and visual stimuli are perceived as simultaneous 50% of the time (i.e., the AV simultaneity window). Adults with unilateral amblyopia (n = 17) and visually normal controls (n = 17) judged the simultaneity of a flash and a click presented with both eyes viewing. The signal onset asynchrony (SOA) varied from 0 ms to 450 ms for auditory-lead and visual-lead conditions. A subset of participants with amblyopia (n = 6) was tested monocularly. Compared to the control group, the auditory-lead side of the AV simultaneity window was widened by 48 ms (36%; p = 0.002), whereas that of the visual-lead side was widened by 86 ms (37%; p = 0.02). The overall mean window width was 500 ms, compared to 366 ms among controls (37% wider; p = 0.002). Among participants with amblyopia, the simultaneity window parameters were unchanged by viewing condition, but subgroup analysis revealed differential effects on the parameters by amblyopia severity, etiology, and foveal suppression status. Possible mechanisms to explain these findings include visual temporal uncertainty, interocular perceptual latency asymmetry, and disruption of normal developmental tuning of sensitivity to audiovisual asynchrony. PMID:28598996
Alterations in audiovisual simultaneity perception in amblyopia.
Richards, Michael D; Goltz, Herbert C; Wong, Agnes M F
2017-01-01
Amblyopia is a developmental visual impairment that is increasingly recognized to affect higher-level perceptual and multisensory processes. To further investigate the audiovisual (AV) perceptual impairments associated with this condition, we characterized the temporal interval in which asynchronous auditory and visual stimuli are perceived as simultaneous 50% of the time (i.e., the AV simultaneity window). Adults with unilateral amblyopia (n = 17) and visually normal controls (n = 17) judged the simultaneity of a flash and a click presented with both eyes viewing. The signal onset asynchrony (SOA) varied from 0 ms to 450 ms for auditory-lead and visual-lead conditions. A subset of participants with amblyopia (n = 6) was tested monocularly. Compared to the control group, the auditory-lead side of the AV simultaneity window was widened by 48 ms (36%; p = 0.002), whereas that of the visual-lead side was widened by 86 ms (37%; p = 0.02). The overall mean window width was 500 ms, compared to 366 ms among controls (37% wider; p = 0.002). Among participants with amblyopia, the simultaneity window parameters were unchanged by viewing condition, but subgroup analysis revealed differential effects on the parameters by amblyopia severity, etiology, and foveal suppression status. Possible mechanisms to explain these findings include visual temporal uncertainty, interocular perceptual latency asymmetry, and disruption of normal developmental tuning of sensitivity to audiovisual asynchrony.
Tumin, Dmitry; McConnell, Patrick I; Galantowicz, Mark; Tobias, Joseph D; Hayes, Don
2017-02-01
Young adult heart transplantation (HTx) recipients experience high mortality risk attributed to increased nonadherence to immunosuppressive medication in this age window. This study sought to test whether a high-risk age window in HTx recipients persisted in the absence of reported nonadherence. Heart transplantation recipients aged 2 to 40 years, transplanted between October 1999 and January 2007, were identified in the United Network for Organ Sharing database. Multivariable survival analysis was used to estimate influences of age at transplantation and attained posttransplant age on mortality hazard among patients stratified by center report of nonadherence to immunosuppression that compromised recovery. Three thousand eighty-one HTx recipients were included, with univariate analysis demonstrating peak hazards of mortality and reported nonadherence among 567 patients transplanted between ages 17 and 24 years. Multivariable analysis adjusting for reported nonadherence demonstrated lower mortality among patients transplanted at younger (hazards ratio, 0.813; 95% confidence interval, 0.663-0.997; P = 0.047) or older (hazards ratio, 0.835; 95% confidence interval, 0.701-0.994; P = 0.042) ages. Peak mortality hazard at ages 17 to 24 years was confirmed in the subgroup of patients with no nonadherence reported during follow-up. This result was replicated using attained age after HTx as the time metric, with younger and older ages predicting improved survival in the absence of reported nonadherence. Late adolescence and young adulthood coincide with greater mortality hazard and greater chances of nonadherence to immunosuppressive medication after HTx, but the elevation of mortality hazard in this age range persists in the absence of reported nonadherence. Other causes of the high-risk age window for post-HTx mortality should be demonstrated to identify opportunities for intervention.
NASA Technical Reports Server (NTRS)
Yuen, Vincent K.
1989-01-01
The Systems Engineering Simulator has addressed the major issues in providing visual data to its real-time man-in-the-loop simulations. Out-the-window views and CCTV views are provided by three scene systems to give the astronauts their real-world views. To expand the window coverage for the Space Station Freedom workstation a rotating optics system is used to provide the widest field of view possible. To provide video signals to as many viewpoints as possible, windows and CCTVs, with a limited amount of hardware, a video distribution system has been developed to time-share the video channels among viewpoints at the selection of the simulation users. These solutions have provided the visual simulation facility for real-time man-in-the-loop simulations for the NASA space program.
Split delivery vehicle routing problem with time windows: a case study
NASA Astrophysics Data System (ADS)
Latiffianti, E.; Siswanto, N.; Firmandani, R. A.
2018-04-01
This paper aims to implement an extension of VRP so called split delivery vehicle routing problem (SDVRP) with time windows in a case study involving pickups and deliveries of workers from several points of origin and several destinations. Each origin represents a bus stop and the destination represents either site or office location. An integer linear programming of the SDVRP problem is presented. The solution was generated using three stages of defining the starting points, assigning busses, and solving the SDVRP with time windows using an exact method. Although the overall computational time was relatively lengthy, the results indicated that the produced solution was better than the existing routing and scheduling that the firm used. The produced solution was also capable of reducing fuel cost by 9% that was obtained from shorter total distance travelled by the shuttle buses.
The Use of Variable Q1 Isolation Windows Improves Selectivity in LC-SWATH-MS Acquisition.
Zhang, Ying; Bilbao, Aivett; Bruderer, Tobias; Luban, Jeremy; Strambio-De-Castillia, Caterina; Lisacek, Frédérique; Hopfgartner, Gérard; Varesio, Emmanuel
2015-10-02
As tryptic peptides and metabolites are not equally distributed along the mass range, the probability of cross fragment ion interference is higher in certain windows when fixed Q1 SWATH windows are applied. We evaluated the benefits of utilizing variable Q1 SWATH windows with regards to selectivity improvement. Variable windows based on equalizing the distribution of either the precursor ion population (PIP) or the total ion current (TIC) within each window were generated by an in-house software, swathTUNER. These two variable Q1 SWATH window strategies outperformed, with respect to quantification and identification, the basic approach using a fixed window width (FIX) for proteomic profiling of human monocyte-derived dendritic cells (MDDCs). Thus, 13.8 and 8.4% additional peptide precursors, which resulted in 13.1 and 10.0% more proteins, were confidently identified by SWATH using the strategy PIP and TIC, respectively, in the MDDC proteomic sample. On the basis of the spectral library purity score, some improvement warranted by variable Q1 windows was also observed, albeit to a lesser extent, in the metabolomic profiling of human urine. We show that the novel concept of "scheduled SWATH" proposed here, which incorporates (i) variable isolation windows and (ii) precursor retention time segmentation further improves both peptide and metabolite identifications.
NASA Astrophysics Data System (ADS)
Abrokwah, K.; O'Reilly, A. M.
2017-12-01
Groundwater is an important resource that is extracted every day because of its invaluable use for domestic, industrial and agricultural purposes. The need for sustaining groundwater resources is clearly indicated by declining water levels and has led to modeling and forecasting accurate groundwater levels. In this study, spectral decomposition of climatic forcing time series was used to develop hybrid wavelet analysis (WA) and moving window average (MWA) artificial neural network (ANN) models. These techniques are explored by modeling historical groundwater levels in order to provide understanding of potential causes of the observed groundwater-level fluctuations. Selection of the appropriate decomposition level for WA and window size for MWA helps in understanding the important time scales of climatic forcing, such as rainfall, that influence water levels. Discrete wavelet transform (DWT) is used to decompose the input time-series data into various levels of approximate and details wavelet coefficients, whilst MWA acts as a low-pass signal-filtering technique for removing high-frequency signals from the input data. The variables used to develop and validate the models were daily average rainfall measurements from five National Atmospheric and Oceanic Administration (NOAA) weather stations and daily water-level measurements from two wells recorded from 1978 to 2008 in central Florida, USA. Using different decomposition levels and different window sizes, several WA-ANN and MWA-ANN models for simulating the water levels were created and their relative performances compared against each other. The WA-ANN models performed better than the corresponding MWA-ANN models; also higher decomposition levels of the input signal by the DWT gave the best results. The results obtained show the applicability and feasibility of hybrid WA-ANN and MWA-ANN models for simulating daily water levels using only climatic forcing time series as model inputs.
Charvat, A; Stasicki, B; Abel, B
2006-03-09
In the present article a novel approach for rapid product screening of fast reactions in IR-laser-heated liquid microbeams in a vacuum is highlighted. From absorbed energies, a shock wave analysis, high-speed laser stroboscopy, and thermodynamic data of high-temperature water the enthalpy, temperature, density, pressure, and the reaction time window for the hot water filament could be characterized. The experimental conditions (30 kbar, 1750 K, density approximately 1 g/cm3) present during the lifetime of the filament (20-30 ns) were extreme and provided a unique environment for high-temperature water chemistry. For the probe of the reaction products liquid beam desorption mass spectrometry was employed. A decisive feature of the technique is that ionic species, as well as neutral products and intermediates may be detected (neutrals as protonated aggregates) via time-of-flight mass spectrometry without any additional ionization laser. After the explosive disintegration of the superheated beam, high-temperature water reactions are efficiently quenched via expansion and evaporative cooling. For first exploratory experiments for chemistry in ultrahigh-temperature, -pressure and -density water, we have chosen resorcinol as a benchmark system, simple enough and well studied in high-temperature water environments much below 1000 K. Contrary to oxidation reactions usually present under less extreme and dense supercritical conditions, we have observed hydration and little H-atom abstraction during the narrow time window of the experiment. Small amounts of radicals but no ionic intermediates other than simple proton adducts were detected. The experimental findings are discussed in terms of the energetic and dense environment and the small time window for reaction, and they provide firm evidence for additional thermal reaction channels in extreme molecular environments.
Time-Frequency Distribution Analyses of Ku-Band Radar Doppler Echo Signals
NASA Astrophysics Data System (ADS)
Bujaković, Dimitrije; Andrić, Milenko; Bondžulić, Boban; Mitrović, Srđan; Simić, Slobodan
2015-03-01
Real radar echo signals of a pedestrian, vehicle and group of helicopters are analyzed in order to maximize signal energy around central Doppler frequency in time-frequency plane. An optimization, preserving this concentration, is suggested based on three well-known concentration measures. Various window functions and time-frequency distributions were optimization inputs. Conducted experiments on an analytic and three real signals have shown that energy concentration significantly depends on used time-frequency distribution and window function, for all three used criteria.
Calibration of Safecast dose rate measurements.
Cervone, Guido; Hultquist, Carolynne
2018-10-01
A methodology is presented to calibrate contributed Safecast dose rate measurements acquired between 2011 and 2016 in the Fukushima prefecture of Japan. The Safecast data are calibrated using observations acquired by the U.S. Department of Energy at the time of the 2011 Fukushima Daiichi power plant nuclear accident. The methodology performs a series of interpolations between the U.S. government and contributed datasets at specific temporal windows and at corresponding spatial locations. The coefficients found for all the different temporal windows are aggregated and interpolated using quadratic regressions to generate a time dependent calibration function. Normal background radiation, decay rates, and missing values are taken into account during the analysis. Results show that the standard Safecast static transformation function overestimates the official measurements because it fails to capture the presence of two different Cesium isotopes and their changing magnitudes with time. A model is created to predict the ratio of the isotopes from the time of the accident through 2020. The proposed time dependent calibration takes into account this Cesium isotopes ratio, and it is shown to reduce the error between U.S. government and contributed data. The proposed calibration is needed through 2020, after which date the errors introduced by ignoring the presence of different isotopes will become negligible. Copyright © 2018 Elsevier Ltd. All rights reserved.
USB Storage Device Forensics for Windows 10.
Arshad, Ayesha; Iqbal, Waseem; Abbas, Haider
2018-05-01
Significantly increased use of USB devices due to their user-friendliness and large storage capacities poses various threats for many users/companies in terms of data theft that becomes easier due to their efficient mobility. Investigations for such data theft activities would require gathering critical digital information capable of recovering digital forensics artifacts like date, time, and device information. This research gathers three sets of registry and logs data: first, before insertion; second, during insertion; and the third, after removal of a USB device. These sets are analyzed to gather evidentiary information from Registry and Windows Event log that helps in tracking a USB device. This research furthers the prior research on earlier versions of Microsoft Windows and compares it with latest Windows 10 system. Comparison of Windows 8 and Windows 10 does not show much difference except for new subkey under USB Key in registry. However, comparison of Windows 7 with latest version indicates significant variances. © 2017 American Academy of Forensic Sciences.
11 Foot Unitary Plan Tunnel Facility Optical Improvement Large Window Analysis
NASA Technical Reports Server (NTRS)
Hawke, Veronica M.
2015-01-01
The test section of the 11 by 11-foot Unitary Plan Transonic Wind Tunnel (11-foot UPWT) may receive an upgrade of larger optical windows on both the North and South sides. These new larger windows will provide better access for optical imaging of test article flow phenomena including surface and off body flow characteristics. The installation of these new larger windows will likely produce a change to the aerodynamic characteristics of the flow in the Test Section. In an effort understand the effect of this change, a computational model was employed to predict the flows through the slotted walls, in the test section and around the model before and after the tunnel modification. This report documents the solid CAD model that was created and the inviscid computational analysis that was completed as a preliminary estimate of the effect of the changes.
Tsai, Kuo-Ming; Wang, He-Yi
2014-08-20
This study focuses on injection molding process window determination for obtaining optimal imaging optical properties, astigmatism, coma, and spherical aberration using plastic lenses. The Taguchi experimental method was first used to identify the optimized combination of parameters and significant factors affecting the imaging optical properties of the lens. Full factorial experiments were then implemented based on the significant factors to build the response surface models. The injection molding process windows for lenses with optimized optical properties were determined based on the surface models, and confirmation experiments were performed to verify their validity. The results indicated that the significant factors affecting the optical properties of lenses are mold temperature, melt temperature, and cooling time. According to experimental data for the significant factors, the oblique ovals for different optical properties on the injection molding process windows based on melt temperature and cooling time can be obtained using the curve fitting approach. The confirmation experiments revealed that the average errors for astigmatism, coma, and spherical aberration are 3.44%, 5.62%, and 5.69%, respectively. The results indicated that the process windows proposed are highly reliable.
The windows of SETI - Frequency and time in the search for extraterrestrial intelligence
NASA Technical Reports Server (NTRS)
Oliver, Bernard M.
1987-01-01
Since interstellar travel is not economically possible on the time scale of a human lifetime, communication with extraterrestrials can be achieved only by sending some form of energy or matter across space; photons (electromagnetic waves) are best. Of particular interest to SETI is the region from about 1,000-60,000 MHz known as the free-space microwave window. During the course of NASA's Cyclops program, it was pointed out that the hydrogen and hydroxyl lines bounded a band in which there were no other known lines. The threatened loss of the microwave window to earth-based services is discussed.
Bao, Yan; Pöppel, Ernst; Wang, Lingyan; Lin, Xiaoxiong; Yang, Taoxi; Avram, Mihai; Blautzik, Janusch; Paolini, Marco; Silveira, Sarita; Vedder, Aline; Zaytseva, Yuliya; Zhou, Bin
2015-12-01
Synchronizing neural processes, mental activities, and social interactions is considered to be fundamental for the creation of temporal order on the personal and interpersonal level. Several different types of synchronization are distinguished, and for each of them examples are given: self-organized synchronizations on the neural level giving rise to pre-semantically defined time windows of some tens of milliseconds and of approximately 3 s; time windows that are created by synchronizing different neural representations, as for instance in aesthetic appreciations or moral judgments; and synchronization of biological rhythms with geophysical cycles, like the circadian clock with the 24-hr rhythm of day and night. For the latter type of synchronization, an experiment is described that shows the importance of social interactions for sharing or avoiding common time. In a group study with four subjects being completely isolated together for 3 weeks from the external world, social interactions resulted both in intra- and interindividual circadian synchronization and desynchronization. A unique phenomenon in circadian regulation is described, the "beat phenomenon," which has been made visible by the interaction of two circadian rhythms with different frequencies in one body. The separation of the two physiological rhythms was the consequence of social interactions, that is, by the desire of a subject to share and to escape common time during different phases of the long-term experiment. The theoretical arguments on synchronization are summarized with the general statement: "Nothing in cognitive science makes sense except in the light of time windows." The hypothesis is forwarded that time windows that express discrete timing mechanisms in behavioral control and on the level of conscious experiences are the necessary bases to create cognitive order, and it is suggested that time windows are implemented by neural oscillations in different frequency domains. © 2015 The Institute of Psychology, Chinese Academy of Sciences and Wiley Publishing Asia Pty Ltd.
Wavelet Statistical Analysis of Low-Latitude Geomagnetic Measurements
NASA Astrophysics Data System (ADS)
Papa, A. R.; Akel, A. F.
2009-05-01
Following previous works by our group (Papa et al., JASTP, 2006), where we analyzed a series of records acquired at the Vassouras National Geomagnetic Observatory in Brazil for the month of October 2000, we introduced a wavelet analysis for the same type of data and for other periods. It is well known that wavelets allow a more detailed study in several senses: the time window for analysis can be drastically reduced if compared to other traditional methods (Fourier, for example) and at the same time allow an almost continuous accompaniment of both amplitude and frequency of signals as time goes by. This advantage brings some possibilities for potentially useful forecasting methods of the type also advanced by our group in previous works (see for example, Papa and Sosman, JASTP, 2008). However, the simultaneous statistical analysis of both time series (in our case amplitude and frequency) is a challenging matter and is in this sense that we have found what we consider our main goal. Some possible trends for future works are advanced.
VO2 thermochromic smart window for energy savings and generation
Zhou, Jiadong; Gao, Yanfeng; Zhang, Zongtao; Luo, Hongjie; Cao, Chuanxiang; Chen, Zhang; Dai, Lei; Liu, Xinling
2013-01-01
The ability to achieve energy saving in architectures and optimal solar energy utilisation affects the sustainable development of the human race. Traditional smart windows and solar cells cannot be combined into one device for energy saving and electricity generation. A VO2 film can respond to the environmental temperature to intelligently regulate infrared transmittance while maintaining visible transparency, and can be applied as a thermochromic smart window. Herein, we report for the first time a novel VO2-based smart window that partially utilises light scattering to solar cells around the glass panel for electricity generation. This smart window combines energy-saving and generation in one device, and offers potential to intelligently regulate and utilise solar radiation in an efficient manner. PMID:24157625
VO₂ thermochromic smart window for energy savings and generation.
Zhou, Jiadong; Gao, Yanfeng; Zhang, Zongtao; Luo, Hongjie; Cao, Chuanxiang; Chen, Zhang; Dai, Lei; Liu, Xinling
2013-10-24
The ability to achieve energy saving in architectures and optimal solar energy utilisation affects the sustainable development of the human race. Traditional smart windows and solar cells cannot be combined into one device for energy saving and electricity generation. A VO2 film can respond to the environmental temperature to intelligently regulate infrared transmittance while maintaining visible transparency, and can be applied as a thermochromic smart window. Herein, we report for the first time a novel VO2-based smart window that partially utilises light scattering to solar cells around the glass panel for electricity generation. This smart window combines energy-saving and generation in one device, and offers potential to intelligently regulate and utilise solar radiation in an efficient manner.
Optimal pulse design for communication-oriented slow-light pulse detection.
Stenner, Michael D; Neifeld, Mark A
2008-01-21
We present techniques for designing pulses for linear slow-light delay systems which are optimal in the sense that they maximize the signal-to-noise ratio (SNR) and signal-to-noise-plus-interference ratio (SNIR) of the detected pulse energy. Given a communication model in which input pulses are created in a finite temporal window and output pulse energy in measured in a temporally-offset output window, the SNIR-optimal pulses achieve typical improvements of 10 dB compared to traditional pulse shapes for a given output window offset. Alternatively, for fixed SNR or SNIR, window offset (detection delay) can be increased by 0.3 times the window width. This approach also invites a communication-based model for delay and signal fidelity.
Is human sentence parsing serial or parallel? Evidence from event-related brain potentials.
Hopf, Jens-Max; Bader, Markus; Meng, Michael; Bayer, Josef
2003-01-01
In this ERP study we investigate the processes that occur in syntactically ambiguous German sentences at the point of disambiguation. Whereas most psycholinguistic theories agree on the view that processing difficulties arise when parsing preferences are disconfirmed (so-called garden-path effects), important differences exist with respect to theoretical assumptions about the parser's recovery from a misparse. A key distinction can be made between parsers that compute all alternative syntactic structures in parallel (parallel parsers) and parsers that compute only a single preferred analysis (serial parsers). To distinguish empirically between parallel and serial parsing models, we compare ERP responses to garden-path sentences with ERP responses to truly ungrammatical sentences. Garden-path sentences contain a temporary and ultimately curable ungrammaticality, whereas truly ungrammatical sentences remain so permanently--a difference which gives rise to different predictions in the two classes of parsing architectures. At the disambiguating word, ERPs in both sentence types show negative shifts of similar onset latency, amplitude, and scalp distribution in an initial time window between 300 and 500 ms. In a following time window (500-700 ms), the negative shift to garden-path sentences disappears at right central parietal sites, while it continues in permanently ungrammatical sentences. These data are taken as evidence for a strictly serial parser. The absence of a difference in the early time window indicates that temporary and permanent ungrammaticalities trigger the same kind of parsing responses. Later differences can be related to successful reanalysis in garden-path but not in ungrammatical sentences. Copyright 2003 Elsevier Science B.V.
Effects of window size and shape on accuracy of subpixel centroid estimation of target images
NASA Technical Reports Server (NTRS)
Welch, Sharon S.
1993-01-01
A new algorithm is presented for increasing the accuracy of subpixel centroid estimation of (nearly) point target images in cases where the signal-to-noise ratio is low and the signal amplitude and shape vary from frame to frame. In the algorithm, the centroid is calculated over a data window that is matched in width to the image distribution. Fourier analysis is used to explain the dependency of the centroid estimate on the size of the data window, and simulation and experimental results are presented which demonstrate the effects of window size for two different noise models. The effects of window shape were also investigated for uniform and Gaussian-shaped windows. The new algorithm was developed to improve the dynamic range of a close-range photogrammetric tracking system that provides feedback for control of a large gap magnetic suspension system (LGMSS).
Alternative Fuels Data Center: America's Largest Home Runs on Biodiesel in
Coalition (Western North Carolina). Download QuickTime Video QuickTime (.mov) Download Windows Media Video Windows Media (.wmv) Video Download Help Text version See more videos provided by Clean Cities TV and Photo of a car Hydrogen Powers Fuel Cell Vehicles in California Nov. 18, 2017 Photo of a car Smart Car
Alternative Fuels Data Center: Rhode Island EV Initiative Adds Chargers
Ocean State Clean Cities. Download QuickTime Video QuickTime (.mov) Download Windows Media Video Windows Media (.wmv) Video Download Help Text version See more videos provided by Clean Cities TV and Photo of a car Hydrogen Powers Fuel Cell Vehicles in California Nov. 18, 2017 Photo of a car Smart Car
Alternative Fuels Data Center: Worcester Regional Transit Authority Drives
Clean Cities. Download QuickTime Video QuickTime (.mov) Download Windows Media Video Windows Media (.wmv ) Video Download Help Text version See more videos provided by Clean Cities TV and FuelEconomy.gov Fuel Cell Vehicles in California Nov. 18, 2017 Photo of a car Smart Car Shopping Nov. 4, 2017 Image of
Alternative Fuels Data Center: Propane Powers Airport Shuttles in New
Clean Fuel Partnership. Download QuickTime Video QuickTime (.mov) Download Windows Media Video Windows Media (.wmv) Video Download Help Text version See more videos provided by Clean Cities TV and Vehicles in California Nov. 18, 2017 Photo of a car Smart Car Shopping Nov. 4, 2017 Photo of a truck
Automated variance reduction for MCNP using deterministic methods.
Sweezy, J; Brown, F; Booth, T; Chiaramonte, J; Preeg, B
2005-01-01
In order to reduce the user's time and the computer time needed to solve deep penetration problems, an automated variance reduction capability has been developed for the MCNP Monte Carlo transport code. This new variance reduction capability developed for MCNP5 employs the PARTISN multigroup discrete ordinates code to generate mesh-based weight windows. The technique of using deterministic methods to generate importance maps has been widely used to increase the efficiency of deep penetration Monte Carlo calculations. The application of this method in MCNP uses the existing mesh-based weight window feature to translate the MCNP geometry into geometry suitable for PARTISN. The adjoint flux, which is calculated with PARTISN, is used to generate mesh-based weight windows for MCNP. Additionally, the MCNP source energy spectrum can be biased based on the adjoint energy spectrum at the source location. This method can also use angle-dependent weight windows.
Displacement and frequency analyses of vibratory systems
NASA Astrophysics Data System (ADS)
Low, K. H.
1995-02-01
This paper deals with the frequency and response studies of vibratory systems, which are represented by a set of n coupled second-order differential equations. The following numerical methods are used in the response analysis: central difference, fourth-order Runge-Kutta and modal methods. Data generated in the response analysis are processed to obtain the system frequencies by using the fast Fourier transform (FFT) or harmonic response methods. Two types of the windows are used in the FFT analysis: rectangular and Hanning windows. Examples of two, four and seven degrees of freedom systems are considered, to illustrate the proposed algorithms. Comparisons with those existing results confirm the validity of the proposed methods. The Hanning window attenuates the results that give a narrower bandwidth around the peak if compared with those using the rectangular window. It is also found that in free vibrations of a multi-mass system, the masses will vibrate in a manner that is the superposition of the natural frequencies of the system, while the system will vibrate at the driving frequency in forced vibrations.
ASTP (SA-210) Launch vehicle operational flight trajectory. Part 3: Final documentation
NASA Technical Reports Server (NTRS)
Carter, A. B.; Klug, G. W.; Williams, N. W.
1975-01-01
Trajectory data are presented for a nominal and two launch window trajectory simulations. These trajectories are designed to insert a manned Apollo spacecraft into a 150/167 km. (81/90 n. mi.) earth orbit inclined at 51.78 degrees for rendezvous with a Soyuz spacecraft, which will be orbiting at approximately 225 km. (121.5 n. mi.). The launch window allocation defined for this launch is 500 pounds of S-IVB stage propellant. The launch window opening trajectory simulation depicts the earliest launch time deviation from a planar flight launch which conforms to this constraint. The launch window closing trajectory simulation was developed for the more stringent Air Force Eastern Test Range (AFETR) flight azimuth restriction of 37.4 degrees east-of-north. These trajectories enclose a 12.09 minute launch window, pertinent features of which are provided in a tabulation. Planar flight data are included for mid-window reference.
NASA Astrophysics Data System (ADS)
Pasten, D.; Comte, D.; Vallejos, J.
2013-05-01
During the last decades several authors showing that the spatial distribution of earthquakes follows multifractal laws and the most interesting behavior is the decreasing of the fratal dimensions before the ocurrence of a large earthquake, and also before its main aftershocks. A multifractal analysis to over 55920 microseismicity events recorded from January 2006 to January 2009 at Creighton mine, Canada was applied. In order to work with a complete catalogue in magnitude, it was taken the data associated with the linear part of the Gutenber-Richter law, with magnitudes greater than -1.5. A multifractal analysis was performed using microseismic data, considering that significant earthquakes are those with magnitude MW ≥ 1.0. A moving window was used, containing a constant number of events in order to guarantee the precise estimations of the fractal dimensions. After different trials, we choose 200 events for the number of the data points in each windows. Two consecutive windows were shifted by 20 events. The complete data set was separated in six sections and this multifractal analysis was applied for each section of 9320 data. The multifractal analysis of each section shows that there is a systematic decreasing of the fractal dimension (Dq) with time before the occurrence of rockburst or natural event with magnitude greater than MW ≥ 1.0, as it is observed in the seismic sequence of large earthquakes. This metodology was repeated for minimum magnitudes MW ≥ 1.5 and MW ≥ 2.0, obtaining same results. The best result was obtained using MW >= 2.0, a right answer rate vary between fifty and eighty percent. The result shows the possibility to use systematically the determination of the Dq parameter in order to detect the next rockburst or natural event in the studied mine. This project has been financially suppoerted by FONDECyT No 3120237 Grant (D.P).
NASA Astrophysics Data System (ADS)
Lyubushin, Alexey
2016-04-01
The problem of estimate of current seismic danger based on monitoring of seismic noise properties from broadband seismic network F-net in Japan (84 stations) is considered. Variations of the following seismic noise parameters are analyzed: multifractal singularity spectrum support width, generalized Hurst exponent, minimum Hölder-Lipschitz exponent and minimum normalized entropy of squared orthogonal wavelet coefficients. These parameters are estimated within adjacent time windows of the length 1 day for seismic noise waveforms from each station. Calculating daily median values of these parameters by all stations provides 4-dimensional time series which describes integral properties of the seismic noise in the region covered by the network. Cluster analysis is applied to the sequence of clouds of 4-dimensional vectors within moving time window of the length 365 days with mutual shift 3 days starting from the beginning of 1997 up to the current time. The purpose of the cluster analysis is to find the best number of clusters (BNC) from probe numbers which are varying from 1 up to the maximum value 40. The BNC is found from the maximum of pseudo-F-statistics (PFS). A 2D map could be created which presents dependence of PFS on the tested probe number of clusters and the right-hand end of moving time window which is rather similar to usual spectral time-frequency diagrams. In the paper [1] it was shown that the BNC before Tohoku mega-earthquake on March 11, 2011, has strongly chaotic regime with jumps from minimum up to maximum values in the time interval 1 year before the event and this time intervals was characterized by high PFS values. The PFS-map is proposed as the method for extracting time intervals with high current seismic danger. The next danger time interval after Tohoku mega-EQ began at the end of 2012 and was finished at the middle of 2013. Starting from middle of 2015 the high PFS values and chaotic regime of BNC variations were returned. This could be interpreted as the increasing of the danger of the next mega-EQ in Japan in the region of Nankai Trough [1] at the first half of 2016. References 1. Lyubushin, A., 2013. How soon would the next mega-earthquake occur in Japan? // Natural Science, 5 (8A1), 1-7. http://dx.doi.org/10.4236/ns.2013.58A1001
Effects of the window openings on the micro-environmental condition in a school bus
NASA Astrophysics Data System (ADS)
Li, Fei; Lee, Eon S.; Zhou, Bin; Liu, Junjie; Zhu, Yifang
2017-10-01
School bus is an important micro-environment for children's health because the level of in-cabin air pollution can increase due to its own exhaust in addition to on-road traffic emissions. However, it has been challenging to understand the in-cabin air quality that is associated with complex airflow patterns inside and outside a school bus. This study conducted Computational Fluid Dynamics (CFD) modeling analyses to determine the effects of window openings on the self-pollution for a school bus. Infiltration through the window gaps is modeled by applying variable numbers of active computational cells as a function of the effective area ratio of the opening. The experimental data on ventilation rates from the literature was used to validate the model. Ultrafine particles (UFPs) and black carbon (BC) concentrations were monitored in ;real world; field campaigns using school buses. This modeling study examined the airflow pattern inside the school bus under four different types of side-window openings at 20, 40, and 60 mph (i.e., a total of 12 cases). We found that opening the driver's window could allow the infiltration of exhaust through window/door gaps in the back of school bus; whereas, opening windows in the middle of the school bus could mitigate this phenomenon. We also found that an increased driving speed (from 20 mph to 60 mph) could result in a higher ventilation rate (up to 3.4 times) and lower mean age of air (down to 0.29 time) inside the bus.
Dynamic subcellular imaging of cancer cell mitosis in the brain of live mice.
Momiyama, Masashi; Suetsugu, Atsushi; Kimura, Hiroaki; Chishima, Takashi; Bouvet, Michael; Endo, Itaru; Hoffman, Robert M
2013-04-01
The ability to visualize cancer cell mitosis and apoptosis in the brain in real time would be of great utility in testing novel therapies. In order to achieve this goal, the cancer cells were labeled with green fluorescent protein (GFP) in the nucleus and red fluorescent protein (RFP) in the cytoplasm, such that mitosis and apoptosis could be clearly imaged. A craniotomy open window was made in athymic nude mice for real-time fluorescence imaging of implanted cancer cells growing in the brain. The craniotomy window was reversibly closed with a skin flap. Mitosis of the individual cancer cells were imaged dynamically in real time through the craniotomy-open window. This model can be used to evaluate brain metastasis and brain cancer at the subcellular level.
NASA Astrophysics Data System (ADS)
Qi, Shuhong; Zhang, Zhihong
2015-03-01
Tumor immune microenvironment became very important for the tumor immunotherapy. There were several kinds of immune cells in tumor stromal, and they played very different roles in tumor growth. In order to observe the behaviors of multiple immune cells in tumor microenvironment and the interaction between immune cells and tumor cells at the same time, we generated a multicolor-labeled tumor immune microenvironment model. The tumor cells and immune cells were labeled by different fluorescent proteins. By using of skin-fold window chamber implanted into mice and intravital imaging technology, we could dynamically observe the different immune cells in tumor microenvironment. After data analysis from the video, we could know the behavior of TILs, DCs and Tregs in tumor immune microenvironment; furthermore, we could know these immune cells play different roles in the tumor microenvironment.
Pandžić, Elvis; Abu-Arish, Asmahan; Whan, Renee M; Hanrahan, John W; Wiseman, Paul W
2018-02-16
Molecular, vesicular and organellar flows are of fundamental importance for the delivery of nutrients and essential components used in cellular functions such as motility and division. With recent advances in fluorescence/super-resolution microscopy modalities we can resolve the movements of these objects at higher spatio-temporal resolutions and with better sensitivity. Previously, spatio-temporal image correlation spectroscopy has been applied to map molecular flows by correlation analysis of fluorescence fluctuations in image series. However, an underlying assumption of this approach is that the sampled time windows contain one dominant flowing component. Although this was true for most of the cases analyzed earlier, in some situations two or more different flowing populations can be present in the same spatio-temporal window. We introduce an approach, termed velocity landscape correlation (VLC), which detects and extracts multiple flow components present in a sampled image region via an extension of the correlation analysis of fluorescence intensity fluctuations. First we demonstrate theoretically how this approach works, test the performance of the method with a range of computer simulated image series with varying flow dynamics. Finally we apply VLC to study variable fluxing of STIM1 proteins on microtubules connected to the plasma membrane of Cystic Fibrosis Bronchial Epithelial (CFBE) cells. Copyright © 2018 Elsevier Inc. All rights reserved.
Non-invasive detection of language-related prefrontal high gamma band activity with beamforming MEG.
Hashimoto, Hiroaki; Hasegawa, Yuka; Araki, Toshihiko; Sugata, Hisato; Yanagisawa, Takufumi; Yorifuji, Shiro; Hirata, Masayuki
2017-10-27
High gamma band (>50 Hz) activity is a key oscillatory phenomenon of brain activation. However, there has not been a non-invasive method established to detect language-related high gamma band activity. We used a 160-channel whole-head magnetoencephalography (MEG) system equipped with superconducting quantum interference device (SQUID) gradiometers to non-invasively investigate neuromagnetic activities during silent reading and verb generation tasks in 15 healthy participants. Individual data were divided into alpha (8-13 Hz), beta (13-25 Hz), low gamma (25-50 Hz), and high gamma (50-100 Hz) bands and analysed with the beamformer method. The time window was consecutively moved. Group analysis was performed to delineate common areas of brain activation. In the verb generation task, transient power increases in the high gamma band appeared in the left middle frontal gyrus (MFG) at the 550-750 ms post-stimulus window. We set a virtual sensor on the left MFG for time-frequency analysis, and high gamma event-related synchronization (ERS) induced by a verb generation task was demonstrated at 650 ms. In contrast, ERS in the high gamma band was not detected in the silent reading task. Thus, our study successfully non-invasively measured language-related prefrontal high gamma band activity.
NASA Astrophysics Data System (ADS)
Dua, Rohit; Watkins, Steve E.
2009-03-01
Strain analysis due to vibration can provide insight into structural health. An Extrinsic Fabry-Perot Interferometric (EFPI) sensor under vibrational strain generates a non-linear modulated output. Advanced signal processing techniques, to extract important information such as absolute strain, are required to demodulate this non-linear output. Past research has employed Artificial Neural Networks (ANN) and Fast Fourier Transforms (FFT) to demodulate the EFPI sensor for limited conditions. These demodulation systems could only handle variations in absolute value of strain and frequency of actuation during a vibration event. This project uses an ANN approach to extend the demodulation system to include the variation in the damping coefficient of the actuating vibration, in a near real-time vibration scenario. A computer simulation provides training and testing data for the theoretical output of the EFPI sensor to demonstrate the approaches. FFT needed to be performed on a window of the EFPI output data. A small window of observation is obtained, while maintaining low absolute-strain prediction errors, heuristically. Results are obtained and compared from employing different ANN architectures including multi-layered feedforward ANN trained using Backpropagation Neural Network (BPNN), and Generalized Regression Neural Networks (GRNN). A two-layered algorithm fusion system is developed and tested that yields better results.
Parallel Climate Data Assimilation PSAS Package
NASA Technical Reports Server (NTRS)
Ding, Hong Q.; Chan, Clara; Gennery, Donald B.; Ferraro, Robert D.
1996-01-01
We have designed and implemented a set of highly efficient and highly scalable algorithms for an unstructured computational package, the PSAS data assimilation package, as demonstrated by detailed performance analysis of systematic runs on up to 512node Intel Paragon. The equation solver achieves a sustained 18 Gflops performance. As the results, we achieved an unprecedented 100-fold solution time reduction on the Intel Paragon parallel platform over the Cray C90. This not only meets and exceeds the DAO time requirements, but also significantly enlarges the window of exploration in climate data assimilations.
Percutaneous window chamber method for chronic intravital microscopy of sensor-tissue interactions.
Koschwanez, Heidi E; Klitzman, Bruce; Reichert, W Monty
2008-11-01
A dorsal, two-sided skin-fold window chamber model was employed previously by Gough in glucose sensor research to characterize poorly understood physiological factors affecting sensor performance. We have extended this work by developing a percutaneous one-sided window chamber model for the rodent dorsum that offers both a larger subcutaneous area and a less restrictive tissue space than previous animal models. A surgical procedure for implanting a sensor into the subcutis beneath an acrylic window (15 mm diameter) is presented. Methods to quantify changes in the microvascular network and red blood cell perfusion around the sensors using noninvasive intravital microscopy and laser Doppler flowmetry are described. The feasibility of combining interstitial glucose monitoring from an implanted sensor with intravital fluorescence microscopy was explored using a bolus injection of fluorescein and dextrose to observe real-time mass transport of a small molecule at the sensor-tissue interface. The percutaneous window chamber provides an excellent model for assessing the influence of different sensor modifications, such as surface morphologies, on neovascularization using real-time monitoring of the microvascular network and tissue perfusion. However, the tissue response to an implanted sensor was variable, and some sensors migrated entirely out of the field of view and could not be observed adequately. A percutaneous optical window provides direct, real-time images of the development and dynamics of microvascular networks, microvessel patency, and fibrotic encapsulation at the tissue-sensor interface. Additionally, observing microvessels following combined bolus injections of a fluorescent dye and glucose in the local sensor environment demonstrated a valuable technique to visualize mass transport at the sensor surface.
Towards component-based validation of GATE: aspects of the coincidence processor
Moraes, Eder R.; Poon, Jonathan K.; Balakrishnan, Karthikayan; Wang, Wenli; Badawi, Ramsey D.
2014-01-01
GATE is public domain software widely used for Monte Carlo simulation in emission tomography. Validations of GATE have primarily been performed on a whole-system basis, leaving the possibility that errors in one sub-system may be offset by errors in others. We assess the accuracy of the GATE PET coincidence generation sub-system in isolation, focusing on the options most closely modeling the majority of commercially available scanners. Independent coincidence generators were coded by teams at Toshiba Medical Research Unit (TMRU) and UC Davis. A model similar to the Siemens mCT scanner was created in GATE. Annihilation photons interacting with the detectors were recorded. Coincidences were generated using GATE, TMRU and UC Davis code and results compared to “ground truth” obtained from the history of the photon interactions. GATE was tested twice, once with every qualified single event opening a time window and initiating a coincidence check (the “multiple window method”), and once where a time window is opened and a coincidence check initiated only by the first single event to occur after the end of the prior time window (the “single window method”). True, scattered and random coincidences were compared. Noise equivalent count rates were also computed and compared. The TMRU and UC Davis coincidence generators agree well with ground truth. With GATE, reasonable accuracy can be obtained if the single window method option is chosen and random coincidences are estimated without use of the delayed coincidence option. However in this GATE version, other parameter combinations can result in significant errors. PMID:25240897
A Maple package for improved global mapping forecast
NASA Astrophysics Data System (ADS)
Carli, H.; Duarte, L. G. S.; da Mota, L. A. C. P.
2014-03-01
We present a Maple implementation of the well known global approach to time series analysis and some further developments designed to improve the computational efficiency of the forecasting capabilities of the approach. This global approach can be summarized as being a reconstruction of the phase space, based on a time ordered series of data obtained from the system. After that, using the reconstructed vectors, a portion of this space is used to produce a mapping, a polynomial fitting, through a minimization procedure, that represents the system and can be employed to forecast further entries for the series. In the present implementation, we introduce a set of commands, tools, in order to perform all these tasks. For example, the command VecTS deals mainly with the reconstruction of the vector in the phase space. The command GfiTS deals with producing the minimization and the fitting. ForecasTS uses all these and produces the prediction of the next entries. For the non-standard algorithms, we here present two commands: IforecasTS and NiforecasTS that, respectively deal with the one-step and the N-step forecasting. Finally, we introduce two further tools to aid the forecasting. The commands GfiTS and AnalysTS, basically, perform an analysis of the behavior of each portion of a series regarding the settings used on the commands just mentioned above. Catalogue identifier: AERW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERW_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3001 No. of bytes in distributed program, including test data, etc.: 95018 Distribution format: tar.gz Programming language: Maple 14. Computer: Any capable of running Maple Operating system: Any capable of running Maple. Tested on Windows ME, Windows XP, Windows 7. RAM: 128 MB Classification: 4.3, 4.9, 5 Nature of problem: Time series analysis and improving forecast capability. Solution method: The method of solution is partially based on a result published in [1]. Restrictions: If the time series that is being analyzed presents a great amount of noise or if the dynamical system behind the time series is of high dimensionality (Dim≫3), then the method may not work well. Unusual features: Our implementation can, in the cases where the dynamics behind the time series is given by a system of low dimensionality, greatly improve the forecast. Running time: This depends strongly on the command that is being used. References: [1] Barbosa, L.M.C.R., Duarte, L.G.S., Linhares, C.A. and da Mota, L.A.C.P., Improving the global fitting method on nonlinear time series analysis, Phys. Rev. E 74, 026702 (2006).
Window-Based Channel Impulse Response Prediction for Time-Varying Ultra-Wideband Channels.
Al-Samman, A M; Azmi, M H; Rahman, T A; Khan, I; Hindia, M N; Fattouh, A
2016-01-01
This work proposes channel impulse response (CIR) prediction for time-varying ultra-wideband (UWB) channels by exploiting the fast movement of channel taps within delay bins. Considering the sparsity of UWB channels, we introduce a window-based CIR (WB-CIR) to approximate the high temporal resolutions of UWB channels. A recursive least square (RLS) algorithm is adopted to predict the time evolution of the WB-CIR. For predicting the future WB-CIR tap of window wk, three RLS filter coefficients are computed from the observed WB-CIRs of the left wk-1, the current wk and the right wk+1 windows. The filter coefficient with the lowest RLS error is used to predict the future WB-CIR tap. To evaluate our proposed prediction method, UWB CIRs are collected through measurement campaigns in outdoor environments considering line-of-sight (LOS) and non-line-of-sight (NLOS) scenarios. Under similar computational complexity, our proposed method provides an improvement in prediction errors of approximately 80% for LOS and 63% for NLOS scenarios compared with a conventional method.
Window-Based Channel Impulse Response Prediction for Time-Varying Ultra-Wideband Channels
Al-Samman, A. M.; Azmi, M. H.; Rahman, T. A.; Khan, I.; Hindia, M. N.; Fattouh, A.
2016-01-01
This work proposes channel impulse response (CIR) prediction for time-varying ultra-wideband (UWB) channels by exploiting the fast movement of channel taps within delay bins. Considering the sparsity of UWB channels, we introduce a window-based CIR (WB-CIR) to approximate the high temporal resolutions of UWB channels. A recursive least square (RLS) algorithm is adopted to predict the time evolution of the WB-CIR. For predicting the future WB-CIR tap of window wk, three RLS filter coefficients are computed from the observed WB-CIRs of the left wk−1, the current wk and the right wk+1 windows. The filter coefficient with the lowest RLS error is used to predict the future WB-CIR tap. To evaluate our proposed prediction method, UWB CIRs are collected through measurement campaigns in outdoor environments considering line-of-sight (LOS) and non-line-of-sight (NLOS) scenarios. Under similar computational complexity, our proposed method provides an improvement in prediction errors of approximately 80% for LOS and 63% for NLOS scenarios compared with a conventional method. PMID:27992445
NASA Astrophysics Data System (ADS)
Moliner, L.; Correcher, C.; Gimenez-Alventosa, V.; Ilisie, V.; Alvarez, J.; Sanchez, S.; Rodríguez-Alvarez, M. J.
2017-11-01
Nowadays, with the increase of the computational power of modern computers together with the state-of-the-art reconstruction algorithms, it is possible to obtain Positron Emission Tomography (PET) images in practically real time. These facts open the door to new applications such as radio-pharmaceuticals tracking inside the body or the use of PET for image-guided procedures, such as biopsy interventions, among others. This work is a proof of concept that aims to improve the user experience with real time PET images. Fixed, incremental, overlapping, sliding and hybrid windows are the different statistical combinations of data blocks used to generate intermediate images in order to follow the path of the activity in the Field Of View (FOV). To evaluate these different combinations, a point source is placed in a dedicated breast PET device and moved along the FOV. These acquisitions are reconstructed according to the different statistical windows, resulting in a smoother transition of positions for the image reconstructions that use the sliding and hybrid window.
NASA Astrophysics Data System (ADS)
Reil, Frank; Thomas, John E.
2002-05-01
For the first time we are able to observe the time-resolved Wigner function of enhanced backscatter from a random medium using a novel two-window technique. This technique enables us to directly verify the phase-conjugating properties of random media. An incident divergent beam displays a convergent enhanced backscatter cone. We measure the joint position and momentum (x, p) distributions of the light field as a function of propagation time in the medium. The two-window technique allows us to independently control the resolutions for position and momentum, thereby surpassing the uncertainty limit associated with Fourier transform pairs. By using a low-coherence light source in a heterodyne detection scheme, we observe enhanced backscattering resolved by path length in the random medium, providing information about the evolution of optical coherence as a function of penetration depth in the random medium.
Longhi, M. Paula; Hoti, Mimoza; Patel, Minal B.; O’Dwyer, Michael; Nourshargh, Sussan; Barnes, Michael R.; Brohi, Karim
2017-01-01
Background Severe trauma induces a widespread response of the immune system. This “genomic storm” can lead to poor outcomes, including Multiple Organ Dysfunction Syndrome (MODS). MODS carries a high mortality and morbidity rate and adversely affects long-term health outcomes. Contemporary management of MODS is entirely supportive, and no specific therapeutics have been shown to be effective in reducing incidence or severity. The pathogenesis of MODS remains unclear, and several models are proposed, such as excessive inflammation, a second-hit insult, or an imbalance between pro- and anti-inflammatory pathways. We postulated that the hyperacute window after trauma may hold the key to understanding how the genomic storm is initiated and may lead to a new understanding of the pathogenesis of MODS. Methods and findings We performed whole blood transcriptome and flow cytometry analyses on a total of 70 critically injured patients (Injury Severity Score [ISS] ≥ 25) at The Royal London Hospital in the hyperacute time period within 2 hours of injury. We compared transcriptome findings in 36 critically injured patients with those of 6 patients with minor injuries (ISS ≤ 4). We then performed flow cytometry analyses in 34 critically injured patients and compared findings with those of 9 healthy volunteers. Immediately after injury, only 1,239 gene transcripts (4%) were differentially expressed in critically injured patients. By 24 hours after injury, 6,294 transcripts (21%) were differentially expressed compared to the hyperacute window. Only 202 (16%) genes differentially expressed in the hyperacute window were still expressed in the same direction at 24 hours postinjury. Pathway analysis showed principally up-regulation of pattern recognition and innate inflammatory pathways, with down-regulation of adaptive responses. Immune deconvolution, flow cytometry, and modular analysis suggested a central role for neutrophils and Natural Killer (NK) cells, with underexpression of T- and B cell responses. In the transcriptome cohort, 20 critically injured patients later developed MODS. Compared with the 16 patients who did not develop MODS (NoMODS), maximal differential expression was seen within the hyperacute window. In MODS versus NoMODS, 363 genes were differentially expressed on admission, compared to only 33 at 24 hours postinjury. MODS transcripts differentially expressed in the hyperacute window showed enrichment among diseases and biological functions associated with cell survival and organismal death rather than inflammatory pathways. There was differential up-regulation of NK cell signalling pathways and markers in patients who would later develop MODS, with down-regulation of neutrophil deconvolution markers. This study is limited by its sample size, precluding more detailed analyses of drivers of the hyperacute response and different MODS phenotypes, and requires validation in other critically injured cohorts. Conclusions In this study, we showed how the hyperacute postinjury time window contained a focused, specific signature of the response to critical injury that led to widespread genomic activation. A transcriptomic signature for later development of MODS was present in this hyperacute window; it showed a strong signal for cell death and survival pathways and implicated NK cells and neutrophil populations in this differential response. PMID:28715416
"Observation Obscurer" - Time Series Viewer, Editor and Processor
NASA Astrophysics Data System (ADS)
Andronov, I. L.
The program is described, which contains a set of subroutines suitable for East viewing and interactive filtering and processing of regularly and irregularly spaced time series. Being a 32-bit DOS application, it may be used as a default fast viewer/editor of time series in any compute shell ("commander") or in Windows. It allows to view the data in the "time" or "phase" mode, to remove ("obscure") or filter outstanding bad points; to make scale transformations and smoothing using few methods (e.g. mean with phase binning, determination of the statistically opti- mal number of phase bins; "running parabola" (Andronov, 1997, As. Ap. Suppl, 125, 207) fit and to make time series analysis using some methods, e.g. correlation, autocorrelation and histogram analysis: determination of extrema etc. Some features have been developed specially for variable star observers, e.g. the barycentric correction, the creation and fast analysis of "OC" diagrams etc. The manual for "hot keys" is presented. The computer code was compiled with a 32-bit Free Pascal (www.freepascal.org).
Window Performance in Extreme Cold,
1982-12-01
outdoor temperatures ranging between -40 and 20*F Alaska that have undergone an extensive window re- as shown in Table 2. We made these observations in...good predictor of when We made icing and condensation observations over moisture or ice would occur on a window pane. the temperature spectrum shown...tions were made during the daytime, it was often likely sashes or frames, and 4) vapor-loose indoor sashes that ATIo at the time of observation would
Computed Tomographic Window Setting for Bronchial Measurement to Guide Double-Lumen Tube Size.
Seo, Jeong-Hwa; Bae, Jinyoung; Paik, Hyesun; Koo, Chang-Hoon; Bahk, Jae-Hyon
2018-04-01
The bronchial diameter measured on computed tomography (CT) can be used to guide double-lumen tube (DLT) sizes objectively. The bronchus is known to be measured most accurately in the so-called bronchial CT window. The authors investigated whether using the bronchial window results in the selection of more appropriately sized DLTs than using the other windows. CT image analysis and prospective randomized study. Tertiary hospital. Adults receiving left-sided DLTs. The authors simulated selection of DLT sizes based on the left bronchial diameters measured in the lung (width 1,500 Hounsfield unit [HU] and level -700 HU), bronchial (1,000 HU and -450 HU), and mediastinal (400 HU and 25 HU) CT windows. Furthermore, patients were randomly assigned to undergo imaging with either the bronchial or mediastinal window to guide DLT sizes. Using the underwater seal technique, the authors assessed whether the DLT was appropriately sized, undersized, or oversized for the patient. On 130 CT images, the bronchial diameter (9.9 ± 1.2 mm v 10.5 ± 1.3 mm v 11.7 ± 1.3 mm) and the selected DLT size were different in the lung, bronchial, and mediastinal windows, respectively (p < 0.001). In 13 patients (17%), the bronchial diameter measured in the lung window suggested too small DLTs (28 Fr) for adults. In the prospective study, oversized tubes were chosen less frequently in the bronchial window than in the mediastinal window (6/110 v 23/111; risk ratio 0.38; 95% CI 0.19-0.79; p = 0.003). No tubes were undersized after measurements in these two windows. The bronchial measurement in the bronchial window guided more appropriately sized DLTs compared with the lung or mediastinal windows. Copyright © 2017 Elsevier Inc. All rights reserved.
Time-series analysis of multiple foreign exchange rates using time-dependent pattern entropy
NASA Astrophysics Data System (ADS)
Ishizaki, Ryuji; Inoue, Masayoshi
2018-01-01
Time-dependent pattern entropy is a method that reduces variations to binary symbolic dynamics and considers the pattern of symbols in a sliding temporal window. We use this method to analyze the instability of daily variations in multiple foreign exchange rates. The time-dependent pattern entropy of 7 foreign exchange rates (AUD/USD, CAD/USD, CHF/USD, EUR/USD, GBP/USD, JPY/USD, and NZD/USD) was found to be high in the long period after the Lehman shock, and be low in the long period after Mar 2012. We compared the correlation matrix between exchange rates in periods of high and low of the time-dependent pattern entropy.
Novel windowing technique realized in FPGA for radar system
NASA Astrophysics Data System (ADS)
Escamilla-Hernandez, E.; Kravchenko, V. F.; Ponomaryov, V. I.; Ikuo, Arai
2006-02-01
To improve the weak target detection ability in radar applications a pulse compression is usually used that in the case linear FM modulation can improve the SNR. One drawback in here is that it can add the range side-lobes in reflectivity measurements. Using weighting window processing in time domain it is possible to decrease significantly the side-lobe level (SLL) and resolve small or low power targets those are masked by powerful ones. There are usually used classical windows such as Hamming, Hanning, etc. in window processing. Additionally to classical ones in this paper we also use a novel class of windows based on atomic functions (AF) theory. For comparison of simulation and experimental results we applied the standard parameters, such as coefficient of amplification, maximum level of side-lobe, width of main lobe, etc. To implement the compression-windowing model on hardware level it has been employed FPGA. This work aims at demonstrating a reasonably flexible implementation of FM-linear signal, pulse compression and windowing employing FPGA's. Classical and novel AF window technique has been investigated to reduce the SLL taking into account the noise influence and increasing the detection ability of the small or weak targets in the imaging radar. Paper presents the experimental hardware results of windowing in pulse compression radar resolving several targets for rectangular, Hamming, Kaiser-Bessel, (see manuscript for formula) functions windows. The windows created by use the atomic functions offer sufficiently better decreasing of the SLL in case of noise presence and when we move away of the main lobe in comparison with classical windows.
Hydrogen Safety Project: Chemical analysis support task. Window ``E`` analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, T E; Campbell, J A; Hoppe, E W
1992-09-01
Core samples taken from tank 101-SY at Hanford during ``window E`` were analyzed for organic and radiochemical constituents by staff of the Analytical Chemistry Laboratory at Pacific Northwest Laboratory. Westinghouse Hanford company submitted these samples to the laboratory.
NASA Technical Reports Server (NTRS)
Lee, Nathaniel; Welch, Bryan W.
2018-01-01
NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.
Jalali-Heravi, Mehdi; Moazeni-Pourasil, Roudabeh Sadat; Sereshti, Hassan
2015-03-01
In analysis of complex natural matrices by gas chromatography-mass spectrometry (GC-MS), many disturbing factors such as baseline drift, spectral background, homoscedastic and heteroscedastic noise, peak shape deformation (non-Gaussian peaks), low S/N ratio and co-elution (overlapped and/or embedded peaks) lead the researchers to handle them to serve time, money and experimental efforts. This study aimed to improve the GC-MS analysis of complex natural matrices utilizing multivariate curve resolution (MCR) methods. In addition, to assess the peak purity of the two-dimensional data, a method called variable size moving window-evolving factor analysis (VSMW-EFA) is introduced and examined. The proposed methodology was applied to the GC-MS analysis of Iranian Lavender essential oil, which resulted in extending the number of identified constituents from 56 to 143 components. It was found that the most abundant constituents of the Iranian Lavender essential oil are α-pinene (16.51%), camphor (10.20%), 1,8-cineole (9.50%), bornyl acetate (8.11%) and camphene (6.50%). This indicates that the Iranian type Lavender contains a relatively high percentage of α-pinene. Comparison of different types of Lavender essential oils showed the composition similarity between Iranian and Italian (Sardinia Island) Lavenders. Published by Elsevier B.V.
Alternative Fuels Data Center: Kentucky Charges Forward with All-Electric
Partnership. Download QuickTime Video QuickTime (.mov) Download Windows Media Video Windows Media (.wmv) Video Fuel Cell Vehicles in California Nov. 18, 2017 Photo of a car Smart Car Shopping Nov. 4, 2017 Image of Photo of a truck Natural Gas Fuels School Buses and Refuse Trucks in Tulsa, Oklahoma Feb. 18, 2017 Photo
Analysis of photonic Doppler velocimetry data based on the continuous wavelet transform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu Shouxian; Wang Detian; Li Tao
2011-02-15
The short time Fourier transform (STFT) cannot resolve rapid velocity changes in most photonic Doppler velocimetry (PDV) data. A practical analysis method based on the continuous wavelet transform (CWT) was presented to overcome this difficulty. The adaptability of the wavelet family predicates that the continuous wavelet transform uses an adaptive time window to estimate the instantaneous frequency of signals. The local frequencies of signal are accurately determined by finding the ridge in the spectrogram of the CWT and then are converted to target velocity according to the Doppler effects. A performance comparison between the CWT and STFT is demonstrated bymore » a plate-impact experiment data. The results illustrate that the new method is automatic and adequate for analysis of PDV data.« less
Environment, susceptibility windows, development and child health
Wright, Robert O
2017-01-01
Purpose To illustrate the role of the exposome in child health while highlighting unique aspects of this research pertinent to children, such as the time dependency of environmental exposures on fetal programming, as well as the time dependent nature of child behavior, diet, and motor function, which alter the probability of exposure to different compounds. Future environmental health research will be more hypothesis generating but will also need to heed lessons learned from other “omic” sciences. The NIH Child Health Environmental Analysis Resource (CHEAR) is a major step towards providing the infrastructure needed to study the exposome and child health. Recent Findings Environmental exposures have overlapping mechanisms such as endocrine disruption and oxidative stress among others. The nature of the long term health impact of an exposure is dependent not only on dose, but also on the timing of exposure. Advances in exposure science, toxicology and biostatistics will create new opportunities to identify and better define windows of susceptibility to environmental exposures. Summary As exposure science matures, we will better understand the role of environment on health. Linking the exposome with genomics will unlock the root origins of multiple complex diseases. PMID:28107208
Artes, Paul H; McLeod, David; Henson, David B
2002-01-01
To report on differences between the latency distributions of responses to stimuli and to false-positive catch trials in suprathreshold perimetry. To describe an algorithm for defining response time windows and to report on its performance in discriminating between true- and false-positive responses on the basis of response time (RT). A sample of 435 largely inexperienced patients underwent suprathreshold visual field examination on a perimeter that was modified to record RTs. Data were analyzed from 60,500 responses to suprathreshold stimuli and from 523 false-positive responses to catch trials. False-positive responses had much more variable latencies than responses to suprathreshold stimuli. An algorithm defining RT windows on the basis of z-transformed individual latency samples correctly identified more than 70% of false-positive responses to catch trials, whereas fewer than 3% of responses to suprathreshold stimuli were classified as false-positive responses. Latency analysis can be used to detect a substantial proportion of false-positive responses in suprathreshold perimetry. Rejection of such responses may increase the reliability of visual field screening by reducing variability and bias in a small but clinically important proportion of patients.
Arcjet exploratory tests of ARC optical window design for the AFE vehicle
NASA Technical Reports Server (NTRS)
Whiting, Ellis E.; Terrazas-Salinas, Imelda; Craig, Roger A.; Sobeck, Charles K.; Sarver, George L., III; Salerno, Louis J.; Love, Wendell; Maa, Scott; Covington, AL
1991-01-01
Tests were made in the 20 MW arc jet facility at the NASA ARC to determine the suitability of sapphire and fused silica as window materials for the Aeroassist Flight Experiment (AFE) entry vehicle. Twenty nine tests were made; 25 at a heating rate about 80 percent of that expected during the AFE entry and 4 at approximately the full, 100 percent AFE heating rate profile, that produces a temperature of about 2900 F on the surface of the tiles that protect the vehicle. These tests show that a conductively cooled window design using mechanical thermal contacts and sapphire is probably not practical. Cooling the window using mechanical thermal contacts produces thermal stresses in the sapphire that cause the window to crack. An insulated design using sapphire, that cools the window as little as possible, appears promising although some spectral data in the vacuum-ultra-violet (VUV) will be lost due to the high temperature reached by the sapphire. The surface of the insulated sapphire windows, tested at the 100 percent AFE heating rate, showed some slight ablation, and cracks appeared in two of three test windows. One small group of cracks were obviously caused by mechanical binding of the window in the assembly, which can be eliminated with improved design. Other cracks were long, straight, thin crystallographic cracks that have very little effect on the optical transmission of the window. Also, the windows did not fall apart along these crystallographic cracks when the windows were removed from their assemblies. Theoretical results from the thermal analysis computer program SINDA indicate that increasing the window thickness from 4 to 8 mm may enable surface ablation to be avoided. An insulated design using a fused silica window tested at the nominal AFE heating rate experienced severe ablation, thus fused silica is not considered to be an acceptable window material.
NASA Astrophysics Data System (ADS)
Lanka, Karthikeyan; Pan, Ming; Konings, Alexandra; Piles, María; D, Nagesh Kumar; Wood, Eric
2017-04-01
Traditionally, passive microwave retrieval algorithms such as Land Parameter Retrieval Model (LPRM) estimate simultaneously soil moisture and Vegetation Optical Depth (VOD) using brightness temperature (Tb) data. The algorithm requires a surface roughness parameter which - despite implications - is generally assumed to be constant at global scale. Due to inherent noise in the satellite data and retrieval algorithm, the VOD retrievals are usually observed to be highly fluctuating at daily scale which may not occur in reality. Such noisy VOD retrievals along with spatially invariable roughness parameter may affect the quality of soil moisture retrievals. The current work aims to smoothen the VOD retrievals (with an assumption that VOD remains constant over a period of time) and simultaneously generate, for the first time, global surface roughness map using multiple descending X-band Tb observations of AMSR-E. The methodology utilizes Tb values under a moving-time-window-setup to estimate concurrently the soil moisture of each day and a constant VOD in the window. Prior to this step, surface roughness parameter is estimated using the complete time series of Tb record. Upon carrying out the necessary sensitivity analysis, the smoothened VOD along with soil moisture retrievals is generated for the 10-year duration of AMSR-E (2002-2011) with a 7-day moving window using the LPRM framework. The spatial patterns of resulted global VOD maps are in coherence with vegetation biomass and climate conditions. The VOD results also exhibit a smoothening effect in terms of lower values of standard deviation. This is also evident from time series comparison of VOD and LPRM VOD retrievals without optimization over moving windows at several grid locations across the globe. The global surface roughness map also exhibited spatial patterns that are strongly influenced by topography and land use conditions. Some of the noticeable features include high roughness over mountainous regions and heavily vegetated tropical rainforests, low roughness in desert areas and moderate roughness value over higher latitudes. The new datasets of VOD and surface roughness can help improving the quality of soil moisture retrievals. Also, the methodology proposed is generic by nature and can be implemented over currently operating AMSR2, SMOS, and SMAP soil moisture missions.
Input-output characterization of an ultrasonic testing system by digital signal analysis
NASA Technical Reports Server (NTRS)
Williams, J. H., Jr.; Lee, S. S.; Karagulle, H.
1986-01-01
Ultrasonic test system input-output characteristics were investigated by directly coupling the transmitting and receiving transducers face to face without a test specimen. Some of the fundamentals of digital signal processing were summarized. Input and output signals were digitized by using a digital oscilloscope, and the digitized data were processed in a microcomputer by using digital signal-processing techniques. The continuous-time test system was modeled as a discrete-time, linear, shift-invariant system. In estimating the unit-sample response and frequency response of the discrete-time system, it was necessary to use digital filtering to remove low-amplitude noise, which interfered with deconvolution calculations. A digital bandpass filter constructed with the assistance of a Blackman window and a rectangular time window were used. Approximations of the impulse response and the frequency response of the continuous-time test system were obtained by linearly interpolating the defining points of the unit-sample response and the frequency response of the discrete-time system. The test system behaved as a linear-phase bandpass filter in the frequency range 0.6 to 2.3 MHz. These frequencies were selected in accordance with the criterion that they were 6 dB below the maximum peak of the amplitude of the frequency response. The output of the system to various inputs was predicted and the results were compared with the corresponding measurements on the system.
TIGER: Turbomachinery interactive grid generation
NASA Technical Reports Server (NTRS)
Soni, Bharat K.; Shih, Ming-Hsin; Janus, J. Mark
1992-01-01
A three dimensional, interactive grid generation code, TIGER, is being developed for analysis of flows around ducted or unducted propellers. TIGER is a customized grid generator that combines new technology with methods from general grid generation codes. The code generates multiple block, structured grids around multiple blade rows with a hub and shroud for either C grid or H grid topologies. The code is intended for use with a Euler/Navier-Stokes solver also being developed, but is general enough for use with other flow solvers. TIGER features a silicon graphics interactive graphics environment that displays a pop-up window, graphics window, and text window. The geometry is read as a discrete set of points with options for several industrial standard formats and NASA standard formats. Various splines are available for defining the surface geometries. Grid generation is done either interactively or through a batch mode operation using history files from a previously generated grid. The batch mode operation can be done either with a graphical display of the interactive session or with no graphics so that the code can be run on another computer system. Run time can be significantly reduced by running on a Cray-YMP.
Koštiaková, Vladimíra; Moleti, Arturo; Wimmerová, Soňa; Jusko, Todd A; Palkovičová Murínová, Ľubica; Sisto, Renata; Richterová, Denisa; Kováč, Ján; Čonka, Kamil; Patayová, Henrieta; Tihányi, Juraj; Trnovec, Tomáš
2016-10-01
The study aim was to identify the timing of sensitive windows for ototoxicity related to perinatal exposure to PCBs. A total of 351 and 214 children from a birth cohort in eastern Slovakia underwent otoacoustic testing at 45 and 72 months, respectively, and distortion product otoacoustic emissions (DPOAEs) at 11 frequencies were recorded. Cord and child 6-, 16-, 45-, and 72- month blood samples were analyzed for PCB 153 concentration. The PCB 153 concentration-time profiles were approximated with a system model to calculate area under the PCB*time curves (AUCs) for specific time intervals (3 and 6 months for 45 and 72 months data, respectively). DPOAE amplitudes were correlated (Spearman) with cord serum PCB and AUCs, markers of prenatal and postnatal exposure, respectively. Two exposure critical windows were identified in infants, the first related to prenatal and early postnatal and the second to postnatal exposure to PCBs. Our data have shown tonotopicity, sexual dimorphism, and asymmetry in ototoxicity of PCBs. Copyright © 2016. Published by Elsevier Ltd.
Defining window-boundaries for genomic analyses using smoothing spline techniques
Beissinger, Timothy M.; Rosa, Guilherme J.M.; Kaeppler, Shawn M.; ...
2015-04-17
High-density genomic data is often analyzed by combining information over windows of adjacent markers. Interpretation of data grouped in windows versus at individual locations may increase statistical power, simplify computation, reduce sampling noise, and reduce the total number of tests performed. However, use of adjacent marker information can result in over- or under-smoothing, undesirable window boundary specifications, or highly correlated test statistics. We introduce a method for defining windows based on statistically guided breakpoints in the data, as a foundation for the analysis of multiple adjacent data points. This method involves first fitting a cubic smoothing spline to the datamore » and then identifying the inflection points of the fitted spline, which serve as the boundaries of adjacent windows. This technique does not require prior knowledge of linkage disequilibrium, and therefore can be applied to data collected from individual or pooled sequencing experiments. Moreover, in contrast to existing methods, an arbitrary choice of window size is not necessary, since these are determined empirically and allowed to vary along the genome.« less
Design and Verification of Critical Pressurised Windows for Manned Spaceflight
NASA Astrophysics Data System (ADS)
Lamoure, Richard; Busto, Lara; Novo, Francisco; Sinnema, Gerben; Leal, Mendes M.
2014-06-01
The Window Design for Manned Spaceflight (WDMS) project was tasked with establishing the state-of-art and explore possible improvements to the current structural integrity verification and fracture control methodologies for manned spacecraft windows.A critical review of the state-of-art in spacecraft window design, materials and verification practice was conducted. Shortcomings of the methodology in terms of analysis, inspection and testing were identified. Schemes for improving verification practices and reducing conservatism whilst maintaining the required safety levels were then proposed.An experimental materials characterisation programme was defined and carried out with the support of the 'Glass and Façade Technology Research Group', at the University of Cambridge. Results of the sample testing campaign were analysed, post-processed and subsequently applied to the design of a breadboard window demonstrator.Two Fused Silica glass window panes were procured and subjected to dedicated analyses, inspection and testing comprising both qualification and acceptance programmes specifically tailored to the objectives of the activity.Finally, main outcomes have been compiled into a Structural Verification Guide for Pressurised Windows in manned spacecraft, incorporating best practices and lessons learned throughout this project.
Real-time image sequence segmentation using curve evolution
NASA Astrophysics Data System (ADS)
Zhang, Jun; Liu, Weisong
2001-04-01
In this paper, we describe a novel approach to image sequence segmentation and its real-time implementation. This approach uses the 3D structure tensor to produce a more robust frame difference signal and uses curve evolution to extract whole objects. Our algorithm is implemented on a standard PC running the Windows operating system with video capture from a USB camera that is a standard Windows video capture device. Using the Windows standard video I/O functionalities, our segmentation software is highly portable and easy to maintain and upgrade. In its current implementation on a Pentium 400, the system can perform segmentation at 5 frames/sec with a frame resolution of 160 by 120.
Sobolik, Tammy; Su, Ying-Jun; Ashby, Will; Schaffer, David K.; Wells, Sam; Wikswo, John P.; Zijlstra, Andries; Richmond, Ann
2016-01-01
ABSTRACT We developed mammary imaging windows (MIWs) to evaluate leukocyte infiltration and cancer cell dissemination in mouse mammary tumors imaged by confocal microscopy. Previous techniques relied on surgical resection of a skin flap to image the tumor microenvironment restricting imaging time to a few hours. Utilization of mammary imaging windows offers extension of intravital imaging of the tumor microenvironment. We have characterized strengths and identified some previously undescribed potential weaknesses of MIW techniques. Through iterative enhancements of a transdermal portal we defined conditions for improved quality and extended confocal imaging time for imaging key cell-cell interactions in the tumor microenvironment. PMID:28243517
Sobolik, Tammy; Su, Ying-Jun; Ashby, Will; Schaffer, David K; Wells, Sam; Wikswo, John P; Zijlstra, Andries; Richmond, Ann
2016-01-01
We developed mammary imaging windows (MIWs) to evaluate leukocyte infiltration and cancer cell dissemination in mouse mammary tumors imaged by confocal microscopy. Previous techniques relied on surgical resection of a skin flap to image the tumor microenvironment restricting imaging time to a few hours. Utilization of mammary imaging windows offers extension of intravital imaging of the tumor microenvironment. We have characterized strengths and identified some previously undescribed potential weaknesses of MIW techniques. Through iterative enhancements of a transdermal portal we defined conditions for improved quality and extended confocal imaging time for imaging key cell-cell interactions in the tumor microenvironment.
Latitudinal and photic effects on diel foraging and predation risk in freshwater pelagic ecosystems
Hansen, Adam G.; Beauchamp, David A.
2014-01-01
1. Clark & Levy (American Naturalist, 131, 1988, 271–290) described an antipredation window for smaller planktivorous fish during crepuscular periods when light permits feeding on zooplankton, but limits visual detection by piscivores. Yet, how the window is influenced by the interaction between light regime, turbidity and cloud cover over a broad latitudinal gradi- ent remains unexplored. 2. We evaluated how latitudinal and seasonal shifts in diel light regimes alter the foraging- risk environment for visually feeding planktivores and piscivores across a natural range of turbidities and cloud covers. Pairing a model of aquatic visual feeding with a model of sun and moon illuminance, we estimated foraging rates of an idealized planktivore and piscivore over depth and time across factorial combinations of latitude (0–70°), turbidity (01–5 NTU) and cloud cover (clear to overcast skies) during the summer solstice and autumnal equinox. We evaluated the foraging-risk environment based on changes in the magnitude, duration and peak timing of the antipredation window. 3. The model scenarios generated up to 10-fold shifts in magnitude, 24-fold shifts in duration and 55-h shifts in timing of the peak antipredation window. The size of the window increased with latitude. This pattern was strongest during the solstice. In clear water at low turbidity (01–05 NTU), peaks in the magnitude and duration of the window formed at 57–60° latitude, before falling to near zero as surface waters became saturated with light under a midnight sun and clear skies at latitudes near 70°. Overcast dampened the midnight sun enough to allow larger windows to form in clear water at high latitudes. Conversely, at turbidities ≥2 NTU, greater reductions in the visual range of piscivores than planktivores created a window for long periods at high latitudes. Latitudinal dependencies were essentially lost during the equinox, indicating a progressive compression of the window from early summer into autumn. 4. Model results show that diel-seasonal foraging and predation risk in freshwater pelagic ecosystems changes considerably with latitude, turbidity and cloud cover. These changes alter the structure of pelagic predator–prey interactions, and in turn, the broader role of pelagic consumers in habitat coupling in lakes.
High-Temperature Optical Window Design
NASA Technical Reports Server (NTRS)
Roeloffs, Norman; Taranto, Nick
1995-01-01
A high-temperature optical window is essential to the optical diagnostics of high-temperature combustion rigs. Laser Doppler velocimetry, schlieren photography, light sheet visualization, and laser-induced fluorescence spectroscopy are a few of the tests that require optically clear access to the combustor flow stream. A design was developed for a high-temperature window that could withstand the severe environment of the NASA Lewis 3200 F Lean Premixed Prevaporized (LPP) Flame Tube Test Rig. The development of this design was both time consuming and costly. This report documents the design process and the lessons learned, in an effort to reduce the cost of developing future designs for high-temperature optical windows.
Choy, G.L.; Boatwright, J.
2007-01-01
The rupture process of the Mw 9.1 Sumatra-Andaman earthquake lasted for approximately 500 sec, nearly twice as long as the teleseismic time windows between the P and PP arrival times generally used to compute radiated energy. In order to measure the P waves radiated by the entire earthquake, we analyze records that extend from the P-wave to the S-wave arrival times from stations at distances ?? >60??. These 8- to 10-min windows contain the PP, PPP, and ScP arrivals, along with other multiply reflected phases. To gauge the effect of including these additional phases, we form the spectral ratio of the source spectrum estimated from extended windows (between TP and TS) to the source spectrum estimated from normal windows (between TP and TPP). The extended windows are analyzed as though they contained only the P-pP-sP wave group. We analyze four smaller earthquakes that occurred in the vicinity of the Mw 9.1 mainshock, with similar depths and focal mechanisms. These smaller events range in magnitude from an Mw 6.0 aftershock of 9 January 2005 to the Mw 8.6 Nias earthquake that occurred to the south of the Sumatra-Andaman earthquake on 28 March 2005. We average the spectral ratios for these four events to obtain a frequency-dependent operator for the extended windows. We then correct the source spectrum estimated from the extended records of the 26 December 2004 mainshock to obtain a complete or corrected source spectrum for the entire rupture process (???600 sec) of the great Sumatra-Andaman earthquake. Our estimate of the total seismic energy radiated by this earthquake is 1.4 ?? 1017 J. When we compare the corrected source spectrum for the entire earthquake to the source spectrum from the first ???250 sec of the rupture process (obtained from normal teleseismic windows), we find that the mainshock radiated much more seismic energy in the first half of the rupture process than in the second half, especially over the period range from 3 sec to 40 sec.
Chang, Andrew; Eastwood, Hayden; Sly, David; James, David; Richardson, Rachael; O'Leary, Stephen
2009-09-01
To protect hearing in an experimental model of cochlear implantation by the application of dexamethasone to the round window prior to surgery. The present study examined the dosage and timing relationships required to optimise the hearing protection. Dexamethasone or saline (control) was absorbed into a pledget of the carboxymethylcellulose and hyaluronic acid and applied to the round window of the guinea pig prior to cochlear implantation. The treatment groups were 2% w/v dexamethasone for 30, 60 and 120min; 20% dexamethasone applied for 30min. Auditory sensitivity was determined pre-operatively, and at 1 week after surgery, with pure-tone auditory brainstem response audiometry (2-32kHz). Cochlear implantation was performed via a cochleostomy drilled into the basal turn of the cochlea, into which a miniature cochlear implant dummy electrode was inserted using soft-surgery techniques. ABR thresholds were elevated after cochlear implantation, maximally at 32kHz and to a lesser extent at lower frequencies. Thresholds were less elevated after dexamethasone treatment, and the hearing protection improved when 2% dexamethasone was applied to the round window for longer periods of time prior to implantation. The time that dexamethasone need be applied to achieve hearing protection could be reduced by increasing the concentration of steroid, with a 20% application for 30min achieving similar levels of protection to a 60min application of 2% dexamethasone. Hearing protection is improved by increasing the time that dexamethasone is applied to the round window prior to cochlear implantation, and the waiting time can be reduced by increasing the steroid concentration. These results suggest that the diffusion dexamethasone through the cochlea is the prime determinant of the extent of hearing protection.
Vendemia, Nicholas; Chao, Jerry; Ivanidze, Jana; Sanelli, Pina; Spinelli, Henry M
2011-01-01
Medpor (Porex Surgical, Inc, Newnan, GA) is composed of porous polyethylene and is commonly used in craniofacial reconstruction. When complications such as seroma or abscess formation arise, diagnostic modalities are limited because Medpor is radiolucent on conventional radiologic studies. This poses a problem in situations where imaging is necessary to distinguish the implant from surrounding tissues. To present a clinically useful method for imaging Medpor with conventional computed tomographic (CT) scanning. Eleven patients (12 total implants) who have undergone reconstructive surgery with Medpor were included in the study. A retrospective review of CT scans done between 1 and 16 months postoperatively was performed using 3 distinct CT window settings. Measurements of implant dimensions and Hounsfield units were recorded and qualitatively assessed. Of the 3 distinct window settings studied, namely, "bone" (W1100/L450), "soft tissue"; (W500/L50), and "implant" (W800/L200), the implant window proved the most ideal, allowing the investigators to visualize and evaluate Medpor in all cases. Qualitative analysis revealed that Medpor implants were able to be distinguished from surrounding tissue in both the implant and soft tissue windows, with a density falling between that of fat and fluid. In 1 case, Medpor could not be visualized in the soft tissue window, although it could be visualized in the implant window. Quantitative analysis demonstrated a mean (SD) density of -38.7 (7.4) Hounsfield units. Medpor may be optimally visualized on conventional CT scans using the implant window settings W800/L200, which can aid in imaging Medpor and diagnosing implant-related complications.
Launch window analysis of satellites in high eccentricity or large circular orbits
NASA Technical Reports Server (NTRS)
Renard, M. L.; Bhate, S. K.; Sridharan, R.
1973-01-01
Numerical methods and computer programs for studying the stability and evolution of orbits of large eccentricity are presented. Methods for determining launch windows and target dates are developed. Mathematical models are prepared to analyze the characteristics of specific missions.
Improving EEG-Based Motor Imagery Classification for Real-Time Applications Using the QSA Method.
Batres-Mendoza, Patricia; Ibarra-Manzano, Mario A; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Montoro-Sanjose, Carlos R; Romero-Troncoso, Rene J; Rostro-Gonzalez, Horacio
2017-01-01
We present an improvement to the quaternion-based signal analysis (QSA) technique to extract electroencephalography (EEG) signal features with a view to developing real-time applications, particularly in motor imagery (IM) cognitive processes. The proposed methodology (iQSA, improved QSA) extracts features such as the average, variance, homogeneity, and contrast of EEG signals related to motor imagery in a more efficient manner (i.e., by reducing the number of samples needed to classify the signal and improving the classification percentage) compared to the original QSA technique. Specifically, we can sample the signal in variable time periods (from 0.5 s to 3 s, in half-a-second intervals) to determine the relationship between the number of samples and their effectiveness in classifying signals. In addition, to strengthen the classification process a number of boosting-technique-based decision trees were implemented. The results show an 82.30% accuracy rate for 0.5 s samples and 73.16% for 3 s samples. This is a significant improvement compared to the original QSA technique that offered results from 33.31% to 40.82% without sampling window and from 33.44% to 41.07% with sampling window, respectively. We can thus conclude that iQSA is better suited to develop real-time applications.
Formal analysis and evaluation of the back-off procedure in IEEE802.11P VANET
NASA Astrophysics Data System (ADS)
Jin, Li; Zhang, Guoan; Zhu, Xiaojun
2017-07-01
The back-off procedure is one of the media access control technologies in 802.11P communication protocol. It plays an important role in avoiding message collisions and allocating channel resources. Formal methods are effective approaches for studying the performances of communication systems. In this paper, we establish a discrete time model for the back-off procedure. We use Markov Decision Processes (MDPs) to model the non-deterministic and probabilistic behaviors of the procedure, and use the probabilistic computation tree logic (PCTL) language to express different properties, which ensure that the discrete time model performs their basic functionality. Based on the model and PCTL specifications, we study the effect of contention window length on the number of senders in the neighborhood of given receivers, and that on the station’s expected cost required by the back-off procedure to successfully send packets. The variation of the window length may increase or decrease the maximum probability of correct transmissions within a time contention unit. We propose to use PRISM model checker to describe our proposed back-off procedure for IEEE802.11P protocol in vehicle network, and define different probability properties formulas to automatically verify the model and derive numerical results. The obtained results are helpful for justifying the values of the time contention unit.
Improving EEG-Based Motor Imagery Classification for Real-Time Applications Using the QSA Method
Batres-Mendoza, Patricia; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Montoro-Sanjose, Carlos R.
2017-01-01
We present an improvement to the quaternion-based signal analysis (QSA) technique to extract electroencephalography (EEG) signal features with a view to developing real-time applications, particularly in motor imagery (IM) cognitive processes. The proposed methodology (iQSA, improved QSA) extracts features such as the average, variance, homogeneity, and contrast of EEG signals related to motor imagery in a more efficient manner (i.e., by reducing the number of samples needed to classify the signal and improving the classification percentage) compared to the original QSA technique. Specifically, we can sample the signal in variable time periods (from 0.5 s to 3 s, in half-a-second intervals) to determine the relationship between the number of samples and their effectiveness in classifying signals. In addition, to strengthen the classification process a number of boosting-technique-based decision trees were implemented. The results show an 82.30% accuracy rate for 0.5 s samples and 73.16% for 3 s samples. This is a significant improvement compared to the original QSA technique that offered results from 33.31% to 40.82% without sampling window and from 33.44% to 41.07% with sampling window, respectively. We can thus conclude that iQSA is better suited to develop real-time applications. PMID:29348744
Delay of cognitive gamma responses in Alzheimer's disease
Başar, Erol; Emek-Savaş, Derya Durusu; Güntekin, Bahar; Yener, Görsev G.
2016-01-01
Event-related oscillations (EROs) reflect cognitive brain dynamics, while sensory-evoked oscillations (SEOs) reflect sensory activities. Previous reports from our lab have shown that those with Alzheimer's disease (AD) or mild cognitive impairment (MCI) have decreased activity and/or coherence in delta, theta, alpha and beta cognitive responses. In the current study, we investigated gamma responses in visual SEO and ERO in 15 patients with AD and in 15 age-, gender- and education-matched healthy controls. The following parameters were analyzed over the parietal-occipital regions in both groups: (i) latency of the maximum gamma response over a 0–800 ms time window; (ii) the maximum peak-to-peak amplitudes for each participant's averaged SEO and ERO gamma responses in 3 frequency ranges (25–30, 30–35, 40–48 Hz); and (iii) the maximum peak-to-peak amplitudes for each participant's averaged SEO and ERO gamma responses over a 0–800 ms time block containing four divided time windows (0–200, 200–400, 400–600, and 600–800 ms). There were main group effects in terms of both latency and peak-to-peak amplitudes of gamma ERO. However, peak-to-peak gamma ERO amplitude differences became noticeable only when the time block was divided into four time windows. SEO amplitudes in the 25–30 Hz frequency range of the 0–200 ms time window over the left hemisphere were greater in the healthy controls than in those with AD. Gamma target ERO latency was delayed up to 138 ms in AD patients when compared to healthy controls. This finding may be an effect of lagged neural signaling in cognitive circuits, which is reflected by the delayed gamma responses in those with AD. Based on the results of this study, we propose that gamma responses should be examined in a more detailed fashion using multiple frequency and time windows. PMID:26937378
Dong, Bing; Li, Yan; Han, Xin-Li; Hu, Bin
2016-09-02
For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10(-5) in optimized correction and is 1.427 × 10(-5) in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method.
Beyond the Time Window of Intravenous Thrombolysis: Standing by or by Stenting?
Liu, Xinfeng
2012-01-01
Intravenous administration of tissue plasminogen activator within 4.5 h of symptom onset is presently the ‘golden rule’ for treating acute ischemic stroke. However, many patients miss the time window and others reject this treatment due to a long list of contraindications. Mechanical embolectomy has recently progressed as a potential alternative for treating patients beyond the time window for IV thrombolysis. In this paper, recent progress in mechanical embolectomy, angioplasty, and stenting in acute stroke is reviewed. Despite worries concerning the long-term clinical outcomes and increased risk of intracranial hemorrhage, favorable clinical outcomes may be achieved after mechanical embolectomy in carefully selected patients even 4.5 h after stroke onset. Potential steps should be prepared and attempted in these patients whose opportunity for recovery will elapse in a flash. PMID:25187761
A refined technique to calculate finite helical axes from rigid body trackers.
McLachlin, Stewart D; Ferreira, Louis M; Dunning, Cynthia E
2014-12-01
Finite helical axes (FHAs) are a potentially effective tool for joint kinematic analysis. Unfortunately, no straightforward guidelines exist for calculating accurate FHAs using prepackaged six degree-of-freedom (6 DOF) rigid body trackers. Thus, this study aimed to: (1) describe a protocol for calculating FHA parameters from 6 DOF rigid body trackers using the screw matrix and (2) to maximize the number of accurate FHAs generated from a given data set using a moving window analysis. Four Optotrak® Smart Markers were used as the rigid body trackers, two moving and two fixed, at different distances from the hinge joint of a custom-machined jig. 6D OF pose information was generated from 51 static positions of the jig rotated and fixed in 0.5 deg increments up to 25 deg. Output metrics included the FHA direction cosines, the rotation about the FHA, the translation along the axis, and the intercept of the FHA with the plane normal to the jig's hinge joint. FHA metrics were calculated using the relative tracker rotation from the starting position, and using a moving window analysis to define a minimum acceptable rotational displacement between the moving tracker data points. Data analysis found all FHA rotations calculated from the starting position were within 0.15 deg of the prescribed jig rotation. FHA intercepts were most stable when determined using trackers closest to the hinge axis. Increasing the moving window size improved the FHA direction cosines and center of rotation accuracy. Window sizes larger than 2 deg had an intercept deviation of less than 1 mm. Furthermore, compared to the 0 deg window size, the 2 deg window had a 90% improvement in FHA intercept precision while generating almost an equivalent number of FHA axes. This work identified a solution to improve FHA calculations for biomechanical researchers looking to describe changes in 3D joint motion.
Long, Tom; Johnson, Ted; Ollison, Will
2004-07-01
Air pollution exposures in the motor vehicle cabin are significantly affected by air exchange rate, a function of vehicle speed, window position, vent status, fan speed, and air conditioning use. A pilot study conducted in Houston, Texas, during September 2000 demonstrated that useful information concerning the position of windows, sunroofs, and convertible tops as a function of temperature and vehicle speed could be obtained through the use of video recorders. To obtain similar data representing a wide range of temperature and traffic conditions, a follow-up study was conducted in and around Chapel Hill, North Carolina at five sites representing a central business district, an arterial road, a low-income commercial district, an interstate highway, and a rural road. Each site permitted an elevated view of vehicles as they proceeded through a turn, thereby exposing all windows to the stationary camcorder. A total of 32 videotaping sessions were conducted between February and October 2001, in which temperature varied from 41 degrees F to 93 degrees F and average vehicle speed varied from 21 to 77 mph. The resulting video tapes were processed to create a vehicle-specific database that included site location, date, time, vehicle type, vehicle color, vehicle age, window configuration, number of windows in each of three position categories (fully open, partially open, and closed), meteorological factors, and vehicle speed. Of the 4715 vehicles included in the database, 1905 (40.4%) were labeled as "open," indicating a window, sunroof, or convertible top was fully or partially open. Stepwise linear regression analyses indicated that "open" window status was affected by wind speed, relative humidity, vehicle speed, cloud cover, apparent temperature, day of week, time of day, vehicle type, vehicle age, vehicle color, number of windows, sunroofs, location, and air quality season. Open windows tended to occur less frequently when relative humidity was high, apparent temperature (a parameter incorporating wind chill and heat index) was below 50 degrees F, or the vehicle was relatively new. Although the effects of the identified parameters were relatively weak, they are statistically significant and should be considered by researchers attempting to model vehicle air exchange rates.
NASA Astrophysics Data System (ADS)
Hibino, Daisuke; Hsu, Mingyi; Shindo, Hiroyuki; Izawa, Masayuki; Enomoto, Yuji; Lin, J. F.; Hu, J. R.
2013-04-01
The impact on yield loss due to systematic defect which remains after Optical Proximity Correction (OPC) modeling has increased, and achieving an acceptable yield has become more difficult in the leading technology beyond 20 nm node production. Furthermore Process-Window has become narrow because of the complexity of IC design and less process margin. In the past, the systematic defects have been inspected by human-eyes. However the judgment by human-eyes is sometime unstable and not accurate. Moreover an enormous amount of time and labor will have to be expended on the one-by-one judgment for several thousands of hot-spot defects. In order to overcome these difficulties and improve the yield and manufacturability, the automated system, which can quantify the shape difference with high accuracy and speed, is needed. Inspection points could be increased for getting higher yield, if the automated system achieves our goal. Defect Window Analysis (DWA) system by using high-precision-contour extraction from SEM image on real silicon and quantifying method which can calculate the difference between defect pattern and non-defect pattern automatically, which was developed by Hitachi High-Technologies, has been applied to the defect judgment instead of the judgment by human-eyes. The DWA result which describes process behavior might be feedback to design or OPC or mask. This new methodology and evaluation results will be presented in detail in this paper.
Activity Recognition on Streaming Sensor Data.
Krishnan, Narayanan C; Cook, Diane J
2014-02-01
Many real-world applications that focus on addressing needs of a human, require information about the activities being performed by the human in real-time. While advances in pervasive computing have lead to the development of wireless and non-intrusive sensors that can capture the necessary activity information, current activity recognition approaches have so far experimented on either a scripted or pre-segmented sequence of sensor events related to activities. In this paper we propose and evaluate a sliding window based approach to perform activity recognition in an on line or streaming fashion; recognizing activities as and when new sensor events are recorded. To account for the fact that different activities can be best characterized by different window lengths of sensor events, we incorporate the time decay and mutual information based weighting of sensor events within a window. Additional contextual information in the form of the previous activity and the activity of the previous window is also appended to the feature describing a sensor window. The experiments conducted to evaluate these techniques on real-world smart home datasets suggests that combining mutual information based weighting of sensor events and adding past contextual information into the feature leads to best performance for streaming activity recognition.
Prospective Heart Tracking for Whole-heart Magnetic Resonance Angiography
Moghari, Mehdi H.; Geva, Tal; Powell, Andrew J.
2015-01-01
Purpose To develop a prospective respiratory-gating technique (Heart-NAV) for use with contrast-enhanced 3D inversion recovery (IR) whole-heart magnetic resonance angiography (MRA) acquisitions that directly tracks heart motion without creating image inflow artifact. Methods With Heart-NAV, 1 of the startup pulses for the whole-heart steady-state free precession MRA sequence is used to collect the centerline of k-space, and its 1-dimensional reconstruction is fed into the standard diaphragm-navigator (NAV) signal analysis process to prospectively gate and track respiratory-induced heart displacement. Ten healthy volunteers underwent non-contrast whole-heart MRA acquisitions using the conventional diaphragm-NAV and Heart-NAV with 5 and 10 mm acceptance windows in a 1.5T scanner. Five patients underwent contrast-enhanced IR whole-heart MRA using a diaphragm-NAV and Heart-NAV with a 5 mm acceptance window. Results For non-contrast whole-heart MRA with both the 5 and 10 mm acceptance windows, Heart-NAV yielded coronary artery vessel sharpness and subjective visual scores that were not significantly different than those using a conventional diaphragm-NAV. Scan time for Heart-NAV was 10% shorter (p<0.05). In patients undergoing contrast-enhanced IR whole-heart MRA, inflow artifact was seen with the diaphragm-NAV but not with Heart-NAV. Conclusion Compared to a conventional diaphragm-NAV, Heart-NAV achieves similar image quality in a slightly shorter scan time and eliminates inflow artifact. PMID:26843458
Space-time interpolation of satellite winds in the tropics
NASA Astrophysics Data System (ADS)
Patoux, Jérôme; Levy, Gad
2013-09-01
A space-time interpolator for creating average geophysical fields from satellite measurements is presented and tested. It is designed for optimal spatiotemporal averaging of heterogeneous data. While it is illustrated with satellite surface wind measurements in the tropics, the methodology can be useful for interpolating, analyzing, and merging a wide variety of heterogeneous and satellite data in the atmosphere and ocean over the entire globe. The spatial and temporal ranges of the interpolator are determined by averaging satellite and in situ measurements over increasingly larger space and time windows and matching the corresponding variability at each scale. This matching provides a relationship between temporal and spatial ranges, but does not provide a unique pair of ranges as a solution to all averaging problems. The pair of ranges most appropriate for a given application can be determined by performing a spectral analysis of the interpolated fields and choosing the smallest values that remove any or most of the aliasing due to the uneven sampling by the satellite. The methodology is illustrated with the computation of average divergence fields over the equatorial Pacific Ocean from SeaWinds-on-QuikSCAT surface wind measurements, for which 72 h and 510 km are suggested as optimal interpolation windows. It is found that the wind variability is reduced over the cold tongue and enhanced over the Pacific warm pool, consistent with the notion that the unstably stratified boundary layer has generally more variable winds and more gustiness than the stably stratified boundary layer. It is suggested that the spectral analysis optimization can be used for any process where time-space correspondence can be assumed.
WINCADRE INORGANIC (WINDOWS COMPUTER-AIDED DATA REVIEW AND EVALUATION)
WinCADRE (Computer-Aided Data Review and Evaluation) is a Windows -based program designed for computer-assisted data validation. WinCADRE is a powerful tool which significantly decreases data validation turnaround time. The electronic-data-deliverable format has been designed in...
14 CFR 1214.117 - Launch and orbit parameters for a standard launch.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) Launch at a time, selected by NASA, from a launch window of not less than 1 hour (a more restrictive launch window may be provided as an optional service). (b) For shared flights from KSC to the standard...
Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun
2008-05-28
Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4-15.9 times faster, while Unphased jobs performed 1.1-18.6 times faster compared to the accumulated computation duration. Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance.
Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun
2008-01-01
Background Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Results Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4–15.9 times faster, while Unphased jobs performed 1.1–18.6 times faster compared to the accumulated computation duration. Conclusion Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance. PMID:18541045
Su, Meng-xiang; Zhou, Wen-di; Lan, Juan; Di, Bin; Hang, Tai-jun
2015-03-01
A simultaneous determination method based on liquid chromatography coupled with time-of-flight mass spectrometry was developed for the analysis of 11 bioactive constituents in tripterygium glycosides tablets, an immune and inflammatory prescription used in China. The analysis was fully optimized on a 1.8 μm particle size C18 column with linear gradient elution, permitting good separation of the 11 analytes and two internal standards in 21 min. The quantitation of each target constituent was carried out using the narrow window extracted ion chromatograms with a ±l0 ppm extraction window, yielding good linearity (r(2) > 0.996) with a linear range of 10-1000 ng/mL. The limits of quantitation were low ranging from 0.25 to 5.02 ng/mL for the 11 analytes, and the precisions and repeatability were better than 1.6 and 5.3%, respectively. The acceptable recoveries obtained were in the range of 93.4-107.4%. This proposed method was successfully applied to quantify the 11 bioactive constituents in commercial samples produced by nine pharmaceutical manufacturers to profile the quality of these preparations. The overall results demonstrate that the contents of the 11 bioactive constituents in different samples were in great diversity, therefore, the quality, clinical safety, and efficacy of this drug needs further research and evaluation. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Rusterholz, Thomas; Achermann, Peter; Dürr, Roland; Koenig, Thomas; Tarokh, Leila
2017-06-01
Investigating functional connectivity between brain networks has become an area of interest in neuroscience. Several methods for investigating connectivity have recently been developed, however, these techniques need to be applied with care. We demonstrate that global field synchronization (GFS), a global measure of phase alignment in the EEG as a function of frequency, must be applied considering signal processing principles in order to yield valid results. Multichannel EEG (27 derivations) was analyzed for GFS based on the complex spectrum derived by the fast Fourier transform (FFT). We examined the effect of window functions on GFS, in particular of non-rectangular windows. Applying a rectangular window when calculating the FFT revealed high GFS values for high frequencies (>15Hz) that were highly correlated (r=0.9) with spectral power in the lower frequency range (0.75-4.5Hz) and tracked the depth of sleep. This turned out to be spurious synchronization. With a non-rectangular window (Tukey or Hanning window) these high frequency synchronization vanished. Both, GFS and power density spectra significantly differed for rectangular and non-rectangular windows. Previous papers using GFS typically did not specify the applied window and may have used a rectangular window function. However, the demonstrated impact of the window function raises the question of the validity of some previous findings at higher frequencies. We demonstrated that it is crucial to apply an appropriate window function for determining synchronization measures based on a spectral approach to avoid spurious synchronization in the beta/gamma range. Copyright © 2017 Elsevier B.V. All rights reserved.
Kutkut, Ahmad M; Andreana, Sebastiano; Kim, Hyeong-Ll; Monaco, Edward
2011-12-01
To propose a clinical recommendation based on anatomy of maxillary sinus before sinus augmentation procedure using presurgical computerized axial tomography (CAT) scan images. CAT scan images were randomly selected from previous completed implant cases. Proposed area for the lateral window osteotomy was outlined on the panorex image of the CAT scan. Sagittal section on the CAT scan that was in the center of the outlined window was selected for sinus measurement analysis. On CAT scan, 2 lines were drawn to measure the dimensions of sinus. One line measured the horizontal width and the other line measured the vertical height. Based on the measurement data, a classification of the maxillary sinus anatomy was proposed. Narrow sinus cavity indicates favorable type anatomy in terms of bone regeneration healing and wide sinus cavity as less favorable anatomy for patient treatment planning. A narrow sinus and greater exposure to the blood supply should require shorter healing times after grafting. Conversely, wider sinus cavities and less exposure to the blood supply would require a longer healing time before implant placement.
PSK Shift Timing Information Detection Using Image Processing and a Matched Filter
2009-09-01
phase shifts are enhanced. Develop, design, and test the resulting phase shift identification scheme. xx Develop, design, and test an optional...and the resulting phase shift identification algorithm is investigated for SNR levels in the range -2dB to 12 dB. Detection performances are derived...test the resulting phase shift identification scheme. Develop, design, and test an optional analysis window overlapping technique to improve phase
Mudford, Oliver C; Taylor, Sarah Ann; Martin, Neil T
2009-01-01
We reviewed all research articles in 10 recent volumes of the Journal of Applied Behavior Analysis (JABA): Vol. 28(3), 1995, through Vol. 38(2), 2005. Continuous recording was used in the majority (55%) of the 168 articles reporting data on free-operant human behaviors. Three methods for reporting interobserver agreement (exact agreement, block-by-block agreement, and time-window analysis) were employed in more than 10 of the articles that reported continuous recording. Having identified these currently popular agreement computation algorithms, we explain them to assist researchers, software writers, and other consumers of JABA articles.
Heidlmayr, Karin; Hemforth, Barbara; Moutier, Sylvain; Isel, Frédéric
2015-01-01
The present study was designed to examine the impact of bilingualism on the neuronal activity in different executive control processes namely conflict monitoring, control implementation (i.e., interference suppression and conflict resolution) and overcoming of inhibition. Twenty-two highly proficient but non-balanced successive French-German bilingual adults and 22 monolingual adults performed a combined Stroop/Negative priming task while event-related potential (ERP) were recorded online. The data revealed that the ERP effects were reduced in bilinguals in comparison to monolinguals but only in the Stroop task and limited to the N400 and the sustained fronto-central negative-going potential time windows. This result suggests that bilingualism may impact the process of control implementation rather than the process of conflict monitoring (N200). Critically, our study revealed a differential time course of the involvement of the anterior cingulate cortex (ACC) and the prefrontal cortex (PFC) in conflict processing. While the ACC showed major activation in the early time windows (N200 and N400) but not in the latest time window (late sustained negative-going potential), the PFC became unilaterally active in the left hemisphere in the N400 and the late sustained negative-going potential time windows. Taken together, the present electroencephalography data lend support to a cascading neurophysiological model of executive control processes, in which ACC and PFC may play a determining role.
Heidlmayr, Karin; Hemforth, Barbara; Moutier, Sylvain; Isel, Frédéric
2015-01-01
The present study was designed to examine the impact of bilingualism on the neuronal activity in different executive control processes namely conflict monitoring, control implementation (i.e., interference suppression and conflict resolution) and overcoming of inhibition. Twenty-two highly proficient but non-balanced successive French–German bilingual adults and 22 monolingual adults performed a combined Stroop/Negative priming task while event-related potential (ERP) were recorded online. The data revealed that the ERP effects were reduced in bilinguals in comparison to monolinguals but only in the Stroop task and limited to the N400 and the sustained fronto-central negative-going potential time windows. This result suggests that bilingualism may impact the process of control implementation rather than the process of conflict monitoring (N200). Critically, our study revealed a differential time course of the involvement of the anterior cingulate cortex (ACC) and the prefrontal cortex (PFC) in conflict processing. While the ACC showed major activation in the early time windows (N200 and N400) but not in the latest time window (late sustained negative-going potential), the PFC became unilaterally active in the left hemisphere in the N400 and the late sustained negative-going potential time windows. Taken together, the present electroencephalography data lend support to a cascading neurophysiological model of executive control processes, in which ACC and PFC may play a determining role. PMID:26124740
Robust sliding-window reconstruction for Accelerating the acquisition of MR fingerprinting.
Cao, Xiaozhi; Liao, Congyu; Wang, Zhixing; Chen, Ying; Ye, Huihui; He, Hongjian; Zhong, Jianhui
2017-10-01
To develop a method for accelerated and robust MR fingerprinting (MRF) with improved image reconstruction and parameter matching processes. A sliding-window (SW) strategy was applied to MRF, in which signal and dictionary matching was conducted between fingerprints consisting of mixed-contrast image series reconstructed from consecutive data frames segmented by a sliding window, and a precalculated mixed-contrast dictionary. The effectiveness and performance of this new method, dubbed SW-MRF, was evaluated in both phantom and in vivo. Error quantifications were conducted on results obtained with various settings of SW reconstruction parameters. Compared with the original MRF strategy, the results of both phantom and in vivo experiments demonstrate that the proposed SW-MRF strategy either provided similar accuracy with reduced acquisition time, or improved accuracy with equal acquisition time. Parametric maps of T 1 , T 2 , and proton density of comparable quality could be achieved with a two-fold or more reduction in acquisition time. The effect of sliding-window width on dictionary sensitivity was also estimated. The novel SW-MRF recovers high quality image frames from highly undersampled MRF data, which enables more robust dictionary matching with reduced numbers of data frames. This time efficiency may facilitate MRF applications in time-critical clinical settings. Magn Reson Med 78:1579-1588, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.
Towards component-based validation of GATE: aspects of the coincidence processor.
Moraes, Eder R; Poon, Jonathan K; Balakrishnan, Karthikayan; Wang, Wenli; Badawi, Ramsey D
2015-02-01
GATE is public domain software widely used for Monte Carlo simulation in emission tomography. Validations of GATE have primarily been performed on a whole-system basis, leaving the possibility that errors in one sub-system may be offset by errors in others. We assess the accuracy of the GATE PET coincidence generation sub-system in isolation, focusing on the options most closely modeling the majority of commercially available scanners. Independent coincidence generators were coded by teams at Toshiba Medical Research Unit (TMRU) and UC Davis. A model similar to the Siemens mCT scanner was created in GATE. Annihilation photons interacting with the detectors were recorded. Coincidences were generated using GATE, TMRU and UC Davis code and results compared to "ground truth" obtained from the history of the photon interactions. GATE was tested twice, once with every qualified single event opening a time window and initiating a coincidence check (the "multiple window method"), and once where a time window is opened and a coincidence check initiated only by the first single event to occur after the end of the prior time window (the "single window method"). True, scattered and random coincidences were compared. Noise equivalent count rates were also computed and compared. The TMRU and UC Davis coincidence generators agree well with ground truth. With GATE, reasonable accuracy can be obtained if the single window method option is chosen and random coincidences are estimated without use of the delayed coincidence option. However in this GATE version, other parameter combinations can result in significant errors. Copyright © 2014 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.