Due-Window Assignment Scheduling with Variable Job Processing Times
Wu, Yu-Bin
2015-01-01
We consider a common due-window assignment scheduling problem jobs with variable job processing times on a single machine, where the processing time of a job is a function of its position in a sequence (i.e., learning effect) or its starting time (i.e., deteriorating effect). The problem is to determine the optimal due-windows, and the processing sequence simultaneously to minimize a cost function includes earliness, tardiness, the window location, window size, and weighted number of tardy jobs. We prove that the problem can be solved in polynomial time. PMID:25918745
Single-machine common/slack due window assignment problems with linear decreasing processing times
NASA Astrophysics Data System (ADS)
Zhang, Xingong; Lin, Win-Chin; Wu, Wen-Hsiang; Wu, Chin-Chia
2017-08-01
This paper studies linear non-increasing processing times and the common/slack due window assignment problems on a single machine, where the actual processing time of a job is a linear non-increasing function of its starting time. The aim is to minimize the sum of the earliness cost, tardiness cost, due window location and due window size. Some optimality results are discussed for the common/slack due window assignment problems and two O(n log n) time algorithms are presented to solve the two problems. Finally, two examples are provided to illustrate the correctness of the corresponding algorithms.
An Efficient Adaptive Window Size Selection Method for Improving Spectrogram Visualization.
Nisar, Shibli; Khan, Omar Usman; Tariq, Muhammad
2016-01-01
Short Time Fourier Transform (STFT) is an important technique for the time-frequency analysis of a time varying signal. The basic approach behind it involves the application of a Fast Fourier Transform (FFT) to a signal multiplied with an appropriate window function with fixed resolution. The selection of an appropriate window size is difficult when no background information about the input signal is known. In this paper, a novel empirical model is proposed that adaptively adjusts the window size for a narrow band-signal using spectrum sensing technique. For wide-band signals, where a fixed time-frequency resolution is undesirable, the approach adapts the constant Q transform (CQT). Unlike the STFT, the CQT provides a varying time-frequency resolution. This results in a high spectral resolution at low frequencies and high temporal resolution at high frequencies. In this paper, a simple but effective switching framework is provided between both STFT and CQT. The proposed method also allows for the dynamic construction of a filter bank according to user-defined parameters. This helps in reducing redundant entries in the filter bank. Results obtained from the proposed method not only improve the spectrogram visualization but also reduce the computation cost and achieves 87.71% of the appropriate window length selection.
Threshold network of a financial market using the P-value of correlation coefficients
NASA Astrophysics Data System (ADS)
Ha, Gyeong-Gyun; Lee, Jae Woo; Nobi, Ashadun
2015-06-01
Threshold methods in financial networks are important tools for obtaining important information about the financial state of a market. Previously, absolute thresholds of correlation coefficients have been used; however, they have no relation to the length of time. We assign a threshold value depending on the size of the time window by using the P-value concept of statistics. We construct a threshold network (TN) at the same threshold value for two different time window sizes in the Korean Composite Stock Price Index (KOSPI). We measure network properties, such as the edge density, clustering coefficient, assortativity coefficient, and modularity. We determine that a significant difference exists between the network properties of the two time windows at the same threshold, especially during crises. This implies that the market information depends on the length of the time window when constructing the TN. We apply the same technique to Standard and Poor's 500 (S&P500) and observe similar results.
A Simulation Study of Paced TCP
NASA Technical Reports Server (NTRS)
Kulik, Joanna; Coulter, Robert; Rockwell, Dennis; Partridge, Craig
2000-01-01
In this paper, we study the performance of paced TCP, a modified version of TCP designed especially for high delay- bandwidth networks. In typical networks, TCP optimizes its send-rate by transmitting increasingly large bursts, or windows, of packets, one burst per round-trip time, until it reaches a maximum window-size, which corresponds to the full capacity of the network. In a network with a high delay-bandwidth product, however, Transmission Control Protocol's (TCPs) maximum window-size may be larger than the queue size of the intermediate routers, and routers will begin to drop packets as soon as the windows become too large for the router queues. The TCP sender then concludes that the bottleneck capacity of the network has been reached, and it limits its send-rate accordingly. Partridge proposed paced TCP as a means of solving the problem of queueing bottlenecks. A sender using paced TCP would release packets in multiple, small bursts during a round-trip time in which ordinary TCP would release a single, large burst of packets. This approach allows the sender to increase its send-rate to the maximum window size without encountering queueing bottlenecks. This paper describes the performance of paced TCP in a simulated network and discusses implementation details that can affect the performance of paced TCP.
Zhang, Mingjing; Wen, Ming; Zhang, Zhi-Min; Lu, Hongmei; Liang, Yizeng; Zhan, Dejian
2015-03-01
Retention time shift is one of the most challenging problems during the preprocessing of massive chromatographic datasets. Here, an improved version of the moving window fast Fourier transform cross-correlation algorithm is presented to perform nonlinear and robust alignment of chromatograms by analyzing the shifts matrix generated by moving window procedure. The shifts matrix in retention time can be estimated by fast Fourier transform cross-correlation with a moving window procedure. The refined shift of each scan point can be obtained by calculating the mode of corresponding column of the shifts matrix. This version is simple, but more effective and robust than the previously published moving window fast Fourier transform cross-correlation method. It can handle nonlinear retention time shift robustly if proper window size has been selected. The window size is the only one parameter needed to adjust and optimize. The properties of the proposed method are investigated by comparison with the previous moving window fast Fourier transform cross-correlation and recursive alignment by fast Fourier transform using chromatographic datasets. The pattern recognition results of a gas chromatography mass spectrometry dataset of metabolic syndrome can be improved significantly after preprocessing by this method. Furthermore, the proposed method is available as an open source package at https://github.com/zmzhang/MWFFT2. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Fida, Benish; Bernabucci, Ivan; Bibbo, Daniele; Conforto, Silvia; Schmid, Maurizio
2015-07-01
Accuracy of systems able to recognize in real time daily living activities heavily depends on the processing step for signal segmentation. So far, windowing approaches are used to segment data and the window size is usually chosen based on previous studies. However, literature is vague on the investigation of its effect on the obtained activity recognition accuracy, if both short and long duration activities are considered. In this work, we present the impact of window size on the recognition of daily living activities, where transitions between different activities are also taken into account. The study was conducted on nine participants who wore a tri-axial accelerometer on their waist and performed some short (sitting, standing, and transitions between activities) and long (walking, stair descending and stair ascending) duration activities. Five different classifiers were tested, and among the different window sizes, it was found that 1.5 s window size represents the best trade-off in recognition among activities, with an obtained accuracy well above 90%. Differences in recognition accuracy for each activity highlight the utility of developing adaptive segmentation criteria, based on the duration of the activities. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.
Wagner, Pablo; Ortiz, Cristian; Vela, Omar; Arias, Paul; Zanolli, Diego; Wagner, Emilio
2016-09-01
Tibialis posterior (TP) tendon transfer through the interosseous membrane is commonly performed in Charcot-Marie-Tooth disease. In order to avoid entrapment of this tendon, no clear recommendation relative to the interosseous membrane (IOM) incision size has been made. Analyze the TP size at the transfer level and therefore determine the most adequate IOM window size to avoid muscle entrapment. Eleven lower extremity magnetic resonances were analyzed. TP muscle measurements were made in axial views, obtaining the medial-lateral and antero-posterior diameter at various distances from the medial malleolus tip. The distance from the posterior to anterior compartment was also measured. These measurements were applied to a mathematical model to predict the IOM window size necessary to allow an ample TP passage in an oblique direction. The average tendon diameter (confidence-interval) at 15cm proximal to the medial malleolus tip was 19.47mm (17.47-21.48). The deep posterior compartment to anterior compartment distance was 10.97mm (9.03-12.90). Using a mathematical model, the estimated IOM window size ranges from 4.2 to 4.9cm. The IOM window size is of utmost importance in trans-membrane TP transfers, given that if equal or smaller than the transposed tendon oblique diameter, a high entrapment risk exists. A membrane window of 5cm or 2.5 times the size of the tendon diameter should be performed in order to theoretically diminish this complication. Copyright © 2015 European Foot and Ankle Society. Published by Elsevier Ltd. All rights reserved.
Wang, Bing; Baby, Varghese; Tong, Wilson; Xu, Lei; Friedman, Michelle; Runser, Robert; Glesk, Ivan; Prucnal, Paul
2002-01-14
A novel optical switch based on cascading two terahertz optical asymmetric demultiplexers (TOAD) is presented. By utilizing the sharp edge of the asymmetric TOAD switching window profile, two TOAD switching windows are overlapped to produce a narrower aggregate switching window, not limited by the pulse propagation time in the SOA of the TOAD. Simulations of the cascaded TOAD switching window show relatively constant window amplitude for different window sizes. Experimental results on cascading two TOADs, each with a switching window of 8ps, but with the SOA on opposite sides of the fiber loop, show a minimum switching window of 2.7ps.
The research on the mean shift algorithm for target tracking
NASA Astrophysics Data System (ADS)
CAO, Honghong
2017-06-01
The traditional mean shift algorithm for target tracking is effective and high real-time, but there still are some shortcomings. The traditional mean shift algorithm is easy to fall into local optimum in the tracking process, the effectiveness of the method is weak when the object is moving fast. And the size of the tracking window never changes, the method will fail when the size of the moving object changes, as a result, we come up with a new method. We use particle swarm optimization algorithm to optimize the mean shift algorithm for target tracking, Meanwhile, SIFT (scale-invariant feature transform) and affine transformation make the size of tracking window adaptive. At last, we evaluate the method by comparing experiments. Experimental result indicates that the proposed method can effectively track the object and the size of the tracking window changes.
Windowed time-reversal music technique for super-resolution ultrasound imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Lianjie; Labyed, Yassin
Systems and methods for super-resolution ultrasound imaging using a windowed and generalized TR-MUSIC algorithm that divides the imaging region into overlapping sub-regions and applies the TR-MUSIC algorithm to the windowed backscattered ultrasound signals corresponding to each sub-region. The algorithm is also structured to account for the ultrasound attenuation in the medium and the finite-size effects of ultrasound transducer elements.
Weibull Analysis and Area Scaling for Infrared Window Materials (U)
2016-08-01
the strength of a window scales inversely with the size of the window. This report was reviewed for technical accuracy by Howard Poisl, Thomas M...strength of a window scales inversely with the size of the window. Test data are given for aluminum oxynitride (ALON), calcium fluoride, chemical vapor...failure of an optical window in the absence of slow crack growth. This report illustrates how the strength of a window scales inversely with the size of
Tay, Benjamin Chia-Meng; Chow, Tzu-Hao; Ng, Beng-Koon; Loh, Thomas Kwok-Seng
2012-09-01
This study investigates the autocorrelation bandwidths of dual-window (DW) optical coherence tomography (OCT) k-space scattering profile of different-sized microspheres and their correlation to scatterer size. A dual-bandwidth spectroscopic metric defined as the ratio of the 10% to 90% autocorrelation bandwidths is found to change monotonically with microsphere size and gives the best contrast enhancement for scatterer size differentiation in the resulting spectroscopic image. A simulation model supports the experimental results and revealed a tradeoff between the smallest detectable scatterer size and the maximum scatterer size in the linear range of the dual-window dual-bandwidth (DWDB) metric, which depends on the choice of the light source optical bandwidth. Spectroscopic OCT (SOCT) images of microspheres and tonsil tissue samples based on the proposed DWDB metric showed clear differentiation between different-sized scatterers as compared to those derived from conventional short-time Fourier transform metrics. The DWDB metric significantly improves the contrast in SOCT imaging and can aid the visualization and identification of dissimilar scatterer size in a sample. Potential applications include the early detection of cell nuclear changes in tissue carcinogenesis, the monitoring of healing tendons, and cell proliferation in tissue scaffolds.
Computed Tomographic Window Setting for Bronchial Measurement to Guide Double-Lumen Tube Size.
Seo, Jeong-Hwa; Bae, Jinyoung; Paik, Hyesun; Koo, Chang-Hoon; Bahk, Jae-Hyon
2018-04-01
The bronchial diameter measured on computed tomography (CT) can be used to guide double-lumen tube (DLT) sizes objectively. The bronchus is known to be measured most accurately in the so-called bronchial CT window. The authors investigated whether using the bronchial window results in the selection of more appropriately sized DLTs than using the other windows. CT image analysis and prospective randomized study. Tertiary hospital. Adults receiving left-sided DLTs. The authors simulated selection of DLT sizes based on the left bronchial diameters measured in the lung (width 1,500 Hounsfield unit [HU] and level -700 HU), bronchial (1,000 HU and -450 HU), and mediastinal (400 HU and 25 HU) CT windows. Furthermore, patients were randomly assigned to undergo imaging with either the bronchial or mediastinal window to guide DLT sizes. Using the underwater seal technique, the authors assessed whether the DLT was appropriately sized, undersized, or oversized for the patient. On 130 CT images, the bronchial diameter (9.9 ± 1.2 mm v 10.5 ± 1.3 mm v 11.7 ± 1.3 mm) and the selected DLT size were different in the lung, bronchial, and mediastinal windows, respectively (p < 0.001). In 13 patients (17%), the bronchial diameter measured in the lung window suggested too small DLTs (28 Fr) for adults. In the prospective study, oversized tubes were chosen less frequently in the bronchial window than in the mediastinal window (6/110 v 23/111; risk ratio 0.38; 95% CI 0.19-0.79; p = 0.003). No tubes were undersized after measurements in these two windows. The bronchial measurement in the bronchial window guided more appropriately sized DLTs compared with the lung or mediastinal windows. Copyright © 2017 Elsevier Inc. All rights reserved.
Micro-machined thermo-conductivity detector
Yu, Conrad
2003-01-01
A micro-machined thermal conductivity detector for a portable gas chromatograph. The detector is highly sensitive and has fast response time to enable detection of the small size gas samples in a portable gas chromatograph which are in the order of nanoliters. The high sensitivity and fast response time are achieved through micro-machined devices composed of a nickel wire, for example, on a silicon nitride window formed in a silicon member and about a millimeter square in size. In addition to operating as a thermal conductivity detector, the silicon nitride window with a micro-machined wire therein of the device can be utilized for a fast response heater for PCR applications.
NASA Astrophysics Data System (ADS)
He, L.; Chen, J. M.; Liu, J.; Mo, G.; Zhen, T.; Chen, B.; Wang, R.; Arain, M.
2013-12-01
Terrestrial ecosystem models have been widely used to simulate carbon, water and energy fluxes and climate-ecosystem interactions. In these models, some vegetation and soil parameters are determined based on limited studies from literatures without consideration of their seasonal variations. Data assimilation (DA) provides an effective way to optimize these parameters at different time scales . In this study, an ensemble Kalman filter (EnKF) is developed and applied to optimize two key parameters of an ecosystem model, namely the Boreal Ecosystem Productivity Simulator (BEPS): (1) the maximum photosynthetic carboxylation rate (Vcmax) at 25 °C, and (2) the soil water stress factor (fw) for stomatal conductance formulation. These parameters are optimized through assimilating observations of gross primary productivity (GPP) and latent heat (LE) fluxes measured in a 74 year-old pine forest, which is part of the Turkey Point Flux Station's age-sequence sites. Vcmax is related to leaf nitrogen concentration and varies slowly over the season and from year to year. In contrast, fw varies rapidly in response to soil moisture dynamics in the root-zone. Earlier studies suggested that DA of vegetation parameters at daily time steps leads to Vcmax values that are unrealistic. To overcome the problem, we developed a three-step scheme to optimize Vcmax and fw. First, the EnKF is applied daily to obtain precursor estimates of Vcmax and fw. Then Vcmax is optimized at different time scales assuming fw is unchanged from first step. The best temporal period or window size is then determined by analyzing the magnitude of the minimized cost-function, and the coefficient of determination (R2) and Root-mean-square deviation (RMSE) of GPP and LE between simulation and observation. Finally, the daily fw value is optimized for rain free days corresponding to the Vcmax curve from the best window size. The optimized fw is then used to model its relationship with soil moisture. We found that the optimized fw is best correlated linearly to soil water content at 5 to 10 cm depth. We also found that both the temporal scale or window size and the priori uncertainty of Vcmax (given as its standard deviation) are important in determining the seasonal trajectory of Vcmax. During the leaf expansion stage, an appropriate window size leads to reasonable estimate of Vcmax. In the summer, the fluctuation of optimized Vcmax is mainly caused by the uncertainties in Vcmax but not the window size. Our study suggests that a smooth Vcmax curve optimized from an optimal time window size is close to the reality though the RMSE of GPP at this window is not the minimum. It also suggests that for the accurate optimization of Vcmax, it is necessary to set appropriate levels of uncertainty of Vcmax in the spring and summer because the rate of leaf nitrogen concentration change is different over the season. Parameter optimizations for more sites and multi-years are in progress.
Oval Window Size and Shape: a Micro-CT Anatomical Study With Considerations for Stapes Surgery.
Zdilla, Matthew J; Skrzat, Janusz; Kozerska, Magdalena; Leszczyński, Bartosz; Tarasiuk, Jacek; Wroński, Sebastian
2018-06-01
The oval window is an important structure with regard to stapes surgeries, including stapedotomy for the treatment of otosclerosis. Recent study of perioperative imaging of the oval window has revealed that oval window niche height can indicate both operative difficulty and subjective discomfort during otosclerosis surgery. With regard to shape, structures incorporated into the oval window niche, such as cartilage grafts, must be compatible with the shape of the oval window. Despite the clinical importance of the oval window, there is little information regarding its size and shape. This study assessed oval window size and shape via micro-computed tomography paired with modern morphometric methodology in the fetal, infant, child, and adult populations. Additionally, the study compared oval window size and shape between sexes and between left- and right-sided ears. No significant differences were found among traditional morphometric parameters among age groups, sides, or sexes. However, geometric morphometric methods revealed shape differences between age groups. Further, geometric morphometric methods provided the average oval window shape and most-likely shape variance. Beyond demonstrating oval window size and shape variation, the results of this report will aid in identifying patients among whom anatomical variation may contribute to surgical difficulty and surgeon discomfort, or otherwise warrant preoperative adaptations for the incorporation of materials into and around the oval window.
Wu, Tiee-Jian; Huang, Ying-Hsueh; Li, Lung-An
2005-11-15
Several measures of DNA sequence dissimilarity have been developed. The purpose of this paper is 3-fold. Firstly, we compare the performance of several word-based or alignment-based methods. Secondly, we give a general guideline for choosing the window size and determining the optimal word sizes for several word-based measures at different window sizes. Thirdly, we use a large-scale simulation method to simulate data from the distribution of SK-LD (symmetric Kullback-Leibler discrepancy). These simulated data can be used to estimate the degree of dissimilarity beta between any pair of DNA sequences. Our study shows (1) for whole sequence similiarity/dissimilarity identification the window size taken should be as large as possible, but probably not >3000, as restricted by CPU time in practice, (2) for each measure the optimal word size increases with window size, (3) when the optimal word size is used, SK-LD performance is superior in both simulation and real data analysis, (4) the estimate beta of beta based on SK-LD can be used to filter out quickly a large number of dissimilar sequences and speed alignment-based database search for similar sequences and (5) beta is also applicable in local similarity comparison situations. For example, it can help in selecting oligo probes with high specificity and, therefore, has potential in probe design for microarrays. The algorithm SK-LD, estimate beta and simulation software are implemented in MATLAB code, and are available at http://www.stat.ncku.edu.tw/tjwu
[Online endpoint detection algorithm for blending process of Chinese materia medica].
Lin, Zhao-Zhou; Yang, Chan; Xu, Bing; Shi, Xin-Yuan; Zhang, Zhi-Qiang; Fu, Jing; Qiao, Yan-Jiang
2017-03-01
Blending process, which is an essential part of the pharmaceutical preparation, has a direct influence on the homogeneity and stability of solid dosage forms. With the official release of Guidance for Industry PAT, online process analysis techniques have been more and more reported in the applications in blending process, but the research on endpoint detection algorithm is still in the initial stage. By progressively increasing the window size of moving block standard deviation (MBSD), a novel endpoint detection algorithm was proposed to extend the plain MBSD from off-line scenario to online scenario and used to determine the endpoint in the blending process of Chinese medicine dispensing granules. By online learning of window size tuning, the status changes of the materials in blending process were reflected in the calculation of standard deviation in a real-time manner. The proposed method was separately tested in the blending processes of dextrin and three other extracts of traditional Chinese medicine. All of the results have shown that as compared with traditional MBSD method, the window size changes according to the proposed MBSD method (progressively increasing the window size) could more clearly reflect the status changes of the materials in blending process, so it is suitable for online application. Copyright© by the Chinese Pharmaceutical Association.
Graphical User Interface for the NASA FLOPS Aircraft Performance and Sizing Code
NASA Technical Reports Server (NTRS)
Lavelle, Thomas M.; Curlett, Brian P.
1994-01-01
XFLOPS is an X-Windows/Motif graphical user interface for the aircraft performance and sizing code FLOPS. This new interface simplifies entering data and analyzing results, thereby reducing analysis time and errors. Data entry is simpler because input windows are used for each of the FLOPS namelists. These windows contain fields to input the variable's values along with help information describing the variable's function. Analyzing results is simpler because output data are displayed rapidly. This is accomplished in two ways. First, because the output file has been indexed, users can view particular sections with the click of a mouse button. Second, because menu picks have been created, users can plot engine and aircraft performance data. In addition, XFLOPS has a built-in help system and complete on-line documentation for FLOPS.
Walton, Emily; Casey, Christy; Mitsch, Jurgen; Vázquez-Diosdado, Jorge A; Yan, Juan; Dottorini, Tania; Ellis, Keith A; Winterlich, Anthony; Kaler, Jasmeet
2018-02-01
Automated behavioural classification and identification through sensors has the potential to improve health and welfare of the animals. Position of a sensor, sampling frequency and window size of segmented signal data has a major impact on classification accuracy in activity recognition and energy needs for the sensor, yet, there are no studies in precision livestock farming that have evaluated the effect of all these factors simultaneously. The aim of this study was to evaluate the effects of position (ear and collar), sampling frequency (8, 16 and 32 Hz) of a triaxial accelerometer and gyroscope sensor and window size (3, 5 and 7 s) on the classification of important behaviours in sheep such as lying, standing and walking. Behaviours were classified using a random forest approach with 44 feature characteristics. The best performance for walking, standing and lying classification in sheep (accuracy 95%, F -score 91%-97%) was obtained using combination of 32 Hz, 7 s and 32 Hz, 5 s for both ear and collar sensors, although, results obtained with 16 Hz and 7 s window were comparable with accuracy of 91%-93% and F -score 88%-95%. Energy efficiency was best at a 7 s window. This suggests that sampling at 16 Hz with 7 s window will offer benefits in a real-time behavioural monitoring system for sheep due to reduced energy needs.
Walton, Emily; Casey, Christy; Mitsch, Jurgen; Vázquez-Diosdado, Jorge A.; Yan, Juan; Dottorini, Tania; Ellis, Keith A.; Winterlich, Anthony
2018-01-01
Automated behavioural classification and identification through sensors has the potential to improve health and welfare of the animals. Position of a sensor, sampling frequency and window size of segmented signal data has a major impact on classification accuracy in activity recognition and energy needs for the sensor, yet, there are no studies in precision livestock farming that have evaluated the effect of all these factors simultaneously. The aim of this study was to evaluate the effects of position (ear and collar), sampling frequency (8, 16 and 32 Hz) of a triaxial accelerometer and gyroscope sensor and window size (3, 5 and 7 s) on the classification of important behaviours in sheep such as lying, standing and walking. Behaviours were classified using a random forest approach with 44 feature characteristics. The best performance for walking, standing and lying classification in sheep (accuracy 95%, F-score 91%–97%) was obtained using combination of 32 Hz, 7 s and 32 Hz, 5 s for both ear and collar sensors, although, results obtained with 16 Hz and 7 s window were comparable with accuracy of 91%–93% and F-score 88%–95%. Energy efficiency was best at a 7 s window. This suggests that sampling at 16 Hz with 7 s window will offer benefits in a real-time behavioural monitoring system for sheep due to reduced energy needs. PMID:29515862
Adaptive time-variant models for fuzzy-time-series forecasting.
Wong, Wai-Keung; Bai, Enjian; Chu, Alice Wai-Ching
2010-12-01
A fuzzy time series has been applied to the prediction of enrollment, temperature, stock indices, and other domains. Related studies mainly focus on three factors, namely, the partition of discourse, the content of forecasting rules, and the methods of defuzzification, all of which greatly influence the prediction accuracy of forecasting models. These studies use fixed analysis window sizes for forecasting. In this paper, an adaptive time-variant fuzzy-time-series forecasting model (ATVF) is proposed to improve forecasting accuracy. The proposed model automatically adapts the analysis window size of fuzzy time series based on the prediction accuracy in the training phase and uses heuristic rules to generate forecasting values in the testing phase. The performance of the ATVF model is tested using both simulated and actual time series including the enrollments at the University of Alabama, Tuscaloosa, and the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX). The experiment results show that the proposed ATVF model achieves a significant improvement in forecasting accuracy as compared to other fuzzy-time-series forecasting models.
Latency as a region contrast: Measuring ERP latency differences with Dynamic Time Warping.
Zoumpoulaki, A; Alsufyani, A; Filetti, M; Brammer, M; Bowman, H
2015-12-01
Methods for measuring onset latency contrasts are evaluated against a new method utilizing the dynamic time warping (DTW) algorithm. This new method allows latency to be measured across a region instead of single point. We use computer simulations to compare the methods' power and Type I error rates under different scenarios. We perform per-participant analysis for different signal-to-noise ratios and two sizes of window (broad vs. narrow). In addition, the methods are tested in combination with single-participant and jackknife average waveforms for different effect sizes, at the group level. DTW performs better than the other methods, being less sensitive to noise as well as to placement and width of the window selected. © 2015 Society for Psychophysiological Research.
Estimating Characteristics of a Maneuvering Reentry Vehicle Observed by Multiple Sensors
2010-03-01
instead of as one large data set. This method allowed the filter to respond to changing dynamics. Jackson and Farbman’s approach could be of...portion of the entire acceleration was due to drag. Lee and Liu adopted a more hybrid approach , combining a least squares and Kalman filters [9...grows again as the window approaches the end of the available data. Three values for minimum window size, window size, and maximum window size are
Influence of sampling window size and orientation on parafoveal cone packing density
Lombardo, Marco; Serrao, Sebastiano; Ducoli, Pietro; Lombardo, Giuseppe
2013-01-01
We assessed the agreement between sampling windows of different size and orientation on packing density estimates in images of the parafoveal cone mosaic acquired using a flood-illumination adaptive optics retinal camera. Horizontal and vertical oriented sampling windows of different size (320x160 µm, 160x80 µm and 80x40 µm) were selected in two retinal locations along the horizontal meridian in one eye of ten subjects. At each location, cone density tended to decline with decreasing sampling area. Although the differences in cone density estimates were not statistically significant, Bland-Altman plots showed that the agreement between cone density estimated within the different sampling window conditions was moderate. The percentage of the preferred packing arrangements of cones by Voronoi tiles was slightly affected by window size and orientation. The results illustrated the high importance of specifying the size and orientation of the sampling window used to derive cone metric estimates to facilitate comparison of different studies. PMID:24009995
Horesh, Yair; Wexler, Ydo; Lebenthal, Ilana; Ziv-Ukelson, Michal; Unger, Ron
2009-03-04
Scanning large genomes with a sliding window in search of locally stable RNA structures is a well motivated problem in bioinformatics. Given a predefined window size L and an RNA sequence S of size N (L < N), the consecutive windows folding problem is to compute the minimal free energy (MFE) for the folding of each of the L-sized substrings of S. The consecutive windows folding problem can be naively solved in O(NL3) by applying any of the classical cubic-time RNA folding algorithms to each of the N-L windows of size L. Recently an O(NL2) solution for this problem has been described. Here, we describe and implement an O(NLpsi(L)) engine for the consecutive windows folding problem, where psi(L) is shown to converge to O(1) under the assumption of a standard probabilistic polymer folding model, yielding an O(L) speedup which is experimentally confirmed. Using this tool, we note an intriguing directionality (5'-3' vs. 3'-5') folding bias, i.e. that the minimal free energy (MFE) of folding is higher in the native direction of the DNA than in the reverse direction of various genomic regions in several organisms including regions of the genomes that do not encode proteins or ncRNA. This bias largely emerges from the genomic dinucleotide bias which affects the MFE, however we see some variations in the folding bias in the different genomic regions when normalized to the dinucleotide bias. We also present results from calculating the MFE landscape of a mouse chromosome 1, characterizing the MFE of the long ncRNA molecules that reside in this chromosome. The efficient consecutive windows folding engine described in this paper allows for genome wide scans for ncRNA molecules as well as large-scale statistics. This is implemented here as a software tool, called RNAslider, and applied to the scanning of long chromosomes, leading to the observation of features that are visible only on a large scale.
Templated fabrication of hollow nanospheres with 'windows' of accurate size and tunable number.
Xie, Duan; Hou, Yidong; Su, Yarong; Gao, Fuhua; Du, Jinglei
2015-01-01
The 'windows' or 'doors' on the surface of a closed hollow structure can enable the exchange of material and information between the interior and exterior of one hollow sphere or between two hollow spheres, and this information or material exchange can also be controlled through altering the window' size. Thus, it is very interesting and important to achieve the fabrication and adjustment of the 'windows' or 'doors' on the surface of a closed hollow structure. In this paper, we propose a new method based on the temple-assisted deposition method to achieve the fabrication of hollow spheres with windows of accurate size and number. Through precisely controlling of deposition parameters (i.e., deposition angle and number), hollow spheres with windows of total size from 0% to 50% and number from 1 to 6 have been successfully achieved. A geometrical model has been developed for the morphology simulation and size calculation of the windows, and the simulation results meet well with the experiment. This model will greatly improve the convenience and efficiency of this temple-assisted deposition method. In addition, these hollow spheres with desired windows also can be dispersed into liquid or arranged regularly on any desired substrate. These advantages will maximize their applications in many fields, such as drug transport and nano-research container.
Eye movements and the span of the effective stimulus in visual search.
Bertera, J H; Rayner, K
2000-04-01
The span of the effective stimulus during visual search through an unstructured alphanumeric array was investigated by using eye-contingent-display changes while the subjects searched for a target letter. In one condition, a window exposing the search array moved in synchrony with the subjects' eye movements, and the size of the window was varied. Performance reached asymptotic levels when the window was 5 degrees. In another condition, a foveal mask moved in synchrony with each eye movement, and the size of the mask was varied. The foveal mask conditions were much more detrimental to search behavior than the window conditions, indicating the importance of foveal vision during search. The size of the array also influenced performance, but performance reached asymptote for all array sizes tested at the same window size, and the effect of the foveal mask was the same for all array sizes. The results indicate that both acuity and difficulty of the search task influenced the span of the effective stimulus during visual search.
Characterization of 176Lu background in LSO-based PET scanners
NASA Astrophysics Data System (ADS)
Conti, Maurizio; Eriksson, Lars; Rothfuss, Harold; Sjoeholm, Therese; Townsend, David; Rosenqvist, Göran; Carlier, Thomas
2017-05-01
LSO and LYSO are today the most common scintillators used in positron emission tomography. Lutetium contains traces of 176Lu, a radioactive isotope that decays β - with a cascade of γ photons in coincidence. Therefore, Lutetium-based scintillators are characterized by a small natural radiation background. In this paper, we investigate and characterize the 176Lu radiation background via experiments performed on LSO-based PET scanners. LSO background was measured at different energy windows and different time coincidence windows, and by using shields to alter the original spectrum. The effect of radiation background in particularly count-starved applications, such as 90Y imaging, is analysed and discussed. Depending on the size of the PET scanner, between 500 and 1000 total random counts per second and between 3 and 5 total true coincidences per second were measured in standard coincidence mode. The LSO background counts in a Siemens mCT in the standard PET energy and time windows are in general negligible in terms of trues, and are comparable to that measured in a BGO scanner of similar size.
An Efficient Format for Nearly Constant-Time Access to Arbitrary Time Intervals in Large Trace Files
Chan, Anthony; Gropp, William; Lusk, Ewing
2008-01-01
A powerful method to aid in understanding the performance of parallel applications uses log or trace files containing time-stamped events and states (pairs of events). These trace files can be very large, often hundreds or even thousands of megabytes. Because of the cost of accessing and displaying such files, other methods are often used that reduce the size of the tracefiles at the cost of sacrificing detail or other information. This paper describes a hierarchical trace file format that provides for display of an arbitrary time window in a time independent of the total size of the file and roughlymore » proportional to the number of events within the time window. This format eliminates the need to sacrifice data to achieve a smaller trace file size (since storage is inexpensive, it is necessary only to make efficient use of bandwidth to that storage). The format can be used to organize a trace file or to create a separate file of annotations that may be used with conventional trace files. We present an analysis of the time to access all of the events relevant to an interval of time and we describe experiments demonstrating the performance of this file format.« less
Progesterone in experimental permanent stroke: a dose-response and therapeutic time-window study
Wali, Bushra; Ishrat, Tauheed; Won, Soonmi; Stein, Donald G.
2014-01-01
Currently, the only approved treatment for ischaemic stroke is tissue plasminogen activator, a clot-buster. This treatment can have dangerous consequences if not given within the first 4 h after stroke. Our group and others have shown progesterone to be beneficial in preclinical studies of stroke, but a progesterone dose-response and time-window study is lacking. We tested male Sprague-Dawley rats (12 months old) with permanent middle cerebral artery occlusion or sham operations on multiple measures of sensory, motor and cognitive performance. For the dose-response study, animals received intraperitoneal injections of progesterone (8, 16 or 32 mg/kg) at 1 h post-occlusion, and subcutaneous injections at 6 h and then once every 24 h for 7 days. For the time-window study, the optimal dose of progesterone was given starting at 3, 6 or 24 h post-stroke. Behavioural recovery was evaluated at repeated intervals. Rats were killed at 22 days post-stroke and brains extracted for evaluation of infarct volume. Both 8 and 16 mg/kg doses of progesterone produced attenuation of infarct volume compared with the placebo, and improved functional outcomes up to 3 weeks after stroke on locomotor activity, grip strength, sensory neglect, gait impairment, motor coordination and spatial navigation tests. In the time-window study, the progesterone group exhibited substantial neuroprotection as late as 6 h after stroke onset. Compared with placebo, progesterone showed a significant reduction in infarct size with 3- and 6-h delays. Moderate doses (8 and 16 mg/kg) of progesterone reduced infarct size and improved functional deficits in our clinically relevant model of stroke. The 8 mg/kg dose was optimal in improving motor, sensory and memory function, and this effect was observed over a large therapeutic time window. Progesterone shows promise as a potential therapeutic agent and should be examined for safety and efficacy in a clinical trial for ischaemic stroke. PMID:24374329
Du, Feng; Yin, Yue; Qi, Yue; Zhang, Kan
2014-08-01
In the present study, we examined whether a peripheral size-singleton distractor that matches the target-distractor size relation can capture attention and disrupt central target identification. Three experiments consistently showed that a size singleton that matches the target-distractor size relation cannot capture attention when it appears outside of the attentional window, even though the same size singleton produces a cuing effect. In addition, a color singleton that matches the target color, instead of a size singleton that matches the target-distractor size relation, captures attention when it is outside of the attentional window. Thus, a size-relation-matched distractor is much weaker than a color-matched distractor in capturing attention and cannot capture attention when the distractor appears outside of the attentional window.
Sakao, Yukinori; Kuroda, Hiroaki; Mun, Mingyon; Uehara, Hirofumi; Motoi, Noriko; Ishikawa, Yuichi; Nakagawa, Ken; Okumura, Sakae
2014-01-01
Background We aimed to clarify that the size of the lung adenocarcinoma evaluated using mediastinal window on computed tomography is an important and useful modality for predicting invasiveness, lymph node metastasis and prognosis in small adenocarcinoma. Methods We evaluated 176 patients with small lung adenocarcinomas (diameter, 1–3 cm) who underwent standard surgical resection. Tumours were examined using computed tomography with thin section conditions (1.25 mm thick on high-resolution computed tomography) with tumour dimensions evaluated under two settings: lung window and mediastinal window. We also determined the patient age, gender, preoperative nodal status, tumour size, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and pathological status (lymphatic vessel, vascular vessel or pleural invasion). Recurrence-free survival was used for prognosis. Results Lung window, mediastinal window, tumour disappearance ratio and preoperative nodal status were significant predictive factors for recurrence-free survival in univariate analyses. Areas under the receiver operator curves for recurrence were 0.76, 0.73 and 0.65 for mediastinal window, tumour disappearance ratio and lung window, respectively. Lung window, mediastinal window, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and preoperative nodal status were significant predictive factors for lymph node metastasis in univariate analyses; areas under the receiver operator curves were 0.61, 0.76, 0.72 and 0.66, for lung window, mediastinal window, tumour disappearance ratio and preoperative serum carcinoembryonic antigen levels, respectively. Lung window, mediastinal window, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and preoperative nodal status were significant factors for lymphatic vessel, vascular vessel or pleural invasion in univariate analyses; areas under the receiver operator curves were 0.60, 0.81, 0.81 and 0.65 for lung window, mediastinal window, tumour disappearance ratio and preoperative serum carcinoembryonic antigen levels, respectively. Conclusions According to the univariate analyses including a logistic regression and ROCs performed for variables with p-values of <0.05 on univariate analyses, our results suggest that measuring tumour size using mediastinal window on high-resolution computed tomography is a simple and useful preoperative prognosis modality in small adenocarcinoma. PMID:25365326
Gabler, Christopher A; Siemann, Evan
2013-01-01
The rate of new exotic recruitment following removal of adult invaders (reinvasion pressure) influences restoration outcomes and costs but is highly variable and poorly understood. We hypothesize that broad variation in average reinvasion pressure of Triadica sebifera (Chinese tallow tree, a major invader) arises from differences among habitats in spatiotemporal availability of realized recruitment windows. These windows are periods of variable duration long enough to permit establishment given local environmental conditions. We tested this hypothesis via a greenhouse mesocosm experiment that quantified how the duration of favorable moisture conditions prior to flood or drought stress (window duration), competition and nutrient availability influenced Triadica success in high stress environments. Window duration influenced pre-stress seedling abundance and size, growth during stress and final abundance; it interacted with other factors to affect final biomass and germination during stress. Stress type and competition impacted final size and biomass, plus germination, mortality and changes in size during stress. Final abundance also depended on competition and the interaction of window duration, stress type and competition. Fertilization interacted with competition and stress to influence biomass and changes in height, respectively, but did not affect Triadica abundance. Overall, longer window durations promoted Triadica establishment, competition and drought (relative to flood) suppressed establishment, and fertilization had weak effects. Interactions among factors frequently produced different effects in specific contexts. Results support our 'outgrow the stress' hypothesis and show that temporal availability of abiotic windows and factors that influence growth rates govern Triadica recruitment in stressful environments. These findings suggest that native seed addition can effectively suppress superior competitors in stressful environments. We also describe environmental scenarios where specific management methods may be more or less effective. Our results enable better niche-based estimates of local reinvasion pressure, which can improve restoration efficacy and efficiency by informing site selection and optimal management.
Gabler, Christopher A.; Siemann, Evan
2013-01-01
The rate of new exotic recruitment following removal of adult invaders (reinvasion pressure) influences restoration outcomes and costs but is highly variable and poorly understood. We hypothesize that broad variation in average reinvasion pressure of Triadica sebifera (Chinese tallow tree, a major invader) arises from differences among habitats in spatiotemporal availability of realized recruitment windows. These windows are periods of variable duration long enough to permit establishment given local environmental conditions. We tested this hypothesis via a greenhouse mesocosm experiment that quantified how the duration of favorable moisture conditions prior to flood or drought stress (window duration), competition and nutrient availability influenced Triadica success in high stress environments. Window duration influenced pre-stress seedling abundance and size, growth during stress and final abundance; it interacted with other factors to affect final biomass and germination during stress. Stress type and competition impacted final size and biomass, plus germination, mortality and changes in size during stress. Final abundance also depended on competition and the interaction of window duration, stress type and competition. Fertilization interacted with competition and stress to influence biomass and changes in height, respectively, but did not affect Triadica abundance. Overall, longer window durations promoted Triadica establishment, competition and drought (relative to flood) suppressed establishment, and fertilization had weak effects. Interactions among factors frequently produced different effects in specific contexts. Results support our ‘outgrow the stress’ hypothesis and show that temporal availability of abiotic windows and factors that influence growth rates govern Triadica recruitment in stressful environments. These findings suggest that native seed addition can effectively suppress superior competitors in stressful environments. We also describe environmental scenarios where specific management methods may be more or less effective. Our results enable better niche-based estimates of local reinvasion pressure, which can improve restoration efficacy and efficiency by informing site selection and optimal management. PMID:23967212
Statistical tests for power-law cross-correlated processes
NASA Astrophysics Data System (ADS)
Podobnik, Boris; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Stanley, H. Eugene
2011-12-01
For stationary time series, the cross-covariance and the cross-correlation as functions of time lag n serve to quantify the similarity of two time series. The latter measure is also used to assess whether the cross-correlations are statistically significant. For nonstationary time series, the analogous measures are detrended cross-correlations analysis (DCCA) and the recently proposed detrended cross-correlation coefficient, ρDCCA(T,n), where T is the total length of the time series and n the window size. For ρDCCA(T,n), we numerically calculated the Cauchy inequality -1≤ρDCCA(T,n)≤1. Here we derive -1≤ρDCCA(T,n)≤1 for a standard variance-covariance approach and for a detrending approach. For overlapping windows, we find the range of ρDCCA within which the cross-correlations become statistically significant. For overlapping windows we numerically determine—and for nonoverlapping windows we derive—that the standard deviation of ρDCCA(T,n) tends with increasing T to 1/T. Using ρDCCA(T,n) we show that the Chinese financial market's tendency to follow the U.S. market is extremely weak. We also propose an additional statistical test that can be used to quantify the existence of cross-correlations between two power-law correlated time series.
Effects of window size and shape on accuracy of subpixel centroid estimation of target images
NASA Technical Reports Server (NTRS)
Welch, Sharon S.
1993-01-01
A new algorithm is presented for increasing the accuracy of subpixel centroid estimation of (nearly) point target images in cases where the signal-to-noise ratio is low and the signal amplitude and shape vary from frame to frame. In the algorithm, the centroid is calculated over a data window that is matched in width to the image distribution. Fourier analysis is used to explain the dependency of the centroid estimate on the size of the data window, and simulation and experimental results are presented which demonstrate the effects of window size for two different noise models. The effects of window shape were also investigated for uniform and Gaussian-shaped windows. The new algorithm was developed to improve the dynamic range of a close-range photogrammetric tracking system that provides feedback for control of a large gap magnetic suspension system (LGMSS).
NEMA NU 2-2007 performance measurements of the Siemens Inveon™ preclinical small animal PET system
Kemp, Brad J; Hruska, Carrie B; McFarland, Aaron R; Lenox, Mark W; Lowe, Val J
2010-01-01
National Electrical Manufacturers Association (NEMA) NU 2-2007 performance measurements were conducted on the Inveon™ preclinical small animal PET system developed by Siemens Medical Solutions. The scanner uses 1.51 × 1.51 × 10 mm LSO crystals grouped in 20 × 20 blocks; a tapered light guide couples the LSO crystals of a block to a position-sensitive photomultiplier tube. There are 80 rings with 320 crystals per ring and the ring diameter is 161 mm. The transaxial and axial fields of view (FOVs) are 100 and 127 mm, respectively. The scanner can be docked to a CT scanner; the performance characteristics of the CT component are not included herein. Performance measurements of spatial resolution, sensitivity, scatter fraction and count rate performance were obtained for different energy windows and coincidence timing window widths. For brevity, the results described here are for an energy window of 350–650 keV and a coincidence timing window of 3.43 ns. The spatial resolution at the center of the transaxial and axial FOVs was 1.56, 1.62 and 2.12 mm in the tangential, radial and axial directions, respectively, and the system sensitivity was 36.2 cps kBq−1 for a line source (7.2% for a point source). For mouse- and rat-sized phantoms, the scatter fraction was 5.7% and 14.6%, respectively. The peak noise equivalent count rate with a noisy randoms estimate was 1475 kcps at 130 MBq for the mouse-sized phantom and 583 kcps at 74 MBq for the rat-sized phantom. The performance measurements indicate that the Inveon™ PET scanner is a high-resolution tomograph with excellent sensitivity that is capable of imaging at a high count rate. PMID:19321924
NEMA NU 2-2007 performance measurements of the Siemens Inveon™ preclinical small animal PET system
NASA Astrophysics Data System (ADS)
Kemp, Brad J.; Hruska, Carrie B.; McFarland, Aaron R.; Lenox, Mark W.; Lowe, Val J.
2009-04-01
National Electrical Manufacturers Association (NEMA) NU 2-2007 performance measurements were conducted on the Inveon™ preclinical small animal PET system developed by Siemens Medical Solutions. The scanner uses 1.51 × 1.51 × 10 mm LSO crystals grouped in 20 × 20 blocks; a tapered light guide couples the LSO crystals of a block to a position-sensitive photomultiplier tube. There are 80 rings with 320 crystals per ring and the ring diameter is 161 mm. The transaxial and axial fields of view (FOVs) are 100 and 127 mm, respectively. The scanner can be docked to a CT scanner; the performance characteristics of the CT component are not included herein. Performance measurements of spatial resolution, sensitivity, scatter fraction and count rate performance were obtained for different energy windows and coincidence timing window widths. For brevity, the results described here are for an energy window of 350-650 keV and a coincidence timing window of 3.43 ns. The spatial resolution at the center of the transaxial and axial FOVs was 1.56, 1.62 and 2.12 mm in the tangential, radial and axial directions, respectively, and the system sensitivity was 36.2 cps kBq-1 for a line source (7.2% for a point source). For mouse- and rat-sized phantoms, the scatter fraction was 5.7% and 14.6%, respectively. The peak noise equivalent count rate with a noisy randoms estimate was 1475 kcps at 130 MBq for the mouse-sized phantom and 583 kcps at 74 MBq for the rat-sized phantom. The performance measurements indicate that the Inveon™ PET scanner is a high-resolution tomograph with excellent sensitivity that is capable of imaging at a high count rate.
Effect of Data Assimilation Parameters on The Optimized Surface CO2 Flux in Asia
NASA Astrophysics Data System (ADS)
Kim, Hyunjung; Kim, Hyun Mee; Kim, Jinwoong; Cho, Chun-Ho
2018-02-01
In this study, CarbonTracker, an inverse modeling system based on the ensemble Kalman filter, was used to evaluate the effects of data assimilation parameters (assimilation window length and ensemble size) on the estimation of surface CO2 fluxes in Asia. Several experiments with different parameters were conducted, and the results were verified using CO2 concentration observations. The assimilation window lengths tested were 3, 5, 7, and 10 weeks, and the ensemble sizes were 100, 150, and 300. Therefore, a total of 12 experiments using combinations of these parameters were conducted. The experimental period was from January 2006 to December 2009. Differences between the optimized surface CO2 fluxes of the experiments were largest in the Eurasian Boreal (EB) area, followed by Eurasian Temperate (ET) and Tropical Asia (TA), and were larger in boreal summer than in boreal winter. The effect of ensemble size on the optimized biosphere flux is larger than the effect of the assimilation window length in Asia, but the importance of them varies in specific regions in Asia. The optimized biosphere flux was more sensitive to the assimilation window length in EB, whereas it was sensitive to the ensemble size as well as the assimilation window length in ET. The larger the ensemble size and the shorter the assimilation window length, the larger the uncertainty (i.e., spread of ensemble) of optimized surface CO2 fluxes. The 10-week assimilation window and 300 ensemble size were the optimal configuration for CarbonTracker in the Asian region based on several verifications using CO2 concentration measurements.
Kahle, Logan Q; Flannery, Maureen E; Dumbacher, John P
2016-01-01
Bird-window collisions are a major and poorly-understood generator of bird mortality. In North America, studies of this topic tend to be focused east of the Mississippi River, resulting in a paucity of data from the Western flyways. Additionally, few available data can critically evaluate factors such as time of day, sex and age bias, and effect of window pane size on collisions. We collected and analyzed 5 years of window strike data from a 3-story building in a large urban park in San Francisco, California. To evaluate our window collision data in context, we collected weekly data on local bird abundance in the adjacent parkland. Our study asks two overarching questions: first-what aspects of a bird's biology might make them more likely to fatally strike windows; and second, what characteristics of a building's design contribute to bird-window collisions. We used a dataset of 308 fatal bird strikes to examine the relationships of strikes relative to age, sex, time of day, time of year, and a variety of other factors, including mitigation efforts. We found that actively migrating birds may not be major contributors to collisions as has been found elsewhere. We found that males and young birds were both significantly overrepresented relative to their abundance in the habitat surrounding the building. We also analyzed the effect of external window shades as mitigation, finding that an overall reduction in large panes, whether covered or in some way broken up with mullions, effectively reduced window collisions. We conclude that effective mitigation or design will be required in all seasons, but that breeding seasons and migratory seasons are most critical, especially for low-rise buildings and other sites away from urban migrant traps. Finally, strikes occur throughout the day, but mitigation may be most effective in the morning and midday.
Kahle, Logan Q.; Flannery, Maureen E.; Dumbacher, John P.
2016-01-01
Bird-window collisions are a major and poorly-understood generator of bird mortality. In North America, studies of this topic tend to be focused east of the Mississippi River, resulting in a paucity of data from the Western flyways. Additionally, few available data can critically evaluate factors such as time of day, sex and age bias, and effect of window pane size on collisions. We collected and analyzed 5 years of window strike data from a 3-story building in a large urban park in San Francisco, California. To evaluate our window collision data in context, we collected weekly data on local bird abundance in the adjacent parkland. Our study asks two overarching questions: first–what aspects of a bird’s biology might make them more likely to fatally strike windows; and second, what characteristics of a building’s design contribute to bird-window collisions. We used a dataset of 308 fatal bird strikes to examine the relationships of strikes relative to age, sex, time of day, time of year, and a variety of other factors, including mitigation efforts. We found that actively migrating birds may not be major contributors to collisions as has been found elsewhere. We found that males and young birds were both significantly overrepresented relative to their abundance in the habitat surrounding the building. We also analyzed the effect of external window shades as mitigation, finding that an overall reduction in large panes, whether covered or in some way broken up with mullions, effectively reduced window collisions. We conclude that effective mitigation or design will be required in all seasons, but that breeding seasons and migratory seasons are most critical, especially for low-rise buildings and other sites away from urban migrant traps. Finally, strikes occur throughout the day, but mitigation may be most effective in the morning and midday. PMID:26731417
Ordering process in the diffusively coupled logistic lattice
NASA Astrophysics Data System (ADS)
Conrado, Claudine V.; Bohr, Tomas
1991-08-01
We study the ordering process in a lattice of diffusively coupled logistic maps for increasing lattice size. Within a window of parameters, the system goes into a weakly chaotic state with long range "antiferromagnetic" order. This happens for arbitrary lattice size L and the ordering time behaves as t ~ L2 as we would expect from a picture of diffusing defects.
Early Warning for Large Magnitude Earthquakes: Is it feasible?
NASA Astrophysics Data System (ADS)
Zollo, A.; Colombelli, S.; Kanamori, H.
2011-12-01
The mega-thrust, Mw 9.0, 2011 Tohoku earthquake has re-opened the discussion among the scientific community about the effectiveness of Earthquake Early Warning (EEW) systems, when applied to such large events. Many EEW systems are now under-testing or -development worldwide and most of them are based on the real-time measurement of ground motion parameters in a few second window after the P-wave arrival. Currently, we are using the initial Peak Displacement (Pd), and the Predominant Period (τc), among other parameters, to rapidly estimate the earthquake magnitude and damage potential. A well known problem about the real-time estimation of the magnitude is the parameter saturation. Several authors have shown that the scaling laws between early warning parameters and magnitude are robust and effective up to magnitude 6.5-7; the correlation, however, has not yet been verified for larger events. The Tohoku earthquake occurred near the East coast of Honshu, Japan, on the subduction boundary between the Pacific and the Okhotsk plates. The high quality Kik- and K- networks provided a large quantity of strong motion records of the mainshock, with a wide azimuthal coverage both along the Japan coast and inland. More than 300 3-component accelerograms have been available, with an epicentral distance ranging from about 100 km up to more than 500 km. This earthquake thus presents an optimal case study for testing the physical bases of early warning and to investigate the feasibility of a real-time estimation of earthquake size and damage potential even for M > 7 earthquakes. In the present work we used the acceleration waveform data of the main shock for stations along the coast, up to 200 km epicentral distance. We measured the early warning parameters, Pd and τc, within different time windows, starting from 3 seconds, and expanding the testing time window up to 30 seconds. The aim is to verify the correlation of these parameters with Peak Ground Velocity and Magnitude, respectively, as a function of the length of the P-wave window. The entire rupture process of the Tohoku earthquake lasted more than 120 seconds, as shown by the source time functions obtained by several authors. When a 3 second window is used to measure Pd and τc the result is an obvious underestimation of the event size and final PGV. However, as the time window increases up to 27-30 seconds, the measured values of Pd and τc become comparable with those expected for a magnitude M≥8.5 earthquake, according to the τc vs. M and the PGV vs. Pd relationships obtained in a previous work. Since we did not observe any saturation effect for the predominant period and peak displacement measured within a P-wave, 30-seconds window, we infer that, at least from a theoretical point of view, the estimation of earthquake damage potential through the early warning parameters is still feasible for large events, provided that a longer time window is used for parameter measurement. The off-line analysis of the Tohoku event records shows that reliable estimations of the damage potential could have been obtained 40-50 seconds after the origin time, by updating the measurements of the early warning parameters in progressively enlarged P-wave time windows from 3 to 30 seconds.
NASA Astrophysics Data System (ADS)
Anstey, Josephine; Pape, Dave
2013-03-01
In this paper we discuss Mrs. Squandertime, a real-time, persistent simulation of a virtual character, her living room, and the view from her window, designed to be a wall-size, projected art installation. Through her large picture window, the eponymous Mrs. Squandertime watches the sea: boats, clouds, gulls, the tide going in and out, people on the sea wall. The hundreds of images that compose the view are drawn from historical printed sources. The program that assembles and animates these images is driven by weather, time, and tide data constantly updated from a real physical location. The character herself is rendered photographically in a series of slowly dissolving stills which correspond to the character's current behavior.
Measuring floodplain spatial patterns using continuous surface metrics at multiple scales
Scown, Murray W.; Thoms, Martin C.; DeJager, Nathan R.
2015-01-01
Interactions between fluvial processes and floodplain ecosystems occur upon a floodplain surface that is often physically complex. Spatial patterns in floodplain topography have only recently been quantified over multiple scales, and discrepancies exist in how floodplain surfaces are perceived to be spatially organised. We measured spatial patterns in floodplain topography for pool 9 of the Upper Mississippi River, USA, using moving window analyses of eight surface metrics applied to a 1 × 1 m2 DEM over multiple scales. The metrics used were Range, SD, Skewness, Kurtosis, CV, SDCURV,Rugosity, and Vol:Area, and window sizes ranged from 10 to 1000 m in radius. Surface metric values were highly variable across the floodplain and revealed a high degree of spatial organisation in floodplain topography. Moran's I correlograms fit to the landscape of each metric at each window size revealed that patchiness existed at nearly all window sizes, but the strength and scale of patchiness changed within window size, suggesting that multiple scales of patchiness and patch structure exist in the topography of this floodplain. Scale thresholds in the spatial patterns were observed, particularly between the 50 and 100 m window sizes for all surface metrics and between the 500 and 750 m window sizes for most metrics. These threshold scales are ~ 15–20% and 150% of the main channel width (1–2% and 10–15% of the floodplain width), respectively. These thresholds may be related to structuring processes operating across distinct scale ranges. By coupling surface metrics, multi-scale analyses, and correlograms, quantifying floodplain topographic complexity is possible in ways that should assist in clarifying how floodplain ecosystems are structured.
Window decompression in laser-heated MagLIF targets
NASA Astrophysics Data System (ADS)
Woodbury, Daniel; Peterson, Kyle; Sefkow, Adam
2015-11-01
The Magnetized Liner Inertial Fusion (MagLIF) concept requires pre-magnetized fuel to be pre-heated with a laser before undergoing compression by a thick solid liner. Recent experiments and simulations suggest that yield has been limited to date by poor laser preheat and laser-induced mix in the fuel region. In order to assess laser energy transmission through the pressure-holding window, as well as resultant mix, we modeled window disassembly under different conditions using 1D and 2D simulations in both Helios and HYDRA. We present results tracking energy absorption, time needed for decompression, risk of laser-plasma interaction (LPI) that may scatter laser light, and potential for mix from various window thicknesses, laser spot sizes and gas fill densities. These results indicate that using thinner windows (0.5-1 μm windows) and relatively large laser spot radii (600 μm and above) can avoid deleterious effects and improve coupling with the fuel. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the National Nuclear Security Administration under DE-AC04- 94AL85000.
A soft, transparent, freely accessible cranial window for chronic imaging and electrophysiology
Heo, Chaejeong; Park, Hyejin; Kim, Yong-Tae; Baeg, Eunha; Kim, Yong Ho; Kim, Seong-Gi; Suh, Minah
2016-01-01
Chronic in vivo imaging and electrophysiology are important for better understanding of neural functions and circuits. We introduce the new cranial window using soft, penetrable, elastic, and transparent, silicone-based polydimethylsiloxane (PDMS) as a substitute for the skull and dura in both rats and mice. The PDMS can be readily tailored to any size and shape to cover large brain area. Clear and healthy cortical vasculatures were observed up to 15 weeks post-implantation. Real-time hemodynamic responses were successfully monitored during sensory stimulation. Furthermore, the PDMS window allowed for easy insertion of microelectrodes and micropipettes into the cortical tissue for electrophysiological recording and chemical injection at any location without causing any fluid leakage. Longitudinal two-photon microscopic imaging of Cx3Cr1+/− GFP transgenic mice was comparable with imaging via a conventional glass-type cranial window, even immediately following direct intracortical injection. This cranial window will facilitate direct probing and mapping for long-term brain studies. PMID:27283875
Switching times of nanoscale FePt: Finite size effects on the linear reversal mechanism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ellis, M. O. A.; Chantrell, R. W.
2015-04-20
The linear reversal mechanism in FePt grains ranging from 2.316 nm to 5.404 nm has been simulated using atomistic spin dynamics, parametrized from ab-initio calculations. The Curie temperature and the critical temperature (T{sup *}), at which the linear reversal mechanism occurs, are observed to decrease with system size whilst the temperature window T{sup *}
Long-term, high-resolution imaging in the mouse neocortex through a chronic cranial window
Holtmaat, Anthony; Bonhoeffer, Tobias; Chow, David K; Chuckowree, Jyoti; De Paola, Vincenzo; Hofer, Sonja B; Hübener, Mark; Keck, Tara; Knott, Graham; Lee, Wei-Chung A; Mostany, Ricardo; Mrsic-Flogel, Tom D; Nedivi, Elly; Portera-Cailliau, Carlos; Svoboda, Karel; Trachtenberg, Joshua T; Wilbrecht, Linda
2011-01-01
To understand the cellular and circuit mechanisms of experience-dependent plasticity, neurons and their synapses need to be studied in the intact brain over extended periods of time. Two-photon excitation laser scanning microscopy (2PLSM), together with expression of fluorescent proteins, enables high-resolution imaging of neuronal structure in vivo. In this protocol we describe a chronic cranial window to obtain optical access to the mouse cerebral cortex for long-term imaging. A small bone flap is replaced with a coverglass, which is permanently sealed in place with dental acrylic, providing a clear imaging window with a large field of view (∼0.8–12 mm2). The surgical procedure can be completed within ∼1 h. The preparation allows imaging over time periods of months with arbitrary imaging intervals. The large size of the imaging window facilitates imaging of ongoing structural plasticity of small neuronal structures in mice, with low densities of labeled neurons. The entire dendritic and axonal arbor of individual neurons can be reconstructed. PMID:19617885
NASA Astrophysics Data System (ADS)
Susilawati, Enny; Mawengkang, Herman; Efendi, Syahril
2018-01-01
Generally a Vehicle Routing Problem with time windows (VRPTW) can be defined as a problem to determine the optimal set of routes used by a fleet of vehicles to serve a given set of customers with service time restrictions; the objective is to minimize the total travel cost (related to the travel times or distances) and operational cost (related to the number of vehicles used). In this paper we address a variant of the VRPTW in which the fleet of vehicle is heterogenic due to the different size of demand from customers. The problem, called Heterogeneous VRP (HVRP) also includes service levels. We use integer programming model to describe the problem. A feasible neighbourhood approach is proposed to solve the model.
2014-10-16
Time-Frequency analysis, Short-Time Fourier Transform, Wigner Ville Distribution, Fourier Bessel Transform, Fractional Fourier Transform. I...INTRODUCTION Most widely used time-frequency transforms are short-time Fourier Transform (STFT) and Wigner Ville distribution (WVD). In STFT, time and...frequency resolutions are limited by the size of window function used in calculating STFT. For mono-component signals, WVD gives the best time and frequency
Dong, Bing; Li, Yan; Han, Xin-Li; Hu, Bin
2016-09-02
For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10(-5) in optimized correction and is 1.427 × 10(-5) in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method.
The short time Fourier transform and local signals
NASA Astrophysics Data System (ADS)
Okumura, Shuhei
In this thesis, I examine the theoretical properties of the short time discrete Fourier transform (STFT). The STFT is obtained by applying the Fourier transform by a fixed-sized, moving window to input series. We move the window by one time point at a time, so we have overlapping windows. I present several theoretical properties of the STFT, applied to various types of complex-valued, univariate time series inputs, and their outputs in closed forms. In particular, just like the discrete Fourier transform, the STFT's modulus time series takes large positive values when the input is a periodic signal. One main point is that a white noise time series input results in the STFT output being a complex-valued stationary time series and we can derive the time and time-frequency dependency structure such as the cross-covariance functions. Our primary focus is the detection of local periodic signals. I present a method to detect local signals by computing the probability that the squared modulus STFT time series has consecutive large values exceeding some threshold after one exceeding observation following one observation less than the threshold. We discuss a method to reduce the computation of such probabilities by the Box-Cox transformation and the delta method, and show that it works well in comparison to the Monte Carlo simulation method.
Timing of Gestational Weight Gain on Fetal Growth and Infant Size at Birth in Vietnam
Young, Melissa F.; Hong Nguyen, Phuong; Addo, O. Yaw; Pham, Hoa; Nguyen, Son; Martorell, Reynaldo; Ramakrishnan, Usha
2017-01-01
Objective To examine the importance of timing of gestational weight gain during three time periods: 1: ≤ 20 weeks gestation), 2: 21–29 weeks) and 3: ≥ 30 weeks) on fetal growth and infant birth size. Methods Study uses secondary data from the PRECONCEPT randomized controlled trial in Thai Nguyen province, Vietnam (n = 1436). Prospective data were collected on women starting pre-pregnancy through delivery. Maternal conditional weight gain (CWG) was defined as window-specific weight gains, uncorrelated with pre-pregnancy body mass index and all prior body weights. Fetal biometry, was assessed by ultrasound measurements of head and abdomen circumferences, biparietal diameter, and femoral length throughout pregnancy. Birth size outcomes included weight and length, and head, abdomen and mid upper arm circumferences as well as small for gestational age (SGA). Adjusted generalized linear and logistic models were used to examine associations. Results Overall, three-quarters of women gained below the Institute of Medicine guidelines, and these women were 2.5 times more likely to give birth to a SGA infant. Maternal CWG in the first window (≤ 20 weeks), followed by 21–29 weeks, had the greatest association on all parameters of fetal growth (except abdomen circumference) and infant size at birth. For birth weight, a 1 SD increase CWG in the first 20 weeks had 3 times the influence compared to later CWG (≥ 30 weeks) (111 g vs. 39 g) and was associated with a 43% reduction in SGA risk (OR (95% CI): 0.57 (0.46–0.70). Conclusion There is a need to target women before or early in pregnancy to ensure adequate nutrition to maximize impact on fetal growth and birth size. Trial Registration ClinicalTrials.gov, NCT01665378 PMID:28114316
Timing of Gestational Weight Gain on Fetal Growth and Infant Size at Birth in Vietnam.
Young, Melissa F; Hong Nguyen, Phuong; Addo, O Yaw; Pham, Hoa; Nguyen, Son; Martorell, Reynaldo; Ramakrishnan, Usha
2017-01-01
To examine the importance of timing of gestational weight gain during three time periods: 1: ≤ 20 weeks gestation), 2: 21-29 weeks) and 3: ≥ 30 weeks) on fetal growth and infant birth size. Study uses secondary data from the PRECONCEPT randomized controlled trial in Thai Nguyen province, Vietnam (n = 1436). Prospective data were collected on women starting pre-pregnancy through delivery. Maternal conditional weight gain (CWG) was defined as window-specific weight gains, uncorrelated with pre-pregnancy body mass index and all prior body weights. Fetal biometry, was assessed by ultrasound measurements of head and abdomen circumferences, biparietal diameter, and femoral length throughout pregnancy. Birth size outcomes included weight and length, and head, abdomen and mid upper arm circumferences as well as small for gestational age (SGA). Adjusted generalized linear and logistic models were used to examine associations. Overall, three-quarters of women gained below the Institute of Medicine guidelines, and these women were 2.5 times more likely to give birth to a SGA infant. Maternal CWG in the first window (≤ 20 weeks), followed by 21-29 weeks, had the greatest association on all parameters of fetal growth (except abdomen circumference) and infant size at birth. For birth weight, a 1 SD increase CWG in the first 20 weeks had 3 times the influence compared to later CWG (≥ 30 weeks) (111 g vs. 39 g) and was associated with a 43% reduction in SGA risk (OR (95% CI): 0.57 (0.46-0.70). There is a need to target women before or early in pregnancy to ensure adequate nutrition to maximize impact on fetal growth and birth size. ClinicalTrials.gov, NCT01665378.
Compensation for Blur Requires Increase in Field of View and Viewing Time
Kwon, MiYoung; Liu, Rong; Chien, Lillian
2016-01-01
Spatial resolution is an important factor for human pattern recognition. In particular, low resolution (blur) is a defining characteristic of low vision. Here, we examined spatial (field of view) and temporal (stimulus duration) requirements for blurry object recognition. The spatial resolution of an image such as letter or face, was manipulated with a low-pass filter. In experiment 1, studying spatial requirement, observers viewed a fixed-size object through a window of varying sizes, which was repositioned until object identification (moving window paradigm). Field of view requirement, quantified as the number of “views” (window repositions) for correct recognition, was obtained for three blur levels, including no blur. In experiment 2, studying temporal requirement, we determined threshold viewing time, the stimulus duration yielding criterion recognition accuracy, at six blur levels, including no blur. For letter and face recognition, we found blur significantly increased the number of views, suggesting a larger field of view is required to recognize blurry objects. We also found blur significantly increased threshold viewing time, suggesting longer temporal integration is necessary to recognize blurry objects. The temporal integration reflects the tradeoff between stimulus intensity and time. While humans excel at recognizing blurry objects, our findings suggest compensating for blur requires increased field of view and viewing time. The need for larger spatial and longer temporal integration for recognizing blurry objects may further challenge object recognition in low vision. Thus, interactions between blur and field of view should be considered for developing low vision rehabilitation or assistive aids. PMID:27622710
Solving the chemical master equation using sliding windows
2010-01-01
Background The chemical master equation (CME) is a system of ordinary differential equations that describes the evolution of a network of chemical reactions as a stochastic process. Its solution yields the probability density vector of the system at each point in time. Solving the CME numerically is in many cases computationally expensive or even infeasible as the number of reachable states can be very large or infinite. We introduce the sliding window method, which computes an approximate solution of the CME by performing a sequence of local analysis steps. In each step, only a manageable subset of states is considered, representing a "window" into the state space. In subsequent steps, the window follows the direction in which the probability mass moves, until the time period of interest has elapsed. We construct the window based on a deterministic approximation of the future behavior of the system by estimating upper and lower bounds on the populations of the chemical species. Results In order to show the effectiveness of our approach, we apply it to several examples previously described in the literature. The experimental results show that the proposed method speeds up the analysis considerably, compared to a global analysis, while still providing high accuracy. Conclusions The sliding window method is a novel approach to address the performance problems of numerical algorithms for the solution of the chemical master equation. The method efficiently approximates the probability distributions at the time points of interest for a variety of chemically reacting systems, including systems for which no upper bound on the population sizes of the chemical species is known a priori. PMID:20377904
Scale up of large ALON® and spinel windows
NASA Astrophysics Data System (ADS)
Goldman, Lee M.; Kashalikar, Uday; Ramisetty, Mohan; Jha, Santosh; Sastri, Suri
2017-05-01
Aluminum Oxynitride (ALON® Transparent Ceramic) and Magnesia Aluminate Spinel (Spinel) combine broadband transparency with excellent mechanical properties. Their cubic structure means that they are transparent in their polycrystalline form, allowing them to be manufactured by conventional powder processing techniques. Surmet has scaled up its ALON® production capability to produce and deliver windows as large as 4.4 sq ft. We have also produced our first 6 sq ft window. We are in the process of producing 7 sq ft ALON® window blanks for armor applications; and scale up to even larger, high optical quality blanks for Recce window applications is underway. Surmet also produces spinel for customers that require superior transmission at the longer wavelengths in the mid wave infra-red (MWIR). Spinel windows have been limited to smaller sizes than have been achieved with ALON. To date the largest spinel window produced is 11x18-in, and windows 14x20-in size are currently in process. Surmet is now scaling up its spinel processing capability to produce high quality window blanks as large as 19x27-in for sensor applications.
Chen, Keguang; Yin, Dongming; Lyu, Huiying; Yang, Lin; Zhang, Tianyu; Dai, Peidong
2016-01-01
With the aggravation of the external auditory canal malformation, the size of extra-niche fossa became smaller, providing concrete data and valuable information for the better design, selecting and safer implantation of the transducer in the area of round window niche. Three-dimensional measurements and assessments before surgery might be helpful for a safer surgical approach and implantation of a vibrant soundbridge. The aim of this study was to investigate whether differences exist in the morphology of the posterior tympanum related to the round window vibroplasty among congenital aural atresia (CAA), congenital aural stenosis (CAS), and a normal control group, and to analyze its effect on the round window implantation of vibrant soundbridge. CT images of 10 normal subjects (20 ears), 27 CAS patients (30 ears), and 25 CAA patients (30 ears) were analyzed. The depth and the size of outside fossa of round window niche related to the round window vibroplasty (extra-niche fossa)and the distances between the center of round window niche and extra-niche fossa were calculated based on three-dimensional reconstruction using mimics software. Finally, the data were analyzed statistically. The size of extra-niche fossa in the atresia group was smaller than in the stenosis group (p < 0.05); furthermore, the size of extra-niche fossa in the stenosis group was smaller than that of the control group (p < 0.05). There was no statistically significant difference of the depth of extra-niche fossa among different groups.
Study of noise transmission through double wall aircraft windows
NASA Technical Reports Server (NTRS)
Vaicaitis, R.
1983-01-01
Analytical and experimental procedures were used to predict the noise transmitted through double wall windows into the cabin of a twin-engine G/A aircraft. The analytical model was applied to optimize cabin noise through parametric variation of the structural and acoustic parameters. The parametric study includes mass addition, increase in plexiglass thickness, decrease in window size, increase in window cavity depth, depressurization of the space between the two window plates, replacement of the air cavity with a transparent viscoelastic material, change in stiffness of the plexiglass material, and different absorptive materials for the interior walls of the cabin. It was found that increasing the exterior plexiglass thickness and/or decreasing the total window size could achieve the proper amount of noise reduction for this aircraft. The total added weight to the aircraft is then about 25 lbs.
Off-Line Quality Control In Integrated Circuit Fabrication Using Experimental Design
NASA Astrophysics Data System (ADS)
Phadke, M. S.; Kackar, R. N.; Speeney, D. V.; Grieco, M. J.
1987-04-01
Off-line quality control is a systematic method of optimizing production processes and product designs. It is widely used in Japan to produce high quality products at low cost. The method was introduced to us by Professor Genichi Taguchi who is a Deming-award winner and a former Director of the Japanese Academy of Quality. In this paper we will i) describe the off-line quality control method, and ii) document our efforts to optimize the process for forming contact windows in 3.5 Aim CMOS circuits fabricated in the Murray Hill Integrated Circuit Design Capability Laboratory. In the fabrication of integrated circuits it is critically important to produce contact windows of size very near the target dimension. Windows which are too small or too large lead to loss of yield. The off-line quality control method has improved both the process quality and productivity. The variance of the window size has been reduced by a factor of four. Also, processing time for window photolithography has been substantially reduced. The key steps of off-line quality control are: i) Identify important manipulatable process factors and their potential working levels. ii) Perform fractional factorial experiments on the process using orthogonal array designs. iii) Analyze the resulting data to determine the optimum operating levels of the factors. Both the process mean and the process variance are considered in this analysis. iv) Conduct an additional experiment to verify that the new factor levels indeed give an improvement.
DOT National Transportation Integrated Search
2009-04-01
This paper studies approximations to the average length of Vehicle Routing Problems (VRP). The approximations are valuable for strategic and : planning analysis of transportation and logistics problems. The research focus is on VRP with varying numbe...
Writers Identification Based on Multiple Windows Features Mining
NASA Astrophysics Data System (ADS)
Fadhil, Murad Saadi; Alkawaz, Mohammed Hazim; Rehman, Amjad; Saba, Tanzila
2016-03-01
Now a days, writer identification is at high demand to identify the original writer of the script at high accuracy. The one of the main challenge in writer identification is how to extract the discriminative features of different authors' scripts to classify precisely. In this paper, the adaptive division method on the offline Latin script has been implemented using several variant window sizes. Fragments of binarized text a set of features are extracted and classified into clusters in the form of groups or classes. Finally, the proposed approach in this paper has been tested on various parameters in terms of text division and window sizes. It is observed that selection of the right window size yields a well positioned window division. The proposed approach is tested on IAM standard dataset (IAM, Institut für Informatik und angewandte Mathematik, University of Bern, Bern, Switzerland) that is a constraint free script database. Finally, achieved results are compared with several techniques reported in the literature.
Next generation smart window display using transparent organic display and light blocking screen.
Kim, Gyeong Woo; Lampande, Raju; Choe, Dong Cheol; Ko, Ik Jang; Park, Jin Hwan; Pode, Ramchandra; Kwon, Jang Hyuk
2018-04-02
Transparent organic light emitting diodes (TOLED) have widespread applications in the next-generation display devices particularly in the large size transparent window and interactive displays. Herein, we report high performance and stable attractive smart window displays using facile process. Advanced smart window display is realized by integrating the high performance light blocking screen and highly transparent white OLED panel. The full smart window display reveals a maximum transmittance as high as 64.2% at the wavelength of 600 nm and extremely good along with tunable ambient contrast ratio (171.94:1) compared to that of normal TOLED (4.54:1). Furthermore, the performance decisive light blocking screen has demonstrated an excellent optical and electrical characteristics such as i) high transmittance (85.56% at 562nm) at light-penetrating state, ii) superior absorbance (2.30 at 562nm) in light interrupting mode, iii) high optical contrast (85.50 at 562 nm), iv) high optical stability for more than 25,000 cycle of driving, v) fast switching time of 1.9 sec, and vi) low driving voltage of 1.7 V. The experimental results of smart window display are also validated using optical simulation. The proposed smart window display technology allows us to adjust the intensity of daylight entering the system quickly and conveniently.
NASA Technical Reports Server (NTRS)
Salem, Jonathan A.
2006-01-01
Mechanical and physical properties of ZnSe windows to be used with the FEANICS (Flow Enclosure Accommodating Novel Investigations in Combustion of Solids) experiments were measured in order to determine design allowables. The average Young s modulus, Poisson's ratio, equibiaxial fracture strength, flaw size, grain size, Knoop hardness, Vicker s hardness, and branching constant were 74.3 +/- 0.1 GPa, 0.31, 57.8 +/- 6.5 MPa, 21 +/- 4 mm, 43 +/- 9 microns, 0.97 +/- 0.02 GPa, 0.97 +/- 0.02 GPa, and 1.0 +/- 0.1 MPa(square root of)m, respectively. The properties of current ZnSe made by chemical vapor deposition are in good agreement with those measured in the 1970 s. The hardness of CVD ZnSe windows is about one-twentieth of the sapphire window being replaced, and about one-sixth of that of window glass. Thus the ZnSe window must be handled with great care. The large grain size relative to the inherent crack size implies the need to use single crystal crack growth properties in the design process. In order to determine the local failure stresses in one of the test specimens, a solution for the stresses between the support ring and the edge of a circular plate load between concentric rings was derived
Real-Time, Polyphase-FFT, 640-MHz Spectrum Analyzer
NASA Technical Reports Server (NTRS)
Zimmerman, George A.; Garyantes, Michael F.; Grimm, Michael J.; Charny, Bentsian; Brown, Randy D.; Wilck, Helmut C.
1994-01-01
Real-time polyphase-fast-Fourier-transform, polyphase-FFT, spectrum analyzer designed to aid in detection of multigigahertz radio signals in two 320-MHz-wide polarization channels. Spectrum analyzer divides total spectrum of 640 MHz into 33,554,432 frequency channels of about 20 Hz each. Size and cost of polyphase-coefficient memory substantially reduced and much of processing loss of windowed FFTs eliminated.
NASA Astrophysics Data System (ADS)
Leavey, Anna; Reed, Nathan; Patel, Sameer; Bradley, Kevin; Kulkarni, Pramod; Biswas, Pratim
2017-10-01
Advanced automobile technology, developed infrastructure, and changing economic markets have resulted in increasing commute times. Traffic is a major source of harmful pollutants and consequently daily peak exposures tend to occur near roadways or while travelling on them. The objective of this study was to measure simultaneous real-time particulate matter (particle numbers, lung-deposited surface area, PM2.5, particle number size distributions) and CO concentrations outside and in-cabin of an on-road car during regular commutes to and from work. Data was collected for different ventilation parameters (windows open or closed, fan on, AC on), whilst travelling along different road-types with varying traffic densities. Multiple predictor variables were examined using linear mixed-effects models. Ambient pollutants (NOx, PM2.5, CO) and meteorological variables (wind speed, temperature, relative humidity, dew point) explained 5-44% of outdoor pollutant variability, while the time spent travelling behind a bus was statistically significant for PM2.5, lung-deposited SA, and CO (adj-R2 values = 0.12, 0.10, 0.13). The geometric mean diameter (GMD) for outdoor aerosol was 34 nm. Larger cabin GMDs were observed when windows were closed compared to open (b = 4.3, p-value = <0.01). When windows were open, cabin total aerosol concentrations tracked those outdoors. With windows closed, the pollutants took longer to enter the vehicle cabin, but also longer to exit it. Concentrations of pollutants in cabin were influenced by outdoor concentrations, ambient temperature, and the window/ventilation parameters. As expected, particle number concentrations were impacted the most by changes to window position/ventilation, and PM2.5 the least. Car drivers can expect their highest exposures when driving with windows open or the fan on, and their lowest exposures during windows closed or the AC on. Final linear mixed-effects models could explain between 88 and 97% of cabin pollutant concentration variability. An individual may control their commuting exposure by applying dynamic behavior modification to adapt to changing pollutant scenarios.
Leavey, Anna; Reed, Nathan; Patel, Sameer; Bradley, Kevin; Kulkarni, Pramod; Biswas, Pratim
2017-01-01
Advanced automobile technology, developed infrastructure, and changing economic markets have resulted in increasing commute times. Traffic is a major source of harmful pollutants and consequently daily peak exposures tend to occur near roadways or while traveling on them. The objective of this study was to measure simultaneous real-time particulate matter (particle numbers, lung-deposited surface area, PM2.5, particle number size distributions) and CO concentrations outside and in-cabin of an on-road car during regular commutes to and from work. Data was collected for different ventilation parameters (windows open or closed, fan on, AC on), whilst traveling along different road-types with varying traffic densities. Multiple predictor variables were examined using linear mixed-effects models. Ambient pollutants (NOx, PM2.5, CO) and meteorological variables (wind speed, temperature, relative humidity, dew point) explained 5–44% of outdoor pollutant variability, while the time spent travelling behind a bus was statistically significant for PM2.5, lung-deposited SA, and CO (adj-R2 values = 0.12, 0.10, 0.13). The geometric mean diameter (GMD) for outdoor aerosol was 34 nm. Larger cabin GMDs were observed when windows were closed compared to open (b = 4.3, p-value = <0.01). When windows were open, cabin total aerosol concentrations tracked those outdoors. With windows closed, the pollutants took longer to enter the vehicle cabin, but also longer to exit it. Concentrations of pollutants in cabin were influenced by outdoor concentrations, ambient temperature, and the window/ventilation parameters. As expected, particle number concentrations were impacted the most by changes to window position / ventilation, and PM2.5 the least. Car drivers can expect their highest exposures when driving with windows open or the fan on, and their lowest exposures during windows closed or the AC on. Final linear mixed-effects models could explain between 88–97% of cabin pollutant concentration variability. An individual may control their commuting exposure by applying dynamic behavior modification to adapt to changing pollutant scenarios. PMID:29284988
Leavey, Anna; Reed, Nathan; Patel, Sameer; Bradley, Kevin; Kulkarni, Pramod; Biswas, Pratim
2017-10-01
Advanced automobile technology, developed infrastructure, and changing economic markets have resulted in increasing commute times. Traffic is a major source of harmful pollutants and consequently daily peak exposures tend to occur near roadways or while traveling on them. The objective of this study was to measure simultaneous real-time particulate matter (particle numbers, lung-deposited surface area, PM 2.5 , particle number size distributions) and CO concentrations outside and in-cabin of an on-road car during regular commutes to and from work. Data was collected for different ventilation parameters (windows open or closed, fan on, AC on), whilst traveling along different road-types with varying traffic densities. Multiple predictor variables were examined using linear mixed-effects models. Ambient pollutants (NO x , PM 2.5 , CO) and meteorological variables (wind speed, temperature, relative humidity, dew point) explained 5-44% of outdoor pollutant variability, while the time spent travelling behind a bus was statistically significant for PM 2.5, lung-deposited SA, and CO (adj-R 2 values = 0.12, 0.10, 0.13). The geometric mean diameter (GMD) for outdoor aerosol was 34 nm. Larger cabin GMDs were observed when windows were closed compared to open (b = 4.3, p-value = <0.01). When windows were open, cabin total aerosol concentrations tracked those outdoors. With windows closed, the pollutants took longer to enter the vehicle cabin, but also longer to exit it. Concentrations of pollutants in cabin were influenced by outdoor concentrations, ambient temperature, and the window/ventilation parameters. As expected, particle number concentrations were impacted the most by changes to window position / ventilation, and PM 2.5 the least. Car drivers can expect their highest exposures when driving with windows open or the fan on, and their lowest exposures during windows closed or the AC on. Final linear mixed-effects models could explain between 88-97% of cabin pollutant concentration variability. An individual may control their commuting exposure by applying dynamic behavior modification to adapt to changing pollutant scenarios.
Eye movement evidence for defocused attention in dysphoria--a perceptual span analysis.
Brzezicka, Aneta; Krejtz, Izabela; von Hecker, Ulrich; Laubrock, Jochen
2012-07-01
The defocused attention hypothesis (von Hecker and Meiser, 2005) assumes that negative mood broadens attention, whereas the analytical rumination hypothesis (Andrews and Thompson, 2009) suggests a narrowing of the attentional focus with depression. We tested these conflicting hypotheses by directly measuring the perceptual span in groups of dysphoric and control subjects, using eye tracking. In the moving window paradigm, information outside of a variable-width gaze-contingent window was masked during reading of sentences. In measures of sentence reading time and mean fixation duration, dysphoric subjects were more pronouncedly affected than controls by a reduced window size. This difference supports the defocused attention hypothesis and seems hard to reconcile with a narrowing of attentional focus. Copyright © 2011 Elsevier B.V. All rights reserved.
Dong, Bing; Li, Yan; Han, Xin-li; Hu, Bin
2016-01-01
For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10−5 in optimized correction and is 1.427 × 10−5 in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method. PMID:27598161
NASA Astrophysics Data System (ADS)
Prawin, J.; Rama Mohan Rao, A.
2018-01-01
The knowledge of dynamic loads acting on a structure is always required for many practical engineering problems, such as structural strength analysis, health monitoring and fault diagnosis, and vibration isolation. In this paper, we present an online input force time history reconstruction algorithm using Dynamic Principal Component Analysis (DPCA) from the acceleration time history response measurements using moving windows. We also present an optimal sensor placement algorithm to place limited sensors at dynamically sensitive spatial locations. The major advantage of the proposed input force identification algorithm is that it does not require finite element idealization of structure unlike the earlier formulations and therefore free from physical modelling errors. We have considered three numerical examples to validate the accuracy of the proposed DPCA based method. Effects of measurement noise, multiple force identification, different kinds of loading, incomplete measurements, and high noise levels are investigated in detail. Parametric studies have been carried out to arrive at optimal window size and also the percentage of window overlap. Studies presented in this paper clearly establish the merits of the proposed algorithm for online load identification.
NASA Astrophysics Data System (ADS)
Zaripov, D. I.; Renfu, Li
2018-05-01
The implementation of high-efficiency digital image correlation methods based on a zero-normalized cross-correlation (ZNCC) procedure for high-speed, time-resolved measurements using a high-resolution digital camera is associated with big data processing and is often time consuming. In order to speed-up ZNCC computation, a high-speed technique based on a parallel projection correlation procedure is proposed. The proposed technique involves the use of interrogation window projections instead of its two-dimensional field of luminous intensity. This simplification allows acceleration of ZNCC computation up to 28.8 times compared to ZNCC calculated directly, depending on the size of interrogation window and region of interest. The results of three synthetic test cases, such as a one-dimensional uniform flow, a linear shear flow and a turbulent boundary-layer flow, are discussed in terms of accuracy. In the latter case, the proposed technique is implemented together with an iterative window-deformation technique. On the basis of the results of the present work, the proposed technique is recommended to be used for initial velocity field calculation, with further correction using more accurate techniques.
NASA Astrophysics Data System (ADS)
Hachani, Sahar; Boudevillain, Brice; Bargaoui, Zoubeida; Delrieu, Guy
2015-04-01
During the first Special Observation Period (SOP) of the Hydrological cycle in the Mediterranean Experiment (HyMeX, www.hymex.org) held in fall 2012 in the Northwestern Mediterranean region, an observation network dedicated to rain studies was implemented in the Cévennes region, France. It was mainly constituted by weather radars, micro rain radars, disdrometers and rain gauges. Observations are performed by a network of 25 OTT Parsivel optical disdrometers distributed with inter-distances ranging from a few meters up to about one hundred kilometers. This presentation focuses on the comparison of one optical disdrometer observations located at Villeneuve-de-berg to observations using weather Météo-France / ARAMIS radar located at Bollène which is in a neighborhood of 60 km from the disdrometer.The period from September to November 2012 is studied. To analyze the structure of the rain observed by radar, a window of investigation centered on the disdrometer was selected and the mean spatial values, standard deviation, gradients, and intermittency of radar reflectivity or rainfall intensity were computed for a time step of 5 minutes.Four different windowsizes were analyzed: 1 km², 25 km², 100 km² and 400 km². On the other hand, the total concentration of drops Nt, the characteristic diameter of drops Dc, and a Gamma distribution shape parameter µ were estimated. Gamma distribution for the DSD related to disdrometer observations was estimated according to the modeling framework proposed by Yu et al. (2014). Correlation coefficient between intensity R obtained by the disdrometer and windowaverage R estimated using radar data is nearly 0.70 whatever the window. The highest value is found for the window 25 km² (0.74). Correlation coefficients between Dc and window average R vary from 0.35 for the window 1 km² to 0.4 for the window 400 km². So, they areweak and not sensitive to the choice of the window. Contrarily, formean radar reflectivityZ, correlation coefficients with Dc, Nt and µ vary to some extent from the window size 1 km² to the window size 100 km². The most sensitive is the correlation coefficient between Z and Nt. However it presents the smallest correlations while the highest correlations are found for Dc (respectively 0.80 and 0.74). The overall of relations between the rainfall structure variables and DSD parameters will be presented in the communication with a special attention to the weather and/or rainfall types (orographic, stratiform, and convective). References: Yu, N., Delrieu, G., Boudevillain, B., Hazenberg, P., and Uijlenhoet, R., 2014: Unified formulation of single and multi-moment normalizations of the raindrop size distribution based on the gamma probability density function. Journal of Applied Meteorology and Climatology, 53, pp 166-179.
Time reversal and phase coherent music techniques for super-resolution ultrasound imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Lianjie; Labyed, Yassin
Systems and methods for super-resolution ultrasound imaging using a windowed and generalized TR-MUSIC algorithm that divides the imaging region into overlapping sub-regions and applies the TR-MUSIC algorithm to the windowed backscattered ultrasound signals corresponding to each sub-region. The algorithm is also structured to account for the ultrasound attenuation in the medium and the finite-size effects of ultrasound transducer elements. A modified TR-MUSIC imaging algorithm is used to account for ultrasound scattering from both density and compressibility contrasts. The phase response of ultrasound transducer elements is accounted for in a PC-MUSIC system.
Towards developing robust algorithms for solving partial differential equations on MIMD machines
NASA Technical Reports Server (NTRS)
Saltz, Joel H.; Naik, Vijay K.
1988-01-01
Methods for efficient computation of numerical algorithms on a wide variety of MIMD machines are proposed. These techniques reorganize the data dependency patterns to improve the processor utilization. The model problem finds the time-accurate solution to a parabolic partial differential equation discretized in space and implicitly marched forward in time. The algorithms are extensions of Jacobi and SOR. The extensions consist of iterating over a window of several timesteps, allowing efficient overlap of computation with communication. The methods increase the degree to which work can be performed while data are communicated between processors. The effect of the window size and of domain partitioning on the system performance is examined both by implementing the algorithm on a simulated multiprocessor system.
Towards developing robust algorithms for solving partial differential equations on MIMD machines
NASA Technical Reports Server (NTRS)
Saltz, J. H.; Naik, V. K.
1985-01-01
Methods for efficient computation of numerical algorithms on a wide variety of MIMD machines are proposed. These techniques reorganize the data dependency patterns to improve the processor utilization. The model problem finds the time-accurate solution to a parabolic partial differential equation discretized in space and implicitly marched forward in time. The algorithms are extensions of Jacobi and SOR. The extensions consist of iterating over a window of several timesteps, allowing efficient overlap of computation with communication. The methods increase the degree to which work can be performed while data are communicated between processors. The effect of the window size and of domain partitioning on the system performance is examined both by implementing the algorithm on a simulated multiprocessor system.
Optimization and performance evaluation of the microPET II scanner for in vivo small-animal imaging
NASA Astrophysics Data System (ADS)
Yang, Yongfeng; Tai, Yuan-Chuan; Siegel, Stefan; Newport, Danny F.; Bai, Bing; Li, Quanzheng; Leahy, Richard M.; Cherry, Simon R.
2004-06-01
MicroPET II is a newly developed PET (positron emission tomography) scanner designed for high-resolution imaging of small animals. It consists of 17 640 LSO crystals each measuring 0.975 × 0.975 × 12.5 mm3, which are arranged in 42 contiguous rings, with 420 crystals per ring. The scanner has an axial field of view (FOV) of 4.9 cm and a transaxial FOV of 8.5 cm. The purpose of this study was to carefully evaluate the performance of the system and to optimize settings for in vivo mouse and rat imaging studies. The volumetric image resolution was found to depend strongly on the reconstruction algorithm employed and averaged 1.1 mm (1.4 µl) across the central 3 cm of the transaxial FOV when using a statistical reconstruction algorithm with accurate system modelling. The sensitivity, scatter fraction and noise-equivalent count (NEC) rate for mouse- and rat-sized phantoms were measured for different energy and timing windows. Mouse imaging was optimized with a wide open energy window (150-750 keV) and a 10 ns timing window, leading to a sensitivity of 3.3% at the centre of the FOV and a peak NEC rate of 235 000 cps for a total activity of 80 MBq (2.2 mCi) in the phantom. Rat imaging, due to the higher scatter fraction, and the activity that lies outside of the field of view, achieved a maximum NEC rate of 24 600 cps for a total activity of 80 MBq (2.2 mCi) in the phantom, with an energy window of 250-750 keV and a 6 ns timing window. The sensitivity at the centre of the FOV for these settings is 2.1%. This work demonstrates that different scanner settings are necessary to optimize the NEC count rate for different-sized animals and different injected doses. Finally, phantom and in vivo animal studies are presented to demonstrate the capabilities of microPET II for small-animal imaging studies.
Polyp measurement with CT colonography: multiple-reader, multiple-workstation comparison.
Young, Brett M; Fletcher, J G; Paulsen, Scott R; Booya, Fargol; Johnson, C Daniel; Johnson, Kristina T; Melton, Zackary; Rodysill, Drew; Mandrekar, Jay
2007-01-01
The risk of invasive colorectal cancer in colorectal polyps correlates with lesion size. Our purpose was to define the most accurate methods for measuring polyp size at CT colonography (CTC) using three models of workstations and multiple observers. Six reviewers measured 24 unique polyps of known size (5, 7, 10, and 12 mm), shape (sessile, flat, and pedunculated), and location (straight or curved bowel segment) using CTC data sets obtained at two doses (5 mAs and 65 mAs) and a previously described colonic phantom model. Reviewers measured the largest diameter of polyps on three proprietary workstations. Each polyp was measured with lung and soft-tissue windows on axial, 2D multiplanar reconstruction (MPR), and 3D images. There were significant differences among measurements obtained at various settings within each workstation (p < 0.0001). Measurements on 2D images were more accurate with lung window than with soft-tissue window settings (p < 0.0001). For the 65-mAs data set, the most accurate measurements were obtained in analysis of axial images with lung window, 2D MPR images with lung window, and 3D tissue cube images for Wizard, Advantage, and Vitrea workstations, respectively, without significant differences in accuracy among techniques (0.11 < p < 0.59). The mean absolute error values for these optimal settings were 0.48 mm, 0.61 mm, and 0.76 mm, respectively, for the three workstations. Within the ultralow-dose 5-mAs data set the best methods for Wizard, Advantage, and Vitrea were axial with lung window, 2D MPR with lung window, and 2D MPR with lung window, respectively. Use of nearly all measurement methods, except for the Vitrea 3D tissue cube and the Wizard 2D MPR with lung window, resulted in undermeasurement of the true size of the polyps. Use of CTC computer workstations facilitates accurate polyp measurement. For routine CTC examinations, polyps should be measured with lung window settings on 2D axial or MPR images (Wizard and Advantage) or 3D images (Vitrea). When these optimal methods are used, these three commercial workstations do not differ significantly in acquisition of accurate polyp measurements at routine dose settings.
NASA Astrophysics Data System (ADS)
Kim, Byung Soo; Lee, Woon-Seek; Koh, Shiegheun
2012-07-01
This article considers an inbound ordering and outbound dispatching problem for a single product in a third-party warehouse, where the demands are dynamic over a discrete and finite time horizon, and moreover, each demand has a time window in which it must be satisfied. Replenishing orders are shipped in containers and the freight cost is proportional to the number of containers used. The problem is classified into two cases, i.e. non-split demand case and split demand case, and a mathematical model for each case is presented. An in-depth analysis of the models shows that they are very complicated and difficult to find optimal solutions as the problem size becomes large. Therefore, genetic algorithm (GA) based heuristic approaches are designed to solve the problems in a reasonable time. To validate and evaluate the algorithms, finally, some computational experiments are conducted.
The effect of exit beam phase aberrations on parallel beam coherent x-ray reconstructions
NASA Astrophysics Data System (ADS)
Hruszkewycz, S. O.; Harder, R.; Xiao, X.; Fuoss, P. H.
2010-12-01
Diffraction artifacts from imperfect x-ray windows near the sample are an important consideration in the design of coherent x-ray diffraction measurements. In this study, we used simulated and experimental diffraction patterns in two and three dimensions to explore the effect of phase imperfections in a beryllium window (such as a void or inclusion) on the convergence behavior of phasing algorithms and on the ultimate reconstruction. A predictive relationship between beam wavelength, sample size, and window position was derived to explain the dependence of reconstruction quality on beryllium defect size. Defects corresponding to this prediction cause the most damage to the sample exit wave and induce signature error oscillations during phasing that can be used as a fingerprint of experimental x-ray window artifacts. The relationship between x-ray window imperfection size and coherent x-ray diffractive imaging reconstruction quality explored in this work can play an important role in designing high-resolution in situ coherent imaging instrumentation and will help interpret the phasing behavior of coherent diffraction measured in these in situ environments.
The effect of exit beam phase aberrations on parallel beam coherent x-ray reconstructions.
Hruszkewycz, S O; Harder, R; Xiao, X; Fuoss, P H
2010-12-01
Diffraction artifacts from imperfect x-ray windows near the sample are an important consideration in the design of coherent x-ray diffraction measurements. In this study, we used simulated and experimental diffraction patterns in two and three dimensions to explore the effect of phase imperfections in a beryllium window (such as a void or inclusion) on the convergence behavior of phasing algorithms and on the ultimate reconstruction. A predictive relationship between beam wavelength, sample size, and window position was derived to explain the dependence of reconstruction quality on beryllium defect size. Defects corresponding to this prediction cause the most damage to the sample exit wave and induce signature error oscillations during phasing that can be used as a fingerprint of experimental x-ray window artifacts. The relationship between x-ray window imperfection size and coherent x-ray diffractive imaging reconstruction quality explored in this work can play an important role in designing high-resolution in situ coherent imaging instrumentation and will help interpret the phasing behavior of coherent diffraction measured in these in situ environments.
Perceived Spaciousness and Preference in Sequential Experience.
Bokharaei, Saleheh; Nasar, Jack L
2016-11-01
We assessed the perceived spaciousness and preference for a destination space in relation to six attributes (size, lighting, window size, texture, wall mural, and amount of furniture) of it and of the space experienced before it. Studies have examined effects of these attributes but not for dynamic experience or preference. We created 24 virtual reality walks between each possible pair of two levels of each attribute. For each destination space, 31 students (13 men, 18 women) rated spaciousness and 30 students (16 men, 14 women) rated preference. We conducted separate 2 × 2 repeated-measure ANOVAs across each condition for perceived spaciousness and preference. Participants judged the space that was larger, was more brightly lit, with a larger window, or with less furniture as the more spacious. These attributes also increased preference. Consonant with adaptation-level theory, participants judged offices as higher in spaciousness and preference if preceded by a space that was smaller, was more dimly lit, or had smaller windows. The findings suggest that perceived spaciousness varies with size, lightness, window size, and amount of furniture but that perception also depends on the size, lightness, and size of the space experienced before. Designers could use the findings to manipulate features to make a space appear larger or more desirable. © 2016, Human Factors and Ergonomics Society.
Mechanical Properties of ZnSe for the FEANICS Module
NASA Technical Reports Server (NTRS)
Salem, Jon
2006-01-01
Mechanical and physical properties of ZnSe windows to be used with the FEANICS (Flow Enclosure Accommodating Novel Investigations in Combustion of Solids) experiments were measured in order to determine design allowables. In addition, the literature on crack growth properties was summarized. The average Young's modulus, Poisson's ratio, equibiaxial fracture strength, flaw size, grain size, Knoop hardness, Vicker's hardness, and branching constant were 74.3 +/- 0.1 GPa, 0.31, 57.8 +/- 6.5 MPa, 21 4 mm, 43 +/- 9 micron, 0.97 +/- 0.02 GPa, 0.97 +/- 0.02 GPa, and 1.0 +/- 0.1 MPam(exp 0.5), respectively. The properties of current ZnSe made by chemical vapor deposition are in good agreement with those measured in the 1970's. The hardness of CVD ZnSe windows is about one twentieth of the sapphire window being replaced, and about one-sixth of that of window glass. Thus the ZnSe window must be handled with great care. The large grain size relative to the inherent crack size implies the need to use single crystal crack growth properties in the design process. In order to determine the local failure stresses in one of the test specimens, a solution for the stresses between the support ring and the edge of a circular plate load between concentric rings was derived.
Plot size recommendations for biomass estimation in a midwestern old-growth forest
Martin A. Spetich; George R Parker
1998-01-01
The authors examine the relationship between disturbance regime and plot size for woody biomass estimation in a midwestern old-growth deciduous forest from 1926 to 1992. Analysis was done on the core 19.6 ac of a 50.1 ac forest in which every tree 4 in. d.b.h. and greater has been tagged and mapped since 1926. Five windows of time are comparedâ1926, 1976, 1981, 1986...
NASA Astrophysics Data System (ADS)
Yin, Yi; Shang, Pengjian
2013-12-01
We use multiscale detrended fluctuation analysis (MSDFA) and multiscale detrended cross-correlation analysis (MSDCCA) to investigate auto-correlation (AC) and cross-correlation (CC) in the US and Chinese stock markets during 1997-2012. The results show that US and Chinese stock indices differ in terms of their multiscale AC structures. Stock indices in the same region also differ with regard to their multiscale AC structures. We analyze AC and CC behaviors among indices for the same region to determine similarity among six stock indices and divide them into four groups accordingly. We choose S&P500, NQCI, HSI, and the Shanghai Composite Index as representative samples for simplicity. MSDFA and MSDCCA results and average MSDFA spectra for local scaling exponents (LSEs) for individual series are presented. We find that the MSDCCA spectrum for LSE CC between two time series generally tends to be greater than the average MSDFA LSE spectrum for individual series. We obtain detailed multiscale structures and relations for CC between the four representatives. MSDFA and MSDCCA with secant rolling windows of different sizes are then applied to reanalyze the AC and CC. Vertical and horizontal comparisons of different window sizes are made. The MSDFA and MSDCCA results for the original window size are confirmed and some new interesting characteristics and conclusions regarding multiscale correlation structures are obtained.
Yanagawa, Masahiro; Kusumoto, Masahiko; Johkoh, Takeshi; Noguchi, Masayuki; Minami, Yuko; Sakai, Fumikazu; Asamura, Hisao; Tomiyama, Noriyuki
2018-05-01
Measuring the size of invasiveness on computed tomography (CT) for the T descriptor size was deemed important in the 8th edition of the TNM lung cancer classification. We aimed to correlate the maximal dimensions of the solid portions using both lung and mediastinal window settings on CT imaging with the pathologic invasiveness (> 0.5 cm) in lung adenocarcinoma patients. The study population consisted of 378 patients with a histologic diagnosis of adenocarcinoma in situ (AIS), minimally invasive adenocarcinoma (MIA), invasive adenocarcinoma (IVA)-lepidic, IVA-acinar and/or IVA-papillary, and IVA-micropapillary and/or solid adenocarcinoma. A panel of 15 radiologists was divided into 2 groups (group A, 9 radiologists; and group B, 6 radiologists). The 2 groups independently measured the maximal and perpendicular dimensions of the solid components and entire tumors on the lung and mediastinal window settings. The solid proportion of nodule was calculated by dividing the solid portion size (lung and mediastinal window settings) by the nodule size (lung window setting). The maximal dimensions of the invasive focus were measured on the corresponding pathologic specimens by 2 pathologists. The solid proportion was larger in the following descending order: IVA-micropapillary and/or solid, IVA-acinar and/or papillary, IVA-lepidic, MIA, and AIS. For both groups A and B, a solid portion > 0.8 cm in the lung window setting or > 0.6 cm in the mediastinal window setting on CT was a significant indicator of pathologic invasiveness > 0.5 cm (P < .001; receiver operating characteristic analysis using Youden's index). A solid portion > 0.8 cm on the lung window setting or solid portion > 0.6 cm on the mediastinal window setting on CT predicts for histopathologic invasiveness to differentiate IVA from MIA and AIS. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Abrokwah, K.; O'Reilly, A. M.
2017-12-01
Groundwater is an important resource that is extracted every day because of its invaluable use for domestic, industrial and agricultural purposes. The need for sustaining groundwater resources is clearly indicated by declining water levels and has led to modeling and forecasting accurate groundwater levels. In this study, spectral decomposition of climatic forcing time series was used to develop hybrid wavelet analysis (WA) and moving window average (MWA) artificial neural network (ANN) models. These techniques are explored by modeling historical groundwater levels in order to provide understanding of potential causes of the observed groundwater-level fluctuations. Selection of the appropriate decomposition level for WA and window size for MWA helps in understanding the important time scales of climatic forcing, such as rainfall, that influence water levels. Discrete wavelet transform (DWT) is used to decompose the input time-series data into various levels of approximate and details wavelet coefficients, whilst MWA acts as a low-pass signal-filtering technique for removing high-frequency signals from the input data. The variables used to develop and validate the models were daily average rainfall measurements from five National Atmospheric and Oceanic Administration (NOAA) weather stations and daily water-level measurements from two wells recorded from 1978 to 2008 in central Florida, USA. Using different decomposition levels and different window sizes, several WA-ANN and MWA-ANN models for simulating the water levels were created and their relative performances compared against each other. The WA-ANN models performed better than the corresponding MWA-ANN models; also higher decomposition levels of the input signal by the DWT gave the best results. The results obtained show the applicability and feasibility of hybrid WA-ANN and MWA-ANN models for simulating daily water levels using only climatic forcing time series as model inputs.
Music chills: The eye pupil as a mirror to music's soul.
Laeng, Bruno; Eidet, Lise Mette; Sulutvedt, Unni; Panksepp, Jaak
2016-08-01
This study evaluated whether music-induced aesthetic "chill" responses, which typically correspond to peak emotional experiences, can be objectively monitored by degree of pupillary dilation. Participants listened to self-chosen songs versus control songs chosen by other participants. The experiment included an active condition where participants made key presses to indicate when experiencing chills and a passive condition (without key presses). Chills were reported more frequently for self-selected songs than control songs. Pupil diameter was concurrently measured by an eye-tracker while participants listened to each of the songs. Pupil size was larger within specific time-windows around the chill events, as monitored by key responses, than in comparison to pupil size observed during 'passive' song listening. In addition, there was a clear relationship between pupil diameter within the chills-related time-windows during both active and passive conditions, thus ruling out the possibility that chills-related pupil dilations were an artifact of making a manual response. These findings strongly suggest that music chills can be visible in the moment-to-moment changes in the size of pupillary responses and that a neuromodulatory role of the central norepinephrine system is thereby implicated in this phenomenon. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
46 CFR 177.1030 - Operating station visibility.
Code of Federal Regulations, 2010 CFR
2010-10-01
... TONS) CONSTRUCTION AND ARRANGEMENT Window Construction and Visibility § 177.1030 Operating station visibility. (a) Windows and other openings at the operating station must be of sufficient size and properly... glazing material used in windows at the operating station must have a light transmission of not less than...
Soft X-ray microscope with nanometer spatial resolution and its applications
NASA Astrophysics Data System (ADS)
Wachulak, P. W.; Torrisi, A.; Bartnik, A.; Wegrzynski, L.; Fok, T.; Patron, Z.; Fiedorowicz, H.
2016-12-01
A compact size microscope based on nitrogen double stream gas puff target soft X-ray source, which emits radiation in water-window spectral range at the wavelength of λ = 2.88 nm is presented. The microscope employs ellipsoidal grazing incidence condenser mirror for sample illumination and silicon nitride Fresnel zone plate objective for object magnification and imaging. The microscope is capable of capturing water-window images of objects with 60 nm spatial resolution and exposure time as low as a few seconds. Details about the microscopy system as well as some examples of different applications from various fields of science, are presented and discussed.
Burnett, B R
2001-03-01
At issue in this case was whether an unusual window defect seen in two of the crime scene photographs was due to a bullet and if so, if that same bullet fatally wounded the victim. The window appeared to have been cracked prior to the apparent shot through it. A .22 bullet recovered from autopsy, when examined only by light microscopy, failed to show associated glass fragments. A previously cracked test window was shot a number of times with .22 caliber bullets near the cracks in an effort to simulate the window defect seen in the crime scene photographs. Several of the defects produced by the test window shots appeared similar to the crime scene window defect. The .22 bullet taken from the victim and several of the test bullets (collected by a cotton box) were examined by scanning electron microscopy/energy dispersive X-ray spectroscopy. The test bullets showed glass particles on and embedded in their surfaces. Particles of similar size and composition were found embedded in the surface of the bullet from the victim. The bullet likely struck the window prior to hitting the victim. It was apparent by the morphology of some of the mushroomed test .22 bullets that they hit the window crack. These bullets showed that the glass on one side of a crack often fails before the other side during the strike. Aggregations of powdered glass on many of the mushroomed surfaces of the .22 bullets suggest that as the bullet mushrooms during impact on the window surface, the glass in contact with the bullet powderizes and coats the mushroomed surface of the bullet with a layer of fine glass particles.
Vincenzi, Simone
2014-08-06
One of the most dramatic consequences of climate change will be the intensification and increased frequency of extreme events. I used numerical simulations to understand and predict the consequences of directional trend (i.e. mean state) and increased variability of a climate variable (e.g. temperature), increased probability of occurrence of point extreme events (e.g. floods), selection pressure and effect size of mutations on a quantitative trait determining individual fitness, as well as the their effects on the population and genetic dynamics of a population of moderate size. The interaction among climate trend, variability and probability of point extremes had a minor effect on risk of extinction, time to extinction and distribution of the trait after accounting for their independent effects. The survival chances of a population strongly and linearly decreased with increasing strength of selection, as well as with increasing climate trend and variability. Mutation amplitude had no effects on extinction risk, time to extinction or genetic adaptation to the new climate. Climate trend and strength of selection largely determined the shift of the mean phenotype in the population. The extinction or persistence of the populations in an 'extinction window' of 10 years was well predicted by a simple model including mean population size and mean genetic variance over a 10-year time frame preceding the 'extinction window', although genetic variance had a smaller role than population size in predicting contemporary risk of extinction. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Sol-gel antireflective spin-coating process for large-size shielding windows
NASA Astrophysics Data System (ADS)
Belleville, Philippe F.; Prene, Philippe; Mennechez, Francoise; Bouigeon, Christian
2002-10-01
The interest of the antireflective coatings applied onto large-area glass components increases everyday for the potential application such as building or shop windows. Today, because of the use of large size components, sol-gel process is a competitive way for antireflective coating mass production. The dip-coating technique commonly used for liquid-deposition, implies a safety hazard due to coating solution handling and storage in the case of large amounts of highly flammable solvent use. On the other hand, spin-coating is a liquid low-consumption technique. Mainly devoted to coat circular small-size substrate, we have developed a spin-coating machine able to coat large-size rectangular windows (up to 1 x 1.7 m2). Both solutions and coating conditions have been optimized to deposit optical layers with accurate and uniform thickness and to highly limit the edge effects. Experimental single layer antireflective coating deposition process onto large-area shielding windows (1000 x 1700 x 20 mm3) is described. Results show that the as-developed process could produce low specular reflection value (down to 1% one side) onto white-glass windows over the visible range (460-750 nm). Low-temperature curing process (120°C) used after sol-gel deposition enables antireflective-coating to withstand abrasion-resistance properties in compliance to US-MIL-C-0675C moderate test.
46 CFR 127.430 - Visibility from pilothouse.
Code of Federal Regulations, 2010 CFR
2010-10-01
... ARRANGEMENTS Construction of Windows, Visibility, and Operability of Coverings § 127.430 Visibility from pilothouse. (a) Windows and other openings at the pilothouse must be of sufficient size and properly located... used in windows at the pilothouse must have a light transmission of at least 70 percent according to...
Breast cancer mitosis detection in histopathological images with spatial feature extraction
NASA Astrophysics Data System (ADS)
Albayrak, Abdülkadir; Bilgin, Gökhan
2013-12-01
In this work, cellular mitosis detection in histopathological images has been investigated. Mitosis detection is very expensive and time consuming process. Development of digital imaging in pathology has enabled reasonable and effective solution to this problem. Segmentation of digital images provides easier analysis of cell structures in histopathological data. To differentiate normal and mitotic cells in histopathological images, feature extraction step is very crucial step for the system accuracy. A mitotic cell has more distinctive textural dissimilarities than the other normal cells. Hence, it is important to incorporate spatial information in feature extraction or in post-processing steps. As a main part of this study, Haralick texture descriptor has been proposed with different spatial window sizes in RGB and La*b* color spaces. So, spatial dependencies of normal and mitotic cellular pixels can be evaluated within different pixel neighborhoods. Extracted features are compared with various sample sizes by Support Vector Machines using k-fold cross validation method. According to the represented results, it has been shown that separation accuracy on mitotic and non-mitotic cellular pixels gets better with the increasing size of spatial window.
NASA Astrophysics Data System (ADS)
Al-Ghraibah, Amani
Solar flares release stored magnetic energy in the form of radiation and can have significant detrimental effects on earth including damage to technological infrastructure. Recent work has considered methods to predict future flare activity on the basis of quantitative measures of the solar magnetic field. Accurate advanced warning of solar flare occurrence is an area of increasing concern and much research is ongoing in this area. Our previous work 111] utilized standard pattern recognition and classification techniques to determine (classify) whether a region is expected to flare within a predictive time window, using a Relevance Vector Machine (RVM) classification method. We extracted 38 features which describing the complexity of the photospheric magnetic field, the result classification metrics will provide the baseline against which we compare our new work. We find a true positive rate (TPR) of 0.8, true negative rate (TNR) of 0.7, and true skill score (TSS) of 0.49. This dissertation proposes three basic topics; the first topic is an extension to our previous work [111, where we consider a feature selection method to determine an appropriate feature subset with cross validation classification based on a histogram analysis of selected features. Classification using the top five features resulting from this analysis yield better classification accuracies across a large unbalanced dataset. In particular, the feature subsets provide better discrimination of the many regions that flare where we find a TPR of 0.85, a TNR of 0.65 sightly lower than our previous work, and a TSS of 0.5 which has an improvement comparing with our previous work. In the second topic, we study the prediction of solar flare size and time-to-flare using support vector regression (SVR). When we consider flaring regions only, we find an average error in estimating flare size of approximately half a GOES class. When we additionally consider non-flaring regions, we find an increased average error of approximately 3/4 a GOES class. We also consider thresholding the regressed flare size for the experiment containing both flaring and non-flaring regions and find a TPR. of 0.69 and a TNR of 0.86 for flare prediction, consistent with our previous studies of flare prediction using the same magnetic complexity features. The results for both of these size regression experiments are consistent across a wide range of predictive time windows, indicating that the magnetic complexity features may be persistent in appearance long before flare activity. This conjecture is supported by our larger error rates of some 40 hours in the time-to-flare regression problem. The magnetic complexity features considered here appear to have discriminative potential for flare size, but their persistence in time makes them less discriminative for the time-to-flare problem. We also study the prediction of solar flare size and time-to-flare using two temporal features, namely the ▵- and ▵-▵-features, the same average size and time-to-flare regression error are found when these temporal features are used in size and time-to-flare prediction. In the third topic, we study the temporal evolution of active region magnetic fields using Hidden Markov Models (HMMs) which is one of the efficient temporal analyses found in literature. We extracted 38 features which describing the complexity of the photospheric magnetic field. These features are converted into a sequence of symbols using k-nearest neighbor search method. We study many parameters before prediction; like the length of the training window Wtrain which denotes to the number of history images use to train the flare and non-flare HMMs, and number of hidden states Q. In training phase, the model parameters of the HMM of each category are optimized so as to best describe the training symbol sequences. In testing phase, we use the best flare and non-flare models to predict/classify active regions as a flaring or non-flaring region using a sliding window method. The best prediction result is found where the length of the history training images are 15 images (i.e., Wtrain= 15) and the length of the sliding testing window is less than or equal to W train, the best result give a TPR of 0.79 consistent with previous flare prediction work, TNR of 0.87 arid TSS of 0.66, where both are higher than our previous flare prediction work. We find that the best number of hidden states which can describe the temporal evolution of the solar ARs is equal to five states, at the same time, a close resultant metrics are found using different number of states.
Latitudinal and photic effects on diel foraging and predation risk in freshwater pelagic ecosystems
Hansen, Adam G.; Beauchamp, David A.
2014-01-01
1. Clark & Levy (American Naturalist, 131, 1988, 271–290) described an antipredation window for smaller planktivorous fish during crepuscular periods when light permits feeding on zooplankton, but limits visual detection by piscivores. Yet, how the window is influenced by the interaction between light regime, turbidity and cloud cover over a broad latitudinal gradi- ent remains unexplored. 2. We evaluated how latitudinal and seasonal shifts in diel light regimes alter the foraging- risk environment for visually feeding planktivores and piscivores across a natural range of turbidities and cloud covers. Pairing a model of aquatic visual feeding with a model of sun and moon illuminance, we estimated foraging rates of an idealized planktivore and piscivore over depth and time across factorial combinations of latitude (0–70°), turbidity (01–5 NTU) and cloud cover (clear to overcast skies) during the summer solstice and autumnal equinox. We evaluated the foraging-risk environment based on changes in the magnitude, duration and peak timing of the antipredation window. 3. The model scenarios generated up to 10-fold shifts in magnitude, 24-fold shifts in duration and 55-h shifts in timing of the peak antipredation window. The size of the window increased with latitude. This pattern was strongest during the solstice. In clear water at low turbidity (01–05 NTU), peaks in the magnitude and duration of the window formed at 57–60° latitude, before falling to near zero as surface waters became saturated with light under a midnight sun and clear skies at latitudes near 70°. Overcast dampened the midnight sun enough to allow larger windows to form in clear water at high latitudes. Conversely, at turbidities ≥2 NTU, greater reductions in the visual range of piscivores than planktivores created a window for long periods at high latitudes. Latitudinal dependencies were essentially lost during the equinox, indicating a progressive compression of the window from early summer into autumn. 4. Model results show that diel-seasonal foraging and predation risk in freshwater pelagic ecosystems changes considerably with latitude, turbidity and cloud cover. These changes alter the structure of pelagic predator–prey interactions, and in turn, the broader role of pelagic consumers in habitat coupling in lakes.
NASA Astrophysics Data System (ADS)
Xiao, Fan; Chen, Zhijun; Chen, Jianguo; Zhou, Yongzhang
2016-05-01
In this study, a novel batch sliding window (BSW) based singularity mapping approach was proposed. Compared to the traditional sliding window (SW) technique with disadvantages of the empirical predetermination of a fixed maximum window size and outliers sensitivity of least-squares (LS) linear regression method, the BSW based singularity mapping approach can automatically determine the optimal size of the largest window for each estimated position, and utilizes robust linear regression (RLR) which is insensitive to outlier values. In the case study, tin geochemical data in Gejiu, Yunnan, have been processed by BSW based singularity mapping approach. The results show that the BSW approach can improve the accuracy of the calculation of singularity exponent values due to the determination of the optimal maximum window size. The utilization of RLR method in the BSW approach can smoothen the distribution of singularity index values with few or even without much high fluctuate values looking like noise points that usually make a singularity map much roughly and discontinuously. Furthermore, the student's t-statistic diagram indicates a strong spatial correlation between high geochemical anomaly and known tin polymetallic deposits. The target areas within high tin geochemical anomaly could probably have much higher potential for the exploration of new tin polymetallic deposits than other areas, particularly for the areas that show strong tin geochemical anomalies whereas no tin polymetallic deposits have been found in them.
Zhang, Jinshui; Yuan, Zhoumiqi; Shuai, Guanyuan; Pan, Yaozhong; Zhu, Xiufang
2017-04-26
This paper developed an approach, the window-based validation set for support vector data description (WVS-SVDD), to determine optimal parameters for support vector data description (SVDD) model to map specific land cover by integrating training and window-based validation sets. Compared to the conventional approach where the validation set included target and outlier pixels selected visually and randomly, the validation set derived from WVS-SVDD constructed a tightened hypersphere because of the compact constraint by the outlier pixels which were located neighboring to the target class in the spectral feature space. The overall accuracies for wheat and bare land achieved were as high as 89.25% and 83.65%, respectively. However, target class was underestimated because the validation set covers only a small fraction of the heterogeneous spectra of the target class. The different window sizes were then tested to acquire more wheat pixels for validation set. The results showed that classification accuracy increased with the increasing window size and the overall accuracies were higher than 88% at all window size scales. Moreover, WVS-SVDD showed much less sensitivity to the untrained classes than the multi-class support vector machine (SVM) method. Therefore, the developed method showed its merits using the optimal parameters, tradeoff coefficient ( C ) and kernel width ( s ), in mapping homogeneous specific land cover.
Overestimation of the Projected Size of Objects on the Surface of Mirrors and Windows
ERIC Educational Resources Information Center
Lawson, Rebecca; Bertamini, Marco; Liu, Dan
2007-01-01
Four experiments investigated judgments of the size of projections of objects on the glass surface of mirrors and windows. The authors tested different ways of explaining the task to overcome the difficulty that people had in understanding what the projection was, and they varied the distance of the observer and the object to the mirror or window…
Neuropsychological basic deficits in preschoolers at risk for ADHD: a meta-analysis.
Pauli-Pott, Ursula; Becker, Katja
2011-06-01
Widely accepted neuropsychological theories on attention deficit hyperactivity disorder (ADHD) assume that the complex symptoms of the disease arise from developmentally preceding neuropsychological basic deficits. These deficits in executive functions and delay aversion are presumed to emerge in the preschool period. The corresponding normative developmental processes include phases of relative stability and rapid change. These non-linear developmental processes might have implications for concurrent and predictive associations between basic deficits and ADHD symptoms. To derive a description of the nature and strength of these associations, a meta-analysis was conducted. It is assumed that weighted mean effect sizes differ between basic deficits and depend on age. The meta-analysis included 25 articles (n=3005 children) in which associations between assessments of basic deficits (i.e. response inhibition, interference control, delay aversion, working memory, flexibility, and vigilance/arousal) in the preschool period and concurrent or subsequent ADHD symptoms or diagnosis of ADHD had been analyzed. For response inhibition and delay aversion, mean effect sizes were of medium to large magnitude while the mean effect size for working memory was small. Meta-regression analyses revealed that effect sizes of delay aversion tasks significantly decreased with increasing age while effect sizes of interference control tasks and Continuous Performance Tests (CPTs) significantly increased. Depending on the normative maturational course of each skill, time windows might exist that allow for a more or less valid assessment of a specific deficit. In future research these time windows might help to describe early developing forms of ADHD and to identify children at risk. Copyright © 2011 Elsevier Ltd. All rights reserved.
Baity-Jesi, Marco; Calore, Enrico; Cruz, Andres; Fernandez, Luis Antonio; Gil-Narvión, José Miguel; Gordillo-Guerrero, Antonio; Iñiguez, David; Maiorano, Andrea; Marinari, Enzo; Martin-Mayor, Victor; Monforte-Garcia, Jorge; Muñoz Sudupe, Antonio; Navarro, Denis; Parisi, Giorgio; Perez-Gaviro, Sergio; Ricci-Tersenghi, Federico; Ruiz-Lorenzo, Juan Jesus; Schifano, Sebastiano Fabio; Tarancón, Alfonso; Tripiccione, Raffaele; Yllanes, David
2017-01-01
We have performed a very accurate computation of the nonequilibrium fluctuation–dissipation ratio for the 3D Edwards–Anderson Ising spin glass, by means of large-scale simulations on the special-purpose computers Janus and Janus II. This ratio (computed for finite times on very large, effectively infinite, systems) is compared with the equilibrium probability distribution of the spin overlap for finite sizes. Our main result is a quantitative statics-dynamics dictionary, which could allow the experimental exploration of important features of the spin-glass phase without requiring uncontrollable extrapolations to infinite times or system sizes. PMID:28174274
Windowed Green function method for the Helmholtz equation in the presence of multiply layered media
NASA Astrophysics Data System (ADS)
Bruno, O. P.; Pérez-Arancibia, C.
2017-06-01
This paper presents a new methodology for the solution of problems of two- and three-dimensional acoustic scattering (and, in particular, two-dimensional electromagnetic scattering) by obstacles and defects in the presence of an arbitrary number of penetrable layers. Relying on the use of certain slow-rise windowing functions, the proposed windowed Green function approach efficiently evaluates oscillatory integrals over unbounded domains, with high accuracy, without recourse to the highly expensive Sommerfeld integrals that have typically been used to account for the effect of underlying planar multilayer structures. The proposed methodology, whose theoretical basis was presented in the recent contribution (Bruno et al. 2016 SIAM J. Appl. Math. 76, 1871-1898. (doi:10.1137/15M1033782)), is fast, accurate, flexible and easy to implement. Our numerical experiments demonstrate that the numerical errors resulting from the proposed approach decrease faster than any negative power of the window size. In a number of examples considered in this paper, the proposed method is up to thousands of times faster, for a given accuracy, than corresponding methods based on the use of Sommerfeld integrals.
Windowed Green function method for the Helmholtz equation in the presence of multiply layered media.
Bruno, O P; Pérez-Arancibia, C
2017-06-01
This paper presents a new methodology for the solution of problems of two- and three-dimensional acoustic scattering (and, in particular, two-dimensional electromagnetic scattering) by obstacles and defects in the presence of an arbitrary number of penetrable layers. Relying on the use of certain slow-rise windowing functions, the proposed windowed Green function approach efficiently evaluates oscillatory integrals over unbounded domains, with high accuracy, without recourse to the highly expensive Sommerfeld integrals that have typically been used to account for the effect of underlying planar multilayer structures. The proposed methodology, whose theoretical basis was presented in the recent contribution (Bruno et al. 2016 SIAM J. Appl. Math. 76 , 1871-1898. (doi:10.1137/15M1033782)), is fast, accurate, flexible and easy to implement. Our numerical experiments demonstrate that the numerical errors resulting from the proposed approach decrease faster than any negative power of the window size. In a number of examples considered in this paper, the proposed method is up to thousands of times faster, for a given accuracy, than corresponding methods based on the use of Sommerfeld integrals.
Funke, K; Wörgötter, F
1995-01-01
1. The spike interval pattern during the light responses of 155 on- and 81 off-centre cells of the dorsal lateral geniculate nucleus (LGN) was studied in anaesthetized and paralysed cats by the use of a novel analysis. Temporally localized interval distributions were computed from a 100 ms time window, which was shifted along the time axis in 10 ms steps, resulting in a 90% overlap between two adjacent windows. For each step the interval distribution was computed inside the time window with 1 ms resolution, and plotted as a greyscale-coded pixel line orthogonal to the time axis. For visual stimulation, light or dark spots of different size and contrast were presented with different background illumination levels. 2. Two characteristic interval patterns were observed during the sustained response component of the cells. Mainly on-cells (77%) responded with multimodal interval distributions, resulting in elongated 'bands' in the 2-dimensional time window plots. In similar situations, the interval distributions for most (71%) off-cells were rather wide and featureless. In those cases where interval bands (i.e. multimodal interval distributions) were observed for off-cells (14%), they were always much wider than for the on-cells. This difference between the on- and off-cell population was independent of the background illumination and the contrast of the stimulus. Y on-cells also tended to produce wider interval bands than X on-cells. 3. For most stimulation situations the first interval band was centred around 6-9 ms, which has been called the fundamental interval; higher order bands are multiples thereof. The fundamental interval shifted towards larger sizes with decreasing stimulus contrast. Increasing stimulus size, on the other hand, resulted in a redistribution of the intervals into higher order bands, while at the same time the location of the fundamental interval remained largely unaffected. This was interpreted as an effect of the increasing surround inhibition at the geniculate level, by which individual retinal EPSPs were cancelled. A changing level of adaptation can result in a mixed shift/redistribution effect because of the changing stimulus contrast and changing level of tonic inhibition. 4. The occurrence of interval bands is not directly related to the shape of the autocorrelation function, which can be flat, weakly oscillatory or strongly oscillatory, regardless of the interval band pattern. 5. A simple computer model was devised to account for the observed cell behaviour. The model is highly robust against parameter variations.(ABSTRACT TRUNCATED AT 400 WORDS) Images Figure 1 Figure 2 Figure 3 Figure 4 Figure 6 Figure 7 Figure 8 Figure 9 Figure 10 Figure 11 Figure 12 Figure 13 Figure 15 PMID:7562612
Limits of the memory coefficient in measuring correlated bursts
NASA Astrophysics Data System (ADS)
Jo, Hang-Hyun; Hiraoka, Takayuki
2018-03-01
Temporal inhomogeneities in event sequences of natural and social phenomena have been characterized in terms of interevent times and correlations between interevent times. The inhomogeneities of interevent times have been extensively studied, while the correlations between interevent times, often called correlated bursts, are far from being fully understood. For measuring the correlated bursts, two relevant approaches were suggested, i.e., memory coefficient and burst size distribution. Here a burst size denotes the number of events in a bursty train detected for a given time window. Empirical analyses have revealed that the larger memory coefficient tends to be associated with the heavier tail of the burst size distribution. In particular, empirical findings in human activities appear inconsistent, such that the memory coefficient is close to 0, while burst size distributions follow a power law. In order to comprehend these observations, by assuming the conditional independence between consecutive interevent times, we derive the analytical form of the memory coefficient as a function of parameters describing interevent time and burst size distributions. Our analytical result can explain the general tendency of the larger memory coefficient being associated with the heavier tail of burst size distribution. We also find that the apparently inconsistent observations in human activities are compatible with each other, indicating that the memory coefficient has limits to measure the correlated bursts.
Robotic Attention Processing And Its Application To Visual Guidance
NASA Astrophysics Data System (ADS)
Barth, Matthew; Inoue, Hirochika
1988-03-01
This paper describes a method of real-time visual attention processing for robots performing visual guidance. This robot attention processing is based on a novel vision processor, the multi-window vision system that was developed at the University of Tokyo. The multi-window vision system is unique in that it only processes visual information inside local area windows. These local area windows are quite flexible in their ability to move anywhere on the visual screen, change their size and shape, and alter their pixel sampling rate. By using these windows for specific attention tasks, it is possible to perform high speed attention processing. The primary attention skills of detecting motion, tracking an object, and interpreting an image are all performed at high speed on the multi-window vision system. A basic robotic attention scheme using the attention skills was developed. The attention skills involved detection and tracking of salient visual features. The tracking and motion information thus obtained was utilized in producing the response to the visual stimulus. The response of the attention scheme was quick enough to be applicable to the real-time vision processing tasks of playing a video 'pong' game, and later using an automobile driving simulator. By detecting the motion of a 'ball' on a video screen and then tracking the movement, the attention scheme was able to control a 'paddle' in order to keep the ball in play. The response was faster than that of a human's, allowing the attention scheme to play the video game at higher speeds. Further, in the application to the driving simulator, the attention scheme was able to control both direction and velocity of a simulated vehicle following a lead car. These two applications show the potential of local visual processing in its use for robotic attention processing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanford, J.W.; Huang, Y.J.
The energy performance of skylights is similar to that of windows in admitting solar heat gain, while at the same time providing a pathway for convective and conductive heat transfer through the building envelope. Since skylights are typically installed at angles ranging from 0{degrees} to 45{degrees}, and differ from windows in both their construction and operation, their conductive and convective heat gains or losses, as well as solar heat gain, will differ for the same rough opening and thermal characteristics. The objective of this work is to quantify the impact of solar gain through skylights on building heating and coolingmore » loads in 45 climates, and to develop a method for including these data into the SP53 residential loads data base previously developed by LBL in support of DOE`s Automated Residential Energy Standard (ARES) program. The authors used the DOE-2.1C program to simulate the heating and cooling loads of a prototypical residential building while varying the size and solar characteristics of skylights and windows. The results are presented as Skylight Solar Loads, which are the contribution of solar gains through skylights to the overall building heating and cooling loads, and as Skylight Solar Load Ratios, which are the ratios of skylight solar loads to those for windows with the same orientation. The study shows that skylight solar loads are larger than those for windows in both heating and cooling. Skylight solar cooling loads are from three to four times greater than those for windows regardless of the skylight tilt, except for those facing north. These cooling loads are largest for south-facing skylights at a tilt angle of approximately 20{degrees}, and drop off at higher tilts and other orientations.« less
Laser-induced damage and fracture in fused silica vacuum windows
NASA Astrophysics Data System (ADS)
Campbell, John H.; Hurst, Patricia A.; Heggins, Dwight D.; Steele, William A.; Bumpas, Stanley E.
1997-05-01
Laser induced damage, that initiates catastrophic fracture, has been observed in large, fused silica lenses that also serve as vacuum barriers in high-fluence positions on the Nova and Beamlet lasers. In nearly all cases damage occurs on the vacuum side of the lens. The damage can lead to catastrophic crack growth if the flaw size exceeds the critical flaw size for SiO2. If the elastic stored energy in the lens in high enough, the lens will fracture into many pieces resulting in an implosion. The consequences of such an implosion can be severe, particularly for large vacuum systems. Three parameters control the degree of fracture in the vacuum barrier window: (1) the elastic stored energy, (2) the ratio of the window thickness to flaw depth and (3) secondary crack propagation. Fracture experiments have ben carried our on 15-cm diameter fused silica windows that contain surface flaws caused by laser damage. The results of these experiments, combined with data from window failures on Beamlet and Nova have been sued to develop design criteria for a 'fail-safe' lens. Specifically the window must be made thick enough such that the peak tensile stress is less than 500 psi and the corresponding ratio of the thickness to critical flaw size is less than 6. Under these conditions a properly mounted window, upon failure, will break into only tow pieces and will not implode. One caveat to these design criteria is that the air leak through the window before secondary crack growth occurs. Finite element stress calculations of a window before and immediately following fracture into two pieces show that the elastic stored energy is redistributed if the fragments 'lock' in place and thereby bridge the opening. In such cases, the peak stresses at the flaw site can increase leading to further crack growth.
NASA Astrophysics Data System (ADS)
Musarudin, M.; Saripan, M. I.; Mashohor, S.; Saad, W. H. M.; Nordin, A. J.; Hashim, S.
2015-10-01
Energy window technique has been implemented in all positron emission tomography (PET) imaging protocol, with the aim to remove the unwanted low energy photons. Current practices in our institution however are performed by using default energy threshold level regardless of the weight of the patient. Phantom size, which represents the size of the patient's body, is the factor that determined the level of scatter fraction during PET imaging. Thus, the motivation of this study is to determine the optimum energy threshold level for different sizes of human-shaped phantom, to represent underweight, normal, overweight and obese patients. In this study, the scanner was modeled by using Monte Carlo code, version MCNP5. Five different sizes of elliptical-cylinder shaped of human-sized phantoms with diameter ranged from 15 to 30 cm were modeled. The tumor was modeled by a cylindrical line source filled with 1.02 MeV positron emitters at the center of the phantom. Various energy window widths, in the ranged of 10-50% were implemented to the data. In conclusion, the phantom mass volume did influence the scatter fraction within the volume. Bigger phantom caused more scattering events and thus led to coincidence counts lost. We evaluated the impact of phantom sizes on the sensitivity and visibility of the simulated models. Implementation of wider energy window improved the sensitivity of the system and retained the coincidence photons lost. Visibility of the tumor improved as an appropriate energy window implemented for the different sizes of phantom.
Molecular matter waves - tools and applications
NASA Astrophysics Data System (ADS)
Juffmann, Thomas; Sclafani, Michele; Knobloch, Christian; Cheshnovsky, Ori; Arndt, Markus
2013-05-01
Fluorescence microscopy allows us to visualize the gradual emergence of a deterministic far-field matter-wave diffraction pattern from stochastically arriving single molecules. We create a slow beam of phthalocyanine molecules via laser desorption from a glass window. The small source size provides the transverse coherence required to observe an interference pattern in the far-field behind an ultra-thin nanomachined grating. There the molecules are deposited onto a quartz window and can be imaged in situ and in real time with single molecule sensitivity. This new setup not only allows for a textbook demonstration of quantum interference, but also enables quantitative explorations of the van der Waals interaction between molecules and material gratings.
Modeling methylene chloride exposure-reduction options for home paint-stripper users.
Riley, D M; Small, M J; Fischhoff, B
2000-01-01
Home improvement is a popular activity, but one that can also involve exposure to hazardous substances. Paint stripping is of particular concern because of the high potential exposures to methylene chloride, a solvent that is a potential human carcinogen and neurotoxicant. This article presents a general methodology for evaluating the effectiveness of behavioral interventions for reducing these risks. It doubles as a model that assesses exposure patterns, incorporating user time-activity patterns and risk-mitigation strategies. The model draws upon recent innovations in indoor air-quality modeling to estimate exposure through inhalation and dermal pathways to paint-stripper users. It is designed to use data gathered from home paint-stripper users about room characteristics, amount of stripper used, time-activity patterns and exposure-reduction strategies (e.g., increased ventilation and modification in the timing of stripper application, scraping, and breaks). Results indicate that the effectiveness of behavioral interventions depends strongly on characteristics of the room (e.g., size, number and size of doors and windows, base air-exchange rates). The greatest simple reduction in exposure is achieved by using an exhaust fan in addition to opening windows and doors. These results can help identify the most important information for product labels and other risk-communication materials.
Least Squares Moving-Window Spectral Analysis.
Lee, Young Jong
2017-08-01
Least squares regression is proposed as a moving-windows method for analysis of a series of spectra acquired as a function of external perturbation. The least squares moving-window (LSMW) method can be considered an extended form of the Savitzky-Golay differentiation for nonuniform perturbation spacing. LSMW is characterized in terms of moving-window size, perturbation spacing type, and intensity noise. Simulation results from LSMW are compared with results from other numerical differentiation methods, such as single-interval differentiation, autocorrelation moving-window, and perturbation correlation moving-window methods. It is demonstrated that this simple LSMW method can be useful for quantitative analysis of nonuniformly spaced spectral data with high frequency noise.
PREDICTION OF SOLAR FLARE SIZE AND TIME-TO-FLARE USING SUPPORT VECTOR MACHINE REGRESSION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boucheron, Laura E.; Al-Ghraibah, Amani; McAteer, R. T. James
We study the prediction of solar flare size and time-to-flare using 38 features describing magnetic complexity of the photospheric magnetic field. This work uses support vector regression to formulate a mapping from the 38-dimensional feature space to a continuous-valued label vector representing flare size or time-to-flare. When we consider flaring regions only, we find an average error in estimating flare size of approximately half a geostationary operational environmental satellite (GOES) class. When we additionally consider non-flaring regions, we find an increased average error of approximately three-fourths a GOES class. We also consider thresholding the regressed flare size for the experimentmore » containing both flaring and non-flaring regions and find a true positive rate of 0.69 and a true negative rate of 0.86 for flare prediction. The results for both of these size regression experiments are consistent across a wide range of predictive time windows, indicating that the magnetic complexity features may be persistent in appearance long before flare activity. This is supported by our larger error rates of some 40 hr in the time-to-flare regression problem. The 38 magnetic complexity features considered here appear to have discriminative potential for flare size, but their persistence in time makes them less discriminative for the time-to-flare problem.« less
Surgical anatomy of the round window-Implications for cochlear implantation.
Luers, J C; Hüttenbrink, K B; Beutner, D
2018-04-01
The round window is an important portal for the application of active hearing aids and cochlear implants. The anatomical and topographical knowledge about the round window region is a prerequisite for successful insertion for a cochlear implant electrode. To sum up current knowledge about the round window anatomy and to give advice to the cochlear implant surgeon for optimal placement of an electrode. Systematic Medline search. Search term "round window[Title]" with no date restriction. Only publications in the English Language were included. All abstracts were screened for relevance, that is a focus on surgical anatomy of the round window. The search results were supplemented with hand searching of selected reviews and reference lists from included studies. Subjective assessment. There is substantial variability in size and shape of the round window. The round window is regarded as the most reliable surgical landmark to safely locate the scala tympani. Factors affecting the optimal trajectory line for atraumatic electrode insertion are anatomy of the round window, the anatomy of the intracochlear hook region and the variable orientation and size of the cochlea's basal turn. The very close relation to the sensitive inner ear structures necessitates a thorough anatomic knowledge and careful insertion technique, especially when implanting patients with residual hearing. In order to avoid electrode migration between the scalae and to achieve protect the modiolus and the basilar membrane, it is recommended to aim for an electrode insertion vector from postero-superior to antero-inferior. © 2017 John Wiley & Sons Ltd.
14 CFR 417.229 - Far-field overpressure blast effects analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... characteristics; (2) The potential for broken windows due to peak incident overpressures below 1.0 psi and related... the potentially affected windows, including their size, location, orientation, glazing material, and...
Local concurrent error detection and correction in data structures using virtual backpointers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, C.C.J.; Chen, P.P.; Fuchs, W.K.
1989-11-01
A new technique, based on virtual backpointers, is presented in this paper for local concurrent error detection and correction in linked data structures. Two new data structures utilizing virtual backpointers, the Virtual Double-Linked List and the B-Tree and Virtual Backpointers, are described. For these structures, double errors within a fixed-size checking window can be detected in constant time and single errors detected during forward moves can be corrected in constant time.
Anomalous finite-size effects in the Battle of the Sexes
NASA Astrophysics Data System (ADS)
Cremer, J.; Reichenbach, T.; Frey, E.
2008-06-01
The Battle of the Sexes describes asymmetric conflicts in mating behavior of males and females. Males can be philanderer or faithful, while females are either fast or coy, leading to a cyclic dynamics. The adjusted replicator equation predicts stable coexistence of all four strategies. In this situation, we consider the effects of fluctuations stemming from a finite population size. We show that they unavoidably lead to extinction of two strategies in the population. However, the typical time until extinction occurs strongly prolongs with increasing system size. In the emerging time window, a quasi-stationary probability distribution forms that is anomalously flat in the vicinity of the coexistence state. This behavior originates in a vanishing linear deterministic drift near the fixed point. We provide numerical data as well as an analytical approach to the mean extinction time and the quasi-stationary probability distribution.
NASA Astrophysics Data System (ADS)
Sun, Hao; Zou, Huanxin; Zhou, Shilin
2016-03-01
Detection of anomalous targets of various sizes in hyperspectral data has received a lot of attention in reconnaissance and surveillance applications. Many anomaly detectors have been proposed in literature. However, current methods are susceptible to anomalies in the processing window range and often make critical assumptions about the distribution of the background data. Motivated by the fact that anomaly pixels are often distinctive from their local background, in this letter, we proposed a novel hyperspectral anomaly detection framework for real-time remote sensing applications. The proposed framework consists of four major components, sparse feature learning, pyramid grid window selection, joint spatial-spectral collaborative coding and multi-level divergence fusion. It exploits the collaborative representation difference in the feature space to locate potential anomalies and is totally unsupervised without any prior assumptions. Experimental results on airborne recorded hyperspectral data demonstrate that the proposed methods adaptive to anomalies in a large range of sizes and is well suited for parallel processing.
40 CFR 600.302-08 - Fuel economy label format requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... or diesel fuel as calculated in § 600.210-08(a) and (b). (3) The fuel pump logo. (4) The following... *”. The title shall be positioned in the grey area above the window of the fuel pump logo, in a size and...)”]”. Both of these titles are centered in the grey area above the window of the fuel pump logo, with a size...
40 CFR 600.302-08 - Fuel economy label format requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... or diesel fuel as calculated in § 600.210-08(a) and (b). (3) The fuel pump logo. (4) The following... *”. The title shall be positioned in the grey area above the window of the fuel pump logo, in a size and...)”]”. Both of these titles are centered in the grey area above the window of the fuel pump logo, with a size...
Ng, Kenney; Steinhubl, Steven R; deFilippi, Christopher; Dey, Sanjoy; Stewart, Walter F
2016-11-01
Using electronic health records data to predict events and onset of diseases is increasingly common. Relatively little is known, although, about the tradeoffs between data requirements and model utility. We examined the performance of machine learning models trained to detect prediagnostic heart failure in primary care patients using longitudinal electronic health records data. Model performance was assessed in relation to data requirements defined by the prediction window length (time before clinical diagnosis), the observation window length (duration of observation before prediction window), the number of different data domains (data diversity), the number of patient records in the training data set (data quantity), and the density of patient encounters (data density). A total of 1684 incident heart failure cases and 13 525 sex, age-category, and clinic matched controls were used for modeling. Model performance improved as (1) the prediction window length decreases, especially when <2 years; (2) the observation window length increases but then levels off after 2 years; (3) the training data set size increases but then levels off after 4000 patients; (4) more diverse data types are used, but, in order, the combination of diagnosis, medication order, and hospitalization data was most important; and (5) data were confined to patients who had ≥10 phone or face-to-face encounters in 2 years. These empirical findings suggest possible guidelines for the minimum amount and type of data needed to train effective disease onset predictive models using longitudinal electronic health records data. © 2016 American Heart Association, Inc.
Applicability of optical scanner method for fine root dynamics
NASA Astrophysics Data System (ADS)
Kume, Tomonori; Ohashi, Mizue; Makita, Naoki; Khoon Kho, Lip; Katayama, Ayumi; Matsumoto, Kazuho; Ikeno, Hidetoshi
2016-04-01
Fine root dynamics is one of the important components in forest carbon cycling, as ~60 % of tree photosynthetic production can be allocated to root growth and metabolic activities. Various techniques have been developed for monitoring fine root biomass, production, mortality in order to understand carbon pools and fluxes resulting from fine roots dynamics. The minirhizotron method is now a widely used technique, in which a transparent tube is inserted into the soil and researchers count an increase and decrease of roots along the tube using images taken by a minirhizotron camera or minirhizotron video camera inside the tube. This method allows us to observe root behavior directly without destruction, but has several weaknesses; e.g., the difficulty of scaling up the results to stand level because of the small observation windows. Also, most of the image analysis are performed manually, which may yield insufficient quantitative and objective data. Recently, scanner method has been proposed, which can produce much bigger-size images (A4-size) with lower cost than those of the minirhizotron methods. However, laborious and time-consuming image analysis still limits the applicability of this method. In this study, therefore, we aimed to develop a new protocol for scanner image analysis to extract root behavior in soil. We evaluated applicability of this method in two ways; 1) the impact of different observers including root-study professionals, semi- and non-professionals on the detected results of root dynamics such as abundance, growth, and decomposition, and 2) the impact of window size on the results using a random sampling basis exercise. We applied our new protocol to analyze temporal changes of root behavior from sequential scanner images derived from a Bornean tropical forests. The results detected by the six observers showed considerable concordance in temporal changes in the abundance and the growth of fine roots but less in the decomposition. We also examined potential errors due to window size in the temporal changes in abundance and growth using the detected results, suggesting high applicability of the scanner methods with wide observation windows.
TH-CD-207B-03: How to Quantify Temporal Resolution in X-Ray MDCT Imaging?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Budde, A; GE Healthcare Technologies, Madison, WI; Li, Y
Purpose: In modern CT scanners, a quantitative metric to assess temporal response, namely, to quantify the temporal resolution (TR), remains elusive. Rough surrogate metrics, such as half of the gantry rotation time for single source CT, a quarter of the gantry rotation time for dual source CT, or measurements of motion artifact’s size, shape, or intensity have previously been used. In this work, a rigorous framework which quantifies TR and a practical measurement method are developed. Methods: A motion phantom was simulated which consisted of a single rod that is in motion except during a static period at the temporalmore » center of the scan, termed the TR window. If the image of the motion scan has negligible motion artifacts compared to an image from a totally static scan, then the system has a TR no worse than the TR window used. By repeating this comparison with varying TR windows, the TR of the system can be accurately determined. Motion artifacts were also visually assessed and the TR was measured across varying rod motion speeds, directions, and locations. Noiseless fan beam acquisitions were simulated and images were reconstructed with a short-scan image reconstruction algorithm. Results: The size, shape, and intensity of motion artifacts varied when the rod speed, direction, or location changed. TR measured using the proposed method, however, was consistent across rod speeds, directions, and locations. Conclusion: Since motion artifacts vary depending upon the motion speed, direction, and location, they are not suitable for measuring TR. In this work, a CT system with a specified TR is defined as having the ability to produce a static image with negligible motion artifacts, no matter what motion occurs outside of a static window of width TR. This framework allows for practical measurement of temporal resolution in clinical CT imaging systems. Funding support: GE Healthcare; Conflict of Interest: Employee, GE Healthcare.« less
Manufacturing and metrology for IR conformal windows and domes
NASA Astrophysics Data System (ADS)
Ferralli, Ian; Blalock, Todd; Brunelle, Matt; Lynch, Timothy; Myer, Brian; Medicus, Kate
2017-05-01
Freeform and conformal optics have the potential to dramatically improve optical systems by enabling systems with fewer optical components, reduced aberrations, and improved aerodynamic performance. These optical components differ from standard components in their surface shape, typically a non-symmetric equation based definition, and material properties. Traditional grinding and polishing tools are unable to handle these freeform shapes. Additionally, standard metrology tools cannot measure these surfaces. Desired substrates are typically hard ceramics, including poly-crystalline alumina or aluminum oxynitride. Notwithstanding the challenges that the hardness provides to manufacturing, these crystalline materials can be highly susceptible to grain decoration creating unacceptable scatter in optical systems. In this presentation, we will show progress towards addressing the unique challenges of manufacturing conformal windows and domes. Particular attention is given to our robotic polishing platform. This platform is based on an industrial robot adapted to accept a wide range of tooling and parts. The robot's flexibility has provided us an opportunity to address the unique challenges of conformal windows. Slurries and polishing active layers can easily be changed to adapt to varying materials and address grain decoration. We have the flexibility to change tool size and shape to address the varying sizes and shapes of conformal optics. In addition, the robotic platform can be a base for a deflectometry-based metrology tool to measure surface form error. This system, whose precision is independent of the robot's positioning accuracy, will allow us to measure optics in-situ saving time and reducing part risk. In conclusion, we will show examples of the conformal windows manufactured using our developed processes.
Lobster eye as a collector for water window microscopy
NASA Astrophysics Data System (ADS)
Pina, L.; Maršíková, V.; Inneman, A.; Nawaz, M. F.; Jančárek, A.; Havlíková, R.
2017-08-01
Imaging in EUV, SXR and XR spectral bands of radiation is of increasing interest. Material science, biology and hot plasma are examples of relevant fast developing areas. Applications include spectroscopy, astrophysics, Soft X-ray Ray metrology, Water Window microscopy, radiography and tomography. Especially Water Window imaging has still not fully recognized potential in biology and medicine microscopy applications. Theoretical study and design of Lobster Eye (LE) optics as a collector for water window (WW) microscopy and comparison with a similar size ellipsoidal mirror condensor are presented.
Rugged sensor window materials for harsh environments
NASA Astrophysics Data System (ADS)
Bayya, Shyam; Villalobos, Guillermo; Kim, Woohong; Sanghera, Jasbinger; Hunt, Michael; Aggarwal, Ishwar D.
2014-09-01
There are several military or commercial systems operating in very harsh environments that require rugged windows. On some of these systems, windows become the single point of failure. These applications include sensor or imaging systems, high-energy laser weapons systems, submarine photonic masts, IR countermeasures and missiles. Based on the sea or land or air based platforms the window or dome on these systems must withstand wave slap, underwater or ground based explosions, or survive flight through heavy rain and sand storms while maintaining good optical transmission in the desired wavelength range. Some of these applications still use softer ZnS or fused silica windows because of lack of availability of rugged materials in shapes or sizes required. Sapphire, ALON and spinel are very rugged materials with significantly higher strengths compared to ZnS and fused silica. There have been recent developments in spinel, ALON and sapphire materials to fabricate in large sizes and conformal shapes. We have been developing spinel ceramics for several of these applications. We are also developing β-SiC as a transparent window material as it has higher hardness, strength, and toughness than sapphire, ALON and spinel. This paper gives a summary of our recent findings.
Hybrid cryptosystem for image file using elgamal and double playfair cipher algorithm
NASA Astrophysics Data System (ADS)
Hardi, S. M.; Tarigan, J. T.; Safrina, N.
2018-03-01
In this paper, we present an implementation of an image file encryption using hybrid cryptography. We chose ElGamal algorithm to perform asymmetric encryption and Double Playfair for the symmetric encryption. Our objective is to show that these algorithms are capable to encrypt an image file with an acceptable running time and encrypted file size while maintaining the level of security. The application was built using C# programming language and ran as a stand alone desktop application under Windows Operating System. Our test shows that the system is capable to encrypt an image with a resolution of 500×500 to a size of 976 kilobytes with an acceptable running time.
Rapid feature-driven changes in the attentional window.
Leonard, Carly J; Lopez-Calderon, Javier; Kreither, Johanna; Luck, Steven J
2013-07-01
Spatial attention must adjust around an object of interest in a manner that reflects the object's size on the retina as well as the proximity of distracting objects, a process often guided by nonspatial features. This study used ERPs to investigate how quickly the size of this type of "attentional window" can adjust around a fixated target object defined by its color and whether this variety of attention influences the feedforward flow of subsequent information through the visual system. The task involved attending either to a circular region at fixation or to a surrounding annulus region, depending on which region contained an attended color. The region containing the attended color varied randomly from trial to trial, so the spatial distribution of attention had to be adjusted on each trial. We measured the initial sensory ERP response elicited by an irrelevant probe stimulus that appeared in one of the two regions at different times after task display onset. This allowed us to measure the amount of time required to adjust spatial attention on the basis of the location of the task-relevant feature. We found that the probe-elicited sensory response was larger when the probe occurred within the region of the attended dots, and this effect required a delay of approximately 175 msec between the onset of the task display and the onset of the probe. Thus, the window of attention is rapidly adjusted around the point of fixation in a manner that reflects the spatial extent of a task-relevant stimulus, leading to changes in the feedforward flow of subsequent information through the visual system.
Two-window heterodyne methods to characterize light fields
NASA Astrophysics Data System (ADS)
Reil, Frank
In this dissertation, I develop a novel Two-Window heterodyne technique for measuring the time-resolved Wigner function of light fields, which allows their complete characterization. A Wigner function is a quasi-probability density that describes the transverse position and transverse momentum of a light field and is Fourier-transform related to its mutual coherence function. It obeys rigorous transport equations and therefore provides an ideal way to characterize a light field and its propagation through various media. I first present the experimental setup of our Two-Window technique, which is based on a heterodyne scheme involving two phase-coupled Local Oscillator beams we call the Dual-LO. The Dual-LO consists of a focused beam ('SLO') which sets the spatial resolution, and a collimated beam ('BLO') which sets the momental resolution. The resolution in transverse position and transverse momentum can be adjusted individually by the size of the SLO and BLO, which enables a measurement resolution surpassing the uncertainty principle associated with Fourier-transform pairs which limits the resolution when just a single LO is used. We first use our technique to determine the beam size, transverse coherence length and radius of curvature of a Gaussian-Schell beam, as well as its longitudinal characteristics, which are related to its optical spectrum. We then examine Enhanced Backscattering at various path-lengths in the turbid medium. For the first time ever, we demonstrate the phase-conjugating properties of a turbid medium by observing the change in sign of the radius of curvature for a non-collimated field incident on the medium. We also perform time-resolved measurements in the transmission regime. In tenuous media we observe two peaks in phase-space confined by a hyperbola which are due to low-order scattering. Their distance depends on the chosen path-delay. Some coherence and even spatial properties of the incident field are preserved in those peaks as measurements with our Two-Window technique show. Various other applications are presented in less detail, such as the Wigner function of the field inside a speckle produced by a piece of glass containing air bubbles.
Statistical analysis of data and modeling of Nanodust measured by STEREO/WAVES at 1AU
NASA Astrophysics Data System (ADS)
Belheouane, S.; Zaslavsky, A.; Meyer-Vernet, N.; Issautier, K.; Czechowski, A.; Mann, I.; Le Chat, G.; Zouganelis, I.; Maksimovic, M.
2012-12-01
We study the flux of dust particles of nanometer size measured at 1AU by the S/WAVES instrument aboard the twin STEREO spacecraft. When they impact the spacecraft at very high speed, these nanodust particles, first detected by Meyer-Vernet et al. (2009), generate plasma clouds and produce voltage pulses measured by the electric antennas. The Time Domain Sampler (TDS) of the radio and plasma instrument produces temporal windows containing several pulses. We perform a statistical study of the distribution of pulse amplitudes and arrival times in the measuring window during the 2007-2012 period. We interpret the results using simulations of the dynamics of nanodust in the solar wind based on the model of Czechowski and Mann (2010). We also investigate the variations of nanodust fluxes while STEREO rotates about the sunward axis (Roll) ; this reveals that some directions are privilegied.
Implementation of the Algorithm for Congestion control in the Dynamic Circuit Network (DCN)
NASA Astrophysics Data System (ADS)
Nalamwar, H. S.; Ivanov, M. A.; Buddhawar, G. U.
2017-01-01
Transport Control Protocol (TCP) incast congestion happens when a number of senders work in parallel with the same server where the high bandwidth and low latency network problem occurs. For many data center network applications such as a search engine, heavy traffic is present on such a server. Incast congestion degrades the entire performance as packets are lost at a server side due to buffer overflow, and as a result, the response time becomes longer. In this work, we focus on TCP throughput, round-trip time (RTT), receive window and retransmission. Our method is based on the proactive adjust of the TCP receive window before the packet loss occurs. We aim to avoid the wastage of the bandwidth by adjusting its size as per the number of packets. To avoid the packet loss, the ICTCP algorithm has been implemented in the data center network (ToR).
Performance evaluation of the Trans-PET®BioCaliburn® SH system
NASA Astrophysics Data System (ADS)
Zhu, Jun; Wang, Luyao; Kao, Chien-Min; Kim, Heejong; Xie, Qingguo
2015-03-01
The Trans-PET®BioCaliburn® SH system, recently introduced by the Raycan Technology Co. Ltd. (Suzhou, China), is a commercial positron emission tomography (PET) system designed for rodent imaging. The system contains 6 basic detector modules (BDMs) arranged on a 10.8 cm diameter ring to provide a transaxial field of view (FOV) of 6.5 cm and an axial FOV of 5.3 cm. In this paper, we report on its performance properties in accordance with the National Electrical Manufacturers Association (NEMA) 2008 NU-4 standards with modifications. The measured spatial resolution at the center of the FOV was 1.05 mm, 1.12 mm and 1.13 mm in the tangential, radial and axial directions, respectively. The measured system sensitivity was 3.29% for a point source at the center of the FOV when using a 350-650 keV energy window and a 5 ns coincidence time window. When a wider 250-750 keV energy window was used, it increased to 4.21%. For mouse- and rat-sized phantoms, the scatter fraction was 10.7% and 16.1%, respectively. The peak noise equivalent count rate were 36 kcps@8.52 MBq for the mouse-sized phantom and 16 kcps@6.79 MBq for the rat-sized phantom. The Derenzo phantom image showed that the system can resolve 1.0 mm diameter rods. The measured performance properties of the system indicate that the Trans-PET®BioCaliburn® SH is a versatile imaging device that can provide high spatial resolution for rodent imaging while offering competitive sensitivity and count-rate performance.
Zhu, Yue-Shan; Yang, Wan-Dong; Li, Xiu-Wen; Ni, Hong-Gang; Zeng, Hui
2018-02-01
The quality of indoor environments has a significant impact on public health. Usually, an indoor environment is treated as a static box, in which physicochemical reactions of indoor air contaminants are negligible. This results in conservative estimates for primary indoor air pollutant concentrations, while also ignoring secondary pollutants. Thus, understanding the relationship between indoor and outdoor particles and particle-bound pollutants is of great significance. For this reason, we collected simultaneous indoor and outdoor measurements of the size distribution of airborne brominated flame retardant (BFR) congeners. The time-dependent concentrations of indoor particles and particle-bound BFRs were then estimated with the mass balance model, accounting for the outdoor concentration, indoor source strength, infiltration, penetration, deposition and indoor resuspension. Based on qualitative observation, the size distributions of ΣPBDE and ΣHBCD were characterized by bimodal peaks. According to our results, particle-bound BDE209 and γ-HBCD underwent degradation. Regardless of the surface adsorption capability of particles and the physicochemical properties of the target compounds, the concentration of BFRs in particles of different size fractions seemed to be governed by the particle distribution. Based on our estimations, for airborne particles and particle-bound BFRs, a window-open ventilated room only takes a quarter of the time to reach an equilibrium between the concentration of pollutants inside and outside compared to a closed room. Unfortunately, indoor pollutants and outdoor pollutants always exist simultaneously, which poses a window-open-or-closed dilemma to achieve proper ventilation. Copyright © 2017 Elsevier Ltd. All rights reserved.
Prevalence of Imaging Biomarkers to Guide the Planning of Acute Stroke Reperfusion Trials.
Jiang, Bin; Ball, Robyn L; Michel, Patrik; Jovin, Tudor; Desai, Manisha; Eskandari, Ashraf; Naqvi, Zack; Wintermark, Max
2017-06-01
Imaging biomarkers are increasingly used as selection criteria for stroke clinical trials. The goal of our study was to determine the prevalence of commonly studied imaging biomarkers in different time windows after acute ischemic stroke onset to better facilitate the design of stroke clinical trials using such biomarkers for patient selection. This retrospective study included 612 patients admitted with a clinical suspicion of acute ischemic stroke with symptom onset no more than 24 hours before completing baseline imaging. Patients with subacute/chronic/remote infarcts and hemorrhage were excluded from this study. Imaging biomarkers were extracted from baseline imaging, which included a noncontrast head computed tomography (CT), perfusion CT, and CT angiography. The prevalence of dichotomized versions of each of the imaging biomarkers in several time windows (time since symptom onset) was assessed and statistically modeled to assess time dependence (not lack thereof). We created tables showing the prevalence of the imaging biomarkers pertaining to the core, the penumbra and the arterial occlusion for different time windows. All continuous imaging features vary over time. The dichotomized imaging features that vary significantly over time include: noncontrast head computed tomography Alberta Stroke Program Early CT (ASPECT) score and dense artery sign, perfusion CT infarct volume, and CT angiography collateral score and visible clot. The dichotomized imaging features that did not vary significantly over time include the thresholded perfusion CT penumbra volumes. As part of the feasibility analysis in stroke clinical trials, this analysis and the resulting tables can help investigators determine sample size and the number needed to screen. © 2017 American Heart Association, Inc.
A unified approach to the study of temporal, correlational, and rate coding.
Panzeri, S; Schultz, S R
2001-06-01
We demonstrate that the information contained in the spike occurrence times of a population of neurons can be broken up into a series of terms, each reflecting something about potential coding mechanisms. This is possible in the coding regime in which few spikes are emitted in the relevant time window. This approach allows us to study the additional information contributed by spike timing beyond that present in the spike counts and to examine the contributions to the whole information of different statistical properties of spike trains, such as firing rates and correlation functions. It thus forms the basis for a new quantitative procedure for analyzing simultaneous multiple neuron recordings and provides theoretical constraints on neural coding strategies. We find a transition between two coding regimes, depending on the size of the relevant observation timescale. For time windows shorter than the timescale of the stimulus-induced response fluctuations, there exists a spike count coding phase, in which the purely temporal information is of third order in time. For time windows much longer than the characteristic timescale, there can be additional timing information of first order, leading to a temporal coding phase in which timing information may affect the instantaneous information rate. In this new framework, we study the relative contributions of the dynamic firing rate and correlation variables to the full temporal information, the interaction of signal and noise correlations in temporal coding, synergy between spikes and between cells, and the effect of refractoriness. We illustrate the utility of the technique by analyzing a few cells from the rat barrel cortex.
NASA Astrophysics Data System (ADS)
Baniamerian, Ali; Bashiri, Mahdi; Zabihi, Fahime
2018-03-01
Cross-docking is a new warehousing policy in logistics which is widely used all over the world and attracts many researchers attention to study about in last decade. In the literature, economic aspects has been often studied, while one of the most significant factors for being successful in the competitive global market is improving quality of customer servicing and focusing on customer satisfaction. In this paper, we introduce a vehicle routing and scheduling problem with cross-docking and time windows in a three-echelon supply chain that considers customer satisfaction. A set of homogeneous vehicles collect products from suppliers and after consolidation process in the cross-dock, immediately deliver them to customers. A mixed integer linear programming model is presented for this problem to minimize transportation cost and early/tardy deliveries with scheduling of inbound and outbound vehicles to increase customer satisfaction. A two phase genetic algorithm (GA) is developed for the problem. For investigating the performance of the algorithm, it was compared with exact and lower bound solutions in small and large-size instances, respectively. Results show that there are at least 86.6% customer satisfaction by the proposed method, whereas customer satisfaction in the classical model is at most 33.3%. Numerical examples results show that the proposed two phase algorithm could achieve optimal solutions in small-size instances. Also in large-size instances, the proposed two phase algorithm could achieve better solutions with less gap from the lower bound in less computational time in comparison with the classic GA.
Mars Reconnaissance Orbiter Uplink Analysis Tool
NASA Technical Reports Server (NTRS)
Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; Hwang, Pauline
2008-01-01
This software analyzes Mars Reconnaissance Orbiter (MRO) orbital geometry with respect to Mars Exploration Rover (MER) contact windows, and is the first tool of its kind designed specifically to support MRO-MER interface coordination. Prior to this automated tool, this analysis was done manually with Excel and the UNIX command line. In total, the process would take approximately 30 minutes for each analysis. The current automated analysis takes less than 30 seconds. This tool resides on the flight machine and uses a PHP interface that does the entire analysis of the input files and takes into account one-way light time from another input file. Input flies are copied over to the proper directories and are dynamically read into the tool s interface. The user can then choose the corresponding input files based on the time frame desired for analysis. After submission of the Web form, the tool merges the two files into a single, time-ordered listing of events for both spacecraft. The times are converted to the same reference time (Earth Transmit Time) by reading in a light time file and performing the calculations necessary to shift the time formats. The program also has the ability to vary the size of the keep-out window on the main page of the analysis tool by inputting a custom time for padding each MRO event time. The parameters on the form are read in and passed to the second page for analysis. Everything is fully coded in PHP and can be accessed by anyone with access to the machine via Web page. This uplink tool will continue to be used for the duration of the MER mission's needs for X-band uplinks. Future missions also can use the tools to check overflight times as well as potential site observation times. Adaptation of the input files to the proper format, and the window keep-out times, would allow for other analyses. Any operations task that uses the idea of keep-out windows will have a use for this program.
Lower HVAC Costs | Efficient Windows Collaborative
system. Smaller HVAC systems cost less and as such can offset some of the cost of the efficient windows dehumidification. First cost savings - Smaller HVAC units cost less. If, for example, down-sizing the HVAC system by half a ton saves $275, the cost premium of energy-efficient windows does not present as big an up
Yu, Zhan; Yu, Min; Zhou, Zhimin; Zhang, Zhibao; Du, Bo; Xiong, Qingqing
2014-01-01
Controlled-release carriers for local drug delivery have attracted increasing attention for inner-ear treatment recently. In this paper, flower-shaped bovine serum albumin (FBSA) particles were prepared by a modified desolvation method followed by glutaraldehyde or heat denaturation. The size of the FBSA particles varied from 10 μm to 100 μm, and most were 50-80 μm. Heat-denatured FBSA particles have good cytocompatibility with a prolonged survival time for L929 cells. The FBSA particles were utilized as carriers to investigate the release behaviors of the model drug - rhodamine B. Rhodamine B showed a sustained-release effect and penetrated the round-window membrane of guinea pigs. We also confirmed the attachment of FBSA particles onto the round-window membrane by microscopy. The FBSA particles, with good biocompatibility, drug-loading capacity, adhesive capability, and biodegradability, may have potential applications in the field of local drug delivery for inner-ear disease treatment.
Duy, Pham K; Chun, Seulah; Chung, Hoeil
2017-11-21
We have systematically characterized Raman scatterings in solid samples with different particle sizes and investigated subsequent trends of particle size-induced intensity variations. For this purpose, both lactose powders and pellets composed of five different particle sizes were prepared. Uniquely in this study, three spectral acquisition schemes with different sizes of laser illuminations and detection windows were employed for the evaluation, since it was expected that the experimental configuration would be another factor potentially influencing the intensity of the lactose peak, along with the particle size itself. In both samples, the distribution of Raman photons became broader with the increase in particle size, as the mean free path of laser photons, the average photon travel distance between consecutive scattering locations, became longer under this situation. When the particle size was the same, the Raman photon distribution was narrower in the pellets since the individual particles were more densely packed in a given volume (the shorter mean free path). When the size of the detection window was small, the number of photons reaching the detector decreased as the photon distribution was larger. Meanwhile, a large-window detector was able to collect the widely distributed Raman photons more effectively; therefore, the trends of intensity change with the variation in particle size were dissimilar depending on the employed spectral acquisition schemes. Overall, the Monte Carlo simulation was effective at probing the photon distribution inside the samples and helped to support the experimental observations.
Wavelet analysis and scaling properties of time series
NASA Astrophysics Data System (ADS)
Manimaran, P.; Panigrahi, Prasanta K.; Parikh, Jitendra C.
2005-10-01
We propose a wavelet based method for the characterization of the scaling behavior of nonstationary time series. It makes use of the built-in ability of the wavelets for capturing the trends in a data set, in variable window sizes. Discrete wavelets from the Daubechies family are used to illustrate the efficacy of this procedure. After studying binomial multifractal time series with the present and earlier approaches of detrending for comparison, we analyze the time series of averaged spin density in the 2D Ising model at the critical temperature, along with several experimental data sets possessing multifractal behavior.
Simulation of a Real-Time Brain Computer Interface for Detecting a Self-Paced Hitting Task.
Hammad, Sofyan H; Kamavuako, Ernest N; Farina, Dario; Jensen, Winnie
2016-12-01
An invasive brain-computer interface (BCI) is a promising neurorehabilitation device for severely disabled patients. Although some systems have been shown to work well in restricted laboratory settings, their utility must be tested in less controlled, real-time environments. Our objective was to investigate whether a specific motor task could be reliably detected from multiunit intracortical signals from freely moving animals in a simulated, real-time setting. Intracortical signals were first obtained from electrodes placed in the primary motor cortex of four rats that were trained to hit a retractable paddle (defined as a "Hit"). In the simulated real-time setting, the signal-to-noise-ratio was first increased by wavelet denoising. Action potentials were detected, and features were extracted (spike count, mean absolute values, entropy, and combination of these features) within pre-defined time windows (200 ms, 300 ms, and 400 ms) to classify the occurrence of a "Hit." We found higher detection accuracy of a "Hit" (73.1%, 73.4%, and 67.9% for the three window sizes, respectively) when the decision was made based on a combination of features rather than on a single feature. However, the duration of the window length was not statistically significant (p = 0.5). Our results showed the feasibility of detecting a motor task in real time in a less restricted environment compared to environments commonly applied within invasive BCI research, and they showed the feasibility of using information extracted from multiunit recordings, thereby avoiding the time-consuming and complex task of extracting and sorting single units. © 2016 International Neuromodulation Society.
Hosein, Riad B M; Mehta, Chetan; Stickley, John; Mcguirk, Simon P; Jones, Timothy J; Brawn, William J; Barron, David J
2007-11-01
A small sub-group of patients with hypoplastic left heart syndrome (HLHS) have normal-sized ascending aorta and arch. An alternative to the Norwood I procedure in these patients is the creation of an aorto-pulmonary (AP) window with a distal pulmonary artery band (PAB). We reviewed our experience with this technique and compared outcomes to the Norwood procedure for HLHS. All patients treated for HLHS in a single institution between 1992 and 2005 were analysed. This identified 13 patients treated with AP window and PAB compared to 333 patients undergoing stage I Norwood procedure. An unrestrictive AP window was created and the main PA was banded. Patient records and echocardiograms were analysed. Median follow-up was 10 (IQR 0-655) days and 100% complete. There were seven early deaths (54%) in the AP window group and two conversions to Norwood circulation. This was a significantly worse outcome than for the Norwood procedure over the same period, which had an early mortality of 29% (p=0.03). Kaplan-Meier actuarial analysis demonstrated a continued survival benefit of the Norwood group at 6 months (p=0.0005). Deaths were due to either low cardiac output syndrome (n=4) or sudden unheralded arrest (n=3). This occurred despite aortic cross-clamp and circulatory arrest times being significantly lower in the AP window group compared to the Norwood group (35+/-27 vs 55+/-16 min, p<0.01 and 16+/-29 vs 55+/-20 min, p<0.01, respectively). No differences in arterial saturations or systolic blood pressure existed between the groups, but diastolic blood pressure was significantly lower in the AP window group at 27+/-10 mmHg compared to 42+/-8 mmHg in the Norwood group (p=0.01) with evidence of flow reversal in the descending aorta. Differences in diastolic blood pressure between groups were abolished after conversion to stage II. Despite favourable anatomy and shorter ischaemic times, the AP window/PAB technique has a poor outcome compared to the Norwood procedure for HLHS. Low diastolic blood pressure with reversal of descending aortic flow in diastole was a feature of the AP window/PAB circulation. We recommend the Norwood procedure for these sub-types. This may have implications for newer 'hybrid' procedures for HLHS which create a similar palliative circulation.
Nonuniform Effects of Reinstatement within the Time Window
ERIC Educational Resources Information Center
Galluccio, Llissa; Rovee-Collier, Carolyn
2006-01-01
A time window is a limited period after an event initially occurs in which additional information can be integrated with the memory of that event. It shuts when the memory is forgotten. The time window hypothesis holds that the impact of a manipulation at different points within the time window is nonuniform. In two operant conditioning…
On the relationship between human search strategies, conspicuity, and search performance
NASA Astrophysics Data System (ADS)
Hogervorst, Maarten A.; Bijl, Piet; Toet, Alexander
2005-05-01
We determined the relationship between search performance with a limited field of view (FOV) and several scanning- and scene parameters in human observer experiments. The observers (38 trained army scouts) searched through a large search sector for a target (a camouflaged person) on a heath. From trial to trial the target appeared at a different location. With a joystick the observers scanned through a panoramic image (displayed on a PC-monitor) while the scan path was registered. Four conditions were run differing in sensor type (visual or thermal infrared) and window size (large or small). In conditions with a small window size the zoom option could be used. Detection performance was highly dependent on zoom factor and deteriorated when scan speed increased beyond a threshold value. Moreover, the distribution of scan speeds scales with the threshold speed. This indicates that the observers are aware of their limitations and choose a (near) optimal search strategy. We found no correlation between the fraction of detected targets and overall search time for the individual observers, indicating that both are independent measures of individual search performance. Search performance (fraction detected, total search time, time in view for detection) was found to be strongly related to target conspicuity. Moreover, we found the same relationship between search performance and conspicuity for visual and thermal targets. This indicates that search performance can be predicted directly by conspicuity regardless of the sensor type.
Integrating speech in time depends on temporal expectancies and attention.
Scharinger, Mathias; Steinberg, Johanna; Tavano, Alessandro
2017-08-01
Sensory information that unfolds in time, such as in speech perception, relies on efficient chunking mechanisms in order to yield optimally-sized units for further processing. Whether or not two successive acoustic events receive a one-unit or a two-unit interpretation seems to depend on the fit between their temporal extent and a stipulated temporal window of integration. However, there is ongoing debate on how flexible this temporal window of integration should be, especially for the processing of speech sounds. Furthermore, there is no direct evidence of whether attention may modulate the temporal constraints on the integration window. For this reason, we here examine how different word durations, which lead to different temporal separations of sound onsets, interact with attention. In an Electroencephalography (EEG) study, participants actively and passively listened to words where word-final consonants were occasionally omitted. Words had either a natural duration or were artificially prolonged in order to increase the separation of speech sound onsets. Omission responses to incomplete speech input, originating in left temporal cortex, decreased when the critical speech sound was separated from previous sounds by more than 250 msec, i.e., when the separation was larger than the stipulated temporal window of integration (125-150 msec). Attention, on the other hand, only increased omission responses for stimuli with natural durations. We complemented the event-related potential (ERP) analyses by a frequency-domain analysis on the stimulus presentation rate. Notably, the power of stimulation frequency showed the same duration and attention effects than the omission responses. We interpret these findings on the background of existing research on temporal integration windows and further suggest that our findings may be accounted for within the framework of predictive coding. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mestrovic, Ante; Chitsazzadeh, Shadi; Wells, Derek
2016-08-15
Purpose: To develop a highly sensitive patient specific QA procedure for gated VMAT stereotactic ablative radiotherapy (SABR) treatments. Methods: A platform was constructed to attach the translational stage of a Quasar respiratory motion phantom to a pinpoint ion chamber insert and move the ion chamber inside the ArcCheck. The Quasar phantom controller uses a patient-specific breathing pattern to translate the ion chamber in a superior-inferior direction inside the ArcCheck. With this system the ion chamber is used to QA the correct phase of the gated delivery and the ArcCheck diodes are used to QA the overall dose distribution. This novelmore » approach requires a single plan delivery for a complete QA of a gated plan. The sensitivity of the gating QA procedure was investigated with respect to the following parameters: PTV size, exhale duration, baseline drift, gating window size. Results: The difference between the measured dose to a point in the penumbra and the Eclipse calculated dose was under 2% for small residual motions. The QA procedure was independent of PTV size and duration of exhale. Baseline drift and gating window size, however, significantly affected the penumbral dose measurement, with differences of up to 30% compared to Eclipse. Conclusion: This study described a highly sensitive QA procedure for gated VMAT SABR treatments. The QA outcome was dependent on the gating window size and baseline drift. Analysis of additional patient breathing patterns is currently undergoing to determine a clinically relevant gating window size and an appropriate tolerance level for this procedure.« less
NASA Astrophysics Data System (ADS)
Alakent, Burak; Camurdan, Mehmet C.; Doruker, Pemra
2005-10-01
Time series models, which are constructed from the projections of the molecular-dynamics (MD) runs on principal components (modes), are used to mimic the dynamics of two proteins: tendamistat and immunity protein of colicin E7 (ImmE7). Four independent MD runs of tendamistat and three independent runs of ImmE7 protein in vacuum are used to investigate the energy landscapes of these proteins. It is found that mean-square displacements of residues along the modes in different time scales can be mimicked by time series models, which are utilized in dividing protein dynamics into different regimes with respect to the dominating motion type. The first two regimes constitute the dominance of intraminimum motions during the first 5ps and the random walk motion in a hierarchically higher-level energy minimum, which comprise the initial time period of the trajectories up to 20-40ps for tendamistat and 80-120ps for ImmE7. These are also the time ranges within which the linear nonstationary time series are completely satisfactory in explaining protein dynamics. Encountering energy barriers enclosing higher-level energy minima constrains the random walk motion of the proteins, and pseudorelaxation processes at different levels of minima are detected in tendamistat, depending on the sampling window size. Correlation (relaxation) times of 30-40ps and 150-200ps are detected for two energy envelopes of successive levels for tendamistat, which gives an overall idea about the hierarchical structure of the energy landscape. However, it should be stressed that correlation times of the modes are highly variable with respect to conformational subspaces and sampling window sizes, indicating the absence of an actual relaxation. The random-walk step sizes and the time length of the second regime are used to illuminate an important difference between the dynamics of the two proteins, which cannot be clarified by the investigation of relaxation times alone: ImmE7 has lower-energy barriers enclosing the higher-level energy minimum, preventing the protein to relax and letting it move in a random-walk fashion for a longer period of time.
Hardware Implementation of a Bilateral Subtraction Filter
NASA Technical Reports Server (NTRS)
Huertas, Andres; Watson, Robert; Villalpando, Carlos; Goldberg, Steven
2009-01-01
A bilateral subtraction filter has been implemented as a hardware module in the form of a field-programmable gate array (FPGA). In general, a bilateral subtraction filter is a key subsystem of a high-quality stereoscopic machine vision system that utilizes images that are large and/or dense. Bilateral subtraction filters have been implemented in software on general-purpose computers, but the processing speeds attainable in this way even on computers containing the fastest processors are insufficient for real-time applications. The present FPGA bilateral subtraction filter is intended to accelerate processing to real-time speed and to be a prototype of a link in a stereoscopic-machine- vision processing chain, now under development, that would process large and/or dense images in real time and would be implemented in an FPGA. In terms that are necessarily oversimplified for the sake of brevity, a bilateral subtraction filter is a smoothing, edge-preserving filter for suppressing low-frequency noise. The filter operation amounts to replacing the value for each pixel with a weighted average of the values of that pixel and the neighboring pixels in a predefined neighborhood or window (e.g., a 9 9 window). The filter weights depend partly on pixel values and partly on the window size. The present FPGA implementation of a bilateral subtraction filter utilizes a 9 9 window. This implementation was designed to take advantage of the ability to do many of the component computations in parallel pipelines to enable processing of image data at the rate at which they are generated. The filter can be considered to be divided into the following parts (see figure): a) An image pixel pipeline with a 9 9- pixel window generator, b) An array of processing elements; c) An adder tree; d) A smoothing-and-delaying unit; and e) A subtraction unit. After each 9 9 window is created, the affected pixel data are fed to the processing elements. Each processing element is fed the pixel value for its position in the window as well as the pixel value for the central pixel of the window. The absolute difference between these two pixel values is calculated and used as an address in a lookup table. Each processing element has a lookup table, unique for its position in the window, containing the weight coefficients for the Gaussian function for that position. The pixel value is multiplied by the weight, and the outputs of the processing element are the weight and pixel-value weight product. The products and weights are fed to the adder tree. The sum of the products and the sum of the weights are fed to the divider, which computes the sum of products the sum of weights. The output of the divider is denoted the bilateral smoothed image. The smoothing function is a simple weighted average computed over a 3 3 subwindow centered in the 9 9 window. After smoothing, the image is delayed by an additional amount of time needed to match the processing time for computing the bilateral smoothed image. The bilateral smoothed image is then subtracted from the 3 3 smoothed image to produce the final output. The prototype filter as implemented in a commercially available FPGA processes one pixel per clock cycle. Operation at a clock speed of 66 MHz has been demonstrated, and results of a static timing analysis have been interpreted as suggesting that the clock speed could be increased to as much as 100 MHz.
Evaluation of Bias Correction Method for Satellite-Based Rainfall Data
Bhatti, Haris Akram; Rientjes, Tom; Haile, Alemseged Tamiru; Habib, Emad; Verhoef, Wouter
2016-01-01
With the advances in remote sensing technology, satellite-based rainfall estimates are gaining attraction in the field of hydrology, particularly in rainfall-runoff modeling. Since estimates are affected by errors correction is required. In this study, we tested the high resolution National Oceanic and Atmospheric Administration’s (NOAA) Climate Prediction Centre (CPC) morphing technique (CMORPH) satellite rainfall product (CMORPH) in the Gilgel Abbey catchment, Ethiopia. CMORPH data at 8 km-30 min resolution is aggregated to daily to match in-situ observations for the period 2003–2010. Study objectives are to assess bias of the satellite estimates, to identify optimum window size for application of bias correction and to test effectiveness of bias correction. Bias correction factors are calculated for moving window (MW) sizes and for sequential windows (SW’s) of 3, 5, 7, 9, …, 31 days with the aim to assess error distribution between the in-situ observations and CMORPH estimates. We tested forward, central and backward window (FW, CW and BW) schemes to assess the effect of time integration on accumulated rainfall. Accuracy of cumulative rainfall depth is assessed by Root Mean Squared Error (RMSE). To systematically correct all CMORPH estimates, station based bias factors are spatially interpolated to yield a bias factor map. Reliability of interpolation is assessed by cross validation. The uncorrected CMORPH rainfall images are multiplied by the interpolated bias map to result in bias corrected CMORPH estimates. Findings are evaluated by RMSE, correlation coefficient (r) and standard deviation (SD). Results showed existence of bias in the CMORPH rainfall. It is found that the 7 days SW approach performs best for bias correction of CMORPH rainfall. The outcome of this study showed the efficiency of our bias correction approach. PMID:27314363
Evaluation of Bias Correction Method for Satellite-Based Rainfall Data.
Bhatti, Haris Akram; Rientjes, Tom; Haile, Alemseged Tamiru; Habib, Emad; Verhoef, Wouter
2016-06-15
With the advances in remote sensing technology, satellite-based rainfall estimates are gaining attraction in the field of hydrology, particularly in rainfall-runoff modeling. Since estimates are affected by errors correction is required. In this study, we tested the high resolution National Oceanic and Atmospheric Administration's (NOAA) Climate Prediction Centre (CPC) morphing technique (CMORPH) satellite rainfall product (CMORPH) in the Gilgel Abbey catchment, Ethiopia. CMORPH data at 8 km-30 min resolution is aggregated to daily to match in-situ observations for the period 2003-2010. Study objectives are to assess bias of the satellite estimates, to identify optimum window size for application of bias correction and to test effectiveness of bias correction. Bias correction factors are calculated for moving window (MW) sizes and for sequential windows (SW's) of 3, 5, 7, 9, …, 31 days with the aim to assess error distribution between the in-situ observations and CMORPH estimates. We tested forward, central and backward window (FW, CW and BW) schemes to assess the effect of time integration on accumulated rainfall. Accuracy of cumulative rainfall depth is assessed by Root Mean Squared Error (RMSE). To systematically correct all CMORPH estimates, station based bias factors are spatially interpolated to yield a bias factor map. Reliability of interpolation is assessed by cross validation. The uncorrected CMORPH rainfall images are multiplied by the interpolated bias map to result in bias corrected CMORPH estimates. Findings are evaluated by RMSE, correlation coefficient (r) and standard deviation (SD). Results showed existence of bias in the CMORPH rainfall. It is found that the 7 days SW approach performs best for bias correction of CMORPH rainfall. The outcome of this study showed the efficiency of our bias correction approach.
NASA Astrophysics Data System (ADS)
Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui
2014-07-01
The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.
Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui
2014-07-01
The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.
Using the Shuttle In Situ Window and Radiator Data for Meteoroid Measurements
NASA Technical Reports Server (NTRS)
Matney, Mark
2015-01-01
Every time NASA's Space Shuttle flew in orbit, it was exposed to the natural meteoroid and artificial debris environment. NASA Johnson Space Center maintains a database of impact cratering data of 60 Shuttle missions flown since the mid-1990's that were inspected after flight. These represent a total net exposure time to the space environment of 2 years. Impact damage was recorded on the windows and radiators, and in many cases information on the impactor material was determined by later analysis of the crater residue. This information was used to segregate damage caused by natural meteoroids and artificial space debris. The windows represent a total area of 3.565 sq m, and were capable of resolving craters down to about 10 micrometers in size. The radiators represent a total area of 119.26 sq m, and saw damage from objects up to approximately 1 mm in diameter. These data were used extensively in the development of NASA's ORDEM 3.0 Orbital Debris Environment Model, and gives a continuous picture of the orbital debris environment in material type and size ranging from about 10 micrometers to 1 mm. However, the meteoroid data from the Shuttles have never been fully analyzed. For the orbital debris work, special "as flown" files were created that tracked the pointing of the surface elements and their shadowing by structure (such as the ISS during docking). Unfortunately, such files for the meteoroid environment have not yet been created. This talk will introduce these unique impact data and describe how they were used for orbital debris measurements. We will then discuss some simple first-order analyses of the meteoroid data, and point the way for future analyses.
47 CFR 15.323 - Specific requirements for devices operating in the 1920-1930 MHz sub-band.
Code of Federal Regulations, 2010 CFR
2010-10-01
...] (c) Devices must incorporate a mechanism for monitoring the time and spectrum windows that its... transmission, devices must monitor the combined time and spectrum windows in which they intend to transmit for... windows without further monitoring. However, occupation of the same combined time and spectrum windows by...
Window performance and building energy use: Some technical options for increasing energy efficiency
NASA Astrophysics Data System (ADS)
Selkowitz, Stephen
1985-11-01
Window system design and operation has a major impact on energy use in buildings as well as on occupants' thermal and visual comfort. Window performance will be a function of optical and thermal properties, window management strategies, climate and orientation, and building type and occupancy. In residences, heat loss control is a primary concern, followed by sun control in more southerly climates. In commercial buildings, the daylight provided by windows may be the major energy benefits but solar gain must be controlled so that increased cooling loads do not exceed daylighting savings. Reductions in peak electrical demand and HVAC system size may also be possible in well-designed daylighted buildings.
77 FR 12588 - Long Fence & Home, LLLP; Analysis of Proposed Consent Order To Aid Public Comment
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-01
... homeowners can realize by replacing their windows, including the home's geographic location, size, insulation... window of a specific composition in a building having a specific level of insulation in a specific region..., energy savings, energy [[Page 12590
Eisner, Brian H; Kambadakone, Avinash; Monga, Manoj; Anderson, James K; Thoreson, Andrew A; Lee, Hang; Dretler, Stephen P; Sahani, Dushyant V
2009-04-01
We determined the most accurate method of measuring urinary stones on computerized tomography. For the in vitro portion of the study 24 calculi, including 12 calcium oxalate monohydrate and 12 uric acid stones, that had been previously collected at our clinic were measured manually with hand calipers as the gold standard measurement. The calculi were then embedded into human kidney-sized potatoes and scanned using 64-slice multidetector computerized tomography. Computerized tomography measurements were performed at 4 window settings, including standard soft tissue windows (window width-320 and window length-50), standard bone windows (window width-1120 and window length-300), 5.13x magnified soft tissue windows and 5.13x magnified bone windows. Maximum stone dimensions were recorded. For the in vivo portion of the study 41 patients with distal ureteral stones who underwent noncontrast computerized tomography and subsequently spontaneously passed the stones were analyzed. All analyzed stones were 100% calcium oxalate monohydrate or mixed, calcium based stones. Stones were prospectively collected at the clinic and the largest diameter was measured with digital calipers as the gold standard. This was compared to computerized tomography measurements using 4.0x magnified soft tissue windows and 4.0x magnified bone windows. Statistical comparisons were performed using Pearson's correlation and paired t test. In the in vitro portion of the study the most accurate measurements were obtained using 5.13x magnified bone windows with a mean 0.13 mm difference from caliper measurement (p = 0.6). Measurements performed in the soft tissue window with and without magnification, and in the bone window without magnification were significantly different from hand caliper measurements (mean difference 1.2, 1.9 and 1.4 mm, p = 0.003, <0.001 and 0.0002, respectively). When comparing measurement errors between stones of different composition in vitro, the error for calcium oxalate calculi was significantly different from the gold standard for all methods except bone window settings with magnification. For uric acid calculi the measurement error was observed only in standard soft tissue window settings. In vivo 4.0x magnified bone windows was superior to 4.0x magnified soft tissue windows in measurement accuracy. Magnified bone window measurements were not statistically different from digital caliper measurements (mean underestimation vs digital caliper 0.3 mm, p = 0.4), while magnified soft tissue windows were statistically distinct (mean underestimation 1.4 mm, p = 0.001). In this study magnified bone windows were the most accurate method of stone measurements in vitro and in vivo. Therefore, we recommend the routine use of magnified bone windows for computerized tomography measurement of stones. In vitro the measurement error in calcium oxalate stones was greater than that in uric acid stones, suggesting that stone composition may be responsible for measurement inaccuracies.
Large Acrylic Spherical Windows In Hyperbaric Underwater Photography
NASA Astrophysics Data System (ADS)
Lones, Joe J.; Stachiw, Jerry D.
1983-10-01
Both acrylic plastic and glass are common materials for hyperbaric optical windows. Although glass continues to be used occasionally for small windows, virtually all large viewports are made of acrylic. It is easy to uderstand the wide use of acrylic when comparing design properties of this plastic with those of glass, and glass windows are relatively more difficult to fabricate and use. in addition there are published guides for the design and fabrication of acrylic windows to be used in the hyperbaric environment of hydrospace. Although these procedures for fabricating the acrylic windows are somewhat involved, the results are extremely reliable. Acrylic viewports are now fabricated to very large sizes for manned observation or optical quality instrumen tation as illustrated by the numerous acrylic submersible vehicle hulls for hu, an occupancy currently in operation and a 3600 large optical window recently developed for the Walt Disney Circle Vision under-water camera housing.
Finding Frequent Closed Itemsets in Sliding Window in Linear Time
NASA Astrophysics Data System (ADS)
Chen, Junbo; Zhou, Bo; Chen, Lu; Wang, Xinyu; Ding, Yiqun
One of the most well-studied problems in data mining is computing the collection of frequent itemsets in large transactional databases. Since the introduction of the famous Apriori algorithm [14], many others have been proposed to find the frequent itemsets. Among such algorithms, the approach of mining closed itemsets has raised much interest in data mining community. The algorithms taking this approach include TITANIC [8], CLOSET+[6], DCI-Closed [4], FCI-Stream [3], GC-Tree [15], TGC-Tree [16] etc. Among these algorithms, FCI-Stream, GC-Tree and TGC-Tree are online algorithms work under sliding window environments. By the performance evaluation in [16], GC-Tree [15] is the fastest one. In this paper, an improved algorithm based on GC-Tree is proposed, the computational complexity of which is proved to be a linear combination of the average transaction size and the average closed itemset size. The algorithm is based on the essential theorem presented in Sect. 4.2. Empirically, the new algorithm is several orders of magnitude faster than the state of art algorithm, GC-Tree.
CB Database: A change blindness database for objects in natural indoor scenes.
Sareen, Preeti; Ehinger, Krista A; Wolfe, Jeremy M
2016-12-01
Change blindness has been a topic of interest in cognitive sciences for decades. Change detection experiments are frequently used for studying various research topics such as attention and perception. However, creating change detection stimuli is tedious and there is no open repository of such stimuli using natural scenes. We introduce the Change Blindness (CB) Database with object changes in 130 colored images of natural indoor scenes. The size and eccentricity are provided for all the changes as well as reaction time data from a baseline experiment. In addition, we have two specialized satellite databases that are subsets of the 130 images. In one set, changes are seen in rooms or in mirrors in those rooms (Mirror Change Database). In the other, changes occur in a room or out a window (Window Change Database). Both the sets have controlled background, change size, and eccentricity. The CB Database is intended to provide researchers with a stimulus set of natural scenes with defined stimulus parameters that can be used for a wide range of experiments. The CB Database can be found at http://search.bwh.harvard.edu/new/CBDatabase.html .
NASA Astrophysics Data System (ADS)
Turcu, I. C. E.; Ross, I. N.; Schulz, M. S.; Daido, H.; Tallents, G. J.; Krishnan, J.; Dwivedi, L.; Hening, A.
1993-06-01
The properties of a coherent x-ray point source in the water window spectral region generated using a small commercially available KrF laser system focused onto a Mylar (essentially carbon) target have been measured. By operating the source in a low-pressure (approximately 20 Torr) nitrogen environment, the degree of monochromaticity was improved due to the nitrogen acting as an x-ray filter and relatively enhancing the radiation at a wavelength of 3.37 nm (C vi 1s-2p). X-ray pinhole camera images show a minimum source size of 12 μm. A Young's double slit coherence measurement gave fringe visibilities of approximately 62% for a slit separation of 10.5 μm at a distance of 31.7 cm from the source. To demonstrate the viability of the laser plasma as a source for coherent imaging applications a Gabor (in-line) hologram of two carbon fibers, of different sizes, was produced. The exposure time and the repetition rate was 2 min and 10 Hz, respectively.
Repliscan: a tool for classifying replication timing regions.
Zynda, Gregory J; Song, Jawon; Concia, Lorenzo; Wear, Emily E; Hanley-Bowdoin, Linda; Thompson, William F; Vaughn, Matthew W
2017-08-07
Replication timing experiments that use label incorporation and high throughput sequencing produce peaked data similar to ChIP-Seq experiments. However, the differences in experimental design, coverage density, and possible results make traditional ChIP-Seq analysis methods inappropriate for use with replication timing. To accurately detect and classify regions of replication across the genome, we present Repliscan. Repliscan robustly normalizes, automatically removes outlying and uninformative data points, and classifies Repli-seq signals into discrete combinations of replication signatures. The quality control steps and self-fitting methods make Repliscan generally applicable and more robust than previous methods that classify regions based on thresholds. Repliscan is simple and effective to use on organisms with different genome sizes. Even with analysis window sizes as small as 1 kilobase, reliable profiles can be generated with as little as 2.4x coverage.
JPARSS: A Java Parallel Network Package for Grid Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Jie; Akers, Walter; Chen, Ying
2002-03-01
The emergence of high speed wide area networks makes grid computinga reality. However grid applications that need reliable data transfer still have difficulties to achieve optimal TCP performance due to network tuning of TCP window size to improve bandwidth and to reduce latency on a high speed wide area network. This paper presents a Java package called JPARSS (Java Parallel Secure Stream (Socket)) that divides data into partitions that are sent over several parallel Java streams simultaneously and allows Java or Web applications to achieve optimal TCP performance in a grid environment without the necessity of tuning TCP window size.more » This package enables single sign-on, certificate delegation and secure or plain-text data transfer using several security components based on X.509 certificate and SSL. Several experiments will be presented to show that using Java parallelstreams is more effective than tuning TCP window size. In addition a simple architecture using Web services« less
Conformal ALON® and spinel windows
NASA Astrophysics Data System (ADS)
Goldman, Lee M.; Smith, Mark; Ramisetty, Mohan; Jha, Santosh; Sastri, Suri
2017-05-01
The requirements for modern aircraft based reconnaissance systems are driving the need for conformal windows for future sensor systems. However, limitations on optical systems and the ability to produce windows in complex geometries currently limit the geometry of existing windows and window assemblies to faceted assemblies of flat windows. ALON consists primarily of aluminum and oxygen, similar to that of alumina, with a small amount of nitrogen added to help stabilize the cubic gamma-AlON phase. ALON's chemical similarity to alumina, translates into a robust manufacturing process. This ease of processing has allowed Surmet to produce ALON windows and domes in a wide variety of geometries and sizes. Spinel (MgAl2O4) contains equal molar amounts of MgO and Al2O3, and is a cubic material, that transmits further into the Infrared than ALON. Spinel is produced via powder processing techniques similar to those used to produce ALON. Surmet is now applying the lessons learned with ALON to produce conformal spinel windows and domes as well.
Martin, C E; Brandmeyer, E A; Ross, R D
2013-01-01
Leaf temperatures were lower when light entry at the leaf tip window was prevented through covering the window with reflective tape, relative to leaf temperatures of plants with leaf tip windows covered with transparent tape. This was true when leaf temperatures were measured with an infrared thermometer, but not with a fine-wire thermocouple. Leaf tip windows of Lithops growing in high-rainfall regions of southern Africa were larger than the windows of plants (numerous individuals of 17 species) growing in areas with less rainfall and, thus, more annual insolation. The results of this study indicate that leaf tip windows of desert plants with an underground growth habit can allow entry of supra-optimal levels of radiant energy, thus most likely inhibiting photosynthetic activity. Consequently, the size of the leaf tip windows correlates inversely with habitat solar irradiance, minimising the probability of photoinhibition, while maximising the absorption of irradiance in cloudy, high-rainfall regions. © 2012 German Botanical Society and The Royal Botanical Society of the Netherlands.
A new powder production route for transparent spinel windows: powder synthesis and window properties
NASA Astrophysics Data System (ADS)
Cook, Ronald; Kochis, Michael; Reimanis, Ivar; Kleebe, Hans-Joachim
2005-05-01
Spinel powders for the production of transparent polycrystalline ceramic windows have been produced using a number of traditional ceramic and sol-gel methods. We have demonstrated that magnesium aluminate spinel powders produced from the reaction of organo-magnesium compounds with surface modified boehmite precursors can be used to produce high quality transparent spinel parts. The new powder production method allows fine control over the starting particle size, size distribution, purity and stoichiometry. The new process involves formation of a boehmite sol-gel from the hydrolysis of aluminum alkoxides followed by surface modification of the boehmite nanoparticles using carboxylic acids. The resulting surface modified boehmite nanoparticles can then be metal exchanged at room temperature with magnesium acetylacetonate to make a precursor powder that is readily transformed into pure phase spinel.
Wachulak, Przemyslaw; Torrisi, Alfio; Nawaz, Muhammad F; Bartnik, Andrzej; Adjei, Daniel; Vondrová, Šárka; Turňová, Jana; Jančarek, Alexandr; Limpouch, Jiří; Vrbová, Miroslava; Fiedorowicz, Henryk
2015-10-01
Short illumination wavelength allows an extension of the diffraction limit toward nanometer scale; thus, improving spatial resolution in optical systems. Soft X-ray (SXR) radiation, from "water window" spectral range, λ=2.3-4.4 nm wavelength, which is particularly suitable for biological imaging due to natural optical contrast provides better spatial resolution than one obtained with visible light microscopes. The high contrast in the "water window" is obtained because of selective radiation absorption by carbon and water, which are constituents of the biological samples. The development of SXR microscopes permits the visualization of features on the nanometer scale, but often with a tradeoff, which can be seen between the exposure time and the size and complexity of the microscopes. Thus, herein, we present a desk-top system, which overcomes the already mentioned limitations and is capable of resolving 60 nm features with very short exposure time. Even though the system is in its initial stage of development, we present different applications of the system for biology and nanotechnology. Construction of the microscope with recently acquired images of various samples will be presented and discussed. Such a high resolution imaging system represents an interesting solution for biomedical, material science, and nanotechnology applications.
Liu, Xue-Li; Gai, Shuang-Shuang; Zhang, Shi-Le; Wang, Pu
2015-01-01
Background An important attribute of the traditional impact factor was the controversial 2-year citation window. So far, several scholars have proposed using different citation time windows for evaluating journals. However, there is no confirmation whether a longer citation time window would be better. How did the journal evaluation effects of 3IF, 4IF, and 6IF comparing with 2IF and 5IF? In order to understand these questions, we made a comparative study of impact factors with different citation time windows with the peer-reviewed scores of ophthalmologic journals indexed by Science Citation Index Expanded (SCIE) database. Methods The peer-reviewed scores of 28 ophthalmologic journals were obtained through a self-designed survey questionnaire. Impact factors with different citation time windows (including 2IF, 3IF, 4IF, 5IF, and 6IF) of 28 ophthalmologic journals were computed and compared in accordance with each impact factor’s definition and formula, using the citation analysis function of the Web of Science (WoS) database. An analysis of the correlation between impact factors with different citation time windows and peer-reviewed scores was carried out. Results Although impact factor values with different citation time windows were different, there was a high level of correlation between them when it came to evaluating journals. In the current study, for ophthalmologic journals’ impact factors with different time windows in 2013, 3IF and 4IF seemed the ideal ranges for comparison, when assessed in relation to peer-reviewed scores. In addition, the 3-year and 4-year windows were quite consistent with the cited peak age of documents published by ophthalmologic journals. Research Limitations Our study is based on ophthalmology journals and we only analyze the impact factors with different citation time window in 2013, so it has yet to be ascertained whether other disciplines (especially those with a later cited peak) or other years would follow the same or similar patterns. Originality/ Value We designed the survey questionnaire ourselves, specifically to assess the real influence of journals. We used peer-reviewed scores to judge the journal evaluation effect of impact factors with different citation time windows. The main purpose of this study was to help researchers better understand the role of impact factors with different citation time windows in journal evaluation. PMID:26295157
Liu, Xue-Li; Gai, Shuang-Shuang; Zhang, Shi-Le; Wang, Pu
2015-01-01
An important attribute of the traditional impact factor was the controversial 2-year citation window. So far, several scholars have proposed using different citation time windows for evaluating journals. However, there is no confirmation whether a longer citation time window would be better. How did the journal evaluation effects of 3IF, 4IF, and 6IF comparing with 2IF and 5IF? In order to understand these questions, we made a comparative study of impact factors with different citation time windows with the peer-reviewed scores of ophthalmologic journals indexed by Science Citation Index Expanded (SCIE) database. The peer-reviewed scores of 28 ophthalmologic journals were obtained through a self-designed survey questionnaire. Impact factors with different citation time windows (including 2IF, 3IF, 4IF, 5IF, and 6IF) of 28 ophthalmologic journals were computed and compared in accordance with each impact factor's definition and formula, using the citation analysis function of the Web of Science (WoS) database. An analysis of the correlation between impact factors with different citation time windows and peer-reviewed scores was carried out. Although impact factor values with different citation time windows were different, there was a high level of correlation between them when it came to evaluating journals. In the current study, for ophthalmologic journals' impact factors with different time windows in 2013, 3IF and 4IF seemed the ideal ranges for comparison, when assessed in relation to peer-reviewed scores. In addition, the 3-year and 4-year windows were quite consistent with the cited peak age of documents published by ophthalmologic journals. Our study is based on ophthalmology journals and we only analyze the impact factors with different citation time window in 2013, so it has yet to be ascertained whether other disciplines (especially those with a later cited peak) or other years would follow the same or similar patterns. We designed the survey questionnaire ourselves, specifically to assess the real influence of journals. We used peer-reviewed scores to judge the journal evaluation effect of impact factors with different citation time windows. The main purpose of this study was to help researchers better understand the role of impact factors with different citation time windows in journal evaluation.
Lottman, Kristin K; Kraguljac, Nina V; White, David M; Morgan, Charity J; Calhoun, Vince D; Butt, Allison; Lahti, Adrienne C
2017-01-01
Resting-state functional connectivity studies in schizophrenia evaluating average connectivity over the entire experiment have reported aberrant network integration, but findings are variable. Examining time-varying (dynamic) functional connectivity may help explain some inconsistencies. We assessed dynamic network connectivity using resting-state functional MRI in patients with schizophrenia, while unmedicated ( n = 34), after 1 week ( n = 29) and 6 weeks of treatment with risperidone ( n = 24), as well as matched controls at baseline ( n = 35) and after 6 weeks ( n = 19). After identifying 41 independent components (ICs) comprising resting-state networks, sliding window analysis was performed on IC timecourses using an optimal window size validated with linear support vector machines. Windowed correlation matrices were then clustered into three discrete connectivity states (a relatively sparsely connected state, a relatively abundantly connected state, and an intermediately connected state). In unmedicated patients, static connectivity was increased between five pairs of ICs and decreased between two pairs of ICs when compared to controls, dynamic connectivity showed increased connectivity between the thalamus and somatomotor network in one of the three states. State statistics indicated that, in comparison to controls, unmedicated patients had shorter mean dwell times and fraction of time spent in the sparsely connected state, and longer dwell times and fraction of time spent in the intermediately connected state. Risperidone appeared to normalize mean dwell times after 6 weeks, but not fraction of time. Results suggest that static connectivity abnormalities in schizophrenia may partly be related to altered brain network temporal dynamics rather than consistent dysconnectivity within and between functional networks and demonstrate the importance of implementing complementary data analysis techniques.
Pereira, Telma; Lemos, Luís; Cardoso, Sandra; Silva, Dina; Rodrigues, Ana; Santana, Isabel; de Mendonça, Alexandre; Guerreiro, Manuela; Madeira, Sara C
2017-07-19
Predicting progression from a stage of Mild Cognitive Impairment to dementia is a major pursuit in current research. It is broadly accepted that cognition declines with a continuum between MCI and dementia. As such, cohorts of MCI patients are usually heterogeneous, containing patients at different stages of the neurodegenerative process. This hampers the prognostic task. Nevertheless, when learning prognostic models, most studies use the entire cohort of MCI patients regardless of their disease stages. In this paper, we propose a Time Windows approach to predict conversion to dementia, learning with patients stratified using time windows, thus fine-tuning the prognosis regarding the time to conversion. In the proposed Time Windows approach, we grouped patients based on the clinical information of whether they converted (converter MCI) or remained MCI (stable MCI) within a specific time window. We tested time windows of 2, 3, 4 and 5 years. We developed a prognostic model for each time window using clinical and neuropsychological data and compared this approach with the commonly used in the literature, where all patients are used to learn the models, named as First Last approach. This enables to move from the traditional question "Will a MCI patient convert to dementia somewhere in the future" to the question "Will a MCI patient convert to dementia in a specific time window". The proposed Time Windows approach outperformed the First Last approach. The results showed that we can predict conversion to dementia as early as 5 years before the event with an AUC of 0.88 in the cross-validation set and 0.76 in an independent validation set. Prognostic models using time windows have higher performance when predicting progression from MCI to dementia, when compared to the prognostic approach commonly used in the literature. Furthermore, the proposed Time Windows approach is more relevant from a clinical point of view, predicting conversion within a temporal interval rather than sometime in the future and allowing clinicians to timely adjust treatments and clinical appointments.
Theoretical vibro-acoustic modeling of acoustic noise transmission through aircraft windows
NASA Astrophysics Data System (ADS)
Aloufi, Badr; Behdinan, Kamran; Zu, Jean
2016-06-01
In this paper, a fully vibro-acoustic model for sound transmission across a multi-pane aircraft window is developed. The proposed model is efficiently applied for a set of window models to perform extensive theoretical parametric studies. The studied window configurations generally simulate the passenger window designs of modern aircraft classes which have an exterior multi-Plexiglas pane, an interior single acrylic glass pane and a dimmable glass ("smart" glass), all separated by thin air cavities. The sound transmission loss (STL) characteristics of three different models, triple-, quadruple- and quintuple-paned windows identical in size and surface density, are analyzed for improving the acoustic insulation performances. Typical results describing the influence of several system parameters, such as the thicknesses, number and spacing of the window panes, on the transmission loss are then investigated. In addition, a comparison study is carried out to evaluate the acoustic reduction capability of each window model. The STL results show that the higher frequencies sound transmission loss performance can be improved by increasing the number of window panels, however, the low frequency performance is decreased, particularly at the mass-spring resonances.
Design considerations for case series models with exposure onset measurement error.
Mohammed, Sandra M; Dalrymple, Lorien S; Sentürk, Damla; Nguyen, Danh V
2013-02-28
The case series model allows for estimation of the relative incidence of events, such as cardiovascular events, within a pre-specified time window after an exposure, such as an infection. The method requires only cases (individuals with events) and controls for all fixed/time-invariant confounders. The measurement error case series model extends the original case series model to handle imperfect data, where the timing of an infection (exposure) is not known precisely. In this work, we propose a method for power/sample size determination for the measurement error case series model. Extensive simulation studies are used to assess the accuracy of the proposed sample size formulas. We also examine the magnitude of the relative loss of power due to exposure onset measurement error, compared with the ideal situation where the time of exposure is measured precisely. To facilitate the design of case series studies, we provide publicly available web-based tools for determining power/sample size for both the measurement error case series model as well as the standard case series model. Copyright © 2012 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Haskins, Justin B.; Bauschlicher, Charles W.; Lawson, John W.
2015-01-01
Zero-temperature density functional theory (DFT), density functional theory molecular dynamics (DFT-MD), and classical molecular dynamics using polarizable force fields (PFF-MD) are employed to evaluate the influence of Lithium ion on the structure, transport, and electrochemical stability of three potential ionic liquid electrolytes: N--methyl-N-butylpyrrolidinium bis(trifluoromethanesulfonyl)imide ([pyr14][TFSI]), N--methyl-N-propylpyrrolidinium bis(fluorosulfonyl)imide ([pyr13][FSI]), and 1-ethyl-3--methylimidazolium boron tetrafluoride ([EMIM][BF4]). We characterize the Lithium ion solvation shell through zero-temperature DFT simulations of [Li(Anion)sub n](exp n-1) -clusters, DFT-MD simulations of isolated lithium ions in small ionic liquid systems, and PFF-MD simulations with high Li-doping levels in large ionic liquid systems. At low levels of Li-salt doping, highly stable solvation shells having 2-3 anions are seen in both [pyr14][TFSI] and [pyr13][FSI], while solvation shells with 4 anions dominate in [EMIM][BF sub 4]. At higher levels of doping, we find the formation of complex Li-network structures that increase the frequency of 4 anion-coordinated solvation shells. A comparison of computational and experimental Raman spectra for a wide range of [Li(Anion) sub n](exp n -1) - clusters shows that our proposed structures are consistent with experiment. We estimate the ion diffusion coefficients and quantify both size and simulation time effects. We find estimates of lithium ion diffusion are a reasonable order of magnitude and can be corrected for simulation time effects. Simulation size, on the other hand, is also important, with diffusion coefficients from long PFF-MD simulations of small cells having 20-40% error compared to large-cell values. Finally, we compute the electrochemical window using differences in electronic energy levels of both isolated cation/anion pairs and small ionic liquid systems with Li-salt doping. The single pair and liquid-phase systems provide similar estimates of electrochemical window, while Li-doping in the liquid-phase systems results in electrochemical windows little changed from the neat systems. Pure and hybrid functionals systematically provide an upper and lower bound, respectively, to the experimental electrochemical window for the systems studied here.
The sonic window: second generation results
NASA Astrophysics Data System (ADS)
Walker, William F.; Fuller, Michael I.; Brush, Edward V.; Eames, Matthew D. C.; Owen, Kevin; Ranganathan, Karthik; Blalock, Travis N.; Hossack, John A.
2006-03-01
Medical Ultrasound Imaging is widely used clinically because of its relatively low cost, portability, lack of ionizing radiation, and real-time nature. However, even with these advantages ultrasound has failed to permeate the broad array of clinical applications where its use could be of value. A prime example of this untapped potential is the routine use of ultrasound to guide intravenous access. In this particular application existing systems lack the required portability, low cost, and ease-of-use required for widespread acceptance. Our team has been working for a number of years to develop an extremely low-cost, pocket-sized, and intuitive ultrasound imaging system that we refer to as the "Sonic Window." We have previously described the first generation Sonic Window prototype that was a bench-top device using a 1024 element, fully populated array operating at a center frequency of 3.3 MHz. Through a high degree of custom front-end integration combined with multiplexing down to a 2 channel PC based digitizer this system acquired a full set of RF data over a course of 512 transmit events. While initial results were encouraging, this system exhibited limitations resulting from low SNR, relatively coarse array sampling, and relatively slow data acquisition. We have recently begun assembling a second-generation Sonic Window system. This system uses a 3600 element fully sampled array operating at 5.0 MHz with a 300 micron element pitch. This system extends the integration of the first generation system to include front-end protection, pre-amplification, a programmable bandpass filter, four sample and holds, and four A/D converters for all 3600 channels in a set of custom integrated circuits with a combined area smaller than the 1.8 x 1.8 cm footprint of the transducer array. We present initial results from this front-end and present benchmark results from a software beamformer implemented on the Analog Devices BF-561 DSP. We discuss our immediate plans for further integration and testing. This second prototype represents a major reduction in size and forms the foundation of a fully functional, fully integrated, pocket sized prototype.
Thin and open vessel windows for intra-vital fluorescence imaging of murine cochlear blood flow
Shi, Xiaorui; Zhang, Fei; Urdang, Zachary; Dai, Min; Neng, Lingling; Zhang, Jinhui; Chen, Songlin; Ramamoorthy, Sripriya; Nuttall, Alfred L.
2014-01-01
Normal microvessel structure and function in the cochlea is essential for maintaining the ionic and metabolic homeostasis required for hearing function. Abnormal cochlear microcirculation has long been considered an etiologic factor in hearing disorders. A better understanding of cochlear blood flow (CoBF) will enable more effective amelioration of hearing disorders that result from aberrant blood flow. However, establishing the direct relationship between CoBF and other cellular events in the lateral wall and response to physio-pathological stress remains a challenge due to the lack of feasible interrogation methods and difficulty in accessing the inner ear. Here we report on new methods for studying the CoBF in a mouse model using a thin or open vessel-window in combination with fluorescence intra-vital microscopy (IVM). An open vessel-window enables investigation of vascular cell biology and blood flow permeability, including pericyte (PC) contractility, bone marrow cell migration, and endothelial barrier leakage, in wild type and fluorescent protein-labeled transgenic mouse models with high spatial and temporal resolution. Alternatively, the thin vessel-window method minimizes disruption of the homeostatic balance in the lateral wall and enables study CoBF under relatively intact physiological conditions. A thin vessel-window method can also be used for time-based studies of physiological and pathological processes. Although the small size of the mouse cochlea makes surgery difficult, the methods are sufficiently developed for studying the structural and functional changes in CoBF under normal and pathological conditions. PMID:24780131
Multi-alternative decision-making with non-stationary inputs.
Nunes, Luana F; Gurney, Kevin
2016-08-01
One of the most widely implemented models for multi-alternative decision-making is the multihypothesis sequential probability ratio test (MSPRT). It is asymptotically optimal, straightforward to implement, and has found application in modelling biological decision-making. However, the MSPRT is limited in application to discrete ('trial-based'), non-time-varying scenarios. By contrast, real world situations will be continuous and entail stimulus non-stationarity. In these circumstances, decision-making mechanisms (like the MSPRT) which work by accumulating evidence, must be able to discard outdated evidence which becomes progressively irrelevant. To address this issue, we introduce a new decision mechanism by augmenting the MSPRT with a rectangular integration window and a transparent decision boundary. This allows selection and de-selection of options as their evidence changes dynamically. Performance was enhanced by adapting the window size to problem difficulty. Further, we present an alternative windowing method which exponentially decays evidence and does not significantly degrade performance, while greatly reducing the memory resources necessary. The methods presented have proven successful at allowing for the MSPRT algorithm to function in a non-stationary environment.
Evaluation of beam tracking strategies for the THOR-CSW solar wind instrument
NASA Astrophysics Data System (ADS)
De Keyser, Johan; Lavraud, Benoit; Prech, Lubomir; Neefs, Eddy; Berkenbosch, Sophie; Beeckman, Bram; Maggiolo, Romain; Fedorov, Andrei; Baruah, Rituparna; Wong, King-Wah; Amoros, Carine; Mathon, Romain; Génot, Vincent
2017-04-01
We compare different beam tracking strategies for the Cold Solar Wind (CSW) plasma spectrometer on the ESA M4 THOR mission candidate. The goal is to intelligently select the energy and angular windows the instrument is sampling and to adapt these windows as the solar wind properties evolve, with the aim to maximize the velocity distribution acquisition rate while maintaining excellent energy and angular resolution. Using synthetic data constructed using high-cadence measurements by the Faraday cup instrument on the Spektr-R mission (30 ms resolution), we test the performance of energy beam tracking with or without angular beam tracking. The algorithm can be fed both by data acquired by the plasma spectrometer during the previous measurement cycle, or by data from another instrument, in casu the Faraday Cup (FAR) instrument foreseen on THOR. We verify how these beam tracking algorithms behave for different sizes of the energy and angular windows, and for different data integration times, in order to assess the limitations of the algorithm and to avoid situations in which the algorithm loses track of the beam.
Dynamics of retinal photocoagulation and rupture
NASA Astrophysics Data System (ADS)
Sramek, Christopher; Paulus, Yannis; Nomoto, Hiroyuki; Huie, Phil; Brown, Jefferson; Palanker, Daniel
2009-05-01
In laser retinal photocoagulation, short (<20 ms) pulses have been found to reduce thermal damage to the inner retina, decrease treatment time, and minimize pain. However, the safe therapeutic window (defined as the ratio of power for producing a rupture to that of mild coagulation) decreases with shorter exposures. To quantify the extent of retinal heating and maximize the therapeutic window, a computational model of millisecond retinal photocoagulation and rupture was developed. Optical attenuation of 532-nm laser light in ocular tissues was measured, including retinal pigment epithelial (RPE) pigmentation and cell-size variability. Threshold powers for vaporization and RPE damage were measured with pulse durations ranging from 1 to 200 ms. A finite element model of retinal heating inferred that vaporization (rupture) takes place at 180-190°C. RPE damage was accurately described by the Arrhenius model with activation energy of 340 kJ/mol. Computed photocoagulation lesion width increased logarithmically with pulse duration, in agreement with histological findings. The model will allow for the optimization of beam parameters to increase the width of the therapeutic window for short exposures.
Laboratory-size three-dimensional water-window x-ray microscope with Wolter type I mirror optics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ohsuka, Shinji; The Graduate School for the Creation of New Photonics Industries, 1955-1 Kurematsu-cho, Nishi-ku, Hamamatsu-City, 431-1202; Ohba, Akira
2016-01-28
We constructed a laboratory-size three-dimensional water-window x-ray microscope that combines wide-field transmission x-ray microscopy with tomographic reconstruction techniques. It consists of an electron-impact x-ray source emitting oxygen Kα x-rays, Wolter type I grazing incidence mirror optics, and a back-illuminated CCD for x-ray imaging. A spatial resolution limit better than 1.0 line pairs per micrometer was obtained for two-dimensional transmission images, and 1-μm-scale three-dimensional fine structures were resolved.
2011-04-01
this limitation the length of the windows needs to be shortened. It is also leads to a narrower confidence interval, see Figure 2.9. 82 The " big ...least one event will occur within the window. The windows are then grouped in sets of two and the process is reapeated for a window size twice as big ...0 505 T. Fu 1 506 D. Walden 1 508 J. Brown 1 55 T.Applebee 0 55 M. Dipper 1 551 T. Smith I 551 C. Bassler 3 3 551 V. Belenky 1 551 W. Belknap
Documentation of debris impact damage to flight deck window
1995-07-26
STS070-309-026 (13-22 JULY 1995) --- A close-up view of the space shuttle Discovery?s window number 6, on the forward starboard side, nearest the pilot?s station. A small impact in the window, about 1/16 inch in size, is clearly seen in the corner. Crew members told a August 11, 1995, gathering of Johnson Space Center (JSC) employees that a small piece of debris apparently struck the window during Discovery?s wing velocity vector mode. It was noticed when the astronauts awoke from their sleep period. Though watched closely during the remainder of the mission, the impact never caused a major concern.
Carey, David L; Blanch, Peter; Ong, Kok-Leong; Crossley, Kay M; Crow, Justin; Morris, Meg E
2017-08-01
(1) To investigate whether a daily acute:chronic workload ratio informs injury risk in Australian football players; (2) to identify which combination of workload variable, acute and chronic time window best explains injury likelihood. Workload and injury data were collected from 53 athletes over 2 seasons in a professional Australian football club. Acute:chronic workload ratios were calculated daily for each athlete, and modelled against non-contact injury likelihood using a quadratic relationship. 6 workload variables, 8 acute time windows (2-9 days) and 7 chronic time windows (14-35 days) were considered (336 combinations). Each parameter combination was compared for injury likelihood fit (using R 2 ). The ratio of moderate speed running workload (18-24 km/h) in the previous 3 days (acute time window) compared with the previous 21 days (chronic time window) best explained the injury likelihood in matches (R 2 =0.79) and in the immediate 2 or 5 days following matches (R 2 =0.76-0.82). The 3:21 acute:chronic workload ratio discriminated between high-risk and low-risk athletes (relative risk=1.98-2.43). Using the previous 6 days to calculate the acute workload time window yielded similar results. The choice of acute time window significantly influenced model performance and appeared to reflect the competition and training schedule. Daily workload ratios can inform injury risk in Australian football. Clinicians and conditioning coaches should consider the sport-specific schedule of competition and training when choosing acute and chronic time windows. For Australian football, the ratio of moderate speed running in a 3-day or 6-day acute time window and a 21-day chronic time window best explained injury risk. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Carey, David L; Blanch, Peter; Ong, Kok-Leong; Crossley, Kay M; Crow, Justin; Morris, Meg E
2017-01-01
Aims (1) To investigate whether a daily acute:chronic workload ratio informs injury risk in Australian football players; (2) to identify which combination of workload variable, acute and chronic time window best explains injury likelihood. Methods Workload and injury data were collected from 53 athletes over 2 seasons in a professional Australian football club. Acute:chronic workload ratios were calculated daily for each athlete, and modelled against non-contact injury likelihood using a quadratic relationship. 6 workload variables, 8 acute time windows (2–9 days) and 7 chronic time windows (14–35 days) were considered (336 combinations). Each parameter combination was compared for injury likelihood fit (using R2). Results The ratio of moderate speed running workload (18–24 km/h) in the previous 3 days (acute time window) compared with the previous 21 days (chronic time window) best explained the injury likelihood in matches (R2=0.79) and in the immediate 2 or 5 days following matches (R2=0.76–0.82). The 3:21 acute:chronic workload ratio discriminated between high-risk and low-risk athletes (relative risk=1.98–2.43). Using the previous 6 days to calculate the acute workload time window yielded similar results. The choice of acute time window significantly influenced model performance and appeared to reflect the competition and training schedule. Conclusions Daily workload ratios can inform injury risk in Australian football. Clinicians and conditioning coaches should consider the sport-specific schedule of competition and training when choosing acute and chronic time windows. For Australian football, the ratio of moderate speed running in a 3-day or 6-day acute time window and a 21-day chronic time window best explained injury risk. PMID:27789430
Time-marching multi-grid seismic tomography
NASA Astrophysics Data System (ADS)
Tong, P.; Yang, D.; Liu, Q.
2016-12-01
From the classic ray-based traveltime tomography to the state-of-the-art full waveform inversion, because of the nonlinearity of seismic inverse problems, a good starting model is essential for preventing the convergence of the objective function toward local minima. With a focus on building high-accuracy starting models, we propose the so-called time-marching multi-grid seismic tomography method in this study. The new seismic tomography scheme consists of a temporal time-marching approach and a spatial multi-grid strategy. We first divide the recording period of seismic data into a series of time windows. Sequentially, the subsurface properties in each time window are iteratively updated starting from the final model of the previous time window. There are at least two advantages of the time-marching approach: (1) the information included in the seismic data of previous time windows has been explored to build the starting models of later time windows; (2) seismic data of later time windows could provide extra information to refine the subsurface images. Within each time window, we use a multi-grid method to decompose the scale of the inverse problem. Specifically, the unknowns of the inverse problem are sampled on a coarse mesh to capture the macro-scale structure of the subsurface at the beginning. Because of the low dimensionality, it is much easier to reach the global minimum on a coarse mesh. After that, finer meshes are introduced to recover the micro-scale properties. That is to say, the subsurface model is iteratively updated on multi-grid in every time window. We expect that high-accuracy starting models should be generated for the second and later time windows. We will test this time-marching multi-grid method by using our newly developed eikonal-based traveltime tomography software package tomoQuake. Real application results in the 2016 Kumamoto earthquake (Mw 7.0) region in Japan will be demonstrated.
Space station proximity operations windows: Human factors design guidelines
NASA Technical Reports Server (NTRS)
Haines, Richard F.
1987-01-01
Proximity operations refers to all activities outside the Space Station which take place within a 1-km radius. Since there will be a large number of different operations involving manned and unmanned vehicles, single- and multiperson crews, automated and manually controlled flight, a wide variety of cargo, and construction/repair activities, accurate and continuous human monitoring of these operations from a specially designed control station on Space Station will be required. Total situational awareness will be required. This paper presents numerous human factors design guidelines and related background information for control windows which will support proximity operations. Separate sections deal with natural and artificial illumination geometry; all basic rendezvous vector approaches; window field-of-view requirements; window size; shape and placement criteria; window optical characteristics as they relate to human perception; maintenance and protection issues; and a comprehensive review of windows installed on U.S. and U.S.S.R. manned vehicles.
Universal entrainment mechanism controls contact times with motile cells
NASA Astrophysics Data System (ADS)
Mathijssen, Arnold J. T. M.; Jeanneret, Raphaël; Polin, Marco
2018-03-01
Contact between particles and motile cells underpins a wide variety of biological processes, from nutrient capture and ligand binding to grazing, viral infection, and cell-cell communication. The window of opportunity for these interactions depends on the basic mechanism determining contact time, which is currently unknown. By combining experiments on three different species—Chlamydomonas reinhardtii, Tetraselmis subcordiforms, and Oxyrrhis marina—with simulations and analytical modeling, we show that the fundamental physical process regulating proximity to a swimming microorganism is hydrodynamic particle entrainment. The resulting distribution of contact times is derived within the framework of Taylor dispersion as a competition between advection by the cell surface and microparticle diffusion, and predicts the existence of an optimal tracer size that is also observed experimentally. Spatial organization of flagella, swimming speed, and swimmer and tracer size influence entrainment features and provide tradeoffs that may be tuned to optimize the estimated probabilities for microbial interactions like predation and infection.
Prediction of CpG-island function: CpG clustering vs. sliding-window methods
2010-01-01
Background Unmethylated stretches of CpG dinucleotides (CpG islands) are an outstanding property of mammal genomes. Conventionally, these regions are detected by sliding window approaches using %G + C, CpG observed/expected ratio and length thresholds as main parameters. Recently, clustering methods directly detect clusters of CpG dinucleotides as a statistical property of the genome sequence. Results We compare sliding-window to clustering (i.e. CpGcluster) predictions by applying new ways to detect putative functionality of CpG islands. Analyzing the co-localization with several genomic regions as a function of window size vs. statistical significance (p-value), CpGcluster shows a higher overlap with promoter regions and highly conserved elements, at the same time showing less overlap with Alu retrotransposons. The major difference in the prediction was found for short islands (CpG islets), often exclusively predicted by CpGcluster. Many of these islets seem to be functional, as they are unmethylated, highly conserved and/or located within the promoter region. Finally, we show that window-based islands can spuriously overlap several, differentially regulated promoters as well as different methylation domains, which might indicate a wrong merge of several CpG islands into a single, very long island. The shorter CpGcluster islands seem to be much more specific when concerning the overlap with alternative transcription start sites or the detection of homogenous methylation domains. Conclusions The main difference between sliding-window approaches and clustering methods is the length of the predicted islands. Short islands, often differentially methylated, are almost exclusively predicted by CpGcluster. This suggests that CpGcluster may be the algorithm of choice to explore the function of these short, but putatively functional CpG islands. PMID:20500903
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chitsazzadeh, S; Wells, D; Mestrovic, A
2016-06-15
Purpose: To develop a QA procedure for gated VMAT stereotactic ablative radiotherapy (SABR) treatments. Methods: An interface was constructed to attach the translational stage of a Quasar respiratory motion phantom to a pinpoint ion chamber insert and move the ion chamber inside an ArcCheck diode array. The Quasar phantom controller used a patient specific breathing pattern to translate the ion chamber in a superior-inferior direction inside the ArcCheck. An amplitude-based RPM tracking system was specified to turn the beam on during the exhale phase of the breathing pattern. SABR plans were developed using Eclipse for liver PTVs ranging in sizemore » from 3-12 cm in diameter using a 2-arc VMAT technique. Dose was measured in the middle of the penumbra region, where the high dose gradient allowed for sensitive detection of any inaccuracies in gated dose delivery. The overall fidelity of the dose distribution was confirmed using ArcCheck. The sensitivity of the gating QA procedure was investigated with respect to the following four parameters: PTV size, duration of exhale, baseline drift, and gating window size. Results: The difference between the measured dose to a point in the penumbra and the Eclipse calculated dose was under 2% for small residual motions. The QA procedure was independent of PTV size and duration of exhale. Baseline drift and gating window size, however, significantly affected the penumbral dose measurement, with differences of up to 30% compared to Eclipse. Conclusion: This study described a highly sensitive QA procedure for gated VMAT SABR treatments. The QA outcome was dependent on the gating window size and baseline drift. Analysis of additional patient breathing patterns will be required to determine a clinically relevant gating window size and an appropriate tolerance level for this procedure.« less
Rosowski, John J; Bowers, Peter; Nakajima, Hideko H
2018-03-01
While most models of cochlear function assume the presence of only two windows into the mammalian cochlea (the oval and round windows), a position that is generally supported by several lines of data, there is evidence for additional sound paths into and out of the inner ear in normal mammals. In this report we review the existing evidence for and against the 'two-window' hypothesis. We then determine how existing data and inner-ear anatomy restrict transmission of sound through these additional sound pathways in cat by utilizing a well-tested model of the cat inner ear, together with anatomical descriptions of the cat cochlear and vestibular aqueducts (potential additional windows to the cochlea). We conclude: (1) The existing data place limits on the size of the cochlear and vestibular aqueducts in cat and are consistent with small volume-velocities through these ducts during ossicular stimulation of the cochlea, (2) the predicted volume velocities produced by aqueducts with diameters half the size of the bony diameters match the functional data within ±10 dB, and (3) these additional volume velocity paths contribute to the inner ear's response to non-acoustic stimulation and conductive pathology. Copyright © 2017 Elsevier B.V. All rights reserved.
Artist concept of SIM PlanetQuest Artist Concept
2002-12-21
Artist's concept of the current mission configuration. SIM PlanetQuest (formerly called Space Interferometry Mission), currently under development, will determine the positions and distances of stars several hundred times more accurately than any previous program. This accuracy will allow SIM to determine the distances to stars throughout the galaxy and to probe nearby stars for Earth-sized planets. SIM will open a window to a new world of discoveries. http://photojournal.jpl.nasa.gov/catalog/PIA04248
CloudMC: a cloud computing application for Monte Carlo simulation.
Miras, H; Jiménez, R; Miras, C; Gomà, C
2013-04-21
This work presents CloudMC, a cloud computing application-developed in Windows Azure®, the platform of the Microsoft® cloud-for the parallelization of Monte Carlo simulations in a dynamic virtual cluster. CloudMC is a web application designed to be independent of the Monte Carlo code in which the simulations are based-the simulations just need to be of the form: input files → executable → output files. To study the performance of CloudMC in Windows Azure®, Monte Carlo simulations with penelope were performed on different instance (virtual machine) sizes, and for different number of instances. The instance size was found to have no effect on the simulation runtime. It was also found that the decrease in time with the number of instances followed Amdahl's law, with a slight deviation due to the increase in the fraction of non-parallelizable time with increasing number of instances. A simulation that would have required 30 h of CPU on a single instance was completed in 48.6 min when executed on 64 instances in parallel (speedup of 37 ×). Furthermore, the use of cloud computing for parallel computing offers some advantages over conventional clusters: high accessibility, scalability and pay per usage. Therefore, it is strongly believed that cloud computing will play an important role in making Monte Carlo dose calculation a reality in future clinical practice.
A refined technique to calculate finite helical axes from rigid body trackers.
McLachlin, Stewart D; Ferreira, Louis M; Dunning, Cynthia E
2014-12-01
Finite helical axes (FHAs) are a potentially effective tool for joint kinematic analysis. Unfortunately, no straightforward guidelines exist for calculating accurate FHAs using prepackaged six degree-of-freedom (6 DOF) rigid body trackers. Thus, this study aimed to: (1) describe a protocol for calculating FHA parameters from 6 DOF rigid body trackers using the screw matrix and (2) to maximize the number of accurate FHAs generated from a given data set using a moving window analysis. Four Optotrak® Smart Markers were used as the rigid body trackers, two moving and two fixed, at different distances from the hinge joint of a custom-machined jig. 6D OF pose information was generated from 51 static positions of the jig rotated and fixed in 0.5 deg increments up to 25 deg. Output metrics included the FHA direction cosines, the rotation about the FHA, the translation along the axis, and the intercept of the FHA with the plane normal to the jig's hinge joint. FHA metrics were calculated using the relative tracker rotation from the starting position, and using a moving window analysis to define a minimum acceptable rotational displacement between the moving tracker data points. Data analysis found all FHA rotations calculated from the starting position were within 0.15 deg of the prescribed jig rotation. FHA intercepts were most stable when determined using trackers closest to the hinge axis. Increasing the moving window size improved the FHA direction cosines and center of rotation accuracy. Window sizes larger than 2 deg had an intercept deviation of less than 1 mm. Furthermore, compared to the 0 deg window size, the 2 deg window had a 90% improvement in FHA intercept precision while generating almost an equivalent number of FHA axes. This work identified a solution to improve FHA calculations for biomechanical researchers looking to describe changes in 3D joint motion.
NASA Astrophysics Data System (ADS)
Ai, Jinquan; Gao, Wei; Gao, Zhiqiang; Shi, Runhe; Zhang, Chao
2017-04-01
Spartina alterniflora is an aggressive invasive plant species that replaces native species, changes the structure and function of the ecosystem across coastal wetlands in China, and is thus a major conservation concern. Mapping the spread of its invasion is a necessary first step for the implementation of effective ecological management strategies. The performance of a phenology-based approach for S. alterniflora mapping is explored in the coastal wetland of the Yangtze Estuary using a time series of GaoFen satellite no. 1 wide field of view camera (GF-1 WFV) imagery. First, a time series of the normalized difference vegetation index (NDVI) was constructed to evaluate the phenology of S. alterniflora. Two phenological stages (the senescence stage from November to mid-December and the green-up stage from late April to May) were determined as important for S. alterniflora detection in the study area based on NDVI temporal profiles, spectral reflectance curves of S. alterniflora and its coexistent species, and field surveys. Three phenology feature sets representing three major phenology-based detection strategies were then compared to map S. alterniflora: (1) the single-date imagery acquired within the optimal phenological window, (2) the multitemporal imagery, including four images from the two important phenological windows, and (3) the monthly NDVI time series imagery. Support vector machines and maximum likelihood classifiers were applied on each phenology feature set at different training sample sizes. For all phenology feature sets, the overall results were produced consistently with high mapping accuracies under sufficient training samples sizes, although significantly improved classification accuracies (10%) were obtained when the monthly NDVI time series imagery was employed. The optimal single-date imagery had the lowest accuracies of all detection strategies. The multitemporal analysis demonstrated little reduction in the overall accuracy compared with the use of monthly NDVI time series imagery. These results show the importance of considering the phenological stage for image selection for mapping S. alterniflora using GF-1 WFV imagery. Furthermore, in light of the better tradeoff between the number of images and classification accuracy when using multitemporal GF-1 WFV imagery, we suggest using multitemporal imagery acquired at appropriate phenological windows for S. alterniflora mapping at regional scales.
Colonius, Hans; Diederich, Adele
2011-07-01
The concept of a "time window of integration" holds that information from different sensory modalities must not be perceived too far apart in time in order to be integrated into a multisensory perceptual event. Empirical estimates of window width differ widely, however, ranging from 40 to 600 ms depending on context and experimental paradigm. Searching for theoretical derivation of window width, Colonius and Diederich (Front Integr Neurosci 2010) developed a decision-theoretic framework using a decision rule that is based on the prior probability of a common source, the likelihood of temporal disparities between the unimodal signals, and the payoff for making right or wrong decisions. Here, this framework is extended to the focused attention task where subjects are asked to respond to signals from a target modality only. Evoking the framework of the time-window-of-integration (TWIN) model, an explicit expression for optimal window width is obtained. The approach is probed on two published focused attention studies. The first is a saccadic reaction time study assessing the efficiency with which multisensory integration varies as a function of aging. Although the window widths for young and older adults differ by nearly 200 ms, presumably due to their different peripheral processing speeds, neither of them deviates significantly from the optimal values. In the second study, head saccadic reactions times to a perfectly aligned audiovisual stimulus pair had been shown to depend on the prior probability of spatial alignment. Intriguingly, they reflected the magnitude of the time-window widths predicted by our decision-theoretic framework, i.e., a larger time window is associated with a higher prior probability.
Bloomfield, Rachel C; Gillespie, Graeme R; Kerswell, Keven J; Butler, Kym L; Hemsworth, Paul H
2015-01-01
The window of the visitor viewing area adjacent to an animal platform in an orangutan enclosure was altered to produce three viewing treatments in a randomized controlled experiment. These treatments were window uncovered, left side of the window covered or right side of the window covered. Observations were conducted on the orangutans present on the platform, and on their location (left or right side), and orientation (towards or away from the window) while on the platform. The partial covering of the window had little effect on the proportion of time orangutans spent on the viewing platform, or on the direction they faced when on the platform. When the orangutans were facing towards the window, and the right side was uncovered, irrespective of whether the left side was covered, they spent about three quarters of the time on the right side, suggesting a preference for the right side of the platform. However, when the right side was covered and the left side uncovered, the animals facing towards the window spent only about a quarter of the time on the right side, that is, they spent more time on the uncovered side. The results suggest that the orangutans have a preference to position themselves to face the window of the visitor viewing area. © 2015 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Zielke, O.; Arrowsmith, R. J.
2005-12-01
The nonlinear dynamics of fault behavior are dominated by complex interactions among the multiple processes controlling the system. For example, temporal and spatial variations in pore pressure, healing effects, and stress transfer cause significant heterogeneities in fault properties and the stress-field at the sub-fault level. Numerical and laboratory fault models show that the interaction of large systems of fault elements causes the entire system to develop into a state of self-organized criticality. Once in this state, small perturbations of the system may result in chain reactions (i.e., earthquakes) which can affect any number of fault segments. This sensitivity to small perturbations is strong evidence for chaotic fault behavior, which implies that exact event prediction is not possible. However, earthquake prediction with a useful accuracy is nevertheless possible. Studies of other natural chaotic systems have shown that they may enter states of metastability, in which the system's behavior is predictable. Applying this concept to earthquake faults, these windows of metastable behavior should be characterized by periodic earthquake recurrence. The observed periodicity of the Parkfield, CA (M= 6) events may resemble such a window of metastability. I am statistically analyzing numerically generated seismic records to study these phases of periodic behavior. In this preliminary study, seismic records were generated using a model introduced by Nakanishi [Phys. Rev. A, 43, 6613-6621, 1991]. It consists of a one-dimensional chain of blocks (interconnected by springs) with a relaxation function that mimics velocity-weakened frictional behavior. The earthquakes occurring in this model show generally a power-law frequency-size distribution. However, for large events the distribution has a shoulder where the frequency of events is higher than expected from the power law. I have analyzed time-series of single block motions within the system. These time-series include noticeable periodicity during certain intervals in an otherwise aperiodic record. The observed periodic signal is not equally distributed over the range of offsets but shows a multi-modal distribution with increased periodicity for the smallest events and for large events that show a specific offset. These large events also form a shoulder in the frequency-size distribution. Apparently, the model exhibits characteristic earthquakes (defined by similar coseismic slip) that occur more frequently than expected from a power law distribution, and also are significantly more periodic. The wavelength of the periodic signal generally equals the minimum loading time, which is related to the loading velocity and the amount of coseismic slip (i.e., stress drop). No significant event occurs between the characteristic events as long as the system stays in a window of periodic behavior. Within the windows of periodic behavior, earthquake prediction is straightforward. Therefore, recognition of these windows not only in synthetic data but also in real seismic records, may improve the intra-window forecast of earthquakes. Further studies will attempt to determine the characteristics of onset, duration, and end of these windows of periodic earthquake recurrence. Only the motion of a single block within a bigger system was analyzed so far. Going from a zero dimensional scenario to a two dimensional case where the offsets not only of a single block but the displacement patterns caused by a certain event are analyzed will increase the verisimilitude of the detection of periodic earthquake recurrence within an otherwise chaotic seismic record.
Hatipoglu, Nuh; Bilgin, Gokhan
2017-10-01
In many computerized methods for cell detection, segmentation, and classification in digital histopathology that have recently emerged, the task of cell segmentation remains a chief problem for image processing in designing computer-aided diagnosis (CAD) systems. In research and diagnostic studies on cancer, pathologists can use CAD systems as second readers to analyze high-resolution histopathological images. Since cell detection and segmentation are critical for cancer grade assessments, cellular and extracellular structures should primarily be extracted from histopathological images. In response, we sought to identify a useful cell segmentation approach with histopathological images that uses not only prominent deep learning algorithms (i.e., convolutional neural networks, stacked autoencoders, and deep belief networks), but also spatial relationships, information of which is critical for achieving better cell segmentation results. To that end, we collected cellular and extracellular samples from histopathological images by windowing in small patches with various sizes. In experiments, the segmentation accuracies of the methods used improved as the window sizes increased due to the addition of local spatial and contextual information. Once we compared the effects of training sample size and influence of window size, results revealed that the deep learning algorithms, especially convolutional neural networks and partly stacked autoencoders, performed better than conventional methods in cell segmentation.
Continuous Time Level Crossing Sampling ADC for Bio-Potential Recording Systems
Tang, Wei; Osman, Ahmad; Kim, Dongsoo; Goldstein, Brian; Huang, Chenxi; Martini, Berin; Pieribone, Vincent A.
2013-01-01
In this paper we present a fixed window level crossing sampling analog to digital convertor for bio-potential recording sensors. This is the first proposed and fully implemented fixed window level crossing ADC without local DACs and clocks. The circuit is designed to reduce data size, power, and silicon area in future wireless neurophysiological sensor systems. We built a testing system to measure bio-potential signals and used it to evaluate the performance of the circuit. The bio-potential amplifier offers a gain of 53 dB within a bandwidth of 200 Hz-20 kHz. The input-referred rms noise is 2.8 µV. In the asynchronous level crossing ADC, the minimum delta resolution is 4 mV. The input signal frequency of the ADC is up to 5 kHz. The system was fabricated using the AMI 0.5 µm CMOS process. The chip size is 1.5 mm by 1.5 mm. The power consumption of the 4-channel system from a 3.3 V supply is 118.8 µW in the static state and 501.6 µW with a 240 kS/s sampling rate. The conversion efficiency is 1.6 nJ/conversion. PMID:24163640
Foo, Lee Kien; McGree, James; Duffull, Stephen
2012-01-01
Optimal design methods have been proposed to determine the best sampling times when sparse blood sampling is required in clinical pharmacokinetic studies. However, the optimal blood sampling time points may not be feasible in clinical practice. Sampling windows, a time interval for blood sample collection, have been proposed to provide flexibility in blood sampling times while preserving efficient parameter estimation. Because of the complexity of the population pharmacokinetic models, which are generally nonlinear mixed effects models, there is no analytical solution available to determine sampling windows. We propose a method for determination of sampling windows based on MCMC sampling techniques. The proposed method attains a stationary distribution rapidly and provides time-sensitive windows around the optimal design points. The proposed method is applicable to determine sampling windows for any nonlinear mixed effects model although our work focuses on an application to population pharmacokinetic models. Copyright © 2012 John Wiley & Sons, Ltd.
7 CFR Exhibit B to Subpart A of... - Requirements for Modular/Panelized Housing Units
Code of Federal Regulations, 2010 CFR
2010-01-01
..., log wall houses, trussed roof rafters or floor trusses; open panel walls, and other types that can be... windows or crawl space vents with all sizes indicated. 2. Floor Plans of all levels. Show square footage... levels is required to indicate intended occupancy functions of the design. A window and door schedule...
7 CFR Exhibit B to Subpart A of... - Requirements for Modular/Panelized Housing Units
Code of Federal Regulations, 2011 CFR
2011-01-01
..., log wall houses, trussed roof rafters or floor trusses; open panel walls, and other types that can be... windows or crawl space vents with all sizes indicated. 2. Floor Plans of all levels. Show square footage... levels is required to indicate intended occupancy functions of the design. A window and door schedule...
Perceptual Span in Oral Reading: The Case of Chinese
ERIC Educational Resources Information Center
Pan, Jinger; Yan, Ming; Laubrock, Jochen
2017-01-01
The present study explores the perceptual span, that is, the physical extent of the area from which useful visual information is obtained during a single fixation, during oral reading of Chinese sentences. Characters outside a window of legible text were replaced by visually similar characters. Results show that the influence of window size on the…
Thin and open vessel windows for intra-vital fluorescence imaging of murine cochlear blood flow.
Shi, Xiaorui; Zhang, Fei; Urdang, Zachary; Dai, Min; Neng, Lingling; Zhang, Jinhui; Chen, Songlin; Ramamoorthy, Sripriya; Nuttall, Alfred L
2014-07-01
Normal microvessel structure and function in the cochlea is essential for maintaining the ionic and metabolic homeostasis required for hearing function. Abnormal cochlear microcirculation has long been considered an etiologic factor in hearing disorders. A better understanding of cochlear blood flow (CoBF) will enable more effective amelioration of hearing disorders that result from aberrant blood flow. However, establishing the direct relationship between CoBF and other cellular events in the lateral wall and response to physio-pathological stress remains a challenge due to the lack of feasible interrogation methods and difficulty in accessing the inner ear. Here we report on new methods for studying the CoBF in a mouse model using a thin or open vessel-window in combination with fluorescence intra-vital microscopy (IVM). An open vessel-window enables investigation of vascular cell biology and blood flow permeability, including pericyte (PC) contractility, bone marrow cell migration, and endothelial barrier leakage, in wild type and fluorescent protein-labeled transgenic mouse models with high spatial and temporal resolution. Alternatively, the thin vessel-window method minimizes disruption of the homeostatic balance in the lateral wall and enables study CoBF under relatively intact physiological conditions. A thin vessel-window method can also be used for time-based studies of physiological and pathological processes. Although the small size of the mouse cochlea makes surgery difficult, the methods are sufficiently developed for studying the structural and functional changes in CoBF under normal and pathological conditions. Copyright © 2014 Elsevier B.V. All rights reserved.
Zhang, Zhengyi; Zhang, Gaoyan; Zhang, Yuanyuan; Liu, Hong; Xu, Junhai; Liu, Baolin
2017-12-01
This study aimed to investigate the functional connectivity in the brain during the cross-modal integration of polyphonic characters in Chinese audio-visual sentences. The visual sentences were all semantically reasonable and the audible pronunciations of the polyphonic characters in corresponding sentences contexts varied in four conditions. To measure the functional connectivity, correlation, coherence and phase synchronization index (PSI) were used, and then multivariate pattern analysis was performed to detect the consensus functional connectivity patterns. These analyses were confined in the time windows of three event-related potential components of P200, N400 and late positive shift (LPS) to investigate the dynamic changes of the connectivity patterns at different cognitive stages. We found that when differentiating the polyphonic characters with abnormal pronunciations from that with the appreciate ones in audio-visual sentences, significant classification results were obtained based on the coherence in the time window of the P200 component, the correlation in the time window of the N400 component and the coherence and PSI in the time window the LPS component. Moreover, the spatial distributions in these time windows were also different, with the recruitment of frontal sites in the time window of the P200 component, the frontal-central-parietal regions in the time window of the N400 component and the central-parietal sites in the time window of the LPS component. These findings demonstrate that the functional interaction mechanisms are different at different stages of audio-visual integration of polyphonic characters.
A multimodal logistics service network design with time windows and environmental concerns
Zhang, Dezhi; He, Runzhong; Wang, Zhongwei
2017-01-01
The design of a multimodal logistics service network with customer service time windows and environmental costs is an important and challenging issue. Accordingly, this work established a model to minimize the total cost of multimodal logistics service network design with time windows and environmental concerns. The proposed model incorporates CO2 emission costs to determine the optimal transportation mode combinations and investment selections for transfer nodes, which consider transport cost, transport time, carbon emission, and logistics service time window constraints. Furthermore, genetic and heuristic algorithms are proposed to set up the abovementioned optimal model. A numerical example is provided to validate the model and the abovementioned two algorithms. Then, comparisons of the performance of the two algorithms are provided. Finally, this work investigates the effects of the logistics service time windows and CO2 emission taxes on the optimal solution. Several important management insights are obtained. PMID:28934272
A multimodal logistics service network design with time windows and environmental concerns.
Zhang, Dezhi; He, Runzhong; Li, Shuangyan; Wang, Zhongwei
2017-01-01
The design of a multimodal logistics service network with customer service time windows and environmental costs is an important and challenging issue. Accordingly, this work established a model to minimize the total cost of multimodal logistics service network design with time windows and environmental concerns. The proposed model incorporates CO2 emission costs to determine the optimal transportation mode combinations and investment selections for transfer nodes, which consider transport cost, transport time, carbon emission, and logistics service time window constraints. Furthermore, genetic and heuristic algorithms are proposed to set up the abovementioned optimal model. A numerical example is provided to validate the model and the abovementioned two algorithms. Then, comparisons of the performance of the two algorithms are provided. Finally, this work investigates the effects of the logistics service time windows and CO2 emission taxes on the optimal solution. Several important management insights are obtained.
Artificial neural networks to predict activity type and energy expenditure in youth.
Trost, Stewart G; Wong, Weng-Keen; Pfeiffer, Karen A; Zheng, Yonglei
2012-09-01
Previous studies have demonstrated that pattern recognition approaches to accelerometer data reduction are feasible and moderately accurate in classifying activity type in children. Whether pattern recognition techniques can be used to provide valid estimates of physical activity (PA) energy expenditure in youth remains unexplored in the research literature. The objective of this study is to develop and test artificial neural networks (ANNs) to predict PA type and energy expenditure (PAEE) from processed accelerometer data collected in children and adolescents. One hundred participants between the ages of 5 and 15 yr completed 12 activity trials that were categorized into five PA types: sedentary, walking, running, light-intensity household activities or games, and moderate-to-vigorous-intensity games or sports. During each trial, participants wore an ActiGraph GT1M on the right hip, and VO2 was measured using the Oxycon Mobile (Viasys Healthcare, Yorba Linda, CA) portable metabolic system. ANNs to predict PA type and PAEE (METs) were developed using the following features: 10th, 25th, 50th, 75th, and 90th percentiles and the lag one autocorrelation. To determine the highest time resolution achievable, we extracted features from 10-, 15-, 20-, 30-, and 60-s windows. Accuracy was assessed by calculating the percentage of windows correctly classified and root mean square error (RMSE). As window size increased from 10 to 60 s, accuracy for the PA-type ANN increased from 81.3% to 88.4%. RMSE for the MET prediction ANN decreased from 1.1 METs to 0.9 METs. At any given window size, RMSE values for the MET prediction ANN were 30-40% lower than the conventional regression-based approaches. ANNs can be used to predict both PA type and PAEE in children and adolescents using count data from a single waist mounted accelerometer.
Iterated local search algorithm for solving the orienteering problem with soft time windows.
Aghezzaf, Brahim; Fahim, Hassan El
2016-01-01
In this paper we study the orienteering problem with time windows (OPTW) and the impact of relaxing the time windows on the profit collected by the vehicle. The way of relaxing time windows adopted in the orienteering problem with soft time windows (OPSTW) that we study in this research is a late service relaxation that allows linearly penalized late services to customers. We solve this problem heuristically by considering a hybrid iterated local search. The results of the computational study show that the proposed approach is able to achieve promising solutions on the OPTW test instances available in the literature, one new best solution is found. On the newly generated test instances of the OPSTW, the results show that the profit collected by the OPSTW is better than the profit collected by the OPTW.
Wildfire cluster detection using space-time scan statistics
NASA Astrophysics Data System (ADS)
Tonini, M.; Tuia, D.; Ratle, F.; Kanevski, M.
2009-04-01
The aim of the present study is to identify spatio-temporal clusters of fires sequences using space-time scan statistics. These statistical methods are specifically designed to detect clusters and assess their significance. Basically, scan statistics work by comparing a set of events occurring inside a scanning window (or a space-time cylinder for spatio-temporal data) with those that lie outside. Windows of increasing size scan the zone across space and time: the likelihood ratio is calculated for each window (comparing the ratio "observed cases over expected" inside and outside): the window with the maximum value is assumed to be the most probable cluster, and so on. Under the null hypothesis of spatial and temporal randomness, these events are distributed according to a known discrete-state random process (Poisson or Bernoulli), which parameters can be estimated. Given this assumption, it is possible to test whether or not the null hypothesis holds in a specific area. In order to deal with fires data, the space-time permutation scan statistic has been applied since it does not require the explicit specification of the population-at risk in each cylinder. The case study is represented by Florida daily fire detection using the Moderate Resolution Imaging Spectroradiometer (MODIS) active fire product during the period 2003-2006. As result, statistically significant clusters have been identified. Performing the analyses over the entire frame period, three out of the five most likely clusters have been identified in the forest areas, on the North of the country; the other two clusters cover a large zone in the South, corresponding to agricultural land and the prairies in the Everglades. Furthermore, the analyses have been performed separately for the four years to analyze if the wildfires recur each year during the same period. It emerges that clusters of forest fires are more frequent in hot seasons (spring and summer), while in the South areas they are widely present along the whole year. The analysis of fires distribution to evaluate if they are statistically more frequent in some area or/and in some period of the year, can be useful to support fire management and to focus on prevention measures.
Extended volume coverage in helical cone-beam CT by using PI-line based BPF algorithm
NASA Astrophysics Data System (ADS)
Cho, Seungryong; Pan, Xiaochuan
2007-03-01
We compared data requirements of filtered-backprojection (FBP) and backprojection-filtration (BPF) algorithms based on PI-lines in helical cone-beam CT. Since the filtration process in FBP algorithm needs all the projection data of PI-lines for each view, the required detector size should be bigger than the size that can cover Tam-Danielsson (T-D) window to avoid data truncation. BPF algorithm, however, requires the projection data only within the T-D window, which means smaller detector size can be used to reconstruct the same image than that in FBP. In other words, a longer helical pitch can be obtained by using BPF algorithm without any truncation artifacts when a fixed detector size is given. The purpose of the work is to demonstrate numerically that extended volume coverage in helical cone-beam CT by using PI-line-based BPF algorithm can be achieved.
Assen, Ayalew H; Belmabkhout, Youssef; Adil, Karim; Bhatt, Prashant M; Xue, Dong-Xu; Jiang, Hao; Eddaoudi, Mohamed
2015-11-23
Using isoreticular chemistry allows the design and construction of a new rare-earth metal (RE) fcu-MOF with a suitable aperture size for practical steric adsorptive separations. The judicious choice of a relatively short organic building block, namely fumarate, to bridge the 12-connected RE hexanuclear clusters has afforded the contraction of the well-defined RE-fcu-MOF triangular window aperture, the sole access to the two interconnected octahedral and tetrahedral cages. The newly constructed RE (Y(3+) and Tb(3+)) fcu-MOF analogues display unprecedented total exclusion of branched paraffins from normal paraffins. The resultant window aperture size of about 4.7 Å, regarded as a sorbate-size cut-off, enabled a complete sieving of branched paraffins from normal paraffins. The results are supported by collective single gas and mixed gas/vapor adsorption and calorimetric studies. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Climate Exposure of US National Parks in a New Era of Change
Monahan, William B.; Fisichelli, Nicholas A.
2014-01-01
US national parks are challenged by climate and other forms of broad-scale environmental change that operate beyond administrative boundaries and in some instances are occurring at especially rapid rates. Here, we evaluate the climate change exposure of 289 natural resource parks administered by the US National Park Service (NPS), and ask which are presently (past 10 to 30 years) experiencing extreme (<5th percentile or >95th percentile) climates relative to their 1901–2012 historical range of variability (HRV). We consider parks in a landscape context (including surrounding 30 km) and evaluate both mean and inter-annual variation in 25 biologically relevant climate variables related to temperature, precipitation, frost and wet day frequencies, vapor pressure, cloud cover, and seasonality. We also consider sensitivity of findings to the moving time window of analysis (10, 20, and 30 year windows). Results show that parks are overwhelmingly at the extreme warm end of historical temperature distributions and this is true for several variables (e.g., annual mean temperature, minimum temperature of the coldest month, mean temperature of the warmest quarter). Precipitation and other moisture patterns are geographically more heterogeneous across parks and show greater variation among variables. Across climate variables, recent inter-annual variation is generally well within the range of variability observed since 1901. Moving window size has a measureable effect on these estimates, but parks with extreme climates also tend to exhibit low sensitivity to the time window of analysis. We highlight particular parks that illustrate different extremes and may facilitate understanding responses of park resources to ongoing climate change. We conclude with discussion of how results relate to anticipated future changes in climate, as well as how they can inform NPS and neighboring land management and planning in a new era of change. PMID:24988483
NASA Astrophysics Data System (ADS)
Zhao, Jianhua; Zeng, Haishan; Kalia, Sunil; Lui, Harvey
2017-02-01
Background: Raman spectroscopy is a non-invasive optical technique which can measure molecular vibrational modes within tissue. A large-scale clinical study (n = 518) has demonstrated that real-time Raman spectroscopy could distinguish malignant from benign skin lesions with good diagnostic accuracy; this was validated by a follow-up independent study (n = 127). Objective: Most of the previous diagnostic algorithms have typically been based on analyzing the full band of the Raman spectra, either in the fingerprint or high wavenumber regions. Our objective in this presentation is to explore wavenumber selection based analysis in Raman spectroscopy for skin cancer diagnosis. Methods: A wavenumber selection algorithm was implemented using variably-sized wavenumber windows, which were determined by the correlation coefficient between wavenumbers. Wavenumber windows were chosen based on accumulated frequency from leave-one-out cross-validated stepwise regression or least and shrinkage selection operator (LASSO). The diagnostic algorithms were then generated from the selected wavenumber windows using multivariate statistical analyses, including principal component and general discriminant analysis (PC-GDA) and partial least squares (PLS). A total cohort of 645 confirmed lesions from 573 patients encompassing skin cancers, precancers and benign skin lesions were included. Lesion measurements were divided into training cohort (n = 518) and testing cohort (n = 127) according to the measurement time. Result: The area under the receiver operating characteristic curve (ROC) improved from 0.861-0.891 to 0.891-0.911 and the diagnostic specificity for sensitivity levels of 0.99-0.90 increased respectively from 0.17-0.65 to 0.20-0.75 by selecting specific wavenumber windows for analysis. Conclusion: Wavenumber selection based analysis in Raman spectroscopy improves skin cancer diagnostic specificity at high sensitivity levels.
Climate exposure of US national parks in a new era of change.
Monahan, William B; Fisichelli, Nicholas A
2014-01-01
US national parks are challenged by climate and other forms of broad-scale environmental change that operate beyond administrative boundaries and in some instances are occurring at especially rapid rates. Here, we evaluate the climate change exposure of 289 natural resource parks administered by the US National Park Service (NPS), and ask which are presently (past 10 to 30 years) experiencing extreme (<5th percentile or >95th percentile) climates relative to their 1901-2012 historical range of variability (HRV). We consider parks in a landscape context (including surrounding 30 km) and evaluate both mean and inter-annual variation in 25 biologically relevant climate variables related to temperature, precipitation, frost and wet day frequencies, vapor pressure, cloud cover, and seasonality. We also consider sensitivity of findings to the moving time window of analysis (10, 20, and 30 year windows). Results show that parks are overwhelmingly at the extreme warm end of historical temperature distributions and this is true for several variables (e.g., annual mean temperature, minimum temperature of the coldest month, mean temperature of the warmest quarter). Precipitation and other moisture patterns are geographically more heterogeneous across parks and show greater variation among variables. Across climate variables, recent inter-annual variation is generally well within the range of variability observed since 1901. Moving window size has a measureable effect on these estimates, but parks with extreme climates also tend to exhibit low sensitivity to the time window of analysis. We highlight particular parks that illustrate different extremes and may facilitate understanding responses of park resources to ongoing climate change. We conclude with discussion of how results relate to anticipated future changes in climate, as well as how they can inform NPS and neighboring land management and planning in a new era of change.
An approach to emotion recognition in single-channel EEG signals: a mother child interaction
NASA Astrophysics Data System (ADS)
Gómez, A.; Quintero, L.; López, N.; Castro, J.
2016-04-01
In this work, we perform a first approach to emotion recognition from EEG single channel signals extracted in four (4) mother-child dyads experiment in developmental psychology. Single channel EEG signals are analyzed and processed using several window sizes by performing a statistical analysis over features in the time and frequency domains. Finally, a neural network obtained an average accuracy rate of 99% of classification in two emotional states such as happiness and sadness.
Development of Facility Type Information Packages for Design of Air Force Facilities.
1983-03-01
solution. For example, the optimum size and loca- 19 tion of windows for the incorporation of a passive solar *l . heating system varies with location, time...conditioning load estimate M. Energy impact statement N. Majcom review comments 0. Solar energy systems 61 4 Information which could help in the development...and Passive solar systems. All facilities should have Scme aspects of passive solar incor- por3ted into the iesign. Active sclar systems should ze con
Downsampling Photodetector Array with Windowing
NASA Technical Reports Server (NTRS)
Patawaran, Ferze D.; Farr, William H.; Nguyen, Danh H.; Quirk, Kevin J.; Sahasrabudhe, Adit
2012-01-01
In a photon counting detector array, each pixel in the array produces an electrical pulse when an incident photon on that pixel is detected. Detection and demodulation of an optical communication signal that modulated the intensity of the optical signal requires counting the number of photon arrivals over a given interval. As the size of photon counting photodetector arrays increases, parallel processing of all the pixels exceeds the resources available in current application-specific integrated circuit (ASIC) and gate array (GA) technology; the desire for a high fill factor in avalanche photodiode (APD) detector arrays also precludes this. Through the use of downsampling and windowing portions of the detector array, the processing is distributed between the ASIC and GA. This allows demodulation of the optical communication signal incident on a large photon counting detector array, as well as providing architecture amenable to algorithmic changes. The detector array readout ASIC functions as a parallel-to-serial converter, serializing the photodetector array output for subsequent processing. Additional downsampling functionality for each pixel is added to this ASIC. Due to the large number of pixels in the array, the readout time of the entire photodetector is greater than the time between photon arrivals; therefore, a downsampling pre-processing step is done in order to increase the time allowed for the readout to occur. Each pixel drives a small counter that is incremented at every detected photon arrival or, equivalently, the charge in a storage capacitor is incremented. At the end of a user-configurable counting period (calculated independently from the ASIC), the counters are sampled and cleared. This downsampled photon count information is then sent one counter word at a time to the GA. For a large array, processing even the downsampled pixel counts exceeds the capabilities of the GA. Windowing of the array, whereby several subsets of pixels are designated for processing, is used to further reduce the computational requirements. The grouping of the designated pixel frame as the photon count information is sent one word at a time to the GA, the aggregation of the pixels in a window can be achieved by selecting only the designated pixel counts from the serial stream of photon counts, thereby obviating the need to store the entire frame of pixel count in the gate array. The pixel count se quence from each window can then be processed, forming lower-rate pixel statistics for each window. By having this processing occur in the GA rather than in the ASIC, future changes to the processing algorithm can be readily implemented. The high-bandwidth requirements of a photon counting array combined with the properties of the optical modulation being detected by the array present a unique problem that has not been addressed by current CCD or CMOS sensor array solutions.
Rapid Corner Detection Using FPGAs
NASA Technical Reports Server (NTRS)
Morfopoulos, Arin C.; Metz, Brandon C.
2010-01-01
In order to perform precision landings for space missions, a control system must be accurate to within ten meters. Feature detection applied against images taken during descent and correlated against the provided base image is computationally expensive and requires tens of seconds of processing time to do just one image while the goal is to process multiple images per second. To solve this problem, this algorithm takes that processing load from the central processing unit (CPU) and gives it to a reconfigurable field programmable gate array (FPGA), which is able to compute data in parallel at very high clock speeds. The workload of the processor then becomes simpler; to read an image from a camera, it is transferred into the FPGA, and the results are read back from the FPGA. The Harris Corner Detector uses the determinant and trace to find a corner score, with each step of the computation occurring on independent clock cycles. Essentially, the image is converted into an x and y derivative map. Once three lines of pixel information have been queued up, valid pixel derivatives are clocked into the product and averaging phase of the pipeline. Each x and y derivative is squared against itself, as well as the product of the ix and iy derivative, and each value is stored in a WxN size buffer, where W represents the size of the integration window and N is the width of the image. In this particular case, a window size of 5 was chosen, and the image is 640 480. Over a WxN size window, an equidistance Gaussian is applied (to bring out the stronger corners), and then each value in the entire window is summed and stored. The required components of the equation are in place, and it is just a matter of taking the determinant and trace. It should be noted that the trace is being weighted by a constant k, a value that is found empirically to be within 0.04 to 0.15 (and in this implementation is 0.05). The constant k determines the number of corners available to be compared against a threshold sigma to mark a valid corner. After a fixed delay from when the first pixel is clocked in (to fill the pipeline), a score is achieved after each successive clock. This score corresponds with an (x,y) location within the image. If the score is higher than the predetermined threshold sigma, then a flag is set high and the location is recorded.
Defining window-boundaries for genomic analyses using smoothing spline techniques
Beissinger, Timothy M.; Rosa, Guilherme J.M.; Kaeppler, Shawn M.; ...
2015-04-17
High-density genomic data is often analyzed by combining information over windows of adjacent markers. Interpretation of data grouped in windows versus at individual locations may increase statistical power, simplify computation, reduce sampling noise, and reduce the total number of tests performed. However, use of adjacent marker information can result in over- or under-smoothing, undesirable window boundary specifications, or highly correlated test statistics. We introduce a method for defining windows based on statistically guided breakpoints in the data, as a foundation for the analysis of multiple adjacent data points. This method involves first fitting a cubic smoothing spline to the datamore » and then identifying the inflection points of the fitted spline, which serve as the boundaries of adjacent windows. This technique does not require prior knowledge of linkage disequilibrium, and therefore can be applied to data collected from individual or pooled sequencing experiments. Moreover, in contrast to existing methods, an arbitrary choice of window size is not necessary, since these are determined empirically and allowed to vary along the genome.« less
A frequency-based window width optimized two-dimensional S-Transform profilometry
NASA Astrophysics Data System (ADS)
Zhong, Min; Chen, Feng; Xiao, Chao
2017-11-01
A new scheme is proposed to as a frequency-based window width optimized two-dimensional S-Transform profilometry, in which parameters pu and pv are introduced to control the width of a two-dimensional Gaussian window. Unlike the standard two-dimensional S-transform using the Gaussian window with window width proportional to the reciprocal local frequency of the tested signal, the size of window width for the optimized two-dimensional S-Transform varies with the pu th (pv th) power of the reciprocal local frequency fx (fy) in x (y) direction. The paper gives a detailed theoretical analysis of optimized two-dimensional S-Transform in fringe analysis as well as the characteristics of the modified Gauss window. Simulations are applied to evaluate the proposed scheme, the results show that the new scheme has better noise reduction ability and can extract phase distribution more precise in comparison with the standard two-dimensional S-transform even though the surface of the measured object varies sharply. Finally, the proposed scheme is demonstrated on three-dimensional surface reconstruction for a complex plastic cat mask to show its effectiveness.
A note on windowing for the waveform relaxation
NASA Technical Reports Server (NTRS)
Zhang, Hong
1994-01-01
The technique of windowing has been often used in the implementation of the waveform relaxations for solving ODE's or time dependent PDE's. Its efficiency depends upon problem stiffness and operator splitting. Using model problems, the estimates for window length and convergence rate are derived. The electiveness of windowing is then investigated for non-stiff and stiff cases respectively. lt concludes that for the former, windowing is highly recommended when a large discrepancy exists between the convergence rate on a time interval and the ones on its subintervals. For the latter, windowing does not provide any computational advantage if machine features are disregarded. The discussion is supported by experimental results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yun, Geun Young; Steemers, Koen
2010-07-15
This paper investigates occupant behaviour of window-use in night-time naturally ventilated offices on the basis of a pilot field study, conducted during the summers of 2006 and 2007 in Cambridge, UK, and then demonstrates the effects of employing night-time ventilation on indoor thermal conditions using predictive models of occupant window-use. A longitudinal field study shows that occupants make good use of night-time natural ventilation strategies when provided with openings that allow secure ventilation, and that there is a noticeable time of day effect in window-use patterns (i.e. increased probability of action on arrival and departure). We develop logistic models ofmore » window-use for night-time naturally ventilated offices, which are subsequently applied to a behaviour algorithm, including Markov chains and Monte Carlo methods. The simulations using the behaviour algorithm demonstrate a good agreement with the observational data of window-use, and reveal how building design and occupant behaviour collectively affect the thermal performance of offices. They illustrate that the provision of secure ventilation leads to more frequent use of the window, and thus contributes significantly to the achievement of a comfortable indoor environment during the daytime occupied period. For example, the maximum temperature for a night-time ventilated office is found to be 3 C below the predicted value for a daytime-only ventilated office. (author)« less
NASA Astrophysics Data System (ADS)
Kang, Jae-sik; Oh, Eun-Joo; Bae, Min-Jung; Song, Doo-Sam
2017-12-01
Given that the Korean government is implementing what has been termed the energy standards and labelling program for windows, window companies will be required to assign window ratings based on the experimental results of their product. Because this has added to the cost and time required for laboratory tests by window companies, the simulation system for the thermal performance of windows has been prepared to compensate for time and cost burdens. In Korea, a simulator is usually used to calculate the thermal performance of a window through WINDOW/THERM, complying with ISO 15099. For a single window, the simulation results are similar to experimental results. A double window is also calculated using the same method, but the calculation results for this type of window are unreliable. ISO 15099 should not recommend the calculation of the thermal properties of an air cavity between window sashes in a double window. This causes a difference between simulation and experimental results pertaining to the thermal performance of a double window. In this paper, the thermal properties of air cavities between window sashes in a double window are analyzed through computational fluid dynamics (CFD) simulations with the results compared to calculation results certified by ISO 15099. The surface temperature of the air cavity analyzed by CFD is compared to the experimental temperatures. These results show that an appropriate calculation method for an air cavity between window sashes in a double window should be established for reliable thermal performance results for a double window.
VIS-IR transmitting BGG glass windows
NASA Astrophysics Data System (ADS)
Bayya, Shyam S.; Chin, Geoff D.; Sanghera, Jasbinder S.; Aggarwal, Ishwar D.
2003-09-01
BaO-Ga2O3-GeO2 (BGG) glasses have the desired properties for various window applications in the 0.5-5 μm wavelength region. These glasses are low cost alternatives to the currently used window materials. Fabrication of a high optical quality 18" diameter BGG glass window has been demonstrated with a transmitted wave front error of λ/10 at 632 nm. BGG substrates have also been successfully tested for environmental weatherability (MIL-F-48616) and rain erosion durability up to 300 mph. Preliminary EMI grids have been successfully applied on BGG glasses demonstrating attenuation of 20dB in X and Ku bands. Although the mechanical properties of BGG glasses are acceptable for various window applications, it is demonstrated here that the properties can be further improved significantly by the glassceramization process. The ceramization process does not add any significant cost to the final window material. The crystallite size in the present glass-ceramic limits its transmission to the 2-5 μm region.
NASA Astrophysics Data System (ADS)
Ghezavati, V. R.; Beigi, M.
2016-12-01
During the last decade, the stringent pressures from environmental and social requirements have spurred an interest in designing a reverse logistics (RL) network. The success of a logistics system may depend on the decisions of the facilities locations and vehicle routings. The location-routing problem (LRP) simultaneously locates the facilities and designs the travel routes for vehicles among established facilities and existing demand points. In this paper, the location-routing problem with time window (LRPTW) and homogeneous fleet type and designing a multi-echelon, and capacitated reverse logistics network, are considered which may arise in many real-life situations in logistics management. Our proposed RL network consists of hybrid collection/inspection centers, recovery centers and disposal centers. Here, we present a new bi-objective mathematical programming (BOMP) for LRPTW in reverse logistic. Since this type of problem is NP-hard, the non-dominated sorting genetic algorithm II (NSGA-II) is proposed to obtain the Pareto frontier for the given problem. Several numerical examples are presented to illustrate the effectiveness of the proposed model and algorithm. Also, the present work is an effort to effectively implement the ɛ-constraint method in GAMS software for producing the Pareto-optimal solutions in a BOMP. The results of the proposed algorithm have been compared with the ɛ-constraint method. The computational results show that the ɛ-constraint method is able to solve small-size instances to optimality within reasonable computing times, and for medium-to-large-sized problems, the proposed NSGA-II works better than the ɛ-constraint.
ON ASYMMETRY OF MAGNETIC HELICITY IN EMERGING ACTIVE REGIONS: HIGH-RESOLUTION OBSERVATIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tian Lirong; Alexander, David; Zhu Chunming
We employ the DAVE (differential affine velocity estimator) tracking technique on a time series of Michelson Doppler Imager (MDI)/1 minute high spatial resolution line-of-sight magnetograms to measure the photospheric flow velocity for three newly emerging bipolar active regions (ARs). We separately calculate the magnetic helicity injection rate of the leading and following polarities to confirm or refute the magnetic helicity asymmetry, found by Tian and Alexander using MDI/96 minute low spatial resolution magnetograms. Our results demonstrate that the magnetic helicity asymmetry is robust, being present in the three ARs studied, two of which have an observed balance of the magneticmore » flux. The magnetic helicity injection rate measured is found to depend little on the window size selected, but does depend on the time interval used between the two successive magnetograms being tracked. It is found that the measurement of the magnetic helicity injection rate performs well for a window size between 12 x 10 and 18 x 15 pixels and at a time interval {Delta}t = 10 minutes. Moreover, the short-lived magnetic structures, 10-60 minutes, are found to contribute 30%-50% of the magnetic helicity injection rate. Comparing with the results calculated by MDI/96 minute data, we find that the MDI/96 minute data, in general, can outline the main trend of the magnetic properties, but they significantly underestimate the magnetic flux in strong field regions and are not appropriate for quantitative tracking studies, so provide a poor estimate of the amount of magnetic helicity injected into the corona.« less
Monetary benefits of preventing childhood lead poisoning with lead-safe window replacement.
Nevin, Rick; Jacobs, David E; Berg, Michael; Cohen, Jonathan
2008-03-01
Previous estimates of childhood lead poisoning prevention benefits have quantified the present value of some health benefits, but not the costs of lead paint hazard control or the benefits associated with housing and energy markets. Because older housing with lead paint constitutes the main exposure source today in the US, we quantify health benefits, costs, market value benefits, energy savings, and net economic benefits of lead-safe window replacement (which includes paint stabilization and other measures). The benefit per resident child from improved lifetime earnings alone is $21,195 in pre-1940 housing and $8685 in 1940-59 housing (in 2005 dollars). Annual energy savings are $130-486 per housing unit, with or without young resident children, with an associated increase in housing market value of $5900-14,300 per housing unit, depending on home size and number of windows replaced. Net benefits are $4490-5,629 for each housing unit built before 1940, and $491-1629 for each unit built from 1940-1959, depending on home size and number of windows replaced. Lead-safe window replacement in all pre-1960 US housing would yield net benefits of at least $67 billion, which does not include many other benefits. These other benefits, which are shown in this paper, include avoided Attention Deficit Hyperactivity Disorder, other medical costs of childhood lead exposure, avoided special education, and reduced crime and juvenile delinquency in later life. In addition, such a window replacement effort would reduce peak demand for electricity, carbon emissions from power plants, and associated long-term costs of climate change.
Process Flow Features as a Host-Based Event Knowledge Representation
2012-06-14
an executing process during a window of time called a process flow. Process flows are calculated from key process data structures extracted from...for Cluster 98. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 4.9. Davies- Boldin Dunn Index Sliding Window 5 on Windows 7...82 4.10. Davies- Boldin Dunn Index Sliding Window 10 on Windows 7 . 83 4.11. Davies- Boldin Dunn Index Sliding Window 20 on Windows 7 . 83 ix List of
Arai, Mamiko; Brandt, Vicky; Dabaghian, Yuri
2014-01-01
Learning arises through the activity of large ensembles of cells, yet most of the data neuroscientists accumulate is at the level of individual neurons; we need models that can bridge this gap. We have taken spatial learning as our starting point, computationally modeling the activity of place cells using methods derived from algebraic topology, especially persistent homology. We previously showed that ensembles of hundreds of place cells could accurately encode topological information about different environments (“learn” the space) within certain values of place cell firing rate, place field size, and cell population; we called this parameter space the learning region. Here we advance the model both technically and conceptually. To make the model more physiological, we explored the effects of theta precession on spatial learning in our virtual ensembles. Theta precession, which is believed to influence learning and memory, did in fact enhance learning in our model, increasing both speed and the size of the learning region. Interestingly, theta precession also increased the number of spurious loops during simplicial complex formation. We next explored how downstream readout neurons might define co-firing by grouping together cells within different windows of time and thereby capturing different degrees of temporal overlap between spike trains. Our model's optimum coactivity window correlates well with experimental data, ranging from ∼150–200 msec. We further studied the relationship between learning time, window width, and theta precession. Our results validate our topological model for spatial learning and open new avenues for connecting data at the level of individual neurons to behavioral outcomes at the neuronal ensemble level. Finally, we analyzed the dynamics of simplicial complex formation and loop transience to propose that the simplicial complex provides a useful working description of the spatial learning process. PMID:24945927
Lombardo, Marco; Serrao, Sebastiano; Lombardo, Giuseppe
2014-01-01
Purpose To investigate the influence of various technical factors on the variation of cone packing density estimates in adaptive optics flood illuminated retinal images. Methods Adaptive optics images of the photoreceptor mosaic were obtained in fifteen healthy subjects. The cone density and Voronoi diagrams were assessed in sampling windows of 320×320 µm, 160×160 µm and 64×64 µm at 1.5 degree temporal and superior eccentricity from the preferred locus of fixation (PRL). The technical factors that have been analyzed included the sampling window size, the corrected retinal magnification factor (RMFcorr), the conversion from radial to linear distance from the PRL, the displacement between the PRL and foveal center and the manual checking of cone identification algorithm. Bland-Altman analysis was used to assess the agreement between cone density estimated within the different sampling window conditions. Results The cone density declined with decreasing sampling area and data between areas of different size showed low agreement. A high agreement was found between sampling areas of the same size when comparing density calculated with or without using individual RMFcorr. The agreement between cone density measured at radial and linear distances from the PRL and between data referred to the PRL or the foveal center was moderate. The percentage of Voronoi tiles with hexagonal packing arrangement was comparable between sampling areas of different size. The boundary effect, presence of any retinal vessels, and the manual selection of cones missed by the automated identification algorithm were identified as the factors influencing variation of cone packing arrangements in Voronoi diagrams. Conclusions The sampling window size is the main technical factor that influences variation of cone density. Clear identification of each cone in the image and the use of a large buffer zone are necessary to minimize factors influencing variation of Voronoi diagrams of the cone mosaic. PMID:25203681
Lombardo, Marco; Serrao, Sebastiano; Lombardo, Giuseppe
2014-01-01
To investigate the influence of various technical factors on the variation of cone packing density estimates in adaptive optics flood illuminated retinal images. Adaptive optics images of the photoreceptor mosaic were obtained in fifteen healthy subjects. The cone density and Voronoi diagrams were assessed in sampling windows of 320×320 µm, 160×160 µm and 64×64 µm at 1.5 degree temporal and superior eccentricity from the preferred locus of fixation (PRL). The technical factors that have been analyzed included the sampling window size, the corrected retinal magnification factor (RMFcorr), the conversion from radial to linear distance from the PRL, the displacement between the PRL and foveal center and the manual checking of cone identification algorithm. Bland-Altman analysis was used to assess the agreement between cone density estimated within the different sampling window conditions. The cone density declined with decreasing sampling area and data between areas of different size showed low agreement. A high agreement was found between sampling areas of the same size when comparing density calculated with or without using individual RMFcorr. The agreement between cone density measured at radial and linear distances from the PRL and between data referred to the PRL or the foveal center was moderate. The percentage of Voronoi tiles with hexagonal packing arrangement was comparable between sampling areas of different size. The boundary effect, presence of any retinal vessels, and the manual selection of cones missed by the automated identification algorithm were identified as the factors influencing variation of cone packing arrangements in Voronoi diagrams. The sampling window size is the main technical factor that influences variation of cone density. Clear identification of each cone in the image and the use of a large buffer zone are necessary to minimize factors influencing variation of Voronoi diagrams of the cone mosaic.
Tabelow, Karsten; König, Reinhard; Polzehl, Jörg
2016-01-01
Estimation of learning curves is ubiquitously based on proportions of correct responses within moving trial windows. Thereby, it is tacitly assumed that learning performance is constant within the moving windows, which, however, is often not the case. In the present study we demonstrate that violations of this assumption lead to systematic errors in the analysis of learning curves, and we explored the dependency of these errors on window size, different statistical models, and learning phase. To reduce these errors in the analysis of single-subject data as well as on the population level, we propose adequate statistical methods for the estimation of learning curves and the construction of confidence intervals, trial by trial. Applied to data from an avoidance learning experiment with rodents, these methods revealed performance changes occurring at multiple time scales within and across training sessions which were otherwise obscured in the conventional analysis. Our work shows that the proper assessment of the behavioral dynamics of learning at high temporal resolution can shed new light on specific learning processes, and, thus, allows to refine existing learning concepts. It further disambiguates the interpretation of neurophysiological signal changes recorded during training in relation to learning. PMID:27303809
ERIC Educational Resources Information Center
Kerzel, Dirk; Born, Sabine; Schonhammer, Josef
2012-01-01
A salient stimulus may interrupt visual search because of attentional capture. It has been shown that attentional capture occurs with a wide, but not with a small attentional window. We tested the hypothesis that capture depends more strongly on the shape of the attentional window than on its size. Search elements were arranged in two nested…
Complex Human Activity Recognition Using Smartphone and Wrist-Worn Motion Sensors.
Shoaib, Muhammad; Bosch, Stephan; Incel, Ozlem Durmaz; Scholten, Hans; Havinga, Paul J M
2016-03-24
The position of on-body motion sensors plays an important role in human activity recognition. Most often, mobile phone sensors at the trouser pocket or an equivalent position are used for this purpose. However, this position is not suitable for recognizing activities that involve hand gestures, such as smoking, eating, drinking coffee and giving a talk. To recognize such activities, wrist-worn motion sensors are used. However, these two positions are mainly used in isolation. To use richer context information, we evaluate three motion sensors (accelerometer, gyroscope and linear acceleration sensor) at both wrist and pocket positions. Using three classifiers, we show that the combination of these two positions outperforms the wrist position alone, mainly at smaller segmentation windows. Another problem is that less-repetitive activities, such as smoking, eating, giving a talk and drinking coffee, cannot be recognized easily at smaller segmentation windows unlike repetitive activities, like walking, jogging and biking. For this purpose, we evaluate the effect of seven window sizes (2-30 s) on thirteen activities and show how increasing window size affects these various activities in different ways. We also propose various optimizations to further improve the recognition of these activities. For reproducibility, we make our dataset publicly available.
A Virtual Study of Grid Resolution on Experiments of a Highly-Resolved Turbulent Plume
NASA Astrophysics Data System (ADS)
Maisto, Pietro M. F.; Marshall, Andre W.; Gollner, Michael J.; Fire Protection Engineering Department Collaboration
2017-11-01
An accurate representation of sub-grid scale turbulent mixing is critical for modeling fire plumes and smoke transport. In this study, PLIF and PIV diagnostics are used with the saltwater modeling technique to provide highly-resolved instantaneous field measurements in unconfined turbulent plumes useful for statistical analysis, physical insight, and model validation. The effect of resolution was investigated employing a virtual interrogation window (of varying size) applied to the high-resolution field measurements. Motivated by LES low-pass filtering concepts, the high-resolution experimental data in this study can be analyzed within the interrogation windows (i.e. statistics at the sub-grid scale) and on interrogation windows (i.e. statistics at the resolved scale). A dimensionless resolution threshold (L/D*) criterion was determined to achieve converged statistics on the filtered measurements. Such a criterion was then used to establish the relative importance between large and small-scale turbulence phenomena while investigating specific scales for the turbulent flow. First order data sets start to collapse at a resolution of 0.3D*, while for second and higher order statistical moments the interrogation window size drops down to 0.2D*.
Critical androgen-sensitive periods of rat penis and clitoris development.
Welsh, Michelle; MacLeod, David J; Walker, Marion; Smith, Lee B; Sharpe, Richard M
2010-02-01
Androgen control of penis development/growth is unclear. In rats, androgen action in a foetal 'masculinisation programming window' (MPW; e15.5-e18.5)' predetermines penile length and hypospadias occurrence. This has implications for humans (e.g. micropenis). Our studies aimed to establish in rats when androgen action/administration affects development/growth of the penis and if deficits in MPW androgen action were rescuable postnatally. Thus, pregnant rats were treated with flutamide during the MPW +/- postnatal testosterone propionate (TP) treatment. To assess penile growth responsiveness, rats were treated with TP in various time windows (late foetal, neonatal through early puberty, puberty onset, or combinations thereof). Phallus length, weight, and morphology, hypospadias and anogenital distance (AGD) were measured in mid-puberty (d25) or adulthood (d90) in males and females, plus serum testosterone in adult males. MPW flutamide exposure reduced adult penile length and induced hypospadias dose-dependently; this was not rescued by postnatal TP treatment. In normal rats, foetal (e14.5-e21.5) TP exposure did not affect male penis size but increased female clitoral size. In males, TP exposure from postnatal d1-24 or at puberty (d15-24), increased penile length at d25, but not ultimately in adulthood. Foetal + postnatal TP (e14-postnatal d24) increased penile size at d25 but reduced it at d90 (due to reduced endogenous testosterone). In females, this treatment caused the biggest increase in adult clitoral size but, unlike in males, phallus size was unaffected by TP during puberty (d15-24). Postnatal TP treatment advanced penile histology at d25 to more resemble adult histology. AGD strongly correlated with final penis length. It is concluded that adult penile size depends critically on androgen action during the MPW but subsequent growth depends on later androgen exposure. Foetal and/or postnatal TP exposure does not increase adult penile size above its 'predetermined' length though its growth towards this maximum is advanced by peripubertal TP treatment.
Study of the effects of condensation on the performance of Pioneer Venus probe windows
NASA Technical Reports Server (NTRS)
Testerman, M. K.
1974-01-01
The transmission loss of Pioneer Venus Probe radiation windows if their exposed surfaces become contaminated with droplets of water, hydrochloric acid, sulfuric acid, and mercury which may be found in the Venusian atmosphere was investigated. Transmission loss was studied as a function of mass concentration of liquid droplets deposited on one surface of test window materials while the wavelength of the transmitting radiation is in the range of 0.3 to 30 microns. The parameters that affect the transmittance of radiation through a window are: (1) particle size, (2) surface concentration of particles, (3) wavelength of the radiation, (4) angle of acceptance of the radiation by the detector, and (5) the refractive index of the aerosol.
Real-time camera-based face detection using a modified LAMSTAR neural network system
NASA Astrophysics Data System (ADS)
Girado, Javier I.; Sandin, Daniel J.; DeFanti, Thomas A.; Wolf, Laura K.
2003-03-01
This paper describes a cost-effective, real-time (640x480 at 30Hz) upright frontal face detector as part of an ongoing project to develop a video-based, tetherless 3D head position and orientation tracking system. The work is specifically targeted for auto-stereoscopic displays and projection-based virtual reality systems. The proposed face detector is based on a modified LAMSTAR neural network system. At the input stage, after achieving image normalization and equalization, a sub-window analyzes facial features using a neural network. The sub-window is segmented, and each part is fed to a neural network layer consisting of a Kohonen Self-Organizing Map (SOM). The output of the SOM neural networks are interconnected and related by correlation-links, and can hence determine the presence of a face with enough redundancy to provide a high detection rate. To avoid tracking multiple faces simultaneously, the system is initially trained to track only the face centered in a box superimposed on the display. The system is also rotationally and size invariant to a certain degree.
Ren, Chuancheng; Gao, Xuwen; Steinberg, Gary K.; Zhao, Heng
2009-01-01
Remote ischemic preconditioning is an emerging concept for stroke treatment, but its protection against focal stroke has not been established. We tested whether remote preconditioning, performed in the ipsilateral hind limb, protects against focal stroke and explored its protective parameters. Stroke was generated by a permanent occlusion of the left distal middle cerebral artery (MCA) combined with a 30 minute occlusion of the bilateral common carotid arteries (CCA) in male rats. Limb preconditioning was generated by 5 or 15 minute occlusion followed with the same period of reperfusion of the left hind femoral artery, and repeated for 2 or 3 cycles. Infarct was measured 2 days later. The results showed that rapid preconditioning with 3 cycles of 15 minutes performed immediately before stroke reduced infarct size from 47.7±7.6% of control ischemia to 9.8±8.6%; at 2 cycles of 15 minutes, infarct was reduced to 24.7±7.3%; at 2 cycles of 5 minutes, infarct was not reduced. Delayed preconditioning with 3 cycles of 15 minutes conducted 2 days before stroke also reduced infarct to 23.0 ±10.9%, but with 2 cycles of 15 minutes it offered no protection. The protective effects at these two therapeutic time windows of remote preconditioning are consistent with those of conventional preconditioning, in which the preconditioning ischemia is induced in the brain itself. Unexpectedly, intermediate preconditioning with 3 cycles of 15 minutes performed 12 hours before stroke also reduced infarct to 24.7±4.7%, which contradicts the current dogma for therapeutic time windows for the conventional preconditioning that has no protection at this time point. In conclusion, remote preconditioning performed in one limb protected against ischemic damage after focal cerebral ischemia. PMID:18201834
Is Latency to Test Deadline a Predictor of Student Test Performance?
ERIC Educational Resources Information Center
Landrum, R. Eric; Gurung, Regan A. R.
2013-01-01
When students are given a period or window of time to take an exam, is taking an exam earlier in the window (high latency to deadline) related to test scores? In Study 1, students (n = 236) were given windows of time to take online each of 13 quizzes and 4 exams. In Study 2, students (n = 251) similarly took 4 exams online within a test window. In…
Contrasting effects of climate on juvenile body size in a Southern Hemisphere passerine bird.
Kruuk, Loeske E B; Osmond, Helen L; Cockburn, Andrew
2015-08-01
Despite extensive research on the topic, it has been difficult to reach general conclusions as to the effects of climate change on morphology in wild animals: in particular, the effects of warming temperatures have been associated with increases, decreases or stasis in body size in different populations. Here, we use a fine-scale analysis of associations between weather and offspring body size in a long-term study of a wild passerine bird, the cooperatively breeding superb fairy-wren, in south-eastern Australia to show that such variation in the direction of associations occurs even within a population. Over the past 26 years, our study population has experienced increased temperatures, increased frequency of heatwaves and reduced rainfall - but the mean body mass of chicks has not changed. Despite the apparent stasis, mass was associated with weather across the previous year, but in multiple counteracting ways. Firstly, (i) chick mass was negatively associated with extremely recent heatwaves, but there also positive associations with (ii) higher maximum temperatures and (iii) higher rainfall, both occurring in a period prior to and during the nesting period, and finally (iv) a longer-term negative association with higher maximum temperatures following the previous breeding season. Our results illustrate how a morphological trait may be affected by both short- and long-term effects of the same weather variable at multiple times of the year and that these effects may act in different directions. We also show that climate within the relevant time windows may not be changing in the same way, such that overall long-term temporal trends in body size may be minimal. Such complexity means that analytical approaches that search for a single 'best' window for one particular weather variable may miss other relevant information, and is also likely to make analyses of phenotypic plasticity and prediction of longer-term population dynamics difficult. © 2015 John Wiley & Sons Ltd.
A critical time window for organismal interactions in a pelagic ecosystem.
Benoit-Bird, Kelly J; McManus, Margaret A
2014-01-01
To measure organismal coherence in a pelagic ecosystem, we used moored sensors to describe the vertical dynamics of each step in the food chain in shelf waters off the west shore of Oahu, Hawaii. Horizontally extensive, intense aggregations of phytoplankton, zooplankton, and micronekton exhibited strong diel patterns in abundance and vertical distribution, resulting in a highly variable potential for interaction amongst trophic levels. Only around dusk did zooplankton layers overlap with phytoplankton layers. Shortly after sunset, micronekton ascended from the deep, aggregating on the island's shelf. Short-lived departures in migration patterns were detected in depth, vertical distribution, density, and total abundance of micronekton when zooplankton layers were present with typical patterns resuming within one hour. Layers of zooplankton began to disappear within 20 minutes of the arrival of micronekton with no layers present after 50 minutes. The effects of zooplankton layers cascaded even further up the food chain, affecting many behaviors of dolphins observed at dusk including their depth, group size, and inter-individual spacing. As a result of these changes in behavior, during a 30-minute window just after dusk, the number of feeding events observed for each dolphin and consequently the feeding time for each individual more than doubled when zooplankton layers were present. Dusk is a critical period for interactions amongst species in this system from phytoplankton to top predators. Our observations that short time windows can drive the structure and function of a complex suite of organisms highlight the importance of explicitly adding a temporal dimension at a scale relevant to individual organisms to our descriptions of heterogeneity in ocean ecosystems.
Hereford, Richard
2006-01-01
The software described here is used to process and analyze daily weather and surface-water data. The programs are refinements of earlier versions that include minor corrections and routines to calculate frequencies above a threshold on an annual or seasonal basis. Earlier versions of this software were used successfully to analyze historical precipitation patterns of the Mojave Desert and the southern Colorado Plateau regions, ecosystem response to climate variation, and variation of sediment-runoff frequency related to climate (Hereford and others, 2003; 2004; in press; Griffiths and others, 2006). The main program described here (Day_Cli_Ann_v5.3) uses daily data to develop a time series of various statistics for a user specified accounting period such as a year or season. The statistics include averages and totals, but the emphasis is on the frequency of occurrence in days of relatively rare weather or runoff events. These statistics are indices of climate variation; for a discussion of climate indices, see the Climate Research Unit website of the University of East Anglia (http://www.cru.uea.ac.uk/projects/stardex/) and the Climate Change Indices web site (http://cccma.seos.uvic.ca/ETCCDMI/indices.html). Specifically, the indices computed with this software are the frequency of high intensity 24-hour rainfall, unusually warm temperature, and unusually high runoff. These rare, or extreme events, are those greater than the 90th percentile of precipitation, streamflow, or temperature computed for the period of record of weather or gaging stations. If they cluster in time over several decades, extreme events may produce detectable change in the physical landscape and ecosystem of a given region. Although the software has been tested on a variety of data, as with any software, the user should carefully evaluate the results with their data. The programs were designed for the range of precipitation, temperature, and streamflow measurements expected in the semiarid Southwest United States. The user is encouraged to review the examples provided with the software. The software is written in Fortran 90 with Fortran 95 extensions and was compiled with the Digital Visual Fortran compiler version 6.6. The executables run on Windows 2000 and XP, and they operate in a MS-DOS console window that has only very simple graphical options such as font size and color, background color, and size of the window. Error trapping was not written into the programs. Typically, when an error occurs, the console window closes without a message.
Baczkowski, Blazej M; Johnstone, Tom; Walter, Henrik; Erk, Susanne; Veer, Ilya M
2017-06-01
We evaluated whether sliding-window analysis can reveal functionally relevant brain network dynamics during a well-established fear conditioning paradigm. To this end, we tested if fMRI fluctuations in amygdala functional connectivity (FC) can be related to task-induced changes in physiological arousal and vigilance, as reflected in the skin conductance level (SCL). Thirty-two healthy individuals participated in the study. For the sliding-window analysis we used windows that were shifted by one volume at a time. Amygdala FC was calculated for each of these windows. Simultaneously acquired SCL time series were averaged over time frames that corresponded to the sliding-window FC analysis, which were subsequently regressed against the whole-brain seed-based amygdala sliding-window FC using the GLM. Surrogate time series were generated to test whether connectivity dynamics could have occurred by chance. In addition, results were contrasted against static amygdala FC and sliding-window FC of the primary visual cortex, which was chosen as a control seed, while a physio-physiological interaction (PPI) was performed as cross-validation. During periods of increased SCL, the left amygdala became more strongly coupled with the bilateral insula and anterior cingulate cortex, core areas of the salience network. The sliding-window analysis yielded a connectivity pattern that was unlikely to have occurred by chance, was spatially distinct from static amygdala FC and from sliding-window FC of the primary visual cortex, but was highly comparable to that of the PPI analysis. We conclude that sliding-window analysis can reveal functionally relevant fluctuations in connectivity in the context of an externally cued task. Copyright © 2017 Elsevier Inc. All rights reserved.
Computed Tomography Window Blending: Feasibility in Thoracic Trauma.
Mandell, Jacob C; Wortman, Jeremy R; Rocha, Tatiana C; Folio, Les R; Andriole, Katherine P; Khurana, Bharti
2018-02-07
This study aims to demonstrate the feasibility of processing computed tomography (CT) images with a custom window blending algorithm that combines soft-tissue, bone, and lung window settings into a single image; to compare the time for interpretation of chest CT for thoracic trauma with window blending and conventional window settings; and to assess diagnostic performance of both techniques. Adobe Photoshop was scripted to process axial DICOM images from retrospective contrast-enhanced chest CTs performed for trauma with a window-blending algorithm. Two emergency radiologists independently interpreted the axial images from 103 chest CTs with both blended and conventional windows. Interpretation time and diagnostic performance were compared with Wilcoxon signed-rank test and McNemar test, respectively. Agreement with Nexus CT Chest injury severity was assessed with the weighted kappa statistic. A total of 13,295 images were processed without error. Interpretation was faster with window blending, resulting in a 20.3% time saving (P < .001), with no difference in diagnostic performance, within the power of the study to detect a difference in sensitivity of 5% as determined by post hoc power analysis. The sensitivity of the window-blended cases was 82.7%, compared to 81.6% for conventional windows. The specificity of the window-blended cases was 93.1%, compared to 90.5% for conventional windows. All injuries of major clinical significance (per Nexus CT Chest criteria) were correctly identified in all reading sessions, and all negative cases were correctly classified. All readers demonstrated near-perfect agreement with injury severity classification with both window settings. In this pilot study utilizing retrospective data, window blending allows faster preliminary interpretation of axial chest CT performed for trauma, with no significant difference in diagnostic performance compared to conventional window settings. Future studies would be required to assess the utility of window blending in clinical practice. Copyright © 2018 The Association of University Radiologists. All rights reserved.
NASA Astrophysics Data System (ADS)
Sato, Taketomo; Kaneshiro, Chinami; HiroshiOkada, HiroshiOkada; Hasegawa, Hideki
1999-04-01
Attempts were made to form regular arrays of size- andposition-controlled Pt-dots on GaAs and InP by combining an insitu electrochemical process with the electron beam (EB)lithography. This utilizes the precipitation of Pt nano-particles atthe initial stage of electrodeposition. First, electrochemicalconditions were optimized in the mode of self-assembled dot arrayformation on unpatterned substrates. Minimum in-plane dot diameters of22 nm and 26 nm on GaAs and InP, respectively, were obtained underthe optimal pulsed mode. Then, Pt dots were selectively formed onpatterned substrates with open circular windows formed by EBlithography, thereby realizing dot-position control. The Pt dot wasfound to have been deposited at the center of each open window, andthe in-plane diameter of the dot could be controlled by the number,width and period of the pulse-waveform applied to substrates. Aminimum diameter of 20 nm was realized in windows with a diameter of100 nm, using a single pulse. Current-voltage (I-V)measurements using an atomic force microscopy (AFM) system with aconductive probe indicated that each Pt dot/n-GaAs contact possessed ahigh Schottky barrier height of about 1 eV.
Surface Fitting Filtering of LIDAR Point Cloud with Waveform Information
NASA Astrophysics Data System (ADS)
Xing, S.; Li, P.; Xu, Q.; Wang, D.; Li, P.
2017-09-01
Full-waveform LiDAR is an active technology of photogrammetry and remote sensing. It provides more detailed information about objects along the path of a laser pulse than discrete-return topographic LiDAR. The point cloud and waveform information with high quality can be obtained by waveform decomposition, which could make contributions to accurate filtering. The surface fitting filtering method with waveform information is proposed to present such advantage. Firstly, discrete point cloud and waveform parameters are resolved by global convergent Levenberg Marquardt decomposition. Secondly, the ground seed points are selected, of which the abnormal ones are detected by waveform parameters and robust estimation. Thirdly, the terrain surface is fitted and the height difference threshold is determined in consideration of window size and mean square error. Finally, the points are classified gradually with the rising of window size. The filtering process is finished until window size is larger than threshold. The waveform data in urban, farmland and mountain areas from "WATER (Watershed Allied Telemetry Experimental Research)" are selected for experiments. Results prove that compared with traditional method, the accuracy of point cloud filtering is further improved and the proposed method has highly practical value.
The effect of low ceiling on the external combustion of the cabin fire
NASA Astrophysics Data System (ADS)
Su, Shichuan; Chen, Changyun; Wang, Liang; Wei, Chengyin; Cui, Haibing; Guo, Chengyu
2018-06-01
External combustion is a phenomenon where the flame flares out of the window and burns outside. Because of the particularity of the ship's cabin structure, there is a great danger in the external combustion. In this paper, the numerical calculation and analysis of three kinds of low ceiling ship cabin fire are analyzed based on the large eddy numerical simulation technique. Through the analysis of temperature, flue gas velocity, heat flux density and so on, the external combustion phenomenon of fire development is calculated. The results show that when external combustion occurs, the amount of fuel escaping decreases with the roof height. The temperature above the window increases with the height of the ceiling. The heat flux density in the external combustion flame is mainly provided by radiation, and convection is only a small part; In the plume area there is a time period, in this time period, the convective heat flux density is greater than the radiation heat flux, this time with the ceiling height increases. No matter which ceiling height, the external combustion will seriously damage the structure of the ship after a certain period of time. The velocity distribution of the three roof is similar, but with the height of the ceiling, the area size is also increasing.
2015-01-01
Altitudinal clines in body size can result from the effects of natural and sexual selection on growth rates and developing times in seasonal environments. Short growing and reproductive seasons constrain the body size that adults can attain and their reproductive success. Little is known about the effects of altitudinal climatic variation on the diversification of Neotropical insects. In central Mexico, in addition to altitude, highly heterogeneous topography generates diverse climates that can occur even at the same latitude. Altitudinal variation and heterogeneous topography open an opportunity to test the relative impact of climatic variation on body size adaptations. In this study, we investigated the relationship between altitudinal climatic variation and body size, and the divergence rates of sexual size dimorphism (SSD) in Neotropical grasshoppers of the genus Sphenarium using a phylogenetic comparative approach. In order to distinguish the relative impact of natural and sexual selection on the diversification of the group, we also tracked the altitudinal distribution of the species and trends of both body size and SSD on the phylogeny of Sphenarium. The correlative evidence suggests no relationship between altitude and body size. However, larger species were associated with places having a warmer winter season in which the temporal window for development and reproduction can be longer. Nonetheless, the largest species were also associated with highly seasonal environments. Moreover, large body size and high levels of SSD have evolved independently several times throughout the history of the group and male body size has experienced a greater evolutionary divergence than females. These lines of evidence suggest that natural selection, associated with seasonality and sexual selection, on maturation time and body size could have enhanced the diversification of this insect group. PMID:26684616
Cheap streak camera based on the LD-S-10 intensifier tube
NASA Astrophysics Data System (ADS)
Dashevsky, Boris E.; Krutik, Mikhail I.; Surovegin, Alexander L.
1992-01-01
Basic properties of a new streak camera and its test results are reported. To intensify images on its screen, we employed modular G1 tubes, the LD-A-1.0 and LD-A-0.33, enabling magnification of 1.0 and 0.33, respectively. If necessary, the LD-A-0.33 tube may be substituted by any other image intensifier of the LDA series, the choice to be determined by the size of the CCD matrix with fiber-optical windows. The reported camera employs a 12.5- mm-long CCD strip consisting of 1024 pixels, each 12 X 500 micrometers in size. Registered radiation was imaged on a 5 X 0.04 mm slit diaphragm tightly connected with the LD-S- 10 fiber-optical input window. Electrons escaping the cathode are accelerated in a 5 kV electric field and focused onto a phosphor screen covering a fiber-optical plate as they travel between deflection plates. Sensitivity of the latter was 18 V/mm, which implies that the total deflecting voltage was 720 V per 40 mm of the screen surface, since reversed-polarity scan pulses +360 V and -360 V were applied across the deflection plate. The streak camera provides full scan times over the screen of 15, 30, 50, 100, 250, and 500 ns. Timing of the electrically or optically driven camera was done using a 10 ns step-controlled-delay (0 - 500 ns) circuit.
Letter-sound processing deficits in children with developmental dyslexia: An ERP study.
Moll, Kristina; Hasko, Sandra; Groth, Katharina; Bartling, Jürgen; Schulte-Körne, Gerd
2016-04-01
The time course during letter-sound processing was investigated in children with developmental dyslexia (DD) and typically developing (TD) children using electroencephalography. Thirty-eight children with DD and 25 TD children participated in a visual-auditory oddball paradigm. Event-related potentials (ERPs) elicited by standard and deviant stimuli in an early (100-190 ms) and late (560-750 ms) time window were analysed. In the early time window, ERPs elicited by the deviant stimulus were delayed and less left lateralized over fronto-temporal electrodes for children with DD compared to TD children. In the late time window, children with DD showed higher amplitudes extending more over right frontal electrodes. Longer latencies in the early time window and stronger right hemispheric activation in the late time window were associated with slower reading and naming speed. Additionally, stronger right hemispheric activation in the late time window correlated with poorer phonological awareness skills. Deficits in early stages of letter-sound processing influence later more explicit cognitive processes during letter-sound processing. Identifying the neurophysiological correlates of letter-sound processing and their relation to reading related skills provides insight into the degree of automaticity during letter-sound processing beyond behavioural measures of letter-sound-knowledge. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Liu, Shelley H; Bobb, Jennifer F; Lee, Kyu Ha; Gennings, Chris; Claus Henn, Birgit; Bellinger, David; Austin, Christine; Schnaas, Lourdes; Tellez-Rojo, Martha M; Hu, Howard; Wright, Robert O; Arora, Manish; Coull, Brent A
2018-07-01
The impact of neurotoxic chemical mixtures on children's health is a critical public health concern. It is well known that during early life, toxic exposures may impact cognitive function during critical time intervals of increased vulnerability, known as windows of susceptibility. Knowledge on time windows of susceptibility can help inform treatment and prevention strategies, as chemical mixtures may affect a developmental process that is operating at a specific life phase. There are several statistical challenges in estimating the health effects of time-varying exposures to multi-pollutant mixtures, such as: multi-collinearity among the exposures both within time points and across time points, and complex exposure-response relationships. To address these concerns, we develop a flexible statistical method, called lagged kernel machine regression (LKMR). LKMR identifies critical exposure windows of chemical mixtures, and accounts for complex non-linear and non-additive effects of the mixture at any given exposure window. Specifically, LKMR estimates how the effects of a mixture of exposures change with the exposure time window using a Bayesian formulation of a grouped, fused lasso penalty within a kernel machine regression (KMR) framework. A simulation study demonstrates the performance of LKMR under realistic exposure-response scenarios, and demonstrates large gains over approaches that consider each time window separately, particularly when serial correlation among the time-varying exposures is high. Furthermore, LKMR demonstrates gains over another approach that inputs all time-specific chemical concentrations together into a single KMR. We apply LKMR to estimate associations between neurodevelopment and metal mixtures in Early Life Exposures in Mexico and Neurotoxicology, a prospective cohort study of child health in Mexico City.
Thurlow, W R
1980-01-01
Messages were presented which moved from right to left along an electronic alphabetic display which was varied in "window" size from 4 through 32 letter spaces. Deaf subjects signed the messages they perceived. Relatively few errors were made even at the highest rate of presentation, which corresponded to a typing rate of 60 words/min. It is concluded that many deaf persons can make effective use of a small visual display. A reduced cost is then possible for visual communication instruments for these people through reduced display size. Deaf subjects who can profit from a small display can be located by a sentence test administered by tape recorder which drives the display of the communication device by means of the standard code of the deaf teletype network.
Absolute phase estimation: adaptive local denoising and global unwrapping.
Bioucas-Dias, Jose; Katkovnik, Vladimir; Astola, Jaakko; Egiazarian, Karen
2008-10-10
The paper attacks absolute phase estimation with a two-step approach: the first step applies an adaptive local denoising scheme to the modulo-2 pi noisy phase; the second step applies a robust phase unwrapping algorithm to the denoised modulo-2 pi phase obtained in the first step. The adaptive local modulo-2 pi phase denoising is a new algorithm based on local polynomial approximations. The zero-order and the first-order approximations of the phase are calculated in sliding windows of varying size. The zero-order approximation is used for pointwise adaptive window size selection, whereas the first-order approximation is used to filter the phase in the obtained windows. For phase unwrapping, we apply the recently introduced robust (in the sense of discontinuity preserving) PUMA unwrapping algorithm [IEEE Trans. Image Process.16, 698 (2007)] to the denoised wrapped phase. Simulations give evidence that the proposed algorithm yields state-of-the-art performance, enabling strong noise attenuation while preserving image details. (c) 2008 Optical Society of America
Technologies for precision manufacture of current and future windows and domes
NASA Astrophysics Data System (ADS)
Hallock, Bob; Shorey, Aric
2009-05-01
The final finish and characterization of windows and domes presents a number of challenges in achieving desired precision with acceptable cost and schedule. This becomes more difficult with advanced materials and as window and dome shapes and requirements become more complex, including acute angle corners, transmitted wavefront specifications, aspheric geometries and trending toward conformal surfaces. Magnetorheological Finishing (MRF®) and Magnetorheological Jet (MR Jet®), along with metrology provided by Sub-aperture Stitching Interferometry (SSI®) have several unique attributes that provide them advantages in enhancing fabrication of current and next generation windows and domes. The advantages that MRF brings to the precision finishing of a wide range of shapes such as flats, spheres (including hemispheres), cylinders, aspheres and even freeform optics, has been well documented. Recent advancements include the ability to finish freeform shapes up to 2-meters in size as well as progress in finishing challenging IR materials. Due to its shear-based removal mechanism in contrast to the pressure-based process of other techniques, edges are not typically rolled, in particular on parts with acute angle corners. MR Jet provides additional benefits, particularly in the finishing of the inside of steep concave domes and other irregular shapes. The ability of MR Jet to correct the figure of conformal domes deterministically and to high precision has been demonstrated. Combining these technologies with metrology techniques, such as SSI provides a solution for finishing current and future windows and domes in a reliable, deterministic and cost-effective way. The ability to use the SSI to characterize a range of shapes such as domes and aspheres, as well as progress in using MRF and MR Jet for finishing conventional and conformal windows and domes with increasing size and complexity of design will be presented.
Voss, Susan E.; Rosowski, John J.; Merchant, Saumil N.; Peake, William T.
2008-01-01
Direct acoustic stimulation of the cochlea by the sound-pressure difference between the oval and round windows (called the “acoustic route”) has been thought to contribute to hearing in some pathological conditions, along with the normally dominant “ossicular route.” To determine the efficacy of this acoustic route and its constituent mechanisms in human ears, sound pressures were measured at three locations in cadaveric temporal bones [with intact and perforated tympanic membranes (TMs)]: (1) in the external ear canal lateral to the TM, PTM; (2) in the tympanic cavity lateral to the oval window, POW; and (3) near the round window, PRW. Sound transmission via the acoustic route is described by two concatenated processes: (1) coupling of sound pressure from ear canal to middle-ear cavity, HPCAV≡PCAV/PTM, where PCAV represents the middle-ear cavity pressure, and (2) sound-pressure difference between the windows, HWPD≡(POW−PRW)/PCAV. Results show that: HPCAV depends on perforation size but not perforation location; HWPD depends on neither perforation size nor location. The results (1) provide a description of the window pressures based on measurements, (2) refute the common otological view that TM perforation location affects the “relative phase of the pressures at the oval and round windows,” and (3) show with an intact ossicular chain that acoustic-route transmission is substantially below ossicular-route transmission except for low frequencies with large perforations. Thus, hearing loss from TM perforations results primarily from reduction in sound coupling via the ossicular route. Some features of the frequency dependence of HPCAV and HWPD can be interpreted in terms of a structure-based lumped-element acoustic model of the perforation and middle-ear cavities. PMID:17902851
Tate, Kevin B.; Rhen, Turk; Eme, John; Kohl, Zachary F.; Crossley, Janna; Elsey, Ruth M.
2016-01-01
During embryonic development, environmental perturbations can affect organisms' developing phenotype, a process known as developmental plasticity. Resulting phenotypic changes can occur during discrete, critical windows of development. Critical windows are periods when developing embryos are most susceptible to these perturbations. We have previously documented that hypoxia reduces embryo size and increases relative heart mass in American alligator, and this study identified critical windows when hypoxia altered morphological, cardiovascular function and cardiac gene expression of alligator embryos. We hypothesized that incubation in hypoxia (10% O2) would increase relative cardiac size due to cardiac enlargement rather than suppression of somatic growth. We exposed alligator embryos to hypoxia during discrete incubation periods to target windows where the embryonic phenotype is altered. Hypoxia affected heart growth between 20 and 40% of embryonic incubation, whereas somatic growth was affected between 70 and 90% of incubation. Arterial pressure was depressed by hypoxic exposure during 50–70% of incubation, whereas heart rate was depressed in embryos exposed to hypoxia during a period spanning 70–90% of incubation. Expression of Vegf and PdgfB was increased in certain hypoxia-exposed embryo treatment groups, and hypoxia toward the end of incubation altered β-adrenergic tone for arterial pressure and heart rate. It is well known that hypoxia exposure can alter embryonic development, and in the present study, we have identified brief, discrete windows that alter the morphology, cardiovascular physiology, and gene expression in embryonic American alligator. PMID:27101296
Reading direction and the central perceptual span in Urdu and English.
Paterson, Kevin B; McGowan, Victoria A; White, Sarah J; Malik, Sameen; Abedipour, Lily; Jordan, Timothy R
2014-01-01
Normal reading relies on the reader making a series of saccadic eye movements along lines of text, separated by brief fixational pauses during which visual information is acquired from a region of text. In English and other alphabetic languages read from left to right, the region from which useful information is acquired during each fixational pause is generally reported to extend further to the right of each fixation than to the left. However, the asymmetry of the perceptual span for alphabetic languages read in the opposite direction (i.e., from right to left) has received much less attention. Accordingly, in order to more fully investigate the asymmetry in the perceptual span for these languages, the present research assessed the influence of reading direction on the perceptual span for bilingual readers of Urdu and English. Text in Urdu and English was presented either entirely as normal or in a gaze-contingent moving-window paradigm in which a region of text was displayed as normal at the reader's point of fixation and text outside this region was obscured. The windows of normal text extended symmetrically 0.5° of visual angle to the left and right of fixation, or asymmetrically by increasing the size of each window to 1.5° or 2.5° to either the left or right of fixation. When participants read English, performance for the window conditions was superior when windows extended to the right. However, when reading Urdu, performance was superior when windows extended to the left, and was essentially the reverse of that observed for English. These findings provide a novel indication that the perceptual span is modified by the language being read to produce an asymmetry in the direction of reading and show for the first time that such an asymmetry occurs for reading Urdu.
A test of multiple correlation temporal window characteristic of non-Markov processes
NASA Astrophysics Data System (ADS)
Arecchi, F. T.; Farini, A.; Megna, N.
2016-03-01
We introduce a sensitive test of memory effects in successive events. The test consists of a combination K of binary correlations at successive times. K decays monotonically from K = 1 for uncorrelated events as a Markov process. For a monotonic memory fading, K<1 always. Here we report evidence of a K>1 temporal window in cognitive tasks consisting of the visual identification of the front face of the Necker cube after a previous presentation of the same. We speculate that memory effects provide a temporal window with K>1 and this experiment could be a possible first step towards a better comprehension of this phenomenon. The K>1 behaviour is maximal at an inter-measurement time τ around 2s with inter-subject differences. The K>1 persists over a time window of 1s around τ; outside this window the K<1 behaviour is recovered. The universal occurrence of a K>1 window in pairs of successive perceptions suggests that, at variance with single visual stimuli eliciting a suitable response, a pair of stimuli shortly separated in time displays mutual correlations.
Bhure, U N; Lardinois, D; Kalff, V; Hany, T F; Soltermann, A; Seifert, B; Steinert, H C
2010-10-01
Accurate determination of tumour size in lung adenocarcinoma with bronchoalveolar features (BAC) is important for the determination of TNM (tumour, nodes, metastasis) scores used in staging, prognosis and therapy response assessment. However, tumour sizes derived using lung window (LW) CT or soft-tissue/mediastinal window (MW) CT often give different results. This study examines which measurement correlates best with actual tumour size and which best identifies advanced disease. This retrospective study included 43 BAC patients who underwent surgical resection with mediastinal lymphadenectomy <4 weeks post CT scan. The largest unidimensional tumour diameter on each CT window was compared with actual histopathological tumour size (HP). LW, MW and HP size measurements and a recently described CT parameter - the modified tumour shadow disappearance rate (mTDR) = (1 - [MW/LW]) - were then used to determine which parameter best discriminated between the presence or absence of advanced disease. There was no difference between HP and LW sizes, but MW significantly underestimated HP size (p<0.0001). Unlike MW (p = 0.01) and mTDR (p = 0.001), neither HP (p = 0.14) nor LW (p = 0.10) distinguished between patients with or without advanced disease. On receiver operating characteristic (ROC) analysis at a cut-off of ≤0.13, the sensitivity and specificity of mTDR for detecting advanced disease were 69% and 89%, respectively. In patients with tumours ≤3 cm, only mTDR remained a significant predictor of advanced disease (p = 0.017), with best cut-off at ≤0.20, giving a sensitivity and specificity of 71% and 94%, respectively. MW better predicts advanced disease than LW and might also need to be recorded for RECIST (response evaluation criteria in solid tumours) assessment for T staging of BAC; however, mTDR appears to be an even better predictor and should also be used.
Improving the phase measurement by the apodization filter in the digital holography
NASA Astrophysics Data System (ADS)
Chang, Shifeng; Wang, Dayong; Wang, Yunxin; Zhao, Jie; Rong, Lu
2012-11-01
Due to the finite size of the hologram aperture in digital holography, high frequency intensity and phase fluctuations along the edges of the images, which reduce the precision of phase measurement. In this paper, the apodization filters are applied to improve the phase measurement in the digital holography. Firstly, the experimental setup of the lensless Fourier transform digital holography is built, where the sample is a standard phase grating with the grating constant of 300μm and the depth of 150nm. Then, apodization filters are applied to phase measurement of the sample with three kinds of the window functions: Tukey window, Hanning window and Blackman window, respectively. Finally, the results were compared to the detection data given by the commercial white-light interferometer. It is shown that aperture diffraction effects can be reduced by the digital apodization, and the phase measurement with the apodization is more accurate than in the unapodized case. Meanwhile, the Blackman window function produces better effect than the other two window functions in the measurement of the standard phase grating.
Impact of floating windows on the accuracy of depth perception in games
NASA Astrophysics Data System (ADS)
Stanfield, Brodie; Zerebecki, Christopher; Hogue, Andrew; Kapralos, Bill; Collins, Karen
2013-03-01
The floating window technique is commonly employed by stereoscopic 3D filmmakers to reduce the effects of window violations by masking out portions of the screen that contain visual information that doesn't exist in one of the views. Although widely adopted in the film industry, and despite its potential benefits, the technique has not been adopted by video game developers to the same extent possibly because of the lack of understanding of how the floating window can be utilized in such an interactive medium. Here, we describe a quantitative study that investigates how the floating window technique affects users' depth perception in a simple game-like environment. Our goal is to determine how various stereoscopic 3D parameters such as the existence, shape, and size of the floating window affect the user experience and to devise a set of guidelines for game developers wishing to develop stereoscopic 3D content. Providing game designers with quantitative knowledge of how these parameters can affect user experience is invaluable when choosing to design interactive stereoscopic 3D content.
Jordan, James E; Pereira, Beatriz D; Lane, Magan R; Morykwas, Michael J; McGee, Maria; Argenta, Louis C
2015-08-01
Myocardial ischemia-reperfusion injury is known to trigger an inflammatory response involving edema, apoptosis, and neutrophil activation/accumulation. Recently, mechanical tissue resuscitation (MTR) was described as a potent cardioprotective strategy for reduction of myocardial ischemia-reperfusion injury. Here, we further describe the protective actions of MTR and begin to define its therapeutic window. A left ventricular, free-wall ischemic area was created in anesthetized swine for 85 minutes and then reperfused for three hours. Animals were randomized to two groups: (1) untreated controls (Control) and (2) application of MTR that was delayed 90 minutes after the initiation of reperfusion (D90). Hemodynamics and regional myocardial blood flow were assessed at multiple time points. Infarct size and neutrophil accumulation were assessed following the reperfusion period. In separate cohorts, the effect of MTR on myocardial interstitial water (MRI imaging) and blood flow was examined. Both groups had similar areas at risk (AAR), hemodynamics, and arterial blood gas values. MTR, even when delayed 90 minutes into reperfusion (D90, 29.2 ± 5.0% of AAR), reduced infarct size significantly compared to Controls (51.9 ± 2.7%, p = 0.006). This protection was associated with a 33% decrease in neutrophil accumulation (p = 0.047). Improvements in blood flow and interstitial water were also observed. Moreover, we demonstrated that the therapeutic window for MTR lasts for at least 90 minutes following reperfusion. This study confirms our previous observations that MTR is an effective therapeutic approach to reducing reperfusion injury with a clinically useful treatment window. © 2015 Wiley Periodicals, Inc.
Addressing scale dependence in roughness and morphometric statistics derived from point cloud data.
NASA Astrophysics Data System (ADS)
Buscombe, D.; Wheaton, J. M.; Hensleigh, J.; Grams, P. E.; Welcker, C. W.; Anderson, K.; Kaplinski, M. A.
2015-12-01
The heights of natural surfaces can be measured with such spatial density that almost the entire spectrum of physical roughness scales can be characterized, down to the morphological form and grain scales. With an ability to measure 'microtopography' comes a demand for analytical/computational tools for spatially explicit statistical characterization of surface roughness. Detrended standard deviation of surface heights is a popular means to create continuous maps of roughness from point cloud data, using moving windows and reporting window-centered statistics of variations from a trend surface. If 'roughness' is the statistical variation in the distribution of relief of a surface, then 'texture' is the frequency of change and spatial arrangement of roughness. The variance in surface height as a function of frequency obeys a power law. In consequence, roughness is dependent on the window size through which it is examined, which has a number of potential disadvantages: 1) the choice of window size becomes crucial, and obstructs comparisons between data; 2) if windows are large relative to multiple roughness scales, it is harder to discriminate between those scales; 3) if roughness is not scaled by the texture length scale, information on the spacing and clustering of roughness `elements' can be lost; and 4) such practice is not amenable to models describing the scattering of light and sound from rough natural surfaces. We discuss the relationship between roughness and texture. Some useful parameters which scale vertical roughness to characteristic horizontal length scales are suggested, with examples of bathymetric point clouds obtained using multibeam from two contrasting riverbeds, namely those of the Colorado River in Grand Canyon, and the Snake River in Hells Canyon. Such work, aside from automated texture characterization and texture segmentation, roughness and grain size calculation, might also be useful for feature detection and classification from point clouds.
Final Results of Shuttle MMOD Impact Database
NASA Technical Reports Server (NTRS)
Hyde, J. L.; Christiansen, E. L.; Lear, D. M.
2015-01-01
The Shuttle Hypervelocity Impact Database documents damage features on each Orbiter thought to be from micrometeoroids (MM) or orbital debris (OD). Data is divided into tables for crew module windows, payload bay door radiators and thermal protection systems along with other miscellaneous regions. The combined number of records in the database is nearly 3000. Each database record provides impact feature dimensions, location on the vehicle and relevant mission information. Additional detail on the type and size of particle that produced the damage site is provided when sampling data and definitive spectroscopic analysis results are available. Guidelines are described which were used in determining whether impact damage is from micrometeoroid or orbital debris impact based on the findings from scanning electron microscopy chemical analysis. Relationships assumed when converting from observed feature sizes in different shuttle materials to particle sizes will be presented. A small number of significant impacts on the windows, radiators and wing leading edge will be highlighted and discussed in detail, including the hypervelocity impact testing performed to estimate particle sizes that produced the damage.
NASA Astrophysics Data System (ADS)
Ohsuka, Shinji; Ohba, Akira; Onoda, Shinobu; Nakamoto, Katsuhiro; Nakano, Tomoyasu; Miyoshi, Motosuke; Soda, Keita; Hamakubo, Takao
2014-09-01
We constructed a laboratory-size three-dimensional water window x-ray microscope that combines wide-field transmission x-ray microscopy with tomographic reconstruction techniques, and observed bio-medical samples to evaluate its applicability to life science research fields. It consists of a condenser and an objective grazing incidence Wolter type I mirror, an electron-impact type oxygen Kα x-ray source, and a back-illuminated CCD for x-ray imaging. A spatial resolution limit of around 1.0 line pairs per micrometer was obtained for two-dimensional transmission images, and 1-μm scale three-dimensional fine structures were resolved.
Ohsuka, Shinji; Ohba, Akira; Onoda, Shinobu; Nakamoto, Katsuhiro; Nakano, Tomoyasu; Miyoshi, Motosuke; Soda, Keita; Hamakubo, Takao
2014-09-01
We constructed a laboratory-size three-dimensional water window x-ray microscope that combines wide-field transmission x-ray microscopy with tomographic reconstruction techniques, and observed bio-medical samples to evaluate its applicability to life science research fields. It consists of a condenser and an objective grazing incidence Wolter type I mirror, an electron-impact type oxygen Kα x-ray source, and a back-illuminated CCD for x-ray imaging. A spatial resolution limit of around 1.0 line pairs per micrometer was obtained for two-dimensional transmission images, and 1-μm scale three-dimensional fine structures were resolved.
Complex Human Activity Recognition Using Smartphone and Wrist-Worn Motion Sensors
Shoaib, Muhammad; Bosch, Stephan; Incel, Ozlem Durmaz; Scholten, Hans; Havinga, Paul J. M.
2016-01-01
The position of on-body motion sensors plays an important role in human activity recognition. Most often, mobile phone sensors at the trouser pocket or an equivalent position are used for this purpose. However, this position is not suitable for recognizing activities that involve hand gestures, such as smoking, eating, drinking coffee and giving a talk. To recognize such activities, wrist-worn motion sensors are used. However, these two positions are mainly used in isolation. To use richer context information, we evaluate three motion sensors (accelerometer, gyroscope and linear acceleration sensor) at both wrist and pocket positions. Using three classifiers, we show that the combination of these two positions outperforms the wrist position alone, mainly at smaller segmentation windows. Another problem is that less-repetitive activities, such as smoking, eating, giving a talk and drinking coffee, cannot be recognized easily at smaller segmentation windows unlike repetitive activities, like walking, jogging and biking. For this purpose, we evaluate the effect of seven window sizes (2–30 s) on thirteen activities and show how increasing window size affects these various activities in different ways. We also propose various optimizations to further improve the recognition of these activities. For reproducibility, we make our dataset publicly available. PMID:27023543
Optimizing the maximum reported cluster size in the spatial scan statistic for ordinal data.
Kim, Sehwi; Jung, Inkyung
2017-01-01
The spatial scan statistic is an important tool for spatial cluster detection. There have been numerous studies on scanning window shapes. However, little research has been done on the maximum scanning window size or maximum reported cluster size. Recently, Han et al. proposed to use the Gini coefficient to optimize the maximum reported cluster size. However, the method has been developed and evaluated only for the Poisson model. We adopt the Gini coefficient to be applicable to the spatial scan statistic for ordinal data to determine the optimal maximum reported cluster size. Through a simulation study and application to a real data example, we evaluate the performance of the proposed approach. With some sophisticated modification, the Gini coefficient can be effectively employed for the ordinal model. The Gini coefficient most often picked the optimal maximum reported cluster sizes that were the same as or smaller than the true cluster sizes with very high accuracy. It seems that we can obtain a more refined collection of clusters by using the Gini coefficient. The Gini coefficient developed specifically for the ordinal model can be useful for optimizing the maximum reported cluster size for ordinal data and helpful for properly and informatively discovering cluster patterns.
Optimizing the maximum reported cluster size in the spatial scan statistic for ordinal data
Kim, Sehwi
2017-01-01
The spatial scan statistic is an important tool for spatial cluster detection. There have been numerous studies on scanning window shapes. However, little research has been done on the maximum scanning window size or maximum reported cluster size. Recently, Han et al. proposed to use the Gini coefficient to optimize the maximum reported cluster size. However, the method has been developed and evaluated only for the Poisson model. We adopt the Gini coefficient to be applicable to the spatial scan statistic for ordinal data to determine the optimal maximum reported cluster size. Through a simulation study and application to a real data example, we evaluate the performance of the proposed approach. With some sophisticated modification, the Gini coefficient can be effectively employed for the ordinal model. The Gini coefficient most often picked the optimal maximum reported cluster sizes that were the same as or smaller than the true cluster sizes with very high accuracy. It seems that we can obtain a more refined collection of clusters by using the Gini coefficient. The Gini coefficient developed specifically for the ordinal model can be useful for optimizing the maximum reported cluster size for ordinal data and helpful for properly and informatively discovering cluster patterns. PMID:28753674
Yurtkuran, Alkın; Emel, Erdal
2014-01-01
The traveling salesman problem with time windows (TSPTW) is a variant of the traveling salesman problem in which each customer should be visited within a given time window. In this paper, we propose an electromagnetism-like algorithm (EMA) that uses a new constraint handling technique to minimize the travel cost in TSPTW problems. The EMA utilizes the attraction-repulsion mechanism between charged particles in a multidimensional space for global optimization. This paper investigates the problem-specific constraint handling capability of the EMA framework using a new variable bounding strategy, in which real-coded particle's boundary constraints associated with the corresponding time windows of customers, is introduced and combined with the penalty approach to eliminate infeasibilities regarding time window violations. The performance of the proposed algorithm and the effectiveness of the constraint handling technique have been studied extensively, comparing it to that of state-of-the-art metaheuristics using several sets of benchmark problems reported in the literature. The results of the numerical experiments show that the EMA generates feasible and near-optimal results within shorter computational times compared to the test algorithms.
Yurtkuran, Alkın
2014-01-01
The traveling salesman problem with time windows (TSPTW) is a variant of the traveling salesman problem in which each customer should be visited within a given time window. In this paper, we propose an electromagnetism-like algorithm (EMA) that uses a new constraint handling technique to minimize the travel cost in TSPTW problems. The EMA utilizes the attraction-repulsion mechanism between charged particles in a multidimensional space for global optimization. This paper investigates the problem-specific constraint handling capability of the EMA framework using a new variable bounding strategy, in which real-coded particle's boundary constraints associated with the corresponding time windows of customers, is introduced and combined with the penalty approach to eliminate infeasibilities regarding time window violations. The performance of the proposed algorithm and the effectiveness of the constraint handling technique have been studied extensively, comparing it to that of state-of-the-art metaheuristics using several sets of benchmark problems reported in the literature. The results of the numerical experiments show that the EMA generates feasible and near-optimal results within shorter computational times compared to the test algorithms. PMID:24723834
2013-03-01
interacted with (15). 4.3.3 Experimental Procedure Two MgAl2O4 spinel samples with nominal 0.6- and 1.6-μm mean grain sizes were tested using advanced...unable to make specific quantitative predictions at this time. Due to the nature of the experimental process, this technique is suitable only for...Information From Spherical Indentation; ARL-TR-4229; U.S. Army Research Laboratory: Aberdeen Proving Ground, MD, 2007. 24. ASTM E112. Standard Test
75 FR 11841 - Repowering Assistance Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-12
... application window. SUMMARY: RBS is announcing a new application window to submit applications for the...-time application window for remaining FY 2009 funds. Paperwork Reduction Act In accordance with the... allocate all of the FY 2009 authorized funds. Therefore, the Agency is opening a new application window to...
Mapping conduction velocity of early embryonic hearts with a robust fitting algorithm
Gu, Shi; Wang, Yves T; Ma, Pei; Werdich, Andreas A; Rollins, Andrew M; Jenkins, Michael W
2015-01-01
Cardiac conduction maturation is an important and integral component of heart development. Optical mapping with voltage-sensitive dyes allows sensitive measurements of electrophysiological signals over the entire heart. However, accurate measurements of conduction velocity during early cardiac development is typically hindered by low signal-to-noise ratio (SNR) measurements of action potentials. Here, we present a novel image processing approach based on least squares optimizations, which enables high-resolution, low-noise conduction velocity mapping of smaller tubular hearts. First, the action potential trace measured at each pixel is fit to a curve consisting of two cumulative normal distribution functions. Then, the activation time at each pixel is determined based on the fit, and the spatial gradient of activation time is determined with a two-dimensional (2D) linear fit over a square-shaped window. The size of the window is adaptively enlarged until the gradients can be determined within a preset precision. Finally, the conduction velocity is calculated based on the activation time gradient, and further corrected for three-dimensional (3D) geometry that can be obtained by optical coherence tomography (OCT). We validated the approach using published activation potential traces based on computer simulations. We further validated the method by adding artificially generated noise to the signal to simulate various SNR conditions using a curved simulated image (digital phantom) that resembles a tubular heart. This method proved to be robust, even at very low SNR conditions (SNR = 2-5). We also established an empirical equation to estimate the maximum conduction velocity that can be accurately measured under different conditions (e.g. sampling rate, SNR, and pixel size). Finally, we demonstrated high-resolution conduction velocity maps of the quail embryonic heart at a looping stage of development. PMID:26114034
Millisecond timing on PCs and Macs.
MacInnes, W J; Taylor, T L
2001-05-01
A real-time, object-oriented solution for displaying stimuli on Windows 95/98, MacOS and Linux platforms is presented. The program, written in C++, utilizes a special-purpose window class (GLWindow), OpenGL, and 32-bit graphics acceleration; it avoids display timing uncertainty by substituting the new window class for the default window code for each system. We report the outcome of tests for real-time capability across PC and Mac platforms running a variety of operating systems. The test program, which can be used as a shell for programming real-time experiments and testing specific processors, is available at http://www.cs.dal.ca/~macinnwj. We propose to provide researchers with a sense of the usefulness of our program, highlight the ability of many multitasking environments to achieve real time, as well as caution users about systems that may not achieve real time, even under optimal conditions.
Exclusive queueing model including the choice of service windows
NASA Astrophysics Data System (ADS)
Tanaka, Masahiro; Yanagisawa, Daichi; Nishinari, Katsuhiro
2018-01-01
In a queueing system involving multiple service windows, choice behavior is a significant concern. This paper incorporates the choice of service windows into a queueing model with a floor represented by discrete cells. We contrived a logit-based choice algorithm for agents considering the numbers of agents and the distances to all service windows. Simulations were conducted with various parameters of agent choice preference for these two elements and for different floor configurations, including the floor length and the number of service windows. We investigated the model from the viewpoint of transit times and entrance block rates. The influences of the parameters on these factors were surveyed in detail and we determined that there are optimum floor lengths that minimize the transit times. In addition, we observed that the transit times were determined almost entirely by the entrance block rates. The results of the presented model are relevant to understanding queueing systems including the choice of service windows and can be employed to optimize facility design and floor management.
Large Area Flat Panel Imaging Detectors for Astronomy and Night Time Sensing
NASA Astrophysics Data System (ADS)
Siegmund, O.; McPhate, J.; Frisch, H.; Elam, J.; Mane, A.; Wagner, R.; Varner, G.
2013-09-01
Sealed tube photo-sensing detectors for optical/IR detection have applications in astronomy, nighttime remote reconnaissance, and airborne/space situational awareness. The potential development of large area photon counting, imaging, timing detectors has significance for these applications and a number of other areas (High energy particle detection (RICH), biological single-molecule fluorescence lifetime imaging microscopy, neutron imaging, time of flight mass spectroscopy, diffraction imaging). We will present details of progress towards the development of a 20 cm sealed tube optical detector with nanoengineered microchannel plates for photon counting, imaging and sub-ns event time stamping. In the operational scheme of the photodetector incoming light passes through an entrance window and interacts with a semitransparent photocathode on the inside of the window. The photoelectrons emitted are accelerated across a proximity gap and are detected by an MCP pair. The pair of novel borosilicate substrate MCPs are functionalized by atomic layer deposition (ALD), and amplify the signal and the resulting electron cloud is detected by a conductive strip line anode for determination of the event positions and the time of arrival. The physical package is ~ 25 x 25 cm but only 1.5 cm thick. Development of such a device in a square 20 cm format presents challenges: hermetic sealing to a large entrance window, a 20 cm semitransparent photocathode with good efficiency and uniformity, 20 cm MCPs with reasonable cost and performance, robust construction to preserve high vacuum and withstand an atmosphere pressure differential. We will discuss the schemes developed to address these issues and present the results for the first test devices. The novel microchannel plates employing borosilicate micro-capillary arrays provide many performance characteristics typical of conventional MCPs, but have been made in sizes up to 20 cm, have low intrinsic background (0.08 events cm2 s-1) and have very stable gain behavior over > 7 C cm2 of charge extracted. They are high temperature compatible and have minimal outgassing, which shortens and simplifies the sealed tube production process and should improve overall lifetimes. Bialkali (NaKSb) semitransparent photocathodes with > 20% quantum efficiency have also been made on 20 cm borosilicate windows compatible with the window seals for the large sealed tube device. The photocathodes have good response uniformity and have been stable for > 5 months in testing. Tests with a 20 cm detector with a cross delay line readout have achieved ~50µm FWHM imaging with single photon sub-ns timing and MHz event rates, and tests with a 10 x 10cm detector with cross strip readout has achieved ~20µm FWHM imaging with >4 MHz event rates with ~10% deadtime. We will discuss the details and implications of these novel detector implementations and their potential applications.
Correlates of avian building strikes at a glass façade museum surrounded by avian habitat
NASA Astrophysics Data System (ADS)
Kahle, L.; Flannery, M.; Dumbacher, J. P.
2013-12-01
Bird window collisions are the second largest anthropogenic cause of bird deaths in the world. Effective mitigation requires an understanding of which birds are most likely to strike, when, and why. Here, we examine five years of avian window strike data from the California Academy of Sciences - a relatively new museum with significant glass façade situated in Golden Gate Park, San Francisco. We examine correlates of window-killed birds, including age, sex, season, and migratory or sedentary tendencies of the birds. We also examine correlates of window kills such as presence of habitat surrounding the building and overall window area. We found that males are almost three times more likely than females to mortally strike windows, and immature birds are three times more abundant than adults in our window kill dataset. Among seasons, strikes were not notably different in spring, summer, and fall; however they were notably reduced in winter. There was no statistical effect of building orientation (north, south, east, or west), and the presence of avian habitat directly adjacent to windows had a minor effect. We also report ongoing studies examining various efforts to reduce window kill (primarily external decals and large electronic window blinds.) We hope that improving our understanding of the causes of the window strikes will help us strategically reduce window strikes.
Non-stationary dynamics in the bouncing ball: A wavelet perspective
DOE Office of Scientific and Technical Information (OSTI.GOV)
Behera, Abhinna K., E-mail: abhinna@iiserkol.ac.in; Panigrahi, Prasanta K., E-mail: pprasanta@iiserkol.ac.in; Sekar Iyengar, A. N., E-mail: ansekar.iyengar@saha.ac.in
2014-12-01
The non-stationary dynamics of a bouncing ball, comprising both periodic as well as chaotic behavior, is studied through wavelet transform. The multi-scale characterization of the time series displays clear signatures of self-similarity, complex scaling behavior, and periodicity. Self-similar behavior is quantified by the generalized Hurst exponent, obtained through both wavelet based multi-fractal detrended fluctuation analysis and Fourier methods. The scale dependent variable window size of the wavelets aptly captures both the transients and non-stationary periodic behavior, including the phase synchronization of different modes. The optimal time-frequency localization of the continuous Morlet wavelet is found to delineate the scales corresponding tomore » neutral turbulence, viscous dissipation regions, and different time varying periodic modulations.« less
nodeGame: Real-time, synchronous, online experiments in the browser.
Balietti, Stefano
2017-10-01
nodeGame is a free, open-source JavaScript/ HTML5 framework for conducting synchronous experiments online and in the lab directly in the browser window. It is specifically designed to support behavioral research along three dimensions: (i) larger group sizes, (ii) real-time (but also discrete time) experiments, and (iii) batches of simultaneous experiments. nodeGame has a modular source code, and defines an API (application programming interface) through which experimenters can create new strategic environments and configure the platform. With zero-install, nodeGame can run on a great variety of devices, from desktop computers to laptops, smartphones, and tablets. The current version of the software is 3.0, and extensive documentation is available on the wiki pages at http://nodegame.org .
Cross-Layer Scheme to Control Contention Window for Per-Flow in Asymmetric Multi-Hop Networks
NASA Astrophysics Data System (ADS)
Giang, Pham Thanh; Nakagawa, Kenji
The IEEE 802.11 MAC standard for wireless ad hoc networks adopts Binary Exponential Back-off (BEB) mechanism to resolve bandwidth contention between stations. BEB mechanism controls the bandwidth allocation for each station by choosing a back-off value from one to CW according to the uniform random distribution, where CW is the contention window size. However, in asymmetric multi-hop networks, some stations are disadvantaged in opportunity of access to the shared channel and may suffer severe throughput degradation when the traffic load is large. Then, the network performance is degraded in terms of throughput and fairness. In this paper, we propose a new cross-layer scheme aiming to solve the per-flow unfairness problem and achieve good throughput performance in IEEE 802.11 multi-hop ad hoc networks. Our cross-layer scheme collects useful information from the physical, MAC and link layers of own station. This information is used to determine the optimal Contention Window (CW) size for per-station fairness. We also use this information to adjust CW size for each flow in the station in order to achieve per-flow fairness. Performance of our cross-layer scheme is examined on various asymmetric multi-hop network topologies by using Network Simulator (NS-2).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Al-Sallal, K.A.
1999-07-01
The study aims to explore the effect of different climates on window and skylight design in residential buildings. The study house is evaluated against climates that have design opportunities for passive systems, with emphasis on passive cooling. The study applies a variety of methods to evaluate the design. It has found that earth sheltering and night ventilation have the potential to provide 12--29% and 25--77% of the cooling requirements respectively for the study house in the selected climates. The reduction of the glazing area from 174 ft{sup 2} to 115 ft{sup 2} has different impacts on the cooling energy costmore » in the different climates. In climates such Fresno and Tucson, one should put the cooling energy savings as a priority for window design, particularly when determining the window size. In other climates such as Albuquerque, the priority of window design should be first given to heating savings requirements.« less
Thermal Stress in HFEF Hot Cell Windows Due to an In-Cell Metal Fire
Solbrig, Charles W.; Warmann, Stephen A.
2016-01-01
This work investigates an accident during the pyrochemical extraction of Uranium and Plutonium from PWR spent fuel in an argon atmosphere hot cell. In the accident, the heavy metals (U and Pu) being extracted are accidentally exposed to air from a leaky instrument penetration which goes through the cell walls. The extracted pin size pieces of U and Pu metal readily burn when exposed to air. Technicians perform the electrochemical extraction using manipulators through a 4 foot thick hot cell concrete wall which protects them from the radioactivity of the spent fuel. Four foot thick windows placed in the wallmore » allow the technicians to visually control the manipulators. These windows would be exposed to the heat of the metal fire. As a result, this analysis determines if the thermal stress caused by the fire would crack the windows and if the heat would degrade the window seals allowing radioactivity to escape from the cell.« less
Thermal Stress in HFEF Hot Cell Windows Due to an In-Cell Metal Fire
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solbrig, Charles W.; Warmann, Stephen A.
This work investigates an accident during the pyrochemical extraction of Uranium and Plutonium from PWR spent fuel in an argon atmosphere hot cell. In the accident, the heavy metals (U and Pu) being extracted are accidentally exposed to air from a leaky instrument penetration which goes through the cell walls. The extracted pin size pieces of U and Pu metal readily burn when exposed to air. Technicians perform the electrochemical extraction using manipulators through a 4 foot thick hot cell concrete wall which protects them from the radioactivity of the spent fuel. Four foot thick windows placed in the wallmore » allow the technicians to visually control the manipulators. These windows would be exposed to the heat of the metal fire. As a result, this analysis determines if the thermal stress caused by the fire would crack the windows and if the heat would degrade the window seals allowing radioactivity to escape from the cell.« less
Additive Manufacturing for Highly Efficient Window Inserts CRADA Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roschli, Alex C.; Chesser, Phillip C.; Love, Lonnie J.
ORNL partnered with the Mackinac Technology Company to demonstrate how additive manufacturing can be used to create highly energy efficient window inserts for retrofit in pre-existing buildings. Many early iterations of the window inserts were fabricated using carbon fiber reinforced thermoplastics and polycarbonate films as a stand in for the low-e coated films produced by the Mackinac Technology Company. After demonstration of the proof of concept, i.e. custom window inserts with tensioned film, the materials used for the manufacture of the frames was more closely examined. Hollow particle-filled syntactic foam and low-density polymer composites formed by expandable microspheres were exploredmore » as the materials used to additively manufacture the frames of the inserts. It was concluded that low-cost retrofit window inserts in custom sizes could be easily fabricated using large scale additive manufacturing. Furthermore, the syntactic and expanded foams developed and tested satisfy the mechanical performance requirements for the application.« less
Chen, Jie; Li, Jiahong; Yang, Shuanghua; Deng, Fang
2017-11-01
The identification of the nonlinearity and coupling is crucial in nonlinear target tracking problem in collaborative sensor networks. According to the adaptive Kalman filtering (KF) method, the nonlinearity and coupling can be regarded as the model noise covariance, and estimated by minimizing the innovation or residual errors of the states. However, the method requires large time window of data to achieve reliable covariance measurement, making it impractical for nonlinear systems which are rapidly changing. To deal with the problem, a weighted optimization-based distributed KF algorithm (WODKF) is proposed in this paper. The algorithm enlarges the data size of each sensor by the received measurements and state estimates from its connected sensors instead of the time window. A new cost function is set as the weighted sum of the bias and oscillation of the state to estimate the "best" estimate of the model noise covariance. The bias and oscillation of the state of each sensor are estimated by polynomial fitting a time window of state estimates and measurements of the sensor and its neighbors weighted by the measurement noise covariance. The best estimate of the model noise covariance is computed by minimizing the weighted cost function using the exhaustive method. The sensor selection method is in addition to the algorithm to decrease the computation load of the filter and increase the scalability of the sensor network. The existence, suboptimality and stability analysis of the algorithm are given. The local probability data association method is used in the proposed algorithm for the multitarget tracking case. The algorithm is demonstrated in simulations on tracking examples for a random signal, one nonlinear target, and four nonlinear targets. Results show the feasibility and superiority of WODKF against other filtering algorithms for a large class of systems.
NASA Technical Reports Server (NTRS)
Luo, Victor; Khanampornpan, Teerapat; Boehmer, Rudy A.; Kim, Rachel Y.
2011-01-01
This software graphically displays all pertinent information from a Predicted Events File (PEF) using the Java Swing framework, which allows for multi-platform support. The PEF is hard to weed through when looking for specific information and it is a desire for the MRO (Mars Reconn aissance Orbiter) Mission Planning & Sequencing Team (MPST) to have a different way to visualize the data. This tool will provide the team with a visual way of reviewing and error-checking the sequence product. The front end of the tool contains much of the aesthetically appealing material for viewing. The time stamp is displayed in the top left corner, and highlighted details are displayed in the bottom left corner. The time bar stretches along the top of the window, and the rest of the space is allotted for blocks and step functions. A preferences window is used to control the layout of the sections along with the ability to choose color and size of the blocks. Double-clicking on a block will show information contained within the block. Zooming into a certain level will graphically display that information as an overlay on the block itself. Other functions include using hotkeys to navigate, an option to jump to a specific time, enabling a vertical line, and double-clicking to zoom in/out. The back end involves a configuration file that allows a more experienced user to pre-define the structure of a block, a single event, or a step function. The individual will have to determine what information is important within each block and what actually defines the beginning and end of a block. This gives the user much more flexibility in terms of what the tool is searching for. In addition to the configurability, all the settings in the preferences window are saved in the configuration file as well
Tuning the emission of aqueous Cu:ZnSe quantum dots to yellow light window
NASA Astrophysics Data System (ADS)
Wang, Chunlei; Hu, Zhiyang; Xu, Shuhong; Wang, Yanbin; Zhao, Zengxia; Wang, Zhuyuan; Cui, Yiping
2015-07-01
Synthesis of internally doped Cu:ZnSe QDs in an aqueous solution still suffers from narrow tunable emissions from the blue to green light window. In this work, we extended the emission window of aqueous Cu:ZnSe QDs to the yellow light window. Our results show that high solution pH, multiple injections of Zn precursors, and nucleation doping strategy are three key factors for preparing yellow emitted Cu:ZnSe QDs. All these factors can depress the reactivity of CuSe nuclei and Zn monomers, promoting ZnSe growth outside CuSe nuclei rather than form ZnSe nuclei separately. With increased ZnSe QD size, the conduction band and nearby trap state energy levels shift to higher energy sites, causing Cu:ZnSe QDs to have a much longer emission.
Cluster fusion-fission dynamics in the Singapore stock exchange
NASA Astrophysics Data System (ADS)
Teh, Boon Kin; Cheong, Siew Ann
2015-10-01
In this paper, we investigate how the cross-correlations between stocks in the Singapore stock exchange (SGX) evolve over 2008 and 2009 within overlapping one-month time windows. In particular, we examine how these cross-correlations change before, during, and after the Sep-Oct 2008 Lehman Brothers Crisis. To do this, we extend the complete-linkage hierarchical clustering algorithm, to obtain robust clusters of stocks with stronger intracluster correlations, and weaker intercluster correlations. After we identify the robust clusters in all time windows, we visualize how these change in the form of a fusion-fission diagram. Such a diagram depicts graphically how the cluster sizes evolve, the exchange of stocks between clusters, as well as how strongly the clusters mix. From the fusion-fission diagram, we see a giant cluster growing and disintegrating in the SGX, up till the Lehman Brothers Crisis in September 2008 and the market crashes of October 2008. After the Lehman Brothers Crisis, clusters in the SGX remain small for few months before giant clusters emerge once again. In the aftermath of the crisis, we also find strong mixing of component stocks between clusters. As a result, the correlation between initially strongly-correlated pairs of stocks decay exponentially with average life time of about a month. These observations impact strongly how portfolios and trading strategies should be formulated.
On Time Delay Margin Estimation for Adaptive Control and Optimal Control Modification
NASA Technical Reports Server (NTRS)
Nguyen, Nhan T.
2011-01-01
This paper presents methods for estimating time delay margin for adaptive control of input delay systems with almost linear structured uncertainty. The bounded linear stability analysis method seeks to represent an adaptive law by a locally bounded linear approximation within a small time window. The time delay margin of this input delay system represents a local stability measure and is computed analytically by three methods: Pade approximation, Lyapunov-Krasovskii method, and the matrix measure method. These methods are applied to the standard model-reference adaptive control, s-modification adaptive law, and optimal control modification adaptive law. The windowing analysis results in non-unique estimates of the time delay margin since it is dependent on the length of a time window and parameters which vary from one time window to the next. The optimal control modification adaptive law overcomes this limitation in that, as the adaptive gain tends to infinity and if the matched uncertainty is linear, then the closed-loop input delay system tends to a LTI system. A lower bound of the time delay margin of this system can then be estimated uniquely without the need for the windowing analysis. Simulation results demonstrates the feasibility of the bounded linear stability method for time delay margin estimation.
High-Reliability Waveguide Vacuum/Pressure Window
NASA Technical Reports Server (NTRS)
Britcliffe, Michael J.; Hanson, Theodore R.; Long, Ezra M.; Montanez, Steven
2013-01-01
The NASA Deep Space Network (DSN) uses commercial waveguide windows on the output waveguide of Ka-band (32 GHz) low-noise amplifiers. Mechanical failure of these windows resulted in an unacceptable loss in tracking time. To address this issue, a new Ka-band WR-28 waveguide window has been designed, fabricated, and tested. The window uses a slab of low-loss, low-dielectric constant foam that is bonded into a 1/2-wave-thick waveguide/flange. The foam is a commercially available, rigid, closed-cell polymethacrylimide. It has excellent electrical properties with a dielectric constant of 1.04, and a loss tangent of 0.01. It is relatively strong with a tensile strength of 1 MPa. The material is virtually impermeable to helium. The finished window exhibits a leak rate of less than 3x10(exp -3)cu cm/s with helium. The material is also chemically resistant and can be cleaned with acetone. The window is constructed by fabricating a window body by brazing a short length of WR-28 copper waveguide into a standard rectangular flange, and machining the resulting part to a thickness of 4.6 mm. The foam is machined to a rectangular shape with a dimension of 7.06x3.53 mm. The foam is bonded into the body with a two-part epoxy. After curing, the excess glue and foam are knife-trimmed by hand. The finished window has a loss of less than 0.08 dB (2%) and a return loss of greater than 25 dB at 32 GHz. This meets the requirements for the DSN application. The window is usable for most applications over the entire 26-to-40-GHz waveguide band. The window return loss can be tuned to a required frequency by var y in g the thickness of the window slightly. Most standard waveguide windows use a thin membrane of material bonded into a recess in a waveguide flange, or sandwiched between two flanges with a polymer seal. Designs using the recessed window are prone to mechanical failure over time due to constraints on the dimensions of the recess that allow the bond to fail. Designs using the sandwich method are often permeable to helium, which prohibits the use of helium leak detection. At the time of this reporting, 40 windows have been produced. Twelve are in operation with a combined operating time of over 30,000 hours without a failure.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-29
... maintenance window for the Gulf individual fishing quota (IFQ) programs, and removing obsolete codified text..., etc.), extends the IFQ maintenance window an additional 8 hours to allow for more time to conduct end... maintenance window. All electronic IFQ transactions must be completed by December 31 at 6 p.m. eastern time...
Antireflective surface structures on optics for high energy lasers
NASA Astrophysics Data System (ADS)
Busse, Lynda E.; Florea, Catalin M.; Shaw, L. Brandon; Frantz, Jesse; Bayya, Shyam; Poutous, Menelaos K.; Joshi, Rajendra; Aggarwal, Ishwar D.; Sanghera, Jas S.
2014-02-01
We report results for antireflective surface structures (ARSS) fabricated directly into the surface of optics and lenses which are important as high energy (multi-kW) laser components, including fused silica windows and lenses, YAG crystals and ceramics and spinel ceramics. Very low reflection losses as well as high laser damage thresholds have been measured for optics with ARSS. Progress to scale up the process for large size windows will also be presented..
Temporal Characterization of Aircraft Noise Sources
NASA Technical Reports Server (NTRS)
Grosveld, Ferdinand W.; Sullivan, Brenda M.; Rizzi, Stephen A.
2004-01-01
Current aircraft source noise prediction tools yield time-independent frequency spectra as functions of directivity angle. Realistic evaluation and human assessment of aircraft fly-over noise require the temporal characteristics of the noise signature. The purpose of the current study is to analyze empirical data from broadband jet and tonal fan noise sources and to provide the temporal information required for prediction-based synthesis. Noise sources included a one-tenth-scale engine exhaust nozzle and a one-fifth scale scale turbofan engine. A methodology was developed to characterize the low frequency fluctuations employing the Short Time Fourier Transform in a MATLAB computing environment. It was shown that a trade-off is necessary between frequency and time resolution in the acoustic spectrogram. The procedure requires careful evaluation and selection of the data analysis parameters, including the data sampling frequency, Fourier Transform window size, associated time period and frequency resolution, and time period window overlap. Low frequency fluctuations were applied to the synthesis of broadband noise with the resulting records sounding virtually indistinguishable from the measured data in initial subjective evaluations. Amplitude fluctuations of blade passage frequency (BPF) harmonics were successfully characterized for conditions equivalent to take-off and approach. Data demonstrated that the fifth harmonic of the BPF varied more in frequency than the BPF itself and exhibited larger amplitude fluctuations over the duration of the time record. Frequency fluctuations were found to be not perceptible in the current characterization of tonal components.
Rotation Periods and Photometric Amplitudes for Cool Stars with TESS
NASA Astrophysics Data System (ADS)
Andrews, Hannah; Dominguez, Zechariah; Johnson, Sara; Buzasi, Derek L.
2018-06-01
The original Kepler mission observed 200000 stars in the same field nearly continuously for over four years, generating an unparalleled set of stellar rotation curves and new insights into the correlation between rotation periods and photometric variability on the lower main sequence. The continuation of Kepler in the guise of K2 has allowed us to examine a stellar sample comparable in size to that observed with Kepler, but drawn from new stellar populations. However, K2 observed each field for at most three months, limiting the inferences that can be drawn, particularly for older, slower-rotating stars. The upcoming TESS spacecraft will provide light curves for perhaps two orders of magnitude more stars, but with time windows as short as 27 days. In this work, we resample Kepler light curves using the TESS observing window, and study what can be learned from high-precision light curves of such short lengths, and how to compare those results to what we have learned from Kepler.
Comparisons of neural networks to standard techniques for image classification and correlation
NASA Technical Reports Server (NTRS)
Paola, Justin D.; Schowengerdt, Robert A.
1994-01-01
Neural network techniques for multispectral image classification and spatial pattern detection are compared to the standard techniques of maximum-likelihood classification and spatial correlation. The neural network produced a more accurate classification than maximum-likelihood of a Landsat scene of Tucson, Arizona. Some of the errors in the maximum-likelihood classification are illustrated using decision region and class probability density plots. As expected, the main drawback to the neural network method is the long time required for the training stage. The network was trained using several different hidden layer sizes to optimize both the classification accuracy and training speed, and it was found that one node per class was optimal. The performance improved when 3x3 local windows of image data were entered into the net. This modification introduces texture into the classification without explicit calculation of a texture measure. Larger windows were successfully used for the detection of spatial features in Landsat and Magellan synthetic aperture radar imagery.
JAX Colony Management System (JCMS): an extensible colony and phenotype data management system.
Donnelly, Chuck J; McFarland, Mike; Ames, Abigail; Sundberg, Beth; Springer, Dave; Blauth, Peter; Bult, Carol J
2010-04-01
The Jackson Laboratory Colony Management System (JCMS) is a software application for managing data and information related to research mouse colonies, associated biospecimens, and experimental protocols. JCMS runs directly on computers that run one of the PC Windows operating systems, but can be accessed via web browser interfaces from any computer running a Windows, Macintosh, or Linux operating system. JCMS can be configured for a single user or multiple users in small- to medium-size work groups. The target audience for JCMS includes laboratory technicians, animal colony managers, and principal investigators. The application provides operational support for colony management and experimental workflows, sample and data tracking through transaction-based data entry forms, and date-driven work reports. Flexible query forms allow researchers to retrieve database records based on user-defined criteria. Recent advances in handheld computers with integrated barcode readers, middleware technologies, web browsers, and wireless networks add to the utility of JCMS by allowing real-time access to the database from any networked computer.
Complex Patterns in Financial Time Series Through HIGUCHI’S Fractal Dimension
NASA Astrophysics Data System (ADS)
Grace Elizabeth Rani, T. G.; Jayalalitha, G.
2016-11-01
This paper analyzes the complexity of stock exchanges through fractal theory. Closing price indices of four stock exchanges with different industry sectors are selected. Degree of complexity is assessed through Higuchi’s fractal dimension. Various window sizes are considered in evaluating the fractal dimension. It is inferred that the data considered as a whole represents random walk for all the four indices. Analysis of financial data through windowing procedure exhibits multi-fractality. Attempts to apply moving averages to reduce noise in the data revealed lower estimates of fractal dimension, which was verified using fractional Brownian motion. A change in the normalization factor in Higuchi’s algorithm did improve the results. It is quintessential to focus on rural development to realize a standard and steady growth of economy. Tools must be devised to settle the issues in this regard. Micro level institutions are necessary for the economic growth of a country like India, which would induce a sporadic development in the present global economical scenario.
Lin, Sen; Bai, Xiaopeng; Wang, Haiyang; Wang, Haolun; Song, Jianan; Huang, Kai; Wang, Chang; Wang, Ning; Li, Bo; Lei, Ming; Wu, Hui
2017-11-01
Electrochromic smart windows (ECSWs) are considered as the most promising alternative to traditional dimming devices. However, the electrode technology in ECSWs remains stagnant, wherein inflexible indium tin oxide and fluorine-doped tin oxide are the main materials being used. Although various complicated production methods, such as high-temperature calcination and sputtering, have been reported, the mass production of flexible and transparent electrodes remains challenging. Here, a nonheated roll-to-roll process is developed for the continuous production of flexible, extralarge, and transparent silver nanofiber (AgNF) network electrodes. The optical and mechanical properties, as well as the electrical conductivity of these products (i.e., 12 Ω sq -1 at 95% transmittance) are comparable with those AgNF networks produced via high-temperature sintering. Moreover, the as-prepared AgNF network is successfully assembled into an A4-sized ECSW with short switching time, good coloration efficiency, and flexibility. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A fast non-local means algorithm based on integral image and reconstructed similar kernel
NASA Astrophysics Data System (ADS)
Lin, Zheng; Song, Enmin
2018-03-01
Image denoising is one of the essential methods in digital image processing. The non-local means (NLM) denoising approach is a remarkable denoising technique. However, its time complexity of the computation is high. In this paper, we design a fast NLM algorithm based on integral image and reconstructed similar kernel. First, the integral image is introduced in the traditional NLM algorithm. In doing so, it reduces a great deal of repetitive operations in the parallel processing, which will greatly improves the running speed of the algorithm. Secondly, in order to amend the error of the integral image, we construct a similar window resembling the Gaussian kernel in the pyramidal stacking pattern. Finally, in order to eliminate the influence produced by replacing the Gaussian weighted Euclidean distance with Euclidean distance, we propose a scheme to construct a similar kernel with a size of 3 x 3 in a neighborhood window which will reduce the effect of noise on a single pixel. Experimental results demonstrate that the proposed algorithm is about seventeen times faster than the traditional NLM algorithm, yet produce comparable results in terms of Peak Signal-to- Noise Ratio (the PSNR increased 2.9% in average) and perceptual image quality.
Self spectrum window method in wigner-ville distribution.
Liu, Zhongguo; Liu, Changchun; Liu, Boqiang; Lv, Yangsheng; Lei, Yinsheng; Yu, Mengsun
2005-01-01
Wigner-Ville distribution (WVD) is an important type of time-frequency analysis in biomedical signal processing. The cross-term interference in WVD has a disadvantageous influence on its application. In this research, the Self Spectrum Window (SSW) method was put forward to suppress the cross-term interference, based on the fact that the cross-term and auto-WVD- terms in integral kernel function are orthogonal. With the Self Spectrum Window (SSW) algorithm, a real auto-WVD function was used as a template to cross-correlate with the integral kernel function, and the Short Time Fourier Transform (STFT) spectrum of the signal was used as window function to process the WVD in time-frequency plane. The SSW method was confirmed by computer simulation with good analysis results. Satisfactory time- frequency distribution was obtained.
Fully automatic time-window selection using machine learning for global adjoint tomography
NASA Astrophysics Data System (ADS)
Chen, Y.; Hill, J.; Lei, W.; Lefebvre, M. P.; Bozdag, E.; Komatitsch, D.; Tromp, J.
2017-12-01
Selecting time windows from seismograms such that the synthetic measurements (from simulations) and measured observations are sufficiently close is indispensable in a global adjoint tomography framework. The increasing amount of seismic data collected everyday around the world demands "intelligent" algorithms for seismic window selection. While the traditional FLEXWIN algorithm can be "automatic" to some extent, it still requires both human input and human knowledge or experience, and thus is not deemed to be fully automatic. The goal of intelligent window selection is to automatically select windows based on a learnt engine that is built upon a huge number of existing windows generated through the adjoint tomography project. We have formulated the automatic window selection problem as a classification problem. All possible misfit calculation windows are classified as either usable or unusable. Given a large number of windows with a known selection mode (select or not select), we train a neural network to predict the selection mode of an arbitrary input window. Currently, the five features we extract from the windows are its cross-correlation value, cross-correlation time lag, amplitude ratio between observed and synthetic data, window length, and minimum STA/LTA value. More features can be included in the future. We use these features to characterize each window for training a multilayer perceptron neural network (MPNN). Training the MPNN is equivalent to solve a non-linear optimization problem. We use backward propagation to derive the gradient of the loss function with respect to the weighting matrices and bias vectors and use the mini-batch stochastic gradient method to iteratively optimize the MPNN. Numerical tests show that with a careful selection of the training data and a sufficient amount of training data, we are able to train a robust neural network that is capable of detecting the waveforms in an arbitrary earthquake data with negligible detection error compared to existing selection methods (e.g. FLEXWIN). We will introduce in detail the mathematical formulation of the window-selection-oriented MPNN and show very encouraging results when applying the new algorithm to real earthquake data.
Free-breathing 3D Cardiac MRI Using Iterative Image-Based Respiratory Motion Correction
Moghari, Mehdi H.; Roujol, Sébastien; Chan, Raymond H.; Hong, Susie N.; Bello, Natalie; Henningsson, Markus; Ngo, Long H.; Goddu, Beth; Goepfert, Lois; Kissinger, Kraig V.; Manning, Warren J.; Nezafat, Reza
2012-01-01
Respiratory motion compensation using diaphragmatic navigator (NAV) gating with a 5 mm gating window is conventionally used for free-breathing cardiac MRI. Due to the narrow gating window, scan efficiency is low resulting in long scan times, especially for patients with irregular breathing patterns. In this work, a new retrospective motion compensation algorithm is presented to reduce the scan time for free-breathing cardiac MRI that increasing the gating window to 15 mm without compromising image quality. The proposed algorithm iteratively corrects for respiratory-induced cardiac motion by optimizing the sharpness of the heart. To evaluate this technique, two coronary MRI datasets with 1.3 mm3 resolution were acquired from 11 healthy subjects (7 females, 25±9 years); one using a NAV with a 5 mm gating window acquired in 12.0±2.0 minutes and one with a 15 mm gating window acquired in 7.1±1.0 minutes. The images acquired with a 15 mm gating window were corrected using the proposed algorithm and compared to the uncorrected images acquired with the 5 mm and 15 mm gating windows. The image quality score, sharpness, and length of the three major coronary arteries were equivalent between the corrected images and the images acquired with a 5 mm gating window (p-value>0.05), while the scan time was reduced by a factor of 1.7. PMID:23132549
Kanehira, Takahiro; Matsuura, Taeko; Takao, Seishin; Matsuzaki, Yuka; Fujii, Yusuke; Fujii, Takaaki; Ito, Yoichi M; Miyamoto, Naoki; Inoue, Tetsuya; Katoh, Norio; Shimizu, Shinichi; Umegaki, Kikuo; Shirato, Hiroki
2017-01-01
To investigate the effectiveness of real-time-image gated proton beam therapy for lung tumors and to establish a suitable size for the gating window (GW). A proton beam gated by a fiducial marker entering a preassigned GW (as monitored by 2 fluoroscopy units) was used with 7 lung cancer patients. Seven treatment plans were generated: real-time-image gated proton beam therapy with GW sizes of ±1, 2, 3, 4, 5, and 8 mm and free-breathing proton therapy. The prescribed dose was 70 Gy (relative biological effectiveness)/10 fractions to 99% of the target. Each of the 3-dimensional marker positions in the time series was associated with the appropriate 4-dimensional computed tomography phase. The 4-dimensional dose calculations were performed. The dose distribution in each respiratory phase was deformed into the end-exhale computed tomography image. The D99 and D5 to D95 of the clinical target volume scaled by the prescribed dose with criteria of D99 >95% and D5 to D95 <5%, V20 for the normal lung, and treatment times were evaluated. Gating windows ≤ ±2 mm fulfilled the CTV criteria for all patients (whereas the criteria were not always met for GWs ≥ ±3 mm) and gave an average reduction in V20 of more than 17.2% relative to free-breathing proton therapy (whereas GWs ≥ ±4 mm resulted in similar or increased V20). The average (maximum) irradiation times were 384 seconds (818 seconds) for the ±1-mm GW, but less than 226 seconds (292 seconds) for the ±2-mm GW. The maximum increased considerably at ±1-mm GW. Real-time-image gated proton beam therapy with a GW of ±2 mm was demonstrated to be suitable, providing good dose distribution without greatly extending treatment time. Copyright © 2016 Elsevier Inc. All rights reserved.
Impact of Real-Time Image Gating on Spot Scanning Proton Therapy for Lung Tumors: A Simulation Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kanehira, Takahiro; Matsuura, Taeko, E-mail: matsuura@med.hokudai.ac.jp; Global Station for Quantum Medical Science and Engineering, Global Institution for Collaborative Research and Education, Hokkaido University, Sapporo
Purpose: To investigate the effectiveness of real-time-image gated proton beam therapy for lung tumors and to establish a suitable size for the gating window (GW). Methods and Materials: A proton beam gated by a fiducial marker entering a preassigned GW (as monitored by 2 fluoroscopy units) was used with 7 lung cancer patients. Seven treatment plans were generated: real-time-image gated proton beam therapy with GW sizes of ±1, 2, 3, 4, 5, and 8 mm and free-breathing proton therapy. The prescribed dose was 70 Gy (relative biological effectiveness)/10 fractions to 99% of the target. Each of the 3-dimensional marker positions in themore » time series was associated with the appropriate 4-dimensional computed tomography phase. The 4-dimensional dose calculations were performed. The dose distribution in each respiratory phase was deformed into the end-exhale computed tomography image. The D99 and D5 to D95 of the clinical target volume scaled by the prescribed dose with criteria of D99 >95% and D5 to D95 <5%, V20 for the normal lung, and treatment times were evaluated. Results: Gating windows ≤ ±2 mm fulfilled the CTV criteria for all patients (whereas the criteria were not always met for GWs ≥ ±3 mm) and gave an average reduction in V20 of more than 17.2% relative to free-breathing proton therapy (whereas GWs ≥ ±4 mm resulted in similar or increased V20). The average (maximum) irradiation times were 384 seconds (818 seconds) for the ±1-mm GW, but less than 226 seconds (292 seconds) for the ±2-mm GW. The maximum increased considerably at ±1-mm GW. Conclusion: Real-time-image gated proton beam therapy with a GW of ±2 mm was demonstrated to be suitable, providing good dose distribution without greatly extending treatment time.« less
Lagrangian Statistics of Slightly Buoyant Droplets in Isotropic Turbulence
NASA Astrophysics Data System (ADS)
Gopalan, Balaji; Malkiel, Edwin; Katz, Joseph
2006-11-01
This project examines the dynamics of slightly buoyant diesel droplets in isotropic turbulence using high speed in-line digital Holographic PIV. A cloud of droplets with specific gravity of 0.85 is injected into the central portion of an isotropic turbulence facility. The droplet trajectories are measured in a 50x50x50 mm̂3 sample volume using high speed in-line digital holography. An automated program has been developed to obtain accurate time history of droplet velocities. Data analysis determines the PDF of velocity and acceleration in three dimensions. The time histories enable us to calculate the three dimensional Lagrangian velocity autocorrelation function, and from them the diffusion coefficients. Due to buoyancy the vertical diffusion time scale exceeds the horizontal one by about 65% .The diffusion coefficients vary between 2.8 cm̂2/sec in the horizontal direction to 5.5 cm̂2/sec in the vertical direction. For droplets with size varying from 2 to 11 Kolmogorov scales there are no clear trends with size. The variations of diffusion rates for different turbulent intensities and the effect of finite window size are presently examined. For shorter time scales, when the diffusion need not be Fickian the three dimensional trajectories can be used to calculate the generalized dispersion tensor and measure the time elapsed for diffusion to become Fickian.
The route to chaos for the Kuramoto-Sivashinsky equation
NASA Technical Reports Server (NTRS)
Papageorgiou, Demetrios T.; Smyrlis, Yiorgos
1990-01-01
The results of extensive numerical experiments of the spatially periodic initial value problem for the Kuramoto-Sivashinsky equation. This paper is concerned with the asymptotic nonlinear dynamics at the dissipation parameter decreases and spatio-temporal chaos sets in. To this end the initial condition is taken to be the same for all numerical experiments (a single sine wave is used) and the large time evolution of the system is followed numerically. Numerous computations were performed to establish the existence of windows, in parameter space, in which the solution has the following characteristics as the viscosity is decreased: a steady fully modal attractor to a steady bimodal attractor to another steady fully modal attractor to a steady trimodal attractor to a periodic attractor, to another steady fully modal attractor, to another periodic attractor, to a steady tetramodal attractor, to another periodic attractor having a full sequence of period-doublings (in parameter space) to chaos. Numerous solutions are presented which provide conclusive evidence of the period-doubling cascades which precede chaos for this infinite-dimensional dynamical system. These results permit a computation of the length of subwindows which in turn provide an estimate for their successive ratios as the cascade develops. A calculation based on the numerical results is also presented to show that the period doubling sequences found here for the Kuramoto-Sivashinsky equation, are in complete agreement with Feigenbaum's universal constant of 4,669201609... . Some preliminary work shows several other windows following the first chaotic one including periodic, chaotic, and a steady octamodal window; however, the windows shrink significantly in size to enable concrete quantitative conclusions to be made.
Size and Location of Defects at the Coupling Interface Affect Lithotripter Performance
Li, Guangyan; Williams, James C.; Pishchalnikov, Yuri A.; Liu, Ziyue; McAteer, James A.
2012-01-01
OBJECTIVE To determine how the size and location of coupling defects caught between the therapy head of a lithotripter and the skin of a surrogate patient (acoustic window of a test chamber) affect the features of shock waves responsible for stone breakage. METHODS Model defects were placed in the coupling gel between the therapy head of a Dornier Compact-S electromagnetic lithotripter and the Mylar window of a water-filled coupling test system. A fiber-optic hydrophone was used to measure acoustic pressures and map the lateral dimensions of the focal zone of the lithotripter. The effect of coupling conditions on stone breakage was assessed using Gypsum model stones. RESULTS Stone breakage decreased in proportion to the area of the coupling defect; a centrally located defect blocking only 18% of the transmission area reduced stone breakage by an average of almost 30%. The effect on stone breakage was greater for defects located on-axis and decreased as the defect was moved laterally; an 18% defect located near the periphery of the coupling window (2.0 cm off-axis) reduced stone breakage by only ~15% compared to when coupling was completely unobstructed. Defects centered within the coupling window acted to narrow the focal width of the lithotripter; an 8.2% defect reduced the focal width ~30% compared to no obstruction (4.4 mm versus 6.5 mm). Coupling defects located slightly off center disrupted the symmetry of the acoustic field; an 18% defect positioned 1.0 cm off-axis shifted the focus of maximum positive pressure ~1.0 mm laterally. Defects on and off-axis imposed a significant reduction in the energy density of shock waves across the focal zone. CONCLUSIONS In addition to blocking the transmission of shock wave energy, coupling defects also disrupt the properties of shock waves that play a role in stone breakage, including the focal width of the lithotripter and the symmetry of the acoustic field; the effect is dependent on the size and location of defects, with defects near the center of the coupling window having the greatest effect. These data emphasize the importance of eliminating air pockets from the coupling interface, particularly defects located near the center of the coupling window. PMID:22938566
Dong, Jie; Wang, Dawei; Ma, Zhenshen; Deng, Guodong; Wang, Lanhua; Zhang, Jiandong
2017-01-01
The aim of the study was evaluate the 3.0 T magnetic resonance (MR) perfusion imaging scanning time window following contrast injection for differentiating benign and malignant breast lesions and to determine the optimum scanning time window for increased scanner usage efficiency and reduced diagnostic adverse risk factors. A total of 52 women with breast abnormalities were selected for conventional MR imaging and T1 dynamic-enhanced imaging. Quantitative parameters [volume transfer constant (Ktrans), rate constant (Kep) and extravascular extracellular volume fraction (Ve)] were calculated at phases 10, 20, 30, 40 and 50, which represented time windows at 5, 10, 15, 20 and 25 min, respectively, following injection of contrast agent. The association of the parameters at different phases with benign and malignant tumor diagnosis was analyzed. MR perfusion imaging was verified as an effective modality in the diagnosis of breast malignancies and the best scanning time window was identified: i) Values of Ktrans and Kep at all phases were statistically significant in differentiating benign and malignant tumors (P<0.05), while the value of Ve had statistical significance only at stage 10, but not at any other stages (P>0.05); ii) values of Ve in benign tumors increased with phase number, but achieved no obvious changes at different phases in malignant tumors; iii) the optimum scanning time window of breast perfusion imaging with 3.0 T MR was between phases 10 and 30 (i.e., between 5 and 15 min after contrast agent injection). The variation trend of Ve values at different phases may serve as a diagnostic reference for differentiating benign and malignant breast abnormalities. The most efficient scanning time window was indicated to be 5 min after contrast injection, based on the observation that the Ve value only had statistical significance in diagnosis at stage 10. However, the optimal scanning time window is from 5 to 15 min following the injection of contrast agent, since that the variation trend of Ve is able to serve as a diagnostic reference. PMID:28450944
Dong, Jie; Wang, Dawei; Ma, Zhenshen; Deng, Guodong; Wang, Lanhua; Zhang, Jiandong
2017-03-01
The aim of the study was evaluate the 3.0 T magnetic resonance (MR) perfusion imaging scanning time window following contrast injection for differentiating benign and malignant breast lesions and to determine the optimum scanning time window for increased scanner usage efficiency and reduced diagnostic adverse risk factors. A total of 52 women with breast abnormalities were selected for conventional MR imaging and T1 dynamic-enhanced imaging. Quantitative parameters [volume transfer constant (K trans ), rate constant (K ep ) and extravascular extracellular volume fraction (V e )] were calculated at phases 10, 20, 30, 40 and 50, which represented time windows at 5, 10, 15, 20 and 25 min, respectively, following injection of contrast agent. The association of the parameters at different phases with benign and malignant tumor diagnosis was analyzed. MR perfusion imaging was verified as an effective modality in the diagnosis of breast malignancies and the best scanning time window was identified: i) Values of K trans and K ep at all phases were statistically significant in differentiating benign and malignant tumors (P<0.05), while the value of V e had statistical significance only at stage 10, but not at any other stages (P>0.05); ii) values of V e in benign tumors increased with phase number, but achieved no obvious changes at different phases in malignant tumors; iii) the optimum scanning time window of breast perfusion imaging with 3.0 T MR was between phases 10 and 30 (i.e., between 5 and 15 min after contrast agent injection). The variation trend of V e values at different phases may serve as a diagnostic reference for differentiating benign and malignant breast abnormalities. The most efficient scanning time window was indicated to be 5 min after contrast injection, based on the observation that the V e value only had statistical significance in diagnosis at stage 10. However, the optimal scanning time window is from 5 to 15 min following the injection of contrast agent, since that the variation trend of V e is able to serve as a diagnostic reference.
Timing anthropogenic stressors to mitigate their impact on marine ecosystem resilience.
Wu, Paul Pao-Yen; Mengersen, Kerrie; McMahon, Kathryn; Kendrick, Gary A; Chartrand, Kathryn; York, Paul H; Rasheed, Michael A; Caley, M Julian
2017-11-02
Better mitigation of anthropogenic stressors on marine ecosystems is urgently needed to address increasing biodiversity losses worldwide. We explore opportunities for stressor mitigation using whole-of-systems modelling of ecological resilience, accounting for complex interactions between stressors, their timing and duration, background environmental conditions and biological processes. We then search for ecological windows, times when stressors minimally impact ecological resilience, defined here as risk, recovery and resistance. We show for 28 globally distributed seagrass meadows that stressor scheduling that exploits ecological windows for dredging campaigns can achieve up to a fourfold reduction in recovery time and 35% reduction in extinction risk. Although the timing and length of windows vary among sites to some degree, global trends indicate favourable windows in autumn and winter. Our results demonstrate that resilience is dynamic with respect to space, time and stressors, varying most strongly with: (i) the life history of the seagrass genus and (ii) the duration and timing of the impacting stress.
Platform for Postprocessing Waveform-Based NDE
NASA Technical Reports Server (NTRS)
Roth, Don
2008-01-01
Taking advantage of the similarities that exist among all waveform-based non-destructive evaluation (NDE) methods, a common software platform has been developed containing multiple- signal and image-processing techniques for waveforms and images. The NASA NDE Signal and Image Processing software has been developed using the latest versions of LabVIEW, and its associated Advanced Signal Processing and Vision Toolkits. The software is useable on a PC with Windows XP and Windows Vista. The software has been designed with a commercial grade interface in which two main windows, Waveform Window and Image Window, are displayed if the user chooses a waveform file to display. Within these two main windows, most actions are chosen through logically conceived run-time menus. The Waveform Window has plots for both the raw time-domain waves and their frequency- domain transformations (fast Fourier transform and power spectral density). The Image Window shows the C-scan image formed from information of the time-domain waveform (such as peak amplitude) or its frequency-domain transformation at each scan location. The user also has the ability to open an image, or series of images, or a simple set of X-Y paired data set in text format. Each of the Waveform and Image Windows contains menus from which to perform many user actions. An option exists to use raw waves obtained directly from scan, or waves after deconvolution if system wave response is provided. Two types of deconvolution, time-based subtraction or inverse-filter, can be performed to arrive at a deconvolved wave set. Additionally, the menu on the Waveform Window allows preprocessing of waveforms prior to image formation, scaling and display of waveforms, formation of different types of images (including non-standard types such as velocity), gating of portions of waves prior to image formation, and several other miscellaneous and specialized operations. The menu available on the Image Window allows many further image processing and analysis operations, some of which are found in commercially-available image-processing software programs (such as Adobe Photoshop), and some that are not (removing outliers, Bscan information, region-of-interest analysis, line profiles, and precision feature measurements).
Microperforations significantly enhance diffusion across round window membrane.
Kelso, Catherine M; Watanabe, Hirobumi; Wazen, Joseph M; Bucher, Tizian; Qian, Zhen J; Olson, Elizabeth S; Kysar, Jeffrey W; Lalwani, Anil K
2015-04-01
Introduction of microperforations in round window membrane (RWM) will allow reliable and predictable intracochlear delivery of pharmaceutical, molecular, or cellular therapeutic agents. Reliable delivery of medications into the inner ear remains a formidable challenge. The RWM is an attractive target for intracochlear delivery. However, simple diffusion across intact RWM is limited by what material can be delivered, size of material to be delivered, difficulty with precise dosing, timing, and precision of delivery over time. Further, absence of reliable methods for measuring diffusion across RWM in vitro is a significant experimental impediment. A novel model for measuring diffusion across guinea pig RWM, with and without microperforation, was developed and tested: cochleae, sparing the RWM, were embedded in 3D-printed acrylic holders using hybrid dental composite and light cured to adapt the round window niche to 3 ml Franz diffusion cells. Perforations were created with 12.5-μm-diameter needles and examined with light microscopy. Diffusion of 1 mM Rhodamine B across RWM in static diffusion cells was measured via fluorescence microscopy. The diffusion cell apparatus provided reliable and replicable measurements of diffusion across RWM. The permeability of Rhodamine B across intact RWM was 5.1 × 10(9-) m/s. Manual application of microperforation with a 12.5-μm-diameter tip produced an elliptical tear removing 0.22 ± 0.07% of the membrane and was associated with a 35× enhancement in diffusion (P < 0.05). Diffusion cells can be applied to the study of RWM permeability in vitro. Microperforation in RWM is an effective means of increasing diffusion across the RWM.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-01
.... Actual pile driving time during this work window will depend on a number of factors, such as sediments... period beginning in November 2010, and ending in February 2011. This work window was selected to coincide.... The work window also coincides with the USFWS' required construction work window to avoid the peak...
ERIC Educational Resources Information Center
Roman, Harry T.
2010-01-01
Skyscrapers sure do have a lot of windows, and these windows are cleaned and checked regularly. All this takes time, money, and puts workers at potential risk. Might there be a better way to do it? In this article, the author discusses a window-washing challenge and describes how students can tackle this task, pick up the challenge, and creatively…
NASA Astrophysics Data System (ADS)
Tong, Qiujie; Wang, Qianqian; Li, Xiaoyang; Shan, Bin; Cui, Xuntai; Li, Chenyu; Peng, Zhong
2016-11-01
In order to satisfy the requirements of the real-time and generality, a laser target simulator in semi-physical simulation system based on RTX+LabWindows/CVI platform is proposed in this paper. Compared with the upper-lower computers simulation platform architecture used in the most of the real-time system now, this system has better maintainability and portability. This system runs on the Windows platform, using Windows RTX real-time extension subsystem to ensure the real-time performance of the system combining with the reflective memory network to complete some real-time tasks such as calculating the simulation model, transmitting the simulation data, and keeping real-time communication. The real-time tasks of simulation system run under the RTSS process. At the same time, we use the LabWindows/CVI to compile a graphical interface, and complete some non-real-time tasks in the process of simulation such as man-machine interaction, display and storage of the simulation data, which run under the Win32 process. Through the design of RTX shared memory and task scheduling algorithm, the data interaction between the real-time tasks process of RTSS and non-real-time tasks process of Win32 is completed. The experimental results show that this system has the strongly real-time performance, highly stability, and highly simulation accuracy. At the same time, it also has the good performance of human-computer interaction.
Rapidity window dependences of higher order cumulants and diffusion master equation
NASA Astrophysics Data System (ADS)
Kitazawa, Masakiyo
2015-10-01
We study the rapidity window dependences of higher order cumulants of conserved charges observed in relativistic heavy ion collisions. The time evolution and the rapidity window dependence of the non-Gaussian fluctuations are described by the diffusion master equation. Analytic formulas for the time evolution of cumulants in a rapidity window are obtained for arbitrary initial conditions. We discuss that the rapidity window dependences of the non-Gaussian cumulants have characteristic structures reflecting the non-equilibrium property of fluctuations, which can be observed in relativistic heavy ion collisions with the present detectors. It is argued that various information on the thermal and transport properties of the hot medium can be revealed experimentally by the study of the rapidity window dependences, especially by the combined use, of the higher order cumulants. Formulas of higher order cumulants for a probability distribution composed of sub-probabilities, which are useful for various studies of non-Gaussian cumulants, are also presented.
Measurement of the noise power spectrum in digital x-ray detectors
NASA Astrophysics Data System (ADS)
Aufrichtig, Richard; Su, Yu; Cheng, Yu; Granfors, Paul R.
2001-06-01
The noise power spectrum, NPS, is a key imaging property of a detector and one of the principle quantities needed to compute the detective quantum efficiency. NPS is measured by computing the Fourier transform of flat field images. Different measurement methods are investigated and evaluated with images obtained from an amorphous silicon flat panel x-ray imaging detector. First, the influence of fixed pattern structures is minimized by appropriate background corrections. For a given data set the effect of using different types of windowing functions is studied. Also different window sizes and amounts of overlap between windows are evaluated and compared to theoretical predictions. Results indicate that measurement error is minimized when applying overlapping Hanning windows on the raw data. Finally it is shown that radial averaging is a useful method of reducing the two-dimensional noise power spectrum to one dimension.
Sapphire Viewports for a Venus Probe
NASA Technical Reports Server (NTRS)
Bates, Stephen
2012-01-01
A document discusses the creation of a viewport suitable for use on the surface of Venus. These viewports are rated for 500 C and 100 atm pressure with appropriate safety factors and reliability required for incorporation into a Venus Lander. Sapphire windows should easily withstand the chemical, pressure, and temperatures of the Venus surface. Novel fixture designs and seals appropriate to the environment are incorporated, as are materials compatible with exploration vessels. A test cell was fabricated, tested, and leak rate measured. The window features polish specification of the sides and corners, soft metal padding of the sapphire, and a metal C-ring seal. The system safety factor is greater than 2, and standard mechanical design theory was used to size the window, flange, and attachment bolts using available material property data. Maintenance involves simple cleaning of the window aperture surfaces. The only weakness of the system is its moderate rather than low leak rate for vacuum applications.
2010-01-01
investigate extracellu- lar electron transfer in Shewanella oneidensisMR-1,where an array of nanoholes precludes or single window allows for direct...the single-cell level (Fig. 1B) highlights the re- lative sizes of the nanohole and window openings in the insulating layer deposited over electrodes...relative to individual bacteria such as Shewanella. The nanoholes are sufficiently small to preclude direct contact of the bacterial cell body to the
Landmark Detection in Orbital Images Using Salience Histograms
NASA Technical Reports Server (NTRS)
Wagstaff, Kiri L.; Panetta, Julian; Schorghofer, Norbert; Greeley, Ronald; PendletonHoffer, Mary; bunte, Melissa
2010-01-01
NASA's planetary missions have collected, and continue to collect, massive volumes of orbital imagery. The volume is such that it is difficult to manually review all of the data and determine its significance. As a result, images are indexed and searchable by location and date but generally not by their content. A new automated method analyzes images and identifies "landmarks," or visually salient features such as gullies, craters, dust devil tracks, and the like. This technique uses a statistical measure of salience derived from information theory, so it is not associated with any specific landmark type. It identifies regions that are unusual or that stand out from their surroundings, so the resulting landmarks are context-sensitive areas that can be used to recognize the same area when it is encountered again. A machine learning classifier is used to identify the type of each discovered landmark. Using a specified window size, an intensity histogram is computed for each such window within the larger image (sliding the window across the image). Next, a salience map is computed that specifies, for each pixel, the salience of the window centered at that pixel. The salience map is thresholded to identify landmark contours (polygons) using the upper quartile of salience values. Descriptive attributes are extracted for each landmark polygon: size, perimeter, mean intensity, standard deviation of intensity, and shape features derived from an ellipse fit.
Optimization of ramp area aircraft push back time windows in the presence of uncertainty
NASA Astrophysics Data System (ADS)
Coupe, William Jeremy
It is well known that airport surface traffic congestion at major airports is responsible for increased taxi-out times, fuel burn and excess emissions and there is potential to mitigate these negative consequences through optimizing airport surface traffic operations. Due to a highly congested voice communication channel between pilots and air traffic controllers and a data communication channel that is used only for limited functions, one of the most viable near-term strategies for improvement of the surface traffic is issuing a push back advisory to each departing aircraft. This dissertation focuses on the optimization of a push back time window for each departing aircraft. The optimization takes into account both spatial and temporal uncertainties of ramp area aircraft trajectories. The uncertainties are described by a stochastic kinematic model of aircraft trajectories, which is used to infer distributions of combinations of push back times that lead to conflict among trajectories from different gates. The model is validated and the distributions are included in the push back time window optimization. Under the assumption of a fixed taxiway spot schedule, the computed push back time windows can be integrated with a higher level taxiway scheduler to optimize the flow of traffic from the gate to the departure runway queue. To enable real-time decision making the computational time of the push back time window optimization is critical and is analyzed throughout.
Osadchii, Oleg E.
2014-01-01
Normal hearts exhibit a positive time difference between the end of ventricular contraction and the end of QT interval, which is referred to as the electromechanical (EM) window. Drug-induced prolongation of repolarization may lead to the negative EM window, which was proposed to be a novel proarrhythmic marker. This study examined whether abnormal changes in the EM window may account for arrhythmogenic effects produced by hypokalemia. Left ventricular pressure, electrocardiogram, and epicardial monophasic action potentials were recorded in perfused hearts from guinea-pig and rabbit. Hypokalemia (2.5 mM K+) was found to prolong repolarization, reduce the EM window, and promote tachyarrhythmia. Nevertheless, during both regular pacing and extrasystolic excitation, the increased QT interval invariably remained shorter than the duration of mechanical systole, thus yielding positive EM window values. Hypokalemia-induced arrhythmogenicity was associated with slowed ventricular conduction, and shortened effective refractory periods, which translated to a reduced excitation wavelength index. Hypokalemia also evoked non-uniform prolongation of action potential duration in distinct epicardial regions, which resulted in increased spatial variability in the repolarization time. These findings suggest that arrhythmogenic effects of hypokalemia are not accounted for by the negative EM window, and are rather attributed to abnormal changes in ventricular conduction times, refractoriness, excitation wavelength, and spatial repolarization gradients. PMID:25141124
NASA Technical Reports Server (NTRS)
Shih, Y. H.; Sergienko, A. V.; Rubin, M. H.
1993-01-01
A pair of correlated photons generated from parametric down conversion was sent to two independent Michelson interferometers. Second order interference was studied by means of a coincidence measurement between the outputs of two interferometers. The reported experiment and analysis studied this second order interference phenomena from the point of view of Einstein-Podolsky-Rosen paradox. The experiment was done in two steps. The first step of the experiment used 50 psec and 3 nsec coincidence time windows simultaneously. The 50 psec window was able to distinguish a 1.5 cm optical path difference in the interferometers. The interference visibility was measured to be 38 percent and 21 percent for the 50 psec time window and 22 percent and 7 percent for the 3 nsec time window, when the optical path difference of the interferometers were 2 cm and 4 cm, respectively. By comparing the visibilities between these two windows, the experiment showed the non-classical effect which resulted from an E.P.R. state. The second step of the experiment used a 20 psec coincidence time window, which was able to distinguish a 6 mm optical path difference in the interferometers. The interference visibilities were measured to be 59 percent for an optical path difference of 7 mm. This is the first observation of visibility greater than 50 percent for a two interferometer E.P.R. experiment which demonstrates nonclassical correlation of space-time variables.
NASA Astrophysics Data System (ADS)
Abdul Ghani, B.
2005-09-01
"TEA CO 2 Laser Simulator" has been designed to simulate the dynamic emission processes of the TEA CO 2 laser based on the six-temperature model. The program predicts the behavior of the laser output pulse (power, energy, pulse duration, delay time, FWHM, etc.) depending on the physical and geometrical input parameters (pressure ratio of gas mixture, reflecting area of the output mirror, media length, losses, filling and decay factors, etc.). Program summaryTitle of program: TEA_CO2 Catalogue identifier: ADVW Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVW Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer: P.IV DELL PC Setup: Atomic Energy Commission of Syria, Scientific Services Department, Mathematics and Informatics Division Operating system: MS-Windows 9x, 2000, XP Programming language: Delphi 6.0 No. of lines in distributed program, including test data, etc.: 47 315 No. of bytes in distributed program, including test data, etc.:7 681 109 Distribution format:tar.gz Classification: 15 Laser Physics Nature of the physical problem: "TEA CO 2 Laser Simulator" is a program that predicts the behavior of the laser output pulse by studying the effect of the physical and geometrical input parameters on the characteristics of the output laser pulse. The laser active medium consists of a CO 2-N 2-He gas mixture. Method of solution: Six-temperature model, for the dynamics emission of TEA CO 2 laser, has been adapted in order to predict the parameters of laser output pulses. A simulation of the laser electrical pumping was carried out using two approaches; empirical function equation (8) and differential equation (9). Typical running time: The program's running time mainly depends on both integration interval and step; for a 4 μs period of time and 0.001 μs integration step (defaults values used in the program), the running time will be about 4 seconds. Restrictions on the complexity: Using a very small integration step might leads to stop the program run due to the huge number of calculating points and to a small paging file size of the MS-Windows virtual memory. In such case, it is recommended to enlarge the paging file size to the appropriate size, or to use a bigger value of integration step.
Effects of the 7-8-year cycle in daily mean air temperature as a cross-scale information transfer
NASA Astrophysics Data System (ADS)
Jajcay, Nikola; Hlinka, Jaroslav; Paluš, Milan
2015-04-01
Using a novel nonlinear time-series analysis method, an information transfer from larger to smaller scales of the air temperature variability has been observed in daily mean surface air temperature (SAT) data from European stations as the influence of the phase of slow oscillatory phenomena with periods around 6-11 years on amplitudes of the variability characterized by smaller temporal scales from a few months to 4-5 years [1]. The strongest effect is exerted by an oscillatory mode with the period close to 8 years and its influence can be seen in 1-2 °C differences of the conditional SAT means taken conditionally on the phase of the 8-year cycle. The size of this effect, however, changes in space and time. The changes in time are studied using sliding window technique, showing that the effect evolves in time, and during the last decades the effect is stronger and significant. Sliding window technique was used along with seasonal division of the data, and it has been found that the cycle is most pronounced in the winter season. Different types of surrogate data are applied in order to establish statistical significance and distinguish the effect of the 7-8-yr cycle from climate variability on shorter time scales. [1] M. Palus, Phys. Rev. Lett. 112 078702 (2014) This study is supported by the Ministry of Education, Youth and Sports of the Czech Republic within the Program KONTAKT II, Project No. LH14001.
Seeing the Light: A Classroom-Sized Pinhole Camera Demonstration for Teaching Vision
ERIC Educational Resources Information Center
Prull, Matthew W.; Banks, William P.
2005-01-01
We describe a classroom-sized pinhole camera demonstration (camera obscura) designed to enhance students' learning of the visual system. The demonstration consists of a suspended rear-projection screen onto which the outside environment projects images through a small hole in a classroom window. Students can observe these images in a darkened…
NASA Technical Reports Server (NTRS)
Gray, Perry; Guven, Ibrahim
2016-01-01
A new facility for making small particle impacts is being developed at NASA. Current sand/particle impact facilities are an erosion test and do not precisely measure and document the size and velocity of each of the impacting particles. In addition, evidence of individual impacts is often obscured by subsequent impacts. This facility will allow the number, size, and velocity of each particle to be measured and adjusted. It will also be possible to determine which particle produced damage at a given location on the target. The particle size and velocity will be measured by high speed imaging techniques. Information as to the extent of damage and debris from impacts will also be recorded. It will be possible to track these secondary particles, measuring size and velocity. It is anticipated that this additional degree of detail will provide input for erosion models and also help determine the impact physics of the erosion process. Particle impacts will be recorded at 90 degrees to the particle flight path and also from the top looking through the target window material.
Remembering the Important Things: Semantic Importance in Stream Reasoning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Rui; Greaves, Mark T.; Smith, William P.
Reasoning and querying over data streams rely on the abil- ity to deliver a sequence of stream snapshots to the processing algo- rithms. These snapshots are typically provided using windows as views into streams and associated window management strategies. Generally, the goal of any window management strategy is to preserve the most im- portant data in the current window and preferentially evict the rest, so that the retained data can continue to be exploited. A simple timestamp- based strategy is rst-in-rst-out (FIFO), in which items are replaced in strict order of arrival. All timestamp-based strategies implicitly assume that a temporalmore » ordering reliably re ects importance to the processing task at hand, and thus that window management using timestamps will maximize the ability of the processing algorithms to deliver accurate interpretations of the stream. In this work, we explore a general no- tion of semantic importance that can be used for window management for streams of RDF data using semantically-aware processing algorithms like deduction or semantic query. Semantic importance exploits the infor- mation carried in RDF and surrounding ontologies for ranking window data in terms of its likely contribution to the processing algorithms. We explore the general semantic categories of query contribution, prove- nance, and trustworthiness, as well as the contribution of domain-specic ontologies. We describe how these categories behave using several con- crete examples. Finally, we consider how a stream window management strategy based on semantic importance could improve overall processing performance, especially as available window sizes decrease.« less
Xu, Stanley; Hambidge, Simon J; McClure, David L; Daley, Matthew F; Glanz, Jason M
2013-08-30
In the examination of the association between vaccines and rare adverse events after vaccination in postlicensure observational studies, it is challenging to define appropriate risk windows because prelicensure RCTs provide little insight on the timing of specific adverse events. Past vaccine safety studies have often used prespecified risk windows based on prior publications, biological understanding of the vaccine, and expert opinion. Recently, a data-driven approach was developed to identify appropriate risk windows for vaccine safety studies that use the self-controlled case series design. This approach employs both the maximum incidence rate ratio and the linear relation between the estimated incidence rate ratio and the inverse of average person time at risk, given a specified risk window. In this paper, we present a scan statistic that can identify appropriate risk windows in vaccine safety studies using the self-controlled case series design while taking into account the dependence of time intervals within an individual and while adjusting for time-varying covariates such as age and seasonality. This approach uses the maximum likelihood ratio test based on fixed-effects models, which has been used for analyzing data from self-controlled case series design in addition to conditional Poisson models. Copyright © 2013 John Wiley & Sons, Ltd.
Haest, Birgen; Hüppop, Ommo; Bairlein, Franz
2018-04-01
Many migrant bird species that breed in the Northern Hemisphere show advancement in spring arrival dates. The North Atlantic Oscillation (NAO) index is one of the climatic variables that have been most often investigated and shown to be correlated with these changes in spring arrival. Although the NAO is often claimed to be a good predictor or even to have a marked effect on interannual changes in spring migration phenology of Northern Hemisphere breeding birds, the results on relations between spring migration phenology and NAO show a large variety, ranging from no, over weak, to a strong association. Several factors, such as geographic location, migration phase, and the NAO index time window, have been suggested to partly explain these observed differences in association. A combination of a literature meta-analysis, and a meta-analysis and sliding time window analysis of a dataset of 23 short- and long-distance migrants from the constant-effort trapping garden at Helgoland, Germany, however, paints a completely different picture. We found a statistically significant overall effect size of the NAO on spring migration phenology (coefficient = -0.14, SE = 0.054), but this on average only explains 0%-6% of the variance in spring migration phenology across all species. As such, the value and biological meaning of the NAO as a general predictor or explanatory variable for climate change effects on migration phenology of birds, seems highly questionable. We found little to no definite support for previously suggested factors, such as geographic location, migration phenology phase, or the NAO time window, to explain the heterogeneity in correlation differences. We, however, did find compelling evidence that the lack of accounting for trends in both time series has led to strongly inflated (spurious) correlations in many studies (coefficient = -0.13, SE = 0.019). © 2017 John Wiley & Sons Ltd.
Wilson, Ander; Chiu, Yueh-Hsiu Mathilda; Hsu, Hsiao-Hsien Leon; Wright, Robert O; Wright, Rosalind J; Coull, Brent A
2017-07-01
Epidemiological research supports an association between maternal exposure to air pollution during pregnancy and adverse children's health outcomes. Advances in exposure assessment and statistics allow for estimation of both critical windows of vulnerability and exposure effect heterogeneity. Simultaneous estimation of windows of vulnerability and effect heterogeneity can be accomplished by fitting a distributed lag model (DLM) stratified by subgroup. However, this can provide an incomplete picture of how effects vary across subgroups because it does not allow for subgroups to have the same window but different within-window effects or to have different windows but the same within-window effect. Because the timing of some developmental processes are common across subpopulations of infants while for others the timing differs across subgroups, both scenarios are important to consider when evaluating health risks of prenatal exposures. We propose a new approach that partitions the DLM into a constrained functional predictor that estimates windows of vulnerability and a scalar effect representing the within-window effect directly. The proposed method allows for heterogeneity in only the window, only the within-window effect, or both. In a simulation study we show that a model assuming a shared component across groups results in lower bias and mean squared error for the estimated windows and effects when that component is in fact constant across groups. We apply the proposed method to estimate windows of vulnerability in the association between prenatal exposures to fine particulate matter and each of birth weight and asthma incidence, and estimate how these associations vary by sex and maternal obesity status in a Boston-area prospective pre-birth cohort study. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Experimental evidence for stochastic switching of supercooled phases in NdNiO3 nanostructures
NASA Astrophysics Data System (ADS)
Kumar, Devendra; Rajeev, K. P.; Alonso, J. A.
2018-03-01
A first-order phase transition is a dynamic phenomenon. In a multi-domain system, the presence of multiple domains of coexisting phases averages out the dynamical effects, making it nearly impossible to predict the exact nature of phase transition dynamics. Here, we report the metal-insulator transition in samples of sub-micrometer size NdNiO3 where the effect of averaging is minimized by restricting the number of domains under study. We observe the presence of supercooled metallic phases with supercooling of 40 K or more. The transformation from the supercooled metallic to the insulating state is a stochastic process that happens at different temperatures and times in different experimental runs. The experimental results are understood without incorporating material specific properties, suggesting that the behavior is of universal nature. The size of the sample needed to observe individual switching of supercooled domains, the degree of supercooling, and the time-temperature window of switching are expected to depend on the parameters such as quenched disorder, strain, and magnetic field.
NASA Astrophysics Data System (ADS)
Huang, Dong; Campos, Edwin; Liu, Yangang
2014-09-01
Statistical characteristics of cloud variability are examined for their dependence on averaging scales and best representation of probability density function with the decade-long retrieval products of cloud liquid water path (LWP) from the tropical western Pacific (TWP), Southern Great Plains (SGP), and North Slope of Alaska (NSA) sites of the Department of Energy's Atmospheric Radiation Measurement Program. The statistical moments of LWP show some seasonal variation at the SGP and NSA sites but not much at the TWP site. It is found that the standard deviation, relative dispersion (the ratio of the standard deviation to the mean), and skewness all quickly increase with the averaging window size when the window size is small and become more or less flat when the window size exceeds 12 h. On average, the cloud LWP at the TWP site has the largest values of standard deviation, relative dispersion, and skewness, whereas the NSA site exhibits the least. Correlation analysis shows that there is a positive correlation between the mean LWP and the standard deviation. The skewness is found to be closely related to the relative dispersion with a correlation coefficient of 0.6. The comparison further shows that the lognormal, Weibull, and gamma distributions reasonably explain the observed relationship between skewness and relative dispersion over a wide range of scales.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Dong; Campos, Edwin; Liu, Yangang
2014-09-17
Statistical characteristics of cloud variability are examined for their dependence on averaging scales and best representation of probability density function with the decade-long retrieval products of cloud liquid water path (LWP) from the tropical western Pacific (TWP), Southern Great Plains (SGP), and North Slope of Alaska (NSA) sites of the Department of Energy’s Atmospheric Radiation Measurement Program. The statistical moments of LWP show some seasonal variation at the SGP and NSA sites but not much at the TWP site. It is found that the standard deviation, relative dispersion (the ratio of the standard deviation to the mean), and skewness allmore » quickly increase with the averaging window size when the window size is small and become more or less flat when the window size exceeds 12 h. On average, the cloud LWP at the TWP site has the largest values of standard deviation, relative dispersion, and skewness, whereas the NSA site exhibits the least. Correlation analysis shows that there is a positive correlation between the mean LWP and the standard deviation. The skewness is found to be closely related to the relative dispersion with a correlation coefficient of 0.6. The comparison further shows that the log normal, Weibull, and gamma distributions reasonably explain the observed relationship between skewness and relative dispersion over a wide range of scales.« less
Time and diffusion lesion size in major anterior circulation ischemic strokes.
Hakimelahi, Reza; Vachha, Behroze A; Copen, William A; Papini, Giacomo D E; He, Julian; Higazi, Mahmoud M; Lev, Michael H; Schaefer, Pamela W; Yoo, Albert J; Schwamm, Lee H; González, R Gilberto
2014-10-01
Major anterior circulation ischemic strokes caused by occlusion of the distal internal carotid artery or proximal middle cerebral artery or both account for about one third of ischemic strokes with mostly poor outcomes. These strokes are treatable by intravenous tissue-type plasminogen activator and endovascular methods. However, dynamics of infarct growth in these strokes are poorly documented. The purpose was to help understand infarct growth dynamics by measuring acute infarct size with diffusion-weighted imaging (DWI) at known times after stroke onset in patients with documented internal carotid artery/middle cerebral artery occlusions. Retrospectively, we included 47 consecutive patients with documented internal carotid artery/middle cerebral artery occlusions who underwent DWI within 30 hours of stroke onset. Prospectively, 139 patients were identified using the same inclusion criteria. DWI lesion volumes were measured and correlated to time since stroke onset. Perfusion data were reviewed in those who underwent perfusion imaging. Acute infarct volumes ranged from 0.41 to 318.3 mL. Infarct size and time did not correlate (R2=0.001). The majority of patients had DWI lesions that were <25% the territory at risk (<70 mL) whether they were imaged <8 or >8 hours after stroke onset. DWI lesions corresponded to areas of greatly reduced perfusion. Poor correlation between infarct volume and time after stroke onset suggests that there are factors more powerful than time in determining infarct size within the first 30 hours. The observations suggest that highly variable cerebral perfusion via the collateral circulation may primarily determine infarct growth dynamics. If verified, clinical implications include the possibility of treating many patients outside traditional time windows. © 2014 American Heart Association, Inc.
NASA Astrophysics Data System (ADS)
Jian, Wang; Xiaohong, Meng; Hong, Liu; Wanqiu, Zheng; Yaning, Liu; Sheng, Gui; Zhiyang, Wang
2017-03-01
Full waveform inversion and reverse time migration are active research areas for seismic exploration. Forward modeling in the time domain determines the precision of the results, and numerical solutions of finite difference have been widely adopted as an important mathematical tool for forward modeling. In this article, the optimum combined of window functions was designed based on the finite difference operator using a truncated approximation of the spatial convolution series in pseudo-spectrum space, to normalize the outcomes of existing window functions for different orders. The proposed combined window functions not only inherit the characteristics of the various window functions, to provide better truncation results, but also control the truncation error of the finite difference operator manually and visually by adjusting the combinations and analyzing the characteristics of the main and side lobes of the amplitude response. Error level and elastic forward modeling under the proposed combined system were compared with outcomes from conventional window functions and modified binomial windows. Numerical dispersion is significantly suppressed, which is compared with modified binomial window function finite-difference and conventional finite-difference. Numerical simulation verifies the reliability of the proposed method.
Alternative Fuels Data Center: Hydrogen Drive
, contact Greater Washington Region Clean Cities Coalition. Download QuickTime Video QuickTime (.mov ) Download Windows Media Video Windows Media (.wmv) Video Download Help Text version See more videos provided
Large format geiger-mode avalanche photodiode LADAR camera
NASA Astrophysics Data System (ADS)
Yuan, Ping; Sudharsanan, Rengarajan; Bai, Xiaogang; Labios, Eduardo; Morris, Bryan; Nicholson, John P.; Stuart, Gary M.; Danny, Harrison
2013-05-01
Recently Spectrolab has successfully demonstrated a compact 32x32 Laser Detection and Range (LADAR) camera with single photo-level sensitivity with small size, weight, and power (SWAP) budget for threedimensional (3D) topographic imaging at 1064 nm on various platforms. With 20-kHz frame rate and 500- ps timing uncertainty, this LADAR system provides coverage down to inch-level fidelity and allows for effective wide-area terrain mapping. At a 10 mph forward speed and 1000 feet above ground level (AGL), it covers 0.5 square-mile per hour with a resolution of 25 in2/pixel after data averaging. In order to increase the forward speed to fit for more platforms and survey a large area more effectively, Spectrolab is developing 32x128 Geiger-mode LADAR camera with 43 frame rate. With the increase in both frame rate and array size, the data collection rate is improved by 10 times. With a programmable bin size from 0.3 ps to 0.5 ns and 14-bit timing dynamic range, LADAR developers will have more freedom in system integration for various applications. Most of the special features of Spectrolab 32x32 LADAR camera, such as non-uniform bias correction, variable range gate width, windowing for smaller arrays, and short pixel protection, are implemented in this camera.
Development of the Code RITRACKS
NASA Technical Reports Server (NTRS)
Plante, Ianik; Cucinotta, Francis A.
2013-01-01
A document discusses the code RITRACKS (Relativistic Ion Tracks), which was developed to simulate heavy ion track structure at the microscopic and nanoscopic scales. It is a Monte-Carlo code that simulates the production of radiolytic species in water, event-by-event, and which may be used to simulate tracks and also to calculate dose in targets and voxels of different sizes. The dose deposited by the radiation can be calculated in nanovolumes (voxels). RITRACKS allows simulation of radiation tracks without the need of extensive knowledge of computer programming or Monte-Carlo simulations. It is installed as a regular application on Windows systems. The main input parameters entered by the user are the type and energy of the ion, the length and size of the irradiated volume, the number of ions impacting the volume, and the number of histories. The simulation can be started after the input parameters are entered in the GUI. The number of each kind of interactions for each track is shown in the result details window. The tracks can be visualized in 3D after the simulation is complete. It is also possible to see the time evolution of the tracks and zoom on specific parts of the tracks. The software RITRACKS can be very useful for radiation scientists to investigate various problems in the fields of radiation physics, radiation chemistry, and radiation biology. For example, it can be used to simulate electron ejection experiments (radiation physics).
Jalali-Heravi, Mehdi; Moazeni-Pourasil, Roudabeh Sadat; Sereshti, Hassan
2015-03-01
In analysis of complex natural matrices by gas chromatography-mass spectrometry (GC-MS), many disturbing factors such as baseline drift, spectral background, homoscedastic and heteroscedastic noise, peak shape deformation (non-Gaussian peaks), low S/N ratio and co-elution (overlapped and/or embedded peaks) lead the researchers to handle them to serve time, money and experimental efforts. This study aimed to improve the GC-MS analysis of complex natural matrices utilizing multivariate curve resolution (MCR) methods. In addition, to assess the peak purity of the two-dimensional data, a method called variable size moving window-evolving factor analysis (VSMW-EFA) is introduced and examined. The proposed methodology was applied to the GC-MS analysis of Iranian Lavender essential oil, which resulted in extending the number of identified constituents from 56 to 143 components. It was found that the most abundant constituents of the Iranian Lavender essential oil are α-pinene (16.51%), camphor (10.20%), 1,8-cineole (9.50%), bornyl acetate (8.11%) and camphene (6.50%). This indicates that the Iranian type Lavender contains a relatively high percentage of α-pinene. Comparison of different types of Lavender essential oils showed the composition similarity between Iranian and Italian (Sardinia Island) Lavenders. Published by Elsevier B.V.
Pedersen, Mangor; Omidvarnia, Amir; Zalesky, Andrew; Jackson, Graeme D
2018-06-08
Correlation-based sliding window analysis (CSWA) is the most commonly used method to estimate time-resolved functional MRI (fMRI) connectivity. However, instantaneous phase synchrony analysis (IPSA) is gaining popularity mainly because it offers single time-point resolution of time-resolved fMRI connectivity. We aim to provide a systematic comparison between these two approaches, on both temporal and topological levels. For this purpose, we used resting-state fMRI data from two separate cohorts with different temporal resolutions (45 healthy subjects from Human Connectome Project fMRI data with repetition time of 0.72 s and 25 healthy subjects from a separate validation fMRI dataset with a repetition time of 3 s). For time-resolved functional connectivity analysis, we calculated tapered CSWA over a wide range of different window lengths that were temporally and topologically compared to IPSA. We found a strong association in connectivity dynamics between IPSA and CSWA when considering the absolute values of CSWA. The association between CSWA and IPSA was stronger for a window length of ∼20 s (shorter than filtered fMRI wavelength) than ∼100 s (longer than filtered fMRI wavelength), irrespective of the sampling rate of the underlying fMRI data. Narrow-band filtering of fMRI data (0.03-0.07 Hz) yielded a stronger relationship between IPSA and CSWA than wider-band (0.01-0.1 Hz). On a topological level, time-averaged IPSA and CSWA nodes were non-linearly correlated for both short (∼20 s) and long (∼100 s) windows, mainly because nodes with strong negative correlations (CSWA) displayed high phase synchrony (IPSA). IPSA and CSWA were anatomically similar in the default mode network, sensory cortex, insula and cerebellum. Our results suggest that IPSA and CSWA provide comparable characterizations of time-resolved fMRI connectivity for appropriately chosen window lengths. Although IPSA requires narrow-band fMRI filtering, we recommend the use of IPSA given that it does not mandate a (semi-)arbitrary choice of window length and window overlap. A code for calculating IPSA is provided. Copyright © 2018. Published by Elsevier Inc.
Performance prediction using geostatistics and window reservoir simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fontanilla, J.P.; Al-Khalawi, A.A.; Johnson, S.G.
1995-11-01
This paper is the first window model study in the northern area of a large carbonate reservoir in Saudi Arabia. It describes window reservoir simulation with geostatistics to model uneven water encroachment in the southwest producing area of the northern portion of the reservoir. In addition, this paper describes performance predictions that investigate the sweep efficiency of the current peripheral waterflood. A 50 x 50 x 549 (240 m. x 260 m. x 0.15 m. average grid block size) geological model was constructed with geostatistics software. Conditional simulation was used to obtain spatial distributions of porosity and volume of dolomite.more » Core data transforms were used to obtain horizontal and vertical permeability distributions. Simple averaging techniques were used to convert the 549-layer geological model to a 50 x 50 x 10 (240 m. x 260 m. x 8 m. average grid block size) window reservoir simulation model. Flux injectors and flux producers were assigned to the outermost grid blocks. Historical boundary flux rates were obtained from a coarsely-ridded full-field model. Pressure distribution, water cuts, GORs, and recent flowmeter data were history matched. Permeability correction factors and numerous parameter adjustments were required to obtain the final history match. The permeability correction factors were based on pressure transient permeability-thickness analyses. The prediction phase of the study evaluated the effects of infill drilling, the use of artificial lifts, workovers, horizontal wells, producing rate constraints, and tight zone development to formulate depletion strategies for the development of this area. The window model will also be used to investigate day-to-day reservoir management problems in this area.« less
Burriel-Valencia, Jordi; Martinez-Roman, Javier; Sapena-Bano, Angel
2018-01-01
The aim of this paper is to introduce a new methodology for the fault diagnosis of induction machines working in the transient regime, when time-frequency analysis tools are used. The proposed method relies on the use of the optimized Slepian window for performing the short time Fourier transform (STFT) of the stator current signal. It is shown that for a given sequence length of finite duration, the Slepian window has the maximum concentration of energy, greater than can be reached with a gated Gaussian window, which is usually used as the analysis window. In this paper, the use and optimization of the Slepian window for fault diagnosis of induction machines is theoretically introduced and experimentally validated through the test of a 3.15-MW induction motor with broken bars during the start-up transient. The theoretical analysis and the experimental results show that the use of the Slepian window can highlight the fault components in the current’s spectrogram with a significant reduction of the required computational resources. PMID:29316650
Burriel-Valencia, Jordi; Puche-Panadero, Ruben; Martinez-Roman, Javier; Sapena-Bano, Angel; Pineda-Sanchez, Manuel
2018-01-06
The aim of this paper is to introduce a new methodology for the fault diagnosis of induction machines working in the transient regime, when time-frequency analysis tools are used. The proposed method relies on the use of the optimized Slepian window for performing the short time Fourier transform (STFT) of the stator current signal. It is shown that for a given sequence length of finite duration, the Slepian window has the maximum concentration of energy, greater than can be reached with a gated Gaussian window, which is usually used as the analysis window. In this paper, the use and optimization of the Slepian window for fault diagnosis of induction machines is theoretically introduced and experimentally validated through the test of a 3.15-MW induction motor with broken bars during the start-up transient. The theoretical analysis and the experimental results show that the use of the Slepian window can highlight the fault components in the current's spectrogram with a significant reduction of the required computational resources.
genepop'007: a complete re-implementation of the genepop software for Windows and Linux.
Rousset, François
2008-01-01
This note summarizes developments of the genepop software since its first description in 1995, and in particular those new to version 4.0: an extended input format, several estimators of neighbourhood size under isolation by distance, new estimators and confidence intervals for null allele frequency, and less important extensions to previous options. genepop now runs under Linux as well as under Windows, and can be entirely controlled by batch calls. © 2007 The Author.
A Simple Compression Scheme Based on ASCII Value Differencing
NASA Astrophysics Data System (ADS)
Tommy; Siregar, Rosyidah; Lubis, Imran; Marwan E, Andi; Mahmud H, Amir; Harahap, Mawaddah
2018-04-01
ASCII characters have a different code representation where each character has a different numeric value between the characters to each other. The characters is usually used as a text message communication has the representation of a numeric code to each other or have a small difference. The value of the difference can be used as a substitution of the characters so it will generate a new message with a size that is a little more. This paper discusses the utilization value of the difference of characters ASCII in a message to a much simpler substitution by using a dynamic-sized window in order to obtain the difference from ASCII value contained on the window as the basis in determining the bit substitution on the file compression results.
NASA Astrophysics Data System (ADS)
Cheng, Z.; Chen, Y.; Liu, Y.; Liu, W.; Zhang, G.
2015-12-01
Among those hydrocarbon reservoir detection techniques, the time-frequency analysis based approach is one of the most widely used approaches because of its straightforward indication of low-frequency anomalies from the time-frequency maps, that is to say, the low-frequency bright spots usually indicate the potential hydrocarbon reservoirs. The time-frequency analysis based approach is easy to implement, and more importantly, is usually of high fidelity in reservoir prediction, compared with the state-of-the-art approaches, and thus is of great interest to petroleum geologists, geophysicists, and reservoir engineers. The S transform has been frequently used in obtaining the time-frequency maps because of its better performance in controlling the compromise between the time and frequency resolutions than the alternatives, such as the short-time Fourier transform, Gabor transform, and continuous wavelet transform. The window function used in the majority of previous S transform applications is the symmetric Gaussian window. However, one problem with the symmetric Gaussian window is the degradation of time resolution in the time-frequency map due to the long front taper. In our study, a bi-Gaussian S transform that substitutes the symmetric Gaussian window with an asymmetry bi-Gaussian window is proposed to analyze the multi-channel seismic data in order to predict hydrocarbon reservoirs. The bi-Gaussian window introduces asymmetry in the resultant time-frequency spectrum, with time resolution better in the front direction, as compared with the back direction. It is the first time that the bi-Gaussian S transform is used for analyzing multi-channel post-stack seismic data in order to predict hydrocarbon reservoirs since its invention in 2003. The superiority of the bi-Gaussian S transform over traditional S transform is tested on a real land seismic data example. The performance shows that the enhanced temporal resolution can help us depict more clearly the edge of the hydrocarbon reservoir, especially when the thickness of the reservoir is small (such as the thin beds).
Single-agent parallel window search
NASA Technical Reports Server (NTRS)
Powley, Curt; Korf, Richard E.
1991-01-01
Parallel window search is applied to single-agent problems by having different processes simultaneously perform iterations of Iterative-Deepening-A(asterisk) (IDA-asterisk) on the same problem but with different cost thresholds. This approach is limited by the time to perform the goal iteration. To overcome this disadvantage, the authors consider node ordering. They discuss how global node ordering by minimum h among nodes with equal f = g + h values can reduce the time complexity of serial IDA-asterisk by reducing the time to perform the iterations prior to the goal iteration. Finally, the two ideas of parallel window search and node ordering are combined to eliminate the weaknesses of each approach while retaining the strengths. The resulting approach, called simply parallel window search, can be used to find a near-optimal solution quickly, improve the solution until it is optimal, and then finally guarantee optimality, depending on the amount of time available.
Levenson, M.
1960-10-25
A cave window is described. It is constructed of thick glass panes arranged so that interior panes have smaller windowpane areas and exterior panes have larger areas. Exterior panes on the radiation exposure side are remotely replaceable when darkened excessively. Metal shutters minimize exposure time to extend window life.
NASA Astrophysics Data System (ADS)
Zhu, Keyong; Huang, Yong; Pruvost, Jeremy; Legrand, Jack; Pilon, Laurent
2017-06-01
This study aims to quantify systematically the effect of non-absorbing cap-shaped droplets condensed on the backside of transparent windows on their directional-hemispherical transmittance and reflectance. Condensed water droplets have been blamed to reduce light transfer through windows in greenhouses, solar desalination plants, and photobioreactors. Here, the directional-hemispherical transmittance was predicted by Monte Carlo ray-tracing method. For the first time, both monodisperse and polydisperse droplets were considered, with contact angle between 0 and 180°, arranged either in an ordered hexagonal pattern or randomly distributed on the window backside with projected surface area coverage between 0 and 90%. The directional-hemispherical transmittance was found to be independent of the size and spatial distributions of the droplets. Instead, it depended on (i) the incident angle, (ii) the optical properties of the window and droplets, and on (iii) the droplet contact angle and (iv) projected surface area coverage. In fact, the directional-hemispherical transmittance decreased with increasing incident angle. Four optical regimes were identified in the normal-hemispherical transmittance. It was nearly constant for droplet contact angles either smaller than the critical angle θcr (predicted by Snell's law) for total internal reflection at the droplet/air interface or larger than 180°-θcr. However, between these critical contact angles, the normal-hemispherical transmittance decreased rapidly to reach a minimum at 90° and increased rapidly with increasing contact angles up to 180°-θcr. This was attributed to total internal reflection at the droplet/air interface which led to increasing reflectance. In addition, the normal-hemispherical transmittance increased slightly with increasing projected surface area coverage for contact angle was smaller than θcr. However, it decreased monotonously with increasing droplet projected surface area coverage for contact angle larger than θcr. These results can be used to select the material or surface coating with advantageous surface properties for applications when dropwise condensation may otherwise have a negative effect on light transmittance.
NASA Technical Reports Server (NTRS)
Craig, Roger A.; Davy, William C.; Whiting, Ellis E.
1994-01-01
The Radiative Heating Experiment, RHE, aboard the Aeroassist Flight Experiment, AFE, (now cancelled) was to make in-situ measurements of the stagnation region shock layer radiation during an aerobraking maneuver from geosynchronous to low earth orbit. The measurements were to provide a data base to help develop and validate aerothermodynamic computational models. Although cancelled, much work was done to develop the science requirements and to successfully meet RHE technical challenges. This paper discusses the RHE scientific objectives and expected science performance of a small sapphire window for the RHE radiometers. The spectral range required was from 170 to 900 nm. The window size was based on radiometer sensitivity requirements including capability of on-orbit solar calibration.
EPICS Controlled Collimator for Controlling Beam Sizes in HIPPO
DOE Office of Scientific and Technical Information (OSTI.GOV)
Napolitano, Arthur Soriano; Vogel, Sven C.
2017-08-03
Controlling the beam spot size and shape in a diffraction experiment determines the probed sample volume. The HIPPO - High-Pressure-Preferred Orientation– neutron time-offlight diffractometer is located at the Lujan Neutron Scattering Center in Los Alamos National Laboratories. HIPPO characterizes microstructural parameters, such as phase composition, strains, grain size, or texture, of bulk (cm-sized) samples. In the current setup, the beam spot has a 10 mm diameter. Using a collimator, consisting of two pairs of neutron absorbing boron-nitride slabs, horizontal and vertical dimensions of a rectangular beam spot can be defined. Using the HIPPO robotic sample changer for sample motion, themore » collimator would enable scanning of e.g. cylindrical samples along the cylinder axis by probing slices of such samples. The project presented here describes implementation of such a collimator, in particular the motion control software. We utilized the EPICS (Experimental Physics Interface and Control System) software interface to integrate the collimator control into the HIPPO instrument control system. Using EPICS, commands are sent to commercial stepper motors that move the beam windows.« less
Real-Time Detection of Dust Devils from Pressure Readings
NASA Technical Reports Server (NTRS)
Wagstaff, Kiri
2009-01-01
A method for real-time detection of dust devils at a given location is based on identifying the abrupt, temporary decreases in atmospheric pressure that are characteristic of dust devils as they travel through that location. The method was conceived for use in a study of dust devils on the Martian surface, where bandwidth limitations encourage the transmission of only those blocks of data that are most likely to contain information about features of interest, such as dust devils. The method, which is a form of intelligent data compression, could readily be adapted to use for the same purpose in scientific investigation of dust devils on Earth. In this method, the readings of an atmospheric- pressure sensor are repeatedly digitized, recorded, and processed by an algorithm that looks for extreme deviations from a continually updated model of the current pressure environment. The question in formulating the algorithm is how to model current normal observations and what minimum magnitude deviation can be considered sufficiently anomalous as to indicate the presence of a dust devil. There is no single, simple answer to this question: any answer necessarily entails a compromise between false detections and misses. For the original Mars application, the answer was sought through analysis of sliding time windows of digitized pressure readings. Windows of 5-, 10-, and 15-minute durations were considered. The windows were advanced in increments of 30 seconds. Increments of other sizes can also be used, but computational cost increases as the increment decreases and analysis is performed more frequently. Pressure models were defined using a polynomial fit to the data within the windows. For example, the figure depicts pressure readings from a 10-minute window wherein the model was defined by a third-degree polynomial fit to the readings and dust devils were identified as negative deviations larger than both 3 standard deviations (from the mean) and 0.05 mbar in magnitude. An algorithm embodying the detection scheme of this example was found to yield a miss rate of just 8 percent and a false-detection rate of 57 percent when evaluated on historical pressure-sensor data collected by the Mars Pathfinder lander. Since dust devils occur infrequently over the course of a mission, prioritizing observations that contain successful detections could greatly conserve bandwidth allocated to a given mission. This technique can be used on future Mars landers and rovers, such as Mars Phoenix and the Mars Science Laboratory.
NASA Astrophysics Data System (ADS)
Goel, Anju; Kumar, Prashant
2015-04-01
Quantification of disproportionate contribution made by signalised traffic intersections (TIs) to overall daily commuting exposure is important but barely known. We carried out mobile measurements in a car for size-resolved particle number concentrations (PNCs) in the 5-560 nm range under five different ventilation settings on a 6 km long busy round route with 10 TIs. These ventilation settings were windows fully open and both outdoor air intake from fan and heating off (Set1), windows closed, fan 25% on and heating 50% on (Set2), windows closed, fan 100% on and heating off (Set3), windows closed, fan off and heating 100% on (Set4), and windows closed, fan and heating off (Set5). Measurements were taken sequentially inside and outside the car cabin at 10 Hz sampling rate using a solenoid switching system in conjunction with a fast response differential mobility spectrometer (DMS50). The objectives were to: (i) identify traffic conditions under which TIs becomes hot-spots of PNCs, (ii) assess the effect of ventilation settings in free-flow and delay conditions (waiting time at a TI when traffic signal is red) on in-cabin PNCs with respect to on-road PNCs at TIs, (iii) deriving the relationship between the PNCs and change in driving speed during delay time at the TIs, and (iv) quantify the contribution of exposure at TIs with respect to overall commuting exposure. Congested TIs were found to become hot-spots when vehicle accelerate from idling conditions. In-cabin peak PNCs followed similar temporal trend as for on-road peak PNCs. Reduction in in-cabin PNC with respect to outside PNC was highest (70%) during free-flow traffic conditions when both fan drawing outdoor air into the cabin and heating was switched off. Such a reduction in in-cabin PNCs at TIs was highest (88%) with respect to outside PNC during delay conditions when fan was drawing outside air at 25% on and heating was 50% on settings. PNCs and change in driving speed showed an exponential-fit relationship during the delay events at TIs. Short-term exposure for ∼2% of total commuting time in car corresponded to ∼25% of total respiratory doses. This study highlights a need for more studies covering diverse traffic and geographical conditions in urban environments so that the disparate contribution of exposure at TIs can be quantified.
Body size limits dim-light foraging activity in stingless bees (Apidae: Meliponini).
Streinzer, Martin; Huber, Werner; Spaethe, Johannes
2016-10-01
Stingless bees constitute a species-rich tribe of tropical and subtropical eusocial Apidae that act as important pollinators for flowering plants. Many foraging tasks rely on vision, e.g. spatial orientation and detection of food sources and nest entrances. Meliponini workers are usually small, which sets limits on eye morphology and thus quality of vision. Limitations are expected both on acuity, and thus on the ability to detect objects from a distance, as well as on sensitivity, and thus on the foraging time window at dusk and dawn. In this study, we determined light intensity thresholds for flight under dim light conditions in eight stingless bee species in relation to body size in a Neotropical lowland rainforest. Species varied in body size (0.8-1.7 mm thorax-width), and we found a strong negative correlation with light intensity thresholds (0.1-79 lx). Further, we measured eye size, ocelli diameter, ommatidia number, and facet diameter. All parameters significantly correlated with body size. A disproportionately low light intensity threshold in the minute Trigonisca pipioli, together with a large eye parameter P eye suggests specific adaptations to circumvent the optical constraints imposed by the small body size. We discuss the implications of body size in bees on foraging behavior.
Developmental time windows for axon growth influence neuronal network topology.
Lim, Sol; Kaiser, Marcus
2015-04-01
Early brain connectivity development consists of multiple stages: birth of neurons, their migration and the subsequent growth of axons and dendrites. Each stage occurs within a certain period of time depending on types of neurons and cortical layers. Forming synapses between neurons either by growing axons starting at similar times for all neurons (much-overlapped time windows) or at different time points (less-overlapped) may affect the topological and spatial properties of neuronal networks. Here, we explore the extreme cases of axon formation during early development, either starting at the same time for all neurons (parallel, i.e., maximally overlapped time windows) or occurring for each neuron separately one neuron after another (serial, i.e., no overlaps in time windows). For both cases, the number of potential and established synapses remained comparable. Topological and spatial properties, however, differed: Neurons that started axon growth early on in serial growth achieved higher out-degrees, higher local efficiency and longer axon lengths while neurons demonstrated more homogeneous connectivity patterns for parallel growth. Second, connection probability decreased more rapidly with distance between neurons for parallel growth than for serial growth. Third, bidirectional connections were more numerous for parallel growth. Finally, we tested our predictions with C. elegans data. Together, this indicates that time windows for axon growth influence the topological and spatial properties of neuronal networks opening up the possibility to a posteriori estimate developmental mechanisms based on network properties of a developed network.
Scalability Analysis and Use of Compression at the Goddard DAAC and End-to-End MODIS Transfers
NASA Technical Reports Server (NTRS)
Menasce, Daniel A.
1998-01-01
The goal of this task is to analyze the performance of single and multiple FTP transfer between SCF's and the Goddard DAAC. We developed an analytic model to compute the performance of FTP sessions as a function of various key parameters, implemented the model as a program called FTP Analyzer, and carried out validations with real data obtained by running single and multiple FTP transfer between GSFC and the Miami SCF. The input parameters to the model include the mix to FTP sessions (scenario), and for each FTP session, the file size. The network parameters include the round trip time, packet loss rate, the limiting bandwidth of the network connecting the SCF to a DAAC, TCP's basic timeout, TCP's Maximum Segment Size, and TCP's Maximum Receiver's Window Size. The modeling approach used consisted of modeling TCP's overall throughput, computing TCP's delay per FTP transfer, and then solving a queuing network model that includes the FTP clients and servers.
Residual motion compensation in ECG-gated interventional cardiac vasculature reconstruction
NASA Astrophysics Data System (ADS)
Schwemmer, C.; Rohkohl, C.; Lauritsch, G.; Müller, K.; Hornegger, J.
2013-06-01
Three-dimensional reconstruction of cardiac vasculature from angiographic C-arm CT (rotational angiography) data is a major challenge. Motion artefacts corrupt image quality, reducing usability for diagnosis and guidance. Many state-of-the-art approaches depend on retrospective ECG-gating of projection data for image reconstruction. A trade-off has to be made regarding the size of the ECG-gating window. A large temporal window is desirable to avoid undersampling. However, residual motion will occur in a large window, causing motion artefacts. We present an algorithm to correct for residual motion. Our approach is based on a deformable 2D-2D registration between the forward projection of an initial, ECG-gated reconstruction, and the original projection data. The approach is fully automatic and does not require any complex segmentation of vasculature, or landmarks. The estimated motion is compensated for during the backprojection step of a subsequent reconstruction. We evaluated the method using the publicly available CAVAREV platform and on six human clinical datasets. We found a better visibility of structure, reduced motion artefacts, and increased sharpness of the vessels in the compensated reconstructions compared to the initial reconstructions. At the time of writing, our algorithm outperforms the leading result of the CAVAREV ranking list. For the clinical datasets, we found an average reduction of motion artefacts by 13 ± 6%. Vessel sharpness was improved by 25 ± 12% on average.
NASA Astrophysics Data System (ADS)
Alakent, Burak; Camurdan, Mehmet C.; Doruker, Pemra
2005-10-01
Time series analysis tools are employed on the principal modes obtained from the Cα trajectories from two independent molecular-dynamics simulations of α-amylase inhibitor (tendamistat). Fluctuations inside an energy minimum (intraminimum motions), transitions between minima (interminimum motions), and relaxations in different hierarchical energy levels are investigated and compared with those encountered in vacuum by using different sampling window sizes and intervals. The low-frequency low-indexed mode relationship, established in vacuum, is also encountered in water, which shows the reliability of the important dynamics information offered by principal components analysis in water. It has been shown that examining a short data collection period (100ps) may result in a high population of overdamped modes, while some of the low-frequency oscillations (<10cm-1) can be captured in water by using a longer data collection period (1200ps). Simultaneous analysis of short and long sampling window sizes gives the following picture of the effect of water on protein dynamics. Water makes the protein lose its memory: future conformations are less dependent on previous conformations due to the lowering of energy barriers in hierarchical levels of the energy landscape. In short-time dynamics (<10ps), damping factors extracted from time series model parameters are lowered. For tendamistat, the friction coefficient in the Langevin equation is found to be around 40-60cm-1 for the low-indexed modes, compatible with literature. The fact that water has increased the friction and that on the other hand has lubrication effect at first sight contradicts. However, this comes about because water enhances the transitions between minima and forces the protein to reduce its already inherent inability to maintain oscillations observed in vacuum. Some of the frequencies lower than 10cm-1 are found to be overdamped, while those higher than 20cm-1 are slightly increased. As for the long-time dynamics in water, it is found that random-walk motion is maintained for approximately 200ps (about five times of that in vacuum) in the low-indexed modes, showing the lowering of energy barriers between the higher-level minima.
Sunlight Responsive Thermochromic Window System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Millett, F,A; Byker,H, J
2006-10-27
Pleotint has embarked on a novel approach with our Sunlight Responsive Thermochromic, SRT™, windows. We are integrating dynamic sunlight control, high insulation values and low solar heat gain together in a high performance window. The Pleotint SRT window is dynamic because it reversibly changes light transmission based on thermochromics activated directly by the heating effect of sunlight. We can achieve a window package with low solar heat gain coefficient (SHGC), a low U value and high insulation. At the same time our windows provide good daylighting. Our innovative window design offers architects and building designers the opportunity to choose theirmore » desired energy performance, excellent sound reduction, external pane can be self-cleaning, or a resistance to wind load, blasts, bullets or hurricanes. SRT windows would provide energy savings that are estimated at up to 30% over traditional window systems. Glass fabricators will be able to use existing equipment to make the SRT window while adding value and flexibility to the basic design. Glazing installers will have the ability to fit the windows with traditional methods without wires, power supplies and controllers. SRT windows can be retrofit into existing buildings,« less
Effect of the time window on the heat-conduction information filtering model
NASA Astrophysics Data System (ADS)
Guo, Qiang; Song, Wen-Jun; Hou, Lei; Zhang, Yi-Lu; Liu, Jian-Guo
2014-05-01
Recommendation systems have been proposed to filter out the potential tastes and preferences of the normal users online, however, the physics of the time window effect on the performance is missing, which is critical for saving the memory and decreasing the computation complexity. In this paper, by gradually expanding the time window, we investigate the impact of the time window on the heat-conduction information filtering model with ten similarity measures. The experimental results on the benchmark dataset Netflix indicate that by only using approximately 11.11% recent rating records, the accuracy could be improved by an average of 33.16% and the diversity could be improved by 30.62%. In addition, the recommendation performance on the dataset MovieLens could be preserved by only considering approximately 10.91% recent records. Under the circumstance of improving the recommendation performance, our discoveries possess significant practical value by largely reducing the computational time and shortening the data storage space.
Du, Yifeng; Kemper, Timothy; Qiu, Jiange; Jiang, Jianxiong
2016-01-01
Neuroinflammation is a common feature in nearly all neurological and some psychiatric disorders. Resembling its extraneural counterpart, neuroinflammation can be both beneficial and detrimental depending on the responding molecules. The overall effect of inflammation on disease progression is highly dependent on the extent of inflammatory mediator production and the duration of inflammatory induction. The time-dependent aspect of inflammatory responses suggests that the therapeutic time window for quelling neuroinflammation might vary with molecular targets and injury types. Therefore, it is important to define the therapeutic time window for anti-inflammatory therapeutics, as contradicting or negative results might arise when different treatment regimens are utilized even in similar animal models. Herein, we discuss a few critical factors that can help define the therapeutic time window and optimize treatment paradigm for suppressing the cyclooxygenase-2/prostaglandin-mediated inflammation after status epilepticus. These determinants should also be relevant to other anti-inflammatory therapeutic strategies for the CNS diseases. PMID:26689339
PET Performance Evaluation of an MR-Compatible PET Insert
Wu, Yibao; Catana, Ciprian; Farrell, Richard; Dokhale, Purushottam A.; Shah, Kanai S.; Qi, Jinyi; Cherry, Simon R.
2010-01-01
A magnetic resonance (MR) compatible positron emission tomography (PET) insert has been developed in our laboratory for simultaneous small animal PET/MR imaging. This system is based on lutetium oxyorthosilicate (LSO) scintillator arrays with position-sensitive avalanche photodiode (PSAPD) photodetectors. The PET performance of this insert has been measured. The average reconstructed image spatial resolution was 1.51 mm. The sensitivity at the center of the field of view (CFOV) was 0.35%, which is comparable to the simulation predictions of 0.40%. The average photopeak energy resolution was 25%. The scatter fraction inside the MRI scanner with a line source was 12% (with a mouse-sized phantom and standard 35 mm Bruker 1H RF coil), 7% (with RF coil only) and 5% (without phantom or RF coil) for an energy window of 350–650 keV. The front-end electronics had a dead time of 390 ns, and a trigger extension dead time of 7.32 μs that degraded counting rate performance for injected doses above ~0.75 mCi (28 MBq). The peak noise-equivalent count rate (NECR) of 1.27 kcps was achieved at 290 μCi (10.7 MBq). The system showed good imaging performance inside a 7-T animal MRI system; however improvements in data acquisition electronics and reduction of the coincidence timing window are needed to realize improved NECR performance. PMID:21072320
Predicting Visual Distraction Using Driving Performance Data
Kircher, Katja; Ahlstrom, Christer
2010-01-01
Behavioral variables are often used as performance indicators (PIs) of visual or internal distraction induced by secondary tasks. The objective of this study is to investigate whether visual distraction can be predicted by driving performance PIs in a naturalistic setting. Visual distraction is here defined by a gaze based real-time distraction detection algorithm called AttenD. Seven drivers used an instrumented vehicle for one month each in a small scale field operational test. For each of the visual distraction events detected by AttenD, seven PIs such as steering wheel reversal rate and throttle hold were calculated. Corresponding data were also calculated for time periods during which the drivers were classified as attentive. For each PI, means between distracted and attentive states were calculated using t-tests for different time-window sizes (2 – 40 s), and the window width with the smallest resulting p-value was selected as optimal. Based on the optimized PIs, logistic regression was used to predict whether the drivers were attentive or distracted. The logistic regression resulted in predictions which were 76 % correct (sensitivity = 77 % and specificity = 76 %). The conclusion is that there is a relationship between behavioral variables and visual distraction, but the relationship is not strong enough to accurately predict visual driver distraction. Instead, behavioral PIs are probably best suited as complementary to eye tracking based algorithms in order to make them more accurate and robust. PMID:21050615
Smart windows with functions of reflective display and indoor temperature-control
NASA Astrophysics Data System (ADS)
Lee, I.-Hui; Chao, Yu-Ching; Hsu, Chih-Cheng; Chang, Liang-Chao; Chiu, Tien-Lung; Lee, Jiunn-Yih; Kao, Fu-Jen; Lee, Chih-Kung; Lee, Jiun-Haw
2010-02-01
In this paper, a switchable window based on cholestreric liquid crystal (CLC) was demonstrated. Under different applied voltages, incoming light at visible and infrared wavelengths was modulated, respectively. A mixture of CLC with a nematic liquid crystal and a chiral dopant selectively reflected infrared light without bias, which effectively reduced the indoor temperature under sunlight illumination. At this time, transmission at visible range was kept at high and the windows looked transparent. With increasing the voltage to 15V, CLC changed to focal conic state and can be used as a reflective display, a privacy window, or a screen for projector. Under a high voltage (30V), homeotropic state was achieved. At this time, both infrared and visible light can transmit which acted as a normal window, which permitted infrared spectrum of winter sunlight to enter the room so as to reduce the heating requirement. Such a device can be used as a switchable window in smart buildings, green houses and windshields.
Ullah, Sami; Daud, Hanita; Dass, Sarat C; Khan, Habib Nawaz; Khalil, Alamgir
2017-11-06
Ability to detect potential space-time clusters in spatio-temporal data on disease occurrences is necessary for conducting surveillance and implementing disease prevention policies. Most existing techniques use geometrically shaped (circular, elliptical or square) scanning windows to discover disease clusters. In certain situations, where the disease occurrences tend to cluster in very irregularly shaped areas, these algorithms are not feasible in practise for the detection of space-time clusters. To address this problem, a new algorithm is proposed, which uses a co-clustering strategy to detect prospective and retrospective space-time disease clusters with no restriction on shape and size. The proposed method detects space-time disease clusters by tracking the changes in space-time occurrence structure instead of an in-depth search over space. This method was utilised to detect potential clusters in the annual and monthly malaria data in Khyber Pakhtunkhwa Province, Pakistan from 2012 to 2016 visualising the results on a heat map. The results of the annual data analysis showed that the most likely hotspot emerged in three sub-regions in the years 2013-2014. The most likely hotspots in monthly data appeared in the month of July to October in each year and showed a strong periodic trend.
Computational model for behavior shaping as an adaptive health intervention strategy.
Berardi, Vincent; Carretero-González, Ricardo; Klepeis, Neil E; Ghanipoor Machiani, Sahar; Jahangiri, Arash; Bellettiere, John; Hovell, Melbourne
2018-03-01
Adaptive behavioral interventions that automatically adjust in real-time to participants' changing behavior, environmental contexts, and individual history are becoming more feasible as the use of real-time sensing technology expands. This development is expected to improve shortcomings associated with traditional behavioral interventions, such as the reliance on imprecise intervention procedures and limited/short-lived effects. JITAI adaptation strategies often lack a theoretical foundation. Increasing the theoretical fidelity of a trial has been shown to increase effectiveness. This research explores the use of shaping, a well-known process from behavioral theory for engendering or maintaining a target behavior, as a JITAI adaptation strategy. A computational model of behavior dynamics and operant conditioning was modified to incorporate the construct of behavior shaping by adding the ability to vary, over time, the range of behaviors that were reinforced when emitted. Digital experiments were performed with this updated model for a range of parameters in order to identify the behavior shaping features that optimally generated target behavior. Narrowing the range of reinforced behaviors continuously in time led to better outcomes compared with a discrete narrowing of the reinforcement window. Rapid narrowing followed by more moderate decreases in window size was more effective in generating target behavior than the inverse scenario. The computational shaping model represents an effective tool for investigating JITAI adaptation strategies. Model parameters must now be translated from the digital domain to real-world experiments so that model findings can be validated.
Improvement of the user interface of multimedia applications by automatic display layout
NASA Astrophysics Data System (ADS)
Lueders, Peter; Ernst, Rolf
1995-03-01
Multimedia research has mainly focussed on real-time data capturing and display combined with compression, storage and transmission of these data. However, there is another problem considering real-time selecting and arranging a possibly large amount of data from multiple media on the computer screen together with textual and graphical data of regular software. This problem has already been known from complex software systems, such as CASE and hypertest, and will even be aggravated in multimedia systems. The aim of our work is to alleviate the user from the burden of continuously selecting, placing and sizing windows and their contents, but without introducing solutions limited to only few applications. We present an experimental system which controls the computer screen contents and layouts, directed by a user and/or tool provided information filter and prioritization. To be application independent, the screen layout is based on general layout optimization algorithms adapted from the VLSI layout which are controlled by application specific objective functions. In this paper, we discuss the problems of a comprehensible screen layout including the stability of optical information in time, the information filtering, the layout algorithms and the adaptation of the objective function to include a specific application. We give some examples of different standard applications with layout problems ranging from hierarchical graph layout to window layout. The results show that the automatic tool independent display layout will be possible in a real time interactive environment.
Optimal Window and Lattice in Gabor Transform. Application to Audio Analysis.
Lachambre, Helene; Ricaud, Benjamin; Stempfel, Guillaume; Torrésani, Bruno; Wiesmeyr, Christoph; Onchis-Moaca, Darian
2015-01-01
This article deals with the use of optimal lattice and optimal window in Discrete Gabor Transform computation. In the case of a generalized Gaussian window, extending earlier contributions, we introduce an additional local window adaptation technique for non-stationary signals. We illustrate our approach and the earlier one by addressing three time-frequency analysis problems to show the improvements achieved by the use of optimal lattice and window: close frequencies distinction, frequency estimation and SNR estimation. The results are presented, when possible, with real world audio signals.
Cecere, Roberto; Gross, Joachim; Thut, Gregor
2016-06-01
The ability to integrate auditory and visual information is critical for effective perception and interaction with the environment, and is thought to be abnormal in some clinical populations. Several studies have investigated the time window over which audiovisual events are integrated, also called the temporal binding window, and revealed asymmetries depending on the order of audiovisual input (i.e. the leading sense). When judging audiovisual simultaneity, the binding window appears narrower and non-malleable for auditory-leading stimulus pairs and wider and trainable for visual-leading pairs. Here we specifically examined the level of independence of binding mechanisms when auditory-before-visual vs. visual-before-auditory input is bound. Three groups of healthy participants practiced audiovisual simultaneity detection with feedback, selectively training on auditory-leading stimulus pairs (group 1), visual-leading stimulus pairs (group 2) or both (group 3). Subsequently, we tested for learning transfer (crossover) from trained stimulus pairs to non-trained pairs with opposite audiovisual input. Our data confirmed the known asymmetry in size and trainability for auditory-visual vs. visual-auditory binding windows. More importantly, practicing one type of audiovisual integration (e.g. auditory-visual) did not affect the other type (e.g. visual-auditory), even if trainable by within-condition practice. Together, these results provide crucial evidence that audiovisual temporal binding for auditory-leading vs. visual-leading stimulus pairs are independent, possibly tapping into different circuits for audiovisual integration due to engagement of different multisensory sampling mechanisms depending on leading sense. Our results have implications for informing the study of multisensory interactions in healthy participants and clinical populations with dysfunctional multisensory integration. © 2016 The Authors. European Journal of Neuroscience published by Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
R-mode constraints from neutron star equation of state
NASA Astrophysics Data System (ADS)
Papazoglou, M. C.; Moustakidis, C. C.
2016-03-01
The gravitational radiation has been proposed a long time before, as an explanation for the observed relatively low spin frequencies of young neutron stars and of accreting neutron stars in low-mass X-ray binaries as well. In the present work we studied the effects of the neutron star equation of state on the r-mode instability window of rotating neutron stars. Firstly, we employed a set of analytical solution of the Tolman-Oppenheimer-Volkoff equations with special emphasis on the Tolman VII solution. In particular, we tried to clarify the effects of the bulk neutron star properties (mass, radius, density distribution, crust size and elasticity) on the r-mode instability window. We found that the critical angular velocity \\varOmegac depends mainly on the neutron star radius. The effects of the gravitational mass and the mass distribution are almost negligible. Secondly, we studied the effect of the elasticity of the crust, via to the slippage factor S and also the effect of the nuclear equation of state, via the slope parameter L, on the instability window. We found that the crust effects are more pronounced, compared to those originated from the equation of state. Moreover, we proposed simple analytical expressions which relate the macroscopic quantity \\varOmegac to the radius, the parameter L and the factor {S}. We also investigated the possibility to measure the radius of a neutron star and the factor {S} with the help of accurate measures of \\varOmegac and the neutron star temperature. Finally, we studied the effects of the mutual friction on the instability window and discussed the results in comparison with previous similar studies.
Donega, Vanessa; van Bel, Frank; Kas, Martien J. H.; Kavelaars, Annemieke; Heijnen, Cobi J.
2013-01-01
Mesenchymal stem cell (MSC) administration via the intranasal route could become an effective therapy to treat neonatal hypoxic-ischemic (HI) brain damage. We analyzed long-term effects of intranasal MSC treatment on lesion size, sensorimotor and cognitive behavior, and determined the therapeutic window and dose response relationships. Furthermore, the appearance of MSCs at the lesion site in relation to the therapeutic window was examined. Nine-day-old mice were subjected to unilateral carotid artery occlusion and hypoxia. MSCs were administered intranasally at 3, 10 or 17 days after hypoxia-ischemia (HI). Motor, cognitive and histological outcome was investigated. PKH-26 labeled cells were used to localize MSCs in the brain. We identified 0.5×106 MSCs as the minimal effective dose with a therapeutic window of at least 10 days but less than 17 days post-HI. A single dose was sufficient for a marked beneficial effect. MSCs reach the lesion site within 24 h when given 3 or 10 days after injury. However, no MSCs were detected in the lesion when administered 17 days following HI. We also show for the first time that intranasal MSC treatment after HI improves cognitive function. Improvement of sensorimotor function and histological outcome was maintained until at least 9 weeks post-HI. The capacity of MSCs to reach the lesion site within 24 h after intranasal administration at 10 days but not at 17 days post-HI indicates a therapeutic window of at least 10 days. Our data strongly indicate that intranasal MSC treatment may become a promising non-invasive therapeutic tool to effectively reduce neonatal encephalopathy. PMID:23300948
High Reliability R-10 Windows Using Vacuum Insulating Glass Units
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stark, David
2012-08-16
The objective of this effort was for EverSealed Windows (“EverSealed” or “ESW”) to design, assemble, thermally and environmentally test and demonstrate a Vacuum Insulating Glass Unit (“VIGU” or “VIG”) that would enable a whole window to meet or exceed the an R-10 insulating value (U-factor ≤ 0.1). To produce a VIGU that could withstand any North American environment, ESW believed it needed to design, produce and use a flexible edge seal system. This is because a rigid edge seal, used by all other know VIG producers and developers, limits the size and/or thermal environment of the VIG to where themore » unit is not practical for typical IG sizes and cannot withstand severe outdoor environments. The rigid-sealed VIG’s use would be limited to mild climates where it would not have a reasonable economic payback when compared to traditional double-pane or triple-pane IGs. ESW’s goals, in addition to achieving a sufficiently high R-value to enable a whole window to achieve R-10, included creating a VIG design that could be produced for a cost equal to or lower than a traditional triple-pane IG (low-e, argon filled). ESW achieved these goals. EverSealed produced, tested and demonstrated a flexible edge-seal VIG that had an R-13 insulating value and the edge-seal system durability to operate reliably for at least 40 years in the harshest climates of North America.« less
Communicating likelihoods and probabilities in forecasts of volcanic eruptions
NASA Astrophysics Data System (ADS)
Doyle, Emma E. H.; McClure, John; Johnston, David M.; Paton, Douglas
2014-02-01
The issuing of forecasts and warnings of natural hazard events, such as volcanic eruptions, earthquake aftershock sequences and extreme weather often involves the use of probabilistic terms, particularly when communicated by scientific advisory groups to key decision-makers, who can differ greatly in relative expertise and function in the decision making process. Recipients may also differ in their perception of relative importance of political and economic influences on interpretation. Consequently, the interpretation of these probabilistic terms can vary greatly due to the framing of the statements, and whether verbal or numerical terms are used. We present a review from the psychology literature on how the framing of information influences communication of these probability terms. It is also unclear as to how people rate their perception of an event's likelihood throughout a time frame when a forecast time window is stated. Previous research has identified that, when presented with a 10-year time window forecast, participants viewed the likelihood of an event occurring ‘today’ as being of less than that in year 10. Here we show that this skew in perception also occurs for short-term time windows (under one week) that are of most relevance for emergency warnings. In addition, unlike the long-time window statements, the use of the phrasing “within the next…” instead of “in the next…” does not mitigate this skew, nor do we observe significant differences between the perceived likelihoods of scientists and non-scientists. This finding suggests that effects occurring due to the shorter time window may be ‘masking’ any differences in perception due to wording or career background observed for long-time window forecasts. These results have implications for scientific advice, warning forecasts, emergency management decision-making, and public information as any skew in perceived event likelihood towards the end of a forecast time window may result in an underestimate of the likelihood of an event occurring ‘today’ leading to potentially inappropriate action choices. We thus present some initial guidelines for communicating such eruption forecasts.
Sood, Mehak; Besson, Pierre; Muthalib, Makii; Jindal, Utkarsh; Perrey, Stephane; Dutta, Anirban; Hayashibe, Mitsuhiro
2016-12-01
Transcranial direct current stimulation (tDCS) has been shown to perturb both cortical neural activity and hemodynamics during (online) and after the stimulation, however mechanisms of these tDCS-induced online and after-effects are not known. Here, online resting-state spontaneous brain activation may be relevant to monitor tDCS neuromodulatory effects that can be measured using electroencephalography (EEG) in conjunction with near-infrared spectroscopy (NIRS). We present a Kalman Filter based online parameter estimation of an autoregressive (ARX) model to track the transient coupling relation between the changes in EEG power spectrum and NIRS signals during anodal tDCS (2mA, 10min) using a 4×1 ring high-definition montage. Our online ARX parameter estimation technique using the cross-correlation between log (base-10) transformed EEG band-power (0.5-11.25Hz) and NIRS oxy-hemoglobin signal in the low frequency (≤0.1Hz) range was shown in 5 healthy subjects to be sensitive to detect transient EEG-NIRS coupling changes in resting-state spontaneous brain activation during anodal tDCS. Conventional sliding window cross-correlation calculations suffer a fundamental problem in computing the phase relationship as the signal in the window is considered time-invariant and the choice of the window length and step size are subjective. Here, Kalman Filter based method allowed online ARX parameter estimation using time-varying signals that could capture transients in the coupling relationship between EEG and NIRS signals. Our new online ARX model based tracking method allows continuous assessment of the transient coupling between the electrophysiological (EEG) and the hemodynamic (NIRS) signals representing resting-state spontaneous brain activation during anodal tDCS. Published by Elsevier B.V.
Sabushimike, Donatien; Na, Seung You; Kim, Jin Young; Bui, Ngoc Nam; Seo, Kyung Sik; Kim, Gil Gyeom
2016-01-01
The detection of a moving target using an IR-UWB Radar involves the core task of separating the waves reflected by the static background and by the moving target. This paper investigates the capacity of the low-rank and sparse matrix decomposition approach to separate the background and the foreground in the trend of UWB Radar-based moving target detection. Robust PCA models are criticized for being batched-data-oriented, which makes them inconvenient in realistic environments where frames need to be processed as they are recorded in real time. In this paper, a novel method based on overlapping-windows processing is proposed to cope with online processing. The method consists of processing a small batch of frames which will be continually updated without changing its size as new frames are captured. We prove that RPCA (via its Inexact Augmented Lagrange Multiplier (IALM) model) can successfully separate the two subspaces, which enhances the accuracy of target detection. The overlapping-windows processing method converges on the optimal solution with its batch counterpart (i.e., processing batched data with RPCA), and both methods prove the robustness and efficiency of the RPCA over the classic PCA and the commonly used exponential averaging method. PMID:27598159
Influence of Shading on Cooling Energy Demand
NASA Astrophysics Data System (ADS)
Rabczak, Sławomir; Bukowska, Maria; Proszak-Miąsik, Danuta; Nowak, Krzysztof
2017-10-01
The article presents an analysis of the building cooling load taking into account the variability of the factors affecting the size of the heat gains. In order to minimize the demand for cooling, the effect of shading elements installed on the outside on the windows and its effect on size of the cooling capacity of air conditioning system for the building has been estimated. Multivariate building cooling load calculations to determine the size of the reduction in cooling demand has derived. Determination of heat gain from the sun is laborious, but gives a result which reflects the influence of the surface transparent partitions, devices used as sunscreen and its location on the building envelope in relation to the world, as well as to the internal heat gains has great attention in obtained calculation. In this study, included in the balance sheet of solar heat gains are defined in three different shading of windows. Calculating the total demand cooling is made for variants assuming 0% shading baffles transparent, 50% shading baffles transparent external shutters at an angle of 45 °, 100% shading baffles transparent hours 12 from the N and E and from 12 from the S and W of the outer slat blinds. The calculation of the average hourly cooling load was taken into account the option assuming the hypothetical possibility of default by up to 10% of the time assumed the cooling season temperatures in the rooms. To reduce the consumption of electricity energy in the cooling system of the smallest variant identified the need for the power supply for the operation of the cooling system. Also assessed the financial benefits of the temporary default of comfort.
Consumptive and nonconsumptive effects of cannibalism in fluctuating age-structured populations.
Wissinger, Scott A; Whiteman, Howard H; Denoël, Mathieu; Mumford, Miranda L; Aubee, Catherine B
2010-02-01
Theory and empirical studies suggest that cannibalism in age-structured populations can regulate recruitment depending on the intensity of intraspecific competition between cannibals and victims and the nature of the cannibalism window, i.e., which size classes interact as cannibals and victims. Here we report on a series of experiments that quantify that window for age-structured populations of salamander larvae and paedomorphic adults. We determined body size limits on cannibalism in microcosms and then the consumptive and nonconsumptive (injuries, foraging and activity, diet, growth) effects on victims in mesocosms with seminatural levels of habitat complexity and alternative prey. We found that cannibalism by the largest size classes (paedomorphs and > or = age 3+ yr larvae) occurs mainly on young-of-the-year (YOY) victims. Surviving YOY and other small larvae had increased injuries, reduced activity levels, and reduced growth rates in the presence of cannibals. Data on YOY survival in an experiment in which we manipulated the density of paedomorphs combined with historical data on the number of cannibals in natural populations indicate that dominant cohorts of paedomorphs can cause observed recruitment failures. Dietary data indicate that ontogenetic shifts in diet should preclude strong intraspecific competition between YOY and cannibals in this species. Thus our results are consistent with previous empirical and theoretical work that suggests that recruitment regulation by cannibalism is most likely when YOY are vulnerable to cannibalism but have low dietary overlap with cannibals. Understanding the role of cannibalism in regulating recruitment in salamander populations is timely, given the widespread occurrences of amphibian decline. Previous studies have focused on extrinsic (including anthropogenic) factors that affect amphibian population dynamics, whereas the data presented here combined with long-term field observations suggest the potential for intrinsically driven population cycles.
Zhang, Huacheng; Hou, Jue; Hu, Yaoxin; Wang, Peiyao; Ou, Ranwen; Jiang, Lei; Liu, Jefferson Zhe; Freeman, Benny D.; Hill, Anita J.; Wang, Huanting
2018-01-01
Porous membranes with ultrafast ion permeation and high ion selectivity are highly desirable for efficient mineral separation, water purification, and energy conversion, but it is still a huge challenge to efficiently separate monatomic ions of the same valence and similar sizes using synthetic membranes. We report metal organic framework (MOF) membranes, including ZIF-8 and UiO-66 membranes with uniform subnanometer pores consisting of angstrom-sized windows and nanometer-sized cavities for ultrafast selective transport of alkali metal ions. The angstrom-sized windows acted as ion selectivity filters for selection of alkali metal ions, whereas the nanometer-sized cavities functioned as ion conductive pores for ultrafast ion transport. The ZIF-8 and UiO-66 membranes showed a LiCl/RbCl selectivity of ~4.6 and ~1.8, respectively, which are much greater than the LiCl/RbCl selectivity of 0.6 to 0.8 measured in traditional porous membranes. Molecular dynamics simulations suggested that ultrafast and selective ion transport in ZIF-8 was associated with partial dehydration effects. This study reveals ultrafast and selective transport of monovalent ions in subnanometer MOF pores and opens up a new avenue to develop unique MOF platforms for efficient ion separations in the future. PMID:29487910
Zhang, Huacheng; Hou, Jue; Hu, Yaoxin; Wang, Peiyao; Ou, Ranwen; Jiang, Lei; Liu, Jefferson Zhe; Freeman, Benny D; Hill, Anita J; Wang, Huanting
2018-02-01
Porous membranes with ultrafast ion permeation and high ion selectivity are highly desirable for efficient mineral separation, water purification, and energy conversion, but it is still a huge challenge to efficiently separate monatomic ions of the same valence and similar sizes using synthetic membranes. We report metal organic framework (MOF) membranes, including ZIF-8 and UiO-66 membranes with uniform subnanometer pores consisting of angstrom-sized windows and nanometer-sized cavities for ultrafast selective transport of alkali metal ions. The angstrom-sized windows acted as ion selectivity filters for selection of alkali metal ions, whereas the nanometer-sized cavities functioned as ion conductive pores for ultrafast ion transport. The ZIF-8 and UiO-66 membranes showed a LiCl/RbCl selectivity of ~4.6 and ~1.8, respectively, which are much greater than the LiCl/RbCl selectivity of 0.6 to 0.8 measured in traditional porous membranes. Molecular dynamics simulations suggested that ultrafast and selective ion transport in ZIF-8 was associated with partial dehydration effects. This study reveals ultrafast and selective transport of monovalent ions in subnanometer MOF pores and opens up a new avenue to develop unique MOF platforms for efficient ion separations in the future.
NASA Astrophysics Data System (ADS)
Ham, Boo-Hyun; Kim, Il-Hwan; Park, Sung-Sik; Yeo, Sun-Young; Kim, Sang-Jin; Park, Dong-Woon; Park, Joon-Soo; Ryu, Chang-Hoon; Son, Bo-Kyeong; Hwang, Kyung-Bae; Shin, Jae-Min; Shin, Jangho; Park, Ki-Yeop; Park, Sean; Liu, Lei; Tien, Ming-Chun; Nachtwein, Angelique; Jochemsen, Marinus; Yan, Philip; Hu, Vincent; Jones, Christopher
2017-03-01
As critical dimensions for advanced two dimensional (2D) DUV patterning continue to shrink, the exact process window becomes increasingly difficult to determine. The defect size criteria shrink with the patterning critical dimensions and are well below the resolution of current optical inspection tools. As a result, it is more challenging for traditional bright field inspection tools to accurately discover the hotspots that define the process window. In this study, we use a novel computational inspection method to identify the depth-of-focus limiting features of a 10 nm node mask with 2D metal structures (single exposure) and compare the results to those obtained with a traditional process windows qualification (PWQ) method based on utilizing a focus modulated wafer and bright field inspection (BFI) to detect hotspot defects. The method is extended to litho-etch litho-etch (LELE) on a different test vehicle to show that overlay related bridging hotspots also can be identified.
OSLG: A new granting scheme in WDM Ethernet passive optical networks
NASA Astrophysics Data System (ADS)
Razmkhah, Ali; Rahbar, Akbar Ghaffarpour
2011-12-01
Several granting schemes have been proposed to grant transmission window and dynamic bandwidth allocation (DBA) in passive optical networks (PON). Generally, granting schemes suffer from bandwidth wastage of granted windows. Here, we propose a new granting scheme for WDM Ethernet PONs, called optical network unit (ONU) Side Limited Granting (OSLG) that conserves upstream bandwidth, thus resulting in decreasing queuing delay and packet drop ratio. In OSLG instead of optical line terminal (OLT), each ONU determines its transmission window. Two OSLG algorithms are proposed in this paper: the OSLG_GA algorithm that determines the size of its transmission window in such a way that the bandwidth wastage problem is relieved, and the OSLG_SC algorithm that saves unused bandwidth for more bandwidth utilization later on. The OSLG can be used as granting scheme of any DBA to provide better performance in the terms of packet drop ratio and queuing delay. Our performance evaluations show the effectiveness of OSLG in reducing packet drop ratio and queuing delay under different DBA techniques.
Gold, Raymond; Roberts, James H.
1989-01-01
A solid state track recording type dosimeter is disclosed to measure the time dependence of the absolute fission rates of nuclides or neutron fluence over a period of time. In a primary species an inner recording drum is rotatably contained within an exterior housing drum that defines a series of collimating slit apertures overlying windows defined in the stationary drum through which radiation can enter. Film type solid state track recorders are positioned circumferentially about the surface of the internal recording drum to record such radiation or its secondary products during relative rotation of the two elements. In another species both the recording element and the aperture element assume the configuration of adjacent disks. Based on slit size of apertures and relative rotational velocity of the inner drum, radiation parameters within a test area may be measured as a function of time and spectra deduced therefrom.
Optimizing searches for electromagnetic counterparts of gravitational wave triggers
NASA Astrophysics Data System (ADS)
Coughlin, Michael W.; Tao, Duo; Chan, Man Leong; Chatterjee, Deep; Christensen, Nelson; Ghosh, Shaon; Greco, Giuseppe; Hu, Yiming; Kapadia, Shasvath; Rana, Javed; Salafia, Om Sharan; Stubbs11, Christopher
2018-04-01
With the detection of a binary neutron star system and its corresponding electromagnetic counterparts, a new window of transient astronomy has opened. Due to the size of the sky localization regions, which can span hundreds to thousands of square degrees, there are significant benefits to optimizing tilings for these large sky areas. The rich science promised by gravitational-wave astronomy has led to the proposal for a variety of proposed tiling and time allocation schemes, and for the first time, we make a systematic comparison of some of these methods. We find that differences of a factor of 2 or more in efficiency are possible, depending on the algorithm employed. For this reason, with future surveys searching for electromagnetic counterparts, care should be taken when selecting tiling, time allocation, and scheduling algorithms to optimize counterpart detection.
Optimizing searches for electromagnetic counterparts of gravitational wave triggers
NASA Astrophysics Data System (ADS)
Coughlin, Michael W.; Tao, Duo; Chan, Man Leong; Chatterjee, Deep; Christensen, Nelson; Ghosh, Shaon; Greco, Giuseppe; Hu, Yiming; Kapadia, Shasvath; Rana, Javed; Salafia, Om Sharan; Stubbs, Christopher W.
2018-07-01
With the detection of a binary neutron star system and its corresponding electromagnetic counterparts, a new window of transient astronomy has opened. Due to the size of the sky localization regions, which can span hundreds to thousands of square degrees, there are significant benefits to optimizing tilings for these large sky areas. The rich science promised by gravitational wave astronomy has led to the proposal for a variety of proposed tiling and time allocation schemes, and for the first time, we make a systematic comparison of some of these methods. We find that differences of a factor of 2 or more in efficiency are possible, depending on the algorithm employed. For this reason, with future surveys searching for electromagnetic counterparts, care should be taken when selecting tiling, time allocation, and scheduling algorithms to optimize counterpart detection.
Shakil, Sadia; Lee, Chin-Hui; Keilholz, Shella Dawn
2016-01-01
A promising recent development in the study of brain function is the dynamic analysis of resting-state functional MRI scans, which can enhance understanding of normal cognition and alterations that result from brain disorders. One widely used method of capturing the dynamics of functional connectivity is sliding window correlation (SWC). However, in the absence of a “gold standard” for comparison, evaluating the performance of the SWC in typical resting-state data is challenging. This study uses simulated networks (SNs) with known transitions to examine the effects of parameters such as window length, window offset, window type, noise, filtering, and sampling rate on the SWC performance. The SWC time course was calculated for all node pairs of each SN and then clustered using the k-means algorithm to determine how resulting brain states match known configurations and transitions in the SNs. The outcomes show that the detection of state transitions and durations in the SWC is most strongly influenced by the window length and offset, followed by noise and filtering parameters. The effect of the image sampling rate was relatively insignificant. Tapered windows provide less sensitivity to state transitions than rectangular windows, which could be the result of the sharp transitions in the SNs. Overall, the SWC gave poor estimates of correlation for each brain state. Clustering based on the SWC time course did not reliably reflect the underlying state transitions unless the window length was comparable to the state duration, highlighting the need for new adaptive window analysis techniques. PMID:26952197
Peng, Sijia; Wang, Wenjuan; Chen, Chunlai
2018-05-10
Fluorescence correlation spectroscopy is a powerful single-molecule tool that is able to capture kinetic processes occurring at the nanosecond time scale. However, the upper limit of its time window is restricted by the dwell time of the molecule of interest in the confocal detection volume, which is usually around submilliseconds for a freely diffusing biomolecule. Here, we present a simple and easy-to-implement method, named surface transient binding-based fluorescence correlation spectroscopy (STB-FCS), which extends the upper limit of the time window to seconds. We further demonstrated that STB-FCS enables capture of both intramolecular and intermolecular kinetic processes whose time scales cross several orders of magnitude.
NASA Astrophysics Data System (ADS)
McClain, Bobbi J.; Porter, William F.
2000-11-01
Satellite imagery is a useful tool for large-scale habitat analysis; however, its limitations need to be tested. We tested these limitations by varying the methods of a habitat evaluation for white-tailed deer ( Odocoileus virginianus) in the Adirondack Park, New York, USA, utilizing harvest data to create and validate the assessment models. We used two classified images, one with a large minimum mapping unit but high accuracy and one with no minimum mapping unit but slightly lower accuracy, to test the sensitivity of the evaluation to these differences. We tested the utility of two methods of assessment, habitat suitability index modeling, and pattern recognition modeling. We varied the scale at which the models were applied by using five separate sizes of analysis windows. Results showed that the presence of a large minimum mapping unit eliminates important details of the habitat. Window size is relatively unimportant if the data are averaged to a large resolution (i.e., township), but if the data are used at the smaller resolution, then the window size is an important consideration. In the Adirondacks, the proportion of hardwood and softwood in an area is most important to the spatial dynamics of deer populations. The low occurrence of open area in all parts of the park either limits the effect of this cover type on the population or limits our ability to detect the effect. The arrangement and interspersion of cover types were not significant to deer populations.
Region of interest and windowing-based progressive medical image delivery using JPEG2000
NASA Astrophysics Data System (ADS)
Nagaraj, Nithin; Mukhopadhyay, Sudipta; Wheeler, Frederick W.; Avila, Ricardo S.
2003-05-01
An important telemedicine application is the perusal of CT scans (digital format) from a central server housed in a healthcare enterprise across a bandwidth constrained network by radiologists situated at remote locations for medical diagnostic purposes. It is generally expected that a viewing station respond to an image request by displaying the image within 1-2 seconds. Owing to limited bandwidth, it may not be possible to deliver the complete image in such a short period of time with traditional techniques. In this paper, we investigate progressive image delivery solutions by using JPEG 2000. An estimate of the time taken in different network bandwidths is performed to compare their relative merits. We further make use of the fact that most medical images are 12-16 bits, but would ultimately be converted to an 8-bit image via windowing for display on the monitor. We propose a windowing progressive RoI technique to exploit this and investigate JPEG 2000 RoI based compression after applying a favorite or a default window setting on the original image. Subsequent requests for different RoIs and window settings would then be processed at the server. For the windowing progressive RoI mode, we report a 50% reduction in transmission time.
Tapiainen, V; Hartikainen, S; Taipale, H; Tiihonen, J; Tolppanen, A-M
2017-06-01
Studies investigating psychiatric disorders as Alzheimer's disease (AD) risk factors have yielded heterogeneous findings. Differences in time windows between the exposure and outcome could be one explanation. We examined whether (1) mental and behavioral disorders in general or (2) specific mental and behavioral disorder categories increase the risk of AD and (3) how the width of the time window between the exposure and outcome affects the results. A nationwide nested case-control study of all Finnish clinically verified AD cases, alive in 2005 and their age, sex and region of residence matched controls (n of case-control pairs 27,948). History of hospital-treated mental and behavioral disorders was available since 1972. Altogether 6.9% (n=1932) of the AD cases and 6.4% (n=1784) of controls had a history of any mental and behavioral disorder. Having any mental and behavioral disorder (adjusted OR=1.07, 95% CI=1.00-1.16) or depression/other mood disorder (adjusted OR=1.17, 95% CI=1.05-1.30) were associated with higher risk of AD with 5-year time window but not with 10-year time window (adjusted OR, 95% CI 0.99, 0.91-1.08 for any disorder and 1.08, 0.96-1.23 for depression). The associations between mental and behavioral disorders and AD were modest and dependent on the time window. Therefore, some of the disorders may represent misdiagnosed prodromal symptoms of AD, which underlines the importance of proper differential diagnostics among older persons. These findings also highlight the importance of appropriate time window in psychiatric and neuroepidemiology research. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Human Mars Mission: Launch Window from Earth Orbit. Pt. 1
NASA Technical Reports Server (NTRS)
Young, Archie
1999-01-01
The determination of orbital window characteristics is of major importance in the analysis of human interplanetary missions and systems. The orbital launch window characteristics are directly involved in the selection of mission trajectories, the development of orbit operational concepts, and the design of orbital launch systems. The orbital launch window problem arises because of the dynamic nature of the relative geometry between outgoing (departure) asymptote of the hyperbolic escape trajectory and the earth parking orbit. The orientation of the escape hyperbola asymptotic relative to earth is a function of time. The required hyperbola energy level also varies with time. In addition, the inertial orientation of the parking orbit is a function of time because of the perturbations caused by the Earth's oblateness. Thus, a coplanar injection onto the escape hyperbola can be made only at a point in time when the outgoing escape asymptote is contained by the plane of parking orbit. Even though this condition may be planned as a nominal situation, it will not generally represent the more probable injection geometry. The general case of an escape injection maneuver performed at a time other than the coplanar time will involve both a path angle and plane change and, therefore, a DELTA V penalty. Usually, because of the DELTA V penalty the actual departure injection window is smaller in duration than that determined by energy requirement alone. This report contains the formulation, characteristics, and test cases for five different launch window modes for Earth orbit. These modes are: (1) One impulsive maneuver from a Highly Elliptical Orbit (HEO) (2) Two impulsive maneuvers from a Highly Elliptical Orbit (HEO) (3) One impulsive maneuver from a Low Earth Orbit (LEO) (4) Two impulsive maneuvers from LEO (5) Three impulsive maneuvers from LEO.
NASA Astrophysics Data System (ADS)
Dwi Prastyo, Dedy; Handayani, Dwi; Fam, Soo-Fen; Puteri Rahayu, Santi; Suhartono; Luh Putu Satyaning Pradnya Paramita, Ni
2018-03-01
Risk assessment and evaluation becomes essential for financial institution to measure the potential risk of their counterparties. In middle of 2016 until first quarter of 2017, there is national program from Indonesian government so-called Tax Amnesty. One subsector that has potential to receive positive impact from the Tax Amnesty program is property and real estate. This work evaluates the risk of top five companies in term of capital share listed in Indonesia stock exchange (IDX). To do this, the Value-at-Risk (VaR) with ARMAX-GARCHX approach is employed. The ARMAX-GARCHX simultaneously models the adaptive mean and variance of stock return of each company considering exogenous variables, i.e. IDR/USD exchange rate and Jakarta Composite Index (JCI). The risk is evaluated in scheme of time moving window. The risk evaluation using 5% quantile with window size 500 transaction days perform better result compare to other scenarios. In addition, duration test is used to test the dependency between shortfalls. It informs that series of shortfall are independent.
Phase unwrapping algorithm using polynomial phase approximation and linear Kalman filter.
Kulkarni, Rishikesh; Rastogi, Pramod
2018-02-01
A noise-robust phase unwrapping algorithm is proposed based on state space analysis and polynomial phase approximation using wrapped phase measurement. The true phase is approximated as a two-dimensional first order polynomial function within a small sized window around each pixel. The estimates of polynomial coefficients provide the measurement of phase and local fringe frequencies. A state space representation of spatial phase evolution and the wrapped phase measurement is considered with the state vector consisting of polynomial coefficients as its elements. Instead of using the traditional nonlinear Kalman filter for the purpose of state estimation, we propose to use the linear Kalman filter operating directly with the wrapped phase measurement. The adaptive window width is selected at each pixel based on the local fringe density to strike a balance between the computation time and the noise robustness. In order to retrieve the unwrapped phase, either a line-scanning approach or a quality guided strategy of pixel selection is used depending on the underlying continuous or discontinuous phase distribution, respectively. Simulation and experimental results are provided to demonstrate the applicability of the proposed method.
Program Predicts Performance of Optical Parametric Oscillators
NASA Technical Reports Server (NTRS)
Cross, Patricia L.; Bowers, Mark
2006-01-01
A computer program predicts the performances of solid-state lasers that operate at wavelengths from ultraviolet through mid-infrared and that comprise various combinations of stable and unstable resonators, optical parametric oscillators (OPOs), and sum-frequency generators (SFGs), including second-harmonic generators (SHGs). The input to the program describes the signal, idler, and pump beams; the SFG and OPO crystals; and the laser geometry. The program calculates the electric fields of the idler, pump, and output beams at three locations (inside the laser resonator, just outside the input mirror, and just outside the output mirror) as functions of time for the duration of the pump beam. For each beam, the electric field is used to calculate the fluence at the output mirror, plus summary parameters that include the centroid location, the radius of curvature of the wavefront leaving through the output mirror, the location and size of the beam waist, and a quantity known, variously, as a propagation constant or beam-quality factor. The program provides a typical Windows interface for entering data and selecting files. The program can include as many as six plot windows, each containing four graphs.
Automated image segmentation-assisted flattening of atomic force microscopy images.
Wang, Yuliang; Lu, Tongda; Li, Xiaolai; Wang, Huimin
2018-01-01
Atomic force microscopy (AFM) images normally exhibit various artifacts. As a result, image flattening is required prior to image analysis. To obtain optimized flattening results, foreground features are generally manually excluded using rectangular masks in image flattening, which is time consuming and inaccurate. In this study, a two-step scheme was proposed to achieve optimized image flattening in an automated manner. In the first step, the convex and concave features in the foreground were automatically segmented with accurate boundary detection. The extracted foreground features were taken as exclusion masks. In the second step, data points in the background were fitted as polynomial curves/surfaces, which were then subtracted from raw images to get the flattened images. Moreover, sliding-window-based polynomial fitting was proposed to process images with complex background trends. The working principle of the two-step image flattening scheme were presented, followed by the investigation of the influence of a sliding-window size and polynomial fitting direction on the flattened images. Additionally, the role of image flattening on the morphological characterization and segmentation of AFM images were verified with the proposed method.
Boswell, Paul G.; Abate-Pella, Daniel; Hewitt, Joshua T.
2015-01-01
Compound identification by liquid chromatography-mass spectrometry (LC-MS) is a tedious process, mainly because authentic standards must be run on a user’s system to be able to confidently reject a potential identity from its retention time and mass spectral properties. Instead, it would be preferable to use shared retention time/index data to narrow down the identity, but shared data cannot be used to reject candidates with an absolute level of confidence because the data are strongly affected by differences between HPLC systems and experimental conditions. However, a technique called “retention projection” was recently shown to account for many of the differences. In this manuscript, we discuss an approach to calculate appropriate retention time tolerance windows for projected retention times, potentially making it possible to exclude candidates with an absolute level of confidence, without needing to have authentic standards of each candidate on hand. In a range of multi-segment gradients and flow rates run among seven different labs, the new approach calculated tolerance windows that were significantly more appropriate for each retention projection than global tolerance windows calculated for retention projections or linear retention indices. Though there were still some small differences between the labs that evidently were not taken into account, the calculated tolerance windows only needed to be relaxed by 50% to make them appropriate for all labs. Even then, 42% of the tolerance windows calculated in this study without standards were narrower than those required by WADA for positive identification, where standards must be run contemporaneously. PMID:26292624
Boswell, Paul G; Abate-Pella, Daniel; Hewitt, Joshua T
2015-09-18
Compound identification by liquid chromatography-mass spectrometry (LC-MS) is a tedious process, mainly because authentic standards must be run on a user's system to be able to confidently reject a potential identity from its retention time and mass spectral properties. Instead, it would be preferable to use shared retention time/index data to narrow down the identity, but shared data cannot be used to reject candidates with an absolute level of confidence because the data are strongly affected by differences between HPLC systems and experimental conditions. However, a technique called "retention projection" was recently shown to account for many of the differences. In this manuscript, we discuss an approach to calculate appropriate retention time tolerance windows for projected retention times, potentially making it possible to exclude candidates with an absolute level of confidence, without needing to have authentic standards of each candidate on hand. In a range of multi-segment gradients and flow rates run among seven different labs, the new approach calculated tolerance windows that were significantly more appropriate for each retention projection than global tolerance windows calculated for retention projections or linear retention indices. Though there were still some small differences between the labs that evidently were not taken into account, the calculated tolerance windows only needed to be relaxed by 50% to make them appropriate for all labs. Even then, 42% of the tolerance windows calculated in this study without standards were narrower than those required by WADA for positive identification, where standards must be run contemporaneously. Copyright © 2015 Elsevier B.V. All rights reserved.
A fast algorithm for vertex-frequency representations of signals on graphs
Jestrović, Iva; Coyle, James L.; Sejdić, Ervin
2016-01-01
The windowed Fourier transform (short time Fourier transform) and the S-transform are widely used signal processing tools for extracting frequency information from non-stationary signals. Previously, the windowed Fourier transform had been adopted for signals on graphs and has been shown to be very useful for extracting vertex-frequency information from graphs. However, high computational complexity makes these algorithms impractical. We sought to develop a fast windowed graph Fourier transform and a fast graph S-transform requiring significantly shorter computation time. The proposed schemes have been tested with synthetic test graph signals and real graph signals derived from electroencephalography recordings made during swallowing. The results showed that the proposed schemes provide significantly lower computation time in comparison with the standard windowed graph Fourier transform and the fast graph S-transform. Also, the results showed that noise has no effect on the results of the algorithm for the fast windowed graph Fourier transform or on the graph S-transform. Finally, we showed that graphs can be reconstructed from the vertex-frequency representations obtained with the proposed algorithms. PMID:28479645
An Evaluation of TCP with Larger Initial Windows
NASA Technical Reports Server (NTRS)
Allman, Mark; Hayes, Christopher; Ostermann, Shawn
1998-01-01
Transmission Control Protocol (TCP's) slow start algorithm gradually increases the amount of data a sender injects into the network, which prevents the sender from overwhelming the network with an inappropriately large burst of traffic. However, the slow start algorithm can make poor use of the available band-width for transfers which are small compared to the bandwidth-delay product of the link, such as file transfers up to few thousand characters over satellite links or even transfers of several hundred bytes over local area networks. This paper evaluates a proposed performance enhancement that raises the initial window used by TCP from 1 MSS-sized segment to roughly 4 KB. The paper evaluates the impact of using larger initial windows on TCP transfers over both the shared Internet and dialup modem links.
NASA Astrophysics Data System (ADS)
Tan, F.; Wang, G.; Chen, C.; Ge, Z.
2016-12-01
Back-projection of teleseismic P waves [Ishii et al., 2005] has been widely used to image the rupture of earthquakes. Besides the conventional narrowband beamforming in time domain, approaches in frequency domain such as MUSIC back projection (Meng 2011) and compressive sensing (Yao et al, 2011), are proposed to improve the resolution. Each method has its advantages and disadvantages and should be properly used in different cases. Therefore, a thorough research to compare and test these methods is needed. We write a GUI program, which puts the three methods together so that people can conveniently use different methods to process the same data and compare the results. Then we use all the methods to process several earthquake data, including 2008 Wenchuan Mw7.9 earthquake and 2011 Tohoku-Oki Mw9.0 earthquake, and theoretical seismograms of both simple sources and complex ruptures. Our results show differences in efficiency, accuracy and stability among the methods. Quantitative and qualitative analysis are applied to measure their dependence on data and parameters, such as station number, station distribution, grid size, calculate window length and so on. In general, back projection makes it possible to get a good result in a very short time using less than 20 lines of high-quality data with proper station distribution, but the swimming artifact can be significant. Some ways, for instance, combining global seismic data, could help ameliorate this method. Music back projection needs relatively more data to obtain a better and more stable result, which means it needs a lot more time since its runtime accumulates obviously faster than back projection with the increase of station number. Compressive sensing deals more effectively with multiple sources in a same time window, however, costs the longest time due to repeatedly solving matrix. Resolution of all the methods is complicated and depends on many factors. An important one is the grid size, which in turn influences runtime significantly. More detailed results in this research may help people to choose proper data, method and parameters.
A Study of the Efficiency of Spatial Indexing Methods Applied to Large Astronomical Databases
NASA Astrophysics Data System (ADS)
Donaldson, Tom; Berriman, G. Bruce; Good, John; Shiao, Bernie
2018-01-01
Spatial indexing of astronomical databases generally uses quadrature methods, which partition the sky into cells used to create an index (usually a B-tree) written as database column. We report the results of a study to compare the performance of two common indexing methods, HTM and HEALPix, on Solaris and Windows database servers installed with a PostgreSQL database, and a Windows Server installed with MS SQL Server. The indexing was applied to the 2MASS All-Sky Catalog and to the Hubble Source catalog. On each server, the study compared indexing performance by submitting 1 million queries at each index level with random sky positions and random cone search radius, which was computed on a logarithmic scale between 1 arcsec and 1 degree, and measuring the time to complete the query and write the output. These simulated queries, intended to model realistic use patterns, were run in a uniform way on many combinations of indexing method and indexing level. The query times in all simulations are strongly I/O-bound and are linear with number of records returned for large numbers of sources. There are, however, considerable differences between simulations, which reveal that hardware I/O throughput is a more important factor in managing the performance of a DBMS than the choice of indexing scheme. The choice of index itself is relatively unimportant: for comparable index levels, the performance is consistent within the scatter of the timings. At small index levels (large cells; e.g. level 4; cell size 3.7 deg), there is large scatter in the timings because of wide variations in the number of sources found in the cells. At larger index levels, performance improves and scatter decreases, but the improvement at level 8 (14 min) and higher is masked to some extent in the timing scatter caused by the range of query sizes. At very high levels (20; 0.0004 arsec), the granularity of the cells becomes so high that a large number of extraneous empty cells begin to degrade performance. Thus, for the use patterns studied here the database performance is not critically dependent on the exact choices of index or level.
A case study of exposure to ultrafine particles from secondhand tobacco smoke in an automobile.
Liu, S; Zhu, Y
2010-10-01
Secondhand tobacco smoke (SHS) in enclosed spaces is a major source of potentially harmful airborne particles. To quantify exposure to ultrafine particles (UFP) because of SHS and to investigate the interaction between pollutants from SHS and vehicular emissions, number concentration and size distribution of UFP and other air pollutants (CO, CO(2) , and PM(2.5)) were measured inside a moving vehicle under five different ventilation conditions. A major interstate freeway with a speed limit of 60 mph and an urban roadway with a speed limit of 30 mph were selected to represent typical urban routes. In a typical 30-min commute on urban roadways, the SHS of one cigarette exposed passengers to approximately 10 times the UFP and 120 times the PM(2.5) of ambient air. The most effective solution to protect passengers from SHS exposure is to abstain from smoking in the vehicle. Opening a window is an effective method for decreasing pollutant exposures on most urban roadways. However, under road conditions with high UFP concentrations, such as tunnels or busy freeways with high proportion of heavy-duty diesel trucks (such as the 710 Freeway in Los Angeles, CA, USA), opening a window is not a viable method to reduce UFPs. Time budget studies show that Americans spend, on average, more than 60 min each day in enclosed vehicles. Smoking inside vehicles can expose the driver and other passengers to high levels of pollutants. Thus, an understanding of the variations and interactions of secondhand tobacco smoke (SHS) and vehicular emissions under realistic driving conditions is necessary. Results of this study indicated that high ventilation rates can effectively dilute ultrafine particles (UFP) inside moving vehicles on urban routes. However, driving with open windows and an increased air exchange rate (AER) are not recommended on tunnels and heavily travelled freeways.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-07
... defines a remotely managed Post Office (RMPO) as a Post Office that offers part-time window service hours... Administrative Post Office. The final rule also defines a part-time Post Office (PTPO) as a Post Office that offers part-time window service hours, is staffed by a Postal Service employee, and reports to a district...
Hawking radiation from squashed Kaluza-Klein black holes: A window to extra dimensions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ishihara, Hideki; Soda, Jiro
2007-09-15
We explore the observability of extra dimensions through five-dimensional squashed Kaluza-Klein black holes residing in the Kaluza-Klein spacetime. With the expectation that the Hawking radiation reflects the five-dimensional nature of the squashed horizon, we study the Hawking radiation of a scalar field in the squashed black hole background. As a result, we show that the luminosity of Hawking radiation tells us the size of the extra dimension, namely, the squashed Kaluza-Klein black holes open a window to extra dimensions.
Recalibration of the Multisensory Temporal Window of Integration Results from Changing Task Demands
Mégevand, Pierre; Molholm, Sophie; Nayak, Ashabari; Foxe, John J.
2013-01-01
The notion of the temporal window of integration, when applied in a multisensory context, refers to the breadth of the interval across which the brain perceives two stimuli from different sensory modalities as synchronous. It maintains a unitary perception of multisensory events despite physical and biophysical timing differences between the senses. The boundaries of the window can be influenced by attention and past sensory experience. Here we examined whether task demands could also influence the multisensory temporal window of integration. We varied the stimulus onset asynchrony between simple, short-lasting auditory and visual stimuli while participants performed two tasks in separate blocks: a temporal order judgment task that required the discrimination of subtle auditory-visual asynchronies, and a reaction time task to the first incoming stimulus irrespective of its sensory modality. We defined the temporal window of integration as the range of stimulus onset asynchronies where performance was below 75% in the temporal order judgment task, as well as the range of stimulus onset asynchronies where responses showed multisensory facilitation (race model violation) in the reaction time task. In 5 of 11 participants, we observed audio-visual stimulus onset asynchronies where reaction time was significantly accelerated (indicating successful integration in this task) while performance was accurate in the temporal order judgment task (indicating successful segregation in that task). This dissociation suggests that in some participants, the boundaries of the temporal window of integration can adaptively recalibrate in order to optimize performance according to specific task demands. PMID:23951203
Imaging windows for long-term intravital imaging
Alieva, Maria; Ritsma, Laila; Giedt, Randy J; Weissleder, Ralph; van Rheenen, Jacco
2014-01-01
Intravital microscopy is increasingly used to visualize and quantitate dynamic biological processes at the (sub)cellular level in live animals. By visualizing tissues through imaging windows, individual cells (e.g., cancer, host, or stem cells) can be tracked and studied over a time-span of days to months. Several imaging windows have been developed to access tissues including the brain, superficial fascia, mammary glands, liver, kidney, pancreas, and small intestine among others. Here, we review the development of imaging windows and compare the most commonly used long-term imaging windows for cancer biology: the cranial imaging window, the dorsal skin fold chamber, the mammary imaging window, and the abdominal imaging window. Moreover, we provide technical details, considerations, and trouble-shooting tips on the surgical procedures and microscopy setups for each imaging window and explain different strategies to assure imaging of the same area over multiple imaging sessions. This review aims to be a useful resource for establishing the long-term intravital imaging procedure. PMID:28243510
Imaging windows for long-term intravital imaging: General overview and technical insights.
Alieva, Maria; Ritsma, Laila; Giedt, Randy J; Weissleder, Ralph; van Rheenen, Jacco
2014-01-01
Intravital microscopy is increasingly used to visualize and quantitate dynamic biological processes at the (sub)cellular level in live animals. By visualizing tissues through imaging windows, individual cells (e.g., cancer, host, or stem cells) can be tracked and studied over a time-span of days to months. Several imaging windows have been developed to access tissues including the brain, superficial fascia, mammary glands, liver, kidney, pancreas, and small intestine among others. Here, we review the development of imaging windows and compare the most commonly used long-term imaging windows for cancer biology: the cranial imaging window, the dorsal skin fold chamber, the mammary imaging window, and the abdominal imaging window. Moreover, we provide technical details, considerations, and trouble-shooting tips on the surgical procedures and microscopy setups for each imaging window and explain different strategies to assure imaging of the same area over multiple imaging sessions. This review aims to be a useful resource for establishing the long-term intravital imaging procedure.
Retrieval of Ice Cloud Properties Using Variable Phase Functions
NASA Astrophysics Data System (ADS)
Heck, Patrick W.; Minnis, Patrick; Yang, Ping; Chang, Fu-Lung; Palikonda, Rabindra; Arduini, Robert F.; Sun-Mack, Sunny
2009-03-01
An enhancement to NASA Langley's Visible Infrared Solar-infrared Split-window Technique (VISST) is developed to identify and account for situations when errors are induced by using smooth ice crystals. The retrieval scheme incorporates new ice cloud phase functions that utilize hexagonal crystals with roughened surfaces. In some situations, cloud optical depths are reduced, hence, cloud height is increased. Cloud effective particle size also changes with the roughened ice crystal models which results in varied effects on the calculation of ice water path. Once validated and expanded, the new approach will be integrated in the CERES MODIS algorithm and real-time retrievals at Langley.
Photorefractive-based adaptive optical windows
NASA Astrophysics Data System (ADS)
Liu, Yuexin; Yang, Yi; Wang, Bo; Fu, John Y.; Yin, Shizhuo; Guo, Ruyan; Yu, Francis T.
2004-10-01
Optical windows have been widely used in optical spectrographic processing system. In this paper, various window profiles, such as rectangular, triangular, Hamming, Hanning, and Blackman etc., have been investigated in detail, regarding their effect on the generated spectrograms, such as joint time-frequency resolution ΔtΔw, the sidelobe amplitude attenuation etc.. All of these windows can be synthesized in a photorefractive crystal by angular multiplexing holographic technique, which renders the system more adaptive. Experimental results are provided.
Noise normalization and windowing functions for VALIDAR in wind parameter estimation
NASA Astrophysics Data System (ADS)
Beyon, Jeffrey Y.; Koch, Grady J.; Li, Zhiwen
2006-05-01
The wind parameter estimates from a state-of-the-art 2-μm coherent lidar system located at NASA Langley, Virginia, named VALIDAR (validation lidar), were compared after normalizing the noise by its estimated power spectra via the periodogram and the linear predictive coding (LPC) scheme. The power spectra and the Doppler shift estimates were the main parameter estimates for comparison. Different types of windowing functions were implemented in VALIDAR data processing algorithm and their impact on the wind parameter estimates was observed. Time and frequency independent windowing functions such as Rectangular, Hanning, and Kaiser-Bessel and time and frequency dependent apodized windowing function were compared. The briefing of current nonlinear algorithm development for Doppler shift correction subsequently follows.
Method of high speed flow field influence and restrain on laser communication
NASA Astrophysics Data System (ADS)
Meng, Li-xin; Wang, Chun-hui; Qian, Cun-zhu; Wang, Shuo; Zhang, Li-zhong
2013-08-01
For laser communication performance which carried by airplane or airship, due to high-speed platform movement, the air has two influences in platform and laser communication terminal window. The first influence is that aerodynamic effect causes the deformation of the optical window; the second one is that a shock wave and boundary layer would be generated. For subsonic within the aircraft, the boundary layer is the main influence. The presence of a boundary layer could change the air density and the temperature of the optical window, which causes the light deflection and received beam spot flicker. Ultimately, the energy hunting of the beam spot which reaches receiving side increases, so that the error rate increases. In this paper, aerodynamic theory is used in analyzing the influence of the optical window deformation due to high speed air. Aero-optics theory is used to analyze the influence of the boundary layer in laser communication link. Based on this, we focused on working on exploring in aerodynamic and aero-optical effect suppression method in the perspective of the optical window design. Based on planning experimental aircraft types and equipment installation location, we optimized the design parameters of the shape and thickness of the optical window, the shape and size of air-management kit. Finally, deformation of the optical window and air flow distribution were simulated by fluid simulation software in the different mach and different altitude fly condition. The simulation results showed that the optical window can inhibit the aerodynamic influence after optimization. In addition, the boundary layer is smoothed; the turbulence influence is reduced, which meets the requirements of the airborne laser communication.
High-impact resistance optical sensor windows
NASA Astrophysics Data System (ADS)
Askinazi, Joel; Ceccorulli, Mark L.; Goldman, Lee
2011-06-01
Recent field experience with optical sensor windows on both ground and airborne platforms has shown a significant increase in window fracturing from foreign object debris (FOD) impacts and as a by-product of asymmetrical warfare. Common optical sensor window materials such as borosilicate glass do not typically have high impact resistance. Emerging advanced optical window materials such as aluminum oxynitride offer the potential for a significant improvement in FOD impact resistance due to their superior surface hardness, fracture toughness and strength properties. To confirm the potential impact resistance improvement achievable with these emerging materials, Goodrich ISR Systems in collaboration with Surmet Corporation undertook a set of comparative FOD impact tests of optical sensor windows made from borosilicate glass and from aluminum oxynitride. It was demonstrated that the aluminum oxynitride windows could withstand up to three times the FOD impact velocity (as compared with borosilicate glass) before fracture would occur. These highly encouraging test results confirm the utility of this new highly viable window solution for use on new ground and airborne window multispectral applications as well as a retrofit to current production windows. We believe that this solution can go a long way to significantly reducing the frequency and life cycle cost of window replacement.
Application of MEMS-based x-ray optics as tuneable nanosecond choppers
NASA Astrophysics Data System (ADS)
Chen, Pice; Walko, Donald A.; Jung, Il Woong; Li, Zhilong; Gao, Ya; Shenoy, Gopal K.; Lopez, Daniel; Wang, Jin
2017-08-01
Time-resolved synchrotron x-ray measurements often rely on using a mechanical chopper to isolate a set of x-ray pulses. We have started the development of micro electromechanical systems (MEMS)-based x-ray optics, as an alternate method to manipulate x-ray beams. In the application of x-ray pulse isolation, we recently achieved a pulse-picking time window of half a nanosecond, which is more than 100 times faster than mechanical choppers can achieve. The MEMS device consists of a comb-drive silicon micromirror, designed for efficiently diffracting an x-ray beam during oscillation. The MEMS devices were operated in Bragg geometry and their oscillation was synchronized to x-ray pulses, with a frequency matching subharmonics of the cycling frequency of x-ray pulses. The microscale structure of the silicon mirror in terms of the curvature and the quality of crystallinity ensures a narrow angular spread of the Bragg reflection. With the discussion of factors determining the diffractive time window, this report showed our approaches to narrow down the time window to half a nanosecond. The short diffractive time window will allow us to select single x-ray pulse out of a train of pulses from synchrotron radiation facilities.
Windows of sensitivity to toxic chemicals in the motor effects development.
Ingber, Susan Z; Pohl, Hana R
2016-02-01
Many chemicals currently used are known to elicit nervous system effects. In addition, approximately 2000 new chemicals introduced annually have not yet undergone neurotoxicity testing. This review concentrated on motor development effects associated with exposure to environmental neurotoxicants to help identify critical windows of exposure and begin to assess data needs based on a subset of chemicals thoroughly reviewed by the Agency for Toxic Substances and Disease Registry (ATSDR) in Toxicological Profiles and Addenda. Multiple windows of sensitivity were identified that differed based on the maturity level of the neurological system at the time of exposure, as well as dose and exposure duration. Similar but distinct windows were found for both motor activity (GD 8-17 [rats], GD 12-14 and PND 3-10 [mice]) and motor function performance (insufficient data for rats, GD 12-17 [mice]). Identifying specific windows of sensitivity in animal studies was hampered by study designs oriented towards detection of neurotoxicity that occurred at any time throughout the developmental process. In conclusion, while this investigation identified some critical exposure windows for motor development effects, it demonstrates a need for more acute duration exposure studies based on neurodevelopmental windows, particularly during the exposure periods identified in this review. Published by Elsevier Inc.
Windows of sensitivity to toxic chemicals in the motor effects development✩
Ingber, Susan Z.; Pohl, Hana R.
2017-01-01
Many chemicals currently used are known to elicit nervous system effects. In addition, approximately 2000 new chemicals introduced annually have not yet undergone neurotoxicity testing. This review concentrated on motor development effects associated with exposure to environmental neurotoxicants to help identify critical windows of exposure and begin to assess data needs based on a subset of chemicals thoroughly reviewed by the Agency for Toxic Substances and Disease Registry (ATSDR) in Toxicological Profiles and Addenda. Multiple windows of sensitivity were identified that differed based on the maturity level of the neurological system at the time of exposure, as well as dose and exposure duration. Similar but distinct windows were found for both motor activity (GD 8–17 [rats], GD 12–14 and PND 3–10 [mice]) and motor function performance (insufficient data for rats, GD 12–17 [mice]). Identifying specific windows of sensitivity in animal studies was hampered by study designs oriented towards detection of neurotoxicity that occurred at any time throughout the developmental process. In conclusion, while this investigation identified some critical exposure windows for motor development effects, it demonstrates a need for more acute duration exposure studies based on neurodevelopmental windows, particularly during the exposure periods identified in this review. PMID:26686904
Schüpbach, Jörg; Gebhardt, Martin D.; Scherrer, Alexandra U.; Bisset, Leslie R.; Niederhauser, Christoph; Regenass, Stephan; Yerly, Sabine; Aubert, Vincent; Suter, Franziska; Pfister, Stefan; Martinetti, Gladys; Andreutti, Corinne; Klimkait, Thomas; Brandenberger, Marcel; Günthard, Huldrych F.
2013-01-01
Background Tests for recent infections (TRIs) are important for HIV surveillance. We have shown that a patient's antibody pattern in a confirmatory line immunoassay (Inno-Lia) also yields information on time since infection. We have published algorithms which, with a certain sensitivity and specificity, distinguish between incident (< = 12 months) and older infection. In order to use these algorithms like other TRIs, i.e., based on their windows, we now determined their window periods. Methods We classified Inno-Lia results of 527 treatment-naïve patients with HIV-1 infection < = 12 months according to incidence by 25 algorithms. The time after which all infections were ruled older, i.e. the algorithm's window, was determined by linear regression of the proportion ruled incident in dependence of time since infection. Window-based incident infection rates (IIR) were determined utilizing the relationship ‘Prevalence = Incidence x Duration’ in four annual cohorts of HIV-1 notifications. Results were compared to performance-based IIR also derived from Inno-Lia results, but utilizing the relationship ‘incident = true incident + false incident’ and also to the IIR derived from the BED incidence assay. Results Window periods varied between 45.8 and 130.1 days and correlated well with the algorithms' diagnostic sensitivity (R2 = 0.962; P<0.0001). Among the 25 algorithms, the mean window-based IIR among the 748 notifications of 2005/06 was 0.457 compared to 0.453 obtained for performance-based IIR with a model not correcting for selection bias. Evaluation of BED results using a window of 153 days yielded an IIR of 0.669. Window-based IIR and performance-based IIR increased by 22.4% and respectively 30.6% in 2008, while 2009 and 2010 showed a return to baseline for both methods. Conclusions IIR estimations by window- and performance-based evaluations of Inno-Lia algorithm results were similar and can be used together to assess IIR changes between annual HIV notification cohorts. PMID:23990968
Advances in spinel optical quality, size/shape capacity, and applications
NASA Astrophysics Data System (ADS)
Roy, Donald W.; Martin, Gay G., Jr.
1992-12-01
Polycrystalline MgAl2O4 Spinel, transparent from two hundred nanometers to six microns, offers a unique combination of optical and physical properties. A superior dome and window material with respect to rain and particle erosion, solar radiation, high temperatures and humidity, it is resistant to attack by strong acids, alkali solutions, sea water and jet fuels. Residual microporosity from the powder process used for fabricating Spinel which previously limited the use of Spinel to thin wall thicknesses and small sizes, has been significantly reduced by advanced hot press and hot isostatic press (HIP) technology. It is now possible to manufacture high quality shallow domes up to seven inches in diameter with a two tenths inch thick wall thickness. Eight inch diameter flat windows have been produced for an advanced missile system. Proof of process near hemispherical 8 inch dome blanks have been fabricated. Recent measurements of refractive index, homogeneity, scatter and surface roughness are available for design purposes. Improvement in the optical quality and in size/shape capability along with several successful prototype tests demonstrate that Spinel is ready for inclusion in appropriate production systems.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-12
... impact of eliminating the correction window from the electronic grant application submission process on... process a temporary error correction window to ensure a smooth and successful transition for applicants. This window provides applicants a period of time beyond the grant application due date to correct any...
Centroid estimation for a Shack-Hartmann wavefront sensor based on stream processing.
Kong, Fanpeng; Polo, Manuel Cegarra; Lambert, Andrew
2017-08-10
Using center of gravity to estimate the centroid of the spot in a Shack-Hartmann wavefront sensor, the measurement corrupts with photon and detector noise. Parameters, like window size, often require careful optimization to balance the noise error, dynamic range, and linearity of the response coefficient under different photon flux. It also needs to be substituted by the correlation method for extended sources. We propose a centroid estimator based on stream processing, where the center of gravity calculation window floats with the incoming pixel from the detector. In comparison with conventional methods, we show that the proposed estimator simplifies the choice of optimized parameters, provides a unit linear coefficient response, and reduces the influence of background and noise. It is shown that the stream-based centroid estimator also works well for limited size extended sources. A hardware implementation of the proposed estimator is discussed.
Linear segmentation algorithm for detecting layer boundary with lidar.
Mao, Feiyue; Gong, Wei; Logan, Timothy
2013-11-04
The automatic detection of aerosol- and cloud-layer boundary (base and top) is important in atmospheric lidar data processing, because the boundary information is not only useful for environment and climate studies, but can also be used as input for further data processing. Previous methods have demonstrated limitations in defining the base and top, window-size setting, and have neglected the in-layer attenuation. To overcome these limitations, we present a new layer detection scheme for up-looking lidars based on linear segmentation with a reasonable threshold setting, boundary selecting, and false positive removing strategies. Preliminary results from both real and simulated data show that this algorithm cannot only detect the layer-base as accurate as the simple multi-scale method, but can also detect the layer-top more accurately than that of the simple multi-scale method. Our algorithm can be directly applied to uncalibrated data without requiring any additional measurements or window size selections.
NASA Astrophysics Data System (ADS)
Li, Xin; Zhou, Shihong; Ma, Jing; Tan, Liying; Shen, Tao
2013-08-01
CMOS is a good candidate tracking detector for satellite optical communications systems with outstanding feature of sub-window for the development of APS (Active Pixel Sensor) technology. For inter-satellite optical communications it is critical to estimate the direction of incident laser beam precisely by measuring the centroid position of incident beam spot. The presence of detector noise results in measurement error, which degrades the tracking performance of systems. In this research, the measurement error of CMOS is derived taking consideration of detector noise. It is shown that the measurement error depends on pixel noise, size of the tracking sub-window (pixels number), intensity of incident laser beam, relative size of beam spot. The influences of these factors are analyzed by numerical simulation. We hope the results obtained in this research will be helpful in the design of CMOS detector satellite optical communications systems.
The uncrowded window of object recognition
Pelli, Denis G; Tillman, Katharine A
2009-01-01
It is now emerging that vision is usually limited by object spacing rather than size. The visual system recognizes an object by detecting and then combining its features. ‘Crowding’ occurs when objects are too close together and features from several objects are combined into a jumbled percept. Here, we review the explosion of studies on crowding—in grating discrimination, letter and face recognition, visual search, selective attention, and reading—and find a universal principle, the Bouma law. The critical spacing required to prevent crowding is equal for all objects, although the effect is weaker between dissimilar objects. Furthermore, critical spacing at the cortex is independent of object position, and critical spacing at the visual field is proportional to object distance from fixation. The region where object spacing exceeds critical spacing is the ‘uncrowded window’. Observers cannot recognize objects outside of this window and its size limits the speed of reading and search. PMID:18828191
External Vision Systems (XVS) Proof-of-Concept Flight Test Evaluation
NASA Technical Reports Server (NTRS)
Shelton, Kevin J.; Williams, Steven P.; Kramer, Lynda J.; Arthur, Jarvis J.; Prinzel, Lawrence, III; Bailey, Randall E.
2014-01-01
NASA's Fundamental Aeronautics Program, High Speed Project is performing research, development, test and evaluation of flight deck and related technologies to support future low-boom, supersonic configurations (without forward-facing windows) by use of an eXternal Vision System (XVS). The challenge of XVS is to determine a combination of sensor and display technologies which can provide an equivalent level of safety and performance to that provided by forward-facing windows in today's aircraft. This flight test was conducted with the goal of obtaining performance data on see-and-avoid and see-to-follow traffic using a proof-of-concept XVS design in actual flight conditions. Six data collection flights were flown in four traffic scenarios against two different sized participating traffic aircraft. This test utilized a 3x1 array of High Definition (HD) cameras, with a fixed forward field-of-view, mounted on NASA Langley's UC-12 test aircraft. Test scenarios, with participating NASA aircraft serving as traffic, were presented to two evaluation pilots per flight - one using the proof-of-concept (POC) XVS and the other looking out the forward windows. The camera images were presented on the XVS display in the aft cabin with Head-Up Display (HUD)-like flight symbology overlaying the real-time imagery. The test generated XVS performance data, including comparisons to natural vision, and post-run subjective acceptability data were also collected. This paper discusses the flight test activities, its operational challenges, and summarizes the findings to date.
Human Mars Mission: Launch Window from Earth Orbit. Pt. 1
NASA Technical Reports Server (NTRS)
Young, Archie
1999-01-01
The determination of orbital window characteristics is of major importance in the analysis of human interplanetary missions and systems. The orbital launch window characteristics are directly involved in the selection of mission trajectories, the development of orbit operational concepts, and the design of orbital launch systems. The orbital launch window problem arises because of the dynamic nature of the relative geometry between outgoing (departure) asymptote of the hyperbolic escape trajectory and the earth parking orbit. The orientation of the escape hyperbola asymptotic relative to the earth is a function of time. The required hyperbola energy level also varies with time. In addition, the inertial orientation of the parking orbit is a function of time because of the perturbations caused by the Earth's oblateness. Thus, a coplanar injection onto the escape hyperbola can be made only at a point in time when the outgoing escape asymptote is contained by the plane of parking orbit. Even though this condition may be planned as a nominal situation, it will not generally represent the more probable injection geometry. The general case of an escape injection maneuver performed at a time other than the coplanar time will involve both a path angle and plane change and, therefore, a delta V penalty. Usually, because of the delta V penalty the actual departure injection window is smaller in duration than that determined by energy requirement alone. This report contains the formulation, characteristics, and test cases for five different launch window modes for Earth orbit. These modes are: 1) One impulsive maneuver from a Highly Elliptical Orbit (HEO); 2) Two impulsive maneuvers from a Highly Elliptical Orbit (HEO); 3) One impulsive maneuver from a Low Earth Orbit (LEO); 4) Two impulsive maneuvers form LEO; and 5) Three impulsive maneuvers form LEO. The formulation of these five different launch window modes provides a rapid means of generating realistic parametric data for space exploration studies. Also the formulation provides vector and geometrical data sufficient for use as a good starting point in detail trajectory analysis based on calculus of variations, steepest descent, or parameter optimization program techniques.
Human Exploration Missions Study Launch Window from Earth Orbit
NASA Technical Reports Server (NTRS)
Young, Archie
2001-01-01
The determination of orbital launch window characteristics is of major importance in the analysis of human interplanetary missions and systems. The orbital launch window characteristics are directly involved in the selection of mission trajectories, the development of orbit operational concepts, and the design of orbital launch systems. The orbital launch window problem arises because of the dynamic nature of the relative geometry between outgoing (departure) asymptote of the hyperbolic escape trajectory and the earth parking orbit. The orientation of the escape hyperbola asymptotic relative to earth is a function of time. The required hyperbola energy level also varies with time. In addition, the inertial orientation of the parking orbit is a function of time because of the perturbations caused by the Earth's oblateness. Thus, a coplanar injection onto the escape hyperbola can be made only at a point in time when the outgoing escape asymptote is contained by the plane of parking orbit. Even though this condition may be planned as a nominal situation, it will not generally represent the more probable injection geometry. The general case of an escape injection maneuver performed at a time other than the coplanar time will involve both a path angle and plane change and, therefore, a Delta(V) penalty. Usually, because of the Delta(V) penalty the actual departure injection window is smaller in duration than that determined by energy requirement alone. This report contains the formulation, characteristics, and test cases for five different launch window modes for Earth orbit. These modes are: (1) One impulsive maneuver from a Low Earth Orbit (LEO), (2) Two impulsive maneuvers from LEO, (3) Three impulsive maneuvers from LEO, (4) One impulsive maneuvers from a Highly Elliptical Orbit (HEO), (5) Two impulsive maneuvers from a Highly Elliptical Orbit (HEO) The formulation of these five different launch window modes provides a rapid means of generating realistic parametric data for space exploration studies. Also the formulation provides vector and geometrical data sufficient for use as a good starting point in detail trajectory analysis based on calculus of variations, steepest descent, or parameter optimization program techniques.
Büttner, Kathrin; Salau, Jennifer; Krieter, Joachim
2016-07-01
Recent analyses of animal movement networks focused on the static aggregation of trade contacts over different time windows, which neglects the system's temporal variation. In terms of disease spread, ignoring the temporal dynamics can lead to an over- or underestimation of an outbreak's speed and extent. This becomes particularly evident, if the static aggregation allows for the existence of more paths compared to the number of time-respecting paths (i.e. paths in the right chronological order). Therefore, the aim of this study was to reveal differences between static and temporal representations of an animal trade network and to assess the quality of the static aggregation in comparison to the temporal counterpart. Contact data from a pig trade network (2006-2009) of a producer community in Northern Germany were analysed. The results show that a median value of 8.7 % (4.6-14.1%) of the nodes and 3.1% (1.6-5.5%) of the edges were active on a weekly resolution. No fluctuations in the activity patterns were obvious. Furthermore, 50% of the nodes already had one trade contact after approximately six months. For an accumulation window with increasing size (one day each), the accumulation rate, i.e. the relative increase in the number of nodes or edges, stayed relatively constant below 0.07% for the nodes and 0.12 % for the edges. The temporal distances had a much wider distribution than the topological distances. 84% of the temporal distances were smaller than 90 days. The maximum temporal distance was 1000 days, which corresponds to the temporal diameter of the present network. The median temporal correlation coefficient, which measures the probability for an edge to persist across two consecutive time steps, was 0.47, with a maximum value of 0.63 at the accumulation window of 88 days. The causal fidelity measures the fraction of the number of static paths which can also be taken in the temporal network. For the whole observation period relatively high values indicate that 67% of the time-respecting paths existed in both network representations. An increase to 0.87 (0.82-0.88) and 0.92 (0.80-0.98), respectively, could be observed for yearly and monthly aggregation windows. The results show that the investigated pig trade network in its static aggregation represents the temporal dynamics of the system sufficiently well. Therefore, the methodology for analysing static instead of dynamic networks can be used without losing too much information. Copyright © 2016 Elsevier B.V. All rights reserved.
GlastCam: A Telemetry-Driven Spacecraft Visualization Tool
NASA Technical Reports Server (NTRS)
Stoneking, Eric T.; Tsai, Dean
2009-01-01
Developed for the GLAST project, which is now the Fermi Gamma-ray Space Telescope, GlastCam software ingests telemetry from the Integrated Test and Operations System (ITOS) and generates four graphical displays of geometric properties in real time, allowing visual assessment of the attitude, configuration, position, and various cross-checks. Four windows are displayed: a "cam" window shows a 3D view of the satellite; a second window shows the standard position plot of the satellite on a Mercator map of the Earth; a third window displays star tracker fields of view, showing which stars are visible from the spacecraft in order to verify star tracking; and the fourth window depicts
NASA Astrophysics Data System (ADS)
Taira, T.; Kato, A.
2013-12-01
A high-resolution Vp/Vs ratio estimate is one of the key parameters to understand spatial variations of composition and physical state within the Earth. Lin and Shearer (2007, BSSA) recently developed a methodology to obtain local Vp/Vs ratios in individual similar earthquake clusters, based on P- and S-wave differential times. A waveform cross-correlation approach is typically employed to measure those differential times for pairs of seismograms from similar earthquakes clusters, at narrow time windows around the direct P and S waves. This approach effectively collects P- and S-wave differential times and however requires the robust P- and S-wave time windows that are extracted based on either manually or automatically picked P- and S-phases. We present another technique to estimate P- and S-wave differential times by exploiting temporal properties of delayed time as a function of elapsed time on the seismograms with a moving-window cross-correlation analysis (e.g., Snieder, 2002, Phys. Rev. E; Niu et al. 2003, Nature). Our approach is based on the principle that the delayed time for the direct S wave differs from that for the direct P wave. Two seismograms aligned by the direct P waves from a pair of similar earthquakes yield that delayed times become zero around the direct P wave. In contrast, delayed times obtained from time windows including the direct S wave have non-zero value. Our approach, in principle, is capable of measuring both P- and S-wave differential times from single-component seismograms. In an ideal case, the temporal evolution of delayed time becomes a step function with its discontinuity at the onset of the direct S wave. The offset in the resulting step function would be the S-wave differential time, relative to the P-wave differential time as the two waveforms are aligned by the direct P wave. We apply our moving-window cross-correlation technique to the two different data sets collected at: 1) the Wakayama district, Japan and 2) the Geysers geothermal field, California. The both target areas are characterized by earthquake swarms that provide a number of similar events clusters. We use the following automated procedure to systematically analyze the two data sets: 1) the identification of the direct P arrivals by using an Akaike Information Criterion based phase picking algorithm introduced by Zhang and Thurber (2003, BSSA), 2) the waveform alignment by the P-wave with a waveform cross-correlation to obtain P-wave differential time, 3) the moving-time window analysis to estimate the S-differential time. Kato et al. (2010, GRL) have estimated the Vp/Vs ratios for a few similar earthquake clusters from the Wakayama data set, by a conventional approach to obtain differential times. We find that the resulting Vp/Vs ratios from our approach for the same earthquake clusters are comparable with those obtained from Kato et al. (2010, GRL). We show that the moving-window cross-correlation technique effectively measures both P- and S-wave differential times for the seismograms in which the clear P and S phases are not observed. We will show spatial distributions in Vp/Vs ratios in our two target areas.
A large format membrane-based x-ray mask for microfluidic chip fabrication
NASA Astrophysics Data System (ADS)
Wang, Lin; Zhang, Min; Desta, Yohannes; Melzak, J.; Wu, C. H.; Peng, Zhengchun
2006-02-01
X-ray lithography is a very good option for the fabrication of micro-devices especially when high aspect ratio patterns are required. Membrane-based x-ray masks are commonly used for high-resolution x-ray lithography. A thin layer of silicon nitride (Si3N4) or silicon carbide (SiC) film (1-2 µm) is normally used as the membrane material for x-ray mask fabrication (Wells G M, Reilly M, Nachman R, Cerrina F, El-Khakani M A and Chaker M 1993 Mater. Res. Soc. Conf. Proc. 306 81-9 Shoki T, Nagasawa H, Kosuga H, Yamaguchi Y, Annaka N, Amemiya I and Nagarekawa O 1993 SPIE Proc. 1924 450-6). The freestanding membrane window of an x-ray mask, which defines the exposing area of the x-ray mask, can be obtained by etching a pre-defined area on a silicon wafer from the backside (Wang L, Desta Y, Fettig R K, Goettert J, Hein H, Jakobs P and Chulz J 2004 J. Micromech. Microeng. 14 722-6). Usually, the window size of an x-ray mask is around 20 × 20 mm because of the low tensile stress of the membrane (10-100 MPa), and the larger window dimension of an x-ray mask may cause the deformation of membranes and lower the mask quality. However, x-ray masks with larger windows are preferred for micro-device fabrication in order to increase the productivity. We analyzed the factors which influence the flatness of large format x-ray masks and fabricated x-ray masks with a window size of 55 × 55 mm and 46 × 65 mm on 1 µm thick membranes by increasing the tensile stress of the membranes (>300 MPa) and optimizing the stress of the absorber layer. The large format x-ray mask was successfully applied for the fabrication of microfluidic chips.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, J; Knopp, MV; Miller, M
2016-06-15
Purpose: Replacement of conventional PMT-based detector with next generation digital photon counting (DPC) detector is a technology leap for PET imaging. This study evaluated the performance and characteristics of the DPC system and its stability within a 1 year time window following its installation focusing on the medical physics basis for clinical applications. Methods: A digital PET/CT scanner using 1:1 coupling of 23,040 crystal: detector elements was introduced and became operational at OSU. We tested and evaluated system performance and characteristics using NEMA NU2-2012. System stabilities in timing resolution, energy resolution, detector temperature and humidity (T&H) were monitored over 1-yr.more » Timing, energy and spatial resolution were characterized across clinically relevant count rate range. CQIE uniformity PET and NEMA IEC-Body PET with hot spheres varying with sizes and contrasts were performed. PET reconstructed in standard(4mm), High(2mm) and Ultra-High(1mm) definitions were evaluated. Results: NEMA results showed PET spatial resolution (mm-FWHM) from 4.01&4.14 at 1cm to 5.82&6.17 at 20cm in transverse & axial. 322±3ps timing and 11.0% energy resolution were measured. 5.7kcps/MBq system sensitivity with 24kcps/MBq effective sensitivity was obtained. The peak-NECR was ∼171kcps with the effective peak-NECR >650kcps@50kBq/mL. Scatter fraction was ∼30%, and the maximum trues was >900kcps. NEMA IQ demonstrated hot sphere contrast ranging from ∼62%±2%(10mm) to ∼88%±2%(22mm), cold sphere contrast of ∼86%±2%(28mm) and ∼89%±3%(37mm) and excellent uniformity. Monitoring 1-yr stability, it revealed ∼1% change in timing, ±0.4% change in energy resolution, and <10% variations in T&H. CQIE PET gave <3% SUV variances in axial. 60%–100% recovery coefficients across sphere sizes and contrast levels were achieved. Conclusion: Characteristics and stability of the next generation DPC PET detector system over an 1-yr time window was excellent and better than prior experiences. It demonstrated improved and robust system characteristics and performance in spatial resolution, sensitivity, timing and energy resolution, count rate and image quality. Michael Miller is an employee of Philips Healthcare.« less
Windows Into the Real World From a Virtual Globe
NASA Astrophysics Data System (ADS)
Rich, J.; Urban-Rich, J.
2007-12-01
Virtual globes such as Google Earth can be great tools for learning about the geographical variation of the earth. The key to virtual globes is the use of satellite imagery to provide a highly accurate view of the earth's surface. However, because the images are not updated regularly, variations in climate and vegetation over time can not be easily seen. In order to enhance the view of the earth and observe these changes by region and over time we are working to add near real time "windows" into the real world from a virtual globe. For the past 4 years we have been installing web cameras in areas of the world that will provide long term monitoring of global changes. By archiving hourly images from arctic, temperate and tropical regions we are creating a visual data set that is already beginning to tell the story of climate variability. The cameras are currently installed in 10 elementary schools in 3 countries and show the student's view out each window. The Windows Around the World program (http://www.WindowsAroundTheWorld.org) uses the images from these cameras to help students gain a better understanding of earth process and variability in climate and vegetation between different regions and over time. Previously we have used standard web based technologies such as DHTML and AJAX to provide near real-time access to these images and also provide enhanced functionality such as dynamic time lapse movies that allow users to see changes over months, days or hours up to the current hour (http://www.windowsaroundtheworld.org/north_america.aspx). We have integrated the camera images from Windows Around the World into Google Earth. Through network links and models we are creating a way for students to "fly" to another school in the program and see what the current view is out the window. By using a model as a screen, the image can be viewed from the same direction as the students who are sitting in a classroom at the participating school. Once at the school, visiting students can move around the area in three dimensions and gain a better understanding of what they are seeing out the window. Currently time-lapse images can be viewed at a lower resolution for all schools on the globe or when flying into an individual school, higher resolution time-lapse images can be seen. The observation of shadows, precipitation, movement of the sun and changes in vegetation allows the viewer to gain a better understanding of how the earth works and how the environment changes between regions and over time. World.org
Re-Evaluation of Event Correlations in Virtual California Using Statistical Analysis
NASA Astrophysics Data System (ADS)
Glasscoe, M. T.; Heflin, M. B.; Granat, R. A.; Yikilmaz, M. B.; Heien, E.; Rundle, J.; Donnellan, A.
2010-12-01
Fusing the results of simulation tools with statistical analysis methods has contributed to our better understanding of the earthquake process. In a previous study, we used a statistical method to investigate emergent phenomena in data produced by the Virtual California earthquake simulator. The analysis indicated that there were some interesting fault interactions and possible triggering and quiescence relationships between events. We have converted the original code from Matlab to python/C++ and are now evaluating data from the most recent version of Virtual California in order to analyze and compare any new behavior exhibited by the model. The Virtual California earthquake simulator can be used to study fault and stress interaction scenarios for realistic California earthquakes. The simulation generates a synthetic earthquake catalog of events with a minimum size of ~M 5.8 that can be evaluated using statistical analysis methods. Virtual California utilizes realistic fault geometries and a simple Amontons - Coulomb stick and slip friction law in order to drive the earthquake process by means of a back-slip model where loading of each segment occurs due to the accumulation of a slip deficit at the prescribed slip rate of the segment. Like any complex system, Virtual California may generate emergent phenomena unexpected even by its designers. In order to investigate this, we have developed a statistical method that analyzes the interaction between Virtual California fault elements and thereby determine whether events on any given fault elements show correlated behavior. Our method examines events on one fault element and then determines whether there is an associated event within a specified time window on a second fault element. Note that an event in our analysis is defined as any time an element slips, rather than any particular “earthquake” along the entire fault length. Results are then tabulated and then differenced with an expected correlation, calculated by assuming a uniform distribution of events in time. We generate a correlation score matrix, which indicates how weakly or strongly correlated each fault element is to every other in the course of the VC simulation. We calculate correlation scores by summing the difference between the actual and expected correlations over all time window lengths and normalizing by the time window size. The correlation score matrix can focus attention on the most interesting areas for more in-depth analysis of event correlation vs. time. The previous study included 59 faults (639 elements) in the model, which included all the faults save the creeping section of the San Andreas. The analysis spanned 40,000 yrs of Virtual California-generated earthquake data. The newly revised VC model includes 70 faults, 8720 fault elements, and spans 110,000 years. Due to computational considerations, we will evaluate the elements comprising the southern California region, which our previous study indicated showed interesting fault interaction and event triggering/quiescence relationships.
Zhong, Zhentao; Yu, Yue; Jin, Shufang; Pan, Jinming
2018-01-01
The hatch window that varies from 24 to 48 h is known to influence post-hatch performance of chicks. A narrow hatch window is needed for commercial poultry industry to acquire a high level of uniformity of chick quality. Hatching synchronization observed in avian species presents possibilities in altering hatch window in artificial incubation. Layer eggs which were laid on the same day by a single breeder flock and stored for no more than two days started incubation 12 h apart to obtain developmental distinction. The eggs of different initial incubation time were mixed as rows adjacent to rows on day 12 of incubation. During the hatching period (since day 18), hatching time of individual eggs and hatch window were obtained by video recordings. Embryonic development (day 18 and 20) and post-hatch performance up to day 7 were measured. The manipulation of mixing eggs of different initial incubation time shortened the hatch window of late incubated eggs in the manipulated group by delaying the onset of hatching process, and improved the hatchability. Compared to the control groups, chick embryos or chicks in the egg redistribution group showed no significant difference in embryonic development and post-hatch performance up to day 7. We have demonstrated that eggs that were incubated with advanced eggs performed a narrow spread of hatch with higher hatchability, normal embryonic development as well as unaffected chick quality. This specific manipulation is applicable in industrial poultry production to shorten hatch window and improve the uniformity of chick quality.
Chiu, Yueh-Hsiu Mathilda; Hsu, Hsiao-Hsien Leon; Wilson, Ander; Coull, Brent A; Pendo, Mathew P; Baccarelli, Andrea; Kloog, Itai; Schwartz, Joel; Wright, Robert O; Taveras, Elsie M; Wright, Rosalind J
2017-10-01
Evolving animal studies and limited epidemiological data show that prenatal air pollution exposure is associated with childhood obesity. Timing of exposure and child sex may play an important role in these associations. We applied an innovative method to examine sex-specific sensitive prenatal windows of exposure to PM 2.5 on anthropometric measures in preschool-aged children. Analyses included 239 children born ≥ 37 weeks gestation in an ethnically-mixed lower-income urban birth cohort. Prenatal daily PM 2.5 exposure was estimated using a validated satellite-based spatio-temporal model. Body mass index z-score (BMI-z), fat mass, % body fat, subscapular and triceps skinfold thickness, waist and hip circumferences and waist-to-hip ratio (WHR) were assessed at age 4.0 ± 0.7 years. Using Bayesian distributed lag interaction models (BDLIMs), we examined sex differences in sensitive windows of weekly averaged PM 2.5 levels on these measures, adjusting for child age, maternal age, education, race/ethnicity, and pre-pregnancy BMI. Mothers were primarily Hispanic (55%) or Black (26%), had ≤ 12 years of education (66%) and never smoked (80%). Increased PM 2.5 exposure 8-17 and 15-22 weeks gestation was significantly associated with increased BMI z-scores and fat mass in boys, but not in girls. Higher PM 2.5 exposure 10-29 weeks gestation was significantly associated with increased WHR in girls, but not in boys. Prenatal PM 2.5 was not significantly associated with other measures of body composition. Estimated cumulative effects across pregnancy, accounting for sensitive windows and within-window effects, were 0.21 (95%CI = 0.01-0.37) for BMI-z and 0.36 (95%CI = 0.12-0.68) for fat mass (kg) in boys, and 0.02 (95%CI = 0.01-0.03) for WHR in girls, all per µg/m 3 increase in PM 2.5 . Increased prenatal PM 2.5 exposure was more strongly associated with indices of increased whole body size in boys and with an indicator of body shape in girls. Methods to better characterize vulnerable windows may provide insight into underlying mechanisms contributing to sex-specific associations. Copyright © 2017 Elsevier Inc. All rights reserved.
Poser, H; Russello, G; Zanella, A; Bellini, L; Gelli, D
2011-12-01
Echocardiographic evaluation was performed in six healthy young adult non-sedated terrapins (Trachemys scripta elegans). The best imaging quality was obtained through the right cervical window. Base-apex inflow and outflow views were recorded, ventricular size, ventricular wall thickness and ventricular outflow tract were measured, and fractional shortening was calculated. Pulsed-wave Doppler interrogation enabled the diastolic biphasic atrio-ventricular flow and the systolic ventricular outflow patterns to be recorded. The following Doppler-derived functional parameters were calculated: early diastolic (E) and late diastolic (A) wave peak velocities, E/A ratio, ventricular outflow systolic peak and mean velocities and gradients, Velocity-Time Integral, acceleration and deceleration times, and Ejection Time. For each parameter the mean, standard deviation and 95% confidence interval were calculated. Echocardiography resulted as a useful and easy-to-perform diagnostic tool in this poorly known species that presents difficulties during evaluation.
Zhao, Chen; Zhang, Shunqi; Liu, Zhipeng; Yin, Tao
2015-07-01
A new method to improve the focalization and efficiency of the Figure of Eight (FOE) coil in rTMS is discussed in this paper. In order to decrease the half width of the distribution curve (HWDC), as well to increase the ratio of positive peak value to negative peak value (RPN) of the induced electric field, a shield plate with a window and a ferromagnetic block are assumed to enhance the positive peak value of the induced electrical field. The shield is made of highly conductive copper, and the block is made of highly permeable soft magnetic ferrite. A computer simulation is conducted on ANSYS® software to conduct the finite element analysis (FEA). Two comparing coefficients were set up to optimize the sizes of the shield window and the block. Simulation results show that a shield with a 60 mm × 30 mm sized window, together with a block 40 mm thick, can decrease the focal area of a FOE coil by 46.7%, while increasing the RPN by 135.9%. The block enhances the peak value of the electrical field induced by a shield-FOE by 8.4%. A real human head model was occupied in this paper to further verify our method.
Generating Daily Synthetic Landsat Imagery by Combining Landsat and MODIS Data
Wu, Mingquan; Huang, Wenjiang; Niu, Zheng; Wang, Changyao
2015-01-01
Owing to low temporal resolution and cloud interference, there is a shortage of high spatial resolution remote sensing data. To address this problem, this study introduces a modified spatial and temporal data fusion approach (MSTDFA) to generate daily synthetic Landsat imagery. This algorithm was designed to avoid the limitations of the conditional spatial temporal data fusion approach (STDFA) including the constant window for disaggregation and the sensor difference. An adaptive window size selection method is proposed in this study to select the best window size and moving steps for the disaggregation of coarse pixels. The linear regression method is used to remove the influence of differences in sensor systems using disaggregated mean coarse reflectance by testing and validation in two study areas located in Xinjiang Province, China. The results show that the MSTDFA algorithm can generate daily synthetic Landsat imagery with a high correlation coefficient (R) ranged from 0.646 to 0.986 between synthetic images and the actual observations. We further show that MSTDFA can be applied to 250 m 16-day MODIS MOD13Q1 products and the Landsat Normalized Different Vegetation Index (NDVI) data by generating a synthetic NDVI image highly similar to actual Landsat NDVI observation with a high R of 0.97. PMID:26393607
Generating Daily Synthetic Landsat Imagery by Combining Landsat and MODIS Data.
Wu, Mingquan; Huang, Wenjiang; Niu, Zheng; Wang, Changyao
2015-09-18
Owing to low temporal resolution and cloud interference, there is a shortage of high spatial resolution remote sensing data. To address this problem, this study introduces a modified spatial and temporal data fusion approach (MSTDFA) to generate daily synthetic Landsat imagery. This algorithm was designed to avoid the limitations of the conditional spatial temporal data fusion approach (STDFA) including the constant window for disaggregation and the sensor difference. An adaptive window size selection method is proposed in this study to select the best window size and moving steps for the disaggregation of coarse pixels. The linear regression method is used to remove the influence of differences in sensor systems using disaggregated mean coarse reflectance by testing and validation in two study areas located in Xinjiang Province, China. The results show that the MSTDFA algorithm can generate daily synthetic Landsat imagery with a high correlation coefficient (R) ranged from 0.646 to 0.986 between synthetic images and the actual observations. We further show that MSTDFA can be applied to 250 m 16-day MODIS MOD13Q1 products and the Landsat Normalized Different Vegetation Index (NDVI) data by generating a synthetic NDVI image highly similar to actual Landsat NDVI observation with a high R of 0.97.
Seismic signal time-frequency analysis based on multi-directional window using greedy strategy
NASA Astrophysics Data System (ADS)
Chen, Yingpin; Peng, Zhenming; Cheng, Zhuyuan; Tian, Lin
2017-08-01
Wigner-Ville distribution (WVD) is an important time-frequency analysis technology with a high energy distribution in seismic signal processing. However, it is interfered by many cross terms. To suppress the cross terms of the WVD and keep the concentration of its high energy distribution, an adaptive multi-directional filtering window in the ambiguity domain is proposed. This begins with the relationship of the Cohen distribution and the Gabor transform combining the greedy strategy and the rotational invariance property of the fractional Fourier transform in order to propose the multi-directional window, which extends the one-dimensional, one directional, optimal window function of the optimal fractional Gabor transform (OFrGT) to a two-dimensional, multi-directional window in the ambiguity domain. In this way, the multi-directional window matches the main auto terms of the WVD more precisely. Using the greedy strategy, the proposed window takes into account the optimal and other suboptimal directions, which also solves the problem of the OFrGT, called the local concentration phenomenon, when encountering a multi-component signal. Experiments on different types of both the signal models and the real seismic signals reveal that the proposed window can overcome the drawbacks of the WVD and the OFrGT mentioned above. Finally, the proposed method is applied to a seismic signal's spectral decomposition. The results show that the proposed method can explore the space distribution of a reservoir more precisely.
Air transparent soundproof window
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Sang-Hoon, E-mail: shkim@mmu.ac.kr; Lee, Seong-Hyun
2014-11-15
A soundproof window or wall which is transparent to airflow is presented. The design is based on two wave theories: the theory of diffraction and the theory of acoustic metamaterials. It consists of a three-dimensional array of strong diffraction-type resonators with many holes centered on each individual resonator. The negative effective bulk modulus of the resonators produces evanescent wave, and at the same time the air holes with subwavelength diameter existed on the surfaces of the window for macroscopic air ventilation. The acoustic performance levels of two soundproof windows with air holes of 20mm and 50mm diameters were measured. Themore » sound level was reduced by about 30 - 35dB in the frequency range of 400 - 5,000Hz with the 20mm window, and by about 20 - 35dB in the frequency range of 700 - 2,200Hz with the 50mm window. Multi stop-band was created by the multi-layers of the window. The attenuation length or the thickness of the window was limited by background noise. The effectiveness of the soundproof window with airflow was demonstrated by a real installation.« less
Madai, Vince Istvan; Wood, Carla N; Galinovic, Ivana; Grittner, Ulrike; Piper, Sophie K; Revankar, Gajanan S; Martin, Steve Z; Zaro-Weber, Olivier; Moeller-Hartmann, Walter; von Samson-Himmelstjerna, Federico C; Heiss, Wolf-Dieter; Ebinger, Martin; Fiebach, Jochen B; Sobesky, Jan
2016-01-01
With regard to acute stroke, patients with unknown time from stroke onset are not eligible for thrombolysis. Quantitative diffusion weighted imaging (DWI) and fluid attenuated inversion recovery (FLAIR) MRI relative signal intensity (rSI) biomarkers have been introduced to predict eligibility for thrombolysis, but have shown heterogeneous results in the past. In the present work, we investigated whether the inclusion of easily obtainable clinical-radiological parameters would improve the prediction of the thrombolysis time window by rSIs and compared their performance to the visual DWI-FLAIR mismatch. In a retrospective study, patients from 2 centers with proven stroke with onset <12 h were included. The DWI lesion was segmented and overlaid on ADC and FLAIR images. rSI mean and SD, were calculated as follows: (mean ROI value/mean value of the unaffected hemisphere). Additionally, the visual DWI-FLAIR mismatch was evaluated. Prediction of the thrombolysis time window was evaluated by the area-under-the-curve (AUC) derived from receiver operating characteristic (ROC) curve analysis. Factors such as the association of age, National Institutes of Health Stroke Scale, MRI field strength, lesion size, vessel occlusion and Wahlund-Score with rSI were investigated and the models were adjusted and stratified accordingly. In 82 patients, the unadjusted rSI measures DWI-mean and -SD showed the highest AUCs (AUC 0.86-0.87). Adjustment for clinical-radiological covariates significantly improved the performance of FLAIR-mean (0.91) and DWI-SD (0.91). The best prediction results based on the AUC were found for the final stratified and adjusted models of DWI-SD (0.94) and FLAIR-mean (0.96) and a multivariable DWI-FLAIR model (0.95). The adjusted visual DWI-FLAIR mismatch did not perform in a significantly worse manner (0.89). ADC-rSIs showed fair performance in all models. Quantitative DWI and FLAIR MRI biomarkers as well as the visual DWI-FLAIR mismatch provide excellent prediction of eligibility for thrombolysis in acute stroke, when easily obtainable clinical-radiological parameters are included in the prediction models. © 2016 S. Karger AG, Basel.
Second window of protection against infarction in conscious rabbits: real or artifactual.
Miki, T; Swafford, A N; Cohen, M V; Downey, J M
1999-04-01
To date, most studies of the second window of protection against infarction (SWOP) have evaluated infarct size by staining with triphenyltetrazolium chloride (TTC) soon after reperfusion. However, early TTC staining has been found to be an unreliable indicator of the ultimate infarct size following some interventions. Therefore, we tested whether SWOP could induce a sustained limitation of infarct size. Instrumented, conscious rabbits underwent 30 min of coronary occlusion. Infarct size was determined by either TTC staining after 3 h of reperfusion or conventional histology after 72 h of reperfusion. In the TTC study, 43.5+/-3.1% of the risk zone infarcted in the control group. Four cycles of 5 min ischemia/10 min reperfusion 24 h prior to 30 min ischemia significantly reduced infarct size measured by TTC to 32.5+/-2.3% (P<0.05 v control). In the histological study 57.8+/-3.6% of the risk zone infarcted in the control group. However, ischemic preconditioning 24 h prior to the 30 min ischemia did not protect the heart (59.3+/-4.4% infarction). Thus the infarct-limiting effect of SWOP evaluated with early TTC staining could not be demonstrated when infarction was assessed by histology after 3 days of reperfusion. These data suggest that SWOP may not have a sustained anti-infarct effect, but rather may simply delay the progression to infarction. Copyright 1999 Academic Press.
Parkes, Marie V.; Demir, Hakan; Teich-McGoldrick, Stephanie L.; ...
2014-03-28
Molecular dynamics simulations were used to investigate trends in noble gas (Ar, Kr, Xe) diffusion in the metal-organic frameworks HKUST-1 and ZIF-8. Diffusion occurs primarily through inter-cage jump events, with much greater diffusion of guest atoms in HKUST-1 compared to ZIF-8 due to the larger cage and window sizes in the former. We compare diffusion coefficients calculated for both rigid and flexible frameworks. For rigid framework simulations, in which the framework atoms were held at their crystallographic or geometry optimized coordinates, sometimes dramatic differences in guest diffusion were seen depending on the initial framework structure or the choice of frameworkmore » force field parameters. When framework flexibility effects were included, argon and krypton diffusion increased significantly compared to rigid-framework simulations using general force field parameters. Additionally, for argon and krypton in ZIF-8, guest diffusion increased with loading, demonstrating that guest-guest interactions between cages enhance inter-cage diffusion. No inter-cage jump events were seen for xenon atoms in ZIF-8 regardless of force field or initial structure, and the loading dependence of xenon diffusion in HKUST-1 is different for rigid and flexible frameworks. Diffusion of krypton and xenon in HKUST-1 depends on two competing effects: the steric effect that decreases diffusion as loading increases, and the “small cage effect” that increases diffusion as loading increases. Finally, a detailed analysis of the window size in ZIF-8 reveals that the window increases beyond its normal size to permit passage of a (nominally) larger krypton atom.« less
Alternative Fuels Data Center: Schwan's Home Service Delivers With
distribute products across the United States. For information about this project, contact Twin Cities Clean Cities Coalition. Download QuickTime Video QuickTime (.mov) Download Windows Media Video Windows Media (.wmv) Video Download Help Text version See more videos provided by Clean Cities TV and FuelEconomy.gov
Displaying Special Characters and Symbols in Computer-Controlled Reaction Time Experiments.
ERIC Educational Resources Information Center
Friel, Brian M.; Kennison, Shelia M.
A procedure for using MEL2 (Version 2.0 of Microcomputer Experimental Laboratory) and FontWINDOW to present special characters and symbols in computer-controlled reaction time experiments is described. The procedure permits more convenience and flexibility than in tachistocopic and projection techniques. FontWINDOW allows researchers to design…
Attosecond light sources in the water window
NASA Astrophysics Data System (ADS)
Ren, Xiaoming; Li, Jie; Yin, Yanchun; Zhao, Kun; Chew, Andrew; Wang, Yang; Hu, Shuyuan; Cheng, Yan; Cunningham, Eric; Wu, Yi; Chini, Michael; Chang, Zenghu
2018-02-01
As a compact and burgeoning alternative to synchrotron radiation and free-electron lasers, high harmonic generation (HHG) has proven its superiority in static and time-resolved extreme ultraviolet spectroscopy for the past two decades and has recently gained many interests and successes in generating soft x-ray emissions covering the biologically important water window spectral region. Unlike synchrotron and free-electron sources, which suffer from relatively long pulse width or large time jitter, soft x-ray sources from HHG could offer attosecond time resolution and be synchronized with their driving field to investigate time-resolved near edge absorption spectroscopy, which could reveal rich structural and dynamical information of the interrogated samples. In this paper, we review recent progresses on generating and characterizing attosecond light sources in the water window region. We show our development of an energetic, two-cycle, carrier-envelope phase stable laser source at 1.7 μm and our achievement in producing a 53 as soft x-ray pulse covering the carbon K-edge in the water window. Such source paves the ways for the next generation x-ray spectroscopy with unprecedented temporal resolution.
Aerial survey methodology for bison population estimation in Yellowstone National Park
Hess, Steven C.
2002-01-01
I developed aerial survey methods for statistically rigorous bison population estimation in Yellowstone National Park to support sound resource management decisions and to understand bison ecology. Survey protocols, data recording procedures, a geographic framework, and seasonal stratifications were based on field observations from February 1998-September 2000. The reliability of this framework and strata were tested with long-term data from 1970-1997. I simulated different sample survey designs and compared them to high-effort censuses of well-defined large areas to evaluate effort, precision, and bias. Sample survey designs require much effort and extensive information on the current spatial distribution of bison and therefore do not offer any substantial reduction in time and effort over censuses. I conducted concurrent ground surveys, or 'double sampling' to estimate detection probability during aerial surveys. Group size distribution and habitat strongly affected detection probability. In winter, 75% of the groups and 92% of individual bison were detected on average from aircraft, while in summer, 79% of groups and 97% of individual bison were detected. I also used photography to quantify the bias due to counting large groups of bison accurately and found that undercounting increased with group size and could reach 15%. I compared survey conditions between seasons and identified optimal time windows for conducting surveys in both winter and summer. These windows account for the habitats and total area bison occupy, and group size distribution. Bison became increasingly scattered over the Yellowstone region in smaller groups and more occupied unfavorable habitats as winter progressed. Therefore, the best conditions for winter surveys occur early in the season (Dec-Jan). In summer, bison were most spatially aggregated and occurred in the largest groups by early August. Low variability between surveys and high detection probability provide population estimates with an overall coefficient of variation of approximately 8% and have high power for detecting trends in population change. I demonstrated how population estimates from winter and summer can be integrated into a comprehensive monitoring program to estimate annual growth rates, overall winter mortality, and an index of calf production, requiring about 30 hours of flight per year.
2012-01-01
Background Data collection for economic evaluation alongside clinical trials is burdensome and cost-intensive. Limiting both the frequency of data collection and recall periods can solve the problem. As a consequence, gaps in survey periods arise and must be filled appropriately. The aims of our study are to assess the validity of incomplete cost data collection and define suitable resource categories. Methods In the randomised KORINNA study, cost data from 234 elderly patients were collected quarterly over a 1-year period. Different strategies for incomplete data collection were compared with complete data collection. The sample size calculation was modified in response to elasticity of variance. Results Resource categories suitable for incomplete data collection were physiotherapy, ambulatory clinic in hospital, medication, consultations, outpatient nursing service and paid household help. Cost estimation from complete and incomplete data collection showed no difference when omitting information from one quarter. When omitting information from two quarters, costs were underestimated by 3.9% to 4.6%. With respect to the observed increased standard deviation, a larger sample size would be required, increased by 3%. Nevertheless, more time was saved than extra time would be required for additional patients. Conclusion Cost data can be collected efficiently by reducing the frequency of data collection. This can be achieved by incomplete data collection for shortened periods or complete data collection by extending recall windows. In our analysis, cost estimates per year for ambulatory healthcare and non-healthcare services in terms of three data collections was as valid and accurate as a four complete data collections. In contrast, data on hospitalisation, rehabilitation stays and care insurance benefits should be collected for the entire target period, using extended recall windows. When applying the method of incomplete data collection, sample size calculation has to be modified because of the increased standard deviation. This approach is suitable to enable economic evaluation with lower costs to both study participants and investigators. Trial registration The trial registration number is ISRCTN02893746 PMID:22978572
Ultrasound-guided identification of cardiac imaging windows.
Liu, Garry; Qi, Xiu-Ling; Robert, Normand; Dick, Alexander J; Wright, Graham A
2012-06-01
Currently, the use of cine magnetic resonance imaging (MRI) to identify cardiac quiescent periods relative to the electrocardiogram (ECG) signal is insufficient for producing submillimeter-resolution coronary MR angiography (MRA) images. In this work, the authors perform a time series comparison between tissue Doppler echocardiograms of the interventricular septum (IVS) and concurrent biplane x-ray angiograms. Our results indicate very close agreement between the diastasis gating windows identified by both the IVS and x-ray techniques. Seven cath lab patients undergoing diagnostic angiograms were simultaneously scanned during a breath hold by ultrasound and biplane x-ray for six to eight heartbeats. The heart rate of each patient was stable. Dye was injected into either the left or right-coronary vasculature. The IVS was imaged using color tissue Doppler in an apical four-chamber view. Diastasis was estimated on the IVS velocity curve. On the biplane angiograms, proximal, mid, and distal regions were identified on the coronary artery (CA). Frame by frame correlation was used to derive displacement, and then velocity, for each region. The quiescent periods for a CA and its subsegments were estimated based on velocity. Using Pearson's correlation coefficient and Bland-Altman analysis, the authors compared the start and end times of the diastasis windows as estimated from the IVS and CA velocities. The authors also estimated the vessel blur across the diastasis windows of multiple sequential heartbeats of each patient. In total, 17 heartbeats were analyzed. The range of heart rate observed across patients was 47-79 beats per minute (bpm) with a mean of 57 bpm. Significant correlations (R > 0.99; p < 0.01) were observed between the IVS and x-ray techniques for the identification of the start and end times of diastasis windows. The mean difference in the starting times between IVS and CA quiescent windows was -12.0 ms. The mean difference in end times between IVS and CA quiescent windows was -3.5 ms. In contrast, the correlation between RR interval and both the start and duration of the x-ray gating windows were relatively weaker: R = 0.63 (p = 0.13) and R = 0.86 (p = 0.01). For IVS gating windows, the average estimated vessel blurs during single and multiple heartbeats were 0.5 and 0.66 mm, respectively. For x-ray gating windows, the corresponding values were 0.26 and 0.44 mm, respectively. In this study, the authors showed that IVS velocity can be used to identify periods of diastasis for coronary arteries. Despite variability in mid-diastolic rest positions over multiple steady rate heartbeats, vessel blurring of 0.5-1 mm was found to be achievable using the IVS gating technique. The authors envision this leading to a new cardiac gating system that, compared with conventional ECG gating, provides better resolution and shorter scan times for coronary MRA. © 2012 American Association of Physicists in Medicine.
Image-guided adaptive gating of lung cancer radiotherapy: a computer simulation study
NASA Astrophysics Data System (ADS)
Aristophanous, Michalis; Rottmann, Joerg; Park, Sang-June; Nishioka, Seiko; Shirato, Hiroki; Berbeco, Ross I.
2010-08-01
The purpose of this study is to investigate the effect that image-guided adaptation of the gating window during treatment could have on the residual tumor motion, by simulating different gated radiotherapy techniques. There are three separate components of this simulation: (1) the 'Hokkaido Data', which are previously measured 3D data of lung tumor motion tracks and the corresponding 1D respiratory signals obtained during the entire ungated radiotherapy treatments of eight patients, (2) the respiratory gating protocol at our institution and the imaging performed under that protocol and (3) the actual simulation in which the Hokkaido Data are used to select tumor position information that could have been collected based on the imaging performed under our gating protocol. We simulated treatments with a fixed gating window and a gating window that is updated during treatment. The patient data were divided into different fractions, each with continuous acquisitions longer than 2 min. In accordance to the imaging performed under our gating protocol, we assume that we have tumor position information for the first 15 s of treatment, obtained from kV fluoroscopy, and for the rest of the fractions the tumor position is only available during the beam-on time from MV imaging. The gating window was set according to the information obtained from the first 15 s such that the residual motion was less than 3 mm. For the fixed gating window technique the gate remained the same for the entire treatment, while for the adaptive technique the range of the tumor motion during beam-on time was measured and used to adapt the gating window to keep the residual motion below 3 mm. The algorithm used to adapt the gating window is described. The residual tumor motion inside the gating window was reduced on average by 24% for the patients with regular breathing patterns and the difference was statistically significant (p-value = 0.01). The magnitude of the residual tumor motion depended on the regularity of the breathing pattern suggesting that image-guided adaptive gating should be combined with breath coaching. The adaptive gating window technique was able to track the exhale position of the breathing cycle quite successfully. Out of a total of 53 fractions the duty cycle was greater than 20% for 42 fractions for the fixed gating window technique and for 39 fractions for the adaptive gating window technique. The results of this study suggest that real-time updating of the gating window can result in reliably low residual tumor motion and therefore can facilitate safe margin reduction.
NASA Astrophysics Data System (ADS)
Guzman, L.; Baeza-Blancas, E.; Reyes, I.; Angulo Brown, F.; Rudolf Navarro, A.
2017-12-01
By studying the magnitude earthquake catalogs, previous studies have reported evidence that some changes in the spatial and temporal organization of earthquake activity is observedbefore and after of a main-shock. These previous studies have used different approach methods for detecting clustering behavior and distance-events density in order topoint out the asymmetric behavior of before shocks and aftershocks. Here, we present a statistical analysis of the seismic activity related to the M8.2 and M7.1 earthquakes occurredon Sept. 7th and Sept. 19th, respectively. First, we calculated the interevent time and distance for the period Sept. 7th 2016 until Oct. 20th 2017 for each seismic region ( a radius of 150 km centeredat coordinates of the M8.1 and M7.1). Next, we calculated the "velocity" of the walker as the ratio between the interevent distance and interevent time, and similarly, we also constructed the"acceleration". A slider pointer is considered to estimate some statistical features within time windows of size τ for the velocity and acceleration sequences before and after the main shocks. Specifically, we applied the fractal dimension method to detect changes in the correlation (persistence) behavior of events in the period before the main events.Our preliminary results pointed out that the fractal dimension associated to the velocity and acceleration sequences exhibits changes in the persistence behavior before the mainshock, while thescaling dimension values after the main events resemble a more uncorrelated behavior. Moreover, the relationship between the standard deviation of the velocity and the local mean velocity valuefor a given time window-size τ is described by an exponent close to 1.5, and the cumulative distribution of velocity and acceleration are well described by power law functions after the crash and stretched-exponential-like distribution before the main shock. On the other hand, we present an analysis of patterns of seismicquiescence before the M8.2 earthquake based on the Schreider algorithmover a period of 27 years. This analysis also includes the modificationof the Schreider method proposed by Muñoz-Diosdado et al. (2015).
Multi-Window Controllers for Autonomous Space Systems
NASA Technical Reports Server (NTRS)
Lurie, B, J.; Hadaegh, F. Y.
1997-01-01
Multi-window controllers select between elementary linear controllers using nonlinear windows based on the amplitude and frequency content of the feedback error. The controllers are relatively simple to implement and perform much better than linear controllers. The commanders for such controllers only order the destination point and are freed from generating the command time-profiles. The robotic missions rely heavily on the tasks of acquisition and tracking. For autonomous and optimal control of the spacecraft, the control bandwidth must be larger while the feedback can (and, therefore, must) be reduced.. Combining linear compensators via multi-window nonlinear summer guarantees minimum phase character of the combined transfer function. It is shown that the solution may require using several parallel branches and windows. Several examples of multi-window nonlinear controller applications are presented.
Network Analyses for Space-Time High Frequency Wind Data
NASA Astrophysics Data System (ADS)
Laib, Mohamed; Kanevski, Mikhail
2017-04-01
Recently, network science has shown an important contribution to the analysis, modelling and visualization of complex time series. Numerous existing methods have been proposed for constructing networks. This work studies spatio-temporal wind data by using networks based on the Granger causality test. Furthermore, a visual comparison is carried out with several frequencies of data and different size of moving window. The main attention is paid to the temporal evolution of connectivity intensity. The Hurst exponent is applied on the provided time series in order to explore if there is a long connectivity memory. The results explore the space time structure of wind data and can be applied to other environmental data. The used dataset presents a challenging case study. It consists of high frequency (10 minutes) wind data from 120 measuring stations in Switzerland, for a time period of 2012-2013. The distribution of stations covers different geomorphological zones and elevation levels. The results are compared with the Person correlation network as well.
NASA Astrophysics Data System (ADS)
Li, Zhuo; Seo, Min-Woong; Kagawa, Keiichiro; Yasutomi, Keita; Kawahito, Shoji
2016-04-01
This paper presents the design and implementation of a time-resolved CMOS image sensor with a high-speed lateral electric field modulation (LEFM) gating structure for time domain fluorescence lifetime measurement. Time-windowed signal charge can be transferred from a pinned photodiode (PPD) to a pinned storage diode (PSD) by turning on a pair of transfer gates, which are situated beside the channel. Unwanted signal charge can be drained from the PPD to the drain by turning on another pair of gates. The pixel array contains 512 (V) × 310 (H) pixels with 5.6 × 5.6 µm2 pixel size. The imager chip was fabricated using 0.11 µm CMOS image sensor process technology. The prototype sensor has a time response of 150 ps at 374 nm. The fill factor of the pixels is 5.6%. The usefulness of the prototype sensor is demonstrated for fluorescence lifetime imaging through simulation and measurement results.
NASA Astrophysics Data System (ADS)
Jalali, Payman; Hyppänen, Timo
2017-06-01
In loose or moderately-dense particle mixtures, the contact forces between particles due to successive collisions create average volumetric solid-solid drag force between different granular phases (of different particle sizes). The derivation of the mathematical formula for this drag force is based on the homogeneity of mixture within the calculational control volume. This assumption especially fails when the size ratio of particles grows to a large value of 10 or greater. The size-driven inhomogeneity is responsible to the deviation of intergranular force from the continuum formula. In this paper, we have implemented discrete element method (DEM) simulations to obtain the volumetric mean force exchanged between the granular phases with the size ratios greater than 10. First, the force is calculated directly from DEM averaged over a proper time window. Second, the continuum formula is applied to calculate the drag forces using the DEM quantities. We have shown the two volumetric forces are in good agreement as long as the homogeneity condition is maintained. However, the relative motion of larger particles in a cloud of finer particles imposes the inhomogeneous distribution of finer particles around the larger ones. We have presented correction factors to the volumetric force from continuum formula.
ERIC Educational Resources Information Center
American School and University, 1983
1983-01-01
An energy-conscious renovation, in which a middle school was turned into a junior high school, utilized fewer windows and an earth berm. These and other conservation measures allowed the school, now double in size, to maintain the existing heating plant. (MLF)
Airborne target tracking algorithm against oppressive decoys in infrared imagery
NASA Astrophysics Data System (ADS)
Sun, Xiechang; Zhang, Tianxu
2009-10-01
This paper presents an approach for tracking airborne target against oppressive infrared decoys. Oppressive decoy lures infrared guided missile by its high infrared radiation. Traditional tracking algorithms have degraded stability even come to tracking failure when airborne target continuously throw out many decoys. The proposed approach first determines an adaptive tracking window. The center of the tracking window is set at a predicted target position which is computed based on uniform motion model. Different strategies are applied for determination of tracking window size according to target state. The image within tracking window is segmented and multi features of candidate targets are extracted. The most similar candidate target is associated to the tracking target by using a decision function, which calculates a weighted sum of normalized feature differences between two comparable targets. Integrated intensity ratio of association target and tracking target, and target centroid are examined to estimate target state in the presence of decoys. The tracking ability and robustness of proposed approach has been validated by processing available real-world and simulated infrared image sequences containing airborne targets and oppressive decoys.
Sink detection on tilted terrain for automated identification of glacial cirques
NASA Astrophysics Data System (ADS)
Prasicek, Günther; Robl, Jörg; Lang, Andreas
2016-04-01
Glacial cirques are morphologically distinct but complex landforms and represent a vital part of high mountain topography. Their distribution, elevation and relief are expected to hold information on (1) the extent of glacial occupation, (2) the mechanism of glacial cirque erosion, and (3) how glacial in concert with periglacial processes can limit peak altitude and mountain range height. While easily detectably for the expert's eye both in nature and on various representations of topography, their complicated nature makes them a nemesis for computer algorithms. Consequently, manual mapping of glacial cirques is commonplace in many mountain landscapes worldwide, but consistent datasets of cirque distribution and objectively mapped cirques and their morphometrical attributes are lacking. Among the biggest problems for algorithm development are the complexity in shape and the great variability of cirque size. For example, glacial cirques can be rather circular or longitudinal in extent, exist as individual and composite landforms, show prominent topographic depressions or can entirely be filled with water or sediment. For these reasons, attributes like circularity, size, drainage area and topology of landform elements (e.g. a flat floor surrounded by steep walls) have only a limited potential for automated cirque detection. Here we present a novel, geomorphometric method for automated identification of glacial cirques on digital elevation models that exploits their genetic bowl-like shape. First, we differentiate between glacial and fluvial terrain employing an algorithm based on a moving window approach and multi-scale curvature, which is also capable of fitting the analysis window to valley width. We then fit a plane to the valley stretch clipped by the analysis window and rotate the terrain around the center cell until the plane is level. Doing so, we produce sinks of considerable size if the clipped terrain represents a cirque, while no or only very small sinks develop on other valley stretches. We normalize sink area by window size for sink classification, apply this method to the Sawtooth Mountains, Idaho, and to Fiordland, New Zealand, and compare the results to manually mapped reference cirques. Results indicate that false negatives are produced only in very rugged terrain and false positives occur in rare cases, when valleys are strongly curved in longitudinal direction.
Numerical and experimental validation for the thermal transmittance of windows with cellular shades
Hart, Robert
2018-02-21
Some highly energy efficient window attachment products are available today, but more rapid market adoption would be facilitated by fair performance metrics. It is important to have validated simulation tools to provide a basis for this analysis. This paper outlines a review and validation of the ISO 15099 center-of-glass zero-solar-load heat transfer correlations for windows with cellular shades. Thermal transmittance was measured experimentally, simulated using computational fluid dynamics (CFD) analysis, and simulated utilizing correlations from ISO 15099 as implemented in Berkeley Lab WINDOW and THERM software. CFD analysis showed ISO 15099 underestimates heat flux of rectangular cavities by up tomore » 60% when aspect ratio (AR) = 1 and overestimates heat flux up to 20% when AR = 0.5. CFD analysis also showed that wave-type surfaces of cellular shades have less than 2% impact on heat flux through the cavities and less than 5% for natural convection of room-side surface. WINDOW was shown to accurately represent heat flux of the measured configurations to a mean relative error of 0.5% and standard deviation of 3.8%. Finally, several shade parameters showed significant influence on correlation accuracy, including distance between shade and glass, inconsistency in cell stretch, size of perimeter gaps, and the mounting hardware.« less
Numerical and experimental validation for the thermal transmittance of windows with cellular shades
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Robert
Some highly energy efficient window attachment products are available today, but more rapid market adoption would be facilitated by fair performance metrics. It is important to have validated simulation tools to provide a basis for this analysis. This paper outlines a review and validation of the ISO 15099 center-of-glass zero-solar-load heat transfer correlations for windows with cellular shades. Thermal transmittance was measured experimentally, simulated using computational fluid dynamics (CFD) analysis, and simulated utilizing correlations from ISO 15099 as implemented in Berkeley Lab WINDOW and THERM software. CFD analysis showed ISO 15099 underestimates heat flux of rectangular cavities by up tomore » 60% when aspect ratio (AR) = 1 and overestimates heat flux up to 20% when AR = 0.5. CFD analysis also showed that wave-type surfaces of cellular shades have less than 2% impact on heat flux through the cavities and less than 5% for natural convection of room-side surface. WINDOW was shown to accurately represent heat flux of the measured configurations to a mean relative error of 0.5% and standard deviation of 3.8%. Finally, several shade parameters showed significant influence on correlation accuracy, including distance between shade and glass, inconsistency in cell stretch, size of perimeter gaps, and the mounting hardware.« less
Gundogdu, Erhan; Ozkan, Huseyin; Alatan, A Aydin
2017-11-01
Correlation filters have been successfully used in visual tracking due to their modeling power and computational efficiency. However, the state-of-the-art correlation filter-based (CFB) tracking algorithms tend to quickly discard the previous poses of the target, since they consider only a single filter in their models. On the contrary, our approach is to register multiple CFB trackers for previous poses and exploit the registered knowledge when an appearance change occurs. To this end, we propose a novel tracking algorithm [of complexity O(D) ] based on a large ensemble of CFB trackers. The ensemble [of size O(2 D ) ] is organized over a binary tree (depth D ), and learns the target appearance subspaces such that each constituent tracker becomes an expert of a certain appearance. During tracking, the proposed algorithm combines only the appearance-aware relevant experts to produce boosted tracking decisions. Additionally, we propose a versatile spatial windowing technique to enhance the individual expert trackers. For this purpose, spatial windows are learned for target objects as well as the correlation filters and then the windowed regions are processed for more robust correlations. In our extensive experiments on benchmark datasets, we achieve a substantial performance increase by using the proposed tracking algorithm together with the spatial windowing.
Interactive floating windows: a new technique for stereoscopic video games
NASA Astrophysics Data System (ADS)
Zerebecki, Chris; Stanfield, Brodie; Tawadrous, Mina; Buckstein, Daniel; Hogue, Andrew; Kapralos, Bill
2012-03-01
The film industry has a long history of creating compelling experiences in stereoscopic 3D. Recently, the video game as an artistic medium has matured into an effective way to tell engaging and immersive stories. Given the current push to bring stereoscopic 3D technology into the consumer market there is considerable interest to develop stereoscopic 3D video games. Game developers have largely ignored the need to design their games specifically for stereoscopic 3D and have thus relied on automatic conversion and driver technology. Game developers need to evaluate solutions used in other media, such as film, to correct perceptual problems such as window violations, and modify or create new solutions to work within an interactive framework. In this paper we extend the dynamic floating window technique into the interactive domain enabling the player to position a virtual window in space. Interactively changing the position, size, and the 3D rotation of the virtual window, objects can be made to 'break the mask' dramatically enhancing the stereoscopic effect. By demonstrating that solutions from the film industry can be extended into the interactive space, it is our hope that this initiates further discussion in the game development community to strengthen their story-telling mechanisms in stereoscopic 3D games.
Carreiro, André V; Amaral, Pedro M T; Pinto, Susana; Tomás, Pedro; de Carvalho, Mamede; Madeira, Sara C
2015-12-01
Amyotrophic Lateral Sclerosis (ALS) is a devastating disease and the most common neurodegenerative disorder of young adults. ALS patients present a rapidly progressive motor weakness. This usually leads to death in a few years by respiratory failure. The correct prediction of respiratory insufficiency is thus key for patient management. In this context, we propose an innovative approach for prognostic prediction based on patient snapshots and time windows. We first cluster temporally-related tests to obtain snapshots of the patient's condition at a given time (patient snapshots). Then we use the snapshots to predict the probability of an ALS patient to require assisted ventilation after k days from the time of clinical evaluation (time window). This probability is based on the patient's current condition, evaluated using clinical features, including functional impairment assessments and a complete set of respiratory tests. The prognostic models include three temporal windows allowing to perform short, medium and long term prognosis regarding progression to assisted ventilation. Experimental results show an area under the receiver operating characteristics curve (AUC) in the test set of approximately 79% for time windows of 90, 180 and 365 days. Creating patient snapshots using hierarchical clustering with constraints outperforms the state of the art, and the proposed prognostic model becomes the first non population-based approach for prognostic prediction in ALS. The results are promising and should enhance the current clinical practice, largely supported by non-standardized tests and clinicians' experience. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Hong, Guosong; Zou, Yingping; Antaris, Alexander L.; Diao, Shuo; Wu, Di; Cheng, Kai; Zhang, Xiaodong; Chen, Changxin; Liu, Bo; He, Yuehui; Wu, Justin Z.; Yuan, Jun; Zhang, Bo; Tao, Zhimin; Fukunaga, Chihiro; Dai, Hongjie
2014-06-01
In vivo fluorescence imaging in the second near-infrared window (1.0-1.7 μm) can afford deep tissue penetration and high spatial resolution, owing to the reduced scattering of long-wavelength photons. Here we synthesize a series of low-bandgap donor/acceptor copolymers with tunable emission wavelengths of 1,050-1,350 nm in this window. Non-covalent functionalization with phospholipid-polyethylene glycol results in water-soluble and biocompatible polymeric nanoparticles, allowing for live cell molecular imaging at >1,000 nm with polymer fluorophores for the first time. Importantly, the high quantum yield of the polymer allows for in vivo, deep-tissue and ultrafast imaging of mouse arterial blood flow with an unprecedented frame rate of >25 frames per second. The high time-resolution results in spatially and time resolved imaging of the blood flow pattern in cardiogram waveform over a single cardiac cycle (~200 ms) of a mouse, which has not been observed with fluorescence imaging in this window before.
Low-complexity image processing for real-time detection of neonatal clonic seizures.
Ntonfo, Guy Mathurin Kouamou; Ferrari, Gianluigi; Raheli, Riccardo; Pisani, Francesco
2012-05-01
In this paper, we consider a novel low-complexity real-time image-processing-based approach to the detection of neonatal clonic seizures. Our approach is based on the extraction, from a video of a newborn, of an average luminance signal representative of the body movements. Since clonic seizures are characterized by periodic movements of parts of the body (e.g., the limbs), by evaluating the periodicity of the extracted average luminance signal it is possible to detect the presence of a clonic seizure. The periodicity is investigated, through a hybrid autocorrelation-Yin estimation technique, on a per-window basis, where a time window is defined as a sequence of consecutive video frames. While processing is first carried out on a single window basis, we extend our approach to interlaced windows. The performance of the proposed detection algorithm is investigated, in terms of sensitivity and specificity, through receiver operating characteristic curves, considering video recordings of newborns affected by neonatal seizures.
Size and location of defects at the coupling interface affect lithotripter performance.
Li, Guangyan; Williams, James C; Pishchalnikov, Yuri A; Liu, Ziyue; McAteer, James A
2012-12-01
Study Type--Therapy (case series) Level of Evidence 4. What's known on the subject? and What does the study add? In shock wave lithotripsy air pockets tend to get caught between the therapy head of the lithotripter and the skin of the patient. Defects at the coupling interface hinder the transmission of shock wave energy into the body, reducing the effectiveness of treatment. This in vitro study shows that ineffective coupling not only blocks the transmission of acoustic pulses but also alters the properties of shock waves involved in the mechanisms of stone breakage, with the effect dependent on the size and location of defects at the coupling interface. • To determine how the size and location of coupling defects caught between the therapy head of a lithotripter and the skin of a surrogate patient (i.e. the acoustic window of a test chamber) affect the features of shock waves responsible for stone breakage. • Model defects were placed in the coupling gel between the therapy head of a Dornier Compact-S electromagnetic lithotripter (Dornier MedTech, Kennesaw, GA, USA) and the Mylar (biaxially oriented polyethylene terephthalate) (DuPont Teijin Films, Chester, VA, USA) window of a water-filled coupling test system. • A fibre-optic probe hydrophone was used to measure acoustic pressures and map the lateral dimensions of the focal zone of the lithotripter. • The effect of coupling conditions on stone breakage was assessed using gypsum model stones. • Stone breakage decreased in proportion to the area of the coupling defect; a centrally located defect blocking only 18% of the transmission area reduced stone breakage by an average of almost 30%. • The effect on stone breakage was greater for defects located on-axis and decreased as the defect was moved laterally; an 18% defect located near the periphery of the coupling window (2.0 cm off-axis) reduced stone breakage by only ~15% compared to when coupling was completely unobstructed. • Defects centred within the coupling window acted to narrow the focal width of the lithotripter; an 8.2% defect reduced the focal width ~30% compared to no obstruction (4.4 mm vs 6.5 mm). • Coupling defects located slightly off centre disrupted the symmetry of the acoustic field; an 18% defect positioned 1.0 cm off-axis shifted the focus of maximum positive pressure ~1.0 mm laterally. • Defects on and off-axis imposed a significant reduction in the energy density of shock waves across the focal zone. • In addition to blocking the transmission of shock-wave energy, coupling defects also disrupt the properties of shock waves that play a role in stone breakage, including the focal width of the lithotripter and the symmetry of the acoustic field • The effect is dependent on the size and location of defects, with defects near the centre of the coupling window having the greatest effect. • These data emphasize the importance of eliminating air pockets from the coupling interface, particularly defects located near the centre of the coupling window. © 2012 BJU INTERNATIONAL.
Smart glass as the method of improving the energy efficiency of high-rise buildings
NASA Astrophysics Data System (ADS)
Gamayunova, Olga; Gumerova, Eliza; Miloradova, Nadezda
2018-03-01
The question that has to be answered in high-rise building is glazing and its service life conditions. Contemporary market offers several types of window units, for instance, wooden, aluminum, PVC and combined models. Wooden and PVC windows become the most widespread and competitive between each other. In recent times design engineers choose smart glass. In this article, the advantages and drawbacks of all types of windows are reviewed, and the recommendations are given according to choice of window type in order to improve energy efficiency of buildings.
Smith, Lauren H; Hargrove, Levi J; Lock, Blair A; Kuiken, Todd A
2011-04-01
Pattern recognition-based control of myoelectric prostheses has shown great promise in research environments, but has not been optimized for use in a clinical setting. To explore the relationship between classification error, controller delay, and real-time controllability, 13 able-bodied subjects were trained to operate a virtual upper-limb prosthesis using pattern recognition of electromyogram (EMG) signals. Classification error and controller delay were varied by training different classifiers with a variety of analysis window lengths ranging from 50 to 550 ms and either two or four EMG input channels. Offline analysis showed that classification error decreased with longer window lengths (p < 0.01 ). Real-time controllability was evaluated with the target achievement control (TAC) test, which prompted users to maneuver the virtual prosthesis into various target postures. The results indicated that user performance improved with lower classification error (p < 0.01 ) and was reduced with longer controller delay (p < 0.01 ), as determined by the window length. Therefore, both of these effects should be considered when choosing a window length; it may be beneficial to increase the window length if this results in a reduced classification error, despite the corresponding increase in controller delay. For the system employed in this study, the optimal window length was found to be between 150 and 250 ms, which is within acceptable controller delays for conventional multistate amplitude controllers.
Autofocus algorithm for curvilinear SAR imaging
NASA Astrophysics Data System (ADS)
Bleszynski, E.; Bleszynski, M.; Jaroszewicz, T.
2012-05-01
We describe an approach to autofocusing for large apertures on curved SAR trajectories. It is a phase-gradient type method in which phase corrections compensating trajectory perturbations are estimated not directly from the image itself, but rather on the basis of partial" SAR data { functions of the slow and fast times { recon- structed (by an appropriate forward-projection procedure) from windowed scene patches, of sizes comparable to distances between distinct targets or localized features of the scene. The resulting partial data" can be shown to contain the same information on the phase perturbations as that in the original data, provided the frequencies of the perturbations do not exceed a quantity proportional to the patch size. The algorithm uses as input a sequence of conventional scene images based on moderate-size subapertures constituting the full aperture for which the phase corrections are to be determined. The subaperture images are formed with pixel sizes comparable to the range resolution which, for the optimal subaperture size, should be also approximately equal the cross-range resolution. The method does not restrict the size or shape of the synthetic aperture and can be incorporated in the data collection process in persistent sensing scenarios. The algorithm has been tested on the publicly available set of GOTCHA data, intentionally corrupted by random-walk-type trajectory uctuations (a possible model of errors caused by imprecise inertial navigation system readings) of maximum frequencies compatible with the selected patch size. It was able to eciently remove image corruption for apertures of sizes up to 360 degrees.
NASA Astrophysics Data System (ADS)
Creusen, I. M.; Hazelhoff, L.; De With, P. H. N.
2013-10-01
In large-scale automatic traffic sign surveying systems, the primary computational effort is concentrated at the traffic sign detection stage. This paper focuses on reducing the computational load of particularly the sliding window object detection algorithm which is employed for traffic sign detection. Sliding-window object detectors often use a linear SVM to classify the features in a window. In this case, the classification can be seen as a convolution of the feature maps with the SVM kernel. It is well known that convolution can be efficiently implemented in the frequency domain, for kernels larger than a certain size. We show that by careful reordering of sliding-window operations, most of the frequency-domain transformations can be eliminated, leading to a substantial increase in efficiency. Additionally, we suggest to use the overlap-add method to keep the memory use within reasonable bounds. This allows us to keep all the transformed kernels in memory, thereby eliminating even more domain transformations, and allows all scales in a multiscale pyramid to be processed using the same set of transformed kernels. For a typical sliding-window implementation, we have found that the detector execution performance improves with a factor of 5.3. As a bonus, many of the detector improvements from literature, e.g. chi-squared kernel approximations, sub-class splitting algorithms etc., can be more easily applied at a lower performance penalty because of an improved scalability.
Knapp, R.W.; Anderson, N.L.
1994-01-01
Data may be overprinted by a steady-state cyclical noise (hum). Steady-state indicates that the noise is invariant with time; its attributes, frequency, amplitude, and phase, do not change with time. Hum recorded on seismic data usually is powerline noise and associated higher harmonics; leakage from full-waveform rectified cathodic protection devices that contain the odd higher harmonics of powerline frequencies; or vibrational noise from mechanical devices. The fundamental frequency of powerline hum may be removed during data acquisition with the use of notch filters. Unfortunately, notch filters do not discriminate signal and noise, attenuating both. They also distort adjacent frequencies by phase shifting. Finally, they attenuate only the fundamental mode of the powerline noise; higher harmonics and frequencies other than that of powerlines are not removed. Digital notch filters, applied during processing, have many of the same problems as analog filters applied in the field. The method described here removes hum of a particular frequency. Hum attributes are measured by discrete Fourier analysis, and the hum is canceled from the data by subtraction. Errors are slight and the result of the presence of (random) noise in the window or asynchrony of the hum and data sampling. Error is minimized by increasing window size or by resampling to a finer interval. Errors affect the degree of hum attenuation, not the signal. The residual is steady-state hum of the same frequency. ?? 1994.
Hamuro, Yoshitomo
2017-03-01
A new strategy to analyze amide hydrogen/deuterium exchange mass spectrometry (HDX-MS) data is proposed, utilizing a wider time window and isotope envelope analysis of each peptide. While most current scientific reports present HDX-MS data as a set of time-dependent deuteration levels of peptides, the ideal HDX-MS data presentation is a complete set of backbone amide hydrogen exchange rates. The ideal data set can provide single amide resolution, coverage of all exchange events, and the open/close ratio of each amide hydrogen in EX2 mechanism. Toward this goal, a typical HDX-MS protocol was modified in two aspects: measurement of a wider time window in HDX-MS experiments and deconvolution of isotope envelope of each peptide. Measurement of a wider time window enabled the observation of deuterium incorporation of most backbone amide hydrogens. Analysis of the isotope envelope instead of centroid value provides the deuterium distribution instead of the sum of deuteration levels in each peptide. A one-step, global-fitting algorithm optimized exchange rate and deuterium retention during the analysis of each amide hydrogen by fitting the deuterated isotope envelopes at all time points of all peptides in a region. Application of this strategy to cytochrome c yielded 97 out of 100 amide hydrogen exchange rates. A set of exchange rates determined by this approach is more appropriate for a patent or regulatory filing of a biopharmaceutical than a set of peptide deuteration levels obtained by a typical protocol. A wider time window of this method also eliminates false negatives in protein-ligand binding site identification. Graphical Abstract ᅟ.
NASA Astrophysics Data System (ADS)
Hamuro, Yoshitomo
2017-03-01
A new strategy to analyze amide hydrogen/deuterium exchange mass spectrometry (HDX-MS) data is proposed, utilizing a wider time window and isotope envelope analysis of each peptide. While most current scientific reports present HDX-MS data as a set of time-dependent deuteration levels of peptides, the ideal HDX-MS data presentation is a complete set of backbone amide hydrogen exchange rates. The ideal data set can provide single amide resolution, coverage of all exchange events, and the open/close ratio of each amide hydrogen in EX2 mechanism. Toward this goal, a typical HDX-MS protocol was modified in two aspects: measurement of a wider time window in HDX-MS experiments and deconvolution of isotope envelope of each peptide. Measurement of a wider time window enabled the observation of deuterium incorporation of most backbone amide hydrogens. Analysis of the isotope envelope instead of centroid value provides the deuterium distribution instead of the sum of deuteration levels in each peptide. A one-step, global-fitting algorithm optimized exchange rate and deuterium retention during the analysis of each amide hydrogen by fitting the deuterated isotope envelopes at all time points of all peptides in a region. Application of this strategy to cytochrome c yielded 97 out of 100 amide hydrogen exchange rates. A set of exchange rates determined by this approach is more appropriate for a patent or regulatory filing of a biopharmaceutical than a set of peptide deuteration levels obtained by a typical protocol. A wider time window of this method also eliminates false negatives in protein-ligand binding site identification.
Teramoto, Wataru; Nakazaki, Takuyuki; Sekiyama, Kaoru; Mori, Shuji
2016-01-01
The present study investigated, whether word width and length affect the optimal character size for reading of horizontally scrolling Japanese words, using reading speed as a measure. In Experiment 1, three Japanese words, each consisting of four Hiragana characters, sequentially scrolled on a display screen from right to left. Participants, all Japanese native speakers, were instructed to read the words aloud as accurately as possible, irrespective of their order within the sequence. To quantitatively measure their reading performance, we used rapid serial visual presentation paradigm, where the scrolling rate was increased until the participants began to make mistakes. Thus, the highest scrolling rate at which the participants’ performance exceeded 88.9% correct rate was calculated for each character size (0.3°, 0.6°, 1.0°, and 3.0°) and scroll window size (5 or 10 character spaces). Results showed that the reading performance was highest in the range of 0.6° to 1.0°, irrespective of the scroll window size. Experiment 2 investigated whether the optimal character size observed in Experiment 1 was applicable for any word width and word length (i.e., the number of characters in a word). Results showed that reading speeds were slower for longer than shorter words and the word width of 3.6° was optimal among the word lengths tested (three, four, and six character words). Considering that character size varied depending on word width and word length in the present study, this means that the optimal character size can be changed by word width and word length in scrolling Japanese words. PMID:26909052
Teramoto, Wataru; Nakazaki, Takuyuki; Sekiyama, Kaoru; Mori, Shuji
2016-01-01
The present study investigated, whether word width and length affect the optimal character size for reading of horizontally scrolling Japanese words, using reading speed as a measure. In Experiment 1, three Japanese words, each consisting of four Hiragana characters, sequentially scrolled on a display screen from right to left. Participants, all Japanese native speakers, were instructed to read the words aloud as accurately as possible, irrespective of their order within the sequence. To quantitatively measure their reading performance, we used rapid serial visual presentation paradigm, where the scrolling rate was increased until the participants began to make mistakes. Thus, the highest scrolling rate at which the participants' performance exceeded 88.9% correct rate was calculated for each character size (0.3°, 0.6°, 1.0°, and 3.0°) and scroll window size (5 or 10 character spaces). Results showed that the reading performance was highest in the range of 0.6° to 1.0°, irrespective of the scroll window size. Experiment 2 investigated whether the optimal character size observed in Experiment 1 was applicable for any word width and word length (i.e., the number of characters in a word). Results showed that reading speeds were slower for longer than shorter words and the word width of 3.6° was optimal among the word lengths tested (three, four, and six character words). Considering that character size varied depending on word width and word length in the present study, this means that the optimal character size can be changed by word width and word length in scrolling Japanese words.
Using Parameters of Dynamic Pulse Function for 3d Modeling in LOD3 Based on Random Textures
NASA Astrophysics Data System (ADS)
Alizadehashrafi, B.
2015-12-01
The pulse function (PF) is a technique based on procedural preprocessing system to generate a computerized virtual photo of the façade with in a fixed size square(Alizadehashrafi et al., 2009, Musliman et al., 2010). Dynamic Pulse Function (DPF) is an enhanced version of PF which can create the final photo, proportional to real geometry. This can avoid distortion while projecting the computerized photo on the generated 3D model(Alizadehashrafi and Rahman, 2013). The challenging issue that might be handled for having 3D model in LoD3 rather than LOD2, is the final aim that have been achieved in this paper. In the technique based DPF the geometries of the windows and doors are saved in an XML file schema which does not have any connections with the 3D model in LoD2 and CityGML format. In this research the parameters of Dynamic Pulse Functions are utilized via Ruby programming language in SketchUp Trimble to generate (exact position and deepness) the windows and doors automatically in LoD3 based on the same concept of DPF. The advantage of this technique is automatic generation of huge number of similar geometries e.g. windows by utilizing parameters of DPF along with defining entities and window layers. In case of converting the SKP file to CityGML via FME software or CityGML plugins the 3D model contains the semantic database about the entities and window layers which can connect the CityGML to MySQL(Alizadehashrafi and Baig, 2014). The concept behind DPF, is to use logical operations to project the texture on the background image which is dynamically proportional to real geometry. The process of projection is based on two vertical and horizontal dynamic pulses starting from upper-left corner of the background wall in down and right directions respectively based on image coordinate system. The logical one/zero on the intersections of two vertical and horizontal dynamic pulses projects/does not project the texture on the background image. It is possible to define priority for each layer. For instance the priority of the door layer can be higher than window layer which means that window texture cannot be projected on the door layer. Orthogonal and rectified perpendicular symmetric photos of the 3D objects that are proportional to the real façade geometry must be utilized for the generation of the output frame for DPF. The DPF produces very high quality and small data size of output image files in quite smaller dimension compare with the photorealistic texturing method. The disadvantage of DPF is its preprocessing method to generate output image file rather than online processing to generate the texture within the 3D environment such as CityGML. Furthermore the result of DPF can be utilized for 3D model in LOD2 rather than LOD3. In the current work the random textures of the window layers are created based on parameters of DPF within Ruby console of SketchUp Trimble to generate the deeper geometries of the windows and their exact position on the façade automatically along with random textures to increase Level of Realism (LoR)(Scarpino, 2010). As the output frame in DPF is proportional to real geometry (height and width of the façade) it is possible to query the XML database and convert them to units such as meter automatically. In this technique, the perpendicular terrestrial photo from the façade is rectified by employing projective transformation based on the frame which is in constrain proportion to real geometry. The rectified photos which are not suitable for texturing but necessary for measuring, can be resized in constrain proportion to real geometry before measuring process. Height and width of windows, doors, horizontal and vertical distance between windows from upper left corner of the photo dimensions of doors and windows are parameters that should be measured to run the program as a plugins in SketchUp Trimble. The system can use these parameters and texture file names and file paths to create the façade semi-automatically. To avoid leaning geometry the textures of windows, doors and etc, should be cropped and rectified from perpendicular photos, so that they can be used in the program to create the whole façade along with its geometries. Texture enhancement should be done in advance such as removing disturbing objects, exposure setting, left-right up-down transformation, and so on. In fact, the quality, small data size, scale and semantic database for each façade are the prominent advantages of this method.
The Golden Cage: Growing up in the Socialist Yugoslavia
ERIC Educational Resources Information Center
Marjanovic-Shane, Ana
2018-01-01
From the mid 1950s through roughly the 1980s, some or many children and youth of the Socialist Yugoslavia, especially those of us in Belgrade, the capital, lived in a curious, almost surreal "window" in the space and time. This surreal window of space-time, offered to children and youth of Yugoslavia, unprecedented opportunities for…
Alternative Fuels Data Center: Maine's Only Biodiesel Manufacturer Powers
this project, contact Maine Clean Communities. Download QuickTime Video QuickTime (.mov) Download Windows Media Video Windows Media (.wmv) Video Download Help Text version See more videos provided by truck Krug Energy Opens Natural Gas Fueling Station in Arkansas June 18, 2016 photo of natural gas
Alternative Fuels Data Center: Texas Taxis Go Hybrid
information about this project, contact Alamo Area Clean Cities (San Antonio). Download QuickTime Video QuickTime (.mov) Download Windows Media Video Windows Media (.wmv) Video Download Help Text version See more car Hydrogen Powers Fuel Cell Vehicles in California Nov. 18, 2017 Photo of a car Smart Car Shopping
ERIC Educational Resources Information Center
Hallberg, Rosemary; Hurley, Janet
2009-01-01
Now that green building has become more popular, school facility directors and architects are beginning to make different choices during construction. These choices may involve energy-efficient lighting, window size, building materials and design elements. Often, though, what happens during construction has unexpected consequences--unwanted…
Okamoto, Takumi; Koide, Tetsushi; Sugi, Koki; Shimizu, Tatsuya; Anh-Tuan Hoang; Tamaki, Toru; Raytchev, Bisser; Kaneda, Kazufumi; Kominami, Yoko; Yoshida, Shigeto; Mieno, Hiroshi; Tanaka, Shinji
2015-08-01
With the increase of colorectal cancer patients in recent years, the needs of quantitative evaluation of colorectal cancer are increased, and the computer-aided diagnosis (CAD) system which supports doctor's diagnosis is essential. In this paper, a hardware design of type identification module in CAD system for colorectal endoscopic images with narrow band imaging (NBI) magnification is proposed for real-time processing of full high definition image (1920 × 1080 pixel). A pyramid style image segmentation with SVMs for multi-size scan windows, which can be implemented on an FPGA with small circuit area and achieve high accuracy, is proposed for actual complex colorectal endoscopic images.
Coalescent genealogy samplers: windows into population history
Kuhner, Mary K.
2016-01-01
Coalescent genealogy samplers attempt to estimate past qualities of a population, such as its size, growth rate, patterns of gene flow or time of divergence from another population, based on samples of molecular data. Genealogy samplers are increasingly popular because of their potential to disentangle complex population histories. In the last decade they have been widely applied to systems ranging from humans to viruses. Findings include detection of unexpected reproductive inequality in fish, new estimates of historical whale abundance, exoneration of humans for the prehistoric decline of bison and inference of a selective sweep on the human Y chromosome. This review summarizes available genealogy-sampler software, including data requirements and limitations on the use of each program. PMID:19101058
Coalescent genealogy samplers: windows into population history.
Kuhner, Mary K
2009-02-01
Coalescent genealogy samplers attempt to estimate past qualities of a population, such as its size, growth rate, patterns of gene flow or time of divergence from another population, based on samples of molecular data. Genealogy samplers are increasingly popular because of their potential to disentangle complex population histories. In the last decade they have been widely applied to systems ranging from humans to viruses. Findings include detection of unexpected reproductive inequality in fish, new estimates of historical whale abundance, exoneration of humans for the prehistoric decline of bison and inference of a selective sweep on the human Y chromosome. This review summarizes available genealogy-sampler software, including data requirements and limitations on the use of each program.
Bloem, Robbert; Garrett-Roe, Sean; Strzalka, Halina; Hamm, Peter; Donaldson, Paul
2010-12-20
We demonstrate how quasi-phase-cycling achieved by sub-cycle delay modulation can be used to replace optical chopping in a box-CARS 2D IR experiment in order to enhance the signal size, and, at the same time, completely eliminate any scattering contamination. Two optical devices are described that can be used for this purpose, a wobbling Brewster window and a photoelastic modulator. They are simple to construct, easy to incorporate into any existing 2D IR setup, and have attractive features such as a high optical throughput and a fast modulation frequency needed to phase cycle on a shot-to-shot basis.
Improvements to the modal holographic wavefront sensor.
Kong, Fanpeng; Lambert, Andrew
2016-05-01
The Zernike coefficients of a light wavefront can be calculated directly by intensity ratios of pairs of spots in the reconstructed image plane of a holographic wavefront sensor (HWFS). However, the response curve of the HWFS heavily depends on the position and size of the detector for each spot and the distortions introduced by other aberrations. In this paper, we propose a method to measure the intensity of each spot by setting a threshold to select effective pixels and using the weighted average intensity within a selected window. Compared with using the integral intensity over a small window for each spot, we show through a numerical simulation that the proposed method reduces the dependency of the HWFS's response curve on the selection of the detector window. We also recorded a HWFS on a holographic plate using a blue laser and demonstrated its capability to detect the strength of encoded Zernike terms in an aberrated beam.
Joint histogram-based cost aggregation for stereo matching.
Min, Dongbo; Lu, Jiangbo; Do, Minh N
2013-10-01
This paper presents a novel method for performing efficient cost aggregation in stereo matching. The cost aggregation problem is reformulated from the perspective of a histogram, giving us the potential to reduce the complexity of the cost aggregation in stereo matching significantly. Differently from previous methods which have tried to reduce the complexity in terms of the size of an image and a matching window, our approach focuses on reducing the computational redundancy that exists among the search range, caused by a repeated filtering for all the hypotheses. Moreover, we also reduce the complexity of the window-based filtering through an efficient sampling scheme inside the matching window. The tradeoff between accuracy and complexity is extensively investigated by varying the parameters used in the proposed method. Experimental results show that the proposed method provides high-quality disparity maps with low complexity and outperforms existing local methods. This paper also provides new insights into complexity-constrained stereo-matching algorithm design.
Iconic Meaning in Music: An Event-Related Potential Study.
Cai, Liman; Huang, Ping; Luo, Qiuling; Huang, Hong; Mo, Lei
2015-01-01
Although there has been extensive research on the processing of the emotional meaning of music, little is known about other aspects of listeners' experience of music. The present study investigated the neural correlates of the iconic meaning of music. Event-related potentials (ERP) were recorded while a group of 20 music majors and a group of 20 non-music majors performed a lexical decision task in the context of implicit musical iconic meaning priming. ERP analysis revealed a significant N400 effect of congruency in time window 260-510 ms following the onset of the target word only in the group of music majors. Time-course analysis using 50 ms windows indicated significant N400 effects both within the time window 410-460 ms and 460-510 ms for music majors, whereas only a partial N400 effect during time window 410-460 ms was observed for non-music majors. There was also a trend for the N400 effects in the music major group to be stronger than those in the non-major group in the sub-windows of 310-360 ms and 410-460 ms. Especially in the sub-window of 410-460 ms, the topographical map of the difference waveforms between congruent and incongruent conditions revealed different N400 distribution between groups; the effect was concentrated in bilateral frontal areas for music majors, but in central-parietal areas for non-music majors. These results imply probable neural mechanism differences underlying automatic iconic meaning priming of music. Our findings suggest that processing of the iconic meaning of music can be accomplished automatically and that musical training may facilitate the understanding of the iconic meaning of music.
Iconic Meaning in Music: An Event-Related Potential Study
Luo, Qiuling; Huang, Hong; Mo, Lei
2015-01-01
Although there has been extensive research on the processing of the emotional meaning of music, little is known about other aspects of listeners’ experience of music. The present study investigated the neural correlates of the iconic meaning of music. Event-related potentials (ERP) were recorded while a group of 20 music majors and a group of 20 non-music majors performed a lexical decision task in the context of implicit musical iconic meaning priming. ERP analysis revealed a significant N400 effect of congruency in time window 260-510 ms following the onset of the target word only in the group of music majors. Time-course analysis using 50 ms windows indicated significant N400 effects both within the time window 410-460 ms and 460-510 ms for music majors, whereas only a partial N400 effect during time window 410-460 ms was observed for non-music majors. There was also a trend for the N400 effects in the music major group to be stronger than those in the non-major group in the sub-windows of 310-360ms and 410-460ms. Especially in the sub-window of 410-460 ms, the topographical map of the difference waveforms between congruent and incongruent conditions revealed different N400 distribution between groups; the effect was concentrated in bilateral frontal areas for music majors, but in central-parietal areas for non-music majors. These results imply probable neural mechanism differences underlying automatic iconic meaning priming of music. Our findings suggest that processing of the iconic meaning of music can be accomplished automatically and that musical training may facilitate the understanding of the iconic meaning of music. PMID:26161561
Finding minimum spanning trees more efficiently for tile-based phase unwrapping
NASA Astrophysics Data System (ADS)
Sawaf, Firas; Tatam, Ralph P.
2006-06-01
The tile-based phase unwrapping method employs an algorithm for finding the minimum spanning tree (MST) in each tile. We first examine the properties of a tile's representation from a graph theory viewpoint, observing that it is possible to make use of a more efficient class of MST algorithms. We then describe a novel linear time algorithm which reduces the size of the MST problem by half at the least, and solves it completely at best. We also show how this algorithm can be applied to a tile using a sliding window technique. Finally, we show how the reduction algorithm can be combined with any other standard MST algorithm to achieve a more efficient hybrid, using Prim's algorithm for empirical comparison and noting that the reduction algorithm takes only 0.1% of the time taken by the overall hybrid.
Thermal damage study of beryllium windows used as vacuum barriers in synchrotron radiation beamlines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holdener, F.R.; Johnson, G.L.; Karpenko, V.P.
An experimental study to investigate thermal-induced damage to SSRL-designed beryllium foil windows was performed at LLNL's Laser Welding Research Facility. The primary goal of this study was to determine the threshold at which thermal-stress-induced damage occurs in these commonly used vacuum barriers. An Nd:Yag pulsed laser with cylindrical optics and a carefully designed test cell provided a test environment that closely resembles the actual beamline conditions at SSRL. Tests performed on two beryllium window geometries, with different vertical aperture dimensions but equal foil thicknesses of 0.254 mm, resulted in two focused total-power thresholds at which incipient damage was determined. Formore » a beam spot size similar to that of the Beamline-X Wiggler Line, onset of surface damage for a 5-mm by 25-mm aperture window was observed at 170 W after 174,000 laser pulses (1.2-ms pulse at 100 pps). A second window with double the vertical aperture dimension (10 mm by 25 mm) was observed to have surface cracking after 180,000 laser pulses with 85 W impinging its front surface. It failed after approximately 1,000,000 pulses. Another window of the same type (10 mm by 25 mm) received 2,160,000 laser pulses at 74.4 W, and subsequent metallographic sectioning revealed no signs of through-thickness damage. Comparison of windows with equal foil thicknesses and aperture dimensions has effectively identified the heat flux limit for incipient failure. The data show that halving the aperture's vertical dimension allows doubling the total incident power for equivalent onsets of thermal-induced damage.« less
NASA Technical Reports Server (NTRS)
Yuen, Vincent K.
1989-01-01
The Systems Engineering Simulator has addressed the major issues in providing visual data to its real-time man-in-the-loop simulations. Out-the-window views and CCTV views are provided by three scene systems to give the astronauts their real-world views. To expand the window coverage for the Space Station Freedom workstation a rotating optics system is used to provide the widest field of view possible. To provide video signals to as many viewpoints as possible, windows and CCTVs, with a limited amount of hardware, a video distribution system has been developed to time-share the video channels among viewpoints at the selection of the simulation users. These solutions have provided the visual simulation facility for real-time man-in-the-loop simulations for the NASA space program.
Split delivery vehicle routing problem with time windows: a case study
NASA Astrophysics Data System (ADS)
Latiffianti, E.; Siswanto, N.; Firmandani, R. A.
2018-04-01
This paper aims to implement an extension of VRP so called split delivery vehicle routing problem (SDVRP) with time windows in a case study involving pickups and deliveries of workers from several points of origin and several destinations. Each origin represents a bus stop and the destination represents either site or office location. An integer linear programming of the SDVRP problem is presented. The solution was generated using three stages of defining the starting points, assigning busses, and solving the SDVRP with time windows using an exact method. Although the overall computational time was relatively lengthy, the results indicated that the produced solution was better than the existing routing and scheduling that the firm used. The produced solution was also capable of reducing fuel cost by 9% that was obtained from shorter total distance travelled by the shuttle buses.