Radar signal analysis of ballistic missile with micro-motion based on time-frequency distribution
NASA Astrophysics Data System (ADS)
Wang, Jianming; Liu, Lihua; Yu, Hua
2015-12-01
The micro-motion of ballistic missile targets induces micro-Doppler modulation on the radar return signal, which is a unique feature for the warhead discrimination during flight. In order to extract the micro-Doppler feature of ballistic missile targets, time-frequency analysis is employed to process the micro-Doppler modulated time-varying radar signal. The images of time-frequency distribution (TFD) reveal the micro-Doppler modulation characteristic very well. However, there are many existing time-frequency analysis methods to generate the time-frequency distribution images, including the short-time Fourier transform (STFT), Wigner distribution (WD) and Cohen class distribution, etc. Under the background of ballistic missile defence, the paper aims at working out an effective time-frequency analysis method for ballistic missile warhead discrimination from the decoys.
NASA Technical Reports Server (NTRS)
Lefebvre, D. R.; Sanderson, A. C.
1994-01-01
Robot coordination and control systems for remote teleoperation applications are by necessity implemented on distributed computers. Modeling and performance analysis of these distributed robotic systems is difficult, but important for economic system design. Performance analysis methods originally developed for conventional distributed computer systems are often unsatisfactory for evaluating real-time systems. The paper introduces a formal model of distributed robotic control systems; and a performance analysis method, based on scheduling theory, which can handle concurrent hard-real-time response specifications. Use of the method is illustrated by a case of remote teleoperation which assesses the effect of communication delays and the allocation of robot control functions on control system hardware requirements.
Challenges in reducing the computational time of QSTS simulations for distribution system analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deboever, Jeremiah; Zhang, Xiaochen; Reno, Matthew J.
The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10more » to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.« less
NASA Astrophysics Data System (ADS)
Ke, Jyh-Bin; Lee, Wen-Chiung; Wang, Kuo-Hsiung
2007-07-01
This paper presents the reliability and sensitivity analysis of a system with M primary units, W warm standby units, and R unreliable service stations where warm standby units switching to the primary state might fail. Failure times of primary and warm standby units are assumed to have exponential distributions, and service times of the failed units are exponentially distributed. In addition, breakdown times and repair times of the service stations also follow exponential distributions. Expressions for system reliability, RY(t), and mean time to system failure, MTTF are derived. Sensitivity analysis, relative sensitivity analysis of the system reliability and the mean time to failure, with respect to system parameters are also investigated.
EMD-WVD time-frequency distribution for analysis of multi-component signals
NASA Astrophysics Data System (ADS)
Chai, Yunzi; Zhang, Xudong
2016-10-01
Time-frequency distribution (TFD) is two-dimensional function that indicates the time-varying frequency content of one-dimensional signals. And The Wigner-Ville distribution (WVD) is an important and effective time-frequency analysis method. The WVD can efficiently show the characteristic of a mono-component signal. However, a major drawback is the extra cross-terms when multi-component signals are analyzed by WVD. In order to eliminating the cross-terms, we decompose signals into single frequency components - Intrinsic Mode Function (IMF) - by using the Empirical Mode decomposition (EMD) first, then use WVD to analyze each single IMF. In this paper, we define this new time-frequency distribution as EMD-WVD. And the experiment results show that the proposed time-frequency method can solve the cross-terms problem effectively and improve the accuracy of WVD time-frequency analysis.
Visualization and Analysis for Near-Real-Time Decision Making in Distributed Workflows
Pugmire, David; Kress, James; Choi, Jong; ...
2016-08-04
Data driven science is becoming increasingly more common, complex, and is placing tremendous stresses on visualization and analysis frameworks. Data sources producing 10GB per second (and more) are becoming increasingly commonplace in both simulation, sensor and experimental sciences. These data sources, which are often distributed around the world, must be analyzed by teams of scientists that are also distributed. Enabling scientists to view, query and interact with such large volumes of data in near-real-time requires a rich fusion of visualization and analysis techniques, middleware and workflow systems. Here, this paper discusses initial research into visualization and analysis of distributed datamore » workflows that enables scientists to make near-real-time decisions of large volumes of time varying data.« less
An Analysis of Variance Approach for the Estimation of Response Time Distributions in Tests
ERIC Educational Resources Information Center
Attali, Yigal
2010-01-01
Generalizability theory and analysis of variance methods are employed, together with the concept of objective time pressure, to estimate response time distributions and the degree of time pressure in timed tests. By estimating response time variance components due to person, item, and their interaction, and fixed effects due to item types and…
2003-11-01
Distributions In contrast to the linear time-frequency transforms such as the short-time Fourier transform, the Wigner - Ville distribution ( WVD ) is...23 9 Results of nine TFDs: (a) Wigner - Ville distribution , (b) Born-Jordan distribution , (c) Choi-Williams distribution , (d) bilinear TFD...are applied in the Wigner - Ville class of time-frequency transforms and the reassignment methods, which are applied to any time-frequency distribution
Furbish, David; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan
2016-01-01
We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.
2014-10-16
Time-Frequency analysis, Short-Time Fourier Transform, Wigner Ville Distribution, Fourier Bessel Transform, Fractional Fourier Transform. I...INTRODUCTION Most widely used time-frequency transforms are short-time Fourier Transform (STFT) and Wigner Ville distribution (WVD). In STFT, time and...frequency resolutions are limited by the size of window function used in calculating STFT. For mono-component signals, WVD gives the best time and frequency
Spatiotemporal Analysis of the Ebola Hemorrhagic Fever in West Africa in 2014
NASA Astrophysics Data System (ADS)
Xu, M.; Cao, C. X.; Guo, H. F.
2017-09-01
Ebola hemorrhagic fever (EHF) is an acute hemorrhagic diseases caused by the Ebola virus, which is highly contagious. This paper aimed to explore the possible gathering area of EHF cases in West Africa in 2014, and identify endemic areas and their tendency by means of time-space analysis. We mapped distribution of EHF incidences and explored statistically significant space, time and space-time disease clusters. We utilized hotspot analysis to find the spatial clustering pattern on the basis of the actual outbreak cases. spatial-temporal cluster analysis is used to analyze the spatial or temporal distribution of agglomeration disease, examine whether its distribution is statistically significant. Local clusters were investigated using Kulldorff's scan statistic approach. The result reveals that the epidemic mainly gathered in the western part of Africa near north Atlantic with obvious regional distribution. For the current epidemic, we have found areas in high incidence of EVD by means of spatial cluster analysis.
Radar Imaging Using The Wigner-Ville Distribution
NASA Astrophysics Data System (ADS)
Boashash, Boualem; Kenny, Owen P.; Whitehouse, Harper J.
1989-12-01
The need for analysis of time-varying signals has led to the formulation of a class of joint time-frequency distributions (TFDs). One of these TFDs, the Wigner-Ville distribution (WVD), has useful properties which can be applied to radar imaging. This paper first discusses the radar equation in terms of the time-frequency representation of the signal received from a radar system. It then presents a method of tomographic reconstruction for time-frequency images to estimate the scattering function of the aircraft. An optical archi-tecture is then discussed for the real-time implementation of the analysis method based on the WVD.
Roccato, Anna; Uyttendaele, Mieke; Membré, Jeanne-Marie
2017-06-01
In the framework of food safety, when mimicking the consumer phase, the storage time and temperature used are mainly considered as single point estimates instead of probability distributions. This singlepoint approach does not take into account the variability within a population and could lead to an overestimation of the parameters. Therefore, the aim of this study was to analyse data on domestic refrigerator temperatures and storage times of chilled food in European countries in order to draw general rules which could be used either in shelf-life testing or risk assessment. In relation to domestic refrigerator temperatures, 15 studies provided pertinent data. Twelve studies presented normal distributions, according to the authors or from the data fitted into distributions. Analysis of temperature distributions revealed that the countries were separated into two groups: northern European countries and southern European countries. The overall variability of European domestic refrigerators is described by a normal distribution: N (7.0, 2.7)°C for southern countries, and, N (6.1, 2.8)°C for the northern countries. Concerning storage times, seven papers were pertinent. Analysis indicated that the storage time was likely to end in the first days or weeks (depending on the product use-by-date) after purchase. Data fitting showed the exponential distribution was the most appropriate distribution to describe the time that food spent at consumer's place. The storage time was described by an exponential distribution corresponding to the use-by date period divided by 4. In conclusion, knowing that collecting data is time and money consuming, in the absence of data, and at least for the European market and for refrigerated products, building a domestic refrigerator temperature distribution using a Normal law and a time-to-consumption distribution using an Exponential law would be appropriate. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Doerann-George, Judith
The Integrated Moving Average (IMA) model of time series, and the analysis of intervention effects based on it, assume random shocks which are normally distributed. To determine the robustness of the analysis to violations of this assumption, empirical sampling methods were employed. Samples were generated from three populations; normal,…
NASA Astrophysics Data System (ADS)
Sun, Wenxiu; Liu, Guoqiang; Xia, Hui; Xia, Zhengwu
2018-03-01
Accurate acquisition of the detection signal travel time plays a very important role in cross-hole tomography. The experimental platform of aluminum plate under the perpendicular magnetic field is established and the bilinear time-frequency analysis methods, Wigner-Ville Distribution (WVD) and the pseudo-Wigner-Ville distribution (PWVD), are applied to analyse the Lamb wave signals detected by electromagnetic acoustic transducer (EMAT). By extracting the same frequency component of the time-frequency spectrum as the excitation frequency, the travel time information can be obtained. In comparison with traditional linear time-frequency analysis method such as short-time Fourier transform (STFT), the bilinear time-frequency analysis method PWVD is more appropriate in extracting travel time and recognizing patterns of Lamb wave.
Hu, Y; Luk, K D; Lu, W W; Holmes, A; Leong, J C
2001-05-01
Spinal somatosensory evoked potential (SSEP) has been employed to monitor the integrity of the spinal cord during surgery. To detect both temporal and spectral changes in SSEP waveforms, an investigation of the application of time-frequency analysis (TFA) techniques was conducted. SSEP signals from 30 scoliosis patients were analysed using different techniques; short time Fourier transform (STFT), Wigner-Ville distribution (WVD), Choi-Williams distribution (CWD), cone-shaped distribution (CSD) and adaptive spectrogram (ADS). The time-frequency distributions (TFD) computed using these methods were assessed and compared with each other. WVD, ADS, CSD and CWD showed better resolution than STFT. Comparing normalised peak widths, CSD showed the sharpest peak width (0.13+/-0.1) in the frequency dimension, and a mean peak width of 0.70+/-0.12 in the time dimension. Both WVD and CWD produced cross-term interference, distorting the TFA distribution, but this was not seen with CSD and ADS. CSD appeared to give a lower mean peak power bias (10.3%+/-6.2%) than ADS (41.8%+/-19.6%). Application of the CSD algorithm showed both good resolution and accurate spectrograms, and is therefore recommended as the most appropriate TFA technique for the analysis of SSEP signals.
Discrete Deterministic and Stochastic Petri Nets
NASA Technical Reports Server (NTRS)
Zijal, Robert; Ciardo, Gianfranco
1996-01-01
Petri nets augmented with timing specifications gained a wide acceptance in the area of performance and reliability evaluation of complex systems exhibiting concurrency, synchronization, and conflicts. The state space of time-extended Petri nets is mapped onto its basic underlying stochastic process, which can be shown to be Markovian under the assumption of exponentially distributed firing times. The integration of exponentially and non-exponentially distributed timing is still one of the major problems for the analysis and was first attacked for continuous time Petri nets at the cost of structural or analytical restrictions. We propose a discrete deterministic and stochastic Petri net (DDSPN) formalism with no imposed structural or analytical restrictions where transitions can fire either in zero time or according to arbitrary firing times that can be represented as the time to absorption in a finite absorbing discrete time Markov chain (DTMC). Exponentially distributed firing times are then approximated arbitrarily well by geometric distributions. Deterministic firing times are a special case of the geometric distribution. The underlying stochastic process of a DDSPN is then also a DTMC, from which the transient and stationary solution can be obtained by standard techniques. A comprehensive algorithm and some state space reduction techniques for the analysis of DDSPNs are presented comprising the automatic detection of conflicts and confusions, which removes a major obstacle for the analysis of discrete time models.
Feature Extraction for Bearing Prognostics and Health Management (PHM) - A Survey (Preprint)
2008-05-01
Envelope analysis • Cepstrum analysis • Higher order spectrum • Short-time Fourier Transform (STFT) • Wigner - Ville distribution ( WVD ) • Empirical mode...techniques are the short-time Fourier transform (STFT), the Wigner - Ville distribution , and the wavelet transform. In this paper we categorize wavelets...diagnosis have shown in many publications, for example, [22]. b) Wigner – Ville distribution : The afore-mentioned STFT is conceptually simple. However
Time-Frequency Domain Analysis of Helicopter Transmission Vibration
1991-08-01
Wigner - Ville distribution ( WVD ) have be reported, including speech...FREQUENCY DISTRIBUTIONS . 8 6. THE WIGNER - VILLE DISTRIBUTION . 9 6.1 History. 9 6.2 Definition. 9 6.3 Discrete-Time/Frequency Wigner - Ville Distribution . 10...signals are examined to indicate how various forms of modulation are portrayed using the Wigner - Ville distribution . Practical examples A signal is
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pugmire, David; Kress, James; Choi, Jong
Data driven science is becoming increasingly more common, complex, and is placing tremendous stresses on visualization and analysis frameworks. Data sources producing 10GB per second (and more) are becoming increasingly commonplace in both simulation, sensor and experimental sciences. These data sources, which are often distributed around the world, must be analyzed by teams of scientists that are also distributed. Enabling scientists to view, query and interact with such large volumes of data in near-real-time requires a rich fusion of visualization and analysis techniques, middleware and workflow systems. Here, this paper discusses initial research into visualization and analysis of distributed datamore » workflows that enables scientists to make near-real-time decisions of large volumes of time varying data.« less
A time-frequency approach for the analysis of normal and arrhythmia cardiac signals.
Mahmoud, Seedahmed S; Fang, Qiang; Davidović, Dragomir M; Cosic, Irena
2006-01-01
Previously, electrocardiogram (ECG) signals have been analyzed in either a time-indexed or spectral form. The reality, is that the ECG and all other biological signals belong to the family of multicomponent nonstationary signals. Due to this reason, the use of time-frequency analysis can be unavoidable for these signals. The Husimi and Wigner distributions are normally used in quantum mechanics for phase space representations of the wavefunction. In this paper, we introduce the Husimi distribution (HD) to analyze the normal and abnormal ECG signals in time-frequency domain. The abnormal cardiac signal was taken from a patient with supraventricular arrhythmia. Simulation results show that the HD has a good performance in the analysis of the ECG signals comparing with the Wigner-Ville distribution (WVD).
Dynamic Singularity Spectrum Distribution of Sea Clutter
NASA Astrophysics Data System (ADS)
Xiong, Gang; Yu, Wenxian; Zhang, Shuning
2015-12-01
The fractal and multifractal theory have provided new approaches for radar signal processing and target-detecting under the background of ocean. However, the related research mainly focuses on fractal dimension or multifractal spectrum (MFS) of sea clutter. In this paper, a new dynamic singularity analysis method of sea clutter using MFS distribution is developed, based on moving detrending analysis (DMA-MFSD). Theoretically, we introduce the time information by using cyclic auto-correlation of sea clutter. For transient correlation series, the instantaneous singularity spectrum based on multifractal detrending moving analysis (MF-DMA) algorithm is calculated, and the dynamic singularity spectrum distribution of sea clutter is acquired. In addition, we analyze the time-varying singularity exponent ranges and maximum position function in DMA-MFSD of sea clutter. For the real sea clutter data, we analyze the dynamic singularity spectrum distribution of real sea clutter in level III sea state, and conclude that the radar sea clutter has the non-stationary and time-varying scale characteristic and represents the time-varying singularity spectrum distribution based on the proposed DMA-MFSD method. The DMA-MFSD will also provide reference for nonlinear dynamics and multifractal signal processing.
NASA Astrophysics Data System (ADS)
Williams, Mike; Egede, Ulrik; Paterson, Stuart; LHCb Collaboration
2011-12-01
The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.
[Hazard function and life table: an introduction to the failure time analysis].
Matsushita, K; Inaba, H
1987-04-01
Failure time analysis has become popular in demographic studies. It can be viewed as a part of regression analysis with limited dependent variables as well as a special case of event history analysis and multistate demography. The idea of hazard function and failure time analysis, however, has not been properly introduced to nor commonly discussed by demographers in Japan. The concept of hazard function in comparison with life tables is briefly described, where the force of mortality is interchangeable with the hazard rate. The basic idea of failure time analysis is summarized for the cases of exponential distribution, normal distribution, and proportional hazard models. The multiple decrement life table is also introduced as an example of lifetime data analysis with cause-specific hazard rates.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mather, Barry
The increasing deployment of distribution-connected photovoltaic (DPV) systems requires utilities to complete complex interconnection studies. Relatively simple interconnection study methods worked well for low penetrations of photovoltaic systems, but more complicated quasi-static time-series (QSTS) analysis is required to make better interconnection decisions as DPV penetration levels increase. Tools and methods must be developed to support this. This paper presents a variable-time-step solver for QSTS analysis that significantly shortens the computational time and effort to complete a detailed analysis of the operation of a distribution circuit with many DPV systems. Specifically, it demonstrates that the proposed variable-time-step solver can reduce themore » required computational time by as much as 84% without introducing any important errors to metrics, such as the highest and lowest voltage occurring on the feeder, number of voltage regulator tap operations, and total amount of losses realized in the distribution circuit during a 1-yr period. Further improvement in computational speed is possible with the introduction of only modest errors in these metrics, such as a 91 percent reduction with less than 5 percent error when predicting voltage regulator operations.« less
A Study of ATLAS Grid Performance for Distributed Analysis
NASA Astrophysics Data System (ADS)
Panitkin, Sergey; Fine, Valery; Wenaus, Torre
2012-12-01
In the past two years the ATLAS Collaboration at the LHC has collected a large volume of data and published a number of ground breaking papers. The Grid-based ATLAS distributed computing infrastructure played a crucial role in enabling timely analysis of the data. We will present a study of the performance and usage of the ATLAS Grid as platform for physics analysis in 2011. This includes studies of general properties as well as timing properties of user jobs (wait time, run time, etc). These studies are based on mining of data archived by the PanDA workload management system.
Weblog patterns and human dynamics with decreasing interest
NASA Astrophysics Data System (ADS)
Guo, J.-L.; Fan, C.; Guo, Z.-H.
2011-06-01
In order to describe the phenomenon that people's interest in doing something always keep high in the beginning while gradually decreases until reaching the balance, a model which describes the attenuation of interest is proposed to reflect the fact that people's interest becomes more stable after a long time. We give a rigorous analysis on this model by non-homogeneous Poisson processes. Our analysis indicates that the interval distribution of arrival-time is a mixed distribution with exponential and power-law feature, which is a power law with an exponential cutoff. After that, we collect blogs in ScienceNet.cn and carry on empirical study on the interarrival time distribution. The empirical results agree well with the theoretical analysis, obeying a special power law with the exponential cutoff, that is, a special kind of Gamma distribution. These empirical results verify the model by providing an evidence for a new class of phenomena in human dynamics. It can be concluded that besides power-law distributions, there are other distributions in human dynamics. These findings demonstrate the variety of human behavior dynamics.
Analysis of thrips distribution: application of spatial statistics and Kriging
John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard
1991-01-01
Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...
A Spectral Analysis of Discrete-Time Quantum Walks Related to the Birth and Death Chains
NASA Astrophysics Data System (ADS)
Ho, Choon-Lin; Ide, Yusuke; Konno, Norio; Segawa, Etsuo; Takumi, Kentaro
2018-04-01
In this paper, we consider a spectral analysis of discrete time quantum walks on the path. For isospectral coin cases, we show that the time averaged distribution and stationary distributions of the quantum walks are described by the pair of eigenvalues of the coins as well as the eigenvalues and eigenvectors of the corresponding random walks which are usually referred as the birth and death chains. As an example of the results, we derive the time averaged distribution of so-called Szegedy's walk which is related to the Ehrenfest model. It is represented by Krawtchouk polynomials which is the eigenvectors of the model and includes the arcsine law.
Time series behaviour of the number of Air Asia passengers: A distributional approach
NASA Astrophysics Data System (ADS)
Asrah, Norhaidah Mohd; Djauhari, Maman Abdurachman
2013-09-01
The common practice to time series analysis is by fitting a model and then further analysis is conducted on the residuals. However, if we know the distributional behavior of time series, the analyses in model identification, parameter estimation, and model checking are more straightforward. In this paper, we show that the number of Air Asia passengers can be represented as a geometric Brownian motion process. Therefore, instead of using the standard approach in model fitting, we use an appropriate transformation to come up with a stationary, normally distributed and even independent time series. An example in forecasting the number of Air Asia passengers will be given to illustrate the advantages of the method.
Analysis of vector wind change with respect to time for Cape Kennedy, Florida
NASA Technical Reports Server (NTRS)
Adelfang, S. I.
1978-01-01
Multivariate analysis was used to determine the joint distribution of the four variables represented by the components of the wind vector at an initial time and after a specified elapsed time is hypothesized to be quadravariate normal; the fourteen statistics of this distribution, calculated from 15 years of twice-daily rawinsonde data are presented by monthly reference periods for each month from 0 to 27 km. The hypotheses that the wind component changes with respect to time is univariate normal, that the joint distribution of wind component change with respect to time is univariate normal, that the joint distribution of wind component changes is bivariate normal, and that the modulus of vector wind change is Rayleigh are tested by comparison with observed distributions. Statistics of the conditional bivariate normal distributions of vector wind at a future time given the vector wind at an initial time are derived. Wind changes over time periods from 1 to 5 hours, calculated from Jimsphere data, are presented. Extension of the theoretical prediction (based on rawinsonde data) of wind component change standard deviation to time periods of 1 to 5 hours falls (with a few exceptions) within the 95 percentile confidence band of the population estimate obtained from the Jimsphere sample data. The joint distributions of wind change components, conditional wind components, and 1 km vector wind shear change components are illustrated by probability ellipses at the 95 percentile level.
Performance analysis of static locking in replicated distributed database systems
NASA Technical Reports Server (NTRS)
Kuang, Yinghong; Mukkamala, Ravi
1991-01-01
Data replication and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. A technique is used that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed database with both shared and exclusive locks.
Time-frequency analysis of backscattered signals from diffuse radar targets
NASA Astrophysics Data System (ADS)
Kenny, O. P.; Boashash, B.
1993-06-01
The need for analysis of time-varying signals has led to the formulation of a class of joint time-frequency distributions (TFDs). One of these TFDs, the Wigner-Ville distribution (WVD), has useful properties which can be applied to radar imaging. The authors discuss time-frequency representation of the backscattered signal from a diffuse radar target. It is then shown that for point scatterers which are statistically dependent or for which the reflectivity coefficient has a nonzero mean value, reconstruction using time of flight positron emission tomography on time-frequency images is effective for estimating the scattering function of the target.
Model-centric distribution automation: Capacity, reliability, and efficiency
Onen, Ahmet; Jung, Jaesung; Dilek, Murat; ...
2016-02-26
A series of analyses along with field validations that evaluate efficiency, reliability, and capacity improvements of model-centric distribution automation are presented. With model-centric distribution automation, the same model is used from design to real-time control calculations. A 14-feeder system with 7 substations is considered. The analyses involve hourly time-varying loads and annual load growth factors. Phase balancing and capacitor redesign modifications are used to better prepare the system for distribution automation, where the designs are performed considering time-varying loads. Coordinated control of load tap changing transformers, line regulators, and switched capacitor banks is considered. In evaluating distribution automation versus traditionalmore » system design and operation, quasi-steady-state power flow analysis is used. In evaluating distribution automation performance for substation transformer failures, reconfiguration for restoration analysis is performed. In evaluating distribution automation for storm conditions, Monte Carlo simulations coupled with reconfiguration for restoration calculations are used. As a result, the evaluations demonstrate that model-centric distribution automation has positive effects on system efficiency, capacity, and reliability.« less
Model-centric distribution automation: Capacity, reliability, and efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Onen, Ahmet; Jung, Jaesung; Dilek, Murat
A series of analyses along with field validations that evaluate efficiency, reliability, and capacity improvements of model-centric distribution automation are presented. With model-centric distribution automation, the same model is used from design to real-time control calculations. A 14-feeder system with 7 substations is considered. The analyses involve hourly time-varying loads and annual load growth factors. Phase balancing and capacitor redesign modifications are used to better prepare the system for distribution automation, where the designs are performed considering time-varying loads. Coordinated control of load tap changing transformers, line regulators, and switched capacitor banks is considered. In evaluating distribution automation versus traditionalmore » system design and operation, quasi-steady-state power flow analysis is used. In evaluating distribution automation performance for substation transformer failures, reconfiguration for restoration analysis is performed. In evaluating distribution automation for storm conditions, Monte Carlo simulations coupled with reconfiguration for restoration calculations are used. As a result, the evaluations demonstrate that model-centric distribution automation has positive effects on system efficiency, capacity, and reliability.« less
Inverse statistics in the foreign exchange market
NASA Astrophysics Data System (ADS)
Jensen, M. H.; Johansen, A.; Petroni, F.; Simonsen, I.
2004-09-01
We investigate intra-day foreign exchange (FX) time series using the inverse statistic analysis developed by Simonsen et al. (Eur. Phys. J. 27 (2002) 583) and Jensen et al. (Physica A 324 (2003) 338). Specifically, we study the time-averaged distributions of waiting times needed to obtain a certain increase (decrease) ρ in the price of an investment. The analysis is performed for the Deutsch Mark (DM) against the US for the full year of 1998, but similar results are obtained for the Japanese Yen against the US. With high statistical significance, the presence of “resonance peaks” in the waiting time distributions is established. Such peaks are a consequence of the trading habits of the market participants as they are not present in the corresponding tick (business) waiting time distributions. Furthermore, a new stylized fact, is observed for the (normalized) waiting time distribution in the form of a power law Pdf. This result is achieved by rescaling of the physical waiting time by the corresponding tick time thereby partially removing scale-dependent features of the market activity.
The use of the Wigner Distribution to analyze structural impulse responses
NASA Technical Reports Server (NTRS)
Wahl, T. J.; Bolton, J. S.
1990-01-01
In this paper it is argued that the time-frequency analysis of structural impulse responses may be used to reveal the wave types carrying significant energy through a structure. Since each wave type is characterized by its own dispersion relation, each wave type may be associated with particular features appearing in the time-frequency domain representation of an impulse response. Here the Wigner Distribution is introduced as a means for obtaining appropriate time-frequency representations of impulse responses. Practical aspects of the calculation of the Wigner Distribution are discussed and examples of its application to the analysis of structural impulse responses are given. These examples will show that the Wigner Distribution may be conveniently used to distinguish between the contributions of various waves types to a total structural response.
Performance analysis of static locking in replicated distributed database systems
NASA Technical Reports Server (NTRS)
Kuang, Yinghong; Mukkamala, Ravi
1991-01-01
Data replications and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. Here, a technique is discussed that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed databases with both shared and exclusive locks.
Real-time modeling and simulation of distribution feeder and distributed resources
NASA Astrophysics Data System (ADS)
Singh, Pawan
The analysis of the electrical system dates back to the days when analog network analyzers were used. With the advent of digital computers, many programs were written for power-flow and short circuit analysis for the improvement of the electrical system. Real-time computer simulations can answer many what-if scenarios in the existing or the proposed power system. In this thesis, the standard IEEE 13-Node distribution feeder is developed and validated on a real-time platform OPAL-RT. The concept and the challenges of the real-time simulation are studied and addressed. Distributed energy resources include some of the commonly used distributed generation and storage devices like diesel engine, solar photovoltaic array, and battery storage system are modeled and simulated on a real-time platform. A microgrid encompasses a portion of an electric power distribution which is located downstream of the distribution substation. Normally, the microgrid operates in paralleled mode with the grid; however, scheduled or forced isolation can take place. In such conditions, the microgrid must have the ability to operate stably and autonomously. The microgrid can operate in grid connected and islanded mode, both the operating modes are studied in the last chapter. Towards the end, a simple microgrid controller modeled and simulated on the real-time platform is developed for energy management and protection for the microgrid.
NASA Astrophysics Data System (ADS)
Senthilkumar, K.; Ruchika Mehra Vijayan, E.
2017-11-01
This paper aims to illustrate real time analysis of large scale data. For practical implementation we are performing sentiment analysis on live Twitter feeds for each individual tweet. To analyze sentiments we will train our data model on sentiWordNet, a polarity assigned wordNet sample by Princeton University. Our main objective will be to efficiency analyze large scale data on the fly using distributed computation. Apache Spark and Apache Hadoop eco system is used as distributed computation platform with Java as development language
NASA Astrophysics Data System (ADS)
Bruggeman, M.; Baeten, P.; De Boeck, W.; Carchon, R.
1996-02-01
Neutron coincidence counting is commonly used for the non-destructive assay of plutonium bearing waste or for safeguards verification measurements. A major drawback of conventional coincidence counting is related to the fact that a valid calibration is needed to convert a neutron coincidence count rate to a 240Pu equivalent mass ( 240Pu eq). In waste assay, calibrations are made for representative waste matrices and source distributions. The actual waste however may have quite different matrices and source distributions compared to the calibration samples. This often results in a bias of the assay result. This paper presents a new neutron multiplicity sensitive coincidence counting technique including an auto-calibration of the neutron detection efficiency. The coincidence counting principle is based on the recording of one- and two-dimensional Rossi-alpha distributions triggered respectively by pulse pairs and by pulse triplets. Rossi-alpha distributions allow an easy discrimination between real and accidental coincidences and are aimed at being measured by a PC-based fast time interval analyser. The Rossi-alpha distributions can be easily expressed in terms of a limited number of factorial moments of the neutron multiplicity distributions. The presented technique allows an unbiased measurement of the 240Pu eq mass. The presented theory—which will be indicated as Time Interval Analysis (TIA)—is complementary to Time Correlation Analysis (TCA) theories which were developed in the past, but is from the theoretical point of view much simpler and allows a straightforward calculation of deadtime corrections and error propagation. Analytical expressions are derived for the Rossi-alpha distributions as a function of the factorial moments of the efficiency dependent multiplicity distributions. The validity of the proposed theory is demonstrated and verified via Monte Carlo simulations of pulse trains and the subsequent analysis of the simulated data.
Study on time-frequency analysis method of very fast transient overvoltage
NASA Astrophysics Data System (ADS)
Li, Shuai; Liu, Shiming; Huang, Qiyan; Fu, Chuanshun
2018-04-01
The operation of the disconnector in the gas insulated substation (GIS) may produce very fast transient overvoltage (VFTO), which has the characteristics of short rise time, short duration, high amplitude and rich frequency components. VFTO can cause damage to GIS and secondary equipment, and the frequency components contained in the VFTO can cause resonance overvoltage inside the transformer, so it is necessary to study the spectral characteristics of the VFTO. From the perspective of signal processing, VFTO is a kind of non-stationary signal, the traditional Fourier transform is difficult to describe its frequency which changes with time, so it is necessary to use time-frequency analysis to analyze VFTO spectral characteristics. In this paper, we analyze the performance of short time Fourier transform (STFT), Wigner-Ville distribution (WVD), pseudo Wigner-Ville distribution (PWVD) and smooth pseudo Wigner-Ville distribution (SPWVD). The results show that SPWVD transform is the best. The time-frequency aggregation of SPWVD is higher than STFT, and it does not have cross-interference terms, which can meet the requirements of VFTO spectrum analysis.
Self spectrum window method in wigner-ville distribution.
Liu, Zhongguo; Liu, Changchun; Liu, Boqiang; Lv, Yangsheng; Lei, Yinsheng; Yu, Mengsun
2005-01-01
Wigner-Ville distribution (WVD) is an important type of time-frequency analysis in biomedical signal processing. The cross-term interference in WVD has a disadvantageous influence on its application. In this research, the Self Spectrum Window (SSW) method was put forward to suppress the cross-term interference, based on the fact that the cross-term and auto-WVD- terms in integral kernel function are orthogonal. With the Self Spectrum Window (SSW) algorithm, a real auto-WVD function was used as a template to cross-correlate with the integral kernel function, and the Short Time Fourier Transform (STFT) spectrum of the signal was used as window function to process the WVD in time-frequency plane. The SSW method was confirmed by computer simulation with good analysis results. Satisfactory time- frequency distribution was obtained.
Mwakanyamale, Kisa; Slater, Lee; Day-Lewis, Frederick D.; Elwaseif, Mehrez; Johnson, Carole D.
2012-01-01
Characterization of groundwater-surface water exchange is essential for improving understanding of contaminant transport between aquifers and rivers. Fiber-optic distributed temperature sensing (FODTS) provides rich spatiotemporal datasets for quantitative and qualitative analysis of groundwater-surface water exchange. We demonstrate how time-frequency analysis of FODTS and synchronous river stage time series from the Columbia River adjacent to the Hanford 300-Area, Richland, Washington, provides spatial information on the strength of stage-driven exchange of uranium contaminated groundwater in response to subsurface heterogeneity. Although used in previous studies, the stage-temperature correlation coefficient proved an unreliable indicator of the stage-driven forcing on groundwater discharge in the presence of other factors influencing river water temperature. In contrast, S-transform analysis of the stage and FODTS data definitively identifies the spatial distribution of discharge zones and provided information on the dominant forcing periods (≥2 d) of the complex dam operations driving stage fluctuations and hence groundwater-surface water exchange at the 300-Area.
Double Fourier analysis for Emotion Identification in Voiced Speech
NASA Astrophysics Data System (ADS)
Sierra-Sosa, D.; Bastidas, M.; Ortiz P., D.; Quintero, O. L.
2016-04-01
We propose a novel analysis alternative, based on two Fourier Transforms for emotion recognition from speech. Fourier analysis allows for display and synthesizes different signals, in terms of power spectral density distributions. A spectrogram of the voice signal is obtained performing a short time Fourier Transform with Gaussian windows, this spectrogram portraits frequency related features, such as vocal tract resonances and quasi-periodic excitations during voiced sounds. Emotions induce such characteristics in speech, which become apparent in spectrogram time-frequency distributions. Later, the signal time-frequency representation from spectrogram is considered an image, and processed through a 2-dimensional Fourier Transform in order to perform the spatial Fourier analysis from it. Finally features related with emotions in voiced speech are extracted and presented.
Avalanche statistics from data with low time resolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
LeBlanc, Michael; Nawano, Aya; Wright, Wendelin J.
Extracting avalanche distributions from experimental microplasticity data can be hampered by limited time resolution. We compute the effects of low time resolution on avalanche size distributions and give quantitative criteria for diagnosing and circumventing problems associated with low time resolution. We show that traditional analysis of data obtained at low acquisition rates can lead to avalanche size distributions with incorrect power-law exponents or no power-law scaling at all. Furthermore, we demonstrate that it can lead to apparent data collapses with incorrect power-law and cutoff exponents. We propose new methods to analyze low-resolution stress-time series that can recover the size distributionmore » of the underlying avalanches even when the resolution is so low that naive analysis methods give incorrect results. We test these methods on both downsampled simulation data from a simple model and downsampled bulk metallic glass compression data and find that the methods recover the correct critical exponents.« less
Avalanche statistics from data with low time resolution
LeBlanc, Michael; Nawano, Aya; Wright, Wendelin J.; ...
2016-11-22
Extracting avalanche distributions from experimental microplasticity data can be hampered by limited time resolution. We compute the effects of low time resolution on avalanche size distributions and give quantitative criteria for diagnosing and circumventing problems associated with low time resolution. We show that traditional analysis of data obtained at low acquisition rates can lead to avalanche size distributions with incorrect power-law exponents or no power-law scaling at all. Furthermore, we demonstrate that it can lead to apparent data collapses with incorrect power-law and cutoff exponents. We propose new methods to analyze low-resolution stress-time series that can recover the size distributionmore » of the underlying avalanches even when the resolution is so low that naive analysis methods give incorrect results. We test these methods on both downsampled simulation data from a simple model and downsampled bulk metallic glass compression data and find that the methods recover the correct critical exponents.« less
2008-10-01
attempts to measure the long-term distribution of stor- age time have relied unrealistic assumptions, but two recent studies suggest a new approach. As...sediment 10 age . Everitt (1968) mapped the age distribution of cottonwoods along a 34 km stretch of the Little Missouri River in North Dakota...Dietrich et al. (1982) applied Erikssons (1971) method to estimate the residence time distribution from Everitts age distribution. Somewhat mysteriously
NASA Astrophysics Data System (ADS)
Nhu Y, Do
2018-03-01
Vietnam has many advantages of wind power resources. Time by time there are more and more capacity as well as number of wind power project in Vietnam. Corresponding to the increase of wind power emitted into national grid, It is necessary to research and analyze in order to ensure the safety and reliability of win power connection. In national distribution grid, voltage sag occurs regularly, it can strongly influence on the operation of wind power. The most serious consequence is the disconnection. The paper presents the analysis of distribution grid's transient process when voltage is sagged. Base on the analysis, the solutions will be recommended to improve the reliability and effective operation of wind power resources.
Study on ion energy distribution in low-frequency oscillation time scale of Hall thrusters
NASA Astrophysics Data System (ADS)
Wei, Liqiu; Li, Wenbo; Ding, Yongjie; Han, Liang; Yu, Daren; Cao, Yong
2017-11-01
This paper reports on the dynamic characteristics of the distribution of ion energy during Hall thruster discharge in the low-frequency oscillation time scale through experimental studies, and a statistical analysis of the time-varying peak and width of ion energy and the ratio of high-energy ions during the low-frequency oscillation. The results show that the ion energy distribution exhibits a periodic change during the low-frequency oscillation. Moreover, the variation in the ion energy peak is opposite to that of the discharge current, and the variations in width of the ion energy distribution and the ratio of high-energy ions are consistent with that of the discharge current. The variation characteristics of the ion density and discharge potential were simulated by one-dimensional hybrid-direct kinetic simulations; the simulation results and analysis indicate that the periodic change in the distribution of ion energy during the low-frequency oscillation depends on the relationship between the ionization source term and discharge potential distribution during ionization in the discharge channel.
Improving Department of Defense Global Distribution Performance Through Network Analysis
2016-06-01
network performance increase. 14. SUBJECT TERMS supply chain metrics, distribution networks, requisition shipping time, strategic distribution database...peace and war” (p. 4). USTRANSCOM Metrics and Analysis Branch defines, develops, tracks, and maintains outcomes- based supply chain metrics to...2014a, p. 8). The Joint Staff defines a TDD standard as the maximum number of days the supply chain can take to deliver requisitioned materiel
A Heuristic Approach to the Theater Distribution Problem
2014-03-27
outstanding guidance on this thesis research as well as the introduction to joint mobility modeling in OPER 674 which sparked my interest in this area of...32 xi List of Acronyms Acronym Definition AMP Analysis of Mobility Platform DARP Dial-A-Ride problem...tabu SMM Strategic Mobility Modeling TDD time definite delivery TDM Theater Distribution Model TDP Theater Distribution Problem TPFDD Time Phased Force
An Empirical Analysis of the Cascade Secret Key Reconciliation Protocol for Quantum Key Distribution
2011-09-01
performance with the parity checks within each pass increasing and as a result, the processing time is expected to increase as well. A conclusion is drawn... timely manner has driven efforts to develop new key distribution methods. The most promising method is Quantum Key Distribution (QKD) and is...thank the QKD Project Team for all of the insight and support they provided in such a short time period. Thanks are especially in order for my
NASA Technical Reports Server (NTRS)
Adelfang, S. I.
1977-01-01
Wind vector change with respect to time at Cape Kennedy, Florida, is examined according to the theory of multivariate normality. The joint distribution of the four variables represented by the components of the wind vector at an initial time and after a specified elapsed time is hypothesized to be quadravariate normal; the fourteen statistics of this distribution, calculated from fifteen years of twice daily Rawinsonde data are presented by monthly reference periods for each month from 0 to 27 km. The hypotheses that the wind component changes with respect to time is univariate normal, the joint distribution of wind component changes is bivariate normal, and the modulus of vector wind change is Rayleigh, has been tested by comparison with observed distributions. Statistics of the conditional bivariate normal distributions of vector wind at a future time given the vector wind at an initial time are derived. Wind changes over time periods from one to five hours, calculated from Jimsphere data, are presented.
NASA Astrophysics Data System (ADS)
ten Veldhuis, Marie-Claire; Schleiss, Marc
2017-04-01
Urban catchments are typically characterised by a more flashy nature of the hydrological response compared to natural catchments. Predicting flow changes associated with urbanisation is not straightforward, as they are influenced by interactions between impervious cover, basin size, drainage connectivity and stormwater management infrastructure. In this study, we present an alternative approach to statistical analysis of hydrological response variability and basin flashiness, based on the distribution of inter-amount times. We analyse inter-amount time distributions of high-resolution streamflow time series for 17 (semi-)urbanised basins in North Carolina, USA, ranging from 13 to 238 km2 in size. We show that in the inter-amount-time framework, sampling frequency is tuned to the local variability of the flow pattern, resulting in a different representation and weighting of high and low flow periods in the statistical distribution. This leads to important differences in the way the distribution quantiles, mean, coefficient of variation and skewness vary across scales and results in lower mean intermittency and improved scaling. Moreover, we show that inter-amount-time distributions can be used to detect regulation effects on flow patterns, identify critical sampling scales and characterise flashiness of hydrological response. The possibility to use both the classical approach and the inter-amount-time framework to identify minimum observable scales and analyse flow data opens up interesting areas for future research.
Analysis of Delays in Transmitting Time Code Using an Automated Computer Time Distribution System
1999-12-01
jlevine@clock. bldrdoc.gov Abstract An automated computer time distribution system broadcasts standard tune to users using computers and modems via...contributed to &lays - sofhareplatform (50% of the delay), transmission speed of time- codes (25OA), telephone network (lS%), modem and others (10’4). The... modems , and telephone lines. Users dial the ACTS server to receive time traceable to the national time scale of Singapore, UTC(PSB). The users can in
Taki, M; Signorini, A; Oton, C J; Nannipieri, T; Di Pasquale, F
2013-10-15
We experimentally demonstrate the use of cyclic pulse coding for distributed strain and temperature measurements in hybrid Raman/Brillouin optical time-domain analysis (BOTDA) optical fiber sensors. The highly integrated proposed solution effectively addresses the strain/temperature cross-sensitivity issue affecting standard BOTDA sensors, allowing for simultaneous meter-scale strain and temperature measurements over 10 km of standard single mode fiber using a single narrowband laser source only.
Estimation Accuracy on Execution Time of Run-Time Tasks in a Heterogeneous Distributed Environment.
Liu, Qi; Cai, Weidong; Jin, Dandan; Shen, Jian; Fu, Zhangjie; Liu, Xiaodong; Linge, Nigel
2016-08-30
Distributed Computing has achieved tremendous development since cloud computing was proposed in 2006, and played a vital role promoting rapid growth of data collecting and analysis models, e.g., Internet of things, Cyber-Physical Systems, Big Data Analytics, etc. Hadoop has become a data convergence platform for sensor networks. As one of the core components, MapReduce facilitates allocating, processing and mining of collected large-scale data, where speculative execution strategies help solve straggler problems. However, there is still no efficient solution for accurate estimation on execution time of run-time tasks, which can affect task allocation and distribution in MapReduce. In this paper, task execution data have been collected and employed for the estimation. A two-phase regression (TPR) method is proposed to predict the finishing time of each task accurately. Detailed data of each task have drawn interests with detailed analysis report being made. According to the results, the prediction accuracy of concurrent tasks' execution time can be improved, in particular for some regular jobs.
Waiting-time distributions of magnetic discontinuities: clustering or Poisson process?
Greco, A; Matthaeus, W H; Servidio, S; Dmitruk, P
2009-10-01
Using solar wind data from the Advanced Composition Explorer spacecraft, with the support of Hall magnetohydrodynamic simulations, the waiting-time distributions of magnetic discontinuities have been analyzed. A possible phenomenon of clusterization of these discontinuities is studied in detail. We perform a local Poisson's analysis in order to establish if these intermittent events are randomly distributed or not. Possible implications about the nature of solar wind discontinuities are discussed.
Waiting-time distributions of magnetic discontinuities: Clustering or Poisson process?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greco, A.; Matthaeus, W. H.; Servidio, S.
2009-10-15
Using solar wind data from the Advanced Composition Explorer spacecraft, with the support of Hall magnetohydrodynamic simulations, the waiting-time distributions of magnetic discontinuities have been analyzed. A possible phenomenon of clusterization of these discontinuities is studied in detail. We perform a local Poisson's analysis in order to establish if these intermittent events are randomly distributed or not. Possible implications about the nature of solar wind discontinuities are discussed.
Ma, Chihua; Luciani, Timothy; Terebus, Anna; Liang, Jie; Marai, G Elisabeta
2017-02-15
Visualizing the complex probability landscape of stochastic gene regulatory networks can further biologists' understanding of phenotypic behavior associated with specific genes. We present PRODIGEN (PRObability DIstribution of GEne Networks), a web-based visual analysis tool for the systematic exploration of probability distributions over simulation time and state space in such networks. PRODIGEN was designed in collaboration with bioinformaticians who research stochastic gene networks. The analysis tool combines in a novel way existing, expanded, and new visual encodings to capture the time-varying characteristics of probability distributions: spaghetti plots over one dimensional projection, heatmaps of distributions over 2D projections, enhanced with overlaid time curves to display temporal changes, and novel individual glyphs of state information corresponding to particular peaks. We demonstrate the effectiveness of the tool through two case studies on the computed probabilistic landscape of a gene regulatory network and of a toggle-switch network. Domain expert feedback indicates that our visual approach can help biologists: 1) visualize probabilities of stable states, 2) explore the temporal probability distributions, and 3) discover small peaks in the probability landscape that have potential relation to specific diseases.
NASA Astrophysics Data System (ADS)
Bidari, Pooya Sobhe; Alirezaie, Javad; Tavakkoli, Jahan
2017-03-01
This paper presents a method for modeling and simulation of shear wave generation from a nonlinear Acoustic Radiation Force Impulse (ARFI) that is considered as a distributed force applied at the focal region of a HIFU transducer radiating in nonlinear regime. The shear wave propagation is simulated by solving the Navier's equation from the distributed nonlinear ARFI as the source of the shear wave. Then, the Wigner-Ville Distribution (WVD) as a time-frequency analysis method is used to detect the shear wave at different local points in the region of interest. The WVD results in an estimation of the shear wave time of arrival, its mean frequency and local attenuation which can be utilized to estimate medium's shear modulus and shear viscosity using the Voigt model.
Kim, Young Hoon; Song, Kwang Yong
2017-06-26
A Brillouin optical time domain analysis (BOTDA) system utilizing tailored compensation for the propagation loss of the pump pulse is demonstrated for long-range and high-resolution distributed sensing. A continuous pump wave for distributed Brillouin amplification (DBA pump) of the pump pulse co-propagates with the probe wave, where gradual variation of the spectral width is additionally introduced to the DBA pump to obtain a uniform Brillouin gain along the position. In the experimental confirmation, a distributed strain measurement along a 51.2 km fiber under test is presented with a spatial resolution of 20 cm, in which the measurement error (σ) of less than 1.45 MHz and the near-constant Brillouin gain of the probe wave are maintained throughout the fiber.
Barua, N U; Bienemann, A S; Woolley, M; Wyatt, M J; Johnson, D; Lewis, O; Irving, C; Pritchard, G; Gill, S
2015-10-15
Mesencephalic astrocyte-derived neurotrophic factor (MANF) is a 20kDa human protein which has both neuroprotective and neurorestorative activity on dopaminergic neurons and therefore may have application for the treatment of Parkinson's Disease. The aims of this study were to determine the translational potential of convection-enhanced delivery (CED) of MANF for the treatment of PD by studying its distribution in porcine putamen and substantia nigra and to correlate histological distribution with co-infused gadolinium-DTPA using real-time magnetic resonance imaging. We describe the distribution of MANF in porcine putamen and substantia nigra using an implantable CED catheter system using co-infused gadolinium-DTPA to allow real-time MRI tracking of infusate distribution. The distribution of gadolinium-DTPA on MRI correlated well with immunohistochemical analysis of MANF distribution. Volumetric analysis of MANF IHC staining indicated a volume of infusion (Vi) to volume of distribution (Vd) ratio of 3 in putamen and 2 in substantia nigra. This study confirms the translational potential of CED of MANF as a novel treatment strategy in PD and also supports the co-infusion of gadolinium as a proxy measure of MANF distribution in future clinical studies. Further study is required to determine the optimum infusion regime, flow rate and frequency of infusions in human trials. Copyright © 2015 Elsevier B.V. All rights reserved.
Stable distribution and long-range correlation of Brent crude oil market
NASA Astrophysics Data System (ADS)
Yuan, Ying; Zhuang, Xin-tian; Jin, Xiu; Huang, Wei-qiang
2014-11-01
An empirical study of stable distribution and long-range correlation in Brent crude oil market was presented. First, it is found that the empirical distribution of Brent crude oil returns can be fitted well by a stable distribution, which is significantly different from a normal distribution. Second, the detrended fluctuation analysis for the Brent crude oil returns shows that there are long-range correlation in returns. It implies that there are patterns or trends in returns that persist over time. Third, the detrended fluctuation analysis for the Brent crude oil returns shows that after the financial crisis 2008, the Brent crude oil market becomes more persistence. It implies that the financial crisis 2008 could increase the frequency and strength of the interdependence and correlations between the financial time series. All of these findings may be used to improve the current fractal theories.
Waiting time distribution revealing the internal spin dynamics in a double quantum dot
NASA Astrophysics Data System (ADS)
Ptaszyński, Krzysztof
2017-07-01
Waiting time distribution and the zero-frequency full counting statistics of unidirectional electron transport through a double quantum dot molecule attached to spin-polarized leads are analyzed using the quantum master equation. The waiting time distribution exhibits a nontrivial dependence on the value of the exchange coupling between the dots and the gradient of the applied magnetic field, which reveals the oscillations between the spin states of the molecule. The zero-frequency full counting statistics, on the other hand, is independent of the aforementioned quantities, thus giving no insight into the internal dynamics. The fact that the waiting time distribution and the zero-frequency full counting statistics give a nonequivalent information is associated with two factors. Firstly, it can be explained by the sensitivity to different timescales of the dynamics of the system. Secondly, it is associated with the presence of the correlation between subsequent waiting times, which makes the renewal theory, relating the full counting statistics and the waiting time distribution, no longer applicable. The study highlights the particular usefulness of the waiting time distribution for the analysis of the internal dynamics of mesoscopic systems.
Distributions-per-level: a means of testing level detectors and models of patch-clamp data.
Schröder, I; Huth, T; Suitchmezian, V; Jarosik, J; Schnell, S; Hansen, U P
2004-01-01
Level or jump detectors generate the reconstructed time series from a noisy record of patch-clamp current. The reconstructed time series is used to create dwell-time histograms for the kinetic analysis of the Markov model of the investigated ion channel. It is shown here that some additional lines in the software of such a detector can provide a powerful new means of patch-clamp analysis. For each current level that can be recognized by the detector, an array is declared. The new software assigns every data point of the original time series to the array that belongs to the actual state of the detector. From the data sets in these arrays distributions-per-level are generated. Simulated and experimental time series analyzed by Hinkley detectors are used to demonstrate the benefits of these distributions-per-level. First, they can serve as a test of the reliability of jump and level detectors. Second, they can reveal beta distributions as resulting from fast gating that would usually be hidden in the overall amplitude histogram. Probably the most valuable feature is that the malfunctions of the Hinkley detectors turn out to depend on the Markov model of the ion channel. Thus, the errors revealed by the distributions-per-level can be used to distinguish between different putative Markov models of the measured time series.
Liang, Liang; Liu, Minliang; Martin, Caitlin; Sun, Wei
2018-01-01
Structural finite-element analysis (FEA) has been widely used to study the biomechanics of human tissues and organs, as well as tissue-medical device interactions, and treatment strategies. However, patient-specific FEA models usually require complex procedures to set up and long computing times to obtain final simulation results, preventing prompt feedback to clinicians in time-sensitive clinical applications. In this study, by using machine learning techniques, we developed a deep learning (DL) model to directly estimate the stress distributions of the aorta. The DL model was designed and trained to take the input of FEA and directly output the aortic wall stress distributions, bypassing the FEA calculation process. The trained DL model is capable of predicting the stress distributions with average errors of 0.492% and 0.891% in the Von Mises stress distribution and peak Von Mises stress, respectively. This study marks, to our knowledge, the first study that demonstrates the feasibility and great potential of using the DL technique as a fast and accurate surrogate of FEA for stress analysis. © 2018 The Author(s).
Regression analysis using dependent Polya trees.
Schörgendorfer, Angela; Branscum, Adam J
2013-11-30
Many commonly used models for linear regression analysis force overly simplistic shape and scale constraints on the residual structure of data. We propose a semiparametric Bayesian model for regression analysis that produces data-driven inference by using a new type of dependent Polya tree prior to model arbitrary residual distributions that are allowed to evolve across increasing levels of an ordinal covariate (e.g., time, in repeated measurement studies). By modeling residual distributions at consecutive covariate levels or time points using separate, but dependent Polya tree priors, distributional information is pooled while allowing for broad pliability to accommodate many types of changing residual distributions. We can use the proposed dependent residual structure in a wide range of regression settings, including fixed-effects and mixed-effects linear and nonlinear models for cross-sectional, prospective, and repeated measurement data. A simulation study illustrates the flexibility of our novel semiparametric regression model to accurately capture evolving residual distributions. In an application to immune development data on immunoglobulin G antibodies in children, our new model outperforms several contemporary semiparametric regression models based on a predictive model selection criterion. Copyright © 2013 John Wiley & Sons, Ltd.
Chan, H L; Lin, J L; Huang, H H; Wu, C P
1997-09-01
A new technique for interference-term suppression in Wigner-Ville distribution (WVD) is proposed for the signal with 1/f spectrum shape. The spectral characteristic of the signal is altered by f alpha filtering before time-frequency analysis and compensated after analysis. With the utilization of the proposed technique in smoothed pseudo Wigner-Ville distribution, an excellent suppression of interference component can be achieved.
Wigner-Ville distribution and Gabor transform in Doppler ultrasound signal processing.
Ghofrani, S; Ayatollahi, A; Shamsollahi, M B
2003-01-01
Time-frequency distributions have been used extensively for nonstationary signal analysis, they describe how the frequency content of a signal is changing in time. The Wigner-Ville distribution (WVD) is the best known. The draw back of WVD is cross-term artifacts. An alternative to the WVD is Gabor transform (GT), a signal decomposition method, which displays the time-frequency energy of a signal on a joint t-f plane without generating considerable cross-terms. In this paper the WVD and GT of ultrasound echo signals are computed analytically.
The source of electrostatic fluctuations in the solar-wind
NASA Technical Reports Server (NTRS)
Lemons, D. S.; Asbridge, J. R.; Bame, S. J.; Feldman, W. C.; Gary, S. P.; Gosling, J. T.
1979-01-01
Solar wind electron and ion distribution functions measured simultaneously with or close to times of intense electrostatic fluctuations are subjected to a linear Vlasov stability analysis. Although all distributions tested were found to be stable, the analysis suggests that the ion beam instability is the most likely source of the fluctuations.
Category Induction via Distributional Analysis: Evidence from a Serial Reaction Time Task
ERIC Educational Resources Information Center
Hunt, Ruskin H.; Aslin, Richard N.
2010-01-01
Category formation lies at the heart of a number of higher-order behaviors, including language. We assessed the ability of human adults to learn, from distributional information alone, categories embedded in a sequence of input stimuli using a serial reaction time task. Artificial grammars generated corpora of input strings containing a…
Time-frequency analysis of SEMG--with special consideration to the interelectrode spacing.
Alemu, M; Kumar, Dinesh Kant; Bradley, Alan
2003-12-01
The surface electromyogram (SEMG) is a complex, nonstationary signal. The spectrum of the SEMG is dependent on the force of contraction being generated and other factors like muscle fatigue and interelectrode distance (IED). The spectrum of the signal is time variant. This paper reports the experimental research conducted to study the influence of force of muscle contraction and IED on the SEMG signal using time-frequency (T-F) analysis. Two T-F techniques have been used: Wigner-Ville distribution (WVD) and Choi-Williams distribution (CWD). The experiment was conducted with the help of ten healthy volunteers (five males and five females) who performed isometric elbow flexions of the active right arm at 20%, 50%, and 80% of their maximal voluntary contraction. The SEMG signal was recorded using surface electrodes placed at a distance of 18 and 36 mm over biceps brachii muscle. The results indicate that the two distributions were spread out across the frequency range at smaller IED. Further, regardless of the spacing, both distributions displayed increased spectral compression with time at higher contraction level.
Scale-free avalanche dynamics in the stock market
NASA Astrophysics Data System (ADS)
Bartolozzi, M.; Leinweber, D. B.; Thomas, A. W.
2006-10-01
Self-organized criticality (SOC) has been claimed to play an important role in many natural and social systems. In the present work we empirically investigate the relevance of this theory to stock-market dynamics. Avalanches in stock-market indices are identified using a multi-scale wavelet-filtering analysis designed to remove Gaussian noise from the index. Here, new methods are developed to identify the optimal filtering parameters which maximize the noise removal. The filtered time series is reconstructed and compared with the original time series. A statistical analysis of both high-frequency Nasdaq E-mini Futures and daily Dow Jones data is performed. The results of this new analysis confirm earlier results revealing a robust power-law behaviour in the probability distribution function of the sizes, duration and laminar times between avalanches. This power-law behaviour holds the potential to be established as a stylized fact of stock market indices in general. While the memory process, implied by the power-law distribution of the laminar times, is not consistent with classical models for SOC, we note that a power-law distribution of the laminar times cannot be used to rule out self-organized critical behaviour.
Multiple-parameter bifurcation analysis in a Kuramoto model with time delay and distributed shear
NASA Astrophysics Data System (ADS)
Niu, Ben; Zhang, Jiaming; Wei, Junjie
2018-05-01
In this paper, time delay effect and distributed shear are considered in the Kuramoto model. On the Ott-Antonsen's manifold, through analyzing the associated characteristic equation of the reduced functional differential equation, the stability boundary of the incoherent state is derived in multiple-parameter space. Moreover, very rich dynamical behavior such as stability switches inducing synchronization switches can occur in this equation. With the loss of stability, Hopf bifurcating coherent states arise, and the criticality of Hopf bifurcations is determined by applying the normal form theory and the center manifold theorem. On one hand, theoretical analysis indicates that the width of shear distribution and time delay can both eliminate the synchronization then lead the Kuramoto model to incoherence. On the other, time delay can induce several coexisting coherent states. Finally, some numerical simulations are given to support the obtained results where several bifurcation diagrams are drawn, and the effect of time delay and shear is discussed.
Inverse Statistics and Asset Allocation Efficiency
NASA Astrophysics Data System (ADS)
Bolgorian, Meysam
In this paper using inverse statistics analysis, the effect of investment horizon on the efficiency of portfolio selection is examined. Inverse statistics analysis is a general tool also known as probability distribution of exit time that is used for detecting the distribution of the time in which a stochastic process exits from a zone. This analysis was used in Refs. 1 and 2 for studying the financial returns time series. This distribution provides an optimal investment horizon which determines the most likely horizon for gaining a specific return. Using samples of stocks from Tehran Stock Exchange (TSE) as an emerging market and S&P 500 as a developed market, effect of optimal investment horizon in asset allocation is assessed. It is found that taking into account the optimal investment horizon in TSE leads to more efficiency for large size portfolios while for stocks selected from S&P 500, regardless of portfolio size, this strategy does not only not produce more efficient portfolios, but also longer investment horizons provides more efficiency.
Time-frequency representation of a highly nonstationary signal via the modified Wigner distribution
NASA Technical Reports Server (NTRS)
Zoladz, T. F.; Jones, J. H.; Jong, J.
1992-01-01
A new signal analysis technique called the modified Wigner distribution (MWD) is presented. The new signal processing tool has been very successful in determining time frequency representations of highly non-stationary multicomponent signals in both simulations and trials involving actual Space Shuttle Main Engine (SSME) high frequency data. The MWD departs from the classic Wigner distribution (WD) in that it effectively eliminates the cross coupling among positive frequency components in a multiple component signal. This attribute of the MWD, which prevents the generation of 'phantom' spectral peaks, will undoubtedly increase the utility of the WD for real world signal analysis applications which more often than not involve multicomponent signals.
On the properties of stochastic intermittency in rainfall processes.
Molini, A; La, Barbera P; Lanza, L G
2002-01-01
In this work we propose a mixed approach to deal with the modelling of rainfall events, based on the analysis of geometrical and statistical properties of rain intermittency in time, combined with the predictability power derived from the analysis of no-rain periods distribution and from the binary decomposition of the rain signal. Some recent hypotheses on the nature of rain intermittency are reviewed too. In particular, the internal intermittent structure of a high resolution pluviometric time series covering one decade and recorded at the tipping bucket station of the University of Genova is analysed, by separating the internal intermittency of rainfall events from the inter-arrival process through a simple geometrical filtering procedure. In this way it is possible to associate no-rain intervals with a probability distribution both in virtue of their position within the event and their percentage. From this analysis, an invariant probability distribution for the no-rain periods within the events is obtained at different aggregation levels and its satisfactory agreement with a typical extreme value distribution is shown.
Spatio-temporal assessment of food safety risks in Canadian food distribution systems using GIS.
Hashemi Beni, Leila; Villeneuve, Sébastien; LeBlanc, Denyse I; Côté, Kevin; Fazil, Aamir; Otten, Ainsley; McKellar, Robin; Delaquis, Pascal
2012-09-01
While the value of geographic information systems (GIS) is widely applied in public health there have been comparatively few examples of applications that extend to the assessment of risks in food distribution systems. GIS can provide decision makers with strong computing platforms for spatial data management, integration, analysis, querying and visualization. The present report addresses some spatio-analyses in a complex food distribution system and defines influence areas as travel time zones generated through road network analysis on a national scale rather than on a community scale. In addition, a dynamic risk index is defined to translate a contamination event into a public health risk as time progresses. More specifically, in this research, GIS is used to map the Canadian produce distribution system, analyze accessibility to contaminated product by consumers, and estimate the level of risk associated with a contamination event over time, as illustrated in a scenario. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Huang, D.; Wang, G.
2014-12-01
Stochastic simulation of spatially distributed ground-motion time histories is important for performance-based earthquake design of geographically distributed systems. In this study, we develop a novel technique to stochastically simulate regionalized ground-motion time histories using wavelet packet analysis. First, a transient acceleration time history is characterized by wavelet-packet parameters proposed by Yamamoto and Baker (2013). The wavelet-packet parameters fully characterize ground-motion time histories in terms of energy content, time- frequency-domain characteristics and time-frequency nonstationarity. This study further investigates the spatial cross-correlations of wavelet-packet parameters based on geostatistical analysis of 1500 regionalized ground motion data from eight well-recorded earthquakes in California, Mexico, Japan and Taiwan. The linear model of coregionalization (LMC) is used to develop a permissible spatial cross-correlation model for each parameter group. The geostatistical analysis of ground-motion data from different regions reveals significant dependence of the LMC structure on regional site conditions, which can be characterized by the correlation range of Vs30 in each region. In general, the spatial correlation and cross-correlation of wavelet-packet parameters are stronger if the site condition is more homogeneous. Using the regional-specific spatial cross-correlation model and cokriging technique, wavelet packet parameters at unmeasured locations can be best estimated, and regionalized ground-motion time histories can be synthesized. Case studies and blind tests demonstrated that the simulated ground motions generally agree well with the actual recorded data, if the influence of regional-site conditions is considered. The developed method has great potential to be used in computational-based seismic analysis and loss estimation in a regional scale.
ERIC Educational Resources Information Center
Roxburgh, Susan
2006-01-01
In this article, I examine the distribution of time pressure associated with the roles of marital partner and parent using data from a telephone survey. Results of an analysis of open-ended responses indicate that less than a quarter of respondents are satisfied with the time they spend with their children and spouses. Women are more likely to…
Wang, Xinghu; Hong, Yiguang; Yi, Peng; Ji, Haibo; Kang, Yu
2017-05-24
In this paper, a distributed optimization problem is studied for continuous-time multiagent systems with unknown-frequency disturbances. A distributed gradient-based control is proposed for the agents to achieve the optimal consensus with estimating unknown frequencies and rejecting the bounded disturbance in the semi-global sense. Based on convex optimization analysis and adaptive internal model approach, the exact optimization solution can be obtained for the multiagent system disturbed by exogenous disturbances with uncertain parameters.
NASA Astrophysics Data System (ADS)
Boashash, Boualem; Lovell, Brian; White, Langford
1988-01-01
Time-Frequency analysis based on the Wigner-Ville Distribution (WVD) is shown to be optimal for a class of signals where the variation of instantaneous frequency is the dominant characteristic. Spectral resolution and instantaneous frequency tracking is substantially improved by using a Modified WVD (MWVD) based on an Autoregressive spectral estimator. Enhanced signal-to-noise ratio may be achieved by using 2D windowing in the Time-Frequency domain. The WVD provides a tool for deriving descriptors of signals which highlight their FM characteristics. These descriptors may be used for pattern recognition and data clustering using the methods presented in this paper.
Statistical analysis of flight times for space shuttle ferry flights
NASA Technical Reports Server (NTRS)
Graves, M. E.; Perlmutter, M.
1974-01-01
Markov chain and Monte Carlo analysis techniques are applied to the simulated Space Shuttle Orbiter Ferry flights to obtain statistical distributions of flight time duration between Edwards Air Force Base and Kennedy Space Center. The two methods are compared, and are found to be in excellent agreement. The flights are subjected to certain operational and meteorological requirements, or constraints, which cause eastbound and westbound trips to yield different results. Persistence of events theory is applied to the occurrence of inclement conditions to find their effect upon the statistical flight time distribution. In a sensitivity test, some of the constraints are varied to observe the corresponding changes in the results.
Estimation Accuracy on Execution Time of Run-Time Tasks in a Heterogeneous Distributed Environment
Liu, Qi; Cai, Weidong; Jin, Dandan; Shen, Jian; Fu, Zhangjie; Liu, Xiaodong; Linge, Nigel
2016-01-01
Distributed Computing has achieved tremendous development since cloud computing was proposed in 2006, and played a vital role promoting rapid growth of data collecting and analysis models, e.g., Internet of things, Cyber-Physical Systems, Big Data Analytics, etc. Hadoop has become a data convergence platform for sensor networks. As one of the core components, MapReduce facilitates allocating, processing and mining of collected large-scale data, where speculative execution strategies help solve straggler problems. However, there is still no efficient solution for accurate estimation on execution time of run-time tasks, which can affect task allocation and distribution in MapReduce. In this paper, task execution data have been collected and employed for the estimation. A two-phase regression (TPR) method is proposed to predict the finishing time of each task accurately. Detailed data of each task have drawn interests with detailed analysis report being made. According to the results, the prediction accuracy of concurrent tasks’ execution time can be improved, in particular for some regular jobs. PMID:27589753
Effect of lag time distribution on the lag phase of bacterial growth - a Monte Carlo analysis
USDA-ARS?s Scientific Manuscript database
The objective of this study is to use Monte Carlo simulation to evaluate the effect of lag time distribution of individual bacterial cells incubated under isothermal conditions on the development of lag phase. The growth of bacterial cells of the same initial concentration and mean lag phase durati...
Adaptive phase extraction: incorporating the Gabor transform in the matching pursuit algorithm.
Wacker, Matthias; Witte, Herbert
2011-10-01
Short-time Fourier transform (STFT), Gabor transform (GT), wavelet transform (WT), and the Wigner-Ville distribution (WVD) are just some examples of time-frequency analysis methods which are frequently applied in biomedical signal analysis. However, all of these methods have their individual drawbacks. The STFT, GT, and WT have a time-frequency resolution that is determined by algorithm parameters and the WVD is contaminated by cross terms. In 1993, Mallat and Zhang introduced the matching pursuit (MP) algorithm that decomposes a signal into a sum of atoms and uses a cross-term free pseudo-WVD to generate a data-adaptive power distribution in the time-frequency space. Thus, it solved some of the problems of the GT and WT but lacks phase information that is crucial e.g., for synchronization analysis. We introduce a new time-frequency analysis method that combines the MP with a pseudo-GT. Therefore, the signal is decomposed into a set of Gabor atoms. Afterward, each atom is analyzed with a Gabor analysis, where the time-domain gaussian window of the analysis matches that of the specific atom envelope. A superposition of the single time-frequency planes gives the final result. This is the first time that a complete analysis of the complex time-frequency plane can be performed in a fully data-adaptive and frequency-selective manner. We demonstrate the capabilities of our approach on a simulation and on real-life magnetoencephalogram data.
Nonstationary Dynamics Data Analysis with Wavelet-SVD Filtering
NASA Technical Reports Server (NTRS)
Brenner, Marty; Groutage, Dale; Bessette, Denis (Technical Monitor)
2001-01-01
Nonstationary time-frequency analysis is used for identification and classification of aeroelastic and aeroservoelastic dynamics. Time-frequency multiscale wavelet processing generates discrete energy density distributions. The distributions are processed using the singular value decomposition (SVD). Discrete density functions derived from the SVD generate moments that detect the principal features in the data. The SVD standard basis vectors are applied and then compared with a transformed-SVD, or TSVD, which reduces the number of features into more compact energy density concentrations. Finally, from the feature extraction, wavelet-based modal parameter estimation is applied.
Noncolocated Time-Reversal MUSIC: High-SNR Distribution of Null Spectrum
NASA Astrophysics Data System (ADS)
Ciuonzo, Domenico; Rossi, Pierluigi Salvo
2017-04-01
We derive the asymptotic distribution of the null spectrum of the well-known Multiple Signal Classification (MUSIC) in its computational Time-Reversal (TR) form. The result pertains to a single-frequency non-colocated multistatic scenario and several TR-MUSIC variants are here investigated. The analysis builds upon the 1st-order perturbation of the singular value decomposition and allows a simple characterization of null-spectrum moments (up to the 2nd order). This enables a comparison in terms of spectrums stability. Finally, a numerical analysis is provided to confirm the theoretical findings.
A statistical analysis of the daily streamflow hydrograph
NASA Astrophysics Data System (ADS)
Kavvas, M. L.; Delleur, J. W.
1984-03-01
In this study a periodic statistical analysis of daily streamflow data in Indiana, U.S.A., was performed to gain some new insight into the stochastic structure which describes the daily streamflow process. This analysis was performed by the periodic mean and covariance functions of the daily streamflows, by the time and peak discharge -dependent recession limb of the daily streamflow hydrograph, by the time and discharge exceedance level (DEL) -dependent probability distribution of the hydrograph peak interarrival time, and by the time-dependent probability distribution of the time to peak discharge. Some new statistical estimators were developed and used in this study. In general features, this study has shown that: (a) the persistence properties of daily flows depend on the storage state of the basin at the specified time origin of the flow process; (b) the daily streamflow process is time irreversible; (c) the probability distribution of the daily hydrograph peak interarrival time depends both on the occurrence time of the peak from which the inter-arrival time originates and on the discharge exceedance level; and (d) if the daily streamflow process is modeled as the release from a linear watershed storage, this release should depend on the state of the storage and on the time of the release as the persistence properties and the recession limb decay rates were observed to change with the state of the watershed storage and time. Therefore, a time-varying reservoir system needs to be considered if the daily streamflow process is to be modeled as the release from a linear watershed storage.
Phelps, Michael; Latif, Asad; Thomsen, Robert; Slodzinski, Martin; Raghavan, Rahul; Paul, Sharon Leigh; Stonemetz, Jerry
2017-08-01
Use of an anesthesia information management system (AIMS) has been reported to improve accuracy of recorded information. We tested the hypothesis that analyzing the distribution of times charted on paper and computerized records could reveal possible rounding errors, and that this effect could be modulated by differences in the user interface for documenting certain event times with an AIMS. We compared the frequency distribution of start and end times for anesthesia cases completed with paper records and an AIMS. Paper anesthesia records had significantly more times ending with "0" and "5" compared to those from the AIMS (p < 0.001). For case start times, AIMS still exhibited end-digit preference, with times whose last digits had significantly higher frequencies of "0" and "5" than other integers. This effect, however, was attenuated compared to that for paper anesthesia records. For case end times, the distribution of minutes recorded with AIMS was almost evenly distributed, unlike those from paper records that still showed significant end-digit preference. The accuracy of anesthesia case start times and case end times, as inferred by statistical analysis of the distribution of the times, is enhanced with the use of an AIMS. Furthermore, the differences in AIMS user interface for documenting case start and case end times likely affects the degree of end-digit preference, and likely accuracy, of those times.
Distributed intelligent data analysis in diabetic patient management.
Bellazzi, R.; Larizza, C.; Riva, A.; Mira, A.; Fiocchi, S.; Stefanelli, M.
1996-01-01
This paper outlines the methodologies that can be used to perform an intelligent analysis of diabetic patients' data, realized in a distributed management context. We present a decision-support system architecture based on two modules, a Patient Unit and a Medical Unit, connected by telecommunication services. We stress the necessity to resort to temporal abstraction techniques, combined with time series analysis, in order to provide useful advice to patients; finally, we outline how data analysis and interpretation can be cooperatively performed by the two modules. PMID:8947655
Probabilistic reasoning in data analysis.
Sirovich, Lawrence
2011-09-20
This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.
Are anesthesia start and end times randomly distributed? The influence of electronic records.
Deal, Litisha G; Nyland, Michael E; Gravenstein, Nikolaus; Tighe, Patrick
2014-06-01
To perform a frequency analysis of start minute digits (SMD) and end minute digits (EMD) taken from the electronic, computer-assisted, and manual anesthesia billing-record systems. Retrospective cross-sectional review. University medical center. This cross-sectional review was conducted on billing records from a single healthcare institution over a 15-month period. A total of 30,738 cases were analyzed. For each record, the start time and end time were recorded. Distributions of SMD and EMD were tested against the null hypothesis of a frequency distribution equivalently spread between zero and nine. SMD and EMD aggregate distributions each differed from equivalency (P < 0.0001). When stratified by type of anesthetic record, no differences were found between the recorded and expected equivalent distribution patterns for electronic anesthesia records for start minute (P < 0.98) or end minute (P < 0.55). Manual and computer-assisted records maintained nonequivalent distribution patterns for SMD and EMD (P < 0.0001 for each comparison). Comparison of cumulative distributions between SMD and EMD distributions suggested a significant difference between the two patterns (P < 0.0001). An electronic anesthesia record system, with automated time capture of events verified by the user, produces a more unified distribution of billing times than do more traditional methods of entering billing times. Copyright © 2014 Elsevier Inc. All rights reserved.
Payne, Brennan R; Stine-Morrow, Elizabeth A L
2014-06-01
We report a secondary data analysis investigating age differences in the effects of clause and sentence wrap-up on reading time distributions during sentence comprehension. Residual word-by-word self-paced reading times were fit to the ex-Gaussian distribution to examine age differences in the effects of clause and sentence wrap-up on both the location and shape of participants' reaction time (RT) distributions. The ex-Gaussian distribution showed good fit to the data in both younger and older adults. Sentence wrap-up increased the central tendency, the variability, and the tail of the distribution, and these effects were exaggerated among the old. In contrast, clause wrap-up influenced the tail of the distribution only, and did so differentially for older adults. Effects were confirmed via nonparametric vincentile plots. Individual differences in visual acuity, working memory, speed of processing, and verbal ability were differentially related to ex-Gaussian parameters reflecting wrap-up effects on underlying reading time distributions. These findings argue against simple pause mechanisms to explain end-of-clause and end-of-sentence reading time patterns; rather, the findings are consistent with a cognitively effortful view of wrap-up and suggest that age and individual differences in attentional allocation to semantic integration during reading, as revealed by RT distribution analyses, play an important role in sentence understanding. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Huang, Chongyang; Zhou, Qi; Gao, Shan; Bao, Qingjia; Chen, Fang; Liu, Chaoyang
2016-01-20
Different ginger cultivars may contain different nutritional and medicinal values. In this study, a time-domain nuclear magnetic resonance method was employed to study water dynamics in different ginger cultivars. Significant differences in transverse relaxation time T2 values assigned to the distribution of water in different parts of the plant were observed between Henan ginger and four other ginger cultivars. Ion concentration and metabolic analysis showed similar differences in Mn ion concentrations and organic solutes among the different ginger cultivars, respectively. On the basis of Pearson's correlation analysis, many organic solutes and 6-gingerol, the main active substance of ginger, exhibited significant correlations with water distribution as determined by NMR T2 relaxation, suggesting that the organic solute differences may impact water distribution. Our work demonstrates that low-field NMR relaxometry provides useful information about water dynamics in different ginger cultivars as affected by the presence of different organic solutes.
Bayesian analysis of volcanic eruptions
NASA Astrophysics Data System (ADS)
Ho, Chih-Hsiang
1990-10-01
The simple Poisson model generally gives a good fit to many volcanoes for volcanic eruption forecasting. Nonetheless, empirical evidence suggests that volcanic activity in successive equal time-periods tends to be more variable than a simple Poisson with constant eruptive rate. An alternative model is therefore examined in which eruptive rate(λ) for a given volcano or cluster(s) of volcanoes is described by a gamma distribution (prior) rather than treated as a constant value as in the assumptions of a simple Poisson model. Bayesian analysis is performed to link two distributions together to give the aggregate behavior of the volcanic activity. When the Poisson process is expanded to accomodate a gamma mixing distribution on λ, a consequence of this mixed (or compound) Poisson model is that the frequency distribution of eruptions in any given time-period of equal length follows the negative binomial distribution (NBD). Applications of the proposed model and comparisons between the generalized model and simple Poisson model are discussed based on the historical eruptive count data of volcanoes Mauna Loa (Hawaii) and Etna (Italy). Several relevant facts lead to the conclusion that the generalized model is preferable for practical use both in space and time.
NASA Astrophysics Data System (ADS)
Doebrich, Marcus; Markstaller, Klaus; Karmrodt, Jens; Kauczor, Hans-Ulrich; Eberle, Balthasar; Weiler, Norbert; Thelen, Manfred; Schreiber, Wolfgang G.
2005-04-01
In this study, an algorithm was developed to measure the distribution of pulmonary time constants (TCs) from dynamic computed tomography (CT) data sets during a sudden airway pressure step up. Simulations with synthetic data were performed to test the methodology as well as the influence of experimental noise. Furthermore the algorithm was applied to in vivo data. In five pigs sudden changes in airway pressure were imposed during dynamic CT acquisition in healthy lungs and in a saline lavage ARDS model. The fractional gas content in the imaged slice (FGC) was calculated by density measurements for each CT image. Temporal variations of the FGC were analysed assuming a model with a continuous distribution of exponentially decaying time constants. The simulations proved the feasibility of the method. The influence of experimental noise could be well evaluated. Analysis of the in vivo data showed that in healthy lungs ventilation processes can be more likely characterized by discrete TCs whereas in ARDS lungs continuous distributions of TCs are observed. The temporal behaviour of lung inflation and deflation can be characterized objectively using the described new methodology. This study indicates that continuous distributions of TCs reflect lung ventilation mechanics more accurately compared to discrete TCs.
NASA Technical Reports Server (NTRS)
Gurgiolo, Chris; Vinas, Adolfo F.
2009-01-01
This paper presents a spherical harmonic analysis of the plasma velocity distribution function using high-angular, energy, and time resolution Cluster data obtained from the PEACE spectrometer instrument to demonstrate how this analysis models the particle distribution function and its moments and anisotropies. The results show that spherical harmonic analysis produced a robust physical representation model of the velocity distribution function, resolving the main features of the measured distributions. From the spherical harmonic analysis, a minimum set of nine spectral coefficients was obtained from which the moment (up to the heat flux), anisotropy, and asymmetry calculations of the velocity distribution function were obtained. The spherical harmonic method provides a potentially effective "compression" technique that can be easily carried out onboard a spacecraft to determine the moments and anisotropies of the particle velocity distribution function for any species. These calculations were implemented using three different approaches, namely, the standard traditional integration, the spherical harmonic (SPH) spectral coefficients integration, and the singular value decomposition (SVD) on the spherical harmonic methods. A comparison among the various methods shows that both SPH and SVD approaches provide remarkable agreement with the standard moment integration method.
ERIC Educational Resources Information Center
Steinhauser, Marco; Hubner, Ronald
2009-01-01
It has been suggested that performance in the Stroop task is influenced by response conflict as well as task conflict. The present study investigated the idea that both conflict types can be isolated by applying ex-Gaussian distribution analysis which decomposes response time into a Gaussian and an exponential component. Two experiments were…
NASA Astrophysics Data System (ADS)
Aoyama, Hideaki; Fujiwara, Yoshi; Ikeda, Yuichi; Iyetomi, Hiroshi; Souma, Wataru; Yoshikawa, Hiroshi
2017-07-01
Preface; Foreword, Acknowledgements, List of tables; List of figures, prologue, 1. Introduction: reconstructing macroeconomics; 2. Basic concepts in statistical physics and stochastic models; 3. Income and firm-size distributions; 4. Productivity distribution and related topics; 5. Multivariate time-series analysis; 6. Business cycles; 7. Price dynamics and inflation/deflation; 8. Complex network, community analysis, visualization; 9. Systemic risks; Appendix A: computer program for beginners; Epilogue; Bibliography; Index.
DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.
Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien
2017-09-01
Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.
Optimal distribution of integration time for intensity measurements in Stokes polarimetry.
Li, Xiaobo; Liu, Tiegen; Huang, Bingjing; Song, Zhanjie; Hu, Haofeng
2015-10-19
We consider the typical Stokes polarimetry system, which performs four intensity measurements to estimate a Stokes vector. We show that if the total integration time of intensity measurements is fixed, the variance of the Stokes vector estimator depends on the distribution of the integration time at four intensity measurements. Therefore, by optimizing the distribution of integration time, the variance of the Stokes vector estimator can be decreased. In this paper, we obtain the closed-form solution of the optimal distribution of integration time by employing Lagrange multiplier method. According to the theoretical analysis and real-world experiment, it is shown that the total variance of the Stokes vector estimator can be significantly decreased about 40% in the case discussed in this paper. The method proposed in this paper can effectively decrease the measurement variance and thus statistically improves the measurement accuracy of the polarimetric system.
Interevent time distributions of human multi-level activity in a virtual world
NASA Astrophysics Data System (ADS)
Mryglod, O.; Fuchs, B.; Szell, M.; Holovatch, Yu.; Thurner, S.
2015-02-01
Studying human behavior in virtual environments provides extraordinary opportunities for a quantitative analysis of social phenomena with levels of accuracy that approach those of the natural sciences. In this paper we use records of player activities in the massive multiplayer online game Pardus over 1238 consecutive days, and analyze dynamical features of sequences of actions of players. We build on previous work where temporal structures of human actions of the same type were quantified, and provide an empirical understanding of human actions of different types. This study of multi-level human activity can be seen as a dynamic counterpart of static multiplex network analysis. We show that the interevent time distributions of actions in the Pardus universe follow highly non-trivial distribution functions, from which we extract action-type specific characteristic 'decay constants'. We discuss characteristic features of interevent time distributions, including periodic patterns on different time scales, bursty dynamics, and various functional forms on different time scales. We comment on gender differences of players in emotional actions, and find that while males and females act similarly when performing some positive actions, females are slightly faster for negative actions. We also observe effects on the age of players: more experienced players are generally faster in making decisions about engaging in and terminating enmity and friendship, respectively.
The distribution of first-passage times and durations in FOREX and future markets
NASA Astrophysics Data System (ADS)
Sazuka, Naoya; Inoue, Jun-ichi; Scalas, Enrico
2009-07-01
Possible distributions are discussed for intertrade durations and first-passage processes in financial markets. The view-point of renewal theory is assumed. In order to represent market data with relatively long durations, two types of distributions are used, namely a distribution derived from the Mittag-Leffler survival function and the Weibull distribution. For the Mittag-Leffler type distribution, the average waiting time (residual life time) is strongly dependent on the choice of a cut-off parameter tmax, whereas the results based on the Weibull distribution do not depend on such a cut-off. Therefore, a Weibull distribution is more convenient than a Mittag-Leffler type if one wishes to evaluate relevant statistics such as average waiting time in financial markets with long durations. On the other hand, we find that the Gini index is rather independent of the cut-off parameter. Based on the above considerations, we propose a good candidate for describing the distribution of first-passage time in a market: The Weibull distribution with a power-law tail. This distribution compensates the gap between theoretical and empirical results more efficiently than a simple Weibull distribution. It should be stressed that a Weibull distribution with a power-law tail is more flexible than the Mittag-Leffler distribution, which itself can be approximated by a Weibull distribution and a power-law. Indeed, the key point is that in the former case there is freedom of choice for the exponent of the power-law attached to the Weibull distribution, which can exceed 1 in order to reproduce decays faster than possible with a Mittag-Leffler distribution. We also give a useful formula to determine an optimal crossover point minimizing the difference between the empirical average waiting time and the one predicted from renewal theory. Moreover, we discuss the limitation of our distributions by applying our distribution to the analysis of the BTP future and calculating the average waiting time. We find that our distribution is applicable as long as durations follow a Weibull law for short times and do not have too heavy a tail.
The role of local heterogeneity in transport through steep hillslopes.
NASA Astrophysics Data System (ADS)
Fiori, A.; Russo, D.
2009-04-01
A stochastic model is developed for the analysis of the travel time distribution in a hillslope. The latter is represented as a system made up from a highly permeable soil underlain by a less permeable subsoil or bedrock. The heterogeneous hydraulic conductivity K is described as a stationary random space function. The travel time distribution is obtained through a stochastic Lagrangian model of transport, after adopting a first order approximation in the logconductivity variance. The results show that the travel time pdf pertaining to the soil is power-law, with exponent variable between -1 and -0.5; the behavior is mainly determined by unsaturated transport. The subsoil is mainly responsible for the tail of the travel time distribution. Analysis of the first and second moments of travel time show that the spreading of solute is controlled by the variations in the flow-paths (geomorphological dispersion), which depend on the hillslope geometry. Conversely, the contribution of the K heterogeneity to spreading appears as less relevant. The model is tested against a detailed three-dimensional numerical simulation with reasonably good agreement.
Nonlinear Reduced-Order Analysis with Time-Varying Spatial Loading Distributions
NASA Technical Reports Server (NTRS)
Prezekop, Adam
2008-01-01
Oscillating shocks acting in combination with high-intensity acoustic loadings present a challenge to the design of resilient hypersonic flight vehicle structures. This paper addresses some features of this loading condition and certain aspects of a nonlinear reduced-order analysis with emphasis on system identification leading to formation of a robust modal basis. The nonlinear dynamic response of a composite structure subject to the simultaneous action of locally strong oscillating pressure gradients and high-intensity acoustic loadings is considered. The reduced-order analysis used in this work has been previously demonstrated to be both computationally efficient and accurate for time-invariant spatial loading distributions, provided that an appropriate modal basis is used. The challenge of the present study is to identify a suitable basis for loadings with time-varying spatial distributions. Using a proper orthogonal decomposition and modal expansion, it is shown that such a basis can be developed. The basis is made more robust by incrementally expanding it to account for changes in the location, frequency and span of the oscillating pressure gradient.
2003-04-01
Wigner - Ville Distribution ( WVD ) of the signal. This distribution is a signal representation consisting in the mapping of the... Wigner - Ville distribution The aim of this section is to show how time-frequency representation by WVD of the echoes received by a SAR provides a...frequency analysis by Wigner - Ville distribution ". IEE Proc., Pt. F., Vol. 139, no. 1, February 1992, pp. 89-97. 3-17 [BFA94] S. Barbarossa, A.
NASA Astrophysics Data System (ADS)
Egozcue, J. J.; Pawlowsky-Glahn, V.; Ortego, M. I.
2005-03-01
Standard practice of wave-height hazard analysis often pays little attention to the uncertainty of assessed return periods and occurrence probabilities. This fact favors the opinion that, when large events happen, the hazard assessment should change accordingly. However, uncertainty of the hazard estimates is normally able to hide the effect of those large events. This is illustrated using data from the Mediterranean coast of Spain, where the last years have been extremely disastrous. Thus, it is possible to compare the hazard assessment based on data previous to those years with the analysis including them. With our approach, no significant change is detected when the statistical uncertainty is taken into account. The hazard analysis is carried out with a standard model. Time-occurrence of events is assumed Poisson distributed. The wave-height of each event is modelled as a random variable which upper tail follows a Generalized Pareto Distribution (GPD). Moreover, wave-heights are assumed independent from event to event and also independent of their occurrence in time. A threshold for excesses is assessed empirically. The other three parameters (Poisson rate, shape and scale parameters of GPD) are jointly estimated using Bayes' theorem. Prior distribution accounts for physical features of ocean waves in the Mediterranean sea and experience with these phenomena. Posterior distribution of the parameters allows to obtain posterior distributions of other derived parameters like occurrence probabilities and return periods. Predictives are also available. Computations are carried out using the program BGPE v2.0.
Analysis of overdispersed count data by mixtures of Poisson variables and Poisson processes.
Hougaard, P; Lee, M L; Whitmore, G A
1997-12-01
Count data often show overdispersion compared to the Poisson distribution. Overdispersion is typically modeled by a random effect for the mean, based on the gamma distribution, leading to the negative binomial distribution for the count. This paper considers a larger family of mixture distributions, including the inverse Gaussian mixture distribution. It is demonstrated that it gives a significantly better fit for a data set on the frequency of epileptic seizures. The same approach can be used to generate counting processes from Poisson processes, where the rate or the time is random. A random rate corresponds to variation between patients, whereas a random time corresponds to variation within patients.
Non-extensivity and complexity in the earthquake activity at the West Corinth rift (Greece)
NASA Astrophysics Data System (ADS)
Michas, Georgios; Vallianatos, Filippos; Sammonds, Peter
2013-04-01
Earthquakes exhibit complex phenomenology that is revealed from the fractal structure in space, time and magnitude. For that reason other tools rather than the simple Poissonian statistics seem more appropriate to describe the statistical properties of the phenomenon. Here we use Non-Extensive Statistical Physics [NESP] to investigate the inter-event time distribution of the earthquake activity at the west Corinth rift (central Greece). This area is one of the most seismotectonically active areas in Europe, with an important continental N-S extension and high seismicity rates. NESP concept refers to the non-additive Tsallis entropy Sq that includes Boltzmann-Gibbs entropy as a particular case. This concept has been successfully used for the analysis of a variety of complex dynamic systems including earthquakes, where fractality and long-range interactions are important. The analysis indicates that the cumulative inter-event time distribution can be successfully described with NESP, implying the complexity that characterizes the temporal occurrences of earthquakes. Further on, we use the Tsallis entropy (Sq) and the Fischer Information Measure (FIM) to investigate the complexity that characterizes the inter-event time distribution through different time windows along the evolution of the seismic activity at the West Corinth rift. The results of this analysis reveal a different level of organization and clusterization of the seismic activity in time. Acknowledgments. GM wish to acknowledge the partial support of the Greek State Scholarships Foundation (IKY).
14 CFR 417.209 - Malfunction turn analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... nozzle burn-through. For each cause of a malfunction turn, the analysis must establish the launch vehicle... the launch vehicle's turning capability in the event of a malfunction during flight. A malfunction... launch vehicle is capable. (4) The time, as a single value or a probability time distribution, when each...
14 CFR 417.209 - Malfunction turn analysis.
Code of Federal Regulations, 2011 CFR
2011-01-01
... nozzle burn-through. For each cause of a malfunction turn, the analysis must establish the launch vehicle... the launch vehicle's turning capability in the event of a malfunction during flight. A malfunction... launch vehicle is capable. (4) The time, as a single value or a probability time distribution, when each...
Quantitative assessment of building fire risk to life safety.
Guanquan, Chu; Jinhua, Sun
2008-06-01
This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.
2017-03-23
sector, “it costs about .5 times the annual salary plus benefits to replace an hourly worker, 1.5 times the annual salary plus benefits to replace a...salaried employee, and as much as 5 times the annual salary plus benefits to replace an executive” (Colquitt, Lepine, & Wesson, 2011). Direct costs ...DISTRIBUTION STATEMENT A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. The views expressed in this thesis are those of the
Using response time distributions to examine top-down influences on attentional capture.
Burnham, Bryan R
2013-02-01
Three experiments examined contingent attentional capture, which is the finding that cuing effects are larger when cues are perceptually similar to a target than when they are dissimilar to the target. This study also analyzed response times (RTs) in terms of the underlying distributions for valid cues and invalid cues. Specifically, an ex-Gaussian analysis and a vincentile analysis examined the influence of top-down attentional control settings on the shift and skew of RT distributions and how the shift and the skew contributed to the cuing effects in the mean RTs. The results showed that cue/target similarity influenced the size of cuing effects. The RT distribution analyses showed that the cuing effects reflected only a shifting effect, not a skewing effect, in the RT distribution between valid cues and invalid cues. That is, top-down attentional control moderated the cuing effects in the mean RTs through distribution shifting, not distribution skewing. The results support the contingent orienting hypothesis (Folk, Remington, & Johnston, Journal of Experimental Psychology: Human Perception and Performance, 18, 1030-1044, 1992) over the attentional disengagement account (Theeuwes, Atchley, & Kramer, 2000) as an explanation for when top-down attentional settings influence the selection of salient stimuli.
Defense Applications of Signal Processing
1999-08-27
class of multiscale autoregressive moving average (MARMA) processes. These are generalisations of ARMA models in time series analysis , and they contain...including the two theoretical sinusoidal components. Analysis of the amplitude and frequency time series provided some novel insight into the real...communication channels, underwater acoustic signals, radar systems , economic time series and biomedical signals [7]. The alpha stable (aS) distribution has
Phase walk analysis of leptokurtic time series.
Schreiber, Korbinian; Modest, Heike I; Räth, Christoph
2018-06-01
The Fourier phase information play a key role for the quantified description of nonlinear data. We present a novel tool for time series analysis that identifies nonlinearities by sensitively detecting correlations among the Fourier phases. The method, being called phase walk analysis, is based on well established measures from random walk analysis, which are now applied to the unwrapped Fourier phases of time series. We provide an analytical description of its functionality and demonstrate its capabilities on systematically controlled leptokurtic noise. Hereby, we investigate the properties of leptokurtic time series and their influence on the Fourier phases of time series. The phase walk analysis is applied to measured and simulated intermittent time series, whose probability density distribution is approximated by power laws. We use the day-to-day returns of the Dow-Jones industrial average, a synthetic time series with tailored nonlinearities mimicing the power law behavior of the Dow-Jones and the acceleration of the wind at an Atlantic offshore site. Testing for nonlinearities by means of surrogates shows that the new method yields strong significances for nonlinear behavior. Due to the drastically decreased computing time as compared to embedding space methods, the number of surrogate realizations can be increased by orders of magnitude. Thereby, the probability distribution of the test statistics can very accurately be derived and parameterized, which allows for much more precise tests on nonlinearities.
Analysis of cardiac signals using spatial filling index and time-frequency domain
Faust, Oliver; Acharya U, Rajendra; Krishnan, SM; Min, Lim Choo
2004-01-01
Background Analysis of heart rate variation (HRV) has become a popular noninvasive tool for assessing the activities of the autonomic nervous system (ANS). HRV analysis is based on the concept that fast fluctuations may specifically reflect changes of sympathetic and vagal activity. It shows that the structure generating the signal is not simply linear, but also involves nonlinear contributions. These signals are essentially non-stationary; may contain indicators of current disease, or even warnings about impending diseases. The indicators may be present at all times or may occur at random in the time scale. However, to study and pinpoint abnormalities in voluminous data collected over several hours is strenuous and time consuming. Methods This paper presents the spatial filling index and time-frequency analysis of heart rate variability signal for disease identification. Renyi's entropy is evaluated for the signal in the Wigner-Ville and Continuous Wavelet Transformation (CWT) domain. Results This Renyi's entropy gives lower 'p' value for scalogram than Wigner-Ville distribution and also, the contours of scalogram visually show the features of the diseases. And in the time-frequency analysis, the Renyi's entropy gives better result for scalogram than the Wigner-Ville distribution. Conclusion Spatial filling index and Renyi's entropy has distinct regions for various diseases with an accuracy of more than 95%. PMID:15361254
Kumar, Vijay; Taylor, Michael K; Mehrotra, Amit; Stagner, William C
2013-06-01
Focused beam reflectance measurement (FBRM) was used as a process analytical technology tool to perform inline real-time particle size analysis of a proprietary granulation manufactured using a continuous twin-screw granulation-drying-milling process. A significant relationship between D20, D50, and D80 length-weighted chord length and sieve particle size was observed with a p value of <0.0001 and R(2) of 0.886. A central composite response surface statistical design was used to evaluate the effect of granulator screw speed and Comil® impeller speed on the length-weighted chord length distribution (CLD) and particle size distribution (PSD) determined by FBRM and nested sieve analysis, respectively. The effect of granulator speed and mill speed on bulk density, tapped density, Compressibility Index, and Flowability Index were also investigated. An inline FBRM probe placed below the Comil-generated chord lengths and CLD data at designated times. The collection of the milled samples for sieve analysis and PSD evaluation were coordinated with the timing of the FBRM determinations. Both FBRM and sieve analysis resulted in similar bimodal distributions for all ten manufactured batches studied. Within the experimental space studied, the granulator screw speed (650-850 rpm) and Comil® impeller speed (1,000-2,000 rpm) did not have a significant effect on CLD, PSD, bulk density, tapped density, Compressibility Index, and Flowability Index (p value > 0.05).
IUTAM Symposium on Statistical Energy Analysis, 8-11 July 1997, Programme
1997-01-01
distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (Maximum200 words) This was the first international scientific gathering devoted...energy flow, continuum dynamics, vibrational energy, statistical energy analysis (SEA) 15. NUMBER OF PAGES 16. PRICE CODE INSECURITY... correlation v=V(ɘ ’• • determination of the correlation n^, =11^, (<?). When harmonic motion and time-average are considered, the following I
Erdeljić, Viktorija; Francetić, Igor; Bošnjak, Zrinka; Budimir, Ana; Kalenić, Smilja; Bielen, Luka; Makar-Aušperger, Ksenija; Likić, Robert
2011-05-01
The relationship between antibiotic consumption and selection of resistant strains has been studied mainly by employing conventional statistical methods. A time delay in effect must be anticipated and this has rarely been taken into account in previous studies. Therefore, distributed lags time series analysis and simple linear correlation were compared in their ability to evaluate this relationship. Data on monthly antibiotic consumption for ciprofloxacin, piperacillin/tazobactam, carbapenems and cefepime as well as Pseudomonas aeruginosa susceptibility were retrospectively collected for the period April 2006 to July 2007. Using distributed lags analysis, a significant temporal relationship was identified between ciprofloxacin, meropenem and cefepime consumption and the resistance rates of P. aeruginosa isolates to these antibiotics. This effect was lagged for ciprofloxacin and cefepime [1 month (R=0.827, P=0.039) and 2 months (R=0.962, P=0.001), respectively] and was simultaneous for meropenem (lag 0, R=0.876, P=0.002). Furthermore, a significant concomitant effect of meropenem consumption on the appearance of multidrug-resistant P. aeruginosa strains (resistant to three or more representatives of classes of antibiotics) was identified (lag 0, R=0.992, P<0.001). This effect was not delayed and it was therefore identified both by distributed lags analysis and the Pearson's correlation coefficient. Correlation coefficient analysis was not able to identify relationships between antibiotic consumption and bacterial resistance when the effect was delayed. These results indicate that the use of diverse statistical methods can yield significantly different results, thus leading to the introduction of possibly inappropriate infection control measures. Copyright © 2010 Elsevier B.V. and the International Society of Chemotherapy. All rights reserved.
Simple 2.5 GHz time-bin quantum key distribution
NASA Astrophysics Data System (ADS)
Boaron, Alberto; Korzh, Boris; Houlmann, Raphael; Boso, Gianluca; Rusca, Davide; Gray, Stuart; Li, Ming-Jun; Nolan, Daniel; Martin, Anthony; Zbinden, Hugo
2018-04-01
We present a 2.5 GHz quantum key distribution setup with the emphasis on a simple experimental realization. It features a three-state time-bin protocol based on a pulsed diode laser and a single intensity modulator. Implementing an efficient one-decoy scheme and finite-key analysis, we achieve record breaking secret key rates of 1.5 kbps over 200 km of standard optical fibers.
Colour cyclic code for Brillouin distributed sensors
NASA Astrophysics Data System (ADS)
Le Floch, Sébastien; Sauser, Florian; Llera, Miguel; Rochat, Etienne
2015-09-01
For the first time, a colour cyclic coding (CCC) is theoretically and experimentally demonstrated for Brillouin optical time-domain analysis (BOTDA) distributed sensors. Compared to traditional intensity-modulated cyclic codes, the code presents an additional gain of √2 while keeping the same number of sequences as for a colour coding. A comparison with a standard BOTDA sensor is realized and validates the theoretical coding gain.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaunaurd, G.; Strifors, H.C.
1996-09-01
Time series data have been traditionally analyzed in either the time or the frequency domains. For signals with a time-varying frequency content, the combined time-frequency (TF) representations, based on the Cohen class of (generalized) Wigner distributions (WD`s) offer a powerful analysis tool. Using them, it is possible to: (1) trace the time-evolution of the resonance features usually present in a standard sonar cross section (SCS), or in a radar cross section (RCS) and (2) extract target information that may be difficult to even notice in an ordinary SCS or RCS. After a brief review of the fundamental properties of themore » WD, the authors discuss ways to reduce or suppress the cross term interference that appears in the WD of multicomponent systems. These points are illustrated with a variety of three-dimensional (3-D) plots of Wigner and pseudo-Wigner distributions (PWD), in which the strength of the distribution is depicted as the height of a Wigner surface with height scales measured by various color shades or pseudocolors. The authors also review studies they have made of the echoes returned by conducting or dielectric targets in the atmosphere, when they are illuminated by broadband radar pings. A TF domain analysis of these impulse radar returns demonstrates their superior informative content. These plots allow the identification of targets in an easier and clearer fashion than by the conventional RCS of narrowband systems. The authors show computed and measured plots of WD and PWD of various types of aircraft to illustrate the classification advantages of the approach at any aspect angle. They also show analogous results for metallic objects buried underground, in dielectric media, at various depths.« less
NASA Astrophysics Data System (ADS)
Godsey, S. E.; Kirchner, J. W.
2008-12-01
The mean residence time - the average time that it takes rainfall to reach the stream - is a basic parameter used to characterize catchment processes. Heterogeneities in these processes lead to a distribution of travel times around the mean residence time. By examining this travel time distribution, we can better predict catchment response to contamination events. A catchment system with shorter residence times or narrower distributions will respond quickly to contamination events, whereas systems with longer residence times or longer-tailed distributions will respond more slowly to those same contamination events. The travel time distribution of a catchment is typically inferred from time series of passive tracers (e.g., water isotopes or chloride) in precipitation and streamflow. Variations in the tracer concentration in streamflow are usually damped compared to those in precipitation, because precipitation inputs from different storms (with different tracer signatures) are mixed within the catchment. Mathematically, this mixing process is represented by the convolution of the travel time distribution and the precipitation tracer inputs to generate the stream tracer outputs. Because convolution in the time domain is equivalent to multiplication in the frequency domain, it is relatively straightforward to estimate the parameters of the travel time distribution in either domain. In the time domain, the parameters describing the travel time distribution are typically estimated by maximizing the goodness of fit between the modeled and measured tracer outputs. In the frequency domain, the travel time distribution parameters can be estimated by fitting a power-law curve to the ratio of precipitation spectral power to stream spectral power. Differences between the methods of parameter estimation in the time and frequency domain mean that these two methods may respond differently to variations in data quality, record length and sampling frequency. Here we evaluate how well these two methods of travel time parameter estimation respond to different sources of uncertainty and compare the methods to one another. We do this by generating synthetic tracer input time series of different lengths, and convolve these with specified travel-time distributions to generate synthetic output time series. We then sample both the input and output time series at various sampling intervals and corrupt the time series with realistic error structures. Using these 'corrupted' time series, we infer the apparent travel time distribution, and compare it to the known distribution that was used to generate the synthetic data in the first place. This analysis allows us to quantify how different record lengths, sampling intervals, and error structures in the tracer measurements affect the apparent mean residence time and the apparent shape of the travel time distribution.
2009-12-01
events. Work associated with aperiodic tasks have the same statistical behavior and the same timing requirements. The timing deadlines are soft. • Sporadic...answers, but it is possible to calculate how precise the estimates are. Simulation-based performance analysis of a model includes a statistical ...to evaluate all pos- sible states in a timely manner. This is the principle reason for resorting to simulation and statistical analysis to evaluate
NASA Astrophysics Data System (ADS)
Pedretti, Daniele
2017-04-01
Power-law (PL) distributions are widely adopted to define the late-time scaling of solute breakthrough curves (BTCs) during transport experiments in highly heterogeneous media. However, from a statistical perspective, distinguishing between a PL distribution and another tailed distribution is difficult, particularly when a qualitative assessment based on visual analysis of double-logarithmic plotting is used. This presentation aims to discuss the results from a recent analysis where a suite of statistical tools was applied to evaluate rigorously the scaling of BTCs from experiments that generate tailed distributions typically described as PL at late time. To this end, a set of BTCs from numerical simulations in highly heterogeneous media were generated using a transition probability approach (T-PROGS) coupled to a finite different numerical solver of the flow equation (MODFLOW) and a random walk particle tracking approach for Lagrangian transport (RW3D). The T-PROGS fields assumed randomly distributed hydraulic heterogeneities with long correlation scales creating solute channeling and anomalous transport. For simplicity, transport was simulated as purely advective. This combination of tools generates strongly non-symmetric BTCs visually resembling PL distributions at late time when plotted in double log scales. Unlike other combination of modeling parameters and boundary conditions (e.g. matrix diffusion in fractures), at late time no direct link exists between the mathematical functions describing scaling of these curves and physical parameters controlling transport. The results suggest that the statistical tests fail to describe the majority of curves as PL distributed. Moreover, they suggest that PL or lognormal distributions have the same likelihood to represent parametrically the shape of the tails. It is noticeable that forcing a model to reproduce the tail as PL functions results in a distribution of PL slopes comprised between 1.2 and 4, which are the typical values observed during field experiments. We conclude that care must be taken when defining a BTC late time distribution as a power law function. Even though the estimated scaling factors are found to fall in traditional ranges, the actual distribution controlling the scaling of concentration may different from a power-law function, with direct consequences for instance for the selection of effective parameters in upscaling modeling solutions.
NASA Astrophysics Data System (ADS)
ten Veldhuis, Marie-Claire; Schleiss, Marc
2017-04-01
In this study, we introduced an alternative approach for analysis of hydrological flow time series, using an adaptive sampling framework based on inter-amount times (IATs). The main difference with conventional flow time series is the rate at which low and high flows are sampled: the unit of analysis for IATs is a fixed flow amount, instead of a fixed time window. We analysed statistical distributions of flows and IATs across a wide range of sampling scales to investigate sensitivity of statistical properties such as quantiles, variance, skewness, scaling parameters and flashiness indicators to the sampling scale. We did this based on streamflow time series for 17 (semi)urbanised basins in North Carolina, US, ranging from 13 km2 to 238 km2 in size. Results showed that adaptive sampling of flow time series based on inter-amounts leads to a more balanced representation of low flow and peak flow values in the statistical distribution. While conventional sampling gives a lot of weight to low flows, as these are most ubiquitous in flow time series, IAT sampling gives relatively more weight to high flow values, when given flow amounts are accumulated in shorter time. As a consequence, IAT sampling gives more information about the tail of the distribution associated with high flows, while conventional sampling gives relatively more information about low flow periods. We will present results of statistical analyses across a range of subdaily to seasonal scales and will highlight some interesting insights that can be derived from IAT statistics with respect to basin flashiness and impact urbanisation on hydrological response.
Analysis of Digital Communication Signals and Extraction of Parameters.
1994-12-01
Fast Fourier Transform (FFT). The correlation methods utilize modified time-frequency distributions , where one of these is based on the Wigner - Ville ... Distribution ( WVD ). Gaussian white noise is added to the signal to simulate various signal-to-noise ratios (SNRs).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, W.; Quinn, B.; Crnkovic, J. D.
Beam dynamics plays an important role in achieving the unprecedented precision on measurement of the muon anomalous magnetic moment in the Fermilab Muon g-2 Experiment. It needs to find the muon momentum distribution in the storage ring in order to evaluate the electric field correction to muon anomalous precession frequency. We will show how to use time evolution of the beam bunch structure to extract the muon momentum distribution by applying a fast rotation analysis on the decay electron signals.
Cryptographic robustness of a quantum cryptography system using phase-time coding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Molotkov, S. N.
2008-01-15
A cryptographic analysis is presented of a new quantum key distribution protocol using phase-time coding. An upper bound is obtained for the error rate that guarantees secure key distribution. It is shown that the maximum tolerable error rate for this protocol depends on the counting rate in the control time slot. When no counts are detected in the control time slot, the protocol guarantees secure key distribution if the bit error rate in the sifted key does not exceed 50%. This protocol partially discriminates between errors due to system defects (e.g., imbalance of a fiber-optic interferometer) and eavesdropping. In themore » absence of eavesdropping, the counts detected in the control time slot are not caused by interferometer imbalance, which reduces the requirements for interferometer stability.« less
Asquith, William H.; Roussel, Meghan C.; Cleveland, Theodore G.; Fang, Xing; Thompson, David B.
2006-01-01
The design of small runoff-control structures, from simple floodwater-detention basins to sophisticated best-management practices, requires the statistical characterization of rainfall as a basis for cost-effective, risk-mitigated, hydrologic engineering design. The U.S. Geological Survey, in cooperation with the Texas Department of Transportation, has developed a framework to estimate storm statistics including storm interevent times, distributions of storm depths, and distributions of storm durations for eastern New Mexico, Oklahoma, and Texas. The analysis is based on hourly rainfall recorded by the National Weather Service. The database contains more than 155 million hourly values from 774 stations in the study area. Seven sets of maps depicting ranges of mean storm interevent time, mean storm depth, and mean storm duration, by county, as well as tables listing each of those statistics, by county, were developed. The mean storm interevent time is used in probabilistic models to assess the frequency distribution of storms. The Poisson distribution is suggested to model the distribution of storm occurrence, and the exponential distribution is suggested to model the distribution of storm interevent times. The four-parameter kappa distribution is judged as an appropriate distribution for modeling the distribution of both storm depth and storm duration. Preference for the kappa distribution is based on interpretation of L-moment diagrams. Parameter estimates for the kappa distributions are provided. Separate dimensionless frequency curves for storm depth and duration are defined for eastern New Mexico, Oklahoma, and Texas. Dimension is restored by multiplying curve ordinates by the mean storm depth or mean storm duration to produce quantile functions of storm depth and duration. Minimum interevent time and location have slight influence on the scale and shape of the dimensionless frequency curves. Ten example problems and solutions to possible applications are provided.
Coalescence computations for large samples drawn from populations of time-varying sizes
Polanski, Andrzej; Szczesna, Agnieszka; Garbulowski, Mateusz; Kimmel, Marek
2017-01-01
We present new results concerning probability distributions of times in the coalescence tree and expected allele frequencies for coalescent with large sample size. The obtained results are based on computational methodologies, which involve combining coalescence time scale changes with techniques of integral transformations and using analytical formulae for infinite products. We show applications of the proposed methodologies for computing probability distributions of times in the coalescence tree and their limits, for evaluation of accuracy of approximate expressions for times in the coalescence tree and expected allele frequencies, and for analysis of large human mitochondrial DNA dataset. PMID:28170404
Modeling of Engine Parameters for Condition-Based Maintenance of the MTU Series 2000 Diesel Engine
2016-09-01
are suitable. To model the behavior of the engine, an autoregressive distributed lag (ARDL) time series model of engine speed and exhaust gas... time series model of engine speed and exhaust gas temperature is derived. The lag length for ARDL is determined by whitening of residuals using the...15 B. REGRESSION ANALYSIS ....................................................................15 1. Time Series Analysis
Contact Time in Random Walk and Random Waypoint: Dichotomy in Tail Distribution
NASA Astrophysics Data System (ADS)
Zhao, Chen; Sichitiu, Mihail L.
Contact time (or link duration) is a fundamental factor that affects performance in Mobile Ad Hoc Networks. Previous research on theoretical analysis of contact time distribution for random walk models (RW) assume that the contact events can be modeled as either consecutive random walks or direct traversals, which are two extreme cases of random walk, thus with two different conclusions. In this paper we conduct a comprehensive research on this topic in the hope of bridging the gap between the two extremes. The conclusions from the two extreme cases will result in a power-law or exponential tail in the contact time distribution, respectively. However, we show that the actual distribution will vary between the two extremes: a power-law-sub-exponential dichotomy, whose transition point depends on the average flight duration. Through simulation results we show that such conclusion also applies to random waypoint.
Real time thermal imaging for analysis and control of crystal growth by the Czochralski technique
NASA Technical Reports Server (NTRS)
Wargo, M. J.; Witt, A. F.
1992-01-01
A real time thermal imaging system with temperature resolution better than +/- 0.5 C and spatial resolution of better than 0.5 mm has been developed. It has been applied to the analysis of melt surface thermal field distributions in both Czochralski and liquid encapsulated Czochralski growth configurations. The sensor can provide single/multiple point thermal information; a multi-pixel averaging algorithm has been developed which permits localized, low noise sensing and display of optical intensity variations at any location in the hot zone as a function of time. Temperature distributions are measured by extraction of data along a user selectable linear pixel array and are simultaneously displayed, as a graphic overlay, on the thermal image.
Particle sizing of pharmaceutical aerosols via direct imaging of particle settling velocities.
Fishler, Rami; Verhoeven, Frank; de Kruijf, Wilbur; Sznitman, Josué
2018-02-15
We present a novel method for characterizing in near real-time the aerodynamic particle size distributions from pharmaceutical inhalers. The proposed method is based on direct imaging of airborne particles followed by a particle-by-particle measurement of settling velocities using image analysis and particle tracking algorithms. Due to the simplicity of the principle of operation, this method has the potential of circumventing potential biases of current real-time particle analyzers (e.g. Time of Flight analysis), while offering a cost effective solution. The simple device can also be constructed in laboratory settings from off-the-shelf materials for research purposes. To demonstrate the feasibility and robustness of the measurement technique, we have conducted benchmark experiments whereby aerodynamic particle size distributions are obtained from several commercially-available dry powder inhalers (DPIs). Our measurements yield size distributions (i.e. MMAD and GSD) that are closely in line with those obtained from Time of Flight analysis and cascade impactors suggesting that our imaging-based method may embody an attractive methodology for rapid inhaler testing and characterization. In a final step, we discuss some of the ongoing limitations of the current prototype and conceivable routes for improving the technique. Copyright © 2017 Elsevier B.V. All rights reserved.
A new parallel-vector finite element analysis software on distributed-memory computers
NASA Technical Reports Server (NTRS)
Qin, Jiangning; Nguyen, Duc T.
1993-01-01
A new parallel-vector finite element analysis software package MPFEA (Massively Parallel-vector Finite Element Analysis) is developed for large-scale structural analysis on massively parallel computers with distributed-memory. MPFEA is designed for parallel generation and assembly of the global finite element stiffness matrices as well as parallel solution of the simultaneous linear equations, since these are often the major time-consuming parts of a finite element analysis. Block-skyline storage scheme along with vector-unrolling techniques are used to enhance the vector performance. Communications among processors are carried out concurrently with arithmetic operations to reduce the total execution time. Numerical results on the Intel iPSC/860 computers (such as the Intel Gamma with 128 processors and the Intel Touchstone Delta with 512 processors) are presented, including an aircraft structure and some very large truss structures, to demonstrate the efficiency and accuracy of MPFEA.
NASA Astrophysics Data System (ADS)
Wang, Haijiang; Yang, Ling
2014-12-01
In this paper, the application of vector analysis tool in the illuminated area and the Doppler frequency distribution research for the airborne pulse radar is studied. An important feature of vector analysis is that it can closely combine the geometric ideas with algebraic calculations. Through coordinate transform, the relationship between the frame of radar antenna and the ground, under aircraft motion attitude, is derived. Under the time-space analysis, the overlap area between the footprint of radar beam and the pulse-illuminated zone is obtained. Furthermore, the Doppler frequency expression is successfully deduced. In addition, the Doppler frequency distribution is plotted finally. Using the time-space analysis results, some important parameters of a specified airborne radar system are obtained. Simultaneously, the results are applied to correct the phase error brought by attitude change in airborne synthetic aperture radar (SAR) imaging.
NASA Astrophysics Data System (ADS)
Sembiring, N.; Ginting, E.; Darnello, T.
2017-12-01
Problems that appear in a company that produces refined sugar, the production floor has not reached the level of critical machine availability because it often suffered damage (breakdown). This results in a sudden loss of production time and production opportunities. This problem can be solved by Reliability Engineering method where the statistical approach to historical damage data is performed to see the pattern of the distribution. The method can provide a value of reliability, rate of damage, and availability level, of an machine during the maintenance time interval schedule. The result of distribution test to time inter-damage data (MTTF) flexible hose component is lognormal distribution while component of teflon cone lifthing is weibull distribution. While from distribution test to mean time of improvement (MTTR) flexible hose component is exponential distribution while component of teflon cone lifthing is weibull distribution. The actual results of the flexible hose component on the replacement schedule per 720 hours obtained reliability of 0.2451 and availability 0.9960. While on the critical components of teflon cone lifthing actual on the replacement schedule per 1944 hours obtained reliability of 0.4083 and availability 0.9927.
NASA Astrophysics Data System (ADS)
Anikushina, T. A.; Naumov, A. V.
2013-12-01
This article demonstrates the principal advantages of the technique for analysis of the long-term spectral evolution of single molecules (SM) in the study of the microscopic nature of the dynamic processes in low-temperature polymers. We performed the detailed analysis of the spectral trail of single tetra-tert-butylterrylene (TBT) molecule in an amorphous polyisobutylene matrix, measured over 5 hours at T = 7K. It has been shown that the slow temporal dynamics is in qualitative agreement with the standard model of two-level systems and stochastic sudden-jump model. At the same time the distributions of the first four moments (cumulants) of the spectra of the selected SM measured at different time points were found not consistent with the standard theory prediction. It was considered as evidence that in a given time interval the system is not ergodic
A strategy for reducing turnaround time in design optimization using a distributed computer system
NASA Technical Reports Server (NTRS)
Young, Katherine C.; Padula, Sharon L.; Rogers, James L.
1988-01-01
There is a need to explore methods for reducing lengthly computer turnaround or clock time associated with engineering design problems. Different strategies can be employed to reduce this turnaround time. One strategy is to run validated analysis software on a network of existing smaller computers so that portions of the computation can be done in parallel. This paper focuses on the implementation of this method using two types of problems. The first type is a traditional structural design optimization problem, which is characterized by a simple data flow and a complicated analysis. The second type of problem uses an existing computer program designed to study multilevel optimization techniques. This problem is characterized by complicated data flow and a simple analysis. The paper shows that distributed computing can be a viable means for reducing computational turnaround time for engineering design problems that lend themselves to decomposition. Parallel computing can be accomplished with a minimal cost in terms of hardware and software.
DOT National Transportation Integrated Search
2006-01-01
The project focuses on two major issues - the improvement of current work zone design practices and an analysis of : vehicle interarrival time (IAT) and speed distributions for the development of a digital computer simulation model for : queues and t...
Relaxation of ferroelectric states in 2D distributions of quantum dots: EELS simulation
NASA Astrophysics Data System (ADS)
Cortés, C. M.; Meza-Montes, L.; Moctezuma, R. E.; Carrillo, J. L.
2016-06-01
The relaxation time of collective electronic states in a 2D distribution of quantum dots is investigated theoretically by simulating EELS experiments. From the numerical calculation of the probability of energy loss of an electron beam, traveling parallel to the distribution, it is possible to estimate the damping time of ferroelectric-like states. We generate this collective response of the distribution by introducing a mean field interaction among the quantum dots, and then, the model is extended incorporating effects of long-range correlations through a Bragg-Williams approximation. The behavior of the dielectric function, the energy loss function, and the relaxation time of ferroelectric-like states is then investigated as a function of the temperature of the distribution and the damping constant of the electronic states in the single quantum dots. The robustness of the trends and tendencies of our results indicate that this scheme of analysis can guide experimentalists to develop tailored quantum dots distributions for specific applications.
Ma, Da; Tang, Liang; Pan, Yan-Huan
2007-12-01
Three-dimensional finite method was used to analyze stress and strain distributions of periodontal ligament of abutments under dynamic loads. Finite element analysis was performed on the model under dynamic loads with vertical and oblique directions. The stress and strain distributions and stress-time curves were analyzed to study the biomechanical behavior of periodontal ligament of abutments. The stress and strain distributions of periodontal ligament under dynamic load were same with the static load. But the maximum stress and strain decreased apparently. The rate of change was between 60%-75%. The periodontal ligament had time-dependent mechanical behaviors. Some level of residual stress in periodontal ligament was left after one mastication period. The stress-free time under oblique load was shorter than that of vertical load. The maximum stress and strain decrease apparently under dynamic loads. The periodontal ligament has time-dependent mechanical behaviors during one mastication. There is some level of residual stress left after one mastication period. The level of residual stress is related to the magnitude and the direction of loads. The direction of applied loads is one important factor that affected the stress distribution and accumulation and release of abutment periodontal ligament.
Detecting Non-Gaussian and Lognormal Characteristics of Temperature and Water Vapor Mixing Ratio
NASA Astrophysics Data System (ADS)
Kliewer, A.; Fletcher, S. J.; Jones, A. S.; Forsythe, J. M.
2017-12-01
Many operational data assimilation and retrieval systems assume that the errors and variables come from a Gaussian distribution. This study builds upon previous results that shows that positive definite variables, specifically water vapor mixing ratio and temperature, can follow a non-Gaussian distribution and moreover a lognormal distribution. Previously, statistical testing procedures which included the Jarque-Bera test, the Shapiro-Wilk test, the Chi-squared goodness-of-fit test, and a composite test which incorporated the results of the former tests were employed to determine locations and time spans where atmospheric variables assume a non-Gaussian distribution. These tests are now investigated in a "sliding window" fashion in order to extend the testing procedure to near real-time. The analyzed 1-degree resolution data comes from the National Oceanic and Atmospheric Administration (NOAA) Global Forecast System (GFS) six hour forecast from the 0Z analysis. These results indicate the necessity of a Data Assimilation (DA) system to be able to properly use the lognormally-distributed variables in an appropriate Bayesian analysis that does not assume the variables are Gaussian.
Lifetime Reliability Prediction of Ceramic Structures Under Transient Thermomechanical Loads
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Jadaan, Osama J.; Gyekenyesi, John P.
2005-01-01
An analytical methodology is developed to predict the probability of survival (reliability) of ceramic components subjected to harsh thermomechanical loads that can vary with time (transient reliability analysis). This capability enables more accurate prediction of ceramic component integrity against fracture in situations such as turbine startup and shutdown, operational vibrations, atmospheric reentry, or other rapid heating or cooling situations (thermal shock). The transient reliability analysis methodology developed herein incorporates the following features: fast-fracture transient analysis (reliability analysis without slow crack growth, SCG); transient analysis with SCG (reliability analysis with time-dependent damage due to SCG); a computationally efficient algorithm to compute the reliability for components subjected to repeated transient loading (block loading); cyclic fatigue modeling using a combined SCG and Walker fatigue law; proof testing for transient loads; and Weibull and fatigue parameters that are allowed to vary with temperature or time. Component-to-component variation in strength (stochastic strength response) is accounted for with the Weibull distribution, and either the principle of independent action or the Batdorf theory is used to predict the effect of multiaxial stresses on reliability. The reliability analysis can be performed either as a function of the component surface (for surface-distributed flaws) or component volume (for volume-distributed flaws). The transient reliability analysis capability has been added to the NASA CARES/ Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. CARES/Life was also updated to interface with commercially available finite element analysis software, such as ANSYS, when used to model the effects of transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.
A method for analyzing temporal patterns of variability of a time series from Poincare plots.
Fishman, Mikkel; Jacono, Frank J; Park, Soojin; Jamasebi, Reza; Thungtong, Anurak; Loparo, Kenneth A; Dick, Thomas E
2012-07-01
The Poincaré plot is a popular two-dimensional, time series analysis tool because of its intuitive display of dynamic system behavior. Poincaré plots have been used to visualize heart rate and respiratory pattern variabilities. However, conventional quantitative analysis relies primarily on statistical measurements of the cumulative distribution of points, making it difficult to interpret irregular or complex plots. Moreover, the plots are constructed to reflect highly correlated regions of the time series, reducing the amount of nonlinear information that is presented and thereby hiding potentially relevant features. We propose temporal Poincaré variability (TPV), a novel analysis methodology that uses standard techniques to quantify the temporal distribution of points and to detect nonlinear sources responsible for physiological variability. In addition, the analysis is applied across multiple time delays, yielding a richer insight into system dynamics than the traditional circle return plot. The method is applied to data sets of R-R intervals and to synthetic point process data extracted from the Lorenz time series. The results demonstrate that TPV complements the traditional analysis and can be applied more generally, including Poincaré plots with multiple clusters, and more consistently than the conventional measures and can address questions regarding potential structure underlying the variability of a data set.
NASA Astrophysics Data System (ADS)
Chen, Xiaowang; Feng, Zhipeng
2016-12-01
Planetary gearboxes are widely used in many sorts of machinery, for its large transmission ratio and high load bearing capacity in a compact structure. Their fault diagnosis relies on effective identification of fault characteristic frequencies. However, in addition to the vibration complexity caused by intricate mechanical kinematics, volatile external conditions result in time-varying running speed and/or load, and therefore nonstationary vibration signals. This usually leads to time-varying complex fault characteristics, and adds difficulty to planetary gearbox fault diagnosis. Time-frequency analysis is an effective approach to extracting the frequency components and their time variation of nonstationary signals. Nevertheless, the commonly used time-frequency analysis methods suffer from poor time-frequency resolution as well as outer and inner interferences, which hinder accurate identification of time-varying fault characteristic frequencies. Although time-frequency reassignment improves the time-frequency readability, it is essentially subject to the constraints of mono-component and symmetric time-frequency distribution about true instantaneous frequency. Hence, it is still susceptible to erroneous energy reallocation or even generates pseudo interferences, particularly for multi-component signals of highly nonlinear instantaneous frequency. In this paper, to overcome the limitations of time-frequency reassignment, we propose an improvement with fine time-frequency resolution and free from interferences for highly nonstationary multi-component signals, by exploiting the merits of iterative generalized demodulation. The signal is firstly decomposed into mono-components of constant frequency by iterative generalized demodulation. Time-frequency reassignment is then applied to each generalized demodulated mono-component, obtaining a fine time-frequency distribution. Finally, the time-frequency distribution of each signal component is restored and superposed to get the time-frequency distribution of original signal. The proposed method is validated using both numerical simulated and lab experimental planetary gearbox vibration signals. The time-varying gear fault symptoms are successfully extracted, showing effectiveness of the proposed iterative generalized time-frequency reassignment method in planetary gearbox fault diagnosis under nonstationary conditions.
NASA Astrophysics Data System (ADS)
Sato, Aki-Hiro
2010-12-01
This study considers q-Gaussian distributions and stochastic differential equations with both multiplicative and additive noises. In the M-dimensional case a q-Gaussian distribution can be theoretically derived as a stationary probability distribution of the multiplicative stochastic differential equation with both mutually independent multiplicative and additive noises. By using the proposed stochastic differential equation a method to evaluate a default probability under a given risk buffer is proposed.
Forecasting weed distributions using climate data: a GIS early warning tool
Jarnevich, Catherine S.; Holcombe, Tracy R.; Barnett, David T.; Stohlgren, Thomas J.; Kartesz, John T.
2010-01-01
The number of invasive exotic plant species establishing in the United States is continuing to rise. When prevention of exotic species from entering into a country fails at the national level and the species establishes, reproduces, spreads, and becomes invasive, the most successful action at a local level is early detection followed eradication. We have developed a simple geographic information system (GIS) analysis for developing watch lists for early detection of invasive exotic plants that relies upon currently available species distribution data coupled with environmental data to aid in describing coarse-scale potential distributions. This GIS analysis tool develops environmental envelopes for species based upon the known distribution of a species thought to be invasive and represents the first approximation of its potential habitat while the necessary data are collected to perform more in-depth analyses. To validate this method we looked at a time series of species distributions for 66 species in Pacific Northwest, and northern Rocky Mountain counties. The time series analysis presented here did select counties that the invasive exotic weeds invaded in subsequent years, showing that this technique could be useful in developing watch lists for the spread of particular exotic species. We applied this same habitat-matching model based upon bioclimaric envelopes to 100 invasive exotics with various levels of known distributions within continental U.S. counties. For species with climatically limited distributions, county watch lists describe county-specific vulnerability to invasion. Species with matching habitats in a county would be added to that county's list. These watch lists can influence management decisions for early warning, control prioritization, and targeted research to determine specific locations within vulnerable counties. This tool provides useful information for rapid assessment of the potential distribution based upon climate envelopes of current distributions for new invasive exotic species.
The joint time-frequency spectrogram structure of heptanes boilover noise
NASA Astrophysics Data System (ADS)
Xu, Qiang
2006-04-01
An experiment was conducted to study the noise characteristics in the boilover phenomena. The boilover occurs in the combustion of a liquid fuel floating on water. It will cause a sharp increase in burning rate and external radiation. Explosive burning of the fuel would cause potential safety consequence. Combustion noise accompanies the development of fire and displays different characteristics in typical period. These characteristics can be used to predict the start time of boilover. The acoustic signal in boilover procedure during the combustion of heptanes-water mixture is obtained in a set of experiments. Joint time-frequency analysis (JTFA) method is applied in the treatment of noise data. Several JTFA algorithms were used in the evaluation. These algorithms include Gabor, adaptive spectrogram, cone shape distribution, choi-williams distribution, Wigner-Ville Distribution, and Short Time Fourier Transform with different windows such as rectangular, Blackman, Hamming and Hanning. Time-frequency distribution patterns of the combustion noise are obtained, and they are compared with others from jet flow and small plastic bubble blow up.
Non-Poissonian Distribution of Tsunami Waiting Times
NASA Astrophysics Data System (ADS)
Geist, E. L.; Parsons, T.
2007-12-01
Analysis of the global tsunami catalog indicates that tsunami waiting times deviate from an exponential distribution one would expect from a Poisson process. Empirical density distributions of tsunami waiting times were determined using both global tsunami origin times and tsunami arrival times at a particular site with a sufficient catalog: Hilo, Hawai'i. Most sources for the tsunamis in the catalog are earthquakes; other sources include landslides and volcanogenic processes. Both datasets indicate an over-abundance of short waiting times in comparison to an exponential distribution. Two types of probability models are investigated to explain this observation. Model (1) is a universal scaling law that describes long-term clustering of sources with a gamma distribution. The shape parameter (γ) for the global tsunami distribution is similar to that of the global earthquake catalog γ=0.63-0.67 [Corral, 2004]. For the Hilo catalog, γ is slightly greater (0.75-0.82) and closer to an exponential distribution. This is explained by the fact that tsunamis from smaller triggered earthquakes or landslides are less likely to be recorded at a far-field station such as Hilo in comparison to the global catalog, which includes a greater proportion of local tsunamis. Model (2) is based on two distributions derived from Omori's law for the temporal decay of triggered sources (aftershocks). The first is the ETAS distribution derived by Saichev and Sornette [2007], which is shown to fit the distribution of observed tsunami waiting times. The second is a simpler two-parameter distribution that is the exponential distribution augmented by a linear decay in aftershocks multiplied by a time constant Ta. Examination of the sources associated with short tsunami waiting times indicate that triggered events include both earthquake and landslide tsunamis that begin in the vicinity of the primary source. Triggered seismogenic tsunamis do not necessarily originate from the same fault zone, however. For example, subduction-thrust and outer-rise earthquake pairs are evident, such as the November 2006 and January 2007 Kuril Islands tsunamigenic pair. Because of variations in tsunami source parameters, such as water depth above the source, triggered tsunami events with short waiting times are not systematically smaller than the primary tsunami.
Dual-induced multifractality in online viewing activity.
Qin, Yu-Hao; Zhao, Zhi-Dan; Cai, Shi-Min; Gao, Liang; Stanley, H Eugene
2018-01-01
Although recent studies have found that the long-term correlations relating to the fat-tailed distribution of inter-event times exist in human activity and that these correlations indicate the presence of fractality, the property of fractality and its origin have not been analyzed. We use both detrended fluctuation analysis and multifractal detrended fluctuation analysis to analyze the time series in online viewing activity separating from Movielens and Netflix. We find long-term correlations at both the individual and communal levels and that the extent of correlation at the individual level is determined by the activity level. These long-term correlations also indicate that there is fractality in the pattern of online viewing. We first find a multifractality that results from the combined effect of the fat-tailed distribution of inter-event times (i.e., the times between successive viewing actions of individuals) and the long-term correlations in online viewing activity and verify this finding using three synthesized series. Therefore, it can be concluded that the multifractality in online viewing activity is caused by both the fat-tailed distribution of inter-event times and the long-term correlations and that this enlarges the generic property of human activity to include not just physical space but also cyberspace.
Dual-induced multifractality in online viewing activity
NASA Astrophysics Data System (ADS)
Qin, Yu-Hao; Zhao, Zhi-Dan; Cai, Shi-Min; Gao, Liang; Stanley, H. Eugene
2018-01-01
Although recent studies have found that the long-term correlations relating to the fat-tailed distribution of inter-event times exist in human activity and that these correlations indicate the presence of fractality, the property of fractality and its origin have not been analyzed. We use both detrended fluctuation analysis and multifractal detrended fluctuation analysis to analyze the time series in online viewing activity separating from Movielens and Netflix. We find long-term correlations at both the individual and communal levels and that the extent of correlation at the individual level is determined by the activity level. These long-term correlations also indicate that there is fractality in the pattern of online viewing. We first find a multifractality that results from the combined effect of the fat-tailed distribution of inter-event times (i.e., the times between successive viewing actions of individuals) and the long-term correlations in online viewing activity and verify this finding using three synthesized series. Therefore, it can be concluded that the multifractality in online viewing activity is caused by both the fat-tailed distribution of inter-event times and the long-term correlations and that this enlarges the generic property of human activity to include not just physical space but also cyberspace.
The Accuracy of GBM GRB Localizations
NASA Astrophysics Data System (ADS)
Briggs, Michael Stephen; Connaughton, V.; Meegan, C.; Hurley, K.
2010-03-01
We report an study of the accuracy of GBM GRB localizations, analyzing three types of localizations: those produced automatically by the GBM Flight Software on board GBM, those produced automatically with ground software in near real time, and localizations produced with human guidance. The two types of automatic locations are distributed in near real-time via GCN Notices; the human-guided locations are distributed on timescale of many minutes or hours using GCN Circulars. This work uses a Bayesian analysis that models the distribution of the GBM total location error by comparing GBM locations to more accurate locations obtained with other instruments. Reference locations are obtained from Swift, Super-AGILE, the LAT, and with the IPN. We model the GBM total location errors as having systematic errors in addition to the statistical errors and use the Bayesian analysis to constrain the systematic errors.
On the Distribution of Earthquake Interevent Times and the Impact of Spatial Scale
NASA Astrophysics Data System (ADS)
Hristopulos, Dionissios
2013-04-01
The distribution of earthquake interevent times is a subject that has attracted much attention in the statistical physics literature [1-3]. A recent paper proposes that the distribution of earthquake interevent times follows from the the interplay of the crustal strength distribution and the loading function (stress versus time) of the Earth's crust locally [4]. It was also shown that the Weibull distribution describes earthquake interevent times provided that the crustal strength also follows the Weibull distribution and that the loading function follows a power-law during the loading cycle. I will discuss the implications of this work and will present supporting evidence based on the analysis of data from seismic catalogs. I will also discuss the theoretical evidence in support of the Weibull distribution based on models of statistical physics [5]. Since other-than-Weibull interevent times distributions are not excluded in [4], I will illustrate the use of the Kolmogorov-Smirnov test in order to determine which probability distributions are not rejected by the data. Finally, we propose a modification of the Weibull distribution if the size of the system under investigation (i.e., the area over which the earthquake activity occurs) is finite with respect to a critical link size. keywords: hypothesis testing, modified Weibull, hazard rate, finite size References [1] Corral, A., 2004. Long-term clustering, scaling, and universality in the temporal occurrence of earthquakes, Phys. Rev. Lett., 9210) art. no. 108501. [2] Saichev, A., Sornette, D. 2007. Theory of earthquake recurrence times, J. Geophys. Res., Ser. B 112, B04313/1-26. [3] Touati, S., Naylor, M., Main, I.G., 2009. Origin and nonuniversality of the earthquake interevent time distribution Phys. Rev. Lett., 102 (16), art. no. 168501. [4] Hristopulos, D.T., 2003. Spartan Gibbs random field models for geostatistical applications, SIAM Jour. Sci. Comput., 24, 2125-2162. [5] I. Eliazar and J. Klafter, 2006. Growth-collapse and decay-surge evolutions, and geometric Langevin equations, Physica A, 367, 106 - 128.
[An EMD based time-frequency distribution and its application in EEG analysis].
Li, Xiaobing; Chu, Meng; Qiu, Tianshuang; Bao, Haiping
2007-10-01
Hilbert-Huang transform (HHT) is a new time-frequency analytic method to analyze the nonlinear and the non-stationary signals. The key step of this method is the empirical mode decomposition (EMD), with which any complicated signal can be decomposed into a finite and small number of intrinsic mode functions (IMF). In this paper, a new EMD based method for suppressing the cross-term of Wigner-Ville distribution (WVD) is developed and is applied to analyze the epileptic EEG signals. The simulation data and analysis results show that the new method suppresses the cross-term of the WVD effectively with an excellent resolution.
Estimating division and death rates from CFSE data
NASA Astrophysics Data System (ADS)
de Boer, Rob J.; Perelson, Alan S.
2005-12-01
The division tracking dye, carboxyfluorescin diacetate succinimidyl ester (CFSE) is currently the most informative labeling technique for characterizing the division history of cells in the immune system. Gett and Hodgkin (Nat. Immunol. 1 (2000) 239-244) have proposed to normalize CFSE data by the 2-fold expansion that is associated with each division, and have argued that the mean of the normalized data increases linearly with time, t, with a slope reflecting the division rate p. We develop a number of mathematical models for the clonal expansion of quiescent cells after stimulation and show, within the context of these models, under which conditions this approach is valid. We compare three means of the distribution of cells over the CFSE profile at time t: the mean, [mu](t), the mean of the normalized distribution, [mu]2(t), and the mean of the normalized distribution excluding nondivided cells, .In the simplest models, which deal with homogeneous populations of cells with constant division and death rates, the normalized frequency distribution of the cells over the respective division numbers is a Poisson distribution with mean [mu]2(t)=pt, where p is the division rate. The fact that in the data these distributions seem Gaussian is therefore insufficient to establish that the times at which cells are recruited into the first division have a Gaussian variation because the Poisson distribution approaches the Gaussian distribution for large pt. Excluding nondivided cells complicates the data analysis because , and only approaches a slope p after an initial transient.In models where the first division of the quiescent cells takes longer than later divisions, all three means have an initial transient before they approach an asymptotic regime, which is the expected [mu](t)=2pt and . Such a transient markedly complicates the data analysis. After the same initial transients, the normalized cell numbers tend to decrease at a rate e-dt, where d is the death rate.Nonlinear parameter fitting of CFSE data obtained from Gett and Hodgkin to ordinary differential equation (ODE) models with first-order terms for cell proliferation and death gave poor fits to the data. The Smith-Martin model with an explicit time delay for the deterministic phase of the cell cycle performed much better. Nevertheless, the insights gained from analysis of the ODEs proved useful as we showed by generating virtual CFSE data with a simulation model, where cell cycle times were drawn from various distributions, and then computing the various mean division numbers.
NASA Astrophysics Data System (ADS)
Korzeniewska, Ewa; Szczesny, Artur; Krawczyk, Andrzej; Murawski, Piotr; Mróz, Józef; Seme, Sebastian
2018-03-01
In this paper, the authors describe the distribution of temperatures around electroconductive pathways created by a physical vacuum deposition process on flexible textile substrates used in elastic electronics and textronics. Cordura material was chosen as the substrate. Silver with 99.99% purity was used as the deposited metal. This research was based on thermographic photographs of the produced samples. Analysis of the temperature field around the electroconductive layer was carried out using Image ThermaBase EU software. The analysis of the temperature distribution highlights the software's usefulness in determining the homogeneity of the created metal layer. Higher local temperatures and non-uniform distributions at the same time can negatively influence the work of the textronic system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marekova, Elisaveta
Series of relatively large earthquakes in different regions of the Earth are studied. The regions chooses are of a high seismic activity and has a good contemporary network for recording of the seismic events along them. The main purpose of this investigation is the attempt to describe analytically the seismic process in the space and time. We are considering the statistical distributions the distances and the times between consecutive earthquakes (so called pair analysis). Studies conducted on approximating the statistical distribution of the parameters of consecutive seismic events indicate the existence of characteristic functions that describe them best. Such amore » mathematical description allows the distributions of the examined parameters to be compared to other model distributions.« less
Decreasing-Rate Pruning Optimizes the Construction of Efficient and Robust Distributed Networks.
Navlakha, Saket; Barth, Alison L; Bar-Joseph, Ziv
2015-07-01
Robust, efficient, and low-cost networks are advantageous in both biological and engineered systems. During neural network development in the brain, synapses are massively over-produced and then pruned-back over time. This strategy is not commonly used when designing engineered networks, since adding connections that will soon be removed is considered wasteful. Here, we show that for large distributed routing networks, network function is markedly enhanced by hyper-connectivity followed by aggressive pruning and that the global rate of pruning, a developmental parameter not previously studied by experimentalists, plays a critical role in optimizing network structure. We first used high-throughput image analysis techniques to quantify the rate of pruning in the mammalian neocortex across a broad developmental time window and found that the rate is decreasing over time. Based on these results, we analyzed a model of computational routing networks and show using both theoretical analysis and simulations that decreasing rates lead to more robust and efficient networks compared to other rates. We also present an application of this strategy to improve the distributed design of airline networks. Thus, inspiration from neural network formation suggests effective ways to design distributed networks across several domains.
Decreasing-Rate Pruning Optimizes the Construction of Efficient and Robust Distributed Networks
Navlakha, Saket; Barth, Alison L.; Bar-Joseph, Ziv
2015-01-01
Robust, efficient, and low-cost networks are advantageous in both biological and engineered systems. During neural network development in the brain, synapses are massively over-produced and then pruned-back over time. This strategy is not commonly used when designing engineered networks, since adding connections that will soon be removed is considered wasteful. Here, we show that for large distributed routing networks, network function is markedly enhanced by hyper-connectivity followed by aggressive pruning and that the global rate of pruning, a developmental parameter not previously studied by experimentalists, plays a critical role in optimizing network structure. We first used high-throughput image analysis techniques to quantify the rate of pruning in the mammalian neocortex across a broad developmental time window and found that the rate is decreasing over time. Based on these results, we analyzed a model of computational routing networks and show using both theoretical analysis and simulations that decreasing rates lead to more robust and efficient networks compared to other rates. We also present an application of this strategy to improve the distributed design of airline networks. Thus, inspiration from neural network formation suggests effective ways to design distributed networks across several domains. PMID:26217933
Alwan, Faris M; Baharum, Adam; Hassan, Geehan S
2013-01-01
The reliability of the electrical distribution system is a contemporary research field due to diverse applications of electricity in everyday life and diverse industries. However a few research papers exist in literature. This paper proposes a methodology for assessing the reliability of 33/11 Kilovolt high-power stations based on average time between failures. The objective of this paper is to find the optimal fit for the failure data via time between failures. We determine the parameter estimation for all components of the station. We also estimate the reliability value of each component and the reliability value of the system as a whole. The best fitting distribution for the time between failures is a three parameter Dagum distribution with a scale parameter [Formula: see text] and shape parameters [Formula: see text] and [Formula: see text]. Our analysis reveals that the reliability value decreased by 38.2% in each 30 days. We believe that the current paper is the first to address this issue and its analysis. Thus, the results obtained in this research reflect its originality. We also suggest the practicality of using these results for power systems for both the maintenance of power systems models and preventive maintenance models.
Alwan, Faris M.; Baharum, Adam; Hassan, Geehan S.
2013-01-01
The reliability of the electrical distribution system is a contemporary research field due to diverse applications of electricity in everyday life and diverse industries. However a few research papers exist in literature. This paper proposes a methodology for assessing the reliability of 33/11 Kilovolt high-power stations based on average time between failures. The objective of this paper is to find the optimal fit for the failure data via time between failures. We determine the parameter estimation for all components of the station. We also estimate the reliability value of each component and the reliability value of the system as a whole. The best fitting distribution for the time between failures is a three parameter Dagum distribution with a scale parameter and shape parameters and . Our analysis reveals that the reliability value decreased by 38.2% in each 30 days. We believe that the current paper is the first to address this issue and its analysis. Thus, the results obtained in this research reflect its originality. We also suggest the practicality of using these results for power systems for both the maintenance of power systems models and preventive maintenance models. PMID:23936346
NASA Astrophysics Data System (ADS)
Cao, Bao; Luo, Hong; Gao, Zhenji
2009-10-01
Space-time Information Expression and Analysis (SIEA) uses vivid graphic images of thinking to deal with information units according to series distribution rules with a variety of arranging, which combined with the use of information technology, powerful data-processing capabilities to carry out analysis and integration of information units. In this paper, a new SIEA approach was proposed and its model was constructed. And basic units, methodologies and steps of SIEA were discussed. Taking China's coastland as an example, the new SIEA approach were applied for the parameters of air humidity, rainfall and surface temperature from the year 1981 to 2000. The case study shows that the parameters change within month alternation, but little change within year alternation. From the view of spatial distribution, it was significantly different for the parameters in north and south of China's coastland. The new SIEA approach proposed in this paper not only has the intuitive, image characteristics, but also can solved the problem that it is difficult to express the biophysical parameters of space-time distribution using traditional charts and tables. It can reveal the complexity of the phenomenon behind the movement of things and laws of nature. And it can quantitatively analyze the phenomenon and nature law of the parameters, which inherited the advantages of graphics of traditional ways of thinking. SIEA provides a new space-time analysis and expression approach, using comprehensive 3S technologies, for the research of Earth System Science.
Oliker, Nurit; Ostfeld, Avi
2014-03-15
This study describes a decision support system, alerts for contamination events in water distribution systems. The developed model comprises a weighted support vector machine (SVM) for the detection of outliers, and a following sequence analysis for the classification of contamination events. The contribution of this study is an improvement of contamination events detection ability and a multi-dimensional analysis of the data, differing from the parallel one-dimensional analysis conducted so far. The multivariate analysis examines the relationships between water quality parameters and detects changes in their mutual patterns. The weights of the SVM model accomplish two goals: blurring the difference between sizes of the two classes' data sets (as there are much more normal/regular than event time measurements), and adhering the time factor attribute by a time decay coefficient, ascribing higher importance to recent observations when classifying a time step measurement. All model parameters were determined by data driven optimization so the calibration of the model was completely autonomic. The model was trained and tested on a real water distribution system (WDS) data set with randomly simulated events superimposed on the original measurements. The model is prominent in its ability to detect events that were only partly expressed in the data (i.e., affecting only some of the measured parameters). The model showed high accuracy and better detection ability as compared to previous modeling attempts of contamination event detection. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kenkre, V. M.; Scott, J. E.; Pease, E. A.; Hurd, A. J.
1998-05-01
A theoretical framework for the analysis of the stress distribution in granular materials is presented. It makes use of a transformation of the vertical spatial coordinate into a formal time variable and the subsequent study of a generally non-Markoffian, i.e., memory-possessing (nonlocal) propagation equation. Previous treatments are obtained as particular cases corresponding to, respectively, wavelike and diffusive limits of the general evolution. Calculations are presented for stress propagation in bounded and unbounded media. They can be used to obtain desired features such as a prescribed stress distribution within the compact.
The Measurement of Pressure Through Tubes in Pressure Distribution Tests
NASA Technical Reports Server (NTRS)
Hemke, Paul E
1928-01-01
The tests described in this report were made to determine the error caused by using small tubes to connect orifices on the surface of aircraft to central pressure capsules in making pressure distribution tests. Aluminum tubes of 3/16-inch inside diameter were used to determine this error. Lengths from 20 feet to 226 feet and pressures whose maxima varied from 2 inches to 140 inches of water were used. Single-pressure impulses for which the time of rise of pressure from zero to a maximum varied from 0.25 second to 3 seconds were investigated. The results show that the pressure recorded at the capsule on the far end of the tube lags behind the pressure at the orifice end and experiences also a change in magnitude. For the values used in these tests the time lag and pressure change vary principally with the time of rise of pressure from zero to a maximum and the tube length. Curves are constructed showing the time lag and pressure change. Empirical formulas are also given for computing the time lag. Analysis of pressure distribution tests made on airplanes in flight shows that the recorded pressures are slightly higher than the pressures at the orifice and that the time lag is negligible. The apparent increase in pressure is usually within the experimental error, but in the case of the modern pursuit type of airplane the pressure increase may be 5 per cent. For pressure-distribution tests on airships the analysis shows that the time lag and pressure change may be neglected.
Study of Solid State Drives performance in PROOF distributed analysis system
NASA Astrophysics Data System (ADS)
Panitkin, S. Y.; Ernst, M.; Petkus, R.; Rind, O.; Wenaus, T.
2010-04-01
Solid State Drives (SSD) is a promising storage technology for High Energy Physics parallel analysis farms. Its combination of low random access time and relatively high read speed is very well suited for situations where multiple jobs concurrently access data located on the same drive. It also has lower energy consumption and higher vibration tolerance than Hard Disk Drive (HDD) which makes it an attractive choice in many applications raging from personal laptops to large analysis farms. The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF is especially efficient together with distributed local storage systems like Xrootd, when data are distributed over computing nodes. In such an architecture the local disk subsystem I/O performance becomes a critical factor, especially when computing nodes use multi-core CPUs. We will discuss our experience with SSDs in PROOF environment. We will compare performance of HDD with SSD in I/O intensive analysis scenarios. In particular we will discuss PROOF system performance scaling with a number of simultaneously running analysis jobs.
Photofragment image analysis using the Onion-Peeling Algorithm
NASA Astrophysics Data System (ADS)
Manzhos, Sergei; Loock, Hans-Peter
2003-07-01
With the growing popularity of the velocity map imaging technique, a need for the analysis of photoion and photoelectron images arose. Here, a computer program is presented that allows for the analysis of cylindrically symmetric images. It permits the inversion of the projection of the 3D charged particle distribution using the Onion Peeling Algorithm. Further analysis includes the determination of radial and angular distributions, from which velocity distributions and spatial anisotropy parameters are obtained. Identification and quantification of the different photolysis channels is therefore straightforward. In addition, the program features geometry correction, centering, and multi-Gaussian fitting routines, as well as a user-friendly graphical interface and the possibility of generating synthetic images using either the fitted or user-defined parameters. Program summaryTitle of program: Glass Onion Catalogue identifier: ADRY Program Summary URL:http://cpc.cs.qub.ac.uk/summaries/ADRY Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions: none Computer: IBM PC Operating system under which the program has been tested: Windows 98, Windows 2000, Windows NT Programming language used: Delphi 4.0 Memory required to execute with typical data: 18 Mwords No. of bits in a word: 32 No. of bytes in distributed program, including test data, etc.: 9 911 434 Distribution format: zip file Keywords: Photofragment image, onion peeling, anisotropy parameters Nature of physical problem: Information about velocity and angular distributions of photofragments is the basis on which the analysis of the photolysis process resides. Reconstructing the three-dimensional distribution from the photofragment image is the first step, further processing involving angular and radial integration of the inverted image to obtain velocity and angular distributions. Provisions have to be made to correct for slight distortions of the image, and to verify the accuracy of the analysis process. Method of solution: The "Onion Peeling" algorithm described by Helm [Rev. Sci. Instrum. 67 (6) (1996)] is used to perform the image reconstruction. Angular integration with a subsequent multi-Gaussian fit supplies information about the velocity distribution of the photofragments, whereas radial integration with subsequent expansion of the angular distributions over Legendre Polynomials gives the spatial anisotropy parameters. Fitting algorithms have been developed to centre the image and to correct for image distortion. Restrictions on the complexity of the problem: The maximum image size (1280×1280) and resolution (16 bit) are restricted by available memory and can be changed in the source code. Initial centre coordinates within 5 pixels may be required for the correction and the centering algorithm to converge. Peaks on the velocity profile separated by less then the peak width may not be deconvolved. In the charged particle image reconstruction, it is assumed that the kinetic energy released in the dissociation process is small compared to the energy acquired in the electric field. For the fitting parameters to be physically meaningful, cylindrical symmetry of the image has to be assumed but the actual inversion algorithm is stable to distortions of such symmetry in experimental images. Typical running time: The analysis procedure can be divided into three parts: inversion, fitting, and geometry correction. The inversion time grows approx. as R3, where R is the radius of the region of interest: for R=200 pixels it is less than a minute, for R=400 pixels less then 6 min on a 400 MHz IBM personal computer. The time for the velocity fitting procedure to converge depends strongly on the number of peaks in the velocity profile and the convergence criterion. It ranges between less then a second for simple curves and a few minutes for profiles with up to twenty peaks. The time taken for the image correction scales as R2 and depends on the curve profile. It is on the order of a few minutes for images with R=500 pixels. Unusual features of the program: Our centering and image correction algorithm is based on Fourier analysis of the radial distribution to insure the sharpest velocity profile and is insensitive to an uneven intensity distribution. There exists an angular averaging option to stabilize the inversion algorithm and not to loose the resolution at the same time.
Broadband Time-Frequency Analysis Using a Multicomputer
2004-09-30
FFT 512 pt Waterfall WVD display 8© 2004 Mercury Computer Systems, Inc. Smoothed Pseudo Wigner - Ville Distribution One of many interference reduction...The Wigner - Ville distribution , the scalogram, and the discrete Gabor transform are among the most well-known of these methods. Due to specific...based upon FFT Accumulation Method • Continuous Wavelet Transform (Scalogram) • Discrete Wigner - Ville Distribution with a selected set of interference
Gardner, B.; Sullivan, P.J.; Morreale, S.J.; Epperly, S.P.
2008-01-01
Loggerhead (Caretta caretta) and leatherback (Dermochelys coriacea) sea turtle distributions and movements in offshore waters of the western North Atlantic are not well understood despite continued efforts to monitor, survey, and observe them. Loggerhead and leatherback sea turtles are listed as endangered by the World Conservation Union, and thus anthropogenic mortality of these species, including fishing, is of elevated interest. This study quantifies spatial and temporal patterns of sea turtle bycatch distributions to identify potential processes influencing their locations. A Ripley's K function analysis was employed on the NOAA Fisheries Atlantic Pelagic Longline Observer Program data to determine spatial, temporal, and spatio-temporal patterns of sea turtle bycatch distributions within the pattern of the pelagic fishery distribution. Results indicate that loggerhead and leatherback sea turtle catch distributions change seasonally, with patterns of spatial clustering appearing from July through October. The results from the space-time analysis indicate that sea turtle catch distributions are related on a relatively fine scale (30-200 km and 1-5 days). The use of spatial and temporal point pattern analysis, particularly K function analysis, is a novel way to examine bycatch data and can be used to inform fishing practices such that fishing could still occur while minimizing sea turtle bycatch. ?? 2008 NRC.
NASA Astrophysics Data System (ADS)
Endreny, Theodore A.; Pashiardis, Stelios
2007-02-01
SummaryRobust and accurate estimates of rainfall frequencies are difficult to make with short, and arid-climate, rainfall records, however new regional and global methods were used to supplement such a constrained 15-34 yr record in Cyprus. The impact of supplementing rainfall frequency analysis with the regional and global approaches was measured with relative bias and root mean square error (RMSE) values. Analysis considered 42 stations with 8 time intervals (5-360 min) in four regions delineated by proximity to sea and elevation. Regional statistical algorithms found the sites passed discordancy tests of coefficient of variation, skewness and kurtosis, while heterogeneity tests revealed the regions were homogeneous to mildly heterogeneous. Rainfall depths were simulated in the regional analysis method 500 times, and then goodness of fit tests identified the best candidate distribution as the general extreme value (GEV) Type II. In the regional analysis, the method of L-moments was used to estimate location, shape, and scale parameters. In the global based analysis, the distribution was a priori prescribed as GEV Type II, a shape parameter was a priori set to 0.15, and a time interval term was constructed to use one set of parameters for all time intervals. Relative RMSE values were approximately equal at 10% for the regional and global method when regions were compared, but when time intervals were compared the global method RMSE had a parabolic-shaped time interval trend. Relative bias values were also approximately equal for both methods when regions were compared, but again a parabolic-shaped time interval trend was found for the global method. The global method relative RMSE and bias trended with time interval, which may be caused by fitting a single scale value for all time intervals.
Bayesian analysis of the kinetics of quantal transmitter secretion at the neuromuscular junction.
Saveliev, Anatoly; Khuzakhmetova, Venera; Samigullin, Dmitry; Skorinkin, Andrey; Kovyazina, Irina; Nikolsky, Eugeny; Bukharaeva, Ellya
2015-10-01
The timing of transmitter release from nerve endings is considered nowadays as one of the factors determining the plasticity and efficacy of synaptic transmission. In the neuromuscular junction, the moments of release of individual acetylcholine quanta are related to the synaptic delays of uniquantal endplate currents recorded under conditions of lowered extracellular calcium. Using Bayesian modelling, we performed a statistical analysis of synaptic delays in mouse neuromuscular junction with different patterns of rhythmic nerve stimulation and when the entry of calcium ions into the nerve terminal was modified. We have obtained a statistical model of the release timing which is represented as the summation of two independent statistical distributions. The first of these is the exponentially modified Gaussian distribution. The mixture of normal and exponential components in this distribution can be interpreted as a two-stage mechanism of early and late periods of phasic synchronous secretion. The parameters of this distribution depend on both the stimulation frequency of the motor nerve and the calcium ions' entry conditions. The second distribution was modelled as quasi-uniform, with parameters independent of nerve stimulation frequency and calcium entry. Two different probability density functions for the distribution of synaptic delays suggest at least two independent processes controlling the time course of secretion, one of them potentially involving two stages. The relative contribution of these processes to the total number of mediator quanta released depends differently on the motor nerve stimulation pattern and on calcium ion entry into nerve endings.
From heavy-tailed to exponential distribution of interevent time in cellphone top-up behavior
NASA Astrophysics Data System (ADS)
Wang, Peng; Ma, Qiang
2017-05-01
Cellphone top-up is a kind of activities, to a great extent, driven by individual consumption rather than personal interest and this behavior should be stable in common sense. However, our researches find there are heavy-tails both in interevent time distribution and purchase frequency distribution at the global level. Moreover, we find both memories of interevent time and unit price series are negative, which is different from previous bursty activities. We divide individuals into five groups according to the purchase frequency and the average unit price respectively. Then, the group analysis shows some significant heterogeneity in this behavior. On one hand, we obtain only the individuals with high purchase frequency have the heavy-tailed nature in interevent time distribution. On the contrary, the negative memory is only caused by low purchase-frequency individuals without burstiness. On the other hand, the individuals with different preferential price also have different power-law exponents at the group level and there is no data collapse after rescaling between these distributions. Our findings produce the evidence for the significant heterogeneity of human activity in many aspects.
Design and Analysis of Scheduling Policies for Real-Time Computer Systems
1992-01-01
C. M. Krishna, "The Impact of Workload on the Reliability of Real-Time Processor Triads," to appear in Micro . Rel. [17] J.F. Kurose, "Performance... Processor Triads", to appear in Micro . Rel. "* J.F. Kurose. "Performance Analysis of Minimum Laxity Scheduling in Discrete Time Queue- ing Systems", to...exponentially distributed service times and deadlines. A similar model was developed for the ED policy for a single processor system under identical
Statistical physics approaches to financial fluctuations
NASA Astrophysics Data System (ADS)
Wang, Fengzhong
2009-12-01
Complex systems attract many researchers from various scientific fields. Financial markets are one of these widely studied complex systems. Statistical physics, which was originally developed to study large systems, provides novel ideas and powerful methods to analyze financial markets. The study of financial fluctuations characterizes market behavior, and helps to better understand the underlying market mechanism. Our study focuses on volatility, a fundamental quantity to characterize financial fluctuations. We examine equity data of the entire U.S. stock market during 2001 and 2002. To analyze the volatility time series, we develop a new approach, called return interval analysis, which examines the time intervals between two successive volatilities exceeding a given value threshold. We find that the return interval distribution displays scaling over a wide range of thresholds. This scaling is valid for a range of time windows, from one minute up to one day. Moreover, our results are similar for commodities, interest rates, currencies, and for stocks of different countries. Further analysis shows some systematic deviations from a scaling law, which we can attribute to nonlinear correlations in the volatility time series. We also find a memory effect in return intervals for different time scales, which is related to the long-term correlations in the volatility. To further characterize the mechanism of price movement, we simulate the volatility time series using two different models, fractionally integrated generalized autoregressive conditional heteroscedasticity (FIGARCH) and fractional Brownian motion (fBm), and test these models with the return interval analysis. We find that both models can mimic time memory but only fBm shows scaling in the return interval distribution. In addition, we examine the volatility of daily opening to closing and of closing to opening. We find that each volatility distribution has a power law tail. Using the detrended fluctuation analysis (DFA) method, we show long-term auto-correlations in these volatility time series. We also analyze return, the actual price changes of stocks, and find that the returns over the two sessions are often anti-correlated.
USDA-ARS?s Scientific Manuscript database
This paper assesses the impact of different likelihood functions in identifying sensitive parameters of the highly parameterized, spatially distributed Soil and Water Assessment Tool (SWAT) watershed model for multiple variables at multiple sites. The global one-factor-at-a-time (OAT) method of Morr...
Observation of distorted Maxwell-Boltzmann distribution of epithermal ions in LHD
NASA Astrophysics Data System (ADS)
Ida, K.; Kobayashi, T.; Yoshinuma, M.; Akiyama, T.; Tokuzawa, T.; Tsuchiya, H.; Itoh, K.; LHD Experiment Group
2017-12-01
A distorted Maxwell-Boltzmann distribution of epithermal ions is observed associated with the collapse of energetic ions triggered by the tongue shaped deformation. The tongue shaped deformation is characterized by the plasma displacement localized in the toroidal, poloidal, and radial directions at the non-rational magnetic flux surface in toroidal plasma. Moment analysis of the ion velocity distribution measured with charge exchange spectroscopy is studied in order to investigate the impact of tongue event on ion distribution. A clear non-zero skewness (3rd moment) and kurtosis (4th moment -3) of ion velocity distribution in the epithermal region (within three times of thermal velocity) is observed after the tongue event. This observation indicates the clear evidence of the distortion of ion velocity distribution from Maxwell-Boltzmann distribution. This distortion from Maxwell-Boltzmann distribution is observed in one-third of plasma minor radius region near the plasma edge and disappears in the ion-ion collision time scale.
Time-evolution of grain size distributions in random nucleation and growth crystallization processes
NASA Astrophysics Data System (ADS)
Teran, Anthony V.; Bill, Andreas; Bergmann, Ralf B.
2010-02-01
We study the time dependence of the grain size distribution N(r,t) during crystallization of a d -dimensional solid. A partial differential equation, including a source term for nuclei and a growth law for grains, is solved analytically for any dimension d . We discuss solutions obtained for processes described by the Kolmogorov-Avrami-Mehl-Johnson model for random nucleation and growth (RNG). Nucleation and growth are set on the same footing, which leads to a time-dependent decay of both effective rates. We analyze in detail how model parameters, the dimensionality of the crystallization process, and time influence the shape of the distribution. The calculations show that the dynamics of the effective nucleation and effective growth rates play an essential role in determining the final form of the distribution obtained at full crystallization. We demonstrate that for one class of nucleation and growth rates, the distribution evolves in time into the logarithmic-normal (lognormal) form discussed earlier by Bergmann and Bill [J. Cryst. Growth 310, 3135 (2008)]. We also obtain an analytical expression for the finite maximal grain size at all times. The theory allows for the description of a variety of RNG crystallization processes in thin films and bulk materials. Expressions useful for experimental data analysis are presented for the grain size distribution and the moments in terms of fundamental and measurable parameters of the model.
Diffusion of active chiral particles
NASA Astrophysics Data System (ADS)
Sevilla, Francisco J.
2016-12-01
The diffusion of chiral active Brownian particles in three-dimensional space is studied analytically, by consideration of the corresponding Fokker-Planck equation for the probability density of finding a particle at position x and moving along the direction v ̂ at time t , and numerically, by the use of Langevin dynamics simulations. The analysis is focused on the marginal probability density of finding a particle at a given location and at a given time (independently of its direction of motion), which is found from an infinite hierarchy of differential-recurrence relations for the coefficients that appear in the multipole expansion of the probability distribution, which contains the whole kinematic information. This approach allows the explicit calculation of the time dependence of the mean-squared displacement and the time dependence of the kurtosis of the marginal probability distribution, quantities from which the effective diffusion coefficient and the "shape" of the positions distribution are examined. Oscillations between two characteristic values were found in the time evolution of the kurtosis, namely, between the value that corresponds to a Gaussian and the one that corresponds to a distribution of spherical shell shape. In the case of an ensemble of particles, each one rotating around a uniformly distributed random axis, evidence is found of the so-called effect "anomalous, yet Brownian, diffusion," for which particles follow a non-Gaussian distribution for the positions yet the mean-squared displacement is a linear function of time.
NASA Astrophysics Data System (ADS)
Samsinar, Riza; Suseno, Jatmiko Endro; Widodo, Catur Edi
2018-02-01
The distribution network is the closest power grid to the customer Electric service providers such as PT. PLN. The dispatching center of power grid companies is also the data center of the power grid where gathers great amount of operating information. The valuable information contained in these data means a lot for power grid operating management. The technique of data warehousing online analytical processing has been used to manage and analysis the great capacity of data. Specific methods for online analytics information systems resulting from data warehouse processing with OLAP are chart and query reporting. The information in the form of chart reporting consists of the load distribution chart based on the repetition of time, distribution chart on the area, the substation region chart and the electric load usage chart. The results of the OLAP process show the development of electric load distribution, as well as the analysis of information on the load of electric power consumption and become an alternative in presenting information related to peak load.
Nowak, Michael D.; Smith, Andrew B.; Simpson, Carl; Zwickl, Derrick J.
2013-01-01
Molecular divergence time analyses often rely on the age of fossil lineages to calibrate node age estimates. Most divergence time analyses are now performed in a Bayesian framework, where fossil calibrations are incorporated as parametric prior probabilities on node ages. It is widely accepted that an ideal parameterization of such node age prior probabilities should be based on a comprehensive analysis of the fossil record of the clade of interest, but there is currently no generally applicable approach for calculating such informative priors. We provide here a simple and easily implemented method that employs fossil data to estimate the likely amount of missing history prior to the oldest fossil occurrence of a clade, which can be used to fit an informative parametric prior probability distribution on a node age. Specifically, our method uses the extant diversity and the stratigraphic distribution of fossil lineages confidently assigned to a clade to fit a branching model of lineage diversification. Conditioning this on a simple model of fossil preservation, we estimate the likely amount of missing history prior to the oldest fossil occurrence of a clade. The likelihood surface of missing history can then be translated into a parametric prior probability distribution on the age of the clade of interest. We show that the method performs well with simulated fossil distribution data, but that the likelihood surface of missing history can at times be too complex for the distribution-fitting algorithm employed by our software tool. An empirical example of the application of our method is performed to estimate echinoid node ages. A simulation-based sensitivity analysis using the echinoid data set shows that node age prior distributions estimated under poor preservation rates are significantly less informative than those estimated under high preservation rates. PMID:23755303
Vibrational Analysis of Engine Components Using Neural-Net Processing and Electronic Holography
NASA Technical Reports Server (NTRS)
Decker, Arthur J.; Fite, E. Brian; Mehmed, Oral; Thorp, Scott A.
1997-01-01
The use of computational-model trained artificial neural networks to acquire damage specific information from electronic holograms is discussed. A neural network is trained to transform two time-average holograms into a pattern related to the bending-induced-strain distribution of the vibrating component. The bending distribution is very sensitive to component damage unlike the characteristic fringe pattern or the displacement amplitude distribution. The neural network processor is fast for real-time visualization of damage. The two-hologram limit makes the processor more robust to speckle pattern decorrelation. Undamaged and cracked cantilever plates serve as effective objects for testing the combination of electronic holography and neural-net processing. The requirements are discussed for using finite-element-model trained neural networks for field inspections of engine components. The paper specifically discusses neural-network fringe pattern analysis in the presence of the laser speckle effect and the performances of two limiting cases of the neural-net architecture.
Vibrational Analysis of Engine Components Using Neural-Net Processing and Electronic Holography
NASA Technical Reports Server (NTRS)
Decker, Arthur J.; Fite, E. Brian; Mehmed, Oral; Thorp, Scott A.
1998-01-01
The use of computational-model trained artificial neural networks to acquire damage specific information from electronic holograms is discussed. A neural network is trained to transform two time-average holograms into a pattern related to the bending-induced-strain distribution of the vibrating component. The bending distribution is very sensitive to component damage unlike the characteristic fringe pattern or the displacement amplitude distribution. The neural network processor is fast for real-time visualization of damage. The two-hologram limit makes the processor more robust to speckle pattern decorrelation. Undamaged and cracked cantilever plates serve as effective objects for testing the combination of electronic holography and neural-net processing. The requirements are discussed for using finite-element-model trained neural networks for field inspections of engine components. The paper specifically discusses neural-network fringe pattern analysis in the presence of the laser speckle effect and the performances of two limiting cases of the neural-net architecture.
Real-Time Support on IEEE 802.11 Wireless Ad-Hoc Networks: Reality vs. Theory
NASA Astrophysics Data System (ADS)
Kang, Mikyung; Kang, Dong-In; Suh, Jinwoo
The usable throughput of an IEEE 802.11 system for an application is much less than the raw bandwidth. Although 802.11b has a theoretical maximum of 11Mbps, more than half of the bandwidth is consumed by overhead leaving at most 5Mbps of usable bandwidth. Considering this characteristic, this paper proposes and analyzes a real-time distributed scheduling scheme based on the existing IEEE 802.11 wireless ad-hoc networks, using USC/ISI's Power Aware Sensing Tracking and Analysis (PASTA) hardware platform. We compared the distributed real-time scheduling scheme with the real-time polling scheme to meet deadline, and compared a measured real bandwidth with a theoretical result. The theoretical and experimental results show that the distributed scheduling scheme can guarantee real-time traffic and enhances the performance up to 74% compared with polling scheme.
Middle-high latitude N2O distributions related to the arctic vortex breakup
NASA Astrophysics Data System (ADS)
Zhou, L. B.; Zou, H.; Gao, Y. Q.
2006-03-01
The relationship of N2O distributions with the Arctic vortex breakup is first analyzed with a probability distribution function (PDF) analysis. The N2O concentration shows different distributions between the early and late vortex breakup years. In the early breakup years, the N2O concentration shows low values and large dispersions after the vortex breakup, which is related to the inhomogeneity in the vertical advection in the middle and high latitude lower stratosphere. The horizontal diffusion coefficient (K,,) shows a larger value accordingly. In the late breakup years, the N2O concentration shows high values and more uniform distributions than in the early years after the vortex breakup, with a smaller vertical advection and K,, after the vortex breakup. It is found that the N2O distributions are largely affected by the Arctic vortex breakup time but the dynamically defined vortex breakup time is not the only factor.
A comparative analysis of massed vs. distributed practice on basic math fact fluency growth rates.
Schutte, Greg M; Duhon, Gary J; Solomon, Benjamin G; Poncy, Brian C; Moore, Kathryn; Story, Bailey
2015-04-01
To best remediate academic deficiencies, educators need to not only identify empirically validated interventions but also be able to apply instructional modifications that result in more efficient student learning. The current study compared the effect of massed and distributed practice with an explicit timing intervention to evaluate the extent to which these modifications lead to increased math fact fluency on basic addition problems. Forty-eight third-grade students were placed into one of three groups with each of the groups completing four 1-min math explicit timing procedures each day across 19 days. Group one completed all four 1-min timings consecutively; group two completed two back-to-back 1-min timings in the morning and two back-to-back 1-min timings in the afternoon, and group three completed one, 1-min independent timing four times distributed across the day. Growth curve modeling was used to examine the progress throughout the course of the study. Results suggested that students in the distributed practice conditions, both four times per day and two times per day, showed significantly higher fluency growth rates than those practicing only once per day in a massed format. These results indicate that combining distributed practice with explicit timing procedures is a useful modification that enhances student learning without the addition of extra instructional time when targeting math fact fluency. Copyright © 2015 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chertkov, Michael; Turitsyn, Konstantin; Sulc, Petr
The anticipated increase in the number of plug-in electric vehicles (EV) will put additional strain on electrical distribution circuits. Many control schemes have been proposed to control EV charging. Here, we develop control algorithms based on randomized EV charging start times and simple one-way broadcast communication allowing for a time delay between communication events. Using arguments from queuing theory and statistical analysis, we seek to maximize the utilization of excess distribution circuit capacity while keeping the probability of a circuit overload negligible.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Theuws, P.G.A.; Beijerinck, H.C.W.; Schram, D.C.
Velocity analysis of the molecular beam is done with a time-of-flight method. The measured velocity distribution of the fast neutral atoms is described by the sum of two Maxwell-Boltzmann distributions with temperatures on the order of 0.25 and 1 eV, respectively. This bimodal distribution is attributed to an overpopulation of the high-energy tail of the ion velocity distribution. The measured intensities of the fast neutrals vary between 5 x 10/sup 14/ and 7 x 10/sup 15/ (molecules s/sup -1/ sr/sup -1/).
Superstatistics analysis of the ion current distribution function: Met3PbCl influence study.
Miśkiewicz, Janusz; Trela, Zenon; Przestalski, Stanisław; Karcz, Waldemar
2010-09-01
A novel analysis of ion current time series is proposed. It is shown that higher (second, third and fourth) statistical moments of the ion current probability distribution function (PDF) can yield new information about ion channel properties. The method is illustrated on a two-state model where the PDF of the compound states are given by normal distributions. The proposed method was applied to the analysis of the SV cation channels of vacuolar membrane of Beta vulgaris and the influence of trimethyllead chloride (Met(3)PbCl) on the ion current probability distribution. Ion currents were measured by patch-clamp technique. It was shown that Met(3)PbCl influences the variance of the open-state ion current but does not alter the PDF of the closed-state ion current. Incorporation of higher statistical moments into the standard investigation of ion channel properties is proposed.
The Transit-Time Distribution from the Northern Hemisphere Midlatitude Surface
NASA Technical Reports Server (NTRS)
Orbe, Clara; Waugh, Darryn W.; Newman, Paul A.; Strahan, Susan; Steenrod, Stephen
2015-01-01
The distribution of transit times from the Northern Hemisphere (NH) midlatitude surface is a fundamental property of tropospheric transport. Here we present an analysis of the transit time distribution (TTD) since air last contacted the northern midlatitude surface layer, as simulated by the NASA Global Modeling Initiative Chemistry Transport Model. We find that throughout the troposphere the TTD is characterized by long flat tails that reflect the recirculation of old air from the Southern Hemisphere and results in mean ages that are significantly larger than the modal age. Key aspects of the TTD -- its mode, mean and spectral width -- are interpreted in terms of tropospheric dynamics, including seasonal shifts in the location and strength of tropical convection and variations in quasi-isentropic transport out of the northern midlatitude surface layer. Our results indicate that current diagnostics of tropospheric transport are insufficient for comparing model transport and that the full distribution of transit times is a more appropriate constraint.
Requirements analysis for a hardware, discrete-event, simulation engine accelerator
NASA Astrophysics Data System (ADS)
Taylor, Paul J., Jr.
1991-12-01
An analysis of a general Discrete Event Simulation (DES), executing on the distributed architecture of an eight mode Intel PSC/2 hypercube, was performed. The most time consuming portions of the general DES algorithm were determined to be the functions associated with message passing of required simulation data between processing nodes of the hypercube architecture. A behavioral description, using the IEEE standard VHSIC Hardware Description and Design Language (VHDL), for a general DES hardware accelerator is presented. The behavioral description specifies the operational requirements for a DES coprocessor to augment the hypercube's execution of DES simulations. The DES coprocessor design implements the functions necessary to perform distributed discrete event simulations using a conservative time synchronization protocol.
Self-similarity of waiting times in fracture systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Niccolini, G.; Bosia, F.; Carpinteri, A.
2009-08-15
Experimental and numerical results are presented for a fracture experiment carried out on a fiber-reinforced element under flexural loading, and a statistical analysis is performed for acoustic emission waiting-time distributions. By an optimization procedure, a recently proposed scaling law describing these distributions for different event magnitude scales is confirmed by both experimental and numerical data, thus reinforcing the idea that fracture of heterogeneous materials has scaling properties similar to those found for earthquakes. Analysis of the different scaling parameters obtained for experimental and numerical data leads us to formulate the hypothesis that the type of scaling function obtained depends onmore » the level of correlation among fracture events in the system.« less
Kawasaki, Yohei; Ide, Kazuki; Akutagawa, Maiko; Yamada, Hiroshi; Yutaka, Ono; Furukawa, Toshiaki A.
2017-01-01
Background Several recent studies have shown that total scores on depressive symptom measures in a general population approximate an exponential pattern except for the lower end of the distribution. Furthermore, we confirmed that the exponential pattern is present for the individual item responses on the Center for Epidemiologic Studies Depression Scale (CES-D). To confirm the reproducibility of such findings, we investigated the total score distribution and item responses of the Kessler Screening Scale for Psychological Distress (K6) in a nationally representative study. Methods Data were drawn from the National Survey of Midlife Development in the United States (MIDUS), which comprises four subsamples: (1) a national random digit dialing (RDD) sample, (2) oversamples from five metropolitan areas, (3) siblings of individuals from the RDD sample, and (4) a national RDD sample of twin pairs. K6 items are scored using a 5-point scale: “none of the time,” “a little of the time,” “some of the time,” “most of the time,” and “all of the time.” The pattern of total score distribution and item responses were analyzed using graphical analysis and exponential regression model. Results The total score distributions of the four subsamples exhibited an exponential pattern with similar rate parameters. The item responses of the K6 approximated a linear pattern from “a little of the time” to “all of the time” on log-normal scales, while “none of the time” response was not related to this exponential pattern. Discussion The total score distribution and item responses of the K6 showed exponential patterns, consistent with other depressive symptom scales. PMID:28289560
ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density.
Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro
2018-01-01
The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done.
ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density
Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro
2018-01-01
The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done. PMID:29765345
Global time-size distribution of volcanic eruptions on Earth.
Papale, Paolo
2018-05-01
Volcanic eruptions differ enormously in their size and impacts, ranging from quiet lava flow effusions along the volcano flanks to colossal events with the potential to affect our entire civilization. Knowledge of the time and size distribution of volcanic eruptions is of obvious relevance for understanding the dynamics and behavior of the Earth system, as well as for defining global volcanic risk. From the analysis of recent global databases of volcanic eruptions extending back to more than 2 million years, I show here that the return times of eruptions with similar magnitude follow an exponential distribution. The associated relative frequency of eruptions with different magnitude displays a power law, scale-invariant distribution over at least six orders of magnitude. These results suggest that similar mechanisms subtend to explosive eruptions from small to colossal, raising concerns on the theoretical possibility to predict the magnitude and impact of impending volcanic eruptions.
Universality classes of fluctuation dynamics in hierarchical complex systems
NASA Astrophysics Data System (ADS)
Macêdo, A. M. S.; González, Iván R. Roa; Salazar, D. S. P.; Vasconcelos, G. L.
2017-03-01
A unified approach is proposed to describe the statistics of the short-time dynamics of multiscale complex systems. The probability density function of the relevant time series (signal) is represented as a statistical superposition of a large time-scale distribution weighted by the distribution of certain internal variables that characterize the slowly changing background. The dynamics of the background is formulated as a hierarchical stochastic model whose form is derived from simple physical constraints, which in turn restrict the dynamics to only two possible classes. The probability distributions of both the signal and the background have simple representations in terms of Meijer G functions. The two universality classes for the background dynamics manifest themselves in the signal distribution as two types of tails: power law and stretched exponential, respectively. A detailed analysis of empirical data from classical turbulence and financial markets shows excellent agreement with the theory.
Toward a quantitative account of pitch distribution in spontaneous narrative: Method and validation
Matteson, Samuel E.; Streit Olness, Gloria; Caplow, Nancy J.
2013-01-01
Pitch is well-known both to animate human discourse and to convey meaning in communication. The study of the statistical population distributions of pitch in discourse will undoubtedly benefit from methodological improvements. The current investigation examines a method that parameterizes pitch in discourse as musical pitch interval H measured in units of cents and that disaggregates the sequence of peak word-pitches using tools employed in time-series analysis and digital signal processing. The investigators test the proposed methodology by its application to distributions in pitch interval of the peak word-pitch (collectively called the discourse gamut) that occur in simulated and actual spontaneous emotive narratives obtained from 17 middle-aged African-American adults. The analysis, in rigorous tests, not only faithfully reproduced simulated distributions imbedded in realistic time series that drift and include pitch breaks, but the protocol also reveals that the empirical distributions exhibit a common hidden structure when normalized to a slowly varying mode (called the gamut root) of their respective probability density functions. Quantitative differences between narratives reveal the speakers' relative propensity for the use of pitch levels corresponding to elevated degrees of a discourse gamut (the “e-la”) superimposed upon a continuum that conforms systematically to an asymmetric Laplace distribution. PMID:23654400
Ong, Marcus E H; Ng, Faith S P; Overton, Jerry; Yap, Susan; Andresen, Derek; Yong, David K L; Lim, Swee Han; Anantharaman, V
2009-03-01
Pre-hospital ambulance calls are not random events, but occur in patterns and trends that are related to movement patterns of people, as well as the geographical epidemiology of the population. This study describes the geographic-time epidemiology of ambulance calls in a large urban city and conducts a time demand analysis. This will facilitate a Systems Status Plan for the deployment of ambulances based on the most cost effective deployment strategy. An observational prospective study looking at the geographic-time epidemiology of all ambulance calls in Singapore. Locations of ambulance calls were spot mapped using Geographic Information Systems (GIS) technology. Ambulance response times were mapped and a demand analysis conducted by postal districts. Between 1 January 2006 and 31 May 2006, 31,896 patients were enrolled into the study. Mean age of patients was 51.6 years (S.D. 23.0) with 60.0% male. Race distribution was 62.5% Chinese, 19.4% Malay, 12.9% Indian and 5.2% others. Trauma consisted 31.2% of calls and medical 68.8%. 9.7% of cases were priority 1 (most severe) and 70.1% priority 2 (moderate severity). Mean call receipt to arrival at scene was 8.0 min (S.D. 4.8). Call volumes in the day were almost twice those at night, with the most calls on Mondays. We found a definite geographical distribution pattern with heavier call volumes in the suburban town centres in the Eastern and Southern part of the country. We characterised the top 35 districts with the highest call volumes by time periods, which will form the basis for ambulance deployment plans. We found a definite geographical distribution pattern of ambulance calls. This study demonstrates the utility of GIS with despatch demand analysis and has implications for maximising the effectiveness of ambulance deployment.
Statistical analysis of hydrodynamic cavitation events
NASA Astrophysics Data System (ADS)
Gimenez, G.; Sommer, R.
1980-10-01
The frequency (number of events per unit time) of pressure pulses produced by hydrodynamic cavitation bubble collapses is investigated using statistical methods. The results indicate that this frequency is distributed according to a normal law, its parameters not being time-evolving.
Signal Processing Applications Of Wigner-Ville Analysis
NASA Astrophysics Data System (ADS)
Whitehouse, H. J.; Boashash, B.
1986-04-01
The Wigner-Ville distribution (WVD), a form of time-frequency analysis, is shown to be useful in the analysis of a variety of non-stationary signals both deterministic and stochastic. The properties of the WVD are reviewed and alternative methods of calculating the WVD are discussed. Applications are presented.
Methodology for CFD Design Analysis of National Launch System Nozzle Manifold
NASA Technical Reports Server (NTRS)
Haire, Scot L.
1993-01-01
The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.
NASA Astrophysics Data System (ADS)
Bliss, Donald; Franzoni, Linda; Rouse, Jerry; Manning, Ben
2005-09-01
An analysis method for time-dependent broadband diffuse sound fields in enclosures is described. Beginning with a formulation utilizing time-dependent broadband intensity boundary sources, the strength of these wall sources is expanded in a series in powers of an absorption parameter, thereby giving a separate boundary integral problem for each power. The temporal behavior is characterized by a Taylor expansion in the delay time for a source to influence an evaluation point. The lowest-order problem has a uniform interior field proportional to the reciprocal of the absorption parameter, as expected, and exhibits relatively slow exponential decay. The next-order problem gives a mean-square pressure distribution that is independent of the absorption parameter and is primarily responsible for the spatial variation of the reverberant field. This problem, which is driven by input sources and the lowest-order reverberant field, depends on source location and the spatial distribution of absorption. Additional problems proceed at integer powers of the absorption parameter, but are essentially higher-order corrections to the spatial variation. Temporal behavior is expressed in terms of an eigenvalue problem, with boundary source strength distributions expressed as eigenmodes. Solutions exhibit rapid short-time spatial redistribution followed by long-time decay of a predominant spatial mode.
A model of return intervals between earthquake events
NASA Astrophysics Data System (ADS)
Zhou, Yu; Chechkin, Aleksei; Sokolov, Igor M.; Kantz, Holger
2016-06-01
Application of the diffusion entropy analysis and the standard deviation analysis to the time sequence of the southern California earthquake events from 1976 to 2002 uncovered scaling behavior typical for anomalous diffusion. However, the origin of such behavior is still under debate. Some studies attribute the scaling behavior to the correlations in the return intervals, or waiting times, between aftershocks or mainshocks. To elucidate a nature of the scaling, we applied specific reshulffling techniques to eliminate correlations between different types of events and then examined how it affects the scaling behavior. We demonstrate that the origin of the scaling behavior observed is the interplay between mainshock waiting time distribution and the structure of clusters of aftershocks, but not correlations in waiting times between the mainshocks and aftershocks themselves. Our findings are corroborated by numerical simulations of a simple model showing a very similar behavior. The mainshocks are modeled by a renewal process with a power-law waiting time distribution between events, and aftershocks follow a nonhomogeneous Poisson process with the rate governed by Omori's law.
An Improved Time-Frequency Analysis Method in Interference Detection for GNSS Receivers
Sun, Kewen; Jin, Tian; Yang, Dongkai
2015-01-01
In this paper, an improved joint time-frequency (TF) analysis method based on a reassigned smoothed pseudo Wigner–Ville distribution (RSPWVD) has been proposed in interference detection for Global Navigation Satellite System (GNSS) receivers. In the RSPWVD, the two-dimensional low-pass filtering smoothing function is introduced to eliminate the cross-terms present in the quadratic TF distribution, and at the same time, the reassignment method is adopted to improve the TF concentration properties of the auto-terms of the signal components. This proposed interference detection method is evaluated by experiments on GPS L1 signals in the disturbing scenarios compared to the state-of-the-art interference detection approaches. The analysis results show that the proposed interference detection technique effectively overcomes the cross-terms problem and also preserves good TF localization properties, which has been proven to be effective and valid to enhance the interference detection performance of the GNSS receivers, particularly in the jamming environments. PMID:25905704
Measuring the effects of heterogeneity on distributed systems
NASA Technical Reports Server (NTRS)
El-Toweissy, Mohamed; Zeineldine, Osman; Mukkamala, Ravi
1991-01-01
Distributed computer systems in daily use are becoming more and more heterogeneous. Currently, much of the design and analysis studies of such systems assume homogeneity. This assumption of homogeneity has been mainly driven by the resulting simplicity in modeling and analysis. A simulation study is presented which investigated the effects of heterogeneity on scheduling algorithms for hard real time distributed systems. In contrast to previous results which indicate that random scheduling may be as good as a more complex scheduler, this algorithm is shown to be consistently better than a random scheduler. This conclusion is more prevalent at high workloads as well as at high levels of heterogeneity.
Principal components analysis of the photoresponse nonuniformity of a matrix detector.
Ferrero, Alejandro; Alda, Javier; Campos, Joaquín; López-Alonso, Jose Manuel; Pons, Alicia
2007-01-01
The principal component analysis is used to identify and quantify spatial distributions of relative photoresponse as a function of the exposure time for a visible CCD array. The analysis shows a simple way to define an invariant photoresponse nonuniformity and compare it with the definition of this invariant pattern as the one obtained for long exposure times. Experimental data of radiant exposure from levels of irradiance obtained in a stable and well-controlled environment are used.
Phylogenetic analysis reveals a scattered distribution of autumn colours
Archetti, Marco
2009-01-01
Background and Aims Leaf colour in autumn is rarely considered informative for taxonomy, but there is now growing interest in the evolution of autumn colours and different hypotheses are debated. Research efforts are hindered by the lack of basic information: the phylogenetic distribution of autumn colours. It is not known when and how autumn colours evolved. Methods Data are reported on the autumn colours of 2368 tree species belonging to 400 genera of the temperate regions of the world, and an analysis is made of their phylogenetic relationships in order to reconstruct the evolutionary origin of red and yellow in autumn leaves. Key Results Red autumn colours are present in at least 290 species (70 genera), and evolved independently at least 25 times. Yellow is present independently from red in at least 378 species (97 genera) and evolved at least 28 times. Conclusions The phylogenetic reconstruction suggests that autumn colours have been acquired and lost many times during evolution. This scattered distribution could be explained by hypotheses involving some kind of coevolutionary interaction or by hypotheses that rely on the need for photoprotection. PMID:19126636
Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klumpp, John
We propose a radiation detection system which generates its own discrete sampling distribution based on past measurements of background. The advantage to this approach is that it can take into account variations in background with respect to time, location, energy spectra, detector-specific characteristics (i.e. different efficiencies at different count rates and energies), etc. This would therefore be a 'machine learning' approach, in which the algorithm updates and improves its characterization of background over time. The system would have a 'learning mode,' in which it measures and analyzes background count rates, and a 'detection mode,' in which it compares measurements frommore » an unknown source against its unique background distribution. By characterizing and accounting for variations in the background, general purpose radiation detectors can be improved with little or no increase in cost. The statistical and computational techniques to perform this kind of analysis have already been developed. The necessary signal analysis can be accomplished using existing Bayesian algorithms which account for multiple channels, multiple detectors, and multiple time intervals. Furthermore, Bayesian machine-learning techniques have already been developed which, with trivial modifications, can generate appropriate decision thresholds based on the comparison of new measurements against a nonparametric sampling distribution. (authors)« less
Visibility graph analysis on quarterly macroeconomic series of China based on complex network theory
NASA Astrophysics Data System (ADS)
Wang, Na; Li, Dong; Wang, Qiwen
2012-12-01
The visibility graph approach and complex network theory provide a new insight into time series analysis. The inheritance of the visibility graph from the original time series was further explored in the paper. We found that degree distributions of visibility graphs extracted from Pseudo Brownian Motion series obtained by the Frequency Domain algorithm exhibit exponential behaviors, in which the exponential exponent is a binomial function of the Hurst index inherited in the time series. Our simulations presented that the quantitative relations between the Hurst indexes and the exponents of degree distribution function are different for different series and the visibility graph inherits some important features of the original time series. Further, we convert some quarterly macroeconomic series including the growth rates of value-added of three industry series and the growth rates of Gross Domestic Product series of China to graphs by the visibility algorithm and explore the topological properties of graphs associated from the four macroeconomic series, namely, the degree distribution and correlations, the clustering coefficient, the average path length, and community structure. Based on complex network analysis we find degree distributions of associated networks from the growth rates of value-added of three industry series are almost exponential and the degree distributions of associated networks from the growth rates of GDP series are scale free. We also discussed the assortativity and disassortativity of the four associated networks as they are related to the evolutionary process of the original macroeconomic series. All the constructed networks have “small-world” features. The community structures of associated networks suggest dynamic changes of the original macroeconomic series. We also detected the relationship among government policy changes, community structures of associated networks and macroeconomic dynamics. We find great influences of government policies in China on the changes of dynamics of GDP and the three industries adjustment. The work in our paper provides a new way to understand the dynamics of economic development.
NASA Astrophysics Data System (ADS)
Kar, Soummya; Moura, José M. F.
2011-08-01
The paper considers gossip distributed estimation of a (static) distributed random field (a.k.a., large scale unknown parameter vector) observed by sparsely interconnected sensors, each of which only observes a small fraction of the field. We consider linear distributed estimators whose structure combines the information \\emph{flow} among sensors (the \\emph{consensus} term resulting from the local gossiping exchange among sensors when they are able to communicate) and the information \\emph{gathering} measured by the sensors (the \\emph{sensing} or \\emph{innovations} term.) This leads to mixed time scale algorithms--one time scale associated with the consensus and the other with the innovations. The paper establishes a distributed observability condition (global observability plus mean connectedness) under which the distributed estimates are consistent and asymptotically normal. We introduce the distributed notion equivalent to the (centralized) Fisher information rate, which is a bound on the mean square error reduction rate of any distributed estimator; we show that under the appropriate modeling and structural network communication conditions (gossip protocol) the distributed gossip estimator attains this distributed Fisher information rate, asymptotically achieving the performance of the optimal centralized estimator. Finally, we study the behavior of the distributed gossip estimator when the measurements fade (noise variance grows) with time; in particular, we consider the maximum rate at which the noise variance can grow and still the distributed estimator being consistent, by showing that, as long as the centralized estimator is consistent, the distributed estimator remains consistent.
Software system for data management and distributed processing of multichannel biomedical signals.
Franaszczuk, P J; Jouny, C C
2004-01-01
The presented software is designed for efficient utilization of cluster of PC computers for signal analysis of multichannel physiological data. The system consists of three main components: 1) a library of input and output procedures, 2) a database storing additional information about location in a storage system, 3) a user interface for selecting data for analysis, choosing programs for analysis, and distributing computing and output data on cluster nodes. The system allows for processing multichannel time series data in multiple binary formats. The description of data format, channels and time of recording are included in separate text files. Definition and selection of multiple channel montages is possible. Epochs for analysis can be selected both manually and automatically. Implementation of a new signal processing procedures is possible with a minimal programming overhead for the input/output processing and user interface. The number of nodes in cluster used for computations and amount of storage can be changed with no major modification to software. Current implementations include the time-frequency analysis of multiday, multichannel recordings of intracranial EEG of epileptic patients as well as evoked response analyses of repeated cognitive tasks.
Estimation of value at risk and conditional value at risk using normal mixture distributions model
NASA Astrophysics Data System (ADS)
Kamaruzzaman, Zetty Ain; Isa, Zaidi
2013-04-01
Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.
NASA Astrophysics Data System (ADS)
Wright, D. J.; Raad, M.; Hoel, E.; Park, M.; Mollenkopf, A.; Trujillo, R.
2016-12-01
Introduced is a new approach for processing spatiotemporal big data by leveraging distributed analytics and storage. A suite of temporally-aware analysis tools summarizes data nearby or within variable windows, aggregates points (e.g., for various sensor observations or vessel positions), reconstructs time-enabled points into tracks (e.g., for mapping and visualizing storm tracks), joins features (e.g., to find associations between features based on attributes, spatial relationships, temporal relationships or all three simultaneously), calculates point densities, finds hot spots (e.g., in species distributions), and creates space-time slices and cubes (e.g., in microweather applications with temperature, humidity, and pressure, or within human mobility studies). These "feature geo analytics" tools run in both batch and streaming spatial analysis mode as distributed computations across a cluster of servers on typical "big" data sets, where static data exist in traditional geospatial formats (e.g., shapefile) locally on a disk or file share, attached as static spatiotemporal big data stores, or streamed in near-real-time. In other words, the approach registers large datasets or data stores with ArcGIS Server, then distributes analysis across a cluster of machines for parallel processing. Several brief use cases will be highlighted based on a 16-node server cluster at 14 Gb RAM per node, allowing, for example, the buffering of over 8 million points or thousands of polygons in 1 minute. The approach is "hybrid" in that ArcGIS Server integrates open-source big data frameworks such as Apache Hadoop and Apache Spark on the cluster in order to run the analytics. In addition, the user may devise and connect custom open-source interfaces and tools developed in Python or Python Notebooks; the common denominator being the familiar REST API.
NASA Astrophysics Data System (ADS)
Bao, Yi; Hoehler, Matthew S.; Smith, Christopher M.; Bundy, Matthew; Chen, Genda
2017-10-01
In this study, Brillouin scattering-based distributed fiber optic sensor is implemented to measure temperature distributions and detect cracks in concrete structures subjected to fire for the first time. A telecommunication-grade optical fiber is characterized as a high temperature sensor with pulse pre-pump Brillouin optical time domain analysis (PPP-BODTA), and implemented to measure spatially-distributed temperatures in reinforced concrete beams in fire. Four beams were tested to failure in a natural gas fueled compartment fire, each instrumented with one fused silica, single-mode optical fiber as a distributed sensor and four thermocouples. Prior to concrete cracking, the distributed temperature was validated at locations of the thermocouples by a relative difference of less than 9%. The cracks in concrete can be identified as sharp peaks in the temperature distribution since the cracks are locally filled with hot air. Concrete cracking did not affect the sensitivity of the distributed sensor but concrete spalling broke the optical fiber loop required for PPP-BOTDA measurements.
Progress on Ultra-Dense Quantum Communication Using Integrated Photonic Architecture
2013-01-01
entanglement based quantum key distribution . . . . . . . . . . . . . . . . . . . . . . . . . 2 2.2 Extended dispersive-optics QKD (DO-QKD) protocol...2 2.3 Analysis of non-local correlations of entangled photon pairs for arbitrary dis- persion...Section 3). 2 Protocol Development 2.1 Achieving multiple secure bits per coincidence in time-energy entanglement based quantum key distribution High
NASA Technical Reports Server (NTRS)
Parrish, R. S.; Carter, M. C.
1974-01-01
This analysis utilizes computer simulation and statistical estimation. Realizations of stationary gaussian stochastic processes with selected autocorrelation functions are computer simulated. Analysis of the simulated data revealed that the mean and the variance of a process were functionally dependent upon the autocorrelation parameter and crossing level. Using predicted values for the mean and standard deviation, by the method of moments, the distribution parameters was estimated. Thus, given the autocorrelation parameter, crossing level, mean, and standard deviation of a process, the probability of exceeding the crossing level for a particular length of time was calculated.
Aeroelastic Analysis of a Distributed Electric Propulsion Wing
NASA Technical Reports Server (NTRS)
Massey, Steven J.; Stanford, Bret K.; Wieseman, Carol D.; Heeg, Jennifer
2017-01-01
An aeroelastic analysis of a prototype distributed electric propulsion wing is presented. Results using MSC Nastran (Registered Trademark) doublet lattice aerodynamics are compared to those based on FUN3D Reynolds Averaged Navier- Stokes aerodynamics. Four levels of grid refinement were examined for the FUN3D solutions and solutions were seen to be well converged. It was found that no oscillatory instability existed, only that of divergence, which occurred in the first bending mode at a dynamic pressure of over three times the flutter clearance condition.
NASA Astrophysics Data System (ADS)
Zheng, F.; Shi, X.; Wu, J.; Gao, Y. W.
2013-12-01
Chlorinated solvents such as trichloroethene (TCE) and tetrachloroethene (PCE) are widespread groundwater contaminants often referred to as dense non-aqueous phase liquids (DNAPLs). Accuracy description of the spreading behavior and configuration for subsurface DNAPL migration is important, especially favourable for design effective remediation strategies. In this study, a 2-D experiment was conducted to investigate the infiltration behavior and spatial distribution of PCE in saturated porous media. Accusand 20/30 mesh sand (Unimin, Le Sueur, MN) was used as the background medium with two 70/80 and 60/70 mesh lenses embedded to simulate heterogeneous conditions. Dyed PCE of 100 ml was released into the flow cell at a constant rate of 2ml/min using a Harvard Apparatus syringe pump with a 50 ml glass syringe for two times, and 5 ml/min water was continuously injected through the inlet at the left side of the sandbox, while kept the same effluent rate at right side to create hydrodynamic condition. A light transmission (LT) system was used to record the migration of PCE and determine the saturation distribution of PCE in the sandbox experiment with a thermoelectrically air-cooled charged-coupled device (CCD) camera. All images were processed using MATLAB to calculate thickness-averaged PCE saturation for each pixel. Mass balance was checked through comparing injected known mounts of PCE with that calculated from LT analysis. Results showed that LT method is effective to delineate PCE migration pathways and quantify the saturation distribution. The relative errors of total PCE volumes calculated by LT analysis at different times were within 15% of the injected PCE volumes. The simulation are conducted using the multiphase modeling software T2VOC, which calibrated by the LT analysis results of three recorded time steps to fit with the complete spatial-temporal distribution of the PCE saturation. Model verification was then performed using the other eight recorded time steps. Simulated results showed that the model could successfully reproduce the migration pathways and distribution configuration observed from the laboratory experiment and LT analysis, excepted for a smaller pool height on the lenses, and a lower saturation values in PCE accumulation area due to local heterogeneities. Due to the influence of water flow, the PCE distribution was asymmetrical, and the PCE distribution area, pool height as well as PCE saturation in the accumulation region at right side was much greater. Acknowledge: This work is financially supported by the National Nature Science Foundation of China grants No. 41030746 and 41172206.
Time-frequency analysis of human motion during rhythmic exercises.
Omkar, S N; Vyas, Khushi; Vikranth, H N
2011-01-01
Biomechanical signals due to human movements during exercise are represented in time-frequency domain using Wigner Distribution Function (WDF). Analysis based on WDF reveals instantaneous spectral and power changes during a rhythmic exercise. Investigations were carried out on 11 healthy subjects who performed 5 cycles of sun salutation, with a body-mounted Inertial Measurement Unit (IMU) as a motion sensor. Variance of Instantaneous Frequency (I.F) and Instantaneous Power (I.P) for performance analysis of the subject is estimated using one-way ANOVA model. Results reveal that joint Time-Frequency analysis of biomechanical signals during motion facilitates a better understanding of grace and consistency during rhythmic exercise.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-08
... series in the pilot: (1) A time series analysis of open interest; and (2) An analysis of the distribution... times the number of shares outstanding. These are summed for all 500 stocks and divided by a... below $3.00 and $0.10 for all other series. Strike price intervals would be set no less than 5 points...
Studies in astronomical time series analysis. I - Modeling random processes in the time domain
NASA Technical Reports Server (NTRS)
Scargle, J. D.
1981-01-01
Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.
The rates and time-delay distribution of multiply imaged supernovae behind lensing clusters
NASA Astrophysics Data System (ADS)
Li, Xue; Hjorth, Jens; Richard, Johan
2012-11-01
Time delays of gravitationally lensed sources can be used to constrain the mass model of a deflector and determine cosmological parameters. We here present an analysis of the time-delay distribution of multiply imaged sources behind 17 strong lensing galaxy clusters with well-calibrated mass models. We find that for time delays less than 1000 days, at z = 3.0, their logarithmic probability distribution functions are well represented by P(log Δt) = 5.3 × 10-4Δttilde beta/M2502tilde beta, with tilde beta = 0.77, where M250 is the projected cluster mass inside 250 kpc (in 1014M⊙), and tilde beta is the power-law slope of the distribution. The resultant probability distribution function enables us to estimate the time-delay distribution in a lensing cluster of known mass. For a cluster with M250 = 2 × 1014M⊙, the fraction of time delays less than 1000 days is approximately 3%. Taking Abell 1689 as an example, its dark halo and brightest galaxies, with central velocity dispersions σ>=500kms-1, mainly produce large time delays, while galaxy-scale mass clumps are responsible for generating smaller time delays. We estimate the probability of observing multiple images of a supernova in the known images of Abell 1689. A two-component model of estimating the supernova rate is applied in this work. For a magnitude threshold of mAB = 26.5, the yearly rate of Type Ia (core-collapse) supernovae with time delays less than 1000 days is 0.004±0.002 (0.029±0.001). If the magnitude threshold is lowered to mAB ~ 27.0, the rate of core-collapse supernovae suitable for time delay observation is 0.044±0.015 per year.
GPS FOM Chimney Analysis using Generalized Extreme Value Distribution
NASA Technical Reports Server (NTRS)
Ott, Rick; Frisbee, Joe; Saha, Kanan
2004-01-01
Many a time an objective of a statistical analysis is to estimate a limit value like 3-sigma 95% confidence upper limit from a data sample. The generalized Extreme Value Distribution method can be profitably employed in many situations for such an estimate. . .. It is well known that according to the Central Limit theorem the mean value of a large data set is normally distributed irrespective of the distribution of the data from which the mean value is derived. In a somewhat similar fashion it is observed that many times the extreme value of a data set has a distribution that can be formulated with a Generalized Distribution. In space shuttle entry with 3-string GPS navigation the Figure Of Merit (FOM) value gives a measure of GPS navigated state accuracy. A GPS navigated state with FOM of 6 or higher is deemed unacceptable and is said to form a FOM 6 or higher chimney. A FOM chimney is a period of time during which the FOM value stays higher than 5. A longer period of FOM of value 6 or higher causes navigated state to accumulate more error for a lack of state update. For an acceptable landing it is imperative that the state error remains low and hence at low altitude during entry GPS data of FOM greater than 5 must not last more than 138 seconds. I To test the GPS performAnce many entry test cases were simulated at the Avionics Development Laboratory. Only high value FoM chimneys are consequential. The extreme value statistical technique is applied to analyze high value FOM chimneys. The Maximum likelihood method is used to determine parameters that characterize the GEV distribution, and then the limit value statistics are estimated.
Evaluation of performance of distributed delay model for chemotherapy-induced myelosuppression.
Krzyzanski, Wojciech; Hu, Shuhua; Dunlavey, Michael
2018-04-01
The distributed delay model has been introduced that replaces the transit compartments in the classic model of chemotherapy-induced myelosuppression with a convolution integral. The maturation of granulocyte precursors in the bone marrow is described by the gamma probability density function with the shape parameter (ν). If ν is a positive integer, the distributed delay model coincides with the classic model with ν transit compartments. The purpose of this work was to evaluate performance of the distributed delay model with particular focus on model deterministic identifiability in the presence of the shape parameter. The classic model served as a reference for comparison. Previously published white blood cell (WBC) count data in rats receiving bolus doses of 5-fluorouracil were fitted by both models. The negative two log-likelihood objective function (-2LL) and running times were used as major markers of performance. Local sensitivity analysis was done to evaluate the impact of ν on the pharmacodynamics response WBC. The ν estimate was 1.46 with 16.1% CV% compared to ν = 3 for the classic model. The difference of 6.78 in - 2LL between classic model and the distributed delay model implied that the latter performed significantly better than former according to the log-likelihood ratio test (P = 0.009), although the overall performance was modestly better. The running times were 1 s and 66.2 min, respectively. The long running time of the distributed delay model was attributed to computationally intensive evaluation of the convolution integral. The sensitivity analysis revealed that ν strongly influences the WBC response by controlling cell proliferation and elimination of WBCs from the circulation. In conclusion, the distributed delay model was deterministically identifiable from typical cytotoxic data. Its performance was modestly better than the classic model with significantly longer running time.
CDF-XL: computing cumulative distribution functions of reaction time data in Excel.
Houghton, George; Grange, James A
2011-12-01
In experimental psychology, central tendencies of reaction time (RT) distributions are used to compare different experimental conditions. This emphasis on the central tendency ignores additional information that may be derived from the RT distribution itself. One method for analysing RT distributions is to construct cumulative distribution frequency plots (CDFs; Ratcliff, Psychological Bulletin 86:446-461, 1979). However, this method is difficult to implement in widely available software, severely restricting its use. In this report, we present an Excel-based program, CDF-XL, for constructing and analysing CDFs, with the aim of making such techniques more readily accessible to researchers, including students (CDF-XL can be downloaded free of charge from the Psychonomic Society's online archive). CDF-XL functions as an Excel workbook and starts from the raw experimental data, organised into three columns (Subject, Condition, and RT) on an Input Data worksheet (a point-and-click utility is provided for achieving this format from a broader data set). No further preprocessing or sorting of the data is required. With one click of a button, CDF-XL will generate two forms of cumulative analysis: (1) "standard" CDFs, based on percentiles of participant RT distributions (by condition), and (2) a related analysis employing the participant means of rank-ordered RT bins. Both analyses involve partitioning the data in similar ways, but the first uses a "median"-type measure at the participant level, while the latter uses the mean. The results are presented in three formats: (i) by participants, suitable for entry into further statistical analysis; (ii) grand means by condition; and (iii) completed CDF plots in Excel charts.
Review of Vibration-Based Helicopters Health and Usage Monitoring Methods
2001-04-05
FM4, NA4, NA4*, NB4 and NB48* (Polyshchuk et al., 1998). The Wigner - Ville distribution ( WVD ) is a joint time-frequency signal analysis. The WVD is one...signal processing methodologies that are of relevance to vibration based damage detection (e.g., Wavelet Transform and Wigner - Ville distribution ) will be...operation cost, reduce maintenance flights, and increase flight safety. Key Words: HUMS; Wavelet Transform; Wigner - Ville distribution ; O&S; Machinery
Probability Density Functions of Observed Rainfall in Montana
NASA Technical Reports Server (NTRS)
Larsen, Scott D.; Johnson, L. Ronald; Smith, Paul L.
1995-01-01
The question of whether a rain rate probability density function (PDF) can vary uniformly between precipitation events is examined. Image analysis on large samples of radar echoes is possible because of advances in technology. The data provided by such an analysis easily allow development of radar reflectivity factors (and by extension rain rate) distribution. Finding a PDF becomes a matter of finding a function that describes the curve approximating the resulting distributions. Ideally, one PDF would exist for all cases; or many PDF's that have the same functional form with only systematic variations in parameters (such as size or shape) exist. Satisfying either of theses cases will, validate the theoretical basis of the Area Time Integral (ATI). Using the method of moments and Elderton's curve selection criteria, the Pearson Type 1 equation was identified as a potential fit for 89 percent of the observed distributions. Further analysis indicates that the Type 1 curve does approximate the shape of the distributions but quantitatively does not produce a great fit. Using the method of moments and Elderton's curve selection criteria, the Pearson Type 1 equation was identified as a potential fit for 89% of the observed distributions. Further analysis indicates that the Type 1 curve does approximate the shape of the distributions but quantitatively does not produce a great fit.
Areal and time distributions of volcanic formations on Mars
NASA Technical Reports Server (NTRS)
Katterfeld, G. N.; Vityaz, V. I.
1987-01-01
The analysis of igneous rock distribution has been fulfilled on the basis of the geomorphological map of Mars at scale 1:5,000,000, according to data obtained from interpretation of 1:2,000,000 scale pictures of Mariner 9, Mars 4, Mars 5, Viking 1 and 2. Areological areas are listed as having been distinguished as the stratigraphic basis for a martian time scale. The area of volcanic eruptions and the number of eruptive centers are calculated on 10 x 10 deg cells and for each areological eras. The largest area of eruptive happening at different times is related with Tharsis tectonic uplift. The study of distribution of igneous rock area and volcanic centers number on 10 deg sectors and zones revealed the concentration belts of volcanic formations.
NASA Astrophysics Data System (ADS)
Ripamonti, Giancarlo; Lacaita, Andrea L.
1993-03-01
The extreme sensitivity and time resolution of Geiger-mode avalanche photodiodes (GM- APDs) have already been exploited for optical time domain reflectometry (OTDR). Better than 1 cm spatial resolution in Rayleigh scattering detection was demonstrated. Distributed and quasi-distributed optical fiber sensors can take advantage of the capabilities of GM-APDs. Extensive studies have recently disclosed the main characteristics and limitations of silicon devices, both commercially available and developmental. In this paper we report an analysis of the performance of these detectors. The main characteristics of GM-APDs of interest for distributed optical fiber sensors are briefly reviewed. Command electronics (active quenching) is then introduced. The detector timing performance sets the maximum spatial resolution in experiments employing OTDR techniques. We highlight that the achievable time resolution depends on the physics of the avalanche spreading over the device area. On the basis of these results, trade-off between the important parameters (quantum efficiency, time resolution, background noise, and afterpulsing effects) is considered. Finally, we show first results on Germanium devices, capable of single photon sensitivity at 1.3 and 1.5 micrometers with sub- nanosecond time resolution.
NASA Astrophysics Data System (ADS)
Clerc, F.; Njiki-Menga, G.-H.; Witschger, O.
2013-04-01
Most of the measurement strategies that are suggested at the international level to assess workplace exposure to nanomaterials rely on devices measuring, in real time, airborne particles concentrations (according different metrics). Since none of the instruments to measure aerosols can distinguish a particle of interest to the background aerosol, the statistical analysis of time resolved data requires special attention. So far, very few approaches have been used for statistical analysis in the literature. This ranges from simple qualitative analysis of graphs to the implementation of more complex statistical models. To date, there is still no consensus on a particular approach and the current period is always looking for an appropriate and robust method. In this context, this exploratory study investigates a statistical method to analyse time resolved data based on a Bayesian probabilistic approach. To investigate and illustrate the use of the this statistical method, particle number concentration data from a workplace study that investigated the potential for exposure via inhalation from cleanout operations by sandpapering of a reactor producing nanocomposite thin films have been used. In this workplace study, the background issue has been addressed through the near-field and far-field approaches and several size integrated and time resolved devices have been used. The analysis of the results presented here focuses only on data obtained with two handheld condensation particle counters. While one was measuring at the source of the released particles, the other one was measuring in parallel far-field. The Bayesian probabilistic approach allows a probabilistic modelling of data series, and the observed task is modelled in the form of probability distributions. The probability distributions issuing from time resolved data obtained at the source can be compared with the probability distributions issuing from the time resolved data obtained far-field, leading in a quantitative estimation of the airborne particles released at the source when the task is performed. Beyond obtained results, this exploratory study indicates that the analysis of the results requires specific experience in statistics.
Integrating software architectures for distributed simulations and simulation analysis communities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael
2005-10-01
The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context ofmore » the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.« less
Dorazio, Robert; Karanth, K. Ullas
2017-01-01
MotivationSeveral spatial capture-recapture (SCR) models have been developed to estimate animal abundance by analyzing the detections of individuals in a spatial array of traps. Most of these models do not use the actual dates and times of detection, even though this information is readily available when using continuous-time recorders, such as microphones or motion-activated cameras. Instead most SCR models either partition the period of trap operation into a set of subjectively chosen discrete intervals and ignore multiple detections of the same individual within each interval, or they simply use the frequency of detections during the period of trap operation and ignore the observed times of detection. Both practices make inefficient use of potentially important information in the data.Model and data analysisWe developed a hierarchical SCR model to estimate the spatial distribution and abundance of animals detected with continuous-time recorders. Our model includes two kinds of point processes: a spatial process to specify the distribution of latent activity centers of individuals within the region of sampling and a temporal process to specify temporal patterns in the detections of individuals. We illustrated this SCR model by analyzing spatial and temporal patterns evident in the camera-trap detections of tigers living in and around the Nagarahole Tiger Reserve in India. We also conducted a simulation study to examine the performance of our model when analyzing data sets of greater complexity than the tiger data.BenefitsOur approach provides three important benefits: First, it exploits all of the information in SCR data obtained using continuous-time recorders. Second, it is sufficiently versatile to allow the effects of both space use and behavior of animals to be specified as functions of covariates that vary over space and time. Third, it allows both the spatial distribution and abundance of individuals to be estimated, effectively providing a species distribution model, even in cases where spatial covariates of abundance are unknown or unavailable. We illustrated these benefits in the analysis of our data, which allowed us to quantify differences between nocturnal and diurnal activities of tigers and to estimate their spatial distribution and abundance across the study area. Our continuous-time SCR model allows an analyst to specify many of the ecological processes thought to be involved in the distribution, movement, and behavior of animals detected in a spatial trapping array of continuous-time recorders. We plan to extend this model to estimate the population dynamics of animals detected during multiple years of SCR surveys.
Combining real-time monitoring and knowledge-based analysis in MARVEL
NASA Technical Reports Server (NTRS)
Schwuttke, Ursula M.; Quan, A. G.; Angelino, R.; Veregge, J. R.
1993-01-01
Real-time artificial intelligence is gaining increasing attention for applications in which conventional software methods are unable to meet technology needs. One such application area is the monitoring and analysis of complex systems. MARVEL, a distributed monitoring and analysis tool with multiple expert systems, was developed and successfully applied to the automation of interplanetary spacecraft operations at NASA's Jet Propulsion Laboratory. MARVEL implementation and verification approaches, the MARVEL architecture, and the specific benefits that were realized by using MARVEL in operations are described.
Stochastic modeling of a serial killer
Simkin, M.V.; Roychowdhury, V.P.
2014-01-01
We analyze the time pattern of the activity of a serial killer, who during twelve years had murdered 53 people. The plot of the cumulative number of murders as a function of time is of “Devil’s staircase” type. The distribution of the intervals between murders (step length) follows a power law with the exponent of 1.4. We propose a model according to which the serial killer commits murders when neuronal excitation in his brain exceeds certain threshold. We model this neural activity as a branching process, which in turn is approximated by a random walk. As the distribution of the random walk return times is a power law with the exponent 1.5, the distribution of the inter-murder intervals is thus explained. We illustrate analytical results by numerical simulation. Time pattern activity data from two other serial killers further substantiate our analysis. PMID:24721476
Stochastic modeling of a serial killer.
Simkin, M V; Roychowdhury, V P
2014-08-21
We analyze the time pattern of the activity of a serial killer, who during 12 years had murdered 53 people. The plot of the cumulative number of murders as a function of time is of "Devil's staircase" type. The distribution of the intervals between murders (step length) follows a power law with the exponent of 1.4. We propose a model according to which the serial killer commits murders when neuronal excitation in his brain exceeds certain threshold. We model this neural activity as a branching process, which in turn is approximated by a random walk. As the distribution of the random walk return times is a power law with the exponent 1.5, the distribution of the inter-murder intervals is thus explained. We illustrate analytical results by numerical simulation. Time pattern activity data from two other serial killers further substantiate our analysis. Copyright © 2014 Elsevier Ltd. All rights reserved.
Time frequency analysis for automated sleep stage identification in fullterm and preterm neonates.
Fraiwan, Luay; Lweesy, Khaldon; Khasawneh, Natheer; Fraiwan, Mohammad; Wenz, Heinrich; Dickhaus, Hartmut
2011-08-01
This work presents a new methodology for automated sleep stage identification in neonates based on the time frequency distribution of single electroencephalogram (EEG) recording and artificial neural networks (ANN). Wigner-Ville distribution (WVD), Hilbert-Hough spectrum (HHS) and continuous wavelet transform (CWT) time frequency distributions were used to represent the EEG signal from which features were extracted using time frequency entropy. The classification of features was done using feed forward back-propagation ANN. The system was trained and tested using data taken from neonates of post-conceptual age of 40 weeks for both preterm (14 recordings) and fullterm (15 recordings). The identification of sleep stages was successfully implemented and the classification based on the WVD outperformed the approaches based on CWT and HHS. The accuracy and kappa coefficient were found to be 0.84 and 0.65 respectively for the fullterm neonates' recordings and 0.74 and 0.50 respectively for preterm neonates' recordings.
Finite time synchronization of memristor-based Cohen-Grossberg neural networks with mixed delays.
Chen, Chuan; Li, Lixiang; Peng, Haipeng; Yang, Yixian
2017-01-01
Finite time synchronization, which means synchronization can be achieved in a settling time, is desirable in some practical applications. However, most of the published results on finite time synchronization don't include delays or only include discrete delays. In view of the fact that distributed delays inevitably exist in neural networks, this paper aims to investigate the finite time synchronization of memristor-based Cohen-Grossberg neural networks (MCGNNs) with both discrete delay and distributed delay (mixed delays). By means of a simple feedback controller and novel finite time synchronization analysis methods, several new criteria are derived to ensure the finite time synchronization of MCGNNs with mixed delays. The obtained criteria are very concise and easy to verify. Numerical simulations are presented to demonstrate the effectiveness of our theoretical results.
Fractal scaling analysis of groundwater dynamics in confined aquifers
NASA Astrophysics Data System (ADS)
Tu, Tongbi; Ercan, Ali; Kavvas, M. Levent
2017-10-01
Groundwater closely interacts with surface water and even climate systems in most hydroclimatic settings. Fractal scaling analysis of groundwater dynamics is of significance in modeling hydrological processes by considering potential temporal long-range dependence and scaling crossovers in the groundwater level fluctuations. In this study, it is demonstrated that the groundwater level fluctuations in confined aquifer wells with long observations exhibit site-specific fractal scaling behavior. Detrended fluctuation analysis (DFA) was utilized to quantify the monofractality, and multifractal detrended fluctuation analysis (MF-DFA) and multiscale multifractal analysis (MMA) were employed to examine the multifractal behavior. The DFA results indicated that fractals exist in groundwater level time series, and it was shown that the estimated Hurst exponent is closely dependent on the length and specific time interval of the time series. The MF-DFA and MMA analyses showed that different levels of multifractality exist, which may be partially due to a broad probability density distribution with infinite moments. Furthermore, it is demonstrated that the underlying distribution of groundwater level fluctuations exhibits either non-Gaussian characteristics, which may be fitted by the Lévy stable distribution, or Gaussian characteristics depending on the site characteristics. However, fractional Brownian motion (fBm), which has been identified as an appropriate model to characterize groundwater level fluctuation, is Gaussian with finite moments. Therefore, fBm may be inadequate for the description of physical processes with infinite moments, such as the groundwater level fluctuations in this study. It is concluded that there is a need for generalized governing equations of groundwater flow processes that can model both the long-memory behavior and the Brownian finite-memory behavior.
NMR relaxation in natural soils: Fast Field Cycling and T1-T2 Determination by IR-MEMS
NASA Astrophysics Data System (ADS)
Haber-Pohlmeier, S.; Pohlmeier, A.; Stapf, S.; van Dusschoten, D.
2009-04-01
Soils are natural porous media of highest importance for food production and sustainment of water resources. For these functions, prominent properties are their ability of water retainment and transport, which are mainly controlled by pore size distribution. The latter is related to NMR relaxation times of water molecules, of which the longitudinal relaxation time can be determined non-invasively by fast-field cycling relaxometry (FFC) and both are obtainable by inversion recovery - multi-echo- imaging (IR-MEMS) methods. The advantage of the FFC method is the determination of the field dependent dispersion of the spin-lattice relaxation rate, whereas MRI at high field is capable of yielding spatially resolved T1 and T2 times. Here we present results of T1- relaxation time distributions of water in three natural soils, obtained by the analysis of FFC data by means of the inverse Laplace transformation (CONTIN)1. Kaldenkirchen soil shows relatively broad bimodal distribution functions D(T1) which shift to higher relaxation rates with increasing relaxation field. These data are compared to spatially resolved T1- and T2 distributions, obtained by IR-MEMS. The distribution of T1 corresponds well to that obtained by FFC.
Optimization of tomographic reconstruction workflows on geographically distributed resources
Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar; ...
2016-01-01
New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less
Optimization of tomographic reconstruction workflows on geographically distributed resources
Bicer, Tekin; Gürsoy, Doǧa; Kettimuthu, Rajkumar; De Carlo, Francesco; Foster, Ian T.
2016-01-01
New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modeling of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Moreover, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks. PMID:27359149
Optimization of tomographic reconstruction workflows on geographically distributed resources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar
New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less
Critical thresholds for eventual extinction in randomly disturbed population growth models.
Peckham, Scott D; Waymire, Edward C; De Leenheer, Patrick
2018-02-16
This paper considers several single species growth models featuring a carrying capacity, which are subject to random disturbances that lead to instantaneous population reduction at the disturbance times. This is motivated in part by growing concerns about the impacts of climate change. Our main goal is to understand whether or not the species can persist in the long run. We consider the discrete-time stochastic process obtained by sampling the system immediately after the disturbances, and find various thresholds for several modes of convergence of this discrete process, including thresholds for the absence or existence of a positively supported invariant distribution. These thresholds are given explicitly in terms of the intensity and frequency of the disturbances on the one hand, and the population's growth characteristics on the other. We also perform a similar threshold analysis for the original continuous-time stochastic process, and obtain a formula that allows us to express the invariant distribution for this continuous-time process in terms of the invariant distribution of the discrete-time process, and vice versa. Examples illustrate that these distributions can differ, and this sends a cautionary message to practitioners who wish to parameterize these and related models using field data. Our analysis relies heavily on a particular feature shared by all the deterministic growth models considered here, namely that their solutions exhibit an exponentially weighted averaging property between a function of the initial condition, and the same function applied to the carrying capacity. This property is due to the fact that these systems can be transformed into affine systems.
Baltzer, Pascal Andreas Thomas; Renz, Diane M; Kullnig, Petra E; Gajda, Mieczyslaw; Camara, Oumar; Kaiser, Werner A
2009-04-01
The identification of the most suspect enhancing part of a lesion is regarded as a major diagnostic criterion in dynamic magnetic resonance mammography. Computer-aided diagnosis (CAD) software allows the semi-automatic analysis of the kinetic characteristics of complete enhancing lesions, providing additional information about lesion vasculature. The diagnostic value of this information has not yet been quantified. Consecutive patients from routine diagnostic studies (1.5 T, 0.1 mmol gadopentetate dimeglumine, dynamic gradient-echo sequences at 1-minute intervals) were analyzed prospectively using CAD. Dynamic sequences were processed and reduced to a parametric map. Curve types were classified by initial signal increase (not significant, intermediate, and strong) and the delayed time course of signal intensity (continuous, plateau, and washout). Lesion enhancement was measured using CAD. The most suspect curve, the curve-type distribution percentage, and combined dynamic data were compared. Statistical analysis included logistic regression analysis and receiver-operating characteristic analysis. Fifty-one patients with 46 malignant and 44 benign lesions were enrolled. On receiver-operating characteristic analysis, the most suspect curve showed diagnostic accuracy of 76.7 +/- 5%. In comparison, the curve-type distribution percentage demonstrated accuracy of 80.2 +/- 4.9%. Combined dynamic data had the highest diagnostic accuracy (84.3 +/- 4.2%). These differences did not achieve statistical significance. With appropriate cutoff values, sensitivity and specificity, respectively, were found to be 80.4% and 72.7% for the most suspect curve, 76.1% and 83.6% for the curve-type distribution percentage, and 78.3% and 84.5% for both parameters. The integration of whole-lesion dynamic data tends to improve specificity. However, no statistical significance backs up this finding.
Automated video analysis system reveals distinct diurnal behaviors in C57BL/6 and C3H/HeN mice.
Adamah-Biassi, E B; Stepien, I; Hudson, R L; Dubocovich, M L
2013-04-15
Advances in rodent behavior dissection using automated video recording and analysis allows detailed phenotyping. This study compared and contrasted 15 diurnal behaviors recorded continuously using an automated behavioral analysis system for a period of 14 days under a 14/10 light/dark cycle in single housed C3H/HeN (C3H) or C57BL/6 (C57) male mice. Diurnal behaviors, recorded with minimal experimental interference and analyzed using phenotypic array and temporal distribution analysis showed bimodal and unimodal profiles in the C57 and C3H mice, respectively. Phenotypic array analysis revealed distinct behavioral rhythms in Activity-Like Behaviors (i.e. walk, hang, jump, come down) (ALB), Exploration-Like Behaviors (i.e. dig, groom, rear up, sniff, stretch) (ELB), Ingestion-Like Behaviors (i.e. drink, eat) (ILB) and Resting-Like Behaviors (i.e. awake, remain low, rest, twitch) (RLB) of C3H and C57 mice. Temporal distribution analysis demonstrated that strain and time of day affects the magnitude and distribution of the spontaneous homecage behaviors. Wheel running activity, water and food measurements correlated with timing of homecage behaviors. Subcutaneous (3 mg/kg, sc) or oral (0.02 mg/ml, oral) melatonin treatments in C57 mice did not modify either the total 24 h magnitude or temporal distribution of homecage behaviors when compared with vehicle treatments. We conclude that C3H and C57 mice show different spontaneous activity and behavioral rhythms specifically during the night period which are not modulated by melatonin. Copyright © 2013 Elsevier B.V. All rights reserved.
One Step Quantum Key Distribution Based on EPR Entanglement.
Li, Jian; Li, Na; Li, Lei-Lei; Wang, Tao
2016-06-30
A novel quantum key distribution protocol is presented, based on entanglement and dense coding and allowing asymptotically secure key distribution. Considering the storage time limit of quantum bits, a grouping quantum key distribution protocol is proposed, which overcomes the vulnerability of first protocol and improves the maneuverability. Moreover, a security analysis is given and a simple type of eavesdropper's attack would introduce at least an error rate of 46.875%. Compared with the "Ping-pong" protocol involving two steps, the proposed protocol does not need to store the qubit and only involves one step.
Shu, Xu; Schaubel, Douglas E
2016-06-01
Times between successive events (i.e., gap times) are of great importance in survival analysis. Although many methods exist for estimating covariate effects on gap times, very few existing methods allow for comparisons between gap times themselves. Motivated by the comparison of primary and repeat transplantation, our interest is specifically in contrasting the gap time survival functions and their integration (restricted mean gap time). Two major challenges in gap time analysis are non-identifiability of the marginal distributions and the existence of dependent censoring (for all but the first gap time). We use Cox regression to estimate the (conditional) survival distributions of each gap time (given the previous gap times). Combining fitted survival functions based on those models, along with multiple imputation applied to censored gap times, we then contrast the first and second gap times with respect to average survival and restricted mean lifetime. Large-sample properties are derived, with simulation studies carried out to evaluate finite-sample performance. We apply the proposed methods to kidney transplant data obtained from a national organ transplant registry. Mean 10-year graft survival of the primary transplant is significantly greater than that of the repeat transplant, by 3.9 months (p=0.023), a result that may lack clinical importance. © 2015, The International Biometric Society.
Comparative analysis through probability distributions of a data set
NASA Astrophysics Data System (ADS)
Cristea, Gabriel; Constantinescu, Dan Mihai
2018-02-01
In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.
Tağluk, M E; Cakmak, E D; Karakaş, S
2005-04-30
Cognitive brain responses to external stimuli, as measured by event related potentials (ERPs), have been analyzed from a variety of perspectives to investigate brain dynamics. Here, the brain responses of healthy subjects to auditory oddball paradigms, standard and deviant stimuli, recorded on an Fz electrode site were studied using a short-term version of the smoothed Wigner-Ville distribution (STSW) method. A smoothing kernel was designed to preserve the auto energy of the signal with maximum time and frequency resolutions. Analysis was conducted mainly on the time-frequency distributions (TFDs) of sweeps recorded during successive trials including the TFD of averaged single sweeps as the evoked time-frequency (ETF) brain response and the average of TFDs of single sweeps as the time-frequency (TF) brain response. Also the power entropy and the phase angles of the signal at frequency f and time t locked to the stimulus onset were studied across single trials as the TF power-locked and the TF phase-locked brain responses, respectively. TFDs represented in this way demonstrated the ERP spectro-temporal characteristics from multiple perspectives. The time-varying energy of the individual components manifested interesting TF structures in the form of amplitude modulated (AM) and frequency modulated (FM) energy bursts. The TF power-locked and phase-locked brain responses provoked ERP energies in a manner modulated by cognitive functions, an observation requiring further investigation. These results may lead to a better understanding of integrative brain dynamics.
Reaction Event Counting Statistics of Biopolymer Reaction Systems with Dynamic Heterogeneity.
Lim, Yu Rim; Park, Seong Jun; Park, Bo Jung; Cao, Jianshu; Silbey, Robert J; Sung, Jaeyoung
2012-04-10
We investigate the reaction event counting statistics (RECS) of an elementary biopolymer reaction in which the rate coefficient is dependent on states of the biopolymer and the surrounding environment and discover a universal kinetic phase transition in the RECS of the reaction system with dynamic heterogeneity. From an exact analysis for a general model of elementary biopolymer reactions, we find that the variance in the number of reaction events is dependent on the square of the mean number of the reaction events when the size of measurement time is small on the relaxation time scale of rate coefficient fluctuations, which does not conform to renewal statistics. On the other hand, when the size of the measurement time interval is much greater than the relaxation time of rate coefficient fluctuations, the variance becomes linearly proportional to the mean reaction number in accordance with renewal statistics. Gillespie's stochastic simulation method is generalized for the reaction system with a rate coefficient fluctuation. The simulation results confirm the correctness of the analytic results for the time dependent mean and variance of the reaction event number distribution. On the basis of the obtained results, we propose a method of quantitative analysis for the reaction event counting statistics of reaction systems with rate coefficient fluctuations, which enables one to extract information about the magnitude and the relaxation times of the fluctuating reaction rate coefficient, without a bias that can be introduced by assuming a particular kinetic model of conformational dynamics and the conformation dependent reactivity. An exact relationship is established between a higher moment of the reaction event number distribution and the multitime correlation of the reaction rate for the reaction system with a nonequilibrium initial state distribution as well as for the system with the equilibrium initial state distribution.
Extreme events in total ozone over Arosa - Part 1: Application of extreme value theory
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Peter, T.; Ribatet, M.; Davison, A. C.; Stübi, R.; Weihs, P.; Holawe, F.
2010-10-01
In this study ideas from extreme value theory are for the first time applied in the field of stratospheric ozone research, because statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not adequately address the structure of the extremes. We show that statistical extreme value methods are appropriate to identify ozone extremes and to describe the tails of the Arosa (Switzerland) total ozone time series. In order to accommodate the seasonal cycle in total ozone, a daily moving threshold was determined and used, with tools from extreme value theory, to analyse the frequency of days with extreme low (termed ELOs) and high (termed EHOs) total ozone at Arosa. The analysis shows that the Generalized Pareto Distribution (GPD) provides an appropriate model for the frequency distribution of total ozone above or below a mathematically well-defined threshold, thus providing a statistical description of ELOs and EHOs. The results show an increase in ELOs and a decrease in EHOs during the last decades. The fitted model represents the tails of the total ozone data set with high accuracy over the entire range (including absolute monthly minima and maxima), and enables a precise computation of the frequency distribution of ozone mini-holes (using constant thresholds). Analyzing the tails instead of a small fraction of days below constant thresholds provides deeper insight into the time series properties. Fingerprints of dynamical (e.g. ENSO, NAO) and chemical features (e.g. strong polar vortex ozone loss), and major volcanic eruptions, can be identified in the observed frequency of extreme events throughout the time series. Overall the new approach to analysis of extremes provides more information on time series properties and variability than previous approaches that use only monthly averages and/or mini-holes and mini-highs.
Extreme events in total ozone over Arosa - Part 1: Application of extreme value theory
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Peter, T.; Ribatet, M.; Davison, A. C.; Stübi, R.; Weihs, P.; Holawe, F.
2010-05-01
In this study ideas from extreme value theory are for the first time applied in the field of stratospheric ozone research, because statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not adequately address the structure of the extremes. We show that statistical extreme value methods are appropriate to identify ozone extremes and to describe the tails of the Arosa (Switzerland) total ozone time series. In order to accommodate the seasonal cycle in total ozone, a daily moving threshold was determined and used, with tools from extreme value theory, to analyse the frequency of days with extreme low (termed ELOs) and high (termed EHOs) total ozone at Arosa. The analysis shows that the Generalized Pareto Distribution (GPD) provides an appropriate model for the frequency distribution of total ozone above or below a mathematically well-defined threshold, thus providing a statistical description of ELOs and EHOs. The results show an increase in ELOs and a decrease in EHOs during the last decades. The fitted model represents the tails of the total ozone data set with high accuracy over the entire range (including absolute monthly minima and maxima), and enables a precise computation of the frequency distribution of ozone mini-holes (using constant thresholds). Analyzing the tails instead of a small fraction of days below constant thresholds provides deeper insight into the time series properties. Fingerprints of dynamical (e.g. ENSO, NAO) and chemical features (e.g. strong polar vortex ozone loss), and major volcanic eruptions, can be identified in the observed frequency of extreme events throughout the time series. Overall the new approach to analysis of extremes provides more information on time series properties and variability than previous approaches that use only monthly averages and/or mini-holes and mini-highs.
First Extraction of Transversity from a Global Analysis of Electron-Proton and Proton-Proton Data
NASA Astrophysics Data System (ADS)
Radici, Marco; Bacchetta, Alessandro
2018-05-01
We present the first extraction of the transversity distribution in the framework of collinear factorization based on the global analysis of pion-pair production in deep-inelastic scattering and in proton-proton collisions with a transversely polarized proton. The extraction relies on the knowledge of dihadron fragmentation functions, which are taken from the analysis of electron-positron annihilation data. For the first time, the transversity is extracted from a global analysis similar to what is usually done for the spin-averaged and helicity distributions. The knowledge of transversity is important for, among other things, detecting possible signals of new physics in high-precision low-energy experiments.
Tracking and imaging humans on heterogeneous infrared sensor arrays for law enforcement applications
NASA Astrophysics Data System (ADS)
Feller, Steven D.; Zheng, Y.; Cull, Evan; Brady, David J.
2002-08-01
We present a plan for the integration of geometric constraints in the source, sensor and analysis levels of sensor networks. The goal of geometric analysis is to reduce the dimensionality and complexity of distributed sensor data analysis so as to achieve real-time recognition and response to significant events. Application scenarios include biometric tracking of individuals, counting and analysis of individuals in groups of humans and distributed sentient environments. We are particularly interested in using this approach to provide networks of low cost point detectors, such as infrared motion detectors, with complex imaging capabilities. By extending the capabilities of simple sensors, we expect to reduce the cost of perimeter and site security applications.
NASA Astrophysics Data System (ADS)
Shaochuan, Lu; Vere-Jones, David
2011-10-01
The paper studies the statistical properties of deep earthquakes around North Island, New Zealand. We first evaluate the catalogue coverage and completeness of deep events according to cusum (cumulative sum) statistics and earlier literature. The epicentral, depth, and magnitude distributions of deep earthquakes are then discussed. It is worth noting that strong grouping effects are observed in the epicentral distribution of these deep earthquakes. Also, although the spatial distribution of deep earthquakes does not change, their occurrence frequencies vary from time to time, active in one period, relatively quiescent in another. The depth distribution of deep earthquakes also hardly changes except for events with focal depth less than 100 km. On the basis of spatial concentration we partition deep earthquakes into several groups—the Taupo-Bay of Plenty group, the Taranaki group, and the Cook Strait group. Second-order moment analysis via the two-point correlation function reveals only very small-scale clustering of deep earthquakes, presumably limited to some hot spots only. We also suggest that some models usually used for shallow earthquakes fit deep earthquakes unsatisfactorily. Instead, we propose a switching Poisson model for the occurrence patterns of deep earthquakes. The goodness-of-fit test suggests that the time-varying activity is well characterized by a switching Poisson model. Furthermore, detailed analysis carried out on each deep group by use of switching Poisson models reveals similar time-varying behavior in occurrence frequencies in each group.
Capture of activation during ventricular arrhythmia using distributed stimulation.
Meunier, Jason M; Ramalingam, Sanjiv; Lin, Shien-Fong; Patwardhan, Abhijit R
2007-04-01
Results of previous studies suggest that pacing strength stimuli can capture activation during ventricular arrhythmia locally near pacing sites. The existence of spatio-temporal distribution of excitable gap during arrhythmia suggests that multiple and timed stimuli delivered over a region may permit capture over larger areas. Our objective in this study was to evaluate the efficacy of using spatially distributed pacing (DP) to capture activation during ventricular arrhythmia. Data were obtained from rabbit hearts which were placed against a lattice of parallel wires through which biphasic pacing stimuli were delivered. Electrical activity was recorded optically. Pacing stimuli were delivered in sequence through the parallel wires starting with the wire closest to the apex and ending with one closest to the base. Inter-stimulus delay was based on conduction velocity. Time-frequency analysis of optical signals was used to determine variability in activation. A decrease in standard deviation of dominant frequencies of activation from a grid of locations that spanned the captured area and a concurrence with paced frequency were used as an index of capture. Results from five animals showed that the average standard deviation decreased from 0.81 Hz during arrhythmia to 0.66 Hz during DP at pacing cycle length of 125 ms (p = 0.03) reflecting decreased spatio-temporal variability in activation during DP. Results of time-frequency analysis during these pacing trials showed agreement between activation and paced frequencies. These results show that spatially distributed and timed stimulation can be used to modify and capture activation during ventricular arrhythmia.
Nanoflare vs Footpoint Heating : Observational Signatures
NASA Technical Reports Server (NTRS)
Winebarger, Amy; Alexander, Caroline; Lionello, Roberto; Linker, Jon; Mikic, Zoran; Downs, Cooper
2015-01-01
Time lag analysis shows very long time lags between all channel pairs. Impulsive heating cannot address these long time lags. 3D Simulations of footpoint heating shows a similar pattern of time lags (magnitude and distribution) to observations. Time lags and relative peak intensities may be able to differentiate between TNE and impulsive heating solutions. Adding a high temperature channel (like XRT Be-thin) may improve diagnostics.
Real time testing of intelligent relays for synchronous distributed generation islanding detection
NASA Astrophysics Data System (ADS)
Zhuang, Davy
As electric power systems continue to grow to meet ever-increasing energy demand, their security, reliability, and sustainability requirements also become more stringent. The deployment of distributed energy resources (DER), including generation and storage, in conventional passive distribution feeders, gives rise to integration problems involving protection and unintentional islanding. Distributed generators need to be islanded for safety reasons when disconnected or isolated from the main feeder as distributed generator islanding may create hazards to utility and third-party personnel, and possibly damage the distribution system infrastructure, including the distributed generators. This thesis compares several key performance indicators of a newly developed intelligent islanding detection relay, against islanding detection devices currently used by the industry. The intelligent relay employs multivariable analysis and data mining methods to arrive at decision trees that contain both the protection handles and the settings. A test methodology is developed to assess the performance of these intelligent relays on a real time simulation environment using a generic model based on a real-life distribution feeder. The methodology demonstrates the applicability and potential advantages of the intelligent relay, by running a large number of tests, reflecting a multitude of system operating conditions. The testing indicates that the intelligent relay often outperforms frequency, voltage and rate of change of frequency relays currently used for islanding detection, while respecting the islanding detection time constraints imposed by standing distributed generator interconnection guidelines.
Photon nonlinear mixing in subcarrier multiplexed quantum key distribution systems.
Capmany, José
2009-04-13
We provide, for the first time to our knowledge, an analysis of the influence of nonlinear photon mixing on the end to end quantum bit error rate (QBER) performance of subcarrier multiplexed quantum key distribution systems. The results show that negligible impact is to be expected for modulation indexes in the range of 2%.
A comparison of dynamic and static economic models of uneven-aged stand management
Robert G. Haight
1985-01-01
Numerical techniques have been used to compute the discrete-time sequence of residual diameter distributions that maximize the present net worth (PNW) of harvestable volume from an uneven-aged stand. Results contradicted optimal steady-state diameter distributions determined with static analysis. In this paper, optimality conditions for solutions to dynamic and static...
ERIC Educational Resources Information Center
Ahmed, Iftikhar; Sadeq, Muhammad Jafar
2006-01-01
Current distance learning systems are increasingly packing highly data-intensive contents on servers, resulting in the congestion of network and server resources at peak service times. A distributed learning system based on faded information field (FIF) architecture that employs mobile agents (MAs) has been proposed and simulated in this work. The…
A Real-Time PCR with Melting Curve Analysis for Molecular Typing of Vibrio parahaemolyticus.
He, Peiyan; Wang, Henghui; Luo, Jianyong; Yan, Yong; Chen, Zhongwen
2018-05-23
Foodborne disease caused by Vibrio parahaemolyticus is a serious public health problem in many countries. Molecular typing has a great scientific significance and application value for epidemiological research of V. parahaemolyticus. In this study, a real-time PCR with melting curve analysis was established for molecular typing of V. parahaemolyticus. Eighteen large variably presented gene clusters (LVPCs) of V. parahaemolyticus which have different distributions in the genome of different strains were selected as targets. Primer pairs of 18 LVPCs were distributed into three tubes. To validate this newly developed assay, we tested 53 Vibrio parahaemolyticus strains, which were classified in 13 different types. Furthermore, cluster analysis using NTSYS PC 2.02 software could divide 53 V. parahaemolyticus strains into six clusters at a relative similarity coefficient of 0.85. This method is fast, simple, and conveniently for molecular typing of V. parahaemolyticus.
J.J. McDonnell; K. McGuire; P. Aggarwal; K.J. Beven; D. Biondi; G. Destouni; S. Dunn; A. James; J. Kirchner; P. Kraft; S. Lyon; P. Maloszewski; B. Newman; L. Pfister; A. Rinaldo; A. Rodhe; T. Sayama; J. Seibert; K. Solomon; C. Soulsby; M. Stewart; D. Tetzlaff; C. Tobin; P. Troch; M. Weiler; A. Western; A. Wörman; S. Wrede
2010-01-01
The time water spends travelling subsurface through a catchment to the stream network (i.e. the catchment water transit time) fundamentally describes the storage, flow pathway heterogeneity and sources of water in a catchment. The distribution of transit times reflects how catchments retain and release water and solutes that in turn set biogeochemical conditions and...
Flood return level analysis of Peaks over Threshold series under changing climate
NASA Astrophysics Data System (ADS)
Li, L.; Xiong, L.; Hu, T.; Xu, C. Y.; Guo, S.
2016-12-01
Obtaining insights into future flood estimation is of great significance for water planning and management. Traditional flood return level analysis with the stationarity assumption has been challenged by changing environments. A method that takes into consideration the nonstationarity context has been extended to derive flood return levels for Peaks over Threshold (POT) series. With application to POT series, a Poisson distribution is normally assumed to describe the arrival rate of exceedance events, but this distribution assumption has at times been reported as invalid. The Negative Binomial (NB) distribution is therefore proposed as an alternative to the Poisson distribution assumption. Flood return levels were extrapolated in nonstationarity context for the POT series of the Weihe basin, China under future climate scenarios. The results show that the flood return levels estimated under nonstationarity can be different with an assumption of Poisson and NB distribution, respectively. The difference is found to be related to the threshold value of POT series. The study indicates the importance of distribution selection in flood return level analysis under nonstationarity and provides a reference on the impact of climate change on flood estimation in the Weihe basin for the future.
Song, Qiang; Liu, Fang; Wen, Guanghui; Cao, Jinde; Yang, Xinsong
2017-04-24
This paper considers the position-based consensus in a network of agents with double-integrator dynamics and directed topology. Two types of distributed observer algorithms are proposed to solve the consensus problem by utilizing continuous and intermittent position measurements, respectively, where each observer does not interact with any other observers. For the case of continuous communication between network agents, some convergence conditions are derived for reaching consensus in the network with a single constant delay or multiple time-varying delays on the basis of the eigenvalue analysis and the descriptor method. When the network agents can only obtain intermittent position data from local neighbors at discrete time instants, the consensus in the network without time delay or with nonuniform delays is investigated by using the Wirtinger's inequality and the delayed-input approach. Numerical examples are given to illustrate the theoretical analysis.
Research on distributed heterogeneous data PCA algorithm based on cloud platform
NASA Astrophysics Data System (ADS)
Zhang, Jin; Huang, Gang
2018-05-01
Principal component analysis (PCA) of heterogeneous data sets can solve the problem that centralized data scalability is limited. In order to reduce the generation of intermediate data and error components of distributed heterogeneous data sets, a principal component analysis algorithm based on heterogeneous data sets under cloud platform is proposed. The algorithm performs eigenvalue processing by using Householder tridiagonalization and QR factorization to calculate the error component of the heterogeneous database associated with the public key to obtain the intermediate data set and the lost information. Experiments on distributed DBM heterogeneous datasets show that the model method has the feasibility and reliability in terms of execution time and accuracy.
Inverse statistics and information content
NASA Astrophysics Data System (ADS)
Ebadi, H.; Bolgorian, Meysam; Jafari, G. R.
2010-12-01
Inverse statistics analysis studies the distribution of investment horizons to achieve a predefined level of return. This distribution provides a maximum investment horizon which determines the most likely horizon for gaining a specific return. There exists a significant difference between inverse statistics of financial market data and a fractional Brownian motion (fBm) as an uncorrelated time-series, which is a suitable criteria to measure information content in financial data. In this paper we perform this analysis for the DJIA and S&P500 as two developed markets and Tehran price index (TEPIX) as an emerging market. We also compare these probability distributions with fBm probability, to detect when the behavior of the stocks are the same as fBm.
Multifractal analysis of visibility graph-based Ito-related connectivity time series.
Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano
2016-02-01
In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.
Objective sea level pressure analysis for sparse data areas
NASA Technical Reports Server (NTRS)
Druyan, L. M.
1972-01-01
A computer procedure was used to analyze the pressure distribution over the North Pacific Ocean for eleven synoptic times in February, 1967. Independent knowledge of the central pressures of lows is shown to reduce the analysis errors for very sparse data coverage. The application of planned remote sensing of sea-level wind speeds is shown to make a significant contribution to the quality of the analysis especially in the high gradient mid-latitudes and for sparse coverage of conventional observations (such as over Southern Hemisphere oceans). Uniform distribution of the available observations of sea-level pressure and wind velocity yields results far superior to those derived from a random distribution. A generalization of the results indicates that the average lower limit for analysis errors is between 2 and 2.5 mb based on the perfect specification of the magnitude of the sea-level pressure gradient from a known verification analysis. A less than perfect specification will derive from wind-pressure relationships applied to satellite observed wind speeds.
Gopal, Sandeep; Pocock, Roger
2018-04-19
The Caenorhabditis elegans (C. elegans) germline is used to study several biologically important processes including stem cell development, apoptosis, and chromosome dynamics. While the germline is an excellent model, the analysis is often two dimensional due to the time and labor required for three-dimensional analysis. Major readouts in such studies are the number/position of nuclei and protein distribution within the germline. Here, we present a method to perform automated analysis of the germline using confocal microscopy and computational approaches to determine the number and position of nuclei in each region of the germline. Our method also analyzes germline protein distribution that enables the three-dimensional examination of protein expression in different genetic backgrounds. Further, our study shows variations in cytoskeletal architecture in distinct regions of the germline that may accommodate specific spatial developmental requirements. Finally, our method enables automated counting of the sperm in the spermatheca of each germline. Taken together, our method enables rapid and reproducible phenotypic analysis of the C. elegans germline.
Reshadat, S; Saedi, S; Zangeneh, A; Ghasemi, S R; Gilan, N R; Karbasi, A; Bavandpoor, E
2015-09-08
Geographic information systems (GIS) analysis has not been widely used in underdeveloped countries to ensure that vulnerable populations have accessibility to primary health-care services. This study applied GIS methods to analyse the spatial accessibility to urban primary-care centres of the population in Kermanshah city, Islamic Republic of Iran, by age and sex groups. In a descriptive-analytical study over 3 time periods, network analysis, mean centre and standard distance methods were applied using ArcGIS 9.3. The analysis was based on a standard radius of 750 m distance from health centres, walking speed of 1 m/s and desired access time to health centres of 12.5 mins. The proportion of the population with inadequate geographical access to health centres rose from 47.3% in 1997 to 58.4% in 2012. The mean centre and standard distance mapping showed that the spatial distribution of health centres in Kermanshah needed to be adjusted to changes in population distribution.
Coherent optical determination of the leaf angle distribution of corn
NASA Technical Reports Server (NTRS)
Ulaby, F. T. (Principal Investigator); Pihlman, M.
1981-01-01
A coherent optical technique for the diffraction analysis of an image is presented. Developments in radar remote sensing shows a need to understand plant geometry and its relationship to plant moisture, soil moisture, and the radar backscattering coefficient. A corn plant changes its leaf angle distribution, as a function of time, from a uniform distribution to one that is strongly vertical. It is shown that plant and soil moisture may have an effect on plant geometry.
2015-09-30
DISTRIBUTION STATEMENT A: Distribution approved for public release; distribution is unlimited. NPS-NRL- Rice -UIUC Collaboration on Navy Atmosphere...portability. There is still a gap in the OCCA support for Fortran programmers who do not have accelerator experience. Activities at Rice /Virginia Tech are...for automated data movement and for kernel optimization using source code analysis and run-time detective work. In this quarter the Rice /Virginia
Multiscale power analysis for heart rate variability
NASA Astrophysics Data System (ADS)
Zeng, Peng; Liu, Hongxing; Ni, Huangjing; Zhou, Jing; Xia, Lan; Ning, Xinbao
2015-06-01
We first introduce multiscale power (MSP) method to assess the power distribution of physiological signals on multiple time scales. Simulation on synthetic data and experiments on heart rate variability (HRV) are tested to support the approach. Results show that both physical and psychological changes influence power distribution significantly. A quantitative parameter, termed power difference (PD), is introduced to evaluate the degree of power distribution alteration. We find that dynamical correlation of HRV will be destroyed completely when PD>0.7.
Analysis of transverse field distributions in Porro prism resonators
NASA Astrophysics Data System (ADS)
Litvin, Igor A.; Burger, Liesl; Forbes, Andrew
2007-05-01
A model to describe the transverse field distribution of the output beam from porro prism resonators is proposed. The model allows the prediction of the output transverse field distribution by assuming that the main areas of loss are located at the apexes of the porro prisms. Experimental work on a particular system showed some interested correlations between the time domain behavior of the resonator and the transverse field output. These findings are presented and discussed.
Daniels, Carter W; Sanabria, Federico
2017-03-01
The distribution of latencies and interresponse times (IRTs) of rats was compared between two fixed-interval (FI) schedules of food reinforcement (FI 30 s and FI 90 s), and between two levels of food deprivation. Computational modeling revealed that latencies and IRTs were well described by mixture probability distributions embodying two-state Markov chains. Analysis of these models revealed that only a subset of latencies is sensitive to the periodicity of reinforcement, and prefeeding only reduces the size of this subset. The distribution of IRTs suggests that behavior in FI schedules is organized in bouts that lengthen and ramp up in frequency with proximity to reinforcement. Prefeeding slowed down the lengthening of bouts and increased the time between bouts. When concatenated, latency and IRT models adequately reproduced sigmoidal FI response functions. These findings suggest that behavior in FI schedules fluctuates in and out of schedule control; an account of such fluctuation suggests that timing and motivation are dissociable components of FI performance. These mixture-distribution models also provide novel insights on the motivational, associative, and timing processes expressed in FI performance. These processes may be obscured, however, when performance in timing tasks is analyzed in terms of mean response rates.
Analysis of rainfall distribution in Kelantan river basin, Malaysia
NASA Astrophysics Data System (ADS)
Che Ros, Faizah; Tosaka, Hiroyuki
2018-03-01
Using rainfall gauge on its own as input carries great uncertainties regarding runoff estimation, especially when the area is large and the rainfall is measured and recorded at irregular spaced gauging stations. Hence spatial interpolation is the key to obtain continuous and orderly rainfall distribution at unknown points to be the input to the rainfall runoff processes for distributed and semi-distributed numerical modelling. It is crucial to study and predict the behaviour of rainfall and river runoff to reduce flood damages of the affected area along the Kelantan river. Thus, a good knowledge on rainfall distribution is essential in early flood prediction studies. Forty six rainfall stations and their daily time-series were used to interpolate gridded rainfall surfaces using inverse-distance weighting (IDW), inverse-distance and elevation weighting (IDEW) methods and average rainfall distribution. Sensitivity analysis for distance and elevation parameters were conducted to see the variation produced. The accuracy of these interpolated datasets was examined using cross-validation assessment.
NASA Astrophysics Data System (ADS)
Sohrabi, M.; Habibi, M.; Ramezani, V.
2017-02-01
The paper presents an experimental study and analysis of full helium ion density angular distributions in a 4-kJ plasma focus device (PFD) at pressures of 10, 15, 25, and 30 mbar using large-area polycarbonate track detectors (PCTDs) (15-cm etchable diameter) processed by 50-Hz-HV electrochemical etching (ECE). Helium ion track distributions at different pressures, in particular, at the main axis of the PFD are presented. Maximum ion track density of 4.4 × 104 tracks/cm2 was obtained in the PCTD placed 6 cm from the anode. The ion distributions for all pressures applied are ring-shaped, which is possibly due to the hollow cylindrical copper anode used. The large-area PCTD processed by ECE proves, at the present state-of-theart, a superior method for direct observation and analysis of ion distributions at a glance with minimum efforts and time. Some observations of the ion density distributions at different pressures are reported and discussed.
Krecar, Dragan; Vassileva, Vassilka; Danninger, Herbert; Hutter, Herbert
2004-06-01
Powder metallurgy is a highly developed method of manufacturing reliable ferrous parts. The main processing steps in a powder metallurgical line are pressing and sintering. Sintering can be strongly enhanced by the formation of a liquid phase during the sintering process when using phosphorus as sintering activator. In this work the distribution (effect) of phosphorus was investigated by means of secondary ion mass spectrometry (SIMS) supported by Auger electron spectroscopy (AES) and electron probe micro analysis (EPMA). To verify the influence of the process conditions (phosphorus content, sintering atmosphere, time) on the mechanical properties, additional measurements of the microstructure (pore shape) and of impact energy were performed. Analysis of fracture surfaces was performed by means of scanning electron microscopy (SEM). The concentration of phosphorus differs in the samples from 0 to 1% (w/ w). Samples with higher phosphorus concentrations (1% (w/ w) and above) are also measurable by EPMA, whereas the distributions of P at technically relevant concentrations and the distribution of possible impurities are only detectable (visible) by means of SIMS. The influence of the sintering time on the phosphorus distribution will be demonstrated. In addition the grain boundary segregation of P was measured by AES at the surface of in-situ broken samples. It will be shown that the distribution of phosphorus depends also on the concentration of carbon in the samples.
Geographic Distribution of Urologists in Korea, 2007 to 2012
Song, Yun Seob; Shim, Sung Ryul; Jung, Insoo; Sun, Hwa Yeon; Song, Soo Hyun; Kwon, Soon-Sun; Ko, Young Myoung
2015-01-01
The adequacy of the urologist work force in Korea has never been investigated. This study investigated the geographic distribution of urologists in Korea. County level data from the National Health Insurance Service and National Statistical Office was analyzed in this ecological study. Urologist density was defined by the number of urologists per 100,000 individuals. National patterns of urologist density were mapped graphically at the county level using GIS software. To control the time sequence, regression analysis with fitted line plot was conducted. The difference of distribution of urologist density was analyzed by ANCOVA. Urologists density showed an uneven distribution according to county characteristics (metropolitan cities vs. nonmetropolitan cities vs. rural areas; mean square=102.329, P<0.001) and also according to year (mean square=9.747, P=0.048). Regression analysis between metropolitan and non-metropolitan cities showed significant difference in the change of urologists per year (P=0.019). Metropolitan cities vs. rural areas and non-metropolitan cities vs. rural areas showed no differences. Among the factors, the presence of training hospitals was the affecting factor for the uneven distribution of urologist density (P<0.001).Uneven distribution of urologists in Korea likely originated from the relatively low urologist density in rural areas. However, considering the time sequencing data from 2007 to 2012, there was a difference between the increase of urologist density in metropolitan and non-metropolitan cities. PMID:26539009
Geographic Distribution of Urologists in Korea, 2007 to 2012.
Song, Yun Seob; Shim, Sung Ryul; Jung, Insoo; Sun, Hwa Yeon; Song, Soo Hyun; Kwon, Soon-Sun; Ko, Young Myoung; Kim, Jae Heon
2015-11-01
The adequacy of the urologist work force in Korea has never been investigated. This study investigated the geographic distribution of urologists in Korea. County level data from the National Health Insurance Service and National Statistical Office was analyzed in this ecological study. Urologist density was defined by the number of urologists per 100,000 individuals. National patterns of urologist density were mapped graphically at the county level using GIS software. To control the time sequence, regression analysis with fitted line plot was conducted. The difference of distribution of urologist density was analyzed by ANCOVA. Urologists density showed an uneven distribution according to county characteristics (metropolitan cities vs. nonmetropolitan cities vs. rural areas; mean square=102.329, P<0.001) and also according to year (mean square=9.747, P=0.048). Regression analysis between metropolitan and non-metropolitan cities showed significant difference in the change of urologists per year (P=0.019). Metropolitan cities vs. rural areas and non-metropolitan cities vs. rural areas showed no differences. Among the factors, the presence of training hospitals was the affecting factor for the uneven distribution of urologist density (P<0.001). Uneven distribution of urologists in Korea likely originated from the relatively low urologist density in rural areas. However, considering the time sequencing data from 2007 to 2012, there was a difference between the increase of urologist density in metropolitan and non-metropolitan cities.
Multifractal Approach to Time Clustering of Earthquakes. Application to Mt. Vesuvio Seismicity
NASA Astrophysics Data System (ADS)
Codano, C.; Alonzo, M. L.; Vilardo, G.
The clustering structure of the Vesuvian earthquakes occurring is investigated by means of statistical tools: the inter-event time distribution, the running mean and the multifractal analysis. The first cannot clearly distinguish between a Poissonian process and a clustered one due to the difficulties of clearly distinguishing between an exponential distribution and a power law one. The running mean test reveals the clustering of the earthquakes, but looses information about the structure of the distribution at global scales. The multifractal approach can enlighten the clustering at small scales, while the global behaviour remains Poissonian. Subsequently the clustering of the events is interpreted in terms of diffusive processes of the stress in the earth crust.
Evaluation of Uranium-235 Measurement Techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaspar, Tiffany C.; Lavender, Curt A.; Dibert, Mark W.
2017-05-23
Monolithic U-Mo fuel plates are rolled to final fuel element form from the original cast ingot, and thus any inhomogeneities in 235U distribution present in the cast ingot are maintained, and potentially exaggerated, in the final fuel foil. The tolerance for inhomogeneities in the 235U concentration in the final fuel element foil is very low. A near-real-time, nondestructive technique to evaluate the 235U distribution in the cast ingot is required in order to provide feedback to the casting process. Based on the technical analysis herein, gamma spectroscopy has been recommended to provide a near-real-time measure of the 235U distribution inmore » U-Mo cast plates.« less
NASA Technical Reports Server (NTRS)
Koeberlein, Ernest, III; Pender, Shaw Exum
1994-01-01
This paper describes the Multimission Telemetry Visualization (MTV) data acquisition/distribution system. MTV was developed by JPL's Multimedia Communications Laboratory (MCL) and designed to process and display digital, real-time, science and engineering data from JPL's Mission Control Center. The MTV system can be accessed using UNIX workstations and PC's over common datacom and telecom networks from worldwide locations. It is designed to lower data distribution costs while increasing data analysis functionality by integrating low-cost, off-the-shelf desktop hardware and software. MTV is expected to significantly lower the cost of real-time data display, processing, distribution, and allow for greater spacecraft safety and mission data access.
Improving queuing service at McDonald's
NASA Astrophysics Data System (ADS)
Koh, Hock Lye; Teh, Su Yean; Wong, Chin Keat; Lim, Hooi Kie; Migin, Melissa W.
2014-07-01
Fast food restaurants are popular among price-sensitive youths and working adults who value the conducive environment and convenient services. McDonald's chains of restaurants promote their sales during lunch hours by offering package meals which are perceived to be inexpensive. These promotional lunch meals attract good response, resulting in occasional long queues and inconvenient waiting times. A study is conducted to monitor the distribution of waiting time, queue length, customer arrival and departure patterns at a McDonald's restaurant located in Kuala Lumpur. A customer survey is conducted to gauge customers' satisfaction regarding waiting time and queue length. An android app named Que is developed to perform onsite queuing analysis and report key performance indices. The queuing theory in Que is based upon the concept of Poisson distribution. In this paper, Que is utilized to perform queuing analysis at this McDonald's restaurant with the aim of improving customer service, with particular reference to reducing queuing time and shortening queue length. Some results will be presented.
NASA Astrophysics Data System (ADS)
Zaccaria, V.; Tucker, D.; Traverso, A.
2016-09-01
Solid oxide fuel cells are characterized by very high efficiency, low emissions level, and large fuel flexibility. Unfortunately, their elevated costs and relatively short lifetimes reduce the economic feasibility of these technologies at the present time. Several mechanisms contribute to degrade fuel cell performance during time, and the study of these degradation modes and potential mitigation actions is critical to ensure the durability of the fuel cell and their long-term stability. In this work, localized degradation of a solid oxide fuel cell is modeled in real-time and its effects on various cell parameters are analyzed. Profile distributions of overpotential, temperature, heat generation, and temperature gradients in the stack are investigated during degradation. Several causes of failure could occur in the fuel cell if no proper control actions are applied. A local analysis of critical parameters conducted shows where the issues are and how they could be mitigated in order to extend the life of the cell.
A Technical Survey on Optimization of Processing Geo Distributed Data
NASA Astrophysics Data System (ADS)
Naga Malleswari, T. Y. J.; Ushasukhanya, S.; Nithyakalyani, A.; Girija, S.
2018-04-01
With growing cloud services and technology, there is growth in some geographically distributed data centers to store large amounts of data. Analysis of geo-distributed data is required in various services for data processing, storage of essential information, etc., processing this geo-distributed data and performing analytics on this data is a challenging task. The distributed data processing is accompanied by issues in storage, computation and communication. The key issues to be dealt with are time efficiency, cost minimization, utility maximization. This paper describes various optimization methods like end-to-end multiphase, G-MR, etc., using the techniques like Map-Reduce, CDS (Community Detection based Scheduling), ROUT, Workload-Aware Scheduling, SAGE, AMP (Ant Colony Optimization) to handle these issues. In this paper various optimization methods and techniques used are analyzed. It has been observed that end-to end multiphase achieves time efficiency; Cost minimization concentrates to achieve Quality of Service, Computation and reduction of Communication cost. SAGE achieves performance improvisation in processing geo-distributed data sets.
A statistical analysis of North East Atlantic (submicron) aerosol size distributions
NASA Astrophysics Data System (ADS)
Dall'Osto, M.; Monahan, C.; Greaney, R.; Beddows, D. C. S.; Harrison, R. M.; Ceburnis, D.; O'Dowd, C. D.
2011-08-01
The Global Atmospheric Watch research station at Mace Head (Ireland) offers the possibility to sample some of the cleanest air masses being imported into Europe as well as some of the most polluted being exported out of Europe. We present a statistical Cluster~analysis of the physical characteristics of aerosol size distributions in air ranging from the cleanest to the most polluted for the year 2008. Data coverage achieved was 75 % throughout the year. By applying the Hartigan-Wong k-Means method, 12 Clusters were identified as systematically occurring and these 12 Clusters could be further combined into 4 categories with similar characteristics, namely: coastal nucleation category (occurring 21.3 % of the time), open ocean nucleation category (occurring 32.6 % of the time), background clean marine category (occurring 26.1 % of the time) and anthropogenic category (occurring 20 % of the time) aerosol size distributions. The coastal nucleation category is characterised by a clear and dominant nucleation mode at sizes less that 10 nm while the open ocean nucleation category is characterised by a dominant Aitken mode between 15 nm and 50 nm. The background clean marine characteristic is a clear bimodality in the size distribution, although it should be noted that either the Aitken mode or the Accumulation mode may dominate the number concentration. By contrast, the continentally-influenced size distributions are generally more mono-modal, albeit with traces of bi-modality. The open ocean category occurs more often during May, June and July, corresponding with the N. E. Atlantic high biological period. Combined with the relatively high percentage frequency of occurrence (32.6 %), this suggests that the marine biota is an important source of new aerosol particles in N. E. Atlantic Air.
NASA Astrophysics Data System (ADS)
Eremenko, D. O.; Drozdov, V. A.; Fotina, O. V.; Platonov, S. Yu.; Yuminov, O. A.
2016-07-01
Background: It is well known that the anomalous behavior of angular anisotropies of fission fragments at sub- and near-barrier energies is associated with a memory of conditions in the entrance channel of the heavy-ion reactions, particularly, deformations and spins of colliding nuclei that determine the initial distributions for the components of the total angular momentum over the symmetry axis of the fissioning system and the beam axis. Purpose: We develop a new dynamic approach, which allows the description of the memory effects in the fission fragment angular distributions and provides new information on fusion and fission dynamics. Methods: The approach is based on the dynamic model of the fission fragment angular distributions which takes into account stochastic aspects of nuclear fission and thermal fluctuations for the tilting mode that is characterized by the projection of the total angular momentum onto the symmetry axis of the fissioning system. Another base of our approach is the quantum mechanical method to calculate the initial distributions over the components of the total angular momentum of the nuclear system immediately following complete fusion. Results: A method is suggested for calculating the initial distributions of the total angular momentum projection onto the symmetry axis for the nuclear systems formed in the reactions of complete fusion of deformed nuclei with spins. The angular distributions of fission fragments for the 16O+232Th,12C+235,236,238, and 13C+235U reactions have been analyzed within the dynamic approach over a range of sub- and above-barrier energies. The analysis allowed us to determine the relaxation time for the tilting mode and the fraction of fission events occurring in times not larger than the relaxation time for the tilting mode. Conclusions: It is shown that the memory effects play an important role in the formation of the angular distributions of fission fragments for the reactions induced by heavy ions. The approach developed for analysis of the effects is a suitable tool to get insight into the complete fusion-fission dynamics, in particular, to investigate the mechanism of the complete fusion and fission time scale.
Voter model with non-Poissonian interevent intervals
NASA Astrophysics Data System (ADS)
Takaguchi, Taro; Masuda, Naoki
2011-09-01
Recent analysis of social communications among humans has revealed that the interval between interactions for a pair of individuals and for an individual often follows a long-tail distribution. We investigate the effect of such a non-Poissonian nature of human behavior on dynamics of opinion formation. We use a variant of the voter model and numerically compare the time to consensus of all the voters with different distributions of interevent intervals and different networks. Compared with the exponential distribution of interevent intervals (i.e., the standard voter model), the power-law distribution of interevent intervals slows down consensus on the ring. This is because of the memory effect; in the power-law case, the expected time until the next update event on a link is large if the link has not had an update event for a long time. On the complete graph, the consensus time in the power-law case is close to that in the exponential case. Regular graphs bridge these two results such that the slowing down of the consensus in the power-law case as compared to the exponential case is less pronounced as the degree increases.
Time trends in recurrence of juvenile nasopharyngeal angiofibroma: Experience of the past 4 decades.
Mishra, Anupam; Mishra, Subhash Chandra
2016-01-01
An analysis of time distribution of juvenile nasopharyngeal angiofibroma (JNA) from the last 4 decades is presented. Sixty recurrences were analyzed as per actuarial survival. SPSS software was used to generate Kaplan-Meier (KM) curves and time distributions were compared by Log-rank, Breslow and Tarone-Ware test. The overall recurrence rate was 17.59%. Majority underwent open transpalatal approach(es) without embolization. The probability of detecting a recurrence was 95% in first 24months and comparison of KM curves of 4 different time periods was not significant. This is the first and largest series to address the time-distribution. The required follow up period is 2years. Our recurrence is just half of the largest series (reported so far) suggesting the superiority of transpalatal techniques. The similarity of curves suggests less likelihood for recent technical advances to influence the recurrence that as per our hypothesis is more likely to reflect tumor biology per se. Copyright © 2016 Elsevier Inc. All rights reserved.
Data-Aware Retrodiction for Asynchronous Harmonic Measurement in a Cyber-Physical Energy System.
Liu, Youda; Wang, Xue; Liu, Yanchi; Cui, Sujin
2016-08-18
Cyber-physical energy systems provide a networked solution for safety, reliability and efficiency problems in smart grids. On the demand side, the secure and trustworthy energy supply requires real-time supervising and online power quality assessing. Harmonics measurement is necessary in power quality evaluation. However, under the large-scale distributed metering architecture, harmonic measurement faces the out-of-sequence measurement (OOSM) problem, which is the result of latencies in sensing or the communication process and brings deviations in data fusion. This paper depicts a distributed measurement network for large-scale asynchronous harmonic analysis and exploits a nonlinear autoregressive model with exogenous inputs (NARX) network to reorder the out-of-sequence measuring data. The NARX network gets the characteristics of the electrical harmonics from practical data rather than the kinematic equations. Thus, the data-aware network approximates the behavior of the practical electrical parameter with real-time data and improves the retrodiction accuracy. Theoretical analysis demonstrates that the data-aware method maintains a reasonable consumption of computing resources. Experiments on a practical testbed of a cyber-physical system are implemented, and harmonic measurement and analysis accuracy are adopted to evaluate the measuring mechanism under a distributed metering network. Results demonstrate an improvement of the harmonics analysis precision and validate the asynchronous measuring method in cyber-physical energy systems.
Wang, Jin
2005-03-01
With brilliant synchrotron X-ray sources, microsecond time-resolved synchrotron X-ray radiography and tomography have been used to elucidate the detailed three-dimensional structure and dynamics of high-pressure high-speed fuel sprays in the near-nozzle region. The measurement allows quantitative determination of the fuel distribution in the optically impenetrable region owing to the multiple scattering of visible light by small atomized fuel droplets surrounding the jet. X-radiographs of the jet-induced shock waves prove that the fuel jets become supersonic under appropriate injection conditions and that the quantitative analysis of the thermodynamic properties of the shock waves can also be derived from the most direct measurement. In other situations where extremely axial-asymmetric sprays are encountered, mass deconvolution and cross-sectional fuel distribution models can be computed based on the monochromatic and time-resolved X-radiographic images collected from various rotational orientations of the sprays. Such quantitative analysis reveals the never-before-reported characteristics and most detailed near-nozzle mass distribution of highly transient fuel sprays.
Sun, Wenjun; Liu, Wenjun; Cui, Lifeng; Zhang, Minglu; Wang, Bei
2013-08-01
This study describes the identification and characterization of a new chlorine resistant bacterium, Sphingomonas TS001, isolated from a model drinking water distribution system. The isolate was identified by 16s rRNA gene analysis and morphological and physiological characteristics. Phylogenetic analysis indicates that TS001 belongs to the genus Sphingomonas. The model distribution system HPC results showed that, when the chlorine residual was greater than 0.7 mg L(-1), 100% of detected heterotrophic bacteria (HPC) was TS001. The bench-scale inactivation efficiency testing showed that this strain was very resistant to chlorine, and 4 mg L(-1) of chlorine with 240 min retention time provided only approximately 5% viability reduction of TS001. In contrast, a 3-log inactivation (99.9%) was obtained for UV fluencies of 40 mJ cm(-2). A high chlorine-resistant and UV sensitive bacterium, Sphingomonas TS001, was documented for the first time. Copyright © 2013 Elsevier B.V. All rights reserved.
Multifractal analysis of mobile social networks
NASA Astrophysics Data System (ADS)
Zheng, Wei; Zhang, Zifeng; Deng, Yufan
2017-09-01
As Wireless Fidelity (Wi-Fi)-enabled handheld devices have been widely used, the mobile social networks (MSNs) has been attracting extensive attention. Fractal approaches have also been widely applied to characterierize natural networks as useful tools to depict their spatial distribution and scaling properties. Moreover, when the complexity of the spatial distribution of MSNs cannot be properly charaterized by single fractal dimension, multifractal analysis is required. For further research, we introduced a multifractal analysis method based on box-covering algorithm to describe the structure of MSNs. Using this method, we find that the networks are multifractal at different time interval. The simulation results demonstrate that the proposed method is efficient for analyzing the multifractal characteristic of MSNs, which provides a distribution of singularities adequately describing both the heterogeneity of fractal patterns and the statistics of measurements across spatial scales in MSNs.
Application of ideal pressure distribution in development process of automobile seats.
Kilincsoy, U; Wagner, A; Vink, P; Bubb, H
2016-07-19
In designing a car seat the ideal pressure distribution is important as it is the largest contact surface between the human and the car. Because of obstacles hindering a more general application of the ideal pressure distribution in seating design, multidimensional measuring techniques are necessary with extensive user tests. The objective of this study is to apply and integrate the knowledge about the ideal pressure distribution in the seat design process for a car manufacturer in an efficient way. Ideal pressure distribution was combined with pressure measurement, in this case pressure mats. In order to integrate this theoretical knowledge of seating comfort in the seat development process for a car manufacturer a special user interface was defined and developed. The mapping of the measured pressure distribution in real-time and accurately scaled to actual seats during test setups directly lead to design implications for seat design even during the test situation. Detailed analysis of the subject's feedback was correlated with objective measurements of the subject's pressure distribution in real time. Therefore existing seating characteristics were taken into account as well. A user interface can incorporate theoretical and validated 'state of the art' models of comfort. Consequently, this information can reduce extensive testing and lead to more detailed results in a shorter time period.
Software Comparison for Renewable Energy Deployment in a Distribution Network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, David Wenzhong; Muljadi, Eduard; Tian, Tian
The main objective of this report is to evaluate different software options for performing robust distributed generation (DG) power system modeling. The features and capabilities of four simulation tools, OpenDSS, GridLAB-D, CYMDIST, and PowerWorld Simulator, are compared to analyze their effectiveness in analyzing distribution networks with DG. OpenDSS and GridLAB-D, two open source software, have the capability to simulate networks with fluctuating data values. These packages allow the running of a simulation each time instant by iterating only the main script file. CYMDIST, a commercial software, allows for time-series simulation to study variations on network controls. PowerWorld Simulator, another commercialmore » tool, has a batch mode simulation function through the 'Time Step Simulation' tool, which obtains solutions for a list of specified time points. PowerWorld Simulator is intended for analysis of transmission-level systems, while the other three are designed for distribution systems. CYMDIST and PowerWorld Simulator feature easy-to-use graphical user interfaces (GUIs). OpenDSS and GridLAB-D, on the other hand, are based on command-line programs, which increase the time necessary to become familiar with the software packages.« less
Analysis of PV Advanced Inverter Functions and Setpoints under Time Series Simulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seuss, John; Reno, Matthew J.; Broderick, Robert Joseph
Utilities are increasingly concerned about the potential negative impacts distributed PV may have on the operational integrity of their distribution feeders. Some have proposed novel methods for controlling a PV system's grid - tie inverter to mitigate poten tial PV - induced problems. This report investigates the effectiveness of several of these PV advanced inverter controls on improving distribution feeder operational metrics. The controls are simulated on a large PV system interconnected at several locations within two realistic distribution feeder models. Due to the time - domain nature of the advanced inverter controls, quasi - static time series simulations aremore » performed under one week of representative variable irradiance and load data for each feeder. A para metric study is performed on each control type to determine how well certain measurable network metrics improve as a function of the control parameters. This methodology is used to determine appropriate advanced inverter settings for each location on the f eeder and overall for any interconnection location on the feeder.« less
Analysis of anabolic steroids in hair: time courses in guinea pigs.
Shen, Min; Xiang, Ping; Yan, Hui; Shen, Baohua; Wang, Mengye
2009-09-01
Sensitive, specific, and reproducible methods for the quantitative determination of eight anabolic steroids in guinea pig hair have been developed using LC/MS/MS and GC/MS/MS. Methyltestosterone, stanozolol, methandienone, nandrolone, trenbolone, boldenone, methenolone and DHEA were administered intraperitoneally in guinea pigs. After the first injection, black hair segments were collected on shaved areas of skin. The analysis of these segments revealed the distribution of anabolic steroids in the guinea pig hair. The major components in hair are the parent anabolic steroids. The time courses of the concentrations of the steroids in hair (except methenolone, which does not deposit in hair) demonstrated that the peak concentrations were reached on days 2-4, except stanozolol, which peaked on day 10 after administration. The concentrations in hair appeared to be related to the physicochemical properties of the drug compound and to the dosage. These studies on the distribution of drugs in the hair shaft and on the time course of their concentration changes provide information relevant to the optimal time and method of collecting hair samples. Such studies also provide basic data that will be useful in the application of hair analysis in the control of doping and in the interpretation of results.
NASA Astrophysics Data System (ADS)
Panozzo, M.; Quintero-Quiroz, C.; Tiana-Alsina, J.; Torrent, M. C.; Masoller, C.
2017-11-01
Semiconductor lasers with time-delayed optical feedback display a wide range of dynamical regimes, which have found various practical applications. They also provide excellent testbeds for data analysis tools for characterizing complex signals. Recently, several of us have analyzed experimental intensity time-traces and quantitatively identified the onset of different dynamical regimes, as the laser current increases. Specifically, we identified the onset of low-frequency fluctuations (LFFs), where the laser intensity displays abrupt dropouts, and the onset of coherence collapse (CC), where the intensity fluctuations are highly irregular. Here we map these regimes when both, the laser current and the feedback strength vary. We show that the shape of the distribution of intensity fluctuations (characterized by the standard deviation, the skewness, and the kurtosis) allows to distinguish among noise, LFFs and CC, and to quantitatively determine (in spite of the gradual nature of the transitions) the boundaries of the three regimes. Ordinal analysis of the inter-dropout time intervals consistently identifies the three regimes occurring in the same parameter regions as the analysis of the intensity distribution. Simulations of the well-known time-delayed Lang-Kobayashi model are in good qualitative agreement with the observations.
Seeking a fingerprint: analysis of point processes in actigraphy recording
NASA Astrophysics Data System (ADS)
Gudowska-Nowak, Ewa; Ochab, Jeremi K.; Oleś, Katarzyna; Beldzik, Ewa; Chialvo, Dante R.; Domagalik, Aleksandra; Fąfrowicz, Magdalena; Marek, Tadeusz; Nowak, Maciej A.; Ogińska, Halszka; Szwed, Jerzy; Tyburczyk, Jacek
2016-05-01
Motor activity of humans displays complex temporal fluctuations which can be characterised by scale-invariant statistics, thus demonstrating that structure and fluctuations of such kinetics remain similar over a broad range of time scales. Previous studies on humans regularly deprived of sleep or suffering from sleep disorders predicted a change in the invariant scale parameters with respect to those for healthy subjects. In this study we investigate the signal patterns from actigraphy recordings by means of characteristic measures of fractional point processes. We analyse spontaneous locomotor activity of healthy individuals recorded during a week of regular sleep and a week of chronic partial sleep deprivation. Behavioural symptoms of lack of sleep can be evaluated by analysing statistics of duration times during active and resting states, and alteration of behavioural organisation can be assessed by analysis of power laws detected in the event count distribution, distribution of waiting times between consecutive movements and detrended fluctuation analysis of recorded time series. We claim that among different measures characterising complexity of the actigraphy recordings and their variations implied by chronic sleep distress, the exponents characterising slopes of survival functions in resting states are the most effective biomarkers distinguishing between healthy and sleep-deprived groups.
Analysis of electrophoresis performance
NASA Technical Reports Server (NTRS)
Roberts, G. O.
1984-01-01
The SAMPLE computer code models electrophoresis separation in a wide range of conditions. Results are included for steady three dimensional continuous flow electrophoresis (CFE), time dependent gel and acetate film experiments in one or two dimensions and isoelectric focusing in one dimension. The code evolves N two dimensional radical concentration distributions in time, or distance down a CFE chamber. For each time or distance increment, there are six stages, successively obtaining the pH distribution, the corresponding degrees of ionization for each radical, the conductivity, the electric field and current distribution, and the flux components in each direction for each separate radical. The final stage is to update the radical concentrations. The model formulation for ion motion in an electric field ignores activity effects, and is valid only for low concentrations; for larger concentrations the conductivity is, therefore, also invalid.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGraw, David; Hershey, Ronald L.
Methods were developed to quantify uncertainty and sensitivity for NETPATH inverse water-rock reaction models and to calculate dissolved inorganic carbon, carbon-14 groundwater travel times. The NETPATH models calculate upgradient groundwater mixing fractions that produce the downgradient target water chemistry along with amounts of mineral phases that are either precipitated or dissolved. Carbon-14 groundwater travel times are calculated based on the upgradient source-water fractions, carbonate mineral phase changes, and isotopic fractionation. Custom scripts and statistical code were developed for this study to facilitate modifying input parameters, running the NETPATH simulations, extracting relevant output, postprocessing the results, and producing graphs and summaries.more » The scripts read userspecified values for each constituent’s coefficient of variation, distribution, sensitivity parameter, maximum dissolution or precipitation amounts, and number of Monte Carlo simulations. Monte Carlo methods for analysis of parametric uncertainty assign a distribution to each uncertain variable, sample from those distributions, and evaluate the ensemble output. The uncertainty in input affected the variability of outputs, namely source-water mixing, phase dissolution and precipitation amounts, and carbon-14 travel time. Although NETPATH may provide models that satisfy the constraints, it is up to the geochemist to determine whether the results are geochemically reasonable. Two example water-rock reaction models from previous geochemical reports were considered in this study. Sensitivity analysis was also conducted to evaluate the change in output caused by a small change in input, one constituent at a time. Results were standardized to allow for sensitivity comparisons across all inputs, which results in a representative value for each scenario. The approach yielded insight into the uncertainty in water-rock reactions and travel times. For example, there was little variation in source-water fraction between the deterministic and Monte Carlo approaches, and therefore, little variation in travel times between approaches. Sensitivity analysis proved very useful for identifying the most important input constraints (dissolved-ion concentrations), which can reveal the variables that have the most influence on source-water fractions and carbon-14 travel times. Once these variables are determined, more focused effort can be applied to determining the proper distribution for each constraint. Second, Monte Carlo results for water-rock reaction modeling showed discrete and nonunique results. The NETPATH models provide the solutions that satisfy the constraints of upgradient and downgradient water chemistry. There can exist multiple, discrete solutions for any scenario and these discrete solutions cause grouping of results. As a result, the variability in output may not easily be represented by a single distribution or a mean and variance and care should be taken in the interpretation and reporting of results.« less
NASA Astrophysics Data System (ADS)
Haneef, Shahna M.; Srijith, K.; Venkitesh, D.; Srinivasan, B.
2017-04-01
We propose and demonstrate the use of cross recurrence plot analysis (CRPA) to accurately determine the Brillouin shift due to strain and temperature in a Brillouin distributed fiber sensor. This signal processing technique, which is implemented in Brillouin sensors for the first time relies on apriori data i.e, the lineshape of the Brillouin gain spectrum and its similarity with the spectral features measured at different locations along the fiber. Analytical and experimental investigation of the proposed scheme is presented in this paper.
Vercruysse, Jurgen; Toiviainen, Maunu; Fonteyne, Margot; Helkimo, Niko; Ketolainen, Jarkko; Juuti, Mikko; Delaet, Urbain; Van Assche, Ivo; Remon, Jean Paul; Vervaet, Chris; De Beer, Thomas
2014-04-01
Over the last decade, there has been increased interest in the application of twin screw granulation as a continuous wet granulation technique for pharmaceutical drug formulations. However, the mixing of granulation liquid and powder material during the short residence time inside the screw chamber and the atypical particle size distribution (PSD) of granules produced by twin screw granulation is not yet fully understood. Therefore, this study aims at visualizing the granulation liquid mixing and distribution during continuous twin screw granulation using NIR chemical imaging. In first instance, the residence time of material inside the barrel was investigated as function of screw speed and moisture content followed by the visualization of the granulation liquid distribution as function of different formulation and process parameters (liquid feed rate, liquid addition method, screw configuration, moisture content and barrel filling degree). The link between moisture uniformity and granule size distributions was also studied. For residence time analysis, increased screw speed and lower moisture content resulted to a shorter mean residence time and narrower residence time distribution. Besides, the distribution of granulation liquid was more homogenous at higher moisture content and with more kneading zones on the granulator screws. After optimization of the screw configuration, a two-level full factorial experimental design was performed to evaluate the influence of moisture content, screw speed and powder feed rate on the mixing efficiency of the powder and liquid phase. From these results, it was concluded that only increasing the moisture content significantly improved the granulation liquid distribution. This study demonstrates that NIR chemical imaging is a fast and adequate measurement tool for allowing process visualization and hence for providing better process understanding of a continuous twin screw granulation system. Copyright © 2013 Elsevier B.V. All rights reserved.
Wu, Hao; Wang, Ruoxu; Liu, Deming; Fu, Songnian; Zhao, Can; Wei, Huifeng; Tong, Weijun; Shum, Perry Ping; Tang, Ming
2016-04-01
We proposed and demonstrated a few-mode fiber (FMF) based optical-fiber sensor for distributed curvature measurement through quasi-single-mode Brillouin frequency shift (BFS). By central-alignment splicing FMF and single-mode fiber (SMF) with a fusion taper, a SMF-components-compatible distributed curvature sensor based on FMF is realized using the conventional Brillouin optical time-domain analysis system. The distributed BFS change induced by bending in FMF has been theoretically and experimentally investigated. The precise BFS response to the curvature along the fiber link has been calibrated. A proof-of-concept experiment is implemented to validate its effectiveness in distributed curvature measurement.
Evolution and Advances in Satellite Analysis of Volcanoes
NASA Astrophysics Data System (ADS)
Dean, K. G.; Dehn, J.; Webley, P.; Bailey, J.
2008-12-01
Over the past 20 years satellite data used for monitoring and analysis of volcanic eruptions has evolved in terms of timeliness, access, distribution, resolution and understanding of volcanic processes. Initially satellite data was used for retrospective analysis but has evolved to proactive monitoring systems. Timely acquisition of data and the capability to distribute large data files paralleled advances in computer technology and was a critical component for near real-time monitoring. The sharing of these data and resulting discussions has improved our understanding of eruption processes and, even more importantly, their impact on society. To illustrate this evolution, critical scientific discoveries will be highlighted, including detection of airborne ash and sulfur dioxide, cloud-height estimates, prediction of ash cloud movement, and detection of thermal anomalies as precursor-signals to eruptions. AVO has been a leader in implementing many of these advances into an operational setting such as, automated eruption detection, database analysis systems, and remotely accessible web-based analysis systems. Finally, limitations resulting from trade-offs between resolution and how they impact some weakness in detection techniques and hazard assessments will be presented.
Comparison of different functional EIT approaches to quantify tidal ventilation distribution.
Zhao, Zhanqi; Yun, Po-Jen; Kuo, Yen-Liang; Fu, Feng; Dai, Meng; Frerichs, Inez; Möller, Knut
2018-01-30
The aim of the study was to examine the pros and cons of different types of functional EIT (fEIT) to quantify tidal ventilation distribution in a clinical setting. fEIT images were calculated with (1) standard deviation of pixel time curve, (2) regression coefficients of global and local impedance time curves, or (3) mean tidal variations. To characterize temporal heterogeneity of tidal ventilation distribution, another fEIT image of pixel inspiration times is also proposed. fEIT-regression is very robust to signals with different phase information. When the respiratory signal should be distinguished from the heart-beat related signal, or during high-frequency oscillatory ventilation, fEIT-regression is superior to other types. fEIT-tidal variation is the most stable image type regarding the baseline shift. We recommend using this type of fEIT image for preliminary evaluation of the acquired EIT data. However, all these fEITs would be misleading in their assessment of ventilation distribution in the presence of temporal heterogeneity. The analysis software provided by the currently available commercial EIT equipment only offers either fEIT of standard deviation or tidal variation. Considering the pros and cons of each fEIT type, we recommend embedding more types into the analysis software to allow the physicians dealing with more complex clinical applications with on-line EIT measurements.
NASA Technical Reports Server (NTRS)
Press, Harry; Mazelsky, Bernard
1954-01-01
The applicability of some results from the theory of generalized harmonic analysis (or power-spectral analysis) to the analysis of gust loads on airplanes in continuous rough air is examined. The general relations for linear systems between power spectrums of a random input disturbance and an output response are used to relate the spectrum of airplane load in rough air to the spectrum of atmospheric gust velocity. The power spectrum of loads is shown to provide a measure of the load intensity in terms of the standard deviation (root mean square) of the load distribution for an airplane in flight through continuous rough air. For the case of a load output having a normal distribution, which appears from experimental evidence to apply to homogeneous rough air, the standard deviation is shown to describe the probability distribution of loads or the proportion of total time that the load has given values. Thus, for airplane in flight through homogeneous rough air, the probability distribution of loads may be determined from a power-spectral analysis. In order to illustrate the application of power-spectral analysis to gust-load analysis and to obtain an insight into the relations between loads and airplane gust-response characteristics, two selected series of calculations are presented. The results indicate that both methods of analysis yield results that are consistent to a first approximation.
Tinkelman, Igor; Melamed, Timor
2005-06-01
In Part I of this two-part investigation [J. Opt. Soc. Am. A 22, 1200 (2005)], we presented a theory for phase-space propagation of time-harmonic electromagnetic fields in an anisotropic medium characterized by a generic wave-number profile. In this Part II, these investigations are extended to transient fields, setting a general analytical framework for local analysis and modeling of radiation from time-dependent extended-source distributions. In this formulation the field is expressed as a superposition of pulsed-beam propagators that emanate from all space-time points in the source domain and in all directions. Using time-dependent quadratic-Lorentzian windows, we represent the field by a phase-space spectral distribution in which the propagating elements are pulsed beams, which are formulated by a transient plane-wave spectrum over the extended-source plane. By applying saddle-point asymptotics, we extract the beam phenomenology in the anisotropic environment resulting from short-pulsed processing. Finally, the general results are applied to the special case of uniaxial crystal and compared with a reference solution.
Re-Organizing Earth Observation Data Storage to Support Temporal Analysis of Big Data
NASA Technical Reports Server (NTRS)
Lynnes, Christopher
2017-01-01
The Earth Observing System Data and Information System archives many datasets that are critical to understanding long-term variations in Earth science properties. Thus, some of these are large, multi-decadal datasets. Yet the challenge in long time series analysis comes less from the sheer volume than the data organization, which is typically one (or a small number of) time steps per file. The overhead of opening and inventorying complex, API-driven data formats such as Hierarchical Data Format introduces a small latency at each time step, which nonetheless adds up for datasets with O(10^6) single-timestep files. Several approaches to reorganizing the data can mitigate this overhead by an order of magnitude: pre-aggregating data along the time axis (time-chunking); storing the data in a highly distributed file system; or storing data in distributed columnar databases. Storing a second copy of the data incurs extra costs, so some selection criteria must be employed, which would be driven by expected or actual usage by the end user community, balanced against the extra cost.
Re-organizing Earth Observation Data Storage to Support Temporal Analysis of Big Data
NASA Astrophysics Data System (ADS)
Lynnes, C.
2017-12-01
The Earth Observing System Data and Information System archives many datasets that are critical to understanding long-term variations in Earth science properties. Thus, some of these are large, multi-decadal datasets. Yet the challenge in long time series analysis comes less from the sheer volume than the data organization, which is typically one (or a small number of) time steps per file. The overhead of opening and inventorying complex, API-driven data formats such as Hierarchical Data Format introduces a small latency at each time step, which nonetheless adds up for datasets with O(10^6) single-timestep files. Several approaches to reorganizing the data can mitigate this overhead by an order of magnitude: pre-aggregating data along the time axis (time-chunking); storing the data in a highly distributed file system; or storing data in distributed columnar databases. Storing a second copy of the data incurs extra costs, so some selection criteria must be employed, which would be driven by expected or actual usage by the end user community, balanced against the extra cost.
Wada, Daichi; Igawa, Hirotaka; Kasai, Tokio
2016-09-01
We demonstrate a dynamic distributed monitoring technique using a long-length fiber Bragg grating (FBG) interrogated by optical frequency domain reflectometry (OFDR) that measures strain at a speed of 150 Hz, spatial resolution of 1 mm, and measurement range of 20 m. A 5 m FBG is bonded to a 5.5 m helicopter blade model, and vibration is applied by the step relaxation method. The time domain responses of the strain distributions are measured, and the blade deflections are calculated based on the strain distributions. Frequency response functions are obtained using the time domain responses of the calculated deflection induced by the preload release, and the modal parameters are retrieved. Experimental results demonstrated the dynamic monitoring performances and the applicability to the modal analysis of the OFDR-FBG technique.
Distribution of coniferin in freeze-fixed stem of Ginkgo biloba L. by cryo-TOF-SIMS/SEM
NASA Astrophysics Data System (ADS)
Aoki, Dan; Hanaya, Yuto; Akita, Takuya; Matsushita, Yasuyuki; Yoshida, Masato; Kuroda, Katsushi; Yagami, Sachie; Takama, Ruka; Fukushima, Kazuhiko
2016-08-01
To clarify the role of coniferin in planta, semi-quantitative cellular distribution of coniferin in quick-frozen Ginkgo biloba L. (ginkgo) was visualized by cryo time-of-flight secondary ion mass spectrometry and scanning electron microscopy (cryo-TOF-SIMS/SEM) analysis. The amount and rough distribution of coniferin were confirmed through quantitative chromatography measurement using serial tangential sections of the freeze-fixed ginkgo stem. The lignification stage of the sample was estimated using microscopic observations. Coniferin distribution visualized at the transverse and radial surfaces of freeze-fixed ginkgo stem suggested that coniferin is stored in the vacuoles, and showed good agreement with the assimilation timing of coniferin to lignin in differentiating xylem. Consequently, it is suggested that coniferin is stored in the tracheid cells of differentiating xylem and is a lignin precursor.
NASA Astrophysics Data System (ADS)
Uglov, A. A.; Smurov, I. Yu; Gus'kov, A. G.; Aksenov, L. V.
1990-08-01
A theoretical study is reported of melting and thermocapillary convection under the action of laser radiation with a nonmonotonic spatial distribution of the power density. An analysis is made of changes in the geometry of the molten bath with time. The transition from a nonmonotonic boundary of a melt, corresponding to the spatial distribution of the radiation, to a monotonic one occurs in a time of the order of 1 ms when the power density of laser radiation is 105 W/cm2. The vortex structure of the flow in the molten bath is governed by the spatial distribution of the laser radiation in such a way that each local power density maximum corresponds to two vortices with oppositely directed velocity components.
Voice-onset time and buzz-onset time identification: A ROC analysis
NASA Astrophysics Data System (ADS)
Lopez-Bascuas, Luis E.; Rosner, Burton S.; Garcia-Albea, Jose E.
2004-05-01
Previous studies have employed signal detection theory to analyze data from speech and nonspeech experiments. Typically, signal distributions were assumed to be Gaussian. Schouten and van Hessen [J. Acoust. Soc. Am. 104, 2980-2990 (1998)] explicitly tested this assumption for an intensity continuum and a speech continuum. They measured response distributions directly and, assuming an interval scale, concluded that the Gaussian assumption held for both continua. However, Pastore and Macmillan [J. Acoust. Soc. Am. 111, 2432 (2002)] applied ROC analysis to Schouten and van Hessen's data, assuming only an ordinal scale. Their ROC curves suppported the Gaussian assumption for the nonspeech signals only. Previously, Lopez-Bascuas [Proc. Audit. Bas. Speech Percept., 158-161 (1997)] found evidence with a rating scale procedure that the Gaussian model was inadequate for a voice-onset time continuum but not for a noise-buzz continuum. Both continua contained ten stimuli with asynchronies ranging from -35 ms to +55 ms. ROC curves (double-probability plots) are now reported for each pair of adjacent stimuli on the two continua. Both speech and nonspeech ROCs often appeared nonlinear, indicating non-Gaussian signal distributions under the usual zero-variance assumption for response criteria.
A statistical analysis of North East Atlantic (submicron) aerosol size distributions
NASA Astrophysics Data System (ADS)
Dall'Osto, M.; Monahan, C.; Greaney, R.; Beddows, D. C. S.; Harrison, R. M.; Ceburnis, D.; O'Dowd, C. D.
2011-12-01
The Global Atmospheric Watch research station at Mace Head (Ireland) offers the possibility to sample some of the cleanest air masses being imported into Europe as well as some of the most polluted being exported out of Europe. We present a statistical cluster analysis of the physical characteristics of aerosol size distributions in air ranging from the cleanest to the most polluted for the year 2008. Data coverage achieved was 75% throughout the year. By applying the Hartigan-Wong k-Means method, 12 clusters were identified as systematically occurring. These 12 clusters could be further combined into 4 categories with similar characteristics, namely: coastal nucleation category (occurring 21.3 % of the time), open ocean nucleation category (occurring 32.6% of the time), background clean marine category (occurring 26.1% of the time) and anthropogenic category (occurring 20% of the time) aerosol size distributions. The coastal nucleation category is characterised by a clear and dominant nucleation mode at sizes less than 10 nm while the open ocean nucleation category is characterised by a dominant Aitken mode between 15 nm and 50 nm. The background clean marine aerosol exhibited a clear bimodality in the sub-micron size distribution, with although it should be noted that either the Aitken mode or the accumulation mode may dominate the number concentration. However, peculiar background clean marine size distributions with coarser accumulation modes are also observed during winter months. By contrast, the continentally-influenced size distributions are generally more monomodal (accumulation), albeit with traces of bimodality. The open ocean category occurs more often during May, June and July, corresponding with the North East (NE) Atlantic high biological period. Combined with the relatively high percentage frequency of occurrence (32.6%), this suggests that the marine biota is an important source of new nano aerosol particles in NE Atlantic Air.
User-Perceived Reliability of M-for-N (M: N) Shared Protection Systems
NASA Astrophysics Data System (ADS)
Ozaki, Hirokazu; Kara, Atsushi; Cheng, Zixue
In this paper we investigate the reliability of general type shared protection systems i.e. M for N (M: N) that can typically be applied to various telecommunication network devices. We focus on the reliability that is perceived by an end user of one of N units. We assume that any failed unit is instantly replaced by one of the M units (if available). We describe the effectiveness of such a protection system in a quantitative manner. The mathematical analysis gives the closed-form solution of the availability, the recursive computing algorithm of the MTTFF (Mean Time to First Failure) and the MTTF (Mean Time to Failure) perceived by an arbitrary end user. We also show that, under a certain condition, the probability distribution of TTFF (Time to First Failure) can be approximated by a simple exponential distribution. The analysis provides useful information for the analysis and the design of not only the telecommunication network devices but also other general shared protection systems that are subject to service level agreements (SLA) involving user-perceived reliability measures.
Ocean Surface Wave Optical Roughness: Analysis of Innovative Measurements
2013-12-16
relationship of MSS to wind speed, and at times has shown a reversal of the Cox-Munk linear relationship. Furthermore, we observe measurable changes in...1985]. The variable speed allocation method has the effect of aliasing (cb) to slower waves, thereby increasing the exponent –m. Our analysis based ...RaDyO) program. The primary research goals of the program are to (1) examine time -dependent oceanic radiance distribution in relation to dynamic
Wang, Dandan; Zong, Qun; Tian, Bailing; Shao, Shikai; Zhang, Xiuyun; Zhao, Xinyi
2018-02-01
The distributed finite-time formation tracking control problem for multiple unmanned helicopters is investigated in this paper. The control object is to maintain the positions of follower helicopters in formation with external interferences. The helicopter model is divided into a second order outer-loop subsystem and a second order inner-loop subsystem based on multiple-time scale features. Using radial basis function neural network (RBFNN) technique, we first propose a novel finite-time multivariable neural network disturbance observer (FMNNDO) to estimate the external disturbance and model uncertainty, where the neural network (NN) approximation errors can be dynamically compensated by adaptive law. Next, based on FMNNDO, a distributed finite-time formation tracking controller and a finite-time attitude tracking controller are designed using the nonsingular fast terminal sliding mode (NFTSM) method. In order to estimate the second derivative of the virtual desired attitude signal, a novel finite-time sliding mode integral filter is designed. Finally, Lyapunov analysis and multiple-time scale principle ensure the realization of control goal in finite-time. The effectiveness of the proposed FMNNDO and controllers are then verified by numerical simulations. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Temporal Methods to Detect Content-Based Anomalies in Social Media
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skryzalin, Jacek; Field, Jr., Richard; Fisher, Andrew N.
Here, we develop a method for time-dependent topic tracking and meme trending in social media. Our objective is to identify time periods whose content differs signifcantly from normal, and we utilize two techniques to do so. The first is an information-theoretic analysis of the distributions of terms emitted during different periods of time. In the second, we cluster documents from each time period and analyze the tightness of each clustering. We also discuss a method of combining the scores created by each technique, and we provide ample empirical analysis of our methodology on various Twitter datasets.
Natural Time and Nowcasting Earthquakes: Are Large Global Earthquakes Temporally Clustered?
NASA Astrophysics Data System (ADS)
Luginbuhl, Molly; Rundle, John B.; Turcotte, Donald L.
2018-02-01
The objective of this paper is to analyze the temporal clustering of large global earthquakes with respect to natural time, or interevent count, as opposed to regular clock time. To do this, we use two techniques: (1) nowcasting, a new method of statistically classifying seismicity and seismic risk, and (2) time series analysis of interevent counts. We chose the sequences of M_{λ } ≥ 7.0 and M_{λ } ≥ 8.0 earthquakes from the global centroid moment tensor (CMT) catalog from 2004 to 2016 for analysis. A significant number of these earthquakes will be aftershocks of the largest events, but no satisfactory method of declustering the aftershocks in clock time is available. A major advantage of using natural time is that it eliminates the need for declustering aftershocks. The event count we utilize is the number of small earthquakes that occur between large earthquakes. The small earthquake magnitude is chosen to be as small as possible, such that the catalog is still complete based on the Gutenberg-Richter statistics. For the CMT catalog, starting in 2004, we found the completeness magnitude to be M_{σ } ≥ 5.1. For the nowcasting method, the cumulative probability distribution of these interevent counts is obtained. We quantify the distribution using the exponent, β, of the best fitting Weibull distribution; β = 1 for a random (exponential) distribution. We considered 197 earthquakes with M_{λ } ≥ 7.0 and found β = 0.83 ± 0.08. We considered 15 earthquakes with M_{λ } ≥ 8.0, but this number was considered too small to generate a meaningful distribution. For comparison, we generated synthetic catalogs of earthquakes that occur randomly with the Gutenberg-Richter frequency-magnitude statistics. We considered a synthetic catalog of 1.97 × 10^5 M_{λ } ≥ 7.0 earthquakes and found β = 0.99 ± 0.01. The random catalog converted to natural time was also random. We then generated 1.5 × 10^4 synthetic catalogs with 197 M_{λ } ≥ 7.0 in each catalog and found the statistical range of β values. The observed value of β = 0.83 for the CMT catalog corresponds to a p value of p=0.004 leading us to conclude that the interevent natural times in the CMT catalog are not random. For the time series analysis, we calculated the autocorrelation function for the sequence of natural time intervals between large global earthquakes and again compared with data from 1.5 × 10^4 synthetic catalogs of random data. In this case, the spread of autocorrelation values was much larger, so we concluded that this approach is insensitive to deviations from random behavior.
Modeling chloride transport using travel time distributions at Plynlimon, Wales
NASA Astrophysics Data System (ADS)
Benettin, Paolo; Kirchner, James W.; Rinaldo, Andrea; Botter, Gianluca
2015-05-01
Here we present a theoretical interpretation of high-frequency, high-quality tracer time series from the Hafren catchment at Plynlimon in mid-Wales. We make use of the formulation of transport by travel time distributions to model chloride transport originating from atmospheric deposition and compute catchment-scale travel time distributions. The relevance of the approach lies in the explanatory power of the chosen tools, particularly to highlight hydrologic processes otherwise clouded by the integrated nature of the measured outflux signal. The analysis reveals the key role of residual storages that are poorly visible in the hydrological response, but are shown to strongly affect water quality dynamics. A significant accuracy in reproducing data is shown by our calibrated model. A detailed representation of catchment-scale travel time distributions has been derived, including the time evolution of the overall dispersion processes (which can be expressed in terms of time-varying storage sampling functions). Mean computed travel times span a broad range of values (from 80 to 800 days) depending on the catchment state. Results also suggest that, in the average, discharge waters are younger than storage water. The model proves able to capture high-frequency fluctuations in the measured chloride concentrations, which are broadly explained by the sharp transition between groundwaters and faster flows originating from topsoil layers. This article was corrected on 22 JUN 2015. See the end of the full text for details.
Cardiovascular response to acute stress in freely moving rats: time-frequency analysis.
Loncar-Turukalo, Tatjana; Bajic, Dragana; Japundzic-Zigon, Nina
2008-01-01
Spectral analysis of cardiovascular series is an important tool for assessing the features of the autonomic control of the cardiovascular system. In this experiment Wistar rats ecquiped with intraarterial catheter for blood pressure (BP) recording were exposed to stress induced by blowing air. The problem of non stationary data was overcomed applying the Smoothed Pseudo Wigner Villle (SPWV) time-frequency distribution. Spectral analysis was done before stress, during stress, immediately after stress and later in recovery. The spectral indices were calculated for both systolic blood pressure (SBP) and pulse interval (PI) series. The time evolution of spectral indices showed perturbed sympathovagal balance.
Wen-Ching Chuang; Christopher G. Boone; Dexter H. Locke; J. Morgan Grove; Ali Whitmer; Geoffrey Buckley; Sainan Zhang
2017-01-01
Trees provide important health, ecosystem, and aesthetic services in urban areas, but they are unevenly distributed. Some neighborhoods have abundant tree canopy and others nearly none. We analyzed how neighborhood characteristics and changes in income over time related to the distribution of urban tree canopy in Washington, D.C. and Baltimore, MD. We used stepwise...
A study of a diffusive model of asset returns and an empirical analysis of financial markets
NASA Astrophysics Data System (ADS)
Alejandro Quinones, Angel Luis
A diffusive model for market dynamics is studied and the predictions of the model are compared to real financial markets. The model has a non-constant diffusion coefficient which depends both on the asset value and the time. A general solution for the distribution of returns is obtained and shown to match the results of computer simulations for two simple cases, piecewise linear and quadratic diffusion. The effects of discreteness in the market dynamics on the model are also studied. For the quadratic diffusion case, a type of phase transition leading to fat tails is observed as the discrete distribution approaches the continuum limit. It is also found that the model captures some of the empirical stylized facts observed in real markets, including fat-tails and scaling behavior in the distribution of returns. An analysis of empirical data for the EUR/USD currency exchange rate and the S&P 500 index is performed. Both markets show time scaling behavior consistent with a value of 1/2 for the Hurst exponent. Finally, the results show that the distribution of returns for the two markets is well fitted by the model, and the corresponding empirical diffusion coefficients are determined.
Fluctuations in Wikipedia access-rate and edit-event data
NASA Astrophysics Data System (ADS)
Kämpf, Mirko; Tismer, Sebastian; Kantelhardt, Jan W.; Muchnik, Lev
2012-12-01
Internet-based social networks often reflect extreme events in nature and society by drastic increases in user activity. We study and compare the dynamics of the two major complex processes necessary for information spread via the online encyclopedia ‘Wikipedia’, i.e., article editing (information upload) and article access (information viewing) based on article edit-event time series and (hourly) user access-rate time series for all articles. Daily and weekly activity patterns occur in addition to fluctuations and bursting activity. The bursts (i.e., significant increases in activity for an extended period of time) are characterized by a power-law distribution of durations of increases and decreases. For describing the recurrence and clustering of bursts we investigate the statistics of the return intervals between them. We find stretched exponential distributions of return intervals in access-rate time series, while edit-event time series yield simple exponential distributions. To characterize the fluctuation behavior we apply detrended fluctuation analysis (DFA), finding that most article access-rate time series are characterized by strong long-term correlations with fluctuation exponents α≈0.9. The results indicate significant differences in the dynamics of information upload and access and help in understanding the complex process of collecting, processing, validating, and distributing information in self-organized social networks.
Compiling global name-space programs for distributed execution
NASA Technical Reports Server (NTRS)
Koelbel, Charles; Mehrotra, Piyush
1990-01-01
Distributed memory machines do not provide hardware support for a global address space. Thus programmers are forced to partition the data across the memories of the architecture and use explicit message passing to communicate data between processors. The compiler support required to allow programmers to express their algorithms using a global name-space is examined. A general method is presented for analysis of a high level source program and along with its translation to a set of independently executing tasks communicating via messages. If the compiler has enough information, this translation can be carried out at compile-time. Otherwise run-time code is generated to implement the required data movement. The analysis required in both situations is described and the performance of the generated code on the Intel iPSC/2 is presented.
Microscopic analysis of currency and stock exchange markets.
Kador, L
1999-08-01
Recently it was shown that distributions of short-term price fluctuations in foreign-currency exchange exhibit striking similarities to those of velocity differences in turbulent flows. Similar profiles represent the spectral-diffusion behavior of impurity molecules in disordered solids at low temperatures. It is demonstrated that a microscopic statistical theory of the spectroscopic line shapes can be applied to the other two phenomena. The theory interprets the financial data in terms of information which becomes available to the traders and their reactions as a function of time. The analysis shows that there is no characteristic time scale in financial markets, but that instead stretched-exponential or algebraic memory functions yield good agreement with the price data. For an algebraic function, the theory yields truncated Lévy distributions which are often observed in stock exchange markets.
Microscopic analysis of currency and stock exchange markets
NASA Astrophysics Data System (ADS)
Kador, L.
1999-08-01
Recently it was shown that distributions of short-term price fluctuations in foreign-currency exchange exhibit striking similarities to those of velocity differences in turbulent flows. Similar profiles represent the spectral-diffusion behavior of impurity molecules in disordered solids at low temperatures. It is demonstrated that a microscopic statistical theory of the spectroscopic line shapes can be applied to the other two phenomena. The theory interprets the financial data in terms of information which becomes available to the traders and their reactions as a function of time. The analysis shows that there is no characteristic time scale in financial markets, but that instead stretched-exponential or algebraic memory functions yield good agreement with the price data. For an algebraic function, the theory yields truncated Lévy distributions which are often observed in stock exchange markets.
A unified Bayesian semiparametric approach to assess discrimination ability in survival analysis
Zhao, Lili; Feng, Dai; Chen, Guoan; Taylor, Jeremy M.G.
2015-01-01
Summary The discriminatory ability of a marker for censored survival data is routinely assessed by the time-dependent ROC curve and the c-index. The time-dependent ROC curve evaluates the ability of a biomarker to predict whether a patient lives past a particular time t. The c-index measures the global concordance of the marker and the survival time regardless of the time point. We propose a Bayesian semiparametric approach to estimate these two measures. The proposed estimators are based on the conditional distribution of the survival time given the biomarker and the empirical biomarker distribution. The conditional distribution is estimated by a linear dependent Dirichlet process mixture model. The resulting ROC curve is smooth as it is estimated by a mixture of parametric functions. The proposed c-index estimator is shown to be more efficient than the commonly used Harrell's c-index since it uses all pairs of data rather than only informative pairs. The proposed estimators are evaluated through simulations and illustrated using a lung cancer dataset. PMID:26676324
Directional Migration of Recirculating Lymphocytes through Lymph Nodes via Random Walks
Thomas, Niclas; Matejovicova, Lenka; Srikusalanukul, Wichat; Shawe-Taylor, John; Chain, Benny
2012-01-01
Naive T lymphocytes exhibit extensive antigen-independent recirculation between blood and lymph nodes, where they may encounter dendritic cells carrying cognate antigen. We examine how long different T cells may spend in an individual lymph node by examining data from long term cannulation of blood and efferent lymphatics of a single lymph node in the sheep. We determine empirically the distribution of transit times of migrating T cells by applying the Least Absolute Shrinkage & Selection Operator () or regularised to fit experimental data describing the proportion of labelled infused cells in blood and efferent lymphatics over time. The optimal inferred solution reveals a distribution with high variance and strong skew. The mode transit time is typically between 10 and 20 hours, but a significant number of cells spend more than 70 hours before exiting. We complement the empirical machine learning based approach by modelling lymphocyte passage through the lymph node . On the basis of previous two photon analysis of lymphocyte movement, we optimised distributions which describe the transit times (first passage times) of discrete one dimensional and continuous (Brownian) three dimensional random walks with drift. The optimal fit is obtained when drift is small, i.e. the ratio of probabilities of migrating forward and backward within the node is close to one. These distributions are qualitatively similar to the inferred empirical distribution, with high variance and strong skew. In contrast, an optimised normal distribution of transit times (symmetrical around mean) fitted the data poorly. The results demonstrate that the rapid recirculation of lymphocytes observed at a macro level is compatible with predominantly randomised movement within lymph nodes, and significant probabilities of long transit times. We discuss how this pattern of migration may contribute to facilitating interactions between low frequency T cells and antigen presenting cells carrying cognate antigen. PMID:23028891
One Step Quantum Key Distribution Based on EPR Entanglement
Li, Jian; Li, Na; Li, Lei-Lei; Wang, Tao
2016-01-01
A novel quantum key distribution protocol is presented, based on entanglement and dense coding and allowing asymptotically secure key distribution. Considering the storage time limit of quantum bits, a grouping quantum key distribution protocol is proposed, which overcomes the vulnerability of first protocol and improves the maneuverability. Moreover, a security analysis is given and a simple type of eavesdropper’s attack would introduce at least an error rate of 46.875%. Compared with the “Ping-pong” protocol involving two steps, the proposed protocol does not need to store the qubit and only involves one step. PMID:27357865
Moving Average Models with Bivariate Exponential and Geometric Distributions.
1985-03-01
ordinary time series and of point processes. Developments in Statistics, Vol. 1, P.R. Krishnaiah , ed. Academic Press, New York. [9] Esary, J.D. and...valued and discrete - valued time series with ARMA correlation structure. Multivariate Analysis V, P.R. Krishnaiah , ed. North-Holland. 151-166. [28
Dao Duc, Khanh; Parutto, Pierre; Chen, Xiaowei; Epsztein, Jérôme; Konnerth, Arthur; Holcman, David
2015-01-01
The dynamics of neuronal networks connected by synaptic dynamics can sustain long periods of depolarization that can last for hundreds of milliseconds such as Up states recorded during sleep or anesthesia. Yet the underlying mechanism driving these periods remain unclear. We show here within a mean-field model that the residence time of the neuronal membrane potential in cortical Up states does not follow a Poissonian law, but presents several peaks. Furthermore, the present modeling approach allows extracting some information about the neuronal network connectivity from the time distribution histogram. Based on a synaptic-depression model, we find that these peaks, that can be observed in histograms of patch-clamp recordings are not artifacts of electrophysiological measurements, but rather are an inherent property of the network dynamics. Analysis of the equations reveals a stable focus located close to the unstable limit cycle, delimiting a region that defines the Up state. The model further shows that the peaks observed in the Up state time distribution are due to winding around the focus before escaping from the basin of attraction. Finally, we use in vivo recordings of intracellular membrane potential and we recover from the peak distribution, some information about the network connectivity. We conclude that it is possible to recover the network connectivity from the distribution of times that the neuronal membrane voltage spends in Up states.
Waiting time distribution in public health care: empirics and theory.
Dimakou, Sofia; Dimakou, Ourania; Basso, Henrique S
2015-12-01
Excessive waiting times for elective surgery have been a long-standing concern in many national healthcare systems in the OECD. How do the hospital admission patterns that generate waiting lists affect different patients? What are the hospitals characteristics that determine waiting times? By developing a model of healthcare provision and analysing empirically the entire waiting time distribution we attempt to shed some light on those issues. We first build a theoretical model that describes the optimal waiting time distribution for capacity constraint hospitals. Secondly, employing duration analysis, we obtain empirical representations of that distribution across hospitals in the UK from 1997-2005. We observe important differences on the 'scale' and on the 'shape' of admission rates. Scale refers to how quickly patients are treated and shape represents trade-offs across duration-treatment profiles. By fitting the theoretical to the empirical distributions we estimate the main structural parameters of the model and are able to closely identify the main drivers of these empirical differences. We find that the level of resources allocated to elective surgery (budget and physical capacity), which determines how constrained the hospital is, explains differences in scale. Changes in benefits and costs structures of healthcare provision, which relate, respectively, to the desire to prioritise patients by duration and the reduction in costs due to delayed treatment, determine the shape, affecting short and long duration patients differently. JEL Classification I11; I18; H51.
Inversion Analysis of Postseismic Deformation in Poroelastic Material Using Finite Element Method
NASA Astrophysics Data System (ADS)
Kawamoto, S.; Ito, T.; Hirahara, K.
2005-12-01
Following a large earthquake, postseismic deformations in the focal source region have been observed by several geodetic measurements. To explain the postseismic deformations, researchers have proposed some physical mechanisms known as afterslip, viscoelastic relaxation and poroelastic rebound. There are a number of studies about postseismic deformations but for poroelastic rebound. So, we calculated the postseismic deformations caused by afterslip and poroelastic rebound using modified FEM code _eCAMBIOT3D_f originally developed by Geotech. Lab. Gunma University, Japan (2003). The postseismic deformations caused by both afterslip and poroelastic rebound are characteristically different from those caused only by afterslip. This suggests that the slip distributions on the fault estimated from geodetic measurements also change. Because of this, we developed the inversion method that accounts for both afterslip and poroelastic rebound using FEM to estimate the difference of slip distributions on the fault quantitatively. The inversion analysis takes following steps. First, we calculate the coseismic and postseismic response functions on each fault segment induced by the unit slip. Where postseismic response function indicate the poroelastic rebound. Next, we make the observation equations at each time step using the response functions and estimate the spatiotemporal distribution of slip on the fault. In solving this inverse problem, we assume the slip distributions on the fault are smooth in space and time except for rapid change (coseismic change). Because the hyperparameters that control the smoothness of spatial and temporal distributions of slip are needed, we determine the best hyperparameters using ABIC. In this presentation, we introduce the example of analysis results using this method.
Effort Drivers Estimation for Brazilian Geographically Distributed Software Development
NASA Astrophysics Data System (ADS)
Almeida, Ana Carina M.; Souza, Renata; Aquino, Gibeon; Meira, Silvio
To meet the requirements of today’s fast paced markets, it is important to develop projects on time and with the minimum use of resources. A good estimate is the key to achieve this goal. Several companies have started to work with geographically distributed teams due to cost reduction and time-to-market. Some researchers indicate that this approach introduces new challenges, because the teams work in different time zones and have possible differences in culture and language. It is already known that the multisite development increases the software cycle time. Data from 15 DSD projects from 10 distinct companies were collected. The analysis shows drivers that impact significantly the total effort planned to develop systems using DSD approach in Brazil.
Use of DAGMan in CRAB3 to improve the splitting of CMS user jobs
NASA Astrophysics Data System (ADS)
Wolf, M.; Mascheroni, M.; Woodard, A.; Belforte, S.; Bockelman, B.; Hernandez, J. M.; Vaandering, E.
2017-10-01
CRAB3 is a workload management tool used by CMS physicists to analyze data acquired by the Compact Muon Solenoid (CMS) detector at the CERN Large Hadron Collider (LHC). Research in high energy physics often requires the analysis of large collections of files, referred to as datasets. The task is divided into jobs that are distributed among a large collection of worker nodes throughout the Worldwide LHC Computing Grid (WLCG). Splitting a large analysis task into optimally sized jobs is critical to efficient use of distributed computing resources. Jobs that are too big will have excessive runtimes and will not distribute the work across all of the available nodes. However, splitting the project into a large number of very small jobs is also inefficient, as each job creates additional overhead which increases load on infrastructure resources. Currently this splitting is done manually, using parameters provided by the user. However the resources needed for each job are difficult to predict because of frequent variations in the performance of the user code and the content of the input dataset. As a result, dividing a task into jobs by hand is difficult and often suboptimal. In this work we present a new feature called “automatic splitting” which removes the need for users to manually specify job splitting parameters. We discuss how HTCondor DAGMan can be used to build dynamic Directed Acyclic Graphs (DAGs) to optimize the performance of large CMS analysis jobs on the Grid. We use DAGMan to dynamically generate interconnected DAGs that estimate the processing time the user code will require to analyze each event. This is used to calculate an estimate of the total processing time per job, and a set of analysis jobs are run using this estimate as a specified time limit. Some jobs may not finish within the alloted time; they are terminated at the time limit, and the unfinished data is regrouped into smaller jobs and resubmitted.
Load flow and state estimation algorithms for three-phase unbalanced power distribution systems
NASA Astrophysics Data System (ADS)
Madvesh, Chiranjeevi
Distribution load flow and state estimation are two important functions in distribution energy management systems (DEMS) and advanced distribution automation (ADA) systems. Distribution load flow analysis is a tool which helps to analyze the status of a power distribution system under steady-state operating conditions. In this research, an effective and comprehensive load flow algorithm is developed to extensively incorporate the distribution system components. Distribution system state estimation is a mathematical procedure which aims to estimate the operating states of a power distribution system by utilizing the information collected from available measurement devices in real-time. An efficient and computationally effective state estimation algorithm adapting the weighted-least-squares (WLS) method has been developed in this research. Both the developed algorithms are tested on different IEEE test-feeders and the results obtained are justified.
Solder, John; Stolp, Bernard J.; Heilweil, Victor M.; Susong, David D.
2016-01-01
Environmental tracers (noble gases, tritium, industrial gases, stable isotopes, and radio-carbon) and hydrogeology were interpreted to determine groundwater transit-time distribution and calculate mean transit time (MTT) with lumped parameter modeling at 19 large springs distributed throughout the Upper Colorado River Basin (UCRB), USA. The predictive value of the MTT to evaluate the pattern and timing of groundwater response to hydraulic stress (i.e., vulnerability) is examined by a statistical analysis of MTT, historical spring discharge records, and the Palmer Hydrological Drought Index. MTTs of the springs range from 10 to 15,000 years and 90 % of the cumulative discharge-weighted travel-time distribution falls within the range of 2−10,000 years. Historical variability in discharge was assessed as the ratio of 10–90 % flow-exceedance (R 10/90%) and ranged from 2.8 to 1.1 for select springs with available discharge data. The lag-time (i.e., delay in discharge response to drought conditions) was determined by cross-correlation analysis and ranged from 0.5 to 6 years for the same select springs. Springs with shorter MTTs (<80 years) statistically correlate with larger discharge variations and faster responses to drought, indicating MTT can be used for estimating the relative magnitude and timing of groundwater response. Results indicate that groundwater discharge to streams in the UCRB will likely respond on the order of years to climate variation and increasing groundwater withdrawals.
Distributed Secure Coordinated Control for Multiagent Systems Under Strategic Attacks.
Feng, Zhi; Wen, Guanghui; Hu, Guoqiang
2017-05-01
This paper studies a distributed secure consensus tracking control problem for multiagent systems subject to strategic cyber attacks modeled by a random Markov process. A hybrid stochastic secure control framework is established for designing a distributed secure control law such that mean-square exponential consensus tracking is achieved. A connectivity restoration mechanism is considered and the properties on attack frequency and attack length rate are investigated, respectively. Based on the solutions of an algebraic Riccati equation and an algebraic Riccati inequality, a procedure to select the control gains is provided and stability analysis is studied by using Lyapunov's method.. The effect of strategic attacks on discrete-time systems is also investigated. Finally, numerical examples are provided to illustrate the effectiveness of theoretical analysis.
A rapid local singularity analysis algorithm with applications
NASA Astrophysics Data System (ADS)
Chen, Zhijun; Cheng, Qiuming; Agterberg, Frits
2015-04-01
The local singularity model developed by Cheng is fast gaining popularity in characterizing mineralization and detecting anomalies of geochemical, geophysical and remote sensing data. However in one of the conventional algorithms involving the moving average values with different scales is time-consuming especially while analyzing a large dataset. Summed area table (SAT), also called as integral image, is a fast algorithm used within the Viola-Jones object detection framework in computer vision area. Historically, the principle of SAT is well-known in the study of multi-dimensional probability distribution functions, namely in computing 2D (or ND) probabilities (area under the probability distribution) from the respective cumulative distribution functions. We introduce SAT and it's variation Rotated Summed Area Table in the isotropic, anisotropic or directional local singularity mapping in this study. Once computed using SAT, any one of the rectangular sum can be computed at any scale or location in constant time. The area for any rectangular region in the image can be computed by using only 4 array accesses in constant time independently of the size of the region; effectively reducing the time complexity from O(n) to O(1). New programs using Python, Julia, matlab and C++ are implemented respectively to satisfy different applications, especially to the big data analysis. Several large geochemical and remote sensing datasets are tested. A wide variety of scale changes (linear spacing or log spacing) for non-iterative or iterative approach are adopted to calculate the singularity index values and compare the results. The results indicate that the local singularity analysis with SAT is more robust and superior to traditional approach in identifying anomalies.
da Silva, Marjorie; Noll, Fernando Barbosa; E Castro, Adriana C Morales-Corrêa
2018-01-01
Swarm-founding wasps are endemic and common representatives of neotropical fauna and compose an interesting social tribe of vespids, presenting both complex social characteristics and uncommon traits for a eusocial group, such as the absence of castes with distinct morphology. The paper wasp Protonectarina sylveirae (Saussure) presents a broad distribution from Brazil, Argentina and Paraguay, occurring widespread in the Atlantic rainforest and arboreal Caatinga, being absent in the Amazon region. Given the peculiar distribution among swarm-founding wasps, an integrative approach to reconstruct the evolutionary history of P. sylveirae in a spatial-temporal framework was performed to investigate: the presence of genetic structure and its relationship with the geography, the evolution of distinct morphologic lineages and the possible historical event(s) in Neotropical region, which could explain the observed phylogeographic pattern. Individuals of P. sylveirae were obtained from populations of 16 areas throughout its distribution for DNA extraction and amplification of mitochondrial genes 12S, 16S and COI. Analysis of genetic diversity, construction of haplotype net, analysis of population structure and dating analysis of divergence time were performed. A morphometric analysis was also performed using 8 measures of the body of the adult (workers) to test if there are morphological distinction among populations. Thirty-five haplotypes were identified, most of them exclusively of a group and a high population structure was found. The possibility of genetic divergence because of isolation by distance was rejected. Morphological analysis pointed to a great uniformity in phenotypes, with only a small degree of differentiation between populations of south and the remaining. Divergence time analysis showed a Middle/Late Miocene origin, a period where an extensive marine ingression occurred in South America. Divergence of haplogroups began from the Plio/Pleistocene boundary and the last glacial maximum most likely modeled the current distribution of species, even though it was not the cause of genetic breaks.
2018-01-01
Swarm-founding wasps are endemic and common representatives of neotropical fauna and compose an interesting social tribe of vespids, presenting both complex social characteristics and uncommon traits for a eusocial group, such as the absence of castes with distinct morphology. The paper wasp Protonectarina sylveirae (Saussure) presents a broad distribution from Brazil, Argentina and Paraguay, occurring widespread in the Atlantic rainforest and arboreal Caatinga, being absent in the Amazon region. Given the peculiar distribution among swarm-founding wasps, an integrative approach to reconstruct the evolutionary history of P. sylveirae in a spatial-temporal framework was performed to investigate: the presence of genetic structure and its relationship with the geography, the evolution of distinct morphologic lineages and the possible historical event(s) in Neotropical region, which could explain the observed phylogeographic pattern. Individuals of P. sylveirae were obtained from populations of 16 areas throughout its distribution for DNA extraction and amplification of mitochondrial genes 12S, 16S and COI. Analysis of genetic diversity, construction of haplotype net, analysis of population structure and dating analysis of divergence time were performed. A morphometric analysis was also performed using 8 measures of the body of the adult (workers) to test if there are morphological distinction among populations. Thirty-five haplotypes were identified, most of them exclusively of a group and a high population structure was found. The possibility of genetic divergence because of isolation by distance was rejected. Morphological analysis pointed to a great uniformity in phenotypes, with only a small degree of differentiation between populations of south and the remaining. Divergence time analysis showed a Middle/Late Miocene origin, a period where an extensive marine ingression occurred in South America. Divergence of haplogroups began from the Plio/Pleistocene boundary and the last glacial maximum most likely modeled the current distribution of species, even though it was not the cause of genetic breaks. PMID:29538451
pyCTQW: A continuous-time quantum walk simulator on distributed memory computers
NASA Astrophysics Data System (ADS)
Izaac, Josh A.; Wang, Jingbo B.
2015-01-01
In the general field of quantum information and computation, quantum walks are playing an increasingly important role in constructing physical models and quantum algorithms. We have recently developed a distributed memory software package pyCTQW, with an object-oriented Python interface, that allows efficient simulation of large multi-particle CTQW (continuous-time quantum walk)-based systems. In this paper, we present an introduction to the Python and Fortran interfaces of pyCTQW, discuss various numerical methods of calculating the matrix exponential, and demonstrate the performance behavior of pyCTQW on a distributed memory cluster. In particular, the Chebyshev and Krylov-subspace methods for calculating the quantum walk propagation are provided, as well as methods for visualization and data analysis.
In Situ Distribution Guided Analysis and Visualization of Transonic Jet Engine Simulations.
Dutta, Soumya; Chen, Chun-Ming; Heinlein, Gregory; Shen, Han-Wei; Chen, Jen-Ping
2017-01-01
Study of flow instability in turbine engine compressors is crucial to understand the inception and evolution of engine stall. Aerodynamics experts have been working on detecting the early signs of stall in order to devise novel stall suppression technologies. A state-of-the-art Navier-Stokes based, time-accurate computational fluid dynamics simulator, TURBO, has been developed in NASA to enhance the understanding of flow phenomena undergoing rotating stall. Despite the proven high modeling accuracy of TURBO, the excessive simulation data prohibits post-hoc analysis in both storage and I/O time. To address these issues and allow the expert to perform scalable stall analysis, we have designed an in situ distribution guided stall analysis technique. Our method summarizes statistics of important properties of the simulation data in situ using a probabilistic data modeling scheme. This data summarization enables statistical anomaly detection for flow instability in post analysis, which reveals the spatiotemporal trends of rotating stall for the expert to conceive new hypotheses. Furthermore, the verification of the hypotheses and exploratory visualization using the summarized data are realized using probabilistic visualization techniques such as uncertain isocontouring. Positive feedback from the domain scientist has indicated the efficacy of our system in exploratory stall analysis.
Time-frequency signal analysis and synthesis - The choice of a method and its application
NASA Astrophysics Data System (ADS)
Boashash, Boualem
In this paper, the problem of choosing a method for time-frequency signal analysis is discussed. It is shown that a natural approach leads to the introduction of the concepts of the analytic signal and instantaneous frequency. The Wigner-Ville Distribution (WVD) is a method of analysis based upon these concepts and it is shown that an accurate Time-Frequency representation of a signal can be obtained by using the WVD for the analysis of a class of signals referred to as 'asymptotic'. For this class of signals, the instantaneous frequency describes an important physical parameter characteristic of the process under investigation. The WVD procedure for signal analysis and synthesis is outlined and its properties are reviewed for deterministic and random signals.
Time-Frequency Signal Analysis And Synthesis The Choice Of A Method And Its Application
NASA Astrophysics Data System (ADS)
Boashash, Boualem
1988-02-01
In this paper, the problem of choosing a method for time-frequency signal analysis is discussed. It is shown that a natural approach leads to the introduction of the concepts of the analytic signal and in-stantaneous frequency. The Wigner-Ville Distribution (WVD) is a method of analysis based upon these concepts and it is shown that an accurate Time-Frequency representation of a signal can be obtained by using the WVD for the analysis of a class of signals referred to as "asymptotic". For this class of signals, the instantaneous frequency describes an important physical parameter characteristic of the process under investigation. The WVD procedure for signal analysis and synthesis is outlined and its properties are reviewed for deterministic and random signals.
Moutsopoulou, Karolina; Waszak, Florian
2012-04-01
The differential effects of task and response conflict in priming paradigms where associations are strengthened between a stimulus, a task, and a response have been demonstrated in recent years with neuroimaging methods. However, such effects are not easily disentangled with only measurements of behavior, such as reaction times (RTs). Here, we report the application of ex-Gaussian distribution analysis on task-switching RT data and show that conflict related to stimulus-response associations retrieved after a switch of tasks is reflected in the Gaussian component. By contrast, conflict related to the retrieval of stimulus-task associations is reflected in the exponential component. Our data confirm that the retrieval of stimulus-task and -response associations affects behavior differently. Ex-Gaussian distribution analysis is a useful tool for pulling apart these different levels of associative priming that are not distinguishable in analyses of RT means.
Characterizations of particle size distribution of the droplets exhaled by sneeze
Han, Z. Y.; Weng, W. G.; Huang, Q. Y.
2013-01-01
This work focuses on the size distribution of sneeze droplets exhaled immediately at mouth. Twenty healthy subjects participated in the experiment and 44 sneezes were measured by using a laser particle size analyser. Two types of distributions are observed: unimodal and bimodal. For each sneeze, the droplets exhaled at different time in the sneeze duration have the same distribution characteristics with good time stability. The volume-based size distributions of sneeze droplets can be represented by a lognormal distribution function, and the relationship between the distribution parameters and the physiological characteristics of the subjects are studied by using linear regression analysis. The geometric mean of the droplet size of all the subjects is 360.1 µm for unimodal distribution and 74.4 µm for bimodal distribution with geometric standard deviations of 1.5 and 1.7, respectively. For the two peaks of the bimodal distribution, the geometric mean (the geometric standard deviation) is 386.2 µm (1.8) for peak 1 and 72.0 µm (1.5) for peak 2. The influences of the measurement method, the limitations of the instrument, the evaporation effects of the droplets, the differences of biological dynamic mechanism and characteristics between sneeze and other respiratory activities are also discussed. PMID:24026469
ERIC Educational Resources Information Center
Kessel, Robert; Lucke, Robert L.
2008-01-01
Shull, Gaynor and Grimes advanced a model for interresponse time distribution using probabilistic cycling between a higher-rate and a lower-rate response process. Both response processes are assumed to be random in time with a constant rate. The cycling between the two processes is assumed to have a constant transition probability that is…
NASA Astrophysics Data System (ADS)
Lu, Hongbin; Chen, Can; Wang, Zhanwen; Qu, Jin; Xu, Daqi; Wu, Tianding; Cao, Yong; Zhou, Jingyong; Zheng, Cheng; Hu, Jianzhong
2015-09-01
Tendon attaches to bone through a functionally graded fibrocartilage zone, including uncalcified fibrocartilage (UF), tidemark (TM) and calcified fibrocartilage (CF). This transition zone plays a pivotal role in relaxing load transfer between tendon and bone, and serves as a boundary between otherwise structurally and functionally distinct tissue types. Calcium and zinc are believed to play important roles in the normal growth, mineralization, and repair of the fibrocartilage zone of bone-tendon junction (BTJ). However, spatial distributions of calcium and zinc at the fibrocartilage zone of BTJ and their distribution-function relationship are not totally understood. Thus, synchrotron radiation-based micro X-ray fluorescence analysis (SR-μXRF) in combination with backscattered electron imaging (BEI) was employed to characterize the distributions of calcium and zinc at the fibrocartilage zone of rabbit patella-patellar tendon complex (PPTC). For the first time, the unique distributions of calcium and zinc at the fibrocartilage zone of the PPTC were clearly mapped by this method. The distributions of calcium and zinc at the fibrocartilage zone of the PPTC were inhomogeneous. A significant accumulation of zinc was exhibited in the transition region between UF and CF. The highest zinc content (3.17 times of that of patellar tendon) was found in the TM of fibrocartilage zone. The calcium content began to increase near the TM and increased exponentially across the calcified fibrocartilage region towards the patella. The highest calcium content (43.14 times of that of patellar tendon) was in the transitional zone of calcified fibrocartilage region and the patella, approximately 69 μm from the location with the highest zinc content. This study indicated, for the first time, that there is a differential distribution of calcium and zinc at the fibrocartilage zone of PPTC. These observations reveal new insights into region-dependent changes across the fibrocartilage zone of BTJ and will serve as critical benchmark parameters for current efforts in BTJ repair.
Detailed Analysis of the Interoccurrence Time Statistics in Seismic Activity
NASA Astrophysics Data System (ADS)
Tanaka, Hiroki; Aizawa, Yoji
2017-02-01
The interoccurrence time statistics of seismiciry is studied theoretically as well as numerically by taking into account the conditional probability and the correlations among many earthquakes in different magnitude levels. It is known so far that the interoccurrence time statistics is well approximated by the Weibull distribution, but the more detailed information about the interoccurrence times can be obtained from the analysis of the conditional probability. Firstly, we propose the Embedding Equation Theory (EET), where the conditional probability is described by two kinds of correlation coefficients; one is the magnitude correlation and the other is the inter-event time correlation. Furthermore, the scaling law of each correlation coefficient is clearly determined from the numerical data-analysis carrying out with the Preliminary Determination of Epicenter (PDE) Catalog and the Japan Meteorological Agency (JMA) Catalog. Secondly, the EET is examined to derive the magnitude dependence of the interoccurrence time statistics and the multi-fractal relation is successfully formulated. Theoretically we cannot prove the universality of the multi-fractal relation in seismic activity; nevertheless, the theoretical results well reproduce all numerical data in our analysis, where several common features or the invariant aspects are clearly observed. Especially in the case of stationary ensembles the multi-fractal relation seems to obey an invariant curve, furthermore in the case of non-stationary (moving time) ensembles for the aftershock regime the multi-fractal relation seems to satisfy a certain invariant curve at any moving times. It is emphasized that the multi-fractal relation plays an important role to unify the statistical laws of seismicity: actually the Gutenberg-Richter law and the Weibull distribution are unified in the multi-fractal relation, and some universality conjectures regarding the seismicity are briefly discussed.
Comparison of Salmonella enteritidis phage types isolated from layers and humans in Belgium in 2005.
Welby, Sarah; Imberechts, Hein; Riocreux, Flavien; Bertrand, Sophie; Dierick, Katelijne; Wildemauwe, Christa; Hooyberghs, Jozef; Van der Stede, Yves
2011-08-01
The aim of this study was to investigate the available results for Belgium of the European Union coordinated monitoring program (2004/665 EC) on Salmonella in layers in 2005, as well as the results of the monthly outbreak reports of Salmonella Enteritidis in humans in 2005 to identify a possible statistical significant trend in both populations. Separate descriptive statistics and univariate analysis were carried out and the parametric and/or non-parametric hypothesis tests were conducted. A time cluster analysis was performed for all Salmonella Enteritidis phage types (PTs) isolated. The proportions of each Salmonella Enteritidis PT in layers and in humans were compared and the monthly distribution of the most common PT, isolated in both populations, was evaluated. The time cluster analysis revealed significant clusters during the months May and June for layers and May, July, August, and September for humans. PT21, the most frequently isolated PT in both populations in 2005, seemed to be responsible of these significant clusters. PT4 was the second most frequently isolated PT. No significant difference was found for the monthly trend evolution of both PT in both populations based on parametric and non-parametric methods. A similar monthly trend of PT distribution in humans and layers during the year 2005 was observed. The time cluster analysis and the statistical significance testing confirmed these results. Moreover, the time cluster analysis showed significant clusters during the summer time and slightly delayed in time (humans after layers). These results suggest a common link between the prevalence of Salmonella Enteritidis in layers and the occurrence of the pathogen in humans. Phage typing was confirmed to be a useful tool for identifying temporal trends.
Distributed collaborative response surface method for mechanical dynamic assembly reliability design
NASA Astrophysics Data System (ADS)
Bai, Guangchen; Fei, Chengwei
2013-11-01
Because of the randomness of many impact factors influencing the dynamic assembly relationship of complex machinery, the reliability analysis of dynamic assembly relationship needs to be accomplished considering the randomness from a probabilistic perspective. To improve the accuracy and efficiency of dynamic assembly relationship reliability analysis, the mechanical dynamic assembly reliability(MDAR) theory and a distributed collaborative response surface method(DCRSM) are proposed. The mathematic model of DCRSM is established based on the quadratic response surface function, and verified by the assembly relationship reliability analysis of aeroengine high pressure turbine(HPT) blade-tip radial running clearance(BTRRC). Through the comparison of the DCRSM, traditional response surface method(RSM) and Monte Carlo Method(MCM), the results show that the DCRSM is not able to accomplish the computational task which is impossible for the other methods when the number of simulation is more than 100 000 times, but also the computational precision for the DCRSM is basically consistent with the MCM and improved by 0.40˜4.63% to the RSM, furthermore, the computational efficiency of DCRSM is up to about 188 times of the MCM and 55 times of the RSM under 10000 times simulations. The DCRSM is demonstrated to be a feasible and effective approach for markedly improving the computational efficiency and accuracy of MDAR analysis. Thus, the proposed research provides the promising theory and method for the MDAR design and optimization, and opens a novel research direction of probabilistic analysis for developing the high-performance and high-reliability of aeroengine.
The influence of wildfires on aerosol size distributions in rural areas.
Alonso-Blanco, E; Calvo, A I; Fraile, R; Castro, A
2012-01-01
The number of particles and their size distributions were measured in a rural area, during the summer, using a PCASP-X. The aim was to study the influence of wildfires on particle size distributions. The comparative studies carried out reveal an average increase of around ten times in the number of particles in the fine mode, especially in sizes between 0.10 and 0.14 μm, where the increase is of nearly 20 times. An analysis carried out at three different points in time--before, during, and after the passing of the smoke plume from the wildfires--shows that the mean geometric diameter of the fine mode in the measurements affected by the fire is smaller than the one obtained in the measurements carried out immediately before and after (0.14 μm) and presents average values of 0.11 μm.
Aab, Alexander
2015-03-30
In this study, we present the results of an analysis of the large angular scale distribution of the arrival directions of cosmic rays with energy above 4 EeV detected at the Pierre Auger Observatory including for the first time events with zenith angle between 60° and 80°. We perform two Rayleigh analyses, one in the right ascension and one in the azimuth angle distributions, that are sensitive to modulations in right ascension and declination, respectively. The largest departure from isotropy appears in themore » $$E\\gt 8$$ EeV energy bin, with an amplitude for the first harmonic in right ascension $$r_{1}^{\\alpha }=(4.4\\pm 1.0)\\times {{10}^{-2}}$$, that has a chance probability $$P(\\geqslant r_{1}^{\\alpha })=6.4\\times {{10}^{-5}}$$, reinforcing the hint previously reported with vertical events alone.« less
A reexamination of plasma measurements from the Mariner 5 Venus encounter
NASA Technical Reports Server (NTRS)
Shefer, R. E.; Lazarus, A. J.; Bridge, H. S.
1979-01-01
Mariner 5 plasma data from the Venus encounter have been analyzed with twice the time resolution of the original analysis of Bridge et al. (1967). The velocity distribution function for each spectrum is used to determine more precisely the locations of boundaries and characteristic flow parameters in the interaction region around the planet. A new region is identified in the flow located between magnetosheathlike plasma inside the shock front and an interior low-flux region near the geometrical shadow of the planet. The region is characterized by a wide velocity distribution function and a decrease in ion flux. Using the highest time resolution magnetic field data, it is proposed that rapid magnetic field fluctuations in this region may result in an artificial broadening of the distribution function. It is concluded that very high time resolution is required in future experiments in order to determine the true nature of the plasma in this region.
Ocean Surface Wave Optical Roughness - Analysis of Innovative Measurements
2012-09-30
goals of the program are to (1) examine time -dependent oceanic radiance distribution in relation to dynamic surface boundary layer (SBL) processes; (2... Analysis of Innovative Measurements 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER...RESULTS An overview of results is provided by Zappa et al. [2012] and Dickey et al. [2012]. TOGA-COARE and Air-sea fluxes Time series
NASA Astrophysics Data System (ADS)
Lu, Xiaodong; Arfaoui, Helene; Mori, Kinji
In highly dynamic electronic commerce environment, the need for adaptability and rapid response time to information service systems has become increasingly important. In order to cope with the continuously changing conditions of service provision and utilization, Faded Information Field (FIF) has been proposed. FIF is a distributed information service system architecture, sustained by push/pull mobile agents to bring high-assurance of services through a recursive demand-oriented provision of the most popular information closer to the users to make a tradeoff between the cost of information service allocation and access. In this paper, based on the analysis of the relationship that exists among the users distribution, information provision and access time, we propose the technology for FIF design to resolve the competing requirements of users and providers to improve users' access time. In addition, to achieve dynamic load balancing with changing users preference, the autonomous information reallocation technology is proposed. We proved the effectiveness of the proposed technology through the simulation and comparison with the conventional system.
Temporal scaling and spatial statistical analyses of groundwater level fluctuations
NASA Astrophysics Data System (ADS)
Sun, H.; Yuan, L., Sr.; Zhang, Y.
2017-12-01
Natural dynamics such as groundwater level fluctuations can exhibit multifractionality and/or multifractality due likely to multi-scale aquifer heterogeneity and controlling factors, whose statistics requires efficient quantification methods. This study explores multifractionality and non-Gaussian properties in groundwater dynamics expressed by time series of daily level fluctuation at three wells located in the lower Mississippi valley, after removing the seasonal cycle in the temporal scaling and spatial statistical analysis. First, using the time-scale multifractional analysis, a systematic statistical method is developed to analyze groundwater level fluctuations quantified by the time-scale local Hurst exponent (TS-LHE). Results show that the TS-LHE does not remain constant, implying the fractal-scaling behavior changing with time and location. Hence, we can distinguish the potentially location-dependent scaling feature, which may characterize the hydrology dynamic system. Second, spatial statistical analysis shows that the increment of groundwater level fluctuations exhibits a heavy tailed, non-Gaussian distribution, which can be better quantified by a Lévy stable distribution. Monte Carlo simulations of the fluctuation process also show that the linear fractional stable motion model can well depict the transient dynamics (i.e., fractal non-Gaussian property) of groundwater level, while fractional Brownian motion is inadequate to describe natural processes with anomalous dynamics. Analysis of temporal scaling and spatial statistics therefore may provide useful information and quantification to understand further the nature of complex dynamics in hydrology.
NASA Astrophysics Data System (ADS)
Sergeenko, N. P.
2017-11-01
An adequate statistical method should be developed in order to predict probabilistically the range of ionospheric parameters. This problem is solved in this paper. The time series of the critical frequency of the layer F2- foF2( t) were subjected to statistical processing. For the obtained samples {δ foF2}, statistical distributions and invariants up to the fourth order are calculated. The analysis shows that the distributions differ from the Gaussian law during the disturbances. At levels of sufficiently small probability distributions, there are arbitrarily large deviations from the model of the normal process. Therefore, it is attempted to describe statistical samples {δ foF2} based on the Poisson model. For the studied samples, the exponential characteristic function is selected under the assumption that time series are a superposition of some deterministic and random processes. Using the Fourier transform, the characteristic function is transformed into a nonholomorphic excessive-asymmetric probability-density function. The statistical distributions of the samples {δ foF2} calculated for the disturbed periods are compared with the obtained model distribution function. According to the Kolmogorov's criterion, the probabilities of the coincidence of a posteriori distributions with the theoretical ones are P 0.7-0.9. The conducted analysis makes it possible to draw a conclusion about the applicability of a model based on the Poisson random process for the statistical description and probabilistic variation estimates during heliogeophysical disturbances of the variations {δ foF2}.
Climate change and fishing: a century of shifting distribution in North Sea cod
Engelhard, Georg H; Righton, David A; Pinnegar, John K
2014-01-01
Globally, spatial distributions of fish stocks are shifting but although the role of climate change in range shifts is increasingly appreciated, little remains known of the likely additional impact that high levels of fishing pressure might have on distribution. For North Sea cod, we show for the first time and in great spatial detail how the stock has shifted its distribution over the past 100 years. We digitized extensive historical fisheries data from paper charts in UK government archives and combined these with contemporary data to a time-series spanning 1913–2012 (excluding both World Wars). New analysis of old data revealed that the current distribution pattern of cod – mostly in the deeper, northern- and north-easternmost parts of the North Sea – is almost opposite to that during most of the Twentieth Century – mainly concentrated in the west, off England and Scotland. Statistical analysis revealed that the deepening, northward shift is likely attributable to warming; however, the eastward shift is best explained by fishing pressure, suggestive of significant depletion of the stock from its previous stronghold, off the coasts of England and Scotland. These spatial patterns were confirmed for the most recent 3½ decades by data from fisheries-independent surveys, which go back to the 1970s. Our results demonstrate the fundamental importance of both climate change and fishing pressure for our understanding of changing distributions of commercially exploited fish. PMID:24375860
Empirical behavior of a world stock index from intra-day to monthly time scales
NASA Astrophysics Data System (ADS)
Breymann, W.; Lüthi, D. R.; Platen, E.
2009-10-01
Most of the papers that study the distributional and fractal properties of financial instruments focus on stock prices or foreign exchange rates. This typically leads to mixed results concerning the distributions of log-returns and some multi-fractal properties of exchange rates, stock prices, and regional indices. This paper uses a well diversified world stock index as the central object of analysis. Such index approximates the growth optimal portfolio, which is demonstrated under the benchmark approach, it is the ideal reference unit for studying basic securities. When denominating this world index in units of a given currency, one measures the movements of the currency against the entire market. This provides a least disturbed observation of the currency dynamics. In this manner, one can expect to disentangle, e.g., the superposition of the two currencies involved in an exchange rate. This benchmark approach to the empirical analysis of financial data allows us to establish remarkable stylized facts. Most important is the observation that the repeatedly documented multi-fractal appearance of financial time series is very weak and much less pronounced than the deviation of the mono-scaling properties from Brownian-motion type scaling. The generalized Hurst exponent H(2) assumes typical values between 0.55 and 0.6. Accordingly, autocorrelations of log-returns decay according to a power law, and the quadratic variation vanishes when going to vanishing observation time step size. Furthermore, one can identify the Student t distribution as the log-return distribution of a well-diversified world stock index for long time horizons when a long enough data series is used for estimation. The study of dependence properties, finally, reveals that jumps at daily horizon originate primarily in the stock market while at 5min horizon they originate in the foreign exchange market. The principal message of the empirical analysis is that there is evidence that a diffusion model without multi-scaling could reasonably well model the dynamics of a broadly diversified world stock index. in here
Body size distributions signal a regime shift in a lake ecosystem
Spanbauer, Trisha; Allen, Craig R.; Angeler, David G.; Eason, Tarsha; Fritz, Sherilyn C.; Garmestani, Ahjond S.; Nash, Kirsty L.; Stone, Jeffery R.; Stow, Craig A.; Sundstrom, Shana M.
2016-01-01
Communities of organisms, from mammals to microorganisms, have discontinuous distributions of body size. This pattern of size structuring is a conservative trait of community organization and is a product of processes that occur at multiple spatial and temporal scales. In this study, we assessed whether body size patterns serve as an indicator of a threshold between alternative regimes. Over the past 7000 years, the biological communities of Foy Lake (Montana, USA) have undergone a major regime shift owing to climate change. We used a palaeoecological record of diatom communities to estimate diatom sizes, and then analysed the discontinuous distribution of organism sizes over time. We used Bayesian classification and regression tree models to determine that all time intervals exhibited aggregations of sizes separated by gaps in the distribution and found a significant change in diatom body size distributions approximately 150 years before the identified ecosystem regime shift. We suggest that discontinuity analysis is a useful addition to the suite of tools for the detection of early warning signals of regime shifts.
Voltage stress effects on microcircuit accelerated life test failure rates
NASA Technical Reports Server (NTRS)
Johnson, G. M.
1976-01-01
The applicability of Arrhenius and Eyring reaction rate models for describing microcircuit aging characteristics as a function of junction temperature and applied voltage was evaluated. The results of a matrix of accelerated life tests with a single metal oxide semiconductor microcircuit operated at six different combinations of temperature and voltage were used to evaluate the models. A total of 450 devices from two different lots were tested at ambient temperatures between 200 C and 250 C and applied voltages between 5 Vdc and 15 Vdc. A statistical analysis of the surface related failure data resulted in bimodal failure distributions comprising two lognormal distributions; a 'freak' distribution observed early in time, and a 'main' distribution observed later in time. The Arrhenius model was shown to provide a good description of device aging as a function of temperature at a fixed voltage. The Eyring model also appeared to provide a reasonable description of main distribution device aging as a function of temperature and voltage. Circuit diagrams are shown.
Colloquium: Statistical mechanics of money, wealth, and income
NASA Astrophysics Data System (ADS)
Yakovenko, Victor M.; Rosser, J. Barkley, Jr.
2009-10-01
This Colloquium reviews statistical models for money, wealth, and income distributions developed in the econophysics literature since the late 1990s. By analogy with the Boltzmann-Gibbs distribution of energy in physics, it is shown that the probability distribution of money is exponential for certain classes of models with interacting economic agents. Alternative scenarios are also reviewed. Data analysis of the empirical distributions of wealth and income reveals a two-class distribution. The majority of the population belongs to the lower class, characterized by the exponential (“thermal”) distribution, whereas a small fraction of the population in the upper class is characterized by the power-law (“superthermal”) distribution. The lower part is very stable, stationary in time, whereas the upper part is highly dynamical and out of equilibrium.
Extended Poisson process modelling and analysis of grouped binary data.
Faddy, Malcolm J; Smith, David M
2012-05-01
A simple extension of the Poisson process results in binomially distributed counts of events in a time interval. A further extension generalises this to probability distributions under- or over-dispersed relative to the binomial distribution. Substantial levels of under-dispersion are possible with this modelling, but only modest levels of over-dispersion - up to Poisson-like variation. Although simple analytical expressions for the moments of these probability distributions are not available, approximate expressions for the mean and variance are derived, and used to re-parameterise the models. The modelling is applied in the analysis of two published data sets, one showing under-dispersion and the other over-dispersion. More appropriate assessment of the precision of estimated parameters and reliable model checking diagnostics follow from this more general modelling of these data sets. © 2012 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Krishnan, Rohin J; Uruthiramoorthy, Lavanya; Jawaid, Noor; Steele, Margaret; Jones, Douglas L
2018-01-01
The Schulich School of Medicine & Dentistry in London, Ontario, has a mentorship program for all full-time faculty. The school would like to expand its outreach to physician faculty located in distributed medical education sites. The purpose of this study was to determine what, if any, mentorship distributed physician faculty currently have, to gauge their interest in expanding the mentorship program to distributed physician faculty and to determine their vision of the most appropriate design of a mentorship program that would address their needs. We conducted a mixed-methods study. The quantitative phase consisted of surveys sent to all distributed faculty members that elicited information on basic demographic characteristics and mentorship experiences/needs. The qualitative phase consisted of 4 focus groups of distributed faculty administered in 2 large and 2 small centres in both regions of the school's distributed education network: Sarnia, Leamington, Stratford and Hanover. Interviews were 90 minutes long and involved standardized semistructured questions. Of the 678 surveys sent, 210 (31.0%) were returned. Most respondents (136 [64.8%]) were men, and almost half (96 [45.7%]) were family physicians. Most respondents (197 [93.8%]) were not formal mentors to Schulich faculty, and 178 (84.8%) were not currently being formally mentored. Qualitative analysis suggested that many respondents were involved in informal mentoring. In addition, about half of the respondents (96 [45.7%]) wished to be formally mentored in the future, but they may be inhibited owing to time constraints and geographical isolation. Consistently, respondents wished to have mentoring by a colleague in a similar practice, with the most practical being one-on-one mentoring. Our analysis suggests that the school's current formal mentoring program may not be applicable and will require modification to address the needs of distributed faculty.
Sinchenko, Elena; Gibbs, W E Keith; Davis, Claire E; Stoddart, Paul R
2010-11-20
A distributed optical-fiber sensing system based on pulsed excitation and time-gated photon counting has been used to locate a fluorescent region along the fiber. The complex Alq3 and the infrared dye IR-125 were examined with 405 and 780 nm excitation, respectively. A model to characterize the response of the distributed fluorescence sensor to a Gaussian input pulse was developed and tested. Analysis of the Alq3 fluorescent response confirmed the validity of the model and enabled the fluorescence lifetime to be determined. The intrinsic lifetime obtained (18.2±0.9 ns) is in good agreement with published data. The decay rate was found to be proportional to concentration, which is indicative of collisional deactivation. The model allows the spatial resolution of a distributed sensing system to be improved for fluorophores with lifetimes that are longer than the resolution of the sensing system.
Sun, J
1995-09-01
In this paper we discuss the non-parametric estimation of a distribution function based on incomplete data for which the measurement origin of a survival time or the date of enrollment in a study is known only to belong to an interval. Also the survival time of interest itself is observed from a truncated distribution and is known only to lie in an interval. To estimate the distribution function, a simple self-consistency algorithm, a generalization of Turnbull's (1976, Journal of the Royal Statistical Association, Series B 38, 290-295) self-consistency algorithm, is proposed. This method is then used to analyze two AIDS cohort studies, for which direct use of the EM algorithm (Dempster, Laird and Rubin, 1976, Journal of the Royal Statistical Association, Series B 39, 1-38), which is computationally complicated, has previously been the usual method of the analysis.
Time-dependent Hartree-Fock approach to nuclear ``pasta'' at finite temperature
NASA Astrophysics Data System (ADS)
Schuetrumpf, B.; Klatt, M. A.; Iida, K.; Maruhn, J. A.; Mecke, K.; Reinhard, P.-G.
2013-05-01
We present simulations of neutron-rich matter at subnuclear densities, like supernova matter, with the time-dependent Hartree-Fock approximation at temperatures of several MeV. The initial state consists of α particles randomly distributed in space that have a Maxwell-Boltzmann distribution in momentum space. Adding a neutron background initialized with Fermi distributed plane waves the calculations reflect a reasonable approximation of astrophysical matter. This matter evolves into spherical, rod-like, and slab-like shapes and mixtures thereof. The simulations employ a full Skyrme interaction in a periodic three-dimensional grid. By an improved morphological analysis based on Minkowski functionals, all eight pasta shapes can be uniquely identified by the sign of only two valuations, namely the Euler characteristic and the integral mean curvature. In addition, we propose the variance in the cell density distribution as a measure to distinguish pasta matter from uniform matter.
Finite-key analysis for measurement-device-independent quantum key distribution.
Curty, Marcos; Xu, Feihu; Cui, Wei; Lim, Charles Ci Wen; Tamaki, Kiyoshi; Lo, Hoi-Kwong
2014-04-29
Quantum key distribution promises unconditionally secure communications. However, as practical devices tend to deviate from their specifications, the security of some practical systems is no longer valid. In particular, an adversary can exploit imperfect detectors to learn a large part of the secret key, even though the security proof claims otherwise. Recently, a practical approach--measurement-device-independent quantum key distribution--has been proposed to solve this problem. However, so far its security has only been fully proven under the assumption that the legitimate users of the system have unlimited resources. Here we fill this gap and provide a rigorous security proof against general attacks in the finite-key regime. This is obtained by applying large deviation theory, specifically the Chernoff bound, to perform parameter estimation. For the first time we demonstrate the feasibility of long-distance implementations of measurement-device-independent quantum key distribution within a reasonable time frame of signal transmission.
NASA Technical Reports Server (NTRS)
Misakian, M.; Mumma, M. J.; Faris, J. F.
1975-01-01
Dissociative excitation of CO2 by electron impact was studied using the methods of translational spectroscopy and angular distribution analysis. Earlier time of flight studies revealed two overlapping spectra, the slower of which was attributed to metastable CO(a3 pi) fragments. The fast peak is the focus of this study. Threshold energy, angular distribution, and improve time of flight measurements indicate that the fast peak actually consists of five overlapping features. The slowest of the five features is found to consist of metastable 0(5S) produced by predissociation of a sigma u + state of CO2 into 0(5S) + CO(a3 pi). Oxygen Rydberg fragments originating directly from a different sigma u + state are believed to make up the next fastest feature. Mechanisms for producing the three remaining features are discussed.
Leal, Aura Lucia; Montañez, Anita Maria; Buitrago, Giancarlo; Patiño, Jaime; Camacho, German; Moreno, Vivian Marcela; Colombia, Red Neumo
2017-01-01
Abstract Background Trends in distribution of S. pneumoniae capsular serotypes are associated with the introduction of pneumococcal conjugate vaccines (PCV) among population. In Colombia, 10-valent PCV (PCV10) has been included in the national vaccination program since 2011. As a part of the pneumococcal surveillance network (SIREVA), Colombia has gathered data of serotype distribution since 1993. The aim of this work is to determine the effect of PCV10 introduction on non-coverage serotypes by PCV10 in Colombia, specifically, the effect on 6A, 19A and 3 serotypes. Methods Information was obtained from the national surveillance program since 1993 to 2016 in children under 5 years. The isolates came from sterile sites (blood, cerebrospinal fluid, pleural fluid, articular and peritoneal fluids). All the isolates were serotyping by National Institute of Health. An interrupted time series analysis was performed to determine the effect of the PCV10 introduction on the 6A, 19A and 3 serotypes (ARIMA model). Results Serotyping was performed in 4683 isolates. The annual proportion trend of the 6A, 19A and 3 serotypes remained constant until 2012. An increase of double in the serotype proportion trends was observed after 2012 (Figure). The interrupted time-series analysis showed a positive effect of the PCV10 introduction on trends of 19A and 3 serotypes, with coefficients 20.92 (P = 0.00, ARIMA(2,0,1)) and 6.32 (P = 0.00, ARIMA(2,1,1), respectively. There was no significant effect on 6A serotype trend. Conclusion The introduction of PCV10 in the national vaccination program in Colombia, affected the distribution of PVC 13 capsular types non included in the PCV 7 and PCV 10 in children under 5 years. This information emphasizes the importance to surveillance the changes in serotype distributions to guide prevention strategies in children under 5 years in Colombia. Figure. 1 Trends in distribution of serotypes 19A, 3 and 6A in children under 5 years. Colombia. Disclosures All authors: No reported disclosures.
Sefiddashti, Sara Emamgholipour; Arab, Mohammad; Ghazanfari, Sadegh; Kazemi, Zhila; Rezaei, Satar; Karyani, Ali Kazemi
2016-07-01
Considering the scarcity of skilled workers in the health sector, the appropriate distribution of human resources in this sector is very important for improving people's health. Having information about the degree of equality in the distribution of health human resources and their time trends is necessary for better planning and efficient use of these resources. The aim of this study was to determine the trend of inequality in the allocation of human resources in the health sector in Tehran between 2007 and 2013. This cross-sectional study was conducted in Tehran Province in Iran. The inequality in the distribution of human resources (specialists, general practitioners, pharmacists, paramedics, dentists, nurses and community health workers (Behvarz)) in 10 cities in Tehran Province was investigated using the Gini coefficient and the dissimilarity index. The time trend of inequality was examined by regression analysis. The required data were collected from the statistical yearbook of the Iran Statistics Center (ISC). The highest value of the Gini coefficient (GC) was related to nurses (GC = 0.291) in 2007. The highest value of the Gini coefficient was related to nurses and Behvarzs in 2008 and 2009, respectively. The distribution of specialists had the highest inequality in 2010 (GC = 0.298), 2011 (GC = 0.300) and 2013 (GC = 0.316). General practitioners had the lowest Gini coefficient for 2007, 2008 and 2012. Nurses for 2009 and Behvarzs for 2010, 2011 and 2013 had the lowest value of Gini coefficient. The dissimilarity indexes for specialists and general practitioners were 26.64 and 8.72 in 2013, respectively. The means of this index for included resources were 31.35, 18.27, 16.91, 22.32, 15.82, 26.74, and 24.33, respectively. The time trend analysis showed that the coefficient of time was positive for all of the human resources, except Behvarzes, and only the coefficient of general practitioners was statistically significant ( p<0.01). Over time, inequalities in the distribution of resources in the health sector have been increasing. By developing the private sector and considering the trend of this sector to operate in the more developed regions, health policy makers should continually evaluate the distribution of human resources, and they should arrange a specific plan for the allocation of human resources in the health sector.
Lindley frailty model for a class of compound Poisson processes
NASA Astrophysics Data System (ADS)
Kadilar, Gamze Özel; Ata, Nihal
2013-10-01
The Lindley distribution gain importance in survival analysis for the similarity of exponential distribution and allowance for the different shapes of hazard function. Frailty models provide an alternative to proportional hazards model where misspecified or omitted covariates are described by an unobservable random variable. Despite of the distribution of the frailty is generally assumed to be continuous, it is appropriate to consider discrete frailty distributions In some circumstances. In this paper, frailty models with discrete compound Poisson process for the Lindley distributed failure time are introduced. Survival functions are derived and maximum likelihood estimation procedures for the parameters are studied. Then, the fit of the models to the earthquake data set of Turkey are examined.
R/S analysis of reaction time in Neuron Type Test for human activity in civil aviation
NASA Astrophysics Data System (ADS)
Zhang, Hong-Yan; Kang, Ming-Cui; Li, Jing-Qiang; Liu, Hai-Tao
2017-03-01
Human factors become the most serious problem leading to accidents of civil aviation, which stimulates the design and analysis of Neuron Type Test (NTT) system to explore the intrinsic properties and patterns behind the behaviors of professionals and students in civil aviation. In the experiment, normal practitioners' reaction time sequences, collected from NTT, exhibit log-normal distribution approximately. We apply the χ2 test to compute the goodness-of-fit by transforming the time sequence with Box-Cox transformation to cluster practitioners. The long-term correlation of different individual practitioner's time sequence is represented by the Hurst exponent via Rescaled Range Analysis, also named by Range/Standard deviation (R/S) Analysis. The different Hurst exponent suggests the existence of different collective behavior and different intrinsic patterns of human factors in civil aviation.
Data-Aware Retrodiction for Asynchronous Harmonic Measurement in a Cyber-Physical Energy System
Liu, Youda; Wang, Xue; Liu, Yanchi; Cui, Sujin
2016-01-01
Cyber-physical energy systems provide a networked solution for safety, reliability and efficiency problems in smart grids. On the demand side, the secure and trustworthy energy supply requires real-time supervising and online power quality assessing. Harmonics measurement is necessary in power quality evaluation. However, under the large-scale distributed metering architecture, harmonic measurement faces the out-of-sequence measurement (OOSM) problem, which is the result of latencies in sensing or the communication process and brings deviations in data fusion. This paper depicts a distributed measurement network for large-scale asynchronous harmonic analysis and exploits a nonlinear autoregressive model with exogenous inputs (NARX) network to reorder the out-of-sequence measuring data. The NARX network gets the characteristics of the electrical harmonics from practical data rather than the kinematic equations. Thus, the data-aware network approximates the behavior of the practical electrical parameter with real-time data and improves the retrodiction accuracy. Theoretical analysis demonstrates that the data-aware method maintains a reasonable consumption of computing resources. Experiments on a practical testbed of a cyber-physical system are implemented, and harmonic measurement and analysis accuracy are adopted to evaluate the measuring mechanism under a distributed metering network. Results demonstrate an improvement of the harmonics analysis precision and validate the asynchronous measuring method in cyber-physical energy systems. PMID:27548171
NASA Astrophysics Data System (ADS)
Hemingway, Jordon D.; Rothman, Daniel H.; Rosengard, Sarah Z.; Galy, Valier V.
2017-11-01
Serial oxidation coupled with stable carbon and radiocarbon analysis of sequentially evolved CO2 is a promising method to characterize the relationship between organic carbon (OC) chemical composition, source, and residence time in the environment. However, observed decay profiles depend on experimental conditions and oxidation pathway. It is therefore necessary to properly assess serial oxidation kinetics before utilizing decay profiles as a measure of OC reactivity. We present a regularized inverse method to estimate the distribution of OC activation energy (E), a proxy for bond strength, using serial oxidation. Here, we apply this method to ramped temperature pyrolysis or oxidation (RPO) analysis but note that this approach is broadly applicable to any serial oxidation technique. RPO analysis directly compares thermal reactivity to isotope composition by determining the E range for OC decaying within each temperature interval over which CO2 is collected. By analyzing a decarbonated test sample at multiple masses and oven ramp rates, we show that OC decay during RPO analysis follows a superposition of parallel first-order kinetics and that resulting E distributions are independent of experimental conditions. We therefore propose the E distribution as a novel proxy to describe OC thermal reactivity and suggest that E vs. isotope relationships can provide new insight into the compositional controls on OC source and residence time.
Dynamic Power Distribution System Management With a Locally Connected Communication Network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dall-Anese, Emiliano; Zhang, Kaiqing; Basar, Tamer
Coordinated optimization and control of distribution-level assets can enable a reliable and optimal integration of massive amount of distributed energy resources (DERs) and facilitate distribution system management (DSM). Accordingly, the objective is to coordinate the power injection at the DERs to maintain certain quantities across the network, e.g., voltage magnitude, line flows, or line losses, to be close to a desired profile. By and large, the performance of the DSM algorithms has been challenged by two factors: i) the possibly non-strongly connected communication network over DERs that hinders the coordination; ii) the dynamics of the real system caused by themore » DERs with heterogeneous capabilities, time-varying operating conditions, and real-time measurement mismatches. In this paper, we investigate the modeling and algorithm design and analysis with the consideration of these two factors. In particular, a game theoretic characterization is first proposed to account for a locally connected communication network over DERs, along with the analysis of the existence and uniqueness of the Nash equilibrium (NE) therein. To achieve the equilibrium in a distributed fashion, a projected-gradient-based asynchronous DSM algorithm is then advocated. The algorithm performance, including the convergence speed and the tracking error, is analytically guaranteed under the dynamic setting. Extensive numerical tests on both synthetic and realistic cases corroborate the analytical results derived.« less
A non-stationary cost-benefit based bivariate extreme flood estimation approach
NASA Astrophysics Data System (ADS)
Qi, Wei; Liu, Junguo
2018-02-01
Cost-benefit analysis and flood frequency analysis have been integrated into a comprehensive framework to estimate cost effective design values. However, previous cost-benefit based extreme flood estimation is based on stationary assumptions and analyze dependent flood variables separately. A Non-Stationary Cost-Benefit based bivariate design flood estimation (NSCOBE) approach is developed in this study to investigate influence of non-stationarities in both the dependence of flood variables and the marginal distributions on extreme flood estimation. The dependence is modeled utilizing copula functions. Previous design flood selection criteria are not suitable for NSCOBE since they ignore time changing dependence of flood variables. Therefore, a risk calculation approach is proposed based on non-stationarities in both marginal probability distributions and copula functions. A case study with 54-year observed data is utilized to illustrate the application of NSCOBE. Results show NSCOBE can effectively integrate non-stationarities in both copula functions and marginal distributions into cost-benefit based design flood estimation. It is also found that there is a trade-off between maximum probability of exceedance calculated from copula functions and marginal distributions. This study for the first time provides a new approach towards a better understanding of influence of non-stationarities in both copula functions and marginal distributions on extreme flood estimation, and could be beneficial to cost-benefit based non-stationary bivariate design flood estimation across the world.
Cheng, Rui; Xia, Li; Sima, Chaotan; Ran, Yanli; Rohollahnejad, Jalal; Zhou, Jiaao; Wen, Yongqiang; Yu, Can
2016-02-08
Ultrashort fiber Bragg gratings (US-FBGs) have significant potential as weak grating sensors for distributed sensing, but the exploitation have been limited by their inherent broad spectra that are undesirable for most traditional wavelength measurements. To address this, we have recently introduced a new interrogation concept using shifted optical Gaussian filters (SOGF) which is well suitable for US-FBG measurements. Here, we apply it to demonstrate, for the first time, an US-FBG-based self-referencing distributed optical sensing technique, with the advantages of adjustable sensitivity and range, high-speed and wide-range (potentially >14000 με) intensity-based detection, and resistance to disturbance by nonuniform parameter distribution. The entire system is essentially based on a microwave network, which incorporates the SOGF with a fiber delay-line between the two arms. Differential detections of the cascaded US-FBGs are performed individually in the network time-domain response which can be obtained by analyzing its complex frequency response. Experimental results are presented and discussed using eight cascaded US-FBGs. A comprehensive numerical analysis is also conducted to assess the system performance, which shows that the use of US-FBGs instead of conventional weak FBGs could significantly improve the power budget and capacity of the distributed sensing system while maintaining the crosstalk level and intensity decay rate, providing a promising route for future sensing applications.
Study of temperature distributions in wafer exposure process
NASA Astrophysics Data System (ADS)
Lin, Zone-Ching; Wu, Wen-Jang
During the exposure process of photolithography, wafer absorbs the exposure energy, which results in rising temperature and the phenomenon of thermal expansion. This phenomenon was often neglected due to its limited effect in the previous generation of process. However, in the new generation of process, it may very likely become a factor to be considered. In this paper, the finite element model for analyzing the transient behavior of the distribution of wafer temperature during exposure was established under the assumption that the wafer was clamped by a vacuum chuck without warpage. The model is capable of simulating the distribution of the wafer temperature under different exposure conditions. The flowchart of analysis begins with the simulation of transient behavior in a single exposure region to the variation of exposure energy, interval of exposure locations and interval of exposure time under continuous exposure to investigate the distribution of wafer temperature. The simulation results indicate that widening the interval of exposure locations has a greater impact in improving the distribution of wafer temperature than extending the interval of exposure time between neighboring image fields. Besides, as long as the distance between the field center locations of two neighboring exposure regions exceeds the straight distance equals to three image fields wide, the interacting thermal effect during wafer exposure can be ignored. The analysis flow proposed in this paper can serve as a supporting reference tool for engineers in planning exposure paths.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sohrabi, M.; Habibi, M., E-mail: mortezahabibi@gmail.com; Ramezani, V.
2017-02-15
The paper presents an experimental study and analysis of full helium ion density angular distributions in a 4-kJ plasma focus device (PFD) at pressures of 10, 15, 25, and 30 mbar using large-area polycarbonate track detectors (PCTDs) (15-cm etchable diameter) processed by 50-Hz-HV electrochemical etching (ECE). Helium ion track distributions at different pressures, in particular, at the main axis of the PFD are presented. Maximum ion track density of ~4.4 × 10{sup 4} tracks/cm{sup 2} was obtained in the PCTD placed 6 cm from the anode. The ion distributions for all pressures applied are ring-shaped, which is possibly due tomore » the hollow cylindrical copper anode used. The large-area PCTD processed by ECE proves, at the present state-of-theart, a superior method for direct observation and analysis of ion distributions at a glance with minimum efforts and time. Some observations of the ion density distributions at different pressures are reported and discussed.« less
Network topology and resilience analysis of South Korean power grid
NASA Astrophysics Data System (ADS)
Kim, Dong Hwan; Eisenberg, Daniel A.; Chun, Yeong Han; Park, Jeryang
2017-01-01
In this work, we present topological and resilience analyses of the South Korean power grid (KPG) with a broad voltage level. While topological analysis of KPG only with high-voltage infrastructure shows an exponential degree distribution, providing another empirical evidence of power grid topology, the inclusion of low voltage components generates a distribution with a larger variance and a smaller average degree. This result suggests that the topology of a power grid may converge to a highly skewed degree distribution if more low-voltage data is considered. Moreover, when compared to ER random and BA scale-free networks, the KPG has a lower efficiency and a higher clustering coefficient, implying that highly clustered structure does not necessarily guarantee a functional efficiency of a network. Error and attack tolerance analysis, evaluated with efficiency, indicate that the KPG is more vulnerable to random or degree-based attacks than betweenness-based intentional attack. Cascading failure analysis with recovery mechanism demonstrates that resilience of the network depends on both tolerance capacity and recovery initiation time. Also, when the two factors are fixed, the KPG is most vulnerable among the three networks. Based on our analysis, we propose that the topology of power grids should be designed so the loads are homogeneously distributed, or functional hubs and their neighbors have high tolerance capacity to enhance resilience.
Amplitude and recurrence time analysis of LP activity at Mount Etna, Italy
NASA Astrophysics Data System (ADS)
Cauchie, Léna; Saccorotti, Gilberto; Bean, Christopher J.
2015-09-01
The aim of this work is to improve our understanding of the long-period (LP) source mechanism at Mount Etna (Italy) through a statistical analysis of detailed LP catalogues. The behavior of LP activity is compared with the empirical laws governing earthquake recurrence, in order to investigate whether any relationships exist between these two apparently different earthquake classes. We analyzed a family of 8894 events detected during a temporary experiment in August 2005. For that time interval, the LP activity is sustained in time and the volcano did not exhibit any evident sign of unrest. The completeness threshold of the catalogue is established through a detection test based on synthetic waveforms. The retrieved amplitude distribution differs significantly from the Gutenberg-Richter law, and the interevent times distribution does not follow the typical γ law, expected for tectonic activity. In order to compare these results with a catalogue for which the source mechanism is well established, we applied the same procedure to a data set from Stromboli Volcano, where recurrent LP activity is closely related to very-long-period pulses, in turn associated with the summit explosions. Our results indicate that the two catalogues exhibit similar behavior in terms of amplitude and interevent time distributions. This suggests that the Etna's LP signals are most likely driven by stress changes caused by an intermittent degassing process occurring at depth, similar to that which drives the summit explosions at Stromboli Volcano.
Characteristic Lifelength of Coherent Structure in the Turbulent Boundary Layer
NASA Technical Reports Server (NTRS)
Palumbo, Daniel L.
2006-01-01
A characteristic lifelength is defined by which a Gaussian distribution is fit to data correlated over a 3 sensor array sampling streamwise sidewall pressure. The data were acquired at subsonic, transonic and supersonic speeds aboard a Tu-144. Lifelengths are estimated using the cross spectrum and are shown to compare favorably with Efimtsov's prediction of correlation space scales. Lifelength distributions are computed in the time/frequency domain using an interval correlation technique on the continuous wavelet transform of the original time data. The median values of the lifelength distributions are found to be very close to the frequency averaged result. The interval correlation technique is shown to allow the retrieval and inspection of the original time data of each event in the lifelength distribution, thus providing a means to locate and study the nature of the coherent structure in the turbulent boundary layer. The lifelength data can be converted to lifetimes using the convection velocity. The lifetime of events in the time/frequency domain are displayed in Lifetime Maps. The primary purpose of the paper is to validate these new analysis techniques so that they can be used with confidence to further characterize coherent structure in the turbulent boundary layer.
NASA Technical Reports Server (NTRS)
Levy, Lionel L., Jr.; Yoshikawa, Kenneth K.
1959-01-01
A method based on linearized and slender-body theories, which is easily adapted to electronic-machine computing equipment, is developed for calculating the zero-lift wave drag of single- and multiple-component configurations from a knowledge of the second derivative of the area distribution of a series of equivalent bodies of revolution. The accuracy and computational time required of the method to calculate zero-lift wave drag is evaluated relative to another numerical method which employs the Tchebichef form of harmonic analysis of the area distribution of a series of equivalent bodies of revolution. The results of the evaluation indicate that the total zero-lift wave drag of a multiple-component configuration can generally be calculated most accurately as the sum of the zero-lift wave drag of each component alone plus the zero-lift interference wave drag between all pairs of components. The accuracy and computational time required of both methods to calculate total zero-lift wave drag at supersonic Mach numbers is comparable for airplane-type configurations. For systems of bodies of revolution both methods yield similar results with comparable accuracy; however, the present method only requires up to 60 percent of the computing time required of the harmonic-analysis method for two bodies of revolution and less time for a larger number of bodies.
PERTS: A Prototyping Environment for Real-Time Systems
NASA Technical Reports Server (NTRS)
Liu, Jane W. S.; Lin, Kwei-Jay; Liu, C. L.
1991-01-01
We discuss an ongoing project to build a Prototyping Environment for Real-Time Systems, called PERTS. PERTS is a unique prototyping environment in that it has (1) tools and performance models for the analysis and evaluation of real-time prototype systems, (2) building blocks for flexible real-time programs and the support system software, (3) basic building blocks of distributed and intelligent real time applications, and (4) an execution environment. PERTS will make the recent and future theoretical advances in real-time system design and engineering readily usable to practitioners. In particular, it will provide an environment for the use and evaluation of new design approaches, for experimentation with alternative system building blocks and for the analysis and performance profiling of prototype real-time systems.
Disease clusters, exact distributions of maxima, and P-values.
Grimson, R C
1993-10-01
This paper presents combinatorial (exact) methods that are useful in the analysis of disease cluster data obtained from small environments, such as buildings and neighbourhoods. Maxwell-Boltzmann and Fermi-Dirac occupancy models are compared in terms of appropriateness of representation of disease incidence patterns (space and/or time) in these environments. The methods are illustrated by a statistical analysis of the incidence pattern of bone fractures in a setting wherein fracture clustering was alleged to be occurring. One of the methodological results derived in this paper is the exact distribution of the maximum cell frequency in occupancy models.
NASA Astrophysics Data System (ADS)
Setty, V.; Sharma, A.
2013-12-01
Characterization of extreme conditions of space weather is essential for potential mitigation strategies. The non-equilibrium nature of magnetosphere makes such efforts complicated and new techniques to understand its extreme event distribution are required. The heavy tail distribution in such systems can be a modeled using Stable distribution whose stability parameter is a measure of scaling in the cumulative distribution and is related to the Hurst exponent. This exponent can be readily measured in stationary time series using several techniques and detrended fluctuation analysis (DFA) is widely used in the presence of non-stationarities. However DFA has severe limitations in cases with non-linear and atypical trends. We propose a new technique that utilizes nonlinear dynamical predictions as a measure of trends and estimates the Hurst exponents. Furthermore, such a measure provides us with a new way to characterize predictability, as perfectly detrended data have no long term memory akin to Gaussian noise Ab initio calculation of weekly Hurst exponents using the auroral electrojet index AL over a span of few decades shows that these exponents are time varying and so is its fractal structure. Such time series data with time varying Hurst exponents are modeled well using multifractional Brownian motion and it is shown that DFA estimates a single time averaged value for Hurst exponent in such data. Our results show that using time varying Hurst exponent structure, we can (a) Estimate stability parameter, -a measure of scaling in heavy tails, (b) Define and identify epochs when the magnetosphere switches between regimes with and without extreme events, and, (c) Study the dependence of the Hurst exponents on the solar activity.
Wacker, M; Witte, H
2013-01-01
This review outlines the methodological fundamentals of the most frequently used non-parametric time-frequency analysis techniques in biomedicine and their main properties, as well as providing decision aids concerning their applications. The short-term Fourier transform (STFT), the Gabor transform (GT), the S-transform (ST), the continuous Morlet wavelet transform (CMWT), and the Hilbert transform (HT) are introduced as linear transforms by using a unified concept of the time-frequency representation which is based on a standardized analytic signal. The Wigner-Ville distribution (WVD) serves as an example of the 'quadratic transforms' class. The combination of WVD and GT with the matching pursuit (MP) decomposition and that of the HT with the empirical mode decomposition (EMD) are explained; these belong to the class of signal-adaptive approaches. Similarities between linear transforms are demonstrated and differences with regard to the time-frequency resolution and interference (cross) terms are presented in detail. By means of simulated signals the effects of different time-frequency resolutions of the GT, CMWT, and WVD as well as the resolution-related properties of the interference (cross) terms are shown. The method-inherent drawbacks and their consequences for the application of the time-frequency techniques are demonstrated by instantaneous amplitude, frequency and phase measures and related time-frequency representations (spectrogram, scalogram, time-frequency distribution, phase-locking maps) of measured magnetoencephalographic (MEG) signals. The appropriate selection of a method and its parameter settings will ensure readability of the time-frequency representations and reliability of results. When the time-frequency characteristics of a signal strongly correspond with the time-frequency resolution of the analysis then a method may be considered 'optimal'. The MP-based signal-adaptive approaches are preferred as these provide an appropriate time-frequency resolution for all frequencies while simultaneously reducing interference (cross) terms.
NASA Astrophysics Data System (ADS)
Phillips, R. C.; Samadi, S. Z.; Meadows, M. E.
2018-07-01
This paper examines the frequency, distribution tails, and peak-over-threshold (POT) of extreme floods through analysis that centers on the October 2015 flooding in North Carolina (NC) and South Carolina (SC), United States (US). The most striking features of the October 2015 flooding were a short time to peak (Tp) and a multi-hour continuous flood peak which caused intensive and widespread damages to human lives, properties, and infrastructure. The 2015 flooding was produced by a sequence of intense rainfall events which originated from category 4 hurricane Joaquin over a period of four days. Here, the probability distribution and distribution parameters (i.e., location, scale, and shape) of floods were investigated by comparing the upper part of empirical distributions of the annual maximum flood (AMF) and POT with light- to heavy- theoretical tails: Fréchet, Pareto, Gumbel, Weibull, Beta, and Exponential. Specifically, four sets of U.S. Geological Survey (USGS) gauging data from the central Carolinas with record lengths from approximately 65-125 years were used. Analysis suggests that heavier-tailed distributions are in better agreement with the POT and somewhat AMF data than more often used exponential (light) tailed probability distributions. Further, the threshold selection and record length affect the heaviness of the tail and fluctuations of the parent distributions. The shape parameter and its evolution in the period of record play a critical and poorly understood role in determining the scaling of flood response to intense rainfall.
A quality-of-life-oriented endpoint for comparing therapies.
Gelber, R D; Gelman, R S; Goldhirsch, A
1989-09-01
An endpoint, time without symptoms of disease and toxicity of treatment (TWiST), is defined to provide a single measure of length and quality of survival. Time with subjective side effects of treatment and time with unpleasant symptoms of disease are subtracted from overall survival time to calculate TWiST for each patient. The purpose of this paper is to describe the construction of this endpoint, and to elaborate on its interpretation for patient care decision-making. Estimating the distribution of TWiST using actuarial methods is shown by simulation studies to be biased as a result of induced dependency between TWiST and its censoring distribution. Considering the distribution of TWiST accumulated within a specified time from start of therapy, L, allows one to reduce this bias by substituting estimated TWiST for censored values and provides a method to evaluate the "payback" period for early toxic effects. Quantile distance plots provide graphical representations for treatment comparisons. The analysis of Ludwig Trial III evaluating toxic adjuvant therapies versus a no-treatment control group for postmenopausal women with node-positive breast cancer illustrates the methodology.
Zhang, Jian; Hou, Dibo; Wang, Ke; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo
2017-05-01
The detection of organic contaminants in water distribution systems is essential to protect public health from potential harmful compounds resulting from accidental spills or intentional releases. Existing methods for detecting organic contaminants are based on quantitative analyses such as chemical testing and gas/liquid chromatography, which are time- and reagent-consuming and involve costly maintenance. This study proposes a novel procedure based on discrete wavelet transform and principal component analysis for detecting organic contamination events from ultraviolet spectral data. Firstly, the spectrum of each observation is transformed using discrete wavelet with a coiflet mother wavelet to capture the abrupt change along the wavelength. Principal component analysis is then employed to approximate the spectra based on capture and fusion features. The significant value of Hotelling's T 2 statistics is calculated and used to detect outliers. An alarm of contamination event is triggered by sequential Bayesian analysis when the outliers appear continuously in several observations. The effectiveness of the proposed procedure is tested on-line using a pilot-scale setup and experimental data.
Continuous distribution of emission states from single CdSe/ZnS quantum dots.
Zhang, Kai; Chang, Hauyee; Fu, Aihua; Alivisatos, A Paul; Yang, Haw
2006-04-01
The photoluminescence dynamics of colloidal CdSe/ZnS/streptavidin quantum dots were studied using time-resolved single-molecule spectroscopy. Statistical tests of the photon-counting data suggested that the simple "on/off" discrete state model is inconsistent with experimental results. Instead, a continuous emission state distribution model was found to be more appropriate. Autocorrelation analysis of lifetime and intensity fluctuations showed a nonlinear correlation between them. These results were consistent with the model that charged quantum dots were also emissive, and that time-dependent charge migration gave rise to the observed photoluminescence dynamics.
Automated Video Analysis System Reveals Distinct Diurnal Behaviors in C57BL/6 and C3H/HeN Mice
Adamah-Biassi, E. B.; Stepien, I.; Hudson, R.L.; Dubocovich, M.L.
2013-01-01
Advances in rodent behavior dissection using automated video recording and analysis allows detailed phenotyping. This study compared and contrasted 15 diurnal behaviors recorded continuously using an automated behavioral analysis system for a period of 14 days under a 14/10 light/dark cycle in single housed C3H/HeN (C3H) or C57BL/6 (C57) male mice. Diurnal behaviors, recorded with minimal experimental interference and analyzed using phenotypic array and temporal distribution analysis showed bimodal and unimodal profiles in the C57 and C3H mice, respectively. Phenotypic array analysis revealed distinct behavioral rhythms in activity-like behaviors (i.e. walk, hang, jump, come down) (ALB), exploration-like behaviors (i.e. dig, groom, rear up, sniff, stretch) (ELB), ingestion-like behaviors (i.e. drink, eat) (ILB) and resting-like behaviors (i.e. awake, remain low, rest, twitch) (RLB) of C3H and C57 mice. Temporal analysis demonstrated that strain and time of day affects the magnitude and distribution of the spontaneous homecage behaviors. Wheel running activity, water and food measurements correlated with timing of homecage behaviors. Subcutaneous (3 mg/kg, sc) or oral (0.02 mg/ml, oral) melatonin treatments in C57 mice did not modify either the total 24 hr magnitude or temporal distribution of homecage behaviors when compared with vehicle treatments. We conclude that C3H and C57 mice show different spontaneous activity and behavioral rhythms specifically during the night period which are not modulated by melatonin. PMID:23337734
Input-output relationship in social communications characterized by spike train analysis
NASA Astrophysics Data System (ADS)
Aoki, Takaaki; Takaguchi, Taro; Kobayashi, Ryota; Lambiotte, Renaud
2016-10-01
We study the dynamical properties of human communication through different channels, i.e., short messages, phone calls, and emails, adopting techniques from neuronal spike train analysis in order to characterize the temporal fluctuations of successive interevent times. We first measure the so-called local variation (LV) of incoming and outgoing event sequences of users and find that these in- and out-LV values are positively correlated for short messages and uncorrelated for phone calls and emails. Second, we analyze the response-time distribution after receiving a message to focus on the input-output relationship in each of these channels. We find that the time scales and amplitudes of response differ between the three channels. To understand the effects of the response-time distribution on the correlations between the LV values, we develop a point process model whose activity rate is modulated by incoming and outgoing events. Numerical simulations of the model indicate that a quick response to incoming events and a refractory effect after outgoing events are key factors to reproduce the positive LV correlations.
Uncertainty analysis in seismic tomography
NASA Astrophysics Data System (ADS)
Owoc, Bartosz; Majdański, Mariusz
2017-04-01
Velocity field from seismic travel time tomography depends on several factors like regularization, inversion path, model parameterization etc. The result also strongly depends on an initial velocity model and precision of travel times picking. In this research we test dependence on starting model in layered tomography and compare it with effect of picking precision. Moreover, in our analysis for manual travel times picking the uncertainty distribution is asymmetric. This effect is shifting the results toward faster velocities. For calculation we are using JIVE3D travel time tomographic code. We used data from geo-engineering and industrial scale investigations, which were collected by our team from IG PAS.
Empirical analysis and modeling of manual turnpike tollbooths in China
NASA Astrophysics Data System (ADS)
Zhang, Hao
2017-03-01
To deal with low-level of service satisfaction at tollbooths of many turnpikes in China, we conduct an empirical study and use a queueing model to investigate performance measures. In this paper, we collect archived data from six tollbooths of a turnpike in China. Empirical analysis on vehicle's time-dependent arrival process and collector's time-dependent service time is conducted. It shows that the vehicle arrival process follows a non-homogeneous Poisson process while the collector service time follows a log-normal distribution. Further, we model the process of collecting tolls at tollbooths with MAP / PH / 1 / FCFS queue for mathematical tractability and present some numerical examples.
NASA Astrophysics Data System (ADS)
Gallton, D. A.; Manweiler, J. W.; Gerrard, A. J.; Cravens, T.; Lanzerotti, L. J.; Patterson, J. D.
2017-12-01
The increased frequency of the Van Allen Probes (VAP) lapping events provides a unique opportunity to examine the scaling length and structure of the magnetospheric plasma at microscales. Onboard the probes is the RBSPICE instrument, which is an energetic particle detector capable of observing ions (H+, Hen+, On+) from approximately 7 KeV upwards to values of 1 MeV. Here we provide a correlation analysis of the probes during quiet time lapping events which examines the behavior of the particle populations when the probes are within 1,000 km of separation distance, at a distance greater than 15,000 km from Earth, and where the Kp and AE magnetic indices show minimal geomagnetic activity. The correlation values of the energetic particle distributions are examined and the falloff distances associated with the tail end of the plasma distribution are calculated. We provide an overview of the initial analysis results for H during the quiet time lapping events and a discussion of the causal relationship.
NASA Astrophysics Data System (ADS)
Deljanin, Isidora; Antanasijević, Davor; Vuković, Gordana; Urošević, Mira Aničić; Tomašević, Milica; Perić-Grujić, Aleksandra; Ristić, Mirjana
2015-09-01
The first investigation of the use of the Kohonen self-organizing map (SOM) which includes lead concentration and its isotopic composition in moss bags to assess the spatial and temporal patterns of lead in the urban microenvironments is presented in this paper. The moss bags experiment was carried out during 2011 in the city tunnel in Belgrade, as well as in street canyons at different heights (4, 8 and 16 m) and in public garages. The moss bags were exposed for 5 and 10 weeks. The results revealed that the 10 weeks period represents suitable exposure time in screening Pb isotopic composition in active biomonitoring analysis. The obtained results showed that the SOM analysis, by recognizing slight differences among moss samples regarding exposure time, horizontal and vertical spatial distribution, with both, contribution of stable lead isotopes and Pb concentration, could be recommended in biomonitoring analysis of lead distribution in urban microenvironments.
Mächtle, W
1999-01-01
Sedimentation velocity is a powerful tool for the analysis of complex solutions of macromolecules. However, sample turbidity imposes an upper limit to the size of molecular complexes currently amenable to such analysis. Furthermore, the breadth of the particle size distribution, combined with possible variations in the density of different particles, makes it difficult to analyze extremely complex mixtures. These same problems are faced in the polymer industry, where dispersions of latices, pigments, lacquers, and emulsions must be characterized. There is a rich history of methods developed for the polymer industry finding use in the biochemical sciences. Two such methods are presented. These use analytical ultracentrifugation to determine the density and size distributions for submicron-sized particles. Both methods rely on Stokes' equations to estimate particle size and density, whereas turbidity, corrected using Mie's theory, provides the concentration measurement. The first method uses the sedimentation time in dispersion media of different densities to evaluate the particle density and size distribution. This method works provided the sample is chemically homogeneous. The second method splices together data gathered at different sample concentrations, thus permitting the high-resolution determination of the size distribution of particle diameters ranging from 10 to 3000 nm. By increasing the rotor speed exponentially from 0 to 40,000 rpm over a 1-h period, size distributions may be measured for extremely broadly distributed dispersions. Presented here is a short history of particle size distribution analysis using the ultracentrifuge, along with a description of the newest experimental methods. Several applications of the methods are provided that demonstrate the breadth of its utility, including extensions to samples containing nonspherical and chromophoric particles. PMID:9916040
ZWD time series analysis derived from NRT data processing. A regional study of PW in Greece.
NASA Astrophysics Data System (ADS)
Pikridas, Christos; Balidakis, Kyriakos; Katsougiannopoulos, Symeon
2015-04-01
ZWD (Zenith Wet/non-hydrostatic Delay) estimates are routinely derived Near Real Time from the new established Analysis Center in the Department of Geodesy and Surveying of Aristotle University of Thessaloniki (DGS/AUT-AC), in the framework of E-GVAP (EUMETNET GNSS water vapour project) since October 2014. This process takes place on an hourly basis and yields, among else, station coordinates and tropospheric parameter estimates for a network of 90+ permanent GNSS (Global Navigation Satellite System) stations. These are distributed at the wider part of Hellenic region. In this study, temporal and spatial variability of ZWD estimates were examined, as well as their relation with coordinate series extracted from both float and fixed solution of the initial phase ambiguities. For this investigation, Bernese GNSS Software v5.2 was used for the acquisition of the 6 month dataset from the aforementioned network. For time series analysis we employed techniques such as the Generalized Lomb-Scargle periodogram and Burg's maximum entropy method due to inefficiencies of the Discrete Fourier Transform application in the test dataset. Through the analysis, interesting results for further geophysical interpretation were drawn. In addition, the spatial and temporal distributions of Precipitable Water vapour (PW) obtained from both ZWD estimates and ERA-Interim reanalysis grids were investigated.
Visibility graph network analysis of natural gas price: The case of North American market
NASA Astrophysics Data System (ADS)
Sun, Mei; Wang, Yaqi; Gao, Cuixia
2016-11-01
Fluctuations in prices of natural gas significantly affect global economy. Therefore, the research on the characteristics of natural gas price fluctuations, turning points and its influencing cycle on the subsequent price series is of great significance. Global natural gas trade concentrates on three regional markets: the North American market, the European market and the Asia-Pacific market, with North America having the most developed natural gas financial market. In addition, perfect legal supervision and coordinated regulations make the North American market more open and more competitive. This paper focuses on the North American natural gas market specifically. The Henry Hub natural gas spot price time series is converted to a visibility graph network which provides a new direction for macro analysis of time series, and several indicators are investigated: degree and degree distribution, the average shortest path length and community structure. The internal mechanisms underlying price fluctuations are explored through the indicators. The results show that the natural gas prices visibility graph network (NGP-VGN) is of small-world and scale-free properties simultaneously. After random rearrangement of original price time series, the degree distribution of network becomes exponential distribution, different from the original ones. This means that, the original price time series is of long-range negative correlation fractal characteristic. In addition, nodes with large degree correspond to significant geopolitical or economic events. Communities correspond to time cycles in visibility graph network. The cycles of time series and the impact scope of hubs can be found by community structure partition.
Leonid Storm Flux Analysis From One Leonid MAC Video AL50R
NASA Technical Reports Server (NTRS)
Gural, Peter S.; Jenniskens, Peter; DeVincenzi, Donald L. (Technical Monitor)
2000-01-01
A detailed meteor flux analysis is presented of a seventeen-minute portion of one videotape, collected on November 18, 1999, during the Leonid Multi-instrument Aircraft Campaign. The data was recorded around the peak of the Leonid meteor storm using an intensified CCD camera pointed towards the low southern horizon. Positions of meteors on the sky were measured. These measured meteor distributions were compared to a Monte Carlo simulation, which is a new approach to parameter estimation for mass ratio and flux. Comparison of simulated flux versus observed flux levels, seen between 1:50:00 and 2:06:41 UT, indicate a magnitude population index of r = 1.8 +/- 0.1 and mass ratio of s = 1.64 +/- 0.06. The average spatial density of the material contributing to the Leonid storm peak is measured at 0.82 +/- 0.19 particles per square kilometer per hour for particles of at least absolute visual magnitude +6.5. Clustering analysis of the arrival times of Leonids impacting the earth's atmosphere over the total observing interval shows no enhancement or clumping down to time scales of the video frame rate. This indicates a uniformly random temporal distribution of particles in the stream encountered during the 1999 epoch. Based on the observed distribution of meteors on the sky and the model distribution, recommendations am made for the optimal pointing directions for video camera meteor counts during future ground and airborne missions.
NASA Technical Reports Server (NTRS)
Lameris, J.
1984-01-01
The development of a thermal and structural model for a hypersonic wing test structure using the NASTRAN finite-element method as its primary analytical tool is described. A detailed analysis was defined to obtain the temperature and thermal stress distribution in the whole wing as well as the five upper and lower root panels. During the development of the models, it was found that the thermal application of NASTRAN and the VIEW program, used for the generation of the radiation exchange coefficients, were definicent. Although for most of these deficiencies solutions could be found, the existence of one particular deficiency in the current thermal model prevented the final computation of the temperature distributions. A SPAR analysis of a single bay of the wing, using data converted from the original NASTRAN model, indicates that local temperature-time distributions can be obtained with good agreement with the test data. The conversion of the NASTRAN thermal model into a SPAR model is recommended to meet the immediate goal of obtaining an accurate thermal stress distribution.
Disentangling Time-series Spectra with Gaussian Processes: Applications to Radial Velocity Analysis
NASA Astrophysics Data System (ADS)
Czekala, Ian; Mandel, Kaisey S.; Andrews, Sean M.; Dittmann, Jason A.; Ghosh, Sujit K.; Montet, Benjamin T.; Newton, Elisabeth R.
2017-05-01
Measurements of radial velocity variations from the spectroscopic monitoring of stars and their companions are essential for a broad swath of astrophysics; these measurements provide access to the fundamental physical properties that dictate all phases of stellar evolution and facilitate the quantitative study of planetary systems. The conversion of those measurements into both constraints on the orbital architecture and individual component spectra can be a serious challenge, however, especially for extreme flux ratio systems and observations with relatively low sensitivity. Gaussian processes define sampling distributions of flexible, continuous functions that are well-motivated for modeling stellar spectra, enabling proficient searches for companion lines in time-series spectra. We introduce a new technique for spectral disentangling, where the posterior distributions of the orbital parameters and intrinsic, rest-frame stellar spectra are explored simultaneously without needing to invoke cross-correlation templates. To demonstrate its potential, this technique is deployed on red-optical time-series spectra of the mid-M-dwarf binary LP661-13. We report orbital parameters with improved precision compared to traditional radial velocity analysis and successfully reconstruct the primary and secondary spectra. We discuss potential applications for other stellar and exoplanet radial velocity techniques and extensions to time-variable spectra. The code used in this analysis is freely available as an open-source Python package.
Noise analysis of antibiotic permeation through bacterial channels
NASA Astrophysics Data System (ADS)
Nestorovich, Ekaterina M.; Danelon, Christophe; Winterhalter, Mathias; Bezrukov, Sergey M.
2003-05-01
Statistical analysis of high-resolution current recordings from a single ion channel reconstituted into a planar lipid membrane allows us to study transport of antibiotics at the molecular detail. Working with the general bacterial porin, OmpF, we demonstrate that addition of zwitterionic β-lactam antibiotics to the membrane-bathing solution introduces transient interruptions in the small-ion current through the channel. Time-resolved measurements reveal that one antibiotic molecule blocks one of the monomers in the OmpF trimer for characteristic times from microseconds to hundreds of microseconds. Spectral noise analysis enables us to perform measurements over a wide range of changing parameters. In all cases studied, the residence time of an antibiotic molecule in the channel exceeds the estimated time for free diffusion by orders of magnitude. This demonstrates that, in analogy to substrate-specific channels that evolved to bind specific metabolite molecules, antibiotics have 'evolved' to be channel-specific. The charge distribution of an efficient antibiotic complements the charge distribution at the narrowest part of the bacterial porin. Interaction of these charges creates a zone of attraction inside the channel and compensates the penetrating molecule's entropy loss and desolvation energy. This facilitates antibiotic translocation through the narrowest part of the channel and accounts for higher antibiotic permeability rates.
Local spectrum analysis of field propagation in an anisotropic medium. Part I. Time-harmonic fields.
Tinkelman, Igor; Melamed, Timor
2005-06-01
The phase-space beam summation is a general analytical framework for local analysis and modeling of radiation from extended source distributions. In this formulation, the field is expressed as a superposition of beam propagators that emanate from all points in the source domain and in all directions. In this Part I of a two-part investigation, the theory is extended to include propagation in anisotropic medium characterized by a generic wave-number profile for time-harmonic fields; in a companion paper [J. Opt. Soc. Am. A 22, 1208 (2005)], the theory is extended to time-dependent fields. The propagation characteristics of the beam propagators in a homogeneous anisotropic medium are considered. With use of Gaussian windows for the local processing of either ordinary or extraordinary electromagnetic field distributions, the field is represented by a phase-space spectral distribution in which the propagating elements are Gaussian beams that are formulated by using Gaussian plane-wave spectral distributions over the extended source plane. By applying saddle-point asymptotics, we extract the Gaussian beam phenomenology in the anisotropic environment. The resulting field is parameterized in terms of the spatial evolution of the beam curvature, beam width, etc., which are mapped to local geometrical properties of the generic wave-number profile. The general results are applied to the special case of uniaxial crystal, and it is found that the asymptotics for the Gaussian beam propagators, as well as the physical phenomenology attached, perform remarkably well.
Does player time-in-game affect tackle technique in elite level rugby union?
Tierney, Gregory J; Denvir, Karl; Farrell, Garreth; Simms, Ciaran K
2018-02-01
It has been hypothesised that fatigue may be a major factor in tackle-related injury risk in rugby union and hence more injuries occur in the later stages of a game. The aim of this study is to identify changes in ball carrier or tackler proficiency characteristics, using elite level match video data, as player time-in-game increases. Qualitative observational cohort study. Three 2014/15 European Rugby Champions Cup games were selected for ball carrier and tackler proficiency analysis. Analysis was only conducted on players who started and remained on the field for the entire game. A separate analysis was conducted on 10 randomly selected 2014/15 European Rugby Champions Cup/Pro 12 games to assess the time distribution of tackles throughout a game. A Chi-square test and one-way way ANOVA with post-hoc testing was conducted to identify significant differences (p<0.05) for proficiency characteristics and tackle counts between quarters in the game, respectively. Player time-in-game did not affect tackle proficiency for both the ball carrier and tackler. Any results that showed statistical significance did not indicate a trend of deterioration in proficiency with increased player time-in-game. The time distribution of tackles analysis indicated that more tackles occurring in the final quarter of the game than the first (p=0.04) and second (p=<0.01). It appears that player time-in-game does not affect tackler or ball carrier tackle technique proficiency at the elite level. More tackles occurring in the final quarter of a game provides an alternative explanation to more tackle-related injuries occurring at this stage. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Reliability analysis of degradable networks with modified BPR
NASA Astrophysics Data System (ADS)
Wang, Yu-Qing; Zhou, Chao-Fan; Jia, Bin; Zhu, Hua-Bing
2017-12-01
In this paper, the effect of the speed limit on degradable networks with capacity restrictions and the forced flow is investigated. The link performance function considering the road capacity is proposed. Additionally, the probability density distribution and the cumulative distribution of link travel time are introduced in the degradable network. By the mean of distinguishing the value of the speed limit, four cases are discussed, respectively. Means and variances of link travel time and route one of the degradable road network are calculated. Besides, by the mean of performing numerical simulation experiments in a specific network, it is found that the speed limit strategy can reduce the travel time budget and mean travel time of link and route. Moreover, it reveals that the speed limit strategy can cut down variances of the travel time of networks to some extent.
Spatio-Temporal Analysis of Smear-Positive Tuberculosis in the Sidama Zone, Southern Ethiopia
Dangisso, Mesay Hailu; Datiko, Daniel Gemechu; Lindtjørn, Bernt
2015-01-01
Background Tuberculosis (TB) is a disease of public health concern, with a varying distribution across settings depending on socio-economic status, HIV burden, availability and performance of the health system. Ethiopia is a country with a high burden of TB, with regional variations in TB case notification rates (CNRs). However, TB program reports are often compiled and reported at higher administrative units that do not show the burden at lower units, so there is limited information about the spatial distribution of the disease. We therefore aim to assess the spatial distribution and presence of the spatio-temporal clustering of the disease in different geographic settings over 10 years in the Sidama Zone in southern Ethiopia. Methods A retrospective space–time and spatial analysis were carried out at the kebele level (the lowest administrative unit within a district) to identify spatial and space-time clusters of smear-positive pulmonary TB (PTB). Scan statistics, Global Moran’s I, and Getis and Ordi (Gi*) statistics were all used to help analyze the spatial distribution and clusters of the disease across settings. Results A total of 22,545 smear-positive PTB cases notified over 10 years were used for spatial analysis. In a purely spatial analysis, we identified the most likely cluster of smear-positive PTB in 192 kebeles in eight districts (RR= 2, p<0.001), with 12,155 observed and 8,668 expected cases. The Gi* statistic also identified the clusters in the same areas, and the spatial clusters showed stability in most areas in each year during the study period. The space-time analysis also detected the most likely cluster in 193 kebeles in the same eight districts (RR= 1.92, p<0.001), with 7,584 observed and 4,738 expected cases in 2003-2012. Conclusion The study found variations in CNRs and significant spatio-temporal clusters of smear-positive PTB in the Sidama Zone. The findings can be used to guide TB control programs to devise effective TB control strategies for the geographic areas characterized by the highest CNRs. Further studies are required to understand the factors associated with clustering based on individual level locations and investigation of cases. PMID:26030162
Murakoshi, Takuma; Masuda, Tomohiro; Utsumi, Ken; Tsubota, Kazuo; Wada, Yuji
2013-01-01
Previous studies have reported the effects of statistics of luminance distribution on visual freshness perception using pictures which included the degradation process of food samples. However, these studies did not examine the effect of individual differences between the same kinds of food. Here we elucidate whether luminance distribution would continue to have a significant effect on visual freshness perception even if visual stimuli included individual differences in addition to the degradation process of foods. We took pictures of the degradation of three fishes over 3.29 hours in a controlled environment, then cropped square patches of their eyes from the original images as visual stimuli. Eleven participants performed paired comparison tests judging the visual freshness of the fish eyes at three points of degradation. Perceived freshness scores (PFS) were calculated using the Bradley-Terry Model for each image. The ANOVA revealed that the PFS for each fish decreased as the degradation time increased; however, the differences in the PFS between individual fish was larger for the shorter degradation time, and smaller for the longer degradation time. A multiple linear regression analysis was conducted in order to determine the relative importance of the statistics of luminance distribution of the stimulus images in predicting PFS. The results show that standard deviation and skewness in luminance distribution have a significant influence on PFS. These results show that even if foodstuffs contain individual differences, visual freshness perception and changes in luminance distribution correlate with degradation time.
NASA Astrophysics Data System (ADS)
Han, H. J.; Kang, J. H.
2016-12-01
Since Jul. 2015, KIAPS (Korea Institute of Atmospheric Prediction Systems) has been performing the semi real-time forecast system to assess the performance of their forecast system as a NWP model. KPOP (KIAPS Protocol for Observation Processing) is a part of KIAPS data assimilation system and has been performing well in KIAPS semi real-time forecast system. In this study, due to the fact that KPOP would be able to treat the scatterometer wind data, we analyze the effect of scatterometer wind (ASCAT-A/B) on KIAPS semi real-time forecast system. O-B global distribution and statistics of scatterometer wind give use two information which are the difference between background field and observation is not too large and KPOP processed the scatterometer wind data well. The changes of analysis increment because of O-B global distribution appear remarkably at the bottom of atmospheric field. It also shows that scatterometer wind data cover wide ocean where data would be able to short. Performance of scatterometer wind data can be checked through the vertical error reduction against IFS between background and analysis field and vertical statistics of O-A. By these analysis result, we can notice that scatterometer wind data will influence the positive effect on lower level performance of semi real-time forecast system at KIAPS. After, long-term result based on effect of scatterometer wind data will be analyzed.
Improved first-order uncertainty method for water-quality modeling
Melching, C.S.; Anmangandla, S.
1992-01-01
Uncertainties are unavoidable in water-quality modeling and subsequent management decisions. Monte Carlo simulation and first-order uncertainty analysis (involving linearization at central values of the uncertain variables) have been frequently used to estimate probability distributions for water-quality model output due to their simplicity. Each method has its drawbacks: Monte Carlo simulation's is mainly computational time; and first-order analysis are mainly questions of accuracy and representativeness, especially for nonlinear systems and extreme conditions. An improved (advanced) first-order method is presented, where the linearization point varies to match the output level whose exceedance probability is sought. The advanced first-order method is tested on the Streeter-Phelps equation to estimate the probability distribution of critical dissolved-oxygen deficit and critical dissolved oxygen using two hypothetical examples from the literature. The advanced first-order method provides a close approximation of the exceedance probability for the Streeter-Phelps model output estimated by Monte Carlo simulation using less computer time - by two orders of magnitude - regardless of the probability distributions assumed for the uncertain model parameters.
Moutsopoulou, Karolina; Waszak, Florian
2013-05-01
It has been shown that in associative learning it is possible to disentangle the effects caused on behaviour by the associations between a stimulus and a classification (S-C) and the associations between a stimulus and the action performed towards it (S-A). Such evidence has been provided using ex-Gaussian distribution analysis to show that different parameters of the reaction time distribution reflect the different processes. Here, using this method, we investigate another difference between these two types of associations: What is the relative durability of these associations across time? Using a task-switching paradigm and by manipulating the lag between the point of the creation of the associations and the test phase, we show that S-A associations have stronger effects on behaviour when the lag between the two repetitions of a stimulus is short. However, classification learning affects behaviour not only in short-term lags but also (and equally so) when the lag between prime and probe is long and the same stimuli are repeatedly presented within a different classification task, demonstrating a remarkable durability of S-C associations.
Teixeira, E R; Sato, Y; Akagawa, Y; Shindoi, N
1998-04-01
Further validity of finite element analysis (FEA) in implant biomechanics requires an increase of modelled range and mesh refinement, and a consequent increase in element number and calculation time. To develop a new method that allows a decrease of the modelled range and element number (along with less calculation time and less computer memory), 10 FEA models of the mandible with different mesio-distal lengths and elements were constructed based on three-dimensional graphic data of the bone structure around an osseointegrated implant. Analysis of stress distribution followed by 100 N loading with the fixation of the most external planes of the models indicated that a minimal bone length of 4.2 mm of the mesial and distal sides was acceptable for FEA representation. Moreover, unification of elements located far away from the implant surface did not affect stress distribution. These results suggest that it may be possible to develop a replica FEA implant model of the mandible with less range and fewer elements without altering stress distribution.
NASA Technical Reports Server (NTRS)
Sepehry-Fard, F.; Coulthard, Maurice H.
1995-01-01
The process of predicting the values of maintenance time dependent variable parameters such as mean time between failures (MTBF) over time must be one that will not in turn introduce uncontrolled deviation in the results of the ILS analysis such as life cycle costs, spares calculation, etc. A minor deviation in the values of the maintenance time dependent variable parameters such as MTBF over time will have a significant impact on the logistics resources demands, International Space Station availability and maintenance support costs. There are two types of parameters in the logistics and maintenance world: a. Fixed; b. Variable Fixed parameters, such as cost per man hour, are relatively easy to predict and forecast. These parameters normally follow a linear path and they do not change randomly. However, the variable parameters subject to the study in this report such as MTBF do not follow a linear path and they normally fall within the distribution curves which are discussed in this publication. The very challenging task then becomes the utilization of statistical techniques to accurately forecast the future non-linear time dependent variable arisings and events with a high confidence level. This, in turn, shall translate in tremendous cost savings and improved availability all around.
Elixir - how to handle 2 trillion pixels
NASA Astrophysics Data System (ADS)
Magnier, Eugene A.; Cuillandre, Jean-Charles
2002-12-01
The Elixir system at CFHT provides automatic data quality assurance and calibration for the wide-field mosaic imager camera CFH12K. Elixir consists of a variety of tools, including: a real-time analysis suite which runs at the telescope to provide quick feedback to the observers; a detailed analysis of the calibration data; and an automated pipeline for processing data to be distributed to observers. To date, 2.4 × 1012 night-time sky pixels from CFH12K have been processed by the Elixir system.
NASA Astrophysics Data System (ADS)
Mohammed, Touseef Ahmed Faisal
Since 2000, renewable electricity installations in the United States (excluding hydropower) have more than tripled. Renewable electricity has grown at a compounded annual average of nearly 14% per year from 2000-2010. Wind, Concentrated Solar Power (CSP) and solar Photo Voltaic (PV) are the fastest growing renewable energy sectors. In 2010 in the U.S., solar PV grew over 71% and CSP grew by 18% from the previous year. Globally renewable electricity installations have more than quadrupled from 2000-2010. Solar PV generation grew by a factor of more than 28 between 2000 and 2010. The amount of CSP and solar PV installations are increasing on the distribution grid. These PV installations transmit electrical current from the load centers to the generating stations. But the transmission and distribution grid have been designed for uni-directional flow of electrical energy from generating stations to load centers. This causes imbalances in voltage and switchgear of the electrical circuitry. With the continuous rise in PV installations, analysis of voltage profile and penetration levels remain an active area of research. Standard distributed photovoltaic (PV) generators represented in simulation studies do not reflect the exact location and variability properties such as distance between interconnection points to substations, voltage regulators, solar irradiance and other environmental factors. Quasi-Static simulations assist in peak load planning hour and day ahead as it gives a time sequence analysis to help in generation allocation. Simulation models can be daily, hourly or yearly depending on duty cycle and dynamics of the system. High penetration of PV into the power grid changes the voltage profile and power flow dynamically in the distribution circuits due to the inherent variability of PV. There are a number of modeling and simulations tools available for the study of such high penetration PV scenarios. This thesis will specifically utilize OpenDSS, a open source Distribution System Simulator developed by Electric Power Research Institute, to simulate grid voltage profile with a large scale PV system under quasi-static time series considering variations of PV output in seconds, minutes, and the average daily load variations. A 13 bus IEEE distribution feeder model is utilized with distributed residential and commercial scale PV at different buses for simulation studies. Time series simulations are discussed for various modes of operation considering dynamic PV penetration at different time periods in a day. In addition, this thesis demonstrates simulations taking into account the presence of moving cloud for solar forecasting studies.
Exact probability distribution functions for Parrondo's games
NASA Astrophysics Data System (ADS)
Zadourian, Rubina; Saakian, David B.; Klümper, Andreas
2016-12-01
We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.
Exact probability distribution functions for Parrondo's games.
Zadourian, Rubina; Saakian, David B; Klümper, Andreas
2016-12-01
We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.
NASA Technical Reports Server (NTRS)
Podwysocki, M. H.
1976-01-01
A study was made of the field size distributions for LACIE test sites 5029, 5033, and 5039, People's Republic of China. Field lengths and widths were measured from LANDSAT imagery, and field area was statistically modeled. Field size parameters have log-normal or Poisson frequency distributions. These were normalized to the Gaussian distribution and theoretical population curves were made. When compared to fields in other areas of the same country measured in the previous study, field lengths and widths in the three LACIE test sites were 2 to 3 times smaller and areas were smaller by an order of magnitude.
A method for developing design diagrams for ceramic and glass materials using fatigue data
NASA Technical Reports Server (NTRS)
Heslin, T. M.; Magida, M. B.; Forrest, K. A.
1986-01-01
The service lifetime of glass and ceramic materials can be expressed as a plot of time-to-failure versus applied stress whose plot is parametric in percent probability of failure. This type of plot is called a design diagram. Confidence interval estimates for such plots depend on the type of test that is used to generate the data, on assumptions made concerning the statistical distribution of the test results, and on the type of analysis used. This report outlines the development of design diagrams for glass and ceramic materials in engineering terms using static or dynamic fatigue tests, assuming either no particular statistical distribution of test results or a Weibull distribution and using either median value or homologous ratio analysis of the test results.
de Wit, Bianca; Kinoshita, Sachiko
2015-01-01
The magnitude of the semantic priming effect is known to increase as the proportion of related prime-target pairs in an experiment increases. This relatedness proportion (RP) effect was studied in a lexical decision task at a short prime-target stimulus onset asynchrony (240 ms), which is widely assumed to preclude strategic prospective usage of the prime. The analysis of the reaction time (RT) distribution suggested that the observed RP effect reflected a modulation of a retrospective semantic matching process. The pattern of the RP effect on the RT distribution found here is contrasted to that reported in De Wit and Kinoshita's (2014) semantic categorization study, and it is concluded that the RP effect is driven by different underlying mechanisms in lexical decision and semantic categorization.
GET electronics samples data analysis
NASA Astrophysics Data System (ADS)
Giovinazzo, J.; Goigoux, T.; Anvar, S.; Baron, P.; Blank, B.; Delagnes, E.; Grinyer, G. F.; Pancin, J.; Pedroza, J. L.; Pibernat, J.; Pollacco, E.; Rebii, A.; Roger, T.; Sizun, P.
2016-12-01
The General Electronics for TPCs (GET) has been developed to equip a generation of time projection chamber detectors for nuclear physics, and may also be used for a wider range of detector types. The goal of this paper is to propose first analysis procedures to be applied on raw data samples from the GET system, in order to correct for systematic effects observed on test measurements. We also present a method to estimate the response function of the GET system channels. The response function is required in analysis where the input signal needs to be reconstructed, in terms of time distribution, from the registered output samples.
NASA Technical Reports Server (NTRS)
1974-01-01
Studies were conducted to develop appropriate space shuttle electrical power distribution and control (EPDC) subsystem simulation models and to apply the computer simulations to systems analysis of the EPDC. A previously developed software program (SYSTID) was adapted for this purpose. The following objectives were attained: (1) significant enhancement of the SYSTID time domain simulation software, (2) generation of functionally useful shuttle EPDC element models, and (3) illustrative simulation results in the analysis of EPDC performance, under the conditions of fault, current pulse injection due to lightning, and circuit protection sizing and reaction times.
NASA Astrophysics Data System (ADS)
Chen, Y.; Reeves, G.; Friedel, R. H.
2005-12-01
The source of relativistic electrons in the Earth's radiation belts in recovery phase of geomagnetic storms is still an open question which requires more observational analysis. To address this question, we first need to differentiate between two competing mechanisms, inward radial transport or in-situ energization. Recent work has focused on analysis of phase space density distribution for specific storms of interest. Here we expand on the results of earlier event studies by surveying the phase space density radial distribution and its temporal evolution during storms for a time period of 2 years (2001-2002). Data in this work are from the IES and HIST electron detectors on board POLAR, whose orbit crosses the outer part of outer radiation belt through equatorial plane almost every 18 hours during this period. The fact that detected electrons with given 1st and 2nd adiabatic invariants can cover L*~6-10, allows tracing the temporally evolving radial gradient which can help in determining the source of new electrons. Initial analysis of approximately 190 days suggests that the energization of relativistic electrons may result from a more complicated combination of radial transport and in-situ acceleration than is usually assumed.
Towards a Cloud Based Smart Traffic Management Framework
NASA Astrophysics Data System (ADS)
Rahimi, M. M.; Hakimpour, F.
2017-09-01
Traffic big data has brought many opportunities for traffic management applications. However several challenges like heterogeneity, storage, management, processing and analysis of traffic big data may hinder their efficient and real-time applications. All these challenges call for well-adapted distributed framework for smart traffic management that can efficiently handle big traffic data integration, indexing, query processing, mining and analysis. In this paper, we present a novel, distributed, scalable and efficient framework for traffic management applications. The proposed cloud computing based framework can answer technical challenges for efficient and real-time storage, management, process and analyse of traffic big data. For evaluation of the framework, we have used OpenStreetMap (OSM) real trajectories and road network on a distributed environment. Our evaluation results indicate that speed of data importing to this framework exceeds 8000 records per second when the size of datasets is near to 5 million. We also evaluate performance of data retrieval in our proposed framework. The data retrieval speed exceeds 15000 records per second when the size of datasets is near to 5 million. We have also evaluated scalability and performance of our proposed framework using parallelisation of a critical pre-analysis in transportation applications. The results show that proposed framework achieves considerable performance and efficiency in traffic management applications.
Shi, Wei; Xia, Jun
2017-02-01
Water quality risk management is a global hot research linkage with the sustainable water resource development. Ammonium nitrogen (NH 3 -N) and permanganate index (COD Mn ) as the focus indicators in Huai River Basin, are selected to reveal their joint transition laws based on Markov theory. The time-varying moments model with either time or land cover index as explanatory variables is applied to build the time-varying marginal distributions of water quality time series. Time-varying copula model, which takes the non-stationarity in the marginal distribution and/or the time variation in dependence structure between water quality series into consideration, is constructed to describe a bivariate frequency analysis for NH 3 -N and COD Mn series at the same monitoring gauge. The larger first-order Markov joint transition probability indicates water quality state Class V w , Class IV and Class III will occur easily in the water body of Bengbu Sluice. Both marginal distribution and copula models are nonstationary, and the explanatory variable time yields better performance than land cover index in describing the non-stationarities in the marginal distributions. In modelling the dependence structure changes, time-varying copula has a better fitting performance than the copula with the constant or the time-trend dependence parameter. The largest synchronous encounter risk probability of NH 3 -N and COD Mn simultaneously reaching Class V is 50.61%, while the asynchronous encounter risk probability is largest when NH 3 -N and COD Mn is inferior to class V and class IV water quality standards, respectively.
Estimation of Parameters from Discrete Random Nonstationary Time Series
NASA Astrophysics Data System (ADS)
Takayasu, H.; Nakamura, T.
For the analysis of nonstationary stochastic time series we introduce a formulation to estimate the underlying time-dependent parameters. This method is designed for random events with small numbers that are out of the applicability range of the normal distribution. The method is demonstrated for numerical data generated by a known system, and applied to time series of traffic accidents, batting average of a baseball player and sales volume of home electronics.
NASA Astrophysics Data System (ADS)
Kurdhi, N. A.; Jamaluddin, A.; Jauhari, W. A.; Saputro, D. R. S.
2017-06-01
In this study, we consider a stochastic integrated manufacturer-retailer inventory model with service level constraint. The model analyzed in this article considers the situation in which the vendor and the buyer establish a long-term contract and strategic partnership to jointly determine the best strategy. The lead time and setup cost are assumed can be controlled by an additional crashing cost and an investment, respectively. It is assumed that shortages are allowed and partially backlogged on the buyer’s side, and that the protection interval (i.e., review period plus lead time) demand distribution is unknown but has given finite first and second moments. The objective is to apply the minmax distribution free approach to simultaneously optimize the review period, the lead time, the setup cost, the safety factor, and the number of deliveries in order to minimize the joint total expected annual cost. The service level constraint guarantees that the service level requirement can be satisfied at the worst case. By constructing Lagrange function, the analysis regarding the solution procedure is conducted, and a solution algorithm is then developed. Moreover, a numerical example and sensitivity analysis are given to illustrate the proposed model and to provide some observations and managerial implications.
Bayesian view of single-qubit clocks, and an energy versus accuracy tradeoff
NASA Astrophysics Data System (ADS)
Gopalkrishnan, Manoj; Kandula, Varshith; Sriram, Praveen; Deshpande, Abhishek; Muralidharan, Bhaskaran
2017-09-01
We bring a Bayesian approach to the analysis of clocks. Using exponential distributions as priors for clocks, we analyze how well one can keep time with a single qubit freely precessing under a magnetic field. We find that, at least with a single qubit, quantum mechanics does not allow exact timekeeping, in contrast to classical mechanics, which does. We find the design of the single-qubit clock that leads to maximum accuracy. Further, we find an energy versus accuracy tradeoff—the energy cost is at least kBT times the improvement in accuracy as measured by the entropy reduction in going from the prior distribution to the posterior distribution. We propose a physical realization of the single-qubit clock using charge transport across a capacitively coupled quantum dot.
Teng, Lei; Zhang, Hongying; Dong, Yongkang; Zhou, Dengwang; Jiang, Taofei; Gao, Wei; Lu, Zhiwei; Chen, Liang; Bao, Xiaoyi
2016-09-15
A temperature-compensated distributed hydrostatic pressure sensor based on Brillouin dynamic gratings (BDGs) is proposed and demonstrated experimentally for the first time, to the best of our knowledge. The principle is to measure the hydrostatic pressure induced birefringence changes through exciting and probing the BDGs in a thin-diameter pure silica polarization-maintaining photonic crystal fiber. The temperature cross-talk to the hydrostatic pressure sensing can be compensated through measuring the temperature-induced Brillouin frequency shift (BFS) changes using Brillouin optical time-domain analysis. A distributed measurement of hydrostatic pressure is demonstrated experimentally using a 4-m sensing fiber, which has a high sensitivity, with a maximum measurement error less than 0.03 MPa at a 20-cm spatial resolution.
Time-Dependent Hartree-Fock Approach to Nuclear Pasta at Finite Temperature
NASA Astrophysics Data System (ADS)
Schuetrumpf, B.; Klatt, M. A.; Iida, K.; Maruhn, J. A.; Mecke, K.; Reinhard, P.-G.
2013-03-01
We present simulations of neutron-rich matter at subnuclear densities, like supernova matter, with the time-dependent Hartree-Fock approximation at temperatures of several MeV. The initial state consists of α particles randomly distributed in space that have a Maxwell-Boltzmann distribution in momentum space. Adding a neutron background initialized with Fermi distributed plane waves the calculations reflect a reasonable approximation of astrophysical matter. This matter evolves into spherical, rod-like, and slab-like shapes and mixtures thereof. The simulations employ a full Skyrme interaction in a periodic three-dimensional grid. By an improved morphological analysis based on Minkowski functionals, all eight pasta shapes can be uniquely identified by the sign of only two valuations, namely the Euler characteristic and the integral mean curvature.
Wang, Haibin; Zha, Daifeng; Li, Peng; Xie, Huicheng; Mao, Lili
2017-01-01
Stockwell transform(ST) time-frequency representation(ST-TFR) is a time frequency analysis method which combines short time Fourier transform with wavelet transform, and ST time frequency filtering(ST-TFF) method which takes advantage of time-frequency localized spectra can separate the signals from Gaussian noise. The ST-TFR and ST-TFF methods are used to analyze the fault signals, which is reasonable and effective in general Gaussian noise cases. However, it is proved that the mechanical bearing fault signal belongs to Alpha(α) stable distribution process(1 < α < 2) in this paper, even the noise also is α stable distribution in some special cases. The performance of ST-TFR method will degrade under α stable distribution noise environment, following the ST-TFF method fail. Hence, a new fractional lower order ST time frequency representation(FLOST-TFR) method employing fractional lower order moment and ST and inverse FLOST(IFLOST) are proposed in this paper. A new FLOST time frequency filtering(FLOST-TFF) algorithm based on FLOST-TFR method and IFLOST is also proposed, whose simplified method is presented in this paper. The discrete implementation of FLOST-TFF algorithm is deduced, and relevant steps are summarized. Simulation results demonstrate that FLOST-TFR algorithm is obviously better than the existing ST-TFR algorithm under α stable distribution noise, which can work better under Gaussian noise environment, and is robust. The FLOST-TFF method can effectively filter out α stable distribution noise, and restore the original signal. The performance of FLOST-TFF algorithm is better than the ST-TFF method, employing which mixed MSEs are smaller when α and generalized signal noise ratio(GSNR) change. Finally, the FLOST-TFR and FLOST-TFF methods are applied to analyze the outer race fault signal and extract their fault features under α stable distribution noise, where excellent performances can be shown. PMID:28406916
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sciancalepore, Corrado, E-mail: corrado.sciancalepore@unimore.it; Bondioli, Federica; INSTM Consortium, Via G. Giusti 9, 51121 Firenze
2015-02-15
An innovative preparation procedure, based on microwave assisted non-hydrolytic sol–gel synthesis, to obtain spherical magnetite nanoparticles was reported together with a detailed quantitative phase analysis and microstructure characterization of the synthetic products. The nanoparticle growth was analyzed as a function of the synthesis time and was described in terms of crystallization degree employing the Rietveld method on the magnetic nanostructured system for the determination of the amorphous content using hematite as internal standard. Product crystallinity increases as the microwave thermal treatment is increased and reaches very high percentages for synthesis times longer than 1 h. Microstructural evolution of nanocrystals wasmore » followed by the integral breadth methods to obtain information on the crystallite size-strain distribution. The results of diffraction line profile analysis were compared with nanoparticle grain distribution estimated by dimensional analysis of the transmission electron microscopy (TEM) images. A variation both in the average grain size and in the distribution of the coherently diffraction domains is evidenced, allowing to suppose a relationship between the two quantities. The traditional integral breadth methods have proven to be valid for a rapid assessment of the diffraction line broadening effects in the above-mentioned nanostructured systems and the basic assumption for the correct use of these methods are discussed as well. - Highlights: • Fe{sub 3}O{sub 4} nanocrystals were obtained by MW-assisted non-hydrolytic sol–gel synthesis. • Quantitative phase analysis revealed that crystallinity up to 95% was reached. • The strategy of Rietveld refinements was discussed in details. • Dimensional analysis showed nanoparticles ranging from 4 to 8 nm. • Results of integral breadth methods were compared with microscopic analysis.« less
Empirical analysis on the runners' velocity distribution in city marathons
NASA Astrophysics Data System (ADS)
Lin, Zhenquan; Meng, Fan
2018-01-01
In recent decades, much researches have been performed on human temporal activity and mobility patterns, while few investigations have been made to examine the features of the velocity distributions of human mobility patterns. In this paper, we investigated empirically the velocity distributions of finishers in New York City marathon, American Chicago marathon, Berlin marathon and London marathon. By statistical analyses on the datasets of the finish time records, we captured some statistical features of human behaviors in marathons: (1) The velocity distributions of all finishers and of partial finishers in the fastest age group both follow log-normal distribution; (2) In the New York City marathon, the velocity distribution of all male runners in eight 5-kilometer internal timing courses undergoes two transitions: from log-normal distribution at the initial stage (several initial courses) to the Gaussian distribution at the middle stage (several middle courses), and to log-normal distribution at the last stage (several last courses); (3) The intensity of the competition, which is described by the root-mean-square value of the rank changes of all runners, goes weaker from initial stage to the middle stage corresponding to the transition of the velocity distribution from log-normal distribution to Gaussian distribution, and when the competition gets stronger in the last course of the middle stage, there will come a transition from Gaussian distribution to log-normal one at last stage. This study may enrich the researches on human mobility patterns and attract attentions on the velocity features of human mobility.
The Energy Spectrum of Accelerated Electrons from Wave-Plasma Interactions in the Ionosphere
2012-06-29
STATEMENT Distribution A: Approved for public release; distribution is unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT A HAARP ...data was obtained. It was therefore necessary to find the resources to repeat the campaign effort (see budget below). A HAARP campaign was...were highly structured in space and time. This fact, and the lack of electron temperature data at HAARP , made data analysis difficult. It became
ERIC Educational Resources Information Center
Molenaar, Peter C. M.; Nesselroade, John R.
1998-01-01
Pseudo-Maximum Likelihood (p-ML) and Asymptotically Distribution Free (ADF) estimation methods for estimating dynamic factor model parameters within a covariance structure framework were compared through a Monte Carlo simulation. Both methods appear to give consistent model parameter estimates, but only ADF gives standard errors and chi-square…
Acoustic analysis in Mudejar-Gothic churches: Experimental results
NASA Astrophysics Data System (ADS)
Galindo, Miguel; Zamarreño, Teófilo; Girón, Sara
2005-05-01
This paper describes the preliminary results of research work in acoustics, conducted in a set of 12 Mudejar-Gothic churches in the city of Seville in the south of Spain. Despite common architectural style, the churches feature individual characteristics and have volumes ranging from 3947 to 10 708 m3. Acoustic parameters were measured in unoccupied churches according to the ISO-3382 standard. An extensive experimental study was carried out using impulse response analysis through a maximum length sequence measurement system in each church. It covered aspects such as reverberation (reverberation times, early decay times), distribution of sound levels (sound strength); early to late sound energy parameters derived from the impulse responses (center time, clarity for speech, clarity, definition, lateral energy fraction), and speech intelligibility (rapid speech transmission index), which all take both spectral and spatial distribution into account. Background noise was also measured to obtain the NR indices. The study describes the acoustic field inside each temple and establishes a discussion for each one of the acoustic descriptors mentioned by using the theoretical models available and the principles of architectural acoustics. Analysis of the quality of the spaces for music and speech is carried out according to the most widespread criteria for auditoria. .
Acoustic analysis in Mudejar-Gothic churches: experimental results.
Galindo, Miguel; Zamarreño, Teófilo; Girón, Sara
2005-05-01
This paper describes the preliminary results of research work in acoustics, conducted in a set of 12 Mudejar-Gothic churches in the city of Seville in the south of Spain. Despite common architectural style, the churches feature individual characteristics and have volumes ranging from 3947 to 10 708 m3. Acoustic parameters were measured in unoccupied churches according to the ISO-3382 standard. An extensive experimental study was carried out using impulse response analysis through a maximum length sequence measurement system in each church. It covered aspects such as reverberation (reverberation times, early decay times), distribution of sound levels (sound strength); early to late sound energy parameters derived from the impulse responses (center time, clarity for speech, clarity, definition, lateral energy fraction), and speech intelligibility (rapid speech transmission index), which all take both spectral and spatial distribution into account. Background noise was also measured to obtain the NR indices. The study describes the acoustic field inside each temple and establishes a discussion for each one of the acoustic descriptors mentioned by using the theoretical models available and the principles of architectural acoustics. Analysis of the quality of the spaces for music and speech is carried out according to the most widespread criteria for auditoria.
Aeroelastic Uncertainty Quantification Studies Using the S4T Wind Tunnel Model
NASA Technical Reports Server (NTRS)
Nikbay, Melike; Heeg, Jennifer
2017-01-01
This paper originates from the joint efforts of an aeroelastic study team in the Applied Vehicle Technology Panel from NATO Science and Technology Organization, with the Task Group number AVT-191, titled "Application of Sensitivity Analysis and Uncertainty Quantification to Military Vehicle Design." We present aeroelastic uncertainty quantification studies using the SemiSpan Supersonic Transport wind tunnel model at the NASA Langley Research Center. The aeroelastic study team decided treat both structural and aerodynamic input parameters as uncertain and represent them as samples drawn from statistical distributions, propagating them through aeroelastic analysis frameworks. Uncertainty quantification processes require many function evaluations to asses the impact of variations in numerous parameters on the vehicle characteristics, rapidly increasing the computational time requirement relative to that required to assess a system deterministically. The increased computational time is particularly prohibitive if high-fidelity analyses are employed. As a remedy, the Istanbul Technical University team employed an Euler solver in an aeroelastic analysis framework, and implemented reduced order modeling with Polynomial Chaos Expansion and Proper Orthogonal Decomposition to perform the uncertainty propagation. The NASA team chose to reduce the prohibitive computational time by employing linear solution processes. The NASA team also focused on determining input sample distributions.
Henríquez-Henríquez, Marcela Patricia; Billeke, Pablo; Henríquez, Hugo; Zamorano, Francisco Javier; Rothhammer, Francisco; Aboitiz, Francisco
2014-01-01
Intra-individual variability of response times (RTisv) is considered as potential endophenotype for attentional deficit/hyperactivity disorder (ADHD). Traditional methods for estimating RTisv lose information regarding response times (RTs) distribution along the task, with eventual effects on statistical power. Ex-Gaussian analysis captures the dynamic nature of RTisv, estimating normal and exponential components for RT distribution, with specific phenomenological correlates. Here, we applied ex-Gaussian analysis to explore whether intra-individual variability of RTs agrees with criteria proposed by Gottesman and Gould for endophenotypes. Specifically, we evaluated if normal and/or exponential components of RTs may (a) present the stair-like distribution expected for endophenotypes (ADHD > siblings > typically developing children (TD) without familiar history of ADHD) and (b) represent a phenotypic correlate for previously described genetic risk variants. This is a pilot study including 55 subjects (20 ADHD-discordant sibling-pairs and 15 TD children), all aged between 8 and 13 years. Participants resolved a visual Go/Nogo with 10% Nogo probability. Ex-Gaussian distributions were fitted to individual RT data and compared among the three samples. In order to test whether intra-individual variability may represent a correlate for previously described genetic risk variants, VNTRs at DRD4 and SLC6A3 were identified in all sibling-pairs following standard protocols. Groups were compared adjusting independent general linear models for the exponential and normal components from the ex-Gaussian analysis. Identified trends were confirmed by the non-parametric Jonckheere-Terpstra test. Stair-like distributions were observed for μ (p = 0.036) and σ (p = 0.009). An additional "DRD4-genotype" × "clinical status" interaction was present for τ (p = 0.014) reflecting a possible severity factor. Thus, normal and exponential RTisv components are suitable as ADHD endophenotypes.
2007-01-01
Background The US Food and Drug Administration approved the Charité artificial disc on October 26, 2004. This approval was based on an extensive analysis and review process; 20 years of disc usage worldwide; and the results of a prospective, randomized, controlled clinical trial that compared lumbar artificial disc replacement to fusion. The results of the investigational device exemption (IDE) study led to a conclusion that clinical outcomes following lumbar arthroplasty were at least as good as outcomes from fusion. Methods The author performed a new analysis of the Visual Analog Scale pain scores and the Oswestry Disability Index scores from the Charité artificial disc IDE study and used a nonparametric statistical test, because observed data distributions were not normal. The analysis included all of the enrolled subjects in both the nonrandomized and randomized phases of the study. Results Subjects from both the treatment and control groups improved from the baseline situation (P < .001) at all follow-up times (6 weeks to 24 months). Additionally, these pain and disability levels with artificial disc replacement were superior (P < .05) to the fusion treatment at all follow-up times including 2 years. Conclusions The a priori statistical plan for an IDE study may not adequately address the final distribution of the data. Therefore, statistical analyses more appropriate to the distribution may be necessary to develop meaningful statistical conclusions from the study. A nonparametric statistical analysis of the Charité artificial disc IDE outcomes scores demonstrates superiority for lumbar arthroplasty versus fusion at all follow-up time points to 24 months. PMID:25802574
Measuring Asymmetric Interactions in Resting State Brain Networks*
Joshi, Anand A.; Salloum, Ronald; Bhushan, Chitresh; Leahy, Richard M.
2015-01-01
Directed graph representations of brain networks are increasingly being used in brain image analysis to indicate the direction and level of influence among brain regions. Most of the existing techniques for directed graph representations are based on time series analysis and the concept of causality, and use time lag information in the brain signals. These time lag-based techniques can be inadequate for functional magnetic resonance imaging (fMRI) signal analysis due to the limited time resolution of fMRI as well as the low frequency hemodynamic response. The aim of this paper is to present a novel measure of necessity that uses asymmetry in the joint distribution of brain activations to infer the direction and level of interaction among brain regions. We present a mathematical formula for computing necessity and extend this measure to partial necessity, which can potentially distinguish between direct and indirect interactions. These measures do not depend on time lag for directed modeling of brain interactions and therefore are more suitable for fMRI signal analysis. The necessity measures were used to analyze resting state fMRI data to determine the presence of hierarchy and asymmetry of brain interactions during resting state. We performed ROI-wise analysis using the proposed necessity measures to study the default mode network. The empirical joint distribution of the fMRI signals was determined using kernel density estimation, and was used for computation of the necessity and partial necessity measures. The significance of these measures was determined using a one-sided Wilcoxon rank-sum test. Our results are consistent with the hypothesis that the posterior cingulate cortex plays a central role in the default mode network. PMID:26221690
Calibrated tree priors for relaxed phylogenetics and divergence time estimation.
Heled, Joseph; Drummond, Alexei J
2012-01-01
The use of fossil evidence to calibrate divergence time estimation has a long history. More recently, Bayesian Markov chain Monte Carlo has become the dominant method of divergence time estimation, and fossil evidence has been reinterpreted as the specification of prior distributions on the divergence times of calibration nodes. These so-called "soft calibrations" have become widely used but the statistical properties of calibrated tree priors in a Bayesian setting hashave not been carefully investigated. Here, we clarify that calibration densities, such as those defined in BEAST 1.5, do not represent the marginal prior distribution of the calibration node. We illustrate this with a number of analytical results on small trees. We also describe an alternative construction for a calibrated Yule prior on trees that allows direct specification of the marginal prior distribution of the calibrated divergence time, with or without the restriction of monophyly. This method requires the computation of the Yule prior conditional on the height of the divergence being calibrated. Unfortunately, a practical solution for multiple calibrations remains elusive. Our results suggest that direct estimation of the prior induced by specifying multiple calibration densities should be a prerequisite of any divergence time dating analysis.
Analysis of Multipsectral Time Series for supporting Forest Management Plans
NASA Astrophysics Data System (ADS)
Simoniello, T.; Carone, M. T.; Costantini, G.; Frattegiani, M.; Lanfredi, M.; Macchiato, M.
2010-05-01
Adequate forest management requires specific plans based on updated and detailed mapping. Multispectral satellite time series have been largely applied to forest monitoring and studies at different scales tanks to their capability of providing synoptic information on some basic parameters descriptive of vegetation distribution and status. As a low expensive tool for supporting forest management plans in operative context, we tested the use of Landsat-TM/ETM time series (1987-2006) in the high Agri Valley (Southern Italy) for planning field surveys as well as for the integration of existing cartography. As preliminary activity to make all scenes radiometrically consistent the no-change regression normalization was applied to the time series; then all the data concerning available forest maps, municipal boundaries, water basins, rivers, and roads were overlapped in a GIS environment. From the 2006 image we elaborated the NDVI map and analyzed the distribution for each land cover class. To separate the physiological variability and identify the anomalous areas, a threshold on the distributions was applied. To label the non homogenous areas, a multitemporal analysis was performed by separating heterogeneity due to cover changes from that linked to basilar unit mapping and classification labelling aggregations. Then a map of priority areas was produced to support the field survey plan. To analyze the territorial evolution, the historical land cover maps were elaborated by adopting a hybrid classification approach based on a preliminary segmentation, the identification of training areas, and a subsequent maximum likelihood categorization. Such an analysis was fundamental for the general assessment of the territorial dynamics and in particular for the evaluation of the efficacy of past intervention activities.
NASA Astrophysics Data System (ADS)
Raj, R.; Hamm, N. A. S.; van der Tol, C.; Stein, A.
2015-08-01
Gross primary production (GPP), separated from flux tower measurements of net ecosystem exchange (NEE) of CO2, is used increasingly to validate process-based simulators and remote sensing-derived estimates of simulated GPP at various time steps. Proper validation should include the uncertainty associated with this separation at different time steps. This can be achieved by using a Bayesian framework. In this study, we estimated the uncertainty in GPP at half hourly time steps. We used a non-rectangular hyperbola (NRH) model to separate GPP from flux tower measurements of NEE at the Speulderbos forest site, The Netherlands. The NRH model included the variables that influence GPP, in particular radiation, and temperature. In addition, the NRH model provided a robust empirical relationship between radiation and GPP by including the degree of curvature of the light response curve. Parameters of the NRH model were fitted to the measured NEE data for every 10-day period during the growing season (April to October) in 2009. Adopting a Bayesian approach, we defined the prior distribution of each NRH parameter. Markov chain Monte Carlo (MCMC) simulation was used to update the prior distribution of each NRH parameter. This allowed us to estimate the uncertainty in the separated GPP at half-hourly time steps. This yielded the posterior distribution of GPP at each half hour and allowed the quantification of uncertainty. The time series of posterior distributions thus obtained allowed us to estimate the uncertainty at daily time steps. We compared the informative with non-informative prior distributions of the NRH parameters. The results showed that both choices of prior produced similar posterior distributions GPP. This will provide relevant and important information for the validation of process-based simulators in the future. Furthermore, the obtained posterior distributions of NEE and the NRH parameters are of interest for a range of applications.
Lee, Chi Hyun; Luo, Xianghua; Huang, Chiung-Yu; DeFor, Todd E; Brunstein, Claudio G; Weisdorf, Daniel J
2016-06-01
Infection is one of the most common complications after hematopoietic cell transplantation. Many patients experience infectious complications repeatedly after transplant. Existing statistical methods for recurrent gap time data typically assume that patients are enrolled due to the occurrence of an event of interest, and subsequently experience recurrent events of the same type; moreover, for one-sample estimation, the gap times between consecutive events are usually assumed to be identically distributed. Applying these methods to analyze the post-transplant infection data will inevitably lead to incorrect inferential results because the time from transplant to the first infection has a different biological meaning than the gap times between consecutive recurrent infections. Some unbiased yet inefficient methods include univariate survival analysis methods based on data from the first infection or bivariate serial event data methods based on the first and second infections. In this article, we propose a nonparametric estimator of the joint distribution of time from transplant to the first infection and the gap times between consecutive infections. The proposed estimator takes into account the potentially different distributions of the two types of gap times and better uses the recurrent infection data. Asymptotic properties of the proposed estimators are established. © 2015, The International Biometric Society.
Lee, Chi Hyun; Huang, Chiung-Yu; DeFor, Todd E.; Brunstein, Claudio G.; Weisdorf, Daniel J.
2015-01-01
Summary Infection is one of the most common complications after hematopoietic cell transplantation. Many patients experience infectious complications repeatedly after transplant. Existing statistical methods for recurrent gap time data typically assume that patients are enrolled due to the occurrence of an event of interest, and subsequently experience recurrent events of the same type; moreover, for one-sample estimation, the gap times between consecutive events are usually assumed to be identically distributed. Applying these methods to analyze the post-transplant infection data will inevitably lead to incorrect inferential results because the time from transplant to the first infection has a different biological meaning than the gap times between consecutive recurrent infections. Some unbiased yet inefficient methods include univariate survival analysis methods based on data from the first infection or bivariate serial event data methods based on the first and second infections. In this paper, we propose a nonparametric estimator of the joint distribution of time from transplant to the first infection and the gap times between consecutive infections. The proposed estimator takes into account the potentially different distributions of the two types of gap times and better uses the recurrent infection data. Asymptotic properties of the proposed estimators are established. PMID:26575402
Distribution of crystalloid fluid changes with the rate of infusion: a population-based study.
Hahn, R G; Drobin, D; Zdolsek, J
2016-05-01
Crystalloid fluid requires 30 min for complete distribution throughout the extracellular fluid space and tends to cause long-standing peripheral edema. A kinetic analysis of the distribution of Ringer's acetate with increasing infusion rates was performed to obtain a better understanding of these characteristics of crystalloids. Data were retrieved from six studies in which 76 volunteers and preoperative patients had received between 300 ml and 2375 ml of Ringer's acetate solution at a rate of 20-80 ml/min (0.33-0.83 ml/min/kg). Serial measurements of the blood hemoglobin concentration were used as inputs in a kinetic analysis based on a two-volume model with micro-constants, using software for nonlinear mixed effects. The micro-constants describing distribution (k12) and elimination (k10) were unchanged when the rate of infusion increased, with half-times of 16 and 26 min, respectively. In contrast, the micro-constant describing how rapidly the already distributed fluid left the peripheral space (k21) decreased by 90% when the fluid was infused more rapidly, corresponding to an increase in the half-time from 3 to 30 min. The central volume of distribution (V(c)) doubled. The return of Ringer's acetate from the peripheral fluid compartment to the plasma was slower with high than with low infusion rates. Edema is a normal consequence of plasma volume expansion with this fluid, even in healthy volunteers. The results are consistent with the view that the viscoelastic properties of the interstitial matrix are responsible for the distribution and redistribution characteristics of crystalloid fluid. © 2016 The Acta Anaesthesiologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Duman, M. S.; Kaplan, E.; Cuvalcı, O.
2018-01-01
The present paper is based on experimental studies and numerical simulations on the surface fatigue failure of the PTFE-bronze layered journal bearings under real-time loading. ‘Permaglide Plain Bearings P10’ type journal bearings were experimentally tested under different real time dynamic loadings by using real time journal bearing test system in our laboratory. The journal bearing consists of a PTFE-bronze layer approximately 0.32 mm thick on the steel support layer with 2.18 mm thick. Two different approaches have been considered with in experiments: (i) under real- time constant loading with varying bearing widths, (ii) under different real-time loadings at constant bearing widths. Fatigue regions, micro-crack dispersion and stress distributions occurred at the journal bearing were experimentally and theoretically investigated. The relation between fatigue region and pressure distributions were investigated by determining the circumferential pressure distribution under real-time dynamic loadings for the position of every 10° crank angles. In the theoretical part; stress and deformation distributions at the surface of the journal bearing analysed by using finite element methods to determine the relationship between stress and fatigue behaviour. As a result of this study, the maximum oil pressure and fatigue cracks were observed in the most heavily loaded regions of the bearing surface. Experimental results show that PTFE-Bronze layered journal bearings fatigue behaviour is better than the bearings include white metal alloy.
Understanding Processes and Timelines for Distributed Photovoltaic
data from more than 30,000 PV systems across 87 utilities in 16 states to better understand how solar photovoltaic (PV) interconnection process time frames in the United States. This study includes an analysis of Analysis Metrics" that shows the four steps involved in the utility interconnection process for solar
Fight or Flight: An Account of a Professor's First Year Obsession
ERIC Educational Resources Information Center
León, Raina J.
2014-01-01
In this article, a junior faculty member explores her obsessions with the distribution of time in the areas of teaching, scholarship, service and personal life through an intensive analysis of an academic calendar, populated with data points in those areas. Through this analysis, she examines her first year and her own development as an academic.
Analysis of high-resolution foreign exchange data of USD-JPY for 13 years
NASA Astrophysics Data System (ADS)
Mizuno, Takayuki; Kurihara, Shoko; Takayasu, Misako; Takayasu, Hideki
2003-06-01
We analyze high-resolution foreign exchange data consisting of 20 million data points of USD-JPY for 13 years to report firm statistical laws in distributions and correlations of exchange rate fluctuations. A conditional probability density analysis clearly shows the existence of trend-following movements at time scale of 8-ticks, about 1 min.
Climate change and fishing: a century of shifting distribution in North Sea cod.
Engelhard, Georg H; Righton, David A; Pinnegar, John K
2014-08-01
Globally, spatial distributions of fish stocks are shifting but although the role of climate change in range shifts is increasingly appreciated, little remains known of the likely additional impact that high levels of fishing pressure might have on distribution. For North Sea cod, we show for the first time and in great spatial detail how the stock has shifted its distribution over the past 100 years. We digitized extensive historical fisheries data from paper charts in UK government archives and combined these with contemporary data to a time-series spanning 1913-2012 (excluding both World Wars). New analysis of old data revealed that the current distribution pattern of cod - mostly in the deeper, northern- and north-easternmost parts of the North Sea - is almost opposite to that during most of the Twentieth Century - mainly concentrated in the west, off England and Scotland. Statistical analysis revealed that the deepening, northward shift is likely attributable to warming; however, the eastward shift is best explained by fishing pressure, suggestive of significant depletion of the stock from its previous stronghold, off the coasts of England and Scotland. These spatial patterns were confirmed for the most recent 3 1/2 decades by data from fisheries-independent surveys, which go back to the 1970s. Our results demonstrate the fundamental importance of both climate change and fishing pressure for our understanding of changing distributions of commercially exploited fish. © 2013 Crown copyright. Global Change Biology published by John Wiley & Sons Ltd. This article is published with the permission of the Controller of HMSO and the Queen's Printer for Scotland.
Meng, Qiang; Weng, Jinxian
2013-01-01
Taking into account the uncertainty caused by exogenous factors, the accident notification time (ANT) and emergency medical service (EMS) response time were modeled as 2 random variables following the lognormal distribution. Their mean values and standard deviations were respectively formulated as the functions of environmental variables including crash time, road type, weekend, holiday, light condition, weather, and work zone type. Work zone traffic accident data from the Fatality Analysis Report System between 2002 and 2009 were utilized to determine the distributions of the ANT and the EMS arrival time in the United States. A mixed logistic regression model, taking into account the uncertainty associated with the ANT and the EMS response time, was developed to estimate the risk of death. The results showed that the uncertainty of the ANT was primarily influenced by crash time and road type, whereas the uncertainty of EMS response time is greatly affected by road type, weather, and light conditions. In addition, work zone accidents occurring during a holiday and in poor light conditions were found to be statistically associated with a longer mean ANT and longer EMS response time. The results also show that shortening the ANT was a more effective approach in reducing the risk of death than the EMS response time in work zones. To shorten the ANT and the EMS response time, work zone activities are suggested to be undertaken during non-holidays, during the daytime, and in good weather and light conditions.
Broadband spectral analysis of non-Debye dielectric relaxation in percolating heterostructures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuncer, Enis; Bellatar, J; Achour, M E
2011-01-01
In this study, the main features of dielectric relaxation in carbon black epoxy composites are discussed using several types of complementary modelling (i.e., the Cole-Cole phenomenological equation, Jonscher s universal dielectric response, and an approach that relies on a continuous distribution of relaxation times). These methods of characterizing the relaxation were conducted below Tg. Through the numerical model we can obtain the characteristic effective relaxation time and exponents straightforwardly. However, the true relaxation spectrum can be obtained from the distribution of relaxation times calculated from the complex dielectric permittivity. Over the compositional range explored, relaxation occurs by a Vogel-Tammam-Fulcher-like temperaturemore » dependence within the limits of experimental accuracy.« less
NASA Astrophysics Data System (ADS)
Barnhart, B. L.; Eichinger, W. E.; Prueger, J. H.
2010-12-01
Hilbert-Huang transform (HHT) is a relatively new data analysis tool which is used to analyze nonstationary and nonlinear time series data. It consists of an algorithm, called empirical mode decomposition (EMD), which extracts the cyclic components embedded within time series data, as well as Hilbert spectral analysis (HSA) which displays the time and frequency dependent energy contributions from each component in the form of a spectrogram. The method can be considered a generalized form of Fourier analysis which can describe the intrinsic cycles of data with basis functions whose amplitudes and phases may vary with time. The HHT will be introduced and compared to current spectral analysis tools such as Fourier analysis, short-time Fourier analysis, wavelet analysis and Wigner-Ville distributions. A number of applications are also presented which demonstrate the strengths and limitations of the tool, including analyzing sunspot number variability and total solar irradiance proxies as well as global averaged temperature and carbon dioxide concentration. Also, near-surface atmospheric quantities such as temperature and wind velocity are analyzed to demonstrate the nonstationarity of the atmosphere.
Higher-Order Theory: Structural/MicroAnalysis Code (HOTSMAC) Developed
NASA Technical Reports Server (NTRS)
Arnold, Steven M.
2002-01-01
The full utilization of advanced materials (be they composite or functionally graded materials) in lightweight aerospace components requires the availability of accurate analysis, design, and life-prediction tools that enable the assessment of component and material performance and reliability. Recently, a new commercially available software product called HOTSMAC (Higher-Order Theory--Structural/MicroAnalysis Code) was jointly developed by Collier Research Corporation, Engineered Materials Concepts LLC, and the NASA Glenn Research Center under funding provided by Glenn's Commercial Technology Office. The analytical framework for HOTSMAC is based on almost a decade of research into the coupled micromacrostructural analysis of heterogeneous materials. Consequently, HOTSMAC offers a comprehensive approach for analyzing/designing the response of components with various microstructural details, including certain advantages not always available in standard displacement-based finite element analysis techniques. The capabilities of HOTSMAC include combined thermal and mechanical analysis, time-independent and time-dependent material behavior, and internal boundary cells (e.g., those that can be used to represent internal cooling passages, see the preceding figure) to name a few. In HOTSMAC problems, materials can be randomly distributed and/or functionally graded (as shown in the figure, wherein the inclusions are distributed linearly), or broken down by strata, such as in the case of thermal barrier coatings or composite laminates.
Queueing analysis of a canonical model of real-time multiprocessors
NASA Technical Reports Server (NTRS)
Krishna, C. M.; Shin, K. G.
1983-01-01
A logical classification of multiprocessor structures from the point of view of control applications is presented. A computation of the response time distribution for a canonical model of a real time multiprocessor is presented. The multiprocessor is approximated by a blocking model. Two separate models are derived: one created from the system's point of view, and the other from the point of view of an incoming task.
Blading Design for Axial Turbomachines
1989-05-01
three- dimensional, viscous computation systems appear to have a long development period ahead, in which fluid shear stress modeling and computation time ...and n directions and T is the shear stress , As a consequence the solution time is longer than for integral methods, dependent largely on thc accuracy of...distributions over airfoils is an adaptation of thin plate deflection theory from stress analysis. At the same time , it minimizes designer effort
Lipid Vesicle Shape Analysis from Populations Using Light Video Microscopy and Computer Vision
Zupanc, Jernej; Drašler, Barbara; Boljte, Sabina; Kralj-Iglič, Veronika; Iglič, Aleš; Erdogmus, Deniz; Drobne, Damjana
2014-01-01
We present a method for giant lipid vesicle shape analysis that combines manually guided large-scale video microscopy and computer vision algorithms to enable analyzing vesicle populations. The method retains the benefits of light microscopy and enables non-destructive analysis of vesicles from suspensions containing up to several thousands of lipid vesicles (1–50 µm in diameter). For each sample, image analysis was employed to extract data on vesicle quantity and size distributions of their projected diameters and isoperimetric quotients (measure of contour roundness). This process enables a comparison of samples from the same population over time, or the comparison of a treated population to a control. Although vesicles in suspensions are heterogeneous in sizes and shapes and have distinctively non-homogeneous distribution throughout the suspension, this method allows for the capture and analysis of repeatable vesicle samples that are representative of the population inspected. PMID:25426933
A multiprocessing architecture for real-time monitoring
NASA Technical Reports Server (NTRS)
Schmidt, James L.; Kao, Simon M.; Read, Jackson Y.; Weitzenkamp, Scott M.; Laffey, Thomas J.
1988-01-01
A multitasking architecture for performing real-time monitoring and analysis using knowledge-based problem solving techniques is described. To handle asynchronous inputs and perform in real time, the system consists of three or more distributed processes which run concurrently and communicate via a message passing scheme. The Data Management Process acquires, compresses, and routes the incoming sensor data to other processes. The Inference Process consists of a high performance inference engine that performs a real-time analysis on the state and health of the physical system. The I/O Process receives sensor data from the Data Management Process and status messages and recommendations from the Inference Process, updates its graphical displays in real time, and acts as the interface to the console operator. The distributed architecture has been interfaced to an actual spacecraft (NASA's Hubble Space Telescope) and is able to process the incoming telemetry in real-time (i.e., several hundred data changes per second). The system is being used in two locations for different purposes: (1) in Sunnyville, California at the Space Telescope Test Control Center it is used in the preflight testing of the vehicle; and (2) in Greenbelt, Maryland at NASA/Goddard it is being used on an experimental basis in flight operations for health and safety monitoring.
NASA Astrophysics Data System (ADS)
Żymełka, Piotr; Nabagło, Daniel; Janda, Tomasz; Madejski, Paweł
2017-12-01
Balanced distribution of air in coal-fired boiler is one of the most important factors in the combustion process and is strongly connected to the overall system efficiency. Reliable and continuous information about combustion airflow and fuel rate is essential for achieving optimal stoichiometric ratio as well as efficient and safe operation of a boiler. Imbalances in air distribution result in reduced boiler efficiency, increased gas pollutant emission and operating problems, such as corrosion, slagging or fouling. Monitoring of air flow trends in boiler is an effective method for further analysis and can help to appoint important dependences and start optimization actions. Accurate real-time monitoring of the air distribution in boiler can bring economical, environmental and operational benefits. The paper presents a novel concept for online monitoring system of air distribution in coal-fired boiler based on real-time numerical calculations. The proposed mathematical model allows for identification of mass flow rates of secondary air to individual burners and to overfire air (OFA) nozzles. Numerical models of air and flue gas system were developed using software for power plant simulation. The correctness of the developed model was verified and validated with the reference measurement values. The presented numerical model for real-time monitoring of air distribution is capable of giving continuous determination of the complete air flows based on available digital communication system (DCS) data.
Estimating interevent time distributions from finite observation periods in communication networks
NASA Astrophysics Data System (ADS)
Kivelä, Mikko; Porter, Mason A.
2015-11-01
A diverse variety of processes—including recurrent disease episodes, neuron firing, and communication patterns among humans—can be described using interevent time (IET) distributions. Many such processes are ongoing, although event sequences are only available during a finite observation window. Because the observation time window is more likely to begin or end during long IETs than during short ones, the analysis of such data is susceptible to a bias induced by the finite observation period. In this paper, we illustrate how this length bias is born and how it can be corrected without assuming any particular shape for the IET distribution. To do this, we model event sequences using stationary renewal processes, and we formulate simple heuristics for determining the severity of the bias. To illustrate our results, we focus on the example of empirical communication networks, which are temporal networks that are constructed from communication events. The IET distributions of such systems guide efforts to build models of human behavior, and the variance of IETs is very important for estimating the spreading rate of information in networks of temporal interactions. We analyze several well-known data sets from the literature, and we find that the resulting bias can lead to systematic underestimates of the variance in the IET distributions and that correcting for the bias can lead to qualitatively different results for the tails of the IET distributions.
Flood Frequency Curves - Use of information on the likelihood of extreme floods
NASA Astrophysics Data System (ADS)
Faber, B.
2011-12-01
Investment in the infrastructure that reduces flood risk for flood-prone communities must incorporate information on the magnitude and frequency of flooding in that area. Traditionally, that information has been a probability distribution of annual maximum streamflows developed from the historical gaged record at a stream site. Practice in the United States fits a Log-Pearson type3 distribution to the annual maximum flows of an unimpaired streamflow record, using the method of moments to estimate distribution parameters. The procedure makes the assumptions that annual peak streamflow events are (1) independent, (2) identically distributed, and (3) form a representative sample of the overall probability distribution. Each of these assumptions can be challenged. We rarely have enough data to form a representative sample, and therefore must compute and display the uncertainty in the estimated flood distribution. But, is there a wet/dry cycle that makes precipitation less than independent between successive years? Are the peak flows caused by different types of events from different statistical populations? How does the watershed or climate changing over time (non-stationarity) affect the probability distribution floods? Potential approaches to avoid these assumptions vary from estimating trend and shift and removing them from early data (and so forming a homogeneous data set), to methods that estimate statistical parameters that vary with time. A further issue in estimating a probability distribution of flood magnitude (the flood frequency curve) is whether a purely statistical approach can accurately capture the range and frequency of floods that are of interest. A meteorologically-based analysis produces "probable maximum precipitation" (PMP) and subsequently a "probable maximum flood" (PMF) that attempts to describe an upper bound on flood magnitude in a particular watershed. This analysis can help constrain the upper tail of the probability distribution, well beyond the range of gaged data or even historical or paleo-flood data, which can be very important in risk analyses performed for flood risk management and dam and levee safety studies.
Nishiura, Hiroshi
2007-05-11
The incubation period of infectious diseases, the time from infection with a microorganism to onset of disease, is directly relevant to prevention and control. Since explicit models of the incubation period enhance our understanding of the spread of disease, previous classic studies were revisited, focusing on the modeling methods employed and paying particular attention to relatively unknown historical efforts. The earliest study on the incubation period of pandemic influenza was published in 1919, providing estimates of the incubation period of Spanish flu using the daily incidence on ships departing from several ports in Australia. Although the study explicitly dealt with an unknown time of exposure, the assumed periods of exposure, which had an equal probability of infection, were too long, and thus, likely resulted in slight underestimates of the incubation period. After the suggestion that the incubation period follows lognormal distribution, Japanese epidemiologists extended this assumption to estimates of the time of exposure during a point source outbreak. Although the reason why the incubation period of acute infectious diseases tends to reveal a right-skewed distribution has been explored several times, the validity of the lognormal assumption is yet to be fully clarified. At present, various different distributions are assumed, and the lack of validity in assuming lognormal distribution is particularly apparent in the case of slowly progressing diseases. The present paper indicates that (1) analysis using well-defined short periods of exposure with appropriate statistical methods is critical when the exact time of exposure is unknown, and (2) when assuming a specific distribution for the incubation period, comparisons using different distributions are needed in addition to estimations using different datasets, analyses of the determinants of incubation period, and an understanding of the underlying disease mechanisms.
Komada, Fusao
2018-01-01
The aim of this study was to investigate the time-to-onset of drug-induced interstitial lung disease (DILD) following the administration of small molecule molecularly-targeted drugs via the use of the spontaneous adverse reaction reporting system of the Japanese Adverse Drug Event Report database. DILD datasets for afatinib, alectinib, bortezomib, crizotinib, dasatinib, erlotinib, everolimus, gefitinib, imatinib, lapatinib, nilotinib, osimertinib, sorafenib, sunitinib, temsirolimus, and tofacitinib were used to calculate the median onset times of DILD and the Weibull distribution parameters, and to perform the hierarchical cluster analysis. The median onset times of DILD for afatinib, bortezomib, crizotinib, erlotinib, gefitinib, and nilotinib were within one month. The median onset times of DILD for dasatinib, everolimus, lapatinib, osimertinib, and temsirolimus ranged from 1 to 2 months. The median onset times of the DILD for alectinib, imatinib, and tofacitinib ranged from 2 to 3 months. The median onset times of the DILD for sunitinib and sorafenib ranged from 8 to 9 months. Weibull distributions for these drugs when using the cluster analysis showed that there were 4 clusters. Cluster 1 described a subgroup with early to later onset DILD and early failure type profiles or a random failure type profile. Cluster 2 exhibited early failure type profiles or a random failure type profile with early onset DILD. Cluster 3 exhibited a random failure type profile or wear out failure type profiles with later onset DILD. Cluster 4 exhibited an early failure type profile or a random failure type profile with the latest onset DILD.
NASA Technical Reports Server (NTRS)
1979-01-01
The SEASAT-A commercial demonstration program ASVT is described. The program consists of a set of experiments involving the evaluation of a real time data distributions system, the SEASAT-A user data distribution system, that provides the capability for near real time dissemination of ocean conditions and weather data products from the U.S. Navy Fleet Numerical Weather Central to a selected set of commercial and industrial users and case studies, performed by commercial and industrial users, using the data gathered by SEASAT-A during its operational life. The impact of the SEASAT-A data on business operations is evaluated by the commercial and industrial users. The approach followed in the performance of the case studies, and the methodology used in the analysis and integration of the case study results to estimate the actual and potential economic benefits of improved ocean condition and weather forecast data are described.
Phenytoin crystal growth rates in the presence of phosphate and chloride ions
NASA Astrophysics Data System (ADS)
Zipp, G. L.; Rodríguez-Hornedo, N.
1992-09-01
Phenytoin crystal growth kinetics have been measured as a function of supersaturation in pH 2.2 phosphoric acid and pH 2.2 hydrochloric acid solutions. Two different methods were used for the kinetic analysis. The first involved a zone-sensing device which provided an analysis of the distribution of crystals in a batch crystallizer. Crystal growth rates were calculated from the increase in the size of the distribution with time. In the second method, growth rates were evaluated from the change in size with time of individual crystals observed under an inverted microscope. The results from each method compare favorably. The use of both techniques provides an excellent opportunity to exploit the strengths of each: an average growth rate from a population of crystals from batch crystallization and insight into the effect of growth on the morphology of the crystals from the individual crystal measurements.
Survival Analysis of Patients with End Stage Renal Disease
NASA Astrophysics Data System (ADS)
Urrutia, J. D.; Gayo, W. S.; Bautista, L. A.; Baccay, E. B.
2015-06-01
This paper provides a survival analysis of End Stage Renal Disease (ESRD) under Kaplan-Meier Estimates and Weibull Distribution. The data were obtained from the records of V. L. MakabaliMemorial Hospital with respect to time t (patient's age), covariates such as developed secondary disease (Pulmonary Congestion and Cardiovascular Disease), gender, and the event of interest: the death of ESRD patients. Survival and hazard rates were estimated using NCSS for Weibull Distribution and SPSS for Kaplan-Meier Estimates. These lead to the same conclusion that hazard rate increases and survival rate decreases of ESRD patient diagnosed with Pulmonary Congestion, Cardiovascular Disease and both diseases with respect to time. It also shows that female patients have a greater risk of death compared to males. The probability risk was given the equation R = 1 — e-H(t) where e-H(t) is the survival function, H(t) the cumulative hazard function which was created using Cox-Regression.
System analysis for the Huntsville Operational Support Center distributed computer system
NASA Technical Reports Server (NTRS)
Ingels, F. M.; Mauldin, J.
1984-01-01
The Huntsville Operations Support Center (HOSC) is a distributed computer system used to provide real time data acquisition, analysis and display during NASA space missions and to perform simulation and study activities during non-mission times. The primary purpose is to provide a HOSC system simulation model that is used to investigate the effects of various HOSC system configurations. Such a model would be valuable in planning the future growth of HOSC and in ascertaining the effects of data rate variations, update table broadcasting and smart display terminal data requirements on the HOSC HYPERchannel network system. A simulation model was developed in PASCAL and results of the simulation model for various system configuraions were obtained. A tutorial of the model is presented and the results of simulation runs are presented. Some very high data rate situations were simulated to observe the effects of the HYPERchannel switch over from contention to priority mode under high channel loading.
Din, Ghiyas Ud; Chughtai, Imran Rafiq; Inayat, Mansoor Hameed; Khan, Iqbal Hussain
2008-12-01
Axial dispersion, holdup and slip velocity of dispersed phase have been investigated for a range of dispersed and continuous phase superficial velocities in a pulsed sieve plate extraction column using radiotracer residence time distribution (RTD) analysis. Axial dispersion model (ADM) was used to simulate the hydrodynamics of the system. It has been observed that increase in dispersed phase superficial velocity results in a decrease in its axial dispersion and increase in its slip velocity while its holdup increases till a maximum asymptotic value is achieved. An increase in superficial velocity of continuous phase increases the axial dispersion and holdup of dispersed phase until a maximum value is obtained, while slip velocity of dispersed phase is found to decrease in the beginning and then it increases with increase in superficial velocity of continuous phase.
2013-01-01
Background Designs and analyses of clinical trials with a time-to-event outcome almost invariably rely on the hazard ratio to estimate the treatment effect and implicitly, therefore, on the proportional hazards assumption. However, the results of some recent trials indicate that there is no guarantee that the assumption will hold. Here, we describe the use of the restricted mean survival time as a possible alternative tool in the design and analysis of these trials. Methods The restricted mean is a measure of average survival from time 0 to a specified time point, and may be estimated as the area under the survival curve up to that point. We consider the design of such trials according to a wide range of possible survival distributions in the control and research arm(s). The distributions are conveniently defined as piecewise exponential distributions and can be specified through piecewise constant hazards and time-fixed or time-dependent hazard ratios. Such designs can embody proportional or non-proportional hazards of the treatment effect. Results We demonstrate the use of restricted mean survival time and a test of the difference in restricted means as an alternative measure of treatment effect. We support the approach through the results of simulation studies and in real examples from several cancer trials. We illustrate the required sample size under proportional and non-proportional hazards, also the significance level and power of the proposed test. Values are compared with those from the standard approach which utilizes the logrank test. Conclusions We conclude that the hazard ratio cannot be recommended as a general measure of the treatment effect in a randomized controlled trial, nor is it always appropriate when designing a trial. Restricted mean survival time may provide a practical way forward and deserves greater attention. PMID:24314264
Software for rapid time dependent ChIP-sequencing analysis (TDCA).
Myschyshyn, Mike; Farren-Dai, Marco; Chuang, Tien-Jui; Vocadlo, David
2017-11-25
Chromatin immunoprecipitation followed by DNA sequencing (ChIP-seq) and associated methods are widely used to define the genome wide distribution of chromatin associated proteins, post-translational epigenetic marks, and modifications found on DNA bases. An area of emerging interest is to study time dependent changes in the distribution of such proteins and marks by using serial ChIP-seq experiments performed in a time resolved manner. Despite such time resolved studies becoming increasingly common, software to facilitate analysis of such data in a robust automated manner is limited. We have designed software called Time-Dependent ChIP-Sequencing Analyser (TDCA), which is the first program to automate analysis of time-dependent ChIP-seq data by fitting to sigmoidal curves. We provide users with guidance for experimental design of TDCA for modeling of time course (TC) ChIP-seq data using two simulated data sets. Furthermore, we demonstrate that this fitting strategy is widely applicable by showing that automated analysis of three previously published TC data sets accurately recapitulates key findings reported in these studies. Using each of these data sets, we highlight how biologically relevant findings can be readily obtained by exploiting TDCA to yield intuitive parameters that describe behavior at either a single locus or sets of loci. TDCA enables customizable analysis of user input aligned DNA sequencing data, coupled with graphical outputs in the form of publication-ready figures that describe behavior at either individual loci or sets of loci sharing common traits defined by the user. TDCA accepts sequencing data as standard binary alignment map (BAM) files and loci of interest in browser extensible data (BED) file format. TDCA accurately models the number of sequencing reads, or coverage, at loci from TC ChIP-seq studies or conceptually related TC sequencing experiments. TC experiments are reduced to intuitive parametric values that facilitate biologically relevant data analysis, and the uncovering of variations in the time-dependent behavior of chromatin. TDCA automates the analysis of TC ChIP-seq experiments, permitting researchers to easily obtain raw and modeled data for specific loci or groups of loci with similar behavior while also enhancing consistency of data analysis of TC data within the genomics field.
Scientific approaches to science policy.
Berg, Jeremy M
2013-11-01
The development of robust science policy depends on use of the best available data, rigorous analysis, and inclusion of a wide range of input. While director of the National Institute of General Medical Sciences (NIGMS), I took advantage of available data and emerging tools to analyze training time distribution by new NIGMS grantees, the distribution of the number of publications as a function of total annual National Institutes of Health support per investigator, and the predictive value of peer-review scores on subsequent scientific productivity. Rigorous data analysis should be used to develop new reforms and initiatives that will help build a more sustainable American biomedical research enterprise.
Directional pair distribution function for diffraction line profile analysis of atomistic models
Leonardi, Alberto; Leoni, Matteo; Scardi, Paolo
2013-01-01
The concept of the directional pair distribution function is proposed to describe line broadening effects in powder patterns calculated from atomistic models of nano-polycrystalline microstructures. The approach provides at the same time a description of the size effect for domains of any shape and a detailed explanation of the strain effect caused by the local atomic displacement. The latter is discussed in terms of different strain types, also accounting for strain field anisotropy and grain boundary effects. The results can in addition be directly read in terms of traditional line profile analysis, such as that based on the Warren–Averbach method. PMID:23396818
NASA Astrophysics Data System (ADS)
Wada, Yuji; Yuge, Kohei; Tanaka, Hiroki; Nakamura, Kentaro
2017-07-01
Numerical analysis on the rotation of an ultrasonically levitated droplet in centrifugal coordinate is discussed. A droplet levitated in an acoustic chamber is simulated using the distributed point source method and the moving particle semi-implicit method. Centrifugal coordinate is adopted to avoid the Laplacian differential error, which causes numerical divergence or inaccuracy in the global coordinate calculation. Consequently, the duration of calculation stability has increased 30 times longer than that in a the previous paper. Moreover, the droplet radius versus rotational acceleration characteristics show a similar trend to the theoretical and experimental values in the literature.
Seismic signal time-frequency analysis based on multi-directional window using greedy strategy
NASA Astrophysics Data System (ADS)
Chen, Yingpin; Peng, Zhenming; Cheng, Zhuyuan; Tian, Lin
2017-08-01
Wigner-Ville distribution (WVD) is an important time-frequency analysis technology with a high energy distribution in seismic signal processing. However, it is interfered by many cross terms. To suppress the cross terms of the WVD and keep the concentration of its high energy distribution, an adaptive multi-directional filtering window in the ambiguity domain is proposed. This begins with the relationship of the Cohen distribution and the Gabor transform combining the greedy strategy and the rotational invariance property of the fractional Fourier transform in order to propose the multi-directional window, which extends the one-dimensional, one directional, optimal window function of the optimal fractional Gabor transform (OFrGT) to a two-dimensional, multi-directional window in the ambiguity domain. In this way, the multi-directional window matches the main auto terms of the WVD more precisely. Using the greedy strategy, the proposed window takes into account the optimal and other suboptimal directions, which also solves the problem of the OFrGT, called the local concentration phenomenon, when encountering a multi-component signal. Experiments on different types of both the signal models and the real seismic signals reveal that the proposed window can overcome the drawbacks of the WVD and the OFrGT mentioned above. Finally, the proposed method is applied to a seismic signal's spectral decomposition. The results show that the proposed method can explore the space distribution of a reservoir more precisely.
Simultaneous Comparison of Two Roller Compaction Techniques and Two Particle Size Analysis Methods.
Saarinen, Tuomas; Antikainen, Osmo; Yliruusi, Jouko
2017-11-01
A new dry granulation technique, gas-assisted roller compaction (GARC), was compared with conventional roller compaction (CRC) by manufacturing 34 granulation batches. The process variables studied were roll pressure, roll speed, and sieve size of the conical mill. The main quality attributes measured were granule size and flow characteristics. Within granulations also the real applicability of two particle size analysis techniques, sieve analysis (SA) and fast imaging technique (Flashsizer, FS), was tested. All granules obtained were acceptable. In general, the particle size of GARC granules was slightly larger than that of CRC granules. In addition, the GARC granules had better flowability. For example, the tablet weight variation of GARC granules was close to 2%, indicating good flowing and packing characteristics. The comparison of the two particle size analysis techniques showed that SA was more accurate in determining wide and bimodal size distributions while FS showed narrower and mono-modal distributions. However, both techniques gave good estimates for mean granule sizes. Overall, SA was a time-consuming but accurate technique that provided reliable information for the entire granule size distribution. By contrast, FS oversimplified the shape of the size distribution, but nevertheless yielded acceptable estimates for mean particle size. In general, FS was two to three orders of magnitude faster than SA.
Frasca, Denis; Dahyot-Fizelier, Claire; Adier, Christophe; Mimoz, Olivier; Debaene, Bertrand; Couet, William; Marchand, Sandrine
2014-01-01
The distribution of metronidazole in the central nervous system has only been described based on cerebrospinal fluid data. However, extracellular fluid (ECF) concentrations may better predict its antimicrobial effect and/or side effects. We sought to explore by microdialysis brain ECF metronidazole distribution in patients with acute brain injury. Four brain-injured patients monitored by cerebral microdialysis received 500 mg of metronidazole over 0.5 h every 8 h. Brain dialysates and blood samples were collected at steady state over 8 h. Probe recoveries were evaluated by in vivo retrodialysis in each patient for metronidazole. Metronidazole and OH-metronidazole were assayed by high-pressure liquid chromatography, and a noncompartmental pharmacokinetic analysis was performed. Probe recovery was equal to 78.8% ± 1.3% for metronidazole in patients. Unbound brain metronidazole concentration-time curves were delayed compared to unbound plasma concentration-time curves but with a mean metronidazole unbound brain/plasma AUC0-τ ratio equal to 102% ± 19% (ranging from 87 to 124%). The unbound plasma concentration-time profiles for OH-metronidazole were flat, with mean average steady-state concentrations equal to 4.0 ± 0.7 μg ml(-1). This microdialysis study describes the steady-state brain distribution of metronidazole in patients and confirms its extensive distribution.
Report of the Defense Science Board Task Force on DoD Energy Strategy, More Fight - Less Fuel
2008-02-01
Budgeting, and Execution System PUC Public Utility Commission Q R RAMCAP Risk Analysis and Management for Critical Asset Protection R&D Research and...0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing...DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT 15. SUBJECT TERMS 16
Algorithms and Object-Oriented Software for Distributed Physics-Based Modeling
NASA Technical Reports Server (NTRS)
Kenton, Marc A.
2001-01-01
The project seeks to develop methods to more efficiently simulate aerospace vehicles. The goals are to reduce model development time, increase accuracy (e.g.,by allowing the integration of multidisciplinary models), facilitate collaboration by geographically- distributed groups of engineers, support uncertainty analysis and optimization, reduce hardware costs, and increase execution speeds. These problems are the subject of considerable contemporary research (e.g., Biedron et al. 1999; Heath and Dick, 2000).
NASA Astrophysics Data System (ADS)
Baitimirova, M.; Osite, A.; Katkevics, J.; Viksna, A.
2012-08-01
Burning of candles generates particulate matter of fine dimensions that produces poor indoor air quality, so it may cause harmful impact on human health. In this study solid aerosol particles of burning of candles of different composition and kerosene combustion were collected in a closed laboratory system. Present work describes particulate matter collection for structure analysis and the relationship between source and size distribution of particulate matter. The formation mechanism of particulate matter and their tendency to agglomerate also are described. Particles obtained from kerosene combustion have normal size distribution. Whereas, particles generated from the burning of stearin candles have distribution shifted towards finer particle size range. If an additive of stearin to paraffin candle is used, particle size distribution is also observed in range of towards finer particles. A tendency to form agglomerates in a short time is observed in case of particles obtained from kerosene combustion, while in case of particles obtained from burning of candles of different composition such a tendency is not observed. Particles from candles and kerosene combustion are Aitken and accumulation mode particles
GRB Diversity vs. Utility as Cosmological Probes
NASA Technical Reports Server (NTRS)
Norris, J. P.; Scargle, J. D.; Bonnell, J. T.; Nemiroff, R. J.; Young, Richard E. (Technical Monitor)
1997-01-01
Recent detections of apparent gamma-ray burst (GRB) counterparts in optical and radio wavebands strongly favor the cosmological distance scale, at least for some GRBs, opening the possibility of GRBs serving as cosmological probes. But GRBs exhibit great diversity: in total duration; in number, width and pulse configuration; and in pulse and overall spectral evolution. However, it is possible that a portion of this behavior reflects a luminosity distribution, and possible that evolution of with cosmic time introduces dispersion into the average GRB characteristics -- issues analogous to those encountered with quasars. The temporal domain offers a rich avenue to investigate this problem. When corrected for assumed spectral redshift, time dilation of event durations, pulse widths, and intervals between pulses must yield the same time-dilation factor as a function of peak flux, or else a luminosity distribution may be the cause of observed time dilation effects. We describe results of burst analysis using an automated, Bayesian-based algorithm to determine burst temporal characteristics for different peak flux groups, and derived constraints on any physical process that would introduce a luminosity distribution.
Zhang, Zheshen; Mower, Jacob; Englund, Dirk; Wong, Franco N C; Shapiro, Jeffrey H
2014-03-28
High-dimensional quantum key distribution (HDQKD) offers the possibility of high secure-key rate with high photon-information efficiency. We consider HDQKD based on the time-energy entanglement produced by spontaneous parametric down-conversion and show that it is secure against collective attacks. Its security rests upon visibility data-obtained from Franson and conjugate-Franson interferometers-that probe photon-pair frequency correlations and arrival-time correlations. From these measurements, an upper bound can be established on the eavesdropper's Holevo information by translating the Gaussian-state security analysis for continuous-variable quantum key distribution so that it applies to our protocol. We show that visibility data from just the Franson interferometer provides a weaker, but nonetheless useful, secure-key rate lower bound. To handle multiple-pair emissions, we incorporate the decoy-state approach into our protocol. Our results show that over a 200-km transmission distance in optical fiber, time-energy entanglement HDQKD could permit a 700-bit/sec secure-key rate and a photon information efficiency of 2 secure-key bits per photon coincidence in the key-generation phase using receivers with a 15% system efficiency.
Intertime jump statistics of state-dependent Poisson processes.
Daly, Edoardo; Porporato, Amilcare
2007-01-01
A method to obtain the probability distribution of the interarrival times of jump occurrences in systems driven by state-dependent Poisson noise is proposed. Such a method uses the survivor function obtained by a modified version of the master equation associated to the stochastic process under analysis. A model for the timing of human activities shows the capability of state-dependent Poisson noise to generate power-law distributions. The application of the method to a model for neuron dynamics and to a hydrological model accounting for land-atmosphere interaction elucidates the origin of characteristic recurrence intervals and possible persistence in state-dependent Poisson models.
Characterization of autoregressive processes using entropic quantifiers
NASA Astrophysics Data System (ADS)
Traversaro, Francisco; Redelico, Francisco O.
2018-01-01
The aim of the contribution is to introduce a novel information plane, the causal-amplitude informational plane. As previous works seems to indicate, Bandt and Pompe methodology for estimating entropy does not allow to distinguish between probability distributions which could be fundamental for simulation or for probability analysis purposes. Once a time series is identified as stochastic by the causal complexity-entropy informational plane, the novel causal-amplitude gives a deeper understanding of the time series, quantifying both, the autocorrelation strength and the probability distribution of the data extracted from the generating processes. Two examples are presented, one from climate change model and the other from financial markets.
Zhang, Guodong; Zeng, Zhigang; Hu, Junhao
2018-01-01
This paper is concerned with the global exponential dissipativity of memristive inertial neural networks with discrete and distributed time-varying delays. By constructing appropriate Lyapunov-Krasovskii functionals, some new sufficient conditions ensuring global exponential dissipativity of memristive inertial neural networks are derived. Moreover, the globally exponential attractive sets and positive invariant sets are also presented here. In addition, the new proposed results here complement and extend the earlier publications on conventional or memristive neural network dynamical systems. Finally, numerical simulations are given to illustrate the effectiveness of obtained results. Copyright © 2017 Elsevier Ltd. All rights reserved.