Bivariate Rainfall and Runoff Analysis Using Shannon Entropy Theory
NASA Astrophysics Data System (ADS)
Rahimi, A.; Zhang, L.
2012-12-01
Rainfall-Runoff analysis is the key component for many hydrological and hydraulic designs in which the dependence of rainfall and runoff needs to be studied. It is known that the convenient bivariate distribution are often unable to model the rainfall-runoff variables due to that they either have constraints on the range of the dependence or fixed form for the marginal distributions. Thus, this paper presents an approach to derive the entropy-based joint rainfall-runoff distribution using Shannon entropy theory. The distribution derived can model the full range of dependence and allow different specified marginals. The modeling and estimation can be proceeded as: (i) univariate analysis of marginal distributions which includes two steps, (a) using the nonparametric statistics approach to detect modes and underlying probability density, and (b) fitting the appropriate parametric probability density functions; (ii) define the constraints based on the univariate analysis and the dependence structure; (iii) derive and validate the entropy-based joint distribution. As to validate the method, the rainfall-runoff data are collected from the small agricultural experimental watersheds located in semi-arid region near Riesel (Waco), Texas, maintained by the USDA. The results of unviariate analysis show that the rainfall variables follow the gamma distribution, whereas the runoff variables have mixed structure and follow the mixed-gamma distribution. With this information, the entropy-based joint distribution is derived using the first moments, the first moments of logarithm transformed rainfall and runoff, and the covariance between rainfall and runoff. The results of entropy-based joint distribution indicate: (1) the joint distribution derived successfully preserves the dependence between rainfall and runoff, and (2) the K-S goodness of fit statistical tests confirm the marginal distributions re-derived reveal the underlying univariate probability densities which further assure that the entropy-based joint rainfall-runoff distribution are satisfactorily derived. Overall, the study shows the Shannon entropy theory can be satisfactorily applied to model the dependence between rainfall and runoff. The study also shows that the entropy-based joint distribution is an appropriate approach to capture the dependence structure that cannot be captured by the convenient bivariate joint distributions. Joint Rainfall-Runoff Entropy Based PDF, and Corresponding Marginal PDF and Histogram for W12 Watershed The K-S Test Result and RMSE on Univariate Distributions Derived from the Maximum Entropy Based Joint Probability Distribution;
NASA Astrophysics Data System (ADS)
Lu, Siqi; Wang, Xiaorong; Wu, Junyong
2018-01-01
The paper presents a method to generate the planning scenarios, which is based on K-means clustering analysis algorithm driven by data, for the location and size planning of distributed photovoltaic (PV) units in the network. Taken the power losses of the network, the installation and maintenance costs of distributed PV, the profit of distributed PV and the voltage offset as objectives and the locations and sizes of distributed PV as decision variables, Pareto optimal front is obtained through the self-adaptive genetic algorithm (GA) and solutions are ranked by a method called technique for order preference by similarity to an ideal solution (TOPSIS). Finally, select the planning schemes at the top of the ranking list based on different planning emphasis after the analysis in detail. The proposed method is applied to a 10-kV distribution network in Gansu Province, China and the results are discussed.
Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis
NASA Astrophysics Data System (ADS)
Chen, Lu; Singh, Vijay P.
2018-02-01
Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.
Size distribution of magnetic iron oxide nanoparticles using Warren-Averbach XRD analysis
NASA Astrophysics Data System (ADS)
Mahadevan, S.; Behera, S. P.; Gnanaprakash, G.; Jayakumar, T.; Philip, J.; Rao, B. P. C.
2012-07-01
We use the Fourier transform based Warren-Averbach (WA) analysis to separate the contributions of X-ray diffraction (XRD) profile broadening due to crystallite size and microstrain for magnetic iron oxide nanoparticles. The profile shape of the column length distribution, obtained from WA analysis, is used to analyze the shape of the magnetic iron oxide nanoparticles. From the column length distribution, the crystallite size and its distribution are estimated for these nanoparticles which are compared with size distribution obtained from dynamic light scattering measurements. The crystallite size and size distribution of crystallites obtained from WA analysis are explained based on the experimental parameters employed in preparation of these magnetic iron oxide nanoparticles. The variation of volume weighted diameter (Dv, from WA analysis) with saturation magnetization (Ms) fits well to a core shell model wherein it is known that Ms=Mbulk(1-6g/Dv) with Mbulk as bulk magnetization of iron oxide and g as magnetic shell disorder thickness.
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Collins, Donald J.; Doyle, Richard J.; Jacobson, Allan S.
1991-01-01
Viewgraphs on DataHub knowledge based assistance for science visualization and analysis using large distributed databases. Topics covered include: DataHub functional architecture; data representation; logical access methods; preliminary software architecture; LinkWinds; data knowledge issues; expert systems; and data management.
Robust Methods for Moderation Analysis with a Two-Level Regression Model.
Yang, Miao; Yuan, Ke-Hai
2016-01-01
Moderation analysis has many applications in social sciences. Most widely used estimation methods for moderation analysis assume that errors are normally distributed and homoscedastic. When these assumptions are not met, the results from a classical moderation analysis can be misleading. For more reliable moderation analysis, this article proposes two robust methods with a two-level regression model when the predictors do not contain measurement error. One method is based on maximum likelihood with Student's t distribution and the other is based on M-estimators with Huber-type weights. An algorithm for obtaining the robust estimators is developed. Consistent estimates of standard errors of the robust estimators are provided. The robust approaches are compared against normal-distribution-based maximum likelihood (NML) with respect to power and accuracy of parameter estimates through a simulation study. Results show that the robust approaches outperform NML under various distributional conditions. Application of the robust methods is illustrated through a real data example. An R program is developed and documented to facilitate the application of the robust methods.
Li, Dongmei; Le Pape, Marc A; Parikh, Nisha I; Chen, Will X; Dye, Timothy D
2013-01-01
Microarrays are widely used for examining differential gene expression, identifying single nucleotide polymorphisms, and detecting methylation loci. Multiple testing methods in microarray data analysis aim at controlling both Type I and Type II error rates; however, real microarray data do not always fit their distribution assumptions. Smyth's ubiquitous parametric method, for example, inadequately accommodates violations of normality assumptions, resulting in inflated Type I error rates. The Significance Analysis of Microarrays, another widely used microarray data analysis method, is based on a permutation test and is robust to non-normally distributed data; however, the Significance Analysis of Microarrays method fold change criteria are problematic, and can critically alter the conclusion of a study, as a result of compositional changes of the control data set in the analysis. We propose a novel approach, combining resampling with empirical Bayes methods: the Resampling-based empirical Bayes Methods. This approach not only reduces false discovery rates for non-normally distributed microarray data, but it is also impervious to fold change threshold since no control data set selection is needed. Through simulation studies, sensitivities, specificities, total rejections, and false discovery rates are compared across the Smyth's parametric method, the Significance Analysis of Microarrays, and the Resampling-based empirical Bayes Methods. Differences in false discovery rates controls between each approach are illustrated through a preterm delivery methylation study. The results show that the Resampling-based empirical Bayes Methods offer significantly higher specificity and lower false discovery rates compared to Smyth's parametric method when data are not normally distributed. The Resampling-based empirical Bayes Methods also offers higher statistical power than the Significance Analysis of Microarrays method when the proportion of significantly differentially expressed genes is large for both normally and non-normally distributed data. Finally, the Resampling-based empirical Bayes Methods are generalizable to next generation sequencing RNA-seq data analysis.
Molecular Isotopic Distribution Analysis (MIDAs) with Adjustable Mass Accuracy
NASA Astrophysics Data System (ADS)
Alves, Gelio; Ogurtsov, Aleksey Y.; Yu, Yi-Kuo
2014-01-01
In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.
Molecular Isotopic Distribution Analysis (MIDAs) with adjustable mass accuracy.
Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo
2014-01-01
In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.
Evaluation of Low-Voltage Distribution Network Index Based on Improved Principal Component Analysis
NASA Astrophysics Data System (ADS)
Fan, Hanlu; Gao, Suzhou; Fan, Wenjie; Zhong, Yinfeng; Zhu, Lei
2018-01-01
In order to evaluate the development level of the low-voltage distribution network objectively and scientifically, chromatography analysis method is utilized to construct evaluation index model of low-voltage distribution network. Based on the analysis of principal component and the characteristic of logarithmic distribution of the index data, a logarithmic centralization method is adopted to improve the principal component analysis algorithm. The algorithm can decorrelate and reduce the dimensions of the evaluation model and the comprehensive score has a better dispersion degree. The clustering method is adopted to analyse the comprehensive score because the comprehensive score of the courts is concentrated. Then the stratification evaluation of the courts is realized. An example is given to verify the objectivity and scientificity of the evaluation method.
Reliability Estimation of Aero-engine Based on Mixed Weibull Distribution Model
NASA Astrophysics Data System (ADS)
Yuan, Zhongda; Deng, Junxiang; Wang, Dawei
2018-02-01
Aero-engine is a complex mechanical electronic system, based on analysis of reliability of mechanical electronic system, Weibull distribution model has an irreplaceable role. Till now, only two-parameter Weibull distribution model and three-parameter Weibull distribution are widely used. Due to diversity of engine failure modes, there is a big error with single Weibull distribution model. By contrast, a variety of engine failure modes can be taken into account with mixed Weibull distribution model, so it is a good statistical analysis model. Except the concept of dynamic weight coefficient, in order to make reliability estimation result more accurately, three-parameter correlation coefficient optimization method is applied to enhance Weibull distribution model, thus precision of mixed distribution reliability model is improved greatly. All of these are advantageous to popularize Weibull distribution model in engineering applications.
Money-center structures in dynamic banking systems
NASA Astrophysics Data System (ADS)
Li, Shouwei; Zhang, Minghui
2016-10-01
In this paper, we propose a dynamic model for banking systems based on the description of balance sheets. It generates some features identified through empirical analysis. Through simulation analysis of the model, we find that banking systems have the feature of money-center structures, that bank asset distributions are power-law distributions, and that contract size distributions are log-normal distributions.
Thermal analysis of void cavity for heat pipe receiver under microgravity
NASA Astrophysics Data System (ADS)
Gui, Xiaohong; Song, Xiange; Nie, Baisheng
2017-04-01
Based on theoretical analysis of PCM (Phase Change Material) solidification process, the model of improved void cavity distribution tending to high temperature region is established. Numerical results are compared with NASA (National Aeronautics and Space Administration) results. Analysis results show that the outer wall temperature, the melting ratio of PCM and the temperature gradient of PCM canister, have great difference in different void cavity distribution. The form of void distribution has a great effect on the process of phase change. Based on simulation results under the model of improved void cavity distribution, phase change heat transfer process in thermal storage container is analyzed. The main goal of the improved designing for PCM canister is to take measures in reducing the concentration distribution of void cavity by adding some foam metal into phase change material.
Parrado-Hernández, Emilio; Gómez-Sánchez, Eduardo; Dimitriadis, Yannis A
2003-09-01
An evaluation of distributed learning as a means to attenuate the category proliferation problem in Fuzzy ARTMAP based neural systems is carried out, from both qualitative and quantitative points of view. The study involves two original winner-take-all (WTA) architectures, Fuzzy ARTMAP and FasArt, and their distributed versions, dARTMAP and dFasArt. A qualitative analysis of the distributed learning properties of dARTMAP is made, focusing on the new elements introduced to endow Fuzzy ARTMAP with distributed learning. In addition, a quantitative study on a selected set of classification problems points out that problems have to present certain features in their output classes in order to noticeably reduce the number of recruited categories and achieve an acceptable classification accuracy. As part of this analysis, distributed learning was successfully adapted to a member of the Fuzzy ARTMAP family, FasArt, and similar procedures can be used to extend distributed learning capabilities to other Fuzzy ARTMAP based systems.
2018 Military Retirement Options: An Expected Net Present Value Decision Analysis Model
2017-03-23
Decision Analysis Model Bret N. Witham Follow this and additional works at: https://scholar.afit.edu/etd Part of the Benefits and Compensation Commons...FORCE INSTITUTE OF TECHNOLOGY Wright-Patterson Air Force Base, Ohio DISTRIBUTION STATEMENT A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED...Science in Operations Research Bret N. Witham, BS Captain, USAF March 2017 DISTRIBUTION STATEMENT A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION
Furbish, David; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan
2016-01-01
We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.
ERIC Educational Resources Information Center
Texeira, Antonio; Rosa, Alvaro; Calapez, Teresa
2009-01-01
This article presents statistical power analysis (SPA) based on the normal distribution using Excel, adopting textbook and SPA approaches. The objective is to present the latter in a comparative way within a framework that is familiar to textbook level readers, as a first step to understand SPA with other distributions. The analysis focuses on the…
Ozone data and mission sampling analysis
NASA Technical Reports Server (NTRS)
Robbins, J. L.
1980-01-01
A methodology was developed to analyze discrete data obtained from the global distribution of ozone. Statistical analysis techniques were applied to describe the distribution of data variance in terms of empirical orthogonal functions and components of spherical harmonic models. The effects of uneven data distribution and missing data were considered. Data fill based on the autocorrelation structure of the data is described. Computer coding of the analysis techniques is included.
Cluster analysis for determining distribution center location
NASA Astrophysics Data System (ADS)
Lestari Widaningrum, Dyah; Andika, Aditya; Murphiyanto, Richard Dimas Julian
2017-12-01
Determination of distribution facilities is highly important to survive in the high level of competition in today’s business world. Companies can operate multiple distribution centers to mitigate supply chain risk. Thus, new problems arise, namely how many and where the facilities should be provided. This study examines a fast-food restaurant brand, which located in the Greater Jakarta. This brand is included in the category of top 5 fast food restaurant chain based on retail sales. There were three stages in this study, compiling spatial data, cluster analysis, and network analysis. Cluster analysis results are used to consider the location of the additional distribution center. Network analysis results show a more efficient process referring to a shorter distance to the distribution process.
NASA Astrophysics Data System (ADS)
Cheng, Liangliang; Busca, Giorgio; Cigada, Alfredo
2017-07-01
Modal analysis is commonly considered as an effective tool to obtain the intrinsic characteristics of structures including natural frequencies, modal damping ratios, and mode shapes, which are significant indicators for monitoring the health status of engineering structures. The complex mode indicator function (CMIF) can be regarded as an effective numerical tool to perform modal analysis. In this paper, experimental strain modal analysis based on the CMIF has been introduced. Moreover, a distributed fiber-optic sensor, as a dense measuring device, has been applied to acquire strain data along a beam surface. Thanks to the dense spatial resolution of the distributed fiber optics, more detailed mode shapes could be obtained. In order to test the effectiveness of the method, a mass lump—considered as a linear damage component—has been attached to the surface of the beam, and damage detection based on strain mode shape has been carried out. The results manifest that strain modal parameters can be estimated effectively by utilizing the CMIF based on the corresponding simulations and experiments. Furthermore, damage detection based on strain mode shapes benefits from the accuracy of strain mode shape recognition and the excellent performance of the distributed fiber optics.
GIS-based poverty and population distribution analysis in China
NASA Astrophysics Data System (ADS)
Cui, Jing; Wang, Yingjie; Yan, Hong
2009-07-01
Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.
NASA Technical Reports Server (NTRS)
Lefebvre, D. R.; Sanderson, A. C.
1994-01-01
Robot coordination and control systems for remote teleoperation applications are by necessity implemented on distributed computers. Modeling and performance analysis of these distributed robotic systems is difficult, but important for economic system design. Performance analysis methods originally developed for conventional distributed computer systems are often unsatisfactory for evaluating real-time systems. The paper introduces a formal model of distributed robotic control systems; and a performance analysis method, based on scheduling theory, which can handle concurrent hard-real-time response specifications. Use of the method is illustrated by a case of remote teleoperation which assesses the effect of communication delays and the allocation of robot control functions on control system hardware requirements.
Dynamic Singularity Spectrum Distribution of Sea Clutter
NASA Astrophysics Data System (ADS)
Xiong, Gang; Yu, Wenxian; Zhang, Shuning
2015-12-01
The fractal and multifractal theory have provided new approaches for radar signal processing and target-detecting under the background of ocean. However, the related research mainly focuses on fractal dimension or multifractal spectrum (MFS) of sea clutter. In this paper, a new dynamic singularity analysis method of sea clutter using MFS distribution is developed, based on moving detrending analysis (DMA-MFSD). Theoretically, we introduce the time information by using cyclic auto-correlation of sea clutter. For transient correlation series, the instantaneous singularity spectrum based on multifractal detrending moving analysis (MF-DMA) algorithm is calculated, and the dynamic singularity spectrum distribution of sea clutter is acquired. In addition, we analyze the time-varying singularity exponent ranges and maximum position function in DMA-MFSD of sea clutter. For the real sea clutter data, we analyze the dynamic singularity spectrum distribution of real sea clutter in level III sea state, and conclude that the radar sea clutter has the non-stationary and time-varying scale characteristic and represents the time-varying singularity spectrum distribution based on the proposed DMA-MFSD method. The DMA-MFSD will also provide reference for nonlinear dynamics and multifractal signal processing.
Analysis and numerical simulation research of the heating process in the oven
NASA Astrophysics Data System (ADS)
Chen, Yawei; Lei, Dingyou
2016-10-01
How to use the oven to bake delicious food is the most concerned problem of the designers and users of the oven. For this intent, this paper analyzed the heat distribution in the oven based on the basic operation principles and proceeded the data simulation of the temperature distribution on the rack section. Constructing the differential equation model of the temperature distribution changes in the pan when the oven works based on the heat radiation and heat transmission, based on the idea of utilizing cellular automation to simulate heat transfer process, used ANSYS software to proceed the numerical simulation analysis to the rectangular, round-cornered rectangular, elliptical and circular pans and giving out the instantaneous temperature distribution of the corresponding shapes of the pans. The temperature distribution of the rectangular and circular pans proves that the product gets overcooked easily at the corners and edges of rectangular pans but not of a round pan.
NASA Astrophysics Data System (ADS)
Yang, Wenxiu; Liu, Yanbo; Zhang, Ligai; Cao, Hong; Wang, Yang; Yao, Jinbo
2016-06-01
Needleless electrospinning technology is considered as a better avenue to produce nanofibrous materials at large scale, and electric field intensity and its distribution play an important role in controlling nanofiber diameter and quality of the nanofibrous web during electrospinning. In the current study, a novel needleless electrospinning method was proposed based on Von Koch curves of Fractal configuration, simulation and analysis on electric field intensity and distribution in the new electrospinning process were performed with Finite element analysis software, Comsol Multiphysics 4.4, based on linear and nonlinear Von Koch fractal curves (hereafter called fractal models). The result of simulation and analysis indicated that Second level fractal structure is the optimal linear electrospinning spinneret in terms of field intensity and uniformity. Further simulation and analysis showed that the circular type of Fractal spinneret has better field intensity and distribution compared to spiral type of Fractal spinneret in the nonlinear Fractal electrospinning technology. The electrospinning apparatus with the optimal Von Koch fractal spinneret was set up to verify the theoretical analysis results from Comsol simulation, achieving more uniform electric field distribution and lower energy cost, compared to the current needle and needleless electrospinning technologies.
On the distribution of saliency.
Berengolts, Alexander; Lindenbaum, Michael
2006-12-01
Detecting salient structures is a basic task in perceptual organization. Saliency algorithms typically mark edge-points with some saliency measure, which grows with the length and smoothness of the curve on which these edge-points lie. Here, we propose a modified saliency estimation mechanism that is based on probabilistically specified grouping cues and on curve length distributions. In this framework, the Shashua and Ullman saliency mechanism may be interpreted as a process for detecting the curve with maximal expected length. Generalized types of saliency naturally follow. We propose several specific generalizations (e.g., gray-level-based saliency) and rigorously derive the limitations on generalized saliency types. We then carry out a probabilistic analysis of expected length saliencies. Using ergodicity and asymptotic analysis, we derive the saliency distributions associated with the main curves and with the rest of the image. We then extend this analysis to finite-length curves. Using the derived distributions, we derive the optimal threshold on the saliency for discriminating between figure and background and bound the saliency-based figure-from-ground performance.
PARTIAL RESTRAINING FORCE INTRODUCTION METHOD FOR DESIGNING CONSTRUCTION COUNTERMESURE ON ΔB METHOD
NASA Astrophysics Data System (ADS)
Nishiyama, Taku; Imanishi, Hajime; Chiba, Noriyuki; Ito, Takao
Landslide or slope failure is a three-dimensional movement phenomenon, thus a three-dimensional treatment makes it easier to understand stability. The ΔB method (simplified three-dimensional slope stability analysis method) is based on the limit equilibrium method and equals to an approximate three-dimensional slope stability analysis that extends two-dimensional cross-section stability analysis results to assess stability. This analysis can be conducted using conventional spreadsheets or two-dimensional slope stability computational software. This paper describes the concept of the partial restraining force in-troduction method for designing construction countermeasures using the distribution of the restraining force found along survey lines, which is based on the distribution of survey line safety factors derived from the above-stated analysis. This paper also presents the transverse distributive method of restraining force used for planning ground stabilizing on the basis of the example analysis.
A distributed data base management system. [for Deep Space Network
NASA Technical Reports Server (NTRS)
Bryan, A. I.
1975-01-01
Major system design features of a distributed data management system for the NASA Deep Space Network (DSN) designed for continuous two-way deep space communications are described. The reasons for which the distributed data base utilizing third-generation minicomputers is selected as the optimum approach for the DSN are threefold: (1) with a distributed master data base, valid data is available in real-time to support DSN management activities at each location; (2) data base integrity is the responsibility of local management; and (3) the data acquisition/distribution and processing power of a third-generation computer enables the computer to function successfully as a data handler or as an on-line process controller. The concept of the distributed data base is discussed along with the software, data base integrity, and hardware used. The data analysis/update constraint is examined.
On Prolonging Network Lifetime through Load-Similar Node Deployment in Wireless Sensor Networks
Li, Qiao-Qin; Gong, Haigang; Liu, Ming; Yang, Mei; Zheng, Jun
2011-01-01
This paper is focused on the study of the energy hole problem in the Progressive Multi-hop Rotational Clustered (PMRC)-structure, a highly scalable wireless sensor network (WSN) architecture. Based on an analysis on the traffic load distribution in PMRC-based WSNs, we propose a novel load-similar node distribution strategy combined with the Minimum Overlapping Layers (MOL) scheme to address the energy hole problem in PMRC-based WSNs. In this strategy, sensor nodes are deployed in the network area according to the load distribution. That is, more nodes shall be deployed in the range where the average load is higher, and then the loads among different areas in the sensor network tend to be balanced. Simulation results demonstrate that the load-similar node distribution strategy prolongs network lifetime and reduces the average packet latency in comparison with existing nonuniform node distribution and uniform node distribution strategies. Note that, besides the PMRC structure, the analysis model and the proposed load-similar node distribution strategy are also applicable to other multi-hop WSN structures. PMID:22163809
Lackey, Daniel P; Carruth, Eric D; Lasher, Richard A; Boenisch, Jan; Sachse, Frank B; Hitchcock, Robert W
2011-11-01
Gap junctions play a fundamental role in intercellular communication in cardiac tissue. Various types of heart disease including hypertrophy and ischemia are associated with alterations of the spatial arrangement of gap junctions. Previous studies applied two-dimensional optical and electron-microscopy to visualize gap junction arrangements. In normal cardiomyocytes, gap junctions were primarily found at cell ends, but can be found also in more central regions. In this study, we extended these approaches toward three-dimensional reconstruction of gap junction distributions based on high-resolution scanning confocal microscopy and image processing. We developed methods for quantitative characterization of gap junction distributions based on analysis of intensity profiles along the principal axes of myocytes. The analyses characterized gap junction polarization at cell ends and higher-order statistical image moments of intensity profiles. The methodology was tested in rat ventricular myocardium. Our analysis yielded novel quantitative data on gap junction distributions. In particular, the analysis demonstrated that the distributions exhibit significant variability with respect to polarization, skewness, and kurtosis. We suggest that this methodology provides a quantitative alternative to current approaches based on visual inspection, with applications in particular in characterization of engineered and diseased myocardium. Furthermore, we propose that these data provide improved input for computational modeling of cardiac conduction.
An XML-Based Protocol for Distributed Event Services
NASA Technical Reports Server (NTRS)
Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)
2001-01-01
A recent trend in distributed computing is the construction of high-performance distributed systems called computational grids. One difficulty we have encountered is that there is no standard format for the representation of performance information and no standard protocol for transmitting this information. This limits the types of performance analysis that can be undertaken in complex distributed systems. To address this problem, we present an XML-based protocol for transmitting performance events in distributed systems and evaluate the performance of this protocol.
NASA Astrophysics Data System (ADS)
Korzeniewska, Ewa; Szczesny, Artur; Krawczyk, Andrzej; Murawski, Piotr; Mróz, Józef; Seme, Sebastian
2018-03-01
In this paper, the authors describe the distribution of temperatures around electroconductive pathways created by a physical vacuum deposition process on flexible textile substrates used in elastic electronics and textronics. Cordura material was chosen as the substrate. Silver with 99.99% purity was used as the deposited metal. This research was based on thermographic photographs of the produced samples. Analysis of the temperature field around the electroconductive layer was carried out using Image ThermaBase EU software. The analysis of the temperature distribution highlights the software's usefulness in determining the homogeneity of the created metal layer. Higher local temperatures and non-uniform distributions at the same time can negatively influence the work of the textronic system.
Kashima, Saori; Inoue, Kazuo; Matsumoto, Masatoshi
2017-01-01
The Great East Japan Earthquake occurred on 11 March 2011 near the northeast coast of the main island, 'Honshu', of Japan. It wreaked enormous damage in two main ways: a giant tsunami and an accident at the Fukushima Daiichi Nuclear Power Plant (FDNPP). This disaster may have affected the distribution of physicians in the region. Here, we evaluate the effect of the disaster on the distribution of hospital physicians in the three most severely affected prefectures (Iwate, Miyagi, and Fukushima). We obtained individual information about physicians from the Physician Census in 2010 (pre-disaster) and 2012 (post-disaster). We examined geographical distributions of physicians in two ways: (1) municipality-based analysis for demographic evaluation; and (2) hospital-based analysis for geographic evaluation. In each analysis, we calculated the rate of change in physician distributions between pre- and post-disaster years at various distances from the tsunami-affected coast, and from the restricted area due to the FDNPP accident. The change in all, hospital, and clinic physicians were 0.2%, 0.7%, and -0.7%, respectively. In the municipality-based analysis, after taking account of the decreased population, physician numbers only decreased within the restricted area. In the hospital-based analysis, hospital physician numbers did not decrease at any distance from the tsunami-affected coast. In contrast, there was a 3.3% and 2.3% decrease in hospital physicians 0-25 km and 25-50 km from the restricted area surrounding the FDNPP, respectively. Additionally, decreases were larger and increases were smaller in areas close to the FDNPP than in areas further away. Our results suggest that the tsunami did not affect the distribution of physicians in the affected regions. However, the FDNPP accident changed physician distribution in areas close to the power plant.
Inoue, Kazuo; Matsumoto, Masatoshi
2017-01-01
Objective The Great East Japan Earthquake occurred on 11 March 2011 near the northeast coast of the main island, ‘Honshu’, of Japan. It wreaked enormous damage in two main ways: a giant tsunami and an accident at the Fukushima Daiichi Nuclear Power Plant (FDNPP). This disaster may have affected the distribution of physicians in the region. Here, we evaluate the effect of the disaster on the distribution of hospital physicians in the three most severely affected prefectures (Iwate, Miyagi, and Fukushima). Methods We obtained individual information about physicians from the Physician Census in 2010 (pre-disaster) and 2012 (post-disaster). We examined geographical distributions of physicians in two ways: (1) municipality-based analysis for demographic evaluation; and (2) hospital-based analysis for geographic evaluation. In each analysis, we calculated the rate of change in physician distributions between pre- and post-disaster years at various distances from the tsunami-affected coast, and from the restricted area due to the FDNPP accident. Results The change in all, hospital, and clinic physicians were 0.2%, 0.7%, and −0.7%, respectively. In the municipality-based analysis, after taking account of the decreased population, physician numbers only decreased within the restricted area. In the hospital-based analysis, hospital physician numbers did not decrease at any distance from the tsunami-affected coast. In contrast, there was a 3.3% and 2.3% decrease in hospital physicians 0–25 km and 25–50 km from the restricted area surrounding the FDNPP, respectively. Additionally, decreases were larger and increases were smaller in areas close to the FDNPP than in areas further away. Conclusions Our results suggest that the tsunami did not affect the distribution of physicians in the affected regions. However, the FDNPP accident changed physician distribution in areas close to the power plant. PMID:28542461
Statistical analysis of multivariate atmospheric variables. [cloud cover
NASA Technical Reports Server (NTRS)
Tubbs, J. D.
1979-01-01
Topics covered include: (1) estimation in discrete multivariate distributions; (2) a procedure to predict cloud cover frequencies in the bivariate case; (3) a program to compute conditional bivariate normal parameters; (4) the transformation of nonnormal multivariate to near-normal; (5) test of fit for the extreme value distribution based upon the generalized minimum chi-square; (6) test of fit for continuous distributions based upon the generalized minimum chi-square; (7) effect of correlated observations on confidence sets based upon chi-square statistics; and (8) generation of random variates from specified distributions.
ERIC Educational Resources Information Center
Doerann-George, Judith
The Integrated Moving Average (IMA) model of time series, and the analysis of intervention effects based on it, assume random shocks which are normally distributed. To determine the robustness of the analysis to violations of this assumption, empirical sampling methods were employed. Samples were generated from three populations; normal,…
Tian, Guo-Liang; Li, Hui-Qiong
2017-08-01
Some existing confidence interval methods and hypothesis testing methods in the analysis of a contingency table with incomplete observations in both margins entirely depend on an underlying assumption that the sampling distribution of the observed counts is a product of independent multinomial/binomial distributions for complete and incomplete counts. However, it can be shown that this independency assumption is incorrect and can result in unreliable conclusions because of the under-estimation of the uncertainty. Therefore, the first objective of this paper is to derive the valid joint sampling distribution of the observed counts in a contingency table with incomplete observations in both margins. The second objective is to provide a new framework for analyzing incomplete contingency tables based on the derived joint sampling distribution of the observed counts by developing a Fisher scoring algorithm to calculate maximum likelihood estimates of parameters of interest, the bootstrap confidence interval methods, and the bootstrap testing hypothesis methods. We compare the differences between the valid sampling distribution and the sampling distribution under the independency assumption. Simulation studies showed that average/expected confidence-interval widths of parameters based on the sampling distribution under the independency assumption are shorter than those based on the new sampling distribution, yielding unrealistic results. A real data set is analyzed to illustrate the application of the new sampling distribution for incomplete contingency tables and the analysis results again confirm the conclusions obtained from the simulation studies.
NASA Technical Reports Server (NTRS)
Heldenfels, Richard R
1951-01-01
A numerical method is presented for the stress analysis of stiffened-shell structures of arbitrary cross section under nonuniform temperature distributions. The method is based on a previously published procedure that is extended to include temperature effects and multicell construction. The application of the method to practical problems is discussed and an illustrative analysis is presented of a two-cell box beam under the combined action of vertical loads and a nonuniform temperature distribution.
Improving Department of Defense Global Distribution Performance Through Network Analysis
2016-06-01
network performance increase. 14. SUBJECT TERMS supply chain metrics, distribution networks, requisition shipping time, strategic distribution database...peace and war” (p. 4). USTRANSCOM Metrics and Analysis Branch defines, develops, tracks, and maintains outcomes- based supply chain metrics to...2014a, p. 8). The Joint Staff defines a TDD standard as the maximum number of days the supply chain can take to deliver requisitioned materiel
Network topology and resilience analysis of South Korean power grid
NASA Astrophysics Data System (ADS)
Kim, Dong Hwan; Eisenberg, Daniel A.; Chun, Yeong Han; Park, Jeryang
2017-01-01
In this work, we present topological and resilience analyses of the South Korean power grid (KPG) with a broad voltage level. While topological analysis of KPG only with high-voltage infrastructure shows an exponential degree distribution, providing another empirical evidence of power grid topology, the inclusion of low voltage components generates a distribution with a larger variance and a smaller average degree. This result suggests that the topology of a power grid may converge to a highly skewed degree distribution if more low-voltage data is considered. Moreover, when compared to ER random and BA scale-free networks, the KPG has a lower efficiency and a higher clustering coefficient, implying that highly clustered structure does not necessarily guarantee a functional efficiency of a network. Error and attack tolerance analysis, evaluated with efficiency, indicate that the KPG is more vulnerable to random or degree-based attacks than betweenness-based intentional attack. Cascading failure analysis with recovery mechanism demonstrates that resilience of the network depends on both tolerance capacity and recovery initiation time. Also, when the two factors are fixed, the KPG is most vulnerable among the three networks. Based on our analysis, we propose that the topology of power grids should be designed so the loads are homogeneously distributed, or functional hubs and their neighbors have high tolerance capacity to enhance resilience.
Wu, Hao; Wang, Ruoxu; Liu, Deming; Fu, Songnian; Zhao, Can; Wei, Huifeng; Tong, Weijun; Shum, Perry Ping; Tang, Ming
2016-04-01
We proposed and demonstrated a few-mode fiber (FMF) based optical-fiber sensor for distributed curvature measurement through quasi-single-mode Brillouin frequency shift (BFS). By central-alignment splicing FMF and single-mode fiber (SMF) with a fusion taper, a SMF-components-compatible distributed curvature sensor based on FMF is realized using the conventional Brillouin optical time-domain analysis system. The distributed BFS change induced by bending in FMF has been theoretically and experimentally investigated. The precise BFS response to the curvature along the fiber link has been calibrated. A proof-of-concept experiment is implemented to validate its effectiveness in distributed curvature measurement.
Radar signal analysis of ballistic missile with micro-motion based on time-frequency distribution
NASA Astrophysics Data System (ADS)
Wang, Jianming; Liu, Lihua; Yu, Hua
2015-12-01
The micro-motion of ballistic missile targets induces micro-Doppler modulation on the radar return signal, which is a unique feature for the warhead discrimination during flight. In order to extract the micro-Doppler feature of ballistic missile targets, time-frequency analysis is employed to process the micro-Doppler modulated time-varying radar signal. The images of time-frequency distribution (TFD) reveal the micro-Doppler modulation characteristic very well. However, there are many existing time-frequency analysis methods to generate the time-frequency distribution images, including the short-time Fourier transform (STFT), Wigner distribution (WD) and Cohen class distribution, etc. Under the background of ballistic missile defence, the paper aims at working out an effective time-frequency analysis method for ballistic missile warhead discrimination from the decoys.
LEOPARD: A grid-based dispersion relation solver for arbitrary gyrotropic distributions
NASA Astrophysics Data System (ADS)
Astfalk, Patrick; Jenko, Frank
2017-01-01
Particle velocity distributions measured in collisionless space plasmas often show strong deviations from idealized model distributions. Despite this observational evidence, linear wave analysis in space plasma environments such as the solar wind or Earth's magnetosphere is still mainly carried out using dispersion relation solvers based on Maxwellians or other parametric models. To enable a more realistic analysis, we present the new grid-based kinetic dispersion relation solver LEOPARD (Linear Electromagnetic Oscillations in Plasmas with Arbitrary Rotationally-symmetric Distributions) which no longer requires prescribed model distributions but allows for arbitrary gyrotropic distribution functions. In this work, we discuss the underlying numerical scheme of the code and we show a few exemplary benchmarks. Furthermore, we demonstrate a first application of LEOPARD to ion distribution data obtained from hybrid simulations. In particular, we show that in the saturation stage of the parallel fire hose instability, the deformation of the initial bi-Maxwellian distribution invalidates the use of standard dispersion relation solvers. A linear solver based on bi-Maxwellians predicts further growth even after saturation, while LEOPARD correctly indicates vanishing growth rates. We also discuss how this complies with former studies on the validity of quasilinear theory for the resonant fire hose. In the end, we briefly comment on the role of LEOPARD in directly analyzing spacecraft data, and we refer to an upcoming paper which demonstrates a first application of that kind.
Spatially distributed modal signals of free shallow membrane shell structronic system
NASA Astrophysics Data System (ADS)
Yue, H. H.; Deng, Z. Q.; Tzou, H. S.
2008-11-01
Based on the smart material and structronics technology, distributed sensor and control of shell structures have been rapidly developed for the last 20 years. This emerging technology has been utilized in aerospace, telecommunication, micro-electromechanical systems and other engineering applications. However, distributed monitoring technique and its resulting global spatially distributed sensing signals of shallow paraboloidal membrane shells are not clearly understood. In this paper, modeling of free flexible paraboloidal shell with spatially distributed sensor, micro-sensing signal characteristics, and location of distributed piezoelectric sensor patches are investigated based on a new set of assumed mode shape functions. Parametric analysis indicates that the signal generation depends on modal membrane strains in the meridional and circumferential directions in which the latter is more significant than the former, when all bending strains vanish in membrane shells. This study provides a modeling and analysis technique for distributed sensors laminated on lightweight paraboloidal flexible structures and identifies critical components and regions that generate significant signals.
NASA Astrophysics Data System (ADS)
Aguila, Alexander; Wilson, Jorge
2017-07-01
This paper develops a methodology to assess a group of measures of electrical improvements in distribution systems, starting from the complementation of technical and economic criteria. In order to solve the problem of energy losses in distribution systems, technical and economic analysis was performed based on a mathematical model to establish a direct relationship between the energy saved by way of minimized losses and the costs of implementing the proposed measures. This paper aims at analysing the feasibility of reducing energy losses in distribution systems, by changing existing network conductors by larger crosssection conductors and distribution voltage change at higher levels. The impact of this methodology provides a highly efficient mathematical tool for analysing the feasibility of implementing improvement projects based on their costs which is a very useful tool for the distribution companies that will serve as a starting point to the analysis for this type of projects in distribution systems.
ERIC Educational Resources Information Center
Baoyan, Yang; Minggang, Wan
2015-01-01
To a certain extent, the distribution of high school education opportunities among the population determines the stratification of high school education opportunities. The researchers examined the distribution of high school education opportunities within the county region based on survey data on middle school graduation education tracking in Q…
Reliable and More Powerful Methods for Power Analysis in Structural Equation Modeling
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Zhang, Zhiyong; Zhao, Yanyun
2017-01-01
The normal-distribution-based likelihood ratio statistic T[subscript ml] = nF[subscript ml] is widely used for power analysis in structural Equation modeling (SEM). In such an analysis, power and sample size are computed by assuming that T[subscript ml] follows a central chi-square distribution under H[subscript 0] and a noncentral chi-square…
NASA Astrophysics Data System (ADS)
Einstein, Theodore L.; Pimpinelli, Alberto; González, Diego Luis; Morales-Cifuentes, Josue R.
2015-09-01
In studies of epitaxial growth, analysis of the distribution of the areas of capture zones (i.e. proximity polygons or Voronoi tessellations with respect to island centers) is often the best way to extract the critical nucleus size i. For non-random nucleation the normalized areas s of these Voronoi cells are well described by the generalized Wigner distribution (GWD) Pβ(s) = asβ exp(-bs2), particularly in the central region 0.5 < s < 2 where data are least noisy. Extensive Monte Carlo simulations reveal inadequacies of our earlier mean field analysis, suggesting β = i + 2 for diffusion-limited aggregation (DLA). Since simulations generate orders of magnitude more data than experiments, they permit close examination of the tails of the distribution, which differ from the simple GWD form. One refinement is based on a fragmentation model. We also compare island-size distributions. We compare analysis by island-size distribution and by scaling of island density with flux. Modifications appear for attach-limited aggregation (ALA). We focus on the experimental system para-hexaphenyl on amorphous mica, comparing the results of the three analysis techniques and reconciling their results via a novel model of hot precursors based on rate equations, pointing out the existence of intermediate scaling regimes between DLA and ALA.
Gis-Based Spatial Statistical Analysis of College Graduates Employment
NASA Astrophysics Data System (ADS)
Tang, R.
2012-07-01
It is urgently necessary to be aware of the distribution and employment status of college graduates for proper allocation of human resources and overall arrangement of strategic industry. This study provides empirical evidence regarding the use of geocoding and spatial analysis in distribution and employment status of college graduates based on the data from 2004-2008 Wuhan Municipal Human Resources and Social Security Bureau, China. Spatio-temporal distribution of employment unit were analyzed with geocoding using ArcGIS software, and the stepwise multiple linear regression method via SPSS software was used to predict the employment and to identify spatially associated enterprise and professionals demand in the future. The results show that the enterprises in Wuhan east lake high and new technology development zone increased dramatically from 2004 to 2008, and tended to distributed southeastward. Furthermore, the models built by statistical analysis suggest that the specialty of graduates major in has an important impact on the number of the employment and the number of graduates engaging in pillar industries. In conclusion, the combination of GIS and statistical analysis which helps to simulate the spatial distribution of the employment status is a potential tool for human resource development research.
Distributed architecture and distributed processing mode in urban sewage treatment
NASA Astrophysics Data System (ADS)
Zhou, Ruipeng; Yang, Yuanming
2017-05-01
Decentralized rural sewage treatment facility over the broad area, a larger operation and management difficult, based on the analysis of rural sewage treatment model based on the response to these challenges, we describe the principle, structure and function in networking technology and network communications technology as the core of distributed remote monitoring system, through the application of case analysis to explore remote monitoring system features in a decentralized rural sewage treatment facilities in the daily operation and management. Practice shows that the remote monitoring system to provide technical support for the long-term operation and effective supervision of the facilities, and reduced operating, maintenance and supervision costs for development.
Spatial analysis of cities using Renyi entropy and fractal parameters
NASA Astrophysics Data System (ADS)
Chen, Yanguang; Feng, Jian
2017-12-01
The spatial distributions of cities fall into two groups: one is the simple distribution with characteristic scale (e.g. exponential distribution), and the other is the complex distribution without characteristic scale (e.g. power-law distribution). The latter belongs to scale-free distributions, which can be modeled with fractal geometry. However, fractal dimension is not suitable for the former distribution. In contrast, spatial entropy can be used to measure any types of urban distributions. This paper is devoted to generalizing multifractal parameters by means of dual relation between Euclidean and fractal geometries. The main method is mathematical derivation and empirical analysis, and the theoretical foundation is the discovery that the normalized fractal dimension is equal to the normalized entropy. Based on this finding, a set of useful spatial indexes termed dummy multifractal parameters are defined for geographical analysis. These indexes can be employed to describe both the simple distributions and complex distributions. The dummy multifractal indexes are applied to the population density distribution of Hangzhou city, China. The calculation results reveal the feature of spatio-temporal evolution of Hangzhou's urban morphology. This study indicates that fractal dimension and spatial entropy can be combined to produce a new methodology for spatial analysis of city development.
NASA Astrophysics Data System (ADS)
Bruggeman, M.; Baeten, P.; De Boeck, W.; Carchon, R.
1996-02-01
Neutron coincidence counting is commonly used for the non-destructive assay of plutonium bearing waste or for safeguards verification measurements. A major drawback of conventional coincidence counting is related to the fact that a valid calibration is needed to convert a neutron coincidence count rate to a 240Pu equivalent mass ( 240Pu eq). In waste assay, calibrations are made for representative waste matrices and source distributions. The actual waste however may have quite different matrices and source distributions compared to the calibration samples. This often results in a bias of the assay result. This paper presents a new neutron multiplicity sensitive coincidence counting technique including an auto-calibration of the neutron detection efficiency. The coincidence counting principle is based on the recording of one- and two-dimensional Rossi-alpha distributions triggered respectively by pulse pairs and by pulse triplets. Rossi-alpha distributions allow an easy discrimination between real and accidental coincidences and are aimed at being measured by a PC-based fast time interval analyser. The Rossi-alpha distributions can be easily expressed in terms of a limited number of factorial moments of the neutron multiplicity distributions. The presented technique allows an unbiased measurement of the 240Pu eq mass. The presented theory—which will be indicated as Time Interval Analysis (TIA)—is complementary to Time Correlation Analysis (TCA) theories which were developed in the past, but is from the theoretical point of view much simpler and allows a straightforward calculation of deadtime corrections and error propagation. Analytical expressions are derived for the Rossi-alpha distributions as a function of the factorial moments of the efficiency dependent multiplicity distributions. The validity of the proposed theory is demonstrated and verified via Monte Carlo simulations of pulse trains and the subsequent analysis of the simulated data.
Robust LOD scores for variance component-based linkage analysis.
Blangero, J; Williams, J T; Almasy, L
2000-01-01
The variance component method is now widely used for linkage analysis of quantitative traits. Although this approach offers many advantages, the importance of the underlying assumption of multivariate normality of the trait distribution within pedigrees has not been studied extensively. Simulation studies have shown that traits with leptokurtic distributions yield linkage test statistics that exhibit excessive Type I error when analyzed naively. We derive analytical formulae relating the deviation from the expected asymptotic distribution of the lod score to the kurtosis and total heritability of the quantitative trait. A simple correction constant yields a robust lod score for any deviation from normality and for any pedigree structure, and effectively eliminates the problem of inflated Type I error due to misspecification of the underlying probability model in variance component-based linkage analysis.
Impelluso, Thomas J
2003-06-01
An algorithm for bone remodeling is presented which allows for both a redistribution of density and a continuous change of principal material directions for the orthotropic material properties of bone. It employs a modal analysis to add density for growth and a local effective strain based analysis to redistribute density. General re-distribution functions are presented. The model utilizes theories of cellular solids to relate density and strength. The code predicts the same general density distributions and local orthotropy as observed in reality.
A non-stationary cost-benefit based bivariate extreme flood estimation approach
NASA Astrophysics Data System (ADS)
Qi, Wei; Liu, Junguo
2018-02-01
Cost-benefit analysis and flood frequency analysis have been integrated into a comprehensive framework to estimate cost effective design values. However, previous cost-benefit based extreme flood estimation is based on stationary assumptions and analyze dependent flood variables separately. A Non-Stationary Cost-Benefit based bivariate design flood estimation (NSCOBE) approach is developed in this study to investigate influence of non-stationarities in both the dependence of flood variables and the marginal distributions on extreme flood estimation. The dependence is modeled utilizing copula functions. Previous design flood selection criteria are not suitable for NSCOBE since they ignore time changing dependence of flood variables. Therefore, a risk calculation approach is proposed based on non-stationarities in both marginal probability distributions and copula functions. A case study with 54-year observed data is utilized to illustrate the application of NSCOBE. Results show NSCOBE can effectively integrate non-stationarities in both copula functions and marginal distributions into cost-benefit based design flood estimation. It is also found that there is a trade-off between maximum probability of exceedance calculated from copula functions and marginal distributions. This study for the first time provides a new approach towards a better understanding of influence of non-stationarities in both copula functions and marginal distributions on extreme flood estimation, and could be beneficial to cost-benefit based non-stationary bivariate design flood estimation across the world.
40 CFR 258.53 - Ground-water sampling and analysis requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... considering the number of samples in the background data base, the data distribution, and the range of the... the background data base, the data distribution, and the range of the concentration values for each... samples collected to establish ground-water quality data must be consistent with the appropriate...
40 CFR 258.53 - Ground-water sampling and analysis requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... considering the number of samples in the background data base, the data distribution, and the range of the... the background data base, the data distribution, and the range of the concentration values for each... samples collected to establish ground-water quality data must be consistent with the appropriate...
A Study of ATLAS Grid Performance for Distributed Analysis
NASA Astrophysics Data System (ADS)
Panitkin, Sergey; Fine, Valery; Wenaus, Torre
2012-12-01
In the past two years the ATLAS Collaboration at the LHC has collected a large volume of data and published a number of ground breaking papers. The Grid-based ATLAS distributed computing infrastructure played a crucial role in enabling timely analysis of the data. We will present a study of the performance and usage of the ATLAS Grid as platform for physics analysis in 2011. This includes studies of general properties as well as timing properties of user jobs (wait time, run time, etc). These studies are based on mining of data archived by the PanDA workload management system.
Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty
Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.
2016-09-12
Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less
Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.
Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less
Forecasting weed distributions using climate data: a GIS early warning tool
Jarnevich, Catherine S.; Holcombe, Tracy R.; Barnett, David T.; Stohlgren, Thomas J.; Kartesz, John T.
2010-01-01
The number of invasive exotic plant species establishing in the United States is continuing to rise. When prevention of exotic species from entering into a country fails at the national level and the species establishes, reproduces, spreads, and becomes invasive, the most successful action at a local level is early detection followed eradication. We have developed a simple geographic information system (GIS) analysis for developing watch lists for early detection of invasive exotic plants that relies upon currently available species distribution data coupled with environmental data to aid in describing coarse-scale potential distributions. This GIS analysis tool develops environmental envelopes for species based upon the known distribution of a species thought to be invasive and represents the first approximation of its potential habitat while the necessary data are collected to perform more in-depth analyses. To validate this method we looked at a time series of species distributions for 66 species in Pacific Northwest, and northern Rocky Mountain counties. The time series analysis presented here did select counties that the invasive exotic weeds invaded in subsequent years, showing that this technique could be useful in developing watch lists for the spread of particular exotic species. We applied this same habitat-matching model based upon bioclimaric envelopes to 100 invasive exotics with various levels of known distributions within continental U.S. counties. For species with climatically limited distributions, county watch lists describe county-specific vulnerability to invasion. Species with matching habitats in a county would be added to that county's list. These watch lists can influence management decisions for early warning, control prioritization, and targeted research to determine specific locations within vulnerable counties. This tool provides useful information for rapid assessment of the potential distribution based upon climate envelopes of current distributions for new invasive exotic species.
Mixed-Fidelity Approach for Design of Low-Boom Supersonic Aircraft
NASA Technical Reports Server (NTRS)
Li, Wu; Shields, Elwood; Geiselhart, Karl
2011-01-01
This paper documents a mixed-fidelity approach for the design of low-boom supersonic aircraft with a focus on fuselage shaping.A low-boom configuration that is based on low-fidelity analysis is used as the baseline. The fuselage shape is modified iteratively to obtain a configuration with an equivalent-area distribution derived from computational fluid dynamics analysis that attempts to match a predetermined low-boom target area distribution and also yields a low-boom ground signature. The ground signature of the final configuration is calculated by using a state-of-the-art computational-fluid-dynamics-based boom analysis method that generates accurate midfield pressure distributions for propagation to the ground with ray tracing. The ground signature that is propagated from a midfield pressure distribution has a shaped ramp front, which is similar to the ground signature that is propagated from the computational fluid dynamics equivalent-area distribution. This result supports the validity of low-boom supersonic configuration design by matching a low-boom equivalent-area target, which is easier to accomplish than matching a low-boom midfield pressure target.
Age distribution and age-related outcomes of olfactory neuroblastoma: a population-based analysis.
Yin, Zhenzhen; Wang, Youyou; Wu, Yuemei; Zhang, Ximei; Wang, Fengming; Wang, Peiguo; Tao, Zhen; Yuan, Zhiyong
2018-01-01
The objective of the study was to describe the age distribution and to evaluate the role of prognostic value of age on survival in patients diagnosed with olfactory neuroblastoma (ONB). A population-based retrospective analysis was conducted. The population-based study of patients in the Surveillance, Epidemiology, and End Results (SEER) tumor registry, who were diagnosed with ONB from 1973 to 2014, were retrospectively analyzed. The cohort included 876 patients with a median age of 54 years. There was a unimodal distribution of age and ONBs most frequently occurred in the fifth to sixth decades of life. Kaplan-Meier analysis demonstrated overall survival (OS) and cancer-specific survival (CSS) rates of 69% and 78% at 5 years. Multivariable Cox regression analysis showed that age, SEER stage, and surgery were independent prognostic factors for CSS. The risk of overall death and cancer-specific death increased 3.1% and 1.6% per year, respectively. Patients aged >60 years presented significantly poor OS and CSS compared with patients aged ≤60 years, even in patients with loco-regional disease or in those treated with surgery. This study highlights the growing evidence that there is a unimodal age distribution of ONB and that age is an important adverse prognostic factor.
Evaluation model of distribution network development based on ANP and grey correlation analysis
NASA Astrophysics Data System (ADS)
Ma, Kaiqiang; Zhan, Zhihong; Zhou, Ming; Wu, Qiang; Yan, Jun; Chen, Genyong
2018-06-01
The existing distribution network evaluation system cannot scientifically and comprehensively reflect the distribution network development status. Furthermore, the evaluation model is monotonous and it is not suitable for horizontal analysis of many regional power grids. For these reason, this paper constructs a set of universal adaptability evaluation index system and model of distribution network development. Firstly, distribution network evaluation system is set up by power supply capability, power grid structure, technical equipment, intelligent level, efficiency of the power grid and development benefit of power grid. Then the comprehensive weight of indices is calculated by combining the AHP with the grey correlation analysis. Finally, the index scoring function can be obtained by fitting the index evaluation criterion to the curve, and then using the multiply plus operator to get the result of sample evaluation. The example analysis shows that the model can reflect the development of distribution network and find out the advantages and disadvantages of distribution network development. Besides, the model provides suggestions for the development and construction of distribution network.
Distributed optical fiber vibration sensor based on spectrum analysis of Polarization-OTDR system.
Zhang, Ziyi; Bao, Xiaoyi
2008-07-07
A fully distributed optical fiber vibration sensor is demonstrated based on spectrum analysis of Polarization-OTDR system. Without performing any data averaging, vibration disturbances up to 5 kHz is successfully demonstrated in a 1km fiber link with 10m spatial resolution. The FFT is performed at each spatial resolution; the relation of the disturbance at each frequency component versus location allows detection of multiple events simultaneously with different and the same frequency components.
Spatial Signal Characteristics of Shallow Paraboloidal Shell Structronic Systems
NASA Astrophysics Data System (ADS)
Yue, H. H.; Deng, Z. Q.; Tzou, H. S.
Based on the smart material and structronics technology, distributed sensor and control of shell structures have been rapidly developed for the last twenty years. This emerging technology has been utilized in aerospace, telecommunication, micro-electromechanical systems and other engineering applications. However, distributed monitoring technique and its resulting global spatially distributed sensing signals of thin flexible membrane shells are not clearly understood. In this paper, modeling of free thin paraboloidal shell with spatially distributed sensor, micro-sensing signal characteristics, and location of distributed piezoelectric sensor patches are investigated based on a new set of assumed mode shape functions. Parametric analysis indicates that the signal generation depends on modal membrane strains in the meridional and circumferential directions in which the latter is more significant than the former, when all bending strains vanish in membrane shells. This study provides a modeling and analysis technique for distributed sensors laminated on lightweight paraboloidal flexible structures and identifies critical components and regions that generate significant signals.
Pasi, Marco; Maddocks, John H.; Lavery, Richard
2015-01-01
Microsecond molecular dynamics simulations of B-DNA oligomers carried out in an aqueous environment with a physiological salt concentration enable us to perform a detailed analysis of how potassium ions interact with the double helix. The oligomers studied contain all 136 distinct tetranucleotides and we are thus able to make a comprehensive analysis of base sequence effects. Using a recently developed curvilinear helicoidal coordinate method we are able to analyze the details of ion populations and densities within the major and minor grooves and in the space surrounding DNA. The results show higher ion populations than have typically been observed in earlier studies and sequence effects that go beyond the nature of individual base pairs or base pair steps. We also show that, in some special cases, ion distributions converge very slowly and, on a microsecond timescale, do not reflect the symmetry of the corresponding base sequence. PMID:25662221
Bagri, Akbar; Hanson, John P.; Lind, J. P.; ...
2016-10-25
We use high-energy X-ray diffraction microscopy (HEDM) to characterize the microstructure of Ni-base alloy 725. HEDM is a non-destructive technique capable of providing three-dimensional reconstructions of grain shapes and orientations in polycrystals. The present analysis yields the grain size distribution in alloy 725 as well as the grain boundary character distribution (GBCD) as a function of lattice misorientation and boundary plane normal orientation. We find that the GBCD of Ni-base alloy 725 is similar to that previously determined in pure Ni and other fcc-base metals. We find an elevated density of Σ9 and Σ3 grain boundaries. We also observe amore » preponderance of grain boundaries along low-index planes, with those along (1 1 1) planes being the most common, even after Σ3 twins have been excluded from the analysis.« less
NASA Astrophysics Data System (ADS)
Dowell, M.; Moore, T.; Follows, M.; Dutkiewicz, S.
2006-12-01
In recent years there has been significant progress both in the use of satellite ocean colour remote sensing and coupled hydrodynamic biological models for producing maps of different dominant phytoplankton groups in the global ocean. In parallel to these initiatives, there is ongoing research largely following on from Alan Longhurst's seminal work on defining a template of distinct ecological and biogeochemical provinces for the oceans based on their physical and biochemical characteristics. For these products and models to be of maximum use in their subsequent inclusion in re-analysis and climate scale models, there is a need to understand how the "observed" distributions of dominant phytoplankton (realized niche) coincide with of the environmental constraints in which they occur (fundamental niche). In the current paper, we base our analysis on the recently published results on the distribution of dominant phytoplankton species at global scale, resulting both from satellite and model analysis. Furthermore, we will present research in defining biogeochemical provinces using satellite and model data inputs and a fuzzy logic based approach. This will be compared with ongoing modelling efforts, which include competitive exclusion and therefore compatible with the definition of the realized ecological niche, to define the emergent distribution of dominant phytoplankton species. Ultimately we investigate the coherence of these two distinct approaches in studying phytoplankton distributions and propose the significance of this in the context of modelling and analysis at various scales.
2016-09-01
HEALTHCARE’S QUANTIFIED-SELF DATA: A COMPARATIVE ANALYSIS VERSUS PERSONAL FINANCIAL ACCOUNT AGGREGATORS BASED ON PORTER’S FIVE FORCES FRAMEWORK FOR...TITLE AND SUBTITLE SECURING HEALTHCARE’S QUANTIFIED-SELF DATA: A COMPARATIVE ANALYSIS VERSUS PERSONAL FINANCIAL ACCOUNT AGGREGATORS BASED ON...Distribution is unlimited. SECURING HEALTHCARE’S QUANTIFIED-SELF DATA: A COMPARATIVE ANALYSIS VERSUS PERSONAL FINANCIAL ACCOUNT AGGREGATORS BASED ON
Review of Vibration-Based Helicopters Health and Usage Monitoring Methods
2001-04-05
FM4, NA4, NA4*, NB4 and NB48* (Polyshchuk et al., 1998). The Wigner - Ville distribution ( WVD ) is a joint time-frequency signal analysis. The WVD is one...signal processing methodologies that are of relevance to vibration based damage detection (e.g., Wavelet Transform and Wigner - Ville distribution ) will be...operation cost, reduce maintenance flights, and increase flight safety. Key Words: HUMS; Wavelet Transform; Wigner - Ville distribution ; O&S; Machinery
Research on distributed heterogeneous data PCA algorithm based on cloud platform
NASA Astrophysics Data System (ADS)
Zhang, Jin; Huang, Gang
2018-05-01
Principal component analysis (PCA) of heterogeneous data sets can solve the problem that centralized data scalability is limited. In order to reduce the generation of intermediate data and error components of distributed heterogeneous data sets, a principal component analysis algorithm based on heterogeneous data sets under cloud platform is proposed. The algorithm performs eigenvalue processing by using Householder tridiagonalization and QR factorization to calculate the error component of the heterogeneous database associated with the public key to obtain the intermediate data set and the lost information. Experiments on distributed DBM heterogeneous datasets show that the model method has the feasibility and reliability in terms of execution time and accuracy.
Performance analysis of the node shell on a container door based on ANSYS
NASA Astrophysics Data System (ADS)
Li, Qingzhou; Zhou, Yi; Hu, Changqing; Cheng, Jiamin; Zeng, Xiaochen
2018-01-01
The structure of thenode shell on a container door was designed and analyzed in this study. The model of the shell was developed with ANSYS. The grids of the model were divided based on the Hex dominant method, and the stress distribution and the temperature distribution of the shell were calculated based on FEA (Finite Element Analysis) method. The analysis results indicated thatthe location of the concave upward side has the highest stress which also lower than the strength limit of the material. The temperature of the magnet installation location was highest, therefore the glue for fixing the magnet must has high temperature resistance. The results provide the basis for the further optimization of the shell.
Progress on Ultra-Dense Quantum Communication Using Integrated Photonic Architecture
2013-01-01
entanglement based quantum key distribution . . . . . . . . . . . . . . . . . . . . . . . . . 2 2.2 Extended dispersive-optics QKD (DO-QKD) protocol...2 2.3 Analysis of non-local correlations of entangled photon pairs for arbitrary dis- persion...Section 3). 2 Protocol Development 2.1 Achieving multiple secure bits per coincidence in time-energy entanglement based quantum key distribution High
Distributed software framework and continuous integration in hydroinformatics systems
NASA Astrophysics Data System (ADS)
Zhou, Jianzhong; Zhang, Wei; Xie, Mengfei; Lu, Chengwei; Chen, Xiao
2017-08-01
When encountering multiple and complicated models, multisource structured and unstructured data, complex requirements analysis, the platform design and integration of hydroinformatics systems become a challenge. To properly solve these problems, we describe a distributed software framework and it’s continuous integration process in hydroinformatics systems. This distributed framework mainly consists of server cluster for models, distributed database, GIS (Geographic Information System) servers, master node and clients. Based on it, a GIS - based decision support system for joint regulating of water quantity and water quality of group lakes in Wuhan China is established.
One Step Quantum Key Distribution Based on EPR Entanglement.
Li, Jian; Li, Na; Li, Lei-Lei; Wang, Tao
2016-06-30
A novel quantum key distribution protocol is presented, based on entanglement and dense coding and allowing asymptotically secure key distribution. Considering the storage time limit of quantum bits, a grouping quantum key distribution protocol is proposed, which overcomes the vulnerability of first protocol and improves the maneuverability. Moreover, a security analysis is given and a simple type of eavesdropper's attack would introduce at least an error rate of 46.875%. Compared with the "Ping-pong" protocol involving two steps, the proposed protocol does not need to store the qubit and only involves one step.
A Variance Distribution Model of Surface EMG Signals Based on Inverse Gamma Distribution.
Hayashi, Hideaki; Furui, Akira; Kurita, Yuichi; Tsuji, Toshio
2017-11-01
Objective: This paper describes the formulation of a surface electromyogram (EMG) model capable of representing the variance distribution of EMG signals. Methods: In the model, EMG signals are handled based on a Gaussian white noise process with a mean of zero for each variance value. EMG signal variance is taken as a random variable that follows inverse gamma distribution, allowing the representation of noise superimposed onto this variance. Variance distribution estimation based on marginal likelihood maximization is also outlined in this paper. The procedure can be approximated using rectified and smoothed EMG signals, thereby allowing the determination of distribution parameters in real time at low computational cost. Results: A simulation experiment was performed to evaluate the accuracy of distribution estimation using artificially generated EMG signals, with results demonstrating that the proposed model's accuracy is higher than that of maximum-likelihood-based estimation. Analysis of variance distribution using real EMG data also suggested a relationship between variance distribution and signal-dependent noise. Conclusion: The study reported here was conducted to examine the performance of a proposed surface EMG model capable of representing variance distribution and a related distribution parameter estimation method. Experiments using artificial and real EMG data demonstrated the validity of the model. Significance: Variance distribution estimated using the proposed model exhibits potential in the estimation of muscle force. Objective: This paper describes the formulation of a surface electromyogram (EMG) model capable of representing the variance distribution of EMG signals. Methods: In the model, EMG signals are handled based on a Gaussian white noise process with a mean of zero for each variance value. EMG signal variance is taken as a random variable that follows inverse gamma distribution, allowing the representation of noise superimposed onto this variance. Variance distribution estimation based on marginal likelihood maximization is also outlined in this paper. The procedure can be approximated using rectified and smoothed EMG signals, thereby allowing the determination of distribution parameters in real time at low computational cost. Results: A simulation experiment was performed to evaluate the accuracy of distribution estimation using artificially generated EMG signals, with results demonstrating that the proposed model's accuracy is higher than that of maximum-likelihood-based estimation. Analysis of variance distribution using real EMG data also suggested a relationship between variance distribution and signal-dependent noise. Conclusion: The study reported here was conducted to examine the performance of a proposed surface EMG model capable of representing variance distribution and a related distribution parameter estimation method. Experiments using artificial and real EMG data demonstrated the validity of the model. Significance: Variance distribution estimated using the proposed model exhibits potential in the estimation of muscle force.
2014-05-01
DISTRIBUTION A. Approved for public release: distribution unlimited. INFORMATION, UNDERSTANDING, AND INFLUENCE: AN AGENCY THEORY STRATEGY ...Influence: An Agency Theory Strategy For Air Base Communications And Cyberspace Support 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...present communications and cyberspace support organizations. Next, it introduces a strategy based on this analysis to bring information, understanding
A Mixed-Fidelity Approach for Design of Low-Boom Supersonic Aircraft
NASA Technical Reports Server (NTRS)
Li, Wu; Shields, Elwood; Geiselhart, Karl A.
2010-01-01
This paper documents a mixed-fidelity approach for the design of low-boom supersonic aircraft as a viable approach for designing a practical low-boom supersonic configuration. A low-boom configuration that is based on low-fidelity analysis is used as the baseline. Tail lift is included to help tailor the aft portion of the ground signature. A comparison of low- and high-fidelity analysis results demonstrates the necessity of using computational fluid dynamics (CFD) analysis in a low-boom supersonic configuration design process. The fuselage shape is modified iteratively to obtain a configuration with a CFD equivalent-area distribution that matches a predetermined low-boom target distribution. The mixed-fidelity approach can easily refine the low-fidelity low-boom baseline into a low-boom configuration with the use of CFD equivalent-area analysis. The ground signature of the final configuration is calculated by using a state-of-the-art CFD-based boom analysis method that generates accurate midfield pressure distributions for propagation to the ground with ray tracing. The ground signature that is propagated from a midfield pressure distribution has a shaped ramp front, which is similar to the ground signature that is propagated from the CFD equivalent-area distribution. This result confirms the validity of the low-boom supersonic configuration design by matching a low-boom equivalent-area target, which is easier to accomplish than matching a low-boom midfield pressure target.
Baradez, Marc-Olivier; Marshall, Damian
2011-01-01
The transition from traditional culture methods towards bioreactor based bioprocessing to produce cells in commercially viable quantities for cell therapy applications requires the development of robust methods to ensure the quality of the cells produced. Standard methods for measuring cell quality parameters such as viability provide only limited information making process monitoring and optimisation difficult. Here we describe a 3D image-based approach to develop cell distribution maps which can be used to simultaneously measure the number, confluency and morphology of cells attached to microcarriers in a stirred tank bioreactor. The accuracy of the cell distribution measurements is validated using in silico modelling of synthetic image datasets and is shown to have an accuracy >90%. Using the cell distribution mapping process and principal component analysis we show how cell growth can be quantitatively monitored over a 13 day bioreactor culture period and how changes to manufacture processes such as initial cell seeding density can significantly influence cell morphology and the rate at which cells are produced. Taken together, these results demonstrate how image-based analysis can be incorporated in cell quality control processes facilitating the transition towards bioreactor based manufacture for clinical grade cells. PMID:22028809
Baradez, Marc-Olivier; Marshall, Damian
2011-01-01
The transition from traditional culture methods towards bioreactor based bioprocessing to produce cells in commercially viable quantities for cell therapy applications requires the development of robust methods to ensure the quality of the cells produced. Standard methods for measuring cell quality parameters such as viability provide only limited information making process monitoring and optimisation difficult. Here we describe a 3D image-based approach to develop cell distribution maps which can be used to simultaneously measure the number, confluency and morphology of cells attached to microcarriers in a stirred tank bioreactor. The accuracy of the cell distribution measurements is validated using in silico modelling of synthetic image datasets and is shown to have an accuracy >90%. Using the cell distribution mapping process and principal component analysis we show how cell growth can be quantitatively monitored over a 13 day bioreactor culture period and how changes to manufacture processes such as initial cell seeding density can significantly influence cell morphology and the rate at which cells are produced. Taken together, these results demonstrate how image-based analysis can be incorporated in cell quality control processes facilitating the transition towards bioreactor based manufacture for clinical grade cells.
Assessing population exposure for landslide risk analysis using dasymetric cartography
NASA Astrophysics Data System (ADS)
Garcia, Ricardo A. C.; Oliveira, Sergio C.; Zezere, Jose L.
2015-04-01
Exposed Population is a major topic that needs to be taken into account in a full landslide risk analysis. Usually, risk analysis is based on an accounting of inhabitants number or inhabitants density, applied over statistical or administrative terrain units, such as NUTS or parishes. However, this kind of approach may skew the obtained results underestimating the importance of population, mainly in territorial units with predominance of rural occupation. Furthermore, the landslide susceptibility scores calculated for each terrain unit are frequently more detailed and accurate than the location of the exposed population inside each territorial unit based on Census data. These drawbacks are not the ideal setting when landslide risk analysis is performed for urban management and emergency planning. Dasymetric cartography, which uses a parameter or set of parameters to restrict the spatial distribution of a particular phenomenon, is a methodology that may help to enhance the resolution of Census data and therefore to give a more realistic representation of the population distribution. Therefore, this work aims to map and to compare the population distribution based on a traditional approach (population per administrative terrain units) and based on dasymetric cartography (population by building). The study is developed in the Region North of Lisbon using 2011 population data and following three main steps: i) the landslide susceptibility assessment based on statistical models independently validated; ii) the evaluation of population distribution (absolute and density) for different administrative territorial units (Parishes and BGRI - the basic statistical unit in the Portuguese Census); and iii) the dasymetric population's cartography based on building areal weighting. Preliminary results show that in sparsely populated administrative units, population density differs more than two times depending on the application of the traditional approach or the dasymetric cartography. This work was supported by the FCT - Portuguese Foundation for Science and Technology.
An Event-Based Approach to Distributed Diagnosis of Continuous Systems
NASA Technical Reports Server (NTRS)
Daigle, Matthew; Roychoudhurry, Indranil; Biswas, Gautam; Koutsoukos, Xenofon
2010-01-01
Distributed fault diagnosis solutions are becoming necessary due to the complexity of modern engineering systems, and the advent of smart sensors and computing elements. This paper presents a novel event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, based on a qualitative abstraction of measurement deviations from the nominal behavior. We systematically derive dynamic fault signatures expressed as event-based fault models. We develop a distributed diagnoser design algorithm that uses these models for designing local event-based diagnosers based on global diagnosability analysis. The local diagnosers each generate globally correct diagnosis results locally, without a centralized coordinator, and by communicating a minimal number of measurements between themselves. The proposed approach is applied to a multi-tank system, and results demonstrate a marked improvement in scalability compared to a centralized approach.
Parametric distribution approach for flow availability in small hydro potential analysis
NASA Astrophysics Data System (ADS)
Abdullah, Samizee; Basri, Mohd Juhari Mat; Jamaluddin, Zahrul Zamri; Azrulhisham, Engku Ahmad; Othman, Jamel
2016-10-01
Small hydro system is one of the important sources of renewable energy and it has been recognized worldwide as clean energy sources. Small hydropower generation system uses the potential energy in flowing water to produce electricity is often questionable due to inconsistent and intermittent of power generated. Potential analysis of small hydro system which is mainly dependent on the availability of water requires the knowledge of water flow or stream flow distribution. This paper presented the possibility of applying Pearson system for stream flow availability distribution approximation in the small hydro system. By considering the stochastic nature of stream flow, the Pearson parametric distribution approximation was computed based on the significant characteristic of Pearson system applying direct correlation between the first four statistical moments of the distribution. The advantage of applying various statistical moments in small hydro potential analysis will have the ability to analyze the variation shapes of stream flow distribution.
Gear Fatigue Crack Diagnosis by Vibration Analysis Using Embedded Modeling
2001-04-05
gave references on Wigner - Ville Distribution ( WVD ) and some statistical based methods including FM4, NA4 and NB4. There are limitations for vibration...Embedded Modeling DISTRIBUTION : Approved for public release, distribution unlimited This paper is part of the following report: TITLE: New Frontiers in
Analysis of Product Distribution Strategy in Digital Publishing Industry Based on Game-Theory
NASA Astrophysics Data System (ADS)
Xu, Li-ping; Chen, Haiyan
2017-04-01
The digital publishing output increased significantly year by year. It has been the most vigorous point of economic growth and has been more important to press and publication industry. Its distribution channel has been diversified, which is different from the traditional industry. A deep research has been done in digital publishing industry, for making clear of the constitution of the industry chain and establishing the model of industry chain. The cooperative and competitive relationship between different distribution channels have been analyzed basing on a game-theory. By comparing the distribution quantity and the market size between the static distribution strategy and dynamic distribution strategy, we get the theory evidence about how to choose the distribution strategy to get the optimal benefit.
Distributed intelligent data analysis in diabetic patient management.
Bellazzi, R.; Larizza, C.; Riva, A.; Mira, A.; Fiocchi, S.; Stefanelli, M.
1996-01-01
This paper outlines the methodologies that can be used to perform an intelligent analysis of diabetic patients' data, realized in a distributed management context. We present a decision-support system architecture based on two modules, a Patient Unit and a Medical Unit, connected by telecommunication services. We stress the necessity to resort to temporal abstraction techniques, combined with time series analysis, in order to provide useful advice to patients; finally, we outline how data analysis and interpretation can be cooperatively performed by the two modules. PMID:8947655
Wang, Xinghu; Hong, Yiguang; Yi, Peng; Ji, Haibo; Kang, Yu
2017-05-24
In this paper, a distributed optimization problem is studied for continuous-time multiagent systems with unknown-frequency disturbances. A distributed gradient-based control is proposed for the agents to achieve the optimal consensus with estimating unknown frequencies and rejecting the bounded disturbance in the semi-global sense. Based on convex optimization analysis and adaptive internal model approach, the exact optimization solution can be obtained for the multiagent system disturbed by exogenous disturbances with uncertain parameters.
Performance of concrete members subjected to large hydrocarbon pool fires
Zwiers, Renata I.; Morgan, Bruce J.
1989-01-01
The authors discuss an investigation to determine analytically if the performance of concrete beams and columns in a hydrocarbon pool test fire would differ significantly from their performance in a standard test fire. The investigation consisted of a finite element analysis to obtain temperature distributions in typical cross sections, a comparison of the resulting temperature distribution in the cross section, and a strength analysis of a beam based on temperature distribution data. Results of the investigation are reported.
Motion/imagery secure cloud enterprise architecture analysis
NASA Astrophysics Data System (ADS)
DeLay, John L.
2012-06-01
Cloud computing with storage virtualization and new service-oriented architectures brings a new perspective to the aspect of a distributed motion imagery and persistent surveillance enterprise. Our existing research is focused mainly on content management, distributed analytics, WAN distributed cloud networking performance issues of cloud based technologies. The potential of leveraging cloud based technologies for hosting motion imagery, imagery and analytics workflows for DOD and security applications is relatively unexplored. This paper will examine technologies for managing, storing, processing and disseminating motion imagery and imagery within a distributed network environment. Finally, we propose areas for future research in the area of distributed cloud content management enterprises.
Distribution-Connected PV's Response to Voltage Sags at Transmission-Scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mather, Barry; Ding, Fei
The ever increasing amount of residential- and commercial-scale distribution-connected PV generation being installed and operated on the U.S.'s electric power system necessitates the use of increased fidelity representative distribution system models for transmission stability studies in order to ensure the continued safe and reliable operation of the grid. This paper describes a distribution model-based analysis that determines the amount of distribution-connected PV that trips off-line for a given voltage sag seen at the distribution circuit's substation. Such sags are what could potentially be experienced over a wide area of an interconnection during a transmission-level line fault. The results of thismore » analysis show that the voltage diversity of the distribution system does cause different amounts of PV generation to be lost for differing severity of voltage sags. The variation of the response is most directly a function of the loading of the distribution system. At low load levels the inversion of the circuit's voltage profile results in considerable differences in the aggregated response of distribution-connected PV Less variation is seen in the response to specific PV deployment scenarios, unless pushed to extremes, and in the total amount of PV penetration attained. A simplified version of the combined CMPLDW and PVD1 models is compared to the results from the model-based analysis. Furthermore, the parameters of the simplified model are tuned to better match the determined response. The resulting tuning parameters do not match the expected physical model of the distribution system and PV systems and thus may indicate that another modeling approach would be warranted.« less
Analysis of BSRT Profiles in the LHC at Injection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitterer, M.; Stancari, G.; Papadopoulou, S.
The beam synchrotron radiation telescope (BSRT) at the LHC allows to take profiles of the transverse beam distribution, which can provide useful additional insight in the evolution of the transverse beam distribution. A python class has been developed [1], which allows to read in the BSRT profiles, usually stored in binary format, run different analysis tools and generate plots of the statistical parameters and profiles as well as videos of the the profiles. The detailed analysis will be described in this note. The analysis is based on the data obtained at injection energy (450 GeV) during MD1217 [2] and MD1415more » [3] which will be also used as illustrative example. A similar approach is also taken with a MATLAB based analysis described in [4].« less
A wavelet-based statistical analysis of FMRI data: I. motivation and data distribution modeling.
Dinov, Ivo D; Boscardin, John W; Mega, Michael S; Sowell, Elizabeth L; Toga, Arthur W
2005-01-01
We propose a new method for statistical analysis of functional magnetic resonance imaging (fMRI) data. The discrete wavelet transformation is employed as a tool for efficient and robust signal representation. We use structural magnetic resonance imaging (MRI) and fMRI to empirically estimate the distribution of the wavelet coefficients of the data both across individuals and spatial locations. An anatomical subvolume probabilistic atlas is used to tessellate the structural and functional signals into smaller regions each of which is processed separately. A frequency-adaptive wavelet shrinkage scheme is employed to obtain essentially optimal estimations of the signals in the wavelet space. The empirical distributions of the signals on all the regions are computed in a compressed wavelet space. These are modeled by heavy-tail distributions because their histograms exhibit slower tail decay than the Gaussian. We discovered that the Cauchy, Bessel K Forms, and Pareto distributions provide the most accurate asymptotic models for the distribution of the wavelet coefficients of the data. Finally, we propose a new model for statistical analysis of functional MRI data using this atlas-based wavelet space representation. In the second part of our investigation, we will apply this technique to analyze a large fMRI dataset involving repeated presentation of sensory-motor response stimuli in young, elderly, and demented subjects.
ERIC Educational Resources Information Center
Ahmed, Iftikhar; Sadeq, Muhammad Jafar
2006-01-01
Current distance learning systems are increasingly packing highly data-intensive contents on servers, resulting in the congestion of network and server resources at peak service times. A distributed learning system based on faded information field (FIF) architecture that employs mobile agents (MAs) has been proposed and simulated in this work. The…
The global impact distribution of Near-Earth objects
NASA Astrophysics Data System (ADS)
Rumpf, Clemens; Lewis, Hugh G.; Atkinson, Peter M.
2016-02-01
Asteroids that could collide with the Earth are listed on the publicly available Near-Earth object (NEO) hazard web sites maintained by the National Aeronautics and Space Administration (NASA) and the European Space Agency (ESA). The impact probability distribution of 69 potentially threatening NEOs from these lists that produce 261 dynamically distinct impact instances, or Virtual Impactors (VIs), were calculated using the Asteroid Risk Mitigation and Optimization Research (ARMOR) tool in conjunction with OrbFit. ARMOR projected the impact probability of each VI onto the surface of the Earth as a spatial probability distribution. The projection considers orbit solution accuracy and the global impact probability. The method of ARMOR is introduced and the tool is validated against two asteroid-Earth collision cases with objects 2008 TC3 and 2014 AA. In the analysis, the natural distribution of impact corridors is contrasted against the impact probability distribution to evaluate the distributions' conformity with the uniform impact distribution assumption. The distribution of impact corridors is based on the NEO population and orbital mechanics. The analysis shows that the distribution of impact corridors matches the common assumption of uniform impact distribution and the result extends the evidence base for the uniform assumption from qualitative analysis of historic impact events into the future in a quantitative way. This finding is confirmed in a parallel analysis of impact points belonging to a synthetic population of 10,006 VIs. Taking into account the impact probabilities introduced significant variation into the results and the impact probability distribution, consequently, deviates markedly from uniformity. The concept of impact probabilities is a product of the asteroid observation and orbit determination technique and, thus, represents a man-made component that is largely disconnected from natural processes. It is important to consider impact probabilities because such information represents the best estimate of where an impact might occur.
Improving Distributed Diagnosis Through Structural Model Decomposition
NASA Technical Reports Server (NTRS)
Bregon, Anibal; Daigle, Matthew John; Roychoudhury, Indranil; Biswas, Gautam; Koutsoukos, Xenofon; Pulido, Belarmino
2011-01-01
Complex engineering systems require efficient fault diagnosis methodologies, but centralized approaches do not scale well, and this motivates the development of distributed solutions. This work presents an event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, by using the structural model decomposition capabilities provided by Possible Conflicts. We develop a distributed diagnosis algorithm that uses residuals computed by extending Possible Conflicts to build local event-based diagnosers based on global diagnosability analysis. The proposed approach is applied to a multitank system, and results demonstrate an improvement in the design of local diagnosers. Since local diagnosers use only a subset of the residuals, and use subsystem models to compute residuals (instead of the global system model), the local diagnosers are more efficient than previously developed distributed approaches.
Ozegowski, S; Sundmacher, L
2012-10-01
Since the 1990s licenses for opening a medical practice in Germany are granted based on a needs-based planning system which regulates the regional allocation of physicians in primary care. This study aims at an analysis of the distribution of physicians (and hence the effects of the planning system) with regard to the overarching objective of primary care supply: the safeguarding of "needs-based and evenly distributed health care provision" (Section 70 para 1 German Social Code V). The need for health care provision of each German district (or region) and the actual number of physicians in the respective area are compared using a concentration analysis. For this purpose, the local health-care need was approximated in a model based on the morbidity predictors age and sex and by combining data on the local population structure with the age- and sex-specific frequency of physician consultations (according to data of the GEK sickness fund). The concentration index then measures the degree of regional inequity in the distribution of outpatient care. The results of the analysis demonstrate an inequitable regional distribution between medical needs of the local population and the existing outpatient health care provider capacities. These regional disparities in needs-adjusted supply densities are particularly large for -outpatient secondary care physicians and psychotherapists, even when taking into account the care provision of urban physicians for peri-urban areas as well as the adequacy of longer travel times to specialists. One major reason for these inequities is the design of today's physician planning mechanism which mainly conserves a suboptimal status quo of the past. The initiated reforms of the planning mechanism should progress and be further deepened. Especially today's quota-based allocation of practice licenses requires fundamental changes taking into account the relevant factors approximating local health care needs, re-assessing the adequate spatial planning level and expanding opportunities for introducing innovative and more flexible health care services models. © Georg Thieme Verlag KG Stuttgart · New York.
ERIC Educational Resources Information Center
Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin
2016-01-01
In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…
Kwon, Deukwoo; Reis, Isildinha M
2015-08-12
When conducting a meta-analysis of a continuous outcome, estimated means and standard deviations from the selected studies are required in order to obtain an overall estimate of the mean effect and its confidence interval. If these quantities are not directly reported in the publications, they must be estimated from other reported summary statistics, such as the median, the minimum, the maximum, and quartiles. We propose a simulation-based estimation approach using the Approximate Bayesian Computation (ABC) technique for estimating mean and standard deviation based on various sets of summary statistics found in published studies. We conduct a simulation study to compare the proposed ABC method with the existing methods of Hozo et al. (2005), Bland (2015), and Wan et al. (2014). In the estimation of the standard deviation, our ABC method performs better than the other methods when data are generated from skewed or heavy-tailed distributions. The corresponding average relative error (ARE) approaches zero as sample size increases. In data generated from the normal distribution, our ABC performs well. However, the Wan et al. method is best for estimating standard deviation under normal distribution. In the estimation of the mean, our ABC method is best regardless of assumed distribution. ABC is a flexible method for estimating the study-specific mean and standard deviation for meta-analysis, especially with underlying skewed or heavy-tailed distributions. The ABC method can be applied using other reported summary statistics such as the posterior mean and 95 % credible interval when Bayesian analysis has been employed.
Extending GIS Technology to Study Karst Features of Southeastern Minnesota
NASA Astrophysics Data System (ADS)
Gao, Y.; Tipping, R. G.; Alexander, E. C.; Alexander, S. C.
2001-12-01
This paper summarizes ongoing research on karst feature distribution of southeastern Minnesota. The main goals of this interdisciplinary research are: 1) to look for large-scale patterns in the rate and distribution of sinkhole development; 2) to conduct statistical tests of hypotheses about the formation of sinkholes; 3) to create management tools for land-use managers and planners; and 4) to deliver geomorphic and hydrogeologic criteria for making scientifically valid land-use policies and ethical decisions in karst areas of southeastern Minnesota. Existing county and sub-county karst feature datasets of southeastern Minnesota have been assembled into a large GIS-based database capable of analyzing the entire data set. The central database management system (DBMS) is a relational GIS-based system interacting with three modules: GIS, statistical and hydrogeologic modules. ArcInfo and ArcView were used to generate a series of 2D and 3D maps depicting karst feature distributions in southeastern Minnesota. IRIS ExplorerTM was used to produce satisfying 3D maps and animations using data exported from GIS-based database. Nearest-neighbor analysis has been used to test sinkhole distributions in different topographic and geologic settings. All current nearest-neighbor analyses testify that sinkholes in southeastern Minnesota are not evenly distributed in this area (i.e., they tend to be clustered). More detailed statistical methods such as cluster analysis, histograms, probability estimation, correlation and regression have been used to study the spatial distributions of some mapped karst features of southeastern Minnesota. A sinkhole probability map for Goodhue County has been constructed based on sinkhole distribution, bedrock geology, depth to bedrock, GIS buffer analysis and nearest-neighbor analysis. A series of karst features for Winona County including sinkholes, springs, seeps, stream sinks and outcrop has been mapped and entered into the Karst Feature Database of Southeastern Minnesota. The Karst Feature Database of Winona County is being expanded to include all the mapped karst features of southeastern Minnesota. Air photos from 1930s to 1990s of Spring Valley Cavern Area in Fillmore County were scanned and geo-referenced into our GIS system. This technology has been proved to be very useful to identify sinkholes and study the rate of sinkhole development.
Robust Mediation Analysis Based on Median Regression
Yuan, Ying; MacKinnon, David P.
2014-01-01
Mediation analysis has many applications in psychology and the social sciences. The most prevalent methods typically assume that the error distribution is normal and homoscedastic. However, this assumption may rarely be met in practice, which can affect the validity of the mediation analysis. To address this problem, we propose robust mediation analysis based on median regression. Our approach is robust to various departures from the assumption of homoscedasticity and normality, including heavy-tailed, skewed, contaminated, and heteroscedastic distributions. Simulation studies show that under these circumstances, the proposed method is more efficient and powerful than standard mediation analysis. We further extend the proposed robust method to multilevel mediation analysis, and demonstrate through simulation studies that the new approach outperforms the standard multilevel mediation analysis. We illustrate the proposed method using data from a program designed to increase reemployment and enhance mental health of job seekers. PMID:24079925
Hierarchical analysis of species distributions and abundance across environmental gradients
Jeffery Diez; Ronald H. Pulliam
2007-01-01
Abiotic and biotic processes operate at multiple spatial and temporal scales to shape many ecological processes, including species distributions and demography. Current debate about the relative roles of niche-based and stochastic processes in shaping species distributions and community composition reflects, in part, the challenge of understanding how these processes...
Development of Distributed System for Electronic Business Based on Java-Technologies
ERIC Educational Resources Information Center
Kintonova, Aliya Zh.; Andassova, Bakkaisha Z.; Ermaganbetova, Madina A.; Maikibaeva, Elmira K.
2016-01-01
This paper describes the results of studies on the peculiarities of distributed information systems, and the research of distributed systems technology development. The paper also provides the results of the analysis of E-business market research in the Republic of Kazakhstan. The article briefly describes the implementation of a possible solution…
LWT Based Sensor Node Signal Processing in Vehicle Surveillance Distributed Sensor Network
NASA Astrophysics Data System (ADS)
Cha, Daehyun; Hwang, Chansik
Previous vehicle surveillance researches on distributed sensor network focused on overcoming power limitation and communication bandwidth constraints in sensor node. In spite of this constraints, vehicle surveillance sensor node must have signal compression, feature extraction, target localization, noise cancellation and collaborative signal processing with low computation and communication energy dissipation. In this paper, we introduce an algorithm for light-weight wireless sensor node signal processing based on lifting scheme wavelet analysis feature extraction in distributed sensor network.
A spatial scan statistic for survival data based on Weibull distribution.
Bhatt, Vijaya; Tiwari, Neeraj
2014-05-20
The spatial scan statistic has been developed as a geographical cluster detection analysis tool for different types of data sets such as Bernoulli, Poisson, ordinal, normal and exponential. We propose a scan statistic for survival data based on Weibull distribution. It may also be used for other survival distributions, such as exponential, gamma, and log normal. The proposed method is applied on the survival data of tuberculosis patients for the years 2004-2005 in Nainital district of Uttarakhand, India. Simulation studies reveal that the proposed method performs well for different survival distribution functions. Copyright © 2013 John Wiley & Sons, Ltd.
One Step Quantum Key Distribution Based on EPR Entanglement
Li, Jian; Li, Na; Li, Lei-Lei; Wang, Tao
2016-01-01
A novel quantum key distribution protocol is presented, based on entanglement and dense coding and allowing asymptotically secure key distribution. Considering the storage time limit of quantum bits, a grouping quantum key distribution protocol is proposed, which overcomes the vulnerability of first protocol and improves the maneuverability. Moreover, a security analysis is given and a simple type of eavesdropper’s attack would introduce at least an error rate of 46.875%. Compared with the “Ping-pong” protocol involving two steps, the proposed protocol does not need to store the qubit and only involves one step. PMID:27357865
Drawing The Red Line: Cost Benefit Analysis on Large Life Rafts
2013-06-13
Patterson Air Force Base, Ohio DISTRIBUTION STATEMENT A . APPROVED FOR PUBLIC RELEASE; DISTRIBUTION IS UNLIMITED. The...Anderson, BS, MS Major, USAF June 2013 DISTRIBUTION STATEMENT A . APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT-ENS-GRP-13-J-1...AMC has set up a Fuel-Efficiency Office (FEO) in order to analyze costs and make decisions that will save money on operational expenses related to
Self-Organizing Maps and Parton Distribution Functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
K. Holcomb, Simonetta Liuti, D. Z. Perry
2011-05-01
We present a new method to extract parton distribution functions from high energy experimental data based on a specific type of neural networks, the Self-Organizing Maps. We illustrate the features of our new procedure that are particularly useful for an anaysis directed at extracting generalized parton distributions from data. We show quantitative results of our initial analysis of the parton distribution functions from inclusive deep inelastic scattering.
Zanatta, Rayssa Ferreira; Barreto, Bruno de Castro Ferreira; Xavier, Tathy Aparecida; Versluis, Antheunis; Soares, Carlos José
2015-02-01
This study evaluated the influence of punch and base orifice diameters on push-out test results by means of finite element analysis (FEA). FEA was performed using 3D models of the push-out test with 3 base orifice diameters (2.5, 3.0, and 3.5 mm) and 3 punch diameters (0.5, 1.0, and 1.5 mm) using MARC/MENTAT (MSC.Software). The image of a cervical slice from a root restored with a fiberglass post was used to construct the models. The mechanical properties of dentin, post, and resin cement were obtained from the literature. Bases and punches were constructed as rigid bodies. A 10-N force was applied by the punch in the center of the post in a nonlinear contact analysis. Modified von Mises stress, maximum principal stress, as well as shear and normal stress components were calculated. Both punch and base orifice sizes influenced the stress distribution of the push-out test. Bases with larger diameters and punches with smaller diameters caused higher stress in dentin and at the dentin/cement interface. FEA showed that the diameter of the orifice base had a more significant influence on the stress distribution than did the punch diameter. For this reason, both factors should be taken into account during push-out experimental tests.
Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong
2016-01-12
The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis.
Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong
2016-01-01
The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis. PMID:28787839
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Weiguo, E-mail: wang_weiguo@vip.163.com
High purity (99.99%) iron samples were cold rolled with a thickness reduction of 87% followed by a annealing at 620 °C for 15 and 45 min, then the grain boundary plane distributions were characterized by electron backscatter diffraction and a stereology and statistics based five parameter analysis. The results showed that the grain boundary planes were mainly distributed on (1 0 0) and such distribution was enhanced obviously when the holding time of annealing increased from 15 to 45 min during which an extensive grain growth and a dramatic texture evolution took place. Further analysis indicated such distribution was largelymore » due to the rotation of grain boundary planes of the [1 0 0] misoriented grains. Additionally, it was also observed that some random boundaries of [1 0 0] misoriented grains were changing from asymmetrical tilt into (1 0 0)/(− 1 0 0) twist when annealing proceeded from 15 to 45 min. Based on the present work and the results reported by other investigators, it was suggested that the chemistry appeared to be critical in determining the grain boundary plane distribution in iron based materials. - Highlights: •Grain boundary (GB) plane distribution (GBPD) in a high purity (99.99%) was studied. •GB planes are mainly distributed on (1 0 0) in the cold rolled and annealed samples. •Such distribution is largely due to the rotation of GB planes of [100] misoriented grains. •Some [1 0 0] asymmetrical tilt boundaries are evolving into twist ones during annealing.« less
Keller, Katharina; Mertens, Valerie; Qi, Mian; Nalepa, Anna I; Godt, Adelheid; Savitsky, Anton; Jeschke, Gunnar; Yulikov, Maxim
2017-07-21
Extraction of distance distributions between high-spin paramagnetic centers from relaxation induced dipolar modulation enhancement (RIDME) data is affected by the presence of overtones of dipolar frequencies. As previously proposed, we account for these overtones by using a modified kernel function in Tikhonov regularization analysis. This paper analyzes the performance of such an approach on a series of model compounds with the Gd(iii)-PyMTA complex serving as paramagnetic high-spin label. We describe the calibration of the overtone coefficients for the RIDME kernel, demonstrate the accuracy of distance distributions obtained with this approach, and show that for our series of Gd-rulers RIDME technique provides more accurate distance distributions than Gd(iii)-Gd(iii) double electron-electron resonance (DEER). The analysis of RIDME data including harmonic overtones can be performed using the MATLAB-based program OvertoneAnalysis, which is available as open-source software from the web page of ETH Zurich. This approach opens a perspective for the routine use of the RIDME technique with high-spin labels in structural biology and structural studies of other soft matter.
Regional analysis of annual maximum rainfall using TL-moments method
NASA Astrophysics Data System (ADS)
Shabri, Ani Bin; Daud, Zalina Mohd; Ariff, Noratiqah Mohd
2011-06-01
Information related to distributions of rainfall amounts are of great importance for designs of water-related structures. One of the concerns of hydrologists and engineers is the probability distribution for modeling of regional data. In this study, a novel approach to regional frequency analysis using L-moments is revisited. Subsequently, an alternative regional frequency analysis using the TL-moments method is employed. The results from both methods were then compared. The analysis was based on daily annual maximum rainfall data from 40 stations in Selangor Malaysia. TL-moments for the generalized extreme value (GEV) and generalized logistic (GLO) distributions were derived and used to develop the regional frequency analysis procedure. TL-moment ratio diagram and Z-test were employed in determining the best-fit distribution. Comparison between the two approaches showed that the L-moments and TL-moments produced equivalent results. GLO and GEV distributions were identified as the most suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation was used for performance evaluation, and it showed that the method of TL-moments was more efficient for lower quantile estimation compared with the L-moments.
Distributed bearing fault diagnosis based on vibration analysis
NASA Astrophysics Data System (ADS)
Dolenc, Boštjan; Boškoski, Pavle; Juričić, Đani
2016-01-01
Distributed bearing faults appear under various circumstances, for example due to electroerosion or the progression of localized faults. Bearings with distributed faults tend to generate more complex vibration patterns than those with localized faults. Despite the frequent occurrence of such faults, their diagnosis has attracted limited attention. This paper examines a method for the diagnosis of distributed bearing faults employing vibration analysis. The vibrational patterns generated are modeled by incorporating the geometrical imperfections of the bearing components. Comparing envelope spectra of vibration signals shows that one can distinguish between localized and distributed faults. Furthermore, a diagnostic procedure for the detection of distributed faults is proposed. This is evaluated on several bearings with naturally born distributed faults, which are compared with fault-free bearings and bearings with localized faults. It is shown experimentally that features extracted from vibrations in fault-free, localized and distributed fault conditions form clearly separable clusters, thus enabling diagnosis.
The impact of distributed computing on education
NASA Technical Reports Server (NTRS)
Utku, S.; Lestingi, J.; Salama, M.
1982-01-01
In this paper, developments in digital computer technology since the early Fifties are reviewed briefly, and the parallelism which exists between these developments and developments in analysis and design procedures of structural engineering is identified. The recent trends in digital computer technology are examined in order to establish the fact that distributed processing is now an accepted philosophy for further developments. The impact of this on the analysis and design practices of structural engineering is assessed by first examining these practices from a data processing standpoint to identify the key operations and data bases, and then fitting them to the characteristics of distributed processing. The merits and drawbacks of the present philosophy in educating structural engineers are discussed and projections are made for the industry-academia relations in the distributed processing environment of structural analysis and design. An ongoing experiment of distributed computing in a university environment is described.
NASA Technical Reports Server (NTRS)
Egolf, T. A.; Landgrebe, A. J.
1982-01-01
A user's manual is provided which includes the technical approach for the Prescribed Wake Rotor Inflow and Flow Field Prediction Analysis. The analysis is used to provide the rotor wake induced velocities at the rotor blades for use in blade airloads and response analyses and to provide induced velocities at arbitrary field points such as at a tail surface. This analysis calculates the distribution of rotor wake induced velocities based on a prescribed wake model. Section operating conditions are prescribed from blade motion and controls determined by a separate blade response analysis. The analysis represents each blade by a segmented lifting line, and the rotor wake by discrete segmented trailing vortex filaments. Blade loading and circulation distributions are calculated based on blade element strip theory including the local induced velocity predicted by the numerical integration of the Biot-Savart Law applied to the vortex wake model.
2017-01-01
In Mexico, the Long-tailed Wood-Partridge (Dendrortyx macroura) is distributed in the mountains of the Trans-Mexican Volcanic Belt, Sierra Madre del Sur and Sierra Norte de Oaxaca; while the Bearded Wood-Partridge (D. barbatus) is distributed in the Sierra Madre Oriental (SMO). There is a controversial overlap in distribution (sympatry) between these two species (on the Cofre de Perote and Pico de Orizaba volcanoes, SMO and Sierra Norte de Oaxaca), based on the ambiguity and current lack of information regarding the distribution of these two species. In order to disentangle the possible presence of both species in the area of sympatry, we conducted a crumble analysis of the historic knowledge regarding the geographic distribution of both species, based on a review of scientific literature, database records, the specimen examination (in ornithological collections), field work and a reconstruction of the distribution range based on Ecological Niche Modeling. Our results support the presence of only one of these two species in the overlapping area, rejecting the existence of such an area of sympatry between the two species. We discuss alternative hypotheses that could explain the historically reported distribution pattern: 1) an error in the single existing historical record; 2) a possible local extinction of the species and 3) the past existence of interspecific competition that has since been resolved under the principle of competitive exclusion. We propose that the Santo Domingo River in northern Oaxaca and western slope of the Sierra Madre Oriental, mark the distribution limits between these species. PMID:28863140
Stability Analysis of Distributed Order Fractional Chen System
Aminikhah, H.; Refahi Sheikhani, A.; Rezazadeh, H.
2013-01-01
We first investigate sufficient and necessary conditions of stability of nonlinear distributed order fractional system and then we generalize the integer-order Chen system into the distributed order fractional domain. Based on the asymptotic stability theory of nonlinear distributed order fractional systems, the stability of distributed order fractional Chen system is discussed. In addition, we have found that chaos exists in the double fractional order Chen system. Numerical solutions are used to verify the analytical results. PMID:24489508
NASA Astrophysics Data System (ADS)
Shankman, C.; Kavelaars, JJ.; Gladman, B. J.; Alexandersen, M.; Kaib, N.; Petit, J.-M.; Bannister, M. T.; Chen, Y.-T.; Gwyn, S.; Jakubik, M.; Volk, K.
2016-02-01
We measure the absolute magnitude, H, distribution, dN(H) ∝ 10 αH , of the scattering Trans-Neptunian Objects (TNOs) as a proxy for their size-frequency distribution. We show that the H-distribution of the scattering TNOs is not consistent with a single-slope distribution, but must transition around H g ˜ 9 to either a knee with a shallow slope or to a divot, which is a differential drop followed by second exponential distribution. Our analysis is based on a sample of 22 scattering TNOs drawn from three different TNO surveys—the Canada-France Ecliptic Plane Survey, Alexandersen et al., and the Outer Solar System Origins Survey, all of which provide well-characterized detection thresholds—combined with a cosmogonic model for the formation of the scattering TNO population. Our measured absolute magnitude distribution result is independent of the choice of cosmogonic model. Based on our analysis, we estimate that the number of scattering TNOs is (2.4-8.3) × 105 for H r < 12. A divot H-distribution is seen in a variety of formation scenarios and may explain several puzzles in Kuiper Belt science. We find that a divot H-distribution simultaneously explains the observed scattering TNO, Neptune Trojan, Plutino, and Centaur H-distributions while simultaneously predicting a large enough scattering TNO population to act as the sole supply of the Jupiter-Family Comets.
Bluetooth-based distributed measurement system
NASA Astrophysics Data System (ADS)
Tang, Baoping; Chen, Zhuo; Wei, Yuguo; Qin, Xiaofeng
2007-07-01
A novel distributed wireless measurement system, which is consisted of a base station, wireless intelligent sensors and relay nodes etc, is established by combining of Bluetooth-based wireless transmission, virtual instrument, intelligent sensor, and network. The intelligent sensors mounted on the equipments to be measured acquire various parameters and the Bluetooth relay nodes get the acquired data modulated and sent to the base station, where data analysis and processing are done so that the operational condition of the equipment can be evaluated. The establishment of the distributed measurement system is discussed with a measurement flow chart for the distributed measurement system based on Bluetooth technology, and the advantages and disadvantages of the system are analyzed at the end of the paper and the measurement system has successfully been used in Daqing oilfield, China for measurement of parameters, such as temperature, flow rate and oil pressure at an electromotor-pump unit.
Historical and future drought in Bangladesh using copula-based bivariate regional frequency analysis
NASA Astrophysics Data System (ADS)
Mortuza, Md Rubayet; Moges, Edom; Demissie, Yonas; Li, Hong-Yi
2018-02-01
The study aims at regional and probabilistic evaluation of bivariate drought characteristics to assess both the past and future drought duration and severity in Bangladesh. The procedures involve applying (1) standardized precipitation index to identify drought duration and severity, (2) regional frequency analysis to determine the appropriate marginal distributions for both duration and severity, (3) copula model to estimate the joint probability distribution of drought duration and severity, and (4) precipitation projections from multiple climate models to assess future drought trends. Since drought duration and severity in Bangladesh are often strongly correlated and do not follow same marginal distributions, the joint and conditional return periods of droughts are characterized using the copula-based joint distribution. The country is divided into three homogeneous regions using Fuzzy clustering and multivariate discordancy and homogeneity measures. For given severity and duration values, the joint return periods for a drought to exceed both values are on average 45% larger, while to exceed either value are 40% less than the return periods from the univariate frequency analysis, which treats drought duration and severity independently. These suggest that compared to the bivariate drought frequency analysis, the standard univariate frequency analysis under/overestimate the frequency and severity of droughts depending on how their duration and severity are related. Overall, more frequent and severe droughts are observed in the west side of the country. Future drought trend based on four climate models and two scenarios showed the possibility of less frequent drought in the future (2020-2100) than in the past (1961-2010).
Complementing Gender Analysis Methods.
Kumar, Anant
2016-01-01
The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital.
Final Report - Cloud-Based Management Platform for Distributed, Multi-Domain Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chowdhury, Pulak; Mukherjee, Biswanath
2017-11-03
In this Department of Energy (DOE) Small Business Innovation Research (SBIR) Phase II project final report, Ennetix presents the development of a solution for end-to-end monitoring, analysis, and visualization of network performance for distributed networks. This solution benefits enterprises of all sizes, operators of distributed and federated networks, and service providers.
Gaze Step Distributions Reflect Fixations and Saccades: A Comment on Stephen and Mirman (2010)
ERIC Educational Resources Information Center
Bogartz, Richard S.; Staub, Adrian
2012-01-01
In three experimental tasks Stephen and Mirman (2010) measured gaze steps, the distance in pixels between gaze positions on successive samples from an eyetracker. They argued that the distribution of gaze steps is best fit by the lognormal distribution, and based on this analysis they concluded that interactive cognitive processes underlie eye…
Comparisons of non-Gaussian statistical models in DNA methylation analysis.
Ma, Zhanyu; Teschendorff, Andrew E; Yu, Hong; Taghia, Jalil; Guo, Jun
2014-06-16
As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance.
Comparisons of Non-Gaussian Statistical Models in DNA Methylation Analysis
Ma, Zhanyu; Teschendorff, Andrew E.; Yu, Hong; Taghia, Jalil; Guo, Jun
2014-01-01
As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance. PMID:24937687
NEAT: an efficient network enrichment analysis test.
Signorelli, Mirko; Vinciotti, Veronica; Wit, Ernst C
2016-09-05
Network enrichment analysis is a powerful method, which allows to integrate gene enrichment analysis with the information on relationships between genes that is provided by gene networks. Existing tests for network enrichment analysis deal only with undirected networks, they can be computationally slow and are based on normality assumptions. We propose NEAT, a test for network enrichment analysis. The test is based on the hypergeometric distribution, which naturally arises as the null distribution in this context. NEAT can be applied not only to undirected, but to directed and partially directed networks as well. Our simulations indicate that NEAT is considerably faster than alternative resampling-based methods, and that its capacity to detect enrichments is at least as good as the one of alternative tests. We discuss applications of NEAT to network analyses in yeast by testing for enrichment of the Environmental Stress Response target gene set with GO Slim and KEGG functional gene sets, and also by inspecting associations between functional sets themselves. NEAT is a flexible and efficient test for network enrichment analysis that aims to overcome some limitations of existing resampling-based tests. The method is implemented in the R package neat, which can be freely downloaded from CRAN ( https://cran.r-project.org/package=neat ).
NASA Technical Reports Server (NTRS)
Weger, R. C.; Lee, J.; Zhu, Tianri; Welch, R. M.
1992-01-01
The current controversy existing in reference to the regularity vs. clustering in cloud fields is examined by means of analysis and simulation studies based upon nearest-neighbor cumulative distribution statistics. It is shown that the Poisson representation of random point processes is superior to pseudorandom-number-generated models and that pseudorandom-number-generated models bias the observed nearest-neighbor statistics towards regularity. Interpretation of this nearest-neighbor statistics is discussed for many cases of superpositions of clustering, randomness, and regularity. A detailed analysis is carried out of cumulus cloud field spatial distributions based upon Landsat, AVHRR, and Skylab data, showing that, when both large and small clouds are included in the cloud field distributions, the cloud field always has a strong clustering signal.
Analysis of Digital Communication Signals and Extraction of Parameters.
1994-12-01
Fast Fourier Transform (FFT). The correlation methods utilize modified time-frequency distributions , where one of these is based on the Wigner - Ville ... Distribution ( WVD ). Gaussian white noise is added to the signal to simulate various signal-to-noise ratios (SNRs).
Metagenomic Analysis of Water Distribution System Bacterial Communities
The microbial quality of drinking water is assessed using culture-based methods that are highly selective and that tend to underestimate the densities and diversity of microbial populations inhabiting distribution systems. In order to better understand the effect of different dis...
NASA Astrophysics Data System (ADS)
Schweizer, Steffen; Schlueter, Steffen; Hoeschen, Carmen; Koegel-Knabner, Ingrid; Mueller, Carsten W.
2017-04-01
Soil organic matter (SOM) is distributed on mineral surfaces depending on physicochemical soil properties that vary at the submicron scale. Nanoscale secondary ion mass spectrometry (NanoSIMS) can be used to visualize the spatial distribution of up to seven elements simultaneously at a lateral resolution of approximately 100 nm from which patterns of SOM coatings can be derived. Existing computational methods are mostly confined to visualization and lack spatial quantification measures of coverage and connectivity of organic matter coatings. This study proposes a methodology for the spatial analysis of SOM coatings based on supervised pixel classification and automatic image analysis of the 12C, 12C14N (indicative for SOM) and 16O (indicative for mineral surfaces) secondary ion distributions. The image segmentation of the secondary ion distributions into mineral particle surface and organic coating was done with a machine learning algorithm, which accounts for multiple features like size, color, intensity, edge and texture in all three ion distributions simultaneously. Our workflow allowed the spatial analysis of differences in the SOM coverage during soil development in the Damma glacier forefield (Switzerland) based on NanoSIMS measurements (n=121; containing ca. 4000 particles). The Damma chronosequence comprises several stages of soil development with increasing ice-free period (from ca. 15 to >700 years). To investigate mineral-associated SOM in the developing soil we obtained clay fractions (<2 μm) from two density fractions: light mineral (1.6 to 2.2 g cm3) and heavy mineral (>2.2 g cm3). We found increased coverage and a simultaneous development from patchy-distributed organic coatings to more connected coatings with increasing time after glacial retreat. The normalized N:C ratio (12C14N: (12C14N + 12C)) on the organic matter coatings was higher in the medium-aged soils than in the young and mature ones in both heavy and light mineral fraction. This reflects the sequential accumulation of proteinaceous SOM in the medium-aged soils and C-rich compounds in the mature soils. The results of our microscale image analysis correlated well with the SOM concentration of the fractions measured by elemental analyzer. Image analysis in combination with secondary ion distributions provides a powerful tool at the required microscale and enhances our mechanistic understanding of SOM stabilization in soil.
Structural Analysis via Generalized Interactive Graphics - STAGING. Volume III. System Manual.
1979-09-01
DISTRIBUTION UNLIMITED 17 DISTRIBUTION ST ATEMENT (of the abnsrct entered in Block 20. it different from, Report) IS SJPPLEMENTARY NOTES I9 KEY WORDS ILCI-lue on...prefixMENUDRIVER and the menu data base itself is cataloged as prefixMENU. The maintanance of the STAGING Material Property Data Base (MPDB...Property Data Base System, and conversion routines as describe,: -n Section 1.2 through 1.6. If any difficulties arise due to differences in operatire
Cheng, Rui; Xia, Li; Sima, Chaotan; Ran, Yanli; Rohollahnejad, Jalal; Zhou, Jiaao; Wen, Yongqiang; Yu, Can
2016-02-08
Ultrashort fiber Bragg gratings (US-FBGs) have significant potential as weak grating sensors for distributed sensing, but the exploitation have been limited by their inherent broad spectra that are undesirable for most traditional wavelength measurements. To address this, we have recently introduced a new interrogation concept using shifted optical Gaussian filters (SOGF) which is well suitable for US-FBG measurements. Here, we apply it to demonstrate, for the first time, an US-FBG-based self-referencing distributed optical sensing technique, with the advantages of adjustable sensitivity and range, high-speed and wide-range (potentially >14000 με) intensity-based detection, and resistance to disturbance by nonuniform parameter distribution. The entire system is essentially based on a microwave network, which incorporates the SOGF with a fiber delay-line between the two arms. Differential detections of the cascaded US-FBGs are performed individually in the network time-domain response which can be obtained by analyzing its complex frequency response. Experimental results are presented and discussed using eight cascaded US-FBGs. A comprehensive numerical analysis is also conducted to assess the system performance, which shows that the use of US-FBGs instead of conventional weak FBGs could significantly improve the power budget and capacity of the distributed sensing system while maintaining the crosstalk level and intensity decay rate, providing a promising route for future sensing applications.
NASA Astrophysics Data System (ADS)
Sadegh, M.; Moftakhari, H.; AghaKouchak, A.
2017-12-01
Many natural hazards are driven by multiple forcing variables, and concurrence/consecutive extreme events significantly increases risk of infrastructure/system failure. It is a common practice to use univariate analysis based upon a perceived ruling driver to estimate design quantiles and/or return periods of extreme events. A multivariate analysis, however, permits modeling simultaneous occurrence of multiple forcing variables. In this presentation, we introduce the Multi-hazard Assessment and Scenario Toolbox (MhAST) that comprehensively analyzes marginal and joint probability distributions of natural hazards. MhAST also offers a wide range of scenarios of return period and design levels and their likelihoods. Contribution of this study is four-fold: 1. comprehensive analysis of marginal and joint probability of multiple drivers through 17 continuous distributions and 26 copulas, 2. multiple scenario analysis of concurrent extremes based upon the most likely joint occurrence, one ruling variable, and weighted random sampling of joint occurrences with similar exceedance probabilities, 3. weighted average scenario analysis based on a expected event, and 4. uncertainty analysis of the most likely joint occurrence scenario using a Bayesian framework.
Stability Analysis of Distributed Engine Control Systems Under Communication Packet Drop (Postprint)
2008-07-01
use, modify, reproduce, release, perform, display, or disclose the work. 14. ABSTRACT Currently, Full Authority Digital Engine Control ( FADEC ...based on a centralized architecture framework is being widely used for gas turbine engine control. However, current FADEC is not able to meet the...system (DEC). FADEC based on Distributed Control Systems (DCS) offers modularity, improved control systems prognostics and fault tolerance along with
Fajardo, Javier; Lessmann, Janeth; Bonaccorso, Elisa; Devenish, Christian; Muñoz, Jesús
2014-01-01
Conservation planning is crucial for megadiverse countries where biodiversity is coupled with incomplete reserve systems and limited resources to invest in conservation. Using Peru as an example of a megadiverse country, we asked whether the national system of protected areas satisfies biodiversity conservation needs. Further, to complement the existing reserve system, we identified and prioritized potential conservation areas using a combination of species distribution modeling, conservation planning and connectivity analysis. Based on a set of 2,869 species, including mammals, birds, amphibians, reptiles, butterflies, and plants, we used species distribution models to represent species' geographic ranges to reduce the effect of biased sampling and partial knowledge about species' distributions. A site-selection algorithm then searched for efficient and complementary proposals, based on the above distributions, for a more representative system of protection. Finally, we incorporated connectivity among areas in an innovative post-hoc analysis to prioritize those areas maximizing connectivity within the system. Our results highlight severe conservation gaps in the Coastal and Andean regions, and we propose several areas, which are not currently covered by the existing network of protected areas. Our approach helps to find areas that contribute to creating a more representative, connected and efficient network. PMID:25479411
A fully traits-based approach to modeling global vegetation distribution.
van Bodegom, Peter M; Douma, Jacob C; Verheijen, Lieneke M
2014-09-23
Dynamic Global Vegetation Models (DGVMs) are indispensable for our understanding of climate change impacts. The application of traits in DGVMs is increasingly refined. However, a comprehensive analysis of the direct impacts of trait variation on global vegetation distribution does not yet exist. Here, we present such analysis as proof of principle. We run regressions of trait observations for leaf mass per area, stem-specific density, and seed mass from a global database against multiple environmental drivers, making use of findings of global trait convergence. This analysis explained up to 52% of the global variation of traits. Global trait maps, generated by coupling the regression equations to gridded soil and climate maps, showed up to orders of magnitude variation in trait values. Subsequently, nine vegetation types were characterized by the trait combinations that they possess using Gaussian mixture density functions. The trait maps were input to these functions to determine global occurrence probabilities for each vegetation type. We prepared vegetation maps, assuming that the most probable (and thus, most suited) vegetation type at each location will be realized. This fully traits-based vegetation map predicted 42% of the observed vegetation distribution correctly. Our results indicate that a major proportion of the predictive ability of DGVMs with respect to vegetation distribution can be attained by three traits alone if traits like stem-specific density and seed mass are included. We envision that our traits-based approach, our observation-driven trait maps, and our vegetation maps may inspire a new generation of powerful traits-based DGVMs.
Efficient transformer study: Analysis of manufacture and utility data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burkes, Klaehn; Cordaro, Joe; McIntosh, John
Distribution transformers convert power from the distribution system voltage to the end-customer voltage, which consists of residences, businesses, distributed generation, campus systems, and manufacturing facilities. Amorphous metal distribution transformers (AMDT) are also more expensive and heavier than conventional silicon steel distribution transformers. This and the difficulty to measure the benefit from energy efficiency and low awareness of the technology have hindered the adoption of AMDT. This report presents the cost savings for installing AMDT and the amount of energy saved based on the improved efficiency.
Log-Normal Distribution of Cosmic Voids in Simulations and Mocks
NASA Astrophysics Data System (ADS)
Russell, E.; Pycke, J.-R.
2017-01-01
Following up on previous studies, we complete here a full analysis of the void size distributions of the Cosmic Void Catalog based on three different simulation and mock catalogs: dark matter (DM), haloes, and galaxies. Based on this analysis, we attempt to answer two questions: Is a three-parameter log-normal distribution a good candidate to satisfy the void size distributions obtained from different types of environments? Is there a direct relation between the shape parameters of the void size distribution and the environmental effects? In an attempt to answer these questions, we find here that all void size distributions of these data samples satisfy the three-parameter log-normal distribution whether the environment is dominated by DM, haloes, or galaxies. In addition, the shape parameters of the three-parameter log-normal void size distribution seem highly affected by environment, particularly existing substructures. Therefore, we show two quantitative relations given by linear equations between the skewness and the maximum tree depth, and between the variance of the void size distribution and the maximum tree depth, directly from the simulated data. In addition to this, we find that the percentage of voids with nonzero central density in the data sets has a critical importance. If the number of voids with nonzero central density reaches ≥3.84% in a simulation/mock sample, then a second population is observed in the void size distributions. This second population emerges as a second peak in the log-normal void size distribution at larger radius.
Fluctuation-Enhanced Sensing for Biological Agent Detection and Identification
2009-01-01
method for bacterium detection published earlier; sensing and evaluating the odors of microbes ; and spectral and amplitude distribution analysis of noise...REPORT TYPE 3. DATES COVERED 00-00-2009 to 00-00-2009 4. TITLE AND SUBTITLE Fluctuation-Enhanced Sensing for Biological Agent Detection and...evaluating the odors of microbes ; and spectral and amplitude distribution analysis of noise in light scattering to identify spores based on their
NASA Astrophysics Data System (ADS)
Huo, Chengyu; Huang, Xiaolin; Zhuang, Jianjun; Hou, Fengzhen; Ni, Huangjing; Ning, Xinbao
2013-09-01
The Poincaré plot is one of the most important approaches in human cardiac rhythm analysis. However, further investigations are still needed to concentrate on techniques that can characterize the dispersion of the points displayed by a Poincaré plot. Based on a modified Poincaré plot, we provide a novel measurement named distribution entropy (DE) and propose a quadrantal multi-scale distribution entropy analysis (QMDE) for the quantitative descriptions of the scatter distribution patterns in various regions and temporal scales. We apply this method to the heartbeat interval series derived from healthy subjects and congestive heart failure (CHF) sufferers, respectively, and find that the discriminations between them are most significant in the first quadrant, which implies significant impacts on vagal regulation brought about by CHF. We also investigate the day-night differences of young healthy people, and it is shown that the results present a clearly circadian rhythm, especially in the first quadrant. In addition, the multi-scale analysis indicates that the results of healthy subjects and CHF sufferers fluctuate in different trends with variation of the scale factor. The same phenomenon also appears in circadian rhythm investigations of young healthy subjects, which implies that the cardiac dynamic system is affected differently in various temporal scales by physiological or pathological factors.
NASA Astrophysics Data System (ADS)
Xiang, Xing; Wang, Ruicheng; Wang, Hongmei; Gong, Linfeng; Man, Baiying; Xu, Ying
2017-03-01
High abundance and widespread distribution of the archaeal phylum Bathyarchaeota in marine environment have been recognized recently, but knowledge about Bathyarchaeota in terrestrial settings and their correlation with environmental parameters is fairly limited. Here we reported the abundance of Bathyarchaeota members across different ecosystems and their correlation with environmental factors by constructing 16S rRNA clone libraries of peat from the Dajiuhu Peatland, coupling with bioinformatics analysis of 16S rRNA data available to date in NCBI database. In total, 1456 Bathyarchaeota sequences from 28 sites were subjected to UniFrac analysis based on phylogenetic distance and multivariate regression tree analysis of taxonomy. Both phylogenetic and taxon-based approaches showed that salinity, total organic carbon and temperature significantly influenced the distribution of Bathyarchaeota across different terrestrial habitats. By applying the ecological concept of ‘indicator species’, we identify 9 indicator groups among the 6 habitats with the most in the estuary sediments. Network analysis showed that members of Bathyarchaeota formed the “backbone” of archaeal community and often co-occurred with Methanomicrobia. These results suggest that Bathyarchaeota may play an important ecological role within archaeal communities via a potential symbiotic association with Methanomicrobia. Our results shed light on understanding of the biogeography, potential functions of Bathyarchaeota and environment conditions that influence Bathyarchaea distribution in terrestrial settings.
Andrews, Ross N; Serio, Joseph; Muralidharan, Govindarajan; Ilavsky, Jan
2017-06-01
Intermetallic γ' precipitates typically strengthen nickel-based superalloys. The shape, size and spatial distribution of strengthening precipitates critically influence alloy strength, while their temporal evolution characteristics determine the high-temperature alloy stability. Combined ultra-small-, small- and wide-angle X-ray scattering (USAXS-SAXS-WAXS) analysis can be used to evaluate the temporal evolution of an alloy's precipitate size distribution (PSD) and phase structure during in situ heat treatment. Analysis of PSDs from USAXS-SAXS data employs either least-squares fitting of a preordained PSD model or a maximum entropy (MaxEnt) approach, the latter avoiding a priori definition of a functional form of the PSD. However, strong low- q scattering from grain boundaries and/or structure factor effects inhibit MaxEnt analysis of typical alloys. This work describes the extension of Bayesian-MaxEnt analysis methods to data exhibiting structure factor effects and low- q power law slopes and demonstrates their use in an in situ study of precipitate size evolution during heat treatment of a model Ni-Al-Si alloy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, Ross N.; Serio, Joseph A.; Muralidharan, Govindarajan
Intermetallic γ' precipitates typically strengthen nickel-based superalloys. The shape, size and spatial distribution of strengthening precipitates critically influence alloy strength, while their temporal evolution characteristics determine the high-temperature alloy stability. Combined ultra-small-, small- and wide-angle X-ray scattering (USAXS–SAXS–WAXS) analysis can be used to evaluate the temporal evolution of an alloy's precipitate size distribution (PSD) and phase structure duringin situheat treatment. Analysis of PSDs from USAXS–SAXS data employs either least-squares fitting of a preordained PSD model or a maximum entropy (MaxEnt) approach, the latter avoidinga prioridefinition of a functional form of the PSD. However, strong low-qscattering from grain boundaries and/or structuremore » factor effects inhibit MaxEnt analysis of typical alloys. Lastly, this work describes the extension of Bayesian–MaxEnt analysis methods to data exhibiting structure factor effects and low-qpower law slopes and demonstrates their use in anin situstudy of precipitate size evolution during heat treatment of a model Ni–Al–Si alloy.« less
Andrews, Ross N.; Serio, Joseph; Muralidharan, Govindarajan; Ilavsky, Jan
2017-01-01
Intermetallic γ′ precipitates typically strengthen nickel-based superalloys. The shape, size and spatial distribution of strengthening precipitates critically influence alloy strength, while their temporal evolution characteristics determine the high-temperature alloy stability. Combined ultra-small-, small- and wide-angle X-ray scattering (USAXS–SAXS–WAXS) analysis can be used to evaluate the temporal evolution of an alloy’s precipitate size distribution (PSD) and phase structure during in situ heat treatment. Analysis of PSDs from USAXS–SAXS data employs either least-squares fitting of a preordained PSD model or a maximum entropy (MaxEnt) approach, the latter avoiding a priori definition of a functional form of the PSD. However, strong low-q scattering from grain boundaries and/or structure factor effects inhibit MaxEnt analysis of typical alloys. This work describes the extension of Bayesian–MaxEnt analysis methods to data exhibiting structure factor effects and low-q power law slopes and demonstrates their use in an in situ study of precipitate size evolution during heat treatment of a model Ni–Al–Si alloy. PMID:28656039
Andrews, Ross N.; Serio, Joseph A.; Muralidharan, Govindarajan; ...
2017-05-30
Intermetallic γ' precipitates typically strengthen nickel-based superalloys. The shape, size and spatial distribution of strengthening precipitates critically influence alloy strength, while their temporal evolution characteristics determine the high-temperature alloy stability. Combined ultra-small-, small- and wide-angle X-ray scattering (USAXS–SAXS–WAXS) analysis can be used to evaluate the temporal evolution of an alloy's precipitate size distribution (PSD) and phase structure duringin situheat treatment. Analysis of PSDs from USAXS–SAXS data employs either least-squares fitting of a preordained PSD model or a maximum entropy (MaxEnt) approach, the latter avoidinga prioridefinition of a functional form of the PSD. However, strong low-qscattering from grain boundaries and/or structuremore » factor effects inhibit MaxEnt analysis of typical alloys. Lastly, this work describes the extension of Bayesian–MaxEnt analysis methods to data exhibiting structure factor effects and low-qpower law slopes and demonstrates their use in anin situstudy of precipitate size evolution during heat treatment of a model Ni–Al–Si alloy.« less
Knowledge-based assistance for science visualization and analysis using large distributed databases
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Jacobson, Allan S.; Doyle, Richard J.; Collins, Donald J.
1993-01-01
Within this decade, the growth in complexity of exploratory data analysis and the sheer volume of space data require new and innovative approaches to support science investigators in achieving their research objectives. To date, there have been numerous efforts addressing the individual issues involved in inter-disciplinary, multi-instrument investigations. However, while successful in small scale, these efforts have not proven to be open and scalable. This proposal addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded with this proposal is the integration of three automation technologies, namely, knowledge-based expert systems, science visualization and science data management. This integration is based on concept called the DataHub. With the DataHub concept, NASA will be able to apply a more complete solution to all nodes of a distributed system. Both computation nodes and interactive nodes will be able to effectively and efficiently use the data services (address, retrieval, update, etc.) with a distributed, interdisciplinary information system in a uniform and standard way. This will allow the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis will be on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to publishable scientific results. In addition, the proposed work includes all the required end-to-end components and interfaces to demonstrate the completed concept.
Knowledge-based assistance for science visualization and analysis using large distributed databases
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Jacobson, Allan S.; Doyle, Richard J.; Collins, Donald J.
1992-01-01
Within this decade, the growth in complexity of exploratory data analysis and the sheer volume of space data require new and innovative approaches to support science investigators in achieving their research objectives. To date, there have been numerous efforts addressing the individual issues involved in inter-disciplinary, multi-instrument investigations. However, while successful in small scale, these efforts have not proven to be open and scaleable. This proposal addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within this proposal is the integration of three automation technologies, namely, knowledge-based expert systems, science visualization and science data management. This integration is based on the concept called the Data Hub. With the Data Hub concept, NASA will be able to apply a more complete solution to all nodes of a distributed system. Both computation nodes and interactive nodes will be able to effectively and efficiently use the data services (access, retrieval, update, etc.) with a distributed, interdisciplinary information system in a uniform and standard way. This will allow the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis will be on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to publishable scientific results. In addition, the proposed work includes all the required end-to-end components and interfaces to demonstrate the completed concept.
NASA Astrophysics Data System (ADS)
Cox, M.; Shirono, K.
2017-10-01
A criticism levelled at the Guide to the Expression of Uncertainty in Measurement (GUM) is that it is based on a mixture of frequentist and Bayesian thinking. In particular, the GUM’s Type A (statistical) uncertainty evaluations are frequentist, whereas the Type B evaluations, using state-of-knowledge distributions, are Bayesian. In contrast, making the GUM fully Bayesian implies, among other things, that a conventional objective Bayesian approach to Type A uncertainty evaluation for a number n of observations leads to the impractical consequence that n must be at least equal to 4, thus presenting a difficulty for many metrologists. This paper presents a Bayesian analysis of Type A uncertainty evaluation that applies for all n ≥slant 2 , as in the frequentist analysis in the current GUM. The analysis is based on assuming that the observations are drawn from a normal distribution (as in the conventional objective Bayesian analysis), but uses an informative prior based on lower and upper bounds for the standard deviation of the sampling distribution for the quantity under consideration. The main outcome of the analysis is a closed-form mathematical expression for the factor by which the standard deviation of the mean observation should be multiplied to calculate the required standard uncertainty. Metrological examples are used to illustrate the approach, which is straightforward to apply using a formula or look-up table.
Dislocation, crystallite size distribution and lattice strain of magnesium oxide nanoparticles
NASA Astrophysics Data System (ADS)
Sutapa, I. W.; Wahid Wahab, Abdul; Taba, P.; Nafie, N. L.
2018-03-01
The oxide of magnesium nanoparticles synthesized using sol-gel method and analysis of the structural properties was conducted. The functional groups of nanoparticles has been analysed by Fourier Transform Infrared Spectroscopy (FT-IR). Dislocations, average size of crystal, strain, stress, the energy density of crystal, crystallite size distribution and morphologies of the crystals were determined based on X-ray diffraction profile analysis. The morphological of the crystal was analysed based on the image resulted from SEM analysis. The crystallite size distribution was calculated with the contention that the particle size has a normal logarithmic form. The most orientations of crystal were determined based on the textural crystal from diffraction data of X-ray diffraction profile analysis. FT-IR results showed the stretching vibration mode of the Mg-O-Mg in the range of 400.11-525 cm-1 as a broad band. The average size crystal of nanoparticles resulted is 9.21 mm with dislocation value of crystal is 0.012 nm-2. The strains, stress, the energy density of crystal are 1.5 x 10-4 37.31 MPa; 0.72 MPa respectively. The highest texture coefficient value of the crystal is 0.98. This result is supported by morphological analysis using SEM which shows most of the regular cubic-shaped crystals. The synthesis method is suitable for simple and cost-effective synthesis model of MgO nanoparticles.
Rice, Stephen B; Chan, Christopher; Brown, Scott C; Eschbach, Peter; Han, Li; Ensor, David S; Stefaniak, Aleksandr B; Bonevich, John; Vladár, András E; Hight Walker, Angela R; Zheng, Jiwen; Starnes, Catherine; Stromberg, Arnold; Ye, Jia; Grulke, Eric A
2015-01-01
This paper reports an interlaboratory comparison that evaluated a protocol for measuring and analysing the particle size distribution of discrete, metallic, spheroidal nanoparticles using transmission electron microscopy (TEM). The study was focused on automated image capture and automated particle analysis. NIST RM8012 gold nanoparticles (30 nm nominal diameter) were measured for area-equivalent diameter distributions by eight laboratories. Statistical analysis was used to (1) assess the data quality without using size distribution reference models, (2) determine reference model parameters for different size distribution reference models and non-linear regression fitting methods and (3) assess the measurement uncertainty of a size distribution parameter by using its coefficient of variation. The interlaboratory area-equivalent diameter mean, 27.6 nm ± 2.4 nm (computed based on a normal distribution), was quite similar to the area-equivalent diameter, 27.6 nm, assigned to NIST RM8012. The lognormal reference model was the preferred choice for these particle size distributions as, for all laboratories, its parameters had lower relative standard errors (RSEs) than the other size distribution reference models tested (normal, Weibull and Rosin–Rammler–Bennett). The RSEs for the fitted standard deviations were two orders of magnitude higher than those for the fitted means, suggesting that most of the parameter estimate errors were associated with estimating the breadth of the distributions. The coefficients of variation for the interlaboratory statistics also confirmed the lognormal reference model as the preferred choice. From quasi-linear plots, the typical range for good fits between the model and cumulative number-based distributions was 1.9 fitted standard deviations less than the mean to 2.3 fitted standard deviations above the mean. Automated image capture, automated particle analysis and statistical evaluation of the data and fitting coefficients provide a framework for assessing nanoparticle size distributions using TEM for image acquisition. PMID:26361398
Ownership reform and the changing manufacturing landscape in Chinese cities: The case of Wuxi.
Zhou, Lei; Yang, Shan; Wang, Shuguang; Xiong, Liyang
2017-01-01
Since the economic transition, manufacturing in China has undergone profound changes not only in number of enterprises, but also in ownership structure and intra-urban spatial distribution. Investigating the changing manufacturing landscape from the perspective of ownership structure is critical to a deep understanding of the changing role of market and government in re-shaping manufacturing location behavior. Through a case study of Wuxi, a city experiencing comprehensive ownership reform, this paper presents a detailed analysis of the intra-urban spatial shift of manufacturing, identifies the location discrepancies, and examines the underlying forces responsible for the geographical differentiations. Through zone- and district-based analysis, a distinctive trend of decentralization and suburbanization, as well as an uneven distribution of manufacturing, is unveiled. The results of Location Quotient analysis show that the distribution of manufacturing by ownership exhibits distinctive spatial patterns, which is characterized by a historically-based, market-led, and institutionally-created spatial variation. By employing Hot Spot analysis, the role of development zones in attracting manufacturing enterprises of different ownerships is established. Overall, the location behavior of the diversified manufacturing has been increasingly based on the forces of market since the land marketization began. A proactive role played by local governments has also guided the enterprise location decision through spatial planning and regulatory policies.
Ownership reform and the changing manufacturing landscape in Chinese cities: The case of Wuxi
Zhou, Lei; Yang, Shan; Wang, Shuguang
2017-01-01
Since the economic transition, manufacturing in China has undergone profound changes not only in number of enterprises, but also in ownership structure and intra-urban spatial distribution. Investigating the changing manufacturing landscape from the perspective of ownership structure is critical to a deep understanding of the changing role of market and government in re-shaping manufacturing location behavior. Through a case study of Wuxi, a city experiencing comprehensive ownership reform, this paper presents a detailed analysis of the intra-urban spatial shift of manufacturing, identifies the location discrepancies, and examines the underlying forces responsible for the geographical differentiations. Through zone- and district-based analysis, a distinctive trend of decentralization and suburbanization, as well as an uneven distribution of manufacturing, is unveiled. The results of Location Quotient analysis show that the distribution of manufacturing by ownership exhibits distinctive spatial patterns, which is characterized by a historically-based, market-led, and institutionally-created spatial variation. By employing Hot Spot analysis, the role of development zones in attracting manufacturing enterprises of different ownerships is established. Overall, the location behavior of the diversified manufacturing has been increasingly based on the forces of market since the land marketization began. A proactive role played by local governments has also guided the enterprise location decision through spatial planning and regulatory policies. PMID:28278284
Probabilistic Analysis of a Composite Crew Module
NASA Technical Reports Server (NTRS)
Mason, Brian H.; Krishnamurthy, Thiagarajan
2011-01-01
An approach for conducting reliability-based analysis (RBA) of a Composite Crew Module (CCM) is presented. The goal is to identify and quantify the benefits of probabilistic design methods for the CCM and future space vehicles. The coarse finite element model from a previous NASA Engineering and Safety Center (NESC) project is used as the baseline deterministic analysis model to evaluate the performance of the CCM using a strength-based failure index. The first step in the probabilistic analysis process is the determination of the uncertainty distributions for key parameters in the model. Analytical data from water landing simulations are used to develop an uncertainty distribution, but such data were unavailable for other load cases. The uncertainty distributions for the other load scale factors and the strength allowables are generated based on assumed coefficients of variation. Probability of first-ply failure is estimated using three methods: the first order reliability method (FORM), Monte Carlo simulation, and conditional sampling. Results for the three methods were consistent. The reliability is shown to be driven by first ply failure in one region of the CCM at the high altitude abort load set. The final predicted probability of failure is on the order of 10-11 due to the conservative nature of the factors of safety on the deterministic loads.
TOPICS IN THEORY OF GENERALIZED PARTON DISTRIBUTIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radyushkin, Anatoly V.
Several topics in the theory of generalized parton distributions (GPDs) are reviewed. First, we give a brief overview of the basics of the theory of generalized parton distributions and their relationship with simpler phenomenological functions, viz. form factors, parton densities and distribution amplitudes. Then, we discuss recent developments in building models for GPDs that are based on the formalism of double distributions (DDs). A special attention is given to a careful analysis of the singularity structure of DDs. The DD formalism is applied to construction of a model GPDs with a singular Regge behavior. Within the developed DD-based approach, wemore » discuss the structure of GPD sum rules. It is shown that separation of DDs into the so-called ``plus'' part and the $D$-term part may be treated as a renormalization procedure for the GPD sum rules. This approach is compared with an alternative prescription based on analytic regularization.« less
Aerodynamic design of a rotor blade for minimum noise radiation
NASA Technical Reports Server (NTRS)
Karamcheti, K.; Yu, Y. H.
1974-01-01
An analysis of the aerodynamic design of a hovering rotor blade for obtaining minimum aerodynamic rotor noise has been carried out. In this analysis, which is based on both acoustical and aerodynamic considerations, attention is given only to the rotational noise due to the pressure fluctuations on the blade surfaces. The lift distribution obtained in this analysis has different characteristics from those of the conventional distribution. The present distribution shows negative lift values over a quarter of the span from the blade tip, and a maximum lift at about the midspan. Results are presented to show that the noise field is considerably affected by the shape of the lift distribution along the blade and that noise reduction of about 5 dB may be obtained by designing the rotor blade to yield minimum noise.
Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Knox, Lenora A.
The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.
Bivariate drought frequency analysis using the copula method
NASA Astrophysics Data System (ADS)
Mirabbasi, Rasoul; Fakheri-Fard, Ahmad; Dinpashoh, Yagob
2012-04-01
Droughts are major natural hazards with significant environmental and economic impacts. In this study, two-dimensional copulas were applied to the analysis of the meteorological drought characteristics of the Sharafkhaneh gauge station, located in the northwest of Iran. Two major drought characteristics, duration and severity, as defined by the standardized precipitation index, were abstracted from observed drought events. Since drought duration and severity exhibited a significant correlation and since they were modeled using different distributions, copulas were used to construct the joint distribution function of the drought characteristics. The parameter of copulas was estimated using the method of the Inference Function for Margins. Several copulas were tested in order to determine the best data fit. According to the error analysis and the tail dependence coefficient, the Galambos copula provided the best fit for the observed drought data. Some bivariate probabilistic properties of droughts, based on the derived copula-based joint distribution, were also investigated. These probabilistic properties can provide useful information for water resource planning and management.
Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework
NASA Astrophysics Data System (ADS)
Gannon, C.
2017-12-01
As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.
Laser fluorescence fluctuation excesses in molecular immunology experiments
NASA Astrophysics Data System (ADS)
Galich, N. E.; Filatov, M. V.
2007-04-01
A novel approach to statistical analysis of flow cytometry fluorescence data have been developed and applied for population analysis of blood neutrophils stained with hydroethidine during respiratory burst reaction. The staining based on intracellular oxidation hydroethidine to ethidium bromide, which intercalate into cell DNA. Fluorescence of the resultant product serves as a measure of the neutrophil ability to generate superoxide radicals after induction respiratory burst reaction by phorbol myristate acetate (PMA). It was demonstrated that polymorphonuclear leukocytes of persons with inflammatory diseases showed a considerably changed response. Cytofluorometric histograms obtained have unique information about condition of neutrophil population what might to allow a determination of the pathology processes type connecting with such inflammation. A novel approach to histogram analysis is based on analysis of high-momentum dynamic of distribution. The features of fluctuation excesses of distribution have unique information about disease under consideration.
NASA Astrophysics Data System (ADS)
Ugolnikov, Oleg S.; Maslov, Igor A.
2018-03-01
Polarization measurements of the twilight background with Wide-Angle Polarization Camera (WAPC) are used to detect the depolarization effect caused by stratospheric aerosol near the altitude of 20 km. Based on a number of observations in central Russia in spring and summer 2016, we found the parameters of lognormal size distribution of aerosol particles. This confirmed the previously published results of the colorimetric method as applied to the same twilights. The mean particle radius (about 0.1 micrometers) and size distribution are also in agreement with the recent data of in situ and space-based remote sensing of stratospheric aerosol. Methods considered here provide two independent techniques of the stratospheric aerosol study based on the twilight sky analysis.
BioconductorBuntu: a Linux distribution that implements a web-based DNA microarray analysis server.
Geeleher, Paul; Morris, Dermot; Hinde, John P; Golden, Aaron
2009-06-01
BioconductorBuntu is a custom distribution of Ubuntu Linux that automatically installs a server-side microarray processing environment, providing a user-friendly web-based GUI to many of the tools developed by the Bioconductor Project, accessible locally or across a network. System installation is via booting off a CD image or by using a Debian package provided to upgrade an existing Ubuntu installation. In its current version, several microarray analysis pipelines are supported including oligonucleotide, dual-or single-dye experiments, including post-processing with Gene Set Enrichment Analysis. BioconductorBuntu is designed to be extensible, by server-side integration of further relevant Bioconductor modules as required, facilitated by its straightforward underlying Python-based infrastructure. BioconductorBuntu offers an ideal environment for the development of processing procedures to facilitate the analysis of next-generation sequencing datasets. BioconductorBuntu is available for download under a creative commons license along with additional documentation and a tutorial from (http://bioinf.nuigalway.ie).
NASA Astrophysics Data System (ADS)
Loye, A.; Jaboyedoff, M.; Pedrazzini, A.
2009-10-01
The availability of high resolution Digital Elevation Models (DEM) at a regional scale enables the analysis of topography with high levels of detail. Hence, a DEM-based geomorphometric approach becomes more accurate for detecting potential rockfall sources. Potential rockfall source areas are identified according to the slope angle distribution deduced from high resolution DEM crossed with other information extracted from geological and topographic maps in GIS format. The slope angle distribution can be decomposed in several Gaussian distributions that can be considered as characteristic of morphological units: rock cliffs, steep slopes, footslopes and plains. A terrain is considered as potential rockfall sources when their slope angles lie over an angle threshold, which is defined where the Gaussian distribution of the morphological unit "Rock cliffs" become dominant over the one of "Steep slopes". In addition to this analysis, the cliff outcrops indicated by the topographic maps were added. They contain however "flat areas", so that only the slope angles values above the mode of the Gaussian distribution of the morphological unit "Steep slopes" were considered. An application of this method is presented over the entire Canton of Vaud (3200 km2), Switzerland. The results were compared with rockfall sources observed on the field and orthophotos analysis in order to validate the method. Finally, the influence of the cell size of the DEM is inspected by applying the methodology over six different DEM resolutions.
PCB congener analysis with Hall electrolytic conductivity detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edstrom, R.D.
1989-01-01
This work reports the development of an analytical methodology for the analysis of PCB congeners based on integrating relative retention data provided by other researchers. The retention data were transposed into a multiple retention marker system which provided good precision in the calculation of relative retention indices for PCB congener analysis. Analytical run times for the developed methodology were approximately one hour using a commercially available GC capillary column. A Tracor Model 700A Hall Electrolytic Conductivity Detector (HECD) was employed in the GC detection of Aroclor standards and environmental samples. Responses by the HECD provided good sensitivity and were reasonablymore » predictable. Ten response factors were calculated based on the molar chlorine content of each homolog group. Homolog distributions were determined for Aroclors 1016, 1221, 1232, 1242, 1248, 1254, 1260, 1262 along with binary and ternary mixtures of the same. These distributions were compared with distributions reported by other researchers using electron capture detection as well as chemical ionization mass spectrometric methodologies. Homolog distributions acquired by the HECD methodology showed good correlation with the previously mentioned methodologies. The developed analytical methodology was used in the analysis of bluefish (Pomatomas saltatrix) and weakfish (Cynoscion regalis) collected from the York River, lower James River and lower Chesapeake Bay in Virginia. Total PCB concentrations were calculated and homolog distributions were constructed from the acquired data. Increases in total PCB concentrations were found in the analyzed fish samples during the fall of 1985 collected from the lower James River and lower Chesapeake Bay.« less
NASA Astrophysics Data System (ADS)
Yin, X.; Chen, G.; Li, W.; Huthchins, D. A.
2013-01-01
Previous work indicated that the capacitive imaging (CI) technique is a useful NDE tool which can be used on a wide range of materials, including metals, glass/carbon fibre composite materials and concrete. The imaging performance of the CI technique for a given application is determined by design parameters and characteristics of the CI probe. In this paper, a rapid method for calculating the whole probe sensitivity distribution based on the finite element model (FEM) is presented to provide a direct view of the imaging capabilities of the planar CI probe. Sensitivity distributions of CI probes with different geometries were obtained. Influencing factors on sensitivity distribution were studied. Comparisons between CI probes with point-to-point triangular electrode pair and back-to-back triangular electrode pair were made based on the analysis of the corresponding sensitivity distributions. The results indicated that the sensitivity distribution could be useful for optimising the probe design parameters and predicting the imaging performance.
Laboratory E-Notebooks: A Learning Object-Based Repository
ERIC Educational Resources Information Center
Abari, Ilior; Pierre, Samuel; Saliah-Hassane, Hamadou
2006-01-01
During distributed virtual laboratory experiment sessions, a major problem is to be able to collect, store, manage and share heterogeneous data (intermediate results, analysis, annotations, etc) manipulated simultaneously by geographically distributed teammates composing a virtual team. The electronic notebook is a possible response to this…
Radar Imaging Using The Wigner-Ville Distribution
NASA Astrophysics Data System (ADS)
Boashash, Boualem; Kenny, Owen P.; Whitehouse, Harper J.
1989-12-01
The need for analysis of time-varying signals has led to the formulation of a class of joint time-frequency distributions (TFDs). One of these TFDs, the Wigner-Ville distribution (WVD), has useful properties which can be applied to radar imaging. This paper first discusses the radar equation in terms of the time-frequency representation of the signal received from a radar system. It then presents a method of tomographic reconstruction for time-frequency images to estimate the scattering function of the aircraft. An optical archi-tecture is then discussed for the real-time implementation of the analysis method based on the WVD.
Thermal-stress analysis for a wood composite blade
NASA Technical Reports Server (NTRS)
Fu, K. C.; Harb, A.
1984-01-01
A thermal-stress analysis of a wind turbine blade made of wood composite material is reported. First, the governing partial differential equation on heat conduction is derived, then, a finite element procedure using variational approach is developed for the solution of the governing equation. Thus, the temperature distribution throughout the blade is determined. Next, based on the temperature distribution, a finite element procedure using potential energy approach is applied to determine the thermal-stress distribution. A set of results is obtained through the use of a computer, which is considered to be satisfactory. All computer programs are contained in the report.
NASA Astrophysics Data System (ADS)
Danilov, A. A.; Kramarenko, V. K.; Nikolaev, D. V.; Rudnev, S. G.; Salamatova, V. Yu; Smirnov, A. V.; Vassilevski, Yu V.
2013-04-01
In this work, an adaptive unstructured tetrahedral mesh generation technology is applied for simulation of segmental bioimpedance measurements using high-resolution whole-body model of the Visible Human Project man. Sensitivity field distributions for a conventional tetrapolar, as well as eight- and ten-electrode measurement configurations are obtained. Based on the ten-electrode configuration, we suggest an algorithm for monitoring changes in the upper lung area.
2006-09-01
Lavoie, D. Kurts, SYNTHETIC ENVIRONMENTS AT THE ENTREPRISE LEVEL: OVERVIEW OF A GOVERNMENT OF CANADA (GOC), ACADEMIA and INDUSTRY DISTRIBUTED...vehicle (UAV) focused to locate the radiological source, and by comparing the performance of these assets in terms of various capability based...framework to analyze homeland security capabilities • Illustrate how a rapidly configured distributed simulation involving academia, industry and
Application of a data-mining method based on Bayesian networks to lesion-deficit analysis
NASA Technical Reports Server (NTRS)
Herskovits, Edward H.; Gerring, Joan P.
2003-01-01
Although lesion-deficit analysis (LDA) has provided extensive information about structure-function associations in the human brain, LDA has suffered from the difficulties inherent to the analysis of spatial data, i.e., there are many more variables than subjects, and data may be difficult to model using standard distributions, such as the normal distribution. We herein describe a Bayesian method for LDA; this method is based on data-mining techniques that employ Bayesian networks to represent structure-function associations. These methods are computationally tractable, and can represent complex, nonlinear structure-function associations. When applied to the evaluation of data obtained from a study of the psychiatric sequelae of traumatic brain injury in children, this method generates a Bayesian network that demonstrates complex, nonlinear associations among lesions in the left caudate, right globus pallidus, right side of the corpus callosum, right caudate, and left thalamus, and subsequent development of attention-deficit hyperactivity disorder, confirming and extending our previous statistical analysis of these data. Furthermore, analysis of simulated data indicates that methods based on Bayesian networks may be more sensitive and specific for detecting associations among categorical variables than methods based on chi-square and Fisher exact statistics.
NASA Astrophysics Data System (ADS)
Ramnath, Vishal
2017-11-01
In the field of pressure metrology the effective area is Ae = A0 (1 + λP) where A0 is the zero-pressure area and λ is the distortion coefficient and the conventional practise is to construct univariate probability density functions (PDFs) for A0 and λ. As a result analytical generalized non-Gaussian bivariate joint PDFs has not featured prominently in pressure metrology. Recently extended lambda distribution based quantile functions have been successfully utilized for summarizing univariate arbitrary PDF distributions of gas pressure balances. Motivated by this development we investigate the feasibility and utility of extending and applying quantile functions to systems which naturally exhibit bivariate PDFs. Our approach is to utilize the GUM Supplement 1 methodology to solve and generate Monte Carlo based multivariate uncertainty data for an oil based pressure balance laboratory standard that is used to generate known high pressures, and which are in turn cross-floated against another pressure balance transfer standard in order to deduce the transfer standard's respective area. We then numerically analyse the uncertainty data by formulating and constructing an approximate bivariate quantile distribution that directly couples A0 and λ in order to compare and contrast its accuracy to an exact GUM Supplement 2 based uncertainty quantification analysis.
[An EMD based time-frequency distribution and its application in EEG analysis].
Li, Xiaobing; Chu, Meng; Qiu, Tianshuang; Bao, Haiping
2007-10-01
Hilbert-Huang transform (HHT) is a new time-frequency analytic method to analyze the nonlinear and the non-stationary signals. The key step of this method is the empirical mode decomposition (EMD), with which any complicated signal can be decomposed into a finite and small number of intrinsic mode functions (IMF). In this paper, a new EMD based method for suppressing the cross-term of Wigner-Ville distribution (WVD) is developed and is applied to analyze the epileptic EEG signals. The simulation data and analysis results show that the new method suppresses the cross-term of the WVD effectively with an excellent resolution.
NASA Astrophysics Data System (ADS)
Boashash, Boualem; Lovell, Brian; White, Langford
1988-01-01
Time-Frequency analysis based on the Wigner-Ville Distribution (WVD) is shown to be optimal for a class of signals where the variation of instantaneous frequency is the dominant characteristic. Spectral resolution and instantaneous frequency tracking is substantially improved by using a Modified WVD (MWVD) based on an Autoregressive spectral estimator. Enhanced signal-to-noise ratio may be achieved by using 2D windowing in the Time-Frequency domain. The WVD provides a tool for deriving descriptors of signals which highlight their FM characteristics. These descriptors may be used for pattern recognition and data clustering using the methods presented in this paper.
Binocular optical axis parallelism detection precision analysis based on Monte Carlo method
NASA Astrophysics Data System (ADS)
Ying, Jiaju; Liu, Bingqi
2018-02-01
According to the working principle of the binocular photoelectric instrument optical axis parallelism digital calibration instrument, and in view of all components of the instrument, the various factors affect the system precision is analyzed, and then precision analysis model is established. Based on the error distribution, Monte Carlo method is used to analyze the relationship between the comprehensive error and the change of the center coordinate of the circle target image. The method can further guide the error distribution, optimize control the factors which have greater influence on the comprehensive error, and improve the measurement accuracy of the optical axis parallelism digital calibration instrument.
DataHub: Science data management in support of interactive exploratory analysis
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Rubin, Mark R.
1993-01-01
The DataHub addresses four areas of significant needs: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within the DataHub is the integration of three technologies, viz. knowledge-based expert systems, science visualization, and science data management. This integration is based on a concept called the DataHub. With the DataHub concept, science investigators are able to apply a more complete solution to all nodes of a distributed system. Both computational nodes and interactives nodes are able to effectively and efficiently use the data services (access, retrieval, update, etc), in a distributed, interdisciplinary information system in a uniform and standard way. This allows the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to information. The DataHub includes all the required end-to-end components and interfaces to demonstrate the complete concept.
DataHub - Science data management in support of interactive exploratory analysis
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Rubin, Mark R.
1993-01-01
DataHub addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within the DataHub is the integration of three technologies, viz. knowledge-based expert systems, science visualization, and science data management. This integration is based on a concept called the DataHub. With the DataHub concept, science investigators are able to apply a more complete solution to all nodes of a distributed system. Both computational nodes and interactive nodes are able to effectively and efficiently use the data services (access, retrieval, update, etc.) in a distributed, interdisciplinary information system in a uniform and standard way. This allows the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis is on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to information. The DataHub includes all the required end-to-end components and interfaces to demonstrate the complete concept.
Bayesian analysis of physiologically based toxicokinetic and toxicodynamic models.
Hack, C Eric
2006-04-17
Physiologically based toxicokinetic (PBTK) and toxicodynamic (TD) models of bromate in animals and humans would improve our ability to accurately estimate the toxic doses in humans based on available animal studies. These mathematical models are often highly parameterized and must be calibrated in order for the model predictions of internal dose to adequately fit the experimentally measured doses. Highly parameterized models are difficult to calibrate and it is difficult to obtain accurate estimates of uncertainty or variability in model parameters with commonly used frequentist calibration methods, such as maximum likelihood estimation (MLE) or least squared error approaches. The Bayesian approach called Markov chain Monte Carlo (MCMC) analysis can be used to successfully calibrate these complex models. Prior knowledge about the biological system and associated model parameters is easily incorporated in this approach in the form of prior parameter distributions, and the distributions are refined or updated using experimental data to generate posterior distributions of parameter estimates. The goal of this paper is to give the non-mathematician a brief description of the Bayesian approach and Markov chain Monte Carlo analysis, how this technique is used in risk assessment, and the issues associated with this approach.
Zhu, Qiaohao; Carriere, K C
2016-01-01
Publication bias can significantly limit the validity of meta-analysis when trying to draw conclusion about a research question from independent studies. Most research on detection and correction for publication bias in meta-analysis focus mainly on funnel plot-based methodologies or selection models. In this paper, we formulate publication bias as a truncated distribution problem, and propose new parametric solutions. We develop methodologies of estimating the underlying overall effect size and the severity of publication bias. We distinguish the two major situations, in which publication bias may be induced by: (1) small effect size or (2) large p-value. We consider both fixed and random effects models, and derive estimators for the overall mean and the truncation proportion. These estimators will be obtained using maximum likelihood estimation and method of moments under fixed- and random-effects models, respectively. We carried out extensive simulation studies to evaluate the performance of our methodology, and to compare with the non-parametric Trim and Fill method based on funnel plot. We find that our methods based on truncated normal distribution perform consistently well, both in detecting and correcting publication bias under various situations.
Grid-based Continual Analysis of Molecular Interior for Drug Discovery, QSAR and QSPR.
Potemkin, Andrey V; Grishina, Maria A; Potemkin, Vladimir A
2017-01-01
In 1979, R.D.Cramer and M.Milne made a first realization of 3D comparison of molecules by aligning them in space and by mapping their molecular fields to a 3D grid. Further, this approach was developed as the DYLOMMS (Dynamic Lattice- Oriented Molecular Modelling System) approach. In 1984, H.Wold and S.Wold proposed the use of partial least squares (PLS) analysis, instead of principal component analysis, to correlate the field values with biological activities. Then, in 1988, the method which was called CoMFA (Comparative Molecular Field Analysis) was introduced and the appropriate software became commercially available. Since 1988, a lot of 3D QSAR methods, algorithms and their modifications are introduced for solving of virtual drug discovery problems (e.g., CoMSIA, CoMMA, HINT, HASL, GOLPE, GRID, PARM, Raptor, BiS, CiS, ConGO,). All the methods can be divided into two groups (classes):1. Methods studying the exterior of molecules; 2) Methods studying the interior of molecules. A series of grid-based computational technologies for Continual Molecular Interior analysis (CoMIn) are invented in the current paper. The grid-based analysis is fulfilled by means of a lattice construction analogously to many other grid-based methods. The further continual elucidation of molecular structure is performed in various ways. (i) In terms of intermolecular interactions potentials. This can be represented as a superposition of Coulomb, Van der Waals interactions and hydrogen bonds. All the potentials are well known continual functions and their values can be determined in all lattice points for a molecule. (ii) In the terms of quantum functions such as electron density distribution, Laplacian and Hamiltonian of electron density distribution, potential energy distribution, the highest occupied and the lowest unoccupied molecular orbitals distribution and their superposition. To reduce time of calculations using quantum methods based on the first principles, an original quantum free-orbital approach AlteQ is proposed. All the functions can be calculated using a quantum approach at a sufficient level of theory and their values can be determined in all lattice points for a molecule. Then, the molecules of a dataset can be superimposed in the lattice for the maximal coincidence (or minimal deviations) of the potentials (i) or the quantum functions (ii). The methods and criteria of the superimposition are discussed. After that a functional relationship between biological activity or property and characteristics of potentials (i) or functions (ii) is created. The methods of the quantitative relationship construction are discussed. New approaches for rational virtual drug design based on the intermolecular potentials and quantum functions are invented. All the invented methods are realized at www.chemosophia.com web page. Therefore, a set of 3D QSAR approaches for continual molecular interior study giving a lot of opportunities for virtual drug discovery, virtual screening and ligand-based drug design are invented. The continual elucidation of molecular structure is performed in the terms of intermolecular interactions potentials and in the terms of quantum functions such as electron density distribution, Laplacian and Hamiltonian of electron density distribution, potential energy distribution, the highest occupied and the lowest unoccupied molecular orbitals distribution and their superposition. To reduce time of calculations using quantum methods based on the first principles, an original quantum free-orbital approach AlteQ is proposed. The methods of the quantitative relationship construction are discussed. New approaches for rational virtual drug design based on the intermolecular potentials and quantum functions are invented. All the invented methods are realized at www.chemosophia.com web page. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Parsons, Brendon A; Marney, Luke C; Siegler, W Christopher; Hoggard, Jamin C; Wright, Bob W; Synovec, Robert E
2015-04-07
Comprehensive two-dimensional (2D) gas chromatography coupled with time-of-flight mass spectrometry (GC × GC-TOFMS) is a versatile instrumental platform capable of collecting highly informative, yet highly complex, chemical data for a variety of samples. Fisher-ratio (F-ratio) analysis applied to the supervised comparison of sample classes algorithmically reduces complex GC × GC-TOFMS data sets to find class distinguishing chemical features. F-ratio analysis, using a tile-based algorithm, significantly reduces the adverse effects of chromatographic misalignment and spurious covariance of the detected signal, enhancing the discovery of true positives while simultaneously reducing the likelihood of detecting false positives. Herein, we report a study using tile-based F-ratio analysis whereby four non-native analytes were spiked into diesel fuel at several concentrations ranging from 0 to 100 ppm. Spike level comparisons were performed in two regimes: comparing the spiked samples to the nonspiked fuel matrix and to each other at relative concentration factors of two. Redundant hits were algorithmically removed by refocusing the tiled results onto the original high resolution pixel level data. To objectively limit the tile-based F-ratio results to only features which are statistically likely to be true positives, we developed a combinatorial technique using null class comparisons, called null distribution analysis, by which we determined a statistically defensible F-ratio cutoff for the analysis of the hit list. After applying null distribution analysis, spiked analytes were reliably discovered at ∼1 to ∼10 ppm (∼5 to ∼50 pg using a 200:1 split), depending upon the degree of mass spectral selectivity and 2D chromatographic resolution, with minimal occurrence of false positives. To place the relevance of this work among other methods in this field, results are compared to those for pixel and peak table-based approaches.
NASA Astrophysics Data System (ADS)
Vittal, H.; Singh, Jitendra; Kumar, Pankaj; Karmakar, Subhankar
2015-06-01
In watershed management, flood frequency analysis (FFA) is performed to quantify the risk of flooding at different spatial locations and also to provide guidelines for determining the design periods of flood control structures. The traditional FFA was extensively performed by considering univariate scenario for both at-site and regional estimation of return periods. However, due to inherent mutual dependence of the flood variables or characteristics [i.e., peak flow (P), flood volume (V) and flood duration (D), which are random in nature], analysis has been further extended to multivariate scenario, with some restrictive assumptions. To overcome the assumption of same family of marginal density function for all flood variables, the concept of copula has been introduced. Although, the advancement from univariate to multivariate analyses drew formidable attention to the FFA research community, the basic limitation was that the analyses were performed with the implementation of only parametric family of distributions. The aim of the current study is to emphasize the importance of nonparametric approaches in the field of multivariate FFA; however, the nonparametric distribution may not always be a good-fit and capable of replacing well-implemented multivariate parametric and multivariate copula-based applications. Nevertheless, the potential of obtaining best-fit using nonparametric distributions might be improved because such distributions reproduce the sample's characteristics, resulting in more accurate estimations of the multivariate return period. Hence, the current study shows the importance of conjugating multivariate nonparametric approach with multivariate parametric and copula-based approaches, thereby results in a comprehensive framework for complete at-site FFA. Although the proposed framework is designed for at-site FFA, this approach can also be applied to regional FFA because regional estimations ideally include at-site estimations. The framework is based on the following steps: (i) comprehensive trend analysis to assess nonstationarity in the observed data; (ii) selection of the best-fit univariate marginal distribution with a comprehensive set of parametric and nonparametric distributions for the flood variables; (iii) multivariate frequency analyses with parametric, copula-based and nonparametric approaches; and (iv) estimation of joint and various conditional return periods. The proposed framework for frequency analysis is demonstrated using 110 years of observed data from Allegheny River at Salamanca, New York, USA. The results show that for both univariate and multivariate cases, the nonparametric Gaussian kernel provides the best estimate. Further, we perform FFA for twenty major rivers over continental USA, which shows for seven rivers, all the flood variables followed nonparametric Gaussian kernel; whereas for other rivers, parametric distributions provide the best-fit either for one or two flood variables. Thus the summary of results shows that the nonparametric method cannot substitute the parametric and copula-based approaches, but should be considered during any at-site FFA to provide the broadest choices for best estimation of the flood return periods.
NASA Astrophysics Data System (ADS)
Samsinar, Riza; Suseno, Jatmiko Endro; Widodo, Catur Edi
2018-02-01
The distribution network is the closest power grid to the customer Electric service providers such as PT. PLN. The dispatching center of power grid companies is also the data center of the power grid where gathers great amount of operating information. The valuable information contained in these data means a lot for power grid operating management. The technique of data warehousing online analytical processing has been used to manage and analysis the great capacity of data. Specific methods for online analytics information systems resulting from data warehouse processing with OLAP are chart and query reporting. The information in the form of chart reporting consists of the load distribution chart based on the repetition of time, distribution chart on the area, the substation region chart and the electric load usage chart. The results of the OLAP process show the development of electric load distribution, as well as the analysis of information on the load of electric power consumption and become an alternative in presenting information related to peak load.
Broadband Time-Frequency Analysis Using a Multicomputer
2004-09-30
FFT 512 pt Waterfall WVD display 8© 2004 Mercury Computer Systems, Inc. Smoothed Pseudo Wigner - Ville Distribution One of many interference reduction...The Wigner - Ville distribution , the scalogram, and the discrete Gabor transform are among the most well-known of these methods. Due to specific...based upon FFT Accumulation Method • Continuous Wavelet Transform (Scalogram) • Discrete Wigner - Ville Distribution with a selected set of interference
Command and Control for Distributed Lethality
2017-06-01
based systems engineering (MBSE) approach to C2 within the distributed lethality environment requires development of methodologies to provide...lethality environment requires development of methodologies to provide definition and structure for existing operational concepts while providing...2 D. SCOPE AND METHODOLOGY ............................................................2 E. STAKEHOLDER ANALYSIS
[Geographical distribution of left ventricular Tei index based on principal component analysis].
Xu, Jinhui; Ge, Miao; He, Jinwei; Xue, Ranyin; Yang, Shaofang; Jiang, Jilin
2014-11-01
To provide a scientific standard of left ventricular Tei index for healthy people from various region of China, and to lay a reliable foundation for the evaluation of left ventricular diastolic and systolic function. The correlation and principal component analysis were used to explore the left ventricular Tei index, which based on the data of 3 562 samples from 50 regions of China by means of literature retrieval. Th e nine geographical factors were longitude(X₁), latitude(X₂), altitude(X₃), annual sunshine hours (X₄), the annual average temperature (X₅), annual average relative humidity (X₆), annual precipitation (X₇), annual temperature range (X₈) and annual average wind speed (X₉). ArcGIS soft ware was applied to calculate the spatial distribution regularities of left ventricular Tei index. There is a significant correlation between the healthy people's left ventricular Tei index and geographical factors, and the correlation coefficients were -0.107 (r₁), -0.301 (r₂), -0.029 (r₃), -0.277 (r₄), -0.256(r₅), -0.289(r₆), -0.320(r₇), -0.310 (r₈) and -0.117 (r₉), respectively. A linear equation between the Tei index and the geographical factor was obtained by regression analysis based on the three extracting principal components. The geographical distribution tendency chart for healthy people's left Tei index was fitted out by the ArcGIS spatial interpolation analysis. The geographical distribution for left ventricular Tei index in China follows certain pattern. The reference value in North is higher than that in South, while the value in East is higher than that in West.
Integral finite element analysis of turntable bearing with flexible rings
NASA Astrophysics Data System (ADS)
Deng, Biao; Liu, Yunfei; Guo, Yuan; Tang, Shengjin; Su, Wenbin; Lei, Zhufeng; Wang, Pengcheng
2018-03-01
This paper suggests a method to calculate the internal load distribution and contact stress of the thrust angular contact ball turntable bearing by FEA. The influence of the stiffness of the bearing structure and the plastic deformation of contact area on the internal load distribution and contact stress of the bearing is considered. In this method, the load-deformation relationship of the rolling elements is determined by the finite element contact analysis of a single rolling element and the raceway. Based on this, the nonlinear contact between the rolling elements and the inner and outer ring raceways is same as a nonlinear compression spring and bearing integral finite element analysis model including support structure was established. The effects of structural deformation and plastic deformation on the built-in stress distribution of slewing bearing are investigated on basis of comparing the consequences of load distribution, inner and outer ring stress, contact stress and other finite element analysis results with the traditional bearing theory, which has guiding function for improving the design of slewing bearing.
NASA Astrophysics Data System (ADS)
Nhu Y, Do
2018-03-01
Vietnam has many advantages of wind power resources. Time by time there are more and more capacity as well as number of wind power project in Vietnam. Corresponding to the increase of wind power emitted into national grid, It is necessary to research and analyze in order to ensure the safety and reliability of win power connection. In national distribution grid, voltage sag occurs regularly, it can strongly influence on the operation of wind power. The most serious consequence is the disconnection. The paper presents the analysis of distribution grid's transient process when voltage is sagged. Base on the analysis, the solutions will be recommended to improve the reliability and effective operation of wind power resources.
The Statistical Value of Raw Fluorescence Signal in Luminex xMAP Based Multiplex Immunoassays
Breen, Edmond J.; Tan, Woei; Khan, Alamgir
2016-01-01
Tissue samples (plasma, saliva, serum or urine) from 169 patients classified as either normal or having one of seven possible diseases are analysed across three 96-well plates for the presences of 37 analytes using cytokine inflammation multiplexed immunoassay panels. Censoring for concentration data caused problems for analysis of the low abundant analytes. Using fluorescence analysis over concentration based analysis allowed analysis of these low abundant analytes. Mixed-effects analysis on the resulting fluorescence and concentration responses reveals a combination of censoring and mapping the fluorescence responses to concentration values, through a 5PL curve, changed observed analyte concentrations. Simulation verifies this, by showing a dependence on the mean florescence response and its distribution on the observed analyte concentration levels. Differences from normality, in the fluorescence responses, can lead to differences in concentration estimates and unreliable probabilities for treatment effects. It is seen that when fluorescence responses are normally distributed, probabilities of treatment effects for fluorescence based t-tests has greater statistical power than the same probabilities from concentration based t-tests. We add evidence that the fluorescence response, unlike concentration values, doesn’t require censoring and we show with respect to differential analysis on the fluorescence responses that background correction is not required. PMID:27243383
Bujkiewicz, Sylwia; Riley, Richard D
2016-01-01
Multivariate random-effects meta-analysis allows the joint synthesis of correlated results from multiple studies, for example, for multiple outcomes or multiple treatment groups. In a Bayesian univariate meta-analysis of one endpoint, the importance of specifying a sensible prior distribution for the between-study variance is well understood. However, in multivariate meta-analysis, there is little guidance about the choice of prior distributions for the variances or, crucially, the between-study correlation, ρB; for the latter, researchers often use a Uniform(−1,1) distribution assuming it is vague. In this paper, an extensive simulation study and a real illustrative example is used to examine the impact of various (realistically) vague prior distributions for ρB and the between-study variances within a Bayesian bivariate random-effects meta-analysis of two correlated treatment effects. A range of diverse scenarios are considered, including complete and missing data, to examine the impact of the prior distributions on posterior results (for treatment effect and between-study correlation), amount of borrowing of strength, and joint predictive distributions of treatment effectiveness in new studies. Two key recommendations are identified to improve the robustness of multivariate meta-analysis results. First, the routine use of a Uniform(−1,1) prior distribution for ρB should be avoided, if possible, as it is not necessarily vague. Instead, researchers should identify a sensible prior distribution, for example, by restricting values to be positive or negative as indicated by prior knowledge. Second, it remains critical to use sensible (e.g. empirically based) prior distributions for the between-study variances, as an inappropriate choice can adversely impact the posterior distribution for ρB, which may then adversely affect inferences such as joint predictive probabilities. These recommendations are especially important with a small number of studies and missing data. PMID:26988929
LOG-NORMAL DISTRIBUTION OF COSMIC VOIDS IN SIMULATIONS AND MOCKS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Russell, E.; Pycke, J.-R., E-mail: er111@nyu.edu, E-mail: jrp15@nyu.edu
2017-01-20
Following up on previous studies, we complete here a full analysis of the void size distributions of the Cosmic Void Catalog based on three different simulation and mock catalogs: dark matter (DM), haloes, and galaxies. Based on this analysis, we attempt to answer two questions: Is a three-parameter log-normal distribution a good candidate to satisfy the void size distributions obtained from different types of environments? Is there a direct relation between the shape parameters of the void size distribution and the environmental effects? In an attempt to answer these questions, we find here that all void size distributions of thesemore » data samples satisfy the three-parameter log-normal distribution whether the environment is dominated by DM, haloes, or galaxies. In addition, the shape parameters of the three-parameter log-normal void size distribution seem highly affected by environment, particularly existing substructures. Therefore, we show two quantitative relations given by linear equations between the skewness and the maximum tree depth, and between the variance of the void size distribution and the maximum tree depth, directly from the simulated data. In addition to this, we find that the percentage of voids with nonzero central density in the data sets has a critical importance. If the number of voids with nonzero central density reaches ≥3.84% in a simulation/mock sample, then a second population is observed in the void size distributions. This second population emerges as a second peak in the log-normal void size distribution at larger radius.« less
Jeong, Y J; Oh, T I; Woo, E J; Kim, K J
2017-07-01
Recently, highly flexible and soft pressure distribution imaging sensor is in great demand for tactile sensing, gait analysis, ubiquitous life-care based on activity recognition, and therapeutics. In this study, we integrate the piezo-capacitive and piezo-electric nanowebs with the conductive fabric sheets for detecting static and dynamic pressure distributions on a large sensing area. Electrical impedance tomography (EIT) and electric source imaging are applied for reconstructing pressure distribution images from measured current-voltage data on the boundary of the hybrid fabric sensor. We evaluated the piezo-capacitive nanoweb sensor, piezo-electric nanoweb sensor, and hybrid fabric sensor. The results show the feasibility of static and dynamic pressure distribution imaging from the boundary measurements of the fabric sensors.
Research on intelligent power distribution system for spacecraft
NASA Astrophysics Data System (ADS)
Xia, Xiaodong; Wu, Jianju
2017-10-01
The power distribution system (PDS) mainly realizes the power distribution and management of the electrical load of the whole spacecraft, which is directly related to the success or failure of the mission, and hence is an important part of the spacecraft. In order to improve the reliability and intelligent degree of the PDS, and considering the function and composition of spacecraft power distribution system, this paper systematically expounds the design principle and method of the intelligent power distribution system based on SSPC, and provides the analysis and verification of the test data additionally.
NASA Astrophysics Data System (ADS)
Xia, Xintao; Wang, Zhongyu
2008-10-01
For some methods of stability analysis of a system using statistics, it is difficult to resolve the problems of unknown probability distribution and small sample. Therefore, a novel method is proposed in this paper to resolve these problems. This method is independent of probability distribution, and is useful for small sample systems. After rearrangement of the original data series, the order difference and two polynomial membership functions are introduced to estimate the true value, the lower bound and the supper bound of the system using fuzzy-set theory. Then empirical distribution function is investigated to ensure confidence level above 95%, and the degree of similarity is presented to evaluate stability of the system. Cases of computer simulation investigate stable systems with various probability distribution, unstable systems with linear systematic errors and periodic systematic errors and some mixed systems. The method of analysis for systematic stability is approved.
Information Interaction Study for DER and DMS Interoperability
NASA Astrophysics Data System (ADS)
Liu, Haitao; Lu, Yiming; Lv, Guangxian; Liu, Peng; Chen, Yu; Zhang, Xinhui
The Common Information Model (CIM) is an abstract data model that can be used to represent the major objects in Distribution Management System (DMS) applications. Because the Common Information Model (CIM) doesn't modeling the Distributed Energy Resources (DERs), it can't meet the requirements of DER operation and management for Distribution Management System (DMS) advanced applications. Modeling of DER were studied based on a system point of view, the article initially proposed a CIM extended information model. By analysis the basic structure of the message interaction between DMS and DER, a bidirectional messaging mapping method based on data exchange was proposed.
ERIC Educational Resources Information Center
Ousley, Chris
2010-01-01
This study sought to provide empirical evidence regarding the use of spatial analysis in enrollment management to predict persistence and graduation. The research utilized data from the 2000 U.S. Census and applicant records from The University of Arizona to study the spatial distributions of enrollments. Based on the initial results, stepwise…
Statistical dynamics of regional populations and economies
NASA Astrophysics Data System (ADS)
Huo, Jie; Wang, Xu-Ming; Hao, Rui; Wang, Peng
Quantitative analysis of human behavior and social development is becoming a hot spot of some interdisciplinary studies. A statistical analysis on the population and GDP of 150 cities in China from 1990 to 2013 is conducted. The result indicates the cumulative probability distribution of the populations and that of the GDPs obeying the shifted power law, respectively. In order to understand these characteristics, a generalized Langevin equation describing variation of population is proposed, which is based on the correlations between population and GDP as well as the random fluctuations of the related factors. The equation is transformed into the Fokker-Plank equation to express the evolution of population distribution. The general solution demonstrates a transition of the distribution from the normal Gaussian distribution to a shifted power law, which suggests a critical point of time at which the transition takes place. The shifted power law distribution in the supercritical situation is qualitatively in accordance with the practical result. The distribution of the GDPs is derived from the well-known Cobb-Douglas production function. The result presents a change, in supercritical situation, from a shifted power law to the Gaussian distribution. This is a surprising result-the regional GDP distribution of our world will be the Gaussian distribution one day in the future. The discussions based on the changing trend of economic growth suggest it will be true. Therefore, these theoretical attempts may draw a historical picture of our society in the aspects of population and economy.
Regional frequency analysis of extreme rainfalls using partial L moments method
NASA Astrophysics Data System (ADS)
Zakaria, Zahrahtul Amani; Shabri, Ani
2013-07-01
An approach based on regional frequency analysis using L moments and LH moments are revisited in this study. Subsequently, an alternative regional frequency analysis using the partial L moments (PL moments) method is employed, and a new relationship for homogeneity analysis is developed. The results were then compared with those obtained using the method of L moments and LH moments of order two. The Selangor catchment, consisting of 37 sites and located on the west coast of Peninsular Malaysia, is chosen as a case study. PL moments for the generalized extreme value (GEV), generalized logistic (GLO), and generalized Pareto distributions were derived and used to develop the regional frequency analysis procedure. PL moment ratio diagram and Z test were employed in determining the best-fit distribution. Comparison between the three approaches showed that GLO and GEV distributions were identified as the suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation used for performance evaluation shows that the method of PL moments would outperform L and LH moments methods for estimation of large return period events.
2012-09-30
recognition. Algorithm design and statistical analysis and feature analysis. Post -Doctoral Associate, Cornell University, Bioacoustics Research...short. The HPC-ADA was designed based on fielded systems [1-4, 6] that offer a variety of desirable attributes, specifically dynamic resource...The software package was designed to utilize parallel and distributed processing for running recognition and other advanced algorithms. DeLMA
On the theoretical velocity distribution and flow resistance in natural channels
NASA Astrophysics Data System (ADS)
Moramarco, Tommaso; Dingman, S. Lawrence
2017-12-01
The velocity distribution in natural channels is of considerable interest for streamflow measurements to obtain information on discharge and flow resistance. This study focuses on the comparison of theoretical velocity distributions based on 1) entropy theory, and 2) the two-parameter power law. The analysis identifies the correlation between the parameters of the distributions and defines their dependence on the geometric and hydraulic characteristics of the channel. Specifically, we investigate how the parameters are related to the flow resistance in terms of Manning roughness, shear velocity and water surface slope, and several formulae showing their relationships are proposed. Velocity measurements carried out in the past 20 years at Ponte Nuovo gauged section along the Tiber River, central Italy, are the basis for the analysis.
Espinosa, Manuel O; Polop, Francisco; Rotela, Camilo H; Abril, Marcelo; Scavuzzo, Carlos M
2016-11-21
The main objective of this study was to obtain and analyse the space-time dynamics of Aedes aegypti breeding sites in Clorinda City, Formosa Province, Argentina coupled with landscape analysis using the maximum entropy approach in order to generate a dengue vector niche model. In urban areas, without vector control activities, 12 entomologic (larval) samplings were performed during three years (October 2011 to October 2014). The entomologic surveillance area represented 16,511 houses. Predictive models for Aedes distribution were developed using vector breeding abundance data, density analysis, clustering and geoprocessing techniques coupled with Earth observation satellite data. The spatial analysis showed a vector spatial distribution pattern with clusters of high density in the central region of Clorinda with a well-defined high-risk area in the western part of the city. It also showed a differential temporal behaviour among different areas, which could have implications for risk models and control strategies at the urban scale. The niche model obtained for Ae. aegypti, based on only one year of field data, showed that 85.8% of the distribution of breeding sites is explained by the percentage of water supply (48.2%), urban distribution (33.2%), and the percentage of urban coverage (4.4%). The consequences for the development of control strategies are discussed with reference to the results obtained using distribution maps based on environmental variables.
Monitoring Method of Cow Anthrax Based on Gis and Spatial Statistical Analysis
NASA Astrophysics Data System (ADS)
Li, Lin; Yang, Yong; Wang, Hongbin; Dong, Jing; Zhao, Yujun; He, Jianbin; Fan, Honggang
Geographic information system (GIS) is a computer application system, which possesses the ability of manipulating spatial information and has been used in many fields related with the spatial information management. Many methods and models have been established for analyzing animal diseases distribution models and temporal-spatial transmission models. Great benefits have been gained from the application of GIS in animal disease epidemiology. GIS is now a very important tool in animal disease epidemiological research. Spatial analysis function of GIS can be widened and strengthened by using spatial statistical analysis, allowing for the deeper exploration, analysis, manipulation and interpretation of spatial pattern and spatial correlation of the animal disease. In this paper, we analyzed the cow anthrax spatial distribution characteristics in the target district A (due to the secret of epidemic data we call it district A) based on the established GIS of the cow anthrax in this district in combination of spatial statistical analysis and GIS. The Cow anthrax is biogeochemical disease, and its geographical distribution is related closely to the environmental factors of habitats and has some spatial characteristics, and therefore the correct analysis of the spatial distribution of anthrax cow for monitoring and the prevention and control of anthrax has a very important role. However, the application of classic statistical methods in some areas is very difficult because of the pastoral nomadic context. The high mobility of livestock and the lack of enough suitable sampling for the some of the difficulties in monitoring currently make it nearly impossible to apply rigorous random sampling methods. It is thus necessary to develop an alternative sampling method, which could overcome the lack of sampling and meet the requirements for randomness. The GIS computer application software ArcGIS9.1 was used to overcome the lack of data of sampling sites.Using ArcGIS 9.1 and GEODA to analyze the cow anthrax spatial distribution of district A. we gained some conclusions about cow anthrax' density: (1) there is a spatial clustering model. (2) there is an intensely spatial autocorrelation. We established a prediction model to estimate the anthrax distribution based on the spatial characteristic of the density of cow anthrax. Comparing with the true distribution, the prediction model has a well coincidence and is feasible to the application. The method using a GIS tool facilitates can be implemented significantly in the cow anthrax monitoring and investigation, and the space statistics - related prediction model provides a fundamental use for other study on space-related animal diseases.
Effect of extreme data loss on heart rate signals quantified by entropy analysis
NASA Astrophysics Data System (ADS)
Li, Yu; Wang, Jun; Li, Jin; Liu, Dazhao
2015-02-01
The phenomenon of data loss always occurs in the analysis of large databases. Maintaining the stability of analysis results in the event of data loss is very important. In this paper, we used a segmentation approach to generate a synthetic signal that is randomly wiped from data according to the Gaussian distribution and the exponential distribution of the original signal. Then, the logistic map is used as verification. Finally, two methods of measuring entropy-base-scale entropy and approximate entropy-are comparatively analyzed. Our results show the following: (1) Two key parameters-the percentage and the average length of removed data segments-can change the sequence complexity according to logistic map testing. (2) The calculation results have preferable stability for base-scale entropy analysis, which is not sensitive to data loss. (3) The loss percentage of HRV signals should be controlled below the range (p = 30 %), which can provide useful information in clinical applications.
Optimum structural design based on reliability and proof-load testing
NASA Technical Reports Server (NTRS)
Shinozuka, M.; Yang, J. N.
1969-01-01
Proof-load test eliminates structures with strength less than the proof load and improves the reliability value in analysis. It truncates the distribution function of strength at the proof load, thereby alleviating verification of a fitted distribution function at the lower tail portion where data are usually nonexistent.
As part of an EPA/USGS project to predict the relative vulnerability of near-coastal species to climate change, we have synthesized in a web-based tool, the Coastal Biogeographic Risk Analysis Tool (CBRAT), the biogeographic distributions and abundances of bivalves, found in dept...
NASA Astrophysics Data System (ADS)
Chávez, G. Moreno; Sarocchi, D.; Santana, E. Arce; Borselli, L.
2015-12-01
The study of grain size distribution is fundamental for understanding sedimentological environments. Through these analyses, clast erosion, transport and deposition processes can be interpreted and modeled. However, grain size distribution analysis can be difficult in some outcrops due to the number and complexity of the arrangement of clasts and matrix and their physical size. Despite various technological advances, it is almost impossible to get the full grain size distribution (blocks to sand grain size) with a single method or instrument of analysis. For this reason development in this area continues to be fundamental. In recent years, various methods of particle size analysis by automatic image processing have been developed, due to their potential advantages with respect to classical ones; speed and final detailed content of information (virtually for each analyzed particle). In this framework, we have developed a novel algorithm and software for grain size distribution analysis, based on color image segmentation using an entropy-controlled quadratic Markov measure field algorithm and the Rosiwal method for counting intersections between clast and linear transects in the images. We test the novel algorithm in different sedimentary deposit types from 14 varieties of sedimentological environments. The results of the new algorithm were compared with grain counts performed manually by the same Rosiwal methods applied by experts. The new algorithm has the same accuracy as a classical manual count process, but the application of this innovative methodology is much easier and dramatically less time-consuming. The final productivity of the new software for analysis of clasts deposits after recording field outcrop images can be increased significantly.
2016-09-01
is to fit empirical Beta distributions to observed data, and then to use a randomization approach to make inferences on the difference between...a Ridit analysis on the often sparse data sets in many Flying Qualities applicationsi. The method of this paper is to fit empirical Beta ...One such measure is the discrete- probability-distribution version of the (squared) ‘Hellinger Distance’ (Yang & Le Cam , 2000) 2(, ) = 1
Taki, M; Signorini, A; Oton, C J; Nannipieri, T; Di Pasquale, F
2013-10-15
We experimentally demonstrate the use of cyclic pulse coding for distributed strain and temperature measurements in hybrid Raman/Brillouin optical time-domain analysis (BOTDA) optical fiber sensors. The highly integrated proposed solution effectively addresses the strain/temperature cross-sensitivity issue affecting standard BOTDA sensors, allowing for simultaneous meter-scale strain and temperature measurements over 10 km of standard single mode fiber using a single narrowband laser source only.
NASA Astrophysics Data System (ADS)
Ying, Shen; Li, Lin; Gao, Yurong
2009-10-01
Spatial visibility analysis is the important direction of pedestrian behaviors because our visual conception in space is the straight method to get environment information and navigate your actions. Based on the agent modeling and up-tobottom method, the paper develop the framework about the analysis of the pedestrian flow depended on visibility. We use viewshed in visibility analysis and impose the parameters on agent simulation to direct their motion in urban space. We analyze the pedestrian behaviors in micro-scale and macro-scale of urban open space. The individual agent use visual affordance to determine his direction of motion in micro-scale urban street on district. And we compare the distribution of pedestrian flow with configuration in macro-scale urban environment, and mine the relationship between the pedestrian flow and distribution of urban facilities and urban function. The paper first computes the visibility situations at the vantage point in urban open space, such as street network, quantify the visibility parameters. The multiple agents use visibility parameters to decide their direction of motion, and finally pedestrian flow reach to a stable state in urban environment through the simulation of multiple agent system. The paper compare the morphology of visibility parameters and pedestrian distribution with urban function and facilities layout to confirm the consistence between them, which can be used to make decision support in urban design.
NASA Astrophysics Data System (ADS)
Lian, Enyang; Ren, Yingyu; Han, Yunfeng; Liu, Weixin; Jin, Ningde; Zhao, Junying
2016-11-01
The multi-scale analysis is an important method for detecting nonlinear systems. In this study, we carry out experiments and measure the fluctuation signals from a rotating electric field conductance sensor with eight electrodes. We first use a recurrence plot to recognise flow patterns in vertical upward gas-liquid two-phase pipe flow from measured signals. Then we apply a multi-scale morphological analysis based on the first-order difference scatter plot to investigate the signals captured from the vertical upward gas-liquid two-phase flow loop test. We find that the invariant scaling exponent extracted from the multi-scale first-order difference scatter plot with the bisector of the second-fourth quadrant as the reference line is sensitive to the inhomogeneous distribution characteristics of the flow structure, and the variation trend of the exponent is helpful to understand the process of breakup and coalescence of the gas phase. In addition, we explore the dynamic mechanism influencing the inhomogeneous distribution of the gas phase in terms of adaptive optimal kernel time-frequency representation. The research indicates that the system energy is a factor influencing the distribution of the gas phase and the multi-scale morphological analysis based on the first-order difference scatter plot is an effective method for indicating the inhomogeneous distribution of the gas phase in gas-liquid two-phase flow.
Using the Kaldor-Hicks Tableau Format for Cost-Benefit Analysis and Policy Evaluation
ERIC Educational Resources Information Center
Krutilla, Kerry
2005-01-01
This note describes the Kaldor-Hicks (KH) tableau format as a framework for distributional accounting in cost-benefit analysis and policy evaluation. The KH tableau format can serve as a heuristic aid for teaching microeconomics-based policy analysis, and offer insight to policy analysts and decisionmakers beyond conventional efficiency analysis.
NASA Astrophysics Data System (ADS)
Aktas, Metin; Maral, Hakan; Akgun, Toygar
2018-02-01
Extinction ratio is an inherent limiting factor that has a direct effect on the detection performance of phase-OTDR based distributed acoustics sensing systems. In this work we present a model based analysis of Rayleigh scattering to simulate the effects of extinction ratio on the received signal under varying signal acquisition scenarios and system parameters. These signal acquisition scenarios are constructed to represent typically observed cases such as multiple vibration sources cluttered around the target vibration source to be detected, continuous wave light sources with center frequency drift, varying fiber optic cable lengths and varying ADC bit resolutions. Results show that an insufficient ER can result in high optical noise floor and effectively hide the effects of elaborate system improvement efforts.
Theiss-Nyland, Katherine; Lynch, Michael; Lines, Jo
2016-05-04
In addition to mass distribution campaigns, the World Health Organization (WHO) recommends the continuous distribution of long-lasting insecticidal nets (LLINs) to all pregnant women attending antenatal care (ANC) and all infants attending the Expanded Programme on Immunization (EPI) services in countries implementing mosquito nets for malaria control. Countries report LLIN distribution data to the WHO annually. For this analysis, these data were used to assess policy and practice in implementing these recommendations and to compare the numbers of LLINs available through ANC and EPI services with the numbers of women and children attending these services. For each reporting country in sub-Saharan Africa, the presence of a reported policy for LLIN distribution through ANC and EPI was reviewed. Prior to inclusion in the analysis the completeness of data was assessed in terms of the numbers of LLINs distributed through all channels (campaigns, EPI, ANC, other). For each country with adequate data, the numbers of LLINs reportedly distributed by national programmes to ANC was compared to the number of women reportedly attending ANC at least once; the ratio between these two numbers was used as an indicator of LLIN availability at ANC services. The same calculations were repeated for LLINs distributed through EPI to produce the corresponding LLIN availability through this distribution channel. Among 48 malaria-endemic countries in Africa, 33 malaria programmes reported adopting policies of ANC-based continuous distribution of LLINs, and 25 reported adopting policies of EPI-based distribution. Over a 3-year period through 2012, distribution through ANC accounted for 9 % of LLINs distributed, and LLINs distributed through EPI accounted for 4 %. The LLIN availability ratios achieved were 55 % through ANC and 34 % through EPI. For 38 country programmes reporting on LLIN distribution, data to calculate LLIN availability through ANC and EPI was available for 17 and 16, respectively. These continuous LLIN distribution channels appear to be under-utilized, especially EPI-based distribution. However, quality data from more countries are needed for consistent and reliable programme performance monitoring. A greater focus on routine data collection, monitoring and reporting on LLINs distributed through both ANC and EPI can provide insight into both strengths and weaknesses of continuous distribution, and improve the effectiveness of these delivery channels.
Kato, Hirotomo; Gomez, Eduardo A; Martini-Robles, Luiggi; Muzzio, Jenny; Velez, Lenin; Calvopiña, Manuel; Romero-Alvarez, Daniel; Mimori, Tatsuyuki; Uezato, Hiroshi; Hashiguchi, Yoshihisa
2016-07-01
A countrywide epidemiological study was performed to elucidate the current geographic distribution of causative species of cutaneous leishmaniasis (CL) in Ecuador by using FTA card-spotted samples and smear slides as DNA sources. Putative Leishmania in 165 samples collected from patients with CL in 16 provinces of Ecuador were examined at the species level based on the cytochrome b gene sequence analysis. Of these, 125 samples were successfully identified as Leishmania (Viannia) guyanensis, L. (V.) braziliensis, L. (V.) naiffi, L. (V.) lainsoni, and L. (Leishmania) mexicana. Two dominant species, L. (V.) guyanensis and L. (V.) braziliensis, were widely distributed in Pacific coast subtropical and Amazonian tropical areas, respectively. Recently reported L. (V.) naiffi and L. (V.) lainsoni were identified in Amazonian areas, and L. (L.) mexicana was identified in an Andean highland area. Importantly, the present study demonstrated that cases of L. (V.) braziliensis infection are increasing in Pacific coast areas.
Riding the Right Wavelet: Quantifying Scale Transitions in Fractured Rocks
NASA Astrophysics Data System (ADS)
Rizzo, Roberto E.; Healy, David; Farrell, Natalie J.; Heap, Michael J.
2017-12-01
The mechanics of brittle failure is a well-described multiscale process that involves a rapid transition from distributed microcracks to localization along a single macroscopic rupture plane. However, considerable uncertainty exists regarding both the length scale at which this transition occurs and the underlying causes that prompt this shift from a distributed to a localized assemblage of cracks or fractures. For the first time, we used an image analysis tool developed to investigate orientation changes at different scales in images of fracture patterns in faulted materials, based on a two-dimensional continuous wavelet analysis. We detected the abrupt change in the fracture pattern from distributed tensile microcracks to localized shear failure in a fracture network produced by triaxial deformation of a sandstone core plug. The presented method will contribute to our ability of unraveling the physical processes at the base of catastrophic rock failure, including the nucleation of earthquakes, landslides, and volcanic eruptions.
Performance Analysis of an Actor-Based Distributed Simulation
NASA Technical Reports Server (NTRS)
Schoeffler, James D.
1998-01-01
Object-oriented design of simulation programs appears to be very attractive because of the natural association of components in the simulated system with objects. There is great potential in distributing the simulation across several computers for the purpose of parallel computation and its consequent handling of larger problems in less elapsed time. One approach to such a design is to use "actors", that is, active objects with their own thread of control. Because these objects execute concurrently, communication is via messages. This is in contrast to an object-oriented design using passive objects where communication between objects is via method calls (direct calls when they are in the same address space and remote procedure calls when they are in different address spaces or different machines). This paper describes a performance analysis program for the evaluation of a design for distributed simulations based upon actors.
Bao, Yi; Chen, Yizheng; Hoehler, Matthew S; Smith, Christopher M; Bundy, Matthew; Chen, Genda
2017-01-01
This paper presents high temperature measurements using a Brillouin scattering-based fiber optic sensor and the application of the measured temperatures and building code recommended material parameters into enhanced thermomechanical analysis of simply supported steel beams subjected to combined thermal and mechanical loading. The distributed temperature sensor captures detailed, nonuniform temperature distributions that are compared locally with thermocouple measurements with less than 4.7% average difference at 95% confidence level. The simulated strains and deflections are validated using measurements from a second distributed fiber optic (strain) sensor and two linear potentiometers, respectively. The results demonstrate that the temperature-dependent material properties specified in the four investigated building codes lead to strain predictions with less than 13% average error at 95% confidence level and that the Europe building code provided the best predictions. However, the implicit consideration of creep in Europe is insufficient when the beam temperature exceeds 800°C.
NASA Astrophysics Data System (ADS)
Wattanasakulpong, Nuttawit; Chaikittiratana, Arisara; Pornpeerakeat, Sacharuck
2018-06-01
In this paper, vibration analysis of functionally graded porous beams is carried out using the third-order shear deformation theory. The beams have uniform and non-uniform porosity distributions across their thickness and both ends are supported by rotational and translational springs. The material properties of the beams such as elastic moduli and mass density can be related to the porosity and mass coefficient utilizing the typical mechanical features of open-cell metal foams. The Chebyshev collocation method is applied to solve the governing equations derived from Hamilton's principle, which is used in order to obtain the accurate natural frequencies for the vibration problem of beams with various general and elastic boundary conditions. Based on the numerical experiments, it is revealed that the natural frequencies of the beams with asymmetric and non-uniform porosity distributions are higher than those of other beams with uniform and symmetric porosity distributions.
Handling nonnormality and variance heterogeneity for quantitative sublethal toxicity tests.
Ritz, Christian; Van der Vliet, Leana
2009-09-01
The advantages of using regression-based techniques to derive endpoints from environmental toxicity data are clear, and slowly, this superior analytical technique is gaining acceptance. As use of regression-based analysis becomes more widespread, some of the associated nuances and potential problems come into sharper focus. Looking at data sets that cover a broad spectrum of standard test species, we noticed that some model fits to data failed to meet two key assumptions-variance homogeneity and normality-that are necessary for correct statistical analysis via regression-based techniques. Failure to meet these assumptions often is caused by reduced variance at the concentrations showing severe adverse effects. Although commonly used with linear regression analysis, transformation of the response variable only is not appropriate when fitting data using nonlinear regression techniques. Through analysis of sample data sets, including Lemna minor, Eisenia andrei (terrestrial earthworm), and algae, we show that both the so-called Box-Cox transformation and use of the Poisson distribution can help to correct variance heterogeneity and nonnormality and so allow nonlinear regression analysis to be implemented. Both the Box-Cox transformation and the Poisson distribution can be readily implemented into existing protocols for statistical analysis. By correcting for nonnormality and variance heterogeneity, these two statistical tools can be used to encourage the transition to regression-based analysis and the depreciation of less-desirable and less-flexible analytical techniques, such as linear interpolation.
Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain
NASA Technical Reports Server (NTRS)
Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem
2016-01-01
The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.
Online location of a break in water distribution systems
NASA Astrophysics Data System (ADS)
Liang, Jianwen; Xiao, Di; Zhao, Xinhua; Zhang, Hongwei
2003-08-01
Breaks often occur to urban water distribution systems under severely cold weather, or due to corrosion of pipes, deformation of ground, etc., and the breaks cannot easily be located, especially immediately after the events. This paper develops a methodology to locate a break in a water distribution system by monitoring water pressure online at some nodes in the water distribution system. For the purpose of online monitoring, supervisory control and data acquisition (SCADA) technology can well be used. A neural network-based inverse analysis method is constructed for locating the break based on the variation of water pressure. The neural network is trained by using analytically simulated data from the water distribution system, and validated by using a set of data that have never been used in the training. It is found that the methodology provides a quick, effective, and practical way in which a break in a water distribution system can be located.
Bao, Yi; Hoehler, Matthew S; Smith, Christopher M; Bundy, Matthew; Chen, Genda
2017-10-01
In this study, distributed fiber optic sensors based on pulse pre-pump Brillouin optical time domain analysis (PPP-BODTA) are characterized and deployed to measure spatially-distributed temperatures in reinforced concrete specimens exposed to fire. Four beams were tested to failure in a natural gas fueled compartment fire, each instrumented with one fused silica, single-mode optical fiber as a distributed sensor and four thermocouples. Prior to concrete cracking, the distributed temperature was validated at locations of the thermocouples by a relative difference of less than 9 %. The cracks in concrete can be identified as sharp peaks in the temperature distribution since the cracks are locally filled with hot air. Concrete cracking did not affect the sensitivity of the distributed sensor but concrete spalling broke the optical fiber loop required for PPP-BOTDA measurements.
Liu, Zhao; Sun, Jiuai; Smith, Lyndon; Smith, Melvyn; Warr, Robert
2012-05-01
Computerised analysis on skin lesion images has been reported to be helpful in achieving objective and reproducible diagnosis of melanoma. In particular, asymmetry in shape, colour and structure reflects the irregular growth of melanin under the skin and is of great importance for diagnosing the malignancy of skin lesions. This paper proposes a novel asymmetry analysis based on a newly developed pigmentation elevation model and the global point signatures (GPSs). Specifically, the pigmentation elevation model was first constructed by computer-based analysis of dermoscopy images, for the identification of melanin and haemoglobin. Asymmetry of skin lesions was then assessed through quantifying distributions of the pigmentation elevation model using the GPSs, derived from a Laplace-Beltrami operator. This new approach allows quantifying the shape and pigmentation distributions of cutaneous lesions simultaneously. Algorithm performance was tested on 351 dermoscopy images, including 88 malignant melanomas and 263 benign naevi, employing a support vector machine (SVM) with tenfold cross-validation strategy. Competitive diagnostic results were achieved using the proposed asymmetry descriptor only, presenting 86.36 % sensitivity, 82.13 % specificity and overall 83.43 % accuracy, respectively. In addition, the proposed GPS-based asymmetry analysis enables working on dermoscopy images from different databases and is approved to be inherently robust to the external imaging variations. These advantages suggested that the proposed method has good potential for follow-up treatment.
NASA Astrophysics Data System (ADS)
Akbardin, J.; Parikesit, D.; Riyanto, B.; TMulyono, A.
2018-05-01
Zones that produce land fishery commodity and its yields have characteristics that is limited in distribution capability because infrastructure conditions availability. High demand for fishery commodities caused to a growing distribution at inefficient distribution distance. The development of the gravity theory with the limitation of movement generation from the production zone can increase the interaction inter-zones by distribution distances effectively and efficiently with shorter movement distribution distances. Regression analysis method with multiple variable of transportation infrastructure condition based on service level and quantitative capacity is determined to estimate the 'mass' of movement generation that is formed. The resulting movement distribution (Tid) model has the equation Tid = 27.04 -0.49 tid. Based on barrier function of power model with calibration value β = 0.0496. In the way of development of the movement generation 'mass' boundary at production zone will shorten the distribution distance effectively with shorter distribution distances. Shorter distribution distances will increase the accessibility inter-zones to interact according to the magnitude of the movement generation 'mass'.
Detecting microsatellites within genomes: significant variation among algorithms.
Leclercq, Sébastien; Rivals, Eric; Jarne, Philippe
2007-04-18
Microsatellites are short, tandemly-repeated DNA sequences which are widely distributed among genomes. Their structure, role and evolution can be analyzed based on exhaustive extraction from sequenced genomes. Several dedicated algorithms have been developed for this purpose. Here, we compared the detection efficiency of five of them (TRF, Mreps, Sputnik, STAR, and RepeatMasker). Our analysis was first conducted on the human X chromosome, and microsatellite distributions were characterized by microsatellite number, length, and divergence from a pure motif. The algorithms work with user-defined parameters, and we demonstrate that the parameter values chosen can strongly influence microsatellite distributions. The five algorithms were then compared by fixing parameters settings, and the analysis was extended to three other genomes (Saccharomyces cerevisiae, Neurospora crassa and Drosophila melanogaster) spanning a wide range of size and structure. Significant differences for all characteristics of microsatellites were observed among algorithms, but not among genomes, for both perfect and imperfect microsatellites. Striking differences were detected for short microsatellites (below 20 bp), regardless of motif. Since the algorithm used strongly influences empirical distributions, studies analyzing microsatellite evolution based on a comparison between empirical and theoretical size distributions should therefore be considered with caution. We also discuss why a typological definition of microsatellites limits our capacity to capture their genomic distributions.
Detecting microsatellites within genomes: significant variation among algorithms
Leclercq, Sébastien; Rivals, Eric; Jarne, Philippe
2007-01-01
Background Microsatellites are short, tandemly-repeated DNA sequences which are widely distributed among genomes. Their structure, role and evolution can be analyzed based on exhaustive extraction from sequenced genomes. Several dedicated algorithms have been developed for this purpose. Here, we compared the detection efficiency of five of them (TRF, Mreps, Sputnik, STAR, and RepeatMasker). Results Our analysis was first conducted on the human X chromosome, and microsatellite distributions were characterized by microsatellite number, length, and divergence from a pure motif. The algorithms work with user-defined parameters, and we demonstrate that the parameter values chosen can strongly influence microsatellite distributions. The five algorithms were then compared by fixing parameters settings, and the analysis was extended to three other genomes (Saccharomyces cerevisiae, Neurospora crassa and Drosophila melanogaster) spanning a wide range of size and structure. Significant differences for all characteristics of microsatellites were observed among algorithms, but not among genomes, for both perfect and imperfect microsatellites. Striking differences were detected for short microsatellites (below 20 bp), regardless of motif. Conclusion Since the algorithm used strongly influences empirical distributions, studies analyzing microsatellite evolution based on a comparison between empirical and theoretical size distributions should therefore be considered with caution. We also discuss why a typological definition of microsatellites limits our capacity to capture their genomic distributions. PMID:17442102
Coupling Analysis of Heat Island Effects, Vegetation Coverage and Urban Flood in Wuhan
NASA Astrophysics Data System (ADS)
Liu, Y.; Liu, Q.; Fan, W.; Wang, G.
2018-04-01
In this paper, satellite image, remote sensing technique and geographic information system technique are main technical bases. Spectral and other factors comprehensive analysis and visual interpretation are main methods. We use GF-1 and Landsat8 remote sensing satellite image of Wuhan as data source, and from which we extract vegetation distribution, urban heat island relative intensity distribution map and urban flood submergence range. Based on the extracted information, through spatial analysis and regression analysis, we find correlations among heat island effect, vegetation coverage and urban flood. The results show that there is a high degree of overlap between of urban heat island and urban flood. The area of urban heat island has buildings with little vegetation cover, which may be one of the reasons for the local heavy rainstorms. Furthermore, the urban heat island has a negative correlation with vegetation coverage, and the heat island effect can be alleviated by the vegetation to a certain extent. So it is easy to understand that the new industrial zones and commercial areas which under constructions distribute in the city, these land surfaces becoming bare or have low vegetation coverage, can form new heat islands easily.
NASA Astrophysics Data System (ADS)
Jin, Y.; Lee, D. K.; Jeong, S. G.
2015-12-01
The ecological and social values of forests have recently been highlighted. Assessments of the biodiversity of forests, as well as their other ecological values, play an important role in regional and national conservation planning. The preservation of habitats is linked to the protection of biodiversity. For mapping habitats, species distribution model (SDM) is used for predicting suitable habitat of significant species, and such distribution modeling is increasingly being used in conservation science. However, the pixel-based analysis does not contain contextual or topological information. In order to provide more accurate habitats predictions, a continuous field view that assumes the real world is required. Here we analyze and compare at different scales, habitats of the Yellow Marten's(Martes Flavigula), which is a top predator and also an umbrella species in South Korea. The object-scale, which is a group of pixels that have similar spatial and spectral characteristics, and pixel-scale were used for SDM. Our analysis using the SDM at different scales suggests that object-scale analysis provides a superior representation of continuous habitat, and thus will be useful in forest conservation planning as well as for species habitat monitoring.
Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.
Pan, Shaoming; Li, Yongkai; Xu, Zhengquan; Chong, Yanwen
2015-01-01
Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.
NASA Astrophysics Data System (ADS)
Yin, Qiang; Chen, Tian-jin; Li, Wei-yang; Xiong, Ze-cheng; Ma, Rui
2017-09-01
In order to obtain the deformation map and equivalent stress distribution of rectifier cabinet for nuclear power generating stations, the quality distribution of structure and electrical are described, the tensile bond strengths of the rings are checked, and the finite element model of cabinet is set up by ANSYS. The transport conditions of the hoisting state and fork loading state are analyzed. The deformation map and equivalent stress distribution are obtained. The attentive problems are put forward. It is a reference for analysis method and the obtained results for the transport of rectifier cabinet for nuclear power generating stations.
Wang, Leimin; Zeng, Zhigang; Ge, Ming-Feng; Hu, Junhao
2018-05-02
This paper deals with the stabilization problem of memristive recurrent neural networks with inertial items, discrete delays, bounded and unbounded distributed delays. First, for inertial memristive recurrent neural networks (IMRNNs) with second-order derivatives of states, an appropriate variable substitution method is invoked to transfer IMRNNs into a first-order differential form. Then, based on nonsmooth analysis theory, several algebraic criteria are established for the global stabilizability of IMRNNs under proposed feedback control, where the cases with both bounded and unbounded distributed delays are successfully addressed. Finally, the theoretical results are illustrated via the numerical simulations. Copyright © 2018 Elsevier Ltd. All rights reserved.
A Poisson process approximation for generalized K-5 confidence regions
NASA Technical Reports Server (NTRS)
Arsham, H.; Miller, D. R.
1982-01-01
One-sided confidence regions for continuous cumulative distribution functions are constructed using empirical cumulative distribution functions and the generalized Kolmogorov-Smirnov distance. The band width of such regions becomes narrower in the right or left tail of the distribution. To avoid tedious computation of confidence levels and critical values, an approximation based on the Poisson process is introduced. This aproximation provides a conservative confidence region; moreover, the approximation error decreases monotonically to 0 as sample size increases. Critical values necessary for implementation are given. Applications are made to the areas of risk analysis, investment modeling, reliability assessment, and analysis of fault tolerant systems.
NASA Astrophysics Data System (ADS)
Wada, Daichi; Sugimoto, Yohei
2017-04-01
Aerodynamic loads on aircraft wings are one of the key parameters to be monitored for reliable and effective aircraft operations and management. Flight data of the aerodynamic loads would be used onboard to control the aircraft and accumulated data would be used for the condition-based maintenance and the feedback for the fatigue and critical load modeling. The effective sensing techniques such as fiber optic distributed sensing have been developed and demonstrated promising capability of monitoring structural responses, i.e., strains on the surface of the aircraft wings. By using the developed techniques, load identification methods for structural health monitoring are expected to be established. The typical inverse analysis for load identification using strains calculates the loads in a discrete form of concentrated forces, however, the distributed form of the loads is essential for the accurate and reliable estimation of the critical stress at structural parts. In this study, we demonstrate an inverse analysis to identify the distributed loads from measured strain information. The introduced inverse analysis technique calculates aerodynamic loads not in a discrete but in a distributed manner based on a finite element model. In order to verify the technique through numerical simulations, we apply static aerodynamic loads on a flat panel model, and conduct the inverse identification of the load distributions. We take two approaches to build the inverse system between loads and strains. The first one uses structural models and the second one uses neural networks. We compare the performance of the two approaches, and discuss the effect of the amount of the strain sensing information.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shankman, C.; Kavelaars, JJ.; Bannister, M. T.
We measure the absolute magnitude, H, distribution, dN(H) ∝ 10{sup αH}, of the scattering Trans-Neptunian Objects (TNOs) as a proxy for their size-frequency distribution. We show that the H-distribution of the scattering TNOs is not consistent with a single-slope distribution, but must transition around H{sub g} ∼ 9 to either a knee with a shallow slope or to a divot, which is a differential drop followed by second exponential distribution. Our analysis is based on a sample of 22 scattering TNOs drawn from three different TNO surveys—the Canada–France Ecliptic Plane Survey, Alexandersen et al., and the Outer Solar System Origins Survey, all of whichmore » provide well-characterized detection thresholds—combined with a cosmogonic model for the formation of the scattering TNO population. Our measured absolute magnitude distribution result is independent of the choice of cosmogonic model. Based on our analysis, we estimate that the number of scattering TNOs is (2.4–8.3) × 10{sup 5} for H{sub r} < 12. A divot H-distribution is seen in a variety of formation scenarios and may explain several puzzles in Kuiper Belt science. We find that a divot H-distribution simultaneously explains the observed scattering TNO, Neptune Trojan, Plutino, and Centaur H-distributions while simultaneously predicting a large enough scattering TNO population to act as the sole supply of the Jupiter-Family Comets.« less
A Disciplined Architectural Approach to Scaling Data Analysis for Massive, Scientific Data
NASA Astrophysics Data System (ADS)
Crichton, D. J.; Braverman, A. J.; Cinquini, L.; Turmon, M.; Lee, H.; Law, E.
2014-12-01
Data collections across remote sensing and ground-based instruments in astronomy, Earth science, and planetary science are outpacing scientists' ability to analyze them. Furthermore, the distribution, structure, and heterogeneity of the measurements themselves pose challenges that limit the scalability of data analysis using traditional approaches. Methods for developing science data processing pipelines, distribution of scientific datasets, and performing analysis will require innovative approaches that integrate cyber-infrastructure, algorithms, and data into more systematic approaches that can more efficiently compute and reduce data, particularly distributed data. This requires the integration of computer science, machine learning, statistics and domain expertise to identify scalable architectures for data analysis. The size of data returned from Earth Science observing satellites and the magnitude of data from climate model output, is predicted to grow into the tens of petabytes challenging current data analysis paradigms. This same kind of growth is present in astronomy and planetary science data. One of the major challenges in data science and related disciplines defining new approaches to scaling systems and analysis in order to increase scientific productivity and yield. Specific needs include: 1) identification of optimized system architectures for analyzing massive, distributed data sets; 2) algorithms for systematic analysis of massive data sets in distributed environments; and 3) the development of software infrastructures that are capable of performing massive, distributed data analysis across a comprehensive data science framework. NASA/JPL has begun an initiative in data science to address these challenges. Our goal is to evaluate how scientific productivity can be improved through optimized architectural topologies that identify how to deploy and manage the access, distribution, computation, and reduction of massive, distributed data, while managing the uncertainties of scientific conclusions derived from such capabilities. This talk will provide an overview of JPL's efforts in developing a comprehensive architectural approach to data science.
NASA Technical Reports Server (NTRS)
Patton, Jeff A.
1986-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Electrical Power Distribution and Control (EPD and C)/Electrical Power Generation (EPG) hardware. The EPD and C/EPG hardware is required for performing critical functions of cryogenic reactant storage, electrical power generation and product water distribution in the Orbiter. Specifically, the EPD and C/EPG hardware consists of the following components: Power Section Assembly (PSA); Reactant Control Subsystem (RCS); Thermal Control Subsystem (TCS); Water Removal Subsystem (WRS); and Power Reactant Storage and Distribution System (PRSDS). The IOA analysis process utilized available EPD and C/EPG hardware drawings and schematics for defining hardware assemblies, components, and hardware items. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode.
Structural diversity effects of multilayer networks on the threshold of interacting epidemics
NASA Astrophysics Data System (ADS)
Wang, Weihong; Chen, MingMing; Min, Yong; Jin, Xiaogang
2016-02-01
Foodborne diseases always spread through multiple vectors (e.g. fresh vegetables and fruits) and reveal that multilayer network could spread fatal pathogen with complex interactions. In this paper, first, we use a "top-down analysis framework that depends on only two distributions to describe a random multilayer network with any number of layers. These two distributions are the overlaid degree distribution and the edge-type distribution of the multilayer network. Second, based on the two distributions, we adopt three indicators of multilayer network diversity to measure the correlation between network layers, including network richness, likeness, and evenness. The network richness is the number of layers forming the multilayer network. The network likeness is the degree of different layers sharing the same edge. The network evenness is the variance of the number of edges in every layer. Third, based on a simple epidemic model, we analyze the influence of network diversity on the threshold of interacting epidemics with the coexistence of collaboration and competition. Our work extends the "top-down" analysis framework to deal with the more complex epidemic situation and more diversity indicators and quantifies the trade-off between thresholds of inter-layer collaboration and intra-layer transmission.
The application of diffusion theory to the analysis of hydrogen desorption data at 25 deg C
NASA Technical Reports Server (NTRS)
Danford, M. D.
1985-01-01
The application of diffusion theory to the analysis of hydrogen desorption data (coulombs of H2 desorbed versus time) has been studied. From these analyses, important information concerning hydrogen solubilities and the nature of the hydrogen distributions in the metal has been obtained. Two nickel base alloys, Rene' 41 and Waspaloy, and one ferrous alloy, 4340 steel, are studied in this work. For the nickel base alloys, it is found that the hydrogen distributions after electrolytic charging conforms closely to those which would be predicted by diffusion theory. For Waspaloy samples charged at 5,000 psi, it is found that the hydrogen distributions are essentially the same as those obtained by electrolytic charging. The hydrogen distributions in electrolytically charged 4340 steel, on the other hand, are essentially uniform in nature, which would not be predicted by diffusion theory. A possible explanation has been proposed. Finally, it is found that the hydrogen desorption is completely explained by the nature of the hydrogen distribution in the metal, and that the fast hydrogen is not due to surface and sub-surface hydride formation, as was originally proposed.
Challenges in reducing the computational time of QSTS simulations for distribution system analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deboever, Jeremiah; Zhang, Xiaochen; Reno, Matthew J.
The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10more » to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.« less
Quantile Functions, Convergence in Quantile, and Extreme Value Distribution Theory.
1980-11-01
Gnanadesikan (1968). Quantile functions are advocated by Parzen (1979) as providing an approach to probability-based data analysis. Quantile functions are... Gnanadesikan , R. (1968). Probability Plotting Methods for the Analysis of Data, Biomtrika, 55, 1-17.
BAT - The Bayesian analysis toolkit
NASA Astrophysics Data System (ADS)
Caldwell, Allen; Kollár, Daniel; Kröninger, Kevin
2009-11-01
We describe the development of a new toolkit for data analysis. The analysis package is based on Bayes' Theorem, and is realized with the use of Markov Chain Monte Carlo. This gives access to the full posterior probability distribution. Parameter estimation, limit setting and uncertainty propagation are implemented in a straightforward manner.
Comparing the index-flood and multiple-regression methods using L-moments
NASA Astrophysics Data System (ADS)
Malekinezhad, H.; Nachtnebel, H. P.; Klik, A.
In arid and semi-arid regions, the length of records is usually too short to ensure reliable quantile estimates. Comparing index-flood and multiple-regression analyses based on L-moments was the main objective of this study. Factor analysis was applied to determine main influencing variables on flood magnitude. Ward’s cluster and L-moments approaches were applied to several sites in the Namak-Lake basin in central Iran to delineate homogeneous regions based on site characteristics. Homogeneity test was done using L-moments-based measures. Several distributions were fitted to the regional flood data and index-flood and multiple-regression methods as two regional flood frequency methods were compared. The results of factor analysis showed that length of main waterway, compactness coefficient, mean annual precipitation, and mean annual temperature were the main variables affecting flood magnitude. The study area was divided into three regions based on the Ward’s method of clustering approach. The homogeneity test based on L-moments showed that all three regions were acceptably homogeneous. Five distributions were fitted to the annual peak flood data of three homogeneous regions. Using the L-moment ratios and the Z-statistic criteria, GEV distribution was identified as the most robust distribution among five candidate distributions for all the proposed sub-regions of the study area, and in general, it was concluded that the generalised extreme value distribution was the best-fit distribution for every three regions. The relative root mean square error (RRMSE) measure was applied for evaluating the performance of the index-flood and multiple-regression methods in comparison with the curve fitting (plotting position) method. In general, index-flood method gives more reliable estimations for various flood magnitudes of different recurrence intervals. Therefore, this method should be adopted as regional flood frequency method for the study area and the Namak-Lake basin in central Iran. To estimate floods of various return periods for gauged catchments in the study area, the mean annual peak flood of the catchments may be multiplied by corresponding values of the growth factors, and computed using the GEV distribution.
NASA Astrophysics Data System (ADS)
Diamanti, Eleni; Takesue, Hiroki; Langrock, Carsten; Fejer, M. M.; Yamamoto, Yoshihisa
2006-12-01
We present a quantum key distribution experiment in which keys that were secure against all individual eavesdropping attacks allowed by quantum mechanics were distributed over 100 km of optical fiber. We implemented the differential phase shift quantum key distribution protocol and used low timing jitter 1.55 µm single-photon detectors based on frequency up-conversion in periodically poled lithium niobate waveguides and silicon avalanche photodiodes. Based on the security analysis of the protocol against general individual attacks, we generated secure keys at a practical rate of 166 bit/s over 100 km of fiber. The use of the low jitter detectors also increased the sifted key generation rate to 2 Mbit/s over 10 km of fiber.
A physically based catchment partitioning method for hydrological analysis
NASA Astrophysics Data System (ADS)
Menduni, Giovanni; Riboni, Vittoria
2000-07-01
We propose a partitioning method for the topographic surface, which is particularly suitable for hydrological distributed modelling and shallow-landslide distributed modelling. The model provides variable mesh size and appears to be a natural evolution of contour-based digital terrain models. The proposed method allows the drainage network to be derived from the contour lines. The single channels are calculated via a search for the steepest downslope lines. Then, for each network node, the contributing area is determined by means of a search for both steepest upslope and downslope lines. This leads to the basin being partitioned into physically based finite elements delimited by irregular polygons. In particular, the distributed computation of local geomorphological parameters (i.e. aspect, average slope and elevation, main stream length, concentration time, etc.) can be performed easily for each single element. The contributing area system, together with the information on the distribution of geomorphological parameters provide a useful tool for distributed hydrological modelling and simulation of environmental processes such as erosion, sediment transport and shallow landslides.
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2014-01-01
This report presents an example of the application of multi-criteria decision analysis to the selection of an architecture for a safety-critical distributed computer system. The design problem includes constraints on minimum system availability and integrity, and the decision is based on the optimal balance of power, weight and cost. The analysis process includes the generation of alternative architectures, evaluation of individual decision criteria, and the selection of an alternative based on overall value. In this example presented here, iterative application of the quantitative evaluation process made it possible to deliberately generate an alternative architecture that is superior to all others regardless of the relative importance of cost.
Observer-based distributed adaptive iterative learning control for linear multi-agent systems
NASA Astrophysics Data System (ADS)
Li, Jinsha; Liu, Sanyang; Li, Junmin
2017-10-01
This paper investigates the consensus problem for linear multi-agent systems from the viewpoint of two-dimensional systems when the state information of each agent is not available. Observer-based fully distributed adaptive iterative learning protocol is designed in this paper. A local observer is designed for each agent and it is shown that without using any global information about the communication graph, all agents achieve consensus perfectly for all undirected connected communication graph when the number of iterations tends to infinity. The Lyapunov-like energy function is employed to facilitate the learning protocol design and property analysis. Finally, simulation example is given to illustrate the theoretical analysis.
ERIC Educational Resources Information Center
Lin, Yu-Tzu; Chen, Ming-Puu; Chang, Chia-Hu; Chang, Pu-Chen
2017-01-01
The benefits of social learning have been recognized by existing research. To explore knowledge distribution in social learning and its effects on learning achievement, we developed a social learning platform and explored students' behaviors of peer interactions by the proposed algorithms based on social network analysis. An empirical study was…
Hasson, Uri; Skipper, Jeremy I; Wilde, Michael J; Nusbaum, Howard C; Small, Steven L
2008-01-15
The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data.
Hasson, Uri; Skipper, Jeremy I.; Wilde, Michael J.; Nusbaum, Howard C.; Small, Steven L.
2007-01-01
The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data. PMID:17964812
NASA Astrophysics Data System (ADS)
Bartkiewicz, Karol; Černoch, Antonín; Lemr, Karel; Miranowicz, Adam; Nori, Franco
2016-06-01
Temporal steering, which is a temporal analog of Einstein-Podolsky-Rosen steering, refers to temporal quantum correlations between the initial and final state of a quantum system. Our analysis of temporal steering inequalities in relation to the average quantum bit error rates reveals the interplay between temporal steering and quantum cloning, which guarantees the security of quantum key distribution based on mutually unbiased bases against individual attacks. The key distributions analyzed here include the Bennett-Brassard 1984 protocol and the six-state 1998 protocol by Bruss. Moreover, we define a temporal steerable weight, which enables us to identify a kind of monogamy of temporal correlation that is essential to quantum cryptography and useful for analyzing various scenarios of quantum causality.
[Study on ecological suitability regionalization of Eucommia ulmoides in Guizhou].
Kang, Chuan-Zhi; Wang, Qing-Qing; Zhou, Tao; Jiang, Wei-Ke; Xiao, Cheng-Hong; Xie, Yu
2014-05-01
To study the ecological suitability regionalization of Eucommia ulmoides, for selecting artificial planting base and high-quality industrial raw material purchase area of the herb in Guizhou. Based on the investigation of 14 Eucommia ulmoides producing areas, pinoresinol diglucoside content and ecological factors were obtained. Using spatial analysis method to carry on ecological suitability regionalization. Meanwhile, combining pinoresinol diglucoside content, the correlation of major active components and environmental factors were analyzed by statistical analysis. The most suitability planting area of Eucommia ulmoides was the northwest of Guizhou. The distribution of Eucommia ulmoides was mainly affected by the type and pH value of soil, and monthly precipitation. The spatial structure of major active components in Eucommia ulmoides were randomly distributed in global space, but had only one aggregation point which had a high positive correlation in local space. The major active components of Eucommia ulmoides had no correlation with altitude, longitude or latitude. Using the spatial analysis method and statistical analysis method, based on environmental factor and pinoresinol diglucoside content, the ecological suitability regionalization of Eucommia ulmoides can provide reference for the selection of suitable planting area, artificial planting base and directing production layout.
Two-dimensional analysis of coupled heat and moisture transport in masonry structures
NASA Astrophysics Data System (ADS)
Krejčí, Tomáš
2016-06-01
Reconstruction and maintenance of historical buildings and bridges require good knowledge of temperature and moisture distribution. Sharp changes in the temperature and moisture can lead to damage. This paper describes analysis of coupled heat and moisture transfer in masonry based on two-level approach. Macro-scale level describes the whole structure while meso-scale level takes into account detailed composition of the masonry. The two-level approach is very computationally demanding and it was implemented in parallel. The two-level approach was used in analysis of temperature and moisture distribution in Charles bridge in Prague, Czech Republic.
Automatic contouring of geologic fabric and finite strain data on the unit hyperboloid
NASA Astrophysics Data System (ADS)
Vollmer, Frederick W.
2018-06-01
Fabric and finite strain analysis, an integral part of studies of geologic structures and orogenic belts, is commonly done by the analysis of particles whose shapes can be approximated as ellipses. Given a sample of such particles, the mean and confidence intervals of particular parameters can be calculated, however, taking the extra step of plotting and contouring the density distribution can identify asymmetries or modes related to sedimentary fabrics or other factors. A common graphical strain analysis technique is to plot final ellipse ratios, Rf , versus orientations, ϕf on polar Elliott or Rf / ϕ plots to examine the density distribution. The plot may be contoured, however, it is desirable to have a contouring method that is rapid, reproducible, and based on the underlying geometry of the data. The unit hyperboloid, H2 , gives a natural parameter space for two-dimensional strain, and various projections, including equal-area and stereographic, have useful properties for examining density distributions for anisotropy. An index, Ia , is given to quantify the magnitude and direction of anisotropy. Elliott and Rf / ϕ plots can be understood by applying hyperbolic geometry and recognizing them as projections of H2 . These both distort area, however, so the equal-area projection is preferred for examining density distributions. The algorithm presented here gives fast, accurate, and reproducible contours of density distributions calculated directly on H2 . The algorithm back-projects the data onto H2 , where the density calculation is done at regular nodes using a weighting value based on the hyperboloid distribution, which is then contoured. It is implemented as an Octave compatible MATLAB function that plots ellipse data using a variety of projections, and calculates and displays contours of their density distribution on H2 .
NASA Astrophysics Data System (ADS)
Hirose, Misa; Toyota, Saori; Tsumura, Norimichi
2018-02-01
In this research, we evaluate the visibility of age spot and freckle with changing the blood volume based on simulated spectral reflectance distribution and the actual facial color images, and compare these results. First, we generate three types of spatial distribution of age spot and freckle in patch-like images based on the simulated spectral reflectance. The spectral reflectance is simulated using Monte Carlo simulation of light transport in multi-layered tissue. Next, we reconstruct the facial color image with changing the blood volume. We acquire the concentration distribution of melanin, hemoglobin and shading components by applying the independent component analysis on a facial color image. We reproduce images using the obtained melanin and shading concentration and the changed hemoglobin concentration. Finally, we evaluate the visibility of pigmentations using simulated spectral reflectance distribution and facial color images. In the result of simulated spectral reflectance distribution, we found that the visibility became lower as the blood volume increases. However, we can see that a specific blood volume reduces the visibility of the actual pigmentations from the result of the facial color images.
NASA Technical Reports Server (NTRS)
Sobel, Larry; Buttitta, Claudio; Suarez, James
1993-01-01
Probabilistic predictions based on the Integrated Probabilistic Assessment of Composite Structures (IPACS) code are presented for the material and structural response of unnotched and notched, 1M6/3501-6 Gr/Ep laminates. Comparisons of predicted and measured modulus and strength distributions are given for unnotched unidirectional, cross-ply, and quasi-isotropic laminates. The predicted modulus distributions were found to correlate well with the test results for all three unnotched laminates. Correlations of strength distributions for the unnotched laminates are judged good for the unidirectional laminate and fair for the cross-ply laminate, whereas the strength correlation for the quasi-isotropic laminate is deficient because IPACS did not yet have a progressive failure capability. The paper also presents probabilistic and structural reliability analysis predictions for the strain concentration factor (SCF) for an open-hole, quasi-isotropic laminate subjected to longitudinal tension. A special procedure was developed to adapt IPACS for the structural reliability analysis. The reliability results show the importance of identifying the most significant random variables upon which the SCF depends, and of having accurate scatter values for these variables.
Interoperating Cloud-based Virtual Farms
NASA Astrophysics Data System (ADS)
Bagnasco, S.; Colamaria, F.; Colella, D.; Casula, E.; Elia, D.; Franco, A.; Lusso, S.; Luparello, G.; Masera, M.; Miniello, G.; Mura, D.; Piano, S.; Vallero, S.; Venaruzzo, M.; Vino, G.
2015-12-01
The present work aims at optimizing the use of computing resources available at the grid Italian Tier-2 sites of the ALICE experiment at CERN LHC by making them accessible to interactive distributed analysis, thanks to modern solutions based on cloud computing. The scalability and elasticity of the computing resources via dynamic (“on-demand”) provisioning is essentially limited by the size of the computing site, reaching the theoretical optimum only in the asymptotic case of infinite resources. The main challenge of the project is to overcome this limitation by federating different sites through a distributed cloud facility. Storage capacities of the participating sites are seen as a single federated storage area, preventing the need of mirroring data across them: high data access efficiency is guaranteed by location-aware analysis software and storage interfaces, in a transparent way from an end-user perspective. Moreover, the interactive analysis on the federated cloud reduces the execution time with respect to grid batch jobs. The tests of the investigated solutions for both cloud computing and distributed storage on wide area network will be presented.
Grimsley, Jasmine M S; Gadziola, Marie A; Wenstrup, Jeffrey J
2012-01-01
Mouse pups vocalize at high rates when they are cold or isolated from the nest. The proportions of each syllable type produced carry information about disease state and are being used as behavioral markers for the internal state of animals. Manual classifications of these vocalizations identified 10 syllable types based on their spectro-temporal features. However, manual classification of mouse syllables is time consuming and vulnerable to experimenter bias. This study uses an automated cluster analysis to identify acoustically distinct syllable types produced by CBA/CaJ mouse pups, and then compares the results to prior manual classification methods. The cluster analysis identified two syllable types, based on their frequency bands, that have continuous frequency-time structure, and two syllable types featuring abrupt frequency transitions. Although cluster analysis computed fewer syllable types than manual classification, the clusters represented well the probability distributions of the acoustic features within syllables. These probability distributions indicate that some of the manually classified syllable types are not statistically distinct. The characteristics of the four classified clusters were used to generate a Microsoft Excel-based mouse syllable classifier that rapidly categorizes syllables, with over a 90% match, into the syllable types determined by cluster analysis.
Glyph-based analysis of multimodal directional distributions in vector field ensembles
NASA Astrophysics Data System (ADS)
Jarema, Mihaela; Demir, Ismail; Kehrer, Johannes; Westermann, Rüdiger
2015-04-01
Ensemble simulations are increasingly often performed in the geosciences in order to study the uncertainty and variability of model predictions. Describing ensemble data by mean and standard deviation can be misleading in case of multimodal distributions. We present first results of a glyph-based visualization of multimodal directional distributions in 2D and 3D vector ensemble data. Directional information on the circle/sphere is modeled using mixtures of probability density functions (pdfs), which enables us to characterize the distributions with relatively few parameters. The resulting mixture models are represented by 2D and 3D lobular glyphs showing direction, spread and strength of each principal mode of the distributions. A 3D extension of our approach is realized by means of an efficient GPU rendering technique. We demonstrate our method in the context of ensemble weather simulations.
NASA Astrophysics Data System (ADS)
Dong, Fang; Chen, Jian; Yang, Fan
2018-01-01
Based on the medium resolution Landsat 8 OLI/TIRS, the temperature distribution in four seasons of urban area in Jinan City was obtained by using atmospheric correction method for the retrieval of land surface temperature. Quantitative analysis of the spatio-temporal distribution characteristics, development trend of urban thermal environment, the seasonal variation and the relationship between surface temperature and normalized difference vegetation index (NDVI) was studied. The results show that the distribution of high temperature areas is concentrated in Jinan, and there is a tendency to expand from east to west, revealing a negative correlation between land surface temperature distribution and NDVI. So as to provide theoretical references and scientific basis of improving the ecological environment of Jinan City, strengthening scientific planning and making overall plan addressing climate change.
NASA Astrophysics Data System (ADS)
Letendre, Steven Emery
The U.S. electric utility sector in its current configuration is unsustainable. The majority of electricity in the United States is produced using finite fossil fuels. In addition, significant potential exists to improve the nation's efficient use of energy. A sustainable electric utility sector will be characterized by increased use of renewable energy sources and high levels of end-use efficiency. This dissertation analyzes two alternative policy approaches designed to move the U.S. electric utility sector toward sustainability. One approach is labeled incremental which involves maintaining the centralized structure of the electric utility sector but facilitating the introduction of renewable energy and efficiency into the electrical system through the pricing mechanism. A second policy approach was described in which structural changes are encouraged based on the emerging distributed utility (DU) concept. A structural policy orientation attempts to capture the unique localized benefits that distributed renewable resources and energy efficiency offer to electric utility companies and their customers. A market penetration analysis of PV in centralized energy supply and distributed peak-shaving applications is conducted for a case-study electric utility company. Sensitivity analysis was performed based on incremental and structural policy orientations. The analysis provides compelling evidence which suggests that policies designed to bring about structural change in the electric utility sector are needed to move the industry toward sustainability. Specifically, the analysis demonstrates that PV technology, a key renewable energy option likely to play an important role in a renewable energy future, will begin to penetrate the electrical system in distributed peak-shaving applications long before the technology is introduced as a centralized energy supply option. Most policies to date, which I term incremental, attempt to encourage energy efficiency and renewables through the pricing system. Based on past policy experience, it is unlikely that such an approach would allow PV to compete in Delaware as an energy supply option in the next ten to twenty years. Alternatively, a market-based, or green pricing, approach will not create significant market opportunities for PV as a centralized energy supply option. However, structural policies designed to encourage the explicit recognition of the localized benefits of distributed resources could result in PV being introduced into the electrical system early in the next century.
Aeroelastic Analysis of a Distributed Electric Propulsion Wing
NASA Technical Reports Server (NTRS)
Massey, Steven J.; Stanford, Bret K.; Wieseman, Carol D.; Heeg, Jennifer
2017-01-01
An aeroelastic analysis of a prototype distributed electric propulsion wing is presented. Results using MSC Nastran (Registered Trademark) doublet lattice aerodynamics are compared to those based on FUN3D Reynolds Averaged Navier- Stokes aerodynamics. Four levels of grid refinement were examined for the FUN3D solutions and solutions were seen to be well converged. It was found that no oscillatory instability existed, only that of divergence, which occurred in the first bending mode at a dynamic pressure of over three times the flutter clearance condition.
Three-Axis Ground Reaction Force Distribution during Straight Walking.
Hori, Masataka; Nakai, Akihito; Shimoyama, Isao
2017-10-24
We measured the three-axis ground reaction force (GRF) distribution during straight walking. Small three-axis force sensors composed of rubber and sensor chips were fabricated and calibrated. After sensor calibration, 16 force sensors were attached to the left shoe. The three-axis force distribution during straight walking was measured, and the local features of the three-axis force under the sole of the shoe were analyzed. The heel area played a role in receiving the braking force, the base area of the fourth and fifth toes applied little vertical or shear force, the base area of the second and third toes generated a portion of the propulsive force and received a large vertical force, and the base area of the big toe helped move the body's center of mass to the other foot. The results demonstrate that measuring the three-axis GRF distribution is useful for a detailed analysis of bipedal locomotion.
Zhang, Jian; Hou, Dibo; Wang, Ke; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo
2017-05-01
The detection of organic contaminants in water distribution systems is essential to protect public health from potential harmful compounds resulting from accidental spills or intentional releases. Existing methods for detecting organic contaminants are based on quantitative analyses such as chemical testing and gas/liquid chromatography, which are time- and reagent-consuming and involve costly maintenance. This study proposes a novel procedure based on discrete wavelet transform and principal component analysis for detecting organic contamination events from ultraviolet spectral data. Firstly, the spectrum of each observation is transformed using discrete wavelet with a coiflet mother wavelet to capture the abrupt change along the wavelength. Principal component analysis is then employed to approximate the spectra based on capture and fusion features. The significant value of Hotelling's T 2 statistics is calculated and used to detect outliers. An alarm of contamination event is triggered by sequential Bayesian analysis when the outliers appear continuously in several observations. The effectiveness of the proposed procedure is tested on-line using a pilot-scale setup and experimental data.
IsoMAP (Isoscape Modeling, Analysis, and Prediction)
NASA Astrophysics Data System (ADS)
Miller, C. C.; Bowen, G. J.; Zhang, T.; Zhao, L.; West, J. B.; Liu, Z.; Rapolu, N.
2009-12-01
IsoMAP is a TeraGrid-based web portal aimed at building the infrastructure that brings together distributed multi-scale and multi-format geospatial datasets to enable statistical analysis and modeling of environmental isotopes. A typical workflow enabled by the portal includes (1) data source exploration and selection, (2) statistical analysis and model development; (3) predictive simulation of isotope distributions using models developed in (1) and (2); (4) analysis and interpretation of simulated spatial isotope distributions (e.g., comparison with independent observations, pattern analysis). The gridded models and data products created by one user can be shared and reused among users within the portal, enabling collaboration and knowledge transfer. This infrastructure and the research it fosters can lead to fundamental changes in our knowledge of the water cycle and ecological and biogeochemical processes through analysis of network-based isotope data, but it will be important A) that those with whom the data and models are shared can be sure of the origin, quality, inputs, and processing history of these products, and B) the system is agile and intuitive enough to facilitate this sharing (rather than just ‘allow’ it). IsoMAP researchers are therefore building into the portal’s architecture several components meant to increase the amount of metadata about users’ products and to repurpose those metadata to make sharing and discovery more intuitive and robust to both expected, professional users as well as unforeseeable populations from other sectors.
Federated software defined network operations for LHC experiments
NASA Astrophysics Data System (ADS)
Kim, Dongkyun; Byeon, Okhwan; Cho, Kihyeon
2013-09-01
The most well-known high-energy physics collaboration, the Large Hadron Collider (LHC), which is based on e-Science, has been facing several challenges presented by its extraordinary instruments in terms of the generation, distribution, and analysis of large amounts of scientific data. Currently, data distribution issues are being resolved by adopting an advanced Internet technology called software defined networking (SDN). Stability of the SDN operations and management is demanded to keep the federated LHC data distribution networks reliable. Therefore, in this paper, an SDN operation architecture based on the distributed virtual network operations center (DvNOC) is proposed to enable LHC researchers to assume full control of their own global end-to-end data dissemination. This may achieve an enhanced data delivery performance based on data traffic offloading with delay variation. The evaluation results indicate that the overall end-to-end data delivery performance can be improved over multi-domain SDN environments based on the proposed federated SDN/DvNOC operation framework.
CDF-XL: computing cumulative distribution functions of reaction time data in Excel.
Houghton, George; Grange, James A
2011-12-01
In experimental psychology, central tendencies of reaction time (RT) distributions are used to compare different experimental conditions. This emphasis on the central tendency ignores additional information that may be derived from the RT distribution itself. One method for analysing RT distributions is to construct cumulative distribution frequency plots (CDFs; Ratcliff, Psychological Bulletin 86:446-461, 1979). However, this method is difficult to implement in widely available software, severely restricting its use. In this report, we present an Excel-based program, CDF-XL, for constructing and analysing CDFs, with the aim of making such techniques more readily accessible to researchers, including students (CDF-XL can be downloaded free of charge from the Psychonomic Society's online archive). CDF-XL functions as an Excel workbook and starts from the raw experimental data, organised into three columns (Subject, Condition, and RT) on an Input Data worksheet (a point-and-click utility is provided for achieving this format from a broader data set). No further preprocessing or sorting of the data is required. With one click of a button, CDF-XL will generate two forms of cumulative analysis: (1) "standard" CDFs, based on percentiles of participant RT distributions (by condition), and (2) a related analysis employing the participant means of rank-ordered RT bins. Both analyses involve partitioning the data in similar ways, but the first uses a "median"-type measure at the participant level, while the latter uses the mean. The results are presented in three formats: (i) by participants, suitable for entry into further statistical analysis; (ii) grand means by condition; and (iii) completed CDF plots in Excel charts.
Vo, T D; Dwyer, G; Szeto, H H
1986-04-01
A relatively powerful and inexpensive microcomputer-based system for the spectral analysis of the EEG is presented. High resolution and speed is achieved with the use of recently available large-scale integrated circuit technology with enhanced functionality (INTEL Math co-processors 8087) which can perform transcendental functions rapidly. The versatility of the system is achieved with a hardware organization that has distributed data acquisition capability performed by the use of a microprocessor-based analog to digital converter with large resident memory (Cyborg ISAAC-2000). Compiled BASIC programs and assembly language subroutines perform on-line or off-line the fast Fourier transform and spectral analysis of the EEG which is stored as soft as well as hard copy. Some results obtained from test application of the entire system in animal studies are presented.
Yacoub, Haitham A.; Sadek, Mahmoud A.; Uversky, Vladimir N.
2017-01-01
ABSTRACT This study was conducted to identify the source of animal meat based on the peculiarities of protein intrinsic disorder distribution in mitochondrial cytochrome b (mtCyt-b). The analysis revealed that animal and avian species can be discriminated based on the proportions of the two groups of residues, Leu+Ile, and Ser+Pro+Ala, in the amino acid sequences of their mtCyt-b. Although levels of the overall intrinsic disorder in mtCyt-b is not very high, the peculiarities of disorder distribution within the sequences of mtCyt-b from different species varies in a rather specific way. In fact, positions and intensities of disorder/flexibility “signals” in the corresponding disorder profiles are relatively unique for avian and animal species. Therefore, it is possible to devise a set of simple rules based on the peculiarities of disorder profiles of their mtCyt-b proteins to discriminate among species. This intrinsic disorder-based analysis represents a new technique that could be used to provide a promising solution for identification of the source of meats. PMID:28331777
NASA Astrophysics Data System (ADS)
Gugsa, Solomon A.; Davies, Angela
2005-08-01
Characterizing an aspheric micro lens is critical for understanding the performance and providing feedback to the manufacturing. We describe a method to find the best-fit conic of an aspheric micro lens using a least squares minimization and Monte Carlo analysis. Our analysis is based on scanning white light interferometry measurements, and we compare the standard rapid technique where a single measurement is taken of the apex of the lens to the more time-consuming stitching technique where more surface area is measured. Both are corrected for tip/tilt based on a planar fit to the substrate. Four major parameters and their uncertainties are estimated from the measurement and a chi-square minimization is carried out to determine the best-fit conic constant. The four parameters are the base radius of curvature, the aperture of the lens, the lens center, and the sag of the lens. A probability distribution is chosen for each of the four parameters based on the measurement uncertainties and a Monte Carlo process is used to iterate the minimization process. Eleven measurements were taken and data is also chosen randomly from the group during the Monte Carlo simulation to capture the measurement repeatability. A distribution of best-fit conic constants results, where the mean is a good estimate of the best-fit conic and the distribution width represents the combined measurement uncertainty. We also compare the Monte Carlo process for the stitched data and the not stitched data. Our analysis allows us to analyze the residual surface error in terms of Zernike polynomials and determine uncertainty estimates for each coefficient.
Online monitoring of seismic damage in water distribution systems
NASA Astrophysics Data System (ADS)
Liang, Jianwen; Xiao, Di; Zhao, Xinhua; Zhang, Hongwei
2004-07-01
It is shown that water distribution systems can be damaged by earthquakes, and the seismic damages cannot easily be located, especially immediately after the events. Earthquake experiences show that accurate and quick location of seismic damage is critical to emergency response of water distribution systems. This paper develops a methodology to locate seismic damage -- multiple breaks in a water distribution system by monitoring water pressure online at limited positions in the water distribution system. For the purpose of online monitoring, supervisory control and data acquisition (SCADA) technology can well be used. A neural network-based inverse analysis method is constructed for locating the seismic damage based on the variation of water pressure. The neural network is trained by using analytically simulated data from the water distribution system, and validated by using a set of data that have never been used in the training. It is found that the methodology provides an effective and practical way in which seismic damage in a water distribution system can be accurately and quickly located.
NASA Astrophysics Data System (ADS)
Aydiner, Ekrem; Cherstvy, Andrey G.; Metzler, Ralf
2018-01-01
We study by Monte Carlo simulations a kinetic exchange trading model for both fixed and distributed saving propensities of the agents and rationalize the person and wealth distributions. We show that the newly introduced wealth distribution - that may be more amenable in certain situations - features a different power-law exponent, particularly for distributed saving propensities of the agents. For open agent-based systems, we analyze the person and wealth distributions and find that the presence of trap agents alters their amplitude, leaving however the scaling exponents nearly unaffected. For an open system, we show that the total wealth - for different trap agent densities and saving propensities of the agents - decreases in time according to the classical Kohlrausch-Williams-Watts stretched exponential law. Interestingly, this decay does not depend on the trap agent density, but rather on saving propensities. The system relaxation for fixed and distributed saving schemes are found to be different.
NASA Astrophysics Data System (ADS)
Bao, Yi; Hoehler, Matthew S.; Smith, Christopher M.; Bundy, Matthew; Chen, Genda
2017-10-01
In this study, Brillouin scattering-based distributed fiber optic sensor is implemented to measure temperature distributions and detect cracks in concrete structures subjected to fire for the first time. A telecommunication-grade optical fiber is characterized as a high temperature sensor with pulse pre-pump Brillouin optical time domain analysis (PPP-BODTA), and implemented to measure spatially-distributed temperatures in reinforced concrete beams in fire. Four beams were tested to failure in a natural gas fueled compartment fire, each instrumented with one fused silica, single-mode optical fiber as a distributed sensor and four thermocouples. Prior to concrete cracking, the distributed temperature was validated at locations of the thermocouples by a relative difference of less than 9%. The cracks in concrete can be identified as sharp peaks in the temperature distribution since the cracks are locally filled with hot air. Concrete cracking did not affect the sensitivity of the distributed sensor but concrete spalling broke the optical fiber loop required for PPP-BOTDA measurements.
Multivariate meta-analysis: a robust approach based on the theory of U-statistic.
Ma, Yan; Mazumdar, Madhu
2011-10-30
Meta-analysis is the methodology for combining findings from similar research studies asking the same question. When the question of interest involves multiple outcomes, multivariate meta-analysis is used to synthesize the outcomes simultaneously taking into account the correlation between the outcomes. Likelihood-based approaches, in particular restricted maximum likelihood (REML) method, are commonly utilized in this context. REML assumes a multivariate normal distribution for the random-effects model. This assumption is difficult to verify, especially for meta-analysis with small number of component studies. The use of REML also requires iterative estimation between parameters, needing moderately high computation time, especially when the dimension of outcomes is large. A multivariate method of moments (MMM) is available and is shown to perform equally well to REML. However, there is a lack of information on the performance of these two methods when the true data distribution is far from normality. In this paper, we propose a new nonparametric and non-iterative method for multivariate meta-analysis on the basis of the theory of U-statistic and compare the properties of these three procedures under both normal and skewed data through simulation studies. It is shown that the effect on estimates from REML because of non-normal data distribution is marginal and that the estimates from MMM and U-statistic-based approaches are very similar. Therefore, we conclude that for performing multivariate meta-analysis, the U-statistic estimation procedure is a viable alternative to REML and MMM. Easy implementation of all three methods are illustrated by their application to data from two published meta-analysis from the fields of hip fracture and periodontal disease. We discuss ideas for future research based on U-statistic for testing significance of between-study heterogeneity and for extending the work to meta-regression setting. Copyright © 2011 John Wiley & Sons, Ltd.
An accurate test for homogeneity of odds ratios based on Cochran's Q-statistic.
Kulinskaya, Elena; Dollinger, Michael B
2015-06-10
A frequently used statistic for testing homogeneity in a meta-analysis of K independent studies is Cochran's Q. For a standard test of homogeneity the Q statistic is referred to a chi-square distribution with K-1 degrees of freedom. For the situation in which the effects of the studies are logarithms of odds ratios, the chi-square distribution is much too conservative for moderate size studies, although it may be asymptotically correct as the individual studies become large. Using a mixture of theoretical results and simulations, we provide formulas to estimate the shape and scale parameters of a gamma distribution to fit the distribution of Q. Simulation studies show that the gamma distribution is a good approximation to the distribution for Q. Use of the gamma distribution instead of the chi-square distribution for Q should eliminate inaccurate inferences in assessing homogeneity in a meta-analysis. (A computer program for implementing this test is provided.) This hypothesis test is competitive with the Breslow-Day test both in accuracy of level and in power.
NASA Astrophysics Data System (ADS)
Bidari, Pooya Sobhe; Alirezaie, Javad; Tavakkoli, Jahan
2017-03-01
This paper presents a method for modeling and simulation of shear wave generation from a nonlinear Acoustic Radiation Force Impulse (ARFI) that is considered as a distributed force applied at the focal region of a HIFU transducer radiating in nonlinear regime. The shear wave propagation is simulated by solving the Navier's equation from the distributed nonlinear ARFI as the source of the shear wave. Then, the Wigner-Ville Distribution (WVD) as a time-frequency analysis method is used to detect the shear wave at different local points in the region of interest. The WVD results in an estimation of the shear wave time of arrival, its mean frequency and local attenuation which can be utilized to estimate medium's shear modulus and shear viscosity using the Voigt model.
Composition and structure of porcine digital flexor tendon-bone insertion tissues.
Chandrasekaran, Sandhya; Pankow, Mark; Peters, Kara; Huang, Hsiao-Ying Shadow
2017-11-01
Tendon-bone insertion is a functionally graded tissue, transitioning from 200 MPa tensile modulus at the tendon end to 20 GPa tensile modulus at the bone, across just a few hundred micrometers. In this study, we examine the porcine digital flexor tendon insertion tissue to provide a quantitative description of its collagen orientation and mineral concentration by using Fast Fourier Transform (FFT) based image analysis and mass spectrometry, respectively. Histological results revealed uniformity in global collagen orientation at all depths, indicative of mechanical anisotropy, although at mid-depth, the highest fiber density, least amount of dispersion, and least cellular circularity were evident. Collagen orientation distribution obtained through 2D FFT of histological imaging data from fluorescent microscopy agreed with past measurements based on polarized light microscopy. Results revealed global fiber orientation across the tendon-bone insertion to be preserved along direction of physiologic tension. Gradation in the fiber distribution orientation index across the insertion was reflective of a decrease in anisotropy from the tendon to the bone. We provided elemental maps across the fibrocartilage for its organic and inorganic constituents through time-of-flight secondary ion mass spectrometry (TOF-SIMS). The apatite intensity distribution from the tendon to bone was shown to follow a linear trend, supporting past results based on Raman microprobe analysis. The merit of this study lies in the image-based simplified approach to fiber distribution quantification and in the high spatial resolution of the compositional analysis. In conjunction with the mechanical properties of the insertion tissue, fiber, and mineral distribution results for the insertion from this may potentially be incorporated into the development of a structural constitutive approach toward computational modeling. Characterizing the properties of the native insertion tissue would provide the microstructural basis for developing biomimetic scaffolds to recreate the graded morphology of a fibrocartilaginous insertion. © 2017 Wiley Periodicals, Inc. J Biomed Mater Res Part A: 105A: 3050-3058, 2017. © 2017 Wiley Periodicals, Inc.
A descriptive model of resting-state networks using Markov chains.
Xie, H; Pal, R; Mitra, S
2016-08-01
Resting-state functional connectivity (RSFC) studies considering pairwise linear correlations have attracted great interests while the underlying functional network structure still remains poorly understood. To further our understanding of RSFC, this paper presents an analysis of the resting-state networks (RSNs) based on the steady-state distributions and provides a novel angle to investigate the RSFC of multiple functional nodes. This paper evaluates the consistency of two networks based on the Hellinger distance between the steady-state distributions of the inferred Markov chain models. The results show that generated steady-state distributions of default mode network have higher consistency across subjects than random nodes from various RSNs.
Measurement and modeling of diameter distributions of particulate matter in terrestrial solutions
NASA Astrophysics Data System (ADS)
Levia, Delphis F.; Michalzik, Beate; Bischoff, Sebastian; NäThe, Kerstin; Legates, David R.; Gruselle, Marie-Cecile; Richter, Susanne
2013-04-01
Particulate matter (PM) plays an important role in biogeosciences, affecting biosphere-atmosphere interactions and ecosystem health. This is the first known study to quantify and model PM diameter distributions of bulk precipitation, throughfall, stemflow, and organic layer (Oa) solution. Solutions were collected from a European beech (Fagus sylvatica L.) forest during leafed and leafless periods. Following scanning electron microscopy and image analysis, PM distributions were quantified and then modeled with the Box-Cox transformation. Based on an analysis of 43,278 individual particulates, median PM diameter of all solutions was around 3.0 µm. All PM diameter frequency distributions were skewed significantly to the right. Optimal power transformations of PM diameter distributions were between -1.00 and -1.56. The utility of this model reconstruction would be that large samples having a similar probability density function can be developed for similar forests. Further work on the shape and chemical composition of particulates is warranted.
Effect of the state of internal boundaries on granite fracture nature under quasi-static compression
NASA Astrophysics Data System (ADS)
Damaskinskaya, E. E.; Panteleev, I. A.; Kadomtsev, A. G.; Naimark, O. B.
2017-05-01
Based on an analysis of the spatial distribution of hypocenters of acoustic emission signal sources and an analysis of the energy distributions of acoustic emission signals, the effect of the liquid phase and a weak electric field on the spatiotemporal nature of granite sample fracture is studied. Experiments on uniaxial compression of granite samples of natural moisture showed that the damage accumulation process is twostage: disperse accumulation of damages is followed by localized accumulation of damages in the formed macrofracture nucleus region. In energy distributions of acoustic emission signals, this transition is accompanied by a change in the distribution shape from exponential to power-law. Granite water saturation qualitatively changes the damage accumulation nature: the process is delocalized until macrofracture with the exponential energy distribution of acoustic emission signals. An exposure to a weak electric field results in a selective change in the damage accumulation nature in the sample volume.
[Simulation and data analysis of stereological modeling based on virtual slices].
Wang, Hao; Shen, Hong; Bai, Xiao-yan
2008-05-01
To establish a computer-assisted stereological model for simulating the process of slice section and evaluate the relationship between section surface and estimated three-dimensional structure. The model was designed by mathematic method as a win32 software based on the MFC using Microsoft visual studio as IDE for simulating the infinite process of sections and analysis of the data derived from the model. The linearity of the fitting of the model was evaluated by comparison with the traditional formula. The win32 software based on this algorithm allowed random sectioning of the particles distributed randomly in an ideal virtual cube. The stereological parameters showed very high throughput (>94.5% and 92%) in homogeneity and independence tests. The data of density, shape and size of the section were tested to conform to normal distribution. The output of the model and that from the image analysis system showed statistical correlation and consistency. The algorithm we described can be used for evaluating the stereologic parameters of the structure of tissue slices.
A novel multitarget model of radiation-induced cell killing based on the Gaussian distribution.
Zhao, Lei; Mi, Dong; Sun, Yeqing
2017-05-07
The multitarget version of the traditional target theory based on the Poisson distribution is still used to describe the dose-survival curves of cells after ionizing radiation in radiobiology and radiotherapy. However, noting that the usual ionizing radiation damage is the result of two sequential stochastic processes, the probability distribution of the damage number per cell should follow a compound Poisson distribution, like e.g. Neyman's distribution of type A (N. A.). In consideration of that the Gaussian distribution can be considered as the approximation of the N. A. in the case of high flux, a multitarget model based on the Gaussian distribution is proposed to describe the cell inactivation effects in low linear energy transfer (LET) radiation with high dose-rate. Theoretical analysis and experimental data fitting indicate that the present theory is superior to the traditional multitarget model and similar to the Linear - Quadratic (LQ) model in describing the biological effects of low-LET radiation with high dose-rate, and the parameter ratio in the present model can be used as an alternative indicator to reflect the radiation damage and radiosensitivity of the cells. Copyright © 2017 Elsevier Ltd. All rights reserved.
Sensitivity of goodness-of-fit statistics to rainfall data rounding off
NASA Astrophysics Data System (ADS)
Deidda, Roberto; Puliga, Michelangelo
An analysis based on the L-moments theory suggests of adopting the generalized Pareto distribution to interpret daily rainfall depths recorded by the rain-gauge network of the Hydrological Survey of the Sardinia Region. Nevertheless, a big problem, not yet completely resolved, arises in the estimation of a left-censoring threshold able to assure a good fitting of rainfall data with the generalized Pareto distribution. In order to detect an optimal threshold, keeping the largest possible number of data, we chose to apply a “failure-to-reject” method based on goodness-of-fit tests, as it was proposed by Choulakian and Stephens [Choulakian, V., Stephens, M.A., 2001. Goodness-of-fit tests for the generalized Pareto distribution. Technometrics 43, 478-484]. Unfortunately, the application of the test, using percentage points provided by Choulakian and Stephens (2001), did not succeed in detecting a useful threshold value in most analyzed time series. A deeper analysis revealed that these failures are mainly due to the presence of large quantities of rounding off values among sample data, affecting the distribution of goodness-of-fit statistics and leading to significant departures from percentage points expected for continuous random variables. A procedure based on Monte Carlo simulations is thus proposed to overcome these problems.
A data-driven allocation tool for in-kind resources distributed by a state health department.
Peterson, Cora; Kegler, Scott R; Parker, Wende R; Sullivan, David
2016-10-02
The objective of this study was to leverage a state health department's operational data to allocate in-kind resources (children's car seats) to counties, with the proposition that need-based allocation could ultimately improve public health outcomes. This study used a retrospective analysis of administrative data on car seats distributed to counties statewide by the Georgia Department of Public Health and development of a need-based allocation tool (presented as interactive supplemental digital content, adaptable to other types of in-kind public health resources) that relies on current county-level injury and sociodemographic data. Car seat allocation using public health data and a need-based formula resulted in substantially different recommended allocations to individual counties compared to historic distribution. Results indicate that making an in-kind public health resource like car seats universally available results in a less equitable distribution of that resource compared to deliberate allocation according to public health need. Public health agencies can use local data to allocate in-kind resources consistent with health objectives; that is, in a manner offering the greatest potential health impact. Future analysis can determine whether the change to a more equitable allocation of resources is also more efficient, resulting in measurably improved public health outcomes.
Regression analysis using dependent Polya trees.
Schörgendorfer, Angela; Branscum, Adam J
2013-11-30
Many commonly used models for linear regression analysis force overly simplistic shape and scale constraints on the residual structure of data. We propose a semiparametric Bayesian model for regression analysis that produces data-driven inference by using a new type of dependent Polya tree prior to model arbitrary residual distributions that are allowed to evolve across increasing levels of an ordinal covariate (e.g., time, in repeated measurement studies). By modeling residual distributions at consecutive covariate levels or time points using separate, but dependent Polya tree priors, distributional information is pooled while allowing for broad pliability to accommodate many types of changing residual distributions. We can use the proposed dependent residual structure in a wide range of regression settings, including fixed-effects and mixed-effects linear and nonlinear models for cross-sectional, prospective, and repeated measurement data. A simulation study illustrates the flexibility of our novel semiparametric regression model to accurately capture evolving residual distributions. In an application to immune development data on immunoglobulin G antibodies in children, our new model outperforms several contemporary semiparametric regression models based on a predictive model selection criterion. Copyright © 2013 John Wiley & Sons, Ltd.
Wallace, C.S.A.; Marsh, S.E.
2005-01-01
Our study used geostatistics to extract measures that characterize the spatial structure of vegetated landscapes from satellite imagery for mapping endangered Sonoran pronghorn habitat. Fine spatial resolution IKONOS data provided information at the scale of individual trees or shrubs that permitted analysis of vegetation structure and pattern. We derived images of landscape structure by calculating local estimates of the nugget, sill, and range variogram parameters within 25 ?? 25-m image windows. These variogram parameters, which describe the spatial autocorrelation of the 1-m image pixels, are shown in previous studies to discriminate between different species-specific vegetation associations. We constructed two independent models of pronghorn landscape preference by coupling the derived measures with Sonoran pronghorn sighting data: a distribution-based model and a cluster-based model. The distribution-based model used the descriptive statistics for variogram measures at pronghorn sightings, whereas the cluster-based model used the distribution of pronghorn sightings within clusters of an unsupervised classification of derived images. Both models define similar landscapes, and validation results confirm they effectively predict the locations of an independent set of pronghorn sightings. Such information, although not a substitute for field-based knowledge of the landscape and associated ecological processes, can provide valuable reconnaissance information to guide natural resource management efforts. ?? 2005 Taylor & Francis Group Ltd.
Singh, Dadabhai T; Trehan, Rahul; Schmidt, Bertil; Bretschneider, Timo
2008-01-01
Preparedness for a possible global pandemic caused by viruses such as the highly pathogenic influenza A subtype H5N1 has become a global priority. In particular, it is critical to monitor the appearance of any new emerging subtypes. Comparative phyloinformatics can be used to monitor, analyze, and possibly predict the evolution of viruses. However, in order to utilize the full functionality of available analysis packages for large-scale phyloinformatics studies, a team of computer scientists, biostatisticians and virologists is needed--a requirement which cannot be fulfilled in many cases. Furthermore, the time complexities of many algorithms involved leads to prohibitive runtimes on sequential computer platforms. This has so far hindered the use of comparative phyloinformatics as a commonly applied tool in this area. In this paper the graphical-oriented workflow design system called Quascade and its efficient usage for comparative phyloinformatics are presented. In particular, we focus on how this task can be effectively performed in a distributed computing environment. As a proof of concept, the designed workflows are used for the phylogenetic analysis of neuraminidase of H5N1 isolates (micro level) and influenza viruses (macro level). The results of this paper are hence twofold. Firstly, this paper demonstrates the usefulness of a graphical user interface system to design and execute complex distributed workflows for large-scale phyloinformatics studies of virus genes. Secondly, the analysis of neuraminidase on different levels of complexity provides valuable insights of this virus's tendency for geographical based clustering in the phylogenetic tree and also shows the importance of glycan sites in its molecular evolution. The current study demonstrates the efficiency and utility of workflow systems providing a biologist friendly approach to complex biological dataset analysis using high performance computing. In particular, the utility of the platform Quascade for deploying distributed and parallelized versions of a variety of computationally intensive phylogenetic algorithms has been shown. Secondly, the analysis of the utilized H5N1 neuraminidase datasets at macro and micro levels has clearly indicated a pattern of spatial clustering of the H5N1 viral isolates based on geographical distribution rather than temporal or host range based clustering.
Yoon, Jong H.; Tamir, Diana; Minzenberg, Michael J.; Ragland, J. Daniel; Ursu, Stefan; Carter, Cameron S.
2009-01-01
Background Multivariate pattern analysis is an alternative method of analyzing fMRI data, which is capable of decoding distributed neural representations. We applied this method to test the hypothesis of the impairment in distributed representations in schizophrenia. We also compared the results of this method with traditional GLM-based univariate analysis. Methods 19 schizophrenia and 15 control subjects viewed two runs of stimuli--exemplars of faces, scenes, objects, and scrambled images. To verify engagement with stimuli, subjects completed a 1-back matching task. A multi-voxel pattern classifier was trained to identify category-specific activity patterns on one run of fMRI data. Classification testing was conducted on the remaining run. Correlation of voxel-wise activity across runs evaluated variance over time in activity patterns. Results Patients performed the task less accurately. This group difference was reflected in the pattern analysis results with diminished classification accuracy in patients compared to controls, 59% and 72% respectively. In contrast, there was no group difference in GLM-based univariate measures. In both groups, classification accuracy was significantly correlated with behavioral measures. Both groups showed highly significant correlation between inter-run correlations and classification accuracy. Conclusions Distributed representations of visual objects are impaired in schizophrenia. This impairment is correlated with diminished task performance, suggesting that decreased integrity of cortical activity patterns is reflected in impaired behavior. Comparisons with univariate results suggest greater sensitivity of pattern analysis in detecting group differences in neural activity and reduced likelihood of non-specific factors driving these results. PMID:18822407
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bagri, Akbar; Hanson, John P.; Lind, J. P.
We use high-energy X-ray diffraction microscopy (HEDM) to characterize the microstructure of Ni-base alloy 725. HEDM is a non-destructive technique capable of providing three-dimensional reconstructions of grain shapes and orientations in polycrystals. The present analysis yields the grain size distribution in alloy 725 as well as the grain boundary character distribution (GBCD) as a function of lattice misorientation and boundary plane normal orientation. We find that the GBCD of Ni-base alloy 725 is similar to that previously determined in pure Ni and other fcc-base metals. We find an elevated density of Σ9 and Σ3 grain boundaries. We also observe amore » preponderance of grain boundaries along low-index planes, with those along (1 1 1) planes being the most common, even after Σ3 twins have been excluded from the analysis.« less
[Research on suitable distribution of Paris yunnanensis based on remote sensing and GIS].
Luo, Yao; Dong, Yong-Bo; Zhu, Cong; Peng, Wen-Fu; Fang, Qing-Mao; Xu, Xin-Liang
2017-11-01
Paris yunnanensis is a kind of rare medicinal herb, having a very high medicinal value. Studying its suitable ecological condition can provide a basis for its rational exploitation, artificial cultivation, and sustainable utilization. A practicable method in this paper has been proposed to research the suitable regional distribution of P. yunnanensis in Sichuan province. By the case study of P. yunnanensis in Sichuan province, and according to related literatures, the suitable ecological condition of P. yunnanensis such as altitude, mean annual temperature (MAT), annual precipitation, regional slope, slope ranges, vegetative cover, and soil types was analyzed following remote sensing (RS) and GIS.The appropriate distribution regionof P. yunnanensis and its area were extracted based on RS and GIS technology,combing with the information of the field validation data. The results showed that the concentrated distribution regions in counties of Sichuan province were, Liangshan prefecture, Aba prefecture, Sertar county of Ganzi prefecture, Panzhihua city, Ya'an city, Chengdu city, Meishan city, Leshan city, Yibin city, Neijiang city, Luzhou city, Bazhong city, Nanchong city, Guangyuan city and other cities and counties area.The suitable distribution area in Sichuan is about 7 338 km², accounting for 3.02% of the total study regional area. The analysis result has high consistency with the filed validation data, and the research method for P. yunnanensis distribution region based onspatial overlay analysis and the extracted the information of land usage and ecological factors following the RS and GIS is reliable. Copyright© by the Chinese Pharmaceutical Association.
NASA Astrophysics Data System (ADS)
Naik, V.; Mauzerall, D. L.; Horowitz, L.; Schwarzkopf, D.; Ramaswamy, V.; Oppenheimer, M.
2004-12-01
The global distribution of tropospheric ozone (O3) depends on the location of emissions of its precursors in addition to chemical and dynamical factors. The global picture of O3 forcing is, therefore, a sum of regional forcings arising from emissions of precursors from different sources. The Kyoto Protocol does not include ozone as a greenhouse gas, and emission reductions of ozone precursors made under Kyoto or any similar agreement would presently receive no credit. In this study, we quantitatively estimate the contribution of emissions of nitrogen oxides (NOx), the primary limiting O3 precursor in the non-urban atmosphere, from specific countries and regions of the world to global O3 concentration distributions. We then estimate radiative forcing resulting from the regional perturbations of NOx emissions. This analysis is intended as an early step towards incorporating O3 into the Kyoto Protocol or any successor agreement. Under such a system countries could obtain credit for improvements in local air quality that result in reductions of O3 concentrations because of the associated reductions in radiative forcing. We use the global chemistry transport model, MOZART-2, to simulate the global O3 distribution for base year 1990 and perturbations to this distribution caused by a 10% percent reduction in the base emissions of NOx from the United States, Europe, East Asia, India, South America, and Africa. We calculate the radiative forcing for the simulated base and perturbed O3 distributions using the GFDL radiative transfer model. The difference between the radiative forcing from O3 for the base and perturbed distributions provides an estimate of the marginal radiative forcing from a region's emissions of NOx. We will present a quantitative analysis of the magnitude, spatial, and temporal distribution of radiative forcing resulting from marginal changes in the NOx emissions from each region.
Analysis of biomolecular solvation sites by 3D-RISM theory.
Sindhikara, Daniel J; Hirata, Fumio
2013-06-06
We derive, implement, and apply equilibrium solvation site analysis for biomolecules. Our method utilizes 3D-RISM calculations to quickly obtain equilibrium solvent distributions without either necessity of simulation or limits of solvent sampling. Our analysis of these distributions extracts highest likelihood poses of solvent as well as localized entropies, enthalpies, and solvation free energies. We demonstrate our method on a structure of HIV-1 protease where excellent structural and thermodynamic data are available for comparison. Our results, obtained within minutes, show systematic agreement with available experimental data. Further, our results are in good agreement with established simulation-based solvent analysis methods. This method can be used not only for visual analysis of active site solvation but also for virtual screening methods and experimental refinement.
Objective sea level pressure analysis for sparse data areas
NASA Technical Reports Server (NTRS)
Druyan, L. M.
1972-01-01
A computer procedure was used to analyze the pressure distribution over the North Pacific Ocean for eleven synoptic times in February, 1967. Independent knowledge of the central pressures of lows is shown to reduce the analysis errors for very sparse data coverage. The application of planned remote sensing of sea-level wind speeds is shown to make a significant contribution to the quality of the analysis especially in the high gradient mid-latitudes and for sparse coverage of conventional observations (such as over Southern Hemisphere oceans). Uniform distribution of the available observations of sea-level pressure and wind velocity yields results far superior to those derived from a random distribution. A generalization of the results indicates that the average lower limit for analysis errors is between 2 and 2.5 mb based on the perfect specification of the magnitude of the sea-level pressure gradient from a known verification analysis. A less than perfect specification will derive from wind-pressure relationships applied to satellite observed wind speeds.
Chen, Yong; Luo, Sheng; Chu, Haitao; Wei, Peng
2013-05-01
Multivariate meta-analysis is useful in combining evidence from independent studies which involve several comparisons among groups based on a single outcome. For binary outcomes, the commonly used statistical models for multivariate meta-analysis are multivariate generalized linear mixed effects models which assume risks, after some transformation, follow a multivariate normal distribution with possible correlations. In this article, we consider an alternative model for multivariate meta-analysis where the risks are modeled by the multivariate beta distribution proposed by Sarmanov (1966). This model have several attractive features compared to the conventional multivariate generalized linear mixed effects models, including simplicity of likelihood function, no need to specify a link function, and has a closed-form expression of distribution functions for study-specific risk differences. We investigate the finite sample performance of this model by simulation studies and illustrate its use with an application to multivariate meta-analysis of adverse events of tricyclic antidepressants treatment in clinical trials.
NASA Astrophysics Data System (ADS)
Iino, Shota; Ito, Riho; Doi, Kento; Imaizumi, Tomoyuki; Hikosaka, Shuhei
2017-10-01
In the developing countries, urban areas are expanding rapidly. With the rapid developments, a short term monitoring of urban changes is important. A constant observation and creation of urban distribution map of high accuracy and without noise pollution are the key issues for the short term monitoring. SAR satellites are highly suitable for day or night and regardless of atmospheric weather condition observations for this type of study. The current study highlights the methodology of generating high-accuracy urban distribution maps derived from the SAR satellite imagery based on Convolutional Neural Network (CNN), which showed the outstanding results for image classification. Several improvements on SAR polarization combinations and dataset construction were performed for increasing the accuracy. As an additional data, Digital Surface Model (DSM), which are useful to classify land cover, were added to improve the accuracy. From the obtained result, high-accuracy urban distribution map satisfying the quality for short-term monitoring was generated. For the evaluation, urban changes were extracted by taking the difference of urban distribution maps. The change analysis with time series of imageries revealed the locations of urban change areas for short-term. Comparisons with optical satellites were performed for validating the results. Finally, analysis of the urban changes combining X-band, L-band and C-band SAR satellites was attempted to increase the opportunity of acquiring satellite imageries. Further analysis will be conducted as future work of the present study
Abanto-Valle, C. A.; Bandyopadhyay, D.; Lachos, V. H.; Enriquez, I.
2009-01-01
A Bayesian analysis of stochastic volatility (SV) models using the class of symmetric scale mixtures of normal (SMN) distributions is considered. In the face of non-normality, this provides an appealing robust alternative to the routine use of the normal distribution. Specific distributions examined include the normal, student-t, slash and the variance gamma distributions. Using a Bayesian paradigm, an efficient Markov chain Monte Carlo (MCMC) algorithm is introduced for parameter estimation. Moreover, the mixing parameters obtained as a by-product of the scale mixture representation can be used to identify outliers. The methods developed are applied to analyze daily stock returns data on S&P500 index. Bayesian model selection criteria as well as out-of- sample forecasting results reveal that the SV models based on heavy-tailed SMN distributions provide significant improvement in model fit as well as prediction to the S&P500 index data over the usual normal model. PMID:20730043
Astrelin, A V; Sokolov, M V; Behnisch, T; Reymann, K G; Voronin, L L
1997-04-25
A statistical approach to analysis of amplitude fluctuations of postsynaptic responses is described. This includes (1) using a L1-metric in the space of distribution functions for minimisation with application of linear programming methods to decompose amplitude distributions into a convolution of Gaussian and discrete distributions; (2) deconvolution of the resulting discrete distribution with determination of the release probabilities and the quantal amplitude for cases with a small number (< 5) of discrete components. The methods were tested against simulated data over a range of sample sizes and signal-to-noise ratios which mimicked those observed in physiological experiments. In computer simulation experiments, comparisons were made with other methods of 'unconstrained' (generalized) and constrained reconstruction of discrete components from convolutions. The simulation results provided additional criteria for improving the solutions to overcome 'over-fitting phenomena' and to constrain the number of components with small probabilities. Application of the programme to recordings from hippocampal neurones demonstrated its usefulness for the analysis of amplitude distributions of postsynaptic responses.
First Extraction of Transversity from a Global Analysis of Electron-Proton and Proton-Proton Data
NASA Astrophysics Data System (ADS)
Radici, Marco; Bacchetta, Alessandro
2018-05-01
We present the first extraction of the transversity distribution in the framework of collinear factorization based on the global analysis of pion-pair production in deep-inelastic scattering and in proton-proton collisions with a transversely polarized proton. The extraction relies on the knowledge of dihadron fragmentation functions, which are taken from the analysis of electron-positron annihilation data. For the first time, the transversity is extracted from a global analysis similar to what is usually done for the spin-averaged and helicity distributions. The knowledge of transversity is important for, among other things, detecting possible signals of new physics in high-precision low-energy experiments.
Li, Li; Zheng, Xu; Li, Zhengqiang; Li, Zhanhua; Dubovik, Oleg; Chen, Xingfeng; Wendisch, Manfred
2017-08-07
Particle shape is crucial to the properties of light scattered by atmospheric aerosol particles. A method of fluorescence microscopy direct observation was introduced to determine the aspect ratio distribution of aerosol particles. The result is comparable with that of the electron microscopic analysis. The measured aspect ratio distribution has been successfully applied in modeling light scattering and further in simulation of polarization measurements of the sun/sky radiometer. These efforts are expected to improve shape retrieval from skylight polarization by using directly measured aspect ratio distribution.
NASA Astrophysics Data System (ADS)
Lv, Gangming; Zhu, Shihua; Hui, Hui
Multi-cell resource allocation under minimum rate request for each user in OFDMA networks is addressed in this paper. Based on Lagrange dual decomposition theory, the joint multi-cell resource allocation problem is decomposed and modeled as a limited-cooperative game, and a distributed multi-cell resource allocation algorithm is thus proposed. Analysis and simulation results show that, compared with non-cooperative iterative water-filling algorithm, the proposed algorithm can remarkably reduce the ICI level and improve overall system performances.
Flare parameters inferred from a 3D loop model data base
NASA Astrophysics Data System (ADS)
Cuambe, Valente A.; Costa, J. E. R.; Simões, P. J. A.
2018-06-01
We developed a data base of pre-calculated flare images and spectra exploring a set of parameters which describe the physical characteristics of coronal loops and accelerated electron distribution. Due to the large number of parameters involved in describing the geometry and the flaring atmosphere in the model used, we built a large data base of models (˜250 000) to facilitate the flare analysis. The geometry and characteristics of non-thermal electrons are defined on a discrete grid with spatial resolution greater than 4 arcsec. The data base was constructed based on general properties of known solar flares and convolved with instrumental resolution to replicate the observations from the Nobeyama radio polarimeter spectra and Nobeyama radioheliograph (NoRH) brightness maps. Observed spectra and brightness distribution maps are easily compared with the modelled spectra and images in the data base, indicating a possible range of solutions. The parameter search efficiency in this finite data base is discussed. 8 out of 10 parameters analysed for 1000 simulated flare searches were recovered with a relative error of less than 20 per cent on average. In addition, from the analysis of the observed correlation between NoRH flare sizes and intensities at 17 GHz, some statistical properties were derived. From these statistics, the energy spectral index was found to be δ ˜ 3, with non-thermal electron densities showing a peak distribution ⪅107 cm-3, and Bphotosphere ⪆ 2000 G. Some bias for larger loops with heights as great as ˜2.6 × 109 cm, and looptop events were noted. An excellent match of the spectrum and the brightness distribution at 17 and 34 GHz of the 2002 May 31 flare is presented as well.
NASA Astrophysics Data System (ADS)
Anggit Maulana, Hiska; Haris, Abdul
2018-05-01
Reservoir and source rock Identification has been performed to deliniate the reservoir distribution of Talangakar Formation South Sumatra Basin. This study is based on integrated geophysical, geological and petrophysical data. The aims of study to determine the characteristics of the reservoir and source rock, to differentiate reservoir and source rock in same Talangakar formation, to find out the distribution of net pay reservoir and source rock layers. The method of geophysical included seismic data interpretation using time and depth structures map, post-stack inversion, interval velocity, geological interpretations included the analysis of structures and faults, and petrophysical processing is interpret data log wells that penetrating Talangakar formation containing hydrocarbons (oil and gas). Based on seismic interpretation perform subsurface mapping on Layer A and Layer I to determine the development of structures in the Regional Research. Based on the geological interpretation, trapping in the form of regional research is anticline structure on southwest-northeast trending and bounded by normal faults on the southwest-southeast regional research structure. Based on petrophysical analysis, the main reservoir in the field of research, is a layer 1,375 m of depth and a thickness 2 to 8.3 meters.
NASA Astrophysics Data System (ADS)
Matsui, Hiroyuki; Mishchenko, Andrei S.; Hasegawa, Tatsuo
2010-02-01
We developed a novel method for obtaining the distribution of trapped carriers over their degree of localization in organic transistors, based on the fine analysis of electron spin resonance spectra at low enough temperatures where all carriers are localized. To apply the method to pentacene thin-film transistors, we proved through continuous wave saturation experiments that all carriers are localized at below 50 K. We analyzed the spectra at 20 K and found that the major groups of traps comprise localized states having wave functions spanning around 1.5 and 5 molecules and a continuous distribution of states with spatial extent in the range between 6 and 20 molecules.
Matsui, Hiroyuki; Mishchenko, Andrei S; Hasegawa, Tatsuo
2010-02-05
We developed a novel method for obtaining the distribution of trapped carriers over their degree of localization in organic transistors, based on the fine analysis of electron spin resonance spectra at low enough temperatures where all carriers are localized. To apply the method to pentacene thin-film transistors, we proved through continuous wave saturation experiments that all carriers are localized at below 50 K. We analyzed the spectra at 20 K and found that the major groups of traps comprise localized states having wave functions spanning around 1.5 and 5 molecules and a continuous distribution of states with spatial extent in the range between 6 and 20 molecules.
GIS-based spatial statistical analysis of risk areas for liver flukes in Surin Province of Thailand.
Rujirakul, Ratana; Ueng-arporn, Naporn; Kaewpitoon, Soraya; Loyd, Ryan J; Kaewthani, Sarochinee; Kaewpitoon, Natthawut
2015-01-01
It is urgently necessary to be aware of the distribution and risk areas of liver fluke, Opisthorchis viverrini, for proper allocation of prevention and control measures. This study aimed to investigate the human behavior, and environmental factors influencing the distribution in Surin Province of Thailand, and to build a model using stepwise multiple regression analysis with a geographic information system (GIS) on environment and climate data. The relationship between the human behavior, attitudes (<50%; X111), environmental factors like population density (148-169 pop/km2; X73), and land use as wetland (X64), were correlated with the liver fluke disease distribution at 0.000, 0.034, and 0.006 levels, respectively. Multiple regression analysis, by equations OV=-0.599+0.005(population density (148-169 pop/km2); X73)+0.040 (human attitude (<50%); X111)+0.022 (land used (wetland; X64), was used to predict the distribution of liver fluke. OV is the patients of liver fluke infection, R Square=0.878, and, Adjust R Square=0.849. By GIS analysis, we found Si Narong, Sangkha, Phanom Dong Rak, Mueang Surin, Non Narai, Samrong Thap, Chumphon Buri, and Rattanaburi to have the highest distributions in Surin province. In conclusion, the combination of GIS and statistical analysis can help simulate the spatial distribution and risk areas of liver fluke, and thus may be an important tool for future planning of prevention and control measures.
Assignment of functional activations to probabilistic cytoarchitectonic areas revisited.
Eickhoff, Simon B; Paus, Tomas; Caspers, Svenja; Grosbras, Marie-Helene; Evans, Alan C; Zilles, Karl; Amunts, Katrin
2007-07-01
Probabilistic cytoarchitectonic maps in standard reference space provide a powerful tool for the analysis of structure-function relationships in the human brain. While these microstructurally defined maps have already been successfully used in the analysis of somatosensory, motor or language functions, several conceptual issues in the analysis of structure-function relationships still demand further clarification. In this paper, we demonstrate the principle approaches for anatomical localisation of functional activations based on probabilistic cytoarchitectonic maps by exemplary analysis of an anterior parietal activation evoked by visual presentation of hand gestures. After consideration of the conceptual basis and implementation of volume or local maxima labelling, we comment on some potential interpretational difficulties, limitations and caveats that could be encountered. Extending and supplementing these methods, we then propose a supplementary approach for quantification of structure-function correspondences based on distribution analysis. This approach relates the cytoarchitectonic probabilities observed at a particular functionally defined location to the areal specific null distribution of probabilities across the whole brain (i.e., the full probability map). Importantly, this method avoids the need for a unique classification of voxels to a single cortical area and may increase the comparability between results obtained for different areas. Moreover, as distribution-based labelling quantifies the "central tendency" of an activation with respect to anatomical areas, it will, in combination with the established methods, allow an advanced characterisation of the anatomical substrates of functional activations. Finally, the advantages and disadvantages of the various methods are discussed, focussing on the question of which approach is most appropriate for a particular situation.
NASA Astrophysics Data System (ADS)
Koutsodendris, Andreas; Papatheodorou, George; Kougiourouki, Ourania; Georgiadis, Michalis
2008-04-01
The types, abundance, distribution and sources of benthic marine litter found in four Greek Gulfs (Patras, Corinth, Echinades and Lakonikos) were studied using bottom trawl nets. Mean distribution and weight densities range between 72-437 Item/km 2 and 6.7-47.4 kg/km 2. Litter items were sorted into material and usage categories. Plastic litter (56%) is the most dominant material category followed by metal (17%) and glass (11%). Beverage packaging (32%) is the dominant usage category followed by general packaging (28%) and food packaging (21%). Based on the typological results three dominant litter sources were identified; land-based, vessel-based and fishery-based. Application of factor analysis (R- and Q-mode) conducted on both material and usage litter datasets confirmed the existence of the three dominant litter sources. Q-mode analysis further resulted in the quantification of the litter sources; land-based ones provided the majority (69%) of the total litter items followed by vessel-based (26%) and fishery-based (5%) sources. Diverse environmental parameters influence significantly these amounts among the four Gulfs.
Allen, Bruce C; Hack, C Eric; Clewell, Harvey J
2007-08-01
A Bayesian approach, implemented using Markov Chain Monte Carlo (MCMC) analysis, was applied with a physiologically-based pharmacokinetic (PBPK) model of methylmercury (MeHg) to evaluate the variability of MeHg exposure in women of childbearing age in the U.S. population. The analysis made use of the newly available National Health and Nutrition Survey (NHANES) blood and hair mercury concentration data for women of age 16-49 years (sample size, 1,582). Bayesian analysis was performed to estimate the population variability in MeHg exposure (daily ingestion rate) implied by the variation in blood and hair concentrations of mercury in the NHANES database. The measured variability in the NHANES blood and hair data represents the result of a process that includes interindividual variation in exposure to MeHg and interindividual variation in the pharmacokinetics (distribution, clearance) of MeHg. The PBPK model includes a number of pharmacokinetic parameters (e.g., tissue volumes, partition coefficients, rate constants for metabolism and elimination) that can vary from individual to individual within the subpopulation of interest. Using MCMC analysis, it was possible to combine prior distributions of the PBPK model parameters with the NHANES blood and hair data, as well as with kinetic data from controlled human exposures to MeHg, to derive posterior distributions that refine the estimates of both the population exposure distribution and the pharmacokinetic parameters. In general, based on the populations surveyed by NHANES, the results of the MCMC analysis indicate that a small fraction, less than 1%, of the U.S. population of women of childbearing age may have mercury exposures greater than the EPA RfD for MeHg of 0.1 microg/kg/day, and that there are few, if any, exposures greater than the ATSDR MRL of 0.3 microg/kg/day. The analysis also indicates that typical exposures may be greater than previously estimated from food consumption surveys, but that the variability in exposure within the population of U.S. women of childbearing age may be less than previously assumed.
On the properties of stochastic intermittency in rainfall processes.
Molini, A; La, Barbera P; Lanza, L G
2002-01-01
In this work we propose a mixed approach to deal with the modelling of rainfall events, based on the analysis of geometrical and statistical properties of rain intermittency in time, combined with the predictability power derived from the analysis of no-rain periods distribution and from the binary decomposition of the rain signal. Some recent hypotheses on the nature of rain intermittency are reviewed too. In particular, the internal intermittent structure of a high resolution pluviometric time series covering one decade and recorded at the tipping bucket station of the University of Genova is analysed, by separating the internal intermittency of rainfall events from the inter-arrival process through a simple geometrical filtering procedure. In this way it is possible to associate no-rain intervals with a probability distribution both in virtue of their position within the event and their percentage. From this analysis, an invariant probability distribution for the no-rain periods within the events is obtained at different aggregation levels and its satisfactory agreement with a typical extreme value distribution is shown.
Saravana Kumar, Gurunathan; George, Subin Philip
2017-02-01
This work proposes a methodology involving stiffness optimization for subject-specific cementless hip implant design based on finite element analysis for reducing stress-shielding effect. To assess the change in the stress-strain state of the femur and the resulting stress-shielding effect due to insertion of the implant, a finite element analysis of the resected femur with implant assembly is carried out for a clinically relevant loading condition. Selecting the von Mises stress as the criterion for discriminating regions for elastic modulus difference, a stiffness minimization method was employed by varying the elastic modulus distribution in custom implant stem. The stiffness minimization problem is formulated as material distribution problem without explicitly penalizing partial volume elements. This formulation enables designs that could be fabricated using additive manufacturing to make porous implant with varying levels of porosity. Stress-shielding effect, measured as difference between the von Mises stress in the intact and implanted femur, decreased as the elastic modulus distribution is optimized.
Medium Caliber Lead-Free Electric Primer. Version 2
2012-09-01
Toxic Substance Control Act TGA Thermogravimetric Analysis TNR Trinitroresorcinol V Voltage VDC Voltage Direct Current WSESRB Weapons System...variety of techniques including Thermogravimetric Analysis (TGA), base-hydrolysis, Surface Area Analysis using Brunauer, Emmett, Teller (BET...Distribution From Thermogravimetric Analysis Johnson, C. E.; Fallis, S.; Chafin, A. P.; Groshens, T. J.; Higa, K. T.; Ismail, I. M. K. and Hawkins, T. W
Efficient Blockwise Permutation Tests Preserving Exchangeability
Zhou, Chunxiao; Zwilling, Chris E.; Calhoun, Vince D.; Wang, Michelle Y.
2014-01-01
In this paper, we present a new blockwise permutation test approach based on the moments of the test statistic. The method is of importance to neuroimaging studies. In order to preserve the exchangeability condition required in permutation tests, we divide the entire set of data into certain exchangeability blocks. In addition, computationally efficient moments-based permutation tests are performed by approximating the permutation distribution of the test statistic with the Pearson distribution series. This involves the calculation of the first four moments of the permutation distribution within each block and then over the entire set of data. The accuracy and efficiency of the proposed method are demonstrated through simulated experiment on the magnetic resonance imaging (MRI) brain data, specifically the multi-site voxel-based morphometry analysis from structural MRI (sMRI). PMID:25289113
Multifractal analysis of mobile social networks
NASA Astrophysics Data System (ADS)
Zheng, Wei; Zhang, Zifeng; Deng, Yufan
2017-09-01
As Wireless Fidelity (Wi-Fi)-enabled handheld devices have been widely used, the mobile social networks (MSNs) has been attracting extensive attention. Fractal approaches have also been widely applied to characterierize natural networks as useful tools to depict their spatial distribution and scaling properties. Moreover, when the complexity of the spatial distribution of MSNs cannot be properly charaterized by single fractal dimension, multifractal analysis is required. For further research, we introduced a multifractal analysis method based on box-covering algorithm to describe the structure of MSNs. Using this method, we find that the networks are multifractal at different time interval. The simulation results demonstrate that the proposed method is efficient for analyzing the multifractal characteristic of MSNs, which provides a distribution of singularities adequately describing both the heterogeneity of fractal patterns and the statistics of measurements across spatial scales in MSNs.
2013-09-01
2012.0002- IR -EP7-A 12a. DISTRIBUTION / AVAILABILITY STATEMENT Approved for public release; distribution is unlimited 12b. DISTRIBUTION CODE A...extremist web forums is directed at Western audiences and supports Homeland attacks. (U.S. Department of Homeland Security Office of Intelligence and...23 In this context, “before the event.” 24 Yung and Benichou’s paper originally was presented at the 5th Fire
Risk-based maintenance of ethylene oxide production facilities.
Khan, Faisal I; Haddara, Mahmoud R
2004-05-20
This paper discusses a methodology for the design of an optimum inspection and maintenance program. The methodology, called risk-based maintenance (RBM) is based on integrating a reliability approach and a risk assessment strategy to obtain an optimum maintenance schedule. First, the likely equipment failure scenarios are formulated. Out of many likely failure scenarios, the ones, which are most probable, are subjected to a detailed study. Detailed consequence analysis is done for the selected scenarios. Subsequently, these failure scenarios are subjected to a fault tree analysis to determine their probabilities. Finally, risk is computed by combining the results of the consequence and the probability analyses. The calculated risk is compared against known acceptable criteria. The frequencies of the maintenance tasks are obtained by minimizing the estimated risk. A case study involving an ethylene oxide production facility is presented. Out of the five most hazardous units considered, the pipeline used for the transportation of the ethylene is found to have the highest risk. Using available failure data and a lognormal reliability distribution function human health risk factors are calculated. Both societal risk factors and individual risk factors exceeded the acceptable risk criteria. To determine an optimal maintenance interval, a reverse fault tree analysis was used. The maintenance interval was determined such that the original high risk is brought down to an acceptable level. A sensitivity analysis is also undertaken to study the impact of changing the distribution of the reliability model as well as the error in the distribution parameters on the maintenance interval.
NASA Technical Reports Server (NTRS)
Palosz, B.; Grzanka, E.; Gierlotka, S.; Stelmakh, S.; Pielaszek, R.; Bismayer, U.; Weber, H.-P.; Palosz, W.
2003-01-01
Two methods of the analysis of powder diffraction patterns of diamond and SiC nanocrystals are presented: (a) examination of changes of the lattice parameters with diffraction vector Q ('apparent lattice parameter', alp) which refers to Bragg scattering, and (b), examination of changes of inter-atomic distances based on the analysis of the atomic Pair Distribution Function, PDF. Application of these methods was studied based on the theoretical diffraction patterns computed for models of nanocrystals having (i) a perfect crystal lattice, and (ii), a core-shell structure, i.e. constituting a two-phase system. The models are defined by the lattice parameter of the grain core, thickness of the surface shell, and the magnitude and distribution of the strain field in the shell. X-ray and neutron experimental diffraction data of nanocrystalline SiC and diamond powders of the grain diameter from 4 nm up to micrometers were used. The effects of the internal pressure and strain at the grain surface on the structure are discussed based on the experimentally determined dependence of the alp values on the Q-vector, and changes of the interatomic distances with the grain size determined experimentally by the atomic Pair Distribution Function (PDF) analysis. The experimental results lend a strong support to the concept of a two-phase, core and the surface shell structure of nanocrystalline diamond and SiC.
Inhomogeneity Based Characterization of Distribution Patterns on the Plasma Membrane
Paparelli, Laura; Corthout, Nikky; Wakefield, Devin L.; Sannerud, Ragna; Jovanovic-Talisman, Tijana; Annaert, Wim; Munck, Sebastian
2016-01-01
Cell surface protein and lipid molecules are organized in various patterns: randomly, along gradients, or clustered when segregated into discrete micro- and nano-domains. Their distribution is tightly coupled to events such as polarization, endocytosis, and intracellular signaling, but challenging to quantify using traditional techniques. Here we present a novel approach to quantify the distribution of plasma membrane proteins and lipids. This approach describes spatial patterns in degrees of inhomogeneity and incorporates an intensity-based correction to analyze images with a wide range of resolutions; we have termed it Quantitative Analysis of the Spatial distributions in Images using Mosaic segmentation and Dual parameter Optimization in Histograms (QuASIMoDOH). We tested its applicability using simulated microscopy images and images acquired by widefield microscopy, total internal reflection microscopy, structured illumination microscopy, and photoactivated localization microscopy. We validated QuASIMoDOH, successfully quantifying the distribution of protein and lipid molecules detected with several labeling techniques, in different cell model systems. We also used this method to characterize the reorganization of cell surface lipids in response to disrupted endosomal trafficking and to detect dynamic changes in the global and local organization of epidermal growth factor receptors across the cell surface. Our findings demonstrate that QuASIMoDOH can be used to assess protein and lipid patterns, quantifying distribution changes and spatial reorganization at the cell surface. An ImageJ/Fiji plugin of this analysis tool is provided. PMID:27603951
Analysis of digital communication signals and extraction of parameters
NASA Astrophysics Data System (ADS)
Al-Jowder, Anwar
1994-12-01
The signal classification performance of four types of electronics support measure (ESM) communications detection systems is compared from the standpoint of the unintended receiver (interceptor). Typical digital communication signals considered include binary phase shift keying (BPSK), quadrature phase shift keying (QPSK), frequency shift keying (FSK), and on-off keying (OOK). The analysis emphasizes the use of available signal processing software. Detection methods compared include broadband energy detection, FFT-based narrowband energy detection, and two correlation methods which employ the fast Fourier transform (FFT). The correlation methods utilize modified time-frequency distributions, where one of these is based on the Wigner-Ville distribution (WVD). Gaussian white noise is added to the signal to simulate various signal-to-noise ratios (SNR's).
2012-08-01
interior, and carbides and borides at the grain boundaries. Blocky carbide particles can also be seen in the grain interior (Figure 1b). The borides ...can be seen distributed (b) higher magnification image of a typical grain boundary decorated with carbide and boride particles. Bi-modal distribution
Harrison, Luke B; Larsson, Hans C E
2015-03-01
Likelihood-based methods are commonplace in phylogenetic systematics. Although much effort has been directed toward likelihood-based models for molecular data, comparatively less work has addressed models for discrete morphological character (DMC) data. Among-character rate variation (ACRV) may confound phylogenetic analysis, but there have been few analyses of the magnitude and distribution of rate heterogeneity among DMCs. Using 76 data sets covering a range of plants, invertebrate, and vertebrate animals, we used a modified version of MrBayes to test equal, gamma-distributed and lognormally distributed models of ACRV, integrating across phylogenetic uncertainty using Bayesian model selection. We found that in approximately 80% of data sets, unequal-rates models outperformed equal-rates models, especially among larger data sets. Moreover, although most data sets were equivocal, more data sets favored the lognormal rate distribution relative to the gamma rate distribution, lending some support for more complex character correlations than in molecular data. Parsimony estimation of the underlying rate distributions in several data sets suggests that the lognormal distribution is preferred when there are many slowly evolving characters and fewer quickly evolving characters. The commonly adopted four rate category discrete approximation used for molecular data was found to be sufficient to approximate a gamma rate distribution with discrete characters. However, among the two data sets tested that favored a lognormal rate distribution, the continuous distribution was better approximated with at least eight discrete rate categories. Although the effect of rate model on the estimation of topology was difficult to assess across all data sets, it appeared relatively minor between the unequal-rates models for the one data set examined carefully. As in molecular analyses, we argue that researchers should test and adopt the most appropriate model of rate variation for the data set in question. As discrete characters are increasingly used in more sophisticated likelihood-based phylogenetic analyses, it is important that these studies be built on the most appropriate and carefully selected underlying models of evolution. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Fernando, M Rohan; Jiang, Chao; Krzyzanowski, Gary D; Ryan, Wayne L
2018-04-12
Plasma cell-free DNA (cfDNA) fragment size distribution provides important information required for diagnostic assay development. We have developed and optimized droplet digital PCR (ddPCR) assays that quantify short and long DNA fragments. These assays were used to analyze plasma cfDNA fragment size distribution in human blood. Assays were designed to amplify 76,135, 490 and 905 base pair fragments of human β-actin gene. These assays were used for fragment size analysis of plasma cell-free, exosome and apoptotic body DNA obtained from normal and pregnant donors. The relative percentages for 76, 135, 490 and 905 bp fragments from non-pregnant plasma and exosome DNA were 100%, 39%, 18%, 5.6% and 100%, 40%, 18%,3.3%, respectively. The relative percentages for pregnant plasma and exosome DNA were 100%, 34%, 14%, 23%, and 100%, 30%, 12%, 18%, respectively. The relative percentages for non-pregnant plasma pellet (obtained after 2nd centrifugation step) were 100%, 100%, 87% and 83%, respectively. Non-pregnant Plasma cell-free and exosome DNA share a unique fragment distribution pattern which is different from pregnant donor plasma and exosome DNA fragment distribution indicating the effect of physiological status on cfDNA fragment size distribution. Fragment distribution pattern for plasma pellet that includes apoptotic bodies and nuclear DNA was greatly different from plasma cell-free and exosome DNA. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Lognormal Approximations of Fault Tree Uncertainty Distributions.
El-Shanawany, Ashraf Ben; Ardron, Keith H; Walker, Simon P
2018-01-26
Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed-form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling-based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed-form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks's method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models. © 2018 Society for Risk Analysis.
Buchin, Kevin; Sijben, Stef; van Loon, E Emiel; Sapir, Nir; Mercier, Stéphanie; Marie Arseneau, T Jean; Willems, Erik P
2015-01-01
The Brownian bridge movement model (BBMM) provides a biologically sound approximation of the movement path of an animal based on discrete location data, and is a powerful method to quantify utilization distributions. Computing the utilization distribution based on the BBMM while calculating movement parameters directly from the location data, may result in inconsistent and misleading results. We show how the BBMM can be extended to also calculate derived movement parameters. Furthermore we demonstrate how to integrate environmental context into a BBMM-based analysis. We develop a computational framework to analyze animal movement based on the BBMM. In particular, we demonstrate how a derived movement parameter (relative speed) and its spatial distribution can be calculated in the BBMM. We show how to integrate our framework with the conceptual framework of the movement ecology paradigm in two related but acutely different ways, focusing on the influence that the environment has on animal movement. First, we demonstrate an a posteriori approach, in which the spatial distribution of average relative movement speed as obtained from a "contextually naïve" model is related to the local vegetation structure within the monthly ranging area of a group of wild vervet monkeys. Without a model like the BBMM it would not be possible to estimate such a spatial distribution of a parameter in a sound way. Second, we introduce an a priori approach in which atmospheric information is used to calculate a crucial parameter of the BBMM to investigate flight properties of migrating bee-eaters. This analysis shows significant differences in the characteristics of flight modes, which would have not been detected without using the BBMM. Our algorithm is the first of its kind to allow BBMM-based computation of movement parameters beyond the utilization distribution, and we present two case studies that demonstrate two fundamentally different ways in which our algorithm can be applied to estimate the spatial distribution of average relative movement speed, while interpreting it in a biologically meaningful manner, across a wide range of environmental scenarios and ecological contexts. Therefore movement parameters derived from the BBMM can provide a powerful method for movement ecology research.
Hernández-Saz, J; Herrera, M; Delgado, F J; Duguay, S; Philippe, T; Gonzalez, M; Abell, J; Walters, R J; Molina, S I
2016-07-29
The analysis by atom probe tomography (APT) of InAlAsSb layers with applications in triple junction solar cells (TJSCs) has shown the existence of In- and Sb-rich regions in the material. The composition variation found is not evident from the direct observation of the 3D atomic distribution and because of this a statistical analysis has been required. From previous analysis of these samples, it is shown that the small compositional fluctuations determined have a strong effect on the optical properties of the material and ultimately on the performance of TJSCs.
NASA Technical Reports Server (NTRS)
Scudder, Nathan F
1934-01-01
This report presents the results of an investigation of the spinning characteristics of NY-1 naval training biplane. The results of flight tests and an analysis based on wind-tunnel test data are given and compared. The primary purpose of the investigation was the determination in flight of the effect of changes in mass distribution along the longitudinal axis, without change of mass quantity or centroid. Other effects were also investigated, such as those due to wing loading, center-of-gravity position, dihedral of wings, control setting, and the removal of a large portion of the fabric from the fin and rudder. The wind tunnel test results used in the numerical analysis were obtained in the 7 by 10 foot wind tunnel through an angle-of-attack.
Foglia, L.; Hill, Mary C.; Mehl, Steffen W.; Burlando, P.
2009-01-01
We evaluate the utility of three interrelated means of using data to calibrate the fully distributed rainfall‐runoff model TOPKAPI as applied to the Maggia Valley drainage area in Switzerland. The use of error‐based weighting of observation and prior information data, local sensitivity analysis, and single‐objective function nonlinear regression provides quantitative evaluation of sensitivity of the 35 model parameters to the data, identification of data types most important to the calibration, and identification of correlations among parameters that contribute to nonuniqueness. Sensitivity analysis required only 71 model runs, and regression required about 50 model runs. The approach presented appears to be ideal for evaluation of models with long run times or as a preliminary step to more computationally demanding methods. The statistics used include composite scaled sensitivities, parameter correlation coefficients, leverage, Cook's D, and DFBETAS. Tests suggest predictive ability of the calibrated model typical of hydrologic models.
Scaling and allometry in the building geometries of Greater London
NASA Astrophysics Data System (ADS)
Batty, M.; Carvalho, R.; Hudson-Smith, A.; Milton, R.; Smith, D.; Steadman, P.
2008-06-01
Many aggregate distributions of urban activities such as city sizes reveal scaling but hardly any work exists on the properties of spatial distributions within individual cities, notwithstanding considerable knowledge about their fractal structure. We redress this here by examining scaling relationships in a world city using data on the geometric properties of individual buildings. We first summarise how power laws can be used to approximate the size distributions of buildings, in analogy to city-size distributions which have been widely studied as rank-size and lognormal distributions following Zipf [ Human Behavior and the Principle of Least Effort (Addison-Wesley, Cambridge, 1949)] and Gibrat [ Les Inégalités Économiques (Librarie du Recueil Sirey, Paris, 1931)]. We then extend this analysis to allometric relationships between buildings in terms of their different geometric size properties. We present some preliminary analysis of building heights from the Emporis database which suggests very strong scaling in world cities. The data base for Greater London is then introduced from which we extract 3.6 million buildings whose scaling properties we explore. We examine key allometric relationships between these different properties illustrating how building shape changes according to size, and we extend this analysis to the classification of buildings according to land use types. We conclude with an analysis of two-point correlation functions of building geometries which supports our non-spatial analysis of scaling.
Convergence analysis of surrogate-based methods for Bayesian inverse problems
NASA Astrophysics Data System (ADS)
Yan, Liang; Zhang, Yuan-Xiang
2017-12-01
The major challenges in the Bayesian inverse problems arise from the need for repeated evaluations of the forward model, as required by Markov chain Monte Carlo (MCMC) methods for posterior sampling. Many attempts at accelerating Bayesian inference have relied on surrogates for the forward model, typically constructed through repeated forward simulations that are performed in an offline phase. Although such approaches can be quite effective at reducing computation cost, there has been little analysis of the approximation on posterior inference. In this work, we prove error bounds on the Kullback-Leibler (KL) distance between the true posterior distribution and the approximation based on surrogate models. Our rigorous error analysis show that if the forward model approximation converges at certain rate in the prior-weighted L 2 norm, then the posterior distribution generated by the approximation converges to the true posterior at least two times faster in the KL sense. The error bound on the Hellinger distance is also provided. To provide concrete examples focusing on the use of the surrogate model based methods, we present an efficient technique for constructing stochastic surrogate models to accelerate the Bayesian inference approach. The Christoffel least squares algorithms, based on generalized polynomial chaos, are used to construct a polynomial approximation of the forward solution over the support of the prior distribution. The numerical strategy and the predicted convergence rates are then demonstrated on the nonlinear inverse problems, involving the inference of parameters appearing in partial differential equations.
Upper Kalamazoo watershed land cover inventory. [based on remote sensing
NASA Technical Reports Server (NTRS)
Richason, B., III; Enslin, W.
1973-01-01
Approximately 1000 square miles of the eastern portion of the watershed were inventoried based on remote sensing imagery. The classification scheme, imagery and interpretation procedures, and a cost analysis are discussed. The distributions of land cover within the area are tabulated.
Nonlinear multi-analysis of agent-based financial market dynamics by epidemic system
NASA Astrophysics Data System (ADS)
Lu, Yunfan; Wang, Jun; Niu, Hongli
2015-10-01
Based on the epidemic dynamical system, we construct a new agent-based financial time series model. In order to check and testify its rationality, we compare the statistical properties of the time series model with the real stock market indices, Shanghai Stock Exchange Composite Index and Shenzhen Stock Exchange Component Index. For analyzing the statistical properties, we combine the multi-parameter analysis with the tail distribution analysis, the modified rescaled range analysis, and the multifractal detrended fluctuation analysis. For a better perspective, the three-dimensional diagrams are used to present the analysis results. The empirical research in this paper indicates that the long-range dependence property and the multifractal phenomenon exist in the real returns and the proposed model. Therefore, the new agent-based financial model can recurrence some important features of real stock markets.
NASA Astrophysics Data System (ADS)
Mishchenko, Andrey S.; Matsui, Hiroyuki; Hasegawa, Tatsuo
2012-02-01
We develop an analytical method for the processing of electron spin resonance (ESR) spectra. The goal is to obtain the distributions of trapped carriers over both their degree of localization and their binding energy in semiconductor crystals or films composed of regularly aligned organic molecules [Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.104.056602 104, 056602 (2010)]. Our method has two steps. We first carry out a fine analysis of the shape of the ESR spectra due to the trapped carriers; this reveals the distribution of the trap density of the states over the degree of localization. This analysis is based on the reasonable assumption that the linewidth of the trapped carriers is predetermined by their degree of localization because of the hyperfine mechanism. We then transform the distribution over the degree of localization into a distribution over the binding energies. The transformation uses the relationships between the binding energies and the localization parameters of the trapped carriers. The particular relation for the system under study is obtained by the Holstein model for trapped polarons using a diagrammatic Monte Carlo analysis. We illustrate the application of the method to pentacene organic thin-film transistors.
Nonstationary Dynamics Data Analysis with Wavelet-SVD Filtering
NASA Technical Reports Server (NTRS)
Brenner, Marty; Groutage, Dale; Bessette, Denis (Technical Monitor)
2001-01-01
Nonstationary time-frequency analysis is used for identification and classification of aeroelastic and aeroservoelastic dynamics. Time-frequency multiscale wavelet processing generates discrete energy density distributions. The distributions are processed using the singular value decomposition (SVD). Discrete density functions derived from the SVD generate moments that detect the principal features in the data. The SVD standard basis vectors are applied and then compared with a transformed-SVD, or TSVD, which reduces the number of features into more compact energy density concentrations. Finally, from the feature extraction, wavelet-based modal parameter estimation is applied.
NASA Astrophysics Data System (ADS)
Haneef, Shahna M.; Srijith, K.; Venkitesh, D.; Srinivasan, B.
2017-04-01
We propose and demonstrate the use of cross recurrence plot analysis (CRPA) to accurately determine the Brillouin shift due to strain and temperature in a Brillouin distributed fiber sensor. This signal processing technique, which is implemented in Brillouin sensors for the first time relies on apriori data i.e, the lineshape of the Brillouin gain spectrum and its similarity with the spectral features measured at different locations along the fiber. Analytical and experimental investigation of the proposed scheme is presented in this paper.
Price Based Local Power Distribution Management System (Local Power Distribution Manager) v1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
BROWN, RICHARD E.; CZARNECKI, STEPHEN; SPEARS, MICHAEL
2016-11-28
A trans-active energy micro-grid controller is implemented in the VOLTTRON distributed control platform. The system uses the price of electricity as the mechanism for conducting transactions that are used to manage energy use and to balance supply and demand. In order to allow testing and analysis of the control system, the implementation is designed to run completely as a software simulation, while allowing the inclusion of selected hardware that physically manages power. Equipment to be integrated with the micro-grid controller must have an IP (Internet Protocol)-based network connection and a software "driver" must exist to translate data communications between themore » device and the controller.« less
Mathur, Sunil; Sadana, Ajit
2015-12-01
We present a rank-based test statistic for the identification of differentially expressed genes using a distance measure. The proposed test statistic is highly robust against extreme values and does not assume the distribution of parent population. Simulation studies show that the proposed test is more powerful than some of the commonly used methods, such as paired t-test, Wilcoxon signed rank test, and significance analysis of microarray (SAM) under certain non-normal distributions. The asymptotic distribution of the test statistic, and the p-value function are discussed. The application of proposed method is shown using a real-life data set. © The Author(s) 2011.
Modelling and analysis of solar cell efficiency distributions
NASA Astrophysics Data System (ADS)
Wasmer, Sven; Greulich, Johannes
2017-08-01
We present an approach to model the distribution of solar cell efficiencies achieved in production lines based on numerical simulations, metamodeling and Monte Carlo simulations. We validate our methodology using the example of an industrial feasible p-type multicrystalline silicon “passivated emitter and rear cell” process. Applying the metamodel, we investigate the impact of each input parameter on the distribution of cell efficiencies in a variance-based sensitivity analysis, identifying the parameters and processes that need to be improved and controlled most accurately. We show that if these could be optimized, the mean cell efficiencies of our examined cell process would increase from 17.62% ± 0.41% to 18.48% ± 0.09%. As the method relies on advanced characterization and simulation techniques, we furthermore introduce a simplification that enhances applicability by only requiring two common measurements of finished cells. The presented approaches can be especially helpful for ramping-up production, but can also be applied to enhance established manufacturing.
Bao, Yi; Chen, Yizheng; Hoehler, Matthew S.; Smith, Christopher M.; Bundy, Matthew; Chen, Genda
2016-01-01
This paper presents high temperature measurements using a Brillouin scattering-based fiber optic sensor and the application of the measured temperatures and building code recommended material parameters into enhanced thermomechanical analysis of simply supported steel beams subjected to combined thermal and mechanical loading. The distributed temperature sensor captures detailed, nonuniform temperature distributions that are compared locally with thermocouple measurements with less than 4.7% average difference at 95% confidence level. The simulated strains and deflections are validated using measurements from a second distributed fiber optic (strain) sensor and two linear potentiometers, respectively. The results demonstrate that the temperature-dependent material properties specified in the four investigated building codes lead to strain predictions with less than 13% average error at 95% confidence level and that the Europe building code provided the best predictions. However, the implicit consideration of creep in Europe is insufficient when the beam temperature exceeds 800°C. PMID:28239230
Cai, Li-mei; Ma, Jin; Zhou, Yong-zhang; Huang, Lan-chun; Dou, Lei; Zhang, Cheng-bo; Fu, Shan-ming
2008-12-01
One hundred and eighteen surface soil samples were collected from the Dongguan City, and analyzed for concentration of Cu, Zn, Ni, Cr, Pb, Cd, As, Hg, pH and OM. The spatial distribution and sources of soil heavy metals were studied using multivariate geostatistical methods and GIS technique. The results indicated concentrations of Cu, Zn, Ni, Pb, Cd and Hg were beyond the soil background content in Guangdong province, and especially concentrations of Pb, Cd and Hg were greatly beyond the content. The results of factor analysis group Cu, Zn, Ni, Cr and As in Factor 1, Pb and Hg in Factor 2 and Cd in Factor 3. The spatial maps based on geostatistical analysis show definite association of Factor 1 with the soil parent material, Factor 2 was mainly affected by industries. The spatial distribution of Factor 3 was attributed to anthropogenic influence.
Kato, Hirotomo; Gomez, Eduardo A.; Martini-Robles, Luiggi; Muzzio, Jenny; Velez, Lenin; Calvopiña, Manuel; Romero-Alvarez, Daniel; Mimori, Tatsuyuki; Uezato, Hiroshi; Hashiguchi, Yoshihisa
2016-01-01
A countrywide epidemiological study was performed to elucidate the current geographic distribution of causative species of cutaneous leishmaniasis (CL) in Ecuador by using FTA card-spotted samples and smear slides as DNA sources. Putative Leishmania in 165 samples collected from patients with CL in 16 provinces of Ecuador were examined at the species level based on the cytochrome b gene sequence analysis. Of these, 125 samples were successfully identified as Leishmania (Viannia) guyanensis, L. (V.) braziliensis, L. (V.) naiffi, L. (V.) lainsoni, and L. (Leishmania) mexicana. Two dominant species, L. (V.) guyanensis and L. (V.) braziliensis, were widely distributed in Pacific coast subtropical and Amazonian tropical areas, respectively. Recently reported L. (V.) naiffi and L. (V.) lainsoni were identified in Amazonian areas, and L. (L.) mexicana was identified in an Andean highland area. Importantly, the present study demonstrated that cases of L. (V.) braziliensis infection are increasing in Pacific coast areas. PMID:27410039
NASA Astrophysics Data System (ADS)
Han, Keesook J.; Hodge, Matthew; Ross, Virginia W.
2011-06-01
For monitoring network traffic, there is an enormous cost in collecting, storing, and analyzing network traffic datasets. Data mining based network traffic analysis has a growing interest in the cyber security community, but is computationally expensive for finding correlations between attributes in massive network traffic datasets. To lower the cost and reduce computational complexity, it is desirable to perform feasible statistical processing on effective reduced datasets instead of on the original full datasets. Because of the dynamic behavior of network traffic, traffic traces exhibit mixtures of heavy tailed statistical distributions or overdispersion. Heavy tailed network traffic characterization and visualization are important and essential tasks to measure network performance for the Quality of Services. However, heavy tailed distributions are limited in their ability to characterize real-time network traffic due to the difficulty of parameter estimation. The Entropy-Based Heavy Tailed Distribution Transformation (EHTDT) was developed to convert the heavy tailed distribution into a transformed distribution to find the linear approximation. The EHTDT linearization has the advantage of being amenable to characterize and aggregate overdispersion of network traffic in realtime. Results of applying the EHTDT for innovative visual analytics to real network traffic data are presented.
A Review of DIMPACK Version 1.0: Conditional Covariance-Based Test Dimensionality Analysis Package
ERIC Educational Resources Information Center
Deng, Nina; Han, Kyung T.; Hambleton, Ronald K.
2013-01-01
DIMPACK Version 1.0 for assessing test dimensionality based on a nonparametric conditional covariance approach is reviewed. This software was originally distributed by Assessment Systems Corporation and now can be freely accessed online. The software consists of Windows-based interfaces of three components: DIMTEST, DETECT, and CCPROX/HAC, which…
NASA Astrophysics Data System (ADS)
Kim, Jin-Young; Kwon, Hyun-Han; Kim, Hung-Soo
2015-04-01
The existing regional frequency analysis has disadvantages in that it is difficult to consider geographical characteristics in estimating areal rainfall. In this regard, this study aims to develop a hierarchical Bayesian model based nonstationary regional frequency analysis in that spatial patterns of the design rainfall with geographical information (e.g. latitude, longitude and altitude) are explicitly incorporated. This study assumes that the parameters of Gumbel (or GEV distribution) are a function of geographical characteristics within a general linear regression framework. Posterior distribution of the regression parameters are estimated by Bayesian Markov Chain Monte Carlo (MCMC) method, and the identified functional relationship is used to spatially interpolate the parameters of the distributions by using digital elevation models (DEM) as inputs. The proposed model is applied to derive design rainfalls over the entire Han-river watershed. It was found that the proposed Bayesian regional frequency analysis model showed similar results compared to L-moment based regional frequency analysis. In addition, the model showed an advantage in terms of quantifying uncertainty of the design rainfall and estimating the area rainfall considering geographical information. Finally, comprehensive discussion on design rainfall in the context of nonstationary will be presented. KEYWORDS: Regional frequency analysis, Nonstationary, Spatial information, Bayesian Acknowledgement This research was supported by a grant (14AWMP-B082564-01) from Advanced Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.
Zhang, Ming-cai; Lü, Si-zhe; Cheng, Ying-wu; Gu, Li-xu; Zhan, Hong-sheng; Shi, Yin-yu; Wang, Xiang; Huang, Shi-rong
2011-02-01
To study the effect of vertebrae semi-dislocation on the stress distribution in facet joint and interuertebral disc of patients with cervical syndrome using three dimensional finite element model. A patient with cervical spondylosis was randomly chosen, who was male, 28 years old, and diagnosed as cervical vertebra semidislocation by dynamic and static palpation and X-ray, and scanned from C(1) to C(7) by 0.75 mm slice thickness of CT. Based on the CT data, the software was used to construct the three dimensional finite element model of cervical vertebra semidislocation (C(4)-C(6)). Based on the model,virtual manipulation was used to correct the vertebra semidislocation by the software, and the stress distribution was analyzed. The result of finite element analysis showed that the stress distribution of C(5-6) facet joint and intervertebral disc changed after virtual manipulation. The vertebra semidislocation leads to the abnormal stress distribution of facet joint and intervertebral disc.
A New Seismic Hazard Model for Mainland China
NASA Astrophysics Data System (ADS)
Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z. K.
2017-12-01
We are developing a new seismic hazard model for Mainland China by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data, and derive a strain rate model based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones. For each zone, a tapered Gutenberg-Richter (TGR) magnitude-frequency distribution is used to model the seismic activity rates. The a- and b-values of the TGR distribution are calculated using observed earthquake data, while the corner magnitude is constrained independently using the seismic moment rate inferred from the geodetically-based strain rate model. Small and medium sized earthquakes are distributed within the source zones following the location and magnitude patterns of historical earthquakes. Some of the larger earthquakes are distributed onto active faults, based on their geological characteristics such as slip rate, fault length, down-dip width, and various paleoseismic data. The remaining larger earthquakes are then placed into the background. A new set of magnitude-rupture scaling relationships is developed based on earthquake data from China and vicinity. We evaluate and select appropriate ground motion prediction equations by comparing them with observed ground motion data and performing residual analysis. To implement the modeling workflow, we develop a tool that builds upon the functionalities of GEM's Hazard Modeler's Toolkit. The GEM OpenQuake software is used to calculate seismic hazard at various ground motion periods and various return periods. To account for site amplification, we construct a site condition map based on geology. The resulting new seismic hazard maps can be used for seismic risk analysis and management.
Statistical population based estimates of water ingestion play a vital role in many types of exposure and risk analysis. A significant large scale analysis of water ingestion by the population of the United States was recently completed and is documented in the report titled ...
NASA Astrophysics Data System (ADS)
Yang, Xin; Zhong, Shiquan; Sun, Han; Tan, Zongkun; Li, Zheng; Ding, Meihua
Based on analyzing of the physical characteristics of cloud and importance of cloud in agricultural production and national economy, cloud is a very important climatic resources such as temperature, precipitation and solar radiation. Cloud plays a very important role in agricultural climate division .This paper analyzes methods of cloud detection based on MODIS data in China and Abroad . The results suggest that Quanjun He method is suitable to detect cloud in Guangxi. State chart of cloud cover in Guangxi is imaged by using Quanjun He method .We find out the approach of calculating cloud covered rate by using the frequency spectrum analysis. At last, the Guangxi is obtained. Taking Rongxian County Guangxi as an example, this article analyze the preliminary application of cloud covered rate in distribution of Rong Shaddock pomelo . Analysis results indicate that cloud covered rate is closely related to quality of Rong Shaddock pomelo.
NASA Astrophysics Data System (ADS)
Dona, Obie Mario; Ibrahim, Eddy; Susilo, Budhi Kuswan
2017-11-01
The research objective is to describe potential, to analyze the quality and quantity of limestone, and to know the limit distribution of rocks based on the value of resistivity, the pattern of distribution of rocks by drilling, the influence mineral growing on rock against resistivity values, the model deposition of limestone based on the value resistivity of rock and drilling, and the comparison between the interpretation resistivity values based on petrographic studies by drilling. Geologic Formations study area consists of assays consisting of altered sandstone, phyllite, slate, siltstone, grewake, and inset limestone. Local quartz sandstone, schist, genealogy, which is Member of Mersip Stylists Formation, consists of limestone that formed in shallow seas. Stylists Formation consists of slate, shale, siltstone and sandstone. This research methodology is quantitative using experimental observation by survey. This type of research methodology by its nature is descriptive analysis.
CO Component Estimation Based on the Independent Component Analysis
NASA Astrophysics Data System (ADS)
Ichiki, Kiyotomo; Kaji, Ryohei; Yamamoto, Hiroaki; Takeuchi, Tsutomu T.; Fukui, Yasuo
2014-01-01
Fast Independent Component Analysis (FastICA) is a component separation algorithm based on the levels of non-Gaussianity. Here we apply FastICA to the component separation problem of the microwave background, including carbon monoxide (CO) line emissions that are found to contaminate the PLANCK High Frequency Instrument (HFI) data. Specifically, we prepare 100 GHz, 143 GHz, and 217 GHz mock microwave sky maps, which include galactic thermal dust, NANTEN CO line, and the cosmic microwave background (CMB) emissions, and then estimate the independent components based on the kurtosis. We find that FastICA can successfully estimate the CO component as the first independent component in our deflection algorithm because its distribution has the largest degree of non-Gaussianity among the components. Thus, FastICA can be a promising technique to extract CO-like components without prior assumptions about their distributions and frequency dependences.
CO component estimation based on the independent component analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ichiki, Kiyotomo; Kaji, Ryohei; Yamamoto, Hiroaki
2014-01-01
Fast Independent Component Analysis (FastICA) is a component separation algorithm based on the levels of non-Gaussianity. Here we apply FastICA to the component separation problem of the microwave background, including carbon monoxide (CO) line emissions that are found to contaminate the PLANCK High Frequency Instrument (HFI) data. Specifically, we prepare 100 GHz, 143 GHz, and 217 GHz mock microwave sky maps, which include galactic thermal dust, NANTEN CO line, and the cosmic microwave background (CMB) emissions, and then estimate the independent components based on the kurtosis. We find that FastICA can successfully estimate the CO component as the first independentmore » component in our deflection algorithm because its distribution has the largest degree of non-Gaussianity among the components. Thus, FastICA can be a promising technique to extract CO-like components without prior assumptions about their distributions and frequency dependences.« less
Fornasaro, Stefano; Vicario, Annalisa; De Leo, Luigina; Bonifacio, Alois; Not, Tarcisio; Sergo, Valter
2018-05-14
Raman hyperspectral imaging is an emerging practice in biological and biomedical research for label free analysis of tissues and cells. Using this method, both spatial distribution and spectral information of analyzed samples can be obtained. The current study reports the first Raman microspectroscopic characterisation of colon tissues from patients with Coeliac Disease (CD). The aim was to assess if Raman imaging coupled with hyperspectral multivariate image analysis is capable of detecting the alterations in the biochemical composition of intestinal tissues associated with CD. The analytical approach was based on a multi-step methodology: duodenal biopsies from healthy and coeliac patients were measured and processed with Multivariate Curve Resolution Alternating Least Squares (MCR-ALS). Based on the distribution maps and the pure spectra of the image constituents obtained from MCR-ALS, interesting biochemical differences between healthy and coeliac patients has been derived. Noticeably, a reduced distribution of complex lipids in the pericryptic space, and a different distribution and abundance of proteins rich in beta-sheet structures was found in CD patients. The output of the MCR-ALS analysis was then used as a starting point for two clustering algorithms (k-means clustering and hierarchical clustering methods). Both methods converged with similar results providing precise segmentation over multiple Raman images of studied tissues.
Song, Shanshan; Liu, Xuehua; Bai, Xueliang; Jiang, Yanbin; Zhang, Xianzhou; Yu, Chengqun; Shao, Xiaoming
2015-01-01
Tibet makes up the majority of the Qinghai-Tibet Plateau, often referred to as the roof of the world. Its complex landforms, physiognomy, and climate create a special heterogeneous environment for mosses. Each moss species inhabits its own habitat and ecological niche. This, in combination with its sensitivity to environmental change, makes moss species distribution a useful indicator of vegetation alteration and climate change. This study aimed to characterize the diversity and distribution of Didymodon (Pottiaceae) in Tibet, and model the potential distribution of its species. A total of 221 sample plots, each with a size of 10 × 10 m and located at different altitudes, were investigated across all vegetation types. Of these, the 181 plots in which Didymodon species were found were used to conduct analyses and modeling. Three noteworthy results were obtained. First, a total of 22 species of Didymodon were identified. Among these, Didymodon rigidulus var. subulatus had not previously been recorded in China, and Didymodon constrictus var. constrictus was the dominant species. Second, analysis of the relationships between species distributions and environmental factors using canonical correspondence analysis revealed that vegetation cover and altitude were the main factors affecting the distribution of Didymodon in Tibet. Third, based on the environmental factors of bioclimate, topography and vegetation, the distribution of Didymodon was predicted throughout Tibet at a spatial resolution of 1 km, using the presence-only MaxEnt model. Climatic variables were the key factors in the model. We conclude that the environment plays a significant role in moss diversity and distribution. Based on our research findings, we recommend that future studies should focus on the impacts of climate change on the distribution and conservation of Didymodon.
Song, Shanshan; Bai, Xueliang; Jiang, Yanbin; Zhang, Xianzhou; Yu, Chengqun
2015-01-01
Tibet makes up the majority of the Qinghai-Tibet Plateau, often referred to as the roof of the world. Its complex landforms, physiognomy, and climate create a special heterogeneous environment for mosses. Each moss species inhabits its own habitat and ecological niche. This, in combination with its sensitivity to environmental change, makes moss species distribution a useful indicator of vegetation alteration and climate change. This study aimed to characterize the diversity and distribution of Didymodon (Pottiaceae) in Tibet, and model the potential distribution of its species. A total of 221 sample plots, each with a size of 10 × 10 m and located at different altitudes, were investigated across all vegetation types. Of these, the 181 plots in which Didymodon species were found were used to conduct analyses and modeling. Three noteworthy results were obtained. First, a total of 22 species of Didymodon were identified. Among these, Didymodon rigidulus var. subulatus had not previously been recorded in China, and Didymodon constrictus var. constrictus was the dominant species. Second, analysis of the relationships between species distributions and environmental factors using canonical correspondence analysis revealed that vegetation cover and altitude were the main factors affecting the distribution of Didymodon in Tibet. Third, based on the environmental factors of bioclimate, topography and vegetation, the distribution of Didymodon was predicted throughout Tibet at a spatial resolution of 1 km, using the presence-only MaxEnt model. Climatic variables were the key factors in the model. We conclude that the environment plays a significant role in moss diversity and distribution. Based on our research findings, we recommend that future studies should focus on the impacts of climate change on the distribution and conservation of Didymodon. PMID:26181326
NASA Astrophysics Data System (ADS)
Ghiaei, Farhad; Kankal, Murat; Anilan, Tugce; Yuksek, Omer
2018-01-01
The analysis of rainfall frequency is an important step in hydrology and water resources engineering. However, a lack of measuring stations, short duration of statistical periods, and unreliable outliers are among the most important problems when designing hydrology projects. In this study, regional rainfall analysis based on L-moments was used to overcome these problems in the Eastern Black Sea Basin (EBSB) of Turkey. The L-moments technique was applied at all stages of the regional analysis, including determining homogeneous regions, in addition to fitting and estimating parameters from appropriate distribution functions in each homogeneous region. We studied annual maximum rainfall height values of various durations (5 min to 24 h) from seven rain gauge stations located in the EBSB in Turkey, which have gauging periods of 39 to 70 years. Homogeneity of the region was evaluated by using L-moments. The goodness-of-fit criterion for each distribution was defined as the ZDIST statistics, depending on various distributions, including generalized logistic (GLO), generalized extreme value (GEV), generalized normal (GNO), Pearson type 3 (PE3), and generalized Pareto (GPA). GLO and GEV determined the best distributions for short (5 to 30 min) and long (1 to 24 h) period data, respectively. Based on the distribution functions, the governing equations were extracted for calculation of intensities of 2, 5, 25, 50, 100, 250, and 500 years return periods (T). Subsequently, the T values for different rainfall intensities were estimated using data quantifying maximum amount of rainfall at different times. Using these T values, duration, altitude, latitude, and longitude values were used as independent variables in a regression model of the data. The determination coefficient ( R 2) value indicated that the model yields suitable results for the regional relationship of intensity-duration-frequency (IDF), which is necessary for the design of hydraulic structures in small and medium sized catchments.
Ali, Syed Shujait; Yu, Yan; Pfosser, Martin; Wetschnig, Wolfgang
2012-01-01
Background and Aims Subfamily Hyacinthoideae (Hyacinthaceae) comprises more than 400 species. Members are distributed in sub-Saharan Africa, Madagascar, India, eastern Asia, the Mediterranean region and Eurasia. Hyacinthoideae, like many other plant lineages, show disjunct distribution patterns. The aim of this study was to reconstruct the biogeographical history of Hyacinthoideae based on phylogenetic analyses, to find the possible ancestral range of Hyacinthoideae and to identify factors responsible for the current disjunct distribution pattern. Methods Parsimony and Bayesian approaches were applied to obtain phylogenetic trees, based on sequences of the trnL-F region. Biogeographical inferences were obtained by applying statistical dispersal-vicariance analysis (S-DIVA) and Bayesian binary MCMC (BBM) analysis implemented in RASP (Reconstruct Ancestral State in Phylogenies). Key Results S-DIVA and BBM analyses suggest that the Hyacinthoideae clade seem to have originated in sub-Saharan Africa. Dispersal and vicariance played vital roles in creating the disjunct distribution pattern. Results also suggest an early dispersal to the Mediterranean region, and thus the northward route (from sub-Saharan Africa to Mediterranean) of dispersal is plausible for members of subfamily Hyacinthoideae. Conclusions Biogeographical analyses reveal that subfamily Hyacinthoideae has originated in sub-Saharan Africa. S-DIVA indicates an early dispersal event to the Mediterranean region followed by a vicariance event, which resulted in Hyacintheae and Massonieae tribes. By contrast, BBM analysis favours dispersal to the Mediterranean region, eastern Asia and Europe. Biogeographical analysis suggests that sub-Saharan Africa and the Mediterranean region have played vital roles as centres of diversification and radiation within subfamily Hyacinthoideae. In this bimodal distribution pattern, sub-Saharan Africa is the primary centre of diversity and the Mediterranean region is the secondary centre of diversity. Sub-Saharan Africa was the source area for radiation toward Madagascar, the Mediterranean region and India. Radiations occurred from the Mediterranean region to eastern Asia, Europe, western Asia and India. PMID:22039008
A logical model of cooperating rule-based systems
NASA Technical Reports Server (NTRS)
Bailin, Sidney C.; Moore, John M.; Hilberg, Robert H.; Murphy, Elizabeth D.; Bahder, Shari A.
1989-01-01
A model is developed to assist in the planning, specification, development, and verification of space information systems involving distributed rule-based systems. The model is based on an analysis of possible uses of rule-based systems in control centers. This analysis is summarized as a data-flow model for a hypothetical intelligent control center. From this data-flow model, the logical model of cooperating rule-based systems is extracted. This model consists of four layers of increasing capability: (1) communicating agents, (2) belief-sharing knowledge sources, (3) goal-sharing interest areas, and (4) task-sharing job roles.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dréan, Gaël; Acosta, Oscar, E-mail: Oscar.Acosta@univ-rennes1.fr; Simon, Antoine
2016-06-15
Purpose: Recent studies revealed a trend toward voxelwise population analysis in order to understand the local dose/toxicity relationships in prostate cancer radiotherapy. Such approaches require, however, an accurate interindividual mapping of the anatomies and 3D dose distributions toward a common coordinate system. This step is challenging due to the high interindividual variability. In this paper, the authors propose a method designed for interindividual nonrigid registration of the rectum and dose mapping for population analysis. Methods: The method is based on the computation of a normalized structural description of the rectum using a Laplacian-based model. This description takes advantage of themore » tubular structure of the rectum and its centerline to be embedded in a nonrigid registration-based scheme. The performances of the method were evaluated on 30 individuals treated for prostate cancer in a leave-one-out cross validation. Results: Performance was measured using classical metrics (Dice score and Hausdorff distance), along with new metrics devised to better assess dose mapping in relation with structural deformation (dose-organ overlap). Considering these scores, the proposed method outperforms intensity-based and distance maps-based registration methods. Conclusions: The proposed method allows for accurately mapping interindividual 3D dose distributions toward a single anatomical template, opening the way for further voxelwise statistical analysis.« less
Frison, Severine; Checchi, Francesco; Kerac, Marko; Nicholas, Jennifer
2016-01-01
Wasting is a major public health issue throughout the developing world. Out of the 6.9 million estimated deaths among children under five annually, over 800,000 deaths (11.6 %) are attributed to wasting. Wasting is quantified as low Weight-For-Height (WFH) and/or low Mid-Upper Arm Circumference (MUAC) (since 2005). Many statistical procedures are based on the assumption that the data used are normally distributed. Analyses have been conducted on the distribution of WFH but there are no equivalent studies on the distribution of MUAC. This secondary data analysis assesses the normality of the MUAC distributions of 852 nutrition cross-sectional survey datasets of children from 6 to 59 months old and examines different approaches to normalise "non-normal" distributions. The distribution of MUAC showed no departure from a normal distribution in 319 (37.7 %) distributions using the Shapiro-Wilk test. Out of the 533 surveys showing departure from a normal distribution, 183 (34.3 %) were skewed (D'Agostino test) and 196 (36.8 %) had a kurtosis different to the one observed in the normal distribution (Anscombe-Glynn test). Testing for normality can be sensitive to data quality, design effect and sample size. Out of the 533 surveys showing departure from a normal distribution, 294 (55.2 %) showed high digit preference, 164 (30.8 %) had a large design effect, and 204 (38.3 %) a large sample size. Spline and LOESS smoothing techniques were explored and both techniques work well. After Spline smoothing, 56.7 % of the MUAC distributions showing departure from normality were "normalised" and 59.7 % after LOESS. Box-Cox power transformation had similar results on distributions showing departure from normality with 57 % of distributions approximating "normal" after transformation. Applying Box-Cox transformation after Spline or Loess smoothing techniques increased that proportion to 82.4 and 82.7 % respectively. This suggests that statistical approaches relying on the normal distribution assumption can be successfully applied to MUAC. In light of this promising finding, further research is ongoing to evaluate the performance of a normal distribution based approach to estimating the prevalence of wasting using MUAC.
Kapucu, Fikret E.; Välkki, Inkeri; Mikkonen, Jarno E.; Leone, Chiara; Lenk, Kerstin; Tanskanen, Jarno M. A.; Hyttinen, Jari A. K.
2016-01-01
Synchrony and asynchrony are essential aspects of the functioning of interconnected neuronal cells and networks. New information on neuronal synchronization can be expected to aid in understanding these systems. Synchronization provides insight in the functional connectivity and the spatial distribution of the information processing in the networks. Synchronization is generally studied with time domain analysis of neuronal events, or using direct frequency spectrum analysis, e.g., in specific frequency bands. However, these methods have their pitfalls. Thus, we have previously proposed a method to analyze temporal changes in the complexity of the frequency of signals originating from different network regions. The method is based on the correlation of time varying spectral entropies (SEs). SE assesses the regularity, or complexity, of a time series by quantifying the uniformity of the frequency spectrum distribution. It has been previously employed, e.g., in electroencephalogram analysis. Here, we revisit our correlated spectral entropy method (CorSE), providing evidence of its justification, usability, and benefits. Here, CorSE is assessed with simulations and in vitro microelectrode array (MEA) data. CorSE is first demonstrated with a specifically tailored toy simulation to illustrate how it can identify synchronized populations. To provide a form of validation, the method was tested with simulated data from integrate-and-fire model based computational neuronal networks. To demonstrate the analysis of real data, CorSE was applied on in vitro MEA data measured from rat cortical cell cultures, and the results were compared with three known event based synchronization measures. Finally, we show the usability by tracking the development of networks in dissociated mouse cortical cell cultures. The results show that temporal correlations in frequency spectrum distributions reflect the network relations of neuronal populations. In the simulated data, CorSE unraveled the synchronizations. With the real in vitro MEA data, CorSE produced biologically plausible results. Since CorSE analyses continuous data, it is not affected by possibly poor spike or other event detection quality. We conclude that CorSE can reveal neuronal network synchronization based on in vitro MEA field potential measurements. CorSE is expected to be equally applicable also in the analysis of corresponding in vivo and ex vivo data analysis. PMID:27803660
Kouloulias, Vassilis; Karanasiou, Irene; Giamalaki, Melina; Matsopoulos, George; Kouvaris, John; Kelekis, Nikolaos; Uzunoglu, Nikolaos
2015-02-01
A hyperthermia system using a folded loop antenna applicator at 27 MHz for soft tissue treatment was investigated both theoretically and experimentally to evaluate its clinical value. The electromagnetic analysis of a 27-MHz folded loop antenna for use in human tissue was based on a customised software tool and led to the design and development of the proposed hyperthermia system. The system was experimentally validated using specific absorption rate (SAR) distribution estimations through temperature distribution measurements of a muscle tissue phantom after electromagnetic exposure. Various scenarios for optimal antenna positioning were also performed. Comparison of the theoretical and experimental analysis results shows satisfactory agreement. The SAR level of 50% reaches 8 cm depth in the tissue phantom. Thus, based on the maximum observed SAR values that were of the order of 100 W/kg, the antenna specified is suitable for deep tumour heating. Theoretical and experimental SAR distribution results as derived from this study are in agreement. The proposed folded loop antenna seems appropriate for use in hyperthermia treatment, achieving proper planning and local treatment of deeply seated affected areas and lesions.
iMARS--mutation analysis reporting software: an analysis of spontaneous cII mutation spectra.
Morgan, Claire; Lewis, Paul D
2006-01-31
The sensitivity of any mutational assay is determined by the level at which spontaneous mutations occur in the corresponding untreated controls. Establishing the type and frequency at which mutations occur naturally within a test system is essential if one is to draw scientifically sound conclusions regarding chemically induced mutations. Currently, mutation-spectra analysis is laborious and time-consuming. Thus, we have developed iMARS, a comprehensive mutation-spectrum analysis package that utilises routinely used methodologies and visualisation tools. To demonstrate the use and capabilities of iMARS, we have analysed the distribution, types and sequence context of spontaneous base substitutions derived from the cII gene mutation assay in transgenic animals. Analysis of spontaneous mutation spectra revealed variation both within and between the transgenic rodent test systems Big Blue Mouse, MutaMouse and Big Blue Rat. The most common spontaneous base substitutions were G:C-->A:T transitions and G:C-->T:A transversions. All Big Blue Mouse spectra were significantly different from each other by distribution and nearly all by mutation type, whereas the converse was true for the other test systems. Twenty-eight mutation hotspots were observed across all spectra generally occurring in CG, GA/TC, GG and GC dinucleotides. A mutation hotspot at nucleotide 212 occurred at a higher frequency in MutaMouse and Big Blue Rat. In addition, CG dinucleotides were the most mutable in all spectra except two Big Blue Mouse spectra. Thus, spontaneous base-substitution spectra showed more variation in distribution, type and sequence context in Big Blue Mouse relative to spectra derived from MutaMouse and Big Blue Rat. The results of our analysis provide a baseline reference for mutation studies utilising the cII gene in transgenic rodent models. The potential differences in spontaneous base-substitution spectra should be considered when making comparisons between these test systems. The ease at which iMARS has allowed us to carry out an exhaustive investigation to assess mutation distribution, mutation type, strand bias, target sequences and motifs, as well as predict mutation hotspots provides us with a valuable tool in helping to distinguish true chemically induced hotspots from background mutations and gives a true reflection of mutation frequency.
nCTEQ15 - Global analysis of nuclear parton distributions with uncertainties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kusina, A.; Kovarik, Karol; Jezo, T.
2015-09-01
We present the first official release of the nCTEQ nuclear parton distribution functions with errors. The main addition to the previous nCTEQ PDFs is the introduction of PDF uncertainties based on the Hessian method. Another important addition is the inclusion of pion production data from RHIC that give us a handle on constraining the gluon PDF. This contribution summarizes our results from arXiv:1509.00792 and concentrates on the comparison with other groups providing nuclear parton distributions.
nCTEQ15 - Global analysis of nuclear parton distributions with uncertainties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kusina, A.; Kovarik, K.; Jezo, T.
2015-09-04
We present the first official release of the nCTEQ nuclear parton distribution functions with errors. The main addition to the previous nCTEQ PDFs is the introduction of PDF uncertainties based on the Hessian method. Another important addition is the inclusion of pion production data from RHIC that give us a handle on constraining the gluon PDF. This contribution summarizes our results from arXiv:1509.00792, and concentrates on the comparison with other groups providing nuclear parton distributions.
NASA Technical Reports Server (NTRS)
Krantz, Timothy L.
2002-01-01
The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.
NASA Astrophysics Data System (ADS)
Wang, Hao; Zhong, Guoxin
2018-03-01
Optical communication network is the mainstream technique of the communication networks for distribution automation, and self-healing technologies can improve the in reliability of the optical communication networks significantly. This paper discussed the technical characteristics and application scenarios of several network self-healing technologies in the access layer, the backbone layer and the core layer of the optical communication networks for distribution automation. On the base of the contrastive analysis, this paper gives an application suggestion of these self-healing technologies.
Study of heavy-ion induced fission for heavy-element synthesis
NASA Astrophysics Data System (ADS)
Nishio, K.; Ikezoe, H.; Hofmann, S.; Heßberger, F. P.; Ackermann, D.; Antalic, S.; Aritomo, Y.; Comas, V. F.; Düllman, Ch. E.; Gorshkov, A.; Graeger, R.; Heinz, S.; Heredia, J. A.; Hirose, K.; Khuyagbaatar, J.; Kindler, B.; Kojouharov, I.; Lommel, B.; Makii, H.; Mann, R.; Mitsuoka, S.; Nagame, Y.; Nishinaka, I.; Ohtsuki, T.; Popeko, A. G.; Saro, S.; Schädel, M.; Türler, A.; Wakabayashi, Y.; Watanabe, Y.; Yakushev, A.; Yeremin, A. V.
2014-03-01
Fission fragment mass distributions were measured in heavy-ion induced fissions using 238U target nucleus. The measured mass distributions changed drastically with incident energy. The results are explained by a change of the ratio between fusion and qasifission with nuclear orientation. A calculation based on a fluctuation dissipation model reproduced the mass distributions and their incident energy dependence. Fusion probability was determined in the analysis, and the values were consistent with those determined from the evaporation residue cross sections.
NASA Technical Reports Server (NTRS)
Kranz, Timothy L.
2002-01-01
The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.
The calculation of downwash behind supersonic wings with an application to triangular plan forms
NASA Technical Reports Server (NTRS)
Lomax, Harvard; Sluder, Loma; Heaslet, Max A
1950-01-01
A method is developed consistent with the assumptions of small perturbation theory which provides a means of determining the downwash behind a wing in supersonic flow for a known load distribution. The analysis is based upon the use of supersonic doublets which are distributed over the plan form and wake of the wing in a manner determined from the wing loading. The equivalence in subsonic and supersonic flow of the downwash at infinity corresponding to a given load distribution is proved.
Double Fourier analysis for Emotion Identification in Voiced Speech
NASA Astrophysics Data System (ADS)
Sierra-Sosa, D.; Bastidas, M.; Ortiz P., D.; Quintero, O. L.
2016-04-01
We propose a novel analysis alternative, based on two Fourier Transforms for emotion recognition from speech. Fourier analysis allows for display and synthesizes different signals, in terms of power spectral density distributions. A spectrogram of the voice signal is obtained performing a short time Fourier Transform with Gaussian windows, this spectrogram portraits frequency related features, such as vocal tract resonances and quasi-periodic excitations during voiced sounds. Emotions induce such characteristics in speech, which become apparent in spectrogram time-frequency distributions. Later, the signal time-frequency representation from spectrogram is considered an image, and processed through a 2-dimensional Fourier Transform in order to perform the spatial Fourier analysis from it. Finally features related with emotions in voiced speech are extracted and presented.
Observed and theoretical variations of atmospheric ozone
NASA Technical Reports Server (NTRS)
London, J.
1976-01-01
Results are summarized from three areas of ozone research: (1) continued analysis of the global distribution of total ozone to extend the global ozone atlas to summarize 15 years (1957-72) of ground based observations; (2) analysis of balloon borne ozonesonde observations for Arosa, Switzerland, and Hohenpeissenberg, Germany (GFR); (3) contined processing of the (Orbiting Geophysical Observatory-4) satellite data to complete the analysis of the stratospheric ozone distribution from the available OGO-4 data. Results of the analysis of the total ozone observations indicated that the long term ozone variation have marked regional patterns and tend to alternate with season and hemisphere. It is becoming increasingly clear that these long period changes are associated with large scale variations in the general upper atmosphere circulation patterns.
An advanced probabilistic structural analysis method for implicit performance functions
NASA Technical Reports Server (NTRS)
Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.
1989-01-01
In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.
Remote rainfall sensing for landslide hazard analysis
Wieczorek, Gerald F.; McWreath, Harry; Davenport, Clay
2001-01-01
Methods of assessing landslide hazards and providing warnings are becoming more advanced as remote sensing of rainfall provides more detailed temporal and spatial data on rainfall distribution. Two recent landslide disasters are examined noting the potential for using remotely sensed rainfall data for landslide hazard analysis. For the June 27, 1995, storm in Madison County, Virginia, USA, National Weather Service WSR-88D Doppler radar provided rainfall estimates based on a relation between cloud reflectivity and moisture content on a 1 sq. km. resolution every 6 minutes. Ground-based measurements of rainfall intensity and precipitation total, in addition to landslide timing and distribution, were compared with the radar-derived rainfall data. For the December 14-16, 1999, storm in Vargas State, Venezuela, infrared sensing from the GOES-8 satellite of cloud top temperatures provided the basis for NOAA/NESDIS rainfall estimates on a 16 sq. km. resolution every 30 minutes. These rainfall estimates were also compared with ground-based measurements of rainfall and landslide distribution. In both examples, the remotely sensed data either overestimated or underestimated ground-based values by up to a factor of 2. The factors that influenced the accuracy of rainfall data include spatial registration and map projection, as well as prevailing wind direction, cloud orientation, and topography.
Payo, Dioli Ann; Leliaert, Frederik; Verbruggen, Heroen; D'hondt, Sofie; Calumpong, Hilconida P.; De Clerck, Olivier
2013-01-01
We investigated species diversity and distribution patterns of the marine red alga Portieria in the Philippine archipelago. Species boundaries were tested based on mitochondrial, plastid and nuclear encoded loci, using a general mixed Yule-coalescent (GMYC) model-based approach and a Bayesian multilocus species delimitation method. The outcome of the GMYC analysis of the mitochondrial encoded cox2-3 dataset was highly congruent with the multilocus analysis. In stark contrast with the current morphology-based assumption that the genus includes a single, widely distributed species in the Indo-West Pacific (Portieria hornemannii), DNA-based species delimitation resulted in the recognition of 21 species within the Philippines. Species distributions were found to be highly structured with most species restricted to island groups within the archipelago. These extremely narrow species ranges and high levels of intra-archipelagic endemism contrast with the wide-held belief that marine organisms generally have large geographical ranges and that endemism is at most restricted to the archipelagic level. Our results indicate that speciation in the marine environment may occur at spatial scales smaller than 100 km, comparable with some terrestrial systems. Our finding of fine-scale endemism has important consequences for marine conservation and management. PMID:23269854
Li, Heng; Cheng, Hui-Yu; Chen, Wei-Liang; Huang, Yi-Hsin; Li, Chi-Kang; Chang, Chiao-Yun; Wu, Yuh-Renn; Lu, Tien-Chang; Chang, Yu-Ming
2017-03-30
We performed depth-resolved PL and Raman spectral mappings of a GaN-based LED structure grown on a patterned sapphire substrate (PSS). Our results showed that the Raman mapping in the PSS-GaN heterointerface and the PL mapping in the In x Ga 1-x N/GaN MQWs active layer are spatially correlated. Based on the 3D construction of E 2 (high) Raman peak intensity and frequency shift, V-shaped pits in the MQWs can be traced down to the dislocations originated in the cone tip area of PSS. Detail analysis of the PL peak distribution further revealed that the indium composition in the MQWs is related to the residual strain propagating from the PSS-GaN heterointerface toward the LED surface. Numerical simulation based on the indium composition distribution also led to a radiative recombination rate distribution that shows agreement with the experimental PL intensity distribution in the In x Ga 1-x N/GaN MQWs active layer.
NASA Astrophysics Data System (ADS)
Langousis, Andreas; Kaleris, Vassilios; Xeygeni, Vagia; Magkou, Foteini
2017-04-01
Assessing the availability of groundwater reserves at a regional level, requires accurate and robust hydraulic head estimation at multiple locations of an aquifer. To that extent, one needs groundwater observation networks that can provide sufficient information to estimate the hydraulic head at unobserved locations. The density of such networks is largely influenced by the spatial distribution of the hydraulic conductivity in the aquifer, and it is usually determined through trial-and-error, by solving the groundwater flow based on a properly selected set of alternative but physically plausible geologic structures. In this work, we use: 1) dimensional analysis, and b) a pulse-based stochastic model for simulation of synthetic aquifer structures, to calculate the distribution of the absolute error in hydraulic head estimation as a function of the standardized distance from the nearest measuring locations. The resulting distributions are proved to encompass all possible small-scale structural dependencies, exhibiting characteristics (bounds, multi-modal features etc.) that can be explained using simple geometric arguments. The obtained results are promising, pointing towards the direction of establishing design criteria based on large-scale geologic maps.
Likelihood-based confidence intervals for estimating floods with given return periods
NASA Astrophysics Data System (ADS)
Martins, Eduardo Sávio P. R.; Clarke, Robin T.
1993-06-01
This paper discusses aspects of the calculation of likelihood-based confidence intervals for T-year floods, with particular reference to (1) the two-parameter gamma distribution; (2) the Gumbel distribution; (3) the two-parameter log-normal distribution, and other distributions related to the normal by Box-Cox transformations. Calculation of the confidence limits is straightforward using the Nelder-Mead algorithm with a constraint incorporated, although care is necessary to ensure convergence either of the Nelder-Mead algorithm, or of the Newton-Raphson calculation of maximum-likelihood estimates. Methods are illustrated using records from 18 gauging stations in the basin of the River Itajai-Acu, State of Santa Catarina, southern Brazil. A small and restricted simulation compared likelihood-based confidence limits with those given by use of the central limit theorem; for the same confidence probability, the confidence limits of the simulation were wider than those of the central limit theorem, which failed more frequently to contain the true quantile being estimated. The paper discusses possible applications of likelihood-based confidence intervals in other areas of hydrological analysis.
Li, Heng; Cheng, Hui-Yu; Chen, Wei-Liang; Huang, Yi-Hsin; Li, Chi-Kang; Chang, Chiao-Yun; Wu, Yuh-Renn; Lu, Tien-Chang; Chang, Yu-Ming
2017-01-01
We performed depth-resolved PL and Raman spectral mappings of a GaN-based LED structure grown on a patterned sapphire substrate (PSS). Our results showed that the Raman mapping in the PSS-GaN heterointerface and the PL mapping in the InxGa1−xN/GaN MQWs active layer are spatially correlated. Based on the 3D construction of E2(high) Raman peak intensity and frequency shift, V-shaped pits in the MQWs can be traced down to the dislocations originated in the cone tip area of PSS. Detail analysis of the PL peak distribution further revealed that the indium composition in the MQWs is related to the residual strain propagating from the PSS-GaN heterointerface toward the LED surface. Numerical simulation based on the indium composition distribution also led to a radiative recombination rate distribution that shows agreement with the experimental PL intensity distribution in the InxGa1−xN/GaN MQWs active layer. PMID:28358119
Pion distribution amplitude and quasidistributions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radyushkin, Anatoly V.
2017-03-27
We extend our analysis of quasidistributions onto the pion distribution amplitude. Using the formalism of parton virtuality distribution amplitudes, we establish a connection between the pion transverse momentum dependent distribution amplitude Ψ(x,k 2 ⊥) and the pion quasidistribution amplitude (QDA) Q π(y,p 3). We build models for the QDAs from the virtuality-distribution-amplitude-based models for soft transverse momentum dependent distribution amplitudes, and analyze the p3 dependence of the resulting QDAs. As there are many models claimed to describe the primordial shape of the pion distribution amplitude, we present the p 3-evolution patterns for models producing some popular proposals: Chernyak-Zhitnitsky, flat, andmore » asymptotic distribution amplitude. Finally, our results may be used as a guide for future studies of the pion distribution amplitude on the lattice using the quasidistribution approach.« less
Modeling and experiments of the adhesion force distribution between particles and a surface.
You, Siming; Wan, Man Pun
2014-06-17
Due to the existence of surface roughness in real surfaces, the adhesion force between particles and the surface where the particles are deposited exhibits certain statistical distributions. Despite the importance of adhesion force distribution in a variety of applications, the current understanding of modeling adhesion force distribution is still limited. In this work, an adhesion force distribution model based on integrating the root-mean-square (RMS) roughness distribution (i.e., the variation of RMS roughness on the surface in terms of location) into recently proposed mean adhesion force models was proposed. The integration was accomplished by statistical analysis and Monte Carlo simulation. A series of centrifuge experiments were conducted to measure the adhesion force distributions between polystyrene particles (146.1 ± 1.99 μm) and various substrates (stainless steel, aluminum and plastic, respectively). The proposed model was validated against the measured adhesion force distributions from this work and another previous study. Based on the proposed model, the effect of RMS roughness distribution on the adhesion force distribution of particles on a rough surface was explored, showing that both the median and standard deviation of adhesion force distribution could be affected by the RMS roughness distribution. The proposed model could predict both van der Waals force and capillary force distributions and consider the multiscale roughness feature, greatly extending the current capability of adhesion force distribution prediction.
ERIC Educational Resources Information Center
Zawacki-Richter, Olaf; Alturki, Uthman; Aldraiweesh, Ahmed
2017-01-01
This paper presents a review of distance education literature published in the "International Review of Research in Open and Distance/Distributed Learning" (IRRODL) to describe the status thereof and to identify gaps and priority areas in distance education research based on a validated classification of research areas. All articles (N =…
Holocene Changes in the Distribution and Abundance of Oaks in California
Roger Byrne; Eric Edlund; Scott Mensing
1991-01-01
Our knowledge of the long-term history of oaks is primarily based on biogeographical analysis of anomalous distribution patterns and paleobotanical macrofossil evidence. Neither of these provide a continuous record of change. In this paper, we present fossil pollen evidence which records significant changes in oak abundance over the last 10,000 years. Between 10,000-5,...
S. Wang; Z. Zhang; G. Sun; P. Strauss; J. Guo; Y. Tang; A. Yao
2012-01-01
Model calibration is essential for hydrologic modeling of large watersheds in a heterogeneous mountain environment. Little guidance is available for model calibration protocols for distributed models that aim at capturing the spatial variability of hydrologic processes. This study used the physically-based distributed hydrologic model, MIKE SHE, to contrast a lumped...
NASA Astrophysics Data System (ADS)
García, T.; Velo, A.; Fernandez-Bastero, S.; Gago-Duport, L.; Santos, A.; Alejo, I.; Vilas, F.
2005-02-01
This paper examines the linkages between the space-distribution of grain sizes and the relative percentage of the amount of mineral species that result from the mixing process of siliciclastic and carbonate sediments at the Ria de Vigo (NW of Spain). The space-distribution of minerals was initially determined, starting from a detailed mineralogical study based on XRD-Rietveld analysis of the superficial sediments. Correlations between the maps obtained for grain sizes, average fractions of either siliciclastic or carbonates, as well as for individual-minerals, were further stabilised. From this analysis, spatially organized patterns were found between carbonates and several minerals involved in the siliciclastic fraction. In particular, a coupled behaviour is observed between plagioclases and carbonates, in terms of their relative percentage amounts and the grain size distribution. In order to explain these results a conceptual model is proposed, based on the interplay between chemical processes at the seawater-sediment interface and hydrodynamical factors. This model suggests the existence of chemical control mechanisms that, by selective processes of dissolution-crystallization, constrain the mixed environment's long-term evolution, inducing the formation of self-organized sedimentary patterns.
Distributed memory parallel Markov random fields using graph partitioning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heinemann, C.; Perciano, T.; Ushizima, D.
Markov random fields (MRF) based algorithms have attracted a large amount of interest in image analysis due to their ability to exploit contextual information about data. Image data generated by experimental facilities, though, continues to grow larger and more complex, making it more difficult to analyze in a reasonable amount of time. Applying image processing algorithms to large datasets requires alternative approaches to circumvent performance problems. Aiming to provide scientists with a new tool to recover valuable information from such datasets, we developed a general purpose distributed memory parallel MRF-based image analysis framework (MPI-PMRF). MPI-PMRF overcomes performance and memory limitationsmore » by distributing data and computations across processors. The proposed approach was successfully tested with synthetic and experimental datasets. Additionally, the performance of the MPI-PMRF framework is analyzed through a detailed scalability study. We show that a performance increase is obtained while maintaining an accuracy of the segmentation results higher than 98%. The contributions of this paper are: (a) development of a distributed memory MRF framework; (b) measurement of the performance increase of the proposed approach; (c) verification of segmentation accuracy in both synthetic and experimental, real-world datasets« less
NASA Astrophysics Data System (ADS)
Song, Lu-Kai; Wen, Jie; Fei, Cheng-Wei; Bai, Guang-Chen
2018-05-01
To improve the computing efficiency and precision of probabilistic design for multi-failure structure, a distributed collaborative probabilistic design method-based fuzzy neural network of regression (FR) (called as DCFRM) is proposed with the integration of distributed collaborative response surface method and fuzzy neural network regression model. The mathematical model of DCFRM is established and the probabilistic design idea with DCFRM is introduced. The probabilistic analysis of turbine blisk involving multi-failure modes (deformation failure, stress failure and strain failure) was investigated by considering fluid-structure interaction with the proposed method. The distribution characteristics, reliability degree, and sensitivity degree of each failure mode and overall failure mode on turbine blisk are obtained, which provides a useful reference for improving the performance and reliability of aeroengine. Through the comparison of methods shows that the DCFRM reshapes the probability of probabilistic analysis for multi-failure structure and improves the computing efficiency while keeping acceptable computational precision. Moreover, the proposed method offers a useful insight for reliability-based design optimization of multi-failure structure and thereby also enriches the theory and method of mechanical reliability design.
Robustness of fit indices to outliers and leverage observations in structural equation modeling.
Yuan, Ke-Hai; Zhong, Xiaoling
2013-06-01
Normal-distribution-based maximum likelihood (NML) is the most widely used method in structural equation modeling (SEM), although practical data tend to be nonnormally distributed. The effect of nonnormally distributed data or data contamination on the normal-distribution-based likelihood ratio (LR) statistic is well understood due to many analytical and empirical studies. In SEM, fit indices are used as widely as the LR statistic. In addition to NML, robust procedures have been developed for more efficient and less biased parameter estimates with practical data. This article studies the effect of outliers and leverage observations on fit indices following NML and two robust methods. Analysis and empirical results indicate that good leverage observations following NML and one of the robust methods lead most fit indices to give more support to the substantive model. While outliers tend to make a good model superficially bad according to many fit indices following NML, they have little effect on those following the two robust procedures. Implications of the results to data analysis are discussed, and recommendations are provided regarding the use of estimation methods and interpretation of fit indices. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
Analysis of DNS Cache Effects on Query Distribution
2013-01-01
This paper studies the DNS cache effects that occur on query distribution at the CN top-level domain (TLD) server. We first filter out the malformed DNS queries to purify the log data pollution according to six categories. A model for DNS resolution, more specifically DNS caching, is presented. We demonstrate the presence and magnitude of DNS cache effects and the cache sharing effects on the request distribution through analytic model and simulation. CN TLD log data results are provided and analyzed based on the cache model. The approximate TTL distribution for domain name is inferred quantificationally. PMID:24396313
Analysis of DNS cache effects on query distribution.
Wang, Zheng
2013-01-01
This paper studies the DNS cache effects that occur on query distribution at the CN top-level domain (TLD) server. We first filter out the malformed DNS queries to purify the log data pollution according to six categories. A model for DNS resolution, more specifically DNS caching, is presented. We demonstrate the presence and magnitude of DNS cache effects and the cache sharing effects on the request distribution through analytic model and simulation. CN TLD log data results are provided and analyzed based on the cache model. The approximate TTL distribution for domain name is inferred quantificationally.
NASA Astrophysics Data System (ADS)
WANG, Qingrong; ZHU, Changfeng
2017-06-01
Integration of distributed heterogeneous data sources is the key issues under the big data applications. In this paper the strategy of variable precision is introduced to the concept lattice, and the one-to-one mapping mode of variable precision concept lattice and ontology concept lattice is constructed to produce the local ontology by constructing the variable precision concept lattice for each subsystem, and the distributed generation algorithm of variable precision concept lattice based on ontology heterogeneous database is proposed to draw support from the special relationship between concept lattice and ontology construction. Finally, based on the standard of main concept lattice of the existing heterogeneous database generated, a case study has been carried out in order to testify the feasibility and validity of this algorithm, and the differences between the main concept lattice and the standard concept lattice are compared. Analysis results show that this algorithm above-mentioned can automatically process the construction process of distributed concept lattice under the heterogeneous data sources.
Structure and Soot Properties of Nonbuoyant Ethylene/Air Laminar Jet Diffusion Flames. Appendix I
NASA Technical Reports Server (NTRS)
Urban, D. L.; Yuan, Z.-G.; Sunderland, P. B.; Linteris, G. T.; Voss, J. E.; Lin, K.-C.; Dai, Z.; Sun, K.; Faeth, G. M.; Ross, Howard D. (Technical Monitor)
2000-01-01
The structure and soot properties of round, soot-emitting, nonbuoyant, laminar jet diffusion flames are described, based on long-duration (175-230/s) experiments at microgravity carried out on orbit In the Space Shuttle Columbia. Experiments] conditions included ethylene-fueled flames burning in still air at nominal pressures of 50 and 100 kPa and an ambient temperature of 300 K with luminous Annie lengths of 49-64 mm. Measurements included luminous flame shapes using color video imaging, soot concentration (volume fraction) distributions using deconvoluted laser extinction imaging, soot temperature distributions using deconvoluted multiline emission imaging, gas temperature distributions at fuel-lean (plume) conditions using thermocouple probes, not structure distributions using thermophoretic sampling and analysis by transmission electron microscopy, and flame radiation using a radiometer. The present flames were larger, and emitted soot men readily, than comparable observed during ground-based microgravity experiments due to closer approach to steady conditions resulting from the longer test times and the reduced gravitational disturbances of the space-based experiments.
A note on generalized Genome Scan Meta-Analysis statistics
Koziol, James A; Feng, Anne C
2005-01-01
Background Wise et al. introduced a rank-based statistical technique for meta-analysis of genome scans, the Genome Scan Meta-Analysis (GSMA) method. Levinson et al. recently described two generalizations of the GSMA statistic: (i) a weighted version of the GSMA statistic, so that different studies could be ascribed different weights for analysis; and (ii) an order statistic approach, reflecting the fact that a GSMA statistic can be computed for each chromosomal region or bin width across the various genome scan studies. Results We provide an Edgeworth approximation to the null distribution of the weighted GSMA statistic, and, we examine the limiting distribution of the GSMA statistics under the order statistic formulation, and quantify the relevance of the pairwise correlations of the GSMA statistics across different bins on this limiting distribution. We also remark on aggregate criteria and multiple testing for determining significance of GSMA results. Conclusion Theoretical considerations detailed herein can lead to clarification and simplification of testing criteria for generalizations of the GSMA statistic. PMID:15717930
An Agent-Based Dynamic Model for Analysis of Distributed Space Exploration Architectures
NASA Astrophysics Data System (ADS)
Sindiy, Oleg V.; DeLaurentis, Daniel A.; Stein, William B.
2009-07-01
A range of complex challenges, but also potentially unique rewards, underlie the development of exploration architectures that use a distributed, dynamic network of resources across the solar system. From a methodological perspective, the prime challenge is to systematically model the evolution (and quantify comparative performance) of such architectures, under uncertainty, to effectively direct further study of specialized trajectories, spacecraft technologies, concept of operations, and resource allocation. A process model for System-of-Systems Engineering is used to define time-varying performance measures for comparative architecture analysis and identification of distinguishing patterns among interoperating systems. Agent-based modeling serves as the means to create a discrete-time simulation that generates dynamics for the study of architecture evolution. A Solar System Mobility Network proof-of-concept problem is introduced representing a set of longer-term, distributed exploration architectures. Options within this set revolve around deployment of human and robotic exploration and infrastructure assets, their organization, interoperability, and evolution, i.e., a system-of-systems. Agent-based simulations quantify relative payoffs for a fully distributed architecture (which can be significant over the long term), the latency period before they are manifest, and the up-front investment (which can be substantial compared to alternatives). Verification and sensitivity results provide further insight on development paths and indicate that the framework and simulation modeling approach may be useful in architectural design of other space exploration mass, energy, and information exchange settings.
Distributed Secure Coordinated Control for Multiagent Systems Under Strategic Attacks.
Feng, Zhi; Wen, Guanghui; Hu, Guoqiang
2017-05-01
This paper studies a distributed secure consensus tracking control problem for multiagent systems subject to strategic cyber attacks modeled by a random Markov process. A hybrid stochastic secure control framework is established for designing a distributed secure control law such that mean-square exponential consensus tracking is achieved. A connectivity restoration mechanism is considered and the properties on attack frequency and attack length rate are investigated, respectively. Based on the solutions of an algebraic Riccati equation and an algebraic Riccati inequality, a procedure to select the control gains is provided and stability analysis is studied by using Lyapunov's method.. The effect of strategic attacks on discrete-time systems is also investigated. Finally, numerical examples are provided to illustrate the effectiveness of theoretical analysis.
NASA Astrophysics Data System (ADS)
Domino, Krzysztof
2017-02-01
The cumulant analysis plays an important role in non Gaussian distributed data analysis. The shares' prices returns are good example of such data. The purpose of this research is to develop the cumulant based algorithm and use it to determine eigenvectors that represent investment portfolios with low variability. Such algorithm is based on the Alternating Least Square method and involves the simultaneous minimisation 2'nd- 6'th cumulants of the multidimensional random variable (percentage shares' returns of many companies). Then the algorithm was tested during the recent crash on the Warsaw Stock Exchange. To determine incoming crash and provide enter and exit signal for the investment strategy the Hurst exponent was calculated using the local DFA. It was shown that introduced algorithm is on average better that benchmark and other portfolio determination methods, but only within examination window determined by low values of the Hurst exponent. Remark that the algorithm is based on cumulant tensors up to the 6'th order calculated for a multidimensional random variable, what is the novel idea. It can be expected that the algorithm would be useful in the financial data analysis on the world wide scale as well as in the analysis of other types of non Gaussian distributed data.
NASA Astrophysics Data System (ADS)
Kumar Sharma, A.; Murty, V. V. S. N.
2014-12-01
The distribution system is the final link between bulk power system and consumer end. A distinctive load flow solution method is used for analysis of the load flow of radial and weakly meshed network based on Kirchhoff's Current Law (KCL) and KVL. This method has excellent convergence characteristics for both radial as well as weakly meshed structure and is based on bus injection to branch current and branch-current to bus-voltage matrix. The main contribution of the paper is: (i) an analysis has been carried out for a weekly mesh network considering number of loops addition and its impact on the losses, kW and kVAr requirements from a system, and voltage profile, (ii) different load models, realistic ZIP load model and load growth impact on losses, voltage profile, kVA and kVAr requirements, (iii) impact of addition of loops on losses, voltage profile, kVA and kVAr requirements from substation, and (iv) comparison of system performance with radial distribution system. Voltage stability is a major concern in planning and operation of power systems. This paper also includes identifying the closeness critical bus which is the most sensitive to the voltage collapse in radial distribution networks. Node having minimum value of voltage stability index is the most sensitive node. Voltage stability index values are computed for meshed network with number of loops added in the system. The results have been obtained for IEEE 33 and 69 bus test system. The results have also been obtained for radial distribution system for comparison.
Analysis of Dynamic Characteristics of the 21st Century Maritime Silk Road
NASA Astrophysics Data System (ADS)
Zhang, Xudong; Zhang, Jie; Fan, Chenqing; Meng, Junmin; Wang, Jing; Wan, Yong
2018-06-01
The 21st century Maritime Silk Road (MSR) proposed by China strongly promotes the maritime industry. In this paper, we use wind and ocean wave datasets from 1979 to 2014 to analyze the spatial and temporal distributions of the wind speed, significant wave height (SWH), mean wave direction (MWD), and mean wave period (MWP) in the MSR. The analysis results indicate that the Luzon Strait and Gulf of Aden have the most obvious seasonal variations and that the central Indian Ocean is relatively stable. We analyzed the distributions of the maximum wind speed and SWH in the MSR over this 36-year period. The results show that the distribution of the monthly average frequency for SWH exceeds 4 m (huge waves) and that of the corresponding wind speed exceeds 13.9 m s-1 (high wind speed). The occurrence frequencies of huge waves and high winds in regions east of the Gulf of Aden are as high as 56% and 80%, respectively. We also assessed the wave and wind energies in different seasons. Based on our analyses, we propose a risk factor (RF) for determining navigation safety levels, based on the wind speed and SWH. We determine the spatial and temporal RF distributions for different seasons and analyze the corresponding impact on four major sea routes. Finally, we determine the spatial distribution of tropical cyclones from 2000 to 2015 and analyze the corresponding impact on the four sea routes. The analysis of the dynamic characteristics of the MSR provides references for ship navigation as well as ocean engineering.
Valenzuela, Carlos Y
2010-01-01
Analysis for the homogeneity of the distribution of the second base of dinucleotides in relation to the first, whose bases are separated by 0, 1, 2,... 21 nucleotide sites, was performed with the VIH-1 genome (cDNA), the Drosophila mtDNA, the Drosophila Torso gene and the human p-globin gene. These four DNA segments showed highly significant heterogeneities of base distributions that cannot be accounted for by neutral or nearly neutral evolution or by the "neighbor influence" of nucleotides on mutation rates. High correlations are found in the bases of dinucleotides separated by 0, 1 and more number of sites. A periodicity of three consecutive significance values (measured by the x²9) was found only in Drosophila mtDNA. This periodicity may be due to an unknown structure or organization of mtDNA. This non-random distribution of the two bases of dinucleotides widespread throughout these DNA segments is rather compatible with panselective evolution and generalized internucleotide co-adaptation.
Distributed Computing Architecture for Image-Based Wavefront Sensing and 2 D FFTs
NASA Technical Reports Server (NTRS)
Smith, Jeffrey S.; Dean, Bruce H.; Haghani, Shadan
2006-01-01
Image-based wavefront sensing (WFS) provides significant advantages over interferometric-based wavefi-ont sensors such as optical design simplicity and stability. However, the image-based approach is computational intensive, and therefore, specialized high-performance computing architectures are required in applications utilizing the image-based approach. The development and testing of these high-performance computing architectures are essential to such missions as James Webb Space Telescope (JWST), Terrestial Planet Finder-Coronagraph (TPF-C and CorSpec), and Spherical Primary Optical Telescope (SPOT). The development of these specialized computing architectures require numerous two-dimensional Fourier Transforms, which necessitate an all-to-all communication when applied on a distributed computational architecture. Several solutions for distributed computing are presented with an emphasis on a 64 Node cluster of DSPs, multiple DSP FPGAs, and an application of low-diameter graph theory. Timing results and performance analysis will be presented. The solutions offered could be applied to other all-to-all communication and scientifically computationally complex problems.
Huang, Xiao-Lei; Qiao, Ge-Xia; Lei, Fu-Min
2010-01-01
Parsimony analysis of endemicity (PAE) was used to identify areas of endemism (AOEs) for Chinese birds at the subregional level. Four AOEs were identified based on a distribution database of 105 endemic species and using 18 avifaunal subregions as the operating geographical units (OGUs). The four AOEs are the Qinghai-Zangnan Subregion, the Southwest Mountainous Subregion, the Hainan Subregion and the Taiwan Subregion. Cladistic analysis of subregions generally supports the division of China’s avifauna into Palaearctic and Oriental realms. Two PAE area trees were produced from two different distribution datasets (year 1976 and 2007). The 1976 topology has four distinct subregional branches; however, the 2007 topology has three distinct branches. Moreover, three Palaearctic subregions in the 1976 tree clustered together with the Oriental subregions in the 2007 tree. Such topological differences may reflect changes in the distribution of bird species through circa three decades. PMID:20559504
Quantum key distribution for composite dimensional finite systems
NASA Astrophysics Data System (ADS)
Shalaby, Mohamed; Kamal, Yasser
2017-06-01
The application of quantum mechanics contributes to the field of cryptography with very important advantage as it offers a mechanism for detecting the eavesdropper. The pioneering work of quantum key distribution uses mutually unbiased bases (MUBs) to prepare and measure qubits (or qudits). Weak mutually unbiased bases (WMUBs) have weaker properties than MUBs properties, however, unlike MUBs, a complete set of WMUBs can be constructed for systems with composite dimensions. In this paper, we study the use of weak mutually unbiased bases (WMUBs) in quantum key distribution for composite dimensional finite systems. We prove that the security analysis of using a complete set of WMUBs to prepare and measure the quantum states in the generalized BB84 protocol, gives better results than using the maximum number of MUBs that can be constructed, when they are analyzed against the intercept and resend attack.
NASA Astrophysics Data System (ADS)
Yin, Shengwen; Yu, Dejie; Yin, Hui; Lü, Hui; Xia, Baizhan
2017-09-01
Considering the epistemic uncertainties within the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) model when it is used for the response analysis of built-up systems in the mid-frequency range, the hybrid Evidence Theory-based Finite Element/Statistical Energy Analysis (ETFE/SEA) model is established by introducing the evidence theory. Based on the hybrid ETFE/SEA model and the sub-interval perturbation technique, the hybrid Sub-interval Perturbation and Evidence Theory-based Finite Element/Statistical Energy Analysis (SIP-ETFE/SEA) approach is proposed. In the hybrid ETFE/SEA model, the uncertainty in the SEA subsystem is modeled by a non-parametric ensemble, while the uncertainty in the FE subsystem is described by the focal element and basic probability assignment (BPA), and dealt with evidence theory. Within the hybrid SIP-ETFE/SEA approach, the mid-frequency response of interest, such as the ensemble average of the energy response and the cross-spectrum response, is calculated analytically by using the conventional hybrid FE/SEA method. Inspired by the probability theory, the intervals of the mean value, variance and cumulative distribution are used to describe the distribution characteristics of mid-frequency responses of built-up systems with epistemic uncertainties. In order to alleviate the computational burdens for the extreme value analysis, the sub-interval perturbation technique based on the first-order Taylor series expansion is used in ETFE/SEA model to acquire the lower and upper bounds of the mid-frequency responses over each focal element. Three numerical examples are given to illustrate the feasibility and effectiveness of the proposed method.
Using independent component analysis for electrical impedance tomography
NASA Astrophysics Data System (ADS)
Yan, Peimin; Mo, Yulong
2004-05-01
Independent component analysis (ICA) is a way to resolve signals into independent components based on the statistical characteristics of the signals. It is a method for factoring probability densities of measured signals into a set of densities that are as statistically independent as possible under the assumptions of a linear model. Electrical impedance tomography (EIT) is used to detect variations of the electric conductivity of the human body. Because there are variations of the conductivity distributions inside the body, EIT presents multi-channel data. In order to get all information contained in different location of tissue it is necessary to image the individual conductivity distribution. In this paper we consider to apply ICA to EIT on the signal subspace (individual conductivity distribution). Using ICA the signal subspace will then be decomposed into statistically independent components. The individual conductivity distribution can be reconstructed by the sensitivity theorem in this paper. Compute simulations show that the full information contained in the multi-conductivity distribution will be obtained by this method.
Predictive analysis of thermal distribution and damage in thermotherapy on biological tissue
NASA Astrophysics Data System (ADS)
Fanjul-Vélez, Félix; Arce-Diego, José Luis
2007-05-01
The use of optical techniques is increasing the possibilities and success of medical praxis in certain cases, either in tissue characterization or treatment. Photodynamic therapy (PDT) or low intensity laser treatment (LILT) are two examples of the latter. Another very interesting implementation is thermotherapy, which consists of controlling temperature increase in a pathological biological tissue. With this method it is possible to provoke an improvement on specific diseases, but a previous analysis of treatment is needed in order for the patient not to suffer any collateral damage, an essential point due to security margins in medical procedures. In this work, a predictive analysis of thermal distribution in a biological tissue irradiated by an optical source is presented. Optical propagation is based on a RTT (Radiation Transport Theory) model solved via a numerical Monte Carlo method, in a multi-layered tissue. Data obtained are included in a bio-heat equation that models heat transference, taking into account conduction, convection, radiation, blood perfusion and vaporization depending on the specific problem. Spatial-temporal differential bio-heat equation is solved via a numerical finite difference approach. Experimental temperature distributions on animal tissue irradiated by laser radiation are shown. From thermal distribution in tissue, thermal damage is studied, based on an Arrhenius analysis, as a way of predicting harmful effects. The complete model can be used for concrete treatment proposals, as a way of predicting treatment effects and consequently decide which optical source parameters are appropriate for the specific disease, mainly wavelength and optical power, with reasonable security margins in the process.
Improving the Aircraft Design Process Using Web-Based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)
2000-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
Improving the Aircraft Design Process Using Web-based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.
2003-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
Deriving injury risk curves using survival analysis from biomechanical experiments.
Yoganandan, Narayan; Banerjee, Anjishnu; Hsu, Fang-Chi; Bass, Cameron R; Voo, Liming; Pintar, Frank A; Gayzik, F Scott
2016-10-03
Injury risk curves from biomechanical experimental data analysis are used in automotive studies to improve crashworthiness and advance occupant safety. Metrics such as acceleration and deflection coupled with outcomes such as fractures and anatomical disruptions from impact tests are used in simple binary regression models. As an improvement, the International Standards Organization suggested a different approach. It was based on survival analysis. While probability curves for side-impact-induced thorax and abdominal injuries and frontal impact-induced foot-ankle-leg injuries are developed using this approach, deficiencies are apparent. The objective of this study is to present an improved, robust and generalizable methodology in an attempt to resolve these issues. It includes: (a) statistical identification of the most appropriate independent variable (metric) from a pool of candidate metrics, measured and or derived during experimentation and analysis processes, based on the highest area under the receiver operator curve, (b) quantitative determination of the most optimal probability distribution based on the lowest Akaike information criterion, (c) supplementing the qualitative/visual inspection method for comparing the selected distribution with a non-parametric distribution with objective measures, (d) identification of overly influential observations using different methods, and (e) estimation of confidence intervals using techniques more appropriate to the underlying survival statistical model. These clear and quantified details can be easily implemented with commercial/open source packages. They can be used in retrospective analysis and prospective design of experiments, and in applications to different loading scenarios such as underbody blast events. The feasibility of the methodology is demonstrated using post mortem human subject experiments and 24 metrics associated with thoracic/abdominal injuries in side-impacts. Published by Elsevier Ltd.
Belli, Sema; Eraslan, Oğuz; Eskitascioglu, Gürcan
2016-01-01
Endodontic-periodontal (EP) lesions require both endodontic and periodontal therapies. Impermeable sealing of the root canal system after cleaning and shaping is essential for a successful endodontic treatment. However, complete healing of the hard and soft tissue lesions takes time, and diseased bone, periodontal ligament, and tooth fibrous joints are reported to have an increased failure risk for a given load. Considering that EP lesions may affect the biomechanics of teeth, this finite elemental analysis study aimed to test the effect of root fillings on stress distribution in premolars with EP lesions. Three finite elemental analysis models representing 3 different types of EP lesions (primary endodontic disease [PED], PED with secondary periodontic involvement, and true combined) were created. The root canals were assumed as nonfilled or filled with gutta-percha, gutta-percha/apical mineral trioxide aggregate (MTA) plug, and MTA-based sealer. Materials used were assumed to be homogenous and isotropic. A 300-N load was applied from the buccal cusp of the crown with a 135° angle. The Cosmoworks structural-analysis program (SolidWorks Corp, Waltham, MA) was used for analysis. Results were presented considering von Mises criteria. Stresses at the root apex increased with an increase in lesion dimensions. Root filling did not affect stress distribution in the PED model. An MTA plug or MTA-based sealer created more stress areas within the root compared with the others in the models representing PED with periodontic involvement and true combined lesions. Stresses at the apical end of the root increase with increases in lesion dimensions. MTA-based sealers or an MTA plug creates more stresses when there is periodontic involvement or a true combined lesion. Copyright © 2016 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Huang, D.; Wang, G.
2014-12-01
Stochastic simulation of spatially distributed ground-motion time histories is important for performance-based earthquake design of geographically distributed systems. In this study, we develop a novel technique to stochastically simulate regionalized ground-motion time histories using wavelet packet analysis. First, a transient acceleration time history is characterized by wavelet-packet parameters proposed by Yamamoto and Baker (2013). The wavelet-packet parameters fully characterize ground-motion time histories in terms of energy content, time- frequency-domain characteristics and time-frequency nonstationarity. This study further investigates the spatial cross-correlations of wavelet-packet parameters based on geostatistical analysis of 1500 regionalized ground motion data from eight well-recorded earthquakes in California, Mexico, Japan and Taiwan. The linear model of coregionalization (LMC) is used to develop a permissible spatial cross-correlation model for each parameter group. The geostatistical analysis of ground-motion data from different regions reveals significant dependence of the LMC structure on regional site conditions, which can be characterized by the correlation range of Vs30 in each region. In general, the spatial correlation and cross-correlation of wavelet-packet parameters are stronger if the site condition is more homogeneous. Using the regional-specific spatial cross-correlation model and cokriging technique, wavelet packet parameters at unmeasured locations can be best estimated, and regionalized ground-motion time histories can be synthesized. Case studies and blind tests demonstrated that the simulated ground motions generally agree well with the actual recorded data, if the influence of regional-site conditions is considered. The developed method has great potential to be used in computational-based seismic analysis and loss estimation in a regional scale.
A Noncentral "t" Regression Model for Meta-Analysis
ERIC Educational Resources Information Center
Camilli, Gregory; de la Torre, Jimmy; Chiu, Chia-Yi
2010-01-01
In this article, three multilevel models for meta-analysis are examined. Hedges and Olkin suggested that effect sizes follow a noncentral "t" distribution and proposed several approximate methods. Raudenbush and Bryk further refined this model; however, this procedure is based on a normal approximation. In the current research literature, this…
Determinants of Standard Errors of MLEs in Confirmatory Factor Analysis
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Cheng, Ying; Zhang, Wei
2010-01-01
This paper studies changes of standard errors (SE) of the normal-distribution-based maximum likelihood estimates (MLE) for confirmatory factor models as model parameters vary. Using logical analysis, simplified formulas and numerical verification, monotonic relationships between SEs and factor loadings as well as unique variances are found.…
Space station systems analysis study. Part 3: Documentation. Volume 5: Cost and schedule data
NASA Technical Reports Server (NTRS)
1977-01-01
Cost estimates for the space station systems analysis were recorded. Space construction base costs and characteristics were cited as well as mission hardware costs and characteristics. Also delineated were cost ground rules, the program schedule, and a detail cost estimate and funding distribution.
Probability model for atmospheric sulfur dioxide concentrations in the area of Venice
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buttazzoni, C.; Lavagnini, I.; Marani, A.
1986-09-01
This paper deals with a comparative screening of existing air quality models based on their ability to simulate the distribution of sulfur dioxide data in the Venetian area. Investigations have been carried out on sulfur dioxide dispersion in the atmosphere of the Venetian area. The studies have been mainly focused on transport models (Gaussian, plume and K-models) aiming at meaningful correlations of sources and receptors. Among the results, a noteworthy disagreement of simulated and experimental data, due to the lack of thorough knowledge of source field conditions and of local meteorology of the sea-land transition area, has been shown. Investigationsmore » with receptor oriented models (based, e.g., on time series analysis, Fourier analysis, or statistical distributions) have also been performed.« less
Shi, Xiao-Jun; Zhang, Ming-Li
2015-03-01
Zygophyllum xanthoxylon, a desert species, displaying a broad east-west continuous distribution pattern in arid Northwestern China, can be considered as a model species to investigate the biogeographical history of this region. We sequenced two chloroplast DNA spacers (psbK-psbI and rpl32-trnL) in 226 individuals from 31 populations to explore the phylogeographical structure. Median-joining network was constructed and analysis of AMOVA, SMOVA, neutrality tests and distribution analysis were used to examine genetic structure and potential range expansion. Using species distribution modeling, the geographical distribution of Z. xanthoxylon was modeled during the present and at the Last Glacial Maximum (LGM). Among 26 haplotypes, one was widely distributed, but most was restricted to either the eastern or western region. The populations with the highest levels of haplotype diversity were found in the Tianshan Mountains and its surroundings in the west, and the Helan Mountains and Alxa Plateau in the east. AMOVA and SAMOVA showed that over all populations, the species lacks phylogeographical structure, which is speculated to be the result of its specific biology. Neutrality tests and mismatch distribution analysis support past range expansions of the species. Comparing the current distribution to those cold and dry conditions in LGM, Z. xanthoxylon had a shrunken and more fragmented range during LGM. Based on the evidences from phylogeographical patterns, distribution of genetic variability, and paleodistribution modeling, Z. xanthoxylon is speculated most likely to have originated from the east and migrated westward via the Hexi Corridor.
NASA Astrophysics Data System (ADS)
Jones, G. T.; Jones, R. W. L.; Kennedy, B. W.; Klein, H.; Morrison, D. R. O.; Wachsmuth, H.; Miller, D. B.; Mobayyen, M. M.; Wainstein, S.; Aderholz, M.; Hantke, D.; Katz, U. F.; Kern, J.; Schmitz, N.; Wittek, W.; Borner, H. P.; Myatt, G.; Cooper-Sarkar, A. M.; Guy, J.; Venus, W.; Bullock, F. W.; Burke, S.
1994-12-01
Based on a QCD analysis of the parton momentum distributions in the proton, the ratio r v = d v / u v of the d and u valence quark distributions is determined as function of x in the range 0.01< x<0.7. The analysis uses data from neutrino and antineutrino charged current interactions on hydrogen and deuterium, obtained with BEBC in the (anti)neutrino wideband beam of the CERN SPS. Since v mainly depends on the deuterium/hydrogen ratios of the normalised x-y-Q 2-distributions many systematic effects cancel. It is found that r v decreases with increasing x, and drops below the naive SU(6) expectation of 0.5 for x≳0.3. An extrapolation of r v to x=1 is consistent with the hypothesis r v (1)=0.
DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.
Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien
2017-09-01
Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.
NASA Astrophysics Data System (ADS)
Weng, Lingyan; Han, Xugao
2018-01-01
Understanding the spatial-temporal distribution pattern of fog and haze is the base to deal with them by adjusting measures to local conditions. Taking 31 provinces in China mainland as the research areas, this paper collected data from Baidu index on the network attention of fog and haze in relevant areas from 2011 to 2016, and conducted an analysis of their spatial-temporal distribution pattern by using autocorrelation analysis. The results show that the network attention of fog and haze has an overall spatial distribution pattern of “higher in the eastern and central, lower in the western China”. There are regional differences in different provinces in terms of network attention. Network attention of fog and haze indicates an obvious geographical agglomeration phenomenon, which is a gradual enlargement of the agglomeration area of higher value with a slight shrinking of those lower value agglomeration areas.
Discovering Semantic Patterns in Bibliographically Coupled Documents.
ERIC Educational Resources Information Center
Qin, Jian
1999-01-01
An example of semantic pattern analysis, based on keywords selected from documents grouped by bibliographical coupling, is used to demonstrate the methodological aspects of knowledge discovery in bibliographic databases. Frequency distribution patterns suggest the existence of a common intellectual base with a wide range of specialties and…
Trajectory-Based Loads for the Ares I-X Test Flight Vehicle
NASA Technical Reports Server (NTRS)
Vause, Roland F.; Starr, Brett R.
2011-01-01
In trajectory-based loads, the structural engineer treats each point on the trajectory as a load case. Distributed aero, inertial, and propulsion forces are developed for the structural model which are equivalent to the integrated values of the trajectory model. Free-body diagrams are then used to solve for the internal forces, or loads, that keep the applied aero, inertial, and propulsion forces in dynamic equilibrium. There are several advantages to using trajectory-based loads. First, consistency is maintained between the integrated equilibrium equations of the trajectory analysis and the distributed equilibrium equations of the structural analysis. Second, the structural loads equations are tied to the uncertainty model for the trajectory systems analysis model. Atmosphere, aero, propulsion, mass property, and controls uncertainty models all feed into the dispersions that are generated for the trajectory systems analysis model. Changes in any of these input models will affect structural loads response. The trajectory systems model manages these inputs as well as the output from the structural model over thousands of dispersed cases. Large structural models with hundreds of thousands of degrees of freedom would execute too slowly to be an efficient part of several thousand system analyses. Trajectory-based loads provide a means for the structures discipline to be included in the integrated systems analysis. Successful applications of trajectory-based loads methods for the Ares I-X vehicle are covered in this paper. Preliminary design loads were based on 2000 trajectories using Monte Carlo dispersions. Range safety loads were tied to 8423 malfunction turn trajectories. In addition, active control system loads were based on 2000 preflight trajectories using Monte Carlo dispersions.
Percentiles of the null distribution of 2 maximum lod score tests.
Ulgen, Ayse; Yoo, Yun Joo; Gordon, Derek; Finch, Stephen J; Mendell, Nancy R
2004-01-01
We here consider the null distribution of the maximum lod score (LOD-M) obtained upon maximizing over transmission model parameters (penetrance values, dominance, and allele frequency) as well as the recombination fraction. Also considered is the lod score maximized over a fixed choice of genetic model parameters and recombination-fraction values set prior to the analysis (MMLS) as proposed by Hodge et al. The objective is to fit parametric distributions to MMLS and LOD-M. Our results are based on 3,600 simulations of samples of n = 100 nuclear families ascertained for having one affected member and at least one other sibling available for linkage analysis. Each null distribution is approximately a mixture p(2)(0) + (1 - p)(2)(v). The values of MMLS appear to fit the mixture 0.20(2)(0) + 0.80chi(2)(1.6). The mixture distribution 0.13(2)(0) + 0.87chi(2)(2.8). appears to describe the null distribution of LOD-M. From these results we derive a simple method for obtaining critical values of LOD-M and MMLS. Copyright 2004 S. Karger AG, Basel
Experimental and numerical studies of micro PEM fuel cell
NASA Astrophysics Data System (ADS)
Peng, Rong-Gui; Chung, Chen-Chung; Chen, Chiun-Hsun
2011-10-01
A single micro proton exchange membrane fuel cell (PEMFC) has been produced using Micro-electromechanical systems (MEMS) technology with the active area of 2.5 cm2 and channel depth of about 500 µm. A theoretical analysis is performed in this study for a novel MEMS-based design of amicro PEMFC. Themodel consists of the conservation equations of mass, momentum, species and electric current in a fully integrated finite-volume solver using the CFD-ACE+ commercial code. The polarization curves of simulation are well correlated with experimental data. Three-dimensional simulations are carried out to treat prediction and analysis of micro PEMFC temperature, current density and water distributions in two different fuel flow rates (15 cm3/min and 40 cm3/min). Simulation results show that temperature distribution within the micro PEMFC is affected by water distribution in the membrane and indicate that low and uniform temperature distribution in the membrane at low fuel flow rates leads to increased membrane water distribution and obtains superior micro PEMFC current density distribution under 0.4V operating voltage. Model predictions are well within those known for experimental mechanism phenomena.
Ávila-Jiménez, María Luisa; Coulson, Stephen James
2011-01-01
We aimed to describe the main Arctic biogeographical patterns of the Collembola, and analyze historical factors and current climatic regimes determining Arctic collembolan species distribution. Furthermore, we aimed to identify possible dispersal routes, colonization sources and glacial refugia for Arctic collembola. We implemented a Gaussian Mixture Clustering method on species distribution ranges and applied a distance- based parametric bootstrap test on presence-absence collembolan species distribution data. Additionally, multivariate analysis was performed considering species distributions, biodiversity, cluster distribution and environmental factors (temperature and precipitation). No clear relation was found between current climatic regimes and species distribution in the Arctic. Gaussian Mixture Clustering found common elements within Siberian areas, Atlantic areas, the Canadian Arctic, a mid-Siberian cluster and specific Beringian elements, following the same pattern previously described, using a variety of molecular methods, for Arctic plants. Species distribution hence indicate the influence of recent glacial history, as LGM glacial refugia (mid-Siberia, and Beringia) and major dispersal routes to high Arctic island groups can be identified. Endemic species are found in the high Arctic, but no specific biogeographical pattern can be clearly identified as a sign of high Arctic glacial refugia. Ocean currents patterns are suggested as being an important factor shaping the distribution of Arctic Collembola, which is consistent with Antarctic studies in collembolan biogeography. The clear relations between cluster distribution and geographical areas considering their recent glacial history, lack of relationship of species distribution with current climatic regimes, and consistency with previously described Arctic patterns in a series of organisms inferred using a variety of methods, suggest that historical phenomena shaping contemporary collembolan distribution can be inferred through biogeographical analysis. PMID:26467728
Design challenges in nanoparticle-based platforms: Implications for targeted drug delivery systems
NASA Astrophysics Data System (ADS)
Mullen, Douglas Gurnett
Characterization and control of heterogeneous distributions of nanoparticle-ligand components are major design challenges for nanoparticle-based platforms. This dissertation begins with an examination of poly(amidoamine) (PAMAM) dendrimer-based targeted delivery platform. A folic acid targeted modular platform was developed to target human epithelial cancer cells. Although active targeting was observed in vitro, active targeting was not found in vivo using a mouse tumor model. A major flaw of this platform design was that it did not provide for characterization or control of the component distribution. Motivated by the problems experienced with the modular design, the actual composition of nanoparticle-ligand distributions were examined using a model dendrimer-ligand system. High Pressure Liquid Chromatography (HPLC) resolved the distribution of components in samples with mean ligand/dendrimer ratios ranging from 0.4 to 13. A peak fitting analysis enabled the quantification of the component distribution. Quantified distributions were found to be significantly more heterogeneous than commonly expected and standard analytical parameters, namely the mean ligand/nanoparticle ratio, failed to adequately represent the component heterogeneity. The distribution of components was also found to be sensitive to particle modifications that preceded the ligand conjugation. With the knowledge gained from this detailed distribution analysis, a new platform design was developed to provide a system with dramatically improved control over the number of components and with improved batch reproducibility. Using semi-preparative HPLC, individual dendrimer-ligand components were isolated. The isolated dendrimer with precise numbers of ligands were characterized by NMR and analytical HPLC. In total, nine different dendrimer-ligand components were obtained with degrees of purity ≥80%. This system has the potential to serve as a platform to which a precise number of functional molecules can be attached and has the potential to dramatically improve platform efficacy. An additional investigation of reproducibility challenges for current dendrimer-based platform designs is also described. The mass transport quality during the partial acetylation reaction of the dendrimer was found to have a major impact on subsequent dendrimer-ligand distributions that cannot be detected by standard analytical techniques. Consequently, this reaction should be eliminated from the platform design. Finally, optimized protocols for purification and characterization of PAMAM dendrimer were detailed.
Photoacoustic imaging to assess pixel-based sO2 distributions in experimental prostate tumors.
Bendinger, Alina L; Glowa, Christin; Peter, Jörg; Karger, Christian P
2018-03-01
A protocol for photoacoustic imaging (PAI) has been developed to assess pixel-based oxygen saturation (sO2) distributions of experimental tumor models. The protocol was applied to evaluate the dependence of PAI results on measurement settings, reproducibility of PAI, and for the characterization of the oxygenation status of experimental prostate tumor sublines (Dunning R3327-H, -HI, -AT1) implanted subcutaneously in male Copenhagen rats. The three-dimensional (3-D) PA data employing two wavelengths were used to estimate sO2 distributions. If the PA signal was sufficiently strong, the distributions were independent from signal gain, threshold, and positioning of animals. Reproducibility of sO2 distributions with respect to shape and median values was demonstrated over several days. The three tumor sublines were characterized by the shapes of their sO2 distributions and their temporal response after external changes of the oxygen supply (100% O2 or air breathing and clamping of tumor-supplying artery). The established protocol showed to be suitable for detecting temporal changes in tumor oxygenation as well as differences in oxygenation between tumor sublines. PA results were in accordance with histology for hypoxia, perfusion, and vasculature. The presented protocol for the assessment of pixel-based sO2 distributions provides more detailed information as compared to conventional region-of-interest-based analysis of PAI, especially with respect to the detection of temporal changes and tumor heterogeneity. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
Photoacoustic imaging to assess pixel-based sO2 distributions in experimental prostate tumors
NASA Astrophysics Data System (ADS)
Bendinger, Alina L.; Glowa, Christin; Peter, Jörg; Karger, Christian P.
2018-03-01
A protocol for photoacoustic imaging (PAI) has been developed to assess pixel-based oxygen saturation (sO2) distributions of experimental tumor models. The protocol was applied to evaluate the dependence of PAI results on measurement settings, reproducibility of PAI, and for the characterization of the oxygenation status of experimental prostate tumor sublines (Dunning R3327-H, -HI, -AT1) implanted subcutaneously in male Copenhagen rats. The three-dimensional (3-D) PA data employing two wavelengths were used to estimate sO2 distributions. If the PA signal was sufficiently strong, the distributions were independent from signal gain, threshold, and positioning of animals. Reproducibility of sO2 distributions with respect to shape and median values was demonstrated over several days. The three tumor sublines were characterized by the shapes of their sO2 distributions and their temporal response after external changes of the oxygen supply (100% O2 or air breathing and clamping of tumor-supplying artery). The established protocol showed to be suitable for detecting temporal changes in tumor oxygenation as well as differences in oxygenation between tumor sublines. PA results were in accordance with histology for hypoxia, perfusion, and vasculature. The presented protocol for the assessment of pixel-based sO2 distributions provides more detailed information as compared to conventional region-of-interest-based analysis of PAI, especially with respect to the detection of temporal changes and tumor heterogeneity.
Application of short-data methods on extreme surge levels
NASA Astrophysics Data System (ADS)
Feng, X.
2014-12-01
Tropical cyclone-induced storm surges are among the most destructive natural hazards that impact the United States. Unfortunately for academic research, the available time series for extreme surge analysis are very short. The limited data introduces uncertainty and affects the accuracy of statistical analyses of extreme surge levels. This study deals with techniques applicable to data sets less than 20 years, including simulation modelling and methods based on the parameters of the parent distribution. The verified water levels from water gauges spread along the Southwest and Southeast Florida Coast, as well as the Florida Keys, are used in this study. Methods to calculate extreme storm surges are described and reviewed, including 'classical' methods based on the generalized extreme value (GEV) distribution and the generalized Pareto distribution (GPD), and approaches designed specifically to deal with short data sets. Incorporating global-warming influence, the statistical analysis reveals enhanced extreme surge magnitudes and frequencies during warm years, while reduced levels of extreme surge activity are observed in the same study domain during cold years. Furthermore, a non-stationary GEV distribution is applied to predict the extreme surge levels with warming sea surface temperatures. The non-stationary GEV distribution indicates that with 1 Celsius degree warming in sea surface temperature from the baseline climate, the 100-year return surge level in Southwest and Southeast Florida will increase by up to 40 centimeters. The considered statistical approaches for extreme surge estimation based on short data sets will be valuable to coastal stakeholders, including urban planners, emergency managers, and the hurricane and storm surge forecasting and warning system.
Analysis of off-axis tension test of wood specimens
Jen Y. Liu
2002-01-01
This paper presents a stress analysis of the off-axis tension test of clear wood specimens based on orthotropic elasticity theory. The effects of Poisson's ratio and shear coupling coefficient on stress distribution are analyzed in detail. The analysis also provides a theoretical foundation for the selection of a 10° grain angle in wood specimens for the...
2004-06-01
suitable form of organizational adaptation is effective organizational diagnosis and analysis. The organizational diagnosis and analysis involve...related to the mission environment, organizational structure, and strategy is imperative for an effective and efficient organizational diagnosis . The...not easily articulated nor expressed otherwise. These displays are crucial to facilitate effective organizational diagnosis and analysis, and
Bohn, Justin; Eddings, Wesley; Schneeweiss, Sebastian
2017-03-15
Distributed networks of health-care data sources are increasingly being utilized to conduct pharmacoepidemiologic database studies. Such networks may contain data that are not physically pooled but instead are distributed horizontally (separate patients within each data source) or vertically (separate measures within each data source) in order to preserve patient privacy. While multivariable methods for the analysis of horizontally distributed data are frequently employed, few practical approaches have been put forth to deal with vertically distributed health-care databases. In this paper, we propose 2 propensity score-based approaches to vertically distributed data analysis and test their performance using 5 example studies. We found that these approaches produced point estimates close to what could be achieved without partitioning. We further found a performance benefit (i.e., lower mean squared error) for sequentially passing a propensity score through each data domain (called the "sequential approach") as compared with fitting separate domain-specific propensity scores (called the "parallel approach"). These results were validated in a small simulation study. This proof-of-concept study suggests a new multivariable analysis approach to vertically distributed health-care databases that is practical, preserves patient privacy, and warrants further investigation for use in clinical research applications that rely on health-care databases. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Egozcue, J. J.; Pawlowsky-Glahn, V.; Ortego, M. I.
2005-03-01
Standard practice of wave-height hazard analysis often pays little attention to the uncertainty of assessed return periods and occurrence probabilities. This fact favors the opinion that, when large events happen, the hazard assessment should change accordingly. However, uncertainty of the hazard estimates is normally able to hide the effect of those large events. This is illustrated using data from the Mediterranean coast of Spain, where the last years have been extremely disastrous. Thus, it is possible to compare the hazard assessment based on data previous to those years with the analysis including them. With our approach, no significant change is detected when the statistical uncertainty is taken into account. The hazard analysis is carried out with a standard model. Time-occurrence of events is assumed Poisson distributed. The wave-height of each event is modelled as a random variable which upper tail follows a Generalized Pareto Distribution (GPD). Moreover, wave-heights are assumed independent from event to event and also independent of their occurrence in time. A threshold for excesses is assessed empirically. The other three parameters (Poisson rate, shape and scale parameters of GPD) are jointly estimated using Bayes' theorem. Prior distribution accounts for physical features of ocean waves in the Mediterranean sea and experience with these phenomena. Posterior distribution of the parameters allows to obtain posterior distributions of other derived parameters like occurrence probabilities and return periods. Predictives are also available. Computations are carried out using the program BGPE v2.0.
A Complex Network Approach to Distributional Semantic Models
Utsumi, Akira
2015-01-01
A number of studies on network analysis have focused on language networks based on free word association, which reflects human lexical knowledge, and have demonstrated the small-world and scale-free properties in the word association network. Nevertheless, there have been very few attempts at applying network analysis to distributional semantic models, despite the fact that these models have been studied extensively as computational or cognitive models of human lexical knowledge. In this paper, we analyze three network properties, namely, small-world, scale-free, and hierarchical properties, of semantic networks created by distributional semantic models. We demonstrate that the created networks generally exhibit the same properties as word association networks. In particular, we show that the distribution of the number of connections in these networks follows the truncated power law, which is also observed in an association network. This indicates that distributional semantic models can provide a plausible model of lexical knowledge. Additionally, the observed differences in the network properties of various implementations of distributional semantic models are consistently explained or predicted by considering the intrinsic semantic features of a word-context matrix and the functions of matrix weighting and smoothing. Furthermore, to simulate a semantic network with the observed network properties, we propose a new growing network model based on the model of Steyvers and Tenenbaum. The idea underlying the proposed model is that both preferential and random attachments are required to reflect different types of semantic relations in network growth process. We demonstrate that this model provides a better explanation of network behaviors generated by distributional semantic models. PMID:26295940
NASA Astrophysics Data System (ADS)
Candela, A.; Brigandì, G.; Aronica, G. T.
2014-07-01
In this paper a procedure to derive synthetic flood design hydrographs (SFDH) using a bivariate representation of rainfall forcing (rainfall duration and intensity) via copulas, which describes and models the correlation between two variables independently of the marginal laws involved, coupled with a distributed rainfall-runoff model, is presented. Rainfall-runoff modelling (R-R modelling) for estimating the hydrological response at the outlet of a catchment was performed by using a conceptual fully distributed procedure based on the Soil Conservation Service - Curve Number method as an excess rainfall model and on a distributed unit hydrograph with climatic dependencies for the flow routing. Travel time computation, based on the distributed unit hydrograph definition, was performed by implementing a procedure based on flow paths, determined from a digital elevation model (DEM) and roughness parameters obtained from distributed geographical information. In order to estimate the primary return period of the SFDH, which provides the probability of occurrence of a hydrograph flood, peaks and flow volumes obtained through R-R modelling were treated statistically using copulas. Finally, the shapes of hydrographs have been generated on the basis of historically significant flood events, via cluster analysis. An application of the procedure described above has been carried out and results presented for the case study of the Imera catchment in Sicily, Italy.
NASA Astrophysics Data System (ADS)
Sergeenko, N. P.
2017-11-01
An adequate statistical method should be developed in order to predict probabilistically the range of ionospheric parameters. This problem is solved in this paper. The time series of the critical frequency of the layer F2- foF2( t) were subjected to statistical processing. For the obtained samples {δ foF2}, statistical distributions and invariants up to the fourth order are calculated. The analysis shows that the distributions differ from the Gaussian law during the disturbances. At levels of sufficiently small probability distributions, there are arbitrarily large deviations from the model of the normal process. Therefore, it is attempted to describe statistical samples {δ foF2} based on the Poisson model. For the studied samples, the exponential characteristic function is selected under the assumption that time series are a superposition of some deterministic and random processes. Using the Fourier transform, the characteristic function is transformed into a nonholomorphic excessive-asymmetric probability-density function. The statistical distributions of the samples {δ foF2} calculated for the disturbed periods are compared with the obtained model distribution function. According to the Kolmogorov's criterion, the probabilities of the coincidence of a posteriori distributions with the theoretical ones are P 0.7-0.9. The conducted analysis makes it possible to draw a conclusion about the applicability of a model based on the Poisson random process for the statistical description and probabilistic variation estimates during heliogeophysical disturbances of the variations {δ foF2}.
Cost Risk Analysis Based on Perception of the Engineering Process
NASA Technical Reports Server (NTRS)
Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.
1986-01-01
In most cost estimating applications at the NASA Langley Research Center (LaRC), it is desirable to present predicted cost as a range of possible costs rather than a single predicted cost. A cost risk analysis generates a range of cost for a project and assigns a probability level to each cost value in the range. Constructing a cost risk curve requires a good estimate of the expected cost of a project. It must also include a good estimate of expected variance of the cost. Many cost risk analyses are based upon an expert's knowledge of the cost of similar projects in the past. In a common scenario, a manager or engineer, asked to estimate the cost of a project in his area of expertise, will gather historical cost data from a similar completed project. The cost of the completed project is adjusted using the perceived technical and economic differences between the two projects. This allows errors from at least three sources. The historical cost data may be in error by some unknown amount. The managers' evaluation of the new project and its similarity to the old project may be in error. The factors used to adjust the cost of the old project may not correctly reflect the differences. Some risk analyses are based on untested hypotheses about the form of the statistical distribution that underlies the distribution of possible cost. The usual problem is not just to come up with an estimate of the cost of a project, but to predict the range of values into which the cost may fall and with what level of confidence the prediction is made. Risk analysis techniques that assume the shape of the underlying cost distribution and derive the risk curve from a single estimate plus and minus some amount usually fail to take into account the actual magnitude of the uncertainty in cost due to technical factors in the project itself. This paper addresses a cost risk method that is based on parametric estimates of the technical factors involved in the project being costed. The engineering process parameters are elicited from the engineer/expert on the project and are based on that expert's technical knowledge. These are converted by a parametric cost model into a cost estimate. The method discussed makes no assumptions about the distribution underlying the distribution of possible costs, and is not tied to the analysis of previous projects, except through the expert calibrations performed by the parametric cost analyst.
Analysis of overdispersed count data by mixtures of Poisson variables and Poisson processes.
Hougaard, P; Lee, M L; Whitmore, G A
1997-12-01
Count data often show overdispersion compared to the Poisson distribution. Overdispersion is typically modeled by a random effect for the mean, based on the gamma distribution, leading to the negative binomial distribution for the count. This paper considers a larger family of mixture distributions, including the inverse Gaussian mixture distribution. It is demonstrated that it gives a significantly better fit for a data set on the frequency of epileptic seizures. The same approach can be used to generate counting processes from Poisson processes, where the rate or the time is random. A random rate corresponds to variation between patients, whereas a random time corresponds to variation within patients.
Shuttle Electrical Power Analysis Program (SEPAP); single string circuit analysis report
NASA Technical Reports Server (NTRS)
Murdock, C. R.
1974-01-01
An evaluation is reported of the data obtained from an analysis of the distribution network characteristics of the shuttle during a spacelab mission. A description of the approach utilized in the development of the computer program and data base is provided and conclusions are drawn from the analysis of the data. Data sheets are provided for information to support the detailed discussion on each computer run.
NASA Astrophysics Data System (ADS)
Katzensteiner, H.; Bell, R.; Petschko, H.; Glade, T.
2012-04-01
The prediction and forecast of widespread landsliding for a given triggering event is an open research question. Numerous studies tried to link spatial rainfall and landslide distributions. This study focuses on analysing the relationship between intensive precipitation and rainfall-triggered shallow landslides in the year 2009 in Lower Austria. Landslide distributions were gained from the building ground register, which is maintained by the Geological Survey of Lower Austria. It contains detailed information of landslides, which were registered due to damage reports. Spatially distributed rainfall estimates were extracted from INCA (Integrated Nowcasting through Comprehensive Analysis) precipitation analysis, which is a combination of station data interpolation and radar data in a spatial resolution of 1km developed by the Central Institute for Meteorology and Geodynamics (ZAMG), Vienna, Austria. The importance of the data source is shown by comparing rainfall data based on reference gauges, spatial interpolation and INCA-analysis for a certain storm period. INCA precipitation data can detect precipitating cells that do not hit a station but might trigger a landslide, which is an advantage over the application of reference stations for the definition of rainfall thresholds. Empirical thresholds at regional scale were determined based on rainfall-intensity and duration in the year 2009 and landslide information. These thresholds are dependent on the criteria which separate the landslide triggering and non-triggering precipitation events from each other. Different approaches for defining thresholds alter the shape of the threshold as well. A temporarily threshold I=8,8263*D^(-0.672) for extreme rainfall events in summer in Lower Austria was defined. A verification of the threshold with similar events of other years as well as following analyses based on a larger landslide database are in progress.
Lee, L.; Helsel, D.
2007-01-01
Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.
Hodson, Mark E; Benning, Liane G; Demarchi, Bea; Penkman, Kirsty E H; Rodriguez-Blanco, Juan D; Schofield, Paul F; Versteegh, Emma A A
Many biominerals form from amorphous calcium carbonate (ACC), but this phase is highly unstable when synthesised in its pure form inorganically. Several species of earthworm secrete calcium carbonate granules which contain highly stable ACC. We analysed the milky fluid from which granules form and solid granules for amino acid (by liquid chromatography) and functional group (by Fourier transform infrared (FTIR) spectroscopy) compositions. Granule elemental composition was determined using inductively coupled plasma-optical emission spectroscopy (ICP-OES) and electron microprobe analysis (EMPA). Mass of ACC present in solid granules was quantified using FTIR and compared to granule elemental and amino acid compositions. Bulk analysis of granules was of powdered bulk material. Spatially resolved analysis was of thin sections of granules using synchrotron-based μ-FTIR and EMPA electron microprobe analysis. The milky fluid from which granules form is amino acid-rich (≤ 136 ± 3 nmol mg -1 (n = 3; ± std dev) per individual amino acid); the CaCO 3 phase present is ACC. Even four years after production, granules contain ACC. No correlation exists between mass of ACC present and granule elemental composition. Granule amino acid concentrations correlate well with ACC content (r ≥ 0.7, p ≤ 0.05) consistent with a role for amino acids (or the proteins they make up) in ACC stabilisation. Intra-granule variation in ACC (RSD = 16%) and amino acid concentration (RSD = 22-35%) was high for granules produced by the same earthworm. Maps of ACC distribution produced using synchrotron-based μ-FTIR mapping of granule thin sections and the relative intensity of the ν 2 : ν 4 peak ratio, cluster analysis and component regression using ACC and calcite standards showed similar spatial distributions of likely ACC-rich and calcite-rich areas. We could not identify organic peaks in the μ-FTIR spectra and thus could not determine whether ACC-rich domains also had relatively high amino acid concentrations. No correlation exists between ACC distribution and elemental concentrations determined by EMPA. ACC present in earthworm CaCO 3 granules is highly stable. Our results suggest a role for amino acids (or proteins) in this stability. We see no evidence for stabilisation of ACC by incorporation of inorganic components. Graphical abstractSynchrotron-based μ-FTIR mapping was used to determine the spatial distribution of amorphous calcium carbonate in earthworm-produced CaCO 3 granules.
Simões, Pedro Ivo; Kaefer, Igor Luis; Farias, Izeni Pires; Lima, Albertina Pimentel
2013-12-12
We describe the advertisement calls and color in life of Allobates sumtuosus (Morales 2002) based on specimens recorded and collected at its type locality in Reserva Biológica do Rio Trombetas, Brazilian Amazonia. We also improve the species diagnosis by adding information on states of characters frequently used in current Allobates taxonomy. Finally, we analyze genetic distances and the evolutionary relationships between typical A. sumtuosus and other Allobates species distributed in Brazil and along the Guiana Shield region using a fragment of the 16S rDNA mitochondrial gene. Based on this integrative analysis, we propose the synonym of Allobates spumaponens Kok & Ernst 2007 with A. sumtuosus and provide an updated geographic distribution of the species.
Thermographic Analysis of Stress Distribution in Welded Joints
NASA Astrophysics Data System (ADS)
Piršić, T.; Krstulović Opara, L.; Domazet, Ž.
2010-06-01
The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural) stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis) in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.
Jia, Lei; Li, Lin; Gui, Tao; Liu, Siyang; Li, Hanping; Han, Jingwan; Guo, Wei; Liu, Yongjian; Li, Jingyun
2016-09-21
With increasing data on HIV-1, a more relevant molecular model describing mechanism details of HIV-1 genetic recombination usually requires upgrades. Currently an incomplete structural understanding of the copy choice mechanism along with several other issues in the field that lack elucidation led us to perform an analysis of the correlation between breakpoint distributions and (1) the probability of base pairing, and (2) intersubtype genetic similarity to further explore structural mechanisms. Near full length sequences of URFs from Asia, Europe, and Africa (one sequence/patient), and representative sequences of worldwide CRFs were retrieved from the Los Alamos HIV database. Their recombination patterns were analyzed by jpHMM in detail. Then the relationships between breakpoint distributions and (1) the probability of base pairing, and (2) intersubtype genetic similarities were investigated. Pearson correlation test showed that all URF groups and the CRF group exhibit the same breakpoint distribution pattern. Additionally, the Wilcoxon two-sample test indicated a significant and inexplicable limitation of recombination in regions with high pairing probability. These regions have been found to be strongly conserved across distinct biological states (i.e., strong intersubtype similarity), and genetic similarity has been determined to be a very important factor promoting recombination. Thus, the results revealed an unexpected disagreement between intersubtype similarity and breakpoint distribution, which were further confirmed by genetic similarity analysis. Our analysis reveals a critical conflict between results from natural HIV-1 isolates and those from HIV-1-based assay vectors in which genetic similarity has been shown to be a very critical factor promoting recombination. These results indicate the region with high-pairing probabilities may be a more fundamental factor affecting HIV-1 recombination than sequence similarity in natural HIV-1 infections. Our findings will be relevant in furthering the understanding of HIV-1 recombination mechanisms.
Chen, Hao; Lu, Xinwei; Li, Loretta Y; Gao, Tianning; Chang, Yuyu
2014-06-15
The concentrations of As, Ba, Co, Cr, Cu, Mn, Ni, Pb, V and Zn in campus dust from kindergartens, elementary schools, middle schools and universities of Xi'an, China were determined by X-ray fluorescence spectrometry. Correlation coefficient analysis, principal component analysis (PCA) and cluster analysis (CA) were used to analyze the data and to identify possible sources of these metals in the dust. The spatial distributions of metals in urban dust of Xi'an were analyzed based on the metal concentrations in campus dusts using the geostatistics method. The results indicate that dust samples from campuses have elevated metal concentrations, especially for Pb, Zn, Co, Cu, Cr and Ba, with the mean values of 7.1, 5.6, 3.7, 2.9, 2.5 and 1.9 times the background values for Shaanxi soil, respectively. The enrichment factor results indicate that Mn, Ni, V, As and Ba in the campus dust were deficiently to minimally enriched, mainly affected by nature and partly by anthropogenic sources, while Co, Cr, Cu, Pb and Zn in the campus dust and especially Pb and Zn were mostly affected by human activities. As and Cu, Mn and Ni, Ba and V, and Pb and Zn had similar distribution patterns. The southwest high-tech industrial area and south commercial and residential areas have relatively high levels of most metals. Three main sources were identified based on correlation coefficient analysis, PCA, CA, as well as spatial distribution characteristics. As, Ni, Cu, Mn, Pb, Zn and Cr have mixed sources - nature, traffic, as well as fossil fuel combustion and weathering of materials. Ba and V are mainly derived from nature, but partly also from industrial emissions, as well as construction sources, while Co principally originates from construction. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Vauchy, Romain; Robisson, Anne-Charlotte; Martin, Philippe M.; Belin, Renaud C.; Aufore, Laurence; Scheinost, Andreas C.; Hodaj, Fiqiri
2015-01-01
The impact of the cation distribution homogeneity of the U0.54Pu0.45Am0.01O2-x mixed oxide on the americium oxidation state was studied by coupling X-ray diffraction (XRD), electron probe micro analysis (EPMA) and X-ray absorption spectroscopy (XAS). Oxygen-hypostoichiometric Am-bearing uranium-plutonium mixed oxide pellets were fabricated by two different co-milling based processes in order to obtain different cation distribution homogeneities. The americium was generated from β- decay of 241Pu. The XRD analysis of the obtained compounds did not reveal any structural difference between the samples. EPMA, however, revealed a high homogeneity in the cation distribution for one sample, and substantial heterogeneity of the U-Pu (so Am) distribution for the other. The difference in cation distribution was linked to a difference in Am chemistry as investigated by XAS, with Am being present at mixed +III/+IV oxidation state in the heterogeneous compound, whereas only Am(IV) was observed in the homogeneous compound. Previously reported discrepancies on Am oxidation states can hence be explained by cation distribution homogeneity effects.
NASA Astrophysics Data System (ADS)
ten Veldhuis, Marie-Claire; Schleiss, Marc
2017-04-01
In this study, we introduced an alternative approach for analysis of hydrological flow time series, using an adaptive sampling framework based on inter-amount times (IATs). The main difference with conventional flow time series is the rate at which low and high flows are sampled: the unit of analysis for IATs is a fixed flow amount, instead of a fixed time window. We analysed statistical distributions of flows and IATs across a wide range of sampling scales to investigate sensitivity of statistical properties such as quantiles, variance, skewness, scaling parameters and flashiness indicators to the sampling scale. We did this based on streamflow time series for 17 (semi)urbanised basins in North Carolina, US, ranging from 13 km2 to 238 km2 in size. Results showed that adaptive sampling of flow time series based on inter-amounts leads to a more balanced representation of low flow and peak flow values in the statistical distribution. While conventional sampling gives a lot of weight to low flows, as these are most ubiquitous in flow time series, IAT sampling gives relatively more weight to high flow values, when given flow amounts are accumulated in shorter time. As a consequence, IAT sampling gives more information about the tail of the distribution associated with high flows, while conventional sampling gives relatively more information about low flow periods. We will present results of statistical analyses across a range of subdaily to seasonal scales and will highlight some interesting insights that can be derived from IAT statistics with respect to basin flashiness and impact urbanisation on hydrological response.
Model-based Bayesian inference for ROC data analysis
NASA Astrophysics Data System (ADS)
Lei, Tianhu; Bae, K. Ty
2013-03-01
This paper presents a study of model-based Bayesian inference to Receiver Operating Characteristics (ROC) data. The model is a simple version of general non-linear regression model. Different from Dorfman model, it uses a probit link function with a covariate variable having zero-one two values to express binormal distributions in a single formula. Model also includes a scale parameter. Bayesian inference is implemented by Markov Chain Monte Carlo (MCMC) method carried out by Bayesian analysis Using Gibbs Sampling (BUGS). Contrast to the classical statistical theory, Bayesian approach considers model parameters as random variables characterized by prior distributions. With substantial amount of simulated samples generated by sampling algorithm, posterior distributions of parameters as well as parameters themselves can be accurately estimated. MCMC-based BUGS adopts Adaptive Rejection Sampling (ARS) protocol which requires the probability density function (pdf) which samples are drawing from be log concave with respect to the targeted parameters. Our study corrects a common misconception and proves that pdf of this regression model is log concave with respect to its scale parameter. Therefore, ARS's requirement is satisfied and a Gaussian prior which is conjugate and possesses many analytic and computational advantages is assigned to the scale parameter. A cohort of 20 simulated data sets and 20 simulations from each data set are used in our study. Output analysis and convergence diagnostics for MCMC method are assessed by CODA package. Models and methods by using continuous Gaussian prior and discrete categorical prior are compared. Intensive simulations and performance measures are given to illustrate our practice in the framework of model-based Bayesian inference using MCMC method.
NASA Astrophysics Data System (ADS)
Yu, Zhang; Xiaohui, Song; Jianfang, Li; Fei, Gao
2017-05-01
Cable overheating will lead to the cable insulation level reducing, speed up the cable insulation aging, even easy to cause short circuit faults. Cable overheating risk identification and warning is nessesary for distribution network operators. Cable overheating risk warning method based on impedance parameter estimation is proposed in the paper to improve the safty and reliability operation of distribution network. Firstly, cable impedance estimation model is established by using least square method based on the data from distribiton SCADA system to improve the impedance parameter estimation accuracy. Secondly, calculate the threshold value of cable impedance based on the historical data and the forecast value of cable impedance based on the forecasting data in future from distribiton SCADA system. Thirdly, establish risks warning rules library of cable overheating, calculate the cable impedance forecast value and analysis the change rate of impedance, and then warn the overheating risk of cable line based on the overheating risk warning rules library according to the variation relationship between impedance and line temperature rise. Overheating risk warning method is simulated in the paper. The simulation results shows that the method can identify the imedance and forecast the temperature rise of cable line in distribution network accurately. The result of overheating risk warning can provide decision basis for operation maintenance and repair.
ReSTART: A Novel Framework for Resource-Based Triage in Mass-Casualty Events.
Mills, Alex F; Argon, Nilay T; Ziya, Serhan; Hiestand, Brian; Winslow, James
2014-01-01
Current guidelines for mass-casualty triage do not explicitly use information about resource availability. Even though this limitation has been widely recognized, how it should be addressed remains largely unexplored. The authors present a novel framework developed using operations research methods to account for resource limitations when determining priorities for transportation of critically injured patients. To illustrate how this framework can be used, they also develop two specific example methods, named ReSTART and Simple-ReSTART, both of which extend the widely adopted triage protocol Simple Triage and Rapid Treatment (START) by using a simple calculation to determine priorities based on the relative scarcity of transportation resources. The framework is supported by three techniques from operations research: mathematical analysis, optimization, and discrete-event simulation. The authors? algorithms were developed using mathematical analysis and optimization and then extensively tested using 9,000 discrete-event simulations on three distributions of patient severity (representing low, random, and high acuity). For each incident, the expected number of survivors was calculated under START, ReSTART, and Simple-ReSTART. A web-based decision support tool was constructed to help providers make prioritization decisions in the aftermath of mass-casualty incidents based on ReSTART. In simulations, ReSTART resulted in significantly lower mortality than START regardless of which severity distribution was used (paired t test, p<.01). Mean decrease in critical mortality, the percentage of immediate and delayed patients who die, was 8.5% for low-acuity distribution (range ?2.2% to 21.1%), 9.3% for random distribution (range ?0.2% to 21.2%), and 9.1% for high-acuity distribution (range ?0.7% to 21.1%). Although the critical mortality improvement due to ReSTART was different for each of the three severity distributions, the variation was less than 1 percentage point, indicating that the ReSTART policy is relatively robust to different severity distributions. Taking resource limitations into account in mass-casualty situations, triage has the potential to increase the expected number of survivors. Further validation is required before field implementation; however, the framework proposed in here can serve as the foundation for future work in this area. 2014.
Yoganandan, Narayan; Arun, Mike W J; Pintar, Frank A; Szabo, Aniko
2014-01-01
Derive optimum injury probability curves to describe human tolerance of the lower leg using parametric survival analysis. The study reexamined lower leg postmortem human subjects (PMHS) data from a large group of specimens. Briefly, axial loading experiments were conducted by impacting the plantar surface of the foot. Both injury and noninjury tests were included in the testing process. They were identified by pre- and posttest radiographic images and detailed dissection following the impact test. Fractures included injuries to the calcaneus and distal tibia-fibula complex (including pylon), representing severities at the Abbreviated Injury Score (AIS) level 2+. For the statistical analysis, peak force was chosen as the main explanatory variable and the age was chosen as the covariable. Censoring statuses depended on experimental outcomes. Parameters from the parametric survival analysis were estimated using the maximum likelihood approach and the dfbetas statistic was used to identify overly influential samples. The best fit from the Weibull, log-normal, and log-logistic distributions was based on the Akaike information criterion. Plus and minus 95% confidence intervals were obtained for the optimum injury probability distribution. The relative sizes of the interval were determined at predetermined risk levels. Quality indices were described at each of the selected probability levels. The mean age, stature, and weight were 58.2±15.1 years, 1.74±0.08 m, and 74.9±13.8 kg, respectively. Excluding all overly influential tests resulted in the tightest confidence intervals. The Weibull distribution was the most optimum function compared to the other 2 distributions. A majority of quality indices were in the good category for this optimum distribution when results were extracted for 25-, 45- and 65-year-olds at 5, 25, and 50% risk levels age groups for lower leg fracture. For 25, 45, and 65 years, peak forces were 8.1, 6.5, and 5.1 kN at 5% risk; 9.6, 7.7, and 6.1 kN at 25% risk; and 10.4, 8.3, and 6.6 kN at 50% risk, respectively. This study derived axial loading-induced injury risk curves based on survival analysis using peak force and specimen age; adopting different censoring schemes; considering overly influential samples in the analysis; and assessing the quality of the distribution at discrete probability levels. Because procedures used in the present survival analysis are accepted by international automotive communities, current optimum human injury probability distributions can be used at all risk levels with more confidence in future crashworthiness applications for automotive and other disciplines.
Social Climate Indicators for the U.S. Army
1991-08-01
based on a secondary analysis of the 1978 Army Life study . Sample: Subjects from 8,1.40 personnel assigned to 60 different battalions. Summary...disillusionment: Its effect on Urion inthe nited States Navy. (AD-A127 311). Monterey, CA: Naval Postgraduate School. Description: This study ... ON AUTHORIlY 3. DISTRIBUTION/AVAILABILiTY OF REPORT --. _Approved for public releas,’ 1b. DECLASSIFICATION/DOWNGRADING SCHEDULE distribution is
Jacob Gibson; Gretchen Moisen; Tracey Frescino; Thomas C. Edwards
2013-01-01
Species distribution models (SDMs) were built with US Forest Inventory and Analysis (FIA) publicly available plot coordinates, which are altered for plot security purposes, and compared with SDMs built with true plot coordinates. Six species endemic to the western US, including four junipers (Juniperus deppeana var. deppeana, J. monosperma, J. occidentalis, J....
Using Honeynets and the Diamond Model for ICS Threat Analysis
2016-05-11
TR-006 | SOFTWARE ENGINEERING INSTITUTE | CARNEGIE MELLON UNIVERSITY Distribution Statement A: Approved for Public Release; Distribution is...Unlimited Copyright 2016 Carnegie Mellon University This material is based upon work funded and supported by Department of Homeland Security under Contract...No. FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and
Y. Chen; S. J. Seybold
2013-01-01
Instar determination of field-collected insect larvae has generally been based on the analysis of head capsule width frequency distributions or bivariate plotting, but few studies have tested the validity of such methods. We used head capsules from exuviae of known instars of the beet armyworm, Spodoptera exigua (Hübner) (Lepidoptera: Noctuidae),...
Simple 2.5 GHz time-bin quantum key distribution
NASA Astrophysics Data System (ADS)
Boaron, Alberto; Korzh, Boris; Houlmann, Raphael; Boso, Gianluca; Rusca, Davide; Gray, Stuart; Li, Ming-Jun; Nolan, Daniel; Martin, Anthony; Zbinden, Hugo
2018-04-01
We present a 2.5 GHz quantum key distribution setup with the emphasis on a simple experimental realization. It features a three-state time-bin protocol based on a pulsed diode laser and a single intensity modulator. Implementing an efficient one-decoy scheme and finite-key analysis, we achieve record breaking secret key rates of 1.5 kbps over 200 km of standard optical fibers.
Distributed measurement of high electric current by means of polarimetric optical fiber sensor.
Palmieri, Luca; Sarchi, Davide; Galtarossa, Andrea
2015-05-04
A novel distributed optical fiber sensor for spatially resolved monitoring of high direct electric current is proposed and analyzed. The sensor exploits Faraday rotation and is based on the polarization analysis of the Rayleigh backscattered light. Preliminary laboratory tests, performed on a section of electric cable for currents up to 2.5 kA, have confirmed the viability of the method.
Hai Ren; Qianmei Zhang; Zhengfeng Wang; Qinfeng Guo; June Wang; Nan Liu; Kaiming Liang
2010-01-01
The distribution of the rare and endangered perennial herb Primulina tabacum Hance is restricted to eight karst caves in southern China. To conserve P. tabacum and to evaluate possible reintroduction, we studied its historical distribution and conducted field surveys of both its biotic and physical environment. We used detrended...
Matsumoto, Masatoshi; Koike, Soichi; Kashima, Saori; Awai, Kazuo
2015-01-01
Japan has the most CT and MRI scanners per unit population in the world, and as these technologies spread, their geographic distribution is becoming equalized. In contrast, the number of radiologists per unit population in Japan is the lowest among OECD countries and their geographic distribution is unknown. Likewise, little is known about the use of teleradiology, which can compensate for the uneven distribution of radiologists. Based on the Survey of Physicians, Dentists and Pharmacists and the Static Survey of Medical Institutions by the Ministry of Health, Labour and Welfare, a dataset of radiologists and CT and MRI utilizations in each of Japan's 1811 municipalities was created. The inter-municipality equity of the number of radiologists was evaluated using Gini coefficient. Logistic regression analysis, based on Static Survey data, was performed to evaluate the association between hospital location and teleradiology use. Between 2006 and 2012 the number of radiologists increased by 21.7%, but the Gini coefficient remained unchanged. The number of radiologists per 1,000 CT (MRI) utilizations decreased by 17.9% (1.0%); the number was highest in metropolis and lowest in town/village and the disparity has widened from 1.9 to 2.2 (1.6 to 2.0) times. The number of hospitals and clinics using teleradiology has increased (by 69.6% and 18.1%, respectively). Hospitals located in towns/villages (odds ratio 1.61; 95% confidence interval 1.26-2.07) were more likely to use teleradiology than those in metropolises. Contrary to the CT and MRI distributions, radiologist distribution has not been evened out by the increase in their number; in other words, the distribution of radiologists was not affected by market-derived spatial competition force. As a consequence, the gap of the radiologist shortage between urban and rural areas is increasing. Teleradiology, which is one way to ameliorate this gap, should be encouraged.
NASA Astrophysics Data System (ADS)
Hwang, Jae-Sang; Seong, Jae-Kyu; Shin, Woo-Ju; Lee, Jong-Geon; Cho, Jeon-Wook; Ryoo, Hee-Suk; Lee, Bang-Wook
2013-11-01
High temperature superconducting (HTS) cable has been paid much attention due to its high efficiency and high current transportation capability, and it is also regarded as eco-friendly power cable for the next generation. Especially for DC HTS cable, it has more sustainable and stable properties compared to AC HTS cable due to the absence of AC loss in DC HTS cable. Recently, DC HTS cable has been investigated competitively all over the world, and one of the key components of DC HTS cable to be developed is a cable joint box considering HVDC environment. In order to achieve the optimum insulation design of the joint box, analysis of DC electric field distribution of the joint box is a fundamental process to develop DC HTS cable. Generally, AC electric field distribution depends on relative permittivity of dielectric materials but in case of DC, electrical conductivity of dielectric material is a dominant factor which determines electric field distribution. In this study, in order to evaluate DC electric field characteristics of the joint box for DC HTS cable, polypropylene laminated paper (PPLP) specimen has been prepared and its DC electric field distribution was analyzed based on the measurement of electrical conductivity of PPLP in liquid nitrogen (LN2). Electrical conductivity of PPLP in LN2 has not been reported yet but it should be measured for DC electric field analysis. The experimental works for measuring electrical conductivity of PPLP in LN2 were presented in this paper. Based on the experimental works, DC electric field distribution of PPLP specimen was fully analyzed considering the steady state and the transient state of DC. Consequently, it was possible to determine the electric field distribution characteristics considering different DC applying stages including DC switching on, DC switching off and polarity reversal conditions.
School Technology Leadership: Artifacts in Systems of Practice
ERIC Educational Resources Information Center
Dexter, Sara
2011-01-01
A cross-case analysis of five case studies of team-based technology leadership in middle schools with laptop programs identifies systems of practice that organize teams' distributed leadership. These cases suggest that successfully implementing a complex improvement effort warrants a team-based leadership approach, especially for an improvement…
A Three-Level Analysis of Collaborative Learning in Dual-Interaction Spaces
ERIC Educational Resources Information Center
Lonchamp, Jacques
2009-01-01
CSCL systems which follow the dual-interaction spaces paradigm support the synchronous construction and discussion of shared artifacts by distributed or colocated small groups of learners. The most recent generic dual-interaction space environments, either model based or component based, can be deeply customized by teachers for supporting…
Clustering, randomness, and regularity in cloud fields: 2. Cumulus cloud fields
NASA Astrophysics Data System (ADS)
Zhu, T.; Lee, J.; Weger, R. C.; Welch, R. M.
1992-12-01
During the last decade a major controversy has been brewing concerning the proper characterization of cumulus convection. The prevailing view has been that cumulus clouds form in clusters, in which cloud spacing is closer than that found for the overall cloud field and which maintains its identity over many cloud lifetimes. This "mutual protection hypothesis" of Randall and Huffman (1980) has been challenged by the "inhibition hypothesis" of Ramirez et al. (1990) which strongly suggests that the spatial distribution of cumuli must tend toward a regular distribution. A dilemma has resulted because observations have been reported to support both hypotheses. The present work reports a detailed analysis of cumulus cloud field spatial distributions based upon Landsat, Advanced Very High Resolution Radiometer, and Skylab data. Both nearest-neighbor and point-to-cloud cumulative distribution function statistics are investigated. The results show unequivocally that when both large and small clouds are included in the cloud field distribution, the cloud field always has a strong clustering signal. The strength of clustering is largest at cloud diameters of about 200-300 m, diminishing with increasing cloud diameter. In many cases, clusters of small clouds are found which are not closely associated with large clouds. As the small clouds are eliminated from consideration, the cloud field typically tends towards regularity. Thus it would appear that the "inhibition hypothesis" of Ramirez and Bras (1990) has been verified for the large clouds. However, these results are based upon the analysis of point processes. A more exact analysis also is made which takes into account the cloud size distributions. Since distinct clouds are by definition nonoverlapping, cloud size effects place a restriction upon the possible locations of clouds in the cloud field. The net effect of this analysis is that the large clouds appear to be randomly distributed, with only weak tendencies towards regularity. For clouds less than 1 km in diameter, the average nearest-neighbor distance is equal to 3-7 cloud diameters. For larger clouds, the ratio of cloud nearest-neighbor distance to cloud diameter increases sharply with increasing cloud diameter. This demonstrates that large clouds inhibit the growth of other large clouds in their vicinity. Nevertheless, this leads to random distributions of large clouds, not regularity.
Spain, S L; Pedroso, I; Kadeva, N; Miller, M B; Iacono, W G; McGue, M; Stergiakouli, E; Smith, G D; Putallaz, M; Lubinski, D; Meaburn, E L; Plomin, R; Simpson, M A
2016-01-01
Although individual differences in intelligence (general cognitive ability) are highly heritable, molecular genetic analyses to date have had limited success in identifying specific loci responsible for its heritability. This study is the first to investigate exome variation in individuals of extremely high intelligence. Under the quantitative genetic model, sampling from the high extreme of the distribution should provide increased power to detect associations. We therefore performed a case–control association analysis with 1409 individuals drawn from the top 0.0003 (IQ >170) of the population distribution of intelligence and 3253 unselected population-based controls. Our analysis focused on putative functional exonic variants assayed on the Illumina HumanExome BeadChip. We did not observe any individual protein-altering variants that are reproducibly associated with extremely high intelligence and within the entire distribution of intelligence. Moreover, no significant associations were found for multiple rare alleles within individual genes. However, analyses using genome-wide similarity between unrelated individuals (genome-wide complex trait analysis) indicate that the genotyped functional protein-altering variation yields a heritability estimate of 17.4% (s.e. 1.7%) based on a liability model. In addition, investigation of nominally significant associations revealed fewer rare alleles associated with extremely high intelligence than would be expected under the null hypothesis. This observation is consistent with the hypothesis that rare functional alleles are more frequently detrimental than beneficial to intelligence. PMID:26239293
Spain, S L; Pedroso, I; Kadeva, N; Miller, M B; Iacono, W G; McGue, M; Stergiakouli, E; Davey Smith, G; Putallaz, M; Lubinski, D; Meaburn, E L; Plomin, R; Simpson, M A
2016-08-01
Although individual differences in intelligence (general cognitive ability) are highly heritable, molecular genetic analyses to date have had limited success in identifying specific loci responsible for its heritability. This study is the first to investigate exome variation in individuals of extremely high intelligence. Under the quantitative genetic model, sampling from the high extreme of the distribution should provide increased power to detect associations. We therefore performed a case-control association analysis with 1409 individuals drawn from the top 0.0003 (IQ >170) of the population distribution of intelligence and 3253 unselected population-based controls. Our analysis focused on putative functional exonic variants assayed on the Illumina HumanExome BeadChip. We did not observe any individual protein-altering variants that are reproducibly associated with extremely high intelligence and within the entire distribution of intelligence. Moreover, no significant associations were found for multiple rare alleles within individual genes. However, analyses using genome-wide similarity between unrelated individuals (genome-wide complex trait analysis) indicate that the genotyped functional protein-altering variation yields a heritability estimate of 17.4% (s.e. 1.7%) based on a liability model. In addition, investigation of nominally significant associations revealed fewer rare alleles associated with extremely high intelligence than would be expected under the null hypothesis. This observation is consistent with the hypothesis that rare functional alleles are more frequently detrimental than beneficial to intelligence.
Addressing data privacy in matched studies via virtual pooling.
Saha-Chaudhuri, P; Weinberg, C R
2017-09-07
Data confidentiality and shared use of research data are two desirable but sometimes conflicting goals in research with multi-center studies and distributed data. While ideal for straightforward analysis, confidentiality restrictions forbid creation of a single dataset that includes covariate information of all participants. Current approaches such as aggregate data sharing, distributed regression, meta-analysis and score-based methods can have important limitations. We propose a novel application of an existing epidemiologic tool, specimen pooling, to enable confidentiality-preserving analysis of data arising from a matched case-control, multi-center design. Instead of pooling specimens prior to assay, we apply the methodology to virtually pool (aggregate) covariates within nodes. Such virtual pooling retains most of the information used in an analysis with individual data and since individual participant data is not shared externally, within-node virtual pooling preserves data confidentiality. We show that aggregated covariate levels can be used in a conditional logistic regression model to estimate individual-level odds ratios of interest. The parameter estimates from the standard conditional logistic regression are compared to the estimates based on a conditional logistic regression model with aggregated data. The parameter estimates are shown to be similar to those without pooling and to have comparable standard errors and confidence interval coverage. Virtual data pooling can be used to maintain confidentiality of data from multi-center study and can be particularly useful in research with large-scale distributed data.
NASA Astrophysics Data System (ADS)
Holá, Markéta; Kalvoda, Jiří; Nováková, Hana; Škoda, Radek; Kanický, Viktor
2011-01-01
LA-ICP-MS and solution based ICP-MS in combination with electron microprobe are presented as a method for the determination of the elemental spatial distribution in fish scales which represent an example of a heterogeneous layered bone structure. Two different LA-ICP-MS techniques were tested on recent common carp ( Cyprinus carpio) scales: A line scan through the whole fish scale perpendicular to the growth rings. The ablation crater of 55 μm width and 50 μm depth allowed analysis of the elemental distribution in the external layer. Suitable ablation conditions providing a deeper ablation crater gave average values from the external HAP layer and the collagen basal plate. Depth profiling using spot analysis was tested in fish scales for the first time. Spot analysis allows information to be obtained about the depth profile of the elements at the selected position on the sample. The combination of all mentioned laser ablation techniques provides complete information about the elemental distribution in the fish scale samples. The results were compared with the solution based ICP-MS and EMP analyses. The fact that the results of depth profiling are in a good agreement both with EMP and PIXE results and, with the assumed ways of incorporation of the studied elements in the HAP structure, suggests a very good potential for this method.
NASA Astrophysics Data System (ADS)
ten Veldhuis, Marie-Claire; Schleiss, Marc
2017-04-01
Urban catchments are typically characterised by a more flashy nature of the hydrological response compared to natural catchments. Predicting flow changes associated with urbanisation is not straightforward, as they are influenced by interactions between impervious cover, basin size, drainage connectivity and stormwater management infrastructure. In this study, we present an alternative approach to statistical analysis of hydrological response variability and basin flashiness, based on the distribution of inter-amount times. We analyse inter-amount time distributions of high-resolution streamflow time series for 17 (semi-)urbanised basins in North Carolina, USA, ranging from 13 to 238 km2 in size. We show that in the inter-amount-time framework, sampling frequency is tuned to the local variability of the flow pattern, resulting in a different representation and weighting of high and low flow periods in the statistical distribution. This leads to important differences in the way the distribution quantiles, mean, coefficient of variation and skewness vary across scales and results in lower mean intermittency and improved scaling. Moreover, we show that inter-amount-time distributions can be used to detect regulation effects on flow patterns, identify critical sampling scales and characterise flashiness of hydrological response. The possibility to use both the classical approach and the inter-amount-time framework to identify minimum observable scales and analyse flow data opens up interesting areas for future research.
Imaging Analysis of Near-Field Recording Technique for Observation of Biological Specimens
NASA Astrophysics Data System (ADS)
Moriguchi, Chihiro; Ohta, Akihiro; Egami, Chikara; Kawata, Yoshimasa; Terakawa, Susumu; Tsuchimori, Masaaki; Watanabe, Osamu
2006-07-01
We present an analysis of the properties of an imaging based on a near-field recording technique in comparison with simulation results. In the system, the optical field distributions localized near the specimens are recorded as the surface topographic distributions of a photosensitive film. It is possible to observe both soft and moving specimens, because the system does not require a scanning probe to obtain the observed image. The imaging properties are evaluated using fine structures of paramecium, and we demonstrate that it is possible to observe minute differences of refractive indices.
COBRA ATD minefield detection model initial performance analysis
NASA Astrophysics Data System (ADS)
Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.
2000-08-01
A statistical performance analysis of the USMC Coastal Battlefield Reconnaissance and Analysis (COBRA) Minefield Detection (MFD) Model has been performed in support of the COBRA ATD Program under execution by the Naval Surface Warfare Center/Dahlgren Division/Coastal Systems Station . This analysis uses the Veridian ERIM International MFD model from the COBRA Sensor Performance Evaluation and Computational Tools for Research Analysis modeling toolbox and a collection of multispectral mine detection algorithm response distributions for mines and minelike clutter objects. These mine detection response distributions were generated form actual COBRA ATD test missions over littoral zone minefields. This analysis serves to validate both the utility and effectiveness of the COBRA MFD Model as a predictive MFD performance too. COBRA ATD minefield detection model algorithm performance results based on a simulate baseline minefield detection scenario are presented, as well as result of a MFD model algorithm parametric sensitivity study.
Segmentation-free image processing and analysis of precipitate shapes in 2D and 3D
NASA Astrophysics Data System (ADS)
Bales, Ben; Pollock, Tresa; Petzold, Linda
2017-06-01
Segmentation based image analysis techniques are routinely employed for quantitative analysis of complex microstructures containing two or more phases. The primary advantage of these approaches is that spatial information on the distribution of phases is retained, enabling subjective judgements of the quality of the segmentation and subsequent analysis process. The downside is that computing micrograph segmentations with data from morphologically complex microstructures gathered with error-prone detectors is challenging and, if no special care is taken, the artifacts of the segmentation will make any subsequent analysis and conclusions uncertain. In this paper we demonstrate, using a two phase nickel-base superalloy microstructure as a model system, a new methodology for analysis of precipitate shapes using a segmentation-free approach based on the histogram of oriented gradients feature descriptor, a classic tool in image analysis. The benefits of this methodology for analysis of microstructure in two and three-dimensions are demonstrated.
NASA Astrophysics Data System (ADS)
Ochoa, Diego Alejandro; García, Jose Eduardo
2016-04-01
The Preisach model is a classical method for describing nonlinear behavior in hysteretic systems. According to this model, a hysteretic system contains a collection of simple bistable units which are characterized by an internal field and a coercive field. This set of bistable units exhibits a statistical distribution that depends on these fields as parameters. Thus, nonlinear response depends on the specific distribution function associated with the material. This model is satisfactorily used in this work to describe the temperature-dependent ferroelectric response in PZT- and KNN-based piezoceramics. A distribution function expanded in Maclaurin series considering only the first terms in the internal field and the coercive field is proposed. Changes in coefficient relations of a single distribution function allow us to explain the complex temperature dependence of hard piezoceramic behavior. A similar analysis based on the same form of the distribution function shows that the KNL-NTS properties soften around its orthorhombic to tetragonal phase transition.
Janković, Bojan; Marinović-Cincović, Milena; Janković, Marija
2017-09-01
Kinetics of degradation for Aronia melanocarpa fresh fruits in argon and air atmospheres were investigated. The investigation was based on probability distributions of apparent activation energy of counterparts (ε a ). Isoconversional analysis results indicated that the degradation process in an inert atmosphere was governed by decomposition reactions of esterified compounds. Also, based on same kinetics approach, it was assumed that in an air atmosphere, the primary compound in degradation pathways could be anthocyanins, which undergo rapid chemical reactions. A new model of reactivity demonstrated that, under inert atmospheres, expectation values for ε a occured at levels of statistical probability. These values corresponded to decomposition processes in which polyphenolic compounds might be involved. ε a values obeyed laws of binomial distribution. It was established that, for thermo-oxidative degradation, Poisson distribution represented a very successful approximation for ε a values where there was additional mechanistic complexity and the binomial distribution was no longer valid. Copyright © 2017 Elsevier Ltd. All rights reserved.
Wijetunge, Chalini D; Saeed, Isaam; Boughton, Berin A; Spraggins, Jeffrey M; Caprioli, Richard M; Bacic, Antony; Roessner, Ute; Halgamuge, Saman K
2015-10-01
Matrix Assisted Laser Desorption Ionization-Imaging Mass Spectrometry (MALDI-IMS) in 'omics' data acquisition generates detailed information about the spatial distribution of molecules in a given biological sample. Various data processing methods have been developed for exploring the resultant high volume data. However, most of these methods process data in the spectral domain and do not make the most of the important spatial information available through this technology. Therefore, we propose a novel streamlined data analysis pipeline specifically developed for MALDI-IMS data utilizing significant spatial information for identifying hidden significant molecular distribution patterns in these complex datasets. The proposed unsupervised algorithm uses Sliding Window Normalization (SWN) and a new spatial distribution based peak picking method developed based on Gray level Co-Occurrence (GCO) matrices followed by clustering of biomolecules. We also use gist descriptors and an improved version of GCO matrices to extract features from molecular images and minimum medoid distance to automatically estimate the number of possible groups. We evaluated our algorithm using a new MALDI-IMS metabolomics dataset of a plant (Eucalypt) leaf. The algorithm revealed hidden significant molecular distribution patterns in the dataset, which the current Component Analysis and Segmentation Map based approaches failed to extract. We further demonstrate the performance of our peak picking method over other traditional approaches by using a publicly available MALDI-IMS proteomics dataset of a rat brain. Although SWN did not show any significant improvement as compared with using no normalization, the visual assessment showed an improvement as compared to using the median normalization. The source code and sample data are freely available at http://exims.sourceforge.net/. awgcdw@student.unimelb.edu.au or chalini_w@live.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Research of working pulsation in closed angle based on rotating-sleeve distributing-flow system
NASA Astrophysics Data System (ADS)
Zhang, Yanjun; Zhang, Hongxin; Zhao, Qinghai; Jiang, Xiaotian; Cheng, Qianchang
2017-08-01
In order to reduce negative effects including hydraulic impact, noise and mechanical vibration, compression and expansion of piston pump in closed volume are used to optimize the angle between valve port and chamber. In addition, the mathematical model about pressurization and depressurization in pump chamber are analyzed based on distributing-flow characteristic, and it is necessary to use simulation software Fluent to simulate the distributing-flow fluid model so as to select the most suitable closed angle. As a result, when compression angle is 3°, the angle is closest to theoretical analysis and has the minimum influence on flow and pump pressure characteristic. Meanwhile, cavitation phenomenon appears in pump chamber in different closed angle on different degrees. Besides the flow pulsation is increasingly smaller with increasing expansion angle. Thus when expansion angle is 2°, the angle is more suitable for distributing-flow system.
Analysis of fault-tolerant neurocontrol architectures
NASA Technical Reports Server (NTRS)
Troudet, T.; Merrill, W.
1992-01-01
The fault-tolerance of analog parallel distributed implementations of a multivariable aircraft neurocontroller is analyzed by simulating weight and neuron failures in a simplified scheme of analog processing based on the functional architecture of the ETANN chip (Electrically Trainable Artificial Neural Network). The neural information processing is found to be only partially distributed throughout the set of weights of the neurocontroller synthesized with the backpropagation algorithm. Although the degree of distribution of the neural processing, and consequently the fault-tolerance of the neurocontroller, could be enhanced using Locally Distributed Weight and Neuron Approaches, a satisfactory level of fault-tolerance could only be obtained by retraining the degrated VLSI neurocontroller. The possibility of maintaining neurocontrol performance and stability in the presence of single weight of neuron failures was demonstrated through an automated retraining procedure of the neurocontroller based on a pre-programmed choice and sequence of the training parameters.
Distributed Adaptive Fuzzy Control for Nonlinear Multiagent Systems Via Sliding Mode Observers.
Shen, Qikun; Shi, Peng; Shi, Yan
2016-12-01
In this paper, the problem of distributed adaptive fuzzy control is investigated for high-order uncertain nonlinear multiagent systems on directed graph with a fixed topology. It is assumed that only the outputs of each follower and its neighbors are available in the design of its distributed controllers. Equivalent output injection sliding mode observers are proposed for each follower to estimate the states of itself and its neighbors, and an observer-based distributed adaptive controller is designed for each follower to guarantee that it asymptotically synchronizes to a leader with tracking errors being semi-globally uniform ultimate bounded, in which fuzzy logic systems are utilized to approximate unknown functions. Based on algebraic graph theory and Lyapunov function approach, using Filippov-framework, the closed-loop system stability analysis is conducted. Finally, numerical simulations are provided to illustrate the effectiveness and potential of the developed design techniques.
New vistas in the determination of hydrogen in aerospace engine metal alloys
NASA Technical Reports Server (NTRS)
Danford, M. D.
1986-01-01
The application of diffusion theory to the analysis of hydrogen desorption data has been studied. From these analyses, important information concerning hydrogen solubilities and the nature of the hydrogen distributions in the metal has been obtained. Two nickel base alloys, Rene' 41 and Waspaloy, and one ferrous alloy, 4340 steel, were studied in this work. For the nickel base alloys, it was found that the hydrogen distributions after electrolytic charging conformed closely to those which would be predicted by diffusion theory. The hydrogen distributions in electrolytically charged 4340 steel, on the other hand, were essentially uniform in nature, which would not be predicted by diffusion theory. Finally, it has been found that the hydrogen desorption is completely explained by the nature of the hydrogen distribution in the metal, and that the 'fast' hydrogen is not due to surface and subsurface hydride formation, as was originally proposed.
Kroshl, William M; Sarkani, Shahram; Mazzuchi, Thomas A
2015-09-01
This article presents ongoing research that focuses on efficient allocation of defense resources to minimize the damage inflicted on a spatially distributed physical network such as a pipeline, water system, or power distribution system from an attack by an active adversary, recognizing the fundamental difference between preparing for natural disasters such as hurricanes, earthquakes, or even accidental systems failures and the problem of allocating resources to defend against an opponent who is aware of, and anticipating, the defender's efforts to mitigate the threat. Our approach is to utilize a combination of integer programming and agent-based modeling to allocate the defensive resources. We conceptualize the problem as a Stackelberg "leader follower" game where the defender first places his assets to defend key areas of the network, and the attacker then seeks to inflict the maximum damage possible within the constraints of resources and network structure. The criticality of arcs in the network is estimated by a deterministic network interdiction formulation, which then informs an evolutionary agent-based simulation. The evolutionary agent-based simulation is used to determine the allocation of resources for attackers and defenders that results in evolutionary stable strategies, where actions by either side alone cannot increase its share of victories. We demonstrate these techniques on an example network, comparing the evolutionary agent-based results to a more traditional, probabilistic risk analysis (PRA) approach. Our results show that the agent-based approach results in a greater percentage of defender victories than does the PRA-based approach. © 2015 Society for Risk Analysis.
Time frequency analysis for automated sleep stage identification in fullterm and preterm neonates.
Fraiwan, Luay; Lweesy, Khaldon; Khasawneh, Natheer; Fraiwan, Mohammad; Wenz, Heinrich; Dickhaus, Hartmut
2011-08-01
This work presents a new methodology for automated sleep stage identification in neonates based on the time frequency distribution of single electroencephalogram (EEG) recording and artificial neural networks (ANN). Wigner-Ville distribution (WVD), Hilbert-Hough spectrum (HHS) and continuous wavelet transform (CWT) time frequency distributions were used to represent the EEG signal from which features were extracted using time frequency entropy. The classification of features was done using feed forward back-propagation ANN. The system was trained and tested using data taken from neonates of post-conceptual age of 40 weeks for both preterm (14 recordings) and fullterm (15 recordings). The identification of sleep stages was successfully implemented and the classification based on the WVD outperformed the approaches based on CWT and HHS. The accuracy and kappa coefficient were found to be 0.84 and 0.65 respectively for the fullterm neonates' recordings and 0.74 and 0.50 respectively for preterm neonates' recordings.
Moderation analysis with missing data in the predictors.
Zhang, Qian; Wang, Lijuan
2017-12-01
The most widely used statistical model for conducting moderation analysis is the moderated multiple regression (MMR) model. In MMR modeling, missing data could pose a challenge, mainly because the interaction term is a product of two or more variables and thus is a nonlinear function of the involved variables. In this study, we consider a simple MMR model, where the effect of the focal predictor X on the outcome Y is moderated by a moderator U. The primary interest is to find ways of estimating and testing the moderation effect with the existence of missing data in X. We mainly focus on cases when X is missing completely at random (MCAR) and missing at random (MAR). Three methods are compared: (a) Normal-distribution-based maximum likelihood estimation (NML); (b) Normal-distribution-based multiple imputation (NMI); and (c) Bayesian estimation (BE). Via simulations, we found that NML and NMI could lead to biased estimates of moderation effects under MAR missingness mechanism. The BE method outperformed NMI and NML for MMR modeling with missing data in the focal predictor, missingness depending on the moderator and/or auxiliary variables, and correctly specified distributions for the focal predictor. In addition, more robust BE methods are needed in terms of the distribution mis-specification problem of the focal predictor. An empirical example was used to illustrate the applications of the methods with a simple sensitivity analysis. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria
NASA Astrophysics Data System (ADS)
Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong
2017-08-01
In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.
Self spectrum window method in wigner-ville distribution.
Liu, Zhongguo; Liu, Changchun; Liu, Boqiang; Lv, Yangsheng; Lei, Yinsheng; Yu, Mengsun
2005-01-01
Wigner-Ville distribution (WVD) is an important type of time-frequency analysis in biomedical signal processing. The cross-term interference in WVD has a disadvantageous influence on its application. In this research, the Self Spectrum Window (SSW) method was put forward to suppress the cross-term interference, based on the fact that the cross-term and auto-WVD- terms in integral kernel function are orthogonal. With the Self Spectrum Window (SSW) algorithm, a real auto-WVD function was used as a template to cross-correlate with the integral kernel function, and the Short Time Fourier Transform (STFT) spectrum of the signal was used as window function to process the WVD in time-frequency plane. The SSW method was confirmed by computer simulation with good analysis results. Satisfactory time- frequency distribution was obtained.
Optical disk processing of solar images.
NASA Astrophysics Data System (ADS)
Title, A.; Tarbell, T.
The current generation of space and ground-based experiments in solar physics produces many megabyte-sized image data arrays. Optical disk technology is the leading candidate for convenient analysis, distribution, and archiving of these data. The authors have been developing data analysis procedures which use both analog and digital optical disks for the study of solar phenomena.
The objective of our study was to characterize the trophic connections of the dominant fishes of the deep-pelagic region of the northern Mid-Atlantic Ridge (MAR) with respect to vertical distribution using carbon (C) and nitrogen (N) stable isotope analysis. Our goals were to id...
Previous studies have shown that culture-based methods tend to underestimate the densities and diversity of bacterial populations inhabiting water distribution systems (WDS). In this study, the phylogenetic diversity of drinking water bacteria was assessed using sequence analysis...
Legal, regulatory & institutional issues facing distributed resources development
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This report describes legal, regulatory, and institutional considerations likely to shape the development and deployment of distributed resources. It is based on research co-sponsored by the National Renewable Energy Laboratory (NREL) and four investor-owned utilities (Central & South West Services, Cinergy Corp., Florida Power Corporation, and San Diego Gas & Electric Company). The research was performed between August 1995 and March 1996 by a team of four consulting firms experienced in energy and utility law, regulation, and economics. It is the survey phase of a project known as the Distributed Resources Institutional Analysis Project.
NASA Technical Reports Server (NTRS)
Kuriyan, J. G.; Phillips, D. H.; Willson, R. C.
1974-01-01
This paper describes the theoretical analysis that is required to infer, from polarimeter measurements of skylight, the size distribution, refractive index and abundance of particulates in the atmosphere. To illustrate the viability of the method, some data obtained at UCLA is analyzed and the atmospheric parameters are derived. The explicit demonstration of the redundancy in the description of aerosol distributions suggests that radiation field measurements will not uniquely determine the modal radius of the size distribution. In spite of this nonuniqueness information useful to heat budget calculations can be derived.
A Bayesian approach to parameter and reliability estimation in the Poisson distribution.
NASA Technical Reports Server (NTRS)
Canavos, G. C.
1972-01-01
For life testing procedures, a Bayesian analysis is developed with respect to a random intensity parameter in the Poisson distribution. Bayes estimators are derived for the Poisson parameter and the reliability function based on uniform and gamma prior distributions of that parameter. A Monte Carlo procedure is implemented to make possible an empirical mean-squared error comparison between Bayes and existing minimum variance unbiased, as well as maximum likelihood, estimators. As expected, the Bayes estimators have mean-squared errors that are appreciably smaller than those of the other two.
Research on the Rural Express Alliance based on ANP improved profit Allocation
NASA Astrophysics Data System (ADS)
Zhuang, Yufeng; Zhang, Bin
2018-01-01
Online shopping platform in rural distribution difficulties, leading to rural online shopping market and logistics market development is slow. At present, China Post and other private courier companies are not possible to do. So we need to build distribution alliances. Reasonable profit allocation mechanism is the key to the stable development of this distribution alliance. So we proposed the Shapley Value Method and the ANP Improved Model to allocate profits. Finally, the rationality of the method is proved by numerical analysis before and after using the corrected Shapley Value.
Cyber Risk Management for Critical Infrastructure: A Risk Analysis Model and Three Case Studies.
Paté-Cornell, M-Elisabeth; Kuypers, Marshall; Smith, Matthew; Keller, Philip
2018-02-01
Managing cyber security in an organization involves allocating the protection budget across a spectrum of possible options. This requires assessing the benefits and the costs of these options. The risk analyses presented here are statistical when relevant data are available, and system-based for high-consequence events that have not happened yet. This article presents, first, a general probabilistic risk analysis framework for cyber security in an organization to be specified. It then describes three examples of forward-looking analyses motivated by recent cyber attacks. The first one is the statistical analysis of an actual database, extended at the upper end of the loss distribution by a Bayesian analysis of possible, high-consequence attack scenarios that may happen in the future. The second is a systems analysis of cyber risks for a smart, connected electric grid, showing that there is an optimal level of connectivity. The third is an analysis of sequential decisions to upgrade the software of an existing cyber security system or to adopt a new one to stay ahead of adversaries trying to find their way in. The results are distributions of losses to cyber attacks, with and without some considered countermeasures in support of risk management decisions based both on past data and anticipated incidents. © 2017 Society for Risk Analysis.
Markov Chain Monte Carlo Methods for Bayesian Data Analysis in Astronomy
NASA Astrophysics Data System (ADS)
Sharma, Sanjib
2017-08-01
Markov Chain Monte Carlo based Bayesian data analysis has now become the method of choice for analyzing and interpreting data in almost all disciplines of science. In astronomy, over the last decade, we have also seen a steady increase in the number of papers that employ Monte Carlo based Bayesian analysis. New, efficient Monte Carlo based methods are continuously being developed and explored. In this review, we first explain the basics of Bayesian theory and discuss how to set up data analysis problems within this framework. Next, we provide an overview of various Monte Carlo based methods for performing Bayesian data analysis. Finally, we discuss advanced ideas that enable us to tackle complex problems and thus hold great promise for the future. We also distribute downloadable computer software (available at https://github.com/sanjibs/bmcmc/ ) that implements some of the algorithms and examples discussed here.
SU-G-TeP3-11: Radiobiological-Cum-Dosimetric Quality Assurance of Complex Radiotherapy Plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paudel, N; Narayanasamy, G; Zhang, X
2016-06-15
Purpose: Dosimetric gamma-analysis used for QA of complex radiotherapy plans tests the dosimetric equivalence of a delivered plan with the treatment planning system (TPS) optimized plan. It does not examine whether a dosimetric difference results in any radiobiological difference. This study introduces a method to test the radiobiological and dosimetric equivalence between a delivered and the TPS optimized plan. Methods: Six head and neck and seven lung cancer VMAT or IMRT plans optimized for patient treatment were calculated and delivered to an ArcCheck phantom. ArcCheck measured dose distributions were compared with the TPS calculated dose distributions using a 2-D gamma-analysis.more » Dose volume histograms (DVHs) for various patient structures were obtained by using measured data in 3DVH software and compared against the TPS calculated DVHs using 3-D gamma analysis. DVH data were used in the Poisson model to calculate tumor control probability (TCP) for the treatment targets and in the sigmoid dose response model to calculate normal tissue complication probability (NTCP) for the normal structures. Results: Two-D and three-D gamma passing rates among six H&N patient plans differed by 0 to 2.7% and among seven lung plans by 0.1 to 4.5%. Average ± SD TCPs based on measurement and TPS were 0.665±0.018 and 0.674±0.044 for H&N, and 0.791±0.027 and 0.733±0.031 for lung plans, respectively. Differences in NTCPs were usually negligible. The differences in dosimetric results, TCPs and NTCPs were insignificant. Conclusion: The 2-D and 3-D gamma-analysis based agreement between measured and planned dose distributions may indicate their dosimetric equivalence. Small and insignificant differences in TCPs and NTCPs based on measured and planned dose distributions indicate the radiobiological equivalence between the measured and optimized plans. However, patient plans showing larger differences between 2-D and 3-D gamma-analysis can help us make a more definite conclusion through our ongoing research with a larger number of patients.« less
Extreme flood event analysis in Indonesia based on rainfall intensity and recharge capacity
NASA Astrophysics Data System (ADS)
Narulita, Ida; Ningrum, Widya
2018-02-01
Indonesia is very vulnerable to flood disaster because it has high rainfall events throughout the year. Flood is categorized as the most important hazard disaster because it is causing social, economic and human losses. The purpose of this study is to analyze extreme flood event based on satellite rainfall dataset to understand the rainfall characteristic (rainfall intensity, rainfall pattern, etc.) that happened before flood disaster in the area for monsoonal, equatorial and local rainfall types. Recharge capacity will be analyzed using land cover and soil distribution. The data used in this study are CHIRPS rainfall satellite data on 0.05 ° spatial resolution and daily temporal resolution, and GSMap satellite rainfall dataset operated by JAXA on 1-hour temporal resolution and 0.1 ° spatial resolution, land use and soil distribution map for recharge capacity analysis. The rainfall characteristic before flooding, and recharge capacity analysis are expected to become the important information for flood mitigation in Indonesia.
Analytical method for predicting the pressure distribution about a nacelle at transonic speeds
NASA Technical Reports Server (NTRS)
Keith, J. S.; Ferguson, D. R.; Merkle, C. L.; Heck, P. H.; Lahti, D. J.
1973-01-01
The formulation and development of a computer analysis for the calculation of streamlines and pressure distributions around two-dimensional (planar and axisymmetric) isolated nacelles at transonic speeds are described. The computerized flow field analysis is designed to predict the transonic flow around long and short high-bypass-ratio fan duct nacelles with inlet flows and with exhaust flows having appropriate aerothermodynamic properties. The flow field boundaries are located as far upstream and downstream as necessary to obtain minimum disturbances at the boundary. The far-field lateral flow field boundary is analytically defined to exactly represent free-flight conditions or solid wind tunnel wall effects. The inviscid solution technique is based on a Streamtube Curvature Analysis. The computer program utilizes an automatic grid refinement procedure and solves the flow field equations with a matrix relaxation technique. The boundary layer displacement effects and the onset of turbulent separation are included, based on the compressible turbulent boundary layer solution method of Stratford and Beavers and on the turbulent separation prediction method of Stratford.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ji, Haojie; Dhomkar, Siddharth; Roy, Bidisha
2014-10-28
For submonolayer quantum dot (QD) based photonic devices, size and density of QDs are critical parameters, the probing of which requires indirect methods. We report the determination of lateral size distribution of type-II ZnTe/ZnSe stacked submonolayer QDs, based on spectral analysis of the optical signature of Aharanov-Bohm (AB) excitons, complemented by photoluminescence studies, secondary-ion mass spectroscopy, and numerical calculations. Numerical calculations are employed to determine the AB transition magnetic field as a function of the type-II QD radius. The study of four samples grown with different tellurium fluxes shows that the lateral size of QDs increases by just 50%, evenmore » though tellurium concentration increases 25-fold. Detailed spectral analysis of the emission of the AB exciton shows that the QD radii take on only certain values due to vertical correlation and the stacked nature of the QDs.« less
Comprehensive evaluation of impacts of distributed generation integration in distribution network
NASA Astrophysics Data System (ADS)
Peng, Sujiang; Zhou, Erbiao; Ji, Fengkun; Cao, Xinhui; Liu, Lingshuang; Liu, Zifa; Wang, Xuyang; Cai, Xiaoyu
2018-04-01
All Distributed generation (DG) as the supplement to renewable energy centralized utilization, is becoming the focus of development direction of renewable energy utilization. With the increasing proportion of DG in distribution network, the network power structure, power flow distribution, operation plans and protection are affected to some extent. According to the main impacts of DG, a comprehensive evaluation model of distributed network with DG is proposed in this paper. A comprehensive evaluation index system including 7 aspects, along with their corresponding index calculation method is established for quantitative analysis. The indices under different access capacity of DG in distribution network are calculated based on the IEEE RBTS-Bus 6 system and the evaluation result is calculated by analytic hierarchy process (AHP). The proposed model and method are verified effective and validity through case study.
NASA Astrophysics Data System (ADS)
Nadarajah, Saralees; Kotz, Samuel
2007-04-01
Various q-type distributions have appeared in the physics literature in the recent years, see e.g. L.C. Malacarne, R.S. Mendes, E. K. Lenzi, q-exponential distribution in urban agglomeration, Phys. Rev. E 65, (2002) 017106. S.M.D. Queiros, On a possible dynamical scenario leading to a generalised Gamma distribution, in xxx.lanl.gov-physics/0411111. U.M.S. Costa, V.N. Freire, L.C. Malacarne, R.S. Mendes, S. Picoli Jr., E.A. de Vasconcelos, E.F. da Silva Jr., An improved description of the dielectric breakdown in oxides based on a generalized Weibull distribution, Physica A 361, (2006) 215. S. Picoli, Jr., R.S. Mendes, L.C. Malacarne, q-exponential, Weibull, and q-Weibull distributions: an empirical analysis, Physica A 324 (2003) 678-688. A.M.C. de Souza, C. Tsallis, Student's t- and r- distributions: unified derivation from an entropic variational principle, Physica A 236 (1997) 52-57. It is pointed out in the paper that many of these are the same as or particular cases of what has been known in the statistics literature. Several of these statistical distributions are discussed and references provided. We feel that this paper could be of assistance for modeling problems of the type considered by L.C. Malacarne, R.S. Mendes, E. K. Lenzi, q-exponential distribution in urban agglomeration, Phys. Rev. E 65, (2002) 017106. S.M.D. Queiros, On a possible dynamical scenario leading to a generalised Gamma distribution, in xxx.lanl.gov-physics/0411111. U.M.S. Costa, V.N. Freire, L.C. Malacarne, R.S. Mendes, S. Picoli Jr., E.A. de Vasconcelos, E.F. da Silva Jr., An improved description of the dielectric breakdown in oxides based on a generalized Weibull distribution, Physica A 361, (2006) 215. S. Picoli, Jr., R.S. Mendes, L.C. Malacarne, q-exponential, Weibull, and q-Weibull distributions: an empirical analysis, Physica A 324 (2003) 678-688. A.M.C. de Souza, C. Tsallis, Student's t- and r- distributions: unified derivation from an entropic variational principle, Physica A 236 (1997) 52-57 and others.
Yu, Hao; Dick, Andrew W
2012-10-01
Given the rapid growth of health care costs, some experts were concerned with erosion of employment-based private insurance (EBPI). This empirical analysis aims to quantify the concern. Using the National Health Account, we generated a cost index to represent state-level annual cost growth. We merged it with the 1996-2003 Medical Expenditure Panel Survey. The unit of analysis is the family. We conducted both bivariate and multivariate logistic analyses. The bivariate analysis found a significant inverse association between the cost index and the proportion of families receiving an offer of EBPI. The multivariate analysis showed that the cost index was significantly negatively associated with the likelihood of receiving an EBPI offer for the entire sample and for families in the first, second, and third quartiles of income distribution. The cost index was also significantly negatively associated with the proportion of families with EBPI for the entire year for each family member (EBPI-EYEM). The multivariate analysis confirmed significance of the relationship for the entire sample, and for families in the second and third quartiles of income distribution. Among the families with EBPI-EYEM, there was a positive relationship between the cost index and this group's likelihood of having out-of-pocket expenditures exceeding 10 percent of family income. The multivariate analysis confirmed significance of the relationship for the entire group and for families in the second and third quartiles of income distribution. Rising health costs reduce EBPI availability and enrollment, and the financial protection provided by it, especially for middle-class families. © Health Research and Educational Trust.
Yu, Hao; Dick, Andrew W
2012-01-01
Background Given the rapid growth of health care costs, some experts were concerned with erosion of employment-based private insurance (EBPI). This empirical analysis aims to quantify the concern. Methods Using the National Health Account, we generated a cost index to represent state-level annual cost growth. We merged it with the 1996–2003 Medical Expenditure Panel Survey. The unit of analysis is the family. We conducted both bivariate and multivariate logistic analyses. Results The bivariate analysis found a significant inverse association between the cost index and the proportion of families receiving an offer of EBPI. The multivariate analysis showed that the cost index was significantly negatively associated with the likelihood of receiving an EBPI offer for the entire sample and for families in the first, second, and third quartiles of income distribution. The cost index was also significantly negatively associated with the proportion of families with EBPI for the entire year for each family member (EBPI-EYEM). The multivariate analysis confirmed significance of the relationship for the entire sample, and for families in the second and third quartiles of income distribution. Among the families with EBPI-EYEM, there was a positive relationship between the cost index and this group's likelihood of having out-of-pocket expenditures exceeding 10 percent of family income. The multivariate analysis confirmed significance of the relationship for the entire group and for families in the second and third quartiles of income distribution. Conclusions Rising health costs reduce EBPI availability and enrollment, and the financial protection provided by it, especially for middle-class families. PMID:22417314
Benazzi, Stefano; Kullmer, Ottmar; Grosse, Ian R; Weber, Gerhard W
2011-01-01
Simulations based on finite element analysis (FEA) have attracted increasing interest in dentistry and dental anthropology for evaluating the stress and strain distribution in teeth under occlusal loading conditions. Nonetheless, FEA is usually applied without considering changes in contacts between antagonistic teeth during the occlusal power stroke. In this contribution we show how occlusal information can be used to investigate the stress distribution with 3D FEA in lower first molars (M1). The antagonistic crowns M1 and P2–M1 of two dried modern human skulls were scanned by μCT in maximum intercuspation (centric occlusion) contact. A virtual analysis of the occlusal power stroke between M1 and P2–M1 was carried out in the Occlusal Fingerprint Analyser (OFA) software, and the occlusal trajectory path was recorded, while contact areas per time-step were visualized and quantified. Stress distribution of the M1 in selected occlusal stages were analyzed in strand7, considering occlusal information taken from OFA results for individual loading direction and loading area. Our FEA results show that the stress pattern changes considerably during the power stroke, suggesting that wear facets have a crucial influence on the distribution of stress on the whole tooth. Grooves and fissures on the occlusal surface are seen as critical locations, as tensile stresses are concentrated at these features. Properly accounting for the power stroke kinematics of occluding teeth results in quite different results (less tensile stresses in the crown) than usual loading scenarios based on parallel forces to the long axis of the tooth. This leads to the conclusion that functional studies considering kinematics of teeth are important to understand biomechanics and interpret morphological adaptation of teeth. PMID:21615398
Yang, Zengling; Mei, Jiaqi; Liu, Zhiqiang; Huang, Guangqun; Huang, Guan; Han, Lujia
2018-06-19
Understanding the biochemical heterogeneity of plant tissue linked to crop straw anatomy is attractive to plant researchers and researchers in the field of biomass refinery. This study provides an in situ analysis and semiquantitative visualization of major components distribution in internodal transverse sections of wheat straw based on Fourier transform infrared (FTIR) microspectroscopic imaging, with a fast non-negativity-constrained least squares (fast NNLS) fitting. This paper investigates changes in biochemical components of tissue during stages of elongation, booting, heading, flowering, grain-filling, milk-ripening, dough, and full-ripening. Visualization analysis was carried out with reference spectra for five components (microcrystalline cellulose, xylan, lignin, pectin, and starch) of wheat straw. Our result showed that (a) the cellulose and lignin distribution is consistent with that from tissue-dyeing with safranin O-fast green and (b) the distribution of cellulose, lignin, and starch is consistent with chemical images for characteristic wavelength at 1432, 1507, and 987 cm -1 , respectively, showing no interference from the other components analyzed. With the validation from biochemical images using characteristic wavelength and tissue-dyeing techniques, further semiquantitative analysis in local tissues based on fast NNLS was carried out, and the result showed that (a) the contents of cellulose in various tissues are very different, with most in parenchyma tissue and least in the epidermis and (b) during plant development, the fluctuation of each component in tissues follows nearly the same trend, especially within vascular bundles and parenchyma tissue. Thus, FTIR microspectroscopic imaging combined with suitable chemometric methods can be successfully applied to study chemical distributions within the internodes transverse sections of wheat straw, providing semiquantitative chemical information.
Analysis of critical operating conditions for LV distribution networks with microgrids
NASA Astrophysics Data System (ADS)
Zehir, M. A.; Batman, A.; Sonmez, M. A.; Font, A.; Tsiamitros, D.; Stimoniaris, D.; Kollatou, T.; Bagriyanik, M.; Ozdemir, A.; Dialynas, E.
2016-11-01
Increase in the penetration of Distributed Generation (DG) in distribution networks, raises the risk of voltage limit violations while contributing to line losses. Especially in low voltage (LV) distribution networks (secondary distribution networks), impacts of active power flows on the bus voltages and on the network losses are more dominant. As network operators must meet regulatory limitations, they have to take into account the most critical operating conditions in their systems. In this study, it is aimed to present the impact of the worst operation cases of LV distribution networks comprising microgrids. Simulation studies are performed on a field data-based virtual test-bed. The simulations are repeated for several cases consisting different microgrid points of connection with different network loading and microgrid supply/demand conditions.
Carbon Fibers Conductivity Studies
NASA Technical Reports Server (NTRS)
Yang, C. Y.; Butkus, A. M.
1980-01-01
In an attempt to understand the process of electrical conduction in polyacrylonitrile (PAN)-based carbon fibers, calculations were carried out on cluster models of the fiber consisting of carbon, nitrogen, and hydrogen atoms using the modified intermediate neglect of differential overlap (MINDO) molecular orbital (MO) method. The models were developed based on the assumption that PAN carbon fibers obtained with heat treatment temperatures (HTT) below 1000 C retain nitrogen in a graphite-like lattice. For clusters modeling an edge nitrogen site, analysis of the occupied MO's indicated an electron distribution similar to that of graphite. A similar analysis for the somewhat less stable interior nitrogen site revealed a partially localized II electron distribution around the nitrogen atom. The differences in bonding trends and structural stability between edge and interior nitrogen clusters led to a two-step process proposed for nitrogen evolution with increasing HTT.