Sample records for distribution analysis techniques

  1. Improved Sectional Image Analysis Technique for Evaluating Fiber Orientations in Fiber-Reinforced Cement-Based Materials.

    PubMed

    Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong

    2016-01-12

    The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis.

  2. Improved Sectional Image Analysis Technique for Evaluating Fiber Orientations in Fiber-Reinforced Cement-Based Materials

    PubMed Central

    Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong

    2016-01-01

    The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis. PMID:28787839

  3. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replication and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. A technique is used that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed database with both shared and exclusive locks.

  4. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replications and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. Here, a technique is discussed that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed databases with both shared and exclusive locks.

  5. Ozone data and mission sampling analysis

    NASA Technical Reports Server (NTRS)

    Robbins, J. L.

    1980-01-01

    A methodology was developed to analyze discrete data obtained from the global distribution of ozone. Statistical analysis techniques were applied to describe the distribution of data variance in terms of empirical orthogonal functions and components of spherical harmonic models. The effects of uneven data distribution and missing data were considered. Data fill based on the autocorrelation structure of the data is described. Computer coding of the analysis techniques is included.

  6. Simultaneous Comparison of Two Roller Compaction Techniques and Two Particle Size Analysis Methods.

    PubMed

    Saarinen, Tuomas; Antikainen, Osmo; Yliruusi, Jouko

    2017-11-01

    A new dry granulation technique, gas-assisted roller compaction (GARC), was compared with conventional roller compaction (CRC) by manufacturing 34 granulation batches. The process variables studied were roll pressure, roll speed, and sieve size of the conical mill. The main quality attributes measured were granule size and flow characteristics. Within granulations also the real applicability of two particle size analysis techniques, sieve analysis (SA) and fast imaging technique (Flashsizer, FS), was tested. All granules obtained were acceptable. In general, the particle size of GARC granules was slightly larger than that of CRC granules. In addition, the GARC granules had better flowability. For example, the tablet weight variation of GARC granules was close to 2%, indicating good flowing and packing characteristics. The comparison of the two particle size analysis techniques showed that SA was more accurate in determining wide and bimodal size distributions while FS showed narrower and mono-modal distributions. However, both techniques gave good estimates for mean granule sizes. Overall, SA was a time-consuming but accurate technique that provided reliable information for the entire granule size distribution. By contrast, FS oversimplified the shape of the size distribution, but nevertheless yielded acceptable estimates for mean particle size. In general, FS was two to three orders of magnitude faster than SA.

  7. The Review of Nuclear Microscopy Techniques: An Approach for Nondestructive Trace Elemental Analysis and Mapping of Biological Materials.

    PubMed

    Mulware, Stephen Juma

    2015-01-01

    The properties of many biological materials often depend on the spatial distribution and concentration of the trace elements present in a matrix. Scientists have over the years tried various techniques including classical physical and chemical analyzing techniques each with relative level of accuracy. However, with the development of spatially sensitive submicron beams, the nuclear microprobe techniques using focused proton beams for the elemental analysis of biological materials have yielded significant success. In this paper, the basic principles of the commonly used microprobe techniques of STIM, RBS, and PIXE for trace elemental analysis are discussed. The details for sample preparation, the detection, and data collection and analysis are discussed. Finally, an application of the techniques to analysis of corn roots for elemental distribution and concentration is presented.

  8. Comparison of time-frequency distribution techniques for analysis of spinal somatosensory evoked potential.

    PubMed

    Hu, Y; Luk, K D; Lu, W W; Holmes, A; Leong, J C

    2001-05-01

    Spinal somatosensory evoked potential (SSEP) has been employed to monitor the integrity of the spinal cord during surgery. To detect both temporal and spectral changes in SSEP waveforms, an investigation of the application of time-frequency analysis (TFA) techniques was conducted. SSEP signals from 30 scoliosis patients were analysed using different techniques; short time Fourier transform (STFT), Wigner-Ville distribution (WVD), Choi-Williams distribution (CWD), cone-shaped distribution (CSD) and adaptive spectrogram (ADS). The time-frequency distributions (TFD) computed using these methods were assessed and compared with each other. WVD, ADS, CSD and CWD showed better resolution than STFT. Comparing normalised peak widths, CSD showed the sharpest peak width (0.13+/-0.1) in the frequency dimension, and a mean peak width of 0.70+/-0.12 in the time dimension. Both WVD and CWD produced cross-term interference, distorting the TFA distribution, but this was not seen with CSD and ADS. CSD appeared to give a lower mean peak power bias (10.3%+/-6.2%) than ADS (41.8%+/-19.6%). Application of the CSD algorithm showed both good resolution and accurate spectrograms, and is therefore recommended as the most appropriate TFA technique for the analysis of SSEP signals.

  9. Elimination of interference component in Wigner-Ville distribution for the signal with 1/f spectral characteristic.

    PubMed

    Chan, H L; Lin, J L; Huang, H H; Wu, C P

    1997-09-01

    A new technique for interference-term suppression in Wigner-Ville distribution (WVD) is proposed for the signal with 1/f spectrum shape. The spectral characteristic of the signal is altered by f alpha filtering before time-frequency analysis and compensated after analysis. With the utilization of the proposed technique in smoothed pseudo Wigner-Ville distribution, an excellent suppression of interference component can be achieved.

  10. A study of the feasibility of statistical analysis of airport performance simulation

    NASA Technical Reports Server (NTRS)

    Myers, R. H.

    1982-01-01

    The feasibility of conducting a statistical analysis of simulation experiments to study airport capacity is investigated. First, the form of the distribution of airport capacity is studied. Since the distribution is non-Gaussian, it is important to determine the effect of this distribution on standard analysis of variance techniques and power calculations. Next, power computations are made in order to determine how economic simulation experiments would be if they are designed to detect capacity changes from condition to condition. Many of the conclusions drawn are results of Monte-Carlo techniques.

  11. Analysis Using Bi-Spectral Related Technique

    DTIC Science & Technology

    1993-11-17

    filtering is employed as the data is processed (equation 1). Earlier results have shown that in contrast to the Wigner - Ville Distribution ( WVD ) no spectral...Technique by-o -~ Ralph Hippenstiel November 17, 1993 94 2 22 1 0 Approved for public reslease; distribution unlimited. Prepared for: Naval Command Control...Government. 12a. DISTRIBUTION /AVAILABILITY STATEMENT 12b. DISTRIBUTION ’.ODE Approved for public relkase; distribution unlimited. 13. ABSTRACT (Maximum

  12. Amino acid distribution in meteorites: diagenesis, extraction methods, and standard metrics in the search for extraterrestrial biosignatures.

    PubMed

    McDonald, Gene D; Storrie-Lombardi, Michael C

    2006-02-01

    The relative abundance of the protein amino acids has been previously investigated as a potential marker for biogenicity in meteoritic samples. However, these investigations were executed without a quantitative metric to evaluate distribution variations, and they did not account for the possibility of interdisciplinary systematic error arising from inter-laboratory differences in extraction and detection techniques. Principal component analysis (PCA), hierarchical cluster analysis (HCA), and stochastic probabilistic artificial neural networks (ANNs) were used to compare the distributions for nine protein amino acids previously reported for the Murchison carbonaceous chondrite, Mars meteorites (ALH84001, Nakhla, and EETA79001), prebiotic synthesis experiments, and terrestrial biota and sediments. These techniques allowed us (1) to identify a shift in terrestrial amino acid distributions secondary to diagenesis; (2) to detect differences in terrestrial distributions that may be systematic differences between extraction and analysis techniques in biological and geological laboratories; and (3) to determine that distributions in meteoritic samples appear more similar to prebiotic chemistry samples than they do to the terrestrial unaltered or diagenetic samples. Both diagenesis and putative interdisciplinary differences in analysis complicate interpretation of meteoritic amino acid distributions. We propose that the analysis of future samples from such diverse sources as meteoritic influx, sample return missions, and in situ exploration of Mars would be less ambiguous with adoption of standardized assay techniques, systematic inclusion of assay standards, and the use of a quantitative, probabilistic metric. We present here one such metric determined by sequential feature extraction and normalization (PCA), information-driven automated exploration of classification possibilities (HCA), and prediction of classification accuracy (ANNs).

  13. The Use of Neutron Analysis Techniques for Detecting The Concentration And Distribution of Chloride Ions in Archaeological Iron

    PubMed Central

    Watkinson, D; Rimmer, M; Kasztovszky, Z; Kis, Z; Maróti, B; Szentmiklósi, L

    2014-01-01

    Chloride (Cl) ions diffuse into iron objects during burial and drive corrosion after excavation. Located under corrosion layers, Cl is inaccessible to many analytical techniques. Neutron analysis offers non-destructive avenues for determining Cl content and distribution in objects. A pilot study used prompt gamma activation analysis (PGAA) and prompt gamma activation imaging (PGAI) to analyse the bulk concentration and longitudinal distribution of Cl in archaeological iron objects. This correlated with the object corrosion rate measured by oxygen consumption, and compared well with Cl measurement using a specific ion meter. High-Cl areas were linked with visible damage to the corrosion layers and attack of the iron core. Neutron techniques have significant advantages in the analysis of archaeological metals, including penetration depth and low detection limits. PMID:26028670

  14. Use of density equalizing map projections (DEMP) in the analysis of childhood cancer in four California counties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merrill, D.W.; Selvin, S.; Close, E.R.

    In studying geographic disease distributions, one normally compares rates of arbitrarily defined geographic subareas (e.g. census tracts), thereby sacrificing the geographic detail of the original data. The sparser the data, the larger the subareas must be in order to calculate stable rates. This dilemma is avoided with the technique of Density Equalizing Map Projections (DEMP). Boundaries of geographic subregions are adjusted to equalize population density over the entire study area. Case locations plotted on the transformed map should have a uniform distribution if the underlying disease-rates are constant. On the transformed map, the statistical analysis of the observed distribution ismore » greatly simplified. Even for sparse distributions, the statistical significance of a supposed disease cluster can be reliably calculated. The present report describes the first successful application of the DEMP technique to a sizeable ``real-world`` data set of epidemiologic interest. An improved DEMP algorithm [GUSE93, CLOS94] was applied to a data set previously analyzed with conventional techniques [SATA90, REYN91]. The results from the DEMP analysis and a conventional analysis are compared.« less

  15. Analysis and synthesis of distributed-lumped-active networks by digital computer

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The use of digital computational techniques in the analysis and synthesis of DLA (distributed lumped active) networks is considered. This class of networks consists of three distinct types of elements, namely, distributed elements (modeled by partial differential equations), lumped elements (modeled by algebraic relations and ordinary differential equations), and active elements (modeled by algebraic relations). Such a characterization is applicable to a broad class of circuits, especially including those usually referred to as linear integrated circuits, since the fabrication techniques for such circuits readily produce elements which may be modeled as distributed, as well as the more conventional lumped and active ones.

  16. Integral-moment analysis of the BATSE gamma-ray burst intensity distribution

    NASA Technical Reports Server (NTRS)

    Horack, John M.; Emslie, A. Gordon

    1994-01-01

    We have applied the technique of integral-moment analysis to the intensity distribution of the first 260 gamma-ray bursts observed by the Burst and Transient Source Experiment (BATSE) on the Compton Gamma Ray Observatory. This technique provides direct measurement of properties such as the mean, variance, and skewness of the convolved luminosity-number density distribution, as well as associated uncertainties. Using this method, one obtains insight into the nature of the source distributions unavailable through computation of traditional single parameters such as V/V(sub max)). If the luminosity function of the gamma-ray bursts is strongly peaked, giving bursts only a narrow range of luminosities, these results are then direct probes of the radial distribution of sources, regardless of whether the bursts are a local phenomenon, are distributed in a galactic halo, or are at cosmological distances. Accordingly, an integral-moment analysis of the intensity distribution of the gamma-ray bursts provides for the most complete analytic description of the source distribution available from the data, and offers the most comprehensive test of the compatibility of a given hypothesized distribution with observation.

  17. BATSE analysis techniques for probing the GRB spatial and luminosity distributions

    NASA Technical Reports Server (NTRS)

    Hakkila, Jon; Meegan, Charles A.

    1992-01-01

    The Burst And Transient Source Experiment (BATSE) has measured homogeneity and isotropy parameters from an increasingly large sample of observed gamma-ray bursts (GRBs), while also maintaining a summary of the way in which the sky has been sampled. Measurement of both of these are necessary for any study of the BATSE data statistically, as they take into account the most serious observational selection effects known in the study of GRBs: beam-smearing and inhomogeneous, anisotropic sky sampling. Knowledge of these effects is important to analysis of GRB angular and intensity distributions. In addition to determining that the bursts are local, it is hoped that analysis of such distributions will allow boundaries to be placed on the true GRB spatial distribution and luminosity function. The technique for studying GRB spatial and luminosity distributions is direct. Results of BATSE analyses are compared to Monte Carlo models parameterized by a variety of spatial and luminosity characteristics.

  18. Hamiltonian Analysis of Subcritical Stochastic Epidemic Dynamics

    PubMed Central

    2017-01-01

    We extend a technique of approximation of the long-term behavior of a supercritical stochastic epidemic model, using the WKB approximation and a Hamiltonian phase space, to the subcritical case. The limiting behavior of the model and approximation are qualitatively different in the subcritical case, requiring a novel analysis of the limiting behavior of the Hamiltonian system away from its deterministic subsystem. This yields a novel, general technique of approximation of the quasistationary distribution of stochastic epidemic and birth-death models and may lead to techniques for analysis of these models beyond the quasistationary distribution. For a classic SIS model, the approximation found for the quasistationary distribution is very similar to published approximations but not identical. For a birth-death process without depletion of susceptibles, the approximation is exact. Dynamics on the phase plane similar to those predicted by the Hamiltonian analysis are demonstrated in cross-sectional data from trachoma treatment trials in Ethiopia, in which declining prevalences are consistent with subcritical epidemic dynamics. PMID:28932256

  19. Quantitative characterization of the spatial distribution of particles in materials: Application to materials processing

    NASA Technical Reports Server (NTRS)

    Parse, Joseph B.; Wert, J. A.

    1991-01-01

    Inhomogeneities in the spatial distribution of second phase particles in engineering materials are known to affect certain mechanical properties. Progress in this area has been hampered by the lack of a convenient method for quantitative description of the spatial distribution of the second phase. This study intends to develop a broadly applicable method for the quantitative analysis and description of the spatial distribution of second phase particles. The method was designed to operate on a desktop computer. The Dirichlet tessellation technique (geometrical method for dividing an area containing an array of points into a set of polygons uniquely associated with the individual particles) was selected as the basis of an analysis technique implemented on a PC. This technique is being applied to the production of Al sheet by PM processing methods; vacuum hot pressing, forging, and rolling. The effect of varying hot working parameters on the spatial distribution of aluminum oxide particles in consolidated sheet is being studied. Changes in distributions of properties such as through-thickness near-neighbor distance correlate with hot-working reduction.

  20. An analysis of fracture trace patterns in areas of flat-lying sedimentary rocks for the detection of buried geologic structure. [Kansas and Texas

    NASA Technical Reports Server (NTRS)

    Podwysocki, M. H.

    1974-01-01

    Two study areas in a cratonic platform underlain by flat-lying sedimentary rocks were analyzed to determine if a quantitative relationship exists between fracture trace patterns and their frequency distributions and subsurface structural closures which might contain petroleum. Fracture trace lengths and frequency (number of fracture traces per unit area) were analyzed by trend surface analysis and length frequency distributions also were compared to a standard Gaussian distribution. Composite rose diagrams of fracture traces were analyzed using a multivariate analysis method which grouped or clustered the rose diagrams and their respective areas on the basis of the behavior of the rays of the rose diagram. Analysis indicates that the lengths of fracture traces are log-normally distributed according to the mapping technique used. Fracture trace frequency appeared higher on the flanks of active structures and lower around passive reef structures. Fracture trace log-mean lengths were shorter over several types of structures, perhaps due to increased fracturing and subsequent erosion. Analysis of rose diagrams using a multivariate technique indicated lithology as the primary control for the lower grouping levels. Groupings at higher levels indicated that areas overlying active structures may be isolated from their neighbors by this technique while passive structures showed no differences which could be isolated.

  1. Noise distribution and denoising of current density images

    PubMed Central

    Beheshti, Mohammadali; Foomany, Farbod H.; Magtibay, Karl; Jaffray, David A.; Krishnan, Sridhar; Nanthakumar, Kumaraswamy; Umapathy, Karthikeyan

    2015-01-01

    Abstract. Current density imaging (CDI) is a magnetic resonance (MR) imaging technique that could be used to study current pathways inside the tissue. The current distribution is measured indirectly as phase changes. The inherent noise in the MR imaging technique degrades the accuracy of phase measurements leading to imprecise current variations. The outcome can be affected significantly, especially at a low signal-to-noise ratio (SNR). We have shown the residual noise distribution of the phase to be Gaussian-like and the noise in CDI images approximated as a Gaussian. This finding matches experimental results. We further investigated this finding by performing comparative analysis with denoising techniques, using two CDI datasets with two different currents (20 and 45 mA). We found that the block-matching and three-dimensional (BM3D) technique outperforms other techniques when applied on current density (J). The minimum gain in noise power by BM3D applied to J compared with the next best technique in the analysis was found to be around 2 dB per pixel. We characterize the noise profile in CDI images and provide insights on the performance of different denoising techniques when applied at two different stages of current density reconstruction. PMID:26158100

  2. Inverse analysis of aerodynamic loads from strain information using structural models and neural networks

    NASA Astrophysics Data System (ADS)

    Wada, Daichi; Sugimoto, Yohei

    2017-04-01

    Aerodynamic loads on aircraft wings are one of the key parameters to be monitored for reliable and effective aircraft operations and management. Flight data of the aerodynamic loads would be used onboard to control the aircraft and accumulated data would be used for the condition-based maintenance and the feedback for the fatigue and critical load modeling. The effective sensing techniques such as fiber optic distributed sensing have been developed and demonstrated promising capability of monitoring structural responses, i.e., strains on the surface of the aircraft wings. By using the developed techniques, load identification methods for structural health monitoring are expected to be established. The typical inverse analysis for load identification using strains calculates the loads in a discrete form of concentrated forces, however, the distributed form of the loads is essential for the accurate and reliable estimation of the critical stress at structural parts. In this study, we demonstrate an inverse analysis to identify the distributed loads from measured strain information. The introduced inverse analysis technique calculates aerodynamic loads not in a discrete but in a distributed manner based on a finite element model. In order to verify the technique through numerical simulations, we apply static aerodynamic loads on a flat panel model, and conduct the inverse identification of the load distributions. We take two approaches to build the inverse system between loads and strains. The first one uses structural models and the second one uses neural networks. We compare the performance of the two approaches, and discuss the effect of the amount of the strain sensing information.

  3. Nuclear risk analysis of the Ulysses mission

    NASA Astrophysics Data System (ADS)

    Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W., Dr.

    1991-01-01

    The use of a radioisotope thermoelectric generator fueled with plutonium-238 dioxide on the Space Shuttle-launched Ulysses mission implies some level of risk due to potential accidents. This paper describes the method used to quantify risks in the Ulysses mission Final Safety Analysis Report prepared for the U.S. Department of Energy. The starting point for the analysis described herein is following input of source term probability distributions from the General Electric Company. A Monte Carlo technique is used to develop probability distributions of radiological consequences for a range of accident scenarios thoughout the mission. Factors affecting radiological consequences are identified, the probability distribution of the effect of each factor determined, and the functional relationship among all the factors established. The probability distributions of all the factor effects are then combined using a Monte Carlo technique. The results of the analysis are presented in terms of complementary cumulative distribution functions (CCDF) by mission sub-phase, phase, and the overall mission. The CCDFs show the total probability that consequences (calculated health effects) would be equal to or greater than a given value.

  4. Active distribution network planning considering linearized system loss

    NASA Astrophysics Data System (ADS)

    Li, Xiao; Wang, Mingqiang; Xu, Hao

    2018-02-01

    In this paper, various distribution network planning techniques with DGs are reviewed, and a new distribution network planning method is proposed. It assumes that the location of DGs and the topology of the network are fixed. The proposed model optimizes the capacities of DG and the optimal distribution line capacity simultaneously by a cost/benefit analysis and the benefit is quantified by the reduction of the expected interruption cost. Besides, the network loss is explicitly analyzed in the paper. For simplicity, the network loss is appropriately simplified as a quadratic function of difference of voltage phase angle. Then it is further piecewise linearized. In this paper, a piecewise linearization technique with different segment lengths is proposed. To validate its effectiveness and superiority, the proposed distribution network planning model with elaborate linearization technique is tested on the IEEE 33-bus distribution network system.

  5. Analyzing coastal environments by means of functional data analysis

    NASA Astrophysics Data System (ADS)

    Sierra, Carlos; Flor-Blanco, Germán; Ordoñez, Celestino; Flor, Germán; Gallego, José R.

    2017-07-01

    Here we used Functional Data Analysis (FDA) to examine particle-size distributions (PSDs) in a beach/shallow marine sedimentary environment in Gijón Bay (NW Spain). The work involved both Functional Principal Components Analysis (FPCA) and Functional Cluster Analysis (FCA). The grainsize of the sand samples was characterized by means of laser dispersion spectroscopy. Within this framework, FPCA was used as a dimension reduction technique to explore and uncover patterns in grain-size frequency curves. This procedure proved useful to describe variability in the structure of the data set. Moreover, an alternative approach, FCA, was applied to identify clusters and to interpret their spatial distribution. Results obtained with this latter technique were compared with those obtained by means of two vector approaches that combine PCA with CA (Cluster Analysis). The first method, the point density function (PDF), was employed after adapting a log-normal distribution to each PSD and resuming each of the density functions by its mean, sorting, skewness and kurtosis. The second applied a centered-log-ratio (clr) to the original data. PCA was then applied to the transformed data, and finally CA to the retained principal component scores. The study revealed functional data analysis, specifically FPCA and FCA, as a suitable alternative with considerable advantages over traditional vector analysis techniques in sedimentary geology studies.

  6. A technique for conducting point pattern analysis of cluster plot stem-maps

    Treesearch

    C.W. Woodall; J.M. Graham

    2004-01-01

    Point pattern analysis of forest inventory stem-maps may aid interpretation and inventory estimation of forest attributes. To evaluate the techniques and benefits of conducting point pattern analysis of forest inventory stem-maps, Ripley`s K(t) was calculated for simulated tree spatial distributions and for over 600 USDA Forest Service Forest...

  7. Performance of Modified Test Statistics in Covariance and Correlation Structure Analysis under Conditions of Multivariate Nonnormality.

    ERIC Educational Resources Information Center

    Fouladi, Rachel T.

    2000-01-01

    Provides an overview of standard and modified normal theory and asymptotically distribution-free covariance and correlation structure analysis techniques and details Monte Carlo simulation results on Type I and Type II error control. Demonstrates through the simulation that robustness and nonrobustness of structure analysis techniques vary as a…

  8. Nondestructive evaluation of turbine blades vibrating in resonant modes

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.; Ahmadshahi, Mansour A.

    1991-12-01

    The paper presents the analysis of the strain distribution of turbine blades. The holographic moire technique is used in conjunction with computer analysis of the fringes. The application of computer fringe analysis technique reduces the number of holograms to be recorded to two. Stroboscopic illumination is used to record the patterns. Strains and stresses are computed.

  9. Spatially distributed modal signals of free shallow membrane shell structronic system

    NASA Astrophysics Data System (ADS)

    Yue, H. H.; Deng, Z. Q.; Tzou, H. S.

    2008-11-01

    Based on the smart material and structronics technology, distributed sensor and control of shell structures have been rapidly developed for the last 20 years. This emerging technology has been utilized in aerospace, telecommunication, micro-electromechanical systems and other engineering applications. However, distributed monitoring technique and its resulting global spatially distributed sensing signals of shallow paraboloidal membrane shells are not clearly understood. In this paper, modeling of free flexible paraboloidal shell with spatially distributed sensor, micro-sensing signal characteristics, and location of distributed piezoelectric sensor patches are investigated based on a new set of assumed mode shape functions. Parametric analysis indicates that the signal generation depends on modal membrane strains in the meridional and circumferential directions in which the latter is more significant than the former, when all bending strains vanish in membrane shells. This study provides a modeling and analysis technique for distributed sensors laminated on lightweight paraboloidal flexible structures and identifies critical components and regions that generate significant signals.

  10. Comparison of photon correlation spectroscopy with photosedimentation analysis for the determination of aqueous colloid size distributions

    USGS Publications Warehouse

    Rees, Terry F.

    1990-01-01

    Colloidal materials, dispersed phases with dimensions between 0.001 and 1 μm, are potential transport media for a variety of contaminants in surface and ground water. Characterization of these colloids, and identification of the parameters that control their movement, are necessary before transport simulations can be attempted. Two techniques that can be used to determine the particle-size distribution of colloidal materials suspended in natural waters are compared. Photon correlation Spectroscopy (PCS) utilizes the Doppler frequency shift of photons scattered off particles undergoing Brownian motion to determine the size of colloids suspended in water. Photosedimentation analysis (PSA) measures the time-dependent change in optical density of a suspension of colloidal particles undergoing centrifugation. A description of both techniques, important underlying assumptions, and limitations are given. Results for a series of river water samples show that the colloid-size distribution means are statistically identical as determined by both techniques. This also is true of the mass median diameter (MMD), even though MMD values determined by PSA are consistently smaller than those determined by PCS. Because of this small negative bias, the skew parameters for the distributions are generally smaller for the PCS-determined distributions than for the PSA-determined distributions. Smaller polydispersity indices for the distributions are also determined by PCS.

  11. Statistical, Graphical, and Learning Methods for Sensing, Surveillance, and Navigation Systems

    DTIC Science & Technology

    2016-06-28

    harsh propagation environments. Conventional filtering techniques fail to provide satisfactory performance in many important nonlinear or non...Gaussian scenarios. In addition, there is a lack of a unified methodology for the design and analysis of different filtering techniques. To address...these problems, we have proposed a new filtering methodology called belief condensation (BC) DISTRIBUTION A: Distribution approved for public release

  12. Evaluation of Meterorite Amono Acid Analysis Data Using Multivariate Techniques

    NASA Technical Reports Server (NTRS)

    McDonald, G.; Storrie-Lombardi, M.; Nealson, K.

    1999-01-01

    The amino acid distributions in the Murchison carbonaceous chondrite, Mars meteorite ALH84001, and ice from the Allan Hills region of Antarctica are shown, using a multivariate technique known as Principal Component Analysis (PCA), to be statistically distinct from the average amino acid compostion of 101 terrestrial protein superfamilies.

  13. Hybrid computer technique yields random signal probability distributions

    NASA Technical Reports Server (NTRS)

    Cameron, W. D.

    1965-01-01

    Hybrid computer determines the probability distributions of instantaneous and peak amplitudes of random signals. This combined digital and analog computer system reduces the errors and delays of manual data analysis.

  14. Evaluation of specimen preparation techniques for micro-PIXE localisation of elements in hyperaccumulating plants

    NASA Astrophysics Data System (ADS)

    Kachenko, Anthony G.; Siegele, Rainer; Bhatia, Naveen P.; Singh, Balwant; Ionescu, Mihail

    2008-04-01

    Hybanthus floribundus subsp. floribundus, a rare Australian Ni-hyperaccumulating shrub and Pityrogramma calomelanos var. austroamericana, an Australian naturalized As-hyperaccumulating fern are promising species for use in phytoremediation of contaminated sites. Micro-proton-induced X-ray emission (μ-PIXE) spectroscopy was used to map the elemental distribution of the accumulated metal(loid)s, Ca and K in leaf or pinnule tissues of the two plant species. Samples were prepared by two contrasting specimen preparation techniques: freeze-substitution in tetrahydrofuran (THF) and freeze-drying. The specimens were analysed to compare the suitability of each technique in preserving (i) the spatial elemental distribution and (ii) the tissue structure of the specimens. Further, the μ-PIXE results were compared with concentration of elements in the bulk tissue obtained by ICP-AES analysis. In H. floribundus subsp. floribundus, μ-PIXE analysis revealed Ni, Ca and K concentrations in freeze-dried leaf tissues were at par with bulk tissue concentrations. Elemental distribution maps illustrated that Ni was preferentially localised in the adaxial epidermal tissues (1% DW) and least concentration was found in spongy mesophyll tissues (0.53% DW). Conversely, elemental distribution maps of THF freeze-substituted tissues indicated significantly lower Ni, Ca and K concentrations than freeze-dried specimens and bulk tissue concentrations. Moreover, Ni concentrations were uniform across the whole specimen and no localisation was observed. In P. calomelanos var. austroamericana freeze-dried pinnule tissues, μ-PIXE revealed statistically similar As, Ca and K concentrations as compared to bulk tissue concentrations. Elemental distribution maps showed that As localisation was relatively uniform across the whole specimen. Once again, THF freeze-substituted tissues revealed a significant loss of As compared to freeze-dried specimens and the concentrations obtained by bulk tissue analysis. The results demonstrate that freeze-drying is a suitable sample preparation technique to study elemental distribution of ions in H. floribundus and P. calomelanos plant tissues using μ-PIXE spectroscopy. Furthermore, cellular structure was preserved in samples prepared using this technique.

  15. SU-F-T-380: Comparing the Effect of Respiration On Dose Distribution Between Conventional Tangent Pair and IMRT Techniques for Adjuvant Radiotherapy in Early Stage Breast Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, M; Ramaseshan, R

    2016-06-15

    Purpose: In this project, we compared the conventional tangent pair technique to IMRT technique by analyzing the dose distribution. We also investigated the effect of respiration on planning target volume (PTV) dose coverage in both techniques. Methods: In order to implement IMRT technique a template based planning protocol, dose constrains and treatment process was developed. Two open fields with optimized field weights were combined with two beamlet optimization fields in IMRT plans. We compared the dose distribution between standard tangential pair and IMRT. The improvement in dose distribution was measured by parameters such as conformity index, homogeneity index and coveragemore » index. Another end point was the IMRT technique will reduce the planning time for staff. The effect of patient’s respiration on dose distribution was also estimated. The four dimensional computed tomography (4DCT) for different phase of breathing cycle was used to evaluate the effect of respiration on IMRT planned dose distribution. Results: We have accumulated 10 patients that acquired 4DCT and planned by both techniques. Based on the preliminary analysis, the dose distribution in IMRT technique was better than conventional tangent pair technique. Furthermore, the effect of respiration in IMRT plan was not significant as evident from the 95% isodose line coverage of PTV drawn on all phases of 4DCT. Conclusion: Based on the 4DCT images, the breathing effect on dose distribution was smaller than what we expected. We suspect that there are two reasons. First, the PTV movement due to respiration was not significant. It might be because we used a tilted breast board to setup patients. Second, the open fields with optimized field weights in IMRT technique might reduce the breathing effect on dose distribution. A further investigation is necessary.« less

  16. Analyzing Distributed Functions in an Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2010-01-01

    Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

  17. Spatial Signal Characteristics of Shallow Paraboloidal Shell Structronic Systems

    NASA Astrophysics Data System (ADS)

    Yue, H. H.; Deng, Z. Q.; Tzou, H. S.

    Based on the smart material and structronics technology, distributed sensor and control of shell structures have been rapidly developed for the last twenty years. This emerging technology has been utilized in aerospace, telecommunication, micro-electromechanical systems and other engineering applications. However, distributed monitoring technique and its resulting global spatially distributed sensing signals of thin flexible membrane shells are not clearly understood. In this paper, modeling of free thin paraboloidal shell with spatially distributed sensor, micro-sensing signal characteristics, and location of distributed piezoelectric sensor patches are investigated based on a new set of assumed mode shape functions. Parametric analysis indicates that the signal generation depends on modal membrane strains in the meridional and circumferential directions in which the latter is more significant than the former, when all bending strains vanish in membrane shells. This study provides a modeling and analysis technique for distributed sensors laminated on lightweight paraboloidal flexible structures and identifies critical components and regions that generate significant signals.

  18. Handling nonnormality and variance heterogeneity for quantitative sublethal toxicity tests.

    PubMed

    Ritz, Christian; Van der Vliet, Leana

    2009-09-01

    The advantages of using regression-based techniques to derive endpoints from environmental toxicity data are clear, and slowly, this superior analytical technique is gaining acceptance. As use of regression-based analysis becomes more widespread, some of the associated nuances and potential problems come into sharper focus. Looking at data sets that cover a broad spectrum of standard test species, we noticed that some model fits to data failed to meet two key assumptions-variance homogeneity and normality-that are necessary for correct statistical analysis via regression-based techniques. Failure to meet these assumptions often is caused by reduced variance at the concentrations showing severe adverse effects. Although commonly used with linear regression analysis, transformation of the response variable only is not appropriate when fitting data using nonlinear regression techniques. Through analysis of sample data sets, including Lemna minor, Eisenia andrei (terrestrial earthworm), and algae, we show that both the so-called Box-Cox transformation and use of the Poisson distribution can help to correct variance heterogeneity and nonnormality and so allow nonlinear regression analysis to be implemented. Both the Box-Cox transformation and the Poisson distribution can be readily implemented into existing protocols for statistical analysis. By correcting for nonnormality and variance heterogeneity, these two statistical tools can be used to encourage the transition to regression-based analysis and the depreciation of less-desirable and less-flexible analytical techniques, such as linear interpolation.

  19. Feature Extraction for Bearing Prognostics and Health Management (PHM) - A Survey (Preprint)

    DTIC Science & Technology

    2008-05-01

    Envelope analysis • Cepstrum analysis • Higher order spectrum • Short-time Fourier Transform (STFT) • Wigner - Ville distribution ( WVD ) • Empirical mode...techniques are the short-time Fourier transform (STFT), the Wigner - Ville distribution , and the wavelet transform. In this paper we categorize wavelets...diagnosis have shown in many publications, for example, [22]. b) Wigner – Ville distribution : The afore-mentioned STFT is conceptually simple. However

  20. An Estimation of the Gamma-Ray Burst Afterglow Apparent Optical Brightness Distribution Function

    NASA Astrophysics Data System (ADS)

    Akerlof, Carl W.; Swan, Heather F.

    2007-12-01

    By using recent publicly available observational data obtained in conjunction with the NASA Swift gamma-ray burst (GRB) mission and a novel data analysis technique, we have been able to make some rough estimates of the GRB afterglow apparent optical brightness distribution function. The results suggest that 71% of all burst afterglows have optical magnitudes with mR<22.1 at 1000 s after the burst onset, the dimmest detected object in the data sample. There is a strong indication that the apparent optical magnitude distribution function peaks at mR~19.5. Such estimates may prove useful in guiding future plans to improve GRB counterpart observation programs. The employed numerical techniques might find application in a variety of other data analysis problems in which the intrinsic distributions must be inferred from a heterogeneous sample.

  1. A deep learning approach to estimate stress distribution: a fast and accurate surrogate of finite-element analysis.

    PubMed

    Liang, Liang; Liu, Minliang; Martin, Caitlin; Sun, Wei

    2018-01-01

    Structural finite-element analysis (FEA) has been widely used to study the biomechanics of human tissues and organs, as well as tissue-medical device interactions, and treatment strategies. However, patient-specific FEA models usually require complex procedures to set up and long computing times to obtain final simulation results, preventing prompt feedback to clinicians in time-sensitive clinical applications. In this study, by using machine learning techniques, we developed a deep learning (DL) model to directly estimate the stress distributions of the aorta. The DL model was designed and trained to take the input of FEA and directly output the aortic wall stress distributions, bypassing the FEA calculation process. The trained DL model is capable of predicting the stress distributions with average errors of 0.492% and 0.891% in the Von Mises stress distribution and peak Von Mises stress, respectively. This study marks, to our knowledge, the first study that demonstrates the feasibility and great potential of using the DL technique as a fast and accurate surrogate of FEA for stress analysis. © 2018 The Author(s).

  2. Simulation studies of wide and medium field of view earth radiation data analysis

    NASA Technical Reports Server (NTRS)

    Green, R. N.

    1978-01-01

    A parameter estimation technique is presented to estimate the radiative flux distribution over the earth from radiometer measurements at satellite altitude. The technique analyzes measurements from a wide field of view (WFOV), horizon to horizon, nadir pointing sensor with a mathematical technique to derive the radiative flux estimates at the top of the atmosphere for resolution elements smaller than the sensor field of view. A computer simulation of the data analysis technique is presented for both earth-emitted and reflected radiation. Zonal resolutions are considered as well as the global integration of plane flux. An estimate of the equator-to-pole gradient is obtained from the zonal estimates. Sensitivity studies of the derived flux distribution to directional model errors are also presented. In addition to the WFOV results, medium field of view results are presented.

  3. High-resolution synchrotron X-ray analysis of bioglass-enriched hydrogels.

    PubMed

    Gorodzha, Svetlana; Douglas, Timothy E L; Samal, Sangram K; Detsch, Rainer; Cholewa-Kowalska, Katarzyna; Braeckmans, Kevin; Boccaccini, Aldo R; Skirtach, Andre G; Weinhardt, Venera; Baumbach, Tilo; Surmeneva, Maria A; Surmenev, Roman A

    2016-05-01

    Enrichment of hydrogels with inorganic particles improves their suitability for bone regeneration by enhancing their mechanical properties, mineralizability, and bioactivity as well as adhesion, proliferation, and differentiation of bone-forming cells, while maintaining injectability. Low aggregation and homogeneous distribution maximize particle surface area, promoting mineralization, cell-particle interactions, and homogenous tissue regeneration. Hence, determination of the size and distribution of particles/particle agglomerates in the hydrogel is desirable. Commonly used techniques have drawbacks. High-resolution techniques (e.g., SEM) require drying. Distribution in the dry state is not representative of the wet state. Techniques in the wet state (histology, µCT) are of lower resolution. Here, self-gelling, injectable composites of Gellan Gum (GG) hydrogel and two different types of sol-gel-derived bioactive glass (bioglass) particles were analyzed in the wet state using Synchrotron X-ray radiation, enabling high-resolution determination of particle size and spatial distribution. The lower detection limit volume was 9 × 10(-5) mm(3) . Bioglass particle suspensions were also studied using zeta potential measurements and Coulter analysis. Aggregation of bioglass particles in the GG hydrogels occurred and aggregate distribution was inhomogeneous. Bioglass promoted attachment of rat mesenchymal stem cells (rMSC) and mineralization. © 2016 Wiley Periodicals, Inc.

  4. Improvements in surface singularity analysis and design methods. [applicable to airfoils

    NASA Technical Reports Server (NTRS)

    Bristow, D. R.

    1979-01-01

    The coupling of the combined source vortex distribution of Green's potential flow function with contemporary numerical techniques is shown to provide accurate, efficient, and stable solutions to subsonic inviscid analysis and design problems for multi-element airfoils. The analysis problem is solved by direct calculation of the surface singularity distribution required to satisfy the flow tangency boundary condition. The design or inverse problem is solved by an iteration process. In this process, the geometry and the associated pressure distribution are iterated until the pressure distribution most nearly corresponding to the prescribed design distribution is obtained. Typically, five iteration cycles are required for convergence. A description of the analysis and design method is presented, along with supporting examples.

  5. Geometric parameter analysis to predetermine optimal radiosurgery technique for the treatment of arteriovenous malformation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mestrovic, Ante; Clark, Brenda G.; Department of Medical Physics, British Columbia Cancer Agency, Vancouver, British Columbia

    2005-11-01

    Purpose: To develop a method of predicting the values of dose distribution parameters of different radiosurgery techniques for treatment of arteriovenous malformation (AVM) based on internal geometric parameters. Methods and Materials: For each of 18 previously treated AVM patients, four treatment plans were created: circular collimator arcs, dynamic conformal arcs, fixed conformal fields, and intensity-modulated radiosurgery. An algorithm was developed to characterize the target and critical structure shape complexity and the position of the critical structures with respect to the target. Multiple regression was employed to establish the correlation between the internal geometric parameters and the dose distribution for differentmore » treatment techniques. The results from the model were applied to predict the dosimetric outcomes of different radiosurgery techniques and select the optimal radiosurgery technique for a number of AVM patients. Results: Several internal geometric parameters showing statistically significant correlation (p < 0.05) with the treatment planning results for each technique were identified. The target volume and the average minimum distance between the target and the critical structures were the most effective predictors for normal tissue dose distribution. The structure overlap volume with the target and the mean distance between the target and the critical structure were the most effective predictors for critical structure dose distribution. The predicted values of dose distribution parameters of different radiosurgery techniques were in close agreement with the original data. Conclusions: A statistical model has been described that successfully predicts the values of dose distribution parameters of different radiosurgery techniques and may be used to predetermine the optimal technique on a patient-to-patient basis.« less

  6. Distributed condition monitoring techniques of optical fiber composite power cable in smart grid

    NASA Astrophysics Data System (ADS)

    Sun, Zhihui; Liu, Yuan; Wang, Chang; Liu, Tongyu

    2011-11-01

    Optical fiber composite power cable such as optical phase conductor (OPPC) is significant for the development of smart grid. This paper discusses the distributed cable condition monitoring techniques of the OPPC, which adopts embedded single-mode fiber as the sensing medium. By applying optical time domain reflection and laser Raman scattering, high-resolution spatial positioning and high-precision distributed temperature measurement is executed. And the OPPC cable condition parameters including temperature and its location, current carrying capacity, and location of fracture and loss can be monitored online. OPPC cable distributed condition monitoring experimental system is set up, and the main parts including pulsed fiber laser, weak Raman signal reception, high speed acquisition and cumulative average processing, temperature demodulation and current carrying capacity analysis are introduced. The distributed cable condition monitoring techniques of the OPPC is significant for power transmission management and security.

  7. Imaging Analysis of Near-Field Recording Technique for Observation of Biological Specimens

    NASA Astrophysics Data System (ADS)

    Moriguchi, Chihiro; Ohta, Akihiro; Egami, Chikara; Kawata, Yoshimasa; Terakawa, Susumu; Tsuchimori, Masaaki; Watanabe, Osamu

    2006-07-01

    We present an analysis of the properties of an imaging based on a near-field recording technique in comparison with simulation results. In the system, the optical field distributions localized near the specimens are recorded as the surface topographic distributions of a photosensitive film. It is possible to observe both soft and moving specimens, because the system does not require a scanning probe to obtain the observed image. The imaging properties are evaluated using fine structures of paramecium, and we demonstrate that it is possible to observe minute differences of refractive indices.

  8. Nuclear risk analysis of the Ulysses mission

    NASA Astrophysics Data System (ADS)

    Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W.

    An account is given of the method used to quantify the risks accruing to the use of a radioisotope thermoelectric generator fueled by Pu-238 dioxide aboard the Space Shuttle-launched Ulysses mission. After using a Monte Carlo technique to develop probability distributions for the radiological consequences of a range of accident scenarios throughout the mission, factors affecting those consequences are identified in conjunction with their probability distributions. The functional relationship among all the factors is then established, and probability distributions for all factor effects are combined by means of a Monte Carlo technique.

  9. Model-Driven Test Generation of Distributed Systems

    NASA Technical Reports Server (NTRS)

    Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin

    2012-01-01

    This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.

  10. Analysis of thrips distribution: application of spatial statistics and Kriging

    Treesearch

    John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard

    1991-01-01

    Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...

  11. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    PubMed

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  12. Vibration monitoring of a helicopter blade model using the optical fiber distributed strain sensing technique.

    PubMed

    Wada, Daichi; Igawa, Hirotaka; Kasai, Tokio

    2016-09-01

    We demonstrate a dynamic distributed monitoring technique using a long-length fiber Bragg grating (FBG) interrogated by optical frequency domain reflectometry (OFDR) that measures strain at a speed of 150 Hz, spatial resolution of 1 mm, and measurement range of 20 m. A 5 m FBG is bonded to a 5.5 m helicopter blade model, and vibration is applied by the step relaxation method. The time domain responses of the strain distributions are measured, and the blade deflections are calculated based on the strain distributions. Frequency response functions are obtained using the time domain responses of the calculated deflection induced by the preload release, and the modal parameters are retrieved. Experimental results demonstrated the dynamic monitoring performances and the applicability to the modal analysis of the OFDR-FBG technique.

  13. Discussion of NAEG distribution and inventory program sampling data in preparation for initiation of phase III

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brady, D.N.; Church, B.W.; White, M.G.

    Soil sampling activities during 1974 were concentrated in Area 5 of the Nevada Test Site (NTS). Area 5 has been assigned the highest priority because of the number of atmospheric test events held and a wide distribution of contaminants. Improved sampling techniques are described. Preliminary data analysis aided in designing a program to infer $sup 239-240$Pu results by Ge(Li) scanning techniques. (auth)

  14. Determination of a Limited Scope Network's Lightning Detection Efficiency

    NASA Technical Reports Server (NTRS)

    Rompala, John T.; Blakeslee, R.

    2008-01-01

    This paper outlines a modeling technique to map lightning detection efficiency variations over a region surveyed by a sparse array of ground based detectors. A reliable flash peak current distribution (PCD) for the region serves as the technique's base. This distribution is recast as an event probability distribution function. The technique then uses the PCD together with information regarding: site signal detection thresholds, type of solution algorithm used, and range attenuation; to formulate the probability that a flash at a specified location will yield a solution. Applying this technique to the full region produces detection efficiency contour maps specific to the parameters employed. These contours facilitate a comparative analysis of each parameter's effect on the network's detection efficiency. In an alternate application, this modeling technique gives an estimate of the number, strength, and distribution of events going undetected. This approach leads to a variety of event density contour maps. This application is also illustrated. The technique's base PCD can be empirical or analytical. A process for formulating an empirical PCD specific to the region and network being studied is presented. A new method for producing an analytical representation of the empirical PCD is also introduced.

  15. Using classification tree analysis to predict oak wilt distribution in Minnesota and Texas

    Treesearch

    Marla c. Downing; Vernon L. Thomas; Jennifer Juzwik; David N. Appel; Robin M. Reich; Kim Camilli

    2008-01-01

    We developed a methodology and compared results for predicting the potential distribution of Ceratocystis fagacearum (causal agent of oak wilt), in both Anoka County, MN, and Fort Hood, TX. The Potential Distribution of Oak Wilt (PDOW) utilizes a binary classification tree statistical technique that incorporates: geographical information systems (GIS...

  16. Authentication techniques for smart cards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, R.A.

    1994-02-01

    Smart card systems are most cost efficient when implemented as a distributed system, which is a system without central host interaction or a local database of card numbers for verifying transaction approval. A distributed system, as such, presents special card and user authentication problems. Fortunately, smart cards offer processing capabilities that provide solutions to authentication problems, provided the system is designed with proper data integrity measures. Smart card systems maintain data integrity through a security design that controls data sources and limits data changes. A good security design is usually a result of a system analysis that provides a thoroughmore » understanding of the application needs. Once designers understand the application, they may specify authentication techniques that mitigate the risk of system compromise or failure. Current authentication techniques include cryptography, passwords, challenge/response protocols, and biometrics. The security design includes these techniques to help prevent counterfeit cards, unauthorized use, or information compromise. This paper discusses card authentication and user identity techniques that enhance security for microprocessor card systems. It also describes the analysis process used for determining proper authentication techniques for a system.« less

  17. Nondestructive ultrasonic characterization of armor grade silicon carbide

    NASA Astrophysics Data System (ADS)

    Portune, Andrew Richard

    Ceramic materials have traditionally been chosen for armor applications for their superior mechanical properties and low densities. At high strain rates seen during ballistic events, the behavior of these materials relies upon the total volumetric flaw concentration more so than any single anomalous flaw. In this context flaws can be defined as any microstructural feature which detriments the performance of the material, potentially including secondary phases, pores, or unreacted sintering additives. Predicting the performance of armor grade ceramic materials depends on knowledge of the absolute and relative concentration and size distribution of bulk heterogeneities. Ultrasound was chosen as a nondestructive technique for characterizing the microstructure of dense silicon carbide ceramics. Acoustic waves interact elastically with grains and inclusions in large sample volumes, and were well suited to determine concentration and size distribution variations for solid inclusions. Methodology was developed for rapid acquisition and analysis of attenuation coefficient spectra. Measurements were conducted at individual points and over large sample areas using a novel technique entitled scanning acoustic spectroscopy. Loss spectra were split into absorption and scattering dominant frequency regimes to simplify analysis. The primary absorption mechanism in polycrystalline silicon carbide was identified as thermoelastic in nature. Correlations between microstructural conditions and parameters within the absorption equation were established through study of commercial and custom engineered SiC materials. Nonlinear least squares regression analysis was used to estimate the size distributions of boron carbide and carbon inclusions within commercial SiC materials. This technique was shown to additionally be capable of approximating grain size distributions in engineered SiC materials which did not contain solid inclusions. Comparisons to results from electron microscopy exhibited favorable agreement between predicted and observed distributions. Developed techniques were applied to large sample areas using scanning acoustic spectroscopy to map variations in the size distribution and concentration of grains and solid inclusions within the bulk microstructure. The experiments performed in this thesis form the foundation of a novel characterization technique capable of mapping variations in sample composition which could be extended to a wide range of dense polycrystalline heterogeneous materials.

  18. Two dimensional Fourier transform methods for fringe pattern analysis

    NASA Astrophysics Data System (ADS)

    Sciammarella, C. A.; Bhat, G.

    An overview of the use of FFTs for fringe pattern analysis is presented, with emphasis on fringe patterns containing displacement information. The techniques are illustrated via analysis of the displacement and strain distributions in the direction perpendicular to the loading, in a disk under diametral compression. The experimental strain distribution is compared to the theoretical, and the agreement is found to be excellent in regions where the elasticity solution models well the actual problem.

  19. Computing distance distributions from dipolar evolution data with overtones: RIDME spectroscopy with Gd(iii)-based spin labels.

    PubMed

    Keller, Katharina; Mertens, Valerie; Qi, Mian; Nalepa, Anna I; Godt, Adelheid; Savitsky, Anton; Jeschke, Gunnar; Yulikov, Maxim

    2017-07-21

    Extraction of distance distributions between high-spin paramagnetic centers from relaxation induced dipolar modulation enhancement (RIDME) data is affected by the presence of overtones of dipolar frequencies. As previously proposed, we account for these overtones by using a modified kernel function in Tikhonov regularization analysis. This paper analyzes the performance of such an approach on a series of model compounds with the Gd(iii)-PyMTA complex serving as paramagnetic high-spin label. We describe the calibration of the overtone coefficients for the RIDME kernel, demonstrate the accuracy of distance distributions obtained with this approach, and show that for our series of Gd-rulers RIDME technique provides more accurate distance distributions than Gd(iii)-Gd(iii) double electron-electron resonance (DEER). The analysis of RIDME data including harmonic overtones can be performed using the MATLAB-based program OvertoneAnalysis, which is available as open-source software from the web page of ETH Zurich. This approach opens a perspective for the routine use of the RIDME technique with high-spin labels in structural biology and structural studies of other soft matter.

  20. Stability, performance and sensitivity analysis of I.I.D. jump linear systems

    NASA Astrophysics Data System (ADS)

    Chávez Fuentes, Jorge R.; González, Oscar R.; Gray, W. Steven

    2018-06-01

    This paper presents a symmetric Kronecker product analysis of independent and identically distributed jump linear systems to develop new, lower dimensional equations for the stability and performance analysis of this type of systems than what is currently available. In addition, new closed form expressions characterising multi-parameter relative sensitivity functions for performance metrics are introduced. The analysis technique is illustrated with a distributed fault-tolerant flight control example where the communication links are allowed to fail randomly.

  1. Distributed intelligent data analysis in diabetic patient management.

    PubMed Central

    Bellazzi, R.; Larizza, C.; Riva, A.; Mira, A.; Fiocchi, S.; Stefanelli, M.

    1996-01-01

    This paper outlines the methodologies that can be used to perform an intelligent analysis of diabetic patients' data, realized in a distributed management context. We present a decision-support system architecture based on two modules, a Patient Unit and a Medical Unit, connected by telecommunication services. We stress the necessity to resort to temporal abstraction techniques, combined with time series analysis, in order to provide useful advice to patients; finally, we outline how data analysis and interpretation can be cooperatively performed by the two modules. PMID:8947655

  2. Elemental imaging at the nanoscale: NanoSIMS and complementary techniques for element localisation in plants.

    PubMed

    Moore, Katie L; Lombi, Enzo; Zhao, Fang-Jie; Grovenor, Chris R M

    2012-04-01

    The ability to locate and quantify elemental distributions in plants is crucial to understanding plant metabolisms, the mechanisms of uptake and transport of minerals and how plants cope with toxic elements or elemental deficiencies. High-resolution secondary ion mass spectrometry (SIMS) is emerging as an important technique for the analysis of biological material at the subcellular scale. This article reviews recent work using the CAMECA NanoSIMS to determine elemental distributions in plants. The NanoSIMS is able to map elemental distributions at high resolution, down to 50 nm, and can detect very low concentrations (milligrams per kilogram) for some elements. It is also capable of mapping almost all elements in the periodic table (from hydrogen to uranium) and can distinguish between stable isotopes, which allows the design of tracer experiments. In this review, particular focus is placed upon studying the same or similar specimens with both the NanoSIMS and a wide range of complementary techniques, showing how the advantages of each technique can be combined to provide a fuller data set to address complex scientific questions. Techniques covered include optical microscopy, synchrotron techniques, including X-ray fluorescence and X-ray absorption spectroscopy, transmission electron microscopy, electron probe microanalysis, particle-induced X-ray emission and inductively coupled plasma mass spectrometry. Some of the challenges associated with sample preparation of plant material for SIMS analysis, the artefacts and limitations of the technique and future trends are also discussed.

  3. Flood frequency analysis using optimization techniques : final report.

    DOT National Transportation Integrated Search

    1992-10-01

    this study consists of three parts. In the first part, a comprehensive investigation was made to find an improved estimation method for the log-Pearson type 3 (LP3) distribution by using optimization techniques. Ninety sets of observed Louisiana floo...

  4. Probabilistic thermal-shock strength testing using infrared imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wereszczak, A.A.; Scheidt, R.A.; Ferber, M.K.

    1999-12-01

    A thermal-shock strength-testing technique has been developed that uses a high-resolution, high-temperature infrared camera to capture a specimen's surface temperature distribution at fracture. Aluminum nitride (AlN) substrates are thermally shocked to fracture to demonstrate the technique. The surface temperature distribution for each test and AlN's thermal expansion are used as input in a finite-element model to determine the thermal-shock strength for each specimen. An uncensored thermal-shock strength Weibull distribution is then determined. The test and analysis algorithm show promise as a means to characterize thermal shock strength of ceramic materials.

  5. Comparative analysis of ferroelectric domain statistics via nonlinear diffraction in random nonlinear materials.

    PubMed

    Wang, B; Switowski, K; Cojocaru, C; Roppo, V; Sheng, Y; Scalora, M; Kisielewski, J; Pawlak, D; Vilaseca, R; Akhouayri, H; Krolikowski, W; Trull, J

    2018-01-22

    We present an indirect, non-destructive optical method for domain statistic characterization in disordered nonlinear crystals having homogeneous refractive index and spatially random distribution of ferroelectric domains. This method relies on the analysis of the wave-dependent spatial distribution of the second harmonic, in the plane perpendicular to the optical axis in combination with numerical simulations. We apply this technique to the characterization of two different media, Calcium Barium Niobate and Strontium Barium Niobate, with drastically different statistical distributions of ferroelectric domains.

  6. Advanced statistical methods for improved data analysis of NASA astrophysics missions

    NASA Technical Reports Server (NTRS)

    Feigelson, Eric D.

    1992-01-01

    The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.

  7. Development of a laser-induced heat flux technique for measurement of convective heat transfer coefficients in a supersonic flowfield

    NASA Technical Reports Server (NTRS)

    Porro, A. Robert; Keith, Theo G., Jr.; Hingst, Warren R.; Chriss, Randall M.; Seablom, Kirk D.

    1991-01-01

    A technique is developed to measure the local convective heat transfer coefficient on a model surface in a supersonic flow field. The technique uses a laser to apply a discrete local heat flux at the model test surface, and an infrared camera system determines the local temperature distribution due to heating. From this temperature distribution and an analysis of the heating process, a local convective heat transfer coefficient is determined. The technique was used to measure the load surface convective heat transfer coefficient distribution on a flat plate at nominal Mach numbers of 2.5, 3.0, 3.5, and 4.0. The flat plate boundary layer initially was laminar and became transitional in the measurement region. The experimental results agreed reasonably well with theoretical predictions of convective heat transfer of flat plate laminar boundary layers. The results indicate that this non-intrusive optical measurement technique has the potential to obtain high quality surface convective heat transfer measurements in high speed flowfields.

  8. A laser-induced heat flux technique for convective heat transfer measurements in high speed flows

    NASA Technical Reports Server (NTRS)

    Porro, A. R.; Keith, T. G., Jr.; Hingst, W. R.

    1991-01-01

    A technique is developed to measure the local convective heat transfer coefficient on a model surface in a supersonic flow field. The technique uses a laser to apply a discrete local heat flux at the model test surface, and an infrared camera system determines the local temperature distribution due to the heating. From this temperature distribution and an analysis of the heating process, a local convective heat transfer coefficient is determined. The technique was used to measure the local surface convective heat transfer coefficient distribution on a flat plate at nominal Mach numbers of 2.5, 3.0, 3.5, and 4.0. The flat plate boundary layer initially was laminar and became transitional in the measurement region. The experimentally determined convective heat transfer coefficients were generally higher than the theoretical predictions for flat plate laminar boundary layers. However, the results indicate that this nonintrusive optical measurement technique has the potential to measure surface convective heat transfer coefficients in high speed flow fields.

  9. A laser-induced heat flux technique for convective heat transfer measurements in high speed flows

    NASA Technical Reports Server (NTRS)

    Porro, A. R.; Keith, T. G., Jr.; Hingst, W. R.

    1991-01-01

    A technique is developed to measure the local convective heat transfer coefficient on a model surface in a supersonic flow field. The technique uses a laser to apply a discrete local heat flux at the model test surface, and an infrared camera system determines the local temperature distribution due to the heating. From this temperature distribution and an analysis of the heating process, a local convective heat transfer coefficient is determined. The technique was used to measure the local surface convective heat transfer coefficient distribution on a flat plate at nominal Mach numbers of 2.5, 3.0, 3.5, and 4.0. The flat plate boundary layer initially was laminar and became transitional in the measurement region. The experimentally determined convective heat transfer coefficients were generally higher than the theoretical predictions for flat plate laminar boundary layers. However, the results indicate that this nonintrusive optical measurement technique has the potential to measure surface convective heat transfer coefficients in high-speed flowfields.

  10. Geospatial methods and data analysis for assessing distribution of grazing livestock

    USDA-ARS?s Scientific Manuscript database

    Free-ranging livestock research must begin with a well conceived problem statement and employ appropriate data acquisition tools and analytical techniques to accomplish the research objective. These requirements are especially critical in addressing animal distribution. Tools and statistics used t...

  11. Application of Image Analysis for Characterization of Spatial Arrangements of Features in Microstructure

    NASA Technical Reports Server (NTRS)

    Louis, Pascal; Gokhale, Arun M.

    1995-01-01

    A number of microstructural processes are sensitive to the spatial arrangements of features in microstructure. However, very little attention has been given in the past to the experimental measurements of the descriptors of microstructural distance distributions due to the lack of practically feasible methods. We present a digital image analysis procedure to estimate the micro-structural distance distributions. The application of the technique is demonstrated via estimation of K function, radial distribution function, and nearest-neighbor distribution function of hollow spherical carbon particulates in a polymer matrix composite, observed in a metallographic section.

  12. Pulsed Laser Ablation-Induced Green Synthesis of TiO2 Nanoparticles and Application of Novel Small Angle X-Ray Scattering Technique for Nanoparticle Size and Size Distribution Analysis.

    PubMed

    Singh, Amandeep; Vihinen, Jorma; Frankberg, Erkka; Hyvärinen, Leo; Honkanen, Mari; Levänen, Erkki

    2016-12-01

    This paper aims to introduce small angle X-ray scattering (SAXS) as a promising technique for measuring size and size distribution of TiO 2 nanoparticles. In this manuscript, pulsed laser ablation in liquids (PLAL) has been demonstrated as a quick and simple technique for synthesizing TiO 2 nanoparticles directly into deionized water as a suspension from titanium targets. Spherical TiO 2 nanoparticles with diameters in the range 4-35 nm were observed with transmission electron microscopy (TEM). X-ray diffraction (XRD) showed highly crystalline nanoparticles that comprised of two main photoactive phases of TiO 2 : anatase and rutile. However, presence of minor amounts of brookite was also reported. The traditional methods for nanoparticle size and size distribution analysis such as electron microscopy-based methods are time-consuming. In this study, we have proposed and validated SAXS as a promising method for characterization of laser-ablated TiO 2 nanoparticles for their size and size distribution by comparing SAXS- and TEM-measured nanoparticle size and size distribution. SAXS- and TEM-measured size distributions closely followed each other for each sample, and size distributions in both showed maxima at the same nanoparticle size. The SAXS-measured nanoparticle diameters were slightly larger than the respective diameters measured by TEM. This was because SAXS measures an agglomerate consisting of several particles as one big particle which slightly increased the mean diameter. TEM- and SAXS-measured mean diameters when plotted together showed similar trend in the variation in the size as the laser power was changed which along with extremely similar size distributions for TEM and SAXS validated the application of SAXS for size distribution measurement of the synthesized TiO 2 nanoparticles.

  13. Location and Size Planning of Distributed Photovoltaic Generation in Distribution network System Based on K-means Clustering Analysis

    NASA Astrophysics Data System (ADS)

    Lu, Siqi; Wang, Xiaorong; Wu, Junyong

    2018-01-01

    The paper presents a method to generate the planning scenarios, which is based on K-means clustering analysis algorithm driven by data, for the location and size planning of distributed photovoltaic (PV) units in the network. Taken the power losses of the network, the installation and maintenance costs of distributed PV, the profit of distributed PV and the voltage offset as objectives and the locations and sizes of distributed PV as decision variables, Pareto optimal front is obtained through the self-adaptive genetic algorithm (GA) and solutions are ranked by a method called technique for order preference by similarity to an ideal solution (TOPSIS). Finally, select the planning schemes at the top of the ranking list based on different planning emphasis after the analysis in detail. The proposed method is applied to a 10-kV distribution network in Gansu Province, China and the results are discussed.

  14. Poster — Thur Eve — 74: Distributed, asynchronous, reactive dosimetric and outcomes analysis using DICOMautomaton

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, Haley; BC Cancer Agency, Surrey, B.C.; BC Cancer Agency, Vancouver, B.C.

    2014-08-15

    Many have speculated about the future of computational technology in clinical radiation oncology. It has been advocated that the next generation of computational infrastructure will improve on the current generation by incorporating richer aspects of automation, more heavily and seamlessly featuring distributed and parallel computation, and providing more flexibility toward aggregate data analysis. In this report we describe how a recently created — but currently existing — analysis framework (DICOMautomaton) incorporates these aspects. DICOMautomaton supports a variety of use cases but is especially suited for dosimetric outcomes correlation analysis, investigation and comparison of radiotherapy treatment efficacy, and dose-volume computation. Wemore » describe: how it overcomes computational bottlenecks by distributing workload across a network of machines; how modern, asynchronous computational techniques are used to reduce blocking and avoid unnecessary computation; and how issues of out-of-date data are addressed using reactive programming techniques and data dependency chains. We describe internal architecture of the software and give a detailed demonstration of how DICOMautomaton could be used to search for correlations between dosimetric and outcomes data.« less

  15. Actinide bioimaging in tissues: Comparison of emulsion and solid track autoradiography techniques with the iQID camera

    PubMed Central

    Miller, Brian W.; Van der Meeren, Anne; Tazrart, Anissa; Angulo, Jaime F.; Griffiths, Nina M.

    2017-01-01

    This work presents a comparison of three autoradiography techniques for imaging biological samples contaminated with actinides: emulsion-based, plastic-based autoradiography and a quantitative digital technique, the iQID camera, based on the numerical analysis of light from a scintillator screen. In radiation toxicology it has been important to develop means of imaging actinide distribution in tissues as these radionuclides may be heterogeneously distributed within and between tissues after internal contamination. Actinide distribution determines which cells are exposed to alpha radiation and is thus potentially critical for assessing absorbed dose. The comparison was carried out by generating autoradiographs of the same biological samples contaminated with actinides with the three autoradiography techniques. These samples were cell preparations or tissue sections collected from animals contaminated with different physico-chemical forms of actinides. The autoradiograph characteristics and the performances of the techniques were evaluated and discussed mainly in terms of acquisition process, activity distribution patterns, spatial resolution and feasibility of activity quantification. The obtained autoradiographs presented similar actinide distribution at low magnification. Out of the three techniques, emulsion autoradiography is the only one to provide a highly-resolved image of the actinide distribution inherently superimposed on the biological sample. Emulsion autoradiography is hence best interpreted at higher magnifications. However, this technique is destructive for the biological sample. Both emulsion- and plastic-based autoradiography record alpha tracks and thus enabled the differentiation between ionized forms of actinides and oxide particles. This feature can help in the evaluation of decorporation therapy efficacy. The most recent technique, the iQID camera, presents several additional features: real-time imaging, separate imaging of alpha particles and gamma rays, and alpha activity quantification. The comparison of these three autoradiography techniques showed that they are complementary and the choice of the technique depends on the purpose of the imaging experiment. PMID:29023595

  16. Variability Extraction and Synthesis via Multi-Resolution Analysis using Distribution Transformer High-Speed Power Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamana, Manohar; Mather, Barry A

    A library of load variability classes is created to produce scalable synthetic data sets using historical high-speed raw data. These data are collected from distribution monitoring units connected at the secondary side of a distribution transformer. Because of the irregular patterns and large volume of historical high-speed data sets, the utilization of current load characterization and modeling techniques are challenging. Multi-resolution analysis techniques are applied to extract the necessary components and eliminate the unnecessary components from the historical high-speed raw data to create the library of classes, which are then utilized to create new synthetic load data sets. A validationmore » is performed to ensure that the synthesized data sets contain the same variability characteristics as the training data sets. The synthesized data sets are intended to be utilized in quasi-static time-series studies for distribution system planning studies on a granular scale, such as detailed PV interconnection studies.« less

  17. Research on the degradation of tropical arable land soil: Part II. The distribution of soil nutrients in eastern part of Hainan Island

    NASA Astrophysics Data System (ADS)

    Wang, Dengfeng; Wei, Zhiyuan; Qi, Zhiping

    Research on the temporal and spatial distribution of soil nutrients in tropical arable land is very important to promote the tropical sustainable agriculture development. Take the Eastern part of Hainan as research area, applying GIS spatial analysis technique, analyzing the temporal and spatial variation of soil N, P and K contents in arable land. The results indicate that the contents of soil N, P and K were 0.28%, 0.20% and 1.75% respectively in 2005. The concentrations of total N and P in arable land soil increased significantly from 1980s to 2005. The variances in contents of soil nutrients were closely related to the application of chemical fertilizers in recent years, and the uneven distribution of soil nutrient contents was a reflection of fertilizer application in research area. Fertilization can be planned based on the distribution of soil nutrients and the spatial analysis techniques, so as to sustain balance of soil nutrients contents.

  18. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  19. Visualization and Analysis for Near-Real-Time Decision Making in Distributed Workflows

    DOE PAGES

    Pugmire, David; Kress, James; Choi, Jong; ...

    2016-08-04

    Data driven science is becoming increasingly more common, complex, and is placing tremendous stresses on visualization and analysis frameworks. Data sources producing 10GB per second (and more) are becoming increasingly commonplace in both simulation, sensor and experimental sciences. These data sources, which are often distributed around the world, must be analyzed by teams of scientists that are also distributed. Enabling scientists to view, query and interact with such large volumes of data in near-real-time requires a rich fusion of visualization and analysis techniques, middleware and workflow systems. Here, this paper discusses initial research into visualization and analysis of distributed datamore » workflows that enables scientists to make near-real-time decisions of large volumes of time varying data.« less

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Son, Young-Sun; Yoon, Wang-Jung

    The purpose of this study is to map pyprophyllite distribution at surface of the Nohwa deposit, Korea by using Advanced Spaceborne Thermal Emission and Reflectance Radiometer (ASTER) data. For this, combined Spectral Angle Mapper (SAM), and Matched Filtering (MF) technique based on mathematical algorithm was applied. The regional distribution of high-grade and low-grade pyrophyllite in the Nohwa deposit area could be differentiated by this method. The results of this study show that ASTER data analysis using combination of SAM and MF techniques will assist in exploration of pyrophyllite at the exposed surface.

  1. Effect of restoration technique on stress distribution in roots with flared canals: an FEA study.

    PubMed

    Belli, Sema; Eraslan, Öznur; Eraslan, Oğuz; Eskitaşcıoğlu, Gürcan

    2014-04-01

    The aim of this finite element analysis (FEA) study was to test the effect of different restorative techniques on stress distribution in roots with flared canals. Five three-dimensional (3D) FEA models that simulated a maxillary incisor with excessive structure loss and flared root canals were created and restored with the following techniques/materials: 1) a prefabricated post: 2) one main and two accessory posts; 3) i-TFC post-core (Sun Medical); 4) the thickness of the root was increased by using composite resin and the root was then restored using a prefabricated post; 5) an anatomic post was created by using composite resin and a prefabricated glass-fiber post. Composite cores and ceramic crowns were created. A 300-N static load was applied at the center of the palatal surface of the tooth to calculate stress distributions. SolidWorks/Cosmosworks structural analysis programs were used for FEA analysis. The analysis of the von Mises and tensile stress values revealed that prefabricated post, accessory post, and i-TFC post systems showed similar stress distributions. They all showed high stress areas at the buccal side of the root (3.67 MPa) and in the cervical region of the root (> 3.67 MPa) as well as low stress accumulation within the post space (0 to 1 MPa). The anatomic post kept the stress within its body and directed less stress towards the remaining tooth structure. The creation of an anatomic post may save the remaining tooth structure in roots with flared canals by reducing the stress levels.

  2. Possibilities of LA-ICP-MS technique for the spatial elemental analysis of the recent fish scales: Line scan vs. depth profiling

    NASA Astrophysics Data System (ADS)

    Holá, Markéta; Kalvoda, Jiří; Nováková, Hana; Škoda, Radek; Kanický, Viktor

    2011-01-01

    LA-ICP-MS and solution based ICP-MS in combination with electron microprobe are presented as a method for the determination of the elemental spatial distribution in fish scales which represent an example of a heterogeneous layered bone structure. Two different LA-ICP-MS techniques were tested on recent common carp ( Cyprinus carpio) scales: A line scan through the whole fish scale perpendicular to the growth rings. The ablation crater of 55 μm width and 50 μm depth allowed analysis of the elemental distribution in the external layer. Suitable ablation conditions providing a deeper ablation crater gave average values from the external HAP layer and the collagen basal plate. Depth profiling using spot analysis was tested in fish scales for the first time. Spot analysis allows information to be obtained about the depth profile of the elements at the selected position on the sample. The combination of all mentioned laser ablation techniques provides complete information about the elemental distribution in the fish scale samples. The results were compared with the solution based ICP-MS and EMP analyses. The fact that the results of depth profiling are in a good agreement both with EMP and PIXE results and, with the assumed ways of incorporation of the studied elements in the HAP structure, suggests a very good potential for this method.

  3. Program risk analysis handbook

    NASA Technical Reports Server (NTRS)

    Batson, R. G.

    1987-01-01

    NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.

  4. Surface mapping of spike potential fields: experienced EEGers vs. computerized analysis.

    PubMed

    Koszer, S; Moshé, S L; Legatt, A D; Shinnar, S; Goldensohn, E S

    1996-03-01

    An EEG epileptiform spike focus recorded with scalp electrodes is clinically localized by visual estimation of the point of maximal voltage and the distribution of its surrounding voltages. We compared such estimated voltage maps, drawn by experienced electroencephalographers (EEGers), with a computerized spline interpolation technique employed in the commercially available software package FOCUS. Twenty-two spikes were recorded from 15 patients during long-term continuous EEG monitoring. Maps of voltage distribution from the 28 electrodes surrounding the points of maximum change in slope (the spike maximum) were constructed by the EEGer. The same points of maximum spike and voltage distributions at the 29 electrodes were mapped by computerized spline interpolation and a comparison between the two methods was made. The findings indicate that the computerized spline mapping techniques employed in FOCUS construct voltage maps with similar maxima and distributions as the maps created by experienced EEGers. The dynamics of spike activity, including correlations, are better visualized using the computerized technique than by manual interpretation alone. Its use as a technique for spike localization is accurate and adds information of potential clinical value.

  5. An inexpensive active optical remote sensing instrument for assessing aerosol distributions.

    PubMed

    Barnes, John E; Sharma, Nimmi C P

    2012-02-01

    Air quality studies on a broad variety of topics from health impacts to source/sink analyses, require information on the distributions of atmospheric aerosols over both altitude and time. An inexpensive, simple to implement, ground-based optical remote sensing technique has been developed to assess aerosol distributions. The technique, called CLidar (Charge Coupled Device Camera Light Detection and Ranging), provides aerosol altitude profiles over time. In the CLidar technique a relatively low-power laser transmits light vertically into the atmosphere. The transmitted laser light scatters off of air molecules, clouds, and aerosols. The entire beam from ground to zenith is imaged using a CCD camera and wide-angle (100 degree) optics which are a few hundred meters from the laser. The CLidar technique is optimized for low altitude (boundary layer and lower troposphere) measurements where most aerosols are found and where many other profiling techniques face difficulties. Currently the technique is limited to nighttime measurements. Using the CLidar technique aerosols may be mapped over both altitude and time. The instrumentation required is portable and can easily be moved to locations of interest (e.g. downwind from factories or power plants, near highways). This paper describes the CLidar technique, implementation and data analysis and offers specifics for users wishing to apply the technique for aerosol profiles.

  6. Application of the variational-asymptotical method to composite plates

    NASA Technical Reports Server (NTRS)

    Hodges, Dewey H.; Lee, Bok W.; Atilgan, Ali R.

    1992-01-01

    A method is developed for the 3D analysis of laminated plate deformation which is an extension of a variational-asymptotical method by Atilgan and Hodges (1991). Both methods are based on the treatment of plate deformation by splitting the 3D analysis into linear through-the-thickness analysis and 2D plate analysis. Whereas the first technique tackles transverse shear deformation in the second asymptotical approximation, the present method simplifies its treatment and restricts it to the first approximation. Both analytical techniques are applied to the linear cylindrical bending problem, and the strain and stress distributions are derived and compared with those of the exact solution. The present theory provides more accurate results than those of the classical laminated-plate theory for the transverse displacement of 2-, 3-, and 4-layer cross-ply laminated plates. The method can give reliable estimates of the in-plane strain and displacement distributions.

  7. The role of chemometrics in single and sequential extraction assays: a review. Part II. Cluster analysis, multiple linear regression, mixture resolution, experimental design and other techniques.

    PubMed

    Giacomino, Agnese; Abollino, Ornella; Malandrino, Mery; Mentasti, Edoardo

    2011-03-04

    Single and sequential extraction procedures are used for studying element mobility and availability in solid matrices, like soils, sediments, sludge, and airborne particulate matter. In the first part of this review we reported an overview on these procedures and described the applications of chemometric uni- and bivariate techniques and of multivariate pattern recognition techniques based on variable reduction to the experimental results obtained. The second part of the review deals with the use of chemometrics not only for the visualization and interpretation of data, but also for the investigation of the effects of experimental conditions on the response, the optimization of their values and the calculation of element fractionation. We will describe the principles of the multivariate chemometric techniques considered, the aims for which they were applied and the key findings obtained. The following topics will be critically addressed: pattern recognition by cluster analysis (CA), linear discriminant analysis (LDA) and other less common techniques; modelling by multiple linear regression (MLR); investigation of spatial distribution of variables by geostatistics; calculation of fractionation patterns by a mixture resolution method (Chemometric Identification of Substrates and Element Distributions, CISED); optimization and characterization of extraction procedures by experimental design; other multivariate techniques less commonly applied. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. Porosity characterization for heterogeneous shales using integrated multiscale microscopy

    NASA Astrophysics Data System (ADS)

    Rassouli, F.; Andrew, M.; Zoback, M. D.

    2016-12-01

    Pore size distribution analysis plays a critical role in gas storage capacity and fluid transport characterization of shales. Study of the diverse distribution of pore size and structure in such low permeably rocks is withheld by the lack of tools to visualize the microstructural properties of shale rocks. In this paper we try to use multiple techniques to investigate the full pore size range in different sample scales. Modern imaging techniques are combined with routine analytical investigations (x-ray diffraction, thin section analysis and mercury porosimetry) to describe pore size distribution of shale samples from Haynesville formation in East Texas to generate a more holistic understanding of the porosity structure in shales, ranging from standard core plug down to nm scales. Standard 1" diameter core plug samples were first imaged using a Versa 3D x-ray microscope at lower resolutions. Then we pick several regions of interest (ROIs) with various micro-features (such as micro-cracks and high organic matters) in the rock samples to run higher resolution CT scans using a non-destructive interior tomography scans. After this step, we cut the samples and drill 5 mm diameter cores out of the selected ROIs. Then we rescan the samples to measure porosity distribution of the 5 mm cores. We repeat this step for samples with diameter of 1 mm being cut out of the 5 mm cores using a laser cutting machine. After comparing the pore structure and distribution of the samples measured form micro-CT analysis, we move to nano-scale imaging to capture the ultra-fine pores within the shale samples. At this stage, the diameter of the 1 mm samples will be milled down to 70 microns using the laser beam. We scan these samples in a nano-CT Ultra x-ray microscope and calculate the porosity of the samples by image segmentation methods. Finally, we use images collected from focused ion beam scanning electron microscopy (FIB-SEM) to be able to compare the results of porosity measurements from all different imaging techniques. These multi-scale characterization techniques are then compared with traditional analytical techniques such as Mercury Porosimetry.

  9. Investigating effects of communications modulation technique on targeting performance

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Eusebio, Gerald; Huling, Edward

    2006-05-01

    One of the key challenges facing the global war on terrorism (GWOT) and urban operations is the increased need for rapid and diverse information from distributed sources. For users to get adequate information on target types and movements, they would need reliable data. In order to facilitate reliable computational intelligence, we seek to explore the communication modulation tradeoffs affecting information distribution and accumulation. In this analysis, we explore the modulation techniques of Orthogonal Frequency Division Multiplexing (OFDM), Direct Sequence Spread Spectrum (DSSS), and statistical time-division multiple access (TDMA) as a function of the bit error rate and jitter that affect targeting performance. In the analysis, we simulate a Link 16 with a simple bandpass frequency shift keying (PSK) technique using different Signal-to-Noise ratios. The communications transfer delay and accuracy tradeoffs are assessed as to the effects incurred in targeting performance.

  10. Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan

    2016-01-01

    The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.

  11. Detection of Frauds and Other Non-technical Losses in Power Utilities using Smart Meters: A Review

    NASA Astrophysics Data System (ADS)

    Ahmad, Tanveer; Ul Hasan, Qadeer

    2016-06-01

    Analysis of losses in power distribution system and techniques to mitigate these are two active areas of research especially in energy scarce countries like Pakistan to increase the availability of power without installing new generation. Since total energy losses account for both technical losses (TL) as well as non-technical losses (NTLs). Utility companies in developing countries are incurring of major financial losses due to non-technical losses. NTLs lead to a series of additional losses, such as damage to the network (infrastructure and the reduction of network reliability) etc. The purpose of this paper is to perform an introductory investigation of non-technical losses in power distribution systems. Additionally, analysis of NTLs using consumer energy consumption data with the help of Linear Regression Analysis has been carried out. This data focuses on the Low Voltage (LV) distribution network, which includes: residential, commercial, agricultural and industrial consumers by using the monthly kWh interval data acquired over a period (one month) of time using smart meters. In this research different prevention techniques are also discussed to prevent illegal use of electricity in the distribution of electrical power system.

  12. Statistical analysis of secondary particle distributions in relativistic nucleus-nucleus collisions

    NASA Technical Reports Server (NTRS)

    Mcguire, Stephen C.

    1987-01-01

    The use is described of several statistical techniques to characterize structure in the angular distributions of secondary particles from nucleus-nucleus collisions in the energy range 24 to 61 GeV/nucleon. The objective of this work was to determine whether there are correlations between emitted particle intensity and angle that may be used to support the existence of the quark gluon plasma. The techniques include chi-square null hypothesis tests, the method of discrete Fourier transform analysis, and fluctuation analysis. We have also used the method of composite unit vectors to test for azimuthal asymmetry in a data set of 63 JACEE-3 events. Each method is presented in a manner that provides the reader with some practical detail regarding its application. Of those events with relatively high statistics, Fe approaches 0 at 55 GeV/nucleon was found to possess an azimuthal distribution with a highly non-random structure. No evidence of non-statistical fluctuations was found in the pseudo-rapidity distributions of the events studied. It is seen that the most effective application of these methods relies upon the availability of many events or single events that possess very high multiplicities.

  13. Is Middle-Upper Arm Circumference "normally" distributed? Secondary data analysis of 852 nutrition surveys.

    PubMed

    Frison, Severine; Checchi, Francesco; Kerac, Marko; Nicholas, Jennifer

    2016-01-01

    Wasting is a major public health issue throughout the developing world. Out of the 6.9 million estimated deaths among children under five annually, over 800,000 deaths (11.6 %) are attributed to wasting. Wasting is quantified as low Weight-For-Height (WFH) and/or low Mid-Upper Arm Circumference (MUAC) (since 2005). Many statistical procedures are based on the assumption that the data used are normally distributed. Analyses have been conducted on the distribution of WFH but there are no equivalent studies on the distribution of MUAC. This secondary data analysis assesses the normality of the MUAC distributions of 852 nutrition cross-sectional survey datasets of children from 6 to 59 months old and examines different approaches to normalise "non-normal" distributions. The distribution of MUAC showed no departure from a normal distribution in 319 (37.7 %) distributions using the Shapiro-Wilk test. Out of the 533 surveys showing departure from a normal distribution, 183 (34.3 %) were skewed (D'Agostino test) and 196 (36.8 %) had a kurtosis different to the one observed in the normal distribution (Anscombe-Glynn test). Testing for normality can be sensitive to data quality, design effect and sample size. Out of the 533 surveys showing departure from a normal distribution, 294 (55.2 %) showed high digit preference, 164 (30.8 %) had a large design effect, and 204 (38.3 %) a large sample size. Spline and LOESS smoothing techniques were explored and both techniques work well. After Spline smoothing, 56.7 % of the MUAC distributions showing departure from normality were "normalised" and 59.7 % after LOESS. Box-Cox power transformation had similar results on distributions showing departure from normality with 57 % of distributions approximating "normal" after transformation. Applying Box-Cox transformation after Spline or Loess smoothing techniques increased that proportion to 82.4 and 82.7 % respectively. This suggests that statistical approaches relying on the normal distribution assumption can be successfully applied to MUAC. In light of this promising finding, further research is ongoing to evaluate the performance of a normal distribution based approach to estimating the prevalence of wasting using MUAC.

  14. Practical application of the benchmarking technique to increase reliability and efficiency of power installations and main heat-mechanic equipment of thermal power plants

    NASA Astrophysics Data System (ADS)

    Rimov, A. A.; Chukanova, T. I.; Trofimov, Yu. V.

    2016-12-01

    Data on the comparative analysis variants of the quality of power installations (benchmarking) applied in the power industry is systematized. It is shown that the most efficient variant of implementation of the benchmarking technique is the analysis of statistical distributions of the indicators in the composed homogenous group of the uniform power installations. The benchmarking technique aimed at revealing the available reserves on improvement of the reliability and heat efficiency indicators of the power installations of the thermal power plants is developed in the furtherance of this approach. The technique provides a possibility of reliable comparison of the quality of the power installations in their homogenous group limited by the number and adoption of the adequate decision on improving some or other technical characteristics of this power installation. The technique provides structuring of the list of the comparison indicators and internal factors affecting them represented according to the requirements of the sectoral standards and taking into account the price formation characteristics in the Russian power industry. The mentioned structuring ensures traceability of the reasons of deviation of the internal influencing factors from the specified values. The starting point for further detail analysis of the delay of the certain power installation indicators from the best practice expressed in the specific money equivalent is positioning of this power installation on distribution of the key indicator being a convolution of the comparison indicators. The distribution of the key indicator is simulated by the Monte-Carlo method after receiving the actual distributions of the comparison indicators: specific lost profit due to the short supply of electric energy and short delivery of power, specific cost of losses due to the nonoptimal expenditures for repairs, and specific cost of excess fuel equivalent consumption. The quality loss indicators are developed facilitating the analysis of the benchmarking results permitting to represent the quality loss of this power installation in the form of the difference between the actual value of the key indicator or comparison indicator and the best quartile of the existing distribution. The uncertainty of the obtained values of the quality loss indicators was evaluated by transforming the standard uncertainties of the input values into the expanded uncertainties of the output values with the confidence level of 95%. The efficiency of the technique is demonstrated in terms of benchmarking of the main thermal and mechanical equipment of the extraction power-generating units T-250 and power installations of the thermal power plants with the main steam pressure 130 atm.

  15. Laboratory characterization of shale pores

    NASA Astrophysics Data System (ADS)

    Nur Listiyowati, Lina

    2018-02-01

    To estimate the potential of shale gas reservoir, one needs to understand the characteristics of pore structures. Characterization of shale gas reservoir microstructure is still a challenge due to ultra-fine grained micro-fabric and micro level heterogeneity of these sedimentary rocks. The sample used in the analysis is a small portion of any reservoir. Thus, each measurement technique has a different result. It raises the question which methods are suitable for characterizing pore shale. The goal of this paper is to summarize some of the microstructure analysis tools of shale rock to get near-real results. The two analyzing pore structure methods are indirect measurement (MIP, He, NMR, LTNA) and direct observation (SEM, TEM, Xray CT). Shale rocks have a high heterogeneity; thus, it needs multiscale quantification techniques to understand their pore structures. To describe the complex pore system of shale, several measurement techniques are needed to characterize the surface area and pore size distribution (LTNA, MIP), shapes, size and distribution of pore (FIB-SEM, TEM, Xray CT), and total porosity (He pycnometer, NMR). The choice of techniques and methods should take into account the purpose of the analysis and also the time and budget.

  16. Effect of geometrical parameters on pressure distributions of impulse manufacturing technologies

    NASA Astrophysics Data System (ADS)

    Brune, Ryan Carl

    Impulse manufacturing techniques constitute a growing field of methods that utilize high-intensity pressure events to conduct useful mechanical operations. As interest in applying this technology continues to grow, greater understanding must be achieved with respect to output pressure events in both magnitude and distribution. In order to address this need, a novel pressure measurement has been developed called the Profile Indentation Pressure Evaluation (PIPE) method that systematically analyzes indentation patterns created with impulse events. Correlation with quasi-static test data and use of software-assisted analysis techniques allows for colorized pressure maps to be generated for both electromagnetic and vaporizing foil actuator (VFA) impulse forming events. Development of this technique aided introduction of a design method for electromagnetic path actuator systems, where key geometrical variables are considered using a newly developed analysis method, which is called the Path Actuator Proximal Array (PAPA) pressure model. This model considers key current distribution and proximity effects and interprets generated pressure by considering the adjacent conductor surfaces as proximal arrays of individual conductors. According to PIPE output pressure analysis, the PAPA model provides a reliable prediction of generated pressure for path actuator systems as local geometry is changed. Associated mechanical calculations allow for pressure requirements to be calculated for shearing, flanging, and hemming operations, providing a design process for such cases. Additionally, geometry effect is investigated through a formability enhancement study using VFA metalworking techniques. A conical die assembly is utilized with both VFA high velocity and traditional quasi-static test methods on varied Hasek-type sample geometries to elicit strain states consistent with different locations on a forming limit diagram. Digital image correlation techniques are utilized to measure major and minor strains for each sample type to compare limit strain results. Overall testing indicated decreased formability at high velocity for 304 DDQ stainless steel and increased formability at high velocity for 3003-H14 aluminum. Microstructural and fractographic analysis helped dissect and analyze the observed differences in these cases. Overall, these studies comprehensively explore the effects of geometrical parameters on magnitude and distribution of impulse manufacturing generated pressure, establishing key guidelines and models for continued development and implementation in commercial applications.

  17. Powerlaw: a Python package for analysis of heavy-tailed distributions.

    PubMed

    Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar

    2014-01-01

    Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.

  18. Point pattern analysis of FIA data

    Treesearch

    Chris Woodall

    2002-01-01

    Point pattern analysis is a branch of spatial statistics that quantifies the spatial distribution of points in two-dimensional space. Point pattern analysis was conducted on stand stem-maps from FIA fixed-radius plots to explore point pattern analysis techniques and to determine the ability of pattern descriptions to describe stand attributes. Results indicate that the...

  19. A comparison of solute-transport solution techniques based on inverse modelling results

    USGS Publications Warehouse

    Mehl, S.; Hill, M.C.

    2000-01-01

    Five common numerical techniques (finite difference, predictor-corrector, total-variation-diminishing, method-of-characteristics, and modified-method-of-characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using randomly distributed homogeneous blocks of five sand types. This experimental model provides an outstanding opportunity to compare the solution techniques because of the heterogeneous hydraulic conductivity distribution of known structure, and the availability of detailed measurements with which to compare simulated concentrations. The present work uses this opportunity to investigate how three common types of results-simulated breakthrough curves, sensitivity analysis, and calibrated parameter values-change in this heterogeneous situation, given the different methods of simulating solute transport. The results show that simulated peak concentrations, even at very fine grid spacings, varied because of different amounts of numerical dispersion. Sensitivity analysis results were robust in that they were independent of the solution technique. They revealed extreme correlation between hydraulic conductivity and porosity, and that the breakthrough curve data did not provide enough information about the dispersivities to estimate individual values for the five sands. However, estimated hydraulic conductivity values are significantly influenced by both the large possible variations in model dispersion and the amount of numerical dispersion present in the solution technique.Five common numerical techniques (finite difference, predictor-corrector, total-variation-diminishing, method-of-characteristics, and modified-method-of-characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using randomly distributed homogeneous blocks of five sand types. This experimental model provides an outstanding opportunity to compare the solution techniques because of the heterogeneous hydraulic conductivity distribution of known structure, and the availability of detailed measurements with which to compare simulated concentrations. The present work uses this opportunity to investigate how three common types of results - simulated breakthrough curves, sensitivity analysis, and calibrated parameter values - change in this heterogeneous situation, given the different methods of simulating solute transport. The results show that simulated peak concentrations, even at very fine grid spacings, varied because of different amounts of numerical dispersion. Sensitivity analysis results were robust in that they were independent of the solution technique. They revealed extreme correlation between hydraulic conductivity and porosity, and that the breakthrough curve data did not provide enough information about the dispersivities to estimate individual values for the five sands. However, estimated hydraulic conductivity values are significantly influenced by both the large possible variations in model dispersion and the amount of numerical dispersion present in the solution technique.

  20. Usefulness of thermographic analysis to control temperature homogeneity in the development and implementation of a closed recirculating CO2 chemohyperthermia model.

    PubMed

    Padilla-Valverde, David; Sanchez-Garcia, Susana; García-Santos, Esther; Marcote-Ibañez, Carlos; Molina-Robles, Mercedes; Martín-Fernández, Jesús; Villarejo-Campos, Pedro

    2016-09-30

    To determine the effectiveness of thermography to control the distribution of abdominal temperature in the development of a closed chemohyperthermia model. For thermographic analysis, we divided the abdominopelvic cavity into nine regions according to a modification of carcinomatosis peritoneal index. A difference of 2.5 °C between and within the quadrants, and thermographic colours, were used as asymmetric criteria. Preclinical study:· Rats Model: Six athymic nude rats, male, rnu/rnu. They were treated with closed technique and open technique. Porcine Model: 12 female large white pigs. Four were treated with open technique and eight with closed recirculation CO 2 technique. Clinical Pilot Study, EUDRACT 2011-006319-69: 18 patients with ovarian cancer were treated with cytoreductive surgery and hyperthermia intraperitoneal chemotherapy, HIPEC, with a closed recirculating CO 2 system. Thermographic control and intra-abdominal temperature assessment was performed at the baseline, when outflow temperature reached 41 °C, and at 30´. The thermographic images showed a higher homogeneity of the intra-abdominal temperature in the closed model respect to the open technique. The thermogram showed a temperature distribution homogeneity when starting the circulation of chemotherapy. There was correlation between the temperature thermographic map in the closed porcine model and pilot study, and reached inflow and outflow temperatures, at half time of HIPEC, of 42/41.4 °C and 42 ± 0.2/41 ± 0.8 °C, respectively. There was no significant impact to the core temperature of patients after reaching the homogeneous temperature distribution. To control homogeneity of temperature distribution is feasible using infra-red digital images in a closed HIPEC with CO 2 recirculation.

  1. Computational Control of Flexible Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Sharpe, Lonnie, Jr.; Shen, Ji Yao

    1994-01-01

    The main objective of this project is to establish a distributed parameter modeling technique for structural analysis, parameter estimation, vibration suppression and control synthesis of large flexible aerospace structures. This report concentrates on the research outputs produced in the last two years of the project. The main accomplishments can be summarized as follows. A new version of the PDEMOD Code had been completed. A theoretical investigation of the NASA MSFC two-dimensional ground-based manipulator facility by using distributed parameter modelling technique has been conducted. A new mathematical treatment for dynamic analysis and control of large flexible manipulator systems has been conceived, which may provide a embryonic form of a more sophisticated mathematical model for future modified versions of the PDEMOD Codes.

  2. Analysis and Modeling of Complex Geomorphic Systems: Technique Development, Data Collection, and Application to Rangeland Terrain

    DTIC Science & Technology

    2008-10-01

    attempts to measure the long-term distribution of stor- age time have relied unrealistic assumptions, but two recent studies suggest a new approach. As...sediment 10 age . Everitt (1968) mapped the age distribution of cottonwoods along a 34 km stretch of the Little Missouri River in North Dakota...Dietrich et al. (1982) applied Erikssons (1971) method to estimate the residence time distribution from Everitts age distribution. Somewhat mysteriously

  3. Analyzing railroad dispatchers' strategies : a cognitive task analysis of a distributed planning task

    DOT National Transportation Integrated Search

    1998-10-11

    This paper describes a preliminary cognitive task analysis (CTA) that is being conducted to examine how experienced train dispatchers manage and schedule trains. The CTA uses ethnographic field observations and structured interview techniques. The ob...

  4. A novel method for the investigation of liquid/liquid distribution coefficients and interface permeabilities applied to the water-octanol-drug system.

    PubMed

    Stein, Paul C; di Cagno, Massimiliano; Bauer-Brandl, Annette

    2011-09-01

    In this work a new, accurate and convenient technique for the measurement of distribution coefficients and membrane permeabilities based on nuclear magnetic resonance (NMR) is described. This method is a novel implementation of localized NMR spectroscopy and enables the simultaneous analysis of the drug content in the octanol and in the water phase without separation. For validation of the method, the distribution coefficients at pH = 7.4 of four active pharmaceutical ingredients (APIs), namely ibuprofen, ketoprofen, nadolol, and paracetamol (acetaminophen), were determined using a classical approach. These results were compared to the NMR experiments which are described in this work. For all substances, the respective distribution coefficients found with the two techniques coincided very well. Furthermore, the NMR experiments make it possible to follow the distribution of the drug between the phases as a function of position and time. Our results show that the technique, which is available on any modern NMR spectrometer, is well suited to the measurement of distribution coefficients. The experiments present also new insight into the dynamics of the water-octanol interface itself and permit measurement of the interface permeability.

  5. Imaging of conductivity distributions using audio-frequency electromagnetic data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Ki Ha; Morrison, H.F.

    1990-10-01

    The objective of this study has been to develop mathematical methods for mapping conductivity distributions between boreholes using low frequency electromagnetic (em) data. In relation to this objective this paper presents two recent developments in high-resolution crosshole em imaging techniques. These are (1) audio-frequency diffusion tomography, and (2) a transform method in which low frequency data is first transformed into a wave-like field. The idea in the second approach is that we can then treat the transformed field using conventional techniques designed for wave field analysis.

  6. Application of fluorescence resonance energy transfer techniques to the study of lectin-binding site distribution on Paramecium primaurelia (Protista, Ciliophora) cell surface.

    PubMed

    Locatelli, D; Delmonte Corrado, M U; Politi, H; Bottiroli, G

    1998-01-01

    Fluorescence resonance energy transfer (FRET) is a photophysical phenomenon occurring between the molecules of two fluorochromes with suitable spectral characteristics (donor-acceptor dye pair), and consisting in an excitation energy migration through a non-radiative process. Since the efficiency of the process is strictly dependent on the distance and reciprocal orientation of the donor and acceptor molecules, FRET-based techniques can be successfully applied to the study of biomolecules and cell component organisation and distribution. These techniques have been employed in studying Paramecium primaurelia surface membrane for the reciprocal distribution of N-acetylneuraminic acid (NeuAc) and N-acetylglucosamine (GlcNAc) glycosidic residues, which were found to be involved in mating cell pairing. NeuAc and GlcNAc were detected by their specific binding lectins, Limulus polyphemus agglutinin (LPA) and wheat germ agglutinin (WGA), respectively. Microspectrofluorometric analysis afforded the choice of fluorescein isothiocyanate and Texas red conjugated with LPA and WGA, respectively, as a suitable donor-acceptor couple efficiently activating FRET processes. Studies performed both in solution and in cells allowed to define the experimental conditions favourable for a FRET analysis. The comparative study carried out both on the conjugating-region and the non conjugating region of the surface membrane, indicates that FRET distribution appears quite homogeneous in mating-competent mating type (mt) I, whereas, in mating-competent mt II cells, FRET distribution seems to be preferentially localised on the conjugating-region functionally involved in mating cell pairing. This difference in the distribution of lectin-binding sites is suggested to be related to mating-competence acquisition.

  7. Break Point Distribution on Chromosome 3 of Human Epithelial Cells exposed to Gamma Rays, Neutrons and Fe Ions

    NASA Technical Reports Server (NTRS)

    Hada, M.; Saganti, P. B.; Gersey, B.; Wilkins, R.; Cucinotta, F. A.; Wu, H.

    2007-01-01

    Most of the reported studies of break point distribution on the damaged chromosomes from radiation exposure were carried out with the G-banding technique or determined based on the relative length of the broken chromosomal fragments. However, these techniques lack the accuracy in comparison with the later developed multicolor banding in situ hybridization (mBAND) technique that is generally used for analysis of intrachromosomal aberrations such as inversions. Using mBAND, we studied chromosome aberrations in human epithelial cells exposed in vitro to both low or high dose rate gamma rays in Houston, low dose rate secondary neutrons at Los Alamos National Laboratory and high dose rate 600 MeV/u Fe ions at NASA Space Radiation Laboratory. Detailed analysis of the inversion type revealed that all of the three radiation types induced a low incidence of simple inversions. Half of the inversions observed after neutron or Fe ion exposure, and the majority of inversions in gamma-irradiated samples were accompanied by other types of intrachromosomal aberrations. In addition, neutrons and Fe ions induced a significant fraction of inversions that involved complex rearrangements of both inter- and intrachromosome exchanges. We further compared the distribution of break point on chromosome 3 for the three radiation types. The break points were found to be randomly distributed on chromosome 3 after neutrons or Fe ions exposure, whereas non-random distribution with clustering break points was observed for gamma-rays. The break point distribution may serve as a potential fingerprint of high-LET radiation exposure.

  8. Evaluation of Uranium-235 Measurement Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaspar, Tiffany C.; Lavender, Curt A.; Dibert, Mark W.

    2017-05-23

    Monolithic U-Mo fuel plates are rolled to final fuel element form from the original cast ingot, and thus any inhomogeneities in 235U distribution present in the cast ingot are maintained, and potentially exaggerated, in the final fuel foil. The tolerance for inhomogeneities in the 235U concentration in the final fuel element foil is very low. A near-real-time, nondestructive technique to evaluate the 235U distribution in the cast ingot is required in order to provide feedback to the casting process. Based on the technical analysis herein, gamma spectroscopy has been recommended to provide a near-real-time measure of the 235U distribution inmore » U-Mo cast plates.« less

  9. Coupling Analysis of Heat Island Effects, Vegetation Coverage and Urban Flood in Wuhan

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Liu, Q.; Fan, W.; Wang, G.

    2018-04-01

    In this paper, satellite image, remote sensing technique and geographic information system technique are main technical bases. Spectral and other factors comprehensive analysis and visual interpretation are main methods. We use GF-1 and Landsat8 remote sensing satellite image of Wuhan as data source, and from which we extract vegetation distribution, urban heat island relative intensity distribution map and urban flood submergence range. Based on the extracted information, through spatial analysis and regression analysis, we find correlations among heat island effect, vegetation coverage and urban flood. The results show that there is a high degree of overlap between of urban heat island and urban flood. The area of urban heat island has buildings with little vegetation cover, which may be one of the reasons for the local heavy rainstorms. Furthermore, the urban heat island has a negative correlation with vegetation coverage, and the heat island effect can be alleviated by the vegetation to a certain extent. So it is easy to understand that the new industrial zones and commercial areas which under constructions distribute in the city, these land surfaces becoming bare or have low vegetation coverage, can form new heat islands easily.

  10. DATMAN: A reliability data analysis program using Bayesian updating

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becker, M.; Feltus, M.A.

    1996-12-31

    Preventive maintenance (PM) techniques focus on the prevention of failures, in particular, system components that are important to plant functions. Reliability-centered maintenance (RCM) improves on the PM techniques by introducing a set of guidelines by which to evaluate the system functions. It also minimizes intrusive maintenance, labor, and equipment downtime without sacrificing system performance when its function is essential for plant safety. Both the PM and RCM approaches require that system reliability data be updated as more component failures and operation time are acquired. Systems reliability and the likelihood of component failures can be calculated by Bayesian statistical methods, whichmore » can update these data. The DATMAN computer code has been developed at Penn State to simplify the Bayesian analysis by performing tedious calculations needed for RCM reliability analysis. DATMAN reads data for updating, fits a distribution that best fits the data, and calculates component reliability. DATMAN provides a user-friendly interface menu that allows the user to choose from several common prior and posterior distributions, insert new failure data, and visually select the distribution that matches the data most accurately.« less

  11. Direct mass spectrometry approaches to characterize polyphenol composition of complex samples.

    PubMed

    Fulcrand, Hélène; Mané, Carine; Preys, Sébastien; Mazerolles, Gérard; Bouchut, Claire; Mazauric, Jean-Paul; Souquet, Jean-Marc; Meudec, Emmanuelle; Li, Yan; Cole, Richard B; Cheynier, Véronique

    2008-12-01

    Lower molecular weight polyphenols including proanthocyanidin oligomers can be analyzed after HPLC separation on either reversed-phase or normal phase columns. However, these techniques are time consuming and can have poor resolution as polymer chain length and structural diversity increase. The detection of higher molecular weight compounds, as well as the determination of molecular weight distributions, remain major challenges in polyphenol analysis. Approaches based on direct mass spectrometry (MS) analysis that are proposed to help overcome these problems are reviewed. Thus, direct flow injection electrospray ionization mass spectrometry analysis can be used to establish polyphenol fingerprints of complex extracts such as in wine. This technique enabled discrimination of samples on the basis of their phenolic (i.e. anthocyanin, phenolic acid and flavan-3-ol) compositions, but larger oligomers and polymers were poorly detectable. Detection of higher molecular weight proanthocyanidins was also restricted with matrix-assisted laser desorption ionization (MALDI) MS, suggesting that they are difficult to desorb as gas-phase ions. The mass distribution of polymeric fractions could, however, be determined by analyzing the mass distributions of bovine serum albumin/proanthocyanidin complexes using MALDI-TOF-MS.

  12. Linear prediction and single-channel recording.

    PubMed

    Carter, A A; Oswald, R E

    1995-08-01

    The measurement of individual single-channel events arising from the gating of ion channels provides a detailed data set from which the kinetic mechanism of a channel can be deduced. In many cases, the pattern of dwells in the open and closed states is very complex, and the kinetic mechanism and parameters are not easily determined. Assuming a Markov model for channel kinetics, the probability density function for open and closed time dwells should consist of a sum of decaying exponentials. One method of approaching the kinetic analysis of such a system is to determine the number of exponentials and the corresponding parameters which comprise the open and closed dwell time distributions. These can then be compared to the relaxations predicted from the kinetic model to determine, where possible, the kinetic constants. We report here the use of a linear technique, linear prediction/singular value decomposition, to determine the number of exponentials and the exponential parameters. Using simulated distributions and comparing with standard maximum-likelihood analysis, the singular value decomposition techniques provide advantages in some situations and are a useful adjunct to other single-channel analysis techniques.

  13. Two dimensional distribution measurement of electric current generated in a polymer electrolyte fuel cell using 49 NMR surface coils.

    PubMed

    Ogawa, Kuniyasu; Sasaki, Tatsuyoshi; Yoneda, Shigeki; Tsujinaka, Kumiko; Asai, Ritsuko

    2018-05-17

    In order to increase the current density generated in a PEFC (polymer electrolyte fuel cell), a method for measuring the spatial distribution of both the current and the water content of the MEA (membrane electrode assembly) is necessary. Based on the frequency shifts of NMR (nuclear magnetic resonance) signals acquired from the water contained in the MEA using 49 NMR coils in a 7 × 7 arrangement inserted in the PEFC, a method for measuring the two-dimensional spatial distribution of electric current generated in a unit cell with a power generation area of 140 mm × 160 mm was devised. We also developed an inverse analysis method to determine the two-dimensional electric current distribution that can be applied to actual PEFC connections. Two analytical techniques, namely coarse graining of segments and stepwise search, were used to shorten the calculation time required for inverse analysis of the electric current map. Using this method and techniques, spatial distributions of electric current and water content in the MEA were obtained when the PEFC generated electric power at 100 A. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Analysis of security of optical encryption with spatially incoherent illumination technique

    NASA Astrophysics Data System (ADS)

    Cheremkhin, Pavel A.; Evtikhiev, Nikolay N.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Shifrina, Anna V.

    2017-03-01

    Applications of optical methods for encryption purposes have been attracting interest of researchers for decades. The first and the most popular is double random phase encoding (DRPE) technique. There are many optical encryption techniques based on DRPE. Main advantage of DRPE based techniques is high security due to transformation of spectrum of image to be encrypted into white spectrum via use of first phase random mask which allows for encrypted images with white spectra. Downsides are necessity of using holographic registration scheme in order to register not only light intensity distribution but also its phase distribution, and speckle noise occurring due to coherent illumination. Elimination of these disadvantages is possible via usage of incoherent illumination instead of coherent one. In this case, phase registration no longer matters, which means that there is no need for holographic setup, and speckle noise is gone. This technique does not have drawbacks inherent to coherent methods, however, as only light intensity distribution is considered, mean value of image to be encrypted is always above zero which leads to intensive zero spatial frequency peak in image spectrum. Consequently, in case of spatially incoherent illumination, image spectrum, as well as encryption key spectrum, cannot be white. This might be used to crack encryption system. If encryption key is very sparse, encrypted image might contain parts or even whole unhidden original image. Therefore, in this paper analysis of security of optical encryption with spatially incoherent illumination depending on encryption key size and density is conducted.

  15. General methodology: Costing, budgeting, and techniques for benefit-cost and cost-effectiveness analysis

    NASA Technical Reports Server (NTRS)

    Stretchberry, D. M.; Hein, G. F.

    1972-01-01

    The general concepts of costing, budgeting, and benefit-cost ratio and cost-effectiveness analysis are discussed. The three common methods of costing are presented. Budgeting distributions are discussed. The use of discounting procedures is outlined. The benefit-cost ratio and cost-effectiveness analysis is defined and their current application to NASA planning is pointed out. Specific practices and techniques are discussed, and actual costing and budgeting procedures are outlined. The recommended method of calculating benefit-cost ratios is described. A standardized method of cost-effectiveness analysis and long-range planning are also discussed.

  16. Multispectral analysis of ocean dumped materials

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1977-01-01

    Remotely sensed data were collected in conjunction with sea-truth measurements in three experiments in the New York Bight. Pollution features of primary interest were ocean dumped materials, such as sewage sludge and acid waste. Sewage-sludge and acid-waste plumes, including plumes from sewage sludge dumped by the 'line-dump' and 'spot-dump' methods, were located, identified, and mapped. Previously developed quantitative analysis techniques for determining quantitative distributions of materials in sewage sludge dumps were evaluated, along with multispectral analysis techniques developed to identify ocean dumped materials. Results of these experiments and the associated data analysis investigations are presented and discussed.

  17. Neutron coincidence counting based on time interval analysis with one- and two-dimensional Rossi-alpha distributions: an application for passive neutron waste assay

    NASA Astrophysics Data System (ADS)

    Bruggeman, M.; Baeten, P.; De Boeck, W.; Carchon, R.

    1996-02-01

    Neutron coincidence counting is commonly used for the non-destructive assay of plutonium bearing waste or for safeguards verification measurements. A major drawback of conventional coincidence counting is related to the fact that a valid calibration is needed to convert a neutron coincidence count rate to a 240Pu equivalent mass ( 240Pu eq). In waste assay, calibrations are made for representative waste matrices and source distributions. The actual waste however may have quite different matrices and source distributions compared to the calibration samples. This often results in a bias of the assay result. This paper presents a new neutron multiplicity sensitive coincidence counting technique including an auto-calibration of the neutron detection efficiency. The coincidence counting principle is based on the recording of one- and two-dimensional Rossi-alpha distributions triggered respectively by pulse pairs and by pulse triplets. Rossi-alpha distributions allow an easy discrimination between real and accidental coincidences and are aimed at being measured by a PC-based fast time interval analyser. The Rossi-alpha distributions can be easily expressed in terms of a limited number of factorial moments of the neutron multiplicity distributions. The presented technique allows an unbiased measurement of the 240Pu eq mass. The presented theory—which will be indicated as Time Interval Analysis (TIA)—is complementary to Time Correlation Analysis (TCA) theories which were developed in the past, but is from the theoretical point of view much simpler and allows a straightforward calculation of deadtime corrections and error propagation. Analytical expressions are derived for the Rossi-alpha distributions as a function of the factorial moments of the efficiency dependent multiplicity distributions. The validity of the proposed theory is demonstrated and verified via Monte Carlo simulations of pulse trains and the subsequent analysis of the simulated data.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pugmire, David; Kress, James; Choi, Jong

    Data driven science is becoming increasingly more common, complex, and is placing tremendous stresses on visualization and analysis frameworks. Data sources producing 10GB per second (and more) are becoming increasingly commonplace in both simulation, sensor and experimental sciences. These data sources, which are often distributed around the world, must be analyzed by teams of scientists that are also distributed. Enabling scientists to view, query and interact with such large volumes of data in near-real-time requires a rich fusion of visualization and analysis techniques, middleware and workflow systems. Here, this paper discusses initial research into visualization and analysis of distributed datamore » workflows that enables scientists to make near-real-time decisions of large volumes of time varying data.« less

  19. Spherical Harmonic Analysis of Particle Velocity Distribution Function: Comparison of Moments and Anisotropies using Cluster Data

    NASA Technical Reports Server (NTRS)

    Gurgiolo, Chris; Vinas, Adolfo F.

    2009-01-01

    This paper presents a spherical harmonic analysis of the plasma velocity distribution function using high-angular, energy, and time resolution Cluster data obtained from the PEACE spectrometer instrument to demonstrate how this analysis models the particle distribution function and its moments and anisotropies. The results show that spherical harmonic analysis produced a robust physical representation model of the velocity distribution function, resolving the main features of the measured distributions. From the spherical harmonic analysis, a minimum set of nine spectral coefficients was obtained from which the moment (up to the heat flux), anisotropy, and asymmetry calculations of the velocity distribution function were obtained. The spherical harmonic method provides a potentially effective "compression" technique that can be easily carried out onboard a spacecraft to determine the moments and anisotropies of the particle velocity distribution function for any species. These calculations were implemented using three different approaches, namely, the standard traditional integration, the spherical harmonic (SPH) spectral coefficients integration, and the singular value decomposition (SVD) on the spherical harmonic methods. A comparison among the various methods shows that both SPH and SVD approaches provide remarkable agreement with the standard moment integration method.

  20. Three-dimensional autoradiographic localization of quench-corrected glycine receptor specific activity in the mouse brain using sup 3 H-strychnine as the ligand

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, W.F.; O'Gorman, S.; Roe, A.W.

    1990-03-01

    The autoradiographic analysis of neurotransmitter receptor distribution is a powerful technique that provides extensive information on the localization of neurotransmitter systems. Computer methodologies are described for the analysis of autoradiographic material which include quench correction, 3-dimensional display, and quantification based on anatomical boundaries determined from the tissue sections. These methodologies are applied to the problem of the distribution of glycine receptors measured by 3H-strychnine binding in the mouse CNS. The most distinctive feature of this distribution is its marked caudorostral gradient. The highest densities of binding sites within this gradient were seen in somatic motor and sensory areas; high densitiesmore » of binding were seen in branchial efferent and special sensory areas. Moderate levels were seen in nuclei related to visceral function. Densities within the reticular formation paralleled the overall gradient with high to moderate levels of binding. The colliculi had low and the diencephalon had very low levels of binding. No binding was seen in the cerebellum or the telencephalon with the exception of the amygdala, which had very low levels of specific binding. This distribution of glycine receptors correlates well with the known functional distribution of glycine synaptic function. These data are illustrated in 3 dimensions and discussed in terms of the significance of the analysis techniques on this type of data as well as the functional significance of the distribution of glycine receptors.« less

  1. Characteristic vector analysis of inflection ratio spectra: New technique for analysis of ocean color data

    NASA Technical Reports Server (NTRS)

    Grew, G. W.

    1985-01-01

    Characteristic vector analysis applied to inflection ratio spectra is a new approach to analyzing spectral data. The technique applied to remote data collected with the multichannel ocean color sensor (MOCS), a passive sensor, simultaneously maps the distribution of two different phytopigments, chlorophyll alpha and phycoerythrin, the ocean. The data set presented is from a series of warm core ring missions conducted during 1982. The data compare favorably with a theoretical model and with data collected on the same mission by an active sensor, the airborne oceanographic lidar (AOL).

  2. Discrete Deterministic and Stochastic Petri Nets

    NASA Technical Reports Server (NTRS)

    Zijal, Robert; Ciardo, Gianfranco

    1996-01-01

    Petri nets augmented with timing specifications gained a wide acceptance in the area of performance and reliability evaluation of complex systems exhibiting concurrency, synchronization, and conflicts. The state space of time-extended Petri nets is mapped onto its basic underlying stochastic process, which can be shown to be Markovian under the assumption of exponentially distributed firing times. The integration of exponentially and non-exponentially distributed timing is still one of the major problems for the analysis and was first attacked for continuous time Petri nets at the cost of structural or analytical restrictions. We propose a discrete deterministic and stochastic Petri net (DDSPN) formalism with no imposed structural or analytical restrictions where transitions can fire either in zero time or according to arbitrary firing times that can be represented as the time to absorption in a finite absorbing discrete time Markov chain (DTMC). Exponentially distributed firing times are then approximated arbitrarily well by geometric distributions. Deterministic firing times are a special case of the geometric distribution. The underlying stochastic process of a DDSPN is then also a DTMC, from which the transient and stationary solution can be obtained by standard techniques. A comprehensive algorithm and some state space reduction techniques for the analysis of DDSPNs are presented comprising the automatic detection of conflicts and confusions, which removes a major obstacle for the analysis of discrete time models.

  3. The use of interpractive graphic displays for interpretation of surface design parameters

    NASA Technical Reports Server (NTRS)

    Talcott, N. A., Jr.

    1981-01-01

    An interactive computer graphics technique known as the Graphic Display Data method has been developed to provide a convenient means for rapidly interpreting large amounts of surface design data. The display technique should prove valuable in such disciplines as aerodynamic analysis, structural analysis, and experimental data analysis. To demonstrate the system's features, an example is presented of the Graphic Data Display method used as an interpretive tool for radiation equilibrium temperature distributions over the surface of an aerodynamic vehicle. Color graphic displays were also examined as a logical extension of the technique to improve its clarity and to allow the presentation of greater detail in a single display.

  4. Understanding a Normal Distribution of Data (Part 2).

    PubMed

    Maltenfort, Mitchell

    2016-02-01

    Completing the discussion of data normality, advanced techniques for analysis of non-normal data are discussed including data transformation, Generalized Linear Modeling, and bootstrapping. Relative strengths and weaknesses of each technique are helpful in choosing a strategy, but help from a statistician is usually necessary to analyze non-normal data using these methods.

  5. Using detailed inter-network simulation and model abstraction to investigate and evaluate joint battlespace infosphere (JBI) support technologies

    NASA Astrophysics Data System (ADS)

    Green, David M.; Dallaire, Joel D.; Reaper, Jerome H.

    2004-08-01

    The Joint Battlespace Infosphere (JBI) program is performing a technology investigation into global communications, data mining and warehousing, and data fusion technologies by focusing on techniques and methodologies that support twenty first century military distributed collaboration. Advancement of these technologies is vitally important if military decision makers are to have the right data, in the right format, at the right time and place to support making the right decisions within available timelines. A quantitative understanding of individual and combinational effects arising from the application of technologies within a framework is presently far too complex to evaluate at more than a cursory depth. In order to facilitate quantitative analysis under these circumstances, the Distributed Information Enterprise Modeling and Simulation (DIEMS) team was formed to apply modeling and simulation (M&S) techniques to help in addressing JBI analysis challenges. The DIEMS team has been tasked utilizing collaborative distributed M&S architectures to quantitatively evaluate JBI technologies and tradeoffs. This paper first presents a high level view of the DIEMS project. Once this approach has been established, a more concentrated view of the detailed communications simulation techniques used in generating the underlying support data sets is presented.

  6. Analysis of Learning Curve Fitting Techniques.

    DTIC Science & Technology

    1987-09-01

    1986. 15. Neter, John and others. Applied Linear Regression Models. Homewood IL: Irwin, 19-33. 16. SAS User’s Guide: Basics, Version 5 Edition. SAS... Linear Regression Techniques (15:23-52). Random errors are assumed to be normally distributed when using -# ordinary least-squares, according to Johnston...lot estimated by the improvement curve formula. For a more detailed explanation of the ordinary least-squares technique, see Neter, et. al., Applied

  7. Analysis of atomic force microscopy data for surface characterization using fuzzy logic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Mousa, Amjed, E-mail: aalmousa@vt.edu; Niemann, Darrell L.; Niemann, Devin J.

    2011-07-15

    In this paper we present a methodology to characterize surface nanostructures of thin films. The methodology identifies and isolates nanostructures using Atomic Force Microscopy (AFM) data and extracts quantitative information, such as their size and shape. The fuzzy logic based methodology relies on a Fuzzy Inference Engine (FIE) to classify the data points as being top, bottom, uphill, or downhill. The resulting data sets are then further processed to extract quantitative information about the nanostructures. In the present work we introduce a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and present an omni-directional searchmore » technique to improve the structural recognition accuracy. In order to demonstrate the effectiveness of our approach we present a case study which uses our approach to quantitatively identify particle sizes of two specimens each with a unique gold nanoparticle size distribution. - Research Highlights: {yields} A Fuzzy logic analysis technique capable of characterizing AFM images of thin films. {yields} The technique is applicable to different surfaces regardless of their densities. {yields} Fuzzy logic technique does not require manual adjustment of the algorithm parameters. {yields} The technique can quantitatively capture differences between surfaces. {yields} This technique yields more realistic structure boundaries compared to other methods.« less

  8. Automated analysis and classification of melanocytic tumor on skin whole slide images.

    PubMed

    Xu, Hongming; Lu, Cheng; Berendt, Richard; Jha, Naresh; Mandal, Mrinal

    2018-06-01

    This paper presents a computer-aided technique for automated analysis and classification of melanocytic tumor on skin whole slide biopsy images. The proposed technique consists of four main modules. First, skin epidermis and dermis regions are segmented by a multi-resolution framework. Next, epidermis analysis is performed, where a set of epidermis features reflecting nuclear morphologies and spatial distributions is computed. In parallel with epidermis analysis, dermis analysis is also performed, where dermal cell nuclei are segmented and a set of textural and cytological features are computed. Finally, the skin melanocytic image is classified into different categories such as melanoma, nevus or normal tissue by using a multi-class support vector machine (mSVM) with extracted epidermis and dermis features. Experimental results on 66 skin whole slide images indicate that the proposed technique achieves more than 95% classification accuracy, which suggests that the technique has the potential to be used for assisting pathologists on skin biopsy image analysis and classification. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Comparative forensic soil analysis of New Jersey state parks using a combination of simple techniques with multivariate statistics.

    PubMed

    Bonetti, Jennifer; Quarino, Lawrence

    2014-05-01

    This study has shown that the combination of simple techniques with the use of multivariate statistics offers the potential for the comparative analysis of soil samples. Five samples were obtained from each of twelve state parks across New Jersey in both the summer and fall seasons. Each sample was examined using particle-size distribution, pH analysis in both water and 1 M CaCl2 , and a loss on ignition technique. Data from each of the techniques were combined, and principal component analysis (PCA) and canonical discriminant analysis (CDA) were used for multivariate data transformation. Samples from different locations could be visually differentiated from one another using these multivariate plots. Hold-one-out cross-validation analysis showed error rates as low as 3.33%. Ten blind study samples were analyzed resulting in no misclassifications using Mahalanobis distance calculations and visual examinations of multivariate plots. Seasonal variation was minimal between corresponding samples, suggesting potential success in forensic applications. © 2014 American Academy of Forensic Sciences.

  10. Interpretation of aeromagnetic data over Abeokuta and its environs, Southwest Nigeria, using spectral analysis (Fourier transform technique)

    NASA Astrophysics Data System (ADS)

    Olurin, Oluwaseun T.; Ganiyu, Saheed A.; Hammed, Olaide S.; Aluko, Taiwo J.

    2016-10-01

    This study presents the results of spectral analysis of magnetic data over Abeokuta area, Southwestern Nigeria, using fast Fourier transform (FFT) in Microsoft Excel. The study deals with the quantitative interpretation of airborne magnetic data (Sheet No. 260), which was conducted by the Nigerian Geological Survey Agency in 2009. In order to minimise aliasing error, the aeromagnetic data was gridded at spacing of 1 km. Spectral analysis technique was used to estimate the magnetic basement depth distributed at two levels. The result of the interpretation shows that the magnetic sources are mainly distributed at two levels. The shallow sources (minimum depth) range in depth from 0.103 to 0.278 km below ground level and are inferred to be due to intrusions within the region. The deeper sources (maximum depth) range in depth from 2.739 to 3.325 km below ground and are attributed to the underlying basement.

  11. Simulation of mixture microstructures via particle packing models and their direct comparison with real mixtures

    NASA Astrophysics Data System (ADS)

    Gulliver, Eric A.

    The objective of this thesis to identify and develop techniques providing direct comparison between simulated and real packed particle mixture microstructures containing submicron-sized particles. This entailed devising techniques for simulating powder mixtures, producing real mixtures with known powder characteristics, sectioning real mixtures, interrogating mixture cross-sections, evaluating and quantifying the mixture interrogation process and for comparing interrogation results between mixtures. A drop and roll-type particle-packing model was used to generate simulations of random mixtures. The simulated mixtures were then evaluated to establish that they were not segregated and free from gross defects. A powder processing protocol was established to provide real mixtures for direct comparison and for use in evaluating the simulation. The powder processing protocol was designed to minimize differences between measured particle size distributions and the particle size distributions in the mixture. A sectioning technique was developed that was capable of producing distortion free cross-sections of fine scale particulate mixtures. Tessellation analysis was used to interrogate mixture cross sections and statistical quality control charts were used to evaluate different types of tessellation analysis and to establish the importance of differences between simulated and real mixtures. The particle-packing program generated crescent shaped pores below large particles but realistic looking mixture microstructures otherwise. Focused ion beam milling was the only technique capable of sectioning particle compacts in a manner suitable for stereological analysis. Johnson-Mehl and Voronoi tessellation of the same cross-sections produced tessellation tiles with different the-area populations. Control charts analysis showed Johnson-Mehl tessellation measurements are superior to Voronoi tessellation measurements for detecting variations in mixture microstructure, such as altered particle-size distributions or mixture composition. Control charts based on tessellation measurements were used for direct, quantitative comparisons between real and simulated mixtures. Four sets of simulated and real mixtures were examined. Data from real mixture was matched with simulated data when the samples were well mixed and the particle size distributions and volume fractions of the components were identical. Analysis of mixture components that occupied less than approximately 10 vol% of the mixture was not practical unless the particle size of the component was extremely small and excellent quality high-resolution compositional micrographs of the real sample are available. These methods of analysis should allow future researchers to systematically evaluate and predict the impact and importance of variables such as component volume fraction and component particle size distribution as they pertain to the uniformity of powder mixture microstructures.

  12. The analysis of a rocket tomography measurement of the N2+3914A emission and N2 ionization rates in an auroral arc

    NASA Technical Reports Server (NTRS)

    Mcdade, Ian C.

    1991-01-01

    Techniques were developed for recovering two-dimensional distributions of auroral volume emission rates from rocket photometer measurements made in a tomographic spin scan mode. These tomographic inversion procedures are based upon an algebraic reconstruction technique (ART) and utilize two different iterative relaxation techniques for solving the problems associated with noise in the observational data. One of the inversion algorithms is based upon a least squares method and the other on a maximum probability approach. The performance of the inversion algorithms, and the limitations of the rocket tomography technique, were critically assessed using various factors such as (1) statistical and non-statistical noise in the observational data, (2) rocket penetration of the auroral form, (3) background sources of emission, (4) smearing due to the photometer field of view, and (5) temporal variations in the auroral form. These tests show that the inversion procedures may be successfully applied to rocket observations made in medium intensity aurora with standard rocket photometer instruments. The inversion procedures have been used to recover two-dimensional distributions of auroral emission rates and ionization rates from an existing set of N2+3914A rocket photometer measurements which were made in a tomographic spin scan mode during the ARIES auroral campaign. The two-dimensional distributions of the 3914A volume emission rates recoverd from the inversion of the rocket data compare very well with the distributions that were inferred from ground-based measurements using triangulation-tomography techniques and the N2 ionization rates derived from the rocket tomography results are in very good agreement with the in situ particle measurements that were made during the flight. Three pre-prints describing the tomographic inversion techniques and the tomographic analysis of the ARIES rocket data are included as appendices.

  13. Selenium Metabolism in Cancer Cells: The Combined Application of XAS and XFM Techniques to the Problem of Selenium Speciation in Biological Systems

    PubMed Central

    Weekley, Claire M.; Aitken, Jade B.; Finney, Lydia; Vogt, Stefan; Witting, Paul K.; Harris, Hugh H.

    2013-01-01

    Determining the speciation of selenium in vivo is crucial to understanding the biological activity of this essential element, which is a popular dietary supplement due to its anti-cancer properties. Hyphenated techniques that combine separation and detection methods are traditionally and effectively used in selenium speciation analysis, but require extensive sample preparation that may affect speciation. Synchrotron-based X-ray absorption and fluorescence techniques offer an alternative approach to selenium speciation analysis that requires minimal sample preparation. We present a brief summary of some key HPLC-ICP-MS and ESI-MS/MS studies of the speciation of selenium in cells and rat tissues. We review the results of a top-down approach to selenium speciation in human lung cancer cells that aims to link the speciation and distribution of selenium to its biological activity using a combination of X-ray absorption spectroscopy (XAS) and X-ray fluorescence microscopy (XFM). The results of this approach highlight the distinct fates of selenomethionine, methylselenocysteine and selenite in terms of their speciation and distribution within cells: organic selenium metabolites were widely distributed throughout the cells, whereas inorganic selenium metabolites were compartmentalized and associated with copper. New data from the XFM mapping of electrophoretically-separated cell lysates show the distribution of selenium in the proteins of selenomethionine-treated cells. Future applications of this top-down approach are discussed. PMID:23698165

  14. Analysis of skin tissues spatial fluorescence distribution by the Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Y Churmakov, D.; Meglinski, I. V.; Piletsky, S. A.; Greenhalgh, D. A.

    2003-07-01

    A novel Monte Carlo technique of simulation of spatial fluorescence distribution within the human skin is presented. The computational model of skin takes into account the spatial distribution of fluorophores, which would arise due to the structure of collagen fibres, compared to the epidermis and stratum corneum where the distribution of fluorophores is assumed to be homogeneous. The results of simulation suggest that distribution of auto-fluorescence is significantly suppressed in the near-infrared spectral region, whereas the spatial distribution of fluorescence sources within a sensor layer embedded in the epidermis is localized at an `effective' depth.

  15. Code Analysis and Refactoring with Clang Tools, Version 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelley, Timothy M.

    2016-12-23

    Code Analysis and Refactoring with Clang Tools is a small set of example code that demonstrates techniques for applying tools distributed with the open source Clang compiler. Examples include analyzing where variables are used and replacing old data structures with standard structures.

  16. Provenance Establishment of Stingless Bee Honey Using Multi-element Analysis in Combination with Chemometrics Techniques.

    PubMed

    Shadan, Aidil Fahmi; Mahat, Naji A; Wan Ibrahim, Wan Aini; Ariffin, Zaiton; Ismail, Dzulkiflee

    2018-01-01

    As consumption of stingless bee honey has been gaining popularity in many countries including Malaysia, ability to identify accurately its geographical origin proves pertinent for investigating fraudulent activities for consumer protection. Because a chemical signature can be location-specific, multi-element distribution patterns may prove useful for provenancing such product. Using the inductively coupled-plasma optical emission spectrometer as well as principal component analysis (PCA) and linear discriminant analysis (LDA), the distributions of multi-elements in stingless bee honey collected at four different geographical locations (North, West, East, and South) in Johor, Malaysia, were investigated. While cross-validation using PCA demonstrated 87.0% correct classification rate, the same was improved (96.2%) with the use of LDA, indicating that discrimination was possible for the different geographical regions. Therefore, utilization of multi-element analysis coupled with chemometrics techniques for assigning the provenance of stingless bee honeys for forensic applications is supported. © 2017 American Academy of Forensic Sciences.

  17. Smoothed Particle Inference Analysis of SNR RCW 103

    NASA Astrophysics Data System (ADS)

    Frank, Kari A.; Burrows, David N.; Dwarkadas, Vikram

    2016-04-01

    We present preliminary results of applying a novel analysis method, Smoothed Particle Inference (SPI), to an XMM-Newton observation of SNR RCW 103. SPI is a Bayesian modeling process that fits a population of gas blobs ("smoothed particles") such that their superposed emission reproduces the observed spatial and spectral distribution of photons. Emission-weighted distributions of plasma properties, such as abundances and temperatures, are then extracted from the properties of the individual blobs. This technique has important advantages over analysis techniques which implicitly assume that remnants are two-dimensional objects in which each line of sight encompasses a single plasma. By contrast, SPI allows superposition of as many blobs of plasma as are needed to match the spectrum observed in each direction, without the need to bin the data spatially. This RCW 103 analysis is part of a pilot study for the larger SPIES (Smoothed Particle Inference Exploration of SNRs) project, in which SPI will be applied to a sample of 12 bright SNRs.

  18. A Technical Survey on Optimization of Processing Geo Distributed Data

    NASA Astrophysics Data System (ADS)

    Naga Malleswari, T. Y. J.; Ushasukhanya, S.; Nithyakalyani, A.; Girija, S.

    2018-04-01

    With growing cloud services and technology, there is growth in some geographically distributed data centers to store large amounts of data. Analysis of geo-distributed data is required in various services for data processing, storage of essential information, etc., processing this geo-distributed data and performing analytics on this data is a challenging task. The distributed data processing is accompanied by issues in storage, computation and communication. The key issues to be dealt with are time efficiency, cost minimization, utility maximization. This paper describes various optimization methods like end-to-end multiphase, G-MR, etc., using the techniques like Map-Reduce, CDS (Community Detection based Scheduling), ROUT, Workload-Aware Scheduling, SAGE, AMP (Ant Colony Optimization) to handle these issues. In this paper various optimization methods and techniques used are analyzed. It has been observed that end-to end multiphase achieves time efficiency; Cost minimization concentrates to achieve Quality of Service, Computation and reduction of Communication cost. SAGE achieves performance improvisation in processing geo-distributed data sets.

  19. Fragment size distribution statistics in dynamic fragmentation of laser shock-loaded tin

    NASA Astrophysics Data System (ADS)

    He, Weihua; Xin, Jianting; Zhao, Yongqiang; Chu, Genbai; Xi, Tao; Shui, Min; Lu, Feng; Gu, Yuqiu

    2017-06-01

    This work investigates the geometric statistics method to characterize the size distribution of tin fragments produced in the laser shock-loaded dynamic fragmentation process. In the shock experiments, the ejection of the tin sample with etched V-shape groove in the free surface are collected by the soft recovery technique. Subsequently, the produced fragments are automatically detected with the fine post-shot analysis techniques including the X-ray micro-tomography and the improved watershed method. To characterize the size distributions of the fragments, a theoretical random geometric statistics model based on Poisson mixtures is derived for dynamic heterogeneous fragmentation problem, which reveals linear combinational exponential distribution. The experimental data related to fragment size distributions of the laser shock-loaded tin sample are examined with the proposed theoretical model, and its fitting performance is compared with that of other state-of-the-art fragment size distribution models. The comparison results prove that our proposed model can provide far more reasonable fitting result for the laser shock-loaded tin.

  20. Distributed support vector machine in master-slave mode.

    PubMed

    Chen, Qingguo; Cao, Feilong

    2018-05-01

    It is well known that the support vector machine (SVM) is an effective learning algorithm. The alternating direction method of multipliers (ADMM) algorithm has emerged as a powerful technique for solving distributed optimisation models. This paper proposes a distributed SVM algorithm in a master-slave mode (MS-DSVM), which integrates a distributed SVM and ADMM acting in a master-slave configuration where the master node and slave nodes are connected, meaning the results can be broadcasted. The distributed SVM is regarded as a regularised optimisation problem and modelled as a series of convex optimisation sub-problems that are solved by ADMM. Additionally, the over-relaxation technique is utilised to accelerate the convergence rate of the proposed MS-DSVM. Our theoretical analysis demonstrates that the proposed MS-DSVM has linear convergence, meaning it possesses the fastest convergence rate among existing standard distributed ADMM algorithms. Numerical examples demonstrate that the convergence and accuracy of the proposed MS-DSVM are superior to those of existing methods under the ADMM framework. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Design and characterization of planar capacitive imaging probe based on the measurement sensitivity distribution

    NASA Astrophysics Data System (ADS)

    Yin, X.; Chen, G.; Li, W.; Huthchins, D. A.

    2013-01-01

    Previous work indicated that the capacitive imaging (CI) technique is a useful NDE tool which can be used on a wide range of materials, including metals, glass/carbon fibre composite materials and concrete. The imaging performance of the CI technique for a given application is determined by design parameters and characteristics of the CI probe. In this paper, a rapid method for calculating the whole probe sensitivity distribution based on the finite element model (FEM) is presented to provide a direct view of the imaging capabilities of the planar CI probe. Sensitivity distributions of CI probes with different geometries were obtained. Influencing factors on sensitivity distribution were studied. Comparisons between CI probes with point-to-point triangular electrode pair and back-to-back triangular electrode pair were made based on the analysis of the corresponding sensitivity distributions. The results indicated that the sensitivity distribution could be useful for optimising the probe design parameters and predicting the imaging performance.

  2. Computerized evaluation of holographic interferograms for fatigue crack detection in riveted lap joints

    NASA Astrophysics Data System (ADS)

    Zhou, Xiang

    Using an innovative portable holographic inspection and testing system (PHITS) developed at the Australian Defence Force Academy, fatigue cracks in riveted lap joints can be detected by visually inspecting the abnormal fringe changes recorded on holographic interferograms. In this thesis, for automatic crack detection, some modern digital image processing techniques are investigated and applied to holographic interferogram evaluation. Fringe analysis algorithms are developed for identification of the crack-induced fringe changes. Theoretical analysis of PHITS and riveted lap joints and two typical experiments demonstrate that the fatigue cracks in lightly-clamped joints induce two characteristic fringe changes: local fringe discontinuities at the cracking sites; and the global crescent fringe distribution near to the edge of the rivet hole. Both of the fringe features are used for crack detection in this thesis. As a basis of the fringe feature extraction, an algorithm for local fringe orientation calculation is proposed. For high orientation accuracy and computational efficiency, Gaussian gradient filtering and neighboring direction averaging are used to minimize the effects of image background variations and random noise. The neighboring direction averaging is also used to approximate the fringe directions in centerlines of bright and dark fringes. Experimental results indicate that for high orientation accuracy the scales of the Gaussian filter and neighboring direction averaging should be chosen according to the local fringe spacings. The orientation histogram technique is applied to detect the local fringe discontinuity due to the fatigue cracks. The Fourier descriptor technique is used to characterize the global fringe distribution change from a circular to a crescent distribution with the fatigue crack growth. Experiments and computer simulations are conducted to analyze the detectability and reliability of crack detection using the two techniques. Results demonstrate that the Fourier descriptor technique is more promising in the detection of the short cracks near the edge of the rivet head. However, it is not as reliable as the fringe orientation technique for detection of the long through cracks. For reliability, both techniques should be used in practical crack detection. Neither the Fourier descriptor technique nor the orientation histogram technique have been previously applied to holographic interferometry. While this work related primarily to interferograms of cracked rivets, the techniques would be readily applied to other areas of fringe pattern analysis.

  3. Sizing aerosolized fractal nanoparticle aggregates through Bayesian analysis of wide-angle light scattering (WALS) data

    NASA Astrophysics Data System (ADS)

    Huber, Franz J. T.; Will, Stefan; Daun, Kyle J.

    2016-11-01

    Inferring the size distribution of aerosolized fractal aggregates from the angular distribution of elastically scattered light is a mathematically ill-posed problem. This paper presents a procedure for analyzing Wide-Angle Light Scattering (WALS) data using Bayesian inference. The outcome is probability densities for the recovered size distribution and aggregate morphology parameters. This technique is applied to both synthetic data and experimental data collected on soot-laden aerosols, using a measurement equation derived from Rayleigh-Debye-Gans fractal aggregate (RDG-FA) theory. In the case of experimental data, the recovered aggregate size distribution parameters are generally consistent with TEM-derived values, but the accuracy is impaired by the well-known limited accuracy of RDG-FA theory. Finally, we show how this bias could potentially be avoided using the approximation error technique.

  4. Compositional analysis and depth profiling of thin film CrO{sub 2} by heavy ion ERDA and standard RBS: a comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khamlich, S., E-mail: skhamlich@gmail.com; Department of Chemistry, Tshwane University of Technology, Private Bag X 680, Pretoria, 0001; The African Laser Centre, CSIR campus, P.O. Box 395, Pretoria

    2012-08-15

    Chromium dioxide (CrO{sub 2}) thin film has generated considerable interest in applied research due to the wide variety of its technological applications. It has been extensively investigated in recent years, attracting the attention of researchers working on spintronic heterostructures and in the magnetic recording industry. However, its synthesis is usually a difficult task due to its metastable nature and various synthesis techniques are being investigated. In this work a polycrystalline thin film of CrO{sub 2} was prepared by electron beam vaporization of Cr{sub 2}O{sub 3} onto a Si substrate. The polycrystalline structure was confirmed through XRD analysis. The stoichiometry andmore » elemental depth distribution of the deposited film were measured by ion beam nuclear analytical techniques heavy ion elastic recoil detection analysis (ERDA) and Rutherford backscattering spectrometry (RBS), which both have relative advantage over non-nuclear spectrometries in that they can readily provide quantitative information about the concentration and distribution of different atomic species in a layer. Moreover, the analysis carried out highlights the importance of complementary usage of the two techniques to obtain a more complete description of elemental content and depth distribution in thin films. - Graphical abstract: Heavy ion elastic recoil detection analysis (ERDA) and Rutherford backscattering spectrometry (RBS) both have relative advantage over non-nuclear spectrometries in that they can readily provide quantitative information about the concentration and distribution of different atomic species in a layer. Highlights: Black-Right-Pointing-Pointer Thin films of CrO{sub 2} have been grown by e-beam evaporation of Cr{sub 2}O{sub 3} target in vacuum. Black-Right-Pointing-Pointer The composition was determined by heavy ion-ERDA and RBS. Black-Right-Pointing-Pointer HI-ERDA and RBS provided information on the light and heavy elements, respectively.« less

  5. A Study to Investigate the Sleeping Comfort of Mattress using Finite Element Method

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroaki; Kamijo, Masayoshi; Shimizu, Yoshio

    Sleep is an essential physiological activity for human beings and many studies have so far investigated sleeping comfort of mattresses. The appropriate measurement of stress distribution within the human body would provide valuable information to us. For the appropriate measurement to estimate stress distribution within the human body, numerical analysis is considered one of the most desirable techniques, and Finite Element Method (FEM), which is widely accepted as a useful numerical technique, was utilized in this study. Since human body dimensions have individual differences, however, it is presumed that the way of the internal stress distribution also changes due to the differences and that the mattress preference varies among different body forms. Thus, we developed three human FEM models reproducing the body forms of three types of male subjects, and investigated the sleeping comfort of mattress based on the relationship between FEM analysis findings and sensory testing results. In comparison with the results of both FEM analysis and sensory testing in the neck region, we found, the sensory testing results corresponded to the FEM analysis findings, and it was possible to estimate subjects' preferences of mattress and comfort in the neck region using the FEM analysis. In this study, we believe, the FEM analysis managed to quantify the subjects' preferences for mattress and to prove itself that it is the valuable tools to examine the sleeping comfort of mattress.

  6. Improved key-rate bounds for practical decoy-state quantum-key-distribution systems

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Zhao, Qi; Razavi, Mohsen; Ma, Xiongfeng

    2017-01-01

    The decoy-state scheme is the most widely implemented quantum-key-distribution protocol in practice. In order to account for the finite-size key effects on the achievable secret key generation rate, a rigorous statistical fluctuation analysis is required. Originally, a heuristic Gaussian-approximation technique was used for this purpose, which, despite its analytical convenience, was not sufficiently rigorous. The fluctuation analysis has recently been made rigorous by using the Chernoff bound. There is a considerable gap, however, between the key-rate bounds obtained from these techniques and that obtained from the Gaussian assumption. Here we develop a tighter bound for the decoy-state method, which yields a smaller failure probability. This improvement results in a higher key rate and increases the maximum distance over which secure key exchange is possible. By optimizing the system parameters, our simulation results show that our method almost closes the gap between the two previously proposed techniques and achieves a performance similar to that of conventional Gaussian approximations.

  7. A comparison of dynamic and static economic models of uneven-aged stand management

    Treesearch

    Robert G. Haight

    1985-01-01

    Numerical techniques have been used to compute the discrete-time sequence of residual diameter distributions that maximize the present net worth (PNW) of harvestable volume from an uneven-aged stand. Results contradicted optimal steady-state diameter distributions determined with static analysis. In this paper, optimality conditions for solutions to dynamic and static...

  8. Funding Mechanisms, Cost Drivers, and the Distribution of Education Funds in Alberta: A Case Study.

    ERIC Educational Resources Information Center

    Neu, Dean; Taylor, Alison

    2000-01-01

    Critical analysis of historical financial data of the Calgary Board of Education (CBE) examined the impact of Alberta's 1994 funding changes on the CBE and the distribution of Alberta's education funding. Findings illustrate how funding mechanisms are used to govern from a distance and how seemingly neutral accounting/funding techniques function…

  9. Composition analysis of a polymer electrolyte membrane fuel cell microporous layer using scanning transmission X-ray microscopy and near edge X-ray absorption fine structure analysis

    NASA Astrophysics Data System (ADS)

    George, Michael G.; Wang, Jian; Banerjee, Rupak; Bazylak, Aimy

    2016-03-01

    The novel application of scanning transmission X-ray microscopy (STXM) to the microporous layer (MPL) of a polymer electrolyte membrane fuel cell is investigated. A spatially resolved chemical component distribution map is obtained for the MPL of a commercially available SGL 25 BC sample. This is achieved with near edge X-ray absorption fine structure spectroscopic analysis. Prior to analysis the sample is embedded in non-reactive epoxy and ultra-microtomed to a thickness of 100 nm. Polytetrafluoroethylene (PTFE), carbon particle agglomerates, and supporting epoxy resin distributions are identified and reconstructed for a scanning area of 6 μm × 6 μm. It is observed that the spatial distribution of PTFE is strongly correlated to the carbon particle agglomerations. Additionally, agglomerate structures of PTFE are identified, possibly indicating the presence of a unique mesostructure in the MPL. STXM analysis is presented as a useful technique for the investigation of chemical species distributions in the MPL.

  10. Mathematical Methods for Optical Physics and Engineering

    NASA Astrophysics Data System (ADS)

    Gbur, Gregory J.

    2011-01-01

    1. Vector algebra; 2. Vector calculus; 3. Vector calculus in curvilinear coordinate systems; 4. Matrices and linear algebra; 5. Advanced matrix techniques and tensors; 6. Distributions; 7. Infinite series; 8. Fourier series; 9. Complex analysis; 10. Advanced complex analysis; 11. Fourier transforms; 12. Other integral transforms; 13. Discrete transforms; 14. Ordinary differential equations; 15. Partial differential equations; 16. Bessel functions; 17. Legendre functions and spherical harmonics; 18. Orthogonal functions; 19. Green's functions; 20. The calculus of variations; 21. Asymptotic techniques; Appendices; References; Index.

  11. Numerical analysis of thermal drilling technique on titanium sheet metal

    NASA Astrophysics Data System (ADS)

    Kumar, R.; Hynes, N. Rajesh Jesudoss

    2018-05-01

    Thermal drilling is a technique used in drilling of sheet metal for various applications. It involves rotating conical tool with high speed in order to drill the sheet metal and formed a hole with bush below the surface of sheet metal. This article investigates the finite element analysis of thermal drilling on Ti6Al4Valloy sheet metal. This analysis was carried out by means of DEFORM-3D simulation software to simulate the performance characteristics of thermal drilling technique. Due to the contribution of high temperature deformation in this technique, the output performances which are difficult to measure by the experimental approach, can be successfully achieved by finite element method. Therefore, the modeling and simulation of thermal drilling is an essential tool to predict the strain rate, stress distribution and temperature of the workpiece.

  12. Practical issues of hyperspectral imaging analysis of solid dosage forms.

    PubMed

    Amigo, José Manuel

    2010-09-01

    Hyperspectral imaging techniques have widely demonstrated their usefulness in different areas of interest in pharmaceutical research during the last decade. In particular, middle infrared, near infrared, and Raman methods have gained special relevance. This rapid increase has been promoted by the capability of hyperspectral techniques to provide robust and reliable chemical and spatial information on the distribution of components in pharmaceutical solid dosage forms. Furthermore, the valuable combination of hyperspectral imaging devices with adequate data processing techniques offers the perfect landscape for developing new methods for scanning and analyzing surfaces. Nevertheless, the instrumentation and subsequent data analysis are not exempt from issues that must be thoughtfully considered. This paper describes and discusses the main advantages and drawbacks of the measurements and data analysis of hyperspectral imaging techniques in the development of solid dosage forms.

  13. An Overview Of Wideband Signal Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Speiser, Jeffrey M.; Whitehouse, Harper J.

    1989-11-01

    This paper provides a unifying perspective for several narowband and wideband signal processing techniques. It considers narrowband ambiguity functions and Wigner-Ville distibutions, together with the wideband ambiguity function and several proposed approaches to a wideband version of the Wigner-Ville distribution (WVD). A unifying perspective is provided by the methodology of unitary representations and ray representations of transformation groups.

  14. Sequential updating of multimodal hydrogeologic parameter fields using localization and clustering techniques

    NASA Astrophysics Data System (ADS)

    Sun, Alexander Y.; Morris, Alan P.; Mohanty, Sitakanta

    2009-07-01

    Estimated parameter distributions in groundwater models may contain significant uncertainties because of data insufficiency. Therefore, adaptive uncertainty reduction strategies are needed to continuously improve model accuracy by fusing new observations. In recent years, various ensemble Kalman filters have been introduced as viable tools for updating high-dimensional model parameters. However, their usefulness is largely limited by the inherent assumption of Gaussian error statistics. Hydraulic conductivity distributions in alluvial aquifers, for example, are usually non-Gaussian as a result of complex depositional and diagenetic processes. In this study, we combine an ensemble Kalman filter with grid-based localization and a Gaussian mixture model (GMM) clustering techniques for updating high-dimensional, multimodal parameter distributions via dynamic data assimilation. We introduce innovative strategies (e.g., block updating and dimension reduction) to effectively reduce the computational costs associated with these modified ensemble Kalman filter schemes. The developed data assimilation schemes are demonstrated numerically for identifying the multimodal heterogeneous hydraulic conductivity distributions in a binary facies alluvial aquifer. Our results show that localization and GMM clustering are very promising techniques for assimilating high-dimensional, multimodal parameter distributions, and they outperform the corresponding global ensemble Kalman filter analysis scheme in all scenarios considered.

  15. Medium Caliber Lead-Free Electric Primer. Version 2

    DTIC Science & Technology

    2012-09-01

    Toxic Substance Control Act TGA Thermogravimetric Analysis TNR Trinitroresorcinol V Voltage VDC Voltage Direct Current WSESRB Weapons System...variety of techniques including Thermogravimetric Analysis (TGA), base-hydrolysis, Surface Area Analysis using Brunauer, Emmett, Teller (BET...Distribution From Thermogravimetric Analysis Johnson, C. E.; Fallis, S.; Chafin, A. P.; Groshens, T. J.; Higa, K. T.; Ismail, I. M. K. and Hawkins, T. W

  16. Methods and apparatus for analysis of chromatographic migration patterns

    DOEpatents

    Stockham, Thomas G.; Ives, Jeffrey T.

    1993-01-01

    A method and apparatus for sharpening signal peaks in a signal representing the distribution of biological or chemical components of a mixture separated by a chromatographic technique such as, but not limited to, electrophoresis. A key step in the method is the use of a blind deconvolution technique, presently embodied as homomorphic filtering, to reduce the contribution of a blurring function to the signal encoding the peaks of the distribution. The invention further includes steps and apparatus directed to determination of a nucleotide sequence from a set of four such signals representing DNA sequence data derived by electrophoretic means.

  17. Thermokarst Characteristics and Distribution in a Transitional Arctic Biome: New Discoveries and Possible Monitoring Directions in a Climate Change Scenario

    NASA Astrophysics Data System (ADS)

    Balser, A. W.; Gooseff, M. N.; Jones, J. B.; Bowden, W. B.; Sanzone, D. M.; Allen, A.; Larouche, J. R.

    2006-12-01

    In arctic regions, climate warming is leading to permafrost melting and wide-scale ecosystem alteration. A prominent pathway of permafrost loss is through thermokarst, which includes the catastrophic loss of soil structure and rapid subsidence. The regional-scale distribution of thermokarst is poorly documented throughout arctic regions. Remote landscapes and a lack of reliable, regional-scale detection techniques severely hamper our understanding of past prevalence and present distribution patterns. Intensive field campaigns are providing key data to bolster our understanding of the distribution and the characteristics of thermokarst formation, and enabling comprehensive method studies to develop remotely-sensed detection techniques. The Noatak Valley in northwestern Alaska's Brooks Range mountains harbors a transitional landscape from arctic and alpine tundra to boreal forest, all contained in a single 7,000,000 acre watershed. Preliminary field investigations augmented by photogrammetric measurements in 2006 revealed consistent patterns in the distribution of classifiable thermokarst feature types in a 2300 square-mile study area in the middle Noatak basin. Four distinct classes of thermokarst show remarkably tight relationships with ambient slope and local landcover. These investigations tie to larger efforts to document past and present regional distribution, testing remotely sensed data analysis techniques for baseline metrics and a future monitoring scheme.

  18. The application of novel nano-thermal and imaging techniques for monitoring drug microstructure and distribution within PLGA microspheres.

    PubMed

    Yang, Fan; Chen, De; Guo, Zhe-Fei; Zhang, Yong-Ming; Liu, Yi; Askin, Sean; Craig, Duncan Q M; Zhao, Min

    2017-04-30

    Poly (d,l-lactic-co-glycolic) acid (PLGA) based microspheres have been extensively used as controlled drug release systems. However, the burst effect has been a persistent issue associated with such systems, especially for those prepared by the double emulsion technique. An effective approach to preventing the burst effect and achieving a more ideal drug release profile is to improve the drug distribution within the polymeric matrix. Therefore, it is of great importance to establish a rapid and robust tool for screening and optimizing the drug distribution during pre-formulation. Transition Temperature Microscopy (TTM), a novel nano-thermal and imaging technique, is an extension of nano-thermal analysis (nano-TA) whereby a transition temperature is detected at a localized region of a sample and then designated a color based on a particular temperature/color palette, finally resulting in a coded map based on transition temperatures detected by carrying out a series of nanoTA measurements across the surface of the sample. In this study, we investigate the feasibility of applying the aforementioned technique combined with other thermal, imaging and structural techniques for monitoring the drug microstructure and spatial distribution within bovine serum albumin (BSA) loaded and nimodipine loaded PLGA microspheres, with a view to better predicting the in vitro drug release performance. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. A controls engineering approach for analyzing airplane input-output characteristics

    NASA Technical Reports Server (NTRS)

    Arbuckle, P. Douglas

    1991-01-01

    An engineering approach for analyzing airplane control and output characteristics is presented. State-space matrix equations describing the linear perturbation dynamics are transformed from physical coordinates into scaled coordinates. The scaling is accomplished by applying various transformations to the system to employ prior engineering knowledge of the airplane physics. Two different analysis techniques are then explained. Modal analysis techniques calculate the influence of each system input on each fundamental mode of motion and the distribution of each mode among the system outputs. The optimal steady state response technique computes the blending of steady state control inputs that optimize the steady state response of selected system outputs. Analysis of an example airplane model is presented to demonstrate the described engineering approach.

  20. Time is Money

    NASA Astrophysics Data System (ADS)

    Ausloos, Marcel; Vandewalle, Nicolas; Ivanova, Kristinka

    Specialized topics on financial data analysis from a numerical and physical point of view are discussed when pertaining to the analysis of coherent and random sequences in financial fluctuations within (i) the extended detrended fluctuation analysis method, (ii) multi-affine analysis technique, (iii) mobile average intersection rules and distributions, (iv) sandpile avalanches models for crash prediction, (v) the (m,k)-Zipf method and (vi) the i-variability diagram technique for sorting out short range correlations. The most baffling result that needs further thought from mathematicians and physicists is recalled: the crossing of two mobile averages is an original method for measuring the "signal" roughness exponent, but why it is so is not understood up to now.

  1. TU-CD-BRA-11: Application of Bone Suppression Technique to Inspiratory/expiratory Chest Radiography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tanaka, R; Sanada, S; Sakuta, K

    Purpose: The bone suppression technique based on advanced image processing can suppress the conspicuity of bones on chest radiographs, creating soft tissue images normally obtained by the dual-energy subtraction technique. This study was performed to investigate the usefulness of bone suppression technique in quantitative analysis of pulmonary function in inspiratory/expiratory chest radiography. Methods: Commercial bone suppression image processing software (ClearRead; Riverain Technologies) was applied to paired inspiratory/expiratory chest radiographs of 107 patients (normal, 33; abnormal, 74) to create corresponding bone suppression images. The abnormal subjects had been diagnosed with pulmonary diseases, such as pneumothorax, pneumonia, emphysema, asthma, and lung cancer.more » After recognition of the lung area, the vectors of respiratory displacement were measured in all local lung areas using a cross-correlation technique. The measured displacement in each area was visualized as displacement color maps. The distribution pattern of respiratory displacement was assessed by comparison with the findings of lung scintigraphy. Results: Respiratory displacement of pulmonary markings (soft tissues) was able to be quantified separately from the rib movements on bone suppression images. The resulting displacement map showed a left-right symmetric distribution increasing from the lung apex to the bottom region of the lung in many cases. However, patients with ventilatory impairments showed a nonuniform distribution caused by decreased displacement of pulmonary markings, which were confirmed to correspond to area with ventilatory impairments found on the lung scintigrams. Conclusion: The bone suppression technique was useful for quantitative analysis of respiratory displacement of pulmonary markings without any interruption of the rib shadows. Abnormal areas could be detected as decreased displacement of pulmonary markings. Inspiratory/expiratory chest radiography combined with the bone suppression technique has potential for predicting local lung function on the basis of dynamic analysis of pulmonary markings. This work was partially supported by Nakatani Foundation, Grant-in-aid for Scientific Research (C) of Ministry of Education, Culture, Sports, Science and Technology, JAPAN (Grant number : 24601007), and Nakatani Foundation, Mitsubishi Foundation, and the he Mitani Foundation for Research and Development. Yasushi Kishitani is a staff of TOYO corporation.« less

  2. Analysis and prediction of Multiple-Site Damage (MSD) fatigue crack growth

    NASA Technical Reports Server (NTRS)

    Dawicke, D. S.; Newman, J. C., Jr.

    1992-01-01

    A technique was developed to calculate the stress intensity factor for multiple interacting cracks. The analysis was verified through comparison with accepted methods of calculating stress intensity factors. The technique was incorporated into a fatigue crack growth prediction model and used to predict the fatigue crack growth life for multiple-site damage (MSD). The analysis was verified through comparison with experiments conducted on uniaxially loaded flat panels with multiple cracks. Configuration with nearly equal and unequal crack distribution were examined. The fatigue crack growth predictions agreed within 20 percent of the experimental lives for all crack configurations considered.

  3. A comparison of sample preparation strategies for biological tissues and subsequent trace element analysis using LA-ICP-MS.

    PubMed

    Bonta, Maximilian; Török, Szilvia; Hegedus, Balazs; Döme, Balazs; Limbeck, Andreas

    2017-03-01

    Laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) is one of the most commonly applied methods for lateral trace element distribution analysis in medical studies. Many improvements of the technique regarding quantification and achievable lateral resolution have been achieved in the last years. Nevertheless, sample preparation is also of major importance and the optimal sample preparation strategy still has not been defined. While conventional histology knows a number of sample pre-treatment strategies, little is known about the effect of these approaches on the lateral distributions of elements and/or their quantities in tissues. The technique of formalin fixation and paraffin embedding (FFPE) has emerged as the gold standard in tissue preparation. However, the potential use for elemental distribution studies is questionable due to a large number of sample preparation steps. In this work, LA-ICP-MS was used to examine the applicability of the FFPE sample preparation approach for elemental distribution studies. Qualitative elemental distributions as well as quantitative concentrations in cryo-cut tissues as well as FFPE samples were compared. Results showed that some metals (especially Na and K) are severely affected by the FFPE process, whereas others (e.g., Mn, Ni) are less influenced. Based on these results, a general recommendation can be given: FFPE samples are completely unsuitable for the analysis of alkaline metals. When analyzing transition metals, FFPE samples can give comparable results to snap-frozen tissues. Graphical abstract Sample preparation strategies for biological tissues are compared with regard to the elemental distributions and average trace element concentrations.

  4. A COMPARISON OF EXPERIMENTS AND THREE-DIMENSIONAL ANALYSIS TECHNIQUES. PART I. UNPOISONED UNIFORM SLAB CORE WITH A PARTIALLY INSERTED HAFNIUM ROD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Renzi, N.E.; Roseberry, R.J.

    >The experimental measurements and nuclear analysis of a uniformly loaded, unpoisoned slab core with a partially insented hafnium rod are described. Comparisons of experimental data with calculated results of the UFO code and flux synthesis techniques are given. It was concluded that one of the flux synthesis techniques and the UFO code are able to predict flux distributions to within approximately 5% of experiment for most cases. An error of approximately 10% was found in the synthesis technique for a channel near the partially inserted rod. The various calculations were able to predict neutron pulsed shutdowns to only approximately 30%.more » (auth)« less

  5. Cost Benefit Analysis and Other Fun and Games.

    ERIC Educational Resources Information Center

    White, Herbert S.

    1985-01-01

    Discussion of application of cost benefit analysis (CBA) accounting techniques to libraries highlights user willingness to be charged for services provided, reasons why CBA will not work in library settings, libraries and budgets, cost distribution on basis of presumed or expected use, implementation of information-seeking behavior control, and…

  6. Statistical Analysis For Nucleus/Nucleus Collisions

    NASA Technical Reports Server (NTRS)

    Mcguire, Stephen C.

    1989-01-01

    Report describes use of several statistical techniques to charactertize angular distributions of secondary particles emitted in collisions of atomic nuclei in energy range of 24 to 61 GeV per nucleon. Purpose of statistical analysis to determine correlations between intensities of emitted particles and angles comfirming existence of quark/gluon plasma.

  7. A comparison of solute-transport solution techniques and their effect on sensitivity analysis and inverse modeling results

    USGS Publications Warehouse

    Mehl, S.; Hill, M.C.

    2001-01-01

    Five common numerical techniques for solving the advection-dispersion equation (finite difference, predictor corrector, total variation diminishing, method of characteristics, and modified method of characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using discrete, randomly distributed, homogeneous blocks of five sand types. This experimental model provides an opportunity to compare the solution techniques: the heterogeneous hydraulic-conductivity distribution of known structure can be accurately represented by a numerical model, and detailed measurements can be compared with simulated concentrations and total flow through the tank. The present work uses this opportunity to investigate how three common types of results - simulated breakthrough curves, sensitivity analysis, and calibrated parameter values - change in this heterogeneous situation given the different methods of simulating solute transport. The breakthrough curves show that simulated peak concentrations, even at very fine grid spacings, varied between the techniques because of different amounts of numerical dispersion. Sensitivity-analysis results revealed: (1) a high correlation between hydraulic conductivity and porosity given the concentration and flow observations used, so that both could not be estimated; and (2) that the breakthrough curve data did not provide enough information to estimate individual values of dispersivity for the five sands. This study demonstrates that the choice of assigned dispersivity and the amount of numerical dispersion present in the solution technique influence estimated hydraulic conductivity values to a surprising degree.

  8. Coherent optical determination of the leaf angle distribution of corn

    NASA Technical Reports Server (NTRS)

    Ulaby, F. T. (Principal Investigator); Pihlman, M.

    1981-01-01

    A coherent optical technique for the diffraction analysis of an image is presented. Developments in radar remote sensing shows a need to understand plant geometry and its relationship to plant moisture, soil moisture, and the radar backscattering coefficient. A corn plant changes its leaf angle distribution, as a function of time, from a uniform distribution to one that is strongly vertical. It is shown that plant and soil moisture may have an effect on plant geometry.

  9. Inhomogeneity Based Characterization of Distribution Patterns on the Plasma Membrane

    PubMed Central

    Paparelli, Laura; Corthout, Nikky; Wakefield, Devin L.; Sannerud, Ragna; Jovanovic-Talisman, Tijana; Annaert, Wim; Munck, Sebastian

    2016-01-01

    Cell surface protein and lipid molecules are organized in various patterns: randomly, along gradients, or clustered when segregated into discrete micro- and nano-domains. Their distribution is tightly coupled to events such as polarization, endocytosis, and intracellular signaling, but challenging to quantify using traditional techniques. Here we present a novel approach to quantify the distribution of plasma membrane proteins and lipids. This approach describes spatial patterns in degrees of inhomogeneity and incorporates an intensity-based correction to analyze images with a wide range of resolutions; we have termed it Quantitative Analysis of the Spatial distributions in Images using Mosaic segmentation and Dual parameter Optimization in Histograms (QuASIMoDOH). We tested its applicability using simulated microscopy images and images acquired by widefield microscopy, total internal reflection microscopy, structured illumination microscopy, and photoactivated localization microscopy. We validated QuASIMoDOH, successfully quantifying the distribution of protein and lipid molecules detected with several labeling techniques, in different cell model systems. We also used this method to characterize the reorganization of cell surface lipids in response to disrupted endosomal trafficking and to detect dynamic changes in the global and local organization of epidermal growth factor receptors across the cell surface. Our findings demonstrate that QuASIMoDOH can be used to assess protein and lipid patterns, quantifying distribution changes and spatial reorganization at the cell surface. An ImageJ/Fiji plugin of this analysis tool is provided. PMID:27603951

  10. Particle size analysis of amalgam powder and handpiece generated specimens.

    PubMed

    Drummond, J L; Hathorn, R M; Cailas, M D; Karuhn, R

    2001-07-01

    The increasing interest in the elimination of amalgam particles from the dental waste (DW) stream, requires efficient devices to remove these particles. The major objective of this project was to perform a comparative evaluation of five basic methods of particle size analysis in terms of the instrument's ability to quantify the size distribution of the various components within the DW stream. The analytical techniques chosen were image analysis via scanning electron microscopy, standard wire mesh sieves, X-ray sedigraphy, laser diffraction, and electrozone analysis. The DW particle stream components were represented by amalgam powders and handpiece/diamond bur generated specimens of enamel; dentin, whole tooth, and condensed amalgam. Each analytical method quantified the examined DW particle stream components. However, X-ray sedigraphy, electrozone, and laser diffraction particle analyses provided similar results for determining particle distributions of DW samples. These three methods were able to more clearly quantify the properties of the examined powder and condensed amalgam samples. Furthermore, these methods indicated that a significant fraction of the DW stream contains particles less than 20 microm. The findings of this study indicated that the electrozone method is likely to be the most effective technique for quantifying the particle size distribution in the DW particle stream. This method required a relative small volume of sample, was not affected by density, shape factors or optical properties, and measured a sufficient number of particles to provide a reliable representation of the particle size distribution curve.

  11. Variability of residual stresses and superposition effect in multipass grinding of high-carbon high-chromium steel

    NASA Astrophysics Data System (ADS)

    Karabelchtchikova, Olga; Rivero, Iris V.

    2005-02-01

    The distribution of residual stresses (RS) and surface integrity generated in heat treatment and subsequent multipass grinding was investigated in this experimental study to examine the source of variability and the nature of the interactions of the experimental factors. A nested experimental design was implemented to (a) compare the sources of the RS variability, (b) to examine RS distribution and tensile peak location due to experimental factors, and (c) to analyze the superposition relationship in the RS distribution due to multipass grinding technique. To characterize the material responses, several techniques were used, including microstructural analysis, hardness-toughness and roughness examinations, and retained austenite and RS measurements using x-ray diffraction. The causality of the RS was explained through the strong correlation of the surface integrity characteristics and RS patterns. The main sources of variation were the depth of the RS distribution and the multipass grinding technique. The grinding effect on the RS was statistically significant; however, it was mostly predetermined by the preexisting RS induced in heat treatment. Regardless of the preceding treatments, the effect of the multipass grinding technique exhibited similar RS patterns, which suggests the existence of the superposition relationship and orthogonal memory between the passes of the grinding operation.

  12. Estimation of Microbial Concentration in Food Products from Qualitative, Microbiological Test Data with the MPN Technique.

    PubMed

    Fujikawa, Hiroshi

    2017-01-01

    Microbial concentration in samples of a food product lot has been generally assumed to follow the log-normal distribution in food sampling, but this distribution cannot accommodate the concentration of zero. In the present study, first, a probabilistic study with the most probable number (MPN) technique was done for a target microbe present at a low (or zero) concentration in food products. Namely, based on the number of target pathogen-positive samples in the total samples of a product found by a qualitative, microbiological examination, the concentration of the pathogen in the product was estimated by means of the MPN technique. The effects of the sample size and the total sample number of a product were then examined. Second, operating characteristic (OC) curves for the concentration of a target microbe in a product lot were generated on the assumption that the concentration of a target microbe could be expressed with the Poisson distribution. OC curves for Salmonella and Cronobacter sakazakii in powdered formulae for infants and young children were successfully generated. The present study suggested that the MPN technique and the Poisson distribution would be useful for qualitative microbiological test data analysis for a target microbe whose concentration in a lot is expected to be low.

  13. Novel Method for Incorporating Model Uncertainties into Gravitational Wave Parameter Estimates

    NASA Astrophysics Data System (ADS)

    Moore, Christopher J.; Gair, Jonathan R.

    2014-12-01

    Posterior distributions on parameters computed from experimental data using Bayesian techniques are only as accurate as the models used to construct them. In many applications, these models are incomplete, which both reduces the prospects of detection and leads to a systematic error in the parameter estimates. In the analysis of data from gravitational wave detectors, for example, accurate waveform templates can be computed using numerical methods, but the prohibitive cost of these simulations means this can only be done for a small handful of parameters. In this Letter, a novel method to fold model uncertainties into data analysis is proposed; the waveform uncertainty is analytically marginalized over using with a prior distribution constructed by using Gaussian process regression to interpolate the waveform difference from a small training set of accurate templates. The method is well motivated, easy to implement, and no more computationally expensive than standard techniques. The new method is shown to perform extremely well when applied to a toy problem. While we use the application to gravitational wave data analysis to motivate and illustrate the technique, it can be applied in any context where model uncertainties exist.

  14. Distributed and Collaborative Software Analysis

    NASA Astrophysics Data System (ADS)

    Ghezzi, Giacomo; Gall, Harald C.

    Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of software analysissoftware analysis such as source code analysis, co-change analysis or bug prediction. However, easy and straight forward synergies between these analyses and tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the variety of data to analyze. As a consequence, distributed and collaborative software analysiscollaborative software analysis scenarios and in particular interoperability are severely limited. We describe a distributed and collaborative software analysis platform that allows for a seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. We realize software analysis tools as services that can be accessed and composed over the Internet. These distributed analysis services shall be widely accessible in our incrementally augmented Software Analysis Broker software analysis broker where organizations and tool providers can register and share their tools. To allow (semi-) automatic use and composition of these tools, they are classified and mapped into a software analysis taxonomy and adhere to specific meta-models and ontologiesontologies for their category of analysis.

  15. Elemental profiling of laser cladded multilayer coatings by laser induced breakdown spectroscopy and energy dispersive X-ray spectroscopy

    NASA Astrophysics Data System (ADS)

    Lednev, V. N.; Sdvizhenskii, P. A.; Filippov, M. N.; Grishin, M. Ya.; Filichkina, V. A.; Stavertiy, A. Ya.; Tretyakov, R. S.; Bunkin, A. F.; Pershin, S. M.

    2017-09-01

    Multilayer tungsten carbide wear resistant coatings were analyzed by laser induced breakdown spectroscopy (LIBS) and energy dispersive X-ray (EDX) spectroscopy. Coaxial laser cladding technique was utilized to produce tungsten carbide coating deposited on low alloy steel substrate with additional inconel 625 interlayer. EDX and LIBS techniques were used for elemental profiling of major components (Ni, W, C, Fe, etc.) in the coating. A good correlation between EDX and LIBS data was observed while LIBS provided additional information on light element distribution (carbon). A non-uniform distribution of tungsten carbide grains along coating depth was detected by both LIBS and EDX. In contrast, horizontal elemental profiling showed a uniform tungsten carbide particles distribution. Depth elemental profiling by layer-by-layer LIBS analysis was demonstrated to be an effective method for studying tungsten carbide grains distribution in wear resistant coating without any sample preparation.

  16. Efficient dynamic events discrimination technique for fiber distributed Brillouin sensors.

    PubMed

    Galindez, Carlos A; Madruga, Francisco J; Lopez-Higuera, Jose M

    2011-09-26

    A technique to detect real time variations of temperature or strain in Brillouin based distributed fiber sensors is proposed and is investigated in this paper. The technique is based on anomaly detection methods such as the RX-algorithm. Detection and isolation of dynamic events from the static ones are demonstrated by a proper processing of the Brillouin gain values obtained by using a standard BOTDA system. Results also suggest that better signal to noise ratio, dynamic range and spatial resolution can be obtained. For a pump pulse of 5 ns the spatial resolution is enhanced, (from 0.541 m obtained by direct gain measurement, to 0.418 m obtained with the technique here exposed) since the analysis is concentrated in the variation of the Brillouin gain and not only on the averaging of the signal along the time. © 2011 Optical Society of America

  17. A real time study on condition monitoring of distribution transformer using thermal imager

    NASA Astrophysics Data System (ADS)

    Mariprasath, T.; Kirubakaran, V.

    2018-05-01

    The transformer is one of the critical apparatus in the power system. At any cost, a few minutes of outages harshly influence the power system. Hence, prevention-based maintenance technique is very essential. The continuous conditioning and monitoring technology significantly increases the life span of the transformer, as well as reduces the maintenance cost. Hence, conditioning and monitoring of transformer's temperature are very essential. In this paper, a critical review has been made on various conditioning and monitoring techniques. Furthermore, a new method, hot spot indication technique, is discussed. Also, transformer's operating condition is monitored by using thermal imager. From the thermal analysis, it is inferred that major hotspot locations are appearing at connection lead out; also, the bushing of the transformer is the very hottest spot in transformer, so monitoring the level of oil is essential. Alongside, real time power quality analysis has been carried out using the power analyzer. It shows that industrial drives are injecting current harmonics to the distribution network, which causes the power quality problem on the grid. Moreover, the current harmonic limit has exceeded the IEEE standard limit. Hence, the adequate harmonics suppression technique is need an hour.

  18. Neural net diagnostics for VLSI test

    NASA Technical Reports Server (NTRS)

    Lin, T.; Tseng, H.; Wu, A.; Dogan, N.; Meador, J.

    1990-01-01

    This paper discusses the application of neural network pattern analysis algorithms to the IC fault diagnosis problem. A fault diagnostic is a decision rule combining what is known about an ideal circuit test response with information about how it is distorted by fabrication variations and measurement noise. The rule is used to detect fault existence in fabricated circuits using real test equipment. Traditional statistical techniques may be used to achieve this goal, but they can employ unrealistic a priori assumptions about measurement data. Our approach to this problem employs an adaptive pattern analysis technique based on feedforward neural networks. During training, a feedforward network automatically captures unknown sample distributions. This is important because distributions arising from the nonlinear effects of process variation can be more complex than is typically assumed. A feedforward network is also able to extract measurement features which contribute significantly to making a correct decision. Traditional feature extraction techniques employ matrix manipulations which can be particularly costly for large measurement vectors. In this paper we discuss a software system which we are developing that uses this approach. We also provide a simple example illustrating the use of the technique for fault detection in an operational amplifier.

  19. Superstatistics analysis of the ion current distribution function: Met3PbCl influence study.

    PubMed

    Miśkiewicz, Janusz; Trela, Zenon; Przestalski, Stanisław; Karcz, Waldemar

    2010-09-01

    A novel analysis of ion current time series is proposed. It is shown that higher (second, third and fourth) statistical moments of the ion current probability distribution function (PDF) can yield new information about ion channel properties. The method is illustrated on a two-state model where the PDF of the compound states are given by normal distributions. The proposed method was applied to the analysis of the SV cation channels of vacuolar membrane of Beta vulgaris and the influence of trimethyllead chloride (Met(3)PbCl) on the ion current probability distribution. Ion currents were measured by patch-clamp technique. It was shown that Met(3)PbCl influences the variance of the open-state ion current but does not alter the PDF of the closed-state ion current. Incorporation of higher statistical moments into the standard investigation of ion channel properties is proposed.

  20. Qualitative fusion technique based on information poor system and its application to factor analysis for vibration of rolling bearings

    NASA Astrophysics Data System (ADS)

    Xia, Xintao; Wang, Zhongyu

    2008-10-01

    For some methods of stability analysis of a system using statistics, it is difficult to resolve the problems of unknown probability distribution and small sample. Therefore, a novel method is proposed in this paper to resolve these problems. This method is independent of probability distribution, and is useful for small sample systems. After rearrangement of the original data series, the order difference and two polynomial membership functions are introduced to estimate the true value, the lower bound and the supper bound of the system using fuzzy-set theory. Then empirical distribution function is investigated to ensure confidence level above 95%, and the degree of similarity is presented to evaluate stability of the system. Cases of computer simulation investigate stable systems with various probability distribution, unstable systems with linear systematic errors and periodic systematic errors and some mixed systems. The method of analysis for systematic stability is approved.

  1. 78 FR 37690 - Federal Acquisition Regulation; Price Analysis Techniques

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-21

    ... that a price is fair and reasonable. 3. The Promotion of Competition Comment: One respondent believed..., distributive impacts, and equity). E.O. 13563 emphasizes the importance of quantifying both costs and benefits...

  2. Power Distribution Analysis For Electrical Usage In Province Area Using Olap (Online Analytical Processing)

    NASA Astrophysics Data System (ADS)

    Samsinar, Riza; Suseno, Jatmiko Endro; Widodo, Catur Edi

    2018-02-01

    The distribution network is the closest power grid to the customer Electric service providers such as PT. PLN. The dispatching center of power grid companies is also the data center of the power grid where gathers great amount of operating information. The valuable information contained in these data means a lot for power grid operating management. The technique of data warehousing online analytical processing has been used to manage and analysis the great capacity of data. Specific methods for online analytics information systems resulting from data warehouse processing with OLAP are chart and query reporting. The information in the form of chart reporting consists of the load distribution chart based on the repetition of time, distribution chart on the area, the substation region chart and the electric load usage chart. The results of the OLAP process show the development of electric load distribution, as well as the analysis of information on the load of electric power consumption and become an alternative in presenting information related to peak load.

  3. Execution models for mapping programs onto distributed memory parallel computers

    NASA Technical Reports Server (NTRS)

    Sussman, Alan

    1992-01-01

    The problem of exploiting the parallelism available in a program to efficiently employ the resources of the target machine is addressed. The problem is discussed in the context of building a mapping compiler for a distributed memory parallel machine. The paper describes using execution models to drive the process of mapping a program in the most efficient way onto a particular machine. Through analysis of the execution models for several mapping techniques for one class of programs, we show that the selection of the best technique for a particular program instance can make a significant difference in performance. On the other hand, the results of benchmarks from an implementation of a mapping compiler show that our execution models are accurate enough to select the best mapping technique for a given program.

  4. Location estimation in wireless sensor networks using spring-relaxation technique.

    PubMed

    Zhang, Qing; Foh, Chuan Heng; Seet, Boon-Chong; Fong, A C M

    2010-01-01

    Accurate and low-cost autonomous self-localization is a critical requirement of various applications of a large-scale distributed wireless sensor network (WSN). Due to its massive deployment of sensors, explicit measurements based on specialized localization hardware such as the Global Positioning System (GPS) is not practical. In this paper, we propose a low-cost WSN localization solution. Our design uses received signal strength indicators for ranging, light weight distributed algorithms based on the spring-relaxation technique for location computation, and the cooperative approach to achieve certain location estimation accuracy with a low number of nodes with known locations. We provide analysis to show the suitability of the spring-relaxation technique for WSN localization with cooperative approach, and perform simulation experiments to illustrate its accuracy in localization.

  5. Electron microprobe analysis and histochemical examination of the calcium distribution in human bone trabeculae: a methodological study using biopsy specimens from post-traumatic osteopenia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Obrant, K.J.; Odselius, R.

    1984-01-01

    Energy dispersive X-ray microanalysis (EDX) (or electron microprobe analysis) of the relative intensity for calcium in different bone trabeculae from the tibia epiphysis, and in different parts of one and the same trabecula, was performed on 3 patients who had earlier had a fracture of the ipsilateral tibia-diaphysis. The variation in intensity was compared with the histochemical patterns obtained with both the Goldner and the von Kossa staining techniques for detecting calcium in tissues. Previously reported calcium distribution features, found to be typical for posttraumatic osteopenia, such as striated mineralization patterns in individual trabeculae and large differences in mineralization levelmore » between different trabeculae, could be verified both by means of the two histochemical procedures and from the electron microprobe analysis. A pronounced difference was observed, however, between the two histochemical staining techniques as regards their sensitivity to detect calcium. To judge from the values obtained from the EDX measurements, the sensitivity of the Goldner technique should be more than ten times higher than that of von Kossa. The EDX measurements gave more detailed information than either of the two histochemical techniques: great variations in the intensity of the calcium peak were found in trabeculae stained as unmineralized as well as mineralized.« less

  6. Acoustic mode measurements in the inlet of a model turbofan using a continuously rotating rake: Data collection/analysis techniques

    NASA Technical Reports Server (NTRS)

    Hall, David G.; Heidelberg, Laurence; Konno, Kevin

    1993-01-01

    The rotating microphone measurement technique and data analysis procedures are documented which are used to determine circumferential and radial acoustic mode content in the inlet of the Advanced Ducted Propeller (ADP) model. Circumferential acoustic mode levels were measured at a series of radial locations using the Doppler frequency shift produced by a rotating inlet microphone probe. Radial mode content was then computed using a least squares curve fit with the measured radial distribution for each circumferential mode. The rotating microphone technique is superior to fixed-probe techniques because it results in minimal interference with the acoustic modes generated by rotor-stator interaction. This effort represents the first experimental implementation of a measuring technique developed by T. G. Sofrin. Testing was performed in the NASA Lewis Low Speed Anechoic Wind Tunnel at a simulated takeoff condition of Mach 0.2. The design is included of the data analysis software and the performance of the rotating rake apparatus. The effect of experiment errors is also discussed.

  7. Acoustic mode measurements in the inlet of a model turbofan using a continuously rotating rake - Data collection/analysis techniques

    NASA Technical Reports Server (NTRS)

    Hall, David G.; Heidelberg, Laurence; Konno, Kevin

    1993-01-01

    The rotating microphone measurement technique and data analysis procedures are documented which are used to determine circumferential and radial acoustic mode content in the inlet of the Advanced Ducted Propeller (ADP) model. Circumferential acoustic mode levels were measured at a series of radial locations using the Doppler frequency shift produced by a rotating inlet microphone probe. Radial mode content was then computed using a least squares curve fit with the measured radial distribution for each circumferential mode. The rotating microphone technique is superior to fixed-probe techniques because it results in minimal interference with the acoustic modes generated by rotor-stator interaction. This effort represents the first experimental implementation of a measuring technique developed by T. G. Sofrin. Testing was performed in the NASA Lewis Low Speed Anechoic Wind Tunnel at a simulated takeoff condition of Mach 0.2. The design is included of the data analysis software and the performance of the rotating rake apparatus. The effect of experiment errors is also discussed.

  8. Application of a novel new multispectral nanoparticle tracking technique

    NASA Astrophysics Data System (ADS)

    McElfresh, Cameron; Harrington, Tyler; Vecchio, Kenneth S.

    2018-06-01

    Fast, reliable, and accurate particle size analysis techniques must meet the demands of evolving industrial and academic research in areas of functionalized nanoparticle synthesis, advanced materials development, and other nanoscale enabled technologies. In this study a new multispectral particle tracking analysis (m-PTA) technique enabled by the ViewSizer™ 3000 (MANTA Instruments, USA) was evaluated using solutions of monomodal and multimodal gold and polystyrene latex nanoparticles, as well as a spark eroded polydisperse 316L stainless steel nanopowder, and large (non-Brownian) borosilicate particles. It was found that m-PTA performed comparably to the DLS in evaluation of monomodal particle size distributions. When measuring bimodal, trimodal and polydisperse solutions, the m-PTA technique overwhelmingly outperformed traditional dynamic light scattering (DLS) in both peak detection and relative particle concentration analysis. It was also observed that the m-PTA technique is less susceptible to large particle overexpression errors. The ViewSizer™ 3000 was also found to be successful in accurately evaluating sizes and concentrations of monomodal and bimodal sinking borosilicate particles.

  9. In Situ Distribution Guided Analysis and Visualization of Transonic Jet Engine Simulations.

    PubMed

    Dutta, Soumya; Chen, Chun-Ming; Heinlein, Gregory; Shen, Han-Wei; Chen, Jen-Ping

    2017-01-01

    Study of flow instability in turbine engine compressors is crucial to understand the inception and evolution of engine stall. Aerodynamics experts have been working on detecting the early signs of stall in order to devise novel stall suppression technologies. A state-of-the-art Navier-Stokes based, time-accurate computational fluid dynamics simulator, TURBO, has been developed in NASA to enhance the understanding of flow phenomena undergoing rotating stall. Despite the proven high modeling accuracy of TURBO, the excessive simulation data prohibits post-hoc analysis in both storage and I/O time. To address these issues and allow the expert to perform scalable stall analysis, we have designed an in situ distribution guided stall analysis technique. Our method summarizes statistics of important properties of the simulation data in situ using a probabilistic data modeling scheme. This data summarization enables statistical anomaly detection for flow instability in post analysis, which reveals the spatiotemporal trends of rotating stall for the expert to conceive new hypotheses. Furthermore, the verification of the hypotheses and exploratory visualization using the summarized data are realized using probabilistic visualization techniques such as uncertain isocontouring. Positive feedback from the domain scientist has indicated the efficacy of our system in exploratory stall analysis.

  10. XRD measurement of mean thickness, thickness distribution and strain for illite and illite-smectite crystallites by the Bertaut-Warren-Averbach technique

    USGS Publications Warehouse

    Drits, Victor A.; Eberl, Dennis D.; Środoń, Jan

    1998-01-01

    A modified version of the Bertaut-Warren-Averbach (BWA) technique (Bertaut 1949, 1950; Warren and Averbach 1950) has been developed to measure coherent scattering domain (CSD) sizes and strains in minerals by analysis of X-ray diffraction (XRD) data. This method is used to measure CSD thickness distributions for calculated and experimental XRD patterns of illites and illite-smectites (I-S). The method almost exactly recovers CSD thickness distributions for calculated illite XRD patterns. Natural I-S samples contain swelling layers that lead to nonperiodic structures in the c* direction and to XRD peaks that are broadened and made asymmetric by mixed layering. Therefore, these peaks cannot be analyzed by the BWA method. These difficulties are overcome by K-saturation and heating prior to X-ray analysis in order to form 10-Å periodic structures. BWA analysis yields the thickness distribution of mixed-layer crystals (coherently diffracting stacks of fundamental illite particles). For most I-S samples, CSD thickness distributions can be approximated by lognormal functions. Mixed-layer crystal mean thickness and expandability then can be used to calculate fundamental illite particle mean thickness. Analyses of the dehydrated, K-saturated samples indicate that basal XRD reflections are broadened by symmetrical strain that may be related to local variations in smectite interlayers caused by dehydration, and that the standard deviation of the strain increases regularly with expandability. The 001 and 002 reflections are affected only slightly by this strain and therefore are suited for CSD thickness analysis. Mean mixed-layer crystal thicknesses for dehydrated I-S measured by the BWA method are very close to those measured by an integral peak width method.

  11. Analytical method for predicting the pressure distribution about a nacelle at transonic speeds

    NASA Technical Reports Server (NTRS)

    Keith, J. S.; Ferguson, D. R.; Merkle, C. L.; Heck, P. H.; Lahti, D. J.

    1973-01-01

    The formulation and development of a computer analysis for the calculation of streamlines and pressure distributions around two-dimensional (planar and axisymmetric) isolated nacelles at transonic speeds are described. The computerized flow field analysis is designed to predict the transonic flow around long and short high-bypass-ratio fan duct nacelles with inlet flows and with exhaust flows having appropriate aerothermodynamic properties. The flow field boundaries are located as far upstream and downstream as necessary to obtain minimum disturbances at the boundary. The far-field lateral flow field boundary is analytically defined to exactly represent free-flight conditions or solid wind tunnel wall effects. The inviscid solution technique is based on a Streamtube Curvature Analysis. The computer program utilizes an automatic grid refinement procedure and solves the flow field equations with a matrix relaxation technique. The boundary layer displacement effects and the onset of turbulent separation are included, based on the compressible turbulent boundary layer solution method of Stratford and Beavers and on the turbulent separation prediction method of Stratford.

  12. A New Approach to X-ray Analysis of SNRs

    NASA Astrophysics Data System (ADS)

    Frank, Kari A.; Burrows, David; Dwarkadas, Vikram

    2016-06-01

    We present preliminary results of applying a novel analysis method, Smoothed Particle Inference (SPI), to XMM-Newton observations of SNR RCW 103 and Tycho. SPI is a Bayesian modeling process that fits a population of gas blobs (”smoothed particles”) such that their superposed emission reproduces the observed spatial and spectral distribution of photons. Emission-weighted distributions of plasma properties, such as abundances and temperatures, are then extracted from the properties of the individual blobs. This technique has important advantages over analysis techniques which implicitly assume that remnants are two-dimensional objects in which each line of sight encompasses a single plasma. By contrast, SPI allows superposition of as many blobs of plasma as are needed to match the spectrum observed in each direction, without the need to bin the data spatially. The analyses of RCW 103 and Tycho are part of a pilot study for the larger SPIES (Smoothed Particle Inference Exploration of SNRs) project, in which SPI will be applied to a sample of 12 bright SNRs.

  13. Extending unbiased stereology of brain ultrastructure to three-dimensional volumes

    NASA Technical Reports Server (NTRS)

    Fiala, J. C.; Harris, K. M.; Koslow, S. H. (Principal Investigator)

    2001-01-01

    OBJECTIVE: Analysis of brain ultrastructure is needed to reveal how neurons communicate with one another via synapses and how disease processes alter this communication. In the past, such analyses have usually been based on single or paired sections obtained by electron microscopy. Reconstruction from multiple serial sections provides a much needed, richer representation of the three-dimensional organization of the brain. This paper introduces a new reconstruction system and new methods for analyzing in three dimensions the location and ultrastructure of neuronal components, such as synapses, which are distributed non-randomly throughout the brain. DESIGN AND MEASUREMENTS: Volumes are reconstructed by defining transformations that align the entire area of adjacent sections. Whole-field alignment requires rotation, translation, skew, scaling, and second-order nonlinear deformations. Such transformations are implemented by a linear combination of bivariate polynomials. Computer software for generating transformations based on user input is described. Stereological techniques for assessing structural distributions in reconstructed volumes are the unbiased bricking, disector, unbiased ratio, and per-length counting techniques. A new general method, the fractional counter, is also described. This unbiased technique relies on the counting of fractions of objects contained in a test volume. A volume of brain tissue from stratum radiatum of hippocampal area CA1 is reconstructed and analyzed for synaptic density to demonstrate and compare the techniques. RESULTS AND CONCLUSIONS: Reconstruction makes practicable volume-oriented analysis of ultrastructure using such techniques as the unbiased bricking and fractional counter methods. These analysis methods are less sensitive to the section-to-section variations in counts and section thickness, factors that contribute to the inaccuracy of other stereological methods. In addition, volume reconstruction facilitates visualization and modeling of structures and analysis of three-dimensional relationships such as synaptic connectivity.

  14. Accurate determination of Brillouin frequency based on cross recurrence plot analysis in Brillouin distributed fiber sensor

    NASA Astrophysics Data System (ADS)

    Haneef, Shahna M.; Srijith, K.; Venkitesh, D.; Srinivasan, B.

    2017-04-01

    We propose and demonstrate the use of cross recurrence plot analysis (CRPA) to accurately determine the Brillouin shift due to strain and temperature in a Brillouin distributed fiber sensor. This signal processing technique, which is implemented in Brillouin sensors for the first time relies on apriori data i.e, the lineshape of the Brillouin gain spectrum and its similarity with the spectral features measured at different locations along the fiber. Analytical and experimental investigation of the proposed scheme is presented in this paper.

  15. Research of the self-healing technologies in the optical communication network of distribution automation

    NASA Astrophysics Data System (ADS)

    Wang, Hao; Zhong, Guoxin

    2018-03-01

    Optical communication network is the mainstream technique of the communication networks for distribution automation, and self-healing technologies can improve the in reliability of the optical communication networks significantly. This paper discussed the technical characteristics and application scenarios of several network self-healing technologies in the access layer, the backbone layer and the core layer of the optical communication networks for distribution automation. On the base of the contrastive analysis, this paper gives an application suggestion of these self-healing technologies.

  16. Methods and apparatus for analysis of chromatographic migration patterns

    DOEpatents

    Stockham, T.G.; Ives, J.T.

    1993-12-28

    A method and apparatus are presented for sharpening signal peaks in a signal representing the distribution of biological or chemical components of a mixture separated by a chromatographic technique such as, but not limited to, electrophoresis. A key step in the method is the use of a blind deconvolution technique, presently embodied as homomorphic filtering, to reduce the contribution of a blurring function to the signal encoding the peaks of the distribution. The invention further includes steps and apparatus directed to determination of a nucleotide sequence from a set of four such signals representing DNA sequence data derived by electrophoretic means. 16 figures.

  17. Terahertz spectral unmixing based method for identifying gastric cancer

    NASA Astrophysics Data System (ADS)

    Cao, Yuqi; Huang, Pingjie; Li, Xian; Ge, Weiting; Hou, Dibo; Zhang, Guangxin

    2018-02-01

    At present, many researchers are exploring biological tissue inspection using terahertz time-domain spectroscopy (THz-TDS) techniques. In this study, based on a modified hard modeling factor analysis method, terahertz spectral unmixing was applied to investigate the relationships between the absorption spectra in THz-TDS and certain biomarkers of gastric cancer in order to systematically identify gastric cancer. A probability distribution and box plot were used to extract the distinctive peaks that indicate carcinogenesis, and the corresponding weight distributions were used to discriminate the tissue types. The results of this work indicate that terahertz techniques have the potential to detect different levels of cancer, including benign tumors and polyps.

  18. Local coexistence of VO 2 phases revealed by deep data analysis

    DOE PAGES

    Strelcov, Evgheni; Ievlev, Anton; Tselev, Alexander; ...

    2016-07-07

    We report a synergistic approach of micro-Raman spectroscopic mapping and deep data analysis to study the distribution of crystallographic phases and ferroelastic domains in a defected Al-doped VO 2 microcrystal. Bayesian linear unmixing revealed an uneven distribution of the T phase, which is stabilized by the surface defects and uneven local doping that went undetectable by other classical analysis techniques such as PCA and SIMPLISMA. This work demonstrates the impact of information recovery via statistical analysis and full mapping in spectroscopic studies of vanadium dioxide systems, which is commonly substituted by averaging or single point-probing approaches, both of which suffermore » from information misinterpretation due to low resolving power.« less

  19. An Analysis Technique/Automated Tool for Comparing and Tracking Analysis Modes of Different Finite Element Models

    NASA Technical Reports Server (NTRS)

    Towner, Robert L.; Band, Jonathan L.

    2012-01-01

    An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.

  20. Environmental Justice Challengers for Ecosystem Service Valuation

    EPA Science Inventory

    In pursuing improved ecosystem services management, there is also an opportunity to work towards environmental justice. The practice of environmental valuation can assist with both goals, but as typically employed obscures distributional analysis. Furthermore, valuation technique...

  1. Time-frequency representation of a highly nonstationary signal via the modified Wigner distribution

    NASA Technical Reports Server (NTRS)

    Zoladz, T. F.; Jones, J. H.; Jong, J.

    1992-01-01

    A new signal analysis technique called the modified Wigner distribution (MWD) is presented. The new signal processing tool has been very successful in determining time frequency representations of highly non-stationary multicomponent signals in both simulations and trials involving actual Space Shuttle Main Engine (SSME) high frequency data. The MWD departs from the classic Wigner distribution (WD) in that it effectively eliminates the cross coupling among positive frequency components in a multiple component signal. This attribute of the MWD, which prevents the generation of 'phantom' spectral peaks, will undoubtedly increase the utility of the WD for real world signal analysis applications which more often than not involve multicomponent signals.

  2. Sensor failure and multivariable control for airbreathing propulsion systems. Ph.D. Thesis - Dec. 1979 Final Report

    NASA Technical Reports Server (NTRS)

    Behbehani, K.

    1980-01-01

    A new sensor/actuator failure analysis technique for turbofan jet engines was developed. Three phases of failure analysis, namely detection, isolation, and accommodation are considered. Failure detection and isolation techniques are developed by utilizing the concept of Generalized Likelihood Ratio (GLR) tests. These techniques are applicable to both time varying and time invariant systems. Three GLR detectors are developed for: (1) hard-over sensor failure; (2) hard-over actuator failure; and (3) brief disturbances in the actuators. The probability distribution of the GLR detectors and the detectability of sensor/actuator failures are established. Failure type is determined by the maximum of the GLR detectors. Failure accommodation is accomplished by extending the Multivariable Nyquest Array (MNA) control design techniques to nonsquare system designs. The performance and effectiveness of the failure analysis technique are studied by applying the technique to a turbofan jet engine, namely the Quiet Clean Short Haul Experimental Engine (QCSEE). Single and multiple sensor/actuator failures in the QCSEE are simulated and analyzed and the effects of model degradation are studied.

  3. Decomposition of the Inequality of Income Distribution by Income Types—Application for Romania

    NASA Astrophysics Data System (ADS)

    Andrei, Tudorel; Oancea, Bogdan; Richmond, Peter; Dhesi, Gurjeet; Herteliu, Claudiu

    2017-09-01

    This paper identifies the salient factors that characterize the inequality income distribution for Romania. Data analysis is rigorously carried out using sophisticated techniques borrowed from classical statistics (Theil). Decomposition of the inequalities measured by the Theil index is also performed. This study relies on an exhaustive (11.1 million records for 2014) data-set for total personal gross income of Romanian citizens.

  4. Scalable collaborative risk management technology for complex critical systems

    NASA Technical Reports Server (NTRS)

    Campbell, Scott; Torgerson, Leigh; Burleigh, Scott; Feather, Martin S.; Kiper, James D.

    2004-01-01

    We describe here our project and plans to develop methods, software tools, and infrastructure tools to address challenges relating to geographically distributed software development. Specifically, this work is creating an infrastructure that supports applications working over distributed geographical and organizational domains and is using this infrastructure to develop a tool that supports project development using risk management and analysis techniques where the participants are not collocated.

  5. A new model to predict weak-lensing peak counts. II. Parameter constraint strategies

    NASA Astrophysics Data System (ADS)

    Lin, Chieh-An; Kilbinger, Martin

    2015-11-01

    Context. Peak counts have been shown to be an excellent tool for extracting the non-Gaussian part of the weak lensing signal. Recently, we developed a fast stochastic forward model to predict weak-lensing peak counts. Our model is able to reconstruct the underlying distribution of observables for analysis. Aims: In this work, we explore and compare various strategies for constraining a parameter using our model, focusing on the matter density Ωm and the density fluctuation amplitude σ8. Methods: First, we examine the impact from the cosmological dependency of covariances (CDC). Second, we perform the analysis with the copula likelihood, a technique that makes a weaker assumption than does the Gaussian likelihood. Third, direct, non-analytic parameter estimations are applied using the full information of the distribution. Fourth, we obtain constraints with approximate Bayesian computation (ABC), an efficient, robust, and likelihood-free algorithm based on accept-reject sampling. Results: We find that neglecting the CDC effect enlarges parameter contours by 22% and that the covariance-varying copula likelihood is a very good approximation to the true likelihood. The direct techniques work well in spite of noisier contours. Concerning ABC, the iterative process converges quickly to a posterior distribution that is in excellent agreement with results from our other analyses. The time cost for ABC is reduced by two orders of magnitude. Conclusions: The stochastic nature of our weak-lensing peak count model allows us to use various techniques that approach the true underlying probability distribution of observables, without making simplifying assumptions. Our work can be generalized to other observables where forward simulations provide samples of the underlying distribution.

  6. Combined point and distributed techniques for multidimensional estimation of spatial groundwater-stream water exchange in a heterogeneous sand bed-stream.

    NASA Astrophysics Data System (ADS)

    Gaona Garcia, J.; Lewandowski, J.; Bellin, A.

    2017-12-01

    Groundwater-stream water interactions in rivers determine water balances, but also chemical and biological processes in the streambed at different spatial and temporal scales. Due to the difficult identification and quantification of gaining, neutral and losing conditions, it is necessary to combine techniques with complementary capabilities and scale ranges. We applied this concept to a study site at the River Schlaube, East Brandenburg-Germany, a sand bed stream with intense sediment heterogeneity and complex environmental conditions. In our approach, point techniques such as temperature profiles of the streambed together with vertical hydraulic gradients provide data for the estimation of fluxes between groundwater and surface water with the numerical model 1DTempPro. On behalf of distributed techniques, fiber optic distributed temperature sensing identifies the spatial patterns of neutral, down- and up-welling areas by analysis of the changes in the thermal patterns at the streambed interface under certain flow. The study finally links point and surface temperatures to provide a method for upscaling of fluxes. Point techniques provide point flux estimates with essential depth detail to infer streambed structures while the results hardly represent the spatial distribution of fluxes caused by the heterogeneity of streambed properties. Fiber optics proved capable of providing spatial thermal patterns with enough resolution to observe distinct hyporheic thermal footprints at multiple scales. The relation of thermal footprint patterns and temporal behavior with flux results from point techniques enabled the use of methods for spatial flux estimates. The lack of detailed information of the physical driver's spatial distribution restricts the spatial flux estimation to the application of the T-proxy method, whose highly uncertain results mainly provide coarse spatial flux estimates. The study concludes that the upscaling of groundwater-stream water interactions using thermal measurements with combined point and distributed techniques requires the integration of physical drivers because of the heterogeneity of the flux patterns. Combined experimental and modeling approaches may help to obtain more reliable understanding of groundwater-surface water interactions at multiple scales.

  7. Contribution of multiple inert gas elimination technique to pulmonary medicine. 1. Principles and information content of the multiple inert gas elimination technique.

    PubMed Central

    Roca, J.; Wagner, P. D.

    1994-01-01

    This introductory review summarises four different aspects of the multiple inert gas elimination technique (MIGET). Firstly, the historical background that facilitated, in the mid 1970s, the development of the MIGET as a tool to obtain more information about the entire spectrum of VA/Q distribution in the lung by measuring the exchange of six gases of different solubility in trace concentrations. Its principle is based on the observation that the retention (or excretion) of any gas is dependent on the solubility (lambda) of that gas and the VA/Q distribution. A second major aspect is the analysis of the information content and limitations of the technique. During the last 15 years a substantial amount of clinical research using the MIGET has been generated by several groups around the world. The technique has been shown to be adequate in understanding the mechanisms of hypoxaemia in different forms of pulmonary disease and the effects of therapeutic interventions, but also in separately determining the quantitative role of each extrapulmonary factor on systemic arterial PO2 when they change between two conditions of MIGET measurement. This information will be extensively reviewed in the forthcoming articles of this series. Next, the different modalities of the MIGET, practical considerations involved in the measurements and the guidelines for quality control have been indicated. Finally, a section has been devoted to the analysis of available data in healthy subjects under different conditions. The lack of systematic information on the VA/Q distributions of older healthy subjects is emphasised, since it will be required to fully understand the changes brought about by diseases that affect the older population. PMID:8091330

  8. Calculation of Weibull strength parameters and Batdorf flow-density constants for volume- and surface-flaw-induced fracture in ceramics

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Gyekenyesi, John P.

    1988-01-01

    The calculation of shape and scale parameters of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by uing the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded. The techniques described were verified with several example problems from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.

  9. X-ray elemental mapping techniques for elucidating the ecophysiology of hyperaccumulator plants.

    PubMed

    van der Ent, Antony; Przybyłowicz, Wojciech J; de Jonge, Martin D; Harris, Hugh H; Ryan, Chris G; Tylko, Grzegorz; Paterson, David J; Barnabas, Alban D; Kopittke, Peter M; Mesjasz-Przybyłowicz, Jolanta

    2018-04-01

    Contents Summary 432 I. Introduction 433 II. Preparation of plant samples for X-ray micro-analysis 433 III. X-ray elemental mapping techniques 438 IV. X-ray data analysis 442 V. Case studies 443 VI. Conclusions 446 Acknowledgements 449 Author contributions 449 References 449 SUMMARY: Hyperaccumulators are attractive models for studying metal(loid) homeostasis, and probing the spatial distribution and coordination chemistry of metal(loid)s in their tissues is important for advancing our understanding of their ecophysiology. X-ray elemental mapping techniques are unique in providing in situ information, and with appropriate sample preparation offer results true to biological conditions of the living plant. The common platform of these techniques is a reliance on characteristic X-rays of elements present in a sample, excited either by electrons (scanning/transmission electron microscopy), protons (proton-induced X-ray emission) or X-rays (X-ray fluorescence microscopy). Elucidating the cellular and tissue-level distribution of metal(loid)s is inherently challenging and accurate X-ray analysis places strict demands on sample collection, preparation and analytical conditions, to avoid elemental redistribution, chemical modification or ultrastructural alterations. We compare the merits and limitations of the individual techniques, and focus on the optimal field of applications for inferring ecophysiological processes in hyperaccumulator plants. X-ray elemental mapping techniques can play a key role in answering questions at every level of metal(loid) homeostasis in plants, from the rhizosphere interface, to uptake pathways in the roots and shoots. Further improvements in technological capabilities offer exciting perspectives for the study of hyperaccumulator plants into the future. © 2017 University of Queensland. New Phytologist © 2017 New Phytologist Trust.

  10. Revealing the influence of water-cement ratio on the pore size distribution in hydrated cement paste by using cyclohexane

    NASA Astrophysics Data System (ADS)

    Bede, Andrea; Ardelean, Ioan

    2017-12-01

    Varying the amount of water in a concrete mix will influence its final properties considerably due to the changes in the capillary porosity. That is why a non-destructive technique is necessary for revealing the capillary pore distribution inside hydrated cement based materials and linking the capillary porosity with the macroscopic properties of these materials. In the present work, we demonstrate a simple approach for revealing the differences in capillary pore size distributions introduced by the preparation of cement paste with different water-to-cement ratios. The approach relies on monitoring the nuclear magnetic resonance transverse relaxation distribution of cyclohexane molecules confined inside the cement paste pores. The technique reveals the whole spectrum of pores inside the hydrated cement pastes, allowing a qualitative and quantitative analysis of different pore sizes. The cement pastes with higher water-to-cement ratios show an increase in capillary porosity, while for all the samples the intra-C-S-H and inter-C-S-H pores (also known as gel pores) remain unchanged. The technique can be applied to various porous materials with internal mineral surfaces.

  11. Measuring phonon mean free path distributions by probing quasiballistic phonon transport in grating nanostructures

    DOE PAGES

    Zeng, Lingping; Collins, Kimberlee C.; Hu, Yongjie; ...

    2015-11-27

    Heat conduction in semiconductors and dielectrics depends upon their phonon mean free paths that describe the average travelling distance between two consecutive phonon scattering events. Nondiffusive phonon transport is being exploited to extract phonon mean free path distributions. Here, we describe an implementation of a nanoscale thermal conductivity spectroscopy technique that allows for the study of mean free path distributions in optically absorbing materials with relatively simple fabrication and a straightforward analysis scheme. We pattern 1D metallic grating of various line widths but fixed gap size on sample surfaces. The metal lines serve as both heaters and thermometers in time-domainmore » thermoreflectance measurements and simultaneously act as wiregrid polarizers that protect the underlying substrate from direct optical excitation and heating. We demonstrate the viability of this technique by studying length-dependent thermal conductivities of silicon at various temperatures. The thermal conductivities measured with different metal line widths are analyzed using suppression functions calculated from the Boltzmann transport equation to extract the phonon mean free path distributions with no calibration required. Furthermore, this table-top ultrafast thermal transport spectroscopy technique enables the study of mean free path spectra in a wide range of technologically important materials.« less

  12. Characteristic Lifelength of Coherent Structure in the Turbulent Boundary Layer

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.

    2006-01-01

    A characteristic lifelength is defined by which a Gaussian distribution is fit to data correlated over a 3 sensor array sampling streamwise sidewall pressure. The data were acquired at subsonic, transonic and supersonic speeds aboard a Tu-144. Lifelengths are estimated using the cross spectrum and are shown to compare favorably with Efimtsov's prediction of correlation space scales. Lifelength distributions are computed in the time/frequency domain using an interval correlation technique on the continuous wavelet transform of the original time data. The median values of the lifelength distributions are found to be very close to the frequency averaged result. The interval correlation technique is shown to allow the retrieval and inspection of the original time data of each event in the lifelength distribution, thus providing a means to locate and study the nature of the coherent structure in the turbulent boundary layer. The lifelength data can be converted to lifetimes using the convection velocity. The lifetime of events in the time/frequency domain are displayed in Lifetime Maps. The primary purpose of the paper is to validate these new analysis techniques so that they can be used with confidence to further characterize coherent structure in the turbulent boundary layer.

  13. Research and Development in Very Long Baseline Interferometry (VLBI)

    NASA Technical Reports Server (NTRS)

    Himwich, William E.

    2004-01-01

    Contents include the following: 1.Observation coordination. 2. Data acquisition system control software. 3. Station support. 4. Correlation, data processing, and analysis. 5. Data distribution and archiving. 6. Technique improvement and research. 7. Computer support.

  14. Adaptive distributed source coding.

    PubMed

    Varodayan, David; Lin, Yao-Chung; Girod, Bernd

    2012-05-01

    We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.

  15. Kinetic Simulation and Energetic Neutral Atom Imaging of the Magnetosphere

    NASA Technical Reports Server (NTRS)

    Fok, Mei-Ching H.

    2011-01-01

    Advanced simulation tools and measurement techniques have been developed to study the dynamic magnetosphere and its response to drivers in the solar wind. The Comprehensive Ring Current Model (CRCM) is a kinetic code that solves the 3D distribution in space, energy and pitch-angle information of energetic ions and electrons. Energetic Neutral Atom (ENA) imagers have been carried in past and current satellite missions. Global morphology of energetic ions were revealed by the observed ENA images. We have combined simulation and ENA analysis techniques to study the development of ring current ions during magnetic storms and substorms. We identify the timing and location of particle injection and loss. We examine the evolution of ion energy and pitch-angle distribution during different phases of a storm. In this talk we will discuss the findings from our ring current studies and how our simulation and ENA analysis tools can be applied to the upcoming TRIO-CINAMA mission.

  16. A technique for estimating the absolute gain of a photomultiplier tube

    NASA Astrophysics Data System (ADS)

    Takahashi, M.; Inome, Y.; Yoshii, S.; Bamba, A.; Gunji, S.; Hadasch, D.; Hayashida, M.; Katagiri, H.; Konno, Y.; Kubo, H.; Kushida, J.; Nakajima, D.; Nakamori, T.; Nagayoshi, T.; Nishijima, K.; Nozaki, S.; Mazin, D.; Mashuda, S.; Mirzoyan, R.; Ohoka, H.; Orito, R.; Saito, T.; Sakurai, S.; Takeda, J.; Teshima, M.; Terada, Y.; Tokanai, F.; Yamamoto, T.; Yoshida, T.

    2018-06-01

    Detection of low-intensity light relies on the conversion of photons to photoelectrons, which are then multiplied and detected as an electrical signal. To measure the actual intensity of the light, one must know the factor by which the photoelectrons have been multiplied. To obtain this amplification factor, we have developed a procedure for estimating precisely the signal caused by a single photoelectron. The method utilizes the fact that the photoelectrons conform to a Poisson distribution. The average signal produced by a single photoelectron can then be estimated from the number of noise events, without requiring analysis of the distribution of the signal produced by a single photoelectron. The signal produced by one or more photoelectrons can be estimated experimentally without any assumptions. This technique, and an example of the analysis of a signal from a photomultiplier tube, are described in this study.

  17. Ultrasonic non invasive techniques for microbiological instrumentation

    NASA Astrophysics Data System (ADS)

    Elvira, L.; Sierra, C.; Galán, B.; Resa, P.

    2010-01-01

    Non invasive techniques based on ultrasounds have advantageous features to study, characterize and monitor microbiological and enzymatic reactions. These processes may change the sound speed, viscosity or particle distribution size of the medium where they take place, which makes possible their analysis using ultrasonic techniques. In this work, two different systems for the analysis of microbiological liquid media based on ultrasounds are presented. In first place, an industrial application based on an ultrasonic monitoring technique for microbiological growth detection in milk is shown. Such a system may improve the quality control strategies in food production factories, being able to decrease the time required to detect possible contaminations in packed products. Secondly, a study about the growing of the Escherichia coli DH5 α in different conditions is presented. It is shown that the use of ultrasonic non invasive characterization techniques in combination with other conventional measurements like optical density provides complementary information about the metabolism of these bacteria.

  18. Changes of the elemental distributions in marine diatoms as a reporter of sample preparation artefacts. A nuclear microscopy application

    NASA Astrophysics Data System (ADS)

    Godinho, R. M.; Cabrita, M. T.; Alves, L. C.; Pinheiro, T.

    2015-04-01

    Studies of the elemental composition of whole marine diatoms cells have high interest as they constitute a direct measurement of environmental changes, and allow anticipating consequences of anthropogenic alterations to organisms, ecosystems and global marine geochemical cycles. Nuclear microscopy is a powerful tool allowing direct measurement of whole cells giving qualitative imaging of distribution, and quantitative determination of intracellular concentration. Major obstacles to the analysis of marine microalgae are high medium salinity and the recurrent presence of extracellular exudates produced by algae to maintain colonies in natural media and in vitro. The objective of this paper was to optimize the methodology of sample preparation of marine unicellular algae for elemental analysis with nuclear microscopy, allowing further studies on cellular response to metals. Primary cultures of Coscinodiscus wailesii maintained in vitro were used to optimize protocols for elemental analysis with nuclear microscopy techniques. Adequate cell preparation procedures to isolate the cells from media components and exudates were established. The use of chemical agents proved to be inappropriate for elemental determination and for intracellular morphological analysis. The assessment of morphology and elemental partitioning in cell compartments obtained with nuclear microscopy techniques enabled to infer their function in natural environment and imbalances in exposure condition. Exposure to metal affected C. wailesii morphology and internal elemental distribution.

  19. Multiple Method Analysis of TiO2 Nanoparticle Uptake in Rice (Oryza sativa L.) Plants.

    PubMed

    Deng, Yingqing; Petersen, Elijah J; Challis, Katie E; Rabb, Savelas A; Holbrook, R David; Ranville, James F; Nelson, Bryant C; Xing, Baoshan

    2017-09-19

    Understanding the translocation of nanoparticles (NPs) into plants is challenging because qualitative and quantitative methods are still being developed and the comparability of results among different methods is unclear. In this study, uptake of titanium dioxide NPs and larger bulk particles (BPs) in rice plant (Oryza sativa L.) tissues was evaluated using three orthogonal techniques: electron microscopy, single-particle inductively coupled plasma mass spectroscopy (spICP-MS) with two different plant digestion approaches, and total elemental analysis using ICP optical emission spectroscopy. In agreement with electron microscopy results, total elemental analysis of plants exposed to TiO 2 NPs and BPs at 5 and 50 mg/L concentrations revealed that TiO 2 NPs penetrated into the plant root and resulted in Ti accumulation in above ground tissues at a higher level compared to BPs. spICP-MS analyses revealed that the size distributions of internalized particles differed between the NPs and BPs with the NPs showing a distribution with smaller particles. Acid digestion resulted in higher particle numbers and the detection of a broader range of particle sizes than the enzymatic digestion approach, highlighting the need for development of robust plant digestion procedures for NP analysis. Overall, there was agreement among the three techniques regarding NP and BP penetration into rice plant roots and spICP-MS showed its unique contribution to provide size distribution information.

  20. Millisecond Microwave Spikes: Statistical Study and Application for Plasma Diagnostics

    NASA Astrophysics Data System (ADS)

    Rozhansky, I. V.; Fleishman, G. D.; Huang, G.-L.

    2008-07-01

    We analyze a dense cluster of solar radio spikes registered at 4.5-6 GHz by the Purple Mountain Observatory spectrometer (Nanjing, China), operating in the 4.5-7.5 GHz range with 5 ms temporal resolution. To handle the data from the spectrometer, we developed a new technique that uses a nonlinear multi-Gaussian spectral fit based on χ2 criteria to extract individual spikes from the originally recorded spectra. Applying this method to the experimental raw data, we eventually identified about 3000 spikes for this event, which allows us to make a detailed statistical analysis. Various statistical characteristics of the spikes have been evaluated, including the intensity distributions, the spectral bandwidth distributions, and the distribution of the spike mean frequencies. The most striking finding of this analysis is the distributions of the spike bandwidth, which are remarkably asymmetric. To reveal the underlaying microphysics, we explore the local-trap model with the renormalized theory of spectral profiles of the electron cyclotron maser (ECM) emission peak in a source with random magnetic irregularities. The distribution of the solar spike relative bandwidths calculated within the local-trap model represents an excellent fit to the experimental data. Accordingly, the developed technique may offer a new tool with which to study very low levels of magnetic turbulence in the spike sources, when the ECM mechanism of the spike cluster is confirmed.

  1. A study of residence time distribution using radiotracer technique in the large scale plant facility

    NASA Astrophysics Data System (ADS)

    Wetchagarun, S.; Tippayakul, C.; Petchrak, A.; Sukrod, K.; Khoonkamjorn, P.

    2017-06-01

    As the demand for troubleshooting of large industrial plants increases, radiotracer techniques, which have capability to provide fast, online and effective detections to plant problems, have been continually developed. One of the good potential applications of the radiotracer for troubleshooting in a process plant is the analysis of Residence Time Distribution (RTD). In this paper, the study of RTD in a large scale plant facility using radiotracer technique was presented. The objective of this work is to gain experience on the RTD analysis using radiotracer technique in a “larger than laboratory” scale plant setup which can be comparable to the real industrial application. The experiment was carried out at the sedimentation tank in the water treatment facility of Thailand Institute of Nuclear Technology (Public Organization). Br-82 was selected to use in this work due to its chemical property, its suitable half-life and its on-site availability. NH4Br in the form of aqueous solution was injected into the system as the radiotracer. Six NaI detectors were placed along the pipelines and at the tank in order to determine the RTD of the system. The RTD and the Mean Residence Time (MRT) of the tank was analysed and calculated from the measured data. The experience and knowledge attained from this study is important for extending this technique to be applied to industrial facilities in the future.

  2. MANCOVA for one way classification with homogeneity of regression coefficient vectors

    NASA Astrophysics Data System (ADS)

    Mokesh Rayalu, G.; Ravisankar, J.; Mythili, G. Y.

    2017-11-01

    The MANOVA and MANCOVA are the extensions of the univariate ANOVA and ANCOVA techniques to multidimensional or vector valued observations. The assumption of a Gaussian distribution has been replaced with the Multivariate Gaussian distribution for the vectors data and residual term variables in the statistical models of these techniques. The objective of MANCOVA is to determine if there are statistically reliable mean differences that can be demonstrated between groups later modifying the newly created variable. When randomization assignment of samples or subjects to groups is not possible, multivariate analysis of covariance (MANCOVA) provides statistical matching of groups by adjusting dependent variables as if all subjects scored the same on the covariates. In this research article, an extension has been made to the MANCOVA technique with more number of covariates and homogeneity of regression coefficient vectors is also tested.

  3. The application of STEP-technology® for particle and protein dispersion detection studies in biopharmaceutical research.

    PubMed

    Gross-Rother, J; Herrmann, N; Blech, M; Pinnapireddy, S R; Garidel, P; Bakowsky, U

    2018-05-30

    Particle detection and analysis techniques are essential in biopharmaceutical industries to evaluate the quality of various parenteral formulations regarding product safety, product quality and to meet the regulations set by the authority agencies. Several particle analysis systems are available on the market, but for the operator, it is quite challenging to identify the suitable method to analyze the sample. At the same time these techniques are the basis to gain a better understanding in biophysical processes, e.g. protein interaction and aggregation processes. The STEP-Technology® (Space and Time resolved Extinction Profiles), as used in the analytical photocentrifuge LUMiSizer®, has been shown to be an effective and promising technique to investigate particle suspensions and emulsions in various fields. In this study, we evaluated the potentials and limitations of this technique for biopharmaceutical model samples. For a first experimental approach, we measured silica and polystyrene (PS) particle standard suspensions with given particle density and refractive index (RI). The concluding evaluation was performed using a variety of relevant data sets to demonstrate the significant influences of the particle density for the final particle size distribution (PSD). The most challenging property required for successful detection, turbidity, was stated and limits have been set based on the depicted absorbance value at 320 nm (A320 values). Furthermore, we produced chemically cross-linked protein particle suspensions to model physically "stable" protein aggregates. These results of LUMiSizer® analysis have been compared to the orthogonal methods of nanoparticle tracking analysis (NTA), dynamic light scattering (DLS) and micro-flow imaging (MFI). Sedimentation velocity distributions showed similar tendencies, but the PSDs and absolute size values could not be obtained. In conclusion, we could demonstrate some applications as well as limitations of this technique for biopharmaceutical samples. In comparison to orthogonal methods this technique is a great complementary approach if particle data e.g. density or refractive index can be determined. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Moisture analysis from radiosonde and microwave spectrometer data

    NASA Technical Reports Server (NTRS)

    Haydu, K. J.; Krishnamurti, T. N.

    1981-01-01

    A method for analysis of the horizontal and vertical distributions of the moisture field utilizing satellite, upper air and surface data is proposed in this paper. A brief overview of the microwave sensors on board Nimbus 5 and 6 is also presented. A technique is provided utilizing the radiosonde data sets to calibrate the satellite field of total precipitable water. Next, the calibrated satellite-derived field is utilized along with ship and coastal reports of moisture, and a vertical structure function to generate vertical distribution of moisture and thus provide a mapping of specific humidity at several levels in the troposphere. Utilizing these procedures, analyses for several case studies were performed. The resultant maps show detailed distribution of specific humidity along with some interesting climatological features. A reasonable acceptance of the available aerological data sets by the analysis scheme is demonstrated.

  5. Principal Components Analysis on the spectral Bidirectional Reflectance Distribution Function of ceramic colour standards.

    PubMed

    Ferrero, A; Campos, J; Rabal, A M; Pons, A; Hernanz, M L; Corróns, A

    2011-09-26

    The Bidirectional Reflectance Distribution Function (BRDF) is essential to characterize an object's reflectance properties. This function depends both on the various illumination-observation geometries as well as on the wavelength. As a result, the comprehensive interpretation of the data becomes rather complex. In this work we assess the use of the multivariable analysis technique of Principal Components Analysis (PCA) applied to the experimental BRDF data of a ceramic colour standard. It will be shown that the result may be linked to the various reflection processes occurring on the surface, assuming that the incoming spectral distribution is affected by each one of these processes in a specific manner. Moreover, this procedure facilitates the task of interpolating a series of BRDF measurements obtained for a particular sample. © 2011 Optical Society of America

  6. Combined evaluation of grazing incidence X-ray fluorescence and X-ray reflectivity data for improved profiling of ultra-shallow depth distributions☆

    PubMed Central

    Ingerle, D.; Meirer, F.; Pepponi, G.; Demenev, E.; Giubertoni, D.; Wobrauschek, P.; Streli, C.

    2014-01-01

    The continuous downscaling of the process size for semiconductor devices pushes the junction depths and consequentially the implantation depths to the top few nanometers of the Si substrate. This motivates the need for sensitive methods capable of analyzing dopant distribution, total dose and possible impurities. X-ray techniques utilizing the external reflection of X-rays are very surface sensitive, hence providing a non-destructive tool for process analysis and control. X-ray reflectometry (XRR) is an established technique for the characterization of single- and multi-layered thin film structures with layer thicknesses in the nanometer range. XRR spectra are acquired by varying the incident angle in the grazing incidence regime while measuring the specular reflected X-ray beam. The shape of the resulting angle-dependent curve is correlated to changes of the electron density in the sample, but does not provide direct information on the presence or distribution of chemical elements in the sample. Grazing Incidence XRF (GIXRF) measures the X-ray fluorescence induced by an X-ray beam incident under grazing angles. The resulting angle dependent intensity curves are correlated to the depth distribution and mass density of the elements in the sample. GIXRF provides information on contaminations, total implanted dose and to some extent on the depth of the dopant distribution, but is ambiguous with regard to the exact distribution function. Both techniques use similar measurement procedures and data evaluation strategies, i.e. optimization of a sample model by fitting measured and calculated angle curves. Moreover, the applied sample models can be derived from the same physical properties, like atomic scattering/form factors and elemental concentrations; a simultaneous analysis is therefore a straightforward approach. This combined analysis in turn reduces the uncertainties of the individual techniques, allowing a determination of dose and depth profile of the implanted elements with drastically increased confidence level. Silicon wafers implanted with Arsenic at different implantation energies were measured by XRR and GIXRF using a combined, simultaneous measurement and data evaluation procedure. The data were processed using a self-developed software package (JGIXA), designed for simultaneous fitting of GIXRF and XRR data. The results were compared with depth profiles obtained by Secondary Ion Mass Spectrometry (SIMS). PMID:25202165

  7. Testing of stack-unit/aquifer sensitivity analysis using contaminant plume distribution in the subsurface of Savannah River Site, South Carolina, USA

    USGS Publications Warehouse

    Rine, J.M.; Shafer, J.M.; Covington, E.; Berg, R.C.

    2006-01-01

    Published information on the correlation and field-testing of the technique of stack-unit/aquifer sensitivity mapping with documented subsurface contaminant plumes is rare. The inherent characteristic of stack-unit mapping, which makes it a superior technique to other analyses that amalgamate data, is the ability to deconstruct the sensitivity analysis on a unit-by-unit basis. An aquifer sensitivity map, delineating the relative sensitivity of the Crouch Branch aquifer of the Administrative/Manufacturing Area (A/M) at the Savannah River Site (SRS) in South Carolina, USA, incorporates six hydrostratigraphic units, surface soil units, and relevant hydrologic data. When this sensitivity map is compared with the distribution of the contaminant tetrachloroethylene (PCE), PCE is present within the Crouch Branch aquifer within an area classified as highly sensitive, even though the PCE was primarily released on the ground surface within areas classified with low aquifer sensitivity. This phenomenon is explained through analysis of the aquifer sensitivity map, the groundwater potentiometric surface maps, and the plume distributions within the area on a unit-by- unit basis. The results of this correlation show how the paths of the PCE plume are influenced by both the geology and the groundwater flow. ?? Springer-Verlag 2006.

  8. Determination of morphological parameters of biological cells by analysis of scattered-light distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burger, D.E.

    1979-11-01

    The extraction of morphological parameters from biological cells by analysis of light-scatter patterns is described. A light-scattering measurement system has been designed and constructed that allows one to visually examine and photographically record biological cells or cell models and measure the light-scatter pattern of an individual cell or cell model. Using a laser or conventional illumination, the imaging system consists of a modified microscope with a 35 mm camera attached to record the cell image or light-scatter pattern. Models of biological cells were fabricated. The dynamic range and angular distributions of light scattered from these models was compared to calculatedmore » distributions. Spectrum analysis techniques applied on the light-scatter data give the sought after morphological cell parameters. These results compared favorably to shape parameters of the fabricated cell models confirming the mathematical model procedure. For nucleated biological material, correct nuclear and cell eccentricity as well as the nuclear and cytoplasmic diameters were determined. A method for comparing the flow equivalent of nuclear and cytoplasmic size to the actual dimensions is shown. This light-scattering experiment provides baseline information for automated cytology. In its present application, it involves correlating average size as measured in flow cytology to the actual dimensions determined from this technique. (ERB)« less

  9. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klumpp, John

    We propose a radiation detection system which generates its own discrete sampling distribution based on past measurements of background. The advantage to this approach is that it can take into account variations in background with respect to time, location, energy spectra, detector-specific characteristics (i.e. different efficiencies at different count rates and energies), etc. This would therefore be a 'machine learning' approach, in which the algorithm updates and improves its characterization of background over time. The system would have a 'learning mode,' in which it measures and analyzes background count rates, and a 'detection mode,' in which it compares measurements frommore » an unknown source against its unique background distribution. By characterizing and accounting for variations in the background, general purpose radiation detectors can be improved with little or no increase in cost. The statistical and computational techniques to perform this kind of analysis have already been developed. The necessary signal analysis can be accomplished using existing Bayesian algorithms which account for multiple channels, multiple detectors, and multiple time intervals. Furthermore, Bayesian machine-learning techniques have already been developed which, with trivial modifications, can generate appropriate decision thresholds based on the comparison of new measurements against a nonparametric sampling distribution. (authors)« less

  10. Particle sizing of pharmaceutical aerosols via direct imaging of particle settling velocities.

    PubMed

    Fishler, Rami; Verhoeven, Frank; de Kruijf, Wilbur; Sznitman, Josué

    2018-02-15

    We present a novel method for characterizing in near real-time the aerodynamic particle size distributions from pharmaceutical inhalers. The proposed method is based on direct imaging of airborne particles followed by a particle-by-particle measurement of settling velocities using image analysis and particle tracking algorithms. Due to the simplicity of the principle of operation, this method has the potential of circumventing potential biases of current real-time particle analyzers (e.g. Time of Flight analysis), while offering a cost effective solution. The simple device can also be constructed in laboratory settings from off-the-shelf materials for research purposes. To demonstrate the feasibility and robustness of the measurement technique, we have conducted benchmark experiments whereby aerodynamic particle size distributions are obtained from several commercially-available dry powder inhalers (DPIs). Our measurements yield size distributions (i.e. MMAD and GSD) that are closely in line with those obtained from Time of Flight analysis and cascade impactors suggesting that our imaging-based method may embody an attractive methodology for rapid inhaler testing and characterization. In a final step, we discuss some of the ongoing limitations of the current prototype and conceivable routes for improving the technique. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Disentangling Time-series Spectra with Gaussian Processes: Applications to Radial Velocity Analysis

    NASA Astrophysics Data System (ADS)

    Czekala, Ian; Mandel, Kaisey S.; Andrews, Sean M.; Dittmann, Jason A.; Ghosh, Sujit K.; Montet, Benjamin T.; Newton, Elisabeth R.

    2017-05-01

    Measurements of radial velocity variations from the spectroscopic monitoring of stars and their companions are essential for a broad swath of astrophysics; these measurements provide access to the fundamental physical properties that dictate all phases of stellar evolution and facilitate the quantitative study of planetary systems. The conversion of those measurements into both constraints on the orbital architecture and individual component spectra can be a serious challenge, however, especially for extreme flux ratio systems and observations with relatively low sensitivity. Gaussian processes define sampling distributions of flexible, continuous functions that are well-motivated for modeling stellar spectra, enabling proficient searches for companion lines in time-series spectra. We introduce a new technique for spectral disentangling, where the posterior distributions of the orbital parameters and intrinsic, rest-frame stellar spectra are explored simultaneously without needing to invoke cross-correlation templates. To demonstrate its potential, this technique is deployed on red-optical time-series spectra of the mid-M-dwarf binary LP661-13. We report orbital parameters with improved precision compared to traditional radial velocity analysis and successfully reconstruct the primary and secondary spectra. We discuss potential applications for other stellar and exoplanet radial velocity techniques and extensions to time-variable spectra. The code used in this analysis is freely available as an open-source Python package.

  12. High-spatial resolution multispectral and panchromatic satellite imagery for mapping perennial desert plants

    NASA Astrophysics Data System (ADS)

    Alsharrah, Saad A.; Bruce, David A.; Bouabid, Rachid; Somenahalli, Sekhar; Corcoran, Paul A.

    2015-10-01

    The use of remote sensing techniques to extract vegetation cover information for the assessment and monitoring of land degradation in arid environments has gained increased interest in recent years. However, such a task can be challenging, especially for medium-spatial resolution satellite sensors, due to soil background effects and the distribution and structure of perennial desert vegetation. In this study, we utilised Pleiades high-spatial resolution, multispectral (2m) and panchromatic (0.5m) imagery and focused on mapping small shrubs and low-lying trees using three classification techniques: 1) vegetation indices (VI) threshold analysis, 2) pre-built object-oriented image analysis (OBIA), and 3) a developed vegetation shadow model (VSM). We evaluated the success of each approach using a root of the sum of the squares (RSS) metric, which incorporated field data as control and three error metrics relating to commission, omission, and percent cover. Results showed that optimum VI performers returned good vegetation cover estimates at certain thresholds, but failed to accurately map the distribution of the desert plants. Using the pre-built IMAGINE Objective OBIA approach, we improved the vegetation distribution mapping accuracy, but this came at the cost of over classification, similar to results of lowering VI thresholds. We further introduced the VSM which takes into account shadow for further refining vegetation cover classification derived from VI. The results showed significant improvements in vegetation cover and distribution accuracy compared to the other techniques. We argue that the VSM approach using high-spatial resolution imagery provides a more accurate representation of desert landscape vegetation and should be considered in assessments of desertification.

  13. Photoelastic analysis of mandibular full-arch implant-supported fixed dentures made with different bar materials and manufacturing techniques.

    PubMed

    Zaparolli, Danilo; Peixoto, Raniel Fernandes; Pupim, Denise; Macedo, Ana Paula; Toniollo, Marcelo Bighetti; Mattos, Maria da Glória Chiarello de

    2017-12-01

    To compare the stress distribution of mandibular full dentures supported with implants according to the bar materials and manufacturing techniques using a qualitative photoelastic analysis. An acrylic master model simulating the mandibular arch was fabricated with four Morse taper implant analogs of 4.5×6mm. Four different bars were manufactured according to different material and techniques: fiber-reinforced resin (G1, Trinia, CAD/CAM), commercially pure titanium (G2, cpTi, CAD/CAM), cobalt‑chromium (G3, Co-Cr, CAD/CAM) and cobalt‑chromium (G4, Co-Cr, conventional cast). Standard clinical and laboratory procedures were used by an experienced dental technician to fabricate 4 mandibular implant-supported dentures. The photoelastic model was created based on the acrylic master model. A load simulation (150N) was performed in total occlusion against the antagonist. Dentures with fiber-reinforced resin bar (G1) exhibited better stress distribution. Dentures with machined Co-Cr bar (G3) exhibited the worst standard of stress distribution, with an overload on the distal part of the posteriors implants, followed by dentures with cast Co-Cr bar (G4) and machined cpTi bar (G2). The fiber-reinforced resin bar exhibited an adequate stress distribution and can serve as a viable alternative for oral rehabilitation with mandibular full dentures supported with implants. Moreover, the use of the G1 group offered advantages including reduced weight and less possible overload to the implants components, leading to the preservation of the support structure. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Social Data Analysis by Non-Linear Imbedding

    DTIC Science & Technology

    2013-09-20

    Fig. 1 shows this dimenion-reduced galaxy. This example is chosen to illustrate how our “ history independent” techniques can infer major historical...DISTRIBUTION A: Distribution approved for public release. 1989 1990 1991 Middle East 2 7 6 Weapon Nonproliferation 2 6 5 Anti- Apartheid & Human Rights...the Non-Proliferation of Nuclear Weapons) #3570 (Status of the International Convention on the Suppression and Punishment of the crime of Apartheid

  15. Pipe and Solids Analysis: What Can I Learn?

    EPA Science Inventory

    This presentation gives a brief overview of techniques that regulators, utilities and consultants might want to request from laboratories to anticipate or solve water treatment and distribution system water quality problems. Actual examples will be given from EPA collaborations,...

  16. Image analysis for the automated estimation of clonal growth and its application to the growth of smooth muscle cells.

    PubMed

    Gavino, V C; Milo, G E; Cornwell, D G

    1982-03-01

    Image analysis was used for the automated measurement of colony frequency (f) and colony diameter (d) in cultures of smooth muscle cells, Initial studies with the inverted microscope showed that number of cells (N) in a colony varied directly with d: log N = 1.98 log d - 3.469 Image analysis generated the complement of a cumulative distribution for f as a function of d. The number of cells in each segment of the distribution function was calculated by multiplying f and the average N for the segment. These data were displayed as a cumulative distribution function. The total number of colonies (fT) and the total number of cells (NT) were used to calculate the average colony size (NA). Population doublings (PD) were then expressed as log2 NA. Image analysis confirmed previous studies in which colonies were sized and counted with an inverted microscope. Thus, image analysis is a rapid and automated technique for the measurement of clonal growth.

  17. An Integrated Approach for the Assessment of the Natural and Anthropogenic Controls on Land Subsidence in the Kingdom of Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Othman, A.; Sultan, M.; Ahmed, M.; Alharbi, T.; Gebremichael, E.; Emil, M.

    2015-12-01

    Recent land subsidence incidences in the Kingdom of Saudi Arabia (KSA) resulted in loss in life and property. In this study, an integrated approach is adopted to accomplish the following: (1) map the spatial distribution of areas that are witnessing land subsidence, (2) quantify the rates of land subsidence, and (3) identify the factors causing the observed subsidence. A three-fold approach is applied: (1) use of interferometric techniques to assess the spatial distribution of land subsidence and to quantify the rates of subsidence, (2) generate a GIS database to encompass all relevant data and derived products, and (3) correlate findings from the radar exercise with relevant spatial and temporal datasets (e.g., remote sensing, geology, fluid extraction rates, distribution of urban areas, etc.). Three main areas were selected: (1) central and northern parts of the KSA, (2) areas surrounding the Ghawar oil/gas field, and (3) the Harrat Lunayyir volcanic field. Applications of two-pass, three-pass, and SBAS radar interferometric techniques over central KSA revealed the following: (1) subsidence rates of up to -15 mm/yr were detected; the spatial distribution of the subsided areas that were extracted using the various interferometric techniques are similar, (2) subsided areas correlated spatially with the distribution of: (a) areas with high groundwater extraction rates as evidenced from the analysis of field and Gravity Recovery and Climate Experiment (GRACE) data, (b) agricultural plantations as evidenced from the analysis of field and temporal Landsat data, (c) urban areas (e.g., Buraydah City), (d) outcrops of carbonates and anhydrite formations (e.g., Khuff and Jilh formations), (3) subsidence could be related to more than one parameter. Similar research activities are underway in northern KSA and in areas surrounding the Ghawar oil/gas and the Harrat Lunayyir volcanic fields to assess the distribution and factors controlling land deformation in those areas.

  18. Iterative Monte Carlo analysis of spin-dependent parton distributions

    DOE PAGES

    Sato, Nobuo; Melnitchouk, Wally; Kuhn, Sebastian E.; ...

    2016-04-05

    We present a comprehensive new global QCD analysis of polarized inclusive deep-inelastic scattering, including the latest high-precision data on longitudinal and transverse polarization asymmetries from Jefferson Lab and elsewhere. The analysis is performed using a new iterative Monte Carlo fitting technique which generates stable fits to polarized parton distribution functions (PDFs) with statistically rigorous uncertainties. Inclusion of the Jefferson Lab data leads to a reduction in the PDF errors for the valence and sea quarks, as well as in the gluon polarization uncertainty at x ≳ 0.1. Furthermore, the study also provides the first determination of the flavor-separated twist-3 PDFsmore » and the d 2 moment of the nucleon within a global PDF analysis.« less

  19. Dual FIB-SEM 3D imaging and lattice boltzmann modeling of porosimetry and multiphase flow in chalk.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rinehart, Alex; Petrusak, Robin; Heath, Jason E.

    2010-12-01

    Mercury intrusion porosimetry (MIP) is an often-applied technique for determining pore throat distributions and seal analysis of fine-grained rocks. Due to closure effects, potential pore collapse, and complex pore network topologies, MIP data interpretation can be ambiguous, and often biased toward smaller pores in the distribution. We apply 3D imaging techniques and lattice-Boltzmann modeling in interpreting MIP data for samples of the Cretaceous Selma Group Chalk. In the Mississippi Interior Salt Basin, the Selma Chalk is the apparent seal for oil and gas fields in the underlying Eutaw Fm., and, where unfractured, the Selma Chalk is one of the regional-scalemore » seals identified by the Southeast Regional Carbon Sequestration Partnership for CO2 injection sites. Dual focused ion - scanning electron beam and laser scanning confocal microscopy methods are used for 3D imaging of nanometer-to-micron scale microcrack and pore distributions in the Selma Chalk. A combination of image analysis software is used to obtain geometric pore body and throat distributions and other topological properties, which are compared to MIP results. 3D data sets of pore-microfracture networks are used in Lattice Boltzmann simulations of drainage (wetting fluid displaced by non-wetting fluid via the Shan-Chen algorithm), which in turn are used to model MIP procedures. Results are used in interpreting MIP results, understanding microfracture-matrix interaction during multiphase flow, and seal analysis for underground CO2 storage.« less

  20. Column ratio mapping: a processing technique for atomic resolution high-angle annular dark-field (HAADF) images.

    PubMed

    Robb, Paul D; Craven, Alan J

    2008-12-01

    An image processing technique is presented for atomic resolution high-angle annular dark-field (HAADF) images that have been acquired using scanning transmission electron microscopy (STEM). This technique is termed column ratio mapping and involves the automated process of measuring atomic column intensity ratios in high-resolution HAADF images. This technique was developed to provide a fuller analysis of HAADF images than the usual method of drawing single intensity line profiles across a few areas of interest. For instance, column ratio mapping reveals the compositional distribution across the whole HAADF image and allows a statistical analysis and an estimation of errors. This has proven to be a very valuable technique as it can provide a more detailed assessment of the sharpness of interfacial structures from HAADF images. The technique of column ratio mapping is described in terms of a [110]-oriented zinc-blende structured AlAs/GaAs superlattice using the 1 angstroms-scale resolution capability of the aberration-corrected SuperSTEM 1 instrument.

  1. Applications of Bayesian Statistics to Problems in Gamma-Ray Bursts

    NASA Technical Reports Server (NTRS)

    Meegan, Charles A.

    1997-01-01

    This presentation will describe two applications of Bayesian statistics to Gamma Ray Bursts (GRBS). The first attempts to quantify the evidence for a cosmological versus galactic origin of GRBs using only the observations of the dipole and quadrupole moments of the angular distribution of bursts. The cosmological hypothesis predicts isotropy, while the galactic hypothesis is assumed to produce a uniform probability distribution over positive values for these moments. The observed isotropic distribution indicates that the Bayes factor for the cosmological hypothesis over the galactic hypothesis is about 300. Another application of Bayesian statistics is in the estimation of chance associations of optical counterparts with galaxies. The Bayesian approach is preferred to frequentist techniques here because the Bayesian approach easily accounts for galaxy mass distributions and because one can incorporate three disjoint hypotheses: (1) bursts come from galactic centers, (2) bursts come from galaxies in proportion to luminosity, and (3) bursts do not come from external galaxies. This technique was used in the analysis of the optical counterpart to GRB970228.

  2. Probabilistic distance-based quantizer design for distributed estimation

    NASA Astrophysics Data System (ADS)

    Kim, Yoon Hak

    2016-12-01

    We consider an iterative design of independently operating local quantizers at nodes that should cooperate without interaction to achieve application objectives for distributed estimation systems. We suggest as a new cost function a probabilistic distance between the posterior distribution and its quantized one expressed as the Kullback Leibler (KL) divergence. We first present the analysis that minimizing the KL divergence in the cyclic generalized Lloyd design framework is equivalent to maximizing the logarithmic quantized posterior distribution on the average which can be further computationally reduced in our iterative design. We propose an iterative design algorithm that seeks to maximize the simplified version of the posterior quantized distribution and discuss that our algorithm converges to a global optimum due to the convexity of the cost function and generates the most informative quantized measurements. We also provide an independent encoding technique that enables minimization of the cost function and can be efficiently simplified for a practical use of power-constrained nodes. We finally demonstrate through extensive experiments an obvious advantage of improved estimation performance as compared with the typical designs and the novel design techniques previously published.

  3. Elemental Analysis in Biological Matrices Using ICP-MS.

    PubMed

    Hansen, Matthew N; Clogston, Jeffrey D

    2018-01-01

    The increasing exploration of metallic nanoparticles for use as cancer therapeutic agents necessitates a sensitive technique to track the clearance and distribution of the material once introduced into a living system. Inductively coupled plasma mass spectrometry (ICP-MS) provides a sensitive and selective tool for tracking the distribution of metal components from these nanotherapeutics. This chapter presents a standardized method for processing biological matrices, ensuring complete homogenization of tissues, and outlines the preparation of appropriate standards and controls. The method described herein utilized gold nanoparticle-treated samples; however, the method can easily be applied to the analysis of other metals.

  4. A comprehensive analysis of the performance characteristics of the Mount Laguna solar photovoltaic installation

    NASA Technical Reports Server (NTRS)

    Shumka, A.; Sollock, S. G.

    1981-01-01

    This paper represents the first comprehensive survey of the Mount Laguna Photovoltaic Installation. The novel techniques used for performing the field tests have been effective in locating and characterizing defective modules. A comparative analysis on the two types of modules used in the array indicates that they have significantly different failure rates, different distributions in degradational space and very different failure modes. A life cycle model is presented to explain a multimodal distribution observed for one module type. A statistical model is constructed and it is shown to be in good agreement with the field data.

  5. Progress in characterizing submonolayer island growth: Capture-zone distributions, growth exponents, & hot precursors

    NASA Astrophysics Data System (ADS)

    Einstein, Theodore L.; Pimpinelli, Alberto; González, Diego Luis; Morales-Cifuentes, Josue R.

    2015-09-01

    In studies of epitaxial growth, analysis of the distribution of the areas of capture zones (i.e. proximity polygons or Voronoi tessellations with respect to island centers) is often the best way to extract the critical nucleus size i. For non-random nucleation the normalized areas s of these Voronoi cells are well described by the generalized Wigner distribution (GWD) Pβ(s) = asβ exp(-bs2), particularly in the central region 0.5 < s < 2 where data are least noisy. Extensive Monte Carlo simulations reveal inadequacies of our earlier mean field analysis, suggesting β = i + 2 for diffusion-limited aggregation (DLA). Since simulations generate orders of magnitude more data than experiments, they permit close examination of the tails of the distribution, which differ from the simple GWD form. One refinement is based on a fragmentation model. We also compare island-size distributions. We compare analysis by island-size distribution and by scaling of island density with flux. Modifications appear for attach-limited aggregation (ALA). We focus on the experimental system para-hexaphenyl on amorphous mica, comparing the results of the three analysis techniques and reconciling their results via a novel model of hot precursors based on rate equations, pointing out the existence of intermediate scaling regimes between DLA and ALA.

  6. Application of Monte Carlo techniques to transient thermal modeling of cavity radiometers having diffuse-specular surfaces

    NASA Technical Reports Server (NTRS)

    Mahan, J. R.; Eskin, L. D.

    1981-01-01

    A viable alternative to the net exchange method of radiative analysis which is equally applicable to diffuse and diffuse-specular enclosures is presented. It is particularly more advantageous to use than the net exchange method in the case of a transient thermal analysis involving conduction and storage of energy as well as radiative exchange. A new quantity, called the distribution factor is defined which replaces the angle factor and the configuration factor. Once obtained, the array of distribution factors for an ensemble of surface elements which define an enclosure permits the instantaneous net radiative heat fluxes to all of the surfaces to be computed directly in terms of the known surface temperatures at that instant. The formulation of the thermal model is described, as is the determination of distribution factors by application of a Monte Carlo analysis. The results show that when fewer than 10,000 packets are emitted, an unsatisfactory approximation for the distribution factors is obtained, but that 10,000 packets is sufficient.

  7. Improving alpine-region spectral unmixing with optimal-fit snow endmembers

    NASA Technical Reports Server (NTRS)

    Painter, Thomas H.; Roberts, Dar A.; Green, Robert O.; Dozier, Jeff

    1995-01-01

    Surface albedo and snow-covered-area (SCA) are crucial inputs to the hydrologic and climatologic modeling of alpine and seasonally snow-covered areas. Because the spectral albedo and thermal regime of pure snow depend on grain size, areal distribution of snow grain size is required. Remote sensing has been shown to be an effective (and necessary) means of deriving maps of grain size distribution and snow-covered-area. Developed here is a technique whereby maps of grain size distribution improve estimates of SCA from spectral mixture analysis with AVIRIS data.

  8. Counterfactual Quantum Deterministic Key Distribution

    NASA Astrophysics Data System (ADS)

    Zhang, Sheng; Wang, Jian; Tang, Chao-Jing

    2013-01-01

    We propose a new counterfactual quantum cryptography protocol concerning about distributing a deterministic key. By adding a controlled blocking operation module to the original protocol [T.G. Noh, Phys. Rev. Lett. 103 (2009) 230501], the correlation between the polarizations of the two parties, Alice and Bob, is extended, therefore, one can distribute both deterministic keys and random ones using our protocol. We have also given a simple proof of the security of our protocol using the technique we ever applied to the original protocol. Most importantly, our analysis produces a bound tighter than the existing ones.

  9. Temporal Methods to Detect Content-Based Anomalies in Social Media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skryzalin, Jacek; Field, Jr., Richard; Fisher, Andrew N.

    Here, we develop a method for time-dependent topic tracking and meme trending in social media. Our objective is to identify time periods whose content differs signifcantly from normal, and we utilize two techniques to do so. The first is an information-theoretic analysis of the distributions of terms emitted during different periods of time. In the second, we cluster documents from each time period and analyze the tightness of each clustering. We also discuss a method of combining the scores created by each technique, and we provide ample empirical analysis of our methodology on various Twitter datasets.

  10. Automated selection of BI-RADS lesion descriptors for reporting calcifications in mammograms

    NASA Astrophysics Data System (ADS)

    Paquerault, Sophie; Jiang, Yulei; Nishikawa, Robert M.; Schmidt, Robert A.; D'Orsi, Carl J.; Vyborny, Carl J.; Newstead, Gillian M.

    2003-05-01

    We are developing an automated computer technique to describe calcifications in mammograms according to the BI-RADS lexicon. We evaluated this technique by its agreement with radiologists' description of the same lesions. Three expert mammographers reviewed our database of 90 cases of digitized mammograms containing clustered microcalcifications and described the calcifications according to BI-RADS. In our study, the radiologists used only 4 of the 5 calcification distribution descriptors and 5 of the 14 calcification morphology descriptors contained in BI-RADS. Our computer technique was therefore designed specifically for these 4 calcification distribution descriptors and 5 calcification morphology descriptors. For calcification distribution, 4 linear discriminant analysis (LDA) classifiers were developed using 5 computer-extracted features to produce scores of how well each descriptor describes a cluster. Similarly, for calcification morphology, 5 LDAs were designed using 10 computer-extracted features. We trained the LDAs using only the BI-RADS data reported by the first radiologist and compared the computer output to the descriptor data reported by all 3 radiologists (for the first radiologist, the leave-one-out method was used). The computer output consisted of the best calcification distribution descriptor and the best 2 calcification morphology descriptors. The results of the comparison with the data from each radiologist, respectively, were: for calcification distribution, percent agreement, 74%, 66%, and 73%, kappa value, 0.44, 0.36, and 0.46; for calcification morphology, percent agreement, 83%, 77%, and 57%, kappa value, 0.78, 0.70, and 0.44. These results indicate that the proposed computer technique can select BI-RADS descriptors in good agreement with radiologists.

  11. Market segmentation for multiple option healthcare delivery systems--an application of cluster analysis.

    PubMed

    Jarboe, G R; Gates, R H; McDaniel, C D

    1990-01-01

    Healthcare providers of multiple option plans may be confronted with special market segmentation problems. This study demonstrates how cluster analysis may be used for discovering distinct patterns of preference for multiple option plans. The availability of metric, as opposed to categorical or ordinal, data provides the ability to use sophisticated analysis techniques which may be superior to frequency distributions and cross-tabulations in revealing preference patterns.

  12. A review of second law techniques applicable to basic thermal science research

    NASA Astrophysics Data System (ADS)

    Drost, M. Kevin; Zamorski, Joseph R.

    1988-11-01

    This paper reports the results of a review of second law analysis techniques which can contribute to basic research in the thermal sciences. The review demonstrated that second law analysis has a role in basic thermal science research. Unlike traditional techniques, second law analysis accurately identifies the sources and location of thermodynamic losses. This allows the development of innovative solutions to thermal science problems by directing research to the key technical issues. Two classes of second law techniques were identified as being particularly useful. First, system and component investigations can provide information of the source and nature of irreversibilities on a macroscopic scale. This information will help to identify new research topics and will support the evaluation of current research efforts. Second, the differential approach can provide information on the causes and spatial and temporal distribution of local irreversibilities. This information enhances the understanding of fluid mechanics, thermodynamics, and heat and mass transfer, and may suggest innovative methods for reducing irreversibilities.

  13. Single-phase power distribution system power flow and fault analysis

    NASA Technical Reports Server (NTRS)

    Halpin, S. M.; Grigsby, L. L.

    1992-01-01

    Alternative methods for power flow and fault analysis of single-phase distribution systems are presented. The algorithms for both power flow and fault analysis utilize a generalized approach to network modeling. The generalized admittance matrix, formed using elements of linear graph theory, is an accurate network model for all possible single-phase network configurations. Unlike the standard nodal admittance matrix formulation algorithms, the generalized approach uses generalized component models for the transmission line and transformer. The standard assumption of a common node voltage reference point is not required to construct the generalized admittance matrix. Therefore, truly accurate simulation results can be obtained for networks that cannot be modeled using traditional techniques.

  14. The use of various X-ray fluorescence analysis modalities for the investigation of historical paintings: The case study on the Late Gothic panel painting

    NASA Astrophysics Data System (ADS)

    Bártová, H.; Trojek, T.; Čechák, T.; Šefců, R.; Chlumská, Š.

    2017-10-01

    The presence of heavy chemical elements in old pigments is possible to identify in historical paintings using X-ray fluorescence analysis (XRF). This is a non-destructive analytical method frequently used in examination of objects that require in situ analysis, where it is necessary to avoid damaging the object by taking samples. Different modalities are available, such as microanalysis, scanning selected areas, or depth profiling techniques. Surface scanning is particularly profitable since 2D element distribution maps are much more understandable than the results of individual analyses. Information on the layered structure of the painting can be also obtained by handheld portable systems. Results presented in our paper combine 2D element distribution maps obtained by scanning analysis, and depth profiling using conventional XRF. The latter is very suitable for objects of art, as it can be evaluated from data measured with portable XRF device. Depth profiling by conventional XRF is based on the differences in X-ray absorption in paint layers. The XRF technique was applied for analysis of panel paintings of the Master of the St George Altarpiece who was active in Prague in the 1470s and 1480s. The results were evaluated by taking micro-samples and performing a material analysis.

  15. Ambient ionisation mass spectrometry for in situ analysis of intact proteins

    PubMed Central

    Kocurek, Klaudia I.; Griffiths, Rian L.

    2018-01-01

    Abstract Ambient surface mass spectrometry is an emerging field which shows great promise for the analysis of biomolecules directly from their biological substrate. In this article, we describe ambient ionisation mass spectrometry techniques for the in situ analysis of intact proteins. As a broad approach, the analysis of intact proteins offers unique advantages for the determination of primary sequence variations and posttranslational modifications, as well as interrogation of tertiary and quaternary structure and protein‐protein/ligand interactions. In situ analysis of intact proteins offers the potential to couple these advantages with information relating to their biological environment, for example, their spatial distributions within healthy and diseased tissues. Here, we describe the techniques most commonly applied to in situ protein analysis (liquid extraction surface analysis, continuous flow liquid microjunction surface sampling, nano desorption electrospray ionisation, and desorption electrospray ionisation), their advantages, and limitations and describe their applications to date. We also discuss the incorporation of ion mobility spectrometry techniques (high field asymmetric waveform ion mobility spectrometry and travelling wave ion mobility spectrometry) into ambient workflows. Finally, future directions for the field are discussed. PMID:29607564

  16. Arthropod Surveillance Programs: Basic Components, Strategies, and Analysis.

    PubMed

    Cohnstaedt, Lee W; Rochon, Kateryn; Duehl, Adrian J; Anderson, John F; Barrera, Roberto; Su, Nan-Yao; Gerry, Alec C; Obenauer, Peter J; Campbell, James F; Lysyk, Tim J; Allan, Sandra A

    2012-03-01

    Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium "Advancements in arthropod monitoring technology, techniques, and analysis" presented at the 58th annual meeting of the Entomological Society of America in San Diego, CA. Interdisciplinary examples of arthropod monitoring for urban, medical, and veterinary applications are reviewed. Arthropod surveillance consists of the three components: 1) sampling method, 2) trap technology, and 3) analysis technique. A sampling method consists of selecting the best device or collection technique for a specific location and sampling at the proper spatial distribution, optimal duration, and frequency to achieve the surveillance objective. Optimized sampling methods are discussed for several mosquito species (Diptera: Culicidae) and ticks (Acari: Ixodidae). The advantages and limitations of novel terrestrial and aerial insect traps, artificial pheromones and kairomones are presented for the capture of red flour beetle (Coleoptera: Tenebrionidae), small hive beetle (Coleoptera: Nitidulidae), bed bugs (Hemiptera: Cimicidae), and Culicoides (Diptera: Ceratopogonidae) respectively. After sampling, extrapolating real world population numbers from trap capture data are possible with the appropriate analysis techniques. Examples of this extrapolation and action thresholds are given for termites (Isoptera: Rhinotermitidae) and red flour beetles.

  17. Survey of aircraft electrical power systems

    NASA Technical Reports Server (NTRS)

    Lee, C. H.; Brandner, J. J.

    1972-01-01

    Areas investigated include: (1) load analysis; (2) power distribution, conversion techniques and generation; (3) design criteria and performance capabilities of hydraulic and pneumatic systems; (4) system control and protection methods; (5) component and heat transfer systems cooling; and (6) electrical system reliability.

  18. Study the fragment size distribution in dynamic fragmentation of laser shock loding tin

    NASA Astrophysics Data System (ADS)

    He, Weihua; Xin, Jianting; Chu, Genbai; Shui, Min; Xi, Tao; Zhao, Yongqiang; Gu, Yuqiu

    2017-06-01

    Characterizing the distribution of fragment size produced from dynamic fragmentation process is very important for fundamental science like predicting material dymanic response performance and for a variety of engineering applications. However, only a few data about fragment mass or size have been obtained due to its great challenge in its dynamic measurement. This paper would focus on investigating the fragment size distribution from the dynamic fragmentation of laser shock-loaded metal. Material ejection of tin sample with wedge shape groove in the free surface is collected with soft recovery technique. Via fine post-shot analysis techniques including X-ray micro-tomography and the improved watershed method, it is found that fragments can be well detected. To characterize their size distributions, a random geometric statistics method based on Poisson mixtures was derived for dynamic heterogeneous fragmentation problem, which leads to a linear combinational exponential distribution. Finally we examined the size distribution of laser shock-loaded tin with the derived model, and provided comparisons with other state-of-art models. The resulting comparisons prove that our proposed model can provide more reasonable fitting result for laser shock-loaded metal.

  19. Development of evaluation technique of GMAW welding quality based on statistical analysis

    NASA Astrophysics Data System (ADS)

    Feng, Shengqiang; Terasaki, Hidenri; Komizo, Yuichi; Hu, Shengsun; Chen, Donggao; Ma, Zhihua

    2014-11-01

    Nondestructive techniques for appraising gas metal arc welding(GMAW) faults plays a very important role in on-line quality controllability and prediction of the GMAW process. On-line welding quality controllability and prediction have several disadvantages such as high cost, low efficiency, complication and greatly being affected by the environment. An enhanced, efficient evaluation technique for evaluating welding faults based on Mahalanobis distance(MD) and normal distribution is presented. In addition, a new piece of equipment, designated the weld quality tester(WQT), is developed based on the proposed evaluation technique. MD is superior to other multidimensional distances such as Euclidean distance because the covariance matrix used for calculating MD takes into account correlations in the data and scaling. The values of MD obtained from welding current and arc voltage are assumed to follow a normal distribution. The normal distribution has two parameters: the mean µ and standard deviation σ of the data. In the proposed evaluation technique used by the WQT, values of MD located in the range from zero to µ+3 σ are regarded as "good". Two experiments which involve changing the flow of shielding gas and smearing paint on the surface of the substrate are conducted in order to verify the sensitivity of the proposed evaluation technique and the feasibility of using WQT. The experimental results demonstrate the usefulness of the WQT for evaluating welding quality. The proposed technique can be applied to implement the on-line welding quality controllability and prediction, which is of great importance to design some novel equipment for weld quality detection.

  20. Automated quantification of the synchrogram by recurrence plot analysis.

    PubMed

    Nguyen, Chinh Duc; Wilson, Stephen James; Crozier, Stuart

    2012-04-01

    Recently, the concept of phase synchronization of two weakly coupled oscillators has raised a great research interest and has been applied to characterize synchronization phenomenon in physiological data. Phase synchronization of cardiorespiratory coupling is often studied by a synchrogram analysis, a graphical tool investigating the relationship between instantaneous phases of two signals. Although several techniques have been proposed to automatically quantify the synchrogram, most of them require a preselection of a phase-locking ratio by trial and error. One technique does not require this information; however, it is based on the power spectrum of phase's distribution in the synchrogram, which is vulnerable to noise. This study aims to introduce a new technique to automatically quantify the synchrogram by studying its dynamic structure. Our technique exploits recurrence plot analysis, which is a well-established tool for characterizing recurring patterns and nonstationarities in experiments. We applied our technique to detect synchronization in simulated and measured infants' cardiorespiratory data. Our results suggest that the proposed technique is able to systematically detect synchronization in noisy and chaotic data without preselecting the phase-locking ratio. By embedding phase information of the synchrogram into phase space, the phase-locking ratio is automatically unveiled as the number of attractors.

  1. A comparison of 3D poly(ε-caprolactone) tissue engineering scaffolds produced with conventional and additive manufacturing techniques by means of quantitative analysis of SR μ-CT images

    NASA Astrophysics Data System (ADS)

    Brun, F.; Intranuovo, F.; Mohammadi, S.; Domingos, M.; Favia, P.; Tromba, G.

    2013-07-01

    The technique used to produce a 3D tissue engineering (TE) scaffold is of fundamental importance in order to guarantee its proper morphological characteristics. An accurate assessment of the resulting structural properties is therefore crucial in order to evaluate the effectiveness of the produced scaffold. Synchrotron radiation (SR) computed microtomography (μ-CT) combined with further image analysis seems to be one of the most effective techniques to this aim. However, a quantitative assessment of the morphological parameters directly from the reconstructed images is a non trivial task. This study considers two different poly(ε-caprolactone) (PCL) scaffolds fabricated with a conventional technique (Solvent Casting Particulate Leaching, SCPL) and an additive manufacturing (AM) technique (BioCell Printing), respectively. With the first technique it is possible to produce scaffolds with random, non-regular, rounded pore geometry. The AM technique instead is able to produce scaffolds with square-shaped interconnected pores of regular dimension. Therefore, the final morphology of the AM scaffolds can be predicted and the resulting model can be used for the validation of the applied imaging and image analysis protocols. It is here reported a SR μ-CT image analysis approach that is able to effectively and accurately reveal the differences in the pore- and throat-size distributions as well as connectivity of both AM and SCPL scaffolds.

  2. SU-E-T-138: Dosimetric Verification For Volumetric Modulated Arc Therapy Cranio-Spinal Irradiation Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goksel, E; Bilge, H; Yildiz, Yarar

    2014-06-01

    Purpose: Dosimetric feasibility of cranio-spinal irradiation with volumetric modulated arc therapy (VMAT-CSI) technique in terms of dose distribution accuracy was investigated using a humanlike phantom. Methods: The OARs and PTV volumes for the Rando phantom were generated on supine CT images. Eclipse (version 8.6) TPS with AAA algorithm was used to create the treatment plan with VMAT-CSI technique. RapidArc plan consisted of cranial, upper spinal (US) and lower spinal (LS) regions that were optimized in the same plan. US field was overlapped by 3cm with cranial and LS fields. Three partial arcs for cranium and 1 full arc for eachmore » US and LS region were used. The VMAT-CSI dose distribution inside the Rando phantom was measured with thermoluminescent detectors (TLD) and film dosimetry, and was compared to the calculated doses of field junctions, target and OARs. TLDs were placed at 24 positions throughout the phantom. The measured TLD doses were compared to the calculated point doses. Planar doses for field junctions were verified with Gafchromic films. Films were analyzed in PTW Verisoft application software using gamma analysis method with the 4 mm distance to agreement (DTA) and 4% dose agreement criteria. Results: TLD readings demonstrated accurate dose delivery, with a median dose difference of -0.3% (range: -8% and 12%) when compared with calculated doses for the areas inside the treatment portal. The maximum dose difference was 12% higher in testicals that are outside the treatment region and 8% lower in lungs where the heterogeinity was higher. All planar dose verifications for field junctions passed the gamma analysis and measured planar dose distributions demonstrated average 97% agreement with calculated doses. Conclusion: The dosimetric data verified with TLD and film dosimetry shows that VMAT-CSI technique provides accurate dose distribution and can be delivered safely.« less

  3. A Comparative Analysis of the Attitudes of Primary School Students and Teachers Regarding the Use of Games in Teaching

    ERIC Educational Resources Information Center

    Andic, Branko; Kadic, Srdan; Grujicic, Rade; Malidžan, Desanka

    2018-01-01

    This paper provides an overview of the attitudes of students and teachers toward the use of educational games in the teaching process. The study encompassed a didactic experiment, and adopted interviewing techniques and theoretical analysis. Likert distributions of attitudes to particular game types are presented in tables and the arithmetic means…

  4. Research and Development on Titanium Alloys

    DTIC Science & Technology

    1949-10-31

    EVALUATION OF EPERIMENTAL TITANIUM-BASE ALLOYS• 65 Binary Alloys of Titanium . . . . .. 65 Titanium-Silver Alloys. . . . . ..... ... 68 Mechanical Properties...using a technique in melting designed to give more uniform distribution of the alloying additions. NMATTWLL MOMORIAL INSTITUTE 4...tc Dr. Derge for analysis. BATTELLE MEMORIAL INSTITUTE -107- 2TABLE 28. OXYGEN STANDARDS FOR ANALYSIS Wt fSapl Pein Cen Designation Sample lielting, 1

  5. Forest Fire History... A Computer Method of Data Analysis

    Treesearch

    Romain M. Meese

    1973-01-01

    A series of computer programs is available to extract information from the individual Fire Reports (U.S. Forest Service Form 5100-29). The programs use a statistical technique to fit a continuous distribution to a set of sampled data. The goodness-of-fit program is applicable to data other than the fire history. Data summaries illustrate analysis of fire occurrence,...

  6. A uniform technique for flood frequency analysis.

    USGS Publications Warehouse

    Thomas, W.O.

    1985-01-01

    This uniform technique consisted of fitting the logarithms of annual peak discharges to a Pearson Type III distribution using the method of moments. The objective was to adopt a consistent approach for the estimation of floodflow frequencies that could be used in computing average annual flood losses for project evaluation. In addition, a consistent approach was needed for defining equitable flood-hazard zones as part of the National Flood Insurance Program. -from ASCE Publications Information

  7. Uncertainty quantification in flux balance analysis of spatially lumped and distributed models of neuron-astrocyte metabolism.

    PubMed

    Calvetti, Daniela; Cheng, Yougan; Somersalo, Erkki

    2016-12-01

    Identifying feasible steady state solutions of a brain energy metabolism model is an inverse problem that allows infinitely many solutions. The characterization of the non-uniqueness, or the uncertainty quantification of the flux balance analysis, is tantamount to identifying the degrees of freedom of the solution. The degrees of freedom of multi-compartment mathematical models for energy metabolism of a neuron-astrocyte complex may offer a key to understand the different ways in which the energetic needs of the brain are met. In this paper we study the uncertainty in the solution, using techniques of linear algebra to identify the degrees of freedom in a lumped model, and Markov chain Monte Carlo methods in its extension to a spatially distributed case. The interpretation of the degrees of freedom in metabolic terms, more specifically, glucose and oxygen partitioning, is then leveraged to derive constraints on the free parameters to guarantee that the model is energetically feasible. We demonstrate how the model can be used to estimate the stoichiometric energy needs of the cells as well as the household energy based on the measured oxidative cerebral metabolic rate of glucose and glutamate cycling. Moreover, our analysis shows that in the lumped model the net direction of lactate dehydrogenase (LDH) in the cells can be deduced from the glucose partitioning between the compartments. The extension of the lumped model to a spatially distributed multi-compartment setting that includes diffusion fluxes from capillary to tissue increases the number of degrees of freedom, requiring the use of statistical sampling techniques. The analysis of the distributed model reveals that some of the conclusions valid for the spatially lumped model, e.g., concerning the LDH activity and glucose partitioning, may no longer hold.

  8. Nonlinear earthquake analysis of reinforced concrete frames with fiber and Bernoulli-Euler beam-column element.

    PubMed

    Karaton, Muhammet

    2014-01-01

    A beam-column element based on the Euler-Bernoulli beam theory is researched for nonlinear dynamic analysis of reinforced concrete (RC) structural element. Stiffness matrix of this element is obtained by using rigidity method. A solution technique that included nonlinear dynamic substructure procedure is developed for dynamic analyses of RC frames. A predicted-corrected form of the Bossak-α method is applied for dynamic integration scheme. A comparison of experimental data of a RC column element with numerical results, obtained from proposed solution technique, is studied for verification the numerical solutions. Furthermore, nonlinear cyclic analysis results of a portal reinforced concrete frame are achieved for comparing the proposed solution technique with Fibre element, based on flexibility method. However, seismic damage analyses of an 8-story RC frame structure with soft-story are investigated for cases of lumped/distributed mass and load. Damage region, propagation, and intensities according to both approaches are researched.

  9. Evaluation of C60 secondary ion mass spectrometry for the chemical analysis and imaging of fingerprints.

    PubMed

    Sisco, Edward; Demoranville, Leonard T; Gillen, Greg

    2013-09-10

    The feasibility of using C60(+) cluster primary ion bombardment secondary ion mass spectrometry (C60(+) SIMS) for the analysis of the chemical composition of fingerprints is evaluated. It was found that C60(+) SIMS could be used to detect and image the spatial localization of a number of sebaceous and eccrine components in fingerprints. These analyses were also found to not be hindered by the use of common latent print powder development techniques. Finally, the ability to monitor the depth distribution of fingerprint constituents was found to be possible - a capability which has not been shown using other chemical imaging techniques. This paper illustrates a number of strengths and potential weaknesses of C60(+) SIMS as an additional or complimentary technique for the chemical analysis of fingerprints. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  10. X-ray focal spot reconstruction by circular penumbra analysis-Application to digital radiography systems.

    PubMed

    Di Domenico, Giovanni; Cardarelli, Paolo; Contillo, Adriano; Taibi, Angelo; Gambaccini, Mauro

    2016-01-01

    The quality of a radiography system is affected by several factors, a major one being the focal spot size of the x-ray tube. In fact, the measurement of such size is recognized to be of primary importance during acceptance tests and image quality evaluations of clinical radiography systems. The most common device providing an image of the focal spot emission distribution is a pin-hole camera, which requires a high tube loading in order to produce a measurable signal. This work introduces an alternative technique to obtain an image of the focal spot, through the processing of a single radiograph of a simple test object, acquired with a suitable magnification. The radiograph of a magnified sharp edge is a well-established method to evaluate the extension of the focal spot profile along the direction perpendicular to the edge. From a single radiograph of a circular x-ray absorber, it is possible to extract simultaneously the radial profiles of several sharp edges with different orientations. The authors propose a technique that allows to obtain an image of the focal spot through the processing of these radial profiles by means of a pseudo-CT reconstruction technique. In order to validate this technique, the reconstruction has been applied to the simulated radiographs of an ideal disk-shaped absorber, generated by various simulated focal spot distributions. Furthermore, the method has been applied to the focal spot of a commercially available mammography unit. In the case of simulated radiographs, the results of the reconstructions have been compared to the original distributions, showing an excellent agreement for what regards both the overall distribution and the full width at half maximum measurements. In the case of the experimental test, the method allowed to obtain images of the focal spot that have been compared with the results obtained through standard techniques, namely, pin-hole camera and slit camera. The method was proven to be effective for simulated images and the results of the experimental test suggest that it could be considered as an alternative technique for focal spot distribution evaluation. The method offers the possibility to measure the actual focal spot size and emission distribution at the same exposure conditions as clinical routine, avoiding high tube loading as in the case of the pin-hole imaging technique.

  11. Spatial analysis techniques applied to uranium prospecting in Chihuahua State, Mexico

    NASA Astrophysics Data System (ADS)

    Hinojosa de la Garza, Octavio R.; Montero Cabrera, María Elena; Sanín, Luz H.; Reyes Cortés, Manuel; Martínez Meyer, Enrique

    2014-07-01

    To estimate the distribution of uranium minerals in Chihuahua, the advanced statistical model "Maximun Entropy Method" (MaxEnt) was applied. A distinguishing feature of this method is that it can fit more complex models in case of small datasets (x and y data), as is the location of uranium ores in the State of Chihuahua. For georeferencing uranium ores, a database from the United States Geological Survey and workgroup of experts in Mexico was used. The main contribution of this paper is the proposal of maximum entropy techniques to obtain the mineral's potential distribution. For this model were used 24 environmental layers like topography, gravimetry, climate (worldclim), soil properties and others that were useful to project the uranium's distribution across the study area. For the validation of the places predicted by the model, comparisons were done with other research of the Mexican Service of Geological Survey, with direct exploration of specific areas and by talks with former exploration workers of the enterprise "Uranio de Mexico". Results. New uranium areas predicted by the model were validated, finding some relationship between the model predictions and geological faults. Conclusions. Modeling by spatial analysis provides additional information to the energy and mineral resources sectors.

  12. Visual Exploration of Semantic Relationships in Neural Word Embeddings

    DOE PAGES

    Liu, Shusen; Bremer, Peer-Timo; Thiagarajan, Jayaraman J.; ...

    2017-08-29

    Constructing distributed representations for words through neural language models and using the resulting vector spaces for analysis has become a crucial component of natural language processing (NLP). But, despite their widespread application, little is known about the structure and properties of these spaces. To gain insights into the relationship between words, the NLP community has begun to adapt high-dimensional visualization techniques. Particularly, researchers commonly use t-distributed stochastic neighbor embeddings (t-SNE) and principal component analysis (PCA) to create two-dimensional embeddings for assessing the overall structure and exploring linear relationships (e.g., word analogies), respectively. Unfortunately, these techniques often produce mediocre or evenmore » misleading results and cannot address domain-specific visualization challenges that are crucial for understanding semantic relationships in word embeddings. We introduce new embedding techniques for visualizing semantic and syntactic analogies, and the corresponding tests to determine whether the resulting views capture salient structures. Additionally, we introduce two novel views for a comprehensive study of analogy relationships. Finally, we augment t-SNE embeddings to convey uncertainty information in order to allow a reliable interpretation. Combined, the different views address a number of domain-specific tasks difficult to solve with existing tools.« less

  13. Visual Exploration of Semantic Relationships in Neural Word Embeddings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Shusen; Bremer, Peer-Timo; Thiagarajan, Jayaraman J.

    Constructing distributed representations for words through neural language models and using the resulting vector spaces for analysis has become a crucial component of natural language processing (NLP). But, despite their widespread application, little is known about the structure and properties of these spaces. To gain insights into the relationship between words, the NLP community has begun to adapt high-dimensional visualization techniques. Particularly, researchers commonly use t-distributed stochastic neighbor embeddings (t-SNE) and principal component analysis (PCA) to create two-dimensional embeddings for assessing the overall structure and exploring linear relationships (e.g., word analogies), respectively. Unfortunately, these techniques often produce mediocre or evenmore » misleading results and cannot address domain-specific visualization challenges that are crucial for understanding semantic relationships in word embeddings. We introduce new embedding techniques for visualizing semantic and syntactic analogies, and the corresponding tests to determine whether the resulting views capture salient structures. Additionally, we introduce two novel views for a comprehensive study of analogy relationships. Finally, we augment t-SNE embeddings to convey uncertainty information in order to allow a reliable interpretation. Combined, the different views address a number of domain-specific tasks difficult to solve with existing tools.« less

  14. Ensembles-based predictions of climate change impacts on bioclimatic zones in Northeast Asia

    NASA Astrophysics Data System (ADS)

    Choi, Y.; Jeon, S. W.; Lim, C. H.; Ryu, J.

    2017-12-01

    Biodiversity is rapidly declining globally and efforts are needed to mitigate this continually increasing loss of species. Clustering of areas with similar habitats can be used to prioritize protected areas and distribute resources for the conservation of species, selection of representative sample areas for research, and evaluation of impacts due to environmental changes. In this study, Northeast Asia (NEA) was classified into 14 bioclimatic zones using statistical techniques, which are correlation analysis and principal component analysis (PCA), and the iterative self-organizing data analysis technique algorithm (ISODATA). Based on these bioclimatic classification, we predicted shift of bioclimatic zones due to climate change. The input variables include the current climatic data (1960-1990) and the future climatic data of the HadGEM2-AO model (RCP 4.5(2050, 2070) and 8.5(2050, 2070)) provided by WorldClim. Using these data, multi-modeling methods including maximum likelihood classification, random forest, and species distribution modelling have been used to project the impact of climate change on the spatial distribution of bioclimatic zones within NEA. The results of various models were compared and analyzed by overlapping each result. As the result, significant changes in bioclimatic conditions can be expected throughout the NEA by 2050s and 2070s. The overall zones moved upward and some zones were predicted to disappear. This analysis provides the basis for understanding potential impacts of climate change on biodiversity and ecosystem. Also, this could be used more effectively to support decision making on climate change adaptation.

  15. Pitfalls in Persuasion: How Do Users Experience Persuasive Techniques in a Web Service?

    NASA Astrophysics Data System (ADS)

    Segerståhl, Katarina; Kotro, Tanja; Väänänen-Vainio-Mattila, Kaisa

    Persuasive technologies are designed by utilizing a variety of interactive techniques that are believed to promote target behaviors. This paper describes a field study in which the aim was to discover possible pitfalls of persuasion, i.e., situations in which persuasive techniques do not function as expected. The study investigated persuasive functionality of a web service targeting weight loss. A qualitative online questionnaire was distributed through the web service and a total of 291 responses were extracted for interpretative analysis. The Persuasive Systems Design model (PSD) was used for supporting systematic analysis of persuasive functionality. Pitfalls were identified through situations that evoked negative user experiences. The primary pitfalls discovered were associated with manual logging of eating and exercise behaviors, appropriateness of suggestions and source credibility issues related to social facilitation. These pitfalls, when recognized, can be addressed in design by applying functional and facilitative persuasive techniques in meaningful combinations.

  16. A note on generalized Genome Scan Meta-Analysis statistics

    PubMed Central

    Koziol, James A; Feng, Anne C

    2005-01-01

    Background Wise et al. introduced a rank-based statistical technique for meta-analysis of genome scans, the Genome Scan Meta-Analysis (GSMA) method. Levinson et al. recently described two generalizations of the GSMA statistic: (i) a weighted version of the GSMA statistic, so that different studies could be ascribed different weights for analysis; and (ii) an order statistic approach, reflecting the fact that a GSMA statistic can be computed for each chromosomal region or bin width across the various genome scan studies. Results We provide an Edgeworth approximation to the null distribution of the weighted GSMA statistic, and, we examine the limiting distribution of the GSMA statistics under the order statistic formulation, and quantify the relevance of the pairwise correlations of the GSMA statistics across different bins on this limiting distribution. We also remark on aggregate criteria and multiple testing for determining significance of GSMA results. Conclusion Theoretical considerations detailed herein can lead to clarification and simplification of testing criteria for generalizations of the GSMA statistic. PMID:15717930

  17. Vibrational study and Natural Bond Orbital analysis of serotonin in monomer and dimer states by density functional theory

    NASA Astrophysics Data System (ADS)

    Borah, Mukunda Madhab; Devi, Th. Gomti

    2018-06-01

    The vibrational spectral analysis of Serotonin and its dimer were carried out using the Fourier Transform Infrared (FTIR) and Raman techniques. The equilibrium geometrical parameters, harmonic vibrational wavenumbers, Frontier orbitals, Mulliken atomic charges, Natural Bond orbitals, first order hyperpolarizability and some optimized energy parameters were computed by density functional theory with 6-31G(d,p) basis set. The detailed analysis of the vibrational spectra have been carried out by computing Potential Energy Distribution (PED, %) with the help of Vibrational Energy Distribution Analysis (VEDA) program. The second order delocalization energies E(2) confirms the occurrence of intramolecular Charge Transfer (ICT) within the molecule. The computed wavenumbers of Serotonin monomer and dimer were found in good agreement with the experimental Raman and IR values.

  18. Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Knox, Lenora A.

    The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.

  19. Structural and spectroscopic characterization, reactivity study and charge transfer analysis of the newly synthetized 2-(6-hydroxy-1-benzofuran-3-yl) acetic acid

    NASA Astrophysics Data System (ADS)

    Murthy, P. Krishna; Krishnaswamy, G.; Armaković, Stevan; Armaković, Sanja J.; Suchetan, P. A.; Desai, Nivedita R.; Suneetha, V.; SreenivasaRao, R.; Bhargavi, G.; Aruna Kumar, D. B.

    2018-06-01

    The title compound 2-(6-hydroxy-1-benzofuran-3-yl) acetic acid (abbreviated as HBFAA) has been synthetized and characterized by FT-IR, FT-Raman and NMR spectroscopic techniques. Solid state crystal structure of HBFAA has been determined by single crystal X-ray diffraction technique. The crystal structure features O-H⋯O and C-H⋯O intermolecular interactions resulting in a two dimensional supramolecular architecture. The presence of various intermolecular interactions is well supported by the Hirshfeld surface analysis. The molecular properties of HBFAA were performed by Density functional theory (DFT) using B3LYP/6-311G++(d,p) method at ground state in gas phase, compile these results with experimental values and shows mutual agreement. The vibrational spectral analysis were carried out using FT-IR and FT-Raman spectroscopic techniques and assignment of each vibrational wavenumber made on the basis of potential energy distribution (PED). And also frontier orbital analysis (FMOs), global reactivity descriptors, non-linear optical properties (NLO) and natural bond orbital analysis (NBO) of HBFAA were computed with same method. Efforts were made in order to understand global and local reactivity properties of title compound by calculations of MEP, ALIE, BDE and Fukui function surfaces in gas phase, together with thermodynamic properties. Molecular dynamics simulation and radial distribution functions were also used in order to understand the influence of water to the stability of title compound. Charge transfer between molecules of HBFAA has been investigated thanks to the combination of MD simulations and DFT calculations.

  20. Testing a laser-induced breakdown spectroscopy technique on the Arctic sediments

    NASA Astrophysics Data System (ADS)

    Han, D.; Nam, S. I.

    2017-12-01

    Physical and geochemical investigations coupled with the Laser-induced Breakdown Spectroscopy (LIBS) were performed on three surface sediment cores (ARA03B/24BOX, ARA02B/01(A)MUC, ARA02B/02MUC and ARA02B/03(A)MUC) recovered from the western Arctic Ocean (Chukchi Sea) during IBRV ARON expeditions in 2012. The LIBS technique was applied to carry out elemental chemical analysis of the Arctic sediments and compared with that measured by ITRAX X-ray fuorescence (XRF) core scanning. LIBS and XRF have shown similar elemental composition within each sediment core. In this study, mineral composition (XRD), grain size distribution and organic carbon content as well as elemental composition (LIBS) were all considered to understand paleoenvironmental changes (ocean circulation, sea-ice drift, iceberg discharge, and etc.) recorded in the Arctic Holocene sediment. Quantitative LIBS analysis shows a gradually varying distribution of the elements along the sampled core and clear separation between the cores. The cores are geochemically characterized by elevated Mn profile. The gradient of mineral composition and grain sizes among the cores shows regional distribution and variation in sedimentary condition due to geological distance between East Siberian and North America. The present study reveals that a LIBS technique can be employed for in-situ sediment analyses for the Arctic Ocean. Furthermore, LIBS does not require costly equipment, trained operators, and complicated sample pre-treatment processes compared to Atomic absorption spectroscopy (AAS) and inductively coupled plasma emission spectroscopy (ICP), and also known to show relatively high levels of sensitivity, precision, and distinction than XRF analysis, scanning electron microscopy-energy dispersive spectrometry (SEM-EDS), and electron probe X-ray microanalysis (EPMA).

  1. Deformation structure analysis of material at fatigue on the basis of the vector field

    NASA Astrophysics Data System (ADS)

    Kibitkin, Vladimir V.; Solodushkin, Andrey I.; Pleshanov, Vasily S.

    2017-12-01

    In the paper, spatial distributions of deformation, circulation, and shear amplitudes and shear angles are obtained from the displacement vector field measured by the DIC technique. This vector field and its characteristics of shears and vortices are given as an example of such approach. The basic formulae are also given. The experiment shows that honeycomb deformation structures can arise in the center of a macrovortex at developed plastic flow. The spatial distribution of local circulation and shears is discovered, which coincides with the deformation structure but their amplitudes are different. The analysis proves that the spatial distribution of shear angles is a result of maximum tangential and normal stresses. The anticlockwise circulation of most local vortices obeys the normal Gaussian law in the area of interest.

  2. Using Dual Regression to Investigate Network Shape and Amplitude in Functional Connectivity Analyses

    PubMed Central

    Nickerson, Lisa D.; Smith, Stephen M.; Öngür, Döst; Beckmann, Christian F.

    2017-01-01

    Independent Component Analysis (ICA) is one of the most popular techniques for the analysis of resting state FMRI data because it has several advantageous properties when compared with other techniques. Most notably, in contrast to a conventional seed-based correlation analysis, it is model-free and multivariate, thus switching the focus from evaluating the functional connectivity of single brain regions identified a priori to evaluating brain connectivity in terms of all brain resting state networks (RSNs) that simultaneously engage in oscillatory activity. Furthermore, typical seed-based analysis characterizes RSNs in terms of spatially distributed patterns of correlation (typically by means of simple Pearson's coefficients) and thereby confounds together amplitude information of oscillatory activity and noise. ICA and other regression techniques, on the other hand, retain magnitude information and therefore can be sensitive to both changes in the spatially distributed nature of correlations (differences in the spatial pattern or “shape”) as well as the amplitude of the network activity. Furthermore, motion can mimic amplitude effects so it is crucial to use a technique that retains such information to ensure that connectivity differences are accurately localized. In this work, we investigate the dual regression approach that is frequently applied with group ICA to assess group differences in resting state functional connectivity of brain networks. We show how ignoring amplitude effects and how excessive motion corrupts connectivity maps and results in spurious connectivity differences. We also show how to implement the dual regression to retain amplitude information and how to use dual regression outputs to identify potential motion effects. Two key findings are that using a technique that retains magnitude information, e.g., dual regression, and using strict motion criteria are crucial for controlling both network amplitude and motion-related amplitude effects, respectively, in resting state connectivity analyses. We illustrate these concepts using realistic simulated resting state FMRI data and in vivo data acquired in healthy subjects and patients with bipolar disorder and schizophrenia. PMID:28348512

  3. Laser ektacytometry and evaluation of statistical characteristics of inhomogeneous ensembles of red blood cells

    NASA Astrophysics Data System (ADS)

    Nikitin, S. Yu.; Priezzhev, A. V.; Lugovtsov, A. E.; Ustinov, V. D.; Razgulin, A. V.

    2014-10-01

    The paper is devoted to development of the laser ektacytometry technique for evaluation of the statistical characteristics of inhomogeneous ensembles of red blood cells (RBCs). We have analyzed theoretically laser beam scattering by the inhomogeneous ensembles of elliptical discs, modeling red blood cells in the ektacytometer. The analysis shows that the laser ektacytometry technique allows for quantitative evaluation of such population characteristics of RBCs as the cells mean shape, the cells deformability variance and asymmetry of the cells distribution in the deformability. Moreover, we show that the deformability distribution itself can be retrieved by solving a specific Fredholm integral equation of the first kind. At this stage we do not take into account the scatter in the RBC sizes.

  4. Potential utilization of the absolute point cumulative semivariogram technique for the evaluation of distribution coefficient.

    PubMed

    Külahci, Fatih; Sen, Zekâi

    2009-09-15

    The classical solid/liquid distribution coefficient, K(d), for radionuclides in water-sediment systems is dependent on many parameters such as flow, geology, pH, acidity, alkalinity, total hardness, radioactivity concentration, etc. in a region. Considerations of all these effects require a regional analysis with an effective methodology, which has been based on the concept of the cumulative semivariogram concept in this paper. Although classical K(d) calculations are punctual and cannot represent regional pattern, in this paper a regional calculation methodology is suggested through the use of Absolute Point Cumulative SemiVariogram (APCSV) technique. The application of the methodology is presented for (137)Cs and (90)Sr measurements at a set of points in Keban Dam reservoir, Turkey.

  5. An analysis of short pulse and dual frequency radar techniques for measuring ocean wave spectra from satellites

    NASA Technical Reports Server (NTRS)

    Jackson, F. C.

    1980-01-01

    Scanning beam microwave radars were used to measure ocean wave directional spectra from satellites. In principle, surface wave spectral resolution in wave number can be obtained using either short pulse (SP) or dual frequency (DF) techniques; in either case, directional resolution obtains naturally as a consequence of a Bragg-like wave front matching. A four frequency moment characterization of backscatter from the near vertical using physical optics in the high frequency limit was applied to an analysis of the SP and DF measurement techniques. The intrinsic electromagnetic modulation spectrum was to the first order in wave steepness proportional to the large wave directional slope spectrum. Harmonic distortion was small and was a minimum near 10 deg incidence. NonGaussian wave statistics can have an effect comparable to that in the second order of scattering from a normally distributed sea surface. The SP technique is superior to the DF technique in terms of measurement signal to noise ratio and contrast ratio.

  6. A COMPARISON OF EXPERIMENTS AND THREE-DIMENSIONAL ANALYSIS TECHNIQUES. PART II. UNPOISONED UNIFORM SLAB CORE WITH A PARTIALLY INSERTED HAFNIUM ROD AND A PARTIALLY INSERTED WATER GAP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roseberry, R.J.

    The experimental measurements and nuclear analysis of a uniformly loaded, unpoisoned slab core with a partially inserted hafnium rod and/or a partially inserted water gap are described. Comparisons of experimental data with calculated results of the UFO core and flux synthesis techniques are given. It is concluded that one of the flux synthesis techniques and the UFO code are able to predict flux distributions to within approximately -5% of experiment for most cases, with a maximum error of approximately -10% for a channel at the core- reflector boundary. The second synthesis technique failed to give comparable agreement with experiment evenmore » when various refinements were used, e.g. increasing the number of mesh points, performing the flux synthesis technique of iteration, and spectrum-weighting the appropriate calculated fluxes through the use of the SWAKRAUM code. These results are comparable to those reported in Part I of this study. (auth)« less

  7. A Real-Time Earthquake Precursor Detection Technique Using TEC from a GPS Network

    NASA Astrophysics Data System (ADS)

    Alp Akyol, Ali; Arikan, Feza; Arikan, Orhan

    2016-07-01

    Anomalies have been observed in the ionospheric electron density distribution prior to strong earthquakes. However, most of the reported results are obtained by earthquake analysis. Therefore, their implementation in practice is highly problematic. Recently, a novel earthquake precursor detection technique based on spatio-temporal analysis of Total Electron Content (TEC) data obtained from Turkish National Permanent GPS Network (TNPGN) is developed by IONOLAB group (www.ionolab.org). In the present study, the developed detection technique is implemented in a causal setup over the available data set in test phase that enables the real time implementation. The performance of the developed earthquake prediction technique is evaluated by using 10 fold cross validation over the data obtained in 2011. Among the 23 earthquakes that have magnitudes higher than 5, the developed technique can detect precursors of 14 earthquakes while producing 8 false alarms. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR 14/001 projects.

  8. Heat transfer monitoring by means of the hot wire technique and finite element analysis software.

    PubMed

    Hernández Wong, J; Suarez, V; Guarachi, J; Calderón, A; Rojas-Trigos, J B; Juárez, A G; Marín, E

    2014-01-01

    It is reported the study of the radial heat transfer in a homogeneous and isotropic substance with a heat linear source in its axial axis. For this purpose, the hot wire characterization technique has been used, in order to obtain the temperature distribution as a function of radial distance from the axial axis and time exposure. Also, the solution of the transient heat transport equation for this problem was obtained under appropriate boundary conditions, by means of finite element technique. A comparison between experimental, conventional theoretical model and numerical simulated results is done to demonstrate the utility of the finite element analysis simulation methodology in the investigation of the thermal response of substances. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Influence of Topographic and Hydrographic Factors on the Spatial Distribution of Leptospirosis Disease in São Paulo County, Brazil: An Approach Using Geospatial Techniques and GIS Analysis

    NASA Astrophysics Data System (ADS)

    Ferreira, M. C.; Ferreira, M. F. M.

    2016-06-01

    Leptospirosis is a zoonosis caused by Leptospira genus bacteria. Rodents, especially Rattus norvegicus, are the most frequent hosts of this microorganism in the cities. The human transmission occurs by contact with urine, blood or tissues of the rodent and contacting water or mud contaminated by rodent urine. Spatial patterns of concentration of leptospirosis are related to the multiple environmental and socioeconomic factors, like housing near flooding areas, domestic garbage disposal sites and high-density of peoples living in slums located near river channels. We used geospatial techniques and geographical information system (GIS) to analysing spatial relationship between the distribution of leptospirosis cases and distance from rivers, river density in the census sector and terrain slope factors, in Sao Paulo County, Brazil. To test this methodology we used a sample of 183 geocoded leptospirosis cases confirmed in 2007, ASTER GDEM2 data, hydrography and census sectors shapefiles. Our results showed that GIS and geospatial analysis techniques improved the mapping of the disease and permitted identify the spatial pattern of association between location of cases and spatial distribution of the environmental variables analyzed. This study showed also that leptospirosis cases might be more related to the census sectors located on higher river density areas and households situated at shorter distances from rivers. In the other hand, it was not possible to assert that slope terrain contributes significantly to the location of leptospirosis cases.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fernandes, P. A.; Lynch, K. A.

    Here, we define the observational parameter regime necessary for observing low-altitude ionospheric origins of high-latitude ion upflow/outflow. We present measurement challenges and identify a new analysis technique which mitigates these impediments. To probe the initiation of auroral ion upflow, it is necessary to examine the thermal ion population at 200-350 km, where typical thermal energies are tenths of eV. Interpretation of the thermal ion distribution function measurement requires removal of payload sheath and ram effects. We use a 3-D Maxwellian model to quantify how observed ionospheric parameters such as density, temperature, and flows affect in situ measurements of the thermalmore » ion distribution function. We define the viable acceptance window of a typical top-hat electrostatic analyzer in this regime and show that the instrument's energy resolution prohibits it from directly observing the shape of the particle spectra. To extract detailed information about measured particle population, we define two intermediate parameters from the measured distribution function, then use a Maxwellian model to replicate possible measured parameters for comparison to the data. Liouville's theorem and the thin-sheath approximation allow us to couple the measured and modeled intermediate parameters such that measurements inside the sheath provide information about plasma outside the sheath. We apply this technique to sounding rocket data to show that careful windowing of the data and Maxwellian models allows for extraction of the best choice of geophysical parameters. More widespread use of this analysis technique will help our community expand its observational database of the seed regions of ionospheric outflows.« less

  11. Multiple locus VNTR analysis highlights that geographical clustering and distribution of Dichelobacter nodosus, the causal agent of footrot in sheep, correlates with inter-country movements☆

    PubMed Central

    Russell, Claire L.; Smith, Edward M.; Calvo-Bado, Leonides A.; Green, Laura E.; Wellington, Elizabeth M.H.; Medley, Graham F.; Moore, Lynda J.; Grogono-Thomas, Rosemary

    2014-01-01

    Dichelobacter nodosus is a Gram-negative, anaerobic bacterium and the causal agent of footrot in sheep. Multiple locus variable number tandem repeat (VNTR) analysis (MLVA) is a portable technique that involves the identification and enumeration of polymorphic tandem repeats across the genome. The aims of this study were to develop an MLVA scheme for D. nodosus suitable for use as a molecular typing tool, and to apply it to a global collection of isolates. Seventy-seven isolates selected from regions with a long history of footrot (GB, Australia) and regions where footrot has recently been reported (India, Scandinavia), were characterised. From an initial 61 potential VNTR regions, four loci were identified as usable and in combination had the attributes required of a typing method for use in bacterial epidemiology: high discriminatory power (D > 0.95), typeability and reproducibility. Results from the analysis indicate that D. nodosus appears to have evolved via recombinational exchanges and clonal diversification. This has resulted in some clonal complexes that contain isolates from multiple countries and continents; and others that contain isolates from a single geographic location (country or region). The distribution of alleles between countries matches historical accounts of sheep movements, suggesting that the MLVA technique is sufficiently specific and sensitive for an epidemiological investigation of the global distribution of D. nodosus. PMID:23748018

  12. Disentangling Time-series Spectra with Gaussian Processes: Applications to Radial Velocity Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Czekala, Ian; Mandel, Kaisey S.; Andrews, Sean M.

    Measurements of radial velocity variations from the spectroscopic monitoring of stars and their companions are essential for a broad swath of astrophysics; these measurements provide access to the fundamental physical properties that dictate all phases of stellar evolution and facilitate the quantitative study of planetary systems. The conversion of those measurements into both constraints on the orbital architecture and individual component spectra can be a serious challenge, however, especially for extreme flux ratio systems and observations with relatively low sensitivity. Gaussian processes define sampling distributions of flexible, continuous functions that are well-motivated for modeling stellar spectra, enabling proficient searches formore » companion lines in time-series spectra. We introduce a new technique for spectral disentangling, where the posterior distributions of the orbital parameters and intrinsic, rest-frame stellar spectra are explored simultaneously without needing to invoke cross-correlation templates. To demonstrate its potential, this technique is deployed on red-optical time-series spectra of the mid-M-dwarf binary LP661-13. We report orbital parameters with improved precision compared to traditional radial velocity analysis and successfully reconstruct the primary and secondary spectra. We discuss potential applications for other stellar and exoplanet radial velocity techniques and extensions to time-variable spectra. The code used in this analysis is freely available as an open-source Python package.« less

  13. Particle size distribution of brown and white rice during gastric digestion measured by image analysis.

    PubMed

    Bornhorst, Gail M; Kostlan, Kevin; Singh, R Paul

    2013-09-01

    The particle size distribution of foods during gastric digestion indicates the amount of physical breakdown that occurred due to the peristaltic movement of the stomach walls in addition to the breakdown that initially occurred during oral processing. The objective of this study was to present an image analysis technique that was rapid, simple, and could distinguish between food components (that is, rice kernel and bran layer in brown rice). The technique was used to quantify particle breakdown of brown and white rice during gastric digestion in growing pigs (used as a model for an adult human) over 480 min of digestion. The particle area distributions were fit to a Rosin-Rammler distribution function. Brown and white rice exhibited considerable breakdown as the number of particles per image decreased over time. The median particle area (x(50)) increased during digestion, suggesting a gastric sieving phenomenon, where small particles were emptied and larger particles were retained for additional breakdown. Brown rice breakdown was further quantified by an examination of the bran layer fragments and rice grain pieces. The percentage of total particle area composed of bran layer fragments was greater in the distal stomach than the proximal stomach in the first 120 min of digestion. The results of this study showed that image analysis may be used to quantify particle breakdown of a soft food product during gastric digestion, discriminate between different food components, and help to clarify the role of food structure and processing in food breakdown during gastric digestion. © 2013 Institute of Food Technologists®

  14. Modification and evaluation of a Barnes-type objective analysis scheme for surface meteorological data

    NASA Technical Reports Server (NTRS)

    Smith, D. R.

    1982-01-01

    The Purdue Regional Objective Analysis of the Mesoscale (PROAM) is a Barness-type scheme for the analysis of surface meteorological data. Modifications are introduced to the original version in order to increase its flexibility and to permit greater ease of usage. The code was rewritten for an interactive computer environment. Furthermore, a multiple iteration technique suggested by Barnes was implemented for greater accuracy. PROAM was subjected to a series of experiments in order to evaluate its performance under a variety of analysis conditions. The tests include use of a known analytic temperature distribution in order to quantify error bounds for the scheme. Similar experiments were conducted using actual atmospheric data. Results indicate that the multiple iteration technique increases the accuracy of the analysis. Furthermore, the tests verify appropriate values for the analysis parameters in resolving meso-beta scale phenomena.

  15. Integrated Data Analysis for Fusion: A Bayesian Tutorial for Fusion Diagnosticians

    NASA Astrophysics Data System (ADS)

    Dinklage, Andreas; Dreier, Heiko; Fischer, Rainer; Gori, Silvio; Preuss, Roland; Toussaint, Udo von

    2008-03-01

    Integrated Data Analysis (IDA) offers a unified way of combining information relevant to fusion experiments. Thereby, IDA meets with typical issues arising in fusion data analysis. In IDA, all information is consistently formulated as probability density functions quantifying uncertainties in the analysis within the Bayesian probability theory. For a single diagnostic, IDA allows the identification of faulty measurements and improvements in the setup. For a set of diagnostics, IDA gives joint error distributions allowing the comparison and integration of different diagnostics results. Validation of physics models can be performed by model comparison techniques. Typical data analysis applications benefit from IDA capabilities of nonlinear error propagation, the inclusion of systematic effects and the comparison of different physics models. Applications range from outlier detection, background discrimination, model assessment and design of diagnostics. In order to cope with next step fusion device requirements, appropriate techniques are explored for fast analysis applications.

  16. Analysis of Photonic Phase-Shifting Technique Employing Amplitude-Controlled Fiber-Optic Delay Lines

    DTIC Science & Technology

    2012-01-13

    Controlled Fiber-Optic Delay Lines January 13, 2012 Approved for public release; distribution is unlimited. Meredith N. draa ViNceNt J. Urick keith J...Draa, Vincent J. Urick , and Keith J. Williams Naval Research Laboratory, Code 5652 4555 Overlook Avenue, SW Washington, DC 20375-5320 NRL/MR/5650--12...9376 Approved for public release; distribution is unlimited. Unclassified Unclassified Unclassified UU 29 Vincent J. Urick (202) 767-9352 Fiber optics

  17. Investigation into the Use of Normal and Half-Normal Plots for Interpreting Results from Screening Experiments.

    DTIC Science & Technology

    1987-03-25

    by Lloyd (1952) using generalized least squares instead of ordinary least squares, and by Wilk, % 20 Gnanadesikan , and Freeny (1963) using a maximum...plot. The half-normal distribution is a special case of the gamma distribution proposed by Wilk, Gnanadesikan , and Huyett (1962). VARIATIONS ON THE... Gnanadesikan , R. Probability plotting methods for the analysis of data. Biometrika, 1968, 55, 1-17. This paper describes and discusses graphical techniques

  18. Statistical analysis of flight times for space shuttle ferry flights

    NASA Technical Reports Server (NTRS)

    Graves, M. E.; Perlmutter, M.

    1974-01-01

    Markov chain and Monte Carlo analysis techniques are applied to the simulated Space Shuttle Orbiter Ferry flights to obtain statistical distributions of flight time duration between Edwards Air Force Base and Kennedy Space Center. The two methods are compared, and are found to be in excellent agreement. The flights are subjected to certain operational and meteorological requirements, or constraints, which cause eastbound and westbound trips to yield different results. Persistence of events theory is applied to the occurrence of inclement conditions to find their effect upon the statistical flight time distribution. In a sensitivity test, some of the constraints are varied to observe the corresponding changes in the results.

  19. Mesh morphing for finite element analysis of implant positioning in cementless total hip replacements.

    PubMed

    Bah, Mamadou T; Nair, Prasanth B; Browne, Martin

    2009-12-01

    Finite element (FE) analysis of the effect of implant positioning on the performance of cementless total hip replacements (THRs) requires the generation of multiple meshes to account for positioning variability. This process can be labour intensive and time consuming as CAD operations are needed each time a specific orientation is to be analysed. In the present work, a mesh morphing technique is developed to automate the model generation process. The volume mesh of a baseline femur with the implant in a nominal position is deformed as the prosthesis location is varied. A virtual deformation field, obtained by solving a linear elasticity problem with appropriate boundary conditions, is applied. The effectiveness of the technique is evaluated using two metrics: the percentages of morphed elements exceeding an aspect ratio of 20 and an angle of 165 degrees between the adjacent edges of each tetrahedron. Results show that for 100 different implant positions, the first and second metrics never exceed 3% and 3.5%, respectively. To further validate the proposed technique, FE contact analyses are conducted using three selected morphed models to predict the strain distribution in the bone and the implant micromotion under joint and muscle loading. The entire bone strain distribution is well captured and both percentages of bone volume with strain exceeding 0.7% and bone average strains are accurately computed. The results generated from the morphed mesh models correlate well with those for models generated from scratch, increasing confidence in the methodology. This morphing technique forms an accurate and efficient basis for FE based implant orientation and stability analysis of cementless hip replacements.

  20. Techniques for assessing the socio-economic effects of vehicle mileage fees.

    DOT National Transportation Integrated Search

    2008-06-01

    The purpose of this study was to develop tools for assessing the distributional effects of alternative highway user fees for light vehicles : in Oregon. The analysis focused on a change from the current gasoline tax to a VMT fee structure for collect...

  1. CFD application to subsonic inlet airframe integration. [computational fluid dynamics (CFD)

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.

    1988-01-01

    The fluid dynamics of curved diffuser duct flows of military aircraft is discussed. Three-dimensional parabolized Navier-Stokes analysis, and experiment techniques are reviewed. Flow measurements and pressure distributions are shown. Velocity vectors, and the effects of vortex generators are considered.

  2. Atmospheric aerosols: A literature summary of their physical characteristics and chemical composition

    NASA Technical Reports Server (NTRS)

    Harris, F. S., Jr.

    1976-01-01

    This report contains a summary of 199 recent references on the characterization of atmospheric aerosols with respect to their composition, sources, size distribution, and time changes, and with particular reference to the chemical elements measured by modern techniques, especially activation analysis.

  3. Imaging the Transport of Silver Nanoparticles Through Soil With Synchrotron X-ray Microtomography

    NASA Astrophysics Data System (ADS)

    Molnar, I. L.; Gerhard, J.; O'Carroll, D. M.; Willson, C. S.

    2012-12-01

    Synchrotron x-ray computed microtomography (SXCMT) offers the ability to examine the spatial distribution of contaminants within the pore space of a porous medium; examples include the distribution of nonaqueous phase liquids (NAPLs) and micro-sized colloids. Recently presented was a method, based upon the application of the Beer-Lambert law and K-edge imaging, for using SXCMT to accurately determine the distribution of silver nanoparticles in a porous medium (Molnar et al., AGU Fall Meeting, H53B-1418, 2011). By capturing a series of SXCMT images of a single sample evolving over time, this technique can study the changing distribution of nanoparticles throughout the pore-network and even within individual pores. While previous work on this method focused on accuracy, precision and its potential applications, this study will provide an in-depth analysis of the results of multiple silver nanoparticle transport experiments imaged using this new technique. SXCMT images were collected at various stages of silver nanoparticle injection into columns packed with well graded and poorly graded quartz sand, iron oxide sand and glass bead porous media. The collected images were used to explore the influences of grain type, size and shape on the transport of silver nanoparticles through soil. The results of this analysis illustrate how SXCMT can collect hitherto unobtainable data which can yield valuable insights into the factors affecting nanoparticle transport through soil.

  4. [Spectra and thermal analysis of the arc in activating flux plasma arc welding].

    PubMed

    Chai, Guo-Ming; Zhu, Yi-Feng

    2010-04-01

    In activating flux plasma arc welding the welding arc was analyzed by spectra analysis technique, and the welding arc temperature field was measured by the infrared sensing and computer image technique. The distribution models of welding arc heat flow density of activating flux PAW welding were developed. The composition of welding arc affected by activated flux was studied, and the welding arc temperature field was studied. The results show that the spectral lines of argon atom and ionized argon atom of primary ionization are the main spectra lines of the conventional plasma welding arc. The spectra lines of weld metal are inappreciable in the spectra lines of the conventional plasma welding arc. The gas particle is the main in the conventional plasma welding arc. The conventional plasma welding arc is gas welding arc. The spectra lines of argon atom and ionized argon atom of primary ionization are intensified in the activating flux plasma welding arc, and the spectra lines of Ti, Cr and Fe elements are found in the activating flux plasma welding arc. The welding arc temperature distribution in activating flux plasma arc welding is compact, the outline of the welding arc temperature field is narrow, the range of the welding arc temperature distribution is concentrated, the welding arc radial temperature gradient is large, and the welding arc radial temperature gradient shows normal Gauss distribution.

  5. Improved Tandem Measurement Techniques for Aerosol Particle Analysis

    NASA Astrophysics Data System (ADS)

    Rawat, Vivek Kumar

    Non-spherical, chemically inhomogeneous (complex) nanoparticles are encountered in a number of natural and engineered environments, including combustion systems (which produces highly non-spherical aggregates), reactors used in gas-phase materials synthesis of doped or multicomponent materials, and in ambient air. These nanoparticles are often highly diverse in size, composition and shape, and hence require determination of property distribution functions for accurate characterization. This thesis focuses on development of tandem mobility-mass measurement techniques coupled with appropriate data inversion routines to facilitate measurement of two dimensional size-mass distribution functions while correcting for the non-idealities of the instruments. Chapter 1 provides the detailed background and motivation for the studies performed in this thesis. In chapter 2, the development of an inversion routine is described which is employed to determine two dimensional size-mass distribution functions from Differential Mobility Analyzer-Aerosol Particle Mass analyzer tandem measurements. Chapter 3 demonstrates the application of the two dimensional distribution function to compute cumulative mass distribution function and also evaluates the validity of this technique by comparing the calculated total mass concentrations to measured values for a variety of aerosols. In Chapter 4, this tandem measurement technique with the inversion routine is employed to analyze colloidal suspensions. Chapter 5 focuses on application of a transverse modulation ion mobility spectrometer coupled with a mass spectrometer to study the effect of vapor dopants on the mobility shifts of sub 2 nm peptide ion clusters. These mobility shifts are then compared to models based on vapor uptake theories. Finally, in Chapter 6, a conclusion of all the studies performed in this thesis is provided and future avenues of research are discussed.

  6. Observation of three-dimensional elemental distributions of a Si device using a 360 degrees -tilt FIB and the cold field-emission STEM system.

    PubMed

    Yaguchi, Toshie; Konno, Mitsuru; Kamino, Takeo; Watanabe, Masashi

    2008-11-01

    A technique for preparation of a pillar-shaped specimen and its multidirectional observation using a combination of a scanning transmission electron microscope (STEM) and a focused ion beam (FIB) instrument has been developed. The system employs an FIB/STEM compatible holder with a specially designed tilt mechanism, which allows the specimen to be tilted through 360 degrees [T. Yaguchi, M. Konno, T. Kamino, T. Hashimoto, T. Ohnishi, K. Umemura, K. Asayama, Microsc. Microanal. 9 (Suppl. 2) (2003) 118; T. Yaguchi, M. Konno, T. Kamino, T. Hashimoto, T. Ohnishi, M. Watanabe, Microsc. Microanal. 10 (Suppl. 2) (2004) 1030]. This technique was applied to obtain the three-dimensional (3D) elemental distributions around a contact plug of a Si device used in a 90-nm technology. A specimen containing only one contact plug was prepared in the shape of a pillar with a diameter of 200nm and a length of 5mum. Elemental maps were obtained from the pillar specimen using a 200-kV cold-field emission gun (FEG) STEM model HD-2300C equipped with the EDAX genesis X-ray energy-dispersive spectrometry (XEDS) system through a spectrum imaging technique. In this study, elemental distributions of minor elements with weak signals were enhanced by applying principal component analysis (PCA), which is a superior technique to extract weak signals from a large dataset. The distributions of elements, especially the metallization component Ti and minor dopant As in this particular device, were successfully extracted by PCA. Finally, the 3D elemental distributions around the contact plug could be visualized by reconstruction from the tilt series of maps.

  7. Metabolomic evaluation of ginsenosides distribution in Panax genus (Panax ginseng and Panax quinquefolius) using multivariate statistical analysis.

    PubMed

    Pace, Roberto; Martinelli, Ernesto Marco; Sardone, Nicola; D E Combarieu, Eric

    2015-03-01

    Ginseng is any one of the eleven species belonging to the genus Panax of the family Araliaceae and is found in North America and in eastern Asia. Ginseng is characterized by the presence of ginsenosides. Principally Panax ginseng and Panax quinquefolius are the adaptogenic herbs and are commonly distributed as health food markets. In the present study high performance liquid chromatography has been used to identify and quantify ginsenosides in the two subject species and the different parts of the plant (roots, neck, leaves, flowers, fruits). The power of this chromatographic technique to evaluate the identity of botanical material and to distinguishing different part of the plants has been investigated with metabolomic technique such as principal component analysis. Metabolomics provide a good opportunity for mining useful chemical information from the chromatographic data set resulting an important tool for quality evaluation of medicinal plants in the authenticity, consistency and efficacy. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Operando analysis of lithium profiles in Li-ion batteries using nuclear microanalysis

    NASA Astrophysics Data System (ADS)

    Surblé, S.; Paireau, C.; Martin, J.-F.; Tarnopolskiy, V.; Gauthier, M.; Khodja, H.; Daniel, L.; Patoux, S.

    2018-07-01

    A wide variety of analytical methods are used for studying the behavior of lithium-ion batteries and particularly the lithium ion distribution in the electrodes. However, the development of in situ/operando techniques proved powerful to understand the mechanisms responsible for the lithium trapping and then the aging phenomenon. Herein, we report the design of an electrochemical cell to profile operando lithium concentration in LiFePO4 electrodes using Ion Beam Analysis techniques. The specificity of the cell resides in its ability to not only provide qualitative information about the elements present but above all to measure quantitatively their content in the electrode at different states of charge of the battery. The nuclear methods give direct information about the degradation of the electrolyte and particularly reveal inhomogeneous distributions of lithium and fluorine along the entire thickness of the electrode. Higher concentrations of fluorine is detected near the electrode/electrolyte interface while a depletion of lithium is observed near the current collector at high states of charge.

  9. Post hoc interlaboratory comparison of single particle ICP-MS size measurements of NIST gold nanoparticle reference materials.

    PubMed

    Montoro Bustos, Antonio R; Petersen, Elijah J; Possolo, Antonio; Winchester, Michael R

    2015-09-01

    Single particle inductively coupled plasma-mass spectrometry (spICP-MS) is an emerging technique that enables simultaneous measurement of nanoparticle size and number quantification of metal-containing nanoparticles at realistic environmental exposure concentrations. Such measurements are needed to understand the potential environmental and human health risks of nanoparticles. Before spICP-MS can be considered a mature methodology, additional work is needed to standardize this technique including an assessment of the reliability and variability of size distribution measurements and the transferability of the technique among laboratories. This paper presents the first post hoc interlaboratory comparison study of the spICP-MS technique. Measurement results provided by six expert laboratories for two National Institute of Standards and Technology (NIST) gold nanoparticle reference materials (RM 8012 and RM 8013) were employed. The general agreement in particle size between spICP-MS measurements and measurements by six reference techniques demonstrates the reliability of spICP-MS and validates its sizing capability. However, the precision of the spICP-MS measurement was better for the larger 60 nm gold nanoparticles and evaluation of spICP-MS precision indicates substantial variability among laboratories, with lower variability between operators within laboratories. Global particle number concentration and Au mass concentration recovery were quantitative for RM 8013 but significantly lower and with a greater variability for RM 8012. Statistical analysis did not suggest an optimal dwell time, because this parameter did not significantly affect either the measured mean particle size or the ability to count nanoparticles. Finally, the spICP-MS data were often best fit with several single non-Gaussian distributions or mixtures of Gaussian distributions, rather than the more frequently used normal or log-normal distributions.

  10. Nonlinear truncation error analysis of finite difference schemes for the Euler equations

    NASA Technical Reports Server (NTRS)

    Klopfer, G. H.; Mcrae, D. S.

    1983-01-01

    It is pointed out that, in general, dissipative finite difference integration schemes have been found to be quite robust when applied to the Euler equations of gas dynamics. The present investigation considers a modified equation analysis of both implicit and explicit finite difference techniques as applied to the Euler equations. The analysis is used to identify those error terms which contribute most to the observed solution errors. A technique for analytically removing the dominant error terms is demonstrated, resulting in a greatly improved solution for the explicit Lax-Wendroff schemes. It is shown that the nonlinear truncation errors are quite large and distributed quite differently for each of the three conservation equations as applied to a one-dimensional shock tube problem.

  11. Quantization error of CCD cameras and their influence on phase calculation in fringe pattern analysis.

    PubMed

    Skydan, Oleksandr A; Lilley, Francis; Lalor, Michael J; Burton, David R

    2003-09-10

    We present an investigation into the phase errors that occur in fringe pattern analysis that are caused by quantization effects. When acquisition devices with a limited value of camera bit depth are used, there are a limited number of quantization levels available to record the signal. This may adversely affect the recorded signal and adds a potential source of instrumental error to the measurement system. Quantization effects also determine the accuracy that may be achieved by acquisition devices in a measurement system. We used the Fourier fringe analysis measurement technique. However, the principles can be applied equally well for other phase measuring techniques to yield a phase error distribution that is caused by the camera bit depth.

  12. Cluster analysis based on dimensional information with applications to feature selection and classification

    NASA Technical Reports Server (NTRS)

    Eigen, D. J.; Fromm, F. R.; Northouse, R. A.

    1974-01-01

    A new clustering algorithm is presented that is based on dimensional information. The algorithm includes an inherent feature selection criterion, which is discussed. Further, a heuristic method for choosing the proper number of intervals for a frequency distribution histogram, a feature necessary for the algorithm, is presented. The algorithm, although usable as a stand-alone clustering technique, is then utilized as a global approximator. Local clustering techniques and configuration of a global-local scheme are discussed, and finally the complete global-local and feature selector configuration is shown in application to a real-time adaptive classification scheme for the analysis of remote sensed multispectral scanner data.

  13. Predicting neuropathic ulceration: analysis of static temperature distributions in thermal images

    NASA Astrophysics Data System (ADS)

    Kaabouch, Naima; Hu, Wen-Chen; Chen, Yi; Anderson, Julie W.; Ames, Forrest; Paulson, Rolf

    2010-11-01

    Foot ulcers affect millions of Americans annually. Conventional methods used to assess skin integrity, including inspection and palpation, may be valuable approaches, but they usually do not detect changes in skin integrity until an ulcer has already developed. We analyze the feasibility of thermal imaging as a technique to assess the integrity of the skin and its many layers. Thermal images are analyzed using an asymmetry analysis, combined with a genetic algorithm, to examine the infrared images for early detection of foot ulcers. Preliminary results show that the proposed technique can reliably and efficiently detect inflammation and hence effectively predict potential ulceration.

  14. Verification of mesoscale objective analyses of VAS and rawinsonde data using the March 1982 AVE/VAS special network data

    NASA Technical Reports Server (NTRS)

    Doyle, James D.; Warner, Thomas T.

    1987-01-01

    Various combinations of VAS (Visible and Infrared Spin Scan Radiometer Atmospheric Sounder) data, conventional rawinsonde data, and gridded data from the National Weather Service's (NWS) global analysis, were used in successive-correction and variational objective-analysis procedures. Analyses are produced for 0000 GMT 7 March 1982, when the VAS sounding distribution was not greatly limited by the existence of cloud cover. The successive-correction (SC) procedure was used with VAS data alone, rawinsonde data alone, and both VAS and rawinsonde data. Variational techniques were applied in three ways. Each of these techniques was discussed.

  15. Data management system performance modeling

    NASA Technical Reports Server (NTRS)

    Kiser, Larry M.

    1993-01-01

    This paper discusses analytical techniques that have been used to gain a better understanding of the Space Station Freedom's (SSF's) Data Management System (DMS). The DMS is a complex, distributed, real-time computer system that has been redesigned numerous times. The implications of these redesigns have not been fully analyzed. This paper discusses the advantages and disadvantages for static analytical techniques such as Rate Monotonic Analysis (RMA) and also provides a rationale for dynamic modeling. Factors such as system architecture, processor utilization, bus architecture, queuing, etc. are well suited for analysis with a dynamic model. The significance of performance measures for a real-time system are discussed.

  16. Particle size and X-ray analysis of Feldspar, Calvert, Ball, and Jordan soils

    NASA Technical Reports Server (NTRS)

    Chapman, R. S.

    1977-01-01

    Pipette analysis and X-ray diffraction techniques were employed to characterize the particle size distribution and clay mineral content of the feldspar, calvert, ball, and jordan soils. In general, the ball, calvert, and jordan soils were primarily clay size particles composed of kaolinite and illite whereas the feldspar soil was primarily silt-size particles composed of quartz and feldspar minerals.

  17. A method of using cluster analysis to study statistical dependence in multivariate data

    NASA Technical Reports Server (NTRS)

    Borucki, W. J.; Card, D. H.; Lyle, G. C.

    1975-01-01

    A technique is presented that uses both cluster analysis and a Monte Carlo significance test of clusters to discover associations between variables in multidimensional data. The method is applied to an example of a noisy function in three-dimensional space, to a sample from a mixture of three bivariate normal distributions, and to the well-known Fisher's Iris data.

  18. Integrated Efforts for Analysis of Geophysical Measurements and Models.

    DTIC Science & Technology

    1997-09-26

    12b. DISTRIBUTION CODE 13. ABSTRACT ( Maximum 200 words) This contract supported investigations of integrated applications of physics, ephemerides...REGIONS AND GPS DATA VALIDATIONS 20 2.5 PL-SCINDA: VISUALIZATION AND ANALYSIS TECHNIQUES 22 2.5.1 View Controls 23 2.5.2 Map Selection...and IR data, about cloudy pixels. Clustering and maximum likelihood classification algorithms categorize up to four cloud layers into stratiform or

  19. The Outlier Detection for Ordinal Data Using Scalling Technique of Regression Coefficients

    NASA Astrophysics Data System (ADS)

    Adnan, Arisman; Sugiarto, Sigit

    2017-06-01

    The aims of this study is to detect the outliers by using coefficients of Ordinal Logistic Regression (OLR) for the case of k category responses where the score from 1 (the best) to 8 (the worst). We detect them by using the sum of moduli of the ordinal regression coefficients calculated by jackknife technique. This technique is improved by scalling the regression coefficients to their means. R language has been used on a set of ordinal data from reference distribution. Furthermore, we compare this approach by using studentised residual plots of jackknife technique for ANOVA (Analysis of Variance) and OLR. This study shows that the jackknifing technique along with the proper scaling may lead us to reveal outliers in ordinal regression reasonably well.

  20. Parallel object-oriented, denoising system using wavelet multiresolution analysis

    DOEpatents

    Kamath, Chandrika; Baldwin, Chuck H.; Fodor, Imola K.; Tang, Nu A.

    2005-04-12

    The present invention provides a data de-noising system utilizing processors and wavelet denoising techniques. Data is read and displayed in different formats. The data is partitioned into regions and the regions are distributed onto the processors. Communication requirements are determined among the processors according to the wavelet denoising technique and the partitioning of the data. The data is transforming onto different multiresolution levels with the wavelet transform according to the wavelet denoising technique, the communication requirements, and the transformed data containing wavelet coefficients. The denoised data is then transformed into its original reading and displaying data format.

  1. Determination of optimum allocation and pricing of distributed generation using genetic algorithm methodology

    NASA Astrophysics Data System (ADS)

    Mwakabuta, Ndaga Stanslaus

    Electric power distribution systems play a significant role in providing continuous and "quality" electrical energy to different classes of customers. In the context of the present restrictions on transmission system expansions and the new paradigm of "open and shared" infrastructure, new approaches to distribution system analyses, economic and operational decision-making need investigation. This dissertation includes three layers of distribution system investigations. In the basic level, improved linear models are shown to offer significant advantages over previous models for advanced analysis. In the intermediate level, the improved model is applied to solve the traditional problem of operating cost minimization using capacitors and voltage regulators. In the advanced level, an artificial intelligence technique is applied to minimize cost under Distributed Generation injection from private vendors. Soft computing techniques are finding increasing applications in solving optimization problems in large and complex practical systems. The dissertation focuses on Genetic Algorithm for investigating the economic aspects of distributed generation penetration without compromising the operational security of the distribution system. The work presents a methodology for determining the optimal pricing of distributed generation that would help utilities make a decision on how to operate their system economically. This would enable modular and flexible investments that have real benefits to the electric distribution system. Improved reliability for both customers and the distribution system in general, reduced environmental impacts, increased efficiency of energy use, and reduced costs of energy services are some advantages.

  2. Arthropod Surveillance Programs: Basic Components, Strategies, and Analysis

    PubMed Central

    Rochon, Kateryn; Duehl, Adrian J.; Anderson, John F.; Barrera, Roberto; Su, Nan-Yao; Gerry, Alec C.; Obenauer, Peter J.; Campbell, James F.; Lysyk, Tim J.; Allan, Sandra A.

    2015-01-01

    Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium “Advancements in arthropod monitoring technology, techniques, and analysis” presented at the 58th annual meeting of the Entomological Society of America in San Diego, CA. Interdisciplinary examples of arthropod monitoring for urban, medical, and veterinary applications are reviewed. Arthropod surveillance consists of the three components: 1) sampling method, 2) trap technology, and 3) analysis technique. A sampling method consists of selecting the best device or collection technique for a specific location and sampling at the proper spatial distribution, optimal duration, and frequency to achieve the surveillance objective. Optimized sampling methods are discussed for several mosquito species (Diptera: Culicidae) and ticks (Acari: Ixodidae). The advantages and limitations of novel terrestrial and aerial insect traps, artificial pheromones and kairomones are presented for the capture of red flour beetle (Coleoptera: Tenebrionidae), small hive beetle (Coleoptera: Nitidulidae), bed bugs (Hemiptera: Cimicidae), and Culicoides (Diptera: Ceratopogonidae) respectively. After sampling, extrapolating real world population numbers from trap capture data are possible with the appropriate analysis techniques. Examples of this extrapolation and action thresholds are given for termites (Isoptera: Rhinotermitidae) and red flour beetles. PMID:26543242

  3. Laser-induced plasma spectroscopy (LIPS): use of a geological tool in assessing bone mineral content.

    PubMed

    Andrássy, László; Gomez, Izabella; Horváth, Ágnes; Gulyás, Katalin; Pethö, Zsófia; Juhász, Balázs; Bhattoa, Harjit Pal; Szekanecz, Zoltan

    2018-02-17

    Bone may be similar to geological formulations in many ways. Therefore, it may be logical to apply laser-based geological techniques in bone research. The mineral and element oxide composition of bioapatite can be estimated by mathematical models. Laser-induced plasma spectrometry (LIPS) has long been used in geology. This method may provide a possibility to determine the composition and concentration of element oxides forming the inorganic part of bones. In this study, we wished to standardize the LIPS technique and use mathematical calculations and models in order to determine CaO distribution and bone homogeneity using bovine shin bone samples. We used polished slices of five bovine shin bones. A portable LIPS instrument using high-power Nd++YAG laser pulses has been developed (OpLab, Budapest). Analysis of CaO distribution was carried out in a 10 × 10 sampling matrix applying 300-μm sampling intervals. We assessed both cortical and trabecular bone areas. Regions of interest (ROI) were determined under microscope. CaO peaks were identified in the 200-500 nm wavelength range. A mathematical formula was used to calculate the element oxide composition (wt%) of inorganic bone. We also applied two accepted mathematical approaches, the Bartlett's test and frequency distribution curve-based analysis, to determine the homogeneity of CaO distribution in bones. We were able to standardize the LIPS technique for bone research. CaO concentrations in the cortical and trabecular regions of B1-5 bones were 33.11 ± 3.99% (range 24.02-40.43%) and 27.60 ± 7.44% (range 3.58-39.51%), respectively. CaO concentrations highly corresponded to those routinely determined by ICP-OES. We were able to graphically demonstrate CaO distribution in both 2D and 3D. We also determined possible interrelations between laser-induced craters and bone structure units, which may reflect the bone structure and may influence the heterogeneity of CaO distributions. By using two different statistical methods, we could confirm if bone samples were homogeneous or not with respect to CaO concentration distribution. LIPS, a technique previously used in geology, may be included in bone research. Assessment of element oxide concentrations in the inorganic part of bone, as well as mathematical calculations may be useful to determine the content of CaO and other element oxides in bone, further analyze bone structure and homogeneity and possibly apply this research to normal, as well as diseased bones.

  4. Distributed Sleep Scheduling in Wireless Sensor Networks via Fractional Domatic Partitioning

    NASA Astrophysics Data System (ADS)

    Schumacher, André; Haanpää, Harri

    We consider setting up sleep scheduling in sensor networks. We formulate the problem as an instance of the fractional domatic partition problem and obtain a distributed approximation algorithm by applying linear programming approximation techniques. Our algorithm is an application of the Garg-Könemann (GK) scheme that requires solving an instance of the minimum weight dominating set (MWDS) problem as a subroutine. Our two main contributions are a distributed implementation of the GK scheme for the sleep-scheduling problem and a novel asynchronous distributed algorithm for approximating MWDS based on a primal-dual analysis of Chvátal's set-cover algorithm. We evaluate our algorithm with ns2 simulations.

  5. The application of structural reliability techniques to plume impingement loading of the Space Station Freedom Photovoltaic Array

    NASA Technical Reports Server (NTRS)

    Yunis, Isam S.; Carney, Kelly S.

    1993-01-01

    A new aerospace application of structural reliability techniques is presented, where the applied forces depend on many probabilistic variables. This application is the plume impingement loading of the Space Station Freedom Photovoltaic Arrays. When the space shuttle berths with Space Station Freedom it must brake and maneuver towards the berthing point using its primary jets. The jet exhaust, or plume, may cause high loads on the photovoltaic arrays. The many parameters governing this problem are highly uncertain and random. An approach, using techniques from structural reliability, as opposed to the accepted deterministic methods, is presented which assesses the probability of failure of the array mast due to plume impingement loading. A Monte Carlo simulation of the berthing approach is used to determine the probability distribution of the loading. A probability distribution is also determined for the strength of the array. Structural reliability techniques are then used to assess the array mast design. These techniques are found to be superior to the standard deterministic dynamic transient analysis, for this class of problem. The results show that the probability of failure of the current array mast design, during its 15 year life, is minute.

  6. Relationships of damaged starch granules and particle size distribution with pasting and thermal profiles of milled MR263 rice flour.

    PubMed

    Asmeda, R; Noorlaila, A; Norziah, M H

    2016-01-15

    This research was conducted to investigate the effects of different grinding techniques (dry, semi-wet and wet) of milled rice grains on the damaged starch and particle size distribution of flour produced from a new variety, MR263, specifically related to the pasting and thermal profiles. The results indicated that grinding techniques significantly (p<0.05) affected starch damage content and particle size distribution of rice flour. Wet grinding process yields flour with lowest percentage of starch damage (7.37%) and finest average particle size (8.52μm). Pasting and gelatinization temperature was found in the range of 84.45-89.63°C and 59.86-75.31°C, respectively. Dry ground flour attained the lowest pasting and gelatinization temperature as shown by the thermal and pasting profiles. Correlation analysis revealed that percentage of damaged starch granules had a significant, negative relationship with pasting temperature while average particle size distribution had a significant, strong negative relationship with gelatinization temperature. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Thermographic Analysis of Stress Distribution in Welded Joints

    NASA Astrophysics Data System (ADS)

    Piršić, T.; Krstulović Opara, L.; Domazet, Ž.

    2010-06-01

    The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural) stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis) in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  8. A wavelet-based statistical analysis of FMRI data: I. motivation and data distribution modeling.

    PubMed

    Dinov, Ivo D; Boscardin, John W; Mega, Michael S; Sowell, Elizabeth L; Toga, Arthur W

    2005-01-01

    We propose a new method for statistical analysis of functional magnetic resonance imaging (fMRI) data. The discrete wavelet transformation is employed as a tool for efficient and robust signal representation. We use structural magnetic resonance imaging (MRI) and fMRI to empirically estimate the distribution of the wavelet coefficients of the data both across individuals and spatial locations. An anatomical subvolume probabilistic atlas is used to tessellate the structural and functional signals into smaller regions each of which is processed separately. A frequency-adaptive wavelet shrinkage scheme is employed to obtain essentially optimal estimations of the signals in the wavelet space. The empirical distributions of the signals on all the regions are computed in a compressed wavelet space. These are modeled by heavy-tail distributions because their histograms exhibit slower tail decay than the Gaussian. We discovered that the Cauchy, Bessel K Forms, and Pareto distributions provide the most accurate asymptotic models for the distribution of the wavelet coefficients of the data. Finally, we propose a new model for statistical analysis of functional MRI data using this atlas-based wavelet space representation. In the second part of our investigation, we will apply this technique to analyze a large fMRI dataset involving repeated presentation of sensory-motor response stimuli in young, elderly, and demented subjects.

  9. Secure distributed genome analysis for GWAS and sequence comparison computation.

    PubMed

    Zhang, Yihua; Blanton, Marina; Almashaqbeh, Ghada

    2015-01-01

    The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice.

  10. Secure distributed genome analysis for GWAS and sequence comparison computation

    PubMed Central

    2015-01-01

    Background The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. Methods In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. Results We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. Conclusions This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice. PMID:26733307

  11. Laser absorption-scattering technique applied to asymmetric evaporating fuel sprays for simultaneous measurement of vapor/liquid mass distributions

    NASA Astrophysics Data System (ADS)

    Gao, J.; Nishida, K.

    2010-10-01

    This paper describes an Ultraviolet-Visible Laser Absorption-Scattering (UV-Vis LAS) imaging technique applied to asymmetric fuel sprays. Continuing from the previous studies, the detailed measurement principle was derived. It is demonstrated that, by means of this technique, cumulative masses and mass distributions of vapor/liquid phases can be quantitatively measured no matter what shape the spray is. A systematic uncertainty analysis was performed, and the measurement accuracy was also verified through a series of experiments on the completely vaporized fuel spray. The results show that the Molar Absorption Coefficient (MAC) of the test fuel, which is typically pressure and temperature dependent, is the major error source. The measurement error in the vapor determination has been shown to be approximately 18% under the assumption of constant MAC of the test fuel. Two application examples of the extended LAS technique were presented for exploring the dynamics and physical insight of the evaporating fuel sprays: diesel sprays injected by group-hole nozzles and gasoline sprays impinging on an inclined wall.

  12. A wireless data acquisition system for acoustic emission testing

    NASA Astrophysics Data System (ADS)

    Zimmerman, A. T.; Lynch, J. P.

    2013-01-01

    As structural health monitoring (SHM) systems have seen increased demand due to lower costs and greater capabilities, wireless technologies have emerged that enable the dense distribution of transducers and the distributed processing of sensor data. In parallel, ultrasonic techniques such as acoustic emission (AE) testing have become increasingly popular in the non-destructive evaluation of materials and structures. These techniques, which involve the analysis of frequency content between 1 kHz and 1 MHz, have proven effective in detecting the onset of cracking and other early-stage failure in active structures such as airplanes in flight. However, these techniques typically involve the use of expensive and bulky monitoring equipment capable of accurately sensing AE signals at sampling rates greater than 1 million samples per second. In this paper, a wireless data acquisition system is presented that is capable of collecting, storing, and processing AE data at rates of up to 20 MHz. Processed results can then be wirelessly transmitted in real-time, creating a system that enables the use of ultrasonic techniques in large-scale SHM systems.

  13. Latin-square three-dimensional gage master

    DOEpatents

    Jones, L.

    1981-05-12

    A gage master for coordinate measuring machines has an nxn array of objects distributed in the Z coordinate utilizing the concept of a Latin square experimental design. Using analysis of variance techniques, the invention may be used to identify sources of error in machine geometry and quantify machine accuracy.

  14. Latin square three dimensional gage master

    DOEpatents

    Jones, Lynn L.

    1982-01-01

    A gage master for coordinate measuring machines has an nxn array of objects distributed in the Z coordinate utilizing the concept of a Latin square experimental design. Using analysis of variance techniques, the invention may be used to identify sources of error in machine geometry and quantify machine accuracy.

  15. Integrated GIS and multivariate statistical analysis for regional scale assessment of heavy metal soil contamination: A critical review.

    PubMed

    Hou, Deyi; O'Connor, David; Nathanail, Paul; Tian, Li; Ma, Yan

    2017-12-01

    Heavy metal soil contamination is associated with potential toxicity to humans or ecotoxicity. Scholars have increasingly used a combination of geographical information science (GIS) with geostatistical and multivariate statistical analysis techniques to examine the spatial distribution of heavy metals in soils at a regional scale. A review of such studies showed that most soil sampling programs were based on grid patterns and composite sampling methodologies. Many programs intended to characterize various soil types and land use types. The most often used sampling depth intervals were 0-0.10 m, or 0-0.20 m, below surface; and the sampling densities used ranged from 0.0004 to 6.1 samples per km 2 , with a median of 0.4 samples per km 2 . The most widely used spatial interpolators were inverse distance weighted interpolation and ordinary kriging; and the most often used multivariate statistical analysis techniques were principal component analysis and cluster analysis. The review also identified several determining and correlating factors in heavy metal distribution in soils, including soil type, soil pH, soil organic matter, land use type, Fe, Al, and heavy metal concentrations. The major natural and anthropogenic sources of heavy metals were found to derive from lithogenic origin, roadway and transportation, atmospheric deposition, wastewater and runoff from industrial and mining facilities, fertilizer application, livestock manure, and sewage sludge. This review argues that the full potential of integrated GIS and multivariate statistical analysis for assessing heavy metal distribution in soils on a regional scale has not yet been fully realized. It is proposed that future research be conducted to map multivariate results in GIS to pinpoint specific anthropogenic sources, to analyze temporal trends in addition to spatial patterns, to optimize modeling parameters, and to expand the use of different multivariate analysis tools beyond principal component analysis (PCA) and cluster analysis (CA). Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Investigation of microstructure in additive manufactured Inconel 625 by spatially resolved neutron transmission spectroscopy

    DOE PAGES

    Tremsin, Anton S.; Gao, Yan; Dial, Laura C.; ...

    2016-07-08

    Non-destructive testing techniques based on neutron imaging and diffraction can provide information on the internal structure of relatively thick metal samples (up to several cm), which are opaque to other conventional non-destructive methods. Spatially resolved neutron transmission spectroscopy is an extension of traditional neutron radiography, where multiple images are acquired simultaneously, each corresponding to a narrow range of energy. The analysis of transmission spectra enables studies of bulk microstructures at the spatial resolution comparable to the detector pixel. In this study we demonstrate the possibility of imaging (with ~100 μm resolution) distribution of some microstructure properties, such as residual strain,more » texture, voids and impurities in Inconel 625 samples manufactured with an additive manufacturing method called direct metal laser melting (DMLM). Although this imaging technique can be implemented only in a few large-scale facilities, it can be a valuable tool for optimization of additive manufacturing techniques and materials and for correlating bulk microstructure properties to manufacturing process parameters. Additionally, the experimental strain distribution can help validate finite element models which many industries use to predict the residual stress distributions in additive manufactured components.« less

  17. Visualization and analysis of pulsed ion beam energy density profile with infrared imaging

    NASA Astrophysics Data System (ADS)

    Isakova, Y. I.; Pushkarev, A. I.

    2018-03-01

    Infrared imaging technique was used as a surface temperature-mapping tool to characterize the energy density distribution of intense pulsed ion beams on a thin metal target. The technique enables the measuring of the total ion beam energy and the energy density distribution along the cross section and allows one to optimize the operation of an ion diode and control target irradiation mode. The diagnostics was tested on the TEMP-4M accelerator at TPU, Tomsk, Russia and on the TEMP-6 accelerator at DUT, Dalian, China. The diagnostics was applied in studies of the dynamics of the target cooling in vacuum after irradiation and in the experiments with target ablation. Errors caused by the target ablation and target cooling during measurements have been analyzed. For Fluke Ti10 and Fluke Ti400 infrared cameras, the technique can achieve surface energy density sensitivity of 0.05 J/cm2 and spatial resolution of 1-2 mm. The thermal imaging diagnostics does not require expensive consumed materials. The measurement time does not exceed 0.1 s; therefore, this diagnostics can be used for the prompt evaluation of the energy density distribution of a pulsed ion beam and during automation of the irradiation process.

  18. Investigation of microstructure in additive manufactured Inconel 625 by spatially resolved neutron transmission spectroscopy.

    PubMed

    Tremsin, Anton S; Gao, Yan; Dial, Laura C; Grazzi, Francesco; Shinohara, Takenao

    2016-01-01

    Non-destructive testing techniques based on neutron imaging and diffraction can provide information on the internal structure of relatively thick metal samples (up to several cm), which are opaque to other conventional non-destructive methods. Spatially resolved neutron transmission spectroscopy is an extension of traditional neutron radiography, where multiple images are acquired simultaneously, each corresponding to a narrow range of energy. The analysis of transmission spectra enables studies of bulk microstructures at the spatial resolution comparable to the detector pixel. In this study we demonstrate the possibility of imaging (with ~100 μm resolution) distribution of some microstructure properties, such as residual strain, texture, voids and impurities in Inconel 625 samples manufactured with an additive manufacturing method called direct metal laser melting (DMLM). Although this imaging technique can be implemented only in a few large-scale facilities, it can be a valuable tool for optimization of additive manufacturing techniques and materials and for correlating bulk microstructure properties to manufacturing process parameters. In addition, the experimental strain distribution can help validate finite element models which many industries use to predict the residual stress distributions in additive manufactured components.

  19. Investigation of microstructure in additive manufactured Inconel 625 by spatially resolved neutron transmission spectroscopy

    NASA Astrophysics Data System (ADS)

    Tremsin, Anton S.; Gao, Yan; Dial, Laura C.; Grazzi, Francesco; Shinohara, Takenao

    2016-01-01

    Non-destructive testing techniques based on neutron imaging and diffraction can provide information on the internal structure of relatively thick metal samples (up to several cm), which are opaque to other conventional non-destructive methods. Spatially resolved neutron transmission spectroscopy is an extension of traditional neutron radiography, where multiple images are acquired simultaneously, each corresponding to a narrow range of energy. The analysis of transmission spectra enables studies of bulk microstructures at the spatial resolution comparable to the detector pixel. In this study we demonstrate the possibility of imaging (with 100 μm resolution) distribution of some microstructure properties, such as residual strain, texture, voids and impurities in Inconel 625 samples manufactured with an additive manufacturing method called direct metal laser melting (DMLM). Although this imaging technique can be implemented only in a few large-scale facilities, it can be a valuable tool for optimization of additive manufacturing techniques and materials and for correlating bulk microstructure properties to manufacturing process parameters. In addition, the experimental strain distribution can help validate finite element models which many industries use to predict the residual stress distributions in additive manufactured components.

  20. Investigation of microstructure in additive manufactured Inconel 625 by spatially resolved neutron transmission spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tremsin, Anton S.; Gao, Yan; Dial, Laura C.

    Non-destructive testing techniques based on neutron imaging and diffraction can provide information on the internal structure of relatively thick metal samples (up to several cm), which are opaque to other conventional non-destructive methods. Spatially resolved neutron transmission spectroscopy is an extension of traditional neutron radiography, where multiple images are acquired simultaneously, each corresponding to a narrow range of energy. The analysis of transmission spectra enables studies of bulk microstructures at the spatial resolution comparable to the detector pixel. In this study we demonstrate the possibility of imaging (with ~100 μm resolution) distribution of some microstructure properties, such as residual strain,more » texture, voids and impurities in Inconel 625 samples manufactured with an additive manufacturing method called direct metal laser melting (DMLM). Although this imaging technique can be implemented only in a few large-scale facilities, it can be a valuable tool for optimization of additive manufacturing techniques and materials and for correlating bulk microstructure properties to manufacturing process parameters. Additionally, the experimental strain distribution can help validate finite element models which many industries use to predict the residual stress distributions in additive manufactured components.« less

  1. Investigation of microstructure in additive manufactured Inconel 625 by spatially resolved neutron transmission spectroscopy

    PubMed Central

    Tremsin, Anton S.; Gao, Yan; Dial, Laura C.; Grazzi, Francesco; Shinohara, Takenao

    2016-01-01

    Abstract Non-destructive testing techniques based on neutron imaging and diffraction can provide information on the internal structure of relatively thick metal samples (up to several cm), which are opaque to other conventional non-destructive methods. Spatially resolved neutron transmission spectroscopy is an extension of traditional neutron radiography, where multiple images are acquired simultaneously, each corresponding to a narrow range of energy. The analysis of transmission spectra enables studies of bulk microstructures at the spatial resolution comparable to the detector pixel. In this study we demonstrate the possibility of imaging (with ~100 μm resolution) distribution of some microstructure properties, such as residual strain, texture, voids and impurities in Inconel 625 samples manufactured with an additive manufacturing method called direct metal laser melting (DMLM). Although this imaging technique can be implemented only in a few large-scale facilities, it can be a valuable tool for optimization of additive manufacturing techniques and materials and for correlating bulk microstructure properties to manufacturing process parameters. In addition, the experimental strain distribution can help validate finite element models which many industries use to predict the residual stress distributions in additive manufactured components. PMID:27877885

  2. Modality-Driven Classification and Visualization of Ensemble Variance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bensema, Kevin; Gosink, Luke; Obermaier, Harald

    Advances in computational power now enable domain scientists to address conceptual and parametric uncertainty by running simulations multiple times in order to sufficiently sample the uncertain input space. While this approach helps address conceptual and parametric uncertainties, the ensemble datasets produced by this technique present a special challenge to visualization researchers as the ensemble dataset records a distribution of possible values for each location in the domain. Contemporary visualization approaches that rely solely on summary statistics (e.g., mean and variance) cannot convey the detailed information encoded in ensemble distributions that are paramount to ensemble analysis; summary statistics provide no informationmore » about modality classification and modality persistence. To address this problem, we propose a novel technique that classifies high-variance locations based on the modality of the distribution of ensemble predictions. Additionally, we develop a set of confidence metrics to inform the end-user of the quality of fit between the distribution at a given location and its assigned class. We apply a similar method to time-varying ensembles to illustrate the relationship between peak variance and bimodal or multimodal behavior. These classification schemes enable a deeper understanding of the behavior of the ensemble members by distinguishing between distributions that can be described by a single tendency and distributions which reflect divergent trends in the ensemble.« less

  3. Etude experimentale et modelisation de la digestion anaerobie des matieres organiques residuelles dans des conditions hyperthermophiles =

    NASA Astrophysics Data System (ADS)

    Altamirano, Felipe Ignacio Castro

    This dissertation focuses on the problem of designing rates in the utility sector. It is motivated by recent developments in the electricity industry, where renewable generation technologies and distributed energy resources are becoming increasingly relevant. Both technologies disrupt the sector in unique ways. While renewables make grid operations more complex, and potentially more expensive, distributed energy resources enable consumers to interact two-ways with the grid. Both developments present challenges and opportunities for regulators, who must adapt their techniques for evaluating policies to the emerging technological conditions. The first two chapters of this work make the case for updating existing techniques to evaluate tariff structures. They also propose new methods which are more appropriate given the prospective technological characteristics of the sector. The first chapter constructs an analytic tool based on a model that captures the interaction between pricing and investment. In contrast to previous approaches, this technique allows consistently comparing portfolios of rates while enabling researchers to model with a significantly greater level of detail the supply side of the sector. A key theoretical implication of the model that underlies this technique is that, by properly updating the portfolio of tariffs, a regulator could induce the welfare maximizing adoption of distributed energy resources and enrollment in rate structures. We develop an algorithm to find globally optimal solutions of this model, which is a nonlinear mathematical program. The results of a computational experiment show that the performance of the algorithm dominates that of commercial nonlinear solvers. In addition, to illustrate the practical relevance of the method, we conduct a cost benefit analysis of implementing time-variant tariffs in two electricity systems, California and Denmark. Although portfolios with time-varying rates create value in both systems, these improvements differ enough to advise very different policies. While in Denmark time-varying tariffs appear unattractive, they at least deserve further revision in California. This conclusion is beyond the reach of previous techniques to analyze rates, as they do not capture the interplay between an intermittent supply and a price-responsive demand. While useful, the method we develop in the first chapter has two important limitations. One is the lack of transparency of the parameters that determine demand substitution patterns, and demand heterogeneity; the other is the narrow range of rate structures that could be studied with the technique. Both limitations stem from taking as a primitive a demand function. Following an alternative path, in the second chapter we develop a technique based on a pricing model that has as a fundamental building block the consumer utility maximization problem. Because researchers do not have to limit themselves to problems with unique solutions, this approach significantly increases the flexibility of the model and, in particular, addresses the limitations of the technique we develop in the first chapter. This gain in flexibility decreases the practicality of our method since the underlying model becomes a Bilevel Problem. To be able to handle realistic instances, we develop a decomposition method based on a non-linear variant of the Alternating Direction Method of Multipliers, which combines Conic and Mixed Integer Programming. A numerical experiment shows that the performance of the solution technique is robust to instance sizes and a wide combination of parameters. We illustrate the relevance of the new method with another applied analysis of rate structures. Our results highlight the value of being able to model in detail distributed energy resources. They also show that ignoring transmission constraints can have meaningful impacts on the analysis of rate structures. In addition, we conduct a distributional analysis, which portrays how our method permits regulators and policy makers to study impacts of a rate update on a heterogeneous population. While a switch in rates could have a positive impact on the aggregate of households, it could benefit some more than others, and even harm some customers. Our technique permits to anticipate these impacts, letting regulators decide among rate structures with considerably more information than what would be available with alternative approaches. In the third chapter, we conduct an empirical analysis of rate structures in California, which is currently undergoing a rate reform. To contribute to the ongoing regulatory debate about the future of rates, we analyze in depth a set of plausible tariff alternatives. In our analysis, we focus on a scenario in which advanced metering infrastructure and home energy management systems are widely adopted. Our modeling approach allows us to capture a wide variety of temporal and spatial demand substitution patterns without the need of estimating a large number of parameters. (Abstract shortened by ProQuest.).

  4. Intervention Techniques Used With Autism Spectrum Disorder by Speech-Language Pathologists in the United States and Taiwan: A Descriptive Analysis of Practice in Clinical Settings.

    PubMed

    Hsieh, Ming-Yeh; Lynch, Georgina; Madison, Charles

    2018-04-27

    This study examined intervention techniques used with children with autism spectrum disorder (ASD) by speech-language pathologists (SLPs) in the United States and Taiwan working in clinic/hospital settings. The research questions addressed intervention techniques used with children with ASD, intervention techniques used with different age groups (under and above 8 years old), and training received before using the intervention techniques. The survey was distributed through the American Speech-Language-Hearing Association to selected SLPs across the United States. In Taiwan, the survey (Chinese version) was distributed through the Taiwan Speech-Language Pathologist Union, 2018, to certified SLPs. Results revealed that SLPs in the United States and Taiwan used 4 common intervention techniques: Social Skill Training, Augmentative and Alternative Communication, Picture Exchange Communication System, and Social Stories. Taiwanese SLPs reported SLP preparation program training across these common intervention strategies. In the United States, SLPs reported training via SLP preparation programs, peer therapists, and self-taught. Most SLPs reported using established or emerging evidence-based practices as defined by the National Professional Development Center (2014) and the National Standards Report (2015). Future research should address comparison of SLP preparation programs to examine the impact of preprofessional training on use of evidence-based practices to treat ASD.

  5. Just fracking: a distributive environmental justice analysis of unconventional gas development in Pennsylvania, USA

    NASA Astrophysics Data System (ADS)

    Clough, Emily; Bell, Derek

    2016-02-01

    This letter presents a distributive environmental justice analysis of unconventional gas development in the area of Pennsylvania lying over the Marcellus Shale, the largest shale gas formation in play in the United States. The extraction of shale gas using unconventional wells, which are hydraulically fractured (fracking), has increased dramatically since 2005. As the number of wells has grown, so have concerns about the potential public health effects on nearby communities. These concerns make shale gas development an environmental justice issue. This letter examines whether the hazards associated with proximity to wells and the economic benefits of shale gas production are fairly distributed. We distinguish two types of distributive environmental justice: traditional and benefit sharing. We ask the traditional question: are there a disproportionate number of minority or low-income residents in areas near to unconventional wells in Pennsylvania? However, we extend this analysis in two ways: we examine income distribution and level of education; and we compare before and after shale gas development. This contributes to discussions of benefit sharing by showing how the income distribution of the population has changed. We use a binary dasymetric technique to remap the data from the 2000 US Census and the 2009-2013 American Communities Survey and combine that data with a buffer containment analysis of unconventional wells to compare the characteristics of the population living nearer to unconventional wells with those further away before and after shale gas development. Our analysis indicates that there is no evidence of traditional distributive environmental injustice: there is not a disproportionate number of minority or low-income residents in areas near to unconventional wells. However, our analysis is consistent with the claim that there is benefit sharing distributive environmental injustice: the income distribution of the population nearer to shale gas wells has not been transformed since shale gas development.

  6. A new parallel-vector finite element analysis software on distributed-memory computers

    NASA Technical Reports Server (NTRS)

    Qin, Jiangning; Nguyen, Duc T.

    1993-01-01

    A new parallel-vector finite element analysis software package MPFEA (Massively Parallel-vector Finite Element Analysis) is developed for large-scale structural analysis on massively parallel computers with distributed-memory. MPFEA is designed for parallel generation and assembly of the global finite element stiffness matrices as well as parallel solution of the simultaneous linear equations, since these are often the major time-consuming parts of a finite element analysis. Block-skyline storage scheme along with vector-unrolling techniques are used to enhance the vector performance. Communications among processors are carried out concurrently with arithmetic operations to reduce the total execution time. Numerical results on the Intel iPSC/860 computers (such as the Intel Gamma with 128 processors and the Intel Touchstone Delta with 512 processors) are presented, including an aircraft structure and some very large truss structures, to demonstrate the efficiency and accuracy of MPFEA.

  7. Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

    2004-01-01

    A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

  8. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  9. Utilization of nuclear structural proteins for targeted therapy and detection of proliferative and differentiation disorders

    DOEpatents

    Lelievre, Sophie; Bissell, Mina

    2001-01-01

    The localization of nuclear apparatus proteins (NUMA) is used to identify tumor cells and different stages in the tumor progression and differentiation processes. There is a characteristic organization of NuMA in tumor cells and in phenotypically normal cells. NuMA distribution patterns are significantly less diffuse in proliferating non-malignant cells compared to malignant cells. The technique encompasses cell immunostaining using a NuMA specific antibody, and microscopic analysis of NuMA distribution within each nucleus.

  10. Complex degree of mutual anisotropy in diagnostics of biological tissues physiological changes

    NASA Astrophysics Data System (ADS)

    Ushenko, Yu. A.; Dubolazov, O. V.; Karachevtcev, A. O.; Zabolotna, N. I.

    2011-05-01

    To characterize the degree of consistency of parameters of the optically uniaxial birefringent protein nets of blood plasma a new parameter - complex degree of mutual anisotropy is suggested. The technique of polarization measuring the coordinate distributions of the complex degree of mutual anisotropy of blood plasma is developed. It is shown that statistic approach to the analysis of complex degree of mutual anisotropy distributions of blood plasma is effective in the diagnosis and differentiation of acute inflammation - acute and gangrenous appendicitis.

  11. Complex degree of mutual anisotropy in diagnostics of biological tissues physiological changes

    NASA Astrophysics Data System (ADS)

    Ushenko, Yu. A.; Dubolazov, A. V.; Karachevtcev, A. O.; Zabolotna, N. I.

    2011-09-01

    To characterize the degree of consistency of parameters of the optically uniaxial birefringent protein nets of blood plasma a new parameter - complex degree of mutual anisotropy is suggested. The technique of polarization measuring the coordinate distributions of the complex degree of mutual anisotropy of blood plasma is developed. It is shown that statistic approach to the analysis of complex degree of mutual anisotropy distributions of blood plasma is effective in the diagnosis and differentiation of acute inflammation - acute and gangrenous appendicitis.

  12. Modern Hardware Technologies and Software Techniques for On-Line Database Storage and Access.

    DTIC Science & Technology

    1985-12-01

    of the information in a message narrative. This method employs artificial intelligence techniques to extract information, In simalest terms, an...disf ribif ion (tape replacemenf) systemns Database distribution On-fine mass storage Videogame ROM (luke-box I Media Cost Mt $2-10/438 $10-SO/G38...trajninq ot tne great intelligence for the analyst would be required. If, on’ the other hand, a sentence analysis scneme siTole enouq,. for the low-level

  13. Measurement and analysis of operating system fault tolerance

    NASA Technical Reports Server (NTRS)

    Lee, I.; Tang, D.; Iyer, R. K.

    1992-01-01

    This paper demonstrates a methodology to model and evaluate the fault tolerance characteristics of operational software. The methodology is illustrated through case studies on three different operating systems: the Tandem GUARDIAN fault-tolerant system, the VAX/VMS distributed system, and the IBM/MVS system. Measurements are made on these systems for substantial periods to collect software error and recovery data. In addition to investigating basic dependability characteristics such as major software problems and error distributions, we develop two levels of models to describe error and recovery processes inside an operating system and on multiple instances of an operating system running in a distributed environment. Based on the models, reward analysis is conducted to evaluate the loss of service due to software errors and the effect of the fault-tolerance techniques implemented in the systems. Software error correlation in multicomputer systems is also investigated.

  14. Deriving Lifetime Maps in the Time/Frequency Domain of Coherent Structures in the Turbulent Boundary Layer

    NASA Technical Reports Server (NTRS)

    Palumbo, Dan

    2008-01-01

    The lifetimes of coherent structures are derived from data correlated over a 3 sensor array sampling streamwise sidewall pressure at high Reynolds number (> 10(exp 8)). The data were acquired at subsonic, transonic and supersonic speeds aboard a Tupolev Tu-144. The lifetimes are computed from a variant of the correlation length termed the lifelength. Characteristic lifelengths are estimated by fitting a Gaussian distribution to the sensors cross spectra and are shown to compare favorably with Efimtsov s prediction of correlation space scales. Lifelength distributions are computed in the time/frequency domain using an interval correlation technique on the continuous wavelet transform of the original time data. The median values of the lifelength distributions are found to be very close to the frequency averaged result. The interval correlation technique is shown to allow the retrieval and inspection of the original time data of each event in the lifelength distributions, thus providing a means to locate and study the nature of the coherent structure in the turbulent boundary layer. The lifelength data are converted to lifetimes using the convection velocity. The lifetime of events in the time/frequency domain are displayed in Lifetime Maps. The primary purpose of the paper is to validate these new analysis techniques so that they can be used with confidence to further characterize the behavior of coherent structures in the turbulent boundary layer.

  15. Simplified Parallel Domain Traversal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson III, David J

    2011-01-01

    Many data-intensive scientific analysis techniques require global domain traversal, which over the years has been a bottleneck for efficient parallelization across distributed-memory architectures. Inspired by MapReduce and other simplified parallel programming approaches, we have designed DStep, a flexible system that greatly simplifies efficient parallelization of domain traversal techniques at scale. In order to deliver both simplicity to users as well as scalability on HPC platforms, we introduce a novel two-tiered communication architecture for managing and exploiting asynchronous communication loads. We also integrate our design with advanced parallel I/O techniques that operate directly on native simulation output. We demonstrate DStep bymore » performing teleconnection analysis across ensemble runs of terascale atmospheric CO{sub 2} and climate data, and we show scalability results on up to 65,536 IBM BlueGene/P cores.« less

  16. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Technical Reports Server (NTRS)

    Cull, R. C.; Eltimsahy, A. H.

    1982-01-01

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  17. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Astrophysics Data System (ADS)

    Cull, R. C.; Eltimsahy, A. H.

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  18. Color image analysis technique for measuring of fat in meat: an application for the meat industry

    NASA Astrophysics Data System (ADS)

    Ballerini, Lucia; Hogberg, Anders; Lundstrom, Kerstin; Borgefors, Gunilla

    2001-04-01

    Intramuscular fat content in meat influences some important meat quality characteristics. The aim of the present study was to develop and apply image processing techniques to quantify intramuscular fat content in beefs together with the visual appearance of fat in meat (marbling). Color images of M. longissimus dorsi meat samples with a variability of intramuscular fat content and marbling were captured. Image analysis software was specially developed for the interpretation of these images. In particular, a segmentation algorithm (i.e. classification of different substances: fat, muscle and connective tissue) was optimized in order to obtain a proper classification and perform subsequent analysis. Segmentation of muscle from fat was achieved based on their characteristics in the 3D color space, and on the intrinsic fuzzy nature of these structures. The method is fully automatic and it combines a fuzzy clustering algorithm, the Fuzzy c-Means Algorithm, with a Genetic Algorithm. The percentages of various colors (i.e. substances) within the sample are then determined; the number, size distribution, and spatial distributions of the extracted fat flecks are measured. Measurements are correlated with chemical and sensory properties. Results so far show that advanced image analysis is useful for quantify the visual appearance of meat.

  19. Concentration Regimes of Biopolymers Xanthan, Tara, and Clairana, Comparing Dynamic Light Scattering and Distribution of Relaxation Time

    PubMed Central

    Oliveira, Patrícia D.; Michel, Ricardo C.; McBride, Alan J. A.; Moreira, Angelita S.; Lomba, Rosana F. T.; Vendruscolo, Claire T.

    2013-01-01

    The aim of this work was to evaluate the utilization of analysis of the distribution of relaxation time (DRT) using a dynamic light back-scattering technique as alternative method for the determination of the concentration regimes in aqueous solutions of biopolymers (xanthan, clairana and tara gums) by an analysis of the overlap (c*) and aggregation (c**) concentrations. The diffusion coefficients were obtained over a range of concentrations for each biopolymer using two methods. The first method analysed the behaviour of the diffusion coefficient as a function of the concentration of the gum solution. This method is based on the analysis of the diffusion coefficient versus the concentration curve. Using the slope of the curves, it was possible to determine the c* and c** for xanthan and tara gum. However, it was not possible to determine the concentration regimes for clairana using this method. The second method was based on an analysis of the DRTs, which showed different numbers of relaxation modes. It was observed that the concentrations at which the number of modes changed corresponded to the c* and c**. Thus, the DRT technique provided an alternative method for the determination of the critical concentrations of biopolymers. PMID:23671627

  20. Application of Raman spectroscopy for direct analysis of Carlina acanthifolia subsp. utzka root essential oil.

    PubMed

    Strzemski, Maciej; Wójciak-Kosior, Magdalena; Sowa, Ireneusz; Agacka-Mołdoch, Monika; Drączkowski, Piotr; Matosiuk, Dariusz; Kurach, Łukasz; Kocjan, Ryszard; Dresler, Sławomir

    2017-11-01

    Carlina genus plants e.g. Carlina acanthifolia subsp. utzka have been still used in folk medicine of many European countries and its biological activity is mostly associated with root essential oils. In the present paper, Raman spectroscopy (RS) was applied for the first time for evaluation of essential oil distribution in root of C. acnthifolia subsp. utzka and identification of root structures containing the essential oil. Furthermore, RS technique was applied to assess chemical stability of oil during drying of plant material or distillation process. Gas chromatography-mass spectrometry was used for qualitative and quantitative analysis of the essential oil. The identity of compounds was confirmed using Raman, ATR-IR and NMR spectroscopy. Carlina oxide was found to be the main component of the oil (98.96% ± 0.15). The spectroscopic study showed the high stability of essential oil and Raman distribution analysis indicated that the oil reservoirs were localized mostly in the structures of outer layer of the root while the inner part showed nearly no signal assigned to the oil. Raman spectroscopy technique enabled rapid, non-destructive direct analysis of plant material with minimal sample preparation and allowed straightforward, unambiguous identification of the essential oil in the sample. Copyright © 2017. Published by Elsevier B.V.

  1. Resistivity analysis of epitaxially grown, doped semiconductors using energy dependent secondary ion mass spectroscopy

    NASA Astrophysics Data System (ADS)

    Burnham, Shawn D.; Thomas, Edward W.; Doolittle, W. Alan

    2006-12-01

    A characterization technique is discussed that allows quantitative optimization of doping in epitaxially grown semiconductors. This technique uses relative changes in the host atom secondary ion (HASI) energy distribution from secondary ion mass spectroscopy (SIMS) to indicate relative changes in conductivity of the material. Since SIMS is a destructive process due to sputtering through a film, a depth profile of the energy distribution of sputtered HASIs in a matrix will contain information on the conductivity of the layers of the film as a function of depth. This process is demonstrated with Mg-doped GaN, with the Mg flux slowly increased through the film. Three distinct regions of conductivity were observed: one with Mg concentration high enough to cause compensation and thus high resistivity, a second with moderate Mg concentration and low resistivity, and a third with little to no Mg doping, causing high resistivity due to the lack of free carriers. During SIMS analysis of the first region, the energy distributions of sputtered Ga HASIs were fairly uniform and unchanging for a Mg flux above the saturation, or compensation, limit. For the second region, the Ga HASI energy distributions shifted and went through a region of inconsistent energy distributions for Mg flux slightly below the critical flux for saturation, or compensation. Finally, for the third region, the Ga HASI energy distributions then settled back into another fairly unchanging, uniform pattern. These three distinct regions were analyzed further through growth of Mg-doped step profiles and bulk growth of material at representative Mg fluxes. The materials grown at the two unchanging, uniform regions of the energy distributions yielded highly resistive material due to too high of Mg concentration and low to no Mg concentration, respectively. However, material grown in the transient energy distribution region with Mg concentration between that of the two highly resistive regions yielded low resistivity (0.59Ωcm) and highly p-type (1.2×1018cm-3 holes) Mg-doped GaN.

  2. Modeling Photo-Bleaching Kinetics to Create High Resolution Maps of Rod Rhodopsin in the Human Retina

    PubMed Central

    Ehler, Martin; Dobrosotskaya, Julia; Cunningham, Denise; Wong, Wai T.; Chew, Emily Y.; Czaja, Wojtek; Bonner, Robert F.

    2015-01-01

    We introduce and describe a novel non-invasive in-vivo method for mapping local rod rhodopsin distribution in the human retina over a 30-degree field. Our approach is based on analyzing the brightening of detected lipofuscin autofluorescence within small pixel clusters in registered imaging sequences taken with a commercial 488nm confocal scanning laser ophthalmoscope (cSLO) over a 1 minute period. We modeled the kinetics of rhodopsin bleaching by applying variational optimization techniques from applied mathematics. The physical model and the numerical analysis with its implementation are outlined in detail. This new technique enables the creation of spatial maps of the retinal rhodopsin and retinal pigment epithelium (RPE) bisretinoid distribution with an ≈ 50μm resolution. PMID:26196397

  3. Near surface geophysics techniques and geomorphological approach to reconstruct the hazard cave map in historical and urban areas

    NASA Astrophysics Data System (ADS)

    Lazzari, M.; Loperte, A.; Perrone, A.

    2010-03-01

    This work, carried out with an integrated methodological approach, focuses on the use of near surface geophysics techniques, such as ground penetrating radar and electrical resistivity tomography (ERT), and geomorphological analysis, in order to reconstruct the cave distribution and geometry in a urban context and, in particular, in historical centres. The interaction during recent centuries between human activity (caves excavation, birth and growth of an urban area) and the characters of the natural environment were the reasons of a progressive increase in hazard and vulnerability levels of several sites. The reconstruction of a detailed cave map distribution is the first step to define the anthropic and geomorphological hazard in urban areas, fundamental basis for planning and assessing the risk.

  4. Future technology insight: mass spectrometry imaging as a tool in drug research and development

    PubMed Central

    Cobice, D F; Goodwin, R J A; Andren, P E; Nilsson, A; Mackay, C L; Andrew, R

    2015-01-01

    In pharmaceutical research, understanding the biodistribution, accumulation and metabolism of drugs in tissue plays a key role during drug discovery and development. In particular, information regarding pharmacokinetics, pharmacodynamics and transport properties of compounds in tissues is crucial during early screening. Historically, the abundance and distribution of drugs have been assessed by well-established techniques such as quantitative whole-body autoradiography (WBA) or tissue homogenization with LC/MS analysis. However, WBA does not distinguish active drug from its metabolites and LC/MS, while highly sensitive, does not report spatial distribution. Mass spectrometry imaging (MSI) can discriminate drug and its metabolites and endogenous compounds, while simultaneously reporting their distribution. MSI data are influencing drug development and currently used in investigational studies in areas such as compound toxicity. In in vivo studies MSI results may soon be used to support new drug regulatory applications, although clinical trial MSI data will take longer to be validated for incorporation into submissions. We review the current and future applications of MSI, focussing on applications for drug discovery and development, with examples to highlight the impact of this promising technique in early drug screening. Recent sample preparation and analysis methods that enable effective MSI, including quantitative analysis of drugs from tissue sections will be summarized and key aspects of methodological protocols to increase the effectiveness of MSI analysis for previously undetectable targets addressed. These examples highlight how MSI has become a powerful tool in drug research and development and offers great potential in streamlining the drug discovery process. PMID:25766375

  5. AN ANALYSIS OF THE INFLUENCE OF ANNUAL THERMAL VARIABLES ON THE OCCURRENCE OF WARM WATER FISHES

    EPA Science Inventory

    A potential effect of climate change is modification of the geographic distribution of fish species. To predict this modification it is necessary to estimate temperature tolerances of fishes and then relate these tolerances to the changing environment. This technique will allow...

  6. Regional-specific Stochastic Simulation of Spatially-distributed Ground-motion Time Histories using Wavelet Packet Analysis

    NASA Astrophysics Data System (ADS)

    Huang, D.; Wang, G.

    2014-12-01

    Stochastic simulation of spatially distributed ground-motion time histories is important for performance-based earthquake design of geographically distributed systems. In this study, we develop a novel technique to stochastically simulate regionalized ground-motion time histories using wavelet packet analysis. First, a transient acceleration time history is characterized by wavelet-packet parameters proposed by Yamamoto and Baker (2013). The wavelet-packet parameters fully characterize ground-motion time histories in terms of energy content, time- frequency-domain characteristics and time-frequency nonstationarity. This study further investigates the spatial cross-correlations of wavelet-packet parameters based on geostatistical analysis of 1500 regionalized ground motion data from eight well-recorded earthquakes in California, Mexico, Japan and Taiwan. The linear model of coregionalization (LMC) is used to develop a permissible spatial cross-correlation model for each parameter group. The geostatistical analysis of ground-motion data from different regions reveals significant dependence of the LMC structure on regional site conditions, which can be characterized by the correlation range of Vs30 in each region. In general, the spatial correlation and cross-correlation of wavelet-packet parameters are stronger if the site condition is more homogeneous. Using the regional-specific spatial cross-correlation model and cokriging technique, wavelet packet parameters at unmeasured locations can be best estimated, and regionalized ground-motion time histories can be synthesized. Case studies and blind tests demonstrated that the simulated ground motions generally agree well with the actual recorded data, if the influence of regional-site conditions is considered. The developed method has great potential to be used in computational-based seismic analysis and loss estimation in a regional scale.

  7. Validation Tests of Fiber Optic Strain-Based Operational Shape and Load Measurements

    NASA Technical Reports Server (NTRS)

    Bakalyar, John A.; Jutte, Christine

    2012-01-01

    Aircraft design has been progressing toward reduced structural weight to improve fuel efficiency, increase performance, and reduce cost. Lightweight aircraft structures are more flexible than conventional designs and require new design considerations. Intelligent sensing allows for enhanced control and monitoring of aircraft, which enables increased structurally efficiency. The NASA Dryden Flight Research Center (DFRC) has developed an instrumentation system and analysis techniques that combine to make distributed structural measurements practical for lightweight vehicles. Dryden's Fiber Optic Strain Sensing (FOSS) technology enables a multitude of lightweight, distributed surface strain measurements. The analysis techniques, referred to as the Displacement Transfer Functions (DTF) and Load Transfer Functions (LTF), use surface strain values to calculate structural deflections and operational loads. The combined system is useful for real-time monitoring of aeroelastic structures, along with many other applications. This paper describes how the capabilities of the measurement system were demonstrated using subscale test articles that represent simple aircraft structures. Empirical FOSS strain data were used within the DTF to calculate the displacement of the article and within the LTF to calculate bending moments due to loads acting on the article. The results of the tests, accuracy of the measurements, and a sensitivity analysis are presented.

  8. Structured Analysis of the Logistic Support Analysis (LSA) Task, ’Integrated Logistic Support (ILS) Assessment Maintenance Planning E-1 Element’ (APJ 966-204)

    DTIC Science & Technology

    1988-10-01

    Structured Analysis involves building a logical (non-physical) model of a system, using graphic techniques which enable users, analysts, and designers to... Design uses tools, especially graphic ones, to render systems readily understandable. 8 Ř. Structured Design offers a set of strategies for...in the overall systems design process, and an overview of the assessment procedures, as well as a guide to the overall assessment. 20. DISTRIBUTION

  9. Studies of transverse momentum dependent parton distributions and Bessel weighting

    DOE PAGES

    Aghasyan, M.; Avakian, H.; De Sanctis, E.; ...

    2015-03-01

    In this paper we present a new technique for analysis of transverse momentum dependent parton distribution functions, based on the Bessel weighting formalism. The procedure is applied to studies of the double longitudinal spin asymmetry in semi-inclusive deep inelastic scattering using a new dedicated Monte Carlo generator which includes quark intrinsic transverse momentum within the generalized parton model. Using a fully differential cross section for the process, the effect of four momentum conservation is analyzed using various input models for transverse momentum distributions and fragmentation functions. We observe a few percent systematic offset of the Bessel-weighted asymmetry obtained from Montemore » Carlo extraction compared to input model calculations, which is due to the limitations imposed by the energy and momentum conservation at the given energy/Q2. We find that the Bessel weighting technique provides a powerful and reliable tool to study the Fourier transform of TMDs with controlled systematics due to experimental acceptances and resolutions with different TMD model inputs.« less

  10. Spacecraft thermal balance testing using infrared sources

    NASA Technical Reports Server (NTRS)

    Tan, G. B. T.; Walker, J. B.

    1982-01-01

    A thermal balance test (controlled flux intensity) on a simple black dummy spacecraft using IR lamps was performed and evaluated, the latter being aimed specifically at thermal mathematical model (TMM) verification. For reference purposes the model was also subjected to a solar simulation test (SST). The results show that the temperature distributions measured during IR testing for two different model attitudes under steady state conditions are reproducible with a TMM. The TMM test data correlation is not as accurate for IRT as for SST. Using the standard deviation of the temperature difference distribution (analysis minus test) the SST data correlation is better by a factor of 1.8 to 2.5. The lower figure applies to the measured and the higher to the computer-generated IR flux intensity distribution. Techniques of lamp power control are presented. A continuing work program is described which is aimed at quantifying the differences between solar simulation and infrared techniques for a model representing the thermal radiating surfaces of a large communications spacecraft.

  11. Studies of transverse momentum dependent parton distributions and Bessel weighting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aghasyan, M.; Avakian, H.; De Sanctis, E.

    In this paper we present a new technique for analysis of transverse momentum dependent parton distribution functions, based on the Bessel weighting formalism. The procedure is applied to studies of the double longitudinal spin asymmetry in semi-inclusive deep inelastic scattering using a new dedicated Monte Carlo generator which includes quark intrinsic transverse momentum within the generalized parton model. Using a fully differential cross section for the process, the effect of four momentum conservation is analyzed using various input models for transverse momentum distributions and fragmentation functions. We observe a few percent systematic offset of the Bessel-weighted asymmetry obtained from Montemore » Carlo extraction compared to input model calculations, which is due to the limitations imposed by the energy and momentum conservation at the given energy/Q2. We find that the Bessel weighting technique provides a powerful and reliable tool to study the Fourier transform of TMDs with controlled systematics due to experimental acceptances and resolutions with different TMD model inputs.« less

  12. A New Femtosecond Laser-Based Three-Dimensional Tomography Technique

    NASA Astrophysics Data System (ADS)

    Echlin, McLean P.

    2011-12-01

    Tomographic imaging has dramatically changed science, most notably in the fields of medicine and biology, by producing 3D views of structures which are too complex to understand in any other way. Current tomographic techniques require extensive time both for post-processing and data collection. Femtosecond laser based tomographic techniques have been developed in both standard atmosphere (femtosecond laser-based serial sectioning technique - FSLSS) and in vacuum (Tri-Beam System) for the fast collection (10 5mum3/s) of mm3 sized 3D datasets. Both techniques use femtosecond laser pulses to selectively remove layer-by-layer areas of material with low collateral damage and a negligible heat affected zone. To the authors knowledge, femtosecond lasers have never been used to serial section and these techniques have been entirely and uniquely developed by the author and his collaborators at the University of Michigan and University of California Santa Barbara. The FSLSS was applied to measure the 3D distribution of TiN particles in a 4330 steel. Single pulse ablation morphologies and rates were measured and collected from literature. Simultaneous two-phase ablation of TiN and steel matrix was shown to occur at fluences of 0.9-2 J/cm2. Laser scanning protocols were developed minimizing surface roughness to 0.1-0.4 mum for laser-based sectioning. The FSLSS technique was used to section and 3D reconstruct titanium nitride (TiN) containing 4330 steel. Statistical analysis of 3D TiN particle sizes, distribution parameters, and particle density were measured. A methodology was developed to use the 3D datasets to produce statistical volume elements (SVEs) for toughness modeling. Six FSLSS TiN datasets were sub-sampled into 48 SVEs for statistical analysis and toughness modeling using the Rice-Tracey and Garrison-Moody models. A two-parameter Weibull analysis was performed and variability in the toughness data agreed well with Ruggieri et al. bulk toughness measurements. The Tri-Beam system combines the benefits of laser based material removal (speed, low-damage, automated) with detectors that collect chemical, structural, and topological information. Multi-modal sectioning information was collected after many laser scanning passes demonstrating the capability of the Tri-Beam system.

  13. Strain-energy release rate analysis of a laminate with a postbuckled delamination

    NASA Technical Reports Server (NTRS)

    Whitcomb, John D.; Shivakumar, K. N.

    1987-01-01

    The objectives are to present the derivation of the new virtual crack closure technique, evaluate the accuracy of the technique, and finally to present the results of a limited parametric study of laminates with a postbuckled delamination. Although the new virtual crack closure technique is general, only homogeneous, isotropic laminates were analyzed. This was to eliminate the variation of flexural stiffness with orientation, which occurs even for quasi-isotropic laminates. This made it easier to identify the effect of geometrical parameters on G. The new virtual crack closure technique is derived. Then the specimen configurations are described. Next, the stress analyses is discussed. Finally, the virtual crack closure technique is evaluated and then used to calculate the distribution of G along the delamination front of several laminates with a postbuckled delamination.

  14. Diagnosis of Misalignment in Overhung Rotor using the K-S Statistic and A2 Test

    NASA Astrophysics Data System (ADS)

    Garikapati, Diwakar; Pacharu, RaviKumar; Munukurthi, Rama Satya Satyanarayana

    2018-02-01

    Vibration measurement at the bearings of rotating machinery has become a useful technique for diagnosing incipient fault conditions. In particular, vibration measurement can be used to detect unbalance in rotor, bearing failure, gear problems or misalignment between a motor shaft and coupled shaft. This is a particular problem encountered in turbines, ID fans and FD fans used for power generation. For successful fault diagnosis, it is important to adopt motor current signature analysis (MCSA) techniques capable of identifying the faults. It is also useful to develop techniques for inferring information such as the severity of fault. It is proposed that modeling the cumulative distribution function of motor current signals with respect to appropriate theoretical distributions, and quantifying the goodness of fit with the Kolmogorov-Smirnov (KS) statistic and A2 test offers a suitable signal feature for diagnosis. This paper demonstrates the successful comparison of the K-S feature and A2 test for discriminating the misalignment fault from normal function.

  15. Measurement of impinging butane flame using combined optical system with digital speckle tomography

    NASA Astrophysics Data System (ADS)

    Ko, Han Seo; Ahn, Seong Soo; Kim, Hyun Jung

    2011-11-01

    Three-dimensional density distributions of an impinging and eccentric flame were measured experimentally using a combined optical system with digital speckle tomography. In addition, a three-dimensional temperature distribution of the flame was reconstructed from an ideal gas equation based on the reconstructed density data. The flame was formed by the ignition of premixed butane/air from air holes and impinged upward against a plate located 24 mm distance from the burner nozzle. In order to verify the reconstruction process for the experimental measurements, numerically synthesized phantoms of impinging and eccentric flames were derived and reconstructed using a developed three-dimensional multiplicative algebraic reconstruction technique (MART). A new scanning technique was developed for the accurate analysis of speckle displacements necessary for investigating the wall jet regions of the impinging flame at which a sharp variation of the flow direction and pressure gradient occur. The reconstructed temperatures by the digital speckle tomography were applied to the boundary condition for numerical analysis of a flame impinged plate. Then, the numerically calculated temperature distribution of the upper side of the flame impinged plate was compared to temperature data taken by an infrared camera. The absolute average uncertainty between the numerical and infrared camera data was 3.7%.

  16. Application of flow field-flow fractionation for the characterization of macromolecules of biological interest: a review

    PubMed Central

    Qureshi, Rashid Nazir

    2010-01-01

    An overview is given of the recent literature on (bio) analytical applications of flow field-flow fractionation (FlFFF). FlFFF is a liquid-phase separation technique that can separate macromolecules and particles according to size. The technique is increasingly used on a routine basis in a variety of application fields. In food analysis, FlFFF is applied to determine the molecular size distribution of starches and modified celluloses, or to study protein aggregation during food processing. In industrial analysis, it is applied for the characterization of polysaccharides that are used as thickeners and dispersing agents. In pharmaceutical and biomedical laboratories, FlFFF is used to monitor the refolding of recombinant proteins, to detect aggregates of antibodies, or to determine the size distribution of drug carrier particles. In environmental studies, FlFFF is used to characterize natural colloids in water streams, and especially to study trace metal distributions over colloidal particles. In this review, first a short discussion of the state of the art in instrumentation is given. Developments in the coupling of FlFFF to various detection modes are then highlighted. Finally, application studies are discussed and ordered according to the type of (bio) macromolecules or bioparticles that are fractionated. PMID:20957473

  17. Techniques for spatio-temporal analysis of vegetation fires in the topical belt of Africa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brivio, P.A.; Ober, G.; Koffi, B.

    1995-12-31

    Biomass burning of forests and savannas is a phenomenon of continental or even global proportions, capable of causing large scale environmental changes. Satellite space observations, in particular from NOAA-AVHRR GAC data, are the only source of information allowing one to document burning patterns at regional and continental scale and over long periods of time. This paper presents some techniques, such as clustering and rose-diagram, useful in the spatial-temporal analysis of satellite derived fires maps to characterize the evolution of spatial patterns of vegetation fires at regional scale. An automatic clustering approach is presented which enables one to describe and parameterizemore » spatial distribution of fire patterns at different scales. The problem of geographical distribution of vegetation fires with respect to some location of interest, point or line, is also considered and presented. In particular rose-diagrams are used to relate fires patterns to some reference point, as experimental sites of tropospheric chemistry measurements. Different temporal data-sets in the tropical belt of Africa, covering both Northern and Southern Hemisphere dry seasons, using these techniques were analyzed and showed very promising results when compared with data from rain chemistry studies at different sampling sites in the equatorial forest.« less

  18. The application of the Wigner Distribution to wave type identification in finite length beams

    NASA Technical Reports Server (NTRS)

    Wahl, T. J.; Bolton, J. Stuart

    1994-01-01

    The object of the research described in this paper was to develop a means of identifying the wave-types propagating between two points in a finite length beam. It is known that different structural wave-types possess different dispersion relations: i.e., that their group speeds and the frequency dependence of their group speeds differ. As a result of those distinct dispersion relationships, different wave-types may be associated with characteristic features when structural responses are examined in the time frequency domain. Previously, the time-frequency character of analytically generated structural responses of both single element and multi-element structures were examined by using the Wigner Distribution (WD) along with filtering techniques that were designed to detect the wave-types present in the responses. In the work to be described here, the measure time-frequency response of finite length beam is examined using the WD and filtering procedures. This paper is organized as follows. First the concept of time-frequency analysis of structural responses is explained. The WD is then introduced along with a description of the implementation of a discrete version. The time-frequency filtering techniques are then presented and explained. The results of applying the WD and the filtering techniques to the analysis of a transient response is then presented.

  19. Inter-cohort growth for three tropical resources: tilapia, octopus and lobster.

    PubMed

    Velázquez-Abunader, Iván; Gómez-Muñoz, Victor Manuel; Salas, Silvia; Ruiz-Velazco, Javier M J

    2015-09-01

    Growth parameters are an important component for the stock assessment of exploited aquatic species. However, it is often difficult to apply direct methods to estimate growth and to analyse the differences between males and females, particularly in tropical areas. The objective of this study was to analyse the inter-cohort growth of three tropical resources and discuss the possible fisheries management implications. A simple method was used to compare individual growth curves obtained from length frequency distribution analysis, illustrated by case studies of three tropical species from different aquatic environments: tilapia (Oreochromis aureus), red octopus (Octopus maya) and the Caribbean spiny lobster (Panulirus argus). The analysis undertaken compared the size distribution of males and females of a given cohort through modal progression analysis. The technique used proved to be useful for highlighting the differences in growth between females and males of a specific cohort. The potential effect of extrinsic and intrinsic factors on the organism's development as reflected in the size distribution of the cohorts is discussed.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenberg, S.D.; Smith, S.; Swank, P.R.

    Visual cell profiles were used to analyze the distribution of atypical bronchial cells in sputum specimens from cigarette-smoking volunteers, cigarette-smoking asbestos workers and cigarette-smoking uranium miners. The preliminary results of these sputum visual cell profile studies have demonstrated distinctive distributions of bronchial cell atypias in progressive patterns of squamous metaplasia, mild, moderate and severe atypias and carcinoma, similar to those the authors have previously reported using cell image analysis techniques to determine an atypia status index (ASI). The information gained from this study will be helpful in further validating this ASI and subsequently achieving the ultimate goal of employing cellmore » image analysis for the rapid and precise identification of premalignant atypias in sputum.« less

  1. Fusion of UAV photogrammetry and digital optical granulometry for detection of structural changes in floodplains

    NASA Astrophysics Data System (ADS)

    Langhammer, Jakub; Lendzioch, Theodora; Mirijovsky, Jakub

    2016-04-01

    Granulometric analysis represents a traditional, important and for the description of sedimentary material substantial method with various applications in sedimentology, hydrology and geomorphology. However, the conventional granulometric field survey methods are time consuming, laborious, costly and are invasive to the surface being sampled, which can be limiting factor for their applicability in protected areas.. The optical granulometry has recently emerged as an image analysis technique, enabling non-invasive survey, employing semi-automated identification of clasts from calibrated digital imagery, taken on site by conventional high resolution digital camera and calibrated frame. The image processing allows detection and measurement of mixed size natural grains, their sorting and quantitative analysis using standard granulometric approaches. Despite known limitations, the technique today presents reliable tool, significantly easing and speeding the field survey in fluvial geomorphology. However, the nature of such survey has still limitations in spatial coverage of the sites and applicability in research at multitemporal scale. In our study, we are presenting novel approach, based on fusion of two image analysis techniques - optical granulometry and UAV-based photogrammetry, allowing to bridge the gap between the needs of high resolution structural information for granulometric analysis and spatially accurate and data coverage. We have developed and tested a workflow that, using UAV imaging platform enabling to deliver seamless, high resolution and spatially accurate imagery of the study site from which can be derived the granulometric properties of the sedimentary material. We have set up a workflow modeling chain, providing (i) the optimum flight parameters for UAV imagery to balance the two key divergent requirements - imagery resolution and seamless spatial coverage, (ii) the workflow for the processing of UAV acquired imagery by means of the optical granulometry and (iii) the workflow for analysis of spatial distribution and temporal changes of granulometric properties across the point bar. The proposed technique was tested on a case study of an active point bar of mid-latitude mountain stream at Sumava mountains, Czech Republic, exposed to repeated flooding. The UAV photogrammetry was used to acquire very high resolution imagery to build high-precision digital terrain models and orthoimage. The orthoimage was then analyzed using the digital optical granulometric tool BaseGrain. This approach allowed us (i) to analyze the spatial distribution of the grain size in a seamless transects over an active point bar and (ii) to assess the multitemporal changes of granulometric properties of the point bar material resulting from flooding. The tested framework prove the applicability of the proposed method for granulometric analysis with accuracy comparable with field optical granulometry. The seamless nature of the data enables to study spatial distribution of granulometric properties across the study sites as well as the analysis of multitemporal changes, resulting from repeated imaging.

  2. Visualizing nD Point Clouds as Topological Landscape Profiles to Guide Local Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oesterling, Patrick; Heine, Christian; Weber, Gunther H.

    2012-05-04

    Analyzing high-dimensional point clouds is a classical challenge in visual analytics. Traditional techniques, such as projections or axis-based techniques, suffer from projection artifacts, occlusion, and visual complexity.We propose to split data analysis into two parts to address these shortcomings. First, a structural overview phase abstracts data by its density distribution. This phase performs topological analysis to support accurate and non-overlapping presentation of the high-dimensional cluster structure as a topological landscape profile. Utilizing a landscape metaphor, it presents clusters and their nesting as hills whose height, width, and shape reflect cluster coherence, size, and stability, respectively. A second local analysis phasemore » utilizes this global structural knowledge to select individual clusters or point sets for further, localized data analysis. Focusing on structural entities significantly reduces visual clutter in established geometric visualizations and permits a clearer, more thorough data analysis. In conclusion, this analysis complements the global topological perspective and enables the user to study subspaces or geometric properties, such as shape.« less

  3. Nonlinear Earthquake Analysis of Reinforced Concrete Frames with Fiber and Bernoulli-Euler Beam-Column Element

    PubMed Central

    Karaton, Muhammet

    2014-01-01

    A beam-column element based on the Euler-Bernoulli beam theory is researched for nonlinear dynamic analysis of reinforced concrete (RC) structural element. Stiffness matrix of this element is obtained by using rigidity method. A solution technique that included nonlinear dynamic substructure procedure is developed for dynamic analyses of RC frames. A predicted-corrected form of the Bossak-α method is applied for dynamic integration scheme. A comparison of experimental data of a RC column element with numerical results, obtained from proposed solution technique, is studied for verification the numerical solutions. Furthermore, nonlinear cyclic analysis results of a portal reinforced concrete frame are achieved for comparing the proposed solution technique with Fibre element, based on flexibility method. However, seismic damage analyses of an 8-story RC frame structure with soft-story are investigated for cases of lumped/distributed mass and load. Damage region, propagation, and intensities according to both approaches are researched. PMID:24578667

  4. Wavelet analysis methods for radiography of multidimensional growth of planar mixing layers

    DOE PAGES

    Merritt, Elizabeth Catherine; Doss, Forrest William

    2016-07-06

    The counter-propagating shear campaign is examining instability growth and its transition to turbulence in the high-energy-density physics regime using a laser-driven counter-propagating flow platform. In these experiments, we observe consistent complex break-up of and structure growth in a tracer layer placed at the shear flow interface during the instability growth phase. We present a wavelet-transform based analysis technique capable of characterizing the scale- and directionality-resolved average intensity perturbations in static radiographs of the experiment. This technique uses the complete spatial information available in each radiograph to describe the structure evolution. We designed this analysis technique to generate a two-dimensional powermore » spectrum for each radiograph from which we can recover information about structure widths, amplitudes, and orientations. Lastly, the evolution of the distribution of power in the spectra for an experimental series is a potential metric for quantifying the structure size evolution as well as a system’s evolution towards isotropy.« less

  5. Atomic characterization of Si nanoclusters embedded in SiO2 by atom probe tomography

    PubMed Central

    2011-01-01

    Silicon nanoclusters are of prime interest for new generation of optoelectronic and microelectronics components. Physical properties (light emission, carrier storage...) of systems using such nanoclusters are strongly dependent on nanostructural characteristics. These characteristics (size, composition, distribution, and interface nature) are until now obtained using conventional high-resolution analytic methods, such as high-resolution transmission electron microscopy, EFTEM, or EELS. In this article, a complementary technique, the atom probe tomography, was used for studying a multilayer (ML) system containing silicon clusters. Such a technique and its analysis give information on the structure at the atomic level and allow obtaining complementary information with respect to other techniques. A description of the different steps for such analysis: sample preparation, atom probe analysis, and data treatment are detailed. An atomic scale description of the Si nanoclusters/SiO2 ML will be fully described. This system is composed of 3.8-nm-thick SiO layers and 4-nm-thick SiO2 layers annealed 1 h at 900°C. PMID:21711666

  6. Wavelet analysis methods for radiography of multidimensional growth of planar mixing layers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merritt, E. C., E-mail: emerritt@lanl.gov; Doss, F. W.

    2016-07-15

    The counter-propagating shear campaign is examining instability growth and its transition to turbulence in the high-energy-density physics regime using a laser-driven counter-propagating flow platform. In these experiments, we observe consistent complex break-up of and structure growth in a tracer layer placed at the shear flow interface during the instability growth phase. We present a wavelet-transform based analysis technique capable of characterizing the scale- and directionality-resolved average intensity perturbations in static radiographs of the experiment. This technique uses the complete spatial information available in each radiograph to describe the structure evolution. We designed this analysis technique to generate a two-dimensional powermore » spectrum for each radiograph from which we can recover information about structure widths, amplitudes, and orientations. The evolution of the distribution of power in the spectra for an experimental series is a potential metric for quantifying the structure size evolution as well as a system’s evolution towards isotropy.« less

  7. Discrete geometric analysis of message passing algorithm on graphs

    NASA Astrophysics Data System (ADS)

    Watanabe, Yusuke

    2010-04-01

    We often encounter probability distributions given as unnormalized products of non-negative functions. The factorization structures are represented by hypergraphs called factor graphs. Such distributions appear in various fields, including statistics, artificial intelligence, statistical physics, error correcting codes, etc. Given such a distribution, computations of marginal distributions and the normalization constant are often required. However, they are computationally intractable because of their computational costs. One successful approximation method is Loopy Belief Propagation (LBP) algorithm. The focus of this thesis is an analysis of the LBP algorithm. If the factor graph is a tree, i.e. having no cycle, the algorithm gives the exact quantities. If the factor graph has cycles, however, the LBP algorithm does not give exact results and possibly exhibits oscillatory and non-convergent behaviors. The thematic question of this thesis is "How the behaviors of the LBP algorithm are affected by the discrete geometry of the factor graph?" The primary contribution of this thesis is the discovery of a formula that establishes the relation between the LBP, the Bethe free energy and the graph zeta function. This formula provides new techniques for analysis of the LBP algorithm, connecting properties of the graph and of the LBP and the Bethe free energy. We demonstrate applications of the techniques to several problems including (non) convexity of the Bethe free energy, the uniqueness and stability of the LBP fixed point. We also discuss the loop series initiated by Chertkov and Chernyak. The loop series is a subgraph expansion of the normalization constant, or partition function, and reflects the graph geometry. We investigate theoretical natures of the series. Moreover, we show a partial connection between the loop series and the graph zeta function.

  8. Control of the Low-energy X-rays by Using MCNP5 and Numerical Analysis for a New Concept Intra-oral X-ray Imaging System

    NASA Astrophysics Data System (ADS)

    Huh, Jangyong; Ji, Yunseo; Lee, Rena

    2018-05-01

    An X-ray control algorithm to modulate the X-ray intensity distribution over the FOV (field of view) has been developed by using numerical analysis and MCNP5, a particle transport simulation code on the basis of the Monte Carlo method. X-rays, which are widely used in medical diagnostic imaging, should be controlled in order to maximize the performance of the X-ray imaging system. However, transporting X-rays, like a liquid or a gas is conveyed through a physical form such as pipes, is not possible. In the present study, an X-ray control algorithm and technique to uniformize the Xray intensity projected on the image sensor were developed using a flattening filter and a collimator in order to alleviate the anisotropy of the distribution of X-rays due to intrinsic features of the X-ray generator. The proposed method, which is combined with MCNP5 modeling and numerical analysis, aimed to optimize a flattening filter and a collimator for a uniform distribution of X-rays. Their size and shape were estimated from the method. The simulation and the experimental results both showed that the method yielded an intensity distribution over an X-ray field of 6×4 cm2 at SID (source to image-receptor distance) of 5 cm with a uniformity of more than 90% when the flattening filter and the collimator were mounted on the system. The proposed algorithm and technique are not only confined to flattening filter development but can also be applied for other X-ray related research and development efforts.

  9. Verification of mesoscale objective analyses of VAS and rawinsode data using the March 1982 AVE/VAS special network data. [Atmospheric Variability Experiment/Visible-infrared spin-scan radiometer Atmospheric Sounder

    NASA Technical Reports Server (NTRS)

    Doyle, James D.; Warner, Thomas T.

    1988-01-01

    Various combinations of VAS (Visible and Infrared Spin Scan Radiometer Atmospheric Sounder) data, conventional rawinsonde data, and gridded data from the National Weather Service's (NWS) global analysis, were used in successive-correction and variational objective-analysis procedures. Analyses are produced for 0000 GMT 7 March 1982, when the VAS sounding distribution was not greatly limited by the existence of cloud cover. The successive-correction (SC) Procedure was used with VAS data alone, rawinsonde data alone, and both VAS and rawinsonde data. Variational techniques were applied in three ways. Each of these techniques was discussed.

  10. Electrostatic analyzer measurements of ionospheric thermal ion populations

    DOE PAGES

    Fernandes, P. A.; Lynch, K. A.

    2016-07-09

    Here, we define the observational parameter regime necessary for observing low-altitude ionospheric origins of high-latitude ion upflow/outflow. We present measurement challenges and identify a new analysis technique which mitigates these impediments. To probe the initiation of auroral ion upflow, it is necessary to examine the thermal ion population at 200-350 km, where typical thermal energies are tenths of eV. Interpretation of the thermal ion distribution function measurement requires removal of payload sheath and ram effects. We use a 3-D Maxwellian model to quantify how observed ionospheric parameters such as density, temperature, and flows affect in situ measurements of the thermalmore » ion distribution function. We define the viable acceptance window of a typical top-hat electrostatic analyzer in this regime and show that the instrument's energy resolution prohibits it from directly observing the shape of the particle spectra. To extract detailed information about measured particle population, we define two intermediate parameters from the measured distribution function, then use a Maxwellian model to replicate possible measured parameters for comparison to the data. Liouville's theorem and the thin-sheath approximation allow us to couple the measured and modeled intermediate parameters such that measurements inside the sheath provide information about plasma outside the sheath. We apply this technique to sounding rocket data to show that careful windowing of the data and Maxwellian models allows for extraction of the best choice of geophysical parameters. More widespread use of this analysis technique will help our community expand its observational database of the seed regions of ionospheric outflows.« less

  11. A Modular GIS-Based Software Architecture for Model Parameter Estimation using the Method of Anchored Distributions (MAD)

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Osorio-Murillo, C.; Over, M. W.; Rubin, Y.

    2012-12-01

    The Method of Anchored Distributions (MAD) is an inverse modeling technique that is well-suited for estimation of spatially varying parameter fields using limited observations and Bayesian methods. This presentation will discuss the design, development, and testing of a free software implementation of the MAD technique using the open source DotSpatial geographic information system (GIS) framework, R statistical software, and the MODFLOW groundwater model. This new tool, dubbed MAD-GIS, is built using a modular architecture that supports the integration of external analytical tools and models for key computational processes including a forward model (e.g. MODFLOW, HYDRUS) and geostatistical analysis (e.g. R, GSLIB). The GIS-based graphical user interface provides a relatively simple way for new users of the technique to prepare the spatial domain, to identify observation and anchor points, to perform the MAD analysis using a selected forward model, and to view results. MAD-GIS uses the Managed Extensibility Framework (MEF) provided by the Microsoft .NET programming platform to support integration of different modeling and analytical tools at run-time through a custom "driver." Each driver establishes a connection with external programs through a programming interface, which provides the elements for communicating with core MAD software. This presentation gives an example of adapting the MODFLOW to serve as the external forward model in MAD-GIS for inferring the distribution functions of key MODFLOW parameters. Additional drivers for other models are being developed and it is expected that the open source nature of the project will engender the development of additional model drivers by 3rd party scientists.

  12. Methods to estimate distribution and range extent of grizzly bears in the Greater Yellowstone Ecosystem

    USGS Publications Warehouse

    Haroldson, Mark A.; Schwartz, Charles C.; Thompson, Daniel J.; Bjornlie, Daniel D.; Gunther, Kerry A.; Cain, Steven L.; Tyers, Daniel B.; Frey, Kevin L.; Aber, Bryan C.

    2014-01-01

    The distribution of the Greater Yellowstone Ecosystem grizzly bear (Ursus arctos) population has expanded into areas unoccupied since the early 20th century. Up-to-date information on the area and extent of this distribution is crucial for federal, state, and tribal wildlife and land managers to make informed decisions regarding grizzly bear management. The most recent estimate of grizzly bear distribution (2004) utilized fixed-kernel density estimators to describe distribution. This method was complex and computationally time consuming and excluded observations of unmarked bears. Our objective was to develop a technique to estimate grizzly bear distribution that would allow for the use of all verified grizzly bear location data, as well as provide the simplicity to be updated more frequently. We placed all verified grizzly bear locations from all sources from 1990 to 2004 and 1990 to 2010 onto a 3-km × 3-km grid and used zonal analysis and ordinary kriging to develop a predicted surface of grizzly bear distribution. We compared the area and extent of the 2004 kriging surface with the previous 2004 effort and evaluated changes in grizzly bear distribution from 2004 to 2010. The 2004 kriging surface was 2.4% smaller than the previous fixed-kernel estimate, but more closely represented the data. Grizzly bear distribution increased 38.3% from 2004 to 2010, with most expansion in the northern and southern regions of the range. This technique can be used to provide a current estimate of grizzly bear distribution for management and conservation applications.

  13. Optical skin friction measurement technique in hypersonic wind tunnel

    NASA Astrophysics Data System (ADS)

    Chen, Xing; Yao, Dapeng; Wen, Shuai; Pan, Junjie

    2016-10-01

    Shear-sensitive liquid-crystal coatings (SSLCCs) have an optical characteristic that they are sensitive to the applied shear stress. Based on this, a novel technique is developed to measure the applied shear stress of the model surface regarding both its magnitude and direction in hypersonic flow. The system of optical skin friction measurement are built in China Academy of Aerospace Aerodynamics (CAAA). A series of experiments of hypersonic vehicle is performed in wind tunnel of CAAA. Global skin friction distribution of the model which shows complicated flow structures is discussed, and a brief mechanism analysis and an evaluation on optical measurement technique have been made.

  14. Integrated analysis of particle interactions at hadron colliders Report of research activities in 2010-2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nadolsky, Pavel M.

    2015-08-31

    The report summarizes research activities of the project ”Integrated analysis of particle interactions” at Southern Methodist University, funded by 2010 DOE Early Career Research Award DE-SC0003870. The goal of the project is to provide state-of-the-art predictions in quantum chromodynamics in order to achieve objectives of the LHC program for studies of electroweak symmetry breaking and new physics searches. We published 19 journal papers focusing on in-depth studies of proton structure and integration of advanced calculations from different areas of particle phenomenology: multi-loop calculations, accurate long-distance hadronic functions, and precise numerical programs. Methods for factorization of QCD cross sections were advancedmore » in order to develop new generations of CTEQ parton distribution functions (PDFs), CT10 and CT14. These distributions provide the core theoretical input for multi-loop perturbative calculations by LHC experimental collaborations. A novel ”PDF meta-analysis” technique was invented to streamline applications of PDFs in numerous LHC simulations and to combine PDFs from various groups using multivariate stochastic sampling of PDF parameters. The meta-analysis will help to bring the LHC perturbative calculations to the new level of accuracy, while reducing computational efforts. The work on parton distributions was complemented by development of advanced perturbative techniques to predict observables dependent on several momentum scales, including production of massive quarks and transverse momentum resummation at the next-to-next-to-leading order in QCD.« less

  15. Biometric analysis of the palm vein distribution by means two different techniques of feature extraction

    NASA Astrophysics Data System (ADS)

    Castro-Ortega, R.; Toxqui-Quitl, C.; Solís-Villarreal, J.; Padilla-Vivanco, A.; Castro-Ramos, J.

    2014-09-01

    Vein patterns can be used for accessing, identifying, and authenticating purposes; which are more reliable than classical identification way. Furthermore, these patterns can be used for venipuncture in health fields to get on to veins of patients when they cannot be seen with the naked eye. In this paper, an image acquisition system is implemented in order to acquire digital images of people hands in the near infrared. The image acquisition system consists of a CCD camera and a light source with peak emission in the 880 nm. This radiation can penetrate and can be strongly absorbed by the desoxyhemoglobin that is presented in the blood of the veins. Our method of analysis is composed by several steps and the first one of all is the enhancement of acquired images which is implemented by spatial filters. After that, adaptive thresholding and mathematical morphology operations are used in order to obtain the distribution of vein patterns. The above process is focused on the people recognition through of images of their palm-dorsal distributions obtained from the near infrared light. This work has been directed for doing a comparison of two different techniques of feature extraction as moments and veincode. The classification task is achieved using Artificial Neural Networks. Two databases are used for the analysis of the performance of the algorithms. The first database used here is owned of the Hong Kong Polytechnic University and the second one is our own database.

  16. Joining X-Ray to Lensing: An Accurate Combined Analysis of MACS J0416.1-2403

    NASA Astrophysics Data System (ADS)

    Bonamigo, M.; Grillo, C.; Ettori, S.; Caminha, G. B.; Rosati, P.; Mercurio, A.; Annunziatella, M.; Balestra, I.; Lombardi, M.

    2017-06-01

    We present a novel approach for a combined analysis of X-ray and gravitational lensing data and apply this technique to the merging galaxy cluster MACS J0416.1-2403. The method exploits the information on the intracluster gas distribution that comes from a fit of the X-ray surface brightness and then includes the hot gas as a fixed mass component in the strong-lensing analysis. With our new technique, we can separate the collisional from the collision-less diffuse mass components, thus obtaining a more accurate reconstruction of the dark matter distribution in the core of a cluster. We introduce an analytical description of the X-ray emission coming from a set of dual pseudo-isothermal elliptical mass distributions, which can be directly used in most lensing softwares. By combining Chandra observations with Hubble Frontier Fields imaging and Multi Unit Spectroscopic Explorer spectroscopy in MACS J0416.1-2403, we measure a projected gas-to-total mass fraction of approximately 10% at 350 kpc from the cluster center. Compared to the results of a more traditional cluster mass model (diffuse halos plus member galaxies), we find a significant difference in the cumulative projected mass profile of the dark matter component and that the dark matter over total mass fraction is almost constant, out to more than 350 kpc. In the coming era of large surveys, these results show the need of multiprobe analyses for detailed dark matter studies in galaxy clusters.

  17. Extreme event distribution in Space Weather: Characterization of heavy tail distribution using Hurst exponents

    NASA Astrophysics Data System (ADS)

    Setty, V.; Sharma, A.

    2013-12-01

    Characterization of extreme conditions of space weather is essential for potential mitigation strategies. The non-equilibrium nature of magnetosphere makes such efforts complicated and new techniques to understand its extreme event distribution are required. The heavy tail distribution in such systems can be a modeled using Stable distribution whose stability parameter is a measure of scaling in the cumulative distribution and is related to the Hurst exponent. This exponent can be readily measured in stationary time series using several techniques and detrended fluctuation analysis (DFA) is widely used in the presence of non-stationarities. However DFA has severe limitations in cases with non-linear and atypical trends. We propose a new technique that utilizes nonlinear dynamical predictions as a measure of trends and estimates the Hurst exponents. Furthermore, such a measure provides us with a new way to characterize predictability, as perfectly detrended data have no long term memory akin to Gaussian noise Ab initio calculation of weekly Hurst exponents using the auroral electrojet index AL over a span of few decades shows that these exponents are time varying and so is its fractal structure. Such time series data with time varying Hurst exponents are modeled well using multifractional Brownian motion and it is shown that DFA estimates a single time averaged value for Hurst exponent in such data. Our results show that using time varying Hurst exponent structure, we can (a) Estimate stability parameter, -a measure of scaling in heavy tails, (b) Define and identify epochs when the magnetosphere switches between regimes with and without extreme events, and, (c) Study the dependence of the Hurst exponents on the solar activity.

  18. Updating Landsat-derived land-cover maps using change detection and masking techniques

    NASA Technical Reports Server (NTRS)

    Likens, W.; Maw, K.

    1982-01-01

    The California Integrated Remote Sensing System's San Bernardino County Project was devised to study the utilization of a data base at a number of jurisdictional levels. The present paper discusses the implementation of change-detection and masking techniques in the updating of Landsat-derived land-cover maps. A baseline landcover classification was first created from a 1976 image, then the adjusted 1976 image was compared with a 1979 scene by the techniques of (1) multidate image classification, (2) difference image-distribution tails thresholding, (3) difference image classification, and (4) multi-dimensional chi-square analysis of a difference image. The union of the results of methods 1, 3 and 4 was used to create a mask of possible change areas between 1976 and 1979, which served to limit analysis of the update image and reduce comparison errors in unchanged areas. The techniques of spatial smoothing of change-detection products, and of combining results of difference change-detection algorithms are also shown to improve Landsat change-detection accuracies.

  19. Estimating air chemical emissions from research activities using stack measurement data.

    PubMed

    Ballinger, Marcel Y; Duchsherer, Cheryl J; Woodruff, Rodger K; Larson, Timothy V

    2013-03-01

    Current methods of estimating air emissions from research and development (R&D) activities use a wide range of release fractions or emission factors with bases ranging from empirical to semi-empirical. Although considered conservative, the uncertainties and confidence levels of the existing methods have not been reported. Chemical emissions were estimated from sampling data taken from four research facilities over 10 years. The approach was to use a Monte Carlo technique to create distributions of annual emission estimates for target compounds detected in source test samples. Distributions were created for each year and building sampled for compounds with sufficient detection frequency to qualify for the analysis. The results using the Monte Carlo technique without applying a filter to remove negative emission values showed almost all distributions spanning zero, and 40% of the distributions having a negative mean. This indicates that emissions are so low as to be indistinguishable from building background. Application of a filter to allow only positive values in the distribution provided a more realistic value for emissions and increased the distribution mean by an average of 16%. Release fractions were calculated by dividing the emission estimates by a building chemical inventory quantity. Two variations were used for this quantity: chemical usage, and chemical usage plus one-half standing inventory. Filters were applied so that only release fraction values from zero to one were included in the resulting distributions. Release fractions had a wide range among chemicals and among data sets for different buildings and/or years for a given chemical. Regressions of release fractions to molecular weight and vapor pressure showed weak correlations. Similarly, regressions of mean emissions to chemical usage, chemical inventory, molecular weight, and vapor pressure also gave weak correlations. These results highlight the difficulties in estimating emissions from R&D facilities using chemical inventory data. Air emissions from research operations are difficult to estimate because of the changing nature of research processes and the small quantity and wide variety of chemicals used. Analysis of stack measurements taken over multiple facilities and a 10-year period using a Monte Carlo technique provided a method to quantify the low emissions and to estimate release fractions based on chemical inventories. The variation in release fractions did not correlate well with factors investigated, confirming the complexities in estimating R&D emissions.

  20. Knowledge-based system for detailed blade design of turbines

    NASA Astrophysics Data System (ADS)

    Goel, Sanjay; Lamson, Scott

    1994-03-01

    A design optimization methodology that couples optimization techniques to CFD analysis for design of airfoils is presented. This technique optimizes 2D airfoil sections of a blade by minimizing the deviation of the actual Mach number distribution on the blade surface from a smooth fit of the distribution. The airfoil is not reverse engineered by specification of a precise distribution of the desired Mach number plot, only general desired characteristics of the distribution are specified for the design. Since the Mach number distribution is very complex, and cannot be conveniently represented by a single polynomial, it is partitioned into segments, each of which is characterized by a different order polynomial. The sum of the deviation of all the segments is minimized during optimization. To make intelligent changes to the airfoil geometry, it needs to be associated with features observed in the Mach number distribution. Associating the geometry parameters with independent features of the distribution is a fairly complex task. Also, for different optimization techniques to work efficiently the airfoil geometry needs to be parameterized into independent parameters, with enough degrees of freedom for adequate geometry manipulation. A high-pressure, low reaction steam turbine blade section was optimized using this methodology. The Mach number distribution was partitioned into pressure and suction surfaces and the suction surface distribution was further subdivided into leading edge, mid section and trailing edge sections. Two different airfoil representation schemes were used for defining the design variables of the optimization problem. The optimization was performed by using a combination of heuristic search and numerical optimization. The optimization results for the two schemes are discussed in the paper. The results are also compared to a manual design improvement study conducted independently by an experienced airfoil designer. The turbine blade optimization system (TBOS) is developed using the described methodology of coupling knowledge engineering with multiple search techniques for blade shape optimization. TBOS removes a major bottleneck in the design cycle by performing multiple design optimizations in parallel, and improves design quality at the same time. TBOS not only improves the design but also the designers' quality of work by taking the mundane repetitive task of design iterations away and leaving them more time for innovative design.

  1. Four lateral mass screw fixation techniques in lower cervical spine following laminectomy: a finite element analysis study of stress distribution.

    PubMed

    Song, Mingzhi; Zhang, Zhen; Lu, Ming; Zong, Junwei; Dong, Chao; Ma, Kai; Wang, Shouyu

    2014-08-09

    Lateral mass screw fixation (LSF) techniques have been widely used for reconstructing and stabilizing the cervical spine; however, complications may result depending on the choice of surgeon. There are only a few reports related to LSF applications, even though fracture fixation has become a severe complication. This study establishes the three-dimensional finite element model of the lower cervical spine, and compares the stress distribution of the four LSF techniques (Magerl, Roy-Camille, Anderson, and An), following laminectomy -- to explore the risks of rupture after fixation. CT scans were performed on a healthy adult female volunteer, and Digital imaging and communication in medicine (Dicom) data was obtained. Mimics 10.01, Geomagic Studio 12.0, Solidworks 2012, HyperMesh 10.1 and Abaqus 6.12 software programs were used to establish the intact model of the lower cervical spines (C3-C7), a postoperative model after laminectomy, and a reconstructive model after applying the LSF techniques. A compressive preload of 74 N combined with a pure moment of 1.8 Nm was applied to the intact and reconstructive model, simulating normal flexion, extension, lateral bending, and axial rotation. The stress distribution of the four LSF techniques was compared by analyzing the maximum von Mises stress. The three-dimensional finite element model of the intact C3-C7 vertebrae was successfully established. This model consists of 503,911 elements and 93,390 nodes. During flexion, extension, lateral bending, and axial rotation modes, the intact model's angular intersegmental range of motion was in good agreement with the results reported from the literature. The postoperative model after the three-segment laminectomy and the reconstructive model after applying the four LSF techniques were established based on the validated intact model. The stress distribution for the Magerl and Roy-Camille groups were more dispersive, and the maximum von Mises stress levels were lower than the other two groups in various conditions. The LSF techniques of Magerl and Roy-Camille are safer methods for stabilizing the lower cervical spine. Therefore, these methods potentially have a lower risk of fixation fracture.

  2. Four lateral mass screw fixation techniques in lower cervical spine following laminectomy: a finite element analysis study of stress distribution

    PubMed Central

    2014-01-01

    Background Lateral mass screw fixation (LSF) techniques have been widely used for reconstructing and stabilizing the cervical spine; however, complications may result depending on the choice of surgeon. There are only a few reports related to LSF applications, even though fracture fixation has become a severe complication. This study establishes the three-dimensional finite element model of the lower cervical spine, and compares the stress distribution of the four LSF techniques (Magerl, Roy-Camille, Anderson, and An), following laminectomy -- to explore the risks of rupture after fixation. Method CT scans were performed on a healthy adult female volunteer, and Digital imaging and communication in medicine (Dicom) data was obtained. Mimics 10.01, Geomagic Studio 12.0, Solidworks 2012, HyperMesh 10.1 and Abaqus 6.12 software programs were used to establish the intact model of the lower cervical spines (C3-C7), a postoperative model after laminectomy, and a reconstructive model after applying the LSF techniques. A compressive preload of 74 N combined with a pure moment of 1.8 Nm was applied to the intact and reconstructive model, simulating normal flexion, extension, lateral bending, and axial rotation. The stress distribution of the four LSF techniques was compared by analyzing the maximum von Mises stress. Result The three-dimensional finite element model of the intact C3-C7 vertebrae was successfully established. This model consists of 503,911 elements and 93,390 nodes. During flexion, extension, lateral bending, and axial rotation modes, the intact model’s angular intersegmental range of motion was in good agreement with the results reported from the literature. The postoperative model after the three-segment laminectomy and the reconstructive model after applying the four LSF techniques were established based on the validated intact model. The stress distribution for the Magerl and Roy-Camille groups were more dispersive, and the maximum von Mises stress levels were lower than the other two groups in various conditions. Conclusion The LSF techniques of Magerl and Roy-Camille are safer methods for stabilizing the lower cervical spine. Therefore, these methods potentially have a lower risk of fixation fracture. PMID:25106498

  3. Validation of helicopter noise prediction techniques

    NASA Technical Reports Server (NTRS)

    Succi, G. P.

    1981-01-01

    The current techniques of helicopter rotor noise prediction attempt to describe the details of the noise field precisely and remove the empiricisms and restrictions inherent in previous methods. These techniques require detailed inputs of the rotor geometry, operating conditions, and blade surface pressure distribution. The purpose of this paper is to review those techniques in general and the Farassat/Nystrom analysis in particular. The predictions of the Farassat/Nystrom noise computer program, using both measured and calculated blade surface pressure data, are compared to measured noise level data. This study is based on a contract from NASA to Bolt Beranek and Newman Inc. with measured data from the AH-1G Helicopter Operational Loads Survey flight test program supplied by Bell Helicopter Textron.

  4. Analyzing the field of bioinformatics with the multi-faceted topic modeling technique.

    PubMed

    Heo, Go Eun; Kang, Keun Young; Song, Min; Lee, Jeong-Hoon

    2017-05-31

    Bioinformatics is an interdisciplinary field at the intersection of molecular biology and computing technology. To characterize the field as convergent domain, researchers have used bibliometrics, augmented with text-mining techniques for content analysis. In previous studies, Latent Dirichlet Allocation (LDA) was the most representative topic modeling technique for identifying topic structure of subject areas. However, as opposed to revealing the topic structure in relation to metadata such as authors, publication date, and journals, LDA only displays the simple topic structure. In this paper, we adopt the Tang et al.'s Author-Conference-Topic (ACT) model to study the field of bioinformatics from the perspective of keyphrases, authors, and journals. The ACT model is capable of incorporating the paper, author, and conference into the topic distribution simultaneously. To obtain more meaningful results, we use journals and keyphrases instead of conferences and bag-of-words.. For analysis, we use PubMed to collected forty-six bioinformatics journals from the MEDLINE database. We conducted time series topic analysis over four periods from 1996 to 2015 to further examine the interdisciplinary nature of bioinformatics. We analyze the ACT Model results in each period. Additionally, for further integrated analysis, we conduct a time series analysis among the top-ranked keyphrases, journals, and authors according to their frequency. We also examine the patterns in the top journals by simultaneously identifying the topical probability in each period, as well as the top authors and keyphrases. The results indicate that in recent years diversified topics have become more prevalent and convergent topics have become more clearly represented. The results of our analysis implies that overtime the field of bioinformatics becomes more interdisciplinary where there is a steady increase in peripheral fields such as conceptual, mathematical, and system biology. These results are confirmed by integrated analysis of topic distribution as well as top ranked keyphrases, authors, and journals.

  5. Biomechanical symmetry in elite rugby union players during dynamic tasks: an investigation using discrete and continuous data analysis techniques.

    PubMed

    Marshall, Brendan; Franklyn-Miller, Andrew; Moran, Kieran; King, Enda; Richter, Chris; Gore, Shane; Strike, Siobhán; Falvey, Éanna

    2015-01-01

    While measures of asymmetry may provide a means of identifying individuals predisposed to injury, normative asymmetry values for challenging sport specific movements in elite athletes are currently lacking in the literature. In addition, previous studies have typically investigated symmetry using discrete point analyses alone. This study examined biomechanical symmetry in elite rugby union players using both discrete point and continuous data analysis techniques. Twenty elite injury free international rugby union players (mean ± SD: age 20.4 ± 1.0 years; height 1.86 ± 0.08 m; mass 98.4 ± 9.9 kg) underwent biomechanical assessment. A single leg drop landing, a single leg hurdle hop, and a running cut were analysed. Peak joint angles and moments were examined in the discrete point analysis while analysis of characterising phases (ACP) techniques were used to examine the continuous data. Dominant side was compared to non-dominant side using dependent t-tests for normally distributed data or Wilcoxon signed-rank test for non-normally distributed data. The significance level was set at α = 0.05. The majority of variables were found to be symmetrical with a total of 57/60 variables displaying symmetry in the discrete point analysis and 55/60 in the ACP. The five variables that were found to be asymmetrical were hip abductor moment in the drop landing (p = 0.02), pelvis lift/drop in the drop landing (p = 0.04) and hurdle hop (p = 0.02), ankle internal rotation moment in the cut (p = 0.04) and ankle dorsiflexion angle also in the cut (p = 0.01). The ACP identified two additional asymmetries not identified in the discrete point analysis. Elite injury free rugby union players tended to exhibit bi-lateral symmetry across a range of biomechanical variables in a drop landing, hurdle hop and cut. This study provides useful normative values for inter-limb symmetry in these movement tests. When examining symmetry it is recommended to incorporate continuous data analysis techniques rather than a discrete point analysis alone; a discrete point analysis was unable to detect two of the five asymmetries identified.

  6. Development of indirect EFBEM for radiating noise analysis including underwater problems

    NASA Astrophysics Data System (ADS)

    Kwon, Hyun-Wung; Hong, Suk-Yoon; Song, Jee-Hun

    2013-09-01

    For the analysis of radiating noise problems in medium-to-high frequency ranges, the Energy Flow Boundary Element Method (EFBEM) was developed. EFBEM is the analysis technique that applies the Boundary Element Method (BEM) to Energy Flow Analysis (EFA). The fundamental solutions representing spherical wave property for radiating noise problems in open field and considering the free surface effect in underwater are developed. Also the directivity factor is developed to express wave's directivity patterns in medium-to-high frequency ranges. Indirect EFBEM by using fundamental solutions and fictitious source was applied to open field and underwater noise problems successfully. Through numerical applications, the acoustic energy density distributions due to vibration of a simple plate model and a sphere model were compared with those of commercial code, and the comparison showed good agreement in the level and pattern of the energy density distributions.

  7. A critical examination of stresses in an elastic single lap joint

    NASA Technical Reports Server (NTRS)

    Cooper, P. A.; Sawyer, J. W.

    1979-01-01

    The results of an approximate nonlinear finite-element analysis of a single lap joint are presented and compared with the results of a linear finite-element analysis, and the geometric nonlinear effects caused by the load-path eccentricity on the adhesive stress distributions are determined. The results from finite-element, Goland-Reissner, and photoelastic analyses show that for a single lap joint the effect of the geometric nonlinear behavior of the joint has a sizable effect on the stresses in the adhesive. The Goland-Reissner analysis is sufficiently accurate in the prediction of stresses along the midsurface of the adhesive bond to be used for qualitative evaluation of the influence of geometric or material parametric variations. Detailed stress distributions in both the adherend and adhesive obtained from the finite-element analysis are presented to provide a basis for comparison with other solution techniques.

  8. Evaluation of the environmental contamination at an abandoned mining site using multivariate statistical techniques--the Rodalquilar (Southern Spain) mining district.

    PubMed

    Bagur, M G; Morales, S; López-Chicano, M

    2009-11-15

    Unsupervised and supervised pattern recognition techniques such as hierarchical cluster analysis, principal component analysis, factor analysis and linear discriminant analysis have been applied to water samples recollected in Rodalquilar mining district (Southern Spain) in order to identify different sources of environmental pollution caused by the abandoned mining industry. The effect of the mining activity on waters was monitored determining the concentration of eleven elements (Mn, Ba, Co, Cu, Zn, As, Cd, Sb, Hg, Au and Pb) by inductively coupled plasma mass spectrometry (ICP-MS). The Box-Cox transformation has been used to transform the data set in normal form in order to minimize the non-normal distribution of the geochemical data. The environmental impact is affected mainly by the mining activity developed in the zone, the acid drainage and finally by the chemical treatment used for the benefit of gold.

  9. Vibration Signature Analysis of a Faulted Gear Transmission System

    NASA Technical Reports Server (NTRS)

    Choy, F. K.; Huang, S.; Zakrajsek, J. J.; Handschuh, R. F.; Townsend, D. P.

    1994-01-01

    A comprehensive procedure in predicting faults in gear transmission systems under normal operating conditions is presented. Experimental data was obtained from a spiral bevel gear fatigue test rig at NASA Lewis Research Center. Time synchronous averaged vibration data was recorded throughout the test as the fault progressed from a small single pit to severe pitting over several teeth, and finally tooth fracture. A numerical procedure based on the Winger-Ville distribution was used to examine the time averaged vibration data. Results from the Wigner-Ville procedure are compared to results from a variety of signal analysis techniques which include time domain analysis methods and frequency analysis methods. Using photographs of the gear tooth at various stages of damage, the limitations and accuracy of the various techniques are compared and discussed. Conclusions are drawn from the comparison of the different approaches as well as the applicability of the Wigner-Ville method in predicting gear faults.

  10. Frequent Statement and Dereference Elimination for Imperative and Object-Oriented Distributed Programs

    PubMed Central

    El-Zawawy, Mohamed A.

    2014-01-01

    This paper introduces new approaches for the analysis of frequent statement and dereference elimination for imperative and object-oriented distributed programs running on parallel machines equipped with hierarchical memories. The paper uses languages whose address spaces are globally partitioned. Distributed programs allow defining data layout and threads writing to and reading from other thread memories. Three type systems (for imperative distributed programs) are the tools of the proposed techniques. The first type system defines for every program point a set of calculated (ready) statements and memory accesses. The second type system uses an enriched version of types of the first type system and determines which of the ready statements and memory accesses are used later in the program. The third type system uses the information gather so far to eliminate unnecessary statement computations and memory accesses (the analysis of frequent statement and dereference elimination). Extensions to these type systems are also presented to cover object-oriented distributed programs. Two advantages of our work over related work are the following. The hierarchical style of concurrent parallel computers is similar to the memory model used in this paper. In our approach, each analysis result is assigned a type derivation (serves as a correctness proof). PMID:24892098

  11. A System for the Semantic Multimodal Analysis of News Audio-Visual Content

    NASA Astrophysics Data System (ADS)

    Mezaris, Vasileios; Gidaros, Spyros; Papadopoulos, GeorgiosTh; Kasper, Walter; Steffen, Jörg; Ordelman, Roeland; Huijbregts, Marijn; de Jong, Franciska; Kompatsiaris, Ioannis; Strintzis, MichaelG

    2010-12-01

    News-related content is nowadays among the most popular types of content for users in everyday applications. Although the generation and distribution of news content has become commonplace, due to the availability of inexpensive media capturing devices and the development of media sharing services targeting both professional and user-generated news content, the automatic analysis and annotation that is required for supporting intelligent search and delivery of this content remains an open issue. In this paper, a complete architecture for knowledge-assisted multimodal analysis of news-related multimedia content is presented, along with its constituent components. The proposed analysis architecture employs state-of-the-art methods for the analysis of each individual modality (visual, audio, text) separately and proposes a novel fusion technique based on the particular characteristics of news-related content for the combination of the individual modality analysis results. Experimental results on news broadcast video illustrate the usefulness of the proposed techniques in the automatic generation of semantic annotations.

  12. Measuring grain boundary character distributions in Ni-base alloy 725 using high-energy diffraction microscopy

    DOE PAGES

    Bagri, Akbar; Hanson, John P.; Lind, J. P.; ...

    2016-10-25

    We use high-energy X-ray diffraction microscopy (HEDM) to characterize the microstructure of Ni-base alloy 725. HEDM is a non-destructive technique capable of providing three-dimensional reconstructions of grain shapes and orientations in polycrystals. The present analysis yields the grain size distribution in alloy 725 as well as the grain boundary character distribution (GBCD) as a function of lattice misorientation and boundary plane normal orientation. We find that the GBCD of Ni-base alloy 725 is similar to that previously determined in pure Ni and other fcc-base metals. We find an elevated density of Σ9 and Σ3 grain boundaries. We also observe amore » preponderance of grain boundaries along low-index planes, with those along (1 1 1) planes being the most common, even after Σ3 twins have been excluded from the analysis.« less

  13. Agricultural Aircraft Aid

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Farmers are increasingly turning to aerial applications of pesticides, fertilizers and other materials. Sometimes uneven distribution of the chemicals is caused by worn nozzles, improper alignment of spray nozzles or system leaks. If this happens, job must be redone with added expense to both the pilot and customer. Traditional pattern analysis techniques take days or weeks. Utilizing NASA's wind tunnel and computer validation technology, Dr. Roth, Oklahoma State University (OSU), developed a system for providing answers within minutes. Called the Rapid Distribution Pattern Evaluation System, the OSU system consists of a 100-foot measurement frame tied in to computerized analysis and readout equipment. System is mobile, delivered by trailer to airfields in agricultural areas where OSU conducts educational "fly-ins." A fly-in typically draws 50 to 100 aerial applicators, researchers, chemical suppliers and regulatory officials. An applicator can have his spray pattern checked. A computerized readout, available in five to 12 minutes, provides information for correcting shortcomings in the distribution pattern.

  14. Analysis of a simulation algorithm for direct brain drug delivery

    PubMed Central

    Rosenbluth, Kathryn Hammond; Eschermann, Jan Felix; Mittermeyer, Gabriele; Thomson, Rowena; Mittermeyer, Stephan; Bankiewicz, Krystof S.

    2011-01-01

    Convection enhanced delivery (CED) achieves targeted delivery of drugs with a pressure-driven infusion through a cannula placed stereotactically in the brain. This technique bypasses the blood brain barrier and gives precise distributions of drugs, minimizing off-target effects of compounds such as viral vectors for gene therapy or toxic chemotherapy agents. The exact distribution is affected by the cannula positioning, flow rate and underlying tissue structure. This study presents an analysis of a simulation algorithm for predicting the distribution using baseline MRI images acquired prior to inserting the cannula. The MRI images included diffusion tensor imaging (DTI) to estimate the tissue properties. The algorithm was adapted for the devices and protocols identified for upcoming trials and validated with direct MRI visualization of Gadolinium in 20 infusions in non-human primates. We found strong agreement between the size and location of the simulated and gadolinium volumes, demonstrating the clinical utility of this surgical planning algorithm. PMID:21945468

  15. Preparation and characterization of 'green' hybrid clay-dye nanopigments

    NASA Astrophysics Data System (ADS)

    Kaya, Mehmet; Onganer, Yavuz; Tabak, Ahmet

    2015-03-01

    We obtained a low cost and abundant nanopigment material composed of Rhodamine B (Rh-B) organic dye compound and Unye bentonite (UB) clay from Turkey. The characterization of the nanopigment was investigated using scanning electron microscopy (SEM), particle size distribution, powder X-ray diffraction (PXRD), Fourier transformed infra-red spectroscopy (FT-IR) and thermal analysis techniques. According to the result of texture analyses, we showed that the particle size distribution (d: 0.5-mean distribution) of Rh-B/UB nanopigment material was around 100 nm diameter. It was also demonstrated that the samples had a particle size around nm diameter in SEM images. As seen in the PXRD and thermal analysis, there is a difference in basal spacing by 1.46° (2θ) and a higher mass loss by 7.80% in the temperature range 200-500 °C compared to the raw bentonite.

  16. The global impact distribution of Near-Earth objects

    NASA Astrophysics Data System (ADS)

    Rumpf, Clemens; Lewis, Hugh G.; Atkinson, Peter M.

    2016-02-01

    Asteroids that could collide with the Earth are listed on the publicly available Near-Earth object (NEO) hazard web sites maintained by the National Aeronautics and Space Administration (NASA) and the European Space Agency (ESA). The impact probability distribution of 69 potentially threatening NEOs from these lists that produce 261 dynamically distinct impact instances, or Virtual Impactors (VIs), were calculated using the Asteroid Risk Mitigation and Optimization Research (ARMOR) tool in conjunction with OrbFit. ARMOR projected the impact probability of each VI onto the surface of the Earth as a spatial probability distribution. The projection considers orbit solution accuracy and the global impact probability. The method of ARMOR is introduced and the tool is validated against two asteroid-Earth collision cases with objects 2008 TC3 and 2014 AA. In the analysis, the natural distribution of impact corridors is contrasted against the impact probability distribution to evaluate the distributions' conformity with the uniform impact distribution assumption. The distribution of impact corridors is based on the NEO population and orbital mechanics. The analysis shows that the distribution of impact corridors matches the common assumption of uniform impact distribution and the result extends the evidence base for the uniform assumption from qualitative analysis of historic impact events into the future in a quantitative way. This finding is confirmed in a parallel analysis of impact points belonging to a synthetic population of 10,006 VIs. Taking into account the impact probabilities introduced significant variation into the results and the impact probability distribution, consequently, deviates markedly from uniformity. The concept of impact probabilities is a product of the asteroid observation and orbit determination technique and, thus, represents a man-made component that is largely disconnected from natural processes. It is important to consider impact probabilities because such information represents the best estimate of where an impact might occur.

  17. LA-iMageS: a software for elemental distribution bioimaging using LA-ICP-MS data.

    PubMed

    López-Fernández, Hugo; de S Pessôa, Gustavo; Arruda, Marco A Z; Capelo-Martínez, José L; Fdez-Riverola, Florentino; Glez-Peña, Daniel; Reboiro-Jato, Miguel

    2016-01-01

    The spatial distribution of chemical elements in different types of samples is an important field in several research areas such as biology, paleontology or biomedicine, among others. Elemental distribution imaging by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) is an effective technique for qualitative and quantitative imaging due to its high spatial resolution and sensitivity. By applying this technique, vast amounts of raw data are generated to obtain high-quality images, essentially making the use of specific LA-ICP-MS imaging software that can process such data absolutely mandatory. Since existing solutions are usually commercial or hard-to-use for average users, this work introduces LA-iMageS, an open-source, free-to-use multiplatform application for fast and automatic generation of high-quality elemental distribution bioimages from LA-ICP-MS data in the PerkinElmer Elan XL format, whose results can be directly exported to external applications for further analysis. A key strength of LA-iMageS is its substantial added value for users, with particular regard to the customization of the elemental distribution bioimages, which allows, among other features, the ability to change color maps, increase image resolution or toggle between 2D and 3D visualizations.

  18. Hybrid approach combining multiple characterization techniques and simulations for microstructural analysis of proton exchange membrane fuel cell electrodes

    NASA Astrophysics Data System (ADS)

    Cetinbas, Firat C.; Ahluwalia, Rajesh K.; Kariuki, Nancy; De Andrade, Vincent; Fongalland, Dash; Smith, Linda; Sharman, Jonathan; Ferreira, Paulo; Rasouli, Somaye; Myers, Deborah J.

    2017-03-01

    The cost and performance of proton exchange membrane fuel cells strongly depend on the cathode electrode due to usage of expensive platinum (Pt) group metal catalyst and sluggish reaction kinetics. Development of low Pt content high performance cathodes requires comprehensive understanding of the electrode microstructure. In this study, a new approach is presented to characterize the detailed cathode electrode microstructure from nm to μm length scales by combining information from different experimental techniques. In this context, nano-scale X-ray computed tomography (nano-CT) is performed to extract the secondary pore space of the electrode. Transmission electron microscopy (TEM) is employed to determine primary C particle and Pt particle size distributions. X-ray scattering, with its ability to provide size distributions of orders of magnitude more particles than TEM, is used to confirm the TEM-determined size distributions. The number of primary pores that cannot be resolved by nano-CT is approximated using mercury intrusion porosimetry. An algorithm is developed to incorporate all these experimental data in one geometric representation. Upon validation of pore size distribution against gas adsorption and mercury intrusion porosimetry data, reconstructed ionomer size distribution is reported. In addition, transport related characteristics and effective properties are computed by performing simulations on the hybrid microstructure.

  19. Normalization of High Dimensional Genomics Data Where the Distribution of the Altered Variables Is Skewed

    PubMed Central

    Landfors, Mattias; Philip, Philge; Rydén, Patrik; Stenberg, Per

    2011-01-01

    Genome-wide analysis of gene expression or protein binding patterns using different array or sequencing based technologies is now routinely performed to compare different populations, such as treatment and reference groups. It is often necessary to normalize the data obtained to remove technical variation introduced in the course of conducting experimental work, but standard normalization techniques are not capable of eliminating technical bias in cases where the distribution of the truly altered variables is skewed, i.e. when a large fraction of the variables are either positively or negatively affected by the treatment. However, several experiments are likely to generate such skewed distributions, including ChIP-chip experiments for the study of chromatin, gene expression experiments for the study of apoptosis, and SNP-studies of copy number variation in normal and tumour tissues. A preliminary study using spike-in array data established that the capacity of an experiment to identify altered variables and generate unbiased estimates of the fold change decreases as the fraction of altered variables and the skewness increases. We propose the following work-flow for analyzing high-dimensional experiments with regions of altered variables: (1) Pre-process raw data using one of the standard normalization techniques. (2) Investigate if the distribution of the altered variables is skewed. (3) If the distribution is not believed to be skewed, no additional normalization is needed. Otherwise, re-normalize the data using a novel HMM-assisted normalization procedure. (4) Perform downstream analysis. Here, ChIP-chip data and simulated data were used to evaluate the performance of the work-flow. It was found that skewed distributions can be detected by using the novel DSE-test (Detection of Skewed Experiments). Furthermore, applying the HMM-assisted normalization to experiments where the distribution of the truly altered variables is skewed results in considerably higher sensitivity and lower bias than can be attained using standard and invariant normalization methods. PMID:22132175

  20. SEPARATION OF THE RIBBON FROM GLOBALLY DISTRIBUTED ENERGETIC NEUTRAL ATOM FLUX USING THE FIRST FIVE YEARS OF IBEX OBSERVATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwadron, N. A.; Moebius, E.; Kucharek, H.

    2014-11-01

    The Interstellar Boundary Explorer (IBEX) observes the IBEX ribbon, which stretches across much of the sky observed in energetic neutral atoms (ENAs). The ribbon covers a narrow (∼20°-50°) region that is believed to be roughly perpendicular to the interstellar magnetic field. Superimposed on the IBEX ribbon is the globally distributed flux that is controlled by the processes and properties of the heliosheath. This is a second study that utilizes a previously developed technique to separate ENA emissions in the ribbon from the globally distributed flux. A transparency mask is applied over the ribbon and regions of high emissions. We thenmore » solve for the globally distributed flux using an interpolation scheme. Previously, ribbon separation techniques were applied to the first year of IBEX-Hi data at and above 0.71 keV. Here we extend the separation analysis down to 0.2 keV and to five years of IBEX data enabling first maps of the ribbon and the globally distributed flux across the full sky of ENA emissions. Our analysis shows the broadening of the ribbon peak at energies below 0.71 keV and demonstrates the apparent deformation of the ribbon in the nose and heliotail. We show global asymmetries of the heliosheath, including both deflection of the heliotail and differing widths of the lobes, in context of the direction, draping, and compression of the heliospheric magnetic field. We discuss implications of the ribbon maps for the wide array of concepts that attempt to explain the ribbon's origin. Thus, we present the five-year separation of the IBEX ribbon from the globally distributed flux in preparation for a formal IBEX data release of ribbon and globally distributed flux maps to the heliophysics community.« less

  1. Combining Different Tools for EEG Analysis to Study the Distributed Character of Language Processing

    PubMed Central

    da Rocha, Armando Freitas; Foz, Flávia Benevides; Pereira, Alfredo

    2015-01-01

    Recent studies on language processing indicate that language cognition is better understood if assumed to be supported by a distributed intelligent processing system enrolling neurons located all over the cortex, in contrast to reductionism that proposes to localize cognitive functions to specific cortical structures. Here, brain activity was recorded using electroencephalogram while volunteers were listening or reading small texts and had to select pictures that translate meaning of these texts. Several techniques for EEG analysis were used to show this distributed character of neuronal enrollment associated with the comprehension of oral and written descriptive texts. Low Resolution Tomography identified the many different sets (s i) of neurons activated in several distinct cortical areas by text understanding. Linear correlation was used to calculate the information H(e i) provided by each electrode of the 10/20 system about the identified s i. H(e i) Principal Component Analysis (PCA) was used to study the temporal and spatial activation of these sources s i. This analysis evidenced 4 different patterns of H(e i) covariation that are generated by neurons located at different cortical locations. These results clearly show that the distributed character of language processing is clearly evidenced by combining available EEG technologies. PMID:26713089

  2. Quadrantal multi-scale distribution entropy analysis of heartbeat interval series based on a modified Poincaré plot

    NASA Astrophysics Data System (ADS)

    Huo, Chengyu; Huang, Xiaolin; Zhuang, Jianjun; Hou, Fengzhen; Ni, Huangjing; Ning, Xinbao

    2013-09-01

    The Poincaré plot is one of the most important approaches in human cardiac rhythm analysis. However, further investigations are still needed to concentrate on techniques that can characterize the dispersion of the points displayed by a Poincaré plot. Based on a modified Poincaré plot, we provide a novel measurement named distribution entropy (DE) and propose a quadrantal multi-scale distribution entropy analysis (QMDE) for the quantitative descriptions of the scatter distribution patterns in various regions and temporal scales. We apply this method to the heartbeat interval series derived from healthy subjects and congestive heart failure (CHF) sufferers, respectively, and find that the discriminations between them are most significant in the first quadrant, which implies significant impacts on vagal regulation brought about by CHF. We also investigate the day-night differences of young healthy people, and it is shown that the results present a clearly circadian rhythm, especially in the first quadrant. In addition, the multi-scale analysis indicates that the results of healthy subjects and CHF sufferers fluctuate in different trends with variation of the scale factor. The same phenomenon also appears in circadian rhythm investigations of young healthy subjects, which implies that the cardiac dynamic system is affected differently in various temporal scales by physiological or pathological factors.

  3. Combining Different Tools for EEG Analysis to Study the Distributed Character of Language Processing.

    PubMed

    Rocha, Armando Freitas da; Foz, Flávia Benevides; Pereira, Alfredo

    2015-01-01

    Recent studies on language processing indicate that language cognition is better understood if assumed to be supported by a distributed intelligent processing system enrolling neurons located all over the cortex, in contrast to reductionism that proposes to localize cognitive functions to specific cortical structures. Here, brain activity was recorded using electroencephalogram while volunteers were listening or reading small texts and had to select pictures that translate meaning of these texts. Several techniques for EEG analysis were used to show this distributed character of neuronal enrollment associated with the comprehension of oral and written descriptive texts. Low Resolution Tomography identified the many different sets (s i ) of neurons activated in several distinct cortical areas by text understanding. Linear correlation was used to calculate the information H(e i ) provided by each electrode of the 10/20 system about the identified s i . H(e i ) Principal Component Analysis (PCA) was used to study the temporal and spatial activation of these sources s i . This analysis evidenced 4 different patterns of H(e i ) covariation that are generated by neurons located at different cortical locations. These results clearly show that the distributed character of language processing is clearly evidenced by combining available EEG technologies.

  4. Continuous distributions of specific ventilation recovered from inert gas washout

    NASA Technical Reports Server (NTRS)

    Lewis, S. M.; Evans, J. W.; Jalowayski, A. A.

    1978-01-01

    A new technique is described for recovering continuous distributions of ventilation as a function of tidal ventilation/volume ratio from the nitrogen washout. The analysis yields a continuous distribution of ventilation as a function of tidal ventilation/volume ratio represented as fractional ventilations of 50 compartments plus dead space. The procedure was verified by recovering known distributions from data to which noise had been added. Using an apparatus to control the subject's tidal volume and FRC, mixed expired N2 data gave the following results: (a) the distributions of young, normal subjects were narrow and unimodal; (b) those of subjects over age 40 were broader with more poorly ventilated units; (c) patients with pulmonary disease of all descriptions showed enlarged dead space; (d) patients with cystic fibrosis showed multimodal distributions with the bulk of the ventilation going to overventilated units; and (e) patients with obstructive lung disease fell into several classes, three of which are illustrated.

  5. Application of Markov Models for Analysis of Development of Psychological Characteristics

    ERIC Educational Resources Information Center

    Kuravsky, Lev S.; Malykh, Sergey B.

    2004-01-01

    A technique to study combined influence of environmental and genetic factors on the base of changes in phenotype distributions is presented. Histograms are exploited as base analyzed characteristics. A continuous time, discrete state Markov process with piece-wise constant interstate transition rates is associated with evolution of each histogram.…

  6. Guidelines for collecting and maintaining archives for genetic monitoring

    Treesearch

    Jennifer A. Jackson; Linda Laikre; C. Scott Baker; Katherine C. Kendall; F. W. Allendorf; M. K. Schwartz

    2011-01-01

    Rapid advances in molecular genetic techniques and the statistical analysis of genetic data have revolutionized the way that populations of animals, plants and microorganisms can be monitored. Genetic monitoring is the practice of using molecular genetic markers to track changes in the abundance, diversity or distribution of populations, species or ecosystems over time...

  7. Infrared imaging of cotton fiber bundles using a focal plane array detector and a single reflectance accessory

    USDA-ARS?s Scientific Manuscript database

    Infrared imaging is gaining attention as a technique used in the examination of cotton fibers. This type of imaging combines spectral analysis with spatial resolution to create visual images that examine sample composition and distribution. Herein, we report the use of an infrared instrument equippe...

  8. Compression of Index Term Dictionary in an Inverted-File-Oriented Database: Some Effective Algorithms.

    ERIC Educational Resources Information Center

    Wisniewski, Janusz L.

    1986-01-01

    Discussion of a new method of index term dictionary compression in an inverted-file-oriented database highlights a technique of word coding, which generates short fixed-length codes obtained from the index terms themselves by analysis of monogram and bigram statistical distributions. Substantial savings in communication channel utilization are…

  9. Reliability analysis of a robotic system using hybridized technique

    NASA Astrophysics Data System (ADS)

    Kumar, Naveen; Komal; Lather, J. S.

    2017-09-01

    In this manuscript, the reliability of a robotic system has been analyzed using the available data (containing vagueness, uncertainty, etc). Quantification of involved uncertainties is done through data fuzzification using triangular fuzzy numbers with known spreads as suggested by system experts. With fuzzified data, if the existing fuzzy lambda-tau (FLT) technique is employed, then the computed reliability parameters have wide range of predictions. Therefore, decision-maker cannot suggest any specific and influential managerial strategy to prevent unexpected failures and consequently to improve complex system performance. To overcome this problem, the present study utilizes a hybridized technique. With this technique, fuzzy set theory is utilized to quantify uncertainties, fault tree is utilized for the system modeling, lambda-tau method is utilized to formulate mathematical expressions for failure/repair rates of the system, and genetic algorithm is utilized to solve established nonlinear programming problem. Different reliability parameters of a robotic system are computed and the results are compared with the existing technique. The components of the robotic system follow exponential distribution, i.e., constant. Sensitivity analysis is also performed and impact on system mean time between failures (MTBF) is addressed by varying other reliability parameters. Based on analysis some influential suggestions are given to improve the system performance.

  10. Data Flow Analysis and Visualization for Spatiotemporal Statistical Data without Trajectory Information.

    PubMed

    Kim, Seokyeon; Jeong, Seongmin; Woo, Insoo; Jang, Yun; Maciejewski, Ross; Ebert, David S

    2018-03-01

    Geographic visualization research has focused on a variety of techniques to represent and explore spatiotemporal data. The goal of those techniques is to enable users to explore events and interactions over space and time in order to facilitate the discovery of patterns, anomalies and relationships within the data. However, it is difficult to extract and visualize data flow patterns over time for non-directional statistical data without trajectory information. In this work, we develop a novel flow analysis technique to extract, represent, and analyze flow maps of non-directional spatiotemporal data unaccompanied by trajectory information. We estimate a continuous distribution of these events over space and time, and extract flow fields for spatial and temporal changes utilizing a gravity model. Then, we visualize the spatiotemporal patterns in the data by employing flow visualization techniques. The user is presented with temporal trends of geo-referenced discrete events on a map. As such, overall spatiotemporal data flow patterns help users analyze geo-referenced temporal events, such as disease outbreaks, crime patterns, etc. To validate our model, we discard the trajectory information in an origin-destination dataset and apply our technique to the data and compare the derived trajectories and the original. Finally, we present spatiotemporal trend analysis for statistical datasets including twitter data, maritime search and rescue events, and syndromic surveillance.

  11. Predictive modelling of grain-size distributions from marine electromagnetic profiling data using end-member analysis and a radial basis function network

    NASA Astrophysics Data System (ADS)

    Baasch, B.; Müller, H.; von Dobeneck, T.

    2018-07-01

    In this work, we present a new methodology to predict grain-size distributions from geophysical data. Specifically, electric conductivity and magnetic susceptibility of seafloor sediments recovered from electromagnetic profiling data are used to predict grain-size distributions along shelf-wide survey lines. Field data from the NW Iberian shelf are investigated and reveal a strong relation between the electromagnetic properties and grain-size distribution. The here presented workflow combines unsupervised and supervised machine-learning techniques. Non-negative matrix factorization is used to determine grain-size end-members from sediment surface samples. Four end-members were found, which well represent the variety of sediments in the study area. A radial basis function network modified for prediction of compositional data is then used to estimate the abundances of these end-members from the electromagnetic properties. The end-members together with their predicted abundances are finally back transformed to grain-size distributions. A minimum spatial variation constraint is implemented in the training of the network to avoid overfitting and to respect the spatial distribution of sediment patterns. The predicted models are tested via leave-one-out cross-validation revealing high prediction accuracy with coefficients of determination (R2) between 0.76 and 0.89. The predicted grain-size distributions represent the well-known sediment facies and patterns on the NW Iberian shelf and provide new insights into their distribution, transition and dynamics. This study suggests that electromagnetic benthic profiling in combination with machine learning techniques is a powerful tool to estimate grain-size distribution of marine sediments.

  12. Predictive modelling of grain size distributions from marine electromagnetic profiling data using end-member analysis and a radial basis function network

    NASA Astrophysics Data System (ADS)

    Baasch, B.; M"uller, H.; von Dobeneck, T.

    2018-04-01

    In this work we present a new methodology to predict grain-size distributions from geophysical data. Specifically, electric conductivity and magnetic susceptibility of seafloor sediments recovered from electromagnetic profiling data are used to predict grain-size distributions along shelf-wide survey lines. Field data from the NW Iberian shelf are investigated and reveal a strong relation between the electromagnetic properties and grain-size distribution. The here presented workflow combines unsupervised and supervised machine learning techniques. Nonnegative matrix factorisation is used to determine grain-size end-members from sediment surface samples. Four end-members were found which well represent the variety of sediments in the study area. A radial-basis function network modified for prediction of compositional data is then used to estimate the abundances of these end-members from the electromagnetic properties. The end-members together with their predicted abundances are finally back transformed to grain-size distributions. A minimum spatial variation constraint is implemented in the training of the network to avoid overfitting and to respect the spatial distribution of sediment patterns. The predicted models are tested via leave-one-out cross-validation revealing high prediction accuracy with coefficients of determination (R2) between 0.76 and 0.89. The predicted grain-size distributions represent the well-known sediment facies and patterns on the NW Iberian shelf and provide new insights into their distribution, transition and dynamics. This study suggests that electromagnetic benthic profiling in combination with machine learning techniques is a powerful tool to estimate grain-size distribution of marine sediments.

  13. Multidimensional chromatographic techniques for hydrophilic copolymers II. Analysis of poly(ethylene glycol)-poly(vinyl acetate) graft copolymers.

    PubMed

    Knecht, Daniela; Rittig, Frank; Lange, Ronald F M; Pasch, Harald

    2006-10-13

    A large variety of hydrophilic copolymers is applied in different fields of chemical industry including bio, pharma and pharmaceutical applications. For example, poly(ethylene glycol)-poly(vinyl alcohol) graft copolymers that are used as tablet coatings are responsible for the controlled release of the active compounds. These copolymers are produced by grafting of vinyl acetate onto polyethylene glycol (PEG) and subsequent hydrolysis of the poly(ethylene glycol)-poly(vinyl acetate) graft copolymers. The poly(ethylene glycol)-poly(vinyl acetate) copolymers are distributed with regard to molar mass and chemical composition. In addition, they frequently contain the homopolymers polyethylene glycol and polyvinyl acetate. The comprehensive analysis of such complex systems requires hyphenated analytical techniques, including two-dimensional liquid chromatography and combined LC and nuclear magnetic resonance spectroscopy. The development and application of these techniques are discussed in the present paper.

  14. Non-invasive characterisation of SIX Japanese hand-guards (tsuba)

    NASA Astrophysics Data System (ADS)

    Barzagli, Elisa; Grazzi, Francesco; Civita, Francesco; Scherillo, Antonella; Pietropaolo, Antonino; Festa, Giulia; Zoppi, Marco

    2013-12-01

    In this work we present a systematic study of Japanese sword hand-guards ( tsuba) carried out by means of non-invasive techniques using neutrons. Several tsuba from different periods, belonging to the Japanese Section of the Stibbert Museum, were analysed using an innovative approach to characterise the bulk of the samples, coupling two neutron techniques, namely Time of Flight Neutron Diffraction (ToF-ND) and Nuclear Resonance Capture Analysis (NRCA). The measurements were carried out on the same instrument: the INES beam-line at the ISIS spallation pulsed neutron source (UK). NRCA analysis allows identifying the elements present in the sample gauge volume, while neutron diffraction is exploited to quantify the phase distribution and other micro-structural parameters of the metal specimen. The results show that all samples are made of high-quality metal, either steel or copper alloy, with noticeable changes in composition and working techniques, depending on the place and time of manufacturing.

  15. Nanostructured surfaces for analysis of anticancer drug and cell diagnosis based on electrochemical and SERS tools.

    PubMed

    El-Said, Waleed A; Yoon, Jinho; Choi, Jeong-Woo

    2018-01-01

    Discovering new anticancer drugs and screening their efficacy requires a huge amount of resources and time-consuming processes. The development of fast, sensitive, and nondestructive methods for the in vitro and in vivo detection of anticancer drugs' effects and action mechanisms have been done to reduce the time and resources required to discover new anticancer drugs. For the in vitro and in vivo detection of the efficiency, distribution, and action mechanism of anticancer drugs, the applications of electrochemical techniques such as electrochemical cell chips and optical techniques such as surface-enhanced Raman spectroscopy (SERS) have been developed based on the nanostructured surface. Research focused on electrochemical cell chips and the SERS technique have been reviewed here; electrochemical cell chips based on nanostructured surfaces have been developed for the in vitro detection of cell viability and the evaluation of the effects of anticancer drugs, which showed the high capability to evaluate the cytotoxic effects of several chemicals at low concentrations. SERS technique based on the nanostructured surface have been used as label-free, simple, and nondestructive techniques for the in vitro and in vivo monitoring of the distribution, mechanism, and metabolism of different anticancer drugs at the cellular level. The use of electrochemical cell chips and the SERS technique based on the nanostructured surface should be good tools to detect the effects and action mechanisms of anticancer drugs.

  16. Nanostructured surfaces for analysis of anticancer drug and cell diagnosis based on electrochemical and SERS tools

    NASA Astrophysics Data System (ADS)

    El-Said, Waleed A.; Yoon, Jinho; Choi, Jeong-Woo

    2018-04-01

    Discovering new anticancer drugs and screening their efficacy requires a huge amount of resources and time-consuming processes. The development of fast, sensitive, and nondestructive methods for the in vitro and in vivo detection of anticancer drugs' effects and action mechanisms have been done to reduce the time and resources required to discover new anticancer drugs. For the in vitro and in vivo detection of the efficiency, distribution, and action mechanism of anticancer drugs, the applications of electrochemical techniques such as electrochemical cell chips and optical techniques such as surface-enhanced Raman spectroscopy (SERS) have been developed based on the nanostructured surface. Research focused on electrochemical cell chips and the SERS technique have been reviewed here; electrochemical cell chips based on nanostructured surfaces have been developed for the in vitro detection of cell viability and the evaluation of the effects of anticancer drugs, which showed the high capability to evaluate the cytotoxic effects of several chemicals at low concentrations. SERS technique based on the nanostructured surface have been used as label-free, simple, and nondestructive techniques for the in vitro and in vivo monitoring of the distribution, mechanism, and metabolism of different anticancer drugs at the cellular level. The use of electrochemical cell chips and the SERS technique based on the nanostructured surface should be good tools to detect the effects and action mechanisms of anticancer drugs.

  17. Single-shot coherent diffraction imaging of microbunched relativistic electron beams for free-electron laser applications.

    PubMed

    Marinelli, A; Dunning, M; Weathersby, S; Hemsing, E; Xiang, D; Andonian, G; O'Shea, F; Miao, Jianwei; Hast, C; Rosenzweig, J B

    2013-03-01

    With the advent of coherent x rays provided by the x-ray free-electron laser (FEL), strong interest has been kindled in sophisticated diffraction imaging techniques. In this Letter, we exploit such techniques for the diagnosis of the density distribution of the intense electron beams typically utilized in an x-ray FEL itself. We have implemented this method by analyzing the far-field coherent transition radiation emitted by an inverse-FEL microbunched electron beam. This analysis utilizes an oversampling phase retrieval method on the transition radiation angular spectrum to reconstruct the transverse spatial distribution of the electron beam. This application of diffraction imaging represents a significant advance in electron beam physics, having critical applications to the diagnosis of high-brightness beams, as well as the collective microbunching instabilities afflicting these systems.

  18. Application of Weibull analysis to SSME hardware

    NASA Technical Reports Server (NTRS)

    Gray, L. A. B.

    1986-01-01

    Generally, it has been documented that the wearing of engine parts forms a failure distribution which can be approximated by a function developed by Weibull. The purpose here is to examine to what extent the Weibull distribution approximates failure data for designated engine parts of the Space Shuttle Main Engine (SSME). The current testing certification requirements will be examined in order to establish confidence levels. An examination of the failure history of SSME parts/assemblies (turbine blades, main combustion chamber, or high pressure fuel pump first stage impellers) which are limited in usage by time or starts will be done by using updated Weibull techniques. Efforts will be made by the investigator to predict failure trends by using Weibull techniques for SSME parts (turbine temperature sensors, chamber pressure transducers, actuators, and controllers) which are not severely limited by time or starts.

  19. A Quantitative Three-Dimensional Image Analysis Tool for Maximal Acquisition of Spatial Heterogeneity Data.

    PubMed

    Allenby, Mark C; Misener, Ruth; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2017-02-01

    Three-dimensional (3D) imaging techniques provide spatial insight into environmental and cellular interactions and are implemented in various fields, including tissue engineering, but have been restricted by limited quantification tools that misrepresent or underutilize the cellular phenomena captured. This study develops image postprocessing algorithms pairing complex Euclidean metrics with Monte Carlo simulations to quantitatively assess cell and microenvironment spatial distributions while utilizing, for the first time, the entire 3D image captured. Although current methods only analyze a central fraction of presented confocal microscopy images, the proposed algorithms can utilize 210% more cells to calculate 3D spatial distributions that can span a 23-fold longer distance. These algorithms seek to leverage the high sample cost of 3D tissue imaging techniques by extracting maximal quantitative data throughout the captured image.

  20. Lidar investigations of ozone in the upper troposphere - lower stratosphere: technique and results of measurements

    NASA Astrophysics Data System (ADS)

    Romanovskii, O. A.; Burlakov, V. D.; Dolgii, S. I.; Nevzorov, A. A.; Nevzorov, A. V.; Kharchenko, O. V.

    2016-12-01

    Prediction of atmospheric ozone layer, which is the valuable and irreplaceable geo asset, is currently the important scientific and engineering problem. The relevance of the research is caused by the necessity to develop laser remote methods for sensing ozone to solve the problems of controlling the environment and climatology. The main aim of the research is to develop the technique for laser remote ozone sensing in the upper troposphere - lower stratosphere by differential absorption method for temperature and aerosol correction and analysis of measurement results. The report introduces the technique of recovering profiles of ozone vertical distribution considering temperature and aerosol correction in atmosphere lidar sounding by differential absorption method. The temperature correction of ozone absorption coefficients is introduced in the software to reduce the retrieval errors. The authors have determined wavelengths, promising to measure ozone profiles in the upper troposphere - lower stratosphere. We present the results of DIAL measurements of the vertical ozone distribution at the Siberian lidar station in Tomsk. Sensing is performed according to the method of differential absorption at wavelength pair of 299/341 nm, which are, respectively, the first and second Stokes components of SRS conversion of 4th harmonic of Nd:YAG laser (266 nm) in hydrogen. Lidar with receiving mirror 0.5 m in diameter is used to implement sensing of vertical ozone distribution in altitude range of 6-18 km. The recovered ozone profiles were compared with IASI satellite data and Kruger model. The results of applying the developed technique to recover the profiles of ozone vertical distribution considering temperature and aerosol correction in the altitude range of 6-18 km in lidar atmosphere sounding by differential absorption method confirm the prospects of using the selected wavelengths of ozone sensing 341 and 299 nm in the ozone lidar.

  1. Glyph-based analysis of multimodal directional distributions in vector field ensembles

    NASA Astrophysics Data System (ADS)

    Jarema, Mihaela; Demir, Ismail; Kehrer, Johannes; Westermann, Rüdiger

    2015-04-01

    Ensemble simulations are increasingly often performed in the geosciences in order to study the uncertainty and variability of model predictions. Describing ensemble data by mean and standard deviation can be misleading in case of multimodal distributions. We present first results of a glyph-based visualization of multimodal directional distributions in 2D and 3D vector ensemble data. Directional information on the circle/sphere is modeled using mixtures of probability density functions (pdfs), which enables us to characterize the distributions with relatively few parameters. The resulting mixture models are represented by 2D and 3D lobular glyphs showing direction, spread and strength of each principal mode of the distributions. A 3D extension of our approach is realized by means of an efficient GPU rendering technique. We demonstrate our method in the context of ensemble weather simulations.

  2. In situ KPFM imaging of local photovoltaic characteristics of structured organic photovoltaic devices.

    PubMed

    Watanabe, Satoshi; Fukuchi, Yasumasa; Fukasawa, Masako; Sassa, Takafumi; Kimoto, Atsushi; Tajima, Yusuke; Uchiyama, Masanobu; Yamashita, Takashi; Matsumoto, Mutsuyoshi; Aoyama, Tetsuya

    2014-02-12

    Here, we discuss the local photovoltaic characteristics of a structured bulk heterojunction, organic photovoltaic devices fabricated with a liquid carbazole, and a fullerene derivative based on analysis by scanning kelvin probe force microscopy (KPFM). Periodic photopolymerization induced by an interference pattern from two laser beams formed surface relief gratings (SRG) in the structured films. The surface potential distribution in the SRGs indicates the formation of donor and acceptor spatial distribution. Under illumination, the surface potential reversibly changed because of the generation of fullerene anions and hole transport from the films to substrates, which indicates that we successfully imaged the local photovoltaic characteristics of the structured photovoltaic devices. Using atomic force microscopy, we confirmed the formation of the SRG because of the material migration to the photopolymerized region of the films, which was induced by light exposure through photomasks. The structuring technique allows for the direct fabrication and the control of donor and acceptor spatial distribution in organic photonic and electronic devices with minimized material consumption. This in situ KPFM technique is indispensable to the fabrication of nanoscale electron donor and electron acceptor spatial distribution in the devices.

  3. Distribution of Health Resource Allocation in the Fars Province Using the Scalogram Analysis Technique in 2011.

    PubMed

    Hatam, Nahid; Kafashi, Shahnaz; Kavosi, Zahra

    2015-07-01

    The importance of health indicators in the recent years has created challenges in resource allocation. Balanced and fair distribution of health resources is one of the main principles in achieving equity. The goal of this cross-sectional descriptive study, conducted in 2010, was to classify health structural indicators in the Fars province using the scalogram technique. Health structural indicators were selected and classified in three categories; namely institutional, human resources, and rural health. The data were obtained from the statistical yearbook of Iran and was analyzed according to the scalogram technique. The distribution map of the Fars province was drawn using ArcGIS (geographic information system). The results showed an interesting health structural indicator map across the province. Our findings revealed that the city of Mohr with 85 and Zarindasht with 36 had the highest and the lowest scores, respectively. This information is valuable to provincial health policymakers to plan appropriately based on factual data and minimize chaos in allocating health resources. Based on such data and reflecting on the local needs, one could develop equity based resource allocation policies and prevent inequality. It is concluded that, as top priority, the provincial policymakers should place dedicated deprivation programs for Farashband, Eghlid and Zaindasht regions.

  4. Ultrafine particle and fiber production in micro-gravity

    NASA Technical Reports Server (NTRS)

    Webb, George W.

    1987-01-01

    The technique of evaporation and condensation of material in an inert gas is investigated for the purpose of preparing ultrafine particles (of order 10 nm in diameter) with a narrow distribution of sizes. Gravity-driven convection increases the rate of coalescence of the particles, leading to larger sizes and a broader distribution. Analysis and experimental efforts to investigate coalescence of particles are presented. The possibility of reducing coalescence in microgravity is discussed. An experimental test in reduced gravity to be performed in a KC135 aircraft is described briefly.

  5. A limiting analysis for edge effects in angle-ply laminates

    NASA Technical Reports Server (NTRS)

    Hsu, P. W.; Herakovich, C. T.

    1976-01-01

    A zeroth order solution for edge effects in angle ply composite laminates using perturbation techniques and a limiting free body approach was developed. The general method of solution for laminates is developed and then applied to the special case of a graphite/epoxy laminate. Interlaminar stress distributions are obtained as a function of the laminate thickness to width ratio h/b and compared to existing numerical results. The solution predicts stable, continuous stress distributions, determines finite maximum tensile interlaminar normal stress for two laminates, and provides mathematical evidence for singular interlaminar shear stresses.

  6. A Dasymetric-Based Monte Carlo Simulation Approach to the Probabilistic Analysis of Spatial Variables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morton, April M; Piburn, Jesse O; McManamay, Ryan A

    2017-01-01

    Monte Carlo simulation is a popular numerical experimentation technique used in a range of scientific fields to obtain the statistics of unknown random output variables. Despite its widespread applicability, it can be difficult to infer required input probability distributions when they are related to population counts unknown at desired spatial resolutions. To overcome this challenge, we propose a framework that uses a dasymetric model to infer the probability distributions needed for a specific class of Monte Carlo simulations which depend on population counts.

  7. Mapping of chlorophyll a distributions in coastal zones

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1978-01-01

    It is pointed out that chlorophyll a is an important environmental parameter for monitoring water quality, nutrient loads, and pollution effects in coastal zones. High chlorophyll a concentrations occur in areas which have high nutrient inflows from sources such as sewage treatment plants and industrial wastes. Low chlorophyll a concentrations may be due to the addition of toxic substances from industrial wastes or other sources. Remote sensing provides an opportunity to assess distributions of water quality parameters, such as chlorophyll a. A description is presented of the chlorophyll a analysis and a quantitative mapping of the James River, Virginia. An approach considered by Johnson (1977) was used in the analysis. An application of the multiple regression analysis technique to a data set collected over the New York Bight, an environmentally different area of the coastal zone, is also discussed.

  8. Processing and statistical analysis of soil-root images

    NASA Astrophysics Data System (ADS)

    Razavi, Bahar S.; Hoang, Duyen; Kuzyakov, Yakov

    2016-04-01

    Importance of the hotspots such as rhizosphere, the small soil volume that surrounds and is influenced by plant roots, calls for spatially explicit methods to visualize distribution of microbial activities in this active site (Kuzyakov and Blagodatskaya, 2015). Zymography technique has previously been adapted to visualize the spatial dynamics of enzyme activities in rhizosphere (Spohn and Kuzyakov, 2014). Following further developing of soil zymography -to obtain a higher resolution of enzyme activities - we aimed to 1) quantify the images, 2) determine whether the pattern (e.g. distribution of hotspots in space) is clumped (aggregated) or regular (dispersed). To this end, we incubated soil-filled rhizoboxes with maize Zea mays L. and without maize (control box) for two weeks. In situ soil zymography was applied to visualize enzymatic activity of β-glucosidase and phosphatase at soil-root interface. Spatial resolution of fluorescent images was improved by direct application of a substrate saturated membrane to the soil-root system. Furthermore, we applied "spatial point pattern analysis" to determine whether the pattern (e.g. distribution of hotspots in space) is clumped (aggregated) or regular (dispersed). Our results demonstrated that distribution of hotspots at rhizosphere is clumped (aggregated) compare to control box without plant which showed regular (dispersed) pattern. These patterns were similar in all three replicates and for both enzymes. We conclude that improved zymography is promising in situ technique to identify, analyze, visualize and quantify spatial distribution of enzyme activities in the rhizosphere. Moreover, such different patterns should be considered in assessments and modeling of rhizosphere extension and the corresponding effects on soil properties and functions. Key words: rhizosphere, spatial point pattern, enzyme activity, zymography, maize.

  9. Recent Advances in the Measurement of Arsenic, Cadmium, and Mercury in Rice and Other Foods

    PubMed Central

    Punshon, Tracy

    2015-01-01

    Trace element analysis of foods is of increasing importance because of raised consumer awareness and the need to evaluate and establish regulatory guidelines for toxic trace metals and metalloids. This paper reviews recent advances in the analysis of trace elements in food, including challenges, state-of-the art methods, and use of spatially resolved techniques for localizing the distribution of As and Hg within rice grains. Total elemental analysis of foods is relatively well-established but the push for ever lower detection limits requires that methods be robust from potential matrix interferences which can be particularly severe for food. Inductively coupled plasma mass spectrometry (ICP-MS) is the method of choice, allowing for multi-element and highly sensitive analyses. For arsenic, speciation analysis is necessary because the inorganic forms are more likely to be subject to regulatory limits. Chromatographic techniques coupled to ICP-MS are most often used for arsenic speciation and a range of methods now exist for a variety of different arsenic species in different food matrices. Speciation and spatial analysis of foods, especially rice, can also be achieved with synchrotron techniques. Sensitive analytical techniques and methodological advances provide robust methods for the assessment of several metals in animal and plant-based foods, in particular for arsenic, cadmium and mercury in rice and arsenic speciation in foodstuffs. PMID:25938012

  10. Lidar investigations of ozone in the upper troposphere - lower stratosphere: technique and results of measurements

    NASA Astrophysics Data System (ADS)

    Romanovskii, Oleg A.; Nevzorov, Alexey A.; Nevzorov, Alexey V.; Kharchenko, Olga V.

    2018-04-01

    The main aim of the research is to develop the technique for laser remote ozone sensing in the upper troposphere - lower stratosphere by differential absorption method for temperature and aerosol correction and analysis of measurement results. The authors have determined wavelengths, promising to measure ozone profiles in the upper troposphere - lower stratosphere. We present the results of DIAL measurements of the vertical ozone distribution at the Siberian lidar station in Tomsk. The recovered ozone profiles were compared with IASI satellite data and Kruger model.

  11. A quantum radar detection protocol for fringe visibility enhancement

    NASA Astrophysics Data System (ADS)

    Koltenbah, Benjamin; Parazzoli, Claudio; Capron, Barbara

    2016-05-01

    We present analysis of a radar detection technique using a Photon Addition Homodyne Receiver (PAHR) that improves SNR of the interferometer fringes and reduces uncertainty of the phase measurement. This system uses the concept of Photon Addition (PA) in which the coherent photon distribution is altered. We discuss this process first as a purely mathematical concept to introduce PA and illustrate its effect on coherent photon distribution. We then present a notional proof-of-concept experiment involving a parametric down converter (PDC) and probabilistic post-selection of the results. We end with presentation of a more deterministic PAHR concept that is more suitable for development into a working system. Coherent light illuminates a target and the return signal interferes with the local oscillator reference photons to create the desired fringes. The PAHR alters the photon probability distribution of the returned light via interaction between the return photons and atoms. We refer to this technique as "Atom Interaction" or AI. The returning photons are focused at the properly prepared atomic system. The injected atoms into this region are prepared in the desired quantum state. During the interaction time, the initial quantum state evolves in such a way that the photon distribution function changes resulting in higher photon count, lower phase noise and an increase in fringe SNR. The result is a 3-5X increase of fringe SNR. This method is best suited for low light intensity (low photon count, 0.1-5) applications. The detection protocol could extend the range of existing systems without loss of accuracy, or conversely enhance a system's accuracy for given range. We present quantum mathematical analysis of the method to illustrate how both range and angular resolution improve in comparison with standard measurement techniques. We also suggest an experimental path to validate the method which also will lead toward deployment in the field.

  12. Quantification of lithium at ppm level in geological samples using nuclear reaction analysis.

    PubMed

    De La Rosa, Nathaly; Kristiansson, Per; Nilsson, E J Charlotta; Ros, Linus; Pallon, Jan; Skogby, Henrik

    2018-01-01

    Proton-induced reaction (p,α) is one type of nuclear reaction analysis (NRA) suitable especially for light element quantification. In the case of lithium quantification presented in this work, accelerated protons with an energy about of 850 keV were used to induce the 7 Li(p,α) 4 He reaction in standard reference and geological samples such as tourmaline and other Li-minerals. It is shown that this technique for lithium quantification allowed for measurement of concentrations down below one ppm. The possibility to relate the lithium content with the boron content in a single analysis was also demonstrated using tourmaline samples, both in absolute concentration and in lateral distribution. In addition, Particle induced X-ray emission (PIXE) was utilized as a complementary IBA technique for simultaneous mapping of elements heavier than sodium.

  13. Fluorescence recovery after photobleaching reveals regulation and distribution of connexin36 gap junction coupling within mouse islets of Langerhans

    PubMed Central

    Farnsworth, Nikki L; Hemmati, Alireza; Pozzoli, Marina; Benninger, Richard K P

    2014-01-01

    The pancreatic islets are central to the maintenance of glucose homeostasis through insulin secretion. Glucose-stimulated insulin secretion is tightly linked to electrical activity in β cells within the islet. Gap junctions, composed of connexin36 (Cx36), form intercellular channels between β cells, synchronizing electrical activity and insulin secretion. Loss of gap junction coupling leads to altered insulin secretion dynamics and disrupted glucose homeostasis. Gap junction coupling is known to be disrupted in mouse models of pre-diabetes. Although approaches to measure gap junction coupling have been devised, they either lack cell specificity, suitable quantification of coupling or spatial resolution, or are invasive. The purpose of this study was to develop fluorescence recovery after photobleaching (FRAP) as a technique to accurately and robustly measure gap junction coupling in the islet. The cationic dye Rhodamine 123 was used with FRAP to quantify dye diffusion between islet β cells as a measure of Cx36 gap junction coupling. Measurements in islets with reduced Cx36 verified the accuracy of this technique in distinguishing between distinct levels of gap junction coupling. Analysis of individual cells revealed that the distribution of coupling across the islet is highly heterogeneous. Analysis of several modulators of gap junction coupling revealed glucose- and cAMP-dependent modulation of gap junction coupling in islets. Finally, FRAP was used to determine cell population specific coupling, where no functional gap junction coupling was observed between α cells and β cells in the islet. The results of this study show FRAP to be a robust technique which provides the cellular resolution to quantify the distribution and regulation of Cx36 gap junction coupling in specific cell populations within the islet. Future studies utilizing this technique may elucidate the role of gap junction coupling in the progression of diabetes and identify mechanisms of gap junction regulation for potential therapies. PMID:25172942

  14. Fluorescence recovery after photobleaching reveals regulation and distribution of connexin36 gap junction coupling within mouse islets of Langerhans.

    PubMed

    Farnsworth, Nikki L; Hemmati, Alireza; Pozzoli, Marina; Benninger, Richard K P

    2014-10-15

    The pancreatic islets are central to the maintenance of glucose homeostasis through insulin secretion. Glucose‐stimulated insulin secretion is tightly linked to electrical activity in β cells within the islet. Gap junctions, composed of connexin36 (Cx36), form intercellular channels between β cells, synchronizing electrical activity and insulin secretion. Loss of gap junction coupling leads to altered insulin secretion dynamics and disrupted glucose homeostasis. Gap junction coupling is known to be disrupted in mouse models of pre‐diabetes. Although approaches to measure gap junction coupling have been devised, they either lack cell specificity, suitable quantification of coupling or spatial resolution, or are invasive. The purpose of this study was to develop fluorescence recovery after photobleaching (FRAP) as a technique to accurately and robustly measure gap junction coupling in the islet. The cationic dye Rhodamine 123 was used with FRAP to quantify dye diffusion between islet β cells as a measure of Cx36 gap junction coupling. Measurements in islets with reduced Cx36 verified the accuracy of this technique in distinguishing between distinct levels of gap junction coupling. Analysis of individual cells revealed that the distribution of coupling across the islet is highly heterogeneous. Analysis of several modulators of gap junction coupling revealed glucose‐ and cAMP‐dependent modulation of gap junction coupling in islets. Finally, FRAP was used to determine cell population specific coupling, where no functional gap junction coupling was observed between α cells and β cells in the islet. The results of this study show FRAP to be a robust technique which provides the cellular resolution to quantify the distribution and regulation of Cx36 gap junction coupling in specific cell populations within the islet. Future studies utilizing this technique may elucidate the role of gap junction coupling in the progression of diabetes and identify mechanisms of gap junction regulation for potential therapies.

  15. A computational method for estimating the dosimetric effect of intra-fraction motion on step-and-shoot IMRT and compensator plans

    NASA Astrophysics Data System (ADS)

    Waghorn, Ben J.; Shah, Amish P.; Ngwa, Wilfred; Meeks, Sanford L.; Moore, Joseph A.; Siebers, Jeffrey V.; Langen, Katja M.

    2010-07-01

    Intra-fraction organ motion during intensity-modulated radiation therapy (IMRT) treatment can cause differences between the planned and the delivered dose distribution. To investigate the extent of these dosimetric changes, a computational model was developed and validated. The computational method allows for calculation of the rigid motion perturbed three-dimensional dose distribution in the CT volume and therefore a dose volume histogram-based assessment of the dosimetric impact of intra-fraction motion on a rigidly moving body. The method was developed and validated for both step-and-shoot IMRT and solid compensator IMRT treatment plans. For each segment (or beam), fluence maps were exported from the treatment planning system. Fluence maps were shifted according to the target position deduced from a motion track. These shifted, motion-encoded fluence maps were then re-imported into the treatment planning system and were used to calculate the motion-encoded dose distribution. To validate the accuracy of the motion-encoded dose distribution the treatment plan was delivered to a moving cylindrical phantom using a programmed four-dimensional motion phantom. Extended dose response (EDR-2) film was used to measure a planar dose distribution for comparison with the calculated motion-encoded distribution using a gamma index analysis (3% dose difference, 3 mm distance-to-agreement). A series of motion tracks incorporating both inter-beam step-function shifts and continuous sinusoidal motion were tested. The method was shown to accurately predict the film's dose distribution for all of the tested motion tracks, both for the step-and-shoot IMRT and compensator plans. The average gamma analysis pass rate for the measured dose distribution with respect to the calculated motion-encoded distribution was 98.3 ± 0.7%. For static delivery the average film-to-calculation pass rate was 98.7 ± 0.2%. In summary, a computational technique has been developed to calculate the dosimetric effect of intra-fraction motion. This technique has the potential to evaluate a given plan's sensitivity to anticipated organ motion. With knowledge of the organ's motion it can also be used as a tool to assess the impact of measured intra-fraction motion after dose delivery.

  16. Pedicle screw cement augmentation. A mechanical pullout study on different cement augmentation techniques.

    PubMed

    Costa, Francesco; Ortolina, Alessandro; Galbusera, Fabio; Cardia, Andrea; Sala, Giuseppe; Ronchi, Franco; Uccelli, Carlo; Grosso, Rossella; Fornari, Maurizio

    2016-02-01

    Pedicle screws with polymethyl methacrylate (PMMA) cement augmentation have been shown to significantly improve the fixation strength in a severely osteoporotic spine. However, the efficacy of screw fixation for different cement augmentation techniques remains unknown. This study aimed to determine the difference in pullout strength between different cement augmentation techniques. Uniform synthetic bones simulating severe osteoporosis were used to provide a platform for each augmentation technique. In all cases a polyaxial screw and acrylic cement (PMMA) at medium viscosity were used. Five groups were analyzed: I) only screw without PMMA (control group); II) retrograde cement pre-filling of the tapped area; III) cannulated and fenestrate screw with cement injection through perforation; IV) injection using a standard trocar of PMMA (vertebroplasty) and retrograde pre-filling of the tapped area; V) injection through a fenestrated trocar and retrograde pre-filling of the tapped area. Standard X-rays were taken in order to visualize cement distribution in each group. Pedicle screws at full insertion were then tested for axial pullout failure using a mechanical testing machine. A total of 30 screws were tested. The results of pullout analysis revealed better results of all groups with respect to the control group. In particular the statistical analysis showed a difference of Group V (p = 0.001) with respect to all other groups. These results confirm that the cement augmentation grants better results in pullout axial forces. Moreover they suggest better load resistance to axial forces when the distribution of the PMMA is along all the screw combining fenestration and pre-filling augmentation technique. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  17. Error determination of a successive correction type objective analysis scheme. [for surface meteorological data

    NASA Technical Reports Server (NTRS)

    Smith, D. R.; Leslie, F. W.

    1984-01-01

    The Purdue Regional Objective Analysis of the Mesoscale (PROAM) is a successive correction type scheme for the analysis of surface meteorological data. The scheme is subjected to a series of experiments to evaluate its performance under a variety of analysis conditions. The tests include use of a known analytic temperature distribution to quantify error bounds for the scheme. Similar experiments were conducted using actual atmospheric data. Results indicate that the multiple pass technique increases the accuracy of the analysis. Furthermore, the tests suggest appropriate values for the analysis parameters in resolving disturbances for the data set used in this investigation.

  18. Monte Carlo analysis for the determination of the conic constant of an aspheric micro lens based on a scanning white light interferometric measurement

    NASA Astrophysics Data System (ADS)

    Gugsa, Solomon A.; Davies, Angela

    2005-08-01

    Characterizing an aspheric micro lens is critical for understanding the performance and providing feedback to the manufacturing. We describe a method to find the best-fit conic of an aspheric micro lens using a least squares minimization and Monte Carlo analysis. Our analysis is based on scanning white light interferometry measurements, and we compare the standard rapid technique where a single measurement is taken of the apex of the lens to the more time-consuming stitching technique where more surface area is measured. Both are corrected for tip/tilt based on a planar fit to the substrate. Four major parameters and their uncertainties are estimated from the measurement and a chi-square minimization is carried out to determine the best-fit conic constant. The four parameters are the base radius of curvature, the aperture of the lens, the lens center, and the sag of the lens. A probability distribution is chosen for each of the four parameters based on the measurement uncertainties and a Monte Carlo process is used to iterate the minimization process. Eleven measurements were taken and data is also chosen randomly from the group during the Monte Carlo simulation to capture the measurement repeatability. A distribution of best-fit conic constants results, where the mean is a good estimate of the best-fit conic and the distribution width represents the combined measurement uncertainty. We also compare the Monte Carlo process for the stitched data and the not stitched data. Our analysis allows us to analyze the residual surface error in terms of Zernike polynomials and determine uncertainty estimates for each coefficient.

  19. A New Reassigned Spectrogram Method in Interference Detection for GNSS Receivers.

    PubMed

    Sun, Kewen; Jin, Tian; Yang, Dongkai

    2015-09-02

    Interference detection is very important for Global Navigation Satellite System (GNSS) receivers. Current work on interference detection in GNSS receivers has mainly focused on time-frequency (TF) analysis techniques, such as spectrogram and Wigner-Ville distribution (WVD), where the spectrogram approach presents the TF resolution trade-off problem, since the analysis window is used, and the WVD method suffers from the very serious cross-term problem, due to its quadratic TF distribution nature. In order to solve the cross-term problem and to preserve good TF resolution in the TF plane at the same time, in this paper, a new TF distribution by using a reassigned spectrogram has been proposed in interference detection for GNSS receivers. This proposed reassigned spectrogram method efficiently combines the elimination of the cross-term provided by the spectrogram itself according to its inherent nature and the improvement of the TF aggregation property achieved by the reassignment method. Moreover, a notch filter has been adopted in interference mitigation for GNSS receivers, where receiver operating characteristics (ROCs) are used as metrics for the characterization of interference mitigation performance. The proposed interference detection method by using a reassigned spectrogram is evaluated by experiments on GPS L1 signals in the disturbing scenarios in comparison to the state-of-the-art TF analysis approaches. The analysis results show that the proposed interference detection technique effectively overcomes the cross-term problem and also keeps good TF localization properties, which has been proven to be valid and effective to enhance the interference Sensors 2015, 15 22168 detection performance; in addition, the adoption of the notch filter in interference mitigation has shown a significant acquisition performance improvement in terms of ROC curves for GNSS receivers in jamming environments.

  20. A New Reassigned Spectrogram Method in Interference Detection for GNSS Receivers

    PubMed Central

    Sun, Kewen; Jin, Tian; Yang, Dongkai

    2015-01-01

    Interference detection is very important for Global Navigation Satellite System (GNSS) receivers. Current work on interference detection in GNSS receivers has mainly focused on time-frequency (TF) analysis techniques, such as spectrogram and Wigner–Ville distribution (WVD), where the spectrogram approach presents the TF resolution trade-off problem, since the analysis window is used, and the WVD method suffers from the very serious cross-term problem, due to its quadratic TF distribution nature. In order to solve the cross-term problem and to preserve good TF resolution in the TF plane at the same time, in this paper, a new TF distribution by using a reassigned spectrogram has been proposed in interference detection for GNSS receivers. This proposed reassigned spectrogram method efficiently combines the elimination of the cross-term provided by the spectrogram itself according to its inherent nature and the improvement of the TF aggregation property achieved by the reassignment method. Moreover, a notch filter has been adopted in interference mitigation for GNSS receivers, where receiver operating characteristics (ROCs) are used as metrics for the characterization of interference mitigation performance. The proposed interference detection method by using a reassigned spectrogram is evaluated by experiments on GPS L1 signals in the disturbing scenarios in comparison to the state-of-the-art TF analysis approaches. The analysis results show that the proposed interference detection technique effectively overcomes the cross-term problem and also keeps good TF localization properties, which has been proven to be valid and effective to enhance the interference detection performance; in addition, the adoption of the notch filter in interference mitigation has shown a significant acquisition performance improvement in terms of ROC curves for GNSS receivers in jamming environments. PMID:26364637

  1. A comparison of two micro-beam X-ray emission techniques for actinide elemental distribution in microscopic particles originating from the hydrogen bombs involved in the Palomares (Spain) and Thule (Greenland) accidents

    NASA Astrophysics Data System (ADS)

    Jimenez-Ramos, M. C.; Eriksson, M.; García-López, J.; Ranebo, Y.; García-Tenorio, R.; Betti, M.; Holm, E.

    2010-09-01

    In order to validate and to gain confidence in two micro-beam techniques: particle induced X-ray emission with nuclear microprobe technique (μ-PIXE) and synchrotron radiation induced X-ray fluorescence in a confocal alignment (confocal SR μ-XRF) for characterization of microscopic particles containing actinide elements (mixed plutonium and uranium) a comparative study has been performed. Inter-comparison of the two techniques is essential as the X-ray production cross-sections for U and Pu are different for protons and photons and not well defined in the open literature, especially for Pu. The particles studied consisted of nuclear weapons material, and originate either in the so called Palomares accident in Spain, 1966 or in the Thule accident in Greenland, 1968. In the determination of the average Pu/U mass ratios (not corrected by self-absorption) in the analysed microscopic particles the results from both techniques show a very good agreement. In addition, the suitability of both techniques for the analysis with good resolution (down to a few μm) of the Pu/U distribution within the particles has been proved. The set of results obtained through both techniques has allowed gaining important information concerning the characterization of the remaining fissile material in the areas affected by the aircraft accidents. This type of information is essential for long-term impact assessments of contaminated sites.

  2. Interaction chromatography for characterization and large-scale fractionation of chemically heterogeneous copolymers

    NASA Astrophysics Data System (ADS)

    Han, Junwon

    The remarkable development of polymer synthesis techniques to make complex polymers with controlled chain architectures has inevitably demanded the advancement of polymer characterization tools to analyze the molecular dispersity in polymeric materials beyond size exclusion chromatography (SEC). In particular, man-made synthetic copolymers that consist of more than one monomer type are disperse mixtures of polymer chains that have distributions in terms of both chemical heterogeneity and chain length (molar mass). While the molecular weight distribution has been quite reliably estimated by the SEC, it is still challenging to properly characterize the chemical composition distribution in the copolymers. Here, I have developed and applied adsorption-based interaction chromatography (IC) techniques as a promising tool to characterize and fractionate polystyrene-based block, random and branched copolymers in terms of their chemical heterogeneity. The first part of this thesis is focused on the adsorption-desorption based purification of PS-b-PMMA diblock copolymers using nanoporous silica. The liquid chromatography analysis and large scale purification are discussed for the PS-b-PMMA block copolymers that have been synthesized by sequential anionic polymerization. SEC and IC are compared to critically analyze the contents of PS homopolymers in the as-synthesized block copolymers. In addition, I have developed an IC technique to provide faster and more reliable information on the chemical heterogeneity in the as-synthesized block copolymers. Finally, a large scale (multi-gram) separation technique is developed to obtain "homopolymer-free" block copolymers via a simple chromatographic filtration technique. By taking advantage of the large specific surface area of nanoporous silica (≈300m 2/g), large scale purification of neat PS-b-PMMA has successfully been achieved by controlling adsorption and desorption of the block copolymers on the silica gel surface using a gravity column. The second part of this thesis is focused on the liquid chromatography analysis and fractionation of RAFT-polymerized PS-b -PMMA diblock copolymers and AFM studies. In this study, PS- b-PMMA block copolymers were synthesized by a RAFT free radical polymerization process---the PMMA block with a phenyldithiobenzoate end group was synthesized first. The contents of unreacted PS and PMMA homopolymers in as-synthesized PS-b-PMMA block copolymers were quantitatively analyzed by solvent gradient interaction chromatography (SGIC) technique employing bare silica and C18-bonded silica columns, respectively. In addition, by 2-dimensional large-scale IC fractionation method, atomic force microscopy (AFM) study of these fractionated samples revealed various morphologies with respect to the chemical composition of each fraction. The third part of this thesis is to analyze random copolymers with tunable monomer sequence distributions using interaction chromatography. Here, IC was used for characterizing the composition and monomer sequence distribution in statistical copolymers of poly(styrene-co-4-bromostyrene) (PBrxS). The PBrS copolymers were synthesized by the bromination of monodisperse polystyrenes; the degree of bromination (x) and the sequence distribution were adjusted by varying the bromination time and the solvent quality, respectively. Both normal-phase (bare silica) and reversed-phase (C18-bonded silica) columns were used at different combinations of solvents and non-solvents to monitor the content of the 4-bromostyrene units in the copolymer and their average monomer sequence distribution. The fourth part of this thesis is to analyze and fractionate highly branched polymers such as dendronized polymers and star-shaped homo and copolymers. I have developed an interaction chromatography technique to separate polymers with nonlinear chain architecture. Specifically, the IC technique has been used to separate dendronized polymers and PS-based highly branched copolymers and to ultimately obtain well-defined dendronized or branched copolymers with a low polydispersity. The effects of excess arm-polymers on (1) the micellar self-assembly of dendronized polymers and (2) the regularity of the pore morphology in the low-k applications by the sol-gel process have been studied.

  3. Distribution of siderophile and other trace elements in melt rock at the Chicxulub impact structure

    NASA Technical Reports Server (NTRS)

    Schuraytz, B. C.; Lindstrom, D. J.; Martinez, R. R.; Sharpton, V. L.; Marin, L. E.

    1994-01-01

    Recent isotopic and mineralogical studies have demonstrated a temporal and chemical link between the Chicxulub multiring impact basin and ejecta at the Cretaceous-Tertiary boundary. A fundamental problem yet to be resolved, however, is identification of the projectile responsible for this cataclysmic event. Drill core samples of impact melt rock from the Chichxulub structure contain Ir and Os abundances and Re-Os isotopic ratios indicating the presence of up to approx. 3 percent meteoritic material. We have used a technique involving microdrilling and high sensitivity instrumental neutron activation analysis (INAA) in conjunction with electron microprobe analysis to characterize further the distribution of siderophile and other trace elements among phases within the C1-N10 melt rock.

  4. Characterization of polypropylene–polyethylene blends by temperature rising elution and crystallization analysis fractionation

    PubMed Central

    del Hierro, Pilar

    2010-01-01

    The introduction of single-site catalysts in the polyolefins industry opens new routes to design resins with improved performance through multicatalyst-multireactor processes. Physical combination of various polyolefin types in a secondary extrusion process is also a common practice to achieve new products with improved properties. The new resins have complex structures, especially in terms of composition distribution, and their characterization is not always an easy task. Techniques like temperature rising elution fractionation (TREF) or crystallization analysis fractionation (CRYSTAF) are currently used to characterize the composition distribution of these resins. It has been shown that certain combinations of polyolefins may result in equivocal results if only TREF or CRYSTAF is used separately for their characterization. PMID:20730530

  5. Real time thermal imaging for analysis and control of crystal growth by the Czochralski technique

    NASA Technical Reports Server (NTRS)

    Wargo, M. J.; Witt, A. F.

    1992-01-01

    A real time thermal imaging system with temperature resolution better than +/- 0.5 C and spatial resolution of better than 0.5 mm has been developed. It has been applied to the analysis of melt surface thermal field distributions in both Czochralski and liquid encapsulated Czochralski growth configurations. The sensor can provide single/multiple point thermal information; a multi-pixel averaging algorithm has been developed which permits localized, low noise sensing and display of optical intensity variations at any location in the hot zone as a function of time. Temperature distributions are measured by extraction of data along a user selectable linear pixel array and are simultaneously displayed, as a graphic overlay, on the thermal image.

  6. Vector wind profile gust model

    NASA Technical Reports Server (NTRS)

    Adelfang, S. I.

    1979-01-01

    Work towards establishing a vector wind profile gust model for the Space Transportation System flight operations and trade studies is reported. To date, all the statistical and computational techniques required were established and partially implemented. An analysis of wind profile gust at Cape Kennedy within the theoretical framework is presented. The variability of theoretical and observed gust magnitude with filter type, altitude, and season is described. Various examples are presented which illustrate agreement between theoretical and observed gust percentiles. The preliminary analysis of the gust data indicates a strong variability with altitude, season, and wavelength regime. An extension of the analyses to include conditional distributions of gust magnitude given gust length, distributions of gust modulus, and phase differences between gust components has begun.

  7. SAXS Combined with UV-vis Spectroscopy and QELS: Accurate Characterization of Silver Sols Synthesized in Polymer Matrices.

    PubMed

    Bulavin, Leonid; Kutsevol, Nataliya; Chumachenko, Vasyl; Soloviov, Dmytro; Kuklin, Alexander; Marynin, Andrii

    2016-12-01

    The present work demonstrates a validation of small-angle X-ray scattering (SAXS) combining with ultra violet and visible (UV-vis) spectroscopy and quasi-elastic light scattering (QELS) analysis for characterization of silver sols synthesized in polymer matrices. Polymer matrix internal structure and polymer chemical nature actually controlled the sol size characteristics. It was shown that for precise analysis of nanoparticle size distribution these techniques should be used simultaneously. All applied methods were in good agreement for the characterization of size distribution of small particles (less than 60 nm) in the sols. Some deviations of the theoretical curves from the experimental ones were observed. The most probable cause is that nanoparticles were not entirely spherical in form.

  8. Three-dimensional characterization of pigment dispersion in dried paint films using focused ion beam-scanning electron microscopy.

    PubMed

    Lin, Jui-Ching; Heeschen, William; Reffner, John; Hook, John

    2012-04-01

    The combination of integrated focused ion beam-scanning electron microscope (FIB-SEM) serial sectioning and imaging techniques with image analysis provided quantitative characterization of three-dimensional (3D) pigment dispersion in dried paint films. The focused ion beam in a FIB-SEM dual beam system enables great control in slicing paints, and the sectioning process can be synchronized with SEM imaging providing high quality serial cross-section images for 3D reconstruction. Application of Euclidean distance map and ultimate eroded points image analysis methods can provide quantitative characterization of 3D particle distribution. It is concluded that 3D measurement of binder distribution in paints is effective to characterize the order of pigment dispersion in dried paint films.

  9. Vortex energy landscape from real space imaging analysis of YBa2Cu3O7 with different defect structures

    NASA Astrophysics Data System (ADS)

    Luccas, R. F.; Granados, X.; Obradors, X.; Puig, T.

    2014-10-01

    A methodology based on real space vortex image analysis is presented able to estimate semi-quantitatively the relevant energy densities of an arbitrary array of vortices, map the interaction energy distributions and evaluate the pinning energy associated to particular defects. The combined study using nanostructuration tools, a vortex visualization technique and the energy method is seen as an opportunity to estimate vortex pinning potentials strengths. Particularly, spatial distributions of vortex energy densities induced by surface nanoindented scratches are evaluated and compared to those of twin boundaries. This comparative study underlines the remarkable role of surface nanoscratches in pinning vortices and its potentiality in the design of novel devices for pinning and guiding vortex motion.

  10. Wavelet analysis of polarization azimuths maps for laser images of myocardial tissue for the purpose of diagnosing acute coronary insufficiency

    NASA Astrophysics Data System (ADS)

    Wanchuliak, O. Ya.; Peresunko, A. P.; Bakko, Bouzan Adel; Kushnerick, L. Ya.

    2011-09-01

    This paper presents the foundations of a large scale - localized wavelet - polarization analysis - inhomogeneous laser images of histological sections of myocardial tissue. Opportunities were identified defining relations between the structures of wavelet coefficients and causes of death. The optical model of polycrystalline networks of myocardium protein fibrils is presented. The technique of determining the coordinate distribution of polarization azimuth of the points of laser images of myocardium histological sections is suggested. The results of investigating the interrelation between the values of statistical (statistical moments of the 1st-4th order) parameters are presented which characterize distributions of wavelet - coefficients polarization maps of myocardium layers and death reasons.

  11. The Impact of Economic Policies on Poverty and Income Distribution: Evaluation Techniques and Tools.

    ERIC Educational Resources Information Center

    Bourguignon, Francois, Ed.; Pereira da Silva, Luiz A., Ed.

    This book, a collection of articles and papers, reviews techniques and tools that can be used to evaluate the poverty and distributional impact of economic policy choices. Following are its contents: "Evaluating the Poverty and Distributional Impact of Economic Policies: A Compendium of Existing Techniques" (Francois Bourguignon and Luiz A.…

  12. MTF Analysis of LANDSAT-4 Thematic Mapper

    NASA Technical Reports Server (NTRS)

    Schowengerdt, R.

    1984-01-01

    A research program to measure the LANDSAT 4 Thematic Mapper (TM) modulation transfer function (MTF) is described. Measurement of a satellite sensor's MTF requires the use of a calibrated ground target, i.e., the spatial radiance distribution of the target must be known to a resolution at least four to five times greater than that of the system under test. A small reflective mirror or a dark light linear pattern such as line or edge, and relatively high resolution underflight imagery are used to calibrate the target. A technique that utilizes an analytical model for the scene spatial frequency power spectrum will be investigated as an alternative to calibration of the scene. The test sites and analysis techniques are also described.

  13. Probabilistic bias analysis in pharmacoepidemiology and comparative effectiveness research: a systematic review.

    PubMed

    Hunnicutt, Jacob N; Ulbricht, Christine M; Chrysanthopoulou, Stavroula A; Lapane, Kate L

    2016-12-01

    We systematically reviewed pharmacoepidemiologic and comparative effectiveness studies that use probabilistic bias analysis to quantify the effects of systematic error including confounding, misclassification, and selection bias on study results. We found articles published between 2010 and October 2015 through a citation search using Web of Science and Google Scholar and a keyword search using PubMed and Scopus. Eligibility of studies was assessed by one reviewer. Three reviewers independently abstracted data from eligible studies. Fifteen studies used probabilistic bias analysis and were eligible for data abstraction-nine simulated an unmeasured confounder and six simulated misclassification. The majority of studies simulating an unmeasured confounder did not specify the range of plausible estimates for the bias parameters. Studies simulating misclassification were in general clearer when reporting the plausible distribution of bias parameters. Regardless of the bias simulated, the probability distributions assigned to bias parameters, number of simulated iterations, sensitivity analyses, and diagnostics were not discussed in the majority of studies. Despite the prevalence and concern of bias in pharmacoepidemiologic and comparative effectiveness studies, probabilistic bias analysis to quantitatively model the effect of bias was not widely used. The quality of reporting and use of this technique varied and was often unclear. Further discussion and dissemination of the technique are warranted. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hovel, Harold; Prettyman, Kevin

    A side-by-side analysis was done on then currently available technology, along with roadmaps to push each particular option forward. Variations in turnkey line processes can and do result in finished solar device performance. Together with variations in starting material quality, the result is a distribution of effciencies. Forensic analysis and characterization of each crystalline Si based technology will determine the most promising approach with respect to cost, efficiency and reliability. Forensic analysis will also shed light on the causes of binning variations. Si solar cells were forensically analyzed from each turn key supplier using a host of techniques

  15. Distributed Contour Trees

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morozov, Dmitriy; Weber, Gunther H.

    2014-03-31

    Topological techniques provide robust tools for data analysis. They are used, for example, for feature extraction, for data de-noising, and for comparison of data sets. This chapter concerns contour trees, a topological descriptor that records the connectivity of the isosurfaces of scalar functions. These trees are fundamental to analysis and visualization of physical phenomena modeled by real-valued measurements. We study the parallel analysis of contour trees. After describing a particular representation of a contour tree, called local{global representation, we illustrate how di erent problems that rely on contour trees can be solved in parallel with minimal communication.

  16. Proceedings of the Annual Conference of the Prognostics and Health Management Society (PHM 2014) Held in Fort Worth, TX on September 29 - October 2, 2014. Invited Session on Corrosion Monitoring, Sensing, Detection and Prediction

    DTIC Science & Technology

    2014-12-23

    Campbell 225 Using Johnson Distribution for Automatic Threshold Setting in Wind Turbine Condition Monitoring System Kun S. Marhadi and Georgios Alexandros...Victoria M. Catterson, Craig Love, and Andrew Robb 725 Detection of Wind Turbine Power Performance Abnormalities Using Eigenvalue Analysis Georgios...instance, in wind turbine gearbox analysis (Zappalà et al., 2012). Various other techniques for frequency domain analysis have been explored for

  17. Influences of geological parameters to probabilistic assessment of slope stability of embankment

    NASA Astrophysics Data System (ADS)

    Nguyen, Qui T.; Le, Tuan D.; Konečný, Petr

    2018-04-01

    This article considers influences of geological parameters to slope stability of the embankment in probabilistic analysis using SLOPE/W computational system. Stability of a simple slope is evaluated with and without pore–water pressure on the basis of variation of soil properties. Normal distributions of unit weight, cohesion and internal friction angle are assumed. Monte Carlo simulation technique is employed to perform analysis of critical slip surface. Sensitivity analysis is performed to observe the variation of the geological parameters and their effects on safety factors of the slope stability.

  18. Analysis of Bi Distribution in Epitaxial GaAsBi by Aberration-Corrected HAADF-STEM

    NASA Astrophysics Data System (ADS)

    Baladés, N.; Sales, D. L.; Herrera, M.; Tan, C. H.; Liu, Y.; Richards, R. D.; Molina, S. I.

    2018-04-01

    The Bi content in GaAs/GaAs1 - x Bi x /GaAs heterostructures grown by molecular beam epitaxy at a substrate temperature close to 340 °C is investigated by aberration-corrected high-angle annular dark-field techniques. The analysis at low magnification of high-angle annular dark-field scanning transmission electron microscopy images, corroborated by EDX analysis, revealed planar defect-free layers and a non-homogeneous Bi distribution at the interfaces and within the GaAsBi layer. At high magnification, the qHAADF analysis confirmed the inhomogeneous distribution and Bi segregation at the GaAsBi/GaAs interface at low Bi flux and distorted dumbbell shape in areas with higher Bi content. At higher Bi flux, the size of the Bi gathering increases leading to roughly equiaxial Bi-rich particles faceted along zinc blende {111} and uniformly dispersed around the matrix and interfaces. FFT analysis checks the coexistence of two phases in some clusters: a rhombohedral pure Bi (rh-Bi) one surrounded by a zinc blende GaAs1 - x Bi x matrix. Clusters may be affecting to the local lattice relaxation and leading to a partially relaxed GaAsBi/GaAs system, in good agreement with XRD analysis.

  19. Thermal photons in heavy ion collisions at 158 A GeV

    NASA Astrophysics Data System (ADS)

    Dutt, Sunil

    2018-05-01

    The essence of experimental ultra-relativistic heavy ion collision physics is the production and study of strongly interacting matter at extreme energy densities, temperatures and consequent search for equation of state of nuclear matter. The focus of the analysis has been to examine pseudo-rapidity distributions obtained for the γ-like particles in pre-shower photon multiplicity detector. This allows the extension of scaled factorial moment analysis to bin sizes smaller than those accessible to other experimental techniques. Scaled factorial moments are calculated using horizontal corrected and vertical analysis. The results are compared with simulation analysis using VENUS event generator.

  20. X-Ray Processing of ChaMPlane Fields: Methods and Initial Results for Selected Anti-Galactic Center Fields

    NASA Astrophysics Data System (ADS)

    Hong, JaeSub; van den Berg, Maureen; Schlegel, Eric M.; Grindlay, Jonathan E.; Koenig, Xavier; Laycock, Silas; Zhao, Ping

    2005-12-01

    We describe the X-ray analysis procedure of the ongoing Chandra Multiwavelength Plane (ChaMPlane) Survey and report the initial results from the analysis of 15 selected anti-Galactic center observations (90deg

  1. Propagation of population pharmacokinetic information using a Bayesian approach: comparison with meta-analysis.

    PubMed

    Dokoumetzidis, Aristides; Aarons, Leon

    2005-08-01

    We investigated the propagation of population pharmacokinetic information across clinical studies by applying Bayesian techniques. The aim was to summarize the population pharmacokinetic estimates of a study in appropriate statistical distributions in order to use them as Bayesian priors in consequent population pharmacokinetic analyses. Various data sets of simulated and real clinical data were fitted with WinBUGS, with and without informative priors. The posterior estimates of fittings with non-informative priors were used to build parametric informative priors and the whole procedure was carried on in a consecutive manner. The posterior distributions of the fittings with informative priors where compared to those of the meta-analysis fittings of the respective combinations of data sets. Good agreement was found, for the simulated and experimental datasets when the populations were exchangeable, with the posterior distribution from the fittings with the prior to be nearly identical to the ones estimated with meta-analysis. However, when populations were not exchangeble an alternative parametric form for the prior, the natural conjugate prior, had to be used in order to have consistent results. In conclusion, the results of a population pharmacokinetic analysis may be summarized in Bayesian prior distributions that can be used consecutively with other analyses. The procedure is an alternative to meta-analysis and gives comparable results. It has the advantage that it is faster than the meta-analysis, due to the large datasets used with the latter and can be performed when the data included in the prior are not actually available.

  2. Statistical detection of patterns in unidimensional distributions by continuous wavelet transforms

    NASA Astrophysics Data System (ADS)

    Baluev, R. V.

    2018-04-01

    Objective detection of specific patterns in statistical distributions, like groupings or gaps or abrupt transitions between different subsets, is a task with a rich range of applications in astronomy: Milky Way stellar population analysis, investigations of the exoplanets diversity, Solar System minor bodies statistics, extragalactic studies, etc. We adapt the powerful technique of the wavelet transforms to this generalized task, making a strong emphasis on the assessment of the patterns detection significance. Among other things, our method also involves optimal minimum-noise wavelets and minimum-noise reconstruction of the distribution density function. Based on this development, we construct a self-closed algorithmic pipeline aimed to process statistical samples. It is currently applicable to single-dimensional distributions only, but it is flexible enough to undergo further generalizations and development.

  3. Analysis on unevenness of skin color using the melanin and hemoglobin components separated by independent component analysis of skin color image

    NASA Astrophysics Data System (ADS)

    Ojima, Nobutoshi; Fujiwara, Izumi; Inoue, Yayoi; Tsumura, Norimichi; Nakaguchi, Toshiya; Iwata, Kayoko

    2011-03-01

    Uneven distribution of skin color is one of the biggest concerns about facial skin appearance. Recently several techniques to analyze skin color have been introduced by separating skin color information into chromophore components, such as melanin and hemoglobin. However, there are not many reports on quantitative analysis of unevenness of skin color by considering type of chromophore, clusters of different sizes and concentration of the each chromophore. We propose a new image analysis and simulation method based on chromophore analysis and spatial frequency analysis. This method is mainly composed of three techniques: independent component analysis (ICA) to extract hemoglobin and melanin chromophores from a single skin color image, an image pyramid technique which decomposes each chromophore into multi-resolution images, which can be used for identifying different sizes of clusters or spatial frequencies, and analysis of the histogram obtained from each multi-resolution image to extract unevenness parameters. As the application of the method, we also introduce an image processing technique to change unevenness of melanin component. As the result, the method showed high capabilities to analyze unevenness of each skin chromophore: 1) Vague unevenness on skin could be discriminated from noticeable pigmentation such as freckles or acne. 2) By analyzing the unevenness parameters obtained from each multi-resolution image for Japanese ladies, agerelated changes were observed in the parameters of middle spatial frequency. 3) An image processing system modulating the parameters was proposed to change unevenness of skin images along the axis of the obtained age-related change in real time.

  4. Predictive distributions for between-study heterogeneity and simple methods for their application in Bayesian meta-analysis

    PubMed Central

    Turner, Rebecca M; Jackson, Dan; Wei, Yinghui; Thompson, Simon G; Higgins, Julian P T

    2015-01-01

    Numerous meta-analyses in healthcare research combine results from only a small number of studies, for which the variance representing between-study heterogeneity is estimated imprecisely. A Bayesian approach to estimation allows external evidence on the expected magnitude of heterogeneity to be incorporated. The aim of this paper is to provide tools that improve the accessibility of Bayesian meta-analysis. We present two methods for implementing Bayesian meta-analysis, using numerical integration and importance sampling techniques. Based on 14 886 binary outcome meta-analyses in the Cochrane Database of Systematic Reviews, we derive a novel set of predictive distributions for the degree of heterogeneity expected in 80 settings depending on the outcomes assessed and comparisons made. These can be used as prior distributions for heterogeneity in future meta-analyses. The two methods are implemented in R, for which code is provided. Both methods produce equivalent results to standard but more complex Markov chain Monte Carlo approaches. The priors are derived as log-normal distributions for the between-study variance, applicable to meta-analyses of binary outcomes on the log odds-ratio scale. The methods are applied to two example meta-analyses, incorporating the relevant predictive distributions as prior distributions for between-study heterogeneity. We have provided resources to facilitate Bayesian meta-analysis, in a form accessible to applied researchers, which allow relevant prior information on the degree of heterogeneity to be incorporated. © 2014 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:25475839

  5. Fluorescence analysis of ubiquinone and its application in quality control of medical supplies

    NASA Astrophysics Data System (ADS)

    Timofeeva, Elvira O.; Gorbunova, Elena V.; Chertov, Aleksandr N.

    2017-02-01

    The presence of antioxidant issues such as redox potential imbalance in human body is a very important question for modern clinical diagnostics. Implementation of fluorescence analysis into optical diagnostics of such wide distributed in a human body antioxidant as ubiquinone is one of the steps for development of the device with a view to clinical diagnostics of redox potential. Recording of fluorescence was carried out with spectrometer using UV irradiation source with thin band (max at 287 and 330 nm) as a background radiation. Concentrations of ubiquinone from 0.25 to 2.5 mmol/l in explored samples were used for investigation. Recording data was processed using correlation analysis and differential analytical technique. The fourth derivative spectrum of fluorescence spectrum provided the basis for a multicomponent analysis of the solutions. As a technique in clinical diagnostics fluorescence analysis with processing method including differential spectrophotometry, it is step forward towards redox potential calculation and quality control in pharmacy for better health care.

  6. A variable-order laminated plate theory based on the variational-asymptotical method

    NASA Technical Reports Server (NTRS)

    Lee, Bok W.; Sutyrin, Vladislav G.; Hodges, Dewey H.

    1993-01-01

    The variational-asymptotical method is a mathematical technique by which the three-dimensional analysis of laminated plate deformation can be split into a linear, one-dimensional, through-the-thickness analysis and a nonlinear, two-dimensional, plate analysis. The elastic constants used in the plate analysis are obtained from the through-the-thickness analysis, along with approximate, closed-form three-dimensional distributions of displacement, strain, and stress. In this paper, a theory based on this technique is developed which is capable of approximating three-dimensional elasticity to any accuracy desired. The asymptotical method allows for the approximation of the through-the-thickness behavior in terms of the eigenfunctions of a certain Sturm-Liouville problem associated with the thickness coordinate. These eigenfunctions contain all the necessary information about the nonhomogeneities along the thickness coordinate of the plate and thus possess the appropriate discontinuities in the derivatives of displacement. The theory is presented in this paper along with numerical results for the eigenfunctions of various laminated plates.

  7. Improvement of analytical capabilities of neutron activation analysis laboratory at the Colombian Geological Survey

    NASA Astrophysics Data System (ADS)

    Parrado, G.; Cañón, Y.; Peña, M.; Sierra, O.; Porras, A.; Alonso, D.; Herrera, D. C.; Orozco, J.

    2016-07-01

    The Neutron Activation Analysis (NAA) laboratory at the Colombian Geological Survey has developed a technique for multi-elemental analysis of soil and plant matrices, based on Instrumental Neutron Activation Analysis (INAA) using the comparator method. In order to evaluate the analytical capabilities of the technique, the laboratory has been participating in inter-comparison tests organized by Wepal (Wageningen Evaluating Programs for Analytical Laboratories). In this work, the experimental procedure and results for the multi-elemental analysis of four soil and four plant samples during participation in the first round on 2015 of Wepal proficiency test are presented. Only elements with radioactive isotopes with medium and long half-lives have been evaluated, 15 elements for soils (As, Ce, Co, Cr, Cs, Fe, K, La, Na, Rb, Sb, Sc, Th, U and Zn) and 7 elements for plants (Br, Co, Cr, Fe, K, Na and Zn). The performance assessment by Wepal based on Z-score distributions showed that most results obtained |Z-scores| ≤ 3.

  8. Impact of different satellite soil moisture products on the predictions of a continuous distributed hydrological model

    NASA Astrophysics Data System (ADS)

    Laiolo, P.; Gabellani, S.; Campo, L.; Silvestro, F.; Delogu, F.; Rudari, R.; Pulvirenti, L.; Boni, G.; Fascetti, F.; Pierdicca, N.; Crapolicchio, R.; Hasenauer, S.; Puca, S.

    2016-06-01

    The reliable estimation of hydrological variables in space and time is of fundamental importance in operational hydrology to improve the flood predictions and hydrological cycle description. Nowadays remotely sensed data can offer a chance to improve hydrological models especially in environments with scarce ground based data. The aim of this work is to update the state variables of a physically based, distributed and continuous hydrological model using four different satellite-derived data (three soil moisture products and a land surface temperature measurement) and one soil moisture analysis to evaluate, even with a non optimal technique, the impact on the hydrological cycle. The experiments were carried out for a small catchment, in the northern part of Italy, for the period July 2012-June 2013. The products were pre-processed according to their own characteristics and then they were assimilated into the model using a simple nudging technique. The benefits on the model predictions of discharge were tested against observations. The analysis showed a general improvement of the model discharge predictions, even with a simple assimilation technique, for all the assimilation experiments; the Nash-Sutcliffe model efficiency coefficient was increased from 0.6 (relative to the model without assimilation) to 0.7, moreover, errors on discharge were reduced up to the 10%. An added value to the model was found in the rainfall season (autumn): all the assimilation experiments reduced the errors up to the 20%. This demonstrated that discharge prediction of a distributed hydrological model, which works at fine scale resolution in a small basin, can be improved with the assimilation of coarse-scale satellite-derived data.

  9. Synthesis of visible light driven cobalt tailored Ag{sub 2}O/TiON nanophotocatalyst by reverse micelle processing for degradation of Eriochrome Black T

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hussain, Syed Tajammul, E-mail: dr_tajammul@yahoo.ca; Rashid; Department of Chemistry, Quaid-i-Azam University, Islamabad

    2013-02-15

    Graphical abstract: Cobalt tailored Ag{sub 2}O/TiON nanophotocatalyst is synthesized using reverse micelle technique and it showed extraordinary photocatalytic activity. Display Omitted Highlights: ► TiON/Ag{sub 2}O/Co nanophotocatalyst is synthesized using microemulsion technique. ► Low temperature anatase phase and outstanding photocatlytic activity is observed. ► Effect of temperature and inert atmosphere on materials phase is investigated. ► Homogeneous dopants distribution and oxygen vacancies are examined. ► Enhancement in surface area, quantum efficiency and optical properties is observed. -- Abstract: An ultra efficient cobalt tailored silver and nitrogen co-doped titania (TiON/Ag{sub 2}O/Co) visible nanophotocatalyst is successfully synthesized using modified reverse micelle processing. Composition,more » phase, distribution of dopants, functional group analysis, optical properties and morphology of synthesized materials are investigated by means of X-ray diffraction (XRD), transmission electron microscopy (TEM) based techniques and others. Charge states of titanium (Ti) and silver are explored through core-loss electron energy loss spectroscopy (EELS) analysis and X ray photoelectron spectroscopy (XPS). Our characterization results showed that the synthesized nanophotocatalyst consisted of anatase phased qausispherical nanoparticles that exhibited homogeneous distribution of dopants, large surface area, high quantum efficiency and enhanced optical properties. At lower content of doped Co ions, the TiON/Ag{sub 2}O responded with extraordinary photocatalytic properties. The cobalt tailored nanophotocatalyst showed remarkable activity against Eriochrome Black T (EBT). Moreover, comparative degradation behavior of EBT with TiON, Ag{sub 2}O/TiON and Co/Ag{sub 2}O/TiON is also investigated.« less

  10. A Computer Program for Practical Semivariogram Modeling and Ordinary Kriging: A Case Study of Porosity Distribution in an Oil Field

    NASA Astrophysics Data System (ADS)

    Mert, Bayram Ali; Dag, Ahmet

    2017-12-01

    In this study, firstly, a practical and educational geostatistical program (JeoStat) was developed, and then example analysis of porosity parameter distribution, using oilfield data, was presented. With this program, two or three-dimensional variogram analysis can be performed by using normal, log-normal or indicator transformed data. In these analyses, JeoStat offers seven commonly used theoretical variogram models (Spherical, Gaussian, Exponential, Linear, Generalized Linear, Hole Effect and Paddington Mix) to the users. These theoretical models can be easily and quickly fitted to experimental models using a mouse. JeoStat uses ordinary kriging interpolation technique for computation of point or block estimate, and also uses cross-validation test techniques for validation of the fitted theoretical model. All the results obtained by the analysis as well as all the graphics such as histogram, variogram and kriging estimation maps can be saved to the hard drive, including digitised graphics and maps. As such, the numerical values of any point in the map can be monitored using a mouse and text boxes. This program is available to students, researchers, consultants and corporations of any size free of charge. The JeoStat software package and source codes available at: http://www.jeostat.com/JeoStat_2017.0.rar.

  11. Detecting and Attributing the Effects of Climate Change on the Distributions of Snake Species Over the Past 50 Years.

    PubMed

    Wu, Jianguo

    2016-01-01

    It is unclear whether the distributions of snakes have changed in association with climate change over the past years. We detected the distribution changes of snakes over the past 50 years and determined whether the changes could be attributed to recent climate change in China. Long-term records of the distribution of nine snake species in China, grey relationship analysis, fuzzy sets classification techniques, the consistency index, and attributed methods were used. Over the past 50 years, the distributions of snake species have changed in multiple directions, primarily shifting northwards, and most of the changes were related to the thermal index. Driven by climatic factors over the past 50 years, the distribution boundary and distribution centers of some species changed with the fluctuations. The observed and predicted changes in distribution were highly consistent for some snake species. The changes in the northern limits of distributions of nearly half of the species, as well as the southern and eastern limits, and the distribution centers of some snake species can be attributed to climate change.

  12. Application of InSAR and GIS techniques to ground subsidence assessment in the Nobi Plain, Central Japan.

    PubMed

    Zheng, Minxue; Fukuyama, Kaoru; Sanga-Ngoie, Kazadi

    2013-12-31

    Spatial variation and temporal changes in ground subsidence over the Nobi Plain, Central Japan, are assessed using GIS techniques and ground level measurements data taken over this area since the 1970s. Notwithstanding the general slowing trend observed in ground subsidence over the plains, we have detected ground rise at some locations, more likely due to the ground expansion because of recovering groundwater levels and the tilting of the Nobi land mass. The problem of non-availability of upper-air meteorological information, especially the 3-dimensional water vapor distribution, during the JERS-1 observational period (1992-1998) was solved by applying the AWC (analog weather charts) method onto the high-precision GPV-MSM (Grid Point Value of Meso-Scale Model) water-vapor data to find the latter's matching meteorological data. From the selected JERS-1 interferometry pair and the matching GPV-MSM meteorological data, the atmospheric path delay generated by water vapor inhomogeneity was then quantitatively evaluated. A highly uniform spatial distribution of the atmospheric delay, with a maximum deviation of approximately 38 mm in its horizontal distribution was found over the Plain. This confirms the effectiveness of using GPV-MSM data for SAR differential interferometric analysis, and sheds thus some new light on the possibility of improving InSAR analysis results for land subsidence applications.

  13. Application of InSAR and GIS Techniques to Ground Subsidence Assessment in the Nobi Plain, Central Japan

    PubMed Central

    Zheng, Minxue; Fukuyama, Kaoru; Sanga-Ngoie, Kazadi

    2014-01-01

    Spatial variation and temporal changes in ground subsidence over the Nobi Plain, Central Japan, are assessed using GIS techniques and ground level measurements data taken over this area since the 1970s. Notwithstanding the general slowing trend observed in ground subsidence over the plains, we have detected ground rise at some locations, more likely due to the ground expansion because of recovering groundwater levels and the tilting of the Nobi land mass. The problem of non-availability of upper-air meteorological information, especially the 3-dimensional water vapor distribution, during the JERS-1 observational period (1992–1998) was solved by applying the AWC (analog weather charts) method onto the high-precision GPV-MSM (Grid Point Value of Meso-Scale Model) water-vapor data to find the latter's matching meteorological data. From the selected JERS-1 interferometry pair and the matching GPV-MSM meteorological data, the atmospheric path delay generated by water vapor inhomogeneity was then quantitatively evaluated. A highly uniform spatial distribution of the atmospheric delay, with a maximum deviation of approximately 38 mm in its horizontal distribution was found over the Plain. This confirms the effectiveness of using GPV-MSM data for SAR differential interferometric analysis, and sheds thus some new light on the possibility of improving InSAR analysis results for land subsidence applications. PMID:24385028

  14. Application of radar chart array analysis to visualize effects of formulation variables on IgG1 particle formation as measured by multiple analytical techniques

    PubMed Central

    Kalonia, Cavan; Kumru, Ozan S.; Kim, Jae Hyun; Middaugh, C. Russell; Volkin, David B.

    2013-01-01

    This study presents a novel method to visualize protein aggregate and particle formation data to rapidly evaluate the effect of solution and stress conditions on the physical stability of an IgG1 monoclonal antibody (mAb). Radar chart arrays were designed so that hundreds of Microflow Digital Imaging (MFI) solution measurements, evaluating different mAb formulations under varying stresses, could be presented in a single figure with minimal loss of data resolution. These MFI radar charts show measured changes in subvisible particle number, size and morphology distribution as a change in the shape of polygons. Radar charts were also created to visualize mAb aggregate and particle formation across a wide size range by combining data sets from size exclusion chromatography (SEC), Archimedes resonant mass measurements, and MFI. We found that the environmental/mechanical stress condition (e.g., heat vs. agitation) was the most important factor in influencing the particle size and morphology distribution with this IgG1 mAb. Additionally, the presence of NaCl exhibited a pH and stress dependent behavior resulting in promotion or inhibition mAb particle formation. This data visualization technique provides a comprehensive analysis of the aggregation tendencies of this IgG1 mAb in different formulations with varying stresses as measured by different analytical techniques. PMID:24122556

  15. Large-Scale CTRW Analysis of Push-Pull Tracer Tests and Other Transport in Heterogeneous Porous Media

    NASA Astrophysics Data System (ADS)

    Hansen, S. K.; Berkowitz, B.

    2014-12-01

    Recently, we developed an alternative CTRW formulation which uses a "latching" upscaling scheme to rigorously map continuous or fine-scale stochastic solute motion onto discrete transitions on an arbitrarily coarse lattice (with spacing potentially on the meter scale or more). This approach enables model simplification, among many other things. Under advection, for example, we see that many relevant anomalous transport problems may be mapped into 1D, with latching to a sequence of successive, uniformly spaced planes. On this formulation (which we term RP-CTRW), the spatial transition vector may generally be made deterministic, with CTRW waiting time distributions encapsulating all the stochastic behavior. We demonstrate the excellent performance of this technique alongside Pareto-distributed waiting times in explaining experiments across a variety of scales using only two degrees of freedom. An interesting new application of the RP-CTRW technique is the analysis of radial (push-pull) tracer tests. Given modern computational power, random walk simulations are a natural fit for the inverse problem of inferring subsurface parameters from push-pull test data, and we propose them as an alternative to the classical type curve approach. In particular, we explore the visibility of heterogeneity through non-Fickian behavior in push-pull tests, and illustrate the ability of a radial RP-CTRW technique to encapsulate this behavior using a sparse parameterization which has predictive value.

  16. Measurement of the spatially distributed temperature and soot loadings in a laminar diffusion flame using a Cone-Beam Tomography technique

    NASA Astrophysics Data System (ADS)

    Zhao, Huayong; Williams, Ben; Stone, Richard

    2014-01-01

    A new low-cost optical diagnostic technique, called Cone Beam Tomographic Three Colour Spectrometry (CBT-TCS), has been developed to measure the planar distributions of temperature, soot particle size, and soot volume fraction in a co-flow axi-symmetric laminar diffusion flame. The image of a flame is recorded by a colour camera, and then by using colour interpolation and applying a cone beam tomography algorithm, a colour map can be reconstructed that corresponds to a diametral plane. Look-up tables calculated using Planck's law and different scattering models are then employed to deduce the temperature, approximate average soot particle size and soot volume fraction in each voxel (volumetric pixel). A sensitivity analysis of the look-up tables shows that the results have a high temperature resolution but a relatively low soot particle size resolution. The assumptions underlying the technique are discussed in detail. Sample data from an ethylene laminar diffusion flame are compared with data in the literature for similar flames. The comparison shows very consistent temperature and soot volume fraction profiles. Further analysis indicates that the difference seen in comparison with published results are within the measurement uncertainties. This methodology is ready to be applied to measure 3D data by capturing multiple flame images from different angles for non-axisymmetric flame.

  17. Combining knowledge discovery from databases (KDD) and case-based reasoning (CBR) to support diagnosis of medical images

    NASA Astrophysics Data System (ADS)

    Stranieri, Andrew; Yearwood, John; Pham, Binh

    1999-07-01

    The development of data warehouses for the storage and analysis of very large corpora of medical image data represents a significant trend in health care and research. Amongst other benefits, the trend toward warehousing enables the use of techniques for automatically discovering knowledge from large and distributed databases. In this paper, we present an application design for knowledge discovery from databases (KDD) techniques that enhance the performance of the problem solving strategy known as case- based reasoning (CBR) for the diagnosis of radiological images. The problem of diagnosing the abnormality of the cervical spine is used to illustrate the method. The design of a case-based medical image diagnostic support system has three essential characteristics. The first is a case representation that comprises textual descriptions of the image, visual features that are known to be useful for indexing images, and additional visual features to be discovered by data mining many existing images. The second characteristic of the approach presented here involves the development of a case base that comprises an optimal number and distribution of cases. The third characteristic involves the automatic discovery, using KDD techniques, of adaptation knowledge to enhance the performance of the case based reasoner. Together, the three characteristics of our approach can overcome real time efficiency obstacles that otherwise mitigate against the use of CBR to the domain of medical image analysis.

  18. Adaptive Multi-Layer LMS Controller Design and Application to Active Vibration Suppression on a Truss and Proposed Impact Analysis Technique

    DTIC Science & Technology

    2001-06-01

    Setup and Initiation ........................................................ 83 2. Simulation 1 (19 Hz, Y-axis of Node 18, Piezo #2...175 INITIAL DISTRIBUTION LIST ................................................................................... 187 ix...system for the sake of testing and simplicity. The Adaptive Multi-Layered LMS Controller was developed one piece at a time. After initial experimental

  19. High and low density development in Puerto Rico

    Treesearch

    William A. Gould; Sebastian Martinuzzi; Olga M. Ramos Gonzalez

    2008-01-01

    This map shows the distribution of high and low density developed lands in Puerto Rico (Martinuzzi et al. 2007). The map was created using a mosaic of Landsat ETM+ images that range from the years 2000 to 2003. The developed land cover was classified using the Iterative Self-Organizing Data Analysis Technique (ISODATA) unsupervised classification (ERDAS 2003)....

  20. Selection of Variables in Cluster Analysis: An Empirical Comparison of Eight Procedures

    ERIC Educational Resources Information Center

    Steinley, Douglas; Brusco, Michael J.

    2008-01-01

    Eight different variable selection techniques for model-based and non-model-based clustering are evaluated across a wide range of cluster structures. It is shown that several methods have difficulties when non-informative variables (i.e., random noise) are included in the model. Furthermore, the distribution of the random noise greatly impacts the…

Top