Sample records for quantitative element analysis

  1. Application of relativistic electrons for the quantitative analysis of trace elements

    NASA Astrophysics Data System (ADS)

    Hoffmann, D. H. H.; Brendel, C.; Genz, H.; Löw, W.; Richter, A.

    1984-04-01

    Particle induced X-ray emission methods (PIXE) have been extended to relativistic electrons to induce X-ray emission (REIXE) for quantitative trace-element analysis. The electron beam (20 ≤ E0≤ 70 MeV) was supplied by the Darmstadt electron linear accelerator DALINAC. Systematic measurements of absolute K-, L- and M-shell ionization cross sections revealed a scaling behaviour of inner-shell ionization cross sections from which X-ray production cross sections can be deduced for any element of interest for a quantitative sample investigation. Using a multielemental mineral monazite sample from Malaysia the sensitivity of REIXE is compared to well established methods of trace-element analysis like proton- and X-ray-induced X-ray fluorescence analysis. The achievable detection limit for very heavy elements amounts to about 100 ppm for the REIXE method. As an example of an application the investigation of a sample prepared from manganese nodules — picked up from the Pacific deep sea — is discussed, which showed the expected high mineral content of Fe, Ni, Cu and Ti, although the search for aliquots of Pt did not show any measurable content within an upper limit of 250 ppm.

  2. EDXRF quantitative analysis of chromophore chemical elements in corundum samples.

    PubMed

    Bonizzoni, L; Galli, A; Spinolo, G; Palanza, V

    2009-12-01

    Corundum is a crystalline form of aluminum oxide (Al(2)O(3)) and is one of the rock-forming minerals. When aluminum oxide is pure, the mineral is colorless, but the presence of trace amounts of other elements such as iron, titanium, and chromium in the crystal lattice gives the typical colors (including blue, red, violet, pink, green, yellow, orange, gray, white, colorless, and black) of gemstone varieties. The starting point for our work is the quantitative evaluation of the concentration of chromophore chemical elements with a precision as good as possible to match the data obtained by different techniques as such as optical absorption photoluminescence. The aim is to give an interpretation of the absorption bands present in the NIR and visible ranges which do not involve intervalence charge transfer transitions (Fe(2+) --> Fe(3+) and Fe(2+) --> Ti(4+)), commonly considered responsible of the important features of the blue sapphire absorption spectra. So, we developed a method to evaluate as accurately as possible the autoabsorption effects and the secondary excitation effects which frequently are sources of relevant errors in the quantitative EDXRF analysis.

  3. EXTRACTION AND QUANTITATIVE ANALYSIS OF ELEMENTAL SULFUR FROM SULFIDE MINERAL SURFACES BY HIGH-PERFORMANCE LIQUID CHROMATOGRAPHY. (R826189)

    EPA Science Inventory

    A simple method for the quantitative determination of elemental sulfur on oxidized sulfide minerals is described. Extraction of elemental sulfur in perchloroethylene and subsequent analysis with high-performance liquid chromatography were used to ascertain the total elemental ...

  4. Quantitative Analysis of Trace Element Impurity Levels in Some Gem-Quality Diamonds

    NASA Astrophysics Data System (ADS)

    McNeill, J. C.; Klein-Bendavid, O.; Pearson, D. G.; Nowell, G. M.; Ottley, C. J.; Chinn, I.; Malarkey, J.

    2009-05-01

    Perhaps the most important information required to understand the origin of diamonds is the nature of the fluid that they crystallise from. Constraining the identity of the diamond-forming fluid for high purity gem diamonds is hampered by analytical challenges because of the very low analyte levels involved. Here we use a new ultra- low blank 'off-line' laser ablation method coupled to sector-field ICPMS for the quantitative analysis of fluid-poor gem diamonds. Ten diamonds comprised of both E- and P-type parageneses, from the Premier Mine, South Africa, were analysed for trace element abundances. We assume that the elemental signatures arise from low densities of sub-microscopic fluid inclusions that are analogous to the much higher densities of fluid inclusions commonly found within fluid-rich diamonds exhibiting fibrous growth. Repeatability of multiple (>20) blanks yielded consistently low values so that using the current procedure our limits of quantitation (10-ã blank) are <1pg for most trace elements, except for Sr, Zr, Ba, from 2-9pg and Pb ~30pg. Trace element patterns of the Premier diamond suite show enrichment of LREE over HREE. Abundances broadly decrease with increasing elemental compatibility. As a suite the chondrite normalised diamond patterns show negative Sr, Zr, Ti and Y anomalies and positive U, and Pb anomalies. All sample abundances are very depleted relative to chondrites (0.1 to 0.001X ch). HREE range from 0.1 to 1ppb as do Y, Nb, Cs. Other lighter elements vary from 2-30ppb. Pb reaches several ppb and Ti ranges from ppb values up to 2ppm. No significant difference were observed between the trace element systematics of the eclogitic and peridotitic diamonds. Overall, these initial data have inter-element fractionation patterns similar to those evident from fluid-rich fibrous diamonds and can be sued to infer that both types of diamond-forming fluids share a common origin.

  5. A convenient method for the quantitative determination of elemental sulfur in coal by HPLC analysis of perchloroethylene extracts

    USGS Publications Warehouse

    Buchanan, D.H.; Coombs, K.J.; Murphy, P.M.; Chaven, C.

    1993-01-01

    A convenient method for the quantitative determination of elemental sulfur in coal is described. Elemental sulfur is extracted from the coal with hot perchloroethylene (PCE) (tetrachloroethene, C2Cl4) and quantitatively determined by HPLC analysis on a C18 reverse-phase column using UV detection. Calibration solutions were prepared from sublimed sulfur. Results of quantitative HPLC analyses agreed with those of a chemical/spectroscopic analysis. The HPLC method was found to be linear over the concentration range of 6 ?? 10-4 to 2 ?? 10-2 g/L. The lower detection limit was 4 ?? 10-4 g/L, which for a coal sample of 20 g is equivalent to 0.0006% by weight of coal. Since elemental sulfur is known to react slowly with hydrocarbons at the temperature of boiling PCE, standard solutions of sulfur in PCE were heated with coals from the Argonne Premium Coal Sample program. Pseudo-first-order uptake of sulfur by the coals was observed over several weeks of heating. For the Illinois No. 6 premium coal, the rate constant for sulfur uptake was 9.7 ?? 10-7 s-1, too small for retrograde reactions between solubilized sulfur and coal to cause a significant loss in elemental sulfur isolated during the analytical extraction. No elemental sulfur was produced when the following pure compounds were heated to reflux in PCE for up to 1 week: benzyl sulfide, octyl sulfide, thiane, thiophene, benzothiophene, dibenzothiophene, sulfuric acid, or ferrous sulfate. A sluury of mineral pyrite in PCE contained elemental sulfur which increased in concentration with heating time. ?? 1993 American Chemical Society.

  6. Qualitative and quantitative analysis of an additive element in metal oxide nanometer film using laser induced breakdown spectroscopy.

    PubMed

    Xiu, Junshan; Liu, Shiming; Sun, Meiling; Dong, Lili

    2018-01-20

    The photoelectric performance of metal ion-doped TiO 2 film will be improved with the changing of the compositions and concentrations of additive elements. In this work, the TiO 2 films doped with different Sn concentrations were obtained with the hydrothermal method. Qualitative and quantitative analysis of the Sn element in TiO 2 film was achieved with laser induced breakdown spectroscopy (LIBS) with the calibration curves plotted accordingly. The photoelectric characteristics of TiO 2 films doped with different Sn content were observed with UV visible absorption spectra and J-V curves. All results showed that Sn doping could improve the optical absorption to be red-shifted and advance the photoelectric properties of the TiO 2 films. We had obtained that when the concentration of Sn doping in TiO 2 films was 11.89  mmol/L, which was calculated by the LIBS calibration curves, the current density of the film was the largest, which indicated the best photoelectric performance. It indicated that LIBS was a potential and feasible measured method, which was applied to qualitative and quantitative analysis of the additive element in metal oxide nanometer film.

  7. Effects of Scan Resolutions and Element Sizes on Bovine Vertebral Mechanical Parameters from Quantitative Computed Tomography-Based Finite Element Analysis

    PubMed Central

    Zhang, Meng; Gao, Jiazi; Huang, Xu; Zhang, Min; Liu, Bei

    2017-01-01

    Quantitative computed tomography-based finite element analysis (QCT/FEA) has been developed to predict vertebral strength. However, QCT/FEA models may be different with scan resolutions and element sizes. The aim of this study was to explore the effects of scan resolutions and element sizes on QCT/FEA outcomes. Nine bovine vertebral bodies were scanned using the clinical CT scanner and reconstructed from datasets with the two-slice thickness, that is, 0.6 mm (PA resolution) and 1 mm (PB resolution). There were significantly linear correlations between the predicted and measured principal strains (R2 > 0.7, P < 0.0001), and the predicted vertebral strength and stiffness were modestly correlated with the experimental values (R2 > 0.6, P < 0.05). Two different resolutions and six different element sizes were combined in pairs, and finite element (FE) models of bovine vertebral cancellous bones in the 12 cases were obtained. It showed that the mechanical parameters of FE models with the PB resolution were similar to those with the PA resolution. The computational accuracy of FE models with the element sizes of 0.41 × 0.41 × 0.6 mm3 and 0.41 × 0.41 × 1 mm3 was higher by comparing the apparent elastic modulus and yield strength. Therefore, scan resolution and element size should be chosen optimally to improve the accuracy of QCT/FEA. PMID:29065624

  8. Elemental analysis of scorpion venoms.

    PubMed

    Al-Asmari, AbdulRahman K; Kunnathodi, Faisal; Al Saadon, Khalid; Idris, Mohammed M

    2016-01-01

    Scorpion venom is a rich source of biomolecules, which can perturb physiological activity of the host on envenomation and may also have a therapeutic potential. Scorpion venoms produced by the columnar cells of venom gland are complex mixture of mucopolysaccharides, neurotoxic peptides and other components. This study was aimed at cataloguing the elemental composition of venoms obtained from medically important scorpions found in the Arabian peninsula. The global elemental composition of the crude venom obtained from Androctonus bicolor, Androctonus crassicauda and Leiurus quinquestriatus scorpions were estimated using ICP-MS analyzer. The study catalogued several chemical elements present in the scorpion venom using ICP-MS total quant analysis and quantitation of nine elements exclusively using appropriate standards. Fifteen chemical elements including sodium, potassium and calcium were found abundantly in the scorpion venom at PPM concentrations. Thirty six chemical elements of different mass ranges were detected in the venom at PPB level. Quantitative analysis of the venoms revealed copper to be the most abundant element in Androctonus sp. venom but at lower level in Leiurus quinquestriatus venom; whereas zinc and manganese was found at higher levels in Leiurus sp. venom but at lower level in Androctonus sp. venom. These data and the concentrations of other different elements present in the various venoms are likely to increase our understanding of the mechanisms of venom activity and their pharmacological potentials.

  9. CCQM Pilot Study CCQM-P140: Quantitative surface analysis of multi-element alloy films

    NASA Astrophysics Data System (ADS)

    Kim, Kyung Joong; Jang, Jong Shik; Kim, An Soon; Suh, Jung Ki; Chung, Yong-Duck; Hodoroaba, Vasile-Dan; Wirth, Thomas; Unger, Wolfgang; Kang, Hee Jae; Popov, Oleg; Popov, Inna; Kuselman, Ilya; Lee, Yeon Hee; Sykes, David E.; Wang, Meiling; Wang, Hai; Ogiwara, Toshiya; Nishio, Mitsuaki; Tanuma, Shigeo; Simons, David; Szakal, Christopher; Osborn, William; Terauchi, Shinya; Ito, Mika; Kurokawa, Akira; Fujimoto, Toshiyuki; Jordaan, Werner; Jeong, Chil Seong; Havelund, Rasmus; Spencer, Steve; Shard, Alex; Streeck, Cornelia; Beckhoff, Burkhard; Eicke, Axel; Terborg, Ralf

    2015-01-01

    A pilot study for a quantitative surface analysis of multi-element alloy films has been performed by the Surface Analysis Working Group (SAWG) of the Consultative Committee for Amount of Substance (CCQM). The aim of this pilot study is to evaluate a protocol for a key comparison to demonstrate the equivalence of measures by National Metrology Institutes (NMIs) and Designated Institutes (DI) for the mole fractions of multi-element alloy films. A Cu(In,Ga)Se2 (CIGS) film with non-uniform depth distribution was chosen as a representative multi-element alloy film. The mole fractions of the reference and the test CIGS films were certified by isotope dilution—inductively coupled plasma/mass spectrometry. A total number counting (TNC) method was used as a method to determine the signal intensities of the constituent elements acquired in SIMS, XPS and AES depth profiling. TNC method is comparable with the certification process because the certified mole fractions are the average values of the films. The mole fractions of the CIGS films were measured by Secondary Ion Mass Spectrometry (SIMS), Auger Electron Spectroscopy (AES), X-ray Photoelectron Spectroscopy (XPS), X-Ray Fluorescence (XRF) Analysis and Electron Probe Micro Analysis (EPMA) with Energy Dispersive X-ray Spectrometry (EDX). Fifteen laboratories from eight NMIs, one DI, and six non-NMIs participated in this pilot study. The average mole fractions of the reported data showed relative standard deviations from 5.5 % to 6.8 % and average relative expanded uncertainties in the range from 4.52 % to 4.86 % for the four test CIGS specimens. These values are smaller than those in the key comparison CCQM-K67 for the measurement of mole fractions of Fe-Ni alloy films. As one result it can be stated that SIMS, XPS and AES protocols relying on the quantification of CIGS films using the TNC method are mature to be used in a CCQM key comparison. Main text. To reach the main text of this paper, click on Final Report. The

  10. Automated Quantitative Rare Earth Elements Mineralogy by Scanning Electron Microscopy

    NASA Astrophysics Data System (ADS)

    Sindern, Sven; Meyer, F. Michael

    2016-09-01

    Increasing industrial demand of rare earth elements (REEs) stems from the central role they play for advanced technologies and the accelerating move away from carbon-based fuels. However, REE production is often hampered by the chemical, mineralogical as well as textural complexity of the ores with a need for better understanding of their salient properties. This is not only essential for in-depth genetic interpretations but also for a robust assessment of ore quality and economic viability. The design of energy and cost-efficient processing of REE ores depends heavily on information about REE element deportment that can be made available employing automated quantitative process mineralogy. Quantitative mineralogy assigns numeric values to compositional and textural properties of mineral matter. Scanning electron microscopy (SEM) combined with a suitable software package for acquisition of backscatter electron and X-ray signals, phase assignment and image analysis is one of the most efficient tools for quantitative mineralogy. The four different SEM-based automated quantitative mineralogy systems, i.e. FEI QEMSCAN and MLA, Tescan TIMA and Zeiss Mineralogic Mining, which are commercially available, are briefly characterized. Using examples of quantitative REE mineralogy, this chapter illustrates capabilities and limitations of automated SEM-based systems. Chemical variability of REE minerals and analytical uncertainty can reduce performance of phase assignment. This is shown for the REE phases parisite and synchysite. In another example from a monazite REE deposit, the quantitative mineralogical parameters surface roughness and mineral association derived from image analysis are applied for automated discrimination of apatite formed in a breakdown reaction of monazite and apatite formed by metamorphism prior to monazite breakdown. SEM-based automated mineralogy fulfils all requirements for characterization of complex unconventional REE ores that will become

  11. Accurate quantitative CF-LIBS analysis of both major and minor elements in alloys via iterative correction of plasma temperature and spectral intensity

    NASA Astrophysics Data System (ADS)

    Shuxia, ZHAO; Lei, ZHANG; Jiajia, HOU; Yang, ZHAO; Wangbao, YIN; Weiguang, MA; Lei, DONG; Liantuan, XIAO; Suotang, JIA

    2018-03-01

    The chemical composition of alloys directly determines their mechanical behaviors and application fields. Accurate and rapid analysis of both major and minor elements in alloys plays a key role in metallurgy quality control and material classification processes. A quantitative calibration-free laser-induced breakdown spectroscopy (CF-LIBS) analysis method, which carries out combined correction of plasma temperature and spectral intensity by using a second-order iterative algorithm and two boundary standard samples, is proposed to realize accurate composition measurements. Experimental results show that, compared to conventional CF-LIBS analysis, the relative errors for major elements Cu and Zn and minor element Pb in the copper-lead alloys has been reduced from 12%, 26% and 32% to 1.8%, 2.7% and 13.4%, respectively. The measurement accuracy for all elements has been improved substantially.

  12. Calibration-free quantitative analysis of elemental ratios in intermetallic nanoalloys and nanocomposites using Laser Induced Breakdown Spectroscopy (LIBS).

    PubMed

    Davari, Seyyed Ali; Hu, Sheng; Mukherjee, Dibyendu

    2017-03-01

    Intermetallic nanoalloys (NAs) and nanocomposites (NCs) have increasingly gained prominence as efficient catalytic materials in electrochemical energy conversion and storage systems. But their morphology and chemical compositions play critical role in tuning their catalytic activities, and precious metal contents. While advanced microscopy techniques facilitate morphological characterizations, traditional chemical characterizations are either qualitative or extremely involved. In this study, we apply Laser Induced Breakdown Spectroscopy (LIBS) for quantitative compositional analysis of NAs and NCs synthesized with varied elemental ratios by our in-house built pulsed laser ablation technique. Specifically, elemental ratios of binary PtNi, PdCo (NAs) and PtCo (NCs) of different compositions are determined from LIBS measurements employing an internal calibration scheme using the bulk matrix species as internal standards. Morphology and qualitative elemental compositions of the aforesaid NAs and NCs are confirmed from Transmission Electron Microscopy (TEM) images and Energy Dispersive X-ray Spectroscopy (EDX) measurements. LIBS experiments are carried out in ambient conditions with the NA and NC samples drop cast on silicon wafers after centrifugation to increase their concentrations. The technique does not call for cumbersome sample preparations including acid digestions and external calibration standards commonly required in Inductively Coupled Plasma-Optical Emission Spectroscopy (ICP-OES) techniques. Yet the quantitative LIBS results are in good agreement with the results from ICP-OES measurements. Our results indicate the feasibility of using LIBS in future for rapid and in-situ quantitative chemical characterizations of wide classes of synthesized NAs and NCs. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Single cell elemental analysis using nuclear microscopy

    NASA Astrophysics Data System (ADS)

    Ren, M. Q.; Thong, P. S. P.; Kara, U.; Watt, F.

    1999-04-01

    The use of Particle Induced X-ray Emission (PIXE), Rutherford Backscattering Spectrometry (RBS) and Scanning Transmission Ion Microscopy (STIM) to provide quantitative elemental analysis of single cells is an area which has high potential, particularly when the trace elements such as Ca, Fe, Zn and Cu can be monitored. We describe the methodology of sample preparation for two cell types, the procedures of cell imaging using STIM, and the quantitative elemental analysis of single cells using RBS and PIXE. Recent work on single cells at the Nuclear Microscopy Research Centre,National University of Singapore has centred around two research areas: (a) Apoptosis (programmed cell death), which has been recently implicated in a wide range of pathological conditions such as cancer, Parkinson's disease etc, and (b) Malaria (infection of red blood cells by the malaria parasite). Firstly we present results on the elemental analysis of human Chang liver cells (ATTCC CCL 13) where vanadium ions were used to trigger apoptosis, and demonstrate that nuclear microscopy has the capability of monitoring vanadium loading within individual cells. Secondly we present the results of elemental changes taking place in individual mouse red blood cells which have been infected with the malaria parasite and treated with the anti-malaria drug Qinghaosu (QHS).

  14. Artificial neural networks applied to quantitative elemental analysis of organic material using PIXE

    NASA Astrophysics Data System (ADS)

    Correa, R.; Chesta, M. A.; Morales, J. R.; Dinator, M. I.; Requena, I.; Vila, I.

    2006-08-01

    An artificial neural network (ANN) has been trained with real-sample PIXE (particle X-ray induced emission) spectra of organic substances. Following the training stage ANN was applied to a subset of similar samples thus obtaining the elemental concentrations in muscle, liver and gills of Cyprinus carpio. Concentrations obtained with the ANN method are in full agreement with results from one standard analytical procedure, showing the high potentiality of ANN in PIXE quantitative analyses.

  15. Finite Element Analysis of Quantitative Percussion Diagnostics for Evaluating the Strength of Bonds Between Composite Laminates

    NASA Astrophysics Data System (ADS)

    Poveromo, Scott; Malcolm, Doug; Earthman, James

    Conventional nondestructive (NDT) techniques used to detect defects in composites are not able to determine intact bond integrity within a composite structure and are costly to use on large and complex shaped surfaces. To overcome current NDT limitations, a new technology was adopted based on quantitative percussion diagnostics (QPD) to better quantify bond quality in fiber reinforced composite materials. Results indicate that this technology is capable of detecting weak (`kiss') bonds between flat composite laminates. Specifically, the local value of the probe force determined from quantitative percussion testing was predicted to be significantly lower for a laminate that contained a `kiss' bond compared to that for a well-bonded sample, which is in agreement with experimental findings. Experimental results were compared to a finite element analysis (FEA) using MSC PATRAN/NASTRAN to understand the visco-elastic behavior of the laminates during percussion testing. The dynamic FEA models were used to directly predict changes in the probe force, as well as effective stress distributions across the bonded panels as a function of time.

  16. Quantitative analysis of bristle number in Drosophila mutants identifies genes involved in neural development

    NASA Technical Reports Server (NTRS)

    Norga, Koenraad K.; Gurganus, Marjorie C.; Dilda, Christy L.; Yamamoto, Akihiko; Lyman, Richard F.; Patel, Prajal H.; Rubin, Gerald M.; Hoskins, Roger A.; Mackay, Trudy F.; Bellen, Hugo J.

    2003-01-01

    BACKGROUND: The identification of the function of all genes that contribute to specific biological processes and complex traits is one of the major challenges in the postgenomic era. One approach is to employ forward genetic screens in genetically tractable model organisms. In Drosophila melanogaster, P element-mediated insertional mutagenesis is a versatile tool for the dissection of molecular pathways, and there is an ongoing effort to tag every gene with a P element insertion. However, the vast majority of P element insertion lines are viable and fertile as homozygotes and do not exhibit obvious phenotypic defects, perhaps because of the tendency for P elements to insert 5' of transcription units. Quantitative genetic analysis of subtle effects of P element mutations that have been induced in an isogenic background may be a highly efficient method for functional genome annotation. RESULTS: Here, we have tested the efficacy of this strategy by assessing the extent to which screening for quantitative effects of P elements on sensory bristle number can identify genes affecting neural development. We find that such quantitative screens uncover an unusually large number of genes that are known to function in neural development, as well as genes with yet uncharacterized effects on neural development, and novel loci. CONCLUSIONS: Our findings establish the use of quantitative trait analysis for functional genome annotation through forward genetics. Similar analyses of quantitative effects of P element insertions will facilitate our understanding of the genes affecting many other complex traits in Drosophila.

  17. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Prospects for higher spatial resolution quantitative X-ray analysis using transition element L-lines

    NASA Astrophysics Data System (ADS)

    Statham, P.; Holland, J.

    2014-03-01

    Lowering electron beam kV reduces electron scattering and improves spatial resolution of X-ray analysis. However, a previous round robin analysis of steels at 5 - 6 kV using Lα-lines for the first row transition elements gave poor accuracies. Our experiments on SS63 steel using Lα-lines show similar biases in Cr and Ni that cannot be corrected with changes to self-absorption coefficients or carbon coating. The inaccuracy may be caused by different probabilities for emission and anomalous self-absorption for the La-line between specimen and pure element standard. Analysis using Ll(L3-M1)-lines gives more accurate results for SS63 plausibly because the M1-shell is not so vulnerable to the atomic environment as the unfilled M4,5-shell. However, Ll-intensities are very weak and WDS analysis may be impractical for some applications. EDS with large area SDD offers orders of magnitude faster analysis and achieves similar results to WDS analysis with Lα-lines but poorer energy resolution precludes the use of Ll-lines in most situations. EDS analysis of K-lines at low overvoltage is an alternative strategy for improving spatial resolution that could give higher accuracy. The trade-off between low kV versus low overvoltage is explored in terms of sensitivity for element detection for different elements.

  19. The Influence of Reconstruction Kernel on Bone Mineral and Strength Estimates Using Quantitative Computed Tomography and Finite Element Analysis.

    PubMed

    Michalski, Andrew S; Edwards, W Brent; Boyd, Steven K

    2017-10-17

    Quantitative computed tomography has been posed as an alternative imaging modality to investigate osteoporosis. We examined the influence of computed tomography convolution back-projection reconstruction kernels on the analysis of bone quantity and estimated mechanical properties in the proximal femur. Eighteen computed tomography scans of the proximal femur were reconstructed using both a standard smoothing reconstruction kernel and a bone-sharpening reconstruction kernel. Following phantom-based density calibration, we calculated typical bone quantity outcomes of integral volumetric bone mineral density, bone volume, and bone mineral content. Additionally, we performed finite element analysis in a standard sideways fall on the hip loading configuration. Significant differences for all outcome measures, except integral bone volume, were observed between the 2 reconstruction kernels. Volumetric bone mineral density measured using images reconstructed by the standard kernel was significantly lower (6.7%, p < 0.001) when compared with images reconstructed using the bone-sharpening kernel. Furthermore, the whole-bone stiffness and the failure load measured in images reconstructed by the standard kernel were significantly lower (16.5%, p < 0.001, and 18.2%, p < 0.001, respectively) when compared with the image reconstructed by the bone-sharpening kernel. These data suggest that for future quantitative computed tomography studies, a standardized reconstruction kernel will maximize reproducibility, independent of the use of a quantitative calibration phantom. Copyright © 2017 The International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.

  20. Quantitative Effects of P Elements on Hybrid Dysgenesis in Drosophila Melanogaster

    PubMed Central

    Rasmusson, K. E.; Simmons, M. J.; Raymond, J. D.; McLarnon, C. F.

    1990-01-01

    Genetic analyses involving chromosomes from seven inbred lines derived from a single M' strain were used to study the quantitative relationships between the incidence and severity of P-M hybrid dysgenesis and the number of genomic P elements. In four separate analyses, the mutability of sn(w), a P element-insertion mutation of the X-linked singed locus, was found to be inversely related to the number of autosomal P elements. Since sn(w) mutability is caused by the action of the P transposase, this finding supports the hypothesis that genomic P elements titrate the transposase present within a cell. Other analyses demonstrated that autosomal transmission ratios were distorted by P element action. In these analyses, the amount of distortion against an autosome increased more or less linearly with the number of P elements carried by the autosome. Additional analyses showed that the magnitude of this distortion was reduced when a second P element-containing autosome was present in the genome. This reduction could adequately be explained by transposase titration; there was no evidence that it was due to repressor molecules binding to P elements and inhibiting their movement. The influence of genomic P elements on the incidence of gonadal dysgenesis was also investigated. Although no simple relationship between the number of P elements and the incidence of the trait could be discerned, it was clear that even a small number of elements could increase the incidence markedly. The failure to find a quantitative relationship between P element number and the incidence of gonadal dysgenesis probably reflects the complex etiology of this trait. PMID:2155853

  1. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  2. Quantitative analysis of trace element concentrations in some gem-quality diamonds

    NASA Astrophysics Data System (ADS)

    McNeill, J.; Pearson, D. G.; Klein-Ben David, O.; Nowell, G. M.; Ottley, C. J.; Chinn, I.

    2009-09-01

    The geochemical signature of diamond-forming fluids can be used to unravel diamond-forming processes and is of potential use in the detection of so-called 'conflict' diamonds. While fluid-rich fibrous diamonds can be analyzed by a variety of techniques, very few data have been published for fluid-poor, gem-quality diamonds because of their very low impurity levels. Here we present a new ICPMS-based (ICPMS: inductively coupled plasma mass spectrometry) method for the analysis of trace element concentrations within fluid-poor, gem-quality diamonds. The method employs a closed-system laser ablation cell. Diamonds are ablated and the products trapped for later pre-concentration into solutions that are analyzed by sector-field ICPMS. We show that our limits of quantification for a wide range of elements are at the sub-pg to low pg level. The method is applied to a suite of 10 diamonds from the Cullinan Mine (previously known as Premier), South Africa, along with other diamonds from Siberia (Mir and Udachnaya) and Venezuela. The concentrations of a wide range of elements for all the samples (expressed by weight in the solid) are very low, with rare earth elements along with Y, Nb, Cs ranging from 0.01 to 2 ppb. Large ion lithophile elements (LILE) such as Rb and Ba vary from 1 to 30 ppb. Ti ranges from ppb levels up to 2 ppm. From the combined, currently small data set we observe two kinds of diamond-forming fluids within gem diamonds. One group has enrichments in LILE over Nb, whereas a second group has normalized LILE abundances more similar to those of Nb. These two groups bear some similarity to different groups of fluid-rich diamonds, providing some supporting evidence of a link between the parental fluids for both fluid-inclusion-rich and gem diamonds.

  3. Quantitative determination of low-Z elements in single atmospheric particles on boron substrates by automated scanning electron microscopy-energy-dispersive X-ray spectrometry.

    PubMed

    Choël, Marie; Deboudt, Karine; Osán, János; Flament, Pascal; Van Grieken, René

    2005-09-01

    Atmospheric aerosols consist of a complex heterogeneous mixture of particles. Single-particle analysis techniques are known to provide unique information on the size-resolved chemical composition of aerosols. A scanning electron microscope (SEM) combined with a thin-window energy-dispersive X-ray (EDX) detector enables the morphological and elemental analysis of single particles down to 0.1 microm with a detection limit of 1-10 wt %, low-Z elements included. To obtain data statistically representative of the air masses sampled, a computer-controlled procedure can be implemented in order to run hundreds of single-particle analyses (typically 1000-2000) automatically in a relatively short period of time (generally 4-8 h, depending on the setup and on the particle loading). However, automated particle analysis by SEM-EDX raises two practical challenges: the accuracy of the particle recognition and the reliability of the quantitative analysis, especially for micrometer-sized particles with low atomic number contents. Since low-Z analysis is hampered by the use of traditional polycarbonate membranes, an alternate choice of substrate is a prerequisite. In this work, boron is being studied as a promising material for particle microanalysis. As EDX is generally said to probe a volume of approximately 1 microm3, geometry effects arise from the finite size of microparticles. These particle geometry effects must be corrected by means of a robust concentration calculation procedure. Conventional quantitative methods developed for bulk samples generate elemental concentrations considerably in error when applied to microparticles. A new methodology for particle microanalysis, combining the use of boron as the substrate material and a reverse Monte Carlo quantitative program, was tested on standard particles ranging from 0.25 to 10 microm. We demonstrate that the quantitative determination of low-Z elements in microparticles is achievable and that highly accurate results can be

  4. Quantitative analysis of major elements in silicate minerals and glasses by micro-PIXE

    USGS Publications Warehouse

    Campbell, J.L.; Czamanske, G.K.; MacDonald, L.; Teesdale, W.J.

    1997-01-01

    The Guelph micro-PIXE facility has been modified to accommodate a second Si(Li) X-ray detector which records the spectrum due to light major elements (11 ??? Z ??? 20) with no deleterious effects from scattered 3 MeV protons. Spectra have been recorded from 30 well-characterized materials, including a broad range of silicate minerals and both natural and synthetic glasses. Sodium is mobile in some of the glasses, but not in the studied mineral lattices. The mean value of the instrumental constant H for each of the elements Mg, Al, and Si in these materials is systematically 6-8% lower than the H-value measured for the pure metals. Normalization factors are derived which permit the matrix corrections requisite for trace-element measurements in silicates to be based upon pure metal standards for Mg, Al and Si, supplemented by well-established, silicate mineral standards for the elements Na, K and Ca. Rigorous comparisons of electron microprobe and micro-PIXE analyses for the entire, 30-sample suite demonstrate the ability of micro-PIXE to produce accurate analysis for the light major elements in silicates. ?? 1997 Elsevier Science B.V.

  5. Scanning transmission ion microscopy mass measurements for quantitative trace element analysis within biological samples and validation using atomic force microscopy thickness measurements

    NASA Astrophysics Data System (ADS)

    Devès, Guillaume; Cohen-Bouhacina, Touria; Ortega, Richard

    2004-10-01

    We used the nuclear microprobe techniques, micro-PIXE (particle-induced X-ray emission), micro-RBS (Rutherford backscattering spectrometry) and scanning transmission ion microscopy (STIM) in order to perform the characterization of trace element content and spatial distribution within biological samples (dehydrated cultured cells, tissues). The normalization of PIXE results was usually expressed in terms of sample dry mass as determined by micro-RBS recorded simultaneously to micro-PIXE. However, the main limit of RBS mass measurement is the sample mass loss occurring during irradiation and which could be up to 30% of the initial sample mass. We present here a new methodology for PIXE normalization and quantitative analysis of trace element within biological samples based on dry mass measurement performed by mean of STIM. The validation of STIM cell mass measurements was obtained in comparison with AFM sample thickness measurements. Results indicated the reliability of STIM mass measurement performed on biological samples and suggested that STIM should be performed for PIXE normalization. Further information deriving from direct confrontation of AFM and STIM analysis could as well be obtained, like in situ measurements of cell specific gravity within cells compartment (nucleolus and cytoplasm).

  6. Quantitative analysis of major and trace elements in NH4HF2-modified silicate rock powders by laser ablation - inductively coupled plasma mass spectrometry.

    PubMed

    Zhang, Wen; Hu, Zhaochu; Liu, Yongsheng; Yang, Wenwu; Chen, Haihong; Hu, Shenghong; Xiao, Hongyan

    2017-08-29

    In this paper, we described a NH 4 HF 2 digestion method as sample preparation for the rapid determination of major and trace elements in silicate rocks using laser ablation-inductively coupled plasma mass spectrometry (LA-ICP-MS). Sample powders digested by NH 4 HF 2 at 230 °C for 3 h form ultrafine powders with a typical grain size d 80  < 8.5 μm, and various silicate rocks have a consistent grain morphology and size, allowing us to produce pressed powder pellets that have excellent cohesion and homogeneity suitable for laser ablation micro-analysis without the addition of binder. The influences of the digestion parameters were investigated and optimized, including the evaporation stage of removing residual NH 4 HF 2 , sample homogenization, selection of the digestion vessel and calibration strategy of quantitative analysis. The optimized NH 4 HF 2 digestion method was applied to dissolve six silicate rock reference materials (BCR-2, BHVO-2, AGV-2, RGM-2, GSP-2, GSR-1) covering a wide range of rock types. Ten major elements and thirty-five trace elements were simultaneously analyzed by LA-ICP-MS. The analytical results of the six reference materials generally agreed with the recommended values, with discrepancies of less than 10% for most elements. The analytical precision is within 5% for most major elements and within 10% for most trace elements. Compared with previous methods of LA-ICP-MS bulk analysis, our method enables the complete dissolution of refractory minerals, such as zircon, in intermediate-acidic intrusive rocks and limits contamination as well as the loss of volatile elements. Moreover, there are many advantages for the new technique, including reducing matrix effects between reference materials and samples, spiking the internal standard simply and feasibly and sample batch processing. The applicability filed of the new technique in this study was focused on the whole-rock analysis of igneous rock samples, which are from basic rocks to acid

  7. Accuracy improvement of quantitative analysis by spatial confinement in laser-induced breakdown spectroscopy.

    PubMed

    Guo, L B; Hao, Z Q; Shen, M; Xiong, W; He, X N; Xie, Z Q; Gao, M; Li, X Y; Zeng, X Y; Lu, Y F

    2013-07-29

    To improve the accuracy of quantitative analysis in laser-induced breakdown spectroscopy, the plasma produced by a Nd:YAG laser from steel targets was confined by a cavity. A number of elements with low concentrations, such as vanadium (V), chromium (Cr), and manganese (Mn), in the steel samples were investigated. After the optimization of the cavity dimension and laser fluence, significant enhancement factors of 4.2, 3.1, and 2.87 in the emission intensity of V, Cr, and Mn lines, respectively, were achieved at a laser fluence of 42.9 J/cm(2) using a hemispherical cavity (diameter: 5 mm). More importantly, the correlation coefficient of the V I 440.85/Fe I 438.35 nm was increased from 0.946 (without the cavity) to 0.981 (with the cavity); and similar results for Cr I 425.43/Fe I 425.08 nm and Mn I 476.64/Fe I 492.05 nm were also obtained. Therefore, it was demonstrated that the accuracy of quantitative analysis with low concentration elements in steel samples was improved, because the plasma became uniform with spatial confinement. The results of this study provide a new pathway for improving the accuracy of quantitative analysis of LIBS.

  8. Method for quantitative determination and separation of trace amounts of chemical elements in the presence of large quantities of other elements having the same atomic mass

    DOEpatents

    Miller, C.M.; Nogar, N.S.

    1982-09-02

    Photoionization via autoionizing atomic levels combined with conventional mass spectroscopy provides a technique for quantitative analysis of trace quantities of chemical elements in the presence of much larger amounts of other elements with substantially the same atomic mass. Ytterbium samples smaller than 10 ng have been detected using an ArF* excimer laser which provides the atomic ions for a time-of-flight mass spectrometer. Elemental selectivity of greater than 5:1 with respect to lutetium impurity has been obtained. Autoionization via a single photon process permits greater photon utilization efficiency because of its greater absorption cross section than bound-free transitions, while maintaining sufficient spectroscopic structure to allow significant photoionization selectivity between different atomic species. Separation of atomic species from others of substantially the same atomic mass is also described.

  9. Elemental analysis of granite by instrumental neutron activation analysis (INAA) and X-ray fluorescence analysis (XRF).

    PubMed

    El-Taher, A

    2012-01-01

    The instrumental neutron activation analysis technique (INAA) was used for qualitative and quantitative analysis of granite samples collected from four locations in the Aswan area in South Egypt. The samples were prepared together with their standards and simultaneously irradiated in a neutron flux of 7×10(11)n/cm(2)s in the TRIGA Mainz research reactor. Gamma-ray spectra from an hyper-pure germanium detector were analyzed. The present study provides the basic data of elemental concentrations of granite rocks. The following elements have been determined Na, Mg, K, Fe, Mn, Sc, Cr, Ti, Co, Zn, Ga, Rb, Zr, Nb, Sn, Ba, Cs, La, Ce, Nd, Sm, Eu, Yb, Lu, Hf, Ta, Th and U. The X-ray fluorescence (XRF) was used for comparison and to detect elements, which can be detected only by XRF such as F, S, Cl, Co, Cu, Mo, Ni, Pb, Se and V. The data presented here are our contribution to understanding the elemental composition of the granite rocks. Because there are no existing databases for the elemental analysis of granite, our results are a start to establishing a database for the Egyptian granite. It is hoped that the data presented here will be useful to those dealing with geochemistry, granite chemistry and related fields. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. ICP-MS as a novel detection system for quantitative element-tagged immunoassay of hidden peanut allergens in foods.

    PubMed

    Careri, Maria; Elviri, Lisa; Mangia, Alessandro; Mucchino, Claudio

    2007-03-01

    A novel ICP-MS-based ELISA immunoassay via element-tagged determination was devised for quantitative analysis of hidden allergens in food. The method was able to detect low amounts of peanuts (down to approximately 2 mg peanuts kg(-1) cereal-based matrix) by using a europium-tagged antibody. Selectivity was proved by the lack of detectable cross-reaction with a number of protein-rich raw materials.

  11. VIBA-Lab 3.0: Computer program for simulation and semi-quantitative analysis of PIXE and RBS spectra and 2D elemental maps

    NASA Astrophysics Data System (ADS)

    Orlić, Ivica; Mekterović, Darko; Mekterović, Igor; Ivošević, Tatjana

    2015-11-01

    VIBA-Lab is a computer program originally developed by the author and co-workers at the National University of Singapore (NUS) as an interactive software package for simulation of Particle Induced X-ray Emission and Rutherford Backscattering Spectra. The original program is redeveloped to a VIBA-Lab 3.0 in which the user can perform semi-quantitative analysis by comparing simulated and measured spectra as well as simulate 2D elemental maps for a given 3D sample composition. The latest version has a new and more versatile user interface. It also has the latest data set of fundamental parameters such as Coster-Kronig transition rates, fluorescence yields, mass absorption coefficients and ionization cross sections for K and L lines in a wider energy range than the original program. Our short-term plan is to introduce routine for quantitative analysis for multiple PIXE and XRF excitations. VIBA-Lab is an excellent teaching tool for students and researchers in using PIXE and RBS techniques. At the same time the program helps when planning an experiment and when optimizing experimental parameters such as incident ions, their energy, detector specifications, filters, geometry, etc. By "running" a virtual experiment the user can test various scenarios until the optimal PIXE and BS spectra are obtained and in this way save a lot of expensive machine time.

  12. Signatures of Evolutionary Adaptation in Quantitative Trait Loci Influencing Trace Element Homeostasis in Liver

    PubMed Central

    Sabidó, Eduard; Bosch, Elena

    2016-01-01

    Essential trace elements possess vital functions at molecular, cellular, and physiological levels in health and disease, and they are tightly regulated in the human body. In order to assess variability and potential adaptive evolution of trace element homeostasis, we quantified 18 trace elements in 150 liver samples, together with the expression levels of 90 genes and abundances of 40 proteins involved in their homeostasis. Additionally, we genotyped 169 single nucleotide polymorphism (SNPs) in the same sample set. We detected significant associations for 8 protein quantitative trait loci (pQTL), 10 expression quantitative trait loci (eQTLs), and 15 micronutrient quantitative trait loci (nutriQTL). Six of these exceeded the false discovery rate cutoff and were related to essential trace elements: 1) one pQTL for GPX2 (rs10133290); 2) two previously described eQTLs for HFE (rs12346) and SELO (rs4838862) expression; and 3) three nutriQTLs: The pathogenic C282Y mutation at HFE affecting iron (rs1800562), and two SNPs within several clustered metallothionein genes determining selenium concentration (rs1811322 and rs904773). Within the complete set of significant QTLs (which involved 30 SNPs and 20 gene regions), we identified 12 SNPs with extreme patterns of population differentiation (FST values in the top 5% percentile in at least one HapMap population pair) and significant evidence for selective sweeps involving QTLs at GPX1, SELENBP1, GPX3, SLC30A9, and SLC39A8. Overall, this detailed study of various molecular phenotypes illustrates the role of regulatory variants in explaining differences in trace element homeostasis among populations and in the human adaptive response to environmental pressures related to micronutrients. PMID:26582562

  13. Organic Elemental Analysis.

    ERIC Educational Resources Information Center

    Ma, T. S.; Gutterson, Milton

    1980-01-01

    Reviews general developments in computerization and data processing of organic elemental analyses; carbon, hydrogen, and nitrogen analyzers; procedures for determining oxygen, sulfur, and halogens, as well as other nometallic elements and organometallics. Selected papers on trace analysis of nonmetals and determination of metallic elements are…

  14. Elemental analysis of occupational and environmental lung diseases by electron probe microanalyzer with wavelength dispersive spectrometer.

    PubMed

    Takada, Toshinori; Moriyama, Hiroshi; Suzuki, Eiichi

    2014-01-01

    Occupational and environmental lung diseases are a group of pulmonary disorders caused by inhalation of harmful particles, mists, vapors or gases. Mineralogical analysis is not generally required in the diagnosis of most cases of these diseases. Apart from minerals that are encountered rarely or only in specific occupations, small quantities of mineral dusts are present in the healthy lung. As such when mineralogical analysis is required, quantitative or semi-quantitative methods must be employed. An electron probe microanalyzer with wavelength dispersive spectrometer (EPMA-WDS) enables analysis of human lung tissue for deposits of elements by both qualitative and semi-quantitative methods. Since 1993, we have analyzed 162 cases of suspected occupational and environmental lung diseases using an EPMA-WDS. Our institute has been accepting online requests for elemental analysis of lung tissue samples by EPMA-WDS since January 2011. Hard metal lung disease is an occupational interstitial lung disease that primarily affects workers exposed to the dust of tungsten carbide. The characteristic pathological findings of the disease are giant cell interstitial pneumonia (GIP) with centrilobular fibrosis, surrounded by mild alveolitis with giant cells within the alveolar space. EPMA-WDS analysis of biopsied lung tissue from patients with GIP has demonstrated that tungsten and/or cobalt is distributed in the giant cells and centrilobular fibrosing lesion in GIP. Pneumoconiosis, caused by amorphous silica, and acute interstitial pneumonia, associated with the giant tsunami, were also elementally analyzed by EPMA-WDS. The results suggest that commonly found elements, such as silicon, aluminum, and iron, may cause occupational and environmental lung diseases. Copyright © 2013 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.

  15. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Vivek K.; Singh, Vinita; Rai, Awadhesh K.

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectramore » from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.« less

  16. Quantitative analysis of trace metal accumulation in teeth using laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Samek, O.; Beddows, D. C. S.; Telle, H. H.; Morris, G. W.; Liska, M.; Kaiser, J.

    The technique of laser ablation is receiving increasing attention for applications in dentistry, specifically for the treatment of teeth (e.g. drilling of micro-holes and plaque removal). In the process of ablation a luminous micro-plasma is normally generated which may be exploited for elemental analysis. Here we report on quantitative Laser-Induced Breakdown Spectroscopy (LIBS) analysis to study the presence of trace minerals in teeth. A selection of teeth of different age groups has been investigated, ranging from the first teeth of infants, through the second teeth of children, to adults to trace the influence of environmental factors on the accumulation of a number of elements in teeth. We found a close link between elements detected in tooth fillings and toothpastes with those present in teeth.

  17. Determination of elements in hospital waste with neutron activation analysis method

    NASA Astrophysics Data System (ADS)

    Dwijananti, P.; Astuti, B.; Alwiyah; Fianti

    2018-03-01

    The producer of the biggest B3 waste is hospital. The waste is from medical and laboratory activities. The purpose of this study is to determine the elements contained in the liquid waste from hospital and calculate the levels of these elements. This research was done by analysis of the neutron activation conducted at BATAN Yogyakarta. The neutron activation analysis is divided into two stages: activation of the samples using neutron sources of reactor Kartini, then chopping by using a set of tools, gamma spectrometer with HPGe detector. Qualitative and quantitative analysis were done by matching the gamma spectrum peak to the Neutron Activation Table. The sample was taken from four points of the liquid waste treatment plant (WWTP) Bhakti Wira Tamtama Semarang hospital. The results showed that the samples containing elements of Cr, Zn, Fe, Co, and Na, with the levels of each element is Cr (0.033 - 0.075) mg/L, Zn (0.090 - 1.048) mg/L, Fe (2.937-37.743) mg/L, Co (0.005-0.023) mg/L, and Na (61.088-116.330) mg/L. Comparing to the standard value, the liquid is safe to the environment.

  18. [Quantitative Analysis of Heavy Metals in Water with LIBS Based on Signal-to-Background Ratio].

    PubMed

    Hu, Li; Zhao, Nan-jing; Liu, Wen-qing; Fang, Li; Zhang, Da-hai; Wang, Yin; Meng, De Shuo; Yu, Yang; Ma, Ming-jun

    2015-07-01

    There are many influence factors in the precision and accuracy of the quantitative analysis with LIBS technology. According to approximately the same characteristics trend of background spectrum and characteristic spectrum along with the change of temperature through in-depth analysis, signal-to-background ratio (S/B) measurement and regression analysis could compensate the spectral line intensity changes caused by system parameters such as laser power, spectral efficiency of receiving. Because the measurement dates were limited and nonlinear, we used support vector machine (SVM) for regression algorithm. The experimental results showed that the method could improve the stability and the accuracy of quantitative analysis of LIBS, and the relative standard deviation and average relative error of test set respectively were 4.7% and 9.5%. Data fitting method based on signal-to-background ratio(S/B) is Less susceptible to matrix elements and background spectrum etc, and provides data processing reference for real-time online LIBS quantitative analysis technology.

  19. Quantitative Chromatographic Determination of Dissolved Elemental Sulfur in the Non-aqueous Electrolyte for Lithium-Sulfur Batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Dong; Yang, Xiao-Qing; Zhang, Xuran

    A fast and reliable analytical method is reported for the quantitative determination of dissolved elemental sulfur in non-aqueous electrolytes for Li-S batteries. By using high performance liquid chromatography with a UV detector, the solubility of S in 12 different pure solvents and in 22 different electrolytes was determined. It was found that the solubility of elemental sulfur is dependent on the Lewis basicity, the polarity of solvents and the salt concentration in the electrolytes. In addition, the S content in the electrolyte recovered from a discharged Li-S battery was successfully determined by the proposed HPLC/UV method. Thus, the feasibility ofmore » the method to the online analysis for a Li-S battery is demonstrated. Interestingly, the S was found super-saturated in the electrolyte recovered from a discharged Li-S cell.« less

  20. Quantitative Chromatographic Determination of Dissolved Elemental Sulfur in the Non-aqueous Electrolyte for Lithium-Sulfur Batteries

    DOE PAGES

    Zheng, Dong; Yang, Xiao-Qing; Zhang, Xuran; ...

    2014-12-02

    A fast and reliable analytical method is reported for the quantitative determination of dissolved elemental sulfur in non-aqueous electrolytes for Li-S batteries. By using high performance liquid chromatography with a UV detector, the solubility of S in 12 different pure solvents and in 22 different electrolytes was determined. It was found that the solubility of elemental sulfur is dependent on the Lewis basicity, the polarity of solvents and the salt concentration in the electrolytes. In addition, the S content in the electrolyte recovered from a discharged Li-S battery was successfully determined by the proposed HPLC/UV method. Thus, the feasibility ofmore » the method to the online analysis for a Li-S battery is demonstrated. Interestingly, the S was found super-saturated in the electrolyte recovered from a discharged Li-S cell.« less

  1. Quantitative option analysis for implementation and management of landfills.

    PubMed

    Kerestecioğlu, Merih

    2016-09-01

    The selection of the most feasible strategy for implementation of landfills is a challenging step. Potential implementation options of landfills cover a wide range, from conventional construction contracts to the concessions. Montenegro, seeking to improve the efficiency of the public services while maintaining affordability, was considering privatisation as a way to reduce public spending on service provision. In this study, to determine the most feasible model for construction and operation of a regional landfill, a quantitative risk analysis was implemented with four steps: (i) development of a global risk matrix; (ii) assignment of qualitative probabilities of occurrences and magnitude of impacts; (iii) determination of the risks to be mitigated, monitored, controlled or ignored; (iv) reduction of the main risk elements; and (v) incorporation of quantitative estimates of probability of occurrence and expected impact for each risk element in the reduced risk matrix. The evaluated scenarios were: (i) construction and operation of the regional landfill by the public sector; (ii) construction and operation of the landfill by private sector and transfer of the ownership to the public sector after a pre-defined period; and (iii) operation of the landfill by the private sector, without ownership. The quantitative risk assessment concluded that introduction of a public private partnership is not the most feasible option, unlike the common belief in several public institutions in developing countries. A management contract for the first years of operation was advised to be implemented, after which, a long term operating contract may follow. © The Author(s) 2016.

  2. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    PubMed

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  3. Quantitative investigation of ligament strains during physical tests for sacroiliac joint pain using finite element analysis.

    PubMed

    Kim, Yoon Hyuk; Yao, Zhidong; Kim, Kyungsoo; Park, Won Man

    2014-06-01

    It may be assumed that the stability is affected when some ligaments are injured or loosened, and this joint instability causes sacroiliac joint pain. Several physical examinations have been used to diagnose sacroiliac pain and to isolate the source of the pain. However, more quantitative and objective information may be necessary to identify unstable or injured ligaments during these tests due to the lack of understanding of the quantitative relationship between the physical tests and the biomechanical parameters that may be related to pains in the sacroiliac joint and the surrounding ligaments. In this study, a three-dimensional finite element model of the sacroiliac joint was developed and the biomechanical conditions for six typical physical tests such as the compression test, distraction test, sacral apex pressure test, thigh thrust test, Patrick's test, and Gaenslen's test were modelled. The sacroiliac joint contact pressure and ligament strain were investigated for each test. The values of contact pressure and the combination of most highly strained ligaments differed markedly among the tests. Therefore, these findings in combination with the physical tests would be helpful to identify the pain source and to understand the pain mechanism. Moreover, the technology provided in this study might be a useful tool to evaluate the physical tests, to improve the present test protocols, or to develop a new physical test protocol. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Quantitative analysis of titanium concentration using calibration-free laser-induced breakdown spectroscopy (LIBS)

    NASA Astrophysics Data System (ADS)

    Zaitun; Prasetyo, S.; Suliyanti, M. M.; Isnaeni; Herbani, Y.

    2018-03-01

    Laser-induced breakdown spectroscopy (LIBS) can be used for quantitative and qualitative analysis. Calibration-free LIBS (CF-LIBS) is a method to quantitatively analyze concentration of elements in a sample in local thermodynamic equilibrium conditions without using available matrix-matched calibration. In this study, we apply CF-LIBS for quantitative analysis of Ti in TiO2 sample. TiO2 powder sample was mixed with polyvinyl alcohol and formed into pellets. An Nd:YAG pulsed laser at a wavelength of 1064 nm was focused onto the sample to generate plasma. The spectrum of plasma was recorded using spectrophotometer then compared to NIST spectral line to determine energy levels and other parameters. The value of plasma temperature obtained using Boltzmann plot is 8127.29 K and electron density from calculation is 2.49×1016 cm-3. Finally, the concentration of Ti in TiO2 sample from this study is 97% that is in proximity with the sample certificate.

  5. Remote quantitative analysis of minerals based on multispectral line-calibrated laser-induced breakdown spectroscopy (LIBS).

    PubMed

    Wan, Xiong; Wang, Peng

    2014-01-01

    Laser-induced breakdown spectroscopy (LIBS) is a feasible remote sensing technique used for mineral analysis in some unapproachable places where in situ probing is needed, such as analysis of radioactive elements in a nuclear leak or the detection of elemental compositions and contents of minerals on planetary and lunar surfaces. Here a compact custom 15 m focus optical component, combining a six times beam expander with a telescope, has been built, with which the laser beam of a 1064 nm Nd ; YAG laser is focused on remote minerals. The excited LIBS signals that reveal the elemental compositions of minerals are collected by another compact single lens-based signal acquisition system. In our remote LIBS investigations, the LIBS spectra of an unknown ore have been detected, from which the metal compositions are obtained. In addition, a multi-spectral line calibration (MSLC) method is proposed for the quantitative analysis of elements. The feasibility of the MSLC and its superiority over a single-wavelength determination have been confirmed by comparison with traditional chemical analysis of the copper content in the ore.

  6. Quantitative analysis of biomedical samples using synchrotron radiation microbeams

    NASA Astrophysics Data System (ADS)

    Ektessabi, Ali; Shikine, Shunsuke; Yoshida, Sohei

    2001-07-01

    X-ray fluorescence (XRF) using a synchrotron radiation (SR) microbeam was applied to investigate distributions and concentrations of elements in single neurons of patients with neurodegenerative diseases. In this paper we introduce a computer code that has been developed to quantify the trace elements and matrix elements at the single cell level. This computer code has been used in studies of several important neurodegenerative diseases such as Alzheimer's disease (AD), Parkinson's disease (PD) and parkinsonism-dementia complex (PDC), as well as in basic biological experiments to determine the elemental changes in cells due to incorporation of foreign metal elements. The substantial nigra (SN) tissue obtained from the autopsy specimens of patients with Guamanian parkinsonism-dementia complex (PDC) and control cases were examined. Quantitative XRF analysis showed that neuromelanin granules of Parkinsonian SN contained higher levels of Fe than those of the control. The concentrations were in the ranges of 2300-3100 ppm and 2000-2400 ppm respectively. On the contrary, Zn and Ni in neuromelanin granules of SN tissue from the PDC case were lower than those of the control. Especially Zn was less than 40 ppm in SN tissue from the PDC case while it was 560-810 ppm in the control. These changes are considered to be closely related to the neuro-degeneration and cell death.

  7. Boundary element analysis of corrosion problems for pumps and pipes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miyasaka, M.; Amaya, K.; Kishimoto, K.

    1995-12-31

    Three-dimensional (3D) and axi-symmetric boundary element methods (BEM) were developed to quantitatively estimate cathodic protection and macro-cell corrosion. For 3D analysis, a multiple-region method (MRM) was developed in addition to a single-region method (SRM). The validity and usefulness of the BEMs were demonstrated by comparing numerical results with experimental data from galvanic corrosion systems of a cylindrical model and a seawater pipe, and from a cathodic protection system of an actual seawater pump. It was shown that a highly accurate analysis could be performed for fluid machines handling seawater with complex 3D fields (e.g. seawater pump) by taking account ofmore » flow rate and time dependencies of polarization curve. Compared to the 3D BEM, the axi-symmetric BEM permitted large reductions in numbers of elements and nodes, which greatly simplified analysis of axi-symmetric fields such as pipes. Computational accuracy and CPU time were compared between analyses using two approximation methods for polarization curves: a logarithmic-approximation method and a linear-approximation method.« less

  8. Matrix-Assisted Plasma Atomization Emission Spectrometry for Surface Sampling Elemental Analysis

    PubMed Central

    Yuan, Xin; Zhan, Xuefang; Li, Xuemei; Zhao, Zhongjun; Duan, Yixiang

    2016-01-01

    An innovative technology has been developed involving a simple and sensitive optical spectrometric method termed matrix-assisted plasma atomization emission spectrometry (MAPAES) for surface sampling elemental analysis using a piece of filter paper (FP) for sample introduction. MAPAES was carried out by direct interaction of the plasma tail plume with the matrix surface. The FP absorbs energy from the plasma source and releases combustion heating to the analytes originally present on its surface, thus to promote the atomization and excitation process. The matrix-assisted plasma atomization excitation phenomenon was observed for multiple elements. The FP matrix served as the partial energy producer and also the sample substrate to adsorb sample solution. Qualitative and quantitative determinations of metal ions were achieved by atomic emission measurements for elements Ba, Cu, Eu, In, Mn, Ni, Rh and Y. The detection limits were down to pg level with linear correlation coefficients better than 0.99. The proposed MAPAES provides a new way for atomic spectrometry which offers advantages of fast analysis speed, little sample consumption, less sample pretreatment, small size, and cost-effective. PMID:26762972

  9. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  10. Downhole Elemental Analysis with LIBS

    NASA Technical Reports Server (NTRS)

    Moreschini, Paolo; Zacny, Kris; Rickman, Doug

    2011-01-01

    In this paper we discuss a novel instrument, currently under development at Honeybee Robotics with SBIR funding from NASA. The device is designed to characterize elemental composition as a function of depth in non-terrestrial geological formations. The instrument consists of a miniaturized laser-induced breakdown spectrometer (LIBS) analyzer integrated in a 2" diameter drill string. While the drill provides subsurface access, the LIBS analyzer provides information on the elemental composition of the borehole wall. This instrument has a variety of space applications ranging from exploration of the Moon for which it was originally designed, to Mars, as well as a variety of terrestrial applications. Subsurface analysis is usually performed by sample acquisition through a drill or excavator, followed by sample preparation and subsequent sample presentation to an instrument or suite of instruments. An alternative approach consisting in bringing a miniaturized version of the instrument to the sample has many advantages over the traditional methodology, as it allows faster response, reduced probability of cross-contamination and a simplification in the sampling mechanisms. LIBS functions by focusing a high energy laser on a material inducing a plasma consisting of a small fraction of the material under analysis. Optical emission from the plasma, analyzed by a spectrometer, can be used to determine elemental composition. A triangulation sensor located in the sensor head determines the distance of the sensor from the borehole wall. An actuator modifies the position of the sensor accordingly, in order to compensate for changes due to the profile of the borehole walls. This is necessary because LIBS measurements are negatively affected by changes in the relative position of the focus of the laser with respect to the position of the sample (commonly referred to as the "lens to sample distance"). Profiling the borehole is done by adjusting the position of the sensor with a

  11. Multivariate Quantitative Chemical Analysis

    NASA Technical Reports Server (NTRS)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  12. Quantitative Modelling of Trace Elements in Hard Coal.

    PubMed

    Smoliński, Adam; Howaniec, Natalia

    2016-01-01

    The significance of coal in the world economy remains unquestionable for decades. It is also expected to be the dominant fossil fuel in the foreseeable future. The increased awareness of sustainable development reflected in the relevant regulations implies, however, the need for the development and implementation of clean coal technologies on the one hand, and adequate analytical tools on the other. The paper presents the application of the quantitative Partial Least Squares method in modeling the concentrations of trace elements (As, Ba, Cd, Co, Cr, Cu, Mn, Ni, Pb, Rb, Sr, V and Zn) in hard coal based on the physical and chemical parameters of coal, and coal ash components. The study was focused on trace elements potentially hazardous to the environment when emitted from coal processing systems. The studied data included 24 parameters determined for 132 coal samples provided by 17 coal mines of the Upper Silesian Coal Basin, Poland. Since the data set contained outliers, the construction of robust Partial Least Squares models for contaminated data set and the correct identification of outlying objects based on the robust scales were required. These enabled the development of the correct Partial Least Squares models, characterized by good fit and prediction abilities. The root mean square error was below 10% for all except for one the final Partial Least Squares models constructed, and the prediction error (root mean square error of cross-validation) exceeded 10% only for three models constructed. The study is of both cognitive and applicative importance. It presents the unique application of the chemometric methods of data exploration in modeling the content of trace elements in coal. In this way it contributes to the development of useful tools of coal quality assessment.

  13. Quantitative Modelling of Trace Elements in Hard Coal

    PubMed Central

    Smoliński, Adam; Howaniec, Natalia

    2016-01-01

    The significance of coal in the world economy remains unquestionable for decades. It is also expected to be the dominant fossil fuel in the foreseeable future. The increased awareness of sustainable development reflected in the relevant regulations implies, however, the need for the development and implementation of clean coal technologies on the one hand, and adequate analytical tools on the other. The paper presents the application of the quantitative Partial Least Squares method in modeling the concentrations of trace elements (As, Ba, Cd, Co, Cr, Cu, Mn, Ni, Pb, Rb, Sr, V and Zn) in hard coal based on the physical and chemical parameters of coal, and coal ash components. The study was focused on trace elements potentially hazardous to the environment when emitted from coal processing systems. The studied data included 24 parameters determined for 132 coal samples provided by 17 coal mines of the Upper Silesian Coal Basin, Poland. Since the data set contained outliers, the construction of robust Partial Least Squares models for contaminated data set and the correct identification of outlying objects based on the robust scales were required. These enabled the development of the correct Partial Least Squares models, characterized by good fit and prediction abilities. The root mean square error was below 10% for all except for one the final Partial Least Squares models constructed, and the prediction error (root mean square error of cross–validation) exceeded 10% only for three models constructed. The study is of both cognitive and applicative importance. It presents the unique application of the chemometric methods of data exploration in modeling the content of trace elements in coal. In this way it contributes to the development of useful tools of coal quality assessment. PMID:27438794

  14. A quantitative metric to identify critical elements within seafood supply networks.

    PubMed

    Plagányi, Éva E; van Putten, Ingrid; Thébaud, Olivier; Hobday, Alistair J; Innes, James; Lim-Camacho, Lilly; Norman-López, Ana; Bustamante, Rodrigo H; Farmery, Anna; Fleming, Aysha; Frusher, Stewart; Green, Bridget; Hoshino, Eriko; Jennings, Sarah; Pecl, Gretta; Pascoe, Sean; Schrobback, Peggy; Thomas, Linda

    2014-01-01

    A theoretical basis is required for comparing key features and critical elements in wild fisheries and aquaculture supply chains under a changing climate. Here we develop a new quantitative metric that is analogous to indices used to analyse food-webs and identify key species. The Supply Chain Index (SCI) identifies critical elements as those elements with large throughput rates, as well as greater connectivity. The sum of the scores for a supply chain provides a single metric that roughly captures both the resilience and connectedness of a supply chain. Standardised scores can facilitate cross-comparisons both under current conditions as well as under a changing climate. Identification of key elements along the supply chain may assist in informing adaptation strategies to reduce anticipated future risks posed by climate change. The SCI also provides information on the relative stability of different supply chains based on whether there is a fairly even spread in the individual scores of the top few key elements, compared with a more critical dependence on a few key individual supply chain elements. We use as a case study the Australian southern rock lobster Jasus edwardsii fishery, which is challenged by a number of climate change drivers such as impacts on recruitment and growth due to changes in large-scale and local oceanographic features. The SCI identifies airports, processors and Chinese consumers as the key elements in the lobster supply chain that merit attention to enhance stability and potentially enable growth. We also apply the index to an additional four real-world Australian commercial fishery and two aquaculture industry supply chains to highlight the utility of a systematic method for describing supply chains. Overall, our simple methodological approach to empirically-based supply chain research provides an objective method for comparing the resilience of supply chains and highlighting components that may be critical.

  15. A Quantitative Metric to Identify Critical Elements within Seafood Supply Networks

    PubMed Central

    Plagányi, Éva E.; van Putten, Ingrid; Thébaud, Olivier; Hobday, Alistair J.; Innes, James; Lim-Camacho, Lilly; Norman-López, Ana; Bustamante, Rodrigo H.; Farmery, Anna; Fleming, Aysha; Frusher, Stewart; Green, Bridget; Hoshino, Eriko; Jennings, Sarah; Pecl, Gretta; Pascoe, Sean; Schrobback, Peggy; Thomas, Linda

    2014-01-01

    A theoretical basis is required for comparing key features and critical elements in wild fisheries and aquaculture supply chains under a changing climate. Here we develop a new quantitative metric that is analogous to indices used to analyse food-webs and identify key species. The Supply Chain Index (SCI) identifies critical elements as those elements with large throughput rates, as well as greater connectivity. The sum of the scores for a supply chain provides a single metric that roughly captures both the resilience and connectedness of a supply chain. Standardised scores can facilitate cross-comparisons both under current conditions as well as under a changing climate. Identification of key elements along the supply chain may assist in informing adaptation strategies to reduce anticipated future risks posed by climate change. The SCI also provides information on the relative stability of different supply chains based on whether there is a fairly even spread in the individual scores of the top few key elements, compared with a more critical dependence on a few key individual supply chain elements. We use as a case study the Australian southern rock lobster Jasus edwardsii fishery, which is challenged by a number of climate change drivers such as impacts on recruitment and growth due to changes in large-scale and local oceanographic features. The SCI identifies airports, processors and Chinese consumers as the key elements in the lobster supply chain that merit attention to enhance stability and potentially enable growth. We also apply the index to an additional four real-world Australian commercial fishery and two aquaculture industry supply chains to highlight the utility of a systematic method for describing supply chains. Overall, our simple methodological approach to empirically-based supply chain research provides an objective method for comparing the resilience of supply chains and highlighting components that may be critical. PMID:24633147

  16. Quantitative analysis of eyes and other optical systems in linear optics.

    PubMed

    Harris, William F; Evans, Tanya; van Gool, Radboud D

    2017-05-01

    To show that 14-dimensional spaces of augmented point P and angle Q characteristics, matrices obtained from the ray transference, are suitable for quantitative analysis although only the latter define an inner-product space and only on it can one define distances and angles. The paper examines the nature of the spaces and their relationships to other spaces including symmetric dioptric power space. The paper makes use of linear optics, a three-dimensional generalization of Gaussian optics. Symmetric 2 × 2 dioptric power matrices F define a three-dimensional inner-product space which provides a sound basis for quantitative analysis (calculation of changes, arithmetic means, etc.) of refractive errors and thin systems. For general systems the optical character is defined by the dimensionally-heterogeneous 4 × 4 symplectic matrix S, the transference, or if explicit allowance is made for heterocentricity, the 5 × 5 augmented symplectic matrix T. Ordinary quantitative analysis cannot be performed on them because matrices of neither of these types constitute vector spaces. Suitable transformations have been proposed but because the transforms are dimensionally heterogeneous the spaces are not naturally inner-product spaces. The paper obtains 14-dimensional spaces of augmented point P and angle Q characteristics. The 14-dimensional space defined by the augmented angle characteristics Q is dimensionally homogenous and an inner-product space. A 10-dimensional subspace of the space of augmented point characteristics P is also an inner-product space. The spaces are suitable for quantitative analysis of the optical character of eyes and many other systems. Distances and angles can be defined in the inner-product spaces. The optical systems may have multiple separated astigmatic and decentred refracting elements. © 2017 The Authors Ophthalmic & Physiological Optics © 2017 The College of Optometrists.

  17. pXRF quantitative analysis of the Otowi Member of the Bandelier Tuff: Generating large, robust data sets to decipher trace element zonation in large silicic magma chambers

    NASA Astrophysics Data System (ADS)

    Van Hoose, A. E.; Wolff, J.; Conrey, R.

    2013-12-01

    Advances in portable X-Ray fluorescence (pXRF) analytical technology have made it possible for high-quality, quantitative data to be collected in a fraction of the time required by standard, non-portable analytical techniques. Not only do these advances reduce analysis time, but data may also be collected in the field in conjunction with sampling. Rhyolitic pumice, being primarily glass, is an excellent material to be analyzed with this technology. High-quality, quantitative data for elements that are tracers of magmatic differentiation (e.g. Rb, Sr, Y, Nb) can be collected for whole, individual pumices and subsamples of larger pumices in 4 minutes. We have developed a calibration for powdered rhyolite pumice from the Otowi Member of the Bandelier Tuff analyzed with the Bruker Tracer IV pXRF using Bruker software and influence coefficients for pumice, which measures the following 19 oxides and elements: SiO2, TiO2, Al2O3, FeO*, MnO, CaO, K2O, P2O5, Zn, Ga, Rb, Sr, Y, Zr, Nb, Ba, Ce, Pb, and Th. With this calibration for the pXRF and thousands of individual powdered pumice samples, we have generated an unparalleled data set for any single eruptive unit with known trace element zonation. The Bandelier Tuff of the Valles-Toledo Caldera Complex, Jemez Mountains, New Mexico, is divided into three main eruptive events. For this study, we have chosen the 1.61 Ma, 450 km3 Otowi Member as it is primarily unwelded and pumice samples are easily accessible. The eruption began with a plinian phase from a single source located near center of the current caldera and deposited the Guaje Pumice Bed. The initial Unit A of the Guaje is geochemically monotonous, but Units B through E, co-deposited with ignimbrite show very strong chemical zonation in trace elements, progressing upwards through the deposits from highly differentiated compositions (Rb ~350 ppm, Nb ~200 ppm) to less differentiated (Rb ~100 ppm, Nb ~50 ppm). Co-erupted ignimbrites emplaced during column collapse show

  18. Quantitative elemental imaging of heterogeneous catalysts using laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Trichard, F.; Sorbier, L.; Moncayo, S.; Blouët, Y.; Lienemann, C.-P.; Motto-Ros, V.

    2017-07-01

    Currently, the use of catalysis is widespread in almost all industrial processes; its use improves productivity, synthesis yields and waste treatment as well as decreases energy costs. The increasingly stringent requirements, in terms of reaction selectivity and environmental standards, impose progressively increasing accuracy and control of operations. Meanwhile, the development of characterization techniques has been challenging, and the techniques often require equipment with high complexity. In this paper, we demonstrate a novel elemental approach for performing quantitative space-resolved analysis with ppm-scale quantification limits and μm-scale resolution. This approach, based on laser-induced breakdown spectroscopy (LIBS), is distinguished by its simplicity, all-optical design, and speed of operation. This work analyzes palladium-based porous alumina catalysts, which are commonly used in the selective hydrogenation process, using the LIBS method. We report an exhaustive study of the quantification capability of LIBS and its ability to perform imaging measurements over a large dynamic range, typically from a few ppm to wt%. These results offer new insight into the use of LIBS-based imaging in the industry and paves the way for innumerable applications.

  19. Semi-quantitative spectrographic analysis and rank correlation in geochemistry

    USGS Publications Warehouse

    Flanagan, F.J.

    1957-01-01

    The rank correlation coefficient, rs, which involves less computation than the product-moment correlation coefficient, r, can be used to indicate the degree of relationship between two elements. The method is applicable in situations where the assumptions underlying normal distribution correlation theory may not be satisfied. Semi-quantitative spectrographic analyses which are reported as grouped or partly ranked data can be used to calculate rank correlations between elements. ?? 1957.

  20. Application of Standards-Based Quantitative SEM-EDS Analysis to Oxide Minerals

    NASA Astrophysics Data System (ADS)

    Mengason, M. J.; Ritchie, N. W.; Newbury, D. E.

    2016-12-01

    SEM and EPMA analysis are powerful tools for documenting and evaluating the relationships between minerals in thin sections and for determining chemical compositions in-situ. The time and costs associated with determining major, minor, and some trace element concentrations in geologic materials can be reduced due to advances in EDS spectrometer performance and the availability of software tools such as NIST DTSA II to perform multiple linear least squares (MLLS) fitting of energy spectra from standards to the spectra from samples recorded under the same analytical conditions. MLLS fitting is able to overcome spectral peak overlaps among the transition-metal elements that commonly occur in oxide minerals, which had previously been seen as too difficult for EDS analysis, allowing for rapid and accurate determination of concentrations. The quantitative use of EDS is demonstrated in the chemical analysis of magnetite (NMNH 114887) and ilmenite (NMNH 96189) from the Smithsonian Natural History Museum Microbeam Standards Collection. Average concentrations from nine total spots over three grains are given in mass % listed as (recommended; measured concentration ± one standard deviation). Spectra were collected for sixty seconds live time at 15 kV and 10 nA over a 12 micrometer wide scan area. Analysis of magnetite yielded Magnesium (0.03; 0.04 ± 0.01), Aluminum (none given; 0.040 ± 0.006), Titanium (0.10; 0.11 ± 0.02), Vanadium (none given; 0.16 ± 0.01), Chromium (0.17; 0.14 ± 0.02), and Iron (70.71, 71.4 ± 0.2). Analysis of ilmenite yielded Magnesium (0.19; 0.183 ± 0.008), Aluminum (none given; 0.04 ± 0.02), Titanium (27.4, 28.1 ± 0.1), Chromium (none given; 0.04 ± 0.01), Manganese (3.69; 3.73 ± 0.03), Iron (36.18; 35.8 ± 0.1), and Niobium (0.64; 0.68 ± 0.03). The analysis of geologic materials by standards-based quantitative EDS can be further illustrated with chemical analyses of oxides from ocean island basalts representing several locations globally to

  1. Improved EPMA Trace Element Accuracy Using a Matrix Iterated Quantitative Blank Correction

    NASA Astrophysics Data System (ADS)

    Donovan, J. J.; Wark, D. A.; Jercinovic, M. J.

    2007-12-01

    contribution from the systematic quantitative offset from a known (usually zero level) blank standard. Preliminary results from this new matrix iterated trace element blank correction demonstrate that systematic errors can be reduced to single digit PPM levels for many situations. 1B.W. Robinson, N.G. Ware and D.G.W. Smith, 1998. "Modern Electron-Microprobe Trace-Element Analysis in Mineralogy". In Cabri, L.J. and Vaughan, D.J., Eds. "Modern Approaches to Ore and Environmental Mineralogy", Short Course 27. Mineralogical Association of Canada, Ottawa 153-180 2Remond, G., Myklebust, R. Fialin, M. Nockolds, C. Phillips, M. Roques-Carmes, C. ¡§Decomposition of Wavelength Dispersive X-ray Spectra¡¨, Journal of Research of the National Institute of Standards and Technology (J. Res. Natl. Inst. Stand. Technol., v. 107, 509-529 (2002) 3Self, P.G., Norrish, K., Milnes, A.R., Graham, J. & Robinson, B.W. (1990): Holes in the Background in XRS. X-ray Spectrom. 19 (2), 59-61 4Wark, DA, and Watson, EB, 2006, TitaniQ: A Titanium-in-Quartz geothermometer: Contributions to Mineralogy and Petrology, 152:743-754, doi: 10.1007/s00410-006-0132-308

  2. An x ray scatter approach for non-destructive chemical analysis of low atomic numbered elements

    NASA Technical Reports Server (NTRS)

    Ross, H. Richard

    1993-01-01

    A non-destructive x-ray scatter (XRS) approach has been developed, along with a rapid atomic scatter algorithm for the detection and analysis of low atomic-numbered elements in solids, powders, and liquids. The present method of energy dispersive x-ray fluorescence spectroscopy (EDXRF) makes the analysis of light elements (i.e., less than sodium; less than 11) extremely difficult. Detection and measurement become progressively worse as atomic numbers become smaller, due to a competing process called 'Auger Emission', which reduces fluorescent intensity, coupled with the high mass absorption coefficients exhibited by low energy x-rays, the detection and determination of low atomic-numbered elements by x-ray spectrometry is limited. However, an indirect approach based on the intensity ratio of Compton and Rayleigh scattered has been used to define light element components in alloys, plastics and other materials. This XRS technique provides qualitative and quantitative information about the overall constituents of a variety of samples.

  3. Non-destructive elemental analysis of vertebral body trabecular bone using muonic X-rays.

    PubMed

    Hosoi, Y; Watanabe, Y; Sugita, R; Tanaka, Y; Nagamine, K; Ono, T; Sakamoto, K

    1995-12-01

    Non-destructive elemental analysis with muonic X-rays was performed on human vertebral bone and lumbar torso phantoms. It can provide quantitative information on all elements in small deep-seated localized volumes. The experiment was carried out using the superconducting muon channel at TRIUMF in Vancouver, Canada and a lithium drifted germanium detector with an active area of 18.5 cm2. The muon channel produced backward-decayed negative muons with wide kinetic energy range from 0.5 to 54.2 MeV. The muon beam was collimated to a diameter of 18 mm. The number of incoming muons was about 4 x 10(6) approximately 5 x 10(7) per data point. In the measurements with human vertebral bones fixed with neutralized formaldehyde, the correlation coefficient between calcium content measured by muons and by atomic absorption analysis was 0.99 and the level of significance was 0.0003. In the measurements with lumbar torso phantoms, the correlation coefficient between calcium content measured by muons and by atomic absorption analysis was 0.99 and the level of significance was 0.02. The results suggest that elemental analysis in vertebral body trabecular bone using muonic X-rays closely correlates with measurements by atomic absorption analysis.

  4. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  5. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  6. Quantitative analysis of Si1-xGex alloy films by SIMS and XPS depth profiling using a reference material

    NASA Astrophysics Data System (ADS)

    Oh, Won Jin; Jang, Jong Shik; Lee, Youn Seoung; Kim, Ansoon; Kim, Kyung Joong

    2018-02-01

    Quantitative analysis methods of multi-element alloy films were compared. The atomic fractions of Si1-xGex alloy films were measured by depth profiling analysis with secondary ion mass spectrometry (SIMS) and X-ray Photoelectron Spectroscopy (XPS). Intensity-to-composition conversion factor (ICF) was used as a mean to convert the intensities to compositions instead of the relative sensitivity factors. The ICFs were determined from a reference Si1-xGex alloy film by the conventional method, average intensity (AI) method and total number counting (TNC) method. In the case of SIMS, although the atomic fractions measured by oxygen ion beams were not quantitative due to severe matrix effect, the results by cesium ion beam were very quantitative. The quantitative analysis results by SIMS using MCs2+ ions are comparable to the results by XPS. In the case of XPS, the measurement uncertainty was highly improved by the AI method and TNC method.

  7. On the elemental analysis of different cigarette brands using laser induced breakdown spectroscopy and laser-ablation time of flight mass spectrometry

    NASA Astrophysics Data System (ADS)

    Ahmed, Nasar; Umar, Zeshan A.; Ahmed, Rizwan; Aslam Baig, M.

    2017-10-01

    We present qualitative and quantitative analysis of the trace elements present in different brands of tobacco available in Pakistan using laser induced breakdown spectroscopy (LIBS) and Laser ablation Time of Flight Mass Spectrometer (LA-TOFMS). The compositional analysis using the calibration free LIBS technique is based on the observed emission spectra of the laser produced plasma plume whereas the elemental composition analysis using LA-TOFMS is based on the mass spectra of the ions produced by laser ablation. The optical emission spectra of these samples contain spectral lines of calcium, magnesium, sodium, potassium, silicon, strontium, barium, lithium and aluminum with varying intensities. The corresponding mass spectra of the elements were detected in LA-TOF-MS with their composition concentration. The analysis of different brands of cigarettes demonstrates that LIBS coupled with a LA-TOF-MS is a powerful technique for the elemental analysis of the trace elements in any solid sample.

  8. Quantitative Data Analysis--In the Graduate Curriculum

    ERIC Educational Resources Information Center

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  9. [Correlation analysis and evaluation of inorganic elements in Angelica sinensis and its correspondence soil from different regions].

    PubMed

    Yan, Hui; Duan, Jin-ao; Qian, Da-wei; Su, Shu-lan; Song, Bing-sheng; He, Zi-qing

    2011-04-01

    Evaluate the relationship between the inorganic elements and the genuineness, invigoration efficacy of this medicinal material by qualitative and quantitative analysis of the inorganic elements in Angelica sinensis and its correspondence soil. The contents of 14 kinds of inorganic elements from 40 samples from 4 main habits of Angelica sinensis in China were determined by the method of ICP-AES. In Angelica sinensis and its correspondence soil, significant positive correlations existed between each pair of Ca, Na, Ni. The enrichment coefficients of Mg by Angelica sinensis was a certain peculiarity. The analysis showed that Zn, Cu, Mn, Mg were distincter to Angelica sinensis's geo-authentic than other elements. The results seemly confirmed that the Mingui was considered as geo-authentic crude drugs by traditional knowledge. The inorganic elements in Angelica sinensis may be correlated with its geo-authentic certainly. This result can provide scientific basis for understanding of Angelica sinensis's geo-authentic nature and the active material base.

  10. Failure Behavior Characterization of Mo-Modified Ti Surface by Impact Test and Finite Element Analysis

    NASA Astrophysics Data System (ADS)

    Ma, Yong; Qin, Jianfeng; Zhang, Xiangyu; Lin, Naiming; Huang, Xiaobo; Tang, Bin

    2015-07-01

    Using the impact test and finite element simulation, the failure behavior of the Mo-modified layer on pure Ti was investigated. In the impact test, four loads of 100, 300, 500, and 700 N and 104 impacts were adopted. The three-dimensional residual impact dents were examined using an optical microscope (Olympus-DSX500i), indicating that the impact resistance of the Ti surface was improved. Two failure modes cohesive and wearing were elucidated by electron backscatter diffraction and energy-dispersive spectrometer performed in a field-emission scanning electron microscope. Through finite element forward analysis performed at a typical impact load of 300 N, stress-strain distributions in the Mo-modified Ti were quantitatively determined. In addition, the failure behavior of the Mo-modified layer was determined and an ideal failure model was proposed for high-load impact, based on the experimental and finite element forward analysis results.

  11. Trace element mapping in Parkinsonian brain by quantitative ion beam microscopy

    NASA Astrophysics Data System (ADS)

    Barapatre, Nirav; Morawski, Markus; Butz, Tilman; Reinert, Tilo

    2010-06-01

    The role of iron in the pathogenesis of the Parkinson's disease (PD) is a current subject of research in Neurochemistry, since an abnormal increase in iron is reported in the substantia nigra (SN) of Parkinsonian patients. A severe loss of the cells containing dopamine in the SN in the PD has also drawn attention towards the function of a browny-black pigment called neuromelanin, which accumulates predominantly in these dopaminergic neurons. The neuromelanin has an ability to chelate metal ions, which, in free state, may cause considerable damage to cells by reacting with their lipid-rich membranes. However, it could also potentiate free radical production if it releases the bound metal ions. The highly sensitive and non-destructive micro-PIXE method suits best to quantify and map the trace elements in the SN. The accuracy in charge measurement for such microanalysis studies is of utmost importance for quantitative analysis. Since a Faraday cup is usually placed behind the thin biological sample to measure the charge, the primary and the secondary electrons, knocked out from the sample by traversing ion beam, hamper an exact charge determination. Hence, a new non-interceptive technique was developed for precise charge measurement and for continuous monitoring of beam current.

  12. Finite element analysis of helicopter structures

    NASA Technical Reports Server (NTRS)

    Rich, M. J.

    1978-01-01

    Application of the finite element analysis is now being expanded to three dimensional analysis of mechanical components. Examples are presented for airframe, mechanical components, and composite structure calculations. Data are detailed on the increase of model size, computer usage, and the effect on reducing stress analysis costs. Future applications for use of finite element analysis for helicopter structures are projected.

  13. Comparison of different types of phacoemulsification tips. I. Quantitative analysis of elemental composition and tip surface microroughness.

    PubMed

    Tsaousis, Konstantinos T; Werner, Liliana; Perez, Jesus Paulo; Li, He J; Reiter, Nicholas; Guan, Jia J; Mamalis, Nick

    2016-09-01

    To evaluate the elemental composition of phacoemulsification tips and their surface roughness in the microscale. John A. Moran Eye Center and Utah Nanofab, College of Engineering, University of Utah, Salt Lake City, Utah, USA. Experimental study. Seven types of phacoemulsification tips were studied. The phaco tips were examined through energy-dispersive x-ray spectroscopy (EDS) and x-ray photoelectron spectroscopy (XPS) for elemental composition. In addition, the roughness of the opening in all tips was assessed through 3-dimensional white-light interferometry. Elemental analysis showed considerable differences in the surface layers between manufacturers. Alcon tips had a thinner oxidized titanium (Ti) layer in their surface. Through XPS, vanadium was not detected in the superficial layers of any tip, but only in deeper levels. The microroughness surface analysis showed comparable results regarding their root-mean-square (RMS) metric. Maximum peak valley distance values varied and appeared to be dependent on the quality of material process rather than the material itself. Phacoemulsification tips are made of Ti alloys and showed differences between models, especially regarding their composition in the superficial layers. Their opening end roughness showed an overall appropriate RMS value of less than 1.0 μm in all cases. The existence of small defected areas highlights the importance of adequate quality control of these critical surgical instruments. None of the authors has a financial or proprietary interest in any material or method mentioned. Copyright © 2016 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  14. Model-Based Linkage Analysis of a Quantitative Trait.

    PubMed

    Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H

    2017-01-01

    Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.

  15. A Dual Super-Element Domain Decomposition Approach for Parallel Nonlinear Finite Element Analysis

    NASA Astrophysics Data System (ADS)

    Jokhio, G. A.; Izzuddin, B. A.

    2015-05-01

    This article presents a new domain decomposition method for nonlinear finite element analysis introducing the concept of dual partition super-elements. The method extends ideas from the displacement frame method and is ideally suited for parallel nonlinear static/dynamic analysis of structural systems. In the new method, domain decomposition is realized by replacing one or more subdomains in a "parent system," each with a placeholder super-element, where the subdomains are processed separately as "child partitions," each wrapped by a dual super-element along the partition boundary. The analysis of the overall system, including the satisfaction of equilibrium and compatibility at all partition boundaries, is realized through direct communication between all pairs of placeholder and dual super-elements. The proposed method has particular advantages for matrix solution methods based on the frontal scheme, and can be readily implemented for existing finite element analysis programs to achieve parallelization on distributed memory systems with minimal intervention, thus overcoming memory bottlenecks typically faced in the analysis of large-scale problems. Several examples are presented in this article which demonstrate the computational benefits of the proposed parallel domain decomposition approach and its applicability to the nonlinear structural analysis of realistic structural systems.

  16. Quantitative analysis of Al-Si alloy using calibration free laser induced breakdown spectroscopy (CF-LIBS)

    NASA Astrophysics Data System (ADS)

    Shakeel, Hira; Haq, S. U.; Aisha, Ghulam; Nadeem, Ali

    2017-06-01

    The quantitative analysis of the standard aluminum-silicon alloy has been performed using calibration free laser induced breakdown spectroscopy (CF-LIBS). The plasma was produced using the fundamental harmonic (1064 nm) of the Nd: YAG laser and the emission spectra were recorded at 3.5 μs detector gate delay. The qualitative analysis of the emission spectra confirms the presence of Mg, Al, Si, Ti, Mn, Fe, Ni, Cu, Zn, Sn, and Pb in the alloy. The background subtracted and self-absorption corrected emission spectra were used for the estimation of plasma temperature as 10 100 ± 300 K. The plasma temperature and self-absorption corrected emission lines of each element have been used for the determination of concentration of each species present in the alloy. The use of corrected emission intensities and accurate evaluation of plasma temperature yield reliable quantitative analysis up to a maximum 2.2% deviation from reference sample concentration.

  17. Optimum element density studies for finite-element thermal analysis of hypersonic aircraft structures

    NASA Technical Reports Server (NTRS)

    Ko, William L.; Olona, Timothy; Muramoto, Kyle M.

    1990-01-01

    Different finite element models previously set up for thermal analysis of the space shuttle orbiter structure are discussed and their shortcomings identified. Element density criteria are established for the finite element thermal modelings of space shuttle orbiter-type large, hypersonic aircraft structures. These criteria are based on rigorous studies on solution accuracies using different finite element models having different element densities set up for one cell of the orbiter wing. Also, a method for optimization of the transient thermal analysis computer central processing unit (CPU) time is discussed. Based on the newly established element density criteria, the orbiter wing midspan segment was modeled for the examination of thermal analysis solution accuracies and the extent of computation CPU time requirements. The results showed that the distributions of the structural temperatures and the thermal stresses obtained from this wing segment model were satisfactory and the computation CPU time was at the acceptable level. The studies offered the hope that modeling the large, hypersonic aircraft structures using high-density elements for transient thermal analysis is possible if a CPU optimization technique was used.

  18. Multispectral colour analysis for quantitative evaluation of pseudoisochromatic color deficiency tests

    NASA Astrophysics Data System (ADS)

    Ozolinsh, Maris; Fomins, Sergejs

    2010-11-01

    Multispectral color analysis was used for spectral scanning of Ishihara and Rabkin color deficiency test book images. It was done using tunable liquid-crystal LC filters built in the Nuance II analyzer. Multispectral analysis keeps both, information on spatial content of tests and on spectral content. Images were taken in the range of 420-720nm with a 10nm step. We calculated retina neural activity charts taking into account cone sensitivity functions, and processed charts in order to find the visibility of latent symbols in color deficiency plates using cross-correlation technique. In such way the quantitative measure is found for each of diagnostics plate for three different color deficiency carrier types - protanopes, deutanopes and tritanopes. Multispectral color analysis allows to determine the CIE xyz color coordinates of pseudoisochromatic plate design elements and to perform statistical analysis of these data to compare the color quality of available color deficiency test books.

  19. Finite element analysis (FEA) analysis of the preflex beam

    NASA Astrophysics Data System (ADS)

    Wan, Lijuan; Gao, Qilang

    2017-10-01

    The development of finite element analysis (FEA) has been relatively mature, and is one of the important means of structural analysis. This method changes the problem that the research of complex structure in the past needs to be done by a large number of experiments. Through the finite element method, the numerical simulation of the structure can be used to achieve a variety of static and dynamic simulation analysis of the mechanical problems, it is also convenient to study the parameters of the structural parameters. Combined with a certain number of experiments to verify the simulation model can be completed in the past all the needs of experimental research. The nonlinear finite element method is used to simulate the flexural behavior of the prestressed composite beams with corrugated steel webs. The finite element analysis is used to understand the mechanical properties of the structure under the action of bending load.

  20. The quantitative analysis of silicon carbide surface smoothing by Ar and Xe cluster ions

    NASA Astrophysics Data System (ADS)

    Ieshkin, A. E.; Kireev, D. S.; Ermakov, Yu. A.; Trifonov, A. S.; Presnov, D. E.; Garshev, A. V.; Anufriev, Yu. V.; Prokhorova, I. G.; Krupenin, V. A.; Chernysh, V. S.

    2018-04-01

    The gas cluster ion beam technique was used for the silicon carbide crystal surface smoothing. The effect of processing by two inert cluster ions, argon and xenon, was quantitatively compared. While argon is a standard element for GCIB, results for xenon clusters were not reported yet. Scanning probe microscopy and high resolution transmission electron microscopy techniques were used for the analysis of the surface roughness and surface crystal layer quality. The gas cluster ion beam processing results in surface relief smoothing down to average roughness about 1 nm for both elements. It was shown that xenon as the working gas is more effective: sputtering rate for xenon clusters is 2.5 times higher than for argon at the same beam energy. High resolution transmission electron microscopy analysis of the surface defect layer gives values of 7 ± 2 nm and 8 ± 2 nm for treatment with argon and xenon clusters.

  1. A quantitative analysis of IRAS maps of molecular clouds

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  2. Energy dispersive X-ray fluorescence spectrometry for the direct multi-element analysis of dried blood spots

    NASA Astrophysics Data System (ADS)

    Marguí, E.; Queralt, I.; García-Ruiz, E.; García-González, E.; Rello, L.; Resano, M.

    2018-01-01

    Home-based collection protocols for clinical specimens are actively pursued as a means of improving life quality of patients. In this sense, dried blood spots (DBS) are proposed as a non-invasive and even self-administered alternative to sampling whole venous blood. This contribution explores the potential of energy dispersive X-ray fluorescence spectrometry for the simultaneous and direct determination of some major (S, Cl, K, Na), minor (P, Fe) and trace (Ca, Cu, Zn) elements in blood, after its deposition onto clinical filter papers, thus giving rise to DBS. For quantification purposes the best strategy was to use matrix-matched blood samples of known analyte concentrations. The accuracy and precision of the method were evaluated by analysis of a blood reference material (Seronorm™ trace elements whole blood L3). Quantitative results were obtained for the determination of P, S, Cl, K and Fe, and limits of detection for these elements were adequate, taking into account their typical concentrations in real blood samples. Determination of Na, Ca, Cu and Zn was hampered by the occurrence of high sample support (Na, Ca) and instrumental blanks (Cu, Zn). Therefore, the quantitative determination of these elements at the levels expected in blood samples was not feasible. The methodology developed was applied to the analysis of several blood samples and the results obtained were compared with those reported by standard techniques. Overall, the performance of the method developed is promising and it could be used to determine the aforementioned elements in blood samples in a simple, fast and economic way. Furthermore, its non-destructive nature enables further analyses by means of complementary techniques to be carried out.

  3. Quantitative statistical analysis of cis-regulatory sequences in ABA/VP1- and CBF/DREB1-regulated genes of Arabidopsis.

    PubMed

    Suzuki, Masaharu; Ketterling, Matthew G; McCarty, Donald R

    2005-09-01

    We have developed a simple quantitative computational approach for objective analysis of cis-regulatory sequences in promoters of coregulated genes. The program, designated MotifFinder, identifies oligo sequences that are overrepresented in promoters of coregulated genes. We used this approach to analyze promoter sequences of Viviparous1 (VP1)/abscisic acid (ABA)-regulated genes and cold-regulated genes, respectively, of Arabidopsis (Arabidopsis thaliana). We detected significantly enriched sequences in up-regulated genes but not in down-regulated genes. This result suggests that gene activation but not repression is mediated by specific and common sequence elements in promoters. The enriched motifs include several known cis-regulatory sequences as well as previously unidentified motifs. With respect to known cis-elements, we dissected the flanking nucleotides of the core sequences of Sph element, ABA response elements (ABREs), and the C repeat/dehydration-responsive element. This analysis identified the motif variants that may correlate with qualitative and quantitative differences in gene expression. While both VP1 and cold responses are mediated in part by ABA signaling via ABREs, these responses correlate with unique ABRE variants distinguished by nucleotides flanking the ACGT core. ABRE and Sph motifs are tightly associated uniquely in the coregulated set of genes showing a strict dependence on VP1 and ABA signaling. Finally, analysis of distribution of the enriched sequences revealed a striking concentration of enriched motifs in a proximal 200-base region of VP1/ABA and cold-regulated promoters. Overall, each class of coregulated genes possesses a discrete set of the enriched motifs with unique distributions in their promoters that may account for the specificity of gene regulation.

  4. Quantitative aspects of inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Bulska, Ewa; Wagner, Barbara

    2016-10-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.

  5. P Element Transposition Contributes Substantial New Variation for a Quantitative Trait in Drosophila Melanogaster

    PubMed Central

    Torkamanzehi, A.; Moran, C.; Nicholas, F. W.

    1992-01-01

    The P-M system of transposition in Drosophila melanogaster is a powerful mutator for many visible and lethal loci. Experiments using crosses between unrelated P and M stocks to assess the importance of transposition-mediated mutations affecting quantitative loci and reponse to selection have yielded unrepeatable or ambiguous results. In a different approach, we have used a P stock produced by microinjection of the ry(506) M stock. Selection responses were compared between transposition lines that were initiated by crossing M strain females with males from the ``co-isogenic'' P strain, and ry(506) M control lines. Unlike previous attempts to quantify the effects of P element transposition, there is no possibility of P transposition in the controls. During 10 generations of selection for the quantitative trait abdominal bristle number, none of the four control lines showed any response to selection, indicative of isogenicity for those loci affecting abdominal bristle number. In contrast, three of the four transposition lines showed substantial response, with regression of cumulative response on cumulative selection differential ranging from 15% to 25%. Transposition of P elements has produced new additive genetic variance at a rate which is more than 30 times greater than the rate expected from spontaneous mutation. PMID:1317317

  6. Elements and elasmobranchs: hypotheses, assumptions and limitations of elemental analysis.

    PubMed

    McMillan, M N; Izzo, C; Wade, B; Gillanders, B M

    2017-02-01

    Quantifying the elemental composition of elasmobranch calcified cartilage (hard parts) has the potential to answer a range of ecological and biological questions, at both the individual and population level. Few studies, however, have employed elemental analyses of elasmobranch hard parts. This paper provides an overview of the range of applications of elemental analysis in elasmobranchs, discussing the assumptions and potential limitations in cartilaginous fishes. It also reviews the available information on biotic and abiotic factors influencing patterns of elemental incorporation into hard parts of elasmobranchs and provides some comparative elemental assays and mapping in an attempt to fill knowledge gaps. Directions for future experimental research are highlighted to better understand fundamental elemental dynamics in elasmobranch hard parts. © 2016 The Fisheries Society of the British Isles.

  7. Quantitative molecular analysis in mantle cell lymphoma.

    PubMed

    Brízová, H; Hilská, I; Mrhalová, M; Kodet, R

    2011-07-01

    A molecular analysis has three major roles in modern oncopathology--as an aid in the differential diagnosis, in molecular monitoring of diseases, and in estimation of the potential prognosis. In this report we review the application of the molecular analysis in a group of patients with mantle cell lymphoma (MCL). We demonstrate that detection of the cyclin D1 mRNA level is a molecular marker in 98% of patients with MCL. Cyclin D1 quantitative monitoring is specific and sensitive for the differential diagnosis and for the molecular monitoring of the disease in the bone marrow. Moreover, the dynamics of cyclin D1 in bone marrow reflects the disease development and it predicts the clinical course. We employed the molecular analysis for a precise quantitative detection of proliferation markers, Ki-67, topoisomerase IIalpha, and TPX2, that are described as effective prognostic factors. Using the molecular approach it is possible to measure the proliferation rate in a reproducible, standard way which is an essential prerequisite for using the proliferation activity as a routine clinical tool. Comparing with immunophenotyping we may conclude that the quantitative PCR-based analysis is a useful, reliable, rapid, reproducible, sensitive and specific method broadening our diagnostic tools in hematopathology. In comparison to interphase FISH in paraffin sections quantitative PCR is less technically demanding and less time-consuming and furthermore it is more sensitive in detecting small changes in the mRNA level. Moreover, quantitative PCR is the only technology which provides precise and reproducible quantitative information about the expression level. Therefore it may be used to demonstrate the decrease or increase of a tumor-specific marker in bone marrow in comparison with a previously aspirated specimen. Thus, it has a powerful potential to monitor the course of the disease in correlation with clinical data.

  8. A New Material Mapping Procedure for Quantitative Computed Tomography-Based, Continuum Finite Element Analyses of the Vertebra

    PubMed Central

    Unnikrishnan, Ginu U.; Morgan, Elise F.

    2011-01-01

    Inaccuracies in the estimation of material properties and errors in the assignment of these properties into finite element models limit the reliability, accuracy, and precision of quantitative computed tomography (QCT)-based finite element analyses of the vertebra. In this work, a new mesh-independent, material mapping procedure was developed to improve the quality of predictions of vertebral mechanical behavior from QCT-based finite element models. In this procedure, an intermediate step, called the material block model, was introduced to determine the distribution of material properties based on bone mineral density, and these properties were then mapped onto the finite element mesh. A sensitivity study was first conducted on a calibration phantom to understand the influence of the size of the material blocks on the computed bone mineral density. It was observed that varying the material block size produced only marginal changes in the predictions of mineral density. Finite element (FE) analyses were then conducted on a square column-shaped region of the vertebra and also on the entire vertebra in order to study the effect of material block size on the FE-derived outcomes. The predicted values of stiffness for the column and the vertebra decreased with decreasing block size. When these results were compared to those of a mesh convergence analysis, it was found that the influence of element size on vertebral stiffness was less than that of the material block size. This mapping procedure allows the material properties in a finite element study to be determined based on the block size required for an accurate representation of the material field, while the size of the finite elements can be selected independently and based on the required numerical accuracy of the finite element solution. The mesh-independent, material mapping procedure developed in this study could be particularly helpful in improving the accuracy of finite element analyses of

  9. Trace elemental analysis of school chalk using energy dispersive X-ray florescence spectroscopy (ED-XRF)

    NASA Astrophysics Data System (ADS)

    Maruthi, Y. A.; Das, N. Lakshmana; Ramprasad, S.; Ram, S. S.; Sudarshan, M.

    2015-08-01

    The present studies focus the quantitative analysis of elements in school chalk to ensure the safety of its use. The elements like Calcium (Ca), Aluminum (Al), Iron (Fe), Silicon (Si) and Chromium (Cr) were analyzed from settled chalk dust samples collected from five classrooms (CD-1) and also from another set of unused chalk samples collected from local market (CD-2) using Energy Dispersive X-Ray florescence(ED-XRF) spectroscopy. Presence of these elements in significant concentrations in school chalk confirmed that, it is an irritant and occupational hazard. It is suggested to use protective equipments like filtered mask for mouth, nose and chalk holders. This study also suggested using the advanced mode of techniques like Digital boards, marker boards and power point presentations to mitigate the occupational hazard for classroom chalk

  10. Elemental analysis with external-beam PIXE

    NASA Astrophysics Data System (ADS)

    Lin, E. K.; Wang, C. W.; Teng, P. K.; Huang, Y. M.; Chen, C. Y.

    1992-05-01

    A beamline system and experimental setup has been established for elemental analysis using PIXE with an external beam. Experiments for the study of the elemental composition of ancient Chinese potsherds (the Min and Ching ages) were performed. Continuum X-ray spectra from the samples bombarded by 3 MeV protons have been measured with a Si(Li) detector. From the analysis of PIXE data, the concentration of the main elements (Al, Si, K, and Ca) and of more than ten trace elements in the matrices and glazed surfaces were determined. Results for two different potsherds are presented, and those obtained from the glaze colorants are compared with the results of measurements on a Ching blue-and-white porcelain vase.

  11. Analysis of Rare Earth Elements in Uranium Using Handheld Laser-Induced Breakdown Spectroscopy (HH LIBS)

    DOE PAGES

    Manard, Benjamin T.; Wylie, E. Miller; Willson, Stephen P.

    2018-05-22

    In this paper, a portable handheld laser-induced breakdown spectroscopy (HH LIBS) instrument was evaluated as a rapid method to qualitatively analyze rare earth elements in a uranium oxide matrix. This research is motivated by the need for development of a method to perform rapid, at-line chemical analysis in a nuclear facility, particularly to provide a rapid first pass analysis to determine if additional actions or measurements are warranted. This will result in the minimization of handling and transport of radiological and nuclear material and subsequent exposure to their associated hazards. In this work, rare earth elements (Eu, Nd, and Yb)more » were quantitatively spiked into a uranium oxide powder and analyzed by the HH LIBS instrumentation. This method demonstrates the ability to rapidly identify elemental constituents in sub-percent levels in a uranium matrix. Preliminary limits of detection (LODs) were determined with values on the order of hundredths of a percent. Validity of this methodology was explored by employing a National Institute of Standards and Technology (NIST) standard reference materials (SRM) 610 and 612 (Trace Elements in Glass). Finally, it was determined that the HH LIBS method was able to clearly discern the rare earths elements of interest in the glass or uranium matrices.« less

  12. Analysis of Rare Earth Elements in Uranium Using Handheld Laser-Induced Breakdown Spectroscopy (HH LIBS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manard, Benjamin T.; Wylie, E. Miller; Willson, Stephen P.

    In this paper, a portable handheld laser-induced breakdown spectroscopy (HH LIBS) instrument was evaluated as a rapid method to qualitatively analyze rare earth elements in a uranium oxide matrix. This research is motivated by the need for development of a method to perform rapid, at-line chemical analysis in a nuclear facility, particularly to provide a rapid first pass analysis to determine if additional actions or measurements are warranted. This will result in the minimization of handling and transport of radiological and nuclear material and subsequent exposure to their associated hazards. In this work, rare earth elements (Eu, Nd, and Yb)more » were quantitatively spiked into a uranium oxide powder and analyzed by the HH LIBS instrumentation. This method demonstrates the ability to rapidly identify elemental constituents in sub-percent levels in a uranium matrix. Preliminary limits of detection (LODs) were determined with values on the order of hundredths of a percent. Validity of this methodology was explored by employing a National Institute of Standards and Technology (NIST) standard reference materials (SRM) 610 and 612 (Trace Elements in Glass). Finally, it was determined that the HH LIBS method was able to clearly discern the rare earths elements of interest in the glass or uranium matrices.« less

  13. Analysis of Rare Earth Elements in Uranium Using Handheld Laser-Induced Breakdown Spectroscopy (HH LIBS).

    PubMed

    Manard, Benjamin T; Wylie, E Miller; Willson, Stephen P

    2018-01-01

    A portable handheld laser-induced breakdown spectroscopy (HH LIBS) instrument was evaluated as a rapid method to qualitatively analyze rare earth elements in a uranium oxide matrix. This research is motivated by the need for development of a method to perform rapid, at-line chemical analysis in a nuclear facility, particularly to provide a rapid first pass analysis to determine if additional actions or measurements are warranted. This will result in the minimization of handling and transport of radiological and nuclear material and subsequent exposure to their associated hazards. In this work, rare earth elements (Eu, Nd, and Yb) were quantitatively spiked into a uranium oxide powder and analyzed by the HH LIBS instrumentation. This method demonstrates the ability to rapidly identify elemental constituents in sub-percent levels in a uranium matrix. Preliminary limits of detection (LODs) were determined with values on the order of hundredths of a percent. Validity of this methodology was explored by employing a National Institute of Standards and Technology (NIST) standard reference materials (SRM) 610 and 612 (Trace Elements in Glass). It was determined that the HH LIBS method was able to clearly discern the rare earths elements of interest in the glass or uranium matrices.

  14. Quantitative aspects of inductively coupled plasma mass spectrometry

    PubMed Central

    Wagner, Barbara

    2016-01-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644971

  15. Quantitative elemental analysis of an industrial mineral talc, using accelerator-based analytical technique

    NASA Astrophysics Data System (ADS)

    Olabanji, S. O.; Ige, A. O.; Mazzoli, C.; Ceccato, D.; Ajayi, E. O. B.; De Poli, M.; Moschini, G.

    2005-10-01

    Accelerator-based technique of PIXE was employed for the determination of the elemental concentration of an industrial mineral, talc. Talc is a very versatile mineral in industries with several applications. Due to this, there is a need to know its constituents to ensure that the workers are not exposed to health risks. Besides, microscopic tests on some talc samples in Nigeria confirm that they fall within the BP British Pharmacopoeia standard for tablet formation. However, for these samples to become a local source of raw material for pharmaceutical grade talc, the precise elemental compositions should be established which is the focus of this work. Proton beam produced by the 2.5 MV AN 2000 Van de Graaff accelerator at INFN, LNL, Legnaro, Padova, Italy was used for the PIXE measurements. The results which show the concentration of different elements in the talc samples, their health implications and metabolic roles are presented and discussed.

  16. Generic element processor (application to nonlinear analysis)

    NASA Technical Reports Server (NTRS)

    Stanley, Gary

    1989-01-01

    The focus here is on one aspect of the Computational Structural Mechanics (CSM) Testbed: finite element technology. The approach involves a Generic Element Processor: a command-driven, database-oriented software shell that facilitates introduction of new elements into the testbed. This shell features an element-independent corotational capability that upgrades linear elements to geometrically nonlinear analysis, and corrects the rigid-body errors that plague many contemporary plate and shell elements. Specific elements that have been implemented in the Testbed via this mechanism include the Assumed Natural-Coordinate Strain (ANS) shell elements, developed with Professor K. C. Park (University of Colorado, Boulder), a new class of curved hybrid shell elements, developed by Dr. David Kang of LPARL (formerly a student of Professor T. Pian), other shell and solid hybrid elements developed by NASA personnel, and recently a repackaged version of the workhorse shell element used in the traditional STAGS nonlinear shell analysis code. The presentation covers: (1) user and developer interfaces to the generic element processor, (2) an explanation of the built-in corotational option, (3) a description of some of the shell-elements currently implemented, and (4) application to sample nonlinear shell postbuckling problems.

  17. Mapping and validation of quantitative trait loci associated with concentrations of 16 elements in unmilled rice grain

    USDA-ARS?s Scientific Manuscript database

    In this study, quantitative trait loci (QTLs) affecting the concentrations of 16 elements in whole, unmilled rice (Oryza sativa L.) grain were identified. Two rice mapping populations, the ‘Lemont’ x ‘TeQing’ recombinant inbred lines (LT-RILs), and the TeQing-into-Lemont backcross introgression lin...

  18. [Quantitative data analysis for live imaging of bone.

    PubMed

    Seno, Shigeto

    Bone tissue is a hard tissue, it was difficult to observe the interior of the bone tissue alive. With the progress of microscopic technology and fluorescent probe technology in recent years, it becomes possible to observe various activities of various cells forming bone society. On the other hand, the quantitative increase in data and the diversification and complexity of the images makes it difficult to perform quantitative analysis by visual inspection. It has been expected to develop a methodology for processing microscopic images and data analysis. In this article, we introduce the research field of bioimage informatics which is the boundary area of biology and information science, and then outline the basic image processing technology for quantitative analysis of live imaging data of bone.

  19. Studies of finite element analysis of composite material structures

    NASA Technical Reports Server (NTRS)

    Douglas, D. O.; Holzmacher, D. E.; Lane, Z. C.; Thornton, E. A.

    1975-01-01

    Research in the area of finite element analysis is summarized. Topics discussed include finite element analysis of a picture frame shear test, BANSAP (a bandwidth reduction program for SAP IV), FEMESH (a finite element mesh generation program based on isoparametric zones), and finite element analysis of a composite bolted joint specimens.

  20. Quantitative trait nucleotide analysis using Bayesian model selection.

    PubMed

    Blangero, John; Goring, Harald H H; Kent, Jack W; Williams, Jeff T; Peterson, Charles P; Almasy, Laura; Dyer, Thomas D

    2005-10-01

    Although much attention has been given to statistical genetic methods for the initial localization and fine mapping of quantitative trait loci (QTLs), little methodological work has been done to date on the problem of statistically identifying the most likely functional polymorphisms using sequence data. In this paper we provide a general statistical genetic framework, called Bayesian quantitative trait nucleotide (BQTN) analysis, for assessing the likely functional status of genetic variants. The approach requires the initial enumeration of all genetic variants in a set of resequenced individuals. These polymorphisms are then typed in a large number of individuals (potentially in families), and marker variation is related to quantitative phenotypic variation using Bayesian model selection and averaging. For each sequence variant a posterior probability of effect is obtained and can be used to prioritize additional molecular functional experiments. An example of this quantitative nucleotide analysis is provided using the GAW12 simulated data. The results show that the BQTN method may be useful for choosing the most likely functional variants within a gene (or set of genes). We also include instructions on how to use our computer program, SOLAR, for association analysis and BQTN analysis.

  1. X-ray fluorescence analysis of K, Al and trace elements in chloroaluminate melts

    NASA Astrophysics Data System (ADS)

    Shibitko, A. O.; Abramov, A. V.; Denisov, E. I.; Lisienko, D. G.; Rebrin, O. I.; Bunkov, G. M.; Rychkov, V. N.

    2017-09-01

    Energy dispersive x-ray fluorescence spectrometry was applied to quantitative determination of K, Al, Cr, Fe and Ni in chloroaluminate melts. To implement the external standard calibration method, an unconventional way of samples preparation was suggested. A mixture of metal chlorides was melted in a quartz cell at 350-450 °C under a slightly excessive pressure of purified argon (99.999 %). The composition of the calibration samples (CSs) prepared was controlled by means of the inductively coupled plasma atomic emission spectrometry (ICP-AES). The optimal conditions for analytical lines excitation were determined, the analytes calibration curves were obtained. There was some influence of matrix effects in synthesized samples on the analytical signal of some elements. The CSs are to be stored in inert gas atmosphere. The precision, accuracy, and reproducibility factors of the quantitative chemical analysis were computed.

  2. Trace elemental analysis of school chalk using energy dispersive X-ray florescence spectroscopy (ED-XRF)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maruthi, Y. A., E-mail: ymjournal2014@gmail.com; Das, N. Lakshmana, E-mail: nldas9@gmail.com; Ramprasad, S., E-mail: ramprasadsurakala@gmail.com

    The present studies focus the quantitative analysis of elements in school chalk to ensure the safety of its use. The elements like Calcium (Ca), Aluminum (Al), Iron (Fe), Silicon (Si) and Chromium (Cr) were analyzed from settled chalk dust samples collected from five classrooms (CD-1) and also from another set of unused chalk samples collected from local market (CD-2) using Energy Dispersive X-Ray florescence(ED-XRF) spectroscopy. Presence of these elements in significant concentrations in school chalk confirmed that, it is an irritant and occupational hazard. It is suggested to use protective equipments like filtered mask for mouth, nose and chalk holders.more » This study also suggested using the advanced mode of techniques like Digital boards, marker boards and power point presentations to mitigate the occupational hazard for classroom chalk.« less

  3. A Quantitative Approach to Scar Analysis

    PubMed Central

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-01-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology. PMID:21281794

  4. An Overview of Quantitative Risk Assessment of Space Shuttle Propulsion Elements

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    1998-01-01

    Since the Space Shuttle Challenger accident in 1986, NASA has been working to incorporate quantitative risk assessment (QRA) in decisions concerning the Space Shuttle and other NASA projects. One current major NASA QRA study is the creation of a risk model for the overall Space Shuttle system. The model is intended to provide a tool to estimate Space Shuttle risk and to perform sensitivity analyses/trade studies, including the evaluation of upgrades. Marshall Space Flight Center (MSFC) is a part of the NASA team conducting the QRA study; MSFC responsibility involves modeling the propulsion elements of the Space Shuttle, namely: the External Tank (ET), the Solid Rocket Booster (SRB), the Reusable Solid Rocket Motor (RSRM), and the Space Shuttle Main Engine (SSME). This paper discusses the approach that MSFC has used to model its Space Shuttle elements, including insights obtained from this experience in modeling large scale, highly complex systems with a varying availability of success/failure data. Insights, which are applicable to any QRA study, pertain to organizing the modeling effort, obtaining customer buy-in, preparing documentation, and using varied modeling methods and data sources. Also provided is an overall evaluation of the study results, including the strengths and the limitations of the MSFC QRA approach and of qRA technology in general.

  5. Finite element analysis of osteoporosis models based on synchrotron radiation

    NASA Astrophysics Data System (ADS)

    Xu, W.; Xu, J.; Zhao, J.; Sun, J.

    2016-04-01

    With growing pressure of social aging, China has to face the increasing population of osteoporosis patients as well as the whole world. Recently synchrotron radiation has become an essential tool for biomedical exploration with advantage of high resolution and high stability. In order to study characteristic changes in different stages of primary osteoporosis, this research focused on the different periods of osteoporosis of rats based on synchrotron radiation. Both bone histomorphometry analysis and finite element analysis were then carried on according to the reconstructed three dimensional models. Finally, the changes of bone tissue in different periods were compared quantitatively. Histomorphometry analysis showed that the structure of the trabecular in osteoporosis degraded as the bone volume decreased. For femurs, the bone volume fraction (Bone volume/ Total volume, BV/TV) decreased from 69% to 43%. That led to the increase of the thickness of trabecular separation (from 45.05μ m to 97.09μ m) and the reduction of the number of trabecular (from 7.99 mm-1 to 5.97mm-1). Simulation of various mechanical tests with finite element analysis (FEA) indicated that, with the exacerbation of osteoporosis, the bones' ability of resistance to compression, bending and torsion gradually became weaker. The compression stiffness of femurs decreased from 1770.96 Fμ m-1 to 697.41 Fμ m-1, the bending and torsion stiffness were from 1390.80 Fμ m-1 to 566.11 Fμ m-1 and from 2957.28N.m/o to 691.31 N.m/o respectively, indicated the decrease of bone strength, and it matched the histomorphometry analysis. This study suggested that FEA and synchrotron radiation were excellent methods for analysing bone strength conbined with histomorphometry analysis.

  6. Design and analysis issues in quantitative proteomics studies.

    PubMed

    Karp, Natasha A; Lilley, Kathryn S

    2007-09-01

    Quantitative proteomics is the comparison of distinct proteomes which enables the identification of protein species which exhibit changes in expression or post-translational state in response to a given stimulus. Many different quantitative techniques are being utilized and generate large datasets. Independent of the technique used, these large datasets need robust data analysis to ensure valid conclusions are drawn from such studies. Approaches to address the problems that arise with large datasets are discussed to give insight into the types of statistical analyses of data appropriate for the various experimental strategies that can be employed by quantitative proteomic studies. This review also highlights the importance of employing a robust experimental design and highlights various issues surrounding the design of experiments. The concepts and examples discussed within will show how robust design and analysis will lead to confident results that will ensure quantitative proteomics delivers.

  7. Toward automatic finite element analysis

    NASA Technical Reports Server (NTRS)

    Kela, Ajay; Perucchio, Renato; Voelcker, Herbert

    1987-01-01

    Two problems must be solved if the finite element method is to become a reliable and affordable blackbox engineering tool. Finite element meshes must be generated automatically from computer aided design databases and mesh analysis must be made self-adaptive. The experimental system described solves both problems in 2-D through spatial and analytical substructuring techniques that are now being extended into 3-D.

  8. Quantitation of absorbed or deposited materials on a substrate that measures energy deposition

    DOEpatents

    Grant, Patrick G.; Bakajin, Olgica; Vogel, John S.; Bench, Graham

    2005-01-18

    This invention provides a system and method for measuring an energy differential that correlates to quantitative measurement of an amount mass of an applied localized material. Such a system and method remains compatible with other methods of analysis, such as, for example, quantitating the elemental or isotopic content, identifying the material, or using the material in biochemical analysis.

  9. Two-Dimensional Nonlinear Finite Element Analysis of CMC Microstructures

    NASA Technical Reports Server (NTRS)

    Mital, Subodh K.; Goldberg, Robert K.; Bonacuse, Peter J.

    2012-01-01

    A research program has been developed to quantify the effects of the microstructure of a woven ceramic matrix composite and its variability on the effective properties and response of the material. In order to characterize and quantify the variations in the microstructure of a five harness satin weave, chemical vapor infiltrated (CVI) SiC/SiC composite material, specimens were serially sectioned and polished to capture images that detailed the fiber tows, matrix, and porosity. Open source quantitative image analysis tools were then used to isolate the constituents, from which two dimensional finite element models were generated which approximated the actual specimen section geometry. A simplified elastic-plastic model, wherein all stress above yield is redistributed to lower stress regions, is used to approximate the progressive damage behavior for each of the composite constituents. Finite element analyses under in-plane tensile loading were performed to examine how the variability in the local microstructure affected the macroscopic stress-strain response of the material as well as the local initiation and progression of damage. The macroscopic stress-strain response appeared to be minimally affected by the variation in local microstructure, but the locations where damage initiated and propagated appeared to be linked to specific aspects of the local microstructure.

  10. Finite element modeling and analysis of tires

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Andersen, C. M.

    1983-01-01

    Predicting the response of tires under various loading conditions using finite element technology is addressed. Some of the recent advances in finite element technology which have high potential for application to tire modeling problems are reviewed. The analysis and modeling needs for tires are identified. Reduction methods for large-scale nonlinear analysis, with particular emphasis on treatment of combined loads, displacement-dependent and nonconservative loadings; development of simple and efficient mixed finite element models for shell analysis, identification of equivalent mixed and purely displacement models, and determination of the advantages of using mixed models; and effective computational models for large-rotation nonlinear problems, based on a total Lagrangian description of the deformation are included.

  11. Integrated transient thermal-structural finite element analysis

    NASA Technical Reports Server (NTRS)

    Thornton, E. A.; Dechaumphai, P.; Wieting, A. R.; Tamma, K. K.

    1981-01-01

    An integrated thermal structural finite element approach for efficient coupling of transient thermal and structural analysis is presented. Integrated thermal structural rod and one dimensional axisymmetric elements considering conduction and convection are developed and used in transient thermal structural applications. The improved accuracy of the integrated approach is illustrated by comparisons with exact transient heat conduction elasticity solutions and conventional finite element thermal finite element structural analyses.

  12. An Quantitative Analysis Method Of Trabecular Pattern In A Bone

    NASA Astrophysics Data System (ADS)

    Idesawa, Masanor; Yatagai, Toyohiko

    1982-11-01

    Orientation and density of trabecular pattern observed in a bone is closely related to its mechanical properties and deseases of a bone are appeared as changes of orientation and/or density distrbution of its trabecular patterns. They have been treated from a qualitative point of view so far because quantitative analysis method has not be established. In this paper, the authors proposed and investigated some quantitative analysis methods of density and orientation of trabecular patterns observed in a bone. These methods can give an index for evaluating orientation of trabecular pattern quantitatively and have been applied to analyze trabecular pattern observed in a head of femur and their availabilities are confirmed. Key Words: Index of pattern orientation, Trabecular pattern, Pattern density, Quantitative analysis

  13. Global quantitative analysis of phosphorylation underlying phencyclidine signaling and sensorimotor gating in the prefrontal cortex.

    PubMed

    McClatchy, D B; Savas, J N; Martínez-Bartolomé, S; Park, S K; Maher, P; Powell, S B; Yates, J R

    2016-02-01

    Prepulse inhibition (PPI) is an example of sensorimotor gating and deficits in PPI have been demonstrated in schizophrenia patients. Phencyclidine (PCP) suppression of PPI in animals has been studied to elucidate the pathological elements of schizophrenia. However, the molecular mechanisms underlying PCP treatment or PPI in the brain are still poorly understood. In this study, quantitative phosphoproteomic analysis was performed on the prefrontal cortex from rats that were subjected to PPI after being systemically injected with PCP or saline. PCP downregulated phosphorylation events were significantly enriched in proteins associated with long-term potentiation (LTP). Importantly, this data set identifies functionally novel phosphorylation sites on known LTP-associated signaling molecules. In addition, mutagenesis of a significantly altered phosphorylation site on xCT (SLC7A11), the light chain of system xc-, the cystine/glutamate antiporter, suggests that PCP also regulates the activity of this protein. Finally, new insights were also derived on PPI signaling independent of PCP treatment. This is the first quantitative phosphorylation proteomic analysis providing new molecular insights into sensorimotor gating.

  14. Analysis of concrete beams using applied element method

    NASA Astrophysics Data System (ADS)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a displacement based method of structural analysis. Some of its features are similar to that of Finite Element Method (FEM). In AEM, the structure is analysed by dividing it into several elements similar to FEM. But, in AEM, elements are connected by springs instead of nodes as in the case of FEM. In this paper, background to AEM is discussed and necessary equations are derived. For illustrating the application of AEM, it has been used to analyse plain concrete beam of fixed support condition. The analysis is limited to the analysis of 2-dimensional structures. It was found that the number of springs has no much influence on the results. AEM could predict deflection and reactions with reasonable degree of accuracy.

  15. Good practices for quantitative bias analysis.

    PubMed

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  16. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    NASA Astrophysics Data System (ADS)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  17. Evaluation of different strategies for quantitative depth profile analysis of Cu/NiCu layers and multilayers via pulsed glow discharge - Time of flight mass spectrometry

    NASA Astrophysics Data System (ADS)

    Muñiz, Rocío; Lobo, Lara; Németh, Katalin; Péter, László; Pereiro, Rosario

    2017-09-01

    There is still a lack of approaches for quantitative depth-profiling when dealing with glow discharges (GD) coupled to mass spectrometric detection. The purpose of this work is to develop quantification procedures using pulsed GD (PGD) - time of flight mass spectrometry. In particular, research was focused towards the depth profile analysis of Cu/NiCu nanolayers and multilayers electrodeposited on Si wafers. PGDs are characterized by three different regions due to the temporal application of power: prepeak, plateau and afterglow. This last region is the most sensitive and so it is convenient for quantitative analysis of minor components; however, major elements are often saturated, even at 30 W of applied radiofrequency power for these particular samples. For such cases, we have investigated two strategies based on a multimatrix calibration procedure: (i) using the afterglow region for all the sample components except for the major element (Cu) that was analyzed in the plateau, and (ii) using the afterglow region for all the elements measuring the ArCu signal instead of Cu. Seven homogeneous certified reference materials containing Si, Cr, Fe, Co, Ni and Cu have been used for quantification. Quantitative depth profiles obtained with these two strategies for samples containing 3 or 6 multilayers (of a few tens of nanometers each layer) were in agreement with the expected values, both in terms of thickness and composition of the layers.

  18. Transient analysis using conical shell elements

    NASA Technical Reports Server (NTRS)

    Yang, J. C. S.; Goeller, J. E.; Messick, W. T.

    1973-01-01

    The use of the NASTRAN conical shell element in static, eigenvalue, and direct transient analyses is demonstrated. The results of a NASTRAN static solution of an externally pressurized ring-stiffened cylinder agree well with a theoretical discontinuity analysis. Good agreement is also obtained between the NASTRAN direct transient response of a uniform cylinder to a dynamic end load and one-dimensional solutions obtained using a method of characteristics stress wave code and a standing wave solution. Finally, a NASTRAN eigenvalue analysis is performed on a hydroballistic model idealized with conical shell elements.

  19. Mineral Analysis of Whole Grain Total Cereal

    ERIC Educational Resources Information Center

    Hooker, Paul

    2005-01-01

    The quantitative analysis of elemental iron in Whole Grain Total Cereal using visible spectroscopy is suitable for a general chemistry course for science or nonscience majors. The more extensive mineral analysis, specifically for the elements iron, calcium and zinc, is suitable for an instrumental or quantitative analysis chemistry course.

  20. A finite element analysis of the vibrational behaviour of the intra-operatively manufactured prosthesis-femur system.

    PubMed

    Pastrav, L C; Devos, J; Van der Perre, G; Jaecques, S V N

    2009-05-01

    In total hip replacement (THR) a good initial stability of the prosthetic stem in the femur, which corresponds to a good overall initial contact, will help assure a good long-term result. During the insertion the implant stability increases and, as a consequence, the resonance frequencies increase, allowing the assessment of the implant fixation by vibration analysis. The influence of changing contact conditions on the resonance frequencies was however not yet quantitatively understood and therefore a finite element analysis (FEA) was set up. Modal analyses on the hip stem-femur system were performed in various contact situations. By modelling the contact changes by means of the contact tolerance options in the finite element software, contact could be varied over the entire hip stem surface or only in specific zones (proximal, central, distal) while keeping other system parameters constant. The results are in agreement with previous observations: contact increase causes positive resonance frequency shifts and the dynamic behaviour is most influenced by contact changes in the proximal zone. Although the finite element analysis did not establish a monotonous relationship between the vibrational mode number and the magnitude of the resonance frequency shift, in general the higher modes are more sensitive to the contact change.

  1. Development of user customized smart keyboard using Smart Product Design-Finite Element Analysis Process in the Internet of Things.

    PubMed

    Kim, Jung Woo; Sul, Sang Hun; Choi, Jae Boong

    2018-06-07

    In a hyper-connected society, IoT environment, markets are rapidly changing as smartphones penetrate global market. As smartphones are applied to various digital media, development of a novel smart product is required. In this paper, a Smart Product Design-Finite Element Analysis Process (SPD-FEAP) is developed to adopt fast-changing tends and user requirements that can be visually verified. The user requirements are derived and quantitatively evaluated from Smart Quality Function Deployment (SQFD) using WebData. Then the usage scenarios are created according to the priority of the functions derived from SQFD. 3D shape analysis by Finite Element Analysis (FEA) was conducted and printed out through Rapid Prototyping (RP) technology to identify any possible errors. Thus, a User Customized Smart Keyboard has been developed using SPD-FEAP. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Quantitative prediction of the bitterness suppression of elemental diets by various flavors using a taste sensor.

    PubMed

    Miyanaga, Yohko; Inoue, Naoko; Ohnishi, Ayako; Fujisawa, Emi; Yamaguchi, Maki; Uchida, Takahiro

    2003-12-01

    The purpose of the study was to develop a method for the quantitative prediction of the bitterness suppression of elemental diets by various flavors and to predict the optimum composition of such elemental diets for oral administration using a multichannel taste sensor. We examined the effects of varying the volume of water used for dilution and of adding varying quantities of five flavors (pineapple, apple, milky coffee, powdered green tea, and banana) on the bitterness of the elemental diet, Aminoreban EN. Gustatory sensation tests with human volunteers (n = 9) and measurements using the artificial taste sensor were performed on 50 g Aminoreban EN dissolved in various volumes (140), 180, 220, 260, 300, 420, 660, 1140, and 2100 ml) of water, and on 50 g Aminoreban EN dissolved in 180 ml of water with the addition of 3-9 g of various flavors for taste masking. In gustatory sensation tests, the relationship between the logarithmic values of the volumes of water used for dilution and the bitterness intensity scores awarded by the volunteers proved to be linear. The addition of flavors also reduced the bitterness of elemental diets in gustatory sensation tests; the magnitude of this effect was, in decreasing order, apple, pineapple, milky coffee, powdered green tea, and banana. With the artificial taste sensor, large changes of membrane potential in channel 1, caused by adsorption (CPA values, corresponding to a bitter aftertaste), were observed for Aminoreban EN but not for any of the flavors. There was a good correlation between the CPA values in channel 1 and the results of the human gustatory tests, indicating that the taste sensor is capable of evaluating not only the bitterness of Aminoreban EN itself but also the bitterness-suppressing effect of the five flavors, which contained many elements such as organic acids and flavor components, and the effect of dilution (by water) on this bitterness. Using regression analysis of data derived from the taste sensor and

  3. Discovery and analysis of an active long terminal repeat-retrotransposable element in Aspergillus oryzae.

    PubMed

    Jie Jin, Feng; Hara, Seiichi; Sato, Atsushi; Koyama, Yasuji

    2014-01-01

    Wild-type Aspergillus oryzae RIB40 contains two copies of the AO090005001597 gene. We previously constructed A. oryzae RIB40 strain, RKuAF8B, with multiple chromosomal deletions, in which the AO090005001597 copy number was found to be increased significantly. Sequence analysis indicated that AO090005001597 is part of a putative 6,000-bp retrotransposable element, flanked by two long terminal repeats (LTRs) of 669 bp, with characteristics of retroviruses and retrotransposons, and thus designated AoLTR (A. oryzae LTR-retrotransposable element). AoLTR comprised putative reverse transcriptase, RNase H, and integrase domains. The deduced amino acid sequence alignment of AoLTR showed 94% overall identity with AFLAV, an A. flavus Tf1/sushi retrotransposon. Quantitative real-time RT-PCR showed that AoLTR gene expression was significantly increased in the RKuAF8B, in accordance with the increased copy number. Inverse PCR indicated that the full-length retrotransposable element was randomly integrated into multiple genomic locations. However, no obvious phenotypic changes were associated with the increased AoLTR gene copy number.

  4. Quantitative mass spectrometry methods for pharmaceutical analysis

    PubMed Central

    Loos, Glenn; Van Schepdael, Ann

    2016-01-01

    Quantitative pharmaceutical analysis is nowadays frequently executed using mass spectrometry. Electrospray ionization coupled to a (hybrid) triple quadrupole mass spectrometer is generally used in combination with solid-phase extraction and liquid chromatography. Furthermore, isotopically labelled standards are often used to correct for ion suppression. The challenges in producing sensitive but reliable quantitative data depend on the instrumentation, sample preparation and hyphenated techniques. In this contribution, different approaches to enhance the ionization efficiencies using modified source geometries and improved ion guidance are provided. Furthermore, possibilities to minimize, assess and correct for matrix interferences caused by co-eluting substances are described. With the focus on pharmaceuticals in the environment and bioanalysis, different separation techniques, trends in liquid chromatography and sample preparation methods to minimize matrix effects and increase sensitivity are discussed. Although highly sensitive methods are generally aimed for to provide automated multi-residue analysis, (less sensitive) miniaturized set-ups have a great potential due to their ability for in-field usage. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644982

  5. Forensic Discrimination of Concrete Pieces by Elemental Analysis of Acid-soluble Component with Inductively Coupled Plasma-Mass Spectrometry.

    PubMed

    Kasamatsu, Masaaki; Igawa, Takao; Suzuki, Shinichi; Suzuki, Yasuhiro

    2018-01-01

    Since fragments of concrete can be evidence of crime, a determination of whether or not they come from the same origin is required. The authors focused on nitric acid-soluble components in the fragments of concrete. As a result of qualitative analysis with ICP-MS, it was confirmed that elements such as Cu, Zn, Rb, Sr, Zr, Ba, La, Ce, Nd, and Pb were contained in the fragments. After the nitric acid-soluble components in the fragments of concrete were separated by dissolving them in nitric acid, the concentrations of these elements in the dissolved solution were quantitatively determined by ICP-MS. The concentration ratios of nine elements compared to La were used as indicators. By comparing these indicators, it was possible to discriminate between the fragments of concrete.

  6. Quantitative analysis and feature recognition in 3-D microstructural data sets

    NASA Astrophysics Data System (ADS)

    Lewis, A. C.; Suh, C.; Stukowski, M.; Geltmacher, A. B.; Spanos, G.; Rajan, K.

    2006-12-01

    A three-dimensional (3-D) reconstruction of an austenitic stainless-steel microstructure was used as input for an image-based finite-element model to simulate the anisotropic elastic mechanical response of the microstructure. The quantitative data-mining and data-warehousing techniques used to correlate regions of high stress with critical microstructural features are discussed. Initial analysis of elastic stresses near grain boundaries due to mechanical loading revealed low overall correlation with their location in the microstructure. However, the use of data-mining and feature-tracking techniques to identify high-stress outliers revealed that many of these high-stress points are generated near grain boundaries and grain edges (triple junctions). These techniques also allowed for the differentiation between high stresses due to boundary conditions of the finite volume reconstructed, and those due to 3-D microstructural features.

  7. Quantitative analysis of single-molecule superresolution images

    PubMed Central

    Coltharp, Carla; Yang, Xinxing; Xiao, Jie

    2014-01-01

    This review highlights the quantitative capabilities of single-molecule localization-based superresolution imaging methods. In addition to revealing fine structural details, the molecule coordinate lists generated by these methods provide the critical ability to quantify the number, clustering, and colocalization of molecules with 10 – 50 nm resolution. Here we describe typical workflows and precautions for quantitative analysis of single-molecule superresolution images. These guidelines include potential pitfalls and essential control experiments, allowing critical assessment and interpretation of superresolution images. PMID:25179006

  8. An interactive graphics system to facilitate finite element structural analysis

    NASA Technical Reports Server (NTRS)

    Burk, R. C.; Held, F. H.

    1973-01-01

    The characteristics of an interactive graphics systems to facilitate the finite element method of structural analysis are described. The finite element model analysis consists of three phases: (1) preprocessing (model generation), (2) problem solution, and (3) postprocessing (interpretation of results). The advantages of interactive graphics to finite element structural analysis are defined.

  9. Elemental analysis of sunflower cataract in Wilson's disease: a study using scanning transmission electron microscopy and energy dispersive spectroscopy.

    PubMed

    Jang, Hyo Ju; Kim, Joon Mo; Choi, Chul Young

    2014-04-01

    Signature ophthalmic characteristics of Wilson's disease (WD) are regarded as diagnostically important manifestations of the disease. Previous studies have proved the common occurrence of copper accumulation in the liver of patients with WD. However, in the case of sunflower cataracts, one of the rare diagnostic signs of WD, no study has demonstrated copper accumulation in the lens capsules of sunflower cataracts in WD patients. To investigate the nanostructure and elemental composition of sunflower cataracts in WD, transmission electron microscopy (TEM) was done on the capsulorhexised anterior lens capsule of sunflower cataracts in WD in order to evaluate anatomical variation and elemental changes. We utilized energy dispersive X-ray spectroscopy (EDS) to investigate the elemental composition of the lens capsule using both point and mapping spectroscopy. Quantitative analysis was performed for relative comparison of the elements. TEM showed the presence of granular deposits of varying size (20-350 nm), appearing mainly in the posterior one third of the anterior capsule. The deposits appeared in linear patterns with scattered dots. There were no electron-dense particles in the epithelial cell layer of the lens. Copper and sulfur peaks were consistently revealed in electron-dense granular deposits. In contrast, copper and sulfur peaks were absent in other tissues, including granule-free lens capsules and epithelial tissue. Most copper was exclusively located in clusters of electron-dense particles, and the copper distribution overlapped with sulfur on mapping spectroscopy. Quantitative analysis presented inconsistent ratios of copper to sulfur in each electron-dense granule. The mean ratio of copper to sulfur was about 3.25 (with a range of 2.39-3.78). This is the first elemental analysis of single electron particles in sunflower cataracts using EDS in the ophthalmic area. Sunflower cataracts with WD are assumed to be the result of accumulation of heterogeneous

  10. Improved finite element methodology for integrated thermal structural analysis

    NASA Technical Reports Server (NTRS)

    Dechaumphai, P.; Thornton, E. A.

    1982-01-01

    An integrated thermal-structural finite element approach for efficient coupling of thermal and structural analysis is presented. New thermal finite elements which yield exact nodal and element temperatures for one dimensional linear steady state heat transfer problems are developed. A nodeless variable formulation is used to establish improved thermal finite elements for one dimensional nonlinear transient and two dimensional linear transient heat transfer problems. The thermal finite elements provide detailed temperature distributions without using additional element nodes and permit a common discretization with lower order congruent structural finite elements. The accuracy of the integrated approach is evaluated by comparisons with analytical solutions and conventional finite element thermal structural analyses for a number of academic and more realistic problems. Results indicate that the approach provides a significant improvement in the accuracy and efficiency of thermal stress analysis for structures with complex temperature distributions.

  11. Contact Stress Analysis of Spiral Bevel Gears Using Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Bibel, G. D.; Kumar, A; Reddy, S.; Handschuh, R.

    1995-01-01

    A procedure is presented for performing three-dimensional stress analysis of spiral bevel gears in mesh using the finite element method. The procedure involves generating a finite element model by solving equations that identify tooth surface coordinates. Coordinate transformations are used to orientate the gear and pinion for gear meshing. Contact boundary conditions are simulated with gap elements. A solution technique for correct orientation of the gap elements is given. Example models and results are presented.

  12. QACD: A method for the quantitative assessment of compositional distribution in geologic materials

    NASA Astrophysics Data System (ADS)

    Loocke, M. P.; Lissenberg, J. C. J.; MacLeod, C. J.

    2017-12-01

    In order to fully understand the petrogenetic history of a rock, it is critical to obtain a thorough characterization of the chemical and textural relationships of its mineral constituents. Element mapping combines the microanalytical techniques that allow for the analysis of major- and minor elements at high spatial resolutions (e.g., electron microbeam analysis) with 2D mapping of samples in order to provide unprecedented detail regarding the growth histories and compositional distributions of minerals within a sample. We present a method for the acquisition and processing of large area X-ray element maps obtained by energy-dispersive X-ray spectrometer (EDS) to produce a quantitative assessment of compositional distribution (QACD) of mineral populations within geologic materials. By optimizing the conditions at which the EDS X-ray element maps are acquired, we are able to obtain full thin section quantitative element maps for most major elements in relatively short amounts of time. Such maps can be used to not only accurately identify all phases and calculate mineral modes for a sample (e.g., a petrographic thin section), but, critically, enable a complete quantitative assessment of their compositions. The QACD method has been incorporated into a python-based, easy-to-use graphical user interface (GUI) called Quack. The Quack software facilitates the generation of mineral modes, element and molar ratio maps and the quantification of full-sample compositional distributions. The open-source nature of the Quack software provides a versatile platform which can be easily adapted and modified to suit the needs of the user.

  13. Design sensitivity analysis of boundary element substructures

    NASA Technical Reports Server (NTRS)

    Kane, James H.; Saigal, Sunil; Gallagher, Richard H.

    1989-01-01

    The ability to reduce or condense a three-dimensional model exactly, and then iterate on this reduced size model representing the parts of the design that are allowed to change in an optimization loop is discussed. The discussion presents the results obtained from an ongoing research effort to exploit the concept of substructuring within the structural shape optimization context using a Boundary Element Analysis (BEA) formulation. The first part contains a formulation for the exact condensation of portions of the overall boundary element model designated as substructures. The use of reduced boundary element models in shape optimization requires that structural sensitivity analysis can be performed. A reduced sensitivity analysis formulation is then presented that allows for the calculation of structural response sensitivities of both the substructured (reduced) and unsubstructured parts of the model. It is shown that this approach produces significant computational economy in the design sensitivity analysis and reanalysis process by facilitating the block triangular factorization and forward reduction and backward substitution of smaller matrices. The implementatior of this formulation is discussed and timings and accuracies of representative test cases presented.

  14. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  15. Quantitative chemical imaging of the intracellular spatial distribution of fundamental elements and light metals in single cells.

    PubMed

    Malucelli, Emil; Iotti, Stefano; Gianoncelli, Alessandra; Fratini, Michela; Merolle, Lucia; Notargiacomo, Andrea; Marraccini, Chiara; Sargenti, Azzurra; Cappadone, Concettina; Farruggia, Giovanna; Bukreeva, Inna; Lombardo, Marco; Trombini, Claudio; Maier, Jeanette A; Lagomarsino, Stefano

    2014-05-20

    We report a method that allows a complete quantitative characterization of whole single cells, assessing the total amount of carbon, nitrogen, oxygen, sodium, and magnesium and providing submicrometer maps of element molar concentration, cell density, mass, and volume. This approach allows quantifying elements down to 10(6) atoms/μm(3). This result was obtained by applying a multimodal fusion approach that combines synchrotron radiation microscopy techniques with off-line atomic force microscopy. The method proposed permits us to find the element concentration in addition to the mass fraction and provides a deeper and more complete knowledge of cell composition. We performed measurements on LoVo human colon cancer cells sensitive (LoVo-S) and resistant (LoVo-R) to doxorubicin. The comparison of LoVo-S and LoVo-R revealed different patterns in the maps of Mg concentration with higher values within the nucleus in LoVo-R and in the perinuclear region in LoVo-S cells. This feature was not so evident for the other elements, suggesting that Mg compartmentalization could be a significant trait of the drug-resistant cells.

  16. Quantitative analysis of arm movement smoothness

    NASA Astrophysics Data System (ADS)

    Szczesna, Agnieszka; Błaszczyszyn, Monika

    2017-07-01

    The paper deals with the problem of motion data quantitative smoothness analysis. We investigated values of movement unit, fluidity and jerk for healthy and paralyzed arm of patients with hemiparesis after stroke. Patients were performing drinking task. To validate the approach, movement of 24 patients were captured using optical motion capture system.

  17. Parallel processing in finite element structural analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1987-01-01

    A brief review is made of the fundamental concepts and basic issues of parallel processing. Discussion focuses on parallel numerical algorithms, performance evaluation of machines and algorithms, and parallelism in finite element computations. A computational strategy is proposed for maximizing the degree of parallelism at different levels of the finite element analysis process including: 1) formulation level (through the use of mixed finite element models); 2) analysis level (through additive decomposition of the different arrays in the governing equations into the contributions to a symmetrized response plus correction terms); 3) numerical algorithm level (through the use of operator splitting techniques and application of iterative processes); and 4) implementation level (through the effective combination of vectorization, multitasking and microtasking, whenever available).

  18. Skeletal assessment with finite element analysis: relevance, pitfalls and interpretation.

    PubMed

    Campbell, Graeme Michael; Glüer, Claus-C

    2017-07-01

    Finite element models simulate the mechanical response of bone under load, enabling noninvasive assessment of strength. Models generated from quantitative computed tomography (QCT) incorporate the geometry and spatial distribution of bone mineral density (BMD) to simulate physiological and traumatic loads as well as orthopaedic implant behaviour. The present review discusses the current strengths and weakness of finite element models for application to skeletal biomechanics. In cadaver studies, finite element models provide better estimations of strength compared to BMD. Data from clinical studies are encouraging; however, the superiority of finite element models over BMD measures for fracture prediction has not been shown conclusively, and may be sex and site dependent. Therapeutic effects on bone strength are larger than for BMD; however, model validation has only been performed on untreated bone. High-resolution modalities and novel image processing methods may enhance the structural representation and predictive ability. Despite extensive use of finite element models to study orthopaedic implant stability, accurate simulation of the bone-implant interface and fracture progression remains a significant challenge. Skeletal finite element models provide noninvasive assessments of strength and implant stability. Improved structural representation and implant surface interaction may enable more accurate models of fragility in the future.

  19. Experimental parameters optimization of instrumental neutron activation analysis in order to determine selected elements in some industrial soils in Turkey

    NASA Astrophysics Data System (ADS)

    Haciyakupoglu, Sevilay; Nur Esen, Ayse; Erenturk, Sema

    2014-08-01

    The purpose of this study is optimization of the experimental parameters for analysis of soil matrix by instrumental neutron activation analysis and quantitative determination of barium, cerium, lanthanum, rubidium, scandium and thorium in soil samples collected from industrialized urban areas near Istanbul. Samples were irradiated in TRIGA MARK II Research Reactor of Istanbul Technical University. Two types of reference materials were used to check the accuracy of the applied method. The achieved results were found to be in compliance with certified values of the reference materials. The calculated En numbers for mentioned elements were found to be less than 1. The presented data of element concentrations in soil samples will help to trace the pollution as an impact of urbanization and industrialization, as well as providing database for future studies.

  20. Control of separation and quantitative analysis by GC-FTIR

    NASA Astrophysics Data System (ADS)

    Semmoud, A.; Huvenne, Jean P.; Legrand, P.

    1992-03-01

    Software for 3-D representations of the 'Absorbance-Wavenumber-Retention time' is used to control the quality of the GC separation. Spectral information given by the FTIR detection allows the user to be sure that a chromatographic peak is 'pure.' The analysis of peppermint essential oil is presented as an example. This assurance is absolutely required for quantitative applications. In these conditions, we have worked out a quantitative analysis of caffeine. Correlation coefficients between integrated absorbance measurements and concentration of caffeine are discussed at two steps of the data treatment.

  1. Finite Element Analysis (FEA) in Design and Production.

    ERIC Educational Resources Information Center

    Waggoner, Todd C.; And Others

    1995-01-01

    Finite element analysis (FEA) enables industrial designers to analyze complex components by dividing them into smaller elements, then assessing stress and strain characteristics. Traditionally mainframe based, FEA is being increasingly used in microcomputers. (SK)

  2. Association of glass fragments by their trace elemental content using ICP-MS and LA-ICP-MS in the analysis scheme

    NASA Astrophysics Data System (ADS)

    Almirall, Jose R.; Montero, Shirly; Furton, Kenneth G.

    2002-08-01

    The importance of glass as evidence of association between a crime event and a suspect has been recognized for some time. Glass is a fragile material that is often found at the scenes of crimes such as burglaries, hit-and-run accidents and violent crime offenses. The physical and chemical properties of glass can be used to differentiate between possible sources and as evidence of association between two fragments of glass thought to originate from the same source. Refractive index (RI) comparisons have been used for this purpose but due to the improved control over glass manufacturing processes, RI values often cannot differentiate glasses, even if the glass originates from different sources. Elemental analysis methods such as NAA, XRF, ICP-AES, and Inductively Coupled Plasma Mass Spectrometry (ICP-MS) have also been used for the comparison of trace elemental compositions and these techniques have been shown to provide an improvement in the discrimination of glass fragments over RI comparisons alone. The multi-element capability and the sensitivity of ICP-MS combined with the simplified sample introduction of laser ablation prior to ion detection provides for an excellent and relatively non-destructive technique for elemental analysis of glass fragments. The methodology for solution analysis (digestion procedure) and solid sample analysis (laser ablation) of glass is reported and the analytical results are compared. An isotope dilution method is also reported as a high precision technique for elemental analysis of glass fragments. The optimum sampling parameters for laser ablation, for semi-quantitative analysis and element ratio comparisons are also presented. Finally, the results of a case involving the breaking of 15 vehicle windows in an airport parking lot and the association of a suspect to the breakings by the glass fragments found on his person are also presented.

  3. Quantitative Analysis of the Efficiency of OLEDs.

    PubMed

    Sim, Bomi; Moon, Chang-Ki; Kim, Kwon-Hyeon; Kim, Jang-Joo

    2016-12-07

    We present a comprehensive model for the quantitative analysis of factors influencing the efficiency of organic light-emitting diodes (OLEDs) as a function of the current density. The model takes into account the contribution made by the charge carrier imbalance, quenching processes, and optical design loss of the device arising from various optical effects including the cavity structure, location and profile of the excitons, effective radiative quantum efficiency, and out-coupling efficiency. Quantitative analysis of the efficiency can be performed with an optical simulation using material parameters and experimental measurements of the exciton profile in the emission layer and the lifetime of the exciton as a function of the current density. This method was applied to three phosphorescent OLEDs based on a single host, mixed host, and exciplex-forming cohost. The three factors (charge carrier imbalance, quenching processes, and optical design loss) were influential in different ways, depending on the device. The proposed model can potentially be used to optimize OLED configurations on the basis of an analysis of the underlying physical processes.

  4. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    PubMed

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  5. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    PubMed Central

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282

  6. Organic Elemental Analysis.

    ERIC Educational Resources Information Center

    Ma, T. S.; Wang, C. Y.

    1984-01-01

    Presents a literature review on methods used to analyze organic elements. Topic areas include methods for: (1) analyzing carbon, hydrogen, and nitrogen; (2) analyzing oxygen, sulfur, and halogens; (3) analyzing other elements; (4) simultaneously determining several elements; and (5) determing trace elements. (JN)

  7. A comparison of sample preparation strategies for biological tissues and subsequent trace element analysis using LA-ICP-MS.

    PubMed

    Bonta, Maximilian; Török, Szilvia; Hegedus, Balazs; Döme, Balazs; Limbeck, Andreas

    2017-03-01

    Laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) is one of the most commonly applied methods for lateral trace element distribution analysis in medical studies. Many improvements of the technique regarding quantification and achievable lateral resolution have been achieved in the last years. Nevertheless, sample preparation is also of major importance and the optimal sample preparation strategy still has not been defined. While conventional histology knows a number of sample pre-treatment strategies, little is known about the effect of these approaches on the lateral distributions of elements and/or their quantities in tissues. The technique of formalin fixation and paraffin embedding (FFPE) has emerged as the gold standard in tissue preparation. However, the potential use for elemental distribution studies is questionable due to a large number of sample preparation steps. In this work, LA-ICP-MS was used to examine the applicability of the FFPE sample preparation approach for elemental distribution studies. Qualitative elemental distributions as well as quantitative concentrations in cryo-cut tissues as well as FFPE samples were compared. Results showed that some metals (especially Na and K) are severely affected by the FFPE process, whereas others (e.g., Mn, Ni) are less influenced. Based on these results, a general recommendation can be given: FFPE samples are completely unsuitable for the analysis of alkaline metals. When analyzing transition metals, FFPE samples can give comparable results to snap-frozen tissues. Graphical abstract Sample preparation strategies for biological tissues are compared with regard to the elemental distributions and average trace element concentrations.

  8. Prediction and phylogenetic analysis of mammalian short interspersed elements (SINEs).

    PubMed

    Rogozin, I B; Mayorov, V I; Lavrentieva, M V; Milanesi, L; Adkison, L R

    2000-09-01

    The presence of repetitive elements can create serious problems for sequence analysis, especially in the case of homology searches in nucleotide sequence databases. Repetitive elements should be treated carefully by using special programs and databases. In this paper, various aspects of SINE (short interspersed repetitive element) identification, analysis and evolution are discussed.

  9. Elemental Impurities in Pharmaceutical Excipients.

    PubMed

    Li, Gang; Schoneker, Dave; Ulman, Katherine L; Sturm, Jason J; Thackery, Lisa M; Kauffman, John F

    2015-12-01

    Control of elemental impurities in pharmaceutical materials is currently undergoing a transition from control based on concentrations in components of drug products to control based on permitted daily exposures in drug products. Within the pharmaceutical community, there is uncertainty regarding the impact of these changes on manufactures of drug products. This uncertainty is fueled in part by a lack of publically available information on elemental impurity levels in common pharmaceutical excipients. This paper summarizes a recent survey of elemental impurity levels in common pharmaceutical excipients as well as some drug substances. A widely applicable analytical procedure was developed and was shown to be suitable for analysis of elements that are subject to United States Pharmacopoeia Chapter <232> and International Conference on Harmonization's Q3D Guideline on Elemental Impurities. The procedure utilizes microwave-assisted digestion of pharmaceutical materials and inductively coupled plasma mass spectrometry for quantitative analysis of these elements. The procedure was applied to 190 samples from 31 different excipients and 15 samples from eight drug substances provided through the International Pharmaceutical Excipient Council of the Americas. The results of the survey indicate that, for the materials included in the study, relatively low levels of elemental impurities are present. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association.

  10. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  11. Reliability modelling and analysis of a multi-state element based on a dynamic Bayesian network

    PubMed Central

    Xu, Tingxue; Gu, Junyuan; Dong, Qi; Fu, Linyu

    2018-01-01

    This paper presents a quantitative reliability modelling and analysis method for multi-state elements based on a combination of the Markov process and a dynamic Bayesian network (DBN), taking perfect repair, imperfect repair and condition-based maintenance (CBM) into consideration. The Markov models of elements without repair and under CBM are established, and an absorbing set is introduced to determine the reliability of the repairable element. According to the state-transition relations between the states determined by the Markov process, a DBN model is built. In addition, its parameters for series and parallel systems, namely, conditional probability tables, can be calculated by referring to the conditional degradation probabilities. Finally, the power of a control unit in a failure model is used as an example. A dynamic fault tree (DFT) is translated into a Bayesian network model, and subsequently extended to a DBN. The results show the state probabilities of an element and the system without repair, with perfect and imperfect repair, and under CBM, with an absorbing set plotted by differential equations and verified. Through referring forward, the reliability value of the control unit is determined in different kinds of modes. Finally, weak nodes are noted in the control unit. PMID:29765629

  12. Reliability modelling and analysis of a multi-state element based on a dynamic Bayesian network

    NASA Astrophysics Data System (ADS)

    Li, Zhiqiang; Xu, Tingxue; Gu, Junyuan; Dong, Qi; Fu, Linyu

    2018-04-01

    This paper presents a quantitative reliability modelling and analysis method for multi-state elements based on a combination of the Markov process and a dynamic Bayesian network (DBN), taking perfect repair, imperfect repair and condition-based maintenance (CBM) into consideration. The Markov models of elements without repair and under CBM are established, and an absorbing set is introduced to determine the reliability of the repairable element. According to the state-transition relations between the states determined by the Markov process, a DBN model is built. In addition, its parameters for series and parallel systems, namely, conditional probability tables, can be calculated by referring to the conditional degradation probabilities. Finally, the power of a control unit in a failure model is used as an example. A dynamic fault tree (DFT) is translated into a Bayesian network model, and subsequently extended to a DBN. The results show the state probabilities of an element and the system without repair, with perfect and imperfect repair, and under CBM, with an absorbing set plotted by differential equations and verified. Through referring forward, the reliability value of the control unit is determined in different kinds of modes. Finally, weak nodes are noted in the control unit.

  13. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  14. A strategy to apply quantitative epistasis analysis on developmental traits.

    PubMed

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  15. Alphavirus replicon approach to promoterless analysis of IRES elements.

    PubMed

    Kamrud, K I; Custer, M; Dudek, J M; Owens, G; Alterson, K D; Lee, J S; Groebner, J L; Smith, J F

    2007-04-10

    Here we describe a system for promoterless analysis of putative internal ribosome entry site (IRES) elements using an alphavirus (family Togaviridae) replicon vector. The system uses the alphavirus subgenomic promoter to produce transcripts that, when modified to contain a spacer region upstream of an IRES element, allow analysis of cap-independent translation of genes of interest (GOI). If the IRES element is removed, translation of the subgenomic transcript can be reduced >95% compared to the same transcript containing a functional IRES element. Alphavirus replicons, used in this manner, offer an alternative to standard dicistronic DNA vectors or in vitro translation systems currently used to analyze putative IRES elements. In addition, protein expression levels varied depending on the spacer element located upstream of each IRES. The ability to modulate the level of expression from alphavirus vectors should extend the utility of these vectors in vaccine development.

  16. Alphavirus Replicon Approach to Promoterless Analysis of IRES Elements

    PubMed Central

    Kamrud, K.I.; Custer, M.; Dudek, J.M.; Owens, G.; Alterson, K.D.; Lee, J.S.; Groebner, J.L.; Smith, J.F.

    2007-01-01

    Here we describe a system for promoterless analysis of putative internal ribosome entry site (IRES) elements using an alphavirus (Family Togaviridae) replicon vector. The system uses the alphavirus subgenomic promoter to produce transcripts that, when modified to contain a spacer region upstream of an IRES element, allow analysis of cap-independent translation of genes of interest (GOI). If the IRES element is removed, translation of the subgenomic transcript can be reduced > 95 % compared to the same transcript containing a functional IRES element. Alphavirus replicons, used in this manner, offer an alternative to standard dicistronic DNA vectors or in-vitro translation systems currently used to analyze putative IRES elements. In addition, protein expression levels varied depending on the spacer element located upstream of each IRES. The ability to modulate the level of expression from alphavirus vectors should extend the utility of these vectors in vaccine development. PMID:17156813

  17. Quantiprot - a Python package for quantitative analysis of protein sequences.

    PubMed

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  18. Determination of trace element mineral/liquid partition coefficients in melilite and diopside by ion and electron microprobe techniques

    NASA Technical Reports Server (NTRS)

    Kuehner, S. M.; Laughlin, J. R.; Grossman, L.; Johnson, M. L.; Burnett, D. S.

    1989-01-01

    The applicability of ion microprobe (IMP) for quantitative analysis of minor elements (Sr, Y, Zr, La, Sm, and Yb) in the major phases present in natural Ca-, Al-rich inclusions (CAIs) was investigated by comparing IMP results with those of an electron microprobe (EMP). Results on three trace-element-doped glasses indicated that it is not possible to obtain precise quantitative analysis by using IMP if there are large differences in SiO2 content between the standards used to derive the ion yields and the unknowns.

  19. Quantitation and detection of vanadium in biologic and pollution materials

    NASA Technical Reports Server (NTRS)

    Gordon, W. A.

    1974-01-01

    A review is presented of special considerations and methodology for determining vanadium in biological and air pollution materials. In addition to descriptions of specific analysis procedures, general sections are included on quantitation of analysis procedures, sample preparation, blanks, and methods of detection of vanadium. Most of the information presented is applicable to the determination of other trace elements in addition to vanadium.

  20. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    ERIC Educational Resources Information Center

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  1. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  2. A comparative study of qualitative and quantitative methods for the assessment of adhesive remnant after bracket debonding.

    PubMed

    Cehreli, S Burcak; Polat-Ozsoy, Omur; Sar, Cagla; Cubukcu, H Evren; Cehreli, Zafer C

    2012-04-01

    The amount of the residual adhesive after bracket debonding is frequently assessed in a qualitative manner, utilizing the adhesive remnant index (ARI). This study aimed to investigate whether quantitative assessment of the adhesive remnant yields more precise results compared to qualitative methods utilizing the 4- and 5-point ARI scales. Twenty debonded brackets were selected. Evaluation and scoring of the adhesive remnant on bracket bases were made consecutively using: 1. qualitative assessment (visual scoring) and 2. quantitative measurement (image analysis) on digital photographs. Image analysis was made on scanning electron micrographs (SEM) and high-precision elemental maps of the adhesive remnant as determined by energy dispersed X-ray spectrometry. Evaluations were made in accordance with the original 4-point and the modified 5-point ARI scales. Intra-class correlation coefficients (ICCs) were calculated, and the data were evaluated using Friedman test followed by Wilcoxon signed ranks test with Bonferroni correction. ICC statistics indicated high levels of agreement for qualitative visual scoring among examiners. The 4-point ARI scale was compliant with the SEM assessments but indicated significantly less adhesive remnant compared to the results of quantitative elemental mapping. When the 5-point scale was used, both quantitative techniques yielded similar results with those obtained qualitatively. These results indicate that qualitative visual scoring using the ARI is capable of generating similar results with those assessed by quantitative image analysis techniques. In particular, visual scoring with the 5-point ARI scale can yield similar results with both the SEM analysis and elemental mapping.

  3. The effect of in situ/in vitro three-dimensional quantitative computed tomography image voxel size on the finite element model of human vertebral cancellous bone.

    PubMed

    Lu, Yongtao; Engelke, Klaus; Glueer, Claus-C; Morlock, Michael M; Huber, Gerd

    2014-11-01

    Quantitative computed tomography-based finite element modeling technique is a promising clinical tool for the prediction of bone strength. However, quantitative computed tomography-based finite element models were created from image datasets with different image voxel sizes. The aim of this study was to investigate whether there is an influence of image voxel size on the finite element models. In all 12 thoracolumbar vertebrae were scanned prior to autopsy (in situ) using two different quantitative computed tomography scan protocols, which resulted in image datasets with two different voxel sizes (0.29 × 0.29 × 1.3 mm(3) vs 0.18 × 0.18 × 0.6 mm(3)). Eight of them were scanned after autopsy (in vitro) and the datasets were reconstructed with two voxel sizes (0.32 × 0.32 × 0.6 mm(3) vs. 0.18 × 0.18 × 0.3 mm(3)). Finite element models with cuboid volume of interest extracted from the vertebral cancellous part were created and inhomogeneous bilinear bone properties were defined. Axial compression was simulated. No effect of voxel size was detected on the apparent bone mineral density for both the in situ and in vitro cases. However, the apparent modulus and yield strength showed significant differences in the two voxel size group pairs (in situ and in vitro). In conclusion, the image voxel size may have to be considered when the finite element voxel modeling technique is used in clinical applications. © IMechE 2014.

  4. Quantitative Analysis of High-Quality Officer Selection by Commandants Career-Level Education Board

    DTIC Science & Technology

    2017-03-01

    due to Marines being evaluated before the end of their initial service commitment. Our research utilizes quantitative variables to analyze the...not provide detailed information why. B. LIMITATIONS The photograph analysis in this research is strictly limited to a quantitative analysis in...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. QUANTITATIVE

  5. Data from quantitative label free proteomics analysis of rat spleen.

    PubMed

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  6. Comprehensive Quantitative Analysis on Privacy Leak Behavior

    PubMed Central

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  7. Comprehensive quantitative analysis on privacy leak behavior.

    PubMed

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects.

  8. Nonlinear Finite Element Analysis of Shells with Large Aspect Ratio

    NASA Technical Reports Server (NTRS)

    Chang, T. Y.; Sawamiphakdi, K.

    1984-01-01

    A higher order degenerated shell element with nine nodes was selected for large deformation and post-buckling analysis of thick or thin shells. Elastic-plastic material properties are also included. The post-buckling analysis algorithm is given. Using a square plate, it was demonstrated that the none-node element does not have shear locking effect even if its aspect ratio was increased to the order 10 to the 8th power. Two sample problems are given to illustrate the analysis capability of the shell element.

  9. Error minimization algorithm for comparative quantitative PCR analysis: Q-Anal.

    PubMed

    OConnor, William; Runquist, Elizabeth A

    2008-07-01

    Current methods for comparative quantitative polymerase chain reaction (qPCR) analysis, the threshold and extrapolation methods, either make assumptions about PCR efficiency that require an arbitrary threshold selection process or extrapolate to estimate relative levels of messenger RNA (mRNA) transcripts. Here we describe an algorithm, Q-Anal, that blends elements from current methods to by-pass assumptions regarding PCR efficiency and improve the threshold selection process to minimize error in comparative qPCR analysis. This algorithm uses iterative linear regression to identify the exponential phase for both target and reference amplicons and then selects, by minimizing linear regression error, a fluorescence threshold where efficiencies for both amplicons have been defined. From this defined fluorescence threshold, cycle time (Ct) and the error for both amplicons are calculated and used to determine the expression ratio. Ratios in complementary DNA (cDNA) dilution assays from qPCR data were analyzed by the Q-Anal method and compared with the threshold method and an extrapolation method. Dilution ratios determined by the Q-Anal and threshold methods were 86 to 118% of the expected cDNA ratios, but relative errors for the Q-Anal method were 4 to 10% in comparison with 4 to 34% for the threshold method. In contrast, ratios determined by an extrapolation method were 32 to 242% of the expected cDNA ratios, with relative errors of 67 to 193%. Q-Anal will be a valuable and quick method for minimizing error in comparative qPCR analysis.

  10. Global scaling for semi-quantitative analysis in FP-CIT SPECT.

    PubMed

    Kupitz, D; Apostolova, I; Lange, C; Ulrich, G; Amthauer, H; Brenner, W; Buchert, R

    2014-01-01

    Semi-quantitative characterization of dopamine transporter availability from single photon emission computed tomography (SPECT) with 123I-ioflupane (FP-CIT) is based on uptake ratios relative to a reference region. The aim of this study was to evaluate the whole brain as reference region for semi-quantitative analysis of FP-CIT SPECT. The rationale was that this might reduce statistical noise associated with the estimation of non-displaceable FP-CIT uptake. 150 FP-CIT SPECTs were categorized as neurodegenerative or non-neurodegenerative by an expert. Semi-quantitative analysis of specific binding ratios (SBR) was performed with a custom-made tool based on the Statistical Parametric Mapping software package using predefined regions of interest (ROIs) in the anatomical space of the Montreal Neurological Institute. The following reference regions were compared: predefined ROIs for frontal and occipital lobe and whole brain (without striata, thalamus and brainstem). Tracer uptake in the reference region was characterized by the mean, median or 75th percentile of its voxel intensities. The area (AUC) under the receiver operating characteristic curve was used as performance measure. The highest AUC of 0.973 was achieved by the SBR of the putamen with the 75th percentile in the whole brain as reference. The lowest AUC for the putamen SBR of 0.937 was obtained with the mean in the frontal lobe as reference. We recommend the 75th percentile in the whole brain as reference for semi-quantitative analysis in FP-CIT SPECT. This combination provided the best agreement of the semi-quantitative analysis with visual evaluation of the SPECT images by an expert and, therefore, is appropriate to support less experienced physicians.

  11. Macroscopic X-ray Powder Diffraction Scanning: Possibilities for Quantitative and Depth-Selective Parchment Analysis.

    PubMed

    Vanmeert, Frederik; De Nolf, Wout; Dik, Joris; Janssens, Koen

    2018-06-05

    At or below the surface of painted works of art, valuable information is present that provides insights into an object's past, such as the artist's technique and the creative process that was followed or its conservation history but also on its current state of preservation. Various noninvasive techniques have been developed over the past 2 decades that can probe this information either locally (via point analysis) or on a macroscopic scale (e.g., full-field imaging and raster scanning). Recently macroscopic X-ray powder diffraction (MA-XRPD) mapping using laboratory X-ray sources was developed. This method can visualize highly specific chemical distributions at the macroscale (dm 2 ). In this work we demonstrate the synergy between the quantitative aspects of powder diffraction and the noninvasive scanning capability of MA-XRPD highlighting the potential of the method to reveal new types of information. Quantitative data derived from a 15th/16th century illuminated sheet of parchment revealed three lead white pigments with different hydrocerussite-cerussite compositions in specific pictorial elements, while quantification analysis of impurities in the blue azurite pigment revealed two distinct azurite types: one rich in barite and one in quartz. Furthermore, on the same artifact, the depth-selective possibilities of the method that stem from an exploitation of the shift of the measured diffraction peaks with respect to reference data are highlighted. The influence of different experimental parameters on the depth-selective analysis results is briefly discussed. Promising stratigraphic information could be obtained, even though the analysis is hampered by not completely understood variations in the unit cell dimensions of the crystalline pigment phases.

  12. The Applications of Finite Element Analysis in Proximal Humeral Fractures.

    PubMed

    Ye, Yongyu; You, Wei; Zhu, Weimin; Cui, Jiaming; Chen, Kang; Wang, Daping

    2017-01-01

    Proximal humeral fractures are common and most challenging, due to the complexity of the glenohumeral joint, especially in the geriatric population with impacted fractures, that the development of implants continues because currently the problems with their fixation are not solved. Pre-, intra-, and postoperative assessments are crucial in management of those patients. Finite element analysis, as one of the valuable tools, has been implemented as an effective and noninvasive method to analyze proximal humeral fractures, providing solid evidence for management of troublesome patients. However, no review article about the applications and effects of finite element analysis in assessing proximal humeral fractures has been reported yet. This review article summarized the applications, contribution, and clinical significance of finite element analysis in assessing proximal humeral fractures. Furthermore, the limitations of finite element analysis, the difficulties of more realistic simulation, and the validation and also the creation of validated FE models were discussed. We concluded that although some advancements in proximal humeral fractures researches have been made by using finite element analysis, utility of this powerful tool for routine clinical management and adequate simulation requires more state-of-the-art studies to provide evidence and bases.

  13. GIS-based multielement source analysis of dustfall in Beijing: A study of 40 major and trace elements.

    PubMed

    Luo, Nana; An, Li; Nara, Atsushi; Yan, Xing; Zhao, Wenji

    2016-06-01

    Dust, as an important carrier of inorganic and organic pollutants, daily exposes to human without any protection. It affects our health adversely, especially its chemical elements and ions. In this research, we investigated the chemical characteristics of dustfall in Beijing, specifically in terms of 40 major and trace elements, and presented semi-quantitative evaluations of the relative local and remote contributions. In total, 58 samples were collected in Beijing and nearby cities during 2013-2014 "the winter heating period". Using multiple statistical methods and GIS techniques, we obtained the relative similarities among certain elements and identified their pollution sources (from local or nearby cities). And more interestingly, the relative contributions of nearby cities can be calculated by the Hysplit4 backward-trajectory model. In addition, the correlation analysis for the 40 elements in dust and soil indicated that traffic restricted interchange between them; the city center, with the heaviest traffic, had the most significant influence. Finally, the resulting source apportionment was examined and modified using land use data and terrain information. We hope it can provide a strong basis for the environmental protection and risk assessment. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Boundary element analysis of post-tensioned slabs

    NASA Astrophysics Data System (ADS)

    Rashed, Youssef F.

    2015-06-01

    In this paper, the boundary element method is applied to carry out the structural analysis of post-tensioned flat slabs. The shear-deformable plate-bending model is employed. The effect of the pre-stressing cables is taken into account via the equivalent load method. The formulation is automated using a computer program, which uses quadratic boundary elements. Verification samples are presented, and finally a practical application is analyzed where results are compared against those obtained from the finite element method. The proposed method is efficient in terms of computer storage and processing time as well as the ease in data input and modifications.

  15. Analysis of random structure-acoustic interaction problems using coupled boundary element and finite element methods

    NASA Technical Reports Server (NTRS)

    Mei, Chuh; Pates, Carl S., III

    1994-01-01

    A coupled boundary element (BEM)-finite element (FEM) approach is presented to accurately model structure-acoustic interaction systems. The boundary element method is first applied to interior, two and three-dimensional acoustic domains with complex geometry configurations. Boundary element results are very accurate when compared with limited exact solutions. Structure-interaction problems are then analyzed with the coupled FEM-BEM method, where the finite element method models the structure and the boundary element method models the interior acoustic domain. The coupled analysis is compared with exact and experimental results for a simplistic model. Composite panels are analyzed and compared with isotropic results. The coupled method is then extended for random excitation. Random excitation results are compared with uncoupled results for isotropic and composite panels.

  16. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  17. Quantitative Proteomic Analysis of the Hfq-Regulon in Sinorhizobium meliloti 2011

    PubMed Central

    Sobrero, Patricio; Schlüter, Jan-Philip; Lanner, Ulrike; Schlosser, Andreas; Becker, Anke; Valverde, Claudio

    2012-01-01

    Riboregulation stands for RNA-based control of gene expression. In bacteria, small non-coding RNAs (sRNAs) are a major class of riboregulatory elements, most of which act at the post-transcriptional level by base-pairing target mRNA genes. The RNA chaperone Hfq facilitates antisense interactions between target mRNAs and regulatory sRNAs, thus influencing mRNA stability and/or translation rate. In the α-proteobacterium Sinorhizobium meliloti strain 2011, the identification and detection of multiple sRNAs genes and the broadly pleitropic phenotype associated to the absence of a functional Hfq protein both support the existence of riboregulatory circuits controlling gene expression to ensure the fitness of this bacterium in both free living and symbiotic conditions. In order to identify target mRNAs subject to Hfq-dependent riboregulation, we have compared the proteome of an hfq mutant and the wild type S. meliloti by quantitative proteomics following protein labelling with 15N. Among 2139 univocally identified proteins, a total of 195 proteins showed a differential abundance between the Hfq mutant and the wild type strain; 65 proteins accumulated ≥2-fold whereas 130 were downregulated (≤0.5-fold) in the absence of Hfq. This profound proteomic impact implies a major role for Hfq on regulation of diverse physiological processes in S. meliloti, from transport of small molecules to homeostasis of iron and nitrogen. Changes in the cellular levels of proteins involved in transport of nucleotides, peptides and amino acids, and in iron homeostasis, were confirmed with phenotypic assays. These results represent the first quantitative proteomic analysis in S. meliloti. The comparative analysis of the hfq mutant proteome allowed identification of novel strongly Hfq-regulated genes in S. meliloti. PMID:23119037

  18. Quantitative proteomic analysis of the Hfq-regulon in Sinorhizobium meliloti 2011.

    PubMed

    Sobrero, Patricio; Schlüter, Jan-Philip; Lanner, Ulrike; Schlosser, Andreas; Becker, Anke; Valverde, Claudio

    2012-01-01

    Riboregulation stands for RNA-based control of gene expression. In bacteria, small non-coding RNAs (sRNAs) are a major class of riboregulatory elements, most of which act at the post-transcriptional level by base-pairing target mRNA genes. The RNA chaperone Hfq facilitates antisense interactions between target mRNAs and regulatory sRNAs, thus influencing mRNA stability and/or translation rate. In the α-proteobacterium Sinorhizobium meliloti strain 2011, the identification and detection of multiple sRNAs genes and the broadly pleitropic phenotype associated to the absence of a functional Hfq protein both support the existence of riboregulatory circuits controlling gene expression to ensure the fitness of this bacterium in both free living and symbiotic conditions. In order to identify target mRNAs subject to Hfq-dependent riboregulation, we have compared the proteome of an hfq mutant and the wild type S. meliloti by quantitative proteomics following protein labelling with (15)N. Among 2139 univocally identified proteins, a total of 195 proteins showed a differential abundance between the Hfq mutant and the wild type strain; 65 proteins accumulated ≥2-fold whereas 130 were downregulated (≤0.5-fold) in the absence of Hfq. This profound proteomic impact implies a major role for Hfq on regulation of diverse physiological processes in S. meliloti, from transport of small molecules to homeostasis of iron and nitrogen. Changes in the cellular levels of proteins involved in transport of nucleotides, peptides and amino acids, and in iron homeostasis, were confirmed with phenotypic assays. These results represent the first quantitative proteomic analysis in S. meliloti. The comparative analysis of the hfq mutant proteome allowed identification of novel strongly Hfq-regulated genes in S. meliloti.

  19. Targeted methods for quantitative analysis of protein glycosylation

    PubMed Central

    Goldman, Radoslav; Sanda, Miloslav

    2018-01-01

    Quantification of proteins by LC-MS/MS-MRM has become a standard method with broad projected clinical applicability. MRM quantification of protein modifications is, however, far less utilized, especially in the case of glycoproteins. This review summarizes current methods for quantitative analysis of protein glycosylation with a focus on MRM methods. We describe advantages of this quantitative approach, analytical parameters that need to be optimized to achieve reliable measurements, and point out the limitations. Differences between major classes of N- and O-glycopeptides are described and class-specific glycopeptide assays are demonstrated. PMID:25522218

  20. [Principal component analysis and cluster analysis of inorganic elements in sea cucumber Apostichopus japonicus].

    PubMed

    Liu, Xiao-Fang; Xue, Chang-Hu; Wang, Yu-Ming; Li, Zhao-Jie; Xue, Yong; Xu, Jie

    2011-11-01

    The present study is to investigate the feasibility of multi-elements analysis in determination of the geographical origin of sea cucumber Apostichopus japonicus, and to make choice of the effective tracers in sea cucumber Apostichopus japonicus geographical origin assessment. The content of the elements such as Al, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, As, Se, Mo, Cd, Hg and Pb in sea cucumber Apostichopus japonicus samples from seven places of geographical origin were determined by means of ICP-MS. The results were used for the development of elements database. Cluster analysis(CA) and principal component analysis (PCA) were applied to differentiate the sea cucumber Apostichopus japonicus geographical origin. Three principal components which accounted for over 89% of the total variance were extracted from the standardized data. The results of Q-type cluster analysis showed that the 26 samples could be clustered reasonably into five groups, the classification results were significantly associated with the marine distribution of the sea cucumber Apostichopus japonicus samples. The CA and PCA were the effective methods for elements analysis of sea cucumber Apostichopus japonicus samples. The content of the mineral elements in sea cucumber Apostichopus japonicus samples was good chemical descriptors for differentiating their geographical origins.

  1. Concentrations of platinum group elements in 122 U.S. coal samples

    USGS Publications Warehouse

    Oman, C.L.; Finkelman, R.B.; Tewalt, S.J.

    1997-01-01

    Analysis of more than 13,000 coal samples by semi-quantitative optical emission spectroscopy (OES) indicates that concentrations of the platinum group elements (iridium, palladium, platinum, osmium, rhodium, and ruthenium) are less than 1 ppm in the ash, the limit of detection for this method of analysis. In order to accurately determine the concentration of the platinum group elements (PGE) in coal, additional data were obtained by inductively coupled plasma mass spectroscopy, an analytical method having part-per-billion (ppb) detection limits for these elements. These data indicate that the PGE in coal occur in concentrations on the order of 1 ppb or less.

  2. Integral finite element analysis of turntable bearing with flexible rings

    NASA Astrophysics Data System (ADS)

    Deng, Biao; Liu, Yunfei; Guo, Yuan; Tang, Shengjin; Su, Wenbin; Lei, Zhufeng; Wang, Pengcheng

    2018-03-01

    This paper suggests a method to calculate the internal load distribution and contact stress of the thrust angular contact ball turntable bearing by FEA. The influence of the stiffness of the bearing structure and the plastic deformation of contact area on the internal load distribution and contact stress of the bearing is considered. In this method, the load-deformation relationship of the rolling elements is determined by the finite element contact analysis of a single rolling element and the raceway. Based on this, the nonlinear contact between the rolling elements and the inner and outer ring raceways is same as a nonlinear compression spring and bearing integral finite element analysis model including support structure was established. The effects of structural deformation and plastic deformation on the built-in stress distribution of slewing bearing are investigated on basis of comparing the consequences of load distribution, inner and outer ring stress, contact stress and other finite element analysis results with the traditional bearing theory, which has guiding function for improving the design of slewing bearing.

  3. Analysis of Brick Masonry Wall using Applied Element Method

    NASA Astrophysics Data System (ADS)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a versatile tool for structural analysis. Analysis is done by discretising the structure as in the case of Finite Element Method (FEM). In AEM, elements are connected by a set of normal and shear springs instead of nodes. AEM is extensively used for the analysis of brittle materials. Brick masonry wall can be effectively analyzed in the frame of AEM. The composite nature of masonry wall can be easily modelled using springs. The brick springs and mortar springs are assumed to be connected in series. The brick masonry wall is analyzed and failure load is determined for different loading cases. The results were used to find the best aspect ratio of brick to strengthen brick masonry wall.

  4. Software for quantitative analysis of radiotherapy: overview, requirement analysis and design solutions.

    PubMed

    Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O

    2013-06-01

    Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  5. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    PubMed

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  6. Analyzing For Light Elements By X-Ray Scattering

    NASA Technical Reports Server (NTRS)

    Ross, H. Richard

    1993-01-01

    Nondestructive method of determining concentrations of low-atomic-number elements in liquids and solids involves measurements of Compton and Rayleigh scattering of x rays. Applied in quantitative analysis of low-atomic-number constituents of alloys, of contaminants and corrosion products on surfaces of alloys, and of fractions of hydrogen in plastics, oils, and solvents.

  7. Completely non-destructive elemental analysis of bulky samples by PGAA

    NASA Astrophysics Data System (ADS)

    Oura, Y.; Nakahara, H.; Sueki, K.; Sato, W.; Saito, A.; Tomizawa, T.; Nishikawa, T.

    1999-01-01

    NBAA (neutron beam activation analysis), which is a combination of PGAA and INAA by a single neutron irradiation, using an internal monostandard method is proposed as a very unique and promising method for the elemental analysis of voluminous and invaluable archaeological samples which do not allow even a scrape of the surface. It was applied to chinawares, Sueki ware, and bronze mirrors, and proved to be a very effective method for nondestructive analysis of not only major elements but also some minor elements such as boron that help solve archaeological problems of ears and sites of their production.

  8. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    NASA Astrophysics Data System (ADS)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  9. A 3,000-year quantitative drought record derived from XRF element data from a south Texas playa

    NASA Astrophysics Data System (ADS)

    Livsey, D. N.; Simms, A.; Hangsterfer, A.; Nisbet, R.; DeWitt, R.

    2013-12-01

    Recent droughts throughout the central United States highlight the need for a better understanding of the past frequency and severity of drought occurrence. Current records of past drought for the south Texas coast are derived from tree-ring data that span approximately the last 900 years before present (BP). In this study we utilize a supervised learning routine to create a transfer function between X-Ray Fluorescence (XRF) derived elemental data from Laguna Salada, Texas core LS10-02 to a locally derived tree-ring drought record. From this transfer function the 900 BP tree-ring drought record was extended to 3,000 BP. The supervised learning routine was trained on the first 100 years of XRF element data and tree-ring drought data to create the transfer function and training data set output. The model was then projected from the XRF elemental data for the next 800 years to create a deployed data set output and to test the transfer function parameters. The coefficients of determination between the model output and observed values are 0.77 and 0.70 for the 100-year training data set and 900-year deployed data set respectively. Given the relatively high coefficients of determination for both the training data set and deployed data set we interpret the model parameters are fairly robust and that a high-resolution drought record can be derived from the XRF element data. These results indicate that XRF element data can be used as a quantitative tool to reconstruct past drought records.

  10. Multivariate calibration in Laser-Induced Breakdown Spectroscopy quantitative analysis: The dangers of a 'black box' approach and how to avoid them

    NASA Astrophysics Data System (ADS)

    Safi, A.; Campanella, B.; Grifoni, E.; Legnaioli, S.; Lorenzetti, G.; Pagnotta, S.; Poggialini, F.; Ripoll-Seguer, L.; Hidalgo, M.; Palleschi, V.

    2018-06-01

    The introduction of multivariate calibration curve approach in Laser-Induced Breakdown Spectroscopy (LIBS) quantitative analysis has led to a general improvement of the LIBS analytical performances, since a multivariate approach allows to exploit the redundancy of elemental information that are typically present in a LIBS spectrum. Software packages implementing multivariate methods are available in the most diffused commercial and open source analytical programs; in most of the cases, the multivariate algorithms are robust against noise and operate in unsupervised mode. The reverse of the coin of the availability and ease of use of such packages is the (perceived) difficulty in assessing the reliability of the results obtained which often leads to the consideration of the multivariate algorithms as 'black boxes' whose inner mechanism is supposed to remain hidden to the user. In this paper, we will discuss the dangers of a 'black box' approach in LIBS multivariate analysis, and will discuss how to overcome them using the chemical-physical knowledge that is at the base of any LIBS quantitative analysis.

  11. Using laser-induced breakdown spectroscopy on vacuum alloys-production process for elements concentration analysis

    NASA Astrophysics Data System (ADS)

    Zhao, Tianzhuo; Fan, Zhongwei; Lian, Fuqiang; Liu, Yang; Lin, Weiran; Mo, Zeqiang; Nie, Shuzhen; Wang, Pu; Xiao, Hong; Li, Xin; Zhong, Qixiu; Zhang, Hongbo

    2017-11-01

    Laser-induced breakdown spectroscopy (LIBS) utilizing an echelle spectrograph-ICCD system is employed for on-line analysis of elements concentration in a vacuum induction melting workshop. Active temperature stabilization of echelle spectrometer is implemented specially for industrial environment applications. The measurement precision is further improved by monitoring laser parameters, such as pulse energy, spatial and temporal profiles, in real time, and post-selecting laser pulses with specific pulse energies. Experimental results show that major components of nickel-based alloys are stable, and can be well detected. By using internal standard method, calibration curves for chromium and aluminum are obtained for quantitative determination, with determination coefficient (relative standard deviation) to be 0.9559 (< 2.2%) and 0.9723 (< 2.8%), respectively.

  12. Quantitative analysis of lead in aqueous solutions by ultrasonic nebulizer assisted laser induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Zhong, Shi-Lei; Lu, Yuan; Kong, Wei-Jin; Cheng, Kai; Zheng, Ronger

    2016-08-01

    In this study, an ultrasonic nebulizer unit was established to improve the quantitative analysis ability of laser-induced breakdown spectroscopy (LIBS) for liquid samples detection, using solutions of the heavy metal element Pb as an example. An analytical procedure was designed to guarantee the stability and repeatability of the LIBS signal. A series of experiments were carried out strictly according to the procedure. The experimental parameters were optimized based on studies of the pulse energy influence and temporal evolution of the emission features. The plasma temperature and electron density were calculated to confirm the LTE state of the plasma. Normalizing the intensities by background was demonstrated to be an appropriate method in this work. The linear range of this system for Pb analysis was confirmed over a concentration range of 0-4,150ppm by measuring 12 samples with different concentrations. The correlation coefficient of the fitted calibration curve was as high as 99.94% in the linear range, and the LOD of Pb was confirmed as 2.93ppm. Concentration prediction experiments were performed on a further six samples. The excellent quantitative ability of the system was demonstrated by comparison of the real and predicted concentrations of the samples. The lowest relative error was 0.043% and the highest was no more than 7.1%.

  13. Contact stress analysis of spiral bevel gears using nonlinear finite element static analysis

    NASA Technical Reports Server (NTRS)

    Bibel, G. D.; Kumar, A.; Reddy, S.; Handschuh, R.

    1993-01-01

    A procedure is presented for performing three-dimensional stress analysis of spiral bevel gears in mesh using the finite element method. The procedure involves generating a finite element model by solving equations that identify tooth surface coordinates. Coordinate transformations are used to orientate the gear and pinion for gear meshing. Contact boundary conditions are simulated with gap elements. A solution technique for correct orientation of the gap elements is given. Example models and results are presented.

  14. Homogeneity testing and quantitative analysis of manganese (Mn) in vitrified Mn-doped glasses by laser-induced breakdown spectroscopy (LIBS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Unnikrishnan, V. K.; Nayak, Rajesh; Kartha, V. B.

    2014-09-15

    Laser-induced breakdown spectroscopy (LIBS), an atomic emission spectroscopy method, has rapidly grown as one of the best elemental analysis techniques over the past two decades. Homogeneity testing and quantitative analysis of manganese (Mn) in manganese-doped glasses have been carried out using an optimized LIBS system employing a nanosecond ultraviolet Nd:YAG laser as the source of excitation. The glass samples have been prepared using conventional vitrification methods. The laser pulse irradiance on the surface of the glass samples placed in air at atmospheric pressure was about 1.7×10{sup 9} W/cm{sup 2}. The spatially integrated plasma emission was collected and imaged on tomore » the spectrograph slit using an optical-fiber-based collection system. Homogeneity was checked by recording LIBS spectra from different sites on the sample surface and analyzing the elemental emission intensities for concentration determination. Validation of the observed LIBS results was done by comparison with scanning electron microscope- energy dispersive X-ray spectroscopy (SEM-EDX) surface elemental mapping. The analytical performance of the LIBS system has been evaluated through the correlation of the LIBS determined concentrations of Mn with its certified values. The results are found to be in very good agreement with the certified concentrations.« less

  15. Elemental analysis of cotton by laser-induced breakdown spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schenk, Emily R.; Almirall, Jose R.

    Laser-induced breakdown spectroscopy (LIBS) has been applied to the elemental characterization of unprocessed cotton. This research is important in forensic and fraud detection applications to establish an elemental fingerprint of U.S. cotton by region, which can be used to determine the source of the cotton. To the best of our knowledge, this is the first report of a LIBS method for the elemental analysis of cotton. The experimental setup consists of a Nd:YAG laser that operates at the fundamental wavelength as the LIBS excitation source and an echelle spectrometer equipped with an intensified CCD camera. The relative concentrations of elementsmore » Al, Ba, Ca, Cr, Cu, Fe, Mg, and Sr from both nutrients and environmental contributions were determined by LIBS. Principal component analysis was used to visualize the differences between cotton samples based on the elemental composition by region in the U.S. Linear discriminant analysis of the LIBS data resulted in the correct classification of >97% of the cotton samples by U.S. region and >81% correct classification by state of origin.« less

  16. Quantitative subsurface analysis using frequency modulated thermal wave imaging

    NASA Astrophysics Data System (ADS)

    Subhani, S. K.; Suresh, B.; Ghali, V. S.

    2018-01-01

    Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.

  17. Finite element analysis of a composite wheelchair wheel design

    NASA Technical Reports Server (NTRS)

    Ortega, Rene

    1994-01-01

    The finite element analysis of a composite wheelchair wheel design is presented. The design is the result of a technology utilization request. The designer's intent is to soften the riding feeling by incorporating a mechanism attaching the wheel rim to the spokes that would allow considerable deflection upon compressive loads. A finite element analysis was conducted to verify proper structural function. Displacement and stress results are presented and conclusions are provided.

  18. Evaluation of the finite element fuel rod analysis code (FRANCO)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, K.; Feltus, M.A.

    1994-12-31

    Knowledge of temperature distribution in a nuclear fuel rod is required to predict the behavior of fuel elements during operating conditions. The thermal and mechanical properties and performance characteristics are strongly dependent on the temperature, which can vary greatly inside the fuel rod. A detailed model of fuel rod behavior can be described by various numerical methods, including the finite element approach. The finite element method has been successfully used in many engineering applications, including nuclear piping and reactor component analysis. However, fuel pin analysis has traditionally been carried out with finite difference codes, with the exception of Electric Powermore » Research Institute`s FREY code, which was developed for mainframe execution. This report describes FRANCO, a finite element fuel rod analysis code capable of computing temperature disrtibution and mechanical deformation of a single light water reactor fuel rod.« less

  19. Quantitative analysis of pork and chicken products by droplet digital PCR.

    PubMed

    Cai, Yicun; Li, Xiang; Lv, Rong; Yang, Jielin; Li, Jian; He, Yuping; Pan, Liangwen

    2014-01-01

    In this project, a highly precise quantitative method based on the digital polymerase chain reaction (dPCR) technique was developed to determine the weight of pork and chicken in meat products. Real-time quantitative polymerase chain reaction (qPCR) is currently used for quantitative molecular analysis of the presence of species-specific DNAs in meat products. However, it is limited in amplification efficiency and relies on standard curves based Ct values, detecting and quantifying low copy number target DNA, as in some complex mixture meat products. By using the dPCR method, we find the relationships between the raw meat weight and DNA weight and between the DNA weight and DNA copy number were both close to linear. This enabled us to establish formulae to calculate the raw meat weight based on the DNA copy number. The accuracy and applicability of this method were tested and verified using samples of pork and chicken powder mixed in known proportions. Quantitative analysis indicated that dPCR is highly precise in quantifying pork and chicken in meat products and therefore has the potential to be used in routine analysis by government regulators and quality control departments of commercial food and feed enterprises.

  20. Computerized image analysis for quantitative neuronal phenotyping in zebrafish.

    PubMed

    Liu, Tianming; Lu, Jianfeng; Wang, Ye; Campbell, William A; Huang, Ling; Zhu, Jinmin; Xia, Weiming; Wong, Stephen T C

    2006-06-15

    An integrated microscope image analysis pipeline is developed for automatic analysis and quantification of phenotypes in zebrafish with altered expression of Alzheimer's disease (AD)-linked genes. We hypothesize that a slight impairment of neuronal integrity in a large number of zebrafish carrying the mutant genotype can be detected through the computerized image analysis method. Key functionalities of our zebrafish image processing pipeline include quantification of neuron loss in zebrafish embryos due to knockdown of AD-linked genes, automatic detection of defective somites, and quantitative measurement of gene expression levels in zebrafish with altered expression of AD-linked genes or treatment with a chemical compound. These quantitative measurements enable the archival of analyzed results and relevant meta-data. The structured database is organized for statistical analysis and data modeling to better understand neuronal integrity and phenotypic changes of zebrafish under different perturbations. Our results show that the computerized analysis is comparable to manual counting with equivalent accuracy and improved efficacy and consistency. Development of such an automated data analysis pipeline represents a significant step forward to achieve accurate and reproducible quantification of neuronal phenotypes in large scale or high-throughput zebrafish imaging studies.

  1. Binary tree eigen solver in finite element analysis

    NASA Technical Reports Server (NTRS)

    Akl, F. A.; Janetzke, D. C.; Kiraly, L. J.

    1993-01-01

    This paper presents a transputer-based binary tree eigensolver for the solution of the generalized eigenproblem in linear elastic finite element analysis. The algorithm is based on the method of recursive doubling, which parallel implementation of a number of associative operations on an arbitrary set having N elements is of the order of o(log2N), compared to (N-1) steps if implemented sequentially. The hardware used in the implementation of the binary tree consists of 32 transputers. The algorithm is written in OCCAM which is a high-level language developed with the transputers to address parallel programming constructs and to provide the communications between processors. The algorithm can be replicated to match the size of the binary tree transputer network. Parallel and sequential finite element analysis programs have been developed to solve for the set of the least-order eigenpairs using the modified subspace method. The speed-up obtained for a typical analysis problem indicates close agreement with the theoretical prediction given by the method of recursive doubling.

  2. ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.

    PubMed

    Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.

    PubMed

    Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan

    2017-01-01

    Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.

  4. [Quantitative surface analysis of Pt-Co, Cu-Au and Cu-Ag alloy films by XPS and AES].

    PubMed

    Li, Lian-Zhong; Zhuo, Shang-Jun; Shen, Ru-Xiang; Qian, Rong; Gao, Jie

    2013-11-01

    In order to improve the quantitative analysis accuracy of AES, We associated XPS with AES and studied the method to reduce the error of AES quantitative analysis, selected Pt-Co, Cu-Au and Cu-Ag binary alloy thin-films as the samples, used XPS to correct AES quantitative analysis results by changing the auger sensitivity factors to make their quantitative analysis results more similar. Then we verified the accuracy of the quantitative analysis of AES when using the revised sensitivity factors by other samples with different composition ratio, and the results showed that the corrected relative sensitivity factors can reduce the error in quantitative analysis of AES to less than 10%. Peak defining is difficult in the form of the integral spectrum of AES analysis since choosing the starting point and ending point when determining the characteristic auger peak intensity area with great uncertainty, and to make analysis easier, we also processed data in the form of the differential spectrum, made quantitative analysis on the basis of peak to peak height instead of peak area, corrected the relative sensitivity factors, and verified the accuracy of quantitative analysis by the other samples with different composition ratio. The result showed that the analytical error in quantitative analysis of AES reduced to less than 9%. It showed that the accuracy of AES quantitative analysis can be highly improved by the way of associating XPS with AES to correct the auger sensitivity factors since the matrix effects are taken into account. Good consistency was presented, proving the feasibility of this method.

  5. Multi-element analysis of emeralds and associated rocks by k(o) neutron activation analysis

    PubMed

    Acharya; Mondal; Burte; Nair; Reddy; Reddy; Reddy; Manohar

    2000-12-01

    Multi-element analysis was carried out in natural emeralds, their associated rocks and one sample of beryl obtained from Rajasthan, India. The concentrations of 21 elements were assayed by Instrumental Neutron Activation Analysis using the k0 method (k0 INAA method) and high-resolution gamma ray spectrometry. The data reveal the segregation of some elements from associated (trapped and host) rocks to the mineral beryl forming the gemstones. A reference rock standard of the US Geological Survey (USGS BCR-1) was also analysed as a control of the method.

  6. Statistical Energy Analysis (SEA) and Energy Finite Element Analysis (EFEA) Predictions for a Floor-Equipped Composite Cylinder

    NASA Technical Reports Server (NTRS)

    Grosveld, Ferdinand W.; Schiller, Noah H.; Cabell, Randolph H.

    2011-01-01

    Comet Enflow is a commercially available, high frequency vibroacoustic analysis software founded on Energy Finite Element Analysis (EFEA) and Energy Boundary Element Analysis (EBEA). Energy Finite Element Analysis (EFEA) was validated on a floor-equipped composite cylinder by comparing EFEA vibroacoustic response predictions with Statistical Energy Analysis (SEA) and experimental results. Statistical Energy Analysis (SEA) predictions were made using the commercial software program VA One 2009 from ESI Group. The frequency region of interest for this study covers the one-third octave bands with center frequencies from 100 Hz to 4000 Hz.

  7. Finite element mesh refinement criteria for stress analysis

    NASA Technical Reports Server (NTRS)

    Kittur, Madan G.; Huston, Ronald L.

    1990-01-01

    This paper discusses procedures for finite-element mesh selection and refinement. The objective is to improve accuracy. The procedures are based on (1) the minimization of the stiffness matrix race (optimizing node location); (2) the use of h-version refinement (rezoning, element size reduction, and increasing the number of elements); and (3) the use of p-version refinement (increasing the order of polynomial approximation of the elements). A step-by-step procedure of mesh selection, improvement, and refinement is presented. The criteria for 'goodness' of a mesh are based on strain energy, displacement, and stress values at selected critical points of a structure. An analysis of an aircraft lug problem is presented as an example.

  8. Microwave plasma monitoring system for the elemental composition analysis of high temperature process streams

    DOEpatents

    Woskov, Paul P.; Cohn, Daniel R.; Titus, Charles H.; Surma, Jeffrey E.

    1997-01-01

    Microwave-induced plasma for continuous, real time trace element monitoring under harsh and variable conditions. The sensor includes a source of high power microwave energy and a shorted waveguide made of a microwave conductive, high temperature capability refractory material communicating with the source of the microwave energy to generate a plasma. The high power waveguide is constructed to be robust in a hot, hostile environment. It includes an aperture for the passage of gases to be analyzed and a spectrometer is connected to receive light from the plasma. Provision is made for real time in situ calibration. The spectrometer disperses the light, which is then analyzed by a computer. The sensor is capable of making continuous, real time quantitative measurements of desired elements, such as the heavy metals lead and mercury. The invention may be incorporated into a high temperature process device and implemented in situ for example, such as with a DC graphite electrode plasma arc furnace. The invention further provides a system for the elemental analysis of process streams by removing particulate and/or droplet samples therefrom and entraining such samples in the gas flow which passes through the plasma flame. Introduction of and entraining samples in the gas flow may be facilitated by a suction pump, regulating gas flow, gravity or combinations thereof.

  9. Standardization approaches in absolute quantitative proteomics with mass spectrometry.

    PubMed

    Calderón-Celis, Francisco; Encinar, Jorge Ruiz; Sanz-Medel, Alfredo

    2017-07-31

    Mass spectrometry-based approaches have enabled important breakthroughs in quantitative proteomics in the last decades. This development is reflected in the better quantitative assessment of protein levels as well as to understand post-translational modifications and protein complexes and networks. Nowadays, the focus of quantitative proteomics shifted from the relative determination of proteins (ie, differential expression between two or more cellular states) to absolute quantity determination, required for a more-thorough characterization of biological models and comprehension of the proteome dynamism, as well as for the search and validation of novel protein biomarkers. However, the physico-chemical environment of the analyte species affects strongly the ionization efficiency in most mass spectrometry (MS) types, which thereby require the use of specially designed standardization approaches to provide absolute quantifications. Most common of such approaches nowadays include (i) the use of stable isotope-labeled peptide standards, isotopologues to the target proteotypic peptides expected after tryptic digestion of the target protein; (ii) use of stable isotope-labeled protein standards to compensate for sample preparation, sample loss, and proteolysis steps; (iii) isobaric reagents, which after fragmentation in the MS/MS analysis provide a final detectable mass shift, can be used to tag both analyte and standard samples; (iv) label-free approaches in which the absolute quantitative data are not obtained through the use of any kind of labeling, but from computational normalization of the raw data and adequate standards; (v) elemental mass spectrometry-based workflows able to provide directly absolute quantification of peptides/proteins that contain an ICP-detectable element. A critical insight from the Analytical Chemistry perspective of the different standardization approaches and their combinations used so far for absolute quantitative MS-based (molecular and

  10. Non destructive multi elemental analysis using prompt gamma neutron activation analysis techniques: Preliminary results for concrete sample

    NASA Astrophysics Data System (ADS)

    Dahing, Lahasen@Normanshah; Yahya, Redzuan; Yahya, Roslan; Hassan, Hearie

    2014-09-01

    In this study, principle of prompt gamma neutron activation analysis has been used as a technique to determine the elements in the sample. The system consists of collimated isotopic neutron source, Cf-252 with HPGe detector and Multichannel Analysis (MCA). Concrete with size of 10×10×10 cm3 and 15×15×15 cm3 were analysed as sample. When neutrons enter and interact with elements in the concrete, the neutron capture reaction will occur and produce characteristic prompt gamma ray of the elements. The preliminary result of this study demonstrate the major element in the concrete was determined such as Si, Mg, Ca, Al, Fe and H as well as others element, such as Cl by analysis the gamma ray lines respectively. The results obtained were compared with NAA and XRF techniques as a part of reference and validation. The potential and the capability of neutron induced prompt gamma as tool for multi elemental analysis qualitatively to identify the elements present in the concrete sample discussed.

  11. Quantitative Imaging of Young's Modulus of Soft Tissues from Ultrasound Water Jet Indentation: A Finite Element Study

    PubMed Central

    Lu, Min-Hua; Mao, Rui; Lu, Yin; Liu, Zheng; Wang, Tian-Fu; Chen, Si-Ping

    2012-01-01

    Indentation testing is a widely used approach to evaluate mechanical characteristics of soft tissues quantitatively. Young's modulus of soft tissue can be calculated from the force-deformation data with known tissue thickness and Poisson's ratio using Hayes' equation. Our group previously developed a noncontact indentation system using a water jet as a soft indenter as well as the coupling medium for the propagation of high-frequency ultrasound. The novel system has shown its ability to detect the early degeneration of articular cartilage. However, there is still lack of a quantitative method to extract the intrinsic mechanical properties of soft tissue from water jet indentation. The purpose of this study is to investigate the relationship between the loading-unloading curves and the mechanical properties of soft tissues to provide an imaging technique of tissue mechanical properties. A 3D finite element model of water jet indentation was developed with consideration of finite deformation effect. An improved Hayes' equation has been derived by introducing a new scaling factor which is dependent on Poisson's ratios v, aspect ratio a/h (the radius of the indenter/the thickness of the test tissue), and deformation ratio d/h. With this model, the Young's modulus of soft tissue can be quantitatively evaluated and imaged with the error no more than 2%. PMID:22927890

  12. Predicting Rediated Noise With Power Flow Finite Element Analysis

    DTIC Science & Technology

    2007-02-01

    Defence R&D Canada – Atlantic DEFENCE DÉFENSE & Predicting Rediated Noise With Power Flow Finite Element Analysis D. Brennan T.S. Koko L. Jiang J...PREDICTING RADIATED NOISE WITH POWER FLOW FINITE ELEMENT ANALYSIS D.P. Brennan T.S. Koko L. Jiang J.C. Wallace Martec Limited Martec Limited...model- or full-scale data before it is available for general use. Brennan, D.P., Koko , T.S., Jiang, L., Wallace, J.C. 2007. Predicting Radiated

  13. Structural weights analysis of advanced aerospace vehicles using finite element analysis

    NASA Technical Reports Server (NTRS)

    Bush, Lance B.; Lentz, Christopher A.; Rehder, John J.; Naftel, J. Chris; Cerro, Jeffrey A.

    1989-01-01

    A conceptual/preliminary level structural design system has been developed for structural integrity analysis and weight estimation of advanced space transportation vehicles. The system includes a three-dimensional interactive geometry modeler, a finite element pre- and post-processor, a finite element analyzer, and a structural sizing program. Inputs to the system include the geometry, surface temperature, material constants, construction methods, and aerodynamic and inertial loads. The results are a sized vehicle structure capable of withstanding the static loads incurred during assembly, transportation, operations, and missions, and a corresponding structural weight. An analysis of the Space Shuttle external tank is included in this paper as a validation and benchmark case of the system.

  14. Scanning Electron Microscope-Cathodoluminescence Analysis of Rare-Earth Elements in Magnets.

    PubMed

    Imashuku, Susumu; Wagatsuma, Kazuaki; Kawai, Jun

    2016-02-01

    Scanning electron microscope-cathodoluminescence (SEM-CL) analysis was performed for neodymium-iron-boron (NdFeB) and samarium-cobalt (Sm-Co) magnets to analyze the rare-earth elements present in the magnets. We examined the advantages of SEM-CL analysis over conventional analytical methods such as SEM-energy-dispersive X-ray (EDX) spectroscopy and SEM-wavelength-dispersive X-ray (WDX) spectroscopy for elemental analysis of rare-earth elements in NdFeB magnets. Luminescence spectra of chloride compounds of elements in the magnets were measured by the SEM-CL method. Chloride compounds were obtained by the dropwise addition of hydrochloric acid on the magnets followed by drying in vacuum. Neodymium, praseodymium, terbium, and dysprosium were separately detected in the NdFeB magnets, and samarium was detected in the Sm-Co magnet by the SEM-CL method. In contrast, it was difficult to distinguish terbium and dysprosium in the NdFeB magnet with a dysprosium concentration of 1.05 wt% by conventional SEM-EDX analysis. Terbium with a concentration of 0.02 wt% in an NdFeB magnet was detected by SEM-CL analysis, but not by conventional SEM-WDX analysis. SEM-CL analysis is advantageous over conventional SEM-EDX and SEM-WDX analyses for detecting trace rare-earth elements in NdFeB magnets, particularly dysprosium and terbium.

  15. Probabilistic finite elements for transient analysis in nonlinear continua

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Mani, A.

    1985-01-01

    The probabilistic finite element method (PFEM), which is a combination of finite element methods and second-moment analysis, is formulated for linear and nonlinear continua with inhomogeneous random fields. Analogous to the discretization of the displacement field in finite element methods, the random field is also discretized. The formulation is simplified by transforming the correlated variables to a set of uncorrelated variables through an eigenvalue orthogonalization. Furthermore, it is shown that a reduced set of the uncorrelated variables is sufficient for the second-moment analysis. Based on the linear formulation of the PFEM, the method is then extended to transient analysis in nonlinear continua. The accuracy and efficiency of the method is demonstrated by application to a one-dimensional, elastic/plastic wave propagation problem. The moments calculated compare favorably with those obtained by Monte Carlo simulation. Also, the procedure is amenable to implementation in deterministic FEM based computer programs.

  16. Preserving elemental content in adherent mammalian cells for analysis by synchrotron-based x-ray fluorescence microscopy

    DOE PAGES

    Jin, Qiaoling; Paunesku, Tatjana; Lai, Barry; ...

    2016-08-31

    Trace metals play important roles in biological function, and x-ray fluorescence microscopy (XFM) provides a way to quantitatively image their distribution within cells. The faithfulness of these measurements is dependent on proper sample preparation. Using mouse embryonic fibroblast NIH/3T3 cells as an example, we compare various approaches to the preparation of adherent mammalian cells for XFM imaging under ambient temperature. Direct side-by-side comparison shows that plunge-freezing-based cryoimmobilization provides more faithful preservation than conventional chemical fixation for most biologically important elements including P, S, Cl, K, Fe, Cu, Zn and possibly Ca in adherent mammalian cells. Although cells rinsed with freshmore » media had a great deal of extracellular background signal for Cl and Ca, this approach maintained cells at the best possible physiological status before rapid freezing and it does not interfere with XFM analysis of other elements. If chemical fixation has to be chosen, the combination of 3% paraformaldehyde and 1.5 % glutaraldehyde preserves S, Fe, Cu and Zn better than either fixative alone. Lastly, when chemically fixed cells were subjected to a variety of dehydration processes, air drying was proved to be more suitable than other drying methods such as graded ethanol dehydration and freeze drying. This first detailed comparison for x-ray fluorescence microscopy shows how detailed quantitative conclusions can be affected by the choice of cell preparation method.« less

  17. Preserving elemental content in adherent mammalian cells for analysis by synchrotron-based x-ray fluorescence microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Qiaoling; Paunesku, Tatjana; Lai, Barry

    Trace metals play important roles in biological function, and x-ray fluorescence microscopy (XFM) provides a way to quantitatively image their distribution within cells. The faithfulness of these measurements is dependent on proper sample preparation. Using mouse embryonic fibroblast NIH/3T3 cells as an example, we compare various approaches to the preparation of adherent mammalian cells for XFM imaging under ambient temperature. Direct side-by-side comparison shows that plunge-freezing-based cryoimmobilization provides more faithful preservation than conventional chemical fixation for most biologically important elements including P, S, Cl, K, Fe, Cu, Zn and possibly Ca in adherent mammalian cells. Although cells rinsed with freshmore » media had a great deal of extracellular background signal for Cl and Ca, this approach maintained cells at the best possible physiological status before rapid freezing and it does not interfere with XFM analysis of other elements. If chemical fixation has to be chosen, the combination of 3% paraformaldehyde and 1.5 % glutaraldehyde preserves S, Fe, Cu and Zn better than either fixative alone. Lastly, when chemically fixed cells were subjected to a variety of dehydration processes, air drying was proved to be more suitable than other drying methods such as graded ethanol dehydration and freeze drying. This first detailed comparison for x-ray fluorescence microscopy shows how detailed quantitative conclusions can be affected by the choice of cell preparation method.« less

  18. Spotsizer: High-throughput quantitative analysis of microbial growth.

    PubMed

    Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg

    2016-10-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.

  19. Non-destructive geochemical analysis and element mapping using bench-top μ-XRF: applications and uses for geoscience problems

    NASA Astrophysics Data System (ADS)

    Flude, Stephanie; Haschke, Michael; Tagle, Roald; Storey, Michael

    2013-04-01

    X-Ray Fluorescence (XRF) has long been used to provide valuable geochemical analysis of bulk rock samples in geological studies. However, it is a destructive technique, requiring samples to be homogenised by grinding to a fine powder and formed into a compacted pellet, or fused glass disk and the resulting sample has to be completely flat for reliable analysis. Until recently, non-destructive, high spatial resolution µ- XRF analysis was possible only at specialised Synchrotron radiation facilities, where high excitation beam energies are possible and specialised X-ray focussing optical systems are available. Recently, a number of bench-top µ-XRF systems have become available, allowing easy, rapid and non-destructive geochemical analysis of various materials. We present a number of examples of how the new bench-top M4 Tornado µ-XRF system, developed by Bruker Nano, can be used to provide valuable geochemical information on geological samples. Both quantitative and qualitative (in the form of X-Ray area-maps) data can be quickly and easily acquired for a wide range of elements (as light as Na, using a vacuum), with minimal sample preparation, using an X-Ray spot size as low as 25 µm. Large specimens up to 30 cm and 5 kg in weight can be analysed due to the large sample chamber, allowing non-destructive characterisation of rare or valuable materials. This technique is particularly useful in characterising heterogeneous samples, such as drill cores, sedimentary and pyroclastic rocks containing a variety of clasts, lavas sourced from mixed and mingled magmas, mineralised samples and fossils. An obvious application is the ability to produce element maps or line-scans of minerals, allowing zoning of major and trace elements to be identified and thus informing on crystallisation histories. An application of particular interest to 40Ar/39Ar geochronologists is the ability to screen and assess the purity of mineral separates, or to characterise polished slabs for

  20. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    NASA Technical Reports Server (NTRS)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  1. Influence analysis in quantitative trait loci detection.

    PubMed

    Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko

    2014-07-01

    This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods-the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics. © 2014 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Error analysis and correction of discrete solutions from finite element codes

    NASA Technical Reports Server (NTRS)

    Thurston, G. A.; Stein, P. A.; Knight, N. F., Jr.; Reissner, J. E.

    1984-01-01

    Many structures are an assembly of individual shell components. Therefore, results for stresses and deflections from finite element solutions for each shell component should agree with the equations of shell theory. This paper examines the problem of applying shell theory to the error analysis and the correction of finite element results. The general approach to error analysis and correction is discussed first. Relaxation methods are suggested as one approach to correcting finite element results for all or parts of shell structures. Next, the problem of error analysis of plate structures is examined in more detail. The method of successive approximations is adapted to take discrete finite element solutions and to generate continuous approximate solutions for postbuckled plates. Preliminary numerical results are included.

  3. Quantitative High-Resolution Genomic Analysis of Single Cancer Cells

    PubMed Central

    Hannemann, Juliane; Meyer-Staeckling, Sönke; Kemming, Dirk; Alpers, Iris; Joosse, Simon A.; Pospisil, Heike; Kurtz, Stefan; Görndt, Jennifer; Püschel, Klaus; Riethdorf, Sabine; Pantel, Klaus; Brandt, Burkhard

    2011-01-01

    During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics. PMID:22140428

  4. Quantitative high-resolution genomic analysis of single cancer cells.

    PubMed

    Hannemann, Juliane; Meyer-Staeckling, Sönke; Kemming, Dirk; Alpers, Iris; Joosse, Simon A; Pospisil, Heike; Kurtz, Stefan; Görndt, Jennifer; Püschel, Klaus; Riethdorf, Sabine; Pantel, Klaus; Brandt, Burkhard

    2011-01-01

    During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  5. Elemental misinterpretation in automated analysis of LIBS spectra.

    PubMed

    Hübert, Waldemar; Ankerhold, Georg

    2011-07-01

    In this work, the Stark effect is shown to be mainly responsible for wrong elemental allocation by automated laser-induced breakdown spectroscopy (LIBS) software solutions. Due to broadening and shift of an elemental emission line affected by the Stark effect, its measured spectral position might interfere with the line position of several other elements. The micro-plasma is generated by focusing a frequency-doubled 200 mJ pulsed Nd/YAG laser on an aluminum target and furthermore on a brass sample in air at atmospheric pressure. After laser pulse excitation, we have measured the temporal evolution of the Al(II) ion line at 281.6 nm (4s(1)S-3p(1)P) during the decay of the laser-induced plasma. Depending on laser pulse power, the center of the measured line is red-shifted by 130 pm (490 GHz) with respect to the exact line position. In this case, the well-known spectral line positions of two moderate and strong lines of other elements coincide with the actual shifted position of the Al(II) line. Consequently, a time-resolving software analysis can lead to an elemental misinterpretation. To avoid a wrong interpretation of LIBS spectra in automated analysis software for a given LIBS system, we recommend using larger gate delays incorporating Stark broadening parameters and using a range of tolerance, which is non-symmetric around the measured line center. These suggestions may help to improve time-resolving LIBS software promising a smaller probability of wrong elemental identification and making LIBS more attractive for industrial applications.

  6. Finite Element Analysis of Reverberation Chambers

    NASA Technical Reports Server (NTRS)

    Bunting, Charles F.; Nguyen, Duc T.

    2000-01-01

    The primary motivating factor behind the initiation of this work was to provide a deterministic means of establishing the validity of the statistical methods that are recommended for the determination of fields that interact in -an avionics system. The application of finite element analysis to reverberation chambers is the initial step required to establish a reasonable course of inquiry in this particularly data-intensive study. The use of computational electromagnetics provides a high degree of control of the "experimental" parameters that can be utilized in a simulation of reverberating structures. As the work evolved there were four primary focus areas they are: 1. The eigenvalue problem for the source free problem. 2. The development of a complex efficient eigensolver. 3. The application of a source for the TE and TM fields for statistical characterization. 4. The examination of shielding effectiveness in a reverberating environment. One early purpose of this work was to establish the utility of finite element techniques in the development of an extended low frequency statistical model for reverberation phenomena. By employing finite element techniques, structures of arbitrary complexity can be analyzed due to the use of triangular shape functions in the spatial discretization. The effects of both frequency stirring and mechanical stirring are presented. It is suggested that for the low frequency operation the typical tuner size is inadequate to provide a sufficiently random field and that frequency stirring should be used. The results of the finite element analysis of the reverberation chamber illustrate io-W the potential utility of a 2D representation for enhancing the basic statistical characteristics of the chamber when operating in a low frequency regime. The basic field statistics are verified for frequency stirring over a wide range of frequencies. Mechanical stirring is shown to provide an effective frequency deviation.

  7. The APOSTEL recommendations for reporting quantitative optical coherence tomography studies.

    PubMed

    Cruz-Herranz, Andrés; Balk, Lisanne J; Oberwahrenbrock, Timm; Saidha, Shiv; Martinez-Lapiscina, Elena H; Lagreze, Wolf A; Schuman, Joel S; Villoslada, Pablo; Calabresi, Peter; Balcer, Laura; Petzold, Axel; Green, Ari J; Paul, Friedemann; Brandt, Alexander U; Albrecht, Philipp

    2016-06-14

    To develop consensus recommendations for reporting of quantitative optical coherence tomography (OCT) study results. A panel of experienced OCT researchers (including 11 neurologists, 2 ophthalmologists, and 2 neuroscientists) discussed requirements for performing and reporting quantitative analyses of retinal morphology and developed a list of initial recommendations based on experience and previous studies. The list of recommendations was subsequently revised during several meetings of the coordinating group. We provide a 9-point checklist encompassing aspects deemed relevant when reporting quantitative OCT studies. The areas covered are study protocol, acquisition device, acquisition settings, scanning protocol, funduscopic imaging, postacquisition data selection, postacquisition data analysis, recommended nomenclature, and statistical analysis. The Advised Protocol for OCT Study Terminology and Elements recommendations include core items to standardize and improve quality of reporting in quantitative OCT studies. The recommendations will make reporting of quantitative OCT studies more consistent and in line with existing standards for reporting research in other biomedical areas. The recommendations originated from expert consensus and thus represent Class IV evidence. They will need to be regularly adjusted according to new insights and practices. © 2016 American Academy of Neurology.

  8. Benefit-risk analysis : a brief review and proposed quantitative approaches.

    PubMed

    Holden, William L

    2003-01-01

    Given the current status of benefit-risk analysis as a largely qualitative method, two techniques for a quantitative synthesis of a drug's benefit and risk are proposed to allow a more objective approach. The recommended methods, relative-value adjusted number-needed-to-treat (RV-NNT) and its extension, minimum clinical efficacy (MCE) analysis, rely upon efficacy or effectiveness data, adverse event data and utility data from patients, describing their preferences for an outcome given potential risks. These methods, using hypothetical data for rheumatoid arthritis drugs, demonstrate that quantitative distinctions can be made between drugs which would better inform clinicians, drug regulators and patients about a drug's benefit-risk profile. If the number of patients needed to treat is less than the relative-value adjusted number-needed-to-harm in an RV-NNT analysis, patients are willing to undergo treatment with the experimental drug to derive a certain benefit knowing that they may be at risk for any of a series of potential adverse events. Similarly, the results of an MCE analysis allow for determining the worth of a new treatment relative to an older one, given not only the potential risks of adverse events and benefits that may be gained, but also by taking into account the risk of disease without any treatment. Quantitative methods of benefit-risk analysis have a place in the evaluative armamentarium of pharmacovigilance, especially those that incorporate patients' perspectives.

  9. Probabilistic finite elements for fatigue and fracture analysis

    NASA Astrophysics Data System (ADS)

    Belytschko, Ted; Liu, Wing Kam

    Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.

  10. Probabilistic finite elements for fatigue and fracture analysis

    NASA Technical Reports Server (NTRS)

    Belytschko, Ted; Liu, Wing Kam

    1992-01-01

    Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.

  11. Accelerator-based chemical and elemental analysis of atmospheric aerosols

    NASA Astrophysics Data System (ADS)

    Mentes, Besim

    Aerosol particles have always been present in the atmosphere, arising from natural sources. But it was not until recently when emissions from anthropogenic (man made) sources began to dominate, that atmospheric aerosols came into focus and the aerosol science in the environmental perspective started to grow. These sources emit or produce particles with different elemental and chemical compositions, as well as different sizes of the individual aerosols. The effects of increased pollution of the atmosphere are many, and have different time scales. One of the effects known today is acid rain, which causes problems for vegetation. Pollution is also a direct human health risk, in many cities where traffic driven by combustion engines is forbidden at certain times when the meteorological conditions are unfavourable. Aerosols play an important role in the climate, and may have both direct and indirect effect which cause cooling of the planet surface, in contrast to the so-called greenhouse gases. During this work a technique for chemical and elemental analysis of atmospheric aerosols and an elemental analysis methodology for upper tropospheric aerosols have been developed. The elemental analysis is performed by the ion beam analysis (IBA) techniques, PIXE (elements heavier than Al). PESA (C, N and O), cPESA (H) and pNRA (Mg and Na). The chemical speciation of atmospheric aerosols is obtained by ion beam thermography (IBT). During thermography the sample temperature is stepwise increased and the IBA techniques are used to continuously monitor the elemental concentration. A thermogram is obtained for each element. The vaporisation of the compounds in the sample appears as a concentration decrease in the thermograms at characteristic vaporisation temperatures (CVTs). Different aspects of IBT have been examined in Paper I to IV. The features of IBT are: almost total elemental speciation of the aerosol mass, chemical speciation of the inorganic compounds, carbon content

  12. Finite element dynamic analysis on CDC STAR-100 computer

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Lambiotte, J. J., Jr.

    1978-01-01

    Computational algorithms are presented for the finite element dynamic analysis of structures on the CDC STAR-100 computer. The spatial behavior is described using higher-order finite elements. The temporal behavior is approximated by using either the central difference explicit scheme or Newmark's implicit scheme. In each case the analysis is broken up into a number of basic macro-operations. Discussion is focused on the organization of the computation and the mode of storage of different arrays to take advantage of the STAR pipeline capability. The potential of the proposed algorithms is discussed and CPU times are given for performing the different macro-operations for a shell modeled by higher order composite shallow shell elements having 80 degrees of freedom.

  13. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  14. Highly Reproducible Label Free Quantitative Proteomic Analysis of RNA Polymerase Complexes*

    PubMed Central

    Mosley, Amber L.; Sardiu, Mihaela E.; Pattenden, Samantha G.; Workman, Jerry L.; Florens, Laurence; Washburn, Michael P.

    2011-01-01

    The use of quantitative proteomics methods to study protein complexes has the potential to provide in-depth information on the abundance of different protein components as well as their modification state in various cellular conditions. To interrogate protein complex quantitation using shotgun proteomic methods, we have focused on the analysis of protein complexes using label-free multidimensional protein identification technology and studied the reproducibility of biological replicates. For these studies, we focused on three highly related and essential multi-protein enzymes, RNA polymerase I, II, and III from Saccharomyces cerevisiae. We found that label-free quantitation using spectral counting is highly reproducible at the protein and peptide level when analyzing RNA polymerase I, II, and III. In addition, we show that peptide sampling does not follow a random sampling model, and we show the need for advanced computational models to predict peptide detection probabilities. In order to address these issues, we used the APEX protocol to model the expected peptide detectability based on whole cell lysate acquired using the same multidimensional protein identification technology analysis used for the protein complexes. Neither method was able to predict the peptide sampling levels that we observed using replicate multidimensional protein identification technology analyses. In addition to the analysis of the RNA polymerase complexes, our analysis provides quantitative information about several RNAP associated proteins including the RNAPII elongation factor complexes DSIF and TFIIF. Our data shows that DSIF and TFIIF are the most highly enriched RNAP accessory factors in Rpb3-TAP purifications and demonstrate our ability to measure low level associated protein abundance across biological replicates. In addition, our quantitative data supports a model in which DSIF and TFIIF interact with RNAPII in a dynamic fashion in agreement with previously published reports. PMID

  15. Chemical Fingerprint Analysis and Quantitative Analysis of Rosa rugosa by UPLC-DAD.

    PubMed

    Mansur, Sanawar; Abdulla, Rahima; Ayupbec, Amatjan; Aisa, Haji Akbar

    2016-12-21

    A method based on ultra performance liquid chromatography with a diode array detector (UPLC-DAD) was developed for quantitative analysis of five active compounds and chemical fingerprint analysis of Rosa rugosa . Ten batches of R. rugosa collected from different plantations in the Xinjiang region of China were used to establish the fingerprint. The feasibility and advantages of the used UPLC fingerprint were verified for its similarity evaluation by systematically comparing chromatograms with professional analytical software recommended by State Food and Drug Administration (SFDA) of China. In quantitative analysis, the five compounds showed good regression (R² = 0.9995) within the test ranges, and the recovery of the method was in the range of 94.2%-103.8%. The similarities of liquid chromatography fingerprints of 10 batches of R. rugosa were more than 0.981. The developed UPLC fingerprint method is simple, reliable, and validated for the quality control and identification of R. rugosa . Additionally, simultaneous quantification of five major bioactive ingredients in the R. rugosa samples was conducted to interpret the consistency of the quality test. The results indicated that the UPLC fingerprint, as a characteristic distinguishing method combining similarity evaluation and quantification analysis, can be successfully used to assess the quality and to identify the authenticity of R. rugosa .

  16. Comparative study of standard space and real space analysis of quantitative MR brain data.

    PubMed

    Aribisala, Benjamin S; He, Jiabao; Blamire, Andrew M

    2011-06-01

    To compare the robustness of region of interest (ROI) analysis of magnetic resonance imaging (MRI) brain data in real space with analysis in standard space and to test the hypothesis that standard space image analysis introduces more partial volume effect errors compared to analysis of the same dataset in real space. Twenty healthy adults with no history or evidence of neurological diseases were recruited; high-resolution T(1)-weighted, quantitative T(1), and B(0) field-map measurements were collected. Algorithms were implemented to perform analysis in real and standard space and used to apply a simple standard ROI template to quantitative T(1) datasets. Regional relaxation values and histograms for both gray and white matter tissues classes were then extracted and compared. Regional mean T(1) values for both gray and white matter were significantly lower using real space compared to standard space analysis. Additionally, regional T(1) histograms were more compact in real space, with smaller right-sided tails indicating lower partial volume errors compared to standard space analysis. Standard space analysis of quantitative MRI brain data introduces more partial volume effect errors biasing the analysis of quantitative data compared to analysis of the same dataset in real space. Copyright © 2011 Wiley-Liss, Inc.

  17. Statistical shape analysis using 3D Poisson equation--A quantitatively validated approach.

    PubMed

    Gao, Yi; Bouix, Sylvain

    2016-05-01

    Statistical shape analysis has been an important area of research with applications in biology, anatomy, neuroscience, agriculture, paleontology, etc. Unfortunately, the proposed methods are rarely quantitatively evaluated, and as shown in recent studies, when they are evaluated, significant discrepancies exist in their outputs. In this work, we concentrate on the problem of finding the consistent location of deformation between two population of shapes. We propose a new shape analysis algorithm along with a framework to perform a quantitative evaluation of its performance. Specifically, the algorithm constructs a Signed Poisson Map (SPoM) by solving two Poisson equations on the volumetric shapes of arbitrary topology, and statistical analysis is then carried out on the SPoMs. The method is quantitatively evaluated on synthetic shapes and applied on real shape data sets in brain structures. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Role Of Social Networks In Resilience Of Naval Recruits: A Quantitative Analysis

    DTIC Science & Technology

    2016-06-01

    comprises 1,297 total surveys from a total of eight divisions of recruits at two different time periods. Quantitative analyses using surveys and network... surveys from a total of eight divisions of recruits at two different time periods. Quantitative analyses using surveys and network data examine the effects...NETWORKS IN RESILIENCE OF NAVAL RECRUITS: A QUANTITATIVE ANALYSIS by Andrea M. Watling June 2016 Thesis Advisor: Edward H. Powley Co

  19. Finite Element Analysis of Particle Ionization within Carbon Nanotube Ion Micro Thruster

    DTIC Science & Technology

    2017-12-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. FINITE ELEMENT ...AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE FINITE ELEMENT ANALYSIS OF PARTICLE IONIZATION WITHIN CARBON NANOTUBE ION MICRO THRUSTER 5...simulation, carbon nanotube simulation, microsatellite, finite element analysis, electric field, particle tracing 15. NUMBER OF PAGES 55 16. PRICE

  20. Quantitative Analysis of Cellular Metabolic Dissipative, Self-Organized Structures

    PubMed Central

    de la Fuente, Ildefonso Martínez

    2010-01-01

    One of the most important goals of the postgenomic era is understanding the metabolic dynamic processes and the functional structures generated by them. Extensive studies during the last three decades have shown that the dissipative self-organization of the functional enzymatic associations, the catalytic reactions produced during the metabolite channeling, the microcompartmentalization of these metabolic processes and the emergence of dissipative networks are the fundamental elements of the dynamical organization of cell metabolism. Here we present an overview of how mathematical models can be used to address the properties of dissipative metabolic structures at different organizational levels, both for individual enzymatic associations and for enzymatic networks. Recent analyses performed with dissipative metabolic networks have shown that unicellular organisms display a singular global enzymatic structure common to all living cellular organisms, which seems to be an intrinsic property of the functional metabolism as a whole. Mathematical models firmly based on experiments and their corresponding computational approaches are needed to fully grasp the molecular mechanisms of metabolic dynamical processes. They are necessary to enable the quantitative and qualitative analysis of the cellular catalytic reactions and also to help comprehend the conditions under which the structural dynamical phenomena and biological rhythms arise. Understanding the molecular mechanisms responsible for the metabolic dissipative structures is crucial for unraveling the dynamics of cellular life. PMID:20957111

  1. Improvements to direct quantitative analysis of multiple microRNAs facilitating faster analysis.

    PubMed

    Ghasemi, Farhad; Wegman, David W; Kanoatov, Mirzo; Yang, Burton B; Liu, Stanley K; Yousef, George M; Krylov, Sergey N

    2013-11-05

    Studies suggest that patterns of deregulation in sets of microRNA (miRNA) can be used as cancer diagnostic and prognostic biomarkers. Establishing a "miRNA fingerprint"-based diagnostic technique requires a suitable miRNA quantitation method. The appropriate method must be direct, sensitive, capable of simultaneous analysis of multiple miRNAs, rapid, and robust. Direct quantitative analysis of multiple microRNAs (DQAMmiR) is a recently introduced capillary electrophoresis-based hybridization assay that satisfies most of these criteria. Previous implementations of the method suffered, however, from slow analysis time and required lengthy and stringent purification of hybridization probes. Here, we introduce a set of critical improvements to DQAMmiR that address these technical limitations. First, we have devised an efficient purification procedure that achieves the required purity of the hybridization probe in a fast and simple fashion. Second, we have optimized the concentrations of the DNA probe to decrease the hybridization time to 10 min. Lastly, we have demonstrated that the increased probe concentrations and decreased incubation time removed the need for masking DNA, further simplifying the method and increasing its robustness. The presented improvements bring DQAMmiR closer to use in a clinical setting.

  2. Finite element analysis of a pseudoelastic compression-generating intramedullary ankle arthrodesis nail.

    PubMed

    Anderson, Ryan T; Pacaccio, Douglas J; Yakacki, Christopher M; Carpenter, R Dana

    2016-09-01

    Tibio-talo-calcaneal (TTC) arthrodesis is an end-stage treatment for patients with severe degeneration of the ankle joint. This treatment consists of using an intramedullary nail (IM) to fuse the calcaneus, talus, and tibia bones together into one construct. Poor bone quality within the joint prior to surgery is common and thus the procedure has shown complications due to non-union. However, a new FDA-approved IM nail has been released that houses a nickel titanium (NiTi) rod that uses its inherent pseudoelastic material properties to apply active compression across the fusion site. Finite element analysis was performed to model the mechanical response of the NiTi within the device. A bone model was then developed based on a quantitative computed tomography (QCT) image for anatomical geometry and bone material properties. A total bone and device system was modeled to investigate the effect of bone quality change and gather load-sharing properties during gait loading. It was found that during the highest magnitude loading of gait, the load taken by the bone was more than 50% higher than the load taken by the nail. When comparing the load distribution during gait, results from this study would suggest that the device helps to prevent stress shielding by allowing a more even distribution of load between bone and nail. In conditions where bone quality may vary patient-to-patient, the model indicates that a 10% decrease in overall bone modulus (i.e. material stiffness) due to reduced bone mineral density would result in higher stresses in the nail (3.4%) and a marginal decrease in stress for the bone (0.5%). The finite element model presented in this study can be used as a quantitative tool to further understand the stress environment of both bone and device for a TTC fusion. Furthermore, the methodology presented gives insight on how to computationally program and use the unique material properties of NiTi in an active compression state useful for bone fracture healing

  3. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis.

    PubMed

    Ishikawa, Akira

    2017-11-27

    Large numbers of quantitative trait loci (QTL) affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs) for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  4. Capillary Optics Based X-Ray Micro-Imaging Elemental Analysis

    NASA Astrophysics Data System (ADS)

    Hampai, D.; Dabagov, S. B.; Cappuccio, G.; Longoni, A.; Frizzi, T.; Cibin, G.

    2010-04-01

    A rapidly developed during the last few years micro-X-ray fluorescence spectrometry (μXRF) is a promising multi-elemental technique for non-destructive analysis. Typically it is rather hard to perform laboratory μXRF analysis because of the difficulty of producing an original small-size X-ray beam as well as its focusing. Recently developed for X-ray beam focusing polycapillary optics offers laboratory X-ray micro probes. The combination of polycapillary lens and fine-focused micro X-ray tube can provide high intensity radiation flux on a sample that is necessary in order to perform the elemental analysis. In comparison to a pinhole, an optimized "X-ray source-op tics" system can result in radiation density gain of more than 3 orders by the value. The most advanced way to get that result is to use the confocal configuration based on two X-ray lenses, one for the fluorescence excitation and the other for the detection of secondary emission from a sample studied. In case of X-ray capillary microfocusing a μXRF instrument designed in the confocal scheme allows us to obtain a 3D elemental mapping. In this work we will show preliminary results obtained with our prototype, a portable X-ray microscope for X-ray both imaging and fluorescence analysis; it enables μXRF elemental mapping simultaneously with X-ray imaging. A prototype of compact XRF spectrometer with a spatial resolution less than 100 μm has been designed.

  5. Quantitative analysis of diffusion tensor orientation: theoretical framework.

    PubMed

    Wu, Yu-Chien; Field, Aaron S; Chung, Moo K; Badie, Benham; Alexander, Andrew L

    2004-11-01

    Diffusion-tensor MRI (DT-MRI) yields information about the magnitude, anisotropy, and orientation of water diffusion of brain tissues. Although white matter tractography and eigenvector color maps provide visually appealing displays of white matter tract organization, they do not easily lend themselves to quantitative and statistical analysis. In this study, a set of visual and quantitative tools for the investigation of tensor orientations in the human brain was developed. Visual tools included rose diagrams, which are spherical coordinate histograms of the major eigenvector directions, and 3D scatterplots of the major eigenvector angles. A scatter matrix of major eigenvector directions was used to describe the distribution of major eigenvectors in a defined anatomic region. A measure of eigenvector dispersion was developed to describe the degree of eigenvector coherence in the selected region. These tools were used to evaluate directional organization and the interhemispheric symmetry of DT-MRI data in five healthy human brains and two patients with infiltrative diseases of the white matter tracts. In normal anatomical white matter tracts, a high degree of directional coherence and interhemispheric symmetry was observed. The infiltrative diseases appeared to alter the eigenvector properties of affected white matter tracts, showing decreased eigenvector coherence and interhemispheric symmetry. This novel approach distills the rich, 3D information available from the diffusion tensor into a form that lends itself to quantitative analysis and statistical hypothesis testing. (c) 2004 Wiley-Liss, Inc.

  6. A Six-Node Curved Triangular Element and a Four-Node Quadrilateral Element for Analysis of Laminated Composite Aerospace Structures

    NASA Technical Reports Server (NTRS)

    Martin, C. Wayne; Breiner, David M.; Gupta, Kajal K. (Technical Monitor)

    2004-01-01

    Mathematical development and some computed results are presented for Mindlin plate and shell elements, suitable for analysis of laminated composite and sandwich structures. These elements use the conventional 3 (plate) or 5 (shell) nodal degrees of freedom, have no communicable mechanisms, have no spurious shear energy (no shear locking), have no spurious membrane energy (no membrane locking) and do not require arbitrary reduction of out-of-plane shear moduli or under-integration. Artificial out-of-plane rotational stiffnesses are added at the element level to avoid convergence problems or singularity due to flat spots in shells. This report discusses a 6-node curved triangular element and a 4-node quadrilateral element. Findings show that in regular rectangular meshes, the Martin-Breiner 6-node triangular curved shell (MB6) is approximately equivalent to the conventional 8-node quadrilateral with integration. The 4-node quadrilateral (MB4) has very good accuracy for a 4-node element, and may be preferred in vibration analysis because of narrower bandwidth. The mathematical developments used in these elements, those discussed in the seven appendices, have been applied to elements with 3, 4, 6, and 10 nodes and can be applied to other nodal configurations.

  7. A simple approach to quantitative analysis using three-dimensional spectra based on selected Zernike moments.

    PubMed

    Zhai, Hong Lin; Zhai, Yue Yuan; Li, Pei Zhen; Tian, Yue Li

    2013-01-21

    A very simple approach to quantitative analysis is proposed based on the technology of digital image processing using three-dimensional (3D) spectra obtained by high-performance liquid chromatography coupled with a diode array detector (HPLC-DAD). As the region-based shape features of a grayscale image, Zernike moments with inherently invariance property were employed to establish the linear quantitative models. This approach was applied to the quantitative analysis of three compounds in mixed samples using 3D HPLC-DAD spectra, and three linear models were obtained, respectively. The correlation coefficients (R(2)) for training and test sets were more than 0.999, and the statistical parameters and strict validation supported the reliability of established models. The analytical results suggest that the Zernike moment selected by stepwise regression can be used in the quantitative analysis of target compounds. Our study provides a new idea for quantitative analysis using 3D spectra, which can be extended to the analysis of other 3D spectra obtained by different methods or instruments.

  8. Reproducibility and validity of a semi-quantitative FFQ for trace elements.

    PubMed

    Lee, Yujin; Park, Kyong

    2016-09-01

    The aim of this study was to test the reproducibility and validity of a self-administered FFQ for the Trace Element Study of Korean Adults in the Yeungnam area (SELEN). Study subjects were recruited from the SELEN cohort selected from rural and urban areas in Yeungnam, Korea. A semi-quantitative FFQ with 146 items was developed considering the dietary characteristics of cohorts in the study area. In a validation study, seventeen men and forty-eight women aged 38-62 years completed 3-d dietary records (DR) and two FFQ over a 3-month period. The validity was examined with the FFQ and DR, and the reproducibility was estimated using partial correlation coefficients, the Bland-Altman method and cross-classification. There were no significant differences between the mean intakes of selected nutrients as estimated from FFQ1, FFQ2 and DR. The median correlation coefficients for all nutrients were 0·47 and 0·56 in the reproducibility and validity tests, respectively. Bland-Altman's index and cross-classification showed acceptable agreement between FFQ1 and FFQ2 and between FFQ2 and DR. Ultimately, 78 % of the subjects were classified into the same and adjacent quartiles for most nutrients. In addition, the weighted κ value indicated that the two methods agreed fairly. In conclusion, this newly developed FFQ was a suitable dietary assessment method for the SELEN cohort study.

  9. Modeling Intracochlear Magnetic Stimulation: A Finite-Element Analysis.

    PubMed

    Mukesh, S; Blake, D T; McKinnon, B J; Bhatti, P T

    2017-08-01

    This study models induced electric fields, and their gradient, produced by pulsatile current stimulation of submillimeter inductors for cochlear implantation. Using finite-element analysis, the lower chamber of the cochlea, scala tympani, is modeled as a cylindrical structure filled with perilymph bounded by tissue, bone, and cochlear neural elements. Single inductors as well as an array of inductors are modeled. The coil strength (~100 nH) and excitation parameters (peak current of 1-5 A, voltages of 16-20 V) are based on a formative feasibility study conducted by our group. In that study, intracochlear micromagnetic stimulation achieved auditory activation as measured through the auditory brainstem response in a feline model. With respect to the finite element simulations, axial symmetry of the inductor geometry is exploited to improve computation time. It is verified that the inductor coil orientation greatly affects the strength of the induced electric field and thereby the ability to affect the transmembrane potential of nearby neural elements. Furthermore, upon comparing an array of micro-inductors with a typical multi-site electrode array, magnetically excited arrays retain greater focus in terms of the gradient of induced electric fields. Once combined with further in vivo analysis, this modeling study may enable further exploration of the mechanism of magnetically induced, and focused neural stimulation.

  10. A Study on Urban Road Traffic Safety Based on Matter Element Analysis

    PubMed Central

    Hu, Qizhou; Zhou, Zhuping; Sun, Xu

    2014-01-01

    This paper examines a new evaluation of urban road traffic safety based on a matter element analysis, avoiding the difficulties found in other traffic safety evaluations. The issue of urban road traffic safety has been investigated through the matter element analysis theory. The chief aim of the present work is to investigate the features of urban road traffic safety. Emphasis was placed on the construction of a criterion function by which traffic safety achieved a hierarchical system of objectives to be evaluated. The matter element analysis theory was used to create the comprehensive appraisal model of urban road traffic safety. The technique was used to employ a newly developed and versatile matter element analysis algorithm. The matter element matrix solves the uncertainty and incompatibility of the evaluated factors used to assess urban road traffic safety. The application results showed the superiority of the evaluation model and a didactic example was included to illustrate the computational procedure. PMID:25587267

  11. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  12. Trace element analysis of coal by neutron activation.

    NASA Technical Reports Server (NTRS)

    Sheibley, D. W.

    1973-01-01

    The irradiation, counting, and data reduction scheme is described for an analysis capability of 1000 samples per year. Up to 56 elements are reported on each sample. The precision and accuracy of the method are shown for 25 elements designated as hazardous by the Environmental Protection Agency (EPA). The interference corrections for selenium and ytterbium on mercury and ytterbium on selenium are described. The effect of bromine and antimony on the determination of arsenic is also mentioned. The use of factorial design techniques to evaluate interferences in the determination of mercury, selenium, and arsenic is shown. Some typical trace element results for coal, fly ash, and bottom ash are given.

  13. Trace element analysis of coal by neutron activation

    NASA Technical Reports Server (NTRS)

    Sheibley, D. W.

    1973-01-01

    The irradiation, counting, and data reduction scheme is described for an analysis capability of 1000 samples per year. Up to 56 elements are reported on each sample. The precision and accuracy of the method are shown for 25 elements designated as hazardous by the Environmental Protection Agency (EPA). The interference corrections for selenium and ytterbium on mercury and ytterbium on selenium are described. The effect of bromine and antimony on the determination of arsenic is also mentioned. The use of factorial design techniques to evaluate interferences in the determination of mercury, selenium, and arsenic is shown. Some typical trace element results for coal, fly ash, and bottom ash are given.

  14. Trace elements as quantitative probes of differentiation processes in planetary interiors

    NASA Technical Reports Server (NTRS)

    Drake, M. J.

    1980-01-01

    The characteristic trace element signature that each mineral in the source region imparts on the magma constitutes the conceptual basis for trace element modeling. It is shown that abundances of trace elements in extrusive igneous rocks may be used as petrological and geochemical probes of the source regions of the rocks if differentiation processes, partition coefficients, phase equilibria, and initial concentrations in the source region are known. Although compatible and incompatible trace elements are useful in modeling, the present review focuses primarily on examples involving the rare-earth elements.

  15. Rolling-Element Fatigue Testing and Data Analysis - A Tutorial

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Zaretsky, Erwin V.

    2011-01-01

    In order to rank bearing materials, lubricants and other design variables using rolling-element bench type fatigue testing of bearing components and full-scale rolling-element bearing tests, the investigator needs to be cognizant of the variables that affect rolling-element fatigue life and be able to maintain and control them within an acceptable experimental tolerance. Once these variables are controlled, the number of tests and the test conditions must be specified to assure reasonable statistical certainty of the final results. There is a reasonable correlation between the results from elemental test rigs with those results obtained with full-scale bearings. Using the statistical methods of W. Weibull and L. Johnson, the minimum number of tests required can be determined. This paper brings together and discusses the technical aspects of rolling-element fatigue testing and data analysis as well as making recommendations to assure quality and reliable testing of rolling-element specimens and full-scale rolling-element bearings.

  16. Finite Element Analysis of a Dynamically Loaded Flat Laminated Plate

    DTIC Science & Technology

    1980-07-01

    and the elements are stacked in the thickness direction to represent various material layers. This analysis allows for orthotropic, elastic- plastic or...INCREMENTS 27 V. PLASTICITY 34 Orthotropic Elastic- Plastic Yielding 34 Orthotropic Elastic-Viscoplastic Yielding 37 VI. ELEMENT EQUILIBRIUM...with time, consequently the materials are assumed to be represented by elastic- plastic and elastic-viscoplastic models. The finite element model

  17. Unveiling the Third Secret of Fátima: μ-XRF quantitative characterization and 2D elemental mapping

    NASA Astrophysics Data System (ADS)

    Manso, M.; Pessanha, S.; Guerra, M.; Figueirinhas, J. L.; Santos, J. P.; Carvalho, M. L.

    2017-04-01

    A set of five manuscripts written by Sister Lúcia between 1941 and 1944 were under study. Among them is the one that contains the description of the third part of the Secret of Fátima also known as the Third Secret of Fátima. In this work, a characterization of the paper and the ink used in these documents was achieved using micro-X-ray fluorescence spectrometry. Quantitative results were obtained for P, K, Ca, Fe, Cu and Zn, revealing different paper composition and Zn in the inks. 2D elemental maps confirmed that Zn was present in the five documents ink and that the manuscript revealing the Third Secret of Fátima contained no erasures or alteration attempts to the original manuscript.

  18. Elemental content of Vietnamese rice. Part 2. Multivariate data analysis.

    PubMed

    Kokot, S; Phuong, T D

    1999-04-01

    Rice samples were obtained from the Red River region and some other parts of Vietnam as well as from Yanco, Australia. These samples were analysed for 14 elements (P, K, Mg, Ca, Mn, Zn, Fe, Cu, Al, Na, Ni, As, Mo and Cd) by ICP-AES, ICP-MS and FAAS as described in Part 1. This data matrix was then submitted to multivariate data analysis by principal component analysis to investigate the influences of environmental and crop cultivation variables on the elemental content of rice. Results revealed that geographical location, grain variety, seasons and soil conditions are the most likely significant factors causing changes in the elemental content between the rice samples. To assess rice quality according to its elemental content and physio-biological properties, a multicriteria decision making method (PROMETHEE) was applied. With the Vietnamese rice, the sticky rice appeared to contain somewhat higher levels of nutritionally significant elements such as P, K and Mg than the non-sticky rice. Also, rice samples grown during the wet season have better levels of nutritionally significant mineral elements than those of the dry season, but in general, the wet season seemed to provide better overall elemental and physio-biological rice quality.

  19. Standard Reference Line Combined with One-Point Calibration-Free Laser-Induced Breakdown Spectroscopy (CF-LIBS) to Quantitatively Analyze Stainless and Heat Resistant Steel.

    PubMed

    Fu, Hongbo; Wang, Huadong; Jia, Junwei; Ni, Zhibo; Dong, Fengzhong

    2018-01-01

    Due to the influence of major elements' self-absorption, scarce observable spectral lines of trace elements, and relative efficiency correction of experimental system, accurate quantitative analysis with calibration-free laser-induced breakdown spectroscopy (CF-LIBS) is in fact not easy. In order to overcome these difficulties, standard reference line (SRL) combined with one-point calibration (OPC) is used to analyze six elements in three stainless-steel and five heat-resistant steel samples. The Stark broadening and Saha - Boltzmann plot of Fe are used to calculate the electron density and the plasma temperature, respectively. In the present work, we tested the original SRL method, the SRL with the OPC method, and intercept with the OPC method. The final calculation results show that the latter two methods can effectively improve the overall accuracy of quantitative analysis and the detection limits of trace elements.

  20. [Correspondence analysis between traditional commercial specifications and quantitative quality indices of Notopterygii Rhizoma et Radix].

    PubMed

    Jiang, Shun-Yuan; Sun, Hong-Bing; Sun, Hui; Ma, Yu-Ying; Chen, Hong-Yu; Zhu, Wen-Tao; Zhou, Yi

    2016-03-01

    This paper aims to explore a comprehensive assessment method combined traditional Chinese medicinal material specifications with quantitative quality indicators. Seventy-six samples of Notopterygii Rhizoma et Radix were collected on market and at producing areas. Traditional commercial specifications were described and assigned, and 10 chemical components and volatile oils were determined for each sample. Cluster analysis, Fisher discriminant analysis and correspondence analysis were used to establish the relationship between the traditional qualitative commercial specifications and quantitative chemical indices for comprehensive evaluating quality of medicinal materials, and quantitative classification of commercial grade and quality grade. A herb quality index (HQI) including traditional commercial specifications and chemical components for quantitative grade classification were established, and corresponding discriminant function were figured out for precise determination of quality grade and sub-grade of Notopterygii Rhizoma et Radix. The result showed that notopterol, isoimperatorin and volatile oil were the major components for determination of chemical quality, and their dividing values were specified for every grade and sub-grade of the commercial materials of Notopterygii Rhizoma et Radix. According to the result, essential relationship between traditional medicinal indicators, qualitative commercial specifications, and quantitative chemical composition indicators can be examined by K-mean cluster, Fisher discriminant analysis and correspondence analysis, which provide a new method for comprehensive quantitative evaluation of traditional Chinese medicine quality integrated traditional commodity specifications and quantitative modern chemical index. Copyright© by the Chinese Pharmaceutical Association.

  1. Use of MRI in Differentiation of Papillary Renal Cell Carcinoma Subtypes: Qualitative and Quantitative Analysis.

    PubMed

    Doshi, Ankur M; Ream, Justin M; Kierans, Andrea S; Bilbily, Matthew; Rusinek, Henry; Huang, William C; Chandarana, Hersh

    2016-03-01

    The purpose of this study was to determine whether qualitative and quantitative MRI feature analysis is useful for differentiating type 1 from type 2 papillary renal cell carcinoma (PRCC). This retrospective study included 21 type 1 and 17 type 2 PRCCs evaluated with preoperative MRI. Two radiologists independently evaluated various qualitative features, including signal intensity, heterogeneity, and margin. For the quantitative analysis, a radiology fellow and a medical student independently drew 3D volumes of interest over the entire tumor on T2-weighted HASTE images, apparent diffusion coefficient parametric maps, and nephrographic phase contrast-enhanced MR images to derive first-order texture metrics. Qualitative and quantitative features were compared between the groups. For both readers, qualitative features with greater frequency in type 2 PRCC included heterogeneous enhancement, indistinct margin, and T2 heterogeneity (all, p < 0.035). Indistinct margins and heterogeneous enhancement were independent predictors (AUC, 0.822). Quantitative analysis revealed that apparent diffusion coefficient, HASTE, and contrast-enhanced entropy were greater in type 2 PRCC (p < 0.05; AUC, 0.682-0.716). A combined quantitative and qualitative model had an AUC of 0.859. Qualitative features within the model had interreader concordance of 84-95%, and the quantitative data had intraclass coefficients of 0.873-0.961. Qualitative and quantitative features can help discriminate between type 1 and type 2 PRCC. Quantitative analysis may capture useful information that complements the qualitative appearance while benefiting from high interobserver agreement.

  2. Quantification of rare earth elements using laser-induced breakdown spectroscopy

    DOE PAGES

    Martin, Madhavi; Martin, Rodger C.; Allman, Steve; ...

    2015-10-21

    In this paper, a study of the optical emission as a function of concentration of laser-ablated yttrium (Y) and of six rare earth elements, europium (Eu), gadolinium (Gd), lanthanum (La), praseodymium (Pr), neodymium (Nd), and samarium (Sm), has been evaluated using the laser-induced breakdown spectroscopy (LIBS) technique. Statistical methodology using multivariate analysis has been used to obtain the sampling errors, coefficient of regression, calibration, and cross-validation of measurements as they relate to the LIBS analysis in graphite-matrix pellets that were doped with elements at several concentrations. Each element (in oxide form) was mixed in the graphite matrix in percentages rangingmore » from 1% to 50% by weight and the LIBS spectra obtained for each composition as well as for pure oxide samples. Finally, a single pellet was mixed with all the elements in equal oxide masses to determine if we can identify the elemental peaks in a mixed pellet. This dataset is relevant for future application to studies of fission product content and distribution in irradiated nuclear fuels. These results demonstrate that LIBS technique is inherently well suited for the future challenge of in situ analysis of nuclear materials. Finally, these studies also show that LIBS spectral analysis using statistical methodology can provide quantitative results and suggest an approach in future to the far more challenging multielemental analysis of ~ 20 primary elements in high-burnup nuclear reactor fuel.« less

  3. Application of Elements of TPM Strategy for Operation Analysis of Mining Machine

    NASA Astrophysics Data System (ADS)

    Brodny, Jaroslaw; Tutak, Magdalena

    2017-12-01

    Total Productive Maintenance (TPM) strategy includes group of activities and actions in order to maintenance machines in failure-free state and without breakdowns thanks to tending limitation of failures, non-planned shutdowns, lacks and non-planned service of machines. These actions are ordered to increase effectiveness of utilization of possessed devices and machines in company. Very significant element of this strategy is connection of technical actions with changes in their perception by employees. Whereas fundamental aim of introduction this strategy is improvement of economic efficiency of enterprise. Increasing competition and necessity of reduction of production costs causes that also mining enterprises are forced to introduce this strategy. In the paper examples of use of OEE model for quantitative evaluation of selected mining devices were presented. OEE model is quantitative tool of TPM strategy and can be the base for further works connected with its introduction. OEE indicator is the product of three components which include availability and performance of the studied machine and the quality of the obtained product. The paper presents the results of the effectiveness analysis of the use of a set of mining machines included in the longwall system, which is the first and most important link in the technological line of coal production. The set of analyzed machines included the longwall shearer, armored face conveyor and cruscher. From a reliability point of view, the analyzed set of machines is a system that is characterized by the serial structure. The analysis was based on data recorded by the industrial automation system used in the mines. This method of data acquisition ensured their high credibility and a full time synchronization. Conclusions from the research and analyses should be used to reduce breakdowns, failures and unplanned downtime, increase performance and improve production quality.

  4. Finite element stress, vibration, and buckling analysis of laminated beams with the use of refined elements

    NASA Astrophysics Data System (ADS)

    Borovkov, Alexei I.; Avdeev, Ilya V.; Artemyev, A.

    1999-05-01

    In present work, the stress, vibration and buckling finite element analysis of laminated beams is performed. Review of the equivalent single-layer (ESL) laminate theories is done. Finite element algorithms and procedures integrated into the original FEA program system and based on the classical laminated plate theory (CLPT), first-order shear deformation theory (FSDT), third-order theory of Reddy (TSDT-R) and third- order theory of Kant (TSDT-K) with the use of the Lanczos method for solving of the eigenproblem are developed. Several numerical tests and examples of bending, free vibration and buckling of multilayered and sandwich beams with various material, geometry properties and boundary conditions are solved. New effective higher-order hierarchical element for the accurate calculation of transverse shear stress is proposed. The comparative analysis of results obtained by the considered models and solutions of 2D problems of the heterogeneous anisotropic elasticity is fulfilled.

  5. Comparison of hexahedral and tetrahedral elements in finite element analysis of the foot and footwear.

    PubMed

    Tadepalli, Srinivas C; Erdemir, Ahmet; Cavanagh, Peter R

    2011-08-11

    Finite element analysis has been widely used in the field of foot and footwear biomechanics to determine plantar pressures as well as stresses and strains within soft tissue and footwear materials. When dealing with anatomical structures such as the foot, hexahedral mesh generation accounts for most of the model development time due to geometric complexities imposed by branching and embedded structures. Tetrahedral meshing, which can be more easily automated, has been the approach of choice to date in foot and footwear biomechanics. Here we use the nonlinear finite element program Abaqus (Simulia, Providence, RI) to examine the advantages and disadvantages of tetrahedral and hexahedral elements under compression and shear loading, material incompressibility, and frictional contact conditions, which are commonly seen in foot and footwear biomechanics. This study demonstrated that for a range of simulation conditions, hybrid hexahedral elements (Abaqus C3D8H) consistently performed well while hybrid linear tetrahedral elements (Abaqus C3D4H) performed poorly. On the other hand, enhanced quadratic tetrahedral elements with improved stress visualization (Abaqus C3D10I) performed as well as the hybrid hexahedral elements in terms of contact pressure and contact shear stress predictions. Although the enhanced quadratic tetrahedral element simulations were computationally expensive compared to hexahedral element simulations in both barefoot and footwear conditions, the enhanced quadratic tetrahedral element formulation seems to be very promising for foot and footwear applications as a result of decreased labor and expedited model development, all related to facilitated mesh generation. Copyright © 2011. Published by Elsevier Ltd.

  6. Finite element analysis of maxillary bone stress caused by Aramany Class IV obturator prostheses.

    PubMed

    Miyashita, Elcio Ricardo; Mattos, Beatriz Silva Câmara; Noritomi, Pedro Yoshito; Navarro, Hamilton

    2012-05-01

    The retention of an Aramany Class IV removable partial dental prosthesis can be compromised by a lack of support. The biomechanics of this obturator prosthesis result in an unusual stress distribution on the residual maxillary bone. This study evaluated the biomechanics of an Aramany Class IV obturator prosthesis with finite element analysis and a digital 3-dimensional (3-D) model developed from a computed tomography scan; bone stress was evaluated according to the load placed on the prosthesis. A 3-D model of an Aramany Class IV maxillary resection and prosthesis was constructed. This model was used to develop a finite element mesh. A 120 N load was applied to the occlusal and incisal platforms corresponding to the prosthetic teeth. Qualitative analysis was based on the scale of maximum principal stress; values obtained through quantitative analysis were expressed in MPa. Under posterior load, tensile and compressive stresses were observed; the tensile stress was greater than the compressive stress, regardless of the bone region, and the greatest compressive stress was observed on the anterior palate near the midline. Under an anterior load, tensile stress was observed in all of the evaluated bone regions; the tensile stress was greater than the compressive stress, regardless of the bone region. The Aramany Class IV obturator prosthesis tended to rotate toward the surgical resection when subjected to posterior or anterior loads. The amount of tensile and compressive stress caused by the Aramany Class IV obturator prosthesis did not exceed the physiological limits of the maxillary bone tissue. (J Prosthet Dent 2012;107:336-342). Copyright © 2012 The Editorial Council of the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.

  7. Quantitative Analysis of Repertoire-Scale Immunoglobulin Properties in Vaccine-Induced B-Cell Responses

    DTIC Science & Technology

    2017-05-10

    repertoire-wide properties. Finally, through 75 the use of appropriate statistical analyses, the repertoire profiles can be quantitatively compared and 76...cell response to eVLP and 503 quantitatively compare GC B-cell repertoires from immunization conditions. We partitioned the 504 resulting clonotype... Quantitative analysis of repertoire-scale immunoglobulin properties in vaccine-induced B-cell responses Ilja V. Khavrutskii1, Sidhartha Chaudhury*1

  8. Probabilistic finite elements for fracture and fatigue analysis

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Lawrence, M.; Besterfield, G. H.

    1989-01-01

    The fusion of the probabilistic finite element method (PFEM) and reliability analysis for probabilistic fracture mechanics (PFM) is presented. A comprehensive method for determining the probability of fatigue failure for curved crack growth was developed. The criterion for failure or performance function is stated as: the fatigue life of a component must exceed the service life of the component; otherwise failure will occur. An enriched element that has the near-crack-tip singular strain field embedded in the element is used to formulate the equilibrium equation and solve for the stress intensity factors at the crack-tip. Performance and accuracy of the method is demonstrated on a classical mode 1 fatigue problem.

  9. Quantitative real-time monitoring of multi-elements in airborne particulates by direct introduction into an inductively coupled plasma mass spectrometer

    NASA Astrophysics Data System (ADS)

    Suzuki, Yoshinari; Sato, Hikaru; Hiyoshi, Katsuhiro; Furuta, Naoki

    2012-10-01

    A new calibration system for real-time determination of trace elements in airborne particulates was developed. Airborne particulates were directly introduced into an inductively coupled plasma mass spectrometer, and the concentrations of 15 trace elements were determined by means of an external calibration method. External standard solutions were nebulized by an ultrasonic nebulizer (USN) coupled with a desolvation system, and the resulting aerosol was introduced into the plasma. The efficiency of sample introduction via the USN was calculated by two methods: (1) the introduction of a Cr standard solution via the USN was compared with introduction of a Cr(CO)6 standard gas via a standard gas generator and (2) the aerosol generated by the USN was trapped on filters and then analyzed. The Cr introduction efficiencies obtained by the two methods were the same, and the introduction efficiencies of the other elements were equal to the introduction efficiency of Cr. Our results indicated that our calibration method for introduction efficiency worked well for the 15 elements (Ti, V, Cr, Mn, Co, Ni, Cu, Zn, As, Mo, Sn, Sb, Ba, Tl and Pb). The real-time data and the filter-collection data agreed well for elements with low-melting oxides (V, Co, As, Mo, Sb, Tl, and Pb). In contrast, the real-time data were smaller than the filter-collection data for elements with high-melting oxides (Ti, Cr, Mn, Ni, Cu, Zn, Sn, and Ba). This result implies that the oxides of these 8 elements were not completely fused, vaporized, atomized, and ionized in the initial radiation zone of the inductively coupled plasma. However, quantitative real-time monitoring can be realized after correction for the element recoveries which can be calculated from the ratio of real-time data/filter-collection data.

  10. A new methodology for free wake analysis using curved vortex elements

    NASA Technical Reports Server (NTRS)

    Bliss, Donald B.; Teske, Milton E.; Quackenbush, Todd R.

    1987-01-01

    A method using curved vortex elements was developed for helicopter rotor free wake calculations. The Basic Curve Vortex Element (BCVE) is derived from the approximate Biot-Savart integration for a parabolic arc filament. When used in conjunction with a scheme to fit the elements along a vortex filament contour, this method has a significant advantage in overall accuracy and efficiency when compared to the traditional straight-line element approach. A theoretical and numerical analysis shows that free wake flows involving close interactions between filaments should utilize curved vortex elements in order to guarantee a consistent level of accuracy. The curved element method was implemented into a forward flight free wake analysis, featuring an adaptive far wake model that utilizes free wake information to extend the vortex filaments beyond the free wake regions. The curved vortex element free wake, coupled with this far wake model, exhibited rapid convergence, even in regions where the free wake and far wake turns are interlaced. Sample calculations are presented for tip vortex motion at various advance ratios for single and multiple blade rotors. Cross-flow plots reveal that the overall downstream wake flow resembles a trailing vortex pair. A preliminary assessment shows that the rotor downwash field is insensitive to element size, even for relatively large curved elements.

  11. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  12. Evaluation of shear wave elastography for differential diagnosis of breast lesions: A new qualitative analysis versus conventional quantitative analysis.

    PubMed

    Ren, Wei-Wei; Li, Xiao-Long; Wang, Dan; Liu, Bo-Ji; Zhao, Chong-Ke; Xu, Hui-Xiong

    2018-04-13

    To evaluate a special kind of ultrasound (US) shear wave elastography for differential diagnosis of breast lesions, using a new qualitative analysis (i.e. the elasticity score in the travel time map) compared with conventional quantitative analysis. From June 2014 to July 2015, 266 pathologically proven breast lesions were enrolled in this study. The maximum, mean, median, minimum, and standard deviation of shear wave speed (SWS) values (m/s) were assessed. The elasticity score, a new qualitative feature, was evaluated in the travel time map. The area under the receiver operating characteristic (AUROC) curves were plotted to evaluate the diagnostic performance of both qualitative and quantitative analyses for differentiation of breast lesions. Among all quantitative parameters, SWS-max showed the highest AUROC (0.805; 95% CI: 0.752, 0.851) compared with SWS-mean (0.786; 95% CI:0.732, 0.834; P = 0.094), SWS-median (0.775; 95% CI:0.720, 0.824; P = 0.046), SWS-min (0.675; 95% CI:0.615, 0.731; P = 0.000), and SWS-SD (0.768; 95% CI:0.712, 0.817; P = 0.074). The AUROC of qualitative analysis in this study obtained the best diagnostic performance (0.871; 95% CI: 0.825, 0.909, compared with the best parameter of SWS-max in quantitative analysis, P = 0.011). The new qualitative analysis of shear wave travel time showed the superior diagnostic performance in the differentiation of breast lesions in comparison with conventional quantitative analysis.

  13. Quantitative Confocal Microscopy Analysis as a Basis for Search and Study of Potassium Kv1.x Channel Blockers

    NASA Astrophysics Data System (ADS)

    Feofanov, Alexey V.; Kudryashova, Kseniya S.; Nekrasova, Oksana V.; Vassilevski, Alexander A.; Kuzmenkov, Alexey I.; Korolkova, Yuliya V.; Grishin, Eugene V.; Kirpichnikov, Mikhail P.

    Artificial KcsA-Kv1.x (x = 1, 3) receptors were recently designed by transferring the ligand-binding site from human Kv1.x voltage-gated potassium channels into corresponding domain of the bacterial KscA channel. We found that KcsA-Kv1.x receptors expressed in E. coli cells are embedded into cell membrane and bind ligands when the cells are transformed to spheroplasts. We supposed that E. coli spheroplasts with membrane-embedded KcsA-Kv1.x and fluorescently labeled ligand agitoxin-2 (R-AgTx2) can be used as elements of an advanced analytical system for search and study of Kv1-channel blockers. To realize this idea, special procedures were developed for measurement and quantitative treatment of fluorescence signals obtained from spheroplast membrane using confocal laser scanning microscopy (CLSM). The worked out analytical "mix and read" systems supported by quantitative CLSM analysis were demonstrated to be reliable alternative to radioligand and electrophysiology techniques in the search and study of selective Kv1.x channel blockers of high scientific and medical importance.

  14. Quantitation of glycerophosphorylcholine by flow injection analysis using immobilized enzymes.

    PubMed

    Mancini, A; Del Rosso, F; Roberti, R; Caligiana, P; Vecchini, A; Binaglia, L

    1996-09-20

    A method for quantitating glycerophosphorylcholine by flow injection analysis is reported in the present paper. Glycerophosphorylcholine phosphodiesterase and choline oxidase, immobilized on controlled porosity glass beads, are packed in a small reactor inserted in a flow injection manifold. When samples containing glycerophosphorylcholine are injected, glycerophosphorylcholine is hydrolyzed into choline and sn-glycerol-3-phosphate. The free choline produced in this reaction is oxidized to betain and hydrogen peroxide. Hydrogen peroxide is detected amperometrically. Quantitation of glycerophosphorylcholine in samples containing choline and phosphorylcholine is obtained inserting ahead of the reactor a small column packed with a mixed bed ion exchange resin. The time needed for each determination does not exceed one minute. The present method, applied to quantitate glycerophosphorylcholine in samples of seminal plasma, gave results comparable with those obtained using the standard enzymatic-spectrophotometric procedure. An alternative procedure, making use of co-immobilized glycerophosphorylcholine phosphodiesterase and glycerol-3-phosphate oxidase for quantitating glycerophosphorylcholine, glycerophosphorylethanolamine and glycerophosphorylserine is also described.

  15. Quantitative analysis to guide orphan drug development.

    PubMed

    Lesko, L J

    2012-08-01

    The development of orphan drugs for rare diseases has made impressive strides in the past 10 years. There has been a surge in orphan drug designations, but new drug approvals have not kept up. This article presents a three-pronged hierarchical strategy for quantitative analysis of data at the descriptive, mechanistic, and systems levels of the biological system that could represent a standardized and rational approach to orphan drug development. Examples are provided to illustrate the concept.

  16. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    PubMed Central

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  17. A 2-D Interface Element for Coupled Analysis of Independently Modeled 3-D Finite Element Subdomains

    NASA Technical Reports Server (NTRS)

    Kandil, Osama A.

    1998-01-01

    Over the past few years, the development of the interface technology has provided an analysis framework for embedding detailed finite element models within finite element models which are less refined. This development has enabled the use of cascading substructure domains without the constraint of coincident nodes along substructure boundaries. The approach used for the interface element is based on an alternate variational principle often used in deriving hybrid finite elements. The resulting system of equations exhibits a high degree of sparsity but gives rise to a non-positive definite system which causes difficulties with many of the equation solvers in general-purpose finite element codes. Hence the global system of equations is generally solved using, a decomposition procedure with pivoting. The research reported to-date for the interface element includes the one-dimensional line interface element and two-dimensional surface interface element. Several large-scale simulations, including geometrically nonlinear problems, have been reported using the one-dimensional interface element technology; however, only limited applications are available for the surface interface element. In the applications reported to-date, the geometry of the interfaced domains exactly match each other even though the spatial discretization within each domain may be different. As such, the spatial modeling of each domain, the interface elements and the assembled system is still laborious. The present research is focused on developing a rapid modeling procedure based on a parametric interface representation of independently defined subdomains which are also independently discretized.

  18. MCM - 2 and Ki - 67 as proliferation markers in renal cell carcinoma: A quantitative and semi - quantitative analysis

    PubMed Central

    Mehdi, Muhammad Zain; Nagi, Abdul Hanan; Naseem, Nadia

    2016-01-01

    ABSTRACT Introduction/Background: Fuhrman nuclear grade is the most important histological parameter to predict prognosis in a patient of renal cell carcinoma (RCC). However, it suffers from inter-observer and intra-observer variation giving rise to need of a parameter that not only correlates with nuclear grade but is also objective and reproducible. Proliferation is the measure of aggressiveness of a tumour and it is strongly correlated with Fuhrman nuclear grade, clinical survival and recurrence in RCC. Ki-67 is conventionally used to assess proliferation. Mini-chromosome maintenance 2 (MCM-2) is a lesser known marker of proliferation and identifies a greater proliferation faction. This study was designed to assess the prognostic significance of MCM-2 by comparing it with Fuhrman nuclear grade and Ki-67. Material and Methods: n=50 cases of various ages, stages, histological subtypes and grades of RCC were selected for this study. Immunohistochemical staining using Ki-67(MIB-1, Mouse monoclonal antibody, Dako) and MCM-2 (Mouse monoclonal antibody, Thermo) was performed on the paraffin embedded blocks in the department of Morbid anatomy and Histopathology, University of Health Sciences, Lahore. Labeling indices (LI) were determined by two pathologists independently using quantitative and semi-quantitative analysis. Statistical analysis was carried out using SPSS 20.0. Kruskall-Wallis test was used to determine a correlation of proliferation markers with grade, and Pearson's correlate was used to determine correlation between the two proliferation markers. Results: Labeling index of MCM-2 (median=24.29%) was found to be much higher than Ki-67(median=13.05%). Both markers were significantly related with grade (p=0.00; Kruskall-Wallis test). LI of MCM-2 was found to correlate significantly with LI of Ki-67(r=0.0934;p=0.01 with Pearson's correlate). Results of semi-quantitative analysis correlated well with quantitative analysis. Conclusion: Both Ki-67 and MCM-2 are

  19. MCM - 2 and Ki - 67 as proliferation markers in renal cell carcinoma: A quantitative and semi - quantitative analysis.

    PubMed

    Mehdi, Muhammad Zain; Nagi, Abdul Hanan; Naseem, Nadia

    2016-01-01

    Fuhrman nuclear grade is the most important histological parameter to predict prognosis in a patient of renal cell carcinoma (RCC). However, it suffers from inter-observer and intra-observer variation giving rise to need of a parameter that not only correlates with nuclear grade but is also objective and reproducible. Proliferation is the measure of aggressiveness of a tumour and it is strongly correlated with Fuhrman nuclear grade, clinical survival and recurrence in RCC. Ki-67 is conventionally used to assess proliferation. Mini-chromosome maintenance 2 (MCM-2) is a lesser known marker of proliferation and identifies a greater proliferation faction. This study was designed to assess the prognostic significance of MCM-2 by comparing it with Fuhrman nuclear grade and Ki-67. n=50 cases of various ages, stages, histological subtypes and grades of RCC were selected for this study. Immunohistochemical staining using Ki-67(MIB-1, Mouse monoclonal antibody, Dako) and MCM-2 (Mouse monoclonal antibody, Thermo) was performed on the paraffin embedded blocks in the department of Morbid anatomy and Histopathology, University of Health Sciences, Lahore. Labeling indices (LI) were determined by two pathologists independently using quantitative and semi-quantitative analysis. Statistical analysis was carried out using SPSS 20.0. Kruskall-Wallis test was used to determine a correlation of proliferation markers with grade, and Pearson's correlate was used to determine correlation between the two proliferation markers. Labeling index of MCM-2 (median=24.29%) was found to be much higher than Ki-67(median=13.05%). Both markers were significantly related with grade (p=0.00; Kruskall-Wallis test). LI of MCM-2 was found to correlate significantly with LI of Ki-67(r=0.0934;p=0.01 with Pearson's correlate). Results of semi-quantitative analysis correlated well with quantitative analysis. Both Ki-67 and MCM-2 are markers of proliferation which are closely linked to grade. Therefore, they

  20. Quantitative characterization of nanoscale polycrystalline magnets with electron magnetic circular dichroism.

    PubMed

    Muto, Shunsuke; Rusz, Ján; Tatsumi, Kazuyoshi; Adam, Roman; Arai, Shigeo; Kocevski, Vancho; Oppeneer, Peter M; Bürgler, Daniel E; Schneider, Claus M

    2014-01-01

    Electron magnetic circular dichroism (EMCD) allows the quantitative, element-selective determination of spin and orbital magnetic moments, similar to its well-established X-ray counterpart, X-ray magnetic circular dichroism (XMCD). As an advantage over XMCD, EMCD measurements are made using transmission electron microscopes, which are routinely operated at sub-nanometre resolution, thereby potentially allowing nanometre magnetic characterization. However, because of the low intensity of the EMCD signal, it has not yet been possible to obtain quantitative information from EMCD signals at the nanoscale. Here we demonstrate a new approach to EMCD measurements that considerably enhances the outreach of the technique. The statistical analysis introduced here yields robust quantitative EMCD signals. Moreover, we demonstrate that quantitative magnetic information can be routinely obtained using electron beams of only a few nanometres in diameter without imposing any restriction regarding the crystalline order of the specimen.

  1. Finite element analysis on a medical implant.

    PubMed

    Semenescu, Augustin; Radu-Ioniță, Florentina; Mateș, Ileana Mariana; Bădică, Petre; Batalu, Nicolae Dan; Negoita, Olivia Doina; Purcarea, Victor Lorin

    2016-01-01

    Several studies have shown a tight connection between several ocular pathologies and an increased risk of hip fractures due to falling, especially among elderly patients. The total replacement of the hip joint is a major surgical intervention that aims to restore the function of the affected hip by various factors, such as arthritis, injures, and others. A corkscrew-like femoral stem was designed in order to preserve the bone stock and to prevent the occurrence of iatrogenic fractures during the hammering of the implant. In this paper, the finite element analysis for the proposed design was applied, considering different loads and three types of materials. A finite element analysis is a powerful tool to simulate, optimize, design, and select suitable materials for new medical implants. The results showed that the best scenario was for Ti6Al4V alloy, although Ti and 316L stainless steel had a reasonable high safety factor.

  2. Quantitative Myocardial Perfusion Imaging Versus Visual Analysis in Diagnosing Myocardial Ischemia: A CE-MARC Substudy.

    PubMed

    Biglands, John D; Ibraheem, Montasir; Magee, Derek R; Radjenovic, Aleksandra; Plein, Sven; Greenwood, John P

    2018-05-01

    This study sought to compare the diagnostic accuracy of visual and quantitative analyses of myocardial perfusion cardiovascular magnetic resonance against a reference standard of quantitative coronary angiography. Visual analysis of perfusion cardiovascular magnetic resonance studies for assessing myocardial perfusion has been shown to have high diagnostic accuracy for coronary artery disease. However, only a few small studies have assessed the diagnostic accuracy of quantitative myocardial perfusion. This retrospective study included 128 patients randomly selected from the CE-MARC (Clinical Evaluation of Magnetic Resonance Imaging in Coronary Heart Disease) study population such that the distribution of risk factors and disease status was proportionate to the full population. Visual analysis results of cardiovascular magnetic resonance perfusion images, by consensus of 2 expert readers, were taken from the original study reports. Quantitative myocardial blood flow estimates were obtained using Fermi-constrained deconvolution. The reference standard for myocardial ischemia was a quantitative coronary x-ray angiogram stenosis severity of ≥70% diameter in any coronary artery of >2 mm diameter, or ≥50% in the left main stem. Diagnostic performance was calculated using receiver-operating characteristic curve analysis. The area under the curve for visual analysis was 0.88 (95% confidence interval: 0.81 to 0.95) with a sensitivity of 81.0% (95% confidence interval: 69.1% to 92.8%) and specificity of 86.0% (95% confidence interval: 78.7% to 93.4%). For quantitative stress myocardial blood flow the area under the curve was 0.89 (95% confidence interval: 0.83 to 0.96) with a sensitivity of 87.5% (95% confidence interval: 77.3% to 97.7%) and specificity of 84.5% (95% confidence interval: 76.8% to 92.3%). There was no statistically significant difference between the diagnostic performance of quantitative and visual analyses (p = 0.72). Incorporating rest myocardial

  3. Quantitative determination of selenium and mercury, and an ICP-MS semi-quantitative scan of other elements in samples of eagle tissues collected from the Pacific Northwest--Summer 2011

    USGS Publications Warehouse

    May, Thomas; Walther, Mike; Brumbaugh, William

    2013-01-01

    Eagle tissues from dead eagle carcasses were collected by U.S. Fish and Wildlife Service personnel at various locations in the Pacific Northwest as part of a study to document the occurrence of metal and metalloid contaminants. A group of 182 eagle tissue samples, consisting of liver, kidney, brain, talon, feather, femur, humerus, and stomach contents, were quantitatively analyzed for concentrations of selenium and mercury by atomic absorption techniques, and for other elements by semi-quantitative scan with an inductively coupled plasma-mass spectrometer. For the various tissue matrices analyzed by an ICP-MS semiquantitative scan, some elemental concentrations (micrograms per gram dry weight) were quite variable within a particular matrix; notable observations were as follows: lead concentrations ranged from 0.2 to 31 in femurs, 0.1 to 29 in humeri, 0.1 to 54 in talons, less than (<) 0.05 to 120 in livers, <0.05 to 34 in kidneys, and 0.05 to 8 in brains; copper concentrations ranged from 5 to 9 in feathers, 8 to 47 in livers, 7 to 43 in kidneys, and 7 to 28 in brains; cadmium concentrations ranged from 0.1 to 10 in kidneys. In stomach contents, concentrations of vanadium ranged from 0.08 to 5, chromium 2 to 34, manganese 1 to 57, copper 2 to 69, arsenic <0.05 to 6, rubidium 1 to 13, and barium <0.5 to 18. Selenium concentrations from highest to lowest based on the matrix mean were as follows: kidney, liver, feather, brain, stomach content, talon, femur, and humerus. For mercury, the highest to lowest concentrations were feather, liver, talon, brain, stomach content, femur, and humerus.

  4. Quantitative X-ray mapping, scatter diagrams and the generation of correction maps to obtain more information about your material

    NASA Astrophysics Data System (ADS)

    Wuhrer, R.; Moran, K.

    2014-03-01

    Quantitative X-ray mapping with silicon drift detectors and multi-EDS detector systems have become an invaluable analysis technique and one of the most useful methods of X-ray microanalysis today. The time to perform an X-ray map has reduced considerably with the ability to map minor and trace elements very accurately due to the larger detector area and higher count rate detectors. Live X-ray imaging can now be performed with a significant amount of data collected in a matter of minutes. A great deal of information can be obtained from X-ray maps. This includes; elemental relationship or scatter diagram creation, elemental ratio mapping, chemical phase mapping (CPM) and quantitative X-ray maps. In obtaining quantitative x-ray maps, we are able to easily generate atomic number (Z), absorption (A), fluorescence (F), theoretical back scatter coefficient (η), and quantitative total maps from each pixel in the image. This allows us to generate an image corresponding to each factor (for each element present). These images allow the user to predict and verify where they are likely to have problems in our images, and are especially helpful to look at possible interface artefacts. The post-processing techniques to improve the quantitation of X-ray map data and the development of post processing techniques for improved characterisation are covered in this paper.

  5. Quantitative Analysis of the Cervical Texture by Ultrasound and Correlation with Gestational Age.

    PubMed

    Baños, Núria; Perez-Moreno, Alvaro; Migliorelli, Federico; Triginer, Laura; Cobo, Teresa; Bonet-Carne, Elisenda; Gratacos, Eduard; Palacio, Montse

    2017-01-01

    Quantitative texture analysis has been proposed to extract robust features from the ultrasound image to detect subtle changes in the textures of the images. The aim of this study was to evaluate the feasibility of quantitative cervical texture analysis to assess cervical tissue changes throughout pregnancy. This was a cross-sectional study including singleton pregnancies between 20.0 and 41.6 weeks of gestation from women who delivered at term. Cervical length was measured, and a selected region of interest in the cervix was delineated. A model to predict gestational age based on features extracted from cervical images was developed following three steps: data splitting, feature transformation, and regression model computation. Seven hundred images, 30 per gestational week, were included for analysis. There was a strong correlation between the gestational age at which the images were obtained and the estimated gestational age by quantitative analysis of the cervical texture (R = 0.88). This study provides evidence that quantitative analysis of cervical texture can extract features from cervical ultrasound images which correlate with gestational age. Further research is needed to evaluate its applicability as a biomarker of the risk of spontaneous preterm birth, as well as its role in cervical assessment in other clinical situations in which cervical evaluation might be relevant. © 2016 S. Karger AG, Basel.

  6. Quantitative analysis of culture using millions of digitized books

    PubMed Central

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  7. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  8. Key Elements of a Family Intervention for Schizophrenia: A Qualitative Analysis of an RCT.

    PubMed

    Grácio, Jaime; Gonçalves-Pereira, Manuel; Leff, Julian

    2018-03-01

    Schizophrenia is a complex biopsychosocial condition in which expressed emotion in family members is a robust predictor of relapse. Not surprisingly, family interventions are remarkably effective and thus recommended in current treatment guidelines. Their key elements seem to be common therapeutic factors, followed by education and coping skills training. However, few studies have explored these key elements and the process of the intervention itself. We conducted a qualitative and quantitative analysis of the records from a pioneering family intervention trial addressing expressed emotion, published by Leff and colleagues four decades ago. Records were analyzed into categories and data explored using descriptive statistics. This was complemented by a narrative evaluation using an inductive approach based on emotional markers and markers of change. The most used strategies in the intervention were addressing needs, followed by coping skills enhancement, advice, and emotional support. Dealing with overinvolvement and reframing were the next most frequent. Single-family home sessions seemed to augment the therapeutic work conducted in family groups. Overall the intervention seemed to promote cognitive and emotional change in the participants, and therapists were sensitive to the emotional trajectory of each subject. On the basis of our findings, we developed a longitudinal framework for better understanding the process of this treatment approach. © 2016 Family Process Institute.

  9. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc.

  10. MicroCT parameters for multimaterial elements assessment

    NASA Astrophysics Data System (ADS)

    de Araújo, Olga M. O.; Silva Bastos, Jaqueline; Machado, Alessandra S.; dos Santos, Thaís M. P.; Ferreira, Cintia G.; Rosifini Alves Claro, Ana Paula; Lopes, Ricardo T.

    2018-03-01

    Microtomography is a non-destructive testing technique for quantitative and qualitative analysis. The investigation of multimaterial elements with great difference of density can result in artifacts that degrade image quality depending on combination of additional filter. The aim of this study is the selection of parameters most appropriate for analysis of bone tissue with metallic implant. The results show the simulation with MCNPX code for the distribution of energy without additional filter, with use of aluminum, copper and brass filters and their respective reconstructed images showing the importance of the choice of these parameters in image acquisition process on computed microtomography.

  11. A new parallel-vector finite element analysis software on distributed-memory computers

    NASA Technical Reports Server (NTRS)

    Qin, Jiangning; Nguyen, Duc T.

    1993-01-01

    A new parallel-vector finite element analysis software package MPFEA (Massively Parallel-vector Finite Element Analysis) is developed for large-scale structural analysis on massively parallel computers with distributed-memory. MPFEA is designed for parallel generation and assembly of the global finite element stiffness matrices as well as parallel solution of the simultaneous linear equations, since these are often the major time-consuming parts of a finite element analysis. Block-skyline storage scheme along with vector-unrolling techniques are used to enhance the vector performance. Communications among processors are carried out concurrently with arithmetic operations to reduce the total execution time. Numerical results on the Intel iPSC/860 computers (such as the Intel Gamma with 128 processors and the Intel Touchstone Delta with 512 processors) are presented, including an aircraft structure and some very large truss structures, to demonstrate the efficiency and accuracy of MPFEA.

  12. Quantitative Doppler Analysis Using Conventional Color Flow Imaging Acquisitions.

    PubMed

    Karabiyik, Yucel; Ekroll, Ingvild Kinn; Eik-Nes, Sturla H; Lovstakken, Lasse

    2018-05-01

    Interleaved acquisitions used in conventional triplex mode result in a tradeoff between the frame rate and the quality of velocity estimates. On the other hand, workflow becomes inefficient when the user has to switch between different modes, and measurement variability is increased. This paper investigates the use of power spectral Capon estimator in quantitative Doppler analysis using data acquired with conventional color flow imaging (CFI) schemes. To preserve the number of samples used for velocity estimation, only spatial averaging was utilized, and clutter rejection was performed after spectral estimation. The resulting velocity spectra were evaluated in terms of spectral width using a recently proposed spectral envelope estimator. The spectral envelopes were also used for Doppler index calculations using in vivo and string phantom acquisitions. In vivo results demonstrated that the Capon estimator can provide spectral estimates with sufficient quality for quantitative analysis using packet-based CFI acquisitions. The calculated Doppler indices were similar to the values calculated using spectrograms estimated on a commercial ultrasound scanner.

  13. High-throughput quantitative analysis by desorption electrospray ionization mass spectrometry.

    PubMed

    Manicke, Nicholas E; Kistler, Thomas; Ifa, Demian R; Cooks, R Graham; Ouyang, Zheng

    2009-02-01

    A newly developed high-throughput desorption electrospray ionization (DESI) source was characterized in terms of its performance in quantitative analysis. A 96-sample array, containing pharmaceuticals in various matrices, was analyzed in a single run with a total analysis time of 3 min. These solution-phase samples were examined from a hydrophobic PTFE ink printed on glass. The quantitative accuracy, precision, and limit of detection (LOD) were characterized. Chemical background-free samples of propranolol (PRN) with PRN-d(7) as internal standard (IS) and carbamazepine (CBZ) with CBZ-d(10) as IS were examined. So were two other sample sets consisting of PRN/PRN-d(7) at varying concentration in a biological milieu of 10% urine or porcine brain total lipid extract, total lipid concentration 250 ng/microL. The background-free samples, examined in a total analysis time of 1.5 s/sample, showed good quantitative accuracy and precision, with a relative error (RE) and relative standard deviation (RSD) generally less than 3% and 5%, respectively. The samples in urine and the lipid extract required a longer analysis time (2.5 s/sample) and showed RSD values of around 10% for the samples in urine and 4% for the lipid extract samples and RE values of less than 3% for both sets. The LOD for PRN and CBZ when analyzed without chemical background was 10 and 30 fmol, respectively. The LOD of PRN increased to 400 fmol analyzed in 10% urine, and 200 fmol when analyzed in the brain lipid extract.

  14. On the Need for Quantitative Bias Analysis in the Peer-Review Process.

    PubMed

    Fox, Matthew P; Lash, Timothy L

    2017-05-15

    Peer review is central to the process through which epidemiologists generate evidence to inform public health and medical interventions. Reviewers thereby act as critical gatekeepers to high-quality research. They are asked to carefully consider the validity of the proposed work or research findings by paying careful attention to the methodology and critiquing the importance of the insight gained. However, although many have noted problems with the peer-review system for both manuscripts and grant submissions, few solutions have been proposed to improve the process. Quantitative bias analysis encompasses all methods used to quantify the impact of systematic error on estimates of effect in epidemiologic research. Reviewers who insist that quantitative bias analysis be incorporated into the design, conduct, presentation, and interpretation of epidemiologic research could substantially strengthen the process. In the present commentary, we demonstrate how quantitative bias analysis can be used by investigators and authors, reviewers, funding agencies, and editors. By utilizing quantitative bias analysis in the peer-review process, editors can potentially avoid unnecessary rejections, identify key areas for improvement, and improve discussion sections by shifting from speculation on the impact of sources of error to quantification of the impact those sources of bias may have had. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Finite element analysis of unnotched charpy impact tests

    DOT National Transportation Integrated Search

    2008-10-01

    This paper describes nonlinear finite element analysis (FEA) to examine the energy to : fracture unnotched Charpy specimens under pendulum impact loading. An oversized, : nonstandard pendulum impactor, called the Bulk Fracture Charpy Machine (BFCM), ...

  16. Stress distribution of the foot during mid-stance to push-off in barefoot gait: a 3-D finite element analysis.

    PubMed

    Chen, W P; Tang, F T; Ju, C W

    2001-08-01

    To quantify stress distribution of the foot during mid-stance to push-off in barefoot gait using 3-D finite element analysis. To simulate the foot structure and facilitate later consideration of footwear. Finite element model was generated and loading condition simulating barefoot gait during mid-stance to push-off was used to quantify the stress distributions. A computational model can provide overall stress distributions of the foot subject to various loading conditions. A preliminary 3-D finite element foot model was generated based on the computed tomography data of a male subject and the bone and soft tissue structures were modeled. Analysis was performed for loading condition simulating barefoot gait during mid-stance to push-off. The peak plantar pressure ranged from 374 to 1003 kPa and the peak von Mises stress in the bone ranged from 2.12 to 6.91 MPa at different instants. The plantar pressure patterns were similar to measurement result from previous literature. The present study provides a preliminary computational model that is capable of estimating the overall plantar pressure and bone stress distributions. It can also provide quantitative analysis for normal and pathological foot motion. This model can identify areas of increased pressure and correlate the pressure with foot pathology. Potential applications can be found in the study of foot deformities, footwear, surgical interventions. It may assist pre-treatment planning, design of pedorthotic appliances, and predict the treatment effect of foot orthosis.

  17. Quantitative analysis of professionally trained versus untrained voices.

    PubMed

    Siupsinskiene, Nora

    2003-01-01

    The aim of this study was to compare healthy trained and untrained voices as well as healthy and dysphonic trained voices in adults using combined voice range profile and aerodynamic tests, to define the normal range limiting values of quantitative voice parameters and to select the most informative quantitative voice parameters for separation between healthy and dysphonic trained voices. Three groups of persons were evaluated. One hundred eighty six healthy volunteers were divided into two groups according to voice training: non-professional speakers group consisted of 106 untrained voices persons (36 males and 70 females) and professional speakers group--of 80 trained voices persons (21 males and 59 females). Clinical group consisted of 103 dysphonic professional speakers (23 males and 80 females) with various voice disorders. Eighteen quantitative voice parameters from combined voice range profile (VRP) test were analyzed: 8 of voice range profile, 8 of speaking voice, overall vocal dysfunction degree and coefficient of sound, and aerodynamic maximum phonation time. Analysis showed that healthy professional speakers demonstrated expanded vocal abilities in comparison to healthy non-professional speakers. Quantitative voice range profile parameters- pitch range, high frequency limit, area of high frequencies and coefficient of sound differed significantly between healthy professional and non-professional voices, and were more informative than speaking voice or aerodynamic parameters in showing the voice training. Logistic stepwise regression revealed that VRP area in high frequencies was sufficient to discriminate between healthy and dysphonic professional speakers for male subjects (overall discrimination accuracy--81.8%) and combination of three quantitative parameters (VRP high frequency limit, maximum voice intensity and slope of speaking curve) for female subjects (overall model discrimination accuracy--75.4%). We concluded that quantitative voice assessment

  18. Quantitative Determination of Aluminum in Deodorant Brands: A Guided Inquiry Learning Experience in Quantitative Analysis Laboratory

    ERIC Educational Resources Information Center

    Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr

    2018-01-01

    The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…

  19. Quantitative genetics

    USDA-ARS?s Scientific Manuscript database

    The majority of economically important traits targeted for cotton improvement are quantitatively inherited. In this chapter, the current state of cotton quantitative genetics is described and separated into four components. These components include: 1) traditional quantitative inheritance analysis, ...

  20. Quantitative genetic analysis of brain copper and zinc in BXD recombinant inbred mice.

    PubMed

    Jones, Leslie C; McCarthy, Kristin A; Beard, John L; Keen, Carl L; Jones, Byron C

    2006-01-01

    Copper and zinc are trace nutrients essential for normal brain function, yet an excess of these elements can be toxic. It is important therefore that these metals be closely regulated. We recently conducted a quantitative trait loci (QTL) analysis to identify chromosomal regions in the mouse containing possible regulatory genes. The animals came from 15 strains of the BXD/Ty recombinant inbred (RI) strain panel and the brain regions analyzed were frontal cortex, caudate-putamen, nucleus accumbens and ventral midbrain. Several QTL were identified for copper and/or zinc, most notably on chromosomes 1, 8, 16 and 17. Genetic correlational analysis also revealed associations between these metals and dopamine, cocaine responses, saccharine preference, immune response and seizure susceptibility. Notably, the QTL on chromosome 17 is also associated with seizure susceptibility and contains the histocompatibility H2 complex. This work shows that regulation of zinc and copper is under polygenic influence and is intimately related to CNS function. Future work will reveal genes underlying the QTL and how they interact with other genes and the environment. More importantly, revelation of the genetic underpinnings of copper and zinc brain homeostasis will aid our understanding of neurological diseases that are related to copper and zinc imbalance.

  1. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    NASA Astrophysics Data System (ADS)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  2. Electroencephalography reactivity for prognostication of post-anoxic coma after cardiopulmonary resuscitation: A comparison of quantitative analysis and visual analysis.

    PubMed

    Liu, Gang; Su, Yingying; Jiang, Mengdi; Chen, Weibi; Zhang, Yan; Zhang, Yunzhou; Gao, Daiquan

    2016-07-28

    Electroencephalogram reactivity (EEG-R) is a positive predictive factor for assessing outcomes in comatose patients. Most studies assess the prognostic value of EEG-R utilizing visual analysis; however, this method is prone to subjectivity. We sought to categorize EEG-R with a quantitative approach. We retrospectively studied consecutive comatose patients who had an EEG-R recording performed 1-3 days after cardiopulmonary resuscitation (CPR) or during normothermia after therapeutic hypothermia. EEG-R was assessed via visual analysis and quantitative analysis separately. Clinical outcomes were followed-up at 3-month and dichotomized as recovery of awareness or no recovery of awareness. A total of 96 patients met the inclusion criteria, and 38 (40%) patients recovered awareness at 3-month followed-up. Of 27 patients with EEG-R measured with visual analysis, 22 patients recovered awareness; and of the 69 patients who did not demonstrated EEG-R, 16 patients recovered awareness. The sensitivity and specificity of visually measured EEG-R were 58% and 91%, respectively. The area under the receiver operating characteristic curve for the quantitative analysis was 0.92 (95% confidence interval, 0.87-0.97), with the best cut-off value of 0.10. EEG-R through quantitative analysis might be a good method in predicting the recovery of awareness in patients with post-anoxic coma after CPR. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. A finite element analysis of viscoelastically damped sandwich plates

    NASA Astrophysics Data System (ADS)

    Ma, B.-A.; He, J.-F.

    1992-01-01

    A finite element analysis associated with an asymptotic solution method for the harmonic flexural vibration of viscoelastically damped unsymmetrical sandwich plates is given. The element formulation is based on generalization of the discrete Kirchhoff theory (DKT) element formulation. The results obtained with the first order approximation of the asymptotic solution presented here are the same as those obtained by means of the modal strain energy (MSE) method. By taking more terms of the asymptotic solution, with successive calculations and use of the Padé approximants method, accuracy can be improved. The finite element computation has been verified by comparison with an analytical exact solution for rectangular plates with simply supported edges. Results for the same plates with clamped edges are also presented.

  4. Quantitative comparison of cognitive behavioral therapy and music therapy research: a methodological best-practices analysis to guide future investigation for adult psychiatric patients.

    PubMed

    Silverman, Michael J

    2008-01-01

    While the music therapy profession is relatively young and small in size, it can treat a variety of clinical populations and has established a diverse research base. However, although the profession originated working with persons diagnosed with mental illnesses, there is a considerable lack of quantitative research concerning the effects of music therapy with this population. Music therapy clinicians and researchers have reported on this lack of evidence and the difficulty in conducting psychosocial research on their interventions (Choi, 1997; Silverman, 2003a). While published studies have provided suggestions for future research, no studies have provided detailed propositions for the methodology and design of meticulous high quality randomized controlled psychiatric music therapy research. How do other psychotherapies accomplish their databases and could the music therapy field borrow from their rigorous "methodological best practices" to strengthen its own literature base? Therefore, as the National Institutes of Mental Health state the treatment of choice for evidence-based psychotherapy is cognitive behavioral therapy (CBT), aspects of this psychotherapy's literature base were analyzed. The purpose of this literature analysis was to (a) analyze and identify components of high-quality quantitative CBT research for adult psychiatric consumers, (b) analyze and identify the variables and other elements of existing quantitative psychiatric music therapy research for adult consumers, and (c) compare the two data sets to identify the best methodological designs and variables for future quantitative music therapy research with the mental health population. A table analyzing randomized and thoroughly controlled studies involving the use of CBT for persons with severe mental illnesses is included to determine chief components of high-quality experimental research designs and implementation of quantitative clinical research. The table also shows the same analyzed

  5. Finite Element Analysis of the LOLA Receiver Telescope Lens

    NASA Technical Reports Server (NTRS)

    Matzinger, Elizabeth

    2007-01-01

    This paper presents the finite element stress and distortion analysis completed on the Receiver Telescope lens of the Lunar Orbiter Laser Altimeter (LOLA). LOLA is one of six instruments on the Lunar Reconnaissance Orbiter (LRO), scheduled to launch in 2008. LOLA's main objective is to produce a high-resolution global lunar topographic model to aid in safe landings and enhance surface mobility in future exploration missions. The Receiver Telescope captures the laser pulses transmitted through a diffractive optical element (DOE) and reflected off the lunar surface. The largest lens of the Receiver Telescope, Lens 1, is a 150 mm diameter aspheric lens originally designed to be made of BK7 glass. The finite element model of the Receiver Telescope Lens 1 is comprised of solid elements and constrained in a manner consistent with the behavior of the mounting configuration of the Receiver Telescope tube. Twenty-one temperature load cases were mapped to the nodes based on thermal analysis completed by LOLA's lead thermal analyst, and loads were applied to simulate the preload applied from the ring flexure. The thermal environment of the baseline design (uncoated BK7 lens with no baffle) produces large radial and axial gradients in the lens. These large gradients create internal stresses that may lead to part failure, as well as significant bending that degrades optical performance. The high stresses and large distortions shown in the analysis precipitated a design change from BK7 glass to sapphire.

  6. Elastic-plastic mixed-iterative finite element analysis: Implementation and performance assessment

    NASA Technical Reports Server (NTRS)

    Sutjahjo, Edhi; Chamis, Christos C.

    1993-01-01

    An elastic-plastic algorithm based on Von Mises and associative flow criteria is implemented in MHOST-a mixed iterative finite element analysis computer program developed by NASA Lewis Research Center. The performance of the resulting elastic-plastic mixed-iterative analysis is examined through a set of convergence studies. Membrane and bending behaviors of 4-node quadrilateral shell finite elements are tested for elastic-plastic performance. Generally, the membrane results are excellent, indicating the implementation of elastic-plastic mixed-iterative analysis is appropriate.

  7. Analysis of artifacts suggests DGGE should not be used for quantitative diversity analysis.

    PubMed

    Neilson, Julia W; Jordan, Fiona L; Maier, Raina M

    2013-03-01

    PCR-denaturing gradient gel electrophoresis (PCR-DGGE) is widely used in microbial ecology for the analysis of comparative community structure. However, artifacts generated during PCR-DGGE of mixed template communities impede the application of this technique to quantitative analysis of community diversity. The objective of the current study was to employ an artificial bacterial community to document and analyze artifacts associated with multiband signatures and preferential template amplification and to highlight their impacts on the use of this technique for quantitative diversity analysis. Six bacterial species (three Betaproteobacteria, two Alphaproteobacteria, and one Firmicutes) were amplified individually and in combinations with primers targeting the V7/V8 region of the 16S rRNA gene. Two of the six isolates produced multiband profiles demonstrating that band number does not correlate directly with α-diversity. Analysis of the multiple bands from one of these isolates confirmed that both bands had identical sequences which lead to the hypothesis that the multiband pattern resulted from two distinct structural conformations of the same amplicon. In addition, consistent preferential amplification was demonstrated following pairwise amplifications of the six isolates. DGGE and real time PCR analysis identified primer mismatch and PCR inhibition due to 16S rDNA secondary structure as the most probable causes of preferential amplification patterns. Reproducible DGGE community profiles generated in this study confirm that PCR-DGGE provides an excellent high-throughput tool for comparative community structure analysis, but that method-specific artifacts preclude its use for accurate comparative diversity analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Element-by-element Solution Procedures for Nonlinear Structural Analysis

    NASA Technical Reports Server (NTRS)

    Hughes, T. J. R.; Winget, J. M.; Levit, I.

    1984-01-01

    Element-by-element approximate factorization procedures are proposed for solving the large finite element equation systems which arise in nonlinear structural mechanics. Architectural and data base advantages of the present algorithms over traditional direct elimination schemes are noted. Results of calculations suggest considerable potential for the methods described.

  9. An emulator for minimizing finite element analysis implementation resources

    NASA Technical Reports Server (NTRS)

    Melosh, R. J.; Utku, S.; Salama, M.; Islam, M.

    1982-01-01

    A finite element analysis emulator providing a basis for efficiently establishing an optimum computer implementation strategy when many calculations are involved is described. The SCOPE emulator determines computer resources required as a function of the structural model, structural load-deflection equation characteristics, the storage allocation plan, and computer hardware capabilities. Thereby, it provides data for trading analysis implementation options to arrive at a best strategy. The models contained in SCOPE lead to micro-operation computer counts of each finite element operation as well as overall computer resource cost estimates. Application of SCOPE to the Memphis-Arkansas bridge analysis provides measures of the accuracy of resource assessments. Data indicate that predictions are within 17.3 percent for calculation times and within 3.2 percent for peripheral storage resources for the ELAS code.

  10. Finite-element reentry heat-transfer analysis of space shuttle Orbiter

    NASA Technical Reports Server (NTRS)

    Ko, William L.; Quinn, Robert D.; Gong, Leslie

    1986-01-01

    A structural performance and resizing (SPAR) finite-element thermal analysis computer program was used in the heat-transfer analysis of the space shuttle orbiter subjected to reentry aerodynamic heating. Three wing cross sections and one midfuselage cross section were selected for the thermal analysis. The predicted thermal protection system temperatures were found to agree well with flight-measured temperatures. The calculated aluminum structural temperatures also agreed reasonably well with the flight data from reentry to touchdown. The effects of internal radiation and of internal convection were found to be significant. The SPAR finite-element solutions agreed reasonably well with those obtained from the conventional finite-difference method.

  11. Quantitative measurements of nanoscale permittivity and conductivity using tuning-fork-based microwave impedance microscopy

    NASA Astrophysics Data System (ADS)

    Wu, Xiaoyu; Hao, Zhenqi; Wu, Di; Zheng, Lu; Jiang, Zhanzhi; Ganesan, Vishal; Wang, Yayu; Lai, Keji

    2018-04-01

    We report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-field microwave imaging with small distance modulation.

  12. Quantitative proteomic analysis of intact plastids.

    PubMed

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  13. In situ semi-quantitative analysis of polluted soils by laser-induced breakdown spectroscopy (LIBS).

    PubMed

    Ismaël, Amina; Bousquet, Bruno; Michel-Le Pierrès, Karine; Travaillé, Grégoire; Canioni, Lionel; Roy, Stéphane

    2011-05-01

    Time-saving, low-cost analyses of soil contamination are required to ensure fast and efficient pollution removal and remedial operations. In this work, laser-induced breakdown spectroscopy (LIBS) has been successfully applied to in situ analyses of polluted soils, providing direct semi-quantitative information about the extent of pollution. A field campaign has been carried out in Brittany (France) on a site presenting high levels of heavy metal concentrations. Results on iron as a major component as well as on lead and copper as minor components are reported. Soil samples were dried and prepared as pressed pellets to minimize the effects of moisture and density on the results. LIBS analyses were performed with a Nd:YAG laser operating at 1064 nm, 60 mJ per 10 ns pulse, at a repetition rate of 10 Hz with a diameter of 500 μm on the sample surface. Good correlations were obtained between the LIBS signals and the values of concentrations deduced from inductively coupled plasma atomic emission spectroscopy (ICP-AES). This result proves that LIBS is an efficient method for optimizing sampling operations. Indeed, "LIBS maps" were established directly on-site, providing valuable assistance in optimizing the selection of the most relevant samples for future expensive and time-consuming laboratory analysis and avoiding useless analyses of very similar samples. Finally, it is emphasized that in situ LIBS is not described here as an alternative quantitative analytical method to the usual laboratory measurements but simply as an efficient time-saving tool to optimize sampling operations and to drastically reduce the number of soil samples to be analyzed, thus reducing costs. The detection limits of 200 ppm for lead and 80 ppm for copper reported here are compatible with the thresholds of toxicity; thus, this in situ LIBS campaign was fully validated for these two elements. Consequently, further experiments are planned to extend this study to other chemical elements and other

  14. Renal geology (quantitative renal stone analysis) by 'Fourier transform infrared spectroscopy'.

    PubMed

    Singh, Iqbal

    2008-01-01

    To prospectively determine the precise stone composition (quantitative analysis) by using infrared spectroscopy in patients with urinary stone disease presenting to our clinic. To determine an ideal method for stone analysis suitable for use in a clinical setting. After routine and a detailed metabolic workup of all patients of urolithiasis, stone samples of 50 patients of urolithiasis satisfying the entry criteria were subjected to the Fourier transform infrared spectroscopic analysis after adequate sample homogenization at a single testing center. Calcium oxalate monohydrate and dihydrate stone mixture was most commonly encountered in 35 (71%) followed by calcium phosphate, carbonate apatite, magnesium ammonium hexahydrate and xanthine stones. Fourier transform infrared spectroscopy allows an accurate, reliable quantitative method of stone analysis. It also helps in maintaining a computerized large reference library. Knowledge of precise stone composition may allow the institution of appropriate prophylactic therapy despite the absence of any detectable metabolic abnormalities. This may prevent and or delay stone recurrence.

  15. An emulator for minimizing computer resources for finite element analysis

    NASA Technical Reports Server (NTRS)

    Melosh, R.; Utku, S.; Islam, M.; Salama, M.

    1984-01-01

    A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).

  16. Development and validation of an ICP-OES method for quantitation of elemental impurities in tablets according to coming US pharmacopeia chapters.

    PubMed

    Støving, Celina; Jensen, Henrik; Gammelgaard, Bente; Stürup, Stefan

    2013-10-01

    May 1, 2014 the United States Pharmacopeia (USP) will implement two new chapters stating limit concentrations of elemental impurities in pharmaceuticals applying inductively coupled plasma methods. In the present work an inductively coupled plasma optical emission spectrometry (ICP-OES) method for quantitation of As, Cd, Cu, Cr, Fe, Hg, Ir, Mn, Mo, Ni, Os, Pb, Pd, Pt, Rh, Ru, V and Zn in tablets according to the new USP chapters was developed. Sample preparation was performed by microwave-assisted acid digestion using a mixture of 65% HNO3 and 37% HCl (3:1, v/v). Limits of detection and quantitation were at least a factor of ten below the USP limit concentrations showing that the ICP-OES technique is well suited for quantitation of elemental impurities. Excluding Os, spike recoveries in the range of 85.3-103.8% were obtained with relative standard deviations (%RSD) ranging from 1.3 to 3.2%. Due to memory effects the spike recovery and %RSD of Os were 161.5% and 13.7%, respectively, thus the method will need further development with respect to elimination of the memory effect of Os. The method was proven to be specific but with potential spectral interference for Ir, Os, Pb, Pt and Rh necessitating visual examination of the spectra. Hg memory effect was handled by using lower spike levels combined with rinsing with 0.1M HCl. The tablets had a content of Fe and Pt of 182.8 ± 18.1 and 2.8 ± 0.2 μg/g, respectively and did therefore not exceed the limit concentration defined by USP. It is suggested that the developed method is applicable to pharmaceutical products with a composition and maximal amount of daily intake (g drug product/day) similar to the tablets used in this work. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. A multi-element screening method to identify metal targets for blood biomonitoring in green sea turtles (Chelonia mydas).

    PubMed

    Villa, C A; Finlayson, S; Limpus, C; Gaus, C

    2015-04-15

    Biomonitoring of blood is commonly used to identify and quantify occupational or environmental exposure to chemical contaminants. Increasingly, this technique has been applied to wildlife contaminant monitoring, including for green turtles, allowing for the non-lethal evaluation of chemical exposure in their nearshore environment. The sources, composition, bioavailability and toxicity of metals in the marine environment are, however, often unknown and influenced by numerous biotic and abiotic factors. These factors can vary considerably across time and space making the selection of the most informative elements for biomonitoring challenging. This study aimed to validate an ICP-MS multi-element screening method for green turtle blood in order to identify and facilitate prioritisation of target metals for subsequent fully quantitative analysis. Multi-element screening provided semiquantitative results for 70 elements, 28 of which were also determined through fully quantitative analysis. Of the 28 comparable elements, 23 of the semiquantitative results had an accuracy between 67% and 112% relative to the fully quantified values. In lieu of any available turtle certified reference materials (CRMs), we evaluated the use of human blood CRMs as a matrix surrogate for quality control, and compared two commonly used sample preparation methods for matrix related effects. The results demonstrate that human blood provides an appropriate matrix for use as a quality control material in the fully quantitative analysis of metals in turtle blood. An example for the application of this screening method is provided by comparing screening results from blood of green turtles foraging in an urban and rural region in Queensland, Australia. Potential targets for future metal biomonitoring in these regions were identified by this approach. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Fourier analysis of finite element preconditioned collocation schemes

    NASA Technical Reports Server (NTRS)

    Deville, Michel O.; Mund, Ernest H.

    1990-01-01

    The spectrum of the iteration operator of some finite element preconditioned Fourier collocation schemes is investigated. The first part of the paper analyses one-dimensional elliptic and hyperbolic model problems and the advection-diffusion equation. Analytical expressions of the eigenvalues are obtained with use of symbolic computation. The second part of the paper considers the set of one-dimensional differential equations resulting from Fourier analysis (in the tranverse direction) of the 2-D Stokes problem. All results agree with previous conclusions on the numerical efficiency of finite element preconditioning schemes.

  19. Interactive Finite Elements for General Engine Dynamics Analysis

    NASA Technical Reports Server (NTRS)

    Adams, M. L.; Padovan, J.; Fertis, D. G.

    1984-01-01

    General nonlinear finite element codes were adapted for the purpose of analyzing the dynamics of gas turbine engines. In particular, this adaptation required the development of a squeeze-film damper element software package and its implantation into a representative current generation code. The ADINA code was selected because of prior use of it and familiarity with its internal structure and logic. This objective was met and the results indicate that such use of general purpose codes is viable alternative to specialized codes for general dynamics analysis of engines.

  20. Quantitative bioimaging of trace elements in the human lens by LA-ICP-MS.

    PubMed

    Konz, Ioana; Fernández, Beatriz; Fernández, M Luisa; Pereiro, Rosario; González-Iglesias, Héctor; Coca-Prados, Miguel; Sanz-Medel, Alfredo

    2014-04-01

    Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) was used for the quantitative imaging of Fe, Cu and Zn in cryostat sections of human eye lenses and for depth profiling analysis in bovine lenses. To ensure a tight temperature control throughout the experiments, a new Peltier-cooled laser ablation cell was employed. For quantification purposes, matrix-matched laboratory standards were prepared from a pool of human lenses from eye donors and spiked with standard solutions containing different concentrations of natural abundance Fe, Cu and Zn. A normalisation strategy was also carried out to correct matrix effects, lack of tissue homogeneity and/or instrumental drifts using a thin gold film deposited on the sample surface. Quantitative images of cryo-sections of human eye lenses analysed by LA-ICP-MS revealed a homogeneous distribution of Fe, Cu and Zn in the nuclear region and a slight increase in Fe concentration in the outer cell layer (i.e. lens epithelium) at the anterior pole. These results were assessed also by isotope dilution mass spectrometry, and Fe, Cu and Zn concentrations determined by ID-ICP-MS in digested samples of lenses and lens capsules.

  1. Patient-specific coronary blood supply territories for quantitative perfusion analysis

    PubMed Central

    Zakkaroff, Constantine; Biglands, John D.; Greenwood, John P.; Plein, Sven; Boyle, Roger D.; Radjenovic, Aleksandra; Magee, Derek R.

    2018-01-01

    Abstract Myocardial perfusion imaging, coupled with quantitative perfusion analysis, provides an important diagnostic tool for the identification of ischaemic heart disease caused by coronary stenoses. The accurate mapping between coronary anatomy and under-perfused areas of the myocardium is important for diagnosis and treatment. However, in the absence of the actual coronary anatomy during the reporting of perfusion images, areas of ischaemia are allocated to a coronary territory based on a population-derived 17-segment (American Heart Association) AHA model of coronary blood supply. This work presents a solution for the fusion of 2D Magnetic Resonance (MR) myocardial perfusion images and 3D MR angiography data with the aim to improve the detection of ischaemic heart disease. The key contribution of this work is a novel method for the mediated spatiotemporal registration of perfusion and angiography data and a novel method for the calculation of patient-specific coronary supply territories. The registration method uses 4D cardiac MR cine series spanning the complete cardiac cycle in order to overcome the under-constrained nature of non-rigid slice-to-volume perfusion-to-angiography registration. This is achieved by separating out the deformable registration problem and solving it through phase-to-phase registration of the cine series. The use of patient-specific blood supply territories in quantitative perfusion analysis (instead of the population-based model of coronary blood supply) has the potential of increasing the accuracy of perfusion analysis. Quantitative perfusion analysis diagnostic accuracy evaluation with patient-specific territories against the AHA model demonstrates the value of the mediated spatiotemporal registration in the context of ischaemic heart disease diagnosis. PMID:29392098

  2. An Inexpensive Electrodeposition Device and Its Use in a Quantitative Analysis Laboratory Exercise

    ERIC Educational Resources Information Center

    Parker, Richard H.

    2011-01-01

    An experimental procedure, using an apparatus that is easy to construct, was developed to incorporate a quantitative electrogravimetric determination of the solution nickel content into an undergraduate or advanced high school quantitative analysis laboratory. This procedure produces results comparable to the procedure used for the gravimetric…

  3. Finite element analysis of wrinkling membranes

    NASA Technical Reports Server (NTRS)

    Miller, R. K.; Hedgepeth, J. M.; Weingarten, V. I.; Das, P.; Kahyai, S.

    1984-01-01

    The development of a nonlinear numerical algorithm for the analysis of stresses and displacements in partly wrinkled flat membranes, and its implementation on the SAP VII finite-element code are described. A comparison of numerical results with exact solutions of two benchmark problems reveals excellent agreement, with good convergence of the required iterative procedure. An exact solution of a problem involving axisymmetric deformations of a partly wrinkled shallow curved membrane is also reported.

  4. PIXE analysis of caries related trace elements in tooth enamel

    NASA Astrophysics Data System (ADS)

    Annegarn, H. J.; Jodaikin, A.; Cleaton-Jones, P. E.; Sellschop, J. P. F.; Madiba, C. C. P.; Bibby, D.

    1981-03-01

    PIXE analysis has been applied to a set of twenty human teeth to determine trace element concentration in enamel from areas susceptible to dental caries (mesial and distal contact points) and in areas less susceptible to the disease (buccal surfaces), with the aim of determining the possible roles of trace elements in the curious process. The samples were caries-free anterior incisors extracted for periodontal reasons from subjects 10-30 years of age. Prior to extraction of the sample teeth, a detailed dental history and examination was carried out in each individual. PIXE analysis, using a 3 MeV proton beam of 1 mm diameter, allowed the determination of Ca, Mn, Fe, Cu, Zn, Sr and Pb above detection limits. As demonstrated in this work, the enhanced sensitivity of PIXE analysis over electron microprobe analysis, and the capability of localised surface analysis compared with the pooled samples required for neutron activation analysis, makes it a powerful and useful technique in dental analysis.

  5. Element analysis: a wavelet-based method for analysing time-localized events in noisy time series.

    PubMed

    Lilly, Jonathan M

    2017-04-01

    A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized 'events'. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event's 'region of influence' within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis , is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry.

  6. Quantitative analysis of cardiovascular MR images.

    PubMed

    van der Geest, R J; de Roos, A; van der Wall, E E; Reiber, J H

    1997-06-01

    The diagnosis of cardiovascular disease requires the precise assessment of both morphology and function. Nearly all aspects of cardiovascular function and flow can be quantified nowadays with fast magnetic resonance (MR) imaging techniques. Conventional and breath-hold cine MR imaging allow the precise and highly reproducible assessment of global and regional left ventricular function. During the same examination, velocity encoded cine (VEC) MR imaging provides measurements of blood flow in the heart and great vessels. Quantitative image analysis often still relies on manual tracing of contours in the images. Reliable automated or semi-automated image analysis software would be very helpful to overcome the limitations associated with the manual and tedious processing of the images. Recent progress in MR imaging of the coronary arteries and myocardial perfusion imaging with contrast media, along with the further development of faster imaging sequences, suggest that MR imaging could evolve into a single technique ('one stop shop') for the evaluation of many aspects of heart disease. As a result, it is very likely that the need for automated image segmentation and analysis software algorithms will further increase. In this paper the developments directed towards the automated image analysis and semi-automated contour detection for cardiovascular MR imaging are presented.

  7. Energy Finite Element Analysis Developments for Vibration Analysis of Composite Aircraft Structures

    NASA Technical Reports Server (NTRS)

    Vlahopoulos, Nickolas; Schiller, Noah H.

    2011-01-01

    The Energy Finite Element Analysis (EFEA) has been utilized successfully for modeling complex structural-acoustic systems with isotropic structural material properties. In this paper, a formulation for modeling structures made out of composite materials is presented. An approach based on spectral finite element analysis is utilized first for developing the equivalent material properties for the composite material. These equivalent properties are employed in the EFEA governing differential equations for representing the composite materials and deriving the element level matrices. The power transmission characteristics at connections between members made out of non-isotropic composite material are considered for deriving suitable power transmission coefficients at junctions of interconnected members. These coefficients are utilized for computing the joint matrix that is needed to assemble the global system of EFEA equations. The global system of EFEA equations is solved numerically and the vibration levels within the entire system can be computed. The new EFEA formulation for modeling composite laminate structures is validated through comparison to test data collected from a representative composite aircraft fuselage that is made out of a composite outer shell and composite frames and stiffeners. NASA Langley constructed the composite cylinder and conducted the test measurements utilized in this work.

  8. Impeller deflection and modal finite element analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, Nathan A.

    2013-10-01

    Deflections of an impeller due to centripetal forces are calculated using finite element analysis. The lateral, or out of plane, deflections are an important design consideration for this particular impeller because it incorporates an air bearing with critical gap tolerances. The target gap distance is approximately 10 microns at a rotational velocity of 2500 rpm. The centripetal forces acting on the impeller cause it deflect in a concave fashion, decreasing the initial gap distance as a function of radial position. This deflection is characterized for a previous and updated impeller design for comparative purposes. The impact of design options suchmore » as material selection, geometry dimensions, and operating rotational velocity are also explored, followed by a sensitivity study with these parameters bounded by specific design values. A modal analysis is also performed to calculate the impeller's natural frequencies which are desired to be avoided during operation. The finite element modeling techniques continue to be exercised by the impeller design team to address specific questions and evaluate conceptual designs, some of which are included in the Appendix.« less

  9. Verification of finite element analysis of fixed partial denture with in vitro electronic strain measurement.

    PubMed

    Wang, Gaoqi; Zhang, Song; Bian, Cuirong; Kong, Hui

    2016-01-01

    The purpose of the study was to verify the finite element analysis model of three-unite fixed partial denture with in vitro electronic strain analysis and analyze clinical situation with the verified model. First, strain gauges were attached to the critical areas of a three-unit fixed partial denture. Strain values were measured under 300 N load perpendicular to the occlusal plane. Secondly, a three-dimensional finite element model in accordance with the electronic strain analysis experiment was constructed from the scanning data. And the strain values obtained by finite element analysis and in vitro measurements were compared. Finally, the clinical destruction of the fixed partial denture was evaluated with the verified finite element analysis model. There was a mutual agreement and consistency between the finite element analysis results and experimental data. The finite element analysis revealed that failure will occur in the veneer layer on buccal surface of the connector under occlusal force of 570 N. The results indicate that the electronic strain analysis is an appropriate and cost saving method to verify the finite element model. The veneer layer on buccal surface of the connector is the weakest area in the fixed partial denture. Copyright © 2015 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  10. Wavelet and Multiresolution Analysis for Finite Element Networking Paradigms

    NASA Technical Reports Server (NTRS)

    Kurdila, Andrew J.; Sharpley, Robert C.

    1999-01-01

    This paper presents a final report on Wavelet and Multiresolution Analysis for Finite Element Networking Paradigms. The focus of this research is to derive and implement: 1) Wavelet based methodologies for the compression, transmission, decoding, and visualization of three dimensional finite element geometry and simulation data in a network environment; 2) methodologies for interactive algorithm monitoring and tracking in computational mechanics; and 3) Methodologies for interactive algorithm steering for the acceleration of large scale finite element simulations. Also included in this report are appendices describing the derivation of wavelet based Particle Image Velocity algorithms and reduced order input-output models for nonlinear systems by utilizing wavelet approximations.

  11. Biomechanical investigation of naso-orbitoethmoid trauma by finite element analysis.

    PubMed

    Huempfner-Hierl, Heike; Schaller, Andreas; Hemprich, Alexander; Hierl, Thomas

    2014-11-01

    Naso-orbitoethmoid fractures account for 5% of all facial fractures. We used data derived from a white 34-year-old man to make a transient dynamic finite element model, which consisted of about 740 000 elements, to simulate fist-like impacts to this anatomically complex area. Finite element analysis showed a pattern of von Mises stresses beyond the yield criterion of bone that corresponded with fractures commonly seen clinically. Finite element models can be used to simulate injuries to the human skull, and provide information about the pathogenesis of different types of fracture. Copyright © 2014 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  12. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  13. ICP-MS: Analytical Method for Identification and Detection of Elemental Impurities.

    PubMed

    Mittal, Mohini; Kumar, Kapil; Anghore, Durgadas; Rawal, Ravindra K

    2017-01-01

    Aim of this article is to review and discuss the currently used quantitative analytical method ICP-MS, which is used for quality control of pharmaceutical products. ICP-MS technique has several applications such as determination of single elements, multi element analysis in synthetic drugs, heavy metals in environmental water, trace element content of selected fertilizers and dairy manures. ICP-MS is also used for determination of toxic and essential elements in different varieties of food samples and metal pollutant present in the environment. The pharmaceuticals may generate impurities at various stages of development, transportation and storage which make them risky to be administered. Thus, it is essential that these impurities must be detected and quantified. ICP-MS plays an important function in the recognition and revealing of elemental impurities. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  14. GProX, a user-friendly platform for bioinformatics analysis and visualization of quantitative proteomics data.

    PubMed

    Rigbolt, Kristoffer T G; Vanselow, Jens T; Blagoev, Blagoy

    2011-08-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)(1). The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net.

  15. GProX, a User-Friendly Platform for Bioinformatics Analysis and Visualization of Quantitative Proteomics Data*

    PubMed Central

    Rigbolt, Kristoffer T. G.; Vanselow, Jens T.; Blagoev, Blagoy

    2011-01-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)1. The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net. PMID:21602510

  16. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    PubMed

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  17. Faults and foibles of quantitative scanning electron microscopy/energy dispersive x-ray spectrometry (SEM/EDS)

    NASA Astrophysics Data System (ADS)

    Newbury, Dale E.; Ritchie, Nicholas W. M.

    2012-06-01

    Scanning electron microscopy with energy dispersive x-ray spectrometry (SEM/EDS) is a powerful and flexible elemental analysis method that can identify and quantify elements with atomic numbers > 4 (Be) present as major constituents (where the concentration C > 0.1 mass fraction, or 10 weight percent), minor (0.01<= C <= 0.1) and trace (C < 0.01, with a minimum detectable limit of ~+/- 0.0005 - 0.001 under routine measurement conditions, a level which is analyte and matrix dependent ). SEM/EDS can select specimen volumes with linear dimensions from ~ 500 nm to 5 μm depending on composition (masses ranging from ~ 10 pg to 100 pg) and can provide compositional maps that depict lateral elemental distributions. Despite the maturity of SEM/EDS, which has a history of more than 40 years, and the sophistication of modern analytical software, the method is vulnerable to serious shortcomings that can lead to incorrect elemental identifications and quantification errors that significantly exceed reasonable expectations. This paper will describe shortcomings in peak identification procedures, limitations on the accuracy of quantitative analysis due to specimen topography or failures in physical models for matrix corrections, and quantitative artifacts encountered in xray elemental mapping. Effective solutions to these problems are based on understanding the causes and then establishing appropriate measurement science protocols. NIST DTSA II and Lispix are open source analytical software available free at www.nist.gov that can aid the analyst in overcoming significant limitations to SEM/EDS.

  18. Finite element analysis of a composite crash box subjected to low velocity impact

    NASA Astrophysics Data System (ADS)

    Shaik Dawood, M. S. I.; Ghazilan, A. L. Ahmad; Shah, Q. H.

    2017-03-01

    In this work, finite element analyses using LS-DYNA had been carried out to investigate the energy absorption capability of a composite crash box. The analysed design incorporates grooves to the cross sectional shape and E-Glass/Epoxy as design material. The effects of groove depth, ridge lines, plane width, material properties, wall thickness and fibre orientation had been quantitatively analysed and found to significantly enhance the energy absorption capability of the crash box.

  19. Magnetic fingerprints of rolling cells for quantitative flow cytometry in whole blood

    NASA Astrophysics Data System (ADS)

    Reisbeck, Mathias; Helou, Michael Johannes; Richter, Lukas; Kappes, Barbara; Friedrich, Oliver; Hayden, Oliver

    2016-09-01

    Over the past 50 years, flow cytometry has had a profound impact on preclinical and clinical applications requiring single cell function information for counting, sub-typing and quantification of epitope expression. At the same time, the workflow complexity and high costs of such optical systems still limit flow cytometry applications to specialized laboratories. Here, we present a quantitative magnetic flow cytometer that incorporates in situ magnetophoretic cell focusing for highly accurate and reproducible rolling of the cellular targets over giant magnetoresistance sensing elements. Time-of-flight analysis is used to unveil quantitative single cell information contained in its magnetic fingerprint. Furthermore, we used erythrocytes as a biological model to validate our methodology with respect to precise analysis of the hydrodynamic cell diameter, quantification of binding capacity of immunomagnetic labels, and discrimination of cell morphology. The extracted time-of-flight information should enable point-of-care quantitative flow cytometry in whole blood for clinical applications, such as immunology and primary hemostasis.

  20. Finite Element Analysis of Adaptive-Stiffening and Shape-Control SMA Hybrid Composites

    NASA Technical Reports Server (NTRS)

    Gao, Xiujie; Burton, Deborah; Turner, Travis L.; Brinson, Catherine

    2005-01-01

    Shape memory alloy hybrid composites with adaptive-stiffening or morphing functions are simulated using finite element analysis. The composite structure is a laminated fiber-polymer composite beam with embedded SMA ribbons at various positions with respect to the neutral axis of the beam. Adaptive stiffening or morphing is activated via selective resistance heating of the SMA ribbons or uniform thermal loads on the beam. The thermomechanical behavior of these composites was simulated in ABAQUS using user-defined SMA elements. The examples demonstrate the usefulness of the methods for the design and simulation of SMA hybrid composites. Keywords: shape memory alloys, Nitinol, ABAQUS, finite element analysis, post-buckling control, shape control, deflection control, adaptive stiffening, morphing, constitutive modeling, user element

  1. Development of Total Reflection X-ray fluorescence spectrometry quantitative methodologies for elemental characterization of building materials and their degradation products

    NASA Astrophysics Data System (ADS)

    García-Florentino, Cristina; Maguregui, Maite; Marguí, Eva; Torrent, Laura; Queralt, Ignasi; Madariaga, Juan Manuel

    2018-05-01

    In this work, a Total Reflection X-ray fluorescence (TXRF) spectrometry based quantitative methodology for elemental characterization of liquid extracts and solids belonging to old building materials and their degradation products from a building of the beginning of 20th century with a high historic cultural value in Getxo, (Basque Country, North of Spain) is proposed. This quantification strategy can be considered a faster methodology comparing to traditional Energy or Wavelength Dispersive X-ray fluorescence (ED-XRF and WD-XRF) spectrometry based methodologies or other techniques such as Inductively Coupled Plasma Mass Spectrometry (ICP-MS). In particular, two kinds of liquid extracts were analysed: (i) water soluble extracts from different mortars and (ii) acid extracts from mortars, black crusts, and calcium carbonate formations. In order to try to avoid the acid extraction step of the materials and their degradation products, it was also studied the TXRF direct measurement of the powdered solid suspensions in water. With this aim, different parameters such as the deposition volume and the measuring time were studied for each kind of samples. Depending on the quantified element, the limits of detection achieved with the TXRF quantitative methodologies for liquid extracts and solids were set around 0.01-1.2 and 2-200 mg/L respectively. The quantification of K, Ca, Ti, Mn, Fe, Zn, Rb, Sr, Sn and Pb in the liquid extracts was proved to be a faster alternative to other more classic quantification techniques (i.e. ICP-MS), accurate enough to obtain information about the composition of the acidic soluble part of the materials and their degradation products. Regarding the solid samples measured as suspensions, it was quite difficult to obtain stable and repetitive suspensions affecting in this way the accuracy of the results. To cope with this problem, correction factors based on the quantitative results obtained using ED-XRF were calculated to improve the accuracy of

  2. Multivariate analysis of elemental chemistry as a robust biosignature

    NASA Astrophysics Data System (ADS)

    Storrie-Lombardi, M.; Nealson, K.

    2003-04-01

    The robotic detection of life in extraterrestrial settings (i.e., Mars, Europa, etc.) would be greatly simplified if analysis could be accomplished in the absence of direct mechanical manipulation of a sample. It would also be preferable to employ a fundamental physico-chemical phenomenon as a biosignature and depend less on the particular manifestations of life on Earth (i.e. to employ non-earthcentric methods). One such approach, which we put forward here, is that of elemental composition, a reflection of the use of specific chemical elements for the construction of living systems. Using appropriate analyses (over the proper spatial scales), it should be possible to see deviations from the geological background (mineral and geochemical composition of the crust), and identify anomalies that would indicate sufficient deviation from the norm as to indicate a possible living system. To this end, over the past four decades elemental distributions have been determined for the sun, the interstellar medium, seawater, the crust of the Earth, carbonaceous chondrite meteorites, bacteria, plants, animals, and human beings. Such data can be relatively easily obtained for samples of a variety of types using a technique known as laser-induced breakdown spectroscopy (LIBS), which employs a high energy laser to ablate a portion of a sample, and then determine elemental composition using remote optical spectroscopy. However, the elements commonly associated with living systems (H, C, O, and N), while useful for detecting extant life, are relatively volatile and are not easily constrained across geological time scales. This minimizes their utility as fossil markers of ancient life. We have investigated the possibility of distinguishing the distributions of less volatile elements in a variety of biological materials from the distributions found in carbonaceous chondrites and the Earth’s crust using principal component analysis (PCA), a classical multivariate analysis technique

  3. Application of artificial neural network in precise prediction of cement elements percentages based on the neutron activation analysis

    NASA Astrophysics Data System (ADS)

    Eftekhari Zadeh, E.; Feghhi, S. A. H.; Roshani, G. H.; Rezaei, A.

    2016-05-01

    Due to variation of neutron energy spectrum in the target sample during the activation process and to peak overlapping caused by the Compton effect with gamma radiations emitted from activated elements, which results in background changes and consequently complex gamma spectrum during the measurement process, quantitative analysis will ultimately be problematic. Since there is no simple analytical correlation between peaks' counts with elements' concentrations, an artificial neural network for analyzing spectra can be a helpful tool. This work describes a study on the application of a neural network to determine the percentages of cement elements (mainly Ca, Si, Al, and Fe) using the neutron capture delayed gamma-ray spectra of the substance emitted by the activated nuclei as patterns which were simulated via the Monte Carlo N-particle transport code, version 2.7. The Radial Basis Function (RBF) network is developed with four specific peaks related to Ca, Si, Al and Fe, which were extracted as inputs. The proposed RBF model is developed and trained with MATLAB 7.8 software. To obtain the optimal RBF model, several structures have been constructed and tested. The comparison between simulated and predicted values using the proposed RBF model shows that there is a good agreement between them.

  4. Joint analysis of binary and quantitative traits with data sharing and outcome-dependent sampling.

    PubMed

    Zheng, Gang; Wu, Colin O; Kwak, Minjung; Jiang, Wenhua; Joo, Jungnam; Lima, Joao A C

    2012-04-01

    We study the analysis of a joint association between a genetic marker with both binary (case-control) and quantitative (continuous) traits, where the quantitative trait values are only available for the cases due to data sharing and outcome-dependent sampling. Data sharing becomes common in genetic association studies, and the outcome-dependent sampling is the consequence of data sharing, under which a phenotype of interest is not measured for some subgroup. The trend test (or Pearson's test) and F-test are often, respectively, used to analyze the binary and quantitative traits. Because of the outcome-dependent sampling, the usual F-test can be applied using the subgroup with the observed quantitative traits. We propose a modified F-test by also incorporating the genotype frequencies of the subgroup whose traits are not observed. Further, a combination of this modified F-test and Pearson's test is proposed by Fisher's combination of their P-values as a joint analysis. Because of the correlation of the two analyses, we propose to use a Gamma (scaled chi-squared) distribution to fit the asymptotic null distribution for the joint analysis. The proposed modified F-test and the joint analysis can also be applied to test single trait association (either binary or quantitative trait). Through simulations, we identify the situations under which the proposed tests are more powerful than the existing ones. Application to a real dataset of rheumatoid arthritis is presented. © 2012 Wiley Periodicals, Inc.

  5. Kinetic Analysis of Amylase Using Quantitative Benedict's and Iodine Starch Reagents

    ERIC Educational Resources Information Center

    Cochran, Beverly; Lunday, Deborah; Miskevich, Frank

    2008-01-01

    Quantitative analysis of carbohydrates is a fundamental analytical tool used in many aspects of biology and chemistry. We have adapted a technique developed by Mathews et al. using an inexpensive scanner and open-source image analysis software to quantify amylase activity using both the breakdown of starch and the appearance of glucose. Breakdown…

  6. Investment appraisal using quantitative risk analysis.

    PubMed

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  7. Errors in quantitative backscattered electron analysis of bone standardized by energy-dispersive x-ray spectrometry.

    PubMed

    Vajda, E G; Skedros, J G; Bloebaum, R D

    1998-10-01

    Backscattered electron (BSE) imaging has proven to be a useful method for analyzing the mineral distribution in microscopic regions of bone. However, an accepted method of standardization has not been developed, limiting the utility of BSE imaging for truly quantitative analysis. Previous work has suggested that BSE images can be standardized by energy-dispersive x-ray spectrometry (EDX). Unfortunately, EDX-standardized BSE images tend to underestimate the mineral content of bone when compared with traditional ash measurements. The goal of this study is to investigate the nature of the deficit between EDX-standardized BSE images and ash measurements. A series of analytical standards, ashed bone specimens, and unembedded bone specimens were investigated to determine the source of the deficit previously reported. The primary source of error was found to be inaccurate ZAF corrections to account for the organic phase of the bone matrix. Conductive coatings, methylmethacrylate embedding media, and minor elemental constituents in bone mineral introduced negligible errors. It is suggested that the errors would remain constant and an empirical correction could be used to account for the deficit. However, extensive preliminary testing of the analysis equipment is essential.

  8. Dynamic Analysis of Geared Rotors by Finite Elements

    NASA Technical Reports Server (NTRS)

    Kahraman, A.; Ozguven, H. Nevzat; Houser, D. R.; Zakrajsek, J. J.

    1992-01-01

    A finite element model of a geared rotor system on flexible bearings has been developed. The model includes the rotary inertia of on shaft elements, the axial loading on shafts, flexibility and damping of bearings, material damping of shafts and the stiffness and the damping of gear mesh. The coupling between the torsional and transverse vibrations of gears were considered in the model. A constant mesh stiffness was assumed. The analysis procedure can be used for forced vibration analysis geared rotors by calculating the critical speeds and determining the response of any point on the shafts to mass unbalances, geometric eccentricities of gears, and displacement transmission error excitation at the mesh point. The dynamic mesh forces due to these excitations can also be calculated. The model has been applied to several systems for the demonstration of its accuracy and for studying the effect of bearing compliances on system dynamics.

  9. Dental application of novel finite element analysis software for three-dimensional finite element modeling of a dentulous mandible from its computed tomography images.

    PubMed

    Nakamura, Keiko; Tajima, Kiyoshi; Chen, Ker-Kong; Nagamatsu, Yuki; Kakigawa, Hiroshi; Masumi, Shin-ich

    2013-12-01

    This study focused on the application of novel finite-element analysis software for constructing a finite-element model from the computed tomography data of a human dentulous mandible. The finite-element model is necessary for evaluating the mechanical response of the alveolar part of the mandible, resulting from occlusal force applied to the teeth during biting. Commercially available patient-specific general computed tomography-based finite-element analysis software was solely applied to the finite-element analysis for the extraction of computed tomography data. The mandibular bone with teeth was extracted from the original images. Both the enamel and the dentin were extracted after image processing, and the periodontal ligament was created from the segmented dentin. The constructed finite-element model was reasonably accurate using a total of 234,644 nodes and 1,268,784 tetrahedral and 40,665 shell elements. The elastic moduli of the heterogeneous mandibular bone were determined from the bone density data of the computed tomography images. The results suggested that the software applied in this study is both useful and powerful for creating a more accurate three-dimensional finite-element model of a dentulous mandible from the computed tomography data without the need for any other software.

  10. Survival Prediction in Pancreatic Ductal Adenocarcinoma by Quantitative Computed Tomography Image Analysis.

    PubMed

    Attiyeh, Marc A; Chakraborty, Jayasree; Doussot, Alexandre; Langdon-Embry, Liana; Mainarich, Shiana; Gönen, Mithat; Balachandran, Vinod P; D'Angelica, Michael I; DeMatteo, Ronald P; Jarnagin, William R; Kingham, T Peter; Allen, Peter J; Simpson, Amber L; Do, Richard K

    2018-04-01

    Pancreatic cancer is a highly lethal cancer with no established a priori markers of survival. Existing nomograms rely mainly on post-resection data and are of limited utility in directing surgical management. This study investigated the use of quantitative computed tomography (CT) features to preoperatively assess survival for pancreatic ductal adenocarcinoma (PDAC) patients. A prospectively maintained database identified consecutive chemotherapy-naive patients with CT angiography and resected PDAC between 2009 and 2012. Variation in CT enhancement patterns was extracted from the tumor region using texture analysis, a quantitative image analysis tool previously described in the literature. Two continuous survival models were constructed, with 70% of the data (training set) using Cox regression, first based only on preoperative serum cancer antigen (CA) 19-9 levels and image features (model A), and then on CA19-9, image features, and the Brennan score (composite pathology score; model B). The remaining 30% of the data (test set) were reserved for independent validation. A total of 161 patients were included in the analysis. Training and test sets contained 113 and 48 patients, respectively. Quantitative image features combined with CA19-9 achieved a c-index of 0.69 [integrated Brier score (IBS) 0.224] on the test data, while combining CA19-9, imaging, and the Brennan score achieved a c-index of 0.74 (IBS 0.200) on the test data. We present two continuous survival prediction models for resected PDAC patients. Quantitative analysis of CT texture features is associated with overall survival. Further work includes applying the model to an external dataset to increase the sample size for training and to determine its applicability.

  11. Automatic variable selection method and a comparison for quantitative analysis in laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Duan, Fajie; Fu, Xiao; Jiang, Jiajia; Huang, Tingting; Ma, Ling; Zhang, Cong

    2018-05-01

    In this work, an automatic variable selection method for quantitative analysis of soil samples using laser-induced breakdown spectroscopy (LIBS) is proposed, which is based on full spectrum correction (FSC) and modified iterative predictor weighting-partial least squares (mIPW-PLS). The method features automatic selection without artificial processes. To illustrate the feasibility and effectiveness of the method, a comparison with genetic algorithm (GA) and successive projections algorithm (SPA) for different elements (copper, barium and chromium) detection in soil was implemented. The experimental results showed that all the three methods could accomplish variable selection effectively, among which FSC-mIPW-PLS required significantly shorter computation time (12 s approximately for 40,000 initial variables) than the others. Moreover, improved quantification models were got with variable selection approaches. The root mean square errors of prediction (RMSEP) of models utilizing the new method were 27.47 (copper), 37.15 (barium) and 39.70 (chromium) mg/kg, which showed comparable prediction effect with GA and SPA.

  12. Is scanning electron microscopy/energy dispersive X-ray spectrometry (SEM/EDS) quantitative?

    PubMed

    Newbury, Dale E; Ritchie, Nicholas W M

    2013-01-01

    Scanning electron microscopy/energy dispersive X-ray spectrometry (SEM/EDS) is a widely applied elemental microanalysis method capable of identifying and quantifying all elements in the periodic table except H, He, and Li. By following the "k-ratio" (unknown/standard) measurement protocol development for electron-excited wavelength dispersive spectrometry (WDS), SEM/EDS can achieve accuracy and precision equivalent to WDS and at substantially lower electron dose, even when severe X-ray peak overlaps occur, provided sufficient counts are recorded. Achieving this level of performance is now much more practical with the advent of the high-throughput silicon drift detector energy dispersive X-ray spectrometer (SDD-EDS). However, three measurement issues continue to diminish the impact of SEM/EDS: (1) In the qualitative analysis (i.e., element identification) that must precede quantitative analysis, at least some current and many legacy software systems are vulnerable to occasional misidentification of major constituent peaks, with the frequency of misidentifications rising significantly for minor and trace constituents. (2) The use of standardless analysis, which is subject to much broader systematic errors, leads to quantitative results that, while useful, do not have sufficient accuracy to solve critical problems, e.g. determining the formula of a compound. (3) EDS spectrometers have such a large volume of acceptance that apparently credible spectra can be obtained from specimens with complex topography that introduce uncontrolled geometric factors that modify X-ray generation and propagation, resulting in very large systematic errors, often a factor of ten or more. © Wiley Periodicals, Inc.

  13. Study on Edge Thickening Flow Forming Using the Finite Elements Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Young Jin; Park, Jin Sung; Cho, Chongdu

    2011-08-01

    This study is to examine the forming features of flow stress property and the incremental forming method with increasing the thickness of material. Recently, the optimized forming method is widely studied through the finite element analysis to optimize forming process conditions in many different forming fields. The optimal forming method should be adopted to meet geometric requirements as the reduction in volume per unit length of material such as forging, rolling, spinning etc. However conventional studies have not dealt with issue regarding volume per unit length. For the study we use the finite element method and model a gear part of an automotive engine flywheel as the study model, which is a weld assembly of a plate and a gear with respective different thickness. In simulation of the present study, a optimized forming condition for gear machining, considering the thickness of the outer edge of flywheel is studied using the finite elements analysis for the increasing thickness of the forming method. It is concluded from the study that forming method to increase the thickness per unit length for gear machining is reasonable using the finite elements analysis and forming test.

  14. Elements of healthy death: a thematic analysis.

    PubMed

    Estebsari, Fatemeh; Taghdisi, Mohammad Hossein; Mostafaei, Davood; Rahimi, Zahra

    2017-01-01

    Background: Death is a natural and frightening phenomenon, which is inevitable. Previous studies on death, which presented a negative and tedious image of this process, are now being revised and directed towards acceptable death and good death. One of the proposed terms about death and dying is "healthy death", which encourages dealing with death positively and leading a lively and happy life until the last moment. This study aimed to explain the views of Iranians about the elements of healthy death. Methods: This qualitative study was conducted for 12 months in two general hospitals in Tehran (capital of Iran), using the thematic analysis method. After conducting 23 in-depth interviews with 21 participants, transcription of content, and data immersion and analysis, themes, as the smallest meaningful units were extracted, encoded and classified. Results: One main category of healthy death with 10 subthemes, including dying at the right time, dying without hassle, dying without cost, dying without dependency and control, peaceful death, not having difficulty at dying, not dying alone and dying at home, inspired death, preplanned death, and presence of a clergyman or a priest, were extracted as the elements of healthy death from the perspective of the participants in this study. Conclusion: The study findings well explained the elements of healthy death. Paying attention to the conditions and factors causing healthy death by professionals and providing and facilitating quality services for patients in the end stage of life make it possible for patients to experience a healthy death.

  15. Quantitative measurements of nanoscale permittivity and conductivity using tuning-fork-based microwave impedance microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Xiaoyu; Hao, Zhenqi; Wu, Di

    Here, we report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS 2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-fieldmore » microwave imaging with small distance modulation.« less

  16. Quantitative measurements of nanoscale permittivity and conductivity using tuning-fork-based microwave impedance microscopy

    DOE PAGES

    Wu, Xiaoyu; Hao, Zhenqi; Wu, Di; ...

    2018-04-01

    Here, we report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS 2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-fieldmore » microwave imaging with small distance modulation.« less

  17. Advanced Software for Analysis of High-Speed Rolling-Element Bearings

    NASA Technical Reports Server (NTRS)

    Poplawski, J. V.; Rumbarger, J. H.; Peters, S. M.; Galatis, H.; Flower, R.

    2003-01-01

    COBRA-AHS is a package of advanced software for analysis of rigid or flexible shaft systems supported by rolling-element bearings operating at high speeds under complex mechanical and thermal loads. These loads can include centrifugal and thermal loads generated by motions of bearing components. COBRA-AHS offers several improvements over prior commercial bearing-analysis programs: It includes innovative probabilistic fatigue-life-estimating software that provides for computation of three-dimensional stress fields and incorporates stress-based (in contradistinction to prior load-based) mathematical models of fatigue life. It interacts automatically with the ANSYS finite-element code to generate finite-element models for estimating distributions of temperature and temperature-induced changes in dimensions in iterative thermal/dimensional analyses: thus, for example, it can be used to predict changes in clearances and thermal lockup. COBRA-AHS provides an improved graphical user interface that facilitates the iterative cycle of analysis and design by providing analysis results quickly in graphical form, enabling the user to control interactive runs without leaving the program environment, and facilitating transfer of plots and printed results for inclusion in design reports. Additional features include roller-edge stress prediction and influence of shaft and housing distortion on bearing performance.

  18. Contact stresses, pressure and area in a fixed-bearing total ankle replacement: a finite element analysis.

    PubMed

    Martinelli, Nicolo; Baretta, Silvia; Pagano, Jenny; Bianchi, Alberto; Villa, Tomaso; Casaroli, Gloria; Galbusera, Fabio

    2017-11-25

    Mobile-bearing ankle implants with good clinical results continued to increase the popularity of total ankle arthroplasty to address endstage ankle osteoarthritis preserving joint movement. Alternative solutions used fixed-bearing designs, which increase stability and reduce the risk of bearing dislocation, but with a theoretical increase of contact stresses leading to a higher polyethylene wear. The purpose of this study was to investigate the contact stresses, pressure and area in the polyethylene component of a new total ankle replacement with a fixed-bearing design, using 3D finite element analysis. A three-dimensional finite element model of the Zimmer Trabecular Metal Total Ankle was developed and assembled based on computed tomography images. Three different sizes of the polyethylene insert were modeled, and a finite element analysis was conducted to investigate the contact pressure, the von Mises stresses and the contact area of the polyethylene component during the stance phase of the gait cycle. The peak value of pressure was found in the anterior region of the articulating surface, where it reached 19.8 MPa at 40% of the gait cycle. The average contact pressure during the stance phase was 6.9 MPa. The maximum von Mises stress of 14.1 MPa was reached at 40% of the gait cycle in the anterior section. In the central section, the maximum von Mises stress of 10.8 MPa was reached at 37% of the gait cycle, whereas in the posterior section the maximum stress of 5.4 MPa was reached at the end of the stance phase. The new fixed-bearing total ankle replacement showed a safe mechanical behavior and many clinical advantages. However, advanced models to quantitatively estimate the wear are need. To the light of the clinical advantages, we conclude that the presented prosthesis is a good alternative to the other products present in the market.

  19. Biomechanical 3-Dimensional Finite Element Analysis of Obturator Protheses Retained with Zygomatic and Dental Implants in Maxillary Defects

    PubMed Central

    Akay, Canan; Yaluğ, Suat

    2015-01-01

    Background The objective of this study was to investigate the stress distribution in the bone around zygomatic and dental implants for 3 different implant-retained obturator prostheses designs in a Aramany class IV maxillary defect using 3-dimensional finite element analysis (FEA). Material\\Methods A 3-dimensional finite element model of an Aramany class IV defect was created. Three different implant-retained obturator prostheses were modeled: model 1 with 1 zygomatic implant and 1 dental implant, model 2 with 1 zygomatic implant and 2 dental implants, and model 3 with 2 zygomatic implants. Locator attachments were used as a superstructure. A 150-N load was applied 3 different ways. Qualitative analysis was based on the scale of maximum principal stress; values obtained through quantitative analysis are expressed in MPa. Results In all loading conditions, model 3 (when compared models 1 and 2) showed the lowest maximum principal stress value. Model 3 is the most appropirate reconstruction in Aramany class IV maxillary defects. Two zygomatic implants can reduce the stresses in model 3. The distribution of stresses on prostheses were more rational with the help of zygoma implants, which can distribute the stresses on each part of the maxilla. Conclusions Aramany class IV obturator prosthesis placement of 2 zygomatic implants in each side of the maxilla is more advantageous than placement of dental implants. In the non-defective side, increasing the number of dental implants is not as suitable as zygomatic implants. PMID:25714086

  20. Finite element methodology for integrated flow-thermal-structural analysis

    NASA Technical Reports Server (NTRS)

    Thornton, Earl A.; Ramakrishnan, R.; Vemaganti, G. R.

    1988-01-01

    Papers entitled, An Adaptive Finite Element Procedure for Compressible Flows and Strong Viscous-Inviscid Interactions, and An Adaptive Remeshing Method for Finite Element Thermal Analysis, were presented at the June 27 to 29, 1988, meeting of the AIAA Thermophysics, Plasma Dynamics and Lasers Conference, San Antonio, Texas. The papers describe research work supported under NASA/Langley Research Grant NsG-1321, and are submitted in fulfillment of the progress report requirement on the grant for the period ending February 29, 1988.

  1. Nonlinear solid finite element analysis of mitral valves with heterogeneous leaflet layers

    NASA Astrophysics Data System (ADS)

    Prot, V.; Skallerud, B.

    2009-02-01

    An incompressible transversely isotropic hyperelastic material for solid finite element analysis of a porcine mitral valve response is described. The material model implementation is checked in single element tests and compared with a membrane implementation in an out-of-plane loading test to study how the layered structures modify the stress response for a simple geometry. Three different collagen layer arrangements are used in finite element analysis of the mitral valve. When the leaflets are arranged in two layers with the collagen on the ventricular side, the stress in the fibre direction through the thickness in the central part of the anterior leaflet is homogenized and the peak stress is reduced. A simulation using membrane elements is also carried out for comparison with the solid finite element results. Compared to echocardiographic measurements, the finite element models bulge too much in the left atrium. This may be due to evidence of active muscle fibres in some parts of the anterior leaflet, whereas our constitutive modelling is based on passive material.

  2. Band-limited Green's Functions for Quantitative Evaluation of Acoustic Emission Using the Finite Element Method

    NASA Technical Reports Server (NTRS)

    Leser, William P.; Yuan, Fuh-Gwo; Leser, William P.

    2013-01-01

    A method of numerically estimating dynamic Green's functions using the finite element method is proposed. These Green's functions are accurate in a limited frequency range dependent on the mesh size used to generate them. This range can often match or exceed the frequency sensitivity of the traditional acoustic emission sensors. An algorithm is also developed to characterize an acoustic emission source by obtaining information about its strength and temporal dependence. This information can then be used to reproduce the source in a finite element model for further analysis. Numerical examples are presented that demonstrate the ability of the band-limited Green's functions approach to determine the moment tensor coefficients of several reference signals to within seven percent, as well as accurately reproduce the source-time function.

  3. Quantitative analysis of biological tissues using Fourier transform-second-harmonic generation imaging

    NASA Astrophysics Data System (ADS)

    Ambekar Ramachandra Rao, Raghu; Mehta, Monal R.; Toussaint, Kimani C., Jr.

    2010-02-01

    We demonstrate the use of Fourier transform-second-harmonic generation (FT-SHG) imaging of collagen fibers as a means of performing quantitative analysis of obtained images of selected spatial regions in porcine trachea, ear, and cornea. Two quantitative markers, preferred orientation and maximum spatial frequency are proposed for differentiating structural information between various spatial regions of interest in the specimens. The ear shows consistent maximum spatial frequency and orientation as also observed in its real-space image. However, there are observable changes in the orientation and minimum feature size of fibers in the trachea indicating a more random organization. Finally, the analysis is applied to a 3D image stack of the cornea. It is shown that the standard deviation of the orientation is sensitive to the randomness in fiber orientation. Regions with variations in the maximum spatial frequency, but with relatively constant orientation, suggest that maximum spatial frequency is useful as an independent quantitative marker. We emphasize that FT-SHG is a simple, yet powerful, tool for extracting information from images that is not obvious in real space. This technique can be used as a quantitative biomarker to assess the structure of collagen fibers that may change due to damage from disease or physical injury.

  4. Single cell versus large population analysis: cell variability in elemental intracellular concentration and distribution.

    PubMed

    Malucelli, Emil; Procopio, Alessandra; Fratini, Michela; Gianoncelli, Alessandra; Notargiacomo, Andrea; Merolle, Lucia; Sargenti, Azzurra; Castiglioni, Sara; Cappadone, Concettina; Farruggia, Giovanna; Lombardo, Marco; Lagomarsino, Stefano; Maier, Jeanette A; Iotti, Stefano

    2018-01-01

    The quantification of elemental concentration in cells is usually performed by analytical assays on large populations missing peculiar but important rare cells. The present article aims at comparing the elemental quantification in single cells and cell population in three different cell types using a new approach for single cells elemental analysis performed at sub-micrometer scale combining X-ray fluorescence microscopy and atomic force microscopy. The attention is focused on the light element Mg, exploiting the opportunity to compare the single cell quantification to the cell population analysis carried out by a highly Mg-selective fluorescent chemosensor. The results show that the single cell analysis reveals the same Mg differences found in large population of the different cell strains studied. However, in one of the cell strains, single cell analysis reveals two cells with an exceptionally high intracellular Mg content compared with the other cells of the same strain. The single cell analysis allows mapping Mg and other light elements in whole cells at sub-micrometer scale. A detailed intensity correlation analysis on the two cells with the highest Mg content reveals that Mg subcellular localization correlates with oxygen in a different fashion with respect the other sister cells of the same strain. Graphical abstract Single cells or large population analysis this is the question!

  5. Quantitative 3D analysis of bone in hip osteoarthritis using clinical computed tomography.

    PubMed

    Turmezei, Tom D; Treece, Graham M; Gee, Andrew H; Fotiadou, Anastasia F; Poole, Kenneth E S

    2016-07-01

    To assess the relationship between proximal femoral cortical bone thickness and radiological hip osteoarthritis using quantitative 3D analysis of clinical computed tomography (CT) data. Image analysis was performed on clinical CT imaging data from 203 female volunteers with a technique called cortical bone mapping (CBM). Colour thickness maps were created for each proximal femur. Statistical parametric mapping was performed to identify statistically significant differences in cortical bone thickness that corresponded with the severity of radiological hip osteoarthritis. Kellgren and Lawrence (K&L) grade, minimum joint space width (JSW) and a novel CT-based osteophyte score were also blindly assessed from the CT data. For each increase in K&L grade, cortical thickness increased by up to 25 % in distinct areas of the superolateral femoral head-neck junction and superior subchondral bone plate. For increasing severity of CT osteophytes, the increase in cortical thickness was more circumferential, involving a wider portion of the head-neck junction, with up to a 7 % increase in cortical thickness per increment in score. Results were not significant for minimum JSW. These findings indicate that quantitative 3D analysis of the proximal femur can identify changes in cortical bone thickness relevant to structural hip osteoarthritis. • CT is being increasingly used to assess bony involvement in osteoarthritis • CBM provides accurate and reliable quantitative analysis of cortical bone thickness • Cortical bone is thicker at the superior femoral head-neck with worse osteoarthritis • Regions of increased thickness co-locate with impingement and osteophyte formation • Quantitative 3D bone analysis could enable clinical disease prediction and therapy development.

  6. [Content of mineral elements of Gastrodia elata by principal components analysis].

    PubMed

    Li, Jin-ling; Zhao, Zhi; Liu, Hong-chang; Luo, Chun-li; Huang, Ming-jin; Luo, Fu-lai; Wang, Hua-lei

    2015-03-01

    To study the content of mineral elements and the principal components in Gastrodia elata. Mineral elements were determined by ICP and the data was analyzed by SPSS. K element has the highest content-and the average content was 15.31 g x kg(-1). The average content of N element was 8.99 g x kg(-1), followed by K element. The coefficient of variation of K and N was small, but the Mn was the biggest with 51.39%. The highly significant positive correlation was found among N, P and K . Three principal components were selected by principal components analysis to evaluate the quality of G. elata. P, B, N, K, Cu, Mn, Fe and Mg were the characteristic elements of G. elata. The content of K and N elements was higher and relatively stable. The variation of Mn content was biggest. The quality of G. elata in Guizhou and Yunnan was better from the perspective of mineral elements.

  7. Correlative SEM SERS for quantitative analysis of dimer nanoparticles.

    PubMed

    Timmermans, F J; Lenferink, A T M; van Wolferen, H A G M; Otto, C

    2016-11-14

    A Raman microscope integrated with a scanning electron microscope was used to investigate plasmonic structures by correlative SEM-SERS analysis. The integrated Raman-SEM microscope combines high-resolution electron microscopy information with SERS signal enhancement from selected nanostructures with adsorbed Raman reporter molecules. Correlative analysis is performed for dimers of two gold nanospheres. Dimers were selected on the basis of SEM images from multi aggregate samples. The effect of the orientation of the dimer with respect to the polarization state of the laser light and the effect of the particle gap size on the Raman signal intensity is observed. Additionally, calculations are performed to simulate the electric near field enhancement. These simulations are based on the morphologies observed by electron microscopy. In this way the experiments are compared with the enhancement factor calculated with near field simulations and are subsequently used to quantify the SERS enhancement factor. Large differences between experimentally observed and calculated enhancement factors are regularly detected, a phenomenon caused by nanoscale differences between the real and 'simplified' simulated structures. Quantitative SERS experiments reveal the structure induced enhancement factor, ranging from ∼200 to ∼20 000, averaged over the full nanostructure surface. The results demonstrate correlative Raman-SEM microscopy for the quantitative analysis of plasmonic particles and structures, thus enabling a new analytical method in the field of SERS and plasmonics.

  8. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    PubMed

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics.

  9. Identification and quantitation of semi-crystalline microplastics using image analysis and differential scanning calorimetry.

    PubMed

    Rodríguez Chialanza, Mauricio; Sierra, Ignacio; Pérez Parada, Andrés; Fornaro, Laura

    2018-06-01

    There are several techniques used to analyze microplastics. These are often based on a combination of visual and spectroscopic techniques. Here we introduce an alternative workflow for identification and mass quantitation through a combination of optical microscopy with image analysis (IA) and differential scanning calorimetry (DSC). We studied four synthetic polymers with environmental concern: low and high density polyethylene (LDPE and HDPE, respectively), polypropylene (PP), and polyethylene terephthalate (PET). Selected experiments were conducted to investigate (i) particle characterization and counting procedures based on image analysis with open-source software, (ii) chemical identification of microplastics based on DSC signal processing, (iii) dependence of particle size on DSC signal, and (iv) quantitation of microplastics mass based on DSC signal. We describe the potential and limitations of these techniques to increase reliability for microplastic analysis. Particle size demonstrated to have particular incidence in the qualitative and quantitative performance of DSC signals. Both, identification (based on characteristic onset temperature) and mass quantitation (based on heat flow) showed to be affected by particle size. As a result, a proper sample treatment which includes sieving of suspended particles is particularly required for this analytical approach.

  10. Quantitative analysis of a brass alloy using CF-LIBS and a laser ablation time-of-flight mass spectrometer

    NASA Astrophysics Data System (ADS)

    Ahmed, Nasar; Abdullah, M.; Ahmed, Rizwan; Piracha, N. K.; Aslam Baig, M.

    2018-01-01

    We present a quantitative analysis of a brass alloy using laser induced breakdown spectroscopy, energy dispersive x-ray spectroscopy (EDX) and laser ablation time-of-flight mass spectrometry (LA-TOF-MS). The emission lines of copper (Cu I) and zinc (Zn I), and the constituent elements of the brass alloy were used to calculate the plasma parameters. The plasma temperature was calculated from the Boltzmann plot as (10 000  ±  1000) K and the electron number density was determined as (2.0  ±  0.5)  ×  1017 cm-3 from the Stark-broadened Cu I line as well as using the Saha-Boltzmann equation. The elemental composition was deduced using these techniques: the Boltzmann plot method (70% Cu and 30% Zn), internal reference self-absorption correction (63.36% Cu and 36.64% Zn), EDX (61.75% Cu and 38.25% Zn), and LA-TOF (62% Cu and 38% Zn), whereas, the certified composition is (62% Cu and 38% Zn). It was observed that the internal reference self-absorption correction method yields analytical results comparable to that of EDX and LA-TOF-MS.

  11. A Calibration-Free Laser-Induced Breakdown Spectroscopy (CF-LIBS) Quantitative Analysis Method Based on the Auto-Selection of an Internal Reference Line and Optimized Estimation of Plasma Temperature.

    PubMed

    Yang, Jianhong; Li, Xiaomeng; Xu, Jinwu; Ma, Xianghong

    2018-01-01

    The quantitative analysis accuracy of calibration-free laser-induced breakdown spectroscopy (CF-LIBS) is severely affected by the self-absorption effect and estimation of plasma temperature. Herein, a CF-LIBS quantitative analysis method based on the auto-selection of internal reference line and the optimized estimation of plasma temperature is proposed. The internal reference line of each species is automatically selected from analytical lines by a programmable procedure through easily accessible parameters. Furthermore, the self-absorption effect of the internal reference line is considered during the correction procedure. To improve the analysis accuracy of CF-LIBS, the particle swarm optimization (PSO) algorithm is introduced to estimate the plasma temperature based on the calculation results from the Boltzmann plot. Thereafter, the species concentrations of a sample can be calculated according to the classical CF-LIBS method. A total of 15 certified alloy steel standard samples of known compositions and elemental weight percentages were used in the experiment. Using the proposed method, the average relative errors of Cr, Ni, and Fe calculated concentrations were 4.40%, 6.81%, and 2.29%, respectively. The quantitative results demonstrated an improvement compared with the classical CF-LIBS method and the promising potential of in situ and real-time application.

  12. A sensitive and quantitative element-tagged immunoassay with ICPMS detection.

    PubMed

    Baranov, Vladimir I; Quinn, Zoë; Bandura, Dmitry R; Tanner, Scott D

    2002-04-01

    We report a set of novel immunoassays in which proteins of interest can be detected using specific element-tagged antibodies. These immunoassays are directly coupled with an inductively coupled plasma mass spectrometer (ICPMS) to quantify the elemental (in this work, metal) component of the reacted tagged antibodies. It is demonstrated that these methods can detect levels of target proteins as low as 0.1-0.5 ng/mL and yield a linear response to protein concentration over 3 orders of magnitude.

  13. PLANS; a finite element program for nonlinear analysis of structures. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Pifko, A.; Armen, H., Jr.; Levy, A.; Levine, H.

    1977-01-01

    The PLANS system, rather than being one comprehensive computer program, is a collection of finite element programs used for the nonlinear analysis of structures. This collection of programs evolved and is based on the organizational philosophy in which classes of analyses are treated individually based on the physical problem class to be analyzed. Each of the independent finite element computer programs of PLANS, with an associated element library, can be individually loaded and used to solve the problem class of interest. A number of programs have been developed for material nonlinear behavior alone and for combined geometric and material nonlinear behavior. The usage, capabilities, and element libraries of the current programs include: (1) plastic analysis of built-up structures where bending and membrane effects are significant, (2) three dimensional elastic-plastic analysis, (3) plastic analysis of bodies of revolution, and (4) material and geometric nonlinear analysis of built-up structures.

  14. Identification and apportionment of hazardous elements in the sediments in the Yangtze River estuary.

    PubMed

    Wang, Jiawei; Liu, Ruimin; Wang, Haotian; Yu, Wenwen; Xu, Fei; Shen, Zhenyao

    2015-12-01

    In this study, positive matrix factorization (PMF) and principal components analysis (PCA) were combined to identify and apportion pollution-based sources of hazardous elements in the surface sediments in the Yangtze River estuary (YRE). Source identification analysis indicated that PC1, including Al, Fe, Mn, Cr, Ni, As, Cu, and Zn, can be defined as a sewage component; PC2, including Pb and Sb, can be considered as an atmospheric deposition component; and PC3, containing Cd and Hg, can be considered as an agricultural nonpoint component. To better identify the sources and quantitatively apportion the concentrations to their sources, eight sources were identified with PMF: agricultural/industrial sewage mixed (18.6 %), mining wastewater (15.9 %), agricultural fertilizer (14.5 %), atmospheric deposition (12.8 %), agricultural nonpoint (10.6 %), industrial wastewater (9.8 %), marine activity (9.0 %), and nickel plating industry (8.8 %). Overall, the hazardous element content seems to be more connected to anthropogenic activity instead of natural sources. The PCA results laid the foundation for the PMF analysis by providing a general classification of sources. PMF resolves more factors with a higher explained variance than PCA; PMF provided both the internal analysis and the quantitative analysis. The combination of the two methods can provide more reasonable and reliable results.

  15. Library Optimization in EDXRF Spectral Deconvolution for Multi-element Analysis of Ambient Aerosols

    EPA Science Inventory

    In multi-element analysis of atmospheric aerosols, attempts are made to fit overlapping elemental spectral lines for many elements that may be undetectable in samples due to low concentrations. Fitting with many library reference spectra has the unwanted effect of raising the an...

  16. Elemental investigation of Syrian medicinal plants using PIXE analysis

    NASA Astrophysics Data System (ADS)

    Rihawy, M. S.; Bakraji, E. H.; Aref, S.; Shaban, R.

    2010-09-01

    Particle induced X-ray emission (PIXE) technique has been employed to perform elemental analysis of K, Ca, Mn, Fe, Cu, Zn, Br and Sr for Syrian medicinal plants used traditionally to enhance the body immunity. Plant samples were prepared in a simple dried base. The results were verified by comparing with those obtained from both IAEA-359 and IAEA-V10 reference materials. Relative standard deviations are mostly within ±5-10% suggest good precision. A correlation between the elemental content in each medicinal plant with its traditional remedial usage has been proposed. Both K and Ca are found to be the major elements in the samples. Fe, Mn and Zn have been detected in good levels in most of these plants clarifying their possible contribution to keep the body immune system in good condition. The contribution of the elements in these plants to the dietary recommended intakes (DRI) has been evaluated. Advantages and limitations of PIXE analytical technique in this investigation have been reviewed.

  17. Design of an elemental analysis system for CELSS research

    NASA Technical Reports Server (NTRS)

    Schwartzkopf, Steven H.

    1987-01-01

    The results of experiments conducted with higher plants in tightly sealed growth chambers provide definite evidence that the physical closure of a chamber has significant effects on many aspects of a plant's biology. One of these effects is seen in the change in rates of uptake, distribution, and re-release or nutrient elements by the plant (mass balance). Experimental data indicates that these rates are different from those recorded for plants grown in open field agriculture, or in open growth chambers. Since higher plants are a crucial component of a controlled ecological life support system (CELSS), it is important that the consequences of these rate differences be understood with regard to the growth and yield of the plants. A description of a system for elemental analysis which can be used to monitor the mass balance of nutrient elements in CELSS experiments is given. Additionally, data on the uptake of nutrient elements by higher plants grown in a growth chamber is presented.

  18. Quantitative analysis of fracture surface by roughness and fractal method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, X.W.; Tian, J.F.; Kang, Y.

    1995-09-01

    In recent years there has been extensive research and great development in Quantitative Fractography, which acts as an integral part of fractographic analysis. A prominent technique for studying the fracture surface is based on fracture profile generation and the major means for characterizing the profile quantitatively are roughness and fractal methods. By this way, some quantitative indexes such as the roughness parameters R{sub L} for profile and R{sub S} for surface, fractal dimensions D{sub L} for profile and D{sub S} for surface can be measured. Given the relationships between the indexes and the mechanical properties of materials, it is possiblemore » to achieve the goal of protecting materials from fracture. But, as the case stands, the theory and experimental technology of quantitative fractography are still imperfect and remain to be studied further. Recently, Gokhale and Underwood et al have proposed an assumption-free method for estimating the surface roughness by vertically sectioning the fracture surface with sections at an angle of 120 deg with each other, which could be expressed as follows: R{sub S} = {ovr R{sub L}{center_dot}{Psi}} where {Psi} is the profile structure factor. This method is based on the classical sterological principles and verified with the aid of computer simulations for some ruled surfaces. The results are considered to be applicable to fracture surfaces with any arbitrary complexity and anisotropy. In order to extend the detail applications to this method in quantitative fractography, the authors made a study on roughness and fractal methods dependent on this method by performing quantitative measurements on some typical low-temperature impact fractures.« less

  19. A new discrete Kirchhoff-Mindlin element based on Mindlin-Reissner plate theory and assumed shear strain fields. I - An extended DKT element for thick-plate bending analysis. II - An extended DKQ element for thick-plate bending analysis

    NASA Astrophysics Data System (ADS)

    Katili, Irwan

    1993-06-01

    A new three-node nine-degree-of-freedom triangular plate bending element is proposed which is valid for the analysis of both thick and thin plates. The element, called the discrete Kirchhoff-Mindlin triangle (DKMT), has a proper rank, passes the patch test for thin and thick plates in an arbitrary mesh, and is free of shear locking. As an extension of the DKMT element, a four-node element with 3 degrees of freedom per node is developed. The element, referred to as DKMQ (discrete Kirchhoff-Mindlin quadrilateral) is found to provide good results for both thin and thick plates without any compatibility problems.

  20. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  1. Variable selection based near infrared spectroscopy quantitative and qualitative analysis on wheat wet gluten

    NASA Astrophysics Data System (ADS)

    Lü, Chengxu; Jiang, Xunpeng; Zhou, Xingfan; Zhang, Yinqiao; Zhang, Naiqian; Wei, Chongfeng; Mao, Wenhua

    2017-10-01

    Wet gluten is a useful quality indicator for wheat, and short wave near infrared spectroscopy (NIRS) is a high performance technique with the advantage of economic rapid and nondestructive test. To study the feasibility of short wave NIRS analyzing wet gluten directly from wheat seed, 54 representative wheat seed samples were collected and scanned by spectrometer. 8 spectral pretreatment method and genetic algorithm (GA) variable selection method were used to optimize analysis. Both quantitative and qualitative model of wet gluten were built by partial least squares regression and discriminate analysis. For quantitative analysis, normalization is the optimized pretreatment method, 17 wet gluten sensitive variables are selected by GA, and GA model performs a better result than that of all variable model, with R2V=0.88, and RMSEV=1.47. For qualitative analysis, automatic weighted least squares baseline is the optimized pretreatment method, all variable models perform better results than those of GA models. The correct classification rates of 3 class of <24%, 24-30%, >30% wet gluten content are 95.45, 84.52, and 90.00%, respectively. The short wave NIRS technique shows potential for both quantitative and qualitative analysis of wet gluten for wheat seed.

  2. Material nonlinear analysis via mixed-iterative finite element method

    NASA Technical Reports Server (NTRS)

    Sutjahjo, Edhi; Chamis, Christos C.

    1992-01-01

    The performance of elastic-plastic mixed-iterative analysis is examined through a set of convergence studies. Membrane and bending behaviors are tested using 4-node quadrilateral finite elements. The membrane result is excellent, which indicates the implementation of elastic-plastic mixed-iterative analysis is appropriate. On the other hand, further research to improve bending performance of the method seems to be warranted.

  3. Three-dimensional Stress Analysis Using the Boundary Element Method

    NASA Technical Reports Server (NTRS)

    Wilson, R. B.; Banerjee, P. K.

    1984-01-01

    The boundary element method is to be extended (as part of the NASA Inelastic Analysis Methods program) to the three-dimensional stress analysis of gas turbine engine hot section components. The analytical basis of the method (as developed in elasticity) is outlined, its numerical implementation is summarized, and the approaches to be followed in extending the method to include inelastic material response indicated.

  4. Quantitative Analysis of Food and Feed Samples with Droplet Digital PCR

    PubMed Central

    Morisset, Dany; Štebih, Dejan; Milavec, Mojca; Gruden, Kristina; Žel, Jana

    2013-01-01

    In this study, the applicability of droplet digital PCR (ddPCR) for routine analysis in food and feed samples was demonstrated with the quantification of genetically modified organisms (GMOs). Real-time quantitative polymerase chain reaction (qPCR) is currently used for quantitative molecular analysis of the presence of GMOs in products. However, its use is limited for detecting and quantifying very small numbers of DNA targets, as in some complex food and feed matrices. Using ddPCR duplex assay, we have measured the absolute numbers of MON810 transgene and hmg maize reference gene copies in DNA samples. Key performance parameters of the assay were determined. The ddPCR system is shown to offer precise absolute and relative quantification of targets, without the need for calibration curves. The sensitivity (five target DNA copies) of the ddPCR assay compares well with those of individual qPCR assays and of the chamber digital PCR (cdPCR) approach. It offers a dynamic range over four orders of magnitude, greater than that of cdPCR. Moreover, when compared to qPCR, the ddPCR assay showed better repeatability at low target concentrations and a greater tolerance to inhibitors. Finally, ddPCR throughput and cost are advantageous relative to those of qPCR for routine GMO quantification. It is thus concluded that ddPCR technology can be applied for routine quantification of GMOs, or any other domain where quantitative analysis of food and feed samples is needed. PMID:23658750

  5. Finite element analysis of end notch flexure specimen

    NASA Technical Reports Server (NTRS)

    Mall, S.; Kochhar, N. K.

    1986-01-01

    A finite element analysis of the end notch flexure specimen for mode II interlaminar fracture toughness measurement was conducted. The effect of friction between the crack faces and large deflection on the evaluation of G sub IIc from this specimen were investigated. Results of this study are presented in this paper.

  6. Finite-element analysis of end-notch flexure specimens

    NASA Technical Reports Server (NTRS)

    Mall, S.; Kochhar, N. K.

    1986-01-01

    A finite-element analysis of the end-notch flexure specimen for Mode II interlaminar fracture toughness measurement was conducted. The effects of friction between the crack faces and large deflection on the evaluation of G(IIc) from this specimen were investigated. Results of this study are presented in this paper.

  7. Ultra-Sensitive Elemental Analysis Using Plasmas 7.Application to Criminal Investigation

    NASA Astrophysics Data System (ADS)

    Suzuki, Yasuhiro

    This paper describes the application of trace elemental analysis using ICP-AES and ICP-MS to criminal investigation. The comparison of trace elements, such as Rb, Sr, Zr, and so on, is effective for the forensic discrimination of glass fragments, which can be important physical evidence for connecting a suspect to a crime scene or to a victim. This procedure can be applied also to lead shotgun pellets by the removal of matrix lead as the sulfate precipitate after the dissolution of a pellet sample. The determination of a toxic element in bio-logical samples is required to prove that a victim ingested this element. Arsenous acids produced in Japan, China, Germany and Switzerland show characteristic patterns of trace elements characteristic to each country.

  8. Stability analysis of flexible wind turbine blades using finite element method

    NASA Technical Reports Server (NTRS)

    Kamoulakos, A.

    1982-01-01

    Static vibration and flutter analysis of a straight elastic axis blade was performed based on a finite element method solution. The total potential energy functional was formulated according to linear beam theory. The inertia and aerodynamic loads were formulated according to the blade absolute acceleration and absolute velocity vectors. In vibration analysis, the direction of motion of the blade during the first out-of-lane and first in-plane modes was examined; numerical results involve NASA/DOE Mod-0, McCauley propeller, north wind turbine and flat plate behavior. In flutter analysis, comparison cases were examined involving several references. Vibration analysis of a nonstraight elastic axis blade based on a finite element method solution was performed in a similar manner with the straight elastic axis blade, since it was recognized that a curved blade can be approximated by an assembly of a sufficient number of straight blade elements at different inclinations with respect to common system of axes. Numerical results involve comparison between the behavior of a straight and a curved cantilever beam during the lowest two in-plane and out-of-plane modes.

  9. Finite element analysis on the bending condition of truck frame before and after opening

    NASA Astrophysics Data System (ADS)

    Cai, Kaiwu; Cheng, Wei; Lu, Jifu

    2018-05-01

    Based on the design parameters of a truck frame, the structure design and model of the truck frame are built. Based on the finite element theory, the load, the type of fatigue and the material parameters of the frame are combined with the semi-trailer. Using finite element analysis software, after a truck frame hole in bending condition for the finite element analysis of comparison, through the analysis found that the truck frame hole under bending condition can meet the strength requirements are very helpful for improving the design of the truck frame.

  10. Race and Older Mothers’ Differentiation: A Sequential Quantitative and Qualitative Analysis

    PubMed Central

    Sechrist, Jori; Suitor, J. Jill; Riffin, Catherine; Taylor-Watson, Kadari; Pillemer, Karl

    2011-01-01

    The goal of this paper is to demonstrate a process by which qualitative and quantitative approaches are combined to reveal patterns in the data that are unlikely to be detected and confirmed by either method alone. Specifically, we take a sequential approach to combining qualitative and quantitative data to explore race differences in how mothers differentiate among their adult children. We began with a standard multivariate analysis examining race differences in mothers’ differentiation among their adult children regarding emotional closeness and confiding. Finding no race differences in this analysis, we conducted an in-depth comparison of the Black and White mothers’ narratives to determine whether there were underlying patterns that we had been unable to detect in our first analysis. Using this method, we found that Black mothers were substantially more likely than White mothers to emphasize interpersonal relationships within the family when describing differences among their children. In our final step, we developed a measure of familism based on the qualitative data and conducted a multivariate analysis to confirm the patterns revealed by the in-depth comparison of the mother’s narratives. We conclude that using such a sequential mixed methods approach to data analysis has the potential to shed new light on complex family relations. PMID:21967639

  11. A thioacidolysis method tailored for higher‐throughput quantitative analysis of lignin monomers

    PubMed Central

    Foster, Cliff; Happs, Renee M.; Doeppke, Crissa; Meunier, Kristoffer; Gehan, Jackson; Yue, Fengxia; Lu, Fachuang; Davis, Mark F.

    2016-01-01

    Abstract Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β‐O‐4 linkages. Current thioacidolysis methods are low‐throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non‐chlorinated organic solvent and is tailored for higher‐throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1–2 mg of biomass per assay and has been quantified using fast‐GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day‐to‐day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. The method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses. PMID:27534715

  12. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less

  13. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers

    DOE PAGES

    Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.; ...

    2016-09-14

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less

  14. QuASAR: quantitative allele-specific analysis of reads.

    PubMed

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-04-15

    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. QuASAR: quantitative allele-specific analysis of reads

    PubMed Central

    Harvey, Chris T.; Moyerbrailean, Gregory A.; Davis, Gordon O.; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-01-01

    Motivation: Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. Results: We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. Availability and implementation: http://github.com/piquelab/QuASAR. Contact: fluca@wayne.edu or rpique@wayne.edu Supplementary information: Supplementary Material is available at Bioinformatics online. PMID:25480375

  16. Finite element modeling and analysis of reinforced-concrete bridge.

    DOT National Transportation Integrated Search

    2000-09-01

    Despite its long history, the finite element method continues to be the predominant strategy employed by engineers to conduct structural analysis. A reliable method is needed for analyzing structures made of reinforced concrete, a complex but common ...

  17. Element analysis: a wavelet-based method for analysing time-localized events in noisy time series

    PubMed Central

    2017-01-01

    A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized ‘events’. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event’s ‘region of influence’ within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis, is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry. PMID:28484325

  18. Quantitative analysis of NMR spectra with chemometrics

    NASA Astrophysics Data System (ADS)

    Winning, H.; Larsen, F. H.; Bro, R.; Engelsen, S. B.

    2008-01-01

    The number of applications of chemometrics to series of NMR spectra is rapidly increasing due to an emerging interest for quantitative NMR spectroscopy e.g. in the pharmaceutical and food industries. This paper gives an analysis of advantages and limitations of applying the two most common chemometric procedures, Principal Component Analysis (PCA) and Multivariate Curve Resolution (MCR), to a designed set of 231 simple alcohol mixture (propanol, butanol and pentanol) 1H 400 MHz spectra. The study clearly demonstrates that the major advantage of chemometrics is the visualisation of larger data structures which adds a new exploratory dimension to NMR research. While robustness and powerful data visualisation and exploration are the main qualities of the PCA method, the study demonstrates that the bilinear MCR method is an even more powerful method for resolving pure component NMR spectra from mixtures when certain conditions are met.

  19. A semi-automatic method for analysis of landscape elements using Shuttle Radar Topography Mission and Landsat ETM+ data

    NASA Astrophysics Data System (ADS)

    Ehsani, Amir Houshang; Quiel, Friedrich

    2009-02-01

    In this paper, we demonstrate artificial neural networks—self-organizing map (SOM)—as a semi-automatic method for extraction and analysis of landscape elements in the man and biosphere reserve "Eastern Carpathians". The Shuttle Radar Topography Mission (SRTM) collected data to produce generally available digital elevation models (DEM). Together with Landsat Thematic Mapper data, this provides a unique, consistent and nearly worldwide data set. To integrate the DEM with Landsat data, it was re-projected from geographic coordinates to UTM with 28.5 m spatial resolution using cubic convolution interpolation. To provide quantitative morphometric parameters, first-order (slope) and second-order derivatives of the DEM—minimum curvature, maximum curvature and cross-sectional curvature—were calculated by fitting a bivariate quadratic surface with a window size of 9×9 pixels. These surface curvatures are strongly related to landform features and geomorphological processes. Four morphometric parameters and seven Landsat-enhanced thematic mapper (ETM+) bands were used as input for the SOM algorithm. Once the network weights have been randomly initialized, different learning parameter sets, e.g. initial radius, final radius and number of iterations, were investigated. An optimal SOM with 20 classes using 1000 iterations and a final neighborhood radius of 0.05 provided a low average quantization error of 0.3394 and was used for further analysis. The effect of randomization of initial weights for optimal SOM was also studied. Feature space analysis, three-dimensional inspection and auxiliary data facilitated the assignment of semantic meaning to the output classes in terms of landform, based on morphometric analysis, and land use, based on spectral properties. Results were displayed as thematic map of landscape elements according to form, cover and slope. Spectral and morphometric signature analysis with corresponding zoom samples superimposed by contour lines were

  20. [Three dimensional mathematical model of tooth for finite element analysis].

    PubMed

    Puskar, Tatjana; Vasiljević, Darko; Marković, Dubravka; Jevremović, Danimir; Pantelić, Dejan; Savić-Sević, Svetlana; Murić, Branka

    2010-01-01

    The mathematical model of the abutment tooth is the starting point of the finite element analysis of stress and deformation of dental structures. The simplest and easiest way is to form a model according to the literature data of dimensions and morphological characteristics of teeth. Our method is based on forming 3D models using standard geometrical forms (objects) in programmes for solid modeling. Forming the mathematical model of abutment of the second upper premolar for finite element analysis of stress and deformation of dental structures. The abutment tooth has a form of a complex geometric object. It is suitable for modeling in programs for solid modeling SolidWorks. After analysing the literature data about the morphological characteristics of teeth, we started the modeling dividing the tooth (complex geometric body) into simple geometric bodies (cylinder, cone, pyramid,...). Connecting simple geometric bodies together or substricting bodies from the basic body, we formed complex geometric body, tooth. The model is then transferred into Abaqus, a computational programme for finite element analysis. Transferring the data was done by standard file format for transferring 3D models ACIS SAT. Using the programme for solid modeling SolidWorks, we developed three models of abutment of the second maxillary premolar: the model of the intact abutment, the model of the endodontically treated tooth with two remaining cavity walls and the model of the endodontically treated tooth with two remaining walls and inserted post. Mathematical models of the abutment made according to the literature data are very similar with the real abutment and the simplifications are minimal. These models enable calculations of stress and deformation of the dental structures. The finite element analysis provides useful information in understanding biomechanical problems and gives guidance for clinical research.

  1. Comparison of Gap Elements and Contact Algorithm for 3D Contact Analysis of Spiral Bevel Gears

    NASA Technical Reports Server (NTRS)

    Bibel, G. D.; Tiku, K.; Kumar, A.; Handschuh, R.

    1994-01-01

    Three dimensional stress analysis of spiral bevel gears in mesh using the finite element method is presented. A finite element model is generated by solving equations that identify tooth surface coordinates. Contact is simulated by the automatic generation of nonpenetration constraints. This method is compared to a finite element contact analysis conducted with gap elements.

  2. Influence of corrosion layers on quantitative analysis

    NASA Astrophysics Data System (ADS)

    Denker, A.; Bohne, W.; Opitz-Coutureau, J.; Rauschenberg, J.; Röhrich, J.; Strub, E.

    2005-09-01

    Art historians and restorers in charge of ancient metal objects are often reluctant to remove the corrosion layer evolved over time, as this would change the appearance of the artefact dramatically. Therefore, when an elemental analysis of the objects is required, this has to be done by penetrating the corrosion layer. In this work the influence of corrosion was studied on Chinese and Roman coins, where removal of oxidized material was possible. Measurements on spots with and without corrosion are presented and the results discussed.

  3. Slave finite elements: The temporal element approach to nonlinear analysis

    NASA Technical Reports Server (NTRS)

    Gellin, S.

    1984-01-01

    A formulation method for finite elements in space and time incorporating nonlinear geometric and material behavior is presented. The method uses interpolation polynomials for approximating the behavior of various quantities over the element domain, and only explicit integration over space and time. While applications are general, the plate and shell elements that are currently being programmed are appropriate to model turbine blades, vanes, and combustor liners.

  4. A Finite Element Analysis of a Class of Problems in Elasto-Plasticity with Hidden Variables.

    DTIC Science & Technology

    1985-09-01

    RD-R761 642 A FINITE ELEMENT ANALYSIS OF A CLASS OF PROBLEMS IN 1/2 ELASTO-PLASTICITY MIlT (U) TEXAS INST FOR COMPUTATIONAL MECHANICS AUSTIN J T ODEN...end Subtitle) S. TYPE OF REPORT & PERIOD COVERED A FINITE ELEMENT ANALYSIS OF A CLASS OF PROBLEMS Final Report IN ELASTO-PLASTICITY WITH HIDDEN...aieeoc ede It neceeeary nd Identify by block number) ;"Elastoplasticity, finite deformations; non-convex analysis ; finite element methods, metal forming

  5. An Error Analysis for the Finite Element Method Applied to Convection Diffusion Problems.

    DTIC Science & Technology

    1981-03-01

    D TFhG-]NOLOGY k 4b 00 \\" ) ’b Technical Note BN-962 AN ERROR ANALYSIS FOR THE FINITE ELEMENT METHOD APPLIED TO CONVECTION DIFFUSION PROBLEM by I...Babu~ka and W. G. Szym’czak March 1981 V.. UNVI I Of- ’i -S AN ERROR ANALYSIS FOR THE FINITE ELEMENT METHOD P. - 0 w APPLIED TO CONVECTION DIFFUSION ...AOAO98 895 MARYLAND UNIVYCOLLEGE PARK INST FOR PHYSICAL SCIENCE--ETC F/G 12/I AN ERROR ANALYIS FOR THE FINITE ELEMENT METHOD APPLIED TO CONV..ETC (U

  6. Trace-Element Analysis by Use of PIXE Technique on Agricultural Products

    NASA Astrophysics Data System (ADS)

    Takagi, A.; Yokoyama, R.; Makisaka, K.; Kisamori, K.; Kuwada, Y.; Nishimura, D.; Matsumiya, R.; Fujita, Y.; Mihara, M.; Matsuta, K.; Fukuda, M.

    2009-10-01

    In order to examine whether a trace-element analysis by PIXE (Particle Induced X-ray Emission) gives a clue to identify production area of agricultural products, we carried out a study on soy beans as an example. In the present study, a proton beam at the energy of 2.3MeV was provided by Van de Graaff accelerator at Osaka University. We used a Ge detector with Be window to measure X-ray spectra. We prepared sample soy beans from China, Thailand, Taiwan, and 7 different areas in Japan. As a result of PIXE analysis, 5 elements, potassium, iron, zinc, arsenic and rubidium, have been identified. There are clear differences in relative amount of trace-elements between samples from different international regions. Chinese beans contain much more Rb than the others, while there are significant differences in Fe and Zn between beans of Thailand and Taiwan. There are relatively smaller differences among Japanese beans. This result shows that trace-elements bring us some practical information of the region where the product grown.

  7. Direct trace-elemental analysis of urine samples by laser ablation-inductively coupled plasma mass spectrometry after sample deposition on clinical filter papers.

    PubMed

    Aramendía, Maite; Rello, Luis; Vanhaecke, Frank; Resano, Martín

    2012-10-16

    Collection of biological fluids on clinical filter papers shows important advantages from a logistic point of view, although analysis of these specimens is far from straightforward. Concerning urine analysis, and particularly when direct trace elemental analysis by laser ablation-inductively coupled plasma mass spectrometry (LA-ICPMS) is aimed at, several problems arise, such as lack of sensitivity or different distribution of the analytes on the filter paper, rendering obtaining reliable quantitative results quite difficult. In this paper, a novel approach for urine collection is proposed, which circumvents many of these problems. This methodology consists on the use of precut filter paper discs where large amounts of sample can be retained upon a single deposition. This provides higher amounts of the target analytes and, thus, sufficient sensitivity, and allows addition of an adequate internal standard at the clinical lab prior to analysis, therefore making it suitable for a strategy based on unsupervised sample collection and ulterior analysis at referral centers. On the basis of this sampling methodology, an analytical method was developed for the direct determination of several elements in urine (Be, Bi, Cd, Co, Cu, Ni, Sb, Sn, Tl, Pb, and V) at the low μg L(-1) level by means of LA-ICPMS. The method developed provides good results in terms of accuracy and LODs (≤1 μg L(-1) for most of the analytes tested), with a precision in the range of 15%, fit-for-purpose for clinical control analysis.

  8. Quantitative maps of genetic interactions in yeast - comparative evaluation and integrative analysis.

    PubMed

    Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero

    2011-03-24

    High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.

  9. Distributed Finite Element Analysis Using a Transputer Network

    NASA Technical Reports Server (NTRS)

    Watson, James; Favenesi, James; Danial, Albert; Tombrello, Joseph; Yang, Dabby; Reynolds, Brian; Turrentine, Ronald; Shephard, Mark; Baehmann, Peggy

    1989-01-01

    The principal objective of this research effort was to demonstrate the extraordinarily cost effective acceleration of finite element structural analysis problems using a transputer-based parallel processing network. This objective was accomplished in the form of a commercially viable parallel processing workstation. The workstation is a desktop size, low-maintenance computing unit capable of supercomputer performance yet costs two orders of magnitude less. To achieve the principal research objective, a transputer based structural analysis workstation termed XPFEM was implemented with linear static structural analysis capabilities resembling commercially available NASTRAN. Finite element model files, generated using the on-line preprocessing module or external preprocessing packages, are downloaded to a network of 32 transputers for accelerated solution. The system currently executes at about one third Cray X-MP24 speed but additional acceleration appears likely. For the NASA selected demonstration problem of a Space Shuttle main engine turbine blade model with about 1500 nodes and 4500 independent degrees of freedom, the Cray X-MP24 required 23.9 seconds to obtain a solution while the transputer network, operated from an IBM PC-AT compatible host computer, required 71.7 seconds. Consequently, the $80,000 transputer network demonstrated a cost-performance ratio about 60 times better than the $15,000,000 Cray X-MP24 system.

  10. Laser induced breakdown spectroscopy for elemental analysis in environmental, cultural heritage and space applications: a review of methods and results.

    PubMed

    Gaudiuso, Rosalba; Dell'Aglio, Marcella; De Pascale, Olga; Senesi, Giorgio S; De Giacomo, Alessandro

    2010-01-01

    Analytical applications of Laser Induced Breakdown Spectroscopy (LIBS), namely optical emission spectroscopy of laser-induced plasmas, have been constantly growing thanks to its intrinsic conceptual simplicity and versatility. Qualitative and quantitative analysis can be performed by LIBS both by drawing calibration lines and by using calibration-free methods and some of its features, so as fast multi-elemental response, micro-destructiveness, instrumentation portability, have rendered it particularly suitable for analytical applications in the field of environmental science, space exploration and cultural heritage. This review reports and discusses LIBS achievements in these areas and results obtained for soils and aqueous samples, meteorites and terrestrial samples simulating extraterrestrial planets, and cultural heritage samples, including buildings and objects of various kinds.

  11. Low-dose CT for quantitative analysis in acute respiratory distress syndrome

    PubMed Central

    2013-01-01

    Introduction The clinical use of serial quantitative computed tomography (CT) to characterize lung disease and guide the optimization of mechanical ventilation in patients with acute respiratory distress syndrome (ARDS) is limited by the risk of cumulative radiation exposure and by the difficulties and risks related to transferring patients to the CT room. We evaluated the effects of tube current-time product (mAs) variations on quantitative results in healthy lungs and in experimental ARDS in order to support the use of low-dose CT for quantitative analysis. Methods In 14 sheep chest CT was performed at baseline and after the induction of ARDS via intravenous oleic acid injection. For each CT session, two consecutive scans were obtained applying two different mAs: 60 mAs was paired with 140, 15 or 7.5 mAs. All other CT parameters were kept unaltered (tube voltage 120 kVp, collimation 32 × 0.5 mm, pitch 0.85, matrix 512 × 512, pixel size 0.625 × 0.625 mm). Quantitative results obtained at different mAs were compared via Bland-Altman analysis. Results Good agreement was observed between 60 mAs and 140 mAs and between 60 mAs and 15 mAs (all biases less than 1%). A further reduction of mAs to 7.5 mAs caused an increase in the bias of poorly aerated and nonaerated tissue (-2.9% and 2.4%, respectively) and determined a significant widening of the limits of agreement for the same compartments (-10.5% to 4.8% for poorly aerated tissue and -5.9% to 10.8% for nonaerated tissue). Estimated mean effective dose at 140, 60, 15 and 7.5 mAs corresponded to 17.8, 7.4, 2.0 and 0.9 mSv, respectively. Image noise of scans performed at 140, 60, 15 and 7.5 mAs corresponded to 10, 16, 38 and 74 Hounsfield units, respectively. Conclusions A reduction of effective dose up to 70% has been achieved with minimal effects on lung quantitative results. Low-dose computed tomography provides accurate quantitative results and could be used to characterize lung compartment distribution and

  12. Oufti: An integrated software package for high-accuracy, high-throughput quantitative microscopy analysis

    PubMed Central

    Paintdakhi, Ahmad; Parry, Bradley; Campos, Manuel; Irnov, Irnov; Elf, Johan; Surovtsev, Ivan; Jacobs-Wagner, Christine

    2016-01-01

    Summary With the realization that bacteria display phenotypic variability among cells and exhibit complex subcellular organization critical for cellular function and behavior, microscopy has re-emerged as a primary tool in bacterial research during the last decade. However, the bottleneck in today’s single-cell studies is quantitative image analysis of cells and fluorescent signals. Here, we address current limitations through the development of Oufti, a stand-alone, open-source software package for automated measurements of microbial cells and fluorescence signals from microscopy images. Oufti provides computational solutions for tracking touching cells in confluent samples, handles various cell morphologies, offers algorithms for quantitative analysis of both diffraction and non-diffraction-limited fluorescence signals, and is scalable for high-throughput analysis of massive datasets, all with subpixel precision. All functionalities are integrated in a single package. The graphical user interface, which includes interactive modules for segmentation, image analysis, and post-processing analysis, makes the software broadly accessible to users irrespective of their computational skills. PMID:26538279

  13. Finite-Element Analysis of a Mach-8 Flight Test Article Using Nonlinear Contact Elements

    NASA Technical Reports Server (NTRS)

    Richards, W. Lance

    1997-01-01

    A flight test article, called a glove, is required for a Mach-8 boundary-layer experiment to be conducted on a flight mission of the air-launched Pegasus(reg) space booster. The glove is required to provide a smooth, three-dimensional, structurally stable, aerodynamic surface and includes instrumentation to determine when and where boundary-layer transition occurs during the hypersonic flight trajectory. A restraint mechanism has been invented to attach the glove to the wing of the space booster. The restraint mechanism securely attaches the glove to the wing in directions normal to the wing/glove interface surface, but allows the glove to thermally expand and contract to alleviate stresses in directions parallel to the interface surface. A finite-element analysis has been performed using nonlinear contact elements to model the complex behavior of the sliding restraint mechanism. This paper provides an overview of the glove design and presents details of the analysis that were essential to demonstrate the flight worthiness of the wing-glove test article. Results show that all glove components are well within the allowable stress and deformation requirements to satisfy the objectives of the flight research experiment.

  14. Quantitatively probing propensity for structural transitions in engineered virus nanoparticles by single-molecule mechanical analysis

    NASA Astrophysics Data System (ADS)

    Castellanos, Milagros; Carrillo, Pablo J. P.; Mateu, Mauricio G.

    2015-03-01

    Viruses are increasingly being studied from the perspective of fundamental physics at the nanoscale as biologically evolved nanodevices with many technological applications. In viral particles of the minute virus of mice (MVM), folded segments of the single-stranded DNA genome are bound to the capsid inner wall and act as molecular buttresses that increase locally the mechanical stiffness of the particle. We have explored whether a quantitative linkage exists in MVM particles between their DNA-mediated stiffening and impairment of a heat-induced, virus-inactivating structural change. A series of structurally modified virus particles with disrupted capsid-DNA interactions and/or distorted capsid cavities close to the DNA-binding sites were engineered and characterized, both in classic kinetics assays and by single-molecule mechanical analysis using atomic force microscopy. The rate constant of the virus inactivation reaction was found to decrease exponentially with the increase in elastic constant (stiffness) of the regions closer to DNA-binding sites. The application of transition state theory suggests that the height of the free energy barrier of the virus-inactivating structural transition increases linearly with local mechanical stiffness. From a virological perspective, the results indicate that infectious MVM particles may have acquired the biological advantage of increased survival under thermal stress by evolving architectural elements that rigidify the particle and impair non-productive structural changes. From a nanotechnological perspective, this study provides proof of principle that determination of mechanical stiffness and its manipulation by protein engineering may be applied for quantitatively probing and tuning the conformational dynamics of virus-based and other protein-based nanoassemblies.Viruses are increasingly being studied from the perspective of fundamental physics at the nanoscale as biologically evolved nanodevices with many technological

  15. Selective Weighted Least Squares Method for Fourier Transform Infrared Quantitative Analysis.

    PubMed

    Wang, Xin; Li, Yan; Wei, Haoyun; Chen, Xia

    2017-06-01

    Classical least squares (CLS) regression is a popular multivariate statistical method used frequently for quantitative analysis using Fourier transform infrared (FT-IR) spectrometry. Classical least squares provides the best unbiased estimator for uncorrelated residual errors with zero mean and equal variance. However, the noise in FT-IR spectra, which accounts for a large portion of the residual errors, is heteroscedastic. Thus, if this noise with zero mean dominates in the residual errors, the weighted least squares (WLS) regression method described in this paper is a better estimator than CLS. However, if bias errors, such as the residual baseline error, are significant, WLS may perform worse than CLS. In this paper, we compare the effect of noise and bias error in using CLS and WLS in quantitative analysis. Results indicated that for wavenumbers with low absorbance, the bias error significantly affected the error, such that the performance of CLS is better than that of WLS. However, for wavenumbers with high absorbance, the noise significantly affected the error, and WLS proves to be better than CLS. Thus, we propose a selective weighted least squares (SWLS) regression that processes data with different wavenumbers using either CLS or WLS based on a selection criterion, i.e., lower or higher than an absorbance threshold. The effects of various factors on the optimal threshold value (OTV) for SWLS have been studied through numerical simulations. These studies reported that: (1) the concentration and the analyte type had minimal effect on OTV; and (2) the major factor that influences OTV is the ratio between the bias error and the standard deviation of the noise. The last part of this paper is dedicated to quantitative analysis of methane gas spectra, and methane/toluene mixtures gas spectra as measured using FT-IR spectrometry and CLS, WLS, and SWLS. The standard error of prediction (SEP), bias of prediction (bias), and the residual sum of squares of the errors

  16. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  17. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  18. 3D analysis of semiconductor devices: A combination of 3D imaging and 3D elemental analysis

    NASA Astrophysics Data System (ADS)

    Fu, Bianzhu; Gribelyuk, Michael A.

    2018-04-01

    3D analysis of semiconductor devices using a combination of scanning transmission electron microscopy (STEM) Z-contrast tomography and energy dispersive spectroscopy (EDS) elemental tomography is presented. 3D STEM Z-contrast tomography is useful in revealing the depth information of the sample. However, it suffers from contrast problems between materials with similar atomic numbers. Examples of EDS elemental tomography are presented using an automated EDS tomography system with batch data processing, which greatly reduces the data collection and processing time. 3D EDS elemental tomography reveals more in-depth information about the defect origin in semiconductor failure analysis. The influence of detector shadowing and X-rays absorption on the EDS tomography's result is also discussed.

  19. Quality Assessments of Long-Term Quantitative Proteomic Analysis of Breast Cancer Xenograft Tissues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Jian-Ying; Chen, Lijun; Zhang, Bai

    The identification of protein biomarkers requires large-scale analysis of human specimens to achieve statistical significance. In this study, we evaluated the long-term reproducibility of an iTRAQ (isobaric tags for relative and absolute quantification) based quantitative proteomics strategy using one channel for universal normalization across all samples. A total of 307 liquid chromatography tandem mass spectrometric (LC-MS/MS) analyses were completed, generating 107 one-dimensional (1D) LC-MS/MS datasets and 8 offline two-dimensional (2D) LC-MS/MS datasets (25 fractions for each set) for human-in-mouse breast cancer xenograft tissues representative of basal and luminal subtypes. Such large-scale studies require the implementation of robust metrics to assessmore » the contributions of technical and biological variability in the qualitative and quantitative data. Accordingly, we developed a quantification confidence score based on the quality of each peptide-spectrum match (PSM) to remove quantification outliers from each analysis. After combining confidence score filtering and statistical analysis, reproducible protein identification and quantitative results were achieved from LC-MS/MS datasets collected over a 16 month period.« less

  20. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    USGS Publications Warehouse

    Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby

    2017-01-01

    Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.

  1. Modern Material Analysis Instruments Add a New Dimension to Materials Characterization and Failure Analysis

    NASA Technical Reports Server (NTRS)

    Panda, Binayak

    2009-01-01

    Modern analytical tools can yield invaluable results during materials characterization and failure analysis. Scanning electron microscopes (SEMs) provide significant analytical capabilities, including angstrom-level resolution. These systems can be equipped with a silicon drift detector (SDD) for very fast yet precise analytical mapping of phases, as well as electron back-scattered diffraction (EBSD) units to map grain orientations, chambers that admit large samples, variable pressure for wet samples, and quantitative analysis software to examine phases. Advanced solid-state electronics have also improved surface and bulk analysis instruments: Secondary ion mass spectroscopy (SIMS) can quantitatively determine and map light elements such as hydrogen, lithium, and boron - with their isotopes. Its high sensitivity detects impurities at parts per billion (ppb) levels. X-ray photo-electron spectroscopy (XPS) can determine oxidation states of elements, as well as identifying polymers and measuring film thicknesses on coated composites. This technique is also known as electron spectroscopy for chemical analysis (ESCA). Scanning Auger electron spectroscopy (SAM) combines surface sensitivity, spatial lateral resolution (10 nm), and depth profiling capabilities to describe elemental compositions of near and below surface regions down to the chemical state of an atom.

  2. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    DOE PAGES

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; ...

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically,more » the presence of C π...C πinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.« less

  3. Trace element analysis of soil type collected from the Manjung and central Perak

    NASA Astrophysics Data System (ADS)

    Azman, Muhammad Azfar; Hamzah, Suhaimi; Rahman, Shamsiah Abdul; Elias, Md Suhaimi; Abdullah, Nazaratul Ashifa; Hashim, Azian; Shukor, Shakirah Abd; Kamaruddin, Ahmad Hasnulhadi Che

    2015-04-01

    Trace elements in soils primarily originated from their parent materials. Parents' material is the underlying geological material that has been undergone different types of chemical weathering and leaching processes. Soil trace elements concentrations may be increases as a result of continuous input from various human activities, including power generation, agriculture, mining and manufacturing. This paper describes the Neutron Activation Analysis (NAA) method used for the determination of trace elements concentrations in part per million (ppm) present in the terrestrial environment soil in Perak. The data may indicate any contamination of trace elements contributed from human activities in the area. The enrichment factors were used to check if there any contamination due to the human activities (power plants, agricultural, mining, etc.) otherwise the values would serve as a baseline data for future study. The samples were collected from 27 locations of different soil series in the area at two different depths: the top soil (0-15cm) and the sub soil (15-30cm). The collected soil samples were air dried at 60°C and passed through 2 µm sieve. Instrumental Neutron Activation Analysis (NAA) has been used for the determination of trace elements. Samples were activated in the Nuclear Malaysia TRIGA Mark II reactor followed by gamma spectrometric analysis. By activating the stable elements in the samples, the elements can be determined from the intensities of gamma energies emitted by the respected radionuclides.

  4. Trace element analysis of soil type collected from the Manjung and central Perak

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Azman, Muhammad Azfar, E-mail: m-azfar@nuclearmalaysia.gov.my; Hamzah, Suhaimi; Rahman, Shamsiah Abdul

    2015-04-29

    Trace elements in soils primarily originated from their parent materials. Parents’ material is the underlying geological material that has been undergone different types of chemical weathering and leaching processes. Soil trace elements concentrations may be increases as a result of continuous input from various human activities, including power generation, agriculture, mining and manufacturing. This paper describes the Neutron Activation Analysis (NAA) method used for the determination of trace elements concentrations in part per million (ppm) present in the terrestrial environment soil in Perak. The data may indicate any contamination of trace elements contributed from human activities in the area. Themore » enrichment factors were used to check if there any contamination due to the human activities (power plants, agricultural, mining, etc.) otherwise the values would serve as a baseline data for future study. The samples were collected from 27 locations of different soil series in the area at two different depths: the top soil (0-15cm) and the sub soil (15-30cm). The collected soil samples were air dried at 60°C and passed through 2 µm sieve. Instrumental Neutron Activation Analysis (NAA) has been used for the determination of trace elements. Samples were activated in the Nuclear Malaysia TRIGA Mark II reactor followed by gamma spectrometric analysis. By activating the stable elements in the samples, the elements can be determined from the intensities of gamma energies emitted by the respected radionuclides.« less

  5. Reliability analysis of dispersion nuclear fuel elements

    NASA Astrophysics Data System (ADS)

    Ding, Shurong; Jiang, Xin; Huo, Yongzhong; Li, Lin an

    2008-03-01

    Taking a dispersion fuel element as a special particle composite, the representative volume element is chosen to act as the research object. The fuel swelling is simulated through temperature increase. The large strain elastoplastic analysis is carried out for the mechanical behaviors using FEM. The results indicate that the fission swelling is simulated successfully; the thickness increments grow linearly with burnup; with increasing of burnup: (1) the first principal stresses at fuel particles change from tensile ones to compression ones, (2) the maximum Mises stresses at the particles transfer from the centers of fuel particles to the location close to the interfaces between the matrix and the particles, their values increase with burnup; the maximum Mises stresses at the matrix exist in the middle location between the two particles near the mid-plane along the length (or width) direction, and the maximum plastic strains are also at the above region.

  6. Quantitative research.

    PubMed

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  7. Quantitative simultaneous multi-element microprobe analysis using combined wavelength and energy dispersive systems

    NASA Technical Reports Server (NTRS)

    Walter, L. S.; Doan, A. S., Jr.; Wood, F. M., Jr.; Bredekamp, J. H.

    1972-01-01

    A combined WDS-EDS system obviates the severe X-ray peak overlap problems encountered with Na, Mg, Al and Si common to pure EDS systems. By application of easily measured empirical correction factors for pulse pile-up and peak overlaps which are normally observed in the analysis of silicate minerals, the accuracy of analysis is comparable with that expected for WDS electron microprobe analyses. The continuum backgrounds are subtracted for the spectra by a spline fitting technique based on integrated intensities between the peaks. The preprocessed data are then reduced to chemical analyses by existing data reduction programs.

  8. An accurate nonlinear finite element analysis and test correlation of a stiffened composite wing panel

    NASA Astrophysics Data System (ADS)

    Davis, D. D., Jr.; Krishnamurthy, T.; Stroud, W. J.; McCleary, S. L.

    1991-05-01

    State-of-the-art nonlinear finite element analysis techniques are evaluated by applying them to a realistic aircraft structural component. A wing panel from the V-22 tiltrotor aircraft is chosen because it is a typical modern aircraft structural component for which there is experimental data for comparison of results. From blueprints and drawings, a very detailed finite element model containing 2284 9-node Assumed Natural-Coordinate Strain elements was generated. A novel solution strategy which accounts for geometric nonlinearity through the use of corotating element reference frames and nonlinear strain-displacement relations is used to analyze this detailed model. Results from linear analyses using the same finite element model are presented in order to illustrate the advantages and costs of the nonlinear analysis as compared with the more traditional linear analysis.

  9. An accurate nonlinear finite element analysis and test correlation of a stiffened composite wing panel

    NASA Technical Reports Server (NTRS)

    Davis, D. D., Jr.; Krishnamurthy, T.; Stroud, W. J.; Mccleary, S. L.

    1991-01-01

    State-of-the-art nonlinear finite element analysis techniques are evaluated by applying them to a realistic aircraft structural component. A wing panel from the V-22 tiltrotor aircraft is chosen because it is a typical modern aircraft structural component for which there is experimental data for comparison of results. From blueprints and drawings, a very detailed finite element model containing 2284 9-node Assumed Natural-Coordinate Strain elements was generated. A novel solution strategy which accounts for geometric nonlinearity through the use of corotating element reference frames and nonlinear strain-displacement relations is used to analyze this detailed model. Results from linear analyses using the same finite element model are presented in order to illustrate the advantages and costs of the nonlinear analysis as compared with the more traditional linear analysis.

  10. Power flows and Mechanical Intensities in structural finite element analysis

    NASA Technical Reports Server (NTRS)

    Hambric, Stephen A.

    1989-01-01

    The identification of power flow paths in dynamically loaded structures is an important, but currently unavailable, capability for the finite element analyst. For this reason, methods for calculating power flows and mechanical intensities in finite element models are developed here. Formulations for calculating input and output powers, power flows, mechanical intensities, and power dissipations for beam, plate, and solid element types are derived. NASTRAN is used to calculate the required velocity, force, and stress results of an analysis, which a post-processor then uses to calculate power flow quantities. The SDRC I-deas Supertab module is used to view the final results. Test models include a simple truss and a beam-stiffened cantilever plate. Both test cases showed reasonable power flow fields over low to medium frequencies, with accurate power balances. Future work will include testing with more complex models, developing an interactive graphics program to view easily and efficiently the analysis results, applying shape optimization methods to the problem with power flow variables as design constraints, and adding the power flow capability to NASTRAN.

  11. Quantitative analysis of background parenchymal enhancement in whole breast on MRI: Influence of menstrual cycle and comparison with a qualitative analysis.

    PubMed

    Jung, Yongsik; Jeong, Seong Kyun; Kang, Doo Kyoung; Moon, Yeorae; Kim, Tae Hee

    2018-06-01

    We quantitatively analyzed background parenchymal enhancement (BPE) in whole breast according to menstrual cycle and compared it with a qualitative analysis method. A data set of breast magnetic resonance imaging (MRI) from 273 breast cancer patients was used. For quantitative analysis, we used semiautomated in-house software with MATLAB. From each voxel of whole breast, the software calculated BPE using following equation: [(signal intensity [SI] at 1 min 30 s after contrast injection - baseline SI)/baseline SI] × 100%. In total, 53 patients had minimal, 108 mild, 87 moderate, and 25 marked BPE. On quantitative analysis, mean BPE values were 33.1% in the minimal, 42.1% in the mild, 59.1% in the moderate, and 81.9% in the marked BPE group showing significant difference (p = .009 for minimal vs. mild, p < 0.001 for other comparisons). Spearman's correlation test showed that there was strong significant correlation between qualitative and quantitative BPE (r = 0.63, p < 0.001). The mean BPE value was 48.7% for patients in the first week of the menstrual cycle, 43.5% in the second week, 49% in the third week, and 49.4% for those in the fourth week. The difference between the second and fourth weeks was significant (p = .005). Median, 90th percentile, and 10th percentile values were also significantly different between the second and fourth weeks but not different in other comparisons (first vs. second, first vs. third, first vs. fourth, second vs. third, or third vs. fourth). Quantitative analysis of BPE correlated well with the qualitative BPE grade. Quantitative BPE values were lowest in the second week and highest in the fourth week. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. STARS: A general-purpose finite element computer program for analysis of engineering structures

    NASA Technical Reports Server (NTRS)

    Gupta, K. K.

    1984-01-01

    STARS (Structural Analysis Routines) is primarily an interactive, graphics-oriented, finite-element computer program for analyzing the static, stability, free vibration, and dynamic responses of damped and undamped structures, including rotating systems. The element library consists of one-dimensional (1-D) line elements, two-dimensional (2-D) triangular and quadrilateral shell elements, and three-dimensional (3-D) tetrahedral and hexahedral solid elements. These elements enable the solution of structural problems that include truss, beam, space frame, plane, plate, shell, and solid structures, or any combination thereof. Zero, finite, and interdependent deflection boundary conditions can be implemented by the program. The associated dynamic response analysis capability provides for initial deformation and velocity inputs, whereas the transient excitation may be either forces or accelerations. An effective in-core or out-of-core solution strategy is automatically employed by the program, depending on the size of the problem. Data input may be at random within a data set, and the program offers certain automatic data-generation features. Input data are formatted as an optimal combination of free and fixed formats. Interactive graphics capabilities enable convenient display of nodal deformations, mode shapes, and element stresses.

  13. Visual and Quantitative Analysis Methods of Respiratory Patterns for Respiratory Gated PET/CT.

    PubMed

    Son, Hye Joo; Jeong, Young Jin; Yoon, Hyun Jin; Park, Jong-Hwan; Kang, Do-Young

    2016-01-01

    We integrated visual and quantitative methods for analyzing the stability of respiration using four methods: phase space diagrams, Fourier spectra, Poincaré maps, and Lyapunov exponents. Respiratory patterns of 139 patients were grouped based on the combination of the regularity of amplitude, period, and baseline positions. Visual grading was done by inspecting the shape of diagram and classified into two states: regular and irregular. Quantitation was done by measuring standard deviation of x and v coordinates of Poincaré map (SD x , SD v ) or the height of the fundamental peak ( A 1 ) in Fourier spectrum or calculating the difference between maximal upward and downward drift. Each group showed characteristic pattern on visual analysis. There was difference of quantitative parameters (SD x , SD v , A 1 , and MUD-MDD) among four groups (one way ANOVA, p = 0.0001 for MUD-MDD, SD x , and SD v , p = 0.0002 for A 1 ). In ROC analysis, the cutoff values were 0.11 for SD x (AUC: 0.982, p < 0.0001), 0.062 for SD v (AUC: 0.847, p < 0.0001), 0.117 for A 1 (AUC: 0.876, p < 0.0001), and 0.349 for MUD-MDD (AUC: 0.948, p < 0.0001). This is the first study to analyze multiple aspects of respiration using various mathematical constructs and provides quantitative indices of respiratory stability and determining quantitative cutoff value for differentiating regular and irregular respiration.

  14. [Distribution Characteristics and Source Analysis of Dustfall Trace Elements During Winter in Beijing].

    PubMed

    Xiong, Qiu-lin; Zhao, Wen-ji; Guo, Xiao-yu; Chen, Fan-tao; Shu, Tong-tong; Zheng, Xiao-xia; Zhao, Wen-hui

    2015-08-01

    The dustfall content is one of the evaluation indexes of atmospheric pollution. Trace elements especially heavy metals in dustfall can lead to risks to ecological environment and human health. In order to study the distribution characteristics of trace elements, heavy metals pollution and their sources in winter atmospheric dust, 49 dustfall samples were collected in Beijing City and nearby during November 2013 to March 2014. Then the contents (mass percentages) of 40 trace elements were measured by Elan DRC It type inductively coupled plasma mass (ICP-MS). Test results showed that more than half of the trace elements in the dust were less than 10 mg x kg(-1); about a quarter were between 10-100 mg x kg-1); while 7 elements (Pb, Zr, Cr, Cu, Zn, Sr and Ba) were more than 100 mg x kg(-1). The contents of Pb, Cu, Zn, Bi, Cd and Mo of winter dustfall in Beijing city.were respectively 4.18, 4.66, 5.35, 6.31, 6.62, and 8.62 times as high as those of corresponding elements in the surface soil in the same period, which went beyond the soil background values by more than 300% . The contribution of human activities to dustfall trace heavy metals content in Beijing city was larger than that in the surrounding region. Then sources analysis of dustfall and its 20 main trace elements (Cd, Mo, Nb, Ga, Co, Y, Nd, Li, La, Ni, Rb, V, Ce, Pb, Zr, Cr, Cu, Zn, Sr, Ba) was conducted through a multi-method analysis, including Pearson correlation analysis, Kendall correlation coefficient analysis and principal component analysis. Research results indicated that sources of winter dustfall in Beijing city were mainly composed of the earth's crust sources (including road dust, construction dust and remote transmission of dust) and the burning of fossil fuels (vehicle emissions, coal combustion, biomass combustion and industrial processes).

  15. X-ray STM: Nanoscale elemental analysis & Observation of atomic track.

    PubMed

    Saito, Akira; Furudate, Y; Kusui, Y; Saito, T; Akai-Kasaya, M; Tanaka, Y; Tamasaku, K; Kohmura, Y; Ishikawa, T; Kuwahara, Y; Aono, M

    2014-11-01

    Scanning tunneling microscopy (STM) combined with brilliant X-rays from synchrotron radiation (SR) can provide various possibilities of original and important applications, such as the elemental analysis on solid surfaces at an atomic scale. The principle of the elemental analysis is based on the inner-shell excitation of an element-specific energy level "under STM observation". A key to obtain an atomic locality is to extract the element-specific modulation of the local tunneling current (not emission that can damage the spatial resolution), which is derived from the inner-shell excitation [1]. On this purpose, we developed a special SR-STM system and smart tip. To surmount a tiny core-excitation efficiency by hard X-rays, we focused two-dimensionally an incident beam having the highest photon density at the SPring-8.After successes in the elemental analyses by SR-STM [1,2] on a semiconductor hetero-interface (Ge on Si) and metal-semiconductor interface (Cu on Ge), we succeeded in obtaining the elemental contrast between Co nano-islands and Au substrate. The results on the metallic substrate suggest the generality of the method and give some important implications on the principle of contrast. For all cases of three samples, the spatial resolution of the analysis was estimated to be ∼1 nm or less, and it is worth noting that the measured surface domains had a deposition thickness of less than one atomic layer (Fig. 1, left and center).jmicro;63/suppl_1/i14-a/DFU045F1F1DFU045F1Fig. 1.(left) Topographic image and (center) beam-induced tip current image of Ge(111)-Cu (-2V, 0.2 nA). (right) X-ray- induced atomic motion tracks on Ge(111) that were newly imaged by the Xray-STM. On the other hand, we found that the "X-ray induced atomic motion" can be observed directly with atomic scale using the SR-STM system effectively under the incident photon density of ∼2 x10(15) photon/sec/mm(2) [3]. SR-STM visualized successfully the track of the atomic motion (Fig. 1, right

  16. PROTON MICROPROBE ANALYSIS OF TRACE-ELEMENT VARIATIONS IN VITRINITES IN THE SAME AND DIFFERENT COAL BEDS.

    USGS Publications Warehouse

    Minkin, J.A.; Chao, E.C.T.; Blank, Herma; Dulong, F.T.

    1987-01-01

    The PIXE (proton-induced X-ray emission) microprobe can be used for nondestructive, in-situ analyses of areas as small as those analyzed by the electron microprobe, and has a sensitivity of detection as much as two orders of magnitude better than the electron microprobe. Preliminary studies demonstrated that PIXE provides a capability for quantitative determination of elemental concentrations in individual coal maceral grains with a detection limit of 1-10 ppm for most elements analyzed. Encouraged by the earlier results, we carried out the analyses reported below to examine trace element variations laterally (over a km range) as well as vertically (cm to m) in the I and J coal beds in the Upper Cretaceous Ferron Sandstone Member of the Mancos Shale in central Utah, and to compare the data with the data from two samples of eastern coals of Pennsylvanian age.

  17. Geochemical variations of rare earth elements in Marcellus shale flowback waters and multiple-source cores in the Appalachian Basin

    NASA Astrophysics Data System (ADS)

    Noack, C.; Jain, J.; Hakala, A.; Schroeder, K.; Dzombak, D. A.; Karamalidis, A.

    2013-12-01

    Rare earth elements (REE) - encompassing the naturally occurring lanthanides, yttrium, and scandium - are potential tracers for subsurface groundwater-brine flows and geochemical processes. Application of these elements as naturally occurring tracers during shale gas development is reliant on accurate quantitation of trace metals in hypersaline brines. We have modified and validated a liquid-liquid technique for extraction and pre-concentration of REE from saline produced waters from shale gas extraction wells with quantitative analysis by ICP-MS. This method was used to analyze time-series samples of Marcellus shale flowback and produced waters. Additionally, the total REE content of core samples of various strata throughout the Appalachian Basin were determined using HF/HNO3 digestion and ICP-MS analysis. A primary goal of the study is to elucidate systematic geochemical variations as a function of location or shale characteristics. Statistical testing will be performed to study temporal variability of inter-element relationships and explore associations between REE abundance and major solution chemistry. The results of these analyses and discussion of their significance will be presented.

  18. The Quantitative Analysis of Chennai Automotive Industry Cluster

    NASA Astrophysics Data System (ADS)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  19. Trace elements by instrumental neutron activation analysis for pollution monitoring

    NASA Technical Reports Server (NTRS)

    Sheibley, D. W.

    1975-01-01

    Methods and technology were developed to analyze 1000 samples/yr of coal and other pollution-related samples. The complete trace element analysis of 20-24 samples/wk averaged 3-3.5 man-hours/sample. The computerized data reduction scheme could identify and report data on as many as 56 elements. In addition to coal, samples of fly ash, bottom ash, crude oil, fuel oil, residual oil, gasoline, jet fuel, kerosene, filtered air particulates, ore, stack scrubber water, clam tissue, crab shells, river sediment and water, and corn were analyzed. Precision of the method was plus or minus 25% based on all elements reported in coal and other sample matrices. Overall accuracy was estimated at 50%.

  20. Refinement of Out of Circularity and Thickness Measurements of a Cylinder for Finite Element Analysis

    DTIC Science & Technology

    2016-09-01

    UNCLASSIFIED UNCLASSIFIED Refinement of Out of Circularity and Thickness Measurements of a Cylinder for Finite Element Analysis...significant effect on the collapse strength and must be accurately represented in finite element analysis to obtain accurate results. Often it is necessary...to interpolate measurements from a relatively coarse grid to a refined finite element model and methods that have wide general acceptance are

  1. Laser ablation-miniature mass spectrometer for elemental and isotopic analysis of rocks.

    PubMed

    Sinha, M P; Neidholdt, E L; Hurowitz, J; Sturhahn, W; Beard, B; Hecht, M H

    2011-09-01

    A laser ablation-miniature mass spectrometer (LA-MMS) for the chemical and isotopic measurement of rocks and minerals is described. In the LA-MMS method, neutral atoms ablated by a pulsed laser are led into an electron impact ionization source, where they are ionized by a 70 eV electron beam. This results in a secondary ion pulse typically 10-100 μs wide, compared to the original 5-10 ns laser pulse duration. Ions of different masses are then spatially dispersed along the focal plane of the magnetic sector of the miniature mass spectrometer (MMS) and measured in parallel by a modified CCD array detector capable of detecting ions directly. Compared to conventional scanning techniques, simultaneous measurement of the ion pulse along the focal plane effectively offers a 100% duty cycle over a wide mass range. LA-MMS offers a more quantitative assessment of elemental composition than techniques that detect ions directly generated by the ablation process because the latter can be strongly influenced by matrix effects that vary with the structure and geometry of the surface, the wavelength of the laser beam, and the not well characterized ionization efficiencies of the elements in the process. The above problems attendant to the direct ion analysis has been minimized in the LA-MMS by analyzing the ablated neutral species after their post-ionization by electron impaction. These neutral species are much more abundant than the directly ablated ions in the ablated vapor plume and are, therefore, expected to be characteristic of the chemical composition of the solid. Also, the electron impact ionization of elements is well studied and their ionization cross sections are known and easy to find in databases. Currently, the LA-MMS limit of detection is 0.4 wt.%. Here we describe LA-MMS elemental composition measurements of various minerals including microcline, lepidolite, anorthoclase, and USGS BCR-2G samples. The measurements of high precision isotopic ratios including (41)K

  2. Analysis of aircraft tires via semianalytic finite elements

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Kim, Kyun O.; Tanner, John A.

    1990-01-01

    A computational procedure is presented for the geometrically nonlinear analysis of aircraft tires. The tire was modeled by using a two-dimensional laminated anisotropic shell theory with the effects of variation in material and geometric parameters included. The four key elements of the procedure are: (1) semianalytic finite elements in which the shell variables are represented by Fourier series in the circumferential direction and piecewise polynomials in the meridional direction; (2) a mixed formulation with the fundamental unknowns consisting of strain parameters, stress-resultant parameters, and generalized displacements; (3) multilevel operator splitting to effect successive simplifications, and to uncouple the equations associated with different Fourier harmonics; and (4) multilevel iterative procedures and reduction techniques to generate the response of the shell.

  3. Multivariate reference technique for quantitative analysis of fiber-optic tissue Raman spectroscopy.

    PubMed

    Bergholt, Mads Sylvest; Duraipandian, Shiyamala; Zheng, Wei; Huang, Zhiwei

    2013-12-03

    We report a novel method making use of multivariate reference signals of fused silica and sapphire Raman signals generated from a ball-lens fiber-optic Raman probe for quantitative analysis of in vivo tissue Raman measurements in real time. Partial least-squares (PLS) regression modeling is applied to extract the characteristic internal reference Raman signals (e.g., shoulder of the prominent fused silica boson peak (~130 cm(-1)); distinct sapphire ball-lens peaks (380, 417, 646, and 751 cm(-1))) from the ball-lens fiber-optic Raman probe for quantitative analysis of fiber-optic Raman spectroscopy. To evaluate the analytical value of this novel multivariate reference technique, a rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe is used for in vivo oral tissue Raman measurements (n = 25 subjects) under 785 nm laser excitation powers ranging from 5 to 65 mW. An accurate linear relationship (R(2) = 0.981) with a root-mean-square error of cross validation (RMSECV) of 2.5 mW can be obtained for predicting the laser excitation power changes based on a leave-one-subject-out cross-validation, which is superior to the normal univariate reference method (RMSE = 6.2 mW). A root-mean-square error of prediction (RMSEP) of 2.4 mW (R(2) = 0.985) can also be achieved for laser power prediction in real time when we applied the multivariate method independently on the five new subjects (n = 166 spectra). We further apply the multivariate reference technique for quantitative analysis of gelatin tissue phantoms that gives rise to an RMSEP of ~2.0% (R(2) = 0.998) independent of laser excitation power variations. This work demonstrates that multivariate reference technique can be advantageously used to monitor and correct the variations of laser excitation power and fiber coupling efficiency in situ for standardizing the tissue Raman intensity to realize quantitative analysis of tissue Raman measurements in vivo, which is particularly appealing in

  4. Development of quantitative exposure data for a pooled exposure-response analysis of 10 silica cohorts.

    PubMed

    Mannetje, Andrea 't; Steenland, Kyle; Checkoway, Harvey; Koskela, Riitta-Sisko; Koponen, Matti; Attfield, Michael; Chen, Jingqiong; Hnizdo, Eva; DeKlerk, Nicholas; Dosemeci, Mustafa

    2002-08-01

    Comprehensive quantitative silica exposure estimates over time, measured in the same units across a number of cohorts, would make possible a pooled exposure-response analysis for lung cancer. Such an analysis would help clarify the continuing controversy regarding whether silica causes lung cancer. Existing quantitative exposure data for 10 silica-exposed cohorts were retrieved from the original investigators. Occupation- and time-specific exposure estimates were either adopted/adapted or developed for each cohort, and converted to milligram per cubic meter (mg/m(3)) respirable crystalline silica. Quantitative exposure assignments were typically based on a large number (thousands) of raw measurements, or otherwise consisted of exposure estimates by experts (for two cohorts). Median exposure level of the cohorts ranged between 0.04 and 0.59 mg/m(3) respirable crystalline silica. Exposure estimates were partially validated via their successful prediction of silicosis in these cohorts. Existing data were successfully adopted or modified to create comparable quantitative exposure estimates over time for 10 silica-exposed cohorts, permitting a pooled exposure-response analysis. The difficulties encountered in deriving common exposure estimates across cohorts are discussed. Copyright 2002 Wiley-Liss, Inc.

  5. Power flow as a complement to statistical energy analysis and finite element analysis

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1987-01-01

    Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.

  6. Quantitative assessment of metal elements using moss species as biomonitors in downwind area of lead-zinc mine.

    PubMed

    Balabanova, Biljana; Stafilov, Trajče; Šajn, Robert; Andonovska, Katerina Bačeva

    2017-02-23

    Distributions of a total of 21 elements were monitored in significantly lead-zinc polluted area using moss species (Hypnum cupressiforme and Camptothecium lutescens) used interchangeably, covering a denser sampling network. Interspecies comparison was conducted using Box-Cox transformed values, due to their skewed distribution. The median concentrations of trace elements in the both mosses examined decreased in the following order: Fe>Mn>Zn>Pb>Cu>Ni∼Cr∼As>Co>Cd>Hg. For almost all analyzed elements, H. cupressiforme revealed higher bio-accumulative abilities. For arsenic contents was obtained ER-value in favor of C. lutescens. The ER for the element contents according to the distance from the pollution source in selected areas was significantly enriched for the anthropogenic introduced elements As, Cd, Cu, Pb and Zn. After Box-Cox transformation of the content values, T B was significantly different for As (4.82), Cd (3.84), Cu (2.95), Pb (4.38), and Zn (4.23). Multivariate factor analysis singled out four elemental associations: F1 (Al-Co-Cr-Fe-Li-Ni-V), F2 (Cd-Pb-Zn), F3 (Ca-Mg-Na-P) and F4 (Cu) with a total variance of 89%. Spatial distribution visualized the hazardously higher contents of "hot spots" of Cd > 1.30 mg/kg, Cu > 22 mg/kg, Pb > 130 mg/kg and Zn > 160 mg/kg. Therefore, main approach in moss biomonitoring should be based on data management of the element distribution by reducing the effect of extreme values (considering Box-Cox data transformation); the interspecies variation in sampling media does not deviate in relation to H. cupressiforme vs. C. lutescens.

  7. Inorganic elemental determinations of marine traditional Chinese Medicine Meretricis concha from Jiaozhou Bay: The construction of inorganic elemental fingerprint based on chemometric analysis

    NASA Astrophysics Data System (ADS)

    Shao, Mingying; Li, Xuejie; Zheng, Kang; Jiang, Man; Yan, Cuiwei; Li, Yantuan

    2016-04-01

    The goal of this paper is to explore the relationship between the inorganic elemental fingerprint and the geographical origin identification of Meretricis concha, which is a commonly used marine traditional Chinese medicine (TCM) for the treatment of asthma and scald burns. For that, the inorganic elemental contents of Meretricis concha from five sampling points in Jiaozhou Bay have been determined by means of inductively coupled plasma optical emission spectrometry, and the comparative investigations based on the contents of 14 inorganic elements (Al, As, Cd, Co, Cr, Cu, Fe, Hg, Mn, Mo, Ni, Pb, Se and Zn) of the samples from Jiaozhou Bay and the previous reported Rushan Bay were performed. It has been found that the samples from the two bays are approximately classified into two kinds using hierarchical cluster analysis, and a four-factor model based on principle component analysis could explain approximately 75% of the detection data, also linear discriminant analysis can be used to develop a prediction model to distinguish the samples from Jiaozhou Bay and Rushan Bay with accuracy of about 93%. The results of the present investigation suggested that the inorganic elemental fingerprint based on the combination of the measured elemental content and chemometric analysis is a promising approach for verifying the geographical origin of Meretricis concha, and this strategy should be valuable for the authenticity discrimination of some marine TCM.

  8. Finite element analysis of thrust angle contact ball slewing bearing

    NASA Astrophysics Data System (ADS)

    Deng, Biao; Guo, Yuan; Zhang, An; Tang, Shengjin

    2017-12-01

    In view of the large heavy slewing bearing no longer follows the rigid ring hupothesis under the load condition, the entity finite element model of thrust angular contact ball bearing was established by using finite element analysis software ANSYS. The boundary conditions of the model were set according to the actual condition of slewing bearing, the internal stress state of the slewing bearing was obtained by solving and calculation, and the calculated results were compared with the numerical results based on the rigid ring assumption. The results show that more balls are loaded in the result of finite element method, and the maximum contact stresses between the ball and raceway have some reductions. This is because the finite element method considers the ferrule as an elastic body. The ring will produce structure deformation in the radial plane when the heavy load slewing bearings are subjected to external loads. The results of the finite element method are more in line with the actual situation of the slewing bearing in the engineering.

  9. Visualization and Quantitative Analysis of Crack-Tip Plastic Zone in Pure Nickel

    NASA Astrophysics Data System (ADS)

    Kelton, Randall; Sola, Jalal Fathi; Meletis, Efstathios I.; Huang, Haiying

    2018-05-01

    Changes in surface morphology have long been thought to be associated with crack propagation in metallic materials. We have studied areal surface texture changes around crack tips in an attempt to understand the correlations between surface texture changes and crack growth behavior. Detailed profiling of the fatigue sample surface was carried out at short fatigue intervals. An image processing algorithm was developed to calculate the surface texture changes. Quantitative analysis of the crack-tip plastic zone, crack-arrested sites near triple points, and large surface texture changes associated with crack release from arrested locations was carried out. The results indicate that surface texture imaging enables visualization of the development of plastic deformation around a crack tip. Quantitative analysis of the surface texture changes reveals the effects of local microstructures on the crack growth behavior.

  10. Characterization of breast lesion using T1-perfusion magnetic resonance imaging: Qualitative vs. quantitative analysis.

    PubMed

    Thakran, S; Gupta, P K; Kabra, V; Saha, I; Jain, P; Gupta, R K; Singh, A

    2018-06-14

    The objective of this study was to quantify the hemodynamic parameters using first pass analysis of T 1 -perfusion magnetic resonance imaging (MRI) data of human breast and to compare these parameters with the existing tracer kinetic parameters, semi-quantitative and qualitative T 1 -perfusion analysis in terms of lesion characterization. MRI of the breast was performed in 50 women (mean age, 44±11 [SD] years; range: 26-75) years with a total of 15 benign and 35 malignant breast lesions. After pre-processing, T 1 -perfusion MRI data was analyzed using qualitative approach by two radiologists (visual inspection of the kinetic curve into types I, II or III), semi-quantitative (characterization of kinetic curve types using empirical parameters), generalized-tracer-kinetic-model (tracer kinetic parameters) and first pass analysis (hemodynamic-parameters). Chi-squared test, t-test, one-way analysis-of-variance (ANOVA) using Bonferroni post-hoc test and receiver-operating-characteristic (ROC) curve were used for statistical analysis. All quantitative parameters except leakage volume (Ve), qualitative (type-I and III) and semi-quantitative curves (type-I and III) provided significant differences (P<0.05) between benign and malignant lesions. Kinetic parameters, particularly volume transfer coefficient (K trans ) provided a significant difference (P<0.05) between all grades except grade-II vs III. The hemodynamic parameter (relative-leakage-corrected-breast-blood-volume [rBBVcorr) provided a statistically significant difference (P<0.05) between all grades. It also provided highest sensitivity and specificity among all parameters in differentiation between different grades of malignant breast lesions. Quantitative parameters, particularly rBBVcorr and K trans provided similar sensitivity and specificity in differentiating benign from malignant breast lesions for this cohort. Moreover, rBBVcorr provided better differentiation between different grades of malignant breast

  11. Feasibility of high-resolution quantitative perfusion analysis in patients with heart failure.

    PubMed

    Sammut, Eva; Zarinabad, Niloufar; Wesolowski, Roman; Morton, Geraint; Chen, Zhong; Sohal, Manav; Carr-White, Gerry; Razavi, Reza; Chiribiri, Amedeo

    2015-02-12

    Cardiac magnetic resonance (CMR) is playing an expanding role in the assessment of patients with heart failure (HF). The assessment of myocardial perfusion status in HF can be challenging due to left ventricular (LV) remodelling and wall thinning, coexistent scar and respiratory artefacts. The aim of this study was to assess the feasibility of quantitative CMR myocardial perfusion analysis in patients with HF. A group of 58 patients with heart failure (HF; left ventricular ejection fraction, LVEF ≤ 50%) and 33 patients with normal LVEF (LVEF >50%), referred for suspected coronary artery disease, were studied. All subjects underwent quantitative first-pass stress perfusion imaging using adenosine according to standard acquisition protocols. The feasibility of quantitative perfusion analysis was then assessed using high-resolution, 3 T kt perfusion and voxel-wise Fermi deconvolution. 30/58 (52%) subjects in the HF group had underlying ischaemic aetiology. Perfusion abnormalities were seen amongst patients with ischaemic HF and patients with normal LV function. No regional perfusion defect was observed in the non-ischaemic HF group. Good agreement was found between visual and quantitative analysis across all groups. Absolute stress perfusion rate, myocardial perfusion reserve (MPR) and endocardial-epicardial MPR ratio identified areas with abnormal perfusion in the ischaemic HF group (p = 0.02; p = 0.04; p = 0.02, respectively). In the Normal LV group, MPR and endocardial-epicardial MPR ratio were able to distinguish between normal and abnormal segments (p = 0.04; p = 0.02 respectively). No significant differences of absolute stress perfusion rate or MPR were observed comparing visually normal segments amongst groups. Our results demonstrate the feasibility of high-resolution voxel-wise perfusion assessment in patients with HF.

  12. Quantitative 3D investigation of Neuronal network in mouse spinal cord model

    NASA Astrophysics Data System (ADS)

    Bukreeva, I.; Campi, G.; Fratini, M.; Spanò, R.; Bucci, D.; Battaglia, G.; Giove, F.; Bravin, A.; Uccelli, A.; Venturi, C.; Mastrogiacomo, M.; Cedola, A.

    2017-01-01

    The investigation of the neuronal network in mouse spinal cord models represents the basis for the research on neurodegenerative diseases. In this framework, the quantitative analysis of the single elements in different districts is a crucial task. However, conventional 3D imaging techniques do not have enough spatial resolution and contrast to allow for a quantitative investigation of the neuronal network. Exploiting the high coherence and the high flux of synchrotron sources, X-ray Phase-Contrast multiscale-Tomography allows for the 3D investigation of the neuronal microanatomy without any aggressive sample preparation or sectioning. We investigated healthy-mouse neuronal architecture by imaging the 3D distribution of the neuronal-network with a spatial resolution of 640 nm. The high quality of the obtained images enables a quantitative study of the neuronal structure on a subject-by-subject basis. We developed and applied a spatial statistical analysis on the motor neurons to obtain quantitative information on their 3D arrangement in the healthy-mice spinal cord. Then, we compared the obtained results with a mouse model of multiple sclerosis. Our approach paves the way to the creation of a “database” for the characterization of the neuronal network main features for a comparative investigation of neurodegenerative diseases and therapies.

  13. A novel iris transillumination grading scale allowing flexible assessment with quantitative image analysis and visual matching.

    PubMed

    Wang, Chen; Brancusi, Flavia; Valivullah, Zaheer M; Anderson, Michael G; Cunningham, Denise; Hedberg-Buenz, Adam; Power, Bradley; Simeonov, Dimitre; Gahl, William A; Zein, Wadih M; Adams, David R; Brooks, Brian

    2018-01-01

    To develop a sensitive scale of iris transillumination suitable for clinical and research use, with the capability of either quantitative analysis or visual matching of images. Iris transillumination photographic images were used from 70 study subjects with ocular or oculocutaneous albinism. Subjects represented a broad range of ocular pigmentation. A subset of images was subjected to image analysis and ranking by both expert and nonexpert reviewers. Quantitative ordering of images was compared with ordering by visual inspection. Images were binned to establish an 8-point scale. Ranking consistency was evaluated using the Kendall rank correlation coefficient (Kendall's tau). Visual ranking results were assessed using Kendall's coefficient of concordance (Kendall's W) analysis. There was a high degree of correlation among the image analysis, expert-based and non-expert-based image rankings. Pairwise comparisons of the quantitative ranking with each reviewer generated an average Kendall's tau of 0.83 ± 0.04 (SD). Inter-rater correlation was also high with Kendall's W of 0.96, 0.95, and 0.95 for nonexpert, expert, and all reviewers, respectively. The current standard for assessing iris transillumination is expert assessment of clinical exam findings. We adapted an image-analysis technique to generate quantitative transillumination values. Quantitative ranking was shown to be highly similar to a ranking produced by both expert and nonexpert reviewers. This finding suggests that the image characteristics used to quantify iris transillumination do not require expert interpretation. Inter-rater rankings were also highly similar, suggesting that varied methods of transillumination ranking are robust in terms of producing reproducible results.

  14. Challenges in Integrating Nondestructive Evaluation and Finite Element Methods for Realistic Structural Analysis

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Baaklini, George Y.; Zagidulin, Dmitri; Rauser, Richard W.

    2000-01-01

    Capabilities and expertise related to the development of links between nondestructive evaluation (NDE) and finite element analysis (FEA) at Glenn Research Center (GRC) are demonstrated. Current tools to analyze data produced by computed tomography (CT) scans are exercised to help assess the damage state in high temperature structural composite materials. A utility translator was written to convert velocity (an image processing software) STL data file to a suitable CAD-FEA type file. Finite element analyses are carried out with MARC, a commercial nonlinear finite element code, and the analytical results are discussed. Modeling was established by building MSC/Patran (a pre and post processing finite element package) generated model and comparing it to a model generated by Velocity in conjunction with MSC/Patran Graphics. Modeling issues and results are discussed in this paper. The entire process that outlines the tie between the data extracted via NDE and the finite element modeling and analysis is fully described.

  15. Quantitative analysis of dinuclear manganese(II) EPR spectra

    NASA Astrophysics Data System (ADS)

    Golombek, Adina P.; Hendrich, Michael P.

    2003-11-01

    A quantitative method for the analysis of EPR spectra from dinuclear Mn(II) complexes is presented. The complex [(Me 3TACN) 2Mn(II) 2(μ-OAc) 3]BPh 4 ( 1) (Me 3TACN= N, N', N''-trimethyl-1,4,7-triazacyclononane; OAc=acetate 1-; BPh 4=tetraphenylborate 1-) was studied with EPR spectroscopy at X- and Q-band frequencies, for both perpendicular and parallel polarizations of the microwave field, and with variable temperature (2-50 K). Complex 1 is an antiferromagnetically coupled dimer which shows signals from all excited spin manifolds, S=1 to 5. The spectra were simulated with diagonalization of the full spin Hamiltonian which includes the Zeeman and zero-field splittings of the individual manganese sites within the dimer, the exchange and dipolar coupling between the two manganese sites of the dimer, and the nuclear hyperfine coupling for each manganese ion. All possible transitions for all spin manifolds were simulated, with the intensities determined from the calculated probability of each transition. In addition, the non-uniform broadening of all resonances was quantitatively predicted using a lineshape model based on D- and r-strain. As the temperature is increased from 2 K, an 11-line hyperfine pattern characteristic of dinuclear Mn(II) is first observed from the S=3 manifold. D- and r-strain are the dominate broadening effects that determine where the hyperfine pattern will be resolved. A single unique parameter set was found to simulate all spectra arising for all temperatures, microwave frequencies, and microwave modes. The simulations are quantitative, allowing for the first time the determination of species concentrations directly from EPR spectra. Thus, this work describes the first method for the quantitative characterization of EPR spectra of dinuclear manganese centers in model complexes and proteins. The exchange coupling parameter J for complex 1 was determined ( J=-1.5±0.3 cm-1; H ex=-2J S1· S2) and found to be in agreement with a previous

  16. Digital Holography, a metrological tool for quantitative analysis: Trends and future applications

    NASA Astrophysics Data System (ADS)

    Paturzo, Melania; Pagliarulo, Vito; Bianco, Vittorio; Memmolo, Pasquale; Miccio, Lisa; Merola, Francesco; Ferraro, Pietro

    2018-05-01

    A review on the last achievements of Digital Holography is reported in this paper, showing that this powerful method can be a key metrological tool for the quantitative analysis and non-invasive inspection of a variety of materials, devices and processes. Nowadays, its range of applications has been greatly extended, including the study of live biological matter and biomedical applications. This paper overviews the main progresses and future perspectives of digital holography, showing new optical configurations and investigating the numerical issues to be tackled for the processing and display of quantitative data.

  17. Accurate interlaminar stress recovery from finite element analysis

    NASA Technical Reports Server (NTRS)

    Tessler, Alexander; Riggs, H. Ronald

    1994-01-01

    The accuracy and robustness of a two-dimensional smoothing methodology is examined for the problem of recovering accurate interlaminar shear stress distributions in laminated composite and sandwich plates. The smoothing methodology is based on a variational formulation which combines discrete least-squares and penalty-constraint functionals in a single variational form. The smoothing analysis utilizes optimal strains computed at discrete locations in a finite element analysis. These discrete strain data are smoothed with a smoothing element discretization, producing superior accuracy strains and their first gradients. The approach enables the resulting smooth strain field to be practically C1-continuous throughout the domain of smoothing, exhibiting superconvergent properties of the smoothed quantity. The continuous strain gradients are also obtained directly from the solution. The recovered strain gradients are subsequently employed in the integration o equilibrium equations to obtain accurate interlaminar shear stresses. The problem is a simply-supported rectangular plate under a doubly sinusoidal load. The problem has an exact analytic solution which serves as a measure of goodness of the recovered interlaminar shear stresses. The method has the versatility of being applicable to the analysis of rather general and complex structures built of distinct components and materials, such as found in aircraft design. For these types of structures, the smoothing is achieved with 'patches', each patch covering the domain in which the smoothed quantity is physically continuous.

  18. Finite element analysis of the cyclic indentation of bilayer enamel

    NASA Astrophysics Data System (ADS)

    Jia, Yunfei; Xuan, Fu-zhen; Chen, Xiaoping; Yang, Fuqian

    2014-04-01

    Tooth enamel is often subjected to repeated contact and often experiences contact deformation in daily life. The mechanical strength of the enamel determines the biofunctionality of the tooth. Considering the variation of the rod arrangement in outer and inner enamel, we approximate enamel as a bilayer structure and perform finite element analysis of the cyclic indentation of the bilayer structure, to mimic the repeated contact of enamel during mastication. The dynamic deformation behaviour of both the inner enamel and the bilayer enamel is examined. The material parameters of the inner and outer enamel used in the analysis are obtained by fitting the finite element results with the experimental nanoindentation results. The penetration depth per cycle at the quasi-steady state is used to describe the depth propagation speed, which exhibits a two-stage power-law dependence on the maximum indentation load and the amplitude of the cyclic load, respectively. The continuous penetration of the indenter reflects the propagation of the plastic zone during cyclic indentation, which is related to the energy dissipation. The outer enamel serves as a protective layer due to its great resistance to contact deformation in comparison to the inner enamel. The larger equivalent plastic strain and lower stresses in the inner enamel during cyclic indentation, as calculated from the finite element analysis, indicate better crack/fracture resistance of the inner enamel.

  19. Evaluation of quantitative image analysis criteria for the high-resolution microendoscopic detection of neoplasia in Barrett's esophagus

    NASA Astrophysics Data System (ADS)

    Muldoon, Timothy J.; Thekkek, Nadhi; Roblyer, Darren; Maru, Dipen; Harpaz, Noam; Potack, Jonathan; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca

    2010-03-01

    Early detection of neoplasia in patients with Barrett's esophagus is essential to improve outcomes. The aim of this ex vivo study was to evaluate the ability of high-resolution microendoscopic imaging and quantitative image analysis to identify neoplastic lesions in patients with Barrett's esophagus. Nine patients with pathologically confirmed Barrett's esophagus underwent endoscopic examination with biopsies or endoscopic mucosal resection. Resected fresh tissue was imaged with fiber bundle microendoscopy; images were analyzed by visual interpretation or by quantitative image analysis to predict whether the imaged sites were non-neoplastic or neoplastic. The best performing pair of quantitative features were chosen based on their ability to correctly classify the data into the two groups. Predictions were compared to the gold standard of histopathology. Subjective analysis of the images by expert clinicians achieved average sensitivity and specificity of 87% and 61%, respectively. The best performing quantitative classification algorithm relied on two image textural features and achieved a sensitivity and specificity of 87% and 85%, respectively. This ex vivo pilot trial demonstrates that quantitative analysis of images obtained with a simple microendoscope system can distinguish neoplasia in Barrett's esophagus with good sensitivity and specificity when compared to histopathology and to subjective image interpretation.

  20. Trace element analysis by PIXE in several biomedical fields

    NASA Astrophysics Data System (ADS)

    Weber, G.; Robaye, G.; Bartsch, P.; Collignon, A.; Beguin, Y.; Roelandts, I.; Delbrouck, J. M.

    1984-04-01

    Since 1980 in the University of Liége trace element analysis by PIXE has been developed in several directions, among these: the elemental composition of lung parenchyma, hilar lymph nodes, blood content in hematological disorders and renal insufficiency. The content in trace elements of lung tumor and surrounding tissue is measured and compared to similar content previously obtained on unselected patients of comparable ages. The normalization of the bromine deficiency observed in hemodialized patients is achieved by using a dialyzing bath doped with NaBr in order to obtain a normal bromine level of 5.7 μg/ml. The content of Cu, Zn, Br and Se in blood serum from more than 100 patients suffering from malignant hemopathy has been measured. The results are compared with a reference group. These oligoelements have also been measured sequentially for patients under intensive chemotherapy in acute myeloid leukemia.

  1. Laer-induced Breakdown Spectroscopy Instrument for Element Analysis of Planetary Surfaces

    NASA Technical Reports Server (NTRS)

    Blacic, J.; Pettit, D.; Cremers, D.; Roessler, N.

    1993-01-01

    One of the most fundamental pieces of information about any planetary body is the elemental and mineralogical composition of its surface materials. We are developing an instrument to obtain such data at ranges of up to several hundreds of meters using the technique of Laser-Induced Breakdown Spectroscopy (LIBS). We envision our instrument being used from a spacecraft in close rendezvous with small bodies such as comets and asteroids, or deployed on surface-rover vehicles on large bodies such as Mars and the Moon. The elemental analysis is based on atomic emission spectroscopy of a laser-induced plasma or spark. A pulsed, diode pumped Nd:YAG laser of several hundred millijoules optical energy is used to vaporize and electronically excite the constituent elements of a rock surface remotely located from the laser. Light emitted from the excited plasma is collected and introduced to the entrance slit of a small grating spectrometer. The spectrally dispersed spark light is detected with either a linear photo diode array or area CCD array. When the latter detector is used, the optical and spectrometer components of the LIBS instrument can also be used in a passive imaging mode to collect and integrate reflected sunlight from the same rock surface. Absorption spectral analysis of this reflected light gives mineralogical information that provides a remote geochemical characterization of the rock surface. We performed laboratory calibrations in air and in vacuum on standard rock powders to quantify the LIBS analysis. We performed preliminary field tests using commercially available components to demonstrate remote LIBS analysis of terrestrial rock surfaces at ranges of over 25 m, and we have demonstrated compatibility with a six-wheeled Russian robotic rover vehicle. Based on these results, we believe that all major and most minor elements expected on planetary surfaces can be measured with absolute accuracy of 10-15 percent and much higher relative accuracy. We have

  2. Geographical classification of Epimedium based on HPLC fingerprint analysis combined with multi-ingredients quantitative analysis.

    PubMed

    Xu, Ning; Zhou, Guofu; Li, Xiaojuan; Lu, Heng; Meng, Fanyun; Zhai, Huaqiang

    2017-05-01

    A reliable and comprehensive method for identifying the origin and assessing the quality of Epimedium has been developed. The method is based on analysis of HPLC fingerprints, combined with similarity analysis, hierarchical cluster analysis (HCA), principal component analysis (PCA) and multi-ingredient quantitative analysis. Nineteen batches of Epimedium, collected from different areas in the western regions of China, were used to establish the fingerprints and 18 peaks were selected for the analysis. Similarity analysis, HCA and PCA all classified the 19 areas into three groups. Simultaneous quantification of the five major bioactive ingredients in the Epimedium samples was also carried out to confirm the consistency of the quality tests. These methods were successfully used to identify the geographical origin of the Epimedium samples and to evaluate their quality. Copyright © 2016 John Wiley & Sons, Ltd.

  3. The MHOST finite element program: 3-D inelastic analysis methods for hot section components. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Nakazawa, Shohei

    1989-01-01

    The user options available for running the MHOST finite element analysis package is described. MHOST is a solid and structural analysis program based on the mixed finite element technology, and is specifically designed for 3-D inelastic analysis. A family of 2- and 3-D continuum elements along with beam and shell structural elements can be utilized, many options are available in the constitutive equation library, the solution algorithms and the analysis capabilities. The outline of solution algorithms is discussed along with the data input and output, analysis options including the user subroutines and the definition of the finite elements implemented in the program package.

  4. Usefulness of a Dual Macro- and Micro-Energy-Dispersive X-Ray Fluorescence Spectrometer to Develop Quantitative Methodologies for Historic Mortar and Related Materials Characterization.

    PubMed

    García-Florentino, Cristina; Maguregui, Maite; Romera-Fernández, Miriam; Queralt, Ignasi; Margui, Eva; Madariaga, Juan Manuel

    2018-05-01

    Wavelength dispersive X-ray fluorescence (WD-XRF) spectrometry has been widely used for elemental quantification of mortars and cements. In this kind of instrument, samples are usually prepared as pellets or fused beads and the whole volume of sample is measured at once. In this work, the usefulness of a dual energy dispersive X-ray fluorescence spectrometer (ED-XRF), working at two lateral resolutions (1 mm and 25 μm) for macro and microanalysis respectively, to develop quantitative methods for the elemental characterization of mortars and concretes is demonstrated. A crucial step before developing any quantitative method with this kind of spectrometers is to verify the homogeneity of the standards at these two lateral resolutions. This new ED-XRF quantitative method also demonstrated the importance of matrix effects in the accuracy of the results being necessary to use Certified Reference Materials as standards. The results obtained with the ED-XRF quantitative method were compared with the ones obtained with two WD-XRF quantitative methods employing two different sample preparation strategies (pellets and fused beads). The selected ED-XRF and both WD-XRF quantitative methods were applied to the analysis of real mortars. The accuracy of the ED-XRF results turn out to be similar to the one achieved by WD-XRF, except for the lightest elements (Na and Mg). The results described in this work proved that μ-ED-XRF spectrometers can be used not only for acquiring high resolution elemental map distributions, but also to perform accurate quantitative studies avoiding the use of more sophisticated WD-XRF systems or the acid extraction/alkaline fusion required as destructive pretreatment in Inductively coupled plasma mass spectrometry based procedures.

  5. Forensic Comparison of Soil Samples Using Nondestructive Elemental Analysis.

    PubMed

    Uitdehaag, Stefan; Wiarda, Wim; Donders, Timme; Kuiper, Irene

    2017-07-01

    Soil can play an important role in forensic cases in linking suspects or objects to a crime scene by comparing samples from the crime scene with samples derived from items. This study uses an adapted ED-XRF analysis (sieving instead of grinding to prevent destruction of microfossils) to produce elemental composition data of 20 elements. Different data processing techniques and statistical distances were evaluated using data from 50 samples and the log-LR cost (C llr ). The best performing combination, Canberra distance, relative data, and square root values, is used to construct a discriminative model. Examples of the spatial resolution of the method in crime scenes are shown for three locations, and sampling strategy is discussed. Twelve test cases were analyzed, and results showed that the method is applicable. The study shows how the combination of an analysis technique, a database, and a discriminative model can be used to compare multiple soil samples quickly. © 2016 American Academy of Forensic Sciences.

  6. Comparative study of contrast-enhanced ultrasound qualitative and quantitative analysis for identifying benign and malignant breast tumor lumps.

    PubMed

    Liu, Jian; Gao, Yun-Hua; Li, Ding-Dong; Gao, Yan-Chun; Hou, Ling-Mi; Xie, Ting

    2014-01-01

    To compare the value of contrast-enhanced ultrasound (CEUS) qualitative and quantitative analysis in the identification of breast tumor lumps. Qualitative and quantitative indicators of CEUS for 73 cases of breast tumor lumps were retrospectively analyzed by univariate and multivariate approaches. Logistic regression was applied and ROC curves were drawn for evaluation and comparison. The CEUS qualitative indicator-generated regression equation contained three indicators, namely enhanced homogeneity, diameter line expansion and peak intensity grading, which demonstrated prediction accuracy for benign and malignant breast tumor lumps of 91.8%; the quantitative indicator-generated regression equation only contained one indicator, namely the relative peak intensity, and its prediction accuracy was 61.5%. The corresponding areas under the ROC curve for qualitative and quantitative analyses were 91.3% and 75.7%, respectively, which exhibited a statistically significant difference by the Z test (P<0.05). The ability of CEUS qualitative analysis to identify breast tumor lumps is better than with quantitative analysis.

  7. Strength Analysis on Ship Ladder Using Finite Element Method

    NASA Astrophysics Data System (ADS)

    Budianto; Wahyudi, M. T.; Dinata, U.; Ruddianto; Eko P., M. M.

    2018-01-01

    In designing the ship’s structure, it should refer to the rules in accordance with applicable classification standards. In this case, designing Ladder (Staircase) on a Ferry Ship which is set up, it must be reviewed based on the loads during ship operations, either during sailing or at port operations. The classification rules in ship design refer to the calculation of the structure components described in Classification calculation method and can be analysed using the Finite Element Method. Classification Regulations used in the design of Ferry Ships used BKI (Bureau of Classification Indonesia). So the rules for the provision of material composition in the mechanical properties of the material should refer to the classification of the used vessel. The analysis in this structure used program structure packages based on Finite Element Method. By using structural analysis on Ladder (Ladder), it obtained strength and simulation structure that can withstand load 140 kg both in static condition, dynamic, and impact. Therefore, the result of the analysis included values of safety factors in the ship is to keep the structure safe but the strength of the structure is not excessive.

  8. Tube Bulge Process : Theoretical Analysis and Finite Element Simulations

    NASA Astrophysics Data System (ADS)

    Velasco, Raphael; Boudeau, Nathalie

    2007-05-01

    This paper is focused on the determination of mechanics characteristics for tubular materials, using tube bulge process. A comparative study is made between two different models: theoretical model and finite element analysis. The theoretical model is completely developed, based first on a geometrical analysis of the tube profile during bulging, which is assumed to strain in arc of circles. Strain and stress analysis complete the theoretical model, which allows to evaluate tube thickness and state of stress, at any point of the free bulge region. Free bulging of a 304L stainless steel is simulated using Ls-Dyna 970. To validate FE simulations approach, a comparison between theoretical and finite elements models is led on several parameters such as: thickness variation at the free bulge region pole with bulge height, tube thickness variation with z axial coordinate, and von Mises stress variation with plastic strain. Finally, the influence of geometrical parameters deviations on flow stress curve is observed using analytical model: deviations of the tube outer diameter, its initial thickness and the bulge height measurement are taken into account to obtain a resulting error on plastic strain and von Mises stress.

  9. Elemental distribution analysis of urinary crystals.

    PubMed

    Fazil Marickar, Y M; Lekshmi, P R; Varma, Luxmi; Koshy, Peter

    2009-10-01

    Various crystals are seen in human urine. Some of them, particularly calcium oxalate dihydrate, are seen normally. Pathological crystals indicate crystal formation initiating urinary stones. Unfortunately, many of the relevant crystals are not recognized in light microscopic analysis of the urinary deposit performed in most of the clinical laboratories. Many crystals are not clearly identifiable under the ordinary light microscopy. The objective of the present study was to perform scanning electron microscopic (SEM) assessment of various urinary deposits and confirm the identity by elemental distribution analysis (EDAX). 50 samples of urinary deposits were collected from urinary stone clinic. Deposits containing significant crystalluria (more than 10 per HPF) were collected under liquid paraffin in special containers and taken up for SEM studies. The deposited crystals were retrieved with appropriate Pasteur pipettes, and placed on micropore filter paper discs. The fluid was absorbed by thicker layers of filter paper underneath and discs were fixed to brass studs. They were then gold sputtered to 100 A and examined under SEM (Jeol JSM 35C microscope). When crystals were seen, their morphology was recorded by taking photographs at different angles. At appropriate magnification, EDAX probe was pointed to the crystals under study and the wave patterns analyzed. Components of the crystals were recognized by utilizing the data. All the samples analyzed contained significant number of crystals. All samples contained more than one type of crystal. The commonest crystals encountered included calcium oxalate monohydrate (whewellite 22%), calcium oxalate dihydrate (weddellite 32%), uric acid (10%), calcium phosphates, namely, apatite (4%), brushite (6%), struvite (6%) and octocalcium phosphate (2%). The morphological appearances of urinary crystals described were correlated with the wavelengths obtained through elemental distribution analysis. Various urinary crystals that

  10. Chromatographic-ICPMS methods for trace element and isotope analysis of water and biogenic calcite

    NASA Astrophysics Data System (ADS)

    Klinkhammer, G. P.; Haley, B. A.; McManus, J.; Palmer, M. R.

    2003-04-01

    ICP-MS is a powerful technique because of its sensitivity and speed of analysis. This is especially true for refractory elements that are notoriously difficult using TIMS and less energetic techniques. However, as ICP-MS instruments become more sensitive to elements of interest they also become more sensitive to interference. This becomes a pressing issue when analyzing samples with high total dissolved solids. This paper describes two trace element methods that overcome these problems by using chromatographic techniques to precondition samples prior to analysis by ICP-MS: separation of rare earth elements (REEs) from seawater using HPLC-ICPMS, and flow-through dissolution of foraminiferal calcite. Using HPLC in combination with ICP-MS it is possible to isolate the REEs from matrix, other transition elements, and each other. This method has been developed for small volume samples (5ml) making it possible to analyze sediment pore waters. As another example, subjecting foram shells to flow-through reagent addition followed by time-resolved analysis in the ICP-MS allows for systematic cleaning and dissolution of foram shells. This method provides information about the relationship between dissolution tendency and elemental composition. Flow-through is also amenable to automation thus yielding the high sample throughput required for paleoceanography, and produces a highly resolved elemental matrix that can be statistically analyzed.

  11. Novel quantitative analysis of autofluorescence images for oral cancer screening.

    PubMed

    Huang, Tze-Ta; Huang, Jehn-Shyun; Wang, Yen-Yun; Chen, Ken-Chung; Wong, Tung-Yiu; Chen, Yi-Chun; Wu, Che-Wei; Chan, Leong-Perng; Lin, Yi-Chu; Kao, Yu-Hsun; Nioka, Shoko; Yuan, Shyng-Shiou F; Chung, Pau-Choo

    2017-05-01

    VELscope® was developed to inspect oral mucosa autofluorescence. However, its accuracy is heavily dependent on the examining physician's experience. This study was aimed toward the development of a novel quantitative analysis of autofluorescence images for oral cancer screening. Patients with either oral cancer or precancerous lesions and a control group with normal oral mucosa were enrolled in this study. White light images and VELscope® autofluorescence images of the lesions were taken with a digital camera. The lesion in the image was chosen as the region of interest (ROI). The average intensity and heterogeneity of the ROI were calculated. A quadratic discriminant analysis (QDA) was utilized to compute boundaries based on sensitivity and specificity. 47 oral cancer lesions, 54 precancerous lesions, and 39 normal oral mucosae controls were analyzed. A boundary of specificity of 0.923 and a sensitivity of 0.979 between the oral cancer lesions and normal oral mucosae were validated. The oral cancer and precancerous lesions could also be differentiated from normal oral mucosae with a specificity of 0.923 and a sensitivity of 0.970. The novel quantitative analysis of the intensity and heterogeneity of VELscope® autofluorescence images used in this study in combination with a QDA classifier can be used to differentiate oral cancer and precancerous lesions from normal oral mucosae. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. A conductive grating sensor for online quantitative monitoring of fatigue crack.

    PubMed

    Li, Peiyuan; Cheng, Li; Yan, Xiaojun; Jiao, Shengbo; Li, Yakun

    2018-05-01

    Online quantitative monitoring of crack damage due to fatigue is a critical challenge for structural health monitoring systems assessing structural safety. To achieve online quantitative monitoring of fatigue crack, a novel conductive grating sensor based on the principle of electrical potential difference is proposed. The sensor consists of equidistant grating channels to monitor the fatigue crack length and conductive bars to provide the circuit path. An online crack monitoring system is established to verify the sensor's capability. The experimental results prove that the sensor is suitable for online quantitative monitoring of fatigue crack. A finite element model for the sensor is also developed to optimize the sensitivity of crack monitoring, which is defined by the rate of sensor resistance change caused by the break of the first grating channel. Analysis of the model shows that the sensor sensitivity can be enhanced by reducing the number of grating channels and increasing their resistance and reducing the resistance of the conductive bar.

  13. A conductive grating sensor for online quantitative monitoring of fatigue crack

    NASA Astrophysics Data System (ADS)

    Li, Peiyuan; Cheng, Li; Yan, Xiaojun; Jiao, Shengbo; Li, Yakun

    2018-05-01

    Online quantitative monitoring of crack damage due to fatigue is a critical challenge for structural health monitoring systems assessing structural safety. To achieve online quantitative monitoring of fatigue crack, a novel conductive grating sensor based on the principle of electrical potential difference is proposed. The sensor consists of equidistant grating channels to monitor the fatigue crack length and conductive bars to provide the circuit path. An online crack monitoring system is established to verify the sensor's capability. The experimental results prove that the sensor is suitable for online quantitative monitoring of fatigue crack. A finite element model for the sensor is also developed to optimize the sensitivity of crack monitoring, which is defined by the rate of sensor resistance change caused by the break of the first grating channel. Analysis of the model shows that the sensor sensitivity can be enhanced by reducing the number of grating channels and increasing their resistance and reducing the resistance of the conductive bar.

  14. Three-dimensional modeling and quantitative analysis of gap junction distributions in cardiac tissue.

    PubMed

    Lackey, Daniel P; Carruth, Eric D; Lasher, Richard A; Boenisch, Jan; Sachse, Frank B; Hitchcock, Robert W

    2011-11-01

    Gap junctions play a fundamental role in intercellular communication in cardiac tissue. Various types of heart disease including hypertrophy and ischemia are associated with alterations of the spatial arrangement of gap junctions. Previous studies applied two-dimensional optical and electron-microscopy to visualize gap junction arrangements. In normal cardiomyocytes, gap junctions were primarily found at cell ends, but can be found also in more central regions. In this study, we extended these approaches toward three-dimensional reconstruction of gap junction distributions based on high-resolution scanning confocal microscopy and image processing. We developed methods for quantitative characterization of gap junction distributions based on analysis of intensity profiles along the principal axes of myocytes. The analyses characterized gap junction polarization at cell ends and higher-order statistical image moments of intensity profiles. The methodology was tested in rat ventricular myocardium. Our analysis yielded novel quantitative data on gap junction distributions. In particular, the analysis demonstrated that the distributions exhibit significant variability with respect to polarization, skewness, and kurtosis. We suggest that this methodology provides a quantitative alternative to current approaches based on visual inspection, with applications in particular in characterization of engineered and diseased myocardium. Furthermore, we propose that these data provide improved input for computational modeling of cardiac conduction.

  15. Tissue microarrays and quantitative tissue-based image analysis as a tool for oncology biomarker and diagnostic development.

    PubMed

    Dolled-Filhart, Marisa P; Gustavson, Mark D

    2012-11-01

    Translational oncology has been improved by using tissue microarrays (TMAs), which facilitate biomarker analysis of large cohorts on a single slide. This has allowed for rapid analysis and validation of potential biomarkers for prognostic and predictive value, as well as for evaluation of biomarker prevalence. Coupled with quantitative analysis of immunohistochemical (IHC) staining, objective and standardized biomarker data from tumor samples can further advance companion diagnostic approaches for the identification of drug-responsive or resistant patient subpopulations. This review covers the advantages, disadvantages and applications of TMAs for biomarker research. Research literature and reviews of TMAs and quantitative image analysis methodology have been surveyed for this review (with an AQUA® analysis focus). Applications such as multi-marker diagnostic development and pathway-based biomarker subpopulation analyses are described. Tissue microarrays are a useful tool for biomarker analyses including prevalence surveys, disease progression assessment and addressing potential prognostic or predictive value. By combining quantitative image analysis with TMAs, analyses will be more objective and reproducible, allowing for more robust IHC-based diagnostic test development. Quantitative multi-biomarker IHC diagnostic tests that can predict drug response will allow for greater success of clinical trials for targeted therapies and provide more personalized clinical decision making.

  16. Three-dimensional elastic-plastic finite-element analysis of fatigue crack propagation

    NASA Technical Reports Server (NTRS)

    Goglia, G. L.; Chermahini, R. G.

    1985-01-01

    Fatigue cracks are a major problem in designing structures subjected to cyclic loading. Cracks frequently occur in structures such as aircraft and spacecraft. The inspection intervals of many aircraft structures are based on crack-propagation lives. Therefore, improved prediction of propagation lives under flight-load conditions (variable-amplitude loading) are needed to provide more realistic design criteria for these structures. The main thrust was to develop a three-dimensional, nonlinear, elastic-plastic, finite element program capable of extending a crack and changing boundary conditions for the model under consideration. The finite-element model is composed of 8-noded (linear-strain) isoparametric elements. In the analysis, the material is assumed to be elastic-perfectly plastic. The cycle stress-strain curve for the material is shown Zienkiewicz's initial-stress method, von Mises's yield criterion, and Drucker's normality condition under small-strain assumptions are used to account for plasticity. The three-dimensional analysis is capable of extending the crack and changing boundary conditions under cyclic loading.

  17. Multiple-element semiquantitative analysis of one-milligram geochemical samples by D.C. arc emission spectrography

    USGS Publications Warehouse

    Rait, N.

    1981-01-01

    A modified method is described for a 1-mg sample multi-element semiquantitative spectrographic analysis. This method uses a direct-current arc source, carbon instead of graphite electrodes, and an 80% argon-20% oxygen atmosphere instead of air. Although this is a destructive method, an analysis can be made for 68 elements in all mineral and geochemical samples. Carbon electrodes have been an aid in improving the detection limits of many elements. The carbon has a greater resistance to heat conductance and develops a better tip, facilitating sample volatilization and counter balancing the cooling effect of a flow of the argon-oxygen mixture around the anode. Where such an argon-oxygen atmosphere is used instead of air, the cyanogen band lines are greatly diminished in intensity, and thus more spectral lines of analysis elements are available for use; the spectral background is also lower. The main advantage of using the carbon electrode and the 80% argon-20% oxygen atmosphere is the improved detection limits of 36 out of 68 elements. The detection limits remain the same for 23 elements, and are not as good for only nine elements. ?? 1981.

  18. Comparison of Neck Screw and Conventional Fixation Techniques in Mandibular Condyle Fractures Using 3-Dimensional Finite Element Analysis.

    PubMed

    Conci, Ricardo Augusto; Tomazi, Flavio Henrique Silveira; Noritomi, Pedro Yoshito; da Silva, Jorge Vicente Lopes; Fritscher, Guilherme Genehr; Heitz, Claiton

    2015-07-01

    To compare the mechanical stress on the mandibular condyle after the reduction and fixation of mandibular condylar fractures using the neck screw and 2 other conventional techniques according to 3-dimensional finite element analysis. A 3-dimensional finite element model of a mandible was created and graphically simulated on a computer screen. The model was fixed with 3 different techniques: a 2.0-mm plate with 4 screws, 2 plates (1 1.5-mm plate and 1 2.0-mm plate) with 4 screws, and a neck screw. Loads were applied that simulated muscular action, with restrictions of the upper movements of the mandible, differentiation of the cortical and medullary bone, and the virtual "folds" of the plates and screws so that they could adjust to the condylar surface. Afterward, the data were exported for graphic visualization of the results and quantitative analysis was performed. The 2-plate technique exhibited better stability in regard to displacement of fractures, deformity of the synthesis materials, and minimum and maximum tension values. The results with the neck screw were satisfactory and were similar to those found when a miniplate was used. Although the study shows that 2 isolated plates yielded better results compared with the other groups using other fixation systems and methods, the neck screw could be an option for condylar fracture reduction. Copyright © 2015 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  19. Quantitative analysis of glycerophospholipids by LC-MS: acquisition, data handling, and interpretation

    PubMed Central

    Myers, David S.; Ivanova, Pavlina T.; Milne, Stephen B.; Brown, H. Alex

    2012-01-01

    As technology expands what it is possible to accurately measure, so too the challenges faced by modern mass spectrometry applications expand. A high level of accuracy in lipid quantitation across thousands of chemical species simultaneously is demanded. While relative changes in lipid amounts with varying conditions may provide initial insights or point to novel targets, there are many questions that require determination of lipid analyte absolute quantitation. Glycerophospholipids present a significant challenge in this regard, given the headgroup diversity, large number of possible acyl chain combinations, and vast range of ionization efficiency of species. Lipidomic output is being used more often not just for profiling of the masses of species, but also for highly-targeted flux-based measurements which put additional burdens on the quantitation pipeline. These first two challenges bring into sharp focus the need for a robust lipidomics workflow including deisotoping, differentiation from background noise, use of multiple internal standards per lipid class, and the use of a scriptable environment in order to create maximum user flexibility and maintain metadata on the parameters of the data analysis as it occurs. As lipidomics technology develops and delivers more output on a larger number of analytes, so must the sophistication of statistical post-processing also continue to advance. High-dimensional data analysis methods involving clustering, lipid pathway analysis, and false discovery rate limitation are becoming standard practices in a maturing field. PMID:21683157

  20. Nonlinear dynamics of planetary gears using analytical and finite element models

    NASA Astrophysics Data System (ADS)

    Ambarisha, Vijaya Kumar; Parker, Robert G.

    2007-05-01

    Vibration-induced gear noise and dynamic loads remain key concerns in many transmission applications that use planetary gears. Tooth separations at large vibrations introduce nonlinearity in geared systems. The present work examines the complex, nonlinear dynamic behavior of spur planetary gears using two models: (i) a lumped-parameter model, and (ii) a finite element model. The two-dimensional (2D) lumped-parameter model represents the gears as lumped inertias, the gear meshes as nonlinear springs with tooth contact loss and periodically varying stiffness due to changing tooth contact conditions, and the supports as linear springs. The 2D finite element model is developed from a unique finite element-contact analysis solver specialized for gear dynamics. Mesh stiffness variation excitation, corner contact, and gear tooth contact loss are all intrinsically considered in the finite element analysis. The dynamics of planetary gears show a rich spectrum of nonlinear phenomena. Nonlinear jumps, chaotic motions, and period-doubling bifurcations occur when the mesh frequency or any of its higher harmonics are near a natural frequency of the system. Responses from the dynamic analysis using analytical and finite element models are successfully compared qualitatively and quantitatively. These comparisons validate the effectiveness of the lumped-parameter model to simulate the dynamics of planetary gears. Mesh phasing rules to suppress rotational and translational vibrations in planetary gears are valid even when nonlinearity from tooth contact loss occurs. These mesh phasing rules, however, are not valid in the chaotic and period-doubling regions.

  1. Laser Induced Breakdown Spectroscopy for Elemental Analysis in Environmental, Cultural Heritage and Space Applications: A Review of Methods and Results

    PubMed Central

    Gaudiuso, Rosalba; Dell’Aglio, Marcella; De Pascale, Olga; Senesi, Giorgio S.; De Giacomo, Alessandro

    2010-01-01

    Analytical applications of Laser Induced Breakdown Spectroscopy (LIBS), namely optical emission spectroscopy of laser-induced plasmas, have been constantly growing thanks to its intrinsic conceptual simplicity and versatility. Qualitative and quantitative analysis can be performed by LIBS both by drawing calibration lines and by using calibration-free methods and some of its features, so as fast multi-elemental response, micro-destructiveness, instrumentation portability, have rendered it particularly suitable for analytical applications in the field of environmental science, space exploration and cultural heritage. This review reports and discusses LIBS achievements in these areas and results obtained for soils and aqueous samples, meteorites and terrestrial samples simulating extraterrestrial planets, and cultural heritage samples, including buildings and objects of various kinds. PMID:22163611

  2. Quantitative analysis of major dibenzocyclooctane lignans in Schisandrae fructus by online TLC-DART-MS.

    PubMed

    Kim, Hye Jin; Oh, Myung Sook; Hong, Jongki; Jang, Young Pyo

    2011-01-01

    Direct analysis in real time (DART) ion source is a powerful ionising technique for the quick and easy detection of various organic molecules without any sample preparation steps, but the lack of quantitation capacity limits its extensive use in the field of phytochemical analysis. To improvise a new system which utilize DART-MS as a hyphenated detector for quantitation. A total extract of Schisandra chinensis fruit was analyzed on a TLC plate and three major lignan compounds were quantitated by three different methods of UV densitometry, TLC-DART-MS and HPLC-UV to compare the efficiency of each method. To introduce the TLC plate into the DART ion source at a constant velocity, a syringe pump was employed. The DART-MS total ion current chromatogram was recorded for the entire TLC plate. The concentration of each lignan compound was calculated from the calibration curve established with standard compound. Gomisin A, gomisin N and schisandrin were well separated on a silica-coated TLC plate and the specific ion current chromatograms were successfully acquired from the TLC-DART-MS system. The TLC-DART-MS system for the quantitation of natural products showed better linearity and specificity than TLC densitometry, and consumed less time and solvent than conventional HPLC method. A hyphenated system for the quantitation of phytochemicals from crude herbal drugs was successfully established. This system was shown to have a powerful analytical capacity for the prompt and efficient quantitation of natural products from crude drugs. Copyright © 2010 John Wiley & Sons, Ltd.

  3. Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3

    NASA Technical Reports Server (NTRS)

    Brooks, Howard L.

    1986-01-01

    In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.

  4. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Finite element normal mode analysis of resistance welding jointed of dissimilar plate hat structure

    NASA Astrophysics Data System (ADS)

    Nazri, N. A.; Sani, M. S. M.

    2017-10-01

    Structural joints offer connection between structural element (beam, plate, solid etc.) in order to build a whole assembled structure. The complex behaviour of connecting elements plays a valuable role in characteristics of dynamic such as natural frequencies and mode shapes. In automotive structures, the trustworthiness arrangement of the structure extremely depends on joints. In this paper, top hat structure is modelled and designed with spot welding joint using dissimilar materials which is mild steel 1010 and stainless steel 304, using finite element software. Different types of connector elements such as rigid body element (RBE2), welding joint element (CWELD), and bar element (CBAR) are applied to represent real connection between two dissimilar plates. Normal mode analysis is simulated with different types of joining element in order to determine modal properties. Natural frequencies using RBE2, CBAR and CWELD are compared to equivalent rigid body method. Connection that gives the lowest percentage error among these three will be selected as the most reliable joining for resistance spot weld. From the analysis, it is shown that CWELD is better compared to others in term of weld joining among dissimilar plate materials. It is expected that joint modelling of finite element plays significant role in structural dynamics.

  6. Finite Element Analysis and Optimization of Flexure Bearing for Linear Motor Compressor

    NASA Astrophysics Data System (ADS)

    Khot, Maruti; Gawali, Bajirao

    Nowadays linear motor compressors are commonly used in miniature cryocoolers instead of rotary compressors because rotary compressors apply large radial forces to the piston, which provide no useful work, cause large amount of wear and usually require lubrication. Recent trends favour flexure supported configurations for long life. The present work aims at designing and geometrical optimization of flexure bearings using finite element analysis and the development of design charts for selection purposes. The work also covers the manufacturing of flexures using different materials and the validation of the experimental finite element analysis results.

  7. Better Finite-Element Analysis of Composite Shell Structures

    NASA Technical Reports Server (NTRS)

    Clarke, Gregory

    2007-01-01

    A computer program implements a finite-element-based method of predicting the deformations of thin aerospace structures made of isotropic materials or anisotropic fiber-reinforced composite materials. The technique and corresponding software are applicable to thin shell structures in general and are particularly useful for analysis of thin beamlike members having open cross-sections (e.g. I-beams and C-channels) in which significant warping can occur.

  8. Effects of P Element Insertions on Quantitative Traits in Drosophila Melanogaster

    PubMed Central

    Mackay, TFC.; Lyman, R. F.; Jackson, M. S.

    1992-01-01

    P element mutagenesis was used to construct 94 third chromosome lines of Drosophila melanogaster which contained on average 3.1 stable P element inserts, in an inbred host strain background previously free of P elements. The homozygous and heterozygous effects of the inserts on viability and abdominal and sternopleural bristle number were ascertained by comparing the chromosome lines with inserts to insert-free control lines of the inbred host strain. P elements reduced average homozygous viability by 12.2% per insert and average heterozygous viability by 5.5% per insert, and induced recessive lethal mutations at a rate of 3.8% per insert. Mutational variation for the bristle traits averaged over both sexes was 0.03V(e) per homozygous P insert and 0.003V(e) per heterozygous P insert, where V(e) is the environmental variance. Mutational variation was greater for the sexes considered separately because inserts had large pleiotropic effects on sex dimorphism of bristle characters. The distributions of homozygous effects of inserts on the bristle traits were asymmetrical, with the largest effects in the direction of reducing bristle number; and highly leptokurtic, with most of the increase in variance contributed by a few lines with large effects. The inserts had partially recessive effects on the bristle traits. Insert lines with extreme bristle effects had on average greatly reduced viability. PMID:1311697

  9. Effects of biochar addition on toxic element concentrations in plants: A meta-analysis.

    PubMed

    Peng, Xin; Deng, Yinger; Peng, Yan; Yue, Kai

    2018-03-01

    Consuming food contaminated by toxic elements (TEs) could pose a substantial risk to human health. Recently, biochar has been extensively studied as an effective soil ameliorant in situ because of its ability to suppress the phytoavailability of TEs. However, despite the research interest, the effects of biochar applications to soil on different TE concentrations in different plant parts remain unclear. Here, we synthesize 1813 individual observations data collected from 97 articles to evaluate the effects of biochar addition on TE concentrations in plant parts. We found that (1) the experiment type, biochar feedstock and pyrolysis temperature all significantly decreased the TE concentration in plant parts; (2) the responses of Cd and Pb concentrations in edible and indirectly edible plant parts were significantly more sensitive to the effect of biochar than the Zn, Ni, Mn, Cr, Co and Cu concentrations; and (3) the biochar dosage and surface area, significantly influenced certain TE concentrations in plant tissues as determined via correlation analysis. Moreover, the only exception in this study was found for metalloid element (i.e., As) concentrations in plants, which were not significantly influenced by biochar addition. Overall, the effects of biochar on TE concentrations in plant tissues were negative, at least on average, and the central trends suggest that biochar has a considerable ability to mitigate the transfer of TEs to food, thereby reducing the associated health risks. Our results provide an initial quantitative determination of the effects of biochar addition on multifarious TEs in different plant parts as well as an assessment of the ability of biochar to reduce TE concentrations in plants. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Gene Level Meta-Analysis of Quantitative Traits by Functional Linear Models.

    PubMed

    Fan, Ruzong; Wang, Yifan; Boehnke, Michael; Chen, Wei; Li, Yun; Ren, Haobo; Lobach, Iryna; Xiong, Momiao

    2015-08-01

    Meta-analysis of genetic data must account for differences among studies including study designs, markers genotyped, and covariates. The effects of genetic variants may differ from population to population, i.e., heterogeneity. Thus, meta-analysis of combining data of multiple studies is difficult. Novel statistical methods for meta-analysis are needed. In this article, functional linear models are developed for meta-analyses that connect genetic data to quantitative traits, adjusting for covariates. The models can be used to analyze rare variants, common variants, or a combination of the two. Both likelihood-ratio test (LRT) and F-distributed statistics are introduced to test association between quantitative traits and multiple variants in one genetic region. Extensive simulations are performed to evaluate empirical type I error rates and power performance of the proposed tests. The proposed LRT and F-distributed statistics control the type I error very well and have higher power than the existing methods of the meta-analysis sequence kernel association test (MetaSKAT). We analyze four blood lipid levels in data from a meta-analysis of eight European studies. The proposed methods detect more significant associations than MetaSKAT and the P-values of the proposed LRT and F-distributed statistics are usually much smaller than those of MetaSKAT. The functional linear models and related test statistics can be useful in whole-genome and whole-exome association studies. Copyright © 2015 by the Genetics Society of America.

  11. Finite Element Analysis of Denosumab Treatment Effects on Vertebral Strength in Ovariectomized Cynomolgus Monkeys.

    PubMed

    Lee, David C; Varela, Aurore; Kostenuik, Paul J; Ominsky, Michael S; Keaveny, Tony M

    2016-08-01

    Finite element analysis has not yet been validated for measuring changes in whole-bone strength at the hip or spine in people after treatment with an osteoporosis agent. Toward that end, we assessed the ability of a clinically approved implementation of finite element analysis to correctly quantify treatment effects on vertebral strength, comparing against direct mechanical testing, in cynomolgus monkeys randomly assigned to one of three 16-month-long treatments: sham surgery with vehicle (Sham-Vehicle), ovariectomy with vehicle (OVX-Vehicle), or ovariectomy with denosumab (OVX-DMAb). After treatment, T12 vertebrae were retrieved, scanned with micro-CT, and mechanically tested to measure compressive strength. Blinded to the strength data and treatment codes, the micro-CT images were coarsened and homogenized to create continuum-type finite element models, without explicit porosity. With clinical translation in mind, these models were then analyzed for strength using the U.S. Food and Drug Administration (FDA)-cleared VirtuOst software application (O.N. Diagnostics, Berkeley, CA, USA), developed for analysis of human bones. We found that vertebral strength by finite element analysis was highly correlated (R(2)  = 0.97; n = 52) with mechanical testing, independent of treatment (p = 0.12). Further, the size of the treatment effect on strength (ratio of mean OVX-DMAb to mean OVX-Vehicle, as a percentage) was large and did not differ (p = 0.79) between mechanical testing (+57%; 95% CI [26%, 95%]) and finite element analysis (+51% [20%, 88%]). The micro-CT analysis revealed increases in cortical thickness (+45% [19%, 73%]) and trabecular bone volume fraction (+24% [8%, 42%]). These results show that a preestablished clinical finite element analysis implementation-developed for human bone and clinically validated in fracture-outcome studies-correctly quantified the observed treatment effects of denosumab on vertebral strength in cynomolgus monkeys. One

  12. Incorporating general race and housing flexibility and deadband in rolling element bearing analysis

    NASA Technical Reports Server (NTRS)

    Davis, R. R.; Vallance, C. S.

    1989-01-01

    Methods for including the effects of general race and housing compliance and outer race-to-housing deadband (clearance) in rolling element bearing mechanics analysis is presented. It is shown that these effects can cause significant changes in bearing stiffness characteristics, which are of major importance in rotordynamic response of turbomachinery and other rotating systems. Preloading analysis is demonstrated with the finite element/contact mechanics hybrid method applied to a 45 mm angular contact ball bearing.

  13. Determination of major elements by wavelength-dispersive X-ray fluorescence spectrometry and trace elements by inductively coupled plasma mass spectrometry in igneous rocks from the same fused sample (110 mg)

    NASA Astrophysics Data System (ADS)

    Amosova, Alena A.; Panteeva, Svetlana V.; Chubarov, Victor M.; Finkelshtein, Alexandr L.

    2016-08-01

    The fusion technique is proposed for simultaneous determination of 35 elements from the same sample. Only 110 mg of rock sample was used to obtain fused glasses for quantitative determination of 10 major elements by wavelength dispersive X-ray fluorescence analysis, 16 rare earth elements and some other trace elements by inductively coupled plasma mass spectrometry analysis. Fusion was performed with 1.1 g of lithium metaborate and LiBr solution as the releasing agent in platinum crucible in electric furnace at 1100 °C. The certified reference materials of ultramafic, mafic, intermediate and felsic igneous rocks have been applied to obtain the calibration curves for rock-forming oxides (Na2O, MgO, Al2O3, SiO2, P2O5, K2O, CaO, TiO2, MnO, Fe2O3) and some trace elements (Ba, Sr, Zr) determination by X-ray fluorescence analysis. The repeatability does not exceed the allowable standard deviation for a wide range of concentrations. In the most cases the relative standard deviation was less than 5%. Obtained glasses were utilized for the further determination of rare earth (La, Ce, Pr, Nd, Sm, Eu, Gd, Tb, Dy, Ho, Er, Tm, Yb, Lu) and some other (Ba, Sr, Zr, Rb, Cs, Y, Nb, Hf, Ta, Th and U) trace elements by inductively coupled plasma mass spectrometry analysis with the same certified reference materials employed. The results could mostly be accepted as satisfactory. The proposed procedure essentially reduces the expenses in comparison with separate sample preparation for inductively coupled plasma mass spectrometry and X-ray fluorescence analysis.

  14. Quantitative X-ray Differential Interference Contrast Microscopy

    NASA Astrophysics Data System (ADS)

    Nakamura, Takashi

    Full-field soft x-ray microscopes are widely used in many fields of sciences. Advances in nanofabrication technology enabled short wavelength focusing elements with significantly improved spatial resolution. In the soft x-ray spectral region, samples as small as 12 nm can be resolved using micro zone-plates as the objective lens. In addition to conventional x-ray microscopy in which x-ray absorption difference provides the image contrast, phase contrast mechanisms such as differential phase contrast (DIC) and Zernike phase contrast have also been demonstrated These phase contrast imaging mechanisms are especially attractive at the x-ray wavelengths where phase contrast of most materials is typically 10 times stronger than the absorption contrast. With recent progresses in plasma-based x- ray sources and increasing accessibility to synchrotron user facilities, x-ray microscopes are quickly becoming standard measurement equipment in the laboratory. To further the usefulness of x-ray DIC microscopy this thesis explicitly addresses three known issues with this imaging modality by introducing new techniques and devices First, as opposed to its visible-light counterpart, no quantitative phase imaging technique exists for x-ray DIC microscopy. To address this issue, two nanoscale x-ray quantitative phase imaging techniques, using exclusive OR (XOR) patterns and zone-plate doublets, respectively, are proposed. Unlike existing x-ray quantitative phase imaging techniques such as Talbot interferometry and ptychography, no dedicated experimental setups or stringent illumination coherence are needed for quantitative phase retrieval. Second, to the best of our knowledge, no quantitative performance characterization of DIC microscopy exists to date. Therefore the imaging system's response to sample's spatial frequency is not known In order to gain in-depth understanding of this imaging modality, performance of x-ray DIC microscopy is quantified using modulation transfer function

  15. Large-scale quantitative analysis of painting arts.

    PubMed

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  16. Fossil Signatures Using Elemental Abundance Distributions and Bayesian Probabilistic Classification

    NASA Technical Reports Server (NTRS)

    Hoover, Richard B.; Storrie-Lombardi, Michael C.

    2004-01-01

    Elemental abundances (C6, N7, O8, Na11, Mg12, Al3, P15, S16, Cl17, K19, Ca20, Ti22, Mn25, Fe26, and Ni28) were obtained for a set of terrestrial fossils and the rock matrix surrounding them. Principal Component Analysis extracted five factors accounting for the 92.5% of the data variance, i.e. information content, of the elemental abundance data. Hierarchical Cluster Analysis provided unsupervised sample classification distinguishing fossil from matrix samples on the basis of either raw abundances or PCA input that agreed strongly with visual classification. A stochastic, non-linear Artificial Neural Network produced a Bayesian probability of correct sample classification. The results provide a quantitative probabilistic methodology for discriminating terrestrial fossils from the surrounding rock matrix using chemical information. To demonstrate the applicability of these techniques to the assessment of meteoritic samples or in situ extraterrestrial exploration, we present preliminary data on samples of the Orgueil meteorite. In both systems an elemental signature produces target classification decisions remarkably consistent with morphological classification by a human expert using only structural (visual) information. We discuss the possibility of implementing a complexity analysis metric capable of automating certain image analysis and pattern recognition abilities of the human eye using low magnification optical microscopy images and discuss the extension of this technique across multiple scales.

  17. Analysis of Transformation Plasticity in Steel Using a Finite Element Method Coupled with a Phase Field Model

    PubMed Central

    Cho, Yi-Gil; Kim, Jin-You; Cho, Hoon-Hwe; Cha, Pil-Ryung; Suh, Dong-Woo; Lee, Jae Kon; Han, Heung Nam

    2012-01-01

    An implicit finite element model was developed to analyze the deformation behavior of low carbon steel during phase transformation. The finite element model was coupled hierarchically with a phase field model that could simulate the kinetics and micro-structural evolution during the austenite-to-ferrite transformation of low carbon steel. Thermo-elastic-plastic constitutive equations for each phase were adopted to confirm the transformation plasticity due to the weaker phase yielding that was proposed by Greenwood and Johnson. From the simulations under various possible plastic properties of each phase, a more quantitative understanding of the origin of transformation plasticity was attempted by a comparison with the experimental observation. PMID:22558295

  18. Consistent linearization of the element-independent corotational formulation for the structural analysis of general shells

    NASA Technical Reports Server (NTRS)

    Rankin, C. C.

    1988-01-01

    A consistent linearization is provided for the element-dependent corotational formulation, providing the proper first and second variation of the strain energy. As a result, the warping problem that has plagued flat elements has been overcome, with beneficial effects carried over to linear solutions. True Newton quadratic convergence has been restored to the Structural Analysis of General Shells (STAGS) code for conservative loading using the full corotational implementation. Some implications for general finite element analysis are discussed, including what effect the automatic frame invariance provided by this work might have on the development of new, improved elements.

  19. Toward best practices in data processing and analysis for intact biotherapeutics by MS in quantitative bioanalysis.

    PubMed

    Kellie, John F; Kehler, Jonathan R; Karlinsey, Molly Z; Summerfield, Scott G

    2017-12-01

    Typically, quantitation of biotherapeutics from biological matrices by LC-MS is based on a surrogate peptide approach to determine molecule concentration. Recent efforts have focused on quantitation of the intact protein molecules or larger mass subunits of monoclonal antibodies. To date, there has been limited guidance for large or intact protein mass quantitation for quantitative bioanalysis. Intact- and subunit-level analyses of biotherapeutics from biological matrices are performed at 12-25 kDa mass range with quantitation data presented. Linearity, bias and other metrics are presented along with recommendations made on the viability of existing quantitation approaches. This communication is intended to start a discussion around intact protein data analysis and processing, recognizing that other published contributions will be required.

  20. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Ryan B.; Clegg, Samuel M.; Frydenvang, Jens

    We report that accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response ofmore » an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “submodel” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. Lastly, the sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.« less