Sample records for true quantitative elemental

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Thomas Martin; Patton, Bruce W.; Weber, Charles F.

    The primary goal of this project is to evaluate x-ray spectra generated within a scanning electron microscope (SEM) to determine elemental composition of small samples. This will be accomplished by performing Monte Carlo simulations of the electron and photon interactions in the sample and in the x-ray detector. The elemental inventories will be determined by an inverse process that progressively reduces the difference between the measured and simulated x-ray spectra by iteratively adjusting composition and geometric variables in the computational model. The intended benefit of this work will be to develop a method to perform quantitative analysis on substandard samplesmore » (heterogeneous phases, rough surfaces, small sizes, etc.) without involving standard elemental samples or empirical matrix corrections (i.e., true standardless quantitative analysis).« less

  2. Integration of infrared thermography into various maintenance methodologies

    NASA Astrophysics Data System (ADS)

    Morgan, William T.

    1993-04-01

    Maintenance methodologies are in developmental stages throughout the world as global competitiveness drives all industries to improve operational efficiencies. Rapid progress in technical advancements has added an additional strain on maintenance organizations to progressively change. Accompanying needs for advanced training and documentation is the demand for utilization of various analytical instruments and quantitative methods. Infrared thermography is one of the primary elements of engineered approaches to maintenance. Current maintenance methodologies can be divided into six categories; Routine ('Breakdown'), Preventive, Predictive, Proactive, Reliability-Based, and Total Productive (TPM) maintenance. Each of these methodologies have distinctive approaches to achieving improved operational efficiencies. Popular though is that infrared thermography is a Predictive maintenance tool. While this is true, it is also true that it can be effectively integrated into each of the maintenance methodologies for achieving desired results. The six maintenance strategies will be defined. Infrared applications integrated into each will be composed in tabular form.

  3. Quantitative fractography by digital image processing: NIH Image macro tools for stereo pair analysis and 3-D reconstruction.

    PubMed

    Hein, L R

    2001-10-01

    A set of NIH Image macro programs was developed to make qualitative and quantitative analyses from digital stereo pictures produced by scanning electron microscopes. These tools were designed for image alignment, anaglyph representation, animation, reconstruction of true elevation surfaces, reconstruction of elevation profiles, true-scale elevation mapping and, for the quantitative approach, surface area and roughness calculations. Limitations on time processing, scanning techniques and programming concepts are also discussed.

  4. Quantitative x-ray photoelectron spectroscopy: Quadrupole effects, shake-up, Shirley background, and relative sensitivity factors from a database of true x-ray photoelectron spectra

    NASA Astrophysics Data System (ADS)

    Seah, M. P.; Gilmore, I. S.

    2006-05-01

    An analysis is provided of the x-ray photoelectron spectroscopy (XPS) intensities measured in the National Physical Laboratory (NPL) XPS database for 46 solid elements. This present analysis does not change our previous conclusions concerning the excellent correlation between experimental intensities, following deconvolving the spectra with angle-averaged reflection electron energy loss data, and the theoretical intensities involving the dipole approximation using Scofield’s cross sections. Here, more recent calculations for cross sections by Trzhaskovskaya involving quadrupole terms are evaluated and it is shown that their cross sections diverge from the experimental database results by up to a factor of 5. The quadrupole angular terms lead to small corrections that are close to our measurement limit but do appear to be supported in the present analysis. Measurements of the extent of shake-up for the 46 elements broadly agree with the calculations of Yarzhemsky but not in detail. The predicted constancy in the shake-up contribution by Yarzhemsky implies that the use of the Shirley background will lead to a peak area that is a constant fraction of the true peak area including the shake-up intensities. However, the measured variability of the shake-up contribution makes the Shirley background invalid for quantification except for situations where the sensitivity factors are from reference samples similar to those being analyzed.

  5. The qualitative f-ratio method applied to electron channelling-induced x-ray imaging with an annular silicon drift detector in a scanning electron microscope in the transmission mode.

    PubMed

    Brodusch, Nicolas; Gauvin, Raynald

    2017-09-01

    Electron channelling is known to affect the x-ray production when an accelerated electron beam is applied to a crystalline material and is highly dependent on the local crystal orientation. This effect, unless very long counting time are used, is barely noticeable on x-ray energy spectra recorded with conventional silicon drift detectors (SDD) located at a small elevation angle. However, the very high count rates provided by the new commercially available annular SDDs permit now to observe this effect routinely and may, in some circumstances, hide the true elemental x-ray variations due to the local true specimen composition. To circumvent this issue, the recently developed f-ratio method was applied to display qualitatively the true net intensity x-ray variations in a thin specimen of a Ti-6Al-4V alloy in a scanning electron microscope in transmission mode. The diffraction contrast observed in the x-ray images was successfully cancelled through the use of f-ratios and the true composition variations at the grain boundaries could be observed in relation to the dislocation alignment prior to the β-phase nucleation. The qualitative effectiveness in removing channelling effects demonstrated in this work makes the f-ratio, in its quantitative form, a possible alternative to the ZAF method in channelling conditions. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.

  6. The influence of graphic format on breast cancer risk communication.

    PubMed

    Schapira, Marilyn M; Nattinger, Ann B; McAuliffe, Timothy L

    2006-09-01

    Graphic displays can enhance quantitative risk communication. However, empiric data regarding the effect of graphic format on risk perception is lacking. We evaluate the effect of graphic format elements on perceptions of risk magnitude and perceived truth of data. Preferences for format also were assessed. Participants (254 female primary care patients) viewed a series of hypothetical risk communications regarding the lifetime risk of breast cancer. Identical numeric risk information was presented using different graphic formats. Risk was perceived to be of lower magnitude when communicated with a bar graph as compared with a pictorial display (p < 0.0001), or with consecutively versus randomly highlighted symbols in a pictorial display (p = 0.0001). Data were perceived to be more true when presented with random versus consecutive highlights in a pictorial display (p < 0.01). A pictorial display was preferred to a bar graph format for the presentation of breast cancer risk estimates alone (p = 0.001). When considering breast cancer risk in comparison to heart disease, stroke, and osteoporosis, however, bar graphs were preferred pictorial displays (p < 0.001). In conclusion, elements of graphic format used to convey quantitative risk information effects key domains of risk perception. One must be cognizant of these effects when designing risk communication strategies.

  7. Expansion of the gravitational potential with computerized Poisson series

    NASA Technical Reports Server (NTRS)

    Broucke, R.

    1976-01-01

    The paper describes a recursive formulation for the expansion of the gravitational potential valid for both the tesseral and zonal harmonics. The expansion is primarily in rectangular coordinates, but the classical orbit elements or equinoctial orbit elements can be easily substituted. The equations of motion for the zonal harmonics in both classical and equinoctial orbital elements are described in a form which will result in closed-form expressions for the first-order perturbations. In order to achieve this result, the true longitude or true anomaly have to be used as independent variables.

  8. Affinity for Quantitative Tools: Undergraduate Marketing Students Moving beyond Quantitative Anxiety

    ERIC Educational Resources Information Center

    Tarasi, Crina O.; Wilson, J. Holton; Puri, Cheenu; Divine, Richard L.

    2013-01-01

    Marketing students are known as less likely to have an affinity for the quantitative aspects of the marketing discipline. In this article, we study the reasons why this might be true and develop a parsimonious 20-item scale for measuring quantitative affinity in undergraduate marketing students. The scale was administered to a sample of business…

  9. Detection methods for biotech cotton MON 15985 and MON 88913 by PCR.

    PubMed

    Lee, Seong-Hun; Kim, Jin-Kug; Yi, Bu-Young

    2007-05-02

    Plants derived through agricultural biotechnology, or genetically modified organisms (GMOs), may affect human health and ecological environment. A living GMO is also called a living modified organism (LMO). Biotech cotton is a GMO in food or feed and also an LMO in the environment. Recently, two varieties of biotech cotton, MON 15985 and MON 88913, were developed by Monsanto Co. The detection method is an essential element for the GMO labeling system or LMO management of biotech plants. In this paper, two primer pairs and probes were designed for specific amplification of 116 and 120 bp PCR products from MON 15985 and MON 88913, respectively, with no amplification from any other biotech cotton. Limits of detection of the qualitative method were all 0.05% for MON 15985 and MON 88913. The quantitative method was developed using a TaqMan real-time PCR. A synthetic plasmid, as a reference molecule, was constructed from a taxon-specific DNA sequence of cotton and two construct-specific DNA sequences of MON 15985 and MON 88913. The quantitative method was validated using six samples that contained levels of biotech cotton mixed with conventional cotton ranging from 0.1 to 10.0%. As a result, the biases from the true value and the relative deviations were all within the range of +/-20%. Limits of quantitation of the quantitative method were all 0.1%. Consequently, it is reported that the proposed detection methods were applicable for qualitative and quantitative analyses for biotech cotton MON 15985 and MON 88913.

  10. Will Systems Biology Deliver Its Promise and Contribute to the Development of New or Improved Vaccines? What Really Constitutes the Study of "Systems Biology" and How Might Such an Approach Facilitate Vaccine Design.

    PubMed

    Germain, Ronald N

    2017-10-16

    A dichotomy exists in the field of vaccinology about the promise versus the hype associated with application of "systems biology" approaches to rational vaccine design. Some feel it is the only way to efficiently uncover currently unknown parameters controlling desired immune responses or discover what elements actually mediate these responses. Others feel that traditional experimental, often reductionist, methods for incrementally unraveling complex biology provide a more solid way forward, and that "systems" approaches are costly ways to collect data without gaining true insight. Here I argue that both views are inaccurate. This is largely because of confusion about what can be gained from classical experimentation versus statistical analysis of large data sets (bioinformatics) versus methods that quantitatively explain emergent properties of complex assemblies of biological components, with the latter reflecting what was previously called "physiology." Reductionist studies will remain essential for generating detailed insight into the functional attributes of specific elements of biological systems, but such analyses lack the power to provide a quantitative and predictive understanding of global system behavior. But by employing (1) large-scale screening methods for discovery of unknown components and connections in the immune system ( omics ), (2) statistical analysis of large data sets ( bioinformatics ), and (3) the capacity of quantitative computational methods to translate these individual components and connections into models of emergent behavior ( systems biology ), we will be able to better understand how the overall immune system functions and to determine with greater precision how to manipulate it to produce desired protective responses. Copyright © 2017 Cold Spring Harbor Laboratory Press; all rights reserved.

  11. Analysis of sustainable leadership for science learning management in the 21st Century under education THAILAND 4.0 framework

    NASA Astrophysics Data System (ADS)

    Jedaman, Pornchai; Buaraphan, Khajornsak; Pimdee, Paitoon; Yuenyong, Chokchai; Sukkamart, Aukkapong; Suksup, Charoen

    2018-01-01

    This article aims to study and analyze the 21st Century of sustainable leadership under the education THAILAND 4.0 Framework, and factor analysis of sustainable leadership for science learning. The study employed both quantitative and qualitative approaches in collecting data including a questionnaire survey, a documentary review and a Participatory Action Learning (PAL). The sample were sampling purposively. There were 225 administrators of Primary and Secondary Education Area Offices throughout Thailand. Out of 225, 183 (83.33%) and 42 (16.67%) respondents were the administrators of Primary and Secondary Education Offices, respectively. The quantitative data was analyzed by descriptive statistical analysis including mean, standard deviation. Also, the Confirmatory Factor Analysis (CFA) was conducted to analyze the factors associated with sustainable leadership under the education THAILAND 4.0 Framework. The qualitative data was analyzed by using three main stages, i.e., data reduction, data organization, data interpretation to conclusion. The study revealed that sustainable leadership under the education THAILAND 4.0 Framework needs to focus on development, awareness of duty and responsibility, equality, moral and knowledge. All aspects should be integrated together in order to achieve the organizational goals, good governance culture and identity. Importantly, there were six "key" elements of sustainable leadership under the education THAILAND 4.0 framework: i) Professional Leadership Role, ii) Leadership Under Change, iii) Leadership Skills 4.0 in the 21st Century, iv) Development in the Pace With Change, v) Creativity and Creative Tension, and vi) Hold True Assessments. The CFA showed that the six key elements of sustainable leadership under the education THAILAND 4.0 framework by weight of each elements were significant at the .01 significance level.

  12. Quantitative determination of band distortions in diamond attenuated total reflectance infrared spectra.

    PubMed

    Boulet-Audet, Maxime; Buffeteau, Thierry; Boudreault, Simon; Daugey, Nicolas; Pézolet, Michel

    2010-06-24

    Due to its unmatched hardness and chemical inertia, diamond offers many advantages over other materials for extreme conditions and routine analysis by attenuated total reflection (ATR) infrared spectroscopy. Its low refractive index can offer up to a 6-fold absorbance increase compared to germanium. Unfortunately, it also results for strong bands in spectral distortions compared to transmission experiments. The aim of this paper is to present a methodological approach to determine quantitatively the degree of the spectral distortions in ATR spectra. This approach requires the determination of the optical constants (refractive index and extinction coefficient) of the investigated sample. As a typical example, the optical constants of the fibroin protein of the silk worm Bombyx mori have been determined from the polarized ATR spectra obtained using both diamond and germanium internal reflection elements. The positions found for the amide I band by germanium and diamond ATR are respectively 6 and 17 cm(-1) lower than the true value dtermined from the k(nu) spectrum, which is calculated to be 1659 cm(-1). To determine quantitatively the effect of relevant parameters such as the film thickness and the protein concentration, various spectral simulations have also been performed. The use of a thinner film probed by light polarized in the plane of incidence and diluting the protein sample can help in obtaining ATR spectra that are closer to their transmittance counterparts. To extend this study to any system, the ATR distortion amplitude has been evaluated using spectral simulations performed for bands of various intensities and widths. From these simulations, a simple empirical relationship has been found to estimate the band shift from the experimental band height and width that could be of practical use for ATR users. This paper shows that the determination of optical constants provides an efficient way to recover the true spectrum shape and band frequencies of distorted ATR spectra.

  13. Model of Silicon Refining During Tapping: Removal of Ca, Al, and Other Selected Element Groups

    NASA Astrophysics Data System (ADS)

    Olsen, Jan Erik; Kero, Ida T.; Engh, Thorvald A.; Tranell, Gabriella

    2017-04-01

    A mathematical model for industrial refining of silicon alloys has been developed for the so-called oxidative ladle refining process. It is a lumped (zero-dimensional) model, based on the mass balances of metal, slag, and gas in the ladle, developed to operate with relatively short computational times for the sake of industrial relevance. The model accounts for a semi-continuous process which includes both the tapping and post-tapping refining stages. It predicts the concentrations of Ca, Al, and trace elements, most notably the alkaline metals, alkaline earth metal, and rare earth metals. The predictive power of the model depends on the quality of the model coefficients, the kinetic coefficient, τ, and the equilibrium partition coefficient, L for a given element. A sensitivity analysis indicates that the model results are most sensitive to L. The model has been compared to industrial measurement data and found to be able to qualitatively, and to some extent quantitatively, predict the data. The model is very well suited for alkaline and alkaline earth metals which respond relatively fast to the refining process. The model is less well suited for elements such as the lanthanides and Al, which are refined more slowly. A major challenge for the prediction of the behavior of the rare earth metals is that reliable thermodynamic data for true equilibrium conditions relevant to the industrial process is not typically available in literature.

  14. Molecules and elements for quantitative bioanalysis: The allure of using electrospray, MALDI, and ICP mass spectrometry side-by-side.

    PubMed

    Linscheid, Michael W

    2018-03-30

    To understand biological processes, not only reliable identification, but quantification of constituents in biological processes play a pivotal role. This is especially true for the proteome: protein quantification must follow protein identification, since sometimes minute changes in abundance tell the real tale. To obtain quantitative data, many sophisticated strategies using electrospray and MALDI mass spectrometry (MS) have been developed in recent years. All of them have advantages and limitations. Several years ago, we started to work on strategies, which are principally capable to overcome some of these limits. The fundamental idea is to use elemental signals as a measure for quantities. We began by replacing the radioactive 32 P with the "cold" natural 31 P to quantify modified nucleotides and phosphorylated peptides and proteins and later used tagging strategies for quantification of proteins more generally. To do this, we introduced Inductively Coupled Plasma Mass Spectrometry (ICP-MS) into the bioanalytical workflows, allowing not only reliable and sensitive detection but also quantification based on isotope dilution absolute measurements using poly-isotopic elements. The detection capability of ICP-MS becomes particularly attractive with heavy metals. The covalently bound proteins tags developed in our group are based on the well-known DOTA chelate complex (1,4,7,10-tetraazacyclododecane-N,N',N″,N‴-tetraacetic acid) carrying ions of lanthanoides as metal core. In this review, I will outline the development of this mutual assistance between molecular and elemental mass spectrometry and discuss the scope and limitations particularly of peptide and protein quantification. The lanthanoide tags provide low detection limits, but offer multiplexing capabilities due to the number of very similar lanthanoides and their isotopes. With isotope dilution comes previously unknown accuracy. Separation techniques such as electrophoresis and HPLC were used and just slightly adapted workflows, already in use for quantification in bioanalysis. Imaging mass spectrometry (MSI) with MALDI and laser ablation ICP-MS complemented the range of application in recent years. © 2018 Wiley Periodicals, Inc.

  15. 43 CFR 4.1307 - Elements; burdens of proof.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 1 2012-10-01 2011-10-01 true Elements; burdens of proof. 4.1307 Section... Review of Proposed Individual Civil Penalty Assessments Under Section 518(f) of the Act § 4.1307 Elements... individual shall have the ultimate burden of persuasion by a preponderance of the evidence as to the elements...

  16. Quantitative imaging biomarkers: Effect of sample size and bias on confidence interval coverage.

    PubMed

    Obuchowski, Nancy A; Bullen, Jennifer

    2017-01-01

    Introduction Quantitative imaging biomarkers (QIBs) are being increasingly used in medical practice and clinical trials. An essential first step in the adoption of a quantitative imaging biomarker is the characterization of its technical performance, i.e. precision and bias, through one or more performance studies. Then, given the technical performance, a confidence interval for a new patient's true biomarker value can be constructed. Estimating bias and precision can be problematic because rarely are both estimated in the same study, precision studies are usually quite small, and bias cannot be measured when there is no reference standard. Methods A Monte Carlo simulation study was conducted to assess factors affecting nominal coverage of confidence intervals for a new patient's quantitative imaging biomarker measurement and for change in the quantitative imaging biomarker over time. Factors considered include sample size for estimating bias and precision, effect of fixed and non-proportional bias, clustered data, and absence of a reference standard. Results Technical performance studies of a quantitative imaging biomarker should include at least 35 test-retest subjects to estimate precision and 65 cases to estimate bias. Confidence intervals for a new patient's quantitative imaging biomarker measurement constructed under the no-bias assumption provide nominal coverage as long as the fixed bias is <12%. For confidence intervals of the true change over time, linearity must hold and the slope of the regression of the measurements vs. true values should be between 0.95 and 1.05. The regression slope can be assessed adequately as long as fixed multiples of the measurand can be generated. Even small non-proportional bias greatly reduces confidence interval coverage. Multiple lesions in the same subject can be treated as independent when estimating precision. Conclusion Technical performance studies of quantitative imaging biomarkers require moderate sample sizes in order to provide robust estimates of bias and precision for constructing confidence intervals for new patients. Assumptions of linearity and non-proportional bias should be assessed thoroughly.

  17. Structure formation in grade 20 steel during equal-channel angular pressing and subsequent heating

    NASA Astrophysics Data System (ADS)

    Dobatkin, S. V.; Odesskii, P. D.; Raab, G. I.; Tyutin, M. R.; Rybalchenko, O. V.

    2016-11-01

    The structure formation and the mechanical properties of quenched and tempered grade 20 steel after equal-channel angular pressing (ECAP) at various true strains and 400°C are studied. Electron microscopy analysis after ECAP shows a partially submicrocrystalline and partially subgrain structure with a structural element size of 340-375 nm. The structural element size depends on the region in which the elements are formed (polyhedral ferrite, needle-shaped ferrite, tempered martensite, and pearlite). Heating of the steel after ECAP at 400 and 450°C increases the fraction of high-angle boundaries and the structural ferrite element size to 360-450 nm. The fragmentation and spheroidization of cementite lamellae of pearlite and subgrain coalescence in the regions of needle-shaped ferrite and tempered martensite take place at a high ECAP true strain and heating temperature. Structural refinement ensures considerable strengthening, namely, UTS 742-871 MPa at EL 11-15.3%. The strength slightly increases, whereas the plasticity slightly decreases when the true strain increases during ECAP. After ECAP and heating, the strength and plastic properties of the grade 20 steel remain almost the same.

  18. Opto-VLSI-based photonic true-time delay architecture for broadband adaptive nulling in phased array antennas.

    PubMed

    Juswardy, Budi; Xiao, Feng; Alameh, Kamal

    2009-03-16

    This paper proposes a novel Opto-VLSI-based tunable true-time delay generation unit for adaptively steering the nulls of microwave phased array antennas. Arbitrary single or multiple true-time delays can simultaneously be synthesized for each antenna element by slicing an RF-modulated broadband optical source and routing specific sliced wavebands through an Opto-VLSI processor to a high-dispersion fiber. Experimental results are presented, which demonstrate the principle of the true-time delay unit through the generation of 5 arbitrary true-time delays of up to 2.5 ns each. (c) 2009 Optical Society of America

  19. 48 CFR 2937.602 - Elements of performance-based contracting.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 7 2013-10-01 2012-10-01 true Elements of performance-based contracting. 2937.602 Section 2937.602 Federal Acquisition Regulations System DEPARTMENT OF LABOR...) 2937.602 Elements of performance-based contracting. (a) Performance-based contracting is defined in FAR...

  20. 40 CFR 123.21 - Elements of a program submission.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 22 2014-07-01 2013-07-01 true Elements of a program submission. 123.21 Section 123.21 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS STATE PROGRAM REQUIREMENTS State Program Submissions § 123.21 Elements of a program submission. (a...

  1. 40 CFR 233.10 - Elements of a program submission.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Elements of a program submission. 233.10 Section 233.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) OCEAN DUMPING 404 STATE PROGRAM REGULATIONS Program Approval § 233.10 Elements of a program submission. Any State...

  2. Advances in HPLC-ICP-MS interface techniques for metal speciation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, S.J.

    The relentless demand for lower detection limits is increasingly coupled to the requirement for elemental speciation. This is particularly true in environmental and clinical fields where total levels are often insufficient for mobility and toxicity studies. This demand for both qualitative and quantitative data on the individual species present in complex samples has led to the development of various interfaces to couple some form of chromatography, usually gas chromatography (GC) or high performance liquid chromatography (HPLC) to an element specific detector. Today inductively coupled plasma-mass spectrometry is often employed since it offers excellent detection limits, element specific information (including isotopicmore » data) and the potential for multi-element studies. Ms presentation will concentrate on HPLC couplings although the advantages and disadvantages of both GC and HPLC couplings to ICP-MS will be discussed. Particular attention will be given to the optimization of both the chromatography and detection systems. Details will be presented of several successful HPLC interface designs and ways of facilitating high levels of a range of organic solvents (e.g. methanol and THF) in the HPLC mobile phase will be highlighted. The advantages of using a sheath gas and practical ways of achieving this will also be discussed. Finally the use of isotope dilution analysis in conjunction with HPLC-ICP-MS will be outlined. In all cases the impact of using the most appropriate approach will be demonstrated using both environmental and clinical samples.« less

  3. EXTRACTION AND QUANTITATIVE ANALYSIS OF ELEMENTAL SULFUR FROM SULFIDE MINERAL SURFACES BY HIGH-PERFORMANCE LIQUID CHROMATOGRAPHY. (R826189)

    EPA Science Inventory

    A simple method for the quantitative determination of elemental sulfur on oxidized sulfide minerals is described. Extraction of elemental sulfur in perchloroethylene and subsequent analysis with high-performance liquid chromatography were used to ascertain the total elemental ...

  4. Estimating true human and animal host source contribution in quantitative microbial source tracking using the Monte Carlo method.

    PubMed

    Wang, Dan; Silkie, Sarah S; Nelson, Kara L; Wuertz, Stefan

    2010-09-01

    Cultivation- and library-independent, quantitative PCR-based methods have become the method of choice in microbial source tracking. However, these qPCR assays are not 100% specific and sensitive for the target sequence in their respective hosts' genome. The factors that can lead to false positive and false negative information in qPCR results are well defined. It is highly desirable to have a way of removing such false information to estimate the true concentration of host-specific genetic markers and help guide the interpretation of environmental monitoring studies. Here we propose a statistical model based on the Law of Total Probability to predict the true concentration of these markers. The distributions of the probabilities of obtaining false information are estimated from representative fecal samples of known origin. Measurement error is derived from the sample precision error of replicated qPCR reactions. Then, the Monte Carlo method is applied to sample from these distributions of probabilities and measurement error. The set of equations given by the Law of Total Probability allows one to calculate the distribution of true concentrations, from which their expected value, confidence interval and other statistical characteristics can be easily evaluated. The output distributions of predicted true concentrations can then be used as input to watershed-wide total maximum daily load determinations, quantitative microbial risk assessment and other environmental models. This model was validated by both statistical simulations and real world samples. It was able to correct the intrinsic false information associated with qPCR assays and output the distribution of true concentrations of Bacteroidales for each animal host group. Model performance was strongly affected by the precision error. It could perform reliably and precisely when the standard deviation of the precision error was small (≤ 0.1). Further improvement on the precision of sample processing and qPCR reaction would greatly improve the performance of the model. This methodology, built upon Bacteroidales assays, is readily transferable to any other microbial source indicator where a universal assay for fecal sources of that indicator exists. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. Clinical Investigation of the Dopaminergic System with PET and FLUORINE-18-FLUORO-L-DOPA.

    NASA Astrophysics Data System (ADS)

    Oakes, Terrence Rayford

    1995-01-01

    Positron Emission Tomography (PET) is a tool that provides quantitative physiological information. It is valuable both in a clinical environment, where information is sought for an individual, and in a research environment, to answer more fundamental questions about physiology and disease states. PET is particularly attractive compared to other nuclear medicine imaging techniques in cases where the anatomical regions of interest are small or when true metabolic rate constants are required. One example with both of these requirements is the investigation of Parkinson's Disease, which is characterized as a presynaptic motor function deficit affecting the striatum. As dopaminergic neurons die, the ability of the striatum to affect motor function decreases. The extent of functional neuronal damage in the small sub-structures may be ascertained by measuring the ability of the caudate and putamen to trap and store dopamine, a neurotransmitter. PET is able to utilize a tracer of dopamine activity, ^ {18}F- scL-DOPA, to quantitate the viability of the striatum. This thesis work deals with implementing and optimizing the many different elements that compose a PET study of the dopaminergic system, including: radioisotope production; conversion of aqueous ^{18}F ^-into [^ {18}F]-F2; synthesis of ^{18}F- scL -DOPA; details of the PET scan itself; measurements to estimate the radiation dosimetry; accurate measurement of a plasma input function; and the quantitation of dopaminergic activity in normal human subjects as well as in Parkinson's Disease patients.

  6. PUFKEY: A High-Security and High-Throughput Hardware True Random Number Generator for Sensor Networks

    PubMed Central

    Li, Dongfang; Lu, Zhaojun; Zou, Xuecheng; Liu, Zhenglin

    2015-01-01

    Random number generators (RNG) play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF) elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST) randomness tests and is resilient to a wide range of security attacks. PMID:26501283

  7. PUFKEY: a high-security and high-throughput hardware true random number generator for sensor networks.

    PubMed

    Li, Dongfang; Lu, Zhaojun; Zou, Xuecheng; Liu, Zhenglin

    2015-10-16

    Random number generators (RNG) play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF) elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST) randomness tests and is resilient to a wide range of security attacks.

  8. Environmental corrections of a dual-induction logging while drilling tool in vertical wells

    NASA Astrophysics Data System (ADS)

    Kang, Zhengming; Ke, Shizhen; Jiang, Ming; Yin, Chengfang; Li, Anzong; Li, Junjian

    2018-04-01

    With the development of Logging While Drilling (LWD) technology, dual-induction LWD logging is not only widely applied in deviated wells and horizontal wells, but it is used commonly in vertical wells. Accordingly, it is necessary to simulate the response of LWD tools in vertical wells for logging interpretation. In this paper, the investigation characteristics, the effects of the tool structure, skin effect and drilling environment of a dual-induction LWD tool are simulated by the three-dimensional (3D) finite element method (FEM). In order to closely simulate the actual situation, real structure of the tool is taking into account. The results demonstrate that the influence of the background value of the tool structure can be eliminated. The values of deducting the background of a tool structure and analytical solution have a quantitative agreement in homogeneous formations. The effect of measurement frequency could be effectively eliminated by chart of skin effect correction. In addition, the measurement environment, borehole size, mud resistivity, shoulder bed, layer thickness and invasion, have an effect on the true resistivity. To eliminate these effects, borehole correction charts, shoulder bed correction charts and tornado charts are computed based on real tool structure. Based on correction charts, well logging data can be corrected automatically by a suitable interpolation method, which is convenient and fast. Verified with actual logging data in vertical wells, this method could obtain the true resistivity of formation.

  9. Quantitative analysis of bristle number in Drosophila mutants identifies genes involved in neural development

    NASA Technical Reports Server (NTRS)

    Norga, Koenraad K.; Gurganus, Marjorie C.; Dilda, Christy L.; Yamamoto, Akihiko; Lyman, Richard F.; Patel, Prajal H.; Rubin, Gerald M.; Hoskins, Roger A.; Mackay, Trudy F.; Bellen, Hugo J.

    2003-01-01

    BACKGROUND: The identification of the function of all genes that contribute to specific biological processes and complex traits is one of the major challenges in the postgenomic era. One approach is to employ forward genetic screens in genetically tractable model organisms. In Drosophila melanogaster, P element-mediated insertional mutagenesis is a versatile tool for the dissection of molecular pathways, and there is an ongoing effort to tag every gene with a P element insertion. However, the vast majority of P element insertion lines are viable and fertile as homozygotes and do not exhibit obvious phenotypic defects, perhaps because of the tendency for P elements to insert 5' of transcription units. Quantitative genetic analysis of subtle effects of P element mutations that have been induced in an isogenic background may be a highly efficient method for functional genome annotation. RESULTS: Here, we have tested the efficacy of this strategy by assessing the extent to which screening for quantitative effects of P elements on sensory bristle number can identify genes affecting neural development. We find that such quantitative screens uncover an unusually large number of genes that are known to function in neural development, as well as genes with yet uncharacterized effects on neural development, and novel loci. CONCLUSIONS: Our findings establish the use of quantitative trait analysis for functional genome annotation through forward genetics. Similar analyses of quantitative effects of P element insertions will facilitate our understanding of the genes affecting many other complex traits in Drosophila.

  10. Fluctuation localization imaging-based fluorescence in situ hybridization (fliFISH) for accurate detection and counting of RNA copies in single cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Yi; Hu, Dehong; Markillie, Lye Meng

    Quantitative gene expression analysis in intact single cells can be achieved using single molecule- based fluorescence in situ hybridization (smFISH). This approach relies on fluorescence intensity to distinguish between true signals, emitted from an RNA copy hybridized with multiple FISH sub-probes, and background noise. Thus, the precision in smFISH is often compromised by partial or nonspecific binding of sub-probes and tissue autofluorescence, limiting its accuracy. Here we provide an accurate approach for setting quantitative thresholds between true and false signals, which relies on blinking frequencies of photoswitchable dyes. This fluctuation localization imaging-based FISH (fliFISH) uses blinking frequency patterns, emitted frommore » a transcript bound to multiple sub-probes, which are distinct from blinking patterns emitted from partial or nonspecifically bound sub-probes and autofluorescence. Using multicolor fliFISH, we identified radial gene expression patterns in mouse pancreatic islets for insulin, the transcription factor, NKX2-2, and their ratio (Nkx2-2/Ins2). These radial patterns, showing higher values in β cells at the islet core and lower values in peripheral cells, were lost in diabetic mouse islets. In summary, fliFISH provides an accurate, quantitative approach for detecting and counting true RNA copies and rejecting false signals by their distinct blinking frequency patterns, laying the foundation for reliable single-cell transcriptomics.« less

  11. Noncontact true temperature measurement, 2

    NASA Technical Reports Server (NTRS)

    Lee, Mark C.; Allen, James L.

    1988-01-01

    A laser pyrometer was developed for acquiring the true temperature of a levitated sample. The reflectivity is measured by first expanding the laser beam to cover the entire cross-sectional surface of the diffuse target. The reflectivity calibration of this system is determined from the surface emissivity of a target with a blackbody cavity. The emissivity of the real target can then be calculated. The overall system constant is obtained by passively measuring the radiance of the blackbody cavity (emissivity = 1.0) at a known, arbitrary temperature. Since the photosensor used is highly linear over the entire operating temperature range, the true temperature of the target can then be computed. The latest results available from this on-going research indicate that true temperatures thus obtained are in very good quantitative agreement with thermocouple measured temperatures.

  12. Monte Carlo evaluation of accuracy and noise properties of two scatter correction methods for /sup 201/Tl cardiac SPECT

    NASA Astrophysics Data System (ADS)

    Narita, Y.; Iida, H.; Ebert, S.; Nakamura, T.

    1997-12-01

    Two independent scatter correction techniques, transmission dependent convolution subtraction (TDCS) and triple-energy window (TEW) method, were evaluated in terms of quantitative accuracy and noise properties using Monte Carlo simulation (EGS4). Emission projections (primary, scatter and scatter plus primary) were simulated for three numerical phantoms for /sup 201/Tl. Data were reconstructed with ordered-subset EM algorithm including noise-less transmission data based attenuation correction. Accuracy of TDCS and TEW scatter corrections were assessed by comparison with simulated true primary data. The uniform cylindrical phantom simulation demonstrated better quantitative accuracy with TDCS than with TEW (-2.0% vs. 16.7%) and better S/N (6.48 vs. 5.05). A uniform ring myocardial phantom simulation demonstrated better homogeneity with TDCS than TEW in the myocardium; i.e., anterior-to-posterior wall count ratios were 0.99 and 0.76 with TDCS and TEW, respectively. For the MCAT phantom, TDCS provided good visual and quantitative agreement with simulated true primary image without noticeably increasing the noise after scatter correction. Overall TDCS proved to be more accurate and less noisy than TEW, facilitating quantitative assessment of physiological functions with SPECT.

  13. Fast group matching for MR fingerprinting reconstruction.

    PubMed

    Cauley, Stephen F; Setsompop, Kawin; Ma, Dan; Jiang, Yun; Ye, Huihui; Adalsteinsson, Elfar; Griswold, Mark A; Wald, Lawrence L

    2015-08-01

    MR fingerprinting (MRF) is a technique for quantitative tissue mapping using pseudorandom measurements. To estimate tissue properties such as T1 , T2 , proton density, and B0 , the rapidly acquired data are compared against a large dictionary of Bloch simulations. This matching process can be a very computationally demanding portion of MRF reconstruction. We introduce a fast group matching algorithm (GRM) that exploits inherent correlation within MRF dictionaries to create highly clustered groupings of the elements. During matching, a group specific signature is first used to remove poor matching possibilities. Group principal component analysis (PCA) is used to evaluate all remaining tissue types. In vivo 3 Tesla brain data were used to validate the accuracy of our approach. For a trueFISP sequence with over 196,000 dictionary elements, 1000 MRF samples, and image matrix of 128 × 128, GRM was able to map MR parameters within 2s using standard vendor computational resources. This is an order of magnitude faster than global PCA and nearly two orders of magnitude faster than direct matching, with comparable accuracy (1-2% relative error). The proposed GRM method is a highly efficient model reduction technique for MRF matching and should enable clinically relevant reconstruction accuracy and time on standard vendor computational resources. © 2014 Wiley Periodicals, Inc.

  14. Reassessment of True Core Collapse Differential Pressure Values for Filter Elements in Safety Critical Environments - 13076

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swain, Adam

    2013-07-01

    As the areas of application for diverse filter types increases, the mechanics and material sciences associated with the hardware and its relationship with more and more arduous process environments becomes critical to the successful and reliable operation of the filtration equipment. Where the filter is the last safe barrier between the process and the life environment, structural integrity and reliability is paramount in both the validation and the ethical acceptability of the designed equipment. Core collapse is a key factor influencing filter element selection, and is an extremely complex issue with a number of variables and failure mechanisms. It ismore » becoming clear that the theory behind core collapse calculations is not always supported with real tested data. In exploring this issue we have found that the calculation method is not always reflective of the true as tested collapse value, with the calculated values being typically in excess or even an order of magnitude higher than the tested values. The above claim is supported by a case study performed by the author, which disproves most of what was previously understood to be true. This paper also aims to explore the various failure mechanisms of different configurations of filter core, comparing calculated collapse values against real tested values, with a view to understanding a method of calculating their true collapse value. As the technology is advancing, and filter elements are being used in higher temperature, higher pressure, more radioactive and more chemically aggressive environments, confidence in core collapse values and data is crucial. (authors)« less

  15. Determination of the Elastic Moduli of a Single Cell Cultured on a Rigid Support by Force Microscopy.

    PubMed

    Garcia, Pablo D; Garcia, Ricardo

    2018-06-19

    The elastic response of a living cell is affected by its physiological state. This property provides mechanical fingerprints of a cell's dysfunctionality. The softness (kilopascal range) and thickness (2-15 μm) of mammalian cells imply that the force exerted by the probe might be affected by the stiffness of the solid support. This observation makes infinite sample thickness models unsuitable to describe quantitatively the forces and deformations on a cell. Here, we report a general theory to determine the true Young's moduli of a single cell from a force-indentation curve. Analytical expressions are deduced for common geometries such as flat punches, paraboloids, cones, needles, and nanowires. For a given cell and indentation, the influence of the solid support on the measurements is reduced by using sharp and high aspect ratio tips. The theory is validated by finite element simulations. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  16. Neural differences in the processing of true and false sentences: insights into the nature of 'truth' in language comprehension.

    PubMed

    Marques, J Frederico; Canessa, Nicola; Cappa, Stefano

    2009-06-01

    The inquiry on the nature of truth in language comprehension has a long history of opposite perspectives. These perspectives either consider that there are qualitative differences in the processing of true and false statements, or that these processes are fundamentally the same and only differ in quantitative terms. The present study evaluated the processing nature of true and false statements in terms of patterns of brain activity using event-related functional-Magnetic-Resonance-Imaging (fMRI). We show that when true and false concept-feature statements are controlled for relation strength/ambiguity, their processing is associated to qualitatively different processes. Verifying true statements activates the left inferior parietal cortex and the caudate nucleus, a neural correlate compatible with an extended search and matching process for particular stored information. In contrast, verifying false statements activates the fronto-polar cortex and is compatible with a reasoning process of finding and evaluating a contradiction between the sentence information and stored knowledge.

  17. True Numerical Cognition in the Wild.

    PubMed

    Piantadosi, Steven T; Cantlon, Jessica F

    2017-04-01

    Cognitive and neural research over the past few decades has produced sophisticated models of the representations and algorithms underlying numerical reasoning in humans and other animals. These models make precise predictions for how humans and other animals should behave when faced with quantitative decisions, yet primarily have been tested only in laboratory tasks. We used data from wild baboons' troop movements recently reported by Strandburg-Peshkin, Farine, Couzin, and Crofoot (2015) to compare a variety of models of quantitative decision making. We found that the decisions made by these naturally behaving wild animals rely specifically on numerical representations that have key homologies with the psychophysics of human number representations. These findings provide important new data on the types of problems human numerical cognition was designed to solve and constitute the first robust evidence of true numerical reasoning in wild animals.

  18. Fluctuation localization imaging-based fluorescence in situ hybridization (fliFISH) for accurate detection and counting of RNA copies in single cells

    DOE PAGES

    Cui, Yi; Hu, Dehong; Markillie, Lye Meng; ...

    2017-10-04

    Here, quantitative gene expression analysis in intact single cells can be achieved using single molecule-based fluorescence in situ hybridization (smFISH). This approach relies on fluorescence intensity to distinguish between true signals, emitted from an RNA copy hybridized with multiple oligonucleotide probes, and background noise. Thus, the precision in smFISH is often compromised by partial or nonspecific probe binding and tissue autofluorescence, especially when only a small number of probes can be fitted to the target transcript. Here we provide an accurate approach for setting quantitative thresholds between true and false signals, which relies on on-off duty cycles of photoswitchable dyes.more » This fluctuation localization imaging-based FISH (fliFISH) uses on-time fractions (measured over a series of exposures) collected from transcripts bound to as low as 8 probes, which are distinct from on-time fractions collected from nonspecifically bound probes or autofluorescence. Using multicolor fliFISH, we identified radial gene expression patterns in mouse pancreatic islets for insulin, the transcription factor, NKX2-2 and their ratio ( Nkx2- 2/Ins2). These radial patterns, showing higher values in β cells at the islet core and lower values in peripheral cells, were lost in diabetic mouse islets. In summary, fliFISH provides an accurate, quantitative approach for detecting and counting true RNA copies and rejecting false signals by their distinct on-time fractions, laying the foundation for reliable single-cell transcriptomics.« less

  19. Fluctuation localization imaging-based fluorescence in situ hybridization (fliFISH) for accurate detection and counting of RNA copies in single cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Yi; Hu, Dehong; Markillie, Lye Meng

    Here, quantitative gene expression analysis in intact single cells can be achieved using single molecule-based fluorescence in situ hybridization (smFISH). This approach relies on fluorescence intensity to distinguish between true signals, emitted from an RNA copy hybridized with multiple oligonucleotide probes, and background noise. Thus, the precision in smFISH is often compromised by partial or nonspecific probe binding and tissue autofluorescence, especially when only a small number of probes can be fitted to the target transcript. Here we provide an accurate approach for setting quantitative thresholds between true and false signals, which relies on on-off duty cycles of photoswitchable dyes.more » This fluctuation localization imaging-based FISH (fliFISH) uses on-time fractions (measured over a series of exposures) collected from transcripts bound to as low as 8 probes, which are distinct from on-time fractions collected from nonspecifically bound probes or autofluorescence. Using multicolor fliFISH, we identified radial gene expression patterns in mouse pancreatic islets for insulin, the transcription factor, NKX2-2 and their ratio ( Nkx2- 2/Ins2). These radial patterns, showing higher values in β cells at the islet core and lower values in peripheral cells, were lost in diabetic mouse islets. In summary, fliFISH provides an accurate, quantitative approach for detecting and counting true RNA copies and rejecting false signals by their distinct on-time fractions, laying the foundation for reliable single-cell transcriptomics.« less

  20. Finite element method analysis of cold forging for deformation and densification of Mo alloyed sintered steel

    NASA Astrophysics Data System (ADS)

    Kamakoshi, Y.; Nishida, S.; Kanbe, K.; Shohji, I.

    2017-10-01

    In recent years, powder metallurgy (P/M) materials have been expected to be applied to automobile products. Then, not only high cost performance but also more strength, wear resistance, long-life and so on are required for P/M materials. As an improvement method of mechanical properties of P/M materials, a densification is expected to be one of effective processes. In this study, to examine behaviours of the densification of Mo-alloyed sintered steel in a cold-forging process, finite element method (FEM) analysis was performed. Firstly, a columnar specimen was cut out from the inner part of a sintered specimen and a load-stroke diagram was obtained by the compression test. 2D FEM analysis was performed using the obtained load-stroke diagram. To correct the errors of stress between the porous mode and the rigid-elastic mode of analysis software, the analysis of a polynominal approximation was performed. As a result, the modified true stress-true strain diagram was obtained for the sintered steel with the densification. Afterwards, 3D FEM analysis of backward extrusion was carried out using the modified true stress-true strain diagram. It was confirmed that both the shape and density of the sintered steel analyzed by new FEM analysis that we suggest correspond well with experimental ones.

  1. [Quantitative determination of blood regurgitation via the mitral valve].

    PubMed

    Sandrikov, V A

    1981-11-01

    A method of quantitative determination of blood regurgitation through the mitral valve is considered. Verification experiment on 5 animals with the determination of correlation coefficient of true and predicted regurgitation has shown it to be 0.855 on the average. Besides, observations were undertaken on 621 patient with varying pathology of the heart. A quantitative characteristics of blood regurgitation in patients with mitral defects is given. The method can be used not only under operation conditions, but also in catheterization of the cardiac cavities without administering of an opaque substance.

  2. Segmentation and detection of fluorescent 3D spots.

    PubMed

    Ram, Sundaresh; Rodríguez, Jeffrey J; Bosco, Giovanni

    2012-03-01

    The 3D spatial organization of genes and other genetic elements within the nucleus is important for regulating gene expression. Understanding how this spatial organization is established and maintained throughout the life of a cell is key to elucidating the many layers of gene regulation. Quantitative methods for studying nuclear organization will lead to insights into the molecular mechanisms that maintain gene organization as well as serve as diagnostic tools for pathologies caused by loss of nuclear structure. However, biologists currently lack automated and high throughput methods for quantitative and qualitative global analysis of 3D gene organization. In this study, we use confocal microscopy and fluorescence in-situ hybridization (FISH) as a cytogenetic technique to detect and localize the presence of specific DNA sequences in 3D. FISH uses probes that bind to specific targeted locations on the chromosomes, appearing as fluorescent spots in 3D images obtained using fluorescence microscopy. In this article, we propose an automated algorithm for segmentation and detection of 3D FISH spots. The algorithm is divided into two stages: spot segmentation and spot detection. Spot segmentation consists of 3D anisotropic smoothing to reduce the effect of noise, top-hat filtering, and intensity thresholding, followed by 3D region-growing. Spot detection uses a Bayesian classifier with spot features such as volume, average intensity, texture, and contrast to detect and classify the segmented spots as either true or false spots. Quantitative assessment of the proposed algorithm demonstrates improved segmentation and detection accuracy compared to other techniques. Copyright © 2012 International Society for Advancement of Cytometry.

  3. Quantitative Analyses of Core Promoters Enable Precise Engineering of Regulated Gene Expression in Mammalian Cells.

    PubMed

    Ede, Christopher; Chen, Ximin; Lin, Meng-Yin; Chen, Yvonne Y

    2016-05-20

    Inducible transcription systems play a crucial role in a wide array of synthetic biology circuits. However, the majority of inducible promoters are constructed from a limited set of tried-and-true promoter parts, which are susceptible to common shortcomings such as high basal expression levels (i.e., leakiness). To expand the toolbox for regulated mammalian gene expression and facilitate the construction of mammalian genetic circuits with precise functionality, we quantitatively characterized a panel of eight core promoters, including sequences with mammalian, viral, and synthetic origins. We demonstrate that this selection of core promoters can provide a wide range of basal gene expression levels and achieve a gradient of fold-inductions spanning 2 orders of magnitude. Furthermore, commonly used parts such as minimal CMV and minimal SV40 promoters were shown to achieve robust gene expression upon induction, but also suffer from high levels of leakiness. In contrast, a synthetic promoter, YB_TATA, was shown to combine low basal expression with high transcription rate in the induced state to achieve significantly higher fold-induction ratios compared to all other promoters tested. These behaviors remain consistent when the promoters are coupled to different genetic outputs and different response elements, as well as across different host-cell types and DNA copy numbers. We apply this quantitative understanding of core promoter properties to the successful engineering of human T cells that respond to antigen stimulation via chimeric antigen receptor signaling specifically under hypoxic environments. Results presented in this study can facilitate the design and calibration of future mammalian synthetic biology systems capable of precisely programmed functionality.

  4. Missing heritability in the tails of quantitative traits? A simulation study on the impact of slightly altered true genetic models.

    PubMed

    Pütter, Carolin; Pechlivanis, Sonali; Nöthen, Markus M; Jöckel, Karl-Heinz; Wichmann, Heinz-Erich; Scherag, André

    2011-01-01

    Genome-wide association studies have identified robust associations between single nucleotide polymorphisms and complex traits. As the proportion of phenotypic variance explained is still limited for most of the traits, larger and larger meta-analyses are being conducted to detect additional associations. Here we investigate the impact of the study design and the underlying assumption about the true genetic effect in a bimodal mixture situation on the power to detect associations. We performed simulations of quantitative phenotypes analysed by standard linear regression and dichotomized case-control data sets from the extremes of the quantitative trait analysed by standard logistic regression. Using linear regression, markers with an effect in the extremes of the traits were almost undetectable, whereas analysing extremes by case-control design had superior power even for much smaller sample sizes. Two real data examples are provided to support our theoretical findings and to explore our mixture and parameter assumption. Our findings support the idea to re-analyse the available meta-analysis data sets to detect new loci in the extremes. Moreover, our investigation offers an explanation for discrepant findings when analysing quantitative traits in the general population and in the extremes. Copyright © 2011 S. Karger AG, Basel.

  5. Bone-marrow densitometry: Assessment of marrow space of human vertebrae by single energy high resolution-quantitative computed tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peña, Jaime A.; Damm, Timo; Bastgen, Jan

    Purpose: Accurate noninvasive assessment of vertebral bone marrow fat fraction is important for diagnostic assessment of a variety of disorders and therapies known to affect marrow composition. Moreover, it provides a means to correct fat-induced bias of single energy quantitative computed tomography (QCT) based bone mineral density (BMD) measurements. The authors developed new segmentation and calibration methods to obtain quantitative surrogate measures of marrow-fat density in the axial skeleton. Methods: The authors developed and tested two high resolution-QCT (HR-QCT) based methods which permit segmentation of bone voids in between trabeculae hypothesizing that they are representative of bone marrow space. Themore » methods permit calculation of marrow content in units of mineral equivalent marrow density (MeMD). The first method is based on global thresholding and peeling (GTP) to define a volume of interest away from the transition between trabecular bone and marrow. The second method, morphological filtering (MF), uses spherical elements of different radii (0.1–1.2 mm) and automatically places them in between trabeculae to identify regions with large trabecular interspace, the bone-void space. To determine their performance, data were compared ex vivo to high-resolution peripheral CT (HR-pQCT) images as the gold-standard. The performance of the methods was tested on a set of excised human vertebrae with intact bone marrow tissue representative of an elderly population with low BMD. Results: 86% (GTP) and 87% (MF) of the voxels identified as true marrow space on HR-pQCT images were correctly identified on HR-QCT images and thus these volumes of interest can be considered to be representative of true marrow space. Within this volume, MeMD was estimated with residual errors of 4.8 mg/cm{sup 3} corresponding to accuracy errors in fat fraction on the order of 5% both for GTP and MF methods. Conclusions: The GTP and MF methods on HR-QCT images permit noninvasive localization and densitometric assessment of marrow fat with residual accuracy errors sufficient to study disorders and therapies known to affect bone marrow composition. Additionally, the methods can be used to correct BMD for fat induced bias. Application and testing in vivo and in longitudinal studies are warranted to determine the clinical performance and value of these methods.« less

  6. Correlation between uteroplacental three-dimensional power Doppler indices and true uterine blood flow: evaluation in a pregnant sheep model.

    PubMed

    Morel, O; Pachy, F; Chavatte-Palmer, P; Bonneau, M; Gayat, E; Laigre, P; Evain-Brion, D; Tsatsaris, V

    2010-11-01

    Three-dimensional (3D) Doppler quantification within the uteroplacental unit could be of great help in understanding and screening for pre-eclampsia and intrauterine growth restriction. Yet the correlation between 3D Doppler indices and true blood flow has not been confirmed in vivo. The aim of this study was to evaluate this correlation in a pregnant sheep model. A blood flow quantitative sensor and a controllable vascular occlusion system were placed around the common uterine artery in seven sheep in late pregnancy, while all the other arterial supplies were ligated. Several occlusion levels were applied, from 0 to 100%, simultaneously with 3D Doppler acquisitions of several placentomes, using standardized settings. Each placentome was analyzed using VOCAL™ (Virtual Organ Computer-aided AnaLysis) software. The correlation between true blood flow and Doppler indices (vascularization index (VI), flow index (FI) and vascularization flow index (VFI)) was evaluated, together with measurement reproducibility. Forty-eight acquisitions were analyzed. All 3D Doppler indices were significantly correlated with true blood flow. Higher correlations were observed for VI and VFI (r = 0.81 (0.74-0.87), P < 0.0001 and r = 0.75 (0.67-0.82), P < 0.0001) compared with FI (r = 0.53 (0.38-0.64) P < 0.0001). Both intra- and interobserver reproducibility were high, with intraclass correlation coefficients of at least 0.799. This is the first in-vivo experimental study confirming a significant correlation between true blood perfusion and quantitative 3D Doppler indices measured within the uteroplacental unit. These results confirm the potential usefulness of 3D Doppler ultrasound for the assessment of placental vascular insufficiency both in clinical cases and in a research setting. Copyright © 2010 ISUOG. Published by John Wiley & Sons, Ltd.

  7. Demonstration Technique to Improve Vocabulary and Grammar Element in Teaching Speaking at EFL Learners

    ERIC Educational Resources Information Center

    Husnu, Muhammad

    2018-01-01

    This study aimed at examining the effectiveness of demonstration technique to improve vocabulary and grammar element in teaching speaking at EFL learners. This research applied true-experimental design. The respondents of the study were 32 students (class IIA) as experimental group and 32 students (class IIB) as control group from the second…

  8. A simple finite element method for non-divergence form elliptic equation

    DOE PAGES

    Mu, Lin; Ye, Xiu

    2017-03-01

    Here, we develop a simple finite element method for solving second order elliptic equations in non-divergence form by combining least squares concept with discontinuous approximations. This simple method has a symmetric and positive definite system and can be easily analyzed and implemented. We could have also used general meshes with polytopal element and hanging node in the method. We prove that our finite element solution approaches to the true solution when the mesh size approaches to zero. Numerical examples are tested that demonstrate the robustness and flexibility of the method.

  9. A simple finite element method for non-divergence form elliptic equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mu, Lin; Ye, Xiu

    Here, we develop a simple finite element method for solving second order elliptic equations in non-divergence form by combining least squares concept with discontinuous approximations. This simple method has a symmetric and positive definite system and can be easily analyzed and implemented. We could have also used general meshes with polytopal element and hanging node in the method. We prove that our finite element solution approaches to the true solution when the mesh size approaches to zero. Numerical examples are tested that demonstrate the robustness and flexibility of the method.

  10. Finite element analysis of true and pseudo surface acoustic waves in one-dimensional phononic crystals

    NASA Astrophysics Data System (ADS)

    Graczykowski, B.; Alzina, F.; Gomis-Bresco, J.; Sotomayor Torres, C. M.

    2016-01-01

    In this paper, we report a theoretical investigation of surface acoustic waves propagating in one-dimensional phononic crystal. Using finite element method eigenfrequency and frequency response studies, we develop two model geometries suitable to distinguish true and pseudo (or leaky) surface acoustic waves and determine their propagation through finite size phononic crystals, respectively. The novelty of the first model comes from the application of a surface-like criterion and, additionally, functional damping domain. Exemplary calculated band diagrams show sorted branches of true and pseudo surface acoustic waves and their quantified surface confinement. The second model gives a complementary study of transmission, reflection, and surface-to-bulk losses of Rayleigh surface waves in the case of a phononic crystal with a finite number of periods. Here, we demonstrate that a non-zero transmission within non-radiative band gaps can be carried via leaky modes originating from the coupling of local resonances with propagating waves in the substrate. Finally, we show that the transmission, reflection, and surface-to-bulk losses can be effectively optimised by tuning the geometrical properties of a stripe.

  11. Generalized PSF modeling for optimized quantitation in PET imaging.

    PubMed

    Ashrafinia, Saeed; Mohy-Ud-Din, Hassan; Karakatsanis, Nicolas A; Jha, Abhinav K; Casey, Michael E; Kadrmas, Dan J; Rahmim, Arman

    2017-06-21

    Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUV mean and SUV max , including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUV mean bias in small tumours. Overall, the results indicate that exactly matched PSF modeling does not offer optimized PET quantitation, and that PSF overestimation may provide enhanced SUV quantitation. Furthermore, generalized PSF modeling may provide a valuable approach for quantitative tasks such as treatment-response assessment and prognostication.

  12. (Congressional Add) Partnership in Innovative Preparation for Educators and Students (PIPES)

    DTIC Science & Technology

    2011-12-30

    researchers collected and analyzed both qualitative and quantitative data from students, teachers, and parents related to PIPES program effectiveness...years. PIPES researchers collected and analyzed both qualitative and quantitative data from students, teachers, and parents related to PIPES program...reported an internal reliability of .93 at pretest and .95 at posttest and follow-up. We used a six point scale "not at all true" to "definitely

  13. Consistent linearization of the element-independent corotational formulation for the structural analysis of general shells

    NASA Technical Reports Server (NTRS)

    Rankin, C. C.

    1988-01-01

    A consistent linearization is provided for the element-dependent corotational formulation, providing the proper first and second variation of the strain energy. As a result, the warping problem that has plagued flat elements has been overcome, with beneficial effects carried over to linear solutions. True Newton quadratic convergence has been restored to the Structural Analysis of General Shells (STAGS) code for conservative loading using the full corotational implementation. Some implications for general finite element analysis are discussed, including what effect the automatic frame invariance provided by this work might have on the development of new, improved elements.

  14. Mineral Analysis of Whole Grain Total Cereal

    ERIC Educational Resources Information Center

    Hooker, Paul

    2005-01-01

    The quantitative analysis of elemental iron in Whole Grain Total Cereal using visible spectroscopy is suitable for a general chemistry course for science or nonscience majors. The more extensive mineral analysis, specifically for the elements iron, calcium and zinc, is suitable for an instrumental or quantitative analysis chemistry course.

  15. Rock-forming and rare elements in lunar surface material from the Sea of Tranquillity and the Ocean of Storms

    NASA Technical Reports Server (NTRS)

    Shevaleyevskiy, I. D.; Chupakhin, M. S.

    1974-01-01

    Methodological and analytical capabilities associated with spark mass spectrometry and X-ray spectroscopy are presented for the determination of the elemental composition of samples of lunar regolith returned to the earth by Apollo 11 and Apollo 12. Using X-ray spectroscopy, the main constituents of samples of lunar surface material were determined, and using mass spectrometry -- the main admixtures. The principal difference of Apollo 11 samples from Apollo 12 samples was found for elements contained in microconcentrations. This is especially true of rare earth elements.

  16. Quantitative X-ray mapping, scatter diagrams and the generation of correction maps to obtain more information about your material

    NASA Astrophysics Data System (ADS)

    Wuhrer, R.; Moran, K.

    2014-03-01

    Quantitative X-ray mapping with silicon drift detectors and multi-EDS detector systems have become an invaluable analysis technique and one of the most useful methods of X-ray microanalysis today. The time to perform an X-ray map has reduced considerably with the ability to map minor and trace elements very accurately due to the larger detector area and higher count rate detectors. Live X-ray imaging can now be performed with a significant amount of data collected in a matter of minutes. A great deal of information can be obtained from X-ray maps. This includes; elemental relationship or scatter diagram creation, elemental ratio mapping, chemical phase mapping (CPM) and quantitative X-ray maps. In obtaining quantitative x-ray maps, we are able to easily generate atomic number (Z), absorption (A), fluorescence (F), theoretical back scatter coefficient (η), and quantitative total maps from each pixel in the image. This allows us to generate an image corresponding to each factor (for each element present). These images allow the user to predict and verify where they are likely to have problems in our images, and are especially helpful to look at possible interface artefacts. The post-processing techniques to improve the quantitation of X-ray map data and the development of post processing techniques for improved characterisation are covered in this paper.

  17. Biochemical evaluation of articular cartilage in patients with osteochondrosis dissecans by means of quantitative T2- and T2-mapping at 3T MRI: a feasibility study.

    PubMed

    Marik, W; Apprich, S; Welsch, G H; Mamisch, T C; Trattnig, S

    2012-05-01

    To perform an in vivo evaluation comparing overlying articular cartilage in patients suffering from osteochondrosis dissecans (OCD) in the talocrural joint and healthy volunteers using quantitative T2 mapping at 3.0 T. Ten patients with OCD of Grade II or lower and 9 healthy age matched volunteers were examined at a 3.0 T whole body MR scanner using a flexible multi-element coil. In all investigated persons MRI included proton-density (PD)-FSE and 3D GRE (TrueFisp) sequences for morphological diagnosis and location of anatomical site and quantitative T2 and T2 maps. Region of interest (ROI) analysis was performed for the cartilage layer above the OCD and for a morphologically healthy graded cartilage layer. Mean T2 and T2 values were then statistically analysed. The cartilage layer of healthy volunteers showed mean T2 and T2 values of 29.4 ms (SD 4.9) and 11.8 ms (SD 2.7), respectively. In patients with OCD of grade I and II lesions mean T2 values were 40.9 ms (SD 6.6), 48.7 ms (SD 11.2) and mean T2 values were 16.1 ms (SD 3.2), 16.2 ms (SD 4.8). Therefore statistically significantly higher mean T2 and T2 values were found in patients suffering from OCD compared to healthy volunteers. T2 and T2 mapping can help assess the microstructural composition of cartilage overlying osteochondral lesions. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  18. Emergy evaluation of water utilization benefits in water-ecological-economic system based on water cycle process

    NASA Astrophysics Data System (ADS)

    Guo, X.; Wu, Z.; Lv, C.

    2017-12-01

    The water utilization benefits are formed by the material flow, energy flow, information flow and value stream in the whole water cycle process, and reflected along with the material circulation of inner system. But most of traditional water utilization benefits evaluation are based on the macro level, only consider the whole material input and output and energy conversion relation, and lack the characterization of water utilization benefits accompanying with water cycle process from the formation mechanism. In addition, most studies are from the perspective of economics, only pay attention to the whole economic output and sewage treatment economic investment, but neglect the ecological function benefits of water cycle, Therefore, from the perspective of internal material circulation in the whole system, taking water cycle process as the process of material circulation and energy flow, the circulation and flow process of water and other ecological environment, social economic elements were described, and the composition of water utilization positive and negative benefits in water-ecological-economic system was explored, and the performance of each benefit was analyzed. On this basis, the emergy calculation method of each benefit was proposed by emergy quantitative analysis technique, which can realize the unified measurement and evaluation of water utilization benefits in water-ecological-economic system. Then, taking Zhengzhou city as an example, the corresponding benefits of different water cycle links were calculated quantitatively by emergy method, and the results showed that the emergy evaluation method of water utilization benefits can unify the ecosystem and the economic system, achieve uniform quantitative analysis, and measure the true value of natural resources and human economic activities comprehensively.

  19. GNSS Ephemeris with Graceful Degradation and Measurement Fusion

    NASA Technical Reports Server (NTRS)

    Garrison, James Levi (Inventor); Walker, Michael Allen (Inventor)

    2015-01-01

    A method for providing an extended propagation ephemeris model for a satellite in Earth orbit, the method includes obtaining a satellite's orbital position over a first period of time, applying a least square estimation filter to determine coefficients defining osculating Keplarian orbital elements and harmonic perturbation parameters associated with a coordinate system defining an extended propagation ephemeris model that can be used to estimate the satellite's position during the first period, wherein the osculating Keplarian orbital elements include semi-major axis of the satellite (a), eccentricity of the satellite (e), inclination of the satellite (i), right ascension of ascending node of the satellite (.OMEGA.), true anomaly (.theta.*), and argument of periapsis (.omega.), applying the least square estimation filter to determine a dominant frequency of the true anomaly, and applying a Fourier transform to determine dominant frequencies of the harmonic perturbation parameters.

  20. PERIODIC CLASSIFICATION AND THE PROUST LAW (in French)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rinck, E.; Feschotte, P.

    1962-04-01

    Progress realized in the knowledge of the solid state has permitted the identification of numerous crystalline phases whose composition was not defined in the meaning of the Proust law. Its rigorous validity has nevertheless served as a beginning for atomic theory; it continues to be utilized in the measurement of atomic weights and remains valid for the vast region of organic chemistry. The investigation of the limits of the validity of the Proust law leads to some peculiarities of the metallic state which are closely connected to the periodic classification of elements. A new arrangement of the periodic table, permittingmore » for the first time the integration of the rare earths and giving to hydrogen a very special place, takes into consideration a distinction between true metals and earth metals. This distinction is imposed by the fact that the Proust law, valid for compounds between metalloids and earth metals, is not always followed when these same metalloids unite with true metals. Finally. this law loses all significance in alloys between true metals. The exceptions to this rule are explained by the specialization of chemical properties which is shown when one passes from short periods to long periods, hydrogen and the metals with short period being considered as undifferentiated elements. The usage of a point of view borrowed from embryology permits, thus, the chemical and even the physical properties of these elements to be better connected. (tr-auth)« less

  1. An overview of quantitative approaches in Gestalt perception.

    PubMed

    Jäkel, Frank; Singh, Manish; Wichmann, Felix A; Herzog, Michael H

    2016-09-01

    Gestalt psychology is often criticized as lacking quantitative measurements and precise mathematical models. While this is true of the early Gestalt school, today there are many quantitative approaches in Gestalt perception and the special issue of Vision Research "Quantitative Approaches in Gestalt Perception" showcases the current state-of-the-art. In this article we give an overview of these current approaches. For example, ideal observer models are one of the standard quantitative tools in vision research and there is a clear trend to try and apply this tool to Gestalt perception and thereby integrate Gestalt perception into mainstream vision research. More generally, Bayesian models, long popular in other areas of vision research, are increasingly being employed to model perceptual grouping as well. Thus, although experimental and theoretical approaches to Gestalt perception remain quite diverse, we are hopeful that these quantitative trends will pave the way for a unified theory. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Geochemistry of organic carbon and trace elements in boreal stratified lakes during different seasons

    NASA Astrophysics Data System (ADS)

    Moreva, O. Y.; Pokrovsky, O. S.; Shirokova, L. S.; Viers, J.

    2008-12-01

    Our knowledge of chemical fluxes in the system rock-soils-rivers-ocean of boreal and glacial landscapes is limited by the least studied part, i.e., the river water transformation between the lake and the river systems. Dissolved organic carbon (DOC), nutrients, major and trace elements are being leached from soil profile to the river but subjected to chemical transformation in the lakes due to phytoplankton and bacterial activity. As a result, many lakes in boreal regions are quite different in chemical composition compared to surrounding rivers and demonstrate important chemical stratification. The main processes responsible for chemical stratification in lakes are considered to be i) diffusion fluxes from the sediment to the bottom water accompanied by sulfate reduction and methanogenesis in the sediments and ii) dissolution/mineralization of precipitating organic matter (mineral fraction, detritus, plankton pellets) in the bottom layer horizons under anoxic conditions. Up to present time, distinguishing between two processes remains difficult. This paper is aimed at filling this gap via detailed geochemical analysis of DOC and trace elements in the water column profiles of three typical stratified lakes of Arkhangelsk region in Kenozersky National Parc (64° N) in winter (glacial) and in summer period. Concentration of most trace elements (Li, B, Al, Ti, V, Cr, Ni, Co, Zn, As, Rb, Sr, Y, Zr, Mo, Sb, Ba, REEs, Th, U) are not subjected to strong variations along the water column, despite the presence of strong or partial redox stratification. Apparently, these elements are not significantly controlled by production/mineralization processes and redox phenomena in the water column, or the influence of these processes is not pronounced under the control by the allochtonous river water input. In particularly, the stability of titanium and aluminum concentration along the depth profile and their independence of iron behavior suggest the important control by dissolved organic matter. Therefore, organo-ferric colloids controlling petrogenic elements speciation in soil and river waters are being replaced by autochthonous organic colloids in the lake system. The same observation is true for some heavy metals such as nickel, copper and zinc, whereas cobalt, as limiting component, is being strongly removed from the photic zone or it is coprecipitating with manganese hydroxide. Results of the present work allow quantitative evaluation of the role of redox processes in the bottom horizons and organic detritus degradation in the creation of chemical stratification of small lakes with high DOC concentration. Further insights on geochemical migration of trace elements in lakes require : i) study of colloidal speciation using in-situ dialysis; ii) monitoring the annual and seasonal dynamics of redox processes and TE concentration variation along the profile; iii) quantitative assessment of bacterial degradation of suspended OM and Mn and Fe redox reactions along the depth profile; iv) setting the sedimentary traps for evaluation of suspended material fluxes, and, v) thorough study of chemical composition of interstitial pore waters.

  3. True random numbers from amplified quantum vacuum.

    PubMed

    Jofre, M; Curty, M; Steinlechner, F; Anzolin, G; Torres, J P; Mitchell, M W; Pruneri, V

    2011-10-10

    Random numbers are essential for applications ranging from secure communications to numerical simulation and quantitative finance. Algorithms can rapidly produce pseudo-random outcomes, series of numbers that mimic most properties of true random numbers while quantum random number generators (QRNGs) exploit intrinsic quantum randomness to produce true random numbers. Single-photon QRNGs are conceptually simple but produce few random bits per detection. In contrast, vacuum fluctuations are a vast resource for QRNGs: they are broad-band and thus can encode many random bits per second. Direct recording of vacuum fluctuations is possible, but requires shot-noise-limited detectors, at the cost of bandwidth. We demonstrate efficient conversion of vacuum fluctuations to true random bits using optical amplification of vacuum and interferometry. Using commercially-available optical components we demonstrate a QRNG at a bit rate of 1.11 Gbps. The proposed scheme has the potential to be extended to 10 Gbps and even up to 100 Gbps by taking advantage of high speed modulation sources and detectors for optical fiber telecommunication devices.

  4. Statistical design of quantitative mass spectrometry-based proteomic experiments.

    PubMed

    Oberg, Ann L; Vitek, Olga

    2009-05-01

    We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.

  5. On-line hydrogen-isotope measurements of organic samples using elemental chromium: An extension for high temperature elemental-analyzer techniques

    USGS Publications Warehouse

    Gehre, Matthias; Renpenning, Julian; Gilevska, Tetyana; Qi, Haiping; Coplen, Tyler B.; Meijer, Harro A.J.; Brand, Willi A.; Schimmelmann, Arndt

    2015-01-01

    The high temperature conversion (HTC) technique using an elemental analyzer with a glassy carbon tube and filling (temperature conversion/elemental analysis, TC/EA) is a widely used method for hydrogen isotopic analysis of water and many solid and liquid organic samples with analysis by isotope-ratio mass spectrometry (IRMS). However, the TC/EA IRMS method may produce inaccurate δ2H results, with values deviating by more than 20 mUr (milliurey = 0.001 = 1‰) from the true value for some materials. We show that a single-oven, chromium-filled elemental analyzer coupled to an IRMS substantially improves the measurement quality and reliability for hydrogen isotopic compositions of organic substances (Cr-EA method). Hot chromium maximizes the yield of molecular hydrogen in a helium carrier gas by irreversibly and quantitatively scavenging all reactive elements except hydrogen. In contrast, under TC/EA conditions, heteroelements like nitrogen or chlorine (and other halogens) can form hydrogen cyanide (HCN) or hydrogen chloride (HCl) and this can cause isotopic fractionation. The Cr-EA technique thus expands the analytical possibilities for on-line hydrogen-isotope measurements of organic samples significantly. This method yielded reproducibility values (1-sigma) for δ2H measurements on water and caffeine samples of better than 1.0 and 0.5 mUr, respectively. To overcome handling problems with water as the principal calibration anchor for hydrogen isotopic measurements, we have employed an effective and simple strategy using reference waters or other liquids sealed in silver-tube segments. These crimped silver tubes can be employed in both the Cr-EA and TC/EA techniques. They simplify considerably the normalization of hydrogen-isotope measurement data to the VSMOW-SLAP (Vienna Standard Mean Ocean Water-Standard Light Antarctic Precipitation) scale, and their use improves accuracy of the data by eliminating evaporative loss and associated isotopic fractionation while handling water as a bulk sample. The calibration of organic samples, commonly having high δ2H values, will benefit from the availability of suitably 2H-enriched reference waters, extending the VSMOW-SLAP scale above zero.

  6. Usefulness of a Dual Macro- and Micro-Energy-Dispersive X-Ray Fluorescence Spectrometer to Develop Quantitative Methodologies for Historic Mortar and Related Materials Characterization.

    PubMed

    García-Florentino, Cristina; Maguregui, Maite; Romera-Fernández, Miriam; Queralt, Ignasi; Margui, Eva; Madariaga, Juan Manuel

    2018-05-01

    Wavelength dispersive X-ray fluorescence (WD-XRF) spectrometry has been widely used for elemental quantification of mortars and cements. In this kind of instrument, samples are usually prepared as pellets or fused beads and the whole volume of sample is measured at once. In this work, the usefulness of a dual energy dispersive X-ray fluorescence spectrometer (ED-XRF), working at two lateral resolutions (1 mm and 25 μm) for macro and microanalysis respectively, to develop quantitative methods for the elemental characterization of mortars and concretes is demonstrated. A crucial step before developing any quantitative method with this kind of spectrometers is to verify the homogeneity of the standards at these two lateral resolutions. This new ED-XRF quantitative method also demonstrated the importance of matrix effects in the accuracy of the results being necessary to use Certified Reference Materials as standards. The results obtained with the ED-XRF quantitative method were compared with the ones obtained with two WD-XRF quantitative methods employing two different sample preparation strategies (pellets and fused beads). The selected ED-XRF and both WD-XRF quantitative methods were applied to the analysis of real mortars. The accuracy of the ED-XRF results turn out to be similar to the one achieved by WD-XRF, except for the lightest elements (Na and Mg). The results described in this work proved that μ-ED-XRF spectrometers can be used not only for acquiring high resolution elemental map distributions, but also to perform accurate quantitative studies avoiding the use of more sophisticated WD-XRF systems or the acid extraction/alkaline fusion required as destructive pretreatment in Inductively coupled plasma mass spectrometry based procedures.

  7. Finite element analysis of true and pseudo surface acoustic waves in one-dimensional phononic crystals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graczykowski, B., E-mail: bartlomiej.graczykowski@icn.cat; Alzina, F.; Gomis-Bresco, J.

    In this paper, we report a theoretical investigation of surface acoustic waves propagating in one-dimensional phononic crystal. Using finite element method eigenfrequency and frequency response studies, we develop two model geometries suitable to distinguish true and pseudo (or leaky) surface acoustic waves and determine their propagation through finite size phononic crystals, respectively. The novelty of the first model comes from the application of a surface-like criterion and, additionally, functional damping domain. Exemplary calculated band diagrams show sorted branches of true and pseudo surface acoustic waves and their quantified surface confinement. The second model gives a complementary study of transmission, reflection,more » and surface-to-bulk losses of Rayleigh surface waves in the case of a phononic crystal with a finite number of periods. Here, we demonstrate that a non-zero transmission within non-radiative band gaps can be carried via leaky modes originating from the coupling of local resonances with propagating waves in the substrate. Finally, we show that the transmission, reflection, and surface-to-bulk losses can be effectively optimised by tuning the geometrical properties of a stripe.« less

  8. Quantitative Effects of P Elements on Hybrid Dysgenesis in Drosophila Melanogaster

    PubMed Central

    Rasmusson, K. E.; Simmons, M. J.; Raymond, J. D.; McLarnon, C. F.

    1990-01-01

    Genetic analyses involving chromosomes from seven inbred lines derived from a single M' strain were used to study the quantitative relationships between the incidence and severity of P-M hybrid dysgenesis and the number of genomic P elements. In four separate analyses, the mutability of sn(w), a P element-insertion mutation of the X-linked singed locus, was found to be inversely related to the number of autosomal P elements. Since sn(w) mutability is caused by the action of the P transposase, this finding supports the hypothesis that genomic P elements titrate the transposase present within a cell. Other analyses demonstrated that autosomal transmission ratios were distorted by P element action. In these analyses, the amount of distortion against an autosome increased more or less linearly with the number of P elements carried by the autosome. Additional analyses showed that the magnitude of this distortion was reduced when a second P element-containing autosome was present in the genome. This reduction could adequately be explained by transposase titration; there was no evidence that it was due to repressor molecules binding to P elements and inhibiting their movement. The influence of genomic P elements on the incidence of gonadal dysgenesis was also investigated. Although no simple relationship between the number of P elements and the incidence of the trait could be discerned, it was clear that even a small number of elements could increase the incidence markedly. The failure to find a quantitative relationship between P element number and the incidence of gonadal dysgenesis probably reflects the complex etiology of this trait. PMID:2155853

  9. Research related to improved computer aided design software package. [comparative efficiency of finite, boundary, and hybrid element methods in elastostatics

    NASA Technical Reports Server (NTRS)

    Walston, W. H., Jr.

    1986-01-01

    The comparative computational efficiencies of the finite element (FEM), boundary element (BEM), and hybrid boundary element-finite element (HVFEM) analysis techniques are evaluated for representative bounded domain interior and unbounded domain exterior problems in elastostatics. Computational efficiency is carefully defined in this study as the computer time required to attain a specified level of solution accuracy. The study found the FEM superior to the BEM for the interior problem, while the reverse was true for the exterior problem. The hybrid analysis technique was found to be comparable or superior to both the FEM and BEM for both the interior and exterior problems.

  10. Simulation of the M13 life cycle II: Investigation of the control mechanisms of M13 infection and establishment of the carrier state.

    PubMed

    Smeal, Steven W; Schmitt, Margaret A; Pereira, Ronnie Rodrigues; Prasad, Ashok; Fisk, John D

    2017-01-01

    Bacteriophage M13 is a true parasite of bacteria, able to co-opt the infected cell and control the production of progeny across many cellular generations. Here, our genetically-structured simulation of M13 is applied to quantitatively dissect the interplay between the host cellular environment and the controlling interactions governing the phage life cycle during the initial establishment of infection and across multiple cell generations. Multiple simulations suggest that phage-encoded feedback interactions constrain the utilization of host DNA polymerase, RNA polymerase and ribosomes. The simulation reveals the importance of p5 translational attenuation in controlling the production of phage double-stranded DNA and suggests an underappreciated role for p5 translational self-attenuation in resource allocation. The control elements active in a single generation are sufficient to reproduce the experimentally-observed multigenerational curing of the phage infection. Understanding the subtleties of regulation will be important for maximally exploiting M13 particles as scaffolds for nanoscale devices. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Hg0 and HgCl2 Reference Gas Standards: ?NIST Traceability ...

    EPA Pesticide Factsheets

    EPA and NIST have collaborated to establish the necessary procedures for establishing the required NIST traceability of commercially-provided Hg0 and HgCl2 reference generators. This presentation will discuss the approach of a joint EPA/NIST study to accurately quantify the true concentrations of Hg0 and HgCl2 reference gases produced from high quality, NIST-traceable, commercial Hg0 and HgCl2 generators. This presentation will also discuss the availability of HCl and Hg0 compressed reference gas standards as a result of EPA's recently approved Alternative Methods 114 and 118. Gaseous elemental mercury (Hg0) and oxidized mercury (HgCl2) reference standards are integral to the use of mercury continuous emissions monitoring systems (Hg CEMS) for regulatory compliance emissions monitoring. However, a quantitative disparity of approximately 7-10% has been observed between commercial Hg0 and HgCl2 reference gases which currently limits the use of (HgCl2) reference gas standards. Resolving this disparity would enable the expanded use of (HgCl2) reference gas standards for regulatory compliance purposes.

  12. Quantitation of specific binding ratio in 123I-FP-CIT SPECT: accurate processing strategy for cerebral ventricular enlargement with use of 3D-striatal digital brain phantom.

    PubMed

    Furuta, Akihiro; Onishi, Hideo; Amijima, Hizuru

    2018-06-01

    This study aimed to evaluate the effect of ventricular enlargement on the specific binding ratio (SBR) and to validate the cerebrospinal fluid (CSF)-Mask algorithm for quantitative SBR assessment of 123 I-FP-CIT single-photon emission computed tomography (SPECT) images with the use of a 3D-striatum digital brain (SDB) phantom. Ventricular enlargement was simulated by three-dimensional extensions in a 3D-SDB phantom comprising segments representing the striatum, ventricle, brain parenchyma, and skull bone. The Evans Index (EI) was measured in 3D-SDB phantom images of an enlarged ventricle. Projection data sets were generated from the 3D-SDB phantoms with blurring, scatter, and attenuation. Images were reconstructed using the ordered subset expectation maximization (OSEM) algorithm and corrected for attenuation, scatter, and resolution recovery. We bundled DaTView (Southampton method) with the CSF-Mask processing software for SBR. We assessed SBR with the use of various coefficients (f factor) of the CSF-Mask. Specific binding ratios of 1, 2, 3, 4, and 5 corresponded to SDB phantom simulations with true values. Measured SBRs > 50% that were underestimated with EI increased compared with the true SBR and this trend was outstanding at low SBR. The CSF-Mask improved 20% underestimates and brought the measured SBR closer to the true values at an f factor of 1.0 despite an increase in EI. We connected the linear regression function (y = - 3.53x + 1.95; r = 0.95) with the EI and f factor using root-mean-square error. Processing with CSF-Mask generates accurate quantitative SBR from dopamine transporter SPECT images of patients with ventricular enlargement.

  13. Standard Reference Line Combined with One-Point Calibration-Free Laser-Induced Breakdown Spectroscopy (CF-LIBS) to Quantitatively Analyze Stainless and Heat Resistant Steel.

    PubMed

    Fu, Hongbo; Wang, Huadong; Jia, Junwei; Ni, Zhibo; Dong, Fengzhong

    2018-01-01

    Due to the influence of major elements' self-absorption, scarce observable spectral lines of trace elements, and relative efficiency correction of experimental system, accurate quantitative analysis with calibration-free laser-induced breakdown spectroscopy (CF-LIBS) is in fact not easy. In order to overcome these difficulties, standard reference line (SRL) combined with one-point calibration (OPC) is used to analyze six elements in three stainless-steel and five heat-resistant steel samples. The Stark broadening and Saha - Boltzmann plot of Fe are used to calculate the electron density and the plasma temperature, respectively. In the present work, we tested the original SRL method, the SRL with the OPC method, and intercept with the OPC method. The final calculation results show that the latter two methods can effectively improve the overall accuracy of quantitative analysis and the detection limits of trace elements.

  14. Improved EPMA Trace Element Accuracy Using a Matrix Iterated Quantitative Blank Correction

    NASA Astrophysics Data System (ADS)

    Donovan, J. J.; Wark, D. A.; Jercinovic, M. J.

    2007-12-01

    At trace element levels below several hundred PPM, accuracy is more often the limiting factor for EPMA quantification rather than precision. Modern EPMA instruments equipped with low noise detectors, counting electronics and large area analyzing crystals can now routinely achieve sensitivities for most elements in the 10 to 100 PPM levels (or even lower). But due to various sample and instrumental artifacts in the x-ray continuum, absolute accuracy is often the limiting factor for ultra trace element quantification. These artifacts have various mechanisms, but are usually attributed to sample artifacts (e.g., sample matrix absorption edges)1, detector artifacts (e.g., Ar or Xe absorption edges) 2 and analyzing crystal artifacts (extended peak tails preventing accurate determination of the true background and ¡§negative peaks¡¨ or ¡§holes¡¨ in the x-ray continuum). The latter being first described3 by Self, et al. and recently documented for the Ti kÑ in quartz geo-thermometer. 4 Ti (ka) Ti (ka) Ti (ka) Ti (ka) Ti (ka) Si () O () Total Average: -.00146 -.00031 -.00180 .00013 .00240 46.7430 53.2563 99.9983 Std Dev: .00069 .00075 .00036 .00190 .00117 .00000 .00168 .00419 The general magnitude of these artifacts can be seen in the above analyses of Ti ka in a synthetic quartz standard. The values for each spectrometer/crystal vary systematically from ¡V18 PPM to + 24 PPM. The exact mechanism for these continuum ¡§holes¡¨ is not known but may be related to secondary lattice diffraction occurring at certain Bragg angles depending on crystal mounting orientation for non-isometric analyzing crystals5. These x-ray continuum artifacts can produce systematic errors at levels up to 100 PPM or more depending on the particular analytical situation. In order to correct for these inaccuracies, a ¡§blank¡¨ correction has been developed that applies a quantitative correction to the measured x-ray intensities during the matrix iteration, by calculating the intensity contribution from the systematic quantitative offset from a known (usually zero level) blank standard. Preliminary results from this new matrix iterated trace element blank correction demonstrate that systematic errors can be reduced to single digit PPM levels for many situations. 1B.W. Robinson, N.G. Ware and D.G.W. Smith, 1998. "Modern Electron-Microprobe Trace-Element Analysis in Mineralogy". In Cabri, L.J. and Vaughan, D.J., Eds. "Modern Approaches to Ore and Environmental Mineralogy", Short Course 27. Mineralogical Association of Canada, Ottawa 153-180 2Remond, G., Myklebust, R. Fialin, M. Nockolds, C. Phillips, M. Roques-Carmes, C. ¡§Decomposition of Wavelength Dispersive X-ray Spectra¡¨, Journal of Research of the National Institute of Standards and Technology (J. Res. Natl. Inst. Stand. Technol., v. 107, 509-529 (2002) 3Self, P.G., Norrish, K., Milnes, A.R., Graham, J. & Robinson, B.W. (1990): Holes in the Background in XRS. X-ray Spectrom. 19 (2), 59-61 4Wark, DA, and Watson, EB, 2006, TitaniQ: A Titanium-in-Quartz geothermometer: Contributions to Mineralogy and Petrology, 152:743-754, doi: 10.1007/s00410-006-0132-308

  15. STATISTICAL PROCEDURES FOR DETERMINATION AND VERIFICATION OF MINIMUM REPORTING LEVELS FOR DRINKING WATER METHODS

    EPA Science Inventory

    The United States Environmental Protection Agency's (EPA) Office of Ground Water and Drinking Water (OGWDW) has developed a single-laboratory quantitation procedure: the lowest concentration minimum reporting level (LCMRL). The LCMRL is the lowest true concentration for which fu...

  16. Virtual Prototyping: Concept to Production

    DTIC Science & Technology

    1994-03-01

    element analysis. Meshing refers to the gen - following page. The FEA enables designers to eration of nodal coordinates and elements evaluate complex...pants. It is not acceptable to have one weapon technology. This is especially true when gen - system believe it is concealed by a terrain fea- erating...conducted by Gen - process there is ample opportunity to utilize eral Paul F Gorman, USA (Ret.), who led the virtual prototyping and simulation to en

  17. Distribution of Unlinked Transpositions of a Ds Element from a T-DNA Locus on Tomato Chromosome 4

    PubMed Central

    Briza, J.; Carroll, B. J.; Klimyuk, V. I.; Thomas, C. M.; Jones, D. A.; Jones, JDG.

    1995-01-01

    In maize, receptor sites for unlinked transpositions of Activator (Ac) elements are not distributed randomly. To test whether the same is true in tomato, the receptor sites for a Dissociation (Ds) element derived from Ac, were mapped for 26 transpositions unlinked to a donor T-DNA locus on chromosome 4. Four independent transposed Dss mapped to sites on chromosome 4 genetically unlinked to the donor T-DNA, consistent with a preference for transposition to unlinked sites on the same chromosome as opposed to sites on other chromosomes. There was little preference among the nondonor chromosomes, except perhaps for chromosome 2, which carried seven transposed Dss, but these could not be proven to be independent. However, these data, when combined with those from other studies in tomato examining the distribution of transposed Acs or Dss among nondonor chromosomes, suggest there may be absolute preferences for transposition irrespective of the chromosomal location of the donor site. If true, transposition to nondonor chromosomes in tomato would differ from that in maize, where the preference seems to be determined by the spatial arrangement of chromosomes in the interphase nucleus. The tomato lines carrying Ds elements at known locations are available for targeted transposon tagging experiments. PMID:8536985

  18. QACD: A method for the quantitative assessment of compositional distribution in geologic materials

    NASA Astrophysics Data System (ADS)

    Loocke, M. P.; Lissenberg, J. C. J.; MacLeod, C. J.

    2017-12-01

    In order to fully understand the petrogenetic history of a rock, it is critical to obtain a thorough characterization of the chemical and textural relationships of its mineral constituents. Element mapping combines the microanalytical techniques that allow for the analysis of major- and minor elements at high spatial resolutions (e.g., electron microbeam analysis) with 2D mapping of samples in order to provide unprecedented detail regarding the growth histories and compositional distributions of minerals within a sample. We present a method for the acquisition and processing of large area X-ray element maps obtained by energy-dispersive X-ray spectrometer (EDS) to produce a quantitative assessment of compositional distribution (QACD) of mineral populations within geologic materials. By optimizing the conditions at which the EDS X-ray element maps are acquired, we are able to obtain full thin section quantitative element maps for most major elements in relatively short amounts of time. Such maps can be used to not only accurately identify all phases and calculate mineral modes for a sample (e.g., a petrographic thin section), but, critically, enable a complete quantitative assessment of their compositions. The QACD method has been incorporated into a python-based, easy-to-use graphical user interface (GUI) called Quack. The Quack software facilitates the generation of mineral modes, element and molar ratio maps and the quantification of full-sample compositional distributions. The open-source nature of the Quack software provides a versatile platform which can be easily adapted and modified to suit the needs of the user.

  19. Mapping and validation of quantitative trait loci associated with concentrations of 16 elements in unmilled rice grain

    USDA-ARS?s Scientific Manuscript database

    In this study, quantitative trait loci (QTLs) affecting the concentrations of 16 elements in whole, unmilled rice (Oryza sativa L.) grain were identified. Two rice mapping populations, the ‘Lemont’ x ‘TeQing’ recombinant inbred lines (LT-RILs), and the TeQing-into-Lemont backcross introgression lin...

  20. Improved 206Pb/238U microprobe geochronology by the monitoring of a trace-element-related matrix effect; SHRIMP, ID-TIMS, ELA-ICP-MS and oxygen isotope documentation for a series of zircon standards

    USGS Publications Warehouse

    Black, L.P.; Kamo, S.L.; Allen, C.M.; Davis, D.W.; Aleinikoff, J.N.; Valley, J.W.; Mundil, R.; Campbell, I.H.; Korsch, R.J.; Williams, I.S.; Foudoulis, C.

    2004-01-01

    Precise isotope dilution-thermal ionisation mass spectrometry (ID-TIMS) documentation is given for two new Palaeozoic zircon standards (TEMORA 2 and R33). These data, in combination with results for previously documented standards (AS3, SL13, QGNG and TEMORA 1), provide the basis for a detailed investigation of inconsistencies in 206Pb/238U ages measured by microprobe. Although these ages are normally consistent between any two standards, their relative age offsets are often different from those established by ID-TIMS. This is true for both sensitive high-resolution ion-microprobe (SHRIMP) and excimer laser ablation-inductively coupled plasma-mass spectrometry (ELA-ICP-MS) dating, although the age offsets are in the opposite sense for the two techniques. Various factors have been investigated for possible correlations with age bias, in an attempt to resolve why the accuracy of the method is worse than the indicated precision. Crystallographic orientation, position on the grain-mount and oxygen isotopic composition are unrelated to the bias. There are, however, striking correlations between the 206Pb/238U age offsets and P, Sm and, most particularly, Nd abundances in the zircons. Although these are not believed to be the primary cause of this apparent matrix effect, they indicate that ionisation of 206Pb/238U is influenced, at least in part, by a combination of trace elements. Nd is sufficiently representative of the controlling trace elements that it provides a quantitative means of correcting for the microprobe age bias. This approach has the potential to reduce age biases associated with different techniques, different instrumentation and different standards within and between laboratories. Crown Copyright ?? 2004 Published by Elsevier B.V. All rights reserved.

  1. Variable angle-of-incidence polarization-sensitive optical coherence tomography: its use to study the 3D collagen structure of equine articular cartilage

    NASA Astrophysics Data System (ADS)

    Ugryumova, Nadya; Gangnus, Sergei V.; Matcher, Stephen J.

    2006-02-01

    Polarization-sensitive optical coherence tomography has been used to spatially map the birefringence of equine articular cartilage. The polar orientation of the collagen fibers relative to the plane of the joint surface must be taken into account if a quantitative measurement of true birefringence is required. Using a series of images taken at different angles of illumination, we determine the fiber polar angle and true birefringence at one site on a sample of equine cartilage, on the assumption that the fibers lie within the plane of imaging. We propose a more general method based on the extended Jones matrix formalism to determine both the polar and azimuthal orientation of the collagen fibers as well as the true birefringence as functions of depth.

  2. Biomimetic wall-shaped hierarchical microstructure for gecko-like attachment.

    PubMed

    Kasem, Haytam; Tsipenyuk, Alexey; Varenberg, Michael

    2015-04-21

    Most biological hairy adhesive systems involved in locomotion rely on spatula-shaped terminal elements, whose operation has been actively studied during the last decade. However, though functional principles underlying their amazing performance are now well understood, due to technical difficulties in manufacturing the complex structure of hierarchical spatulate systems, a biomimetic surface structure featuring true shear-induced dynamic attachment still remains elusive. To try bridging this gap, a novel method of manufacturing gecko-like attachment surfaces is devised based on a laser-micromachining technology. This method overcomes the inherent disadvantages of photolithography techniques and opens wide perspectives for future production of gecko-like attachment systems. Advanced smart-performance surfaces featuring thin-film-based hierarchical shear-activated elements are fabricated and found capable of generating friction force of several tens of times the contact load, which makes a significant step forward towards a true gecko-like adhesive.

  3. Methods of Microcomputer Research in Early Childhood Special Education.

    ERIC Educational Resources Information Center

    Fujiura, Glenn; Johnson, Lawrence J.

    1986-01-01

    The review of some recent studies on use of microcomputers in early childhood special education highlights methodological issues including the qualitative quantitative distinction and the interdependence of research design and interpretation. Imbedding qualitative methods into quasi- or true-experimental designs can provide more information than…

  4. UK audit of analysis of quantitative parameters from renography data generated using a physical phantom.

    PubMed

    Nijran, Kuldip S; Houston, Alex S; Fleming, John S; Jarritt, Peter H; Heikkinen, Jari O; Skrypniuk, John V

    2014-07-01

    In this second UK audit of quantitative parameters obtained from renography, phantom simulations were used in cases in which the 'true' values could be estimated, allowing the accuracy of the parameters measured to be assessed. A renal physical phantom was used to generate a set of three phantom simulations (six kidney functions) acquired on three different gamma camera systems. A total of nine phantom simulations and three real patient studies were distributed to UK hospitals participating in the audit. Centres were asked to provide results for the following parameters: relative function and time-to-peak (whole kidney and cortical region). As with previous audits, a questionnaire collated information on methodology. Errors were assessed as the root mean square deviation from the true value. Sixty-one centres responded to the audit, with some hospitals providing multiple sets of results. Twenty-one centres provided a complete set of parameter measurements. Relative function and time-to-peak showed a reasonable degree of accuracy and precision in most UK centres. The overall average root mean squared deviation of the results for (i) the time-to-peak measurement for the whole kidney and (ii) the relative function measurement from the true value was 7.7 and 4.5%, respectively. These results showed a measure of consistency in the relative function and time-to-peak that was similar to the results reported in a previous renogram audit by our group. Analysis of audit data suggests a reasonable degree of accuracy in the quantification of renography function using relative function and time-to-peak measurements. However, it is reasonable to conclude that the objectives of the audit could not be fully realized because of the limitations of the mechanical phantom in providing true values for renal parameters.

  5. Quantitative aspects of inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Bulska, Ewa; Wagner, Barbara

    2016-10-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.

  6. A convenient method for the quantitative determination of elemental sulfur in coal by HPLC analysis of perchloroethylene extracts

    USGS Publications Warehouse

    Buchanan, D.H.; Coombs, K.J.; Murphy, P.M.; Chaven, C.

    1993-01-01

    A convenient method for the quantitative determination of elemental sulfur in coal is described. Elemental sulfur is extracted from the coal with hot perchloroethylene (PCE) (tetrachloroethene, C2Cl4) and quantitatively determined by HPLC analysis on a C18 reverse-phase column using UV detection. Calibration solutions were prepared from sublimed sulfur. Results of quantitative HPLC analyses agreed with those of a chemical/spectroscopic analysis. The HPLC method was found to be linear over the concentration range of 6 ?? 10-4 to 2 ?? 10-2 g/L. The lower detection limit was 4 ?? 10-4 g/L, which for a coal sample of 20 g is equivalent to 0.0006% by weight of coal. Since elemental sulfur is known to react slowly with hydrocarbons at the temperature of boiling PCE, standard solutions of sulfur in PCE were heated with coals from the Argonne Premium Coal Sample program. Pseudo-first-order uptake of sulfur by the coals was observed over several weeks of heating. For the Illinois No. 6 premium coal, the rate constant for sulfur uptake was 9.7 ?? 10-7 s-1, too small for retrograde reactions between solubilized sulfur and coal to cause a significant loss in elemental sulfur isolated during the analytical extraction. No elemental sulfur was produced when the following pure compounds were heated to reflux in PCE for up to 1 week: benzyl sulfide, octyl sulfide, thiane, thiophene, benzothiophene, dibenzothiophene, sulfuric acid, or ferrous sulfate. A sluury of mineral pyrite in PCE contained elemental sulfur which increased in concentration with heating time. ?? 1993 American Chemical Society.

  7. Does Rational Selection of Training and Test Sets Improve the Outcome of QSAR Modeling?

    EPA Science Inventory

    Prior to using a quantitative structure activity relationship (QSAR) model for external predictions, its predictive power should be established and validated. In the absence of a true external dataset, the best way to validate the predictive ability of a model is to perform its s...

  8. Cotton defense induction patterns under spatially, temporally and quantitatively varying herbivory levels

    USDA-ARS?s Scientific Manuscript database

    The optimal defense theory (ODT) predicts that plants allocate defense compounds to their tissues depending on its value and the likelihood of herbivore attack. Whereas ODT has been confirmed for static damage levels it remains poorly understood if ODT holds true for defense organization of inducibl...

  9. Fully quantitative pixel-wise analysis of cardiovascular magnetic resonance perfusion improves discrimination of dark rim artifact from perfusion defects associated with epicardial coronary stenosis.

    PubMed

    Ta, Allison D; Hsu, Li-Yueh; Conn, Hannah M; Winkler, Susanne; Greve, Anders M; Shanbhag, Sujata M; Chen, Marcus Y; Patricia Bandettini, W; Arai, Andrew E

    2018-03-08

    Dark rim artifacts in first-pass cardiovascular magnetic resonance (CMR) perfusion images can mimic perfusion defects and affect diagnostic accuracy for coronary artery disease (CAD). We evaluated whether quantitative myocardial blood flow (MBF) can differentiate dark rim artifacts from true perfusion defects in CMR perfusion. Regadenoson perfusion CMR was performed at 1.5 T in 76 patients. Significant CAD was defined by quantitative invasive coronary angiography (QCA) ≥ 50% diameter stenosis. Non-significant CAD (NonCAD) was defined as stenosis by QCA < 50% diameter stenosis or computed tomographic coronary angiography (CTA) < 30% in all major epicardial arteries. Dark rim artifacts had study specific and guideline-based definitions for comparison purposes. MBF was quantified at the pixel-level and sector-level. In a NonCAD subgroup with dark rim artifacts, stress MBF was lower in the subendocardial than midmyocardial and epicardial layers (2.17 ± 0.61 vs. 3.06 ± 0.75 vs. 3.24 ± 0.80 mL/min/g, both p < 0.001) and was also 30% lower than in remote regions (2.17 ± 0.61 vs. 2.83 ± 0.67 mL/min/g, p < 0.001). However, subendocardial stress MBF in dark rim artifacts was 37-56% higher than in true perfusion defects (2.17 ± 0.61 vs. 0.95 ± 0.43 mL/min/g, p < 0.001). Absolute stress MBF differentiated CAD from NonCAD with an accuracy ranging from 86 to 89% (all p < 0.001) using pixel-level analyses. Similar results were seen at a sector level. Quantitative stress MBF is lower in dark rim artifacts than remote myocardium but significantly higher than in true perfusion defects. If confirmed in larger series, this approach may aid the interpretation of clinical stress perfusion exams. ClinicalTrials.gov Identifier: NCT00027170 ; first posted 11/28/2001; updated 11/27/2017.

  10. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    PubMed

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory analysis our results show similar diagnostic accuracy comparing anatomical (AUC 0.86(0.83-0.89)) and functional reference standards (AUC 0.88(0.84-0.90)). Only the per territory analysis sensitivity did not show significant heterogeneity. None of the groups showed signs of publication bias. The clinical value of semi-quantitative and quantitative CMR perfusion analysis remains uncertain due to extensive inter-study heterogeneity and large differences in CMR perfusion acquisition protocols, reference standards, and methods of assessment of myocardial perfusion parameters. For wide spread implementation, standardization of CMR perfusion techniques is essential. CRD42016040176 .

  11. Heavy Ion Testing at the Galactic Cosmic Ray Energy Peak

    NASA Technical Reports Server (NTRS)

    Pellish, Jonathan A.; Xapsos, M. A.; LaBel, K. A.; Marshall, P. W.; Heidel, D. F.; Rodbell, K. P.; Hakey, M. C.; Dodd, P. E.; Shaneyfelt, M. R.; Schwank, J. R.; hide

    2009-01-01

    A 1 GeV/u Fe-56 Ion beam allows for true 90 deg. tilt irradiations of various microelectronic components and reveals relevant upset trends for an abundant element at the galactic cosmic ray (GCR) flux-energy peak.

  12. Trust It or Trash It?

    MedlinePlus

    Trust It or Trash It? About | Contact | Español Tab 1 Tab 2 What is Trust It or Trash It? This is a tool ... be true, it may be. (See the second “Trust it” statement above). Click on each element below ...

  13. West Europe Report

    DTIC Science & Technology

    1986-04-10

    elements, based on the somewhat naive belief that political authority would necessarily follow. The first idea was to create a European embryo while...and their own destiny ? [Answer] The language of true solidarity in dealing wit* diffIcultles including what this entails in terms of personal

  14. A quantitative study on magnesium alloy stent biodegradation.

    PubMed

    Gao, Yuanming; Wang, Lizhen; Gu, Xuenan; Chu, Zhaowei; Guo, Meng; Fan, Yubo

    2018-06-06

    Insufficient scaffolding time in the process of rapid corrosion is the main problem of magnesium alloy stent (MAS). Finite element method had been used to investigate corrosion of MAS. However, related researches mostly described all elements suffered corrosion in view of one-dimensional corrosion. Multi-dimensional corrosions significantly influence mechanical integrity of MAS structures such as edges and corners. In this study, the effects of multi-dimensional corrosion were studied using experiment quantitatively, then a phenomenological corrosion model was developed to consider these effects. We implemented immersion test with magnesium alloy (AZ31B) cubes, which had different numbers of exposed surfaces to analyze differences of dimension. It was indicated that corrosion rates of cubes are almost proportional to their exposed-surface numbers, especially when pitting corrosions are not marked. The cubes also represented the hexahedron elements in simulation. In conclusion, corrosion rate of every element accelerates by increasing corrosion-surface numbers in multi-dimensional corrosion. The damage ratios among elements with the same size are proportional to the ratios of corrosion-surface numbers under uniform corrosion. The finite element simulation using proposed model provided more details of changes of morphology and mechanics in scaffolding time by removing 25.7% of elements of MAS. The proposed corrosion model reflected the effects of multi-dimension on corrosions. It would be used to predict degradation process of MAS quantitatively. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Metabolic Mapping: Quantitative Enzyme Cytochemistry and Histochemistry to Determine the Activity of Dehydrogenases in Cells and Tissues.

    PubMed

    Molenaar, Remco J; Khurshed, Mohammed; Hira, Vashendriya V V; Van Noorden, Cornelis J F

    2018-05-26

    Altered cellular metabolism is a hallmark of many diseases, including cancer, cardiovascular diseases and infection. The metabolic motor units of cells are enzymes and their activity is heavily regulated at many levels, including the transcriptional, mRNA stability, translational, post-translational and functional level. This complex regulation means that conventional quantitative or imaging assays, such as quantitative mRNA experiments, Western Blots and immunohistochemistry, yield incomplete information regarding the ultimate activity of enzymes, their function and/or their subcellular localization. Quantitative enzyme cytochemistry and histochemistry (i.e., metabolic mapping) show in-depth information on in situ enzymatic activity and its kinetics, function and subcellular localization in an almost true-to-nature situation. We describe a protocol to detect the activity of dehydrogenases, which are enzymes that perform redox reactions to reduce cofactors such as NAD(P) + and FAD. Cells and tissue sections are incubated in a medium that is specific for the enzymatic activity of one dehydrogenase. Subsequently, the dehydrogenase that is the subject of investigation performs its enzymatic activity in its subcellular site. In a chemical reaction with the reaction medium, this ultimately generates blue-colored formazan at the site of the dehydrogenase's activity. The formazan's absorbance is therefore a direct measure of the dehydrogenase's activity and can be quantified using monochromatic light microscopy and image analysis. The quantitative aspect of this protocol enables researchers to draw statistical conclusions from these assays. Besides observational studies, this technique can be used for inhibition studies of specific enzymes. In this context, studies benefit from the true-to-nature advantages of metabolic mapping, giving in situ results that may be physiologically more relevant than in vitro enzyme inhibition studies. In all, metabolic mapping is an indispensable technique to study metabolism at the cellular or tissue level. The technique is easy to adopt, provides in-depth, comprehensive and integrated metabolic information and enables rapid quantitative analysis.

  16. Survey of Army/NASA Rotorcraft Aeroelastic Stability Research

    DTIC Science & Technology

    1988-10-01

    modal analysis of aeroelastic sLaoili:v of .niform 5ant:- lever rotor blades that clearlv .llustra:ea the significar: ;.fl- ence : :ne -cn - ear bending... ence 8, the Newtonian approach does, not necessarily yield a syMetriC structural operator and althort3. the equations from the two methods are not... ence 69 to a true finite-element form so that the generalized coorainates were actual displacements and slopes at ends of the element. In addition to the

  17. Identification of moving sinusoidal wave loads for sensor structural configuration by finite element inverse method

    NASA Astrophysics Data System (ADS)

    Zhang, B.; Yu, S.

    2018-03-01

    In this paper, a beam structure of composite materials with elastic foundation supports is established as the sensor model, which propagates moving sinusoidal wave loads. The inverse Finite Element Method (iFEM) is applied for reconstructing moving wave loads which are compared with true wave loads. The conclusion shows that iFEM is accurate and robust in the determination of wave propagation. This helps to seek a suitable new wave sensor method.

  18. Predictors of the pathogenicity of methicillin-resistant Staphylococcus aureus nosocomial pneumonia.

    PubMed

    Nagaoka, Kentaro; Yanagihara, Katsunori; Harada, Yosuke; Yamada, Koichi; Migiyama, Yohei; Morinaga, Yoshitomo; Izumikawa, Koichi; Kakeya, Hiroshi; Yamamoto, Yoshihiro; Nishimura, Masaharu; Kohno, Shigeru

    2014-05-01

    The clinical characteristics of patients with nosocomial pneumonia (NP) associated with methicillin-resistant Staphylococcus aureus (MRSA) infection are not well characterized. Three hundred and thirty-seven consecutive patients with MRSA isolation from respiratory specimens who attended our hospital between April 2007 and March 2011 were enrolled. Patients characteristics diagnosed with 'true' MRSA-NP were described with regards to clinical, microbiological features, radiological features and genetic characteristics of the isolates. The diagnosis of 'true' MRSA-NP was confirmed by anti-MRSA treatment effects, Gram-staining or bronchoalveolar lavage fluid culture. Thirty-six patients were diagnosed with 'true' MRSA-NP, whereas 34 were diagnosed with NP with MRSA colonization. Patients with a MRSA-NP had a Pneumonia Patient Outcomes Research Team score of 5 (58.3% vs 23.5%), single cultivation of MRSA (83.3% vs 38.2%), MRSA quantitative cultivation yielding more than 10(6) CFU/mL (80.6% vs 47.1%), radiological findings other than lobar pneumonia (66.7% vs 26.5%), and a history of head, neck, oesophageal or stomach surgery (30.6% vs 11.8%). These factors were shown to be independent predictors of the pathogenicity of 'true' MRSA-NP by multivariate analysis (P < 0.05). 'True' MRSA-NP shows distinct clinical and radiological features from NP with MRSA colonization. © 2014 Asian Pacific Society of Respirology.

  19. Quantitative aspects of inductively coupled plasma mass spectrometry

    PubMed Central

    Wagner, Barbara

    2016-01-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644971

  20. Determination of trace element mineral/liquid partition coefficients in melilite and diopside by ion and electron microprobe techniques

    NASA Technical Reports Server (NTRS)

    Kuehner, S. M.; Laughlin, J. R.; Grossman, L.; Johnson, M. L.; Burnett, D. S.

    1989-01-01

    The applicability of ion microprobe (IMP) for quantitative analysis of minor elements (Sr, Y, Zr, La, Sm, and Yb) in the major phases present in natural Ca-, Al-rich inclusions (CAIs) was investigated by comparing IMP results with those of an electron microprobe (EMP). Results on three trace-element-doped glasses indicated that it is not possible to obtain precise quantitative analysis by using IMP if there are large differences in SiO2 content between the standards used to derive the ion yields and the unknowns.

  1. Application of relativistic electrons for the quantitative analysis of trace elements

    NASA Astrophysics Data System (ADS)

    Hoffmann, D. H. H.; Brendel, C.; Genz, H.; Löw, W.; Richter, A.

    1984-04-01

    Particle induced X-ray emission methods (PIXE) have been extended to relativistic electrons to induce X-ray emission (REIXE) for quantitative trace-element analysis. The electron beam (20 ≤ E0≤ 70 MeV) was supplied by the Darmstadt electron linear accelerator DALINAC. Systematic measurements of absolute K-, L- and M-shell ionization cross sections revealed a scaling behaviour of inner-shell ionization cross sections from which X-ray production cross sections can be deduced for any element of interest for a quantitative sample investigation. Using a multielemental mineral monazite sample from Malaysia the sensitivity of REIXE is compared to well established methods of trace-element analysis like proton- and X-ray-induced X-ray fluorescence analysis. The achievable detection limit for very heavy elements amounts to about 100 ppm for the REIXE method. As an example of an application the investigation of a sample prepared from manganese nodules — picked up from the Pacific deep sea — is discussed, which showed the expected high mineral content of Fe, Ni, Cu and Ti, although the search for aliquots of Pt did not show any measurable content within an upper limit of 250 ppm.

  2. Semi-quantitative spectrographic analysis and rank correlation in geochemistry

    USGS Publications Warehouse

    Flanagan, F.J.

    1957-01-01

    The rank correlation coefficient, rs, which involves less computation than the product-moment correlation coefficient, r, can be used to indicate the degree of relationship between two elements. The method is applicable in situations where the assumptions underlying normal distribution correlation theory may not be satisfied. Semi-quantitative spectrographic analyses which are reported as grouped or partly ranked data can be used to calculate rank correlations between elements. ?? 1957.

  3. Development and validation of open-source software for DNA mixture interpretation based on a quantitative continuous model

    PubMed Central

    Manabe, Sho; Morimoto, Chie; Hamano, Yuya; Fujimoto, Shuntaro

    2017-01-01

    In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI) from these samples are challenging. In this study, we developed a new open-source software “Kongoh” for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1–4 persons’ contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR) of a POI’s contribution in true contributors and non-contributors by using 2–4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI’s contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples. PMID:29149210

  4. Development and validation of open-source software for DNA mixture interpretation based on a quantitative continuous model.

    PubMed

    Manabe, Sho; Morimoto, Chie; Hamano, Yuya; Fujimoto, Shuntaro; Tamaki, Keiji

    2017-01-01

    In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI) from these samples are challenging. In this study, we developed a new open-source software "Kongoh" for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1-4 persons' contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR) of a POI's contribution in true contributors and non-contributors by using 2-4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI's contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples.

  5. Bl4 decays and the extraction of |Vub|

    NASA Astrophysics Data System (ADS)

    Kang, Xian-Wei; Kubis, Bastian; Hanhart, Christoph; Meißner, Ulf-G.

    2014-03-01

    The Cabibbo-Kobayashi-Maskawa matrix element |Vub| is not well determined yet. It can be extracted from both inclusive or exclusive decays, like B→π(ρ)laccent="true">ν¯l. However, the exclusive determination from B→ρlaccent="true">ν¯l, in particular, suffers from a large model dependence. In this paper, we propose to extract |Vub| from the four-body semileptonic decay B→ππlaccent="true">ν¯l, where the form factors for the pion-pion system are treated in dispersion theory. This is a model-independent approach that takes into account the ππ rescattering effects, as well as the effect of the ρ meson. We demonstrate that both finite-width effects of the ρ meson as well as scalar ππ contributions can be considered completely in this way.

  6. 75 FR 11641 - Community Reinvestment Act; Interagency Questions and Answers Regarding Community Reinvestment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-11

    ... lending test applicable to large institutions, its content may also be applicable to, for example, small... the guidance that describes the lending, investment, and service tests. The Questions and Answers are... received total quantitative CRA consideration. Although this is true if the express, bona fide intent of...

  7. Born to Burnout: A Meta-Analytic Path Model of Personality, Job Burnout, and Work Outcomes

    ERIC Educational Resources Information Center

    Swider, Brian W.; Zimmerman, Ryan D.

    2010-01-01

    We quantitatively summarized the relationship between Five-Factor Model personality traits, job burnout dimensions (emotional exhaustion, depersonalization, and personal accomplishment), and absenteeism, turnover, and job performance. All five of the Five-Factor Model personality traits had multiple true score correlations of 0.57 with emotional…

  8. Modeling Dynamic Functional Neuroimaging Data Using Structural Equation Modeling

    ERIC Educational Resources Information Center

    Price, Larry R.; Laird, Angela R.; Fox, Peter T.; Ingham, Roger J.

    2009-01-01

    The aims of this study were to present a method for developing a path analytic network model using data acquired from positron emission tomography. Regions of interest within the human brain were identified through quantitative activation likelihood estimation meta-analysis. Using this information, a "true" or population path model was then…

  9. Effect of Root Filling on Stress Distribution in Premolars with Endodontic-Periodontal Lesion: A Finite Elemental Analysis Study.

    PubMed

    Belli, Sema; Eraslan, Oğuz; Eskitascioglu, Gürcan

    2016-01-01

    Endodontic-periodontal (EP) lesions require both endodontic and periodontal therapies. Impermeable sealing of the root canal system after cleaning and shaping is essential for a successful endodontic treatment. However, complete healing of the hard and soft tissue lesions takes time, and diseased bone, periodontal ligament, and tooth fibrous joints are reported to have an increased failure risk for a given load. Considering that EP lesions may affect the biomechanics of teeth, this finite elemental analysis study aimed to test the effect of root fillings on stress distribution in premolars with EP lesions. Three finite elemental analysis models representing 3 different types of EP lesions (primary endodontic disease [PED], PED with secondary periodontic involvement, and true combined) were created. The root canals were assumed as nonfilled or filled with gutta-percha, gutta-percha/apical mineral trioxide aggregate (MTA) plug, and MTA-based sealer. Materials used were assumed to be homogenous and isotropic. A 300-N load was applied from the buccal cusp of the crown with a 135° angle. The Cosmoworks structural-analysis program (SolidWorks Corp, Waltham, MA) was used for analysis. Results were presented considering von Mises criteria. Stresses at the root apex increased with an increase in lesion dimensions. Root filling did not affect stress distribution in the PED model. An MTA plug or MTA-based sealer created more stress areas within the root compared with the others in the models representing PED with periodontic involvement and true combined lesions. Stresses at the apical end of the root increase with increases in lesion dimensions. MTA-based sealers or an MTA plug creates more stresses when there is periodontic involvement or a true combined lesion. Copyright © 2016 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  10. X-Ray and UV Photoelectron Spectroscopy | Materials Science | NREL

    Science.gov Websites

    backsheet material, showing excellent quantitative agreement between measured and predicted peak area ratios quantitative agreement between measured and predicted peak area ratios. Subtle differences in polymer functionality are assessed by deviations from stoichiometry. Elemental Analysis Uses quantitative identification

  11. Determination of the rCBF in the Amygdala and Rhinal Cortex Using a FAIR-TrueFISP Sequence

    PubMed Central

    Martirosian, Petros; Klose, Uwe; Nägele, Thomas; Schick, Fritz; Ernemann, Ulrike

    2011-01-01

    Objective Brain perfusion can be assessed non-invasively by modern arterial spin labeling MRI. The FAIR (flow-sensitive alternating inversion recovery)-TrueFISP (true fast imaging in steady precession) technique was applied for regional assessment of cerebral blood flow in brain areas close to the skull base, since this approach provides low sensitivity to magnetic susceptibility effects. The investigation of the rhinal cortex and the amygdala is a potentially important feature for the diagnosis and research on dementia in its early stages. Materials and Methods Twenty-three subjects with no structural or psychological impairment were investigated. FAIR-True-FISP quantitative perfusion data were evaluated in the amygdala on both sides and in the pons. A preparation of the radiofrequency FOCI (frequency offset corrected inversion) pulse was used for slice selective inversion. After a time delay of 1.2 sec, data acquisition began. Imaging slice thickness was 5 mm and inversion slab thickness for slice selective inversion was 12.5 mm. Image matrix size for perfusion images was 64 × 64 with a field of view of 256 × 256 mm, resulting in a spatial resolution of 4 × 4 × 5 mm. Repetition time was 4.8 ms; echo time was 2.4 ms. Acquisition time for the 50 sets of FAIR images was 6:56 min. Data were compared with perfusion data from the literature. Results Perfusion values in the right amygdala, left amygdala and pons were 65.2 (± 18.2) mL/100 g/minute, 64.6 (± 21.0) mL/100 g/minute, and 74.4 (± 19.3) mL/100 g/minute, respectively. These values were higher than formerly published data using continuous arterial spin labeling but similar to 15O-PET (oxygen-15 positron emission tomography) data. Conclusion The FAIR-TrueFISP approach is feasible for the quantitative assessment of perfusion in the amygdala. Data are comparable with formerly published data from the literature. The applied technique provided excellent image quality, even for brain regions located at the skull base in the vicinity of marked susceptibility steps. PMID:21927556

  12. Signatures of Evolutionary Adaptation in Quantitative Trait Loci Influencing Trace Element Homeostasis in Liver

    PubMed Central

    Sabidó, Eduard; Bosch, Elena

    2016-01-01

    Essential trace elements possess vital functions at molecular, cellular, and physiological levels in health and disease, and they are tightly regulated in the human body. In order to assess variability and potential adaptive evolution of trace element homeostasis, we quantified 18 trace elements in 150 liver samples, together with the expression levels of 90 genes and abundances of 40 proteins involved in their homeostasis. Additionally, we genotyped 169 single nucleotide polymorphism (SNPs) in the same sample set. We detected significant associations for 8 protein quantitative trait loci (pQTL), 10 expression quantitative trait loci (eQTLs), and 15 micronutrient quantitative trait loci (nutriQTL). Six of these exceeded the false discovery rate cutoff and were related to essential trace elements: 1) one pQTL for GPX2 (rs10133290); 2) two previously described eQTLs for HFE (rs12346) and SELO (rs4838862) expression; and 3) three nutriQTLs: The pathogenic C282Y mutation at HFE affecting iron (rs1800562), and two SNPs within several clustered metallothionein genes determining selenium concentration (rs1811322 and rs904773). Within the complete set of significant QTLs (which involved 30 SNPs and 20 gene regions), we identified 12 SNPs with extreme patterns of population differentiation (FST values in the top 5% percentile in at least one HapMap population pair) and significant evidence for selective sweeps involving QTLs at GPX1, SELENBP1, GPX3, SLC30A9, and SLC39A8. Overall, this detailed study of various molecular phenotypes illustrates the role of regulatory variants in explaining differences in trace element homeostasis among populations and in the human adaptive response to environmental pressures related to micronutrients. PMID:26582562

  13. The effect of in situ/in vitro three-dimensional quantitative computed tomography image voxel size on the finite element model of human vertebral cancellous bone.

    PubMed

    Lu, Yongtao; Engelke, Klaus; Glueer, Claus-C; Morlock, Michael M; Huber, Gerd

    2014-11-01

    Quantitative computed tomography-based finite element modeling technique is a promising clinical tool for the prediction of bone strength. However, quantitative computed tomography-based finite element models were created from image datasets with different image voxel sizes. The aim of this study was to investigate whether there is an influence of image voxel size on the finite element models. In all 12 thoracolumbar vertebrae were scanned prior to autopsy (in situ) using two different quantitative computed tomography scan protocols, which resulted in image datasets with two different voxel sizes (0.29 × 0.29 × 1.3 mm(3) vs 0.18 × 0.18 × 0.6 mm(3)). Eight of them were scanned after autopsy (in vitro) and the datasets were reconstructed with two voxel sizes (0.32 × 0.32 × 0.6 mm(3) vs. 0.18 × 0.18 × 0.3 mm(3)). Finite element models with cuboid volume of interest extracted from the vertebral cancellous part were created and inhomogeneous bilinear bone properties were defined. Axial compression was simulated. No effect of voxel size was detected on the apparent bone mineral density for both the in situ and in vitro cases. However, the apparent modulus and yield strength showed significant differences in the two voxel size group pairs (in situ and in vitro). In conclusion, the image voxel size may have to be considered when the finite element voxel modeling technique is used in clinical applications. © IMechE 2014.

  14. Road sign recognition during computer testing versus driving simulator performance for stroke and stroke+aphasia groups.

    DOT National Transportation Integrated Search

    2015-07-01

    Driving is essential to maintaining independence. For most Americans preserving personal mobility is a : key element to retaining jobs, friends, activities and the basic necessities to maintain a household. This : is particularly true for older peopl...

  15. Crustal forensics in arc magmas

    NASA Astrophysics Data System (ADS)

    Davidson, Jon P.; Hora, John M.; Garrison, Jennifer M.; Dungan, Michael A.

    2005-01-01

    The geochemical characteristics of continental crust are present in nearly all arc magmas. These characteristics may reflect a specific source process, such as fluid fluxing, common to both arc magmas and the continental crust, and/or may reflect the incorporation of continental crust into arc magmas either at source via subducted sediment, or via contamination during differentiation. Resolving the relative mass contributions of juvenile, mantle-derived material, versus that derived from pre-existing crust of the upper plate, and providing these estimates on an element-by-element basis, is important because: (1) we want to constrain crustal growth rates; (2) we want to quantitatively track element cycling at convergent margins; and (3) we want to determine the origin of economically important elements and compounds. Traditional geochemical approaches for determining the contributions of various components to arc magmas are particularly successful when applied on a comparative basis. Studies of suites from multiple magmatic systems along arcs, for which differentiation effects can be individually constrained, can be used to extrapolate to potential source compositions. In the Lesser Antilles Arc, for example, differentiation trends from individual volcanoes are consistent with open-system evolution. However, such trends do not project back to a common primitive magma composition, suggesting that differentiation modifies magmas that were derived from distinct mantle sources. We propose that such approaches should now be complemented by petrographically constrained mineral-scale isotope and trace element analysis to unravel the contributing components to arc magmas. This innovative approach can: (1) better constrain true end-member compositions by returning wider ranges in geochemical compositions among constituent minerals than is found in whole rocks; (2) better determine magmatic evolution processes from core-rim isotopic or trace element profiles from the phases contained in magmas; and (3) constrain rates of differentiation by applying diffusion-controlled timescales to element profiles. An example from Nguaruhoe Volcano, New Zealand, underscores the importance of such a microsampling approach, showing that mineral isotopic compositions encompass wide ranges, that whole-rock isotopic compositions are consequently simply element-weighted averages of the heterogeneous crystal cargo, and that open-system evolution is proved by core-rim variations in Sr isotope ratios. Nguaruhoe is just one of many systems examined through microanalytical approaches. The overwhelming conclusion of these studies is that crystal cargoes are not truly phenocrystic, but are inherited from various sources. The implication of this realization is that the interpretation of whole-rock isotopic data, including the currently popular U-series, needs careful evaluation in the context of petrographic observations.

  16. Compositional Diversity of the Vestan Regolith Derived from Howardite Compositions and Dawn VIR Spectra

    NASA Technical Reports Server (NTRS)

    Mittlefehldt, D. W.; Ammannito, E.; Hiroi, T.; DeAngeles, S.; Moriarty, D. P.; DiIorio, T.; Pieters, C. M.; DeSanctis, M. C.

    2014-01-01

    Howardite, eucrite and diogenite meteorites likely come from asteroid 4 Vesta [1]. Howardites - physical mixtures of eucrites and diogenites - are of two subtypes: regolithic howardites were gardened in the true regolith; fragmental howardites are simple polymict breccias [2]. The Dawn spacecraft imaged the howarditic surface of Vesta with the visible and infrared mapping spectrometer (VIR) resulting in qualitative maps of the distributions of distinct diogenite-rich and eucrite-rich terranes [3, 4]. We are developing a robust basis for quantitative mapping of the distribution of lithologic types using spectra acquired on splits of well-characterized howardites [5, 6]. Spectra were measured on sample powders sieved to <75 µm in the laboratories of the Istituto di Astrofisica e Planetologia Spaziali and Brown University. Data reduction was done using the methods developed to process Dawn VIR spectra [4]. The band parameters for the 1 and 2 µm pyroxene absorption features (hereafter BI and BII) can be directly compared to Dawn VIR results. Regolithic howardites have shallower BI and BII absorptions compared to fragmental howardites with similar compositions. However, there are statistically significant correlations between Al or Ca contents and BI or BII center wavelengths regardless of howardite subtype. Diogenites are poor in Al and Ca while eucrites are rich in these elements. The laboratory spectra can thus be directly correlated with the percentage of eucrite material contained in the howardites. We are using these correlations to quantitatively map Al and Ca distributions, and thus the percentage of eucritic material, in the current regolith of Vesta.

  17. Correlations between axial stiffness and microstructure of a species of bamboo

    PubMed Central

    Mannan, Sayyad; Paul Knox, J.

    2017-01-01

    Bamboo is a ubiquitous monocotyledonous flowering plant and is a member of the true grass family Poaceae. In many parts of the world, it is widely used as a structural material especially in scaffolding and buildings. In spite of its wide use, there is no accepted methodology for standardizing a species of bamboo for a particular structural purpose. The task of developing structure–property correlations is complicated by the fact that bamboo is a hierarchical material whose structure at the nanoscopic level is not very well explored. However, we show that as far as stiffness is concerned, it is possible to obtain reliable estimates of important structural properties like the axial modulus from the knowledge of certain key elements of the microstructure. Stiffness of bamboo depends most sensitively on the size and arrangement of the fibre sheaths surrounding the vascular bundles and the arrangement of crystalline cellulose microfibrils in their secondary cell walls. For the species of bamboo studied in this work, we have quantitatively determined the radial gradation that the arrangement of fibres renders to the structure. The arrangement of the fibres gives bamboo a radially graded property variation across its cross section. PMID:28280545

  18. Flow studies in canine artery bifurcations using a numerical simulation method.

    PubMed

    Xu, X Y; Collins, M W; Jones, C J

    1992-11-01

    Three-dimensional flows through canine femoral bifurcation models were predicted under physiological flow conditions by solving numerically the time-dependent three-dimensional Navier-stokes equations. In the calculations, two models were assumed for the blood, those of (a) a Newtonian fluid, and (b) a non-Newtonian fluid obeying the power law. The blood vessel wall was assumed to be rigid this being the only approximation to the prediction model. The numerical procedure utilized a finite volume approach on a finite element mesh to discretize the equations, and the code used (ASTEC) incorporated the SIMPLE velocity-pressure algorithm in performing the calculations. The predicted velocity profiles were in good qualitative agreement with the in vivo measurements recently obtained by Jones et al. The non-Newtonian effects on the bifurcation flow field were also investigated, and no great differences in velocity profiles were observed. This indicated that the non-Newtonian characteristics of the blood might not be an important factor in determining the general flow patterns for these bifurcations, but could have local significance. Current work involves modeling wall distensibility in an empirically valid manner. Predictions accommodating these will permit a true quantitative comparison with experiment.

  19. A method for testing whether model predictions fall within a prescribed factor of true values, with an application to pesticide leaching

    USGS Publications Warehouse

    Parrish, Rudolph S.; Smith, Charles N.

    1990-01-01

    A quantitative method is described for testing whether model predictions fall within a specified factor of true values. The technique is based on classical theory for confidence regions on unknown population parameters and can be related to hypothesis testing in both univariate and multivariate situations. A capability index is defined that can be used as a measure of predictive capability of a model, and its properties are discussed. The testing approach and the capability index should facilitate model validation efforts and permit comparisons among competing models. An example is given for a pesticide leaching model that predicts chemical concentrations in the soil profile.

  20. The linearized multistage model and the future of quantitative risk assessment.

    PubMed

    Crump, K S

    1996-10-01

    The linearized multistage (LMS) model has for over 15 years been the default dose-response model used by the U.S. Environmental Protection Agency (USEPA) and other federal and state regulatory agencies in the United States for calculating quantitative estimates of low-dose carcinogenic risks from animal data. The LMS model is in essence a flexible statistical model that can describe both linear and non-linear dose-response patterns, and that produces an upper confidence bound on the linear low-dose slope of the dose-response curve. Unlike its namesake, the Armitage-Doll multistage model, the parameters of the LMS do not correspond to actual physiological phenomena. Thus the LMS is 'biological' only to the extent that the true biological dose response is linear at low dose and that low-dose slope is reflected in the experimental data. If the true dose response is non-linear the LMS upper bound may overestimate the true risk by many orders of magnitude. However, competing low-dose extrapolation models, including those derived from 'biologically-based models' that are capable of incorporating additional biological information, have not shown evidence to date of being able to produce quantitative estimates of low-dose risks that are any more accurate than those obtained from the LMS model. Further, even if these attempts were successful, the extent to which more accurate estimates of low-dose risks in a test animal species would translate into improved estimates of human risk is questionable. Thus, it does not appear possible at present to develop a quantitative approach that would be generally applicable and that would offer significant improvements upon the crude bounding estimates of the type provided by the LMS model. Draft USEPA guidelines for cancer risk assessment incorporate an approach similar to the LMS for carcinogens having a linear mode of action. However, under these guidelines quantitative estimates of low-dose risks would not be developed for carcinogens having a non-linear mode of action; instead dose-response modelling would be used in the experimental range to calculate an LED10* (a statistical lower bound on the dose corresponding to a 10% increase in risk), and safety factors would be applied to the LED10* to determine acceptable exposure levels for humans. This approach is very similar to the one presently used by USEPA for non-carcinogens. Rather than using one approach for carcinogens believed to have a linear mode of action and a different approach for all other health effects, it is suggested herein that it would be more appropriate to use an approach conceptually similar to the 'LED10*-safety factor' approach for all health effects, and not to routinely develop quantitative risk estimates from animal data.

  1. Quantitative, spectrally-resolved intraoperative fluorescence imaging

    PubMed Central

    Valdés, Pablo A.; Leblond, Frederic; Jacobs, Valerie L.; Wilson, Brian C.; Paulsen, Keith D.; Roberts, David W.

    2012-01-01

    Intraoperative visual fluorescence imaging (vFI) has emerged as a promising aid to surgical guidance, but does not fully exploit the potential of the fluorescent agents that are currently available. Here, we introduce a quantitative fluorescence imaging (qFI) approach that converts spectrally-resolved data into images of absolute fluorophore concentration pixel-by-pixel across the surgical field of view (FOV). The resulting estimates are linear, accurate, and precise relative to true values, and spectral decomposition of multiple fluorophores is also achieved. Experiments with protoporphyrin IX in a glioma rodent model demonstrate in vivo quantitative and spectrally-resolved fluorescence imaging of infiltrating tumor margins for the first time. Moreover, we present images from human surgery which detect residual tumor not evident with state-of-the-art vFI. The wide-field qFI technique has broad implications for intraoperative surgical guidance because it provides near real-time quantitative assessment of multiple fluorescent biomarkers across the operative field. PMID:23152935

  2. Method for quantitative determination and separation of trace amounts of chemical elements in the presence of large quantities of other elements having the same atomic mass

    DOEpatents

    Miller, C.M.; Nogar, N.S.

    1982-09-02

    Photoionization via autoionizing atomic levels combined with conventional mass spectroscopy provides a technique for quantitative analysis of trace quantities of chemical elements in the presence of much larger amounts of other elements with substantially the same atomic mass. Ytterbium samples smaller than 10 ng have been detected using an ArF* excimer laser which provides the atomic ions for a time-of-flight mass spectrometer. Elemental selectivity of greater than 5:1 with respect to lutetium impurity has been obtained. Autoionization via a single photon process permits greater photon utilization efficiency because of its greater absorption cross section than bound-free transitions, while maintaining sufficient spectroscopic structure to allow significant photoionization selectivity between different atomic species. Separation of atomic species from others of substantially the same atomic mass is also described.

  3. Accurate quantitative CF-LIBS analysis of both major and minor elements in alloys via iterative correction of plasma temperature and spectral intensity

    NASA Astrophysics Data System (ADS)

    Shuxia, ZHAO; Lei, ZHANG; Jiajia, HOU; Yang, ZHAO; Wangbao, YIN; Weiguang, MA; Lei, DONG; Liantuan, XIAO; Suotang, JIA

    2018-03-01

    The chemical composition of alloys directly determines their mechanical behaviors and application fields. Accurate and rapid analysis of both major and minor elements in alloys plays a key role in metallurgy quality control and material classification processes. A quantitative calibration-free laser-induced breakdown spectroscopy (CF-LIBS) analysis method, which carries out combined correction of plasma temperature and spectral intensity by using a second-order iterative algorithm and two boundary standard samples, is proposed to realize accurate composition measurements. Experimental results show that, compared to conventional CF-LIBS analysis, the relative errors for major elements Cu and Zn and minor element Pb in the copper-lead alloys has been reduced from 12%, 26% and 32% to 1.8%, 2.7% and 13.4%, respectively. The measurement accuracy for all elements has been improved substantially.

  4. Propeller theory of Professor Joukowski and his pupils

    NASA Technical Reports Server (NTRS)

    Margoulis, W

    1922-01-01

    This report gives a summary of the work done in Russia from 1911 to 1914, by Professor Joukowski and his pupils. This summary will show that these men were the true originators of the theory, which combines the theory of the wing element and of the slipstream.

  5. Flexible Retrieval: When True Inferences Produce False Memories

    ERIC Educational Resources Information Center

    Carpenter, Alexis C.; Schacter, Daniel L.

    2017-01-01

    Episodic memory involves flexible retrieval processes that allow us to link together distinct episodes, make novel inferences across overlapping events, and recombine elements of past experiences when imagining future events. However, the same flexible retrieval and recombination processes that underpin these adaptive functions may also leave…

  6. 14 CFR 1214.701 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 5 2011-01-01 2010-01-01 true Definitions. 1214.701 Section 1214.701 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION SPACE FLIGHT The Authority of the Space Shuttle Commander § 1214.701 Definitions. (a) Space Shuttle Elements consists of the Orbiter, an External...

  7. Parental Support for Teenage Pregnancy Prevention Programmes in South Carolina Public Middle Schools

    ERIC Educational Resources Information Center

    Rose, India; Prince, Mary; Flynn, Shannon; Kershner, Sarah; Taylor, Doug

    2014-01-01

    Teenage pregnancy is a major public health issue in the USA; this is especially true in the state of South Carolina (SC). Research shows that well developed, good-quality teenage pregnancy prevention (TPP) programmes can be effective in modifying young people's sexual behaviour. While several quantitative studies have examined parents' perceptions…

  8. Perceptions of Quantitative Methods in Higher Education: Mapping Student Profiles

    ERIC Educational Resources Information Center

    Ramos, Madalena; Carvalho, Helena

    2011-01-01

    A number of studies have concluded that when students have greater confidence about their math skills and are aware of its usefulness, they have a more positive perception of the subject. This article aims to examine whether this pseudo linear trend in the relationship between affective and instrumental dimensions is also true of the university…

  9. Identification of true EST alignments for recognising transcribed regions.

    PubMed

    Ma, Chuang; Wang, Jia; Li, Lun; Duan, Mo-Jie; Zhou, Yan-Hong

    2011-01-01

    Transcribed regions can be determined by aligning Expressed Sequence Tags (ESTs) with genome sequences. The kernel of this strategy is to effectively distinguish true EST alignments from spurious ones. In this study, three measures including Direction Check, Identity Check and Terminal Check were introduced to more effectively eliminate spurious EST alignments. On the basis of these introduced measures and other widely used measures, a computational tool, named ESTCleanser, has been developed to identify true EST alignments for obtaining reliable transcribed regions. The performance of ESTCleanser has been evaluated on the well-annotated human ENCyclopedia of DNA Elements (ENCODE) regions using human ESTs in the dbEST database. The evaluation results show that the accuracy of ESTCleanser at exon and intron levels is more remarkably enhanced than that of UCSC-spliced EST alignments. This work would be helpful to EST-based researches on finding new genes, complementing genome annotation, recognising alternative splicing events and Single Nucleotide Polymorphisms (SNPs), etc.

  10. ICP-MS as a novel detection system for quantitative element-tagged immunoassay of hidden peanut allergens in foods.

    PubMed

    Careri, Maria; Elviri, Lisa; Mangia, Alessandro; Mucchino, Claudio

    2007-03-01

    A novel ICP-MS-based ELISA immunoassay via element-tagged determination was devised for quantitative analysis of hidden allergens in food. The method was able to detect low amounts of peanuts (down to approximately 2 mg peanuts kg(-1) cereal-based matrix) by using a europium-tagged antibody. Selectivity was proved by the lack of detectable cross-reaction with a number of protein-rich raw materials.

  11. Artificial neural networks applied to quantitative elemental analysis of organic material using PIXE

    NASA Astrophysics Data System (ADS)

    Correa, R.; Chesta, M. A.; Morales, J. R.; Dinator, M. I.; Requena, I.; Vila, I.

    2006-08-01

    An artificial neural network (ANN) has been trained with real-sample PIXE (particle X-ray induced emission) spectra of organic substances. Following the training stage ANN was applied to a subset of similar samples thus obtaining the elemental concentrations in muscle, liver and gills of Cyprinus carpio. Concentrations obtained with the ANN method are in full agreement with results from one standard analytical procedure, showing the high potentiality of ANN in PIXE quantitative analyses.

  12. Modeling of ultrasonic wave propagation in composite laminates with realistic discontinuity representation.

    PubMed

    Zelenyak, Andreea-Manuela; Schorer, Nora; Sause, Markus G R

    2018-02-01

    This paper presents a method for embedding realistic defect geometries of a fiber reinforced material in a finite element modeling environment in order to simulate active ultrasonic inspection. When ultrasonic inspection is used experimentally to investigate the presence of defects in composite materials, the microscopic defect geometry may cause signal characteristics that are difficult to interpret. Hence, modeling of this interaction is key to improve our understanding and way of interpreting the acquired ultrasonic signals. To model the true interaction of the ultrasonic wave field with such defect structures as pores, cracks or delamination, a realistic three dimensional geometry reconstruction is required. We present a 3D-image based reconstruction process which converts computed tomography data in adequate surface representations ready to be embedded for processing with finite element methods. Subsequent modeling using these geometries uses a multi-scale and multi-physics simulation approach which results in quantitative A-Scan ultrasonic signals which can be directly compared with experimental signals. Therefore, besides the properties of the composite material, a full transducer implementation, piezoelectric conversion and simultaneous modeling of the attached circuit is applied. Comparison between simulated and experimental signals provides very good agreement in electrical voltage amplitude and the signal arrival time and thus validates the proposed modeling approach. Simulating ultrasound wave propagation in a medium with a realistic shape of the geometry clearly shows a difference in how the disturbance of the waves takes place and finally allows more realistic modeling of A-scans. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Investigation of the "true" extraction recovery of analytes from multiple types of tissues and its impact on tissue bioanalysis using two model compounds.

    PubMed

    Yuan, Long; Ma, Li; Dillon, Lisa; Fancher, R Marcus; Sun, Huadong; Zhu, Mingshe; Lehman-McKeeman, Lois; Aubry, Anne-Françoise; Ji, Qin C

    2016-11-16

    LC-MS/MS has been widely applied to the quantitative analysis of tissue samples. However, one key remaining issue is that the extraction recovery of analyte from spiked tissue calibration standard and quality control samples (QCs) may not accurately represent the "true" recovery of analyte from incurred tissue samples. This may affect the accuracy of LC-MS/MS tissue bioanalysis. Here, we investigated whether the recovery determined using tissue QCs by LC-MS/MS can accurately represent the "true" recovery from incurred tissue samples using two model compounds: BMS-986104, a S1P 1 receptor modulator drug candidate, and its phosphate metabolite, BMS-986104-P. We first developed a novel acid and surfactant assisted protein precipitation method for the extraction of BMS-986104 and BMS-986104-P from rat tissues, and determined their recoveries using tissue QCs by LC-MS/MS. We then used radioactive incurred samples from rats dosed with 3 H-labeled BMS-986104 to determine the absolute total radioactivity recovery in six different tissues. The recoveries determined using tissue QCs and incurred samples matched with each other very well. The results demonstrated that, in this assay, tissue QCs accurately represented the incurred tissue samples to determine the "true" recovery, and LC-MS/MS assay was accurate for tissue bioanalysis. Another aspect we investigated is how the tissue QCs should be prepared to better represent the incurred tissue samples. We compared two different QC preparation methods (analyte spiked in tissue homogenates or in intact tissues) and demonstrated that the two methods had no significant difference when a good sample preparation was in place. The developed assay showed excellent accuracy and precision, and was successfully applied to the quantitative determination of BMS-986104 and BMS-986104-P in tissues in a rat toxicology study. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. 32 CFR 644.138 - Family housing leasing program.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true Family housing leasing program. 644.138 Section... PROPERTY REAL ESTATE HANDBOOK Acquisition Acquisition by Leasing § 644.138 Family housing leasing program... for the leasing of family housing units to the Division or District Engineer. Each military element...

  15. 18 CFR 707.8 - Typical classes of action requiring similar treatment under NEPA.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 2 2013-04-01 2012-04-01 true Typical classes of... Resources WATER RESOURCES COUNCIL COMPLIANCE WITH THE NATIONAL ENVIRONMENTAL POLICY ACT (NEPA) Water... submittal of regional water resources management plans (comprehensive, coordinated, joint plans or elements...

  16. Quantitative determination of low-Z elements in single atmospheric particles on boron substrates by automated scanning electron microscopy-energy-dispersive X-ray spectrometry.

    PubMed

    Choël, Marie; Deboudt, Karine; Osán, János; Flament, Pascal; Van Grieken, René

    2005-09-01

    Atmospheric aerosols consist of a complex heterogeneous mixture of particles. Single-particle analysis techniques are known to provide unique information on the size-resolved chemical composition of aerosols. A scanning electron microscope (SEM) combined with a thin-window energy-dispersive X-ray (EDX) detector enables the morphological and elemental analysis of single particles down to 0.1 microm with a detection limit of 1-10 wt %, low-Z elements included. To obtain data statistically representative of the air masses sampled, a computer-controlled procedure can be implemented in order to run hundreds of single-particle analyses (typically 1000-2000) automatically in a relatively short period of time (generally 4-8 h, depending on the setup and on the particle loading). However, automated particle analysis by SEM-EDX raises two practical challenges: the accuracy of the particle recognition and the reliability of the quantitative analysis, especially for micrometer-sized particles with low atomic number contents. Since low-Z analysis is hampered by the use of traditional polycarbonate membranes, an alternate choice of substrate is a prerequisite. In this work, boron is being studied as a promising material for particle microanalysis. As EDX is generally said to probe a volume of approximately 1 microm3, geometry effects arise from the finite size of microparticles. These particle geometry effects must be corrected by means of a robust concentration calculation procedure. Conventional quantitative methods developed for bulk samples generate elemental concentrations considerably in error when applied to microparticles. A new methodology for particle microanalysis, combining the use of boron as the substrate material and a reverse Monte Carlo quantitative program, was tested on standard particles ranging from 0.25 to 10 microm. We demonstrate that the quantitative determination of low-Z elements in microparticles is achievable and that highly accurate results can be obtained using the automatic data processing described here compared to conventional methods.

  17. Accurate and quantitative polarization-sensitive OCT by unbiased birefringence estimator with noise-stochastic correction

    NASA Astrophysics Data System (ADS)

    Kasaragod, Deepa; Sugiyama, Satoshi; Ikuno, Yasushi; Alonso-Caneiro, David; Yamanari, Masahiro; Fukuda, Shinichi; Oshika, Tetsuro; Hong, Young-Joo; Li, En; Makita, Shuichi; Miura, Masahiro; Yasuno, Yoshiaki

    2016-03-01

    Polarization sensitive optical coherence tomography (PS-OCT) is a functional extension of OCT that contrasts the polarization properties of tissues. It has been applied to ophthalmology, cardiology, etc. Proper quantitative imaging is required for a widespread clinical utility. However, the conventional method of averaging to improve the signal to noise ratio (SNR) and the contrast of the phase retardation (or birefringence) images introduce a noise bias offset from the true value. This bias reduces the effectiveness of birefringence contrast for a quantitative study. Although coherent averaging of Jones matrix tomography has been widely utilized and has improved the image quality, the fundamental limitation of nonlinear dependency of phase retardation and birefringence to the SNR was not overcome. So the birefringence obtained by PS-OCT was still not accurate for a quantitative imaging. The nonlinear effect of SNR to phase retardation and birefringence measurement was previously formulated in detail for a Jones matrix OCT (JM-OCT) [1]. Based on this, we had developed a maximum a-posteriori (MAP) estimator and quantitative birefringence imaging was demonstrated [2]. However, this first version of estimator had a theoretical shortcoming. It did not take into account the stochastic nature of SNR of OCT signal. In this paper, we present an improved version of the MAP estimator which takes into account the stochastic property of SNR. This estimator uses a probability distribution function (PDF) of true local retardation, which is proportional to birefringence, under a specific set of measurements of the birefringence and SNR. The PDF was pre-computed by a Monte-Carlo (MC) simulation based on the mathematical model of JM-OCT before the measurement. A comparison between this new MAP estimator, our previous MAP estimator [2], and the standard mean estimator is presented. The comparisons are performed both by numerical simulation and in vivo measurements of anterior and posterior eye segment as well as in skin imaging. The new estimator shows superior performance and also shows clearer image contrast.

  18. A new method of linkage analysis using LOD scores for quantitative traits supports linkage of monoamine oxidase activity to D17S250 in the Collaborative Study on the Genetics of Alcoholism pedigrees.

    PubMed

    Curtis, David; Knight, Jo; Sham, Pak C

    2005-09-01

    Although LOD score methods have been applied to diseases with complex modes of inheritance, linkage analysis of quantitative traits has tended to rely on non-parametric methods based on regression or variance components analysis. Here, we describe a new method for LOD score analysis of quantitative traits which does not require specification of a mode of inheritance. The technique is derived from the MFLINK method for dichotomous traits. A range of plausible transmission models is constructed, constrained to yield the correct population mean and variance for the trait but differing with respect to the contribution to the variance due to the locus under consideration. Maximized LOD scores under homogeneity and admixture are calculated, as is a model-free LOD score which compares the maximized likelihoods under admixture assuming linkage and no linkage. These LOD scores have known asymptotic distributions and hence can be used to provide a statistical test for linkage. The method has been implemented in a program called QMFLINK. It was applied to data sets simulated using a variety of transmission models and to a measure of monoamine oxidase activity in 105 pedigrees from the Collaborative Study on the Genetics of Alcoholism. With the simulated data, the results showed that the new method could detect linkage well if the true allele frequency for the trait was close to that specified. However, it performed poorly on models in which the true allele frequency was much rarer. For the Collaborative Study on the Genetics of Alcoholism data set only a modest overlap was observed between the results obtained from the new method and those obtained when the same data were analysed previously using regression and variance components analysis. Of interest is that D17S250 produced a maximized LOD score under homogeneity and admixture of 2.6 but did not indicate linkage using the previous methods. However, this region did produce evidence for linkage in a separate data set, suggesting that QMFLINK may have been able to detect a true linkage which was not picked up by the other methods. The application of model-free LOD score analysis to quantitative traits is novel and deserves further evaluation of its merits and disadvantages relative to other methods.

  19. Simulation of Hypervelocity Impact on Aluminum-Nextel-Kevlar Orbital Debris Shields

    NASA Technical Reports Server (NTRS)

    Fahrenthold, Eric P.

    2000-01-01

    An improved hybrid particle-finite element method has been developed for hypervelocity impact simulation. The method combines the general contact-impact capabilities of particle codes with the true Lagrangian kinematics of large strain finite element formulations. Unlike some alternative schemes which couple Lagrangian finite element models with smooth particle hydrodynamics, the present formulation makes no use of slidelines or penalty forces. The method has been implemented in a parallel, three dimensional computer code. Simulations of three dimensional orbital debris impact problems using this parallel hybrid particle-finite element code, show good agreement with experiment and good speedup in parallel computation. The simulations included single and multi-plate shields as well as aluminum and composite shielding materials. at an impact velocity of eleven kilometers per second.

  20. Quantitative Procedures for the Assessment of Quality in Higher Education Institutions.

    ERIC Educational Resources Information Center

    Moran, Tom; Rowse, Glenwood

    The development of procedures designed to provide quantitative assessments of quality in higher education institutions are reviewed. These procedures employ a systems framework and utilize quantitative data to compare institutions or programs of similar types with one another. Three major elements essential in the development of models focusing on…

  1. Focused Group Interviews as an Innovative Quanti-Qualitative Methodology (QQM): Integrating Quantitative Elements into a Qualitative Methodology

    ERIC Educational Resources Information Center

    Grim, Brian J.; Harmon, Alison H.; Gromis, Judy C.

    2006-01-01

    There is a sharp divide between quantitative and qualitative methodologies in the social sciences. We investigate an innovative way to bridge this gap that incorporates quantitative techniques into a qualitative method, the "quanti-qualitative method" (QQM). Specifically, our research utilized small survey questionnaires and experiment-like…

  2. Renal cell carcinoma containing abundant non-calcified fat.

    PubMed

    Wasser, Elliot J; Shyn, Paul B; Riveros-Angel, Marcela; Sadow, Cheryl A; Steele, Graeme S; Silverman, Stuart G

    2013-06-01

    Renal masses found to contain macroscopic fatty elements on CT or MRI imaging can generally be classified as benign angiomyolipomas. Rarely, renal cell carcinomas may also contain evidence of macroscopic fat. When true adipocytic elements are present, this is generally due to a process of osseous metaplasia in which both fat cells and calcification are co-localized within the mass. We present a patient with a large papillary renal cell carcinoma containing abundant fat with sparse, punctate calcification remote from the fatty elements on imaging. This report highlights the need for radiologists to maintain caution when diagnosing renal angiomyolipomas on the basis of macroscopic fat and reviews the current literature on fat-containing renal masses.

  3. Recursive analytical solution describing artificial satellite motion perturbed by an arbitrary number of zonal terms

    NASA Technical Reports Server (NTRS)

    Mueller, A. C.

    1977-01-01

    An analytical first order solution has been developed which describes the motion of an artificial satellite perturbed by an arbitrary number of zonal harmonics of the geopotential. A set of recursive relations for the solution, which was deduced from recursive relations of the geopotential, was derived. The method of solution is based on Von-Zeipel's technique applied to a canonical set of two-body elements in the extended phase space which incorporates the true anomaly as a canonical element. The elements are of Poincare type, that is, they are regular for vanishing eccentricities and inclinations. Numerical results show that this solution is accurate to within a few meters after 500 revolutions.

  4. Disease quantification on PET/CT images without object delineation

    NASA Astrophysics Data System (ADS)

    Tong, Yubing; Udupa, Jayaram K.; Odhner, Dewey; Wu, Caiyun; Fitzpatrick, Danielle; Winchell, Nicole; Schuster, Stephen J.; Torigian, Drew A.

    2017-03-01

    The derivation of quantitative information from images to make quantitative radiology (QR) clinically practical continues to face a major image analysis hurdle because of image segmentation challenges. This paper presents a novel approach to disease quantification (DQ) via positron emission tomography/computed tomography (PET/CT) images that explores how to decouple DQ methods from explicit dependence on object segmentation through the use of only object recognition results to quantify disease burden. The concept of an object-dependent disease map is introduced to express disease severity without performing explicit delineation and partial volume correction of either objects or lesions. The parameters of the disease map are estimated from a set of training image data sets. The idea is illustrated on 20 lung lesions and 20 liver lesions derived from 18F-2-fluoro-2-deoxy-D-glucose (FDG)-PET/CT scans of patients with various types of cancers and also on 20 NEMA PET/CT phantom data sets. Our preliminary results show that, on phantom data sets, "disease burden" can be estimated to within 2% of known absolute true activity. Notwithstanding the difficulty in establishing true quantification on patient PET images, our results achieve 8% deviation from "true" estimates, with slightly larger deviations for small and diffuse lesions where establishing ground truth becomes really questionable, and smaller deviations for larger lesions where ground truth set up becomes more reliable. We are currently exploring extensions of the approach to include fully automated body-wide DQ, extensions to just CT or magnetic resonance imaging (MRI) alone, to PET/CT performed with radiotracers other than FDG, and other functional forms of disease maps.

  5. Self-regulating proportionally controlled heating apparatus and technique

    NASA Technical Reports Server (NTRS)

    Strange, M. G. (Inventor)

    1975-01-01

    A self-regulating proportionally controlled heating apparatus and technique is provided wherein a single electrical resistance heating element having a temperature coefficient of resistance serves simultaneously as a heater and temperature sensor. The heating element is current-driven and the voltage drop across the heating element is monitored and a component extracted which is attributable to a change in actual temperature of the heating element from a desired reference temperature, so as to produce a resulting error signal. The error signal is utilized to control the level of the heater drive current and the actual heater temperature in a direction to reduce the noted temperature difference. The continuous nature of the process for deriving the error signal feedback information results in true proportional control of the heating element without the necessity for current-switching which may interfere with nearby sensitive circuits, and with no cyclical variation in the controlled temperature.

  6. Less-studied TCE: are their environmental concentrations increasing due to their use in new technologies?

    PubMed

    Filella, M; Rodríguez-Murillo, J C

    2017-09-01

    The possible environmental impact of the recent increase in use of a group of technology-critical elements (Nb, Ta, Ga, In, Ge and Te) is analysed by reviewing published concentration profiles in environmental archives (ice cores, ombrotrophic peat bogs, freshwater sediments and moss surveys) and evaluating temporal trends in surface waters. No increase has so far been recorded. The low potential direct emissions of these elements, resulting from their absolute low production levels, make it unlikely that the increasing use of these elements in modern technology has any noticeable effect on their environmental concentrations on a global scale. This holds particularly true for those of these elements that are probably emitted in relatively high amounts from other human activities (i.e., coal combustion and non-ferrous smelting), such as In, the most studied element of the group. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Reducing Speckle In One-Look SAR Images

    NASA Technical Reports Server (NTRS)

    Nathan, K. S.; Curlander, J. C.

    1990-01-01

    Local-adaptive-filter algorithm incorporated into digital processing of synthetic-aperture-radar (SAR) echo data to reduce speckle in resulting imagery. Involves use of image statistics in vicinity of each picture element, in conjunction with original intensity of element, to estimate brightness more nearly proportional to true radar reflectance of corresponding target. Increases ratio of signal to speckle noise without substantial degradation of resolution common to multilook SAR images. Adapts to local variations of statistics within scene, preserving subtle details. Computationally simple. Lends itself to parallel processing of different segments of image, making possible increased throughput.

  8. Study of Far—Field Directivity Pattern for Linear Arrays

    NASA Astrophysics Data System (ADS)

    Ana-Maria, Chiselev; Luminita, Moraru; Laura, Onose

    2011-10-01

    A model to calculate directivity pattern in far field is developed in this paper. Based on this model, the three-dimensional beam pattern is introduced and analyzed in order to investigate geometric parameters of linear arrays and their influences on the directivity pattern. Simulations in azimuthal plane are made to highlight the influence of transducers parameters, including number of elements and inter-element spacing. It is true that these parameters are important factors that influence the directivity pattern and the appearance of side-lobes for linear arrays.

  9. User Instructions for the EPIC-2 Code.

    DTIC Science & Technology

    1986-09-01

    10 1 TAM IIFAILIDARAC EFAIL 5 MATERIAL CARDS FOR SOLIDS INPUT DATA L45,5X, FSO, A48. R(8FDO.OJ, MATL I WAR I iAIL "EFAILMAtEA :SCRIPT ION DENSITY SPH...failure of the elements must be achieved by the eroding interface algorithm, it is important that EFAIL (a mate- rial property) be much greater than ERODE...If left blank (DFRAC z 0) factor will be set to DFRAC = 1.0 EFAIL = Equivalent plastic strain (true) which, if exceeded, will totally fail the element

  10. Surface hardening of cutting elements agricultural machinery vibro arc plasma

    NASA Astrophysics Data System (ADS)

    Sharifullin, S. N.; Adigamov, N. R.; Adigamov, N. N.; Solovev, R. Y.; Arakcheeva, K. S.

    2016-01-01

    At present, the state technical policy aimed at the modernization of worn equipment, including agriculture, based on the use of high-performance technology called nanotechnology. By upgrading worn-out equipment meant restoring it with the achievement of the above parameters passport. The existing traditional technologies are not suitable for the repair of worn-out equipment modernization. This is especially true of imported equipment. Out here alone - is the use of high-performance technologies. In this paper, we consider the use of vibro arc plasma for surface hardening of cutting elements of agricultural machinery.

  11. Quantitative measures of gingival recession and the influence of gender, race, and attrition.

    PubMed

    Handelman, Chester S; Eltink, Anthony P; BeGole, Ellen

    2018-01-29

    Gingival recession in dentitions with otherwise healthy periodontium is a common occurrence in adults. Recession is clinically measured using a periodontal probe to the nearest millimeter. The aim of this study is to establish quantitative measures of recession, the clinical crown height, and a new measure the gingival margin-papillae measurement. The latter is seen as the shortest apico-coronal distance measured from the depth of the gingival margin to a line connecting the tips of the two adjacent papillae. Measurements on all teeth up to and including the first molar were performed on pretreatment study models of 120 adult Caucasian and African-American subjects divided into four groups of 30 by gender and race. Both the clinical crown height and the gingival margin-papillae measurements gave a true positive result for changes associated with gingival recession. Tooth wear shortens the clinical crown, and therefore, the measure of clinical crown height can give a false negative result when gingival recession is present. However, the gingival margin-papillae measurement was not affected by tooth wear and gave a true positive result for gingival recession. Tooth wear (attrition) was not associated with an increase in gingival recession. These measures are also useful in detecting recession prior to cemental exposure. Measures for recession and tooth wear were different for the four demographic groups studied. These measures can be used as quantitative standards in both clinical dentistry, research, and epidemiological studies.

  12. Qualitative and quantitative analysis of an additive element in metal oxide nanometer film using laser induced breakdown spectroscopy.

    PubMed

    Xiu, Junshan; Liu, Shiming; Sun, Meiling; Dong, Lili

    2018-01-20

    The photoelectric performance of metal ion-doped TiO 2 film will be improved with the changing of the compositions and concentrations of additive elements. In this work, the TiO 2 films doped with different Sn concentrations were obtained with the hydrothermal method. Qualitative and quantitative analysis of the Sn element in TiO 2 film was achieved with laser induced breakdown spectroscopy (LIBS) with the calibration curves plotted accordingly. The photoelectric characteristics of TiO 2 films doped with different Sn content were observed with UV visible absorption spectra and J-V curves. All results showed that Sn doping could improve the optical absorption to be red-shifted and advance the photoelectric properties of the TiO 2 films. We had obtained that when the concentration of Sn doping in TiO 2 films was 11.89  mmol/L, which was calculated by the LIBS calibration curves, the current density of the film was the largest, which indicated the best photoelectric performance. It indicated that LIBS was a potential and feasible measured method, which was applied to qualitative and quantitative analysis of the additive element in metal oxide nanometer film.

  13. TH-AB-209-09: Quantitative Imaging of Electrical Conductivity by VHF-Induced Thermoacoustics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patch, S; Hull, D; See, W

    Purpose: To demonstrate that very high frequency (VHF) induced thermoacoustics has the potential to provide quantitative images of electrical conductivity in Siemens/meter, much as shear wave elastography provides tissue stiffness in kPa. Quantitatively imaging a large organ requires exciting thermoacoustic pulses throughout the volume and broadband detection of those pulses because tomographic image reconstruction preserves frequency content. Applying the half-wavelength limit to a 200-micron inclusion inside a 7.5 cm diameter organ requires measurement sensitivity to frequencies ranging from 4 MHz down to 10 kHz, respectively. VHF irradiation provides superior depth penetration over near infrared used in photoacoustics. Additionally, VHF signalmore » production is proportional to electrical conductivity, and prostate cancer is known to suppress electrical conductivity of prostatic fluid. Methods: A dual-transducer system utilizing a P4-1 array connected to a Verasonics V1 system augmented by a lower frequency focused single element transducer was developed. Simultaneous acquisition of VHF-induced thermoacoustic pulses by both transducers enabled comparison of transducer performance. Data from the clinical array generated a stack of 96-images with separation of 0.3 mm, whereas the single element transducer imaged only in a single plane. In-plane resolution and quantitative accuracy were measured at isocenter. Results: The array provided volumetric imaging capability with superior resolution whereas the single element transducer provided superior quantitative accuracy. Combining axial images from both transducers preserved resolution of the P4-1 array and improved image contrast. Neither transducer was sensitive to frequencies below 50 kHz, resulting in a DC offset and low-frequency shading over fields of view exceeding 15 mm. Fresh human prostates were imaged ex vivo and volumetric reconstructions reveal structures rarely seen in diagnostic images. Conclusion: Quantitative whole-organ thermoacoustic tomography will be feasible by sparsely interspersing transducer elements sensitive to the low end of the ultrasonic range.« less

  14. Automated Quantitative Rare Earth Elements Mineralogy by Scanning Electron Microscopy

    NASA Astrophysics Data System (ADS)

    Sindern, Sven; Meyer, F. Michael

    2016-09-01

    Increasing industrial demand of rare earth elements (REEs) stems from the central role they play for advanced technologies and the accelerating move away from carbon-based fuels. However, REE production is often hampered by the chemical, mineralogical as well as textural complexity of the ores with a need for better understanding of their salient properties. This is not only essential for in-depth genetic interpretations but also for a robust assessment of ore quality and economic viability. The design of energy and cost-efficient processing of REE ores depends heavily on information about REE element deportment that can be made available employing automated quantitative process mineralogy. Quantitative mineralogy assigns numeric values to compositional and textural properties of mineral matter. Scanning electron microscopy (SEM) combined with a suitable software package for acquisition of backscatter electron and X-ray signals, phase assignment and image analysis is one of the most efficient tools for quantitative mineralogy. The four different SEM-based automated quantitative mineralogy systems, i.e. FEI QEMSCAN and MLA, Tescan TIMA and Zeiss Mineralogic Mining, which are commercially available, are briefly characterized. Using examples of quantitative REE mineralogy, this chapter illustrates capabilities and limitations of automated SEM-based systems. Chemical variability of REE minerals and analytical uncertainty can reduce performance of phase assignment. This is shown for the REE phases parisite and synchysite. In another example from a monazite REE deposit, the quantitative mineralogical parameters surface roughness and mineral association derived from image analysis are applied for automated discrimination of apatite formed in a breakdown reaction of monazite and apatite formed by metamorphism prior to monazite breakdown. SEM-based automated mineralogy fulfils all requirements for characterization of complex unconventional REE ores that will become increasingly important for supply of REEs in the future.

  15. Fighting fire with fire: estimating the efficacy of wildfire mitigation programs using propensity scores

    Treesearch

    David T. Butry

    2009-01-01

    This paper examines the effect wildfire mitigation has on broad-scale wildfire behavior. Each year, hundreds of million of dollars are spent on fire suppression and fuels management applications, yet little is known, quantitatively, of the returns to these programs in terms of their impact on wildfire extent and intensity. This is especially true when considering that...

  16. Physical and Nonphysical Bullying Victimization of Academically Oriented Students: The Role of Gender and School Type

    ERIC Educational Resources Information Center

    Lehman, Brett

    2015-01-01

    Although there are many factors associated with being the victim of bullying in school, quantitative studies have not treated academic attitudes, effort, and achievement (or lack thereof) as risk factors. This is true despite many ethnographic accounts of good students being stigmatized and directly bullied on account of their status as good…

  17. Rocks: A Concrete Activity That Introduces Normal Distribution, Sampling Error, Central Limit Theorem and True Score Theory

    ERIC Educational Resources Information Center

    Van Duzer, Eric

    2011-01-01

    This report introduces a short, hands-on activity that addresses a key challenge in teaching quantitative methods to students who lack confidence or experience with statistical analysis. Used near the beginning of the course, this activity helps students develop an intuitive insight regarding a number of abstract concepts which are key to…

  18. The Effect of High School Junior Reserve Officers' Training Corps (JROTC) on Civic Knowledge, Skills, and Attitudes of Hispanic Cadets

    ERIC Educational Resources Information Center

    Loving, Kirk Anthony

    2017-01-01

    As students continue to experience low test scores on national civics assessments, it is important to identify curriculum which can increase their civic capabilities. This is especially true for the quickly growing Hispanic population, which suffers a civic achievement gap. The purpose of this quantitative quasi-experimental nonequivalent…

  19. Impact of different meander sizes on the RF transmit performance and coupling of microstrip line elements at 7 T.

    PubMed

    Rietsch, Stefan H G; Quick, Harald H; Orzada, Stephan

    2015-08-01

    In this work, the transmit performance and interelement coupling characteristics of radio frequency (RF) antenna microstrip line elements are examined in simulations and measurements. The initial point of the simulations is a microstrip line element loaded with a phantom. Meander structures are then introduced at the end of the element. The size of the meanders is increased in fixed steps and the magnetic field is optimized. In continuative simulations, the coupling between identical elements is evaluated for different element spacing and loading conditions. Verification of the simulation results is accomplished in measurements of the coupling between two identical elements for four different meander sizes. Image acquisition on a 7 T magnetic resonance imaging (MRI) system provides qualitative and quantitative comparisons to confirm the simulation results. Simulations point out an optimum range of meander sizes concerning coupling in all chosen geometric setups. Coupling measurement results are in good agreement with the simulations. Qualitative and quantitative comparisons of the acquired MRI images substantiate the coupling results. The coupling between coil elements in RF antenna arrays consisting of the investigated element types can be optimized under consideration of the central magnetic field strength or efficiency depending on the desired application.

  20. Elemental analysis of scorpion venoms.

    PubMed

    Al-Asmari, AbdulRahman K; Kunnathodi, Faisal; Al Saadon, Khalid; Idris, Mohammed M

    2016-01-01

    Scorpion venom is a rich source of biomolecules, which can perturb physiological activity of the host on envenomation and may also have a therapeutic potential. Scorpion venoms produced by the columnar cells of venom gland are complex mixture of mucopolysaccharides, neurotoxic peptides and other components. This study was aimed at cataloguing the elemental composition of venoms obtained from medically important scorpions found in the Arabian peninsula. The global elemental composition of the crude venom obtained from Androctonus bicolor, Androctonus crassicauda and Leiurus quinquestriatus scorpions were estimated using ICP-MS analyzer. The study catalogued several chemical elements present in the scorpion venom using ICP-MS total quant analysis and quantitation of nine elements exclusively using appropriate standards. Fifteen chemical elements including sodium, potassium and calcium were found abundantly in the scorpion venom at PPM concentrations. Thirty six chemical elements of different mass ranges were detected in the venom at PPB level. Quantitative analysis of the venoms revealed copper to be the most abundant element in Androctonus sp. venom but at lower level in Leiurus quinquestriatus venom; whereas zinc and manganese was found at higher levels in Leiurus sp. venom but at lower level in Androctonus sp. venom. These data and the concentrations of other different elements present in the various venoms are likely to increase our understanding of the mechanisms of venom activity and their pharmacological potentials.

  1. What Are the 50 Cent Euro Coins Made of?

    ERIC Educational Resources Information Center

    Peralta, Luis; Farinha, Ana Catarina; Rego, Florbela

    2008-01-01

    X-ray fluorescence is a non-destructive technique that allows elemental composition analysis. In this paper we describe a prescription to obtain the elemental composition of homogeneous coins, like 50 cent Euro coins, and how to get the quantitative proportions of each element with the help of Monte Carlo simulation. Undergraduate students can…

  2. Uncovering the end uses of the rare earth elements.

    PubMed

    Du, Xiaoyue; Graedel, T E

    2013-09-01

    The rare earth elements (REE) are a group of fifteen elements with unique properties that make them indispensable for a wide variety of emerging and conventional established technologies. However, quantitative knowledge of REE remains sparse, despite the current heightened interest in future availability of the resources. Mining is heavily concentrated in China, whose monopoly position and potential restriction of exports render primary supply vulnerable to short term disruption. We have drawn upon the published literature and unpublished materials in different languages to derive the first quantitative annual domestic production by end use of individual rare earth elements from 1995 to 2007. The information is illustrated in Sankey diagrams for the years 1995 and 2007. Other years are available in the supporting information. Comparing 1995 and 2007, the production of the rare earth elements in China, Japan, and the US changed dramatically in quantities and structure. The information can provide a solid foundation for industries, academic institutions and governments to make decisions and develop strategies. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Is container type the biggest predictor of trace element and BPA leaching from drinking water bottles?

    PubMed

    Rowell, Candace; Kuiper, Nora; Preud'Homme, Hugues

    2016-07-01

    The knowledge-base of bottled water leachate is highly contradictory due to varying methodologies and limited multi-elemental and/or molecular analyses; understanding the range of contaminants and their pathways is required. This study determined the leaching potential and leaching kinetics of trace elements, using consistent comprehensive quantitative and semi-quantitative (79 elements total) analyses, and BPA, using isotopic dilution and MEPS pre-concentration with UHPLC-ESI-QTOF. Statistical methods were used to determine confounders and predictors of leaching and human health risk throughout 12days of UV exposure and after exposure to elevated temperature. Various types of water were used to assess the impact of water quality. Results suggest Sb leaching is primarily dependent upon water quality, not container type. Bottle type is a predictor of elemental leaching for Pb, Ba, Cr, Cu, Mn and Sr; BPA was detected in samples from polycarbonate containers. Health risks from the consumption of bottled water increase after UV exposure. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Spectral Interferences Manganese (Mn) - Europium (Eu) Lines in X-Ray Fluorescence Spectrometry Spectrum

    NASA Astrophysics Data System (ADS)

    Tanc, Beril; Kaya, Mustafa; Gumus, Lokman; Kumral, Mustafa

    2016-04-01

    X-ray fluorescence (XRF) spectrometry is widely used for quantitative and semi quantitative analysis of many major, minor and trace elements in geological samples. Some advantages of the XRF method are; non-destructive sample preparation, applicability for powder, solid, paste and liquid samples and simple spectrum that are independent from chemical state. On the other hand, there are some disadvantages of the XRF methods such as poor sensitivity for low atomic number elements, matrix effect (physical matrix effects, such as fine versus course grain materials, may impact XRF performance) and interference effect (the spectral lines of elements may overlap distorting results for one or more elements). Especially, spectral interferences are very significant factors for accurate results. In this study, semi-quantitative analyzed manganese (II) oxide (MnO, 99.99%) was examined. Samples were pelleted and analyzed with XRF spectrometry (Bruker S8 Tiger). Unexpected peaks were obtained at the side of the major Mn peaks. Although sample does not contain Eu element, in results 0,3% Eu2O3 was observed. These result can occur high concentration of MnO and proximity of Mn and Eu lines. It can be eliminated by using correction equation or Mn concentration can confirm with other methods (such as Atomic absorption spectroscopy). Keywords: Spectral Interferences; Manganese (Mn); Europium (Eu); X-Ray Fluorescence Spectrometry Spectrum.

  5. A simple and efficient shear-flexible plate bending element

    NASA Technical Reports Server (NTRS)

    Chaudhuri, Reaz A.

    1987-01-01

    A shear-flexible triangular element formulation, which utilizes an assumed quadratic displacement potential energy approach and is numerically integrated using Gauss quadrature, is presented. The Reissner/Mindlin hypothesis of constant cross-sectional warping is directly applied to the three-dimensional elasticity theory to obtain a moderately thick-plate theory or constant shear-angle theory (CST), wherein the middle surface is no longer considered to be the reference surface and the two rotations are replaced by the two in-plane displacements as nodal variables. The resulting finite-element possesses 18 degrees of freedom (DOF). Numerical results are obtained for two different numerical integration schemes and a wide range of meshes and span-to-thickness ratios. These, when compared with available exact, series or finite-element solutions, demonstrate accuracy and rapid convergence characteristics of the present element. This is especially true in the case of thin to very thin plates, when the present element, used in conjunction with the reduced integration scheme, outperforms its counterpart, based on discrete Kirchhoff constraint theory (DKT).

  6. 32 CFR 643.122 - Reserve facilities-Air Force and Navy use.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true Reserve facilities-Air Force and Navy use. 643... Force and Navy use. MACOM may approve local agreements with other Army, DOD, and Reserve elements... Force or Navy Reserve, or which involve a transfer of funds between services for other than minor...

  7. Using Human Interest Stories To Demonstrate Relevant Concepts in the Public Speaking Classroom.

    ERIC Educational Resources Information Center

    Stowell, Jessica

    Students can learn the concepts of descriptive language, "group think," and how to overcome communication apprehension painlessly by using human interest stories with humerous elements. A public relations teacher uses two audio tapes and a true story about a former student in her classroom. Garrison Keillor's 12-minute story "Tomato…

  8. Leadership Mentoring: Maintaining School Improvement in Turbulent Times

    ERIC Educational Resources Information Center

    Gross, Steven Jay

    2006-01-01

    Today, school systems face the challenge of developing the next generation of school leaders. This means more than simply hiring promising new leaders--it requires developing an effective mentoring program. True leadership mentoring must be carefully crafted with highly educated mentors and prepared proteges. But what are the elements of a quality…

  9. 41 CFR 101-27.207-1 - Agency controls.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 41 Public Contracts and Property Management 2 2014-07-01 2012-07-01 true Agency controls. 101-27...-Management of Shelf-Life Materials § 101-27.207-1 Agency controls. Agencies shall establish the necessary controls to identify shelf-life items on their stock records (and in other appropriate elements of their...

  10. 41 CFR 101-27.207-1 - Agency controls.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 41 Public Contracts and Property Management 2 2011-07-01 2007-07-01 true Agency controls. 101-27...-Management of Shelf-Life Materials § 101-27.207-1 Agency controls. Agencies shall establish the necessary controls to identify shelf-life items on their stock records (and in other appropriate elements of their...

  11. 41 CFR 101-27.207-1 - Agency controls.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Agency controls. 101-27...-Management of Shelf-Life Materials § 101-27.207-1 Agency controls. Agencies shall establish the necessary controls to identify shelf-life items on their stock records (and in other appropriate elements of their...

  12. 34 CFR Appendix C to Part 300 - National Instructional Materials Accessibility Standard (NIMAS)

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... Using educationLevel obviates the need for a separate field for gradeRange since dc elements can repeat... 34 Education 2 2011-07-01 2010-07-01 true National Instructional Materials Accessibility Standard (NIMAS) C Appendix C to Part 300 Education Regulations of the Offices of the Department of Education...

  13. Recreational Terror: Postmodern Elements of the Contemporary Horror Film.

    ERIC Educational Resources Information Center

    Pinedo, Isabel

    1996-01-01

    States that the boundaries of any genre are slippery, but this is particularly true of the postmodern horror film, since the definition of postmodern is itself blurry. Argues that postmodern horror films include films from 1968 onward. Defines postmodernism and the characteristics of postmodern horror, including violence, violation of boundaries,…

  14. Stability evaluation and correction of a pulsed neutron generator prompt gamma activation analysis system

    USDA-ARS?s Scientific Manuscript database

    Source output stability is important for accurate measurement in prompt gamma neutron activation. This is especially true when measuring low-concentration elements such as in vivo nitrogen (~2.5% of body weight). We evaluated the stability of the compact DT neutron generator within an in vivo nitrog...

  15. A comparative study of qualitative and quantitative methods for the assessment of adhesive remnant after bracket debonding.

    PubMed

    Cehreli, S Burcak; Polat-Ozsoy, Omur; Sar, Cagla; Cubukcu, H Evren; Cehreli, Zafer C

    2012-04-01

    The amount of the residual adhesive after bracket debonding is frequently assessed in a qualitative manner, utilizing the adhesive remnant index (ARI). This study aimed to investigate whether quantitative assessment of the adhesive remnant yields more precise results compared to qualitative methods utilizing the 4- and 5-point ARI scales. Twenty debonded brackets were selected. Evaluation and scoring of the adhesive remnant on bracket bases were made consecutively using: 1. qualitative assessment (visual scoring) and 2. quantitative measurement (image analysis) on digital photographs. Image analysis was made on scanning electron micrographs (SEM) and high-precision elemental maps of the adhesive remnant as determined by energy dispersed X-ray spectrometry. Evaluations were made in accordance with the original 4-point and the modified 5-point ARI scales. Intra-class correlation coefficients (ICCs) were calculated, and the data were evaluated using Friedman test followed by Wilcoxon signed ranks test with Bonferroni correction. ICC statistics indicated high levels of agreement for qualitative visual scoring among examiners. The 4-point ARI scale was compliant with the SEM assessments but indicated significantly less adhesive remnant compared to the results of quantitative elemental mapping. When the 5-point scale was used, both quantitative techniques yielded similar results with those obtained qualitatively. These results indicate that qualitative visual scoring using the ARI is capable of generating similar results with those assessed by quantitative image analysis techniques. In particular, visual scoring with the 5-point ARI scale can yield similar results with both the SEM analysis and elemental mapping.

  16. Heterogeneous nucleation of aspartame from aqueous solutions

    NASA Astrophysics Data System (ADS)

    Kubota, Noriaki; Kinno, Hiroaki; Shimizu, Kenji

    1990-03-01

    Waiting times, the time from the instant of quenching needed for a first nucleus to appear, were measured at constant supercoolings for primary nucleation of aspartame (α-L-aspartyl-L-phenylalanine methylester) from aqueous solutions, which were sealed into glass ampoules (solution volume = 3.16 cm 3). Since the waiting time became shorter by filtering the solution prior to quenching, the nucleation was concluded to be heterogeneously induced. The measured waiting time consisted of two parts: time needed for the nucleus to grow to a detactable size (growth time) and stochastic time needed for nucleation (true waiting time). The distribution of the true waiting time, is well explained by a stochastic model, in which nucleation is regarded to occur heterogeneously and in a stochastic manner by two kinds of active sites. The active sites are estimated to be located on foreign particles in which such elements as Si, Al and Mg were contained. The amount of each element is very small in the order of magnitude of ppb (mass basis) of the whole solution. The growth time was correlated with the degree of supercooling.

  17. A review of empirical research related to the use of small quantitative samples in clinical outcome scale development.

    PubMed

    Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S

    2016-11-01

    There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.

  18. Quantitative Analysis of Trace Element Impurity Levels in Some Gem-Quality Diamonds

    NASA Astrophysics Data System (ADS)

    McNeill, J. C.; Klein-Bendavid, O.; Pearson, D. G.; Nowell, G. M.; Ottley, C. J.; Chinn, I.; Malarkey, J.

    2009-05-01

    Perhaps the most important information required to understand the origin of diamonds is the nature of the fluid that they crystallise from. Constraining the identity of the diamond-forming fluid for high purity gem diamonds is hampered by analytical challenges because of the very low analyte levels involved. Here we use a new ultra- low blank 'off-line' laser ablation method coupled to sector-field ICPMS for the quantitative analysis of fluid-poor gem diamonds. Ten diamonds comprised of both E- and P-type parageneses, from the Premier Mine, South Africa, were analysed for trace element abundances. We assume that the elemental signatures arise from low densities of sub-microscopic fluid inclusions that are analogous to the much higher densities of fluid inclusions commonly found within fluid-rich diamonds exhibiting fibrous growth. Repeatability of multiple (>20) blanks yielded consistently low values so that using the current procedure our limits of quantitation (10-ã blank) are <1pg for most trace elements, except for Sr, Zr, Ba, from 2-9pg and Pb ~30pg. Trace element patterns of the Premier diamond suite show enrichment of LREE over HREE. Abundances broadly decrease with increasing elemental compatibility. As a suite the chondrite normalised diamond patterns show negative Sr, Zr, Ti and Y anomalies and positive U, and Pb anomalies. All sample abundances are very depleted relative to chondrites (0.1 to 0.001X ch). HREE range from 0.1 to 1ppb as do Y, Nb, Cs. Other lighter elements vary from 2-30ppb. Pb reaches several ppb and Ti ranges from ppb values up to 2ppm. No significant difference were observed between the trace element systematics of the eclogitic and peridotitic diamonds. Overall, these initial data have inter-element fractionation patterns similar to those evident from fluid-rich fibrous diamonds and can be sued to infer that both types of diamond-forming fluids share a common origin.

  19. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    PubMed Central

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation. PMID:26982626

  20. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation.

  1. Quantitation of absorbed or deposited materials on a substrate that measures energy deposition

    DOEpatents

    Grant, Patrick G.; Bakajin, Olgica; Vogel, John S.; Bench, Graham

    2005-01-18

    This invention provides a system and method for measuring an energy differential that correlates to quantitative measurement of an amount mass of an applied localized material. Such a system and method remains compatible with other methods of analysis, such as, for example, quantitating the elemental or isotopic content, identifying the material, or using the material in biochemical analysis.

  2. Quantitative multiplex detection of pathogen biomarkers

    DOEpatents

    Mukundan, Harshini; Xie, Hongzhi; Swanson, Basil I.; Martinez, Jennifer; Grace, Wynne K.

    2016-02-09

    The present invention addresses the simultaneous detection and quantitative measurement of multiple biomolecules, e.g., pathogen biomarkers through either a sandwich assay approach or a lipid insertion approach. The invention can further employ a multichannel, structure with multi-sensor elements per channel.

  3. Quantitative multiplex detection of pathogen biomarkers

    DOEpatents

    Mukundan, Harshini; Xie, Hongzhi; Swanson, Basil I; Martinez, Jennifer; Grace, Wynne K

    2014-10-14

    The present invention addresses the simultaneous detection and quantitative measurement of multiple biomolecules, e.g., pathogen biomarkers through either a sandwich assay approach or a lipid insertion approach. The invention can further employ a multichannel, structure with multi-sensor elements per channel.

  4. Comparison of Pre-Service Physics Teachers' Conceptual Understanding of Dynamics in Model-Based Scientific Inquiry and Scientific Inquiry Environments

    ERIC Educational Resources Information Center

    Arslan Buyruk, Arzu; Ogan Bekiroglu, Feral

    2018-01-01

    The focus of this study was to evaluate the impact of model-based inquiry on pre-service physics teachers' conceptual understanding of dynamics. Theoretical framework of this research was based on models-of-data theory. True-experimental design using quantitative and qualitative research methods was carried out for this research. Participants of…

  5. Quantifying Confidence in Model Predictions for Hypersonic Aircraft Structures

    DTIC Science & Technology

    2015-03-01

    of isolating calibrations of models in the network, segmented and simultaneous calibration are compared using the Kullback - Leibler ...value of θ. While not all test -statistics are as simple as measuring goodness or badness of fit , their directional interpretations tend to remain...data quite well, qualitatively. Quantitative goodness - of - fit tests are problematic because they assume a true empirical CDF is being tested or

  6. Performance Characteristics of a New LSO PET/CT Scanner With Extended Axial Field-of-View and PSF Reconstruction

    NASA Astrophysics Data System (ADS)

    Jakoby, Bjoern W.; Bercier, Yanic; Watson, Charles C.; Bendriem, Bernard; Townsend, David W.

    2009-06-01

    A new combined lutetium oxyorthosilicate (LSO) PET/CT scanner with an extended axial field-of-view (FOV) of 21.8 cm has been developed (Biograph TruePoint PET/CT with TrueV; Siemens Molecular Imaging) and introduced into clinical practice. The scanner includes the recently announced point spread function (PSF) reconstruction algorithm. The PET components incorporate four rings of 48 detector blocks, 5.4 cm times 5.4 cm in cross-section. Each block comprises a 13 times 13 matrix of 4 times 4 times 20 mm3 elements. Data are acquired with a 4.5 ns coincidence time window and an energy window of 425-650 keV. The physical performance of the new scanner has been evaluated according to the recently revised National Electrical Manufacturers Association (NEMA) NU 2-2007 standard and the results have been compared with a previous PET/CT design that incorporates three rings of block detectors with an axial coverage of 16.2 cm (Biograph TruePoint PET/CT; Siemens Molecular Imaging). In addition to the phantom measurements, patient Noise Equivalent Count Rates (NECRs) have been estimated for a range of patients with different body weights (42-154 kg). The average spatial resolution is the same for both scanners: 4.4 mm (FWHM) and 5.0 mm (FWHM) at 1 cm and 10 cm respectively from the center of the transverse FOV. The scatter fractions of the Biograph TruePoint and Biograph TruePoint TrueV are comparable at 32%. Compared to the three ring design, the system sensitivity and peak NECR with smoothed randoms correction (1R) increase by 82% and 73%, respectively. The increase in sensitivity from the extended axial coverage of the Biograph TruePoint PET/CT with TrueV should allow a decrease in either scan time or injected dose without compromising diagnostic image quality. The contrast improvement with the PSF reconstruction potentially offers enhanced detectability for small lesions.

  7. Trace elements as quantitative probes of differentiation processes in planetary interiors

    NASA Technical Reports Server (NTRS)

    Drake, M. J.

    1980-01-01

    The characteristic trace element signature that each mineral in the source region imparts on the magma constitutes the conceptual basis for trace element modeling. It is shown that abundances of trace elements in extrusive igneous rocks may be used as petrological and geochemical probes of the source regions of the rocks if differentiation processes, partition coefficients, phase equilibria, and initial concentrations in the source region are known. Although compatible and incompatible trace elements are useful in modeling, the present review focuses primarily on examples involving the rare-earth elements.

  8. {sup 90}Y -PET imaging: Exploring limitations and accuracy under conditions of low counts and high random fraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlier, Thomas, E-mail: thomas.carlier@chu-nantes.fr; Willowson, Kathy P.; Fourkal, Eugene

    Purpose: {sup 90}Y -positron emission tomography (PET) imaging is becoming a recognized modality for postinfusion quantitative assessment following radioembolization therapy. However, the extremely low counts and high random fraction associated with {sup 90}Y -PET may significantly impair both qualitative and quantitative results. The aim of this work was to study image quality and noise level in relation to the quantification and bias performance of two types of Siemens PET scanners when imaging {sup 90}Y and to compare experimental results with clinical data from two types of commercially available {sup 90}Y microspheres. Methods: Data were acquired on both Siemens Biograph TruePointmore » [non-time-of-flight (TOF)] and Biograph microcomputed tomography (mCT) (TOF) PET/CT scanners. The study was conducted in three phases. The first aimed to assess quantification and bias for different reconstruction methods according to random fraction and number of true counts in the scan. The NEMA 1994 PET phantom was filled with water with one cylindrical insert left empty (air) and the other filled with a solution of {sup 90}Y . The phantom was scanned for 60 min in the PET/CT scanner every one or two days. The second phase used the NEMA 2001 PET phantom to derive noise and image quality metrics. The spheres and the background were filled with a {sup 90}Y solution in an 8:1 contrast ratio and four 30 min acquisitions were performed over a one week period. Finally, 32 patient data (8 treated with Therasphere{sup ®} and 24 with SIR-Spheres{sup ®}) were retrospectively reconstructed and activity in the whole field of view and the liver was compared to theoretical injected activity. Results: The contribution of both bremsstrahlung and LSO trues was found to be negligible, allowing data to be decay corrected to obtain correct quantification. In general, the recovered activity for all reconstruction methods was stable over the range studied, with a small bias appearing at extremely high random fraction and low counts for iterative algorithms. Point spread function (PSF) correction and TOF reconstruction in general reduce background variability and noise and increase recovered concentration. Results for patient data indicated a good correlation between the expected and PET reconstructed activities. A linear relationship between the expected and the measured activities in the organ of interest was observed for all reconstruction method used: a linearity coefficient of 0.89 ± 0.05 for the Biograph mCT and 0.81 ± 0.05 for the Biograph TruePoint. Conclusions: Due to the low counts and high random fraction, accurate image quantification of {sup 90}Y during selective internal radionuclide therapy is affected by random coincidence estimation, scatter correction, and any positivity constraint of the algorithm. Nevertheless, phantom and patient studies showed that the impact of number of true and random coincidences on quantitative results was found to be limited as long as ordinary Poisson ordered subsets expectation maximization reconstruction algorithms with random smoothing are used. Adding PSF correction and TOF information to the reconstruction greatly improves the image quality in terms of bias, variability, noise reduction, and detectability. On the patient studies, the total activity in the field of view is in general accurately measured by Biograph mCT and slightly overestimated by the Biograph TruePoint.« less

  9. Polarization sensitive optical coherence tomography in equine bone

    NASA Astrophysics Data System (ADS)

    Jacobs, J. W.; Matcher, S. J.

    2009-02-01

    Optical coherence tomography (OCT) has been used to image equine bone samples. OCT and polarization sensitive OCT (PS-OCT) images of equine bone samples, before and after demineralization, are presented. Using a novel approach, taking a series of images at different angles of illumination, the polar angle and true birefringence of collagen within the tissue is determined, at one site in the sample. The images were taken before and after the bones were passed through a demineralization process. The images show an improvement in depth penetration after demineralization allowing better visualization of the internal structure of the bone and the optical orientation of the collagen. A quantitative measurement of true birefringence has been made of the bone; true birefringence was shown to be 1.9x10-3 before demineralization increasing to 2.7x10-3 after demineralization. However, determined collagen fiber orientation remains the same before and after demineralization. The study of bone is extensive within the field of tissue engineering where an understanding of the internal structures is essential. OCT in bone, and improved depth penetration through demineralization, offers a useful approach to bone analysis.

  10. Modular arrangement of regulatory RNA elements.

    PubMed

    Roßmanith, Johanna; Narberhaus, Franz

    2017-03-04

    Due to their simple architecture and control mechanism, regulatory RNA modules are attractive building blocks in synthetic biology. This is especially true for riboswitches, which are natural ligand-binding regulators of gene expression. The discovery of various tandem riboswitches inspired the design of combined RNA modules with activities not yet found in nature. Riboswitches were placed in tandem or in combination with a ribozyme or temperature-responsive RNA thermometer resulting in new functionalities. Here, we compare natural examples of tandem riboswitches with recently designed artificial RNA regulators suggesting substantial modularity of regulatory RNA elements. Challenges associated with modular RNA design are discussed.

  11. Calibration-free quantitative analysis of elemental ratios in intermetallic nanoalloys and nanocomposites using Laser Induced Breakdown Spectroscopy (LIBS).

    PubMed

    Davari, Seyyed Ali; Hu, Sheng; Mukherjee, Dibyendu

    2017-03-01

    Intermetallic nanoalloys (NAs) and nanocomposites (NCs) have increasingly gained prominence as efficient catalytic materials in electrochemical energy conversion and storage systems. But their morphology and chemical compositions play critical role in tuning their catalytic activities, and precious metal contents. While advanced microscopy techniques facilitate morphological characterizations, traditional chemical characterizations are either qualitative or extremely involved. In this study, we apply Laser Induced Breakdown Spectroscopy (LIBS) for quantitative compositional analysis of NAs and NCs synthesized with varied elemental ratios by our in-house built pulsed laser ablation technique. Specifically, elemental ratios of binary PtNi, PdCo (NAs) and PtCo (NCs) of different compositions are determined from LIBS measurements employing an internal calibration scheme using the bulk matrix species as internal standards. Morphology and qualitative elemental compositions of the aforesaid NAs and NCs are confirmed from Transmission Electron Microscopy (TEM) images and Energy Dispersive X-ray Spectroscopy (EDX) measurements. LIBS experiments are carried out in ambient conditions with the NA and NC samples drop cast on silicon wafers after centrifugation to increase their concentrations. The technique does not call for cumbersome sample preparations including acid digestions and external calibration standards commonly required in Inductively Coupled Plasma-Optical Emission Spectroscopy (ICP-OES) techniques. Yet the quantitative LIBS results are in good agreement with the results from ICP-OES measurements. Our results indicate the feasibility of using LIBS in future for rapid and in-situ quantitative chemical characterizations of wide classes of synthesized NAs and NCs. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. The 2011 collapse of Puu Oo pit crater, Hawaii: insights from digital image correlation and Discrete Element Method models

    NASA Astrophysics Data System (ADS)

    Holohan, E. P.; Walter, T. R.; Schöpfer, M. P. J.; Walsh, J. J.; Orr, T.; Poland, M.

    2012-04-01

    In March 2011, a spectacular fissure eruption on Kilauea was associated with a major collapse event in the highly-active Puu Oo crater. Time-lapse cameras maintained by the Hawaii Volcano Observatory captured views of the crater in the moments before, during, and after the collapse. The 2011 event hence represents a unique opportunity to characterize the surface deformation related to the onset of a pit crater collapse and to understand what factors influence it. To do so, we used two approaches. First, we analyzed the available series of camera images by means of digital image correlation techniques. This enabled us to gain a semi-quantitative (pixel-unit) description of the surface displacements and the structural development of the collapsing crater floor. Secondly, we ran a series of 'true-scale' numerical pit-crater collapse simulations based on the two-dimensional Distinct Element Method (2D-DEM). This enabled us to gain insights into what geometric and mechanical factors could have controlled the observed surface displacement pattern and structural development. Our analysis of the time-lapse images reveals that the crater floor initially gently sagged, and then rapidly collapsed in association with the appearance of a large ring-like fault scarp. The observed structural development and surface displacement patterns of the March 2011 Puu Oo collapse are best reproduced in DEM models with a relatively shallow magma reservoir that is vertically elongated, and with a crater floor rock mass that is reasonably strong. In combining digital image correlation with DEM modeling, our study highlights the future potential of these relatively new techniques for understanding physical processes at active volcanoes.

  13. On Biological Network Visualization: Understanding Challenges, Measuring the Status Quo, and Estimating Saliency of Visual Attributes

    ERIC Educational Resources Information Center

    Gopal, Nikhil

    2017-01-01

    Biomedical research increasingly relies on the analysis and visualization of a wide range of collected data. However, for certain research questions, such as those investigating the interconnectedness of biological elements, the sheer quantity and variety of data results in rather uninterpretable--this is especially true for network visualization,…

  14. The 8:00 a.m. Kazoo Experience

    ERIC Educational Resources Information Center

    Snodgrass, Jennifer

    2007-01-01

    In this article, the author relates how the use of kazoos has helped her students sing melodies by sight. She was initially troubled by her students lack of confidence in making pitches. After several weeks of using kazoos, her students were able to develop a true sense of matching pitch and became more successful in all elements of dictation.

  15. Standardization approaches in absolute quantitative proteomics with mass spectrometry.

    PubMed

    Calderón-Celis, Francisco; Encinar, Jorge Ruiz; Sanz-Medel, Alfredo

    2017-07-31

    Mass spectrometry-based approaches have enabled important breakthroughs in quantitative proteomics in the last decades. This development is reflected in the better quantitative assessment of protein levels as well as to understand post-translational modifications and protein complexes and networks. Nowadays, the focus of quantitative proteomics shifted from the relative determination of proteins (ie, differential expression between two or more cellular states) to absolute quantity determination, required for a more-thorough characterization of biological models and comprehension of the proteome dynamism, as well as for the search and validation of novel protein biomarkers. However, the physico-chemical environment of the analyte species affects strongly the ionization efficiency in most mass spectrometry (MS) types, which thereby require the use of specially designed standardization approaches to provide absolute quantifications. Most common of such approaches nowadays include (i) the use of stable isotope-labeled peptide standards, isotopologues to the target proteotypic peptides expected after tryptic digestion of the target protein; (ii) use of stable isotope-labeled protein standards to compensate for sample preparation, sample loss, and proteolysis steps; (iii) isobaric reagents, which after fragmentation in the MS/MS analysis provide a final detectable mass shift, can be used to tag both analyte and standard samples; (iv) label-free approaches in which the absolute quantitative data are not obtained through the use of any kind of labeling, but from computational normalization of the raw data and adequate standards; (v) elemental mass spectrometry-based workflows able to provide directly absolute quantification of peptides/proteins that contain an ICP-detectable element. A critical insight from the Analytical Chemistry perspective of the different standardization approaches and their combinations used so far for absolute quantitative MS-based (molecular and elemental) proteomics is provided in this review. © 2017 Wiley Periodicals, Inc.

  16. Faults and foibles of quantitative scanning electron microscopy/energy dispersive x-ray spectrometry (SEM/EDS)

    NASA Astrophysics Data System (ADS)

    Newbury, Dale E.; Ritchie, Nicholas W. M.

    2012-06-01

    Scanning electron microscopy with energy dispersive x-ray spectrometry (SEM/EDS) is a powerful and flexible elemental analysis method that can identify and quantify elements with atomic numbers > 4 (Be) present as major constituents (where the concentration C > 0.1 mass fraction, or 10 weight percent), minor (0.01<= C <= 0.1) and trace (C < 0.01, with a minimum detectable limit of ~+/- 0.0005 - 0.001 under routine measurement conditions, a level which is analyte and matrix dependent ). SEM/EDS can select specimen volumes with linear dimensions from ~ 500 nm to 5 μm depending on composition (masses ranging from ~ 10 pg to 100 pg) and can provide compositional maps that depict lateral elemental distributions. Despite the maturity of SEM/EDS, which has a history of more than 40 years, and the sophistication of modern analytical software, the method is vulnerable to serious shortcomings that can lead to incorrect elemental identifications and quantification errors that significantly exceed reasonable expectations. This paper will describe shortcomings in peak identification procedures, limitations on the accuracy of quantitative analysis due to specimen topography or failures in physical models for matrix corrections, and quantitative artifacts encountered in xray elemental mapping. Effective solutions to these problems are based on understanding the causes and then establishing appropriate measurement science protocols. NIST DTSA II and Lispix are open source analytical software available free at www.nist.gov that can aid the analyst in overcoming significant limitations to SEM/EDS.

  17. Inductively Coupled Plasma Mass Spectrometry (ICP-MS) Applications in Quantitative Proteomics.

    PubMed

    Chahrour, Osama; Malone, John

    2017-01-01

    Recent advances in inductively coupled plasma mass spectrometry (ICP-MS) hyphenated to different separation techniques have promoted it as a valuable tool in protein/peptide quantification. These emerging ICP-MS applications allow absolute quantification by measuring specific elemental responses. One approach quantifies elements already present in the structure of the target peptide (e.g. phosphorus and sulphur) as natural tags. Quantification of these natural tags allows the elucidation of the degree of protein phosphorylation in addition to absolute protein quantification. A separate approach is based on utilising bi-functional labelling substances (those containing ICP-MS detectable elements), that form a covalent chemical bond with the protein thus creating analogs which are detectable by ICP-MS. Based on the previously established stoichiometries of the labelling reagents, quantification can be achieved. This technique is very useful for the design of precise multiplexed quantitation schemes to address the challenges of biomarker screening and discovery. This review discusses the capabilities and different strategies to implement ICP-MS in the field of quantitative proteomics. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  18. EDXRF quantitative analysis of chromophore chemical elements in corundum samples.

    PubMed

    Bonizzoni, L; Galli, A; Spinolo, G; Palanza, V

    2009-12-01

    Corundum is a crystalline form of aluminum oxide (Al(2)O(3)) and is one of the rock-forming minerals. When aluminum oxide is pure, the mineral is colorless, but the presence of trace amounts of other elements such as iron, titanium, and chromium in the crystal lattice gives the typical colors (including blue, red, violet, pink, green, yellow, orange, gray, white, colorless, and black) of gemstone varieties. The starting point for our work is the quantitative evaluation of the concentration of chromophore chemical elements with a precision as good as possible to match the data obtained by different techniques as such as optical absorption photoluminescence. The aim is to give an interpretation of the absorption bands present in the NIR and visible ranges which do not involve intervalence charge transfer transitions (Fe(2+) --> Fe(3+) and Fe(2+) --> Ti(4+)), commonly considered responsible of the important features of the blue sapphire absorption spectra. So, we developed a method to evaluate as accurately as possible the autoabsorption effects and the secondary excitation effects which frequently are sources of relevant errors in the quantitative EDXRF analysis.

  19. Quantitative analysis of trace metal accumulation in teeth using laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Samek, O.; Beddows, D. C. S.; Telle, H. H.; Morris, G. W.; Liska, M.; Kaiser, J.

    The technique of laser ablation is receiving increasing attention for applications in dentistry, specifically for the treatment of teeth (e.g. drilling of micro-holes and plaque removal). In the process of ablation a luminous micro-plasma is normally generated which may be exploited for elemental analysis. Here we report on quantitative Laser-Induced Breakdown Spectroscopy (LIBS) analysis to study the presence of trace minerals in teeth. A selection of teeth of different age groups has been investigated, ranging from the first teeth of infants, through the second teeth of children, to adults to trace the influence of environmental factors on the accumulation of a number of elements in teeth. We found a close link between elements detected in tooth fillings and toothpastes with those present in teeth.

  20. Fluorescent proteins for quantitative microscopy: important properties and practical evaluation.

    PubMed

    Shaner, Nathan Christopher

    2014-01-01

    More than 20 years after their discovery, fluorescent proteins (FPs) continue to be the subject of massive engineering efforts yielding continued improvements. Among these efforts are many aspects that should be of great interest to quantitative imaging users. With new variants frequently introduced into the research community, "tried and true" FPs that have been relied on for many years may now be due for upgrades to more modern variants. However, the dizzying array of FPs now available can make the initial act of narrowing down the potential choices an intimidating prospect. This chapter describes the FP properties that most strongly impact their performance in quantitative imaging experiments, along with their physical origins as they are currently understood. A workflow for evaluating a given FP in the researcher's chosen experimental system (e.g., a specific cell line) is described. © 2014 Elsevier Inc. All rights reserved.

  1. A quantitative description for efficient financial markets

    NASA Astrophysics Data System (ADS)

    Immonen, Eero

    2015-09-01

    In this article we develop a control system model for describing efficient financial markets. We define the efficiency of a financial market in quantitative terms by robust asymptotic price-value equality in this model. By invoking the Internal Model Principle of robust output regulation theory we then show that under No Bubble Conditions, in the proposed model, the market is efficient if and only if the following conditions hold true: (1) the traders, as a group, can identify any mispricing in asset value (even if no one single trader can do it accurately), and (2) the traders, as a group, incorporate an internal model of the value process (again, even if no one single trader knows it). This main result of the article, which deliberately avoids the requirement for investor rationality, demonstrates, in quantitative terms, that the more transparent the markets are, the more efficient they are. An extensive example is provided to illustrate the theoretical development.

  2. Single cell elemental analysis using nuclear microscopy

    NASA Astrophysics Data System (ADS)

    Ren, M. Q.; Thong, P. S. P.; Kara, U.; Watt, F.

    1999-04-01

    The use of Particle Induced X-ray Emission (PIXE), Rutherford Backscattering Spectrometry (RBS) and Scanning Transmission Ion Microscopy (STIM) to provide quantitative elemental analysis of single cells is an area which has high potential, particularly when the trace elements such as Ca, Fe, Zn and Cu can be monitored. We describe the methodology of sample preparation for two cell types, the procedures of cell imaging using STIM, and the quantitative elemental analysis of single cells using RBS and PIXE. Recent work on single cells at the Nuclear Microscopy Research Centre,National University of Singapore has centred around two research areas: (a) Apoptosis (programmed cell death), which has been recently implicated in a wide range of pathological conditions such as cancer, Parkinson's disease etc, and (b) Malaria (infection of red blood cells by the malaria parasite). Firstly we present results on the elemental analysis of human Chang liver cells (ATTCC CCL 13) where vanadium ions were used to trigger apoptosis, and demonstrate that nuclear microscopy has the capability of monitoring vanadium loading within individual cells. Secondly we present the results of elemental changes taking place in individual mouse red blood cells which have been infected with the malaria parasite and treated with the anti-malaria drug Qinghaosu (QHS).

  3. Response of hot element flush wall gauges in oscillating laminar flow

    NASA Technical Reports Server (NTRS)

    Giddings, T. A.; Cook, W. J.

    1986-01-01

    The time dependent response characteristics of flush-mounted hot element gauges used as instruments to measure wall shear stress in unsteady periodic air flows were investigated. The study was initiated because anomalous results were obtained from the gauges in oscillating turbulent flows for the phase relation of the wall shear stress variation, indicating possible gauge response problems. Flat plate laminar oscillating turbulent flows characterized by a mean free stream velocity with a superposed sinusoidal variation were performed. Laminar rather than turbulent flows were studied, because a numerical solution for the phase angle between the free stream velocity and the wall shear stress variation that is known to be correct can be obtained. The focus is on comparing the phase angle indicated by the hot element gauges with corresponding numerical prediction for the phase angle, since agreement would indicate that the hot element gauges faithfully follow the true wall shear stress variation.

  4. Does peripheral quantitative computed tomography ignore tissue density of cancellous bone?

    PubMed

    Banse, X; Devogelaer, J P

    2002-01-01

    The purpose of this work was to determine the capacity of peripheral quantitative computed tomography (pQCT) to accurately measure the true physical properties of vertebral cancellous bone samples and to predict their stiffness. pQCT bone mineral density (BMD) was first measured in ideal conditions. Ten cubic specimens of vertebral cancellous bone (10 x 10 x 10 mm) were washed with a water jet, defatted, and scanned in saline after elimination of air bubbles; thirteen slices were obtained. Seventy-one unprepared cylindrical samples were scanned in more realistic conditions, which allow further biomechanical testing. After extraction from the vertebral body, the samples were pushed into a plastic tube (no effort was made to remove the marrow or air bubbles), and only four slices were obtained to reduce the duration of scan. For the 81 samples, the true bone volume fraction (BV/TV, %), true apparent density (rho(app), g/cm(3)), and tissue density (rho(tiss), g/cm(3)) (an indicator of the degree of mineralization of the matrix) were then measured using Archimedes principle. rho(app) was closely correlated to BV/TV (r(2) = 0.97). rho(tiss) (1.58 +/- 0.08 g/cm(2)) was almost constant but had some influence on rho(app) (r(2) = 0.03, p < 0.001). The pQCT BMD predicted accurately rho(app) (r(2) = 0.96) and BV/TV (r(2) = 0.93) for the cylinders. For the cubes, in ideal conditions, the same correlations were even better (r(2) > 0.99, both). Analysis of covariance indicated no difference (p > 0.05) in the regressions due to preparation of the samples. The stiffness was better predicted by the true rho(app) (r(2) = 0.87) than by BV/TV (r(2) = 0.83), indicating that stiffness was influenced by small differences in the tissue density. Consequently, the correlation between pQCT BMD and stiffness was excellent (r(2) = 0.84). The fact that pQCT did not ignore this tissue density information compensated for the inaccuracies linked to realistic scanning conditions of the cylinder.

  5. In Search of the True Universe

    NASA Astrophysics Data System (ADS)

    Harwit, Martin

    2014-01-01

    1. The nineteenth century's last five years; Part I. The Import of Theoretical Tools: 2. An overview; 3. Conclusions based on principles; 4. Conclusions based on a premise; 5. Conclusions based on calculations; 6. Asking the right questions, accepting limited answers; Part II. A National Plan Shaping the Universe We Perceive: 7. A new order and the new universe it produced; 8. Where did the chemical elements arise?; 9. Landscapes; 10. The evolution of astrophysical theory after 1960; 11. Turmoils of leadership; 12. Cascades and shocks that shape astrophysics; 13. Astrophysical discourse and persuasion; Part III. The Cost of Discerning the True Universe: 14. Organization and functioning of the astronomical community; 15. Language and astrophysical stability; 16. An economically viable astronomical program; Epilogue.

  6. In Search of the True Universe

    NASA Astrophysics Data System (ADS)

    Harwit, Martin

    2013-11-01

    1. The nineteenth century's last five years; Part I. The Import of Theoretical Tools: 2. An overview; 3. Conclusions based on principles; 4. Conclusions based on a premise; 5. Conclusions based on calculations; 6. Asking the right questions, accepting limited answers; Part II. A National Plan Shaping the Universe We Perceive: 7. A new order and the new universe it produced; 8. Where did the chemical elements arise?; 9. Landscapes; 10. The evolution of astrophysical theory after 1960; 11. Turmoils of leadership; 12. Cascades and shocks that shape astrophysics; 13. Astrophysical discourse and persuasion; Part III. The Cost of Discerning the True Universe: 14. Organization and functioning of the astronomical community; 15. Language and astrophysical stability; 16. An economically viable astronomical program; Epilogue.

  7. Comparing the Effect of Blogging as well as Pen-and-Paper on the Essay Writing Performance of Iranian Graduate Students

    ERIC Educational Resources Information Center

    Kashani, Hajar; Mahmud, Rosnaini Binti; Kalajahi, Seyed Ali Rezvani

    2013-01-01

    In today's world, there are lots of methods in language teaching in general and teaching writing in particular. Using two different tools in writing essays and conducting a study to compare the effectiveness of these two tools namely blog and pen-and-paper was the basis of this study. This study used a quantitative true experimental design aimed…

  8. Understanding Crystal Populations; Looking Towards 3D Quantitative Analysis

    NASA Astrophysics Data System (ADS)

    Jerram, D. A.; Morgan, D. J.

    2010-12-01

    In order to understand volcanic systems, the potential record held within crystal populations needs to be revealed. It is becoming increasingly clear, however, that the crystal populations that arrive at the surface in volcanic eruptions are commonly mixtures of crystals, which may be representative of simple crystallization, recycling of crystals and incorporation of alien crystals. If we can quantify the true 3D population within a sample then we will be able to separate crystals with different histories and begin to interrogate the true and complex plumbing within the volcanic system. Modeling crystal populations is one area where we can investigate the best methodologies to use when dealing with sections through 3D populations. By producing known 3D shapes and sizes with virtual textures and looking at the statistics of shape and size when such populations are sectioned, we are able to gain confidence about what our 2D information is telling us about the population. We can also use this approach to test the size of population we need to analyze. 3D imaging through serial sectioning or x-ray CT, provides a complete 3D quantification of a rocks texture. Individual phases can be identified and in principle the true 3D statistics of the population can be interrogated. In practice we need to develop strategies (as with 2D-3D transformations), that enable a true characterization of the 3D data, and an understanding of the errors and pitfalls that exist. Ultimately, the reproduction of true 3D textures and the wealth of information they hold, is now within our reach.

  9. Functional dissection of a napin gene promoter: identification of promoter elements required for embryo and endosperm-specific transcription.

    PubMed

    Ellerström, M; Stålberg, K; Ezcurra, I; Rask, L

    1996-12-01

    The promoter region (-309 to +44) of the Brassica napus storage protein gene napA was studied in transgenic tobacco by successive 5' as well as internal deletions fused to the reporter gene GUS (beta-glucuronidase). The expression in the two main tissues of the seed, the endosperm and the embryo, was shown to be differentially regulated. This tissue-specific regulation within the seed was found to affect the developmental expression during seed development. The region between -309 to -152, which has a large effect on quantitative expression, was shown to harbour four elements regulating embryo and one regulating endosperm expression. This region also displayed enhancer activity. Deletion of eight bp from position -152 to position -144 totally abolished the activity of the napA promoter. This deletion disrupted a cis element with similarity to an ABA-responsive element (ABRE) overlapping with an E-box, demonstrating its crucial importance for quantitative expression. An internal deletion of the region -133 to -120, resulted in increased activity in both leaves and endosperm and a decreased activity in the embryo. Within this region, a cis element similar to the (CA)n element, found in other storage protein promoters, was identified. This suggest that the (CA)n element is important for conferring seed specificity by serving both as an activator and a repressor element.

  10. A multi-element screening method to identify metal targets for blood biomonitoring in green sea turtles (Chelonia mydas).

    PubMed

    Villa, C A; Finlayson, S; Limpus, C; Gaus, C

    2015-04-15

    Biomonitoring of blood is commonly used to identify and quantify occupational or environmental exposure to chemical contaminants. Increasingly, this technique has been applied to wildlife contaminant monitoring, including for green turtles, allowing for the non-lethal evaluation of chemical exposure in their nearshore environment. The sources, composition, bioavailability and toxicity of metals in the marine environment are, however, often unknown and influenced by numerous biotic and abiotic factors. These factors can vary considerably across time and space making the selection of the most informative elements for biomonitoring challenging. This study aimed to validate an ICP-MS multi-element screening method for green turtle blood in order to identify and facilitate prioritisation of target metals for subsequent fully quantitative analysis. Multi-element screening provided semiquantitative results for 70 elements, 28 of which were also determined through fully quantitative analysis. Of the 28 comparable elements, 23 of the semiquantitative results had an accuracy between 67% and 112% relative to the fully quantified values. In lieu of any available turtle certified reference materials (CRMs), we evaluated the use of human blood CRMs as a matrix surrogate for quality control, and compared two commonly used sample preparation methods for matrix related effects. The results demonstrate that human blood provides an appropriate matrix for use as a quality control material in the fully quantitative analysis of metals in turtle blood. An example for the application of this screening method is provided by comparing screening results from blood of green turtles foraging in an urban and rural region in Queensland, Australia. Potential targets for future metal biomonitoring in these regions were identified by this approach. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Retrieval analysis of different orthodontic brackets: the applicability of electron microprobe techniques for determining material heterogeneities and corrosive potential

    PubMed Central

    HOLST, Alexandra Ioana; HOLST, Stefan; HIRSCHFELDER, Ursula; von SECKENDORFF, Volker

    2012-01-01

    Objective The objective of this study was to investigate the applicability of micro-analytical methods with high spatial resolution to the characterization of the composition and corrosion behavior of two bracket systems. Material and methods The surfaces of six nickel-free brackets and six nickel-containing brackets were examined for signs of corrosion and qualitative surface analysis using an electron probe microanalyzer (EPMA), prior to bonding to patient's tooth surfaces and four months after clinical use. The surfaces were characterized qualitatively by secondary electron (SE) images and back scattered electron (BSE) images in both compositional and topographical mode. Qualitative and quantitative wavelength-dispersive analyses were performed for different elements, and by utilizing qualitative analysis the relative concentration of selected elements was mapped two-dimensionally. The absolute concentration of the elements was determined in specially prepared brackets by quantitative analysis using pure element standards for calibration and calculating correction-factors (ZAF). Results Clear differences were observed between the different bracket types. The nickel-containing stainless steel brackets consist of two separate pieces joined by a brazing alloy. Compositional analysis revealed two different alloy compositions, and reaction zones on both sides of the brazing alloy. The nickel-free bracket was a single piece with only slight variation in element concentration, but had a significantly rougher surface. After clinical use, no corrosive phenomena were detectable with the methods applied. Traces of intraoral wear at the contact areas between the bracket slot and the arch wire were verified. Conclusion Electron probe microanalysis is a valuable tool for the characterization of element distribution and quantitative analysis for corrosion studies. PMID:23032212

  12. Plasticity first: molecular signatures of a complex morphological trait in filamentous cyanobacteria.

    PubMed

    Koch, Robin; Kupczok, Anne; Stucken, Karina; Ilhan, Judith; Hammerschmidt, Katrin; Dagan, Tal

    2017-08-31

    Filamentous cyanobacteria that differentiate multiple cell types are considered the peak of prokaryotic complexity and their evolution has been studied in the context of multicellularity origins. Species that form true-branching filaments exemplify the most complex cyanobacteria. However, the mechanisms underlying the true-branching morphology remain poorly understood despite of several investigations that focused on the identification of novel genes or pathways. An alternative route for the evolution of novel traits is based on existing phenotypic plasticity. According to that scenario - termed genetic assimilation - the fixation of a novel phenotype precedes the fixation of the genotype. Here we show that the evolution of transcriptional regulatory elements constitutes a major mechanism for the evolution of new traits. We found that supplementation with sucrose reconstitutes the ancestral branchless phenotype of two true-branching Fischerella species and compared the transcription start sites (TSSs) between the two phenotypic states. Our analysis uncovers several orthologous TSSs whose transcription level is correlated with the true-branching phenotype. These TSSs are found in genes that encode components of the septosome and elongasome (e.g., fraC and mreB). The concept of genetic assimilation supplies a tenable explanation for the evolution of novel traits but testing its feasibility is hindered by the inability to recreate and study the evolution of present-day traits. We present a novel approach to examine transcription data for the plasticity first route and provide evidence for its occurrence during the evolution of complex colony morphology in true-branching cyanobacteria. Our results reveal a route for evolution of the true-branching phenotype in cyanobacteria via modification of the transcription level of pre-existing genes. Our study supplies evidence for the 'plasticity-first' hypothesis and highlights the importance of transcriptional regulation in the evolution of novel traits.

  13. Simplex and duplex event-specific analytical methods for functional biotech maize.

    PubMed

    Lee, Seong-Hun; Kim, Su-Jeong; Yi, Bu-Young

    2009-08-26

    Analytical methods are very important in the control of genetically modified organism (GMO) labeling systems or living modified organism (LMO) management for biotech crops. Event-specific primers and probes were developed for qualitative and quantitative analysis for biotech maize event 3272 and LY 038 on the basis of the 3' flanking regions, respectively. The qualitative primers confirmed the specificity by a single PCR product and sensitivity to 0.05% as a limit of detection (LOD). Simplex and duplex quantitative methods were also developed using TaqMan real-time PCR. One synthetic plasmid was constructed from two taxon-specific DNA sequences of maize and two event-specific 3' flanking DNA sequences of event 3272 and LY 038 as reference molecules. In-house validation of the quantitative methods was performed using six levels of mixing samples, from 0.1 to 10.0%. As a result, the biases from the true value and the relative deviations were all within the range of +/-30%. Limits of quantitation (LOQs) of the quantitative methods were all 0.1% for simplex real-time PCRs of event 3272 and LY 038 and 0.5% for duplex real-time PCR of LY 038. This study reports that event-specific analytical methods were applicable for qualitative and quantitative analysis for biotech maize event 3272 and LY 038.

  14. Effects of Scan Resolutions and Element Sizes on Bovine Vertebral Mechanical Parameters from Quantitative Computed Tomography-Based Finite Element Analysis

    PubMed Central

    Zhang, Meng; Gao, Jiazi; Huang, Xu; Zhang, Min; Liu, Bei

    2017-01-01

    Quantitative computed tomography-based finite element analysis (QCT/FEA) has been developed to predict vertebral strength. However, QCT/FEA models may be different with scan resolutions and element sizes. The aim of this study was to explore the effects of scan resolutions and element sizes on QCT/FEA outcomes. Nine bovine vertebral bodies were scanned using the clinical CT scanner and reconstructed from datasets with the two-slice thickness, that is, 0.6 mm (PA resolution) and 1 mm (PB resolution). There were significantly linear correlations between the predicted and measured principal strains (R2 > 0.7, P < 0.0001), and the predicted vertebral strength and stiffness were modestly correlated with the experimental values (R2 > 0.6, P < 0.05). Two different resolutions and six different element sizes were combined in pairs, and finite element (FE) models of bovine vertebral cancellous bones in the 12 cases were obtained. It showed that the mechanical parameters of FE models with the PB resolution were similar to those with the PA resolution. The computational accuracy of FE models with the element sizes of 0.41 × 0.41 × 0.6 mm3 and 0.41 × 0.41 × 1 mm3 was higher by comparing the apparent elastic modulus and yield strength. Therefore, scan resolution and element size should be chosen optimally to improve the accuracy of QCT/FEA. PMID:29065624

  15. On the representative volume element of asphalt concrete at low temperature

    NASA Astrophysics Data System (ADS)

    Marasteanu, Mihai; Cannone Falchetto, Augusto; Velasquez, Raul; Le, Jia-Liang

    2016-08-01

    The feasibility of characterizing asphalt mixtures' rheological and failure properties at low temperatures by means of the Bending Beam Rheometer (BBR) is investigated in this paper. The main issue is the use of thin beams of asphalt mixture in experimental procedures that may not capture the true behavior of the material used to construct an asphalt pavement.

  16. Potential effects of acid precipitation on soils in the humid temperate zone

    Treesearch

    C. R. Frink; G. K. Voigt

    1976-01-01

    Acid precipitation is not a new phenomenon. As long as water has fallen on the surface of the earth it has probably contained varying amounts of oxides of carbons, nitrogen and sulfur that increase hydrogen ion activity. This was certainly true when volcanism prevailed. With the appearance of life spasmodic geologic expulsions of elements into the atmosphere were...

  17. Fearless Vampire Kissers: Bloodsuckers We Love in "Twilight," "True Blood" and Others

    ERIC Educational Resources Information Center

    Beck, Bernard

    2011-01-01

    The figure of the vampire has been an important element of popular culture for more than a century. The movies have been a home for vampire stories, and they have presented them as unusually frightening images. A recent explosion of vampire screen works reveals a new emphasis on addressing female issues as opposed to male issues and focusing on…

  18. Moving along the number line: operational momentum in nonsymbolic arithmetic.

    PubMed

    McCrink, Koleen; Dehaene, Stanislas; Dehaene-Lambertz, Ghislaine

    2007-11-01

    Can human adults perform arithmetic operations with large approximate numbers, and what effect, if any, does an internal spatial-numerical representation of numerical magnitude have on their responses? We conducted a psychophysical study in which subjects viewed several hundred short videos of sets of objects being added or subtracted from one another and judged whether the final numerosity was correct or incorrect. Over a wide range of possible outcomes, the subjects' responses peaked at the approximate location of the true numerical outcome and gradually tapered off as a function of the ratio of the true and proposed outcomes (Weber's law). Furthermore, an operational momentum effect was observed, whereby addition problems were overestimated and subtraction problems were underestimated. The results show that approximate arithmetic operates according to precise quantitative rules, perhaps analogous to those characterizing movement on an internal continuum.

  19. Spectrochemical analysis of powdered biological samples using transversely excited atmospheric carbon dioxide laser plasma excitation

    NASA Astrophysics Data System (ADS)

    Zivkovic, Sanja; Momcilovic, Milos; Staicu, Angela; Mutic, Jelena; Trtica, Milan; Savovic, Jelena

    2017-02-01

    The aim of this study was to develop a simple laser induced breakdown spectroscopy (LIBS) method for quantitative elemental analysis of powdered biological materials based on laboratory prepared calibration samples. The analysis was done using ungated single pulse LIBS in ambient air at atmospheric pressure. Transversely-Excited Atmospheric pressure (TEA) CO2 laser was used as an energy source for plasma generation on samples. The material used for the analysis was a blue-green alga Spirulina, widely used in food and pharmaceutical industries and also in a few biotechnological applications. To demonstrate the analytical potential of this particular LIBS system the obtained spectra were compared to the spectra obtained using a commercial LIBS system based on pulsed Nd:YAG laser. A single sample of known concentration was used to estimate detection limits for Ba, Ca, Fe, Mg, Mn, Si and Sr and compare detection power of these two LIBS systems. TEA CO2 laser based LIBS was also applied for quantitative analysis of the elements in powder Spirulina samples. Analytical curves for Ba, Fe, Mg, Mn and Sr were constructed using laboratory produced matrix-matched calibration samples. Inductively coupled plasma optical emission spectroscopy (ICP-OES) was used as the reference technique for elemental quantification, and reasonably well agreement between ICP and LIBS data was obtained. Results confirm that, in respect to its sensitivity and precision, TEA CO2 laser based LIBS can be successfully applied for quantitative analysis of macro and micro-elements in algal samples. The fact that nearly all classes of materials can be prepared as powders implies that the proposed method could be easily extended to a quantitative analysis of different kinds of materials, organic, biological or inorganic.

  20. 78 FR 76658 - Report on the Selection of Eligible Countries for Fiscal Year 2014

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-18

    ... particular indicators. Where appropriate, the Board took into account additional quantitative and qualitative... previous years, a number of countries that performed well on the quantitative elements of the selection... caused by historical data revisions or methodological changes from the indicator institutions. In neither...

  1. Using ICPSR Resources to Teach Sociology

    ERIC Educational Resources Information Center

    Hoelter, Lynette F.; LeClere, Felicia B.; Pienta, Amy M.; Barlow, Rachael E.; McNally, James W.

    2008-01-01

    The focus on quantitative literacy has been increasingly outside the realm of mathematics. The social sciences are well suited to including quantitative elements throughout the curriculum but doing so can mean challenges in preparation and presentation of material for instructors and increased anxiety for students. This paper describes tools and…

  2. Cytotype Control of Drosophila Melanogaster P Element Transposition: Genomic Position Determines Maternal Repression

    PubMed Central

    Misra, S.; Buratowski, R. M.; Ohkawa, T.; Rio, D. C.

    1993-01-01

    P element transposition in Drosophila is controlled by the cytotype regulatory state: in P cytotype, transposition is repressed, whereas in M cytotype, transposition can occur. P cytotype is determined by a combination of maternally inherited factors and chromosomal P elements in the zygote. Transformant strains containing single elements that encoded the 66-kD P element protein zygotically repressed transposition, but did not display the maternal repression characteristic of P cytotype. Upon mobilization to new genomic positions, some of these repressor elements showed significant maternal repression of transposition in genetic assays, involving a true maternal effect. Thus, the genomic position of repressor elements can determine the maternal vs. zygotic inheritance of P cytotype. Immunoblotting experiments indicate that this genomic position effect does not operate solely by controlling the expression level of the 66-kD repressor protein during oogenesis. Likewise, P element derivatives containing the hsp26 maternal regulator sequence expressed high levels of the 66-kD protein during oogenesis, but showed no detectable maternal repression. These data suggest that the location of a repressor element in the genome may determine maternal inheritance of P cytotype by a mechanism involving more than the overall level of expression of the 66-kD protein in the ovary. PMID:8293979

  3. Implementing online quantitative support modules in an intermediate-level course

    NASA Astrophysics Data System (ADS)

    Daly, J.

    2011-12-01

    While instructors typically anticipate that students in introductory geology courses enter a class with a wide range of quantitative ability, we often overlook the fact that this may also be true in upper-level courses. Some students are drawn to the subject and experience success in early courses with an emphasis on descriptive geology, then experience frustration and disappointment in mid- and upper-level courses that are more quantitative. To bolster student confidence in quantitative skills and enhance their performance in an upper-level course, I implemented several modules from The Math You Need (TMYN) online resource with a 200-level geomorphology class. Student facility with basic quantitative skills (rearranging equations, manipulating units, and graphing) was assessed with an online pre- and post-test. During the semester, modules were assigned to complement existing course activities (for example, the module on manipulating units was assigned prior to a lab on measurement of channel area and water velocity, then calculation of discharge). The implementation was designed to be a concise review of relevant skills for students with higher confidence in their quantitative abilities, and to provide a self-paced opportunity for students with less quantitative facility to build skills. This course already includes a strong emphasis on quantitative data collection, analysis, and presentation; in the past, student performance in the course has been strongly influenced by their individual quantitative ability. I anticipate that giving students the opportunity to improve mastery of fundamental quantitative skills will improve their performance on higher-stakes assignments and exams, and will enhance their sense of accomplishment in the course.

  4. Application of Laser Induced Breakdown Spectroscopy to the identification of emeralds from different synthetic processes

    NASA Astrophysics Data System (ADS)

    Agrosì, G.; Tempesta, G.; Scandale, E.; Legnaioli, S.; Lorenzetti, G.; Pagnotta, S.; Palleschi, V.; Mangone, A.; Lezzerini, M.

    2014-12-01

    Laser Induced Breakdown Spectroscopy can provide a useful contribution in mineralogical field in which the quantitative chemical analyses (including the evaluation of light elements) can play a key role in the studies on the origin of the emeralds. In particular, the chemical analyses permit to determine those trace elements, known as fingerprints, that can be useful to study their provenance. This technique, not requiring sample preparation results particularly suitable for gemstones, that obviously must be studied in a non-destructive way. In this paper, the LIBS technique was applied to distinguish synthetic emeralds grown by Biron hydrothermal method from those grown by Chatham flux method. The analyses performed by collinear double-pulse LIBS give a signal enhancement useful for the quantitative chemical analyses while guaranteeing a minimal sample damage. In this way it was obtained a considerable improvement on the detection limit of the trace elements, whose determination is essential for determining the origin of emerald gemstone. The trace elements V, Cr, and Fe and their relative amounts allowed the correct attribution of the manufacturer. Two different methods for quantitative analyses were used for this study: the standard Calibration-Free LIBS (CF-LIBS) method and its recent evolution, the One Point Calibration LIBS (OPC-LIBS). This is the first approach to the evaluation of the emerald origin by means of the LIBS technique.

  5. True-3D Accentuating of Grids and Streets in Urban Topographic Maps Enhances Human Object Location Memory

    PubMed Central

    Edler, Dennis; Bestgen, Anne-Kathrin; Kuchinke, Lars; Dickmann, Frank

    2015-01-01

    Cognitive representations of learned map information are subject to systematic distortion errors. Map elements that divide a map surface into regions, such as content-related linear symbols (e.g. streets, rivers, railway systems) or additional artificial layers (coordinate grids), provide an orientation pattern that can help users to reduce distortions in their mental representations. In recent years, the television industry has started to establish True-3D (autostereoscopic) displays as mass media. These modern displays make it possible to watch dynamic and static images including depth illusions without additional devices, such as 3D glasses. In these images, visual details can be distributed over different positions along the depth axis. Some empirical studies of vision research provided first evidence that 3D stereoscopic content attracts higher attention and is processed faster. So far, the impact of True-3D accentuating has not yet been explored concerning spatial memory tasks and cartography. This paper reports the results of two empirical studies that focus on investigations whether True-3D accentuating of artificial, regular overlaying line features (i.e. grids) and content-related, irregular line features (i.e. highways and main streets) in official urban topographic maps (scale 1/10,000) further improves human object location memory performance. The memory performance is measured as both the percentage of correctly recalled object locations (hit rate) and the mean distances of correctly recalled objects (spatial accuracy). It is shown that the True-3D accentuating of grids (depth offset: 5 cm) significantly enhances the spatial accuracy of recalled map object locations, whereas the True-3D emphasis of streets significantly improves the hit rate of recalled map object locations. These results show the potential of True-3D displays for an improvement of the cognitive representation of learned cartographic information. PMID:25679208

  6. The Combined Quantification and Interpretation of Multiple Quantitative Magnetic Resonance Imaging Metrics Enlightens Longitudinal Changes Compatible with Brain Repair in Relapsing-Remitting Multiple Sclerosis Patients.

    PubMed

    Bonnier, Guillaume; Maréchal, Benedicte; Fartaria, Mário João; Falkowskiy, Pavel; Marques, José P; Simioni, Samanta; Schluep, Myriam; Du Pasquier, Renaud; Thiran, Jean-Philippe; Krueger, Gunnar; Granziera, Cristina

    2017-01-01

    Quantitative and semi-quantitative MRI (qMRI) metrics provide complementary specificity and differential sensitivity to pathological brain changes compatible with brain inflammation, degeneration, and repair. Moreover, advanced magnetic resonance imaging (MRI) metrics with overlapping elements amplify the true tissue-related information and limit measurement noise. In this work, we combined multiple advanced MRI parameters to assess focal and diffuse brain changes over 2 years in a group of early-stage relapsing-remitting MS patients. Thirty relapsing-remitting MS patients with less than 5 years disease duration and nine healthy subjects underwent 3T MRI at baseline and after 2 years including T1, T2, T2* relaxometry, and magnetization transfer imaging. To assess longitudinal changes in normal-appearing (NA) tissue and lesions, we used analyses of variance and Bonferroni correction for multiple comparisons. Multivariate linear regression was used to assess the correlation between clinical outcome and multiparametric MRI changes in lesions and NA tissue. In patients, we measured a significant longitudinal decrease of mean T2 relaxation times in NA white matter ( p  = 0.005) and a decrease of T1 relaxation times in the pallidum ( p  < 0.05), which are compatible with edema reabsorption and/or iron deposition. No longitudinal changes in qMRI metrics were observed in controls. In MS lesions, we measured a decrease in T1 relaxation time ( p -value < 2.2e-16) and a significant increase in MTR ( p -value < 1e-6), suggesting repair mechanisms, such as remyelination, increased axonal density, and/or a gliosis. Last, the evolution of advanced MRI metrics-and not changes in lesions or brain volume-were correlated to motor and cognitive tests scores evolution (Adj- R 2  > 0.4, p  < 0.05). In summary, the combination of multiple advanced MRI provided evidence of changes compatible with focal and diffuse brain repair at early MS stages as suggested by histopathological studies.

  7. Quantitative Relationships Between Net Volume Change and Fabric Properties During Soil Evolution

    NASA Technical Reports Server (NTRS)

    Chadwick, O. A.; Nettleton, W. D.

    1993-01-01

    The state of soil evolution can be charted by net long-term volume and elemental mass changes for individual horizons compared with parent material. Volume collapse or dilation depends on relative elemental mass fluxes associated with losses form or additions to soil horizons.

  8. Optimization of a shorter variable-acquisition time for legs to achieve true whole-body PET/CT images.

    PubMed

    Umeda, Takuro; Miwa, Kenta; Murata, Taisuke; Miyaji, Noriaki; Wagatsuma, Kei; Motegi, Kazuki; Terauchi, Takashi; Koizumi, Mitsuru

    2017-12-01

    The present study aimed to qualitatively and quantitatively evaluate PET images as a function of acquisition time for various leg sizes, and to optimize a shorter variable-acquisition time protocol for legs to achieve better qualitative and quantitative accuracy of true whole-body PET/CT images. The diameters of legs to be modeled as phantoms were defined based on data derived from 53 patients. This study analyzed PET images of a NEMA phantom and three plastic bottle phantoms (diameter, 5.68, 8.54 and 10.7 cm) that simulated the human body and legs, respectively. The phantoms comprised two spheres (diameters, 10 and 17 mm) containing fluorine-18 fluorodeoxyglucose solution with sphere-to-background ratios of 4 at a background radioactivity level of 2.65 kBq/mL. All PET data were reconstructed with acquisition times ranging from 10 to 180, and 1200 s. We visually evaluated image quality and determined the coefficient of variance (CV) of the background, contrast and the quantitative %error of the hot spheres, and then determined two shorter variable-acquisition protocols for legs. Lesion detectability and quantitative accuracy determined based on maximum standardized uptake values (SUV max ) in PET images of a patient using the proposed protocols were also evaluated. A larger phantom and a shorter acquisition time resulted in increased background noise on images and decreased the contrast in hot spheres. A visual score of ≥ 1.5 was obtained when the acquisition time was ≥ 30 s for three leg phantoms, and ≥ 120 s for the NEMA phantom. The quantitative %errors of the 10- and 17-mm spheres in the leg phantoms were ± 15 and ± 10%, respectively, in PET images with a high CV (scan < 30 s). The mean SUV max of three lesions using the current fixed-acquisition and two proposed variable-acquisition time protocols in the clinical study were 3.1, 3.1 and 3.2, respectively, which did not significantly differ. Leg acquisition time per bed position of even 30-90 s allows axial equalization, uniform image noise and a maximum ± 15% quantitative accuracy for the smallest lesion. The overall acquisition time was reduced by 23-42% using the proposed shorter variable than the current fixed-acquisition time for imaging legs, indicating that this is a useful and practical protocol for routine qualitative and quantitative PET/CT assessment in the clinical setting.

  9. Mapping Metal Elements of Shuangbai Dinosaur Fossil by Synchrotron X-ray Fluorescence Microprobe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Y.; Qun, Y; Ablett, J

    The metal elements mapping of Shuangbai dinosaur fossil, was obtained by synchrotron x-ray fluorescence (SXRF). Eight elements, Ca, Mn, Fe, Cu, Zn, As, Y and Sr were determined. Elements As and Y were detected for the first time in the dinosaur fossil. The data indicated that metal elements are asymmetrical on fossil section. This is different from common minerals. Mapping metals showed that metal element As is few. The dinosaur most likely belongs to natural death. This is different from Zigong dinosaurs which were found dead from poisoning. This method has been used to find that metals Fe and Mnmore » are accrete, and the same is true for Sr and Y. This study indicated that colloid granule Fe and Mn, as well as Sr and Y had opposite electric charges in lithification process of fossils. By this analysis, compound forms can be ascertained. Synchrotron light source x-ray fluorescence is a complementary method that shows mapping of metal elements at the dinosaur fossil, and is rapid, exact and intuitionist. This study shows that dinosaur fossil mineral imaging has a potential in reconstructing the paleoenvironment and ancient geology.« less

  10. Evaluation of functional degeneration of the amazon-ant Polyergus rufescens Latr. under an influence of socially parasitic way of life.

    PubMed

    Dobrzańska, J

    1978-01-01

    In certain, infrequently occurring, favorable circumstances the ants P. rufescens can display patterns of behavior which seem to be disappearing as a result of their parasitic way of life: the ability to food themselves, independently though ineffectively, elements of the offspring-protection behavior, transporting of nestmates, escape reaction. Similar events reinforce the infrequently used, latent reflexes, preventing their complete extinction. It is supposed that the characteristic in conventional parasitism disappearance of certain elements of behavior is inhibited by a social way of life. It may also be true of other, non-insect communities.

  11. Using Generalized Annotated Programs to Solve Social Network Diffusion Optimization Problems

    DTIC Science & Technology

    2013-01-01

    as follows: —Let kall be the k value for the SNDOP-ALL query and for each SNDOP query i, let ki be the k for that query. For each query i, set ki... kall − 1. —Number each element of vi ∈ V such that gI(vi) and V C(vi) are true. For the ith SNDOP query, let vi be the corresponding element of V —Let...vertices of S. PROOF. We set up |V | SNDOP-queries as follows: —Let kall be the k value for the SNDOP-ALL query and and for each SNDOP-query i, let ki be

  12. ALGORITHM TO REDUCE APPROXIMATION ERROR FROM THE COMPLEX-VARIABLE BOUNDARY-ELEMENT METHOD APPLIED TO SOIL FREEZING.

    USGS Publications Warehouse

    Hromadka, T.V.; Guymon, G.L.

    1985-01-01

    An algorithm is presented for the numerical solution of the Laplace equation boundary-value problem, which is assumed to apply to soil freezing or thawing. The Laplace equation is numerically approximated by the complex-variable boundary-element method. The algorithm aids in reducing integrated relative error by providing a true measure of modeling error along the solution domain boundary. This measure of error can be used to select locations for adding, removing, or relocating nodal points on the boundary or to provide bounds for the integrated relative error of unknown nodal variable values along the boundary.

  13. The many meanings of gross photosynthesis and their implication for photosynthesis research from leaf to globe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wohlfahrt, Georg; Gu, Lianhong

    2015-06-25

    different meanings by different communities. We review the history of this term and associated concepts to clarify the terminology and make recommendations about a consistent use of terms in accordance with photosynthetic theory. We show that a widely used eddy covariance CO2 flux partitioning approach yields estimates which are quantitatively closer to the definition of true photosynthesis despite aiming at estimating apparent photosynthesis.

  14. The One Another Project: A Quantitative Study of North American Christian College and Seminary Students' Perception of Social Media's Effect on Their Blblical Interpersonal Relationships

    ERIC Educational Resources Information Center

    Pardue, Micheal S.

    2012-01-01

    Statistics show that social media is being used by a large majority of college students. This technological advent brings with it questions about how social media is affecting the relationships of those who use it. This is especially true for the Church and Christian institutions that put a high value on relationships. This study asked 3,645…

  15. Profiling Environmental Chemicals in the Antioxidant Response Element Pathway using Quantitative High Throughput Screening (qHTS)

    EPA Science Inventory

    The antioxidant response element (ARE) signaling pathway plays an important role in the amelioration of oxidative stress, which can contribute to a number of diseases, including cancer. We screened 1408 NTP-provided substances in 1536-well qHTS format at concentrations ranging fr...

  16. Effects of the approximations of light propagation on quantitative photoacoustic tomography using two-dimensional photon diffusion equation and linearization

    NASA Astrophysics Data System (ADS)

    Okawa, Shinpei; Hirasawa, Takeshi; Kushibiki, Toshihiro; Ishihara, Miya

    2017-12-01

    Quantitative photoacoustic tomography (QPAT) employing a light propagation model will play an important role in medical diagnoses by quantifying the concentration of hemoglobin or a contrast agent. However, QPAT by the light propagation model with the three-dimensional (3D) radiative transfer equation (RTE) requires a huge computational load in the iterative forward calculations involved in the updating process to reconstruct the absorption coefficient. The approximations of the light propagation improve the efficiency of the image reconstruction for the QPAT. In this study, we compared the 3D/two-dimensional (2D) photon diffusion equation (PDE) approximating 3D RTE with the Monte Carlo simulation based on 3D RTE. Then, the errors in a 2D PDE-based linearized image reconstruction caused by the approximations were quantitatively demonstrated and discussed in the numerical simulations. It was clearly observed that the approximations affected the reconstructed absorption coefficient. The 2D PDE-based linearized algorithm succeeded in the image reconstruction of the region with a large absorption coefficient in the 3D phantom. The value reconstructed in the phantom experiment agreed with that in the numerical simulation, so that it was validated that the numerical simulation of the image reconstruction predicted the relationship between the true absorption coefficient of the target in the 3D medium and the reconstructed value with the 2D PDE-based linearized algorithm. Moreover, the the true absorption coefficient in 3D medium was estimated from the 2D reconstructed image on the basis of the prediction by the numerical simulation. The estimation was successful in the phantom experiment, although some limitations were revealed.

  17. Diagnosis of the "large medial meniscus" of the knee on MR imaging.

    PubMed

    Samoto, Nobuhiko; Kozuma, Masakazu; Tokuhisa, Toshio; Kobayashi, Kunio

    2006-11-01

    Although several quantitative magnetic resonance (MR) diagnostic criteria for discoid lateral meniscus (DLM) have been described, there are no criteria by which to estimate the size of the medial meniscus. We define a medial meniscus that exceeds the normal size as a "large medial meniscus" (LMM), and the purpose of this study is to establish the quantitative MR diagnostic criteria for LMM. The MR imaging findings of 96 knees with arthroscopically confirmed intact semilunar lateral meniscus (SLM), 18 knees with intact DLM, 105 knees with intact semilunar medial meniscus (SMM) and 4 knees with torn LMM were analyzed. The following three quantitative parameters were measured: (a) meniscal width (MW): the minimum MW on the coronal slice; (b) ratio of the meniscus to the tibia (RMT): the ratio of minimum MW to maximum tibial width on the coronal slice; (c) continuity of the anterior and posterior horns (CAPH): the number of consecutive 5-mm-thick sagittal slices showing continuity between the anterior horn and the posterior horn of the meniscus on sagittal slices. Using logistic discriminant analysis between intact SLM and DLM groups and using descriptive statistics of intact SLM and SMM groups, the cutoff values used to discriminate LMM from SMM were calculated by MW and RMT. Moreover, the efficacy of these cutoff values and three slices of the cutoff values for CAPH were estimated in the medial meniscus group. "MW> or =11 mm" and "RMT> or =15%" were determined to be effective diagnostic criteria for LMM, while three of four cases in the torn LMM group were true positives and specificity was 99% in both criteria. When "CAPH> or =3 slices" was used as a criterion, three of four torn LMM cases were true positives and specificity was 93%.

  18. Attomole quantitation of protein separations with accelerator mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vogel, J S; Grant, P G; Buccholz, B A

    2000-12-15

    Quantification of specific proteins depends on separation by chromatography or electrophoresis followed by chemical detection schemes such as staining and fluorophore adhesion. Chemical exchange of short-lived isotopes, particularly sulfur, is also prevalent despite the inconveniences of counting radioactivity. Physical methods based on isotopic and elemental analyses offer highly sensitive protein quantitation that has linear response over wide dynamic ranges and is independent of protein conformation. Accelerator mass spectrometry quantifies long-lived isotopes such as 14C to sub-attomole sensitivity. We quantified protein interactions with small molecules such as toxins, vitamins, and natural biochemicals at precisions of 1-5% . Micro-proton-induced-xray-emission quantifies elemental abundancesmore » in separated metalloprotein samples to nanogram amounts and is capable of quantifying phosphorylated loci in gels. Accelerator-based quantitation is a possible tool for quantifying the genome translation into proteome.« less

  19. A True Metasurface Antenna.

    PubMed

    El Badawe, Mohamed; Almoneef, Thamer S; Ramahi, Omar M

    2016-01-13

    We present a true metasurface antenna based on electrically-small resonators. The resonators are placed on a flat surface and connected to one feed point using corporate feed. Unlike conventional array antennas where the distance between adjacent antennas is half wavelength to reduce mutual coupling between adjacent antennas, here the distance between the radiating elements is electrically very small to affect good impedance matching of each resonator to its feed. A metasurface antenna measuring 1.2λ × 1.2λ and designed to operate at 3 GHz achieved a gain of 12 dBi. A prototype was fabricated and tested showing good agreement between numerical simulations and experimental results. Through numerical simulation, we show that the metasurface antenna has the ability to provide beam steering by phasing all the resonators appropriately.

  20. Quantitative chemical imaging of the intracellular spatial distribution of fundamental elements and light metals in single cells.

    PubMed

    Malucelli, Emil; Iotti, Stefano; Gianoncelli, Alessandra; Fratini, Michela; Merolle, Lucia; Notargiacomo, Andrea; Marraccini, Chiara; Sargenti, Azzurra; Cappadone, Concettina; Farruggia, Giovanna; Bukreeva, Inna; Lombardo, Marco; Trombini, Claudio; Maier, Jeanette A; Lagomarsino, Stefano

    2014-05-20

    We report a method that allows a complete quantitative characterization of whole single cells, assessing the total amount of carbon, nitrogen, oxygen, sodium, and magnesium and providing submicrometer maps of element molar concentration, cell density, mass, and volume. This approach allows quantifying elements down to 10(6) atoms/μm(3). This result was obtained by applying a multimodal fusion approach that combines synchrotron radiation microscopy techniques with off-line atomic force microscopy. The method proposed permits us to find the element concentration in addition to the mass fraction and provides a deeper and more complete knowledge of cell composition. We performed measurements on LoVo human colon cancer cells sensitive (LoVo-S) and resistant (LoVo-R) to doxorubicin. The comparison of LoVo-S and LoVo-R revealed different patterns in the maps of Mg concentration with higher values within the nucleus in LoVo-R and in the perinuclear region in LoVo-S cells. This feature was not so evident for the other elements, suggesting that Mg compartmentalization could be a significant trait of the drug-resistant cells.

  1. Microstructure stability during creep deformation of hard-oriented polysynthetically twinned crystal of TiAl alloy

    NASA Astrophysics Data System (ADS)

    Kim, Hee Y.; Maruyama, K.

    2003-10-01

    The hard-orientated polysynthetically twinned (PST) crystal with the lamellar plates oriented parallel to the compression axis was deformed at 1150 K under the applied stress of 158 to 316 MPa. Microstructural changes were examined quantitatively for the PST crystal during creep deformation. In the as-grown PST crystal of the present study, proportions of α 2/ γ, true twin, pseudotwin, and 120 deg rotational fault interfaces were 12, 59, 12, and 17 pct, respectively. After creep deformation, lamellar coarsening by dissolution of α 2 lamellae and migration of γ/γ interfaces were observed. The acceleration of creep rate after the minimum strain rate in the creep curve was attributed to the lamellar coarsening and destruction of lamellar structure during the creep deformation. Thirty-two percent of α 2/ γ interfaces, 51 pct of true twin interfaces, 74 pct of pseudotwin interfaces, and 80 pct of 120 deg rotational faults disappeared after 4 pct creep strain at 1150 K. The α 2/ γ interface was more stable than γ/γ interfaces during the creep deformation. The pseudotwin interface and 120 deg rotational fault were less thermally stable than the true twin interface for γ/γ interfaces.

  2. A Comparison of Learning Cultures in Different Sizes and Types

    ERIC Educational Resources Information Center

    Brown, Paula D.; Finch, Kim S.; MacGregor, Cynthia

    2012-01-01

    This study compared relevant data and information about leadership and learning cultures in different sizes and types of high schools. Research was conducted using a quantitative design with a qualitative element. Quantitative data were gathered using a researcher-created survey. Independent sample t-tests were conducted to analyze the means of…

  3. Genes and quantitative trait loci (QTL) controlling trace element concentrations in perennial grasses grown on phytotoxic soil contaminated with heavy metals

    USDA-ARS?s Scientific Manuscript database

    Perennial grasses cover diverse soils throughout the world, including sites contaminated with heavy metals, producing forages that must be safe for livestock and wildlife. Chromosome regions known as quantitative trait loci (QTLs) controlling forage mineral concentrations were mapped in a populatio...

  4. Chromatographic-ICPMS methods for trace element and isotope analysis of water and biogenic calcite

    NASA Astrophysics Data System (ADS)

    Klinkhammer, G. P.; Haley, B. A.; McManus, J.; Palmer, M. R.

    2003-04-01

    ICP-MS is a powerful technique because of its sensitivity and speed of analysis. This is especially true for refractory elements that are notoriously difficult using TIMS and less energetic techniques. However, as ICP-MS instruments become more sensitive to elements of interest they also become more sensitive to interference. This becomes a pressing issue when analyzing samples with high total dissolved solids. This paper describes two trace element methods that overcome these problems by using chromatographic techniques to precondition samples prior to analysis by ICP-MS: separation of rare earth elements (REEs) from seawater using HPLC-ICPMS, and flow-through dissolution of foraminiferal calcite. Using HPLC in combination with ICP-MS it is possible to isolate the REEs from matrix, other transition elements, and each other. This method has been developed for small volume samples (5ml) making it possible to analyze sediment pore waters. As another example, subjecting foram shells to flow-through reagent addition followed by time-resolved analysis in the ICP-MS allows for systematic cleaning and dissolution of foram shells. This method provides information about the relationship between dissolution tendency and elemental composition. Flow-through is also amenable to automation thus yielding the high sample throughput required for paleoceanography, and produces a highly resolved elemental matrix that can be statistically analyzed.

  5. Le Châtelier reciprocal relations and the mechanical analog

    NASA Astrophysics Data System (ADS)

    Gilmore, Robert

    1983-08-01

    Le Châtelier's principle is discussed carefully in terms of two sets of simple thermodynamic examples. The principle is then formulated quantitatively for general thermodynamic systems. The formulation is in terms of a perturbation-response matrix, the Le Châtelier matrix [L]. Le Châtelier's principle is contained in the diagonal elements of this matrix, all of which exceed one. These matrix elements describe the response of a system to a perturbation of either its extensive or intensive variables. These response ratios are inverses of each other. The Le Châtelier matrix is symmetric, so that a new set of thermodynamic reciprocal relations is derived. This quantitative formulation is illustrated by a single simple example which includes the original examples and shows the reciprocities among them. The assumptions underlying this new quantitative formulation of Le Châtelier's principle are general and applicable to a wide variety of nonthermodynamic systems. Le Châtelier's principle is formulated quantitatively for mechanical systems in static equilibrium, and mechanical examples of this formulation are given.

  6. Vacuum decay in an interacting multiverse

    NASA Astrophysics Data System (ADS)

    Robles-Pérez, S.; Alonso-Serrano, A.; Bastos, C.; Bertolami, O.

    2016-08-01

    We examine a new multiverse scenario in which the component universes interact. We focus our attention to the process of "true" vacuum nucleation in the false vacuum within one single element of the multiverse. It is shown that the interactions lead to a collective behavior that might lead, under specific conditions, to a pre-inflationary phase and ensued distinguishable imprints in the comic microwave background radiation.

  7. Inspired by a True Story: Good Night, and Good Luck and Why We Need It

    ERIC Educational Resources Information Center

    Beck, Bernard

    2006-01-01

    Why do people refurbish some historical tales from time to time, adding new versions to be displayed publicly? Powerful cultural elements can be invoked for many purposes, and the morals derived from the fables can be quite different. When individuals are blessed with a rich cultural heritage, they may delve into it to find wisdom for themselves…

  8. Anterior implant-supported overdentures.

    PubMed

    Ben-Ur, Z; Gorfil, C; Shifman, A

    1996-09-01

    Retention of complete mandibular dentures can be successfully achieved by means of an implant-retained or natural tooth-retained bar and clip system in the anterior segment of the mandible. The same design principles hold true for both methods of anchoring the retentive bar. These retentive elements must be constructed to allow some freedom of movement around a fulcrum line designed to be perpendicular to the sagittal plane.

  9. Exposure Potential and Health Impacts of Indium and Gallium, Metals Critical to Emerging Electronics and Energy Technologies.

    PubMed

    White, Sarah Jane O; Shine, James P

    2016-12-01

    The rapid growth of new electronics and energy technologies requires the use of rare elements of the periodic table. For many of these elements, little is known about their environmental behavior or human health impacts. This is true for indium and gallium, two technology critical elements. Increased environmental concentrations of both indium and gallium create the potential for increased environmental exposure, though little is known about the extent of this exposure. Evidence is mounting that indium and gallium can have substantial toxicity, including in occupational settings where indium lung disease has been recognized as a potentially fatal disease caused by the inhalation of indium particles. This paper aims to review the basic chemistry, changing environmental concentrations, potential for human exposure, and known health effects of indium and gallium.

  10. On the Singularity in the Estimation of the Quaternion-of-Rotation

    NASA Technical Reports Server (NTRS)

    Bar-Itzhack, Itzhack Y.; Thienel, Julie K.

    2003-01-01

    It has been claimed in the archival literature that the covariance matrix of a Kalman filter, which is designed to estimate the quaternion-of-rotation, is necessarily rank deficient because the normality constraint of the quaternion produces dependence between the quaternion elements. In reality, though, this phenomenon does not occur. The covariance matrix is not singular, and the filter is well behaved. Several simple examples are presented that demonstrate the regularity of the covariance matrix. First, estimation cases are presented where a relationship exists between the estimated variables, and yet the covariance matrix is not singular. Then the particular problem of quaternion estimation is analyzed. It is shown that the discrepancy stems from the fact that a functional relationship exists between the elements of the true quaternion but not between its estimated elements.

  11. Estimating Small-Body Gravity Field from Shape Model and Navigation Data

    NASA Technical Reports Server (NTRS)

    Park, Ryan S.; Werner, Robert A.; Bhaskaran, Shyam

    2008-01-01

    This paper presents a method to model the external gravity field and to estimate the internal density variation of a small-body. We first discuss the modeling problem, where we assume the polyhedral shape and internal density distribution are given, and model the body interior using finite elements definitions, such as cubes and spheres. The gravitational attractions computed from these approaches are compared with the true uniform-density polyhedral attraction and the level of accuracies are presented. We then discuss the inverse problem where we assume the body shape, radiometric measurements, and a priori density constraints are given, and estimate the internal density variation by estimating the density of each finite element. The result shows that the accuracy of the estimated density variation can be significantly improved depending on the orbit altitude, finite-element resolution, and measurement accuracy.

  12. Synchrotron-based ambient pressure X-ray photoelectron spectroscopy of hydrogen and helium

    DOE PAGES

    Zhong, Jian-Qiang; Wang, Mengen; Hoffmann, William H.; ...

    2018-03-01

    Contrary to popular belief, it is possible to obtain X-ray photoelectron spectra for elements lighter than lithium, namely hydrogen and helium. The literature is plagued with claims of this impossibility, which holds true for lab-based X-ray sources. However, this limitation is merely technical and is related mostly to the low X-ray photoionization cross-sections of the 1s orbitals of hydrogen and helium. Here, we show that, using ambient pressure X-ray photoelectron spectroscopy (XPS), a bright-enough X-ray source allows the study of these elusive elements. This has important implications in the understanding of the limitations of one of the most useful techniquesmore » in materials science, and moreover, it potentially opens the possibility of using XPS to directly study the most abundant element in the universe.« less

  13. Synchrotron-based ambient pressure X-ray photoelectron spectroscopy of hydrogen and helium

    NASA Astrophysics Data System (ADS)

    Zhong, Jian-Qiang; Wang, Mengen; Hoffmann, William H.; van Spronsen, Matthijs A.; Lu, Deyu; Boscoboinik, J. Anibal

    2018-02-01

    Contrary to popular belief, it is possible to obtain X-ray photoelectron spectra for elements lighter than lithium, namely hydrogen and helium. The literature is plagued with claims of this impossibility, which holds true for lab-based X-ray sources. However, this limitation is merely technical and is related mostly to the low X-ray photoionization cross-sections of the 1s orbitals of hydrogen and helium. In this letter, we show that, using ambient pressure X-ray photoelectron spectroscopy (XPS), a bright-enough X-ray source allows the study of these elusive elements. This has important implications in the understanding of the limitations of one of the most useful techniques in materials science, and moreover, it potentially opens the possibility of using XPS to directly study the most abundant element in the universe.

  14. Synchrotron-based ambient pressure X-ray photoelectron spectroscopy of hydrogen and helium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhong, Jian-Qiang; Wang, Mengen; Hoffmann, William H.

    Contrary to popular belief, it is possible to obtain X-ray photoelectron spectra for elements lighter than lithium, namely hydrogen and helium. The literature is plagued with claims of this impossibility, which holds true for lab-based X-ray sources. However, this limitation is merely technical and is related mostly to the low X-ray photoionization cross-sections of the 1s orbitals of hydrogen and helium. Here, we show that, using ambient pressure X-ray photoelectron spectroscopy (XPS), a bright-enough X-ray source allows the study of these elusive elements. This has important implications in the understanding of the limitations of one of the most useful techniquesmore » in materials science, and moreover, it potentially opens the possibility of using XPS to directly study the most abundant element in the universe.« less

  15. Quantitative Chromatographic Determination of Dissolved Elemental Sulfur in the Non-aqueous Electrolyte for Lithium-Sulfur Batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Dong; Yang, Xiao-Qing; Zhang, Xuran

    A fast and reliable analytical method is reported for the quantitative determination of dissolved elemental sulfur in non-aqueous electrolytes for Li-S batteries. By using high performance liquid chromatography with a UV detector, the solubility of S in 12 different pure solvents and in 22 different electrolytes was determined. It was found that the solubility of elemental sulfur is dependent on the Lewis basicity, the polarity of solvents and the salt concentration in the electrolytes. In addition, the S content in the electrolyte recovered from a discharged Li-S battery was successfully determined by the proposed HPLC/UV method. Thus, the feasibility ofmore » the method to the online analysis for a Li-S battery is demonstrated. Interestingly, the S was found super-saturated in the electrolyte recovered from a discharged Li-S cell.« less

  16. Quantitative Chromatographic Determination of Dissolved Elemental Sulfur in the Non-aqueous Electrolyte for Lithium-Sulfur Batteries

    DOE PAGES

    Zheng, Dong; Yang, Xiao-Qing; Zhang, Xuran; ...

    2014-12-02

    A fast and reliable analytical method is reported for the quantitative determination of dissolved elemental sulfur in non-aqueous electrolytes for Li-S batteries. By using high performance liquid chromatography with a UV detector, the solubility of S in 12 different pure solvents and in 22 different electrolytes was determined. It was found that the solubility of elemental sulfur is dependent on the Lewis basicity, the polarity of solvents and the salt concentration in the electrolytes. In addition, the S content in the electrolyte recovered from a discharged Li-S battery was successfully determined by the proposed HPLC/UV method. Thus, the feasibility ofmore » the method to the online analysis for a Li-S battery is demonstrated. Interestingly, the S was found super-saturated in the electrolyte recovered from a discharged Li-S cell.« less

  17. SEDIMENT-HOSTED PRECIOUS METAL DEPOSITS.

    USGS Publications Warehouse

    Bagby, W.C.; Pickthorn, W.J.; Goldfarb, R.; Hill, R.A.

    1984-01-01

    The Dee mine is a sediment-hosted, disseminated gold deposit in the Roberts Mountains allochthon of north central Nevada. Soil samples were collected from the C-horizon in undisturbed areas over the deposit in order to investigate the usefulness of soil geochemistry in identifying this type of deposit. Each sample was sieved to minus 80 mesh and analyzed quantitatively for Au, Ag, As, Sb, Hg, Tl and semi-quantitative data for an additional 31 elements. Rank sum analysis is successful for the Au, Ag, As, Sb, Hg, Tl suite, even though bedrock geology is disregarded. This method involves data transformation into a total element signature by ranking the data in ascending order and summing the element ranks for each sample. The rank sums are then divided into percentile groups and plotted. The rank sum plot for the Dee soils unequivocally identifies three of four known ore zones.

  18. Evaluation of different strategies for quantitative depth profile analysis of Cu/NiCu layers and multilayers via pulsed glow discharge - Time of flight mass spectrometry

    NASA Astrophysics Data System (ADS)

    Muñiz, Rocío; Lobo, Lara; Németh, Katalin; Péter, László; Pereiro, Rosario

    2017-09-01

    There is still a lack of approaches for quantitative depth-profiling when dealing with glow discharges (GD) coupled to mass spectrometric detection. The purpose of this work is to develop quantification procedures using pulsed GD (PGD) - time of flight mass spectrometry. In particular, research was focused towards the depth profile analysis of Cu/NiCu nanolayers and multilayers electrodeposited on Si wafers. PGDs are characterized by three different regions due to the temporal application of power: prepeak, plateau and afterglow. This last region is the most sensitive and so it is convenient for quantitative analysis of minor components; however, major elements are often saturated, even at 30 W of applied radiofrequency power for these particular samples. For such cases, we have investigated two strategies based on a multimatrix calibration procedure: (i) using the afterglow region for all the sample components except for the major element (Cu) that was analyzed in the plateau, and (ii) using the afterglow region for all the elements measuring the ArCu signal instead of Cu. Seven homogeneous certified reference materials containing Si, Cr, Fe, Co, Ni and Cu have been used for quantification. Quantitative depth profiles obtained with these two strategies for samples containing 3 or 6 multilayers (of a few tens of nanometers each layer) were in agreement with the expected values, both in terms of thickness and composition of the layers.

  19. Development of multiplex PCR assay for authentication of Cornu Cervi Pantotrichum in traditional Chinese medicine based on cytochrome b and C oxidase subunit 1 genes.

    PubMed

    Gao, Lijun; Xia, Wei; Ai, Jinxia; Li, Mingcheng; Yuan, Guanxin; Niu, Jiamu; Fu, Guilian; Zhang, Lihua

    2016-07-01

    This study describes a method for discriminating the true Cervus antlers from its counterfeits using multiplex PCR. Bioinformatics were carried out to design the specific alleles primers for mitochondrial (mt) cytochrome b (Cyt b) and cytochrome C oxidase subunit 1 (Cox 1) genes. The mt DNA and genomic DNA were extracted from Cervi Cornu Pantotrichum through the modified alkaline and the salt-extracting method in addition to its counterfeits, respectively. Sufficient DNA templates were extracted from all samples used in two methods, and joint fragments of 354 bp and 543 bp that were specifically amplified from both of true Cervus antlers served as a standard control. The data revealed that the multiplex PCR-based assays using two primer sets can be used for forensic and quantitative identification of original Cervus deer products from counterfeit antlers in a single step.

  20. [Clinical studies in health workers employed in the manual lifting of patients: methods for the examination of spinal lesions].

    PubMed

    Ricci, M G; Menoni, O; Colombini, D; Occhipinti, E

    1999-01-01

    To enable different research groups to make a standardized collection of clinical data on alterations of the lumbar region of the spine, protocols were used for the collection and classification of data that were proposed and thoroughly validated by the authors. The protocols include a clinical/functional examination of the spine, checking for positive anamnestic threshold, for pain on pressure/palpation of the spiny apophyses and paravertebral muscles, for painful movements, in order to classify 1st, 2nd and 3rd grade functional spondylarthropathy (for different regions of the spine). An ad hoc questionnaire was also prepared for the quantitative and qualitative study of true acute low back pain and the ingravescent low back pain controlled at the onset pharmacologically. The results of this questionnaire make it possible to calculate the incidence of acute low back pain (true and pharmacologically controlled).

  1. Space-Time Data fusion for Remote Sensing Applications

    NASA Technical Reports Server (NTRS)

    Braverman, Amy; Nguyen, H.; Cressie, N.

    2011-01-01

    NASA has been collecting massive amounts of remote sensing data about Earth's systems for more than a decade. Missions are selected to be complementary in quantities measured, retrieval techniques, and sampling characteristics, so these datasets are highly synergistic. To fully exploit this, a rigorous methodology for combining data with heterogeneous sampling characteristics is required. For scientific purposes, the methodology must also provide quantitative measures of uncertainty that propagate input-data uncertainty appropriately. We view this as a statistical inference problem. The true but notdirectly- observed quantities form a vector-valued field continuous in space and time. Our goal is to infer those true values or some function of them, and provide to uncertainty quantification for those inferences. We use a spatiotemporal statistical model that relates the unobserved quantities of interest at point-level to the spatially aggregated, observed data. We describe and illustrate our method using CO2 data from two NASA data sets.

  2. Calculation of Coincidence Summing Correction Factors for an HPGe detector using GEANT4.

    PubMed

    Giubrone, G; Ortiz, J; Gallardo, S; Martorell, S; Bas, M C

    2016-07-01

    The aim of this paper was to calculate the True Coincidence Summing Correction Factors (TSCFs) for an HPGe coaxial detector in order to correct the summing effect as a result of the presence of (88)Y and (60)Co in a multigamma source used to obtain a calibration efficiency curve. Results were obtained for three volumetric sources using the Monte Carlo toolkit, GEANT4. The first part of this paper deals with modeling the detector in order to obtain a simulated full energy peak efficiency curve. A quantitative comparison between the measured and simulated values was made across the entire energy range under study. The True Summing Correction Factors were calculated for (88)Y and (60)Co using the full peak efficiencies obtained with GEANT4. This methodology was subsequently applied to (134)Cs, and presented a complex decay scheme. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Light irradiation induces fragmentation of the plasmodium, a novel photomorphogenesis in the true slime mold Physarum polycephalum: action spectra and evidence for involvement of the phytochrome.

    PubMed

    Kakiuchi, Y; Takahashi, T; Murakami, A; Ueda, T

    2001-03-01

    A new photomorphogenesis was found in the plasmodium of the true slime mold Physarum polycephalum: the plasmodium broke temporarily into equal-sized spherical pieces, each containing about eight nuclei, about 5 h after irradiation with light. Action spectroscopic study showed that UVA, blue and far-red lights were effective, while red light inhibited the far-red-induced fragmentation. Difference absorption spectra of both the living plasmodium and the plasmodial homogenate after alternate irradiation with far-red and red light gave two extremes at 750 and 680 nm, which agreed with those for the induction and inhibition of the fragmentation, respectively. A kinetic model similar to that of phytochrome action explained quantitatively the fluence rate-response curves of the fragmentation. Our results indicate that one of the photoreceptors for the plasmodial fragmentation is a phytochrome.

  4. Nipponium as a new element (Z=75) separated by the Japanese chemist, Masataka Ogawa: a scientific and science historical re-evaluation.

    PubMed

    Yoshihara, H Kenji

    2008-01-01

    This review article deals with a new element 'nipponium' reported by Masataka Ogawa in 1908, and with its scientific and science historical background. Ogawa positioned nipponium between molybdenum and ruthenium in the periodic table. From a modern chemical viewpoint, however, nipponium is ascribable to the element with Z=75, namely rhenium, which was unknown in 1908. The reasons for this corrected assignment of nipponium are (1) its optical spectra, (2) its atomic weight when corrected, (3) its relative abundance in molybdenite, the same being true with rhenium. Recently some important evidence was found among the Ogawa's personal collection preserved by his family. Deciphering the X-ray spectra revealed that the measured spectra of the nipponium sample that Ogawa brought from University College, London clearly showed the presence of the element 75 (rhenium). Thus was resolved the mysterious story of nipponium, which had continued for almost a century. It is concluded that nipponium was identical to rhenium.

  5. Factors influencing perceived angular velocity.

    PubMed

    Kaiser, M K; Calderone, J B

    1991-11-01

    The assumption that humans are able to perceive and process angular kinematics is critical to many structure-from-motion and optical flow models. The current studies investigate this sensitivity, and examine several factors likely to influence angular velocity perception. In particular, three factors are considered: (1) the extent to which perceived angular velocity is determined by edge transitions of surface elements, (2) the extent to which angular velocity estimates are influenced by instantaneous linear velocities of surface elements, and (3) whether element-velocity effects are related to three-dimensional (3-D) tangential velocities or to two-dimensional (2-D) image velocities. Edge-transition rate biased angular velocity estimates only when edges were highly salient. Element velocities influenced perceived angular velocity; this bias was related to 2-D image velocity rather than 3-D tangential velocity. Despite these biases, however, judgments were most strongly determined by the true angular velocity. Sensitivity to this higher order motion parameter was surprisingly good, for rotations both in depth (y-axis) and parallel to the line of sight (z-axis).

  6. Nipponium as a new element (Z = 75) separated by the Japanese chemist, Masataka Ogawa: a scientific and science historical re-evaluation

    PubMed Central

    Yoshihara, H. Kenji

    2008-01-01

    This review article deals with a new element ‘nipponium’ reported by Masataka Ogawa in 1908, and with its scientific and science historical background. Ogawa positioned nipponium between molybdenum and ruthenium in the periodic table. From a modern chemical viewpoint, however, nipponium is ascribable to the element with Z = 75, namely rhenium, which was unknown in 1908. The reasons for this corrected assignment of nipponium are (1) its optical spectra, (2) its atomic weight when corrected, (3) its relative abundance in molybdenite, the same being true with rhenium. Recently some important evidence was found among the Ogawa’s personal collection preserved by his family. Deciphering the X-ray spectra revealed that the measured spectra of the nipponium sample that Ogawa brought from University College, London clearly showed the presence of the element 75 (rhenium). Thus was resolved the mysterious story of nipponium, which had continued for almost a century. It is concluded that nipponium was identical to rhenium. PMID:18941300

  7. Immunochemical Methods for Quantitation of Vitamin B6

    DTIC Science & Technology

    1981-09-30

    pANk K:E:: Z P a . LIST OF FIGURES Page Figure 1. Synthesis of N-Carboxymethylpyridoxine 15 Figure 2. Pyridoxine and N- Substituted Derivatives 16...Pyridoxine Substituted in the 3 Position 23 Figure 6. Synthesis of as -Pyridoxylformic Acid and as - 25 Pyridoxylacetic Acid Figure 7. Fluorogenic Galactosides...CH20 (Vill) (X Figure 2. Pyridoxine and N- Substituted Derivatives 16 hinder the formation of quaternary salts (Kirpal, 1910).’" We found this to be true

  8. Quantitative Analysis of Situational Awareness (QUASA): Applying Signal Detection Theory to True/False Probes and Self-Ratings

    DTIC Science & Technology

    2004-06-01

    obtained. Further refinements of the technique based on recent research in experimental psychology are also considered. INTRODUCTION The key...an established line of research in psychology in which objective and subjective metrics are combined to analyse the degree of ‘calibration’ in... Creelman , 1991). A notable exception is the study by Kunimoto et al. (2001) in which confidence ratings were subjected to SDT analysis to evaluate the

  9. Arterial Spin Labeling - Fast Imaging with Steady-State Free Precession (ASL-FISP): A Rapid and Quantitative Perfusion Technique for High Field MRI

    PubMed Central

    Gao, Ying; Goodnough, Candida L.; Erokwu, Bernadette O.; Farr, George W.; Darrah, Rebecca; Lu, Lan; Dell, Katherine M.; Yu, Xin; Flask, Chris A.

    2014-01-01

    Arterial Spin Labeling (ASL) is a valuable non-contrast perfusion MRI technique with numerous clinical applications. Many previous ASL MRI studies have utilized either Echo-Planar Imaging (EPI) or True Fast Imaging with Steady-State Free Precession (True FISP) readouts that are prone to off-resonance artifacts on high field MRI scanners. We have developed a rapid ASL-FISP MRI acquisition for high field preclinical MRI scanners providing perfusion-weighted images with little or no artifacts in less than 2 seconds. In this initial implementation, a FAIR (Flow-Sensitive Alternating Inversion Recovery) ASL preparation was combined with a rapid, centrically-encoded FISP readout. Validation studies on healthy C57/BL6 mice provided consistent estimation of in vivo mouse brain perfusion at 7 T and 9.4 T (249±38 ml/min/100g and 241±17 ml/min/100g, respectively). The utility of this method was further demonstrated in detecting significant perfusion deficits in a C57/BL6 mouse model of ischemic stroke. Reasonable kidney perfusion estimates were also obtained for a healthy C57/BL6 mouse exhibiting differential perfusion in the renal cortex and medulla. Overall, the ASL-FISP technique provides a rapid and quantitative in vivo assessment of tissue perfusion for high field MRI scanners with minimal image artifacts. PMID:24891124

  10. Limitations of quantitative analysis of deep crustal seismic reflection data: Examples from GLIMPCE

    USGS Publications Warehouse

    Lee, Myung W.; Hutchinson, Deborah R.

    1992-01-01

    Amplitude preservation in seismic reflection data can be obtained by a relative true amplitude (RTA) processing technique in which the relative strength of reflection amplitudes is preserved vertically as well as horizontally, after compensating for amplitude distortion by near-surface effects and propagation effects. Quantitative analysis of relative true amplitudes of the Great Lakes International Multidisciplinary Program on Crustal Evolution seismic data is hampered by large uncertainties in estimates of the water bottom reflection coefficient and the vertical amplitude correction and by inadequate noise suppression. Processing techniques such as deconvolution, F-K filtering, and migration significantly change the overall shape of amplitude curves and hence calculation of reflection coefficients and average reflectance. Thus lithological interpretation of deep crustal seismic data based on the absolute value of estimated reflection strength alone is meaningless. The relative strength of individual events, however, is preserved on curves generated at different stages in the processing. We suggest that qualitative comparisons of relative strength, if used carefully, provide a meaningful measure of variations in reflectivity. Simple theoretical models indicate that peg-leg multiples rather than water bottom multiples are the most severe source of noise contamination. These multiples are extremely difficult to remove when the water bottom reflection coefficient is large (>0.6), a condition that exists beneath parts of Lake Superior and most of Lake Huron.

  11. Elemental analysis of occupational and environmental lung diseases by electron probe microanalyzer with wavelength dispersive spectrometer.

    PubMed

    Takada, Toshinori; Moriyama, Hiroshi; Suzuki, Eiichi

    2014-01-01

    Occupational and environmental lung diseases are a group of pulmonary disorders caused by inhalation of harmful particles, mists, vapors or gases. Mineralogical analysis is not generally required in the diagnosis of most cases of these diseases. Apart from minerals that are encountered rarely or only in specific occupations, small quantities of mineral dusts are present in the healthy lung. As such when mineralogical analysis is required, quantitative or semi-quantitative methods must be employed. An electron probe microanalyzer with wavelength dispersive spectrometer (EPMA-WDS) enables analysis of human lung tissue for deposits of elements by both qualitative and semi-quantitative methods. Since 1993, we have analyzed 162 cases of suspected occupational and environmental lung diseases using an EPMA-WDS. Our institute has been accepting online requests for elemental analysis of lung tissue samples by EPMA-WDS since January 2011. Hard metal lung disease is an occupational interstitial lung disease that primarily affects workers exposed to the dust of tungsten carbide. The characteristic pathological findings of the disease are giant cell interstitial pneumonia (GIP) with centrilobular fibrosis, surrounded by mild alveolitis with giant cells within the alveolar space. EPMA-WDS analysis of biopsied lung tissue from patients with GIP has demonstrated that tungsten and/or cobalt is distributed in the giant cells and centrilobular fibrosing lesion in GIP. Pneumoconiosis, caused by amorphous silica, and acute interstitial pneumonia, associated with the giant tsunami, were also elementally analyzed by EPMA-WDS. The results suggest that commonly found elements, such as silicon, aluminum, and iron, may cause occupational and environmental lung diseases. Copyright © 2013 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.

  12. Quantitative statistical analysis of cis-regulatory sequences in ABA/VP1- and CBF/DREB1-regulated genes of Arabidopsis.

    PubMed

    Suzuki, Masaharu; Ketterling, Matthew G; McCarty, Donald R

    2005-09-01

    We have developed a simple quantitative computational approach for objective analysis of cis-regulatory sequences in promoters of coregulated genes. The program, designated MotifFinder, identifies oligo sequences that are overrepresented in promoters of coregulated genes. We used this approach to analyze promoter sequences of Viviparous1 (VP1)/abscisic acid (ABA)-regulated genes and cold-regulated genes, respectively, of Arabidopsis (Arabidopsis thaliana). We detected significantly enriched sequences in up-regulated genes but not in down-regulated genes. This result suggests that gene activation but not repression is mediated by specific and common sequence elements in promoters. The enriched motifs include several known cis-regulatory sequences as well as previously unidentified motifs. With respect to known cis-elements, we dissected the flanking nucleotides of the core sequences of Sph element, ABA response elements (ABREs), and the C repeat/dehydration-responsive element. This analysis identified the motif variants that may correlate with qualitative and quantitative differences in gene expression. While both VP1 and cold responses are mediated in part by ABA signaling via ABREs, these responses correlate with unique ABRE variants distinguished by nucleotides flanking the ACGT core. ABRE and Sph motifs are tightly associated uniquely in the coregulated set of genes showing a strict dependence on VP1 and ABA signaling. Finally, analysis of distribution of the enriched sequences revealed a striking concentration of enriched motifs in a proximal 200-base region of VP1/ABA and cold-regulated promoters. Overall, each class of coregulated genes possesses a discrete set of the enriched motifs with unique distributions in their promoters that may account for the specificity of gene regulation.

  13. Examining the Elements of Online Learning Quality in a Fully Online Doctoral Program

    ERIC Educational Resources Information Center

    Templeton, Nathan R.; Ballenger, Julia N.; Thompson, J. Ray

    2015-01-01

    The purpose of this descriptive quantitative study was to examine the quality elements of online learning in a regional doctoral program. Utilizing the six quality dimensions of Hathaway's (2009) theory of online learning quality as a framework, the study investigated instructor-learner, learner-learner, learner-content, learner-interface,…

  14. Analyzing For Light Elements By X-Ray Scattering

    NASA Technical Reports Server (NTRS)

    Ross, H. Richard

    1993-01-01

    Nondestructive method of determining concentrations of low-atomic-number elements in liquids and solids involves measurements of Compton and Rayleigh scattering of x rays. Applied in quantitative analysis of low-atomic-number constituents of alloys, of contaminants and corrosion products on surfaces of alloys, and of fractions of hydrogen in plastics, oils, and solvents.

  15. Quantitative assessment of elemental carbon in the lungs of never smokers, cigarette smokers and coal miners

    EPA Science Inventory

    Inhalation exposure to particulates such as cigarette smoke and coal dust is known to contribute to the development of chronic lung disease. The purpose of this study was to estimate the amount of elemental carbon (EC) deposits from autopsied lung samples from cigarette smokers, ...

  16. Geochemical variations of rare earth elements in Marcellus shale flowback waters and multiple-source cores in the Appalachian Basin

    NASA Astrophysics Data System (ADS)

    Noack, C.; Jain, J.; Hakala, A.; Schroeder, K.; Dzombak, D. A.; Karamalidis, A.

    2013-12-01

    Rare earth elements (REE) - encompassing the naturally occurring lanthanides, yttrium, and scandium - are potential tracers for subsurface groundwater-brine flows and geochemical processes. Application of these elements as naturally occurring tracers during shale gas development is reliant on accurate quantitation of trace metals in hypersaline brines. We have modified and validated a liquid-liquid technique for extraction and pre-concentration of REE from saline produced waters from shale gas extraction wells with quantitative analysis by ICP-MS. This method was used to analyze time-series samples of Marcellus shale flowback and produced waters. Additionally, the total REE content of core samples of various strata throughout the Appalachian Basin were determined using HF/HNO3 digestion and ICP-MS analysis. A primary goal of the study is to elucidate systematic geochemical variations as a function of location or shale characteristics. Statistical testing will be performed to study temporal variability of inter-element relationships and explore associations between REE abundance and major solution chemistry. The results of these analyses and discussion of their significance will be presented.

  17. Remote quantitative analysis of minerals based on multispectral line-calibrated laser-induced breakdown spectroscopy (LIBS).

    PubMed

    Wan, Xiong; Wang, Peng

    2014-01-01

    Laser-induced breakdown spectroscopy (LIBS) is a feasible remote sensing technique used for mineral analysis in some unapproachable places where in situ probing is needed, such as analysis of radioactive elements in a nuclear leak or the detection of elemental compositions and contents of minerals on planetary and lunar surfaces. Here a compact custom 15 m focus optical component, combining a six times beam expander with a telescope, has been built, with which the laser beam of a 1064 nm Nd ; YAG laser is focused on remote minerals. The excited LIBS signals that reveal the elemental compositions of minerals are collected by another compact single lens-based signal acquisition system. In our remote LIBS investigations, the LIBS spectra of an unknown ore have been detected, from which the metal compositions are obtained. In addition, a multi-spectral line calibration (MSLC) method is proposed for the quantitative analysis of elements. The feasibility of the MSLC and its superiority over a single-wavelength determination have been confirmed by comparison with traditional chemical analysis of the copper content in the ore.

  18. Elemental micro-PIXE mapping of hypersensitive lesions in Lagenaria sphaerica (Cucurbitaceae) resistant to Sphaerotheca fuliginea (powdery mildew)

    NASA Astrophysics Data System (ADS)

    Weiersbye-Witkowski, I. M.; Przybylowicz, W. J.; Straker, C. J.; Mesjasz-Przybylowicz, J.

    1997-07-01

    Genotypes of the Southern African cucurbit, Lagenaria sphaerica, that are resistant to powdery-mildew ( Sphaerotheca fuliginea) exhibit foliar hypersensitive (HS) lesions on inoculation with this fungal pathogen. Elemental distributions across radially symmetrical HS lesions, surrounding unlesioned leaf tissue and uninoculated leaf tissue, were obtained using the true elemental imaging system (Dynamic Analysis) of the NAC Van de Graaff nuclear microprobe. Raster scans of 3 MeV protons were complemented by simultaneous PIXE and BS point analyses. The composition of cellulose (C 6H 10O 5) was used as constant matrix composition for scans, and the sample thickness was found from BS spectra. Si and elements heavier than Ca contributed to matrix composition within HS lesions and the locally elevated Ca raised the limits of detection for some trace metals of interest. In comparison to uninoculated tissue, inoculated tissue was characterised by higher overall concentrations of all measured elements except Cu. Fully developed, 6 day-old HS lesions and the surrounding tissue could be divided into five zones, centred on the fungal infection site. Each zone was characterized by distinct local elemental distributions (either depletion, or accumulation to potentially phytotoxic levels).

  19. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    PubMed Central

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  20. Quantitative Morphology Measures in Galaxies: Ground-Truthing from Simulations

    NASA Astrophysics Data System (ADS)

    Narayanan, Desika T.; Abruzzo, Matthew W.; Dave, Romeel; Thompson, Robert

    2017-01-01

    The process of galaxy assembly is a prevalent question in astronomy; there are a variety of potentially important effects, including baryonic accretion from the intergalactic medium, as well as major galaxy mergers. Recent years have ushered in the development of quantitative measures of morphology such as the Gini coefficient (G), the second-order moment of the brightest quintile of a galaxy’s light (M20), and the concentration (C), asymmetry (A), and clumpiness (S) of galaxies. To investigate the efficacy of these observational methods at identifying major mergers, we have run a series of very high resolution cosmological zoom simulations, and coupled these with 3D Monte Carlo dust radiative transfer. Our methodology is powerful in that it allows us to “observe” the simulation as an observer would, while maintaining detailed knowledge of the true merger history of the galaxy. In this presentation, we will present our main results from our analysis of these quantitative morphology measures, with a particular focus on high-redshift (z>2) systems.

  1. Replication of linkage to quantitative trait loci: variation in location and magnitude of the lod score.

    PubMed

    Hsueh, W C; Göring, H H; Blangero, J; Mitchell, B D

    2001-01-01

    Replication of linkage signals from independent samples is considered an important step toward verifying the significance of linkage signals in studies of complex traits. The purpose of this empirical investigation was to examine the variability in the precision of localizing a quantitative trait locus (QTL) by analyzing multiple replicates of a simulated data set with the use of variance components-based methods. Specifically, we evaluated across replicates the variation in both the magnitude and the location of the peak lod scores. We analyzed QTLs whose effects accounted for 10-37% of the phenotypic variance in the quantitative traits. Our analyses revealed that the precision of QTL localization was directly related to the magnitude of the QTL effect. For a QTL with effect accounting for > 20% of total phenotypic variation, > 90% of the linkage peaks fall within 10 cM from the true gene location. We found no evidence that, for a given magnitude of the lod score, the presence of interaction influenced the precision of QTL localization.

  2. The use of mode of action information in risk assessment: quantitative key events/dose-response framework for modeling the dose-response for key events.

    PubMed

    Simon, Ted W; Simons, S Stoney; Preston, R Julian; Boobis, Alan R; Cohen, Samuel M; Doerrer, Nancy G; Fenner-Crisp, Penelope A; McMullin, Tami S; McQueen, Charlene A; Rowlands, J Craig

    2014-08-01

    The HESI RISK21 project formed the Dose-Response/Mode-of-Action Subteam to develop strategies for using all available data (in vitro, in vivo, and in silico) to advance the next-generation of chemical risk assessments. A goal of the Subteam is to enhance the existing Mode of Action/Human Relevance Framework and Key Events/Dose Response Framework (KEDRF) to make the best use of quantitative dose-response and timing information for Key Events (KEs). The resulting Quantitative Key Events/Dose-Response Framework (Q-KEDRF) provides a structured quantitative approach for systematic examination of the dose-response and timing of KEs resulting from a dose of a bioactive agent that causes a potential adverse outcome. Two concepts are described as aids to increasing the understanding of mode of action-Associative Events and Modulating Factors. These concepts are illustrated in two case studies; 1) cholinesterase inhibition by the pesticide chlorpyrifos, which illustrates the necessity of considering quantitative dose-response information when assessing the effect of a Modulating Factor, that is, enzyme polymorphisms in humans, and 2) estrogen-induced uterotrophic responses in rodents, which demonstrate how quantitative dose-response modeling for KE, the understanding of temporal relationships between KEs and a counterfactual examination of hypothesized KEs can determine whether they are Associative Events or true KEs.

  3. Influence of Radial Stress Gradient on Strainbursts: An Experimental Study

    NASA Astrophysics Data System (ADS)

    Su, Guoshao; Zhai, Shaobin; Jiang, Jianqing; Zhang, Gangliang; Yan, Liubin

    2017-10-01

    Strainbursts, which are violent disasters that are accompanied by the ejection failure of rocks, usually occur in hard brittle rocks around highly stressed underground openings. The release of the radial stress at excavation boundaries is one of the major inducing factors for strainbursts in tunnels. After excavation, the radial stress usually exhibits different but apparent gradient variations along the radial direction near the boundary within a certain depth under different in situ stress conditions. In this study, the influence of the radial stress gradient on strainbursts of granite was investigated using an improved true-triaxial rockburst testing system, which was equipped with an acoustic emission monitoring system. The stress state and boundary conditions (i.e., one face free, other faces loaded and increasing tangential stress) of the representative rock element in the vicinity of the excavation boundary were simulated. High-speed cameras were used to capture the ejection failure processes during strainbursts, and the kinetic energy of ejected fragments was quantitatively estimated by analyzing the recorded videos. The experimental results indicate that with an increasing radial stress gradient, the strength increases, the apparent yield platform prior to the peak stress on the stress-strain curves decreases, the failure mode changes from strainburst characterized by tensile splitting to strainburst characterized by shear rupture, and the kinetic energy of ejected fragments during strainbursts significantly increases.

  4. Nanoscale morphological and chemical changes of high voltage lithium-manganese rich NMC composite cathodes with cycling.

    PubMed

    Yang, Feifei; Liu, Yijin; Martha, Surendra K; Wu, Ziyu; Andrews, Joy C; Ice, Gene E; Pianetta, Piero; Nanda, Jagjit

    2014-08-13

    Understanding the evolution of chemical composition and morphology of battery materials during electrochemical cycling is fundamental to extending battery cycle life and ensuring safety. This is particularly true for the much debated high energy density (high voltage) lithium-manganese rich cathode material of composition Li(1 + x)M(1 - x)O2 (M = Mn, Co, Ni). In this study we combine full-field transmission X-ray microscopy (TXM) with X-ray absorption near edge structure (XANES) to spatially resolve changes in chemical phase, oxidation state, and morphology within a high voltage cathode having nominal composition Li1.2Mn0.525Ni0.175Co0.1O2. Nanoscale microscopy with chemical/elemental sensitivity provides direct quantitative visualization of the cathode, and insights into failure. Single-pixel (∼ 30 nm) TXM XANES revealed changes in Mn chemistry with cycling, possibly to a spinel conformation and likely including some Mn(II), starting at the particle surface and proceeding inward. Morphological analysis of the particles revealed, with high resolution and statistical sampling, that the majority of particles adopted nonspherical shapes after 200 cycles. Multiple-energy tomography showed a more homogeneous association of transition metals in the pristine particle, which segregate significantly with cycling. Depletion of transition metals at the cathode surface occurs after just one cycle, likely driven by electrochemical reactions at the surface.

  5. Nanoscale Morphological and Chemical Changes of High Voltage Lithium–Manganese Rich NMC Composite Cathodes with Cycling

    PubMed Central

    2015-01-01

    Understanding the evolution of chemical composition and morphology of battery materials during electrochemical cycling is fundamental to extending battery cycle life and ensuring safety. This is particularly true for the much debated high energy density (high voltage) lithium–manganese rich cathode material of composition Li1 + xM1 – xO2 (M = Mn, Co, Ni). In this study we combine full-field transmission X-ray microscopy (TXM) with X-ray absorption near edge structure (XANES) to spatially resolve changes in chemical phase, oxidation state, and morphology within a high voltage cathode having nominal composition Li1.2Mn0.525Ni0.175Co0.1O2. Nanoscale microscopy with chemical/elemental sensitivity provides direct quantitative visualization of the cathode, and insights into failure. Single-pixel (∼30 nm) TXM XANES revealed changes in Mn chemistry with cycling, possibly to a spinel conformation and likely including some Mn(II), starting at the particle surface and proceeding inward. Morphological analysis of the particles revealed, with high resolution and statistical sampling, that the majority of particles adopted nonspherical shapes after 200 cycles. Multiple-energy tomography showed a more homogeneous association of transition metals in the pristine particle, which segregate significantly with cycling. Depletion of transition metals at the cathode surface occurs after just one cycle, likely driven by electrochemical reactions at the surface. PMID:25054780

  6. Boerhaave on Fire

    NASA Astrophysics Data System (ADS)

    Diemente, Damon

    2000-01-01

    In 1741 an English translation of Herman Boerhaave's celebrated textbook Elementa Chemic was published under the title A New Method of Chemistry. True to its time, this book included elaborate discussions of the elements earth, water, air, and fire. This article offers to teachers for classroom use a selection of passages from Boerhaave's chapter on fire. Now, today's teacher of chemistry is apt to feel that little of significance to the modern classroom can be gleaned from a two-and-a-half-centuries-old text, and especially from a topic as old-fashioned as fire. But this view is decidedly shortsighted. Boerhaave offers demonstrations and experiments that can be instructively performed today, quantitative data that can be checked against modern equations, and much theory and hypothesis that can be assessed in light of modern chemical ideas. In the readings presented here I have found material for discussion in class, for investigation in the laboratory, and for a few homework assignments. Modern students are well able to comprehend and paraphrase Boerhaave, to check his results, appreciate his insights, and identify his shortfalls. From him they learn firsthand how painstaking and difficult it was to imagine and develop the concepts of thermochemistry. To read from his chapter on fire is to stand witness to the birth and infancy of thermodynamics as conceived in the mind of a great chemist from the age when coherent chemical theory was just beginning to emerge.

  7. Development and in-house validation of the event-specific qualitative and quantitative PCR detection methods for genetically modified cotton MON15985.

    PubMed

    Jiang, Lingxi; Yang, Litao; Rao, Jun; Guo, Jinchao; Wang, Shu; Liu, Jia; Lee, Seonghun; Zhang, Dabing

    2010-02-01

    To implement genetically modified organism (GMO) labeling regulations, an event-specific analysis method based on the junction sequence between exogenous integration and host genomic DNA has become the preferential approach for GMO identification and quantification. In this study, specific primers and TaqMan probes based on the revealed 5'-end junction sequence of GM cotton MON15985 were designed, and qualitative and quantitative polymerase chain reaction (PCR) assays were established employing the designed primers and probes. In the qualitative PCR assay, the limit of detection (LOD) was 0.5 g kg(-1) in 100 ng total cotton genomic DNA, corresponding to about 17 copies of haploid cotton genomic DNA, and the LOD and limit of quantification (LOQ) for quantitative PCR assay were 10 and 17 copies of haploid cotton genomic DNA, respectively. Furthermore, the developed quantitative PCR assays were validated in-house by five different researchers. Also, five practical samples with known GM contents were quantified using the developed PCR assay in in-house validation, and the bias between the true and quantification values ranged from 2.06% to 12.59%. This study shows that the developed qualitative and quantitative PCR methods are applicable for the identification and quantification of GM cotton MON15985 and its derivates.

  8. The orbit of Phi Cygni measured with long-baseline optical interferometry - Component masses and absolute magnitudes

    NASA Technical Reports Server (NTRS)

    Armstrong, J. T.; Hummel, C. A.; Quirrenbach, A.; Buscher, D. F.; Mozurkewich, D.; Vivekanand, M.; Simon, R. S.; Denison, C. S.; Johnston, K. J.; Pan, X.-P.

    1992-01-01

    The orbit of the double-lined spectroscopic binary Phi Cygni, the distance to the system, and the masses and absolute magnitudes of its components are presented via measurements with the Mar III Optical Interferometer. On the basis of a reexamination of the spectroscopic data of Rach & Herbig (1961), the values and uncertainties are adopted for the period and the projected semimajor axes from the present fit to the spectroscopic data and the values of the remaining elements from the present fit to the Mark III data. The elements of the true orbit are derived, and the masses and absolute magnitudes of the components, and the distance to the system are calculated.

  9. Application of Handheld Laser-Induced Breakdown Spectroscopy (LIBS) to Geochemical Analysis.

    PubMed

    Connors, Brendan; Somers, Andrew; Day, David

    2016-05-01

    While laser-induced breakdown spectroscopy (LIBS) has been in use for decades, only within the last two years has technology progressed to the point of enabling true handheld, self-contained instruments. Several instruments are now commercially available with a range of capabilities and features. In this paper, the SciAps Z-500 handheld LIBS instrument functionality and sub-systems are reviewed. Several assayed geochemical sample sets, including igneous rocks and soils, are investigated. Calibration data are presented for multiple elements of interest along with examples of elemental mapping in heterogeneous samples. Sample preparation and the data collection method from multiple locations and data analysis are discussed. © The Author(s) 2016.

  10. Validation of PCR methods for quantitation of genetically modified plants in food.

    PubMed

    Hübner, P; Waiblinger, H U; Pietsch, K; Brodmann, P

    2001-01-01

    For enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients, quantitative detection methods such as quantitative competitive (QC-PCR) and real-time PCR are applied by official food control laboratories. The experiences of 3 European food control laboratories in validating such methods were compared to describe realistic performance characteristics of quantitative PCR detection methods. The limit of quantitation (LOQ) of GMO-specific, real-time PCR was experimentally determined to reach 30-50 target molecules, which is close to theoretical prediction. Starting PCR with 200 ng genomic plant DNA, the LOQ depends primarily on the genome size of the target plant and ranges from 0.02% for rice to 0.7% for wheat. The precision of quantitative PCR detection methods, expressed as relative standard deviation (RSD), varied from 10 to 30%. Using Bt176 corn containing test samples and applying Bt176 specific QC-PCR, mean values deviated from true values by -7to 18%, with an average of 2+/-10%. Ruggedness of real-time PCR detection methods was assessed in an interlaboratory study analyzing commercial, homogeneous food samples. Roundup Ready soybean DNA contents were determined in the range of 0.3 to 36%, relative to soybean DNA, with RSDs of about 25%. Taking the precision of quantitative PCR detection methods into account, suitable sample plans and sample sizes for GMO analysis are suggested. Because quantitative GMO detection methods measure GMO contents of samples in relation to reference material (calibrants), high priority must be given to international agreements and standardization on certified reference materials.

  11. Linking soils and streams: Response of soil solution chemistry to simulated hurricane disturbance mirrors stream chemistry following a severe hurricane

    Treesearch

    William H. McDowell; Daniel Liptzin

    2014-01-01

    Understanding the drivers of forest ecosystem response to major disturbance events is an important topic in forest ecology and ecosystem management. Because of the multiple elements included in most major disturbances such as hurricanes, fires, or landslides, it is often difficult to ascribe a specific driver to the observed response. This is particularly true for the...

  12. Identification of cis-elements conferring high levels of gene expression in non-green plastids.

    PubMed

    Zhang, Jiang; Ruf, Stephanie; Hasse, Claudia; Childs, Liam; Scharff, Lars B; Bock, Ralph

    2012-10-01

    Although our knowledge about the mechanisms of gene expression in chloroplasts has increased substantially over the past decades, next to nothing is known about the signals and factors that govern expression of the plastid genome in non-green tissues. Here we report the development of a quantitative method suitable for determining the activity of cis-acting elements for gene expression in non-green plastids. The in vivo assay is based on stable transformation of the plastid genome and the discovery that root length upon seedling growth in the presence of the plastid translational inhibitor kanamycin is directly proportional to the expression strength of the resistance gene nptII in transgenic tobacco plastids. By testing various combinations of promoters and translation initiation signals, we have used this experimental system to identify cis-elements that are highly active in non-green plastids. Surprisingly, heterologous expression elements from maize plastids were significantly more efficient in conferring high expression levels in root plastids than homologous expression elements from tobacco. Our work has established a quantitative method for characterization of gene expression in non-green plastid types, and has led to identification of cis-elements for efficient plastid transgene expression in non-green tissues, which are valuable tools for future transplastomic studies in basic and applied research. © 2012 The Authors. The Plant Journal © 2012 Blackwell Publishing Ltd.

  13. Microcomb-Based True-Time-Delay Network for Microwave Beamforming With Arbitrary Beam Pattern Control

    NASA Astrophysics Data System (ADS)

    Xue, Xiaoxiao; Xuan, Yi; Bao, Chengying; Li, Shangyuan; Zheng, Xiaoping; Zhou, Bingkun; Qi, Minghao; Weiner, Andrew M.

    2018-06-01

    Microwave phased array antennas (PAAs) are very attractive to defense applications and high-speed wireless communications for their abilities of fast beam scanning and complex beam pattern control. However, traditional PAAs based on phase shifters suffer from the beam-squint problem and have limited bandwidths. True-time-delay (TTD) beamforming based on low-loss photonic delay lines can solve this problem. But it is still quite challenging to build large-scale photonic TTD beamformers due to their high hardware complexity. In this paper, we demonstrate a photonic TTD beamforming network based on a miniature microresonator frequency comb (microcomb) source and dispersive time delay. A method incorporating optical phase modulation and programmable spectral shaping is proposed for positive and negative apodization weighting to achieve arbitrary microwave beam pattern control. The experimentally demonstrated TTD beamforming network can support a PAA with 21 elements. The microwave frequency range is $\\mathbf{8\\sim20\\ {GHz}}$, and the beam scanning range is $\\mathbf{\\pm 60.2^\\circ}$. Detailed measurements of the microwave amplitudes and phases are performed. The beamforming performances of Gaussian, rectangular beams and beam notch steering are evaluated through simulations by assuming a uniform radiating antenna array. The scheme can potentially support larger PAAs with hundreds of elements by increasing the number of comb lines with broadband microcomb generation.

  14. Predator-driven elemental cycling: the impact of predation and risk effects on ecosystem stoichiometry.

    PubMed

    Leroux, Shawn J; Schmitz, Oswald J

    2015-11-01

    Empirical evidence is beginning to show that predators can be important drivers of elemental cycling within ecosystems by propagating indirect effects that determine the distribution of elements among trophic levels as well as determine the chemical content of organic matter that becomes decomposed by microbes. These indirect effects can be propagated by predator consumptive effects on prey, nonconsumptive (risk) effects, or a combination of both. Currently, there is insufficient theory to predict how such predator effects should propagate throughout ecosystems. We present here a theoretical framework for exploring predator effects on ecosystem elemental cycling to encourage further empirical quantification. We use a classic ecosystem trophic compartment model as a basis for our analyses but infuse principles from ecological stoichiometry into the analyses of elemental cycling. Using a combined analytical-numerical approach, we compare how predators affect cycling through consumptive effects in which they control the flux of nutrients up trophic chains; through risk effects in which they change the homeostatic elemental balance of herbivore prey which accordingly changes the element ratio herbivores select from plants; and through a combination of both effects. Our analysis reveals that predators can have quantitatively important effects on elemental cycling, relative to a model formalism that excludes predator effects. Furthermore, the feedbacks due to predator nonconsumptive effects often have the quantitatively strongest impact on whole ecosystem elemental stocks, production and efficiency rates, and recycling fluxes by changing the stoichiometric balance of all trophic levels. Our modeling framework predictably shows how bottom-up control by microbes and top-down control by predators on ecosystems become interdependent when top predator effects permeate ecosystems.

  15. P Element Transposition Contributes Substantial New Variation for a Quantitative Trait in Drosophila Melanogaster

    PubMed Central

    Torkamanzehi, A.; Moran, C.; Nicholas, F. W.

    1992-01-01

    The P-M system of transposition in Drosophila melanogaster is a powerful mutator for many visible and lethal loci. Experiments using crosses between unrelated P and M stocks to assess the importance of transposition-mediated mutations affecting quantitative loci and reponse to selection have yielded unrepeatable or ambiguous results. In a different approach, we have used a P stock produced by microinjection of the ry(506) M stock. Selection responses were compared between transposition lines that were initiated by crossing M strain females with males from the ``co-isogenic'' P strain, and ry(506) M control lines. Unlike previous attempts to quantify the effects of P element transposition, there is no possibility of P transposition in the controls. During 10 generations of selection for the quantitative trait abdominal bristle number, none of the four control lines showed any response to selection, indicative of isogenicity for those loci affecting abdominal bristle number. In contrast, three of the four transposition lines showed substantial response, with regression of cumulative response on cumulative selection differential ranging from 15% to 25%. Transposition of P elements has produced new additive genetic variance at a rate which is more than 30 times greater than the rate expected from spontaneous mutation. PMID:1317317

  16. Unified Theory for Decoding the Signals from X-Ray Florescence and X-Ray Diffraction of Mixtures.

    PubMed

    Chung, Frank H

    2017-05-01

    For research and development or for solving technical problems, we often need to know the chemical composition of an unknown mixture, which is coded and stored in the signals of its X-ray fluorescence (XRF) and X-ray diffraction (XRD). X-ray fluorescence gives chemical elements, whereas XRD gives chemical compounds. The major problem in XRF and XRD analyses is the complex matrix effect. The conventional technique to deal with the matrix effect is to construct empirical calibration lines with standards for each element or compound sought, which is tedious and time-consuming. A unified theory of quantitative XRF analysis is presented here. The idea is to cancel the matrix effect mathematically. It turns out that the decoding equation for quantitative XRF analysis is identical to that for quantitative XRD analysis although the physics of XRD and XRF are fundamentally different. The XRD work has been published and practiced worldwide. The unified theory derives a new intensity-concentration equation of XRF, which is free from the matrix effect and valid for a wide range of concentrations. The linear decoding equation establishes a constant slope for each element sought, hence eliminating the work on calibration lines. The simple linear decoding equation has been verified by 18 experiments.

  17. An internal variable constitutive model for the large deformation of metals at high temperatures

    NASA Technical Reports Server (NTRS)

    Brown, Stuart; Anand, Lallit

    1988-01-01

    The advent of large deformation finite element methodologies is beginning to permit the numerical simulation of hot working processes whose design until recently has been based on prior industrial experience. Proper application of such finite element techniques requires realistic constitutive equations which more accurately model material behavior during hot working. A simple constitutive model for hot working is the single scalar internal variable model for isotropic thermal elastoplasticity proposed by Anand. The model is recalled and the specific scalar functions, for the equivalent plastic strain rate and the evolution equation for the internal variable, presented are slight modifications of those proposed by Anand. The modified functions are better able to represent high temperature material behavior. The monotonic constant true strain rate and strain rate jump compression experiments on a 2 percent silicon iron is briefly described. The model is implemented in the general purpose finite element program ABAQUS.

  18. Natural mutagenesis of human genomes by endogenous retrotransposons.

    PubMed

    Iskow, Rebecca C; McCabe, Michael T; Mills, Ryan E; Torene, Spencer; Pittard, W Stephen; Neuwald, Andrew F; Van Meir, Erwin G; Vertino, Paula M; Devine, Scott E

    2010-06-25

    Two abundant classes of mobile elements, namely Alu and L1 elements, continue to generate new retrotransposon insertions in human genomes. Estimates suggest that these elements have generated millions of new germline insertions in individual human genomes worldwide. Unfortunately, current technologies are not capable of detecting most of these young insertions, and the true extent of germline mutagenesis by endogenous human retrotransposons has been difficult to examine. Here, we describe technologies for detecting these young retrotransposon insertions and demonstrate that such insertions indeed are abundant in human populations. We also found that new somatic L1 insertions occur at high frequencies in human lung cancer genomes. Genome-wide analysis suggests that altered DNA methylation may be responsible for the high levels of L1 mobilization observed in these tumors. Our data indicate that transposon-mediated mutagenesis is extensive in human genomes and is likely to have a major impact on human biology and diseases.

  19. The Effects of Social Capital Elements on Job Satisfaction and Motivation Levels of Teachers

    ERIC Educational Resources Information Center

    Boydak Özan, Mukadder; Yavuz Özdemir, Tuncay; Yaras, Zübeyde

    2017-01-01

    The purpose of this study is to examine the effects of social capital elements' on job satisfaction and motivation levels of teachers. The mixed method was used in the study. The quantitative data were analyzed through Correlation and Multiple Regression analyses. An interview form developed by the researchers was used for analyzing the…

  20. Control of the flow over wing airfoils in transonic regimes by means of force action of surface elements on the flow

    NASA Astrophysics Data System (ADS)

    Aul'chenko, S. M.; Zamuraev, V. P.

    2012-09-01

    Mathematical modeling of the effect of force oscillations of surface elements of a wing airfoil on the shock-wave structure of the transonic flow over it is implemented. The qualitative and quantitative effect of the oscillation parameters on the airfoil wave drag is investigated.

  1. Quantitative analysis of biomedical samples using synchrotron radiation microbeams

    NASA Astrophysics Data System (ADS)

    Ektessabi, Ali; Shikine, Shunsuke; Yoshida, Sohei

    2001-07-01

    X-ray fluorescence (XRF) using a synchrotron radiation (SR) microbeam was applied to investigate distributions and concentrations of elements in single neurons of patients with neurodegenerative diseases. In this paper we introduce a computer code that has been developed to quantify the trace elements and matrix elements at the single cell level. This computer code has been used in studies of several important neurodegenerative diseases such as Alzheimer's disease (AD), Parkinson's disease (PD) and parkinsonism-dementia complex (PDC), as well as in basic biological experiments to determine the elemental changes in cells due to incorporation of foreign metal elements. The substantial nigra (SN) tissue obtained from the autopsy specimens of patients with Guamanian parkinsonism-dementia complex (PDC) and control cases were examined. Quantitative XRF analysis showed that neuromelanin granules of Parkinsonian SN contained higher levels of Fe than those of the control. The concentrations were in the ranges of 2300-3100 ppm and 2000-2400 ppm respectively. On the contrary, Zn and Ni in neuromelanin granules of SN tissue from the PDC case were lower than those of the control. Especially Zn was less than 40 ppm in SN tissue from the PDC case while it was 560-810 ppm in the control. These changes are considered to be closely related to the neuro-degeneration and cell death.

  2. Trace analysis of high-purity graphite by LA-ICP-MS.

    PubMed

    Pickhardt, C; Becker, J S

    2001-07-01

    Laser-ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been established as a very efficient and sensitive technique for the direct analysis of solids. In this work the capability of LA-ICP-MS was investigated for determination of trace elements in high-purity graphite. Synthetic laboratory standards with a graphite matrix were prepared for the purpose of quantifying the analytical results. Doped trace elements, concentration 0.5 microg g(-1), in a laboratory standard were determined with an accuracy of 1% to +/- 7% and a relative standard deviation (RSD) of 2-13%. Solution-based calibration was also used for quantitative analysis of high-purity graphite. It was found that such calibration led to analytical results for trace-element determination in graphite with accuracy similar to that obtained by use of synthetic laboratory standards for quantification of analytical results. Results from quantitative determination of trace impurities in a real reactor-graphite sample, using both quantification approaches, were in good agreement. Detection limits for all elements of interest were determined in the low ng g(-1) concentration range. Improvement of detection limits by a factor of 10 was achieved for analyses of high-purity graphite with LA-ICP-MS under wet plasma conditions, because the lower background signal and increased element sensitivity.

  3. Incremental value of live/real time three-dimensional transesophageal echocardiography over the two-dimensional technique in the assessment of primary cardiac malignant fibrous histiocytoma.

    PubMed

    Gok, Gulay; Elsayed, Mahmoud; Thind, Munveer; Uygur, Begum; Abtahi, Firoozeh; Chahwala, Jugal R; Yıldırımtürk, Özlem; Kayacıoğlu, İlyas; Pehlivanoğlu, Seçkin; Nanda, Navin C

    2015-07-01

    We describe a case of primary cardiac malignant fibrous histiocytoma where live/real time three-dimensional transesophageal echocardiography added incremental value to the two-dimensional modalities. Specifically, the three-dimensional technique allowed us to delineate the true extent and infiltration of the tumor, to identify characteristics of the tumor mass suggestive of its malignant nature, and to quantitatively assess the total tumor burden. © 2015, Wiley Periodicals, Inc.

  4. Rapid Creation and Quantitative Monitoring of High Coverage shRNA Libraries

    PubMed Central

    Bassik, Michael C.; Lebbink, Robert Jan; Churchman, L. Stirling; Ingolia, Nicholas T.; Patena, Weronika; LeProust, Emily M.; Schuldiner, Maya; Weissman, Jonathan S.; McManus, Michael T.

    2009-01-01

    Short hairpin RNA (shRNA) libraries are limited by the low efficacy of many shRNAs, giving false negatives, and off-target effects, giving false positives. Here we present a strategy for rapidly creating expanded shRNA pools (∼30 shRNAs/gene) that are analyzed by deep-sequencing (EXPAND). This approach enables identification of multiple effective target-specific shRNAs from a complex pool, allowing a rigorous statistical evaluation of whether a gene is a true hit. PMID:19448642

  5. Topology Design for Directional Range Extension Networks with Antenna Blockage

    DTIC Science & Technology

    2017-03-19

    introduced by pod-based antenna blockages. Using certain modeling approximations, the paper presents a quantitative analysis showing design trade-offs...parameters. Sec- tion IV develops quantitative relationships among key design elements and performance metrics. Section V considers some implications of the...Topology Design for Directional Range Extension Networks with Antenna Blockage Thomas Shake MIT Lincoln Laboratory shake@ll.mit.edu Abstract

  6. Accuracy improvement of quantitative analysis by spatial confinement in laser-induced breakdown spectroscopy.

    PubMed

    Guo, L B; Hao, Z Q; Shen, M; Xiong, W; He, X N; Xie, Z Q; Gao, M; Li, X Y; Zeng, X Y; Lu, Y F

    2013-07-29

    To improve the accuracy of quantitative analysis in laser-induced breakdown spectroscopy, the plasma produced by a Nd:YAG laser from steel targets was confined by a cavity. A number of elements with low concentrations, such as vanadium (V), chromium (Cr), and manganese (Mn), in the steel samples were investigated. After the optimization of the cavity dimension and laser fluence, significant enhancement factors of 4.2, 3.1, and 2.87 in the emission intensity of V, Cr, and Mn lines, respectively, were achieved at a laser fluence of 42.9 J/cm(2) using a hemispherical cavity (diameter: 5 mm). More importantly, the correlation coefficient of the V I 440.85/Fe I 438.35 nm was increased from 0.946 (without the cavity) to 0.981 (with the cavity); and similar results for Cr I 425.43/Fe I 425.08 nm and Mn I 476.64/Fe I 492.05 nm were also obtained. Therefore, it was demonstrated that the accuracy of quantitative analysis with low concentration elements in steel samples was improved, because the plasma became uniform with spatial confinement. The results of this study provide a new pathway for improving the accuracy of quantitative analysis of LIBS.

  7. Micro-PIXE studies of Lupinus angustifolius L. after treatment of seeds with molybdenum

    NASA Astrophysics Data System (ADS)

    Przybylowicz, W. J.; Mesjasz-Przybylowicz, J.; Wouters, K.; Vlassak, K.; Combrink, N. J. J.

    1997-02-01

    An example of nuclear microprobe application in agriculture is presented. The NAC nuclear microprobe was used to determine quantitative elemental distribution of major, minor and trace elements in Lupinus angustifolius L. (Leguminosae) after treatment of seeds with molybdenum. Experiments were performed in order to establish safe concentration levels and sources of Mo in seed treatments. Elemental distributions in Mo-treated plants and in the non-treated control plants were studied in order to explain how Mo causes toxicity. Some specific regions of Mo and other main and trace elements enrichment were identified.

  8. Generalization techniques to reduce the number of volume elements for terrain effect calculations in fully analytical gravitational modelling

    NASA Astrophysics Data System (ADS)

    Benedek, Judit; Papp, Gábor; Kalmár, János

    2018-04-01

    Beyond rectangular prism polyhedron, as a discrete volume element, can also be used to model the density distribution inside 3D geological structures. The calculation of the closed formulae given for the gravitational potential and its higher-order derivatives, however, needs twice more runtime than that of the rectangular prism computations. Although the more detailed the better principle is generally accepted it is basically true only for errorless data. As soon as errors are present any forward gravitational calculation from the model is only a possible realization of the true force field on the significance level determined by the errors. So if one really considers the reliability of input data used in the calculations then sometimes the "less" can be equivalent to the "more" in statistical sense. As a consequence the processing time of the related complex formulae can be significantly reduced by the optimization of the number of volume elements based on the accuracy estimates of the input data. New algorithms are proposed to minimize the number of model elements defined both in local and in global coordinate systems. Common gravity field modelling programs generate optimized models for every computation points ( dynamic approach), whereas the static approach provides only one optimized model for all. Based on the static approach two different algorithms were developed. The grid-based algorithm starts with the maximum resolution polyhedral model defined by 3-3 points of each grid cell and generates a new polyhedral surface defined by points selected from the grid. The other algorithm is more general; it works also for irregularly distributed data (scattered points) connected by triangulation. Beyond the description of the optimization schemes some applications of these algorithms in regional and local gravity field modelling are presented too. The efficiency of the static approaches may provide even more than 90% reduction in computation time in favourable situation without the loss of reliability of the calculated gravity field parameters.

  9. Using Neutron Spectroscopy to Obtain Quantitative Composition Data of Ganymede's Surface from the Jupiter Ganymede Orbiter

    NASA Astrophysics Data System (ADS)

    Lawrence, D. J.; Maurice, S.; Patterson, G. W.; Hibbitts, C. A.

    2010-05-01

    Understanding the global composition of Ganymede's surface is a key goal of the Europa Jupiter System Mission (EJSM) that is being jointly planned by NASA and ESA. Current plans for obtaining surface information with the Jupiter Ganymede Orbiter (JGO) use spectral imaging measurements. While spectral imaging can provide good mineralogy-related information, quantitative data about elemental abundances can often be hindered by non-composition variations due to surface effects (e.g., space weathering, grain effects, temperature, etc.). Orbital neutron and gamma-ray spectroscopy can provide quantitative composition information that is complementary to spectral imaging measurements, as has been demonstrated with similar instrumental combinations at the Moon, Mars, and Mercury. Neutron and gamma-ray measurements have successfully returned abundance information in a hydrogen-rich environment on Mars. In regards to neutrons and gamma-rays, there are many similarities between the Mars and Ganymede hydrogen-rich environments. In this study, we present results of neutron transport models, which show that quantitative composition information from Ganymede's surface can be obtained in a realistic mission scenario. Thermal and epithermal neutrons are jointly sensitive to the abundances of hydrogen and neutron absorbing elements, such as iron and titanium. These neutron measurements can discriminate between regions that are rich or depleted in neutron absorbing elements, even in the presence of large amounts of hydrogen. Details will be presented about how the neutron composition parameters can be used to meet high-level JGO science objectives, as well as an overview of a neutron spectrometer than can meet various mission and stringent environmental requirements.

  10. How can we establish more successful knowledge networks in developing countries? Lessons learnt from knowledge networks in Iran.

    PubMed

    Yazdizadeh, Bahareh; Majdzadeh, Reza; Alami, Ali; Amrolalaei, Sima

    2014-10-29

    Formal knowledge networks are considered among the solutions for strengthening knowledge translation and one of the elements of innovative systems in developing and developed countries. In the year 2000, knowledge networks were established in Iran's health system to organize, lead, empower, and coordinate efforts made by health-related research centers in the country. Since the assessment of a knowledge network is one of the main requirements for its success, the current study was designed in two qualitative and quantitative sections to identify the strengths and weaknesses of the established knowledge networks and to assess their efficiency. In the qualitative section, semi-structured, in-depth interviews were held with network directors and secretaries. The interviews were analyzed through the framework approach. To analyze effectiveness, social network analysis approach was used. That is, by considering the networks' research council members as 'nodes', and the numbers of their joint articles--before and after the network establishments--as 'relations or ties', indices of density, clique, and centrality were calculated for each network. In the qualitative section, non-transparency of management, lack of goals, administrative problems were among the most prevalent issues observed. Currently, the most important challenges are the policies related to them and their management. In the quantitative section, we observed that density and clique indices had risen for some networks; however, the centrality index for the same networks was not as high. Consequently the attribution of density and clique indices to these networks was not possible. Therefore, consolidating and revising policies relevant to the networks and preparing a guide for establishing managing networks could prove helpful. To develop knowledge and technology in a country, networks need to solve the problems they face in management and governance. That is, the first step towards the realization of true knowledge networks in health system.

  11. A sensitive and quantitative element-tagged immunoassay with ICPMS detection.

    PubMed

    Baranov, Vladimir I; Quinn, Zoë; Bandura, Dmitry R; Tanner, Scott D

    2002-04-01

    We report a set of novel immunoassays in which proteins of interest can be detected using specific element-tagged antibodies. These immunoassays are directly coupled with an inductively coupled plasma mass spectrometer (ICPMS) to quantify the elemental (in this work, metal) component of the reacted tagged antibodies. It is demonstrated that these methods can detect levels of target proteins as low as 0.1-0.5 ng/mL and yield a linear response to protein concentration over 3 orders of magnitude.

  12. Thermal evaluation for exposed stone house with quantitative and qualitative approach in mountainous area, Wonosobo, Indonesia

    NASA Astrophysics Data System (ADS)

    Hermawan, Hermawan; Prianto, Eddy

    2017-12-01

    A building can be considered as having a good thermal performance if it can make the occupant comfortable. Thermal comfort can be seen from the occupant's respond toward the architectural elements and the environment, such as lighting, the room crowding, air temperature, humidity, oxygen level, and occupant's behaviours. The objective of this research is to analyse the thermal performance of four different orientation houses in mountainous area. The research was conducted on the four expose stone houses with four different orientations in the slope of Sindoro Mountain which has relative cool temperature, about 26°C. The measurement of the elements above was done quantitatively and qualitatively for 24 hours. The results are as follows. First, the most comfortable house is west-orientation house. Second, based on the quantitative and qualitative observation, there is no significant difference (±5 %). Third, the occupant's behaviours (caring and genen) also become factors influencing occupant's comfort.

  13. Reference-free error estimation for multiple measurement methods.

    PubMed

    Madan, Hennadii; Pernuš, Franjo; Špiclin, Žiga

    2018-01-01

    We present a computational framework to select the most accurate and precise method of measurement of a certain quantity, when there is no access to the true value of the measurand. A typical use case is when several image analysis methods are applied to measure the value of a particular quantitative imaging biomarker from the same images. The accuracy of each measurement method is characterized by systematic error (bias), which is modeled as a polynomial in true values of measurand, and the precision as random error modeled with a Gaussian random variable. In contrast to previous works, the random errors are modeled jointly across all methods, thereby enabling the framework to analyze measurement methods based on similar principles, which may have correlated random errors. Furthermore, the posterior distribution of the error model parameters is estimated from samples obtained by Markov chain Monte-Carlo and analyzed to estimate the parameter values and the unknown true values of the measurand. The framework was validated on six synthetic and one clinical dataset containing measurements of total lesion load, a biomarker of neurodegenerative diseases, which was obtained with four automatic methods by analyzing brain magnetic resonance images. The estimates of bias and random error were in a good agreement with the corresponding least squares regression estimates against a reference.

  14. A Rasch scaling validation of a 'core' near-death experience.

    PubMed

    Lange, Rense; Greyson, Bruce; Houran, James

    2004-05-01

    For those with true near-death experiences (NDEs), Greyson's (1983, 1990) NDE Scale satisfactorily fits the Rasch rating scale model, thus yielding a unidimensional measure with interval-level scaling properties. With increasing intensity, NDEs reflect peace, joy and harmony, followed by insight and mystical or religious experiences, while the most intense NDEs involve an awareness of things occurring in a different place or time. The semantics of this variable are invariant across True-NDErs' gender, current age, age at time of NDE, and latency and intensity of the NDE, thus identifying NDEs as 'core' experiences whose meaning is unaffected by external variables, regardless of variations in NDEs' intensity. Significant qualitative and quantitative differences were observed between True-NDErs and other respondent groups, mostly revolving around the differential emphasis on paranormal/mystical/religious experiences vs. standard reactions to threat. The findings further suggest that False-Positive respondents reinterpret other profound psychological states as NDEs. Accordingly, the Rasch validation of the typology proposed by Greyson (1983) also provides new insights into previous research, including the possibility of embellishment over time (as indicated by the finding of positive, as well as negative, latency effects) and the potential roles of religious affiliation and religiosity (as indicated by the qualitative differences surrounding paranormal/mystical/religious issues).

  15. True-time-delay photonic beamformer for an L-band phased array radar

    NASA Astrophysics Data System (ADS)

    Zmuda, Henry; Toughlian, Edward N.; Payson, Paul M.; Malowicki, John E.

    1995-10-01

    The problem of obtaining a true-time-delay photonic beamformer has recently been a topic of great interest. Many interesting and novel approaches to this problem have been studied. This paper examines the design, construction, and testing of a dynamic optical processor for the control of a 20-element phased array antenna operating at L-band (1.2-1.4 GHz). The approach taken here has several distinct advantages. The actual optical control is accomplished with a class of spatial light modulator known as a segmented mirror device (SMD). This allows for the possibility of controlling an extremely large number (tens of thousands) of antenna elements using integrated circuit technology. The SMD technology is driven by the HDTV and laser printer markets so ultimate cost reduction as well as technological improvements are expected. Optical splitting is efficiently accomplished using a diffractive optical element. This again has the potential for use in antenna array systems with a large number of radiating elements. The actual time delay is achieved using a single acousto-optic device for all the array elements. Acousto-optic device technologies offer sufficient delay as needed for a time steered array. The topological configuration is an optical heterodyne system, hence high, potentially millimeter wave center frequencies are possible by mixing two lasers of slightly differing frequencies. Finally, the entire system is spatially integrated into a 3D glass substrate. The integrated system provides the ruggedness needed in most applications and essentially eliminates the drift problems associated with free space optical systems. Though the system is presently being configured as a beamformer, it has the ability to operate as a general photonic signal processing element in an adaptive (reconfigurable) transversal frequency filter configuration. Such systems are widely applicable in jammer/noise canceling systems, broadband ISDN, and for spread spectrum secure communications. This paper also serves as an update of work-in-progress at the Rome Laboratory Photonics Center Optical Beamforming Lab. The multi-faceted aspects of the design and construction of this state-of-the-art beamforming project will be discussed. Experimental results which demonstrate the performance of the system to-date with regard to both maximum delay and resolution over a broad bandwidth are presented.

  16. Deception in Program Evaluation Design

    DTIC Science & Technology

    2014-10-31

    CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Scott Cheney-Peters 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT...stakeholders interested in assessments as a true reflection of a programâs state have a variety of methods at hand to mitigate their impacts. Even in...26. Cites attempts to manipulate the reception and understanding of findings on climate research and intelligence reports. 3 whether determining

  17. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Vivek K.; Singh, Vinita; Rai, Awadhesh K.

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectramore » from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.« less

  18. A Modeling Approach for Burn Scar Assessment Using Natural Features and Elastic Property

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsap, L V; Zhang, Y; Goldgof, D B

    2004-04-02

    A modeling approach is presented for quantitative burn scar assessment. Emphases are given to: (1) constructing a finite element model from natural image features with an adaptive mesh, and (2) quantifying the Young's modulus of scars using the finite element model and the regularization method. A set of natural point features is extracted from the images of burn patients. A Delaunay triangle mesh is then generated that adapts to the point features. A 3D finite element model is built on top of the mesh with the aid of range images providing the depth information. The Young's modulus of scars ismore » quantified with a simplified regularization functional, assuming that the knowledge of scar's geometry is available. The consistency between the Relative Elasticity Index and the physician's rating based on the Vancouver Scale (a relative scale used to rate burn scars) indicates that the proposed modeling approach has high potentials for image-based quantitative burn scar assessment.« less

  19. CNV-ROC: A cost effective, computer-aided analytical performance evaluator of chromosomal microarrays

    PubMed Central

    Goodman, Corey W.; Major, Heather J.; Walls, William D.; Sheffield, Val C.; Casavant, Thomas L.; Darbro, Benjamin W.

    2016-01-01

    Chromosomal microarrays (CMAs) are routinely used in both research and clinical laboratories; yet, little attention has been given to the estimation of genome-wide true and false negatives during the assessment of these assays and how such information could be used to calibrate various algorithmic metrics to improve performance. Low-throughput, locus-specific methods such as fluorescence in situ hybridization (FISH), quantitative PCR (qPCR), or multiplex ligation-dependent probe amplification (MLPA) preclude rigorous calibration of various metrics used by copy number variant (CNV) detection algorithms. To aid this task, we have established a comparative methodology, CNV-ROC, which is capable of performing a high throughput, low cost, analysis of CMAs that takes into consideration genome-wide true and false negatives. CNV-ROC uses a higher resolution microarray to confirm calls from a lower resolution microarray and provides for a true measure of genome-wide performance metrics at the resolution offered by microarray testing. CNV-ROC also provides for a very precise comparison of CNV calls between two microarray platforms without the need to establish an arbitrary degree of overlap. Comparison of CNVs across microarrays is done on a per-probe basis and receiver operator characteristic (ROC) analysis is used to calibrate algorithmic metrics, such as log2 ratio threshold, to enhance CNV calling performance. CNV-ROC addresses a critical and consistently overlooked aspect of analytical assessments of genome-wide techniques like CMAs which is the measurement and use of genome-wide true and false negative data for the calculation of performance metrics and comparison of CNV profiles between different microarray experiments. PMID:25595567

  20. Families of short interspersed elements in the genome of the oomycete plant pathogen, Phytophthora infestans.

    PubMed

    Whisson, Stephen C; Avrova, Anna O; Lavrova, Olga; Pritchard, Leighton

    2005-04-01

    The first known families of tRNA-related short interspersed elements (SINEs) in the oomycetes were identified by exploiting the genomic DNA sequence resources for the potato late blight pathogen, Phytophthora infestans. Fifteen families of tRNA-related SINEs, as well as predicted tRNAs, and other possible RNA polymerase III-transcribed sequences were identified. The size of individual elements ranges from 101 to 392 bp, representing sequences present from low (1) to highly abundant (over 2000) copy number in the P. infestans genome, based on quantitative PCR analysis. Putative short direct repeat sequences (6-14 bp) flanking the elements were also identified for eight of the SINEs. Predicted SINEs were named in a series prefixed infSINE (for infestans-SINE). Two SINEs were apparently present as multimers of tRNA-related units; four copies of a related unit for infSINEr, and two unrelated units for infSINEz. Two SINEs, infSINEh and infSINEi, were typically located within 400 bp of each other. These were also the only two elements identified as being actively transcribed in the mycelial stage of P. infestans by RT-PCR. It is possible that infSINEh and infSINEi represent active retrotransposons in P. infestans. Based on the quantitative PCR estimates of copy number for all of the elements identified, tRNA-related SINEs were estimated to comprise 0.3% of the 250 Mb P. infestans genome. InfSINE-related sequences were found to occur in species throughout the genus Phytophthora. However, seven elements were shown to be exclusive to P. infestans.

  1. Structural Equation Modelling of Multiple Facet Data: Extending Models for Multitrait-Multimethod Data

    ERIC Educational Resources Information Center

    Bechger, Timo M.; Maris, Gunter

    2004-01-01

    This paper is about the structural equation modelling of quantitative measures that are obtained from a multiple facet design. A facet is simply a set consisting of a finite number of elements. It is assumed that measures are obtained by combining each element of each facet. Methods and traits are two such facets, and a multitrait-multimethod…

  2. From Nominal to Quantitative Codification of Content-Neutral Variables in Graphics Research: The Beginnings of a Manifest Content Model.

    ERIC Educational Resources Information Center

    Crow, Wendell C.

    This paper suggests ways in which manifest, physical attributes of graphic elements can be described and measured. It also proposes a preliminary conceptual model that accounts for the readily apparent, measurable variables in a visual message. The graphic elements that are described include format, typeface, and photographs/artwork. The…

  3. Reducing the wave drag of wing airfoils in transonic flow regimes by the force action of airfoil surface elements on the flow

    NASA Astrophysics Data System (ADS)

    Aul'chenko, S. M.; Zamuraev, V. P.

    2012-11-01

    Mathematical modeling of the influence of forced oscillations of surface elements of a wing airfoil on the shock-wave structure of transonic flow past it has been carried out. The qualitative and quantitative influence of the oscillation parameters on the wave drag of the airfoil has been investigated.

  4. Prediction of local proximal tibial subchondral bone structural stiffness using subject-specific finite element modeling: Effect of selected density-modulus relationship.

    PubMed

    Nazemi, S Majid; Amini, Morteza; Kontulainen, Saija A; Milner, Jaques S; Holdsworth, David W; Masri, Bassam A; Wilson, David R; Johnston, James D

    2015-08-01

    Quantitative computed tomography based subject-specific finite element modeling has potential to clarify the role of subchondral bone alterations in knee osteoarthritis initiation, progression, and pain initiation. Calculation of bone elastic moduli from image data is a basic step when constructing finite element models. However, different relationships between elastic moduli and imaged density (known as density-modulus relationships) have been reported in the literature. The objective of this study was to apply seven different trabecular-specific and two cortical-specific density-modulus relationships from the literature to finite element models of proximal tibia subchondral bone, and identify the relationship(s) that best predicted experimentally measured local subchondral structural stiffness with highest explained variance and least error. Thirteen proximal tibial compartments were imaged via quantitative computed tomography. Imaged bone mineral density was converted to elastic moduli using published density-modulus relationships and mapped to corresponding finite element models. Proximal tibial structural stiffness values were compared to experimentally measured stiffness values from in-situ macro-indentation testing directly on the subchondral bone surface (47 indentation points). Regression lines between experimentally measured and finite element calculated stiffness had R(2) values ranging from 0.56 to 0.77. Normalized root mean squared error varied from 16.6% to 337.6%. Of the 21 evaluated density-modulus relationships in this study, Goulet combined with Snyder and Schneider or Rho appeared most appropriate for finite element modeling of local subchondral bone structural stiffness. Though, further studies are needed to optimize density-modulus relationships and improve finite element estimates of local subchondral bone structural stiffness. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Monitoring of toxic elements present in sludge of industrial waste using CF-LIBS.

    PubMed

    Kumar, Rohit; Rai, Awadhesh K; Alamelu, Devanathan; Aggarwal, Suresh K

    2013-01-01

    Industrial waste is one of the main causes of environmental pollution. Laser-induced breakdown spectroscopy (LIBS) was applied to detect the toxic metals in the sludge of industrial waste water. Sludge on filter paper was obtained after filtering the collected waste water samples from different sections of a water treatment plant situated in an industrial area of Kanpur City. The LIBS spectra of the sludge samples were recorded in the spectral range of 200 to 500 nm by focusing the laser light on sludge. Calibration-free laser-induced breakdown spectroscopy (CF-LIBS) technique was used for the quantitative measurement of toxic elements such as Cr and Pb present in the sample. We also used the traditional calibration curve approach to quantify these elements. The results obtained from CF-LIBS are in good agreement with the results from the calibration curve approach. Thus, our results demonstrate that CF-LIBS is an appropriate technique for quantitative analysis where reference/standard samples are not available to make the calibration curve. The results of the present experiment are alarming to the people living nearby areas of industrial activities, as the concentrations of toxic elements are quite high compared to the admissible limits of these substances.

  6. Correlative fractography: combining scanning electron microscopy and light microscopes for qualitative and quantitative analysis of fracture surfaces.

    PubMed

    Hein, Luis Rogerio de Oliveira; de Oliveira, José Alberto; de Campos, Kamila Amato

    2013-04-01

    Correlative fractography is a new expression proposed here to describe a new method for the association between scanning electron microscopy (SEM) and light microscopy (LM) for the qualitative and quantitative analysis of fracture surfaces. This article presents a new method involving the fusion of one elevation map obtained by extended depth from focus reconstruction from LM with exactly the same area by SEM and associated techniques, as X-ray mapping. The true topographic information is perfectly associated to local fracture mechanisms with this new technique, presented here as an alternative to stereo-pair reconstruction for the investigation of fractured components. The great advantage of this technique resides in the possibility of combining any imaging methods associated with LM and SEM for the same observed field from fracture surface.

  7. Benchmark model correction of monitoring system based on Dynamic Load Test of Bridge

    NASA Astrophysics Data System (ADS)

    Shi, Jing-xian; Fan, Jiang

    2018-03-01

    Structural health monitoring (SHM) is a field of research in the area, and it’s designed to achieve bridge safety and reliability assessment, which needs to be carried out on the basis of the accurate simulation of the finite element model. Bridge finite element model is simplified of the structural section form, support conditions, material properties and boundary condition, which is based on the design and construction drawings, and it gets the calculation models and the results.But according to the design and specification requirements established finite element model due to its cannot fully reflect the true state of the bridge, so need to modify the finite element model to obtain the more accurate finite element model. Based on Da-guan river crossing of Ma - Zhao highway in Yunnan province as the background to do the dynamic load test test, we find that the impact coefficient of the theoretical model of the bridge is very different from the coefficient of the actual test, and the change is different; according to the actual situation, the calculation model is adjusted to get the correct frequency of the bridge, the revised impact coefficient found that the modified finite element model is closer to the real state, and provides the basis for the correction of the finite model.

  8. Quantitation and detection of vanadium in biologic and pollution materials

    NASA Technical Reports Server (NTRS)

    Gordon, W. A.

    1974-01-01

    A review is presented of special considerations and methodology for determining vanadium in biological and air pollution materials. In addition to descriptions of specific analysis procedures, general sections are included on quantitation of analysis procedures, sample preparation, blanks, and methods of detection of vanadium. Most of the information presented is applicable to the determination of other trace elements in addition to vanadium.

  9. Nanoscale Magnetism in Next Generation Magnetic Nanoparticles

    DTIC Science & Technology

    2018-03-17

    as dextran coated SPIONs were studied. From the measured T1 and T2 relaxation times, a new method called Quantitative Ultra- Short Time-to-Echo...angiograms with high clarity and definition, and enabled quantitative MRI in biological samples. At UCL, the work included (i) fabricating multi-element...distribution unlimited. I. Introduction Compared to flat biosensor devices, 3D engineered biosensors achievemore intimate and conformal interfaces with cells

  10. [Mathematical anatomy: muscles according to Stensen].

    PubMed

    Andrault, Raphaële

    2010-01-01

    In his Elementorum Myologiae Specimen, Steno geometrizes "the new fabric of muscles" and their movement of contraction, so as to refute the main contemporary hypothesis about the functioning of the muscles. This physiological refutation relies on an abstract representation of the muscular fibre as a parallelepiped of flesh transversally linked to the tendons. Those two features have been comprehensively studied. But the method used by Steno, as well as the way he has chosen to present his physiological results, have so far been neglected. Yet, Steno's work follows a true synthetic order, which he conceives as a tool to separate uncertain anatomical "elements" from the certain ones. We will show that the true understanding of this "more geometrico" order is the only way to avoid frequent misconceptions of the scientific aim pursued by Steno, which is neither to give a mathematical explanation of the functioning of the muscles, nor to reduce the muscles to some mathematical shapes.

  11. True time-delay photonic beamforming with fine steerability and frequency-agility for spaceborne phased-arrays: a proof-of-concept demonstration

    NASA Astrophysics Data System (ADS)

    Paul, Dilip K.; Razdan, Rajender; Goldman, Alfred M.

    1996-10-01

    Feasibility of photonics in beam forming and steering of large phased-array antennas onboard communications satellite/avionics systems is addressed in this paper. Specifically, a proof-of-concept demonstration of phased- array antenna feed network using fiber optic true time-delay (TTD) elements is reported for SATCOM phased-array antennas operating at C-band. Results of the photonic hardware design and performance analysis, including the measured radiation patterns of the antenna array fed by the photonic BFN, are presented. An excellent agreement between the analysis and measured data has been observed. In addition to being light- weight and compact, several unique characteristics such as rf carrier frequency agility and continuous steerability of the radiated beam achieved by the fiber optic TTD architecture are clear evidences of its superiority over other competing photonic architectures.

  12. Demonstration of a linear optical true-time delay device by use of a microelectromechanical mirror array.

    PubMed

    Rader, Amber; Anderson, Betty Lise

    2003-03-10

    We present the design and proof-of-concept demonstration of an optical device capable of producing true-time delay(s) (TTD)(s) for phased array antennas. This TTD device uses a free-space approach consisting of a single microelectromechanical systems (MEMS) mirror array in a multiple reflection spherical mirror configuration based on the White cell. Divergence is avoided by periodic refocusing by the mirrors. By using the MEMS mirror to switch between paths of different lengths, time delays are generated. Six different delays in 1-ns increments were demonstrated by using the Texas Instruments Digital Micromirror Device as the switching element. Losses of 1.6 to 5.2 dB per bounce and crosstalk of -27 dB were also measured, both resulting primarily from diffraction from holes in each pixel and the inter-pixel gaps of the MEMS.

  13. Uncertainty Analysis of the NASA Glenn 8x6 Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Stephens, Julia; Hubbard, Erin; Walter, Joel; McElroy, Tyler

    2016-01-01

    This paper presents methods and results of a detailed measurement uncertainty analysis that was performed for the 8- by 6-foot Supersonic Wind Tunnel located at the NASA Glenn Research Center. The statistical methods and engineering judgments used to estimate elemental uncertainties are described. The Monte Carlo method of propagating uncertainty was selected to determine the uncertainty of calculated variables of interest. A detailed description of the Monte Carlo method as applied for this analysis is provided. Detailed uncertainty results for the uncertainty in average free stream Mach number as well as other variables of interest are provided. All results are presented as random (variation in observed values about a true value), systematic (potential offset between observed and true value), and total (random and systematic combined) uncertainty. The largest sources contributing to uncertainty are determined and potential improvement opportunities for the facility are investigated.

  14. Progress report on Bertelsen research and development of an air cushion crawler all-terrain vehicle

    NASA Astrophysics Data System (ADS)

    Bertelsen, W. R.

    1987-06-01

    The ACV is an exceptional amphibian but it is not, nor is any other existing craft, an all-terrain vehicle (ATV). Using the best elements of the ACV in an air-cushion crawler tractor, a true ATV can be attained. A conventional crawler drive train will propel two tracks as pressurized, propulsive pontoons. The key to a successful ATV is in perfecting efficient, durable, sliding seals to allow the belt to move in its orbit around the track unit and maintain its internal pressure. After deriving the adequate seal, a 12 inch wide x 86 inch long endless rubber belt was fitted bilateral seals and slide plates with internal guide wheels fore and aft with a 21 inch wheel base. From this approximately one-quarter scale model, full-scale air track crawlers, true ATVs, of any size and capacity can be produced.

  15. A mathematical model for adaptive transport network in path finding by true slime mold.

    PubMed

    Tero, Atsushi; Kobayashi, Ryo; Nakagaki, Toshiyuki

    2007-02-21

    We describe here a mathematical model of the adaptive dynamics of a transport network of the true slime mold Physarum polycephalum, an amoeboid organism that exhibits path-finding behavior in a maze. This organism possesses a network of tubular elements, by means of which nutrients and signals circulate through the plasmodium. When the organism is put in a maze, the network changes its shape to connect two exits by the shortest path. This process of path-finding is attributed to an underlying physiological mechanism: a tube thickens as the flux through it increases. The experimental evidence for this is, however, only qualitative. We constructed a mathematical model of the general form of the tube dynamics. Our model contains a key parameter corresponding to the extent of the feedback regulation between the thickness of a tube and the flux through it. We demonstrate the dependence of the behavior of the model on this parameter.

  16. Evaluation of an ensemble of genetic models for prediction of a quantitative trait.

    PubMed

    Milton, Jacqueline N; Steinberg, Martin H; Sebastiani, Paola

    2014-01-01

    Many genetic markers have been shown to be associated with common quantitative traits in genome-wide association studies. Typically these associated genetic markers have small to modest effect sizes and individually they explain only a small amount of the variability of the phenotype. In order to build a genetic prediction model without fitting a multiple linear regression model with possibly hundreds of genetic markers as predictors, researchers often summarize the joint effect of risk alleles into a genetic score that is used as a covariate in the genetic prediction model. However, the prediction accuracy can be highly variable and selecting the optimal number of markers to be included in the genetic score is challenging. In this manuscript we present a strategy to build an ensemble of genetic prediction models from data and we show that the ensemble-based method makes the challenge of choosing the number of genetic markers more amenable. Using simulated data with varying heritability and number of genetic markers, we compare the predictive accuracy and inclusion of true positive and false positive markers of a single genetic prediction model and our proposed ensemble method. The results show that the ensemble of genetic models tends to include a larger number of genetic variants than a single genetic model and it is more likely to include all of the true genetic markers. This increased sensitivity is obtained at the price of a lower specificity that appears to minimally affect the predictive accuracy of the ensemble.

  17. Near-Infrared Spectroscopy Assay of Key Quality-Indicative Ingredients of Tongkang Tablets.

    PubMed

    Pan, Wenjie; Ma, Jinfang; Xiao, Xue; Huang, Zhengwei; Zhou, Huanbin; Ge, Fahuan; Pan, Xin

    2017-04-01

    The objective of this paper is to develop an easy and fast near-infrared spectroscopy (NIRS) assay for the four key quality-indicative active ingredients of Tongkang tablets by comparing the true content of the active ingredients measured by high performance liquid chromatography (HPLC) and the NIRS data. The HPLC values for the active ingredients content of Cimicifuga glycoside, calycosin glucoside, 5-O-methylvisamminol and hesperidin in Tongkang tablets were set as reference values. The NIRS raw spectra of Tongkang tablets were processed using first-order convolution method. The iterative optimization method was chosen to optimize the band for Cimicifuga glycoside and 5-O-methylvisamminol, and correlation coefficient method was used to determine the optimal band of calycosin glucoside and hesperidin. A near-infrared quantitative calibration model was established for each quality-indicative ingredient by partial least-squares method on the basis of the contents detected by HPLC and the obtained NIRS spectra. The correlation coefficient R 2 values of the four models of Cimicifuga glycoside, calycosin glucoside, 5-O-methylvisamminol and hesperidin were 0.9025, 0.8582, 0.9250, and 0.9325, respectively. It was demonstrated that the accuracy of the validation values was approximately 90% by comparison of the predicted results from NIRS models and the HPLC true values, which suggested that NIRS assay was successfully established and validated. It was expected that the quantitative analysis models of the four indicative ingredients could be used to rapidly perform quality control in industrial production of Tongkang tablets.

  18. Quantitative analyses and modelling to support achievement of the 2020 goals for nine neglected tropical diseases.

    PubMed

    Hollingsworth, T Déirdre; Adams, Emily R; Anderson, Roy M; Atkins, Katherine; Bartsch, Sarah; Basáñez, María-Gloria; Behrend, Matthew; Blok, David J; Chapman, Lloyd A C; Coffeng, Luc; Courtenay, Orin; Crump, Ron E; de Vlas, Sake J; Dobson, Andy; Dyson, Louise; Farkas, Hajnal; Galvani, Alison P; Gambhir, Manoj; Gurarie, David; Irvine, Michael A; Jervis, Sarah; Keeling, Matt J; Kelly-Hope, Louise; King, Charles; Lee, Bruce Y; Le Rutte, Epke A; Lietman, Thomas M; Ndeffo-Mbah, Martial; Medley, Graham F; Michael, Edwin; Pandey, Abhishek; Peterson, Jennifer K; Pinsent, Amy; Porco, Travis C; Richardus, Jan Hendrik; Reimer, Lisa; Rock, Kat S; Singh, Brajendra K; Stolk, Wilma; Swaminathan, Subramanian; Torr, Steve J; Townsend, Jeffrey; Truscott, James; Walker, Martin; Zoueva, Alexandra

    2015-12-09

    Quantitative analysis and mathematical models are useful tools in informing strategies to control or eliminate disease. Currently, there is an urgent need to develop these tools to inform policy to achieve the 2020 goals for neglected tropical diseases (NTDs). In this paper we give an overview of a collection of novel model-based analyses which aim to address key questions on the dynamics of transmission and control of nine NTDs: Chagas disease, visceral leishmaniasis, human African trypanosomiasis, leprosy, soil-transmitted helminths, schistosomiasis, lymphatic filariasis, onchocerciasis and trachoma. Several common themes resonate throughout these analyses, including: the importance of epidemiological setting on the success of interventions; targeting groups who are at highest risk of infection or re-infection; and reaching populations who are not accessing interventions and may act as a reservoir for infection,. The results also highlight the challenge of maintaining elimination 'as a public health problem' when true elimination is not reached. The models elucidate the factors that may be contributing most to persistence of disease and discuss the requirements for eventually achieving true elimination, if that is possible. Overall this collection presents new analyses to inform current control initiatives. These papers form a base from which further development of the models and more rigorous validation against a variety of datasets can help to give more detailed advice. At the moment, the models' predictions are being considered as the world prepares for a final push towards control or elimination of neglected tropical diseases by 2020.

  19. Mechanism of transcription termination by RNA polymerase III utilizes a nontemplate-strand sequence-specific signal element

    PubMed Central

    Arimbasseri, Aneeshkumar G.; Maraia, Richard J.

    2015-01-01

    SUMMARY Understanding the mechanism of transcription termination by a eukaryotic RNA polymerase (RNAP) has been limited by lack of a characterizable intermediate that reflects transition from an elongation complex to a true termination event. While other multisubunit RNAPs require multipartite cis-signals and/or ancillary factors to mediate pausing and release of the nascent transcript from the clutches of these enzymes, RNAP III does so with precision and efficiency on a simple oligo(dT) tract, independent of other cis-elements or trans-factors. We report a RNAP III pre-termination complex that reveals termination mechanisms controlled by sequence-specific elements in the non-template strand. Furthermore, the TFIIF-like, RNAP III subunit, C37 is required for this function of the non-template strand signal. The results reveal the RNAP III terminator as an information-rich control element. While the template strand promotes destabilization via a weak oligo(rU:dA) hybrid, the non-template strand provides distinct sequence-specific destabilizing information through interactions with the C37 subunit. PMID:25959395

  20. Techniques of orbital decay and long-term ephemeris prediction for satellites in earth orbit

    NASA Technical Reports Server (NTRS)

    Barry, B. F.; Pimm, R. S.; Rowe, C. K.

    1971-01-01

    In the special perturbation method, Cowell and variation-of-parameters formulations of the motion equations are implemented and numerically integrated. Variations in the orbital elements due to drag are computed using the 1970 Jacchia atmospheric density model, which includes the effects of semiannual variations, diurnal bulge, solar activity, and geomagnetic activity. In the general perturbation method, two-variable asymptotic series and automated manipulation capabilities are used to obtain analytical solutions to the variation-of-parameters equations. Solutions are obtained considering the effect of oblateness only and the combined effects of oblateness and drag. These solutions are then numerically evaluated by means of a FORTRAN program in which an updating scheme is used to maintain accurate epoch values of the elements. The atmospheric density function is approximated by a Fourier series in true anomaly, and the 1970 Jacchia model is used to periodically update the Fourier coefficients. The accuracy of both methods is demonstrated by comparing computed orbital elements to actual elements over time spans of up to 8 days for the special perturbation method and up to 356 days for the general perturbation method.

  1. TORO II simulations of induction heating in ferromagnetic materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adkins, D.R.; Gartling, D.K.; Kelley, J.B.

    TORO II is a finite element computer program that is used in the simulation of electric and magnetic fields. This code, which was developed at Sandia National Laboratories, has been coupled with a finite element thermal code, COYOTE II, to predict temperature profiles in inductively heated parts. The development of an effective technique to account for the nonlinear behavior of the magnetic permeability in ferromagnetic parts is one of the more difficult aspects of solving induction heating problems. In the TORO II code, nonlinear, spatially varying magnetic permeability is approximated by an effective permeability on an element-by-element basis that effectivelymore » provides the same energy deposition that is produced when the true permeability is used. This approximation has been found to give an accurate estimate of the volumetric heating distribution in the part, and predicted temperature distributions have been experimentally verified using a medium carbon steel and a 10kW industrial induction heating unit. Work on the model was funded through a Cooperative Research and Development Agreement (CRADA) between the Department of Energy and General Motors` Delphi Saginaw Steering Systems.« less

  2. Discrimination of microbiological samples using femtosecond laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Baudelet, Matthieu; Yu, Jin; Bossu, Myriam; Jovelet, Julien; Wolf, Jean-Pierre; Amodeo, Tanguy; Fréjafon, Emeric; Laloi, Patrick

    2006-10-01

    Using femtosecond laser-induced breakdown spectroscopy, the authors have analyzed five different species of bacterium. Line emissions from six trace mineral elements, Na, Mg, P, K, Ca, and Fe, have been clearly detected. Their intensities correspond to relative concentrations of these elements contained in the analyzed samples. The authors demonstrate that the concentration profile of trace elements allows unambiguous discrimination of different bacteria. Quantitative differentiation has been made by representing bacteria in a six-dimension hyperspace with each of its axis representing a detected trace element. In such hyperspace, representative points of different species of bacterium are gathered in different and distinct volumes.

  3. Concentrations of platinum group elements in 122 U.S. coal samples

    USGS Publications Warehouse

    Oman, C.L.; Finkelman, R.B.; Tewalt, S.J.

    1997-01-01

    Analysis of more than 13,000 coal samples by semi-quantitative optical emission spectroscopy (OES) indicates that concentrations of the platinum group elements (iridium, palladium, platinum, osmium, rhodium, and ruthenium) are less than 1 ppm in the ash, the limit of detection for this method of analysis. In order to accurately determine the concentration of the platinum group elements (PGE) in coal, additional data were obtained by inductively coupled plasma mass spectroscopy, an analytical method having part-per-billion (ppb) detection limits for these elements. These data indicate that the PGE in coal occur in concentrations on the order of 1 ppb or less.

  4. Assessment of real-time PCR cycle threshold values in Microsporum canis culture-positive and culture-negative cats in an animal shelter: a field study.

    PubMed

    Jacobson, Linda S; McIntyre, Lauren; Mykusz, Jenny

    2018-02-01

    Objectives Real-time PCR provides quantitative information, recorded as the cycle threshold (Ct) value, about the number of organisms detected in a diagnostic sample. The Ct value correlates with the number of copies of the target organism in an inversely proportional and exponential relationship. The aim of the study was to determine whether Ct values could be used to distinguish between culture-positive and culture-negative samples. Methods This was a retrospective analysis of Ct values from dermatophyte PCR results in cats with suspicious skin lesions or suspected exposure to dermatophytosis. Results One hundred and thirty-two samples were included. Using culture as the gold standard, 28 were true positives, 12 were false positives and 92 were true negatives. The area under the curve for the pretreatment time point was 96.8% (95% confidence interval [CI] 94.2-99.5) compared with 74.3% (95% CI 52.6-96.0) for pooled data during treatment. Before treatment, a Ct cut-off of <35.7 (approximate DNA count 300) provided a sensitivity of 92.3% and specificity of 95.2%. There was no reliable cut-off Ct value between culture-positive and culture-negative samples during treatment. Ct values prior to treatment differed significantly between the true-positive and false-positive groups ( P = 0.0056). There was a significant difference between the pretreatment and first and second negative culture time points ( P = 0.0002 and P <0.0001, respectively). However, there was substantial overlap between Ct values for true positives and true negatives, and for pre- and intra-treatment time points. Conclusions and relevance Ct values had limited usefulness for distinguishing between culture-positive and culture-negative cases when field study samples were analyzed. In addition, Ct values were less reliable than fungal culture for determining mycological cure.

  5. Fast analysis of wood preservers using laser induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Uhl, A.; Loebe, K.; Kreuchwig, L.

    2001-06-01

    Laser-induced breakdown spectroscopy (LIBS) is used for the investigation of wood preservers in timber and in furniture. Both experiments in laboratory and practical applications in recycling facilities and on a building site prove the new possibilities for the fast detection of harmful agents in wood. A commercial system was developed for mobile laser-plasma-analysis as well as for industrial use in sorting plants. The universal measuring principle in combination with an Echelle optics permits real simultaneous multi-element-analysis in the range of 200-780 nm with a resolution of a few picometers. It enables the user to detect main and trace elements in wood within a few seconds, nearly independent of the matrix, knowing that different kinds of wood show an equal elemental composition. Sample preparation is not required. The quantitative analysis of inorganic wood preservers (containing, e.g. Cu, Cr, B, As, Pb, Hg) has been performed exactly using carbon as reference element. It can be shown that the detection limits for heavy metals in wood are in the ppm-range. Additional information is given concerning the quantitative analysis. Statistical data, e.g. the standard deviation (S.D.), were determined and calibration curves were used for each particular element. A comparison between ICP-AES and LIBS is given using depth profile correction factors regarding the different penetration depths with respect to the different volumes in wood analyzed by both analytical methods.

  6. Influence of mom and dad: quantitative genetic models for maternal effects and genomic imprinting.

    PubMed

    Santure, Anna W; Spencer, Hamish G

    2006-08-01

    The expression of an imprinted gene is dependent on the sex of the parent it was inherited from, and as a result reciprocal heterozygotes may display different phenotypes. In contrast, maternal genetic terms arise when the phenotype of an offspring is influenced by the phenotype of its mother beyond the direct inheritance of alleles. Both maternal effects and imprinting may contribute to resemblance between offspring of the same mother. We demonstrate that two standard quantitative genetic models for deriving breeding values, population variances and covariances between relatives, are not equivalent when maternal genetic effects and imprinting are acting. Maternal and imprinting effects introduce both sex-dependent and generation-dependent effects that result in differences in the way additive and dominance effects are defined for the two approaches. We use a simple example to demonstrate that both imprinting and maternal genetic effects add extra terms to covariances between relatives and that model misspecification may over- or underestimate true covariances or lead to extremely variable parameter estimation. Thus, an understanding of various forms of parental effects is essential in correctly estimating quantitative genetic variance components.

  7. Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.

    PubMed

    Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C

    2015-02-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  8. Nanoscale morphological analysis of soft matter aggregates with fractal dimension ranging from 1 to 3.

    PubMed

    Valle, Francesco; Brucale, Marco; Chiodini, Stefano; Bystrenova, Eva; Albonetti, Cristiano

    2017-09-01

    While the widespread emergence of nanoscience and nanotechnology can be dated back to the early eighties, the last decade has witnessed a true coming of age of this research field, with novel nanomaterials constantly finding their way into marketed products. The performance of nanomaterials being dominated by their nanoscale morphology, their quantitative characterization with respect to a number of properties is often crucial. In this context, those imaging techniques able to resolve nanometer scale details are clearly key players. In particular, atomic force microscopy can yield a fully quantitative tridimensional (3D) topography at the nanoscale. Herein, we will review a set of morphological analysis based on the scaling approach, which give access to important quantitative parameters for describing nanomaterial samples. To generalize the use of such morphological analysis on all D-dimensions (1D, 2D and 3D), the review will focus on specific soft matter aggregates with fractal dimension ranging from just above 1 to just below 3. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Allelic loss studies do not provide evidence for the "endometriosis-as-tumor" theory.

    PubMed

    Prowse, Amanda H; Fakis, Giannoulis; Manek, Sanjiv; Churchman, Michael; Edwards, Sarah; Rowan, Andrew; Koninckx, Philippe; Kennedy, Stephen; Tomlinson, Ian P M

    2005-04-01

    To identify consistent genetic changes in endometriosis samples to determine whether endometriosis lesions are true neoplasms. We analyzed ovarian endometriosis lesions for loss of heterozygosity (LOH) at 12 loci of potential importance (D9S1870, D9S265, D9S270, D9S161, D11S29, D1S199, D8S261, APOA2, PTCH, TP53, D10S541, and D10S1765), including some at which genetic changes were previously reported in endometriosis. Molecular biology laboratory in a university hospital department. Seventeen women with ovarian endometriosis. Laser capture microdissection to separate the endometriotic epithelium, the adjacent endometriotic stroma, and surrounding normal ovarian stromal tissue, followed by DNA extraction and polymerase chain reaction amplification of polymorphic microsatellite markers. Fluorescence-based quantitation for the LOH analysis. We identified LOH in only one lesion at one locus (D8S261). Our data do not support the hypothesis that ovarian endometriosis is a true neoplasm.

  10. The power metric: a new statistically robust enrichment-type metric for virtual screening applications with early recovery capability.

    PubMed

    Lopes, Julio Cesar Dias; Dos Santos, Fábio Mendes; Martins-José, Andrelly; Augustyns, Koen; De Winter, Hans

    2017-01-01

    A new metric for the evaluation of model performance in the field of virtual screening and quantitative structure-activity relationship applications is described. This metric has been termed the power metric and is defined as the fraction of the true positive rate divided by the sum of the true positive and false positive rates, for a given cutoff threshold. The performance of this metric is compared with alternative metrics such as the enrichment factor, the relative enrichment factor, the receiver operating curve enrichment factor, the correct classification rate, Matthews correlation coefficient and Cohen's kappa coefficient. The performance of this new metric is found to be quite robust with respect to variations in the applied cutoff threshold and ratio of the number of active compounds to the total number of compounds, and at the same time being sensitive to variations in model quality. It possesses the correct characteristics for its application in early-recognition virtual screening problems.

  11. Blood flow estimation in gastroscopic true-color images

    NASA Astrophysics Data System (ADS)

    Jacoby, Raffael S.; Herpers, Rainer; Zwiebel, Franz M.; Englmeier, Karl-Hans

    1995-05-01

    The assessment of blood flow in the gastrointestinal mucosa might be an important factor for the diagnosis and treatment of several diseases such as ulcers, gastritis, colitis, or early cancer. The quantity of blood flow is roughly estimated by computing the spatial hemoglobin distribution in the mucosa. The presented method enables a practical realization by calculating approximately the hemoglobin concentration based on a spectrophotometric analysis of endoscopic true-color images, which are recorded during routine examinations. A system model based on the reflectance spectroscopic law of Kubelka-Munk is derived which enables an estimation of the hemoglobin concentration by means of the color values of the images. Additionally, a transformation of the color values is developed in order to improve the luminance independence. Applying this transformation and estimating the hemoglobin concentration for each pixel of interest, the hemoglobin distribution can be computed. The obtained results are mostly independent of luminance. An initial validation of the presented method is performed by a quantitative estimation of the reproducibility.

  12. Environment-dependent morphology in plasmodium of true slime mold Physarum polycephalum and a network growth model.

    PubMed

    Takamatsu, Atsuko; Takaba, Eri; Takizawa, Ginjiro

    2009-01-07

    Branching network growth patterns, depending on environmental conditions, in plasmodium of true slime mold Physarum polycephalum were investigated. Surprisingly, the patterns resemble those in bacterial colonies even though the biological mechanisms differ greatly. Bacterial colonies are collectives of microorganisms in which individual organisms have motility and interact through nutritious and chemical fields. In contrast, the plasmodium is a giant amoeba-like multinucleated unicellular organism that forms a network of tubular structures through which protoplasm streams. The cell motility of the plasmodium is generated by oscillation phenomena observed in the partial bodies, which interact through the tubular structures. First, we analyze characteristics of the morphology quantitatively, then we abstract local rules governing the growing process to construct a simple network growth model. This model is independent of specific systems, in which only two rules are applied. Finally, we discuss the mechanism of commonly observed biological pattern formations through comparison with the system of bacterial colonies.

  13. Saturated laser fluorescence in turbulent sooting flames at high pressure

    NASA Technical Reports Server (NTRS)

    King, G. B.; Carter, C. D.; Laurendeau, N. M.

    1984-01-01

    The primary objective was to develop a quantitative, single pulse, laser-saturated fluorescence (LSF) technique for measurement of radical species concentrations in practical flames. The species of immediate interest was the hydroxyl radical. Measurements were made in both turbulent premixed diffusion flames at pressures between 1 and 20 atm. Interferences from Mie scattering were assessed by doping with particles or by controlling soot loading through variation of equivalence ratio and fuel type. The efficacy of the LSF method at high pressure was addressed by comparing fluorescence and adsorption measurements in a premixed, laminar flat flame at 1-20 atm. Signal-averaging over many laser shots is sufficient to determine the local concentration of radical species in laminar flames. However, for turbulent flames, single pulse measurements are more appropriate since a statistically significant number of laser pulses is needed to determine the probability function (PDF). PDFs can be analyzed to give true average properties and true local kinetics in turbulent, chemically reactive flows.

  14. Quantitative determination of selenium and mercury, and an ICP-MS semi-quantitative scan of other elements in samples of eagle tissues collected from the Pacific Northwest--Summer 2011

    USGS Publications Warehouse

    May, Thomas; Walther, Mike; Brumbaugh, William

    2013-01-01

    Eagle tissues from dead eagle carcasses were collected by U.S. Fish and Wildlife Service personnel at various locations in the Pacific Northwest as part of a study to document the occurrence of metal and metalloid contaminants. A group of 182 eagle tissue samples, consisting of liver, kidney, brain, talon, feather, femur, humerus, and stomach contents, were quantitatively analyzed for concentrations of selenium and mercury by atomic absorption techniques, and for other elements by semi-quantitative scan with an inductively coupled plasma-mass spectrometer. For the various tissue matrices analyzed by an ICP-MS semiquantitative scan, some elemental concentrations (micrograms per gram dry weight) were quite variable within a particular matrix; notable observations were as follows: lead concentrations ranged from 0.2 to 31 in femurs, 0.1 to 29 in humeri, 0.1 to 54 in talons, less than (<) 0.05 to 120 in livers, <0.05 to 34 in kidneys, and 0.05 to 8 in brains; copper concentrations ranged from 5 to 9 in feathers, 8 to 47 in livers, 7 to 43 in kidneys, and 7 to 28 in brains; cadmium concentrations ranged from 0.1 to 10 in kidneys. In stomach contents, concentrations of vanadium ranged from 0.08 to 5, chromium 2 to 34, manganese 1 to 57, copper 2 to 69, arsenic <0.05 to 6, rubidium 1 to 13, and barium <0.5 to 18. Selenium concentrations from highest to lowest based on the matrix mean were as follows: kidney, liver, feather, brain, stomach content, talon, femur, and humerus. For mercury, the highest to lowest concentrations were feather, liver, talon, brain, stomach content, femur, and humerus.

  15. Quantitative measurement of the chemical composition of geological standards with a miniature laser ablation/ionization mass spectrometer designed for in situ application in space research

    NASA Astrophysics Data System (ADS)

    Neuland, M. B.; Grimaudo, V.; Mezger, K.; Moreno-García, P.; Riedo, A.; Tulej, M.; Wurz, P.

    2016-03-01

    A key interest of planetary space missions is the quantitative determination of the chemical composition of the planetary surface material. The chemical composition of surface material (minerals, rocks, soils) yields fundamental information that can be used to answer key scientific questions about the formation and evolution of the planetary body in particular and the Solar System in general. We present a miniature time-of-flight type laser ablation/ionization mass spectrometer (LMS) and demonstrate its capability in measuring the elemental and mineralogical composition of planetary surface samples quantitatively by using a femtosecond laser for ablation/ionization. The small size and weight of the LMS make it a remarkable tool for in situ chemical composition measurements in space research, convenient for operation on a lander or rover exploring a planetary surface. In the laboratory, we measured the chemical composition of four geological standard reference samples USGS AGV-2 Andesite, USGS SCo-l Cody Shale, NIST 97b Flint Clay and USGS QLO-1 Quartz Latite with LMS. These standard samples are used to determine the sensitivity factors of the instrument. One important result is that all sensitivity factors are close to 1. Additionally, it is observed that the sensitivity factor of an element depends on its electron configuration, hence on the electron work function and the elemental group in agreement with existing theory. Furthermore, the conformity of the sensitivity factors is supported by mineralogical analyses of the USGS SCo-l and the NIST 97b samples. With the four different reference samples, the consistency of the calibration factors can be demonstrated, which constitutes the fundamental basis for a standard-less measurement-technique for in situ quantitative chemical composition measurements on planetary surface.

  16. Development and characterization of a dynamic lesion phantom for the quantitative evaluation of dynamic contrast-enhanced MRI.

    PubMed

    Freed, Melanie; de Zwart, Jacco A; Hariharan, Prasanna; Myers, Matthew R; Badano, Aldo

    2011-10-01

    To develop a dynamic lesion phantom that is capable of producing physiological kinetic curves representative of those seen in human dynamic contrast-enhanced MRI (DCE-MRI) data. The objective of this phantom is to provide a platform for the quantitative comparison of DCE-MRI protocols to aid in the standardization and optimization of breast DCE-MRI. The dynamic lesion consists of a hollow, plastic mold with inlet and outlet tubes to allow flow of a contrast agent solution through the lesion over time. Border shape of the lesion can be controlled using the lesion mold production method. The configuration of the inlet and outlet tubes was determined using fluid transfer simulations. The total fluid flow rate was determined using x-ray images of the lesion for four different flow rates (0.25, 0.5, 1.0, and 1.5 ml/s) to evaluate the resultant kinetic curve shape and homogeneity of the contrast agent distribution in the dynamic lesion. High spatial and temporal resolution x-ray measurements were used to estimate the true kinetic curve behavior in the dynamic lesion for benign and malignant example curves. DCE-MRI example data were acquired of the dynamic phantom using a clinical protocol. The optimal inlet and outlet tube configuration for the lesion molds was two inlet molds separated by 30° and a single outlet tube directly between the two inlet tubes. X-ray measurements indicated that 1.0 ml/s was an appropriate total fluid flow rate and provided truth for comparison with MRI data of kinetic curves representative of benign and malignant lesions. DCE-MRI data demonstrated the ability of the phantom to produce realistic kinetic curves. The authors have constructed a dynamic lesion phantom, demonstrated its ability to produce physiological kinetic curves, and provided estimations of its true kinetic curve behavior. This lesion phantom provides a tool for the quantitative evaluation of DCE-MRI protocols, which may lead to improved discrimination of breast cancer lesions.

  17. Translations on Eastern Europe, Political, Sociological, and Military Affairs, Number 1320

    DTIC Science & Technology

    1976-11-17

    behavior we should condemn. True enough, in the public pressure affecting our standard-of- living policies we can discover elements which are influenced...so that the supply of high- grade, healthful foods becomes still more polymorphic , abundant and appetiz- ing. Together with the All-Union Institute...death the Warsaw Pact nations will exert political and military pressure on Yugoslavia—and if need be will invade it—in order to force it to return to

  18. Electrostatic discharge test apparatus

    NASA Technical Reports Server (NTRS)

    Smith, William Conrad (Inventor)

    1988-01-01

    Electrostatic discharge properties of materials are quantitatively measured and ranked. Samples are rotated on a turntable beneath selectable, co-available electrostatic chargers, one being a corona charging element and the other a sample-engaging triboelectric charging element. Samples then pass under a voltage meter to measure the amount of residual charge on the samples. After charging is discontinued, measurements are continued to record the charge decay history over time.

  19. Elemental accumulation in lichen transplants in the neighborhood of thermal power stations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freitas, M.C.; Reis, M.A.; Alves, L.C.

    1996-12-31

    Lichens are known to be good monitors of air pollution because they easily absorb the chemical elements from air particles. Therefore, the exposure of clean lichens to a polluted region will result in an accumulation of elements emitted by the pollution sources in the lichens. In this work, samples of the lichen Parmelia sulcata were collected from olive tree stems and in a very clean area to gauge pollution. The goal is to obtain a quantitative relation between results obtained via lichens and via airborne particles.

  20. Chemical Sensing Using Fiber Cavity Ring-Down Spectroscopy

    PubMed Central

    Waechter, Helen; Litman, Jessica; Cheung, Adrienne H.; Barnes, Jack A.; Loock, Hans-Peter

    2010-01-01

    Waveguide-based cavity ring-down spectroscopy (CRD) can be used for quantitative measurements of chemical concentrations in small amounts of liquid, in gases or in films. The change in ring-down time can be correlated to analyte concentration when using fiber optic sensing elements that change their attenuation in dependence of either sample absorption or refractive index. Two types of fiber cavities, i.e., fiber loops and fiber strands containing reflective elements, are distinguished. Both types of cavities were coupled to a variety of chemical sensor elements, which are discussed and compared. PMID:22294895

  1. Unveiling the Third Secret of Fátima: μ-XRF quantitative characterization and 2D elemental mapping

    NASA Astrophysics Data System (ADS)

    Manso, M.; Pessanha, S.; Guerra, M.; Figueirinhas, J. L.; Santos, J. P.; Carvalho, M. L.

    2017-04-01

    A set of five manuscripts written by Sister Lúcia between 1941 and 1944 were under study. Among them is the one that contains the description of the third part of the Secret of Fátima also known as the Third Secret of Fátima. In this work, a characterization of the paper and the ink used in these documents was achieved using micro-X-ray fluorescence spectrometry. Quantitative results were obtained for P, K, Ca, Fe, Cu and Zn, revealing different paper composition and Zn in the inks. 2D elemental maps confirmed that Zn was present in the five documents ink and that the manuscript revealing the Third Secret of Fátima contained no erasures or alteration attempts to the original manuscript.

  2. Combined use of X-ray fluorescence microscopy, phase contrast imaging for high resolution quantitative iron mapping in inflamed cells

    NASA Astrophysics Data System (ADS)

    Gramaccioni, C.; Procopio, A.; Farruggia, G.; Malucelli, E.; Iotti, S.; Notargiacomo, A.; Fratini, M.; Yang, Y.; Pacureanu, A.; Cloetens, P.; Bohic, S.; Massimi, L.; Cutone, A.; Valenti, P.; Rosa, L.; Berlutti, F.; Lagomarsino, S.

    2017-06-01

    X-ray fluorescence microscopy (XRFM) is a powerful technique to detect and localize elements in cells. To derive information useful for biology and medicine, it is essential not only to localize, but also to map quantitatively the element concentration. Here we applied quantitative XRFM to iron in phagocytic cells. Iron, a primary component of living cells, can become toxic when present in excess. In human fluids, free iron is maintained at 10-18 M concentration thanks to iron binding proteins as lactoferrin (Lf). The iron homeostasis, involving the physiological ratio of iron between tissues/secretions and blood, is strictly regulated by ferroportin, the sole protein able to export iron from cells to blood. Inflammatory processes induced by lipopolysaccharide (LPS) or bacterial pathoge inhibit ferroportin synthesis in epithelial and phagocytic cells thus hindering iron export, increasing intracellular iron and bacterial multiplication. In this respect, Lf is emerging as an important regulator of both iron and inflammatory homeostasis. Here we studied phagocytic cells inflamed by bacterial LPS and untreated or treated with milk derived bovine Lf. Quantitative mapping of iron concentration and mass fraction at high spatial resolution is obtained combining X-ray fluorescence microscopy, atomic force microscopy and synchrotron phase contrast imaging.

  3. Hard x-ray phase contrastmicroscopy - techniques and applications

    NASA Astrophysics Data System (ADS)

    Holzner, Christian

    In 1918, Einstein provided the first description of the nature of the refractive index for X-rays, showing that phase contrast effects are significant. A century later, most x-ray microscopy and nearly all medical imaging remains based on absorption contrast, even though phase contrast offers orders of magnitude improvements in contrast and reduced radiation exposure at multi-keV x-ray energies. The work presented is concerned with developing practical and quantitative methods of phase contrast for x-ray microscopy. A theoretical framework for imaging in phase contrast is put forward; this is used to obtain quantitative images in a scanning microscope using a segmented detector, and to correct for artifacts in a commercial phase contrast x-ray nano-tomography system. The principle of reciprocity between scanning and full-field microscopes is then used to arrive at a novel solution: Zernike contrast in a scanning microscope. These approaches are compared on a theoretical and experimental basis in direct connection with applications using multi-keV x-ray microscopes at the Advanced Photon Source at Argonne National Laboratory. Phase contrast provides the best means to image mass and ultrastructure of light elements that mainly constitute biological matter, while stimulated x-ray fluorescence provides high sensitivity for studies of the distribution of heavier trace elements, such as metals. These approaches are combined in a complementary way to yield quantitative maps of elemental concentration from 2D images, with elements placed in their ultrastructural context. The combination of x-ray fluorescence and phase contrast poses an ideal match for routine, high resolution tomographic imaging of biological samples in the future. The presented techniques and demonstration experiments will help pave the way for this development.

  4. Quantitative morphology in canine cutaneous soft tissue sarcomas.

    PubMed

    Simeonov, R; Ananiev, J; Gulubova, M

    2015-12-01

    Stained cytological specimens from 24 dogs with spontaneous soft tissue sarcomas [fibrosarcoma (n = 8), liposarcoma (n = 8) and haemangiopericytoma (n = 8)], and 24 dogs with reactive connective tissue lesions [granulation tissue (n = 12) and dermal fibrosis (n = 12)] were analysed by computer-assisted nuclear morphometry. The studied morphometric parameters were: mean nuclear area (MNA; µm(2)), mean nuclear perimeter (MNP; µm), mean nuclear diameter (MND mean; µm), minimum nuclear diameter (Dmin; µm) and maximum nuclear diameter (Dmax; µm). The study aimed to evaluate (1) possibility for quantitative differentiation of soft tissue sarcomas from reactive connective tissue lesions and (2) by using cytomorphometry, to differentiate the various histopathological soft tissue sarcomas subtypes in dogs. The mean values of all nuclear cytomorphometric parameters (except for Dmax) were statistically significantly higher in reactive connective tissue processes than in soft tissue sarcomas. At the same time, however, there were no considerable differences among the different sarcoma subtypes. The results demonstrated that the quantitative differentiation of reactive connective tissue processes from soft tissue sarcomas in dogs is possible, but the same was not true for the different canine soft tissue sarcoma subtypes. Further investigations on this topic are necessary for thorough explication of the role of quantitative morphology in the diagnostics of mesenchymal neoplasms and tumour-like fibrous lesions in dogs. © 2014 John Wiley & Sons Ltd.

  5. Influence of cardiac and respiratory motion on tomographic reconstructions of the heart: implications for quantitative nuclear cardiology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ter-Pogossian, M.M.; Bergmann, S.R.; Sobel, B.E.

    1982-12-01

    The potential influence of physiological, periodic motions of the heart due to the cardiac cycle, the respiratory cycle, or both on quantitative image reconstruction by positron emission tomography (PET) has been largely neglected. To define their quantitative impact, cardiac PET was performed in 6 dogs after injection of /sup 11/C-palmitate under disparate conditions including: normal cardiac and respiration cycles and cardiac arrest with and without respiration. Although in vitro assay of myocardial samples demonstrated that palmitate uptake was homogeneous (coefficient of variation . 10.1%), analysis of the reconstructed images demonstrated significant heterogeneity of apparent cardiac distribution of radioactivity due tomore » both intrinsic cardiac and respiratory motion. Image degradation due to respiratory motion was demonstrated in a healthy human volunteer as well, in whom cardiac tomography was performed with Super PETT I during breath-holding and during normal breathing. The results indicate that quantitatively significant degradation of reconstructions of true tracer distribution occurs in cardiac PET due to both intrinsic cardiac and respiratory induced motion of the heart. They suggest that avoidance of or minimization of these influences can be accomplished by gating with respect to both the cardiac cycle and respiration or by employing brief scan times during breath-holding.« less

  6. Toward uniform implementation of parametric map Digital Imaging and Communication in Medicine standard in multisite quantitative diffusion imaging studies.

    PubMed

    Malyarenko, Dariya; Fedorov, Andriy; Bell, Laura; Prah, Melissa; Hectors, Stefanie; Arlinghaus, Lori; Muzi, Mark; Solaiyappan, Meiyappan; Jacobs, Michael; Fung, Maggie; Shukla-Dave, Amita; McManus, Kevin; Boss, Michael; Taouli, Bachir; Yankeelov, Thomas E; Quarles, Christopher Chad; Schmainda, Kathleen; Chenevert, Thomas L; Newitt, David C

    2018-01-01

    This paper reports on results of a multisite collaborative project launched by the MRI subgroup of Quantitative Imaging Network to assess current capability and provide future guidelines for generating a standard parametric diffusion map Digital Imaging and Communication in Medicine (DICOM) in clinical trials that utilize quantitative diffusion-weighted imaging (DWI). Participating sites used a multivendor DWI DICOM dataset of a single phantom to generate parametric maps (PMs) of the apparent diffusion coefficient (ADC) based on two models. The results were evaluated for numerical consistency among models and true phantom ADC values, as well as for consistency of metadata with attributes required by the DICOM standards. This analysis identified missing metadata descriptive of the sources for detected numerical discrepancies among ADC models. Instead of the DICOM PM object, all sites stored ADC maps as DICOM MR objects, generally lacking designated attributes and coded terms for quantitative DWI modeling. Source-image reference, model parameters, ADC units and scale, deemed important for numerical consistency, were either missing or stored using nonstandard conventions. Guided by the identified limitations, the DICOM PM standard has been amended to include coded terms for the relevant diffusion models. Open-source software has been developed to support conversion of site-specific formats into the standard representation.

  7. Mineral inversion for element capture spectroscopy logging based on optimization theory

    NASA Astrophysics Data System (ADS)

    Zhao, Jianpeng; Chen, Hui; Yin, Lu; Li, Ning

    2017-12-01

    Understanding the mineralogical composition of a formation is an essential key step in the petrophysical evaluation of petroleum reservoirs. Geochemical logging tools can provide quantitative measurements of a wide range of elements. In this paper, element capture spectroscopy (ECS) was taken as an example and an optimization method was adopted to solve the mineral inversion problem for ECS. This method used the converting relationship between elements and minerals as response equations and took into account the statistical uncertainty of the element measurements and established an optimization function for ECS. Objective function value and reconstructed elemental logs were used to check the robustness and reliability of the inversion method. Finally, the inversion mineral results had a good agreement with x-ray diffraction laboratory data. The accurate conversion of elemental dry weights to mineral dry weights formed the foundation for the subsequent applications based on ECS.

  8. Optimizing finite element predictions of local subchondral bone structural stiffness using neural network-derived density-modulus relationships for proximal tibial subchondral cortical and trabecular bone.

    PubMed

    Nazemi, S Majid; Amini, Morteza; Kontulainen, Saija A; Milner, Jaques S; Holdsworth, David W; Masri, Bassam A; Wilson, David R; Johnston, James D

    2017-01-01

    Quantitative computed tomography based subject-specific finite element modeling has potential to clarify the role of subchondral bone alterations in knee osteoarthritis initiation, progression, and pain. However, it is unclear what density-modulus equation(s) should be applied with subchondral cortical and subchondral trabecular bone when constructing finite element models of the tibia. Using a novel approach applying neural networks, optimization, and back-calculation against in situ experimental testing results, the objective of this study was to identify subchondral-specific equations that optimized finite element predictions of local structural stiffness at the proximal tibial subchondral surface. Thirteen proximal tibial compartments were imaged via quantitative computed tomography. Imaged bone mineral density was converted to elastic moduli using multiple density-modulus equations (93 total variations) then mapped to corresponding finite element models. For each variation, root mean squared error was calculated between finite element prediction and in situ measured stiffness at 47 indentation sites. Resulting errors were used to train an artificial neural network, which provided an unlimited number of model variations, with corresponding error, for predicting stiffness at the subchondral bone surface. Nelder-Mead optimization was used to identify optimum density-modulus equations for predicting stiffness. Finite element modeling predicted 81% of experimental stiffness variance (with 10.5% error) using optimized equations for subchondral cortical and trabecular bone differentiated with a 0.5g/cm 3 density. In comparison with published density-modulus relationships, optimized equations offered improved predictions of local subchondral structural stiffness. Further research is needed with anisotropy inclusion, a smaller voxel size and de-blurring algorithms to improve predictions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Optical sensing elements for nitrogen dioxide (NO.sub.2) gas detection, a sol-gel method for making the sensing elements and fiber optic sensors incorporating nitrogen dioxide gas optical sensing elements

    DOEpatents

    Mechery, Shelly John [Mississippi State, MS; Singh, Jagdish P [Starkville, MS

    2007-07-03

    A sensing element, a method of making a sensing element, and a fiber optic sensor incorporating the sensing element are described. The sensor can be used for the quantitative detection of NO.sub.2 in a mixture of gases. The sensing element can be made by incorporating a diazotizing reagent which reacts with nitrous ions to produce a diazo compound and a coupling reagent which couples with the diazo compound to produce an azo dye into a sol and allowing the sol to form an optically transparent gel. The sensing element changes color in the presence of NO.sub.2 gas. The temporal response of the absorption spectrum at various NO.sub.2 concentrations has also been recorded and analyzed. Sensors having different design configurations are described. The sensing element can detect NO.sub.2 gas at levels of parts per billion.

  10. Quantitative real-time monitoring of multi-elements in airborne particulates by direct introduction into an inductively coupled plasma mass spectrometer

    NASA Astrophysics Data System (ADS)

    Suzuki, Yoshinari; Sato, Hikaru; Hiyoshi, Katsuhiro; Furuta, Naoki

    2012-10-01

    A new calibration system for real-time determination of trace elements in airborne particulates was developed. Airborne particulates were directly introduced into an inductively coupled plasma mass spectrometer, and the concentrations of 15 trace elements were determined by means of an external calibration method. External standard solutions were nebulized by an ultrasonic nebulizer (USN) coupled with a desolvation system, and the resulting aerosol was introduced into the plasma. The efficiency of sample introduction via the USN was calculated by two methods: (1) the introduction of a Cr standard solution via the USN was compared with introduction of a Cr(CO)6 standard gas via a standard gas generator and (2) the aerosol generated by the USN was trapped on filters and then analyzed. The Cr introduction efficiencies obtained by the two methods were the same, and the introduction efficiencies of the other elements were equal to the introduction efficiency of Cr. Our results indicated that our calibration method for introduction efficiency worked well for the 15 elements (Ti, V, Cr, Mn, Co, Ni, Cu, Zn, As, Mo, Sn, Sb, Ba, Tl and Pb). The real-time data and the filter-collection data agreed well for elements with low-melting oxides (V, Co, As, Mo, Sb, Tl, and Pb). In contrast, the real-time data were smaller than the filter-collection data for elements with high-melting oxides (Ti, Cr, Mn, Ni, Cu, Zn, Sn, and Ba). This result implies that the oxides of these 8 elements were not completely fused, vaporized, atomized, and ionized in the initial radiation zone of the inductively coupled plasma. However, quantitative real-time monitoring can be realized after correction for the element recoveries which can be calculated from the ratio of real-time data/filter-collection data.

  11. A Comparison of Multivariate and Pre-Processing Methods for Quantitative Laser-Induced Breakdown Spectroscopy of Geologic Samples

    NASA Technical Reports Server (NTRS)

    Anderson, R. B.; Morris, R. V.; Clegg, S. M.; Bell, J. F., III; Humphries, S. D.; Wiens, R. C.

    2011-01-01

    The ChemCam instrument selected for the Curiosity rover is capable of remote laser-induced breakdown spectroscopy (LIBS).[1] We used a remote LIBS instrument similar to ChemCam to analyze 197 geologic slab samples and 32 pressed-powder geostandards. The slab samples are well-characterized and have been used to validate the calibration of previous instruments on Mars missions, including CRISM [2], OMEGA [3], the MER Pancam [4], Mini-TES [5], and Moessbauer [6] instruments and the Phoenix SSI [7]. The resulting dataset was used to compare multivariate methods for quantitative LIBS and to determine the effect of grain size on calculations. Three multivariate methods - partial least squares (PLS), multilayer perceptron artificial neural networks (MLP ANNs) and cascade correlation (CC) ANNs - were used to generate models and extract the quantitative composition of unknown samples. PLS can be used to predict one element (PLS1) or multiple elements (PLS2) at a time, as can the neural network methods. Although MLP and CC ANNs were successful in some cases, PLS generally produced the most accurate and precise results.

  12. Apparatus and method for identification of matrix materials in which transuranic elements are embedded using thermal neutron capture gamma-ray emission

    DOEpatents

    Close, D.A.; Franks, L.A.; Kocimski, S.M.

    1984-08-16

    An invention is described that enables the quantitative simultaneous identification of the matrix materials in which fertile and fissile nuclides are embedded to be made along with the quantitative assay of the fertile and fissile materials. The invention also enables corrections for any absorption of neutrons by the matrix materials and by the measurement apparatus by the measurement of the prompt and delayed neutron flux emerging from a sample after the sample is interrogated by simultaneously applied neutrons and gamma radiation. High energy electrons are directed at a first target to produce gamma radiation. A second target receives the resulting pulsed gamma radiation and produces neutrons from the interaction with the gamma radiation. These neutrons are slowed by a moderator surrounding the sample and bathe the sample uniformly, generating second gamma radiation in the interaction. The gamma radiation is then resolved and quantitatively detected, providing a spectroscopic signature of the constituent elements contained in the matrix and in the materials within the vicinity of the sample. (LEW)

  13. Finite Element Analysis of Quantitative Percussion Diagnostics for Evaluating the Strength of Bonds Between Composite Laminates

    NASA Astrophysics Data System (ADS)

    Poveromo, Scott; Malcolm, Doug; Earthman, James

    Conventional nondestructive (NDT) techniques used to detect defects in composites are not able to determine intact bond integrity within a composite structure and are costly to use on large and complex shaped surfaces. To overcome current NDT limitations, a new technology was adopted based on quantitative percussion diagnostics (QPD) to better quantify bond quality in fiber reinforced composite materials. Results indicate that this technology is capable of detecting weak (`kiss') bonds between flat composite laminates. Specifically, the local value of the probe force determined from quantitative percussion testing was predicted to be significantly lower for a laminate that contained a `kiss' bond compared to that for a well-bonded sample, which is in agreement with experimental findings. Experimental results were compared to a finite element analysis (FEA) using MSC PATRAN/NASTRAN to understand the visco-elastic behavior of the laminates during percussion testing. The dynamic FEA models were used to directly predict changes in the probe force, as well as effective stress distributions across the bonded panels as a function of time.

  14. Quantitative evaluation of potential irradiation geometries for carbon-ion beam grid therapy.

    PubMed

    Tsubouchi, Toshiro; Henry, Thomas; Ureba, Ana; Valdman, Alexander; Bassler, Niels; Siegbahn, Albert

    2018-03-01

    Radiotherapy using grids containing cm-wide beam elements has been carried out sporadically for more than a century. During the past two decades, preclinical research on radiotherapy with grids containing small beam elements, 25 μm-0.7 mm wide, has been performed. Grid therapy with larger beam elements is technically easier to implement, but the normal tissue tolerance to the treatment is decreasing. In this work, a new approach in grid therapy, based on irradiations with grids containing narrow carbon-ion beam elements was evaluated dosimetrically. The aim formulated for the suggested treatment was to obtain a uniform target dose combined with well-defined grids in the irradiated normal tissue. The gain, obtained by crossfiring the carbon-ion beam grids over a simulated target volume, was quantitatively evaluated. The dose distributions produced by narrow rectangular carbon-ion beams in a water phantom were simulated with the PHITS Monte Carlo code. The beam-element height was set to 2.0 cm in the simulations, while the widths varied from 0.5 to 10.0 mm. A spread-out Bragg peak (SOBP) was then created for each beam element in the grid, to cover the target volume with dose in the depth direction. The dose distributions produced by the beam-grid irradiations were thereafter constructed by adding the dose profiles simulated for single beam elements. The variation of the valley-to-peak dose ratio (VPDR) with depth in water was thereafter evaluated. The separation of the beam elements inside the grids were determined for different irradiation geometries with a selection criterion. The simulated carbon-ion beams remained narrow down to the depths of the Bragg peaks. With the formulated selection criterion, a beam-element separation which was close to the beam-element width was found optimal for grids containing 3.0-mm-wide beam elements, while a separation which was considerably larger than the beam-element width was found advantageous for grids containing 0.5-mm-wide beam elements. With the single-grid irradiation setup, the VPDRs were close to 1.0 already at a distance of several cm from the target. The valley doses given to the normal tissue at 0.5 cm distance from the target volume could be limited to less than 10% of the mean target dose if a crossfiring setup with four interlaced grids was used. The dose distributions produced by grids containing 0.5- and 3.0-mm wide beam elements had characteristics which could be useful for grid therapy. Grids containing mm-wide carbon-ion beam elements could be advantageous due to the technical ease with which these beams can be produced and delivered, despite the reduced threshold doses observed for early and late responding normal tissue for beams of millimeter width, compared to submillimetric beams. The treatment simulations showed that nearly homogeneous dose distributions could be created inside the target volumes, combined with low valley doses in the normal tissue located close to the target volume, if the carbon-ion beam grids were crossfired in an interlaced manner with optimally selected beam-element separations. The formulated selection criterion was found useful for the quantitative evaluation of the dose distributions produced by the different irradiation setups. © 2018 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  15. A 3,000-year quantitative drought record derived from XRF element data from a south Texas playa

    NASA Astrophysics Data System (ADS)

    Livsey, D. N.; Simms, A.; Hangsterfer, A.; Nisbet, R.; DeWitt, R.

    2013-12-01

    Recent droughts throughout the central United States highlight the need for a better understanding of the past frequency and severity of drought occurrence. Current records of past drought for the south Texas coast are derived from tree-ring data that span approximately the last 900 years before present (BP). In this study we utilize a supervised learning routine to create a transfer function between X-Ray Fluorescence (XRF) derived elemental data from Laguna Salada, Texas core LS10-02 to a locally derived tree-ring drought record. From this transfer function the 900 BP tree-ring drought record was extended to 3,000 BP. The supervised learning routine was trained on the first 100 years of XRF element data and tree-ring drought data to create the transfer function and training data set output. The model was then projected from the XRF elemental data for the next 800 years to create a deployed data set output and to test the transfer function parameters. The coefficients of determination between the model output and observed values are 0.77 and 0.70 for the 100-year training data set and 900-year deployed data set respectively. Given the relatively high coefficients of determination for both the training data set and deployed data set we interpret the model parameters are fairly robust and that a high-resolution drought record can be derived from the XRF element data. These results indicate that XRF element data can be used as a quantitative tool to reconstruct past drought records.

  16. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    NASA Technical Reports Server (NTRS)

    Lindstrom, David J.; Lindstrom, Richard M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably.

  17. Memory for radio advertisements: the effect of program and typicality.

    PubMed

    Martín-Luengo, Beatriz; Luna, Karlos; Migueles, Malen

    2013-01-01

    We examined the influence of the type of radio program on the memory for radio advertisements. We also investigated the role in memory of the typicality (high or low) of the elements of the products advertised. Participants listened to three types of programs (interesting, boring, enjoyable) with two advertisements embedded in each. After completing a filler task, the participants performed a true/false recognition test. Hits and false alarm rates were higher for the interesting and enjoyable programs than for the boring one. There were also more hits and false alarms for the high-typicality elements. The response criterion for the advertisements embedded in the boring program was stricter than for the advertisements in other types of programs. We conclude that the type of program in which an advertisement is inserted and the nature of the elements of the advertisement affect both the number of hits and false alarms and the response criterion, but not the accuracy of the memory.

  18. Subcellular trace element distribution in Geosiphon pyriforme

    NASA Astrophysics Data System (ADS)

    Maetz, Mischa; Schüßler, Arthur; Wallianos, Alexandros; Traxel, Kurt

    1999-04-01

    Geosiphon pyriforme is a unique endosymbiotic consortium consisting of a soil dwelling fungus and the cyanobacterium Nostoc punctiforme. At present this symbiosis becomes very interesting because of its phylogenetic relationship to the arbuscular mycorrhizal (AM) fungi. Geosiphon pyriforme could be an important model system for these obligate symbiotic fungi, which supply 80-90% of all land plant species with nutrients, in particular phosphorous and trace elements. Combined PIXE and STIM analyses of the various compartments of Geosiphon give hints for the matter exchange between the symbiotic partners and their environment and the kind of nutrient storage and acquisition, in particular related to nitrogen fixation and metabolism. To determine the quality of our PIXE results we analysed several geological and biological standards over a time period of three years. This led to an overall precision of about 6% and an accuracy of 5-10% for nearly all detectable elements. In combination with the correction model for the occurring mass loss during the analyses this holds true even for biological targets.

  19. Photospheric Magnetic Flux Transport - Supergranules Rule

    NASA Technical Reports Server (NTRS)

    Hathaway, David H.; Rightmire-Upton, Lisa

    2012-01-01

    Observations of the transport of magnetic flux in the Sun's photosphere show that active region magnetic flux is carried far from its origin by a combination of flows. These flows have previously been identified and modeled as separate axisymmetric processes: differential rotation, meridional flow, and supergranule diffusion. Experiments with a surface convective flow model reveal that the true nature of this transport is advection by the non-axisymmetric cellular flows themselves - supergranules. Magnetic elements are transported to the boundaries of the cells and then follow the evolving boundaries. The convective flows in supergranules have peak velocities near 500 m/s. These flows completely overpower the superimposed 20 m/s meridional flow and 100 m/s differential rotation. The magnetic elements remain pinned at the supergranule boundaries. Experiments with and without the superimposed axisymmetric photospheric flows show that the axisymmetric transport of magnetic flux is controlled by the advection of the cellular pattern by underlying flows representative of deeper layers. The magnetic elements follow the differential rotation and meridional flow associated with the convection cells themselves -- supergranules rule!

  20. Factors influencing perceived angular velocity

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K.; Calderone, Jack B.

    1991-01-01

    Angular velocity perception is examined for rotations both in depth and in the image plane and the influence of several object properties on this motion parameter is explored. Two major object properties are considered, namely, texture density which determines the rate of edge transitions for rotations in depth, i.e., the number of texture elements that pass an object's boundary per unit of time, and object size which determines the tangential linear velocities and 2D image velocities of texture elements for a given angular velocity. Results of experiments show that edge-transition rate biased angular velocity estimates only when edges were highly salient. Element velocities had an impact on perceived angular velocity; this bias was associated with 2D image velocity rather than 3D tangential velocity. Despite these biases judgements were most strongly determined by the true angular velocity. Sensitivity to this higher order motion parameter appeared to be good for rotations both in depth (y-axis) and parallel to the line of sight (z-axis).

  1. Numerical Evaluation of P-Multigrid Method for the Solution of Discontinuous Galerkin Discretizations of Diffusive Equations

    NASA Technical Reports Server (NTRS)

    Atkins, H. L.; Helenbrook, B. T.

    2005-01-01

    This paper describes numerical experiments with P-multigrid to corroborate analysis, validate the present implementation, and to examine issues that arise in the implementations of the various combinations of relaxation schemes, discretizations and P-multigrid methods. The two approaches to implement P-multigrid presented here are equivalent for most high-order discretization methods such as spectral element, SUPG, and discontinuous Galerkin applied to advection; however it is discovered that the approach that mimics the common geometric multigrid implementation is less robust, and frequently unstable when applied to discontinuous Galerkin discretizations of di usion. Gauss-Seidel relaxation converges 40% faster than block Jacobi, as predicted by analysis; however, the implementation of Gauss-Seidel is considerably more expensive that one would expect because gradients in most neighboring elements must be updated. A compromise quasi Gauss-Seidel relaxation method that evaluates the gradient in each element twice per iteration converges at rates similar to those predicted for true Gauss-Seidel.

  2. Simple control laws for low-thrust orbit transfers

    NASA Technical Reports Server (NTRS)

    Petropoulos, Anastassios E.

    2003-01-01

    Two methods are presented by which to determine both a thrust direction and when to apply thrust to effect specified changes in any of the orbit elements except for true anomaly, which is assumed free. The central body is assumed to be a point mass, and the initial and final orbits are assumed closed. Thrust, when on, is of a constant value, and specific impulse is constant. The thrust profiles derived from the two methods are not propellant-optimal, but are based firstly on the optimal thrust directions and location on the osculating orbit for changing each of the orbit elements and secondly on the desired changes in the orbit elements. Two examples of transfers are presented, one in semimajor axis and inclination, and one in semimajor axis and eccentricity. The latter compares favourably with a propellant-optimized transfer between the same orbits. The control laws have few input parameters, but can still capture the complexity of a wide variety of orbit transfers.

  3. Zooming in on neutrino oscillations with DUNE

    NASA Astrophysics Data System (ADS)

    Srivastava, Rahul; Ternes, Christoph A.; Tórtola, Mariam; Valle, José W. F.

    2018-05-01

    We examine the capabilities of the DUNE experiment as a probe of the neutrino mixing paradigm. Taking the current status of neutrino oscillations and the design specifications of DUNE, we determine the experiment's potential to probe the structure of neutrino mixing and C P violation. We focus on the poorly determined parameters θ23 and δC P and consider both two and seven years of run. We take various benchmarks as our true values, such as the current preferred values of θ23 and δC P, as well as several theory-motivated choices. We determine quantitatively DUNE's potential to perform a precision measurement of θ23, as well as to test the C P violation hypothesis in a model-independent way. We find that, after running for seven years, DUNE will make a substantial step in the precise determination of these parameters, bringing to quantitative test the predictions of various theories of neutrino mixing.

  4. What Is True Halving in the Payoff Matrix of Game Theory?

    PubMed Central

    Hasegawa, Eisuke; Yoshimura, Jin

    2016-01-01

    In game theory, there are two social interpretations of rewards (payoffs) for decision-making strategies: (1) the interpretation based on the utility criterion derived from expected utility theory and (2) the interpretation based on the quantitative criterion (amount of gain) derived from validity in the empirical context. A dynamic decision theory has recently been developed in which dynamic utility is a conditional (state) variable that is a function of the current wealth of a decision maker. We applied dynamic utility to the equal division in dove-dove contests in the hawk-dove game. Our results indicate that under the utility criterion, the half-share of utility becomes proportional to a player’s current wealth. Our results are consistent with studies of the sense of fairness in animals, which indicate that the quantitative criterion has greater validity than the utility criterion. We also find that traditional analyses of repeated games must be reevaluated. PMID:27487194

  5. Objective breast tissue image classification using Quantitative Transmission ultrasound tomography

    NASA Astrophysics Data System (ADS)

    Malik, Bilal; Klock, John; Wiskin, James; Lenox, Mark

    2016-12-01

    Quantitative Transmission Ultrasound (QT) is a powerful and emerging imaging paradigm which has the potential to perform true three-dimensional image reconstruction of biological tissue. Breast imaging is an important application of QT and allows non-invasive, non-ionizing imaging of whole breasts in vivo. Here, we report the first demonstration of breast tissue image classification in QT imaging. We systematically assess the ability of the QT images’ features to differentiate between normal breast tissue types. The three QT features were used in Support Vector Machines (SVM) classifiers, and classification of breast tissue as either skin, fat, glands, ducts or connective tissue was demonstrated with an overall accuracy of greater than 90%. Finally, the classifier was validated on whole breast image volumes to provide a color-coded breast tissue volume. This study serves as a first step towards a computer-aided detection/diagnosis platform for QT.

  6. The principles and technical aspects of diuresis renography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conway, J.J.

    1989-12-01

    It is intuitive that dilation of the urinary tract is most likely caused by obstruction. However, the opposite is more often true. That is, dilation is not associated with obstruction, especially in children. The most common causes for hydronephrosis and hydroureter include infection, vesicoureteral reflux, congenital megacalyces and megaureter, previous obstruction, and bladder noncompliance. Theoretically, one can consider obstruction on the basis of its significance, which is that there may be a loss of renal function with time. Techniques such as intravenous pyelography and ultrasonography, which anatomically document the degree of dilation of the urinary tract, cannot quantitatively determine themore » presence of obstruction or its significance. Radionuclide renography more readily quantifies abnormal renal function. Serial renographic studies with furosemide can document renal function loss and, thus, determine the significance of the obstruction. Diuresis renography with furosemide provides an objective quantitative means for determining the renal function changes over time.« less

  7. What Is True Halving in the Payoff Matrix of Game Theory?

    PubMed

    Ito, Hiromu; Katsumata, Yuki; Hasegawa, Eisuke; Yoshimura, Jin

    2016-01-01

    In game theory, there are two social interpretations of rewards (payoffs) for decision-making strategies: (1) the interpretation based on the utility criterion derived from expected utility theory and (2) the interpretation based on the quantitative criterion (amount of gain) derived from validity in the empirical context. A dynamic decision theory has recently been developed in which dynamic utility is a conditional (state) variable that is a function of the current wealth of a decision maker. We applied dynamic utility to the equal division in dove-dove contests in the hawk-dove game. Our results indicate that under the utility criterion, the half-share of utility becomes proportional to a player's current wealth. Our results are consistent with studies of the sense of fairness in animals, which indicate that the quantitative criterion has greater validity than the utility criterion. We also find that traditional analyses of repeated games must be reevaluated.

  8. P value and the theory of hypothesis testing: an explanation for new researchers.

    PubMed

    Biau, David Jean; Jolles, Brigitte M; Porcher, Raphaël

    2010-03-01

    In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.

  9. [Evaluation of the quality of three-dimensional data acquired by using two kinds of structure light intra-oral scanner to scan the crown preparation model].

    PubMed

    Zhang, X Y; Li, H; Zhao, Y J; Wang, Y; Sun, Y C

    2016-07-01

    To quantitatively evaluate the quality and accuracy of three-dimensional (3D) data acquired by using two kinds of structure intra-oral scanner to scan the typical teeth crown preparations. Eight typical teeth crown preparations model were scanned 3 times with two kinds of structured light intra-oral scanner(A, B), as test group. A high precision model scanner were used to scan the model as true value group. The data above the cervical margin was extracted. The indexes of quality including non-manifold edges, the self-intersections, highly-creased edges, spikes, small components, small tunnels, small holes and the anount of triangles were measured with the tool of mesh doctor in Geomagic studio 2012. The scanned data of test group were aligned to the data of true value group. 3D deviations of the test group compared with true value group were measured for each scanned point, each preparation and each group. Independent-samples Mann-Whitney U test was applied to analyze 3D deviations for each scanned point of A and B group. Correlation analysis was applied to index values and 3D deviation values. The total number of spikes in A group was 96, and that in B group and true value group were 5 and 0 respectively. Trueness: A group 8.0 (8.3) μm, B group 9.5 (11.5) μm(P>0.05). Correlation analysis of the number of spikes with data precision of A group was r=0.46. In the study, the qulity of the scanner B is better than scanner A, the difference of accuracy is not statistically significant. There is correlation between quality and data precision of the data scanned with scanner A.

  10. CNV-ROC: A cost effective, computer-aided analytical performance evaluator of chromosomal microarrays.

    PubMed

    Goodman, Corey W; Major, Heather J; Walls, William D; Sheffield, Val C; Casavant, Thomas L; Darbro, Benjamin W

    2015-04-01

    Chromosomal microarrays (CMAs) are routinely used in both research and clinical laboratories; yet, little attention has been given to the estimation of genome-wide true and false negatives during the assessment of these assays and how such information could be used to calibrate various algorithmic metrics to improve performance. Low-throughput, locus-specific methods such as fluorescence in situ hybridization (FISH), quantitative PCR (qPCR), or multiplex ligation-dependent probe amplification (MLPA) preclude rigorous calibration of various metrics used by copy number variant (CNV) detection algorithms. To aid this task, we have established a comparative methodology, CNV-ROC, which is capable of performing a high throughput, low cost, analysis of CMAs that takes into consideration genome-wide true and false negatives. CNV-ROC uses a higher resolution microarray to confirm calls from a lower resolution microarray and provides for a true measure of genome-wide performance metrics at the resolution offered by microarray testing. CNV-ROC also provides for a very precise comparison of CNV calls between two microarray platforms without the need to establish an arbitrary degree of overlap. Comparison of CNVs across microarrays is done on a per-probe basis and receiver operator characteristic (ROC) analysis is used to calibrate algorithmic metrics, such as log2 ratio threshold, to enhance CNV calling performance. CNV-ROC addresses a critical and consistently overlooked aspect of analytical assessments of genome-wide techniques like CMAs which is the measurement and use of genome-wide true and false negative data for the calculation of performance metrics and comparison of CNV profiles between different microarray experiments. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Priority survey between indicators and analytic hierarchy process analysis for green chemistry technology assessment.

    PubMed

    Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong

    2015-01-01

    This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies.

  12. Numerical Analysis of the Elastic Properties of 3D Needled Carbon/Carbon Composites

    NASA Astrophysics Data System (ADS)

    Tan, Y.; Yan, Y.; Li, X.; Guo, F.

    2017-09-01

    Based on the observation of microstructures of 3D needled carbon/carbon (C/C) composites, a model of their representative volume element (RVE) considering the true distribution of fibers is established. Using the theories of mesoscopic mechanics and introducing periodic boundary conditions for displacements, their elastic properties, with account of porosity, are determined by finite-element methods. Quasi-static tensile tests were carried out, and the numerical predictions were found to be in good agreement with test results. This means that the RVE model of 3D needled C/C composites can predict their elastic properties efficiently. The effects of needling density, radius of needled fibers, and thickness ratio of a short-cut fiber web and a weftless ply on the elastic constants of the composites are analyzed.

  13. A Critical Analysis of the Common Elements of a High School Social Justice Curriculum: Quantitative Research vs. Qualitative Research

    ERIC Educational Resources Information Center

    Hartlep, Nicholas Daniel

    2010-01-01

    The topic of this article is high school social justice curriculum [SJC]. Three socially-just focused studies were critically analyzed. Sample sizes in these studies varied from n = 12 to n = 55. It is the author's belief, based on the research of others (Kerssen-Griep & Eifler, 2008) that an effective SJC should consist of the following elements:…

  14. [Study of biological performance of Chinese materia medica with either a cold or hot property based on the three-element mathematical analysis model].

    PubMed

    Jin, Rui; Zhang, Bing; Liu, Xiao-Qing; Liu, Sen-Mao; Liu, Xin; Li, Lian-Zhen; Zhang, Qian; Xue, Chun-Miao

    2011-07-01

    The properties of Chinese materia medica are believed to be the summarization of the effects of biological performance on the various body states. Systemic discussion of chemical-factor elements, body-condition elements, biological-performance elements and their interrelationships is needed for research into the properties of Chinese materia medica. Following the practical characteristics of Chinese medicine, the three-element mathematical model was formed by introducing some mathematical concepts and methods and was used to study the cold or hot property of Chinese medicine, and to investigate the difference in biological performances of the two properties. By using the concept of different functionality of Chinese medicine on abnormal states and the idea of interaction in mathematics, the effects of chemical-factor elements and body-condition elements were normalized to the amount of biological performance which was represented by some important indicators. The three-element mathematical model was formed with scatter plots through four steps, including effect separation, intensity calculation, frequency statistics and relevance analysis. A comparison pharmacology experiment of administration of hot property medicines, Fuzi (Radix Aconiti Lateralis Preparata) and Rougui (Cortex Cinnamomi), and cold property medicines, Huangbai (Cortex Phellodendri) and Zhizi (Fructus Gardeniae) on normal and glucocorticoid-induced yang-deficiency and yin-deficiency states was designed. The results were analyzed by the mathematical model. The scatter plots were the main output of model analysis. The expression of cold property and hot property was able to be quantified by frequency distribution of biological indexes of administrations on yang-deficiency and yin-deficiency states in the "efficacy zone" and "toxicity zone" of the plots and by the relevance analysis. The ratios of biological indicator frequency in the "efficacy zone" of administrations on yang-deficiency state and yin-deficiency state were 7:3 for Fuzi, 3:3 for Rougui, 4:4 for Huangbai and 1:5 for Zhizi. The sums of the biological indicator frequency in the "toxicity zone" of administration on the two states were 4 for Fuzi, 0 for Rougui, 2 for Huangbai and 4 for Zhizi. The relevance analysis showed that the order from Fuzi, Rougui, Huangbai to Zhizi was proportional to the change from "be true of yang-deficiency state" to "be true of yin-deficiency state". The extent of the hot property decreased while that of the cold property increased in the order of Fuzi, Rougui, Huangbai and Zhizi. The stronger the efficacy of above medicines is, the more obvious the toxicity displayed. The three-element mathematical model employed in this study is effectively capable of explaining the different biological expressions between hot property medicines and cold property medicines. This suggests that it may provide a mathematical tool and theoretical basis for the modern interpretation of cold property and hot property of Chinese medicine, and provide new ideas for further studing into the essence of Chinese medicine property theory.

  15. ``Phantom'' Modes in Ab Initio Tunneling Calculations: Implications for Theoretical Materials Optimization, Tunneling, and Transport

    NASA Astrophysics Data System (ADS)

    Barabash, Sergey V.; Pramanik, Dipankar

    2015-03-01

    Development of low-leakage dielectrics for semiconductor industry, together with many other areas of academic and industrial research, increasingly rely upon ab initio tunneling and transport calculations. Complex band structure (CBS) is a powerful formalism to establish the nature of tunneling modes, providing both a deeper understanding and a guided optimization of materials, with practical applications ranging from screening candidate dielectrics for lowest ``ultimate leakage'' to identifying charge-neutrality levels and Fermi level pinning. We demonstrate that CBS is prone to a particular type of spurious ``phantom'' solution, previously deemed true but irrelevant because of a very fast decay. We demonstrate that (i) in complex materials, phantom modes may exhibit very slow decay (appearing as leading tunneling terms implying qualitative and huge quantitative errors), (ii) the phantom modes are spurious, (iii) unlike the pseudopotential ``ghost'' states, phantoms are an apparently unavoidable artifact of large numerical basis sets, (iv) a presumed increase in computational accuracy increases the number of phantoms, effectively corrupting the CBS results despite the higher accuracy achieved in resolving the true CBS modes and the real band structure, and (v) the phantom modes cannot be easily separated from the true CBS modes. We discuss implications for direct transport calculations. The strategy for dealing with the phantom states is discussed in the context of optimizing high-quality high- κ dielectric materials for decreased tunneling leakage.

  16. Hot deformation characteristics of AZ80 magnesium alloy: Work hardening effect and processing parameter sensitivities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Y.; Wan, L.; Guo, Z. H.

    Isothermal compression experiment of AZ80 magnesium alloy was conducted by Gleeble thermo-mechanical simulator in order to quantitatively investigate the work hardening (WH), strain rate sensitivity (SRS) and temperature sensitivity (TS) during hot processing of magnesium alloys. The WH, SRS and TS were described by Zener-Hollomon parameter (Z) coupling of deformation parameters. The relationships between WH rate and true strain as well as true stress were derived from Kocks-Mecking dislocation model and validated by our measurement data. The slope defined through the linear relationship of WH rate and true stress was only related to the annihilation coefficient Ω. Obvious WH behaviormore » could be exhibited at a higher Z condition. Furthermore, we have identified the correlation between the microstructural evolution including β-Mg17Al12 precipitation and the SRS and TS variations. Intensive dynamic recrystallization and homogeneous distribution of β-Mg17Al12 precipitates resulted in greater SRS coefficient at higher temperature. The deformation heat effect and β-Mg17Al12 precipitate content can be regarded as the major factors determining the TS behavior. At low Z condition, the SRS becomes stronger, in contrast to the variation of TS. The optimum hot processing window was validated based on the established SRS and TS values distribution maps for AZ80 magnesium alloy.« less

  17. Mapping of native inorganic elements and injected nanoparticles in a biological organ with laser-induced plasma

    NASA Astrophysics Data System (ADS)

    Motto-Ros, V.; Sancey, L.; Ma, Q. L.; Lux, F.; Bai, X. S.; Wang, X. C.; Yu, Jin; Panczer, G.; Tillement, O.

    2012-11-01

    Emission spectroscopy of laser-induced plasma from a thin section of mouse kidney successfully detected inorganic elements, Na, Ca, Cu, and Gd, naturally contained in the organ or artificially injected in the form of Gd-based nanoparticle. A two-dimensional scan of the sample allowed the laser beam to explore its surface with a resolution of 100 μm, resulting in a quantitative elemental mapping of the organ with sub-mM sensitivity. The compatibility of the setup with standard optical microscopy emphasizes the potential to provide multiple images of a same biological tissue with different types of response which can be elemental, molecular, or cellular.

  18. Relationship of magnetic field strength and brightness of fine-structure elements in the solar temperature minimum region

    NASA Technical Reports Server (NTRS)

    Cook, J. W.; Ewing, J. A.

    1990-01-01

    A quantitative relationship was determined between magnetic field strength (or magnetic flux) from photospheric magnetograph observations and the brightness temperature of solar fine-structure elements observed at 1600 A, where the predominant flux source is continuum emission from the solar temperature minimum region. A Kitt Peak magnetogram and spectroheliograph observations at 1600 A taken during a sounding rocket flight of the High Resolution Telescope and Spectrograph from December 11, 1987 were used. The statistical distributions of brightness temperature in the quiet sun at 1600 A, and absolute value of magnetic field strength in the same area were determined from these observations. Using a technique which obtains the best-fit relationship of a given functional form between these two histogram distributions, a quantitative relationship was determined between absolute value of magnetic field strength B and brightness temperature which is essentially linear from 10 to 150 G. An interpretation is suggested, in which a basal heating occurs generally, while brighter elements are produced in magnetic regions with temperature enhancements proportional to B.

  19. Comparing the MRI-based Goutallier Classification to an experimental quantitative MR spectroscopic fat measurement of the supraspinatus muscle.

    PubMed

    Gilbert, Fabian; Böhm, Dirk; Eden, Lars; Schmalzl, Jonas; Meffert, Rainer H; Köstler, Herbert; Weng, Andreas M; Ziegler, Dirk

    2016-08-22

    The Goutallier Classification is a semi quantitative classification system to determine the amount of fatty degeneration in rotator cuff muscles. Although initially proposed for axial computer tomography scans it is currently applied to magnet-resonance-imaging-scans. The role for its clinical use is controversial, as the reliability of the classification has been shown to be inconsistent. The purpose of this study was to compare the semi quantitative MRI-based Goutallier Classification applied by 5 different raters to experimental MR spectroscopic quantitative fat measurement in order to determine the correlation between this classification system and the true extent of fatty degeneration shown by spectroscopy. MRI-scans of 42 patients with rotator cuff tears were examined by 5 shoulder surgeons and were graduated according to the MRI-based Goutallier Classification proposed by Fuchs et al. Additionally the fat/water ratio was measured with MR spectroscopy using the experimental SPLASH technique. The semi quantitative grading according to the Goutallier Classification was statistically correlated with the quantitative measured fat/water ratio using Spearman's rank correlation. Statistical analysis of the data revealed only fair correlation of the Goutallier Classification system and the quantitative fat/water ratio with R = 0.35 (p < 0.05). By dichotomizing the scale the correlation was 0.72. The interobserver and intraobserver reliabilities were substantial with R = 0.62 and R = 0.74 (p < 0.01). The correlation between the semi quantitative MRI based Goutallier Classification system and MR spectroscopic fat measurement is weak. As an adequate estimation of fatty degeneration based on standard MRI may not be possible, quantitative methods need to be considered in order to increase diagnostic safety and thus provide patients with ideal care in regard to the amount of fatty degeneration. Spectroscopic MR measurement may increase the accuracy of the Goutallier classification and thus improve the prediction of clinical results after rotator cuff repair. However, these techniques are currently only available in an experimental setting.

  20. Dynamic Modelling for Planar Extensible Continuum Robot Manipulators

    DTIC Science & Technology

    2006-01-01

    5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7... octopus arm [18]. The OCTARM, shown in Figure 1, is a three-section robot with nine degrees of freedom. Aside from two axis bending with constant... octopus arm. However, while allowing extensibility, the model is based on an approximation (by a Þnite number of linear models) to the true continuum

  1. HCE Research Coordination Directorate (ReCoorD Database)

    DTIC Science & Technology

    2016-04-27

    portfolio management is often hidden within broader mission scopes and visibility into those portfolio is often limited at best. Current specialty...specific tracking databases do not exist. Current broad-sweeping portfolio management tools do not exist (not true--define terms?). The HCE receives...requests from a variety of oversight bodies for reports on the current state of project-through- portfolio efforts. Tools such as NIH’s Reporter, while still in development, do not yet appear to meet HCE element requirements.

  2. Automated 3D quantitative assessment and measurement of alpha angles from the femoral head-neck junction using MR imaging

    NASA Astrophysics Data System (ADS)

    Xia, Ying; Fripp, Jurgen; Chandra, Shekhar S.; Walker, Duncan; Crozier, Stuart; Engstrom, Craig

    2015-10-01

    To develop an automated approach for 3D quantitative assessment and measurement of alpha angles from the femoral head-neck (FHN) junction using bone models derived from magnetic resonance (MR) images of the hip joint. Bilateral MR images of the hip joints were acquired from 30 male volunteers (healthy active individuals and high-performance athletes, aged 18-49 years) using a water-excited 3D dual echo steady state (DESS) sequence. In a subset of these subjects (18 water-polo players), additional True Fast Imaging with Steady-state Precession (TrueFISP) images were acquired from the right hip joint. For both MR image sets, an active shape model based algorithm was used to generate automated 3D bone reconstructions of the proximal femur. Subsequently, a local coordinate system of the femur was constructed to compute a 2D shape map to project femoral head sphericity for calculation of alpha angles around the FHN junction. To evaluate automated alpha angle measures, manual analyses were performed on anterosuperior and anterior radial MR slices from the FHN junction that were automatically reformatted using the constructed coordinate system. High intra- and inter-rater reliability (intra-class correlation coefficients  >  0.95) was found for manual alpha angle measurements from the auto-extracted anterosuperior and anterior radial slices. Strong correlations were observed between manual and automatic measures of alpha angles for anterosuperior (r  =  0.84) and anterior (r  =  0.92) FHN positions. For matched DESS and TrueFISP images, there were no significant differences between automated alpha angle measures obtained from the upper anterior quadrant of the FHN junction (two-way repeated measures ANOVA, F  <  0.01, p  =  0.98). Our automatic 3D method analysed MR images of the hip joints to generate alpha angle measures around the FHN junction circumference with very good reliability and reproducibility. This work has the potential to improve analyses of cam-type lesions of the FHN junction for large-scale morphometric and clinical MR investigations of the human hip region.

  3. Comparison of reconstruction methods and quantitative accuracy in Siemens Inveon PET scanner

    NASA Astrophysics Data System (ADS)

    Ram Yu, A.; Kim, Jin Su; Kang, Joo Hyun; Moo Lim, Sang

    2015-04-01

    PET reconstruction is key to the quantification of PET data. To our knowledge, no comparative study of reconstruction methods has been performed to date. In this study, we compared reconstruction methods with various filters in terms of their spatial resolution, non-uniformities (NU), recovery coefficients (RCs), and spillover ratios (SORs). In addition, the linearity of reconstructed radioactivity between linearity of measured and true concentrations were also assessed. A Siemens Inveon PET scanner was used in this study. Spatial resolution was measured with NEMA standard by using a 1 mm3 sized 18F point source. Image quality was assessed in terms of NU, RC and SOR. To measure the effect of reconstruction algorithms and filters, data was reconstructed using FBP, 3D reprojection algorithm (3DRP), ordered subset expectation maximization 2D (OSEM 2D), and maximum a posteriori (MAP) with various filters or smoothing factors (β). To assess the linearity of reconstructed radioactivity, image quality phantom filled with 18F was used using FBP, OSEM and MAP (β =1.5 & 5 × 10-5). The highest achievable volumetric resolution was 2.31 mm3 and the highest RCs were obtained when OSEM 2D was used. SOR was 4.87% for air and 3.97% for water, obtained OSEM 2D reconstruction was used. The measured radioactivity of reconstruction image was proportional to the injected one for radioactivity below 16 MBq/ml when FBP or OSEM 2D reconstruction methods were used. By contrast, when the MAP reconstruction method was used, activity of reconstruction image increased proportionally, regardless of the amount of injected radioactivity. When OSEM 2D or FBP were used, the measured radioactivity concentration was reduced by 53% compared with true injected radioactivity for radioactivity <16 MBq/ml. The OSEM 2D reconstruction method provides the highest achievable volumetric resolution and highest RC among all the tested methods and yields a linear relation between the measured and true concentrations for radioactivity Our data collectively showed that OSEM 2D reconstruction method provides quantitatively accurate reconstructed PET data results.

  4. WE-FG-202-08: Assessment of Treatment Response Via Longitudinal Diffusion MRI On A MRI-Guided System: Initial Experience of Quantitative Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qi, X; Yang, Y; Yang, L

    Purpose: To report our initial experience of systematic monitoring treatment response using longitudinal diffusion MR images on a Co-60 MRI-guided radiotherapy system. Methods: Four patients, including 2 head-and-necks, 1 sarcoma and 1 GBM treated on a 0.35 Tesla MRI-guided treatment system, were analyzed. For each patient, 3D TrueFISP MRIs were acquired during CT simulation and before each treatment for treatment planning and patient setup purposes respectively. Additionally, 2D diffusion-weighted MR images (DWI) were acquired weekly throughout the treatment course. The gross target volume (GTV) and brainstem (as a reference structure) were delineated on weekly 3D TrueFISP MRIs to monitor anatomymore » changes, the contours were then transferred onto the corresponding DWI images after fusing with the weekly TrueFISP images. The patient-specific temporal and spatial variations during the entire treatment course, such as anatomic changes, target apparent diffusion coefficient (ADC) distribution were evaluated in a longitudinal pattern. Results: Routine MRI revealed progressive soft-tissue GTV volume changes (up to 53%) for the H&N cases during the treatment course of 5–7 weeks. Within the GTV, the mean ADC values varied from −44% (ADC decrease) to +26% (ADC increase) in a week. The gradual increase of ADC value was inversely associated with target volume variation for one H&N case. The maximal changes of mean ADC values within the brainstem were 5.3% for the H&N cases. For the large size sarcoma and GBM tumors, spatial heterogeneity and temporal variations were observed through longitudinal ADC analysis. Conclusion: In addition to the superior soft-tissue visualization, the 0.35T MR system on ViewRay showed the potential to quantitatively measure the ADC values for both tumor and normal tissues. For normal tissue that is minimally affected by radiation, its ADC values are reproducible. Tumor ADC values show temporal and spatial fluctuation that can be exploited for personalized adaptive therapy.« less

  5. Validation of new superheavy elements and IUPAC-IUPAP joint working group

    NASA Astrophysics Data System (ADS)

    Jarlskog, Cecilia

    2016-12-01

    The great chemist Glenn Seaborg has written a delightful little book "Man-made Transuranium Elements", published in 1963, in which he points out that: "The former basic criterion for the discovery of a new element - namely, chemical identification and separation from all previously-known elements - had to be changed in the case of lawrencium (element 103). This also may be true for elements beyond lawrencium." Indeed this is what has happened. The elements with Z ≥ 103 are produced in nuclear reactions and are detected by counters. The detectors have undergone substantial refinement. For example one uses multiwire proportional chambers [for which George Charpak received the 1992 Nobel Prize in Physics] as well as solid state micro-strip detectors. In spite of this remarkable shift from chemistry to physics, the managerial staff of the International Union of Pure and Applied Chemistry (IUPAC) does not seem to be aware of what has been going on. The validation of superheavy elements should be done by physicists as the chemists lack the relevant competence as I will discuss here below. This article is about a collaboration between International Union of Pure and Applied Chemistry (IUPAC) and its sister organization International Union of Pure and Applied Physics (IUPAP), to deal with discovery of superheavy elements beyond Z = 112. I spent a great deal of time on this issue. In my opinion, the collaboration turned out to be a failure. For the sake of science, which should be our most important concern (and not politics), the rules for the future collaborations, if any, should be accurately defined and respected. The validation of new elements should be done by people who have the relevant competence - the physicists.

  6. Top-quality security optical elements: from holography towards 500.000 dpi

    NASA Astrophysics Data System (ADS)

    Kotačka, Libor; Těthal, Tomas; Kolařík, Vladimir

    2005-09-01

    Invented in late 1940s, holography has played a very important role in many technical applications. While the 60s and 70s belonged to, say, a classical period of the holography and diffractive optics (optical elements, lenses, beam splitters), the last two decades have shown an enormous expansion of various mainly synthetically designed and created holographic elements. Ever since its invention, holograms have also attracted our attention, because of their true three-dimension perception of a depicted object and related optical features. These phenomena caused, the holograms have become very well and easily publicly recognized, but still very difficult to falsify. Holography based optically variable microstructures and related advanced anti-counterfeit measures are thus ones of the leading features in security elements used for the protection against falsification of valuables, documents (banknotes, visa, passports, ID cards, tax stamps, etc.), serving for the protection of interests and many others. Our talk deals with the survey of currently exploited technologies to produce several protective optical elements. A special attention will be paid to the synthetically developed special optical elements by means of the unique technology - the electron beam lithography, what is one of the world's most advanced technologies used for the protection against falsification. The computer-synthesized security elements are recorded with an incredible resolution of up to 500.000 dpi and are specially developed for the security of the most important state valuables and documents. Finally, we shall discuss some technological possibilities for its future development.

  7. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth.

    PubMed

    Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C

    2015-04-13

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  8. Methyl cation affinities of neutral and anionic maingroup-element hydrides: trends across the periodic table and correlation with proton affinities.

    PubMed

    Mulder, R Joshua; Guerra, Célia Fonseca; Bickelhaupt, F Matthias

    2010-07-22

    We have computed the methyl cation affinities in the gas phase of archetypal anionic and neutral bases across the periodic table using ZORA-relativistic density functional theory (DFT) at BP86/QZ4P//BP86/TZ2P. The main purpose of this work is to provide the methyl cation affinities (and corresponding entropies) at 298 K of all anionic (XH(n-1)(-)) and neutral bases (XH(n)) constituted by maingroup-element hydrides of groups 14-17 and the noble gases (i.e., group 18) along the periods 2-6. The cation affinity of the bases decreases from H(+) to CH(3)(+). To understand this trend, we have carried out quantitative bond energy decomposition analyses (EDA). Quantitative correlations are established between the MCA and PA values.

  9. Is scanning electron microscopy/energy dispersive X-ray spectrometry (SEM/EDS) quantitative?

    PubMed

    Newbury, Dale E; Ritchie, Nicholas W M

    2013-01-01

    Scanning electron microscopy/energy dispersive X-ray spectrometry (SEM/EDS) is a widely applied elemental microanalysis method capable of identifying and quantifying all elements in the periodic table except H, He, and Li. By following the "k-ratio" (unknown/standard) measurement protocol development for electron-excited wavelength dispersive spectrometry (WDS), SEM/EDS can achieve accuracy and precision equivalent to WDS and at substantially lower electron dose, even when severe X-ray peak overlaps occur, provided sufficient counts are recorded. Achieving this level of performance is now much more practical with the advent of the high-throughput silicon drift detector energy dispersive X-ray spectrometer (SDD-EDS). However, three measurement issues continue to diminish the impact of SEM/EDS: (1) In the qualitative analysis (i.e., element identification) that must precede quantitative analysis, at least some current and many legacy software systems are vulnerable to occasional misidentification of major constituent peaks, with the frequency of misidentifications rising significantly for minor and trace constituents. (2) The use of standardless analysis, which is subject to much broader systematic errors, leads to quantitative results that, while useful, do not have sufficient accuracy to solve critical problems, e.g. determining the formula of a compound. (3) EDS spectrometers have such a large volume of acceptance that apparently credible spectra can be obtained from specimens with complex topography that introduce uncontrolled geometric factors that modify X-ray generation and propagation, resulting in very large systematic errors, often a factor of ten or more. © Wiley Periodicals, Inc.

  10. Closing in on chemical bonds by opening up relativity theory.

    PubMed

    Whitney, Cynthia K

    2008-03-01

    This paper develops a connection between the phenomenology of chemical bonding and the theory of relativity. Empirical correlations between electron numbers in atoms and chemical bond stabilities in molecules are first reviewed and extended. Quantitative chemical bond strengths are then related to ionization potentials in elements. Striking patterns in ionization potentials are revealed when the data are viewed in an element-independent way, where element-specific details are removed via an appropriate scaling law. The scale factor involved is not explained by quantum mechanics; it is revealed only when one goes back further, to the development of Einstein's special relativity theory.

  11. The confusion in complying with good manufacturing practice requirements in Malaysia

    NASA Astrophysics Data System (ADS)

    Jali, Mohd Bakri; Ghani, Maaruf Abdul; Nor, Norazmir Md

    2016-11-01

    Food manufacturing operations need to fulfil regulatory requirements related to hygiene and good manufacturing practices (GMP) to successfully market their products as safe and quality products. GMP based on its ten elements used as guidelines to ensure control over biological, chemical and physical hazards. This study aims to investigate the confusion for design and facilities elements among food industries. Both qualitative and quantitative techniques are used as systematic tools. Design and facilities elements lay a firm foundation for good manufacturing practice to ensure food hygiene and should be used in conjunction with each specific code of hygiene practice and guidelines.

  12. Not all are free-living: high-throughput DNA metabarcoding reveals a diverse community of protists parasitizing soil metazoa.

    PubMed

    Geisen, S; Laros, I; Vizcaíno, A; Bonkowski, M; de Groot, G A

    2015-09-01

    Protists, the most diverse eukaryotes, are largely considered to be free-living bacterivores, but vast numbers of taxa are known to parasitize plants or animals. High-throughput sequencing (HTS) approaches now commonly replace cultivation-based approaches in studying soil protists, but insights into common biases associated with this method are limited to aquatic taxa and samples. We created a mock community of common free-living soil protists (amoebae, flagellates, ciliates), extracted DNA and amplified it in the presence of metazoan DNA using 454 HTS. We aimed at evaluating whether HTS quantitatively reveals true relative abundances of soil protists and at investigating whether the expected protist community structure is altered by the co-amplification of metazoan-associated protist taxa. Indeed, HTS revealed fundamentally different protist communities from those expected. Ciliate sequences were highly over-represented, while those of most amoebae and flagellates were under-represented or totally absent. These results underpin the biases introduced by HTS that prevent reliable quantitative estimations of free-living protist communities. Furthermore, we detected a wide range of nonadded protist taxa probably introduced along with metazoan DNA, which altered the protist community structure. Among those, 20 taxa most closely resembled parasitic, often pathogenic taxa. Therewith, we provide the first HTS data in support of classical observational studies that showed that potential protist parasites are hosted by soil metazoa. Taken together, profound differences in amplification success between protist taxa and an inevitable co-extraction of protist taxa parasitizing soil metazoa obscure the true diversity of free-living soil protist communities. © 2015 John Wiley & Sons Ltd.

  13. A quantitative metric to identify critical elements within seafood supply networks.

    PubMed

    Plagányi, Éva E; van Putten, Ingrid; Thébaud, Olivier; Hobday, Alistair J; Innes, James; Lim-Camacho, Lilly; Norman-López, Ana; Bustamante, Rodrigo H; Farmery, Anna; Fleming, Aysha; Frusher, Stewart; Green, Bridget; Hoshino, Eriko; Jennings, Sarah; Pecl, Gretta; Pascoe, Sean; Schrobback, Peggy; Thomas, Linda

    2014-01-01

    A theoretical basis is required for comparing key features and critical elements in wild fisheries and aquaculture supply chains under a changing climate. Here we develop a new quantitative metric that is analogous to indices used to analyse food-webs and identify key species. The Supply Chain Index (SCI) identifies critical elements as those elements with large throughput rates, as well as greater connectivity. The sum of the scores for a supply chain provides a single metric that roughly captures both the resilience and connectedness of a supply chain. Standardised scores can facilitate cross-comparisons both under current conditions as well as under a changing climate. Identification of key elements along the supply chain may assist in informing adaptation strategies to reduce anticipated future risks posed by climate change. The SCI also provides information on the relative stability of different supply chains based on whether there is a fairly even spread in the individual scores of the top few key elements, compared with a more critical dependence on a few key individual supply chain elements. We use as a case study the Australian southern rock lobster Jasus edwardsii fishery, which is challenged by a number of climate change drivers such as impacts on recruitment and growth due to changes in large-scale and local oceanographic features. The SCI identifies airports, processors and Chinese consumers as the key elements in the lobster supply chain that merit attention to enhance stability and potentially enable growth. We also apply the index to an additional four real-world Australian commercial fishery and two aquaculture industry supply chains to highlight the utility of a systematic method for describing supply chains. Overall, our simple methodological approach to empirically-based supply chain research provides an objective method for comparing the resilience of supply chains and highlighting components that may be critical.

  14. A Quantitative Metric to Identify Critical Elements within Seafood Supply Networks

    PubMed Central

    Plagányi, Éva E.; van Putten, Ingrid; Thébaud, Olivier; Hobday, Alistair J.; Innes, James; Lim-Camacho, Lilly; Norman-López, Ana; Bustamante, Rodrigo H.; Farmery, Anna; Fleming, Aysha; Frusher, Stewart; Green, Bridget; Hoshino, Eriko; Jennings, Sarah; Pecl, Gretta; Pascoe, Sean; Schrobback, Peggy; Thomas, Linda

    2014-01-01

    A theoretical basis is required for comparing key features and critical elements in wild fisheries and aquaculture supply chains under a changing climate. Here we develop a new quantitative metric that is analogous to indices used to analyse food-webs and identify key species. The Supply Chain Index (SCI) identifies critical elements as those elements with large throughput rates, as well as greater connectivity. The sum of the scores for a supply chain provides a single metric that roughly captures both the resilience and connectedness of a supply chain. Standardised scores can facilitate cross-comparisons both under current conditions as well as under a changing climate. Identification of key elements along the supply chain may assist in informing adaptation strategies to reduce anticipated future risks posed by climate change. The SCI also provides information on the relative stability of different supply chains based on whether there is a fairly even spread in the individual scores of the top few key elements, compared with a more critical dependence on a few key individual supply chain elements. We use as a case study the Australian southern rock lobster Jasus edwardsii fishery, which is challenged by a number of climate change drivers such as impacts on recruitment and growth due to changes in large-scale and local oceanographic features. The SCI identifies airports, processors and Chinese consumers as the key elements in the lobster supply chain that merit attention to enhance stability and potentially enable growth. We also apply the index to an additional four real-world Australian commercial fishery and two aquaculture industry supply chains to highlight the utility of a systematic method for describing supply chains. Overall, our simple methodological approach to empirically-based supply chain research provides an objective method for comparing the resilience of supply chains and highlighting components that may be critical. PMID:24633147

  15. ICP-MS: Analytical Method for Identification and Detection of Elemental Impurities.

    PubMed

    Mittal, Mohini; Kumar, Kapil; Anghore, Durgadas; Rawal, Ravindra K

    2017-01-01

    Aim of this article is to review and discuss the currently used quantitative analytical method ICP-MS, which is used for quality control of pharmaceutical products. ICP-MS technique has several applications such as determination of single elements, multi element analysis in synthetic drugs, heavy metals in environmental water, trace element content of selected fertilizers and dairy manures. ICP-MS is also used for determination of toxic and essential elements in different varieties of food samples and metal pollutant present in the environment. The pharmaceuticals may generate impurities at various stages of development, transportation and storage which make them risky to be administered. Thus, it is essential that these impurities must be detected and quantified. ICP-MS plays an important function in the recognition and revealing of elemental impurities. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  16. Multielement extraction system for the determination of 18 trace elements in geochemical samples

    USGS Publications Warehouse

    Clark, J.R.; Viets, J.G.

    1981-01-01

    A Methyl isobutyl ketone-Amine synerGistic Iodide Complex (MAGIC) extraction system has been developed for use in geochemical exploration which separates a maximum number of trace elements from interfering matrices. Extraction curves for 18 of these trace elements are presented: Pd, Pt, Cu, Ag, Au, Zn, Cd, Hg, Ga, In, Tl, Sa, Pb, As, Sb, Bi, Se, and Te. The acid normality of the aqueous phase controls the extraction into the organic phase, and each of these 18 elements has a broad range of HCl normality over which H is quantitatively extracted, making H possible to determine all 18 trace elements from a single sample digestion or leach solution. The extract can be analyzed directly by flame atomic absorption or inductively coupled plasma emission spectroscopy. Most of these 18 elements can be determined by Nameless atomic absorption after special treatment of the organic extract.

  17. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  18. Electrostatic discharge test apparatus

    NASA Technical Reports Server (NTRS)

    Smith, William C. (Inventor)

    1989-01-01

    Electrostatic discharge properties of materials are quantitatively measured and ranked. Samples (20) are rotated on a turntable (15) beneath selectable, co-available electrostatic chargers (30/40), one being a corona charging element (30) and the other a sample-engaging triboelectric charging element (40). They then pass under a voltage meter (25) to measure the amount of residual charge on the samples (20). After charging is discontinued, measurements are continued to record the charge decay history over time.

  19. Quantitative characterization of nanoscale polycrystalline magnets with electron magnetic circular dichroism.

    PubMed

    Muto, Shunsuke; Rusz, Ján; Tatsumi, Kazuyoshi; Adam, Roman; Arai, Shigeo; Kocevski, Vancho; Oppeneer, Peter M; Bürgler, Daniel E; Schneider, Claus M

    2014-01-01

    Electron magnetic circular dichroism (EMCD) allows the quantitative, element-selective determination of spin and orbital magnetic moments, similar to its well-established X-ray counterpart, X-ray magnetic circular dichroism (XMCD). As an advantage over XMCD, EMCD measurements are made using transmission electron microscopes, which are routinely operated at sub-nanometre resolution, thereby potentially allowing nanometre magnetic characterization. However, because of the low intensity of the EMCD signal, it has not yet been possible to obtain quantitative information from EMCD signals at the nanoscale. Here we demonstrate a new approach to EMCD measurements that considerably enhances the outreach of the technique. The statistical analysis introduced here yields robust quantitative EMCD signals. Moreover, we demonstrate that quantitative magnetic information can be routinely obtained using electron beams of only a few nanometres in diameter without imposing any restriction regarding the crystalline order of the specimen.

  20. Simultaneous X-ray fluorescence and scanning X-ray diffraction microscopy at the Australian Synchrotron XFM beamline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Michael W. M.; Phillips, Nicholas W.; van Riessen, Grant A.

    2016-08-11

    Owing to its extreme sensitivity, quantitative mapping of elemental distributionsviaX-ray fluorescence microscopy (XFM) has become a key microanalytical technique. The recent realisation of scanning X-ray diffraction microscopy (SXDM) meanwhile provides an avenue for quantitative super-resolved ultra-structural visualization. The similarity of their experimental geometries indicates excellent prospects for simultaneous acquisition. Here, in both step- and fly-scanning modes, robust, simultaneous XFM-SXDM is demonstrated.

  1. Quantitative Imaging of Young's Modulus of Soft Tissues from Ultrasound Water Jet Indentation: A Finite Element Study

    PubMed Central

    Lu, Min-Hua; Mao, Rui; Lu, Yin; Liu, Zheng; Wang, Tian-Fu; Chen, Si-Ping

    2012-01-01

    Indentation testing is a widely used approach to evaluate mechanical characteristics of soft tissues quantitatively. Young's modulus of soft tissue can be calculated from the force-deformation data with known tissue thickness and Poisson's ratio using Hayes' equation. Our group previously developed a noncontact indentation system using a water jet as a soft indenter as well as the coupling medium for the propagation of high-frequency ultrasound. The novel system has shown its ability to detect the early degeneration of articular cartilage. However, there is still lack of a quantitative method to extract the intrinsic mechanical properties of soft tissue from water jet indentation. The purpose of this study is to investigate the relationship between the loading-unloading curves and the mechanical properties of soft tissues to provide an imaging technique of tissue mechanical properties. A 3D finite element model of water jet indentation was developed with consideration of finite deformation effect. An improved Hayes' equation has been derived by introducing a new scaling factor which is dependent on Poisson's ratios v, aspect ratio a/h (the radius of the indenter/the thickness of the test tissue), and deformation ratio d/h. With this model, the Young's modulus of soft tissue can be quantitatively evaluated and imaged with the error no more than 2%. PMID:22927890

  2. Nuclear physics: Close encounters of the alpha kind

    DOE PAGES

    Quaglioni, Sofia

    2015-12-02

    Here, breakthrough calculations of collisions between two helium nuclei pave the way to a quantitative understanding of how the elements carbon and oxygen were made in stars — and to improved models of stellar evolution.

  3. Priority survey between indicators and analytic hierarchy process analysis for green chemistry technology assessment

    PubMed Central

    Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong

    2015-01-01

    Objectives This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. Methods The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. Results These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. Conclusions This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies. PMID:26206364

  4. Recovery Act. Development and Validation of an Advanced Stimulation Prediction Model for Enhanced Geothermal System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutierrez, Marte

    The research project aims to develop and validate an advanced computer model that can be used in the planning and design of stimulation techniques to create engineered reservoirs for Enhanced Geothermal Systems. The specific objectives of the proposal are to: 1) Develop a true three-dimensional hydro-thermal fracturing simulator that is particularly suited for EGS reservoir creation. 2) Perform laboratory scale model tests of hydraulic fracturing and proppant flow/transport using a polyaxial loading device, and use the laboratory results to test and validate the 3D simulator. 3) Perform discrete element/particulate modeling of proppant transport in hydraulic fractures, and use the resultsmore » to improve understand of proppant flow and transport. 4) Test and validate the 3D hydro-thermal fracturing simulator against case histories of EGS energy production. 5) Develop a plan to commercialize the 3D fracturing and proppant flow/transport simulator. The project is expected to yield several specific results and benefits. Major technical products from the proposal include: 1) A true-3D hydro-thermal fracturing computer code that is particularly suited to EGS, 2) Documented results of scale model tests on hydro-thermal fracturing and fracture propping in an analogue crystalline rock, 3) Documented procedures and results of discrete element/particulate modeling of flow and transport of proppants for EGS applications, and 4) Database of monitoring data, with focus of Acoustic Emissions (AE) from lab scale modeling and field case histories of EGS reservoir creation.« less

  5. Recovery Act. Development and Validation of an Advanced Stimulation Prediction Model for Enhanced Geothermal Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutierrez, Marte

    2013-12-31

    This research project aims to develop and validate an advanced computer model that can be used in the planning and design of stimulation techniques to create engineered reservoirs for Enhanced Geothermal Systems. The specific objectives of the proposal are to; Develop a true three-dimensional hydro-thermal fracturing simulator that is particularly suited for EGS reservoir creation; Perform laboratory scale model tests of hydraulic fracturing and proppant flow/transport using a polyaxial loading device, and use the laboratory results to test and validate the 3D simulator; Perform discrete element/particulate modeling of proppant transport in hydraulic fractures, and use the results to improve understandmore » of proppant flow and transport; Test and validate the 3D hydro-thermal fracturing simulator against case histories of EGS energy production; and Develop a plan to commercialize the 3D fracturing and proppant flow/transport simulator. The project is expected to yield several specific results and benefits. Major technical products from the proposal include; A true-3D hydro-thermal fracturing computer code that is particularly suited to EGS; Documented results of scale model tests on hydro-thermal fracturing and fracture propping in an analogue crystalline rock; Documented procedures and results of discrete element/particulate modeling of flow and transport of proppants for EGS applications; and Database of monitoring data, with focus of Acoustic Emissions (AE) from lab scale modeling and field case histories of EGS reservoir creation.« less

  6. A modular finite-element model (MODFE) for areal and axisymmetric ground-water-flow problems, Part 2: Derivation of finite-element equations and comparisons with analytical solutions

    USGS Publications Warehouse

    Cooley, Richard L.

    1992-01-01

    MODFE, a modular finite-element model for simulating steady- or unsteady-state, area1 or axisymmetric flow of ground water in a heterogeneous anisotropic aquifer is documented in a three-part series of reports. In this report, part 2, the finite-element equations are derived by minimizing a functional of the difference between the true and approximate hydraulic head, which produces equations that are equivalent to those obtained by either classical variational or Galerkin techniques. Spatial finite elements are triangular with linear basis functions, and temporal finite elements are one dimensional with linear basis functions. Physical processes that can be represented by the model include (1) confined flow, unconfined flow (using the Dupuit approximation), or a combination of both; (2) leakage through either rigid or elastic confining units; (3) specified recharge or discharge at points, along lines, or areally; (4) flow across specified-flow, specified-head, or head-dependent boundaries; (5) decrease of aquifer thickness to zero under extreme water-table decline and increase of aquifer thickness from zero as the water table rises; and (6) head-dependent fluxes from springs, drainage wells, leakage across riverbeds or confining units combined with aquifer dewatering, and evapotranspiration. The matrix equations produced by the finite-element method are solved by the direct symmetric-Doolittle method or the iterative modified incomplete-Cholesky conjugate-gradient method. The direct method can be efficient for small- to medium-sized problems (less than about 500 nodes), and the iterative method is generally more efficient for larger-sized problems. Comparison of finite-element solutions with analytical solutions for five example problems demonstrates that the finite-element model can yield accurate solutions to ground-water flow problems.

  7. Energy dispersive X-ray fluorescence spectrometry for the direct multi-element analysis of dried blood spots

    NASA Astrophysics Data System (ADS)

    Marguí, E.; Queralt, I.; García-Ruiz, E.; García-González, E.; Rello, L.; Resano, M.

    2018-01-01

    Home-based collection protocols for clinical specimens are actively pursued as a means of improving life quality of patients. In this sense, dried blood spots (DBS) are proposed as a non-invasive and even self-administered alternative to sampling whole venous blood. This contribution explores the potential of energy dispersive X-ray fluorescence spectrometry for the simultaneous and direct determination of some major (S, Cl, K, Na), minor (P, Fe) and trace (Ca, Cu, Zn) elements in blood, after its deposition onto clinical filter papers, thus giving rise to DBS. For quantification purposes the best strategy was to use matrix-matched blood samples of known analyte concentrations. The accuracy and precision of the method were evaluated by analysis of a blood reference material (Seronorm™ trace elements whole blood L3). Quantitative results were obtained for the determination of P, S, Cl, K and Fe, and limits of detection for these elements were adequate, taking into account their typical concentrations in real blood samples. Determination of Na, Ca, Cu and Zn was hampered by the occurrence of high sample support (Na, Ca) and instrumental blanks (Cu, Zn). Therefore, the quantitative determination of these elements at the levels expected in blood samples was not feasible. The methodology developed was applied to the analysis of several blood samples and the results obtained were compared with those reported by standard techniques. Overall, the performance of the method developed is promising and it could be used to determine the aforementioned elements in blood samples in a simple, fast and economic way. Furthermore, its non-destructive nature enables further analyses by means of complementary techniques to be carried out.

  8. Reduction of collisional-radiative models for transient, atomic plasmas

    NASA Astrophysics Data System (ADS)

    Abrantes, Richard June; Karagozian, Ann; Bilyeu, David; Le, Hai

    2017-10-01

    Interactions between plasmas and any radiation field, whether by lasers or plasma emissions, introduce many computational challenges. One of these computational challenges involves resolving the atomic physics, which can influence other physical phenomena in the radiated system. In this work, a collisional-radiative (CR) model with reduction capabilities is developed to capture the atomic physics at a reduced computational cost. Although the model is made with any element in mind, the model is currently supplemented by LANL's argon database, which includes the relevant collisional and radiative processes for all of the ionic stages. Using the detailed data set as the true solution, reduction mechanisms in the form of Boltzmann grouping, uniform grouping, and quasi-steady-state (QSS), are implemented to compare against the true solution. Effects on the transient plasma stemming from the grouping methods are compared. Distribution A: Approved for public release; unlimited distribution, PA (Public Affairs) Clearance Number 17449. This work was supported by the Air Force Office of Scientific Research (AFOSR), Grant Number 17RQCOR463 (Dr. Jason Marshall).

  9. Development of the Austrian Nursing Minimum Data Set (NMDS-AT): the third Delphi Round, a quantitative online survey.

    PubMed

    Ranegger, Renate; Hackl, Werner O; Ammenwerth, Elske

    2015-01-01

    A Nursing Minimum Data Set (NMDS) aims at systematically describing nursing care in terms of patient problems, nursing activities, and patient outcomes. In an earlier Delphi study, 56 data elements were proposed to be included in an Austrian Nursing Minimum Data Set (NMDS-AT). To identify the most important data elements of this list, and to identify appropriate coding systems. Online Delphi-based survey with 88 experts. 43 data elements were rated as relevant for an NMDS-AT (strong agreement of more than half of the experts): nine data elements concerning the institution, patient demographics, and medical condition; 18 data elements concerning patient problems by using nursing diagnosis; seven data elements concerning nursing outcomes, and nine data elements concerning nursing interventions. As classification systems, national classification systems were proposed besides ICNP, NNN, and nursing-sensitive indicators. The resulting proposal for an NMDS-AT will now be tested with routine data.

  10. Molecular manipulations for enhancing luminescent bioreporters performance in the detection of toxic chemicals.

    PubMed

    Yagur-Kroll, Sharon; Belkin, Shimshon

    2014-01-01

    Microbial whole-cell bioreporters are genetically modified microorganisms that produce a quantifiable output in response to the presence of toxic chemicals or other stress factors. These bioreporters harbor a genetic fusion between a sensing element (usually a gene regulatory element responsive to the target) and a reporter element, the product of which may be quantitatively monitored either by its presence or by its activity. In this chapter we review genetic manipulations undertaken in order to improve bioluminescent bioreporter performance by increasing luminescent output, lowering the limit of detection, and shortening the response time. We describe molecular manipulations applied to all aspects of whole-cell bioreporters: the host strain, the expression system, the sensing element, and the reporter element. The molecular construction of whole-cell luminescent bioreporters, harboring fusions of gene promoter elements to reporter genes, has been around for over three decades; in most cases, these two genetic elements are combined "as is." This chapter outlines diverse molecular manipulations for enhancing the performance of such sensors.

  11. Direct formation of element chlorides from the corresponding element oxides through microwave-assisted carbohydrochlorination reactions.

    PubMed

    Nordschild, Simon; Auner, Norbert

    2008-01-01

    A series of technically and economically important element chlorides-such as SiCl4, BCl3, AlCl3, FeCl2, PCl3 and TiCl4-was synthesized through reactions between hydrogen chloride and the corresponding element oxides in the presence of different carbon sources with microwave assistance. This process route was optimized for demonstration purposes for tetrachlorosilane formation and successfully demonstrates the broad applicability of various silicon oxide-containing minerals and materials for carbohydrochlorination. The chlorination reaction occurs at lower temperatures than with conventional heating in a tubular oven, with substantially shorter reaction times and in better yields: quantitatively in the case of tetrachlorosilane, based on the silicon content of the starting material. The experimental procedure is very simple and provides basic information about the suitability of element compounds, especially element oxides, for carbohydrochlorination. According to the general reaction sequence element oxide-->element-->element chloride used in today's technology, this one-step carbohydrochlorination with hydrogen chloride is considerably more efficient, particularly in terms of energy input and reaction times, avoiding the isolation of the pure elements required for chlorination to give the element chlorides with use of the more corrosive and toxic chlorine gas.

  12. Soil Components in Heterogeneous Impact Glass in Martian Meteorite EETA79001

    NASA Technical Reports Server (NTRS)

    Schrader, C. M.; Cohen, B. A.; Donovan, J. J.; Vicenzi, E. P.

    2010-01-01

    Martian soil composition can illuminate past and ongoing near-surface processes such as impact gardening [2] and hydrothermal and volcanic activity [3,4]. Though the Mars Exploration Rovers (MER) have analyzed the major-element composition of Martian soils, no soil samples have been returned to Earth for detailed chemical analysis. Rao et al. [1] suggested that Martian meteorite EETA79001 contains melted Martian soil in its impact glass (Lithology C) based on sulfur enrichment of Lithology C relative to the meteorite s basaltic lithologies (A and B) [1,2]. If true, it may be possible to extract detailed soil chemical analyses using this meteoritic sample. We conducted high-resolution (0.3 m/pixel) element mapping of Lithology C in thin section EETA79001,18 by energy dispersive spectrometry (EDS). We use these data for principal component analysis (PCA).

  13. OSMOTIC PROPERTIES OF HUMAN RED CELLS.

    PubMed

    SAVITZ, D; SIDEL, V W; SOLOMON, A K

    1964-09-01

    The hematocrit method as a technique for determining red cell volume under anisotonic conditions has been reexamined and has been shown, with appropriate corrections for trapped plasma, to provide a true measure of cell volume. Cell volume changes in response to equilibration in anisotonic media were found to be much less than those predicted for an ideal osmometer; this anomalous behavior cannot be explained by solute leakage or by the changing osmotic coefficient of hemoglobin, but is quantitatively accounted for by the hypothesis that 20 per cent of intracellular water is bound to hemoglobin and is unavailable for participation in osmotic shifts.

  14. [Sensitivity to change].

    PubMed

    Igl, W; Zwingmann, C; Faller, H

    2005-04-01

    In rehabilitation research patient questionnaires are widely used for evaluative purposes, i. e. to measure improvements or deteriorations over time. This is only possible if the questionnaires applied appropriately reflect "true" change over time, i. e. they have to be sensitive to change. The aim of this paper is to point out the importance of the "sensitivity to change" concept for evaluative assessment tools and evaluative studies, respectively, considering quality of life research as an example. Various qualitative aspects, e. g. scaling of response options of assessment tools, are covered as well as quantitative methods, i. e. study designs and indices. Furthermore, recommendations for interpretation are given.

  15. The Effect of the Ill-posed Problem on Quantitative Error Assessment in Digital Image Correlation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehoucq, R. B.; Reu, P. L.; Turner, D. Z.

    Here, this work explores the effect of the ill-posed problem on uncertainty quantification for motion estimation using digital image correlation (DIC) (Sutton et al. 2009). We develop a correction factor for standard uncertainty estimates based on the cosine of the angle between the true motion and the image gradients, in an integral sense over a subregion of the image. This correction factor accounts for variability in the DIC solution previously unaccounted for when considering only image noise, interpolation bias, contrast, and the software settings such as subset size and spacing.

  16. The Effect of the Ill-posed Problem on Quantitative Error Assessment in Digital Image Correlation

    DOE PAGES

    Lehoucq, R. B.; Reu, P. L.; Turner, D. Z.

    2017-11-27

    Here, this work explores the effect of the ill-posed problem on uncertainty quantification for motion estimation using digital image correlation (DIC) (Sutton et al. 2009). We develop a correction factor for standard uncertainty estimates based on the cosine of the angle between the true motion and the image gradients, in an integral sense over a subregion of the image. This correction factor accounts for variability in the DIC solution previously unaccounted for when considering only image noise, interpolation bias, contrast, and the software settings such as subset size and spacing.

  17. Experimental validation of a Monte-Carlo-based inversion scheme for 3D quantitative photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Buchmann, Jens; Kaplan, Bernhard A.; Prohaska, Steffen; Laufer, Jan

    2017-03-01

    Quantitative photoacoustic tomography (qPAT) aims to extract physiological parameters, such as blood oxygen saturation (sO2), from measured multi-wavelength image data sets. The challenge of this approach lies in the inherently nonlinear fluence distribution in the tissue, which has to be accounted for by using an appropriate model, and the large scale of the inverse problem. In addition, the accuracy of experimental and scanner-specific parameters, such as the wavelength dependence of the incident fluence, the acoustic detector response, the beam profile and divergence, needs to be considered. This study aims at quantitative imaging of blood sO2, as it has been shown to be a more robust parameter compared to absolute concentrations. We propose a Monte-Carlo-based inversion scheme in conjunction with a reduction in the number of variables achieved using image segmentation. The inversion scheme is experimentally validated in tissue-mimicking phantoms consisting of polymer tubes suspended in a scattering liquid. The tubes were filled with chromophore solutions at different concentration ratios. 3-D multi-spectral image data sets were acquired using a Fabry-Perot based PA scanner. A quantitative comparison of the measured data with the output of the forward model is presented. Parameter estimates of chromophore concentration ratios were found to be within 5 % of the true values.

  18. Sample normalization methods in quantitative metabolomics.

    PubMed

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Confidence estimation for quantitative photoacoustic imaging

    NASA Astrophysics Data System (ADS)

    Gröhl, Janek; Kirchner, Thomas; Maier-Hein, Lena

    2018-02-01

    Quantification of photoacoustic (PA) images is one of the major challenges currently being addressed in PA research. Tissue properties can be quantified by correcting the recorded PA signal with an estimation of the corresponding fluence. Fluence estimation itself, however, is an ill-posed inverse problem which usually needs simplifying assumptions to be solved with state-of-the-art methods. These simplifications, as well as noise and artifacts in PA images reduce the accuracy of quantitative PA imaging (PAI). This reduction in accuracy is often localized to image regions where the assumptions do not hold true. This impedes the reconstruction of functional parameters when averaging over entire regions of interest (ROI). Averaging over a subset of voxels with a high accuracy would lead to an improved estimation of such parameters. To achieve this, we propose a novel approach to the local estimation of confidence in quantitative reconstructions of PA images. It makes use of conditional probability densities to estimate confidence intervals alongside the actual quantification. It encapsulates an estimation of the errors introduced by fluence estimation as well as signal noise. We validate the approach using Monte Carlo generated data in combination with a recently introduced machine learning-based approach to quantitative PAI. Our experiments show at least a two-fold improvement in quantification accuracy when evaluating on voxels with high confidence instead of thresholding signal intensity.

  20. Characterizing microstructural features of biomedical samples by statistical analysis of Mueller matrix images

    NASA Astrophysics Data System (ADS)

    He, Honghui; Dong, Yang; Zhou, Jialing; Ma, Hui

    2017-03-01

    As one of the salient features of light, polarization contains abundant structural and optical information of media. Recently, as a comprehensive description of polarization property, the Mueller matrix polarimetry has been applied to various biomedical studies such as cancerous tissues detections. In previous works, it has been found that the structural information encoded in the 2D Mueller matrix images can be presented by other transformed parameters with more explicit relationship to certain microstructural features. In this paper, we present a statistical analyzing method to transform the 2D Mueller matrix images into frequency distribution histograms (FDHs) and their central moments to reveal the dominant structural features of samples quantitatively. The experimental results of porcine heart, intestine, stomach, and liver tissues demonstrate that the transformation parameters and central moments based on the statistical analysis of Mueller matrix elements have simple relationships to the dominant microstructural properties of biomedical samples, including the density and orientation of fibrous structures, the depolarization power, diattenuation and absorption abilities. It is shown in this paper that the statistical analysis of 2D images of Mueller matrix elements may provide quantitative or semi-quantitative criteria for biomedical diagnosis.

  1. Quantitative energy-filtered TEM imaging of interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bentley, J.; Kenik, E.A.; Siangchaew, K.

    Quantitative elemental mapping by inner shell core-loss energy-filtered transmission electron microscopy (TEM) with a Gatan Imaging Filter (GIF) interfaced to a Philips CM30 TEM operated with a LaB{sub 6} filament at 300 kV has been applied to interfaces in a range of materials. In sensitized type 304L stainless steel aged 15 h at 600{degrees}C, grain-boundary Cr depletion occurs between Cr-rich intergranular M{sub 23}C{sub 6} particles. Images of net Cr L{sub 23} intensity show segregation profiles that agree quantitatively with focused-probe spectrum-line measurements recorded with a Gatan PEELS on a Philips EM400T/FEG (0.8 nA in 2-nm-diam probe) of the same regions.more » Rare-earth oxide additives that are used for the liquid-phase sintering of Si{sub 3}N{sub 4} generate second phases of complex composition at grain boundaries and edges. These grain boundary phases often control corrosion, crack growth and creep damage behavior. High resolution imaging has been widely and with focused probes can be compromised by beam damage, but elemental mapping by EFTEM appears not to cause appreciable beam damage.« less

  2. Partial Least Squares and Neural Networks for Quantitative Calibration of Laser-induced Breakdown Spectroscopy (LIBs) of Geologic Samples

    NASA Technical Reports Server (NTRS)

    Anderson, R. B.; Morris, Richard V.; Clegg, S. M.; Humphries, S. D.; Wiens, R. C.; Bell, J. F., III; Mertzman, S. A.

    2010-01-01

    The ChemCam instrument [1] on the Mars Science Laboratory (MSL) rover will be used to obtain the chemical composition of surface targets within 7 m of the rover using Laser Induced Breakdown Spectroscopy (LIBS). ChemCam analyzes atomic emission spectra (240-800 nm) from a plasma created by a pulsed Nd:KGW 1067 nm laser. The LIBS spectra can be used in a semiquantitative way to rapidly classify targets (e.g., basalt, andesite, carbonate, sulfate, etc.) and in a quantitative way to estimate their major and minor element chemical compositions. Quantitative chemical analysis from LIBS spectra is complicated by a number of factors, including chemical matrix effects [2]. Recent work has shown promising results using multivariate techniques such as partial least squares (PLS) regression and artificial neural networks (ANN) to predict elemental abundances in samples [e.g. 2-6]. To develop, refine, and evaluate analysis schemes for LIBS spectra of geologic materials, we collected spectra of a diverse set of well-characterized natural geologic samples and are comparing the predictive abilities of PLS, cascade correlation ANN (CC-ANN) and multilayer perceptron ANN (MLP-ANN) analysis procedures.

  3. Verification of continuum drift kinetic equation solvers in NIMROD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Held, E. D.; Ji, J.-Y.; Kruger, S. E.

    Verification of continuum solutions to the electron and ion drift kinetic equations (DKEs) in NIMROD [C. R. Sovinec et al., J. Comp. Phys. 195, 355 (2004)] is demonstrated through comparison with several neoclassical transport codes, most notably NEO [E. A. Belli and J. Candy, Plasma Phys. Controlled Fusion 54, 015015 (2012)]. The DKE solutions use NIMROD's spatial representation, 2D finite-elements in the poloidal plane and a 1D Fourier expansion in toroidal angle. For 2D velocity space, a novel 1D expansion in finite elements is applied for the pitch angle dependence and a collocation grid is used for the normalized speedmore » coordinate. The full, linearized Coulomb collision operator is kept and shown to be important for obtaining quantitative results. Bootstrap currents, parallel ion flows, and radial particle and heat fluxes show quantitative agreement between NIMROD and NEO for a variety of tokamak equilibria. In addition, velocity space distribution function contours for ions and electrons show nearly identical detailed structure and agree quantitatively. A Θ-centered, implicit time discretization and a block-preconditioned, iterative linear algebra solver provide efficient electron and ion DKE solutions that ultimately will be used to obtain closures for NIMROD's evolving fluid model.« less

  4. Guidelines for reporting quantitative mass spectrometry based experiments in proteomics.

    PubMed

    Martínez-Bartolomé, Salvador; Deutsch, Eric W; Binz, Pierre-Alain; Jones, Andrew R; Eisenacher, Martin; Mayer, Gerhard; Campos, Alex; Canals, Francesc; Bech-Serra, Joan-Josep; Carrascal, Montserrat; Gay, Marina; Paradela, Alberto; Navajas, Rosana; Marcilla, Miguel; Hernáez, María Luisa; Gutiérrez-Blázquez, María Dolores; Velarde, Luis Felipe Clemente; Aloria, Kerman; Beaskoetxea, Jabier; Medina-Aunon, J Alberto; Albar, Juan P

    2013-12-16

    Mass spectrometry is already a well-established protein identification tool and recent methodological and technological developments have also made possible the extraction of quantitative data of protein abundance in large-scale studies. Several strategies for absolute and relative quantitative proteomics and the statistical assessment of quantifications are possible, each having specific measurements and therefore, different data analysis workflows. The guidelines for Mass Spectrometry Quantification allow the description of a wide range of quantitative approaches, including labeled and label-free techniques and also targeted approaches such as Selected Reaction Monitoring (SRM). The HUPO Proteomics Standards Initiative (HUPO-PSI) has invested considerable efforts to improve the standardization of proteomics data handling, representation and sharing through the development of data standards, reporting guidelines, controlled vocabularies and tooling. In this manuscript, we describe a key output from the HUPO-PSI-namely the MIAPE Quant guidelines, which have developed in parallel with the corresponding data exchange format mzQuantML [1]. The MIAPE Quant guidelines describe the HUPO-PSI proposal concerning the minimum information to be reported when a quantitative data set, derived from mass spectrometry (MS), is submitted to a database or as supplementary information to a journal. The guidelines have been developed with input from a broad spectrum of stakeholders in the proteomics field to represent a true consensus view of the most important data types and metadata, required for a quantitative experiment to be analyzed critically or a data analysis pipeline to be reproduced. It is anticipated that they will influence or be directly adopted as part of journal guidelines for publication and by public proteomics databases and thus may have an impact on proteomics laboratories across the world. This article is part of a Special Issue entitled: Standardization and Quality Control. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Geological impacts on nutrition

    USDA-ARS?s Scientific Manuscript database

    This chapter reviews the nutritional roles of mineral elements, as part of a volume on health implications of geology. The chapter addresses the absorption and post-absorptive utilization of the nutritionally essential minerals, including their physiological functions and quantitative requirements....

  6. Finite element analysis of a composite crash box subjected to low velocity impact

    NASA Astrophysics Data System (ADS)

    Shaik Dawood, M. S. I.; Ghazilan, A. L. Ahmad; Shah, Q. H.

    2017-03-01

    In this work, finite element analyses using LS-DYNA had been carried out to investigate the energy absorption capability of a composite crash box. The analysed design incorporates grooves to the cross sectional shape and E-Glass/Epoxy as design material. The effects of groove depth, ridge lines, plane width, material properties, wall thickness and fibre orientation had been quantitatively analysed and found to significantly enhance the energy absorption capability of the crash box.

  7. Multiscale Modeling for the Analysis for Grain-Scale Fracture Within Aluminum Microstructures

    NASA Technical Reports Server (NTRS)

    Glaessgen, Edward H.; Phillips, Dawn R.; Yamakov, Vesselin; Saether, Erik

    2005-01-01

    Multiscale modeling methods for the analysis of metallic microstructures are discussed. Both molecular dynamics and the finite element method are used to analyze crack propagation and stress distribution in a nanoscale aluminum bicrystal model subjected to hydrostatic loading. Quantitative similarity is observed between the results from the two very different analysis methods. A bilinear traction-displacement relationship that may be embedded into cohesive zone finite elements is extracted from the nanoscale molecular dynamics results.

  8. Methodological approaches to conducting pilot and proof tests on reverse-osmosis systems: Results of comparative studies

    NASA Astrophysics Data System (ADS)

    Panteleev, A. A.; Bobinkin, V. V.; Larionov, S. Yu.; Ryabchikov, B. E.; Smirnov, V. B.; Shapovalov, D. A.

    2017-10-01

    When designing large-scale water-treatment plants based on reverse-osmosis systems, it is proposed to conduct experimental-industrial or pilot tests for validated simulation of the operation of the equipment. It is shown that such tests allow establishing efficient operating conditions and characteristics of the plant under design. It is proposed to conduct pilot tests of the reverse-osmosis systems on pilot membrane plants (PMPs) and test membrane plants (TMPs). The results of a comparative experimental study of pilot and test membrane plants are exemplified by simulating the operating parameters of the membrane elements of an industrial plant. It is concluded that the reliability of the data obtained on the TMP may not be sufficient to design industrial water-treatment plants, while the PMPs are capable of providing reliable data that can be used for full-scale simulation of the operation of industrial reverse-osmosis systems. The test membrane plants allow simulation of the operating conditions of individual industrial plant systems; therefore, potential areas of their application are shown. A method for numerical calculation and experimental determination of the true selectivity and the salt passage are proposed. An expression has been derived that describes the functional dependence between the observed and true salt passage. The results of the experiments conducted on a test membrane plant to determine the true value of the salt passage of a reverse-osmosis membrane are exemplified by magnesium sulfate solution at different initial operating parameters. It is shown that the initial content of a particular solution component has a significant effect on the change in the true salt passage of the membrane.

  9. TRUE MASSES OF RADIAL-VELOCITY EXOPLANETS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Robert A., E-mail: rbrown@stsci.edu

    We study the task of estimating the true masses of known radial-velocity (RV) exoplanets by means of direct astrometry on coronagraphic images to measure the apparent separation between exoplanet and host star. Initially, we assume perfect knowledge of the RV orbital parameters and that all errors are due to photon statistics. We construct design reference missions for four missions currently under study at NASA: EXO-S and WFIRST-S, with external star shades for starlight suppression, EXO-C and WFIRST-C, with internal coronagraphs. These DRMs reveal extreme scheduling constraints due to the combination of solar and anti-solar pointing restrictions, photometric and obscurational completeness,more » image blurring due to orbital motion, and the “nodal effect,” which is the independence of apparent separation and inclination when the planet crosses the plane of the sky through the host star. Next, we address the issue of nonzero uncertainties in RV orbital parameters by investigating their impact on the observations of 21 single-planet systems. Except for two—GJ 676 A b and 16 Cyg B b, which are observable only by the star-shade missions—we find that current uncertainties in orbital parameters generally prevent accurate, unbiased estimation of true planetary mass. For the coronagraphs, WFIRST-C and EXO-C, the most likely number of good estimators of true mass is currently zero. For the star shades, EXO-S and WFIRST-S, the most likely numbers of good estimators are three and four, respectively, including GJ 676 A b and 16 Cyg B b. We expect that uncertain orbital elements currently undermine all potential programs of direct imaging and spectroscopy of RV exoplanets.« less

  10. Analytical applications of condensed phosphoric acid-IV Iodometric determination of sulphur in sulphate and sulphide ores and minerals and other compounds after reduction with sodium hypophosphite and tin metal in condensed phosphoric acid.

    PubMed

    Mizoguchi, T; Ishii, H

    1980-06-01

    Sulphate in sulphate ores, e.g., alunite, anglesite, barytes, chalcanthite, gypsum, manganese sulphate ore, is reduced to hydrogen sulphide by the hypophosphite-tin metal-CPA method, if a slight modification is made. Sulphide ores, e.g., galena, sphalerite, are quantitatively decomposed with CPA alone to give hydrogen sulphide. Suitable reducing agents must be used for the quantitative recovery of hydrogen sulphide from pyrite, nickel sulphide, cobalt sulphide and cadmium sulphide, or elemental sulphur is liberated. Iodide must be used in the decomposition of chalcopyrite; the copper sulphide is too stable to be decomposed by CPA alone. Molybdenite is not decomposed in CPA even if reducing agents are added. The pretreatment methods for the determination of sulphur in sulphur oxyacids and elemental sulphur have also been investigated.

  11. Scattering matrix elements of biological particles measured in a flow through system: theory and practice.

    PubMed

    Sloot, P M; Hoekstra, A G; van der Liet, H; Figdor, C G

    1989-05-15

    Light scattering techniques (including depolarization experiments) applied to biological cells provide a fast nondestructive probe that is very sensitive to small morphological differences. Until now quantitative measurement of these scatter phenomena were only described for particles in suspension. In this paper we discuss the symmetry conditions applicable to the scattering matrices of monodisperse biological cells in a flow cytometer and provide evidence that quantitative measurement of the elements of these scattering matrices is possible in flow through systems. Two fundamental extensions to the theoretical description of conventional scattering experiments are introduced: large cone integration of scattering signals and simultaneous implementation of the localization principle to account for scattering by a sharply focused laser beam. In addition, a specific calibration technique is proposed to account for depolarization effects of the highly specialized optics applied in flow through equipment.

  12. The quantitative analysis of silicon carbide surface smoothing by Ar and Xe cluster ions

    NASA Astrophysics Data System (ADS)

    Ieshkin, A. E.; Kireev, D. S.; Ermakov, Yu. A.; Trifonov, A. S.; Presnov, D. E.; Garshev, A. V.; Anufriev, Yu. V.; Prokhorova, I. G.; Krupenin, V. A.; Chernysh, V. S.

    2018-04-01

    The gas cluster ion beam technique was used for the silicon carbide crystal surface smoothing. The effect of processing by two inert cluster ions, argon and xenon, was quantitatively compared. While argon is a standard element for GCIB, results for xenon clusters were not reported yet. Scanning probe microscopy and high resolution transmission electron microscopy techniques were used for the analysis of the surface roughness and surface crystal layer quality. The gas cluster ion beam processing results in surface relief smoothing down to average roughness about 1 nm for both elements. It was shown that xenon as the working gas is more effective: sputtering rate for xenon clusters is 2.5 times higher than for argon at the same beam energy. High resolution transmission electron microscopy analysis of the surface defect layer gives values of 7 ± 2 nm and 8 ± 2 nm for treatment with argon and xenon clusters.

  13. X-ray fluorescence analysis of K, Al and trace elements in chloroaluminate melts

    NASA Astrophysics Data System (ADS)

    Shibitko, A. O.; Abramov, A. V.; Denisov, E. I.; Lisienko, D. G.; Rebrin, O. I.; Bunkov, G. M.; Rychkov, V. N.

    2017-09-01

    Energy dispersive x-ray fluorescence spectrometry was applied to quantitative determination of K, Al, Cr, Fe and Ni in chloroaluminate melts. To implement the external standard calibration method, an unconventional way of samples preparation was suggested. A mixture of metal chlorides was melted in a quartz cell at 350-450 °C under a slightly excessive pressure of purified argon (99.999 %). The composition of the calibration samples (CSs) prepared was controlled by means of the inductively coupled plasma atomic emission spectrometry (ICP-AES). The optimal conditions for analytical lines excitation were determined, the analytes calibration curves were obtained. There was some influence of matrix effects in synthesized samples on the analytical signal of some elements. The CSs are to be stored in inert gas atmosphere. The precision, accuracy, and reproducibility factors of the quantitative chemical analysis were computed.

  14. Environmental assessment of a wood-waste-fired industrial firetube boiler. Volume 2. Data supplement. Final report, January 1981-March 1984

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeRosier, R.; Waterland, L.R.

    1987-03-01

    The report gives emission results from field tests of a wood-waste-fired industrial firetube boiler. Emission measurements included: continuous monitoring of flue-gas emissions; source assessment sampling system (SASS) sampling of the flue gas with subsequent laboratory analysis of samples to give total flue-gas organics in two boiling-point ranges, compound category information within these ranges, specific quantitation of the semivolatile organic priority pollutants, and flue-gas concentrations of 65 trace elements; Method 5 sampling for particulates; controlled condensation system (CSS) sampling for SO/sub 2/ and SO/sub 3/; and grab sampling of boiler bottom ash for trace-element-content determinations. Emission levels of five polycyclic organicmore » matter species and phenol were quantitated: except for naphthalene, all were emitted at less than 0.4 microgram/dscm.« less

  15. Piezoelectric sensors based on molecular imprinted polymers for detection of low molecular mass analytes.

    PubMed

    Uludağ, Yildiz; Piletsky, Sergey A; Turner, Anthony P F; Cooper, Matthew A

    2007-11-01

    Biomimetic recognition elements employed for the detection of analytes are commonly based on proteinaceous affibodies, immunoglobulins, single-chain and single-domain antibody fragments or aptamers. The alternative supra-molecular approach using a molecularly imprinted polymer now has proven utility in numerous applications ranging from liquid chromatography to bioassays. Despite inherent advantages compared with biochemical/biological recognition (which include robustness, storage endurance and lower costs) there are few contributions that describe quantitative analytical applications of molecularly imprinted polymers for relevant small molecular mass compounds in real-world samples. There is, however, significant literature describing the use of low-power, portable piezoelectric transducers to detect analytes in environmental monitoring and other application areas. Here we review the combination of molecularly imprinted polymers as recognition elements with piezoelectric biosensors for quantitative detection of small molecules. Analytes are classified by type and sample matrix presentation and various molecularly imprinted polymer synthetic fabrication strategies are also reviewed.

  16. Interpretation of biological and mechanical variations between the Lowry versus Bradford method for protein quantification.

    PubMed

    Lu, Tzong-Shi; Yiao, Szu-Yu; Lim, Kenneth; Jensen, Roderick V; Hsiao, Li-Li

    2010-07-01

    The identification of differences in protein expression resulting from methodical variations is an essential component to the interpretation of true, biologically significant results. We used the Lowry and Bradford methods- two most commonly used methods for protein quantification, to assess whether differential protein expressions are a result of true biological or methodical variations. MATERIAL #ENTITYSTARTX00026; Differential protein expression patterns was assessed by western blot following protein quantification by the Lowry and Bradford methods. We have observed significant variations in protein concentrations following assessment with the Lowry versus Bradford methods, using identical samples. Greater variations in protein concentration readings were observed over time and in samples with higher concentrations, with the Bradford method. Identical samples quantified using both methods yielded significantly different expression patterns on Western blot. We show for the first time that methodical variations observed in these protein assay techniques, can potentially translate into differential protein expression patterns, that can be falsely taken to be biologically significant. Our study therefore highlights the pivotal need to carefully consider methodical approaches to protein quantification in techniques that report quantitative differences.

  17. Perfusion in Rat Brain at 7 T with Arterial Spin Labeling Using FAIR-TrueFISP and QUIPSS

    PubMed Central

    Esparza-Coss, Emilio; Wosik, Jarek; Narayana, Ponnada A.

    2010-01-01

    Measurement of perfusion in longitudinal studies allows for the assessment of tissue integrity and the detection of subtle pathologies. In this work, the feasibility of measuring brain perfusion in rats with high spatial resolution using arterial spin labeling (ASL) is reported. A flow sensitive alternating recovery (FAIR) sequence, coupled with a balanced gradient fast imaging with steady state precession (TrueFISP) readout section was used to minimize ghosting and geometric distortions, while achieving high SNR. The quantitative imaging of perfusion using a single subtraction (QUIPSS) method was implemented to address the effects of variable transit delays between the labeling of spins and their arrival at the imaging slice. Studies in six rats at 7 T showed good perfusion contrast with minimal geometric distortion. The measured blood flow values of 152.5 ± 6.3 ml/100g/min in gray matter and 72.3 ± 14.0 ml/100g/min in white matter are in good agreement with previously reported values based on autoradiography, considered to be the gold standard. PMID:20299174

  18. Apparent directional selection by biased pleiotropic mutation.

    PubMed

    Tanaka, Yoshinari

    2010-07-01

    Pleiotropic effects of deleterious mutations are considered to be among the factors responsible for genetic constraints on evolution by long-term directional selection acting on a quantitative trait. If pleiotropic phenotypic effects are biased in a particular direction, mutations generate apparent directional selection, which refers to the covariance between fitness and the trait owing to a linear association between the number of mutations possessed by individuals and the genotypic values of the trait. The present analysis has shown how the equilibrium mean value of the trait is determined by a balance between directional selection and biased pleiotropic mutations. Assuming that genes act additively both on the trait and on fitness, the total variance-standardized directional selection gradient was decomposed into apparent and true components. Experimental data on mutation bias from the bristle traits of Drosophila and life history traits of Daphnia suggest that apparent selection explains a small but significant fraction of directional selection pressure that is observed in nature; the data suggest that changes induced in a trait by biased pleiotropic mutation (i.e., by apparent directional selection) are easily compensated for by (true) directional selection.

  19. Identification of residual leukemic cells by flow cytometry in childhood B-cell precursor acute lymphoblastic leukemia: verification of leukemic state by flow-sorting and molecular/cytogenetic methods.

    PubMed

    Øbro, Nina F; Ryder, Lars P; Madsen, Hans O; Andersen, Mette K; Lausen, Birgitte; Hasle, Henrik; Schmiegelow, Kjeld; Marquart, Hanne V

    2012-01-01

    Reduction in minimal residual disease, measured by real-time quantitative PCR or flow cytometry, predicts prognosis in childhood B-cell precursor acute lymphoblastic leukemia. We explored whether cells reported as minimal residual disease by flow cytometry represent the malignant clone harboring clone-specific genomic markers (53 follow-up bone marrow samples from 28 children with B-cell precursor acute lymphoblastic leukemia). Cell populations (presumed leukemic and non-leukemic) were flow-sorted during standard flow cytometry-based minimal residual disease monitoring and explored by PCR and/or fluorescence in situ hybridization. We found good concordance between flow cytometry and genomic analyses in the individual flow-sorted leukemic (93% true positive) and normal (93% true negative) cell populations. Four cases with discrepant results had plausible explanations (e.g. partly informative immunophenotype and antigen modulation) that highlight important methodological pitfalls. These findings demonstrate that with sufficient experience, flow cytometry is reliable for minimal residual disease monitoring in B-cell precursor acute lymphoblastic leukemia, although rare cases require supplementary PCR-based monitoring.

  20. Model Analysis of an Aircraft Fueslage Panel using Experimental and Finite-Element Techniques

    NASA Technical Reports Server (NTRS)

    Fleming, Gary A.; Buehrle, Ralph D.; Storaasli, Olaf L.

    1998-01-01

    The application of Electro-Optic Holography (EOH) for measuring the center bay vibration modes of an aircraft fuselage panel under forced excitation is presented. The requirement of free-free panel boundary conditions made the acquisition of quantitative EOH data challenging since large scale rigid body motions corrupted measurements of the high frequency vibrations of interest. Image processing routines designed to minimize effects of large scale motions were applied to successfully resurrect quantitative EOH vibrational amplitude measurements

  1. Development of Total Reflection X-ray fluorescence spectrometry quantitative methodologies for elemental characterization of building materials and their degradation products

    NASA Astrophysics Data System (ADS)

    García-Florentino, Cristina; Maguregui, Maite; Marguí, Eva; Torrent, Laura; Queralt, Ignasi; Madariaga, Juan Manuel

    2018-05-01

    In this work, a Total Reflection X-ray fluorescence (TXRF) spectrometry based quantitative methodology for elemental characterization of liquid extracts and solids belonging to old building materials and their degradation products from a building of the beginning of 20th century with a high historic cultural value in Getxo, (Basque Country, North of Spain) is proposed. This quantification strategy can be considered a faster methodology comparing to traditional Energy or Wavelength Dispersive X-ray fluorescence (ED-XRF and WD-XRF) spectrometry based methodologies or other techniques such as Inductively Coupled Plasma Mass Spectrometry (ICP-MS). In particular, two kinds of liquid extracts were analysed: (i) water soluble extracts from different mortars and (ii) acid extracts from mortars, black crusts, and calcium carbonate formations. In order to try to avoid the acid extraction step of the materials and their degradation products, it was also studied the TXRF direct measurement of the powdered solid suspensions in water. With this aim, different parameters such as the deposition volume and the measuring time were studied for each kind of samples. Depending on the quantified element, the limits of detection achieved with the TXRF quantitative methodologies for liquid extracts and solids were set around 0.01-1.2 and 2-200 mg/L respectively. The quantification of K, Ca, Ti, Mn, Fe, Zn, Rb, Sr, Sn and Pb in the liquid extracts was proved to be a faster alternative to other more classic quantification techniques (i.e. ICP-MS), accurate enough to obtain information about the composition of the acidic soluble part of the materials and their degradation products. Regarding the solid samples measured as suspensions, it was quite difficult to obtain stable and repetitive suspensions affecting in this way the accuracy of the results. To cope with this problem, correction factors based on the quantitative results obtained using ED-XRF were calculated to improve the accuracy of the TXRF results.

  2. Development and validation of an ICP-OES method for quantitation of elemental impurities in tablets according to coming US pharmacopeia chapters.

    PubMed

    Støving, Celina; Jensen, Henrik; Gammelgaard, Bente; Stürup, Stefan

    2013-10-01

    May 1, 2014 the United States Pharmacopeia (USP) will implement two new chapters stating limit concentrations of elemental impurities in pharmaceuticals applying inductively coupled plasma methods. In the present work an inductively coupled plasma optical emission spectrometry (ICP-OES) method for quantitation of As, Cd, Cu, Cr, Fe, Hg, Ir, Mn, Mo, Ni, Os, Pb, Pd, Pt, Rh, Ru, V and Zn in tablets according to the new USP chapters was developed. Sample preparation was performed by microwave-assisted acid digestion using a mixture of 65% HNO3 and 37% HCl (3:1, v/v). Limits of detection and quantitation were at least a factor of ten below the USP limit concentrations showing that the ICP-OES technique is well suited for quantitation of elemental impurities. Excluding Os, spike recoveries in the range of 85.3-103.8% were obtained with relative standard deviations (%RSD) ranging from 1.3 to 3.2%. Due to memory effects the spike recovery and %RSD of Os were 161.5% and 13.7%, respectively, thus the method will need further development with respect to elimination of the memory effect of Os. The method was proven to be specific but with potential spectral interference for Ir, Os, Pb, Pt and Rh necessitating visual examination of the spectra. Hg memory effect was handled by using lower spike levels combined with rinsing with 0.1M HCl. The tablets had a content of Fe and Pt of 182.8 ± 18.1 and 2.8 ± 0.2 μg/g, respectively and did therefore not exceed the limit concentration defined by USP. It is suggested that the developed method is applicable to pharmaceutical products with a composition and maximal amount of daily intake (g drug product/day) similar to the tablets used in this work. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. QUANTITATIVE DETERMINATION OF THE URANIUM CONTENT OF URANIUM ORES TECHNOLOGICAL PRODUCTS BY ION EXCHANGE-COMPLEXON SEPARATION (in Hungarian)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fodor, M.

    An ion exchange-complexion separation meihod was developed for the removal of interfering elements in the determination of the uranium content of recovery solutions. By adding (ethylenediamine)tetraacetic acid to the solution, most of the interfering elements can be brought into an anionic complex. Adjusting the soluiion to pH 7 and letting it pass through an Amberlite IRC-50 type cation exchanger of hydrogen form, the uranium remains on the column whereas the interfering elements pass into the effluent. The method was successfully applied in analyzing the recovery solutions of uranium ores. (auth)

  4. Closing in on Chemical Bonds by Opening up Relativity Theory

    PubMed Central

    Whitney, Cynthia Kolb

    2008-01-01

    This paper develops a connection between the phenomenology of chemical bonding and the theory of relativity. Empirical correlations between electron numbers in atoms and chemical bond stabilities in molecules are first reviewed and extended. Quantitative chemical bond strengths are then related to ionization potentials in elements. Striking patterns in ionization potentials are revealed when the data are viewed in an element-independent way, where element-specific details are removed via an appropriate scaling law. The scale factor involved is not explained by quantum mechanics; it is revealed only when one goes back further, to the development of Einstein’s special relativity theory. PMID:19325749

  5. Characterizing the Constitutive Properties of AA7075 for Hot Forming

    NASA Astrophysics Data System (ADS)

    Omer, K.; Kim, S.; Butcher, C.; Worswick, M.

    2017-09-01

    The work presented herein investigates the constitutive properties of AA7075 as it undergoes a hot stamping/die quenching process. Tensile specimens were solutionized inside a heated furnace set to 470°C. Once solutionized, the samples were quenched to an intermediate temperature using a vortex air chiller at a minimum rate of 52°C/s. Tensile tests were conducted at steady state temperatures of 470, 400, 300, 200, 115 and 25°C. This solutionizing and subsequent quenching process replicated the temperature cycle and quench rates representative of a die quenching operation. The results of the tensile test were analyzed with digital imaging correlation using an area reduction approach. The area reduction approach approximated the cross-sectional area of the tensile specimen as it necked. The approach allowed for the true stress-strain response to be calculated well past the initial necking point. The resulting true stress-strain curves showed that the AA7075 samples experienced almost no hardening at 470°C. As steady state temperature decreased, the rate of hardening as well as overall material strength increased. The true stress strain curves were fit to a modified version of the extended Voce constitutive model. The resulting fits can be used in a finite element model to predict the behaviour of an AA7075 blank during a die quenching operation.

  6. Quantitative landslide risk assessment and mapping on the basis of recent occurrences

    NASA Astrophysics Data System (ADS)

    Remondo, Juan; Bonachea, Jaime; Cendrero, Antonio

    A quantitative procedure for mapping landslide risk is developed from considerations of hazard, vulnerability and valuation of exposed elements. The approach based on former work by the authors, is applied in the Bajo Deba area (northern Spain) where a detailed study of landslide occurrence and damage in the recent past (last 50 years) was carried out. Analyses and mapping are implemented in a Geographic Information System (GIS). The method is based on a susceptibility model developed previously from statistical relationships between past landslides and terrain parameters related to instability. Extrapolations based on past landslide behaviour were used to calculate failure frequency for the next 50 years. A detailed inventory of direct damage due to landslides during the study period was carried out and the main elements at risk in the area identified and mapped. Past direct (monetary) losses per type of element were estimated and expressed as an average 'specific loss' for events of a given magnitude (corresponding to a specified scenario). Vulnerability was assessed by comparing losses with the actual value of the elements affected and expressed as a fraction of that value (0-1). From hazard, vulnerability and monetary value, risk was computed for each element considered. Direct risk maps (€/pixel/year) were obtained and indirect losses from the disruption of economic activities due to landslides assessed. The final result is a risk map and table combining all losses per pixel for a 50-year period. Total monetary value at risk for the Bajo Deba area in the next 50 years is about 2.4 × 10 6 Euros.

  7. Dual embryonic origin of the hyobranchial apparatus in the Mexican axolotl (Ambystoma mexicanum).

    PubMed

    Davidian, Asya; Malashichev, Yegor

    2013-01-01

    Traditionally, the cartilaginous viscerocranium of vertebrates is considered as neural crest (NC)-derived. Morphological work carried out on amphibian embryos in the first half of the XX century suggested potentially mesodermal origin for some hyobranchial elements. Since then, the embryonic sources of the hyobranchial apparatus in amphibians has not been investigated due to lack of an appropriate long-term labelling system. We performed homotopic transplantations of neural folds along with the majority of cells of the presumptive NC, and/or fragments of the head lateral plate mesoderm (LPM) from transgenic GFP+ into white embryos. In these experiments, the NC-derived GFP+ cells contributed to all hyobranchial elements, except for basibranchial 2, whereas the grafting of GFP+ head mesoderm led to a reverse labelling result. The grafting of only the most ventral part of the head LPM resulted in marking of the basibranchial 2 and the heart myocardium, implying their origin from a common mesodermal region. This is the first evidence of contribution of LPM of the head to cranial elements in any vertebrate. If compared to fish, birds, and mammals, in which all branchial skeletal elements are NC-derived, the axolotl (probably this is true for all amphibians) demonstrates an evolutionary deviation, in which the head LPM replaces NC cells in a hyobranchial element. This implies that cells of different embryonic origin may have the same developmental program, leading to the formation of identical (homologous) elements of the skeleton.

  8. A New Material Mapping Procedure for Quantitative Computed Tomography-Based, Continuum Finite Element Analyses of the Vertebra

    PubMed Central

    Unnikrishnan, Ginu U.; Morgan, Elise F.

    2011-01-01

    Inaccuracies in the estimation of material properties and errors in the assignment of these properties into finite element models limit the reliability, accuracy, and precision of quantitative computed tomography (QCT)-based finite element analyses of the vertebra. In this work, a new mesh-independent, material mapping procedure was developed to improve the quality of predictions of vertebral mechanical behavior from QCT-based finite element models. In this procedure, an intermediate step, called the material block model, was introduced to determine the distribution of material properties based on bone mineral density, and these properties were then mapped onto the finite element mesh. A sensitivity study was first conducted on a calibration phantom to understand the influence of the size of the material blocks on the computed bone mineral density. It was observed that varying the material block size produced only marginal changes in the predictions of mineral density. Finite element (FE) analyses were then conducted on a square column-shaped region of the vertebra and also on the entire vertebra in order to study the effect of material block size on the FE-derived outcomes. The predicted values of stiffness for the column and the vertebra decreased with decreasing block size. When these results were compared to those of a mesh convergence analysis, it was found that the influence of element size on vertebral stiffness was less than that of the material block size. This mapping procedure allows the material properties in a finite element study to be determined based on the block size required for an accurate representation of the material field, while the size of the finite elements can be selected independently and based on the required numerical accuracy of the finite element solution. The mesh-independent, material mapping procedure developed in this study could be particularly helpful in improving the accuracy of finite element analyses of vertebroplasty and spine metastases, as these analyses typically require mesh refinement at the interfaces between distinct materials. Moreover, the mapping procedure is not specific to the vertebra and could thus be applied to many other anatomic sites. PMID:21823740

  9. Schnellverfahren zur flammenlosen AAS-Bestimmung von Spurenelementen in geologischen Proben

    NASA Astrophysics Data System (ADS)

    Schrön, W.; Bombach, G.; Beuge, P.

    This paper reports experience with direct quantitative trace element determinations in powdered geological samples by nameless atomic absorption spectroscopy. Two methods were explored. The first one is based on the production of a sample aerosol by laser radiation in a specifically designed sample chamber and the subsequent transport of the aerosol into a graphite tube, which has been preheated to a stable temperature. This technique is suited for a large range of concentration and is relatively free from matrix interferences. The technique was tested for the elements Ag, As, Bi, Cd, Co, Mn, Ni, Pb, Sb, Se, Sr and Tl. The described sample chamber can be also used in combination with other spcctroscopic techniques. The second method explored permits the quantitative determination of trace elements at very low concentrations. Essentially an accurately weighed amount of sample is placed on a graphite rod and introduced into a graphite furnace by inserting the rod through the sample injection port. Atomization takes place also under stable temperature conditions. Using this technique detection limits were found to be 10 -11 g for Ag, 2 × 10 -11 g for Cd and 10 -10 g for Sb in silicate materials.

  10. Scanning transmission ion microscopy mass measurements for quantitative trace element analysis within biological samples and validation using atomic force microscopy thickness measurements

    NASA Astrophysics Data System (ADS)

    Devès, Guillaume; Cohen-Bouhacina, Touria; Ortega, Richard

    2004-10-01

    We used the nuclear microprobe techniques, micro-PIXE (particle-induced X-ray emission), micro-RBS (Rutherford backscattering spectrometry) and scanning transmission ion microscopy (STIM) in order to perform the characterization of trace element content and spatial distribution within biological samples (dehydrated cultured cells, tissues). The normalization of PIXE results was usually expressed in terms of sample dry mass as determined by micro-RBS recorded simultaneously to micro-PIXE. However, the main limit of RBS mass measurement is the sample mass loss occurring during irradiation and which could be up to 30% of the initial sample mass. We present here a new methodology for PIXE normalization and quantitative analysis of trace element within biological samples based on dry mass measurement performed by mean of STIM. The validation of STIM cell mass measurements was obtained in comparison with AFM sample thickness measurements. Results indicated the reliability of STIM mass measurement performed on biological samples and suggested that STIM should be performed for PIXE normalization. Further information deriving from direct confrontation of AFM and STIM analysis could as well be obtained, like in situ measurements of cell specific gravity within cells compartment (nucleolus and cytoplasm).

  11. Modern Material Analysis Instruments Add a New Dimension to Materials Characterization and Failure Analysis

    NASA Technical Reports Server (NTRS)

    Panda, Binayak

    2009-01-01

    Modern analytical tools can yield invaluable results during materials characterization and failure analysis. Scanning electron microscopes (SEMs) provide significant analytical capabilities, including angstrom-level resolution. These systems can be equipped with a silicon drift detector (SDD) for very fast yet precise analytical mapping of phases, as well as electron back-scattered diffraction (EBSD) units to map grain orientations, chambers that admit large samples, variable pressure for wet samples, and quantitative analysis software to examine phases. Advanced solid-state electronics have also improved surface and bulk analysis instruments: Secondary ion mass spectroscopy (SIMS) can quantitatively determine and map light elements such as hydrogen, lithium, and boron - with their isotopes. Its high sensitivity detects impurities at parts per billion (ppb) levels. X-ray photo-electron spectroscopy (XPS) can determine oxidation states of elements, as well as identifying polymers and measuring film thicknesses on coated composites. This technique is also known as electron spectroscopy for chemical analysis (ESCA). Scanning Auger electron spectroscopy (SAM) combines surface sensitivity, spatial lateral resolution (10 nm), and depth profiling capabilities to describe elemental compositions of near and below surface regions down to the chemical state of an atom.

  12. Elemental Analysis of Bone, Teeth, Horn and Antler in Different Animal Species Using Non-Invasive Handheld X-Ray Fluorescence.

    PubMed

    Buddhachat, Kittisak; Klinhom, Sarisa; Siengdee, Puntita; Brown, Janine L; Nomsiri, Raksiri; Kaewmong, Patcharaporn; Thitaram, Chatchote; Mahakkanukrauh, Pasuk; Nganvongpanit, Korakot

    2016-01-01

    Mineralized tissues accumulate elements that play crucial roles in animal health. Although elemental content of bone, blood and teeth of human and some animal species have been characterized, data for many others are lacking, as well as species comparisons. Here we describe the distribution of elements in horn (Bovidae), antler (Cervidae), teeth and bone (humerus) across a number of species determined by handheld X-ray fluorescence (XRF) to better understand differences and potential biological relevance. A difference in elemental profiles between horns and antlers was observed, possibly due to the outer layer of horns being comprised of keratin, whereas antlers are true bone. Species differences in tissue elemental content may be intrinsic, but also related to feeding habits that contribute to mineral accumulation, particularly for toxic heavy metals. One significant finding was a higher level of iron (Fe) in the humerus bone of elephants compared to other species. This may be an adaptation of the hematopoietic system by distributing Fe throughout the bone rather than the marrow, as elephant humerus lacks a marrow cavity. We also conducted discriminant analysis and found XRF was capable of distinguishing samples from different species, with humerus bone being the best source for species discrimination. For example, we found a 79.2% correct prediction and success rate of 80% for classification between human and non-human humerus bone. These findings show that handheld XRF can serve as an effective tool for the biological study of elemental composition in mineralized tissue samples and may have a forensic application.

  13. Elemental Analysis of Bone, Teeth, Horn and Antler in Different Animal Species Using Non-Invasive Handheld X-Ray Fluorescence

    PubMed Central

    Buddhachat, Kittisak; Klinhom, Sarisa; Siengdee, Puntita; Brown, Janine L.; Nomsiri, Raksiri; Kaewmong, Patcharaporn; Thitaram, Chatchote; Mahakkanukrauh, Pasuk; Nganvongpanit, Korakot

    2016-01-01

    Mineralized tissues accumulate elements that play crucial roles in animal health. Although elemental content of bone, blood and teeth of human and some animal species have been characterized, data for many others are lacking, as well as species comparisons. Here we describe the distribution of elements in horn (Bovidae), antler (Cervidae), teeth and bone (humerus) across a number of species determined by handheld X-ray fluorescence (XRF) to better understand differences and potential biological relevance. A difference in elemental profiles between horns and antlers was observed, possibly due to the outer layer of horns being comprised of keratin, whereas antlers are true bone. Species differences in tissue elemental content may be intrinsic, but also related to feeding habits that contribute to mineral accumulation, particularly for toxic heavy metals. One significant finding was a higher level of iron (Fe) in the humerus bone of elephants compared to other species. This may be an adaptation of the hematopoietic system by distributing Fe throughout the bone rather than the marrow, as elephant humerus lacks a marrow cavity. We also conducted discriminant analysis and found XRF was capable of distinguishing samples from different species, with humerus bone being the best source for species discrimination. For example, we found a 79.2% correct prediction and success rate of 80% for classification between human and non-human humerus bone. These findings show that handheld XRF can serve as an effective tool for the biological study of elemental composition in mineralized tissue samples and may have a forensic application. PMID:27196603

  14. Quantitative measurements of nanoscale permittivity and conductivity using tuning-fork-based microwave impedance microscopy

    NASA Astrophysics Data System (ADS)

    Wu, Xiaoyu; Hao, Zhenqi; Wu, Di; Zheng, Lu; Jiang, Zhanzhi; Ganesan, Vishal; Wang, Yayu; Lai, Keji

    2018-04-01

    We report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-field microwave imaging with small distance modulation.

  15. Riddles about the origin and thermal history of the moon

    NASA Technical Reports Server (NTRS)

    Levin, B. Y.; Mayeva, S. V.

    1977-01-01

    Magmatic differentiation of the moon's interior, confirmed through calculations of thermal history, was studied. It appears that differentiation was a result of the moon's initial temperature whose origin remains unknown. In solving this problem, convective models of the moon were considered as well as a two layered differentiated model of the moon, operative over the past 3.5 billion years. The high content of long lived radioactive elements present was investigated in explaining the moon's current thermal properties. The controversy concerning the true nature of magmatic differentiation continues to be unsolved.

  16. Habitability in Advanced Space Mission Design. Part 2; Evaluation of Habitation Elements

    NASA Technical Reports Server (NTRS)

    Adams, Constance M.; McCurdy, Matthew R.

    2000-01-01

    Habitability is a fundamental component of any long-duration human habitat. Due to the pressures on the crew and the criticality of their performance, this is particularly true of habitats or vehicles proposed for use in any human space mission of duration over 30 days. This paper, the second of three on this subject, will focus on evaluating all the vehicles currently under consideration for the Mars Design Reference Mission through application of metrics for habitability (proposed in a previous paper, see references Adams/McCurdy 1999).

  17. The Voice of the Turtle is Heard Programs to Develop Military Writers in the Field of Strategy

    DTIC Science & Technology

    1966-04-08

    BENEFIT TO THE USER AS MAY ACCRUE. 8 April 1966 "THE VOICE OF THE TURTLE IS HEARD" PROGRAMS TO DEVELOP MILITARY WRITERS IN THE FIELD OF STRATEGY By...U USAWC RESEARCH ELEMENT (Research Paper) L’The Voice of the Turtle is Heard" Programs to Develop Military Writers in the Field of Strategy by Lt Col...extensively their own "original sources" of information. Such information as published is often nebulous , however, and as often fanciful as it is true

  18. Creative Strategic Intelligence Analysis and Decision Making Within the Elements of National Power. Proteus Futures Workshop, held Carlise, PA on 14-16 Aug 2007

    DTIC Science & Technology

    2007-08-01

    Professor Ayers suggests the facts have been present since 1979 and the initiation of Khomeini’s objectives: Islamic Rule over Iran, expanding Islamic Rule ...Dajjal (Anti-Christ), establish Islam as the global religion, and rule the world for seven to nine years. True believers are to seek out martyrdom for...without the need for new rules and constructs. He uses the same rules and analysis that he applies to physical combat and pleads that we should stop

  19. Eliminating Major Gaps in DoD Data on the Fully-Burdened and Life-Cycle Cost of Military Personnel: Cost Elements Should be Mandated by Policy

    DTIC Science & Technology

    2013-01-07

    Budgetary Assessment as well as private sector companies . Reserve Forces Policy Board Eliminating Major Gaps in DoD Data on the Fully-Burdened and Life...and utilities costs associated with the housing, childcare and recreation facilities found on major bases. This is true whether the reservist is...Notably, the current Under Secretary of Defense Comptroller, the Honorable Robert Hale has said, “the cost of pay and benefits has risen more than 87

  20. Basic research on design analysis methods for rotorcraft vibrations

    NASA Technical Reports Server (NTRS)

    Hanagud, S.

    1991-01-01

    The objective of the present work was to develop a method for identifying physically plausible finite element system models of airframe structures from test data. The assumed models were based on linear elastic behavior with general (nonproportional) damping. Physical plausibility of the identified system matrices was insured by restricting the identification process to designated physical parameters only and not simply to the elements of the system matrices themselves. For example, in a large finite element model the identified parameters might be restricted to the moduli for each of the different materials used in the structure. In the case of damping, a restricted set of damping values might be assigned to finite elements based on the material type and on the fabrication processes used. In this case, different damping values might be associated with riveted, bolted and bonded elements. The method itself is developed first, and several approaches are outlined for computing the identified parameter values. The method is applied first to a simple structure for which the 'measured' response is actually synthesized from an assumed model. Both stiffness and damping parameter values are accurately identified. The true test, however, is the application to a full-scale airframe structure. In this case, a NASTRAN model and actual measured modal parameters formed the basis for the identification of a restricted set of physically plausible stiffness and damping parameters.

  1. Three-dimensional finite element modeling of a maxillary premolar tooth based on the micro-CT scanning: a detailed description.

    PubMed

    Huang, Zheng; Chen, Zhi

    2013-10-01

    This study describes the details of how to construct a three-dimensional (3D) finite element model of a maxillary first premolar tooth based on micro-CT data acquisition technique, MIMICS software and ANSYS software. The tooth was scanned by micro-CT, in which 1295 slices were obtained and then 648 slices were selected for modeling. The 3D surface mesh models of enamel and dentin were created by MIMICS (STL file). The solid mesh model was constructed by ANSYS. After the material properties and boundary conditions were set, a loading analysis was performed to demonstrate the applicableness of the resulting model. The first and third principal stresses were then evaluated. The results showed that the number of nodes and elements of the finite element model were 56 618 and 311801, respectively. The geometric form of the model was highly consistent with that of the true tooth, and the deviation between them was -0.28%. The loading analysis revealed the typical stress patterns in the contour map. The maximum compressive stress existed in the contact points and the maximum tensile stress existed in the deep fissure between the two cusps. It is concluded that by using the micro-CT and highly integrated software, construction of the 3D finite element model with high quality will not be difficult for clinical researchers.

  2. Reappraising Accretion to Vesta and the Angrite Parent Body Through Mineral-Scale Platinum Group Element and Os-Isotope Analyses

    NASA Astrophysics Data System (ADS)

    Riches, A. J. V.; Burton, K. W.; Nowell, G. M.; Dale, C. W.; Ottley, C. J.

    2016-08-01

    New methods presented here enable quantitative determination of mineral-scale PGE-abundances and Os-isotope compositions in meteorite materials thereby providing valuable new insight into planetary evolution.

  3. Czochralski crystal growth: Modeling study

    NASA Technical Reports Server (NTRS)

    Dudukovic, M. P.; Ramachandran, P. A.; Srivastava, R. K.; Dorsey, D.

    1986-01-01

    The modeling study of Czochralski (Cz) crystal growth is reported. The approach was to relate in a quantitative manner, using models based on first priniciples, crystal quality to operating conditions and geometric variables. The finite element method is used for all calculations.

  4. Inland surface waters

    EPA Science Inventory

    A critical load is a “quantitative estimate of the exposure to one or more pollutants below which significant harmful effects on specified sensitive elements of the environment do not occur according to present knowledge”. Critical loads can be either modeled, or calculated empi...

  5. Homology of the Fifth Epibranchial and Accessory Elements of the Ceratobranchials among Gnathostomes: Insights from the Development of Ostariophysans

    PubMed Central

    Carvalho, Murilo; Bockmann, Flávio Alicino; de Carvalho, Marcelo Rodrigues

    2013-01-01

    Epibranchials are among the main dorsal elements of the gill basket in jawed vertebrates (Gnathostomata). Among extant fishes, chondrichthyans most resemble the putative ancestral condition as all branchial arches possess every serially homologous piece. In osteichthyans, a primitive rod-like epibranchial 5, articulated to ceratobranchial 5, is absent. Instead, epibranchial 5 of many actinopterygians is here identified as an accessory element attached to ceratobranchial 4. Differences in shape and attachment of epibranchial 5 in chondrichthyans and actinopterygians raised suspicions about their homology, prompting us to conduct a detailed study of the morphology and development of the branchial basket of three ostariophysans (Prochilodus argenteus, Characiformes; Lophiosilurus alexandri and Pseudoplatystoma corruscans, Siluriformes). Results were interpreted within a phylogenetic context of major gnathostome lineages. Developmental series strongly suggest that the so-called epibranchial 5 of actinopterygians does not belong to the epal series because it shares the same chondroblastic layer with ceratobranchial 4 and its ontogenetic emergence is considerably late. This neomorphic structure is called accessory element of ceratobranchial 4. Its distribution among gnathostomes indicates it is a teleost synapomorphy, occurring homoplastically in Polypteriformes, whereas the loss of the true epibranchial 5 is an osteichthyan synapomorphy. The origin of the accessory element of ceratobranchial 4 appears to have occurred twice in osteichthyans, but it may have a single origin; in this case, the accessory element of ceratobranchial 4 would represent a remnant of a series of elements distally attached to ceratobranchials 1–4, a condition totally or partially retained in basal actinopterygians. Situations wherein a structure is lost while a similar neomorphic element is present may lead to erroneous homology assessments; these can be avoided by detailed morphological and ontogenetic investigations interpreted in the light of well-supported phylogenetic hypotheses. PMID:23638061

  6. The Evolution of Mobile DNAs: When Will Transposons Create Phylogenies That Look As If There Is a Master Gene?

    PubMed Central

    Brookfield, John F. Y.; Johnson, Louise J.

    2006-01-01

    Some families of mammalian interspersed repetitive DNA, such as the Alu SINE sequence, appear to have evolved by the serial replacement of one active sequence with another, consistent with there being a single source of transposition: the “master gene.” Alternative models, in which multiple source sequences are simultaneously active, have been called “transposon models.” Transposon models differ in the proportion of elements that are active and in whether inactivation occurs at the moment of transposition or later. Here we examine the predictions of various types of transposon model regarding the patterns of sequence variation expected at an equilibrium between transposition, inactivation, and deletion. Under the master gene model, all bifurcations in the true tree of elements occur in a single lineage. We show that this property will also hold approximately for transposon models in which most elements are inactive and where at least some of the inactivation events occur after transposition. Such tree shapes are therefore not conclusive evidence for a single source of transposition. PMID:16790583

  7. Disrupted narrative and narrative symbol.

    PubMed

    Vuletić, Georgije

    2018-02-01

    In this article a specific type of narrative, which often appears in analytic sessions, is discussed. It is characterized by a seemingly ordinary, everyday topic and by a peculiar disruption of the narrative flow. The threefold structure of this type of narrative is described, along with its main characteristics. One element of this type of narrative is very similar to symbolic content or complex symbolic structures, e.g. dreams, the sort of material that can be used for the purpose of interpretation. The similarities as well as the differences are elaborated in the article. Thanks to the observed general structure and 'symbolic' nature of some parts of the narrative, it is easy to notice some of the unconscious elements, which are not familiar to the patient's ego, and to make an interpretation. Because these elements are close to the threshold of consciousness, the patient willingly accepts an interpretation based on them. This is especially true for patients whose dominant function is thinking. A temporary, working name for this type of narrative is proposed in the article: 'disrupted narrative' - and for its disruptive part 'narrative symbol'. © 2018, The Society of Analytical Psychology.

  8. Extra-Mediterranean refugia: The rule and not the exception?

    PubMed Central

    2012-01-01

    Some decades ago, biogeographers distinguished three major faunal types of high importance for Europe: (i) Mediterranean elements with exclusive glacial survival in the Mediterranean refugia, (ii) Siberian elements with glacial refugia in the eastern Palearctic and only postglacial expansion to Europe and (iii) arctic and/or alpine elements with large zonal distributions in the periglacial areas and postglacial retreat to the North and/or into the high mountain systems. Genetic analyses have unravelled numerous additional refugia both of continental and Mediterranean species, thus strongly modifying the biogeographical view of Europe. This modified notion is particularly true for the so-called Siberian species, which in many cases have not immigrated into Europe during the postglacial period, but most likely have survived the last, or even several glacial phases, in extra-Mediterranean refugia in some climatically favourable but geographically limited areas of southern Central and Eastern Europe. Recently, genetic analyses revealed that typical Mediterranean species have also survived the Last Glacial Maximum in cryptic northern refugia (e.g. in the Carpathians or even north of the Alps) in addition to their Mediterranean refuge areas. PMID:22953783

  9. Whole-body concentrations of elements in three fish species from offshore oil platforms and natural areas in the Southern California Bight, USA

    USGS Publications Warehouse

    Love, Milton S.; Saiki, Michael K.; May, Thomas W.; Yee, Julie L.

    2013-01-01

    elements. Forty-two elements were excluded from statistical comparisons as they (1) consisted of major cations that were unlikely to accumulate to potentially toxic concentrations; (2) were not detected by the analytical procedures; or (3) were detected at concentrations too low to yield reliable quantitative measurements. The remaining 21 elements consisted of aluminum, arsenic, barium, cadmium, chromium, cobalt, copper, gallium, iron, lead, lithium, manganese, mercury, nickel, rubidium, selenium, strontium, tin, titanium, vanadium, and zinc. Statistical comparisons of these elements indicated that none consistently exhibited higher concentrations at oil platforms than at natural areas. However, the concentrations of copper, selenium, titanium, and vanadium in Pacific sanddab were unusual because small individuals exhibited either no differences between oil platforms and natural areas or significantly lower concentrations at oil platforms than at natural areas, whereas large individuals exhibited significantly higher concentrations at oil platforms than at natural areas.

  10. Uncoupling of sgRNAs from their associated barcodes during PCR amplification of combinatorial CRISPR screens

    PubMed Central

    2018-01-01

    Many implementations of pooled screens in mammalian cells rely on linking an element of interest to a barcode, with the latter subsequently quantitated by next generation sequencing. However, substantial uncoupling between these paired elements during lentiviral production has been reported, especially as the distance between elements increases. We detail that PCR amplification is another major source of uncoupling, and becomes more pronounced with increased amounts of DNA template molecules and PCR cycles. To lessen uncoupling in systems that use paired elements for detection, we recommend minimizing the distance between elements, using low and equal template DNA inputs for plasmid and genomic DNA during PCR, and minimizing the number of PCR cycles. We also present a vector design for conducting combinatorial CRISPR screens that enables accurate barcode-based detection with a single short sequencing read and minimal uncoupling. PMID:29799876

  11. Vibration transmission through rolling element bearings in geared rotor system, part 1. Ph.D. Thesis Final Report

    NASA Technical Reports Server (NTRS)

    Singh, Rajendra; Lim, Teik Chin

    1989-01-01

    A mathematical model is proposed to examine the vibration transmission through rolling element bearings in geared rotor systems. Current bearing models, based on either ideal boundary conditions for the shaft or purely translational stiffness element description, cannot explain how the vibratory motion may be transmitted from the rotating shaft to the casing. This study clarifies this issue qualitatively and quantitatively by developing a comprehensive bearing stiffness matrix of dimension 6 model for the precision rolling element bearings from basic principles. The proposed bearing formulation is extended to analyze the overall geared rotor system dynamics including casing and mounts. The bearing stiffness matrix is included in discrete system models using lumped parameter and/or dynamic finite element techniques. Eigensolution and forced harmonic response due to rotating mass unbalance or kinematic transmission error excitation for a number of examples are computed.

  12. Development and characterization of a dynamic lesion phantom for the quantitative evaluation of dynamic contrast-enhanced MRI

    PubMed Central

    Freed, Melanie; de Zwart, Jacco A.; Hariharan, Prasanna; R. Myers, Matthew; Badano, Aldo

    2011-01-01

    Purpose: To develop a dynamic lesion phantom that is capable of producing physiological kinetic curves representative of those seen in human dynamic contrast-enhanced MRI (DCE-MRI) data. The objective of this phantom is to provide a platform for the quantitative comparison of DCE-MRI protocols to aid in the standardization and optimization of breast DCE-MRI. Methods: The dynamic lesion consists of a hollow, plastic mold with inlet and outlet tubes to allow flow of a contrast agent solution through the lesion over time. Border shape of the lesion can be controlled using the lesion mold production method. The configuration of the inlet and outlet tubes was determined using fluid transfer simulations. The total fluid flow rate was determined using x-ray images of the lesion for four different flow rates (0.25, 0.5, 1.0, and 1.5 ml∕s) to evaluate the resultant kinetic curve shape and homogeneity of the contrast agent distribution in the dynamic lesion. High spatial and temporal resolution x-ray measurements were used to estimate the true kinetic curve behavior in the dynamic lesion for benign and malignant example curves. DCE-MRI example data were acquired of the dynamic phantom using a clinical protocol. Results: The optimal inlet and outlet tube configuration for the lesion molds was two inlet molds separated by 30° and a single outlet tube directly between the two inlet tubes. X-ray measurements indicated that 1.0 ml∕s was an appropriate total fluid flow rate and provided truth for comparison with MRI data of kinetic curves representative of benign and malignant lesions. DCE-MRI data demonstrated the ability of the phantom to produce realistic kinetic curves. Conclusions: The authors have constructed a dynamic lesion phantom, demonstrated its ability to produce physiological kinetic curves, and provided estimations of its true kinetic curve behavior. This lesion phantom provides a tool for the quantitative evaluation of DCE-MRI protocols, which may lead to improved discrimination of breast cancer lesions. PMID:21992378

  13. Potential Applications of PET/MR Imaging in Cardiology.

    PubMed

    Ratib, Osman; Nkoulou, René

    2014-06-01

    Recent advances in hybrid PET/MR imaging have opened new perspectives for cardiovascular applications. Although cardiac MR imaging has gained wider adoption for routine clinical applications, PET images remain the reference in many applications for which objective analysis of metabolic and physiologic parameters is needed. In particular, in cardiovascular diseases-more specifically, coronary artery disease-the use of quantitative and measurable parameters in a reproducible way is essential for the management of therapeutic decisions and patient follow-up. Functional MR images and dynamic assessment of myocardial perfusion from transit of intravascular contrast medium can provide useful criteria for identifying areas of decreased myocardial perfusion or for assessing tissue viability from late contrast enhancement of scar tissue. PET images, however, will provide more quantitative data on true tissue perfusion and metabolism. Quantitative myocardial flow can also lead to accurate assessment of coronary flow reserve. The combination of both modalities will therefore provide complementary data that can be expected to improve the accuracy and reproducibility of diagnostic procedures. But the true potential of hybrid PET/MR imaging may reside in applications beyond the domain of coronary artery disease. The combination of both modalities in assessment of other cardiac diseases such as inflammation and of other systemic diseases can also be envisioned. It is also predicted that the 2 modalities combined could help characterize atherosclerotic plaques and differentiate plaques with a high risk of rupture from stable plaques. In the future, the development of new tracers will also open new perspectives in evaluating myocardial remodeling and in assessing the kinetics of stem cell therapy in myocardial infarction. New tracers will also provide new means for evaluating alterations in cardiac innervation, angiogenesis, and even the assessment of reporter gene technologies. The fusion of 2 potentially competing modalities can certainly offer the best of each modality in a single procedure. The impact of such advanced technology in routine clinical practice will still need to be demonstrated. Beyond the expected improvement in patient management and the potential impact on patient outcome, PET/MR imaging will also need to establish its medicoeconomic justification in an era of health-care economic restrictions. © 2014 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  14. Fractional flow reserve and coronary bifurcation anatomy: a novel quantitative model to assess and report the stenosis severity of bifurcation lesions.

    PubMed

    Tu, Shengxian; Echavarria-Pinto, Mauro; von Birgelen, Clemens; Holm, Niels R; Pyxaras, Stylianos A; Kumsars, Indulis; Lam, Ming Kai; Valkenburg, Ilona; Toth, Gabor G; Li, Yingguang; Escaned, Javier; Wijns, William; Reiber, Johan H C

    2015-04-20

    The aim of this study was to develop a new model for assessment of stenosis severity in a bifurcation lesion including its core. The diagnostic performance of this model, powered by 3-dimensional quantitative coronary angiography to predict the functional significance of obstructive bifurcation stenoses, was evaluated using fractional flow reserve (FFR) as the reference standard. Development of advanced quantitative models might help to establish a relationship between bifurcation anatomy and FFR. Patients who had undergone coronary angiography and interventions in 5 European cardiology centers were randomly selected and analyzed. Different bifurcation fractal laws, including Murray, Finet, and HK laws, were implemented in the bifurcation model, resulting in different degrees of stenosis severity. A total of 78 bifurcation lesions in 73 patients were analyzed. In 51 (65%) bifurcations, FFR was measured in the main vessel. A total of 34 (43.6%) interrogated vessels had an FFR≤0.80. Correlation between FFR and diameter stenosis was poor by conventional straight analysis (ρ=-0.23, p<0.001) but significantly improved by bifurcation analyses: the highest by the HK law (ρ=-0.50, p<0.001), followed by the Finet law (ρ=-0.49, p<0.001), and the Murray law (ρ=-0.41, p<0.001). The area under the receiver-operating characteristics curve for predicting FFR≤0.80 was significantly higher by bifurcation analysis compared with straight analysis: 0.72 (95% confidence interval: 0.61 to 0.82) versus 0.60 (95% confidence interval: 0.49 to 0.71; p=0.001). Applying a threshold of ≥50% diameter stenosis, as assessed by the bifurcation model, to predict FFR≤0.80 resulted in 23 true positives, 27 true negatives, 17 false positives, and 11 false negatives. The new bifurcation model provides a comprehensive assessment of bifurcation anatomy. Compared with straight analysis, identification of lesions with preserved FFR values in obstructive bifurcation stenoses was improved. Nevertheless, accuracy was limited by using solely anatomical parameters. Copyright © 2015 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  15. Needs assessment for simulation training in neuroendoscopy: a Canadian national survey.

    PubMed

    Haji, Faizal A; Dubrowski, Adam; Drake, James; de Ribaupierre, Sandrine

    2013-02-01

    In recent years, dramatic changes in surgical education have increased interest in simulation-based training for complex surgical skills. This is particularly true for endoscopic third ventriculostomy (ETV), given the potential for serious intraoperative errors arising from surgical inexperience. However, prior to simulator development, a thorough assessment of training needs is essential to ensure development of educationally relevant platforms. The purpose of this study was to conduct a national needs assessment addressing specific goals of instruction, to guide development of simulation platforms, training curricula, and assessment metrics for ETV. Canadian neurosurgeons performing ETV were invited to participate in a structured online questionnaire regarding the procedural steps for ETV, the frequency and significance of intraoperative errors committed while learning the technique, and simulation training modules of greatest potential educational benefit. Descriptive data analysis was completed for both quantitative and qualitative responses. Thirty-two (55.2%) of 58 surgeons completed the survey. All believed that virtual reality simulation training for ETV would be a valuable addition to clinical training. Selection of ventriculostomy site, navigation within the ventricles, and performance of the ventriculostomy ranked as the most important steps to simulate. Technically inadequate ventriculostomy and inappropriate fenestration site selection were ranked as the most frequent/significant errors. A standard ETV module was thought to be most beneficial for resident training. To inform the development of a simulation-based training program for ETV, the authors have conducted a national needs assessment. The results provide valuable insight to inform key design elements necessary to construct an educationally relevant device and educational program.

  16. Numerical simulation of the deterministic vector separation of particles flowing over slanted open cavities

    NASA Astrophysics Data System (ADS)

    Shaqfeh, Eric S. G.; Bernate, Jorge A.; Yang, Mengfei

    2016-12-01

    Within the past decade, the separation of particles via continuous flow through microfluidic devices has been developed largely through an Edisonian approach whereby devices have been developed based on observation and intuition. This is particularly true in the development of vector chromatography at vanishingly small Reynolds number for non-Brownian particles. Note that this latter phenomenon has its origins in the irreversible forces that are at work in the device, since Stokes flow reversibility typically prohibits their function otherwise. We present a numerical simulation of the vector separation of non-Brownian particles of different sizes and deformabilities in the Stokes flow through channels whose lower surface is composed of slanted cavities. The simulations are designed to understand the physical principles behind the separation as well as to provide design criteria for devices for separating particles in a given size and flexibility range. The numerical simulations are Stokes flow boundary element simulations using techniques defined elsewhere in the literature, but including a close-range repulsive force between the particles and the slanted cavities. We demonstrate that over a range of repulsive force that is comparable to the roughness in the experimental devices, the separation data (particularly in particle size) are predicted quantitatively and are a very weak function of the range of the force. We then vary the geometric parameters of the simulated devices to demonstrate the sensitivity of the separation efficiency to these parameters, thus making design predictions as to which devices are appropriate for separating particles in different size, shape, and deformability ranges.

  17. Interstellar abundances - Gas and dust

    NASA Technical Reports Server (NTRS)

    Field, G. B.

    1974-01-01

    Data on abundances of interstellar atoms, ions and molecules in front of zeta Oph are assembled and analyzed. The gas-phase abundances of at least 11 heavy elements are significantly lower, relative to hydrogen, than in the solar system. The abundance deficiencies of certain elements correlate with the temperatures derived theoretically for particle condensation in stellar atmospheres or nebulae, suggesting that these elements have condensed into dust grains near stars. There is evidence that other elements have accreted onto such grains after their arrival in interstellar space. The extinction spectrum of zeta Oph can be explained qualitatively and, to a degree, quantitatively by dust grains composed of silicates, graphite, silicon carbide, and iron, with mantles composed of complex molecules of H, C, N, and O. This composition is consistent with the observed gas-phase deficiencies.

  18. Numerical and Analytical Study of Nonlinear Effects of Transonic Flow Past a Wing Airfoil in Oscillation of its Surface Element

    NASA Astrophysics Data System (ADS)

    Aul'chenko, S. M.; Zamuraev, V. P.; Kalinina, A. P.

    2014-05-01

    The present work is devoted to a criterial analysis and mathematical modeling of the influence of forced oscillations of surface elements of a wing airfoil on the shock-wave structure of transonic flow past it. Parameters that govern the regimes of interaction of the oscillatory motion of airfoil sections with the breakdown compression shock have been established. The qualitative and quantitative influence of these parameters on the wave resistance of the airfoil has been investigated.

  19. [Integral quantitative evaluation of working conditions in the construction industry].

    PubMed

    Guseĭnov, A A

    1993-01-01

    Present method evaluating the quality of environment (using MAC and MAL) does not enable to assess completely and objectively the work conditions of building industry due to multiple confounding elements. A solution to this complicated problem including the analysis of various correlating elements of the system "human--work conditions--environment" may be encouraged by social norm of morbidity, which is independent on industrial and natural environment. The complete integral assessment enables to see the whole situation and reveal the points at risk.

  20. Performance comparison of two resolution modeling PET reconstruction algorithms in terms of physical figures of merit used in quantitative imaging.

    PubMed

    Matheoud, R; Ferrando, O; Valzano, S; Lizio, D; Sacchetti, G; Ciarmiello, A; Foppiano, F; Brambilla, M

    2015-07-01

    Resolution modeling (RM) of PET systems has been introduced in iterative reconstruction algorithms for oncologic PET. The RM recovers the loss of resolution and reduces the associated partial volume effect. While these methods improved the observer performance, particularly in the detection of small and faint lesions, their impact on quantification accuracy still requires thorough investigation. The aim of this study was to characterize the performances of the RM algorithms under controlled conditions simulating a typical (18)F-FDG oncologic study, using an anthropomorphic phantom and selected physical figures of merit, used for image quantification. Measurements were performed on Biograph HiREZ (B_HiREZ) and Discovery 710 (D_710) PET/CT scanners and reconstructions were performed using the standard iterative reconstructions and the RM algorithms associated to each scanner: TrueX and SharpIR, respectively. RM determined a significant improvement in contrast recovery for small targets (≤17 mm diameter) only for the D_710 scanner. The maximum standardized uptake value (SUVmax) increased when RM was applied using both scanners. The SUVmax of small targets was on average lower with the B_HiREZ than with the D_710. Sharp IR improved the accuracy of SUVmax determination, whilst TrueX showed an overestimation of SUVmax for sphere dimensions greater than 22 mm. The goodness of fit of adaptive threshold algorithms worsened significantly when RM algorithms were employed for both scanners. Differences in general quantitative performance were observed for the PET scanners analyzed. Segmentation of PET images using adaptive threshold algorithms should not be undertaken in conjunction with RM reconstructions. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

Top