Sample records for present quantitative results

  1. Priority survey between indicators and analytic hierarchy process analysis for green chemistry technology assessment

    PubMed Central

    Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong

    2015-01-01

    Objectives This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. Methods The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. Results These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. Conclusions This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies. PMID:26206364

  2. Priority survey between indicators and analytic hierarchy process analysis for green chemistry technology assessment.

    PubMed

    Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong

    2015-01-01

    This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies.

  3. Less label, more free: approaches in label-free quantitative mass spectrometry.

    PubMed

    Neilson, Karlie A; Ali, Naveid A; Muralidharan, Sridevi; Mirzaei, Mehdi; Mariani, Michael; Assadourian, Gariné; Lee, Albert; van Sluyter, Steven C; Haynes, Paul A

    2011-02-01

    In this review we examine techniques, software, and statistical analyses used in label-free quantitative proteomics studies for area under the curve and spectral counting approaches. Recent advances in the field are discussed in an order that reflects a logical workflow design. Examples of studies that follow this design are presented to highlight the requirement for statistical assessment and further experiments to validate results from label-free quantitation. Limitations of label-free approaches are considered, label-free approaches are compared with labelling techniques, and forward-looking applications for label-free quantitative data are presented. We conclude that label-free quantitative proteomics is a reliable, versatile, and cost-effective alternative to labelled quantitation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Using Large Data Sets to Study College Education Trajectories

    ERIC Educational Resources Information Center

    Oseguera, Leticia; Hwang, Jihee

    2014-01-01

    This chapter presents various considerations researchers undertook to conduct a quantitative study on low-income students using a national data set. Specifically, it describes how a critical quantitative scholar approaches guiding frameworks, variable operationalization, analytic techniques, and result interpretation. Results inform how…

  5. 77 FR 42495 - Release of Draft Documents Related to the Review of the National Ambient Air Quality Standards...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-19

    ... Review Draft. These two draft assessment documents describe the quantitative analyses the EPA is... NAAQS,\\3\\ the Agency is conducting quantitative assessments characterizing the: (1) Health risks... present the initial key results, observations, and related uncertainties associated with the quantitative...

  6. Introduction to Quantitative Science, a Ninth-Grade Laboratory-Centered Course Stressing Quantitative Observation and Mathematical Analysis of Experimental Results. Final Report.

    ERIC Educational Resources Information Center

    Badar, Lawrence J.

    This report, in the form of a teacher's guide, presents materials for a ninth grade introductory course on Introduction to Quantitative Science (IQS). It is intended to replace a traditional ninth grade general science with a process oriented course that will (1) unify the sciences, and (2) provide a quantitative preparation for the new science…

  7. The Undergraduate Spanish Major Curriculum: Faculty, Alumni, and Student Perceptions

    ERIC Educational Resources Information Center

    Hertel, Tammy Jandrey; Dings, Abby

    2017-01-01

    This article presents the quantitative and qualitative results of a nationwide survey of the perceptions of faculty, alumni, and students regarding the contribution to the undergraduate Spanish major curriculum of various types of courses and experiences. Quantitative results indicated that all participants valued the importance of study abroad as…

  8. Characterization and quantitation of polyolefin microplastics in personal-care products using high-temperature gel-permeation chromatography.

    PubMed

    Hintersteiner, Ingrid; Himmelsbach, Markus; Buchberger, Wolfgang W

    2015-02-01

    In recent years, the development of reliable methods for the quantitation of microplastics in different samples, including evaluating the particles' adverse effects in the marine environment, has become a great concern. Because polyolefins are the most prevalent type of polymer in personal-care products containing microplastics, this study presents a novel approach for their quantitation. The method is suitable for aqueous and hydrocarbon-based products, and includes a rapid sample clean-up involving twofold density separation and a subsequent quantitation with high-temperature gel-permeation chromatography. In contrast with previous procedures, both errors caused by weighing after insufficient separation of plastics and matrix and time-consuming visual sorting are avoided. In addition to reliable quantitative results, in this investigation a comprehensive characterization of the polymer particles isolated from the product matrix, covering size, shape, molecular weight distribution and stabilization, is provided. Results for seven different personal-care products are presented. Recoveries of this method were in the range of 92-96 %.

  9. Qualitative versus Quantitative Results: An Experimental Introduction to Data Interpretation.

    ERIC Educational Resources Information Center

    Johnson, Eric R.; Alter, Paula

    1989-01-01

    Described is an experiment in which the student can ascertain the meaning of a negative result from a qualitative test by performing a more sensitive quantitative test on the same sample. Methodology for testing urinary glucose with a spectrophotometer at 630 nm and with commercial assaying glucose strips is presented. (MVL)

  10. Evaluation of Aution Max AX-4030 and 9UB Uriflet, 10PA Aution Sticks urine dipsticks in the automated urine test strip analysis.

    PubMed

    Rota, Cristina; Biondi, Marco; Trenti, Tommaso

    2011-09-26

    Aution Max AX-4030, a test strip analyzer recently introduced to the market, represents an upgrade of the Aution Max AX-4280 widely employed for urinalysis. This new instrument model can allocate two different test strips at the same time. In the present study the two instruments have been compared together with the usage of Uriflet 9UB and the recently produced Aution Sticks 10PA urine strips, the latter presenting an additional test area for the measurement of urinary creatinine. Imprecision and correlation between instruments and strips have been evaluated for chemical-physical parameters. Accuracy was evaluated for protein, glucose and creatinine by comparing the semi-quantitative results to those obtained by quantitative methods. The well-known interference effect of high ascorbic acid levels on urine glucose test strip determination was evaluated, ascorbic acid influence was also evaluated on protein and creatinine determination. The two instruments have demonstrated comparable performances: precision and correlation between instruments and strips, evaluated for chemical-physical parameters, were always good. Furthermore, accuracy was always very good: results of protein and glucose semi-quantitative measurements resulted to be highly correlated with those obtained by quantitative methods. Moreover, the semi-quantitative measurements of creatinine, employing Aution Sticks 10PA urine strips, were highly comparable with quantitative results. 10PA urine strips are eligible for urine creatinine determination with the possibility of correcting urinalysis results for urinary creatinine concentration, whenever necessary and calculating the protein creatinine ratio. Further studies should be carried out to evaluate effectiveness and appropriateness of the usage of creatinine semi-quantitative analysis.

  11. Assessment of and standardization for quantitative nondestructive test

    NASA Technical Reports Server (NTRS)

    Neuschaefer, R. W.; Beal, J. B.

    1972-01-01

    Present capabilities and limitations of nondestructive testing (NDT) as applied to aerospace structures during design, development, production, and operational phases are assessed. It will help determine what useful structural quantitative and qualitative data may be provided from raw materials to vehicle refurbishment. This assessment considers metal alloys systems and bonded composites presently applied in active NASA programs or strong contenders for future use. Quantitative and qualitative data has been summarized from recent literature, and in-house information, and presented along with a description of those structures or standards where the information was obtained. Examples, in tabular form, of NDT technique capabilities and limitations have been provided. NDT techniques discussed and assessed were radiography, ultrasonics, penetrants, thermal, acoustic, and electromagnetic. Quantitative data is sparse; therefore, obtaining statistically reliable flaw detection data must be strongly emphasized. The new requirements for reusable space vehicles have resulted in highly efficient design concepts operating in severe environments. This increases the need for quantitative NDT evaluation of selected structural components, the end item structure, and during refurbishment operations.

  12. Improved cancer risk stratification and diagnosis via quantitative phase microscopy (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Uttam, Shikhar; Pham, Hoa V.; Hartman, Douglas J.

    2017-02-01

    Pathology remains the gold standard for cancer diagnosis and in some cases prognosis, in which trained pathologists examine abnormality in tissue architecture and cell morphology characteristic of cancer cells with a bright-field microscope. The limited resolution of conventional microscope can result in intra-observer variation, missed early-stage cancers, and indeterminate cases that often result in unnecessary invasive procedures in the absence of cancer. Assessment of nanoscale structural characteristics via quantitative phase represents a promising strategy for identifying pre-cancerous or cancerous cells, due to its nanoscale sensitivity to optical path length, simple sample preparation (i.e., label-free) and low cost. I will present the development of quantitative phase microscopy system in transmission and reflection configuration to detect the structural changes in nuclear architecture, not be easily identifiable by conventional pathology. Specifically, we will present the use of transmission-mode quantitative phase imaging to improve diagnostic accuracy of urine cytology and the nuclear dry mass is progressively correlate with negative, atypical, suspicious and positive cytological diagnosis. In a second application, we will present the use of reflection-mode quantitative phase microscopy for depth-resolved nanoscale nuclear architecture mapping (nanoNAM) of clinically prepared formalin-fixed, paraffin-embedded tissue sections. We demonstrated that the quantitative phase microscopy system detects a gradual increase in the density alteration of nuclear architecture during malignant transformation in animal models of colon carcinogenesis and in human patients with ulcerative colitis, even in tissue that appears histologically normal according to pathologists. We evaluated the ability of nanoNAM to predict "future" cancer progression in patients with ulcerative colitis.

  13. Primary enzyme quantitation

    DOEpatents

    Saunders, G.C.

    1982-03-04

    The disclosure relates to the quantitation of a primary enzyme concentration by utilizing a substrate for the primary enzyme labeled with a second enzyme which is an indicator enzyme. Enzyme catalysis of the substrate occurs and results in release of the indicator enzyme in an amount directly proportional to the amount of primary enzyme present. By quantifying the free indicator enzyme one determines the amount of primary enzyme present.

  14. Subsurface imaging and cell refractometry using quantitative phase/ shear-force feedback microscopy

    NASA Astrophysics Data System (ADS)

    Edward, Kert; Farahi, Faramarz

    2009-10-01

    Over the last few years, several novel quantitative phase imaging techniques have been developed for the study of biological cells. However, many of these techniques are encumbered by inherent limitations including 2π phase ambiguities and diffraction limited spatial resolution. In addition, subsurface information in the phase data is not exploited. We hereby present a novel quantitative phase imaging system without 2 π ambiguities, which also allows for subsurface imaging and cell refractometry studies. This is accomplished by utilizing simultaneously obtained shear-force topography information. We will demonstrate how the quantitative phase and topography data can be used for subsurface and cell refractometry analysis and will present results for a fabricated structure and a malaria infected red blood cell.

  15. Modeling noisy resonant system response

    NASA Astrophysics Data System (ADS)

    Weber, Patrick Thomas; Walrath, David Edwin

    2017-02-01

    In this paper, a theory-based model replicating empirical acoustic resonant signals is presented and studied to understand sources of noise present in acoustic signals. Statistical properties of empirical signals are quantified and a noise amplitude parameter, which models frequency and amplitude-based noise, is created, defined, and presented. This theory-driven model isolates each phenomenon and allows for parameters to be independently studied. Using seven independent degrees of freedom, this model will accurately reproduce qualitative and quantitative properties measured from laboratory data. Results are presented and demonstrate success in replicating qualitative and quantitative properties of experimental data.

  16. Examining the Role of Numeracy in College STEM Courses: Results from the Quantitative Reasoning for College Science (QuaRCS) Assessment Instrument

    NASA Astrophysics Data System (ADS)

    Follette, Katherine B.; McCarthy, Donald W.; Dokter, Erin F.; Buxner, Sanlyn; Prather, Edward E.

    2016-01-01

    Is quantitative literacy a prerequisite for science literacy? Can students become discerning voters, savvy consumers and educated citizens without it? Should college science courses for nonmajors be focused on "science appreciation", or should they engage students in the messy quantitative realities of modern science? We will present results from the recently developed and validated Quantitative Reasoning for College Science (QuaRCS) Assessment, which probes both quantitative reasoning skills and attitudes toward mathematics. Based on data from nearly two thousand students enrolled in nineteen general education science courses, we show that students in these courses did not demonstrate significant skill or attitude improvements over the course of a single semester, but find encouraging evidence for longer term trends.

  17. Quantitative surface topography determination by Nomarski reflection microscopy. 2: Microscope modification, calibration, and planar sample experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartman, J.S.; Gordon, R.L.; Lessor, D.L.

    1980-09-01

    The application of reflective Nomarski differential interference contrast microscopy for the determination of quantitative sample topography data is presented. The discussion includes a review of key theoretical results presented previously plus the experimental implementation of the concepts using a commercial Momarski microscope. The experimental work included the modification and characterization of a commercial microscope to allow its use for obtaining quantitative sample topography data. System usage for the measurement of slopes on flat planar samples is also discussed. The discussion has been designed to provide the theoretical basis, a physical insight, and a cookbook procedure for implementation to allow thesemore » results to be of value to both those interested in the microscope theory and its practical usage in the metallography laboratory.« less

  18. A quantitative comparison of corrective and perfective maintenance

    NASA Technical Reports Server (NTRS)

    Henry, Joel; Cain, James

    1994-01-01

    This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.

  19. [Quantitative surface analysis of Pt-Co, Cu-Au and Cu-Ag alloy films by XPS and AES].

    PubMed

    Li, Lian-Zhong; Zhuo, Shang-Jun; Shen, Ru-Xiang; Qian, Rong; Gao, Jie

    2013-11-01

    In order to improve the quantitative analysis accuracy of AES, We associated XPS with AES and studied the method to reduce the error of AES quantitative analysis, selected Pt-Co, Cu-Au and Cu-Ag binary alloy thin-films as the samples, used XPS to correct AES quantitative analysis results by changing the auger sensitivity factors to make their quantitative analysis results more similar. Then we verified the accuracy of the quantitative analysis of AES when using the revised sensitivity factors by other samples with different composition ratio, and the results showed that the corrected relative sensitivity factors can reduce the error in quantitative analysis of AES to less than 10%. Peak defining is difficult in the form of the integral spectrum of AES analysis since choosing the starting point and ending point when determining the characteristic auger peak intensity area with great uncertainty, and to make analysis easier, we also processed data in the form of the differential spectrum, made quantitative analysis on the basis of peak to peak height instead of peak area, corrected the relative sensitivity factors, and verified the accuracy of quantitative analysis by the other samples with different composition ratio. The result showed that the analytical error in quantitative analysis of AES reduced to less than 9%. It showed that the accuracy of AES quantitative analysis can be highly improved by the way of associating XPS with AES to correct the auger sensitivity factors since the matrix effects are taken into account. Good consistency was presented, proving the feasibility of this method.

  20. A combined qualitative and quantitative procedure for the chemical analysis of urinary calculi

    PubMed Central

    Hodgkinson, A.

    1971-01-01

    A better understanding of the physico-chemical principles underlying the formation of calculus has led to a need for more precise information on the chemical composition of stones. A combined qualitative and quantitative procedure for the chemical analysis of urinary calculi which is suitable for routine use is presented. The procedure involves five simple qualitative tests followed by the quantitative determination of calcium, magnesium, inorganic phosphate, and oxalate. These data are used to calculate the composition of the stone in terms of calcium oxalate, apatite, and magnesium ammonium phosphate. Analytical results and derived values for five representative types of calculi are presented. PMID:5551382

  1. Conceptual Diversity, Moderators, and Theoretical Issues in Quantitative Studies of Cultural Capital Theory

    ERIC Educational Resources Information Center

    Tan, Cheng Yong

    2017-01-01

    The present study reviewed quantitative empirical studies examining the relationship between cultural capital and student achievement. Results showed that researchers had conceptualized and measured cultural capital in different ways. It is argued that the more holistic understanding of the construct beyond highbrow cultural consumption must be…

  2. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    ERIC Educational Resources Information Center

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  3. Solving Quantitative Problems: Guidelines for Teaching Derived from Research.

    ERIC Educational Resources Information Center

    Kramers-Pals, H.; Pilot, A.

    1988-01-01

    Presents four guidelines for teaching quantitative problem-solving based on research results: analyze difficulties of students, develop a system of heuristics, select and map key relations, and design instruction with proper orientation, exercise, and feedback. Discusses the four guidelines and uses flow charts and diagrams to show how the…

  4. Quantitation of glycerophosphorylcholine by flow injection analysis using immobilized enzymes.

    PubMed

    Mancini, A; Del Rosso, F; Roberti, R; Caligiana, P; Vecchini, A; Binaglia, L

    1996-09-20

    A method for quantitating glycerophosphorylcholine by flow injection analysis is reported in the present paper. Glycerophosphorylcholine phosphodiesterase and choline oxidase, immobilized on controlled porosity glass beads, are packed in a small reactor inserted in a flow injection manifold. When samples containing glycerophosphorylcholine are injected, glycerophosphorylcholine is hydrolyzed into choline and sn-glycerol-3-phosphate. The free choline produced in this reaction is oxidized to betain and hydrogen peroxide. Hydrogen peroxide is detected amperometrically. Quantitation of glycerophosphorylcholine in samples containing choline and phosphorylcholine is obtained inserting ahead of the reactor a small column packed with a mixed bed ion exchange resin. The time needed for each determination does not exceed one minute. The present method, applied to quantitate glycerophosphorylcholine in samples of seminal plasma, gave results comparable with those obtained using the standard enzymatic-spectrophotometric procedure. An alternative procedure, making use of co-immobilized glycerophosphorylcholine phosphodiesterase and glycerol-3-phosphate oxidase for quantitating glycerophosphorylcholine, glycerophosphorylethanolamine and glycerophosphorylserine is also described.

  5. User's Guide to Handlens - A Computer Program that Calculates the Chemistry of Minerals in Mixtures

    USGS Publications Warehouse

    Eberl, D.D.

    2008-01-01

    HandLens is a computer program, written in Excel macro language, that calculates the chemistry of minerals in mineral mixtures (for example, in rocks, soils and sediments) for related samples from inputs of quantitative mineralogy and chemistry. For best results, the related samples should contain minerals having the same chemical compositions; that is, the samples should differ only in the proportions of minerals present. This manual describes how to use the program, discusses the theory behind its operation, and presents test results of the program's accuracy. Required input for HandLens includes quantitative mineralogical data, obtained, for example, by RockJock analysis of X-ray diffraction (XRD) patterns, and quantitative chemical data, obtained, for example, by X-ray florescence (XRF) analysis of the same samples. Other quantitative data, such as sample depth, temperature, surface area, also can be entered. The minerals present in the samples are selected from a list, and the program is started. The results of the calculation include: (1) a table of linear coefficients of determination (r2's) which relate pairs of input data (for example, Si versus quartz weight percents); (2) a utility for plotting all input data, either as pairs of variables, or as sums of up to eight variables; (3) a table that presents the calculated chemical formulae for minerals in the samples; (4) a table that lists the calculated concentrations of major, minor, and trace elements in the various minerals; and (5) a table that presents chemical formulae for the minerals that have been corrected for possible systematic errors in the mineralogical and/or chemical analyses. In addition, the program contains a method for testing the assumption of constant chemistry of the minerals within a sample set.

  6. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast

    PubMed Central

    Pang, Wei; Coghill, George M.

    2015-01-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377

  7. Multi-frequency local wavenumber analysis and ply correlation of delamination damage.

    PubMed

    Juarez, Peter D; Leckey, Cara A C

    2015-09-01

    Wavenumber domain analysis through use of scanning laser Doppler vibrometry has been shown to be effective for non-contact inspection of damage in composites. Qualitative and semi-quantitative local wavenumber analysis of realistic delamination damage and quantitative analysis of idealized damage scenarios (Teflon inserts) have been performed previously in the literature. This paper presents a new methodology based on multi-frequency local wavenumber analysis for quantitative assessment of multi-ply delamination damage in carbon fiber reinforced polymer (CFRP) composite specimens. The methodology is presented and applied to a real world damage scenario (impact damage in an aerospace CFRP composite). The methodology yields delamination size and also correlates local wavenumber results from multiple excitation frequencies to theoretical dispersion curves in order to robustly determine the delamination ply depth. Results from the wavenumber based technique are validated against a traditional nondestructive evaluation method. Published by Elsevier B.V.

  8. [Study on ethnic medicine quantitative reference herb,Tibetan medicine fruits of Capsicum frutescens as a case].

    PubMed

    Zan, Ke; Cui, Gan; Guo, Li-Nong; Ma, Shuang-Cheng; Zheng, Jian

    2018-05-01

    High price and difficult to get of reference substance have become obstacles to HPLC assay of ethnic medicine. A new method based on quantitative reference herb (QRH) was proposed. Specific chromatograms in fruits of Capsicum frutescens were employed to determine peak positions, and HPLC quantitative reference herb was prepared from fruits of C. frutescens. The content of capsaicin and dihydrocapsaicin in the quantitative control herb was determined by HPLC. Eleven batches of fruits of C. frutescens were analyzed with quantitative reference herb and reference substance respectively. The results showed no difference. The present method is feasible for quality control of ethnic medicines and quantitative reference herb is suitable to replace reference substances in assay. Copyright© by the Chinese Pharmaceutical Association.

  9. Limits of quantitation - Yet another suggestion

    NASA Astrophysics Data System (ADS)

    Carlson, Jill; Wysoczanski, Artur; Voigtman, Edward

    2014-06-01

    The work presented herein suggests that the limit of quantitation concept may be rendered substantially less ambiguous and ultimately more useful as a figure of merit by basing it upon the significant figure and relative measurement error ideas due to Coleman, Auses and Gram, coupled with the correct instantiation of Currie's detection limit methodology. Simple theoretical results are presented for a linear, univariate chemical measurement system with homoscedastic Gaussian noise, and these are tested against both Monte Carlo computer simulations and laser-excited molecular fluorescence experimental results. Good agreement among experiment, theory and simulation is obtained and an easy extension to linearly heteroscedastic Gaussian noise is also outlined.

  10. [Influence of sample surface roughness on mathematical model of NIR quantitative analysis of wood density].

    PubMed

    Huang, An-Min; Fei, Ben-Hua; Jiang, Ze-Hui; Hse, Chung-Yun

    2007-09-01

    Near infrared spectroscopy is widely used as a quantitative method, and the main multivariate techniques consist of regression methods used to build prediction models, however, the accuracy of analysis results will be affected by many factors. In the present paper, the influence of different sample roughness on the mathematical model of NIR quantitative analysis of wood density was studied. The result of experiments showed that if the roughness of predicted samples was consistent with that of calibrated samples, the result was good, otherwise the error would be much higher. The roughness-mixed model was more flexible and adaptable to different sample roughness. The prediction ability of the roughness-mixed model was much better than that of the single-roughness model.

  11. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    PubMed

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  12. Preschool Temperament Assessment: A Quantitative Assessment of the Validity of Behavioral Style Questionnaire Data

    ERIC Educational Resources Information Center

    Huelsman, Timothy J.; Gagnon, Sandra Glover; Kidder-Ashley, Pamela; Griggs, Marissa Swaim

    2014-01-01

    Research Findings: Child temperament is an important construct, but its measurement has been marked by a number of weaknesses that have diminished the frequency with which it is assessed in practice. We address this problem by presenting the results of a quantitative construct validation study. We calculated validity indices by hypothesizing the…

  13. A Radioactivity Based Quantitative Analysis of the Amount of Thorium Present in Ores and Metallurgical Products; ANALYSE QUANTITATIVE DU THORIUM DANS LES MINERAIS ET LES PRODUITS THORIFERES PAR UNE METHODE BASEE SUR LA RADIOACTIVITE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collee, R.; Govaerts, J.; Winand, L.

    1959-10-31

    A brief resume of the classical methods of quantitative determination of thorium in ores and thoriferous products is given to show that a rapid, accurate, and precise physical method based on the radioactivity of thorium would be of great utility. A method based on the utilization of the characteristic spectrum of the thorium gamma radiation is presented. The preparation of the samples and the instruments needed for the measurements is discussed. The experimental results show that the reproducibility is very satisfactory and that it is possible to detect Th contents of 1% or smaller. (J.S.R.)

  14. Iterative optimization method for design of quantitative magnetization transfer imaging experiments.

    PubMed

    Levesque, Ives R; Sled, John G; Pike, G Bruce

    2011-09-01

    Quantitative magnetization transfer imaging (QMTI) using spoiled gradient echo sequences with pulsed off-resonance saturation can be a time-consuming technique. A method is presented for selection of an optimum experimental design for quantitative magnetization transfer imaging based on the iterative reduction of a discrete sampling of the Z-spectrum. The applicability of the technique is demonstrated for human brain white matter imaging at 1.5 T and 3 T, and optimal designs are produced to target specific model parameters. The optimal number of measurements and the signal-to-noise ratio required for stable parameter estimation are also investigated. In vivo imaging results demonstrate that this optimal design approach substantially improves parameter map quality. The iterative method presented here provides an advantage over free form optimal design methods, in that pragmatic design constraints are readily incorporated. In particular, the presented method avoids clustering and repeated measures in the final experimental design, an attractive feature for the purpose of magnetization transfer model validation. The iterative optimal design technique is general and can be applied to any method of quantitative magnetization transfer imaging. Copyright © 2011 Wiley-Liss, Inc.

  15. Quantitative Autism Traits in First Degree Relatives: Evidence for the Broader Autism Phenotype in Fathers, but Not in Mothers and Siblings

    ERIC Educational Resources Information Center

    De la Marche, Wouter; Noens, Ilse; Luts, Jan; Scholte, Evert; Van Huffel, Sabine; Steyaert, Jean

    2012-01-01

    Autism spectrum disorder (ASD) symptoms are present in unaffected relatives and individuals from the general population. Results are inconclusive, however, on whether unaffected relatives have higher levels of quantitative autism traits (QAT) or not. This might be due to differences in research populations, because behavioral data and molecular…

  16. A method for the extraction and quantitation of phycoerythrin from algae

    NASA Technical Reports Server (NTRS)

    Stewart, D. E.

    1982-01-01

    A summary of a new technique for the extraction and quantitation of phycoerythrin (PHE) from algal samples is described. Results of analysis of four extracts representing three PHE types from algae including cryptomonad and cyanophyte types are presented. The method of extraction and an equation for quantitation are given. A graph showing the relationship of concentration and fluorescence units that may be used with samples fluorescing around 575-580 nm (probably dominated by cryptophytes in estuarine waters) and 560 nm (dominated by cyanophytes characteristics of the open ocean) is provided.

  17. The Role of Introductory Geosciences in Students' Quantitative Literacy

    NASA Astrophysics Data System (ADS)

    Wenner, J. M.; Manduca, C.; Baer, E. M.

    2006-12-01

    Quantitative literacy is more than mathematics; it is about reasoning with data. Colleges and universities have begun to recognize the distinction between mathematics and quantitative literacy, modifying curricula to reflect the need for numerate citizens. Although students may view geology as 'rocks for jocks', the geosciences are truthfully rife with data, making introductory geoscience topics excellent context for developing the quantitative literacy of students with diverse backgrounds. In addition, many news items that deal with quantitative skills, such as the global warming phenomenon, have their basis in the Earth sciences and can serve as timely examples of the importance of quantitative literacy for all students in introductory geology classrooms. Participants at a workshop held in 2006, 'Infusing Quantitative Literacy into Introductory Geoscience Courses,' discussed and explored the challenges and opportunities associated with the inclusion of quantitative material and brainstormed about effective practices for imparting quantitative literacy to students with diverse backgrounds. The tangible results of this workshop add to the growing collection of quantitative materials available through the DLESE- and NSF-supported Teaching Quantitative Skills in the Geosciences website, housed at SERC. There, faculty can find a collection of pages devoted to the successful incorporation of quantitative literacy in introductory geoscience. The resources on the website are designed to help faculty to increase their comfort with presenting quantitative ideas to students with diverse mathematical abilities. A methods section on "Teaching Quantitative Literacy" (http://serc.carleton.edu/quantskills/methods/quantlit/index.html) focuses on connecting quantitative concepts with geoscience context and provides tips, trouble-shooting advice and examples of quantitative activities. The goal in this section is to provide faculty with material that can be readily incorporated into existing introductory geoscience courses. In addition, participants at the workshop (http://serc.carleton.edu/quantskills/workshop06/index.html) submitted and modified more than 20 activities and model courses (with syllabi) designed to use best practices for helping introductory geoscience students to become quantitatively literate. We present insights from the workshop and other sources for a framework that can aid in increasing quantitative literacy of students from a variety of backgrounds in the introductory geoscience classroom.

  18. Towards quantitative magnetic particle imaging: A comparison with magnetic particle spectroscopy

    NASA Astrophysics Data System (ADS)

    Paysen, Hendrik; Wells, James; Kosch, Olaf; Steinhoff, Uwe; Trahms, Lutz; Schaeffter, Tobias; Wiekhorst, Frank

    2018-05-01

    Magnetic Particle Imaging (MPI) is a quantitative imaging modality with promising features for several biomedical applications. Here, we study quantitatively the raw data obtained during MPI measurements. We present a method for the calibration of the MPI scanner output using measurements from a magnetic particle spectrometer (MPS) to yield data in units of magnetic moments. The calibration technique is validated in a simplified MPI mode with a 1D excitation field. Using the calibrated results from MPS and MPI, we determine and compare the detection limits for each system. The detection limits were found to be 5.10-12 Am2 for MPS and 3.6.10-10 Am2 for MPI. Finally, the quantitative information contained in a standard MPI measurement with a 3D excitation is analyzed and compared to the previous results, showing a decrease in signal amplitudes of the odd harmonics related to the case of 1D excitation. We propose physical explanations for all acquired results; and discuss the possible benefits for the improvement of MPI technology.

  19. The simultaneous quantitation of ten amino acids in soil extracts by mass fragmentography

    NASA Technical Reports Server (NTRS)

    Pereira, W. E.; Hoyano, Y.; Reynolds, W. E.; Summons, R. E.; Duffield, A. M.

    1972-01-01

    A specific and sensitive method for the identification and simultaneous quantitation by mass fragmentography of ten of the amino acids present in soil was developed. The technique uses a computer driven quadrupole mass spectrometer and a commercial preparation of deuterated amino acids is used as internal standards for purposes of quantitation. The results obtained are comparable with those from an amino acid analyzer. In the quadrupole mass spectrometer-computer system up to 25 pre-selected ions may be monitored sequentially. This allows a maximum of 12 different amino acids (one specific ion in each of the undeuterated and deuterated amino acid spectra) to be quantitated. The method is relatively rapid (analysis time of approximately one hour) and is capable of the quantitation of nanogram quantities of amino acids.

  20. Principles of Quantitative MR Imaging with Illustrated Review of Applicable Modular Pulse Diagrams.

    PubMed

    Mills, Andrew F; Sakai, Osamu; Anderson, Stephan W; Jara, Hernan

    2017-01-01

    Continued improvements in diagnostic accuracy using magnetic resonance (MR) imaging will require development of methods for tissue analysis that complement traditional qualitative MR imaging studies. Quantitative MR imaging is based on measurement and interpretation of tissue-specific parameters independent of experimental design, compared with qualitative MR imaging, which relies on interpretation of tissue contrast that results from experimental pulse sequence parameters. Quantitative MR imaging represents a natural next step in the evolution of MR imaging practice, since quantitative MR imaging data can be acquired using currently available qualitative imaging pulse sequences without modifications to imaging equipment. The article presents a review of the basic physical concepts used in MR imaging and how quantitative MR imaging is distinct from qualitative MR imaging. Subsequently, the article reviews the hierarchical organization of major applicable pulse sequences used in this article, with the sequences organized into conventional, hybrid, and multispectral sequences capable of calculating the main tissue parameters of T1, T2, and proton density. While this new concept offers the potential for improved diagnostic accuracy and workflow, awareness of this extension to qualitative imaging is generally low. This article reviews the basic physical concepts in MR imaging, describes commonly measured tissue parameters in quantitative MR imaging, and presents the major available pulse sequences used for quantitative MR imaging, with a focus on the hierarchical organization of these sequences. © RSNA, 2017.

  1. Cost-Effective Experiments on the Diffraction and Interference of Light.

    ERIC Educational Resources Information Center

    Sprigham, S. V.

    2000-01-01

    Presents an alternative experimental arrangement that results in a considerable cost savings by reducing the number of sensors and other apparati required while giving excellent quantitative results for comparison with theory. (Author/CCM)

  2. Analysis of atomic force microscopy data for surface characterization using fuzzy logic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Mousa, Amjed, E-mail: aalmousa@vt.edu; Niemann, Darrell L.; Niemann, Devin J.

    2011-07-15

    In this paper we present a methodology to characterize surface nanostructures of thin films. The methodology identifies and isolates nanostructures using Atomic Force Microscopy (AFM) data and extracts quantitative information, such as their size and shape. The fuzzy logic based methodology relies on a Fuzzy Inference Engine (FIE) to classify the data points as being top, bottom, uphill, or downhill. The resulting data sets are then further processed to extract quantitative information about the nanostructures. In the present work we introduce a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and present an omni-directional searchmore » technique to improve the structural recognition accuracy. In order to demonstrate the effectiveness of our approach we present a case study which uses our approach to quantitatively identify particle sizes of two specimens each with a unique gold nanoparticle size distribution. - Research Highlights: {yields} A Fuzzy logic analysis technique capable of characterizing AFM images of thin films. {yields} The technique is applicable to different surfaces regardless of their densities. {yields} Fuzzy logic technique does not require manual adjustment of the algorithm parameters. {yields} The technique can quantitatively capture differences between surfaces. {yields} This technique yields more realistic structure boundaries compared to other methods.« less

  3. Increasing Literacy in Quantitative Methods: The Key to the Future of Canadian Psychology

    PubMed Central

    Counsell, Alyssa; Cribbie, Robert A.; Harlow, Lisa. L.

    2016-01-01

    Quantitative methods (QM) dominate empirical research in psychology. Unfortunately most researchers in psychology receive inadequate training in QM. This creates a challenge for researchers who require advanced statistical methods to appropriately analyze their data. Many of the recent concerns about research quality, replicability, and reporting practices are directly tied to the problematic use of QM. As such, improving quantitative literacy in psychology is an important step towards eliminating these concerns. The current paper will include two main sections that discuss quantitative challenges and opportunities. The first section discusses training and resources for students and presents descriptive results on the number of quantitative courses required and available to graduate students in Canadian psychology departments. In the second section, we discuss ways of improving quantitative literacy for faculty, researchers, and clinicians. This includes a strong focus on the importance of collaboration. The paper concludes with practical recommendations for improving quantitative skills and literacy for students and researchers in Canada. PMID:28042199

  4. Increasing Literacy in Quantitative Methods: The Key to the Future of Canadian Psychology.

    PubMed

    Counsell, Alyssa; Cribbie, Robert A; Harlow, Lisa L

    2016-08-01

    Quantitative methods (QM) dominate empirical research in psychology. Unfortunately most researchers in psychology receive inadequate training in QM. This creates a challenge for researchers who require advanced statistical methods to appropriately analyze their data. Many of the recent concerns about research quality, replicability, and reporting practices are directly tied to the problematic use of QM. As such, improving quantitative literacy in psychology is an important step towards eliminating these concerns. The current paper will include two main sections that discuss quantitative challenges and opportunities. The first section discusses training and resources for students and presents descriptive results on the number of quantitative courses required and available to graduate students in Canadian psychology departments. In the second section, we discuss ways of improving quantitative literacy for faculty, researchers, and clinicians. This includes a strong focus on the importance of collaboration. The paper concludes with practical recommendations for improving quantitative skills and literacy for students and researchers in Canada.

  5. Modeling Environmental Impacts on Cognitive Performance for Artificially Intelligent Entities

    DTIC Science & Technology

    2017-06-01

    of the agent behavior model is presented in a military-relevant virtual game environment. We then outline a quantitative approach to test the agent...relevant virtual game environment. We then outline a quantitative approach to test the agent behavior model within the virtual environment. Results show...x Game View of Hot Environment Condition Displaying Total “f” Cost for Each Searched Waypoint Node

  6. Quantitative coronary plaque analysis predicts high-risk plaque morphology on coronary computed tomography angiography: results from the ROMICAT II trial.

    PubMed

    Liu, Ting; Maurovich-Horvat, Pál; Mayrhofer, Thomas; Puchner, Stefan B; Lu, Michael T; Ghemigian, Khristine; Kitslaar, Pieter H; Broersen, Alexander; Pursnani, Amit; Hoffmann, Udo; Ferencik, Maros

    2018-02-01

    Semi-automated software can provide quantitative assessment of atherosclerotic plaques on coronary CT angiography (CTA). The relationship between established qualitative high-risk plaque features and quantitative plaque measurements has not been studied. We analyzed the association between quantitative plaque measurements and qualitative high-risk plaque features on coronary CTA. We included 260 patients with plaque who underwent coronary CTA in the Rule Out Myocardial Infarction/Ischemia Using Computer Assisted Tomography (ROMICAT) II trial. Quantitative plaque assessment and qualitative plaque characterization were performed on a per coronary segment basis. Quantitative coronary plaque measurements included plaque volume, plaque burden, remodeling index, and diameter stenosis. In qualitative analysis, high-risk plaque was present if positive remodeling, low CT attenuation plaque, napkin-ring sign or spotty calcium were detected. Univariable and multivariable logistic regression analyses were performed to assess the association between quantitative and qualitative high-risk plaque assessment. Among 888 segments with coronary plaque, high-risk plaque was present in 391 (44.0%) segments by qualitative analysis. In quantitative analysis, segments with high-risk plaque had higher total plaque volume, low CT attenuation plaque volume, plaque burden and remodeling index. Quantitatively assessed low CT attenuation plaque volume (odds ratio 1.12 per 1 mm 3 , 95% CI 1.04-1.21), positive remodeling (odds ratio 1.25 per 0.1, 95% CI 1.10-1.41) and plaque burden (odds ratio 1.53 per 0.1, 95% CI 1.08-2.16) were associated with high-risk plaque. Quantitative coronary plaque characteristics (low CT attenuation plaque volume, positive remodeling and plaque burden) measured by semi-automated software correlated with qualitative assessment of high-risk plaque features.

  7. Enhancing the Quantitative Representation of Socioeconomic Conditions in the Shared Socio-economic Pathways (SSPs) using the International Futures Model

    NASA Astrophysics Data System (ADS)

    Rothman, D. S.; Siraj, A.; Hughes, B.

    2013-12-01

    The international research community is currently in the process of developing new scenarios for climate change research. One component of these scenarios are the Shared Socio-economic Pathways (SSPs), which describe a set of possible future socioeconomic conditions. These are presented in narrative storylines with associated quantitative drivers. The core quantitative drivers include total population, average GDP per capita, educational attainment, and urbanization at the global, regional, and national levels. At the same time there have been calls, particularly by the IAV community, for the SSPs to include additional quantitative information on other key social factors, such as income inequality, governance, health, and access to key infrastructures, which are discussed in the narratives. The International Futures system (IFs), based at the Pardee Center at the University of Denver, is able to provide forecasts of many of these indicators. IFs cannot use the SSP drivers as exogenous inputs, but we are able to create development pathways that closely reproduce the core quantitative drivers defined by the different SSPs, as well as incorporating assumptions on other key driving factors described in the qualitative narratives. In this paper, we present forecasts for additional quantitative indicators based upon the implementation of the SSP development pathways in IFs. These results will be of value to many researchers.

  8. Apparatus and method for quantitatively evaluating total fissile and total fertile nuclide content in samples

    DOEpatents

    Caldwell, John T.; Kunz, Walter E.; Cates, Michael R.; Franks, Larry A.

    1985-01-01

    Simultaneous photon and neutron interrogation of samples for the quantitative determination of total fissile nuclide and total fertile nuclide material present is made possible by the use of an electron accelerator. Prompt and delayed neutrons produced from resulting induced fissions are counted using a single detection system and allow the resolution of the contributions from each interrogating flux leading in turn to the quantitative determination sought. Detection limits for .sup.239 Pu are estimated to be about 3 mg using prompt fission neutrons and about 6 mg using delayed neutrons.

  9. Apparatus and method for quantitatively evaluating total fissile and total fertile nuclide content in samples. [Patent application

    DOEpatents

    Caldwell, J.T.; Kunz, W.E.; Cates, M.R.; Franks, L.A.

    1982-07-07

    Simultaneous photon and neutron interrogation of samples for the quantitative determination of total fissile nuclide and total fertile nuclide material present is made possible by the use of an electron accelerator. Prompt and delayed neutrons produced from resulting induced fission are counted using a single detection system and allow the resolution of the contributions from each interrogating flux leading in turn to the quantitative determination sought. Detection limits for /sup 239/Pu are estimated to be about 3 mg using prompt fission neutrons and about 6 mg using delayed neutrons.

  10. [Application of target restoration space quantity and quantitative relation in precise esthetic prosthodontics].

    PubMed

    Haiyang, Yu; Tian, Luo

    2016-06-01

    Target restoration space (TRS) is the most precise space required for designing optimal prosthesis. TRS consists of an internal or external tooth space to confirm the esthetics and function of the final restoration. Therefore, assisted with quantitive analysis transfer, TRS quantitative analysis is a significant improvement for minimum tooth preparation. This article presents TRS quantity-related measurement, analysis, transfer, and internal relevance of three TR. classifications. Results reveal the close bond between precision and minimally invasive treatment. This study can be used to improve the comprehension and execution of precise esthetic prosthodontics.

  11. A Study Assessing the Potential of Negative Effects in Interdisciplinary Math–Biology Instruction

    PubMed Central

    Madlung, Andreas; Bremer, Martina; Himelblau, Edward; Tullis, Alexa

    2011-01-01

    There is increasing enthusiasm for teaching approaches that combine mathematics and biology. The call for integrating more quantitative work in biology education has led to new teaching tools that improve quantitative skills. Little is known, however, about whether increasing interdisciplinary work can lead to adverse effects, such as the development of broader but shallower skills or the possibility that math anxiety causes some students to disengage in the classroom, or, paradoxically, to focus so much on the mathematics that they lose sight of its application for the biological concepts in the center of the unit at hand. We have developed and assessed an integrative learning module and found disciplinary learning gains to be equally strong in first-year students who actively engaged in embedded quantitative calculations as in those students who were merely presented with quantitative data in the context of interpreting biological and biostatistical results. When presented to advanced biology students, our quantitative learning tool increased test performance significantly. We conclude from our study that the addition of mathematical calculations to the first year and advanced biology curricula did not hinder overall student learning, and may increase disciplinary learning and data interpretation skills in advanced students. PMID:21364099

  12. Nondestructive Evaluation for Aerospace Composites

    NASA Technical Reports Server (NTRS)

    Leckey, Cara; Cramer, Elliott; Perey, Daniel

    2015-01-01

    Nondestructive evaluation (NDE) techniques are important for enabling NASA's missions in space exploration and aeronautics. The expanded and continued use of composite materials for aerospace components and vehicles leads to a need for advanced NDE techniques capable of quantitatively characterizing damage in composites. Quantitative damage detection techniques help to ensure safety, reliability and durability of space and aeronautic vehicles. This presentation will give a broad outline of NASA's range of technical work and an overview of the NDE research performed in the Nondestructive Evaluation Sciences Branch at NASA Langley Research Center. The presentation will focus on ongoing research in the development of NDE techniques for composite materials and structures, including development of automated data processing tools to turn NDE data into quantitative location and sizing results. Composites focused NDE research in the areas of ultrasonics, thermography, X-ray computed tomography, and NDE modeling will be discussed.

  13. Advances in Surface Plasmon Resonance Imaging allowing for quantitative measurement of laterally heterogeneous samples

    NASA Astrophysics Data System (ADS)

    Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John

    2012-02-01

    The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to materials on metallic surfaces for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases -- uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. The degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.

  14. Quantitative analysis of voids in percolating structures in two-dimensional N-body simulations

    NASA Technical Reports Server (NTRS)

    Harrington, Patrick M.; Melott, Adrian L.; Shandarin, Sergei F.

    1993-01-01

    We present in this paper a quantitative method for defining void size in large-scale structure based on percolation threshold density. Beginning with two-dimensional gravitational clustering simulations smoothed to the threshold of nonlinearity, we perform percolation analysis to determine the large scale structure. The resulting objective definition of voids has a natural scaling property, is topologically interesting, and can be applied immediately to redshift surveys.

  15. Implications of Polishing Techniques in Quantitative X-Ray Microanalysis

    PubMed Central

    Rémond, Guy; Nockolds, Clive; Phillips, Matthew; Roques-Carmes, Claude

    2002-01-01

    Specimen preparation using abrasives results in surface and subsurface mechanical (stresses, strains), geometrical (roughness), chemical (contaminants, reaction products) and physical modifications (structure, texture, lattice defects). The mechanisms involved in polishing with abrasives are presented to illustrate the effects of surface topography, surface and subsurface composition and induced lattice defects on the accuracy of quantitative x-ray microanalysis of mineral materials with the electron probe microanalyzer (EPMA). PMID:27446758

  16. A new method to assess the sustainability performance of events: Application to the 2014 World Orienteering Championship

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scrucca, Flavio; Severi, Claudio; Galvan, Nicola

    Nowadays an increasing attention of public and private agencies to the sustainability performance of events is observed, since it is recognized as a key issue in the context of sustainable development. Assessing the sustainability performance of events involves environmental, social and economic aspects; their impacts are complex and a quantitative assessment is often difficult. This paper presents a new quali-quantitative method developed to measure the sustainability of events, taking into account all its potential impacts. The 2014 World Orienteering Championship, held in Italy, was selected to test the proposed evaluation methodology. The total carbon footprint of the event was 165.34more » tCO{sub 2}eq and the avoided emissions were estimated as being 46 tCO{sub 2}eq. The adopted quali-quantitative method resulted to be efficient in assessing the sustainability impacts and can be applied for the evaluation of similar events. - Highlights: • A quali-quantitative method to assess events' sustainability is presented. • All the methodological issues related to the method are explained. • The method is used to evaluate the sustainability of an international sports event. • The method resulted to be valid to assess the event's sustainability level. • The carbon footprint of the event has been calculated.« less

  17. A novel approach to teach the generation of bioelectrical potentials from a descriptive and quantitative perspective.

    PubMed

    Rodriguez-Falces, Javier

    2013-12-01

    In electrophysiology studies, it is becoming increasingly common to explain experimental observations using both descriptive methods and quantitative approaches. However, some electrophysiological phenomena, such as the generation of extracellular potentials that results from the propagation of the excitation source along the muscle fiber, are difficult to describe and conceptualize. In addition, most traditional approaches aimed at describing extracellular potentials consist of complex mathematical machinery that gives no chance for physical interpretation. The aim of the present study is to present a new method to teach the formation of extracellular potentials around a muscle fiber from both a descriptive and quantitative perspective. The implementation of this method was tested through a written exam and a satisfaction survey. The new method enhanced the ability of students to visualize the generation of bioelectrical potentials. In addition, the new approach improved students' understanding of how changes in the fiber-to-electrode distance and in the shape of the excitation source are translated into changes in the extracellular potential. The survey results show that combining general principles of electrical fields with accurate graphic imagery gives students an intuitive, yet quantitative, feel for electrophysiological signals and enhances their motivation to continue their studies in the biomedical engineering field.

  18. A custom-built PET phantom design for quantitative imaging of printed distributions.

    PubMed

    Markiewicz, P J; Angelis, G I; Kotasidis, F; Green, M; Lionheart, W R; Reader, A J; Matthews, J C

    2011-11-07

    This note presents a practical approach to a custom-made design of PET phantoms enabling the use of digital radioactive distributions with high quantitative accuracy and spatial resolution. The phantom design allows planar sources of any radioactivity distribution to be imaged in transaxial and axial (sagittal or coronal) planes. Although the design presented here is specially adapted to the high-resolution research tomograph (HRRT), the presented methods can be adapted to almost any PET scanner. Although the presented phantom design has many advantages, a number of practical issues had to be overcome such as positioning of the printed source, calibration, uniformity and reproducibility of printing. A well counter (WC) was used in the calibration procedure to find the nonlinear relationship between digital voxel intensities and the actual measured radioactive concentrations. Repeated printing together with WC measurements and computed radiography (CR) using phosphor imaging plates (IP) were used to evaluate the reproducibility and uniformity of such printing. Results show satisfactory printing uniformity and reproducibility; however, calibration is dependent on the printing mode and the physical state of the cartridge. As a demonstration of the utility of using printed phantoms, the image resolution and quantitative accuracy of reconstructed HRRT images are assessed. There is very good quantitative agreement in the calibration procedure between HRRT, CR and WC measurements. However, the high resolution of CR and its quantitative accuracy supported by WC measurements made it possible to show the degraded resolution of HRRT brain images caused by the partial-volume effect and the limits of iterative image reconstruction.

  19. African Primary Care Research: Quantitative analysis and presentation of results

    PubMed Central

    Ogunbanjo, Gboyega A.

    2014-01-01

    Abstract This article is part of a series on Primary Care Research Methods. The article describes types of continuous and categorical data, how to capture data in a spreadsheet, how to use descriptive and inferential statistics and, finally, gives advice on how to present the results in text, figures and tables. The article intends to help Master's level students with writing the data analysis section of their research proposal and presenting their results in their final research report. PMID:26245435

  20. ASPECTS: an automation-assisted SPE method development system.

    PubMed

    Li, Ming; Chou, Judy; King, Kristopher W; Yang, Liyu

    2013-07-01

    A typical conventional SPE method development (MD) process usually involves deciding the chemistry of the sorbent and eluent based on information about the analyte; experimentally preparing and trying out various combinations of adsorption chemistry and elution conditions; quantitatively evaluating the various conditions; and comparing quantitative results from all combination of conditions to select the best condition for method qualification. The second and fourth steps have mostly been performed manually until now. We developed an automation-assisted system that expedites the conventional SPE MD process by automating 99% of the second step, and expedites the fourth step by automatically processing the results data and presenting it to the analyst in a user-friendly format. The automation-assisted SPE MD system greatly saves the manual labor in SPE MD work, prevents analyst errors from causing misinterpretation of quantitative results, and shortens data analysis and interpretation time.

  1. Heterophile antibody interference in qualitative urine/serum hCG devices: Case report.

    PubMed

    Patel, Khushbu K; Gronowski, Ann M

    2016-06-01

    This case report investigates the origin of a false positive result on a serum qualitative human chorionic gonadotropin (hCG) device. A 46-year-old woman diagnosed with chronic myeloid leukemia presented with nausea and vomiting. A qualitative serum hCG test was interpreted as positive; however, a quantitative serum hCG test was negative (<5IU/L). To further investigate this discrepancy, the sample was pretreated with heterophilic blocking reagent (HBR). Additionally, the sample was tested on other qualitative hCG devices composed of antibodies from different animal sources. Blocking reagent from an automated quantitative immunoassay was also tested for its ability to inhibit the heterophile antibody interference. The qualitative test result was negative after pretreatment with heterophilic blocking reagent. Other devices composed of antibodies from different animal sources also demonstrated mixed results with the patient's sample. Blocking reagent obtained from the automated quantitative assay inhibited the heterophile antibody interference in the patient's sample. This case demonstrates that positive serum point-of-care hCG results should be interpreted with caution and confirmed with a quantitative serum hCG immunoassay when clinical suspicion is raised. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  2. Thermal Nondestructive Characterization of Corrosion in Boiler Tubes by Application fo a Moving Line Heat Source

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Winfree, William P.

    2000-01-01

    Wall thinning in utility boiler waterwall tubing is a significant inspection concern for boiler operators. Historically, conventional ultrasonics has been used lor inspection of these tubes. This technique has proved to be very labor intensive and slow. This has resulted in a "spot check" approach to inspections, making thickness measurements over a relatively small percentage of the total boiler wall area. NASA Langley Research Center has developed a thermal NDE technique designed to image and quantitatively characterize the amount of material thinning present in steel tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source, coupled with this analysis technique, represents a significant improvement in the inspection speed for large structures such as boiler waterwalls while still providing high-resolution thickness measurements. A theoretical basis for the technique will be presented thus demonstrating the quantitative nature of the technique. Further, results of laboratory experiments on flat Panel specimens with fabricated material loss regions will be presented.

  3. Visual investigation on the heat dissipation process of a heat sink by using digital holographic interferometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Bingjing; Zhao, Jianlin, E-mail: jlzhao@nwpu.edu.cn; Wang, Jun

    2013-11-21

    We present a method for visually and quantitatively investigating the heat dissipation process of plate-fin heat sinks by using digital holographic interferometry. A series of phase change maps reflecting the temperature distribution and variation trend of the air field surrounding heat sink during the heat dissipation process are numerically reconstructed based on double-exposure holographic interferometry. According to the phase unwrapping algorithm and the derived relationship between temperature and phase change of the detection beam, the full-field temperature distributions are quantitatively obtained with a reasonably high measurement accuracy. And then the impact of heat sink's channel width on the heat dissipationmore » performance in the case of natural convection is analyzed. In addition, a comparison between simulation and experiment results is given to verify the reliability of this method. The experiment results certify the feasibility and validity of the presented method in full-field, dynamical, and quantitative measurement of the air field temperature distribution, which provides a basis for analyzing the heat dissipation performance of plate-fin heat sinks.« less

  4. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    NASA Astrophysics Data System (ADS)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  5. A quantitative analysis of IRAS maps of molecular clouds

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  6. Inverse transport problems in quantitative PAT for molecular imaging

    NASA Astrophysics Data System (ADS)

    Ren, Kui; Zhang, Rongting; Zhong, Yimin

    2015-12-01

    Fluorescence photoacoustic tomography (fPAT) is a molecular imaging modality that combines photoacoustic tomography with fluorescence imaging to obtain high-resolution imaging of fluorescence distributions inside heterogeneous media. The objective of this work is to study inverse problems in the quantitative step of fPAT where we intend to reconstruct physical coefficients in a coupled system of radiative transport equations using internal data recovered from ultrasound measurements. We derive uniqueness and stability results on the inverse problems and develop some efficient algorithms for image reconstructions. Numerical simulations based on synthetic data are presented to validate the theoretical analysis. The results we present here complement these in Ren K and Zhao H (2013 SIAM J. Imaging Sci. 6 2024-49) on the same problem but in the diffusive regime.

  7. Improving membrane based multiplex immunoassays for semi-quantitative detection of multiple cytokines in a single sample

    PubMed Central

    2014-01-01

    Background Inflammatory mediators can serve as biomarkers for the monitoring of the disease progression or prognosis in many conditions. In the present study we introduce an adaptation of a membrane-based technique in which the level of up to 40 cytokines and chemokines can be determined in both human and rodent blood in a semi-quantitative way. The planar assay was modified using the LI-COR (R) detection system (fluorescence based) rather than chemiluminescence and semi-quantitative outcomes were achieved by normalizing the outcomes using the automated exposure settings of the Odyssey readout device. The results were compared to the gold standard assay, namely ELISA. Results The improved planar assay allowed the detection of a considerably higher number of analytes (n = 30 and n = 5 for fluorescent and chemiluminescent detection, respectively). The improved planar method showed high sensitivity up to 17 pg/ml and a linear correlation of the normalized fluorescence intensity with the results from the ELISA (r = 0.91). Conclusions The results show that the membrane-based technique is a semi-quantitative assay that correlates satisfactorily to the gold standard when enhanced by the use of fluorescence and subsequent semi-quantitative analysis. This promising technique can be used to investigate inflammatory profiles in multiple conditions, particularly in studies with constraints in sample sizes and/or budget. PMID:25022797

  8. Photothermal quantitative phase imaging of living cells with nanoparticles utilizing a cost-efficient setup

    NASA Astrophysics Data System (ADS)

    Turko, Nir A.; Isbach, Michael; Ketelhut, Steffi; Greve, Burkhard; Schnekenburger, Jürgen; Shaked, Natan T.; Kemper, Björn

    2017-02-01

    We explored photothermal quantitative phase imaging (PTQPI) of living cells with functionalized nanoparticles (NPs) utilizing a cost-efficient setup based on a cell culture microscope. The excitation light was modulated by a mechanical chopper wheel with low frequencies. Quantitative phase imaging (QPI) was performed with Michelson interferometer-based off-axis digital holographic microscopy and a standard industrial camera. We present results from PTQPI observations on breast cancer cells that were incubated with functionalized gold NPs binding to the epidermal growth factor receptor. Moreover, QPI was used to quantify the impact of the NPs and the low frequency light excitation on cell morphology and viability.

  9. Diffraction enhance x-ray imaging for quantitative phase contrast studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agrawal, A. K.; Singh, B., E-mail: balwants@rrcat.gov.in; Kashyap, Y. S.

    2016-05-23

    Conventional X-ray imaging based on absorption contrast permits limited visibility of feature having small density and thickness variations. For imaging of weakly absorbing material or materials possessing similar densities, a novel phase contrast imaging techniques called diffraction enhanced imaging has been designed and developed at imaging beamline Indus-2 RRCAT Indore. The technique provides improved visibility of the interfaces and show high contrast in the image forsmall density or thickness gradients in the bulk. This paper presents basic principle, instrumentation and analysis methods for this technique. Initial results of quantitative phase retrieval carried out on various samples have also been presented.

  10. Diagnostic value of (99m)Tc-3PRGD2 scintimammography for differentiation of malignant from benign breast lesions: Comparison of visual and semi-quantitative analysis.

    PubMed

    Chen, Qianqian; Xie, Qian; Zhao, Min; Chen, Bin; Gao, Shi; Zhang, Haishan; Xing, Hua; Ma, Qingjie

    2015-01-01

    To compare the diagnostic value of visual and semi-quantitative analysis of technetium-99m-poly-ethylene glycol, 4-arginine-glycine-aspartic acid ((99m)Tc-3PRGD2) scintimammography (SMG) for better differentiation of benign from malignant breast masses, and also investigate the incremental role of semi-quantitative index of SMG. A total of 72 patients with breast lesions were included in the study. Technetium-99m-3PRGD2 SMG was performed with single photon emission computed tomography (SPET) at 60 min after intravenous injection of 749 ± 86MBq of the radiotracer. Images were evaluated by visual interpretation and semi-quantitative indices of tumor to non-tumor (T/N) ratios, which were compared with pathology results. Receiver operating characteristics (ROC) curve analyses were performed to determine the optimal visual grade, to calculate cut-off values of semi-quantitative indices, and to compare visual and semi-quantitative diagnostic values. Among the 72 patients, 89 lesions were confirmed by histopathology after fine needle aspiration biopsy or surgery, 48 malignant and 41 benign lesions. The mean T/N ratio of (99m)Tc-3PRGD2 SMG in malignant lesions was significantly higher than that in benign lesions (P<0.05). When grade 2 of the disease was used as cut-off value for the detection of primary breast cancer, the sensitivity, specificity and accuracy were 81.3%, 70.7%, and 76.4%, respectively. When a T/N ratio of 2.01 was used as cut-off value, the sensitivity, specificity and accuracy were 79.2%, 75.6%, and 77.5%, respectively. According to ROC analysis, the area under the curve for semi-quantitative analysis was higher than that for visual analysis, but the statistical difference was not significant (P=0.372). Compared with visual analysis or semi-quantitative analysis alone, the sensitivity, specificity and accuracy of visual analysis combined with semi-quantitative analysis in diagnosing primary breast cancer were higher, being: 87.5%, 82.9%, and 85.4%, respectively. The area under the curve was 0.891. Results of the present study suggest that the semi-quantitative and visual analysis statistically showed similar results. The semi-quantitative analysis provided incremental value additive to visual analysis of (99m)Tc-3PRGD2 SMG for the detection of breast cancer. It seems from our results that, when the tumor was located in the medial part of the breast, the semi-quantitative analysis gave better diagnostic results.

  11. Colorimetric Determination of pH.

    ERIC Educational Resources Information Center

    Tucker, Sheryl; And Others

    1989-01-01

    Presented is an activity in which the pH of a solution can be quantitatively measured using a spectrophotometer. The theory, experimental details, sample preparation and selection, instrumentation, and results are discussed. (CW)

  12. Quantitative analysis of peel-off degree for printed electronics

    NASA Astrophysics Data System (ADS)

    Park, Janghoon; Lee, Jongsu; Sung, Ki-Hak; Shin, Kee-Hyun; Kang, Hyunkyoo

    2018-02-01

    We suggest a facile methodology of peel-off degree evaluation by image processing on printed electronics. The quantification of peeled and printed areas was performed using open source programs. To verify the accuracy of methods, we manually removed areas from the printed circuit that was measured, resulting in 96.3% accuracy. The sintered patterns showed a decreasing tendency in accordance with the increase in the energy density of an infrared lamp, and the peel-off degree increased. Thus, the comparison between both results was presented. Finally, the correlation between performance characteristics was determined by quantitative analysis.

  13. Determination of trace element mineral/liquid partition coefficients in melilite and diopside by ion and electron microprobe techniques

    NASA Technical Reports Server (NTRS)

    Kuehner, S. M.; Laughlin, J. R.; Grossman, L.; Johnson, M. L.; Burnett, D. S.

    1989-01-01

    The applicability of ion microprobe (IMP) for quantitative analysis of minor elements (Sr, Y, Zr, La, Sm, and Yb) in the major phases present in natural Ca-, Al-rich inclusions (CAIs) was investigated by comparing IMP results with those of an electron microprobe (EMP). Results on three trace-element-doped glasses indicated that it is not possible to obtain precise quantitative analysis by using IMP if there are large differences in SiO2 content between the standards used to derive the ion yields and the unknowns.

  14. Diagnosis of breast cancer biopsies using quantitative phase imaging

    NASA Astrophysics Data System (ADS)

    Majeed, Hassaan; Kandel, Mikhail E.; Han, Kevin; Luo, Zelun; Macias, Virgilia; Tangella, Krishnarao; Balla, Andre; Popescu, Gabriel

    2015-03-01

    The standard practice in the histopathology of breast cancers is to examine a hematoxylin and eosin (H&E) stained tissue biopsy under a microscope. The pathologist looks at certain morphological features, visible under the stain, to diagnose whether a tumor is benign or malignant. This determination is made based on qualitative inspection making it subject to investigator bias. Furthermore, since this method requires a microscopic examination by the pathologist it suffers from low throughput. A quantitative, label-free and high throughput method for detection of these morphological features from images of tissue biopsies is, hence, highly desirable as it would assist the pathologist in making a quicker and more accurate diagnosis of cancers. We present here preliminary results showing the potential of using quantitative phase imaging for breast cancer screening and help with differential diagnosis. We generated optical path length maps of unstained breast tissue biopsies using Spatial Light Interference Microscopy (SLIM). As a first step towards diagnosis based on quantitative phase imaging, we carried out a qualitative evaluation of the imaging resolution and contrast of our label-free phase images. These images were shown to two pathologists who marked the tumors present in tissue as either benign or malignant. This diagnosis was then compared against the diagnosis of the two pathologists on H&E stained tissue images and the number of agreements were counted. In our experiment, the agreement between SLIM and H&E based diagnosis was measured to be 88%. Our preliminary results demonstrate the potential and promise of SLIM for a push in the future towards quantitative, label-free and high throughput diagnosis.

  15. Bayesian aggregation versus majority vote in the characterization of non-specific arm pain based on quantitative needle electromyography

    PubMed Central

    2010-01-01

    Background Methods for the calculation and application of quantitative electromyographic (EMG) statistics for the characterization of EMG data detected from forearm muscles of individuals with and without pain associated with repetitive strain injury are presented. Methods A classification procedure using a multi-stage application of Bayesian inference is presented that characterizes a set of motor unit potentials acquired using needle electromyography. The utility of this technique in characterizing EMG data obtained from both normal individuals and those presenting with symptoms of "non-specific arm pain" is explored and validated. The efficacy of the Bayesian technique is compared with simple voting methods. Results The aggregate Bayesian classifier presented is found to perform with accuracy equivalent to that of majority voting on the test data, with an overall accuracy greater than 0.85. Theoretical foundations of the technique are discussed, and are related to the observations found. Conclusions Aggregation of motor unit potential conditional probability distributions estimated using quantitative electromyographic analysis, may be successfully used to perform electrodiagnostic characterization of "non-specific arm pain." It is expected that these techniques will also be able to be applied to other types of electrodiagnostic data. PMID:20156353

  16. 75 FR 373 - Agency Information Collection Activities; Submission for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-05

    ... Request; Experimental Study: Presentation of Quantitative Effectiveness Information to Consumers in Direct... clearance. Experimental Study: Presentation of Quantitative Effectiveness Information to Consumers in Direct... research has proposed that providing quantitative information about product efficacy enables consumers to...

  17. A Teacher's Exploratory Inquiry of Language Awareness: Language Learner Perceptions from Oral Presentations

    ERIC Educational Resources Information Center

    Leichsenring, Andrew

    2015-01-01

    This paper presents a teacher-led inquiry into learner language awareness and learner perceptions of: oral presentations using first language (L1) support when using a second language (L2); and L2 learner and user identity. The quantitative-based results of this preliminary inquiry represent a source of understanding for the researcher, who later,…

  18. Comparison of high-performance liquid chromatography and supercritical fluid chromatography using evaporative light scattering detection for the determination of plasticizers in medical devices.

    PubMed

    Lecoeur, Marie; Decaudin, Bertrand; Guillotin, Yoann; Sautou, Valérie; Vaccher, Claude

    2015-10-23

    Recently, interest in supercritical fluid chromatography (SFC) has increased due to its high throughput and the development of new system improving chromatographic performances. However, most papers dealt with fundamental studies and chiral applications and only few works described validation process of SFC method. Likewise, evaporative light scattering detection (ELSD) has been widely employed in liquid chromatography but only a few recent works presented its quantitative performances hyphenated with SFC apparatus. The present paper discusses about the quantitative performances of SFC-ELSD compared to HPLC-ELSD, for the determination of plasticizers (ATBC, DEHA, DEHT and TOTM) in PVC tubing used as medical devices. After the development of HPLC-ELSD, both methods were evaluated based on the total error approach using accuracy profile. The results show that HPLC-ELSD was more precise than SFC-ELSD but lower limits of quantitation were obtained by SFC. Hence, HPLC was validated in the ± 10% acceptance limits whereas SFC lacks of accuracy to quantify plasticizers. Finally, both methods were used to determine the composition of plasticized-PVC medical devices. Results demonstrated that SFC and HPLC both hyphenated with ELSD provided similar results. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Recovery and Determination of Adsorbed Technetium on Savannah River Site Charcoal Stack Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lahoda, Kristy G.; Engelmann, Mark D.; Farmer, Orville T.

    2008-03-01

    Experimental results are provided for the sample analyses for technetium (Tc) in charcoal samples placed in-line with a Savannah River Site (SRS) processing stack effluent stream as a part of an environmental surveillance program. The method for Tc removal from charcoal was based on that originally developed with high purity charcoal. Presented is the process that allowed for the quantitative analysis of 99Tc in SRS charcoal stack samples with and without 97Tc as a tracer. The results obtained with the method using the 97Tc tracer quantitatively confirm the results obtained with no tracer added. All samples contain 99Tc at themore » pg g-1 level.« less

  20. ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.

    PubMed

    Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. Descriptive statistics.

    PubMed

    Nick, Todd G

    2007-01-01

    Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.

  2. A Methodology for Conducting Integrative Mixed Methods Research and Data Analyses

    PubMed Central

    Castro, Felipe González; Kellison, Joshua G.; Boyd, Stephen J.; Kopak, Albert

    2011-01-01

    Mixed methods research has gained visibility within the last few years, although limitations persist regarding the scientific caliber of certain mixed methods research designs and methods. The need exists for rigorous mixed methods designs that integrate various data analytic procedures for a seamless transfer of evidence across qualitative and quantitative modalities. Such designs can offer the strength of confirmatory results drawn from quantitative multivariate analyses, along with “deep structure” explanatory descriptions as drawn from qualitative analyses. This article presents evidence generated from over a decade of pilot research in developing an integrative mixed methods methodology. It presents a conceptual framework and methodological and data analytic procedures for conducting mixed methods research studies, and it also presents illustrative examples from the authors' ongoing integrative mixed methods research studies. PMID:22167325

  3. Breast cancer diagnosis using spatial light interference microscopy

    NASA Astrophysics Data System (ADS)

    Majeed, Hassaan; Kandel, Mikhail E.; Han, Kevin; Luo, Zelun; Macias, Virgilia; Tangella, Krishnarao; Balla, Andre; Popescu, Gabriel

    2015-11-01

    The standard practice in histopathology of breast cancers is to examine a hematoxylin and eosin (H&E) stained tissue biopsy under a microscope to diagnose whether a lesion is benign or malignant. This determination is made based on a manual, qualitative inspection, making it subject to investigator bias and resulting in low throughput. Hence, a quantitative, label-free, and high-throughput diagnosis method is highly desirable. We present here preliminary results showing the potential of quantitative phase imaging for breast cancer screening and help with differential diagnosis. We generated phase maps of unstained breast tissue biopsies using spatial light interference microscopy (SLIM). As a first step toward quantitative diagnosis based on SLIM, we carried out a qualitative evaluation of our label-free images. These images were shown to two pathologists who classified each case as either benign or malignant. This diagnosis was then compared against the diagnosis of the two pathologists on corresponding H&E stained tissue images and the number of agreements were counted. The agreement between SLIM and H&E based diagnosis was 88% for the first pathologist and 87% for the second. Our results demonstrate the potential and promise of SLIM for quantitative, label-free, and high-throughput diagnosis.

  4. Automatic Gleason grading of prostate cancer using quantitative phase imaging and machine learning

    NASA Astrophysics Data System (ADS)

    Nguyen, Tan H.; Sridharan, Shamira; Macias, Virgilia; Kajdacsy-Balla, Andre; Melamed, Jonathan; Do, Minh N.; Popescu, Gabriel

    2017-03-01

    We present an approach for automatic diagnosis of tissue biopsies. Our methodology consists of a quantitative phase imaging tissue scanner and machine learning algorithms to process these data. We illustrate the performance by automatic Gleason grading of prostate specimens. The imaging system operates on the principle of interferometry and, as a result, reports on the nanoscale architecture of the unlabeled specimen. We use these data to train a random forest classifier to learn textural behaviors of prostate samples and classify each pixel in the image into different classes. Automatic diagnosis results were computed from the segmented regions. By combining morphological features with quantitative information from the glands and stroma, logistic regression was used to discriminate regions with Gleason grade 3 versus grade 4 cancer in prostatectomy tissue. The overall accuracy of this classification derived from a receiver operating curve was 82%, which is in the range of human error when interobserver variability is considered. We anticipate that our approach will provide a clinically objective and quantitative metric for Gleason grading, allowing us to corroborate results across instruments and laboratories and feed the computer algorithms for improved accuracy.

  5. Interpretation of Negative Molecular Test Results in Patients With Suspected or Confirmed Ebola Virus Disease: Report of Two Cases.

    PubMed

    Edwards, Jeffrey K; Kleine, Christian; Munster, Vincent; Giuliani, Ruggero; Massaquoi, Moses; Sprecher, Armand; Chertow, Daniel S

    2015-12-01

    Quantitative reverse-transcription polymerase chain reaction (qRT-PCR) is the most sensitive quantitative diagnostic assay for detection of Ebola virus in multiple body fluids. Despite the strengths of this assay, we present 2 cases of Ebola virus disease (EVD) and highlight the potential for false-negative results during the early and late stages of EVD. The first case emphasizes the low negative-predictive value of qRT-PCR during incubation and the early febrile stage of EVD, and the second case emphasizes the potential for false-negative results during recovery and late neurologic complications of EVD. Careful interpretation of test results are needed to guide difficult admission and discharge decisions in suspected or confirmed EVD.

  6. Simulation of UV atomic radiation for application in exhaust plume spectrometry

    NASA Astrophysics Data System (ADS)

    Wallace, T. L.; Powers, W. T.; Cooper, A. E.

    1993-06-01

    Quantitative analysis of exhaust plume spectral data has long been a goal of developers of advanced engine health monitoring systems which incorporate optical measurements of rocket exhaust constituents. Discussed herein is the status of present efforts to model and predict atomic radiation spectra and infer free-atom densities from emission/absorption measurements as part of the Optical Plume Anomaly Detection (OPAD) program at Marshall Space Flight Center (MSFC). A brief examination of the mathematical formalism is provided in the context of predicting radiation from the Mach disk region of the SSME exhaust flow at nominal conditions during ground level testing at MSFC. Computational results are provided for Chromium and Copper at selected transitions which indicate a strong dependence upon broadening parameter values determining the absorption-emission line shape. Representative plots of recent spectral data from the Stennis Space Center (SSC) Diagnostic Test Facility (DTF) rocket engine are presented and compared to numerical results from the present self-absorbing model; a comprehensive quantitative analysis will be reported at a later date.

  7. Thermographic Imaging of Material Loss in Boiler Water-Wall Tubing by Application of Scanning Line Source

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Winfree, William P.

    2000-01-01

    Localized wall thinning due to corrosion in utility boiler water-wall tubing is a significant inspection concern for boiler operators. Historically, conventional ultrasonics has been used for inspection of these tubes. This technique has proven to be very manpower and time intensive. This has resulted in a spot check approach to inspections, documenting thickness measurements over a relatively small percentage of the total boiler wall area. NASA Langley Research Center has developed a thermal NDE technique designed to image and quantitatively characterize the amount of material thinning present in steel tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source coupled with the analysis technique represents a significant improvement in the inspection speed for large structures such as boiler water-walls. A theoretical basis for the technique will be presented which explains the quantitative nature of the technique. Further, a dynamic calibration system will be presented for the technique that allows the extraction of thickness information from the temperature data. Additionally, the results of applying this technology to actual water-wall tubing samples and in situ inspections will be presented.

  8. Design of a Virtual Reality System for Affect Analysis in Facial Expressions (VR-SAAFE); Application to Schizophrenia.

    PubMed

    Bekele, E; Bian, D; Peterman, J; Park, S; Sarkar, N

    2017-06-01

    Schizophrenia is a life-long, debilitating psychotic disorder with poor outcome that affects about 1% of the population. Although pharmacotherapy can alleviate some of the acute psychotic symptoms, residual social impairments present a significant barrier that prevents successful rehabilitation. With limited resources and access to social skills training opportunities, innovative technology has emerged as a potentially powerful tool for intervention. In this paper, we present a novel virtual reality (VR)-based system for understanding facial emotion processing impairments that may lead to poor social outcome in schizophrenia. We henceforth call it a VR System for Affect Analysis in Facial Expressions (VR-SAAFE). This system integrates a VR-based task presentation platform that can minutely control facial expressions of an avatar with or without accompanying verbal interaction, with an eye-tracker to quantitatively measure a participants real-time gaze and a set of physiological sensors to infer his/her affective states to allow in-depth understanding of the emotion recognition mechanism of patients with schizophrenia based on quantitative metrics. A usability study with 12 patients with schizophrenia and 12 healthy controls was conducted to examine processing of the emotional faces. Preliminary results indicated that there were significant differences in the way patients with schizophrenia processed and responded towards the emotional faces presented in the VR environment compared with healthy control participants. The preliminary results underscore the utility of such a VR-based system that enables precise and quantitative assessment of social skill deficits in patients with schizophrenia.

  9. Development of a Biological Science Quantitative Reasoning Exam (BioSQuaRE)

    PubMed Central

    Stanhope, Liz; Ziegler, Laura; Haque, Tabassum; Le, Laura; Vinces, Marcelo; Davis, Gregory K.; Zieffler, Andrew; Brodfuehrer, Peter; Preest, Marion; M. Belitsky, Jason; Umbanhowar, Charles; Overvoorde, Paul J.

    2017-01-01

    Multiple reports highlight the increasingly quantitative nature of biological research and the need to innovate means to ensure that students acquire quantitative skills. We present a tool to support such innovation. The Biological Science Quantitative Reasoning Exam (BioSQuaRE) is an assessment instrument designed to measure the quantitative skills of undergraduate students within a biological context. The instrument was developed by an interdisciplinary team of educators and aligns with skills included in national reports such as BIO2010, Scientific Foundations for Future Physicians, and Vision and Change. Undergraduate biology educators also confirmed the importance of items included in the instrument. The current version of the BioSQuaRE was developed through an iterative process using data from students at 12 postsecondary institutions. A psychometric analysis of these data provides multiple lines of evidence for the validity of inferences made using the instrument. Our results suggest that the BioSQuaRE will prove useful to faculty and departments interested in helping students acquire the quantitative competencies they need to successfully pursue biology, and useful to biology students by communicating the importance of quantitative skills. We invite educators to use the BioSQuaRE at their own institutions. PMID:29196427

  10. Quantitative determination and validation of octreotide acetate using 1 H-NMR spectroscopy with internal standard method.

    PubMed

    Yu, Chen; Zhang, Qian; Xu, Peng-Yao; Bai, Yin; Shen, Wen-Bin; Di, Bin; Su, Meng-Xiang

    2018-01-01

    Quantitative nuclear magnetic resonance (qNMR) is a well-established technique in quantitative analysis. We presented a validated 1 H-qNMR method for assay of octreotide acetate, a kind of cyclic octopeptide. Deuterium oxide was used to remove the undesired exchangeable peaks, which was referred to as proton exchange, in order to make the quantitative signals isolated in the crowded spectrum of the peptide and ensure precise quantitative analysis. Gemcitabine hydrochloride was chosen as the suitable internal standard. Experimental conditions, including relaxation delay time, the numbers of scans, and pulse angle, were optimized first. Then method validation was carried out in terms of selectivity, stability, linearity, precision, and robustness. The assay result was compared with that by means of high performance liquid chromatography, which is provided by Chinese Pharmacopoeia. The statistical F test, Student's t test, and nonparametric test at 95% confidence level indicate that there was no significant difference between these two methods. qNMR is a simple and accurate quantitative tool with no need for specific corresponding reference standards. It has the potential of the quantitative analysis of other peptide drugs and standardization of the corresponding reference standards. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Modelling default and likelihood reasoning as probabilistic

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  12. Critical Quantitative Inquiry in Context

    ERIC Educational Resources Information Center

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  13. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  14. Ultrasound introscopic image quantitative characteristics for medical diagnosis

    NASA Astrophysics Data System (ADS)

    Novoselets, Mikhail K.; Sarkisov, Sergey S.; Gridko, Alexander N.; Tcheban, Anatoliy K.

    1993-09-01

    The results on computer aided extraction of quantitative characteristics (QC) of ultrasound introscopic images for medical diagnosis are presented. Thyroid gland (TG) images of Chernobil Accident sufferers are considered. It is shown that TG diseases can be associated with some values of selected QCs of random echo distribution in the image. The possibility of these QCs usage for TG diseases recognition in accordance with calculated values is analyzed. The role of speckle noise elimination in the solution of the problem on TG diagnosis is considered too.

  15. Quantitative Articles: Developing Studies for Publication in Counseling Journals

    ERIC Educational Resources Information Center

    Trusty, Jerry

    2011-01-01

    This article is presented as a guide for developing quantitative studies and preparing quantitative manuscripts for publication in counseling journals. It is intended as an aid for aspiring authors in conceptualizing studies and formulating valid research designs. Material is presented on choosing variables and measures and on selecting…

  16. Monitoring of toxic elements present in sludge of industrial waste using CF-LIBS.

    PubMed

    Kumar, Rohit; Rai, Awadhesh K; Alamelu, Devanathan; Aggarwal, Suresh K

    2013-01-01

    Industrial waste is one of the main causes of environmental pollution. Laser-induced breakdown spectroscopy (LIBS) was applied to detect the toxic metals in the sludge of industrial waste water. Sludge on filter paper was obtained after filtering the collected waste water samples from different sections of a water treatment plant situated in an industrial area of Kanpur City. The LIBS spectra of the sludge samples were recorded in the spectral range of 200 to 500 nm by focusing the laser light on sludge. Calibration-free laser-induced breakdown spectroscopy (CF-LIBS) technique was used for the quantitative measurement of toxic elements such as Cr and Pb present in the sample. We also used the traditional calibration curve approach to quantify these elements. The results obtained from CF-LIBS are in good agreement with the results from the calibration curve approach. Thus, our results demonstrate that CF-LIBS is an appropriate technique for quantitative analysis where reference/standard samples are not available to make the calibration curve. The results of the present experiment are alarming to the people living nearby areas of industrial activities, as the concentrations of toxic elements are quite high compared to the admissible limits of these substances.

  17. Generalized PSF modeling for optimized quantitation in PET imaging.

    PubMed

    Ashrafinia, Saeed; Mohy-Ud-Din, Hassan; Karakatsanis, Nicolas A; Jha, Abhinav K; Casey, Michael E; Kadrmas, Dan J; Rahmim, Arman

    2017-06-21

    Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUV mean and SUV max , including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUV mean bias in small tumours. Overall, the results indicate that exactly matched PSF modeling does not offer optimized PET quantitation, and that PSF overestimation may provide enhanced SUV quantitation. Furthermore, generalized PSF modeling may provide a valuable approach for quantitative tasks such as treatment-response assessment and prognostication.

  18. Rapid Trace Detection and Isomer Quantitation of Pesticide Residues via Matrix-Assisted Laser Desorption/Ionization Fourier Transform Ion Cyclotron Resonance Mass Spectrometry.

    PubMed

    Wu, Xinzhou; Li, Weifeng; Guo, Pengran; Zhang, Zhixiang; Xu, Hanhong

    2018-04-18

    Matrix-assisted laser desorption/ionization Fourier transform ion cyclotron resonance mass spectrometry (MALDI-FTICR-MS) has been applied for rapid, sensitive, undisputed, and quantitative detection of pesticide residues on fresh leaves with little sample pretreatment. Various pesticides (insecticides, bactericides, herbicides, and acaricides) are detected directly in the complex matrix with excellent limits of detection down to 4 μg/L. FTICR-MS could unambiguously identify pesticides with tiny mass differences (∼0.017 75 Da), thereby avoiding false-positive results. Remarkably, pesticide isomers can be totally discriminated by use of diagnostic fragments, and quantitative analysis of pesticide isomers is demonstrated. The present results expand the horizons of the MALDI-FTICR-MS platform in the reliable determination of pesticides, with integrated advantages of ultrahigh mass resolution and accuracy. This method provides growing evidence for the resultant detrimental effects of pesticides, expediting the identification and evaluation of innovative pesticides.

  19. Pansharpening on the Narrow Vnir and SWIR Spectral Bands of SENTINEL-2

    NASA Astrophysics Data System (ADS)

    Vaiopoulos, A. D.; Karantzalos, K.

    2016-06-01

    In this paper results from the evaluation of several state-of-the-art pansharpening techniques are presented for the VNIR and SWIR bands of Sentinel-2. A procedure for the pansharpening is also proposed which aims at respecting the closest spectral similarities between the higher and lower resolution bands. The evaluation included 21 different fusion algorithms and three evaluation frameworks based both on standard quantitative image similarity indexes and qualitative evaluation from remote sensing experts. The overall analysis of the evaluation results indicated that remote sensing experts disagreed with the outcomes and method ranking from the quantitative assessment. The employed image quality similarity indexes and quantitative evaluation framework based on both high and reduced resolution data from the literature didn't manage to highlight/evaluate mainly the spatial information that was injected to the lower resolution images. Regarding the SWIR bands none of the methods managed to deliver significantly better results than a standard bicubic interpolation on the original low resolution bands.

  20. Fourier phase in Fourier-domain optical coherence tomography.

    PubMed

    Uttam, Shikhar; Liu, Yang

    2015-12-01

    Phase of an electromagnetic wave propagating through a sample-of-interest is well understood in the context of quantitative phase imaging in transmission-mode microscopy. In the past decade, Fourier-domain optical coherence tomography has been used to extend quantitative phase imaging to the reflection-mode. Unlike transmission-mode electromagnetic phase, however, the origin and characteristics of reflection-mode Fourier phase are poorly understood, especially in samples with a slowly varying refractive index. In this paper, the general theory of Fourier phase from first principles is presented, and it is shown that Fourier phase is a joint estimate of subresolution offset and mean spatial frequency of the coherence-gated sample refractive index. It is also shown that both spectral-domain phase microscopy and depth-resolved spatial-domain low-coherence quantitative phase microscopy are special cases of this general theory. Analytical expressions are provided for both, and simulations are presented to explain and support the theoretical results. These results are further used to show how Fourier phase allows the estimation of an axial mean spatial frequency profile of the sample, along with depth-resolved characterization of localized optical density change and sample heterogeneity. Finally, a Fourier phase-based explanation of Doppler optical coherence tomography is also provided.

  1. Fuzzy Performance between Surface Fitting and Energy Distribution in Turbulence Runner

    PubMed Central

    Liang, Zhongwei; Liu, Xiaochu; Ye, Bangyan; Brauwer, Richard Kars

    2012-01-01

    Because the application of surface fitting algorithms exerts a considerable fuzzy influence on the mathematical features of kinetic energy distribution, their relation mechanism in different external conditional parameters must be quantitatively analyzed. Through determining the kinetic energy value of each selected representative position coordinate point by calculating kinetic energy parameters, several typical algorithms of complicated surface fitting are applied for constructing microkinetic energy distribution surface models in the objective turbulence runner with those obtained kinetic energy values. On the base of calculating the newly proposed mathematical features, we construct fuzzy evaluation data sequence and present a new three-dimensional fuzzy quantitative evaluation method; then the value change tendencies of kinetic energy distribution surface features can be clearly quantified, and the fuzzy performance mechanism discipline between the performance results of surface fitting algorithms, the spatial features of turbulence kinetic energy distribution surface, and their respective environmental parameter conditions can be quantitatively analyzed in detail, which results in the acquirement of final conclusions concerning the inherent turbulence kinetic energy distribution performance mechanism and its mathematical relation. A further turbulence energy quantitative study can be ensured. PMID:23213287

  2. On sweat analysis for quantitative estimation of dehydration during physical exercise.

    PubMed

    Ring, Matthias; Lohmueller, Clemens; Rauh, Manfred; Eskofier, Bjoern M

    2015-08-01

    Quantitative estimation of water loss during physical exercise is of importance because dehydration can impair both muscular strength and aerobic endurance. A physiological indicator for deficit of total body water (TBW) might be the concentration of electrolytes in sweat. It has been shown that concentrations differ after physical exercise depending on whether water loss was replaced by fluid intake or not. However, to the best of our knowledge, this fact has not been examined for its potential to quantitatively estimate TBW loss. Therefore, we conducted a study in which sweat samples were collected continuously during two hours of physical exercise without fluid intake. A statistical analysis of these sweat samples revealed significant correlations between chloride concentration in sweat and TBW loss (r = 0.41, p <; 0.01), and between sweat osmolality and TBW loss (r = 0.43, p <; 0.01). A quantitative estimation of TBW loss resulted in a mean absolute error of 0.49 l per estimation. Although the precision has to be improved for practical applications, the present results suggest that TBW loss estimation could be realizable using sweat samples.

  3. A Quantitative Comparison of Leading-edge Vortices in Incompressible and Supersonic Flows

    NASA Technical Reports Server (NTRS)

    Wang, F. Y.; Milanovic, I. M.; Zaman, K. B. M. Q.

    2002-01-01

    When requiring quantitative data on delta-wing vortices for design purposes, low-speed results have often been extrapolated to configurations intended for supersonic operation. This practice stems from a lack of database owing to difficulties that plague measurement techniques in high-speed flows. In the present paper an attempt is made to examine this practice by comparing quantitative data on the nearwake properties of such vortices in incompressible and supersonic flows. The incompressible flow data are obtained in experiments conducted in a low-speed wind tunnel. Detailed flow-field properties, including vorticity and turbulence characteristics, obtained by hot-wire and pressure probe surveys are documented. These data are compared, wherever possible, with available data from a past work for a Mach 2.49 flow for the same wing geometry and angles-of-attack. The results indicate that quantitative similarities exist in the distributions of total pressure and swirl velocity. However, the streamwise velocity of the core exhibits different trends. The axial flow characteristics of the vortices in the two regimes are examined, and a candidate theory is discussed.

  4. Models of volcanic eruption hazards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wohletz, K.H.

    1992-01-01

    Volcanic eruptions pose an ever present but poorly constrained hazard to life and property for geothermal installations in volcanic areas. Because eruptions occur sporadically and may limit field access, quantitative and systematic field studies of eruptions are difficult to complete. Circumventing this difficulty, laboratory models and numerical simulations are pivotal in building our understanding of eruptions. For example, the results of fuel-coolant interaction experiments show that magma-water interaction controls many eruption styles. Applying these results, increasing numbers of field studies now document and interpret the role of external water eruptions. Similarly, numerical simulations solve the fundamental physics of high-speed fluidmore » flow and give quantitative predictions that elucidate the complexities of pyroclastic flows and surges. A primary goal of these models is to guide geologists in searching for critical field relationships and making their interpretations. Coupled with field work, modeling is beginning to allow more quantitative and predictive volcanic hazard assessments.« less

  5. Models of volcanic eruption hazards

    NASA Astrophysics Data System (ADS)

    Wohletz, K. H.

    Volcanic eruptions pose an ever present but poorly constrained hazard to life and property for geothermal installations in volcanic areas. Because eruptions occur sporadically and may limit field access, quantitative and systematic field studies of eruptions are difficult to complete. Circumventing this difficulty, laboratory models and numerical simulations are pivotal in building our understanding of eruptions. For example, the results of fuel-coolant interaction experiments show that magma-water interaction controls many eruption styles. Applying these results, increasing numbers of field studies now document and interpret the role of external water eruptions. Similarly, numerical simulations solve the fundamental physics of high-speed fluid flow and give quantitative predictions that elucidate the complexities of pyroclastic flows and surges. A primary goal of these models is to guide geologists in searching for critical field relationships and making their interpretations. Coupled with field work, modeling is beginning to allow more quantitative and predictive volcanic hazard assessments.

  6. Quantitative electron density characterization of soft tissue substitute plastic materials using grating-based x-ray phase-contrast imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarapata, A.; Chabior, M.; Zanette, I.

    2014-10-15

    Many scientific research areas rely on accurate electron density characterization of various materials. For instance in X-ray optics and radiation therapy, there is a need for a fast and reliable technique to quantitatively characterize samples for electron density. We present how a precise measurement of electron density can be performed using an X-ray phase-contrast grating interferometer in a radiographic mode of a homogenous sample in a controlled geometry. A batch of various plastic materials was characterized quantitatively and compared with calculated results. We found that the measured electron densities closely match theoretical values. The technique yields comparable results between amore » monochromatic and a polychromatic X-ray source. Measured electron densities can be further used to design dedicated X-ray phase contrast phantoms and the additional information on small angle scattering should be taken into account in order to exclude unsuitable materials.« less

  7. Quantitative metal magnetic memory reliability modeling for welded joints

    NASA Astrophysics Data System (ADS)

    Xing, Haiyan; Dang, Yongbin; Wang, Ben; Leng, Jiancheng

    2016-03-01

    Metal magnetic memory(MMM) testing has been widely used to detect welded joints. However, load levels, environmental magnetic field, and measurement noises make the MMM data dispersive and bring difficulty to quantitative evaluation. In order to promote the development of quantitative MMM reliability assessment, a new MMM model is presented for welded joints. Steel Q235 welded specimens are tested along the longitudinal and horizontal lines by TSC-2M-8 instrument in the tensile fatigue experiments. The X-ray testing is carried out synchronously to verify the MMM results. It is found that MMM testing can detect the hidden crack earlier than X-ray testing. Moreover, the MMM gradient vector sum K vs is sensitive to the damage degree, especially at early and hidden damage stages. Considering the dispersion of MMM data, the K vs statistical law is investigated, which shows that K vs obeys Gaussian distribution. So K vs is the suitable MMM parameter to establish reliability model of welded joints. At last, the original quantitative MMM reliability model is first presented based on the improved stress strength interference theory. It is shown that the reliability degree R gradually decreases with the decreasing of the residual life ratio T, and the maximal error between prediction reliability degree R 1 and verification reliability degree R 2 is 9.15%. This presented method provides a novel tool of reliability testing and evaluating in practical engineering for welded joints.

  8. Spontaneous Focusing on Quantitative Relations: Towards a Characterization

    ERIC Educational Resources Information Center

    Degrande, Tine; Verschaffel, Lieven; Van Dooren, Wim

    2017-01-01

    In contrast to previous studies on Spontaneous Focusing on Quantitative Relations (SFOR), the present study investigated not only the "extent" to which children focus on (multiplicative) quantitative relations, but also the "nature" of children's quantitative focus (i.e., the types of quantitative relations that children focus…

  9. Toward best practices in data processing and analysis for intact biotherapeutics by MS in quantitative bioanalysis.

    PubMed

    Kellie, John F; Kehler, Jonathan R; Karlinsey, Molly Z; Summerfield, Scott G

    2017-12-01

    Typically, quantitation of biotherapeutics from biological matrices by LC-MS is based on a surrogate peptide approach to determine molecule concentration. Recent efforts have focused on quantitation of the intact protein molecules or larger mass subunits of monoclonal antibodies. To date, there has been limited guidance for large or intact protein mass quantitation for quantitative bioanalysis. Intact- and subunit-level analyses of biotherapeutics from biological matrices are performed at 12-25 kDa mass range with quantitation data presented. Linearity, bias and other metrics are presented along with recommendations made on the viability of existing quantitation approaches. This communication is intended to start a discussion around intact protein data analysis and processing, recognizing that other published contributions will be required.

  10. Hydrological and Climatic Significance of Martian Deltas

    NASA Astrophysics Data System (ADS)

    Di Achille, G.; Vaz, D. A.

    2017-10-01

    We a) review the geomorphology, sedimentology, and mineralogy of the martian deltas record and b) present the results of a quantitative study of the hydrology and sedimentology of martian deltas using modified version of terrestrial model Sedflux.

  11. Portable low-coherence interferometry for quantitatively imaging fast dynamics with extended field of view

    NASA Astrophysics Data System (ADS)

    Shaked, Natan T.; Girshovitz, Pinhas; Frenklach, Irena

    2014-06-01

    We present our recent advances in the development of compact, highly portable and inexpensive wide-field interferometric modules. By a smart design of the interferometric system, including the usage of low-coherence illumination sources and common-path off-axis geometry of the interferometers, spatial and temporal noise levels of the resulting quantitative thickness profile can be sub-nanometric, while processing the phase profile in real time. In addition, due to novel experimentally-implemented multiplexing methods, we can capture low-coherence off-axis interferograms with significantly extended field of view and in faster acquisition rates. Using these techniques, we quantitatively imaged rapid dynamics of live biological cells including sperm cells and unicellular microorganisms. Then, we demonstrated dynamic profiling during lithography processes of microscopic elements, with thicknesses that may vary from several nanometers to hundreds of microns. Finally, we present new algorithms for fast reconstruction (including digital phase unwrapping) of off-axis interferograms, which allow real-time processing in more than video rate on regular single-core computers.

  12. Quantitative Morphology Measures in Galaxies: Ground-Truthing from Simulations

    NASA Astrophysics Data System (ADS)

    Narayanan, Desika T.; Abruzzo, Matthew W.; Dave, Romeel; Thompson, Robert

    2017-01-01

    The process of galaxy assembly is a prevalent question in astronomy; there are a variety of potentially important effects, including baryonic accretion from the intergalactic medium, as well as major galaxy mergers. Recent years have ushered in the development of quantitative measures of morphology such as the Gini coefficient (G), the second-order moment of the brightest quintile of a galaxy’s light (M20), and the concentration (C), asymmetry (A), and clumpiness (S) of galaxies. To investigate the efficacy of these observational methods at identifying major mergers, we have run a series of very high resolution cosmological zoom simulations, and coupled these with 3D Monte Carlo dust radiative transfer. Our methodology is powerful in that it allows us to “observe” the simulation as an observer would, while maintaining detailed knowledge of the true merger history of the galaxy. In this presentation, we will present our main results from our analysis of these quantitative morphology measures, with a particular focus on high-redshift (z>2) systems.

  13. 3D quantitative analysis of early decomposition changes of the human face.

    PubMed

    Caplova, Zuzana; Gibelli, Daniele Maria; Poppa, Pasquale; Cummaudo, Marco; Obertova, Zuzana; Sforza, Chiarella; Cattaneo, Cristina

    2018-03-01

    Decomposition of the human body and human face is influenced, among other things, by environmental conditions. The early decomposition changes that modify the appearance of the face may hamper the recognition and identification of the deceased. Quantitative assessment of those changes may provide important information for forensic identification. This report presents a pilot 3D quantitative approach of tracking early decomposition changes of a single cadaver in controlled environmental conditions by summarizing the change with weekly morphological descriptions. The root mean square (RMS) value was used to evaluate the changes of the face after death. The results showed a high correlation (r = 0.863) between the measured RMS and the time since death. RMS values of each scan are presented, as well as the average weekly RMS values. The quantification of decomposition changes could improve the accuracy of antemortem facial approximation and potentially could allow the direct comparisons of antemortem and postmortem 3D scans.

  14. Effect of Turbulence in Wind-Tunnel Measurements

    NASA Technical Reports Server (NTRS)

    Dryden, H L; Kuethe, A M

    1931-01-01

    This paper gives some quantitative measurements of wind tunnel turbulence and its effect on the air resistance of spheres and airship models, measurements made possible by the hot wire anemometer and associated apparatus in its original form was described in Technical Report no. 320 and some modifications are presented in an appendix to the present paper. One important result of the investigation is a curve by means of which measurements of the air resistance of spheres can be interpreted to give the turbulence quantitatively. Another is the definite proof that the discrepancies in the results on the N. P. L. Standard airship models are due mainly to differences in the turbulences of the wind tunnels in which the tests were made. An attempt is made to interpret the observed results in terms of the boundary layer theory and for this purpose a brief account is given of the physical bases of this theory and of conceptions that have been obtained by analogy with the laws of flow in pipes.

  15. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  16. Quantitative, spectrally-resolved intraoperative fluorescence imaging

    PubMed Central

    Valdés, Pablo A.; Leblond, Frederic; Jacobs, Valerie L.; Wilson, Brian C.; Paulsen, Keith D.; Roberts, David W.

    2012-01-01

    Intraoperative visual fluorescence imaging (vFI) has emerged as a promising aid to surgical guidance, but does not fully exploit the potential of the fluorescent agents that are currently available. Here, we introduce a quantitative fluorescence imaging (qFI) approach that converts spectrally-resolved data into images of absolute fluorophore concentration pixel-by-pixel across the surgical field of view (FOV). The resulting estimates are linear, accurate, and precise relative to true values, and spectral decomposition of multiple fluorophores is also achieved. Experiments with protoporphyrin IX in a glioma rodent model demonstrate in vivo quantitative and spectrally-resolved fluorescence imaging of infiltrating tumor margins for the first time. Moreover, we present images from human surgery which detect residual tumor not evident with state-of-the-art vFI. The wide-field qFI technique has broad implications for intraoperative surgical guidance because it provides near real-time quantitative assessment of multiple fluorescent biomarkers across the operative field. PMID:23152935

  17. Cloning of DOG1, a quantitative trait locus controlling seed dormancy in Arabidopsis.

    PubMed

    Bentsink, Leónie; Jowett, Jemma; Hanhart, Corrie J; Koornneef, Maarten

    2006-11-07

    Genetic variation for seed dormancy in nature is a typical quantitative trait controlled by multiple loci on which environmental factors have a strong effect. Finding the genes underlying dormancy quantitative trait loci is a major scientific challenge, which also has relevance for agriculture and ecology. In this study we describe the identification of the DELAY OF GERMINATION 1 (DOG1) gene previously identified as a quantitative trait locus involved in the control of seed dormancy. This gene was isolated by a combination of positional cloning and mutant analysis and is absolutely required for the induction of seed dormancy. DOG1 is a member of a small gene family of unknown molecular function, with five members in Arabidopsis. The functional natural allelic variation present in Arabidopsis is caused by polymorphisms in the cis-regulatory region of the DOG1 gene and results in considerable expression differences between the DOG1 alleles of the accessions analyzed.

  18. Meta-analysis of the technical performance of an imaging procedure: guidelines and statistical methodology.

    PubMed

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2015-02-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  19. Meta-analysis of the technical performance of an imaging procedure: Guidelines and statistical methodology

    PubMed Central

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2017-01-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test–retest repeatability data for illustrative purposes. PMID:24872353

  20. DMD-based quantitative phase microscopy and optical diffraction tomography

    NASA Astrophysics Data System (ADS)

    Zhou, Renjie

    2018-02-01

    Digital micromirror devices (DMDs), which offer high speed and high degree of freedoms in steering light illuminations, have been increasingly applied to optical microscopy systems in recent years. Lately, we introduced DMDs into digital holography to enable new imaging modalities and break existing imaging limitations. In this paper, we will first present our progress in using DMDs for demonstrating laser-illumination Fourier ptychographic microscopy (FPM) with shotnoise limited detection. After that, we will present a novel common-path quantitative phase microscopy (QPM) system based on using a DMD. Building on those early developments, a DMD-based high speed optical diffraction tomography (ODT) system has been recently demonstrated, and the results will also be presented. This ODT system is able to achieve video-rate 3D refractive-index imaging, which can potentially enable observations of high-speed 3D sample structural changes.

  1. Eye-Tracking Verification of the Strategy Used to Analyse Algorithms Expressed in a Flowchart and Pseudocode

    ERIC Educational Resources Information Center

    Andrzejewska, Magdalena; Stolinska, Anna; Blasiak, Wladyslaw; Peczkowski, Pawel; Rosiek, Roman; Rozek, Bozena; Sajka, Miroslawa; Wcislo, Dariusz

    2016-01-01

    The results of qualitative and quantitative investigations conducted with individuals who learned algorithms in school are presented in this article. In these investigations, eye-tracking technology was used to follow the process of solving algorithmic problems. The algorithmic problems were presented in two comparable variants: in a pseudocode…

  2. Modelling default and likelihood reasoning as probabilistic reasoning

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. Likely and by default are in fact treated as duals in the same sense as possibility and necessity. To model these four forms probabilistically, a qualitative default probabilistic (QDP) logic and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequent results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  3. Using Qualitative Metasummary to Synthesize Qualitative and Quantitative Descriptive Findings

    PubMed Central

    Sandelowski, Margarete; Barroso, Julie; Voils, Corrine I.

    2008-01-01

    The new imperative in the health disciplines to be more methodologically inclusive has generated a growing interest in mixed research synthesis, or the integration of qualitative and quantitative research findings. Qualitative metasummary is a quantitatively oriented aggregation of qualitative findings originally developed to accommodate the distinctive features of qualitative surveys. Yet these findings are similar in form and mode of production to the descriptive findings researchers often present in addition to the results of bivariate and multivariable analyses. Qualitative metasummary, which includes the extraction, grouping, and formatting of findings, and the calculation of frequency and intensity effect sizes, can be used to produce mixed research syntheses and to conduct a posteriori analyses of the relationship between reports and findings. PMID:17243111

  4. Overview of EPA Research on Drinking Water Distribution System Nitrification

    EPA Science Inventory

    Results from USEPA research investigating drinking water distribution system nitrification will be presented. The two research areas include: (1) monochloramine disinfection kinetics of Nitrosomonas europaea using Propidium Monoazide Quantitative Real-time PCR (PMA-qPCR) and (2...

  5. A benchmark for comparison of dental radiography analysis algorithms.

    PubMed

    Wang, Ching-Wei; Huang, Cheng-Ta; Lee, Jia-Hong; Li, Chung-Hsing; Chang, Sheng-Wei; Siao, Ming-Jhih; Lai, Tat-Ming; Ibragimov, Bulat; Vrtovec, Tomaž; Ronneberger, Olaf; Fischer, Philipp; Cootes, Tim F; Lindner, Claudia

    2016-07-01

    Dental radiography plays an important role in clinical diagnosis, treatment and surgery. In recent years, efforts have been made on developing computerized dental X-ray image analysis systems for clinical usages. A novel framework for objective evaluation of automatic dental radiography analysis algorithms has been established under the auspices of the IEEE International Symposium on Biomedical Imaging 2015 Bitewing Radiography Caries Detection Challenge and Cephalometric X-ray Image Analysis Challenge. In this article, we present the datasets, methods and results of the challenge and lay down the principles for future uses of this benchmark. The main contributions of the challenge include the creation of the dental anatomy data repository of bitewing radiographs, the creation of the anatomical abnormality classification data repository of cephalometric radiographs, and the definition of objective quantitative evaluation for comparison and ranking of the algorithms. With this benchmark, seven automatic methods for analysing cephalometric X-ray image and two automatic methods for detecting bitewing radiography caries have been compared, and detailed quantitative evaluation results are presented in this paper. Based on the quantitative evaluation results, we believe automatic dental radiography analysis is still a challenging and unsolved problem. The datasets and the evaluation software will be made available to the research community, further encouraging future developments in this field. (http://www-o.ntust.edu.tw/~cweiwang/ISBI2015/). Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  6. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses.

    PubMed

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard.

  7. Does Training in Table Creation Enhance Table Interpretation? A Quasi-Experimental Study with Follow-Up

    ERIC Educational Resources Information Center

    Karazsia, Bryan T.; Wong, Kendal

    2016-01-01

    Quantitative and statistical literacy are core domains in the undergraduate psychology curriculum. An important component of such literacy includes interpretation of visual aids, such as tables containing results from statistical analyses. This article presents results of a quasi-experimental study with longitudinal follow-up that tested the…

  8. Computational Algorithmization: Limitations in Problem Solving Skills in Computational Sciences Majors at University of Oriente

    ERIC Educational Resources Information Center

    Castillo, Antonio S.; Berenguer, Isabel A.; Sánchez, Alexander G.; Álvarez, Tomás R. R.

    2017-01-01

    This paper analyzes the results of a diagnostic study carried out with second year students of the computational sciences majors at University of Oriente, Cuba, to determine the limitations that they present in computational algorithmization. An exploratory research was developed using quantitative and qualitative methods. The results allowed…

  9. Quantitative phase and amplitude imaging using Differential-Interference Contrast (DIC) microscopy

    NASA Astrophysics Data System (ADS)

    Preza, Chrysanthe; O'Sullivan, Joseph A.

    2009-02-01

    We present an extension of the development of an alternating minimization (AM) method for the computation of a specimen's complex transmittance function (magnitude and phase) from DIC images. The ability to extract both quantitative phase and amplitude information from two rotationally-diverse DIC images (i.e., acquired by rotating the sample) extends previous efforts in computational DIC microscopy that have focused on quantitative phase imaging only. Simulation results show that the inverse problem at hand is sensitive to noise as well as to the choice of the AM algorithm parameters. The AM framework allows constraints and penalties on the magnitude and phase estimates to be incorporated in a principled manner. Towards this end, Green and De Pierro's "log-cosh" regularization penalty is applied to the magnitude of differences of neighboring values of the complex-valued function of the specimen during the AM iterations. The penalty is shown to be convex in the complex space. A procedure to approximate the penalty within the iterations is presented. In addition, a methodology to pre-compute AM parameters that are optimal with respect to the convergence rate of the AM algorithm is also presented. Both extensions of the AM method are investigated with simulations.

  10. Estimation of the number of fluorescent end-members for quantitative analysis of multispectral FLIM data.

    PubMed

    Gutierrez-Navarro, Omar; Campos-Delgado, Daniel U; Arce-Santana, Edgar R; Maitland, Kristen C; Cheng, Shuna; Jabbour, Joey; Malik, Bilal; Cuenca, Rodrigo; Jo, Javier A

    2014-05-19

    Multispectral fluorescence lifetime imaging (m-FLIM) can potentially allow identifying the endogenous fluorophores present in biological tissue. Quantitative description of such data requires estimating the number of components in the sample, their characteristic fluorescent decays, and their relative contributions or abundances. Unfortunately, this inverse problem usually requires prior knowledge about the data, which is seldom available in biomedical applications. This work presents a new methodology to estimate the number of potential endogenous fluorophores present in biological tissue samples from time-domain m-FLIM data. Furthermore, a completely blind linear unmixing algorithm is proposed. The method was validated using both synthetic and experimental m-FLIM data. The experimental m-FLIM data include in-vivo measurements from healthy and cancerous hamster cheek-pouch epithelial tissue, and ex-vivo measurements from human coronary atherosclerotic plaques. The analysis of m-FLIM data from in-vivo hamster oral mucosa identified healthy from precancerous lesions, based on the relative concentration of their characteristic fluorophores. The algorithm also provided a better description of atherosclerotic plaques in term of their endogenous fluorophores. These results demonstrate the potential of this methodology to provide quantitative description of tissue biochemical composition.

  11. Quantitative Investigation of Protein-Nucleic Acid Interactions by Biosensor Surface Plasmon Resonance.

    PubMed

    Wang, Shuo; Poon, Gregory M K; Wilson, W David

    2015-01-01

    Biosensor-surface plasmon resonance (SPR) technology has emerged as a powerful label-free approach for the study of nucleic acid interactions in real time. The method provides simultaneous equilibrium and kinetic characterization for biomolecular interactions with low sample requirements and without the need for external probes. A detailed and practical guide for protein-DNA interaction analyses using biosensor-SPR methods is presented. Details of SPR technology and basic fundamentals are described with recommendations on the preparation of the SPR instrument, sensor chips and samples, experimental design, quantitative and qualitative data analyses and presentation. A specific example of the interaction of a transcription factor with DNA is provided with results evaluated by both kinetic and steady-state SPR methods.

  12. Family involvement in decision making for people with dementia in residential aged care: a systematic review of quantitative literature.

    PubMed

    Petriwskyj, Andrea; Gibson, Alexandra; Parker, Deborah; Banks, Susan; Andrews, Sharon; Robinson, Andrew

    2014-06-01

    Ensuring older adults' involvement in their care is accepted as good practice and is vital, particularly for people with dementia, whose care and treatment needs change considerably over the course of the illness. However, involving family members in decision making on people's behalf is still practically difficult for staff and family. The aim of this review was to identify and appraise the existing quantitative evidence about family involvement in decision making for people with dementia living in residential aged care. The present Joanna Briggs Institute (JBI) metasynthesis assessed studies that investigated involvement of family members in decision making for people with dementia in residential aged care settings. While quantitative and qualitative studies were included in the review, this paper presents the quantitative findings. A comprehensive search of 15 electronic databases was performed. The search was limited to papers published in English, from 1990 to 2013. Twenty-six studies were identified as being relevant; 10 were quantitative, with 1 mixed method study. Two independent reviewers assessed the studies for methodological validity and extracted the data using the JBI Meta Analysis of Statistics Assessment and Review Instrument (JBI-MAStARI). The findings were synthesized and presented in narrative form. The findings related to decisions encountered and made by family surrogates, variables associated with decisions, surrogates' perceptions of, and preferences for, their roles, as well as outcomes for people with dementia and their families. The results identified patterns within, and variables associated with, surrogate decision making, all of which highlight the complexity and variation regarding family involvement. Attention needs to be paid to supporting family members in decision making in collaboration with staff.

  13. Generating One Biometric Feature from Another: Faces from Fingerprints

    PubMed Central

    Ozkaya, Necla; Sagiroglu, Seref

    2010-01-01

    This study presents a new approach based on artificial neural networks for generating one biometric feature (faces) from another (only fingerprints). An automatic and intelligent system was designed and developed to analyze the relationships among fingerprints and faces and also to model and to improve the existence of the relationships. The new proposed system is the first study that generates all parts of the face including eyebrows, eyes, nose, mouth, ears and face border from only fingerprints. It is also unique and different from similar studies recently presented in the literature with some superior features. The parameter settings of the system were achieved with the help of Taguchi experimental design technique. The performance and accuracy of the system have been evaluated with 10-fold cross validation technique using qualitative evaluation metrics in addition to the expanded quantitative evaluation metrics. Consequently, the results were presented on the basis of the combination of these objective and subjective metrics for illustrating the qualitative properties of the proposed methods as well as a quantitative evaluation of their performances. Experimental results have shown that one biometric feature can be determined from another. These results have once more indicated that there is a strong relationship between fingerprints and faces. PMID:22399877

  14. A quantitative evaluation of spurious results in the infrared spectroscopic measurement of CO2 isotope ratios

    NASA Astrophysics Data System (ADS)

    Mansfield, C. D.; Rutt, H. N.

    2002-02-01

    The possible generation of spurious results, arising from the application of infrared spectroscopic techniques to the measurement of carbon isotope ratios in breath, due to coincident absorption bands has been re-examined. An earlier investigation, which approached the problem qualitatively, fulfilled its aspirations in providing an unambiguous assurance that 13C16O2/12C16O2 ratios can be confidently measured for isotopic breath tests using instruments based on infrared absorption. Although this conclusion still stands, subsequent quantitative investigation has revealed an important exception that necessitates a strict adherence to sample collection protocol. The results show that concentrations and decay rates of the coincident breath trace compounds acetonitrile and carbon monoxide, found in the breath sample of a heavy smoker, can produce spurious results. Hence, findings from this investigation justify the concern that breath trace compounds present a risk to the accurate measurement of carbon isotope ratios in breath when using broadband, non-dispersive, ground state absorption infrared spectroscopy. It provides recommendations on the length of smoking abstention required to avoid generation of spurious results and also reaffirms, through quantitative argument, the validity of using infrared absorption spectroscopy to measure CO2 isotope ratios in breath.

  15. Quantitative laser speckle flowmetry of the in vivo microcirculation using sidestream dark field microscopy

    PubMed Central

    Nadort, Annemarie; Woolthuis, Rutger G.; van Leeuwen, Ton G.; Faber, Dirk J.

    2013-01-01

    We present integrated Laser Speckle Contrast Imaging (LSCI) and Sidestream Dark Field (SDF) flowmetry to provide real-time, non-invasive and quantitative measurements of speckle decorrelation times related to microcirculatory flow. Using a multi exposure acquisition scheme, precise speckle decorrelation times were obtained. Applying SDF-LSCI in vitro and in vivo allows direct comparison between speckle contrast decorrelation and flow velocities, while imaging the phantom and microcirculation architecture. This resulted in a novel analysis approach that distinguishes decorrelation due to flow from other additive decorrelation sources. PMID:24298399

  16. Guidelines for Initiating a Research Agenda: Research Design and Dissemination of Results.

    PubMed

    Delost, Maria E; Nadder, Teresa S

    2014-01-01

    Successful research outcomes require selection and implementation of the appropriate research design. A realistic sampling plan appropriate for the design is essential. Qualitative or quantitative methodology may be utilized, depending on the research question and goals. Quantitative research may be experimental where there is an intervention, or nonexperimental, if no intervention is included in the design. Causation can only be established with experimental research. Popular types of nonexperimental research include descriptive and survey research. Research findings may be disseminated via presentations, posters, and publications, such as abstracts and manuscripts.

  17. Review of Department of Defense Education Activity (DoDEA) Schools. Volume II: Quantitative Analysis of Educational Quality. IDA Paper.

    ERIC Educational Resources Information Center

    Anderson, Lowell Bruce; Bracken, Jerome; Bracken, Marilyn C.

    This volume compiles, and presents in integrated form, the Institute for Defense Analyses' (IDA) quantitative analysis of educational quality provided by the Department of Defense's dependent schools. It covers the quantitative aspects of volume 1 in greater detail and presents some analyses deemed too technical for that volume. The first task in…

  18. Formation resistivity measurements from within a cased well used to quantitatively determine the amount of oil and gas present

    DOEpatents

    Vail, III, William B.

    1997-01-01

    Methods to quantitatively determine the separate amounts of oil and gas in a geological formation adjacent to a cased well using measurements of formation resistivity are disclosed. The steps include obtaining resistivity measurements from within a cased well of a given formation, obtaining the porosity, obtaining the resistivity of formation water present, computing the combined amounts of oil and gas present using Archie's Equations, determining the relative amounts of oil and gas present from measurements within a cased well, and then quantitatively determining the separate amounts of oil and gas present in the formation.

  19. Formation resistivity measurements from within a cased well used to quantitatively determine the amount of oil and gas present

    DOEpatents

    Vail, W.B. III

    1997-05-27

    Methods to quantitatively determine the separate amounts of oil and gas in a geological formation adjacent to a cased well using measurements of formation resistivity are disclosed. The steps include obtaining resistivity measurements from within a cased well of a given formation, obtaining the porosity, obtaining the resistivity of formation water present, computing the combined amounts of oil and gas present using Archie`s Equations, determining the relative amounts of oil and gas present from measurements within a cased well, and then quantitatively determining the separate amounts of oil and gas present in the formation. 7 figs.

  20. Computation of the three-dimensional medial surface dynamics of the vocal folds.

    PubMed

    Döllinger, Michael; Berry, David A

    2006-01-01

    To increase our understanding of pathological and healthy voice production, quantitative measurement of the medial surface dynamics of the vocal folds is significant, albeit rarely performed because of the inaccessibility of the vocal folds. Using an excised hemilarynx methodology, a new calibration technique, herein referred to as the linear approximate (LA) method, was introduced to compute the three-dimensional coordinates of fleshpoints along the entire medial surface of the vocal fold. The results were compared with results from the direct linear transform. An associated error estimation was presented, demonstrating the improved accuracy of the new method. A test on real data was reported including computation of quantitative measurements of vocal fold dynamics.

  1. Nontargeted quantitation of lipid classes using hydrophilic interaction liquid chromatography-electrospray ionization mass spectrometry with single internal standard and response factor approach.

    PubMed

    Cífková, Eva; Holčapek, Michal; Lísa, Miroslav; Ovčačíková, Magdaléna; Lyčka, Antonín; Lynen, Frédéric; Sandra, Pat

    2012-11-20

    The identification and quantitation of a wide range of lipids in complex biological samples is an essential requirement for the lipidomic studies. High-performance liquid chromatography-mass spectrometry (HPLC/MS) has the highest potential to obtain detailed information on the whole lipidome, but the reliable quantitation of multiple lipid classes is still a challenging task. In this work, we describe a new method for the nontargeted quantitation of polar lipid classes separated by hydrophilic interaction liquid chromatography (HILIC) followed by positive-ion electrospray ionization mass spectrometry (ESI-MS) using a single internal lipid standard to which all class specific response factors (RFs) are related to. The developed method enables the nontargeted quantitation of lipid classes and molecules inside these classes in contrast to the conventional targeted quantitation, which is based on predefined selected reaction monitoring (SRM) transitions for selected lipids only. In the nontargeted quantitation method described here, concentrations of lipid classes are obtained by the peak integration in HILIC chromatograms multiplied by their RFs related to the single internal standard (i.e., sphingosyl PE, d17:1/12:0) used as common reference for all polar lipid classes. The accuracy, reproducibility and robustness of the method have been checked by various means: (1) the comparison with conventional lipidomic quantitation using SRM scans on a triple quadrupole (QqQ) mass analyzer, (2) (31)P nuclear magnetic resonance (NMR) quantitation of the total lipid extract, (3) method robustness test using subsequent measurements by three different persons, (4) method transfer to different HPLC/MS systems using different chromatographic conditions, and (5) comparison with previously published results for identical samples, especially human reference plasma from the National Institute of Standards and Technology (NIST human plasma). Results on human plasma, egg yolk and porcine liver extracts are presented and discussed.

  2. Relations of Dispositions toward Ridicule and Histrionic Self-Presentation with Quantitative and Qualitative Humor Creation Abilities.

    PubMed

    Renner, Karl-Heinz; Manthey, Leonie

    2018-01-01

    Previous research has shown that humor and self-presentation are linked in several ways. With regard to individual differences, it turned out that gelotophilia (the joy of being laughed at) and katagelasticism (the joy of laughing at others) are substantially associated with the histrionic self-presentation style that is characterized by performing explicit As-If-behaviors (e.g., irony, parodying others) in everyday interactions. By contrast, gelotophobia (the fear of being laughed at) shows a negative correlation with histrionic self-presentation. In order to further contribute to the nomological network, we have explored whether the three dispositions toward ridicule and laughter as well as histrionic self-presentation are related to humor creation abilities. In doing so, we have assessed the four constructs in a study with 337 participants that also completed the Cartoon Punch line Production Test (CPPT, Köhler and Ruch, 1993, unpublished). In the CPPT, subjects were asked to generate as many funny punch lines as possible for six caption-removed cartoons. The created punch lines were then analyzed with regard to quantitative (e.g., number of punch lines) and qualitative (e.g., wittiness of the punch lines and overall wittiness of the person as evaluated by three independent raters) humor creation abilities. Results show that both gelotophilia and histrionic self-presentation were positively correlated with quantitative and qualitative humor creation abilities. By contrast, gelotophobia showed slightly negative and katagelasticism no associations with the assessed humor creation abilities. These findings especially apply to the subgroup of participants that created punch lines for each of the six cartoons and partly replicate and extend the results of a previous study by Ruch et al. (2009). Altogether, the results of our study show that individual differences in humor-related traits are associated with the quantity and quality of humorous punch lines. It is argued that behavior-related or performative humor creation tasks should be considered in addition to the CPPT in order to open up new avenues that can cross-fertilize research on individual differences in humor and self-presentation.

  3. Relations of Dispositions toward Ridicule and Histrionic Self-Presentation with Quantitative and Qualitative Humor Creation Abilities

    PubMed Central

    Renner, Karl-Heinz; Manthey, Leonie

    2018-01-01

    Previous research has shown that humor and self-presentation are linked in several ways. With regard to individual differences, it turned out that gelotophilia (the joy of being laughed at) and katagelasticism (the joy of laughing at others) are substantially associated with the histrionic self-presentation style that is characterized by performing explicit As-If-behaviors (e.g., irony, parodying others) in everyday interactions. By contrast, gelotophobia (the fear of being laughed at) shows a negative correlation with histrionic self-presentation. In order to further contribute to the nomological network, we have explored whether the three dispositions toward ridicule and laughter as well as histrionic self-presentation are related to humor creation abilities. In doing so, we have assessed the four constructs in a study with 337 participants that also completed the Cartoon Punch line Production Test (CPPT, Köhler and Ruch, 1993, unpublished). In the CPPT, subjects were asked to generate as many funny punch lines as possible for six caption-removed cartoons. The created punch lines were then analyzed with regard to quantitative (e.g., number of punch lines) and qualitative (e.g., wittiness of the punch lines and overall wittiness of the person as evaluated by three independent raters) humor creation abilities. Results show that both gelotophilia and histrionic self-presentation were positively correlated with quantitative and qualitative humor creation abilities. By contrast, gelotophobia showed slightly negative and katagelasticism no associations with the assessed humor creation abilities. These findings especially apply to the subgroup of participants that created punch lines for each of the six cartoons and partly replicate and extend the results of a previous study by Ruch et al. (2009). Altogether, the results of our study show that individual differences in humor-related traits are associated with the quantity and quality of humorous punch lines. It is argued that behavior-related or performative humor creation tasks should be considered in addition to the CPPT in order to open up new avenues that can cross-fertilize research on individual differences in humor and self-presentation. PMID:29487549

  4. Comparison of Holographic Photopolymer Materials by Use of Analytic Nonlocal Diffusion Models: Errata

    NASA Astrophysics Data System (ADS)

    O'Neill, Feidhlim T.; Lawrence, Justin R.; Sheridan, John T.

    2003-06-01

    Two typographic errors have been identified by the authors in Equation (5) in Ref. 1 . These errors do not effect either the physical interpretation of the situation or the quantitative results presented in the paper.

  5. Motion magnification using the Hermite transform

    NASA Astrophysics Data System (ADS)

    Brieva, Jorge; Moya-Albor, Ernesto; Gomez-Coronel, Sandra L.; Escalante-Ramírez, Boris; Ponce, Hiram; Mora Esquivel, Juan I.

    2015-12-01

    We present an Eulerian motion magnification technique with a spatial decomposition based on the Hermite Transform (HT). We compare our results to the approach presented in.1 We test our method in one sequence of the breathing of a newborn baby and on an MRI left ventricle sequence. Methods are compared using quantitative and qualitative metrics after the application of the motion magnification algorithm.

  6. Quantified Energy Dissipation Rates in the Terrestrial Bow Shock. 1.; Analysis Techniques and Methodology

    NASA Technical Reports Server (NTRS)

    Wilson, L. B., III; Sibeck, D. G.; Breneman, A.W.; Le Contel, O.; Cully, C.; Turner, D. L.; Angelopoulos, V.; Malaspina, D. M.

    2014-01-01

    We present a detailed outline and discussion of the analysis techniques used to compare the relevance of different energy dissipation mechanisms at collisionless shock waves. We show that the low-frequency, quasi-static fields contribute less to ohmic energy dissipation, (-j · E ) (minus current density times measured electric field), than their high-frequency counterparts. In fact, we found that high-frequency, large-amplitude (greater than 100 millivolts per meter and/or greater than 1 nanotesla) waves are ubiquitous in the transition region of collisionless shocks. We quantitatively show that their fields, through wave-particle interactions, cause enough energy dissipation to regulate the global structure of collisionless shocks. The purpose of this paper, part one of two, is to outline and describe in detail the background, analysis techniques, and theoretical motivation for our new results presented in the companion paper. The companion paper presents the results of our quantitative energy dissipation rate estimates and discusses the implications. Together, the two manuscripts present the first study quantifying the contribution that high-frequency waves provide, through wave-particle interactions, to the total energy dissipation budget of collisionless shock waves.

  7. Laser microprobe characterization of C species in Interplanetary Dust Particles (IDP)

    NASA Technical Reports Server (NTRS)

    Dibrozolo, F. R.; Bunch, T. E.; Chang, S.; Brownlee, D. E.

    1986-01-01

    Preliminary results of a study whose aim is the characterization of carbon (C) species in microvolumes of materials by means of laser ionization mass spectrometry (LIMS) are presented. The LIMS instrument employs a pulsed UV laser to produce nearly instantaneous vaporization and ionization of materials, followed by acceleration and time-of-flight analysis of the ions produced. LIMS provides a survey technique with nearly simultaneous acquisition of mass spectra covering the entire elemental range. The main limitation of the LIMS technique at present is its limited ability to perform quantitative analysis, due in part to insufficient knowledge of the mechanism of laser-solid interaction. However, considerable effort is now being directed at making LIMS a more quantitative technique. A variety of different C samples, both natural and man made were analyzed to establish the ability of LIMS to differentiate among the various C phases. The results of preliminary analyses performed on meteoritical and interplanetary dust samples are also presented. The C standards selected for the LIMS characterization range from essentially amorphous soot to diamond, which exhibits the highest degree of ordering.

  8. Fourier phase in Fourier-domain optical coherence tomography

    PubMed Central

    Uttam, Shikhar; Liu, Yang

    2015-01-01

    Phase of an electromagnetic wave propagating through a sample-of-interest is well understood in the context of quantitative phase imaging in transmission-mode microscopy. In the past decade, Fourier-domain optical coherence tomography has been used to extend quantitative phase imaging to the reflection-mode. Unlike transmission-mode electromagnetic phase, however, the origin and characteristics of reflection-mode Fourier phase are poorly understood, especially in samples with a slowly varying refractive index. In this paper, the general theory of Fourier phase from first principles is presented, and it is shown that Fourier phase is a joint estimate of subresolution offset and mean spatial frequency of the coherence-gated sample refractive index. It is also shown that both spectral-domain phase microscopy and depth-resolved spatial-domain low-coherence quantitative phase microscopy are special cases of this general theory. Analytical expressions are provided for both, and simulations are presented to explain and support the theoretical results. These results are further used to show how Fourier phase allows the estimation of an axial mean spatial frequency profile of the sample, along with depth-resolved characterization of localized optical density change and sample heterogeneity. Finally, a Fourier phase-based explanation of Doppler optical coherence tomography is also provided. PMID:26831383

  9. Quantitative fluorescence angiography for neurosurgical interventions.

    PubMed

    Weichelt, Claudia; Duscha, Philipp; Steinmeier, Ralf; Meyer, Tobias; Kuß, Julia; Cimalla, Peter; Kirsch, Matthias; Sobottka, Stephan B; Koch, Edmund; Schackert, Gabriele; Morgenstern, Ute

    2013-06-01

    Present methods for quantitative measurement of cerebral perfusion during neurosurgical operations require additional technology for measurement, data acquisition, and processing. This study used conventional fluorescence video angiography--as an established method to visualize blood flow in brain vessels--enhanced by a quantifying perfusion software tool. For these purposes, the fluorescence dye indocyanine green is given intravenously, and after activation by a near-infrared light source the fluorescence signal is recorded. Video data are analyzed by software algorithms to allow quantification of the blood flow. Additionally, perfusion is measured intraoperatively by a reference system. Furthermore, comparing reference measurements using a flow phantom were performed to verify the quantitative blood flow results of the software and to validate the software algorithm. Analysis of intraoperative video data provides characteristic biological parameters. These parameters were implemented in the special flow phantom for experimental validation of the developed software algorithms. Furthermore, various factors that influence the determination of perfusion parameters were analyzed by means of mathematical simulation. Comparing patient measurement, phantom experiment, and computer simulation under certain conditions (variable frame rate, vessel diameter, etc.), the results of the software algorithms are within the range of parameter accuracy of the reference methods. Therefore, the software algorithm for calculating cortical perfusion parameters from video data presents a helpful intraoperative tool without complex additional measurement technology.

  10. Quantitative Rainbow Schlieren Deflectometry as a Temperature Diagnostic for Spherical Flames

    NASA Technical Reports Server (NTRS)

    Feikema, Douglas A.

    2004-01-01

    Numerical analysis and experimental results are presented to define a method for quantitatively measuring the temperature distribution of a spherical diffusion flame using Rainbow Schlieren Deflectometry in microgravity. First, a numerical analysis is completed to show the method can suitably determine temperature in the presence of spatially varying species composition. Also, a numerical forward-backward inversion calculation is presented to illustrate the types of calculations and deflections to be encountered. Lastly, a normal gravity demonstration of temperature measurement in an axisymmetric laminar, diffusion flame using Rainbow Schlieren deflectometry is presented. The method employed in this paper illustrates the necessary steps for the preliminary design of a Schlieren system. The largest deflections for the normal gravity flame considered in this paper are 7.4 x 10(-4) radians which can be accurately measured with 2 meter focal length collimating and decollimating optics. The experimental uncertainty of deflection is less than 5 x 10(-5) radians.

  11. Modern projection of the old electroscope for nuclear radiation quantitative work and demonstrations

    NASA Astrophysics Data System (ADS)

    Oliveira Bastos, Rodrigo; Baltokoski Boch, Layara

    2017-11-01

    Although quantitative measurements in radioactivity teaching and research are only believed to be possible with high technology, early work in this area was fully accomplished with very simple apparatus such as zinc sulphide screens and electroscopes. This article presents an experimental practice using the electroscope, which is a very simple apparatus that has been widely used for educational purposes, although generally for qualitative work. The main objective is to show the possibility of measuring radioactivity not only in qualitative demonstrations, but also in quantitative experimental practices. The experimental set-up is a low-cost ion chamber connected to an electroscope in a configuration that is very similar to that used by Marie and Pierre Currie, Rutherford, Geiger, Pacini, Hess and other great researchers from the time of the big discoveries in nuclear and high-energy particle physics. An electroscope leaf is filmed and projected, permitting the collection of quantitative data for the measurement of the 220Rn half-life, collected from the emanation of the lantern mantles. The article presents the experimental procedures and the expected results, indicating that the experiment may provide support for nuclear physics classes. These practices could spread widely to either university or school didactic laboratories, and the apparatus has the potential to allow the development of new teaching activity for nuclear physics.

  12. Dynamic phase differences based on quantitative phase imaging for the objective evaluation of cell behavior.

    PubMed

    Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim

    2015-01-01

    Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.

  13. Dynamic phase differences based on quantitative phase imaging for the objective evaluation of cell behavior

    NASA Astrophysics Data System (ADS)

    Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim

    2015-11-01

    Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two-dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.

  14. 21 CFR 170.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... available methods that permit quantitative determination of the amount of each food additive present or... present in or on a food and there are available methods that permit quantitative determination of each...

  15. 21 CFR 570.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... available methods that permit quantitative determination of the amount of each food additive present or... present in or on a food and there are available methods that permit quantitative determination of each...

  16. 21 CFR 170.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... available methods that permit quantitative determination of the amount of each food additive present or... present in or on a food and there are available methods that permit quantitative determination of each...

  17. 21 CFR 570.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... available methods that permit quantitative determination of the amount of each food additive present or... present in or on a food and there are available methods that permit quantitative determination of each...

  18. 21 CFR 170.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... available methods that permit quantitative determination of the amount of each food additive present or... present in or on a food and there are available methods that permit quantitative determination of each...

  19. 21 CFR 570.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... available methods that permit quantitative determination of the amount of each food additive present or... present in or on a food and there are available methods that permit quantitative determination of each...

  20. 21 CFR 570.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... available methods that permit quantitative determination of the amount of each food additive present or... present in or on a food and there are available methods that permit quantitative determination of each...

  1. 21 CFR 570.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... available methods that permit quantitative determination of the amount of each food additive present or... present in or on a food and there are available methods that permit quantitative determination of each...

  2. 21 CFR 170.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... available methods that permit quantitative determination of the amount of each food additive present or... present in or on a food and there are available methods that permit quantitative determination of each...

  3. Career, Family, and Institutional Variables in the Work Lives of Academic Women in the Chemical Sciences

    NASA Astrophysics Data System (ADS)

    Fassinger, Ruth E.; Scantlebury, Kathryn; Richmond, Geraldine

    This article presents quantitative results of a study of 139 academic women in the chemical sciences who participated in a professional development program sponsored by the Committee on the Advancement of Women Chemists. The study investigated variables frequently examined in the vocational psychology of women: approaches to achievement, coping strategies, career advancement, the home-work interface, workplace climate, and mentoring. The article presents and discusses results in the context of unique issues faced by women in scientific careers.

  4. Development and First Results of the Width-Tapered Beam Method for Adhesion Testing of Photovoltaic Material Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosco, Nick; Tracy, Jared; Dauskardt, Reinhold

    2016-11-21

    A fracture mechanics based approach for quantifying adhesion at every interface within the PV module laminate is presented. The common requirements of monitoring crack length and specimen compliance are circumvented through development of a width-tapered cantilever beam method. This technique may be applied at both the module and coupon level to yield a similar, quantitative, measurement. Details of module and sample preparation are described and first results on field-exposed modules deployed for over 27 years presented.

  5. Using multiple PCR and CE with chemiluminescence detection for simultaneous qualitative and quantitative analysis of genetically modified organism.

    PubMed

    Guo, Longhua; Qiu, Bin; Chi, Yuwu; Chen, Guonan

    2008-09-01

    In this paper, an ultrasensitive CE-CL detection system coupled with a novel double-on-column coaxial flow detection interface was developed for the detection of PCR products. A reliable procedure based on this system had been demonstrated for qualitative and quantitative analysis of genetically modified organism-the detection of Roundup Ready Soy (RRS) samples was presented as an example. The promoter, terminator, function and two reference genes of RRS were amplified with multiplex PCR simultaneously. After that, the multiplex PCR products were labeled with acridinium ester at the 5'-terminal through an amino modification and then analyzed by the proposed CE-CL system. Reproducibility of analysis times and peak heights for the CE-CL analysis were determined to be better than 0.91 and 3.07% (RSD, n=15), respectively, for three consecutive days. It was shown that this method could accurately and qualitatively detect RRS standards and the simulative samples. The evaluation in terms of quantitative analysis of RRS provided by this new method was confirmed by comparing our assay results with those of the standard real-time quantitative PCR (RT-QPCR) using SYBR Green I dyes. The results showed a good coherence between the two methods. This approach demonstrated the possibility for accurate qualitative and quantitative detection of GM plants in a single run.

  6. Targeted liquid chromatography tandem mass spectrometry to quantitate wheat gluten using well-defined reference proteins.

    PubMed

    Schalk, Kathrin; Koehler, Peter; Scherf, Katharina Anne

    2018-01-01

    Celiac disease (CD) is an inflammatory disorder of the upper small intestine caused by the ingestion of storage proteins (prolamins and glutelins) from wheat, barley, rye, and, in rare cases, oats. CD patients need to follow a gluten-free diet by consuming gluten-free products with gluten contents of less than 20 mg/kg. Currently, the recommended method for the quantitative determination of gluten is an enzyme-linked immunosorbent assay (ELISA) based on the R5 monoclonal antibody. Because the R5 ELISA mostly detects the prolamin fraction of gluten, a new independent method is required to detect prolamins as well as glutelins. This paper presents the development of a method to quantitate 16 wheat marker peptides derived from all wheat gluten protein types by liquid chromatography tandem mass spectrometry (LC-MS/MS) in the multiple reaction monitoring mode. The quantitation of each marker peptide in the chymotryptic digest of a defined amount of the respective reference wheat protein type resulted in peptide-specific yields. This enabled the conversion of peptide into protein type concentrations. Gluten contents were expressed as sum of all determined protein type concentrations. This new method was applied to quantitate gluten in wheat starches and compared to R5 ELISA and gel-permeation high-performance liquid chromatography with fluorescence detection (GP-HPLC-FLD), which resulted in a strong correlation between LC-MS/MS and the other two methods.

  7. Dual function microscope for quantitative DIC and birefringence imaging

    NASA Astrophysics Data System (ADS)

    Li, Chengshuai; Zhu, Yizheng

    2016-03-01

    A spectral multiplexing interferometry (SXI) method is presented for integrated birefringence and phase gradient measurement on label-free biological specimens. With SXI, the retardation and orientation of sample birefringence are simultaneously encoded onto two separate spectral carrier waves, generated by a crystal retarder oriented at a specific angle. Thus sufficient information for birefringence determination can be obtained from a single interference spectrum, eliminating the need for multiple acquisitions with mechanical rotation or electrical modulation. In addition, with the insertion of a Nomarski prism, the setup can then acquire quantitative differential interference contrast images. Red blood cells infected by malaria parasites are imaged for birefringence retardation as well as phase gradient. The results demonstrate that the SXI approach can achieve both quantitative phase imaging and birefringence imaging with a single, high-sensitivity system.

  8. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  9. Quantitative three-dimensional photoacoustic tomography of the finger joints: an in vivo study

    NASA Astrophysics Data System (ADS)

    Sun, Yao; Sobel, Eric; Jiang, Huabei

    2009-11-01

    We present for the first time in vivo full three-dimensional (3-D) photoacoustic tomography (PAT) of the distal interphalangeal joint in a human subject. Both absorbed energy density and absorption coefficient images of the joint are quantitatively obtained using our finite-element-based photoacoustic image reconstruction algorithm coupled with the photon diffusion equation. The results show that major anatomical features in the joint along with the side arteries can be imaged with a 1-MHz transducer in a spherical scanning geometry. In addition, the cartilages associated with the joint can be quantitatively differentiated from the phalanx. This in vivo study suggests that the 3-D PAT method described has the potential to be used for early diagnosis of joint diseases such as osteoarthritis and rheumatoid arthritis.

  10. Results of Instrument Observations and Adaptive Prediction of Thermoabrasion of Banks of the Vilyui Reservoir

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Velikin, S. A.; Sobol', I. S.; Sobol', S. V.

    2013-11-15

    Quantitative data derived from observations of reformation of the thermoabrasive banks of the Viliyui Reservoir in Yakutia during the service period from 1972 through 2011, and results of analytical prediction of bank formations over the next 20 years for purposes of monitoring the ecological safety of this water body are presented.

  11. Addressing Negative Math Attitudes with Service-Learning

    ERIC Educational Resources Information Center

    Henrich, Allison; Sloughter, J. McLean; Anderson, Jeffrey; Bahuaud, Eric

    2016-01-01

    In this paper, we share the results of our recent study of a quantitative literacy course with a service-learning component. Our study aims to answer the question: How did student attitudes shift as a result of participating in this course? We present and analyze statistics from pre- and post-surveys in five classes (N = 78) taught by two…

  12. Value Types in Higher Education--Students' Perspective

    ERIC Educational Resources Information Center

    Dziewanowska, Katarzyna

    2017-01-01

    The purpose of the paper is to propose the service-dominant logic in marketing as a framework for analysing the value co-creation process in the higher education sector and present the results of a quantitative study (a survey) conducted among business students from four Polish public universities. The results of the study led to identification of…

  13. Field evaluation of the myrtle creek advanced curve warning system : final report.

    DOT National Transportation Integrated Search

    2006-06-01

    As part of a larger study focusing on determining optimum countermeasures for speed related crashes, this report presents the results of a quantitative and qualitative before and after evaluation of a dynamic curve warning system deployed at one site...

  14. Radiofrequency recombination lines as diagnostics of the cool interstellar medium.

    NASA Technical Reports Server (NTRS)

    Dupree, A. K.

    1971-01-01

    Quantitative details are given of a new diagnostic technique for the carbon and hydrogen (H I) recombination lines. Theoretical results are presented for conditions expected in H I clouds, and are compared with available observations for Orion A and NGC 2024.

  15. Numerical analysis of quantitative measurement of hydroxyl radical concentration using laser-induced fluorescence in flame

    NASA Astrophysics Data System (ADS)

    Shuang, Chen; Tie, Su; Yao-Bang, Zheng; Li, Chen; Ting-Xu, Liu; Ren-Bing, Li; Fu-Rong, Yang

    2016-06-01

    The aim of the present work is to quantitatively measure the hydroxyl radical concentration by using LIF (laser-induced fluorescence) in flame. The detailed physical models of spectral absorption lineshape broadening, collisional transition and quenching at elevated pressure are built. The fine energy level structure of the OH molecule is illustrated to understand the process with laser-induced fluorescence emission and others in the case without radiation, which include collisional quenching, rotational energy transfer (RET), and vibrational energy transfer (VET). Based on these, some numerical results are achieved by simulations in order to evaluate the fluorescence yield at elevated pressure. These results are useful for understanding the real physical processes in OH-LIF technique and finding a way to calibrate the signal for quantitative measurement of OH concentration in a practical combustor. Project supported by the National Natural Science Foundation of China (Grant No. 11272338) and the Fund from the Science and Technology on Scramjet Key Laboratory, China (Grant No. STSKFKT2013004).

  16. Quality of life, coping strategies, social support and self-efficacy in women after acute myocardial infarction: a mixed methods approach.

    PubMed

    Fuochi, G; Foà, C

    2018-03-01

    Quality of life, coping strategies, social support and self-efficacy are important psychosocial variables strongly affecting the experience of acute myocardial infarction (AMI) in women. To gain a more in-depth understanding of how coping strategies, self-efficacy, quality of life and social support shape women's adjustment to AMI. Mixed methods study. Quantitative data were collected through a standardised questionnaire on coping strategies, self-efficacy, quality of life and social support. Qualitative data stemmed from 57 semistructured interviews conducted with post-AMI female patients on related topics. Quantitative data were analysed with unpaired two-sample t-tests on the means, comparing women who experienced AMI (N = 77) with a control group of women who did not have AMI (N = 173), and pairwise correlations on the AMI sample. Qualitative data were grouped into coding families and analysed through thematic content analysis. Qualitative and quantitative results were then integrated, for different age groups. Quantitative results indicated statistically significant differences between women who experienced AMI and the control group: the former showed lower self-perceived health, perceived social support and social support coping, but greater self-efficacy, use of acceptance, avoidance and religious coping. Pairwise correlations showed that avoidance coping strategy was negatively correlated with quality of life, while the opposite was true for problem-oriented coping, perceived social support and self-efficacy. Qualitative results extended and confirmed quantitative results, except for coping strategies: avoidance coping seemed more present than reported in the standardised measures. Mixed methods provide understanding of the importance of social support, self-efficacy and less avoidant coping strategies to women's adjustment to AMI. Women need support from health professionals with knowledge of these topics, to facilitate their adaptation to AMI. © 2017 Nordic College of Caring Science.

  17. The role of emotions in time to presentation for symptoms suggestive of cancer: a systematic literature review of quantitative studies.

    PubMed

    Balasooriya-Smeekens, Chantal; Walter, Fiona M; Scott, Suzanne

    2015-12-01

    Emotions may be important in patients' decisions to seek medical help for symptoms suggestive of cancer. The aim of this systematic literature review was to examine quantitative literature on the influence of emotion on patients' help-seeking for symptoms suggestive of cancer. The objectives were to identify the following: (a) which types of emotions influence help-seeking behaviour, (b) whether these form a barrier or trigger for seeking medical help and (c) how the role of emotions varies between different cancers and populations. We searched four electronic databases and conducted a narrative synthesis. Inclusion criteria were studies that reported primary, quantitative research that examined any emotion specific to symptom appraisal or help-seeking for symptoms suggestive of cancer. Thirty-three papers were included. The studies were heterogeneous in their methods and quality, and very few had emotion as the main focus of the research. Studies reported a limited range of emotions, mainly related to fear and worry. The impact of emotions appears mixed, sometimes acting as a barrier to consultation whilst at other times being a trigger or being unrelated to time to presentation. It is plausible that different emotions play different roles at different times prior to presentation. This systematic review provides some quantitative evidence for the role of emotions in help-seeking behaviour. However, it also highlighted widespread methodological, definition and design issues among the existing literature. The conflicting results around the role of emotions on time to presentation may be due to the lack of definition of each specific emotion. Copyright © 2015 John Wiley & Sons, Ltd.

  18. A quantitative systems physiology model of renal function and blood pressure regulation: Model description.

    PubMed

    Hallow, K M; Gebremichael, Y

    2017-06-01

    Renal function plays a central role in cardiovascular, kidney, and multiple other diseases, and many existing and novel therapies act through renal mechanisms. Even with decades of accumulated knowledge of renal physiology, pathophysiology, and pharmacology, the dynamics of renal function remain difficult to understand and predict, often resulting in unexpected or counterintuitive therapy responses. Quantitative systems pharmacology modeling of renal function integrates this accumulated knowledge into a quantitative framework, allowing evaluation of competing hypotheses, identification of knowledge gaps, and generation of new experimentally testable hypotheses. Here we present a model of renal physiology and control mechanisms involved in maintaining sodium and water homeostasis. This model represents the core renal physiological processes involved in many research questions in drug development. The model runs in R and the code is made available. In a companion article, we present a case study using the model to explore mechanisms and pharmacology of salt-sensitive hypertension. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  19. Advances in Surface Plasmon Resonance Imaging enable quantitative measurement of laterally heterogeneous coatings of nanoscale thickness

    NASA Astrophysics Data System (ADS)

    Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John

    2013-03-01

    The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to the optical properties of nanoscale coatings on thin metallic surfaces, for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases - uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. We first demonstrate the directionally heterogeneous nature of the SPR phenomenon using a directionally ordered sample, then show how this allows for the calculation of the average coverage of a heterogeneous sample. Finally, the degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.

  20. A preamplification approach to GMO detection in processed foods.

    PubMed

    Del Gaudio, S; Cirillo, A; Di Bernardo, G; Galderisi, U; Cipollaro, M

    2010-03-01

    DNA is widely used as a target for GMO analysis because of its stability and high detectability. Real-time PCR is the method routinely used in most analytical laboratories due to its quantitative performance and great sensitivity. Accurate DNA detection and quantification is dependent on the specificity and sensitivity of the amplification protocol as well as on the quality and quantity of the DNA used in the PCR reaction. In order to enhance the sensitivity of real-time PCR and consequently expand the number of analyzable target genes, we applied a preamplification technique to processed foods where DNA can be present in low amounts and/or in degraded forms thereby affecting the reliability of qualitative and quantitative results. The preamplification procedure utilizes a pool of primers targeting genes of interest and is followed by real-time PCR reactions specific for each gene. An improvement of Ct values was found comparing preamplified vs. non-preamplified DNA. The strategy reported in the present study will be also applicable to other fields requiring quantitative DNA testing by real-time PCR.

  1. Quantitative and Functional Requirements for Bioluminescent Cancer Models.

    PubMed

    Feys, Lynn; Descamps, Benedicte; Vanhove, Christian; Vermeulen, Stefan; Vandesompele, J O; Vanderheyden, Katrien; Messens, Kathy; Bracke, Marc; De Wever, Olivier

    2016-01-01

    Bioluminescent cancer models are widely used but detailed quantification of the luciferase signal and functional comparison with a non-transfected control cell line are generally lacking. In the present study, we provide quantitative and functional tests for luciferase-transfected cells. We quantified the luciferase expression in BLM and HCT8/E11 transfected cancer cells, and examined the effect of long-term luciferin exposure. The present study also investigated functional differences between parental and transfected cancer cells. Our results showed that quantification of different single-cell-derived populations are superior with droplet digital polymerase chain reaction. Quantification of luciferase protein level and luciferase bioluminescent activity is only useful when there is a significant difference in copy number. Continuous exposure of cell cultures to luciferin leads to inhibitory effects on mitochondrial activity, cell growth and bioluminescence. These inhibitory effects correlate with luciferase copy number. Cell culture and mouse xenograft assays showed no significant functional differences between luciferase-transfected and parental cells. Luciferase-transfected cells should be validated by quantitative and functional assays before starting large-scale experiments. Copyright © 2016 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  2. Image segmentation evaluation for very-large datasets

    NASA Astrophysics Data System (ADS)

    Reeves, Anthony P.; Liu, Shuang; Xie, Yiting

    2016-03-01

    With the advent of modern machine learning methods and fully automated image analysis there is a need for very large image datasets having documented segmentations for both computer algorithm training and evaluation. Current approaches of visual inspection and manual markings do not scale well to big data. We present a new approach that depends on fully automated algorithm outcomes for segmentation documentation, requires no manual marking, and provides quantitative evaluation for computer algorithms. The documentation of new image segmentations and new algorithm outcomes are achieved by visual inspection. The burden of visual inspection on large datasets is minimized by (a) customized visualizations for rapid review and (b) reducing the number of cases to be reviewed through analysis of quantitative segmentation evaluation. This method has been applied to a dataset of 7,440 whole-lung CT images for 6 different segmentation algorithms designed to fully automatically facilitate the measurement of a number of very important quantitative image biomarkers. The results indicate that we could achieve 93% to 99% successful segmentation for these algorithms on this relatively large image database. The presented evaluation method may be scaled to much larger image databases.

  3. An Informed Approach to Improving Quantitative Literacy and Mitigating Math Anxiety in Undergraduates Through Introductory Science Courses

    NASA Astrophysics Data System (ADS)

    Follette, K.; McCarthy, D.

    2012-08-01

    Current trends in the teaching of high school and college science avoid numerical engagement because nearly all students lack basic arithmetic skills and experience anxiety when encountering numbers. Nevertheless, such skills are essential to science and vital to becoming savvy consumers, citizens capable of recognizing pseudoscience, and discerning interpreters of statistics in ever-present polls, studies, and surveys in which our society is awash. Can a general-education collegiate course motivate students to value numeracy and to improve their quantitative skills in what may well be their final opportunity in formal education? We present a tool to assess whether skills in numeracy/quantitative literacy can be fostered and improved in college students through the vehicle of non-major introductory courses in astronomy. Initial classroom applications define the magnitude of this problem and indicate that significant improvements are possible. Based on these initial results we offer this tool online and hope to collaborate with other educators, both formal and informal, to develop effective mechanisms for encouraging all students to value and improve their skills in basic numeracy.

  4. Results of Studying Astronomy Students’ Science Literacy, Quantitative Literacy, and Information Literacy

    NASA Astrophysics Data System (ADS)

    Buxner, Sanlyn; Impey, Chris David; Follette, Katherine B.; Dokter, Erin F.; McCarthy, Don; Vezino, Beau; Formanek, Martin; Romine, James M.; Brock, Laci; Neiberding, Megan; Prather, Edward E.

    2017-01-01

    Introductory astronomy courses often serve as terminal science courses for non-science majors and present an opportunity to assess non future scientists’ attitudes towards science as well as basic scientific knowledge and scientific analysis skills that may remain unchanged after college. Through a series of studies, we have been able to evaluate students’ basic science knowledge, attitudes towards science, quantitative literacy, and informational literacy. In the Fall of 2015, we conducted a case study of a single class administering all relevant surveys to an undergraduate class of 20 students. We will present our analysis of trends of each of these studies as well as the comparison case study. In general we have found that students basic scientific knowledge has remained stable over the past quarter century. In all of our studies, there is a strong relationship between student attitudes and their science and quantitative knowledge and skills. Additionally, students’ information literacy is strongly connected to their attitudes and basic scientific knowledge. We are currently expanding these studies to include new audiences and will discuss the implications of our findings for instructors.

  5. A Simple Configuration for Quantitative Phase Contrast Microscopy of Transmissible Samples

    NASA Astrophysics Data System (ADS)

    Sengupta, Chandan; Dasgupta, Koustav; Bhattacharya, K.

    Phase microscopy attempts to visualize and quantify the phase distribution of samples which are otherwise invisible under microscope without the use of stains. The two principal approaches to phase microscopy are essentially those of Fourier plane modulation and interferometric techniques. Although the former, first proposed by Zernike, had been the harbinger of phase microscopy, it was the latter that allowed for quantitative evaluation of phase samples. However interferometric techniques are fraught with associated problems such as complicated setup involving mirrors and beam-splitters, the need for a matched objective in the reference arm and also the need for vibration isolation. The present work proposes a single element cube beam-splitter (CBS) interferometer combined with a microscope objective (MO) for interference microscopy. Because of the monolithic nature of the interferometer, the system is almost insensitive to vibrations and relatively simple to align. It will be shown that phase shifting properties may also be introduced by suitable and proper use of polarizing devices. Initial results showing the quantitative three dimensional phase profiles of simulated and actual biological specimens are presented.

  6. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing

    PubMed Central

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-01-01

    Aims A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R2), using R2 as the primary metric of assay agreement. However, the use of R2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. Methods We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Results Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. Conclusions The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. PMID:28747393

  7. A study for development of aerothermodynamic test model materials and fabrication technique

    NASA Technical Reports Server (NTRS)

    Dean, W. G.; Connor, L. E.

    1972-01-01

    A literature survey, materials reformulation and tailoring, fabrication problems, and materials selection and evaluation for fabricating models to be used with the phase-change technique for obtaining quantitative aerodynamic heat transfer data are presented. The study resulted in the selection of two best materials, stycast 2762 FT, and an alumina ceramic. Characteristics of these materials and detailed fabrication methods are presented.

  8. Differential diagnosis of breast cancer using quantitative, label-free and molecular vibrational imaging

    PubMed Central

    Yang, Yaliang; Li, Fuhai; Gao, Liang; Wang, Zhiyong; Thrall, Michael J.; Shen, Steven S.; Wong, Kelvin K.; Wong, Stephen T. C.

    2011-01-01

    We present a label-free, chemically-selective, quantitative imaging strategy to identify breast cancer and differentiate its subtypes using coherent anti-Stokes Raman scattering (CARS) microscopy. Human normal breast tissue, benign proliferative, as well as in situ and invasive carcinomas, were imaged ex vivo. Simply by visualizing cellular and tissue features appearing on CARS images, cancerous lesions can be readily separated from normal tissue and benign proliferative lesion. To further distinguish cancer subtypes, quantitative disease-related features, describing the geometry and distribution of cancer cell nuclei, were extracted and applied to a computerized classification system. The results show that in situ carcinoma was successfully distinguished from invasive carcinoma, while invasive ductal carcinoma (IDC) and invasive lobular carcinoma were also distinguished from each other. Furthermore, 80% of intermediate-grade IDC and 85% of high-grade IDC were correctly distinguished from each other. The proposed quantitative CARS imaging method has the potential to enable rapid diagnosis of breast cancer. PMID:21833355

  9. Modern quantitative schlieren techniques

    NASA Astrophysics Data System (ADS)

    Hargather, Michael; Settles, Gary

    2010-11-01

    Schlieren optical techniques have traditionally been used to qualitatively visualize refractive flowfields in transparent media. Modern schlieren optics, however, are increasingly focused on obtaining quantitative information such as temperature and density fields in a flow -- once the sole purview of interferometry -- without the need for coherent illumination. Quantitative data are obtained from schlieren images by integrating the measured refractive index gradient to obtain the refractive index field in an image. Ultimately this is converted to a density or temperature field using the Gladstone-Dale relationship, an equation of state, and geometry assumptions for the flowfield of interest. Several quantitative schlieren methods are reviewed here, including background-oriented schlieren (BOS), schlieren using a weak lens as a "standard," and "rainbow schlieren." Results are presented for the application of these techniques to measure density and temperature fields across a supersonic turbulent boundary layer and a low-speed free-convection boundary layer in air. Modern equipment, including digital cameras, LED light sources, and computer software that make this possible are also discussed.

  10. Biological monitoring of Upper Three Runs Creek, Savannah River Plant, Aiken County, South Carolina. Final report on macroinvertebrate stream assessments for F/H area ETF effluent discharge, July 1987--February 1990

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Specht, W.L.

    1991-10-01

    In anticipation of the fall 1988 start up of effluent discharges into Upper Three Creek by the F/H Area Effluent Treatment Facility of the Savannah River Site, Aiken, SC, a two and one half year biological study was initiated in June 1987. Upper Three Runs Creek is an intensively studied fourth order stream known for its high species richness. Designed to assess the potential impact of F?H area effluent on the creek, the study includes qualitative and quantitative macroinvertebrate stream surveys at five sites, chronic toxicity testing of the effluent, water chemistry and bioaccumulation analysis. This final report presents themore » results of both pre-operational and post-operational qualitative and quantitative (artificial substrate) macroinvertebrate studies. Six quantitative and three qualitative studies were conducted prior to the initial release of the F/H ETF effluent and five quantitative and two qualitative studies were conducted post-operationally.« less

  11. Low rank magnetic resonance fingerprinting.

    PubMed

    Mazor, Gal; Weizman, Lior; Tal, Assaf; Eldar, Yonina C

    2016-08-01

    Magnetic Resonance Fingerprinting (MRF) is a relatively new approach that provides quantitative MRI using randomized acquisition. Extraction of physical quantitative tissue values is preformed off-line, based on acquisition with varying parameters and a dictionary generated according to the Bloch equations. MRF uses hundreds of radio frequency (RF) excitation pulses for acquisition, and therefore high under-sampling ratio in the sampling domain (k-space) is required. This under-sampling causes spatial artifacts that hamper the ability to accurately estimate the quantitative tissue values. In this work, we introduce a new approach for quantitative MRI using MRF, called Low Rank MRF. We exploit the low rank property of the temporal domain, on top of the well-known sparsity of the MRF signal in the generated dictionary domain. We present an iterative scheme that consists of a gradient step followed by a low rank projection using the singular value decomposition. Experiments on real MRI data demonstrate superior results compared to conventional implementation of compressed sensing for MRF at 15% sampling ratio.

  12. Probing myocardium biomechanics using quantitative optical coherence elastography

    NASA Astrophysics Data System (ADS)

    Wang, Shang; Lopez, Andrew L.; Morikawa, Yuka; Tao, Ge; Li, Jiasong; Larina, Irina V.; Martin, James F.; Larin, Kirill V.

    2015-03-01

    We present a quantitative optical coherence elastographic method for noncontact assessment of the myocardium elasticity. The method is based on shear wave imaging optical coherence tomography (SWI-OCT), where a focused air-puff system is used to induce localized tissue deformation through a low-pressure short-duration air stream and a phase-sensitive OCT system is utilized to monitor the propagation of the induced tissue displacement with nanoscale sensitivity. The 1-D scanning of M-mode OCT imaging and the application of optical phase retrieval and mapping techniques enable the reconstruction and visualization of 2-D depth-resolved shear wave propagation in tissue with ultra-high frame rate. The feasibility of this method in quantitative elasticity measurement is demonstrated on tissue-mimicking phantoms with the estimated Young's modulus compared with uniaxial compression tests. We also performed pilot experiments on ex vivo mouse cardiac muscle tissues with normal and genetically altered cardiomyocytes. Our results indicate this noncontact quantitative optical coherence elastographic method can be a useful tool for the cardiac muscle research and studies.

  13. Hydrophobic ionic liquids for quantitative bacterial cell lysis with subsequent DNA quantification.

    PubMed

    Fuchs-Telka, Sabine; Fister, Susanne; Mester, Patrick-Julian; Wagner, Martin; Rossmanith, Peter

    2017-02-01

    DNA is one of the most frequently analyzed molecules in the life sciences. In this article we describe a simple and fast protocol for quantitative DNA isolation from bacteria based on hydrophobic ionic liquid supported cell lysis at elevated temperatures (120-150 °C) for subsequent PCR-based analysis. From a set of five hydrophobic ionic liquids, 1-butyl-1-methylpyrrolidinium bis(trifluoromethylsulfonyl)imide was identified as the most suitable for quantitative cell lysis and DNA extraction because of limited quantitative PCR inhibition by the aqueous eluate as well as no detectable DNA uptake. The newly developed method was able to efficiently lyse Gram-negative bacterial cells, whereas Gram-positive cells were protected by their thick cell wall. The performance of the final protocol resulted in quantitative DNA extraction efficiencies for Gram-negative bacteria similar to those obtained with a commercial kit, whereas the number of handling steps, and especially the time required, was dramatically reduced. Graphical Abstract After careful evaluation of five hydrophobic ionic liquids, 1-butyl-1-methylpyrrolidinium bis(trifluoromethylsulfonyl)imide ([BMPyr + ][Ntf 2 - ]) was identified as the most suitable ionic liquid for quantitative cell lysis and DNA extraction. When used for Gram-negative bacteria, the protocol presented is simple and very fast and achieves DNA extraction efficiencies similar to those obtained with a commercial kit. ddH 2 O double-distilled water, qPCR quantitative PCR.

  14. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses

    PubMed Central

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard. PMID:27529425

  15. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  16. Quantitative Susceptibility Mapping after Sports-Related Concussion.

    PubMed

    Koch, K M; Meier, T B; Karr, R; Nencka, A S; Muftuler, L T; McCrea, M

    2018-06-07

    Quantitative susceptibility mapping using MR imaging can assess changes in brain tissue structure and composition. This report presents preliminary results demonstrating changes in tissue magnetic susceptibility after sports-related concussion. Longitudinal quantitative susceptibility mapping metrics were produced from imaging data acquired from cohorts of concussed and control football athletes. One hundred thirty-six quantitative susceptibility mapping datasets were analyzed across 3 separate visits (24 hours after injury, 8 days postinjury, and 6 months postinjury). Longitudinal quantitative susceptibility mapping group analyses were performed on stability-thresholded brain tissue compartments and selected subregions. Clinical concussion metrics were also measured longitudinally in both cohorts and compared with the measured quantitative susceptibility mapping. Statistically significant increases in white matter susceptibility were identified in the concussed athlete group during the acute (24 hour) and subacute (day 8) period. These effects were most prominent at the 8-day visit but recovered and showed no significant difference from controls at the 6-month visit. The subcortical gray matter showed no statistically significant group differences. Observed susceptibility changes after concussion appeared to outlast self-reported clinical recovery metrics at a group level. At an individual subject level, susceptibility increases within the white matter showed statistically significant correlations with return-to-play durations. The results of this preliminary investigation suggest that sports-related concussion can induce physiologic changes to brain tissue that can be detected using MR imaging-based magnetic susceptibility estimates. In group analyses, the observed tissue changes appear to persist beyond those detected on clinical outcome assessments and were associated with return-to-play duration after sports-related concussion. © 2018 by American Journal of Neuroradiology.

  17. Rapid Quantitative Determination of Squalene in Shark Liver Oils by Raman and IR Spectroscopy.

    PubMed

    Hall, David W; Marshall, Susan N; Gordon, Keith C; Killeen, Daniel P

    2016-01-01

    Squalene is sourced predominantly from shark liver oils and to a lesser extent from plants such as olives. It is used for the production of surfactants, dyes, sunscreen, and cosmetics. The economic value of shark liver oil is directly related to the squalene content, which in turn is highly variable and species-dependent. Presented here is a validated gas chromatography-mass spectrometry analysis method for the quantitation of squalene in shark liver oils, with an accuracy of 99.0 %, precision of 0.23 % (standard deviation), and linearity of >0.999. The method has been used to measure the squalene concentration of 16 commercial shark liver oils. These reference squalene concentrations were related to infrared (IR) and Raman spectra of the same oils using partial least squares regression. The resultant models were suitable for the rapid quantitation of squalene in shark liver oils, with cross-validation r (2) values of >0.98 and root mean square errors of validation of ≤4.3 % w/w. Independent test set validation of these models found mean absolute deviations of the 4.9 and 1.0 % w/w for the IR and Raman models, respectively. Both techniques were more accurate than results obtained by an industrial refractive index analysis method, which is used for rapid, cheap quantitation of squalene in shark liver oils. In particular, the Raman partial least squares regression was suited to quantitative squalene analysis. The intense and highly characteristic Raman bands of squalene made quantitative analysis possible irrespective of the lipid matrix.

  18. Do Mouthwashes Really Kill Bacteria?

    ERIC Educational Resources Information Center

    Corner, Thomas R.

    1984-01-01

    Errors in determining the effectiveness of mouthwashes, disinfectants, and other household products as antibacterial agents may result from using broth cultures and/or irregularly shaped bits of filter paper. Presents procedures for a better technique and, for advanced students, two additional procedures for introducing quantitative analysis into…

  19. The Concept Maps as a Didactic Resource Tool of Meaningful Learning in Astronomy Themes

    NASA Astrophysics Data System (ADS)

    Silveira, Felipa Pacífico Ribeiro de Assis; Mendonça, Conceição Aparecida Soares

    2015-07-01

    This article presents the results of an investigation that sought to understand the performance of the conceptual map (MC) as a teaching resource facilitator of meaningful learning of scientific concepts on astronomical themes, developed with elementary school students. The methodology employed to obtain and process the data was based on a quantitative and qualitative approach. On the quantitative level we designed a quasi-experimental research with a control group that did not use the MC and an experimental group that used the MC, both being evaluated in the beginning and end of the process. In this case, the performance of both groups is displayed in a descriptive and analytical study. In the qualitative approach, the MCs were interpreted using the structuring and assigned meanings shared by the student during his/her presentation. The results demonstrated through the improvement of qualifications that the MC made a difference in conceptual learning and in certain skills revealed by learning indicators.

  20. The Adaptation of the Immigrant Second Generation in America: Theoretical Overview and Recent Evidence

    PubMed Central

    Portes, Alejandro; Fernández-Kelly, Patricia; Haller, William

    2013-01-01

    This paper summarises a research program on the new immigrant second generation initiated in the early 1990s and completed in 2006. The four field waves of the Children of Immigrants Longitudinal Study (CILS) are described and the main theoretical models emerging from it are presented and graphically summarised. After considering critical views of this theory, we present the most recent results from this longitudinal research program in the forum of quantitative models predicting downward assimilation in early adulthood and qualitative interviews identifying ways to escape it by disadvantaged children of immigrants. Quantitative results strongly support the predicted effects of exogenous variables identified by segmented assimilation theory and identify the intervening factors during adolescence that mediate their influence on adult outcomes. Qualitative evidence gathered during the last stage of the study points to three factors that can lead to exceptional educational achievement among disadvantaged youths. All three indicate the positive influence of selective acculturation. Implications of these findings for theory and policy are discussed. PMID:23626483

  1. Sequential Extraction Results and Mineralogy of Mine Waste and Stream Sediments Associated With Metal Mines in Vermont, Maine, and New Zealand

    USGS Publications Warehouse

    Piatak, N.M.; Seal, R.R.; Sanzolone, R.F.; Lamothe, P.J.; Brown, Z.A.; Adams, M.

    2007-01-01

    We report results from sequential extraction experiments and the quantitative mineralogy for samples of stream sediments and mine wastes collected from metal mines. Samples were from the Elizabeth, Ely Copper, and Pike Hill Copper mines in Vermont, the Callahan Mine in Maine, and the Martha Mine in New Zealand. The extraction technique targeted the following operationally defined fractions and solid-phase forms: (1) soluble, adsorbed, and exchangeable fractions; (2) carbonates; (3) organic material; (4) amorphous iron- and aluminum-hydroxides and crystalline manganese-oxides; (5) crystalline iron-oxides; (6) sulfides and selenides; and (7) residual material. For most elements, the sum of an element from all extractions steps correlated well with the original unleached concentration. Also, the quantitative mineralogy of the original material compared to that of the residues from two extraction steps gave insight into the effectiveness of reagents at dissolving targeted phases. The data are presented here with minimal interpretation or discussion and further analyses and interpretation will be presented elsewhere.

  2. Extraction of Trivalent Actinides and Lanthanides from Californium Campaign Rework Solution Using TODGA-based Solvent Extraction System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benker, Dennis; Delmau, Laetitia Helene; Dryman, Joshua Cory

    This report presents the studies carried out to demonstrate the possibility of quantitatively extracting trivalent actinides and lanthanides from highly acidic solutions using a neutral ligand-based solvent extraction system. These studies stemmed from the perceived advantage of such systems over cationexchange- based solvent extraction systems that require an extensive feed adjustment to make a low-acid feed. The targeted feed solutions are highly acidic aqueous phases obtained after the dissolution of curium targets during a californium (Cf) campaign. Results obtained with actual Cf campaign solutions, but highly diluted to be manageable in a glove box, are presented, followed by results ofmore » tests run in the hot cells with Cf campaign rework solutions. It was demonstrated that a solvent extraction system based on the tetraoctyl diglycolamide molecule is capable of quantitatively extracting trivalent actinides from highly acidic solutions. This system was validated using actual feeds from a Cf campaign.« less

  3. Quantitative analysis of 18F-NaF dynamic PET/CT cannot differentiate malignant from benign lesions in multiple myeloma

    PubMed Central

    Sachpekidis, Christos; Hillengass, Jens; Goldschmidt, Hartmut; Anwar, Hoda; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-01-01

    A renewed interest has been recently developed for the highly sensitive bone-seeking radiopharmaceutical 18F-NaF. Aim of the present study is to evaluate the potential utility of quantitative analysis of 18F-NaF dynamic PET/CT data in differentiating malignant from benign degenerative lesions in multiple myeloma (MM). 80 MM patients underwent whole-body PET/CT and dynamic PET/CT scanning of the pelvis with 18F-NaF. PET/CT data evaluation was based on visual (qualitative) assessment, semi-quantitative (SUV) calculations, and absolute quantitative estimations after application of a 2-tissue compartment model and a non-compartmental approach leading to the extraction of fractal dimension (FD). In total 263 MM lesions were demonstrated on 18F-NaF PET/CT. Semi-quantitative and quantitative evaluations were performed for 25 MM lesions as well as for 25 benign, degenerative and traumatic lesions. Mean SUVaverage for MM lesions was 11.9 and mean SUVmax was 23.2. Respectively, SUVaverage and SUVmax for degenerative lesions were 13.5 and 20.2. Kinetic analysis of 18F-NaF revealed the following mean values for MM lesions: K1 = 0.248 (1/min), k3 = 0.359 (1/min), influx (Ki) = 0.107 (1/min), FD = 1.382, while the respective values for degenerative lesions were: K1 = 0.169 (1/min), k3 = 0.422 (1/min), influx (Ki) = 0.095 (1/min), FD = 1. 411. No statistically significant differences between MM and benign degenerative disease regarding SUVaverage, SUVmax, K1, k3 and influx (Ki) were demonstrated. FD was significantly higher in degenerative than in malignant lesions. The present findings show that quantitative analysis of 18F-NaF PET data cannot differentiate malignant from benign degenerative lesions in MM patients, supporting previously published results, which reflect the limited role of 18F-NaF PET/CT in the diagnostic workup of MM. PMID:28913153

  4. Shear-induced aggregation dynamics in a polymer microrod suspension

    NASA Astrophysics Data System (ADS)

    Kumar, Pramukta S.

    A non-Brownian suspension of micron scale rods is found to exhibit reversible shear-driven formation of disordered aggregates resulting in dramatic viscosity enhancement at low shear rates. Aggregate formation is imaged at low magnification using a combined rheometer and fluorescence microscope system. The size and structure of these aggregates are found to depend on shear rate and concentration, with larger aggregates present at lower shear rates and higher concentrations. Quantitative measurements of the early-stage aggregation process are modeled by a collision driven growth of porous structures which show that the aggregate density increases with a shear rate. A Krieger-Dougherty type constitutive relation and steady-state viscosity measurements are used to estimate the intrinsic viscosity of complex structures developed under shear. Higher magnification images are collected and used to validate the aggregate size versus density relationship, as well as to obtain particle flow fields via PIV. The flow fields provide a tantalizing view of fluctuations involved in the aggregation process. Interaction strength is estimated via contact force measurements and JKR theory and found to be extremely strong in comparison to shear forces present in the system, estimated using hydrodynamic arguments. All of the results are then combined to produce a consistent conceptual model of aggregation in the system that features testable consequences. These results represent a direct, quantitative, experimental study of aggregation and viscosity enhancement in rod suspension, and demonstrate a strategy for inferring inaccessible microscopic geometric properties of a dynamic system through the combination of quantitative imaging and rheology.

  5. Development of a Biological Science Quantitative Reasoning Exam (BioSQuaRE)

    ERIC Educational Resources Information Center

    Stanhope, Liz; Ziegler, Laura; Haque, Tabassum; Le, Laura; Vinces, Marcelo; Davis, Gregory K.; Zieffler, Andrew; Brodfuehrer, Peter; Preest, Marion; Belitsky, Jason M.; Umbanhowar, Charles, Jr.; Overvoorde, Paul J.

    2017-01-01

    Multiple reports highlight the increasingly quantitative nature of biological research and the need to innovate means to ensure that students acquire quantitative skills. We present a tool to support such innovation. The Biological Science Quantitative Reasoning Exam (BioSQuaRE) is an assessment instrument designed to measure the quantitative…

  6. Using Live-Crown Ratio to Control Wood Quality: An Example of Quantitative Silviculture

    Treesearch

    Thomas J. Dean

    1999-01-01

    Quantitative silviculture is the application of biological relationships in meeting specific, quantitative management objectives. It is a two-sided approach requiring the identification and application of biological relationships. An example of quantitative silviculture is presented that uses a relationship between average-live crown ratio and relative stand density...

  7. The implementation of quantitative electromyographic neuromuscular monitoring in an academic anesthesia department.

    PubMed

    Todd, Michael M; Hindman, Bradley J; King, Brian J

    2014-08-01

    Although experts agree on the importance of quantitative neuromuscular blockade monitoring, particularly for managing reversal, such monitoring is not in widespread use. We describe the processes and results of our departmental experience with the introduction of such quantitative monitoring. In mid-2010, the senior authors became concerned about the management of nondepolarizing neuromuscular blockers (NMB) by providers within the department, based on personal observations and on a review of a departmental quality assurance/adverse event database. This review indicated the occurrence of 2 to 4 reintubations/year in the postanesthesia care unit (PACU) that were deemed to be probably or possibly related to inadequate reversal. In response, quantitative blockade equipment (Datex-Omeda ElectroSensor™ EMG system) was installed in all our main operating rooms in January 2011. This introduction was accompanied by an extensive educational effort. Adoption of the system was slow; by mid-2011, the quantitative system was being used in <50% of cases involving nondepolarizing relaxants and adverse NMB-related events continued to occur. Therefore, starting in August 2011 and extending over the next 2 years, we performed a series of 5 separate sampling surveys in the PACU in which train-of-four (TOF) ratios were recorded in 409 tracheally extubated adult patients who had received nondepolarizing NMB (almost exclusively rocuronium) as well as in 73 patients who had not received any nondepolarizing NMB. After each survey, the results were presented to the entire department, along with discussions of individual cases, reviews of the recent literature regarding quantitative monitoring and further education regarding the use of the quantitative system. In the initial (August 2011) PACU survey of 96 patients receiving nondepolarizing NMBs, 31% had a TOF ratio of ≤0.9, 17% had a ratio of ≤0.8, and 4 patients (4%) had ratios of ≤0.5. A record review showed that the quantitative monitoring system had been used to monitor reversal in only 51% of these patients, and 23% of patients had no evidence of any monitoring, including qualitative TOF assessment. By December of 2012 (after 2 interim PACU monitoring surveys), a fourth survey showed 15% of 101 monitored patients had a TOF ratio ≤0.9, and only 5% had ratios ≤0.8. (P < 0.05 vs August 2011). Clear documentation of reversal using the quantitative system was present in 83% of cases (P < 0.05 vs August 2011). A final survey in July 2013 showed nearly identical values to those from December 2012. The lowest TOF ratio observed in any patient not receiving a nondepolarizing NMB was 0.92. There were no changes in the patterns of either rocuronium or neostigmine use over the duration of the project (through December 2012), and there have been no cases of NMB-related reintubations in the PACU during the last 2 years. Implementation of universal electromyographic-based quantitative neuromuscular blockade monitoring required a sustained process of education along with repeated PACU surveys and feedback to providers. Nevertheless, this effort resulted in a significant reduction in the incidence of incompletely reversed patients in the PACU.

  8. Quantitative Acoustic Model for Adhesion Evaluation of Pmma/silicon Film Structures

    NASA Astrophysics Data System (ADS)

    Ju, H. S.; Tittmann, B. R.

    2010-02-01

    A Poly-methyl-methacrylate (PMMA) film on a silicon substrate is a main structure for photolithography in semiconductor manufacturing processes. This paper presents a potential of scanning acoustic microscopy (SAM) for nondestructive evaluation of the PMMA/Si film structure, whose adhesion failure is commonly encountered during the fabrication and post-fabrication processes. A physical model employing a partial discontinuity in displacement is developed for rigorously quantitative evaluation of the interfacial weakness. The model is implanted to the matrix method for the surface acoustic wave (SAW) propagation in anisotropic media. Our results show that variations in the SAW velocity and reflectance are predicted to show their sensitivity to the adhesion condition. Experimental results by the v(z) technique and SAW velocity reconstruction verify the prediction.

  9. Gold Nanoparticle Labeling Based ICP-MS Detection/Measurement of Bacteria, and Their Quantitative Photothermal Destruction

    PubMed Central

    Lin, Yunfeng

    2015-01-01

    Bacteria such as Salmonella and E. coli present a great challenge in public health care in today’s society. Protection of public safety against bacterial contamination and rapid diagnosis of infection require simple and fast assays for the detection and elimination of bacterial pathogens. After utilizing Salmonella DT104 as an example bacterial strain for our investigation, we report a rapid and sensitive assay for the qualitative and quantitative detection of bacteria by using antibody affinity binding, popcorn shaped gold nanoparticle (GNPOPs) labeling, surfance enchanced Raman spectroscopy (SERS), and inductively coupled plasma mass spectrometry (ICP-MS) detection. For qualitative analysis, our assay can detect Salmonella within 10 min by Raman spectroscopy; for quantitative analysis, our assay has the ability to measure as few as 100 Salmonella DT104 in a 1 mL sample (100 CFU/mL) within 40 min. Based on the quantitative detection, we investigated the quantitative destruction of Salmonella DT104, and the assay’s photothermal efficiency in order to reduce the amount of GNPOPs in the assay to ultimately to eliminate any potential side effects/toxicity to the surrounding cells in vivo. Results suggest that our assay may serve as a promising candidate for qualitative and quantitative detection and elimination of a variety of bacterial pathogens. PMID:26417447

  10. General quantitative genetic methods for comparative biology: phylogenies, taxonomies and multi-trait models for continuous and categorical characters.

    PubMed

    Hadfield, J D; Nakagawa, S

    2010-03-01

    Although many of the statistical techniques used in comparative biology were originally developed in quantitative genetics, subsequent development of comparative techniques has progressed in relative isolation. Consequently, many of the new and planned developments in comparative analysis already have well-tested solutions in quantitative genetics. In this paper, we take three recent publications that develop phylogenetic meta-analysis, either implicitly or explicitly, and show how they can be considered as quantitative genetic models. We highlight some of the difficulties with the proposed solutions, and demonstrate that standard quantitative genetic theory and software offer solutions. We also show how results from Bayesian quantitative genetics can be used to create efficient Markov chain Monte Carlo algorithms for phylogenetic mixed models, thereby extending their generality to non-Gaussian data. Of particular utility is the development of multinomial models for analysing the evolution of discrete traits, and the development of multi-trait models in which traits can follow different distributions. Meta-analyses often include a nonrandom collection of species for which the full phylogenetic tree has only been partly resolved. Using missing data theory, we show how the presented models can be used to correct for nonrandom sampling and show how taxonomies and phylogenies can be combined to give a flexible framework with which to model dependence.

  11. MO-D-213-06: Quantitative Image Quality Metrics Are for Physicists, Not Radiologists: How to Communicate to Your Radiologists Using Their Language

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szczykutowicz, T; Rubert, N; Ranallo, F

    Purpose: A framework for explaining differences in image quality to non-technical audiences in medial imaging is needed. Currently, this task is something that is learned “on the job.” The lack of a formal methodology for communicating optimal acquisition parameters into the clinic effectively mitigates many technological advances. As a community, medical physicists need to be held responsible for not only advancing image science, but also for ensuring its proper use in the clinic. This work outlines a framework that bridges the gap between the results from quantitative image quality metrics like detectability, MTF, and NPS and their effect on specificmore » anatomical structures present in diagnostic imaging tasks. Methods: Specific structures of clinical importance were identified for a body, an extremity, a chest, and a temporal bone protocol. Using these structures, quantitative metrics were used to identify the parameter space that should yield optimal image quality constrained within the confines of clinical logistics and dose considerations. The reading room workflow for presenting the proposed changes for imaging each of these structures is presented. The workflow consists of displaying images for physician review consisting of different combinations of acquisition parameters guided by quantitative metrics. Examples of using detectability index, MTF, NPS, noise and noise non-uniformity are provided. During review, the physician was forced to judge the image quality solely on those features they need for diagnosis, not on the overall “look” of the image. Results: We found that in many cases, use of this framework settled mis-agreements between physicians. Once forced to judge images on the ability to detect specific structures inter reader agreement was obtained. Conclusion: This framework will provide consulting, research/industrial, or in-house physicists with clinically relevant imaging tasks to guide reading room image review. This framework avoids use of the overall “look” or “feel” to dictate acquisition parameter selection. Equipment grants GE Healthcare.« less

  12. Spatial reconstruction of semi-quantitative precipitation fields over Africa during the nineteenth century from documentary evidence and gauge data

    NASA Astrophysics Data System (ADS)

    Nicholson, Sharon E.; Klotter, Douglas; Dezfuli, Amin K.

    2012-07-01

    The article presents a newly created precipitation data set for the African continent and describes the methodology used in its creation. It is based on a combination of proxy data and rain gauge records. The data set is semi-quantitative, with a "wetness" index of - 3 to + 3 to describe the quality of the rainy season. It covers the period AD 1801 to 1900 and includes data for 90 geographical regions of the continent. The results underscore a multi-decadal period of aridity early in the nineteenth century.

  13. Transient segregation behavior in Cd1-xZnxTe with low Zn content-A qualitative and quantitative analysis

    NASA Astrophysics Data System (ADS)

    Neubert, M.; Jurisch, M.

    2015-06-01

    The paper analyzes experimental compositional profiles in Vertical Bridgman (VB, VGF) grown (Cd,Zn)Te crystals, found in the literature. The origin of the observed axial ZnTe-distribution profiles is attributed to dendritic growth after initial nucleation from supercooled melts. The analysis was done by utilizing a boundary layer model providing a very good approximation of the experimental data. Besides the discussion of the qualitative results also a quantitative analysis of the fitted model parameters is presented as far as it is possible by the utilized model.

  14. Quantitative analysis of pyroglutamic acid in peptides.

    PubMed

    Suzuki, Y; Motoi, H; Sato, K

    1999-08-01

    A simplified and rapid procedure for the determination of pyroglutamic acid in peptides was developed. The method involves the enzymatic cleavage of an N-terminal pyroglutamate residue using a thermostable pyroglutamate aminopeptidase and isocratic HPLC separation of the resulting enzymatic hydrolysate using a column switching technique. Pyroglutamate aminopeptidase from a thermophilic archaebacteria, Pyrococcus furiosus, cleaves N-terminal pyroglutamic acid residue independent of the molecular weight of the substrate. It cleaves more than 85% of pyroglutamate from peptides whose molecular weight ranges from 362.4 to 4599.4 Da. Thus, a new method is presented that quantitatively estimates N-terminal pyroglutamic acid residue in peptides.

  15. Multispectral analysis of ocean dumped materials

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1977-01-01

    Remotely sensed data were collected in conjunction with sea-truth measurements in three experiments in the New York Bight. Pollution features of primary interest were ocean dumped materials, such as sewage sludge and acid waste. Sewage-sludge and acid-waste plumes, including plumes from sewage sludge dumped by the 'line-dump' and 'spot-dump' methods, were located, identified, and mapped. Previously developed quantitative analysis techniques for determining quantitative distributions of materials in sewage sludge dumps were evaluated, along with multispectral analysis techniques developed to identify ocean dumped materials. Results of these experiments and the associated data analysis investigations are presented and discussed.

  16. Analyzing the impacts of global trade and investment on non-communicable diseases and risk factors: a critical review of methodological approaches used in quantitative analyses.

    PubMed

    Cowling, Krycia; Thow, Anne Marie; Pollack Porter, Keshia

    2018-05-24

    A key mechanism through which globalization has impacted health is the liberalization of trade and investment, yet relatively few studies to date have used quantitative methods to investigate the impacts of global trade and investment policies on non-communicable diseases and risk factors. Recent reviews of this literature have found heterogeneity in results and a range of quality across studies, which may be in part attributable to a lack of conceptual clarity and methodological inconsistencies. This study is a critical review of methodological approaches used in the quantitative literature on global trade and investment and diet, tobacco, alcohol, and related health outcomes, with the objective of developing recommendations and providing resources to guide future robust, policy relevant research. A review of reviews, expert review, and reference tracing were employed to identify relevant studies, which were evaluated using a novel quality assessment tool designed for this research. Eight review articles and 34 quantitative studies were identified for inclusion. Important ways to improve this literature were identified and discussed: clearly defining exposures of interest and not conflating trade and investment; exploring mechanisms of broader relationships; increasing the use of individual-level data; ensuring consensus and consistency in key confounding variables; utilizing more sector-specific versus economy-wide trade and investment indicators; testing and adequately adjusting for autocorrelation and endogeneity when using longitudinal data; and presenting results from alternative statistical models and sensitivity analyses. To guide the development of future analyses, recommendations for international data sources for selected trade and investment indicators, as well as key gaps in the literature, are presented. More methodologically rigorous and consistent approaches in future quantitative studies on the impacts of global trade and investment policies on non-communicable diseases and risk factors can help to resolve inconsistencies of existing research and generate useful information to guide policy decisions.

  17. Decision-Making in Multiple Sclerosis Patients: A Systematic Review

    PubMed Central

    2018-01-01

    Background Multiple sclerosis (MS) is frequently associated with cognitive and behavioural deficits. A growing number of studies suggest an impact of MS on decision-making abilities. The aim of this systematic review was to assess if (1) performance of MS patients in decision-making tasks was consistently different from controls and (2) whether this modification was associated with cognitive dysfunction and emotional alterations. Methods The search was conducted on Pubmed/Medline database. 12 studies evaluating the difference between MS patients and healthy controls using validated decision-making tasks were included. Outcomes considered were quantitative (net scores) and qualitative measurements (deliberation time and learning from feedback). Results Quantitative and qualitative decision-making impairment in MS was present in 64.7% of measurements. Patients were equally impaired in tasks for decision-making under risk and ambiguity. A correlation to other cognitive functions was present in 50% of cases, with the highest associations in the domains of processing speed and attentional capacity. Conclusions In MS patients, qualitative and quantitative modifications may be present in any kind of decision-making task and can appear independently of other cognitive measures. Since decision-making abilities have a significant impact on everyday life, this cognitive aspect has an influential importance in various MS-related treatment settings. PMID:29721338

  18. Identification and Quantitation of Asparagine and Citrulline Using High-Performance Liquid Chromatography (HPLC)

    PubMed Central

    Bai, Cheng; Reilly, Charles C.; Wood, Bruce W.

    2007-01-01

    High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (μMol ml−1/μMol ml−1)], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh.) K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides. PMID:19662174

  19. Identification and quantitation of asparagine and citrulline using high-performance liquid chromatography (HPLC).

    PubMed

    Bai, Cheng; Reilly, Charles C; Wood, Bruce W

    2007-03-28

    High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (microMol ml(-1)/microMol ml(-1))], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh.) K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides.

  20. Computational Study of the Richtmyer-Meshkov Instability with a Complex Initial Condition

    NASA Astrophysics Data System (ADS)

    McFarland, Jacob; Reilly, David; Greenough, Jeffrey; Ranjan, Devesh

    2014-11-01

    Results are presented for a computational study of the Richtmyer-Meshkov instability with a complex initial condition. This study covers experiments which will be conducted at the newly-built inclined shock tube facility at the Georgia Institute of Technology. The complex initial condition employed consists of an underlying inclined interface perturbation with a broadband spectrum of modes superimposed. A three-dimensional staggered mesh arbitrary Lagrange Eulerian (ALE) hydrodynamics code developed at Lawerence Livermore National Laboratory called ARES was used to obtain both qualitative and quantitative results. Qualitative results are discussed using time series of density plots from which mixing width may be extracted. Quantitative results are also discussed using vorticity fields, circulation components, and energy spectra. The inclined interface case is compared to the complex interface case in order to study the effect of initial conditions on shocked, variable-density flows.

  1. Quantitative targeting maps based on experimental investigations for a branched tube model in magnetic drug targeting

    NASA Astrophysics Data System (ADS)

    Gitter, K.; Odenbach, S.

    2011-12-01

    Magnetic drug targeting (MDT), because of its high targeting efficiency, is a promising approach for tumour treatment. Unwanted side effects are considerably reduced, since the nanoparticles are concentrated within the target region due to the influence of a magnetic field. Nevertheless, understanding the transport phenomena of nanoparticles in an artery system is still challenging. This work presents experimental results for a branched tube model. Quantitative results describe, for example, the net amount of nanoparticles that are targeted towards the chosen region due to the influence of a magnetic field. As a result of measurements, novel drug targeting maps, combining, e.g. the magnetic volume force, the position of the magnet and the net amount of targeted nanoparticles, are presented. The targeting maps are valuable for evaluation and comparison of setups and are also helpful for the design and the optimisation of a magnet system with an appropriate strength and distribution of the field gradient. The maps indicate the danger of accretion within the tube and also show the promising result of magnetic drug targeting that up to 97% of the nanoparticles were successfully targeted.

  2. Airport Surface Traffic Control Concept Formulation Study : Volume 3. Operations Analysis of O'Hare Airport - Part 2.

    DOT National Transportation Integrated Search

    1975-07-01

    The volume presents the results of the quantitative analyses of the O'Hare ASTC System operations. The operations environments for the periods selected for detailed analysis of the ASDE films and controller communications recording are described. Fol...

  3. Miramar College Program Evaluation: Aviation Maintenance.

    ERIC Educational Resources Information Center

    Moriyama, Bruce; Brumley, Leslie

    Qualitative and quantitative data are presented in this evaluation of the curricular, personnel, and financial status of Miramar College's program in aviation maintenance. The report first provides the results of an interview with the program chairperson, which sought information on program objectives and goals and their determination, the extent…

  4. Industrial Chemistry, Science (Experimental): 5316.07.

    ERIC Educational Resources Information Center

    Scholz, Robert

    This unit of instruction presents some important and interesting processes carried on daily in industry and which result in products with which the student is familiar. The student will be responsible for learning some reactions involving these chemical processes and the quantitative calculations of these reactions. Fractional distillation,…

  5. Determination of Benzalkonium Chloride in Commercial Disinfectant Formulations by Quantitative NMR Spectroscopy

    DTIC Science & Technology

    2012-11-01

    disinfectant solutions containing benzalkonium chloride (BAC); a molluscicide and antifouling chemical. In order to determine the efficacy of this...formulations. The methods and results presented herein will be used in a separate study to assess the efficacy of BACs as antifouling agents under

  6. COMPARATIVE ANALYSIS OF HEALTH RISK ASSESSMENTS FOR MUNICIPAL WASTE COMBUSTORS

    EPA Science Inventory

    Quantitative health risk assessments have been performed for a number of proposed municipal waste combustor (MWC) facilities over the past several years. his article presents the results of a comparative analysis of a total of 21 risk assessments, focusing on seven of the most co...

  7. Comparative study of the dynamics of lipid membrane phase decomposition in experiment and simulation.

    PubMed

    Burger, Stefan; Fraunholz, Thomas; Leirer, Christian; Hoppe, Ronald H W; Wixforth, Achim; Peter, Malte A; Franke, Thomas

    2013-06-25

    Phase decomposition in lipid membranes has been the subject of numerous investigations by both experiment and theoretical simulation, yet quantitative comparisons of the simulated data to the experimental results are rare. In this work, we present a novel way of comparing the temporal development of liquid-ordered domains obtained from numerically solving the Cahn-Hilliard equation and by inducing a phase transition in giant unilamellar vesicles (GUVs). Quantitative comparison is done by calculating the structure factor of the domain pattern. It turns out that the decomposition takes place in three distinct regimes in both experiment and simulation. These regimes are characterized by different rates of growth of the mean domain diameter, and there is quantitative agreement between experiment and simulation as to the duration of each regime and the absolute rate of growth in each regime.

  8. Quantitative performance of a polarization diffraction grating polarimeter encoded onto two liquid-crystal-on-silicon displays

    NASA Astrophysics Data System (ADS)

    Cofré, Aarón; Vargas, Asticio; Torres-Ruiz, Fabián A.; Campos, Juan; Lizana, Angel; del Mar Sánchez-López, María; Moreno, Ignacio

    2017-11-01

    We present a quantitative analysis of the performance of a complete snapshot polarimeter based on a polarization diffraction grating (PDGr). The PDGr is generated in a common path polarization interferometer with a Z optical architecture that uses two liquid-crystal on silicon (LCoS) displays to imprint two different phase-only diffraction gratings onto two orthogonal linear states of polarization. As a result, we obtain a programmable PDGr capable to act as a simultaneous polarization state generator (PSG), yielding diffraction orders with different states of polarization. The same system is also shown to operate as a polarization state analyzer (PSA), therefore useful for the realization of a snapshot polarimeter. We analyze its performance using quantitative metrics such as the conditional number, and verify its reliability for the detection of states of polarization.

  9. On the Agreement between Manual and Automated Methods for Single-Trial Detection and Estimation of Features from Event-Related Potentials

    PubMed Central

    Biurrun Manresa, José A.; Arguissain, Federico G.; Medina Redondo, David E.; Mørch, Carsten D.; Andersen, Ole K.

    2015-01-01

    The agreement between humans and algorithms on whether an event-related potential (ERP) is present or not and the level of variation in the estimated values of its relevant features are largely unknown. Thus, the aim of this study was to determine the categorical and quantitative agreement between manual and automated methods for single-trial detection and estimation of ERP features. To this end, ERPs were elicited in sixteen healthy volunteers using electrical stimulation at graded intensities below and above the nociceptive withdrawal reflex threshold. Presence/absence of an ERP peak (categorical outcome) and its amplitude and latency (quantitative outcome) in each single-trial were evaluated independently by two human observers and two automated algorithms taken from existing literature. Categorical agreement was assessed using percentage positive and negative agreement and Cohen’s κ, whereas quantitative agreement was evaluated using Bland-Altman analysis and the coefficient of variation. Typical values for the categorical agreement between manual and automated methods were derived, as well as reference values for the average and maximum differences that can be expected if one method is used instead of the others. Results showed that the human observers presented the highest categorical and quantitative agreement, and there were significantly large differences between detection and estimation of quantitative features among methods. In conclusion, substantial care should be taken in the selection of the detection/estimation approach, since factors like stimulation intensity and expected number of trials with/without response can play a significant role in the outcome of a study. PMID:26258532

  10. Quantitative proteomic analysis of human lung tumor xenografts treated with the ectopic ATP synthase inhibitor citreoviridin.

    PubMed

    Wu, Yi-Hsuan; Hu, Chia-Wei; Chien, Chih-Wei; Chen, Yu-Ju; Huang, Hsuan-Cheng; Juan, Hsueh-Fen

    2013-01-01

    ATP synthase is present on the plasma membrane of several types of cancer cells. Citreoviridin, an ATP synthase inhibitor, selectively suppresses the proliferation and growth of lung cancer without affecting normal cells. However, the global effects of targeting ectopic ATP synthase in vivo have not been well defined. In this study, we performed quantitative proteomic analysis using isobaric tags for relative and absolute quantitation (iTRAQ) and provided a comprehensive insight into the complicated regulation by citreoviridin in a lung cancer xenograft model. With high reproducibility of the quantitation, we obtained quantitative proteomic profiling with 2,659 proteins identified. Bioinformatics analysis of the 141 differentially expressed proteins selected by their relative abundance revealed that citreoviridin induces alterations in the expression of glucose metabolism-related enzymes in lung cancer. The up-regulation of enzymes involved in gluconeogenesis and storage of glucose indicated that citreoviridin may reduce the glycolytic intermediates for macromolecule synthesis and inhibit cell proliferation. Using comprehensive proteomics, the results identify metabolic aspects that help explain the antitumorigenic effect of citreoviridin in lung cancer, which may lead to a better understanding of the links between metabolism and tumorigenesis in cancer therapy.

  11. Quantitative Proteomic Analysis of Human Lung Tumor Xenografts Treated with the Ectopic ATP Synthase Inhibitor Citreoviridin

    PubMed Central

    Wu, Yi-Hsuan; Hu, Chia-Wei; Chien, Chih-Wei; Chen, Yu-Ju; Huang, Hsuan-Cheng; Juan, Hsueh-Fen

    2013-01-01

    ATP synthase is present on the plasma membrane of several types of cancer cells. Citreoviridin, an ATP synthase inhibitor, selectively suppresses the proliferation and growth of lung cancer without affecting normal cells. However, the global effects of targeting ectopic ATP synthase in vivo have not been well defined. In this study, we performed quantitative proteomic analysis using isobaric tags for relative and absolute quantitation (iTRAQ) and provided a comprehensive insight into the complicated regulation by citreoviridin in a lung cancer xenograft model. With high reproducibility of the quantitation, we obtained quantitative proteomic profiling with 2,659 proteins identified. Bioinformatics analysis of the 141 differentially expressed proteins selected by their relative abundance revealed that citreoviridin induces alterations in the expression of glucose metabolism-related enzymes in lung cancer. The up-regulation of enzymes involved in gluconeogenesis and storage of glucose indicated that citreoviridin may reduce the glycolytic intermediates for macromolecule synthesis and inhibit cell proliferation. Using comprehensive proteomics, the results identify metabolic aspects that help explain the antitumorigenic effect of citreoviridin in lung cancer, which may lead to a better understanding of the links between metabolism and tumorigenesis in cancer therapy. PMID:23990911

  12. QSAR DataBank - an approach for the digital organization and archiving of QSAR model information

    PubMed Central

    2014-01-01

    Background Research efforts in the field of descriptive and predictive Quantitative Structure-Activity Relationships or Quantitative Structure–Property Relationships produce around one thousand scientific publications annually. All the materials and results are mainly communicated using printed media. The printed media in its present form have obvious limitations when they come to effectively representing mathematical models, including complex and non-linear, and large bodies of associated numerical chemical data. It is not supportive of secondary information extraction or reuse efforts while in silico studies poses additional requirements for accessibility, transparency and reproducibility of the research. This gap can and should be bridged by introducing domain-specific digital data exchange standards and tools. The current publication presents a formal specification of the quantitative structure-activity relationship data organization and archival format called the QSAR DataBank (QsarDB for shorter, or QDB for shortest). Results The article describes QsarDB data schema, which formalizes QSAR concepts (objects and relationships between them) and QsarDB data format, which formalizes their presentation for computer systems. The utility and benefits of QsarDB have been thoroughly tested by solving everyday QSAR and predictive modeling problems, with examples in the field of predictive toxicology, and can be applied for a wide variety of other endpoints. The work is accompanied with open source reference implementation and tools. Conclusions The proposed open data, open source, and open standards design is open to public and proprietary extensions on many levels. Selected use cases exemplify the benefits of the proposed QsarDB data format. General ideas for future development are discussed. PMID:24910716

  13. Optical techniques for determination of normal shock position in supersonic flows for aerospace applications

    NASA Technical Reports Server (NTRS)

    Adamovsky, Grigory; Eustace, John G.

    1990-01-01

    Techniques for the quantitative determination of shock position in supersonic flows using direct and indirect methods is presented. A description of an experimental setup is also presented, different configurations of shock position sensing systems are explained, and some experimental results are given. All of the methods discussed are analyzed to determine the ease of technology transfer from the laboratory to in-flight operation.

  14. Evaluation of spacecraft toxic gas removal agents

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A study of the decomposition of various compounds adsorbed on charcoal was made, with a view toward providing a critical appraisal of previous data from charcoal adsorption studies. It was found that thermal decomposition occurs at temperature lower than previously suspected during the charcoal stripping process. A discussion is presented dealing with the various types of reactions found. A rough, quantitative scheme for correcting previous analytical results is developed and presented.

  15. Quantitative characterisation of sedimentary grains

    NASA Astrophysics Data System (ADS)

    Tunwal, Mohit; Mulchrone, Kieran F.; Meere, Patrick A.

    2016-04-01

    Analysis of sedimentary texture helps in determining the formation, transportation and deposition processes of sedimentary rocks. Grain size analysis is traditionally quantitative, whereas grain shape analysis is largely qualitative. A semi-automated approach to quantitatively analyse shape and size of sand sized sedimentary grains is presented. Grain boundaries are manually traced from thin section microphotographs in the case of lithified samples and are automatically identified in the case of loose sediments. Shape and size paramters can then be estimated using a software package written on the Mathematica platform. While automated methodology already exists for loose sediment analysis, the available techniques for the case of lithified samples are limited to cases of high definition thin section microphotographs showing clear contrast between framework grains and matrix. Along with the size of grain, shape parameters such as roundness, angularity, circularity, irregularity and fractal dimension are measured. A new grain shape parameter developed using Fourier descriptors has also been developed. To test this new approach theoretical examples were analysed and produce high quality results supporting the accuracy of the algorithm. Furthermore sandstone samples from known aeolian and fluvial environments from the Dingle Basin, County Kerry, Ireland were collected and analysed. Modern loose sediments from glacial till from County Cork, Ireland and aeolian sediments from Rajasthan, India have also been collected and analysed. A graphical summary of the data is presented and allows for quantitative distinction between samples extracted from different sedimentary environments.

  16. Ongoing advances in quantitative PpIX fluorescence guided intracranial tumor resection (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Olson, Jonathan D.; Kanick, Stephen C.; Bravo, Jaime J.; Roberts, David W.; Paulsen, Keith D.

    2016-03-01

    Aminolevulinc-acid induced protoporphyrin IX (ALA-PpIX) is being investigated as a biomarker to guide neurosurgical resection of brain tumors. ALA-PpIX fluorescence can be observed visually in the surgical field; however, raw fluorescence emissions can be distorted by factors other than the fluorophore concentration. Specifically, fluorescence emissions are mixed with autofluorescence and attenuated by background absorption and scattering properties of the tissue. Recent work at Dartmouth has developed advanced fluorescence detection approaches that return quantitative assessments of PpIX concentration, which are independent of background optical properties. The quantitative fluorescence imaging (qFI) approach has increased sensitivity to residual disease within the resection cavity at the end of surgery that was not visible to the naked eye through the operating microscope. This presentation outlines clinical observations made during an ongoing investigation of ALA-PpIX based guidance of tumor resection. PpIX fluorescence measurements made in a wide-field hyperspectral imaging approach are co-registered with point-assessment using a fiber optic probe. Data show variations in the measured PpIX accumulation among different clinical tumor grades (i.e. high grade glioma, low grade glioma), types (i.e. primary tumors. metastases) and normal structures of interest (e.g. normal cortex, hippocampus). These results highlight the contrast enhancement and underscore the potential clinical benefit offered from quantitative measurements of PpIX concentration during resection of intracranial tumors.

  17. Identification, quantitation, and method validation for flavan-3-ols in fermented ready-to-drink teas from the Italian market using HPLC-UV/DAD and LC-MS/MS.

    PubMed

    Cordero, Chiara; Canale, Francesca; Del Rio, Daniele; Bicchi, Carlo

    2009-11-01

    The present study is focused on flavan-3-ols characterizing the antioxidant properties of fermented tea (Camellia sinensis). These bioactive compounds, object of nutritional claims in commercial products, should be quantified with rigorous analytical procedures whose accuracy and precision have been stated with a certain level of confidence. An HPLC-UV/DAD method, able to detect and quantify flavan-3-ols in infusions and ready-to-drink teas, has been developed for routine analysis and validated by characterizing several performance parameters. The accuracy assessment has been run through a series of LC-MS/MS analyses. Epigallocatechin, (+)-catechin, (-)-epigallocatechingallate, (-)-epicatechin, (-)-gallocatechingallate, (-)-epicatechingallate, and (-)-catechingallate were chosen as markers of the polyphenolic fraction. Quantitative results showed that samples obtained from tea leaves infusion were richer in polyphenolic antioxidants than those obtained through other industrial processes. The influence of shelf-life and packaging material on the flavan-3-ols content was also considered; markers decreased, with an exponential trend, as a function of time within the shelf life while packaging materials demonstrated to influence differently the flavan-3-ol fraction composition over time. The method presented here provides quantitative results with a certain level of confidence and is suitable for a routine quality control of iced teas whose antioxidant properties are object of nutritional claim.

  18. Quantification and clustering of phenotypic screening data using time-series analysis for chemotherapy of schistosomiasis

    PubMed Central

    2012-01-01

    Background Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. Method We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. Results We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. Conclusions The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind. PMID:22369037

  19. Image-Based Quantification of Plant Immunity and Disease.

    PubMed

    Laflamme, Bradley; Middleton, Maggie; Lo, Timothy; Desveaux, Darrell; Guttman, David S

    2016-12-01

    Measuring the extent and severity of disease is a critical component of plant pathology research and crop breeding. Unfortunately, existing visual scoring systems are qualitative, subjective, and the results are difficult to transfer between research groups, while existing quantitative methods can be quite laborious. Here, we present plant immunity and disease image-based quantification (PIDIQ), a quantitative, semi-automated system to rapidly and objectively measure disease symptoms in a biologically relevant context. PIDIQ applies an ImageJ-based macro to plant photos in order to distinguish healthy tissue from tissue that has yellowed due to disease. It can process a directory of images in an automated manner and report the relative ratios of healthy to diseased leaf area, thereby providing a quantitative measure of plant health that can be statistically compared with appropriate controls. We used the Arabidopsis thaliana-Pseudomonas syringae model system to show that PIDIQ is able to identify both enhanced plant health associated with effector-triggered immunity as well as elevated disease symptoms associated with effector-triggered susceptibility. Finally, we show that the quantitative results provided by PIDIQ correspond to those obtained via traditional in planta pathogen growth assays. PIDIQ provides a simple and effective means to nondestructively quantify disease from whole plants and we believe it will be equally effective for monitoring disease on excised leaves and stems.

  20. Improved quantitative analysis of spectra using a new method of obtaining derivative spectra based on a singular perturbation technique.

    PubMed

    Li, Zhigang; Wang, Qiaoyun; Lv, Jiangtao; Ma, Zhenhe; Yang, Linjuan

    2015-06-01

    Spectroscopy is often applied when a rapid quantitative analysis is required, but one challenge is the translation of raw spectra into a final analysis. Derivative spectra are often used as a preliminary preprocessing step to resolve overlapping signals, enhance signal properties, and suppress unwanted spectral features that arise due to non-ideal instrument and sample properties. In this study, to improve quantitative analysis of near-infrared spectra, derivatives of noisy raw spectral data need to be estimated with high accuracy. A new spectral estimator based on singular perturbation technique, called the singular perturbation spectra estimator (SPSE), is presented, and the stability analysis of the estimator is given. Theoretical analysis and simulation experimental results confirm that the derivatives can be estimated with high accuracy using this estimator. Furthermore, the effectiveness of the estimator for processing noisy infrared spectra is evaluated using the analysis of beer spectra. The derivative spectra of the beer and the marzipan are used to build the calibration model using partial least squares (PLS) modeling. The results show that the PLS based on the new estimator can achieve better performance compared with the Savitzky-Golay algorithm and can serve as an alternative choice for quantitative analytical applications.

  1. Quantitative Measurement of Trans-Fats by Infrared Spectroscopy

    ERIC Educational Resources Information Center

    Walker, Edward B.; Davies, Don R.; Campbell, Mike

    2007-01-01

    Trans-fat is a general term, which is mainly used to describe the various trans geometric isomers present in unsaturated fatty acids. Various techniques are now used for a quantitative measurement of the amount of trans-fats present in foods and cooking oil.

  2. Retinal status analysis method based on feature extraction and quantitative grading in OCT images.

    PubMed

    Fu, Dongmei; Tong, Hejun; Zheng, Shuang; Luo, Ling; Gao, Fulin; Minar, Jiri

    2016-07-22

    Optical coherence tomography (OCT) is widely used in ophthalmology for viewing the morphology of the retina, which is important for disease detection and assessing therapeutic effect. The diagnosis of retinal diseases is based primarily on the subjective analysis of OCT images by trained ophthalmologists. This paper describes an OCT images automatic analysis method for computer-aided disease diagnosis and it is a critical part of the eye fundus diagnosis. This study analyzed 300 OCT images acquired by Optovue Avanti RTVue XR (Optovue Corp., Fremont, CA). Firstly, the normal retinal reference model based on retinal boundaries was presented. Subsequently, two kinds of quantitative methods based on geometric features and morphological features were proposed. This paper put forward a retinal abnormal grading decision-making method which was used in actual analysis and evaluation of multiple OCT images. This paper showed detailed analysis process by four retinal OCT images with different abnormal degrees. The final grading results verified that the analysis method can distinguish abnormal severity and lesion regions. This paper presented the simulation of the 150 test images, where the results of analysis of retinal status showed that the sensitivity was 0.94 and specificity was 0.92.The proposed method can speed up diagnostic process and objectively evaluate the retinal status. This paper aims on studies of retinal status automatic analysis method based on feature extraction and quantitative grading in OCT images. The proposed method can obtain the parameters and the features that are associated with retinal morphology. Quantitative analysis and evaluation of these features are combined with reference model which can realize the target image abnormal judgment and provide a reference for disease diagnosis.

  3. Targeted liquid chromatography tandem mass spectrometry to quantitate wheat gluten using well-defined reference proteins

    PubMed Central

    Schalk, Kathrin; Koehler, Peter

    2018-01-01

    Celiac disease (CD) is an inflammatory disorder of the upper small intestine caused by the ingestion of storage proteins (prolamins and glutelins) from wheat, barley, rye, and, in rare cases, oats. CD patients need to follow a gluten-free diet by consuming gluten-free products with gluten contents of less than 20 mg/kg. Currently, the recommended method for the quantitative determination of gluten is an enzyme-linked immunosorbent assay (ELISA) based on the R5 monoclonal antibody. Because the R5 ELISA mostly detects the prolamin fraction of gluten, a new independent method is required to detect prolamins as well as glutelins. This paper presents the development of a method to quantitate 16 wheat marker peptides derived from all wheat gluten protein types by liquid chromatography tandem mass spectrometry (LC-MS/MS) in the multiple reaction monitoring mode. The quantitation of each marker peptide in the chymotryptic digest of a defined amount of the respective reference wheat protein type resulted in peptide-specific yields. This enabled the conversion of peptide into protein type concentrations. Gluten contents were expressed as sum of all determined protein type concentrations. This new method was applied to quantitate gluten in wheat starches and compared to R5 ELISA and gel-permeation high-performance liquid chromatography with fluorescence detection (GP-HPLC-FLD), which resulted in a strong correlation between LC-MS/MS and the other two methods. PMID:29425234

  4. An optically coupled system for quantitative monitoring of MRI gradient currents induced into endocardial leads.

    PubMed

    Mattei, E; Calcagnini, G; Triventi, M; Delogu, A; Del Guercio, M; Angeloni, A; Bartolini, P

    2013-01-01

    The time-varying gradient fields generated during Magnetic Resonance Imaging (MRI) procedures have the potential to induce electrical current on implanted endocardial leads. Whether this current can result in undesired cardiac stimulation is unknown. This paper presents an optically coupled system with the potential to quantitatively measure the currents induced by the gradient fields into endocardial leads during MRI procedures. Our system is based on a microcontroller that works as analog-to-digital (A/D) converter and sends the current signal acquired from the lead to an optical high-speed light-emitting-diode transmitter. Plastic fiber guides the light outside the MRI chamber, to a photodiode receiver and then to an acquisition board connected to a PC. The preliminary characterization of the performances of the system is also presented.

  5. Corneal topography with high-speed swept source OCT in clinical examination

    PubMed Central

    Karnowski, Karol; Kaluzny, Bartlomiej J.; Szkulmowski, Maciej; Gora, Michalina; Wojtkowski, Maciej

    2011-01-01

    We present the applicability of high-speed swept source (SS) optical coherence tomography (OCT) for quantitative evaluation of the corneal topography. A high-speed OCT device of 108,000 lines/s permits dense 3D imaging of the anterior segment within a time period of less than one fourth of second, minimizing the influence of motion artifacts on final images and topographic analysis. The swept laser performance was specially adapted to meet imaging depth requirements. For the first time to our knowledge the results of a quantitative corneal analysis based on SS OCT for clinical pathologies such as keratoconus, a cornea with superficial postinfectious scar, and a cornea 5 months after penetrating keratoplasty are presented. Additionally, a comparison with widely used commercial systems, a Placido-based topographer and a Scheimpflug imaging-based topographer, is demonstrated. PMID:21991558

  6. [Progress in stable isotope labeled quantitative proteomics methods].

    PubMed

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2013-06-01

    Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.

  7. Dissociative conceptual and quantitative problem solving outcomes across interactive engagement and traditional format introductory physics

    NASA Astrophysics Data System (ADS)

    McDaniel, Mark A.; Stoen, Siera M.; Frey, Regina F.; Markow, Zachary E.; Hynes, K. Mairin; Zhao, Jiuqing; Cahill, Michael J.

    2016-12-01

    The existing literature indicates that interactive-engagement (IE) based general physics classes improve conceptual learning relative to more traditional lecture-oriented classrooms. Very little research, however, has examined quantitative problem-solving outcomes from IE based relative to traditional lecture-based physics classes. The present study included both pre- and post-course conceptual-learning assessments and a new quantitative physics problem-solving assessment that included three representative conservation of energy problems from a first-semester calculus-based college physics course. Scores for problem translation, plan coherence, solution execution, and evaluation of solution plausibility were extracted for each problem. Over 450 students in three IE-based sections and two traditional lecture sections taught at the same university during the same semester participated. As expected, the IE-based course produced more robust gains on a Force Concept Inventory than did the lecture course. By contrast, when the full sample was considered, gains in quantitative problem solving were significantly greater for lecture than IE-based physics; when students were matched on pre-test scores, there was still no advantage for IE-based physics on gains in quantitative problem solving. Further, the association between performance on the concept inventory and quantitative problem solving was minimal. These results highlight that improved conceptual understanding does not necessarily support improved quantitative physics problem solving, and that the instructional method appears to have less bearing on gains in quantitative problem solving than does the kinds of problems emphasized in the courses and homework and the overlap of these problems to those on the assessment.

  8. Retroperitoneal tumour radiotherapy: clinical improvements using kilovoltage cone beam computed tomography.

    PubMed

    Juan-Senabre, Xavier J; Ferrer-Albiach, Carlos; Rodríguez-Cordón, Marta; Santos-Serra, Agustín; López-Tarjuelo, Juan; Calzada-Feliu, Salvador

    2009-04-01

    We present a clinical case of a patient diagnosed with a retroperitoneal sarcoma, which received preoperative treatment with daily verification via computed tomography obtained with kilovoltage cone beam. We compare the benefit of this treatment compared to other conventional treatment without image guiding, reporting quantitative results.

  9. Research Application Paper

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    The purpose of this study is to compare and contrast two types of scholarly article designs, quantitative and qualitative, as to how two research designs can be similar and different, and how the authors conduct their research and present their results. When researching and analyzing two scholarly articles of different design types, it is…

  10. Implications for Application of Qualitative Methods to Library and Information Science Research.

    ERIC Educational Resources Information Center

    Grover, Robert; Glazier, Jack

    1985-01-01

    Presents conceptual framework for library and information science research and analyzes research methodology that has application for information science, using as example results of study conducted by authors. Rationale for use of qualitative research methods in theory building is discussed and qualitative and quantitative research methods are…

  11. Outcome Evaluation: Student Development Program, Special Studies Division, Cleveland State University.

    ERIC Educational Resources Information Center

    Pasch, Marvin

    Techniques and procedures used to evaluate the outcomes of the student development program, and to use the evaluation results, are presented. Specific evaluation questions are posed that address overall outcomes, not individual student outcomes, and quantitative measures are suggested to accompany the questions. The measures include statistical…

  12. Miramar College Program Evaluation: Fire Science.

    ERIC Educational Resources Information Center

    Moriyama, Bruce; Brumley, Leslie

    Qualitative and quantitative data are presented in this evaluation of the curricular, personnel, and financial status of Miramar College's program in fire sciences. The report first provides the results of an interview with the program chairperson, which sought information on program objectives and goals and their determination, the extent to…

  13. Examining Secondary Special Education Teachers' Literacy Instructional Practices

    ERIC Educational Resources Information Center

    Leko, Melinda M.; Handy, Tamara; Roberts, Carly A.

    2017-01-01

    This study presents findings from a survey of secondary special education teachers who teach reading. Respondents were 577 special education teachers from a large Midwestern state who completed an online or mail survey. Results based on quantitative and qualitative analyses indicate predominant foci of secondary special education teachers' reading…

  14. Quantitative Assessment of Interutterance Stability: Application to Dysarthria

    ERIC Educational Resources Information Center

    Cummins, Fred; Lowit, Anja; van Brenk, Frits

    2014-01-01

    Purpose: Following recent attempts to quantify articulatory impairment in speech, the present study evaluates the usefulness of a novel measure of motor stability to characterize dysarthria. Method: The study included 8 speakers with ataxic dysarthria (AD), 16 speakers with hypokinetic dysarthria (HD) as a result of Parkinson's disease, and…

  15. Thermal noise in space-charge-limited hole current in silicon

    NASA Technical Reports Server (NTRS)

    Shumka, A.; Golder, J.; Nicolet, M.

    1972-01-01

    Present theories on noise in single-carrier space-charge-limited currents in solids have not been quantitatively substantiated by experimental evidence. To obtain such experimental verification, the noise in specially fabricated silicon structures is being measured and analyzed. The first results of this verification effort are reported.

  16. The Undergraduate Spanish Major Curriculum: Realities and Faculty Perceptions

    ERIC Educational Resources Information Center

    Hertel, Tammy Jandrey; Dings, Abby

    2014-01-01

    This article presents the quantitative and qualitative results of a nationwide survey of Spanish department faculty on the components of their undergraduate Spanish major curriculum and their perceptions of these components, as well as their perceptions of recent Modern Language Association (MLA) reports (2007, 2009) and the reports'…

  17. Exploring the SoTL Landscape at the University of Saskatchewan

    ERIC Educational Resources Information Center

    Wuetherick, Brad; Yu, Stan; Greer, Jim

    2016-01-01

    This paper presents the results of a quantitative study that comprehensively assessed the level and extent to which the Scholarship of Teaching and Learning (SoTL) was being conducted amongst faculty and staff at the University of Saskatchewan, and identifies the barriers and challenges faced by SoTL practitioners.

  18. An Experimental Ecological Study of a Garden Compost Heap.

    ERIC Educational Resources Information Center

    Curds, Tracy

    1985-01-01

    A quantitative study of the fauna of a garden compost heap shows it to be similar to that of organisms found in soil and leaf litter. Materials, methods, and results are discussed and extensive tables of fauna lists, wet/dry masses, and statistical analyses are presented. (Author/DH)

  19. AN ENZYME LINKED IMMUNOSORBENT ASSAY (ELISA) METHOD FOR MONITORING 2,4 DICHLOROPHENOXYACETIC ACID (2,4-D) EXPOSURES

    EPA Science Inventory

    Abstract describes a streamlined ELISA method developed to quantitatively measure 2,4-D in human urine samples. Method development steps and comparison with gas chromatography/mass spectrometry are presented. Results indicated that the ELISA method could be used as a high throu...

  20. Methods for assessing geodiversity

    NASA Astrophysics Data System (ADS)

    Zwoliński, Zbigniew; Najwer, Alicja; Giardino, Marco

    2017-04-01

    The accepted systematics of geodiversity assessment methods will be presented in three categories: qualitative, quantitative and qualitative-quantitative. Qualitative methods are usually descriptive methods that are suited to nominal and ordinal data. Quantitative methods use a different set of parameters and indicators to determine the characteristics of geodiversity in the area being researched. Qualitative-quantitative methods are a good combination of the collection of quantitative data (i.e. digital) and cause-effect data (i.e. relational and explanatory). It seems that at the current stage of the development of geodiversity research methods, qualitative-quantitative methods are the most advanced and best assess the geodiversity of the study area. Their particular advantage is the integration of data from different sources and with different substantive content. Among the distinguishing features of the quantitative and qualitative-quantitative methods for assessing geodiversity are their wide use within geographic information systems, both at the stage of data collection and data integration, as well as numerical processing and their presentation. The unresolved problem for these methods, however, is the possibility of their validation. It seems that currently the best method of validation is direct filed confrontation. Looking to the next few years, the development of qualitative-quantitative methods connected with cognitive issues should be expected, oriented towards ontology and the Semantic Web.

  1. A quantitative study of nanoparticle skin penetration with interactive segmentation.

    PubMed

    Lee, Onseok; Lee, See Hyun; Jeong, Sang Hoon; Kim, Jaeyoung; Ryu, Hwa Jung; Oh, Chilhwan; Son, Sang Wook

    2016-10-01

    In the last decade, the application of nanotechnology techniques has expanded within diverse areas such as pharmacology, medicine, and optical science. Despite such wide-ranging possibilities for implementation into practice, the mechanisms behind nanoparticle skin absorption remain unknown. Moreover, the main mode of investigation has been qualitative analysis. Using interactive segmentation, this study suggests a method of objectively and quantitatively analyzing the mechanisms underlying the skin absorption of nanoparticles. Silica nanoparticles (SNPs) were assessed using transmission electron microscopy and applied to the human skin equivalent model. Captured fluorescence images of this model were used to evaluate degrees of skin penetration. These images underwent interactive segmentation and image processing in addition to statistical quantitative analyses of calculated image parameters including the mean, integrated density, skewness, kurtosis, and area fraction. In images from both groups, the distribution area and intensity of fluorescent silica gradually increased in proportion to time. Since statistical significance was achieved after 2 days in the negative charge group and after 4 days in the positive charge group, there is a periodic difference. Furthermore, the quantity of silica per unit area showed a dramatic change after 6 days in the negative charge group. Although this quantitative result is identical to results obtained by qualitative assessment, it is meaningful in that it was proven by statistical analysis with quantitation by using image processing. The present study suggests that the surface charge of SNPs could play an important role in the percutaneous absorption of NPs. These findings can help achieve a better understanding of the percutaneous transport of NPs. In addition, these results provide important guidance for the design of NPs for biomedical applications.

  2. Assigning Quantitative Function to Post-Translational Modifications Reveals Multiple Sites of Phosphorylation That Tune Yeast Pheromone Signaling Output

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pincus, David; Ryan, Christopher J.; Smith, Richard D.

    2013-03-12

    Cell signaling systems transmit information by post-­translationally modifying signaling proteins, often via phosphorylation. While thousands of sites of phosphorylation have been identified in proteomic studies, the vast majority of sites have no known function. Assigning functional roles to the catalog of uncharacterized phosphorylation sites is a key research challenge. Here we present a general approach to address this challenge and apply it to a prototypical signaling pathway, the pheromone response pathway in Saccharomyces cerevisiae. The pheromone pathway includes a mitogen activated protein kinase (MAPK) cascade activated by a G-­protein coupled receptor (GPCR). We used mass spectrometry-based proteomics to identify sitesmore » whose phosphorylation changed when the system was active, and evolutionary conservation to assign priority to a list of candidate MAPK regulatory sites. We made targeted alterations in those sites, and measured the effects of the mutations on pheromone pathway output in single cells. Our work identified six new sites that quantitatively tuned system output. We developed simple computational models to find system architectures that recapitulated the quantitative phenotypes of the mutants. Our results identify a number of regulated phosphorylation events that contribute to adjust the input-­output relationship of this model eukaryotic signaling system. We believe this combined approach constitutes a general means not only to reveal modification sites required to turn a pathway on and off, but also those required for more subtle quantitative effects that tune pathway output. Our results further suggest that relatively small quantitative influences from individual regulatory phosphorylation events endow signaling systems with plasticity that evolution may exploit to quantitatively tailor signaling outcomes.« less

  3. Evidence Regarding the Impact of Conflicts of Interest on Environmental and Occupational Health Research.

    PubMed

    Wells, Ellen M

    2017-06-01

    This review describes published literature providing evidence for financial conflicts of interest in environmental and occupational health research. Secondary goals were to describe evidence that (a) utilized quantitative methods to evaluate the association of conflicts with study outcomes, and (b) assessed undisclosed as well as disclosed conflicts of interest. Forty-three studies were identified which contained descriptions of the impact of financial conflicts of interest on research results; 11 of these conducted quantitative analyses to demonstrate these relationships. All 11 articles which quantified associations identified significant associations of the presence of financial conflicts of interest with study findings. In studies which measured undisclosed conflicts, these comprised a substantial proportion of all conflicts. Suggestions for improving understanding and interpretation of research results are presented.

  4. Quantitative analysis of the correlations in the Boltzmann-Grad limit for hard spheres

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pulvirenti, M.

    2014-12-09

    In this contribution I consider the problem of the validity of the Boltzmann equation for a system of hard spheres in the Boltzmann-Grad limit. I briefly review the results available nowadays with a particular emphasis on the celebrated Lanford’s validity theorem. Finally I present some recent results, obtained in collaboration with S. Simonella, concerning a quantitative analysis of the propagation of chaos. More precisely we introduce a quantity (the correlation error) measuring how close a j-particle rescaled correlation function at time t (sufficiently small) is far from the full statistical independence. Roughly speaking, a correlation error of order k, measuresmore » (in the context of the BBKGY hierarchy) the event in which k tagged particles form a recolliding group.« less

  5. Application of scenario analysis and multiagent technique in land-use planning: a case study on Sanjiang wetlands.

    PubMed

    Yu, Huan; Ni, Shi-Jun; Kong, Bo; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong

    2013-01-01

    Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources.

  6. Application of Scenario Analysis and Multiagent Technique in Land-Use Planning: A Case Study on Sanjiang Wetlands

    PubMed Central

    Ni, Shi-Jun; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong

    2013-01-01

    Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources. PMID:23818816

  7. Validation of HPLC and UV spectrophotometric methods for the determination of meropenem in pharmaceutical dosage form.

    PubMed

    Mendez, Andreas S L; Steppe, Martin; Schapoval, Elfrides E S

    2003-12-04

    A high-performance liquid chromatographic method and a UV spectrophotometric method for the quantitative determination of meropenem, a highly active carbapenem antibiotic, in powder for injection were developed in present work. The parameters linearity, precision, accuracy, specificity, robustness, limit of detection and limit of quantitation were studied according to International Conference on Harmonization guidelines. Chromatography was carried out by reversed-phase technique on an RP-18 column with a mobile phase composed of 30 mM monobasic phosphate buffer and acetonitrile (90:10; v/v), adjusted to pH 3.0 with orthophosphoric acid. The UV spectrophotometric method was performed at 298 nm. The samples were prepared in water and the stability of meropenem in aqueous solution at 4 and 25 degrees C was studied. The results were satisfactory with good stability after 24 h at 4 degrees C. Statistical analysis by Student's t-test showed no significant difference between the results obtained by the two methods. The proposed methods are highly sensitive, precise and accurate and can be used for the reliable quantitation of meropenem in pharmaceutical dosage form.

  8. Quantitatively differentiating microstructural variations of skeletal muscle tissues by multispectral Mueller matrix imaging

    NASA Astrophysics Data System (ADS)

    Dong, Yang; He, Honghui; He, Chao; Ma, Hui

    2016-10-01

    Polarized light is sensitive to the microstructures of biological tissues and can be used to detect physiological changes. Meanwhile, spectral features of the scattered light can also provide abundant microstructural information of tissues. In this paper, we take the backscattering polarization Mueller matrix images of bovine skeletal muscle tissues during the 24-hour experimental time, and analyze their multispectral behavior using quantitative Mueller matrix parameters. In the processes of rigor mortis and proteolysis of muscle samples, multispectral frequency distribution histograms (FDHs) of the Mueller matrix elements can reveal rich qualitative structural information. In addition, we analyze the temporal variations of the sample using the multispectral Mueller matrix transformation (MMT) parameters. The experimental results indicate that the different stages of rigor mortis and proteolysis for bovine skeletal muscle samples can be judged by these MMT parameters. The results presented in this work show that combining with the multispectral technique, the FDHs and MMT parameters can characterize the microstructural variation features of skeletal muscle tissues. The techniques have the potential to be used as tools for quantitative assessment of meat qualities in food industry.

  9. Evaluation of Deblur Methods for Radiography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, William M.

    2014-03-31

    Radiography is used as a primary diagnostic for dynamic experiments, providing timeresolved radiographic measurements of areal mass density along a line of sight through the experiment. It is well known that the finite spot extent of the radiographic source, as well as scattering, are sources of blurring of the radiographic images. This blurring interferes with quantitative measurement of the areal mass density. In order to improve the quantitative utility of this diagnostic, it is necessary to deblur or “restore” the radiographs to recover the “true” areal mass density from a radiographic transmission measurement. Towards this end, I am evaluating threemore » separate methods currently in use for deblurring radiographs. I begin by briefly describing the problems associated with image restoration, and outlining the three methods. Next, I illustrate how blurring affects the quantitative measurements using radiographs. I then present the results of the various deblur methods, evaluating each according to several criteria. After I have summarized the results of the evaluation, I give a detailed account of how the restoration process is actually implemented.« less

  10. MR Imaging in Spinocerebellar Ataxias: A Systematic Review.

    PubMed

    Klaes, A; Reckziegel, E; Franca, M C; Rezende, T J R; Vedolin, L M; Jardim, L B; Saute, J A

    2016-08-01

    Polyglutamine expansion spinocerebellar ataxias are autosomal dominant slowly progressive neurodegenerative diseases with no current treatment. MR imaging is the best-studied surrogate biomarker candidate for polyglutamine expansion spinocerebellar ataxias, though with conflicting results. We aimed to review quantitative central nervous system MR imaging technique findings in patients with polyglutamine expansion spinocerebellar ataxias and correlations with well-established clinical and molecular disease markers. We searched MEDLINE, LILACS, and Cochrane data bases of clinical trials between January 1995 and January 2016, for quantitative MR imaging volumetric approaches, MR spectroscopy, diffusion tensor imaging, or other quantitative techniques, comparing patients with polyglutamine expansion spinocerebellar ataxias (SCAs) with controls. Pertinent details for each study regarding participants, imaging methods, and results were extracted. After reviewing the 706 results, 18 studies were suitable for inclusion: 2 studies in SCA1, 1 in SCA2, 15 in SCA3, 1 in SCA7, 1 in SCA1 and SCA6 presymptomatic carriers, and none in SCA17 and dentatorubropallidoluysian atrophy. Cerebellar hemispheres and vermis, whole brain stem, midbrain, pons, medulla oblongata, cervical spine, striatum, and thalamus presented significant atrophy in SCA3. The caudate, putamen and whole brain stem presented similar sensitivity to change compared with ataxia scales after 2 years of follow-up in a single prospective study in SCA3. MR spectroscopy and DTI showed abnormalities only in cross-sectional studies in SCA3. Results from single studies in other polyglutamine expansion spinocerebellar ataxias should be replicated in different cohorts. Additional cross-sectional and prospective volumetric analysis, MR spectroscopy, and DTI studies are necessary in polyglutamine expansion spinocerebellar ataxias. The properties of preclinical disease biomarkers (presymptomatic) of MR imaging should be targeted in future studies. © 2016 by American Journal of Neuroradiology.

  11. Novel Sessile Drop Software for Quantitative Estimation of Slag Foaming in Carbon/Slag Interactions

    NASA Astrophysics Data System (ADS)

    Khanna, Rita; Rahman, Mahfuzur; Leow, Richard; Sahajwalla, Veena

    2007-08-01

    Novel video-processing software has been developed for the sessile drop technique for a rapid and quantitative estimation of slag foaming. The data processing was carried out in two stages: the first stage involved the initial transformation of digital video/audio signals into a format compatible with computing software, and the second stage involved the computation of slag droplet volume and area of contact in a chosen video frame. Experimental results are presented on slag foaming from synthetic graphite/slag system at 1550 °C. This technique can be used for determining the extent and stability of foam as a function of time.

  12. MBTH: A novel approach to rapid, spectrophotometric quantitation of total algal carbohydrates

    DOE PAGES

    Van Wychen, Stefanie; Long, William; Black, Stuart K.; ...

    2016-11-24

    A high-throughput and robust application of the 3-methyl-2-benzothiazolinone hydrazone (MBTH) method was developed for carbohydrate determination in microalgae. The traditional phenol-sulfuric acid method to quantify carbohydrates is strongly affected by algal biochemical components and exhibits a highly variable response to microalgal monosaccharides. We present a novel use of the MBTH method to accurately quantify carbohydrates in hydrolyzate after acid hydrolysis of algal biomass, without a need for neutralization. As a result, the MBTH method demonstrated consistent and sensitive quantitation of algae-specific monosaccharides down to 5 ug mL -1 without interference from other algae acidic hydrolyzate components.

  13. Investment appraisal using quantitative risk analysis.

    PubMed

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  14. Comparison and evaluation of fusion methods used for GF-2 satellite image in coastal mangrove area

    NASA Astrophysics Data System (ADS)

    Ling, Chengxing; Ju, Hongbo; Liu, Hua; Zhang, Huaiqing; Sun, Hua

    2018-04-01

    GF-2 satellite is the highest spatial resolution Remote Sensing Satellite of the development history of China's satellite. In this study, three traditional fusion methods including Brovey, Gram-Schmidt and Color Normalized (CN were used to compare with the other new fusion method NNDiffuse, which used the qualitative assessment and quantitative fusion quality index, including information entropy, variance, mean gradient, deviation index, spectral correlation coefficient. Analysis results show that NNDiffuse method presented the optimum in qualitative and quantitative analysis. It had more effective for the follow up of remote sensing information extraction and forest, wetland resources monitoring applications.

  15. Quantitative values of blood flow through the human forearm, hand, and finger as functions of temperature

    NASA Technical Reports Server (NTRS)

    Montgomery, L. D.

    1974-01-01

    A literature search was made to obtain values of human forearm, hand and finger blood flow as functions of environmental temperature. The sources used include both government and laboratory reports and the research presented in the open literature. An attempt was made to review many of the more quantitative noninvasive determinations and to collate the results in such a way as to yield blood flow values for each body segment as continuous functions of temperature. A brief review of the various ways used to measure blood flow is included along with an abstract of each work from which data was taken.

  16. MBTH: A novel approach to rapid, spectrophotometric quantitation of total algal carbohydrates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Wychen, Stefanie; Long, William; Black, Stuart K.

    A high-throughput and robust application of the 3-methyl-2-benzothiazolinone hydrazone (MBTH) method was developed for carbohydrate determination in microalgae. The traditional phenol-sulfuric acid method to quantify carbohydrates is strongly affected by algal biochemical components and exhibits a highly variable response to microalgal monosaccharides. We present a novel use of the MBTH method to accurately quantify carbohydrates in hydrolyzate after acid hydrolysis of algal biomass, without a need for neutralization. As a result, the MBTH method demonstrated consistent and sensitive quantitation of algae-specific monosaccharides down to 5 ug mL -1 without interference from other algae acidic hydrolyzate components.

  17. Survey of quantitative data on the solar energy and its spectra distribution

    NASA Technical Reports Server (NTRS)

    Thekaekara, M. P.

    1976-01-01

    This paper presents a survey of available quantitative data on the total and spectral solar irradiance at ground level and outside the atmosphere. Measurements from research aircraft have resulted in the currently accepted NASA/ASTM standards of the solar constant and zero air mass solar spectral irradiance. The intrinsic variability of solar energy output and programs currently under way for more precise measurements from spacecraft are discussed. Instrumentation for solar measurements and their reference radiation scales are examined. Insolation data available from the records of weather stations are reviewed for their applicability to solar energy conversion. Two alternate methods of solarimetry are briefly discussed.

  18. Do Junior High School Students Perceive Their Learning Environment as Constructivist?

    NASA Astrophysics Data System (ADS)

    Moustafa, Asely; Ben-Zvi-Assaraf, Orit; Eshach, Haim

    2013-08-01

    The purpose of this study is to examine the manner in which the features of a constructivist learning environment, and the mechanisms at its base, are expressed in junior high school students' conceptions. Our research is based on an integration of quantitative and qualitative approaches, deigned to provide a wider ranging and deeper understanding. Eight hundred and forty eighth- and ninth-grade students from over 15 schools participated in the study. Of the 840 students who completed the questionnaire, the explanations of 200 well-written questionnaires were further analyzed qualitatively. The findings of the study are presented in terms of the four scales employed in the CLES, namely the autonomy scale, the prior knowledge scale, the negotiation scale, and the student-centeredness scale. The quantitative results achieved here concur with parallel studies conducted around the world. The findings indicate that a considerable portion of the students perceive their learning environment as a constructivist one and report positive attitudes toward the way they are being taught. In terms of the qualitative results, however, it appears that in some cases, the students' explanations reveal that in fact, and contrary to the bare quantitative results, some students do not perceive their learning environment as being constructivist. This raises the question of whether the fact that students recognize the factors associated with constructivist teaching is indeed an indication that such teaching exists in practice. This finding emphasizes the importance of combining qualitative and quantitative methods for arriving at a balanced view of classroom occurrences.

  19. Variance in total levels of phospholipase C zeta (PLC-ζ) in human sperm may limit the applicability of quantitative immunofluorescent analysis as a diagnostic indicator of oocyte activation capability.

    PubMed

    Kashir, Junaid; Jones, Celine; Mounce, Ginny; Ramadan, Walaa M; Lemmon, Bernadette; Heindryckx, Bjorn; de Sutter, Petra; Parrington, John; Turner, Karen; Child, Tim; McVeigh, Enda; Coward, Kevin

    2013-01-01

    To examine whether similar levels of phospholipase C zeta (PLC-ζ) protein are present in sperm from men whose ejaculates resulted in normal oocyte activation, and to examine whether a predominant pattern of PLC-ζ localization is linked to normal oocyte activation ability. Laboratory study. University laboratory. Control subjects (men with proven oocyte activation capacity; n = 16) and men whose sperm resulted in recurrent intracytoplasmic sperm injection failure (oocyte activation deficient [OAD]; n = 5). Quantitative immunofluorescent analysis of PLC-ζ protein in human sperm. Total levels of PLC-ζ fluorescence, proportions of sperm exhibiting PLC-ζ immunoreactivity, and proportions of PLC-ζ localization patterns in sperm from control and OAD men. Sperm from control subjects presented a significantly higher proportion of sperm exhibiting PLC-ζ immunofluorescence compared with infertile men diagnosed with OAD (82.6% and 27.4%, respectively). Total levels of PLC-ζ in sperm from individual control and OAD patients exhibited significant variance, with sperm from 10 out of 16 (62.5%) exhibiting levels similar to OAD samples. Predominant PLC-ζ localization patterns varied between control and OAD samples with no predictable or consistent pattern. The results indicate that sperm from control men exhibited significant variance in total levels of PLC-ζ protein, as well as significant variance in the predominant localization pattern. Such variance may hinder the diagnostic application of quantitative PLC-ζ immunofluorescent analysis. Copyright © 2013 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  20. Evaluation of reference gene suitability for quantitative expression analysis by quantitative polymerase chain reaction in the mandibular condyle of sheep.

    PubMed

    Jiang, Xin; Xue, Yang; Zhou, Hongzhi; Li, Shouhong; Zhang, Zongmin; Hou, Rui; Ding, Yuxiang; Hu, Kaijin

    2015-10-01

    Reference genes are commonly used as a reliable approach to normalize the results of quantitative polymerase chain reaction (qPCR), and to reduce errors in the relative quantification of gene expression. Suitable reference genes belonging to numerous functional classes have been identified for various types of species and tissue. However, little is currently known regarding the most suitable reference genes for bone, specifically for the sheep mandibular condyle. Sheep are important for the study of human bone diseases, particularly for temporomandibular diseases. The present study aimed to identify a set of reference genes suitable for the normalization of qPCR data from the mandibular condyle of sheep. A total of 12 reference genes belonging to various functional classes were selected, and the expression stability of the reference genes was determined in both the normal and fractured area of the sheep mandibular condyle. RefFinder, which integrates the following currently available computational algorithms: geNorm, NormFinder, BestKeeper, and the comparative ΔCt method, was used to compare and rank the candidate reference genes. The results obtained from the four methods demonstrated a similar trend: RPL19, ACTB, and PGK1 were the most stably expressed reference genes in the sheep mandibular condyle. As determined by RefFinder comprehensive analysis, the results of the present study suggested that RPL19 is the most suitable reference gene for studies associated with the sheep mandibular condyle. In addition, ACTB and PGK1 may be considered suitable alternatives.

  1. An Investigation of Proposed Techniques for Quantifying Confidence in Assurance Arguments

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.; Holloway, C. Michael

    2016-01-01

    The use of safety cases in certification raises the question of assurance argument sufficiency and the issue of confidence (or uncertainty) in the argument's claims. Some researchers propose to model confidence quantitatively and to calculate confidence in argument conclusions. We know of little evidence to suggest that any proposed technique would deliver trustworthy results when implemented by system safety practitioners. Proponents do not usually assess the efficacy of their techniques through controlled experiment or historical study. Instead, they present an illustrative example where the calculation delivers a plausible result. In this paper, we review current proposals, claims made about them, and evidence advanced in favor of them. We then show that proposed techniques can deliver implausible results in some cases. We conclude that quantitative confidence techniques require further validation before they should be recommended as part of the basis for deciding whether an assurance argument justifies fielding a critical system.

  2. Mechanism on brain information processing: Energy coding

    NASA Astrophysics Data System (ADS)

    Wang, Rubin; Zhang, Zhikang; Jiao, Xianfa

    2006-09-01

    According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, the authors present a brand new scientific theory that offers a unique mechanism for brain information processing. They demonstrate that the neural coding produced by the activity of the brain is well described by the theory of energy coding. Due to the energy coding model's ability to reveal mechanisms of brain information processing based upon known biophysical properties, they cannot only reproduce various experimental results of neuroelectrophysiology but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, they estimate that the theory has very important consequences for quantitative research of cognitive function.

  3. Energy coding in biological neural networks

    PubMed Central

    Zhang, Zhikang

    2007-01-01

    According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, we present a brand new scientific theory that offers an unique mechanism for brain information processing. We demonstrate that the neural coding produced by the activity of the brain is well described by our theory of energy coding. Due to the energy coding model’s ability to reveal mechanisms of brain information processing based upon known biophysical properties, we can not only reproduce various experimental results of neuro-electrophysiology, but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, we estimate that the theory has very important consequences for quantitative research of cognitive function. PMID:19003513

  4. Human low vision image warping - Channel matching considerations

    NASA Technical Reports Server (NTRS)

    Juday, Richard D.; Smith, Alan T.; Loshin, David S.

    1992-01-01

    We are investigating the possibility that a video image may productively be warped prior to presentation to a low vision patient. This could form part of a prosthesis for certain field defects. We have done preliminary quantitative studies on some notions that may be valid in calculating the image warpings. We hope the results will help make best use of time to be spent with human subjects, by guiding the selection of parameters and their range to be investigated. We liken a warping optimization to opening the largest number of spatial channels between the pixels of an input imager and resolution cells in the visual system. Some important effects are not quantified that will require human evaluation, such as local 'squashing' of the image, taken as the ratio of eigenvalues of the Jacobian of the transformation. The results indicate that the method shows quantitative promise. These results have identified some geometric transformations to evaluate further with human subjects.

  5. Image restoration using aberration taken by a Hartmann wavefront sensor on extended object, towards real-time deconvolution

    NASA Astrophysics Data System (ADS)

    Darudi, Ahmad; Bakhshi, Hadi; Asgari, Reza

    2015-05-01

    In this paper we present the results of image restoration using the data taken by a Hartmann sensor. The aberration is measure by a Hartmann sensor in which the object itself is used as reference. Then the Point Spread Function (PSF) is simulated and used for image reconstruction using the Lucy-Richardson technique. A technique is presented for quantitative evaluation the Lucy-Richardson technique for deconvolution.

  6. Characterizing microstructural features of biomedical samples by statistical analysis of Mueller matrix images

    NASA Astrophysics Data System (ADS)

    He, Honghui; Dong, Yang; Zhou, Jialing; Ma, Hui

    2017-03-01

    As one of the salient features of light, polarization contains abundant structural and optical information of media. Recently, as a comprehensive description of polarization property, the Mueller matrix polarimetry has been applied to various biomedical studies such as cancerous tissues detections. In previous works, it has been found that the structural information encoded in the 2D Mueller matrix images can be presented by other transformed parameters with more explicit relationship to certain microstructural features. In this paper, we present a statistical analyzing method to transform the 2D Mueller matrix images into frequency distribution histograms (FDHs) and their central moments to reveal the dominant structural features of samples quantitatively. The experimental results of porcine heart, intestine, stomach, and liver tissues demonstrate that the transformation parameters and central moments based on the statistical analysis of Mueller matrix elements have simple relationships to the dominant microstructural properties of biomedical samples, including the density and orientation of fibrous structures, the depolarization power, diattenuation and absorption abilities. It is shown in this paper that the statistical analysis of 2D images of Mueller matrix elements may provide quantitative or semi-quantitative criteria for biomedical diagnosis.

  7. Quantitative phase imaging of biological cells and tissues using singleshot white light interference microscopy and phase subtraction method for extended range of measurement

    NASA Astrophysics Data System (ADS)

    Mehta, Dalip Singh; Sharma, Anuradha; Dubey, Vishesh; Singh, Veena; Ahmad, Azeem

    2016-03-01

    We present a single-shot white light interference microscopy for the quantitative phase imaging (QPI) of biological cells and tissues. A common path white light interference microscope is developed and colorful white light interferogram is recorded by three-chip color CCD camera. The recorded white light interferogram is decomposed into the red, green and blue color wavelength component interferograms and processed it to find out the RI for different color wavelengths. The decomposed interferograms are analyzed using local model fitting (LMF)" algorithm developed for reconstructing the phase map from single interferogram. LMF is slightly off-axis interferometric QPI method which is a single-shot method that employs only a single image, so it is fast and accurate. The present method is very useful for dynamic process where path-length changes at millisecond level. From the single interferogram a wavelength-dependent quantitative phase imaging of human red blood cells (RBCs) are reconstructed and refractive index is determined. The LMF algorithm is simple to implement and is efficient in computation. The results are compared with the conventional phase shifting interferometry and Hilbert transform techniques.

  8. A two-factor error model for quantitative steganalysis

    NASA Astrophysics Data System (ADS)

    Böhme, Rainer; Ker, Andrew D.

    2006-02-01

    Quantitative steganalysis refers to the exercise not only of detecting the presence of hidden stego messages in carrier objects, but also of estimating the secret message length. This problem is well studied, with many detectors proposed but only a sparse analysis of errors in the estimators. A deep understanding of the error model, however, is a fundamental requirement for the assessment and comparison of different detection methods. This paper presents a rationale for a two-factor model for sources of error in quantitative steganalysis, and shows evidence from a dedicated large-scale nested experimental set-up with a total of more than 200 million attacks. Apart from general findings about the distribution functions found in both classes of errors, their respective weight is determined, and implications for statistical hypothesis tests in benchmarking scenarios or regression analyses are demonstrated. The results are based on a rigorous comparison of five different detection methods under many different external conditions, such as size of the carrier, previous JPEG compression, and colour channel selection. We include analyses demonstrating the effects of local variance and cover saturation on the different sources of error, as well as presenting the case for a relative bias model for between-image error.

  9. Mapping surface charge density of lipid bilayers by quantitative surface conductivity microscopy

    PubMed Central

    Klausen, Lasse Hyldgaard; Fuhs, Thomas; Dong, Mingdong

    2016-01-01

    Local surface charge density of lipid membranes influences membrane–protein interactions leading to distinct functions in all living cells, and it is a vital parameter in understanding membrane-binding mechanisms, liposome design and drug delivery. Despite the significance, no method has so far been capable of mapping surface charge densities under physiologically relevant conditions. Here, we use a scanning nanopipette setup (scanning ion-conductance microscope) combined with a novel algorithm to investigate the surface conductivity near supported lipid bilayers, and we present a new approach, quantitative surface conductivity microscopy (QSCM), capable of mapping surface charge density with high-quantitative precision and nanoscale resolution. The method is validated through an extensive theoretical analysis of the ionic current at the nanopipette tip, and we demonstrate the capacity of QSCM by mapping the surface charge density of model cationic, anionic and zwitterionic lipids with results accurately matching theoretical values. PMID:27561322

  10. A statistical framework for protein quantitation in bottom-up MS-based proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas

    2009-08-15

    ABSTRACT Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and confidence measures. Challenges include the presence of low-quality or incorrectly identified peptides and widespread, informative, missing data. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model for protein abundance in terms of peptide peak intensities, applicable to both label-based and label-free quantitation experiments. The model allows for both random and censoring missingness mechanisms and provides naturally for protein-level estimates and confidence measures. The model is also used to derive automated filtering and imputation routines. Three LC-MS datasets are used tomore » illustrate the methods. Availability: The software has been made available in the open-source proteomics platform DAnTE (Polpitiya et al. (2008)) (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu« less

  11. A qualitative and quantitative assessment for a bone marrow harvest simulator.

    PubMed

    Machado, Liliane S; Moraes, Ronei M

    2009-01-01

    Several approaches to perform assessment in training simulators based on virtual reality have been proposed. There are two kinds of assessment methods: offline and online. The main requirements related to online training assessment methodologies applied to virtual reality systems are the low computational complexity and the high accuracy. In the literature it can be found several approaches for general cases which can satisfy such requirements. An inconvenient about those approaches is related to an unsatisfactory solution for specific cases, as in some medical procedures, where there are quantitative and qualitative information available to perform the assessment. In this paper, we present an approach to online training assessment based on a Modified Naive Bayes which can manipulate qualitative and quantitative variables simultaneously. A special medical case was simulated in a bone marrow harvest simulator. The results obtained were satisfactory and evidenced the applicability of the method.

  12. Quantitative validation of an air-coupled ultrasonic probe model by Interferometric laser tomography

    NASA Astrophysics Data System (ADS)

    Revel, G. M.; Pandarese, G.; Cavuto, A.

    2012-06-01

    The present paper describes the quantitative validation of a finite element (FE) model of the ultrasound beam generated by an air coupled non-contact ultrasound transducer. The model boundary conditions are given by vibration velocities measured by laser vibrometry on the probe membrane. The proposed validation method is based on the comparison between the simulated 3D pressure field and the pressure data measured with interferometric laser tomography technique. The model details and the experimental techniques are described in paper. The analysis of results shows the effectiveness of the proposed approach and the possibility to quantitatively assess and predict the generated acoustic pressure field, with maximum discrepancies in the order of 20% due to uncertainty effects. This step is important for determining in complex problems the real applicability of air-coupled probes and for the simulation of the whole inspection procedure, also when the component is designed, so as to virtually verify its inspectability.

  13. Qualitative and Quantitative Analyses of Glycogen in Human Milk.

    PubMed

    Matsui-Yatsuhashi, Hiroko; Furuyashiki, Takashi; Takata, Hiroki; Ishida, Miyuki; Takumi, Hiroko; Kakutani, Ryo; Kamasaka, Hiroshi; Nagao, Saeko; Hirose, Junko; Kuriki, Takashi

    2017-02-22

    Identification as well as a detailed analysis of glycogen in human milk has not been shown yet. The present study confirmed that glycogen is contained in human milk by qualitative and quantitative analyses. High-performance anion exchange chromatography (HPAEC) and high-performance size exclusion chromatography with a multiangle laser light scattering detector (HPSEC-MALLS) were used for qualitative analysis of glycogen in human milk. Quantitative analysis was carried out by using samples obtained from the individual milks. The result revealed that the concentration of human milk glycogen varied depending on the mother's condition-such as the period postpartum and inflammation. The amounts of glycogen in human milk collected at 0 and 1-2 months postpartum were higher than in milk collected at 3-14 months postpartum. In the milk from mothers with severe mastitis, the concentration of glycogen was about 40 times higher than that in normal milk.

  14. Mapping surface charge density of lipid bilayers by quantitative surface conductivity microscopy

    NASA Astrophysics Data System (ADS)

    Klausen, Lasse Hyldgaard; Fuhs, Thomas; Dong, Mingdong

    2016-08-01

    Local surface charge density of lipid membranes influences membrane-protein interactions leading to distinct functions in all living cells, and it is a vital parameter in understanding membrane-binding mechanisms, liposome design and drug delivery. Despite the significance, no method has so far been capable of mapping surface charge densities under physiologically relevant conditions. Here, we use a scanning nanopipette setup (scanning ion-conductance microscope) combined with a novel algorithm to investigate the surface conductivity near supported lipid bilayers, and we present a new approach, quantitative surface conductivity microscopy (QSCM), capable of mapping surface charge density with high-quantitative precision and nanoscale resolution. The method is validated through an extensive theoretical analysis of the ionic current at the nanopipette tip, and we demonstrate the capacity of QSCM by mapping the surface charge density of model cationic, anionic and zwitterionic lipids with results accurately matching theoretical values.

  15. Abseq: Ultrahigh-throughput single cell protein profiling with droplet microfluidic barcoding.

    PubMed

    Shahi, Payam; Kim, Samuel C; Haliburton, John R; Gartner, Zev J; Abate, Adam R

    2017-03-14

    Proteins are the primary effectors of cellular function, including cellular metabolism, structural dynamics, and information processing. However, quantitative characterization of proteins at the single-cell level is challenging due to the tiny amount of protein available. Here, we present Abseq, a method to detect and quantitate proteins in single cells at ultrahigh throughput. Like flow and mass cytometry, Abseq uses specific antibodies to detect epitopes of interest; however, unlike these methods, antibodies are labeled with sequence tags that can be read out with microfluidic barcoding and DNA sequencing. We demonstrate this novel approach by characterizing surface proteins of different cell types at the single-cell level and distinguishing between the cells by their protein expression profiles. DNA-tagged antibodies provide multiple advantages for profiling proteins in single cells, including the ability to amplify low-abundance tags to make them detectable with sequencing, to use molecular indices for quantitative results, and essentially limitless multiplexing.

  16. Abseq: Ultrahigh-throughput single cell protein profiling with droplet microfluidic barcoding

    NASA Astrophysics Data System (ADS)

    Shahi, Payam; Kim, Samuel C.; Haliburton, John R.; Gartner, Zev J.; Abate, Adam R.

    2017-03-01

    Proteins are the primary effectors of cellular function, including cellular metabolism, structural dynamics, and information processing. However, quantitative characterization of proteins at the single-cell level is challenging due to the tiny amount of protein available. Here, we present Abseq, a method to detect and quantitate proteins in single cells at ultrahigh throughput. Like flow and mass cytometry, Abseq uses specific antibodies to detect epitopes of interest; however, unlike these methods, antibodies are labeled with sequence tags that can be read out with microfluidic barcoding and DNA sequencing. We demonstrate this novel approach by characterizing surface proteins of different cell types at the single-cell level and distinguishing between the cells by their protein expression profiles. DNA-tagged antibodies provide multiple advantages for profiling proteins in single cells, including the ability to amplify low-abundance tags to make them detectable with sequencing, to use molecular indices for quantitative results, and essentially limitless multiplexing.

  17. Abseq: Ultrahigh-throughput single cell protein profiling with droplet microfluidic barcoding

    PubMed Central

    Shahi, Payam; Kim, Samuel C.; Haliburton, John R.; Gartner, Zev J.; Abate, Adam R.

    2017-01-01

    Proteins are the primary effectors of cellular function, including cellular metabolism, structural dynamics, and information processing. However, quantitative characterization of proteins at the single-cell level is challenging due to the tiny amount of protein available. Here, we present Abseq, a method to detect and quantitate proteins in single cells at ultrahigh throughput. Like flow and mass cytometry, Abseq uses specific antibodies to detect epitopes of interest; however, unlike these methods, antibodies are labeled with sequence tags that can be read out with microfluidic barcoding and DNA sequencing. We demonstrate this novel approach by characterizing surface proteins of different cell types at the single-cell level and distinguishing between the cells by their protein expression profiles. DNA-tagged antibodies provide multiple advantages for profiling proteins in single cells, including the ability to amplify low-abundance tags to make them detectable with sequencing, to use molecular indices for quantitative results, and essentially limitless multiplexing. PMID:28290550

  18. Standard Reference Line Combined with One-Point Calibration-Free Laser-Induced Breakdown Spectroscopy (CF-LIBS) to Quantitatively Analyze Stainless and Heat Resistant Steel.

    PubMed

    Fu, Hongbo; Wang, Huadong; Jia, Junwei; Ni, Zhibo; Dong, Fengzhong

    2018-01-01

    Due to the influence of major elements' self-absorption, scarce observable spectral lines of trace elements, and relative efficiency correction of experimental system, accurate quantitative analysis with calibration-free laser-induced breakdown spectroscopy (CF-LIBS) is in fact not easy. In order to overcome these difficulties, standard reference line (SRL) combined with one-point calibration (OPC) is used to analyze six elements in three stainless-steel and five heat-resistant steel samples. The Stark broadening and Saha - Boltzmann plot of Fe are used to calculate the electron density and the plasma temperature, respectively. In the present work, we tested the original SRL method, the SRL with the OPC method, and intercept with the OPC method. The final calculation results show that the latter two methods can effectively improve the overall accuracy of quantitative analysis and the detection limits of trace elements.

  19. Mapping surface charge density of lipid bilayers by quantitative surface conductivity microscopy.

    PubMed

    Klausen, Lasse Hyldgaard; Fuhs, Thomas; Dong, Mingdong

    2016-08-26

    Local surface charge density of lipid membranes influences membrane-protein interactions leading to distinct functions in all living cells, and it is a vital parameter in understanding membrane-binding mechanisms, liposome design and drug delivery. Despite the significance, no method has so far been capable of mapping surface charge densities under physiologically relevant conditions. Here, we use a scanning nanopipette setup (scanning ion-conductance microscope) combined with a novel algorithm to investigate the surface conductivity near supported lipid bilayers, and we present a new approach, quantitative surface conductivity microscopy (QSCM), capable of mapping surface charge density with high-quantitative precision and nanoscale resolution. The method is validated through an extensive theoretical analysis of the ionic current at the nanopipette tip, and we demonstrate the capacity of QSCM by mapping the surface charge density of model cationic, anionic and zwitterionic lipids with results accurately matching theoretical values.

  20. Identification of downy mildew resistance gene candidates by positional cloning in maize (Zea mays subsp. mays; Poaceae)1

    PubMed Central

    Kim, Jae Yoon; Moon, Jun-Cheol; Kim, Hyo Chul; Shin, Seungho; Song, Kitae; Kim, Kyung-Hee; Lee, Byung-Moo

    2017-01-01

    Premise of the study: Positional cloning in combination with phenotyping is a general approach to identify disease-resistance gene candidates in plants; however, it requires several time-consuming steps including population or fine mapping. Therefore, in the present study, we suggest a new combined strategy to improve the identification of disease-resistance gene candidates. Methods and Results: Downy mildew (DM)–resistant maize was selected from five cultivars using a spreader row technique. Positional cloning and bioinformatics tools were used to identify the DM-resistance quantitative trait locus marker (bnlg1702) and 47 protein-coding gene annotations. Eventually, five DM-resistance gene candidates, including bZIP34, Bak1, and Ppr, were identified by quantitative reverse-transcription PCR (RT-PCR) without fine mapping of the bnlg1702 locus. Conclusions: The combined protocol with the spreader row technique, quantitative trait locus positional cloning, and quantitative RT-PCR was effective for identifying DM-resistance candidate genes. This cloning approach may be applied to other whole-genome-sequenced crops or resistance to other diseases. PMID:28224059

  1. Modified HS-SPME for determination of quantitative relations between low-molecular oxygen compounds in various matrices.

    PubMed

    Dawidowicz, Andrzej L; Szewczyk, Joanna; Dybowski, Michal P

    2016-09-07

    Similar quantitative relations between individual constituents of the liquid sample established by its direct injection can be obtained applying Polydimethylsiloxane (PDMS) fiber in the headspace solid phase microextraction (HS-SPME) system containing the examined sample suspended in methyl silica oil. This paper proves that the analogous system composed of sample suspension/emulsion in polyethylene glycol (PEG) and Carbowax fiber allows to get similar quantitative relations between components of the mixture as those established by its direct analysis, but only for polar constituents. It is demonstrated for essential oil (EO) components of savory, sage, mint and thyme, and of artificial liquid mixture of polar constituents. The observed differences in quantitative relations between polar constituents estimated by both applied procedures are insignificant (Fexp < Fcrit). The presented results indicates that wider applicability of the system composed of a sample suspended in the oil of the same physicochemical character as that of used SPME fiber coating strongly depends on the character of interactions between analytes-suspending liquid and analytes-fiber coating. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Quantitative refractive index distribution of single cell by combining phase-shifting interferometry and AFM imaging.

    PubMed

    Zhang, Qinnan; Zhong, Liyun; Tang, Ping; Yuan, Yingjie; Liu, Shengde; Tian, Jindong; Lu, Xiaoxu

    2017-05-31

    Cell refractive index, an intrinsic optical parameter, is closely correlated with the intracellular mass and concentration. By combining optical phase-shifting interferometry (PSI) and atomic force microscope (AFM) imaging, we constructed a label free, non-invasive and quantitative refractive index of single cell measurement system, in which the accurate phase map of single cell was retrieved with PSI technique and the cell morphology with nanoscale resolution was achieved with AFM imaging. Based on the proposed AFM/PSI system, we achieved quantitative refractive index distributions of single red blood cell and Jurkat cell, respectively. Further, the quantitative change of refractive index distribution during Daunorubicin (DNR)-induced Jurkat cell apoptosis was presented, and then the content changes of intracellular biochemical components were achieved. Importantly, these results were consistent with Raman spectral analysis, indicating that the proposed PSI/AFM based refractive index system is likely to become a useful tool for intracellular biochemical components analysis measurement, and this will facilitate its application for revealing cell structure and pathological state from a new perspective.

  3. Cleavage Entropy as Quantitative Measure of Protease Specificity

    PubMed Central

    Fuchs, Julian E.; von Grafenstein, Susanne; Huber, Roland G.; Margreiter, Michael A.; Spitzer, Gudrun M.; Wallnoefer, Hannes G.; Liedl, Klaus R.

    2013-01-01

    A purely information theory-guided approach to quantitatively characterize protease specificity is established. We calculate an entropy value for each protease subpocket based on sequences of cleaved substrates extracted from the MEROPS database. We compare our results with known subpocket specificity profiles for individual proteases and protease groups (e.g. serine proteases, metallo proteases) and reflect them quantitatively. Summation of subpocket-wise cleavage entropy contributions yields a measure for overall protease substrate specificity. This total cleavage entropy allows ranking of different proteases with respect to their specificity, separating unspecific digestive enzymes showing high total cleavage entropy from specific proteases involved in signaling cascades. The development of a quantitative cleavage entropy score allows an unbiased comparison of subpocket-wise and overall protease specificity. Thus, it enables assessment of relative importance of physicochemical and structural descriptors in protease recognition. We present an exemplary application of cleavage entropy in tracing substrate specificity in protease evolution. This highlights the wide range of substrate promiscuity within homologue proteases and hence the heavy impact of a limited number of mutations on individual substrate specificity. PMID:23637583

  4. Evaluation of a web based informatics system with data mining tools for predicting outcomes with quantitative imaging features in stroke rehabilitation clinical trials

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Kim, Bokkyu; Park, Ji Hoon; Wang, Erik; Forsyth, Sydney; Lim, Cody; Ravi, Ragini; Karibyan, Sarkis; Sanchez, Alexander; Liu, Brent

    2017-03-01

    Quantitative imaging biomarkers are used widely in clinical trials for tracking and evaluation of medical interventions. Previously, we have presented a web based informatics system utilizing quantitative imaging features for predicting outcomes in stroke rehabilitation clinical trials. The system integrates imaging features extraction tools and a web-based statistical analysis tool. The tools include a generalized linear mixed model(GLMM) that can investigate potential significance and correlation based on features extracted from clinical data and quantitative biomarkers. The imaging features extraction tools allow the user to collect imaging features and the GLMM module allows the user to select clinical data and imaging features such as stroke lesion characteristics from the database as regressors and regressands. This paper discusses the application scenario and evaluation results of the system in a stroke rehabilitation clinical trial. The system was utilized to manage clinical data and extract imaging biomarkers including stroke lesion volume, location and ventricle/brain ratio. The GLMM module was validated and the efficiency of data analysis was also evaluated.

  5. Formation resistivity measurements from within a cased well used to quantitatively determine the amount of oil and gas present

    DOEpatents

    Vail, III, William Banning

    2000-01-01

    Methods to quantitatively determine the separate amounts of oil and gas in a geological formation adjacent to a cased well using measurements of formation resistivity. The steps include obtaining resistivity measurements from within a cased well of a given formation, obtaining the porosity, obtaining the resistivity of formation water present, computing the combined amounts of oil and gas present using Archie's Equations, determining the relative amounts of oil and gas present from measurements within a cased well, and then quantitatively determining the separate amounts of oil and gas present in the formation. Resistivity measurements are obtained from within the cased well by conducting A.C. current from within the cased well to a remote electrode at a frequency that is within the frequency range of 0.1 Hz to 20 Hz.

  6. Comparing DNS and Experiments of Subcritical Flow Past an Isolated Surface Roughness Element

    NASA Astrophysics Data System (ADS)

    Doolittle, Charles; Goldstein, David

    2009-11-01

    Results are presented from computational and experimental studies of subcritical roughness within a Blasius boundary layer. This work stems from discrepancies presented by Stephani and Goldstein (AIAA Paper 2009-585) where DNS results did not agree with hot-wire measurements. The near wake regions of cylindrical surface roughness elements corresponding to roughness-based Reynolds numbers Rek of about 202 are of specific concern. Laser-Doppler anemometry and flow visualization in water, as well as the same spectral DNS code used by Stephani and Goldstein are used to obtain both quantitative and qualitative comparisons with previous results. Conclusions regarding previous studies will be presented alongside discussion of current work including grid resolution studies and an examination of vorticity dynamics.

  7. Abstracts of papers presented at the LVIII Cold Spring Harbor Symposium on quantitative Biology: DNA and chromosomes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This volume contains the abstracts of oral and poster presentations made at the LVIII Cold Spring Harbor Symposium on Quantitative Biology entitles DNA & Chromosomes. The meeting was held June 2--June 9, 1993 at Cold Spring Harbor, New York.

  8. A microdosimetric study of {sup 10}B(n,{alpha}){sup 7}Li and {sup 157}Gd(n,{gamma}) reactions for neutron capture therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, C.K.C.; Sutton, M.; Evans, T.M.

    1999-01-01

    This paper presents the microdosimetric analysis for the most interesting cell survival experiment recently performed at the Brookhaven National Laboratory (BNL). In this experiment, the cells were first treated with a gadolinium (Gd) labeled tumor-seeking boronated porphyrin (Gd-BOPP) or with BOPP alone, and then irradiated with thermal neutrons. The resulting cell-survival curves indicate that the {sup 157}Gd(n,{gamma}) reactions are very effective in cell killing. The death of a cell treated with Gd-BOPP was attributed to either the {sup 10}B(n,{alpha}){sup 7}Li reactions or the {sup 157}Gd(n,{gamma}) reactions (or both). However, the quantitative relationship between the two types of reaction and themore » cell-survival fraction was not clear. This paper presents the microdosimetric analysis for the BNL experiment based on the measured experimental parameters, and the results clearly suggest a quantitative relationship between the two types of reaction and the cell survival fraction. The results also suggest new research in gadolinium neutron capture therapy (GdNCT) which may lead to a more practical modality than the boron neutron capture therapy (BNCT) for treating cancers.« less

  9. Quantitative magnetic resonance imaging in traumatic brain injury.

    PubMed

    Bigler, E D

    2001-04-01

    Quantitative neuroimaging has now become a well-established method for analyzing magnetic resonance imaging in traumatic brain injury (TBI). A general review of studies that have examined quantitative changes following TBI is presented. The consensus of quantitative neuroimaging studies is that most brain structures demonstrate changes in volume or surface area after injury. The patterns of atrophy are consistent with the generalized nature of brain injury and diffuse axonal injury. Various clinical caveats are provided including how quantitative neuroimaging findings can be used clinically and in predicting rehabilitation outcome. The future of quantitative neuroimaging also is discussed.

  10. A proposed configuration for a stepped specimen to be used in the systematic evaluation of factors influencing warpage in metallic alloys being used for cryogenic wind tunnel models

    NASA Technical Reports Server (NTRS)

    Wigley, D. A.

    1982-01-01

    A proposed configuration for a stepped specimen to be used in the system evaluation of mechanisms that can introduce warpage or dimensional changes in metallic alloys used for cryogenic wind tunnel models is described. Considerations for selecting a standard specimen are presented along with results obtained from an investigation carried out for VASCOMAX 200 maraging steel. Details of the machining and measurement techniques utilized in the investigation are presented. Initial results from the sample of VASCOMAX 200 show that the configuration and measuring techniques are capable of giving quantitative results.

  11. Quantitative autistic trait measurements index background genetic risk for ASD in Hispanic families.

    PubMed

    Page, Joshua; Constantino, John Nicholas; Zambrana, Katherine; Martin, Eden; Tunc, Ilker; Zhang, Yi; Abbacchi, Anna; Messinger, Daniel

    2016-01-01

    Recent studies have indicated that quantitative autistic traits (QATs) of parents reflect inherited liabilities that may index background genetic risk for clinical autism spectrum disorder (ASD) in their offspring. Moreover, preferential mating for QATs has been observed as a potential factor in concentrating autistic liabilities in some families across generations. Heretofore, intergenerational studies of QATs have focused almost exclusively on Caucasian populations-the present study explored these phenomena in a well-characterized Hispanic population. The present study examined QAT scores in siblings and parents of 83 Hispanic probands meeting research diagnostic criteria for ASD, and 64 non-ASD controls, using the Social Responsiveness Scale-2 (SRS-2). Ancestry of the probands was characterized by genotype, using information from 541,929 single nucleotide polymorphic markers. In families of Hispanic children with an ASD diagnosis, the pattern of quantitative trait correlations observed between ASD-affected children and their first-degree relatives (ICCs on the order of 0.20), between unaffected first-degree relatives in ASD-affected families (sibling/mother ICC = 0.36; sibling/father ICC = 0.53), and between spouses (mother/father ICC = 0.48) were in keeping with the influence of transmitted background genetic risk and strong preferential mating for variation in quantitative autistic trait burden. Results from analysis of ancestry-informative genetic markers among probands in this sample were consistent with that from other Hispanic populations. Quantitative autistic traits represent measurable indices of inherited liability to ASD in Hispanic families. The accumulation of autistic traits occurs within generations, between spouses, and across generations, among Hispanic families affected by ASD. The occurrence of preferential mating for QATs-the magnitude of which may vary across cultures-constitutes a mechanism by which background genetic liability for ASD can accumulate in a given family in successive generations.

  12. Quantitative comparison of clustered microcalcifications in for-presentation and for-processing mammograms in full-field digital mammography.

    PubMed

    Wang, Juan; Nishikawa, Robert M; Yang, Yongyi

    2017-07-01

    Mammograms acquired with full-field digital mammography (FFDM) systems are provided in both "for-processing'' and "for-presentation'' image formats. For-presentation images are traditionally intended for visual assessment by the radiologists. In this study, we investigate the feasibility of using for-presentation images in computerized analysis and diagnosis of microcalcification (MC) lesions. We make use of a set of 188 matched mammogram image pairs of MC lesions from 95 cases (biopsy proven), in which both for-presentation and for-processing images are provided for each lesion. We then analyze and characterize the MC lesions from for-presentation images and compare them with their counterparts in for-processing images. Specifically, we consider three important aspects in computer-aided diagnosis (CAD) of MC lesions. First, we quantify each MC lesion with a set of 10 image features of clustered MCs and 12 textural features of the lesion area. Second, we assess the detectability of individual MCs in each lesion from the for-presentation images by a commonly used difference-of-Gaussians (DoG) detector. Finally, we study the diagnostic accuracy in discriminating between benign and malignant MC lesions from the for-presentation images by a pretrained support vector machine (SVM) classifier. To accommodate the underlying background suppression and image enhancement in for-presentation images, a normalization procedure is applied. The quantitative image features of MC lesions from for-presentation images are highly consistent with that from for-processing images. The values of Pearson's correlation coefficient between features from the two formats range from 0.824 to 0.961 for the 10 MC image features, and from 0.871 to 0.963 for the 12 textural features. In detection of individual MCs, the FROC curve from for-presentation is similar to that from for-processing. In particular, at sensitivity level of 80%, the average number of false-positives (FPs) per image region is 9.55 for both for-presentation and for-processing images. Finally, for classifying MC lesions as malignant or benign, the area under the ROC curve is 0.769 in for-presentation, compared to 0.761 in for-processing (P = 0.436). The quantitative results demonstrate that MC lesions in for-presentation images are highly consistent with that in for-processing images in terms of image features, detectability of individual MCs, and classification accuracy between malignant and benign lesions. These results indicate that for-presentation images can be compatible with for-processing images for use in CAD algorithms for MC lesions. © 2017 American Association of Physicists in Medicine.

  13. Low-frequency quantitative ultrasound imaging of cell death in vivo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadeghi-Naini, Ali; Falou, Omar; Czarnota, Gregory J.

    Purpose: Currently, no clinical imaging modality is used routinely to assess tumor response to cancer therapies within hours to days of the delivery of treatment. Here, the authors demonstrate the efficacy of ultrasound at a clinically relevant frequency to quantitatively detect changes in tumors in response to cancer therapies using preclinical mouse models.Methods: Conventional low-frequency and corresponding high-frequency ultrasound (ranging from 4 to 28 MHz) were used along with quantitative spectroscopic and signal envelope statistical analyses on data obtained from xenograft tumors treated with chemotherapy, x-ray radiation, as well as a novel vascular targeting microbubble therapy.Results: Ultrasound-based spectroscopic biomarkers indicatedmore » significant changes in cell-death associated parameters in responsive tumors. Specifically changes in the midband fit, spectral slope, and 0-MHz intercept biomarkers were investigated for different types of treatment and demonstrated cell-death related changes. The midband fit and 0-MHz intercept biomarker derived from low-frequency data demonstrated increases ranging approximately from 0 to 6 dBr and 0 to 8 dBr, respectively, depending on treatments administrated. These data paralleled results observed for high-frequency ultrasound data. Statistical analysis of ultrasound signal envelope was performed as an alternative method to obtain histogram-based biomarkers and provided confirmatory results. Histological analysis of tumor specimens indicated up to 61% cell death present in the tumors depending on treatments administered, consistent with quantitative ultrasound findings indicating cell death. Ultrasound-based spectroscopic biomarkers demonstrated a good correlation with histological morphological findings indicative of cell death (r{sup 2}= 0.71, 0.82; p < 0.001).Conclusions: In summary, the results provide preclinical evidence, for the first time, that quantitative ultrasound used at a clinically relevant frequency, in addition to high-frequency ultrasound, can detect tissue changes associated with cell death in vivo in response to cancer treatments.« less

  14. Backup of Renewable Energy for an Electrical Island: Case Study of Israeli Electricity System—Current Status

    PubMed Central

    Fakhouri, A.; Kuperman, A.

    2014-01-01

    The paper focuses on the quantitative analysis of Israeli Government's targets of 10% renewable energy penetration by 2020 and determining the desired methodology (models) for assessing the effects on the electricity market, addressing the fact that Israel is an electricity island. The main objective is to determine the influence of achieving the Government's goals for renewable energy penetration on the need for backup in the Israeli electricity system. This work presents the current situation of the Israeli electricity market and the study to be taken in order to assess the undesirable effects resulting from the intermittency of electricity generated by wind and solar power stations as well as presents some solutions to mitigating these phenomena. Future work will focus on a quantitative analysis of model runs and determine the amounts of backup required relative to the amount of installed capacity from renewable resources. PMID:24624044

  15. Another look at retroactive and proactive interference: a quantitative analysis of conversion processes.

    PubMed

    Blank, Hartmut

    2005-02-01

    Traditionally, the causes of interference phenomena were sought in "real" or "hard" memory processes such as unlearning, response competition, or inhibition, which serve to reduce the accessibility of target items. I propose an alternative approach which does not deny the influence of such processes but highlights a second, equally important, source of interference-the conversion (Tulving, 1983) of accessible memory information into memory performance. Conversion is conceived as a problem-solving-like activity in which the rememberer tries to find solutions to a memory task. Conversion-based interference effects are traced to different conversion processes in the experimental and control conditions of interference designs. I present a simple theoretical model that quantitatively predicts the resulting amount of interference. In two paired-associate learning experiments using two different types of memory tests, these predictions were corroborated. Relations of the present approach to traditional accounts of interference phenomena and implications for eyewitness testimony are discussed.

  16. Backup of renewable energy for an electrical island: case study of Israeli electricity system--current status.

    PubMed

    Fakhouri, A; Kuperman, A

    2014-01-01

    The paper focuses on the quantitative analysis of Israeli Government's targets of 10% renewable energy penetration by 2020 and determining the desired methodology (models) for assessing the effects on the electricity market, addressing the fact that Israel is an electricity island. The main objective is to determine the influence of achieving the Government's goals for renewable energy penetration on the need for backup in the Israeli electricity system. This work presents the current situation of the Israeli electricity market and the study to be taken in order to assess the undesirable effects resulting from the intermittency of electricity generated by wind and solar power stations as well as presents some solutions to mitigating these phenomena. Future work will focus on a quantitative analysis of model runs and determine the amounts of backup required relative to the amount of installed capacity from renewable resources.

  17. Advances in imaging and quantification of electrical properties at the nanoscale using Scanning Microwave Impedance Microscopy (sMIM)

    NASA Astrophysics Data System (ADS)

    Friedman, Stuart; Yang, Yongliang; Amster, Oskar

    2015-03-01

    Scanning Microwave Impedance Microscopy (sMIM) is a mode for Atomic Force Microscopy (AFM) enabling imaging of unique contrast mechanisms and measurement of local permittivity and conductivity at the 10's of nm length scale. Recent results will be presented illustrating high-resolution electrical features such as sub 15 nm Moire' patterns in Graphene, carbon nanotubes of various electrical states and ferro-electrics. In addition to imaging, the technique is suited to a variety of metrology applications where specific physical properties are determined quantitatively. We will present research activities on quantitative measurements using multiple techniques to determine dielectric constant (permittivity) and conductivity (e.g. dopant concentration) for a range of materials. Examples include bulk dielectrics, low-k dielectric thin films, capacitance standards and doped semiconductors. Funded in part by DOE SBIR DE-SC0009586.

  18. Separating DDTs in edible animal fats using matrix solid-phase dispersion extraction with activated carbon filter, Toyobo-KF.

    PubMed

    Furusawa, Naoto

    2006-09-01

    A technique is presented for the economical, routine, and quantitative analysis of contamination by dichloro-diphenyl-trichloroethanes (DDTs) [pp'-DDT, pp'-dichlorodiphenyl dichloroethylene, and pp'-dichlorodiphenyl dichloreothane in beef tallow and chicken fat samples, based on their separation using matrix solid-phase dispersion (MSPD) extraction with Toyobo-KF, an activated carbon fiber. Toyobo-KF is a newly applied MSPD sorbent, and it is followed by reversed-phase high-performance liquid chromatography (HPLC) with a photodiode array detector. The resulting analytical performance parameters [recoveries of spiked DDTs (0.1, 0.2, and 0.4 microg/g) > or = 81%, with relative standard deviations of < or = 8% (n = 5), and quantitation limits < or = 0.03 microg/g], with minimal handling and cost-efficiency, indicate that the present MSPD-HPLC method may be a useful tool for routine monitoring of DDT contamination in meat.

  19. Holographic 3D imaging through diffuse media by compressive sampling of the mutual intensity

    NASA Astrophysics Data System (ADS)

    Falldorf, Claas; Klein, Thorsten; Agour, Mostafa; Bergmann, Ralf B.

    2017-05-01

    We present a method for holographic imaging through a volume scattering material, which is based on selfreference and light with good spatial but limited temporal coherence. In contrast to existing techniques, we do not require a separate reference wave, thus our approach provides great advantages towards the flexibility of the measurement system. The main applications are remote sensing and investigation of moving objects through gaseous streams, bubbles or foggy water for example. Furthermore, due to the common path nature, the system is also insensitive to mechanical disturbances. The measurement result is a complex amplitude which is comparable to a phase shifted digital hologramm and therefore allows 3D imaging, numerical refocusing and quantitative phase contrast imaging. As an example of application, we present measurements of the quantitative phase contrast of the epidermis of an onion through a volume scattering material.

  20. Laboratory measurements of the millimeter-wave spectra of calcium isocyanide

    NASA Astrophysics Data System (ADS)

    Steimle, Timothy C.; Saito, Shuji; Takano, Shuro

    1993-06-01

    The ground state of CaNC is presently characterized by mm-wave spectroscopy, using a standard Hamiltonian linear molecule model to analyze the spectrum. The resulting spectroscopic parameters were used to predict the transition frequencies and Einstein A-coefficients, which should make possible a quantitative astrophysical search for CaNC.

  1. Graphical Interaction Analysis Impact on Groups Collaborating through Blogs

    ERIC Educational Resources Information Center

    Fessakis, Georgios; Dimitracopoulou, Angelique; Palaiodimos, Aggelos

    2013-01-01

    This paper presents empirical research results regarding the impact of Interaction Analysis (IA) graphs on groups of students collaborating through online blogging according to a "learning by design" scenario. The IA graphs used are of two categories; the first category summarizes quantitatively the activity of the users for each blog,…

  2. Rotational Stability--An Amusing Physical Paradox

    ERIC Educational Resources Information Center

    Sendra, Carlos M.; Picca, Fabricio Della; Gil, Salvador

    2007-01-01

    Here we present a simple and amusing device that demonstrates some surprising results of the dynamics of the rotation of a symmetrical rigid body. This system allows for a qualitative demonstration or a quantitative study of the rotation stability of a symmetric top. A simple and inexpensive technique is proposed to carry out quantitative…

  3. Exploring the Relationship between Self-Efficacy and Retention in Introductory Physics

    ERIC Educational Resources Information Center

    Sawtelle, Vashti; Brewe, Eric; Kramer, Laird H.

    2012-01-01

    The quantitative results of Sources of Self-Efficacy in Science Courses-Physics (SOSESC-P) are presented as a logistic regression predicting the passing of students in introductory Physics with Calculus I, overall as well as disaggregated by gender. Self-efficacy as a theory to explain human behavior change [Bandura [1977] "Psychological…

  4. School Climate of Educational Institutions: Design and Validation of a Diagnostic Scale

    ERIC Educational Resources Information Center

    Becerra, Sandra

    2016-01-01

    School climate is recognized as a relevant factor for the improvement of educative processes, favoring the administrative processes and optimum school performance. The present article is the result of a quantitative research model which had the objective of psychometrically designing and validating a scale to diagnose the organizational climate of…

  5. Does the Social Working Environment Predict Beginning Teachers' Self-Efficacy and Feelings of Depression?

    ERIC Educational Resources Information Center

    Devos, Christelle; Dupriez, Vincent; Paquay, Leopold

    2012-01-01

    We investigate how the social working environment predicts beginning teachers' self-efficacy and feelings of depression. Two quantitative studies are presented. The results show that the goal structure of the school culture (mastery or performance orientation) predicts both outcomes. Frequent collaborative interactions with colleagues are related…

  6. The development of a host potential index and its postharvest application to the spotted wing drosophila, Drosophila suzukii (Diptera: Drosophilidae)

    USDA-ARS?s Scientific Manuscript database

    Novel methodology is presented for indexing the relative potential of hosts to function as resources. Results from studies examining host selection, utilization, and physiological development of the organism resourcing the host were combined and quantitatively related via a Host Potential Index (HPI...

  7. Characterization of the maize pollen transcriptome: Validation of microarray results using quantitative real-time PCR

    EPA Science Inventory

    Pollen is the primary means of gene flow between plants and plant populations and plays a critical role in seed production. Pollen fitness can be defined as the ability of a particular pollen grain to outcompete other pollen present on the stigma and complete fertilization, thus ...

  8. The Social and Emotional Development of Gifted Students

    ERIC Educational Resources Information Center

    Callahan, Carolyn M.; Sowa, Claudia J.; May, Kathleen M.; Tomchin, Ellen Menaker; Plucker, Jonathan A.; Cunningham, Caroline M.; Taylor, Wesley

    2004-01-01

    This research monograph on the social and emotional development of gifted students' is divided into four parts. Part 1 of the report focuses on analysis of the literature. Parts 2-4 present results of seven qualitative and quantitative studies of adolescent development. In Part 2, Studies 1 and 2 expand Lazarus and Folkman's cognitive appraisal…

  9. Proposal for Teaching Evolutionary Biology: A Bridge between Research and Educational Practice

    ERIC Educational Resources Information Center

    Alvarez Pérez, Eréndira; Ruiz Gutiérrez, Rosaura

    2016-01-01

    We present quantitative results for the doctoral thesis of the first-named author of this article. The objective was to recommend and test a teaching proposal for core knowledge of evolutionary biology in secondary education. The focus of the study is "Problem cores in teaching". The "Weaving evolutionary thinking" teaching…

  10. Investing in Leadership: The District's Role in Managing Principal Turnover

    ERIC Educational Resources Information Center

    Mascall, Blair; Leithwood, Kenneth

    2010-01-01

    This article presents the results of research into the impact of principal turnover on schools, and the ability of schools to mitigate the negative effects of frequent turnover by distributing leadership in the schools. The findings from this qualitative and quantitative analysis show that rapid principal turnover does indeed have a negative…

  11. Teacher Perceptions of Professional Role and Innovative Teaching at Elementary Schools in Taiwan

    ERIC Educational Resources Information Center

    Hung, Chih-Lun; Li, Feng-Chin

    2017-01-01

    The purpose of the study is to explore the association between primary school teachers' perceptions of professional role and their innovative teaching in Central Taiwan. Quantitative research methods were employed, and data were collected from 554 Central Taiwanese teachers. The results of the present study indicated that elementary school…

  12. Online Master's Students' Perceptions of Institutional Supports and Resources: Initial Survey Results

    ERIC Educational Resources Information Center

    Milman, Natalie B.; Posey, Laurie; Pintz, Christine; Wright, Kayla; Zhou, Pearl

    2015-01-01

    This article presents the quantitative findings of an exploratory mixed methods study that investigated first- and second-year online graduate master's students': 1) perceptions of the importance of, and satisfaction with, administrative, academic, technical, and online community supports; 2) personal factors and grit level; and 3) differences, if…

  13. Innovative Teaching: An Empirical Study of Computer-Aided Instruction in Quantitative Business Courses

    ERIC Educational Resources Information Center

    Gonul, Fusun F.; Solano, Roger A.

    2013-01-01

    We investigate business undergraduate mathematics-based courses in a blended environment of online assignments and exams and offline lectures, and report the impact on academic performance of factors such as classroom attendance, web-based course supplements, and homework. We present results from both ordinary least squares and fixed effects,…

  14. Hip Hop Therapy: An Exploratory Study of a Rap Music Intervention with At-Risk and Delinquent Youth.

    ERIC Educational Resources Information Center

    Tyson, Edgar H.

    2002-01-01

    Presents an exploratory study of the therapeutic potential of "Hip-Hop" therapy, an "innovative synergy of rap music, bibliotherapy, and music therapy." Finds that the quantitative and qualitative results partially supported the hypothesis that under a specific set of conditions rap music would improve the therapeutic…

  15. An Outcome Evaluation of the Spirituality for Kids Program. Technical Report

    ERIC Educational Resources Information Center

    Maestas, Nicole; Gaillot, Sarah

    2008-01-01

    This report presents results from a multisite, quantitative evaluation of the international Spirituality for Kids (SFK) after-school program. Despite its name, SFK is a nonreligious program that seeks to build resilience in children by teaching them to access inner resources and build positive connections with others. The SFK program is unlike…

  16. Stages of Psychometric Measure Development: The Example of the Generalized Expertise Measure (GEM)

    ERIC Educational Resources Information Center

    Germain, Marie-Line

    2006-01-01

    This paper chronicles the steps, methods, and presents hypothetical results of quantitative and qualitative studies being conducted to develop a Generalized Expertise Measure (GEM). Per Hinkin (1995), the stages of scale development are domain and item generation, content expert validation, and pilot test. Content/face validity and internal…

  17. Fluctuations and Noise in Stochastic Spread of Respiratory Infection Epidemics in Social Networks

    NASA Astrophysics Data System (ADS)

    Yulmetyev, Renat; Emelyanova, Natalya; Demin, Sergey; Gafarov, Fail; Hänggi, Peter; Yulmetyeva, Dinara

    2003-05-01

    For the analysis of epidemic and disease dynamics complexity, it is necessary to understand the basic principles and notions of its spreading in long-time memory media. Here we considering the problem from a theoretical and practical viewpoint, presenting the quantitative evidence confirming the existence of stochastic long-range memory and robust chaos in a real time series of respiratory infections of human upper respiratory track. In this work we present a new statistical method of analyzing the spread of grippe and acute respiratory track infections epidemic process of human upper respiratory track by means of the theory of discrete non-Markov stochastic processes. We use the results of our recent theory (Phys. Rev. E 65, 046107 (2002)) for the study of statistical effects of memory in real data series, describing the epidemic dynamics of human acute respiratory track infections and grippe. The obtained results testify to an opportunity of the strict quantitative description of the regular and stochastic components in epidemic dynamics of social networks with a view to time discreteness and effects of statistical memory.

  18. Evaluating a linearized Euler equations model for strong turbulence effects on sound propagation.

    PubMed

    Ehrhardt, Loïc; Cheinet, Sylvain; Juvé, Daniel; Blanc-Benon, Philippe

    2013-04-01

    Sound propagation outdoors is strongly affected by atmospheric turbulence. Under strongly perturbed conditions or long propagation paths, the sound fluctuations reach their asymptotic behavior, e.g., the intensity variance progressively saturates. The present study evaluates the ability of a numerical propagation model based on the finite-difference time-domain solving of the linearized Euler equations in quantitatively reproducing the wave statistics under strong and saturated intensity fluctuations. It is the continuation of a previous study where weak intensity fluctuations were considered. The numerical propagation model is presented and tested with two-dimensional harmonic sound propagation over long paths and strong atmospheric perturbations. The results are compared to quantitative theoretical or numerical predictions available on the wave statistics, including the log-amplitude variance and the probability density functions of the complex acoustic pressure. The match is excellent for the evaluated source frequencies and all sound fluctuations strengths. Hence, this model captures these many aspects of strong atmospheric turbulence effects on sound propagation. Finally, the model results for the intensity probability density function are compared with a standard fit by a generalized gamma function.

  19. Characterization of European sword blades through neutron imaging techniques

    NASA Astrophysics Data System (ADS)

    Salvemini, F.; Grazzi, F.; Peetermans, S.; Gener, M.; Lehmann, E. H.; Zoppi, M.

    2014-09-01

    In the present work, we have studied two European rapier blades, dating back to the period ranging from the Late Renaissance to the Early Modern Age (about 17th to 18th century). In order to determine variation in quality and differences in technology, a study was undertaken with the purpose to observe variations in the blade microstructure (and consequently in the construction processes). The samples, which in the present case were expendable, have been investigated, preliminarily, through standard metallography and then by means of white beam and energy-selective neutron imaging. The comparison of the results, using the two techniques, turned out to be satisfactory, with a substantial quantitative agreement of the results obtained with the two techniques, and show the complementarity of the two methods. Metallography has been considered up to now the method of choice for metal material characterization. The correspondence between the two methods, as well as the non-invasive character of the neutron-based techniques and its possibility to obtain 3D reconstruction, candidate neutron imaging as an important and quantitatively reliable technique for metal characterization.

  20. Development of a new quantitative gas permeability method for dental implant-abutment connection tightness assessment

    PubMed Central

    2011-01-01

    Background Most dental implant systems are presently made of two pieces: the implant itself and the abutment. The connection tightness between those two pieces is a key point to prevent bacterial proliferation, tissue inflammation and bone loss. The leak has been previously estimated by microbial, color tracer and endotoxin percolation. Methods A new nitrogen flow technique was developed for implant-abutment connection leakage measurement, adapted from a recent, sensitive, reproducible and quantitative method used to assess endodontic sealing. Results The results show very significant differences between various sealing and screwing conditions. The remaining flow was lower after key screwing compared to hand screwing (p = 0.03) and remained different from the negative test (p = 0.0004). The method reproducibility was very good, with a coefficient of variation of 1.29%. Conclusions Therefore, the presented new gas flow method appears to be a simple and robust method to compare different implant systems. It allows successive measures without disconnecting the abutment from the implant and should in particular be used to assess the behavior of the connection before and after mechanical stress. PMID:21492459

  1. Exploitation of the complexation reaction of ortho-dihydroxylated anthocyanins with aluminum(III) for their quantitative spectrophotometric determination in edible sources.

    PubMed

    Bernal, Freddy A; Orduz-Diaz, Luisa L; Coy-Barrera, Ericsson

    2015-10-15

    Anthocyanins are natural pigments known for their color and antioxidant activity. These properties allow their use in various fields, including food and pharmaceutical ones. Quantitative determination of anthocyanins had been performed by non-specific methods that limit the accuracy and reliability of the results. Therefore, a novel, simple spectrophotometric method for the anthocyanins quantification based on a formation of blue-colored complexes by the known reaction between catechol- and pyrogallol-containing anthocyanins and aluminum(III) is presented. The method demonstrated to be reproducible, repetitive (RSD<1.5%) and highly sensitive to ortho-dihydroxylated anthocyanins (LOD = 0.186 μg/mL). Compliance with Beer's law was also evident in a range of concentrations (2-16 μg/mL for cyanidin 3-O-glucoside). Good recoveries (98.8-103.3%) were calculated using anthocyanin-rich plant samples. The described method revealed direct correlation to pH differential method results for several common anthocyanin-containing fruits indicating its great analytical potential. The presented method was successfully validated. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. The Relevancy of Large-Scale, Quantitative Methodologies in Middle Grades Education Research

    ERIC Educational Resources Information Center

    Mertens, Steven B.

    2006-01-01

    This article examines the relevancy of large-scale, quantitative methodologies in middle grades education research. Based on recommendations from national advocacy organizations, the need for more large-scale, quantitative research, combined with the application of more rigorous methodologies, is presented. Subsequent sections describe and discuss…

  3. Spontaneous Focusing on Quantitative Relations in the Development of Children's Fraction Knowledge

    ERIC Educational Resources Information Center

    McMullen, Jake; Hannula-Sormunen, Minna M.; Lehtinen, Erno

    2014-01-01

    While preschool-aged children display some skills with quantitative relations, later learning of related fraction concepts is difficult for many students. We present two studies that investigate young children's tendency of Spontaneous Focusing On quantitative Relations (SFOR), which may help explain individual differences in the development of…

  4. Gisting Technique Development.

    DTIC Science & Technology

    1981-12-01

    furnished tapes (" Stonehenge " database) which were used for previous contracts. Recognition results for English male and female speakers are presented in...independent " Stonehenge " test data. A variety of options in generating word arrays were tried; the results below describe the most successful of these. The...time to carry out any quantitative tests, ............. Page 22 even the obvious one of retraining the " Stonehenge " English vocabulary on-line, we

  5. Newly Built Undergraduate Schools Should Place Great Emphasis on Connotation Construction and Quality Promotion: An Analysis Based on the Qualification Evaluation Results for 41 Undergraduate Schools

    ERIC Educational Resources Information Center

    Binglin, Zhong

    2016-01-01

    The article presents a quantitative analysis of the evaluation results for 41 newly built undergraduate schools that submitted to the qualification evaluation of undergraduate work by Ministry of Education in 2013. It shows that newly built undergraduate schools should place great emphasis on connotation construction and quality promotion and on…

  6. A Method For The Verification Of Wire Crimp Compression Using Ultrasonic Inspection

    NASA Technical Reports Server (NTRS)

    Cramer, K. E.; Perey, Daniel F.; Yost, William t.

    2010-01-01

    The development of a new ultrasonic measurement technique to assess quantitatively wire crimp terminations is discussed. The amplitude change of a compressional ultrasonic wave propagating at right angles to the wire axis and through the junction of a crimp termination is shown to correlate with the results of a destructive pull test, which is a standard for assessing crimp wire junction quality. To demonstrate the technique, the case of incomplete compression of crimped connections is ultrasonically tested, and the results are correlated with pull tests. Results show that the nondestructive ultrasonic measurement technique consistently predicts good crimps when the ultrasonic transmission is above a certain threshold amplitude level. A quantitative measure of the quality of the crimped connection based on the ultrasonic energy transmitted is shown to respond accurately to crimp quality. A wave propagation model, solved by finite element analysis, describes the compressional ultrasonic wave propagation through the junction during the crimping process. This model is in agreement within 6% of the ultrasonic measurements. A prototype instrument for applying this technique while wire crimps are installed is also presented. The instrument is based on a two-jaw type crimp tool suitable for butt-splice type connections. A comparison of the results of two different instruments is presented and shows reproducibility between instruments within a 95% confidence bound.

  7. Construction of measurement uncertainty profiles for quantitative analysis of genetically modified organisms based on interlaboratory validation data.

    PubMed

    Macarthur, Roy; Feinberg, Max; Bertheau, Yves

    2010-01-01

    A method is presented for estimating the size of uncertainty associated with the measurement of products derived from genetically modified organisms (GMOs). The method is based on the uncertainty profile, which is an extension, for the estimation of uncertainty, of a recent graphical statistical tool called an accuracy profile that was developed for the validation of quantitative analytical methods. The application of uncertainty profiles as an aid to decision making and assessment of fitness for purpose is also presented. Results of the measurement of the quantity of GMOs in flour by PCR-based methods collected through a number of interlaboratory studies followed the log-normal distribution. Uncertainty profiles built using the results generally give an expected range for measurement results of 50-200% of reference concentrations for materials that contain at least 1% GMO. This range is consistent with European Network of GM Laboratories and the European Union (EU) Community Reference Laboratory validation criteria and can be used as a fitness for purpose criterion for measurement methods. The effect on the enforcement of EU labeling regulations is that, in general, an individual analytical result needs to be < 0.45% to demonstrate compliance, and > 1.8% to demonstrate noncompliance with a labeling threshold of 0.9%.

  8. iMet-Q: A User-Friendly Tool for Label-Free Metabolomics Quantitation Using Dynamic Peak-Width Determination

    PubMed Central

    Chang, Hui-Yin; Chen, Ching-Tai; Lih, T. Mamie; Lynn, Ke-Shiuan; Juo, Chiun-Gung; Hsu, Wen-Lian; Sung, Ting-Yi

    2016-01-01

    Efficient and accurate quantitation of metabolites from LC-MS data has become an important topic. Here we present an automated tool, called iMet-Q (intelligent Metabolomic Quantitation), for label-free metabolomics quantitation from high-throughput MS1 data. By performing peak detection and peak alignment, iMet-Q provides a summary of quantitation results and reports ion abundance at both replicate level and sample level. Furthermore, it gives the charge states and isotope ratios of detected metabolite peaks to facilitate metabolite identification. An in-house standard mixture and a public Arabidopsis metabolome data set were analyzed by iMet-Q. Three public quantitation tools, including XCMS, MetAlign, and MZmine 2, were used for performance comparison. From the mixture data set, seven standard metabolites were detected by the four quantitation tools, for which iMet-Q had a smaller quantitation error of 12% in both profile and centroid data sets. Our tool also correctly determined the charge states of seven standard metabolites. By searching the mass values for those standard metabolites against Human Metabolome Database, we obtained a total of 183 metabolite candidates. With the isotope ratios calculated by iMet-Q, 49% (89 out of 183) metabolite candidates were filtered out. From the public Arabidopsis data set reported with two internal standards and 167 elucidated metabolites, iMet-Q detected all of the peaks corresponding to the internal standards and 167 metabolites. Meanwhile, our tool had small abundance variation (≤0.19) when quantifying the two internal standards and had higher abundance correlation (≥0.92) when quantifying the 167 metabolites. iMet-Q provides user-friendly interfaces and is publicly available for download at http://ms.iis.sinica.edu.tw/comics/Software_iMet-Q.html. PMID:26784691

  9. Quantitative analysis of 18F-NaF dynamic PET/CT cannot differentiate malignant from benign lesions in multiple myeloma.

    PubMed

    Sachpekidis, Christos; Hillengass, Jens; Goldschmidt, Hartmut; Anwar, Hoda; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-01-01

    A renewed interest has been recently developed for the highly sensitive bone-seeking radiopharmaceutical 18 F-NaF. Aim of the present study is to evaluate the potential utility of quantitative analysis of 18 F-NaF dynamic PET/CT data in differentiating malignant from benign degenerative lesions in multiple myeloma (MM). 80 MM patients underwent whole-body PET/CT and dynamic PET/CT scanning of the pelvis with 18 F-NaF. PET/CT data evaluation was based on visual (qualitative) assessment, semi-quantitative (SUV) calculations, and absolute quantitative estimations after application of a 2-tissue compartment model and a non-compartmental approach leading to the extraction of fractal dimension (FD). In total 263 MM lesions were demonstrated on 18 F-NaF PET/CT. Semi-quantitative and quantitative evaluations were performed for 25 MM lesions as well as for 25 benign, degenerative and traumatic lesions. Mean SUV average for MM lesions was 11.9 and mean SUV max was 23.2. Respectively, SUV average and SUV max for degenerative lesions were 13.5 and 20.2. Kinetic analysis of 18 F-NaF revealed the following mean values for MM lesions: K 1 = 0.248 (1/min), k 3 = 0.359 (1/min), influx (K i ) = 0.107 (1/min), FD = 1.382, while the respective values for degenerative lesions were: K 1 = 0.169 (1/min), k 3 = 0.422 (1/min), influx (K i ) = 0.095 (1/min), FD = 1. 411. No statistically significant differences between MM and benign degenerative disease regarding SUV average , SUV max , K 1 , k 3 and influx (K i ) were demonstrated. FD was significantly higher in degenerative than in malignant lesions. The present findings show that quantitative analysis of 18 F-NaF PET data cannot differentiate malignant from benign degenerative lesions in MM patients, supporting previously published results, which reflect the limited role of 18 F-NaF PET/CT in the diagnostic workup of MM.

  10. Population structure and strong divergent selection shape phenotypic diversification in maize landraces.

    PubMed

    Pressoir, G; Berthaud, J

    2004-02-01

    To conserve the long-term selection potential of maize, it is necessary to investigate past and present evolutionary processes that have shaped quantitative trait variation. Understanding the dynamics of quantitative trait evolution is crucial to future crop breeding. We characterized population differentiation of maize landraces from the State of Oaxaca, Mexico for quantitative traits and molecular markers. Qst values were much higher than Fst values obtained for molecular markers. While low values of Fst (0.011 within-village and 0.003 among-villages) suggest that considerable gene flow occurred among the studied populations, high levels of population differentiation for quantitative traits were observed (ie an among-village Qst value of 0.535 for kernel weight). Our results suggest that although quantitative traits appear to be under strong divergent selection, a considerable amount of gene flow occurs among populations. Furthermore, we characterized nonproportional changes in the G matrix structure both within and among villages that are consequences of farmer selection. As a consequence of these differences in the G matrix structure, the response to multivariate selection will be different from one population to another. Large changes in the G matrix structure could indicate that farmers select for genes of major and pleiotropic effect. Farmers' decision and selection strategies have a great impact on phenotypic diversification in maize landraces.

  11. A new liquid chromatography-mass spectrometry-based method to quantitate exogenous recombinant transferrin in cerebrospinal fluid: a potential approach for pharmacokinetic studies of transferrin-based therapeutics in the central nervous systems.

    PubMed

    Wang, Shunhai; Bobst, Cedric E; Kaltashov, Igor A

    2015-01-01

    Transferrin (Tf) is an 80 kDa iron-binding protein that is viewed as a promising drug carrier to target the central nervous system as a result of its ability to penetrate the blood-brain barrier. Among the many challenges during the development of Tf-based therapeutics, the sensitive and accurate quantitation of the administered Tf in cerebrospinal fluid (CSF) remains particularly difficult because of the presence of abundant endogenous Tf. Herein, we describe the development of a new liquid chromatography-mass spectrometry-based method for the sensitive and accurate quantitation of exogenous recombinant human Tf in rat CSF. By taking advantage of a His-tag present in recombinant Tf and applying Ni affinity purification, the exogenous human serum Tf can be greatly enriched from rat CSF, despite the presence of the abundant endogenous protein. Additionally, we applied a newly developed (18)O-labeling technique that can generate internal standards at the protein level, which greatly improved the accuracy and robustness of quantitation. The developed method was investigated for linearity, accuracy, precision, and lower limit of quantitation, all of which met the commonly accepted criteria for bioanalytical method validation.

  12. Isolation and quantification of Quillaja saponaria Molina saponins and lipids in iscom-matrix and iscoms.

    PubMed

    Behboudi, S; Morein, B; Rönnberg, B

    1995-12-01

    In the iscom, multiple copies of antigen are attached by hydrophobic interaction to a matrix which is built up by Quillaja triterpenoid saponins and lipids. Thus, the iscom presents antigen in multimeric form in a small particle with a built-in adjuvant resulting in a highly immunogenic antigen formulation. We have designed a chloroform-methanol-water extraction procedure to isolate the triterpenoid saponins and lipids incorporated into iscom-matrix and iscoms. The triterpenoids in the triterpenoid phase were quantitated using orcinol sulfuric acid detecting their carbohydrate chains and by HPLC. The cholesterol and phosphatidylcholine in the lipid phase were quantitated by HPLC and a commercial colorimetric method for the cholesterol. The quantitative methods showed an almost total separation and recovery of triterpenoids and lipids in their respective phases, while protein was detected in all phases after extraction. The protein content was determined by the method of Lowry and by amino acid analysis. Amino acid analysis was shown to be the reliable method of the two to quantitate proteins in iscoms. In conclusion, simple, reproducible and efficient procedures have been designed to isolate and quantitate the triterpenoids and lipids added for preparation of iscom-matrix and iscoms. The procedures described should also be useful to adequately define constituents in prospective vaccines.

  13. Cancer imaging phenomics toolkit: quantitative imaging analytics for precision diagnostics and predictive modeling of clinical outcome.

    PubMed

    Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina

    2018-01-01

    The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.

  14. Sieve-based device for MALDI sample preparation. III. Its power for quantitative measurements.

    PubMed

    Molin, Laura; Cristoni, Simone; Seraglia, Roberta; Traldi, Pietro

    2011-02-01

    The solid sample inhomogeneity is a weak point of traditional MALDI deposition techniques that reflects negatively on quantitative analysis. The recently developed sieve-based device (SBD) sample deposition method, based on the electrospraying of matrix/analyte solutions through a grounded sieve, allows the homogeneous deposition of microcrystals with dimensions smaller than that of the laser spot. In each microcrystal the matrix/analyte molar ratio can be considered constant. Then, by irradiating different portions of the microcrystal distribution an identical response is obtained. This result suggests the employment of SBD in the development of quantitative procedures. For this aim, mixtures of different proteins of known molarity were analyzed, showing a good relationship between molarity and intensity ratios. This behaviour was also observed in the case of proteins with quite different ionic yields. The power of the developed method for quantitative evaluation was also tested by the measurement of the abundance of IGPP[Oxi]GPP[Oxi]GLMGPP (m/z 1219) present in the collagen-α-5(IV) chain precursor, differently expressed in urines from healthy subjects and diabetic-nephropathic patients, confirming its overexpression in the presence of nephropathy. The data obtained indicate that SBD is a particularly effective method for quantitative analysis also in biological fluids of interest. Copyright © 2011 John Wiley & Sons, Ltd.

  15. Non-invasive assessment of cerebral microcirculation with diffuse optics and coherent hemodynamics spectroscopy

    NASA Astrophysics Data System (ADS)

    Fantini, Sergio; Sassaroli, Angelo; Kainerstorfer, Jana M.; Tgavalekos, Kristen T.; Zang, Xuan

    2016-03-01

    We describe the general principles and initial results of coherent hemodynamics spectroscopy (CHS), which is a new technique for the quantitative assessment of cerebral hemodynamics on the basis of dynamic near-infrared spectroscopy (NIRS) measurements. The two components of CHS are (1) dynamic measurements of coherent cerebral hemodynamics in the form of oscillations at multiple frequencies (frequency domain) or temporal transients (time domain), and (2) their quantitative analysis with a dynamic mathematical model that relates the concentration and oxygen saturation of hemoglobin in tissue to cerebral blood volume (CBV), cerebral blood flow (CBF), and cerebral metabolic rate of oxygen (CMRO2). In particular, CHS can provide absolute measurements and dynamic monitoring of CBF, and quantitative measures of cerebral autoregulation. We report initial results of CBF measurements in hemodialysis patients, where we found a lower CBF (54 +/- 16 ml/(100 g-min)) compared to a group of healthy controls (95 +/- 11 ml/(100 g-min)). We also report CHS measurements of cerebral autoregulation, where a quantitative index of autoregulation (its cutoff frequency) was found to be significantly greater in healthy subjects during hyperventilation (0.034 +/- 0.005 Hz) than during normal breathing (0.017 +/- 0.002 Hz). We also present our approach to depth resolved CHS, based on multi-distance, frequency-domain NIRS data and a two-layer diffusion model, to enhance sensitivity to cerebral tissue. CHS offers a potentially powerful approach to the quantitative assessment and continuous monitoring of local brain perfusion at the microcirculation level, with prospective brain mapping capabilities of research and clinical significance.

  16. The reliability analysis of a separated, dual fail operational redundant strapdown IMU. [inertial measurement unit

    NASA Technical Reports Server (NTRS)

    Motyka, P.

    1983-01-01

    A methodology for quantitatively analyzing the reliability of redundant avionics systems, in general, and the dual, separated Redundant Strapdown Inertial Measurement Unit (RSDIMU), in particular, is presented. The RSDIMU is described and a candidate failure detection and isolation system presented. A Markov reliability model is employed. The operational states of the system are defined and the single-step state transition diagrams discussed. Graphical results, showing the impact of major system parameters on the reliability of the RSDIMU system, are presented and discussed.

  17. The Evaluation of Effectiveness of Reciprocal Teaching Strategies on Comprehension of Expository Texts

    ERIC Educational Resources Information Center

    Pilten, Gulhiz

    2016-01-01

    The purpose of the present research is investigating the effects of reciprocal teaching in comprehending expository texts. The research was designed with mixed method. The quantitative dimension of the present research was designed in accordance with pre-test-post-test control group experiment model. The quantitative dimension of the present…

  18. Identification of internal control genes for quantitative expression analysis by real-time PCR in bovine peripheral lymphocytes.

    PubMed

    Spalenza, Veronica; Girolami, Flavia; Bevilacqua, Claudia; Riondato, Fulvio; Rasero, Roberto; Nebbia, Carlo; Sacchi, Paola; Martin, Patrice

    2011-09-01

    Gene expression studies in blood cells, particularly lymphocytes, are useful for monitoring potential exposure to toxicants or environmental pollutants in humans and livestock species. Quantitative PCR is the method of choice for obtaining accurate quantification of mRNA transcripts although variations in the amount of starting material, enzymatic efficiency, and the presence of inhibitors can lead to evaluation errors. As a result, normalization of data is of crucial importance. The most common approach is the use of endogenous reference genes as an internal control, whose expression should ideally not vary among individuals and under different experimental conditions. The accurate selection of reference genes is therefore an important step in interpreting quantitative PCR studies. Since no systematic investigation in bovine lymphocytes has been performed, the aim of the present study was to assess the expression stability of seven candidate reference genes in circulating lymphocytes collected from 15 dairy cows. Following the characterization by flow cytometric analysis of the cell populations obtained from blood through a density gradient procedure, three popular softwares were used to evaluate the gene expression data. The results showed that two genes are sufficient for normalization of quantitative PCR studies in cattle lymphocytes and that YWAHZ, S24 and PPIA are the most stable genes. Copyright © 2010 Elsevier Ltd. All rights reserved.

  19. LFQuant: a label-free fast quantitative analysis tool for high-resolution LC-MS/MS proteomics data.

    PubMed

    Zhang, Wei; Zhang, Jiyang; Xu, Changming; Li, Ning; Liu, Hui; Ma, Jie; Zhu, Yunping; Xie, Hongwei

    2012-12-01

    Database searching based methods for label-free quantification aim to reconstruct the peptide extracted ion chromatogram based on the identification information, which can limit the search space and thus make the data processing much faster. The random effect of the MS/MS sampling can be remedied by cross-assignment among different runs. Here, we present a new label-free fast quantitative analysis tool, LFQuant, for high-resolution LC-MS/MS proteomics data based on database searching. It is designed to accept raw data in two common formats (mzXML and Thermo RAW), and database search results from mainstream tools (MASCOT, SEQUEST, and X!Tandem), as input data. LFQuant can handle large-scale label-free data with fractionation such as SDS-PAGE and 2D LC. It is easy to use and provides handy user interfaces for data loading, parameter setting, quantitative analysis, and quantitative data visualization. LFQuant was compared with two common quantification software packages, MaxQuant and IDEAL-Q, on the replication data set and the UPS1 standard data set. The results show that LFQuant performs better than them in terms of both precision and accuracy, and consumes significantly less processing time. LFQuant is freely available under the GNU General Public License v3.0 at http://sourceforge.net/projects/lfquant/. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Quantitative analysis of active compounds in pharmaceutical preparations by use of attenuated total-reflection Fourier transform mid-infrared spectrophotometry and the internal standard method.

    PubMed

    Sastre Toraño, J; van Hattum, S H

    2001-10-01

    A new method is presented for the quantitative analysis of compounds in pharmaceutical preparations Fourier transform (FT) mid-infrared (MIR) spectroscopy with an attenuated total reflection (ATR) module. Reduction of the quantity of overlapping absorption bands, by interaction of the compound of interest with an appropriate solvent, and the employment of an internal standard (IS), makes MIR suitable for quantitative analysis. Vigabatrin, as active compound in vigabatrin 100-mg capsules, was used as a model compound for the development of the method. Vigabatrin was extracted from the capsule content with water after addition of a sodium thiosulfate IS solution. The extract was concentrated by volume reduction and applied to the FTMIR-ATR module. Concentrations of unknown samples were calculated from the ratio of the vigabatrin band area (1321-1610 cm(-1)) and the IS band area (883-1215 cm(-1)) using a calibration standard. The ratio of the area of the vigabatrin peak to that of the IS was linear with the concentration in the range of interest (90-110 mg, in twofold; n=2). The accuracy of the method in this range was 99.7-100.5% (n=5) with a variability of 0.4-1.3% (n=5). The comparison of the presented method with an HPLC assay showed similar results; the analysis of five vigabatrin 100-mg capsules resulted in a mean concentration of 102 mg with a variation of 2% with both methods.

  1. A comparative evaluation of software for the analysis of liquid chromatography-tandem mass spectrometry data from isotope coded affinity tag experiments.

    PubMed

    Moulder, Robert; Filén, Jan-Jonas; Salmi, Jussi; Katajamaa, Mikko; Nevalainen, Olli S; Oresic, Matej; Aittokallio, Tero; Lahesmaa, Riitta; Nyman, Tuula A

    2005-07-01

    The options available for processing quantitative data from isotope coded affinity tag (ICAT) experiments have mostly been confined to software specific to the instrument of acquisition. However, recent developments with data format conversion have subsequently increased such processing opportunities. In the present study, data sets from ICAT experiments, analysed with liquid chromatography/tandem mass spectrometry (MS/MS), using an Applied Biosystems QSTAR Pulsar quadrupole-TOF mass spectrometer, were processed in triplicate using separate mass spectrometry software packages. The programs Pro ICAT, Spectrum Mill and SEQUEST with XPRESS were employed. Attention was paid towards the extent of common identification and agreement of quantitative results, with additional interest in the flexibility and productivity of these programs. The comparisons were made with data from the analysis of a specifically prepared test mixture, nine proteins at a range of relative concentration ratios from 0.1 to 10 (light to heavy labelled forms), as a known control, and data selected from an ICAT study involving the measurement of cytokine induced protein expression in human lymphoblasts, as an applied example. Dissimilarities were detected in peptide identification that reflected how the associated scoring parameters favoured information from the MS/MS data sets. Accordingly, there were differences in the numbers of peptides and protein identifications, although from these it was apparent that both confirmatory and complementary information was present. In the quantitative results from the three programs, no statistically significant differences were observed.

  2. A CZT-based blood counter for quantitative molecular imaging.

    PubMed

    Espagnet, Romain; Frezza, Andrea; Martin, Jean-Pierre; Hamel, Louis-André; Lechippey, Laëtitia; Beauregard, Jean-Mathieu; Després, Philippe

    2017-12-01

    Robust quantitative analysis in positron emission tomography (PET) and in single-photon emission computed tomography (SPECT) typically requires the time-activity curve as an input function for the pharmacokinetic modeling of tracer uptake. For this purpose, a new automated tool for the determination of blood activity as a function of time is presented. The device, compact enough to be used on the patient bed, relies on a peristaltic pump for continuous blood withdrawal at user-defined rates. Gamma detection is based on a 20 × 20 × 15 mm 3 cadmium zinc telluride (CZT) detector, read by custom-made electronics and a field-programmable gate array-based signal processing unit. A graphical user interface (GUI) allows users to select parameters and easily perform acquisitions. This paper presents the overall design of the device as well as the results related to the detector performance in terms of stability, sensitivity and energy resolution. Results from a patient study are also reported. The device achieved a sensitivity of 7.1 cps/(kBq/mL) and a minimum detectable activity of 2.5 kBq/ml for 18 F. The gamma counter also demonstrated an excellent stability with a deviation in count rates inferior to 0.05% over 6 h. An energy resolution of 8% was achieved at 662 keV. The patient study was conclusive and demonstrated that the compact gamma blood counter developed has the sensitivity and the stability required to conduct quantitative molecular imaging studies in PET and SPECT.

  3. In situ flash x-ray high-speed computed tomography for the quantitative analysis of highly dynamic processes

    NASA Astrophysics Data System (ADS)

    Moser, Stefan; Nau, Siegfried; Salk, Manfred; Thoma, Klaus

    2014-02-01

    The in situ investigation of dynamic events, ranging from car crash to ballistics, often is key to the understanding of dynamic material behavior. In many cases the important processes and interactions happen on the scale of milli- to microseconds at speeds of 1000 m s-1 or more. Often, 3D information is necessary to fully capture and analyze all relevant effects. High-speed 3D-visualization techniques are thus required for the in situ analysis. 3D-capable optical high-speed methods often are impaired by luminous effects and dust, while flash x-ray based methods usually deliver only 2D data. In this paper, a novel 3D-capable flash x-ray based method, in situ flash x-ray high-speed computed tomography is presented. The method is capable of producing 3D reconstructions of high-speed processes based on an undersampled dataset consisting of only a few (typically 3 to 6) x-ray projections. The major challenges are identified, discussed and the chosen solution outlined. The application is illustrated with an exemplary application of a 1000 m s-1 high-speed impact event on the scale of microseconds. A quantitative analysis of the in situ measurement of the material fragments with a 3D reconstruction with 1 mm voxel size is presented and the results are discussed. The results show that the HSCT method allows gaining valuable visual and quantitative mechanical information for the understanding and interpretation of high-speed events.

  4. The benefits of improved technologies in agricultural aviation

    NASA Technical Reports Server (NTRS)

    Lietzke, K.; Abram, P.; Braen, C.; Givens, S.; Hazelrigg, G. A., Jr.; Fish, R.; Clyne, F.; Sand, F.

    1977-01-01

    The results are present for a study of the economic benefits attributed to a variety of potential technological improvements in agricultural aviation. Part 1 gives a general description of the ag-air industry and discusses the information used in the data base to estimate the potential benefits from technological improvements. Part 2 presents the benefit estimates and provides a quantitative basis for the estimates in each area study. Part 3 is a bibliography of references relating to this study.

  5. Sigmund Freud's practice: visits and consultation, psychoanalyses, remuneration.

    PubMed

    Tögel, Christfried

    2009-10-01

    This paper provides an overview of the quantitative side of the systematic records kept by Freud on his practice. He left precise records of the duration, frequency, and fees of psychoanalytic treatments. These statistics are compared with the treatment duration and frequency customary in present-day psychoanalytic practice in German-speaking countries. The results suggest that, regarding frequency and duration and their relationship, there is little difference between Freud's psychoanalytic practice and that of the present day.

  6. Basic sciences agonize in Turkey!

    NASA Astrophysics Data System (ADS)

    Akdemir, Fatma; Araz, Asli; Akman, Ferdi; Durak, Rıdvan

    2016-04-01

    In this study, changes from past to present in the departments of physics, chemistry, biology and mathematics, which are considered as the basic sciences in Turkey, are shown. The importance of basic science for the country emphasized and the status of our country was discussed with a critical perspective. The number of academic staff, the number of students, opened quotas according to years for these four departments at universities were calculated and analysis of the resulting changes were made. In examined graphics changes to these four departments were similar. Especially a significant change was observed in the physics department. Lack of jobs employing young people who have graduated from basic science is also an issue that must be discussed. There are also qualitative results of this study that we have discussed as quantitative. Psychological problems caused by unemployment have become a disease among young people. This study was focused on more quantitative results. We have tried to explain the causes of obtained results and propose solutions.

  7. Quantification and clustering of phenotypic screening data using time-series analysis for chemotherapy of schistosomiasis.

    PubMed

    Lee, Hyokyeong; Moody-Davis, Asher; Saha, Utsab; Suzuki, Brian M; Asarnow, Daniel; Chen, Steven; Arkin, Michelle; Caffrey, Conor R; Singh, Rahul

    2012-01-01

    Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind.

  8. Development of a Quantitative Food Frequency Questionnaire for Use among the Yup'ik People of Western Alaska

    PubMed Central

    Kolahdooz, Fariba; Simeon, Desiree; Ferguson, Gary; Sharma, Sangita

    2014-01-01

    Alaska Native populations are experiencing a nutrition transition and a resulting decrease in diet quality. The present study aimed to develop a quantitative food frequency questionnaire to assess the diet of the Yup'ik people of Western Alaska. A cross-sectional survey was conducted using 24-hour recalls and the information collected served as a basis for developing a quantitative food frequency questionnaire. A total of 177 males and females, aged 13-88, in six western Alaska communities, completed up to three 24-hour recalls as part of the Alaska Native Dietary and Subsistence Food Assessment Project. The frequency of the foods reported in the 24-hour recalls was tabulated and used to create a draft quantitative food frequency questionnaire, which was pilot tested and finalized with input from community members. Store-bought foods high in fat and sugar were reported more frequently than traditional foods. Seven of the top 26 foods most frequently reported were traditional foods. A 150-item quantitative food frequency questionnaire was developed that included 14 breads and crackers; 3 cereals; 11 dairy products; 69 meats, poultry and fish; 13 fruit; 22 vegetables; 9 desserts and snacks; and 9 beverages. The quantitative food frequency questionnaire contains 39 traditional food items. This quantitative food frequency questionnaire can be used to assess the unique diet of the Alaska Native people of Western Alaska. This tool will allow for monitoring of dietary changes over time as well as the identification of foods and nutrients that could be promoted in a nutrition intervention program intended to reduce chronic disease. PMID:24963718

  9. Standardizing Evaluation of pQCT Image Quality in the Presence of Subject Movement: Qualitative vs. Quantitative Assessment

    PubMed Central

    Blew, Robert M.; Lee, Vinson R.; Farr, Joshua N.; Schiferl, Daniel J.; Going, Scott B.

    2013-01-01

    Purpose Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale, and (2) establish a quantitative motion assessment methodology. Methods Scans were performed on 506 healthy girls (9–13yr) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n=46) was examined to determine %Move’s impact on bone parameters. Results Agreement between measurers was strong (ICC = .732 for tibia, .812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat or no repeat. The quantitative approach found ≥95% of subjects had %Move <25%. Comparison of initial and repeat scans by groups above and below 25% initial movement, showed significant differences in the >25% grouping. Conclusions A pQCT visual inspection scale can be a reliable metric of image quality but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat. PMID:24077875

  10. Development of a quantitative food frequency questionnaire for use among the Yup'ik people of Western Alaska.

    PubMed

    Kolahdooz, Fariba; Simeon, Desiree; Ferguson, Gary; Sharma, Sangita

    2014-01-01

    Alaska Native populations are experiencing a nutrition transition and a resulting decrease in diet quality. The present study aimed to develop a quantitative food frequency questionnaire to assess the diet of the Yup'ik people of Western Alaska. A cross-sectional survey was conducted using 24-hour recalls and the information collected served as a basis for developing a quantitative food frequency questionnaire. A total of 177 males and females, aged 13-88, in six western Alaska communities, completed up to three 24-hour recalls as part of the Alaska Native Dietary and Subsistence Food Assessment Project. The frequency of the foods reported in the 24-hour recalls was tabulated and used to create a draft quantitative food frequency questionnaire, which was pilot tested and finalized with input from community members. Store-bought foods high in fat and sugar were reported more frequently than traditional foods. Seven of the top 26 foods most frequently reported were traditional foods. A 150-item quantitative food frequency questionnaire was developed that included 14 breads and crackers; 3 cereals; 11 dairy products; 69 meats, poultry and fish; 13 fruit; 22 vegetables; 9 desserts and snacks; and 9 beverages. The quantitative food frequency questionnaire contains 39 traditional food items. This quantitative food frequency questionnaire can be used to assess the unique diet of the Alaska Native people of Western Alaska. This tool will allow for monitoring of dietary changes over time as well as the identification of foods and nutrients that could be promoted in a nutrition intervention program intended to reduce chronic disease.

  11. Quantitative radiomics studies for tissue characterization: a review of technology and methodological procedures.

    PubMed

    Larue, Ruben T H M; Defraene, Gilles; De Ruysscher, Dirk; Lambin, Philippe; van Elmpt, Wouter

    2017-02-01

    Quantitative analysis of tumour characteristics based on medical imaging is an emerging field of research. In recent years, quantitative imaging features derived from CT, positron emission tomography and MR scans were shown to be of added value in the prediction of outcome parameters in oncology, in what is called the radiomics field. However, results might be difficult to compare owing to a lack of standardized methodologies to conduct quantitative image analyses. In this review, we aim to present an overview of the current challenges, technical routines and protocols that are involved in quantitative imaging studies. The first issue that should be overcome is the dependency of several features on the scan acquisition and image reconstruction parameters. Adopting consistent methods in the subsequent target segmentation step is evenly crucial. To further establish robust quantitative image analyses, standardization or at least calibration of imaging features based on different feature extraction settings is required, especially for texture- and filter-based features. Several open-source and commercial software packages to perform feature extraction are currently available, all with slightly different functionalities, which makes benchmarking quite challenging. The number of imaging features calculated is typically larger than the number of patients studied, which emphasizes the importance of proper feature selection and prediction model-building routines to prevent overfitting. Even though many of these challenges still need to be addressed before quantitative imaging can be brought into daily clinical practice, radiomics is expected to be a critical component for the integration of image-derived information to personalize treatment in the future.

  12. Mantle rheology and satellite signatures from present-day glacial forcings

    NASA Technical Reports Server (NTRS)

    Sabadini, Roberto; Yuen, David A.; Gasperini, Paolo

    1988-01-01

    Changes in the long-wavelength region of the earth's gravity field resulting from both present-day glacial discharges and the possible growth of the Antarctic ice sheet are considered. Significant differences in the responses between the Maxell and Burger body rheologies are found for time spans of less than 100 years. The quantitative model for predicting the secular variations of the gravitational potential, and means for incorporating glacial forcings, are described. Results are given for the excitation of the degree two harmonics. It is suggested that detailed satellite monitoring of present-day ice movements in conjunction with geodetic satellite missions may provide a reasonable alternative for the esimation of deep mantle viscosity.

  13. Effects of low-dose prenatal irradiation on the central nervous system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-04-01

    Scientists are in general agreement about the effects of prenatal irradiation, including those affecting the central nervous system (CNS). Differing concepts and research approaches have resulted in some uncertainties about some quantitative relationships, underlying interpretations, and conclusions. Examples of uncertainties include the existence of a threshold, the quantitative relationships between prenatal radiation doses and resulting physical and functional lesions, and processes by which lesions originate and develop. A workshop was convened in which scientists with varying backgrounds and viewpoints discussed these relationships and explored ways in which various disciplines could coordinate concepts and methodologies to suggest research directions for resolvingmore » uncertainties. This Workshop Report summarizes, in an extended fashion, salient features of the presentations on the current status of our knowledge about the radiobiology and neuroscience of prenatal irradiation and the relationships between them.« less

  14. Precision and Accuracy Parameters in Structured Light 3-D Scanning

    NASA Astrophysics Data System (ADS)

    Eiríksson, E. R.; Wilm, J.; Pedersen, D. B.; Aanæs, H.

    2016-04-01

    Structured light systems are popular in part because they can be constructed from off-the-shelf low cost components. In this paper we quantitatively show how common design parameters affect precision and accuracy in such systems, supplying a much needed guide for practitioners. Our quantitative measure is the established VDI/VDE 2634 (Part 2) guideline using precision made calibration artifacts. Experiments are performed on our own structured light setup, consisting of two cameras and a projector. We place our focus on the influence of calibration design parameters, the calibration procedure and encoding strategy and present our findings. Finally, we compare our setup to a state of the art metrology grade commercial scanner. Our results show that comparable, and in some cases better, results can be obtained using the parameter settings determined in this study.

  15. Effects of low-dose prenatal irradiation on the central nervous system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Scientists are in general agreement about the effects of prenatal irradiation, including those affecting the central nervous system (CNS). Differing concepts and research approaches have resulted in some uncertainties about some quantitative relationships, underlying interpretations, and conclusions. Examples of uncertainties include the existence of a threshold, the quantitative relationships between prenatal radiation doses and resulting physical and functional lesions, and processes by which lesions originate and develop. A workshop was convened in which scientists with varying backgrounds and viewpoints discussed these relationships and explored ways in which various disciplines could coordinate concepts and methodologies to suggest research directions for resolvingmore » uncertainties. This Workshop Report summarizes, in an extended fashion, salient features of the presentations on the current status of our knowledge about the radiobiology and neuroscience of prenatal irradiation and the relationships between them.« less

  16. Nonperturbative quark, gluon, and meson correlators of unquenched QCD

    NASA Astrophysics Data System (ADS)

    Cyrol, Anton K.; Mitter, Mario; Pawlowski, Jan M.; Strodthoff, Nils

    2018-03-01

    We present nonperturbative first-principle results for quark, gluon, and meson 1PI correlation functions of two-flavor Landau-gauge QCD in the vacuum. These correlation functions carry the full information about the theory. They are obtained by solving their functional renormalization group equations in a systematic vertex expansion, aiming at apparent convergence. This work represents a crucial prerequisite for quantitative first-principle studies of the QCD phase diagram and the hadron spectrum within this framework. In particular, we have computed the gluon, ghost, quark, and scalar-pseudoscalar meson propagators, as well as gluon, ghost-gluon, quark-gluon, quark, quark-meson, and meson interactions. Our results stress the crucial importance of the quantitatively correct running of different vertices in the semiperturbative regime for describing the phenomena and scales of confinement and spontaneous chiral symmetry breaking without phenomenological input.

  17. Quadrant photodetector sensitivity.

    PubMed

    Manojlović, Lazo M

    2011-07-10

    A quantitative theoretical analysis of the quadrant photodetector (QPD) sensitivity in position measurement is presented. The Gaussian light spot irradiance distribution on the QPD surface was assumed to meet most of the real-life applications of this sensor. As the result of the mathematical treatment of the problem, we obtained, in a closed form, the sensitivity function versus the ratio of the light spot 1/e radius and the QPD radius. The obtained result is valid for the full range of the ratios. To check the influence of the finite light spot radius on the interaxis cross talk and linearity, we also performed a mathematical analysis to quantitatively measure these types of errors. An optimal range of the ratio of light spot radius and QPD radius has been found to simultaneously achieve low interaxis cross talk and high linearity of the sensor. © 2011 Optical Society of America

  18. Clinical evaluation of tuberculosis viability microscopy for assessing treatment response.

    PubMed

    Datta, Sumona; Sherman, Jonathan M; Bravard, Marjory A; Valencia, Teresa; Gilman, Robert H; Evans, Carlton A

    2015-04-15

    It is difficult to determine whether early tuberculosis treatment is effective in reducing the infectiousness of patients' sputum, because culture takes weeks and conventional acid-fast sputum microscopy and molecular tests cannot differentiate live from dead tuberculosis. To assess treatment response, sputum samples (n=124) from unselected patients (n=35) with sputum microscopy-positive tuberculosis were tested pretreatment and after 3, 6, and 9 days of empiric first-line therapy. Tuberculosis quantitative viability microscopy with fluorescein diacetate, quantitative culture, and acid-fast auramine microscopy were all performed in triplicate. Tuberculosis quantitative viability microscopy predicted quantitative culture results such that 76% of results agreed within ±1 logarithm (rS=0.85; P<.0001). In 31 patients with non-multidrug-resistant (MDR) tuberculosis, viability and quantitative culture results approximately halved (both 0.27 log reduction, P<.001) daily. For patients with non-MDR tuberculosis and available data, by treatment day 9 there was a >10-fold reduction in viability in 100% (24/24) of cases and quantitative culture in 95% (19/20) of cases. Four other patients subsequently found to have MDR tuberculosis had no significant changes in viability (P=.4) or quantitative culture (P=.6) results during early treatment. The change in viability and quantitative culture results during early treatment differed significantly between patients with non-MDR tuberculosis and those with MDR tuberculosis (both P<.001). Acid-fast microscopy results changed little during early treatment, and this change was similar for non-MDR tuberculosis vs MDR tuberculosis (P=.6). Tuberculosis quantitative viability microscopy is a simple test that within 1 hour predicted quantitative culture results that became available weeks later, rapidly indicating whether patients were responding to tuberculosis therapy. © The Author 2014. Published by Oxford University Press on behalf of the Infectious Diseases Society of America.

  19. Modeling aeolian dune and dune field evolution

    NASA Astrophysics Data System (ADS)

    Diniega, Serina

    Aeolian sand dune morphologies and sizes are strongly connected to the environmental context and physical processes active since dune formation. As such, the patterns and measurable features found within dunes and dune fields can be interpreted as records of environmental conditions. Using mathematical models of dune and dune field evolution, it should be possible to quantitatively predict dune field dynamics from current conditions or to determine past field conditions based on present-day observations. In this dissertation, we focus on the construction and quantitative analysis of a continuum dune evolution model. We then apply this model towards interpretation of the formative history of terrestrial and martian dunes and dune fields. Our first aim is to identify the controls for the characteristic lengthscales seen in patterned dune fields. Variations in sand flux, binary dune interactions, and topography are evaluated with respect to evolution of individual dunes. Through the use of both quantitative and qualitative multiscale models, these results are then extended to determine the role such processes may play in (de)stabilization of the dune field. We find that sand flux variations and topography generally destabilize dune fields, while dune collisions can yield more similarly-sized dunes. We construct and apply a phenomenological macroscale dune evolution model to then quantitatively demonstrate how dune collisions cause a dune field to evolve into a set of uniformly-sized dunes. Our second goal is to investigate the influence of reversing winds and polar processes in relation to dune slope and morphology. Using numerical experiments, we investigate possible causes of distinctive morphologies seen in Antarctic and martian polar dunes. Finally, we discuss possible model extensions and needed observations that will enable the inclusion of more realistic physical environments in the dune and dune field evolution models. By elucidating the qualitative and quantitative connections between environmental conditions, physical processes, and resultant dune and dune field morphologies, this research furthers our ability to interpret spacecraft images of dune fields, and to use present-day observations to improve our understanding of past terrestrial and martian environments.

  20. Quantitative analysis of N-glycans from human alfa-acid-glycoprotein using stable isotope labeling and zwitterionic hydrophilic interaction capillary liquid chromatography electrospray mass spectrometry as tool for pancreatic disease diagnosis.

    PubMed

    Giménez, Estela; Balmaña, Meritxell; Figueras, Joan; Fort, Esther; de Bolós, Carme; Sanz-Nebot, Victòria; Peracaula, Rosa; Rizzi, Andreas

    2015-03-25

    In this work we demonstrate the potential of glycan reductive isotope labeling (GRIL) using [(12)C]- and [(13)C]-coded aniline and zwitterionic hydrophilic interaction capillary liquid chromatography electrospray mass spectrometry (μZIC-HILIC-ESI-MS) for relative quantitation of glycosylation variants in selected glycoproteins present in samples from cancer patients. Human α1-acid-glycoprotein (hAGP) is an acute phase serum glycoprotein whose glycosylation has been described to be altered in cancer and chronic inflammation. However, it is not clear yet whether some particular glycans in hAGP can be used as biomarker for differentiating between these two pathologies. In this work, hAGP was isolated by immunoaffinity chromatography (IAC) from serum samples of healthy individuals and from those suffering chronic pancreatitis and different stages of pancreatic cancer, respectively. After de-N-glycosylation, relative quantitation of the hAGP glycans was carried out using stable isotope labeling and μZIC-HILIC-ESI-MS analysis. First, protein denaturing conditions prior to PNGase F digestion were optimized to achieve quantitative digestion yields, and the reproducibility of the established methodology was evaluated with standard hAGP. Then, the proposed method was applied to the analysis of the clinical samples (control vs. pathological). Pancreatic cancer samples clearly showed an increase in the abundance of fucosylated glycans as the stage of the disease increases and this was unlike to samples from chronic pancreatitis. The results gained here indicate the mentioned glycan in hAGP as a candidate structure worth to be corroborated by an extended study including more clinical cases; especially those with chronic pancreatitis and initial stages of pancreatic cancer. Importantly, the results demonstrate that the presented methodology combining an enrichment of a target protein by IAC with isotope coded relative quantitation of N-glycans can be successfully used for targeted glycomics studies. The methodology is assumed being suitable as well for other such studies aimed at finding novel cancer associated glycoprotein biomarkers. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Quantitative and Qualitative Analysis of Phenolic and Flavonoid Content in Moringa oleifera Lam and Ocimum tenuiflorum L.

    PubMed Central

    Sankhalkar, Sangeeta; Vernekar, Vrunda

    2016-01-01

    Background: Number of secondary compounds is produced by plants as natural antioxidants. Moringa oleifera Lam. and Ocimum tenuiflorum L. are known for their wide applications in food and pharmaceutical industry. Objective: To compare phenolic and flavonoid content in M. oleifera Lam and O. tenuiflorum L. by quantitative and qualitative analysis. Materials and Methods: Phenolic and flavonoid content were studied spectrophotometrically and by paper chromatography in M. oleifera Lam. and O. tenuiflorum L. Results: Higher phenolic and flavonoid content were observed in Moringa leaf and flower. Ocimum flower showed higher phenolic content and low flavonoid in comparison to Moringa. Flavonoids such as biflavonyl, flavones, glycosylflavones, and kaempferol were identified by paper chromatography. Phytochemical analysis for flavonoid, tannins, saponins, alkaloids, reducing sugars, and anthraquinones were tested positive for Moringa and Ocimum leaf as well as flower. Conclusions: In the present study higher phenolic and flavonoid content, indicated the natural antioxidant nature of Moringa and Ocimum signifying their medicinal importance. SUMMARY Moringa oleifera Lam. and Ocimum tenuiflorum L. are widly grown in India and are known for their medicinal properties. Number of secondary metabolites like phenolics and flavonoids are known to be present in both the plants. The present study was conducted with an objective to qualitatively and quantitatively compare the phenolics and flavanoids in these two medicinally important plants.Quantitation of total phenolics and flavanoids was done by spectrophotometrically while qualitative analysis was perfomed by paper chromatography and by phytochemical tests. Our results have shown higher phenolics and flavanoid content in Moringa leaf and flower. However, higher phenolic content was absent in Ocimum flower compared to that of Moringa. Phytochemical analysis of various metabolites such as flavonoids, tanins, sapponins, alkaloids, anthraquinones revealed that both the plant extracts were rich sources of secondary metabolites and thus tested positive for the above tests. Various flavanoids and Phenolics were identified by paper chromatography based on their Rf values and significant colors. From the above study we conclude that Moringa and Ocimum are rich in natural antioxidants hence are potent source in pharmaceutical industry. PMID:26941531

  2. Quantitative paleotopography and paleogeography around the Gibraltar Arc (South Spain) during the Messinian Salinity Crisis

    NASA Astrophysics Data System (ADS)

    Elez, Javier; Silva, Pablo G.; Huerta, Pedro; Perucha, M. Ángeles; Civis, Jorge; Roquero, Elvira; Rodríguez-Pascua, Miguel A.; Bardají, Teresa; Giner-Robles, Jorge L.; Martínez-Graña, Antonio

    2016-12-01

    The Malaga basin contains an important geological record documenting the complex paleogeographic evolution of the Gibraltar Arc before, during and after the closure and desiccation of the Mediterranean Sea triggered by the "Messinian Salinity crisis" (MSC). Proxy paleo-elevation data, estimated from the stratigraphic and geomorphological records, allow the building of quantitative paleogeoid, paleotopographic and paleogeographic models for the three main paleogeographic stages: pre-MSC (Tortonian-early Messinian), syn-MSC (late Messinian) and post-MSC (early Pliocene). The methodological workflow combines classical contouring procedures used in geology and isobase map models from geomorphometric analyses and proxy data overprinted on present Digital Terrain Models. The resulting terrain quantitative models have been arranged, managed and computed in a GIS environment. The computed terrain models enable the exploration of past landscapes usually beyond the reach of classical geomorphological analyses and strongly improve the paleogeographic and paleotopographic knowledge of the study area. The resulting models suggest the occurrence of a set of uplifted littoral erosive and paleokarstic landforms that evolved during pre-MSC times. These uplifted landform assemblages can explain the origin of key elements of the present landscape, such as the Torcal de Antequera and the large amount of mogote-like relict hills present in the zone, in terms of ancient uplifted tropical islands. The most prominent landform is the extensive erosional platform dominating the Betic frontal zone that represents the relic Atlantic wave cut platform elaborated during late-Tortonian to early Messinian times. The amount of uplift derived from paleogeoid models suggests that the area rose by about 340 m during the MSC. This points to isostatic uplift triggered by differential erosional unloading (towards the Mediterranean) as the main factor controlling landscape evolution in the area during and after the MSC. Former littoral landscapes in the old emergent axis of the Gibraltar Arc were uplifted to form the main water-divide of the present Betic Cordillera in the zone.

  3. Using the Blended Learning Approach in a Quantitative Literacy Course

    ERIC Educational Resources Information Center

    Botts, Ryan T.; Carter, Lori; Crockett, Catherine

    2018-01-01

    The efforts to improve the quantitative reasoning (quantitative literacy) skills of college students in the United States have been gaining momentum in recent years. At the same time, the blended learning approach to course delivery has gained in popularity, promising better learning with flexible modalities and pace. This paper presents the…

  4. Past, Present, and Future of Critical Quantitative Research in Higher Education

    ERIC Educational Resources Information Center

    Wells, Ryan S.; Stage, Frances K.

    2014-01-01

    This chapter discusses the evolution of the critical quantitative paradigm with an emphasis on extending this approach to new populations and new methods. Along with this extension of critical quantitative work, however, come continued challenges and tensions for researchers. This chapter recaps and responds to each chapter in the volume, and…

  5. Young People's Attitudes to Religious Diversity: Quantitative Approaches from Social Psychology and Empirical Theology

    ERIC Educational Resources Information Center

    Francis, Leslie J.; Croft, Jennifer S.; Pyke, Alice; Robbins, Mandy

    2012-01-01

    This essay discusses the design of the quantitative component of the "Young People's Attitudes to Religious Diversity" project, conceived by Professor Robert Jackson within the Warwick Religions and Education Research Unit, and presents some preliminary findings from the data. The quantitative component followed and built on the…

  6. Boiler Tube Corrosion Characterization with a Scanning Thermal Line

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Jacobstein, Ronald; Reilly, Thomas

    2001-01-01

    Wall thinning due to corrosion in utility boiler water wall tubing is a significant operational concern for boiler operators. Historically, conventional ultrasonics has been used for inspection of these tubes. Unfortunately, ultrasonic inspection is very manpower intense and slow. Therefore, thickness measurements are typically taken over a relatively small percentage of the total boiler wall and statistical analysis is used to determine the overall condition of the boiler tubing. Other inspection techniques, such as electromagnetic acoustic transducer (EMAT), have recently been evaluated, however they provide only a qualitative evaluation - identifying areas or spots where corrosion has significantly reduced the wall thickness. NASA Langley Research Center, in cooperation with ThermTech Services, has developed a thermal NDE technique designed to quantitatively measure the wall thickness and thus determine the amount of material thinning present in steel boiler tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source coupled with the analysis technique represents a significant improvement in the inspection speed and accuracy for large structures such as boiler water walls. A theoretical basis for the technique will be presented to establish the quantitative nature of the technique. Further, a dynamic calibration system will be presented for the technique that allows the extraction of thickness information from the temperature data. Additionally, the results of the application of this technology to actual water wall tubing samples and in-situ inspections will be presented.

  7. Quantity discrimination in canids: Dogs (Canis familiaris) and wolves (Canis lupus) compared.

    PubMed

    Miletto Petrazzini, Maria Elena; Wynne, Clive D L

    2017-11-01

    Accumulating evidence indicates that animals are able to discriminate between quantities. Recent studies have shown that dogs' and coyotes' ability to discriminate between quantities of food items decreases with increasing numerical ratio. Conversely, wolves' performance is not affected by numerical ratio. Cross-species comparisons are difficult because of differences in the methodologies employed, and hence it is still unclear whether domestication altered quantitative abilities in canids. Here we used the same procedure to compare pet dogs and wolves in a spontaneous food choice task. Subjects were presented with two quantities of food items and allowed to choose only one option. Four numerical contrasts of increasing difficulty (range 1-4) were used to assess the influence of numerical ratio on the performance of the two species. Dogs' accuracy was affected by numerical ratio, while no ratio effect was observed in wolves. These results align with previous findings and reinforce the idea of different quantitative competences in dogs and wolves. Although we cannot exclude that other variables might have played a role in shaping quantitative abilities in these two species, our results might suggest that the interspecific differences here reported may have arisen as a result of domestication. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Quantitative Vectorial Magnetic Imaging of Multi Domain Rock Forming Minerals using Nitrogen-Vacancy Centers in Diamond

    NASA Astrophysics Data System (ADS)

    Shaar, R.; Farchi, E.; Farfurnik, D.; Ebert, Y.; Haim, G.; Bar-Gill, N.

    2017-12-01

    Magnetization in rock samples is crucial for paleomagnetometry research, as it harbors valuable geological information on long term processes, such as tectonic movements and the formation of oceans and continents. Nevertheless, current techniques are limited in their ability to measure high spatial resolution and high-sensitivity quantitative vectorial magnetic signatures from individual minerals and micrometer scale samples. As a result, our understanding of bulk rock magnetization is limited, specifically for the case of multi-domain minerals. In this work we use a newly developed nitrogen-vacancy magnetic microscope, capable of quantitative vectorial magnetic imaging with optical resolution. We demonstrate direct imaging of the vectorial magnetic field of a single, multi-domain dendritic magnetite, as well as the measurement and calculation of the weak magnetic moments of an individual grain on the micron scale. Our results were measured in a standoff distance of 3-10 μm, with 350 nm spatial resolution, magnetic sensitivity of 6 μT/√(Hz) and a field of view of 35 μm. The results presented here show the capabilities and the future potential of NV microscopy in measuring the magnetic signals of individual micrometer scale grains. These outcomes pave the way for future applications in paleomagnetometry, and for the fundamental understanding of magnetization in multi-domain samples.

  9. Wear-Induced Changes in FSW Tool Pin Profile: Effect of Process Parameters

    NASA Astrophysics Data System (ADS)

    Sahlot, Pankaj; Jha, Kaushal; Dey, G. K.; Arora, Amit

    2018-06-01

    Friction stir welding (FSW) of high melting point metallic (HMPM) materials has limited application due to tool wear and relatively short tool life. Tool wear changes the profile of the tool pin and adversely affects weld properties. A quantitative understanding of tool wear and tool pin profile is crucial to develop the process for joining of HMPM materials. Here we present a quantitative wear study of H13 steel tool pin profile for FSW of CuCrZr alloy. The tool pin profile is analyzed at multiple traverse distances for welding with various tool rotational and traverse speeds. The results indicate that measured wear depth is small near the pin root and significantly increases towards the tip. Near the pin tip, wear depth increases with increase in tool rotational speed. However, change in wear depth near the pin root is minimal. Wear depth also increases with decrease in tool traverse speeds. Tool pin wear from the bottom results in pin length reduction, which is greater for higher tool rotational speeds, and longer traverse distances. The pin profile changes due to wear and result in root defect for long traverse distance. This quantitative understanding of tool wear would be helpful to estimate tool wear, optimize process parameters, and tool pin shape during FSW of HMPM materials.

  10. The effect of vortex formation on left ventricular filling and mitral valve efficiency.

    PubMed

    Pierrakos, Olga; Vlachos, Pavlos P

    2006-08-01

    A new mechanism for quantifying the filling energetics in the left ventricle (LV) and past mechanical heart valves (MHV) is identified and presented. This mechanism is attributed to vortex formation dynamics past MHV leaflets. Recent studies support the conjecture that the natural healthy left ventricle (LV) performs in an optimum, energy-preserving manner by redirecting the flow with high efficiency. Yet to date, no quantitative proof has been presented. The present work provides quantitative results and validation of a theory based on the dynamics of vortex ring formation, which is governed by a critical formation number (FN) that corresponds to the dimensionless time at which the vortex ring has reached its maximum circulation content, in support of this hypothesis. Herein, several parameters (vortex ring circulation, vortex ring energy, critical FN, hydrodynamic efficiencies, vortex ring propagation speed) have been quantified and presented as a means of bridging the physics of vortex formation in the LV. In fact, the diastolic hydrodynamic efficiencies were found to be 60, 41, and 29%, respectively, for the porcine, anti-anatomical, and anatomical valve configurations. This assessment provides quantitative proof of vortex formation, which is dependent of valve design and orientation, being an important flow characteristic and associated to LV energetics. Time resolved digital particle image velocimetry with kilohertz sampling rate was used to study the ejection of fluid into the LV and resolve the spatiotemporal evolution of the flow. The clinical significance of this study is quantifying vortex formation and the critical FN that can potentially serve as a parameter to quantify the LV filling process and the performance of heart valves.

  11. Shape effects in the turbulent tumbling of large particles

    NASA Astrophysics Data System (ADS)

    Variano, Evan; Oehmke, Theresa; Pujara, Nimish

    2017-11-01

    We present laboratory results on rotation of finite-sized, neutrally buoyant, anisotropic particles in isotropic turbulence. The isotropic turbulent flow is generated using a randomly-actuated synthetic jet array that minimizes tank scale circulation and measurements are made with stereoscopic particle image velocimetry. By using particles of different shapes, we explore the effects that symmetries have on particle rotation. We add to previous data collected for spheres cylinders and ellipsoids by performing new measurements on cubes, cuboids and cones. The measurement technique and results on mean-square particle rotation will be presented. Preliminary results, at the time of writing this abstract, indicate that symmetry breaking increases the rate of particle rotation. More complete quantitative results will be presented. This work was partially supported by the NSF award ENG-1604026 and by the Army Research Office Biomathematics Program.

  12. An iterative method for near-field Fresnel region polychromatic phase contrast imaging

    NASA Astrophysics Data System (ADS)

    Carroll, Aidan J.; van Riessen, Grant A.; Balaur, Eugeniu; Dolbnya, Igor P.; Tran, Giang N.; Peele, Andrew G.

    2017-07-01

    We present an iterative method for polychromatic phase contrast imaging that is suitable for broadband illumination and which allows for the quantitative determination of the thickness of an object given the refractive index of the sample material. Experimental and simulation results suggest the iterative method provides comparable image quality and quantitative object thickness determination when compared to the analytical polychromatic transport of intensity and contrast transfer function methods. The ability of the iterative method to work over a wider range of experimental conditions means the iterative method is a suitable candidate for use with polychromatic illumination and may deliver more utility for laboratory-based x-ray sources, which typically have a broad spectrum.

  13. Optical spectroscopy of ancient paper and textiles

    NASA Astrophysics Data System (ADS)

    Missori, M.

    2016-03-01

    Ancient paper and textiles represent a striking example of optically inhomogenous materials whose optical responses are strongly governed by scattering effects. In order to recover the absorption coefficient from non-invasive and non-destructive reflectance measurements a specific approach based on Kubelka-Munk two-flux theory must be applied. In this way quantitative chemical information, such as chromophores concentration, can be obtained, as well as quantitative spectra of additional substances such as pigments or dyes. Results on a folio of the Codex on the Flight of Birds by Leonardo da Vinci and a linen cloth dated back to 1653 and called the Shroud of Arquata, a copy of the Shroud of Turin, will be presented.

  14. Clinical application of quantitative computed tomography in osteogenesis imperfecta-suspected cat.

    PubMed

    Won, Sungjun; Chung, Woo-Jo; Yoon, Junghee

    2017-09-30

    One-year-old male Persian cat presented with multiple fractures and no known traumatic history. Marked decrease of bone radiopacity and thin cortices of all long bones were identified on radiography. Tentative diagnosis was osteogenesis imperfecta, a congenital disorder characterized by fragile bone. To determine bone mineral density (BMD), quantitative computed tomography (QCT) was performed. The QCT results revealed a mean trabecular BMD of vertebral bodies of 149.9 ± 86.5 mg/cm 3 . After bisphosphonate therapy, BMD of the same site increased significantly (218.5 ± 117.1 mg/cm 3 , p < 0.05). QCT was a useful diagnostic tool to diagnose osteopenia and quantify response to medical treatment.

  15. Quantitative Voronovskaya and Grüss-Voronovskaya type theorems for Jain-Durrmeyer operators of blending type

    NASA Astrophysics Data System (ADS)

    Kajla, Arun; Deshwal, Sheetal; Agrawal, P. N.

    2018-05-01

    In the present paper we introduce a Durrmeyer variant of Jain operators based on a function ρ (x) where ρ is a continuously differentiable function on [0,∞), ρ (0)=0 and \\inf ρ '(x)≥ a, a >0, x \\in [0,∞) . For these new operators, some indispensable auxiliary results are established first. Then, the degree of approximation with the aid of Ditzian-Totik modulus of smoothness and the rate of convergence for functions whose derivatives are of bounded variation, is obtained. Further, we focus on the study of a Voronovskaja type asymptotic theorem, quantitative Voronovskaya and Grüss-Voronovskaya type theorems.

  16. Normal and abnormal human vestibular ocular function

    NASA Technical Reports Server (NTRS)

    Peterka, R. J.; Black, F. O.

    1986-01-01

    The major motivation of this research is to understand the role the vestibular system plays in sensorimotor interactions which result in spatial disorientation and motion sickness. A second goal was to explore the range of abnormality as it is reflected in quantitative measures of vestibular reflex responses. The results of a study of vestibular reflex measurements in normal subjects and preliminary results in abnormal subjects are presented in this report. Statistical methods were used to define the range of normal responses, and determine age related changes in function.

  17. Quantitative x-ray phase-contrast imaging using a single grating of comparable pitch to sample feature size.

    PubMed

    Morgan, Kaye S; Paganin, David M; Siu, Karen K W

    2011-01-01

    The ability to quantitatively retrieve transverse phase maps during imaging by using coherent x rays often requires a precise grating or analyzer-crystal-based setup. Imaging of live animals presents further challenges when these methods require multiple exposures for image reconstruction. We present a simple method of single-exposure, single-grating quantitative phase contrast for a regime in which the grating period is much greater than the effective pixel size. A grating is used to create a high-visibility reference pattern incident on the sample, which is distorted according to the complex refractive index and thickness of the sample. The resolution, along a line parallel to the grating, is not restricted by the grating spacing, and the detector resolution becomes the primary determinant of the spatial resolution. We present a method of analysis that maps the displacement of interrogation windows in order to retrieve a quantitative phase map. Application of this analysis to the imaging of known phantoms shows excellent correspondence.

  18. Assessing the properties of internal standards for quantitative matrix-assisted laser desorption/ionization mass spectrometry of small molecules.

    PubMed

    Sleno, Lekha; Volmer, Dietrich A

    2006-01-01

    Growing interest in the ability to conduct quantitative assays for small molecules by matrix-assisted laser desorption/ionization (MALDI) has been the driving force for several recent studies. This present work includes the investigation of internal standards for these analyses using a high-repetition rate MALDI triple quadrupole instrument. Certain physicochemical properties are assessed for predicting possible matches for internal standards for different small molecules. The importance of similar molecular weight of an internal standard to its analyte is seen through experiments with a series of acylcarnitines, having a fixed charge site and growing alkyl chain length. Both acetyl- and hexanoyl-carnitine were systematically assessed with several other acylcarnitine compounds as internal standards. The results clearly demonstrate that closely matched molecular weights between analyte and internal standard are essential for acceptable quantitation results. Using alpha-cyano-4-hydroxycinnamic acid as the organic matrix, the similarities between analyte and internal standard remain the most important parameter and not necessarily their even distribution within the solid sample spot. Several 4-quinolone antibiotics as well as a diverse group of pharmaceutical drugs were tested as internal standards for the 4-quinolone, ciprofloxacin. Quantitative results were shown using the solution-phase properties, log D and pKa, of these molecules. Their distribution coefficients, log D, are demonstrated as a fundamental parameter for similar crystallization patterns of analyte and internal standard. In the end, it was also possible to quantify ciprofloxacin using a drug from a different compound class, namely quinidine, having a similar log D value as the analyte. Copyright 2006 John Wiley & Sons, Ltd.

  19. Quantitative body fluid proteomics in medicine - A focus on minimal invasiveness.

    PubMed

    Csősz, Éva; Kalló, Gergő; Márkus, Bernadett; Deák, Eszter; Csutak, Adrienne; Tőzsér, József

    2017-02-05

    Identification of new biomarkers specific for various pathological conditions is an important field in medical sciences. Body fluids have emerging potential in biomarker studies especially those which are continuously available and can be collected by non-invasive means. Changes in the protein composition of body fluids such as tears, saliva, sweat, etc. may provide information on both local and systemic conditions of medical relevance. In this review, our aim is to discuss the quantitative proteomics techniques used in biomarker studies, and to present advances in quantitative body fluid proteomics of non-invasively collectable body fluids with relevance to biomarker identification. The advantages and limitations of the widely used quantitative proteomics techniques are also presented. Based on the reviewed literature, we suggest an ideal pipeline for body fluid analyses aiming at biomarkers discoveries: starting from identification of biomarker candidates by shotgun quantitative proteomics or protein arrays, through verification of potential biomarkers by targeted mass spectrometry, to the antibody-based validation of biomarkers. The importance of body fluids as a rich source of biomarkers is discussed. Quantitative proteomics is a challenging part of proteomics applications. The body fluids collected by non-invasive means have high relevance in medicine; they are good sources for biomarkers used in establishing the diagnosis, follow up of disease progression and predicting high risk groups. The review presents the most widely used quantitative proteomics techniques in body fluid analysis and lists the potential biomarkers identified in tears, saliva, sweat, nasal mucus and urine for local and systemic diseases. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  1. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  2. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  3. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  4. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  5. Optical holographic structural analysis of Kevlar rocket motor cases

    NASA Astrophysics Data System (ADS)

    Harris, W. J.

    1981-05-01

    The methodology of applying optical holography to evaluation of subscale Kevlar 49 composite pressure vessels is explored. The results and advantages of the holographic technique are discussed. The cases utilized were of similar design, but each had specific design features, the effects of which are reviewed. Burst testing results are presented in conjunction with the holographic fringe patterns obtained during progressive pressurization. Examples of quantitative data extracted by analysis of fringe fields are included.

  6. Defense AT and L. Volume 42, Number 2, March-April 2013

    DTIC Science & Technology

    2013-03-01

    results —quantitative and quali- tative. After reviewing the data , and with discussions across the ELT functional areas, several initiatives were...information in a more comprehensible format. When data are collected and presented as a “filled-in” model , the result is called a view. The Department of...rights acquisition, (2) the DoD increases software data -rights training for all appropriate personnel so they understand the use and relevance of

  7. Transforming Boolean models to continuous models: methodology and application to T-cell receptor signaling

    PubMed Central

    Wittmann, Dominik M; Krumsiek, Jan; Saez-Rodriguez, Julio; Lauffenburger, Douglas A; Klamt, Steffen; Theis, Fabian J

    2009-01-01

    Background The understanding of regulatory and signaling networks has long been a core objective in Systems Biology. Knowledge about these networks is mainly of qualitative nature, which allows the construction of Boolean models, where the state of a component is either 'off' or 'on'. While often able to capture the essential behavior of a network, these models can never reproduce detailed time courses of concentration levels. Nowadays however, experiments yield more and more quantitative data. An obvious question therefore is how qualitative models can be used to explain and predict the outcome of these experiments. Results In this contribution we present a canonical way of transforming Boolean into continuous models, where the use of multivariate polynomial interpolation allows transformation of logic operations into a system of ordinary differential equations (ODE). The method is standardized and can readily be applied to large networks. Other, more limited approaches to this task are briefly reviewed and compared. Moreover, we discuss and generalize existing theoretical results on the relation between Boolean and continuous models. As a test case a logical model is transformed into an extensive continuous ODE model describing the activation of T-cells. We discuss how parameters for this model can be determined such that quantitative experimental results are explained and predicted, including time-courses for multiple ligand concentrations and binding affinities of different ligands. This shows that from the continuous model we may obtain biological insights not evident from the discrete one. Conclusion The presented approach will facilitate the interaction between modeling and experiments. Moreover, it provides a straightforward way to apply quantitative analysis methods to qualitatively described systems. PMID:19785753

  8. Assessing agreement between preclinical magnetic resonance imaging and histology: An evaluation of their image qualities and quantitative results

    PubMed Central

    Elschner, Cindy; Korn, Paula; Hauptstock, Maria; Schulz, Matthias C.; Range, Ursula; Jünger, Diana; Scheler, Ulrich

    2017-01-01

    One consequence of demographic change is the increasing demand for biocompatible materials for use in implants and prostheses. This is accompanied by a growing number of experimental animals because the interactions between new biomaterials and its host tissue have to be investigated. To evaluate novel materials and engineered tissues the use of non-destructive imaging modalities have been identified as a strategic priority. This provides the opportunity for studying interactions repeatedly with individual animals, along with the advantages of reduced biological variability and decreased number of laboratory animals. However, histological techniques are still the golden standard in preclinical biomaterial research. The present article demonstrates a detailed method comparison between histology and magnetic resonance imaging. This includes the presentation of their image qualities as well as the detailed statistical analysis for assessing agreement between quantitative measures. Exemplarily, the bony ingrowth of tissue engineered bone substitutes for treatment of a cleft-like maxillary bone defect has been evaluated. By using a graphical concordance analysis the mean difference between MRI results and histomorphometrical measures has been examined. The analysis revealed a slightly but significant bias in the case of the bone volume (biasHisto−MRI:Bone volume=2.40 %, p<0.005) and a clearly significant deviation for the remaining defect width (biasHisto−MRI:Defect width=−6.73 %, p≪0.005). But the study although showed a considerable effect of the analyzed section position to the quantitative result. It could be proven, that the bias of the data sets was less originated due to the imaging modalities, but mainly on the evaluation of different slice positions. The article demonstrated that method comparisons not always need the use of an independent animal study, additionally. PMID:28666026

  9. Elicitation of quantitative data from a heterogeneous expert panel: formal process and application in animal health.

    PubMed

    Van der Fels-Klerx, Ine H J; Goossens, Louis H J; Saatkamp, Helmut W; Horst, Suzan H S

    2002-02-01

    This paper presents a protocol for a formal expert judgment process using a heterogeneous expert panel aimed at the quantification of continuous variables. The emphasis is on the process's requirements related to the nature of expertise within the panel, in particular the heterogeneity of both substantive and normative expertise. The process provides the opportunity for interaction among the experts so that they fully understand and agree upon the problem at hand, including qualitative aspects relevant to the variables of interest, prior to the actual quantification task. Individual experts' assessments on the variables of interest, cast in the form of subjective probability density functions, are elicited with a minimal demand for normative expertise. The individual experts' assessments are aggregated into a single probability density function per variable, thereby weighting the experts according to their expertise. Elicitation techniques proposed include the Delphi technique for the qualitative assessment task and the ELI method for the actual quantitative assessment task. Appropriately, the Classical model was used to weight the experts' assessments in order to construct a single distribution per variable. Applying this model, the experts' quality typically was based on their performance on seed variables. An application of the proposed protocol in the broad and multidisciplinary field of animal health is presented. Results of this expert judgment process showed that the proposed protocol in combination with the proposed elicitation and analysis techniques resulted in valid data on the (continuous) variables of interest. In conclusion, the proposed protocol for a formal expert judgment process aimed at the elicitation of quantitative data from a heterogeneous expert panel provided satisfactory results. Hence, this protocol might be useful for expert judgment studies in other broad and/or multidisciplinary fields of interest.

  10. Influence of neighbourhood information on 'Local Climate Zone' mapping in heterogeneous cities

    NASA Astrophysics Data System (ADS)

    Verdonck, Marie-Leen; Okujeni, Akpona; van der Linden, Sebastian; Demuzere, Matthias; De Wulf, Robert; Van Coillie, Frieke

    2017-10-01

    Local climate zone (LCZ) mapping is an emerging field in urban climate research. LCZs potentially provide an objective framework to assess urban form and function worldwide. The scheme is currently being used to globally map LCZs as a part of the World Urban Database and Access Portal Tools (WUDAPT) initiative. So far, most of the LCZ maps lack proper quantitative assessment, challenging the generic character of the WUDAPT workflow. Using the standard method introduced by the WUDAPT community difficulties arose concerning the built zones due to high levels of heterogeneity. To overcome this problem a contextual classifier is adopted in the mapping process. This paper quantitatively assesses the influence of neighbourhood information on the LCZ mapping result of three cities in Belgium: Antwerp, Brussels and Ghent. Overall accuracies for the maps were respectively 85.7 ± 0.5, 79.6 ± 0.9, 90.2 ± 0.4%. The approach presented here results in overall accuracies of 93.6 ± 0.2, 92.6 ± 0.3 and 95.6 ± 0.3% for Antwerp, Brussels and Ghent. The results thus indicate a positive influence of neighbourhood information for all study areas with an increase in overall accuracies of 7.9, 13.0 and 5.4%. This paper reaches two main conclusions. Firstly, evidence was introduced on the relevance of a quantitative accuracy assessment in LCZ mapping, showing that the accuracies reported in previous papers are not easily achieved. Secondly, the method presented in this paper proves to be highly effective in Belgian cities, and given its open character shows promise for application in other heterogeneous cities worldwide.

  11. Instructional Technology and the Post-Test Results of College Learners

    ERIC Educational Resources Information Center

    Pagan-Melendez, Juan

    2012-01-01

    The problem in the present quasi-experimental research design was the poor English communication skills of college students enrolled in first-year English as a second language (ESL) courses in Puerto Rico. The purpose of this quantitative study was to compare learning outcomes between a first-year English as a second class taught with the…

  12. Closing the Loop: Using Assessment Results to Modify the Curriculum so That Student Quantitative Reasoning Skills Are Enhanced

    ERIC Educational Resources Information Center

    Johnson, Lynn

    2012-01-01

    Assurance of student learning through effective assessment has become increasingly important over the past decade as accrediting agencies now require documented efforts to measure and improve student performance. This paper presents the methodology used by the College of Business Administration at California State University, Stanislaus to assess…

  13. Rapid Forgetting of Social Transmission of Food Preferences in Aged Rats: Relationship to Hippocampal CREB Activation

    ERIC Educational Resources Information Center

    Countryman, Renee A.; Gold, Paul E.

    2007-01-01

    A major characteristic of age-related changes in memory in rodents is an increase in the rate of forgetting of new information, even when tests given soon after training reveal intact memory. Interference with CREB functions similarly results in rapid decay of memory. Using quantitative immunocytochemistry, the present experiment examined the…

  14. Group specific quantitative real-time polymerase chain reaction (qRT-PCR) analysis of methanogenic archaea in stored swine manure

    USDA-ARS?s Scientific Manuscript database

    Consolidated storage of swine manure is associated with the production of a variety of odors and emissions which result from anaerobic digestion of materials present in the manure. Methanogenic archaea are a diverse group of anaerobic microorganisms responsible for the production of methane. In th...

  15. Determination of methadone hydrochloride in a maintenance dosage formulation.

    PubMed

    Hoffmann, T J; Thompson, R D

    1975-07-01

    A colorimetric method for direct quantitative assay of methadone hydrochloride in liquid oral dosage forms is presented. The procedure involves the formation of a dye complex with bromothymol blue buffer solution. The resultant complex is extracted with benzene and measured spectrophotometrically. Duplicate tests on the formulation showed 99.2% of the labeled amount of methadone.

  16. The Quantitative Resolution of a Mixture of Group II Metal Ions by Thermometric Titration with EDTA. An Analytical Chemistry Experiment.

    ERIC Educational Resources Information Center

    Smith, Robert L.; Popham, Ronald E.

    1983-01-01

    Presents an experiment in thermometric titration used in an analytic chemistry-chemical instrumentation course, consisting of two titrations, one a mixture of calcium and magnesium, the other of calcium, magnesium, and barium ions. Provides equipment and solutions list/specifications, graphs, and discussion of results. (JM)

  17. An Assessment of Need for Developing and Implementing Technical and Skilled Worker Training for the Solar Energy Industry.

    ERIC Educational Resources Information Center

    Orsak, Charles G.; And Others

    A Navarro College, Texas, study determined the quantitative and qualitative needs for developing skilled manpower for the solar industry and secondarily identified the (present) solar industry manpower populations and tasks performed by solar technical and skilled workers. Results from three initial working groups addressing equipment, market…

  18. East Asian International Students and Psychological Well-Being: A Systematic Review

    ERIC Educational Resources Information Center

    Li, Jiaqi; Wang, Yanlin; Xiao, Feiya

    2014-01-01

    The present article reports a systematic review of the studies related to psychological well-being among East Asian international students. A total of 18 quantitative studies published in peer-reviewed journals from 2000 to 2011 were reviewed. Our review revealed three major results: (1) a majority of researchers (n = 13, 72.2%) tend to choose…

  19. 77 FR 31964 - Energy Conservation Program: Energy Conservation Standards for Residential Dishwashers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-30

    ... year used for discounting the NPV of total consumer costs and savings, for the time-series of costs and... does not imply that the time-series of cost and benefits from which the annualized values were... each TSL, DOE has included tables that present a summary of the results of DOE's quantitative analysis...

  20. An Outcome Evaluation of the Success for Kids Program. Technical Report

    ERIC Educational Resources Information Center

    Maestas, Nicole; Gaillot, Sarah

    2010-01-01

    This report presents results from a multisite, quantitative evaluation of the international Success for Kids (SFK) after-school program. The program seeks to build resilience in children by teaching them to access inner resources and build positive connections with others. The SFK program is unlike most after-school programs both in its focus on…

  1. Assisted migration of forest populations for adapting trees to climate change

    Treesearch

    Cuauhtémoc Sáenz-Romero; Roberto A. Lindig-Cisneros; Dennis G. Joyce; Jean Beaulieu; J. Bradley St. Clair; Barry C. Jaquish

    2016-01-01

    We present evidence that climatic change is an ongoing process and that forest tree populations are genetically differentiated for quantitative traits because of adaptation to specific habitats. We discuss in detail indications that the shift of suitable climatic habitat for forest tree species and populations, as a result of rapid climatic change, is likely to cause...

  2. Creating a Communicative Language Teaching Environment for Improving Students' Communicative Competence at EFL/EAP University Level

    ERIC Educational Resources Information Center

    Farooq, Muhammad U.

    2015-01-01

    The present research focuses on teachers' perceptions and practices regarding Communicative Language Teaching (CLT) and its impact on communicative competency of the students. A questionnaire was used to collect the quantitative data from teachers. The results show that the EFL teachers are aware of the CLT characteristics, its implementation and…

  3. Elementary Analysis of the Special Relativistic Combination of Velocities, Wigner Rotation and Thomas Precession

    ERIC Educational Resources Information Center

    O'Donnell, Kane; Visser, Matt

    2011-01-01

    The purpose of this paper is to provide an elementary introduction to the qualitative and quantitative results of velocity combination in special relativity, including the Wigner rotation and Thomas precession. We utilize only the most familiar tools of special relativity, in arguments presented at three differing levels: (1) utterly elementary,…

  4. The Emergence of the Controversy around the Theory of Evolution and Creationism in UK Newspaper Reports

    ERIC Educational Resources Information Center

    Allgaier, Joachim; Holliman, Richard

    2006-01-01

    The question of whether religious explanations about the origin of life should be taught alongside scientific accounts in compulsory science education has sparked controversy in several countries for decades. An important site for these controversies is media reporting. This article presents the results of a quantitative and qualitative analysis…

  5. Teaching Digital Natives: 3-D Virtual Science Lab in the Middle School Science Classroom

    ERIC Educational Resources Information Center

    Franklin, Teresa J.

    2008-01-01

    This paper presents the development of a 3-D virtual environment in Second Life for the delivery of standards-based science content for middle school students in the rural Appalachian region of Southeast Ohio. A mixed method approach in which quantitative results of improved student learning and qualitative observations of implementation within…

  6. Roles of additives and surface control in slurry atomization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsai, S.C.

    1990-03-01

    This quarterly report describes a quantitative correlation between the flow behavior index of a micronized coal slurry and the interparticular van der Waals attraction force as measured by the Hamaker constant. Preliminary results on the effects of interparticular electrostatic repulsion and the liquid viscosity on both the flow behavior and the relative viscosity are also presented.

  7. A Study of Early Environmental Education Experiences: Can We Legislate Concern and Understanding of the Natural World?

    ERIC Educational Resources Information Center

    Ramey, Linda K.

    2008-01-01

    When we think of environmental education, we often remember summer camps and scouting. This ongoing study examines childhood experiences and the potential impact of those experiences on fostering a caring concern for the environment. Results, obtained using mixed methodology (quantitative and qualitative techniques), indicate trends present in…

  8. The Impact of Various Quizzing Patterns on the Test Performance of High School Economics Students

    ERIC Educational Resources Information Center

    Robertson, William L.

    2010-01-01

    Presenting college students, in a wide variety of content areas, with frequent announced and unannounced quizzes appears to correlate positively with enhanced test performance. The purpose of this quantitative study was to examine if similar results can be achieved with high school students in a standard economics class. Based on a theoretical…

  9. Textbook Publishers' Website Objective Question Banks: Does Their Use Improve Students' Examination Performance?

    ERIC Educational Resources Information Center

    Johnston, Scott Paul; Huczynski, Andrzej

    2006-01-01

    This article presents the findings of a survey of students' usage of the objective question bank section of an academic publisher's textbook website. The findings are based on a survey of 239 business and management undergraduates conducted using a quantitative research methodology. The results suggest that increased use of the objective question…

  10. Reliability and safety, and the risk of construction damage in mining areas

    NASA Astrophysics Data System (ADS)

    Skrzypczak, Izabela; Kogut, Janusz P.; Kokoszka, Wanda; Oleniacz, Grzegorz

    2018-04-01

    This article concerns the reliability and safety of building structures in mining areas, with a particular emphasis on the quantitative risk analysis of buildings. The issues of threat assessment and risk estimation, in the design of facilities in mining exploitation areas, are presented here, indicating the difficulties and ambiguities associated with their quantification and quantitative analysis. This article presents the concept of quantitative risk assessment of the impact of mining exploitation, in accordance with ISO 13824 [1]. The risk analysis is illustrated through an example of a construction located within an area affected by mining exploitation.

  11. Three pedagogical approaches to introductory physics labs and their effects on student learning outcomes

    NASA Astrophysics Data System (ADS)

    Chambers, Timothy

    This dissertation presents the results of an experiment that measured the learning outcomes associated with three different pedagogical approaches to introductory physics labs. These three pedagogical approaches presented students with the same apparatus and covered the same physics content, but used different lab manuals to guide students through distinct cognitive processes in conducting their laboratory investigations. We administered post-tests containing multiple-choice conceptual questions and free-response quantitative problems one week after students completed these laboratory investigations. In addition, we collected data from the laboratory practical exam taken by students at the end of the semester. Using these data sets, we compared the learning outcomes for the three curricula in three dimensions of ability: conceptual understanding, quantitative problem-solving skill, and laboratory skills. Our three pedagogical approaches are as follows. Guided labs lead students through their investigations via a combination of Socratic-style questioning and direct instruction, while students record their data and answers to written questions in the manual during the experiment. Traditional labs provide detailed written instructions, which students follow to complete the lab objectives. Open labs provide students with a set of apparatus and a question to be answered, and leave students to devise and execute an experiment to answer the question. In general, we find that students performing Guided labs perform better on some conceptual assessment items, and that students performing Open labs perform significantly better on experimental tasks. Combining a classical test theory analysis of post-test results with in-lab classroom observations allows us to identify individual components of the laboratory manuals and investigations that are likely to have influenced the observed differences in learning outcomes associated with the different pedagogical approaches. Due to the novel nature of this research and the large number of item-level results we produced, we recommend additional research to determine the reproducibility of our results. Analyzing the data with item response theory yields additional information about the performance of our students on both conceptual questions and quantitative problems. We find that performing lab activities on a topic does lead to better-than-expected performance on some conceptual questions regardless of pedagogical approach, but that this acquired conceptual understanding is strongly context-dependent. The results also suggest that a single "Newtonian reasoning ability" is inadequate to explain student response patterns to items from the Force Concept Inventory. We develop a framework for applying polytomous item response theory to the analysis of quantitative free-response problems and for analyzing how features of student solutions are influenced by problem-solving ability. Patterns in how students at different abilities approach our post-test problems are revealed, and we find hints as to how features of a free-response problem influence its item parameters. The item-response theory framework we develop provides a foundation for future development of quantitative free-response research instruments. Chapter 1 of the dissertation presents a brief history of physics education research and motivates the present study. Chapter 2 describes our experimental methodology and discusses the treatments applied to students and the instruments used to measure their learning. Chapter 3 provides an introduction to the statistical and analytical methods used in our data analysis. Chapter 4 presents the full data set, analyzed using both classical test theory and item response theory. Chapter 5 contains a discussion of the implications of our results and a data-driven analysis of our experimental methods. Chapter 6 describes the importance of this work to the field and discusses the relevance of our research to curriculum development and to future work in physics education research.

  12. The DUV Stability of Superlattice-Doped CMOS Detector Arrays

    NASA Technical Reports Server (NTRS)

    Hoenk, M. E.; Carver, A.; Jones, T.; Dickie, M.; Cheng, P.; Greer, H. F.; Nikzad, S.; Sgro, J.

    2013-01-01

    In this paper, we present experimental results and band structure calculations that illuminate the unique properties of superlattice-doped detectors. Numerical band structure calculations are presented to analyze the dependencies of surface passivation on dopant profiles and interface trap densities (Figure 3). Experiments and calculations show that quantum-engineered surfaces, grown at JPL by low temperature molecular beam epitaxy, achieve a qualitative as well as quantitative uniqueness in their near-immunity to high densities of surface and interface traps.

  13. Studies of Altered Response to Infection Induced by Severe Injury.

    DTIC Science & Technology

    1994-11-15

    quantitation of original specific RNA was determined by comparison of target and MIMIC band intensities and were standardized by G3PDH quantities. 4 RESULTS... housekeeping gene G 3PDH (Fig. 1). We compared the amount of mRNA present after 3 hours of stimulation to the amount of bioactivity present after 16 hours...Is$ mimic concelntration l PCR with G3PDH Primers ( G3PDH ) PCR Products Gel MIMIC -~ Attomoie MIMIC 100 10 1 0.1 SDensitometry Reading I Graphing and

  14. A color video display technique for flow field surveys

    NASA Technical Reports Server (NTRS)

    Winkelmann, A. E.; Tsao, C. P.

    1982-01-01

    A computer driven color video display technique has been developed for the presentation of wind tunnel flow field survey data. The results of both qualitative and quantitative flow field surveys can be presented in high spatial resolutions color coded displays. The technique has been used for data obtained with a hot-wire probe, a split-film probe, a Conrad (pitch) probe and a 5-tube pressure probe in surveys above and behind a wing with partially stalled and fully stalled flow.

  15. Luminous Phenomena in the Atmosphere. A New Frontier of New Physics?

    NASA Astrophysics Data System (ADS)

    Teodorani, M.

    1999-03-01

    A main geographic list of anomalous atmospheric light phenomena which are reocurring in several areas of the world is presented. In particular, the Norwegian light-phenomenon occurring in Hessdalen, a prototypical event of this class, is described in great detail. Results obtained in 1984 by the Norwegian scientific organization named 'Project Hessdalen' are discussed. Moreover, the present status and future projects of this organization are presented. It is also shown how the philosophy of research of Project Hessdalen can be adapted to the quantitative investigation of similar light phenomena in other parts of the world. Subsequently, the numerical analysis carried out by the author on the Project Hessdalen 1984 data is shown in detail. After illustrating the several physical theories which have been proposed so far to explain the light phenomenon, a strong emphasis is given on the quantitative definition of instrumental prerequisites and measurable physical parameters. A strategy aimed at defining the investigation methodology and instrumented monitoring in Italian areas of recurrence of the light phenomenon, is presented. An introduction is also given on the documented effects of interaction of the electromagnetic field produced by the light phenomenon with the brain electrical activity of people, by suggesting possible biophysical causes.

  16. Comparing Online with Brick and Mortar Course Learning Outcomes: An Analysis of Quantitative Methods Curriculum in Public Administration

    ERIC Educational Resources Information Center

    Harris, Ronald A.; Nikitenko, Gleb O.

    2014-01-01

    Teaching graduate students in an intensive adult-learning format presents a special challenge for quantitative analytical competencies. Students often lack necessary background, skills and motivation to deal with quantitative-skill-based course work. This study compares learning outcomes for graduate students enrolled in three course sections…

  17. Multi-modality imaging of tumor phenotype and response to therapy

    NASA Astrophysics Data System (ADS)

    Nyflot, Matthew J.

    2011-12-01

    Imaging and radiation oncology have historically been closely linked. However, the vast majority of techniques used in the clinic involve anatomical imaging. Biological imaging offers the potential for innovation in the areas of cancer diagnosis and staging, radiotherapy target definition, and treatment response assessment. Some relevant imaging techniques are FDG PET (for imaging cellular metabolism), FLT PET (proliferation), CuATSM PET (hypoxia), and contrast-enhanced CT (vasculature and perfusion). Here, a technique for quantitative spatial correlation of tumor phenotype is presented for FDG PET, FLT PET, and CuATSM PET images. Additionally, multimodality imaging of treatment response with FLT PET, CuATSM, and dynamic contrast-enhanced CT is presented, in a trial of patients receiving an antiangiogenic agent (Avastin) combined with cisplatin and radiotherapy. Results are also presented for translational applications in animal models, including quantitative assessment of proliferative response to cetuximab with FLT PET and quantification of vascular volume with a blood-pool contrast agent (Fenestra). These techniques have clear applications to radiobiological research and optimized treatment strategies, and may eventually be used for personalized therapy for patients.

  18. Rationalising the 'irrational': a think aloud study of discrete choice experiment responses.

    PubMed

    Ryan, Mandy; Watson, Verity; Entwistle, Vikki

    2009-03-01

    Stated preference methods assume respondents' preferences are consistent with utility theory, but many empirical studies report evidence of preferences that violate utility theory. This evidence is often derived from quantitative tests that occur naturally within, or are added to, stated preference tasks. In this study, we use qualitative methods to explore three axioms of utility theory: completeness, monotonicity, and continuity. We take a novel approach, adopting a 'think aloud' technique to identify violations of the axioms of utility theory and to consider how well the quantitative tests incorporated within a discrete choice experiment are able to detect these. Results indicate that quantitative tests classify respondents as being 'irrational' when qualitative statements would indicate they are 'rational'. In particular, 'non-monotonic' responses can often be explained by respondents inferring additional information beyond what is presented in the task, and individuals who appear to adopt non-compensatory decision-making strategies do so because they rate particular attributes very highly (they are not attempting to simplify the task). The results also provide evidence of 'cost-based responses': respondents assumed tests with higher costs would be of higher quality. The value of including in-depth qualitative validation techniques in the development of stated preference tasks is shown.

  19. Identification of Lactobacillus delbrueckii and Streptococcus thermophilus Strains Present in Artisanal Raw Cow Milk Cheese Using Real-time PCR and Classic Plate Count Methods.

    PubMed

    Stachelska, Milena A

    2017-12-04

    The aim of this paper was to detect Lactobacillus delbrueckii and Streptococcus thermophilus using real-time quantitative PCR assay in 7-day ripening cheese produced from unpasteurised milk. Real-time quantitative PCR assays were designed to identify and enumerate the chosen species of lactic acid bacteria (LAB) in ripened cheese. The results of molecular quantification and classic bacterial enumeration showed a high level of similarity proving that DNA extraction was carried out in a proper way and that genomic DNA solutions were free of PCR inhibitors. These methods revealed the presence of L. delbrueckii and S. thermophilus. The real-time PCR enabled quantification with a detection of 101-103 CFU/g of product. qPCR-standard curves were linear over seven log units down to 101 copies per reaction; efficiencies ranged from 77.9% to 93.6%. Cheese samples were analysed with plate count method and qPCR in parallel. Compared with the classic plate count method, the newly developed qPCR method provided faster and species specific identification of two dairy LAB and yielded comparable quantitative results.

  20. The acellular matrix (ACM) for bladder tissue engineering: A quantitative magnetic resonance imaging study.

    PubMed

    Cheng, Hai-Ling Margaret; Loai, Yasir; Beaumont, Marine; Farhat, Walid A

    2010-08-01

    Bladder acellular matrices (ACMs) derived from natural tissue are gaining increasing attention for their role in tissue engineering and regeneration. Unlike conventional scaffolds based on biodegradable polymers or gels, ACMs possess native biomechanical and many acquired biologic properties. Efforts to optimize ACM-based scaffolds are ongoing and would be greatly assisted by a noninvasive means to characterize scaffold properties and monitor interaction with cells. MRI is well suited to this role, but research with MRI for scaffold characterization has been limited. This study presents initial results from quantitative MRI measurements for bladder ACM characterization and investigates the effects of incorporating hyaluronic acid, a natural biomaterial useful in tissue-engineering and regeneration. Measured MR relaxation times (T(1), T(2)) and diffusion coefficient were consistent with increased water uptake and glycosaminoglycan content observed on biochemistry in hyaluronic acid ACMs. Multicomponent MRI provided greater specificity, with diffusion data showing an acellular environment and T(2) components distinguishing the separate effects of increased glycosaminoglycans and hydration. These results suggest that quantitative MRI may provide useful information on matrix composition and structure, which is valuable in guiding further development using bladder ACMs for organ regeneration and in strategies involving the use of hyaluronic acid.

  1. A TEM quantitative evaluation of strengthening in an Mg-RE alloy reinforced with SiC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cabibbo, Marcello, E-mail: m.cabibbo@univpm.it; Spigarelli, Stefano

    2011-10-15

    Magnesium alloys containing rare earth elements are known to have high specific strength, good creep and corrosion resistance up to 523 K. The addition of SiC ceramic particles strengthens the metal matrix composite resulting in better wear and creep resistance while maintaining good machinability. The role of the reinforcement particles in enhancing strength can be quantitatively evaluated using transmission electron microscopy (TEM). This paper presents a quantitative evaluation of the different strengthening contributions, determined through TEM inspections, in an SiC Mg-RE composite alloy containing yttrium, neodymium, gadolinium and dysprosium. Compression tests at temperatures ranging between 290 and 573 K weremore » carried out. The microstructure strengthening mechanism was studied for all the compression conditions. Strengthening was compared to the mechanical results and the way the different contributions were combined is also discussed and justified. - Research Highlights: {yields} TEM yield strengthening terms evaluation on a Mg-RE SiC alloy. {yields} The evaluation has been extended to different compression temperature conditions. {yields} Linear and Quadratic sum has been proposed and validated. {yields} Hall-Petch was found to be the most prominent strengthening contributions.« less

  2. Quantitative and qualitative research across cultures and languages: cultural metrics and their application.

    PubMed

    Wagner, Wolfgang; Hansen, Karolina; Kronberger, Nicole

    2014-12-01

    Growing globalisation of the world draws attention to cultural differences between people from different countries or from different cultures within the countries. Notwithstanding the diversity of people's worldviews, current cross-cultural research still faces the challenge of how to avoid ethnocentrism; comparing Western-driven phenomena with like variables across countries without checking their conceptual equivalence clearly is highly problematic. In the present article we argue that simple comparison of measurements (in the quantitative domain) or of semantic interpretations (in the qualitative domain) across cultures easily leads to inadequate results. Questionnaire items or text produced in interviews or via open-ended questions have culturally laden meanings and cannot be mapped onto the same semantic metric. We call the culture-specific space and relationship between variables or meanings a 'cultural metric', that is a set of notions that are inter-related and that mutually specify each other's meaning. We illustrate the problems and their possible solutions with examples from quantitative and qualitative research. The suggested methods allow to respect the semantic space of notions in cultures and language groups and the resulting similarities or differences between cultures can be better understood and interpreted.

  3. [Mixed methods research in public health: issues and illustration].

    PubMed

    Guével, Marie-Renée; Pommier, Jeanine

    2012-01-01

    For many years, researchers in a range of fields have combined quantitative and qualitative methods. However, the combined use of quantitative and qualitative methods has only recently been conceptualized and defined as mixed methods research. Some authors have described the emerging field as a third methodological tradition (in addition to the qualitative and quantitative traditions). Mixed methods research combines different perspectives and facilitates the study of complex interventions or programs, particularly in public health, an area where interdisciplinarity is critical. However, the existing literature is primarily in English. By contrast, the literature in French remains limited. The purpose of this paper is to present the emergence of mixed methods research for francophone public health specialists. A literature review was conducted to identify the main characteristics of mixed methods research. The results provide an overall picture of the mixed methods approach through its history, definitions, and applications, and highlight the tools developed to clarify the approach (typologies) and to implement it (integration of results and quality standards). The tools highlighted in the literature review are illustrated by a study conducted in France. Mixed methods research opens new possibilities for examining complex research questions and provides relevant and promising opportunities for addressing current public health issues in France.

  4. Evaluation of a rapid quantitative determination method of PSA concentration with gold immunochromatographic strips.

    PubMed

    Wu, Cheng-Ching; Lin, Hung-Yu; Wang, Chao-Ping; Lu, Li-Fen; Yu, Teng-Hung; Hung, Wei-Chin; Houng, Jer-Yiing; Chung, Fu-Mei; Lee, Yau-Jiunn; Hu, Jin-Jia

    2015-11-03

    Prostate cancer remains the most common cancer in men. Qualitative or semi-quantitative immunochromatographic measurements of prostate specific antigen (PSA) have been shown to be simple, noninvasive and feasible. The aim of this study was to evaluate an optimized gold immunochromatographic strip device for the detection of PSA, in which the results can be analysed using a Chromogenic Rapid Test Reader to quantitatively assess the test results. This reader measures the reflectance of the signal line via a charge-coupled device camera. For quantitative analysis, PSA concentration was computed via a calibration equation. Capillary blood samples from 305 men were evaluated, and two independent observers interpreted the test results after 12 min. Blood samples were also collected and tested with a conventional quantitative assay. Sensitivity, specificity, positive and negative predictive values, and accuracy of the PSA rapid quantitative test system were 100, 96.6, 89.5, 100, and 97.4 %, respectively. Reproducibility of the test was 99.2, and interobserver variation was 8 % with a false positive rate of 3.4 %. The correlation coefficient between the ordinary quantitative assay and the rapid quantitative test was 0.960. The PSA rapid quantitative test system provided results quickly and was easy to use, so that tests using this system can be easily performed at outpatient clinics or elsewhere. This system may also be useful for initial cancer screening and for point-of-care testing, because results can be obtained within 12 min and at a cost lower than that of conventional quantitative assays.

  5. Quantification of Microbial Phenotypes

    PubMed Central

    Martínez, Verónica S.; Krömer, Jens O.

    2016-01-01

    Metabolite profiling technologies have improved to generate close to quantitative metabolomics data, which can be employed to quantitatively describe the metabolic phenotype of an organism. Here, we review the current technologies available for quantitative metabolomics, present their advantages and drawbacks, and the current challenges to generate fully quantitative metabolomics data. Metabolomics data can be integrated into metabolic networks using thermodynamic principles to constrain the directionality of reactions. Here we explain how to estimate Gibbs energy under physiological conditions, including examples of the estimations, and the different methods for thermodynamics-based network analysis. The fundamentals of the methods and how to perform the analyses are described. Finally, an example applying quantitative metabolomics to a yeast model by 13C fluxomics and thermodynamics-based network analysis is presented. The example shows that (1) these two methods are complementary to each other; and (2) there is a need to take into account Gibbs energy errors. Better estimations of metabolic phenotypes will be obtained when further constraints are included in the analysis. PMID:27941694

  6. Barriers and Delays in Tuberculosis Diagnosis and Treatment Services: Does Gender Matter?

    PubMed Central

    Yang, Wei-Teng; Gounder, Celine R.; Akande, Tokunbo; De Neve, Jan-Walter; McIntire, Katherine N.; Chandrasekhar, Aditya; de Lima Pereira, Alan; Gummadi, Naveen; Samanta, Santanu; Gupta, Amita

    2014-01-01

    Background. Tuberculosis (TB) remains a global public health problem with known gender-related disparities. We reviewed the quantitative evidence for gender-related differences in accessing TB services from symptom onset to treatment initiation. Methods. Following a systematic review process, we: searched 12 electronic databases; included quantitative studies assessing gender differences in accessing TB diagnostic and treatment services; abstracted data; and assessed study validity. We defined barriers and delays at the individual and provider/system levels using a conceptual framework of the TB care continuum and examined gender-related differences. Results. Among 13,448 articles, 137 were included: many assessed individual-level barriers (52%) and delays (42%), 76% surveyed persons presenting for care with diagnosed or suspected TB, 24% surveyed community members, and two-thirds were from African and Asian regions. Many studies reported no gender differences. Among studies reporting disparities, women faced greater barriers (financial: 64% versus 36%; physical: 100% versus 0%; stigma: 85% versus 15%; health literacy: 67% versus 33%; and provider-/system-level: 100% versus 0%) and longer delays (presentation to diagnosis: 45% versus 0%) than men. Conclusions. Many studies found no quantitative gender-related differences in barriers and delays limiting access to TB services. When differences were identified, women experienced greater barriers and longer delays than men. PMID:24876956

  7. Barriers and delays in tuberculosis diagnosis and treatment services: does gender matter?

    PubMed

    Yang, Wei-Teng; Gounder, Celine R; Akande, Tokunbo; De Neve, Jan-Walter; McIntire, Katherine N; Chandrasekhar, Aditya; de Lima Pereira, Alan; Gummadi, Naveen; Samanta, Santanu; Gupta, Amita

    2014-01-01

    Background. Tuberculosis (TB) remains a global public health problem with known gender-related disparities. We reviewed the quantitative evidence for gender-related differences in accessing TB services from symptom onset to treatment initiation. Methods. Following a systematic review process, we: searched 12 electronic databases; included quantitative studies assessing gender differences in accessing TB diagnostic and treatment services; abstracted data; and assessed study validity. We defined barriers and delays at the individual and provider/system levels using a conceptual framework of the TB care continuum and examined gender-related differences. Results. Among 13,448 articles, 137 were included: many assessed individual-level barriers (52%) and delays (42%), 76% surveyed persons presenting for care with diagnosed or suspected TB, 24% surveyed community members, and two-thirds were from African and Asian regions. Many studies reported no gender differences. Among studies reporting disparities, women faced greater barriers (financial: 64% versus 36%; physical: 100% versus 0%; stigma: 85% versus 15%; health literacy: 67% versus 33%; and provider-/system-level: 100% versus 0%) and longer delays (presentation to diagnosis: 45% versus 0%) than men. Conclusions. Many studies found no quantitative gender-related differences in barriers and delays limiting access to TB services. When differences were identified, women experienced greater barriers and longer delays than men.

  8. Dynamic and quantitative method of analyzing service consistency evolution based on extended hierarchical finite state automata.

    PubMed

    Fan, Linjun; Tang, Jun; Ling, Yunxiang; Li, Benxian

    2014-01-01

    This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA). Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service's evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA) is constructed based on finite state automata (FSA), which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs) based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average) is the biggest influential factor, the noncomposition of atomic services (13.12%) is the second biggest one, and the service version's confusion (1.2%) is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA.

  9. Quantitative study of Xanthosoma violaceum leaf surfaces using RIMAPS and variogram techniques.

    PubMed

    Favret, Eduardo A; Fuentes, Néstor O; Molina, Ana M

    2006-08-01

    Two new imaging techniques (rotated image with maximum averaged power spectrum (RIMAPS) and variogram) are presented for the study and description of leaf surfaces. Xanthosoma violaceum was analyzed to illustrate the characteristics of both techniques. Both techniques produce a quantitative description of leaf surface topography. RIMAPS combines digitized images rotation with Fourier transform, and it is used to detect patterns orientation and characteristics of surface topography. Variogram relates the mathematical variance of a surface with the area of the sample window observed. It gives the typical scale lengths of the surface patterns. RIMAPS detects the morphological variations of the surface topography pattern between fresh and dried (herbarium) samples of the leaf. The variogram method finds the characteristic dimensions of the leaf microstructure, i.e., cell length, papillae diameter, etc., showing that there are not significant differences between dry and fresh samples. The results obtained show the robustness of RIMAPS and variogram analyses to detect, distinguish, and characterize leaf surfaces, as well as give scale lengths. Both techniques are tools for the biologist to study variations of the leaf surface when different patterns are present. The use of RIMAPS and variogram opens a wide spectrum of possibilities by providing a systematic, quantitative description of the leaf surface topography.

  10. Refining Intervention Targets in Family-Based Research: Lessons From Quantitative Behavioral Genetics

    PubMed Central

    Leve, Leslie D.; Harold, Gordon T.; Ge, Xiaojia; Neiderhiser, Jenae M.; Patterson, Gerald

    2010-01-01

    The results from a large body of family-based research studies indicate that modifying the environment (specifically dimensions of the social environment) through intervention is an effective mechanism for achieving positive outcomes. Parallel to this work is a growing body of evidence from genetically informed studies indicating that social environmental factors are central to enhancing or offsetting genetic influences. Increased precision in the understanding of the role of the social environment in offsetting genetic risk might provide new information about environmental mechanisms that could be applied to prevention science. However, at present, the multifaceted conceptualization of the environment in prevention science is mismatched with the more limited measurement of the environment in many genetically informed studies. A framework for translating quantitative behavioral genetic research to inform the development of preventive interventions is presented in this article. The measurement of environmental indices amenable to modification is discussed within the context of quantitative behavioral genetic studies. In particular, emphasis is placed on the necessary elements that lead to benefits in prevention science, specifically the development of evidence-based interventions. An example from an ongoing prospective adoption study is provided to illustrate the potential of this translational process to inform the selection of preventive intervention targets. PMID:21188273

  11. Combination of Liquid Chromatography with Multivariate Curve Resolution-Alternating Least-Squares (MCR-ALS) in the Quantitation of Polycyclic Aromatic Hydrocarbons Present in Paprika Samples.

    PubMed

    Monago-Maraña, Olga; Pérez, Rocío L; Escandar, Graciela M; Muñoz de la Peña, Arsenio; Galeano-Díaz, Teresa

    2016-11-02

    This work presents a strategy for quantitating polycyclic aromatic hydrocarbons (PAHs) in smoked paprika samples. For this, a liquid chromatographic method with fluorimetric detection (HPLC-FLD) was optimized. To resolve some interference co-eluting with the target analytes, the second-order multivariate curve resolution-alternating least-squares (MCR-ALS) algorithm has been employed combined with this liquid chromatographic method. Among the eight PAHs quantified (fluorene, phenanthrene, anthracene, pyrene, chrysene, benzo[a]anthracene, benzo[b]fluoranthene, and benzo[a]pyrene) by HPLC-FLD, only in the case of fluorene, pyrene, and benzo[b]fluoranthene was it necessary to apply the second-order algorithm for their resolution. Limits of detection and quantitation were between 0.015 and 0.45 mg/kg and between 0.15 and 1.5 mg/kg, respectively. Good recovery results (>80%) for paprika were obtained via the complete extraction procedure, consisting of an extraction from the matrix and the cleanup of the extract by means of silica cartridges. Higher concentrations of chrysene, benzo[a]anthracene, benzo[b]fluoranthene, and benzo[a]pyrene were found in the paprika samples, with respect to the maximal amounts allowed for other spices that are under European Regulation (EU) N° 2015/1933.

  12. Dynamic and Quantitative Method of Analyzing Service Consistency Evolution Based on Extended Hierarchical Finite State Automata

    PubMed Central

    Fan, Linjun; Tang, Jun; Ling, Yunxiang; Li, Benxian

    2014-01-01

    This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA). Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service's evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA) is constructed based on finite state automata (FSA), which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs) based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average) is the biggest influential factor, the noncomposition of atomic services (13.12%) is the second biggest one, and the service version's confusion (1.2%) is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA. PMID:24772033

  13. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    NASA Astrophysics Data System (ADS)

    Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.

    2012-09-01

    Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  14. Identification and uncertainty estimation of vertical reflectivity profiles using a Lagrangian approach to support quantitative precipitation measurements by weather radar

    NASA Astrophysics Data System (ADS)

    Hazenberg, P.; Torfs, P. J. J. F.; Leijnse, H.; Delrieu, G.; Uijlenhoet, R.

    2013-09-01

    This paper presents a novel approach to estimate the vertical profile of reflectivity (VPR) from volumetric weather radar data using both a traditional Eulerian as well as a newly proposed Lagrangian implementation. For this latter implementation, the recently developed Rotational Carpenter Square Cluster Algorithm (RoCaSCA) is used to delineate precipitation regions at different reflectivity levels. A piecewise linear VPR is estimated for either stratiform or neither stratiform/convective precipitation. As a second aspect of this paper, a novel approach is presented which is able to account for the impact of VPR uncertainty on the estimated radar rainfall variability. Results show that implementation of the VPR identification and correction procedure has a positive impact on quantitative precipitation estimates from radar. Unfortunately, visibility problems severely limit the impact of the Lagrangian implementation beyond distances of 100 km. However, by combining this procedure with the global Eulerian VPR estimation procedure for a given rainfall type (stratiform and neither stratiform/convective), the quality of the quantitative precipitation estimates increases up to a distance of 150 km. Analyses of the impact of VPR uncertainty shows that this aspect accounts for a large fraction of the differences between weather radar rainfall estimates and rain gauge measurements.

  15. An adaptive model approach for quantitative wrist rigidity evaluation during deep brain stimulation surgery.

    PubMed

    Assis, Sofia; Costa, Pedro; Rosas, Maria Jose; Vaz, Rui; Silva Cunha, Joao Paulo

    2016-08-01

    Intraoperative evaluation of the efficacy of Deep Brain Stimulation includes evaluation of the effect on rigidity. A subjective semi-quantitative scale is used, dependent on the examiner perception and experience. A system was proposed previously, aiming to tackle this subjectivity, using quantitative data and providing real-time feedback of the computed rigidity reduction, hence supporting the physician decision. This system comprised of a gyroscope-based motion sensor in a textile band, placed in the patients hand, which communicated its measurements to a laptop. The latter computed a signal descriptor from the angular velocity of the hand during wrist flexion in DBS surgery. The first approach relied on using a general rigidity reduction model, regardless of the initial severity of the symptom. Thus, to enhance the performance of the previously presented system, we aimed to develop models for high and low baseline rigidity, according to the examiner assessment before any stimulation. This would allow a more patient-oriented approach. Additionally, usability was improved by having in situ processing in a smartphone, instead of a computer. Such system has shown to be reliable, presenting an accuracy of 82.0% and a mean error of 3.4%. Relatively to previous results, the performance was similar, further supporting the importance of considering the cogwheel rigidity to better infer about the reduction in rigidity. Overall, we present a simple, wearable, mobile system, suitable for intra-operatory conditions during DBS, supporting a physician in decision-making when setting stimulation parameters.

  16. Beta Decay and the Origins of Biological Chirality

    NASA Astrophysics Data System (ADS)

    van House, James Christopher

    1984-06-01

    The amino acids and sugars on which terrestrial life is based show maximal optical activity, that is, with rare exceptions, they are composed of D sugars in RNA and DNA and L-amino acids in proteins. Recent quantitative theoretical calculations suggest that the origin of this asymmetry can be causally explained by asymmetric radiolysis of initially racemic mixtures of D and L molecules by the longitudinally polarized electrons emitted in parity violating nuclear (beta) decay. These same theories predict an asymmetry in the rate of orthopositronium formation, Ap(,s), when low energy positron beams with a net helicity form positronium in optically active molecules, and quantitatively connect Ap(,s) to asymmetric radiolysis. This thesis presents the results of a measurement of Ap(,s) in several D, L, and DL amino acids using a polarized low energy positron beam. Limits of Ap(,s) < 3 x 10(' -4) were set on the amino acids leucine, selenocystine, and thyroxine, sufficient to exclude part of the predicted range of Ap(,s) in the last two molecules. These experimental limits improve previous limits on asymmetric radiolysis by a factor of 10('6). A quantitative discussion of the connection between the above limits and the origin of optical activity in living organisms is presented. Details of the development of the high intensity, highly polarized slow positron beam used in these measurements and of the first use of remoderation to form a slow positron beam and provide a timing signal for the beam are presented in the Appendices.

  17. Quantitative imaging technique using the layer-stripping algorithm

    NASA Astrophysics Data System (ADS)

    Beilina, L.

    2017-07-01

    We present the layer-stripping algorithm for the solution of the hyperbolic coefficient inverse problem (CIP). Our numerical examples show quantitative reconstruction of small tumor-like inclusions in two-dimensions.

  18. MilQuant: a free, generic software tool for isobaric tagging-based quantitation.

    PubMed

    Zou, Xiao; Zhao, Minzhi; Shen, Hongyan; Zhao, Xuyang; Tong, Yuanpeng; Wang, Qingsong; Wei, Shicheng; Ji, Jianguo

    2012-09-18

    Isobaric tagging techniques such as iTRAQ and TMT are widely used in quantitative proteomics and especially useful for samples that demand in vitro labeling. Due to diversity in choices of MS acquisition approaches, identification algorithms, and relative abundance deduction strategies, researchers are faced with a plethora of possibilities when it comes to data analysis. However, the lack of generic and flexible software tool often makes it cumbersome for researchers to perform the analysis entirely as desired. In this paper, we present MilQuant, mzXML-based isobaric labeling quantitator, a pipeline of freely available programs that supports native acquisition files produced by all mass spectrometer types and collection approaches currently used in isobaric tagging based MS data collection. Moreover, aside from effective normalization and abundance ratio deduction algorithms, MilQuant exports various intermediate results along each step of the pipeline, making it easy for researchers to customize the analysis. The functionality of MilQuant was demonstrated by four distinct datasets from different laboratories. The compatibility and extendibility of MilQuant makes it a generic and flexible tool that can serve as a full solution to data analysis of isobaric tagging-based quantitation. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. An overview of technical considerations when using quantitative real-time PCR analysis of gene expression in human exercise research

    PubMed Central

    Yan, Xu; Bishop, David J.

    2018-01-01

    Gene expression analysis by quantitative PCR in skeletal muscle is routine in exercise studies. The reproducibility and reliability of the data fundamentally depend on how the experiments are performed and interpreted. Despite the popularity of the assay, there is a considerable variation in experimental protocols and data analyses from different laboratories, and there is a lack of consistency of proper quality control steps throughout the assay. In this study, we present a number of experiments on various steps of quantitative PCR workflow, and demonstrate how to perform a quantitative PCR experiment with human skeletal muscle samples in an exercise study. We also tested some common mistakes in performing qPCR. Interestingly, we found that mishandling of muscle for a short time span (10 mins) before RNA extraction did not affect RNA quality, and isolated total RNA was preserved for up to one week at room temperature. Demonstrated by our data, use of unstable reference genes lead to substantial differences in the final results. Alternatively, cDNA content can be used for data normalisation; however, complete removal of RNA from cDNA samples is essential for obtaining accurate cDNA content. PMID:29746477

  20. Light scattering application for quantitative estimation of apoptosis

    NASA Astrophysics Data System (ADS)

    Bilyy, Rostyslav O.; Stoika, Rostyslav S.; Getman, Vasyl B.; Bilyi, Olexander I.

    2004-05-01

    Estimation of cell proliferation and apoptosis are in focus of instrumental methods used in modern biomedical sciences. Present study concerns monitoring of functional state of cells, specifically the development of their programmed death or apoptosis. The available methods for such purpose are either very expensive, or require time-consuming operations. Their specificity and sensitivity are frequently not sufficient for making conclusions which could be used in diagnostics or treatment monitoring. We propose a novel method for apoptosis measurement based on quantitative determination of cellular functional state taking into account their physical characteristics. This method uses the patented device -- laser microparticle analyser PRM-6 -- for analyzing light scattering by the microparticles, including cells. The method gives an opportunity for quick, quantitative, simple (without complicated preliminary cell processing) and relatively cheap measurement of apoptosis in cellular population. The elaborated method was used for studying apoptosis expression in murine leukemia cells of L1210 line and human lymphoblastic leukemia cells of K562 line. The results obtained by the proposed method permitted measuring cell number in tested sample, detecting and quantitative characterization of functional state of cells, particularly measuring the ratio of the apoptotic cells in suspension.

  1. A further component analysis for illicit drugs mixtures with THz-TDS

    NASA Astrophysics Data System (ADS)

    Xiong, Wei; Shen, Jingling; He, Ting; Pan, Rui

    2009-07-01

    A new method for quantitative analysis of mixtures of illicit drugs with THz time domain spectroscopy was proposed and verified experimentally. In traditional method we need fingerprints of all the pure chemical components. In practical as only the objective components in a mixture and their absorption features are known, it is necessary and important to present a more practical technique for the detection and identification. Our new method of quantitatively inspect of the mixtures of illicit drugs is developed by using derivative spectrum. In this method, the ratio of objective components in a mixture can be obtained on the assumption that all objective components in the mixture and their absorption features are known but the unknown components are not needed. Then methamphetamine and flour, a illicit drug and a common adulterant, were selected for our experiment. The experimental result verified the effectiveness of the method, which suggested that it could be an effective method for quantitative identification of illicit drugs. This THz spectroscopy technique is great significant in the real-world applications of illicit drugs quantitative analysis. It could be an effective method in the field of security and pharmaceuticals inspection.

  2. Correction for isotopic interferences between analyte and internal standard in quantitative mass spectrometry by a nonlinear calibration function.

    PubMed

    Rule, Geoffrey S; Clark, Zlatuse D; Yue, Bingfang; Rockwood, Alan L

    2013-04-16

    Stable isotope-labeled internal standards are of great utility in providing accurate quantitation in mass spectrometry (MS). An implicit assumption has been that there is no "cross talk" between signals of the internal standard and the target analyte. In some cases, however, naturally occurring isotopes of the analyte do contribute to the signal of the internal standard. This phenomenon becomes more pronounced for isotopically rich compounds, such as those containing sulfur, chlorine, or bromine, higher molecular weight compounds, and those at high analyte/internal standard concentration ratio. This can create nonlinear calibration behavior that may bias quantitative results. Here, we propose the use of a nonlinear but more accurate fitting of data for these situations that incorporates one or two constants determined experimentally for each analyte/internal standard combination and an adjustable calibration parameter. This fitting provides more accurate quantitation in MS-based assays where contributions from analyte to stable labeled internal standard signal exist. It can also correct for the reverse situation where an analyte is present in the internal standard as an impurity. The practical utility of this approach is described, and by using experimental data, the approach is compared to alternative fits.

  3. Full skin quantitative optical coherence elastography achieved by combining vibration and surface acoustic wave methods

    NASA Astrophysics Data System (ADS)

    Li, Chunhui; Guan, Guangying; Huang, Zhihong; Wang, Ruikang K.; Nabi, Ghulam

    2015-03-01

    By combining with the phase sensitive optical coherence tomography (PhS-OCT), vibration and surface acoustic wave (SAW) methods have been reported to provide elastography of skin tissue respectively. However, neither of these two methods can provide the elastography in full skin depth in current systems. This paper presents a feasibility study on an optical coherence elastography method which combines both vibration and SAW in order to give the quantitative mechanical properties of skin tissue with full depth range, including epidermis, dermis and subcutaneous fat. Experiments are carried out on layered tissue mimicking phantoms and in vivo human forearm and palm skin. A ring actuator generates vibration while a line actuator were used to excited SAWs. A PhS-OCT system is employed to provide the ultrahigh sensitive measurement of the generated waves. The experimental results demonstrate that by the combination of vibration and SAW method the full skin bulk mechanical properties can be quantitatively measured and further the elastography can be obtained with a sensing depth from ~0mm to ~4mm. This method is promising to apply in clinics where the quantitative elasticity of localized skin diseases is needed to aid the diagnosis and treatment.

  4. Quantitative methods in assessment of neurologic function.

    PubMed

    Potvin, A R; Tourtellotte, W W; Syndulko, K; Potvin, J

    1981-01-01

    Traditionally, neurologists have emphasized qualitative techniques for assessing results of clinical trials. However, in recent years qualitative evaluations have been increasingly augmented by quantitative tests for measuring neurologic functions pertaining to mental state, strength, steadiness, reactions, speed, coordination, sensation, fatigue, gait, station, and simulated activities of daily living. Quantitative tests have long been used by psychologists for evaluating asymptomatic function, assessing human information processing, and predicting proficiency in skilled tasks; however, their methodology has never been directly assessed for validity in a clinical environment. In this report, relevant contributions from the literature on asymptomatic human performance and that on clinical quantitative neurologic function are reviewed and assessed. While emphasis is focused on tests appropriate for evaluating clinical neurologic trials, evaluations of tests for reproducibility, reliability, validity, and examiner training procedures, and for effects of motivation, learning, handedness, age, and sex are also reported and interpreted. Examples of statistical strategies for data analysis, scoring systems, data reduction methods, and data display concepts are presented. Although investigative work still remains to be done, it appears that carefully selected and evaluated tests of sensory and motor function should be an essential factor for evaluating clinical trials in an objective manner.

  5. Analytical analysis and implementation of a low-speed high-torque permanent magnet vernier in-wheel motor for electric vehicle

    NASA Astrophysics Data System (ADS)

    Li, Jiangui; Wang, Junhua; Zhigang, Zhao; Yan, Weili

    2012-04-01

    In this paper, analytical analysis of the permanent magnet vernier (PMV) is presented. The key is to analytically solve the governing Laplacian/quasi-Poissonian field equations in the motor regions. By using the time-stepping finite element method, the analytical method is verified. Hence, the performances of the PMV machine are quantitatively compared with that of the analytical results. The analytical results agree well with the finite element method results. Finally, the experimental results are given to further show the validity of the analysis.

  6. Prospects and challenges of quantitative phase imaging in tumor cell biology

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Götte, Martin; Greve, Burkhard; Ketelhut, Steffi

    2016-03-01

    Quantitative phase imaging (QPI) techniques provide high resolution label-free quantitative live cell imaging. Here, prospects and challenges of QPI in tumor cell biology are presented, using the example of digital holographic microscopy (DHM). It is shown that the evaluation of quantitative DHM phase images allows the retrieval of different parameter sets for quantification of cellular motion changes in migration and motility assays that are caused by genetic modifications. Furthermore, we demonstrate simultaneously label-free imaging of cell growth and morphology properties.

  7. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing.

    PubMed

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-02-01

    A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  8. Quantitative and qualitative comparison of MR imaging of the temporomandibular joint at 1.5 and 3.0 T using an optimized high-resolution protocol

    PubMed Central

    Spinner, Georg; Wyss, Michael; Erni, Stefan; Ettlin, Dominik A; Nanz, Daniel; Ulbrich, Erika J; Gallo, Luigi M; Andreisek, Gustav

    2016-01-01

    Objectives: To quantitatively and qualitatively compare MRI of the temporomandibular joint (TMJ) using an optimized high-resolution protocol at 3.0 T and a clinical standard protocol at 1.5 T. Methods: A phantom and 12 asymptomatic volunteers were MR imaged using a 2-channel surface coil (standard TMJ coil) at 1.5 and 3.0 T (Philips Achieva and Philips Ingenia, respectively; Philips Healthcare, Best, Netherlands). Imaging protocol consisted of coronal and oblique sagittal proton density-weighted turbo spin echo sequences. For quantitative evaluation, a spherical phantom was imaged. Signal-to-noise ratio (SNR) maps were calculated on a voxelwise basis. For qualitative evaluation, all volunteers underwent MRI of the TMJ with the jaw in closed position. Two readers independently assessed visibility and delineation of anatomical structures of the TMJ and overall image quality on a 5-point Likert scale. Quantitative and qualitative measurements were compared between field strengths. Results: The quantitative analysis showed similar SNR for the high-resolution protocol at 3.0 T compared with the clinical protocol at 1.5 T. The qualitative analysis showed significantly better visibility and delineation of clinically relevant anatomical structures of the TMJ, including the TMJ disc and pterygoid muscle as well as better overall image quality at 3.0 T than at 1.5 T. Conclusions: The presented results indicate that expected gains in SNR at 3.0 T can be used to increase the spatial resolution when imaging the TMJ, which translates into increased visibility and delineation of anatomical structures of the TMJ. Therefore, imaging at 3.0 T should be preferred over 1.5 T for imaging the TMJ. PMID:26371077

  9. Quantitative measurement of a candidate serum biomarker peptide derived from α2-HS-glycoprotein, and a preliminary trial of multidimensional peptide analysis in females with pregnancy-induced hypertension.

    PubMed

    Hamamura, Kensuke; Yanagida, Mitsuaki; Ishikawa, Hitoshi; Banzai, Michio; Yoshitake, Hiroshi; Nonaka, Daisuke; Tanaka, Kenji; Sakuraba, Mayumi; Miyakuni, Yasuka; Takamori, Kenji; Nojima, Michio; Yoshida, Koyo; Fujiwara, Hiroshi; Takeda, Satoru; Araki, Yoshihiko

    2018-03-01

    Purpose We previously attempted to develop quantitative enzyme-linked immunosorbent assay (ELISA) systems for the PDA039/044/071 peptides, potential serum disease biomarkers (DBMs) of pregnancy-induced hypertension (PIH), primarily identified by a peptidomic approach (BLOTCHIP®-mass spectrometry (MS)). However, our methodology did not extend to PDA071 (cysteinyl α2-HS-glycoprotein 341-367 ), due to difficulty to produce a specific antibody against the peptide. The aim of the present study was to establish an alternative PDA071 quantitation system using liquid chromatography-multiple reaction monitoring (LC-MRM)/MS, to explore the potential utility of PDA071 as a DBM for PIH. Methods We tested heat/acid denaturation methods in efforts to purify serum PDA071 and developed an LC-MRM/MS method allowing for specific quantitation thereof. We measured serum PDA071 concentrations, and these results were validated including by three-dimensional (3D) plotting against PDA039 (kininogen-1 439-456 )/044 (kininogen-1 438-456 ) concentrations, followed by discriminant analysis. Results PDA071 was successfully extracted from serum using a heat denaturation method. Optimum conditions for quantitation via LC-MRM/MS were developed; the assayed serum PDA071 correlated well with the BLOTCHIP® assay values. Although the PDA071 alone did not significantly differ between patients and controls, 3D plotting of PDA039/044/071 peptide concentrations and construction of a Jackknife classification matrix were satisfactory in terms of PIH diagnostic precision. Conclusions Combination analysis using both PDA071 and PDA039/044 concentrations allowed PIH diagnostic accuracy to be attained, and our method will be valuable in future pathophysiological studies of hypertensive disorders of pregnancy.

  10. Accurate ECG diagnosis of atrial tachyarrhythmias using quantitative analysis: a prospective diagnostic and cost-effectiveness study.

    PubMed

    Krummen, David E; Patel, Mitul; Nguyen, Hong; Ho, Gordon; Kazi, Dhruv S; Clopton, Paul; Holland, Marian C; Greenberg, Scott L; Feld, Gregory K; Faddis, Mitchell N; Narayan, Sanjiv M

    2010-11-01

    Quantitative ECG Analysis. Optimal atrial tachyarrhythmia management is facilitated by accurate electrocardiogram interpretation, yet typical atrial flutter (AFl) may present without sawtooth F-waves or RR regularity, and atrial fibrillation (AF) may be difficult to separate from atypical AFl or rapid focal atrial tachycardia (AT). We analyzed whether improved diagnostic accuracy using a validated analysis tool significantly impacts costs and patient care. We performed a prospective, blinded, multicenter study using a novel quantitative computerized algorithm to identify atrial tachyarrhythmia mechanism from the surface ECG in patients referred for electrophysiology study (EPS). In 122 consecutive patients (age 60 ± 12 years) referred for EPS, 91 sustained atrial tachyarrhythmias were studied. ECGs were also interpreted by 9 physicians from 3 specialties for comparison and to allow healthcare system modeling. Diagnostic accuracy was compared to the diagnosis at EPS. A Markov model was used to estimate the impact of improved arrhythmia diagnosis. We found 13% of typical AFl ECGs had neither sawtooth flutter waves nor RR regularity, and were misdiagnosed by the majority of clinicians (0/6 correctly diagnosed by consensus visual interpretation) but correctly by quantitative analysis in 83% (5/6, P = 0.03). AF diagnosis was also improved through use of the algorithm (92%) versus visual interpretation (primary care: 76%, P < 0.01). Economically, we found that these improvements in diagnostic accuracy resulted in an average cost-savings of $1,303 and 0.007 quality-adjusted-life-years per patient. Typical AFl and AF are frequently misdiagnosed using visual criteria. Quantitative analysis improves diagnostic accuracy and results in improved healthcare costs and patient outcomes. © 2010 Wiley Periodicals, Inc.

  11. SPECHT - single-stage phosphopeptide enrichment and stable-isotope chemical tagging: quantitative phosphoproteomics of insulin action in muscle.

    PubMed

    Kettenbach, Arminja N; Sano, Hiroyuki; Keller, Susanna R; Lienhard, Gustav E; Gerber, Scott A

    2015-01-30

    The study of cellular signaling remains a significant challenge for translational and clinical research. In particular, robust and accurate methods for quantitative phosphoproteomics in tissues and tumors represent significant hurdles for such efforts. In the present work, we design, implement and validate a method for single-stage phosphopeptide enrichment and stable isotope chemical tagging, or SPECHT, that enables the use of iTRAQ, TMT and/or reductive dimethyl-labeling strategies to be applied to phosphoproteomics experiments performed on primary tissue. We develop and validate our approach using reductive dimethyl-labeling and HeLa cells in culture, and find these results indistinguishable from data generated from more traditional SILAC-labeled HeLa cells mixed at the cell level. We apply the SPECHT approach to the quantitative analysis of insulin signaling in a murine myotube cell line and muscle tissue, identify known as well as new phosphorylation events, and validate these phosphorylation sites using phospho-specific antibodies. Taken together, our work validates chemical tagging post-single-stage phosphoenrichment as a general strategy for studying cellular signaling in primary tissues. Through the use of a quantitatively reproducible, proteome-wide phosphopeptide enrichment strategy, we demonstrated the feasibility of post-phosphopeptide purification chemical labeling and tagging as an enabling approach for quantitative phosphoproteomics of primary tissues. Using reductive dimethyl labeling as a generalized chemical tagging strategy, we compared the performance of post-phosphopeptide purification chemical tagging to the well established community standard, SILAC, in insulin-stimulated tissue culture cells. We then extended our method to the analysis of low-dose insulin signaling in murine muscle tissue, and report on the analytical and biological significance of our results. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Mass spectrometric real-time monitoring of an enzymatic phosphorylation assay using internal standards and data-handling freeware.

    PubMed

    Krappmann, Michael; de Boer, Arjen R; Kool, Daniël R W; Irth, Hubertus; Letzel, Thomas

    2016-04-30

    Continuous-flow reaction detection systems (monitoring enzymatic reactions with mass spectrometry (MS)) lack quantitative values so far. Therefore, two independent internal standards (IS) are implemented in a way that the online system stability can be observed, quantitative conversion values for substrate and product can be obtained and they can be used as mass calibration standards for high MS accuracy. An application previously developed for the MS detection of peptide phosphorylation by cAMP-dependent protein kinase A (PKA) (De Boer et al., Anal. Bioanal. Chem. 2005, 381, 647-655) was transferred to a continuous-flow reaction detection system. This enzymatic reaction, involving enzyme activation as well as the transfer of a phosphate group from ATP to a peptide substrate, was used to prove the compatibility of a quantitative enzymatic assay in a continuous-flow real-time system (connected to MS). Moreover (using internal standards), the critical parameter reaction temperature (including solution density variations depending on temperature) was studied in the continuous-flow mixing system. Furthermore, two substrates (malantide and kemptide), two enzyme types (catalytic subunit of PKA and complete PKA) and one inhibitor were tested to determine system robustness and long-term availability. Even spraying solutions that contained significant amount of MS contaminants (e.g. the polluted catalytic subunit) resulted in quantifiable MS signal intensities. Subsequent recalculations using the internal standards led to results representing the power of this application. The presented methodology and the data evaluation with available Achroma freeware enable the direct coupling of biochemical assays with quantitative MS detection. Monitoring changes such as temperature, reaction time, inhibition, or compound concentrations can be observed quantitatively and thus enzymatic activity can be calculated. Copyright © 2016 John Wiley & Sons, Ltd.

  13. Quantitative prediction of drug side effects based on drug-related features.

    PubMed

    Niu, Yanqing; Zhang, Wen

    2017-09-01

    Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.

  14. A new systematic and quantitative approach to characterization of surface nanostructures using fuzzy logic

    NASA Astrophysics Data System (ADS)

    Al-Mousa, Amjed A.

    Thin films are essential constituents of modern electronic devices and have a multitude of applications in such devices. The impact of the surface morphology of thin films on the device characteristics where these films are used has generated substantial attention to advanced film characterization techniques. In this work, we present a new approach to characterize surface nanostructures of thin films by focusing on isolating nanostructures and extracting quantitative information, such as the shape and size of the structures. This methodology is applicable to any Scanning Probe Microscopy (SPM) data, such as Atomic Force Microscopy (AFM) data which we are presenting here. The methodology starts by compensating the AFM data for some specific classes of measurement artifacts. After that, the methodology employs two distinct techniques. The first, which we call the overlay technique, proceeds by systematically processing the raster data that constitute the scanning probe image in both vertical and horizontal directions. It then proceeds by classifying points in each direction separately. Finally, the results from both the horizontal and the vertical subsets are overlaid, where a final decision on each surface point is made. The second technique, based on fuzzy logic, relies on a Fuzzy Inference Engine (FIE) to classify the surface points. Once classified, these points are clustered into surface structures. The latter technique also includes a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and then tune the fuzzy technique system uniquely for that surface. Both techniques have been applied to characterize organic semiconductor thin films of pentacene on different substrates. Also, we present a case study to demonstrate the effectiveness of our methodology to identify quantitatively particle sizes of two specimens of gold nanoparticles of different nominal dimensions dispersed on a mica surface. A comparison with other techniques like: thresholding, watershed and edge detection is presented next. Finally, we present a systematic study of the fuzzy logic technique by experimenting with synthetic data. These results are discussed and compared along with the challenges of the two techniques.

  15. Good practices for quantitative bias analysis.

    PubMed

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage more widespread use of bias analysis to estimate the potential magnitude and direction of biases, as well as the uncertainty in estimates potentially influenced by the biases. © The Author 2014; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  16. Distributed Simulation as a modelling tool for the development of a simulation-based training programme for cardiovascular specialties.

    PubMed

    Kelay, Tanika; Chan, Kah Leong; Ako, Emmanuel; Yasin, Mohammad; Costopoulos, Charis; Gold, Matthew; Kneebone, Roger K; Malik, Iqbal S; Bello, Fernando

    2017-01-01

    Distributed Simulation is the concept of portable, high-fidelity immersive simulation. Here, it is used for the development of a simulation-based training programme for cardiovascular specialities. We present an evidence base for how accessible, portable and self-contained simulated environments can be effectively utilised for the modelling, development and testing of a complex training framework and assessment methodology. Iterative user feedback through mixed-methods evaluation techniques resulted in the implementation of the training programme. Four phases were involved in the development of our immersive simulation-based training programme: ( 1) initial conceptual stage for mapping structural criteria and parameters of the simulation training framework and scenario development ( n  = 16), (2) training facility design using Distributed Simulation , (3) test cases with clinicians ( n  = 8) and collaborative design, where evaluation and user feedback involved a mixed-methods approach featuring (a) quantitative surveys to evaluate the realism and perceived educational relevance of the simulation format and framework for training and (b) qualitative semi-structured interviews to capture detailed feedback including changes and scope for development. Refinements were made iteratively to the simulation framework based on user feedback, resulting in (4) transition towards implementation of the simulation training framework, involving consistent quantitative evaluation techniques for clinicians ( n  = 62). For comparative purposes, clinicians' initial quantitative mean evaluation scores for realism of the simulation training framework, realism of the training facility and relevance for training ( n  = 8) are presented longitudinally, alongside feedback throughout the development stages from concept to delivery, including the implementation stage ( n  = 62). Initially, mean evaluation scores fluctuated from low to average, rising incrementally. This corresponded with the qualitative component, which augmented the quantitative findings; trainees' user feedback was used to perform iterative refinements to the simulation design and components (collaborative design), resulting in higher mean evaluation scores leading up to the implementation phase. Through application of innovative Distributed Simulation techniques, collaborative design, and consistent evaluation techniques from conceptual, development, and implementation stages, fully immersive simulation techniques for cardiovascular specialities are achievable and have the potential to be implemented more broadly.

  17. Communication patterns in a psychotherapy following traumatic brain injury: A quantitative case study based on symbolic dynamics

    PubMed Central

    2011-01-01

    Background The role of psychotherapy in the treatment of traumatic brain injury is receiving increased attention. The evaluation of psychotherapy with these patients has been conducted largely in the absence of quantitative data concerning the therapy itself. Quantitative methods for characterizing the sequence-sensitive structure of patient-therapist communication are now being developed with the objective of improving the effectiveness of psychotherapy following traumatic brain injury. Methods The content of three therapy session transcripts (sessions were separated by four months) obtained from a patient with a history of several motor vehicle accidents who was receiving dialectical behavior therapy was scored and analyzed using methods derived from the mathematical theory of symbolic dynamics. Results The analysis of symbol frequencies was largely uninformative. When repeated triples were examined a marked pattern of change in content was observed over the three sessions. The context free grammar complexity and the Lempel-Ziv complexity were calculated for each therapy session. For both measures, the rate of complexity generation, expressed as bits per minute, increased longitudinally during the course of therapy. The between-session increases in complexity generation rates are consistent with calculations of mutual information. Taken together these results indicate that there was a quantifiable increase in the variability of patient-therapist verbal behavior during the course of therapy. Comparison of complexity values against values obtained from equiprobable random surrogates established the presence of a nonrandom structure in patient-therapist dialog (P = .002). Conclusions While recognizing that only limited conclusions can be based on a case history, it can be noted that these quantitative observations are consistent with qualitative clinical observations of increases in the flexibility of discourse during therapy. These procedures can be of particular value in the examination of therapies following traumatic brain injury because, in some presentations, these therapies are complicated by deficits that result in subtle distortions of language that produce significant post-injury social impairment. Independently of the mathematical analysis applied to the investigation of therapy-generated symbol sequences, our experience suggests that the procedures presented here are of value in training therapists. PMID:21794113

  18. Development of one novel multiple-target plasmid for duplex quantitative PCR analysis of roundup ready soybean.

    PubMed

    Zhang, Haibo; Yang, Litao; Guo, Jinchao; Li, Xiang; Jiang, Lingxi; Zhang, Dabing

    2008-07-23

    To enforce the labeling regulations of genetically modified organisms (GMOs), the application of reference molecules as calibrators is becoming essential for practical quantification of GMOs. However, the reported reference molecules with tandem marker multiple targets have been proved not suitable for duplex PCR analysis. In this study, we developed one unique plasmid molecule based on one pMD-18T vector with three exogenous target DNA fragments of Roundup Ready soybean GTS 40-3-2 (RRS), that is, CaMV35S, NOS, and RRS event fragments, plus one fragment of soybean endogenous Lectin gene. This Lectin gene fragment was separated from the three exogenous target DNA fragments of RRS by inserting one 2.6 kb DNA fragment with no relatedness to RRS detection targets in this resultant plasmid. Then, we proved that this design allows the quantification of RRS using the three duplex real-time PCR assays targeting CaMV35S, NOS, and RRS events employing this reference molecule as the calibrator. In these duplex PCR assays, the limits of detection (LOD) and quantification (LOQ) were 10 and 50 copies, respectively. For the quantitative analysis of practical RRS samples, the results of accuracy and precision were similar to those of simplex PCR assays, for instance, the quantitative results were at the 1% level, the mean bias of the simplex and duplex PCR were 4.0% and 4.6%, respectively, and the statistic analysis ( t-test) showed that the quantitative data from duplex and simplex PCR had no significant discrepancy for each soybean sample. Obviously, duplex PCR analysis has the advantages of saving the costs of PCR reaction and reducing the experimental errors in simplex PCR testing. The strategy reported in the present study will be helpful for the development of new reference molecules suitable for duplex PCR quantitative assays of GMOs.

  19. Stoichiometric and kinetic analysis of extreme halophilic Archaea on various substrates in a corrosion resistant bioreactor.

    PubMed

    Lorantfy, Bettina; Seyer, Bernhard; Herwig, Christoph

    2014-01-25

    Extreme halophilic Archaea are extremophile species which can thrive in hypersaline environments of up to 3-5 M sodium chloride concentration. Although their ecology and physiology are widely identified on the microbiological level, little emphasis has been laid on quantitative bioprocess development with extreme halophiles. The goal of this study was to establish, on the one hand, a methodological basis for quantitative bioprocess analysis of extreme halophilic Archaea with an extreme halophilic strain as an example. Firstly, as a novel usage, a corrosion resistant bioreactor setup for extreme halophiles has been implemented. Then, paying special attention to total bioprocess quantification approaches, an indirect method for biomass quantification using on-line process signals was introduced. Subsequently, robust quantitative data evaluation methods for halophiles could be developed, providing defined and controlled cultivation conditions in the bioreactor and therefore obtaining suitable quality of on-line as well as off-line datasets. On the other hand, new physiological results of extreme halophiles in bioreactor have also been obtained based on the quantitative methodological tools. For the first time, quantitative data on stoichiometry and kinetics were collected and evaluated on different carbon sources. The results on various substrates were interpreted, with proposed metabolic mechanisms, by linking to the reported primary carbon metabolism of extreme halophilic Archaea. Moreover, results of chemostat cultures demonstrated that extreme halophilic organisms show Monod-kinetics on different sole carbon sources. A diauxic growth pattern was described on a mixture of substrates in batch cultivations. In addition, the methodologies presented here enable one to characterize the utilized strain Haloferax mediterranei (HFX) as a potential new host organism. Thus, this study offers a strong methodological basis as well as a fundamental physiological assessment for bioreactor quantification of extreme halophiles that can serve as primary knowledge for applications of extreme halophiles in biotechnology. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. [Occupational exposure to airborne chemical substances in paintings conservators].

    PubMed

    Jezewska, Anna; Szewczyńska, Małgorzata; Woźnica, Agnieszka

    2014-01-01

    This paper presents the results of the quantitative study of the airborne chemical substances detected in the conservator's work environment. The quantitative tests were carried out in 6 museum easel paintings conservation studios. The air test samples were taken at various stages of restoration works, such as cleaning, doubling, impregnation, varnishing, retouching, just to name a few. The chemical substances in the sampled air were measured by the GC-FID (gas chromatography with flame ionization detector) test method. The study results demonstrated that concentrations of airborne substances, e.g., toluene, 1,4-dioxane, turpentine and white spirit in the work environment of paintings conservators exceeded the values allowed by hygiene standards. It was found that exposure levels to the same chemical agents, released during similar activities, varied for different paintings conservation studios. It is likely that this discrepancy resulted from the indoor air exchange system for a given studio (e.g. type of ventilation and its efficiency), the size of the object under maintenance, and also from the methodology and protection used by individual employees. The levels of organic solvent vapors, present in the workplace air in the course of painting conservation, were found to be well above the occupational exposure limits, thus posing a threat to the worker's health.

  1. An in vivo study of low back pain and shock absorption in the human locomotor system.

    PubMed

    Voloshin, A; Wosk, J

    1982-01-01

    In this second of three papers, the principles of a non-invasive in vivo method to quantitatively evaluate the shock absorbing capacity of the human musculoskeletal system and the correlation of this shock absorbing capacity with low back pain (LPB) symptoms are presented. The experiments involved patients suffering from low back pain (as well as other degenerative joint diseases) and healthy patients. The obtained results reveal that low back pain correlates with the reduced capacity of the human musculoskeletal system between the femoral condyle and the forehead to attenuate incoming shock waves. Examination of the absolute values of the amplitude of the propagated waves leads to the conclusion that the human locomotor system, which possesses reduced attenuation capacity, tries to prevent overloading of the head from insufficiently attenuated shock waves. Results of the present investigation support the idea that the repetitive loading resulting from gait generates intermittent waves that propagate through the entire human musculoskeletal system from the heel up to the head. These waves are gradually attenuated along this course by the natural shock absorbers (bone and soft tissues). Contemporary methods for examination of the human musculoskeletal system may by improved by using the proposed non-invasive in vivo technique for quantitative characterization of the locomotor system's shock absorbing capacity.

  2. A smartphone-readable barcode assay for the detection and quantitation of pesticide residues.

    PubMed

    Guo, Juan; Wong, Jessica X H; Cui, Caie; Li, Xiaochun; Yu, Hua-Zhong

    2015-08-21

    In this paper, we present a smartphone-readable barcode assay for the qualitative detection of methyl parathion residues, a toxic organophosphorus pesticide that is popularly used in agriculture worldwide. The detection principle is based on the irreversible inhibition of the enzymatic activity of acetylcholinesterase (AchE) by methyl parathion; AchE catalytically hydrolyzes acetylthiocholine iodine to thiocholine that in turn dissociates dithiobis-nitrobenzoate to produce a yellow product (deprotonated thio-nitrobenzoate). The yellow intensity of the product was confirmed to be inversely dependent on the concentration of the pesticide. We have designed a barcode-formatted assay chip by using a PDMS (polydimethylsiloxane) channel plate (as the reaction reservoir), situated under a printed partial barcode, to complete the whole barcode such that it can be directly read by a barcode scanning app installed on a smartphone. The app is able to qualitatively present the result of the pesticide test; the absence or a low concentration of methyl parathion results in the barcode reading as "-", identifying the test as negative for pesticides. Upon obtaining a positive result (the app reads a "+" character), the captured image can be further analyzed to quantitate the methyl parathion concentration in the sample. Besides the portability and simplicity, this mobile-app based colorimetric barcode assay compares favorably with the standard spectrophotometric method.

  3. Comparison study of two procedures for the determination of emamectin benzoate in medicated fish feed.

    PubMed

    Farer, Leslie J; Hayes, John M

    2005-01-01

    A new method has been developed for the determination of emamectin benzoate in fish feed. The method uses a wet extraction, cleanup by solid-phase extraction, and quantitation and separation by liquid chromatography (LC). In this paper, we compare the performance of this method with that of a previously reported LC assay for the determination of emamectin benzoate in fish feed. Although similar to the previous method, the new procedure uses a different sample pretreatment, wet extraction, and quantitation method. The performance of the new method was compared with that of the previously reported method by analyses of 22 medicated feed samples from various commercial sources. A comparison of the results presented here reveals slightly lower assay values obtained with the new method. Although a paired sample t-test indicates the difference in results is significant, this difference is within the method precision of either procedure.

  4. Hierarchical structure of stock price fluctuations in financial markets

    NASA Astrophysics Data System (ADS)

    Gao, Ya-Chun; Cai, Shi-Min; Wang, Bing-Hong

    2012-12-01

    The financial market and turbulence have been broadly compared on account of the same quantitative methods and several common stylized facts they share. In this paper, the She-Leveque (SL) hierarchy, proposed to explain the anomalous scaling exponents deviating from Kolmogorov monofractal scaling of the velocity fluctuation in fluid turbulence, is applied to study and quantify the hierarchical structure of stock price fluctuations in financial markets. We therefore observed certain interesting results: (i) the hierarchical structure related to multifractal scaling generally presents in all the stock price fluctuations we investigated. (ii) The quantitatively statistical parameters that describe SL hierarchy are different between developed financial markets and emerging ones, distinctively. (iii) For the high-frequency stock price fluctuation, the hierarchical structure varies with different time periods. All these results provide a novel analogy in turbulence and financial market dynamics and an insight to deeply understand multifractality in financial markets.

  5. Quantitative modeling of coupled piezo-elastodynamic behavior of piezoelectric actuators bonded to an elastic medium for structural health monitoring: a review.

    PubMed

    Huang, Guoliang; Song, Fei; Wang, Xiaodong

    2010-01-01

    Elastic waves, especially guided waves, generated by a piezoelectric actuator/sensor network, have shown great potential for on-line health monitoring of advanced aerospace, nuclear, and automotive structures in recent decades. Piezoelectric materials can function as both actuators and sensors in these applications due to wide bandwidth, quick response and low costs. One of the most fundamental issues surrounding the effective use of piezoelectric actuators is the quantitative evaluation of the resulting elastic wave propagation by considering the coupled piezo-elastodynamic behavior between the actuator and the host medium. Accurate characterization of the local interfacial stress distribution between the actuator and the host medium is the key issue for the problem. This paper presents a review of the development of analytical, numerical and hybrid approaches for modeling of the coupled piezo-elastodynamic behavior. The resulting elastic wave propagation for structural health monitoring is also summarized.

  6. Hybrid data acquisition and processing strategies with increased throughput and selectivity: pSMART analysis for global qualitative and quantitative analysis.

    PubMed

    Prakash, Amol; Peterman, Scott; Ahmad, Shadab; Sarracino, David; Frewen, Barbara; Vogelsang, Maryann; Byram, Gregory; Krastins, Bryan; Vadali, Gouri; Lopez, Mary

    2014-12-05

    Data-dependent acquisition (DDA) and data-independent acquisition strategies (DIA) have both resulted in improved understanding of proteomics samples. Both strategies have advantages and disadvantages that are well-published, where DDA is typically applied for deep discovery and DIA may be used to create sample records. In this paper, we present a hybrid data acquisition and processing strategy (pSMART) that combines the strengths of both techniques and provides significant benefits for qualitative and quantitative peptide analysis. The performance of pSMART is compared to published DIA strategies in an experiment that allows the objective assessment of DIA performance with respect to interrogation of previously acquired MS data. The results of this experiment demonstrate that pSMART creates fewer decoy hits than a standard DIA strategy. Moreover, we show that pSMART is more selective, sensitive, and reproducible than either standard DIA or DDA strategies alone.

  7. Remote sensing and spectral analysis of plumes from ocean dumping in the New York Bight Apex

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1980-01-01

    The application of the remote sensing techniques of aerial photography and multispectral scanning in the qualitative and quantitative analysis of plumes from ocean dumping of waste materials is investigated in the New York Bight Apex. Plumes resulting from the dumping of acid waste and sewage sludge were observed by Ocean Color Scanner at an altitude of 19.7 km and by Modular Multispectral Scanner and mapping camera at an altitude of 3.0 km. Results of the qualitative analysis of multispectral and photographic data for the mapping, location, and identification of pollution features without concurrent sea truth measurements are presented which demonstrate the usefulness of in-scene calibration. Quantitative distributions of the suspended solids in sewage sludge released in spot and line dumps are also determined by a multiple regression analysis of multispectral and sea truth data.

  8. The effect of the water on the curcumin tautomerism: A quantitative approach

    NASA Astrophysics Data System (ADS)

    Manolova, Yana; Deneva, Vera; Antonov, Liudmil; Drakalska, Elena; Momekova, Denitsa; Lambov, Nikolay

    2014-11-01

    The tautomerism of curcumin has been investigated in ethanol/water binary mixtures by using UV-Vis spectroscopy and advanced quantum-chemical calculations. The spectral changes were processed by using advanced chemometric procedure, based on resolution of overlapping bands technique. As a result, molar fractions of the tautomers and their individual spectra have been estimated. It has been shown that in ethanol the enol-keto tautomer only is presented. The addition of water leads to appearance of a new spectral band, which was assigned to the diketo tautomeric form. The results show that in 90% water/10% ethanol the diketo form is dominating. The observed shift in the equilibrium is explained by the quantum chemical calculations, which show that water molecules stabilize diketo tautomer through formation of stable complexes. To our best knowledge we report for the first time quantitative data for the tautomerism of curcumin and the effect of the water.

  9. Phase calibration target for quantitative phase imaging with ptychography.

    PubMed

    Godden, T M; Muñiz-Piniella, A; Claverley, J D; Yacoot, A; Humphry, M J

    2016-04-04

    Quantitative phase imaging (QPI) utilizes refractive index and thickness variations that lead to optical phase shifts. This gives contrast to images of transparent objects. In quantitative biology, phase images are used to accurately segment cells and calculate properties such as dry mass, volume and proliferation rate. The fidelity of the measured phase shifts is of critical importance in this field. However to date, there has been no standardized method for characterizing the performance of phase imaging systems. Consequently, there is an increasing need for protocols to test the performance of phase imaging systems using well-defined phase calibration and resolution targets. In this work, we present a candidate for a standardized phase resolution target, and measurement protocol for the determination of the transfer of spatial frequencies, and sensitivity of a phase imaging system. The target has been carefully designed to contain well-defined depth variations over a broadband range of spatial frequencies. In order to demonstrate the utility of the target, we measure quantitative phase images on a ptychographic microscope, and compare the measured optical phase shifts with Atomic Force Microscopy (AFM) topography maps and surface profile measurements from coherence scanning interferometry. The results show that ptychography has fully quantitative nanometer sensitivity in optical path differences over a broadband range of spatial frequencies for feature sizes ranging from micrometers to hundreds of micrometers.

  10. CRAFT (complete reduction to amplitude frequency table)--robust and time-efficient Bayesian approach for quantitative mixture analysis by NMR.

    PubMed

    Krishnamurthy, Krish

    2013-12-01

    The intrinsic quantitative nature of NMR is increasingly exploited in areas ranging from complex mixture analysis (as in metabolomics and reaction monitoring) to quality assurance/control. Complex NMR spectra are more common than not, and therefore, extraction of quantitative information generally involves significant prior knowledge and/or operator interaction to characterize resonances of interest. Moreover, in most NMR-based metabolomic experiments, the signals from metabolites are normally present as a mixture of overlapping resonances, making quantification difficult. Time-domain Bayesian approaches have been reported to be better than conventional frequency-domain analysis at identifying subtle changes in signal amplitude. We discuss an approach that exploits Bayesian analysis to achieve a complete reduction to amplitude frequency table (CRAFT) in an automated and time-efficient fashion - thus converting the time-domain FID to a frequency-amplitude table. CRAFT uses a two-step approach to FID analysis. First, the FID is digitally filtered and downsampled to several sub FIDs, and secondly, these sub FIDs are then modeled as sums of decaying sinusoids using the Bayesian approach. CRAFT tables can be used for further data mining of quantitative information using fingerprint chemical shifts of compounds of interest and/or statistical analysis of modulation of chemical quantity in a biological study (metabolomics) or process study (reaction monitoring) or quality assurance/control. The basic principles behind this approach as well as results to evaluate the effectiveness of this approach in mixture analysis are presented. Copyright © 2013 John Wiley & Sons, Ltd.

  11. Using Neutron Spectroscopy to Obtain Quantitative Composition Data of Ganymede's Surface from the Jupiter Ganymede Orbiter

    NASA Astrophysics Data System (ADS)

    Lawrence, D. J.; Maurice, S.; Patterson, G. W.; Hibbitts, C. A.

    2010-05-01

    Understanding the global composition of Ganymede's surface is a key goal of the Europa Jupiter System Mission (EJSM) that is being jointly planned by NASA and ESA. Current plans for obtaining surface information with the Jupiter Ganymede Orbiter (JGO) use spectral imaging measurements. While spectral imaging can provide good mineralogy-related information, quantitative data about elemental abundances can often be hindered by non-composition variations due to surface effects (e.g., space weathering, grain effects, temperature, etc.). Orbital neutron and gamma-ray spectroscopy can provide quantitative composition information that is complementary to spectral imaging measurements, as has been demonstrated with similar instrumental combinations at the Moon, Mars, and Mercury. Neutron and gamma-ray measurements have successfully returned abundance information in a hydrogen-rich environment on Mars. In regards to neutrons and gamma-rays, there are many similarities between the Mars and Ganymede hydrogen-rich environments. In this study, we present results of neutron transport models, which show that quantitative composition information from Ganymede's surface can be obtained in a realistic mission scenario. Thermal and epithermal neutrons are jointly sensitive to the abundances of hydrogen and neutron absorbing elements, such as iron and titanium. These neutron measurements can discriminate between regions that are rich or depleted in neutron absorbing elements, even in the presence of large amounts of hydrogen. Details will be presented about how the neutron composition parameters can be used to meet high-level JGO science objectives, as well as an overview of a neutron spectrometer than can meet various mission and stringent environmental requirements.

  12. A comparison of cosegregation analysis methods for the clinical setting.

    PubMed

    Rañola, John Michael O; Liu, Quanhui; Rosenthal, Elisabeth A; Shirts, Brian H

    2018-04-01

    Quantitative cosegregation analysis can help evaluate the pathogenicity of genetic variants. However, genetics professionals without statistical training often use simple methods, reporting only qualitative findings. We evaluate the potential utility of quantitative cosegregation in the clinical setting by comparing three methods. One thousand pedigrees each were simulated for benign and pathogenic variants in BRCA1 and MLH1 using United States historical demographic data to produce pedigrees similar to those seen in the clinic. These pedigrees were analyzed using two robust methods, full likelihood Bayes factors (FLB) and cosegregation likelihood ratios (CSLR), and a simpler method, counting meioses. Both FLB and CSLR outperform counting meioses when dealing with pathogenic variants, though counting meioses is not far behind. For benign variants, FLB and CSLR greatly outperform as counting meioses is unable to generate evidence for benign variants. Comparing FLB and CSLR, we find that the two methods perform similarly, indicating that quantitative results from either of these methods could be combined in multifactorial calculations. Combining quantitative information will be important as isolated use of cosegregation in single families will yield classification for less than 1% of variants. To encourage wider use of robust cosegregation analysis, we present a website ( http://www.analyze.myvariant.org ) which implements the CSLR, FLB, and Counting Meioses methods for ATM, BRCA1, BRCA2, CHEK2, MEN1, MLH1, MSH2, MSH6, and PMS2. We also present an R package, CoSeg, which performs the CSLR analysis on any gene with user supplied parameters. Future variant classification guidelines should allow nuanced inclusion of cosegregation evidence against pathogenicity.

  13. A Statistical Framework for Protein Quantitation in Bottom-Up MS-Based Proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas

    2009-08-15

    Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and associated confidence measures. Challenges include the presence of low quality or incorrectly identified peptides and informative missingness. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model that carefully accounts for informative missingness in peak intensities and allows unbiased, model-based, protein-level estimation and inference. The model is applicable to both label-based and label-free quantitation experiments. We also provide automated, model-based, algorithms for filtering of proteins and peptides as well as imputation of missing values. Two LC/MS datasets are used to illustrate themore » methods. In simulation studies, our methods are shown to achieve substantially more discoveries than standard alternatives. Availability: The software has been made available in the opensource proteomics platform DAnTE (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu Supplementary information: Supplementary data are available at Bioinformatics online.« less

  14. Cloning of quantitative trait genes from rice reveals conservation and divergence of photoperiod flowering pathways in Arabidopsis and rice

    PubMed Central

    Matsubara, Kazuki; Hori, Kiyosumi; Ogiso-Tanaka, Eri; Yano, Masahiro

    2014-01-01

    Flowering time in rice (Oryza sativa L.) is determined primarily by daylength (photoperiod), and natural variation in flowering time is due to quantitative trait loci involved in photoperiodic flowering. To date, genetic analysis of natural variants in rice flowering time has resulted in the positional cloning of at least 12 quantitative trait genes (QTGs), including our recently cloned QTGs, Hd17, and Hd16. The QTGs have been assigned to specific photoperiodic flowering pathways. Among them, 9 have homologs in the Arabidopsis genome, whereas it was evident that there are differences in the pathways between rice and Arabidopsis, such that the rice Ghd7–Ehd1–Hd3a/RFT1 pathway modulated by Hd16 is not present in Arabidopsis. In this review, we describe QTGs underlying natural variation in rice flowering time. Additionally, we discuss the implications of the variation in adaptive divergence and its importance in rice breeding. PMID:24860584

  15. Portable paper-based device for quantitative colorimetric assays relying on light reflectance principle.

    PubMed

    Li, Bowei; Fu, Longwen; Zhang, Wei; Feng, Weiwei; Chen, Lingxin

    2014-04-01

    This paper presents a novel paper-based analytical device based on the colorimetric paper assays through its light reflectance. The device is portable, low cost (<20 dollars), and lightweight (only 176 g) that is available to assess the cost-effectiveness and appropriateness of the original health care or on-site detection information. Based on the light reflectance principle, the signal can be obtained directly, stably and user-friendly in our device. We demonstrated the utility and broad applicability of this technique with measurements of different biological and pollution target samples (BSA, glucose, Fe, and nitrite). Moreover, the real samples of Fe (II) and nitrite in the local tap water were successfully analyzed, and compared with the standard UV absorption method, the quantitative results showed good performance, reproducibility, and reliability. This device could provide quantitative information very conveniently and show great potential to broad fields of resource-limited analysis, medical diagnostics, and on-site environmental detection. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Exploring relationship between face-to-face interaction and team performance using wearable sensor badges.

    PubMed

    Watanabe, Jun-ichiro; Ishibashi, Nozomu; Yano, Kazuo

    2014-01-01

    Quantitative analyses of human-generated data collected in various fields have uncovered many patterns of complex human behaviors. However, thus far the quantitative evaluation of the relationship between the physical behaviors of employees and their performance has been inadequate. Here, we present findings demonstrating the significant relationship between the physical behaviors of employees and their performance via experiments we conducted in inbound call centers while the employees wore sensor badges. There were two main findings. First, we found that face-to-face interaction among telecommunicators and the frequency of their bodily movements caused by the face-to-face interaction had a significant correlation with the entire call center performance, which we measured as "Calls per Hour." Second, our trial to activate face-to-face interaction on the basis of data collected by the wearable sensor badges the employees wore significantly increased their performance. These results demonstrate quantitatively that human-human interaction in the physical world plays an important role in team performance.

  17. Exploring Relationship between Face-to-Face Interaction and Team Performance Using Wearable Sensor Badges

    PubMed Central

    Watanabe, Jun-ichiro; Ishibashi, Nozomu; Yano, Kazuo

    2014-01-01

    Quantitative analyses of human-generated data collected in various fields have uncovered many patterns of complex human behaviors. However, thus far the quantitative evaluation of the relationship between the physical behaviors of employees and their performance has been inadequate. Here, we present findings demonstrating the significant relationship between the physical behaviors of employees and their performance via experiments we conducted in inbound call centers while the employees wore sensor badges. There were two main findings. First, we found that face-to-face interaction among telecommunicators and the frequency of their bodily movements caused by the face-to-face interaction had a significant correlation with the entire call center performance, which we measured as “Calls per Hour.” Second, our trial to activate face-to-face interaction on the basis of data collected by the wearable sensor badges the employees wore significantly increased their performance. These results demonstrate quantitatively that human-human interaction in the physical world plays an important role in team performance. PMID:25501748

  18. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  19. Normalized Temperature Contrast Processing in Infrared Flash Thermography

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2016-01-01

    The paper presents further development in normalized contrast processing used in flash infrared thermography method. Method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided. Methods of converting image contrast to temperature contrast and vice versa are provided. Normalized contrast processing in flash thermography is useful in quantitative analysis of flash thermography data including flaw characterization and comparison of experimental results with simulation. Computation of normalized temperature contrast involves use of flash thermography data acquisition set-up with high reflectivity foil and high emissivity tape such that the foil, tape and test object are imaged simultaneously. Methods of assessing other quantitative parameters such as emissivity of object, afterglow heat flux, reflection temperature change and surface temperature during flash thermography are also provided. Temperature imaging and normalized temperature contrast processing provide certain advantages over normalized image contrast processing by reducing effect of reflected energy in images and measurements, therefore providing better quantitative data. Examples of incorporating afterglow heat-flux and reflection temperature evolution in flash thermography simulation are also discussed.

  20. Experimental simulation of the effects of sudden increases in geomagnetic activity upon quantitative measures of human brain activity: validation of correlational studies.

    PubMed

    Mulligan, Bryce P; Persinger, Michael A

    2012-05-10

    Previous correlations between geomagnetic activity and quantitative changes in electroencephalographic power revealed particular associations with the right parietal lobe for theta activity and the right frontal region for gamma activity. In the present experiment subjects were exposed to either no field (sham conditions) or to either 20 nT or 70 nT, 7 Hz, amplitude modulated (mHz range) magnetic fields for 30 min. Quantitative electroencephalographic (QEEG) measurements were completed before, during, and after the field exposures. After about 10 min of exposure theta power over the right parietal region was enhanced for the 20 nT exposure but suppressed for the 70 nT exposure relative to sham field exposures. The effect dissipated by the end of the exposure. These results support the contention that magnetic field fluctuations were primarily responsible for the significant geomagnetic-QEEG correlations reported in several studies. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

Top