ERIC Educational Resources Information Center
Luyt, Russell
2012-01-01
A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…
Biomarkers and Surrogate Endpoints in Uveitis: The Impact of Quantitative Imaging.
Denniston, Alastair K; Keane, Pearse A; Srivastava, Sunil K
2017-05-01
Uveitis is a major cause of sight loss across the world. The reliable assessment of intraocular inflammation in uveitis ('disease activity') is essential in order to score disease severity and response to treatment. In this review, we describe how 'quantitative imaging', the approach of using automated analysis and measurement algorithms across both standard and emerging imaging modalities, can develop objective instrument-based measures of disease activity. This is a narrative review based on searches of the current world literature using terms related to quantitative imaging techniques in uveitis, supplemented by clinical trial registry data, and expert knowledge of surrogate endpoints and outcome measures in ophthalmology. Current measures of disease activity are largely based on subjective clinical estimation, and are relatively insensitive, with poor discrimination and reliability. The development of quantitative imaging in uveitis is most established in the use of optical coherence tomographic (OCT) measurement of central macular thickness (CMT) to measure severity of macular edema (ME). The transformative effect of CMT in clinical assessment of patients with ME provides a paradigm for the development and impact of other forms of quantitative imaging. Quantitative imaging approaches are now being developed and validated for other key inflammatory parameters such as anterior chamber cells, vitreous haze, retinovascular leakage, and chorioretinal infiltrates. As new forms of quantitative imaging in uveitis are proposed, the uveitis community will need to evaluate these tools against the current subjective clinical estimates and reach a new consensus for how disease activity in uveitis should be measured. The development, validation, and adoption of sensitive and discriminatory measures of disease activity is an unmet need that has the potential to transform both drug development and routine clinical care for the patient with uveitis.
Development and Measurement of Preschoolers' Quantitative Knowledge
ERIC Educational Resources Information Center
Geary, David C.
2015-01-01
The collection of studies in this special issue make an important contribution to our understanding and measurement of the core cognitive and noncognitive factors that influence children's emerging quantitative competencies. The studies also illustrate how the field has matured, from a time when the quantitative competencies of infants and young…
Peters, Susan; Vermeulen, Roel; Olsson, Ann; Van Gelder, Rainer; Kendzia, Benjamin; Vincent, Raymond; Savary, Barbara; Williams, Nick; Woldbæk, Torill; Lavoué, Jérôme; Cavallo, Domenico; Cattaneo, Andrea; Mirabelli, Dario; Plato, Nils; Dahmann, Dirk; Fevotte, Joelle; Pesch, Beate; Brüning, Thomas; Straif, Kurt; Kromhout, Hans
2012-01-01
SYNERGY is a large pooled analysis of case-control studies on the joint effects of occupational carcinogens and smoking in the development of lung cancer. A quantitative job-exposure matrix (JEM) will be developed to assign exposures to five major lung carcinogens [asbestos, chromium, nickel, polycyclic aromatic hydrocarbons (PAH), and respirable crystalline silica (RCS)]. We assembled an exposure database, called ExpoSYN, to enable such a quantitative exposure assessment. Existing exposure databases were identified and European and Canadian research institutes were approached to identify pertinent exposure measurement data. Results of individual air measurements were entered anonymized according to a standardized protocol. The ExpoSYN database currently includes 356 551 measurements from 19 countries. In total, 140 666 personal and 215 885 stationary data points were available. Measurements were distributed over the five agents as follows: RCS (42%), asbestos (20%), chromium (16%), nickel (15%), and PAH (7%). The measurement data cover the time period from 1951 to present. However, only a small portion of measurements (1.4%) were performed prior to 1975. The major contributing countries for personal measurements were Germany (32%), UK (22%), France (14%), and Norway and Canada (both 11%). ExpoSYN is a unique occupational exposure database with measurements from 18 European countries and Canada covering a time period of >50 years. This database will be used to develop a country-, job-, and time period-specific quantitative JEM. This JEM will enable data-driven quantitative exposure assessment in a multinational pooled analysis of community-based lung cancer case-control studies.
Quantitative Articles: Developing Studies for Publication in Counseling Journals
ERIC Educational Resources Information Center
Trusty, Jerry
2011-01-01
This article is presented as a guide for developing quantitative studies and preparing quantitative manuscripts for publication in counseling journals. It is intended as an aid for aspiring authors in conceptualizing studies and formulating valid research designs. Material is presented on choosing variables and measures and on selecting…
ERIC Educational Resources Information Center
Scrutton, Roger; Beames, Simon
2015-01-01
Outdoor adventure education (OAE) has a long history of being credited with the personal and social development (PSD) of its participants. PSD is notoriously difficult to measure quantitatively, yet stakeholders demand statistical evidence that given approaches to eliciting PSD are effective in their methods. Rightly or wrongly, many stakeholders…
Using Inequality Measures to Incorporate Environmental Justice into Regulatory Analyses
Abstract: Formally evaluating how specific policy measures influence environmental justice is challenging, especially in the context of regulatory analyses in which quantitative comparisons are the norm. However, there is a large literature on developing and applying quantitative...
A general way for quantitative magnetic measurement by transmitted electrons
NASA Astrophysics Data System (ADS)
Song, Dongsheng; Li, Gen; Cai, Jianwang; Zhu, Jing
2016-01-01
EMCD (electron magnetic circular dichroism) technique opens a new door to explore magnetic properties by transmitted electrons. The recently developed site-specific EMCD technique makes it possible to obtain rich magnetic information from the Fe atoms sited at nonequivalent crystallographic planes in NiFe2O4, however it is based on a critical demand for the crystallographic structure of the testing sample. Here, we have further improved and tested the method for quantitative site-specific magnetic measurement applicable for more complex crystallographic structure by using the effective dynamical diffraction effects (general routine for selecting proper diffraction conditions, making use of the asymmetry of dynamical diffraction for design of experimental geometry and quantitative measurement, etc), and taken yttrium iron garnet (Y3Fe5O12, YIG) with more complex crystallographic structure as an example to demonstrate its applicability. As a result, the intrinsic magnetic circular dichroism signals, spin and orbital magnetic moment of iron with site-specific are quantitatively determined. The method will further promote the development of quantitative magnetic measurement with high spatial resolution by transmitted electrons.
Quantitative dispersion microscopy
Fu, Dan; Choi, Wonshik; Sung, Yongjin; Yaqoob, Zahid; Dasari, Ramachandra R.; Feld, Michael
2010-01-01
Refractive index dispersion is an intrinsic optical property and a useful source of contrast in biological imaging studies. In this report, we present the first dispersion phase imaging of living eukaryotic cells. We have developed quantitative dispersion microscopy based on the principle of quantitative phase microscopy. The dual-wavelength quantitative phase microscope makes phase measurements at 310 nm and 400 nm wavelengths to quantify dispersion (refractive index increment ratio) of live cells. The measured dispersion of living HeLa cells is found to be around 1.088, which agrees well with that measured directly for protein solutions using total internal reflection. This technique, together with the dry mass and morphology measurements provided by quantitative phase microscopy, could prove to be a useful tool for distinguishing different types of biomaterials and studying spatial inhomogeneities of biological samples. PMID:21113234
Measurement Invariance: A Foundational Principle for Quantitative Theory Building
ERIC Educational Resources Information Center
Nimon, Kim; Reio, Thomas G., Jr.
2011-01-01
This article describes why measurement invariance is a critical issue to quantitative theory building within the field of human resource development. Readers will learn what measurement invariance is and how to test for its presence using techniques that are accessible to applied researchers. Using data from a LibQUAL+[TM] study of user…
Measuring Aircraft Capability for Military and Political Analysis
1976-03-01
challenged in 1932 when a panel of distinguished British scientists discussed the feasibility of quantitatively estimating sensory events... Quantitative Analysis of Social Problems , E.R. Tufte (ed.), p. 407, Addison-Wesley, 1970. 17 "artificial" boundaries are imposed on the data. Less...of arms transfers in various parts of the world as well. Quantitative research (and hence measurement) contributes to theoretical development by
Vessel wall characterization using quantitative MRI: what's in a number?
Coolen, Bram F; Calcagno, Claudia; van Ooij, Pim; Fayad, Zahi A; Strijkers, Gustav J; Nederveen, Aart J
2018-02-01
The past decade has witnessed the rapid development of new MRI technology for vessel wall imaging. Today, with advances in MRI hardware and pulse sequences, quantitative MRI of the vessel wall represents a real alternative to conventional qualitative imaging, which is hindered by significant intra- and inter-observer variability. Quantitative MRI can measure several important morphological and functional characteristics of the vessel wall. This review provides a detailed introduction to novel quantitative MRI methods for measuring vessel wall dimensions, plaque composition and permeability, endothelial shear stress and wall stiffness. Together, these methods show the versatility of non-invasive quantitative MRI for probing vascular disease at several stages. These quantitative MRI biomarkers can play an important role in the context of both treatment response monitoring and risk prediction. Given the rapid developments in scan acceleration techniques and novel image reconstruction, we foresee the possibility of integrating the acquisition of multiple quantitative vessel wall parameters within a single scan session.
Development of a Biological Science Quantitative Reasoning Exam (BioSQuaRE)
ERIC Educational Resources Information Center
Stanhope, Liz; Ziegler, Laura; Haque, Tabassum; Le, Laura; Vinces, Marcelo; Davis, Gregory K.; Zieffler, Andrew; Brodfuehrer, Peter; Preest, Marion; Belitsky, Jason M.; Umbanhowar, Charles, Jr.; Overvoorde, Paul J.
2017-01-01
Multiple reports highlight the increasingly quantitative nature of biological research and the need to innovate means to ensure that students acquire quantitative skills. We present a tool to support such innovation. The Biological Science Quantitative Reasoning Exam (BioSQuaRE) is an assessment instrument designed to measure the quantitative…
NASA Astrophysics Data System (ADS)
Goddard, Braden
The ability of inspection agencies and facility operators to measure powders containing several actinides is increasingly necessary as new reprocessing techniques and fuel forms are being developed. These powders are difficult to measure with nondestructive assay (NDA) techniques because neutrons emitted from induced and spontaneous fission of different nuclides are very similar. A neutron multiplicity technique based on first principle methods was developed to measure these powders by exploiting isotope-specific nuclear properties, such as the energy-dependent fission cross sections and the neutron induced fission neutron multiplicity. This technique was tested through extensive simulations using the Monte Carlo N-Particle eXtended (MCNPX) code and by one measurement campaign using the Active Well Coincidence Counter (AWCC) and two measurement campaigns using the Epithermal Neutron Multiplicity Counter (ENMC) with various (alpha,n) sources and actinide materials. Four potential applications of this first principle technique have been identified: (1) quantitative measurement of uranium, neptunium, plutonium, and americium materials; (2) quantitative measurement of mixed oxide (MOX) materials; (3) quantitative measurement of uranium materials; and (4) weapons verification in arms control agreements. This technique still has several challenges which need to be overcome, the largest of these being the challenge of having high-precision active and passive measurements to produce results with acceptably small uncertainties.
ERIC Educational Resources Information Center
Attali, Yigal
2014-01-01
Previous research on calculator use in standardized assessments of quantitative ability focused on the effect of calculator availability on item difficulty and on whether test developers can predict these effects. With the introduction of an on-screen calculator on the Quantitative Reasoning measure of the "GRE"® revised General Test, it…
Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C
2015-02-01
Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Affinity for Quantitative Tools: Undergraduate Marketing Students Moving beyond Quantitative Anxiety
ERIC Educational Resources Information Center
Tarasi, Crina O.; Wilson, J. Holton; Puri, Cheenu; Divine, Richard L.
2013-01-01
Marketing students are known as less likely to have an affinity for the quantitative aspects of the marketing discipline. In this article, we study the reasons why this might be true and develop a parsimonious 20-item scale for measuring quantitative affinity in undergraduate marketing students. The scale was administered to a sample of business…
Herbort, Carl P; Tugal-Tutkun, Ilknur
2017-06-01
Laser flare photometry (LFP) is an objective and quantitative method to measure intraocular inflammation. The LFP technology was developed in Japan and has been commercially available since 1990. The aim of this work was to review the application of LFP in uveitis practice in Europe compared to Japan where the technology was born. We reviewed PubMed articles published on LFP and uveitis. Although LFP has been largely integrated in routine uveitis practice in Europe, it has been comparatively neglected in Japan and still has not received FDA approval in the USA. As LFP is the only method that provides a precise measure of intraocular inflammation, it should be used as a gold standard in uveitis centres worldwide.
Measuring the Beginning: A Quantitative Study of the Transition to Higher Education
ERIC Educational Resources Information Center
Brooman, Simon; Darwent, Sue
2014-01-01
This quantitative study measures change in certain factors known to influence success of first-year students during the transition to higher education: self-efficacy, autonomous learning and social integration. A social integration scale was developed with three subscales: "sense of belonging", "relationship with staff" and…
Silicon solar cell process development, fabrication and analysis
NASA Technical Reports Server (NTRS)
Leung, D. C.; Iles, P. A.
1983-01-01
Measurements of minority carrier diffusion lengths were made on the small mesa diodes from HEM Si and SILSO Si. The results were consistent with previous Voc and Isc measurements. Only the medium grain SILSO had a distinct advantage for the non grain boundary diodes. Substantial variations were observed for the HEM ingot 4141C. Also a quantitatively scaled light spot scan was being developed for localized diffusion length measurements in polycrystalline silicon solar cells. A change to a more monochromatic input for the light spot scan results in greater sensitivity and in principle, quantitative measurement of local material qualities is now possible.
ERIC Educational Resources Information Center
Mergler, S.; Lobker, B.; Evenhuis, H. M.; Penning, C.
2010-01-01
Low bone mineral density (BMD) and fractures are common in people with intellectual disabilities (ID). Reduced mobility in case of motor impairment and the use of anti-epileptic drugs contribute to the development of low BMD. Quantitative ultrasound (QUS) measurement of the heel bone is a non-invasive and radiation-free method for measuring bone…
NASA Astrophysics Data System (ADS)
Pratt, Jon R.; Kramar, John A.; Newell, David B.; Smith, Douglas T.
2005-05-01
If nanomechanical testing is to evolve into a tool for process and quality control in semiconductor fabrication, great advances in throughput, repeatability, and accuracy of the associated instruments and measurements will be required. A recent grant awarded by the NIST Advanced Technology Program seeks to address the throughput issue by developing a high-speed AFM-based platform for quantitative nanomechanical measurements. The following paper speaks to the issue of quantitative accuracy by presenting an overview of various standards and techniques under development at NIST and other national metrology institutes (NMIs) that can provide a metrological basis for nanomechanical testing. The infrastructure we describe places firm emphasis on traceability to the International System of Units, paving the way for truly quantitative, rather than qualitative, physical property testing.
A specialized plug-in software module for computer-aided quantitative measurement of medical images.
Wang, Q; Zeng, Y J; Huo, P; Hu, J L; Zhang, J H
2003-12-01
This paper presents a specialized system for quantitative measurement of medical images. Using Visual C++, we developed a computer-aided software based on Image-Pro Plus (IPP), a software development platform. When transferred to the hard disk of a computer by an MVPCI-V3A frame grabber, medical images can be automatically processed by our own IPP plug-in for immunohistochemical analysis, cytomorphological measurement and blood vessel segmentation. In 34 clinical studies, the system has shown its high stability, reliability and ease of utility.
A Historical Analysis of Internal Review
1981-03-01
the background material presented. In such a study as this, the absence of quantitative data forces narrative descriptions and arguments vice... difinitive graphic displays. The chapter seeked to convey a sense of history and development of auditing in general and internal review in particular. In...measure represents the closest feasible way of measuring the accomplishment of an objective that cannot itself be expressed quantitatively . Such a measure
A rapid colorimetric assay for mold spore germination using XTT tetrazolium salt
Carol A. Clausen; Vina W. Yang
2011-01-01
Current laboratory test methods to measure efficacy of new mold inhibitors are time consuming, some require specialized test equipment and ratings are subjective. Rapid, simple quantitative assays to measure the efficacy of mold inhibitors are needed. A quantitative, colorimetric microassay was developed using XTT tetrazolium salt to metabolically assess mold spore...
Kisingo, Alex; Rollins, Rick; Murray, Grant; Dearden, Phil; Clarke, Marlea
2016-10-01
Protected areas (PAs) can provide important benefits to conservation and to communities. A key factor in the effective delivery of these benefits is the role of governance. There has been a growth in research developing frameworks to evaluate 'good' PA governance, usually drawing on a set of principles that are associated with groups of indicators. In contrast to dominant qualitative approaches, this paper describes the development of a quantitative method for measuring effectiveness of protected area governance, as perceived by stakeholders in the Greater Serengeti Ecosystem in Tanzania. The research developed a quantitative method for developing effectiveness measures of PA governance, using a set of 65 statements related to governance principles developed from a literature review. The instrument was administered to 389 individuals from communities located near PAs in the Greater Serengeti Ecosystem. The results of a factor analysis suggest that statements load onto 10 factors that demonstrate high psychometric validity as measured by factor loadings, explained variance, and Cronbach's alpha reliability. The ten common factors that were extracted were: 1) legitimacy, 2) transparency and accountability, 3) responsiveness, 4) fairness, 5) participation, 6) ecosystem based management (EBM) and connectivity, 7) resilience, 8) achievements, 9) consensus orientation, and 10) power. The paper concludes that quantitative surveys can be used to evaluate governance of protected areas from a community-level perspective. Copyright © 2016 Elsevier Ltd. All rights reserved.
Studying learning in the healthcare setting: the potential of quantitative diary methods.
Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke
2015-08-01
Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the field of medical education. Quantitative diary methods offer several methodological advantages, such as measuring aspects of learning with great detail, accuracy and authenticity. Moreover, they enable researchers to study how and under which conditions learning in the health care setting occurs and in which way learning can be promoted. Hence, quantitative diary methods may contribute to theory development and the optimization of teaching methods in medical education.
Multi-color phase imaging and sickle cell anemia (Conference Presentation)
NASA Astrophysics Data System (ADS)
Hosseini, Poorya; Zhou, Renjie; Yaqoob, Zahid; So, Peter T. C.
2016-03-01
Quantitative phase measurements at multiple wavelengths has created an opportunity for exploring new avenues in phase microscopy such as enhancing imaging-depth (1), measuring hemoglobin concentrations in erythrocytes (2), and more recently in tomographic mapping of the refractive index of live cells (3). To this end, quantitative phase imaging has been demonstrated both at few selected spectral points as well as with high spectral resolution (4,5). However, most of these developed techniques compromise imaging speed, field of view, or the spectral resolution to perform interferometric measurements at multiple colors. In the specific application of quantitative phase in studying blood diseases and red blood cells, current techniques lack the required sensitivity to quantify biological properties of interest at individual cell level. Recently, we have set out to develop a stable quantitative interferometric microscope allowing for measurements of such properties for red cells without compromising field of view or speed of the measurements. The feasibility of the approach will be initially demonstrated in measuring dispersion curves of known solutions, followed by measuring biological properties of red cells in sickle cell anemia. References: 1. Mann CJ, Bingham PR, Paquit VC, Tobin KW. Quantitative phase imaging by three-wavelength digital holography. Opt Express. 2008;16(13):9753-64. 2. Park Y, Yamauchi T, Choi W, Dasari R, Feld MS. Spectroscopic phase microscopy for quantifying hemoglobin concentrations in intact red blood cells. Opt Lett. 2009;34(23):3668-70. 3. Hosseini P, Sung Y, Choi Y, Lue N, Yaqoob Z, So P. Scanning color optical tomography (SCOT). Opt Express. 2015;23(15):19752-62. 4. Jung J-H, Jang J, Park Y. Spectro-refractometry of individual microscopic objects using swept-source quantitative phase imaging. Anal Chem. 2013;85(21):10519-25. 5. Rinehart M, Zhu Y, Wax A. Quantitative phase spectroscopy. Biomed Opt Express. 2012;3(5):958-65.
Pateman, B; Jinks, A M
1999-01-01
The focus of this paper is a study designed to explore the validity of quantitative approaches of student evaluation in a pre-registration degree programme. As managers of the students' education we were concerned that the quantitative method, which used lecturer criteria, may not fully represent students' views. The approach taken is that of a process-type strategy for curriculum evaluation as described by Parlett and Hamilton (1972). The aim of the study is to produce illuminative data, or students' 'stories' of their educational experiences through use of semi-structured interviews. The results are then compared to the current quantitative measurement tools designed to obtain 'snapshots' of the educational effectiveness of the curriculum. The quantitative measurement tools use Likert scale measurements of teacher-devised criterion statements. The results of the study give a rich source of qualitative data which can be used to inform future curriculum development. However, complete validation of the current quantitative instruments used was not achieved in this study. Student and teacher agendas in respect of important issues pertaining to the course programme were found to differ. Limitations of the study are given. There is discussion of the options open to the management team with regard to future development of curriculum evaluation systems.
Xiao, Xia; Lei, Kin Fong; Huang, Chia-Hao
2015-01-01
Cell migration is a cellular response and results in various biological processes such as cancer metastasis, that is, the primary cause of death for cancer patients. Quantitative investigation of the correlation between cell migration and extracellular stimulation is essential for developing effective therapeutic strategies for controlling invasive cancer cells. The conventional method to determine cell migration rate based on comparison of successive images may not be an objective approach. In this work, a microfluidic chip embedded with measurement electrodes has been developed to quantitatively monitor the cell migration activity based on the impedimetric measurement technique. A no-damage wound was constructed by microfluidic phenomenon and cell migration activity under the stimulation of cytokine and an anti-cancer drug, i.e., interleukin-6 and doxorubicin, were, respectively, investigated. Impedance measurement was concurrently performed during the cell migration process. The impedance change was directly correlated to the cell migration activity; therefore, the migration rate could be calculated. In addition, a good match was found between impedance measurement and conventional imaging analysis. But the impedimetric measurement technique provides an objective and quantitative measurement. Based on our technique, cell migration rates were calculated to be 8.5, 19.1, and 34.9 μm/h under the stimulation of cytokine at concentrations of 0 (control), 5, and 10 ng/ml. This technique has high potential to be developed into a powerful analytical platform for cancer research. PMID:26180566
Stachybotrys chartarum is an indoor mold that has been associated with pulmonary hemorrhage (PH) cases in the Cleveland, Ohio area. This study applied two new quantitative measurements to air samples from a home where an infant developed PH. Quantitative polymerase chain reacti...
SYN-JEM: A Quantitative Job-Exposure Matrix for Five Lung Carcinogens.
Peters, Susan; Vermeulen, Roel; Portengen, Lützen; Olsson, Ann; Kendzia, Benjamin; Vincent, Raymond; Savary, Barbara; Lavoué, Jérôme; Cavallo, Domenico; Cattaneo, Andrea; Mirabelli, Dario; Plato, Nils; Fevotte, Joelle; Pesch, Beate; Brüning, Thomas; Straif, Kurt; Kromhout, Hans
2016-08-01
The use of measurement data in occupational exposure assessment allows more quantitative analyses of possible exposure-response relations. We describe a quantitative exposure assessment approach for five lung carcinogens (i.e. asbestos, chromium-VI, nickel, polycyclic aromatic hydrocarbons (by its proxy benzo(a)pyrene (BaP)) and respirable crystalline silica). A quantitative job-exposure matrix (JEM) was developed based on statistical modeling of large quantities of personal measurements. Empirical linear models were developed using personal occupational exposure measurements (n = 102306) from Europe and Canada, as well as auxiliary information like job (industry), year of sampling, region, an a priori exposure rating of each job (none, low, and high exposed), sampling and analytical methods, and sampling duration. The model outcomes were used to create a JEM with a quantitative estimate of the level of exposure by job, year, and region. Decreasing time trends were observed for all agents between the 1970s and 2009, ranging from -1.2% per year for personal BaP and nickel exposures to -10.7% for asbestos (in the time period before an asbestos ban was implemented). Regional differences in exposure concentrations (adjusted for measured jobs, years of measurement, and sampling method and duration) varied by agent, ranging from a factor 3.3 for chromium-VI up to a factor 10.5 for asbestos. We estimated time-, job-, and region-specific exposure levels for four (asbestos, chromium-VI, nickel, and RCS) out of five considered lung carcinogens. Through statistical modeling of large amounts of personal occupational exposure measurement data we were able to derive a quantitative JEM to be used in community-based studies. © The Author 2016. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
A Quantitative Approach to Assessing System Evolvability
NASA Technical Reports Server (NTRS)
Christian, John A., III
2004-01-01
When selecting a system from multiple candidates, the customer seeks the one that best meets his or her needs. Recently the desire for evolvable systems has become more important and engineers are striving to develop systems that accommodate this need. In response to this search for evolvability, we present a historical perspective on evolvability, propose a refined definition of evolvability, and develop a quantitative method for measuring this property. We address this quantitative methodology from both a theoretical and practical perspective. This quantitative model is then applied to the problem of evolving a lunar mission to a Mars mission as a case study.
Targeted Quantitation of Proteins by Mass Spectrometry
2013-01-01
Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement. PMID:23517332
Targeted quantitation of proteins by mass spectrometry.
Liebler, Daniel C; Zimmerman, Lisa J
2013-06-04
Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement.
Universality and predictability in molecular quantitative genetics.
Nourmohammad, Armita; Held, Torsten; Lässig, Michael
2013-12-01
Molecular traits, such as gene expression levels or protein binding affinities, are increasingly accessible to quantitative measurement by modern high-throughput techniques. Such traits measure molecular functions and, from an evolutionary point of view, are important as targets of natural selection. We review recent developments in evolutionary theory and experiments that are expected to become building blocks of a quantitative genetics of molecular traits. We focus on universal evolutionary characteristics: these are largely independent of a trait's genetic basis, which is often at least partially unknown. We show that universal measurements can be used to infer selection on a quantitative trait, which determines its evolutionary mode of conservation or adaptation. Furthermore, universality is closely linked to predictability of trait evolution across lineages. We argue that universal trait statistics extends over a range of cellular scales and opens new avenues of quantitative evolutionary systems biology. Copyright © 2013. Published by Elsevier Ltd.
Proposal for a quantitative index of flood disasters.
Feng, Lihua; Luo, Gaoyuan
2010-07-01
Drawing on calculations of wind scale and earthquake magnitude, this paper develops a new quantitative method for measuring flood magnitude and disaster intensity. Flood magnitude is the quantitative index that describes the scale of a flood; the flood's disaster intensity is the quantitative index describing the losses caused. Both indices have numerous theoretical and practical advantages with definable concepts and simple applications, which lend them key practical significance.
Stages of Psychometric Measure Development: The Example of the Generalized Expertise Measure (GEM)
ERIC Educational Resources Information Center
Germain, Marie-Line
2006-01-01
This paper chronicles the steps, methods, and presents hypothetical results of quantitative and qualitative studies being conducted to develop a Generalized Expertise Measure (GEM). Per Hinkin (1995), the stages of scale development are domain and item generation, content expert validation, and pilot test. Content/face validity and internal…
ERIC Educational Resources Information Center
Eleje, Lydia I.; Esomonu, Nkechi P. M.
2018-01-01
A Test to measure achievement in quantitative economics among secondary school students was developed and validated in this study. The test is made up 20 multiple choice test items constructed based on quantitative economics sub-skills. Six research questions guided the study. Preliminary validation was done by two experienced teachers in…
NASA Technical Reports Server (NTRS)
Natesh, R.; Stringfellow, G. B.; Virkar, A. V.; Dunn, J.; Guyer, T.
1983-01-01
Statistically significant quantitative structural imperfection measurements were made on samples from ubiquitous crystalline process (UCP) Ingot 5848 - 13C. Important correlation was obtained between defect densities, cell efficiency, and diffusion length. Grain boundary substructure displayed a strong influence on the conversion efficiency of solar cells from Semix material. Quantitative microscopy measurements gave statistically significant information compared to other microanalytical techniques. A surface preparation technique to obtain proper contrast of structural defects suitable for quantimet quantitative image analyzer (QTM) analysis was perfected and is used routinely. The relationships between hole mobility and grain boundary density was determined. Mobility was measured using the van der Pauw technique, and grain boundary density was measured using quantitative microscopy technique. Mobility was found to decrease with increasing grain boundary density.
Schmidt, Simone; Hafner, Patricia; Klein, Andrea; Rubino-Nacht, Daniela; Gocheva, Vanya; Schroeder, Jonas; Naduvilekoot Devasia, Arjith; Zuesli, Stephanie; Bernert, Guenther; Laugel, Vincent; Bloetzer, Clemens; Steinlin, Maja; Capone, Andrea; Gloor, Monika; Tobler, Patrick; Haas, Tanja; Bieri, Oliver; Zumbrunn, Thomas; Fischer, Dirk; Bonati, Ulrike
2018-01-01
The development of new therapeutic agents for the treatment of Duchenne muscular dystrophy has put a focus on defining outcome measures most sensitive to capture treatment effects. This cross-sectional analysis investigates the relation between validated clinical assessments such as the 6-minute walk test, motor function measure and quantitative muscle MRI of thigh muscles in ambulant Duchenne muscular dystrophy patients, aged 6.5 to 10.8 years (mean 8.2, SD 1.1). Quantitative muscle MRI included the mean fat fraction using a 2-point Dixon technique, and transverse relaxation time (T2) measurements. All clinical assessments were highly significantly inter-correlated with p < 0.001. The strongest correlation with the motor function measure and its D1-subscore was shown by the 6-minute walk test. Clinical assessments showed no correlation with age. Importantly, quantitative muscle MRI values significantly correlated with all clinical assessments with the extensors showing the strongest correlation. In contrast to the clinical assessments, quantitative muscle MRI values were highly significantly correlated with age. In conclusion, the motor function measure and timed function tests measure disease severity in a highly comparable fashion and all tests correlated with quantitative muscle MRI values quantifying fatty muscle degeneration. Copyright © 2017 Elsevier B.V. All rights reserved.
The Acquisition Age of Quantitative Concepts of Children From Three to Six Years Old
ERIC Educational Resources Information Center
Kraner, Robert E.
1978-01-01
A criterion-referenced measuring instrument was developed from a national survey of quantitative skills/concepts required by entering first grade students and individually administered to 273 children three to six and one-half years of age. By comparing mastery age of each of the 153 skills/concepts, sequential patterns of development were made…
Development of iPad application "Postima" for quantitative analysis of the effects of manual therapy
NASA Astrophysics Data System (ADS)
Sugiyama, Naruhisa; Shirakawa, Tomohiro
2017-07-01
The technical difficulty of diagnosing joint misalignment and/or dysfunction by quantitative evaluation is commonly acknowledged among manual therapists. Usually, manual therapists make a diagnosis based on a combination of observing patient symptoms and performing physical examinations, both of which rely on subjective criteria and thus contain some uncertainty. We thus sought to investigate the correlations among posture, skeletal misalignment, and pain severity over the course of manual therapy treatment, and to explore the possibility of establishing objective criteria for diagnosis. For this purpose, we developed an iPad application that realizes the measurement of patients' postures and analyzes them quantitatively. We also discuss the results and effectiveness of the measurement and analysis.
A Systematic Review of Quantitative Resilience Measures for Water Infrastructure Systems
Shin, Sangmin; Lee, Seungyub; Judi, David; ...
2018-02-07
Over the past few decades, the concept of resilience has emerged as an important consideration in the planning and management of water infrastructure systems. Accordingly, various resilience measures have been developed for the quantitative evaluation and decision-making of systems. There are, however, numerous considerations and no clear choice of which measure, if any, provides the most appropriate representation of resilience for a given application. This study provides a critical review of quantitative approaches to measure the resilience of water infrastructure systems, with a focus on water resources and distribution systems. A compilation of 11 criteria evaluating 21 selected resilience measuresmore » addressing major features of resilience is developed using the Axiomatic Design process. Existing gaps of resilience measures are identified based on the review criteria. The results show that resilience measures have generally paid less attention to cascading damage to interrelated systems, rapid identification of failure, physical damage of system components, and time variation of resilience. Concluding the paper, improvements to resilience measures are recommended. The findings contribute to our understanding of gaps and provide information to help further improve resilience measures of water infrastructure systems.« less
A Systematic Review of Quantitative Resilience Measures for Water Infrastructure Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Sangmin; Lee, Seungyub; Judi, David
Over the past few decades, the concept of resilience has emerged as an important consideration in the planning and management of water infrastructure systems. Accordingly, various resilience measures have been developed for the quantitative evaluation and decision-making of systems. There are, however, numerous considerations and no clear choice of which measure, if any, provides the most appropriate representation of resilience for a given application. This study provides a critical review of quantitative approaches to measure the resilience of water infrastructure systems, with a focus on water resources and distribution systems. A compilation of 11 criteria evaluating 21 selected resilience measuresmore » addressing major features of resilience is developed using the Axiomatic Design process. Existing gaps of resilience measures are identified based on the review criteria. The results show that resilience measures have generally paid less attention to cascading damage to interrelated systems, rapid identification of failure, physical damage of system components, and time variation of resilience. Concluding the paper, improvements to resilience measures are recommended. The findings contribute to our understanding of gaps and provide information to help further improve resilience measures of water infrastructure systems.« less
Using BCG as a framework for setting goals and communicating progress toward those goals
This 5 minute Lightning Talk will discuss the benefits of stakeholder-supported quantitative targets in measuring progress, and will describe the Biological Condition Gradient (BCG) as one way to develop these quantitative targets.
The Evolution of Pearson's Correlation Coefficient
ERIC Educational Resources Information Center
Kader, Gary D.; Franklin, Christine A.
2008-01-01
This article describes an activity for developing the notion of association between two quantitative variables. By exploring a collection of scatter plots, the authors propose a nonstandard "intuitive" measure of association; and by examining properties of this measure, they develop the more standard measure, Pearson's Correlation Coefficient. The…
Brown, J Quincy; Vishwanath, Karthik; Palmer, Gregory M; Ramanujam, Nirmala
2009-02-01
Methods of optical spectroscopy that provide quantitative, physically or physiologically meaningful measures of tissue properties are an attractive tool for the study, diagnosis, prognosis, and treatment of various cancers. Recent development of methodologies to convert measured reflectance and fluorescence spectra from tissue to cancer-relevant parameters such as vascular volume, oxygenation, extracellular matrix extent, metabolic redox states, and cellular proliferation have significantly advanced the field of tissue optical spectroscopy. The number of publications reporting quantitative tissue spectroscopy results in the UV-visible wavelength range has increased sharply in the past three years, and includes new and emerging studies that correlate optically measured parameters with independent measures such as immunohistochemistry, which should aid in increased clinical acceptance of these technologies.
Time-resolved measurements of supersonic fuel sprays using synchrotron X-rays.
Powell, C F; Yue, Y; Poola, R; Wang, J
2000-11-01
A time-resolved radiographic technique has been developed for probing the fuel distribution close to the nozzle of a high-pressure single-hole diesel injector. The measurement was made using X-ray absorption of monochromatic synchrotron-generated radiation, allowing quantitative determination of the fuel distribution in this optically impenetrable region with a time resolution of better than 1 micros. These quantitative measurements constitute the most detailed near-nozzle study of a fuel spray to date.
The (mis)use of subjective process measures in software engineering
NASA Technical Reports Server (NTRS)
Valett, Jon D.; Condon, Steven E.
1993-01-01
A variety of measures are used in software engineering research to develop an understanding of the software process and product. These measures fall into three broad categories: quantitative, characteristics, and subjective. Quantitative measures are those to which a numerical value can be assigned, for example effort or lines of code (LOC). Characteristics describe the software process or product; they might include programming language or the type of application. While such factors do not provide a quantitative measurement of a process or product, they do help characterize them. Subjective measures (as defined in this study) are those that are based on the opinion or opinions of individuals; they are somewhat unique and difficult to quantify. Capturing of subjective measure data typically involves development of some type of scale. For example, 'team experience' is one of the subjective measures that were collected and studied by the Software Engineering Laboratory (SEL). Certainly, team experience could have an impact on the software process or product; actually measuring a team's experience, however, is not a strictly mathematical exercise. Simply adding up each team member's years of experience appears inadequate. In fact, most researchers would agree that 'years' do not directly translate into 'experience.' Team experience must be defined subjectively and then a scale must be developed e.g., high experience versus low experience; or high, medium, low experience; or a different or more granular scale. Using this type of scale, a particular team's overall experience can be compared with that of other teams in the development environment. Defining, collecting, and scaling subjective measures is difficult. First, precise definitions of the measures must be established. Next, choices must be made about whose opinions will be solicited to constitute the data. Finally, care must be given to defining the right scale and level of granularity for measurement.
1992-05-21
complete dependence on nerves. Organ culture of sciatic nerves, combined with an assay for axolotl transferrin developed earlier, allows quantitative study...axonal release of various unknown proteins. Combining this approach with the ELISA for quantitative measurement of axolotl transferrin developed with...light microscope autoradiographic analysis following binding of radiolabelled Tf. Studies of Tf synthesis will employ cDNA probes for axolotl Tf mRNA
The concept of "buffering" in systems and control theory: from metaphor to math.
Schmitt, Bernhard M
2004-10-04
The paradigm of "buffering" is used increasingly for the description of diverse "systemic" phenomena encountered in evolutionary genetics, ecology, integrative physiology, and other areas. However, in this new context, the paradigm has not yet matured into a truly quantitative concept inasmuch as it lacks a corresponding quantitative measure of "systems-level buffering strength". Here, I develop such measures on the basis of a formal and general approach to the quantitation of buffering action. "Systems-level buffering" is shown to be synonymous with "disturbance rejection" in feedback-control systems, and can be quantitated by means of dimensionless proportions between partial flows in two-partitioned systems. The units allow either the time-independent, "static" buffering properties or the time-dependent, "dynamic" ones to be measured. Analogous to this "resistance to change", one can define and measure the "conductance to change"; this quantity corresponds to "set-point tracking" in feedback-control systems. Together, these units provide a systematic framework for the quantitation of buffering action in systems biology, and reveal the common principle behind systems-level buffering, classical acid-base buffering, and multiple other manifestations of buffering.
Abbatiello, Susan E.; Schilling, Birgit; Mani, D. R.; Zimmerman, Lisa J.; Hall, Steven C.; MacLean, Brendan; Albertolle, Matthew; Allen, Simon; Burgess, Michael; Cusack, Michael P.; Gosh, Mousumi; Hedrick, Victoria; Held, Jason M.; Inerowicz, H. Dorota; Jackson, Angela; Keshishian, Hasmik; Kinsinger, Christopher R.; Lyssand, John; Makowski, Lee; Mesri, Mehdi; Rodriguez, Henry; Rudnick, Paul; Sadowski, Pawel; Sedransk, Nell; Shaddox, Kent; Skates, Stephen J.; Kuhn, Eric; Smith, Derek; Whiteaker, Jeffery R.; Whitwell, Corbin; Zhang, Shucha; Borchers, Christoph H.; Fisher, Susan J.; Gibson, Bradford W.; Liebler, Daniel C.; MacCoss, Michael J.; Neubert, Thomas A.; Paulovich, Amanda G.; Regnier, Fred E.; Tempst, Paul; Carr, Steven A.
2015-01-01
There is an increasing need in biology and clinical medicine to robustly and reliably measure tens to hundreds of peptides and proteins in clinical and biological samples with high sensitivity, specificity, reproducibility, and repeatability. Previously, we demonstrated that LC-MRM-MS with isotope dilution has suitable performance for quantitative measurements of small numbers of relatively abundant proteins in human plasma and that the resulting assays can be transferred across laboratories while maintaining high reproducibility and quantitative precision. Here, we significantly extend that earlier work, demonstrating that 11 laboratories using 14 LC-MS systems can develop, determine analytical figures of merit, and apply highly multiplexed MRM-MS assays targeting 125 peptides derived from 27 cancer-relevant proteins and seven control proteins to precisely and reproducibly measure the analytes in human plasma. To ensure consistent generation of high quality data, we incorporated a system suitability protocol (SSP) into our experimental design. The SSP enabled real-time monitoring of LC-MRM-MS performance during assay development and implementation, facilitating early detection and correction of chromatographic and instrumental problems. Low to subnanogram/ml sensitivity for proteins in plasma was achieved by one-step immunoaffinity depletion of 14 abundant plasma proteins prior to analysis. Median intra- and interlaboratory reproducibility was <20%, sufficient for most biological studies and candidate protein biomarker verification. Digestion recovery of peptides was assessed and quantitative accuracy improved using heavy-isotope-labeled versions of the proteins as internal standards. Using the highly multiplexed assay, participating laboratories were able to precisely and reproducibly determine the levels of a series of analytes in blinded samples used to simulate an interlaboratory clinical study of patient samples. Our study further establishes that LC-MRM-MS using stable isotope dilution, with appropriate attention to analytical validation and appropriate quality control measures, enables sensitive, specific, reproducible, and quantitative measurements of proteins and peptides in complex biological matrices such as plasma. PMID:25693799
Stewart, Ethan L; Hagerty, Christina H; Mikaberidze, Alexey; Mundt, Christopher C; Zhong, Ziming; McDonald, Bruce A
2016-07-01
Zymoseptoria tritici causes Septoria tritici blotch (STB) on wheat. An improved method of quantifying STB symptoms was developed based on automated analysis of diseased leaf images made using a flatbed scanner. Naturally infected leaves (n = 949) sampled from fungicide-treated field plots comprising 39 wheat cultivars grown in Switzerland and 9 recombinant inbred lines (RIL) grown in Oregon were included in these analyses. Measures of quantitative resistance were percent leaf area covered by lesions, pycnidia size and gray value, and pycnidia density per leaf and lesion. These measures were obtained automatically with a batch-processing macro utilizing the image-processing software ImageJ. All phenotypes in both locations showed a continuous distribution, as expected for a quantitative trait. The trait distributions at both sites were largely overlapping even though the field and host environments were quite different. Cultivars and RILs could be assigned to two or more statistically different groups for each measured phenotype. Traditional visual assessments of field resistance were highly correlated with quantitative resistance measures based on image analysis for the Oregon RILs. These results show that automated image analysis provides a promising tool for assessing quantitative resistance to Z. tritici under field conditions.
Refining the quantitative pathway of the Pathways to Mathematics model.
Sowinski, Carla; LeFevre, Jo-Anne; Skwarchuk, Sheri-Lynn; Kamawar, Deepthi; Bisanz, Jeffrey; Smith-Chant, Brenda
2015-03-01
In the current study, we adopted the Pathways to Mathematics model of LeFevre et al. (2010). In this model, there are three cognitive domains--labeled as the quantitative, linguistic, and working memory pathways--that make unique contributions to children's mathematical development. We attempted to refine the quantitative pathway by combining children's (N=141 in Grades 2 and 3) subitizing, counting, and symbolic magnitude comparison skills using principal components analysis. The quantitative pathway was examined in relation to dependent numerical measures (backward counting, arithmetic fluency, calculation, and number system knowledge) and a dependent reading measure, while simultaneously accounting for linguistic and working memory skills. Analyses controlled for processing speed, parental education, and gender. We hypothesized that the quantitative, linguistic, and working memory pathways would account for unique variance in the numerical outcomes; this was the case for backward counting and arithmetic fluency. However, only the quantitative and linguistic pathways (not working memory) accounted for unique variance in calculation and number system knowledge. Not surprisingly, only the linguistic pathway accounted for unique variance in the reading measure. These findings suggest that the relative contributions of quantitative, linguistic, and working memory skills vary depending on the specific cognitive task. Copyright © 2014 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Huerta, Margarita; Tong, Fuhui; Irby, Beverly J.; Lara-Alecio, Rafael
2016-01-01
The authors of this quantitative study measured and compared the academic language development and conceptual understanding of fifth-grade economically disadvantaged English language learners (ELL), former ELLs, and native English-speaking (ES) students as reflected in their science notebook scores. Using an instrument they developed, the authors…
ERIC Educational Resources Information Center
Smith, Mike U.; Snyder, Scott W.; Devereaux, Randolph S.
2016-01-01
The present study reports the development of a brief, quantitative, web-based, psychometrically sound measure--the Generalized Acceptance of EvolutioN Evaluation (GAENE, pronounced "gene") in a format that is useful in large and small groups, in research, and in classroom settings. The measure was designed to measure only evolution…
Sieve-based device for MALDI sample preparation. III. Its power for quantitative measurements.
Molin, Laura; Cristoni, Simone; Seraglia, Roberta; Traldi, Pietro
2011-02-01
The solid sample inhomogeneity is a weak point of traditional MALDI deposition techniques that reflects negatively on quantitative analysis. The recently developed sieve-based device (SBD) sample deposition method, based on the electrospraying of matrix/analyte solutions through a grounded sieve, allows the homogeneous deposition of microcrystals with dimensions smaller than that of the laser spot. In each microcrystal the matrix/analyte molar ratio can be considered constant. Then, by irradiating different portions of the microcrystal distribution an identical response is obtained. This result suggests the employment of SBD in the development of quantitative procedures. For this aim, mixtures of different proteins of known molarity were analyzed, showing a good relationship between molarity and intensity ratios. This behaviour was also observed in the case of proteins with quite different ionic yields. The power of the developed method for quantitative evaluation was also tested by the measurement of the abundance of IGPP[Oxi]GPP[Oxi]GLMGPP (m/z 1219) present in the collagen-α-5(IV) chain precursor, differently expressed in urines from healthy subjects and diabetic-nephropathic patients, confirming its overexpression in the presence of nephropathy. The data obtained indicate that SBD is a particularly effective method for quantitative analysis also in biological fluids of interest. Copyright © 2011 John Wiley & Sons, Ltd.
Sevenster, M; Buurman, J; Liu, P; Peters, J F; Chang, P J
2015-01-01
Accumulating quantitative outcome parameters may contribute to constructing a healthcare organization in which outcomes of clinical procedures are reproducible and predictable. In imaging studies, measurements are the principal category of quantitative para meters. The purpose of this work is to develop and evaluate two natural language processing engines that extract finding and organ measurements from narrative radiology reports and to categorize extracted measurements by their "temporality". The measurement extraction engine is developed as a set of regular expressions. The engine was evaluated against a manually created ground truth. Automated categorization of measurement temporality is defined as a machine learning problem. A ground truth was manually developed based on a corpus of radiology reports. A maximum entropy model was created using features that characterize the measurement itself and its narrative context. The model was evaluated in a ten-fold cross validation protocol. The measurement extraction engine has precision 0.994 and recall 0.991. Accuracy of the measurement classification engine is 0.960. The work contributes to machine understanding of radiology reports and may find application in software applications that process medical data.
Development and implementation of an automated quantitative film digitizer quality control program
NASA Astrophysics Data System (ADS)
Fetterly, Kenneth A.; Avula, Ramesh T. V.; Hangiandreou, Nicholas J.
1999-05-01
A semi-automated, quantitative film digitizer quality control program that is based on the computer analysis of the image data from a single digitized test film was developed. This program includes measurements of the geometric accuracy, optical density performance, signal to noise ratio, and presampled modulation transfer function. The variability of the measurements was less than plus or minus 5%. Measurements were made on a group of two clinical and two laboratory laser film digitizers during a trial period of approximately four months. Quality control limits were established based on clinical necessity, vendor specifications and digitizer performance. During the trial period, one of the digitizers failed the performance requirements and was corrected by calibration.
Sample and data processing considerations for the NIST quantitative infrared database
NASA Astrophysics Data System (ADS)
Chu, Pamela M.; Guenther, Franklin R.; Rhoderick, George C.; Lafferty, Walter J.; Phillips, William
1999-02-01
Fourier-transform infrared (FT-IR) spectrometry has become a useful real-time in situ analytical technique for quantitative gas phase measurements. In fact, the U.S. Environmental Protection Agency (EPA) has recently approved open-path FT-IR monitoring for the determination of hazardous air pollutants (HAP) identified in EPA's Clean Air Act of 1990. To support infrared based sensing technologies, the National Institute of Standards and Technology (NIST) is currently developing a standard quantitative spectral database of the HAPs based on gravimetrically prepared standard samples. The procedures developed to ensure the quantitative accuracy of the reference data are discussed, including sample preparation, residual sample contaminants, data processing considerations, and estimates of error.
Yoshikawa, Hirokazu; Weisner, Thomas S; Kalil, Ariel; Way, Niobe
2008-03-01
Multiple methods are vital to understanding development as a dynamic, transactional process. This article focuses on the ways in which quantitative and qualitative methodologies can be combined to enrich developmental science and the study of human development, focusing on the practical questions of "when" and "how." Research situations that may be especially suited to mixing qualitative and quantitative approaches are described. The authors also discuss potential choices for using mixed quantitative- qualitative approaches in study design, sampling, construction of measures or interview protocols, collaborations, and data analysis relevant to developmental science. Finally, they discuss some common pitfalls that occur in mixing these methods and include suggestions for surmounting them.
Quantitative Analysis of Radar Returns from Insects
NASA Technical Reports Server (NTRS)
Riley, J. R.
1979-01-01
When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.
34 CFR 668.145 - Test approval procedures.
Code of Federal Regulations, 2014 CFR
2014-07-01
... appropriate advanced degrees and experience in test development or psychometric research, to determine whether... contain multiple sub-tests measuring content domains other than verbal and quantitative domains, the Secretary reviews only those sub-tests covering the verbal and quantitative domains. (b)(1) If the Secretary...
34 CFR 668.145 - Test approval procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... appropriate advanced degrees and experience in test development or psychometric research, to determine whether... contain multiple sub-tests measuring content domains other than verbal and quantitative domains, the Secretary reviews only those sub-tests covering the verbal and quantitative domains. (b)(1) If the Secretary...
34 CFR 668.145 - Test approval procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... appropriate advanced degrees and experience in test development or psychometric research, to determine whether... contain multiple sub-tests measuring content domains other than verbal and quantitative domains, the Secretary reviews only those sub-tests covering the verbal and quantitative domains. (b)(1) If the Secretary...
34 CFR 668.145 - Test approval procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... appropriate advanced degrees and experience in test development or psychometric research, to determine whether... contain multiple sub-tests measuring content domains other than verbal and quantitative domains, the Secretary reviews only those sub-tests covering the verbal and quantitative domains. (b)(1) If the Secretary...
Exploring a taxonomy for aggression against women: can it aid conceptual clarity?
Cook, Sarah; Parrott, Dominic
2009-01-01
The assessment of aggression against women is demanding primarily because assessment strategies do not share a common language to describe reliably the wide range of forms of aggression women experience. The lack of a common language impairs efforts to describe these experiences, understand causes and consequences of aggression against women, and develop effective intervention and prevention efforts. This review accomplishes two goals. First, it applies a theoretically and empirically based taxonomy to behaviors assessed by existing measurement instruments. Second, it evaluates whether the taxonomy provides a common language for the field. Strengths of the taxonomy include its ability to describe and categorize all forms of aggression found in existing quantitative measures. The taxonomy also classifies numerous examples of aggression discussed in the literature but notably absent from quantitative measures. Although we use existing quantitative measures as a starting place to evaluate the taxonomy, its use is not limited to quantitative methods. Implications for theory, research, and practice are discussed.
Aiken, Leona S; West, Stephen G; Millsap, Roger E
2008-01-01
In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD programs (86%) participated. This survey replicated and extended a previous survey (L. S. Aiken, S. G. West, L. B. Sechrest, & R. R. Reno, 1990), permitting examination of curriculum development. Most training supported laboratory and not field research. The median of 1.6 years of training in statistics and measurement was mainly devoted to the modally 1-year introductory statistics course, leaving little room for advanced study. Curricular enhancements were noted in statistics and to a minor degree in measurement. Additional coverage of both fundamental and innovative quantitative methodology is needed. The research design curriculum has largely stagnated, a cause for great concern. Elite programs showed no overall advantage in quantitative training. Forces that support curricular innovation are characterized. Human capital challenges to quantitative training, including recruiting and supporting young quantitative faculty, are discussed. Steps must be taken to bring innovations in quantitative methodology into the curriculum of PhD programs in psychology. PsycINFO Database Record (c) 2008 APA, all rights reserved.
ERIC Educational Resources Information Center
Kjellström, Sofia; Golino, Hudson; Hamer, Rebecca; Van Rossum, Erik Jan; Almers, Ellen
2016-01-01
Qualitative research supports a developmental dimension in views on teaching and learning, but there are currently no quantitative tools to measure the full range of this development. To address this, we developed the Epistemological Development in Teaching and Learning Questionnaire (EDTLQ). In the current study the psychometric properties of the…
2017-01-01
Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers (QIBs) to measure changes in these features. Critical to the performance of a QIB in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method and metrics used to assess a QIB for clinical use. It is therefore, difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America (RSNA) and the Quantitative Imaging Biomarker Alliance (QIBA) with technical, radiological and statistical experts developed a set of technical performance analysis methods, metrics and study designs that provide terminology, metrics and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of QIB performance studies so that results from multiple studies can be compared, contrasted or combined. PMID:24919831
A Review of Recent Developments in X-Ray Diagnostics for Turbulent and Optically Dense Rocket Sprays
NASA Technical Reports Server (NTRS)
Radke, Christopher; Halls, Benjamin; Kastengren, Alan; Meyer, Terrence
2017-01-01
Highly efficient mixing and atomization of fuel and oxidizers is an important factor in many propulsion and power generating applications. To better quantify breakup and mixing in atomizing sprays, several diagnostic techniques have been developed to collect droplet information and spray statistics. Several optical based techniques, such as Ballistic Imaging and SLIPI have previously demonstrated qualitative measurements in optically dense sprays, however these techniques have produced limited quantitative information in the near injector region. To complement to these advances, a recent wave of developments utilizing synchrotron based x-rays have been successful been implemented facilitating the collection of quantitative measurements in optically dense sprays.
Zhu, Huayang; Ricote, Sandrine; Coors, W Grover; Kee, Robert J
2015-01-01
A model-based interpretation of measured equilibrium conductivity and conductivity relaxation is developed to establish thermodynamic, transport, and kinetics parameters for multiple charged defect conducting (MCDC) ceramic materials. The present study focuses on 10% yttrium-doped barium zirconate (BZY10). In principle, using the Nernst-Einstein relationship, equilibrium conductivity measurements are sufficient to establish thermodynamic and transport properties. However, in practice it is difficult to establish unique sets of properties using equilibrium conductivity alone. Combining equilibrium and conductivity-relaxation measurements serves to significantly improve the quantitative fidelity of the derived material properties. The models are developed using a Nernst-Planck-Poisson (NPP) formulation, which enables the quantitative representation of conductivity relaxations caused by very large changes in oxygen partial pressure.
Validation of Greyscale-Based Quantitative Ultrasound in Manual Wheelchair Users
Collinger, Jennifer L.; Fullerton, Bradley; Impink, Bradley G.; Koontz, Alicia M.; Boninger, Michael L.
2010-01-01
Objective The primary aim of this study is to establish the validity of greyscale-based quantitative ultrasound (QUS) measures of the biceps and supraspinatus tendons. Design Nine QUS measures of the biceps and supraspinatus tendons were computed from ultrasound images collected from sixty-seven manual wheelchair users. Shoulder pathology was measured using questionnaires, physical examination maneuvers, and a clinical ultrasound grading scale. Results Increased age, duration of wheelchair use, and body mass correlated with a darker, more homogenous tendon appearance. Subjects with pain during physical examination tests for biceps tenderness and acromioclavicular joint tenderness exhibited significantly different supraspinatus QUS values. Even when controlling for tendon depth, QUS measures of the biceps tendon differed significantly between subjects with healthy tendons, mild tendinosis, and severe tendinosis. Clinical grading of supraspinatus tendon health was correlated with QUS measures of the supraspinatus tendon. Conclusions Quantitative ultrasound is valid method to quantify tendinopathy and may allow for early detection of tendinosis. Manual wheelchair users are at a high risk for developing shoulder tendon pathology and may benefit from quantitative ultrasound-based research that focuses on identifying interventions designed to reduce this risk. PMID:20407304
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-22
...-depth understanding of individuals' attitudes, beliefs, motivations, and feelings than do quantitative... and as a qualitative research tool have three major purposes: To obtain information that is useful for developing variables and measures for quantitative studies, To better understand people's attitudes and...
Quantitative adhesion characterization of antireflective coatings in multijunction photovoltaics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brock, Ryan; Rewari, Raunaq; Novoa, Fernando D.
We discuss the development of a new composite dual cantilever beam (cDCB) thin-film adhesion testing method, which enables the quantitative measurement of adhesion on the thin and fragile substrates used in multijunction photovoltaics. In particular, we address the adhesion of several 2- and 3-layer antireflective coating systems on multijunction cells. By varying interface chemistry and morphology through processing, we demonstrate the marked effects on adhesion and help to develop an understanding of how high adhesion can be achieved, as adhesion values ranging from 0.5 J/m2 to 10 J/m2 were measured. Damp heat (85 degrees C/85% RH) was used to invokemore » degradation of interfacial adhesion. We demonstrate that even with germanium substrates that fracture relatively easily, quantitative measurements of adhesion can be made at high test yield. The cDCB test is discussed as an important new methodology, which can be broadly applied to any system that makes use of thin, brittle, or otherwise fragile substrates.« less
Development of a novel nanoscratch technique for quantitative measurement of ice adhesion strength
NASA Astrophysics Data System (ADS)
Loho, T.; Dickinson, M.
2018-04-01
The mechanism for the way that ice adheres to surfaces is still not well understood. Currently there is no standard method to quantitatively measure how ice adheres to surfaces which makes ice surface studies difficult to compare. A novel quantitative lateral force adhesion measurement at the micro-nano scale for ice was created which shears micro-nano sized ice droplets (less than 3 μm in diameter and 100nm in height) using a nanoindenter. By using small ice droplets, the variables associated with bulk ice measurements were minimised which increased data repeatability compared to bulk testing. The technique provided post- testing surface scans to confirm that the ice had been removed and that measurements were of ice adhesion strength. Results show that the ice adhesion strength of a material is greatly affected by the nano-scale surface roughness of the material with rougher surfaces having higher ice adhesion strength.
A strategy to apply quantitative epistasis analysis on developmental traits.
Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei
2017-05-15
Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.
NASA Astrophysics Data System (ADS)
Ju, Yang; Inoue, Kojiro; Saka, Masumi; Abe, Hiroyuki
2002-11-01
We present a method for quantitative measurement of electrical conductivity of semiconductor wafers in a contactless fashion by using millimeter waves. A focusing sensor was developed to focus a 110 GHz millimeter wave beam on the surface of a silicon wafer. The amplitude and the phase of the reflection coefficient of the millimeter wave signal were measured by which electrical conductivity of the wafer was determined quantitatively, independent of the permittivity and thickness of the wafers. The conductivity obtained by this method agrees well with that measured by the conventional four-point-probe method.
NASA Technical Reports Server (NTRS)
Partridge, William P.; Laurendeau, Normand M.
1997-01-01
We have experimentally assessed the quantitative nature of planar laser-induced fluorescence (PLIF) measurements of NO concentration in a unique atmospheric pressure, laminar, axial inverse diffusion flame (IDF). The PLIF measurements were assessed relative to a two-dimensional array of separate laser saturated fluorescence (LSF) measurements. We demonstrated and evaluated several experimentally-based procedures for enhancing the quantitative nature of PLIF concentration images. Because these experimentally-based PLIF correction schemes require only the ability to make PLIF and LSF measurements, they produce a more broadly applicable PLIF diagnostic compared to numerically-based correction schemes. We experimentally assessed the influence of interferences on both narrow-band and broad-band fluorescence measurements at atmospheric and high pressures. Optimum excitation and detection schemes were determined for the LSF and PLIF measurements. Single-input and multiple-input, experimentally-based PLIF enhancement procedures were developed for application in test environments with both negligible and significant quench-dependent error gradients. Each experimentally-based procedure provides an enhancement of approximately 50% in the quantitative nature of the PLIF measurements, and results in concentration images nominally as quantitative as LSF point measurements. These correction procedures can be applied to other species, including radicals, for which no experimental data are available from which to implement numerically-based PLIF enhancement procedures.
Quantitative reactive modeling and verification.
Henzinger, Thomas A
Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.
Peng, Qiyu; Park, Hyung-Soon; Shah, Parag; Wilson, Nicole; Ren, Yupeng; Wu, Yi-Ning; Liu, Jie; Gaebler-Spira, Deborah J; Zhang, Li-Qun
2011-01-01
Spasticity and contracture are major sources of disability in people with neurological impairments that have been evaluated using various instruments: the Modified Ashworth Scale, tendon reflex scale, pendulum test, mechanical perturbations, and passive joint range of motion (ROM). These measures generally are either convenient to use in clinics but not quantitative or they are quantitative but difficult to use conveniently in clinics. We have developed a manual spasticity evaluator (MSE) to evaluate spasticity/contracture quantitatively and conveniently, with ankle ROM and stiffness measured at a controlled low velocity and joint resistance and Tardieu catch angle measured at several higher velocities. We found that the Tardieu catch angle was linearly related to the velocity, indicating that increased resistance at higher velocities was felt at further stiffer positions and, thus, that the velocity dependence of spasticity may also be position-dependent. This finding indicates the need to control velocity in spasticity evaluation, which is achieved with the MSE. Quantitative measurements of spasticity, stiffness, and ROM can lead to more accurate characterizations of pathological conditions and outcome evaluations of interventions, potentially contributing to better healthcare services for patients with neurological disorders such as cerebral palsy, spinal cord injury, traumatic brain injury, and stroke.
Mannetje, Andrea 't; Steenland, Kyle; Checkoway, Harvey; Koskela, Riitta-Sisko; Koponen, Matti; Attfield, Michael; Chen, Jingqiong; Hnizdo, Eva; DeKlerk, Nicholas; Dosemeci, Mustafa
2002-08-01
Comprehensive quantitative silica exposure estimates over time, measured in the same units across a number of cohorts, would make possible a pooled exposure-response analysis for lung cancer. Such an analysis would help clarify the continuing controversy regarding whether silica causes lung cancer. Existing quantitative exposure data for 10 silica-exposed cohorts were retrieved from the original investigators. Occupation- and time-specific exposure estimates were either adopted/adapted or developed for each cohort, and converted to milligram per cubic meter (mg/m(3)) respirable crystalline silica. Quantitative exposure assignments were typically based on a large number (thousands) of raw measurements, or otherwise consisted of exposure estimates by experts (for two cohorts). Median exposure level of the cohorts ranged between 0.04 and 0.59 mg/m(3) respirable crystalline silica. Exposure estimates were partially validated via their successful prediction of silicosis in these cohorts. Existing data were successfully adopted or modified to create comparable quantitative exposure estimates over time for 10 silica-exposed cohorts, permitting a pooled exposure-response analysis. The difficulties encountered in deriving common exposure estimates across cohorts are discussed. Copyright 2002 Wiley-Liss, Inc.
Label-free hyperspectral dark-field microscopy for quantitative scatter imaging
NASA Astrophysics Data System (ADS)
Cheney, Philip; McClatchy, David; Kanick, Stephen; Lemaillet, Paul; Allen, David; Samarov, Daniel; Pogue, Brian; Hwang, Jeeseong
2017-03-01
A hyperspectral dark-field microscope has been developed for imaging spatially distributed diffuse reflectance spectra from light-scattering samples. In this report, quantitative scatter spectroscopy is demonstrated with a uniform scattering phantom, namely a solution of polystyrene microspheres. A Monte Carlo-based inverse model was used to calculate the reduced scattering coefficients of samples of different microsphere concentrations from wavelength-dependent backscattered signal measured by the dark-field microscope. The results are compared to the measurement results from a NIST double-integrating sphere system for validation. Ongoing efforts involve quantitative mapping of scattering and absorption coefficients in samples with spatially heterogeneous optical properties.
Development of a Biological Science Quantitative Reasoning Exam (BioSQuaRE)
Stanhope, Liz; Ziegler, Laura; Haque, Tabassum; Le, Laura; Vinces, Marcelo; Davis, Gregory K.; Zieffler, Andrew; Brodfuehrer, Peter; Preest, Marion; M. Belitsky, Jason; Umbanhowar, Charles; Overvoorde, Paul J.
2017-01-01
Multiple reports highlight the increasingly quantitative nature of biological research and the need to innovate means to ensure that students acquire quantitative skills. We present a tool to support such innovation. The Biological Science Quantitative Reasoning Exam (BioSQuaRE) is an assessment instrument designed to measure the quantitative skills of undergraduate students within a biological context. The instrument was developed by an interdisciplinary team of educators and aligns with skills included in national reports such as BIO2010, Scientific Foundations for Future Physicians, and Vision and Change. Undergraduate biology educators also confirmed the importance of items included in the instrument. The current version of the BioSQuaRE was developed through an iterative process using data from students at 12 postsecondary institutions. A psychometric analysis of these data provides multiple lines of evidence for the validity of inferences made using the instrument. Our results suggest that the BioSQuaRE will prove useful to faculty and departments interested in helping students acquire the quantitative competencies they need to successfully pursue biology, and useful to biology students by communicating the importance of quantitative skills. We invite educators to use the BioSQuaRE at their own institutions. PMID:29196427
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saripalli, Prasad; Brown, Christopher F.; Lindberg, Michael J.
We report on a new Cellular Absorptive Tracers (CATs) method, for a simple, non-destructive characterization of bacterial mass in flow systems. Results show that adsorption of a CAT molecule into the cellular mass results in its retardation during flow, which is a good, quantitative measure of the biomass quantity and distribution. No such methods are currently available for a quantitative characterization of cell mass.
ERIC Educational Resources Information Center
Pasch, Marvin
Techniques and procedures used to evaluate the outcomes of the student development program, and to use the evaluation results, are presented. Specific evaluation questions are posed that address overall outcomes, not individual student outcomes, and quantitative measures are suggested to accompany the questions. The measures include statistical…
Quantitative Measurement of Critical Thinking Skills in Novice and Experienced Physical Therapists
ERIC Educational Resources Information Center
Mulhall, Michele L.
2011-01-01
Critical thinking skills (CTS) have been emphasized in educational curricula and professional development of physical therapists. Studies assessing the measurement and development of CTS in healthcare professionals have primarily focused on students enrolled in professional phases of allied health educational programs. Despite the breadth of…
Shi, Wen; Li, Xiaohua; Ma, Huimin
2012-06-25
The whole picture: Carbon nanodots labeled with two fluorescent dyes have been developed as a tunable ratiometric pH sensor to measure intracellular pH. The nanosensor shows good biocompatibility and cellular dispersibility. Quantitative determinations on intact HeLa cells and pH fluctuations associated with oxidative stress were performed. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Gunawardena, Harsha P; O'Brien, Jonathon; Wrobel, John A; Xie, Ling; Davies, Sherri R; Li, Shunqiang; Ellis, Matthew J; Qaqish, Bahjat F; Chen, Xian
2016-02-01
Single quantitative platforms such as label-based or label-free quantitation (LFQ) present compromises in accuracy, precision, protein sequence coverage, and speed of quantifiable proteomic measurements. To maximize the quantitative precision and the number of quantifiable proteins or the quantifiable coverage of tissue proteomes, we have developed a unified approach, termed QuantFusion, that combines the quantitative ratios of all peptides measured by both LFQ and label-based methodologies. Here, we demonstrate the use of QuantFusion in determining the proteins differentially expressed in a pair of patient-derived tumor xenografts (PDXs) representing two major breast cancer (BC) subtypes, basal and luminal. Label-based in-spectra quantitative peptides derived from amino acid-coded tagging (AACT, also known as SILAC) of a non-malignant mammary cell line were uniformly added to each xenograft with a constant predefined ratio, from which Ratio-of-Ratio estimates were obtained for the label-free peptides paired with AACT peptides in each PDX tumor. A mixed model statistical analysis was used to determine global differential protein expression by combining complementary quantifiable peptide ratios measured by LFQ and Ratio-of-Ratios, respectively. With minimum number of replicates required for obtaining the statistically significant ratios, QuantFusion uses the distinct mechanisms to "rescue" the missing data inherent to both LFQ and label-based quantitation. Combined quantifiable peptide data from both quantitative schemes increased the overall number of peptide level measurements and protein level estimates. In our analysis of the PDX tumor proteomes, QuantFusion increased the number of distinct peptide ratios by 65%, representing differentially expressed proteins between the BC subtypes. This quantifiable coverage improvement, in turn, not only increased the number of measurable protein fold-changes by 8% but also increased the average precision of quantitative estimates by 181% so that some BC subtypically expressed proteins were rescued by QuantFusion. Thus, incorporating data from multiple quantitative approaches while accounting for measurement variability at both the peptide and global protein levels make QuantFusion unique for obtaining increased coverage and quantitative precision for tissue proteomes. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.
NASA Technical Reports Server (NTRS)
Greenberg, Paul S.; Wernet, Mark P.
1999-01-01
Systems have been developed and demonstrated for performing quantitative velocity measurements in reduced gravity combustion science and fluid physics investigations. The unique constraints and operational environments inherent to reduced-gravity experimental facilities pose special challenges to the development of hardware and software systems. Both point and planar velocimetric capabilities are described, with particular attention being given to the development of systems to support the International Space Station laboratory. Emphasis has been placed on optical methods, primarily arising from the sensitivity of the phenomena of interest to intrusive probes. Limitations on available power, volume, data storage, and attendant expertise have motivated the use of solid-state sources and detectors, as well as efficient analysis capabilities emphasizing interactive data display and parameter control.
ERIC Educational Resources Information Center
Graham, Karen
2012-01-01
This study attempted development and validation of a measure of "intention to stay in academia" for physician assistant (PA) faculty in order to determine if the construct could be measured in way that had both quantitative and qualitative meaning. Adopting both the methodologic framework of the Rasch model and the theoretical framework…
In this study, a quantitative liquid chromatography-mass spectrometry (LC-MS) technique capable of measuring the concentrations of heterocyclic nitrogen compounds in ambient fine aerosols (PM2.5) has been developed. Quadrupole time-of-flight (Q-TOF) MS technology is used to provi...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-02
... quantitative studies. Focus groups serve the narrowly defined need for direct and informal opinion on a specific topic and as a qualitative research tool have three major purposes: To obtain information that is useful for developing variables and measures for quantitative studies, To better understand people's...
1982-12-01
paper examines the various measures discussed in the literature and used in selected corpora- tions which develop software. It presents several methods ...examines the various measures discassed in the literature and used in selected corporations which develop software. It presents several methods for...HOUR .... 40 D. SELECTED INDUSrRY METHODS FOR MEASURING PRODUCTIVITY 41 _ I1. 1IBM 41.. . . . . . . . ; 2. Amdahl . . . . . . . . . . . . . . . . . . 44
Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A
2014-12-01
Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.
Detecting Genetic Interactions for Quantitative Traits Using m-Spacing Entropy Measure
Yee, Jaeyong; Kwon, Min-Seok; Park, Taesung; Park, Mira
2015-01-01
A number of statistical methods for detecting gene-gene interactions have been developed in genetic association studies with binary traits. However, many phenotype measures are intrinsically quantitative and categorizing continuous traits may not always be straightforward and meaningful. Association of gene-gene interactions with an observed distribution of such phenotypes needs to be investigated directly without categorization. Information gain based on entropy measure has previously been successful in identifying genetic associations with binary traits. We extend the usefulness of this information gain by proposing a nonparametric evaluation method of conditional entropy of a quantitative phenotype associated with a given genotype. Hence, the information gain can be obtained for any phenotype distribution. Because any functional form, such as Gaussian, is not assumed for the entire distribution of a trait or a given genotype, this method is expected to be robust enough to be applied to any phenotypic association data. Here, we show its use to successfully identify the main effect, as well as the genetic interactions, associated with a quantitative trait. PMID:26339620
Cantow, Kathleen; Arakelyan, Karen; Seeliger, Erdmann; Niendorf, Thoralf; Pohlmann, Andreas
2016-01-01
In vivo assessment of renal perfusion and oxygenation under (patho)physiological conditions by means of noninvasive diagnostic imaging is conceptually appealing. Blood oxygen level-dependent (BOLD) magnetic resonance imaging (MRI) and quantitative parametric mapping of the magnetic resonance (MR) relaxation times T 2* and T 2 are thought to provide surrogates of renal tissue oxygenation. The validity and efficacy of this technique for quantitative characterization of local tissue oxygenation and its changes under different functional conditions have not been systematically examined yet and remain to be established. For this purpose, the development of an integrative multimodality approaches is essential. Here we describe an integrated hybrid approach (MR-PHYSIOL) that combines established quantitative physiological measurements with T 2* (T 2) mapping and MR-based kidney size measurements. Standardized reversible (patho)physiologically relevant interventions, such as brief periods of aortic occlusion, hypoxia, and hyperoxia, are used for detailing the relation between the MR-PHYSIOL parameters, in particular between renal T 2* and tissue oxygenation.
Nolan, John P.; Mandy, Francis
2008-01-01
While the term flow cytometry refers to the measurement of cells, the approach of making sensitive multiparameter optical measurements in a flowing sample stream is a very general analytical approach. The past few years have seen an explosion in the application of flow cytometry technology for molecular analysis and measurements using micro-particles as solid supports. While microsphere-based molecular analyses using flow cytometry date back three decades, the need for highly parallel quantitative molecular measurements that has arisen from various genomic and proteomic advances has driven the development in particle encoding technology to enable highly multiplexed assays. Multiplexed particle-based immunoassays are now common place, and new assays to study genes, protein function, and molecular assembly. Numerous efforts are underway to extend the multiplexing capabilities of microparticle-based assays through new approaches to particle encoding and analyte reporting. The impact of these developments will be seen in the basic research and clinical laboratories, as well as in drug development. PMID:16604537
Prototype ultrasonic instrument for quantitative testing
NASA Technical Reports Server (NTRS)
Lynnworth, L. C.; Dubois, J. L.; Kranz, P. R.
1972-01-01
A prototype ultrasonic instrument has been designed and developed for quantitative testing. The complete delivered instrument consists of a pulser/receiver which plugs into a standard oscilloscope, an rf power amplifier, a standard decade oscillator, and a set of broadband transducers for typical use at 1, 2, 5 and 10 MHz. The system provides for its own calibration, and on the oscilloscope, presents a quantitative (digital) indication of time base and sensitivity scale factors and some measurement data.
Dong, Daming; Jiao, Leizi; Du, Xiaofan; Zhao, Chunjiang
2017-04-20
In this study, we developed a substrate to enhance the sensitivity of LIBS by 5 orders of magnitude. Using a combination of field enhancement due to the metal nanoparticles in the substrate, the aggregate effect of super-hydrophobic interfaces and magnetic confinement, we performed a quantitative measurement of copper in solution with concentrations on the ppt level. We also demonstrated that the substrate improves quantitative measurements by providing an opportunity for internal standardization.
Quantitative analysis of a scar's pliability, perfusion and metrology
NASA Astrophysics Data System (ADS)
Gonzalez, Mariacarla; Sevilla, Nicole; Chue-Sang, Joseph; Ramella-Roman, Jessica C.
2017-02-01
The primary effect of scarring is the loss of function in the affected area. Scarring also leads to physical and psychological problems that could be devastating to the patient's life. Currently, scar assessment is highly subjective and physician dependent. The examination relies on the expertise of the physician to determine the characteristics of the scar by touch and visual examination using the Vancouver scar scale (VSS), which categorizes scars depending on pigmentation, pliability, height and vascularity. In order to establish diagnostic guidelines for scar formation, a quantitative, accurate assessment method needs to be developed. An instrument capable of measuring all categories was developed; three of the aforementioned parameters will be explored. In order to look at pliability, a durometer which measures the amount of resistance a surface exerts to prevent the permanent indentation of the surface is used due to its simplicity and quantitative output. To look at height and vascularity, a profilometry system that collects the location of the scar in three-dimensions and laser speckle imaging (LSI), which shows the dynamic changes in perfusion, respectively, are used. Gelatin phantoms were utilized to measure pliability. Finally, dynamic changes in skin perfusion of volunteers' forearms undergoing pressure cuff occlusion were measured, along with incisional scars.
Quantitation of Met tyrosine phosphorylation using MRM-MS.
Meng, Zhaojing; Srivastava, Apurva K; Zhou, Ming; Veenstra, Timothy
2013-01-01
Phosphorylation has long been accepted as a key cellular regulator of cell signaling pathways. The recent development of multiple-reaction monitoring mass spectrometry (MRM-MS) provides a useful tool for measuring the absolute quantity of phosphorylation occupancy at pivotal sites within signaling proteins, even when the phosphorylation sites are in close proximity. Here, we described a targeted quantitation approach to measure the absolute phosphorylation occupancy at Y1234 and Y1235 of Met. The approach is utilized to obtain absolute occupancy of the two phosphorylation sites in the full-length recombinant Met. It is further applied to quantitate the phosphorylation state of these two sites in SNU-5 cells treated with a Met inhibitor.
Liu, Chao; Cai, Hong-Xin; Zhang, Jian-Feng; Ma, Jian-Jun; Lu, Yin-Jiang; Fan, Shun-Wu
2014-03-01
The high-intensity zone (HIZ) on magnetic resonance imaging (MRI) has been studied for more than 20 years, but its diagnostic value in low back pain (LBP) is limited by the high incidence in asymptomatic subjects. Little effort has been made to improve the objective assessment of HIZ. To develop quantitative measurements for HIZ and estimate intra- and interobserver reliability and to clarify different signal intensity of HIZ in patients with or without LBP. A measurement reliability and prospective comparative study. A consecutive series of patients with LBP between June 2010 and May 2011 (group A) and a successive series of asymptomatic controls during the same period (group B). Incidence of HIZ; quantitative measures, including area of disc, area and signal intensity of HIZ, and magnetic resonance imaging index; and intraclass correlation coefficients (ICCs) for intra- and interobserver reliability. On the basis of HIZ criteria, a series of quantitative dimension and signal intensity measures was developed for assessing HIZ. Two experienced spine surgeons traced the region of interest twice within 4 weeks for assessment of the intra- and interobserver reliability. The quantitative variables were compared between groups A and B. There were 72 patients with LBP and 79 asymptomatic controls enrolling in this study. The prevalence of HIZ in group A and group B was 45.8% and 20.2%, respectively. The intraobserver agreement was excellent for the quantitative measures (ICC=0.838-0.977) as well as interobserver reliability (ICC=0.809-0.935). The mean signal of HIZ in group A was significantly brighter than in group B (57.55±14.04% vs. 45.61±7.22%, p=.000). There was no statistical difference of area of disc and HIZ between the two groups. The magnetic resonance imaging index was found to be higher in group A when compared with group B (3.94±1.71 vs. 3.06±1.50), but with a p value of .050. A series of quantitative measurements for HIZ was established and demonstrated excellent intra- and interobserver reliability. The signal intensity of HIZ was different in patients with or without LBP, and significant brighter signal was observed in symptomatic subjects. Copyright © 2014 Elsevier Inc. All rights reserved.
2016-09-05
46 was performed on an LTQ-Orbitrap Elite MS and the final quantitation was derived by 47 comparing the relative response of the 200 fmol AQUA...shown in Figure 3B, the final quantitation is derived by comparing the 527 relative response of the 200 fmol AQUA standards (SEE and IRSEE: Set 1) to...measure of eVLP quality, the western blot 553 and LC-HRMS quantitation results were compared to survival data in mice for each of these 554 eVLP vaccine
Hammond, Emily; Sloan, Chelsea; Newell, John D; Sieren, Jered P; Saylor, Melissa; Vidal, Craig; Hogue, Shayna; De Stefano, Frank; Sieren, Alexa; Hoffman, Eric A; Sieren, Jessica C
2017-09-01
Quantitative computed tomography (CT) measures are increasingly being developed and used to characterize lung disease. With recent advances in CT technologies, we sought to evaluate the quantitative accuracy of lung imaging at low- and ultralow-radiation doses with the use of iterative reconstruction (IR), tube current modulation (TCM), and spectral shaping. We investigated the effect of five independent CT protocols reconstructed with IR on quantitative airway measures and global lung measures using an in vivo large animal model as a human subject surrogate. A control protocol was chosen (NIH-SPIROMICS + TCM) and five independent protocols investigating TCM, low- and ultralow-radiation dose, and spectral shaping. For all scans, quantitative global parenchymal measurements (mean, median and standard deviation of the parenchymal HU, along with measures of emphysema) and global airway measurements (number of segmented airways and pi10) were generated. In addition, selected individual airway measurements (minor and major inner diameter, wall thickness, inner and outer area, inner and outer perimeter, wall area fraction, and inner equivalent circle diameter) were evaluated. Comparisons were made between control and target protocols using difference and repeatability measures. Estimated CT volume dose index (CTDIvol) across all protocols ranged from 7.32 mGy to 0.32 mGy. Low- and ultralow-dose protocols required more manual editing and resolved fewer airway branches; yet, comparable pi10 whole lung measures were observed across all protocols. Similar trends in acquired parenchymal and airway measurements were observed across all protocols, with increased measurement differences using the ultralow-dose protocols. However, for small airways (1.9 ± 0.2 mm) and medium airways (5.7 ± 0.4 mm), the measurement differences across all protocols were comparable to the control protocol repeatability across breath holds. Diameters, wall thickness, wall area fraction, and equivalent diameter had smaller measurement differences than area and perimeter measurements. In conclusion, the use of IR with low- and ultralow-dose CT protocols with CT volume dose indices down to 0.32 mGy maintains selected quantitative parenchymal and airway measurements relevant to pulmonary disease characterization. © 2017 American Association of Physicists in Medicine.
NASA Astrophysics Data System (ADS)
Cai, Tao; Guo, Songtao; Li, Yongzeng; Peng, Di; Zhao, Xiaofeng; Liu, Yingzheng
2018-04-01
The mechanoluminescent (ML) sensor is a newly developed non-invasive technique for stress/strain measurement. However, its application has been mostly restricted to qualitative measurement due to the lack of a well-defined relationship between ML intensity and stress. To achieve accurate stress measurement, an intensity ratio model was proposed in this study to establish a quantitative relationship between the stress condition and its ML intensity in elastic deformation. To verify the proposed model, experiments were carried out on a ML measurement system using resin samples mixed with the sensor material SrAl2O4:Eu2+, Dy3+. The ML intensity ratio was found to be dependent on the applied stress and strain rate, and the relationship acquired from the experimental results agreed well with the proposed model. The current study provided a physical explanation for the relationship between ML intensity and its stress condition. The proposed model was applicable in various SrAl2O4:Eu2+, Dy3+-based ML measurement in elastic deformation, and could provide a useful reference for quantitative stress measurement using the ML sensor in general.
Roberts, P
1999-07-01
The political climate of health care provision and education for health care in the latter years of the 20th century is evolving from the uncertainty of newly created markets to a more clearly focused culture of collaboration, dissemination of good practice, with an increased emphasis on quality provision and its measurement. The need for provider units to prove and improve efficiency and effectiveness through evidence-based quality strategies in order to stay firmly in the market place has never been more necessary. The measurement of customer expectations and perceptions of delivered service quality is widely utilized as a basis for customer retention and business growth in both commercial and non-profit organizations. This paper describes the methodological development of NEdSERV--quantitative instrumentation designed to measure and respond to ongoing stakeholder expectations and perceptions of delivered service quality within nurse education.
NASA Astrophysics Data System (ADS)
Zhao, H.; Zhang, S.
2008-01-01
One of the most effective means to achieve controlled auto-ignition (CAI) combustion in a gasoline engine is by the residual gas trapping method. The amount of residual gas and mixture composition have significant effects on the subsequent combustion process and engine emissions. In order to obtain quantitative measurements of in-cylinder residual gas concentration and air/fuel ratio, a spontaneous Raman scattering (SRS) system has been developed recently. The optimized optical SRS setups are presented and discussed. The temperature effect on the SRS measurement is considered and a method has been developed to correct for the overestimated values due to the temperature effect. Simultaneous measurements of O2, H2O, CO2 and fuel were obtained throughout the intake, compression, combustion and expansion strokes. It shows that the SRS can provide valuable data on this process in a CAI combustion engine.
Quantitative operando visualization of the energy band depth profile in solar cells.
Chen, Qi; Mao, Lin; Li, Yaowen; Kong, Tao; Wu, Na; Ma, Changqi; Bai, Sai; Jin, Yizheng; Wu, Dan; Lu, Wei; Wang, Bing; Chen, Liwei
2015-07-13
The energy band alignment in solar cell devices is critically important because it largely governs elementary photovoltaic processes, such as the generation, separation, transport, recombination and collection of charge carriers. Despite the expenditure of considerable effort, the measurement of energy band depth profiles across multiple layers has been extremely challenging, especially for operando devices. Here we present direct visualization of the surface potential depth profile over the cross-sections of operando organic photovoltaic devices using scanning Kelvin probe microscopy. The convolution effect due to finite tip size and cantilever beam crosstalk has previously prohibited quantitative interpretation of scanning Kelvin probe microscopy-measured surface potential depth profiles. We develop a bias voltage-compensation method to address this critical problem and obtain quantitatively accurate measurements of the open-circuit voltage, built-in potential and electrode potential difference.
Quantitative operando visualization of the energy band depth profile in solar cells
Chen, Qi; Mao, Lin; Li, Yaowen; Kong, Tao; Wu, Na; Ma, Changqi; Bai, Sai; Jin, Yizheng; Wu, Dan; Lu, Wei; Wang, Bing; Chen, Liwei
2015-01-01
The energy band alignment in solar cell devices is critically important because it largely governs elementary photovoltaic processes, such as the generation, separation, transport, recombination and collection of charge carriers. Despite the expenditure of considerable effort, the measurement of energy band depth profiles across multiple layers has been extremely challenging, especially for operando devices. Here we present direct visualization of the surface potential depth profile over the cross-sections of operando organic photovoltaic devices using scanning Kelvin probe microscopy. The convolution effect due to finite tip size and cantilever beam crosstalk has previously prohibited quantitative interpretation of scanning Kelvin probe microscopy-measured surface potential depth profiles. We develop a bias voltage-compensation method to address this critical problem and obtain quantitatively accurate measurements of the open-circuit voltage, built-in potential and electrode potential difference. PMID:26166580
Cappelleri, Joseph C; Jason Lundy, J; Hays, Ron D
2014-05-01
The US Food and Drug Administration's guidance for industry document on patient-reported outcomes (PRO) defines content validity as "the extent to which the instrument measures the concept of interest" (FDA, 2009, p. 12). According to Strauss and Smith (2009), construct validity "is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity" (p. 7). Hence, both qualitative and quantitative information are essential in evaluating the validity of measures. We review classical test theory and item response theory (IRT) approaches to evaluating PRO measures, including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized "difficulty" (severity) order of items is represented by observed responses. If a researcher has few qualitative data and wants to get preliminary information about the content validity of the instrument, then descriptive assessments using classical test theory should be the first step. As the sample size grows during subsequent stages of instrument development, confidence in the numerical estimates from Rasch and other IRT models (as well as those of classical test theory) would also grow. Classical test theory and IRT can be useful in providing a quantitative assessment of items and scales during the content-validity phase of PRO-measure development. Depending on the particular type of measure and the specific circumstances, the classical test theory and/or the IRT should be considered to help maximize the content validity of PRO measures. Copyright © 2014 Elsevier HS Journals, Inc. All rights reserved.
English, Devin; Bowleg, Lisa; del Río-González, Ana Maria; Tschann, Jeanne M.; Agans, Robert; Malebranche, David J
2017-01-01
Objectives Although social science research has examined police and law enforcement-perpetrated discrimination against Black men using policing statistics and implicit bias studies, there is little quantitative evidence detailing this phenomenon from the perspective of Black men. Consequently, there is a dearth of research detailing how Black men’s perspectives on police and law enforcement-related stress predict negative physiological and psychological health outcomes. This study addresses these gaps with the qualitative development and quantitative test of the Police and Law Enforcement (PLE) scale. Methods In Study 1, we employed thematic analysis on transcripts of individual qualitative interviews with 90 Black men to assess key themes and concepts and develop quantitative items. In Study 2, we used 2 focus groups comprised of 5 Black men each (n=10), intensive cognitive interviewing with a separate sample of Black men (n=15), and piloting with another sample of Black men (n=13) to assess the ecological validity of the quantitative items. For study 3, we analyzed data from a sample of 633 Black men between the ages of 18 and 65 to test the factor structure of the PLE, as we all as its concurrent validity and convergent/discriminant validity. Results Qualitative analyses and confirmatory factor analyses suggested that a 5-item, 1-factor measure appropriately represented respondents’ experiences of police/law enforcement discrimination. As hypothesized, the PLE was positively associated with measures of racial discrimination and depressive symptoms. Conclusions Preliminary evidence suggests that the PLE is a reliable and valid measure of Black men’s experiences of discrimination with police/law enforcement. PMID:28080104
English, Devin; Bowleg, Lisa; Del Río-González, Ana Maria; Tschann, Jeanne M; Agans, Robert P; Malebranche, David J
2017-04-01
Although social science research has examined police and law enforcement-perpetrated discrimination against Black men using policing statistics and implicit bias studies, there is little quantitative evidence detailing this phenomenon from the perspective of Black men. Consequently, there is a dearth of research detailing how Black men's perspectives on police and law enforcement-related stress predict negative physiological and psychological health outcomes. This study addresses these gaps with the qualitative development and quantitative test of the Police and Law Enforcement (PLE) Scale. In Study 1, we used thematic analysis on transcripts of individual qualitative interviews with 90 Black men to assess key themes and concepts and develop quantitative items. In Study 2, we used 2 focus groups comprised of 5 Black men each (n = 10), intensive cognitive interviewing with a separate sample of Black men (n = 15), and piloting with another sample of Black men (n = 13) to assess the ecological validity of the quantitative items. For Study 3, we analyzed data from a sample of 633 Black men between the ages of 18 and 65 to test the factor structure of the PLE, as we all as its concurrent validity and convergent/discriminant validity. Qualitative analyses and confirmatory factor analyses suggested that a 5-item, 1-factor measure appropriately represented respondents' experiences of police/law enforcement discrimination. As hypothesized, the PLE was positively associated with measures of racial discrimination and depressive symptoms. Preliminary evidence suggests that the PLE is a reliable and valid measure of Black men's experiences of discrimination with police/law enforcement. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Esposito, Dominick; Taylor, Erin Fries; Gold, Marsha
2009-02-01
Interest in disease management programs continues to grow as managed care plans, the federal and state governments, and other organizations consider such efforts as a means to improve health care quality and reduce costs. These efforts vary in size, scope, and target population. While large-scale programs provide the means to measure impacts, evaluation of smaller interventions remains valuable as they often represent the early planning stages of larger initiatives. This paper describes a multi-method approach for evaluating small interventions that sought to improve the quality of care for Medicaid beneficiaries with multiple chronic conditions. Our approach relied on quantitative and qualitative methods to develop a complete understanding of each intervention. Quantitative data in the form of both process measures, such as case manager contacts, and outcome measures, such as hospital use, were reported and analyzed. Qualitative information was collected through interviews and the development of logic models to document the flow of intervention activities and how they were intended to affect outcomes. The logic models helped us to understand the underlying reasons for the success or lack thereof of each intervention. The analysis provides useful information on several fronts. First, qualitative data provided valuable information about implementation. Second, process measures helped determine whether implementation occurred as anticipated. Third, outcome measures indicated the potential for favorable results later, possibly suggesting further study. Finally, the evaluation of qualitative and quantitative data in combination helped us assess the potential promise of each intervention and identify common themes and challenges across all interventions.
ERIC Educational Resources Information Center
Saito, Hirotaka; Ando, Akinobu; Itagaki, Shota; Kawada, Taku; Davis, Darold; Nagai, Nobuyuki
2017-01-01
Until now, when practicing facial expression recognition skills in nonverbal communication areas of SST, judgment of facial expression was not quantitative because the subjects of SST were judged by teachers. Therefore, we thought whether SST could be performed using facial expression detection devices that can quantitatively measure facial…
ERIC Educational Resources Information Center
Guess, Doug; And Others
Ten replication studies based on quantitative procedures developed to measure motor and sensory/motor skill acquisition among handicapped and nonhandicapped infants and children are presented. Each study follows the original assessment procedures, and emphasizes the stability of interobserver reliability across time, consistency in the response…
Spector, P E; Jex, S M
1998-10-01
Despite the widespread use of self-report measures of both job-related stressors and strains, relatively few carefully developed scales for which validity data exist are available. In this article, we discuss 3 job stressor scales (Interpersonal Conflict at Work Scale, Organizational Constraints Scale, and Quantitative Workload Inventory) and 1 job strain scale (Physical Symptoms Inventory). Using meta-analysis, we combined the results of 18 studies to provide estimates of relations between our scales and other variables. Data showed moderate convergent validity for the 3 job stressor scales, suggesting some objectively to these self-reports. Norms for each scale are provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dykstra, Andrew B; St. Brice, Lois; Rodriguez, Jr., Miguel
2014-01-01
Clostridium thermocellum has emerged as a leading bioenergy-relevant microbe due to its ability to solubilize cellulose into carbohydrates, mediated by multi-component membrane-attached complexes termed cellulosomes. To probe microbial cellulose utilization rates, it is desirable to be able to measure the concentrations of saccharolytic enzymes and estimate the total amount of cellulosome present on a mass basis. Current cellulase determination methodologies involve labor-intensive purification procedures and only allow for indirect determination of abundance. We have developed a method using multiple reaction monitoring (MRM-MS) to simultaneously quantitate both enzymatic and structural components of the cellulosome protein complex in samples ranging in complexitymore » from purified cellulosomes to whole cell lysates, as an alternative to a previously-developed enzyme-linked immunosorbent assay (ELISA) method of cellulosome quantitation. The precision of the cellulosome mass concentration in technical replicates is better than 5% relative standard deviation for all samples, indicating high precision for determination of the mass concentration of cellulosome components.« less
Quantitation of Protein Carbonylation by Dot Blot
Wehr, Nancy B.; Levine, Rodney L.
2012-01-01
Protein carbonylation is the most commonly used measure of oxidative modification of proteins. It is frequently measured spectrophotometrically or immunochemically by derivatizing proteins with the classical carbonyl reagent, 2,4-dinitrophenylhydrazine. We developed an immunochemical dot blot method for quantitation of protein carbonylation in homogenates or purified proteins. Dimethyl sulfoxide was employed as the solvent because it very efficiently extracts proteins from tissues and keeps them soluble. It also readily dissolves 2,4-dinitrophenylhydrazine and wets PVDF membranes. The detection limit is 0.19 ± 0.04 pmol carbonyl. Sixty ng protein is sufficient to measure protein carbonyl content. This level of sensitivity allowed measurement of protein carbonylation in individual Drosophila. PMID:22326366
Spotsizer: High-throughput quantitative analysis of microbial growth.
Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg
2016-10-01
Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.
Measuring the performance of visual to auditory information conversion.
Tan, Shern Shiou; Maul, Tomás Henrique Bode; Mennie, Neil Russell
2013-01-01
Visual to auditory conversion systems have been in existence for several decades. Besides being among the front runners in providing visual capabilities to blind users, the auditory cues generated from image sonification systems are still easier to learn and adapt to compared to other similar techniques. Other advantages include low cost, easy customizability, and universality. However, every system developed so far has its own set of strengths and weaknesses. In order to improve these systems further, we propose an automated and quantitative method to measure the performance of such systems. With these quantitative measurements, it is possible to gauge the relative strengths and weaknesses of different systems and rank the systems accordingly. Performance is measured by both the interpretability and also the information preservation of visual to auditory conversions. Interpretability is measured by computing the correlation of inter image distance (IID) and inter sound distance (ISD) whereas the information preservation is computed by applying Information Theory to measure the entropy of both visual and corresponding auditory signals. These measurements provide a basis and some insights on how the systems work. With an automated interpretability measure as a standard, more image sonification systems can be developed, compared, and then improved. Even though the measure does not test systems as thoroughly as carefully designed psychological experiments, a quantitative measurement like the one proposed here can compare systems to a certain degree without incurring much cost. Underlying this research is the hope that a major breakthrough in image sonification systems will allow blind users to cost effectively regain enough visual functions to allow them to lead secure and productive lives.
Quantitative comparison of in situ soil CO2 flux measurement methods
Jennifer D. Knoepp; James M. Vose
2002-01-01
Development of reliable regional or global carbon budgets requires accurate measurement of soil CO2 flux. We conducted laboratory and field studies to determine the accuracy and comparability of methods commonly used to measure in situ soil CO2 fluxes. Methods compared included CO2...
Light scattering application for quantitative estimation of apoptosis
NASA Astrophysics Data System (ADS)
Bilyy, Rostyslav O.; Stoika, Rostyslav S.; Getman, Vasyl B.; Bilyi, Olexander I.
2004-05-01
Estimation of cell proliferation and apoptosis are in focus of instrumental methods used in modern biomedical sciences. Present study concerns monitoring of functional state of cells, specifically the development of their programmed death or apoptosis. The available methods for such purpose are either very expensive, or require time-consuming operations. Their specificity and sensitivity are frequently not sufficient for making conclusions which could be used in diagnostics or treatment monitoring. We propose a novel method for apoptosis measurement based on quantitative determination of cellular functional state taking into account their physical characteristics. This method uses the patented device -- laser microparticle analyser PRM-6 -- for analyzing light scattering by the microparticles, including cells. The method gives an opportunity for quick, quantitative, simple (without complicated preliminary cell processing) and relatively cheap measurement of apoptosis in cellular population. The elaborated method was used for studying apoptosis expression in murine leukemia cells of L1210 line and human lymphoblastic leukemia cells of K562 line. The results obtained by the proposed method permitted measuring cell number in tested sample, detecting and quantitative characterization of functional state of cells, particularly measuring the ratio of the apoptotic cells in suspension.
Fritz, Nora E; Keller, Jennifer; Calabresi, Peter A; Zackowski, Kathleen M
2017-01-01
At least 85% of individuals with multiple sclerosis report walking dysfunction as their primary complaint. Walking and strength measures are common clinical measures to mark increasing disability or improvement with rehabilitation. Previous studies have shown an association between strength or walking ability and spinal cord MRI measures, and strength measures with brainstem corticospinal tract magnetization transfer ratio. However, the relationship between walking performance and brain corticospinal tract magnetization transfer imaging measures and the contribution of clinical measurements of walking and strength to the underlying integrity of the corticospinal tract has not been explored in multiple sclerosis. The objectives of this study were explore the relationship of quantitative measures of walking and strength to whole-brain corticospinal tract-specific MRI measures and to determine the contribution of quantitative measures of function in addition to basic clinical measures (age, gender, symptom duration and Expanded Disability Status Scale) to structural imaging measures of the corticospinal tract. We hypothesized that quantitative walking and strength measures would be related to brain corticospinal tract-specific measures, and would provide insight into the heterogeneity of brain pathology. Twenty-nine individuals with relapsing-remitting multiple sclerosis (mean(SD) age 48.7 (11.5) years; symptom duration 11.9(8.7); 17 females; median[range] Expanded Disability Status Scale 4.0 [1.0-6.5]) and 29 age and gender-matched healthy controls (age 50.8(11.6) years; 20 females) participated in clinical tests of strength and walking (Timed Up and Go, Timed 25 Foot Walk, Two Minute Walk Test ) as well as 3 T imaging including diffusion tensor imaging and magnetization transfer imaging. Individuals with multiple sclerosis were weaker (p = 0.0024) and walked slower (p = 0.0013) compared to controls. Quantitative measures of walking and strength were significantly related to corticospinal tract fractional anisotropy (r > 0.26; p < 0.04) and magnetization transfer ratio (r > 0.29; p < 0.03) measures. Although the Expanded Disability Status Scale was highly correlated with walking measures, it was not significantly related to either corticospinal tract fractional anisotropy or magnetization transfer ratio (p > 0.05). Walk velocity was a significant contributor to magnetization transfer ratio (p = 0.006) and fractional anisotropy (p = 0.011) in regression modeling that included both quantitative measures of function and basic clinical information. Quantitative measures of strength and walking are associated with brain corticospinal tract pathology. The addition of these quantitative measures to basic clinical information explains more of the variance in corticospinal tract fractional anisotropy and magnetization transfer ratio than the basic clinical information alone. Outcome measurement for multiple sclerosis clinical trials has been notoriously challenging; the use of quantitative measures of strength and walking along with tract-specific imaging methods may improve our ability to monitor disease change over time, with intervention, and provide needed guidelines for developing more effective targeted rehabilitation strategies.
Xu, Tianmin
2012-06-01
Just like other subjects in medicine, orthodontics also uses some vague concepts to describe what are difficult to measure quantitatively. Anchorage control is one of them. With the development of evidence-based medicine, orthodontists pay more and more attention to the accuracy of the clinical evidence. The empirical description of anchorage control is showing inadequacy in modern orthodontics. This essay, based on author's recent series of studies on anchorage control, points out the inaccuracy of maximum anchorage concept, commonly neglected points in quantitative measurement of anchorage loss and the solutions. It also discusses the limitation of maximum anchorage control.
In situ spectroradiometric quantification of ERTS data. [Prescott and Phoenix, Arizona
NASA Technical Reports Server (NTRS)
Yost, E. F. (Principal Investigator)
1975-01-01
The author has identified the following significant results. Analyses of ERTS-1 photographic data were made to quantitatively relate ground reflectance measurements to photometric characteristics of the images. Digital image processing of photographic data resulted in a nomograph to correct for atmospheric effects over arid terrain. Optimum processing techniques to derive maximum geologic information from desert areas were established. Additive color techniques to provide quantitative measurements of surface water between different orbits were developed which were accepted as the standard flood mapping techniques using ERTS.
Quantitative structure parameters from the NMR spectroscopy of quadrupolar nuclei
Perras, Frederic A.
2015-12-15
Here, nuclear magnetic resonance (NMR) spectroscopy is one of the most important characterization tools in chemistry, however, 3/4 of the NMR active nuclei are underutilized due to their quadrupolar nature. This short review centers on the development of methods that use solid-state NMR of quadrupolar nuclei for obtaining quantitative structural information. Namely, techniques using dipolar recoupling as well as the resolution afforded by double-rotation are presented for the measurement of spin–spin coupling between quadrupoles, enabling the measurement of internuclear distances and connectivities.
A Validity and Reliability Study of the Attitudes toward Sustainable Development Scale
ERIC Educational Resources Information Center
Biasutti, Michele; Frate, Sara
2017-01-01
This article describes the development and validation of the Attitudes toward Sustainable Development scale, a quantitative 20-item scale that measures Italian university students' attitudes toward sustainable development. A total of 484 undergraduate students completed the questionnaire. The validity and reliability of the scale was statistically…
Rink, Cameron L; Wernke, Matthew M; Powell, Heather M; Tornero, Mark; Gnyawali, Surya C; Schroeder, Ryan M; Kim, Jayne Y; Denune, Jeffrey A; Albury, Alexander W; Gordillo, Gayle M; Colvin, James M; Sen, Chandan K
2017-07-01
Objective: (1) Develop a standardized approach to quantitatively measure residual limb skin health. (2) Report reference residual limb skin health values in people with transtibial and transfemoral amputation. Approach: Residual limb health outcomes in individuals with transtibial ( n = 5) and transfemoral ( n = 5) amputation were compared to able-limb controls ( n = 4) using noninvasive imaging (hyperspectral imaging and laser speckle flowmetry) and probe-based approaches (laser doppler flowmetry, transcutaneous oxygen, transepidermal water loss, surface electrical capacitance). Results: A standardized methodology that employs noninvasive imaging and probe-based approaches to measure residual limb skin health are described. Compared to able-limb controls, individuals with transtibial and transfemoral amputation have significantly lower transcutaneous oxygen tension, higher transepidermal water loss, and higher surface electrical capacitance in the residual limb. Innovation: Residual limb health as a critical component of prosthesis rehabilitation for individuals with lower limb amputation is understudied in part due to a lack of clinical measures. Here, we present a standardized approach to measure residual limb health in people with transtibial and transfemoral amputation. Conclusion: Technology advances in noninvasive imaging and probe-based measures are leveraged to develop a standardized approach to quantitatively measure residual limb health in individuals with lower limb loss. Compared to able-limb controls, resting residual limb physiology in people that have had transfemoral or transtibial amputation is characterized by lower transcutaneous oxygen tension and poorer skin barrier function.
Quantitative Species Measurements In Microgravity Combustion Flames
NASA Technical Reports Server (NTRS)
Chen, Shin-Juh; Pilgrim, Jeffrey S.; Silver, Joel A.; Piltch, Nancy D.
2003-01-01
The capability of models and theories to accurately predict and describe the behavior of low gravity flames can only be verified by quantitative measurements. Although video imaging, simple temperature measurements, and velocimetry methods have provided useful information in many cases, there is still a need for quantitative species measurements. Over the past decade, we have been developing high sensitivity optical absorption techniques to permit in situ, non-intrusive, absolute concentration measurements for both major and minor flames species using diode lasers. This work has helped to establish wavelength modulation spectroscopy (WMS) as an important method for species detection within the restrictions of microgravity-based measurements. More recently, in collaboration with Prof. Dahm at the University of Michigan, a new methodology combining computed flame libraries with a single experimental measurement has allowed us to determine the concentration profiles for all species in a flame. This method, termed ITAC (Iterative Temperature with Assumed Chemistry) was demonstrated for a simple laminar nonpremixed methane-air flame at both 1-g and at 0-g in a vortex ring flame. In this paper, we report additional normal and microgravity experiments which further confirm the usefulness of this approach. We also present the development of a new type of laser. This is an external cavity diode laser (ECDL) which has the unique capability of high frequency modulation as well as a very wide tuning range. This will permit the detection of multiple species with one laser while using WMS detection.
ERIC Educational Resources Information Center
Storfer-Isser, Amy; Musher-Eizenman, Dara
2013-01-01
Objective: To examine the psychometric properties of 9 quantitative items that assess time scarcity and fatigue as parent barriers to planning and preparing meals for their children. Methods: A convenience sample of 342 parents of children aged 2-6 years completed a 20-minute online survey. Exploratory factor analysis was used to examine the…
The BEACH Act of 2000 directed the U.S. EPA to establish more expeditious methods for the detection of pathogen indicators in coastal waters, as well as new water quality criteria based on these methods. Progress has been made in developing a quantitative PCR (qPCR) method for en...
ERIC Educational Resources Information Center
Arai, Heii; Takano, Maki; Miyakawa, Koichi; Ota, Tsuneyoshi; Takahashi, Tadashi; Asaka, Hirokazu; Kawaguchi, Tsuneaki
2006-01-01
A newly developed quantitative near-infrared spectroscopy (NIRS) system was used to measure changes in cortical hemoglobin oxygenation during the Verbal Fluency Task in 32 healthy controls, 15 subjects with mild cognitive impairment (MCI), and 15 patients with Alzheimer's disease (AD). The amplitude of changes in the waveform, which was…
DOT National Transportation Integrated Search
2017-04-01
The Federal Motor Carrier Safety Administration (FMCSA), in cooperation with the John A. Volpe National Transportation Systems Center (Volpe), has developed a quantitative model to measure the effectiveness of motor carrier interventions in terms of ...
DOT National Transportation Integrated Search
1995-10-01
The primary objective of this study is to provide information relative to the development of a set of performance measures for intermodal freight transportation. To accomplish this objective, data was collected, processed, and analyzed on the basis o...
Insights into Spray Development from Metered-Dose Inhalers Through Quantitative X-ray Radiography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mason-Smith, Nicholas; Duke, Daniel J.; Kastengren, Alan L.
Typical methods to study pMDI sprays employ particle sizing or visible light diagnostics, which suffer in regions of high spray density. X-ray techniques can be applied to pharmaceutical sprays to obtain information unattainable by conventional particle sizing and light-based techniques. We present a technique for obtaining quantitative measurements of spray density in pMDI sprays. A monochromatic focused X-ray beam was used to perform quantitative radiography measurements in the near-nozzle region and plume of HFA-propelled sprays. Measurements were obtained with a temporal resolution of 0.184 ms and spatial resolution of 5 mu m. Steady flow conditions were reached after around 30more » ms for the formulations examined with the spray device used. Spray evolution was affected by the inclusion of ethanol in the formulation and unaffected by the inclusion of 0.1% drug by weight. Estimation of the nozzle exit density showed that vapour is likely to dominate the flow leaving the inhaler nozzle during steady flow. Quantitative measurements in pMDI sprays allow the determination of nozzle exit conditions that are difficult to obtain experimentally by other means. Measurements of these nozzle exit conditions can improve understanding of the atomization mechanisms responsible for pMDI spray droplet and particle formation.« less
Method for measuring the size distribution of airborne rhinovirus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Russell, M.L.; Goth-Goldstein, R.; Apte, M.G.
About 50% of viral-induced respiratory illnesses are caused by the human rhinovirus (HRV). Measurements of the concentrations and sizes of bioaerosols are critical for research on building characteristics, aerosol transport, and mitigation measures. We developed a quantitative reverse transcription-coupled polymerase chain reaction (RT-PCR) assay for HRV and verified that this assay detects HRV in nasal lavage samples. A quantitation standard was used to determine a detection limit of 5 fg of HRV RNA with a linear range over 1000-fold. To measure the size distribution of HRV aerosols, volunteers with a head cold spent two hours in a ventilated research chamber.more » Airborne particles from the chamber were collected using an Andersen Six-Stage Cascade Impactor. Each stage of the impactor was analyzed by quantitative RT-PCR for HRV. For the first two volunteers with confirmed HRV infection, but with mild symptoms, we were unable to detect HRV on any stage of the impactor.« less
Quantitative transmission Raman spectroscopy of pharmaceutical tablets and capsules.
Johansson, Jonas; Sparén, Anders; Svensson, Olof; Folestad, Staffan; Claybourn, Mike
2007-11-01
Quantitative analysis of pharmaceutical formulations using the new approach of transmission Raman spectroscopy has been investigated. For comparison, measurements were also made in conventional backscatter mode. The experimental setup consisted of a Raman probe-based spectrometer with 785 nm excitation for measurements in backscatter mode. In transmission mode the same system was used to detect the Raman scattered light, while an external diode laser of the same type was used as excitation source. Quantitative partial least squares models were developed for both measurement modes. The results for tablets show that the prediction error for an independent test set was lower for the transmission measurements with a relative root mean square error of about 2.2% as compared with 2.9% for the backscatter mode. Furthermore, the models were simpler in the transmission case, for which only a single partial least squares (PLS) component was required to explain the variation. The main reason for the improvement using the transmission mode is a more representative sampling of the tablets compared with the backscatter mode. Capsules containing mixtures of pharmaceutical powders were also assessed by transmission only. The quantitative results for the capsules' contents were good, with a prediction error of 3.6% w/w for an independent test set. The advantage of transmission Raman over backscatter Raman spectroscopy has been demonstrated for quantitative analysis of pharmaceutical formulations, and the prospects for reliable, lean calibrations for pharmaceutical analysis is discussed.
Identification of common coexpression modules based on quantitative network comparison.
Jo, Yousang; Kim, Sanghyeon; Lee, Doheon
2018-06-13
Finding common molecular interactions from different samples is essential work to understanding diseases and other biological processes. Coexpression networks and their modules directly reflect sample-specific interactions among genes. Therefore, identification of common coexpression network or modules may reveal the molecular mechanism of complex disease or the relationship between biological processes. However, there has been no quantitative network comparison method for coexpression networks and we examined previous methods for other networks that cannot be applied to coexpression network. Therefore, we aimed to propose quantitative comparison methods for coexpression networks and to find common biological mechanisms between Huntington's disease and brain aging by the new method. We proposed two similarity measures for quantitative comparison of coexpression networks. Then, we performed experiments using known coexpression networks. We showed the validity of two measures and evaluated threshold values for similar coexpression network pairs from experiments. Using these similarity measures and thresholds, we quantitatively measured the similarity between disease-specific and aging-related coexpression modules and found similar Huntington's disease-aging coexpression module pairs. We identified similar Huntington's disease-aging coexpression module pairs and found that these modules are related to brain development, cell death, and immune response. It suggests that up-regulated cell signalling related cell death and immune/ inflammation response may be the common molecular mechanisms in the pathophysiology of HD and normal brain aging in the frontal cortex.
Vikingsson, Svante; Dahlberg, Jan-Olof; Hansson, Johan; Höiom, Veronica; Gréen, Henrik
2017-06-01
Dabrafenib is an inhibitor of BRAF V600E used for treating metastatic melanoma but a majority of patients experience adverse effects. Methods to measure the levels of dabrafenib and major metabolites during treatment are needed to allow development of individualized dosing strategies to reduce the burden of such adverse events. In this study, an LC-MS/MS method capable of measuring dabrafenib quantitatively and six metabolites semi-quantitatively is presented. The method is fully validated with regard to dabrafenib in human plasma in the range 5-5000 ng/mL. The analytes were separated on a C18 column after protein precipitation and detected in positive electrospray ionization mode using a Xevo TQ triple quadrupole mass spectrometer. As no commercial reference standards are available, the calibration curve of dabrafenib was used for semi-quantification of dabrafenib metabolites. Compared to earlier methods the presented method represents a simpler and more cost-effective approach suitable for clinical studies. Graphical abstract Combined multi reaction monitoring transitions of dabrafenib and metabolites in a typical case sample.
Ross, D W
1986-05-01
The phenomenon of leukemic cell maturation requires a measurement of myeloid maturation to understand the process and to exploit it as a means of therapy for leukemia. The HL-60 leukemic cell line was used as a model of induced leukemic cell maturation in order to develop a method of quantitating granulocytic and monocytic maturation in response to drug therapy. An automated flow cytochemistry system (Hemalog-D) was employed to measure mean cell volume, myeloperoxidase (MPO), and nonspecific esterase (NSE). For granulocytic maturation induced by vitamin A or DMSO, MPO and cell volume decreased by 50%, maintaining a constant mean cellular MPO concentration throughout maturation from promyelocyte to neutrophil-like forms. For monocytic maturation induced by low-dose ARA-c, the mean NSE increased substantially, while cell volume remained constant. Unlike MPO concentration, NSE was truly inducible and thus a useful quantitative measure of maturation caused by low-dose ARA-c. Flow cytochemistry and cytofluorometry may be developed to allow for quantitative monitoring of therapeutic trials of induced maturation in human leukemias. However, this will require adapting these techniques to the complexity of human leukemias in vivo, and the necessity of handling heterogeneous populations encountered in bone marrow samples.
Liu, Yongliang; Thibodeaux, Devron; Gamble, Gary; Bauer, Philip; VanDerveer, Don
2012-08-01
Despite considerable efforts in developing curve-fitting protocols to evaluate the crystallinity index (CI) from X-ray diffraction (XRD) measurements, in its present state XRD can only provide a qualitative or semi-quantitative assessment of the amounts of crystalline or amorphous fraction in a sample. The greatest barrier to establishing quantitative XRD is the lack of appropriate cellulose standards, which are needed to calibrate the XRD measurements. In practice, samples with known CI are very difficult to prepare or determine. In a previous study, we reported the development of a simple algorithm for determining fiber crystallinity information from Fourier transform infrared (FT-IR) spectroscopy. Hence, in this study we not only compared the fiber crystallinity information between FT-IR and XRD measurements, by developing a simple XRD algorithm in place of a time-consuming and subjective curve-fitting process, but we also suggested a direct way of determining cotton cellulose CI by calibrating XRD with the use of CI(IR) as references.
Automatic vertebral bodies detection of x-ray images using invariant multiscale template matching
NASA Astrophysics Data System (ADS)
Sharifi Sarabi, Mona; Villaroman, Diane; Beckett, Joel; Attiah, Mark; Marcus, Logan; Ahn, Christine; Babayan, Diana; Gaonkar, Bilwaj; Macyszyn, Luke; Raghavendra, Cauligi
2017-03-01
Lower back pain and pathologies related to it are one of the most common results for a referral to a neurosurgical clinic in the developed and the developing world. Quantitative evaluation of these pathologies is a challenge. Image based measurements of angles/vertebral heights and disks could provide a potential quantitative biomarker for tracking and measuring these pathologies. Detection of vertebral bodies is a key element and is the focus of the current work. From the variety of medical imaging techniques, MRI and CT scans have been typically used for developing image segmentation methods. However, CT scans are known to give a large dose of x-rays, increasing cancer risk [8]. MRI can be substituted for CTs when the risk is high [8] but are difficult to obtain in smaller facilities due to cost and lack of expertise in the field [2]. X-rays provide another option with its ability to control the x-ray dosage, especially for young people, and its accessibility for smaller facilities. Hence, the ability to create quantitative biomarkers from x-ray data is especially valuable. Here, we develop a multiscale template matching, inspired by [9], to detect centers of vertebral bodies from x-ray data. The immediate application of such detection lies in developing quantitative biomarkers and in querying similar images in a database. Previously, shape similarity classification methods have been used to address this problem, but these are challenging to use in the presence of variation due to gross pathology and even subtle effects [1].
DOE Office of Scientific and Technical Information (OSTI.GOV)
Helms, J.
2017-02-10
The US energy sector is vulnerable to multiple hazards including both natural disasters and malicious attacks from an intelligent adversary. The question that utility owners, operators and regulators face is how to prioritize their investments to mitigate the risks from a hazard that can have the most impact on the asset of interest. In order to be able to understand their risk landscape and develop a prioritized mitigation strategy, they must quantify risk in a consistent way across all hazards their asset is facing. Without being able to quantitatively measure risk, it is not possible to defensibly prioritize security investmentsmore » or evaluate trade-offs between security and functionality. Development of a methodology that will consistently measure and quantify risk across different hazards is needed.« less
Quantitative characterization of turbidity by radiative transfer based reflectance imaging
Tian, Peng; Chen, Cheng; Jin, Jiahong; Hong, Heng; Lu, Jun Q.; Hu, Xin-Hua
2018-01-01
A new and noncontact approach of multispectral reflectance imaging has been developed to inversely determine the absorption coefficient of μa, the scattering coefficient of μs and the anisotropy factor g of a turbid target from one measured reflectance image. The incident beam was profiled with a diffuse reflectance standard for deriving both measured and calculated reflectance images. A GPU implemented Monte Carlo code was developed to determine the parameters with a conjugate gradient descent algorithm and the existence of unique solutions was shown. We noninvasively determined embedded region thickness in heterogeneous targets and estimated in vivo optical parameters of nevi from 4 patients between 500 and 950nm for melanoma diagnosis to demonstrate the potentials of quantitative reflectance imaging. PMID:29760971
A video imaging system and the associated quantification methods have been developed for measurement of the transfers of a fluorescent tracer from surfaces to hands. The highly fluorescent compound riboflavin (Vitamin B2), which is also water soluble and non-toxic, was chosen as...
Goujon, Nicolas; Devine, Alexandra; Baker, Sally M; Sprunt, Beth; Edmonds, Tanya J; Booth, Jennifer K; Keeffe, Jill E
2014-01-01
A review of existing measurement instruments was conducted to examine their suitability to measure disability prevalence and assess quality of life, protection of disability rights and community participation by people with disabilities, specifically within the context of development programs in low and middle-income countries. From a search of PubMed and the grey literature, potentially relevant measurement instruments were identified and examined for their content and psychometric properties, where possible. Criteria for inclusion were: based on the WHO's International Classification of Functioning Disability and Health (ICF), used quantitative methods, suitable for population-based studies of disability inclusive development in English and published after 1990. Characteristics of existing instruments were analysed according to components of the ICF and quality of life domains. Ten instruments were identified and reviewed according to the criteria listed above. Each version of instruments was analysed separately. Only three instruments included a component on quality of life. Domains from the ICF that were addressed by some but not all instruments included the environment, technology and communication. The measurement instruments reviewed covered the range of elements required to measure disability-inclusion within development contexts. However no single measurement instrument has the capacity to measure both disability prevalence and changes in quality of life according to contemporary disability paradigms. The review of measurement instruments supports the need for developing an instrument specifically intended to measure disability inclusive practice within development programs. Implications for Rehabilitation Surveys and tools are needed to plan disability inclusive development. Existing measurement tools to determine prevalence of disability, wellbeing, rights and access to the community were reviewed. No single validated tool exists for population-based studies, uses quantitative methods and the components of the ICF to measure prevalence of disability, well-being of people with disability and their access to their communities. A measurement tool that reflects the UNCRPD and addresses all components of the ICF is needed to assist in disability inclusive development, especially in low and mid resource countries.
Moon, Chan Hong; Kim, Jung-Hwan; Zhao, Tiejun; Bae, Kyongtae Ty
2013-11-01
To develop quantitative dual-tuned (DT) (1) H/(23) Na MRI of human knee cartilage in vivo at 7 Tesla (T). A sensitive (23) Na transceiver array RF coil was developed at 7T. B1 fields generated by the transceiver array coil were characterized and corrected in the (23) Na images. Point spread function (PSF) of the (23) Na images was measured, and the signal decrease due to partial-volume-effect was compensated in [(23) Na] quantification of knee cartilage. SNR and [(23) Na] in anterior femoral cartilage were measured from seven healthy subjects. SNR of (23) Na image with the transceiver array coil was higher than that of birdcage coil. SNR in the cartilage at 2-mm isotropic resolution was 26.80 ± 3.69 (n = 7). B1 transmission and reception fields produced by the DT coil at 7T were similar to each other. Effective full-width-half-maximum of (23) Na image was ∼5 mm at 2-mm resolution. Mean [(23) Na] was 288.13 ± 29.50 mM (n = 7) in the anterior femoral cartilage of normal subjects. We developed a new high-sensitivity (23) Na RF coil for knee MRI at 7T. Our (1) H/(23) Na MRI allowed quantitative measurement of [(23) Na] in knee cartilage by measuring PSF and cartilage thickness from (23) Na and (1) H image, respectively. Copyright © 2013 Wiley Periodicals, Inc.
Lomiwes, D; Reis, M M; Wiklund, E; Young, O A; North, M
2010-12-01
The potential of near infrared (NIR) spectroscopy as an on-line method to quantify glycogen and predict ultimate pH (pH(u)) of pre rigor beef M. longissimus dorsi (LD) was assessed. NIR spectra (538 to 1677 nm) of pre rigor LD from steers, cows and bulls were collected early post mortem and measurements were made for pre rigor glycogen concentration and pH(u). Spectral and measured data were combined to develop models to quantify glycogen and predict the pH(u) of pre rigor LD. NIR spectra and pre rigor predicted values obtained from quantitative models were shown to be poorly correlated against glycogen and pH(u) (r(2)=0.23 and 0.20, respectively). Qualitative models developed to categorize each muscle according to their pH(u) were able to correctly categorize 42% of high pH(u) samples. Optimum qualitative and quantitative models derived from NIR spectra found low correlation between predicted values and reference measurements. Copyright © 2010 The American Meat Science Association. Published by Elsevier Ltd.. All rights reserved.
UNiquant, a Program for Quantitative Proteomics Analysis Using Stable Isotope Labeling
Huang, Xin; Tolmachev, Aleksey V.; Shen, Yulei; Liu, Miao; Huang, Lin; Zhang, Zhixin; Anderson, Gordon A.; Smith, Richard D.; Chan, Wing C.; Hinrichs, Steven H.; Fu, Kai; Ding, Shi-Jian
2011-01-01
Stable isotope labeling (SIL) methods coupled with nanoscale liquid chromatography and high resolution tandem mass spectrometry are increasingly useful for elucidation of the proteome-wide differences between multiple biological samples. Development of more effective programs for the sensitive identification of peptide pairs and accurate measurement of the relative peptide/protein abundance are essential for quantitative proteomic analysis. We developed and evaluated the performance of a new program, termed UNiquant, for analyzing quantitative proteomics data using stable isotope labeling. UNiquant was compared with two other programs, MaxQuant and Mascot Distiller, using SILAC-labeled complex proteome mixtures having either known or unknown heavy/light ratios. For the SILAC-labeled Jeko-1 cell proteome digests with known heavy/light ratios (H/L = 1:1, 1:5, and 1:10), UNiquant quantified a similar number of peptide pairs as MaxQuant for the H/L = 1:1 and 1:5 mixtures. In addition, UNiquant quantified significantly more peptides than MaxQuant and Mascot Distiller in the H/L = 1:10 mixtures. UNiquant accurately measured relative peptide/protein abundance without the need for post-measurement normalization of peptide ratios, which is required by the other programs. PMID:21158445
Soni, Jalpa; Purwar, Harsh; Lakhotia, Harshit; Chandel, Shubham; Banerjee, Chitram; Kumar, Uday; Ghosh, Nirmalya
2013-07-01
A novel spectroscopic Mueller matrix system has been developed and explored for both fluorescence and elastic scattering polarimetric measurements from biological tissues. The 4 × 4 Mueller matrix measurement strategy is based on sixteen spectrally resolved (λ = 400 - 800 nm) measurements performed by sequentially generating and analyzing four elliptical polarization states. Eigenvalue calibration of the system ensured high accuracy of Mueller matrix measurement over a broad wavelength range, either for forward or backscattering geometry. The system was explored for quantitative fluorescence and elastic scattering spectroscopic polarimetric studies on normal and precancerous tissue sections from human uterine cervix. The fluorescence spectroscopic Mueller matrices yielded an interesting diattenuation parameter, exhibiting differences between normal and precancerous tissues.
NASA Astrophysics Data System (ADS)
Berthias, F.; Feketeová, L.; Della Negra, R.; Dupasquier, T.; Fillol, R.; Abdoul-Carime, H.; Farizon, B.; Farizon, M.; Märk, T. D.
2018-01-01
The combination of the Dispositif d'Irradiation d'Agrégats Moléculaire with the correlated ion and neutral time of flight-velocity map imaging technique provides a new way to explore processes occurring subsequent to the excitation of charged nano-systems. The present contribution describes in detail the methods developed for the quantitative measurement of branching ratios and cross sections for collision-induced dissociation processes of water cluster nano-systems. These methods are based on measurements of the detection efficiency of neutral fragments produced in these dissociation reactions. Moreover, measured detection efficiencies are used here to extract the number of neutral fragments produced for a given charged fragment.
Pan, Xiaohong; Julian, Thomas; Augsburger, Larry
2006-02-10
Differential scanning calorimetry (DSC) and X-ray powder diffractometry (XRPD) methods were developed for the quantitative analysis of the crystallinity of indomethacin (IMC) in IMC and silica gel (SG) binary system. The DSC calibration curve exhibited better linearity than that of XRPD. No phase transformation occurred in the IMC-SG mixtures during DSC measurement. The major sources of error in DSC measurements were inhomogeneous mixing and sampling. Analyzing the amount of IMC in the mixtures using high-performance liquid chromatography (HPLC) could reduce the sampling error. DSC demonstrated greater sensitivity and had less variation in measurement than XRPD in quantifying crystalline IMC in the IMC-SG binary system.
Quantitation of protein carbonylation by dot blot.
Wehr, Nancy B; Levine, Rodney L
2012-04-15
Protein carbonylation is the most commonly used measure of oxidative modification of proteins. It is frequently measured spectrophotometrically or immunochemically by derivatizing proteins with the classical carbonyl reagent, 2,4-dinitrophenylhydrazine. We developed an immunochemical dot blot method for quantitation of protein carbonylation in homogenates or purified proteins. Dimethyl sulfoxide was employed as the solvent because it very efficiently extracts proteins from tissues and keeps them soluble. It also readily dissolves 2,4-dinitrophenylhydrazine and wets polyvinylidene difluoride (PVDF) membranes. The detection limit is 0.19 ± 0.04 pmol of carbonyl, and 60 ng of protein is sufficient to measure protein carbonyl content. This level of sensitivity allowed measurement of protein carbonylation in individual Drosophila. Copyright © 2012 Elsevier Inc. All rights reserved.
Pressure Distribution in Nonuniform Two-Dimensional Flow
NASA Technical Reports Server (NTRS)
Schwabe, M.
1943-01-01
In an attempt to follow the time rate of change of the processes in turbulent flows by quantitative measurements the measurement of the pressure is often beset with insuperable difficulties for the reason that the speeds and hence the pressures to be measured are often very small. On the other hand, the measurement of very small pressures requires, at least, considerable time, so that the follow-up of periodically varying processes is as goad as impossible. In order to obviate these difficulties a method, suggested by Prof. Prandtl, has been developed by which the pressure distribution is simply determined from the photographic flow picture. This method is described and proved on a worked-out example. It was found that quantitatively very satisfactory results can be achieved.
Herbort, Carl P; Tugal-Tutkun, Ilknur; Neri, Piergiorgio; Pavésio, Carlos; Onal, Sumru; LeHoang, Phuc
2017-05-01
Uveitis is one of the fields in ophthalmology where a tremendous evolution took place in the past 25 years. Not only did we gain access to more efficient, more targeted, and better tolerated therapies, but also in parallel precise and quantitative measurement methods developed allowing the clinician to evaluate these therapies and adjust therapeutic intervention with a high degree of precision. Objective and quantitative measurement of the global level of intraocular inflammation became possible for most inflammatory diseases with direct or spill-over anterior chamber inflammation, thanks to laser flare photometry. The amount of retinal inflammation could be quantified by using fluorescein angiography to score retinal angiographic signs. Indocyanine green angiography gave imaging insight into the hitherto inaccessible choroidal compartment, rendering possible the quantification of choroiditis by scoring indocyanine green angiographic signs. Optical coherence tomography has enabled measurement and objective monitoring of retinal and choroidal thickness. This multimodal quantitative appraisal of intraocular inflammation represents an exquisite security in monitoring uveitis. What is enigmatic, however, is the slow pace with which these improvements are integrated in some areas. What is even more difficult to understand is the fact that clinical trials to assess new therapeutic agents still mostly rely on subjective parameters such as clinical evaluation of vitreous haze as a main endpoint; whereas a whole array of precise, quantitative, and objective modalities are available for the design of clinical studies. The scope of this work was to review the quantitative investigations that improved the management of uveitis in the past 2-3 decades.
NASA Astrophysics Data System (ADS)
Yamauchi, Toyohiko; Iwai, Hidenao; Yamashita, Yutaka
2011-11-01
We demonstrate tomographic imaging of intracellular activity of living cells by a low-coherent quantitative phase microscope. The intracellular organelles, such as the nucleus, nucleolus, and mitochondria, are moving around inside living cells, driven by the cellular physiological activity. In order to visualize the intracellular motility in a label-free manner we have developed a reflection-type quantitative phase microscope which employs the phase shifting interferometric technique with a low-coherent light source. The phase shifting interferometry enables us to quantitatively measure the intensity and phase of the optical field, and the low-coherence interferometry makes it possible to selectively probe a specific sectioning plane in the cell volume. The results quantitatively revealed the depth-resolved fluctuations of intracellular surfaces so that the plasma membrane and the membranes of intracellular organelles were independently measured. The transversal and the vertical spatial resolutions were 0.56 μm and 0.93 μm, respectively, and the mechanical sensitivity of the phase measurement was 1.2 nanometers. The mean-squared displacement was applied as a statistical tool to analyze the temporal fluctuation of the intracellular organelles. To the best of our knowledge, our system visualized depth-resolved intracellular organelles motion for the first time in sub-micrometer resolution without contrast agents.
Performance and Maqasid al-Shari'ah's Pentagon-Shaped Ethical Measurement.
Bedoui, Houssem Eddine; Mansour, Walid
2015-06-01
Business performance is traditionally viewed from the one-dimensional financial angle. This paper develops a new approach that links performance to the ethical vision of Islam based on maqasid al-shari'ah (i.e., the objectives of Islamic law). The approach involves a Pentagon-shaped performance scheme structure via five pillars, namely wealth, posterity, intellect, faith, and human self. Such a scheme ensures that any firm or organization can ethically contribute to the promotion of human welfare, prevent corruption, and enhance social and economic stability and not merely maximize its own performance in terms of its financial return. A quantitative measure of ethical performance is developed. It surprisingly shows that a firm or organization following only the financial aspect at the expense of the others performs poorly. This paper discusses further the practical instances of the quantitative measurement of the ethical aspects of the system taken at an aggregate level.
Development and psychometric evaluation of a quantitative measure of "fat talk".
MacDonald Clarke, Paige; Murnen, Sarah K; Smolak, Linda
2010-01-01
Based on her anthropological research, Nichter (2000) concluded that it is normative for many American girls to engage in body self-disparagement in the form of "fat talk." The purpose of the present two studies was to develop a quantitative measure of fat talk. A series of 17 scenarios were created in which "Naomi" is talking with a female friend(s) and there is an expression of fat talk. College women respondents rated the frequency with which they would behave in a similar way as the women in each scenario. A nine-item one-factor scale was determined through principal components analysis and its scores yielded evidence of internal consistency reliability, test-retest reliability over a five-week time period, construct validity, discriminant validity, and incremental validity in that it predicted unique variance in body shame and eating disorder symptoms above and beyond other measures of self-objectification. Copyright 2009 Elsevier Ltd. All rights reserved.
The accurate quantitation of proteins or peptides using Mass Spectrometry (MS) is gaining prominence in the biomedical research community as an alternative method for analyte measurement. The Clinical Proteomic Tumor Analysis Consortium (CPTAC) investigators have been at the forefront in the promotion of reproducible MS techniques, through the development and application of standardized proteomic methods for protein quantitation on biologically relevant samples.
Establish an Agent-Simulant Technology Relationship (ASTR)
2017-04-14
for quantitative measures that characterize simulant performance in testing , such as the ability to be removed from surfaces. Component-level ASTRs...Overall Test and Agent-Simulant Technology Relationship (ASTR) process. 1.2 Background. a. Historically, many tests did not develop quantitative ...methodology report14. Report provides a VX-TPP ASTR for post -decon contact hazard and off- gassing. In the Stryker production verification test (PVT
Developing Non-Targeted Measurement Methods to Characterize the Human Exposome
The exposome represents all exposures experienced by an individual during their lifetime. Registered chemicals currently number in the tens-of-thousands, and therefore comprise a significant portion of the human exposome. To date, quantitative monitoring methods have been develop...
Validation of a Spectral Method for Quantitative Measurement of Color in Protein Drug Solutions.
Yin, Jian; Swartz, Trevor E; Zhang, Jian; Patapoff, Thomas W; Chen, Bartolo; Marhoul, Joseph; Shih, Norman; Kabakoff, Bruce; Rahimi, Kimia
2016-01-01
A quantitative spectral method has been developed to precisely measure the color of protein solutions. In this method, a spectrophotometer is utilized for capturing the visible absorption spectrum of a protein solution, which can then be converted to color values (L*a*b*) that represent human perception of color in a quantitative three-dimensional space. These quantitative values (L*a*b*) allow for calculating the best match of a sample's color to a European Pharmacopoeia reference color solution. In order to qualify this instrument and assay for use in clinical quality control, a technical assessment was conducted to evaluate the assay suitability and precision. Setting acceptance criteria for this study required development and implementation of a unique statistical method for assessing precision in 3-dimensional space. Different instruments, cuvettes, protein solutions, and analysts were compared in this study. The instrument accuracy, repeatability, and assay precision were determined. The instrument and assay are found suitable for use in assessing color of drug substances and drug products and is comparable to the current European Pharmacopoeia visual assessment method. In the biotechnology industry, a visual assessment is the most commonly used method for color characterization, batch release, and stability testing of liquid protein drug solutions. Using this method, an analyst visually determines the color of the sample by choosing the closest match to a standard color series. This visual method can be subjective because it requires an analyst to make a judgment of the best match of color of the sample to the standard color series, and it does not capture data on hue and chroma that would allow for improved product characterization and the ability to detect subtle differences between samples. To overcome these challenges, we developed a quantitative spectral method for color determination that greatly reduces the variability in measuring color and allows for a more precise understanding of color differences. In this study, we established a statistical method for assessing precision in 3-dimensional space and demonstrated that the quantitative spectral method is comparable with respect to precision and accuracy to the current European Pharmacopoeia visual assessment method. © PDA, Inc. 2016.
ERIC Educational Resources Information Center
Davis, Gregory A.; Lucente, Joe
2012-01-01
Many Extension leadership development programs have been evaluated for effectiveness. Little literature exists focusing on the evaluation of leadership development programs involving elected and appointed local officials. This article describes an annual program involving elected and appointed local officials and shares quantitative and…
ERIC Educational Resources Information Center
Khourey-Bowers, Claudia; Fenk, Christopher
2009-01-01
The purpose of this study was to explore the relationship between teachers' (N = 69) participation in constructivist chemistry professional development (PD) and enhancement of content (CK) and pedagogical content knowledge (PCK) (representational thinking and conceptual change strategies) and self-efficacy (PSTE). Quantitative measures assessed…
Thermal Imaging with Novel Infrared Focal Plane Arrays and Quantitative Analysis of Thermal Imagery
NASA Technical Reports Server (NTRS)
Gunapala, S. D.; Rafol, S. B.; Bandara, S. V.; Liu, J. K.; Mumolo, J. M.; Soibel, A.; Ting, D. Z.; Tidrow, Meimei
2012-01-01
We have developed a single long-wavelength infrared (LWIR) quantum well infrared photodetector (QWIP) camera for thermography. This camera has been used to measure the temperature profile of patients. A pixel coregistered simultaneously reading mid-wavelength infrared (MWIR)/LWIR dual-band QWIP camera was developed to improve the accuracy of temperature measurements especially with objects with unknown emissivity. Even the dualband measurement can provide inaccurate results due to the fact that emissivity is a function of wavelength. Thus we have been developing a four-band QWIP camera for accurate temperature measurement of remote object.
Creating an automated tool for measuring software cohesion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tutton, J.M.; Zucconi, L.
1994-05-06
Program modules with high complexity tend to be more error prone and more difficult to understand. These factors increase maintenance and enhancement costs. Hence, a tool that can help programmers determine a key factor in module complexity should be very useful. Our goal is to create a software tool that will automatically give a quantitative measure of the cohesiveness of a given module, and hence give us an estimate of the {open_quotes}maintainability{close_quotes} of that module. The Tool will use a metric developed by Professors Linda M. Ott and James M. Bieman. The Ott/Bieman metric gives quantitative measures that indicate themore » degree of functional cohesion using abstract data slices.« less
NASA Technical Reports Server (NTRS)
Fletcher, D. G.; Mcdaniel, J. C.
1987-01-01
A preliminary quantitative study of the compressible flowfield in a steady, nonreacting model SCRAMJET combustor using laser-induced iodine fluorescence (LIIF) is reported. Measurements of density, temperature, and velocity were conducted with the calibrated, nonintrusive, optical technique for two different combustor operating conditions. First, measurements were made in the supersonic flow over a rearward-facing step without transverse injection for comparison with calculated pressure profiles. The second configuration was staged injection behind the rearward-facing step at an injection dynamic pressure ratio of 1.06. These experimental results will be used to validate computational fluid dynamic (CFD) codes being developed to model supersonic combustor flowfields.
Comparative analysis of quantitative methodologies for Vibrionaceae biofilms.
Chavez-Dozal, Alba A; Nourabadi, Neda; Erken, Martina; McDougald, Diane; Nishiguchi, Michele K
2016-11-01
Multiple symbiotic and free-living Vibrio spp. grow as a form of microbial community known as a biofilm. In the laboratory, methods to quantify Vibrio biofilm mass include crystal violet staining, direct colony-forming unit (CFU) counting, dry biofilm cell mass measurement, and observation of development of wrinkled colonies. Another approach for bacterial biofilms also involves the use of tetrazolium (XTT) assays (used widely in studies of fungi) that are an appropriate measure of metabolic activity and vitality of cells within the biofilm matrix. This study systematically tested five techniques, among which the XTT assay and wrinkled colony measurement provided the most reproducible, accurate, and efficient methods for the quantitative estimation of Vibrionaceae biofilms.
Macroscopic crack formation and extension in pristine and artificially aged PBX 9501
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Cheng; Thompson, Darla G
2010-01-01
A technique has been developed to quantitatively describe macroscopic cracks, both their location and extent, in heterogeneous high explosive and mock materials. By combining such a technique with the deformation field measurement using digital image correlation (DIC), we conduct observation and measurement of the initiation, extension, and coalescence of internal cracks in the compression of Brazilian disk made of pristine and artificially aged PBX 9501 hjgh explosives. Our results conclude quantitatively that aged PBX 9501 is not only weaker but also much more brittle than the pristine one, thus is more susceptible to macroscopic cracking.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-18
... DEPARTMENT OF HOMELAND SECURITY [Docket No. DHS-2013-0064] Cooperative Research and Development... Research of Purpose Bred Explosive Detection Canines AGENCY: Science and Technology Directorate... canines; understanding, collection and analysis of quantitative behavior trait measurement; application of...
Chen, Li; Mossa-Basha, Mahmud; Balu, Niranjan; Canton, Gador; Sun, Jie; Pimentel, Kristi; Hatsukami, Thomas S; Hwang, Jenq-Neng; Yuan, Chun
2018-06-01
To develop a quantitative intracranial artery measurement technique to extract comprehensive artery features from time-of-flight MR angiography (MRA). By semiautomatically tracing arteries based on an open-curve active contour model in a graphical user interface, 12 basic morphometric features and 16 basic intensity features for each artery were identified. Arteries were then classified as one of 24 types using prediction from a probability model. Based on the anatomical structures, features were integrated within 34 vascular groups for regional features of vascular trees. Eight 3D MRA acquisitions with intracranial atherosclerosis were assessed to validate this technique. Arterial tracings were validated by an experienced neuroradiologist who checked agreement at bifurcation and stenosis locations. This technique achieved 94% sensitivity and 85% positive predictive values (PPV) for bifurcations, and 85% sensitivity and PPV for stenosis. Up to 1,456 features, such as length, volume, and averaged signal intensity for each artery, as well as vascular group in each of the MRA images, could be extracted to comprehensively reflect characteristics, distribution, and connectivity of arteries. Length for the M1 segment of the middle cerebral artery extracted by this technique was compared with reviewer-measured results, and the intraclass correlation coefficient was 0.97. A semiautomated quantitative method to trace, label, and measure intracranial arteries from 3D-MRA was developed and validated. This technique can be used to facilitate quantitative intracranial vascular research, such as studying cerebrovascular adaptation to aging and disease conditions. Magn Reson Med 79:3229-3238, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Comparison of two laboratory-based systems for evaluation of halos in intraocular lenses
Alexander, Elsinore; Wei, Xin; Lee, Shinwook
2018-01-01
Purpose Multifocal intraocular lenses (IOLs) can be associated with unwanted visual phenomena, including halos. Predicting potential for halos is desirable when designing new multifocal IOLs. Halo images from 6 IOL models were compared using the Optikos modulation transfer function bench system and a new high dynamic range (HDR) system. Materials and methods One monofocal, 1 extended depth of focus, and 4 multifocal IOLs were evaluated. An off-the-shelf optical bench was used to simulate a distant (>50 m) car headlight and record images. A custom HDR system was constructed using an imaging photometer to simulate headlight images and to measure quantitative halo luminance data. A metric was developed to characterize halo luminance properties. Clinical relevance was investigated by correlating halo measurements to visual outcomes questionnaire data. Results The Optikos system produced halo images useful for visual comparisons; however, measurements were relative and not quantitative. The HDR halo system provided objective and quantitative measurements used to create a metric from the area under the curve (AUC) of the logarithmic normalized halo profile. This proposed metric differentiated between IOL models, and linear regression analysis found strong correlations between AUC and subjective clinical ratings of halos. Conclusion The HDR system produced quantitative, preclinical metrics that correlated to patients’ subjective perception of halos. PMID:29503526
DOT National Transportation Integrated Search
2016-11-01
The Federal Motor Carrier Safety Administration (FMCSA), in cooperation with the John A. Volpe National Transportation Systems Center (Volpe), has developed a quantitative model to measure the effectiveness of motor carrier interventions in terms of ...
DOT National Transportation Integrated Search
2018-04-01
The Federal Motor Carrier Safety Administration (FMCSA), in cooperation with the John A. Volpe National Transportation Systems Center (Volpe), has developed a quantitative model to measure the effectiveness of motor carrier interventions in terms of ...
DOT National Transportation Integrated Search
2017-04-01
The Federal Motor Carrier Safety Administration (FMCSA), in cooperation with the John A. Volpe National Transportation Systems Center (Volpe), has developed a quantitative model to measure the effectiveness of motor carrier interventions in terms of ...
DOT National Transportation Integrated Search
2015-01-01
The Federal Motor Carrier Safety Administration (FMCSA), in cooperation with the John A. Volpe National : Transportation Systems Center (Volpe), has developed a quantitative model to measure the effectiveness of motor : carrier interventions in terms...
DOT National Transportation Integrated Search
2017-04-01
The Federal Motor Carrier Safety Administration (FMCSA), in cooperation with the John A. Volpe National Transportation Systems Center (Volpe), has developed a quantitative model to measure the effectiveness of motor carrier interventions in terms of ...
Analytical robustness of quantitative NIR chemical imaging for Islamic paper characterization
NASA Astrophysics Data System (ADS)
Mahgoub, Hend; Gilchrist, John R.; Fearn, Thomas; Strlič, Matija
2017-07-01
Recently, spectral imaging techniques such as Multispectral (MSI) and Hyperspectral Imaging (HSI) have gained importance in the field of heritage conservation. This paper explores the analytical robustness of quantitative chemical imaging for Islamic paper characterization by focusing on the effect of different measurement and processing parameters, i.e. acquisition conditions and calibration on the accuracy of the collected spectral data. This will provide a better understanding of the technique that can provide a measure of change in collections through imaging. For the quantitative model, special calibration target was devised using 105 samples from a well-characterized reference Islamic paper collection. Two material properties were of interest: starch sizing and cellulose degree of polymerization (DP). Multivariate data analysis methods were used to develop discrimination and regression models which were used as an evaluation methodology for the metrology of quantitative NIR chemical imaging. Spectral data were collected using a pushbroom HSI scanner (Gilden Photonics Ltd) in the 1000-2500 nm range with a spectral resolution of 6.3 nm using a mirror scanning setup and halogen illumination. Data were acquired at different measurement conditions and acquisition parameters. Preliminary results showed the potential of the evaluation methodology to show that measurement parameters such as the use of different lenses and different scanning backgrounds may not have a great influence on the quantitative results. Moreover, the evaluation methodology allowed for the selection of the best pre-treatment method to be applied to the data.
A Flow Cytometry-Based Assay for Quantifying Non-Plaque Forming Strains of Yellow Fever Virus
Hammarlund, Erika; Amanna, Ian J.; Dubois, Melissa E.; Barron, Alex; Engelmann, Flora; Messaoudi, Ilhem; Slifka, Mark K.
2012-01-01
Primary clinical isolates of yellow fever virus can be difficult to quantitate by standard in vitro methods because they may not form discernable plaques or induce a measurable cytopathic effect (CPE) on cell monolayers. In our hands, the Dakar strain of yellow fever virus (YFV-Dakar) could not be measured by plaque assay (PA), focus-forming assay (FFA), or by measurement of CPE. For these reasons, we developed a YFV-specific monoclonal antibody (3A8.B6) and used it to optimize a highly sensitive flow cytometry-based tissue culture limiting dilution assay (TC-LDA) to measure levels of infectious virus. The TC-LDA was performed by incubating serial dilutions of virus in replicate wells of C6/36 cells and stained intracellularly for virus with MAb 3A8.B6. Using this approach, we could reproducibly quantitate YFV-Dakar in tissue culture supernatants as well as from the serum of viremic rhesus macaques experimentally infected with YFV-Dakar. Moreover, the TC-LDA approach was >10-fold more sensitive than standard plaque assay for quantitating typical plaque-forming strains of YFV including YFV-17D and YFV-FNV (French neurotropic vaccine). Together, these results indicate that the TC-LDA technique is effective for quantitating both plaque-forming and non-plaque-forming strains of yellow fever virus, and this methodology may be readily adapted for the study and quantitation of other non-plaque-forming viruses. PMID:23028428
A flow cytometry-based assay for quantifying non-plaque forming strains of yellow fever virus.
Hammarlund, Erika; Amanna, Ian J; Dubois, Melissa E; Barron, Alex; Engelmann, Flora; Messaoudi, Ilhem; Slifka, Mark K
2012-01-01
Primary clinical isolates of yellow fever virus can be difficult to quantitate by standard in vitro methods because they may not form discernable plaques or induce a measurable cytopathic effect (CPE) on cell monolayers. In our hands, the Dakar strain of yellow fever virus (YFV-Dakar) could not be measured by plaque assay (PA), focus-forming assay (FFA), or by measurement of CPE. For these reasons, we developed a YFV-specific monoclonal antibody (3A8.B6) and used it to optimize a highly sensitive flow cytometry-based tissue culture limiting dilution assay (TC-LDA) to measure levels of infectious virus. The TC-LDA was performed by incubating serial dilutions of virus in replicate wells of C6/36 cells and stained intracellularly for virus with MAb 3A8.B6. Using this approach, we could reproducibly quantitate YFV-Dakar in tissue culture supernatants as well as from the serum of viremic rhesus macaques experimentally infected with YFV-Dakar. Moreover, the TC-LDA approach was >10-fold more sensitive than standard plaque assay for quantitating typical plaque-forming strains of YFV including YFV-17D and YFV-FNV (French neurotropic vaccine). Together, these results indicate that the TC-LDA technique is effective for quantitating both plaque-forming and non-plaque-forming strains of yellow fever virus, and this methodology may be readily adapted for the study and quantitation of other non-plaque-forming viruses.
Cappelleri, Joseph C.; Lundy, J. Jason; Hays, Ron D.
2014-01-01
Introduction The U.S. Food and Drug Administration’s patient-reported outcome (PRO) guidance document defines content validity as “the extent to which the instrument measures the concept of interest” (FDA, 2009, p. 12). “Construct validity is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity” (Strauss & Smith, 2009, p. 7). Hence both qualitative and quantitative information are essential in evaluating the validity of measures. Methods We review classical test theory and item response theory approaches to evaluating PRO measures including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized “difficulty” (severity) order of items is represented by observed responses. Conclusion Classical test theory and item response theory can be useful in providing a quantitative assessment of items and scales during the content validity phase of patient-reported outcome measures. Depending on the particular type of measure and the specific circumstances, either one or both approaches should be considered to help maximize the content validity of PRO measures. PMID:24811753
Evaluating Comprehensive School Reform Models at Scale: Focus on Implementation
ERIC Educational Resources Information Center
Vernez, Georges; Karam, Rita; Mariano, Louis T.; DeMartini, Christine
2006-01-01
This study was designed to fill the "implementation measurement" gap. A methodology to quantitatively measure the level of Comprehensive School Reform (CSR) implementation that can be used across a variety of CSR models was developed, and then applied to measure actual implementation of four different CSR models in a large number of schools. The…
Recent developments in dimensional nanometrology using AFMs
NASA Astrophysics Data System (ADS)
Yacoot, Andrew; Koenders, Ludger
2011-12-01
Scanning probe microscopes, in particular the atomic force microscope (AFM), have developed into sophisticated instruments that, throughout the world, are no longer used just for imaging, but for quantitative measurements. A role of the national measurement institutes has been to provide traceable metrology for these instruments. This paper presents a brief overview as to how this has been achieved, highlights the future requirements for metrology to support developments in AFM technology and describes work in progress to meet this need.
Towards in vivo focal cortical dysplasia phenotyping using quantitative MRI.
Adler, Sophie; Lorio, Sara; Jacques, Thomas S; Benova, Barbora; Gunny, Roxana; Cross, J Helen; Baldeweg, Torsten; Carmichael, David W
2017-01-01
Focal cortical dysplasias (FCDs) are a range of malformations of cortical development each with specific histopathological features. Conventional radiological assessment of standard structural MRI is useful for the localization of lesions but is unable to accurately predict the histopathological features. Quantitative MRI offers the possibility to probe tissue biophysical properties in vivo and may bridge the gap between radiological assessment and ex-vivo histology. This review will cover histological, genetic and radiological features of FCD following the ILAE classification and will explain how quantitative voxel- and surface-based techniques can characterise these features. We will provide an overview of the quantitative MRI measures available, their link with biophysical properties and finally the potential application of quantitative MRI to the problem of FCD subtyping. Future research linking quantitative MRI to FCD histological properties should improve clinical protocols, allow better characterisation of lesions in vivo and tailored surgical planning to the individual.
QDMR: a quantitative method for identification of differentially methylated regions by entropy
Zhang, Yan; Liu, Hongbo; Lv, Jie; Xiao, Xue; Zhu, Jiang; Liu, Xiaojuan; Su, Jianzhong; Li, Xia; Wu, Qiong; Wang, Fang; Cui, Ying
2011-01-01
DNA methylation plays critical roles in transcriptional regulation and chromatin remodeling. Differentially methylated regions (DMRs) have important implications for development, aging and diseases. Therefore, genome-wide mapping of DMRs across various temporal and spatial methylomes is important in revealing the impact of epigenetic modifications on heritable phenotypic variation. We present a quantitative approach, quantitative differentially methylated regions (QDMRs), to quantify methylation difference and identify DMRs from genome-wide methylation profiles by adapting Shannon entropy. QDMR was applied to synthetic methylation patterns and methylation profiles detected by methylated DNA immunoprecipitation microarray (MeDIP-chip) in human tissues/cells. This approach can give a reasonable quantitative measure of methylation difference across multiple samples. Then DMR threshold was determined from methylation probability model. Using this threshold, QDMR identified 10 651 tissue DMRs which are related to the genes enriched for cell differentiation, including 4740 DMRs not identified by the method developed by Rakyan et al. QDMR can also measure the sample specificity of each DMR. Finally, the application to methylation profiles detected by reduced representation bisulphite sequencing (RRBS) in mouse showed the platform-free and species-free nature of QDMR. This approach provides an effective tool for the high-throughput identification of potential functional regions involved in epigenetic regulation. PMID:21306990
Hoffmann, Uwe; Pfeifer, Frank; Hsuing, Chang; Siesler, Heinz W
2016-05-01
The aim of this contribution is to demonstrate the transfer of spectra that have been measured on two different laboratory Fourier transform near-infrared (FT-NIR) spectrometers to the format of a handheld instrument by measuring only a few samples with both spectrometer types. Thus, despite the extreme differences in spectral range and resolution, spectral data sets that have been collected and quantitative as well as qualitative calibrations that have been developed thereof, respectively, over a long period on a laboratory instrument can be conveniently transferred to the handheld system. Thus, the necessity to prepare completely new calibration samples and the effort required to develop calibration models when changing hardware platforms is minimized. The enabling procedure is based on piecewise direct standardization (PDS) and will be described for the data sets of a quantitative and a qualitative application case study. For this purpose the spectra measured on the FT-NIR laboratory spectrometers were used as "master" data and transferred to the "target" format of the handheld instrument. The quantitative test study refers to transmission spectra of three-component liquid solvent mixtures whereas the qualitative application example encompasses diffuse reflection spectra of six different current polymers. To prove the performance of the transfer procedure for quantitative applications, partial least squares (PLS-1) calibrations were developed for the individual components of the solvent mixtures with spectra transferred from the master to the target instrument and the cross-validation parameters were compared with the corresponding parameters obtained for spectra measured on the master and target instruments, respectively. To test the retention of the discrimination ability of the transferred polymer spectra sets principal component analyses (PCAs) were applied exemplarily for three of the six investigated polymers and their identification was demonstrated by Mahalanobis distance plots for all polymers. © The Author(s) 2016.
Self-guided training for deep brain stimulation planning using objective assessment.
Holden, Matthew S; Zhao, Yulong; Haegelen, Claire; Essert, Caroline; Fernandez-Vidal, Sara; Bardinet, Eric; Ungi, Tamas; Fichtinger, Gabor; Jannin, Pierre
2018-04-04
Deep brain stimulation (DBS) is an increasingly common treatment for neurodegenerative diseases. Neurosurgeons must have thorough procedural, anatomical, and functional knowledge to plan electrode trajectories and thus ensure treatment efficacy and patient safety. Developing this knowledge requires extensive training. We propose a training approach with objective assessment of neurosurgeon proficiency in DBS planning. To assess proficiency, we propose analyzing both the viability of the planned trajectory and the manner in which the operator arrived at the trajectory. To improve understanding, we suggest a self-guided training course for DBS planning using real-time feedback. To validate the proposed measures of proficiency and training course, two experts and six novices followed the training course, and we monitored their proficiency measures throughout. At baseline, experts planned higher quality trajectories and did so more efficiently. As novices progressed through the training course, their proficiency measures increased significantly, trending toward expert measures. We developed and validated measures which reliably discriminate proficiency levels. These measures are integrated into a training course, which quantitatively improves trainee performance. The proposed training course can be used to improve trainees' proficiency, and the quantitative measures allow trainees' progress to be monitored.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Damao; Wang, Zhien; Heymsfield, Andrew J.
Measurement of ice number concentration in clouds is important but still challenging. Stratiform mixed-phase clouds (SMCs) provide a simple scenario for retrieving ice number concentration from remote sensing measurements. The simple ice generation and growth pattern in SMCs offers opportunities to use cloud radar reflectivity (Ze) measurements and other cloud properties to infer ice number concentration quantitatively. To understand the strong temperature dependency of ice habit and growth rate quantitatively, we develop a 1-D ice growth model to calculate the ice diffusional growth along its falling trajectory in SMCs. The radar reflectivity and fall velocity profiles of ice crystals calculatedmore » from the 1-D ice growth model are evaluated with the Atmospheric Radiation Measurements (ARM) Climate Research Facility (ACRF) ground-based high vertical resolution radar measurements. Combining Ze measurements and 1-D ice growth model simulations, we develop a method to retrieve the ice number concentrations in SMCs at given cloud top temperature (CTT) and liquid water path (LWP). The retrieved ice concentrations in SMCs are evaluated with in situ measurements and with a three-dimensional cloud-resolving model simulation with a bin microphysical scheme. These comparisons show that the retrieved ice number concentrations are within an uncertainty of a factor of 2, statistically.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2011-11-01
NREL's new imaging tool could provide manufacturers with insight on their processes. Scientists at the National Renewable Energy Laboratory (NREL) have used capabilities within the Process Development and Integration Laboratory (PDIL) to generate quantitative minority-carrier lifetime maps of multicrystalline silicon (mc-Si) bricks. This feat has been accomplished by using the PDIL's photoluminescence (PL) imaging system in conjunction with transient lifetime measurements obtained using a custom NREL-designed resonance-coupled photoconductive decay (RCPCD) system. PL imaging can obtain rapid high-resolution images that provide a qualitative assessment of the material lifetime-with the lifetime proportional to the pixel intensity. In contrast, the RCPCD technique providesmore » a fast quantitative measure of the lifetime with a lower resolution and penetrates millimeters into the mc-Si brick, providing information on bulk lifetimes and material quality. This technique contrasts with commercially available minority-carrier lifetime mapping systems that use microwave conductivity measurements. Such measurements are dominated by surface recombination and lack information on the material quality within the bulk of the brick. By combining these two complementary techniques, we obtain high-resolution lifetime maps at very fast data acquisition times-attributes necessary for a production-based diagnostic tool. These bulk lifetime measurements provide manufacturers with invaluable feedback on their silicon ingot casting processes. NREL has been applying the PL images of lifetime in mc-Si bricks in collaboration with a U.S. photovoltaic industry partner through Recovery Act Funded Project ARRA T24. NREL developed a new tool to quantitatively map minority-carrier lifetime of multicrystalline silicon bricks by using photoluminescence imaging in conjunction with resonance-coupled photoconductive decay measurements. Researchers are not hindered by surface recombination and can look deeper into the material to map bulk lifetimes. The tool is being applied to silicon bricks in a project collaborating with a U.S. photovoltaic industry partner. Photovoltaic manufacturers can use the NREL tool to obtain valuable feedback on their silicon ingot casting processes.« less
Hamamura, Kensuke; Yanagida, Mitsuaki; Ishikawa, Hitoshi; Banzai, Michio; Yoshitake, Hiroshi; Nonaka, Daisuke; Tanaka, Kenji; Sakuraba, Mayumi; Miyakuni, Yasuka; Takamori, Kenji; Nojima, Michio; Yoshida, Koyo; Fujiwara, Hiroshi; Takeda, Satoru; Araki, Yoshihiko
2018-03-01
Purpose We previously attempted to develop quantitative enzyme-linked immunosorbent assay (ELISA) systems for the PDA039/044/071 peptides, potential serum disease biomarkers (DBMs) of pregnancy-induced hypertension (PIH), primarily identified by a peptidomic approach (BLOTCHIP®-mass spectrometry (MS)). However, our methodology did not extend to PDA071 (cysteinyl α2-HS-glycoprotein 341-367 ), due to difficulty to produce a specific antibody against the peptide. The aim of the present study was to establish an alternative PDA071 quantitation system using liquid chromatography-multiple reaction monitoring (LC-MRM)/MS, to explore the potential utility of PDA071 as a DBM for PIH. Methods We tested heat/acid denaturation methods in efforts to purify serum PDA071 and developed an LC-MRM/MS method allowing for specific quantitation thereof. We measured serum PDA071 concentrations, and these results were validated including by three-dimensional (3D) plotting against PDA039 (kininogen-1 439-456 )/044 (kininogen-1 438-456 ) concentrations, followed by discriminant analysis. Results PDA071 was successfully extracted from serum using a heat denaturation method. Optimum conditions for quantitation via LC-MRM/MS were developed; the assayed serum PDA071 correlated well with the BLOTCHIP® assay values. Although the PDA071 alone did not significantly differ between patients and controls, 3D plotting of PDA039/044/071 peptide concentrations and construction of a Jackknife classification matrix were satisfactory in terms of PIH diagnostic precision. Conclusions Combination analysis using both PDA071 and PDA039/044 concentrations allowed PIH diagnostic accuracy to be attained, and our method will be valuable in future pathophysiological studies of hypertensive disorders of pregnancy.
Progress in quantitative GPR development at CNDE
NASA Astrophysics Data System (ADS)
Eisenmann, David; Margetan, F. J.; Chiou, C.-P.; Roberts, Ron; Wendt, Scott
2014-02-01
Ground penetrating radar (GPR) uses electromagnetic (EM) radiation pulses to locate and map embedded objects. Commercial GPR instruments are generally geared toward producing images showing the location and extent of buried objects, and often do not make full use of available absolute amplitude information. At the Center for Nondestructive Evaluation (CNDE) at Iowa State University efforts are underway to develop a more quantitative approach to GPR inspections in which absolute amplitudes and spectra of measured signals play a key role. Guided by analogous work in ultrasonic inspection, there are three main thrusts to the effort. These focus, respectively, on the development of tools for: (1) analyzing raw GPR data; (2) measuring the EM properties of soils and other embedding media; and (3) simulating GPR inspections. This paper reviews progress in each category. The ultimate goal of the work is to develop model-based simulation tools that can be used assess the usefulness of GPR for a given inspection scenario, to optimize inspection choices, and to determine inspection reliability.
Thermodynamics and Mechanics of Membrane Curvature Generation and Sensing by Proteins and Lipids
Baumgart, Tobias; Capraro, Benjamin R.; Zhu, Chen; Das, Sovan L.
2014-01-01
Research investigating lipid membrane curvature generation and sensing is a rapidly developing frontier in membrane physical chemistry and biophysics. The fast recent progress is based on the discovery of a plethora of proteins involved in coupling membrane shape to cellular membrane function, the design of new quantitative experimental techniques to study aspects of membrane curvature, and the development of analytical theories and simulation techniques that allow a mechanistic interpretation of quantitative measurements. The present review first provides an overview of important classes of membrane proteins for which function is coupled to membrane curvature. We then survey several mechanisms that are assumed to underlie membrane curvature sensing and generation. Finally, we discuss relatively simple thermodynamic/mechanical models that allow quantitative interpretation of experimental observations. PMID:21219150
Changes in the Levels of Cryspovirus During In Vitro Development of Cryptosporidium parvum
USDA-ARS?s Scientific Manuscript database
The purpose of this study was to develop and utilize semi-quantitative RT-PCR and PCR assays for measuring the level of cryspovirus, the viral symbiont of Cryptosporidium parvum, during in vitro development of the protozoan. Cultures of human carcinoma cells (HCT-8) were inoculated with excysting C...
ERIC Educational Resources Information Center
Kelly, Martina; Bennett, Deirdre; Muijtjens, Arno; O'Flynn, Siun; Dornan, Tim
2015-01-01
Clinical clerks learn more than they are taught and not all they learn can be measured. As a result, curriculum leaders evaluate clinical educational environments. The quantitative Dundee Ready Environment Measure (DREEM) is a "de facto" standard for that purpose. Its 50 items and 5 subscales were developed by consensus. Reasoning that…
Zheng, Zhi; Luo, Yuling; McMaster, Gary K
2006-07-01
Accurate and precise quantification of mRNA in whole blood is made difficult by gene expression changes during blood processing, and by variations and biases introduced by sample preparations. We sought to develop a quantitative whole-blood mRNA assay that eliminates blood purification, RNA isolation, reverse transcription, and target amplification while providing high-quality data in an easy assay format. We performed single- and multiplex gene expression analysis with multiple hybridization probes to capture mRNA directly from blood lysate and used branched DNA to amplify the signal. The 96-well plate singleplex assay uses chemiluminescence detection, and the multiplex assay combines Luminex-encoded beads with fluorescent detection. The single- and multiplex assays could quantitatively measure as few as 6000 and 24,000 mRNA target molecules (0.01 and 0.04 amoles), respectively, in up to 25 microL of whole blood. Both formats had CVs < 10% and dynamic ranges of 3-4 logs. Assay sensitivities allowed quantitative measurement of gene expression in the minority of cells in whole blood. The signals from whole-blood lysate correlated well with signals from purified RNA of the same sample, and absolute mRNA quantification results from the assay were similar to those obtained by quantitative reverse transcription-PCR. Both single- and multiplex assay formats were compatible with common anticoagulants and PAXgene-treated samples; however, PAXgene preparations induced expression of known antiapoptotic genes in whole blood. Both the singleplex and the multiplex branched DNA assays can quantitatively measure mRNA expression directly from small volumes of whole blood. The assay offers an alternative to current technologies that depend on RNA isolation and is amenable to high-throughput gene expression analysis of whole blood.
Torres-Mejía, Gabriela; De Stavola, Bianca; Allen, Diane S; Pérez-Gavilán, Juan J; Ferreira, Jorge M; Fentiman, Ian S; Dos Santos Silva, Isabel
2005-05-01
Mammographic features are known to be associated with breast cancer but the magnitude of the effect differs markedly from study to study. Methods to assess mammographic features range from subjective qualitative classifications to computer-automated quantitative measures. We used data from the UK Guernsey prospective studies to examine the relative value of these methods in predicting breast cancer risk. In all, 3,211 women ages > or =35 years who had a mammogram taken in 1986 to 1989 were followed-up to the end of October 2003, with 111 developing breast cancer during this period. Mammograms were classified using the subjective qualitative Wolfe classification and several quantitative mammographic features measured using computer-based techniques. Breast cancer risk was positively associated with high-grade Wolfe classification, percent breast density and area of dense tissue, and negatively associated with area of lucent tissue, fractal dimension, and lacunarity. Inclusion of the quantitative measures in the same model identified area of dense tissue and lacunarity as the best predictors of breast cancer, with risk increasing by 59% [95% confidence interval (95% CI), 29-94%] per SD increase in total area of dense tissue but declining by 39% (95% CI, 53-22%) per SD increase in lacunarity, after adjusting for each other and for other confounders. Comparison of models that included both the qualitative Wolfe classification and these two quantitative measures to models that included either the qualitative or the two quantitative variables showed that they all made significant contributions to prediction of breast cancer risk. These findings indicate that breast cancer risk is affected not only by the amount of mammographic density but also by the degree of heterogeneity of the parenchymal pattern and, presumably, by other features captured by the Wolfe classification.
SERS quantitative urine creatinine measurement of human subject
NASA Astrophysics Data System (ADS)
Wang, Tsuei Lian; Chiang, Hui-hua K.; Lu, Hui-hsin; Hung, Yung-da
2005-03-01
SERS method for biomolecular analysis has several potentials and advantages over traditional biochemical approaches, including less specimen contact, non-destructive to specimen, and multiple components analysis. Urine is an easily available body fluid for monitoring the metabolites and renal function of human body. We developed surface-enhanced Raman scattering (SERS) technique using 50nm size gold colloidal particles for quantitative human urine creatinine measurements. This paper shows that SERS shifts of creatinine (104mg/dl) in artificial urine is from 1400cm-1 to 1500cm-1 which was analyzed for quantitative creatinine measurement. Ten human urine samples were obtained from ten healthy persons and analyzed by the SERS technique. Partial least square cross-validation (PLSCV) method was utilized to obtain the estimated creatinine concentration in clinically relevant (55.9mg/dl to 208mg/dl) concentration range. The root-mean square error of cross validation (RMSECV) is 26.1mg/dl. This research demonstrates the feasibility of using SERS for human subject urine creatinine detection, and establishes the SERS platform technique for bodily fluids measurement.
Li, Qian; Magers, Tobias; King, Brad; Engel, Brian J; Bakhtiar, Ray; Green, Charisse; Shoup, Ronald
2018-06-15
Sensitive LC-MS/MS methods were developed to measure lidocaine and its metabolite 2,6-dimethylaniline (2,6-DMA) with application to transdermal studies. The methods for lidocaine in minipig plasma, tissue biopsies, and dermal tapes utilized mixed mode/SCX solid phase extraction, with lower quantitation limits of 25 pg/mL in plasma, 15 ng/g tissue, and 5 ng/tape. 2,6-DMA was measured in plasma and skin tissue homogenates by ultrafiltration and (for tissue) by further derivatization with 4-methoxybenzoyl chloride to form the corresponding benzamide derivative, which extended the lower limit of quantitation to 200 pg/mL. The methods allowed local measurement of lidocaine in stratum corneum, punch biopsies, and plasma and of 2,6-DMA in plasma and biopsies obtained from minipigs dosed with experimental transdermal formulations. Quantitation limits were approximately 7-fold lower than previously reported for lidocaine and 3-fold lower for 2,6-DMA. Copyright © 2018 Elsevier B.V. All rights reserved.
Resolving High Amplitude Surface Motion with Diffusing Light
NASA Technical Reports Server (NTRS)
Wright, W.; Budakian, R.; Putterman, Seth J.
1996-01-01
A new technique has been developed for the purpose of imaging high amplitude surface motion. With this method one can quantitatively measure the transition to ripple wave turbulence. In addition, one can measure the phase of the turbulent state. These experiments reveal strong coherent structures in turbulent range of motion.
An Investigation of the Eighteenth-Century Achromatic Telescope
ERIC Educational Resources Information Center
Jaecks, Duane H.
2010-01-01
The optical quality and properties of over 200 telescopes residing in museums and private collections have been measured and tested with the goal of obtaining new information about the early development of the achromatic lens (1757-1770). Quantitative measurements of the chromatic and spherical aberration of telescope objective lenses were made…
ERIC Educational Resources Information Center
Lane, Erin S.; Harris, Sara E.
2015-01-01
The authors developed a classroom observation protocol for quantitatively measuring student engagement in large university classes. The Behavioral Engagement Related to instruction (BERI) protocol can be used to provide timely feedback to instructors as to how they can improve student engagement in their classrooms.
Job Satisfaction of Employees at a Christian University
ERIC Educational Resources Information Center
Schroder, Ralph
2008-01-01
As part of this quantitative study, a survey questionnaire was mailed out to 835 university employees to measure levels of overall, intrinsic, and extrinsic job satisfaction. The survey included items of the Professional Satisfaction Scale, an instrument developed according to Herzberg's two-factor theory. Responses were measured on a 5-point…
Personal Responsibility: An Integrative Review of Conceptual and Measurement Issues of the Construct
ERIC Educational Resources Information Center
Mergler, Amanda
2017-01-01
This integrative literature review examines educational research that focuses on quantitative measurement of the construct of personal responsibility, in studies which included participants from middle childhood to young adulthood. National policies in education are increasingly concerned with values and skills that students develop through their…
ERIC Educational Resources Information Center
Hulsey, John D.
2010-01-01
This study uses a quantitative approach to evaluate the trustworthiness of e-businesses as measured by the E-business Trustworthy Index, EBTI, developed as part of this research. The problem is that despite the importance of e-business trustworthiness and the findings from many studies, there are few if any objective measures that evaluate the…
Psychometric Support for the Ownership in Exercise and Empowerment in Exercise Scales
ERIC Educational Resources Information Center
Moore, E. Whitney G.; Fry, Mary D.
2014-01-01
This study's purpose was to examine the psychometric properties of two new scales developed to quantitatively measure participants' ownership in exercise classes and empowerment, with respect to exercise. These two outcome measures will compliment Achievement Goal Perspective Theory (AGPT) grounded research to better understand participants'…
Li, Ning; Zhao, Wei-Guo; Pu, Chun-Hua; Yang, Wen-Lei
2018-01-01
This prospective study quantitatively measured the cerebellar retraction factors, including retraction distance, depth and duration, and evaluated their potential relationship to the development of hearing loss after microvascular decompression (MVD) for hemifacial spasm (HFS). One hundred ten patients with primary HFS who underwent MVD in our department were included into this study. The cerebellar retraction factors were quantitatively measured on preoperative MR and timed during MVD. Associations of cerebellar retraction and other factors to postoperative hearing loss were analyzed. Eleven (10%) patients developed hearing loss after MVD. Compared with the group without hearing loss, the cerebellar retraction distance, depth and duration of the group with hearing loss were significantly greater (p < 0.05). Multivariate regression analysis showed that greater cerebellar retraction depth and longer retraction duration were significantly associated with a higher incidence of postoperative hearing impairment (p < 0.05). This study strongly suggested a correlation between the cerebellar retraction factors, especially retraction depth and duration, and possibility of hearing loss following MVD for HFS.
Quantitative Morphology Measures in Galaxies: Ground-Truthing from Simulations
NASA Astrophysics Data System (ADS)
Narayanan, Desika T.; Abruzzo, Matthew W.; Dave, Romeel; Thompson, Robert
2017-01-01
The process of galaxy assembly is a prevalent question in astronomy; there are a variety of potentially important effects, including baryonic accretion from the intergalactic medium, as well as major galaxy mergers. Recent years have ushered in the development of quantitative measures of morphology such as the Gini coefficient (G), the second-order moment of the brightest quintile of a galaxy’s light (M20), and the concentration (C), asymmetry (A), and clumpiness (S) of galaxies. To investigate the efficacy of these observational methods at identifying major mergers, we have run a series of very high resolution cosmological zoom simulations, and coupled these with 3D Monte Carlo dust radiative transfer. Our methodology is powerful in that it allows us to “observe” the simulation as an observer would, while maintaining detailed knowledge of the true merger history of the galaxy. In this presentation, we will present our main results from our analysis of these quantitative morphology measures, with a particular focus on high-redshift (z>2) systems.
Towards Measurement of Confidence in Safety Cases
NASA Technical Reports Server (NTRS)
Denney, Ewen; Paim Ganesh J.; Habli, Ibrahim
2011-01-01
Arguments in safety cases are predominantly qualitative. This is partly attributed to the lack of sufficient design and operational data necessary to measure the achievement of high-dependability targets, particularly for safety-critical functions implemented in software. The subjective nature of many forms of evidence, such as expert judgment and process maturity, also contributes to the overwhelming dependence on qualitative arguments. However, where data for quantitative measurements is systematically collected, quantitative arguments provide far more benefits over qualitative arguments, in assessing confidence in the safety case. In this paper, we propose a basis for developing and evaluating integrated qualitative and quantitative safety arguments based on the Goal Structuring Notation (GSN) and Bayesian Networks (BN). The approach we propose identifies structures within GSN-based arguments where uncertainties can be quantified. BN are then used to provide a means to reason about confidence in a probabilistic way. We illustrate our approach using a fragment of a safety case for an unmanned aerial system and conclude with some preliminary observations
Wu, Ed X.; Tang, Haiying; Tong, Christopher; Heymsfield, Steve B.; Vasselli, Joseph R.
2015-01-01
This study aimed to develop a quantitative and in vivo magnetic resonance imaging (MRI) approach to investigate the muscle growth effects of anabolic steroids. A protocol of MRI acquisition on a standard clinical 1.5 Tesla scanner and quantitative image analysis was established and employed to measure the individual muscle and organ volumes in the intact and castrated guinea pigs undergoing a 16-week treatment protocol by two well-documented anabolic steroids, testosterone and nandrolone, via implanted silastic capsules. High correlations between the in vivo MRI and postmortem dissection measurements were observed for shoulder muscle complex (R = 0.86), masseter (R=0.79), temporalis (R=0.95), neck muscle complex (R=0.58), prostate gland and seminal vesicles (R=0.98), and testis (R=0.96). Furthermore, the longitudinal MRI measurements yielded adequate sensitivity to detect the restoration of growth to or towards normal in castrated guinea pigs by replacing circulating steroid levels to physiological or slightly higher levels, as expected. These results demonstrated that quantitative MRI using a standard clinical scanner provides accurate and sensitive measurement of individual muscles and organs, and this in vivo MRI protocol in conjunction with the castrated guinea pig model constitutes an effective platform to investigate the longitudinal and cross-sectional growth effects of other potential anabolic steroids. The quantitative MRI protocol developed can also be readily adapted for human studies on most clinical MRI scanner to investigate the anabolic steroid growth effects, or monitor the changes in individual muscle and organ volume and geometry following injury, strength training, neuromuscular disorders, and pharmacological or surgical interventions. PMID:18241900
Principles of Quantitative MR Imaging with Illustrated Review of Applicable Modular Pulse Diagrams.
Mills, Andrew F; Sakai, Osamu; Anderson, Stephan W; Jara, Hernan
2017-01-01
Continued improvements in diagnostic accuracy using magnetic resonance (MR) imaging will require development of methods for tissue analysis that complement traditional qualitative MR imaging studies. Quantitative MR imaging is based on measurement and interpretation of tissue-specific parameters independent of experimental design, compared with qualitative MR imaging, which relies on interpretation of tissue contrast that results from experimental pulse sequence parameters. Quantitative MR imaging represents a natural next step in the evolution of MR imaging practice, since quantitative MR imaging data can be acquired using currently available qualitative imaging pulse sequences without modifications to imaging equipment. The article presents a review of the basic physical concepts used in MR imaging and how quantitative MR imaging is distinct from qualitative MR imaging. Subsequently, the article reviews the hierarchical organization of major applicable pulse sequences used in this article, with the sequences organized into conventional, hybrid, and multispectral sequences capable of calculating the main tissue parameters of T1, T2, and proton density. While this new concept offers the potential for improved diagnostic accuracy and workflow, awareness of this extension to qualitative imaging is generally low. This article reviews the basic physical concepts in MR imaging, describes commonly measured tissue parameters in quantitative MR imaging, and presents the major available pulse sequences used for quantitative MR imaging, with a focus on the hierarchical organization of these sequences. © RSNA, 2017.
Quantitation of Fine Displacement in Echography
NASA Astrophysics Data System (ADS)
Masuda, Kohji; Ishihara, Ken; Yoshii, Ken; Furukawa, Toshiyuki; Kumagai, Sadatoshi; Maeda, Hajime; Kodama, Shinzo
1993-05-01
A High-speed Digital Subtraction Echography was developed to visualize the fine displacement of human internal organs. This method indicates differences in position through time series images of high-frame-rate echography. Fine displacement less than ultrasonic wavelength can be observed. This method, however, lacks the ability to quantitatively measure displacement length. The subtraction between two successive images was affected by displacement direction in spite of the displacement length being the same. To solve this problem, convolution of an echogram with Gaussian distribution was used. To express displacement length as brightness quantitatively, normalization using a brightness gradient was applied. The quantitation algorithm was applied to successive B-mode images. Compared to the simply subtracted images, quantitated images express more precisely the motion of organs. Expansion of the carotid artery and fine motion of ventricular walls can be visualized more easily. Displacement length can be quantitated with wavelength. Under more static conditions, this system quantitates displacement length that is much less than wavelength.
Franzen, Lutz; Anderski, Juliane; Windbergs, Maike
2015-09-01
For rational development and evaluation of dermal drug delivery, the knowledge of rate and extent of substance penetration into the human skin is essential. However, current analytical procedures are destructive, labor intense and lack a defined spatial resolution. In this context, confocal Raman microscopy bares the potential to overcome current limitations in drug depth profiling. Confocal Raman microscopy already proved its suitability for the acquisition of qualitative penetration profiles, but a comprehensive investigation regarding its suitability for quantitative measurements inside the human skin is still missing. In this work, we present a systematic validation study to deploy confocal Raman microscopy for quantitative drug depth profiling in human skin. After we validated our Raman microscopic setup, we successfully established an experimental procedure that allows correlating the Raman signal of a model drug with its controlled concentration in human skin. To overcome current drawbacks in drug depth profiling, we evaluated different modes of peak correlation for quantitative Raman measurements and offer a suitable operating procedure for quantitative drug depth profiling in human skin. In conclusion, we successfully demonstrate the potential of confocal Raman microscopy for quantitative drug depth profiling in human skin as valuable alternative to destructive state-of-the-art techniques. Copyright © 2015 Elsevier B.V. All rights reserved.
Quantitative tomographic measurements of opaque multiphase flows
DOE Office of Scientific and Technical Information (OSTI.GOV)
GEORGE,DARIN L.; TORCZYNSKI,JOHN R.; SHOLLENBERGER,KIM ANN
2000-03-01
An electrical-impedance tomography (EIT) system has been developed for quantitative measurements of radial phase distribution profiles in two-phase and three-phase vertical column flows. The EIT system is described along with the computer algorithm used for reconstructing phase volume fraction profiles. EIT measurements were validated by comparison with a gamma-densitometry tomography (GDT) system. The EIT system was used to accurately measure average solid volume fractions up to 0.05 in solid-liquid flows, and radial gas volume fraction profiles in gas-liquid flows with gas volume fractions up to 0.15. In both flows, average phase volume fractions and radial volume fraction profiles from GDTmore » and EIT were in good agreement. A minor modification to the formula used to relate conductivity data to phase volume fractions was found to improve agreement between the methods. GDT and EIT were then applied together to simultaneously measure the solid, liquid, and gas radial distributions within several vertical three-phase flows. For average solid volume fractions up to 0.30, the gas distribution for each gas flow rate was approximately independent of the amount of solids in the column. Measurements made with this EIT system demonstrate that EIT may be used successfully for noninvasive, quantitative measurements of dispersed multiphase flows.« less
The application of absolute quantitative (1)H NMR spectroscopy in drug discovery and development.
Singh, Suruchi; Roy, Raja
2016-07-01
The identification of a drug candidate and its structural determination is the most important step in the process of the drug discovery and for this, nuclear magnetic resonance (NMR) is one of the most selective analytical techniques. The present review illustrates the various perspectives of absolute quantitative (1)H NMR spectroscopy in drug discovery and development. It deals with the fundamentals of quantitative NMR (qNMR), the physiochemical properties affecting qNMR, and the latest referencing techniques used for quantification. The precise application of qNMR during various stages of drug discovery and development, namely natural product research, drug quantitation in dosage forms, drug metabolism studies, impurity profiling and solubility measurements is elaborated. To achieve this, the authors explore the literature of NMR in drug discovery and development between 1963 and 2015. It also takes into account several other reviews on the subject. qNMR experiments are used for drug discovery and development processes as it is a non-destructive, versatile and robust technique with high intra and interpersonal variability. However, there are several limitations also. qNMR of complex biological samples is incorporated with peak overlap and a low limit of quantification and this can be overcome by using hyphenated chromatographic techniques in addition to NMR.
Salamat, Sara; Hutchings, John; Kwong, Clemens; Magnussen, John; Hancock, Mark J
2016-01-01
To assess the relationship between quantitative measures of disc height and signal intensity with the Pfirrmann disc degeneration scoring system and to test the inter-rater reliability of the quantitative measures. Participants were 76 people who had recently recovered from their last episode of acute low back pain and underwent MRI scan on a single 3T machine. At all 380 lumbar discs, quantitative measures of disc height and signal intensity were made by 2 independent raters and compared to Pfirrmann scores from a single radiologist. For quantitative measures of disc height and signal intensity a "raw" score and 2 adjusted ratios were calculated and the relationship with Pfirrmann scores was assessed. The inter-tester reliability of quantitative measures was also investigated. There was a strong linear relationship between quantitative disc signal intensity and Pfirrmann scores for grades 1-4, but not for grades 4 and 5. For disc height only, Pfirrmann grade 5 had significantly reduced disc height compared to all other grades. Results were similar regardless of whether raw or adjusted scores were used. Inter-rater reliability for the quantitative measures was excellent (ICC > 0.97). Quantitative measures of disc signal intensity were strongly related to Pfirrmann scores from grade 1 to 4; however disc height only differentiated between grade 4 and 5 Pfirrmann scores. Using adjusted ratios for quantitative measures of disc height or signal intensity did not significantly alter the relationship with Pfirrmann scores.
Pargett, Michael; Umulis, David M
2013-07-15
Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.
[Quantitative study of the prothallial morphogenesis in Asplenium species].
Henriet, M; Auquière, J P; Moens, P
1976-01-01
A precedent paper concerned a qualitative analysis of the gametophytic development in nine Asplenium species. By a quantitative study, we specify the parental relationships among these species. The surface of the gametophyte and the number of maginal hairs increase differently for each species. The density of the marginal hairs depends on the considered species. The relation among the morphological gametophytic parameters is constant in a group of determined species. The principal componant analysis is realized for all the parameters measured during the prothallial development. It confirms parental relationships among the diploids and tetraploids species on a morphological point of vue.
NASA Astrophysics Data System (ADS)
Li, Xuesong; Northrop, William F.
2016-04-01
This paper describes a quantitative approach to approximate multiple scattering through an isotropic turbid slab based on Markov Chain theorem. There is an increasing need to utilize multiple scattering for optical diagnostic purposes; however, existing methods are either inaccurate or computationally expensive. Here, we develop a novel Markov Chain approximation approach to solve multiple scattering angular distribution (AD) that can accurately calculate AD while significantly reducing computational cost compared to Monte Carlo simulation. We expect this work to stimulate ongoing multiple scattering research and deterministic reconstruction algorithm development with AD measurements.
Multispectral analysis of ocean dumped materials
NASA Technical Reports Server (NTRS)
Johnson, R. W.
1977-01-01
Remotely sensed data were collected in conjunction with sea-truth measurements in three experiments in the New York Bight. Pollution features of primary interest were ocean dumped materials, such as sewage sludge and acid waste. Sewage-sludge and acid-waste plumes, including plumes from sewage sludge dumped by the 'line-dump' and 'spot-dump' methods, were located, identified, and mapped. Previously developed quantitative analysis techniques for determining quantitative distributions of materials in sewage sludge dumps were evaluated, along with multispectral analysis techniques developed to identify ocean dumped materials. Results of these experiments and the associated data analysis investigations are presented and discussed.
Canela, Andrés; Vera, Elsa; Klatt, Peter; Blasco, María A
2007-03-27
A major limitation of studies of the relevance of telomere length to cancer and age-related diseases in human populations and to the development of telomere-based therapies has been the lack of suitable high-throughput (HT) assays to measure telomere length. We have developed an automated HT quantitative telomere FISH platform, HT quantitative FISH (Q-FISH), which allows the quantification of telomere length as well as percentage of short telomeres in large human sample sets. We show here that this technique provides the accuracy and sensitivity to uncover associations between telomere length and human disease.
Amaechi, Bennett T; Podoleanu, Adrian Gh; Komarov, Gleb; Higham, Susan M; Jackson, David A
2004-01-01
The use of transverse microradiography (TMR) to quantify the amount of mineral lost during demineralization of tooth tissue has long been established. In the present study, the use of an en-face Optical Coherence Tomography (OCT) technology to detect and quantitatively monitor the mineral changes in root caries was investigated and correlated with TMR. We used an OCT system, developed initially for retina imaging, and which can collect A-scans, B-scans (longitudinal images) and C-scans (en-face images) to quantitatively assess the development of root caries. The power to the sample was 250 microW, wavelength lambda = 850 nm and the optical source linewidth was 16 microm. Both the transversal and longitudinal images showed the caries lesion as volumes of reduced reflectivity. Quantitative analysis using the A-scan (reflectivity versus depth curve) showed that the tissue reflectivity decreased with increasing demineralization time. A linear correlation (r = 0.957) was observed between the mineral loss measured by TMR and the percentage reflectivity loss in demineralized tissue measured by OCT. We concluded that OCT could be used to detect incipient root caries, and that the reflectivity loss in root tissue during demineralization, measured by OCT, could be related to the amount of mineral lost during the demineralization.
NASA Astrophysics Data System (ADS)
Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.
2012-09-01
Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.
Luchins, Daniel
2012-01-01
The quality improvement model currently used in medicine and mental health was adopted from industry, where it developed out of early 20th-century efforts to apply a positivist/quantitative agenda to improving manufacturing. This article questions the application of this model to mental health care. It argues that (1) developing "operational definitions" for something as value-laden as "quality" risks conflating two realms, what we measure with what we value; (2) when measurements that are tied to individuals are aggregated to establish benchmarks and goals, unwarranted mathematical assumptions are made; (3) choosing clinical outcomes is problematic; (4) there is little relationship between process measures and clinical outcomes; and (5) since changes in quality indices do not relate to improved clinical care, management's reliance on such indices provides an illusory sense of control. An alternative model is the older, skill-based/qualitative approach to knowing, which relies on "implicit/ expert" knowledge. These two approaches offer a series of contrasts: quality versus excellence, competence versus expertise, management versus leadership, extrinsic versus intrinsic rewards. The article concludes that we need not totally dispense with the current quality improvement model, but rather should balance quantitative efforts with the older qualitative approach in a mixed methods model.
Learning Communities: An Untapped Sustainable Competitive Advantage for Higher Education
ERIC Educational Resources Information Center
Dawson, Shane; Burnett, Bruce; O' Donohue, Mark
2006-01-01
Purpose: This paper demonstrates the need for the higher education sector to develop and implement scaleable, quantitative measures that evaluate community and establish organisational benchmarks in order to guide the development of future practices designed to enhance the student learning experience. Design/methodology/approach: Literature…
Probing sub-alveolar length scales with hyperpolarized-gas diffusion NMR
NASA Astrophysics Data System (ADS)
Miller, Wilson; Carl, Michael; Mooney, Karen; Mugler, John; Cates, Gordon
2009-05-01
Diffusion MRI of the lung is a promising technique for detecting alterations of normal lung microstructure in diseases such as emphysema. The length scale being probed using this technique is related to the time scale over which the helium-3 or xenon-129 diffusion is observed. We have developed new MR pulse sequence methods for making diffusivity measurements at sub-millisecond diffusion times, allowing one to probe smaller length scales than previously possible in-vivo, and opening the possibility of making quantitative measurements of the ratio of surface area to volume (S/V) in the lung airspaces. The quantitative accuracy of simulated and experimental measurements in microstructure phantoms will be discussed, and preliminary in-vivo results will be presented.
Application of NIR spectroscopy in the assessment of diabetic foot disorders
NASA Astrophysics Data System (ADS)
Schleicher, Eckhard; Hampel, Uwe; Freyer, Richard
2001-10-01
Diabetic foot syndrome (DFS) is a common sequel of long-term diabetes mellitus. There is a urgent need of noninvasive, objective and quantitative diagnostic tools to assess tissue viability and perfusion for a successful therapy. NIR spectroscopy seems to be qualified to measure local capillary hemoglobin saturation of the outer extremities in patients with progressive diabetic disorders. We investigate how NIR spectroscopy can be applied to the assessment of diabetic foot problems such as neuropathy and angiopathy. Thereby we use spatially resolved spectroscopy in conjunction with a specially developed continuous-wave laser spectrometer. Comparison of intra- and interindividual measurements is expected to yield quantitative measures of local tissue viability which is a prerequisite for a successful therapy.
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
2002-06-01
Recent technological trends based on miniaturization of mechanical, electro-mechanical, and photonic devices to the microscopic scale, have led to the development of microelectromechanical systems (MEMS). Effective development of MEMS components requires the synergism of advanced design, analysis, and fabrication methodologies, and also of quantitative metrology techniques for characterizing their performance, reliability, and integrity during the electronic packaging cycle. In this paper, we describe opto-electronic techniques for measuring, with sub-micrometer accuracy, shape and changes in states of deformation of MEMS strictures. With the described opto-electronic techniques, it is possible to characterize MEMS components using the display and data modes. In the display mode, interferometric information related to shape and deformation is displayed at video frame rates, providing the capability for adjusting and setting experimental conditions. In the data mode, interferometric information related to shape and deformation is recorded as high-spatial and high-digital resolution images, which are further processed to provide quantitative 3D information. Furthermore, the quantitative 3D data are exported to computer-aided design (CAD) environments and utilized for analysis and optimization of MEMS devices. Capabilities of opto- electronic techniques are illustrated with representative applications demonstrating their applicability to provide indispensable quantitative information for the effective development and optimization of MEMS devices.
Development of a two-wavelength IR laser absorption diagnostic for propene and ethylene
NASA Astrophysics Data System (ADS)
Parise, T. C.; Davidson, D. F.; Hanson, R. K.
2018-05-01
A two-wavelength infrared laser absorption diagnostic for non-intrusive, simultaneous quantitative measurement of propene and ethylene was developed. To this end, measurements of absorption cross sections of propene and potential interfering species at 10.958 µm were acquired at high-temperatures. When used in conjunction with existing absorption cross-section measurements of ethylene and other species at 10.532 µm, a two-wavelength diagnostic was developed to simultaneously measure propene and ethylene, the two small alkenes found to generally dominate the final decomposition products of many fuel hydrocarbon pyrolysis systems. Measurements of these two species is demonstrated using this two-wavelength diagnostic scheme for propene decomposition between 1360 and 1710 K.
Dunbar, Richard L.; Goel, Harsh; Tuteja, Sony; Song, Wen-Liang; Nathanson, Grace; Babar, Zeeshan; Lalic, Dusanka; Gelfand, Joel M.; Rader, Daniel J.; Grove, Gary L.
2017-01-01
Though cardioprotective, niacin monotherapy is limited by unpleasant cutaneous symptoms mimicking dermatitis: niacin-associated skin toxicity (NASTy). Niacin is prototypical of several emerging drugs suffering off-target rubefacient properties whereby agonizing the GPR109A receptor on cutaneous immune cells provokes vasodilation, prompting skin plethora and rubor, as well as dolor, tumor, and calor, and systemically, heat loss, frigor, chills, and rigors. Typically, NASTy effects are described by subjective patient-reported perception, at best semi-quantitative and bias-prone. Conversely, objective, quantitative, and unbiased methods measuring NASTy stigmata would facilitate research to abolish them, motivating development of several objective methods. In early drug development, such methods might better predict clinical tolerability in larger clinical trials. Measuring cutaneous stigmata may also aid investigations of vasospastic, ischemic, and inflammatory skin conditions. We present methods to measure NASTy physical stigmata to facilitate research into novel niacin mimetics/analogs, detailing characteristics of each technique following niacin, and how NASTy stigmata relate to symptom perception. We gave niacin orally and measured rubor by colorimetry and white-light spectroscopy, plethora by laser Doppler flowmetry, and calor/frigor by thermometry. Surprisingly, each stigma’s abruptness predicted symptom perception, whereas peak intensity did not. These methods are adaptable to study other rubefacient drugs or dermatologic and vascular disorders. PMID:28119443
Instrumented toys for studying power and precision grasp forces in infants.
Serio, S M; Cecchi, F; Boldrini, E; Laschi, C; Sgandurra, G; Cioni, G; Dario, P
2011-01-01
Currently the study of infants grasping development is purely clinical, based on functional scales or on the observation of the infant while playing; no quantitative variables are measured or known for diagnosis of eventually disturbed development. The aim of this work is to show the results of a longitudinal study achieved by using a "baby gym" composed by a set of instrumented toys, as a tool to measure and stimulate grasping actions, in infants from 4 to 9 months of life. The study has been carried out with 7 healthy infants and it was observed, during infants development, an increase of precision grasp and a reduction of power grasp with age. Moreover the forces applied for performing both precision and power grasp increase with age. The proposed devices represent a valid tool for continuous and quantitative measuring infants manual function and motor development, without being distressful for the infant and consequently it could be suitable for early intervention training during the first year of life. The same system, in fact, could be used with infants at high risk for developmental motor disorder in order to evaluate any potential difference from control healthy infants.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Young-Min; Pennycook, Stephen J.; Borisevich, Albina Y.
Octahedral tilt behavior is increasingly recognized as an important contributing factor to the physical behavior of perovskite oxide materials and especially their interfaces, necessitating the development of high-resolution methods of tilt mapping. There are currently two major approaches for quantitative imaging of tilts in scanning transmission electron microscopy (STEM), bright field (BF) and annular bright field (ABF). In this study, we show that BF STEM can be reliably used for measurements of oxygen octahedral tilts. While optimal conditions for BF imaging are more restricted with respect to sample thickness and defocus, we find that BF imaging with an aberration-corrected microscopemore » with the accelerating voltage of 300 kV gives us the most accurate quantitative measurement of the oxygen column positions. Using the tilted perovskite structure of BiFeO 3 (BFO) as our test sample, we simulate BF and ABF images in a wide range of conditions, identifying the optimal imaging conditions for each mode. Finally, we show that unlike ABF imaging, BF imaging remains directly quantitatively interpretable for a wide range of the specimen mistilt, suggesting that it should be preferable to the ABF STEM imaging for quantitative structure determination.« less
Kim, Young-Min; Pennycook, Stephen J.; Borisevich, Albina Y.
2017-04-29
Octahedral tilt behavior is increasingly recognized as an important contributing factor to the physical behavior of perovskite oxide materials and especially their interfaces, necessitating the development of high-resolution methods of tilt mapping. There are currently two major approaches for quantitative imaging of tilts in scanning transmission electron microscopy (STEM), bright field (BF) and annular bright field (ABF). In this study, we show that BF STEM can be reliably used for measurements of oxygen octahedral tilts. While optimal conditions for BF imaging are more restricted with respect to sample thickness and defocus, we find that BF imaging with an aberration-corrected microscopemore » with the accelerating voltage of 300 kV gives us the most accurate quantitative measurement of the oxygen column positions. Using the tilted perovskite structure of BiFeO 3 (BFO) as our test sample, we simulate BF and ABF images in a wide range of conditions, identifying the optimal imaging conditions for each mode. Finally, we show that unlike ABF imaging, BF imaging remains directly quantitatively interpretable for a wide range of the specimen mistilt, suggesting that it should be preferable to the ABF STEM imaging for quantitative structure determination.« less
Advances in Quantitative Proteomics of Microbes and Microbial Communities
NASA Astrophysics Data System (ADS)
Waldbauer, J.; Zhang, L.; Rizzo, A. I.
2015-12-01
Quantitative measurements of gene expression are key to developing a mechanistic, predictive understanding of how microbial metabolism drives many biogeochemical fluxes and responds to environmental change. High-throughput RNA-sequencing can afford a wealth of information about transcript-level expression patterns, but it is becoming clear that expression dynamics are often very different at the protein level where biochemistry actually occurs. These divergent dynamics between levels of biological organization necessitate quantitative proteomic measurements to address many biogeochemical questions. The protein-level expression changes that underlie shifts in the magnitude, or even the direction, of metabolic and biogeochemical fluxes can be quite subtle and test the limits of current quantitative proteomics techniques. Here we describe methodologies for high-precision, whole-proteome quantification that are applicable to both model organisms of biogeochemical interest that may not be genetically tractable, and to complex community samples from natural environments. Employing chemical derivatization of peptides with multiple isotopically-coded tags, this strategy is rapid and inexpensive, can be implemented on a wide range of mass spectrometric instrumentation, and is relatively insensitive to chromatographic variability. We demonstrate the utility of this quantitative proteomics approach in application to both isolates and natural communities of sulfur-metabolizing and photosynthetic microbes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, R.N.; Cooper, M.D.
1990-09-01
This report summarizes goals and accomplishments of the research program supported under DOE Grant No. FG02-86ER60418 entitled Instrumentation and Quantitative Methods of Evaluation, with R. Beck, P. I. and M. Cooper, Co-P.I. during the period January 15, 1990 through September 1, 1990. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development andmore » transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 7 figs.« less
Shen, Xiaomeng; Hu, Qiang; Li, Jun; Wang, Jianmin; Qu, Jun
2015-10-02
Comprehensive and accurate evaluation of data quality and false-positive biomarker discovery is critical to direct the method development/optimization for quantitative proteomics, which nonetheless remains challenging largely due to the high complexity and unique features of proteomic data. Here we describe an experimental null (EN) method to address this need. Because the method experimentally measures the null distribution (either technical or biological replicates) using the same proteomic samples, the same procedures and the same batch as the case-vs-contol experiment, it correctly reflects the collective effects of technical variability (e.g., variation/bias in sample preparation, LC-MS analysis, and data processing) and project-specific features (e.g., characteristics of the proteome and biological variation) on the performances of quantitative analysis. To show a proof of concept, we employed the EN method to assess the quantitative accuracy and precision and the ability to quantify subtle ratio changes between groups using different experimental and data-processing approaches and in various cellular and tissue proteomes. It was found that choices of quantitative features, sample size, experimental design, data-processing strategies, and quality of chromatographic separation can profoundly affect quantitative precision and accuracy of label-free quantification. The EN method was also demonstrated as a practical tool to determine the optimal experimental parameters and rational ratio cutoff for reliable protein quantification in specific proteomic experiments, for example, to identify the necessary number of technical/biological replicates per group that affords sufficient power for discovery. Furthermore, we assessed the ability of EN method to estimate levels of false-positives in the discovery of altered proteins, using two concocted sample sets mimicking proteomic profiling using technical and biological replicates, respectively, where the true-positives/negatives are known and span a wide concentration range. It was observed that the EN method correctly reflects the null distribution in a proteomic system and accurately measures false altered proteins discovery rate (FADR). In summary, the EN method provides a straightforward, practical, and accurate alternative to statistics-based approaches for the development and evaluation of proteomic experiments and can be universally adapted to various types of quantitative techniques.
NASA Technical Reports Server (NTRS)
Mulhall, B. D. L.
1980-01-01
The development of both quantitative criteria that were used to evaluate conceptional systems for automating the functions for the FBI Identification Division is described. Specific alternative systems for automation were compared by using these developed criteria, defined as Measures of Effectiveness (MOE), to gauge system's performance in attempting to achieve certain goals. The MOE, essentially measurement tools that were developed through the combination of suitable parameters, pertain to each conceivable area of system operation. The methods and approaches used, both in selecting the parameters and in using the resulting MOE, are described.
Morris, S; Li, Y; Smith, J A M; Dube', S; Burbridge, C; Symonds, T
2017-05-16
Fibromyalgia (FM), a disorder characterized by chronic widespread pain and tenderness, affects greater than five million individuals in the United States alone. Patients experience multiple symptoms in addition to pain, and among them, fatigue is one of the most bothersome and disabling. There is a growing body of literature suggesting that fatigue is a multidimensional concept. Currently, to our knowledge, no multidimensional Patient Reported Outcome (PRO) measure of FM-related fatigue meets Food and Drug Administration (FDA) requirements to support a product label claim. Therefore, the objective of this research was to evaluate qualitative and quantitative data previously gathered to inform the development of a comprehensive, multidimensional, PRO measure to assess FM-related fatigue in FM clinical trials. Existing qualitative and quantitative data from three previously conducted studies in patients with FM were reviewed to inform the initial development of a multidimensional PRO measure of FM-related fatigue: 1) a concept elicitation study involving in-depth, open-ended interviews with patients with FM in the United States (US) (N = 20), Germany (N = 10), and France (N = 10); 2) a cognitive debriefing and pilot study of a preliminary pool of 23 items (N = 20 US patients with FM); and 3) a methodology study that explored initial psychometrics of the item pool (N = 145 US patients with FM). Five domains were identified that intend to capture the broad experience of FM-related fatigue reported in the qualitative research: the Global Fatigue Experience, Cognitive Fatigue, Physical Fatigue, Motivation, and Impact on Function. Seventeen of the original pool of 23 items were selected to best capture these five dimensions. These 17 items formed the basis of a newly developed multidimensional PRO measure to assess FM-related fatigue in clinical trials: the Multidimensional Daily Diary of Fatigue-Fibromyalgia-17 (MDF-Fibro-17). Qualitative analysis, and preliminary quantitative item level data, confirmed that FM-related fatigue is multidimensional and provided strong support for the content validity of the MDF-Fibro-17. The next stage was to quantitatively evaluate the measure to confirm the factor structure, psychometric properties, sensitivity to change, and meaningful change. This has been conducted and is being reported separately.
Coagulation monitoring based on blood elastic measurement using optical coherence tomography
NASA Astrophysics Data System (ADS)
Xu, Xiangqun; Zhu, Jiang; Chen, Zhongping
2017-02-01
Blood coagulation monitoring is important to diagnose hematological diseases and cardiovascular diseases and to predict the risk of bleeding and excessive clotting. In this study, we developed a system to dynamically monitor blood coagulation and quantitatively determine the coagulation function by blood elastic measurement. When blood forms a clot from a liquid, ultrasonic force induces a shear wave, which is detected by optical coherence tomography (OCT). The coagulation of porcine whole blood recalcified by calcium chloride is assessed using the metrics of reaction time, clot formation kinetics and maximum shear modulus. The OCE system can noninvasively monitor the blood coagulation and quantitatively determine the coagulation function.
Jesse, Stephen; Kalinin, Sergei V; Nikiforov, Maxim P
2013-07-09
An approach for the thermomechanical characterization of phase transitions in polymeric materials (polyethyleneterephthalate) by band excitation acoustic force microscopy is developed. This methodology allows the independent measurement of resonance frequency, Q factor, and oscillation amplitude of a tip-surface contact area as a function of tip temperature, from which the thermal evolution of tip-surface spring constant and mechanical dissipation can be extracted. A heating protocol maintained a constant tip-surface contact area and constant contact force, thereby allowing for reproducible measurements and quantitative extraction of material properties including temperature dependence of indentation-based elastic and loss moduli.
Microsatellite primers for the Pacific Northwest conifer Callitropsis nootkatensis (Cupressaceae)
Tar N. Jennings; Brian J. Knaus; Katherine Alderman; Paul E. Hennon; David V. D’Amore; Richard. Cronn
2013-01-01
Microsatellite primers were developed for Nootka cypress ( Callitropsis nootkatensis ) to provide quantitative measures for gene conservation that can assist in guiding management decisions for a species experiencing climate-induced decline.
Electron-Beam Diagnostic Methods for Hypersonic Flow Diagnostics
NASA Technical Reports Server (NTRS)
1994-01-01
The purpose of this work was the evaluation of the use of electron-bean fluorescence for flow measurements during hypersonic flight. Both analytical and numerical models were developed in this investigation to evaluate quantitatively flow field imaging concepts based upon the electron beam fluorescence technique for use in flight research and wind tunnel applications. Specific models were developed for: (1) fluorescence excitation/emission for nitrogen, (2) rotational fluorescence spectrum for nitrogen, (3) single and multiple scattering of electrons in a variable density medium, (4) spatial and spectral distribution of fluorescence, (5) measurement of rotational temperature and density, (6) optical filter design for fluorescence imaging, and (7) temperature accuracy and signal acquisition time requirements. Application of these models to a typical hypersonic wind tunnel flow is presented. In particular, the capability of simulating the fluorescence resulting from electron impact ionization in a variable density nitrogen or air flow provides the capability to evaluate the design of imaging instruments for flow field mapping. The result of this analysis is a recommendation that quantitative measurements of hypersonic flow fields using electron-bean fluorescence is a tractable method with electron beam energies of 100 keV. With lower electron energies, electron scattering increases with significant beam divergence which makes quantitative imaging difficult. The potential application of the analytical and numerical models developed in this work is in the design of a flow field imaging instrument for use in hypersonic wind tunnels or onboard a flight research vehicle.
García-Florentino, Cristina; Maguregui, Maite; Romera-Fernández, Miriam; Queralt, Ignasi; Margui, Eva; Madariaga, Juan Manuel
2018-05-01
Wavelength dispersive X-ray fluorescence (WD-XRF) spectrometry has been widely used for elemental quantification of mortars and cements. In this kind of instrument, samples are usually prepared as pellets or fused beads and the whole volume of sample is measured at once. In this work, the usefulness of a dual energy dispersive X-ray fluorescence spectrometer (ED-XRF), working at two lateral resolutions (1 mm and 25 μm) for macro and microanalysis respectively, to develop quantitative methods for the elemental characterization of mortars and concretes is demonstrated. A crucial step before developing any quantitative method with this kind of spectrometers is to verify the homogeneity of the standards at these two lateral resolutions. This new ED-XRF quantitative method also demonstrated the importance of matrix effects in the accuracy of the results being necessary to use Certified Reference Materials as standards. The results obtained with the ED-XRF quantitative method were compared with the ones obtained with two WD-XRF quantitative methods employing two different sample preparation strategies (pellets and fused beads). The selected ED-XRF and both WD-XRF quantitative methods were applied to the analysis of real mortars. The accuracy of the ED-XRF results turn out to be similar to the one achieved by WD-XRF, except for the lightest elements (Na and Mg). The results described in this work proved that μ-ED-XRF spectrometers can be used not only for acquiring high resolution elemental map distributions, but also to perform accurate quantitative studies avoiding the use of more sophisticated WD-XRF systems or the acid extraction/alkaline fusion required as destructive pretreatment in Inductively coupled plasma mass spectrometry based procedures.
A Backscatter-Lidar Forward-Operator
NASA Astrophysics Data System (ADS)
Geisinger, Armin; Behrendt, Andreas; Wulfmeyer, Volker; Vogel, Bernhard; Mattis, Ina; Flentje, Harald; Förstner, Jochen; Potthast, Roland
2015-04-01
We have developed a forward-operator which is capable of calculating virtual lidar profiles from atmospheric state simulations. The operator allows us to compare lidar measurements and model simulations based on the same measurement parameter: the lidar backscatter profile. This method simplifies qualitative comparisons and also makes quantitative comparisons possible, including statistical error quantification. Implemented into an aerosol-capable model system, the operator will act as a component to assimilate backscatter-lidar measurements. As many weather services maintain already networks of backscatter-lidars, such data are acquired already in an operational manner. To estimate and quantify errors due to missing or uncertain aerosol information, we started sensitivity studies about several scattering parameters such as the aerosol size and both the real and imaginary part of the complex index of refraction. Furthermore, quantitative and statistical comparisons between measurements and virtual measurements are shown in this study, i.e. applying the backscatter-lidar forward-operator on model output.
The Development of a Test to Assess Drug Using Behavior.
ERIC Educational Resources Information Center
Althoff, Michael E.
The objective of the study was to develop a test which could measure both the qualitative and quantitative aspects of drug-using behavior, including such factors as attitudes toward drugs, experience with drugs, and knowledge about drugs. The Drug Use Scale was developed containing 134 items and dealing with five classes of drugs: marijuana,…
ERIC Educational Resources Information Center
Cooley, Sam J.; Burns, Victoria E.; Cumming, Jennifer
2016-01-01
This study investigates the initial development of groupwork skills through outdoor adventure education (OAE) and the factors that predict the extent of this development, using the first two levels of Kirkpatrick's model of training evaluation. University students (N = 238) completed questionnaires measuring their initial reactions to OAE (Level 1…
Comparison of estimated and measured sediment yield in the Gualala River
Matthew O’Connor; Jack Lewis; Robert Pennington
2012-01-01
This study compares quantitative erosion rate estimates developed at different spatial and temporal scales. It is motivated by the need to assess potential water quality impacts of a proposed vineyard development project in the Gualala River watershed. Previous erosion rate estimates were developed using sediment source assessment techniques by the North Coast Regional...
A Professional Development School Staff's Perceptions of Actual and Preferred Learning Environments.
ERIC Educational Resources Information Center
Kiley, Therese J.; Jensen, Rita A.
A study assessed the teaching/learning environment of one professional development school in a variety of ways that included a combination of quantitative and qualitative measures. Results were analyzed using the eight scales of the "School Level Environment Questionnaire" (SLEQ) as categories: Student Support, Affiliation, Professional…
Development of the Student Affairs Officers Work Environment Perception Scale
ERIC Educational Resources Information Center
Haynes, Derrick E.
2010-01-01
The qualitative and quantitative study developed and validated a questionnaire to measure Student Affairs Officers' (SAO) perceptions of the work environment. A review of the literature identified five major categories and 25 elements having an impact on SAOs' perceptions of the work environment. The test instrument (questionnaire) was developed…
USDA-ARS?s Scientific Manuscript database
A rapid, quantitative research method using microwave-assisted probe ultrasonication was developed to facilitate the determination of total insoluble, and soluble starch in various sugar factory and refinery products. Several variables that affect starch solubilization were evaluated: 1) conductiv...
USDA-ARS?s Scientific Manuscript database
A rapid, quantitative research method using microwave-assisted probe ultrasonication was developed to facilitate the determination of total insoluble, and soluble starch in various sugar crop products. Several variables that affect starch solubilization were evaluated, 1) conductive boiling time, 2...
Abstract describes a streamlined ELISA method developed to quantitatively measure 2,4-D in human urine samples. Method development steps and comparison with gas chromatography/mass spectrometry are presented. Results indicated that the ELISA method could be used as a high throu...
A Novel ImageJ Macro for Automated Cell Death Quantitation in the Retina
Maidana, Daniel E.; Tsoka, Pavlina; Tian, Bo; Dib, Bernard; Matsumoto, Hidetaka; Kataoka, Keiko; Lin, Haijiang; Miller, Joan W.; Vavvas, Demetrios G.
2015-01-01
Purpose TUNEL assay is widely used to evaluate cell death. Quantification of TUNEL-positive (TUNEL+) cells in tissue sections is usually performed manually, ideally by two masked observers. This process is time consuming, prone to measurement errors, and not entirely reproducible. In this paper, we describe an automated quantification approach to address these difficulties. Methods We developed an ImageJ macro to quantitate cell death by TUNEL assay in retinal cross-section images. The script was coded using IJ1 programming language. To validate this tool, we selected a dataset of TUNEL assay digital images, calculated layer area and cell count manually (done by two observers), and compared measurements between observers and macro results. Results The automated macro segmented outer nuclear layer (ONL) and inner nuclear layer (INL) successfully. Automated TUNEL+ cell counts were in-between counts of inexperienced and experienced observers. The intraobserver coefficient of variation (COV) ranged from 13.09% to 25.20%. The COV between both observers was 51.11 ± 25.83% for the ONL and 56.07 ± 24.03% for the INL. Comparing observers' results with macro results, COV was 23.37 ± 15.97% for the ONL and 23.44 ± 18.56% for the INL. Conclusions We developed and validated an ImageJ macro that can be used as an accurate and precise quantitative tool for retina researchers to achieve repeatable, unbiased, fast, and accurate cell death quantitation. We believe that this standardized measurement tool could be advantageous to compare results across different research groups, as it is freely available as open source. PMID:26469755
Yurkovich, James T.; Yang, Laurence; Palsson, Bernhard O.; ...
2017-03-06
Deep-coverage metabolomic profiling has revealed a well-defined development of metabolic decay in human red blood cells (RBCs) under cold storage conditions. A set of extracellular biomarkers has been recently identified that reliably defines the qualitative state of the metabolic network throughout this metabolic decay process. Here, we extend the utility of these biomarkers by using them to quantitatively predict the concentrations of other metabolites in the red blood cell. We are able to accurately predict the concentration profile of 84 of the 91 (92%) measured metabolites ( p < 0.05) in RBC metabolism using only measurements of these five biomarkers.more » The median of prediction errors (symmetric mean absolute percent error) across all metabolites was 13%. Furthermore, the ability to predict numerous metabolite concentrations from a simple set of biomarkers offers the potential for the development of a powerful workflow that could be used to evaluate the metabolic state of a biological system using a minimal set of measurements.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yurkovich, James T.; Yang, Laurence; Palsson, Bernhard O.
Deep-coverage metabolomic profiling has revealed a well-defined development of metabolic decay in human red blood cells (RBCs) under cold storage conditions. A set of extracellular biomarkers has been recently identified that reliably defines the qualitative state of the metabolic network throughout this metabolic decay process. Here, we extend the utility of these biomarkers by using them to quantitatively predict the concentrations of other metabolites in the red blood cell. We are able to accurately predict the concentration profile of 84 of the 91 (92%) measured metabolites ( p < 0.05) in RBC metabolism using only measurements of these five biomarkers.more » The median of prediction errors (symmetric mean absolute percent error) across all metabolites was 13%. Furthermore, the ability to predict numerous metabolite concentrations from a simple set of biomarkers offers the potential for the development of a powerful workflow that could be used to evaluate the metabolic state of a biological system using a minimal set of measurements.« less
Photogrammetry Applied to Wind Tunnel Testing
NASA Technical Reports Server (NTRS)
Liu, Tian-Shu; Cattafesta, L. N., III; Radeztsky, R. H.; Burner, A. W.
2000-01-01
In image-based measurements, quantitative image data must be mapped to three-dimensional object space. Analytical photogrammetric methods, which may be used to accomplish this task, are discussed from the viewpoint of experimental fluid dynamicists. The Direct Linear Transformation (DLT) for camera calibration, used in pressure sensitive paint, is summarized. An optimization method for camera calibration is developed that can be used to determine the camera calibration parameters, including those describing lens distortion, from a single image. Combined with the DLT method, this method allows a rapid and comprehensive in-situ camera calibration and therefore is particularly useful for quantitative flow visualization and other measurements such as model attitude and deformation in production wind tunnels. The paper also includes a brief description of typical photogrammetric applications to temperature- and pressure-sensitive paint measurements and model deformation measurements in wind tunnels.
Does your SEM really tell the truth?--How would you know? Part 1.
Postek, Michael T; Vladár, András E
2013-01-01
The scanning electron microscope (SEM) has gone through a tremendous evolution to become a critical tool for many and diverse scientific and industrial applications. The high resolution of the SEM is especially suited for both qualitative and quantitative applications especially for nanotechnology and nanomanufacturing. Quantitatively, measurement, or metrology is one of the main uses. It is likely that one of the first questions asked before even the first scanning electron micrograph was ever recorded was: "… how big is that?" The quality of that answer has improved a great deal over the past few years especially since today these instruments are being used as a primary measurement tool on semiconductor processing lines to monitor the manufacturing processes. The well-articulated needs of semiconductor production prompted a rapid evolution of the instrument and its capabilities. Over the past 20 years or so, instrument manufacturers, through substantial semiconductor industry investment of research and development (R&D) money, have vastly improved the performance of these instruments. All users have benefited from this investment, especially where quantitative measurements with an SEM are concerned. But, how good are these data? This article discusses some of the most important aspects and larger issues associated with imaging and measurements with the SEM that every user should know, and understand before any critical quantitative work is attempted. © Wiley Periodicals, Inc.
Quantitative fluorescence angiography for neurosurgical interventions.
Weichelt, Claudia; Duscha, Philipp; Steinmeier, Ralf; Meyer, Tobias; Kuß, Julia; Cimalla, Peter; Kirsch, Matthias; Sobottka, Stephan B; Koch, Edmund; Schackert, Gabriele; Morgenstern, Ute
2013-06-01
Present methods for quantitative measurement of cerebral perfusion during neurosurgical operations require additional technology for measurement, data acquisition, and processing. This study used conventional fluorescence video angiography--as an established method to visualize blood flow in brain vessels--enhanced by a quantifying perfusion software tool. For these purposes, the fluorescence dye indocyanine green is given intravenously, and after activation by a near-infrared light source the fluorescence signal is recorded. Video data are analyzed by software algorithms to allow quantification of the blood flow. Additionally, perfusion is measured intraoperatively by a reference system. Furthermore, comparing reference measurements using a flow phantom were performed to verify the quantitative blood flow results of the software and to validate the software algorithm. Analysis of intraoperative video data provides characteristic biological parameters. These parameters were implemented in the special flow phantom for experimental validation of the developed software algorithms. Furthermore, various factors that influence the determination of perfusion parameters were analyzed by means of mathematical simulation. Comparing patient measurement, phantom experiment, and computer simulation under certain conditions (variable frame rate, vessel diameter, etc.), the results of the software algorithms are within the range of parameter accuracy of the reference methods. Therefore, the software algorithm for calculating cortical perfusion parameters from video data presents a helpful intraoperative tool without complex additional measurement technology.
Improving the Performance of the Listening Competency Scale: Revision and Validation
ERIC Educational Resources Information Center
Mickelson, William T.; Welch, S. A.
2013-01-01
Measuring latent traits is central to quantitative listening research and has been the focus of many studies. One such prominent measurement instrument, based on the Wolvin and Coakley (1993) listening taxonomy, was developed by Ford, Wolvin, and Chung (2000). Subsequent validation research (Mickelson & Welch, 2012) called for revisiting and…
ERIC Educational Resources Information Center
Harvey-Buschel, Phyllis
2009-01-01
The problem explored in this study was whether access to technology impacted technology integration in mathematics instruction in urban public secondary schools. Access to technology was measured by availability of computers in the classroom, teacher experience, and teacher professional development. Technology integration was measured by…
ERIC Educational Resources Information Center
Deng, Feng; Chai, Ching Sing; So, Hyo-Jeong; Qian, Yangyi; Chen, Lingling
2017-01-01
While various quantitative measures for assessing teachers' technological pedagogical content knowledge (TPACK) have developed rapidly, few studies to date have comprehensively validated the structure of TPACK through various criteria of validity especially for content specific areas. In this paper, we examined how the TPACK survey measure is…
Phenomenology and Measurement of Circumscribed Interests in Autism Spectrum Disorders
ERIC Educational Resources Information Center
Turner-Brown, Lauren M.; Lam, Kristen S. L.; Holtzclaw, Tia N.; Dichter, Gabriel S.; Bodfish, James W.
2011-01-01
Circumscribed interests (CI) are important and understudied symptoms that affect individuals with autism spectrum disorders (ASD). The present study sought to develop quantitative measures of the content, intensity and functional impairment of CI in 50 children with high-functioning ASD compared to an age-, IQ-, and gender-matched sample of 50…
ERIC Educational Resources Information Center
Grundhoefer, Raymie
2013-01-01
The purpose of this research is twofold: (a) develop a validated measure for learning initiatives based on knowledge-creation theory and (b) conduct a quantitative study to investigate the relationships between electronic learning systems, learning-organization culture, efficacious knowledge creation (EKC), and innovativeness. Although Cheng-Chang…
ERIC Educational Resources Information Center
Borleffs, Elisabeth; Maassen, Ben A. M.; Lyytinen, Heikki; Zwarts, Frans
2017-01-01
This narrative review discusses quantitative indices measuring differences between alphabetic languages that are related to the process of word recognition. The specific orthography that a child is acquiring has been identified as a central element influencing reading acquisition and dyslexia. However, the development of reliable metrics to…
ERIC Educational Resources Information Center
Paredes, Josie Abaroa
2013-01-01
The purpose of this study was to investigate, through quantitative research, effective middle school characteristics that predict student achievement, as measured by the school-wide California API score. Characteristics were determined using an instrument developed by the Office of Superintendent of Public Instruction (OSPI), which asked middle…
Measuring the Professional Identity of Hong Kong In-Service Teachers
ERIC Educational Resources Information Center
Cheung, Hoi Yan
2008-01-01
A teacher professional identity scale was developed for Hong Kong in-service teachers to measure the professional identity of teachers. Most studies of professional identity have been qualitative. The present study tried to examine this important concept using a quantitative method. Based on various studies, one of the ways of understanding the…
Chen, Ran; Zhang, Yuntao; Sahneh, Faryad Darabi; Scoglio, Caterina M; Wohlleben, Wendel; Haase, Andrea; Monteiro-Riviere, Nancy A; Riviere, Jim E
2014-09-23
Quantitative characterization of nanoparticle interactions with their surrounding environment is vital for safe nanotechnological development and standardization. A recent quantitative measure, the biological surface adsorption index (BSAI), has demonstrated promising applications in nanomaterial surface characterization and biological/environmental prediction. This paper further advances the approach beyond the application of five descriptors in the original BSAI to address the concentration dependence of the descriptors, enabling better prediction of the adsorption profile and more accurate categorization of nanomaterials based on their surface properties. Statistical analysis on the obtained adsorption data was performed based on three different models: the original BSAI, a concentration-dependent polynomial model, and an infinite dilution model. These advancements in BSAI modeling showed a promising development in the application of quantitative predictive modeling in biological applications, nanomedicine, and environmental safety assessment of nanomaterials.
NASA Astrophysics Data System (ADS)
Betsofen, S. Ya.; Kolobov, Yu. R.; Volkova, E. F.; Bozhko, S. A.; Voskresenskaya, I. I.
2015-04-01
Quantitative methods have been developed to estimate the anisotropy of the strength properties and to determine the phase composition of Mg-Al alloys. The efficiency of the methods is confirmed for MA5 alloy subjected to severe plastic deformation. It is shown that the Taylor factors calculated for basal slip averaged over all orientations of a polycrystalline aggregate with allowance for texture can be used for a quantitative estimation of the contribution of the texture of semifinished magnesium alloy products to the anisotropy of their strength properties. A technique of determining the composition of a solid solution and the intermetallic phase Al12Mg17 content is developed using the measurement of the lattice parameters of the solid solution and the known dependence of these lattice parameters on the composition.
Quantitative inactivation-mechanisms of P. digitatum and A. niger spores based on atomic oxygen dose
NASA Astrophysics Data System (ADS)
Ito, Masafumi; Hashizume, Hiroshi; Ohta, Takayuki; Hori, Masaru
2014-10-01
We have investigated inactivation mechanisms of Penicillium digitatum and Asperguills niger spores using atmospheric-pressure radical source quantitatively. The radical source was specially developed for supplying only neutral radicals without charged species and UV-light emissions. Reactive oxygen radical densities such as grand-state oxygen atoms, excited-state oxygen molecules and ozone were measured using VUV and UV absorption spectroscopies. The measurements and the treatments of spores were carried out in an Ar-purged chamber for eliminating the influences of OH, NOx and so on. The results revealed that the inactivation of spores can be explained by atomic-oxygen dose under the conditions employing neutral ROS irradiations. On the basis of the dose, we have observed the changes of intracellular organelles and membrane functions using TEM, SEM and confocal- laser fluorescent microscopy. From these results, we discuss the detail inactivation-mechanisms quantitatively based on atomic-oxygen dose.
Quantitative photoacoustic elasticity and viscosity imaging for cirrhosis detection
NASA Astrophysics Data System (ADS)
Wang, Qian; Shi, Yujiao; Yang, Fen; Yang, Sihua
2018-05-01
Elasticity and viscosity assessments are essential for understanding and characterizing the physiological and pathological states of tissue. In this work, by establishing a photoacoustic (PA) shear wave model, an approach for quantitative PA elasticity imaging based on measurement of the rise time of the thermoelastic displacement was developed. Thus, using an existing PA viscoelasticity imaging method that features a phase delay measurement, quantitative PA elasticity imaging and viscosity imaging can be obtained in a simultaneous manner. The method was tested and validated by imaging viscoelastic agar phantoms prepared at different agar concentrations, and the imaging data were in good agreement with rheometry results. Ex vivo experiments on liver pathological models demonstrated the capability for cirrhosis detection, and the results were consistent with the corresponding histological results. This method expands the scope of conventional PA imaging and has potential to become an important alternative imaging modality.
Cinelli, Giorgia; Tositti, Laura; Mostacci, Domiziano; Baré, Jonathan
2016-05-01
In view of assessing natural radioactivity with on-site quantitative gamma spectrometry, efficiency calibration of NaI(Tl) detectors is investigated. A calibration based on Monte Carlo simulation of detector response is proposed, to render reliable quantitative analysis practicable in field campaigns. The method is developed with reference to contact geometry, in which measurements are taken placing the NaI(Tl) probe directly against the solid source to be analyzed. The Monte Carlo code used for the simulations was MCNP. Experimental verification of the calibration goodness is obtained by comparison with appropriate standards, as reported. On-site measurements yield a quick quantitative assessment of natural radioactivity levels present ((40)K, (238)U and (232)Th). On-site gamma spectrometry can prove particularly useful insofar as it provides information on materials from which samples cannot be taken. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Portable smartphone based quantitative phase microscope
NASA Astrophysics Data System (ADS)
Meng, Xin; Tian, Xiaolin; Yu, Wei; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Liu, Cheng; Wang, Shouyu
2018-01-01
To realize portable device with high contrast imaging capability, we designed a quantitative phase microscope using transport of intensity equation method based on a smartphone. The whole system employs an objective and an eyepiece as imaging system and a cost-effective LED as illumination source. A 3-D printed cradle is used to align these components. Images of different focal planes are captured by manual focusing, followed by calculation of sample phase via a self-developed Android application. To validate its accuracy, we first tested the device by measuring a random phase plate with known phases, and then red blood cell smear, Pap smear, broad bean epidermis sections and monocot root were also measured to show its performance. Owing to its advantages as accuracy, high-contrast, cost-effective and portability, the portable smartphone based quantitative phase microscope is a promising tool which can be future adopted in remote healthcare and medical diagnosis.
An evidential reasoning extension to quantitative model-based failure diagnosis
NASA Technical Reports Server (NTRS)
Gertler, Janos J.; Anderson, Kenneth C.
1992-01-01
The detection and diagnosis of failures in physical systems characterized by continuous-time operation are studied. A quantitative diagnostic methodology has been developed that utilizes the mathematical model of the physical system. On the basis of the latter, diagnostic models are derived each of which comprises a set of orthogonal parity equations. To improve the robustness of the algorithm, several models may be used in parallel, providing potentially incomplete and/or conflicting inferences. Dempster's rule of combination is used to integrate evidence from the different models. The basic probability measures are assigned utilizing quantitative information extracted from the mathematical model and from online computation performed therewith.
Kim, Paul; Lee, Ju Kang; Lim, Oh Kyung; Park, Heung Kyu; Park, Ki Deok
2017-12-01
To predict the probability of lymphedema development in breast cancer patients in the early post-operation stage, we investigated the ability of quantitative lymphoscintigraphic assessment. This retrospective study included 201 patients without lymphedema after unilateral breast cancer surgery. Lymphoscintigraphy was performed between 4 and 8 weeks after surgery to evaluate the lymphatic system in the early postoperative stage. Quantitative lymphoscintigraphy was performed using four methods: ratio of radiopharmaceutical clearance rate of the affected to normal hand; ratio of radioactivity of the affected to normal hand; ratio of radiopharmaceutical uptake rate of the affected to normal axilla (RUA); and ratio of radioactivity of the affected to normal axilla (RRA). During a 1-year follow-up, patients with a circumferential interlimb difference of 2 cm at any measurement location and a 200-mL interlimb volume difference were diagnosed with lymphedema. We investigated the difference in quantitative lymphoscintigraphic assessment between the non-lymphedema and lymphedema groups. Quantitative lymphoscintigraphic assessment revealed that the RUA and RRA were significantly lower in the lymphedema group than in the non-lymphedema group. After adjusting the model for all significant variables (body mass index, N-stage, T-stage, type of surgery, and type of lymph node surgery), RRA was associated with lymphedema (odds ratio=0.14; 95% confidence interval, 0.04-0.46; p=0.001). In patients in the early postoperative stage after unilateral breast cancer surgery, quantitative lymphoscintigraphic assessment can be used to predict the probability of developing lymphedema.
Quantitative measurement of solvation shells using frequency modulated atomic force microscopy
NASA Astrophysics Data System (ADS)
Uchihashi, T.; Higgins, M.; Nakayama, Y.; Sader, J. E.; Jarvis, S. P.
2005-03-01
The nanoscale specificity of interaction measurements and additional imaging capability of the atomic force microscope make it an ideal technique for measuring solvation shells in a variety of liquids next to a range of materials. Unfortunately, the widespread use of atomic force microscopy for the measurement of solvation shells has been limited by uncertainties over the dimensions, composition and durability of the tip during the measurements, and problems associated with quantitative force calibration of the most sensitive dynamic measurement techniques. We address both these issues by the combined use of carbon nanotube high aspect ratio probes and quantifying the highly sensitive frequency modulation (FM) detection technique using a recently developed analytical method. Due to the excellent reproducibility of the measurement technique, additional information regarding solvation shell size as a function of proximity to the surface has been obtained for two very different liquids. Further, it has been possible to identify differences between chemical and geometrical effects in the chosen systems.
Rosedale, Mary; Malaspina, Dolores; Malamud, Daniel; Strauss, Shiela M; Horne, Jaclyn D; Abouzied, Salman; Cruciani, Ricardo A; Knotkova, Helena
2012-01-01
This article reports and discusses how quantitative (physiological and behavioral) and qualitative methods are being combined in an open-label pilot feasibility study. The study evaluates safety, tolerability, and acceptability of a protocol to treat depression in HIV-infected individuals, using a 2-week block of transcranial direct current stimulation (tDCS) over the dorsolateral prefrontal cortex. Major depressive disorder (MDD) is the second most prevalent psychiatric disorder after substance abuse among HIV-positive adults, and novel antidepressant treatments are needed for this vulnerable population. The authors describe the challenges and contributions derived from different research perspectives and methodological approaches and provide a philosophical framework for combining quantitative and qualitative measurements for a fuller examination of the disorder. Four methodological points are presented: (1) the value of combining quantitative and qualitative approaches; (2) the need for context-specific measures when studying patients with medical and psychiatric comorbidities; (3) the importance of research designs that integrate physiological, behavioral, and qualitative approaches when evaluating novel treatments; and (4) the need to explore the relationships between biomarkers, clinical symptom assessments, patient self-evaluations, and patient experiences when developing new, patient-centered protocols. The authors conclude that the complexity of studying novel treatments in complex and new patient populations requires complex research designs to capture the richness of data that inform translational research.
Aerodynamic measurement techniques. [laser based diagnostic techniques
NASA Technical Reports Server (NTRS)
Hunter, W. W., Jr.
1976-01-01
Laser characteristics of intensity, monochromatic, spatial coherence, and temporal coherence were developed to advance laser based diagnostic techniques for aerodynamic related research. Two broad categories of visualization and optical measurements were considered, and three techniques received significant attention. These are holography, laser velocimetry, and Raman scattering. Examples of the quantitative laser velocimeter and Raman scattering measurements of velocity, temperature, and density indicated the potential of these nonintrusive techniques.
Effect of 2,450 MHz microwave radiation on the development of the rat brain
DOE Office of Scientific and Technical Information (OSTI.GOV)
Inouye, M.; Galvin, M.J.; McRee, D.I.
1983-12-01
Male Sprague-Dawley rats were exposed to 2,450 MHz microwave radiation at an incident power density of 10 mW/cm2 daily for 3 hours from day 4 of pregnancy (in utero exposure) through day 40 postpartum, except for 2 days at the perinatal period. The animals were killed, and the brains removed, weighed, measured, and histologically examined at 15, 20, 30, and 40 days of age. The histologic parameters examined included the cortical architecture of the cerebral cortex, the decline of the germinal layer along the lateral ventricles, the myelination of the corpus callosum, and the decline of the external germinal layermore » of the cerebellar cortex. In 40-day-old rats, quantitative measurements of neurons were also made. The spine density of the pyramidal cells in layer III of the somatosensory cortex, and the density of basal dendritic trees of the pyramidal cells in layer V were measured in Golgi-Cox impregnated specimens. In addition, the density of Purkinje cells and the extent of the Purkinje cell layer in each lobule were measured in midsagittal sections of the cerebellum stained with thionin. There were no remarkable differences between microwave-exposed and control (sham-irradiated) groups for any of the histologic or quantitative parameters examined; however, the findings provide important information on quantitative measurements of the brain. The data from this study failed to demonstrate that there is a significant effect on rat brain development due to microwave exposure (10 mW/cm2) during the embryonic, fetal, and postnatal periods.« less
NASA Astrophysics Data System (ADS)
Yuan, Zhen; Li, Xiaoqi; Xi, Lei
2014-06-01
Biomedical photoacoustic tomography (PAT), as a potential imaging modality, can visualize tissue structure and function with high spatial resolution and excellent optical contrast. It is widely recognized that the ability of quantitatively imaging optical absorption and scattering coefficients from photoacoustic measurements is essential before PAT can become a powerful imaging modality. Existing quantitative PAT (qPAT), while successful, has been focused on recovering absorption coefficient only by assuming scattering coefficient a constant. An effective method for photoacoustically recovering optical scattering coefficient is presently not available. Here we propose and experimentally validate such a method for quantitative scattering coefficient imaging using photoacoustic data from one-wavelength illumination. The reconstruction method developed combines conventional PAT with the photon diffusion equation in a novel way to realize the recovery of scattering coefficient. We demonstrate the method using various objects having scattering contrast only or both absorption and scattering contrasts embedded in turbid media. The listening-to-light-scattering method described will be able to provide high resolution scattering imaging for various biomedical applications ranging from breast to brain imaging.
Cleavage Entropy as Quantitative Measure of Protease Specificity
Fuchs, Julian E.; von Grafenstein, Susanne; Huber, Roland G.; Margreiter, Michael A.; Spitzer, Gudrun M.; Wallnoefer, Hannes G.; Liedl, Klaus R.
2013-01-01
A purely information theory-guided approach to quantitatively characterize protease specificity is established. We calculate an entropy value for each protease subpocket based on sequences of cleaved substrates extracted from the MEROPS database. We compare our results with known subpocket specificity profiles for individual proteases and protease groups (e.g. serine proteases, metallo proteases) and reflect them quantitatively. Summation of subpocket-wise cleavage entropy contributions yields a measure for overall protease substrate specificity. This total cleavage entropy allows ranking of different proteases with respect to their specificity, separating unspecific digestive enzymes showing high total cleavage entropy from specific proteases involved in signaling cascades. The development of a quantitative cleavage entropy score allows an unbiased comparison of subpocket-wise and overall protease specificity. Thus, it enables assessment of relative importance of physicochemical and structural descriptors in protease recognition. We present an exemplary application of cleavage entropy in tracing substrate specificity in protease evolution. This highlights the wide range of substrate promiscuity within homologue proteases and hence the heavy impact of a limited number of mutations on individual substrate specificity. PMID:23637583
Quantitative measures for redox signaling.
Pillay, Ché S; Eagling, Beatrice D; Driscoll, Scott R E; Rohwer, Johann M
2016-07-01
Redox signaling is now recognized as an important regulatory mechanism for a number of cellular processes including the antioxidant response, phosphokinase signal transduction and redox metabolism. While there has been considerable progress in identifying the cellular machinery involved in redox signaling, quantitative measures of redox signals have been lacking, limiting efforts aimed at understanding and comparing redox signaling under normoxic and pathogenic conditions. Here we have outlined some of the accepted principles for redox signaling, including the description of hydrogen peroxide as a signaling molecule and the role of kinetics in conferring specificity to these signaling events. Based on these principles, we then develop a working definition for redox signaling and review a number of quantitative methods that have been employed to describe signaling in other systems. Using computational modeling and published data, we show how time- and concentration- dependent analyses, in particular, could be used to quantitatively describe redox signaling and therefore provide important insights into the functional organization of redox networks. Finally, we consider some of the key challenges with implementing these methods. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Piltch, Nancy D.; Pettegrew, Richard D.; Ferkul, Paul; Sacksteder, K. (Technical Monitor)
2001-01-01
Surface radiometry is an established technique for noncontact temperature measurement of solids. We adapt this technique to the study of solid surface combustion where the solid fuel undergoes physical and chemical changes as pyrolysis proceeds, and additionally may produce soot. The physical and chemical changes alter the fuel surface emissivity, and soot contributes to the infrared signature in the same spectral band as the signal of interest. We have developed a measurement that isolates the fuel's surface emissions in the presence of soot, and determine the surface emissivity as a function of temperature. A commercially available infrared camera images the two-dimensional surface of ashless filter paper burning in concurrent flow. The camera is sensitive in the 2 to 5 gm band, but spectrally filtered to reduce the interference from hot gas phase combustion products. Results show a strong functional dependence of emissivity on temperature, attributed to the combined effects of thermal and oxidative processes. Using the measured emissivity, radiance measurements from several burning samples were corrected for the presence of soot and for changes in emissivity, to yield quantitative surface temperature measurements. Ultimately the results will be used to develop a full-field, non-contact temperature measurement that will be used in spacebased combustion investigations.
NASA Astrophysics Data System (ADS)
Yi, Steven; Yang, Arthur; Yin, Gongjie; Wen, James
2011-03-01
In this paper, we report a novel three-dimensional (3D) wound imaging system (hardware and software) under development at Technest Inc. System design is aimed to perform accurate 3D measurement and modeling of a wound and track its healing status over time. Accurate measurement and tracking of wound healing enables physicians to assess, document, improve, and individualize the treatment plan given to each wound patient. In current wound care practices, physicians often visually inspect or roughly measure the wound to evaluate the healing status. This is not an optimal practice since human vision lacks precision and consistency. In addition, quantifying slow or subtle changes through perception is very difficult. As a result, an instrument that quantifies both skin color and geometric shape variations would be particularly useful in helping clinicians to assess healing status and judge the effect of hyperemia, hematoma, local inflammation, secondary infection, and tissue necrosis. Once fully developed, our 3D imaging system will have several unique advantages over traditional methods for monitoring wound care: (a) Non-contact measurement; (b) Fast and easy to use; (c) up to 50 micron measurement accuracy; (d) 2D/3D Quantitative measurements;(e) A handheld device; and (f) Reasonable cost (< $1,000).
Quantitative prediction of oral cancer risk in patients with oral leukoplakia.
Liu, Yao; Li, Yicheng; Fu, Yue; Liu, Tong; Liu, Xiaoyong; Zhang, Xinyan; Fu, Jie; Guan, Xiaobing; Chen, Tong; Chen, Xiaoxin; Sun, Zheng
2017-07-11
Exfoliative cytology has been widely used for early diagnosis of oral squamous cell carcinoma. We have developed an oral cancer risk index using DNA index value to quantitatively assess cancer risk in patients with oral leukoplakia, but with limited success. In order to improve the performance of the risk index, we collected exfoliative cytology, histopathology, and clinical follow-up data from two independent cohorts of normal, leukoplakia and cancer subjects (training set and validation set). Peaks were defined on the basis of first derivatives with positives, and modern machine learning techniques were utilized to build statistical prediction models on the reconstructed data. Random forest was found to be the best model with high sensitivity (100%) and specificity (99.2%). Using the Peaks-Random Forest model, we constructed an index (OCRI2) as a quantitative measurement of cancer risk. Among 11 leukoplakia patients with an OCRI2 over 0.5, 4 (36.4%) developed cancer during follow-up (23 ± 20 months), whereas 3 (5.3%) of 57 leukoplakia patients with an OCRI2 less than 0.5 developed cancer (32 ± 31 months). OCRI2 is better than other methods in predicting oral squamous cell carcinoma during follow-up. In conclusion, we have developed an exfoliative cytology-based method for quantitative prediction of cancer risk in patients with oral leukoplakia.
Senesi, Giorgio S; Senesi, Nicola
2016-09-28
Soil organic carbon (OC) measurement is a crucial factor for quantifying soil C pools and inventories and monitoring the inherent temporal and spatial heterogeneity and changes of soil OC content. These are relevant issues in addressing sustainable management of terrestrial OC aiming to enhance C sequestration in soil, thus mitigating the impact of increasing CO2 concentration in the atmosphere and related effects on global climate change. Nowadays, dry combustion by an elemental analyzer or wet combustion by dichromate oxidation of the soil sample are the most recommended and commonly used methods for quantitative soil OC determination. However, the unanimously recognized uncertainties and limitations of these classical laboursome methods have prompted research efforts focusing on the development and application of more advanced and appealing techniques and methods for the measurement of soil OC in the laboratory and possibly in situ in the field. Among these laser-induced breakdown spectroscopy (LIBS) has raised the highest interest for its unique advantages. After an introduction and a highlight of the LIBS basic principles, instrumentation, methodologies and supporting chemometric methods, the main body of this review provides an historical and critical overview of the developments and results obtained up-to-now by the application of LIBS to the quantitative measurement of soil C and especially OC content. A brief critical summary of LIBS advantages and limitations/drawbacks including some final remarks and future perspectives concludes this review. Copyright © 2016 Elsevier B.V. All rights reserved.
Measurement of strains by means of electro-optics holography
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.; Bhat, Gopalakrishna K.; Albertazzi, Armando, Jr.
1991-03-01
The use of a TV camera as a recording medium and the observation of whole field displacements in real time makes holographic TV a very interesting and powerful tool in a variety of areas from NDE to research and development. The paper presents new developments in the field that add to the versatility of the technique by introducing portability and methods to obtain accurate quantitative results. Examples of applications are given to the measurement of strains both at room and at high temperatures and strain measurements at the microscopic level. 1.
Systematic review of empowerment measures in health promotion.
Cyril, Sheila; Smith, Ben J; Renzaho, Andre M N
2016-12-01
Empowerment, a multi-level construct comprising individual, community and organizational domains, is a fundamental value and goal in health promotion. While a range of scales have been developed for the measurement of empowerment, the qualities of these have not been rigorously assessed. The aim of this study was to evaluate the measurement properties of quantitative empowerment scales and their applicability in health promotion programs. A systematic review following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines was done to evaluate empowerment scales across three dimensions: item development, reliability and validity. This was followed by assessment of measurement properties using a ratings scale with criteria addressing an a priori explicit theoretical framework, assessment of content validity, internal consistency and factor analysis to test structural validity. Of the 20 studies included in this review, only 8 (40%) used literature reviews, expert panels and empirical studies to develop scale items and 9 (45%) of studies fulfilled ≥5 criteria on the ratings scale. Two studies (10%) measured community empowerment and one study measured organizational empowerment, the rest (85%) measured individual empowerment. This review highlights important gaps in the measurement of community and organizational domains of empowerment using quantitative scales. A priority for future empowerment research is to investigate and explore approaches such as mixed methods to enable adequate measurement of empowerment across all three domains. This would help health promotion practitioners to effectively measure empowerment as a driver of change and an outcome in health promotion programs. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Public and patient involvement in quantitative health research: A statistical perspective.
Hannigan, Ailish
2018-06-19
The majority of studies included in recent reviews of impact for public and patient involvement (PPI) in health research had a qualitative design. PPI in solely quantitative designs is underexplored, particularly its impact on statistical analysis. Statisticians in practice have a long history of working in both consultative (indirect) and collaborative (direct) roles in health research, yet their perspective on PPI in quantitative health research has never been explicitly examined. To explore the potential and challenges of PPI from a statistical perspective at distinct stages of quantitative research, that is sampling, measurement and statistical analysis, distinguishing between indirect and direct PPI. Statistical analysis is underpinned by having a representative sample, and a collaborative or direct approach to PPI may help achieve that by supporting access to and increasing participation of under-represented groups in the population. Acknowledging and valuing the role of lay knowledge of the context in statistical analysis and in deciding what variables to measure may support collective learning and advance scientific understanding, as evidenced by the use of participatory modelling in other disciplines. A recurring issue for quantitative researchers, which reflects quantitative sampling methods, is the selection and required number of PPI contributors, and this requires further methodological development. Direct approaches to PPI in quantitative health research may potentially increase its impact, but the facilitation and partnership skills required may require further training for all stakeholders, including statisticians. © 2018 The Authors Health Expectations published by John Wiley & Sons Ltd.
Magda, Balázs; Dobi, Zoltán; Mészáros, Katalin; Szabó, Éva; Márta, Zoltán; Imre, Tímea; Szabó, Pál T
2017-06-05
The aim of this study was to develop a sensitive, reliable and high-throughput liquid chromatography - electrospray ionization - mass spectrometric (LC-ESI-MS/MS) method for the simultaneous quantitation of cortisol and cortisone in human saliva. Derivatization with 2-hydrazino-1-methylpyridine (HMP) was one of the most challenging aspects of the method development. The reagent was reacting with cortisol and cortisone at 60°C within 1h, giving mono- and bis-hydrazone derivatives. Investigation of derivatization reaction and sample preparation was detailed and discussed. Improvement of method sensitivity was achieved with charged derivatization and use of on-line solid phase extraction (on-line SPE). The lower limit of quantitation (LLOQ) was 5 and 10pg/ml for cortisol and cortisone, respectively. The developed method was subsequently applied to clinical laboratory measurement of cortisol and cortisone in human saliva. Copyright © 2017 Elsevier B.V. All rights reserved.
Development of an optoelectronic holographic platform for otolaryngology applications
NASA Astrophysics Data System (ADS)
Harrington, Ellery; Dobrev, Ivo; Bapat, Nikhil; Flores, Jorge Mauricio; Furlong, Cosme; Rosowski, John; Cheng, Jeffery Tao; Scarpino, Chris; Ravicz, Michael
2010-08-01
In this paper, we present advances on our development of an optoelectronic holographic computing platform with the ability to quantitatively measure full-field-of-view nanometer-scale movements of the tympanic membrane (TM). These measurements can facilitate otologists' ability to study and diagnose hearing disorders in humans. The holographic platform consists of a laser delivery system and an otoscope. The control software, called LaserView, is written in Visual C++ and handles communication and synchronization between hardware components. It provides a user-friendly interface to allow viewing of holographic images with several tools to automate holography-related tasks and facilitate hardware communication. The software uses a series of concurrent threads to acquire images, control the hardware, and display quantitative holographic data at video rates and in two modes of operation: optoelectronic holography and lensless digital holography. The holographic platform has been used to perform experiments on several live and post-mortem specimens, and is to be deployed in a medical research environment with future developments leading to its eventual clinical use.
Surface temperature/heat transfer measurement using a quantitative phosphor thermography system
NASA Technical Reports Server (NTRS)
Buck, G. M.
1991-01-01
A relative-intensity phosphor thermography technique developed for surface heating studies in hypersonic wind tunnels is described. A direct relationship between relative emission intensity and phosphor temperature is used for quantitative surface temperature measurements in time. The technique provides global surface temperature-time histories using a 3-CCD (Charge Coupled Device) video camera and digital recording system. A current history of technique development at Langley is discussed. Latest developments include a phosphor mixture for a greater range of temperature sensitivity and use of castable ceramics for inexpensive test models. A method of calculating surface heat-transfer from thermal image data in blowdown wind tunnels is included in an appendix, with an analysis of material thermal heat-transfer properties. Results from tests in the Langley 31-Inch Mach 10 Tunnel are presented for a ceramic orbiter configuration and a four-inch diameter hemisphere model. Data include windward heating for bow-shock/wing-shock interactions on the orbiter wing surface, and a comparison with prediction for hemisphere heating distribution.
Zhang, Yingying; Li, Changkai; Liu, Dongyan; Zhang, Ying; Liu, Yan
2015-04-01
To develop in situ NaI(Tl) detector for radioactivity measurement in the marine environment, the Monte Carlo N-Particle (MCNP) Transport Code was utilized to simulate the measurement of NaI(Tl) detector immersed in seawater, taking into account the material and geometry of the detector, and the interactions between the photons with the atoms of the seawater and the detector. The simulation results of the marine detection efficiency and distance were deduced and analyzed. In order to test their reliability, the field measurement was made at open sea and the experimental value of the marine detection efficiency was deduced and seems to be in good agreement with the simulated one. The minimum detectable activity for (137)Cs in the seawater of NaI(Tl) detector developed was determined mathematically at last. The simulation method and results in the paper can be used for the better design and quantitative calculation of in situ NaI(Tl) detector for radioactivity measurement in the marine environment, and also for some applications such as the installation on the marine monitoring platform and the quantitative analysis of radionuclides. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Mehta, Shalin B.; Sheppard, Colin J. R.
2010-05-01
Various methods that use large illumination aperture (i.e. partially coherent illumination) have been developed for making transparent (i.e. phase) specimens visible. These methods were developed to provide qualitative contrast rather than quantitative measurement-coherent illumination has been relied upon for quantitative phase analysis. Partially coherent illumination has some important advantages over coherent illumination and can be used for measurement of the specimen's phase distribution. However, quantitative analysis and image computation in partially coherent systems have not been explored fully due to the lack of a general, physically insightful and computationally efficient model of image formation. We have developed a phase-space model that satisfies these requirements. In this paper, we employ this model (called the phase-space imager) to elucidate five different partially coherent systems mentioned in the title. We compute images of an optical fiber under these systems and verify some of them with experimental images. These results and simulated images of a general phase profile are used to compare the contrast and the resolution of the imaging systems. We show that, for quantitative phase imaging of a thin specimen with matched illumination, differential phase contrast offers linear transfer of specimen information to the image. We also show that the edge enhancement properties of spiral phase contrast are compromised significantly as the coherence of illumination is reduced. The results demonstrate that the phase-space imager model provides a useful framework for analysis, calibration, and design of partially coherent imaging methods.
[The quantitative testing of V617F mutation in gen JAK2 using pyrosequencing technique].
Dunaeva, E A; Mironov, K O; Dribnokhodova, T E; Subbotina, E E; Bashmakova; Ol'hovskiĭ, I A; Shipulin, G A
2014-11-01
The somatic mutation V617F in gen JAK2 is a frequent cause of chronic myeloprolific diseases not conditioned by BCR/ABL mutation. The quantitative testing of relative percentage of mutant allele can be used in establishing severity of disease and its prognosis and in prescription of remedy inhibiting activity of JAK2. To quantitatively test mutation the pyrosequencing technique was applied. The developed technique permits detecting and quantitatively, testing percentage of mutation fraction since 7%. The "gray zone" is presented by samples with percentage of mutant allele from 4% to 7%. The dependence of expected percentage of mutant fraction in analyzed sample from observed value of signal is described by equation of line with regression coefficients y = - 0.97, x = -1.32 and at that measurement uncertainty consists ± 0.7. The developed technique is approved officially on clinical material from 192 patients with main forms of myeloprolific diseases not conditioned by BCR/ABL mutation. It was detected 64 samples with mautant fraction percentage from 13% to 91%. The developed technique permits implementing monitoring of therapy of myeloprolific diseases and facilitates to optimize tactics of treatment.
Modern projection of the old electroscope for nuclear radiation quantitative work and demonstrations
NASA Astrophysics Data System (ADS)
Oliveira Bastos, Rodrigo; Baltokoski Boch, Layara
2017-11-01
Although quantitative measurements in radioactivity teaching and research are only believed to be possible with high technology, early work in this area was fully accomplished with very simple apparatus such as zinc sulphide screens and electroscopes. This article presents an experimental practice using the electroscope, which is a very simple apparatus that has been widely used for educational purposes, although generally for qualitative work. The main objective is to show the possibility of measuring radioactivity not only in qualitative demonstrations, but also in quantitative experimental practices. The experimental set-up is a low-cost ion chamber connected to an electroscope in a configuration that is very similar to that used by Marie and Pierre Currie, Rutherford, Geiger, Pacini, Hess and other great researchers from the time of the big discoveries in nuclear and high-energy particle physics. An electroscope leaf is filmed and projected, permitting the collection of quantitative data for the measurement of the 220Rn half-life, collected from the emanation of the lantern mantles. The article presents the experimental procedures and the expected results, indicating that the experiment may provide support for nuclear physics classes. These practices could spread widely to either university or school didactic laboratories, and the apparatus has the potential to allow the development of new teaching activity for nuclear physics.
An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis
NASA Astrophysics Data System (ADS)
Kim, Yongmin; Alexander, Thomas
1986-06-01
In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.
Metrics and the effective computational scientist: process, quality and communication.
Baldwin, Eric T
2012-09-01
Recent treatments of computational knowledge worker productivity have focused upon the value the discipline brings to drug discovery using positive anecdotes. While this big picture approach provides important validation of the contributions of these knowledge workers, the impact accounts do not provide the granular detail that can help individuals and teams perform better. I suggest balancing the impact-focus with quantitative measures that can inform the development of scientists. Measuring the quality of work, analyzing and improving processes, and the critical evaluation of communication can provide immediate performance feedback. The introduction of quantitative measures can complement the longer term reporting of impacts on drug discovery. These metric data can document effectiveness trends and can provide a stronger foundation for the impact dialogue. Copyright © 2012 Elsevier Ltd. All rights reserved.
Microscopic optical path length difference and polarization measurement system for cell analysis
NASA Astrophysics Data System (ADS)
Satake, H.; Ikeda, K.; Kowa, H.; Hoshiba, T.; Watanabe, E.
2018-03-01
In recent years, noninvasive, nonstaining, and nondestructive quantitative cell measurement techniques have become increasingly important in the medical field. These cell measurement techniques enable the quantitative analysis of living cells, and are therefore applied to various cell identification processes, such as those determining the passage number limit during cell culturing in regenerative medicine. To enable cell measurement, we developed a quantitative microscopic phase imaging system based on a Mach-Zehnder interferometer that measures the optical path length difference distribution without phase unwrapping using optical phase locking. The applicability of our phase imaging system was demonstrated by successful identification of breast cancer cells amongst normal cells. However, the cell identification method using this phase imaging system exhibited a false identification rate of approximately 7%. In this study, we implemented a polarimetric imaging system by introducing a polarimetric module to one arm of the Mach-Zehnder interferometer of our conventional phase imaging system. This module was comprised of a quarter wave plate and a rotational polarizer on the illumination side of the sample, and a linear polarizer on the optical detector side. In addition, we developed correction methods for the measurement errors of the optical path length and birefringence phase differences that arose through the influence of elements other than cells, such as the Petri dish. As the Petri dish holding the fluid specimens was transparent, it did not affect the amplitude information; however, the optical path length and birefringence phase differences were affected. Therefore, we proposed correction of the optical path length and birefringence phase for the influence of elements other than cells, as a prerequisite for obtaining highly precise phase and polarimetric images.
Quantitative 3-D Corneal Imaging In Vivo Using a Modified HRT- RCM Confocal Microscope
Petroll, W. Matthew.; Weaver, Matthew; Vaidya, Saurabh; McCulley, James P.; Cavanagh, H. Dwight
2012-01-01
Purpose The purpose of this study was to develop and test hardware and software modifications to allow quantitative full-thickness corneal imaging using the HRT Rostock Corneal Module. Methods A PC-controlled motor drive with positional feedback was integrated into the system to allow automated focusing through the entire cornea. The left eyes of ten New Zealand White rabbits were scanned from endothelium to epithelium. Image sequences were read into a custom-developed program for depth calculation and measurement of sub-layer thicknesses. 3-D visualizations were also generated using Imaris. In six rabbits, stack images were registered, and depth-dependent counts of keratocyte nuclei were made using Metamorph. Results The mean epithelial and corneal thicknesses measured in the rabbit were 47 ± 5 μm and 373 ± 25 μm, respectively (N = 10 corneas); coefficients of variation for repeated scans were 8.2% and 2.1%. Corneal thickness measured using ultrasonic pachymetry was 374 ± 17 μm. The mean overall keratocyte density measured in the rabbit was 43,246 ± 5,603 cells/mm3 in vivo (N = 6 corneas). There was a gradual decrease in keratocyte density from the anterior to posterior cornea (R = 0.99), consistent with previous data generated in vitro. Conclusions This modified system allows high resolution 3-D image stacks to be collected from the full thickness rabbit cornea in vivo. These datasets can be used for interactive visualization of corneal cell layers, measurement of sub-layer thickness, and depth-dependent keratocyte density measurements. Overall, the modifications significantly expand the potential quantitative research applications of the HRT-RCM microscope. PMID:23051907
Indicators of Family Care for Development for Use in Multicountry Surveys
Kariger, Patricia; Engle, Patrice; Britto, Pia M. Rebello; Sywulka, Sara M.; Menon, Purnima
2012-01-01
Indicators of family care for development are essential for ascertaining whether families are providing their children with an environment that leads to positive developmental outcomes. This project aimed to develop indicators from a set of items, measuring family care practices and resources important for caregiving, for use in epidemiologic surveys in developing countries. A mixed method (quantitative and qualitative) design was used for item selection and evaluation. Qualitative and quantitative analyses were conducted to examine the validity of candidate items in several country samples. Qualitative methods included the use of global expert panels to identify and evaluate the performance of each candidate item as well as in-country focus groups to test the content validity of the items. The quantitative methods included analyses of item-response distributions, using bivariate techniques. The selected items measured two family care practices (support for learning/stimulating environment and limit-setting techniques) and caregiving resources (adequacy of the alternate caregiver when the mother worked). Six play-activity items, indicative of support for learning/stimulating environment, were included in the core module of UNICEF's Multiple Cluster Indictor Survey 3. The other items were included in optional modules. This project provided, for the first time, a globally-relevant set of items for assessing family care practices and resources in epidemiological surveys. These items have multiple uses, including national monitoring and cross-country comparisons of the status of family care for development used globally. The obtained information will reinforce attention to efforts to improve the support for development of children. PMID:23304914
Photogrammetric Measurements of an EH-60L Brownout Cloud
NASA Technical Reports Server (NTRS)
Wong, Oliver D.; Tanner, Philip E.
2010-01-01
There is a critical lack of quantitative data regarding the mechanism of brownout cloud formation. Recognizing this, tests were conducted during the Air Force Research Lab 3D-LZ Brownout Test at the US Army Yuma Proving Ground. Photogrammetry was utilized during two rounds of flight tests with an instrumented EH-60L Black Hawk to determine if this technique could quantitatively measure the formation and evolution of a brownout cloud. Specific areas of interest include the location, size, and average convective velocity of the cloud, along with the characteristics of any defined structures within it. Following the first flight test, photogrammetric data were validated through comparison with onboard vehicle data. Lessons learned from this test were applied to the development of an improved photogrammetry system. A second flight test, utilizing the improved system, demonstrated that obtaining quantitative measurements of the brownout cloud are possible. Results from these measurements are presented in the paper. Flow visualization with chalk dust seeding was also tested. It was observed that pickup forces of the brownout cloud appear to be very low. Overall, these tests demonstrate the viability of photogrammetry as a means for quantifying brownout cloud formation and evolution.
The Relationship between Quantitative and Qualitative Measures of Writing Skills.
ERIC Educational Resources Information Center
Howerton, Mary Lou P.; And Others
The relationships of quantitative measures of writing skills to overall writing quality as measured by the E.T.S. Composition Evaluation Scale (CES) were examined. Quantitative measures included indices of language productivity, vocabulary diversity, spelling, and syntactic maturity. Power of specific indices to account for variation in overall…
USDA-ARS?s Scientific Manuscript database
In recent years, there has been increased concern over carry-over activity of mostly high temperature (HT) and very high temperature (VHT) stable amylases in white, refined sugars from refineries to various food manufacturing industries and other end-users. HT and VHT stable amylases were developed...
DEVELOPMENT OF CRITERIA AND METHODS FOR EVALUATING TRAINER AIRCRAFT EFFECTIVENESS.
ERIC Educational Resources Information Center
KUSEWITT, J.B.
THE PURPOSE OF THIS STUDY WAS TO DEVELOP A METHOD FOR DETERMINING OBJECTIVE MEASURES OF TRAINER AIRCRAFT EFFECTIVENESS TO EVALUATE PROGRAM ALTERNATIVES FOR TRAINING PILOTS FOR FLEET FIGHTER AND ATTACK-TYPE AIRCRAFT. THE TRAINING SYLLABUS WAS BASED ON AVERAGE STUDENT ABILITY. THE BASIC PROBLEM WAS TO ESTABLISH QUANTITATIVE TIME-DIFFICULTY…
Meaning-Making among Medical Students: Development of a Quantitative Measure of Self-Authorship
ERIC Educational Resources Information Center
Fallar, Robert
2014-01-01
Preparation for and application to medical school, as well as the subsequent medical training of matriculating students, can have an important impact on psychosocial development. The premedical baccalaureate is the traditional preparation for medical school, although many medical schools also offer a separate entry path through early assurance…
Saliva has an important advantage over serum as a medium for antibody detection due to non-invasive sampling, which is critical for community-based epidemiological surveys. The development of a Luminex multiplex immunoassay for measurement of salivary IgG and IgA responses to pot...
Fungal concentrations were measured in the dust of six homes in Cleveland, OH, where a child developed pulmonary hemorrhage (pulmonary hemorrhage homes, i.e. PHH), and 26 reference homes (RH) with no known fungal contamination. QPCR assays for 82 species (or assay groups) were u...
The Functional and Developmental Organization of Cognitive Developmental Sequences
ERIC Educational Resources Information Center
Demetriou, Andreas; Kyriakides, Leonidas
2006-01-01
This study examines the organization and development of 5 domains of reasoning (categorical, quantitative, spatial, causal, and propositional) and the construct validity of a test designed to measure development from early adolescence to early adulthood. The theory underlying the test is first summarized and the conceptual design of the test is…
ERIC Educational Resources Information Center
Honeyman, Catherine A.
2010-01-01
This article extends understanding of the connections between education, social capital, and development through a mixed-methods case study of the Sistema de Aprendizaje Tutorial, or SAT, an innovative secondary-level education system. The quantitative dimension of the research used survey measures of social responsibility to compare 93 SAT…
A GRE Test for the STEM Disciplines: Developing an Assessment "of" and "for" Learning
ERIC Educational Resources Information Center
Payne, David G.; Briel, Jacqueline B.; Hawthorn, John; Riedeburg, Karen
2006-01-01
Plans are described for creating a Graduate Record Examination (GRE) test for the STEM (science, technology, engineering, and mathematics) disciplines. Previous work showed that a quantitative measure for the STEM disciplines exacerbated group differences beyond those reflected in the current GRE General Test. A test development approach is…
Scale Development: Heterosexist Attitudes in Women's Collegiate Athletics
ERIC Educational Resources Information Center
Mullin, Elizabeth M.
2013-01-01
Homophobia and heterosexism in women's athletics have been studied extensively using a qualitative approach. Limited research from a quantitative approach has been conducted in the area and none with a sport-specific instrument. The purpose of the current study was to develop a valid and reliable questionnaire to measure heterosexist attitudes in…
A Model for Measuring Effectiveness of an Online Course
ERIC Educational Resources Information Center
Mashaw, Bijan
2012-01-01
As a result of this research, a quantitative model and a procedure have been developed to create an online mentoring effectiveness index (EI). To develop the model, mentoring and teaching effectiveness are defined, and then the constructs and factors of effectiveness are identified. The model's construction is based on the theory that…
Thin-Film Material Science and Processing | Materials Science | NREL
, a prime example of this research is thin-film photovoltaics (PV). Thin films are important because have developed a quantitative high-throughput technique that can measure many barriers in parallel with
Developing a performance measurement approach to benefit/cost freight project prioritization.
DOT National Transportation Integrated Search
2014-10-01
Future reauthorizations of the federal transportation bill will require a comprehensive and quantitative analysis of the freight benefits : of proposed freight system projects. To prioritize public investments in freight systems and to insure conside...
Toxicity Estimation Software Tool (TEST)
The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...
3D methodology for evaluating rail crossing roughness.
DOT National Transportation Integrated Search
2015-03-02
Description of Research Project The overall objective of this project is to investigate develop a quantitative method or measure for determining the need to rehabilitate rail crossings. The scope of the project includes investigation of sensor capabi...
Field Verification of Undercut Criteria and Alternatives for Subgrade Stabilization-Coastal Plain
DOT National Transportation Integrated Search
2012-06-01
The North Carolina Department of Transportation (NCDOT) is progressing toward developing quantitative and systematic : criteria that address the implementation of undercutting as a subgrade stabilization measure. As part of this effort, a : laborator...
Harik-Khan, R; Moats, W A
1995-01-01
A procedure for identifying and quantitating violative beta-lactams in milk is described. This procedure integrates beta-lactam residue detection kits with the multiresidue automated liquid chromatographic (LC) cleanup method developed in our laboratory. Spiked milk was deproteinized, extracted, and subjected to reversed-phase LC using a gradient program that concentrated the beta-lactams. Amoxicillin, ampicillin, cephapirin, ceftiofur, cloxacillin, and penicillin G were, thus, separated into 5 fractions that were subsequently tested for activity by using 4 kits. beta-lactams in the positive fractions were quantitated by analytical LC methods developed in our laboratory. The LC cleanup method separated beta-lactam antibiotics from each other and from interferences in the matrix and also concentrated the antibiotics, thus increasing the sensitivity of the kits to the beta-lactam antibiotics. The procedure facilitated the task of identifying and measuring the beta-lactam antibiotics that may be present in milk samples.
Quantitative Global Heat Transfer in a Mach-6 Quiet Tunnel
NASA Technical Reports Server (NTRS)
Sullivan, John P.; Schneider, Steven P.; Liu, Tianshu; Rubal, Justin; Ward, Chris; Dussling, Joseph; Rice, Cody; Foley, Ryan; Cai, Zeimin; Wang, Bo;
2012-01-01
This project developed quantitative methods for obtaining heat transfer from temperature sensitive paint (TSP) measurements in the Mach-6 quiet tunnel at Purdue, which is a Ludwieg tube with a downstream valve, moderately-short flow duration and low levels of heat transfer. Previous difficulties with inferring heat transfer from TSP in the Mach-6 quiet tunnel were traced to (1) the large transient heat transfer that occurs during the unusually long tunnel startup and shutdown, (2) the non-uniform thickness of the insulating coating, (3) inconsistencies and imperfections in the painting process and (4) the low levels of heat transfer observed on slender models at typical stagnation temperatures near 430K. Repeated measurements were conducted on 7 degree-half-angle sharp circular cones at zero angle of attack in order to evaluate the techniques, isolate the problems and identify solutions. An attempt at developing a two-color TSP method is also summarized.
Quantitative assessment of motor fatigue: normative values and comparison with prior-polio patients.
Meldrum, Dara; Cahalane, Eibhlis; Conroy, Ronan; Guthrie, Richard; Hardiman, Orla
2007-06-01
Motor fatigue is a common complaint of polio survivors and has a negative impact on activities of daily living. The aim of this study was to establish a normative database for hand grip strength and fatigue and to investigate differences between prior-polio subjects and normal controls. Static and dynamic hand grip fatigue and maximum voluntary isometric contraction (MVIC) of hand grip were measured in subjects with a prior history of polio (n = 44) and healthy controls (n = 494). A normative database of fatigue was developed using four indices of analysis. Compared with healthy controls, subjects with prior polio had significantly reduced hand grip strength but developed greater hand grip fatigue in only one fatigue index. Quantitative measurement of fatigue in the prior-polio population may be useful in order to detect change over time and to evaluate treatment strategies.
Computerized image analysis for quantitative neuronal phenotyping in zebrafish.
Liu, Tianming; Lu, Jianfeng; Wang, Ye; Campbell, William A; Huang, Ling; Zhu, Jinmin; Xia, Weiming; Wong, Stephen T C
2006-06-15
An integrated microscope image analysis pipeline is developed for automatic analysis and quantification of phenotypes in zebrafish with altered expression of Alzheimer's disease (AD)-linked genes. We hypothesize that a slight impairment of neuronal integrity in a large number of zebrafish carrying the mutant genotype can be detected through the computerized image analysis method. Key functionalities of our zebrafish image processing pipeline include quantification of neuron loss in zebrafish embryos due to knockdown of AD-linked genes, automatic detection of defective somites, and quantitative measurement of gene expression levels in zebrafish with altered expression of AD-linked genes or treatment with a chemical compound. These quantitative measurements enable the archival of analyzed results and relevant meta-data. The structured database is organized for statistical analysis and data modeling to better understand neuronal integrity and phenotypic changes of zebrafish under different perturbations. Our results show that the computerized analysis is comparable to manual counting with equivalent accuracy and improved efficacy and consistency. Development of such an automated data analysis pipeline represents a significant step forward to achieve accurate and reproducible quantification of neuronal phenotypes in large scale or high-throughput zebrafish imaging studies.
Guo, Sujuan; Pridham, Kevin J; Sheng, Zhi
2016-01-01
Autophagy is a catabolic process whereby cellular components are degraded to fuel cells for longer survival during stress. Hence, autophagy plays a vital role in determining cell fate and is central for homeostasis and pathogenesis of many human diseases including chronic myeloid leukemia (CML). It has been well established that autophagy is important for the leukemogenesis as well as drug resistance in CML. Thus, autophagy is an intriguing therapeutic target. However, current approaches that detect autophagy lack reliability and often fail to provide quantitative measurements. To overcome this hurdle and facilitate the development of autophagy-related therapies, we have recently developed an autophagy assay termed as the Cyto-ID fluorescence spectrophotometric assay. This method uses a cationic fluorescence dye, Cyto-ID, which specifically labels autophagic compartments and is detected by a spectrophotometer to permit a large-scale and quantitative analysis. As such, it allows rapid, reliable, and quantitative detection of autophagy and estimation of autophagy flux. In this chapter, we further provide technical details of this method and step-by-step protocols for measuring autophagy or autophagy flux in CML cell lines as well as primary hematopoietic cells.
UNiquant, a program for quantitative proteomics analysis using stable isotope labeling.
Huang, Xin; Tolmachev, Aleksey V; Shen, Yulei; Liu, Miao; Huang, Lin; Zhang, Zhixin; Anderson, Gordon A; Smith, Richard D; Chan, Wing C; Hinrichs, Steven H; Fu, Kai; Ding, Shi-Jian
2011-03-04
Stable isotope labeling (SIL) methods coupled with nanoscale liquid chromatography and high resolution tandem mass spectrometry are increasingly useful for elucidation of the proteome-wide differences between multiple biological samples. Development of more effective programs for the sensitive identification of peptide pairs and accurate measurement of the relative peptide/protein abundance are essential for quantitative proteomic analysis. We developed and evaluated the performance of a new program, termed UNiquant, for analyzing quantitative proteomics data using stable isotope labeling. UNiquant was compared with two other programs, MaxQuant and Mascot Distiller, using SILAC-labeled complex proteome mixtures having either known or unknown heavy/light ratios. For the SILAC-labeled Jeko-1 cell proteome digests with known heavy/light ratios (H/L = 1:1, 1:5, and 1:10), UNiquant quantified a similar number of peptide pairs as MaxQuant for the H/L = 1:1 and 1:5 mixtures. In addition, UNiquant quantified significantly more peptides than MaxQuant and Mascot Distiller in the H/L = 1:10 mixtures. UNiquant accurately measured relative peptide/protein abundance without the need for postmeasurement normalization of peptide ratios, which is required by the other programs.
Elayavilli, Ravikumar Komandur; Liu, Hongfang
2016-01-01
Computational modeling of biological cascades is of great interest to quantitative biologists. Biomedical text has been a rich source for quantitative information. Gathering quantitative parameters and values from biomedical text is one significant challenge in the early steps of computational modeling as it involves huge manual effort. While automatically extracting such quantitative information from bio-medical text may offer some relief, lack of ontological representation for a subdomain serves as impedance in normalizing textual extractions to a standard representation. This may render textual extractions less meaningful to the domain experts. In this work, we propose a rule-based approach to automatically extract relations involving quantitative data from biomedical text describing ion channel electrophysiology. We further translated the quantitative assertions extracted through text mining to a formal representation that may help in constructing ontology for ion channel events using a rule based approach. We have developed Ion Channel ElectroPhysiology Ontology (ICEPO) by integrating the information represented in closely related ontologies such as, Cell Physiology Ontology (CPO), and Cardiac Electro Physiology Ontology (CPEO) and the knowledge provided by domain experts. The rule-based system achieved an overall F-measure of 68.93% in extracting the quantitative data assertions system on an independently annotated blind data set. We further made an initial attempt in formalizing the quantitative data assertions extracted from the biomedical text into a formal representation that offers potential to facilitate the integration of text mining into ontological workflow, a novel aspect of this study. This work is a case study where we created a platform that provides formal interaction between ontology development and text mining. We have achieved partial success in extracting quantitative assertions from the biomedical text and formalizing them in ontological framework. The ICEPO ontology is available for download at http://openbionlp.org/mutd/supplementarydata/ICEPO/ICEPO.owl.
Impact of specific language impairment and type of school on different language subsystems.
Puglisi, Marina Leite; Befi-Lopes, Debora Maria
2016-01-01
This study aimed to explore quantitative and qualitative effects of type of school and specific language impairment (SLI) on different language abilities. 204 Brazilian children aged from 4 to 6 years old participated in the study. Children were selected to form three groups: 1) 63 typically developing children studying in private schools (TDPri); 2) 102 typically developing children studying in state schools (TDSta); and 39 children with SLI studying in state schools (SLISta). All individuals were assessed regarding expressive vocabulary, number morphology and morphosyntactic comprehension. All language subsystems were vulnerable to both environmental (type of school) and biological (SLI) effects. The relationship between the three language measures was exactly the same to all groups: vocabulary growth correlated with age and with the development of morphological abilities and morphosyntactic comprehension. Children with SLI showed atypical errors in the comprehension test at the age of 4, but presented a pattern of errors that gradually resembled typical development. The effect of type of school was marked by quantitative differences, while the effect of SLI was characterised by both quantitative and qualitative differences.
NASA Astrophysics Data System (ADS)
Sader, John E.; Uchihashi, Takayuki; Higgins, Michael J.; Farrell, Alan; Nakayama, Yoshikazu; Jarvis, Suzanne P.
2005-03-01
Use of the atomic force microscope (AFM) in quantitative force measurements inherently requires a theoretical framework enabling conversion of the observed deflection properties of the cantilever to an interaction force. In this paper, the theoretical foundations of using frequency modulation atomic force microscopy (FM-AFM) in quantitative force measurements are examined and rigorously elucidated, with consideration being given to both 'conservative' and 'dissipative' interactions. This includes a detailed discussion of the underlying assumptions involved in such quantitative force measurements, the presentation of globally valid explicit formulae for evaluation of so-called 'conservative' and 'dissipative' forces, discussion of the origin of these forces, and analysis of the applicability of FM-AFM to quantitative force measurements in liquid.
Estimating aboveground live understory vegetation carbon in the United States
Kristofer D Johnson; Grant M Domke; Matthew B Russell; Brian Walters; John Hom; Alicia Peduzzi; Richard Birdsey; Katelyn Dolan; Wenli Huang
2017-01-01
Despite the key role that understory vegetation plays in ecosystems and the terrestrial carbon cycle, it is often overlooked and has few quantitative measurements, especially at national scales. To understand the contribution of understory carbon to the United States (US) carbon budget, we developed an approach that relies on field measurements of understory vegetation...
NASA Astrophysics Data System (ADS)
Knight, Silvin P.; Browne, Jacinta E.; Meaney, James F.; Smith, David S.; Fagan, Andrew J.
2016-10-01
A novel anthropomorphic flow phantom device has been developed, which can be used for quantitatively assessing the ability of magnetic resonance imaging (MRI) scanners to accurately measure signal/concentration time-intensity curves (CTCs) associated with dynamic contrast-enhanced (DCE) MRI. Modelling of the complex pharmacokinetics of contrast agents as they perfuse through the tumour capillary network has shown great promise for cancer diagnosis and therapy monitoring. However, clinical adoption has been hindered by methodological problems, resulting in a lack of consensus regarding the most appropriate acquisition and modelling methodology to use and a consequent wide discrepancy in published data. A heretofore overlooked source of such discrepancy may arise from measurement errors of tumour CTCs deriving from the imaging pulse sequence itself, while the effects on the fidelity of CTC measurement of using rapidly-accelerated sequences such as parallel imaging and compressed sensing remain unknown. The present work aimed to investigate these features by developing a test device in which ‘ground truth’ CTCs were generated and presented to the MRI scanner for measurement, thereby allowing for an assessment of the DCE-MRI protocol to accurately measure this curve shape. The device comprised a four-pump flow system wherein CTCs derived from prior patient prostate data were produced in measurement chambers placed within the imaged volume. The ground truth was determined as the mean of repeat measurements using an MRI-independent, custom-built optical imaging system. In DCE-MRI experiments, significant discrepancies between the ground truth and measured CTCs were found for both tumorous and healthy tissue-mimicking curve shapes. Pharmacokinetic modelling revealed errors in measured K trans, v e and k ep values of up to 42%, 31%, and 50% respectively, following a simple variation of the parallel imaging factor and number of signal averages in the acquisition protocol. The device allows for the quantitative assessment and standardisation of DCE-MRI protocols (both existing and emerging).
Guidelines for reporting quantitative mass spectrometry based experiments in proteomics.
Martínez-Bartolomé, Salvador; Deutsch, Eric W; Binz, Pierre-Alain; Jones, Andrew R; Eisenacher, Martin; Mayer, Gerhard; Campos, Alex; Canals, Francesc; Bech-Serra, Joan-Josep; Carrascal, Montserrat; Gay, Marina; Paradela, Alberto; Navajas, Rosana; Marcilla, Miguel; Hernáez, María Luisa; Gutiérrez-Blázquez, María Dolores; Velarde, Luis Felipe Clemente; Aloria, Kerman; Beaskoetxea, Jabier; Medina-Aunon, J Alberto; Albar, Juan P
2013-12-16
Mass spectrometry is already a well-established protein identification tool and recent methodological and technological developments have also made possible the extraction of quantitative data of protein abundance in large-scale studies. Several strategies for absolute and relative quantitative proteomics and the statistical assessment of quantifications are possible, each having specific measurements and therefore, different data analysis workflows. The guidelines for Mass Spectrometry Quantification allow the description of a wide range of quantitative approaches, including labeled and label-free techniques and also targeted approaches such as Selected Reaction Monitoring (SRM). The HUPO Proteomics Standards Initiative (HUPO-PSI) has invested considerable efforts to improve the standardization of proteomics data handling, representation and sharing through the development of data standards, reporting guidelines, controlled vocabularies and tooling. In this manuscript, we describe a key output from the HUPO-PSI-namely the MIAPE Quant guidelines, which have developed in parallel with the corresponding data exchange format mzQuantML [1]. The MIAPE Quant guidelines describe the HUPO-PSI proposal concerning the minimum information to be reported when a quantitative data set, derived from mass spectrometry (MS), is submitted to a database or as supplementary information to a journal. The guidelines have been developed with input from a broad spectrum of stakeholders in the proteomics field to represent a true consensus view of the most important data types and metadata, required for a quantitative experiment to be analyzed critically or a data analysis pipeline to be reproduced. It is anticipated that they will influence or be directly adopted as part of journal guidelines for publication and by public proteomics databases and thus may have an impact on proteomics laboratories across the world. This article is part of a Special Issue entitled: Standardization and Quality Control. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Cramer, K. Elliott; Winfree, William P.
2000-01-01
Wall thinning in utility boiler waterwall tubing is a significant inspection concern for boiler operators. Historically, conventional ultrasonics has been used lor inspection of these tubes. This technique has proved to be very labor intensive and slow. This has resulted in a "spot check" approach to inspections, making thickness measurements over a relatively small percentage of the total boiler wall area. NASA Langley Research Center has developed a thermal NDE technique designed to image and quantitatively characterize the amount of material thinning present in steel tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source, coupled with this analysis technique, represents a significant improvement in the inspection speed for large structures such as boiler waterwalls while still providing high-resolution thickness measurements. A theoretical basis for the technique will be presented thus demonstrating the quantitative nature of the technique. Further, results of laboratory experiments on flat Panel specimens with fabricated material loss regions will be presented.
Current methods and advances in bone densitometry
NASA Technical Reports Server (NTRS)
Guglielmi, G.; Gluer, C. C.; Majumdar, S.; Blunt, B. A.; Genant, H. K.
1995-01-01
Bone mass is the primary, although not the only, determinant of fracture. Over the past few years a number of noninvasive techniques have been developed to more sensitively quantitate bone mass. These include single and dual photon absorptiometry (SPA and DPA), single and dual X-ray absorptiometry (SXA and DXA) and quantitative computed tomography (QCT). While differing in anatomic sites measured and in their estimates of precision, accuracy, and fracture discrimination, all of these methods provide clinically useful measurements of skeletal status. It is the intent of this review to discuss the pros and cons of these techniques and to present the new applications of ultrasound (US) and magnetic resonance (MRI) in the detection and management of osteoporosis.
NMR diffusion and relaxation studies of 2-nitroimidazole and albumin interactions
NASA Astrophysics Data System (ADS)
Wijesekera, Dj; Willis, Scott A.; Gupta, Abhishek; Torres, Allan M.; Zheng, Gang; Price, William S.
2018-03-01
Nitroimidazole derivatives are of current interest in the development of hypoxia targeting agents and show potential in the establishment of quantitative measures of tumor hypoxia. In this study, the binding of 2-nitroimidazole to albumin was probed using NMR diffusion and relaxation measurements. Binding studies were conducted at three different protein concentrations (0.23, 0.30 and 0.38 mM) with drug concentrations ranging from 0.005-0.16 M at 298 K. Quantitative assessments of the binding model were made by evaluating the number of binding sites, n, and association constant, K. These were determined to be 21 ± 3 and 53 ± 4 M- 1, respectively.
NASA Technical Reports Server (NTRS)
Poultney, S. K.; Brumfield, M. L.; Siviter, J. S.
1975-01-01
Typical pollutant gas concentrations at the stack exits of stationary sources can be estimated to be about 500 ppm under the present emission standards. Raman lidar has a number of advantages which makes it a valuable tool for remote measurements of these stack emissions. Tests of the Langley Research Center Raman lidar at a calibration tank indicate that night measurements of SO2 concentrations and stack opacity are possible. Accuracies of 10 percent are shown to be achievable from a distance of 300 m within 30 min integration times for 500 ppm SO2 at the stack exits. All possible interferences were examined quantitatively (except for the fluorescence of aerosols in actual stack emissions) and found to have negligible effect on the measurements. An early test at an instrumented stack is strongly recommended.
Piezoelectric tuning fork biosensors for the quantitative measurement of biomolecular interactions
NASA Astrophysics Data System (ADS)
Gonzalez, Laura; Rodrigues, Mafalda; Benito, Angel Maria; Pérez-García, Lluïsa; Puig-Vidal, Manel; Otero, Jorge
2015-12-01
The quantitative measurement of biomolecular interactions is of great interest in molecular biology. Atomic force microscopy (AFM) has proved its capacity to act as a biosensor and determine the affinity between biomolecules of interest. Nevertheless, the detection scheme presents certain limitations when it comes to developing a compact biosensor. Recently, piezoelectric quartz tuning forks (QTFs) have been used as laser-free detection sensors for AFM. However, only a few studies along these lines have considered soft biological samples, and even fewer constitute quantified molecular recognition experiments. Here, we demonstrate the capacity of QTF probes to perform specific interaction measurements between biotin-streptavidin complexes in buffer solution. We propose in this paper a variant of dynamic force spectroscopy based on representing adhesion energies E (aJ) against pulling rates v (nm s-1). Our results are compared with conventional AFM measurements and show the great potential of these sensors in molecular interaction studies.
Automated Quantitative Nuclear Cardiology Methods
Motwani, Manish; Berman, Daniel S.; Germano, Guido; Slomka, Piotr J.
2016-01-01
Quantitative analysis of SPECT and PET has become a major part of nuclear cardiology practice. Current software tools can automatically segment the left ventricle, quantify function, establish myocardial perfusion maps and estimate global and local measures of stress/rest perfusion – all with minimal user input. State-of-the-art automated techniques have been shown to offer high diagnostic accuracy for detecting coronary artery disease, as well as predict prognostic outcomes. This chapter briefly reviews these techniques, highlights several challenges and discusses the latest developments. PMID:26590779
NASA Technical Reports Server (NTRS)
Baily, N. A.
1974-01-01
Research data obtained by the low dose electronic radiography system are reported. Data cover: (1) localization and tracking of Ta screws implanted in the inner wall of the right ventrical of the heart, (2) use of cross hairs to outline inner or outer heart wall contours, (3) quantitative measure of anatomical components which are stationary in size or change size dynamically, and (4) study of dynamic quantitative data from roentenologic or fluoroscopic procedures.
2001-10-25
analyses of electroencephalogram at half- closed eye and fully closed eye. This study aimed at quantitative estimating rest rhythm of horses by the...analyses of eyeball movement. The mask attached with a miniature CCD camera was newly developed. The continuous images of the horse eye for about 24...eyeball area were calculated. As for the results, the fluctuating status of eyeball area was analyzed quantitatively, and the rest rhythm of horses was
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carla J. Miller
This report provides a summary of the literature review that was performed and based on previous work performed at the Idaho National Laboratory studying the Three Mile Island 2 (TMI-2) nuclear reactor accident, specifically the melted fuel debris. The purpose of the literature review was to document prior published work that supports the feasibility of the analytical techniques that were developed to provide quantitative results of the make-up of the fuel and reactor component debris located inside and outside the containment. The quantitative analysis provides a technique to perform nuclear fuel accountancy measurements
NASA Technical Reports Server (NTRS)
Johnson, R. W.; Bahn, G. S.
1977-01-01
Statistical analysis techniques were applied to develop quantitative relationships between in situ river measurements and the remotely sensed data that were obtained over the James River in Virginia on 28 May 1974. The remotely sensed data were collected with a multispectral scanner and with photographs taken from an aircraft platform. Concentration differences among water quality parameters such as suspended sediment, chlorophyll a, and nutrients indicated significant spectral variations. Calibrated equations from the multiple regression analysis were used to develop maps that indicated the quantitative distributions of water quality parameters and the dispersion characteristics of a pollutant plume entering the turbid river system. Results from further analyses that use only three preselected multispectral scanner bands of data indicated that regression coefficients and standard errors of estimate were not appreciably degraded compared with results from the 10-band analysis.
Cardiovascular and pulmonary dynamics by quantitative imaging
NASA Technical Reports Server (NTRS)
Wood, E. H.
1976-01-01
The accuracy and range of studies on cardiovascular and pulmonary functions can be greatly facilitated if the motions of the underlying organ systems throughout individual cycles can be directly visualized and readily measured with minimum or preferably no effect on these motions. Achievement of this objective requires development of techniques for quantitative noninvasive or minimally invasive dynamic and stop-action imaging of the organ systems. A review of advances in dynamic quantitative imaging of moving organs reveals that the revolutionary value of cross-sectional and three-dimensional images produced by various types of radiant energy such as X-rays and gamma rays, positrons, electrons, protons, light, and ultrasound for clinical diagnostic and biomedical research applications is just beginning to be realized. The fabrication of a clinically useful cross-section reconstruction device with sensing capabilities for both anatomical structural composition and chemical composition may be possible and awaits future development.
Absolute quantitation of isoforms of post-translationally modified proteins in transgenic organism.
Li, Yaojun; Shu, Yiwei; Peng, Changchao; Zhu, Lin; Guo, Guangyu; Li, Ning
2012-08-01
Post-translational modification isoforms of a protein are known to play versatile biological functions in diverse cellular processes. To measure the molar amount of each post-translational modification isoform (P(isf)) of a target protein present in the total protein extract using mass spectrometry, a quantitative proteomic protocol, absolute quantitation of isoforms of post-translationally modified proteins (AQUIP), was developed. A recombinant ERF110 gene overexpression transgenic Arabidopsis plant was used as the model organism for demonstration of the proof of concept. Both Ser-62-independent (14)N-coded synthetic peptide standards and (15)N-coded ERF110 protein standard isolated from the heavy nitrogen-labeled transgenic plants were employed simultaneously to determine the concentration of all isoforms (T(isf)) of ERF110 in the whole plant cell lysate, whereas a pair of Ser-62-dependent synthetic peptide standards were used to quantitate the Ser-62 phosphosite occupancy (R(aqu)). The P(isf) was finally determined by integrating the two empirically measured variables using the following equation: P(isf) = T(isf) · R(aqu). The absolute amount of Ser-62-phosphorylated isoform of ERF110 determined using AQUIP was substantiated with a stable isotope labeling in Arabidopsis-based relative and accurate quantitative proteomic approach. The biological role of the Ser-62-phosphorylated isoform was demonstrated in transgenic plants.
NASA Technical Reports Server (NTRS)
Nguyen, Quang-Viet; Kojima, Jun
2005-01-01
Researchers from NASA Glenn Research Center s Combustion Branch and the Ohio Aerospace Institute (OAI) have developed a transferable calibration standard for an optical technique called spontaneous Raman scattering (SRS) in high-pressure flames. SRS is perhaps the only technique that provides spatially and temporally resolved, simultaneous multiscalar measurements in turbulent flames. Such measurements are critical for the validation of numerical models of combustion. This study has been a combined experimental and theoretical effort to develop a spectral calibration database for multiscalar diagnostics using SRS in high-pressure flames. However, in the past such measurements have used a one-of-a-kind experimental setup and a setup-dependent calibration procedure to empirically account for spectral interferences, or crosstalk, among the major species of interest. Such calibration procedures, being non-transferable, are prohibitively expensive to duplicate. A goal of this effort is to provide an SRS calibration database using transferable standards that can be implemented widely by other researchers for both atmospheric-pressure and high-pressure (less than 30 atm) SRS studies. A secondary goal of this effort is to provide quantitative multiscalar diagnostics in high pressure environments to validate computational combustion codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
E Nazaretski; J Thibodaux; I Vekhter
2011-12-31
We report the local measurements of the magnetic penetration depth in a superconducting Nb film using magnetic force microscopy (MFM). We developed a method for quantitative extraction of the penetration depth from single-parameter simultaneous fits to the lateral and height profiles of the MFM signal, and demonstrate that the obtained value is in excellent agreement with that obtained from the bulk magnetization measurements.
Kazerooni, Ella A.; Lynch, David A.; Liu, Lyrica X.; Murray, Susan; Curtis, Jeffrey L.; Criner, Gerard J.; Kim, Victor; Bowler, Russell P.; Hanania, Nicola A.; Anzueto, Antonio R.; Make, Barry J.; Hokanson, John E.; Crapo, James D.; Silverman, Edwin K.; Martinez, Fernando J.; Washko, George R.
2011-01-01
Purpose: To test the hypothesis—given the increasing emphasis on quantitative computed tomographic (CT) phenotypes of chronic obstructive pulmonary disease (COPD)—that a relationship exists between COPD exacerbation frequency and quantitative CT measures of emphysema and airway disease. Materials and Methods: This research protocol was approved by the institutional review board of each participating institution, and all participants provided written informed consent. One thousand two subjects who were enrolled in the COPDGene Study and met the GOLD (Global Initiative for Chronic Obstructive Lung Disease) criteria for COPD with quantitative CT analysis were included. Total lung emphysema percentage was measured by using the attenuation mask technique with a −950-HU threshold. An automated program measured the mean wall thickness and mean wall area percentage in six segmental bronchi. The frequency of COPD exacerbation in the prior year was determined by using a questionnaire. Statistical analysis was performed to examine the relationship of exacerbation frequency with lung function and quantitative CT measurements. Results: In a multivariate analysis adjusted for lung function, bronchial wall thickness and total lung emphysema percentage were associated with COPD exacerbation frequency. Each 1-mm increase in bronchial wall thickness was associated with a 1.84-fold increase in annual exacerbation rate (P = .004). For patients with 35% or greater total emphysema, each 5% increase in emphysema was associated with a 1.18-fold increase in this rate (P = .047). Conclusion: Greater lung emphysema and airway wall thickness were associated with COPD exacerbations, independent of the severity of airflow obstruction. Quantitative CT can help identify subgroups of patients with COPD who experience exacerbations for targeted research and therapy development for individual phenotypes. © RSNA, 2011 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.11110173/-/DC1 PMID:21788524
NASA Astrophysics Data System (ADS)
Haneda, Kiyofumi; Kajima, Toshio; Koyama, Tadashi; Muranaka, Hiroyuki; Dojo, Hirofumi; Aratani, Yasuhiko
2002-05-01
The target of our study is to analyze the level of necessary security requirements, to search for suitable security measures and to optimize security distribution to every portion of the medical practice. Quantitative expression must be introduced to our study, if possible, to enable simplified follow-up security procedures and easy evaluation of security outcomes or results. Using fault tree analysis (FTA), system analysis showed that system elements subdivided into groups by details result in a much more accurate analysis. Such subdivided composition factors greatly depend on behavior of staff, interactive terminal devices, kinds of services provided, and network routes. Security measures were then implemented based on the analysis results. In conclusion, we identified the methods needed to determine the required level of security and proposed security measures for each medical information system, and the basic events and combinations of events that comprise the threat composition factors. Methods for identifying suitable security measures were found and implemented. Risk factors for each basic event, a number of elements for each composition factor, and potential security measures were found. Methods to optimize the security measures for each medical information system were proposed, developing the most efficient distribution of risk factors for basic events.
Mazzoleni, Stefano; Toth, Andras; Munih, Marko; Van Vaerenbergh, Jo; Cavallo, Giuseppe; Micera, Silvestro; Dario, Paolo; Guglielmelli, Eugenio
2009-10-30
One of the main scientific and technological challenges of rehabilitation bioengineering is the development of innovative methodologies, based on the use of appropriate technological devices, for an objective assessment of patients undergoing a rehabilitation treatment. Such tools should be as fast and cheap to use as clinical scales, which are currently the daily instruments most widely used in the routine clinical practice. A human-centered approach was used in the design and development of a mechanical structure equipped with eight force/torque sensors that record quantitative data during the initiation of a predefined set of Activities of Daily Living (ADL) tasks, in isometric conditions. Preliminary results validated the appropriateness, acceptability and functionality of the proposed platform, that has become now a tool used for clinical research in three clinical centres. This paper presented the design and development of an innovative platform for whole-body force and torque measurements on human subjects. The platform has been designed to perform accurate quantitative measurements in isometric conditions with the specific aim to address the needs for functional assessment tests of patients undergoing a rehabilitation treatment as a consequence of a stroke.The versatility of the system also enlightens several other interesting possible areas of application for therapy in neurorehabilitation, for research in basic neuroscience, and more.
Quantitative cell biology: the essential role of theory.
Howard, Jonathon
2014-11-05
Quantitative biology is a hot area, as evidenced by the recent establishment of institutes, graduate programs, and conferences with that name. But what is quantitative biology? What should it be? And how can it contribute to solving the big questions in biology? The past decade has seen very rapid development of quantitative experimental techniques, especially at the single-molecule and single-cell levels. In this essay, I argue that quantitative biology is much more than just the quantitation of these experimental results. Instead, it should be the application of the scientific method by which measurement is directed toward testing theories. In this view, quantitative biology is the recognition that theory and models play critical roles in biology, as they do in physics and engineering. By tying together experiment and theory, quantitative biology promises a deeper understanding of underlying mechanisms, when the theory works, or to new discoveries, when it does not. © 2014 Howard. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
NASA Astrophysics Data System (ADS)
Lamarche, G.; Le Gonidec, Y.; Lucieer, V.; Lurton, X.; Greinert, J.; Dupré, S.; Nau, A.; Heffron, E.; Roche, M.; Ladroit, Y.; Urban, P.
2017-12-01
Detecting liquid, solid or gaseous features in the ocean is generating considerable interest in the geoscience community, because of their potentially high economic values (oil & gas, mining), their significance for environmental management (oil/gas leakage, biodiversity mapping, greenhouse gas monitoring) as well as their potential cultural and traditional values (food, freshwater). Enhancing people's capability to quantify and manage the natural capital present in the ocean water goes hand in hand with the development of marine acoustic technology, as marine echosounders provide the most reliable and technologically advanced means to develop quantitative studies of water column backscatter data. This is not developed to its full capability because (i) of the complexity of the physics involved in relation to the constantly changing marine environment, and (ii) the rapid technological evolution of high resolution multibeam echosounder (MBES) water-column imaging systems. The Water Column Imaging Working Group is working on a series of multibeam echosounder (MBES) water column datasets acquired in a variety of environments, using a range of frequencies, and imaging a number of water-column features such as gas seeps, oil leaks, suspended particulate matter, vegetation and freshwater springs. Access to data from different acoustic frequencies and ocean dynamics enables us to discuss and test multifrequency approaches which is the most promising means to develop a quantitative analysis of the physical properties of acoustic scatterers, providing rigorous cross calibration of the acoustic devices. In addition, high redundancy of multibeam data, such as is available for some datasets, will allow us to develop data processing techniques, leading to quantitative estimates of water column gas seeps. Each of the datasets has supporting ground-truthing data (underwater videos and photos, physical oceanography measurements) which provide information on the origin and chemistry of the seep content. This is of first importance when assessing the physical properties of water column scatterers from backscatter acoustic measurement.
The NIST Quantitative Infrared Database
Chu, P. M.; Guenther, F. R.; Rhoderick, G. C.; Lafferty, W. J.
1999-01-01
With the recent developments in Fourier transform infrared (FTIR) spectrometers it is becoming more feasible to place these instruments in field environments. As a result, there has been enormous increase in the use of FTIR techniques for a variety of qualitative and quantitative chemical measurements. These methods offer the possibility of fully automated real-time quantitation of many analytes; therefore FTIR has great potential as an analytical tool. Recently, the U.S. Environmental Protection Agency (U.S.EPA) has developed protocol methods for emissions monitoring using both extractive and open-path FTIR measurements. Depending upon the analyte, the experimental conditions and the analyte matrix, approximately 100 of the hazardous air pollutants (HAPs) listed in the 1990 U.S.EPA Clean Air Act amendment (CAAA) can be measured. The National Institute of Standards and Technology (NIST) has initiated a program to provide quality-assured infrared absorption coefficient data based on NIST prepared primary gas standards. Currently, absorption coefficient data has been acquired for approximately 20 of the HAPs. For each compound, the absorption coefficient spectrum was calculated using nine transmittance spectra at 0.12 cm−1 resolution and the Beer’s law relationship. The uncertainties in the absorption coefficient data were estimated from the linear regressions of the transmittance data and considerations of other error sources such as the nonlinear detector response. For absorption coefficient values greater than 1 × 10−4 μmol/mol)−1 m−1 the average relative expanded uncertainty is 2.2 %. This quantitative infrared database is currently an ongoing project at NIST. Additional spectra will be added to the database as they are acquired. Our current plans include continued data acquisition of the compounds listed in the CAAA, as well as the compounds that contribute to global warming and ozone depletion.
Bohren, Meghan A; Vogel, Joshua P; Hunter, Erin C; Lutsiv, Olha; Makh, Suprita K; Souza, João Paulo; Aguiar, Carolina; Saraiva Coneglian, Fernando; Diniz, Alex Luíz Araújo; Tunçalp, Özge; Javadi, Dena; Oladapo, Olufemi T; Khosla, Rajat; Hindin, Michelle J; Gülmezoglu, A Metin
2015-06-01
Despite growing recognition of neglectful, abusive, and disrespectful treatment of women during childbirth in health facilities, there is no consensus at a global level on how these occurrences are defined and measured. This mixed-methods systematic review aims to synthesize qualitative and quantitative evidence on the mistreatment of women during childbirth in health facilities to inform the development of an evidence-based typology of the phenomenon. We searched PubMed, CINAHL, and Embase databases and grey literature using a predetermined search strategy to identify qualitative, quantitative, and mixed-methods studies on the mistreatment of women during childbirth across all geographical and income-level settings. We used a thematic synthesis approach to synthesize the qualitative evidence and assessed the confidence in the qualitative review findings using the CERQual approach. In total, 65 studies were included from 34 countries. Qualitative findings were organized under seven domains: (1) physical abuse, (2) sexual abuse, (3) verbal abuse, (4) stigma and discrimination, (5) failure to meet professional standards of care, (6) poor rapport between women and providers, and (7) health system conditions and constraints. Due to high heterogeneity of the quantitative data, we were unable to conduct a meta-analysis; instead, we present descriptions of study characteristics, outcome measures, and results. Additional themes identified in the quantitative studies are integrated into the typology. This systematic review presents a comprehensive, evidence-based typology of the mistreatment of women during childbirth in health facilities, and demonstrates that mistreatment can occur at the level of interaction between the woman and provider, as well as through systemic failures at the health facility and health system levels. We propose this typology be adopted to describe the phenomenon and be used to develop measurement tools and inform future research, programs, and interventions.
Ziv, Omer; Zaritsky, Assaf; Yaffe, Yakey; Mutukula, Naresh; Edri, Reuven; Elkabetz, Yechiel
2015-10-01
Neural stem cells (NSCs) are progenitor cells for brain development, where cellular spatial composition (cytoarchitecture) and dynamics are hypothesized to be linked to critical NSC capabilities. However, understanding cytoarchitectural dynamics of this process has been limited by the difficulty to quantitatively image brain development in vivo. Here, we study NSC dynamics within Neural Rosettes--highly organized multicellular structures derived from human pluripotent stem cells. Neural rosettes contain NSCs with strong epithelial polarity and are expected to perform apical-basal interkinetic nuclear migration (INM)--a hallmark of cortical radial glial cell development. We developed a quantitative live imaging framework to characterize INM dynamics within rosettes. We first show that the tendency of cells to follow the INM orientation--a phenomenon we referred to as radial organization, is associated with rosette size, presumably via mechanical constraints of the confining structure. Second, early forming rosettes, which are abundant with founder NSCs and correspond to the early proliferative developing cortex, show fast motions and enhanced radial organization. In contrast, later derived rosettes, which are characterized by reduced NSC capacity and elevated numbers of differentiated neurons, and thus correspond to neurogenesis mode in the developing cortex, exhibit slower motions and decreased radial organization. Third, later derived rosettes are characterized by temporal instability in INM measures, in agreement with progressive loss in rosette integrity at later developmental stages. Finally, molecular perturbations of INM by inhibition of actin or non-muscle myosin-II (NMII) reduced INM measures. Our framework enables quantification of cytoarchitecture NSC dynamics and may have implications in functional molecular studies, drug screening, and iPS cell-based platforms for disease modeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurtz, Sarah; Repins, Ingrid L; Hacke, Peter L
Continued growth of PV system deployment would be enhanced by quantitative, low-uncertainty predictions of the degradation and failure rates of PV modules and systems. The intended product lifetime (decades) far exceeds the product development cycle (months), limiting our ability to reduce the uncertainty of the predictions for this rapidly changing technology. Yet, business decisions (setting insurance rates, analyzing return on investment, etc.) require quantitative risk assessment. Moving toward more quantitative assessments requires consideration of many factors, including the intended application, consequence of a possible failure, variability in the manufacturing, installation, and operation, as well as uncertainty in the measured accelerationmore » factors, which provide the basis for predictions based on accelerated tests. As the industry matures, it is useful to periodically assess the overall strategy for standards development and prioritization of research to provide a technical basis both for the standards and the analysis related to the application of those. To this end, this paper suggests a tiered approach to creating risk assessments. Recent and planned potential improvements in international standards are also summarized.« less
Saito, Akira; Numata, Yasushi; Hamada, Takuya; Horisawa, Tomoyoshi; Cosatto, Eric; Graf, Hans-Peter; Kuroda, Masahiko; Yamamoto, Yoichiro
2016-01-01
Recent developments in molecular pathology and genetic/epigenetic analysis of cancer tissue have resulted in a marked increase in objective and measurable data. In comparison, the traditional morphological analysis approach to pathology diagnosis, which can connect these molecular data and clinical diagnosis, is still mostly subjective. Even though the advent and popularization of digital pathology has provided a boost to computer-aided diagnosis, some important pathological concepts still remain largely non-quantitative and their associated data measurements depend on the pathologist's sense and experience. Such features include pleomorphism and heterogeneity. In this paper, we propose a method for the objective measurement of pleomorphism and heterogeneity, using the cell-level co-occurrence matrix. Our method is based on the widely used Gray-level co-occurrence matrix (GLCM), where relations between neighboring pixel intensity levels are captured into a co-occurrence matrix, followed by the application of analysis functions such as Haralick features. In the pathological tissue image, through image processing techniques, each nucleus can be measured and each nucleus has its own measureable features like nucleus size, roundness, contour length, intra-nucleus texture data (GLCM is one of the methods). In GLCM each nucleus in the tissue image corresponds to one pixel. In this approach the most important point is how to define the neighborhood of each nucleus. We define three types of neighborhoods of a nucleus, then create the co-occurrence matrix and apply Haralick feature functions. In each image pleomorphism and heterogeneity are then determined quantitatively. For our method, one pixel corresponds to one nucleus feature, and we therefore named our method Cell Feature Level Co-occurrence Matrix (CFLCM). We tested this method for several nucleus features. CFLCM is showed as a useful quantitative method for pleomorphism and heterogeneity on histopathological image analysis.
Quantitative imaging of protein targets in the human brain with PET
NASA Astrophysics Data System (ADS)
Gunn, Roger N.; Slifstein, Mark; Searle, Graham E.; Price, Julie C.
2015-11-01
PET imaging of proteins in the human brain with high affinity radiolabelled molecules has a history stretching back over 30 years. During this period the portfolio of protein targets that can be imaged has increased significantly through successes in radioligand discovery and development. This portfolio now spans six major categories of proteins; G-protein coupled receptors, membrane transporters, ligand gated ion channels, enzymes, misfolded proteins and tryptophan-rich sensory proteins. In parallel to these achievements in radiochemical sciences there have also been significant advances in the quantitative analysis and interpretation of the imaging data including the development of methods for image registration, image segmentation, tracer compartmental modeling, reference tissue kinetic analysis and partial volume correction. In this review, we analyze the activity of the field around each of the protein targets in order to give a perspective on the historical focus and the possible future trajectory of the field. The important neurobiology and pharmacology is introduced for each of the six protein classes and we present established radioligands for each that have successfully transitioned to quantitative imaging in humans. We present a standard quantitative analysis workflow for these radioligands which takes the dynamic PET data, associated blood and anatomical MRI data as the inputs to a series of image processing and bio-mathematical modeling steps before outputting the outcome measure of interest on either a regional or parametric image basis. The quantitative outcome measures are then used in a range of different imaging studies including tracer discovery and development studies, cross sectional studies, classification studies, intervention studies and longitudinal studies. Finally we consider some of the confounds, challenges and subtleties that arise in practice when trying to quantify and interpret PET neuroimaging data including motion artifacts, partial volume effects, age effects, image registration and normalization, input functions and metabolites, parametric imaging, receptor internalization and genetic factors.
Cheng, Yongfeng; Wei, Haiming; Sun, Rui; Tian, Zhigang; Zheng, Xiaodong
2016-02-01
Bradford assay is one of the most common methods for measuring protein concentrations. However, some pharmaceutical excipients, such as detergents, interfere with Bradford assay even at low concentrations. Protein precipitation can be used to overcome sample incompatibility with protein quantitation. But the rate of protein recovery caused by acetone precipitation is only about 70%. In this study, we found that sucrose not only could increase the rate of protein recovery after 1 h acetone precipitation, but also did not interfere with Bradford assay. So we developed a method for rapid protein quantitation in protein drugs even if they contained interfering substances. Copyright © 2015 Elsevier Inc. All rights reserved.
Quantitative interpretation of Great Lakes remote sensing data
NASA Technical Reports Server (NTRS)
Shook, D. F.; Salzman, J.; Svehla, R. A.; Gedney, R. T.
1980-01-01
The paper discusses the quantitative interpretation of Great Lakes remote sensing water quality data. Remote sensing using color information must take into account (1) the existence of many different organic and inorganic species throughout the Great Lakes, (2) the occurrence of a mixture of species in most locations, and (3) spatial variations in types and concentration of species. The radiative transfer model provides a potential method for an orderly analysis of remote sensing data and a physical basis for developing quantitative algorithms. Predictions and field measurements of volume reflectances are presented which show the advantage of using a radiative transfer model. Spectral absorptance and backscattering coefficients for two inorganic sediments are reported.
Buelow, Daelynn; Sun, Yilun; Tang, Li; Gu, Zhengming; Pounds, Stanley; Hayden, Randall
2016-07-01
Monitoring of Epstein-Barr virus (EBV) load in immunocompromised patients has become integral to their care. An increasing number of reagents are available for quantitative detection of EBV; however, there are little published comparative data. Four real-time PCR systems (one using laboratory-developed reagents and three using analyte-specific reagents) were compared with one another for detection of EBV from whole blood. Whole blood specimens seeded with EBV were used to determine quantitative linearity, analytical measurement range, lower limit of detection, and CV for each assay. Retrospective testing of 198 clinical samples was performed in parallel with all methods; results were compared to determine relative quantitative and qualitative performance. All assays showed similar performance. No significant difference was found in limit of detection (3.12-3.49 log10 copies/mL; P = 0.37). A strong qualitative correlation was seen with all assays that used clinical samples (positive detection rates of 89.5%-95.8%). Quantitative correlation of clinical samples across assays was also seen in pairwise regression analysis, with R(2) ranging from 0.83 to 0.95. Normalizing clinical sample results to IU/mL did not alter the quantitative correlation between assays. Quantitative EBV detection by real-time PCR can be performed over a wide linear dynamic range, using three different commercially available reagents and laboratory-developed methods. EBV was detected with comparable sensitivity and quantitative correlation for all assays. Copyright © 2016 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.
Shackney, Stanley; Emlet, David R; Pollice, Agnese; Smith, Charles; Brown, Kathryn; Kociban, Deborah
2006-01-01
Laser scanning Cytometry (LSC) is a versatile technology that makes it possible to perform multiple measurements on individual cells and correlate them cell by cell with other cellular features. It would be highly desirable to be able to perform reproducible, quantitative, correlated cell-based immunofluorescence studies on individual cells from human solid tumors. However, such studies can be challenging because of the presence of large numbers of cell aggregates and other confounding factors. Techniques have been developed to deal with cell aggregates in data sets collected by LSC. Experience has also been gained in addressing other key technical and methodological issues that can affect the reproducibility of such cell-based immunofluorescence measurements. We describe practical aspects of cell sample collection, cell fixation and staining, protocols for performing multiparameter immunofluorescence measurements by LSC, use of controls and reference samples, and approaches to data analysis that we have found useful in improving the accuracy and reproducibility of LSC data obtained in human tumor samples. We provide examples of the potential advantages of LSC in examining quantitative aspects of cell-based analysis. Improvements in the quality of cell-based multiparameter immunofluorescence measurements make it possible to extract useful information from relatively small numbers of cells. This, in turn, permits the performance of multiple multicolor panels on each tumor sample. With links among the different panels that are provided by overlapping measurements, it is possible to develop increasingly more extensive profiles of intracellular expression of multiple proteins in clinical samples of human solid tumors. Examples of such linked panels of measurements are provided. Advances in methodology can improve cell-based multiparameter immunofluorescence measurements on cell suspensions from human solid tumors by LSC for use in prognostic and predictive clinical applications. Copyright (c) 2005 Wiley-Liss, Inc.
Emmert, Martin; Meszmer, Nina; Schlesinger, Mark
2018-02-05
Little is known about the usefulness of online ratings when searching for a hospital. We therefore assess the association between quantitative and qualitative online ratings for US hospitals and clinical quality of care measures. First, we collected a stratified random sample of 1000 quantitative and qualitative online ratings for hospitals from the website RateMDs. We used an integrated iterative approach to develop a categorization scheme to capture both the topics and sentiment in the narrative comments. Next, we matched the online ratings with hospital-level quality measures published by the Centers for Medicare and Medicaid Services. Regarding nominally scaled measures, we checked for differences in the distribution among the online rating categories. For metrically scaled measures, we applied the Spearman rank coefficient of correlation. Thirteen of the twenty-nine quality of care measures were significantly associated with the quantitative online ratings (Spearman p = ±0.143, p < 0.05 for all). Thereof, eight associations indicated better clinical outcomes for better online ratings. Seven of the twenty-nine clinical measures were significantly associated with the sentiment of patient narratives (p = ±0.114, p < 0.05 for all), whereof four associations indicated worse clinical outcomes in more favorable narrative comments. There seems to be some association between quantitative online ratings and clinical performance measures. However, the relatively weak strength and inconsistency of the direction of the association as well as the lack of association with several other clinical measures may not enable the drawing of strong conclusions. Narrative comments also seem to have limited potential to reflect the clinical quality of care in its current form. Thus, online ratings are of limited usefulness in guiding patients towards high-performing hospitals from a clinical point of view. Nevertheless, patients might prefer different aspects of care when choosing a hospital.
Schoenherr, Jordan Richard; Hamstra, Stanley J
2016-08-01
Psychometrics has recently undergone extensive criticism within the medical education literature. The use of quantitative measurement using psychometric instruments such as response scales is thought to emphasize a narrow range of relevant learner skills and competencies. Recent reviews and commentaries suggest that a paradigm shift might be presently underway. We argue for caution, in that the psychometrics approach and the quantitative account of competencies that it reflects is based on a rich discussion regarding measurement and scaling that led to the establishment of this paradigm. Rather than reflecting a homogeneous discipline focused on core competencies devoid of consideration of context, the psychometric community has a history of discourse and debate within the field, with an acknowledgement that the techniques and instruments developed within psychometrics are heuristics that must be used pragmatically.
Harden, J.W.
1982-01-01
A soil development index has been developed in order to quantitatively measure the degree of soil profile development. This index, which combines eight soil field properties with soil thickness, is designed from field descriptions of the Merced River chronosequence in central California. These eight properties are: clay films, texture plus wet consistence, rubification (color hue and chroma), structure, dry consistence, moist consistence, color value, and pH. Other properties described in the field can be added when more soils are studied. Most of the properties change systematically within the 3 m.y. age span of the Merced River chronosequence. The absence of properties on occasion does not significantly affect the index. Individual quantified field properties, as well as the integrated index, are examined and compared as functions of soil depth and age. ?? 1982.
Advancing effects analysis for integrated, large-scale wildfire risk assessment
Matthew P. Thompson; David E. Calkin; Julie W. Gilbertson-Day; Alan A. Ager
2011-01-01
In this article, we describe the design and development of a quantitative, geospatial risk assessment tool intended to facilitate monitoring trends in wildfire risk over time and to provide information useful in prioritizing fuels treatments and mitigation measures. The research effort is designed to develop, from a strategic view, a first approximation of how both...
ERIC Educational Resources Information Center
Hsu, Pi-Shan; Chang, Te-Jeng; Wu, Ming-Hsiung
2009-01-01
The level of learners' expertise has been used as a metric and diagnostic mechanism of instruction. This metric influences mental effort directly according to the applications of cognitive load theory. Cognitive efficiency, an optimal measurement technique of expertise, was developed by Kalyuga and Sweller to replace instructional efficiency in…
Ha, Richard; Mema, Eralda; Guo, Xiaotao; Mango, Victoria; Desperito, Elise; Ha, Jason; Wynn, Ralph; Zhao, Binsheng
2016-04-01
The amount of fibroglandular tissue (FGT) has been linked to breast cancer risk based on mammographic density studies. Currently, the qualitative assessment of FGT on mammogram (MG) and magnetic resonance imaging (MRI) is prone to intra and inter-observer variability. The purpose of this study is to develop an objective quantitative FGT measurement tool for breast MRI that could provide significant clinical value. An IRB approved study was performed. Sixty breast MRI cases with qualitative assessment of mammographic breast density and MRI FGT were randomly selected for quantitative analysis from routine breast MRIs performed at our institution from 1/2013 to 12/2014. Blinded to the qualitative data, whole breast and FGT contours were delineated on T1-weighted pre contrast sagittal images using an in-house, proprietary segmentation algorithm which combines the region-based active contours and a level set approach. FGT (%) was calculated by: [segmented volume of FGT (mm(3))/(segmented volume of whole breast (mm(3))] ×100. Statistical correlation analysis was performed between quantified FGT (%) on MRI and qualitative assessments of mammographic breast density and MRI FGT. There was a significant positive correlation between quantitative MRI FGT assessment and qualitative MRI FGT (r=0.809, n=60, P<0.001) and mammographic density assessment (r=0.805, n=60, P<0.001). There was a significant correlation between qualitative MRI FGT assessment and mammographic density assessment (r=0.725, n=60, P<0.001). The four qualitative assessment categories of FGT correlated with the calculated mean quantitative FGT (%) of 4.61% (95% CI, 0-12.3%), 8.74% (7.3-10.2%), 18.1% (15.1-21.1%), 37.4% (29.5-45.3%). Quantitative measures of FGT (%) were computed with data derived from breast MRI and correlated significantly with conventional qualitative assessments. This quantitative technique may prove to be a valuable tool in clinical use by providing computer generated standardized measurements with limited intra or inter-observer variability.
Quantitative color measurement of pH indicator paper using trichromatic LEDs and TCS230 color sensor
NASA Astrophysics Data System (ADS)
Ghorude, T. N.; Chaudhari, A. L.; Shaligram, A. D.
2008-11-01
Quantitative analysis of pH indicator paper color is needed in the various fields. An indigenously developed Tristimulus colorimeter is used in this work for pH Indicator paper color measurement. The colorimeter uses Trichromatic RGB LEDs and a programmable color light to frequency converter (TCS230), combining configurable silicon photodiodes and a current to frequency converter on a single monolithic CMOS integrated circuit. The output is a square wave (50% duty cycle) with frequency directly proportional to light intensity. Digital input and digital output allow directly to a microcontroller. The light to frequency converter reads an 8*8 array of photodiodes. Sixteen photodiodes have red filters, 16 photodiodes have green filters, 16 photodiodes have blue filters, and 16 photodiodes are clear with no filters. All 16 photodiodes of the same colors are connected in parallel and type of photodiode the device uses during operation is pin selectable. Solutions having different standard pH were prepared and indicator paper was dipped in solution, it shows change in color. Using the developed RGB colorimeter chromaticity coordinates were measured and compared with the chromaticity coordinates measured using Ocean Optics HR-4000 high resolution spectrophotometer.
Fällman, Erik; Schedin, Staffan; Jass, Jana; Andersson, Magnus; Uhlin, Bernt Eric; Axner, Ove
2004-06-15
An optical force measurement system for quantitating forces in the pN range between micrometer-sized objects has been developed. The system was based upon optical tweezers in combination with a sensitive position detection system and constructed around an inverted microscope. A trapped particle in the focus of the high numerical aperture microscope-objective behaves like an omnidirectional mechanical spring in response to an external force. The particle's displacement from the equilibrium position is therefore a direct measure of the exerted force. A weak probe laser beam, focused directly below the trapping focus, was used for position detection of the trapped particle (a polystyrene bead). The bead and the condenser focus the light to a distinct spot in the far field, monitored by a position sensitive detector. Various calibration procedures were implemented in order to provide absolute force measurements. The system has been used to measure the binding forces between Escherichia coli bacterial adhesins and galabiose-functionalized beads.
NASA Astrophysics Data System (ADS)
Delhez, Robert; Van der Gaast, S. J.; Wielders, Arno; de Boer, J. L.; Helmholdt, R. B.; van Mechelen, J.; Reiss, C.; Woning, L.; Schenk, H.
2003-02-01
The mineralogy of the surface material of Mars is the key to disclose its present and past life and climates. Clay mineral species, carbonates, and ice (water and CO2) are and/or contain their witnesses. X-ray powder diffraction (XRPD) is the most powerful analytical method to identify and quantitatively characterize minerals in complex mixtures. This paper discusses the development of a working model of an instrument consisting of a reflection mode diffractometer and a transmission mode CCD-XRPD instrument, combined with an XRF module. The CCD-XRD/XRF instrument is analogous to the instrument for Mars missions developed by Sarrazin et al. (1998). This part of the tandem instrument enables "quick and dirty" analysis of powdered (!) matter to monitor semi-quantitatively the presence of clay minerals as a group, carbonates, and ices and yields semi-quantitative chemical information from X-ray fluorescence (XRF). The reflection mode instrument (i) enables in-situ measurements of rocks and soils and quantitative information on the compounds identified, (ii) has a high resolution and reveals large spacings for accurate identification, in particular of clay mineral species, and (iii) the shape of the line profiles observed reveals the kind and approximate amounts of lattice imperfections present. It will be shown that the information obtained with the reflection mode diffractometer is crucial for finding signs of life and changes in the climate on Mars. Obviously this instrument can also be used for other extra-terrestrial research.
Kennedy, Gordon J; Afeworki, Mobae; Calabro, David C; Chase, Clarence E; Smiley, Randolph J
2004-06-01
Distinct hydrogen species are present in important inorganic solids such as zeolites, silicoaluminophosphates (SAPOs), mesoporous materials, amorphous silicas, and aluminas. These H species include hydrogens associated with acidic sites such as Al(OH)Si, non-framework aluminum sites, silanols, and surface functionalities. Direct and quantitative methodology to identify, measure, and monitor these hydrogen species are key to monitoring catalyst activity, optimizing synthesis conditions, tracking post-synthesis structural modifications, and in the preparation of novel catalytic materials. Many workers have developed several techniques to address these issues, including 1H MAS NMR (magic-angle spinning nuclear magnetic resonance). 1H MAS NMR offers many potential advantages over other techniques, but care is needed in recognizing experimental limitations and developing sample handling and NMR methodology to obtain quantitatively reliable data. A simplified approach is described that permits vacuum dehydration of multiple samples simultaneously and directly in the MAS rotor without the need for epoxy, flame sealing, or extensive glovebox use. We have found that careful optimization of important NMR conditions, such as magnetic field homogeneity and magic angle setting are necessary to acquire quantitative, high-resolution spectra that accurately measure the concentrations of the different hydrogen species present. Details of this 1H MAS NMR methodology with representative applications to zeolites, SAPOs, M41S, and silicas as a function of synthesis conditions and post-synthesis treatments (i.e., steaming, thermal dehydroxylation, and functionalization) are presented.
A framework for organizing and selecting quantitative approaches for benefit-harm assessment.
Puhan, Milo A; Singh, Sonal; Weiss, Carlos O; Varadhan, Ravi; Boyd, Cynthia M
2012-11-19
Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches.
A framework for organizing and selecting quantitative approaches for benefit-harm assessment
2012-01-01
Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches. PMID:23163976
DOE Office of Scientific and Technical Information (OSTI.GOV)
Timchalk, Chuck; Poet, Torka S.; Kousba, Ahmed A.
2004-04-01
There is a need to develop approaches for assessing risk associated with acute exposures to a broad-range of chemical agents and to rapidly determine the potential implications to human health. Non-invasive biomonitoring approaches are being developed using reliable portable analytical systems to quantitate dosimetry utilizing readily obtainable body fluids, such as saliva. Saliva has been used to evaluate a broad range of biomarkers, drugs, and environmental contaminants including heavy metals and pesticides. To advance the application of non-invasive biomonitoring a microfluidic/ electrochemical device has also been developed for the analysis of lead (Pb), using square wave anodic stripping voltammetry. Themore » system demonstrates a linear response over a broad concentration range (1 2000 ppb) and is capable of quantitating saliva Pb in rats orally administered acute doses of Pb-acetate. Appropriate pharmacokinetic analyses have been used to quantitate systemic dosimetry based on determination of saliva Pb concentrations. In addition, saliva has recently been used to quantitate dosimetry following exposure to the organophosphate insecticide chlorpyrifos in a rodent model system by measuring the major metabolite, trichloropyridinol, and saliva cholinesterase inhibition following acute exposures. These results suggest that technology developed for non-invasive biomonitoring can provide a sensitive, and portable analytical tool capable of assessing exposure and risk in real-time. By coupling these non-invasive technologies with pharmacokinetic modeling it is feasible to rapidly quantitate acute exposure to a broad range of chemical agents. In summary, it is envisioned that once fully developed, these monitoring and modeling approaches will be useful for accessing acute exposure and health risk.« less
Note: Measuring instrument of singlet oxygen quantum yield in photodynamic effects
NASA Astrophysics Data System (ADS)
Li, Zhongwei; Zhang, Pengwei; Zang, Lixin; Qin, Feng; Zhang, Zhiguo; Zhang, Hongli
2017-06-01
Using diphenylisobenzofuran (C20H14O) as a singlet oxygen (1O2) reporter, a comparison method, which can be used to measure the singlet oxygen quantum yield (ΦΔ) of the photosensitizer quantitatively, is presented in this paper. Based on this method, an automatic measuring instrument of singlet oxygen quantum yield is developed. The singlet oxygen quantum yield of the photosensitizer hermimether and aloe-emodin is measured. It is found that the measuring results are identical to the existing ones, which verifies the validity of the measuring instrument.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oiko, V. T. A., E-mail: oiko@ifi.unicamp.br; Rodrigues, V.; Ugarte, D.
2014-03-15
Understanding the mechanical properties of nanoscale systems requires new experimental and theoretical tools. In particular, force sensors compatible with nanomechanical testing experiments and with sensitivity in the nN range are required. Here, we report the development and testing of a tuning-fork-based force sensor for in situ nanomanipulation experiments inside a scanning electron microscope. The sensor uses a very simple design for the electronics and it allows the direct and quantitative force measurement in the 1–100 nN force range. The sensor response is initially calibrated against a nN range force standard, as, for example, a calibrated Atomic Force Microscopy cantilever; subsequently,more » applied force values can be directly derived using only the electric signals generated by the tuning fork. Using a homemade nanomanipulator, the quantitative force sensor has been used to analyze the mechanical deformation of multi-walled carbon nanotube bundles, where we analyzed forces in the 5–40 nN range, measured with an error bar of a few nN.« less
A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.
Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less
A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis
Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; ...
2015-12-07
Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less
Researching Participation in Adult Education: The Potential of the Qualitative Perspective.
ERIC Educational Resources Information Center
Rockhill, Kathleen
1982-01-01
Critiques research on participation in terms of problems of measurement, definition, and the use of value-laden contructs. Compares qualitative and quantitative methods, further developing the qualitative approach and the influence of hermeneutics and phenomenology. (Author/SK)
We report the development of a quantifiable exposure indicator for measuring the presence of environmental estrogens in aquatic systems. Synthetic oligonucleotides, designed specifically for the vitellogenin gene (Vg) transcription product, were used in a Reverse Transcription Po...
Metabolic gradients: a new system for old questions.
Blackstone, Neil W
2008-04-22
Metabolic gradients are likely to be crucial to normal and abnormal development of cells and tissues. As shown by a new study, a Xenopus egg model system has great promise to illuminate quantitative measures of metabolic gradients in living cytoplasm.
Zaharchuk, Greg; Busse, Reed F; Rosenthal, Guy; Manley, Geoffery T; Glenn, Orit A; Dillon, William P
2006-08-01
The oxygen partial pressure (pO2) of human body fluids reflects the oxygenation status of surrounding tissues. All existing fluid pO2 measurements are invasive, requiring either microelectrode/optode placement or fluid removal. The purpose of this study is to develop a noninvasive magnetic resonance imaging method to measure the pO2 of human body fluids. We developed an imaging paradigm that exploits the paramagnetism of molecular oxygen to create quantitative images of fluid oxygenation. A single-shot fast spin echo pulse sequence was modified to minimize artifacts from motion, fluid flow, and partial volume. Longitudinal relaxation rate (R1 = 1/T1) was measured with a time-efficient nonequilibrium saturation recovery method and correlated with pO2 measured in phantoms. pO2 images of human and fetal cerebrospinal fluid, bladder urine, and vitreous humor are presented and quantitative oxygenation levels are compared with prior literature estimates, where available. Significant pO2 increases are shown in cerebrospinal fluid and vitreous following 100% oxygen inhalation. Potential errors due to temperature, fluid flow, and partial volume are discussed. Noninvasive measurements of human body fluid pO2 in vivo are presented, which yield reasonable values based on prior literature estimates. This rapid imaging-based measurement of fluid oxygenation may provide insight into normal physiology as well as changes due to disease or during treatment.
Code of Federal Regulations, 2014 CFR
2014-04-01
... regarding a variety of quantitative measurements of their covered trading activities, which vary depending... entity's covered trading activities. c. The quantitative measurements that must be furnished pursuant to... prior to September 30, 2015. e. In addition to the quantitative measurements required in this appendix...
NASA Astrophysics Data System (ADS)
Olson, Jonathan D.; Kanick, Stephen C.; Bravo, Jaime J.; Roberts, David W.; Paulsen, Keith D.
2016-03-01
Aminolevulinc-acid induced protoporphyrin IX (ALA-PpIX) is being investigated as a biomarker to guide neurosurgical resection of brain tumors. ALA-PpIX fluorescence can be observed visually in the surgical field; however, raw fluorescence emissions can be distorted by factors other than the fluorophore concentration. Specifically, fluorescence emissions are mixed with autofluorescence and attenuated by background absorption and scattering properties of the tissue. Recent work at Dartmouth has developed advanced fluorescence detection approaches that return quantitative assessments of PpIX concentration, which are independent of background optical properties. The quantitative fluorescence imaging (qFI) approach has increased sensitivity to residual disease within the resection cavity at the end of surgery that was not visible to the naked eye through the operating microscope. This presentation outlines clinical observations made during an ongoing investigation of ALA-PpIX based guidance of tumor resection. PpIX fluorescence measurements made in a wide-field hyperspectral imaging approach are co-registered with point-assessment using a fiber optic probe. Data show variations in the measured PpIX accumulation among different clinical tumor grades (i.e. high grade glioma, low grade glioma), types (i.e. primary tumors. metastases) and normal structures of interest (e.g. normal cortex, hippocampus). These results highlight the contrast enhancement and underscore the potential clinical benefit offered from quantitative measurements of PpIX concentration during resection of intracranial tumors.
A satellite technique for quantitatively mapping rainfall rates over the oceans
NASA Technical Reports Server (NTRS)
Wilheit, T. T.; Roa, M. S. V.; Chang, T. C.; Rodgers, E. B.; Theon, J. S.
1975-01-01
A theoretical model for calculating microwave radiative transfer in raining atmospheres is developed. These calculations are compared with microwave brightness temperatures at a wavelength of 1.55 cm measured on the Nimbus-5 satellite and rain rates derived from WSR-57 meteorological radar measurements. A specially designed ground based verification experiment was also performed wherein upward viewing microwave brightness temperature measurements at wavelengths of 1.55 cm and 0.81 cm were compared with directly measured rain rates.
Cordella, Claire; Dickerson, Bradford C.; Quimby, Megan; Yunusova, Yana; Green, Jordan R.
2016-01-01
Background Primary progressive aphasia (PPA) is a neurodegenerative aphasic syndrome with three distinct clinical variants: non-fluent (nfvPPA), logopenic (lvPPA), and semantic (svPPA). Speech (non-) fluency is a key diagnostic marker used to aid identification of the clinical variants, and researchers have been actively developing diagnostic tools to assess speech fluency. Current approaches reveal coarse differences in fluency between subgroups, but often fail to clearly differentiate nfvPPA from the variably fluent lvPPA. More robust subtype differentiation may be possible with finer-grained measures of fluency. Aims We sought to identify the quantitative measures of speech rate—including articulation rate and pausing measures—that best differentiated PPA subtypes, specifically the non-fluent group (nfvPPA) from the more fluent groups (lvPPA, svPPA). The diagnostic accuracy of the quantitative speech rate variables was compared to that of a speech fluency impairment rating made by clinicians. Methods and Procedures Automatic estimates of pause and speech segment durations and rate measures were derived from connected speech samples of participants with PPA (N=38; 11 nfvPPA, 14 lvPPA, 13 svPPA) and healthy age-matched controls (N=8). Clinician ratings of fluency impairment were made using a previously validated clinician rating scale developed specifically for use in PPA. Receiver operating characteristic (ROC) analyses enabled a quantification of diagnostic accuracy. Outcomes and Results Among the quantitative measures, articulation rate was the most effective for differentiating between nfvPPA and the more fluent lvPPA and svPPA groups. The diagnostic accuracy of both speech and articulation rate measures was markedly better than that of the clinician rating scale, and articulation rate was the best classifier overall. Area under the curve (AUC) values for articulation rate were good to excellent for identifying nfvPPA from both svPPA (AUC=.96) and lvPPA (AUC=.86). Cross-validation of accuracy results for articulation rate showed good generalizability outside the training dataset. Conclusions Results provide empirical support for (1) the efficacy of quantitative assessments of speech fluency and (2) a distinct non-fluent PPA subtype characterized, at least in part, by an underlying disturbance in speech motor control. The trend toward improved classifier performance for quantitative rate measures demonstrates the potential for a more accurate and reliable approach to subtyping in the fluency domain, and suggests that articulation rate may be a useful input variable as part of a multi-dimensional clinical subtyping approach. PMID:28757671
NASA Astrophysics Data System (ADS)
Medjoubi, K.; Dawiec, A.
2017-12-01
A simple method is proposed in this work for quantitative evaluation of the quality of the threshold adjustment and the flat-field correction of Hybrid Photon Counting pixel (HPC) detectors. This approach is based on the Photon Transfer Curve (PTC) corresponding to the measurement of the standard deviation of the signal in flat field images. Fixed pattern noise (FPN), easily identifiable in the curve, is linked to the residual threshold dispersion, sensor inhomogeneity and the remnant errors in flat fielding techniques. The analytical expression of the signal to noise ratio curve is developed for HPC and successfully used as a fit function applied to experimental data obtained with the XPAD detector. The quantitative evaluation of the FPN, described by the photon response non-uniformity (PRNU), is measured for different configurations (threshold adjustment method and flat fielding technique) and is demonstrated to be used in order to evaluate the best setting for having the best image quality from a commercial or a R&D detector.
A preliminary study of DTI Fingerprinting on stroke analysis.
Ma, Heather T; Ye, Chenfei; Wu, Jun; Yang, Pengfei; Chen, Xuhui; Yang, Zhengyi; Ma, Jingbo
2014-01-01
DTI (Diffusion Tensor Imaging) is a well-known MRI (Magnetic Resonance Imaging) technique which provides useful structural information about human brain. However, the quantitative measurement to physiological variation of subtypes of ischemic stroke is not available. An automatically quantitative method for DTI analysis will enhance the DTI application in clinics. In this study, we proposed a DTI Fingerprinting technology to quantitatively analyze white matter tissue, which was applied in stroke classification. The TBSS (Tract Based Spatial Statistics) method was employed to generate mask automatically. To evaluate the clustering performance of the automatic method, lesion ROI (Region of Interest) is manually drawn on the DWI images as a reference. The results from the DTI Fingerprinting were compared with those obtained from the reference ROIs. It indicates that the DTI Fingerprinting could identify different states of ischemic stroke and has promising potential to provide a more comprehensive measure of the DTI data. Further development should be carried out to improve DTI Fingerprinting technology in clinics.
Composition-explicit distillation curves of aviation fuel JP-8 and a coal-based jet fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beverly L. Smith; Thomas J. Bruno
2007-09-15
We have recently introduced several important improvements in the measurement of distillation curves for complex fluids. The modifications to the classical measurement provide for (1) a composition explicit data channel for each distillate fraction (for both qualitative and quantitative analysis); (2) temperature measurements that are true thermodynamic state points; (3) temperature, volume, and pressure measurements of low uncertainty suitable for an equation of state development; (4) consistency with a century of historical data; (5) an assessment of the energy content of each distillate fraction; (6) a trace chemical analysis of each distillate fraction; and (7) a corrosivity assessment of eachmore » distillate fraction. The most significant modification is achieved with a new sampling approach that allows precise qualitative as well as quantitative analyses of each fraction, on the fly. We have applied the new method to the measurement of rocket propellant, gasoline, and jet fuels. In this paper, we present the application of the technique to representative batches of the military aviation fuel JP-8, and also to a coal-derived fuel developed as a potential substitute. We present not only the distillation curves but also a chemical characterization of each fraction and discuss the contrasts between the two fluids. 26 refs., 5 figs., 6 tabs.« less
Analysis of airborne MAIS imaging spectrometric data for mineral exploration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang Jinnian; Zheng Lanfen; Tong Qingxi
1996-11-01
The high spectral resolution imaging spectrometric system made quantitative analysis and mapping of surface composition possible. The key issue will be the quantitative approach for analysis of surface parameters for imaging spectrometer data. This paper describes the methods and the stages of quantitative analysis. (1) Extracting surface reflectance from imaging spectrometer image. Lab. and inflight field measurements are conducted for calibration of imaging spectrometer data, and the atmospheric correction has also been used to obtain ground reflectance by using empirical line method and radiation transfer modeling. (2) Determining quantitative relationship between absorption band parameters from the imaging spectrometer data andmore » chemical composition of minerals. (3) Spectral comparison between the spectra of spectral library and the spectra derived from the imagery. The wavelet analysis-based spectrum-matching techniques for quantitative analysis of imaging spectrometer data has beer, developed. Airborne MAIS imaging spectrometer data were used for analysis and the analysis results have been applied to the mineral and petroleum exploration in Tarim Basin area china. 8 refs., 8 figs.« less
Optical Ptychographic Microscope for Quantitative Bio-Mechanical Imaging
NASA Astrophysics Data System (ADS)
Anthony, Nicholas; Cadenazzi, Guido; Nugent, Keith; Abbey, Brian
The role that mechanical forces play in biological processes such as cell movement and death is becoming of significant interest to further develop our understanding of the inner workings of cells. The most common method used to obtain stress information is photoelasticity which maps a samples birefringence, or its direction dependent refractive indices, using polarized light. However this method only provides qualitative data and for stress information to be useful quantitative data is required. Ptychography is a method for quantitatively determining the phase of a samples complex transmission function. The technique relies upon the collection of multiple overlapping coherent diffraction patterns from laterally displaced points on the sample. The overlap of measurement points provides complementary information that significantly aids in the reconstruction of the complex wavefield exiting the sample and allows for quantitative imaging of weakly interacting specimens. Here we describe recent advances at La Trobe University Melbourne on achieving quantitative birefringence mapping using polarized light ptychography with applications in cell mechanics. Australian Synchrotron, ARC Centre of Excellence for Advanced Molecular Imaging.
Quantitation without Calibration: Response Profile as an Indicator of Target Amount.
Debnath, Mrittika; Farace, Jessica M; Johnson, Kristopher D; Nesterova, Irina V
2018-06-21
Quantitative assessment of biomarkers is essential in numerous contexts from decision-making in clinical situations to food quality monitoring to interpretation of life-science research findings. However, appropriate quantitation techniques are not as widely addressed as detection methods. One of the major challenges in biomarker's quantitation is the need to have a calibration for correlating a measured signal to a target amount. The step complicates the methodologies and makes them less sustainable. In this work we address the issue via a new strategy: relying on position of response profile rather than on an absolute signal value for assessment of a target's amount. In order to enable the capability we develop a target-probe binding mechanism based on a negative cooperativity effect. A proof-of-concept example demonstrates that the model is suitable for quantitative analysis of nucleic acids over a wide concentration range. The general principles of the platform will be applicable toward a variety of biomarkers such as nucleic acids, proteins, peptides, and others.
Zhao, Xuemei; Delgado, Liliana; Weiner, Russell; Laterza, Omar F.
2015-01-01
Thymus- and activation-regulated chemokine (TARC) in serum/plasma associates with the disease activity of atopic dermatitis (AD), and is a promising tool for assessing the response to the treatment of the disease. TARC also exists within platelets, with elevated levels detectable in AD patients. We examined the effects of pre-analytical factors on the quantitation of TARC in human EDTA plasma. TARC levels in platelet-free plasma were significantly lower than those in platelet-containing plasma. After freeze-thaw, TARC levels increased in platelet-containing plasma, but remained unchanged in platelet-free plasma, suggesting TARC was released from the platelets during the freeze-thaw process. In contrast, TARC levels were stable in serum independent of freeze-thaw. These findings underscore the importance of pre-analytical factors to TARC quantitation. Plasma TARC levels should be measured in platelet-free plasma for accurate quantitation. Pre-analytical factors influence the quantitation, interpretation, and implementation of circulating TARC as a biomarker for the development of AD therapeutics. PMID:28936246
Quantitative Interferometry in the Severe Acoustic Environment of Resonant Supersonic Jets
NASA Technical Reports Server (NTRS)
Mercer, Carolyn R.; Raman, Ganesh
1999-01-01
Understanding fundamental fluidic dynamic and acoustic processes in high-speed jets requires quantitative velocity, density and temperature measurements. In this paper we demonstrate a new, robust Liquid Crystal Point Diffraction Interferometer (LCPDI) that includes phase stepping and can provide accurate data even in the presence of intense acoustic fields. This novel common path interferometer (LCPDI) was developed to overcome difficulties with the Mach Zehnder interferometer in vibratory environments and is applied here to the case of a supersonic shock- containing jet. The environmentally insensitive LCPDI that is easy to align and capable of measuring optical wavefronts with high accuracy is briefly described, then integrated line of sight density data from the LCPDI for two underexpanded jets are presented.
NASA Technical Reports Server (NTRS)
Weldon, J. W.
1973-01-01
An investigation was conducted to develop procedure to obtain quantitative values for chlorophyll and turbidity in coastal waters by observing the changes in spectral radiance of the backscattered spectrum. The technique under consideration consists of Examining Exotech model 20-D spectral radiometer data and determining which radiance ratios best correlated with chlorophyll and turbidity measurements as obtained from analyses of water samples and sechi visibility readings. Preliminary results indicate that there is a correlation between backscattered light and chlorophyll concentration and secchi visibility. The tests were conducted with the spectrometer mounted in a light aircraft over the Mississippi Sound at altitudes of 2.5K, 2.8K and 10K feet.
Quantifying Disease Progression in Amyotrophic Lateral Sclerosis
Simon, Neil G; Turner, Martin R; Vucic, Steve; Al-Chalabi, Ammar; Shefner, Jeremy; Lomen-Hoerth, Catherine; Kiernan, Matthew C
2014-01-01
Amyotrophic lateral sclerosis (ALS) exhibits characteristic variability of onset and rate of disease progression, with inherent clinical heterogeneity making disease quantitation difficult. Recent advances in understanding pathogenic mechanisms linked to the development of ALS impose an increasing need to develop strategies to predict and more objectively measure disease progression. This review explores phenotypic and genetic determinants of disease progression in ALS, and examines established and evolving biomarkers that may contribute to robust measurement in longitudinal clinical studies. With targeted neuroprotective strategies on the horizon, developing efficiencies in clinical trial design may facilitate timely entry of novel treatments into the clinic. PMID:25223628
Measuring faculty retention and success in academic medicine.
Ries, Andrew; Wingard, Deborah; Gamst, Anthony; Larsen, Catherine; Farrell, Elizabeth; Reznik, Vivian
2012-08-01
To develop and demonstrate the usefulness of quantitative methods for assessing retention and academic success of junior faculty in academic medicine. The authors created matched sets of participants and nonparticipants in a junior faculty development program based on hire date and academic series for newly hired assistant professors at the University of California, San Diego (UCSD), School of Medicine between 1988 and 2005. They used Kaplan-Meier and Cox proportional hazards survival analyses to characterize the influence of covariates, including gender, ethnicity, and program participation, on retention. They also developed a new method for quantifying academic success based on several measures including (1) leadership and professional activities, (2) honors and awards, (3) research grants, (4) teaching and mentoring/advising activities, and (5) publications. The authors then used these measures to compare matched pairs of participating and nonparticipating faculty who were subsequently promoted and remained at UCSD. Compared with matched nonparticipants, the retention of junior faculty who participated in the faculty development program was significantly higher. Among those who were promoted and remained at UCSD, the academic success of faculty development participants was consistently greater than that of matched nonparticipants. This difference reached statistical significance for leadership and professional activities. Using better quantitative methods for evaluating retention and academic success will improve understanding and research in these areas. In this study, use of such methods indicated that organized junior faculty development programs have positive effects on faculty retention and may facilitate success in academic medicine.
NASA Astrophysics Data System (ADS)
Singletary, Joanna Lynn Bush
This study evaluated the relationship of environmental service-learning on environmental literacy in undergraduates. The subjects were 36 undergraduates at a small liberal arts university enrolled in an environmental biology course. To determine the role of environmental service-learning on college students' environmental knowledge, attitudes, behaviors, and environmental literacy, this study utilized concurrent mixed methods approach for qualitative and quantitative analysis. A quasi-experimental repeated measures approach was the design of the quantitative component of the study. Data were collected on attitude, behavior, and content knowledge aspects of environmental literacy as measured by the Environmental Literacy Survey (Kibert, 2000). Hypotheses were tested by independent samples ttests and repeated measures ANOVA. Repeated measures ANOVA conducted on participants' three subscales scores for the Environmental Literacy Survey (attitude, behavior, and knowledge) indicated that students who participated in environmental service-learning scored statistically significantly higher than those that did not initially participate in service-learning. Qualitative data collected in the form of journal reflections and portfolios were evaluated for themes of environmental attitudes or affective statements, environmentally positive behaviors and skills, and ecological content. Quantitative and qualitative data support the positive role of environmental service-learning in the development of environmental literacy in undergraduate students.
NASA Astrophysics Data System (ADS)
Shaar, R.; Farchi, E.; Farfurnik, D.; Ebert, Y.; Haim, G.; Bar-Gill, N.
2017-12-01
Magnetization in rock samples is crucial for paleomagnetometry research, as it harbors valuable geological information on long term processes, such as tectonic movements and the formation of oceans and continents. Nevertheless, current techniques are limited in their ability to measure high spatial resolution and high-sensitivity quantitative vectorial magnetic signatures from individual minerals and micrometer scale samples. As a result, our understanding of bulk rock magnetization is limited, specifically for the case of multi-domain minerals. In this work we use a newly developed nitrogen-vacancy magnetic microscope, capable of quantitative vectorial magnetic imaging with optical resolution. We demonstrate direct imaging of the vectorial magnetic field of a single, multi-domain dendritic magnetite, as well as the measurement and calculation of the weak magnetic moments of an individual grain on the micron scale. Our results were measured in a standoff distance of 3-10 μm, with 350 nm spatial resolution, magnetic sensitivity of 6 μT/√(Hz) and a field of view of 35 μm. The results presented here show the capabilities and the future potential of NV microscopy in measuring the magnetic signals of individual micrometer scale grains. These outcomes pave the way for future applications in paleomagnetometry, and for the fundamental understanding of magnetization in multi-domain samples.
NASA Astrophysics Data System (ADS)
Koeppe, Robert Allen
Positron computed tomography (PCT) is a diagnostic imaging technique that provides both three dimensional imaging capability and quantitative measurements of local tissue radioactivity concentrations in vivo. This allows the development of non-invasive methods that employ the principles of tracer kinetics for determining physiological properties such as mass specific blood flow, tissue pH, and rates of substrate transport or utilization. A physiologically based, two-compartment tracer kinetic model was derived to mathematically describe the exchange of a radioindicator between blood and tissue. The model was adapted for use with dynamic sequences of data acquired with a positron tomograph. Rapid estimation techniques were implemented to produce functional images of the model parameters by analyzing each individual pixel sequence of the image data. A detailed analysis of the performance characteristics of three different parameter estimation schemes was performed. The analysis included examination of errors caused by statistical uncertainties in the measured data, errors in the timing of the data, and errors caused by violation of various assumptions of the tracer kinetic model. Two specific radioindicators were investigated. ('18)F -fluoromethane, an inert freely diffusible gas, was used for local quantitative determinations of both cerebral blood flow and tissue:blood partition coefficient. A method was developed that did not require direct sampling of arterial blood for the absolute scaling of flow values. The arterial input concentration time course was obtained by assuming that the alveolar or end-tidal expired breath radioactivity concentration is proportional to the arterial blood concentration. The scale of the input function was obtained from a series of venous blood concentration measurements. The method of absolute scaling using venous samples was validated in four studies, performed on normal volunteers, in which directly measured arterial concentrations were compared to those predicted from the expired air and venous blood samples. The glucose analog ('18)F-3-deoxy-3-fluoro-D -glucose (3-FDG) was used for quantitating the membrane transport rate of glucose. The measured data indicated that the phosphorylation rate of 3-FDG was low enough to allow accurate estimation of the transport rate using a two compartment model.
2011-01-01
used in efforts to develop QSAR models. Measurement of Repellent Efficacy Screening for Repellency of Compounds with Unknown Toxicology In screening...CPT) were used to develop Quantitative Structure Activity Relationship ( QSAR ) models to predict repellency. Successful prediction of novel...acylpiperidine QSAR models employed 4 descriptors to describe the relationship between structure and repellent duration. The ANN model of the carboxamides did not
Epigenetics as a First Exit Problem
NASA Astrophysics Data System (ADS)
Aurell, E.; Sneppen, K.
2002-01-01
We develop a framework to discuss the stability of epigenetic states as first exit problems in dynamical systems with noise. We consider in particular the stability of the lysogenic state of the λ prophage. The formalism defines a quantitative measure of robustness of inherited states.
40 CFR Appendix E to Part 300 - Oil Spill Response
Code of Federal Regulations, 2013 CFR
2013-07-01
... and Health Administration RSPA—Research and Special Programs Administration USCG—United States Coast... major discharge regardless of the following quantitative measures: (a) Minor discharge means a discharge... protection of response teams and necessary research, development, demonstration, and evaluation to improve...
40 CFR Appendix E to Part 300 - Oil Spill Response
Code of Federal Regulations, 2011 CFR
2011-07-01
... and Health Administration RSPA—Research and Special Programs Administration USCG—United States Coast... major discharge regardless of the following quantitative measures: (a) Minor discharge means a discharge... protection of response teams and necessary research, development, demonstration, and evaluation to improve...
40 CFR Appendix E to Part 300 - Oil Spill Response
Code of Federal Regulations, 2012 CFR
2012-07-01
... and Health Administration RSPA—Research and Special Programs Administration USCG—United States Coast... major discharge regardless of the following quantitative measures: (a) Minor discharge means a discharge... protection of response teams and necessary research, development, demonstration, and evaluation to improve...
Methods of Writing Instruction Evaluation.
ERIC Educational Resources Information Center
Lamb, Bill H.
The Writing Program Director at Johnson County Community College (Kansas) developed quantitative measures for writing instruction evaluation which can support that institution's growing interest in and support for peer collaboration as a means to improving instructional quality. The first process (Interaction Analysis) has an observer measure…
Hahn, Cassidy M.; Iwanowicz, Luke R.; Cornman, Robert S.; Mazik, Patricia M.; Blazer, Vicki S.
2016-01-01
Environmental studies increasingly identify the presence of both contaminants of emerging concern (CECs) and legacy contaminants in aquatic environments; however, the biological effects of these compounds on resident fishes remain largely unknown. High throughput methodologies were employed to establish partial transcriptomes for three wild-caught, non-model fish species; smallmouth bass (Micropterus dolomieu), white sucker (Catostomus commersonii) and brown bullhead (Ameiurus nebulosus). Sequences from these transcriptome databases were utilized in the development of a custom nCounter CodeSet that allowed for direct multiplexed measurement of 50 transcript abundance endpoints in liver tissue. Sequence information was also utilized in the development of quantitative real-time PCR (qPCR) primers. Cross-species hybridization allowed the smallmouth bass nCounter CodeSet to be used for quantitative transcript abundance analysis of an additional non-model species, largemouth bass (Micropterus salmoides). We validated the nCounter analysis data system with qPCR for a subset of genes and confirmed concordant results. Changes in transcript abundance biomarkers between sexes and seasons were evaluated to provide baseline data on transcript modulation for each species of interest.
Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H. L.; Onami, Shuichi
2015-01-01
Motivation: Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. Results: We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. Availability and implementation: A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Contact: sonami@riken.jp Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:25414366
Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H L; Onami, Shuichi
2015-04-01
Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
Hill, Ryan C; Oman, Trent J; Shan, Guomin; Schafer, Barry; Eble, Julie; Chen, Cynthia
2015-08-26
Currently, traditional immunochemistry technologies such as enzyme-linked immunosorbent assays (ELISA) are the predominant analytical tool used to measure levels of recombinant proteins expressed in genetically engineered (GE) plants. Recent advances in agricultural biotechnology have created a need to develop methods capable of selectively detecting and quantifying multiple proteins in complex matrices because of increasing numbers of transgenic proteins being coexpressed or "stacked" to achieve tolerance to multiple herbicides or to provide multiple modes of action for insect control. A multiplexing analytical method utilizing liquid chromatography with tandem mass spectrometry (LC-MS/MS) has been developed and validated to quantify three herbicide-tolerant proteins in soybean tissues: aryloxyalkanoate dioxygenase (AAD-12), 5-enol-pyruvylshikimate-3-phosphate synthase (2mEPSPS), and phosphinothricin acetyltransferase (PAT). Results from the validation showed high recovery and precision over multiple analysts and laboratories. Results from this method were comparable to those obtained with ELISA with respect to protein quantitation, and the described method was demonstrated to be suitable for multiplex quantitation of transgenic proteins in GE crops.
NASA Technical Reports Server (NTRS)
Cooper, Clayton S.; Laurendeau, Normand M.; Hicks, Yolanda R. (Technical Monitor)
2000-01-01
Lean direct-injection (LDI) spray flames offer the possibility of reducing NO(sub x) emissions from gas turbines by rapid mixing of the liquid fuel and air so as to drive the flame structure toward partially-premixed conditions. We consider the technical approaches required to utilize laser-induced fluorescence methods for quantitatively measuring NO concentrations in high-pressure LDI spray flames. In the progression from atmospheric to high-pressure measurements, the LIF method requires a shift from the saturated to the linear regime of fluorescence measurements. As such, we discuss quantitative, spatially resolved laser-saturated fluorescence (LSF), linear laser-induced fluorescence (LIF), and planar laser-induced fluorescence (PLIF) measurements of NO concentration in LDI spray flames. Spatially-resolved LIF measurements of NO concentration (ppm) are reported for preheated, LDI spray flames at pressures of two to five atmospheres. The spray is produced by a hollow-cone, pressure-atomized nozzle supplied with liquid heptane. NO is excited via the Q(sub 2)(26.5) transition of the gamma(0,0) band. Detection is performed in a two nanometer region centered on the gamma(0,1) band. A complete scheme is developed by which quantitative NO concentrations in high-pressure LDI spray flames can be measured by applying linear LIF. NO is doped into the reactants and convected through the flame with no apparent destruction, thus allowing a NO fluorescence calibration to be taken inside the flame environment. The in-situ calibration scheme is validated by comparisons to a reference flame. Quantitative NO profiles are presented and analyzed so as to better understand the operation of lean-direct injectors for gas turbine combustors. Moreover, parametric studies are provided for variations in pressure, air-preheat temperature, and equivalence ratio. Similar parametric studies are performed for lean, premixed-prevaporized flames to permit comparisons to those for LDI flames. Finally, PLIF is expanded to high pressure in an effort to quantify the detected fluorescence image for LDI flames. Success is achieved by correcting the PLIF calibration via a single-point LIF measurement. This procedure removes the influence of any preferential background that occurs in the PLIF detection window. In general, both the LIF and PLIF measurements verify that the LDI strategy could be used to reduce NO(sub x) emissions in future gas turbine combustors.
A Quantitative and Qualitative Exploration of Photoaversion in Achromatopsia
Aboshiha, Jonathan; Kumaran, Neruban; Kalitzeos, Angelos; Hogg, Chris; Rubin, Gary; Michaelides, Michel
2017-01-01
Purpose Photoaversion (PA) is a disabling and ubiquitous feature of achromatopsia (ACHM). We aimed to help define the characteristics of this important symptom, and present the first published assessment of its impact on patients' lives, as well as quantitative and qualitative PA assessments. Methods Molecularly confirmed ACHM subjects were assessed for PA using four tasks: structured survey of patient experience, novel quantitative subjective measurement of PA, visual acuities in differing ambient lighting, and objective palpebral aperture-related PA testing. Results Photoaversion in ACHM was found to be the most significant symptom for a substantial proportion (38%) of patients. A novel subjective PA measurement technique was developed and demonstrated fidelity with more invasive paradigms without exposing often very photosensitive patients to brighter light intensities used elsewhere. An objective PA measurement was also refined for use in trials, indicating that higher light intensities than previously published are likely to be needed. Monocular testing, as required for trials, was also validated for the first time. Conclusions This study offers new insights into PA in ACHM. It provides the first structured evidence of the great significance of this symptom to patients, suggesting that PA should be considered as an additional outcome measure in therapeutic trials. It also offers new insights into the characteristics of PA in ACHM, and describes both subjective and objective measures of PA that could be employed in clinical trials. PMID:28715587
NASA Astrophysics Data System (ADS)
Vappou, Jonathan; Bour, Pierre; Marquet, Fabrice; Ozenne, Valery; Quesson, Bruno
2018-05-01
Monitoring thermal therapies through medical imaging is essential in order to ensure that they are safe, efficient and reliable. In this paper, we propose a new approach, halfway between MR acoustic radiation force imaging (MR-ARFI) and MR elastography (MRE), allowing for the quantitative measurement of the elastic modulus of tissue in a highly localized manner. It relies on the simulation of the MR-ARFI profile, which depends on tissue biomechanical properties, and on the identification of tissue elasticity through the fitting of experimental displacement images measured using rapid MR-ARFI. This method was specifically developed to monitor MR-guided high intensity focused ultrasound (MRgHIFU) therapy. Elasticity changes were followed during HIFU ablations (N = 6) performed ex vivo in porcine muscle samples, and were compared to temperature changes measured by MR-thermometry. Shear modulus was found to increase consistently and steadily a few seconds after the heating started, and such changes were found to be irreversible. The shear modulus was found to increase from 1.49 ± 0.48 kPa (before ablation) to 3.69 ± 0.93 kPa (after ablation and cooling). Thanks to its ability to perform quantitative elasticity measurements in a highly localized manner around the focal spot, this method proved to be particularly attractive for monitoring HIFU ablations.
Harkisoen, S; Arends, J E; van den Hoek, J A R; Whelan, J; van Erpecum, K J; Boland, G J; Hoepelman, A I M
2014-12-01
Some studies done in Asian patients have shown that serum levels of hepatitis B virus (HBV) DNA predict the development of cirrhosis. However, it is unclear whether this also applies for non-Asian patients. This study investigated historic and current HBV DNA and quantitative hepatitis B surface antigen (HBsAg) levels as predictors of cirrhosis in non-Asian women with chronic HBV. A retrospective cohort study of non-Asian women with chronic HBV was performed. Among other variables, HBV DNA and quantitative HBsAg levels were measured in stored historic serum samples obtained during pregnancy (period 1990-2004) and current serum samples (period 2011-2012) to determine any association with liver cirrhosis by liver stiffness measurement (LSM). One hundred and nineteen asymptomatic, treatment-naïve non-Asian women were included; the median number of years between the historic sample and the current sample was 17 (interquartile range (IQR) 13-20). The median historic log HBV DNA and quantitative log HBsAg levels were 2.5 (IQR 1.9-3.4) IU/ml and 4.2 (IQR 3.6-4.5) IU/ml, respectively. LSM diagnosed 14 patients (12%) with F3-F4 fibrosis, i.e. stiffness >8.1kPa. No association of cirrhosis was found with historic HBV DNA (relative risk (RR) 0.34, 95% confidence interval (CI) 0.05-2.44) or with the quantitative HBsAg level (HBsAg level >1000 IU/ml, RR 0.35, 95% CI 0.11-1.11). Multivariable analysis identified alcohol consumption (odds ratio (OR) 6.4, 95% CI 1.3-30.1), aspartate aminotransferase >0.5 times the upper limit of normal (OR 15.4, 95% CI 1.9-122.6), and prothrombin time (OR 12.0, 95% CI 1.2-120.4), but not HBV DNA or quantitative HBsAg level, to be independent predictors of the presence of cirrhosis. Neither historic nor current HBV DNA or the quantitative HBsAg level is associated with the development of HBV-related cirrhosis in non-Asian women. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Chen, Wei; Li, Yanying; Chen, Chang-Er; Sweetman, Andrew J; Zhang, Hao; Jones, Kevin C
2017-11-21
Widespread use of organic chemicals in household and personal care products (HPCPs) and their discharge into aquatic systems means reliable, robust techniques to monitor environmental concentrations are needed. The passive sampling approach of diffusive gradients in thin-films (DGT) is developed here and demonstrated to provide in situ quantitative and time-weighted average (TWA) measurement of these chemicals in waters. The novel technique is developed for HPCPs, including preservatives, antioxidants and disinfectants, by evaluating the performance of different binding agents. Ultrasonic extraction of binding gels in acetonitrile gave good and consistent recoveries for all test chemicals. Uptake by DGT with HLB (hydrophilic-lipophilic-balanced) as the binding agent was relatively independent of pH (3.5-9.5), ionic strength (0.001-0.1 M) and dissolved organic matter (0-20 mg L -1 ), making it suitable for applications across a wide range of environments. Deployment time and diffusion layer thickness dependence experiments confirmed DGT accumulated chemicals masses are consistent with theoretical predictions. The technique was further tested and applied in the influent and effluent of a wastewater treatment plant. Results were compared with conventional grab-sampling and 24-h-composited samples from autosamplers. DGT provided TWA concentrations over up to 18 days deployment, with minimal effects from biofouling or the diffusive boundary layer. The field application demonstrated advantages of the DGT technique: it gives in situ analyte preconcentration in a simple matrix, with more quantitative measurement of the HPCP analytes.
Evidence for success in health promotion: suggestions for improvement.
Macdonald, G; Veen, C; Tones, K
1996-09-01
This paper argues that health promotion needs to develop an approach to evaluation and effectiveness that values qualitative methodologies. It posits the idea that qualitative research could learn from the experience of quantitative researchers and promote more useful ways of measuring effectiveness by the use of intermediate and indirect indicators. It refers to a European-wide project designed to gather information on the effectiveness of health promotion interventions. This project discovered that there was a need for an instrument that allowed qualitative intervention methodologies to be assessed in the same way as quantitative methods.
NASA Astrophysics Data System (ADS)
Ehn, Andreas; Jonsson, Malin; Johansson, Olof; Aldén, Marcus; Bood, Joakim
2013-01-01
Fluorescence lifetimes of toluene as a function of oxygen concentration in toluene/nitrogen/oxygen mixtures have been measured at room temperature using picosecond-laser excitation of the S1-S0 transition at 266 nm. The data satisfy the Stern-Volmer relation with high accuracy, providing an updated value of the Stern-Volmer slope. A newly developed fluorescence lifetime imaging scheme, called Dual Imaging with Modeling Evaluation (DIME), is evaluated and successfully demonstrated for quantitative oxygen concentration imaging in toluene-seeded O2/N2 gas mixtures.
NASA Astrophysics Data System (ADS)
Ehn, Andreas; Jonsson, Malin; Johansson, Olof; Aldén, Marcus; Bood, Joakim
2012-12-01
Fluorescence lifetimes of toluene as a function of oxygen concentration in toluene/nitrogen/oxygen mixtures have been measured at room temperature using picosecond-laser excitation of the S1-S0 transition at 266 nm. The data satisfy the Stern-Volmer relation with high accuracy, providing an updated value of the Stern-Volmer slope. A newly developed fluorescence lifetime imaging scheme, called Dual Imaging with Modeling Evaluation (DIME), is evaluated and successfully demonstrated for quantitative oxygen concentration imaging in toluene-seeded O2/N2 gas mixtures.
NASA Astrophysics Data System (ADS)
Hardee, John R.; Long, John; Otts, Julie
2002-05-01
A senior-level undergraduate laboratory experiment that demonstrates the use of solid-phase microextraction (SPME) and capillary gas chromatography-mass spectrometry (GC-MS) was developed for the quantitative determination of bromoform in swimming pool water. Bromoform was extracted by SPME from the headspace of vials containing sodium chloride-saturated swimming pool water. Bromoform concentrations were determined from comparisons of peak areas on a student-generated calibration curve. Students compared results to OSHA water and air exposure limits for bromoform.
Nishiura, T; Abe, K
1999-01-01
The rat submandibular gland is not fully developed at birth and definitive differentiation takes place postnatally. The steady-state mRNA expression for the four proteinase inhibitor molecules, tissue inhibitors of metalloproteinase (TIMP)-1 and -2, and cystatins S and C, and for a housekeeping gene, glyceraldehyde-3-phosphate dehydrogenase (G3PDH), in rat submandibular glands was measured by quantitative competitive reverse transcription-polymerase chain reaction (RT-PCR) at different stages of postnatal development. The gene-expression patterns of TIMP-1 and -2 relative to G3PDH were similar to each other. The TIMP-2 and cystatin C genes were more highly expressed than those of TIMP-1 and cystatin S at all stages. Moreover, the gene expressions of TIMP-1 and -2, and of cystatins S and C, were predominant between 1 and 7, and 7 and 12 weeks of age, respectively, and coincided developmentally with the regression of terminal tubule cells and the differentiation of granular convoluted tubule cells, respectively. Quantitative competitive RT-PCR allowed accurate measurement of small changes in the steady-state concentrations of these proteinase-inhibitor mRNA molecules.
NASA Astrophysics Data System (ADS)
Combs, Christopher; Clemens, Noel
2014-11-01
Ablation is a multi-physics process involving heat and mass transfer and codes aiming to predict ablation are in need of experimental data pertaining to the turbulent transport of ablation products for validation. Low-temperature sublimating ablators such as naphthalene can be used to create a limited physics problem and simulate ablation at relatively low temperature conditions. At The University of Texas at Austin, a technique is being developed that uses planar laser-induced fluorescence (PLIF) of naphthalene to visualize the transport of ablation products in a supersonic flow. In the current work, naphthalene PLIF will be used to make quantitative measurements of the concentration of ablation products in a Mach 5 turbulent boundary layer. For this technique to be used for quantitative research in supersonic wind tunnel facilities, the fluorescence properties of naphthalene must first be investigated over a wide range of state conditions and excitation wavelengths. The resulting calibration of naphthalene fluorescence will be applied to the PLIF images of ablation from a boundary layer plug, yielding 2-D fields of naphthalene mole fraction. These images may help provide data necessary to validate computational models of ablative thermal protection systems for reentry vehicles. Work supported by NASA Space Technology Research Fellowship Program under grant NNX11AN55H.
Leve, Leslie D.; Harold, Gordon T.; Ge, Xiaojia; Neiderhiser, Jenae M.; Patterson, Gerald
2010-01-01
The results from a large body of family-based research studies indicate that modifying the environment (specifically dimensions of the social environment) through intervention is an effective mechanism for achieving positive outcomes. Parallel to this work is a growing body of evidence from genetically informed studies indicating that social environmental factors are central to enhancing or offsetting genetic influences. Increased precision in the understanding of the role of the social environment in offsetting genetic risk might provide new information about environmental mechanisms that could be applied to prevention science. However, at present, the multifaceted conceptualization of the environment in prevention science is mismatched with the more limited measurement of the environment in many genetically informed studies. A framework for translating quantitative behavioral genetic research to inform the development of preventive interventions is presented in this article. The measurement of environmental indices amenable to modification is discussed within the context of quantitative behavioral genetic studies. In particular, emphasis is placed on the necessary elements that lead to benefits in prevention science, specifically the development of evidence-based interventions. An example from an ongoing prospective adoption study is provided to illustrate the potential of this translational process to inform the selection of preventive intervention targets. PMID:21188273
A feeling for the numbers in biology
Phillips, Rob; Milo, Ron
2009-01-01
Although the quantitative description of biological systems has been going on for centuries, recent advances in the measurement of phenomena ranging from metabolism to gene expression to signal transduction have resulted in a new emphasis on biological numeracy. This article describes the confluence of two different approaches to biological numbers. First, an impressive array of quantitative measurements make it possible to develop intuition about biological numbers ranging from how many gigatons of atmospheric carbon are fixed every year in the process of photosynthesis to the number of membrane transporters needed to provide sugars to rapidly dividing Escherichia coli cells. As a result of the vast array of such quantitative data, the BioNumbers web site has recently been developed as a repository for biology by the numbers. Second, a complementary and powerful tradition of numerical estimates familiar from the physical sciences and canonized in the so-called “Fermi problems” calls for efforts to estimate key biological quantities on the basis of a few foundational facts and simple ideas from physics and chemistry. In this article, we describe these two approaches and illustrate their synergism in several particularly appealing case studies. These case studies reveal the impact that an emphasis on numbers can have on important biological questions. PMID:20018695
A relative quantitative assessment of myocardial perfusion by first-pass technique: animal study
NASA Astrophysics Data System (ADS)
Chen, Jun; Zhang, Zhang; Yu, Xuefang; Zhou, Kenneth J.
2015-03-01
The purpose of this study is to quantitatively assess the myocardial perfusion by first-pass technique in swine model. Numerous techniques based on the analysis of Computed Tomography (CT) Hounsfield Unit (HU) density have emerged. Although these methods proposed to be able to assess haemodynamically significant coronary artery stenosis, their limitations are noticed. There are still needs to develop some new techniques. Experiments were performed upon five (5) closed-chest swine. Balloon catheters were placed into the coronary artery to simulate different degrees of luminal stenosis. Myocardial Blood Flow (MBF) was measured using color microsphere technique. Fractional Flow Reserve (FFR) was measured using pressure wire. CT examinations were performed twice during First-pass phase under adenosine-stress condition. CT HU Density (HUDCT) and CT HU Density Ratio (HUDRCT) were calculated using the acquired CT images. Our study presents that HUDRCT shows a good (y=0.07245+0.09963x, r2=0.898) correlation with MBF and FFR. In receiver operating characteristic (ROC) curve analyses, HUDRCT provides excellent diagnostic performance for the detection of significant ischemia during adenosine-stress as defined by FFR indicated by the value of Area Under the Curve (AUC) of 0.927. HUDRCT has the potential to be developed as a useful indicator of quantitative assessment of myocardial perfusion.
NASA Astrophysics Data System (ADS)
Shevade, Abhijit V.; Ryan, Margaret A.; Homer, Margie L.; Zhou, Hanying; Manfreda, Allison M.; Lara, Liana M.; Yen, Shiao-Pin S.; Jewell, April D.; Manatt, Kenneth S.; Kisor, Adam K.
We have developed a Quantitative Structure-Activity Relationships (QSAR) based approach to correlate the response of chemical sensors in an array with molecular descriptors. A novel molecular descriptor set has been developed; this set combines descriptors of sensing film-analyte interactions, representing sensor response, with a basic analyte descriptor set commonly used in QSAR studies. The descriptors are obtained using a combination of molecular modeling tools and empirical and semi-empirical Quantitative Structure-Property Relationships (QSPR) methods. The sensors under investigation are polymer-carbon sensing films which have been exposed to analyte vapors at parts-per-million (ppm) concentrations; response is measured as change in film resistance. Statistically validated QSAR models have been developed using Genetic Function Approximations (GFA) for a sensor array for a given training data set. The applicability of the sensor response models has been tested by using it to predict the sensor activities for test analytes not considered in the training set for the model development. The validated QSAR sensor response models show good predictive ability. The QSAR approach is a promising computational tool for sensing materials evaluation and selection. It can also be used to predict response of an existing sensing film to new target analytes.
Tojo, Axel; Malm, Johan; Marko-Varga, György; Lilja, Hans; Laurell, Thomas
2014-01-01
The antibody microarrays have become widespread, but their use for quantitative analyses in clinical samples has not yet been established. We investigated an immunoassay based on nanoporous silicon antibody microarrays for quantification of total prostate-specific-antigen (PSA) in 80 clinical plasma samples, and provide quantitative data from a duplex microarray assay that simultaneously quantifies free and total PSA in plasma. To further develop the assay the porous silicon chips was placed into a standard 96-well microtiter plate for higher throughput analysis. The samples analyzed by this quantitative microarray were 80 plasma samples obtained from men undergoing clinical PSA testing (dynamic range: 0.14-44ng/ml, LOD: 0.14ng/ml). The second dataset, measuring free PSA (dynamic range: 0.40-74.9ng/ml, LOD: 0.47ng/ml) and total PSA (dynamic range: 0.87-295ng/ml, LOD: 0.76ng/ml), was also obtained from the clinical routine. The reference for the quantification was a commercially available assay, the ProStatus PSA Free/Total DELFIA. In an analysis of 80 plasma samples the microarray platform performs well across the range of total PSA levels. This assay might have the potential to substitute for the large-scale microtiter plate format in diagnostic applications. The duplex assay paves the way for a future quantitative multiplex assay, which analyses several prostate cancer biomarkers simultaneously. PMID:22921878
ERIC Educational Resources Information Center
Zhou, Wenxia; Sun, Jianmin; Guan, Yanjun; Li, Yuhui; Pan, Jingzhou
2013-01-01
The current research aimed to develop a multidimensional measure on the criteria of career success in a Chinese context. Items on the criteria of career success were obtained using a qualitative approach among 30 Chinese employees; exploratory factor analysis was conducted to select items and determine the factor structure among a new sample of…
ERIC Educational Resources Information Center
Rock, Heidi Marie
2017-01-01
The purpose of this quantitative retrospective causal-comparative study was to determine to what extent the form of professional development (face-to-face or online) or the level of instruction (elementary or high school) has on classroom teaching practices as measured by student learning outcomes. The first research question sought to determine…
Developing Model-Making and Model-Breaking Skills Using Direct Measurement Video-Based Activities
ERIC Educational Resources Information Center
Vonk, Matthew; Bohacek, Peter; Militello, Cheryl; Iverson, Ellen
2017-01-01
This study focuses on student development of two important laboratory skills in the context of introductory college-level physics. The first skill, which we call model making, is the ability to analyze a phenomenon in a way that produces a quantitative multimodal model. The second skill, which we call model breaking, is the ability to critically…
NASA Technical Reports Server (NTRS)
Dragan, O.; Galan, N.; Sirbu, A.; Ghita, C.
1974-01-01
The design and construction of inductive transducers for measuring the vibrations in metal bars at ultrasonic frequencies are discussed. Illustrations of the inductive transducers are provided. The quantitative relations that are useful in designing the transducers are analyzed. Mathematical models are developed to substantiate the theoretical considerations. Results obtained with laboratory equipment in testing specified metal samples are included.
ERIC Educational Resources Information Center
Seamster, Christina Lambert
2016-01-01
According to Molnar (2014), full time virtual school education lacks a measurement tool that accurately measures effective virtual teacher practice. Using both qualitative and quantitative methods, the current study sought to understand the common practices among full time K-8 virtual school teachers, the extent to which teachers believed such…
NASA Astrophysics Data System (ADS)
Conerty, Michelle D.; Castracane, James; Cacace, Anthony T.; Parnes, Steven M.; Gardner, Glendon M.; Miller, Mitchell B.
1995-05-01
Electronic Speckle Pattern Interferometry (ESPI) is a nondestructive optical evaluation technique that is capable of determining surface and subsurface integrity through the quantitative evaluation of static or vibratory motion. By utilizing state of the art developments in the areas of lasers, fiber optics and solid state detector technology, this technique has become applicable in medical research and diagnostics. Based on initial support from NIDCD and continued support from InterScience, Inc., we have been developing a range of instruments for improved diagnostic evaluation in otolaryngological applications based on the technique of ESPI. These compact fiber optic instruments are capable of making real time interferometric measurements of the target tissue. Ongoing development of image post- processing software is currently capable of extracting the desired quantitative results from the acquired interferometric images. The goal of the research is to develop a fully automated system in which the image processing and quantification will be performed in hardware in near real-time. Subsurface details of both the tympanic membrane and vocal cord dynamics could speed the diagnosis of otosclerosis, laryngeal tumors, and aid in the evaluation of surgical procedures.
Harrison, C S; Grant, P M; Conway, B A
2010-01-01
The increasing importance of inclusive design and in particular accessibility guidelines established in the U.K. 1996 Disability Discrimination Act (DDA) has been a prime motivation for the work on wheelchair access, a subset of the DDA guidelines, described in this article. The development of these guidelines mirrors the long-standing provisions developed in the U.S. In order to raise awareness of these guidelines and in particular to give architects, building designers, and users a physical sensation of how a planned development could be experienced, a wheelchair virtual reality system was developed. This compares with conventional methods of measuring against drawings and comparing dimensions against building regulations, established in the U.K. under British standards. Features of this approach include the marriage of an electromechanical force-feedback system with high-quality immersive graphics as well as the potential ability to generate a physiological rating of buildings that do not yet exist. The provision of this sense of "feel" augments immersion within the virtual reality environment and also provides the basis from which both qualitative and quantitative measures of a building's access performance can be gained.
An optimized method for measuring fatty acids and cholesterol in stable isotope-labeled cells
Argus, Joseph P.; Yu, Amy K.; Wang, Eric S.; Williams, Kevin J.; Bensinger, Steven J.
2017-01-01
Stable isotope labeling has become an important methodology for determining lipid metabolic parameters of normal and neoplastic cells. Conventional methods for fatty acid and cholesterol analysis have one or more issues that limit their utility for in vitro stable isotope-labeling studies. To address this, we developed a method optimized for measuring both fatty acids and cholesterol from small numbers of stable isotope-labeled cultured cells. We demonstrate quantitative derivatization and extraction of fatty acids from a wide range of lipid classes using this approach. Importantly, cholesterol is also recovered, albeit at a modestly lower yield, affording the opportunity to quantitate both cholesterol and fatty acids from the same sample. Although we find that background contamination can interfere with quantitation of certain fatty acids in low amounts of starting material, our data indicate that this optimized method can be used to accurately measure mass isotopomer distributions for cholesterol and many fatty acids isolated from small numbers of cultured cells. Application of this method will facilitate acquisition of lipid parameters required for quantifying flux and provide a better understanding of how lipid metabolism influences cellular function. PMID:27974366
Allesø, Morten; Holm, Per; Carstensen, Jens Michael; Holm, René
2016-05-25
Surface topography, in the context of surface smoothness/roughness, was investigated by the use of an image analysis technique, MultiRay™, related to photometric stereo, on different tablet batches manufactured either by direct compression or roller compaction. In the present study, oblique illumination of the tablet (darkfield) was considered and the area of cracks and pores in the surface was used as a measure of tablet surface topography; the higher a value, the rougher the surface. The investigations demonstrated a high precision of the proposed technique, which was able to rapidly (within milliseconds) and quantitatively measure the obtained surface topography of the produced tablets. Compaction history, in the form of applied roll force and tablet punch pressure, was also reflected in the measured smoothness of the tablet surfaces. Generally it was found that a higher degree of plastic deformation of the microcrystalline cellulose resulted in a smoother tablet surface. This altogether demonstrated that the technique provides the pharmaceutical developer with a reliable, quantitative response parameter for visual appearance of solid dosage forms, which may be used for process and ultimately product optimization. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Sugano, Koji; Ikegami, Kohei; Isono, Yoshitada
2017-06-01
In this paper, a characterization method for Raman enhancement for highly sensitive and quantitative surface-enhanced Raman spectroscopy (SERS) is reported. A particle dimer shows a marked electromagnetic enhancement when the particle connection direction is matched to the polarization direction of incident light. In this study, dimers were arrayed by nanotrench-guided self-assembly for a marked total Raman enhancement. By measuring acetonedicarboxylic acid, the fabricated structures were characterized for SERS depending on the polarization angle against the particle connection direction. This indicates that the fabricated structures cause an effective SERS enhancement, which is dominated by the electromagnetic enhancement. Then, we measured 4,4‧-bipyridine, which is a pesticide material, for quantitative analysis. In advance, we evaluated the enhancement of the particle structure by the Raman measurement of acetonedicarboxylic acid. Finally, we compared the Raman intensities of acetonedicarboxylic acid and 4,4‧-bipyridine. Their intensities showed good correlation. The advantage of this method for previously evaluating the enhancement of the substrate was demonstrated. This developed SERS characterization method is expected to be applied to various quantitative trace analyses of molecules with high sensitivity.
A quantitative approach to painting styles
NASA Astrophysics Data System (ADS)
Vieira, Vilson; Fabbri, Renato; Sbrissa, David; da Fontoura Costa, Luciano; Travieso, Gonzalo
2015-01-01
This research extends a method previously applied to music and philosophy (Vilson Vieira et al., 2012), representing the evolution of art as a time-series where relations like dialectics are measured quantitatively. For that, a corpus of paintings of 12 well-known artists from baroque and modern art is analyzed. A set of 99 features is extracted and the features which most contributed to the classification of painters are selected. The projection space obtained provides the basis to the analysis of measurements. These quantitative measures underlie revealing observations about the evolution of painting styles, specially when compared with other humanity fields already analyzed: while music evolved along a master-apprentice tradition (high dialectics) and philosophy by opposition, painting presents another pattern: constant increasing skewness, low opposition between members of the same movement and opposition peaks in the transition between movements. Differences between baroque and modern movements are also observed in the projected "painting space": while baroque paintings are presented as an overlapped cluster, the modern paintings present minor overlapping and are disposed more widely in the projection than the baroque counterparts. This finding suggests that baroque painters shared aesthetics while modern painters tend to "break rules" and develop their own style.
Bono Jr., Michael S.; Garcia, Ravi D.; Sri-Jayantha, Dylan V.; Ahner, Beth A.; Kirby, Brian J.
2015-01-01
In this study, we cultured Chlorella vulgaris cells with a range of lipid contents, induced via nitrogen starvation, and characterized them via flow cytometry, with BODIPY 505/515 as a fluorescent lipid label, and liquid-state 1H NMR spectroscopy. In doing so, we demonstrate the utility of calibrating flow cytometric measurements of algal lipid content using triacylglyceride (TAG, also known as triacylglycerol or triglyceride) content per cell as measured via quantitative 1H NMR. Ensemble-averaged fluorescence of BODIPY-labeled cells was highly correlated with average TAG content per cell measured by bulk NMR, with a linear regression yielding a linear fit with r 2 = 0.9974. This correlation compares favorably to previous calibrations of flow cytometry protocols to lipid content measured via extraction, and calibration by NMR avoids the time and complexity that is generally required for lipid quantitation via extraction. Flow cytometry calibrated to a direct measurement of TAG content can be used to investigate the distribution of lipid contents for cells within a culture. Our flow cytometry measurements showed that Chlorella vulgaris cells subjected to nitrogen limitation exhibited higher mean lipid content but a wider distribution of lipid content that overlapped the relatively narrow distribution of lipid content for replete cells, suggesting that nitrogen limitation induces lipid accumulation in only a subset of cells. Calibration of flow cytometry protocols using direct in situ measurement of TAG content via NMR will facilitate rapid development of more precise flow cytometry protocols, enabling investigation of algal lipid accumulation for development of more productive algal biofuel feedstocks and cultivation protocols. PMID:26267664
Bono, Michael S; Garcia, Ravi D; Sri-Jayantha, Dylan V; Ahner, Beth A; Kirby, Brian J
2015-01-01
In this study, we cultured Chlorella vulgaris cells with a range of lipid contents, induced via nitrogen starvation, and characterized them via flow cytometry, with BODIPY 505/515 as a fluorescent lipid label, and liquid-state 1H NMR spectroscopy. In doing so, we demonstrate the utility of calibrating flow cytometric measurements of algal lipid content using triacylglyceride (TAG, also known as triacylglycerol or triglyceride) content per cell as measured via quantitative 1H NMR. Ensemble-averaged fluorescence of BODIPY-labeled cells was highly correlated with average TAG content per cell measured by bulk NMR, with a linear regression yielding a linear fit with r2 = 0.9974. This correlation compares favorably to previous calibrations of flow cytometry protocols to lipid content measured via extraction, and calibration by NMR avoids the time and complexity that is generally required for lipid quantitation via extraction. Flow cytometry calibrated to a direct measurement of TAG content can be used to investigate the distribution of lipid contents for cells within a culture. Our flow cytometry measurements showed that Chlorella vulgaris cells subjected to nitrogen limitation exhibited higher mean lipid content but a wider distribution of lipid content that overlapped the relatively narrow distribution of lipid content for replete cells, suggesting that nitrogen limitation induces lipid accumulation in only a subset of cells. Calibration of flow cytometry protocols using direct in situ measurement of TAG content via NMR will facilitate rapid development of more precise flow cytometry protocols, enabling investigation of algal lipid accumulation for development of more productive algal biofuel feedstocks and cultivation protocols.
Assessment of metabolic bone diseases by quantitative computed tomography
NASA Technical Reports Server (NTRS)
Richardson, M. L.; Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U. J.
1985-01-01
Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid-induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements. Knowledge of appendicular cortical mineral status is important in its own right but is not a valid predictor of axial trabecular mineral status, which may be disproportionately decreased in certain diseases. Quantitative CT provides a reliable means of assessing the latter region of the skeleton, correlates well with the spinal fracture index (a semiquantitative measurement of end-organ failure), and offers the clinician a sensitive means of following the effects of therapy.
Resource for the Development of Biomedical Accelerator Mass Spectrometry (AMS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuerteltaub, K. W.; Bench, G.; Buchholz, B. A.
The NIH Research Resource for Biomedical AMS was originally funded at Lawrence Livermore National Laboratory in 1999 to develop and apply the technology of accelerator mass spectrometry (AMS) in broad- based biomedical research. The Resource’s niche is to fill needs for ultra high sensitivity quantitation when isotope-labeled agents are used. The Research Resource’s Technology Research and Development (TR&D) efforts will focus on the needs of the biomedical research community in the context of seven Driving Biomedical Projects (DBPs) that will drive the Center’s technical capabilities through three core TR&Ds. We will expand our present capabilities by developing a fully integratedmore » HPLC AMS to increase our capabilities for metabolic measurements, we will develop methods to understand cellular processes and we will develop and validate methods for the application of AMS in human studies, which is a growing area of demand by collaborators and service users. In addition, we will continue to support new and ongoing collaborative and service projects that require the capabilities of the Resource. The Center will continue to train researchers in the use of the AMS capabilities being developed, and the results of all efforts will be widely disseminated to advance progress in biomedical research. Towards these goals, our specific aims are to:1.) Increase the value and information content of AMS measurements by combining molecular speciation with quantitation of defined macromolecular isolates. Specifically, develop and validate methods for macromolecule labeling, characterization and quantitation.2.) Develop and validate methods and strategies to enable AMS to become more broadly used in human studies. Specifically, demonstrate robust methods for conducting pharmacokinetic/pharmacodynamics studies in humans and model systems.3.) Increase the accessibility of AMS to the Biomedical research community and the throughput of AMS through direct coupling to separatory instruments.4.) Provide high throughput 14C BioAMS analysis for collaborative and service clients.« less
Resource for the Development of Biomedical Accelerator Mass Spectrometry (AMS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turteltaub, K. W.; Bench, G.; Buchholz, B. A.
2016-04-08
The NIH Research Resource for Biomedical AMS was originally funded at Lawrence Livermore National Laboratory in 1999 to develop and apply the technology of accelerator mass spectrometry (AMS) in broad- based biomedical research. The Resource’s niche is to fill needs for ultra high sensitivity quantitation when isotope-labeled agents are used. The Research Resource’s Technology Research and Development (TR&D) efforts will focus on the needs of the biomedical research community in the context of seven Driving Biomedical Projects (DBPs) that will drive the Center’s technical capabilities through three core TR&Ds. We will expand our present capabilities by developing a fully integratedmore » HPLC AMS to increase our capabilities for metabolic measurements, we will develop methods to understand cellular processes and we will develop and validate methods for the application of AMS in human studies, which is a growing area of demand by collaborators and service users. In addition, we will continue to support new and ongoing collaborative and service projects that require the capabilities of the Resource. The Center will continue to train researchers in the use of the AMS capabilities being developed, and the results of all efforts will be widely disseminated to advance progress in biomedical research. Towards these goals, our specific aims are to:1.) Increase the value and information content of AMS measurements by combining molecular speciation with quantitation of defined macromolecular isolates. Specifically, develop and validate methods for macromolecule labeling, characterization and quantitation.2.) Develop and validate methods and strategies to enable AMS to become more broadly used in human studies. Specifically, demonstrate robust methods for conducting pharmacokinetic/pharmacodynamics studies in humans and model systems.3.) Increase the accessibility of AMS to the Biomedical research community and the throughput of AMS through direct coupling to separatory instruments.4.) Provide high throughput 14C BioAMS analysis for collaborative and service clients.« less
Motion compensation using origin ensembles in awake small animal positron emission tomography
NASA Astrophysics Data System (ADS)
Gillam, John E.; Angelis, Georgios I.; Kyme, Andre Z.; Meikle, Steven R.
2017-02-01
In emission tomographic imaging, the stochastic origin ensembles algorithm provides unique information regarding the detected counts given the measured data. Precision in both voxel and region-wise parameters may be determined for a single data set based on the posterior distribution of the count density allowing uncertainty estimates to be allocated to quantitative measures. Uncertainty estimates are of particular importance in awake animal neurological and behavioral studies for which head motion, unique for each acquired data set, perturbs the measured data. Motion compensation can be conducted when rigid head pose is measured during the scan. However, errors in pose measurements used for compensation can degrade the data and hence quantitative outcomes. In this investigation motion compensation and detector resolution models were incorporated into the basic origin ensembles algorithm and an efficient approach to computation was developed. The approach was validated against maximum liklihood—expectation maximisation and tested using simulated data. The resultant algorithm was then used to analyse quantitative uncertainty in regional activity estimates arising from changes in pose measurement precision. Finally, the posterior covariance acquired from a single data set was used to describe correlations between regions of interest providing information about pose measurement precision that may be useful in system analysis and design. The investigation demonstrates the use of origin ensembles as a powerful framework for evaluating statistical uncertainty of voxel and regional estimates. While in this investigation rigid motion was considered in the context of awake animal PET, the extension to arbitrary motion may provide clinical utility where respiratory or cardiac motion perturb the measured data.
Measuring Down: Evaluating Digital Storytelling as a Process for Narrative Health Promotion.
Gubrium, Aline C; Fiddian-Green, Alice; Lowe, Sarah; DiFulvio, Gloria; Del Toro-Mejías, Lizbeth
2016-05-15
Digital storytelling (DST) engages participants in a group-based process to create and share narrative accounts of life events. We present key evaluation findings of a 2-year, mixed-methods study that focused on effects of participating in the DST process on young Puerto Rican Latina's self-esteem, social support, empowerment, and sexual attitudes and behaviors. Quantitative results did not show significant changes in the expected outcomes. However, in our qualitative findings we identified several ways in which the DST made positive, health-bearing effects. We argue for the importance of "measuring down" to reflect the locally grounded, felt experiences of participants who engage in the process, as current quantitative scales do not "measure up" to accurately capture these effects. We end by suggesting the need to develop mixed-methods, culturally relevant, and sensitive evaluation tools that prioritize process effects as they inform intervention and health promotion. © The Author(s) 2016.
Quantitative measurement of pass-by noise radiated by vehicles running at high speeds
NASA Astrophysics Data System (ADS)
Yang, Diange; Wang, Ziteng; Li, Bing; Luo, Yugong; Lian, Xiaomin
2011-03-01
It has been a challenge in the past to accurately locate and quantify the pass-by noise source radiated by the running vehicles. A system composed of a microphone array is developed in our current work to do this work. An acoustic-holography method for moving sound sources is designed to handle the Doppler effect effectively in the time domain. The effective sound pressure distribution is reconstructed on the surface of a running vehicle. The method has achieved a high calculation efficiency and is able to quantitatively measure the sound pressure at the sound source and identify the location of the main sound source. The method is also validated by the simulation experiments and the measurement tests with known moving speakers. Finally, the engine noise, tire noise, exhaust noise and wind noise of the vehicle running at different speeds are successfully identified by this method.
Balasubramanian, M; Spencer, A J; Short, S D; Watkins, K; Chrisopoulos, S; Brennan, D S
2016-09-01
The integration of qualitative and quantitative approaches introduces new avenues to bridge strengths, and address weaknesses of both methods. To develop measure(s) for migrant dentist experiences in Australia through a mixed methods approach. The sequential qualitative-quantitative design involved first the harvesting of data items from qualitative study, followed by a national survey of migrant dentists in Australia. Statements representing unique experiences in migrant dentists' life stories were deployed the survey questionnaire, using a five-point Likert scale. Factor analysis was used to examine component factors. Eighty-two statements from 51 participants were harvested from the qualitative analysis. A total of 1,022 of 1,977 migrant dentists (response rate 54.5%) returned completed questionnaires. Factor analysis supported an initial eight-factor solution; further scale development and reliability analysis led to five scales with a final list of 38 life story experience (LSE) items. Three scales were based on home country events: health system and general lifestyle concerns (LSE1; 10 items), society and culture (LSE4; 4 items) and career development (LSE5; 4 items). Two scales included migrant experiences in Australia: appreciation towards Australian way of life (LSE2; 13 items) and settlement concerns (LSE3; 7 items). The five life story experience scales provided necessary conceptual clarity and empirical grounding to explore migrant dentist experiences in Australia. Being based on original migrant dentist narrations, these scales have the potential to offer in-depth insights for policy makers and support future research on dentist migration. Copyright© 2016 Dennis Barber Ltd
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loo, Jr., Billy W.
2000-06-01
The study of the exocrine pancreatic acinar cell has been central to the development of models of many cellular processes, especially of protein transport and secretion. Traditional methods used to examine this system have provided a wealth of qualitative information from which mechanistic models have been inferred. However they have lacked the ability to make quantitative measurements, particularly of the distribution of protein in the cell, information critical for grounding of models in terms of magnitude and relative significance. This dissertation describes the development and application of new tools that were used to measure the protein content of the majormore » intracellular compartments in the acinar cell, particularly the zymogen granule. Soft x-ray microscopy permits image formation with high resolution and contrast determined by the underlying protein content of tissue rather than staining avidity. A sample preparation method compatible with x-ray microscopy was developed and its properties evaluated. Automatic computerized methods were developed to acquire, calibrate, and analyze large volumes of x-ray microscopic images of exocrine pancreatic tissue sections. Statistics were compiled on the protein density of several organelles, and on the protein density, size, and spatial distribution of tens of thousands of zymogen granules. The results of these measurements, and how they compare to predictions of different models of protein transport, are discussed.« less
Simple Perfusion Apparatus (SPA) for Manipulation, Tracking and Study of Oocytes and Embryos
Angione, Stephanie L.; Oulhen, Nathalie; Brayboy, Lynae M.; Tripathi, Anubhav; Wessel, Gary M.
2016-01-01
Objective To develop and implement a device and protocol for oocyte analysis at a single cell level. The device must be capable of high resolution imaging, temperature control, perfusion of media, drugs, sperm, and immunolabeling reagents all at defined flow-rates. Each oocyte and resultant embryo must remain spatially separated and defined. Design Experimental laboratory study Setting University and Academic Center for reproductive medicine. Patients/Animals Women with eggs retrieved for ICSI cycles, adult female FVBN and B6C3F1 mouse strains, sea stars. Intervention Real-time, longitudinal imaging of oocytes following fluorescent labeling, insemination, and viability tests. Main outcome measure(s) Cell and embryo viability, immunolabeling efficiency, live cell endocytosis quantitation, precise metrics of fertilization and embryonic development. Results Single oocytes were longitudinally imaged following significant changes in media, markers, endocytosis quantitation, and development, all with supreme control by microfluidics. Cells remained viable, enclosed, and separate for precision measurements, repeatability, and imaging. Conclusions We engineered a simple device to load, visualize, experiment, and effectively record individual oocytes and embryos, without loss of cells. Prolonged incubation capabilities provide longitudinal studies without need for transfer and potential loss of cells. This simple perfusion apparatus (SPA) provides for careful, precise, and flexible handling of precious samples facilitating clinical in vitro fertilization approaches. PMID:25450296
Proof of the quantitative potential of immunofluorescence by mass spectrometry.
Toki, Maria I; Cecchi, Fabiola; Hembrough, Todd; Syrigos, Konstantinos N; Rimm, David L
2017-03-01
Protein expression in formalin-fixed, paraffin-embedded patient tissue is routinely measured by Immunohistochemistry (IHC). However, IHC has been shown to be subject to variability in sensitivity, specificity and reproducibility, and is generally, at best, considered semi-quantitative. Mass spectrometry (MS) is considered by many to be the criterion standard for protein measurement, offering high sensitivity, specificity, and objective molecular quantification. Here, we seek to show that quantitative immunofluorescence (QIF) with standardization can achieve quantitative results comparable to MS. Epidermal growth factor receptor (EGFR) was measured by quantitative immunofluorescence in 15 cell lines with a wide range of EGFR expression, using different primary antibody concentrations, including the optimal signal-to-noise concentration after quantitative titration. QIF target measurement was then compared to the absolute EGFR concentration measured by Liquid Tissue-selected reaction monitoring mass spectrometry. The best agreement between the two assays was found when the EGFR primary antibody was used at the optimal signal-to-noise concentration, revealing a strong linear regression (R 2 =0.88). This demonstrates that quantitative optimization of titration by calculation of signal-to-noise ratio allows QIF to be standardized to MS and can therefore be used to assess absolute protein concentration in a linear and reproducible manner.
NASA Astrophysics Data System (ADS)
Rappleye, Devin Spencer
The development of electroanalytical techniques in multianalyte molten salt mixtures, such as those found in used nuclear fuel electrorefiners, would enable in situ, real-time concentration measurements. Such measurements are beneficial for process monitoring, optimization and control, as well as for international safeguards and nuclear material accountancy. Electroanalytical work in molten salts has been limited to single-analyte mixtures with a few exceptions. This work builds upon the knowledge of molten salt electrochemistry by performing electrochemical measurements on molten eutectic LiCl-KCl salt mixture containing two analytes, developing techniques for quantitatively analyzing the measured signals even with an additional signal from another analyte, correlating signals to concentration and identifying improvements in experimental and analytical methodologies. (Abstract shortened by ProQuest.).
Advancing the Fork detector for quantitative spent nuclear fuel verification
Vaccaro, S.; Gauld, I. C.; Hu, J.; ...
2018-01-31
The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations.more » A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This study describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. Finally, the results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.« less
Advancing the Fork detector for quantitative spent nuclear fuel verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vaccaro, S.; Gauld, I. C.; Hu, J.
The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations.more » A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This study describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. Finally, the results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.« less
Advancing the Fork detector for quantitative spent nuclear fuel verification
NASA Astrophysics Data System (ADS)
Vaccaro, S.; Gauld, I. C.; Hu, J.; De Baere, P.; Peterson, J.; Schwalbach, P.; Smejkal, A.; Tomanin, A.; Sjöland, A.; Tobin, S.; Wiarda, D.
2018-04-01
The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations. A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This paper describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. The results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.
Quantitative analysis of fracture surface by roughness and fractal method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, X.W.; Tian, J.F.; Kang, Y.
1995-09-01
In recent years there has been extensive research and great development in Quantitative Fractography, which acts as an integral part of fractographic analysis. A prominent technique for studying the fracture surface is based on fracture profile generation and the major means for characterizing the profile quantitatively are roughness and fractal methods. By this way, some quantitative indexes such as the roughness parameters R{sub L} for profile and R{sub S} for surface, fractal dimensions D{sub L} for profile and D{sub S} for surface can be measured. Given the relationships between the indexes and the mechanical properties of materials, it is possiblemore » to achieve the goal of protecting materials from fracture. But, as the case stands, the theory and experimental technology of quantitative fractography are still imperfect and remain to be studied further. Recently, Gokhale and Underwood et al have proposed an assumption-free method for estimating the surface roughness by vertically sectioning the fracture surface with sections at an angle of 120 deg with each other, which could be expressed as follows: R{sub S} = {ovr R{sub L}{center_dot}{Psi}} where {Psi} is the profile structure factor. This method is based on the classical sterological principles and verified with the aid of computer simulations for some ruled surfaces. The results are considered to be applicable to fracture surfaces with any arbitrary complexity and anisotropy. In order to extend the detail applications to this method in quantitative fractography, the authors made a study on roughness and fractal methods dependent on this method by performing quantitative measurements on some typical low-temperature impact fractures.« less
Identifying Gifted Students: A Practical Guide
ERIC Educational Resources Information Center
Johnsen, S., Ed.
2004-01-01
This user-friendly guide offers advice and insight on developing defensible identification procedures and services for gifted and talented students. Special attention is given to the use of multiple methods including qualitative and quantitative assessments such as standardized measures (e.g. intelligence, aptitude, and achievement tests),…
Direct injection analysis of fatty and resin acids in papermaking process waters by HPLC/MS.
Valto, Piia; Knuutinen, Juha; Alén, Raimo
2011-04-01
A novel HPLC-atmospheric pressure chemical ionization/MS (HPLC-APCI/MS) method was developed for the rapid analysis of selected fatty and resin acids typically present in papermaking process waters. A mixture of palmitic, stearic, oleic, linolenic, and dehydroabietic acids was separated by a commercial HPLC column (a modified stationary C(18) phase) using gradient elution with methanol/0.15% formic acid (pH 2.5) as a mobile phase. The internal standard (myristic acid) method was used to calculate the correlation coefficients and in the quantitation of the results. In the thorough quality parameters measurement, a mixture of these model acids in aqueous media as well as in six different paper machine process waters was quantitatively determined. The measured quality parameters, such as selectivity, linearity, precision, and accuracy, clearly indicated that, compared with traditional gas chromatographic techniques, the simple method developed provided a faster chromatographic analysis with almost real-time monitoring of these acids. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NIST Efforts to Quality-Assure Gunpowder Measurements
NASA Technical Reports Server (NTRS)
MacCrehan, William A.; Reardon, Michelle R.
2000-01-01
In the past few years, the National Institute for Standards and Technology (NIST) has been promoting the idea of quantitatively determining the additives in smokeless gunpowder using micellar capillary electrophoresis as a means of investigating the criminal use of hand guns and pipe bombs. As a part of this effort, we have evaluated both supercritical fluid and ultrasonic solvent extractions for the quantitative recovery of nitroglycerin (NG), diphenylamine (DPA), N-nitrosodiphenylamine (NnDPA), and ethyl centralite (EC) from gunpowder. Recoveries were evaluated by repeat extraction and matrix spiking experiments. The final extraction protocol provides greater than 95 percent recoveries. To help other researches validate their own analytical method for additive determinations, NIST is exploring the development of a standard reference material, Additives in Smokeless Gunpowder. The evaluated method is being applied to two double-base (NG-containing) powders, one stabilized with diphenylamine and the other with ethyl centralite. As part of this reference material development effort, we are conducting an interlaboratory comparison exercise among the forensic and military gunpowder measurement community.
NASA Technical Reports Server (NTRS)
Driscoll, R. S.; Francis, R. E.
1970-01-01
A description of space and supporting aircraft photography for the interpretation and analyses of non-forest (shrubby and herbaceous) native vegetation is presented. The research includes the development of a multiple sampling technique to assign quantitative area values of specific plant community types included within an assigned space photograph map unit. Also, investigations of aerial film type, scale, and season of photography for identification and quantity measures of shrubby and herbaceous vegetation were conducted. Some work was done to develop automated interpretation techniques with film image density measurement devices.
Evaluation of airway protection: Quantitative timing measures versus penetration/aspiration score.
Kendall, Katherine A
2017-10-01
Quantitative measures of swallowing function may improve the reliability and accuracy of modified barium swallow (MBS) study interpretation. Quantitative study analysis has not been widely instituted, however, secondary to concerns about the time required to make measures and a lack of research demonstrating impact on MBS interpretation. This study compares the accuracy of the penetration/aspiration (PEN/ASP) scale (an observational visual-perceptual assessment tool) to quantitative measures of airway closure timing relative to the arrival of the bolus at the upper esophageal sphincter in identifying a failure of airway protection during deglutition. Retrospective review of clinical swallowing data from a university-based outpatient clinic. Swallowing data from 426 patients were reviewed. Patients with normal PEN/ASP scores were identified, and the results of quantitative airway closure timing measures for three liquid bolus sizes were evaluated. The incidence of significant airway closure delay with and without a normal PEN/ASP score was determined. Inter-rater reliability for the quantitative measures was calculated. In patients with a normal PEN/ASP score, 33% demonstrated a delay in airway closure on at least one swallow during the MBS study. There was no correlation between PEN/ASP score and airway closure delay. Inter-rater reliability for the quantitative measure of airway closure timing was nearly perfect (intraclass correlation coefficient = 0.973). The use of quantitative measures of swallowing function, in conjunction with traditional visual perceptual methods of MBS study interpretation, improves the identification of airway closure delay, and hence, potential aspiration risk, even when no penetration or aspiration is apparent on the MBS study. 4. Laryngoscope, 127:2314-2318, 2017. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.
Development of quantitative screen for 1550 chemicals with GC-MS.
Bergmann, Alan J; Points, Gary L; Scott, Richard P; Wilson, Glenn; Anderson, Kim A
2018-05-01
With hundreds of thousands of chemicals in the environment, effective monitoring requires high-throughput analytical techniques. This paper presents a quantitative screening method for 1550 chemicals based on statistical modeling of responses with identification and integration performed using deconvolution reporting software. The method was evaluated with representative environmental samples. We tested biological extracts, low-density polyethylene, and silicone passive sampling devices spiked with known concentrations of 196 representative chemicals. A multiple linear regression (R 2 = 0.80) was developed with molecular weight, logP, polar surface area, and fractional ion abundance to predict chemical responses within a factor of 2.5. Linearity beyond the calibration had R 2 > 0.97 for three orders of magnitude. Median limits of quantitation were estimated to be 201 pg/μL (1.9× standard deviation). The number of detected chemicals and the accuracy of quantitation were similar for environmental samples and standard solutions. To our knowledge, this is the most precise method for the largest number of semi-volatile organic chemicals lacking authentic standards. Accessible instrumentation and software make this method cost effective in quantifying a large, customizable list of chemicals. When paired with silicone wristband passive samplers, this quantitative screen will be very useful for epidemiology where binning of concentrations is common. Graphical abstract A multiple linear regression of chemical responses measured with GC-MS allowed quantitation of 1550 chemicals in samples such as silicone wristbands.
On measures of association among genetic variables
Gianola, Daniel; Manfredi, Eduardo; Simianer, Henner
2012-01-01
Summary Systems involving many variables are important in population and quantitative genetics, for example, in multi-trait prediction of breeding values and in exploration of multi-locus associations. We studied departures of the joint distribution of sets of genetic variables from independence. New measures of association based on notions of statistical distance between distributions are presented. These are more general than correlations, which are pairwise measures, and lack a clear interpretation beyond the bivariate normal distribution. Our measures are based on logarithmic (Kullback-Leibler) and on relative ‘distances’ between distributions. Indexes of association are developed and illustrated for quantitative genetics settings in which the joint distribution of the variables is either multivariate normal or multivariate-t, and we show how the indexes can be used to study linkage disequilibrium in a two-locus system with multiple alleles and present applications to systems of correlated beta distributions. Two multivariate beta and multivariate beta-binomial processes are examined, and new distributions are introduced: the GMS-Sarmanov multivariate beta and its beta-binomial counterpart. PMID:22742500
Aknoun, Sherazade; Savatier, Julien; Bon, Pierre; Galland, Frédéric; Abdeladim, Lamiae; Wattellier, Benoit; Monneret, Serge
2015-01-01
Single-cell dry mass measurement is used in biology to follow cell cycle, to address effects of drugs, or to investigate cell metabolism. Quantitative phase imaging technique with quadriwave lateral shearing interferometry (QWLSI) allows measuring cell dry mass. The technique is very simple to set up, as it is integrated in a camera-like instrument. It simply plugs onto a standard microscope and uses a white light illumination source. Its working principle is first explained, from image acquisition to automated segmentation algorithm and dry mass quantification. Metrology of the whole process, including its sensitivity, repeatability, reliability, sources of error, over different kinds of samples and under different experimental conditions, is developed. We show that there is no influence of magnification or spatial light coherence on dry mass measurement; effect of defocus is more critical but can be calibrated. As a consequence, QWLSI is a well-suited technique for fast, simple, and reliable cell dry mass study, especially for live cells.
Rapid video-referenced ratings of reciprocal social behavior in toddlers: A twin study
Marrus, Natasha; Glowinski, Anne L.; Jacob, Theodore; Klin, Ami; Jones, Warren; Drain, Caroline E.; Holzhauer, Kieran E.; Hariprasad, Vaishnavi; Fitzgerald, Rob T.; Mortenson, Erika L.; Sant, Sayli M.; Cole, Lyndsey; Siegel, Satchel A.; Zhang, Yi; Agrawal, Arpana; Heath, Andrew; Constantino, John N.
2015-01-01
Background Reciprocal social behavior (RSB) is a developmental prerequisite for social competency, and deficits in RSB constitute a core feature of autism spectrum disorder (ASD). Although clinical screeners categorically ascertain risk of ASD in early childhood, rapid methods for quantitative measurement of RSB in toddlers are not yet established. Such measurements are critical for tracking developmental trajectories and incremental responses to intervention. Methods We developed and validated a 20-minute video-referenced rating scale, the video-referenced rating of reciprocal social behavior (vrRSB), for untrained caregivers to provide standardized ratings of quantitative variation in RSB. Parents of 252 toddler twins [Monozygotic (MZ)=31 pairs, Dizygotic (DZ)=95 pairs] ascertained through birth records, rated their twins’ RSB at two time points, on average 6 months apart, and completed two developmental measures, the Modified Checklist for Autism in Toddlers (M-CHAT) and the MacArthur Communicative Development Inventory Short Form (MCDI-s). Results Scores on the vrRSB were fully continuously distributed, with excellent 6-month test-retest reliability ([intraclass correlation coefficient] ICC=0.704, p<0.000). MZ twins displayed markedly greater trait concordance than DZ twins, (MZ ICC=0.863, p<0.000, DZ ICC=0.231, p<0.012). VrRSB score distributions were highly distinct for children passing versus failing the M-CHAT (t=−8.588, df=31, p<.000), incrementally improved from 18-24 months, and were inversely correlated with receptive and expressive vocabulary on the MCDI-s. Conclusions Like quantitative autistic trait ratings in school-aged children and adults, toddler scores on the vrRSB are continuously distributed and appear highly heritable. These ratings exhibited minimal measurement error, high inter-individual stability, and developmental progression in RSB as children matured from 18-24 months, supporting their potential utility for serially quantifying the severity of early autistic syndromes over time and in response to intervention. In addition, these findings inform the genetic-environmental structure of RSB in early typical development. PMID:25677414
Design, Validation, and Testing of a Hot-Film Anemometer for Hypersonic Flow
NASA Astrophysics Data System (ADS)
Sheplak, Mark
The application of constant-temperature hot-film anemometry to hypersonic flow has been reviewed and extended in this thesis. The objective of this investigation was to develop a measurement tool capable of yielding continuous, high-bandwidth, quantitative, normal mass-flux and total -temperature measurements in moderate-enthalpy environments. This research has produced a probe design that represents a significant advancement over existing designs, offering the following improvements: (1) a five-fold increase in bandwidth; (2) true stagnation-line sensor placement; (3) a two order-of-magnitude decrease in sensor volume; and (4) over a 70% increase in maximum film temperature. These improvements were achieved through substrate design, sensor placement, the use of high-temperature materials, and state -of-the-art microphotolithographic fabrication techniques. The experimental study to characterize the probe was performed in four different hypersonic wind tunnels at NASA-Langley Research Center. The initial test consisted of traversing the hot film through a Mach 6, flat-plate, turbulent boundary layer in air. The detailed static-calibration measurements that followed were performed in two different hypersonic flows: a Mach 11 helium flow and Mach 6 air flow. The final test of this thesis consisted of traversing the probe through the Mach 6 wake of a 70^ circ blunt body. The goal of this test was to determine the state (i.e., laminar or turbulent) of the wake. These studies indicate that substrate conduction effects result in instrumentation characteristics that prevent the hot-film anemometer from being used as a quantitative tool. The extension of this technique to providing quantitative information is dependent upon the development of lower thermal-conductivity substrate materials. However, the probe durability, absence of strain gauging, and high bandwidth represent significant improvements over the hot-wire technique for making qualitative measurements. Potential uses for this probe are: frequency identification for resonant flows, transition studies, turbulence detection for quiet-tunnel development and reattaching turbulent shear flows, and qualitative turbulence studies of shock-wave/turbulent boundary layer interactions.
Pratley, Pierre
2016-11-01
Research on the association between women's empowerment and maternal and child health has rapidly expanded. However, questions concerning the measurement and aggregation of quantitative indicators of women's empowerment and their associations with measures of maternal and child health status and healthcare utilization remain unanswered. Major challenges include complexity in measuring progress in several dimensions and the situational, context dependent nature of the empowerment process as it relates to improvements in maternal and child health status and maternal care seeking behaviors. This systematic literature review summarizes recent evidence from the developing world regarding the role women's empowerment plays as a social determinant of maternal and child health outcomes. A search of quantitative evidence previously reported in the economic, socio-demographic and public health literature finds 67 eligible studies that report on direct indicators of women's empowerment and their association with indicators capturing maternal and child health outcomes. Statistically significant associations were found between women's empowerment and maternal and child health outcomes such as antenatal care, skilled attendance at birth, contraceptive use, child mortality, full vaccination, nutritional status and exposure to violence. Although associations differ in magnitude and direction, the studies reviewed generally support the hypothesis that women's empowerment is significantly and positively associated with maternal and child health outcomes. While major challenges remain regarding comparability between studies and lack of direct indicators in key dimensions of empowerment, these results suggest that policy makers and practitioners must consider women's empowerment as a viable strategy to improve maternal and child health, but also as a merit in itself. Recommendations include collection of indicators on psychological, legal and political dimensions of women's empowerment and development of a comprehensive conceptual framework that can guide research and policy making. Copyright © 2016 Elsevier Ltd. All rights reserved.
Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons
2014-01-01
Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooper, M.D.; Beck, R.N.
1988-06-01
This document describes several years research to improve PET imaging and diagnostic techniques in man. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefitmore » from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. The reports in the study were processed separately for the data bases. (TEM)« less
Napolitano, José G.; Gödecke, Tanja; Lankin, David C.; Jaki, Birgit U.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.
2013-01-01
The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative 1H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development of a 1D qHNMR-based method for simultaneous identification and quantification of green tea constituents. This approach utilizes computer-assisted 1H iterative Full Spin Analysis (HiFSA) and enables rapid profiling of seven catechins in commercial green tea extracts. The qHNMR results were cross-validated against quantitative profiles obtained with an orthogonal LC-MS/MS method. The relative strengths and weaknesses of both approaches are discussed, with special emphasis on the role of identical reference standards in qualitative and quantitative analyses. PMID:23870106
Mapping Quantitative Traits in Unselected Families: Algorithms and Examples
Dupuis, Josée; Shi, Jianxin; Manning, Alisa K.; Benjamin, Emelia J.; Meigs, James B.; Cupples, L. Adrienne; Siegmund, David
2009-01-01
Linkage analysis has been widely used to identify from family data genetic variants influencing quantitative traits. Common approaches have both strengths and limitations. Likelihood ratio tests typically computed in variance component analysis can accommodate large families but are highly sensitive to departure from normality assumptions. Regression-based approaches are more robust but their use has primarily been restricted to nuclear families. In this paper, we develop methods for mapping quantitative traits in moderately large pedigrees. Our methods are based on the score statistic which in contrast to the likelihood ratio statistic, can use nonparametric estimators of variability to achieve robustness of the false positive rate against departures from the hypothesized phenotypic model. Because the score statistic is easier to calculate than the likelihood ratio statistic, our basic mapping methods utilize relatively simple computer code that performs statistical analysis on output from any program that computes estimates of identity-by-descent. This simplicity also permits development and evaluation of methods to deal with multivariate and ordinal phenotypes, and with gene-gene and gene-environment interaction. We demonstrate our methods on simulated data and on fasting insulin, a quantitative trait measured in the Framingham Heart Study. PMID:19278016
A Fan-tastic Quantitative Exploration of Ohm's Law
NASA Astrophysics Data System (ADS)
Mitchell, Brandon; Ekey, Robert; McCullough, Roy; Reitz, William
2018-02-01
Teaching simple circuits and Ohm's law to students in the introductory classroom has been extensively investigated through the common practice of using incandescent light bulbs to help students develop a conceptual foundation before moving on to quantitative analysis. However, the bulb filaments' resistance has a large temperature dependence, which makes them less suitable as a tool for quantitative analysis. Some instructors show that light bulbs do not obey Ohm's law either outright or through inquiry-based laboratory experiments. Others avoid the subject altogether by using bulbs strictly for qualitative purposes and then later switching to resistors for a numerical analysis, or by changing the operating conditions of the bulb so that it is "barely" glowing. It seems incongruous to develop a conceptual basis for the behavior of simple circuits using bulbs only to later reveal that they do not follow Ohm's law. Recently, small computer fans were proposed as a suitable replacement of bulbs for qualitative analysis of simple circuits where the current is related to the rotational speed of the fans. In this contribution, we demonstrate that fans can also be used for quantitative measurements and provide suggestions for successful classroom implementation.
Shuttle on-orbit contamination and environmental effects
NASA Technical Reports Server (NTRS)
Leger, L. J.; Jacobs, S.; Ehlers, H. K. F.; Miller, E.
1985-01-01
Ensuring the compatibility of the space shuttle system with payloads and payload measurements is discussed. An extensive set of quantitative requirements and goals was developed and implemented by the space shuttle program management. The performance of the Shuttle system as measured by these requirements and goals was assessed partly through the use of the induced environment contamination monitor on Shuttle flights 2, 3, and 4. Contamination levels are low and generally within the requirements and goals established. Additional data from near-term payloads and already planned contamination measurements will complete the environment definition and allow for the development of contamination avoidance procedures as necessary for any payload.
I Vivo Quantitative Ultrasound Imaging and Scatter Assessments.
NASA Astrophysics Data System (ADS)
Lu, Zheng Feng
There is evidence that "instrument independent" measurements of ultrasonic scattering properties would provide useful diagnostic information that is not available with conventional ultrasound imaging. This dissertation is a continuing effort to test the above hypothesis and to incorporate quantitative ultrasound methods into clinical examinations for early detection of diffuse liver disease. A well-established reference phantom method was employed to construct quantitative ultrasound images of tissue in vivo. The method was verified by extensive phantom tests. A new method was developed to measure the effective attenuation coefficient of the body wall. The method relates the slope of the difference between the echo signal power spectrum from a uniform region distal to the body wall and the echo signal power spectrum from a reference phantom to the body wall attenuation. The accuracy obtained from phantom tests suggests further studies with animal experiments. Clinically, thirty-five healthy subjects and sixteen patients with diffuse liver disease were studied by these quantitative ultrasound methods. The average attenuation coefficient in normals agreed with previous investigators' results; in vivo backscatter coefficients agreed with the results from normals measured by O'Donnell. Strong discriminating power (p < 0.001) was found for both attenuation and backscatter coefficients between fatty livers and normals; a significant difference (p < 0.01) was observed in the backscatter coefficient but not in the attenuation coefficient between cirrhotic livers and normals. An in vivo animal model of steroid hepatopathy was used to investigate the system sensitivity in detecting early changes in canine liver resulting from corticosteroid administration. The average attenuation coefficient slope increased from 0.7 dB/cm/MHz in controls to 0.82 dB/cm/MHz (at 6 MHz) in treated animals on day 14 into the treatment, and the backscatter coefficient was 26times 10^{ -4}cm^{-1}sr^{-1} in controls compared with 74times 10^{-4}cm^{-1}sr^ {-1} (at 6 MHz) in treated animals. A simplified quantitative approach using video image signals was developed. Results derived both from the r.f. signal analysis and from the video signal analysis are sensitive to the changes in the liver in this animal model.
MR Fingerprinting for Rapid Quantitative Abdominal Imaging
Chen, Yong; Jiang, Yun; Pahwa, Shivani; Ma, Dan; Lu, Lan; Twieg, Michael D.; Wright, Katherine L.; Seiberlich, Nicole; Griswold, Mark A.
2016-01-01
Purpose To develop a magnetic resonance (MR) “fingerprinting” technique for quantitative abdominal imaging. Materials and Methods This HIPAA-compliant study had institutional review board approval, and informed consent was obtained from all subjects. To achieve accurate quantification in the presence of marked B0 and B1 field inhomogeneities, the MR fingerprinting framework was extended by using a two-dimensional fast imaging with steady-state free precession, or FISP, acquisition and a Bloch-Siegert B1 mapping method. The accuracy of the proposed technique was validated by using agarose phantoms. Quantitative measurements were performed in eight asymptomatic subjects and in six patients with 20 focal liver lesions. A two-tailed Student t test was used to compare the T1 and T2 results in metastatic adenocarcinoma with those in surrounding liver parenchyma and healthy subjects. Results Phantom experiments showed good agreement with standard methods in T1 and T2 after B1 correction. In vivo studies demonstrated that quantitative T1, T2, and B1 maps can be acquired within a breath hold of approximately 19 seconds. T1 and T2 measurements were compatible with those in the literature. Representative values included the following: liver, 745 msec ± 65 (standard deviation) and 31 msec ± 6; renal medulla, 1702 msec ± 205 and 60 msec ± 21; renal cortex, 1314 msec ± 77 and 47 msec ± 10; spleen, 1232 msec ± 92 and 60 msec ± 19; skeletal muscle, 1100 msec ± 59 and 44 msec ± 9; and fat, 253 msec ± 42 and 77 msec ± 16, respectively. T1 and T2 in metastatic adenocarcinoma were 1673 msec ± 331 and 43 msec ± 13, respectively, significantly different from surrounding liver parenchyma relaxation times of 840 msec ± 113 and 28 msec ± 3 (P < .0001 and P < .01) and those in hepatic parenchyma in healthy volunteers (745 msec ± 65 and 31 msec ± 6, P < .0001 and P = .021, respectively). Conclusion A rapid technique for quantitative abdominal imaging was developed that allows simultaneous quantification of multiple tissue properties within one 19-second breath hold, with measurements comparable to those in published literature. © RSNA, 2016 PMID:26794935
MR Fingerprinting for Rapid Quantitative Abdominal Imaging.
Chen, Yong; Jiang, Yun; Pahwa, Shivani; Ma, Dan; Lu, Lan; Twieg, Michael D; Wright, Katherine L; Seiberlich, Nicole; Griswold, Mark A; Gulani, Vikas
2016-04-01
To develop a magnetic resonance (MR) "fingerprinting" technique for quantitative abdominal imaging. This HIPAA-compliant study had institutional review board approval, and informed consent was obtained from all subjects. To achieve accurate quantification in the presence of marked B0 and B1 field inhomogeneities, the MR fingerprinting framework was extended by using a two-dimensional fast imaging with steady-state free precession, or FISP, acquisition and a Bloch-Siegert B1 mapping method. The accuracy of the proposed technique was validated by using agarose phantoms. Quantitative measurements were performed in eight asymptomatic subjects and in six patients with 20 focal liver lesions. A two-tailed Student t test was used to compare the T1 and T2 results in metastatic adenocarcinoma with those in surrounding liver parenchyma and healthy subjects. Phantom experiments showed good agreement with standard methods in T1 and T2 after B1 correction. In vivo studies demonstrated that quantitative T1, T2, and B1 maps can be acquired within a breath hold of approximately 19 seconds. T1 and T2 measurements were compatible with those in the literature. Representative values included the following: liver, 745 msec ± 65 (standard deviation) and 31 msec ± 6; renal medulla, 1702 msec ± 205 and 60 msec ± 21; renal cortex, 1314 msec ± 77 and 47 msec ± 10; spleen, 1232 msec ± 92 and 60 msec ± 19; skeletal muscle, 1100 msec ± 59 and 44 msec ± 9; and fat, 253 msec ± 42 and 77 msec ± 16, respectively. T1 and T2 in metastatic adenocarcinoma were 1673 msec ± 331 and 43 msec ± 13, respectively, significantly different from surrounding liver parenchyma relaxation times of 840 msec ± 113 and 28 msec ± 3 (P < .0001 and P < .01) and those in hepatic parenchyma in healthy volunteers (745 msec ± 65 and 31 msec ± 6, P < .0001 and P = .021, respectively). A rapid technique for quantitative abdominal imaging was developed that allows simultaneous quantification of multiple tissue properties within one 19-second breath hold, with measurements comparable to those in published literature.
Nuclear medicine and imaging research (Instrumentation and quantitative methods of evaluation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, R.N.; Cooper, M.D.
1989-09-01
This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development ofmore » new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility.« less
Dynamic Quantitative Trait Locus Analysis of Plant Phenomic Data.
Li, Zitong; Sillanpää, Mikko J
2015-12-01
Advanced platforms have recently become available for automatic and systematic quantification of plant growth and development. These new techniques can efficiently produce multiple measurements of phenotypes over time, and introduce time as an extra dimension to quantitative trait locus (QTL) studies. Functional mapping utilizes a class of statistical models for identifying QTLs associated with the growth characteristics of interest. A major benefit of functional mapping is that it integrates information over multiple timepoints, and therefore could increase the statistical power for QTL detection. We review the current development of computationally efficient functional mapping methods which provide invaluable tools for analyzing large-scale timecourse data that are readily available in our post-genome era. Copyright © 2015 Elsevier Ltd. All rights reserved.
Wengert, Georg Johannes; Helbich, Thomas H; Vogl, Wolf-Dieter; Baltzer, Pascal; Langs, Georg; Weber, Michael; Bogner, Wolfgang; Gruber, Stephan; Trattnig, Siegfried; Pinker, Katja
2015-02-01
The purposes of this study were to introduce and assess an automated user-independent quantitative volumetric (AUQV) breast density (BD) measurement system on the basis of magnetic resonance imaging (MRI) using the Dixon technique as well as to compare it with qualitative and quantitative mammographic (MG) BD measurements. Forty-three women with normal mammogram results (Breast Imaging Reporting and Data System 1) were included in this institutional review board-approved prospective study. All participants were subjected to BD assessment with MRI using the following sequence with the Dixon technique (echo time/echo time, 6 milliseconds/2.45 milliseconds/2.67 milliseconds; 1-mm isotropic; 3 minutes 38 seconds). To test the reproducibility, a second MRI after patient repositioning was performed. The AUQV magnetic resonance (MR) BD measurement system automatically calculated percentage (%) BD. The qualitative BD assessment was performed using the American College of Radiology Breast Imaging Reporting and Data System BD categories. Quantitative BD was estimated semiautomatically using the thresholding technique Cumulus4. Appropriate statistical tests were used to assess the agreement between the AUQV MR measurements and to compare them with qualitative and quantitative MG BD estimations. The AUQV MR BD measurements were successfully performed in all 43 women. There was a nearly perfect agreement of AUQV MR BD measurements between the 2 MR examinations for % BD (P < 0.001; intraclass correlation coefficient, 0.998) with no significant differences (P = 0.384). The AUQV MR BD measurements were significantly lower than quantitative and qualitative MG BD assessment (P < 0.001). The AUQV MR BD measurement system allows a fully automated, user-independent, robust, reproducible, as well as radiation- and compression-free volumetric quantitative BD assessment through different levels of BD. The AUQV MR BD measurements were significantly lower than the currently used qualitative and quantitative MG-based approaches, implying that the current assessment might overestimate breast density with MG.
Exposure assessment of tetrafluoroethylene and ammonium perfluorooctanoate 1951-2002.
Sleeuwenhoek, Anne; Cherrie, John W
2012-03-01
To develop a method to reconstruct exposure to tetrafluoroethylene (TFE) and ammonium perfluorooctanoate (APFO) in plants producing polytetrafluoroethylene (PTFE) in the absence of suitable objective measurements. These data were used to inform an epidemiological study being carried out to investigate possible risks in workers employed in the manufacture of PTFE and to study trends in exposure over time. For each plant, detailed descriptions of all occupational titles, including tasks and changes over time, were obtained during semi-structured interviews with key plant personnel. A semi-quantitative assessment method was used to assess inhalation exposure to TFE and inhalation plus dermal exposure to APFO. Temporal trends in exposure to TFE and APFO were investigated. In each plant the highest exposures for both TFE and APFO occurred in the polymerisation area. Due to the introduction of control measures, increasing process automation and other improvements, exposures generally decreased over time. In the polymerisation area, the annual decline in exposure to TFE varied by plant from 3.8 to 5.7% and for APFO from 2.2 to 5.5%. A simple method for assessing exposure was developed which used detailed process information and job descriptions to estimate average annual TFE and APFO exposure on an arbitrary semi-quantitative scale. These semi-quantitative estimates are sufficient to identify relative differences in exposure for the epidemiological study and should good data become available, they could be used to provide quantitative estimates for all plants across the whole period of operation. This journal is © The Royal Society of Chemistry 2012
McMullin, Brian T; Leung, Ming-Ying; Shanbhag, Arun S; McNulty, Donald; Mabrey, Jay D; Agrawal, C Mauli
2006-02-01
A total of 750 images of individual ultra-high molecular weight polyethylene (UHMWPE) particles isolated from periprosthetic failed hip, knee, and shoulder arthroplasties were extracted from archival scanning electron micrographs. Particle size and morphology was subsequently analyzed using computerized image analysis software utilizing five descriptors found in ASTM F1877-98, a standard for quantitative description of wear debris. An online survey application was developed to display particle images, and allowed ten respondents to classify particle morphologies according to commonly used terminology as fibers, flakes, or granules. Particles were categorized based on a simple majority of responses. All descriptors were evaluated using a one-way ANOVA and Tukey-Kramer test for all-pairs comparison among each class of particles. A logistic regression model using half of the particles included in the survey was then used to develop a mathematical scheme to predict whether a given particle should be classified as a fiber, flake, or granule based on its quantitative measurements. The validity of the model was then assessed using the other half of the survey particles and compared with human responses. Comparison of the quantitative measurements of isolated particles showed that the morphologies of each particle type classified by respondents were statistically different from one another (p<0.05). The average agreement between mathematical prediction and human respondents was 83.5% (standard error 0.16%). These data suggest that computerized descriptors can be feasibly correlated with subjective terminology, thus providing a basis for a common vocabulary for particle description which can be translated into quantitative dimensions.
McMullin, Brian T.; Leung, Ming-Ying; Shanbhag, Arun S.; McNulty, Donald; Mabrey, Jay D.; Agrawal, C. Mauli
2014-01-01
A total of 750 images of individual ultra-high molecular weight polyethylene (UHMWPE) particles isolated from periprosthetic failed hip, knee, and shoulder arthroplasties were extracted from archival scanning electron micrographs. Particle size and morphology was subsequently analyzed using computerized image analysis software utilizing five descriptors found in ASTM F1877-98, a standard for quantitative description of wear debris. An online survey application was developed to display particle images, and allowed ten respondents to classify particle morphologies according to commonly used terminology as fibers, flakes, or granules. Particles were categorized based on a simple majority of responses. All descriptors were evaluated using a one-way ANOVA and Tukey–Kramer test for all-pairs comparison among each class of particles. A logistic regression model using half of the particles included in the survey was then used to develop a mathematical scheme to predict whether a given particle should be classified as a fiber, flake, or granule based on its quantitative measurements. The validity of the model was then assessed using the other half of the survey particles and compared with human responses. Comparison of the quantitative measurements of isolated particles showed that the morphologies of each particle type classified by respondents were statistically different from one another (po0:05). The average agreement between mathematical prediction and human respondents was 83.5% (standard error 0.16%). These data suggest that computerized descriptors can be feasibly correlated with subjective terminology, thus providing a basis for a common vocabulary for particle description which can be translated into quantitative dimensions. PMID:16112725
Quantitative force measurements in liquid using frequency modulation atomic force microscopy
NASA Astrophysics Data System (ADS)
Uchihashi, Takayuki; Higgins, Michael J.; Yasuda, Satoshi; Jarvis, Suzanne P.; Akita, Seiji; Nakayama, Yoshikazu; Sader, John E.
2004-10-01
The measurement of short-range forces with the atomic force microscope (AFM) typically requires implementation of dynamic techniques to maintain sensitivity and stability. While frequency modulation atomic force microscopy (FM-AFM) is used widely for high-resolution imaging and quantitative force measurements in vacuum, quantitative force measurements using FM-AFM in liquids have proven elusive. Here we demonstrate that the formalism derived for operation in vacuum can also be used in liquids, provided certain modifications are implemented. To facilitate comparison with previous measurements taken using surface forces apparatus, we choose a model system (octamethylcyclotetrasiloxane) that is known to exhibit short-ranged structural ordering when confined between two surfaces. Force measurements obtained are found to be in excellent agreement with previously reported results. This study therefore establishes FM-AFM as a powerful tool for the quantitative measurement of forces in liquid.
NASA Astrophysics Data System (ADS)
Nuraeni, E.; Rahmat, A.
2018-05-01
Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.
Quantitation of absorbed or deposited materials on a substrate that measures energy deposition
Grant, Patrick G.; Bakajin, Olgica; Vogel, John S.; Bench, Graham
2005-01-18
This invention provides a system and method for measuring an energy differential that correlates to quantitative measurement of an amount mass of an applied localized material. Such a system and method remains compatible with other methods of analysis, such as, for example, quantitating the elemental or isotopic content, identifying the material, or using the material in biochemical analysis.
Petroll, W. Matthew; Robertson, Danielle M.
2015-01-01
The optical sectioning ability of confocal microscopy allows high magnification images to be obtained from different depths within a thick tissue specimen, and is thus ideally suited to the study of intact tissue in living subjects. In vivo confocal microscopy has been used in a variety of corneal research and clinical applications since its development over 25 years ago. In this article we review the latest developments in quantitative corneal imaging with the Heidelberg Retinal Tomograph with Rostock Corneal Module (HRT-RCM). We provide an overview of the unique strengths and weaknesses of the HRT-RCM. We discuss techniques for performing 3-D imaging with the HRT-RCM, including hardware and software modifications that allow full thickness confocal microscopy through focusing (CMTF) of the cornea, which can provide quantitative measurements of corneal sublayer thicknesses, stromal cell and extracellular matrix backscatter, and depth dependent changes in corneal keratocyte density. We also review current approaches for quantitative imaging of the subbasal nerve plexus, which require a combination of advanced image acquisition and analysis procedures, including wide field mapping and 3-D reconstruction of nerve structures. The development of new hardware, software, and acquisition techniques continues to expand the number of applications of the HRT-RCM for quantitative in vivo corneal imaging at the cellular level. Knowledge of these rapidly evolving strategies should benefit corneal clinicians and basic scientists alike. PMID:25998608
Spectrophotometer and ultrasound evaluation of late toxicity following breast-cancer radiotherapy
Yoshida, E. J.; Chen, H.; Torres, M. A.; Curran, W. J.; Liu, T.
2011-01-01
Purpose: Radiation-induced normal-tissue toxicities are common, complex, and distressing side effects that affect 90% of patients receiving breast-cancer radiotherapy and 40% of patients post radiotherapy. In this study, the authors investigated the use of spectrophotometry and ultrasound to quantitatively measure radiation-induced skin discoloration and subcutaneous-tissue fibrosis. The study’s purpose is to determine whether skin discoloration correlates with the development of fibrosis in breast-cancer radiotherapy.Methods : Eighteen breast-cancer patients were enrolled in our initial study. All patients were previously treated with a standard course of radiation, and the median follow-up time was 22 months. The treated and untreated breasts were scanned with a spectrophotometer and an ultrasound. Two spectrophotometer parameters—melanin and erythema indices—were used to quantitatively assess skin discoloration. Two ultrasound parameters—skin thickness and Pearson coefficient of the hypodermis—were used to quantitatively assess severity of fibrosis. These measurements were correlated with clinical assessments (RTOG late morbidity scores).Results: Significant measurement differences between the treated and contralateral breasts were observed among all patients: 27.3% mean increase in skin thickness (p < 0.001), 34.1% mean decrease in Pearson coefficient (p < 0.001), 27.3% mean increase in melanin (p < 0.001), and 22.6% mean increase in erythema (p < 0.001). All parameters except skin thickness correlated with RTOG scores. A moderate correlation exists between melanin and erythema; however, spectrophotometer parameters do not correlate with ultrasound parameters.Conclusions: Spectrophotometry and quantitative ultrasound are objective tools that assess radiation-induced tissue injury. Spectrophotometer parameters did not correlate with those of quantitative ultrasound suggesting that skin discoloration cannot be used as a marker for subcutaneous fibrosis. These tools may prove useful for the reduction of radiation morbidities and improvement of patient quality of life. PMID:21992389
Spectrophotometer and ultrasound evaluation of late toxicity following breast-cancer radiotherapy.
Yoshida, E J; Chen, H; Torres, M A; Curran, W J; Liu, T
2011-10-01
Radiation-induced normal-tissue toxicities are common, complex, and distressing side effects that affect 90% of patients receiving breast-cancer radiotherapy and 40% of patients post radiotherapy. In this study, the authors investigated the use of spectrophotometry and ultrasound to quantitatively measure radiation-induced skin discoloration and subcutaneous-tissue fibrosis. The study's purpose is to determine whether skin discoloration correlates with the development of fibrosis in breast-cancer radiotherapy. Eighteen breast-cancer patients were enrolled in our initial study. All patients were previously treated with a standard course of radiation, and the median follow-up time was 22 months. The treated and untreated breasts were scanned with a spectrophotometer and an ultrasound. Two spectrophotometer parameters-melanin and erythema indices-were used to quantitatively assess skin discoloration. Two ultrasound parameters-skin thickness and Pearson coefficient of the hypodermis-were used to quantitatively assess severity of fibrosis. These measurements were correlated with clinical assessments (RTOG late morbidity scores). Significant measurement differences between the treated and contralateral breasts were observed among all patients: 27.3% mean increase in skin thickness (p < 0.001), 34.1% mean decrease in Pearson coefficient (p < 0.001), 27.3% mean increase in melanin (p < 0.001), and 22.6% mean increase in erythema (p < 0.001). All parameters except skin thickness correlated with RTOG scores. A moderate correlation exists between melanin and erythema; however, spectrophotometer parameters do not correlate with ultrasound parameters. Spectrophotometry and quantitative ultrasound are objective tools that assess radiation-induced tissue injury. Spectrophotometer parameters did not correlate with those of quantitative ultrasound suggesting that skin discoloration cannot be used as a marker for subcutaneous fibrosis. These tools may prove useful for the reduction of radiation morbidities and improvement of patient quality of life.
Onsum, Matthew D; Geretti, Elena; Paragas, Violette; Kudla, Arthur J; Moulis, Sharon P; Luus, Lia; Wickham, Thomas J; McDonagh, Charlotte F; MacBeath, Gavin; Hendriks, Bart S
2013-11-01
Human epidermal growth factor receptor 2 (HER2) is an important biomarker for breast and gastric cancer prognosis and patient treatment decisions. HER2 positivity, as defined by IHC or fluorescent in situ hybridization testing, remains an imprecise predictor of patient response to HER2-targeted therapies. Challenges to correct HER2 assessment and patient stratification include intratumoral heterogeneity, lack of quantitative and/or objective assays, and differences between measuring HER2 amplification at the protein versus gene level. We developed a novel immunofluorescence method for quantitation of HER2 protein expression at the single-cell level on FFPE patient samples. Our assay uses automated image analysis to identify and classify tumor versus non-tumor cells, as well as quantitate the HER2 staining for each tumor cell. The HER2 staining level is converted to HER2 protein expression using a standard cell pellet array stained in parallel with the tissue sample. This approach allows assessment of HER2 expression and heterogeneity within a tissue section at the single-cell level. By using this assay, we identified distinct subgroups of HER2 heterogeneity within traditional definitions of HER2 positivity in both breast and gastric cancers. Quantitative assessment of intratumoral HER2 heterogeneity may offer an opportunity to improve the identification of patients likely to respond to HER2-targeted therapies. The broad applicability of the assay was demonstrated by measuring HER2 expression profiles on multiple tumor types, and on normal and diseased heart tissues. Copyright © 2013 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.
Devonshire, Alison S; O'Sullivan, Denise M; Honeyborne, Isobella; Jones, Gerwyn; Karczmarczyk, Maria; Pavšič, Jernej; Gutteridge, Alice; Milavec, Mojca; Mendoza, Pablo; Schimmel, Heinz; Van Heuverswyn, Fran; Gorton, Rebecca; Cirillo, Daniela Maria; Borroni, Emanuele; Harris, Kathryn; Barnard, Marinus; Heydenrych, Anthenette; Ndusilo, Norah; Wallis, Carole L; Pillay, Keshree; Barry, Thomas; Reddington, Kate; Richter, Elvira; Mozioğlu, Erkan; Akyürek, Sema; Yalçınkaya, Burhanettin; Akgoz, Muslum; Žel, Jana; Foy, Carole A; McHugh, Timothy D; Huggett, Jim F
2016-08-03
Real-time PCR (qPCR) based methods, such as the Xpert MTB/RIF, are increasingly being used to diagnose tuberculosis (TB). While qualitative methods are adequate for diagnosis, the therapeutic monitoring of TB patients requires quantitative methods currently performed using smear microscopy. The potential use of quantitative molecular measurements for therapeutic monitoring has been investigated but findings have been variable and inconclusive. The lack of an adequate reference method and reference materials is a barrier to understanding the source of such disagreement. Digital PCR (dPCR) offers the potential for an accurate method for quantification of specific DNA sequences in reference materials which can be used to evaluate quantitative molecular methods for TB treatment monitoring. To assess a novel approach for the development of quality assurance materials we used dPCR to quantify specific DNA sequences in a range of prototype reference materials and evaluated accuracy between different laboratories and instruments. The materials were then also used to evaluate the quantitative performance of qPCR and Xpert MTB/RIF in eight clinical testing laboratories. dPCR was found to provide results in good agreement with the other methods tested and to be highly reproducible between laboratories without calibration even when using different instruments. When the reference materials were analysed with qPCR and Xpert MTB/RIF by clinical laboratories, all laboratories were able to correctly rank the reference materials according to concentration, however there was a marked difference in the measured magnitude. TB is a disease where the quantification of the pathogen could lead to better patient management and qPCR methods offer the potential to rapidly perform such analysis. However, our findings suggest that when precisely characterised materials are used to evaluate qPCR methods, the measurement result variation is too high to determine whether molecular quantification of Mycobacterium tuberculosis would provide a clinically useful readout. The methods described in this study provide a means by which the technical performance of quantitative molecular methods can be evaluated independently of clinical variability to improve accuracy of measurement results. These will assist in ultimately increasing the likelihood that such approaches could be used to improve patient management of TB.
Oxygen octahedra picker: A software tool to extract quantitative information from STEM images.
Wang, Yi; Salzberger, Ute; Sigle, Wilfried; Eren Suyolcu, Y; van Aken, Peter A
2016-09-01
In perovskite oxide based materials and hetero-structures there are often strong correlations between oxygen octahedral distortions and functionality. Thus, atomistic understanding of the octahedral distortion, which requires accurate measurements of atomic column positions, will greatly help to engineer their properties. Here, we report the development of a software tool to extract quantitative information of the lattice and of BO6 octahedral distortions from STEM images. Center-of-mass and 2D Gaussian fitting methods are implemented to locate positions of individual atom columns. The precision of atomic column distance measurements is evaluated on both simulated and experimental images. The application of the software tool is demonstrated using practical examples. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Atrioventricular junction (AVJ) motion tracking: a software tool with ITK/VTK/Qt.
Pengdong Xiao; Shuang Leng; Xiaodan Zhao; Hua Zou; Ru San Tan; Wong, Philip; Liang Zhong
2016-08-01
The quantitative measurement of the Atrioventricular Junction (AVJ) motion is an important index for ventricular functions of one cardiac cycle including systole and diastole. In this paper, a software tool that can conduct AVJ motion tracking from cardiovascular magnetic resonance (CMR) images is presented by using Insight Segmentation and Registration Toolkit (ITK), The Visualization Toolkit (VTK) and Qt. The software tool is written in C++ by using Visual Studio Community 2013 integrated development environment (IDE) containing both an editor and a Microsoft complier. The software package has been successfully implemented. From the software engineering practice, it is concluded that ITK, VTK, and Qt are very handy software systems to implement automatic image analysis functions for CMR images such as quantitative measure of motion by visual tracking.
Providing Evidence in the Moral Domain
ERIC Educational Resources Information Center
Cooper, Diane L.; Liddell, Debora L.; Davis, Tiffany J.; Pasquesi, Kira
2012-01-01
In this era of increased accountability, it is important to consider how student affairs researches and assesses the outcomes of efforts to increase moral competence. This article examines both qualitative and quantitative inquiry methods for measuring moral development. The authors review the instrumentation and methods typically used to measure…
78 FR 69839 - Building Technologies Office Prioritization Tool
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-21
... innovative and cost-effective energy saving solutions: Supporting research and development of high impact... Description The tool was designed to inform programmatic decision-making and facilitate the setting of... quantitative analysis to assure only the highest impact measures are the focus of further effort. The approach...
Simple X-ray diffraction algorithm for direct determination of cotton crystallinity
USDA-ARS?s Scientific Manuscript database
Traditionally, XRD had been used to study the crystalline structure of cotton celluloses. Despite considerable efforts in developing the curve-fitting protocol to evaluate the crystallinity index (CI), in its present state, XRD measurement can only provide a qualitative or semi-quantitative assessme...
Evaluating the Fine Arts Program at the Center for Excellence in Disabilities
ERIC Educational Resources Information Center
Schlosnagle, Leo; McBean, Amanda L.; Cutlip, Milisa; Panzironi, Helen; Jarmolowicz, David P.
2014-01-01
Art programs for people with disabilities may encourage creativity, promote engagement, emphasize inclusion, and extend access and opportunities for community involvement. This mixed methods study utilized quantitative and qualitative data, repeated measures, action research, and stakeholder collaboration to develop and implement an evaluation…
New methods are needed to screen thousands of environmental chemicals for toxicity, including developmental neurotoxicity. In vitro, cell-based assays that model key cellular events have been proposed for high throughput screening of chemicals for developmental neurotoxicity. Whi...
QUANTITATIVE MEASUREMENT OF HELICOBACTER PYLORI BY THE TAQMAN FLUOROGENIC PROBE SYSTEM
Culturing of H. pylori from environmental sources continues to be an obstacle in detecting and enumerating this organism. Successful methods of isolation and growth from water samples have not yet been developed. In this study a method involving real tme PCR product detection wit...
Earth resources data systems design: S192 instrument measurements and characteristics
NASA Technical Reports Server (NTRS)
Goldstein, A. S.
1972-01-01
The design, development, and characteristics of the S192 instrument for use with the earth resources data systems are discussed. Subjects presented are: (1) multispectral scanner measurements, (2) measurement characteristics, (3) calibration and aligment, (4) operating modes, and (5) time tagging and references. The S192 will obtain high spatial resolution, quantitative line scan imagery data of the radiation reflected and emitted by selected test sites in up to 13 spectral bands of visible, near infrared, and thermal infrared regions of the electromagnetic spectrum.
Bernstein, Lynne E.; Lu, Zhong-Lin; Jiang, Jintao
2008-01-01
A fundamental question about human perception is how the speech perceiving brain combines auditory and visual phonetic stimulus information. We assumed that perceivers learn the normal relationship between acoustic and optical signals. We hypothesized that when the normal relationship is perturbed by mismatching the acoustic and optical signals, cortical areas responsible for audiovisual stimulus integration respond as a function of the magnitude of the mismatch. To test this hypothesis, in a previous study, we developed quantitative measures of acoustic-optical speech stimulus incongruity that correlate with perceptual measures. In the current study, we presented low incongruity (LI, matched), medium incongruity (MI, moderately mismatched), and high incongruity (HI, highly mismatched) audiovisual nonsense syllable stimuli during fMRI scanning. Perceptual responses differed as a function of the incongruity level, and BOLD measures were found to vary regionally and quantitatively with perceptual and quantitative incongruity levels. Each increase in level of incongruity resulted in an increase in overall levels of cortical activity and in additional activations. However, the only cortical region that demonstrated differential sensitivity to the three stimulus incongruity levels (HI > MI > LI) was a subarea of the left supramarginal gyrus (SMG). The left SMG might support a fine-grained analysis of the relationship between audiovisual phonetic input in comparison with stored knowledge, as hypothesized here. The methods here show that quantitative manipulation of stimulus incongruity is a new and powerful tool for disclosing the system that processes audiovisual speech stimuli. PMID:18495091
Automatic 3D segmentation of multiphoton images: a key step for the quantification of human skin.
Decencière, Etienne; Tancrède-Bohin, Emmanuelle; Dokládal, Petr; Koudoro, Serge; Pena, Ana-Maria; Baldeweck, Thérèse
2013-05-01
Multiphoton microscopy has emerged in the past decade as a useful noninvasive imaging technique for in vivo human skin characterization. However, it has not been used until now in evaluation clinical trials, mainly because of the lack of specific image processing tools that would allow the investigator to extract pertinent quantitative three-dimensional (3D) information from the different skin components. We propose a 3D automatic segmentation method of multiphoton images which is a key step for epidermis and dermis quantification. This method, based on the morphological watershed and graph cuts algorithms, takes into account the real shape of the skin surface and of the dermal-epidermal junction, and allows separating in 3D the epidermis and the superficial dermis. The automatic segmentation method and the associated quantitative measurements have been developed and validated on a clinical database designed for aging characterization. The segmentation achieves its goals for epidermis-dermis separation and allows quantitative measurements inside the different skin compartments with sufficient relevance. This study shows that multiphoton microscopy associated with specific image processing tools provides access to new quantitative measurements on the various skin components. The proposed 3D automatic segmentation method will contribute to build a powerful tool for characterizing human skin condition. To our knowledge, this is the first 3D approach to the segmentation and quantification of these original images. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.
Uncertainty in the use of MAMA software to measure particle morphological parameters from SEM images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwartz, Daniel S.; Tandon, Lav
The MAMA software package developed at LANL is designed to make morphological measurements on a wide variety of digital images of objects. At LANL, we have focused on using MAMA to measure scanning electron microscope (SEM) images of particles, as this is a critical part of our forensic analysis of interdicted radiologic materials. In order to successfully use MAMA to make such measurements, we must understand the level of uncertainty involved in the process, so that we can rigorously support our quantitative conclusions.
The on-line characterization of a radium slurry by gamma-ray spectrometry.
Philips, S; Croft, S
2005-01-01
We have developed an in-line monitor to directly measure the (226)Ra concentration in a nuclear waste stream using quantitative gamma-ray spectrometry applied to the 186keV emission. The waste stream is in the form of a slurry composed of the solid waste material mixed with water. The concentration measurement includes a self-attenuation correction factor determined from a transmission measurement using the 122keV gamma from (57)Co. Presented here is the model for the measurement system and results from some initial tests.
Wind tunnel model surface gauge for measuring roughness
NASA Technical Reports Server (NTRS)
Vorburger, T. V.; Gilsinn, D. E.; Teague, E. C.; Giauque, C. H. W.; Scire, F. E.; Cao, L. X.
1987-01-01
The optical inspection of surface roughness research has proceeded along two different lines. First, research into a quantitative understanding of light scattering from metal surfaces and into the appropriate models to describe the surfaces themselves. Second, the development of a practical instrument for the measurement of rms roughness of high performance wind tunnel models with smooth finishes. The research is summarized, with emphasis on the second avenue of research.