Indium adhesion provides quantitative measure of surface cleanliness
NASA Technical Reports Server (NTRS)
Krieger, G. L.; Wilson, G. J.
1968-01-01
Indium tipped probe measures hydrophobic and hydrophilic contaminants on rough and smooth surfaces. The force needed to pull the indium tip, which adheres to a clean surface, away from the surface provides a quantitative measure of cleanliness.
The other half of the story: effect size analysis in quantitative research.
Maher, Jessica Middlemis; Markey, Jonathan C; Ebert-May, Diane
2013-01-01
Statistical significance testing is the cornerstone of quantitative research, but studies that fail to report measures of effect size are potentially missing a robust part of the analysis. We provide a rationale for why effect size measures should be included in quantitative discipline-based education research. Examples from both biological and educational research demonstrate the utility of effect size for evaluating practical significance. We also provide details about some effect size indices that are paired with common statistical significance tests used in educational research and offer general suggestions for interpreting effect size measures. Finally, we discuss some inherent limitations of effect size measures and provide further recommendations about reporting confidence intervals.
Quantitation of absorbed or deposited materials on a substrate that measures energy deposition
Grant, Patrick G.; Bakajin, Olgica; Vogel, John S.; Bench, Graham
2005-01-18
This invention provides a system and method for measuring an energy differential that correlates to quantitative measurement of an amount mass of an applied localized material. Such a system and method remains compatible with other methods of analysis, such as, for example, quantitating the elemental or isotopic content, identifying the material, or using the material in biochemical analysis.
Quantitative angle-insensitive flow measurement using relative standard deviation OCT.
Zhu, Jiang; Zhang, Buyun; Qi, Li; Wang, Ling; Yang, Qiang; Zhu, Zhuqing; Huo, Tiancheng; Chen, Zhongping
2017-10-30
Incorporating different data processing methods, optical coherence tomography (OCT) has the ability for high-resolution angiography and quantitative flow velocity measurements. However, OCT angiography cannot provide quantitative information of flow velocities, and the velocity measurement based on Doppler OCT requires the determination of Doppler angles, which is a challenge in a complex vascular network. In this study, we report on a relative standard deviation OCT (RSD-OCT) method which provides both vascular network mapping and quantitative information for flow velocities within a wide range of Doppler angles. The RSD values are angle-insensitive within a wide range of angles, and a nearly linear relationship was found between the RSD values and the flow velocities. The RSD-OCT measurement in a rat cortex shows that it can quantify the blood flow velocities as well as map the vascular network in vivo .
Quantitative angle-insensitive flow measurement using relative standard deviation OCT
NASA Astrophysics Data System (ADS)
Zhu, Jiang; Zhang, Buyun; Qi, Li; Wang, Ling; Yang, Qiang; Zhu, Zhuqing; Huo, Tiancheng; Chen, Zhongping
2017-10-01
Incorporating different data processing methods, optical coherence tomography (OCT) has the ability for high-resolution angiography and quantitative flow velocity measurements. However, OCT angiography cannot provide quantitative information of flow velocities, and the velocity measurement based on Doppler OCT requires the determination of Doppler angles, which is a challenge in a complex vascular network. In this study, we report on a relative standard deviation OCT (RSD-OCT) method which provides both vascular network mapping and quantitative information for flow velocities within a wide range of Doppler angles. The RSD values are angle-insensitive within a wide range of angles, and a nearly linear relationship was found between the RSD values and the flow velocities. The RSD-OCT measurement in a rat cortex shows that it can quantify the blood flow velocities as well as map the vascular network in vivo.
NASA Astrophysics Data System (ADS)
Sun, Aihui; Tian, Xiaolin; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Wang, Shouyu; Liu, Cheng
2018-01-01
As a lensfree imaging technique, ptychographic iterative engine (PIE) method can provide both quantitative sample amplitude and phase distributions avoiding aberration. However, it requires field of view (FoV) scanning often relying on mechanical translation, which not only slows down measuring speed, but also introduces mechanical errors decreasing both resolution and accuracy in retrieved information. In order to achieve high-accurate quantitative imaging with fast speed, digital micromirror device (DMD) is adopted in PIE for large FoV scanning controlled by on/off state coding by DMD. Measurements were implemented using biological samples as well as USAF resolution target, proving high resolution in quantitative imaging using the proposed system. Considering its fast and accurate imaging capability, it is believed the DMD based PIE technique provides a potential solution for medical observation and measurements.
Rastogi, L.; Dash, K.; Arunachalam, J.
2013-01-01
The quantitative analysis of glutathione (GSH) is important in different fields like medicine, biology, and biotechnology. Accurate quantitative measurements of this analyte have been hampered by the lack of well characterized reference standards. The proposed procedure is intended to provide an accurate and definitive method for the quantitation of GSH for reference measurements. Measurement of the stoichiometrically existing sulfur content in purified GSH offers an approach for its quantitation and calibration through an appropriate characterized reference material (CRM) for sulfur would provide a methodology for the certification of GSH quantity, that is traceable to SI (International system of units). The inductively coupled plasma optical emission spectrometry (ICP-OES) approach negates the need for any sample digestion. The sulfur content of the purified GSH is quantitatively converted into sulfate ions by microwave-assisted UV digestion in the presence of hydrogen peroxide prior to ion chromatography (IC) measurements. The measurement of sulfur by ICP-OES and IC (as sulfate) using the “high performance” methodology could be useful for characterizing primary calibration standards and certified reference materials with low uncertainties. The relative expanded uncertainties (% U) expressed at 95% confidence interval for ICP-OES analyses varied from 0.1% to 0.3%, while in the case of IC, they were between 0.2% and 1.2%. The described methods are more suitable for characterizing primary calibration standards and certifying reference materials of GSH, than for routine measurements. PMID:29403814
Quantitative dispersion microscopy
Fu, Dan; Choi, Wonshik; Sung, Yongjin; Yaqoob, Zahid; Dasari, Ramachandra R.; Feld, Michael
2010-01-01
Refractive index dispersion is an intrinsic optical property and a useful source of contrast in biological imaging studies. In this report, we present the first dispersion phase imaging of living eukaryotic cells. We have developed quantitative dispersion microscopy based on the principle of quantitative phase microscopy. The dual-wavelength quantitative phase microscope makes phase measurements at 310 nm and 400 nm wavelengths to quantify dispersion (refractive index increment ratio) of live cells. The measured dispersion of living HeLa cells is found to be around 1.088, which agrees well with that measured directly for protein solutions using total internal reflection. This technique, together with the dry mass and morphology measurements provided by quantitative phase microscopy, could prove to be a useful tool for distinguishing different types of biomaterials and studying spatial inhomogeneities of biological samples. PMID:21113234
23 CFR 1200.22 - State traffic safety information system improvements grants.
Code of Federal Regulations, 2013 CFR
2013-04-01
... measures to be used to demonstrate quantitative progress in the accuracy, completeness, timeliness... to implement, provides an explanation. (d) Requirement for quantitative improvement. A State shall demonstrate quantitative improvement in the data attributes of accuracy, completeness, timeliness, uniformity...
23 CFR 1200.22 - State traffic safety information system improvements grants.
Code of Federal Regulations, 2014 CFR
2014-04-01
... measures to be used to demonstrate quantitative progress in the accuracy, completeness, timeliness... to implement, provides an explanation. (d) Requirement for quantitative improvement. A State shall demonstrate quantitative improvement in the data attributes of accuracy, completeness, timeliness, uniformity...
Quantitative force measurements in liquid using frequency modulation atomic force microscopy
NASA Astrophysics Data System (ADS)
Uchihashi, Takayuki; Higgins, Michael J.; Yasuda, Satoshi; Jarvis, Suzanne P.; Akita, Seiji; Nakayama, Yoshikazu; Sader, John E.
2004-10-01
The measurement of short-range forces with the atomic force microscope (AFM) typically requires implementation of dynamic techniques to maintain sensitivity and stability. While frequency modulation atomic force microscopy (FM-AFM) is used widely for high-resolution imaging and quantitative force measurements in vacuum, quantitative force measurements using FM-AFM in liquids have proven elusive. Here we demonstrate that the formalism derived for operation in vacuum can also be used in liquids, provided certain modifications are implemented. To facilitate comparison with previous measurements taken using surface forces apparatus, we choose a model system (octamethylcyclotetrasiloxane) that is known to exhibit short-ranged structural ordering when confined between two surfaces. Force measurements obtained are found to be in excellent agreement with previously reported results. This study therefore establishes FM-AFM as a powerful tool for the quantitative measurement of forces in liquid.
A Systematic Review of Quantitative Resilience Measures for Water Infrastructure Systems
Shin, Sangmin; Lee, Seungyub; Judi, David; ...
2018-02-07
Over the past few decades, the concept of resilience has emerged as an important consideration in the planning and management of water infrastructure systems. Accordingly, various resilience measures have been developed for the quantitative evaluation and decision-making of systems. There are, however, numerous considerations and no clear choice of which measure, if any, provides the most appropriate representation of resilience for a given application. This study provides a critical review of quantitative approaches to measure the resilience of water infrastructure systems, with a focus on water resources and distribution systems. A compilation of 11 criteria evaluating 21 selected resilience measuresmore » addressing major features of resilience is developed using the Axiomatic Design process. Existing gaps of resilience measures are identified based on the review criteria. The results show that resilience measures have generally paid less attention to cascading damage to interrelated systems, rapid identification of failure, physical damage of system components, and time variation of resilience. Concluding the paper, improvements to resilience measures are recommended. The findings contribute to our understanding of gaps and provide information to help further improve resilience measures of water infrastructure systems.« less
A Systematic Review of Quantitative Resilience Measures for Water Infrastructure Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Sangmin; Lee, Seungyub; Judi, David
Over the past few decades, the concept of resilience has emerged as an important consideration in the planning and management of water infrastructure systems. Accordingly, various resilience measures have been developed for the quantitative evaluation and decision-making of systems. There are, however, numerous considerations and no clear choice of which measure, if any, provides the most appropriate representation of resilience for a given application. This study provides a critical review of quantitative approaches to measure the resilience of water infrastructure systems, with a focus on water resources and distribution systems. A compilation of 11 criteria evaluating 21 selected resilience measuresmore » addressing major features of resilience is developed using the Axiomatic Design process. Existing gaps of resilience measures are identified based on the review criteria. The results show that resilience measures have generally paid less attention to cascading damage to interrelated systems, rapid identification of failure, physical damage of system components, and time variation of resilience. Concluding the paper, improvements to resilience measures are recommended. The findings contribute to our understanding of gaps and provide information to help further improve resilience measures of water infrastructure systems.« less
Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C
2015-02-01
Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Electric Field Quantitative Measurement System and Method
NASA Technical Reports Server (NTRS)
Generazio, Edward R. (Inventor)
2016-01-01
A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.
Fritz, Nora E; Keller, Jennifer; Calabresi, Peter A; Zackowski, Kathleen M
2017-01-01
At least 85% of individuals with multiple sclerosis report walking dysfunction as their primary complaint. Walking and strength measures are common clinical measures to mark increasing disability or improvement with rehabilitation. Previous studies have shown an association between strength or walking ability and spinal cord MRI measures, and strength measures with brainstem corticospinal tract magnetization transfer ratio. However, the relationship between walking performance and brain corticospinal tract magnetization transfer imaging measures and the contribution of clinical measurements of walking and strength to the underlying integrity of the corticospinal tract has not been explored in multiple sclerosis. The objectives of this study were explore the relationship of quantitative measures of walking and strength to whole-brain corticospinal tract-specific MRI measures and to determine the contribution of quantitative measures of function in addition to basic clinical measures (age, gender, symptom duration and Expanded Disability Status Scale) to structural imaging measures of the corticospinal tract. We hypothesized that quantitative walking and strength measures would be related to brain corticospinal tract-specific measures, and would provide insight into the heterogeneity of brain pathology. Twenty-nine individuals with relapsing-remitting multiple sclerosis (mean(SD) age 48.7 (11.5) years; symptom duration 11.9(8.7); 17 females; median[range] Expanded Disability Status Scale 4.0 [1.0-6.5]) and 29 age and gender-matched healthy controls (age 50.8(11.6) years; 20 females) participated in clinical tests of strength and walking (Timed Up and Go, Timed 25 Foot Walk, Two Minute Walk Test ) as well as 3 T imaging including diffusion tensor imaging and magnetization transfer imaging. Individuals with multiple sclerosis were weaker (p = 0.0024) and walked slower (p = 0.0013) compared to controls. Quantitative measures of walking and strength were significantly related to corticospinal tract fractional anisotropy (r > 0.26; p < 0.04) and magnetization transfer ratio (r > 0.29; p < 0.03) measures. Although the Expanded Disability Status Scale was highly correlated with walking measures, it was not significantly related to either corticospinal tract fractional anisotropy or magnetization transfer ratio (p > 0.05). Walk velocity was a significant contributor to magnetization transfer ratio (p = 0.006) and fractional anisotropy (p = 0.011) in regression modeling that included both quantitative measures of function and basic clinical information. Quantitative measures of strength and walking are associated with brain corticospinal tract pathology. The addition of these quantitative measures to basic clinical information explains more of the variance in corticospinal tract fractional anisotropy and magnetization transfer ratio than the basic clinical information alone. Outcome measurement for multiple sclerosis clinical trials has been notoriously challenging; the use of quantitative measures of strength and walking along with tract-specific imaging methods may improve our ability to monitor disease change over time, with intervention, and provide needed guidelines for developing more effective targeted rehabilitation strategies.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-31
... quantitative information regarding expected reductions in emissions of CO 2 or fuel consumption as a result of... encouraged to provide quantitative information that validates the existence of substantial transportation... quantitative and qualitative measures. Therefore, applicants for TIGER Discretionary Grants are generally...
NASA Astrophysics Data System (ADS)
Cai, Tao; Guo, Songtao; Li, Yongzeng; Peng, Di; Zhao, Xiaofeng; Liu, Yingzheng
2018-04-01
The mechanoluminescent (ML) sensor is a newly developed non-invasive technique for stress/strain measurement. However, its application has been mostly restricted to qualitative measurement due to the lack of a well-defined relationship between ML intensity and stress. To achieve accurate stress measurement, an intensity ratio model was proposed in this study to establish a quantitative relationship between the stress condition and its ML intensity in elastic deformation. To verify the proposed model, experiments were carried out on a ML measurement system using resin samples mixed with the sensor material SrAl2O4:Eu2+, Dy3+. The ML intensity ratio was found to be dependent on the applied stress and strain rate, and the relationship acquired from the experimental results agreed well with the proposed model. The current study provided a physical explanation for the relationship between ML intensity and its stress condition. The proposed model was applicable in various SrAl2O4:Eu2+, Dy3+-based ML measurement in elastic deformation, and could provide a useful reference for quantitative stress measurement using the ML sensor in general.
Metrics and the effective computational scientist: process, quality and communication.
Baldwin, Eric T
2012-09-01
Recent treatments of computational knowledge worker productivity have focused upon the value the discipline brings to drug discovery using positive anecdotes. While this big picture approach provides important validation of the contributions of these knowledge workers, the impact accounts do not provide the granular detail that can help individuals and teams perform better. I suggest balancing the impact-focus with quantitative measures that can inform the development of scientists. Measuring the quality of work, analyzing and improving processes, and the critical evaluation of communication can provide immediate performance feedback. The introduction of quantitative measures can complement the longer term reporting of impacts on drug discovery. These metric data can document effectiveness trends and can provide a stronger foundation for the impact dialogue. Copyright © 2012 Elsevier Ltd. All rights reserved.
Pfammatter, Sibylle; Bonneil, Eric; Thibault, Pierre
2016-12-02
Quantitative proteomics using isobaric reagent tandem mass tags (TMT) or isobaric tags for relative and absolute quantitation (iTRAQ) provides a convenient approach to compare changes in protein abundance across multiple samples. However, the analysis of complex protein digests by isobaric labeling can be undermined by the relative large proportion of co-selected peptide ions that lead to distorted reporter ion ratios and affect the accuracy and precision of quantitative measurements. Here, we investigated the use of high-field asymmetric waveform ion mobility spectrometry (FAIMS) in proteomic experiments to reduce sample complexity and improve protein quantification using TMT isobaric labeling. LC-FAIMS-MS/MS analyses of human and yeast protein digests led to significant reductions in interfering ions, which increased the number of quantifiable peptides by up to 68% while significantly improving the accuracy of abundance measurements compared to that with conventional LC-MS/MS. The improvement in quantitative measurements using FAIMS is further demonstrated for the temporal profiling of protein abundance of HEK293 cells following heat shock treatment.
Targeted Quantitation of Proteins by Mass Spectrometry
2013-01-01
Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement. PMID:23517332
Targeted quantitation of proteins by mass spectrometry.
Liebler, Daniel C; Zimmerman, Lisa J
2013-06-04
Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement.
NASA Astrophysics Data System (ADS)
Festa, G.; Senesi, R.; Alessandroni, M.; Andreani, C.; Vitali, G.; Porcinai, S.; Giusti, A. M.; Materna, T.; Paradowska, A. M.
2011-03-01
Quantitative neutron studies of cultural heritage objects provide access to microscopic, mesoscopic, and macroscopic structures in a nondestructive manner. In this paper we present a neutron diffraction investigation of a Ghiberti Renaissance gilded bronze relief devoted to the measurement of cavities and inhomogeneities in the bulk of the sample, along with the bulk phase composition and residual strain distribution. The quantitative measurements allowed the determination of the re-melting parts extension, as well as improving current knowledge about the manufacturing process. The study provides significant and unique information to conservators and restorators about the history of the relief.
Quantitative flow and velocity measurements of pulsatile blood flow with 4D-DSA
NASA Astrophysics Data System (ADS)
Shaughnessy, Gabe; Hoffman, Carson; Schafer, Sebastian; Mistretta, Charles A.; Strother, Charles M.
2017-03-01
Time resolved 3D angiographic data from 4D DSA provides a unique environment to explore physical properties of blood flow. Utilizing the pulsatility of the contrast waveform, the Fourier components can be used to track the waveform motion through vessels. Areas of strong pulsatility are determined through the FFT power spectrum. Using this method, we find an accuracy from 4D-DSA flow measurements within 7.6% and 6.8% RMSE of ICA PCVIPR and phantom flow probe validation measurements, respectively. The availability of velocity and flow information with fast acquisition could provide a more quantitative approach to treatment planning and evaluation in interventional radiology.
Bound Pool Fractions Complement Diffusion Measures to Describe White Matter Micro and Macrostructure
Stikov, Nikola; Perry, Lee M.; Mezer, Aviv; Rykhlevskaia, Elena; Wandell, Brian A.; Pauly, John M.; Dougherty, Robert F.
2010-01-01
Diffusion imaging and bound pool fraction (BPF) mapping are two quantitative magnetic resonance imaging techniques that measure microstructural features of the white matter of the brain. Diffusion imaging provides a quantitative measure of the diffusivity of water in tissue. BPF mapping is a quantitative magnetization transfer (qMT) technique that estimates the proportion of exchanging protons bound to macromolecules, such as those found in myelin, and is thus a more direct measure of myelin content than diffusion. In this work, we combine BPF estimates of macromolecular content with measurements of diffusivity within human white matter tracts. Within the white matter, the correlation between BPFs and diffusivity measures such as fractional anisotropy and radial diffusivity was modest, suggesting that diffusion tensor imaging and bound pool fractions are complementary techniques. We found that several major tracts have high BPF, suggesting a higher density of myelin in these tracts. We interpret these results in the context of a quantitative tissue model. PMID:20828622
Zhang, Xin-Wei; Qiu, Quan-Fa; Jiang, Hong; Zhang, Fu-Li; Liu, Yan-Lin; Amatore, Christian; Huang, Wei-Hua
2017-10-09
Nanoelectrodes allow precise and quantitative measurements of important biological processes at the single living-cell level in real time. Cylindrical nanowire electrodes (NWEs) required for intracellular measurements create a great challenge for achieving excellent electrochemical and mechanical performances. Herein, we present a facile and robust solution to this problem based on a unique SiC-core-shell design to produce cylindrical NWEs with superior mechanical toughness provided by the SiC nano-core and an excellent electrochemical performance provided by the ultrathin carbon shell that can be used as such or platinized. The use of such NWEs for biological applications is illustrated by the first quantitative measurements of ROS/RNS in individual phagolysosomes of living macrophages. As the shell material can be varied to meet any specific detection purpose, this work opens up new opportunities to monitor quantitatively biological functions occurring inside cells and their organelles. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Applying Quantitative Genetic Methods to Primate Social Behavior
Brent, Lauren J. N.
2013-01-01
Increasingly, behavioral ecologists have applied quantitative genetic methods to investigate the evolution of behaviors in wild animal populations. The promise of quantitative genetics in unmanaged populations opens the door for simultaneous analysis of inheritance, phenotypic plasticity, and patterns of selection on behavioral phenotypes all within the same study. In this article, we describe how quantitative genetic techniques provide studies of the evolution of behavior with information that is unique and valuable. We outline technical obstacles for applying quantitative genetic techniques that are of particular relevance to studies of behavior in primates, especially those living in noncaptive populations, e.g., the need for pedigree information, non-Gaussian phenotypes, and demonstrate how many of these barriers are now surmountable. We illustrate this by applying recent quantitative genetic methods to spatial proximity data, a simple and widely collected primate social behavior, from adult rhesus macaques on Cayo Santiago. Our analysis shows that proximity measures are consistent across repeated measurements on individuals (repeatable) and that kin have similar mean measurements (heritable). Quantitative genetics may hold lessons of considerable importance for studies of primate behavior, even those without a specific genetic focus. PMID:24659839
Quantitative magnetic resonance imaging phantoms: A review and the need for a system phantom.
Keenan, Kathryn E; Ainslie, Maureen; Barker, Alex J; Boss, Michael A; Cecil, Kim M; Charles, Cecil; Chenevert, Thomas L; Clarke, Larry; Evelhoch, Jeffrey L; Finn, Paul; Gembris, Daniel; Gunter, Jeffrey L; Hill, Derek L G; Jack, Clifford R; Jackson, Edward F; Liu, Guoying; Russek, Stephen E; Sharma, Samir D; Steckner, Michael; Stupic, Karl F; Trzasko, Joshua D; Yuan, Chun; Zheng, Jie
2018-01-01
The MRI community is using quantitative mapping techniques to complement qualitative imaging. For quantitative imaging to reach its full potential, it is necessary to analyze measurements across systems and longitudinally. Clinical use of quantitative imaging can be facilitated through adoption and use of a standard system phantom, a calibration/standard reference object, to assess the performance of an MRI machine. The International Society of Magnetic Resonance in Medicine AdHoc Committee on Standards for Quantitative Magnetic Resonance was established in February 2007 to facilitate the expansion of MRI as a mainstream modality for multi-institutional measurements, including, among other things, multicenter trials. The goal of the Standards for Quantitative Magnetic Resonance committee was to provide a framework to ensure that quantitative measures derived from MR data are comparable over time, between subjects, between sites, and between vendors. This paper, written by members of the Standards for Quantitative Magnetic Resonance committee, reviews standardization attempts and then details the need, requirements, and implementation plan for a standard system phantom for quantitative MRI. In addition, application-specific phantoms and implementation of quantitative MRI are reviewed. Magn Reson Med 79:48-61, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Image-Based Quantification of Plant Immunity and Disease.
Laflamme, Bradley; Middleton, Maggie; Lo, Timothy; Desveaux, Darrell; Guttman, David S
2016-12-01
Measuring the extent and severity of disease is a critical component of plant pathology research and crop breeding. Unfortunately, existing visual scoring systems are qualitative, subjective, and the results are difficult to transfer between research groups, while existing quantitative methods can be quite laborious. Here, we present plant immunity and disease image-based quantification (PIDIQ), a quantitative, semi-automated system to rapidly and objectively measure disease symptoms in a biologically relevant context. PIDIQ applies an ImageJ-based macro to plant photos in order to distinguish healthy tissue from tissue that has yellowed due to disease. It can process a directory of images in an automated manner and report the relative ratios of healthy to diseased leaf area, thereby providing a quantitative measure of plant health that can be statistically compared with appropriate controls. We used the Arabidopsis thaliana-Pseudomonas syringae model system to show that PIDIQ is able to identify both enhanced plant health associated with effector-triggered immunity as well as elevated disease symptoms associated with effector-triggered susceptibility. Finally, we show that the quantitative results provided by PIDIQ correspond to those obtained via traditional in planta pathogen growth assays. PIDIQ provides a simple and effective means to nondestructively quantify disease from whole plants and we believe it will be equally effective for monitoring disease on excised leaves and stems.
Best Practices Handbook: Traffic Engineering in Range Networks
2016-03-01
units of measurement. Measurement Methodology - A repeatable measurement technique used to derive one or more metrics of interest . Network...Performance measures - Metrics that provide quantitative or qualitative measures of the performance of systems or subsystems of interest . Performance Metric
NASA Astrophysics Data System (ADS)
Park, K. W.; Nair, H. P.; Crook, A. M.; Bank, S. R.; Yu, E. T.
2013-02-01
A proximal probe-based quantitative measurement of thermal conductivity with ˜100-150 nm lateral and vertical spatial resolution has been implemented. Measurements on an ErAs/GaAs superlattice structure grown by molecular beam epitaxy with 3% volumetric ErAs content yielded thermal conductivity at room temperature of 9 ± 2 W/m K, approximately five times lower than that for GaAs. Numerical modeling of phonon scattering by ErAs nanoparticles yielded thermal conductivities in reasonable agreement with those measured experimentally and provides insight into the potential influence of nanoparticle shape on phonon scattering. Measurements of wedge-shaped samples created by focused ion beam milling provide direct confirmation of depth resolution achieved.
Electrons, Photons, and Force: Quantitative Single-Molecule Measurements from Physics to Biology
2011-01-01
Single-molecule measurement techniques have illuminated unprecedented details of chemical behavior, including observations of the motion of a single molecule on a surface, and even the vibration of a single bond within a molecule. Such measurements are critical to our understanding of entities ranging from single atoms to the most complex protein assemblies. We provide an overview of the strikingly diverse classes of measurements that can be used to quantify single-molecule properties, including those of single macromolecules and single molecular assemblies, and discuss the quantitative insights they provide. Examples are drawn from across the single-molecule literature, ranging from ultrahigh vacuum scanning tunneling microscopy studies of adsorbate diffusion on surfaces to fluorescence studies of protein conformational changes in solution. PMID:21338175
Brown, J Quincy; Vishwanath, Karthik; Palmer, Gregory M; Ramanujam, Nirmala
2009-02-01
Methods of optical spectroscopy that provide quantitative, physically or physiologically meaningful measures of tissue properties are an attractive tool for the study, diagnosis, prognosis, and treatment of various cancers. Recent development of methodologies to convert measured reflectance and fluorescence spectra from tissue to cancer-relevant parameters such as vascular volume, oxygenation, extracellular matrix extent, metabolic redox states, and cellular proliferation have significantly advanced the field of tissue optical spectroscopy. The number of publications reporting quantitative tissue spectroscopy results in the UV-visible wavelength range has increased sharply in the past three years, and includes new and emerging studies that correlate optically measured parameters with independent measures such as immunohistochemistry, which should aid in increased clinical acceptance of these technologies.
NASA Technical Reports Server (NTRS)
Bush, Lance B.
1997-01-01
In the current political climate NASA must be able to show reliable measures demonstrating successful technology transfer. The currently available quantitative data of intellectual property technology transfer efforts portray a less than successful performance. In this paper, the use of only quantitative values for measurement of technology transfer is shown to undervalue the effort. In addition, NASA's current policy in negotiating intellectual property rights results in undervalued royalty rates. NASA has maintained that it's position of providing public good precludes it from negotiating fair market value for its technology and instead has negotiated for reasonable cost in order to recover processing fees. This measurement issue is examined and recommendations made which include a new policy regarding the intellectual property rights negotiation, and two measures to supplement the intellectual property measures.
Volgushev, Maxim; Malyshev, Aleksey; Balaban, Pavel; Chistiakova, Marina; Volgushev, Stanislav; Wolf, Fred
2008-04-09
The generation of action potentials (APs) is a key process in the operation of nerve cells and the communication between neurons. Action potentials in mammalian central neurons are characterized by an exceptionally fast onset dynamics, which differs from the typically slow and gradual onset dynamics seen in identified snail neurons. Here we describe a novel method of analysis which provides a quantitative measure of the onset dynamics of action potentials. This method captures the difference between the fast, step-like onset of APs in rat neocortical neurons and the gradual, exponential-like AP onset in identified snail neurons. The quantitative measure of the AP onset dynamics, provided by the method, allows us to perform quantitative analyses of factors influencing the dynamics.
Volgushev, Maxim; Malyshev, Aleksey; Balaban, Pavel; Chistiakova, Marina; Volgushev, Stanislav; Wolf, Fred
2008-01-01
The generation of action potentials (APs) is a key process in the operation of nerve cells and the communication between neurons. Action potentials in mammalian central neurons are characterized by an exceptionally fast onset dynamics, which differs from the typically slow and gradual onset dynamics seen in identified snail neurons. Here we describe a novel method of analysis which provides a quantitative measure of the onset dynamics of action potentials. This method captures the difference between the fast, step-like onset of APs in rat neocortical neurons and the gradual, exponential-like AP onset in identified snail neurons. The quantitative measure of the AP onset dynamics, provided by the method, allows us to perform quantitative analyses of factors influencing the dynamics. PMID:18398478
Quantifying Uncertainty in Near Surface Electromagnetic Imaging Using Bayesian Methods
NASA Astrophysics Data System (ADS)
Blatter, D. B.; Ray, A.; Key, K.
2017-12-01
Geoscientists commonly use electromagnetic methods to image the Earth's near surface. Field measurements of EM fields are made (often with the aid an artificial EM source) and then used to infer near surface electrical conductivity via a process known as inversion. In geophysics, the standard inversion tool kit is robust and can provide an estimate of the Earth's near surface conductivity that is both geologically reasonable and compatible with the measured field data. However, standard inverse methods struggle to provide a sense of the uncertainty in the estimate they provide. This is because the task of finding an Earth model that explains the data to within measurement error is non-unique - that is, there are many, many such models; but the standard methods provide only one "answer." An alternative method, known as Bayesian inversion, seeks to explore the full range of Earth model parameters that can adequately explain the measured data, rather than attempting to find a single, "ideal" model. Bayesian inverse methods can therefore provide a quantitative assessment of the uncertainty inherent in trying to infer near surface conductivity from noisy, measured field data. This study applies a Bayesian inverse method (called trans-dimensional Markov chain Monte Carlo) to transient airborne EM data previously collected over Taylor Valley - one of the McMurdo Dry Valleys in Antarctica. Our results confirm the reasonableness of previous estimates (made using standard methods) of near surface conductivity beneath Taylor Valley. In addition, we demonstrate quantitatively the uncertainty associated with those estimates. We demonstrate that Bayesian inverse methods can provide quantitative uncertainty to estimates of near surface conductivity.
A Method to Measure and Estimate Normalized Contrast in Infrared Flash Thermography
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2016-01-01
The paper presents further development in normalized contrast processing used in flash infrared thermography method. Method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided. Methods of converting image contrast to temperature contrast and vice versa are provided. Normalized contrast processing in flash thermography is useful in quantitative analysis of flash thermography data including flaw characterization and comparison of experimental results with simulation. Computation of normalized temperature contrast involves use of flash thermography data acquisition set-up with high reflectivity foil and high emissivity tape such that the foil, tape and test object are imaged simultaneously. Methods of assessing other quantitative parameters such as emissivity of object, afterglow heat flux, reflection temperature change and surface temperature during flash thermography are also provided. Temperature imaging and normalized temperature contrast processing provide certain advantages over normalized image contrast processing by reducing effect of reflected energy in images and measurements, therefore providing better quantitative data. Examples of incorporating afterglow heat-flux and reflection temperature evolution in flash thermography simulation are also discussed.
Esposito, Dominick; Taylor, Erin Fries; Gold, Marsha
2009-02-01
Interest in disease management programs continues to grow as managed care plans, the federal and state governments, and other organizations consider such efforts as a means to improve health care quality and reduce costs. These efforts vary in size, scope, and target population. While large-scale programs provide the means to measure impacts, evaluation of smaller interventions remains valuable as they often represent the early planning stages of larger initiatives. This paper describes a multi-method approach for evaluating small interventions that sought to improve the quality of care for Medicaid beneficiaries with multiple chronic conditions. Our approach relied on quantitative and qualitative methods to develop a complete understanding of each intervention. Quantitative data in the form of both process measures, such as case manager contacts, and outcome measures, such as hospital use, were reported and analyzed. Qualitative information was collected through interviews and the development of logic models to document the flow of intervention activities and how they were intended to affect outcomes. The logic models helped us to understand the underlying reasons for the success or lack thereof of each intervention. The analysis provides useful information on several fronts. First, qualitative data provided valuable information about implementation. Second, process measures helped determine whether implementation occurred as anticipated. Third, outcome measures indicated the potential for favorable results later, possibly suggesting further study. Finally, the evaluation of qualitative and quantitative data in combination helped us assess the potential promise of each intervention and identify common themes and challenges across all interventions.
The concept of "buffering" in systems and control theory: from metaphor to math.
Schmitt, Bernhard M
2004-10-04
The paradigm of "buffering" is used increasingly for the description of diverse "systemic" phenomena encountered in evolutionary genetics, ecology, integrative physiology, and other areas. However, in this new context, the paradigm has not yet matured into a truly quantitative concept inasmuch as it lacks a corresponding quantitative measure of "systems-level buffering strength". Here, I develop such measures on the basis of a formal and general approach to the quantitation of buffering action. "Systems-level buffering" is shown to be synonymous with "disturbance rejection" in feedback-control systems, and can be quantitated by means of dimensionless proportions between partial flows in two-partitioned systems. The units allow either the time-independent, "static" buffering properties or the time-dependent, "dynamic" ones to be measured. Analogous to this "resistance to change", one can define and measure the "conductance to change"; this quantity corresponds to "set-point tracking" in feedback-control systems. Together, these units provide a systematic framework for the quantitation of buffering action in systems biology, and reveal the common principle behind systems-level buffering, classical acid-base buffering, and multiple other manifestations of buffering.
Quantitative imaging biomarkers: Effect of sample size and bias on confidence interval coverage.
Obuchowski, Nancy A; Bullen, Jennifer
2017-01-01
Introduction Quantitative imaging biomarkers (QIBs) are being increasingly used in medical practice and clinical trials. An essential first step in the adoption of a quantitative imaging biomarker is the characterization of its technical performance, i.e. precision and bias, through one or more performance studies. Then, given the technical performance, a confidence interval for a new patient's true biomarker value can be constructed. Estimating bias and precision can be problematic because rarely are both estimated in the same study, precision studies are usually quite small, and bias cannot be measured when there is no reference standard. Methods A Monte Carlo simulation study was conducted to assess factors affecting nominal coverage of confidence intervals for a new patient's quantitative imaging biomarker measurement and for change in the quantitative imaging biomarker over time. Factors considered include sample size for estimating bias and precision, effect of fixed and non-proportional bias, clustered data, and absence of a reference standard. Results Technical performance studies of a quantitative imaging biomarker should include at least 35 test-retest subjects to estimate precision and 65 cases to estimate bias. Confidence intervals for a new patient's quantitative imaging biomarker measurement constructed under the no-bias assumption provide nominal coverage as long as the fixed bias is <12%. For confidence intervals of the true change over time, linearity must hold and the slope of the regression of the measurements vs. true values should be between 0.95 and 1.05. The regression slope can be assessed adequately as long as fixed multiples of the measurand can be generated. Even small non-proportional bias greatly reduces confidence interval coverage. Multiple lesions in the same subject can be treated as independent when estimating precision. Conclusion Technical performance studies of quantitative imaging biomarkers require moderate sample sizes in order to provide robust estimates of bias and precision for constructing confidence intervals for new patients. Assumptions of linearity and non-proportional bias should be assessed thoroughly.
Erokwu, Bernadette O; Anderson, Christian E; Flask, Chris A; Dell, Katherine M
2018-05-01
BackgroundAutosomal recessive polycystic kidney disease (ARPKD) is associated with significant mortality and morbidity, and currently, there are no disease-specific treatments available for ARPKD patients. One major limitation in establishing new therapies for ARPKD is a lack of sensitive measures of kidney disease progression. Magnetic resonance imaging (MRI) can provide multiple quantitative assessments of the disease.MethodsWe applied quantitative image analysis of high-resolution (noncontrast) T2-weighted MRI techniques to study cystic kidney disease progression and response to therapy in the PCK rat model of ARPKD.ResultsSerial imaging over a 2-month period demonstrated that renal cystic burden (RCB, %)=[total cyst volume (TCV)/total kidney volume (TKV) × 100], TCV, and, to a lesser extent, TKV detected cystic kidney disease progression, as well as the therapeutic effect of octreotide, a clinically available medication shown previously to slow both kidney and liver disease progression in this model. All three MRI measures correlated significantly with histologic measures of renal cystic area, although the correlation of RCB and TCV was stronger than that of TKV.ConclusionThese preclinical MRI results provide a basis for applying these quantitative MRI techniques in clinical studies, to stage and measure progression in human ARPKD kidney disease.
Heijtel, D F R; Mutsaerts, H J M M; Bakker, E; Schober, P; Stevens, M F; Petersen, E T; van Berckel, B N M; Majoie, C B L M; Booij, J; van Osch, M J P; Vanbavel, E; Boellaard, R; Lammertsma, A A; Nederveen, A J
2014-05-15
Measurements of the cerebral blood flow (CBF) and cerebrovascular reactivity (CVR) provide useful information about cerebrovascular condition and regional metabolism. Pseudo-continuous arterial spin labeling (pCASL) is a promising non-invasive MRI technique to quantitatively measure the CBF, whereas additional hypercapnic pCASL measurements are currently showing great promise to quantitatively assess the CVR. However, the introduction of pCASL at a larger scale awaits further evaluation of the exact accuracy and precision compared to the gold standard. (15)O H₂O positron emission tomography (PET) is currently regarded as the most accurate and precise method to quantitatively measure both CBF and CVR, though it is one of the more invasive methods as well. In this study we therefore assessed the accuracy and precision of quantitative pCASL-based CBF and CVR measurements by performing a head-to-head comparison with (15)O H₂O PET, based on quantitative CBF measurements during baseline and hypercapnia. We demonstrate that pCASL CBF imaging is accurate during both baseline and hypercapnia with respect to (15)O H₂O PET with a comparable precision. These results pave the way for quantitative usage of pCASL MRI in both clinical and research settings. Copyright © 2014 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carla J. Miller
This report provides a summary of the literature review that was performed and based on previous work performed at the Idaho National Laboratory studying the Three Mile Island 2 (TMI-2) nuclear reactor accident, specifically the melted fuel debris. The purpose of the literature review was to document prior published work that supports the feasibility of the analytical techniques that were developed to provide quantitative results of the make-up of the fuel and reactor component debris located inside and outside the containment. The quantitative analysis provides a technique to perform nuclear fuel accountancy measurements
Analytical robustness of quantitative NIR chemical imaging for Islamic paper characterization
NASA Astrophysics Data System (ADS)
Mahgoub, Hend; Gilchrist, John R.; Fearn, Thomas; Strlič, Matija
2017-07-01
Recently, spectral imaging techniques such as Multispectral (MSI) and Hyperspectral Imaging (HSI) have gained importance in the field of heritage conservation. This paper explores the analytical robustness of quantitative chemical imaging for Islamic paper characterization by focusing on the effect of different measurement and processing parameters, i.e. acquisition conditions and calibration on the accuracy of the collected spectral data. This will provide a better understanding of the technique that can provide a measure of change in collections through imaging. For the quantitative model, special calibration target was devised using 105 samples from a well-characterized reference Islamic paper collection. Two material properties were of interest: starch sizing and cellulose degree of polymerization (DP). Multivariate data analysis methods were used to develop discrimination and regression models which were used as an evaluation methodology for the metrology of quantitative NIR chemical imaging. Spectral data were collected using a pushbroom HSI scanner (Gilden Photonics Ltd) in the 1000-2500 nm range with a spectral resolution of 6.3 nm using a mirror scanning setup and halogen illumination. Data were acquired at different measurement conditions and acquisition parameters. Preliminary results showed the potential of the evaluation methodology to show that measurement parameters such as the use of different lenses and different scanning backgrounds may not have a great influence on the quantitative results. Moreover, the evaluation methodology allowed for the selection of the best pre-treatment method to be applied to the data.
Stewart, Ethan L; Hagerty, Christina H; Mikaberidze, Alexey; Mundt, Christopher C; Zhong, Ziming; McDonald, Bruce A
2016-07-01
Zymoseptoria tritici causes Septoria tritici blotch (STB) on wheat. An improved method of quantifying STB symptoms was developed based on automated analysis of diseased leaf images made using a flatbed scanner. Naturally infected leaves (n = 949) sampled from fungicide-treated field plots comprising 39 wheat cultivars grown in Switzerland and 9 recombinant inbred lines (RIL) grown in Oregon were included in these analyses. Measures of quantitative resistance were percent leaf area covered by lesions, pycnidia size and gray value, and pycnidia density per leaf and lesion. These measures were obtained automatically with a batch-processing macro utilizing the image-processing software ImageJ. All phenotypes in both locations showed a continuous distribution, as expected for a quantitative trait. The trait distributions at both sites were largely overlapping even though the field and host environments were quite different. Cultivars and RILs could be assigned to two or more statistically different groups for each measured phenotype. Traditional visual assessments of field resistance were highly correlated with quantitative resistance measures based on image analysis for the Oregon RILs. These results show that automated image analysis provides a promising tool for assessing quantitative resistance to Z. tritici under field conditions.
Repeatability Assessment by ISO 11843-7 in Quantitative HPLC for Herbal Medicines.
Chen, Liangmian; Kotani, Akira; Hakamata, Hideki; Tsutsumi, Risa; Hayashi, Yuzuru; Wang, Zhimin; Kusu, Fumiyo
2015-01-01
We have proposed an assessment methods to estimate the measurement relative standard deviation (RSD) of chromatographic peaks in quantitative HPLC for herbal medicines by the methodology of ISO 11843 Part 7 (ISO 11843-7:2012), which provides detection limits stochastically. In quantitative HPLC with UV detection (HPLC-UV) of Scutellaria Radix for the determination of baicalin, the measurement RSD of baicalin by ISO 11843-7:2012 stochastically was within a 95% confidence interval of the statistically obtained RSD by repetitive measurements (n = 6). Thus, our findings show that it is applicable for estimating of the repeatability of HPLC-UV for determining baicalin without repeated measurements. In addition, the allowable limit of the "System repeatability" in "Liquid Chromatography" regulated in a pharmacopoeia can be obtained by the present assessment method. Moreover, the present assessment method was also successfully applied to estimate the measurement RSDs of quantitative three-channel liquid chromatography with electrochemical detection (LC-3ECD) of Chrysanthemi Flos for determining caffeoylquinic acids and flavonoids. By the present repeatability assessment method, reliable measurement RSD was obtained stochastically, and the experimental time was remarkably reduced.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2011-11-01
NREL's new imaging tool could provide manufacturers with insight on their processes. Scientists at the National Renewable Energy Laboratory (NREL) have used capabilities within the Process Development and Integration Laboratory (PDIL) to generate quantitative minority-carrier lifetime maps of multicrystalline silicon (mc-Si) bricks. This feat has been accomplished by using the PDIL's photoluminescence (PL) imaging system in conjunction with transient lifetime measurements obtained using a custom NREL-designed resonance-coupled photoconductive decay (RCPCD) system. PL imaging can obtain rapid high-resolution images that provide a qualitative assessment of the material lifetime-with the lifetime proportional to the pixel intensity. In contrast, the RCPCD technique providesmore » a fast quantitative measure of the lifetime with a lower resolution and penetrates millimeters into the mc-Si brick, providing information on bulk lifetimes and material quality. This technique contrasts with commercially available minority-carrier lifetime mapping systems that use microwave conductivity measurements. Such measurements are dominated by surface recombination and lack information on the material quality within the bulk of the brick. By combining these two complementary techniques, we obtain high-resolution lifetime maps at very fast data acquisition times-attributes necessary for a production-based diagnostic tool. These bulk lifetime measurements provide manufacturers with invaluable feedback on their silicon ingot casting processes. NREL has been applying the PL images of lifetime in mc-Si bricks in collaboration with a U.S. photovoltaic industry partner through Recovery Act Funded Project ARRA T24. NREL developed a new tool to quantitatively map minority-carrier lifetime of multicrystalline silicon bricks by using photoluminescence imaging in conjunction with resonance-coupled photoconductive decay measurements. Researchers are not hindered by surface recombination and can look deeper into the material to map bulk lifetimes. The tool is being applied to silicon bricks in a project collaborating with a U.S. photovoltaic industry partner. Photovoltaic manufacturers can use the NREL tool to obtain valuable feedback on their silicon ingot casting processes.« less
Vessel wall characterization using quantitative MRI: what's in a number?
Coolen, Bram F; Calcagno, Claudia; van Ooij, Pim; Fayad, Zahi A; Strijkers, Gustav J; Nederveen, Aart J
2018-02-01
The past decade has witnessed the rapid development of new MRI technology for vessel wall imaging. Today, with advances in MRI hardware and pulse sequences, quantitative MRI of the vessel wall represents a real alternative to conventional qualitative imaging, which is hindered by significant intra- and inter-observer variability. Quantitative MRI can measure several important morphological and functional characteristics of the vessel wall. This review provides a detailed introduction to novel quantitative MRI methods for measuring vessel wall dimensions, plaque composition and permeability, endothelial shear stress and wall stiffness. Together, these methods show the versatility of non-invasive quantitative MRI for probing vascular disease at several stages. These quantitative MRI biomarkers can play an important role in the context of both treatment response monitoring and risk prediction. Given the rapid developments in scan acceleration techniques and novel image reconstruction, we foresee the possibility of integrating the acquisition of multiple quantitative vessel wall parameters within a single scan session.
Apparatus for rapid measurement of aerosol bulk chemical composition
Lee, Yin-Nan E.; Weber, Rodney J.
2003-01-01
An apparatus and method for continuous on-line measurement of chemical composition of aerosol particles with a fast time resolution are provided. The apparatus includes a modified particle size magnifier for producing activated aerosol particles and a collection device which collects the activated aerosol particles into a liquid stream for quantitative analysis by analytical methods. The method provided for on-line measurement of chemical composition of aerosol particles includes exposing aerosol carrying sample air to hot saturated steam thereby forming activated aerosol particles; collecting the activated aerosol particles by a collection device for delivery as a jet stream onto an impaction surface; flushing off the activated aerosol particles from the impaction surface into a liquid stream for delivery of the collected liquid stream to an analytical instrument for quantitative measurement.
Apparatus for rapid measurement of aerosol bulk chemical composition
Lee, Yin-Nan E.; Weber, Rodney J.; Orsini, Douglas
2006-04-18
An apparatus for continuous on-line measurement of chemical composition of aerosol particles with a fast time resolution is provided. The apparatus includes an enhanced particle size magnifier for producing activated aerosol particles and an enhanced collection device which collects the activated aerosol particles into a liquid stream for quantitative analysis by analytical means. Methods for on-line measurement of chemical composition of aerosol particles are also provided, the method including exposing aerosol carrying sample air to hot saturated steam thereby forming activated aerosol particles; collecting the activated aerosol particles by a collection device for delivery as a jet stream onto an impaction surface; and flushing off the activated aerosol particles from the impaction surface into a liquid stream for delivery of the collected liquid stream to an analytical instrument for quantitative measurement.
McGrane, Shawn D; Moore, David S; Goodwin, Peter M; Dattelbaum, Dana M
2014-01-01
The ratio of Stokes to anti-Stokes nonresonant spontaneous Raman can provide an in situ thermometer that is noncontact, independent of any material specific parameters or calibrations, can be multiplexed spatially with line imaging, and can be time resolved for dynamic measurements. However, spontaneous Raman cross sections are very small, and thermometric measurements are often limited by the amount of laser energy that can be applied without damaging the sample or changing its temperature appreciably. In this paper, we quantitatively detail the tradeoff space between spatial, temporal, and thermometric accuracy measurable with spontaneous Raman. Theoretical estimates are pinned to experimental measurements to form realistic expectations of the resolution tradeoffs appropriate to various experiments. We consider the effects of signal to noise, collection efficiency, laser heating, pulsed laser ablation, and blackbody emission as limiting factors, provide formulae to help choose optimal conditions and provide estimates relevant to planning experiments along with concrete examples for single-shot measurements.
2017-01-01
Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers (QIBs) to measure changes in these features. Critical to the performance of a QIB in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method and metrics used to assess a QIB for clinical use. It is therefore, difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America (RSNA) and the Quantitative Imaging Biomarker Alliance (QIBA) with technical, radiological and statistical experts developed a set of technical performance analysis methods, metrics and study designs that provide terminology, metrics and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of QIB performance studies so that results from multiple studies can be compared, contrasted or combined. PMID:24919831
NASA Technical Reports Server (NTRS)
Buck, Gregory M. (Inventor)
1989-01-01
A thermal imaging system provides quantitative temperature information and is particularly useful in hypersonic wind tunnel applications. An object to be measured is prepared by coating with a two-color, ultraviolet-activated, thermographic phosphor. The colors emitted by the phosphor are detected by a conventional color video camera. A phosphor emitting blue and green light with a ratio that varies depending on temperature is used so that the intensity of light in the blue and green wavelengths detected by the blue and green tubes in the video camera can be compared. Signals representing the intensity of blue and green light at points on the surface of a model in a hypersonic wind tunnel are used to calculate a ratio of blue to green light intensity which provides quantitative temperature information for the surface of the model.
Quantitative Evaluation of Musical Scale Tunings
ERIC Educational Resources Information Center
Hall, Donald E.
1974-01-01
The acoustical and mathematical basis of the problem of tuning the twelve-tone chromatic scale is reviewed. A quantitative measurement showing how well any tuning succeeds in providing just intonation for any specific piece of music is explained and applied to musical examples using a simple computer program. (DT)
Towards Measurement of Confidence in Safety Cases
NASA Technical Reports Server (NTRS)
Denney, Ewen; Paim Ganesh J.; Habli, Ibrahim
2011-01-01
Arguments in safety cases are predominantly qualitative. This is partly attributed to the lack of sufficient design and operational data necessary to measure the achievement of high-dependability targets, particularly for safety-critical functions implemented in software. The subjective nature of many forms of evidence, such as expert judgment and process maturity, also contributes to the overwhelming dependence on qualitative arguments. However, where data for quantitative measurements is systematically collected, quantitative arguments provide far more benefits over qualitative arguments, in assessing confidence in the safety case. In this paper, we propose a basis for developing and evaluating integrated qualitative and quantitative safety arguments based on the Goal Structuring Notation (GSN) and Bayesian Networks (BN). The approach we propose identifies structures within GSN-based arguments where uncertainties can be quantified. BN are then used to provide a means to reason about confidence in a probabilistic way. We illustrate our approach using a fragment of a safety case for an unmanned aerial system and conclude with some preliminary observations
Biomarkers and Surrogate Endpoints in Uveitis: The Impact of Quantitative Imaging.
Denniston, Alastair K; Keane, Pearse A; Srivastava, Sunil K
2017-05-01
Uveitis is a major cause of sight loss across the world. The reliable assessment of intraocular inflammation in uveitis ('disease activity') is essential in order to score disease severity and response to treatment. In this review, we describe how 'quantitative imaging', the approach of using automated analysis and measurement algorithms across both standard and emerging imaging modalities, can develop objective instrument-based measures of disease activity. This is a narrative review based on searches of the current world literature using terms related to quantitative imaging techniques in uveitis, supplemented by clinical trial registry data, and expert knowledge of surrogate endpoints and outcome measures in ophthalmology. Current measures of disease activity are largely based on subjective clinical estimation, and are relatively insensitive, with poor discrimination and reliability. The development of quantitative imaging in uveitis is most established in the use of optical coherence tomographic (OCT) measurement of central macular thickness (CMT) to measure severity of macular edema (ME). The transformative effect of CMT in clinical assessment of patients with ME provides a paradigm for the development and impact of other forms of quantitative imaging. Quantitative imaging approaches are now being developed and validated for other key inflammatory parameters such as anterior chamber cells, vitreous haze, retinovascular leakage, and chorioretinal infiltrates. As new forms of quantitative imaging in uveitis are proposed, the uveitis community will need to evaluate these tools against the current subjective clinical estimates and reach a new consensus for how disease activity in uveitis should be measured. The development, validation, and adoption of sensitive and discriminatory measures of disease activity is an unmet need that has the potential to transform both drug development and routine clinical care for the patient with uveitis.
NASA Astrophysics Data System (ADS)
Li, Chunhui; Guan, Guangying; Huang, Zhihong; Wang, Ruikang K.; Nabi, Ghulam
2015-03-01
By combining with the phase sensitive optical coherence tomography (PhS-OCT), vibration and surface acoustic wave (SAW) methods have been reported to provide elastography of skin tissue respectively. However, neither of these two methods can provide the elastography in full skin depth in current systems. This paper presents a feasibility study on an optical coherence elastography method which combines both vibration and SAW in order to give the quantitative mechanical properties of skin tissue with full depth range, including epidermis, dermis and subcutaneous fat. Experiments are carried out on layered tissue mimicking phantoms and in vivo human forearm and palm skin. A ring actuator generates vibration while a line actuator were used to excited SAWs. A PhS-OCT system is employed to provide the ultrahigh sensitive measurement of the generated waves. The experimental results demonstrate that by the combination of vibration and SAW method the full skin bulk mechanical properties can be quantitatively measured and further the elastography can be obtained with a sensing depth from ~0mm to ~4mm. This method is promising to apply in clinics where the quantitative elasticity of localized skin diseases is needed to aid the diagnosis and treatment.
Schwartz, Peter H; Perkins, Susan M; Schmidt, Karen K; Muriello, Paul F; Althouse, Sandra; Rawl, Susan M
2017-08-01
Guidelines recommend that patient decision aids should provide quantitative information about probabilities of potential outcomes, but the impact of this information is unknown. Behavioral economics suggests that patients confused by quantitative information could benefit from a "nudge" towards one option. We conducted a pilot randomized trial to estimate the effect sizes of presenting quantitative information and a nudge. Primary care patients (n = 213) eligible for colorectal cancer screening viewed basic screening information and were randomized to view (a) quantitative information (quantitative module), (b) a nudge towards stool testing with the fecal immunochemical test (FIT) (nudge module), (c) neither a nor b, or (d) both a and b. Outcome measures were perceived colorectal cancer risk, screening intent, preferred test, and decision conflict, measured before and after viewing the decision aid, and screening behavior at 6 months. Patients viewing the quantitative module were more likely to be screened than those who did not ( P = 0.012). Patients viewing the nudge module had a greater increase in perceived colorectal cancer risk than those who did not ( P = 0.041). Those viewing the quantitative module had a smaller increase in perceived risk than those who did not ( P = 0.046), and the effect was moderated by numeracy. Among patients with high numeracy who did not view the nudge module, those who viewed the quantitative module had a greater increase in intent to undergo FIT ( P = 0.028) than did those who did not. The limitations of this study were the limited sample size and single healthcare system. Adding quantitative information to a decision aid increased uptake of colorectal cancer screening, while adding a nudge to undergo FIT did not increase uptake. Further research on quantitative information in decision aids is warranted.
Exploring a taxonomy for aggression against women: can it aid conceptual clarity?
Cook, Sarah; Parrott, Dominic
2009-01-01
The assessment of aggression against women is demanding primarily because assessment strategies do not share a common language to describe reliably the wide range of forms of aggression women experience. The lack of a common language impairs efforts to describe these experiences, understand causes and consequences of aggression against women, and develop effective intervention and prevention efforts. This review accomplishes two goals. First, it applies a theoretically and empirically based taxonomy to behaviors assessed by existing measurement instruments. Second, it evaluates whether the taxonomy provides a common language for the field. Strengths of the taxonomy include its ability to describe and categorize all forms of aggression found in existing quantitative measures. The taxonomy also classifies numerous examples of aggression discussed in the literature but notably absent from quantitative measures. Although we use existing quantitative measures as a starting place to evaluate the taxonomy, its use is not limited to quantitative methods. Implications for theory, research, and practice are discussed.
Behavior of stabled horses provided continuous or intermittent access to drinking water.
McDonnell, S M; Freeman, D A; Cymbaluk, N F; Schott, H C; Hinchcliff, K; Kyle, B
1999-11-01
To compare quantitative measures and clinical assessments of behavior as an indication of psychologic well-being of stabled horses provided drinking water continuously or via 1 of 3 intermittent delivery systems. 22 Quarter Horse (QH) or QH-crossbred mares and 17 Belgian or Belgian-crossbred mares (study 1) and 24 QH or QH-crossbred mares and 18 Belgian or Belgian-crossbred mares (study 2). Stabled horses were provided water continuously or via 1 of 3 intermittent water delivery systems in 2 study periods during a 2-year period. Continuous 24-hour videotaped samples were used to compare quantitative measures and clinical assessments of behavior among groups provided water by the various water delivery systems. All horses had clinically normal behavior. Significant differences in well being were not detected among groups provided water by the various delivery systems. Various continuous and intermittent water delivery systems can provide adequately for the psychologic well-being of stabled horses.
Highlights from High Energy Neutrino Experiments at CERN
NASA Astrophysics Data System (ADS)
Schlatter, W.-D.
2015-07-01
Experiments with high energy neutrino beams at CERN provided early quantitative tests of the Standard Model. This article describes results from studies of the nucleon quark structure and of the weak current, together with the precise measurement of the weak mixing angle. These results have established a new quality for tests of the electroweak model. In addition, the measurements of the nucleon structure functions in deep inelastic neutrino scattering allowed first quantitative tests of QCD.
DOE R&D Accomplishments Database
Phelps, M. E.; Hoffman, E. J.; Huang, S. C.; Schelbert, H. R.; Kuhl, D. E.
1978-01-01
Emission computed tomography can provide a quantitative in vivo measurement of regional tissue radionuclide tracer concentrations. This facility when combined with physiologic models and radioactively labeled physiologic tracers that behave in a predictable manner allow measurement of a wide variety of physiologic variables. This integrated technique has been referred to as Physiologic Tomography (PT). PT requires labeled compounds which trace physiologic processes in a known and predictable manner, and physiologic models which are appropriately formulated and validated to derive physiologic variables from ECT data. In order to effectively achieve this goal, PT requires an ECT system that is capable of performing truly quantitative or analytical measurements of tissue tracer concentrations and which has been well characterized in terms of spatial resolution, sensitivity and signal to noise ratios in the tomographic image. This paper illustrates the capabilities of emission computed tomography and provides examples of physiologic tomography for the regional measurement of cerebral and myocardial metabolic rate for glucose, regional measurement of cerebral blood volume, gated cardiac blood pools and capillary perfusion in brain and heart. Studies on patients with stroke and myocardial ischemia are also presented.
NASA Astrophysics Data System (ADS)
Mansfield, C. D.; Rutt, H. N.
2002-02-01
The possible generation of spurious results, arising from the application of infrared spectroscopic techniques to the measurement of carbon isotope ratios in breath, due to coincident absorption bands has been re-examined. An earlier investigation, which approached the problem qualitatively, fulfilled its aspirations in providing an unambiguous assurance that 13C16O2/12C16O2 ratios can be confidently measured for isotopic breath tests using instruments based on infrared absorption. Although this conclusion still stands, subsequent quantitative investigation has revealed an important exception that necessitates a strict adherence to sample collection protocol. The results show that concentrations and decay rates of the coincident breath trace compounds acetonitrile and carbon monoxide, found in the breath sample of a heavy smoker, can produce spurious results. Hence, findings from this investigation justify the concern that breath trace compounds present a risk to the accurate measurement of carbon isotope ratios in breath when using broadband, non-dispersive, ground state absorption infrared spectroscopy. It provides recommendations on the length of smoking abstention required to avoid generation of spurious results and also reaffirms, through quantitative argument, the validity of using infrared absorption spectroscopy to measure CO2 isotope ratios in breath.
Investigating the Validity of Two Widely Used Quantitative Text Tools
ERIC Educational Resources Information Center
Cunningham, James W.; Hiebert, Elfrieda H.; Mesmer, Heidi Anne
2018-01-01
In recent years, readability formulas have gained new prominence as a basis for selecting texts for learning and assessment. Variables that quantitative tools count (e.g., word frequency, sentence length) provide valid measures of text complexity insofar as they accurately predict representative and high-quality criteria. The longstanding…
Robert E. Keane; Lisa Holsinger; Russell A. Parsons
2011-01-01
A measure of the degree of departure of a landscape from its range of historical conditions can provide a means for prioritizing and planning areas for restoration treatments. There are few statistics or indices that provide a quantitative context for measuring departure across landscapes. This study evaluated a set of five similarity indices commonly used in...
Development of a novel nanoscratch technique for quantitative measurement of ice adhesion strength
NASA Astrophysics Data System (ADS)
Loho, T.; Dickinson, M.
2018-04-01
The mechanism for the way that ice adheres to surfaces is still not well understood. Currently there is no standard method to quantitatively measure how ice adheres to surfaces which makes ice surface studies difficult to compare. A novel quantitative lateral force adhesion measurement at the micro-nano scale for ice was created which shears micro-nano sized ice droplets (less than 3 μm in diameter and 100nm in height) using a nanoindenter. By using small ice droplets, the variables associated with bulk ice measurements were minimised which increased data repeatability compared to bulk testing. The technique provided post- testing surface scans to confirm that the ice had been removed and that measurements were of ice adhesion strength. Results show that the ice adhesion strength of a material is greatly affected by the nano-scale surface roughness of the material with rougher surfaces having higher ice adhesion strength.
Measuring iron in the brain using quantitative susceptibility mapping and X-ray fluorescence imaging
Zheng, Weili; Nichol, Helen; Liu, Saifeng; Cheng, Yu-Chung N.; Haacke, E. Mark
2013-01-01
Measuring iron content in the brain has important implications for a number of neurodegenerative diseases. Quantitative susceptibility mapping (QSM), derived from magnetic resonance images, has been used to measure total iron content in vivo and in post mortem brain. In this paper, we show how magnetic susceptibility from QSM correlates with total iron content measured by X-ray fluorescence (XRF) imaging and by inductively coupled plasma mass spectrometry (ICPMS). The relationship between susceptibility and ferritin iron was estimated at 1.10 ± 0.08 ppb susceptibility per μg iron/g wet tissue, similar to that of iron in fixed (frozen/thawed) cadaveric brain and previously published data from unfixed brains. We conclude that magnetic susceptibility can provide a direct and reliable quantitative measurement of iron content and that it can be used clinically at least in regions with high iron content. PMID:23591072
The quantitative and condition-dependent Escherichia coli proteome
Schmidt, Alexander; Kochanowski, Karl; Vedelaar, Silke; Ahrné, Erik; Volkmer, Benjamin; Callipo, Luciano; Knoops, Kèvin; Bauer, Manuel; Aebersold, Ruedi; Heinemann, Matthias
2016-01-01
Measuring precise concentrations of proteins can provide insights into biological processes. Here, we use efficient protein extraction and sample fractionation and state-of-the-art quantitative mass spectrometry techniques to generate a comprehensive, condition-dependent protein abundance map of Escherichia coli. We measure cellular protein concentrations for 55% of predicted E. coli genes (>2300 proteins) under 22 different experimental conditions and identify methylation and N-terminal protein acetylations previously not known to be prevalent in bacteria. We uncover system-wide proteome allocation, expression regulation, and post-translational adaptations. These data provide a valuable resource for the systems biology and broader E. coli research communities. PMID:26641532
Wengert, G J; Helbich, T H; Woitek, R; Kapetas, P; Clauser, P; Baltzer, P A; Vogl, W-D; Weber, M; Meyer-Baese, A; Pinker, Katja
2016-11-01
To evaluate the inter-/intra-observer agreement of BI-RADS-based subjective visual estimation of the amount of fibroglandular tissue (FGT) with magnetic resonance imaging (MRI), and to investigate whether FGT assessment benefits from an automated, observer-independent, quantitative MRI measurement by comparing both approaches. Eighty women with no imaging abnormalities (BI-RADS 1 and 2) were included in this institutional review board (IRB)-approved prospective study. All women underwent un-enhanced breast MRI. Four radiologists independently assessed FGT with MRI by subjective visual estimation according to BI-RADS. Automated observer-independent quantitative measurement of FGT with MRI was performed using a previously described measurement system. Inter-/intra-observer agreements of qualitative and quantitative FGT measurements were assessed using Cohen's kappa (k). Inexperienced readers achieved moderate inter-/intra-observer agreement and experienced readers a substantial inter- and perfect intra-observer agreement for subjective visual estimation of FGT. Practice and experience reduced observer-dependency. Automated observer-independent quantitative measurement of FGT was successfully performed and revealed only fair to moderate agreement (k = 0.209-0.497) with subjective visual estimations of FGT. Subjective visual estimation of FGT with MRI shows moderate intra-/inter-observer agreement, which can be improved by practice and experience. Automated observer-independent quantitative measurements of FGT are necessary to allow a standardized risk evaluation. • Subjective FGT estimation with MRI shows moderate intra-/inter-observer agreement in inexperienced readers. • Inter-observer agreement can be improved by practice and experience. • Automated observer-independent quantitative measurements can provide reliable and standardized assessment of FGT with MRI.
Motion compensation using origin ensembles in awake small animal positron emission tomography
NASA Astrophysics Data System (ADS)
Gillam, John E.; Angelis, Georgios I.; Kyme, Andre Z.; Meikle, Steven R.
2017-02-01
In emission tomographic imaging, the stochastic origin ensembles algorithm provides unique information regarding the detected counts given the measured data. Precision in both voxel and region-wise parameters may be determined for a single data set based on the posterior distribution of the count density allowing uncertainty estimates to be allocated to quantitative measures. Uncertainty estimates are of particular importance in awake animal neurological and behavioral studies for which head motion, unique for each acquired data set, perturbs the measured data. Motion compensation can be conducted when rigid head pose is measured during the scan. However, errors in pose measurements used for compensation can degrade the data and hence quantitative outcomes. In this investigation motion compensation and detector resolution models were incorporated into the basic origin ensembles algorithm and an efficient approach to computation was developed. The approach was validated against maximum liklihood—expectation maximisation and tested using simulated data. The resultant algorithm was then used to analyse quantitative uncertainty in regional activity estimates arising from changes in pose measurement precision. Finally, the posterior covariance acquired from a single data set was used to describe correlations between regions of interest providing information about pose measurement precision that may be useful in system analysis and design. The investigation demonstrates the use of origin ensembles as a powerful framework for evaluating statistical uncertainty of voxel and regional estimates. While in this investigation rigid motion was considered in the context of awake animal PET, the extension to arbitrary motion may provide clinical utility where respiratory or cardiac motion perturb the measured data.
Herbort, Carl P; Tugal-Tutkun, Ilknur
2017-06-01
Laser flare photometry (LFP) is an objective and quantitative method to measure intraocular inflammation. The LFP technology was developed in Japan and has been commercially available since 1990. The aim of this work was to review the application of LFP in uveitis practice in Europe compared to Japan where the technology was born. We reviewed PubMed articles published on LFP and uveitis. Although LFP has been largely integrated in routine uveitis practice in Europe, it has been comparatively neglected in Japan and still has not received FDA approval in the USA. As LFP is the only method that provides a precise measure of intraocular inflammation, it should be used as a gold standard in uveitis centres worldwide.
An Automated System for Chromosome Analysis
NASA Technical Reports Server (NTRS)
Castleman, K. R.; Melnyk, J. H.
1976-01-01
The design, construction, and testing of a complete system to produce karyotypes and chromosome measurement data from human blood samples, and to provide a basis for statistical analysis of quantitative chromosome measurement data are described.
Liu, Ting; Maurovich-Horvat, Pál; Mayrhofer, Thomas; Puchner, Stefan B; Lu, Michael T; Ghemigian, Khristine; Kitslaar, Pieter H; Broersen, Alexander; Pursnani, Amit; Hoffmann, Udo; Ferencik, Maros
2018-02-01
Semi-automated software can provide quantitative assessment of atherosclerotic plaques on coronary CT angiography (CTA). The relationship between established qualitative high-risk plaque features and quantitative plaque measurements has not been studied. We analyzed the association between quantitative plaque measurements and qualitative high-risk plaque features on coronary CTA. We included 260 patients with plaque who underwent coronary CTA in the Rule Out Myocardial Infarction/Ischemia Using Computer Assisted Tomography (ROMICAT) II trial. Quantitative plaque assessment and qualitative plaque characterization were performed on a per coronary segment basis. Quantitative coronary plaque measurements included plaque volume, plaque burden, remodeling index, and diameter stenosis. In qualitative analysis, high-risk plaque was present if positive remodeling, low CT attenuation plaque, napkin-ring sign or spotty calcium were detected. Univariable and multivariable logistic regression analyses were performed to assess the association between quantitative and qualitative high-risk plaque assessment. Among 888 segments with coronary plaque, high-risk plaque was present in 391 (44.0%) segments by qualitative analysis. In quantitative analysis, segments with high-risk plaque had higher total plaque volume, low CT attenuation plaque volume, plaque burden and remodeling index. Quantitatively assessed low CT attenuation plaque volume (odds ratio 1.12 per 1 mm 3 , 95% CI 1.04-1.21), positive remodeling (odds ratio 1.25 per 0.1, 95% CI 1.10-1.41) and plaque burden (odds ratio 1.53 per 0.1, 95% CI 1.08-2.16) were associated with high-risk plaque. Quantitative coronary plaque characteristics (low CT attenuation plaque volume, positive remodeling and plaque burden) measured by semi-automated software correlated with qualitative assessment of high-risk plaque features.
NASA Technical Reports Server (NTRS)
Partridge, William P.; Laurendeau, Normand M.
1997-01-01
We have experimentally assessed the quantitative nature of planar laser-induced fluorescence (PLIF) measurements of NO concentration in a unique atmospheric pressure, laminar, axial inverse diffusion flame (IDF). The PLIF measurements were assessed relative to a two-dimensional array of separate laser saturated fluorescence (LSF) measurements. We demonstrated and evaluated several experimentally-based procedures for enhancing the quantitative nature of PLIF concentration images. Because these experimentally-based PLIF correction schemes require only the ability to make PLIF and LSF measurements, they produce a more broadly applicable PLIF diagnostic compared to numerically-based correction schemes. We experimentally assessed the influence of interferences on both narrow-band and broad-band fluorescence measurements at atmospheric and high pressures. Optimum excitation and detection schemes were determined for the LSF and PLIF measurements. Single-input and multiple-input, experimentally-based PLIF enhancement procedures were developed for application in test environments with both negligible and significant quench-dependent error gradients. Each experimentally-based procedure provides an enhancement of approximately 50% in the quantitative nature of the PLIF measurements, and results in concentration images nominally as quantitative as LSF point measurements. These correction procedures can be applied to other species, including radicals, for which no experimental data are available from which to implement numerically-based PLIF enhancement procedures.
Measurement of the traction force of biological cells by digital holography
Yu, Xiao; Cross, Michael; Liu, Changgeng; Clark, David C.; Haynie, Donald T.; Kim, Myung K.
2011-01-01
The traction force produced by biological cells has been visualized as distortions in flexible substrata. We have utilized quantitative phase microscopy by digital holography (DH-QPM) to study the wrinkling of a silicone rubber film by motile fibroblasts. Surface deformation and the cellular traction force have been measured from phase profiles in a direct and straightforward manner. DH-QPM is shown to provide highly efficient and versatile means for quantitatively analyzing cellular motility. PMID:22254175
NASA Astrophysics Data System (ADS)
Pratt, Jon R.; Kramar, John A.; Newell, David B.; Smith, Douglas T.
2005-05-01
If nanomechanical testing is to evolve into a tool for process and quality control in semiconductor fabrication, great advances in throughput, repeatability, and accuracy of the associated instruments and measurements will be required. A recent grant awarded by the NIST Advanced Technology Program seeks to address the throughput issue by developing a high-speed AFM-based platform for quantitative nanomechanical measurements. The following paper speaks to the issue of quantitative accuracy by presenting an overview of various standards and techniques under development at NIST and other national metrology institutes (NMIs) that can provide a metrological basis for nanomechanical testing. The infrastructure we describe places firm emphasis on traceability to the International System of Units, paving the way for truly quantitative, rather than qualitative, physical property testing.
NASA Astrophysics Data System (ADS)
Ding, Carl-Philipp; Sjöberg, Magnus; Vuilleumier, David; Reuss, David L.; He, Xu; Böhm, Benjamin
2018-03-01
This study shows fuel film measurements in a spark-ignited direct injection engine using refractive index matching (RIM). The RIM technique is applied to measure the fuel impingement of a high research octane number gasoline fuel with 30 vol% ethanol content at two intake pressures and coolant temperatures. Measurements are conducted for an alkylate fuel at one operating case, as well. It is shown that the fuel volume on the piston surface increases for lower intake pressure and lower coolant temperature and that the alkylate fuel shows very little spray impingement. The fuel films can be linked to increased soot emissions. A detailed description of the calibration technique is provided and measurement uncertainties are discussed. The dependency of the RIM signal on refractive index changes is measured. The RIM technique provides quantitative film thickness measurements up to 0.9 µm in this engine. For thicker films, semi-quantitative results of film thickness can be utilized to study the distribution of impinged fuel.
Gilbert, Fabian; Böhm, Dirk; Eden, Lars; Schmalzl, Jonas; Meffert, Rainer H; Köstler, Herbert; Weng, Andreas M; Ziegler, Dirk
2016-08-22
The Goutallier Classification is a semi quantitative classification system to determine the amount of fatty degeneration in rotator cuff muscles. Although initially proposed for axial computer tomography scans it is currently applied to magnet-resonance-imaging-scans. The role for its clinical use is controversial, as the reliability of the classification has been shown to be inconsistent. The purpose of this study was to compare the semi quantitative MRI-based Goutallier Classification applied by 5 different raters to experimental MR spectroscopic quantitative fat measurement in order to determine the correlation between this classification system and the true extent of fatty degeneration shown by spectroscopy. MRI-scans of 42 patients with rotator cuff tears were examined by 5 shoulder surgeons and were graduated according to the MRI-based Goutallier Classification proposed by Fuchs et al. Additionally the fat/water ratio was measured with MR spectroscopy using the experimental SPLASH technique. The semi quantitative grading according to the Goutallier Classification was statistically correlated with the quantitative measured fat/water ratio using Spearman's rank correlation. Statistical analysis of the data revealed only fair correlation of the Goutallier Classification system and the quantitative fat/water ratio with R = 0.35 (p < 0.05). By dichotomizing the scale the correlation was 0.72. The interobserver and intraobserver reliabilities were substantial with R = 0.62 and R = 0.74 (p < 0.01). The correlation between the semi quantitative MRI based Goutallier Classification system and MR spectroscopic fat measurement is weak. As an adequate estimation of fatty degeneration based on standard MRI may not be possible, quantitative methods need to be considered in order to increase diagnostic safety and thus provide patients with ideal care in regard to the amount of fatty degeneration. Spectroscopic MR measurement may increase the accuracy of the Goutallier classification and thus improve the prediction of clinical results after rotator cuff repair. However, these techniques are currently only available in an experimental setting.
A Quantitative Comparative Study Measuring Consumer Satisfaction Based on Health Record Format
ERIC Educational Resources Information Center
Moore, Vivianne E.
2013-01-01
This research study used a quantitative comparative method to investigate the relationship between consumer satisfaction and communication based on the format of health record. The central problem investigated in this research study related to the format of health record used and consumer satisfaction with care provided and effect on communication…
Lee, Alex Pui-Wai; Fang, Fang; Jin, Chun-Na; Kam, Kevin Ka-Ho; Tsui, Gary K W; Wong, Kenneth K Y; Looi, Jen-Li; Wong, Randolph H L; Wan, Song; Sun, Jing Ping; Underwood, Malcolm J; Yu, Cheuk-Man
2014-01-01
The mitral valve (MV) has complex 3-dimensional (3D) morphology and motion. Advance in real-time 3D echocardiography (RT3DE) has revolutionized clinical imaging of the MV by providing clinicians with realistic visualization of the valve. Thus far, RT3DE of the MV structure and dynamics has adopted an approach that depends largely on subjective and qualitative interpretation of the 3D images of the valve, rather than objective and reproducible measurement. RT3DE combined with image-processing computer techniques provides precise segmentation and reliable quantification of the complex 3D morphology and rapid motion of the MV. This new approach to imaging may provide additional quantitative descriptions that are useful in diagnostic and therapeutic decision-making. Quantitative analysis of the MV using RT3DE has increased our understanding of the pathologic mechanism of degenerative, ischemic, functional, and rheumatic MV disease. Most recently, 3D morphologic quantification has entered into clinical use to provide more accurate diagnosis of MV disease and for planning surgery and transcatheter interventions. Current limitations of this quantitative approach to MV imaging include labor-intensiveness during image segmentation and lack of a clear definition of the clinical significance of many of the morphologic parameters. This review summarizes the current development and applications of quantitative analysis of the MV morphology using RT3DE.
TH-AB-209-09: Quantitative Imaging of Electrical Conductivity by VHF-Induced Thermoacoustics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patch, S; Hull, D; See, W
Purpose: To demonstrate that very high frequency (VHF) induced thermoacoustics has the potential to provide quantitative images of electrical conductivity in Siemens/meter, much as shear wave elastography provides tissue stiffness in kPa. Quantitatively imaging a large organ requires exciting thermoacoustic pulses throughout the volume and broadband detection of those pulses because tomographic image reconstruction preserves frequency content. Applying the half-wavelength limit to a 200-micron inclusion inside a 7.5 cm diameter organ requires measurement sensitivity to frequencies ranging from 4 MHz down to 10 kHz, respectively. VHF irradiation provides superior depth penetration over near infrared used in photoacoustics. Additionally, VHF signalmore » production is proportional to electrical conductivity, and prostate cancer is known to suppress electrical conductivity of prostatic fluid. Methods: A dual-transducer system utilizing a P4-1 array connected to a Verasonics V1 system augmented by a lower frequency focused single element transducer was developed. Simultaneous acquisition of VHF-induced thermoacoustic pulses by both transducers enabled comparison of transducer performance. Data from the clinical array generated a stack of 96-images with separation of 0.3 mm, whereas the single element transducer imaged only in a single plane. In-plane resolution and quantitative accuracy were measured at isocenter. Results: The array provided volumetric imaging capability with superior resolution whereas the single element transducer provided superior quantitative accuracy. Combining axial images from both transducers preserved resolution of the P4-1 array and improved image contrast. Neither transducer was sensitive to frequencies below 50 kHz, resulting in a DC offset and low-frequency shading over fields of view exceeding 15 mm. Fresh human prostates were imaged ex vivo and volumetric reconstructions reveal structures rarely seen in diagnostic images. Conclusion: Quantitative whole-organ thermoacoustic tomography will be feasible by sparsely interspersing transducer elements sensitive to the low end of the ultrasonic range.« less
Devonshire, Alison S; O'Sullivan, Denise M; Honeyborne, Isobella; Jones, Gerwyn; Karczmarczyk, Maria; Pavšič, Jernej; Gutteridge, Alice; Milavec, Mojca; Mendoza, Pablo; Schimmel, Heinz; Van Heuverswyn, Fran; Gorton, Rebecca; Cirillo, Daniela Maria; Borroni, Emanuele; Harris, Kathryn; Barnard, Marinus; Heydenrych, Anthenette; Ndusilo, Norah; Wallis, Carole L; Pillay, Keshree; Barry, Thomas; Reddington, Kate; Richter, Elvira; Mozioğlu, Erkan; Akyürek, Sema; Yalçınkaya, Burhanettin; Akgoz, Muslum; Žel, Jana; Foy, Carole A; McHugh, Timothy D; Huggett, Jim F
2016-08-03
Real-time PCR (qPCR) based methods, such as the Xpert MTB/RIF, are increasingly being used to diagnose tuberculosis (TB). While qualitative methods are adequate for diagnosis, the therapeutic monitoring of TB patients requires quantitative methods currently performed using smear microscopy. The potential use of quantitative molecular measurements for therapeutic monitoring has been investigated but findings have been variable and inconclusive. The lack of an adequate reference method and reference materials is a barrier to understanding the source of such disagreement. Digital PCR (dPCR) offers the potential for an accurate method for quantification of specific DNA sequences in reference materials which can be used to evaluate quantitative molecular methods for TB treatment monitoring. To assess a novel approach for the development of quality assurance materials we used dPCR to quantify specific DNA sequences in a range of prototype reference materials and evaluated accuracy between different laboratories and instruments. The materials were then also used to evaluate the quantitative performance of qPCR and Xpert MTB/RIF in eight clinical testing laboratories. dPCR was found to provide results in good agreement with the other methods tested and to be highly reproducible between laboratories without calibration even when using different instruments. When the reference materials were analysed with qPCR and Xpert MTB/RIF by clinical laboratories, all laboratories were able to correctly rank the reference materials according to concentration, however there was a marked difference in the measured magnitude. TB is a disease where the quantification of the pathogen could lead to better patient management and qPCR methods offer the potential to rapidly perform such analysis. However, our findings suggest that when precisely characterised materials are used to evaluate qPCR methods, the measurement result variation is too high to determine whether molecular quantification of Mycobacterium tuberculosis would provide a clinically useful readout. The methods described in this study provide a means by which the technical performance of quantitative molecular methods can be evaluated independently of clinical variability to improve accuracy of measurement results. These will assist in ultimately increasing the likelihood that such approaches could be used to improve patient management of TB.
Prototype ultrasonic instrument for quantitative testing
NASA Technical Reports Server (NTRS)
Lynnworth, L. C.; Dubois, J. L.; Kranz, P. R.
1972-01-01
A prototype ultrasonic instrument has been designed and developed for quantitative testing. The complete delivered instrument consists of a pulser/receiver which plugs into a standard oscilloscope, an rf power amplifier, a standard decade oscillator, and a set of broadband transducers for typical use at 1, 2, 5 and 10 MHz. The system provides for its own calibration, and on the oscilloscope, presents a quantitative (digital) indication of time base and sensitivity scale factors and some measurement data.
Quantitative Characterization of Nanostructured Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dr. Frank
The two-and-a-half day symposium on the "Quantitative Characterization of Nanostructured Materials" will be the first comprehensive meeting on this topic held under the auspices of a major U.S. professional society. Spring MRS Meetings provide a natural venue for this symposium as they attract a broad audience of researchers that represents a cross-section of the state-of-the-art regarding synthesis, structure-property relations, and applications of nanostructured materials. Close interactions among the experts in local structure measurements and materials researchers will help both to identify measurement needs pertinent to real-world materials problems and to familiarize the materials research community with the state-of-the-art local structuremore » measurement techniques. We have chosen invited speakers that reflect the multidisciplinary and international nature of this topic and the need to continually nurture productive interfaces among university, government and industrial laboratories. The intent of the symposium is to provide an interdisciplinary forum for discussion and exchange of ideas on the recent progress in quantitative characterization of structural order in nanomaterials using different experimental techniques and theory. The symposium is expected to facilitate discussions on optimal approaches for determining atomic structure at the nanoscale using combined inputs from multiple measurement techniques.« less
Vapor Corrosion Cell and Method of Using Same
NASA Technical Reports Server (NTRS)
Davis, Dennis D. (Inventor)
2001-01-01
The present invention provides a vapor corrosion cell for a real-time and quantitative measurement of corrosion of conductive materials in atmospheres containing chemically reactive gases and water vapor. Two prototypes are provided. Also provided are various applications of this apparatus in industry.
Dong, Daming; Jiao, Leizi; Du, Xiaofan; Zhao, Chunjiang
2017-04-20
In this study, we developed a substrate to enhance the sensitivity of LIBS by 5 orders of magnitude. Using a combination of field enhancement due to the metal nanoparticles in the substrate, the aggregate effect of super-hydrophobic interfaces and magnetic confinement, we performed a quantitative measurement of copper in solution with concentrations on the ppt level. We also demonstrated that the substrate improves quantitative measurements by providing an opportunity for internal standardization.
A Quantitative Geochemical Target for Modeling the Formation of the Earth and Moon
NASA Technical Reports Server (NTRS)
Boyce, Jeremy W.; Barnes, Jessica J.; McCubbin, Francis M.
2017-01-01
The past decade has been one of geochemical, isotopic, and computational advances that are bringing the laboratory measurements and computational modeling neighborhoods of the Earth-Moon community to ever closer proximity. We are now however in the position to become even better neighbors: modelers can generate testable hypthotheses for geochemists; and geochemists can provide quantitive targets for modelers. Here we present a robust example of the latter based on Cl isotope measurements of mare basalts.
Hyper-spectrum scanning laser optical tomography
NASA Astrophysics Data System (ADS)
Chen, Lingling; Li, Guiye; Li, Yingchao; Liu, Lina; Liu, Ang; Hu, Xuejuan; Ruan, Shuangchen
2018-02-01
We describe a quantitative fluorescence projection tomography technique which measures the three-dimensional fluorescence spectrum in biomedical samples with size up to several millimeters. This is achieved by acquiring a series of hyperspectral images, by using laser scanning scheme, at different projection angles. We demonstrate that this technique provide a quantitative measure of the fluorescence signal by comparing the spectrum and intensity profile of a fluorescent bead phantom and also demonstrate its application to differentiating the extrinsic label and the autofluorescence in a mouse embryo.
Spector, P E; Jex, S M
1998-10-01
Despite the widespread use of self-report measures of both job-related stressors and strains, relatively few carefully developed scales for which validity data exist are available. In this article, we discuss 3 job stressor scales (Interpersonal Conflict at Work Scale, Organizational Constraints Scale, and Quantitative Workload Inventory) and 1 job strain scale (Physical Symptoms Inventory). Using meta-analysis, we combined the results of 18 studies to provide estimates of relations between our scales and other variables. Data showed moderate convergent validity for the 3 job stressor scales, suggesting some objectively to these self-reports. Norms for each scale are provided.
Assessing Psychodynamic Conflict.
Simmonds, Joshua; Constantinides, Prometheas; Perry, J Christopher; Drapeau, Martin; Sheptycki, Amanda R
2015-09-01
Psychodynamic psychotherapies suggest that symptomatic relief is provided, in part, with the resolution of psychic conflicts. Clinical researchers have used innovative methods to investigate such phenomenon. This article aims to review the literature on quantitative psychodynamic conflict rating scales. An electronic search of the literature was conducted to retrieve quantitative observer-rated scales used to assess conflict noting each measure's theoretical model, information source, and training and clinical experience required. Scales were also examined for levels of reliability and validity. Five quantitative observer-rated conflict scales were identified. Reliability varied from poor to excellent with each measure demonstrating good validity. However a small number of studies and limited links to current conflict theory suggest further clinical research is needed.
Quantitation of Met tyrosine phosphorylation using MRM-MS.
Meng, Zhaojing; Srivastava, Apurva K; Zhou, Ming; Veenstra, Timothy
2013-01-01
Phosphorylation has long been accepted as a key cellular regulator of cell signaling pathways. The recent development of multiple-reaction monitoring mass spectrometry (MRM-MS) provides a useful tool for measuring the absolute quantity of phosphorylation occupancy at pivotal sites within signaling proteins, even when the phosphorylation sites are in close proximity. Here, we described a targeted quantitation approach to measure the absolute phosphorylation occupancy at Y1234 and Y1235 of Met. The approach is utilized to obtain absolute occupancy of the two phosphorylation sites in the full-length recombinant Met. It is further applied to quantitate the phosphorylation state of these two sites in SNU-5 cells treated with a Met inhibitor.
Quantitative consensus of supervised learners for diffuse lung parenchymal HRCT patterns
NASA Astrophysics Data System (ADS)
Raghunath, Sushravya; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Bartholmai, Brian J.; Robb, Richard A.
2013-03-01
Automated lung parenchymal classification usually relies on supervised learning of expert chosen regions representative of the visually differentiable HRCT patterns specific to different pathologies (eg. emphysema, ground glass, honey combing, reticular and normal). Considering the elusiveness of a single most discriminating similarity measure, a plurality of weak learners can be combined to improve the machine learnability. Though a number of quantitative combination strategies exist, their efficacy is data and domain dependent. In this paper, we investigate multiple (N=12) quantitative consensus approaches to combine the clusters obtained with multiple (n=33) probability density-based similarity measures. Our study shows that hypergraph based meta-clustering and probabilistic clustering provides optimal expert-metric agreement.
Manned Mars mission radiation environment and radiobiology
NASA Technical Reports Server (NTRS)
Nachtwey, D. S.
1986-01-01
Potential radiation hazards to crew members on manned Mars missions are discussed. It deals briefly with radiation sources and environments likely to be encountered during various phases of such missions, providing quantitative estimates of these environments. Also provided are quantitative data and discussions on the implications of such radiation on the human body. Various sorts of protective measures are suggested. Recent re-evaluation of allowable dose limits by the National Council of Radiation Protection is discussed, and potential implications from such activity are assessed.
Cantow, Kathleen; Arakelyan, Karen; Seeliger, Erdmann; Niendorf, Thoralf; Pohlmann, Andreas
2016-01-01
In vivo assessment of renal perfusion and oxygenation under (patho)physiological conditions by means of noninvasive diagnostic imaging is conceptually appealing. Blood oxygen level-dependent (BOLD) magnetic resonance imaging (MRI) and quantitative parametric mapping of the magnetic resonance (MR) relaxation times T 2* and T 2 are thought to provide surrogates of renal tissue oxygenation. The validity and efficacy of this technique for quantitative characterization of local tissue oxygenation and its changes under different functional conditions have not been systematically examined yet and remain to be established. For this purpose, the development of an integrative multimodality approaches is essential. Here we describe an integrated hybrid approach (MR-PHYSIOL) that combines established quantitative physiological measurements with T 2* (T 2) mapping and MR-based kidney size measurements. Standardized reversible (patho)physiologically relevant interventions, such as brief periods of aortic occlusion, hypoxia, and hyperoxia, are used for detailing the relation between the MR-PHYSIOL parameters, in particular between renal T 2* and tissue oxygenation.
NASA Astrophysics Data System (ADS)
Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.
2012-09-01
Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.
Measuring teamwork in primary care: Triangulation of qualitative and quantitative data.
Brown, Judith Belle; Ryan, Bridget L; Thorpe, Cathy; Markle, Emma K R; Hutchison, Brian; Glazier, Richard H
2015-09-01
This article describes the triangulation of qualitative dimensions, reflecting high functioning teams, with the results of standardized teamwork measures. The study used a mixed methods design using qualitative and quantitative approaches to assess teamwork in 19 Family Health Teams in Ontario, Canada. This article describes dimensions from the qualitative phase using grounded theory to explore the issues and challenges to teamwork. Two quantitative measures were used in the study, the Team Climate Inventory (TCI) and the Providing Effective Resources and Knowledge (PERK) scale. For the triangulation analysis, the mean scores of these measures were compared with the qualitatively derived ratings for the dimensions. The final sample for the qualitative component was 107 participants. The qualitative analysis identified 9 dimensions related to high team functioning such as common philosophy, scope of practice, conflict resolution, change management, leadership, and team evolution. From these dimensions, teams were categorized numerically as high, moderate, or low functioning. Three hundred seventeen team members completed the survey measures. Mean site scores for the TCI and PERK were 3.87 and 3.88, respectively (of 5). The TCI was associated will all dimensions except for team location, space allocation, and executive director leadership. The PERK was associated with all dimensions except team location. Data triangulation provided qualitative and quantitative evidence of what constitutes teamwork. Leadership was pivotal in forging a common philosophy and encouraging team collaboration. Teams used conflict resolution strategies and adapted to the changes they encountered. These dimensions advanced the team's evolution toward a high functioning team. (c) 2015 APA, all rights reserved).
Multifractal spectrum and lacunarity as measures of complexity of osseointegration.
de Souza Santos, Daniel; Dos Santos, Leonardo Cavalcanti Bezerra; de Albuquerque Tavares Carvalho, Alessandra; Leão, Jair Carneiro; Delrieux, Claudio; Stosic, Tatijana; Stosic, Borko
2016-07-01
The goal of this study is to contribute to a better quantitative description of the early stages of osseointegration, by application of fractal, multifractal, and lacunarity analysis. Fractal, multifractal, and lacunarity analysis are performed on scanning electron microscopy (SEM) images of titanium implants that were first subjected to different treatment combinations of i) sand blasting, ii) acid etching, and iii) exposition to calcium phosphate, and were then submersed in a simulated body fluid (SBF) for 30 days. All the three numerical techniques are applied to the implant SEM images before and after SBF immersion, in order to provide a comprehensive set of common quantitative descriptors. It is found that implants subjected to different physicochemical treatments before submersion in SBF exhibit a rather similar level of complexity, while the great variety of crystal forms after SBF submersion reveals rather different quantitative measures (reflecting complexity), for different treatments. In particular, it is found that acid treatment, in most combinations with the other considered treatments, leads to a higher fractal dimension (more uniform distribution of crystals), lower lacunarity (lesser variation in gap sizes), and narrowing of the multifractal spectrum (smaller fluctuations on different scales). The current quantitative description has shown the capacity to capture the main features of complex images of implant surfaces, for several different treatments. Such quantitative description should provide a fundamental tool for future large scale systematic studies, considering the large variety of possible implant treatments and their combinations. Quantitative description of early stages of osseointegration on titanium implants with different treatments should help develop a better understanding of this phenomenon, in general, and provide basis for further systematic experimental studies. Clinical practice should benefit from such studies in the long term, by more ready access to implants of higher quality.
MONITORING ECOSYSTEMS FROM SPACE: THE GLOBAL FIDUCIALS PROGRAM
Images from satellites provide valuable insights to changes in land-cover and ecosystems. Long- term monitoring of ecosystem change using historical satellite imagery can provide quantitative measures of ecological processes and allows for estimation of future ecosystem condition...
NASA Technical Reports Server (NTRS)
Davis, Brian; Turner, Travis L.; Seelecke, Stefan
2008-01-01
An experimental and numerical investigation into the static and dynamic responses of shape memory alloy hybrid composite (SMAHC) beams is performed to provide quantitative validation of a recently commercialized numerical analysis/design tool for SMAHC structures. The SMAHC beam specimens consist of a composite matrix with embedded pre-strained SMA actuators, which act against the mechanical boundaries of the structure when thermally activated to adaptively stiffen the structure. Numerical results are produced from the numerical model as implemented into the commercial finite element code ABAQUS. A rigorous experimental investigation is undertaken to acquire high fidelity measurements including infrared thermography and projection moire interferometry for full-field temperature and displacement measurements, respectively. High fidelity numerical results are also obtained from the numerical model and include measured parameters, such as geometric imperfection and thermal load. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.
Medical privacy protection based on granular computing.
Wang, Da-Wei; Liau, Churn-Jung; Hsu, Tsan-Sheng
2004-10-01
Based on granular computing methodology, we propose two criteria to quantitatively measure privacy invasion. The total cost criterion measures the effort needed for a data recipient to find private information. The average benefit criterion measures the benefit a data recipient obtains when he received the released data. These two criteria remedy the inadequacy of the deterministic privacy formulation proposed in Proceedings of Asia Pacific Medical Informatics Conference, 2000; Int J Med Inform 2003;71:17-23. Granular computing methodology provides a unified framework for these quantitative measurements and previous bin size and logical approaches. These two new criteria are implemented in a prototype system Cellsecu 2.0. Preliminary system performance evaluation is conducted and reviewed.
DOT National Transportation Integrated Search
1995-10-01
The primary objective of this study is to provide information relative to the development of a set of performance measures for intermodal freight transportation. To accomplish this objective, data was collected, processed, and analyzed on the basis o...
Measuring the performance of visual to auditory information conversion.
Tan, Shern Shiou; Maul, Tomás Henrique Bode; Mennie, Neil Russell
2013-01-01
Visual to auditory conversion systems have been in existence for several decades. Besides being among the front runners in providing visual capabilities to blind users, the auditory cues generated from image sonification systems are still easier to learn and adapt to compared to other similar techniques. Other advantages include low cost, easy customizability, and universality. However, every system developed so far has its own set of strengths and weaknesses. In order to improve these systems further, we propose an automated and quantitative method to measure the performance of such systems. With these quantitative measurements, it is possible to gauge the relative strengths and weaknesses of different systems and rank the systems accordingly. Performance is measured by both the interpretability and also the information preservation of visual to auditory conversions. Interpretability is measured by computing the correlation of inter image distance (IID) and inter sound distance (ISD) whereas the information preservation is computed by applying Information Theory to measure the entropy of both visual and corresponding auditory signals. These measurements provide a basis and some insights on how the systems work. With an automated interpretability measure as a standard, more image sonification systems can be developed, compared, and then improved. Even though the measure does not test systems as thoroughly as carefully designed psychological experiments, a quantitative measurement like the one proposed here can compare systems to a certain degree without incurring much cost. Underlying this research is the hope that a major breakthrough in image sonification systems will allow blind users to cost effectively regain enough visual functions to allow them to lead secure and productive lives.
A preliminary study of DTI Fingerprinting on stroke analysis.
Ma, Heather T; Ye, Chenfei; Wu, Jun; Yang, Pengfei; Chen, Xuhui; Yang, Zhengyi; Ma, Jingbo
2014-01-01
DTI (Diffusion Tensor Imaging) is a well-known MRI (Magnetic Resonance Imaging) technique which provides useful structural information about human brain. However, the quantitative measurement to physiological variation of subtypes of ischemic stroke is not available. An automatically quantitative method for DTI analysis will enhance the DTI application in clinics. In this study, we proposed a DTI Fingerprinting technology to quantitatively analyze white matter tissue, which was applied in stroke classification. The TBSS (Tract Based Spatial Statistics) method was employed to generate mask automatically. To evaluate the clustering performance of the automatic method, lesion ROI (Region of Interest) is manually drawn on the DWI images as a reference. The results from the DTI Fingerprinting were compared with those obtained from the reference ROIs. It indicates that the DTI Fingerprinting could identify different states of ischemic stroke and has promising potential to provide a more comprehensive measure of the DTI data. Further development should be carried out to improve DTI Fingerprinting technology in clinics.
Zhang, Sheng; Huang, Jinsheng; Yang, Baigbing; Lin, Binjie; Xu, Xinyun; Chen, Jinru; Zhao, Zhuandi; Tu, Xiaozhi; Bin, Haihua
2014-04-01
To improve the occupational health management levels in electroplating enterprises with quantitative classification measures and to provide a scientific basis for the prevention and control of occupational hazards in electroplating enterprises and the protection of workers' health. A quantitative classification table was created for the occupational health management in electroplating enterprises. The evaluation indicators included 6 items and 27 sub-items, with a total score of 100 points. Forty electroplating enterprises were selected and scored according to the quantitative classification table. These electroplating enterprises were classified into grades A, B, and C based on the scores. Among 40 electroplating enterprises, 11 (27.5%) had scores of >85 points (grade A), 23 (57.5%) had scores of 60∼85 points (grade B), and 6 (15.0%) had scores of <60 points (grade C). Quantitative classification management for electroplating enterprises is a valuable attempt, which is helpful for the supervision and management by the health department and provides an effective method for the self-management of enterprises.
Optical Ptychographic Microscope for Quantitative Bio-Mechanical Imaging
NASA Astrophysics Data System (ADS)
Anthony, Nicholas; Cadenazzi, Guido; Nugent, Keith; Abbey, Brian
The role that mechanical forces play in biological processes such as cell movement and death is becoming of significant interest to further develop our understanding of the inner workings of cells. The most common method used to obtain stress information is photoelasticity which maps a samples birefringence, or its direction dependent refractive indices, using polarized light. However this method only provides qualitative data and for stress information to be useful quantitative data is required. Ptychography is a method for quantitatively determining the phase of a samples complex transmission function. The technique relies upon the collection of multiple overlapping coherent diffraction patterns from laterally displaced points on the sample. The overlap of measurement points provides complementary information that significantly aids in the reconstruction of the complex wavefield exiting the sample and allows for quantitative imaging of weakly interacting specimens. Here we describe recent advances at La Trobe University Melbourne on achieving quantitative birefringence mapping using polarized light ptychography with applications in cell mechanics. Australian Synchrotron, ARC Centre of Excellence for Advanced Molecular Imaging.
Code of Federal Regulations, 2012 CFR
2012-04-01
... systems. (k) Research activities to improve the basis for determining appropriate management measures to... current forestry funding and staffing levels; and standards providing quantitative criteria to evaluate...
Code of Federal Regulations, 2011 CFR
2011-04-01
... systems. (k) Research activities to improve the basis for determining appropriate management measures to... current forestry funding and staffing levels; and standards providing quantitative criteria to evaluate...
Code of Federal Regulations, 2013 CFR
2013-04-01
... systems. (k) Research activities to improve the basis for determining appropriate management measures to... current forestry funding and staffing levels; and standards providing quantitative criteria to evaluate...
Code of Federal Regulations, 2010 CFR
2010-04-01
... systems. (k) Research activities to improve the basis for determining appropriate management measures to... current forestry funding and staffing levels; and standards providing quantitative criteria to evaluate...
Code of Federal Regulations, 2014 CFR
2014-04-01
... systems. (k) Research activities to improve the basis for determining appropriate management measures to... current forestry funding and staffing levels; and standards providing quantitative criteria to evaluate...
Ha, Richard; Mema, Eralda; Guo, Xiaotao; Mango, Victoria; Desperito, Elise; Ha, Jason; Wynn, Ralph; Zhao, Binsheng
2016-04-01
The amount of fibroglandular tissue (FGT) has been linked to breast cancer risk based on mammographic density studies. Currently, the qualitative assessment of FGT on mammogram (MG) and magnetic resonance imaging (MRI) is prone to intra and inter-observer variability. The purpose of this study is to develop an objective quantitative FGT measurement tool for breast MRI that could provide significant clinical value. An IRB approved study was performed. Sixty breast MRI cases with qualitative assessment of mammographic breast density and MRI FGT were randomly selected for quantitative analysis from routine breast MRIs performed at our institution from 1/2013 to 12/2014. Blinded to the qualitative data, whole breast and FGT contours were delineated on T1-weighted pre contrast sagittal images using an in-house, proprietary segmentation algorithm which combines the region-based active contours and a level set approach. FGT (%) was calculated by: [segmented volume of FGT (mm(3))/(segmented volume of whole breast (mm(3))] ×100. Statistical correlation analysis was performed between quantified FGT (%) on MRI and qualitative assessments of mammographic breast density and MRI FGT. There was a significant positive correlation between quantitative MRI FGT assessment and qualitative MRI FGT (r=0.809, n=60, P<0.001) and mammographic density assessment (r=0.805, n=60, P<0.001). There was a significant correlation between qualitative MRI FGT assessment and mammographic density assessment (r=0.725, n=60, P<0.001). The four qualitative assessment categories of FGT correlated with the calculated mean quantitative FGT (%) of 4.61% (95% CI, 0-12.3%), 8.74% (7.3-10.2%), 18.1% (15.1-21.1%), 37.4% (29.5-45.3%). Quantitative measures of FGT (%) were computed with data derived from breast MRI and correlated significantly with conventional qualitative assessments. This quantitative technique may prove to be a valuable tool in clinical use by providing computer generated standardized measurements with limited intra or inter-observer variability.
Spotsizer: High-throughput quantitative analysis of microbial growth.
Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg
2016-10-01
Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.
A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry.
Chavez, Juan D; Eng, Jimmy K; Schweppe, Devin K; Cilia, Michelle; Rivera, Keith; Zhong, Xuefei; Wu, Xia; Allen, Terrence; Khurgel, Moshe; Kumar, Akhilesh; Lampropoulos, Athanasios; Larsson, Mårten; Maity, Shuvadeep; Morozov, Yaroslav; Pathmasiri, Wimal; Perez-Neut, Mathew; Pineyro-Ruiz, Coriness; Polina, Elizabeth; Post, Stephanie; Rider, Mark; Tokmina-Roszyk, Dorota; Tyson, Katherine; Vieira Parrine Sant'Ana, Debora; Bruce, James E
2016-01-01
Chemical cross-linking mass spectrometry (XL-MS) provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions.
Normalized Temperature Contrast Processing in Infrared Flash Thermography
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2016-01-01
The paper presents further development in normalized contrast processing used in flash infrared thermography method. Method of computing normalized image or pixel intensity contrast, and normalized temperature contrast are provided. Methods of converting image contrast to temperature contrast and vice versa are provided. Normalized contrast processing in flash thermography is useful in quantitative analysis of flash thermography data including flaw characterization and comparison of experimental results with simulation. Computation of normalized temperature contrast involves use of flash thermography data acquisition set-up with high reflectivity foil and high emissivity tape such that the foil, tape and test object are imaged simultaneously. Methods of assessing other quantitative parameters such as emissivity of object, afterglow heat flux, reflection temperature change and surface temperature during flash thermography are also provided. Temperature imaging and normalized temperature contrast processing provide certain advantages over normalized image contrast processing by reducing effect of reflected energy in images and measurements, therefore providing better quantitative data. Examples of incorporating afterglow heat-flux and reflection temperature evolution in flash thermography simulation are also discussed.
In situ spectroradiometric quantification of ERTS data. [Prescott and Phoenix, Arizona
NASA Technical Reports Server (NTRS)
Yost, E. F. (Principal Investigator)
1975-01-01
The author has identified the following significant results. Analyses of ERTS-1 photographic data were made to quantitatively relate ground reflectance measurements to photometric characteristics of the images. Digital image processing of photographic data resulted in a nomograph to correct for atmospheric effects over arid terrain. Optimum processing techniques to derive maximum geologic information from desert areas were established. Additive color techniques to provide quantitative measurements of surface water between different orbits were developed which were accepted as the standard flood mapping techniques using ERTS.
Impact of immersion oils and mounting media on the confocal imaging of dendritic spines
Peterson, Brittni M.; Mermelstein, Paul G.; Meisel, Robert L.
2015-01-01
Background Structural plasticity, such as changes in dendritic spine morphology and density, reflect changes in synaptic connectivity and circuitry. Procedural variables used in different methods for labeling dendritic spines have been quantitatively evaluated for their impact on the ability to resolve individual spines in confocal microscopic analyses. In contrast, there have been discussions, though no quantitative analyses, of the potential effects of choosing specific mounting media and immersion oils on dendritic spine resolution. New Method Here we provide quantitative data measuring the impact of these variables on resolving dendritic spines in 3D confocal analyses. Medium spiny neurons from the rat striatum and nucleus accumbens are used as examples. Results Both choice of mounting media and immersion oil affected the visualization of dendritic spines, with choosing the appropriate immersion oil as being more imperative. These biologic data are supported by quantitative measures of the 3D diffraction pattern (i.e. point spread function) of a point source of light under the same mounting medium and immersion oil combinations. Comparison with Existing Method Although not a new method, this manuscript provides quantitative data demonstrating that different mounting media and immersion oils can impact the ability to resolve dendritic spines. These findings highlight the importance of reporting which mounting medium and immersion oil are used in preparations for confocal analyses, especially when comparing published results from different laboratories. Conclusion Collectively, these data suggest that choosing the appropriate immersion oil and mounting media is critical for obtaining the best resolution, and consequently more accurate measures of dendritic spine densities. PMID:25601477
Impact of immersion oils and mounting media on the confocal imaging of dendritic spines.
Peterson, Brittni M; Mermelstein, Paul G; Meisel, Robert L
2015-03-15
Structural plasticity, such as changes in dendritic spine morphology and density, reflect changes in synaptic connectivity and circuitry. Procedural variables used in different methods for labeling dendritic spines have been quantitatively evaluated for their impact on the ability to resolve individual spines in confocal microscopic analyses. In contrast, there have been discussions, though no quantitative analyses, of the potential effects of choosing specific mounting media and immersion oils on dendritic spine resolution. Here we provide quantitative data measuring the impact of these variables on resolving dendritic spines in 3D confocal analyses. Medium spiny neurons from the rat striatum and nucleus accumbens are used as examples. Both choice of mounting media and immersion oil affected the visualization of dendritic spines, with choosing the appropriate immersion oil as being more imperative. These biologic data are supported by quantitative measures of the 3D diffraction pattern (i.e. point spread function) of a point source of light under the same mounting medium and immersion oil combinations. Although not a new method, this manuscript provides quantitative data demonstrating that different mounting media and immersion oils can impact the ability to resolve dendritic spines. These findings highlight the importance of reporting which mounting medium and immersion oil are used in preparations for confocal analyses, especially when comparing published results from different laboratories. Collectively, these data suggest that choosing the appropriate immersion oil and mounting media is critical for obtaining the best resolution, and consequently more accurate measures of dendritic spine densities. Copyright © 2015 Elsevier B.V. All rights reserved.
Measurement Tools for the Immersive Visualization Environment: Steps Toward the Virtual Laboratory.
Hagedorn, John G; Dunkers, Joy P; Satterfield, Steven G; Peskin, Adele P; Kelso, John T; Terrill, Judith E
2007-01-01
This paper describes a set of tools for performing measurements of objects in a virtual reality based immersive visualization environment. These tools enable the use of the immersive environment as an instrument for extracting quantitative information from data representations that hitherto had be used solely for qualitative examination. We provide, within the virtual environment, ways for the user to analyze and interact with the quantitative data generated. We describe results generated by these methods to obtain dimensional descriptors of tissue engineered medical products. We regard this toolbox as our first step in the implementation of a virtual measurement laboratory within an immersive visualization environment.
Bloecker, Katja; Wirth, W; Guermazi, A; Hitzl, W; Hunter, D J; Eckstein, F
2015-10-01
We aimed to apply 3D MRI-based measurement technology to studying 2-year change in quantitative measurements of meniscus size and position. Forty-seven knees from the Osteoarthritis Initiative with medial radiographic joint space narrowing had baseline and 2-year follow-up MRIs. Quantitative measures were obtained from manual segmentation of the menisci and tibia using coronal DESSwe images. The standardized response mean (SRM = mean/SD change) was used as measure of sensitivity to longitudinal change. Medial tibial plateau coverage decreased from 34.8% to 29.9% (SRM -0.82; p < 0.001). Change in medial meniscus extrusion in a central image (SRM 0.18) and in the central five slices (SRM 0.22) did not reach significance, but change in extrusion across the entire meniscus (SRM 0.32; p = 0.03) and in the relative area of meniscus extrusion (SRM 0.56; p < 0.001) did. There was a reduction in medial meniscus volume (10%; p < 0.001), width (7%; p < 0.001), and height (2%; p = 0.08); meniscus substance loss was strongest in the posterior (SRM -0.51; p = 0.001) and weakest in the anterior horn (SRM -0.15; p = 0.31). This pilot study reports, for the first time, longitudinal change in quantitative 3D meniscus measurements in knee osteoarthritis. It provides evidence of improved sensitivity to change of 3D measurements compared with single slice analysis. • First longitudinal MRI-based measurements of change of meniscus position and size. • Quantitative longitudinal evaluation of meniscus change in knee osteoarthritis. • Improved sensitivity to change of 3D measurements compared with single slice analysis.
Quantiprot - a Python package for quantitative analysis of protein sequences.
Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold
2017-07-17
The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.
Personal exposure sampling provides the most accurate and representative assessment of exposure to a pollutant, but only if measures are implemented to minimize exposure misclassification and reduce confounders that may cause misinterpretation of the collected data. Poor complian...
Shultzaberger, Ryan K.; Paddock, Mark L.; Katsuki, Takeo; Greenspan, Ralph J.; Golden, Susan S.
2016-01-01
The temporal measurement of a bioluminescent reporter has proven to be one of the most powerful tools for characterizing circadian rhythms in the cyanobacterium Synechococcus elongatus. Primarily, two approaches have been used to automate this process: (1) detection of cell culture bioluminescence in 96-well plates by a photomultiplier tube-based plate-cycling luminometer (TopCount Microplate Scintillation and Luminescence Counter, Perkin Elmer) and (2) detection of individual colony bioluminescence by iteratively rotating a Petri dish under a cooled CCD camera using a computer-controlled turntable. Each approach has distinct advantages. The TopCount provides a more quantitative measurement of bioluminescence, enabling the direct comparison of clock output levels among strains. The computer-controlled turntable approach has a shorter set-up time and greater throughput, making it a more powerful phenotypic screening tool. While the latter approach is extremely useful, only a few labs have been able to build such an apparatus because of technical hurdles involved in coordinating and controlling both the camera and the turntable, and in processing the resulting images. This protocol provides instructions on how to construct, use, and process data from a computer-controlled turntable to measure the temporal changes in bioluminescence of individual cyanobacterial colonies. Furthermore, we describe how to prepare samples for use with the TopCount to minimize experimental noise, and generate meaningful quantitative measurements of clock output levels for advanced analysis. PMID:25662451
Pop, Bianca; Niculae, Alexandru-Ștefan; Pop, Tudor Lucian; Răchișan, Andreea Liana
2017-12-01
Autism spectrum disorder (ASD) represents a very large set of neurodevelopmental issues with diverse clinical outcomes. Various hypotheses have been put forth for the etiology of autism spectrum disorder, including issues pertaining to oxidative stress. In this study, we conducted measurements of serum 8-Iso-Prostaglanding F2 α (8-iso-PGF2α, which is the results of non-enzimatically mediated polyunsaturated fatty acid oxidation) in a population of individuals with autism and a control group of age and sex matched controls. A quantitative assay of Paraoxonase 1 (PON1) was conducted. Data regarding comorbidities, structural MRI scans, medication, intelligence quotient (IQ) and Childhood Autism Rating Scale scores (CARS) were also included in our study. Our results show that patients diagnosed with autism have higher levels of 8-iso-PGF2α than their neurotypical counterparts. Levels of this particular metabolite, however, do not correlate with quantitative serum levels of Paraoxonase 1, which has been shown to be altered in individuals with autism. Neither 8-iso-PGF2α nor quantitative levels of PON1 provide any meaningful correlation with clinical or neuroimaging data in this study group. Future research should focus on providing data regarding PON 1 phenotype, in addition to standard quantitative measurements, in relation to 8-iso-PGF2α as well as other clinical and structural brain findings.
Sloot, P M; Hoekstra, A G; van der Liet, H; Figdor, C G
1989-05-15
Light scattering techniques (including depolarization experiments) applied to biological cells provide a fast nondestructive probe that is very sensitive to small morphological differences. Until now quantitative measurement of these scatter phenomena were only described for particles in suspension. In this paper we discuss the symmetry conditions applicable to the scattering matrices of monodisperse biological cells in a flow cytometer and provide evidence that quantitative measurement of the elements of these scattering matrices is possible in flow through systems. Two fundamental extensions to the theoretical description of conventional scattering experiments are introduced: large cone integration of scattering signals and simultaneous implementation of the localization principle to account for scattering by a sharply focused laser beam. In addition, a specific calibration technique is proposed to account for depolarization effects of the highly specialized optics applied in flow through equipment.
Investigation of the Thermomechanical Response of Shape Memory Alloy Hybrid Composite Beams
NASA Technical Reports Server (NTRS)
Davis, Brian A.
2005-01-01
Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical model. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. Excellent agreement is achieved between the predicted and measured results, thereby quantitatively validating the numerical tool.
NASA Astrophysics Data System (ADS)
Suman, Rakesh; O'Toole, Peter
2014-03-01
Here we report a novel label free, high contrast and quantitative method for imaging live cells. The technique reconstructs an image from overlapping diffraction patterns using a ptychographical algorithm. The algorithm utilises both amplitude and phase data from the sample to report on quantitative changes related to the refractive index (RI) and thickness of the specimen. We report the ability of this technique to generate high contrast images, to visualise neurite elongation in neuronal cells, and to provide measure of cell proliferation.
de Certaines, J D; Henriksen, O; Spisni, A; Cortsen, M; Ring, P B
1993-01-01
Quantitative magnetic resonance imaging may offer unique potential for tissue characterization in vivo. In this connection texture analysis of quantitative MR images may be of special importance. Because evaluation of texture analysis needs large data material, multicenter approaches become mandatory. Within the frame of BME Concerted Action on Tissue Characterization by MRI and MRS, a pilot multicenter study was launched in order to evaluate the technical problems including comparability of relaxation time measurements carried out in the individual sites. Human brain, skeletal muscle, and liver were used as models. A total of 218 healthy volunteers were studied. Fifteen MRI scanners with field strength ranging from 0.08 T to 1.5 T were induced. Measurement accuracy was tested on the Eurospin relaxation time test object (TO5) and the obtained calibration curve was used for correction of the in vivo data. The results established that, by following a standardized procedure, comparable quantitative measurements can be obtained in vivo from a number of MR sites. The overall variation coefficient in vivo was in the same order of magnitude as ex vivo relaxometry. Thus, it is possible to carry out international multicenter studies on quantitative imaging, provided that quality control with respect to measurement accuracy and calibration of the MR equipments are performed.
Establish an Agent-Simulant Technology Relationship (ASTR)
2017-04-14
for quantitative measures that characterize simulant performance in testing , such as the ability to be removed from surfaces. Component-level ASTRs...Overall Test and Agent-Simulant Technology Relationship (ASTR) process. 1.2 Background. a. Historically, many tests did not develop quantitative ...methodology report14. Report provides a VX-TPP ASTR for post -decon contact hazard and off- gassing. In the Stryker production verification test (PVT
ERIC Educational Resources Information Center
Xu, Xia; Veenstra, Timothy D.
2012-01-01
The list of physiological events in which sex steroids play a role continues to increase. To decipher the roles that sex steroids play in any condition requires high quality cohorts of samples and assays that provide highly accurate quantitative measures. Liquid and gas chromatography coupled with mass spectrometry (LC-MS and GC-MS) have…
NASA Astrophysics Data System (ADS)
Lawrence, D. J.; Maurice, S.; Patterson, G. W.; Hibbitts, C. A.
2010-05-01
Understanding the global composition of Ganymede's surface is a key goal of the Europa Jupiter System Mission (EJSM) that is being jointly planned by NASA and ESA. Current plans for obtaining surface information with the Jupiter Ganymede Orbiter (JGO) use spectral imaging measurements. While spectral imaging can provide good mineralogy-related information, quantitative data about elemental abundances can often be hindered by non-composition variations due to surface effects (e.g., space weathering, grain effects, temperature, etc.). Orbital neutron and gamma-ray spectroscopy can provide quantitative composition information that is complementary to spectral imaging measurements, as has been demonstrated with similar instrumental combinations at the Moon, Mars, and Mercury. Neutron and gamma-ray measurements have successfully returned abundance information in a hydrogen-rich environment on Mars. In regards to neutrons and gamma-rays, there are many similarities between the Mars and Ganymede hydrogen-rich environments. In this study, we present results of neutron transport models, which show that quantitative composition information from Ganymede's surface can be obtained in a realistic mission scenario. Thermal and epithermal neutrons are jointly sensitive to the abundances of hydrogen and neutron absorbing elements, such as iron and titanium. These neutron measurements can discriminate between regions that are rich or depleted in neutron absorbing elements, even in the presence of large amounts of hydrogen. Details will be presented about how the neutron composition parameters can be used to meet high-level JGO science objectives, as well as an overview of a neutron spectrometer than can meet various mission and stringent environmental requirements.
Highway-railway at-grade crossing structures : rideability measurements and assessments.
DOT National Transportation Integrated Search
2009-05-01
This report provides two analyses for obtaining a quantitative means of rating the condition of railroad-highway at-grade crossings based on their measured roughness. Phase One of this report examined 11 crossings in the Lexington area by use of a la...
Advanced quantitative measurement methodology in physics education research
NASA Astrophysics Data System (ADS)
Wang, Jing
The ultimate goal of physics education research (PER) is to develop a theoretical framework to understand and improve the learning process. In this journey of discovery, assessment serves as our headlamp and alpenstock. It sometimes detects signals in student mental structures, and sometimes presents the difference between expert understanding and novice understanding. Quantitative assessment is an important area in PER. Developing research-based effective assessment instruments and making meaningful inferences based on these instruments have always been important goals of the PER community. Quantitative studies are often conducted to provide bases for test development and result interpretation. Statistics are frequently used in quantitative studies. The selection of statistical methods and interpretation of the results obtained by these methods shall be connected to the education background. In this connecting process, the issues of educational models are often raised. Many widely used statistical methods do not make assumptions on the mental structure of subjects, nor do they provide explanations tailored to the educational audience. There are also other methods that consider the mental structure and are tailored to provide strong connections between statistics and education. These methods often involve model assumption and parameter estimation, and are complicated mathematically. The dissertation provides a practical view of some advanced quantitative assessment methods. The common feature of these methods is that they all make educational/psychological model assumptions beyond the minimum mathematical model. The purpose of the study is to provide a comparison between these advanced methods and the pure mathematical methods. The comparison is based on the performance of the two types of methods under physics education settings. In particular, the comparison uses both physics content assessments and scientific ability assessments. The dissertation includes three parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The reasoning ability structures for U.S. and Chinese students at different educational levels are given by the analysis. A final discussion on the advanced quantitative assessment methodology and the pure mathematical methodology is presented at the end.
NASA Astrophysics Data System (ADS)
Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung
2015-11-01
Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr05839b
NASA Astrophysics Data System (ADS)
Berthias, F.; Feketeová, L.; Della Negra, R.; Dupasquier, T.; Fillol, R.; Abdoul-Carime, H.; Farizon, B.; Farizon, M.; Märk, T. D.
2018-01-01
The combination of the Dispositif d'Irradiation d'Agrégats Moléculaire with the correlated ion and neutral time of flight-velocity map imaging technique provides a new way to explore processes occurring subsequent to the excitation of charged nano-systems. The present contribution describes in detail the methods developed for the quantitative measurement of branching ratios and cross sections for collision-induced dissociation processes of water cluster nano-systems. These methods are based on measurements of the detection efficiency of neutral fragments produced in these dissociation reactions. Moreover, measured detection efficiencies are used here to extract the number of neutral fragments produced for a given charged fragment.
NASA Astrophysics Data System (ADS)
Fantini, Sergio; Sassaroli, Angelo; Kainerstorfer, Jana M.; Tgavalekos, Kristen T.; Zang, Xuan
2016-03-01
We describe the general principles and initial results of coherent hemodynamics spectroscopy (CHS), which is a new technique for the quantitative assessment of cerebral hemodynamics on the basis of dynamic near-infrared spectroscopy (NIRS) measurements. The two components of CHS are (1) dynamic measurements of coherent cerebral hemodynamics in the form of oscillations at multiple frequencies (frequency domain) or temporal transients (time domain), and (2) their quantitative analysis with a dynamic mathematical model that relates the concentration and oxygen saturation of hemoglobin in tissue to cerebral blood volume (CBV), cerebral blood flow (CBF), and cerebral metabolic rate of oxygen (CMRO2). In particular, CHS can provide absolute measurements and dynamic monitoring of CBF, and quantitative measures of cerebral autoregulation. We report initial results of CBF measurements in hemodialysis patients, where we found a lower CBF (54 +/- 16 ml/(100 g-min)) compared to a group of healthy controls (95 +/- 11 ml/(100 g-min)). We also report CHS measurements of cerebral autoregulation, where a quantitative index of autoregulation (its cutoff frequency) was found to be significantly greater in healthy subjects during hyperventilation (0.034 +/- 0.005 Hz) than during normal breathing (0.017 +/- 0.002 Hz). We also present our approach to depth resolved CHS, based on multi-distance, frequency-domain NIRS data and a two-layer diffusion model, to enhance sensitivity to cerebral tissue. CHS offers a potentially powerful approach to the quantitative assessment and continuous monitoring of local brain perfusion at the microcirculation level, with prospective brain mapping capabilities of research and clinical significance.
Advancing the Fork detector for quantitative spent nuclear fuel verification
Vaccaro, S.; Gauld, I. C.; Hu, J.; ...
2018-01-31
The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations.more » A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This study describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. Finally, the results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.« less
Advancing the Fork detector for quantitative spent nuclear fuel verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vaccaro, S.; Gauld, I. C.; Hu, J.
The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations.more » A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This study describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. Finally, the results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.« less
Advancing the Fork detector for quantitative spent nuclear fuel verification
NASA Astrophysics Data System (ADS)
Vaccaro, S.; Gauld, I. C.; Hu, J.; De Baere, P.; Peterson, J.; Schwalbach, P.; Smejkal, A.; Tomanin, A.; Sjöland, A.; Tobin, S.; Wiarda, D.
2018-04-01
The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations. A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This paper describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. The results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.
Barnard, Alison M; Willcocks, Rebecca J; Finanger, Erika L; Daniels, Michael J; Triplett, William T; Rooney, William D; Lott, Donovan J; Forbes, Sean C; Wang, Dah-Jyuu; Senesac, Claudia R; Harrington, Ann T; Finkel, Richard S; Russman, Barry S; Byrne, Barry J; Tennekoon, Gihan I; Walter, Glenn A; Sweeney, H Lee; Vandenborne, Krista
2018-01-01
To provide evidence for quantitative magnetic resonance (qMR) biomarkers in Duchenne muscular dystrophy by investigating the relationship between qMR measures of lower extremity muscle pathology and functional endpoints in a large ambulatory cohort using a multicenter study design. MR spectroscopy and quantitative imaging were implemented to measure intramuscular fat fraction and the transverse magnetization relaxation time constant (T2) in lower extremity muscles of 136 participants with Duchenne muscular dystrophy. Measures were collected at 554 visits over 48 months at one of three imaging sites. Fat fraction was measured in the soleus and vastus lateralis using MR spectroscopy, while T2 was assessed using MRI in eight lower extremity muscles. Ambulatory function was measured using the 10m walk/run, climb four stairs, supine to stand, and six minute walk tests. Significant correlations were found between all qMR and functional measures. Vastus lateralis qMR measures correlated most strongly to functional endpoints (|ρ| = 0.68-0.78), although measures in other rapidly progressing muscles including the biceps femoris (|ρ| = 0.63-0.73) and peroneals (|ρ| = 0.59-0.72) also showed strong correlations. Quantitative MR biomarkers were excellent indicators of loss of functional ability and correlated with qualitative measures of function. A VL FF of 0.40 was an approximate lower threshold of muscle pathology associated with loss of ambulation. Lower extremity qMR biomarkers have a robust relationship to clinically meaningful measures of ambulatory function in Duchenne muscular dystrophy. These results provide strong supporting evidence for qMR biomarkers and set the stage for their potential use as surrogate outcomes in clinical trials.
NASA Astrophysics Data System (ADS)
Morgenthaler, George W.; Nuñez, German R.; Botello, Aaron M.; Soto, Jose; Shrairman, Ruth; Landau, Alexander
1998-01-01
Many reaction time experiments have been conducted over the years to observe human responses. However, most of the experiments that were performed did not have quantitatively accurate instruments for measuring change in reaction time under stress. There is a great need for quantitative instruments to measure neuromuscular reaction responses under stressful conditions such as distraction, disorientation, disease, alcohol, drugs, etc. The two instruments used in the experiments reported in this paper are such devices. Their accuracy, portability, ease of use, and biometric character are what makes them very special. PACE™ is a software model used to measure reaction time. VeriFax's Impairoscope measures the deterioration of neuromuscular responses. During the 1997 Summer Semester, various reaction time experiments were conducted on University of Colorado faculty, staff, and students using the PACE™ system. The tests included both two-eye and one-eye unstressed trials and trials with various stresses such as fatigue, distractions in which subjects were asked to perform simple arithmetic during the PACE™ tests, and stress due to rotating-chair dizziness. Various VeriFax Impairoscope tests, both stressed and unstressed, were conducted to determine the Impairoscope's ability to quantitatively measure this impairment. In the 1997 Fall Semester, a Phase II effort was undertaken to increase test sample sizes in order to provide statistical precision and stability. More sophisticated statistical methods remain to be applied to better interpret the data.
Quantitative and Isolated Measurement of Far-Field Light Scattering by a Single Nanostructure
NASA Astrophysics Data System (ADS)
Kim, Donghyeong; Jeong, Kwang-Yong; Kim, Jinhyung; Ee, Ho-Seok; Kang, Ju-Hyung; Park, Hong-Gyu; Seo, Min-Kyo
2017-11-01
Light scattering by nanostructures has facilitated research on various optical phenomena and applications by interfacing the near fields and free-propagating radiation. However, direct quantitative measurement of far-field scattering by a single nanostructure on the wavelength scale or less is highly challenging. Conventional back-focal-plane imaging covers only a limited solid angle determined by the numerical aperture of the objectives and suffers from optical aberration and distortion. Here, we present a quantitative measurement of the differential far-field scattering cross section of a single nanostructure over the full hemisphere. In goniometer-based far-field scanning with a high signal-to-noise ratio of approximately 27.4 dB, weak scattering signals are efficiently isolated and detected under total-internal-reflection illumination. Systematic measurements reveal that the total and differential scattering cross sections of a Au nanorod are determined by the plasmonic Fabry-Perot resonances and the phase-matching conditions to the free-propagating radiation, respectively. We believe that our angle-resolved far-field measurement scheme provides a way to investigate and evaluate the physical properties and performance of nano-optical materials and phenomena.
Photosynthetic Control of Atmospheric Carbonyl Sulfide during the Growing Season
NASA Technical Reports Server (NTRS)
Campbell, J. Elliott; Carmichael, Gregory R.; Chai, T.; Mena-Carrasco, M.; Tang, Y.; Blake, D. R.; Blake, N. J.; Vay, Stephanie A.; Collatz, G. James; Baker, I.;
2008-01-01
Climate models incorporate photosynthesis-climate feedbacks, yet we lack robust tools for large-scale assessments of these processes. Recent work suggests that carbonyl sulfide (COS), a trace gas consumed by plants, could provide a valuable constraint on photosynthesis. Here we analyze airborne observations of COS and carbon dioxide concentrations during the growing season over North America with a three-dimensional atmospheric transport model. We successfully modeled the persistent vertical drawdown of atmospheric COS using the quantitative relation between COS and photosynthesis that has been measured in plant chamber experiments. Furthermore, this drawdown is driven by plant uptake rather than other continental and oceanic fluxes in the model. These results provide quantitative evidence that COS gradients in the continental growing season may have broad use as a measurement-based photosynthesis tracer.
Hattori, Yoshiaki; Falgout, Leo; Lee, Woosik; Jung, Sung-Young; Poon, Emily; Lee, Jung Woo; Na, Ilyoun; Geisler, Amelia; Sadhwani, Divya; Zhang, Yihui; Su, Yewang; Wang, Xiaoqi; Liu, Zhuangjian; Xia, Jing; Cheng, Huanyu; Webb, R. Chad; Bonifas, Andrew P.; Won, Philip; Jeong, Jae-Woong; Jang, Kyung-In; Song, Young Min; Nardone, Beatrice; Nodzenski, Michael; Fan, Jonathan A.; Huang, Yonggang; West, Dennis P.; Paller, Amy S.; Alam, Murad
2014-01-01
Non-invasive, biomedical devices have the potential to provide important, quantitative data for the assessment of skin diseases and wound healing. Traditional methods either rely on qualitative visual and tactile judgments of a professional and/or data obtained using instrumentation with forms that do not readily allow intimate integration with sensitive skin near a wound site. Here we report a skin-like electronics platform that can softly and reversibly laminate perilesionally at wounds to provide highly accurate, quantitative data of relevance to the management of surgical wound healing. Clinical studies on patients using thermal sensors and actuators in fractal layouts provide precise time-dependent mapping of temperature and thermal conductivity of the skin near the wounds. Analytical and simulation results establish the fundamentals of the sensing modalities, the mechanics of the system, and strategies for optimized design. The use of this type of ‘epidermal’ electronics system in a realistic, clinical setting with human subjects establishes a set of practical procedures in disinfection, reuse, and protocols for quantitative measurement. The results have the potential to address important unmet needs in chronic wound management. PMID:24668927
SWIMMING ASSOCIATED ILLNESS AND RAPID MEASURES OF WATER QUALITY AT A GULF BEACH
Studies at Great Lakes beaches have provided evidence that faster ways of measuring the fecal indicator bacteria (FIB) Enterococcus using quantitative polymerase chain reaction (qPCR) are predictive of swimming associated illness. In 2005 we conducted an epidemiology study to eva...
Measuring potential denitrification enzyme activity rates using the membrane inlet mass spectrometer
The denitrification enzyme activity (DEA) assay, provides a quantitative assessment of the multi enzyme, biological process of reactive nitrogen removal via the reduction of N03 to N2. Measured in soil, usually under non limiting carbon and nitrate concentrations, this short ter...
Fluorescent-Antibody Measurement Of Cancer-Cell Urokinase
NASA Technical Reports Server (NTRS)
Morrison, Dennis R.
1993-01-01
Combination of laboratory techniques provides measurements of amounts of urokinase in and between normal and cancer cells. Includes use of fluorescent antibodies specific against different forms of urokinase-type plasminogen activator, (uPA), fluorescence microscopy, quantitative analysis of images of sections of tumor tissue, and flow cytometry of different uPA's and deoxyribonucleic acid (DNA) found in suspended-tumor-cell preparations. Measurements provide statistical method for indicating or predicting metastatic potentials of some invasive tumors. Assessments of metastatic potentials based on such measurements used in determining appropriate follow-up procedures after surgical removal of tumors.
Comparative analysis of quantitative methodologies for Vibrionaceae biofilms.
Chavez-Dozal, Alba A; Nourabadi, Neda; Erken, Martina; McDougald, Diane; Nishiguchi, Michele K
2016-11-01
Multiple symbiotic and free-living Vibrio spp. grow as a form of microbial community known as a biofilm. In the laboratory, methods to quantify Vibrio biofilm mass include crystal violet staining, direct colony-forming unit (CFU) counting, dry biofilm cell mass measurement, and observation of development of wrinkled colonies. Another approach for bacterial biofilms also involves the use of tetrazolium (XTT) assays (used widely in studies of fungi) that are an appropriate measure of metabolic activity and vitality of cells within the biofilm matrix. This study systematically tested five techniques, among which the XTT assay and wrinkled colony measurement provided the most reproducible, accurate, and efficient methods for the quantitative estimation of Vibrionaceae biofilms.
Shackney, Stanley; Emlet, David R; Pollice, Agnese; Smith, Charles; Brown, Kathryn; Kociban, Deborah
2006-01-01
Laser scanning Cytometry (LSC) is a versatile technology that makes it possible to perform multiple measurements on individual cells and correlate them cell by cell with other cellular features. It would be highly desirable to be able to perform reproducible, quantitative, correlated cell-based immunofluorescence studies on individual cells from human solid tumors. However, such studies can be challenging because of the presence of large numbers of cell aggregates and other confounding factors. Techniques have been developed to deal with cell aggregates in data sets collected by LSC. Experience has also been gained in addressing other key technical and methodological issues that can affect the reproducibility of such cell-based immunofluorescence measurements. We describe practical aspects of cell sample collection, cell fixation and staining, protocols for performing multiparameter immunofluorescence measurements by LSC, use of controls and reference samples, and approaches to data analysis that we have found useful in improving the accuracy and reproducibility of LSC data obtained in human tumor samples. We provide examples of the potential advantages of LSC in examining quantitative aspects of cell-based analysis. Improvements in the quality of cell-based multiparameter immunofluorescence measurements make it possible to extract useful information from relatively small numbers of cells. This, in turn, permits the performance of multiple multicolor panels on each tumor sample. With links among the different panels that are provided by overlapping measurements, it is possible to develop increasingly more extensive profiles of intracellular expression of multiple proteins in clinical samples of human solid tumors. Examples of such linked panels of measurements are provided. Advances in methodology can improve cell-based multiparameter immunofluorescence measurements on cell suspensions from human solid tumors by LSC for use in prognostic and predictive clinical applications. Copyright (c) 2005 Wiley-Liss, Inc.
Quantitative phase imaging for enhanced assessment of optomechanical cancer cell properties
NASA Astrophysics Data System (ADS)
Kastl, Lena; Kemper, Björn; Schnekenburger, Jürgen
2018-02-01
Optical cell stretching provides label-free investigations of cells by measuring their biomechanical properties based on deformability determination in a fiber optical two-beam trap. However, the stretching forces in this two-beam laser trap depend on the optical properties of the investigated specimen. Therefore, we characterized in parallel four cancer cell lines with varying degree of differentiation utilizing quantitative phase imaging (QPI) and optical cell stretching. The QPI data allowed enhanced assessment of the mechanical cell properties measured with the optical cell stretcher and demonstrates the high potential of cell phenotyping when both techniques are combined.
Measures of fish behavior as indicators of sublethal toxicosis during standard toxicity tests
Little, E.E.; DeLonay, A.J.
1996-01-01
Behavioral functions essential for growth and survival can be dramatically altered by sublethal exposure to toxicants. Measures of these behavioral responses are effective in detecting adverse effects of sublethal contaminant exposure. Behavioral responses of fishes can be qualitatively and quantitatively evaluated during routine toxicity tests. At selected intervals of exposure, qualitative evaluations are accomplished through direct observations, whereas video recordings are used for quantitative evaluations. Standardized procedures for behavioral evaluation are readily applicable to different fish species and provide rapid, sensitive, and ecologically relevant assessments of sublethal exposure. The methods are readily applied to standardized test protocols.
Ultrasonics Equipped Crimp Tool: A New Technology for Aircraft Wiring Safety
NASA Technical Reports Server (NTRS)
Yost, William T.; Perey, Daniel F.; Cramer, Elliott
2006-01-01
We report on the development of a new measurement technique to quantitatively assess the condition of wire crimp connections. This ultrasonic (UT) method transmits high frequency sound waves through the joint under inspection. The wire-crimp region filters and scatters the ultrasonic energy as it passes through the crimp and wire. The resulting output (both time and frequency domains) provides a quantitative measure of the joint quality that is independent and unaffected by current. Crimps of poor mechanical and electrical quality will result in low temporal output and will distort the spectrum into unique and predictable patterns, depending on crimp "quality". This inexpensive, real-time measurement system can provide certification of crimps as they are made and recertification of existing wire crimps currently in service. The measurements for re-certification do not require that the wire be disconnected from its circuit. No other technology exists to measure in-situ the condition of wire joints (no electrical currents through the crimp are used in this analytical technique). We discuss the signals obtained from this instrument, and correlate these signals with destructive wire pull tests.
Ulgen, Ayse; Han, Zhihua; Li, Wentian
2003-12-31
We address the question of whether statistical correlations among quantitative traits lead to correlation of linkage results of these traits. Five measured quantitative traits (total cholesterol, fasting glucose, HDL cholesterol, blood pressure, and triglycerides), and one derived quantitative trait (total cholesterol divided by the HDL cholesterol) are used for phenotype correlation studies. Four of them are used for linkage analysis. We show that although correlation among phenotypes partially reflects the correlation among linkage analysis results, the LOD-score correlations are on average low. The most significant peaks found by using different traits do not often overlap. Studying covariances at specific locations in LOD scores may provide clues for further bivariate linkage analyses.
Quantitative aspects of inductively coupled plasma mass spectrometry
NASA Astrophysics Data System (ADS)
Bulska, Ewa; Wagner, Barbara
2016-10-01
Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.
Zhang, Yuxin; Holmes, James; Rabanillo, Iñaki; Guidon, Arnaud; Wells, Shane; Hernando, Diego
2018-09-01
To evaluate the reproducibility of quantitative diffusion measurements obtained with reduced Field of View (rFOV) and Multi-shot EPI (msEPI) acquisitions, using single-shot EPI (ssEPI) as a reference. Diffusion phantom experiments, and prostate diffusion-weighted imaging in healthy volunteers and patients with known or suspected prostate cancer were performed across the three different sequences. Quantitative diffusion measurements of apparent diffusion coefficient, and diffusion kurtosis parameters (healthy volunteers), were obtained and compared across diffusion sequences (rFOV, msEPI, and ssEPI). Other possible confounding factors like b-value combinations and acquisition parameters were also investigated. Both msEPI and rFOV have shown reproducible quantitative diffusion measurements relative to ssEPI; no significant difference in ADC was observed across pulse sequences in the standard diffusion phantom (p = 0.156), healthy volunteers (p ≥ 0.12) or patients (p ≥ 0.26). The ADC values within the non-cancerous central gland and peripheral zone of patients were 1.29 ± 0.17 × 10 -3 mm 2 /s and 1.74 ± 0.23 × 10 -3 mm 2 /s respectively. However, differences in quantitative diffusion parameters were observed across different number of averages for rFOV, and across b-value groups and diffusion models for all the three sequences. Both rFOV and msEPI have the potential to provide high image quality with reproducible quantitative diffusion measurements in prostate diffusion MRI. Copyright © 2018 Elsevier Inc. All rights reserved.
Fundamentals of quantitative dynamic contrast-enhanced MR imaging.
Paldino, Michael J; Barboriak, Daniel P
2009-05-01
Quantitative analysis of dynamic contrast-enhanced MR imaging (DCE-MR imaging) has the power to provide information regarding physiologic characteristics of the microvasculature and is, therefore, of great potential value to the practice of oncology. In particular, these techniques could have a significant impact on the development of novel anticancer therapies as a promising biomarker of drug activity. Standardization of DCE-MR imaging acquisition and analysis to provide more reproducible measures of tumor vessel physiology is of crucial importance to realize this potential. The purpose of this article is to review the pathophysiologic basis and technical aspects of DCE-MR imaging techniques.
ERIC Educational Resources Information Center
Lane, Erin S.; Harris, Sara E.
2015-01-01
The authors developed a classroom observation protocol for quantitatively measuring student engagement in large university classes. The Behavioral Engagement Related to instruction (BERI) protocol can be used to provide timely feedback to instructors as to how they can improve student engagement in their classrooms.
Measuring landscape esthetics: the scenic beauty estimation method
Terry C. Daniel; Ron S. Boster
1976-01-01
The Scenic Beauty Estimation Method (SBE) provides quantitative measures of esthetic preferences for alternative wildland management systems. Extensive experimentation and testing with user, interest, and professional groups validated the method. SBE shows promise as an efficient and objective means for assessing the scenic beauty of public forests and wildlands, and...
Nanoscale Structure of Type I Collagen Fibrils: Quantitative Measurement of D-spacing
Erickson, Blake; Fang, Ming; Wallace, Joseph M.; Orr, Bradford G.; Les, Clifford M.; Holl, Mark M. Banaszak
2012-01-01
This paper details a quantitative method to measure the D-periodic spacing of Type I collagen fibrils using Atomic Force Microscopy coupled with analysis using a 2D Fast Fourier Transform approach. Instrument calibration, data sampling and data analysis are all discussed and comparisons of the data to the complementary methods of electron microscopy and X-ray scattering are made. Examples of the application of this new approach to the analysis of Type I collagen morphology in disease models of estrogen depletion and Osteogenesis Imperfecta are provided. We demonstrate that it is the D-spacing distribution, not the D-spacing mean, that showed statistically significant differences in estrogen depletion associated with early stage Osteoporosis and Osteogenesis Imperfecta. The ability to quantitatively characterize nanoscale morphological features of Type I collagen fibrils will provide important structural information regarding Type I collagen in many research areas, including tissue aging and disease, tissue engineering, and gene knock out studies. Furthermore, we also envision potential clinical applications including evaluation of tissue collagen integrity under the impact of diseases or drug treatments. PMID:23027700
Advances in Imaging Approaches to Fracture Risk Evaluation
Manhard, Mary Kate; Nyman, Jeffry S.; Does, Mark D.
2016-01-01
Fragility fractures are a growing problem worldwide, and current methods for diagnosing osteoporosis do not always identify individuals who require treatment to prevent a fracture and may misidentify those not a risk. Traditionally, fracture risk is assessed using dual-energy X-ray absorptiometry, which provides measurements of areal bone mineral density (BMD) at sites prone to fracture. Recent advances in imaging show promise in adding new information that could improve the prediction of fracture risk in the clinic. As reviewed herein, advances in quantitative computed tomography (QCT) predict hip and vertebral body strength; high resolution HR-peripheral QCT (HR-pQCT) and micro-magnetic resonance imaging (μMRI) assess the micro-architecture of trabecular bone; quantitative ultrasound (QUS) measures the modulus or tissue stiffness of cortical bone; and quantitative ultra-short echo time MRI methods quantify the concentrations of bound water and pore water in cortical bone, which reflect a variety of mechanical properties of bone. Each of these technologies provides unique characteristics of bone and may improve fracture risk diagnoses and reduce prevalence of fractures by helping to guide treatment decisions. PMID:27816505
Summary of Quantitative Interpretation of Image Far Ultraviolet Auroral Data
NASA Technical Reports Server (NTRS)
Frey, H. U.; Immel, T. J.; Mende, S. B.; Gerard, J.-C.; Hubert, B.; Habraken, S.; Span, J.; Gladstone, G. R.; Bisikalo, D. V.; Shematovich, V. I.;
2002-01-01
Direct imaging of the magnetosphere by instruments on the IMAGE spacecraft is supplemented by simultaneous observations of the global aurora in three far ultraviolet (FUV) wavelength bands. The purpose of the multi-wavelength imaging is to study the global auroral particle and energy input from thc magnetosphere into the atmosphere. This paper describes provides the method for quantitative interpretation of FUV measurements. The Wide-Band Imaging Camera (WIC) provides broad band ultraviolet images of the aurora with maximum spatial and temporal resolution by imaging the nitrogen lines and bands between 140 and 180 nm wavelength. The Spectrographic Imager (SI), a dual wavelength monochromatic instrument, images both Doppler-shifted Lyman alpha emissions produced by precipitating protons, in the SI-12 channel and OI 135.6 nm emissions in the SI-13 channel. From the SI-12 Doppler shifted Lyman alpha images it is possible to obtain the precipitating proton flux provided assumptions are made regarding the mean energy of the protons. Knowledge of the proton (flux and energy) component allows the calculation of the contribution produced by protons in the WIC and SI-13 instruments. Comparison of the corrected WIC and SI-13 signals provides a measure of the electron mean energy, which can then be used to determine the electron energy fluxun-. To accomplish this reliable modeling emission modeling and instrument calibrations are required. In-flight calibration using early-type stars was used to validate the pre-flight laboratory calibrations and determine long-term trends in sensitivity. In general, very reasonable agreement is found between in-situ measurements and remote quantitative determinations.
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
2002-06-01
Recent technological trends based on miniaturization of mechanical, electro-mechanical, and photonic devices to the microscopic scale, have led to the development of microelectromechanical systems (MEMS). Effective development of MEMS components requires the synergism of advanced design, analysis, and fabrication methodologies, and also of quantitative metrology techniques for characterizing their performance, reliability, and integrity during the electronic packaging cycle. In this paper, we describe opto-electronic techniques for measuring, with sub-micrometer accuracy, shape and changes in states of deformation of MEMS strictures. With the described opto-electronic techniques, it is possible to characterize MEMS components using the display and data modes. In the display mode, interferometric information related to shape and deformation is displayed at video frame rates, providing the capability for adjusting and setting experimental conditions. In the data mode, interferometric information related to shape and deformation is recorded as high-spatial and high-digital resolution images, which are further processed to provide quantitative 3D information. Furthermore, the quantitative 3D data are exported to computer-aided design (CAD) environments and utilized for analysis and optimization of MEMS devices. Capabilities of opto- electronic techniques are illustrated with representative applications demonstrating their applicability to provide indispensable quantitative information for the effective development and optimization of MEMS devices.
Schmidt, Mark E; Chiao, Ping; Klein, Gregory; Matthews, Dawn; Thurfjell, Lennart; Cole, Patricia E; Margolin, Richard; Landau, Susan; Foster, Norman L; Mason, N Scott; De Santi, Susan; Suhy, Joyce; Koeppe, Robert A; Jagust, William
2015-09-01
In vivo imaging of amyloid burden with positron emission tomography (PET) provides a means for studying the pathophysiology of Alzheimer's and related diseases. Measurement of subtle changes in amyloid burden requires quantitative analysis of image data. Reliable quantitative analysis of amyloid PET scans acquired at multiple sites and over time requires rigorous standardization of acquisition protocols, subject management, tracer administration, image quality control, and image processing and analysis methods. We review critical points in the acquisition and analysis of amyloid PET, identify ways in which technical factors can contribute to measurement variability, and suggest methods for mitigating these sources of noise. Improved quantitative accuracy could reduce the sample size necessary to detect intervention effects when amyloid PET is used as a treatment end point and allow more reliable interpretation of change in amyloid burden and its relationship to clinical course. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
[Quantitative Analysis of Heavy Metals in Water with LIBS Based on Signal-to-Background Ratio].
Hu, Li; Zhao, Nan-jing; Liu, Wen-qing; Fang, Li; Zhang, Da-hai; Wang, Yin; Meng, De Shuo; Yu, Yang; Ma, Ming-jun
2015-07-01
There are many influence factors in the precision and accuracy of the quantitative analysis with LIBS technology. According to approximately the same characteristics trend of background spectrum and characteristic spectrum along with the change of temperature through in-depth analysis, signal-to-background ratio (S/B) measurement and regression analysis could compensate the spectral line intensity changes caused by system parameters such as laser power, spectral efficiency of receiving. Because the measurement dates were limited and nonlinear, we used support vector machine (SVM) for regression algorithm. The experimental results showed that the method could improve the stability and the accuracy of quantitative analysis of LIBS, and the relative standard deviation and average relative error of test set respectively were 4.7% and 9.5%. Data fitting method based on signal-to-background ratio(S/B) is Less susceptible to matrix elements and background spectrum etc, and provides data processing reference for real-time online LIBS quantitative analysis technology.
Clinical application of a light-pen computer system for quantitative angiography
NASA Technical Reports Server (NTRS)
Alderman, E. L.
1975-01-01
The paper describes an angiographic analysis system which uses a video disk for recording and playback, a light-pen for data input, minicomputer processing, and an electrostatic printer/plotter for hardcopy output. The method is applied to quantitative analysis of ventricular volumes, sequential ventriculography for assessment of physiologic and pharmacologic interventions, analysis of instantaneous time sequence of ventricular systolic and diastolic events, and quantitation of segmental abnormalities. The system is shown to provide the capability for computation of ventricular volumes and other measurements from operator-defined margins by greatly reducing the tedium and errors associated with manual planimetry.
Towards in vivo focal cortical dysplasia phenotyping using quantitative MRI.
Adler, Sophie; Lorio, Sara; Jacques, Thomas S; Benova, Barbora; Gunny, Roxana; Cross, J Helen; Baldeweg, Torsten; Carmichael, David W
2017-01-01
Focal cortical dysplasias (FCDs) are a range of malformations of cortical development each with specific histopathological features. Conventional radiological assessment of standard structural MRI is useful for the localization of lesions but is unable to accurately predict the histopathological features. Quantitative MRI offers the possibility to probe tissue biophysical properties in vivo and may bridge the gap between radiological assessment and ex-vivo histology. This review will cover histological, genetic and radiological features of FCD following the ILAE classification and will explain how quantitative voxel- and surface-based techniques can characterise these features. We will provide an overview of the quantitative MRI measures available, their link with biophysical properties and finally the potential application of quantitative MRI to the problem of FCD subtyping. Future research linking quantitative MRI to FCD histological properties should improve clinical protocols, allow better characterisation of lesions in vivo and tailored surgical planning to the individual.
A QUANTITATIVE MEASURE FOR PROGRAMMED INSTRUCTION.
ERIC Educational Resources Information Center
HOLLAND, JAMES G.
IN AN ATTEMPT TO PROVIDE AN OBJECTIVE MEANS FOR IDENTIFYING THE DEGREE TO WHICH MATERIAL CAN BE TECHNICALLY TERMED "PROGRAMMED", THE SO-CALLED "BLACKOUT" TECHNIQUE HAS BEEN DEVELOPED. ALL WORDS IN A PROGRAM WHICH ARE NOT DIRECTLY NEEDED IN ORDER TO PROVIDE THE REQUIRED ANSWERS ARE COVERED WITH BLACK CRAYON, AND THIS EDITED…
Comparison of two laboratory-based systems for evaluation of halos in intraocular lenses
Alexander, Elsinore; Wei, Xin; Lee, Shinwook
2018-01-01
Purpose Multifocal intraocular lenses (IOLs) can be associated with unwanted visual phenomena, including halos. Predicting potential for halos is desirable when designing new multifocal IOLs. Halo images from 6 IOL models were compared using the Optikos modulation transfer function bench system and a new high dynamic range (HDR) system. Materials and methods One monofocal, 1 extended depth of focus, and 4 multifocal IOLs were evaluated. An off-the-shelf optical bench was used to simulate a distant (>50 m) car headlight and record images. A custom HDR system was constructed using an imaging photometer to simulate headlight images and to measure quantitative halo luminance data. A metric was developed to characterize halo luminance properties. Clinical relevance was investigated by correlating halo measurements to visual outcomes questionnaire data. Results The Optikos system produced halo images useful for visual comparisons; however, measurements were relative and not quantitative. The HDR halo system provided objective and quantitative measurements used to create a metric from the area under the curve (AUC) of the logarithmic normalized halo profile. This proposed metric differentiated between IOL models, and linear regression analysis found strong correlations between AUC and subjective clinical ratings of halos. Conclusion The HDR system produced quantitative, preclinical metrics that correlated to patients’ subjective perception of halos. PMID:29503526
Diffusion Lung Imaging with Hyperpolarized Gas MRI
Yablonskiy, Dmitriy A; Sukstanskii, Alexander L; Quirk, James D
2015-01-01
Lung imaging using conventional 1H MRI presents great challenges due to low density of lung tissue, lung motion and very fast lung tissue transverse relaxation (typical T2* is about 1-2 ms). MRI with hyperpolarized gases (3He and 129Xe) provides a valuable alternative due to a very strong signal originated from inhaled gas residing in the lung airspaces and relatively slow gas T2* relaxation (typical T2* is about 20-30 ms). Though in vivo human experiments should be done very fast – usually during a single breath-hold. In this review we describe the recent developments in diffusion lung MRI with hyperpolarized gases. We show that a combination of modeling results of gas diffusion in lung airspaces and diffusion measurements with variable diffusion-sensitizing gradients allows extracting quantitative information on the lung microstructure at the alveolar level. This approach, called in vivo lung morphometry, allows from a less than 15-second MRI scan, providing quantitative values and spatial distributions of the same physiological parameters as are measured by means of the “standard” invasive stereology (mean linear intercept, surface-to-volume ratio, density of alveoli, etc.). Besides, the approach makes it possible to evaluate some advanced Weibel parameters characterizing lung microstructure - average radii of alveolar sacs and ducts, as well as the depth of their alveolar sleeves. Such measurements, providing in vivo information on the integrity of pulmonary acinar airways and their changes in different diseases, are of great importance and interest to a broad range of physiologists and clinicians. We also discuss a new type of experiments that are based on the in vivo lung morphometry technique combined with quantitative CT measurements as well as with the Gradient Echo MRI measurements of hyperpolarized gas transverse relaxation in the lung airspaces. Such experiments provide additional information on the blood vessel volume fraction, specific gas volume, the length of acinar airways, and allows evaluation of lung parenchymal and non-parenchymal tissue. PMID:26676342
Zhang, Xirui; Daaboul, George G; Spuhler, Philipp S; Dröge, Peter; Ünlü, M Selim
2016-03-14
DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions.
NASA Astrophysics Data System (ADS)
Liu, C.; Mcgovern, G. P.; Horita, J.
2015-12-01
Traditional isotope ratio mass spectrometry methods to measure 2H/1H and 13C/12C ratios of organic molecules only provide average isotopic values of whole molecules. During the measurement process, valuable information of position-specific isotope fractionations (PSIF) between non-equivalent H and C positions is lost, which can provide additional very useful information about the origins and history of organic molecules. Quantitative nuclear magnetic resonance (NMR) spectrometry can measure 2H and 13C PSIF of organic molecules without destruction. The 2H and 13C signals from different positions of a given molecule show up as distinctive peaks in an NMR spectrum, and their peak areas are proportional to the 2H and 13C populations at each position. Moreover, quantitative NMR can be applied to a wide variety of organic molecules. We have been developing quantitative NMR methods to determine 2H and 13C PSIF of light hydrocarbons (propane, butane and pentane), using J-Young and custom-made high-pressure NMR cells. With careful conditioning of the NMR spectrometer (e.g. tuning, shimming) and effective 1H -13C decoupling, precision of ± <10‰ (2H) and ± <1‰ (13C) can be readily attainable after several hours of acquisition. Measurement time depends on the relaxation time of interested nucleus and the total number of scans needed for high signal-to-noise ratios. Our data for commercial, pure hydrocarbon samples showed that 2H PSIF in the hydrocarbons can be larger than 60‰ and that 13C PSIF can be as large as 15‰. Comparison with theoretical calculations indicates that the PSIF patterns of some hydrocarbon samples reflect non-equilibrium processes in their productions.
Cappelleri, Joseph C.; Lundy, J. Jason; Hays, Ron D.
2014-01-01
Introduction The U.S. Food and Drug Administration’s patient-reported outcome (PRO) guidance document defines content validity as “the extent to which the instrument measures the concept of interest” (FDA, 2009, p. 12). “Construct validity is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity” (Strauss & Smith, 2009, p. 7). Hence both qualitative and quantitative information are essential in evaluating the validity of measures. Methods We review classical test theory and item response theory approaches to evaluating PRO measures including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized “difficulty” (severity) order of items is represented by observed responses. Conclusion Classical test theory and item response theory can be useful in providing a quantitative assessment of items and scales during the content validity phase of patient-reported outcome measures. Depending on the particular type of measure and the specific circumstances, either one or both approaches should be considered to help maximize the content validity of PRO measures. PMID:24811753
Quantitative real-time in vivo detection of magnetic nanoparticles by their nonlinear magnetization
NASA Astrophysics Data System (ADS)
Nikitin, M. P.; Torno, M.; Chen, H.; Rosengart, A.; Nikitin, P. I.
2008-04-01
A novel method of highly sensitive quantitative detection of magnetic nanoparticles (MP) in biological tissues and blood system has been realized and tested in real time in vivo experiments. The detection method is based on nonlinear magnetic properties of MP and the related device can record a very small relative variation of nonlinear magnetic susceptibility up to 10-8 at room temperature, providing sensitivity of several nanograms of MP in 0.1ml volume. Real-time quantitative in vivo measurements of dynamics of MP concentration in blood flow have been performed. A catheter that carried the blood flow of a rat passed through the measuring device. After an MP injection, the quantity of MP in the circulating blood was continuously recorded. The method has also been used to evaluate the MP distribution between rat's organs. Its sensitivity was compared with detection of the radioactive MP based on isotope of Fe59. The comparison of magnetic and radioactive signals in the rat's blood and organ samples demonstrated similar sensitivity for both methods. However, the proposed magnetic method is much more convenient as it is safe, less expensive, and provides real-time measurements in vivo. Moreover, the sensitivity of the method can be further improved by optimization of the device geometry.
Wu, Cheng-Ching; Lin, Hung-Yu; Wang, Chao-Ping; Lu, Li-Fen; Yu, Teng-Hung; Hung, Wei-Chin; Houng, Jer-Yiing; Chung, Fu-Mei; Lee, Yau-Jiunn; Hu, Jin-Jia
2015-11-03
Prostate cancer remains the most common cancer in men. Qualitative or semi-quantitative immunochromatographic measurements of prostate specific antigen (PSA) have been shown to be simple, noninvasive and feasible. The aim of this study was to evaluate an optimized gold immunochromatographic strip device for the detection of PSA, in which the results can be analysed using a Chromogenic Rapid Test Reader to quantitatively assess the test results. This reader measures the reflectance of the signal line via a charge-coupled device camera. For quantitative analysis, PSA concentration was computed via a calibration equation. Capillary blood samples from 305 men were evaluated, and two independent observers interpreted the test results after 12 min. Blood samples were also collected and tested with a conventional quantitative assay. Sensitivity, specificity, positive and negative predictive values, and accuracy of the PSA rapid quantitative test system were 100, 96.6, 89.5, 100, and 97.4 %, respectively. Reproducibility of the test was 99.2, and interobserver variation was 8 % with a false positive rate of 3.4 %. The correlation coefficient between the ordinary quantitative assay and the rapid quantitative test was 0.960. The PSA rapid quantitative test system provided results quickly and was easy to use, so that tests using this system can be easily performed at outpatient clinics or elsewhere. This system may also be useful for initial cancer screening and for point-of-care testing, because results can be obtained within 12 min and at a cost lower than that of conventional quantitative assays.
Quantitative evaluation of statistical errors in small-angle X-ray scattering measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sedlak, Steffen M.; Bruetzel, Linda K.; Lipfert, Jan
A new model is proposed for the measurement errors incurred in typical small-angle X-ray scattering (SAXS) experiments, which takes into account the setup geometry and physics of the measurement process. The model accurately captures the experimentally determined errors from a large range of synchrotron and in-house anode-based measurements. Its most general formulation gives for the variance of the buffer-subtracted SAXS intensity σ 2(q) = [I(q) + const.]/(kq), whereI(q) is the scattering intensity as a function of the momentum transferq;kand const. are fitting parameters that are characteristic of the experimental setup. The model gives a concrete procedure for calculating realistic measurementmore » errors for simulated SAXS profiles. In addition, the results provide guidelines for optimizing SAXS measurements, which are in line with established procedures for SAXS experiments, and enable a quantitative evaluation of measurement errors.« less
A convenient method for X-ray analysis in TEM that measures mass thickness and composition
NASA Astrophysics Data System (ADS)
Statham, P.; Sagar, J.; Holland, J.; Pinard, P.; Lozano-Perez, S.
2018-01-01
We consider a new approach for quantitative analysis in transmission electron microscopy (TEM) that offers the same convenience as single-standard quantitative analysis in scanning electron microscopy (SEM). Instead of a bulk standard, a thin film with known mass thickness is used as a reference. The procedure involves recording an X-ray spectrum from the reference film for each session of acquisitions on real specimens. There is no need to measure the beam current; the current only needs to be stable for the duration of the session. A new reference standard with a large (1 mm x 1 mm) area of uniform thickness of 100 nm silicon nitride is used to reveal regions of X-ray detector occlusion that would give misleading results for any X-ray method that measures thickness. Unlike previous methods, the new X-ray method does not require an accurate beam current monitor but delivers equivalent accuracy in mass thickness measurement. Quantitative compositional results are also automatically corrected for specimen self-absorption. The new method is tested using a wedge specimen of Inconel 600 that is used to calibrate the high angle angular dark field (HAADF) signal to provide a thickness reference and results are compared with electron energy-loss spectrometry (EELS) measurements. For the new X-ray method, element composition results are consistent with the expected composition for the alloy and the mass thickness measurement is shown to provide an accurate alternative to EELS for thickness determination in TEM without the uncertainty associated with mean free path estimates.
NASA Astrophysics Data System (ADS)
Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John
2012-02-01
The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to materials on metallic surfaces for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases -- uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. The degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.
Abildgaard, Johan S.; Saksvik, Per Ø.; Nielsen, Karina
2016-01-01
Organizational interventions aiming at improving employee health and wellbeing have proven to be challenging to evaluate. To analyze intervention processes two methodological approaches have widely been used: quantitative (often questionnaire data), or qualitative (often interviews). Both methods are established tools, but their distinct epistemological properties enable them to illuminate different aspects of organizational interventions. In this paper, we use the quantitative and qualitative process data from an organizational intervention conducted in a national postal service, where the Intervention Process Measure questionnaire (N = 285) as well as an extensive interview study (N = 50) were used. We analyze what type of knowledge about intervention processes these two methodologies provide and discuss strengths and weaknesses as well as potentials for mixed methods evaluation methodologies. PMID:27713707
Abildgaard, Johan S; Saksvik, Per Ø; Nielsen, Karina
2016-01-01
Organizational interventions aiming at improving employee health and wellbeing have proven to be challenging to evaluate. To analyze intervention processes two methodological approaches have widely been used: quantitative (often questionnaire data), or qualitative (often interviews). Both methods are established tools, but their distinct epistemological properties enable them to illuminate different aspects of organizational interventions. In this paper, we use the quantitative and qualitative process data from an organizational intervention conducted in a national postal service, where the Intervention Process Measure questionnaire ( N = 285) as well as an extensive interview study ( N = 50) were used. We analyze what type of knowledge about intervention processes these two methodologies provide and discuss strengths and weaknesses as well as potentials for mixed methods evaluation methodologies.
Microsatellite primers for the Pacific Northwest conifer Callitropsis nootkatensis (Cupressaceae)
Tar N. Jennings; Brian J. Knaus; Katherine Alderman; Paul E. Hennon; David V. D’Amore; Richard. Cronn
2013-01-01
Microsatellite primers were developed for Nootka cypress ( Callitropsis nootkatensis ) to provide quantitative measures for gene conservation that can assist in guiding management decisions for a species experiencing climate-induced decline.
Jarvi, Susan I.; Farias, Margaret E.M.; Howe, Kay; Jacquier, Steven; Hollingsworth, Robert; Pitt, William
2013-01-01
The life cycle of the nematode Angiostrongylus cantonensis involves rats as the definitive host and slugs and snails as intermediate hosts. Humans can become infected upon ingestion of intermediate or paratenic (passive carrier) hosts containing stage L3 A. cantonensis larvae. Here, we report a quantitative PCR (qPCR) assay that provides a reliable, relative measure of parasite load in intermediate hosts. Quantification of the levels of infection of intermediate hosts is critical for determining A. cantonensis intensity on the Island of Hawaii. The identification of high intensity infection ‘hotspots’ will allow for more effective targeted rat and slug control measures. qPCR appears more efficient and sensitive than microscopy and provides a new tool for quantification of larvae from intermediate hosts, and potentially from other sources as well. PMID:22902292
Wallace, Jack
2010-05-01
While forensic laboratories will soon be required to estimate uncertainties of measurement for those quantitations reported to the end users of the information, the procedures for estimating this have been little discussed in the forensic literature. This article illustrates how proficiency test results provide the basis for estimating uncertainties in three instances: (i) For breath alcohol analyzers the interlaboratory precision is taken as a direct measure of uncertainty. This approach applies when the number of proficiency tests is small. (ii) For blood alcohol, the uncertainty is calculated from the differences between the laboratory's proficiency testing results and the mean quantitations determined by the participants; this approach applies when the laboratory has participated in a large number of tests. (iii) For toxicology, either of these approaches is useful for estimating comparability between laboratories, but not for estimating absolute accuracy. It is seen that data from proficiency tests enable estimates of uncertainty that are empirical, simple, thorough, and applicable to a wide range of concentrations.
Evaluation of Deblur Methods for Radiography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, William M.
2014-03-31
Radiography is used as a primary diagnostic for dynamic experiments, providing timeresolved radiographic measurements of areal mass density along a line of sight through the experiment. It is well known that the finite spot extent of the radiographic source, as well as scattering, are sources of blurring of the radiographic images. This blurring interferes with quantitative measurement of the areal mass density. In order to improve the quantitative utility of this diagnostic, it is necessary to deblur or “restore” the radiographs to recover the “true” areal mass density from a radiographic transmission measurement. Towards this end, I am evaluating threemore » separate methods currently in use for deblurring radiographs. I begin by briefly describing the problems associated with image restoration, and outlining the three methods. Next, I illustrate how blurring affects the quantitative measurements using radiographs. I then present the results of the various deblur methods, evaluating each according to several criteria. After I have summarized the results of the evaluation, I give a detailed account of how the restoration process is actually implemented.« less
Indirect scaling methods for testing quantitative emotion theories.
Junge, Martin; Reisenzein, Rainer
2013-01-01
Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular.
Schmid, W; Rosland, J H; von Hofacker, S; Hunskår, I; Bruvik, F
2018-02-20
The use of music as therapy in multidisciplinary end-of-life care dates back to the 1970s and nowadays music therapy (MT) is one of the most frequently used complementary therapy in in-patient palliative care in the US. However existing research investigated music therapy's potential impact mainly from one perspective, referring to either a quantitative or qualitative paradigm. The aim of this review is to provide an overview of the users' and providers' perspectives on music therapy in palliative care within one research article. A systematic literature search was conducted using several databases supplemented with a hand-search of journals between November 1978 and December 2016. Inclusion criteria were: Music therapy with adults in palliative care conducted by a certified music therapist. Both quantitative and qualitative studies in English, German or a Scandinavian language published in peer reviewed journals were included. We aimed to identify and discuss the perspectives of both patients and health care providers on music therapy's impact in palliative care to forward a comprehensive understanding of it's effectiveness, benefits and limitations. We investigated themes mentioned by patients within qualitative studies, as well as commonly chosen outcome measures in quantitative research. A qualitative approach utilizing inductive content analysis was carried out to analyze and categorize the data. Twelve articles, reporting on nine quantitative and three qualitative research studies were included. Seven out of the nine quantitative studies investigated pain as an outcome. All of the included quantitative studies reported positive effects of the music therapy. Patients themselves associated MT with the expression of positive as well as challenging emotions and increased well-being. An overarching theme in both types of research is a psycho-physiological change through music therapy. Both quantitative as well as qualitative research showed positive changes in psycho-physiological well-being. The integration of the users´ and providers´ perspectives within future research applicable for example in mixed-methods designs is recommended.
Spatially resolved hazard and exposure assessments: an example of lead in soil at Lavrion, Greece.
Tristán, E; Demetriades, A; Ramsey, M H; Rosenbaum, M S; Stavrakis, P; Thornton, I; Vassiliades, E; Vergou, K
2000-01-01
Spatially resolved hazard assessment (SRHA) and spatially resolved exposure assessment (SREA) are methodologies that have been devised for assessing child exposure to soil containing environmental pollutants. These are based on either a quantitative or a semiquantitative approach. The feasibility of the methodologies has been demonstrated in a study assessing child exposure to Pb accessible in soil at the town of Lavrion in Greece. Using a quantitative approach, both measured and kriged concentrations of Pb in soil are compared with an "established" statutory threshold value. The probabilistic approach gives a refined classification of the contaminated land, since it takes into consideration the uncertainty in both the actual measurement and estimated kriged values. Two exposure assessment models (i.e., IEUBK and HESP) are used as the basis of the quantitative SREA methodologies. The significant correlation between the blood-Pb predictions, using the IEUBK model, and measured concentrations provides a partial validation of the method, because it allows for the uncertainty in the measurements and the lack of some site-specific measurements. The semiquantitative applications of SRHA and SREA incorporate both qualitative information (e.g., land use and dustiness of waste) and quantitative information (e.g., distance from wastes and distance from industry). The significant correlation between the results of these assessments and the measured blood-Pb levels confirms the robust nature of this approach. Successful application of these methodologies could reduce the cost of the assessment and allow areas to be prioritized for further investigation, remediation, or risk management.
Xiao, Xia; Lei, Kin Fong; Huang, Chia-Hao
2015-01-01
Cell migration is a cellular response and results in various biological processes such as cancer metastasis, that is, the primary cause of death for cancer patients. Quantitative investigation of the correlation between cell migration and extracellular stimulation is essential for developing effective therapeutic strategies for controlling invasive cancer cells. The conventional method to determine cell migration rate based on comparison of successive images may not be an objective approach. In this work, a microfluidic chip embedded with measurement electrodes has been developed to quantitatively monitor the cell migration activity based on the impedimetric measurement technique. A no-damage wound was constructed by microfluidic phenomenon and cell migration activity under the stimulation of cytokine and an anti-cancer drug, i.e., interleukin-6 and doxorubicin, were, respectively, investigated. Impedance measurement was concurrently performed during the cell migration process. The impedance change was directly correlated to the cell migration activity; therefore, the migration rate could be calculated. In addition, a good match was found between impedance measurement and conventional imaging analysis. But the impedimetric measurement technique provides an objective and quantitative measurement. Based on our technique, cell migration rates were calculated to be 8.5, 19.1, and 34.9 μm/h under the stimulation of cytokine at concentrations of 0 (control), 5, and 10 ng/ml. This technique has high potential to be developed into a powerful analytical platform for cancer research. PMID:26180566
Margolin, Ezra J; Mlynarczyk, Carrie M; Mulhall, John P; Stember, Doron S; Stahl, Peter J
2017-06-01
Non-curvature penile deformities are prevalent and bothersome manifestations of Peyronie's disease (PD), but the quantitative metrics that are currently used to describe these deformities are inadequate and non-standardized, presenting a barrier to clinical research and patient care. To introduce erect penile volume (EPV) and percentage of erect penile volume loss (percent EPVL) as novel metrics that provide detailed quantitative information about non-curvature penile deformities and to study the feasibility and reliability of three-dimensional (3D) photography for measurement of quantitative penile parameters. We constructed seven penis models simulating deformities found in PD. The 3D photographs of each model were captured in triplicate by four observers using a 3D camera. Computer software was used to generate automated measurements of EPV, percent EPVL, penile length, minimum circumference, maximum circumference, and angle of curvature. The automated measurements were statistically compared with measurements obtained using water-displacement experiments, a tape measure, and a goniometer. Accuracy of 3D photography for average measurements of all parameters compared with manual measurements; inter-test, intra-observer, and inter-observer reliabilities of EPV and percent EPVL measurements as assessed by the intraclass correlation coefficient. The 3D images were captured in a median of 52 seconds (interquartile range = 45-61). On average, 3D photography was accurate to within 0.3% for measurement of penile length. It overestimated maximum and minimum circumferences by averages of 4.2% and 1.6%, respectively; overestimated EPV by an average of 7.1%; and underestimated percent EPVL by an average of 1.9%. All inter-test, inter-observer, and intra-observer intraclass correlation coefficients for EPV and percent EPVL measurements were greater than 0.75, reflective of excellent methodologic reliability. By providing highly descriptive and reliable measurements of penile parameters, 3D photography can empower researchers to better study volume-loss deformities in PD and enable clinicians to offer improved clinical assessment, communication, and documentation. This is the first study to apply 3D photography to the assessment of PD and to accurately measure the novel parameters of EPV and percent EPVL. This proof-of-concept study is limited by the lack of data in human subjects, which could present additional challenges in obtaining reliable measurements. EPV and percent EPVL are novel metrics that can be quickly, accurately, and reliably measured using computational analysis of 3D photographs and can be useful in describing non-curvature volume-loss deformities resulting from PD. Margolin EJ, Mlynarczyk CM, Muhall JP, et al. Three-Dimensional Photography for Quantitative Assessment of Penile Volume-Loss Deformities in Peyronie's Disease. J Sex Med 2017;14:829-833. Copyright © 2017 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.
Quantitative Thermochemical Measurements in High-Pressure Gaseous Combustion
NASA Technical Reports Server (NTRS)
Kojima, Jun J.; Fischer, David G.
2012-01-01
We present our strategic experiment and thermochemical analyses on combustion flow using a subframe burst gating (SBG) Raman spectroscopy. This unconventional laser diagnostic technique has promising ability to enhance accuracy of the quantitative scalar measurements in a point-wise single-shot fashion. In the presentation, we briefly describe an experimental methodology that generates transferable calibration standard for the routine implementation of the diagnostics in hydrocarbon flames. The diagnostic technology was applied to simultaneous measurements of temperature and chemical species in a swirl-stabilized turbulent flame with gaseous methane fuel at elevated pressure (17 atm). Statistical analyses of the space-/time-resolved thermochemical data provide insights into the nature of the mixing process and it impact on the subsequent combustion process in the model combustor.
Quantitative aspects of inductively coupled plasma mass spectrometry
Wagner, Barbara
2016-01-01
Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644971
NASA Astrophysics Data System (ADS)
Burn, H. E.; Wenner, J. M.; Baer, E. M.
2011-12-01
The quantitative components of introductory geoscience courses can pose significant barriers to students. Many academic departments respond by stripping courses of their quantitative components or by attaching prerequisite mathematics courses [PMC]. PMCs cause students to incur additional costs and credits and may deter enrollment in introductory courses; yet, stripping quantitative content from geoscience courses masks the data-rich, quantitative nature of geoscience. Furthermore, the diversity of math skills required in geoscience and students' difficulty with transferring mathematical knowledge across domains suggest that PMCs may be ineffective. Instead, this study explores an alternative strategy -- to remediate students' mathematical skills using online modules that provide students with opportunities to build contextual quantitative reasoning skills. The Math You Need, When You Need It [TMYN] is a set of modular online student resources that address mathematical concepts in the context of the geosciences. TMYN modules are online resources that employ a "just-in-time" approach - giving students access to skills and then immediately providing opportunities to apply them. Each module places the mathematical concept in multiple geoscience contexts. Such an approach illustrates the immediate application of a principle and provides repeated exposure to a mathematical skill, enhancing long-term retention. At the same time, placing mathematics directly in several geoscience contexts better promotes transfer of learning by using similar discourse (words, tools, representations) and context that students will encounter when applying mathematics in the future. This study uses quantitative and qualitative data to explore the effectiveness of TMYN modules in remediating students' mathematical skills. Quantitative data derive from ten geoscience courses that used TMYN modules during the fall 2010 and spring 2011 semesters; none of the courses had a PMC. In all courses, students completed a pretest, the assigned modules, and a posttest. Success in remediation was measured using normalized gain scores, which measures the change in score divided by the maximum possible increase: (posttest-pretest)/(1-pretest). To compare across courses, normalized gain scores were standardized. Additional analysis included disaggregating normalized gain scores by quartiles based on pretest scores. The results were supplemented by qualitative data from faculty interviews and information provided by faculty on a web form upon completion of the course. Results suggest TMYN modules remediate mathematical skills effectively, and that normalized gains tend to be higher for students in the lower quartiles on the pretest. Students indicate finding the modules helpful, though sometimes difficult. Faculty interview data triangulate these findings and provide further evidence that online, modularized remediation is an effective alternative to assigning prerequisite mathematical courses to remediate mathematical skills.
ERIC Educational Resources Information Center
Heffernan, Bernadette M.
1998-01-01
Describes work done to provide staff of the Sandy Point Discovery Center with methods for evaluating exhibits and interpretive programming. Quantitative and qualitative evaluation measures were designed to assess the program's objective of estuary education. Pretest-posttest questionnaires and interviews are used to measure subjects' knowledge and…
Measuring Research Data Uncertainty in the 2010 NRC Assessment of Geography Graduate Education
ERIC Educational Resources Information Center
Shortridge, Ashton; Goldsberry, Kirk; Weessies, Kathleen
2011-01-01
This article characterizes and measures errors in the 2010 National Research Council (NRC) assessment of research-doctorate programs in geography. This article provides a conceptual model for data-based sources of uncertainty and reports on a quantitative assessment of NRC research data uncertainty for a particular geography doctoral program.…
Zheng, Zhi; Luo, Yuling; McMaster, Gary K
2006-07-01
Accurate and precise quantification of mRNA in whole blood is made difficult by gene expression changes during blood processing, and by variations and biases introduced by sample preparations. We sought to develop a quantitative whole-blood mRNA assay that eliminates blood purification, RNA isolation, reverse transcription, and target amplification while providing high-quality data in an easy assay format. We performed single- and multiplex gene expression analysis with multiple hybridization probes to capture mRNA directly from blood lysate and used branched DNA to amplify the signal. The 96-well plate singleplex assay uses chemiluminescence detection, and the multiplex assay combines Luminex-encoded beads with fluorescent detection. The single- and multiplex assays could quantitatively measure as few as 6000 and 24,000 mRNA target molecules (0.01 and 0.04 amoles), respectively, in up to 25 microL of whole blood. Both formats had CVs < 10% and dynamic ranges of 3-4 logs. Assay sensitivities allowed quantitative measurement of gene expression in the minority of cells in whole blood. The signals from whole-blood lysate correlated well with signals from purified RNA of the same sample, and absolute mRNA quantification results from the assay were similar to those obtained by quantitative reverse transcription-PCR. Both single- and multiplex assay formats were compatible with common anticoagulants and PAXgene-treated samples; however, PAXgene preparations induced expression of known antiapoptotic genes in whole blood. Both the singleplex and the multiplex branched DNA assays can quantitatively measure mRNA expression directly from small volumes of whole blood. The assay offers an alternative to current technologies that depend on RNA isolation and is amenable to high-throughput gene expression analysis of whole blood.
Direct Measurements of the Convective Recycling of the Upper Troposphere
NASA Technical Reports Server (NTRS)
Bertram, Timothy H.; Perring, Anne E.; Wooldridge, Paul J.; Crounse, John D.; Kwan, Alan J.; Wennberg, Paul O.; Scheuer, Eric; Dibb, Jack; Avery, Melody; Sachse, Glen;
2007-01-01
We present a statistical representation of the aggregate effects of deep convection on the chemistry and dynamics of the Upper Troposphere (UT) based on direct aircraft observations of the chemical composition of the UT over the Eastern United States and Canada during summer. These measurements provide new and unique observational constraints on the chemistry occurring downwind of convection and the rate at which air in the UT is recycled, previously only the province of model analyses. These results provide quantitative measures that can be used to evaluate global climate and chemistry models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vazehrad, S., E-mail: vazehrad@kth.se; Elfsberg, J., E-mail: jessica.elfsberg@scania.com; Diószegi, A., E-mail: attila.dioszegi@jth.hj.se
An investigation on silicon segregation of lamellar, compacted and nodular graphite iron was carried out by applying a selective, immersion color etching and a modified electron microprobe to study the microstructure. The color etched micrographs of the investigated cast irons by revealing the austenite phase have provided data about the chronology and mechanism of microstructure formation. Moreover, electron microprobe has provided two dimensional segregation maps of silicon. A good agreement was found between the segregation profile of silicon in the color etched microstructure and the silicon maps achieved by electron microprobe analysis. However, quantitative silicon investigation was found to bemore » more accurate than color etching results to study the size of the eutectic colonies. - Highlights: • Sensitivity of a color etchant to silicon segregation is quantitatively demonstrated. • Si segregation measurement by EMPA approved the results achieved by color etching. • Color etched micrographs provided data about solidification mechanism in cast irons. • Austenite grain boundaries were identified by measuring the local Si concentration.« less
A traits-based approach for prioritizing species for monitoring and surrogacy selection
Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.; ...
2016-11-28
The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less
A traits-based approach for prioritizing species for monitoring and surrogacy selection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.
The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less
Atmospheric Science Data Center
2014-05-15
... 2004. The color-coded maps (along the bottom) provide a quantitative measurement of the sunlight reflected from these surfaces, and the ... MD. The MISR data were obtained from the NASA Langley Research Center Atmospheric Science Data Center in Hampton, VA. Image ...
Arnold, Benjamin F; van der Laan, Mark J; Hubbard, Alan E; Steel, Cathy; Kubofcik, Joseph; Hamlin, Katy L; Moss, Delynn M; Nutman, Thomas B; Priest, Jeffrey W; Lammie, Patrick J
2017-05-01
Serological antibody levels are a sensitive marker of pathogen exposure, and advances in multiplex assays have created enormous potential for large-scale, integrated infectious disease surveillance. Most methods to analyze antibody measurements reduce quantitative antibody levels to seropositive and seronegative groups, but this can be difficult for many pathogens and may provide lower resolution information than quantitative levels. Analysis methods have predominantly maintained a single disease focus, yet integrated surveillance platforms would benefit from methodologies that work across diverse pathogens included in multiplex assays. We developed an approach to measure changes in transmission from quantitative antibody levels that can be applied to diverse pathogens of global importance. We compared age-dependent immunoglobulin G curves in repeated cross-sectional surveys between populations with differences in transmission for multiple pathogens, including: lymphatic filariasis (Wuchereria bancrofti) measured before and after mass drug administration on Mauke, Cook Islands, malaria (Plasmodium falciparum) before and after a combined insecticide and mass drug administration intervention in the Garki project, Nigeria, and enteric protozoans (Cryptosporidium parvum, Giardia intestinalis, Entamoeba histolytica), bacteria (enterotoxigenic Escherichia coli, Salmonella spp.), and viruses (norovirus groups I and II) in children living in Haiti and the USA. Age-dependent antibody curves fit with ensemble machine learning followed a characteristic shape across pathogens that aligned with predictions from basic mechanisms of humoral immunity. Differences in pathogen transmission led to shifts in fitted antibody curves that were remarkably consistent across pathogens, assays, and populations. Mean antibody levels correlated strongly with traditional measures of transmission intensity, such as the entomological inoculation rate for P. falciparum (Spearman's rho = 0.75). In both high- and low transmission settings, mean antibody curves revealed changes in population mean antibody levels that were masked by seroprevalence measures because changes took place above or below the seropositivity cutoff. Age-dependent antibody curves and summary means provided a robust and sensitive measure of changes in transmission, with greatest sensitivity among young children. The method generalizes to pathogens that can be measured in high-throughput, multiplex serological assays, and scales to surveillance activities that require high spatiotemporal resolution. Our results suggest quantitative antibody levels will be particularly useful to measure differences in exposure for pathogens that elicit a transient antibody response or for monitoring populations with very high- or very low transmission, when seroprevalence is less informative. The approach represents a new opportunity to conduct integrated serological surveillance for neglected tropical diseases, malaria, and other infectious diseases with well-defined antigen targets.
van der Laan, Mark J.; Hubbard, Alan E.; Steel, Cathy; Kubofcik, Joseph; Hamlin, Katy L.; Moss, Delynn M.; Nutman, Thomas B.; Priest, Jeffrey W.; Lammie, Patrick J.
2017-01-01
Background Serological antibody levels are a sensitive marker of pathogen exposure, and advances in multiplex assays have created enormous potential for large-scale, integrated infectious disease surveillance. Most methods to analyze antibody measurements reduce quantitative antibody levels to seropositive and seronegative groups, but this can be difficult for many pathogens and may provide lower resolution information than quantitative levels. Analysis methods have predominantly maintained a single disease focus, yet integrated surveillance platforms would benefit from methodologies that work across diverse pathogens included in multiplex assays. Methods/Principal findings We developed an approach to measure changes in transmission from quantitative antibody levels that can be applied to diverse pathogens of global importance. We compared age-dependent immunoglobulin G curves in repeated cross-sectional surveys between populations with differences in transmission for multiple pathogens, including: lymphatic filariasis (Wuchereria bancrofti) measured before and after mass drug administration on Mauke, Cook Islands, malaria (Plasmodium falciparum) before and after a combined insecticide and mass drug administration intervention in the Garki project, Nigeria, and enteric protozoans (Cryptosporidium parvum, Giardia intestinalis, Entamoeba histolytica), bacteria (enterotoxigenic Escherichia coli, Salmonella spp.), and viruses (norovirus groups I and II) in children living in Haiti and the USA. Age-dependent antibody curves fit with ensemble machine learning followed a characteristic shape across pathogens that aligned with predictions from basic mechanisms of humoral immunity. Differences in pathogen transmission led to shifts in fitted antibody curves that were remarkably consistent across pathogens, assays, and populations. Mean antibody levels correlated strongly with traditional measures of transmission intensity, such as the entomological inoculation rate for P. falciparum (Spearman’s rho = 0.75). In both high- and low transmission settings, mean antibody curves revealed changes in population mean antibody levels that were masked by seroprevalence measures because changes took place above or below the seropositivity cutoff. Conclusions/Significance Age-dependent antibody curves and summary means provided a robust and sensitive measure of changes in transmission, with greatest sensitivity among young children. The method generalizes to pathogens that can be measured in high-throughput, multiplex serological assays, and scales to surveillance activities that require high spatiotemporal resolution. Our results suggest quantitative antibody levels will be particularly useful to measure differences in exposure for pathogens that elicit a transient antibody response or for monitoring populations with very high- or very low transmission, when seroprevalence is less informative. The approach represents a new opportunity to conduct integrated serological surveillance for neglected tropical diseases, malaria, and other infectious diseases with well-defined antigen targets. PMID:28542223
Pargett, Michael; Umulis, David M
2013-07-15
Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.
Hattori, Yoshiaki; Falgout, Leo; Lee, Woosik; Jung, Sung-Young; Poon, Emily; Lee, Jung Woo; Na, Ilyoun; Geisler, Amelia; Sadhwani, Divya; Zhang, Yihui; Su, Yewang; Wang, Xiaoqi; Liu, Zhuangjian; Xia, Jing; Cheng, Huanyu; Webb, R Chad; Bonifas, Andrew P; Won, Philip; Jeong, Jae-Woong; Jang, Kyung-In; Song, Young Min; Nardone, Beatrice; Nodzenski, Michael; Fan, Jonathan A; Huang, Yonggang; West, Dennis P; Paller, Amy S; Alam, Murad; Yeo, Woon-Hong; Rogers, John A
2014-10-01
Non-invasive, biomedical devices have the potential to provide important, quantitative data for the assessment of skin diseases and wound healing. Traditional methods either rely on qualitative visual and tactile judgments of a professional and/or data obtained using instrumentation with forms that do not readily allow intimate integration with sensitive skin near a wound site. Here, an electronic sensor platform that can softly and reversibly laminate perilesionally at wounds to provide highly accurate, quantitative data of relevance to the management of surgical wound healing is reported. Clinical studies on patients using thermal sensors and actuators in fractal layouts provide precise time-dependent mapping of temperature and thermal conductivity of the skin near the wounds. Analytical and simulation results establish the fundamentals of the sensing modalities, the mechanics of the system, and strategies for optimized design. The use of this type of "epidermal" electronics system in a realistic clinical setting with human subjects establishes a set of practical procedures in disinfection, reuse, and protocols for quantitative measurement. The results have the potential to address important unmet needs in chronic wound management. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Brenner, Stephan; Muula, Adamson S; Robyn, Paul Jacob; Bärnighausen, Till; Sarker, Malabika; Mathanga, Don P; Bossert, Thomas; De Allegri, Manuela
2014-04-22
In this article we present a study design to evaluate the causal impact of providing supply-side performance-based financing incentives in combination with a demand-side cash transfer component on equitable access to and quality of maternal and neonatal healthcare services. This intervention is introduced to selected emergency obstetric care facilities and catchment area populations in four districts in Malawi. We here describe and discuss our study protocol with regard to the research aims, the local implementation context, and our rationale for selecting a mixed methods explanatory design with a quasi-experimental quantitative component. The quantitative research component consists of a controlled pre- and post-test design with multiple post-test measurements. This allows us to quantitatively measure 'equitable access to healthcare services' at the community level and 'healthcare quality' at the health facility level. Guided by a theoretical framework of causal relationships, we determined a number of input, process, and output indicators to evaluate both intended and unintended effects of the intervention. Overall causal impact estimates will result from a difference-in-difference analysis comparing selected indicators across intervention and control facilities/catchment populations over time.To further explain heterogeneity of quantitatively observed effects and to understand the experiential dimensions of financial incentives on clients and providers, we designed a qualitative component in line with the overall explanatory mixed methods approach. This component consists of in-depth interviews and focus group discussions with providers, service user, non-users, and policy stakeholders. In this explanatory design comprehensive understanding of expected and unexpected effects of the intervention on both access and quality will emerge through careful triangulation at two levels: across multiple quantitative elements and across quantitative and qualitative elements. Combining a traditional quasi-experimental controlled pre- and post-test design with an explanatory mixed methods model permits an additional assessment of organizational and behavioral changes affecting complex processes. Through this impact evaluation approach, our design will not only create robust evidence measures for the outcome of interest, but also generate insights on how and why the investigated interventions produce certain intended and unintended effects and allows for a more in-depth evaluation approach.
NASA Astrophysics Data System (ADS)
Zhang, Xirui; Daaboul, George G.; Spuhler, Philipp S.; Dröge, Peter; Ünlü, M. Selim
2016-03-01
DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions.DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions. Electronic supplementary information (ESI) available: DNA sequences and nomenclature (Table 1S); SDS-PAGE assay of IHF stock solution (Fig. 1S); determination of the concentration of IHF stock solution by Bradford assay (Fig. 2S); equilibrium binding isotherm fitting results of other DNA sequences (Table 2S); calculation of dissociation constants (Fig. 3S, 4S; Table 2S); geometric model for quantitation of DNA bending angle induced by specific IHF binding (Fig. 4S); customized flow cell assembly (Fig. 5S); real-time measurement of average fluorophore height change by SSFM (Fig. 6S); summary of binding parameters obtained from additive isotherm model fitting (Table 3S); average surface densities of 10 dsDNA spots and bound IHF at equilibrium (Table 4S); effects of surface densities on the binding and bending of dsDNA (Tables 5S, 6S and Fig. 7S-10S). See DOI: 10.1039/c5nr06785e
Lin, Jui-Ching; Heeschen, William; Reffner, John; Hook, John
2012-04-01
The combination of integrated focused ion beam-scanning electron microscope (FIB-SEM) serial sectioning and imaging techniques with image analysis provided quantitative characterization of three-dimensional (3D) pigment dispersion in dried paint films. The focused ion beam in a FIB-SEM dual beam system enables great control in slicing paints, and the sectioning process can be synchronized with SEM imaging providing high quality serial cross-section images for 3D reconstruction. Application of Euclidean distance map and ultimate eroded points image analysis methods can provide quantitative characterization of 3D particle distribution. It is concluded that 3D measurement of binder distribution in paints is effective to characterize the order of pigment dispersion in dried paint films.
Sweat testing to evaluate autonomic function
Illigens, Ben M.W.; Gibbons, Christopher H.
2011-01-01
Sudomotor dysfunction is one of the earliest detectable neurophysiologic abnormalities in distal small fiber neuropathy. Traditional neurophysiologic measurements of sudomotor function include thermoregulatory sweat testing (TST), quantitative sudomotor axon reflex testing (QSART), silicone impressions, the sympathetic skin response (SSR), and the recent addition of quantitative direct and indirect axon reflex testing (QDIRT). These testing techniques, when used in combination, can detect and localized pre- and postganglionic lesions, can provide early diagnosis of sudomotor dysfunction and can monitor disease progression or disease recovery. In this article, we review the common tests available for assessment of sudomotor function, detail the testing methodology, review the limitations and provide examples of test results. PMID:18989618
Goel, Utsav O; Maddox, Michael M; Elfer, Katherine N; Dorsey, Philip J; Wang, Mei; McCaslin, Ian Ross; Brown, J Quincy; Lee, Benjamin R
2014-01-01
Reduction of warm ischemia time during partial nephrectomy (PN) is critical to minimizing ischemic damage and improving postoperative kidney function, while maintaining tumor resection efficacy. Recently, methods for localizing the effects of warm ischemia to the region of the tumor via selective clamping of higher-order segmental artery branches have been shown to have superior outcomes compared with clamping the main renal artery. However, artery identification can prolong operative time and increase the blood loss and reduce the positive effects of selective ischemia. Quantitative diffuse reflectance spectroscopy (DRS) can provide a convenient, real-time means to aid in artery identification during laparoscopic PN. The feasibility of quantitative DRS for real-time longitudinal measurement of tissue perfusion and vascular oxygenation in laparoscopic nephrectomy was investigated in vivo in six Yorkshire swine kidneys (n=three animals ). DRS allowed for rapid identification of ischemic areas after selective vessel occlusion. In addition, the rates of ischemia induction and recovery were compared for main renal artery versus tertiary segmental artery occlusion, and it was found that the tertiary segmental artery occlusion trends toward faster recovery after ischemia, which suggests a potential benefit of selective ischemia. Quantitative DRS could provide a convenient and fast tool for artery identification and evaluation of the depth, spatial extent, and duration of selective tissue ischemia in laparoscopic PN.
NASA Astrophysics Data System (ADS)
Goel, Utsav O.; Maddox, Michael M.; Elfer, Katherine N.; Dorsey, Philip J.; Wang, Mei; McCaslin, Ian Ross; Brown, J. Quincy; Lee, Benjamin R.
2014-10-01
Reduction of warm ischemia time during partial nephrectomy (PN) is critical to minimizing ischemic damage and improving postoperative kidney function, while maintaining tumor resection efficacy. Recently, methods for localizing the effects of warm ischemia to the region of the tumor via selective clamping of higher-order segmental artery branches have been shown to have superior outcomes compared with clamping the main renal artery. However, artery identification can prolong operative time and increase the blood loss and reduce the positive effects of selective ischemia. Quantitative diffuse reflectance spectroscopy (DRS) can provide a convenient, real-time means to aid in artery identification during laparoscopic PN. The feasibility of quantitative DRS for real-time longitudinal measurement of tissue perfusion and vascular oxygenation in laparoscopic nephrectomy was investigated in vivo in six Yorkshire swine kidneys (n=three animals). DRS allowed for rapid identification of ischemic areas after selective vessel occlusion. In addition, the rates of ischemia induction and recovery were compared for main renal artery versus tertiary segmental artery occlusion, and it was found that the tertiary segmental artery occlusion trends toward faster recovery after ischemia, which suggests a potential benefit of selective ischemia. Quantitative DRS could provide a convenient and fast tool for artery identification and evaluation of the depth, spatial extent, and duration of selective tissue ischemia in laparoscopic PN.
Radar QPE for hydrological design: Intensity-Duration-Frequency curves
NASA Astrophysics Data System (ADS)
Marra, Francesco; Morin, Efrat
2015-04-01
Intensity-duration-frequency (IDF) curves are widely used in flood risk management since they provide an easy link between the characteristics of a rainfall event and the probability of its occurrence. They are estimated analyzing the extreme values of rainfall records, usually basing on raingauge data. This point-based approach raises two issues: first, hydrological design applications generally need IDF information for the entire catchment rather than a point, second, the representativeness of point measurements decreases with the distance from measure location, especially in regions characterized by steep climatological gradients. Weather radar, providing high resolution distributed rainfall estimates over wide areas, has the potential to overcome these issues. Two objections usually restrain this approach: (i) the short length of data records and (ii) the reliability of quantitative precipitation estimation (QPE) of the extremes. This work explores the potential use of weather radar estimates for the identification of IDF curves by means of a long length radar archive and a combined physical- and quantitative- adjustment of radar estimates. Shacham weather radar, located in the eastern Mediterranean area (Tel Aviv, Israel), archives data since 1990 providing rainfall estimates for 23 years over a region characterized by strong climatological gradients. Radar QPE is obtained correcting the effects of pointing errors, ground echoes, beam blockage, attenuation and vertical variations of reflectivity. Quantitative accuracy is then ensured with a range-dependent bias adjustment technique and reliability of radar QPE is assessed by comparison with gauge measurements. IDF curves are derived from the radar data using the annual extremes method and compared with gauge-based curves. Results from 14 study cases will be presented focusing on the effects of record length and QPE accuracy, exploring the potential application of radar IDF curves for ungauged locations and providing insights on the use of radar QPE for hydrological design studies.
An evidential reasoning extension to quantitative model-based failure diagnosis
NASA Technical Reports Server (NTRS)
Gertler, Janos J.; Anderson, Kenneth C.
1992-01-01
The detection and diagnosis of failures in physical systems characterized by continuous-time operation are studied. A quantitative diagnostic methodology has been developed that utilizes the mathematical model of the physical system. On the basis of the latter, diagnostic models are derived each of which comprises a set of orthogonal parity equations. To improve the robustness of the algorithm, several models may be used in parallel, providing potentially incomplete and/or conflicting inferences. Dempster's rule of combination is used to integrate evidence from the different models. The basic probability measures are assigned utilizing quantitative information extracted from the mathematical model and from online computation performed therewith.
Khodamoradi, Abdolvahed; Ghaffari, Mohammad Payam; Daryabeygi-Khotbehsara, Reza; Sajadi, Haniye Sadat; Majdzadeh, Reza
2018-01-01
Informal patients' payments (IPPs) is a sensitive subject. The aim of current study was to assess the trends in informal payment studies and explore methods of IPPs measurement, prevalence, and features (payment type, volume, and receiver) in various contexts. A search strategy was developed to identify peer-reviewed articles addressing informal payments on PubMed, Science Direct, Web of Science, Scopus, and CINAHL. A total of 1252 studies were identified initially. After screening process, 38 studies were included in the systematic review. The selected studies were appraised, and findings were synthesized. Among selected studies, quantitative approaches were mostly used for measuring IPPs from general public and patients' perspective, and qualitative methods mainly targeted health care providers. Reported IPP prevalence in selected articles ranges between 2% and 80%, more prevalent in the inpatient sector than in outpatient. There are a number of strategies for the measurement of IPPs with different strengths and weaknesses. Most applied strategies for general public were quantitative surveys recruiting more than 1000 participants using a face-to-face structured interview, and then qualitative studies on less than 150 health care providers, with focus group discussion. This review provides a comprehensive picture of current informal patients' payments measurement tools, which helps researchers in future investigations. Copyright © 2017 John Wiley & Sons, Ltd.
A statistical framework for protein quantitation in bottom-up MS-based proteomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas
2009-08-15
ABSTRACT Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and confidence measures. Challenges include the presence of low-quality or incorrectly identified peptides and widespread, informative, missing data. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model for protein abundance in terms of peptide peak intensities, applicable to both label-based and label-free quantitation experiments. The model allows for both random and censoring missingness mechanisms and provides naturally for protein-level estimates and confidence measures. The model is also used to derive automated filtering and imputation routines. Three LC-MS datasets are used tomore » illustrate the methods. Availability: The software has been made available in the open-source proteomics platform DAnTE (Polpitiya et al. (2008)) (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu« less
Analytical scanning evanescent microwave microscope and control stage
Xiang, Xiao-Dong; Gao, Chen; Duewer, Fred; Yang, Hai Tao; Lu, Yalin
2013-01-22
A scanning evanescent microwave microscope (SEMM) that uses near-field evanescent electromagnetic waves to probe sample properties is disclosed. The SEMM is capable of high resolution imaging and quantitative measurements of the electrical properties of the sample. The SEMM has the ability to map dielectric constant, loss tangent, conductivity, electrical impedance, and other electrical parameters of materials. Such properties are then used to provide distance control over a wide range, from to microns to nanometers, over dielectric and conductive samples for a scanned evanescent microwave probe, which enable quantitative non-contact and submicron spatial resolution topographic and electrical impedance profiling of dielectric, nonlinear dielectric and conductive materials. The invention also allows quantitative estimation of microwave impedance using signals obtained by the scanned evanescent microwave probe and quasistatic approximation modeling. The SEMM can be used to measure electrical properties of both dielectric and electrically conducting materials.
Analytical scanning evanescent microwave microscope and control stage
Xiang, Xiao-Dong; Gao, Chen; Duewer, Fred; Yang, Hai Tao; Lu, Yalin
2009-06-23
A scanning evanescent microwave microscope (SEMM) that uses near-field evanescent electromagnetic waves to probe sample properties is disclosed. The SEMM is capable of high resolution imaging and quantitative measurements of the electrical properties of the sample. The SEMM has the ability to map dielectric constant, loss tangent, conductivity, electrical impedance, and other electrical parameters of materials. Such properties are then used to provide distance control over a wide range, from to microns to nanometers, over dielectric and conductive samples for a scanned evanescent microwave probe, which enable quantitative non-contact and submicron spatial resolution topographic and electrical impedance profiling of dielectric, nonlinear dielectric and conductive materials. The invention also allows quantitative estimation of microwave impedance using signals obtained by the scanned evanescent microwave probe and quasistatic approximation modeling. The SEMM can be used to measure electrical properties of both dielectric and electrically conducting materials.
Cinelli, Giorgia; Tositti, Laura; Mostacci, Domiziano; Baré, Jonathan
2016-05-01
In view of assessing natural radioactivity with on-site quantitative gamma spectrometry, efficiency calibration of NaI(Tl) detectors is investigated. A calibration based on Monte Carlo simulation of detector response is proposed, to render reliable quantitative analysis practicable in field campaigns. The method is developed with reference to contact geometry, in which measurements are taken placing the NaI(Tl) probe directly against the solid source to be analyzed. The Monte Carlo code used for the simulations was MCNP. Experimental verification of the calibration goodness is obtained by comparison with appropriate standards, as reported. On-site measurements yield a quick quantitative assessment of natural radioactivity levels present ((40)K, (238)U and (232)Th). On-site gamma spectrometry can prove particularly useful insofar as it provides information on materials from which samples cannot be taken. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Microstencils to generate defined, multi-species patterns of bacteria
Timm, Collin M.; Hansen, Ryan R.; Doktycz, Mitchel J.; ...
2015-11-12
Microbial communities are complex heterogeneous systems that are influenced by physical and chemical interactions with their environment, host, and community members. Techniques that facilitate the quantitative evaluation of how microscale organization influences the morphogenesis of multispecies communities could provide valuable insights into the dynamic behavior and organization of natural communities, the design of synthetic environments for multispecies culture, and the engineering of artificial consortia. In this work, we demonstrate a method for patterning microbes into simple arrangements that allow the quantitative measurement of growth dynamics as a function of their proximity to one another. The method combines parylene-based liftoff techniquesmore » with microfluidic delivery to simultaneously pattern multiple bacterial species with high viability using low-cost, customizable methods. Furthermore, quantitative measurements of bacterial growth for two competing isolates demonstrate that spatial coordination can play a critical role in multispecies growth and structure.« less
von Gunten, Lucien; D'Andrea, William J.; Bradley, Raymond S.; Huang, Yongsong
2012-01-01
High-resolution paleoclimate reconstructions are often restricted by the difficulties of sampling geologic archives in great detail and the analytical costs of processing large numbers of samples. Using sediments from Lake Braya Sø, Greenland, we introduce a new method that provides a quantitative high-resolution paleoclimate record by combining measurements of the alkenone unsaturation index () with non-destructive scanning reflectance spectroscopic measurements in the visible range (VIS-RS). The proxy-to-proxy (PTP) method exploits two distinct calibrations: the in situ calibration of to lake water temperature and the calibration of scanning VIS-RS data to down core data. Using this approach, we produced a quantitative temperature record that is longer and has 5 times higher sampling resolution than the original time series, thereby allowing detection of temperature variability in frequency bands characteristic of the AMO over the past 7,000 years. PMID:22934132
Microstencils to generate defined, multi-species patterns of bacteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Timm, Collin M.; Hansen, Ryan R.; Doktycz, Mitchel J.
Microbial communities are complex heterogeneous systems that are influenced by physical and chemical interactions with their environment, host, and community members. Techniques that facilitate the quantitative evaluation of how microscale organization influences the morphogenesis of multispecies communities could provide valuable insights into the dynamic behavior and organization of natural communities, the design of synthetic environments for multispecies culture, and the engineering of artificial consortia. In this work, we demonstrate a method for patterning microbes into simple arrangements that allow the quantitative measurement of growth dynamics as a function of their proximity to one another. The method combines parylene-based liftoff techniquesmore » with microfluidic delivery to simultaneously pattern multiple bacterial species with high viability using low-cost, customizable methods. Furthermore, quantitative measurements of bacterial growth for two competing isolates demonstrate that spatial coordination can play a critical role in multispecies growth and structure.« less
Lichten, Catherine A; White, Rachel; Clark, Ivan B N; Swain, Peter S
2014-02-03
To connect gene expression with cellular physiology, we need to follow levels of proteins over time. Experiments typically use variants of Green Fluorescent Protein (GFP), and time-series measurements require specialist expertise if single cells are to be followed. Fluorescence plate readers, however, a standard in many laboratories, can in principle provide similar data, albeit at a mean, population level. Nevertheless, extracting the average fluorescence per cell is challenging because autofluorescence can be substantial. Here we propose a general method for correcting plate reader measurements of fluorescent proteins that uses spectral unmixing and determines both the fluorescence per cell and the errors on that fluorescence. Combined with strain collections, such as the GFP fusion collection for budding yeast, our methodology allows quantitative measurements of protein levels of up to hundreds of genes and therefore provides complementary data to high throughput studies of transcription. We illustrate the method by following the induction of the GAL genes in Saccharomyces cerevisiae for over 20 hours in different sugars and argue that the order of appearance of the Leloir enzymes may be to reduce build-up of the toxic intermediate galactose-1-phosphate. Further, we quantify protein levels of over 40 genes, again over 20 hours, after cells experience a change in carbon source (from glycerol to glucose). Our methodology is sensitive, scalable, and should be applicable to other organisms. By allowing quantitative measurements on a per cell basis over tens of hours and over hundreds of genes, it should increase our understanding of the dynamic changes that drive cellular behaviour.
2014-01-01
Background To connect gene expression with cellular physiology, we need to follow levels of proteins over time. Experiments typically use variants of Green Fluorescent Protein (GFP), and time-series measurements require specialist expertise if single cells are to be followed. Fluorescence plate readers, however, a standard in many laboratories, can in principle provide similar data, albeit at a mean, population level. Nevertheless, extracting the average fluorescence per cell is challenging because autofluorescence can be substantial. Results Here we propose a general method for correcting plate reader measurements of fluorescent proteins that uses spectral unmixing and determines both the fluorescence per cell and the errors on that fluorescence. Combined with strain collections, such as the GFP fusion collection for budding yeast, our methodology allows quantitative measurements of protein levels of up to hundreds of genes and therefore provides complementary data to high throughput studies of transcription. We illustrate the method by following the induction of the GAL genes in Saccharomyces cerevisiae for over 20 hours in different sugars and argue that the order of appearance of the Leloir enzymes may be to reduce build-up of the toxic intermediate galactose-1-phosphate. Further, we quantify protein levels of over 40 genes, again over 20 hours, after cells experience a change in carbon source (from glycerol to glucose). Conclusions Our methodology is sensitive, scalable, and should be applicable to other organisms. By allowing quantitative measurements on a per cell basis over tens of hours and over hundreds of genes, it should increase our understanding of the dynamic changes that drive cellular behaviour. PMID:24495318
Quantitative Metrics for Provenance in the Global Change Information System
NASA Astrophysics Data System (ADS)
Sherman, R. A.; Tipton, K.; Elamparuthy, A.
2017-12-01
The Global Change Information System (GCIS) is an open-source web-based resource to provide traceable provenance for government climate information, particularly the National Climate Assessment and other climate science reports from the U.S. Global Change Research Program. Since 2014, GCIS has been adding and updating information and linking records to make the system as complete as possible for the key reports. Our total count of records has grown to well over 20,000, but until recently there hasn't been an easy way to measure how well all those records were serving the mission of providing provenance. The GCIS team has recently established quantitative measures of whether each record has sufficient metadata and linkages to be useful for users of our featured climate reports. We will describe our metrics and show how they can be used to guide future development of GCIS and aid users of government climate data.
Quantitative fluorescence angiography for neurosurgical interventions.
Weichelt, Claudia; Duscha, Philipp; Steinmeier, Ralf; Meyer, Tobias; Kuß, Julia; Cimalla, Peter; Kirsch, Matthias; Sobottka, Stephan B; Koch, Edmund; Schackert, Gabriele; Morgenstern, Ute
2013-06-01
Present methods for quantitative measurement of cerebral perfusion during neurosurgical operations require additional technology for measurement, data acquisition, and processing. This study used conventional fluorescence video angiography--as an established method to visualize blood flow in brain vessels--enhanced by a quantifying perfusion software tool. For these purposes, the fluorescence dye indocyanine green is given intravenously, and after activation by a near-infrared light source the fluorescence signal is recorded. Video data are analyzed by software algorithms to allow quantification of the blood flow. Additionally, perfusion is measured intraoperatively by a reference system. Furthermore, comparing reference measurements using a flow phantom were performed to verify the quantitative blood flow results of the software and to validate the software algorithm. Analysis of intraoperative video data provides characteristic biological parameters. These parameters were implemented in the special flow phantom for experimental validation of the developed software algorithms. Furthermore, various factors that influence the determination of perfusion parameters were analyzed by means of mathematical simulation. Comparing patient measurement, phantom experiment, and computer simulation under certain conditions (variable frame rate, vessel diameter, etc.), the results of the software algorithms are within the range of parameter accuracy of the reference methods. Therefore, the software algorithm for calculating cortical perfusion parameters from video data presents a helpful intraoperative tool without complex additional measurement technology.
NASA Technical Reports Server (NTRS)
Nguyen, Quang-Viet; Kojima, Jun
2005-01-01
Researchers from NASA Glenn Research Center s Combustion Branch and the Ohio Aerospace Institute (OAI) have developed a transferable calibration standard for an optical technique called spontaneous Raman scattering (SRS) in high-pressure flames. SRS is perhaps the only technique that provides spatially and temporally resolved, simultaneous multiscalar measurements in turbulent flames. Such measurements are critical for the validation of numerical models of combustion. This study has been a combined experimental and theoretical effort to develop a spectral calibration database for multiscalar diagnostics using SRS in high-pressure flames. However, in the past such measurements have used a one-of-a-kind experimental setup and a setup-dependent calibration procedure to empirically account for spectral interferences, or crosstalk, among the major species of interest. Such calibration procedures, being non-transferable, are prohibitively expensive to duplicate. A goal of this effort is to provide an SRS calibration database using transferable standards that can be implemented widely by other researchers for both atmospheric-pressure and high-pressure (less than 30 atm) SRS studies. A secondary goal of this effort is to provide quantitative multiscalar diagnostics in high pressure environments to validate computational combustion codes.
Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung
2015-12-14
Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.
2017-01-01
Chemical standardization, along with morphological and DNA analysis ensures the authenticity and advances the integrity evaluation of botanical preparations. Achievement of a more comprehensive, metabolomic standardization requires simultaneous quantitation of multiple marker compounds. Employing quantitative 1H NMR (qHNMR), this study determined the total isoflavone content (TIfCo; 34.5–36.5% w/w) via multimarker standardization and assessed the stability of a 10-year-old isoflavone-enriched red clover extract (RCE). Eleven markers (nine isoflavones, two flavonols) were targeted simultaneously, and outcomes were compared with LC-based standardization. Two advanced quantitative measures in qHNMR were applied to derive quantities from complex and/or overlapping resonances: a quantum mechanical (QM) method (QM-qHNMR) that employs 1H iterative full spin analysis, and a non-QM method that uses linear peak fitting algorithms (PF-qHNMR). A 10 min UHPLC-UV method provided auxiliary orthogonal quantitation. This is the first systematic evaluation of QM and non-QM deconvolution as qHNMR quantitation measures. It demonstrates that QM-qHNMR can account successfully for the complexity of 1H NMR spectra of individual analytes and how QM-qHNMR can be built for mixtures such as botanical extracts. The contents of the main bioactive markers were in good agreement with earlier HPLC-UV results, demonstrating the chemical stability of the RCE. QM-qHNMR advances chemical standardization by its inherent QM accuracy and the use of universal calibrants, avoiding the impractical need for identical reference materials. PMID:28067513
Turbulence Characterization and Control
1975-07-01
backscatter experiment, measures the returns from density fluctuations over the lower 1000 ft. of the atmosphere. Microthermal sensors measure local... microthermal sensors mentioned above provide useful data of this type. In additiona an optical experiment capable of making quantitative...analysis of these data has been completed. During some of these experimental runs, microthermal and acoustic sounder data was also col- lected by
Quantitative 3D analysis of shape dynamics of the left ventricle
NASA Astrophysics Data System (ADS)
Scowen, Barry C.; Smith, Stephen L.; Vannan, Mani A.; Arsenault, Marie
1998-07-01
There is an established link between Left Ventricular (LV) geometry and its performance. As a consequence of ischemic heart disease and the attempt to relieve myocardial tissue stress, ventricle shape begins to distort from a conical to spherical geometry with a reduction in pumping efficiency of the chamber. If untreated, premature heart failure will result. To increase the changes of successful treatment it is obviously important for the benefit of the patient to detect these abnormalities as soon as possible. It is the development of a technique to characterize and quantify the shape of the left ventricle that is described here. The system described in this paper uses a novel helix model which combines the advantages of current two dimensional (2D) quantitative measures which provide limited information, with 3D qualitative methods which provide accurate reconstructions of the LV using computationally expensive rendering schemes. A phantom object and dog ventricle (normal/abnormal) were imaged and helical models constructed. The result are encouraging with differences between normal and abnormal ventricles in both diastole and systole able to be determined. Further work entails building a library of subjects in order to determine the relationship between ventricle geometry and quantitative measurements.
Cheng, Hai-Ling Margaret; Loai, Yasir; Beaumont, Marine; Farhat, Walid A
2010-08-01
Bladder acellular matrices (ACMs) derived from natural tissue are gaining increasing attention for their role in tissue engineering and regeneration. Unlike conventional scaffolds based on biodegradable polymers or gels, ACMs possess native biomechanical and many acquired biologic properties. Efforts to optimize ACM-based scaffolds are ongoing and would be greatly assisted by a noninvasive means to characterize scaffold properties and monitor interaction with cells. MRI is well suited to this role, but research with MRI for scaffold characterization has been limited. This study presents initial results from quantitative MRI measurements for bladder ACM characterization and investigates the effects of incorporating hyaluronic acid, a natural biomaterial useful in tissue-engineering and regeneration. Measured MR relaxation times (T(1), T(2)) and diffusion coefficient were consistent with increased water uptake and glycosaminoglycan content observed on biochemistry in hyaluronic acid ACMs. Multicomponent MRI provided greater specificity, with diffusion data showing an acellular environment and T(2) components distinguishing the separate effects of increased glycosaminoglycans and hydration. These results suggest that quantitative MRI may provide useful information on matrix composition and structure, which is valuable in guiding further development using bladder ACMs for organ regeneration and in strategies involving the use of hyaluronic acid.
Cellular network entropy as the energy potential in Waddington's differentiation landscape
Banerji, Christopher R. S.; Miranda-Saavedra, Diego; Severini, Simone; Widschwendter, Martin; Enver, Tariq; Zhou, Joseph X.; Teschendorff, Andrew E.
2013-01-01
Differentiation is a key cellular process in normal tissue development that is significantly altered in cancer. Although molecular signatures characterising pluripotency and multipotency exist, there is, as yet, no single quantitative mark of a cellular sample's position in the global differentiation hierarchy. Here we adopt a systems view and consider the sample's network entropy, a measure of signaling pathway promiscuity, computable from a sample's genome-wide expression profile. We demonstrate that network entropy provides a quantitative, in-silico, readout of the average undifferentiated state of the profiled cells, recapitulating the known hierarchy of pluripotent, multipotent and differentiated cell types. Network entropy further exhibits dynamic changes in time course differentiation data, and in line with a sample's differentiation stage. In disease, network entropy predicts a higher level of cellular plasticity in cancer stem cell populations compared to ordinary cancer cells. Importantly, network entropy also allows identification of key differentiation pathways. Our results are consistent with the view that pluripotency is a statistical property defined at the cellular population level, correlating with intra-sample heterogeneity, and driven by the degree of signaling promiscuity in cells. In summary, network entropy provides a quantitative measure of a cell's undifferentiated state, defining its elevation in Waddington's landscape. PMID:24154593
Gutierrez-Navarro, Omar; Campos-Delgado, Daniel U; Arce-Santana, Edgar R; Maitland, Kristen C; Cheng, Shuna; Jabbour, Joey; Malik, Bilal; Cuenca, Rodrigo; Jo, Javier A
2014-05-19
Multispectral fluorescence lifetime imaging (m-FLIM) can potentially allow identifying the endogenous fluorophores present in biological tissue. Quantitative description of such data requires estimating the number of components in the sample, their characteristic fluorescent decays, and their relative contributions or abundances. Unfortunately, this inverse problem usually requires prior knowledge about the data, which is seldom available in biomedical applications. This work presents a new methodology to estimate the number of potential endogenous fluorophores present in biological tissue samples from time-domain m-FLIM data. Furthermore, a completely blind linear unmixing algorithm is proposed. The method was validated using both synthetic and experimental m-FLIM data. The experimental m-FLIM data include in-vivo measurements from healthy and cancerous hamster cheek-pouch epithelial tissue, and ex-vivo measurements from human coronary atherosclerotic plaques. The analysis of m-FLIM data from in-vivo hamster oral mucosa identified healthy from precancerous lesions, based on the relative concentration of their characteristic fluorophores. The algorithm also provided a better description of atherosclerotic plaques in term of their endogenous fluorophores. These results demonstrate the potential of this methodology to provide quantitative description of tissue biochemical composition.
Guo, Sujuan; Pridham, Kevin J; Sheng, Zhi
2016-01-01
Autophagy is a catabolic process whereby cellular components are degraded to fuel cells for longer survival during stress. Hence, autophagy plays a vital role in determining cell fate and is central for homeostasis and pathogenesis of many human diseases including chronic myeloid leukemia (CML). It has been well established that autophagy is important for the leukemogenesis as well as drug resistance in CML. Thus, autophagy is an intriguing therapeutic target. However, current approaches that detect autophagy lack reliability and often fail to provide quantitative measurements. To overcome this hurdle and facilitate the development of autophagy-related therapies, we have recently developed an autophagy assay termed as the Cyto-ID fluorescence spectrophotometric assay. This method uses a cationic fluorescence dye, Cyto-ID, which specifically labels autophagic compartments and is detected by a spectrophotometer to permit a large-scale and quantitative analysis. As such, it allows rapid, reliable, and quantitative detection of autophagy and estimation of autophagy flux. In this chapter, we further provide technical details of this method and step-by-step protocols for measuring autophagy or autophagy flux in CML cell lines as well as primary hematopoietic cells.
Parker, Katherine M.; Clark, Alexander P.; Goodman, Norman C.; Glover, David K.; Holmes, Jeffrey W.
2015-01-01
Background Quantitative analysis of wall motion from three-dimensional (3D) dobutamine stress echocardiography (DSE) could provide additional diagnostic information not available from qualitative analysis. In this study we compare the effectiveness of 3D fractional shortening (3DFS), a measure of wall motion computed from 3D echocardiography (3DE), to strain and strain rate measured with sonomicrometry for detecting critical stenoses during DSE. Methods Eleven open-chest dogs underwent DSE both with and without a critical stenosis. 3DFS was measured from 3DE images acquired at peak stress. 3DFS was normalized by subtracting average 3DFS during control peak stress (Δ3DFS). Strains in the perfusion defect (PD) were measured from sonomicrometry, and PD size and location were measured with microspheres. Results A Δ3DFS abnormality indicated the presence of a critical stenosis with high sensitivity and specificity (88% and 100%, respectively), and Δ3DFS abnormality size correlated with PD size (R2=0.54). The sensitivity and specificity for Δ3DFS was similar to that for area strain (88%, 100%) and circumferential strain and strain rate (88%, 92% and 88%, 86%, respectively), while longitudinal strain and strain rate were less specific. Δ3DFS correlated significantly with both coronary flow reserve (R2=0.71) and PD size (R2=0.97), while area strain correlated with PD size only (R2=0.67), and other measures were not significantly correlated with flow reserve or PD size. Conclusion Quantitative wall motion analysis using Δ3DFS is effective for detecting critical stenoses during DSE, performing similarly to 3D strain, and provides potentially useful information on the size and location of a perfusion defect. PMID:24815588
Breach Risk Magnitude: A Quantitative Measure of Database Security.
Yasnoff, William A
2016-01-01
A quantitative methodology is described that provides objective evaluation of the potential for health record system breaches. It assumes that breach risk increases with the number of potential records that could be exposed, while it decreases when more authentication steps are required for access. The breach risk magnitude (BRM) is the maximum value for any system user of the common logarithm of the number of accessible database records divided by the number of authentication steps needed to achieve such access. For a one million record relational database, the BRM varies from 5.52 to 6 depending on authentication protocols. For an alternative data architecture designed specifically to increase security by separately storing and encrypting each patient record, the BRM ranges from 1.3 to 2.6. While the BRM only provides a limited quantitative assessment of breach risk, it may be useful to objectively evaluate the security implications of alternative database organization approaches.
Physiologic basis for understanding quantitative dehydration assessment.
Cheuvront, Samuel N; Kenefick, Robert W; Charkoudian, Nisha; Sawka, Michael N
2013-03-01
Dehydration (body water deficit) is a physiologic state that can have profound implications for human health and performance. Unfortunately, dehydration can be difficult to assess, and there is no single, universal gold standard for decision making. In this article, we review the physiologic basis for understanding quantitative dehydration assessment. We highlight how phenomenologic interpretations of dehydration depend critically on the type (dehydration compared with volume depletion) and magnitude (moderate compared with severe) of dehydration, which in turn influence the osmotic (plasma osmolality) and blood volume-dependent compensatory thresholds for antidiuretic and thirst responses. In particular, we review new findings regarding the biological variation in osmotic responses to dehydration and discuss how this variation can help provide a quantitative and clinically relevant link between the physiology and phenomenology of dehydration. Practical measures with empirical thresholds are provided as a starting point for improving the practice of dehydration assessment.
Quantitative imaging of aggregated emulsions.
Penfold, Robert; Watson, Andrew D; Mackie, Alan R; Hibberd, David J
2006-02-28
Noise reduction, restoration, and segmentation methods are developed for the quantitative structural analysis in three dimensions of aggregated oil-in-water emulsion systems imaged by fluorescence confocal laser scanning microscopy. Mindful of typical industrial formulations, the methods are demonstrated for concentrated (30% volume fraction) and polydisperse emulsions. Following a regularized deconvolution step using an analytic optical transfer function and appropriate binary thresholding, novel application of the Euclidean distance map provides effective discrimination of closely clustered emulsion droplets with size variation over at least 1 order of magnitude. The a priori assumption of spherical nonintersecting objects provides crucial information to combat the ill-posed inverse problem presented by locating individual particles. Position coordinates and size estimates are recovered with sufficient precision to permit quantitative study of static geometrical features. In particular, aggregate morphology is characterized by a novel void distribution measure based on the generalized Apollonius problem. This is also compared with conventional Voronoi/Delauney analysis.
Gravitational Effects on Near Field Flow Structure of Low Density Gas Jets
NASA Technical Reports Server (NTRS)
Griffin, D. W.; Yep, T. W.; Agrawal, A. K.
2005-01-01
Experiments were conducted in Earth gravity and microgravity to acquire quantitative data on near field flow structure of helium jets injected into air. Microgravity conditions were simulated in the 2.2- second drop tower at NASA Glenn Research Center. The jet flow was observed by quantitative rainbow schlieren deflectometry, a non-intrusive line of site measurement technique for the whole field. The flow structure was characterized by distributions of angular deflection and helium mole percentage obtained from color schlieren images taken at 60 Hz. Results show that the jet in microgravity was up to 70 percent wider than that in Earth gravity. The global jet flow oscillations observed in Earth gravity were absent in microgravity, providing direct experimental evidence that the flow instability in the low density jet was buoyancy induced. The paper provides quantitative details of temporal flow evolution as the experiment undergoes change in gravity in the drop tower.
NASA Astrophysics Data System (ADS)
Yuan, Zhen; Li, Xiaoqi; Xi, Lei
2014-06-01
Biomedical photoacoustic tomography (PAT), as a potential imaging modality, can visualize tissue structure and function with high spatial resolution and excellent optical contrast. It is widely recognized that the ability of quantitatively imaging optical absorption and scattering coefficients from photoacoustic measurements is essential before PAT can become a powerful imaging modality. Existing quantitative PAT (qPAT), while successful, has been focused on recovering absorption coefficient only by assuming scattering coefficient a constant. An effective method for photoacoustically recovering optical scattering coefficient is presently not available. Here we propose and experimentally validate such a method for quantitative scattering coefficient imaging using photoacoustic data from one-wavelength illumination. The reconstruction method developed combines conventional PAT with the photon diffusion equation in a novel way to realize the recovery of scattering coefficient. We demonstrate the method using various objects having scattering contrast only or both absorption and scattering contrasts embedded in turbid media. The listening-to-light-scattering method described will be able to provide high resolution scattering imaging for various biomedical applications ranging from breast to brain imaging.
Iterative optimization method for design of quantitative magnetization transfer imaging experiments.
Levesque, Ives R; Sled, John G; Pike, G Bruce
2011-09-01
Quantitative magnetization transfer imaging (QMTI) using spoiled gradient echo sequences with pulsed off-resonance saturation can be a time-consuming technique. A method is presented for selection of an optimum experimental design for quantitative magnetization transfer imaging based on the iterative reduction of a discrete sampling of the Z-spectrum. The applicability of the technique is demonstrated for human brain white matter imaging at 1.5 T and 3 T, and optimal designs are produced to target specific model parameters. The optimal number of measurements and the signal-to-noise ratio required for stable parameter estimation are also investigated. In vivo imaging results demonstrate that this optimal design approach substantially improves parameter map quality. The iterative method presented here provides an advantage over free form optimal design methods, in that pragmatic design constraints are readily incorporated. In particular, the presented method avoids clustering and repeated measures in the final experimental design, an attractive feature for the purpose of magnetization transfer model validation. The iterative optimal design technique is general and can be applied to any method of quantitative magnetization transfer imaging. Copyright © 2011 Wiley-Liss, Inc.
Quantitative measures for redox signaling.
Pillay, Ché S; Eagling, Beatrice D; Driscoll, Scott R E; Rohwer, Johann M
2016-07-01
Redox signaling is now recognized as an important regulatory mechanism for a number of cellular processes including the antioxidant response, phosphokinase signal transduction and redox metabolism. While there has been considerable progress in identifying the cellular machinery involved in redox signaling, quantitative measures of redox signals have been lacking, limiting efforts aimed at understanding and comparing redox signaling under normoxic and pathogenic conditions. Here we have outlined some of the accepted principles for redox signaling, including the description of hydrogen peroxide as a signaling molecule and the role of kinetics in conferring specificity to these signaling events. Based on these principles, we then develop a working definition for redox signaling and review a number of quantitative methods that have been employed to describe signaling in other systems. Using computational modeling and published data, we show how time- and concentration- dependent analyses, in particular, could be used to quantitatively describe redox signaling and therefore provide important insights into the functional organization of redox networks. Finally, we consider some of the key challenges with implementing these methods. Copyright © 2016 Elsevier Inc. All rights reserved.
Wikswo, J P; Prokop, A; Baudenbacher, F; Cliffel, D; Csukas, B; Velkovsky, M
2006-08-01
Systems biology, i.e. quantitative, postgenomic, postproteomic, dynamic, multiscale physiology, addresses in an integrative, quantitative manner the shockwave of genetic and proteomic information using computer models that may eventually have 10(6) dynamic variables with non-linear interactions. Historically, single biological measurements are made over minutes, suggesting the challenge of specifying 10(6) model parameters. Except for fluorescence and micro-electrode recordings, most cellular measurements have inadequate bandwidth to discern the time course of critical intracellular biochemical events. Micro-array expression profiles of thousands of genes cannot determine quantitative dynamic cellular signalling and metabolic variables. Major gaps must be bridged between the computational vision and experimental reality. The analysis of cellular signalling dynamics and control requires, first, micro- and nano-instruments that measure simultaneously multiple extracellular and intracellular variables with sufficient bandwidth; secondly, the ability to open existing internal control and signalling loops; thirdly, external BioMEMS micro-actuators that provide high bandwidth feedback and externally addressable intracellular nano-actuators; and, fourthly, real-time, closed-loop, single-cell control algorithms. The unravelling of the nested and coupled nature of cellular control loops requires simultaneous recording of multiple single-cell signatures. Externally controlled nano-actuators, needed to effect changes in the biochemical, mechanical and electrical environment both outside and inside the cell, will provide a major impetus for nanoscience.
Methods for collecting algal samples as part of the National Water-Quality Assessment Program
Porter, Stephen D.; Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.
1993-01-01
Benthic algae (periphyton) and phytoplankton communities are characterized in the U.S. Geological Survey's National Water-Quality Assessment Program as part of an integrated physical, chemical, and biological assessment of the Nation's water quality. This multidisciplinary approach provides multiple lines of evidence for evaluating water-quality status and trends, and for refining an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. Water quality can be characterized by evaluating the results of qualitative and quantitative measurements of the algal community. Qualitative periphyton samples are collected to develop of list of taxa present in the sampling reach. Quantitative periphyton samples are collected to measure algal community structure within selected habitats. These samples of benthic algal communities are collected from natural substrates, using the sampling methods that are most appropriate for the habitat conditions. Phytoplankton samples may be collected in large nonwadeable streams and rivers to meet specific program objectives. Estimates of algal biomass (chlorophyll content and ash-free dry mass) also are optional measures that may be useful for interpreting water-quality conditions. A nationally consistent approach provides guidance on site, reach, and habitat selection, as well as information on methods and equipment for qualitative and quantitative sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data locally, regionally, and nationally.
ERIC Educational Resources Information Center
Farr, Erik P.; Quintana, Jason C.; Reynoso, Vanessa; Ruberry, Josiah D.; Shin, Wook R.; Swartz, Kevin R.
2018-01-01
Here we present a new undergraduate laboratory that will introduce the concepts of time-resolved spectroscopy and provide insight into the natural time scales on which chemical dynamics occur through direct measurement. A quantitative treatment of the acquired data will provide a deeper understanding of the role of quantum mechanics and various…
Automated Chemical Warfare Respirator Quantitative Fit Test Instrument
1985-04-01
i requisite to assessment of the level of protection provided by the respirator. Quantitative measurement of the variability of fit of the face- plec ...ACQUISITION SYSTEM CORN OIL FILlE RESERVOIR -~ HEATER CONCENTRATIOA ZAONVU~C RB EAHG1FECO PHOTOMETER- STA RT CIL RESERVOIR BOTTGM TEMPERATURE SWITCH...30 .........-.-.-... % L4* . 3.3.4 HP 3497A Control and Data Acquisition Unit lI, "%a.infraoc" i.; a box with five slots for plug-in modules pll u; a
Three-dimensional quantitative flow diagnostics
NASA Technical Reports Server (NTRS)
Miles, Richard B.; Nosenchuck, Daniel M.
1989-01-01
The principles, capabilities, and practical implementation of advanced measurement techniques for the quantitative characterization of three-dimensional flows are reviewed. Consideration is given to particle, Rayleigh, and Raman scattering; fluorescence; flow marking by H2 bubbles, photochromism, photodissociation, and vibrationally excited molecules; light-sheet volume imaging; and stereo imaging. Also discussed are stereo schlieren methods, holographic particle imaging, optical tomography, acoustic and magnetic-resonance imaging, and the display of space-filling data. Extensive diagrams, graphs, photographs, sample images, and tables of numerical data are provided.
Using ultrasound to quantify tongue shape and movement characteristics.
Zharkova, Natalia
2013-01-01
Objective : Previous experimental studies have demonstrated abnormal lingual articulatory patterns characterizing cleft palate speech. Most articulatory information to date has been collected using electropalatography, which records the location and size of tongue-palate contact but not the tongue shape. The latter type of data can be provided by ultrasound. The present paper aims to describe ultrasound tongue imaging as a potential tool for quantitative analysis of tongue function in speakers with cleft palate. A description of the ultrasound technique as applied to analyzing tongue movements is given, followed by the requirements for quantitative analysis. Several measures are described, and example calculations are provided. Measures : Two measures aim to quantify overuse of tongue dorsum in cleft palate articulations. Crucially for potential clinical applications, these measures do not require head-to-transducer stabilization because both are based on a single tongue curve. The other three measures compare sets of tongue curves, with the aim to quantify the dynamics of tongue displacement, token-to-token variability in tongue position, and the extent of separation between tongue curves for different speech sounds. Conclusions : All measures can be used to compare tongue function in speakers with cleft palate before and after therapy, as well as to assess their performance against that in typical speakers and to help in selecting more effective treatments.
Optical Fourier filtering for whole lens assessment of progressive power lenses.
Spiers, T; Hull, C C
2000-07-01
Four binary filter designs for use in an optical Fourier filtering set-up were evaluated when taking quantitative measurements and when qualitatively mapping the power variation of progressive power lenses (PPLs). The binary filters tested were concentric ring, linear grating, grid and "chevron" designs. The chevron filter was considered best for quantitative measurements since it permitted a vernier acuity task to be used for measuring the fringe spacing, significantly reducing errors, and it also gave information on the polarity of the lens power. The linear grating filter was considered best for qualitatively evaluating the power variation. Optical Fourier filtering and a Nidek automatic focimeter were then used to measure the powers in the distance and near portions of five PPLs of differing design. Mean measurement error was 0.04 D with a maximum value of 0.13 D. Good qualitative agreement was found between the iso-cylinder plots provided by the manufacturer and the Fourier filter fringe patterns for the PPLs indicating that optical Fourier filtering provides the ability to map the power distribution across the entire lens aperture without the need for multiple point measurements. Arguments are presented that demonstrate that it should be possible to derive both iso-sphere and iso-cylinder plots from the binary filter patterns.
THE USE AND LIMITATIONS OF DETECTION AND QUANTITATION LIMITS IN ENVIRONMENTAL ANALYSIS
Site assessment, remediation and compliance monitoring require the routine determination of the concentration of regulated substances in environmental samples. Each measurement methodology providing the concentration determinations, is required to specify key data quality elemen...
Comparison of numerical model simulations and SFO wake vortex windline measurements
DOT National Transportation Integrated Search
2003-06-23
To provide quantitative support for the Simultaneous Offset Instrument Approach (SOIA) procedure, an extensive data collection effort was undertaken at San Francisco International Airport by the Federal Aviation Administration (FAA, U.S. Dept. of Tra...
Modern projection of the old electroscope for nuclear radiation quantitative work and demonstrations
NASA Astrophysics Data System (ADS)
Oliveira Bastos, Rodrigo; Baltokoski Boch, Layara
2017-11-01
Although quantitative measurements in radioactivity teaching and research are only believed to be possible with high technology, early work in this area was fully accomplished with very simple apparatus such as zinc sulphide screens and electroscopes. This article presents an experimental practice using the electroscope, which is a very simple apparatus that has been widely used for educational purposes, although generally for qualitative work. The main objective is to show the possibility of measuring radioactivity not only in qualitative demonstrations, but also in quantitative experimental practices. The experimental set-up is a low-cost ion chamber connected to an electroscope in a configuration that is very similar to that used by Marie and Pierre Currie, Rutherford, Geiger, Pacini, Hess and other great researchers from the time of the big discoveries in nuclear and high-energy particle physics. An electroscope leaf is filmed and projected, permitting the collection of quantitative data for the measurement of the 220Rn half-life, collected from the emanation of the lantern mantles. The article presents the experimental procedures and the expected results, indicating that the experiment may provide support for nuclear physics classes. These practices could spread widely to either university or school didactic laboratories, and the apparatus has the potential to allow the development of new teaching activity for nuclear physics.
Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.
2017-01-01
Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883
NASA Astrophysics Data System (ADS)
Davis, Brian; Turner, Travis L.; Seelecke, Stefan
2005-05-01
Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons within the beam structures. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical analysis/design tool. The experimental investigation is refined by a more thorough test procedure and incorporation of higher fidelity measurements such as infrared thermography and projection moire interferometry. The numerical results are produced by a recently commercialized version of the constitutive model as implemented in ABAQUS and are refined by incorporation of additional measured parameters such as geometric imperfection. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. The results demonstrate the effectiveness of SMAHC structures in controlling static and dynamic responses by adaptive stiffening. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.
NASA Technical Reports Server (NTRS)
Davis, Brian; Turner, Travis L.; Seelecke, Stefan
2005-01-01
Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons within the beam structures. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical analysis/design tool. The experimental investigation is refined by a more thorough test procedure and incorporation of higher fidelity measurements such as infrared thermography and projection moire interferometry. The numerical results are produced by a recently commercialized version of the constitutive model as implemented in ABAQUS and are refined by incorporation of additional measured parameters such as geometric imperfection. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. The results demonstrate the effectiveness of SMAHC structures in controlling static and dynamic responses by adaptive stiffening. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.
Current methods and advances in bone densitometry
NASA Technical Reports Server (NTRS)
Guglielmi, G.; Gluer, C. C.; Majumdar, S.; Blunt, B. A.; Genant, H. K.
1995-01-01
Bone mass is the primary, although not the only, determinant of fracture. Over the past few years a number of noninvasive techniques have been developed to more sensitively quantitate bone mass. These include single and dual photon absorptiometry (SPA and DPA), single and dual X-ray absorptiometry (SXA and DXA) and quantitative computed tomography (QCT). While differing in anatomic sites measured and in their estimates of precision, accuracy, and fracture discrimination, all of these methods provide clinically useful measurements of skeletal status. It is the intent of this review to discuss the pros and cons of these techniques and to present the new applications of ultrasound (US) and magnetic resonance (MRI) in the detection and management of osteoporosis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurtz, Sarah; Repins, Ingrid L; Hacke, Peter L
Continued growth of PV system deployment would be enhanced by quantitative, low-uncertainty predictions of the degradation and failure rates of PV modules and systems. The intended product lifetime (decades) far exceeds the product development cycle (months), limiting our ability to reduce the uncertainty of the predictions for this rapidly changing technology. Yet, business decisions (setting insurance rates, analyzing return on investment, etc.) require quantitative risk assessment. Moving toward more quantitative assessments requires consideration of many factors, including the intended application, consequence of a possible failure, variability in the manufacturing, installation, and operation, as well as uncertainty in the measured accelerationmore » factors, which provide the basis for predictions based on accelerated tests. As the industry matures, it is useful to periodically assess the overall strategy for standards development and prioritization of research to provide a technical basis both for the standards and the analysis related to the application of those. To this end, this paper suggests a tiered approach to creating risk assessments. Recent and planned potential improvements in international standards are also summarized.« less
Quantitative optical metrology with CMOS cameras
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Kolenovic, Ervin; Ferguson, Curtis F.
2004-08-01
Recent advances in laser technology, optical sensing, and computer processing of data, have lead to the development of advanced quantitative optical metrology techniques for high accuracy measurements of absolute shapes and deformations of objects. These techniques provide noninvasive, remote, and full field of view information about the objects of interest. The information obtained relates to changes in shape and/or size of the objects, characterizes anomalies, and provides tools to enhance fabrication processes. Factors that influence selection and applicability of an optical technique include the required sensitivity, accuracy, and precision that are necessary for a particular application. In this paper, sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography (OEH) based on CMOS cameras, are discussed. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gauges, demonstrating the applicability of CMOS cameras in quantitative optical metrology techniques. It is shown that the advanced nature of CMOS technology can be applied to challenging engineering applications, including the study of rapidly evolving phenomena occurring in MEMS and micromechatronics.
Quantitative MRI in refractory temporal lobe epilepsy: relationship with surgical outcomes
Bonilha, Leonardo
2015-01-01
Medically intractable temporal lobe epilepsy (TLE) remains a serious health problem. Across treatment centers, up to 40% of patients with TLE will continue to experience persistent postoperative seizures at 2-year follow-up. It is unknown why such a large number of patients continue to experience seizures despite being suitable candidates for resective surgery. Preoperative quantitative MRI techniques may provide useful information on why some patients continue to experience disabling seizures, and may have the potential to develop prognostic markers of surgical outcome. In this article, we provide an overview of how quantitative MRI morphometric and diffusion tensor imaging (DTI) data have improved the understanding of brain structural alterations in patients with refractory TLE. We subsequently review the studies that have applied quantitative structural imaging techniques to identify the neuroanatomical factors that are most strongly related to a poor postoperative prognosis. In summary, quantitative imaging studies strongly suggest that TLE is a disorder affecting a network of neurobiological systems, characterized by multiple and inter-related limbic and extra-limbic network abnormalities. The relationship between brain alterations and postoperative outcome are less consistent, but there is emerging evidence suggesting that seizures are less likely to remit with surgery when presurgical abnormalities are observed in the connectivity supporting brain regions serving as network nodes located outside the resected temporal lobe. Future work, possibly harnessing the potential from multimodal imaging approaches, may further elucidate the etiology of persistent postoperative seizures in patients with refractory TLE. Furthermore, quantitative imaging techniques may be explored to provide individualized measures of postoperative seizure freedom outcome. PMID:25853080
Zhao, Dan; Liu, Wei; Cai, Ailu; Li, Jingyu; Chen, Lizhu; Wang, Bing
2013-02-01
The purpose of this study was to investigate the effectiveness for quantitative evaluation of cerebellar vermis using three-dimensional (3D) ultrasound and to establish a nomogram for Chinese fetal vermis measurements during gestation. Sonographic examinations were performed in normal fetuses and in cases suspected of the diagnosis of vermian rotation. 3D median planes were obtained with both OMNIVIEW and tomographic ultrasound imaging. Measurements of the cerebellar vermis were highly correlated between two-dimensional and 3D median planes. The diameter of the cerebellar vermis follows growth approximately predicted by the quadratic regression equation. The normal vermis was almost parallel to the brain stem, with the average angle degree to be <2° in normal fetuses. The average angle degree of the 9 cases of vermian rotation was >5°. Three-dimensional median planes are obtained more easily than two-dimensional ones, and allow accurate measurements of the cerebellar vermis. The 3D approach may enable rapid assessment of fetal cerebral anatomy in standard examination. Measurements of cerebellar vermis may provide a quantitative index for prenatal diagnosis of posterior fossa malformations. © 2012 John Wiley & Sons, Ltd.
Quantitative polarized Raman spectroscopy in highly turbid bone tissue
NASA Astrophysics Data System (ADS)
Raghavan, Mekhala; Sahar, Nadder D.; Wilson, Robert H.; Mycek, Mary-Ann; Pleshko, Nancy; Kohn, David H.; Morris, Michael D.
2010-05-01
Polarized Raman spectroscopy allows measurement of molecular orientation and composition and is widely used in the study of polymer systems. Here, we extend the technique to the extraction of quantitative orientation information from bone tissue, which is optically thick and highly turbid. We discuss multiple scattering effects in tissue and show that repeated measurements using a series of objectives of differing numerical apertures can be employed to assess the contributions of sample turbidity and depth of field on polarized Raman measurements. A high numerical aperture objective minimizes the systematic errors introduced by multiple scattering. We test and validate the use of polarized Raman spectroscopy using wild-type and genetically modified (oim/oim model of osteogenesis imperfecta) murine bones. Mineral orientation distribution functions show that mineral crystallites are not as well aligned (p<0.05) in oim/oim bones (28+/-3 deg) compared to wild-type bones (22+/-3 deg), in agreement with small-angle X-ray scattering results. In wild-type mice, backbone carbonyl orientation is 76+/-2 deg and in oim/oim mice, it is 72+/-4 deg (p>0.05). We provide evidence that simultaneous quantitative measurements of mineral and collagen orientations on intact bone specimens are possible using polarized Raman spectroscopy.
Quantitative polarized Raman spectroscopy in highly turbid bone tissue.
Raghavan, Mekhala; Sahar, Nadder D; Wilson, Robert H; Mycek, Mary-Ann; Pleshko, Nancy; Kohn, David H; Morris, Michael D
2010-01-01
Polarized Raman spectroscopy allows measurement of molecular orientation and composition and is widely used in the study of polymer systems. Here, we extend the technique to the extraction of quantitative orientation information from bone tissue, which is optically thick and highly turbid. We discuss multiple scattering effects in tissue and show that repeated measurements using a series of objectives of differing numerical apertures can be employed to assess the contributions of sample turbidity and depth of field on polarized Raman measurements. A high numerical aperture objective minimizes the systematic errors introduced by multiple scattering. We test and validate the use of polarized Raman spectroscopy using wild-type and genetically modified (oim/oim model of osteogenesis imperfecta) murine bones. Mineral orientation distribution functions show that mineral crystallites are not as well aligned (p<0.05) in oim/oim bones (28+/-3 deg) compared to wild-type bones (22+/-3 deg), in agreement with small-angle X-ray scattering results. In wild-type mice, backbone carbonyl orientation is 76+/-2 deg and in oim/oim mice, it is 72+/-4 deg (p>0.05). We provide evidence that simultaneous quantitative measurements of mineral and collagen orientations on intact bone specimens are possible using polarized Raman spectroscopy.
Neuroergonomics: Quantitative Modeling of Individual, Shared, and Team Neurodynamic Information.
Stevens, Ronald H; Galloway, Trysha L; Willemsen-Dunlap, Ann
2018-06-01
The aim of this study was to use the same quantitative measure and scale to directly compare the neurodynamic information/organizations of individual team members with those of the team. Team processes are difficult to separate from those of individual team members due to the lack of quantitative measures that can be applied to both process sets. Second-by-second symbolic representations were created of each team member's electroencephalographic power, and quantitative estimates of their neurodynamic organizations were calculated from the Shannon entropy of the symbolic data streams. The information in the neurodynamic data streams of health care ( n = 24), submarine navigation ( n = 12), and high school problem-solving ( n = 13) dyads was separated into the information of each team member, the information shared by team members, and the overall team information. Most of the team information was the sum of each individual's neurodynamic information. The remaining team information was shared among the team members. This shared information averaged ~15% of the individual information, with momentary levels of 1% to 80%. Continuous quantitative estimates can be made from the shared, individual, and team neurodynamic information about the contributions of different team members to the overall neurodynamic organization of a team and the neurodynamic interdependencies among the team members. Information models provide a generalizable quantitative method for separating a team's neurodynamic organization into that of individual team members and that shared among team members.
Bousse, Tatiana; Shore, David A.; Goldsmith, Cynthia S.; Hossain, M. Jaber; Jang, Yunho; Davis, Charles T.; Donis, Ruben O.; Stevens, James
2017-01-01
Summary Recent advances in instrumentation and data analysis in field flow fractionation and multi-angle light scattering (FFF-MALS) have enabled greater use of this technique to characterize and quantitate viruses. In this study, the FFF-MALS technique was applied to the characterization and quantitation of type A influenza virus particles to assess its usefulness for vaccine preparation. The use of FFF-MALS for quantitation and measurement of control particles provided data accurate to within 5% of known values, reproducible with a coefficient of variation of 1.9 %. The methods, sensitivity and limit of detection were established by analyzing different volumes of purified virus, which produced a linear regression with fitting value R2 of 0.99. FFF-MALS was further applied to detect and quantitate influenza virus in the supernatant of infected MDCK cells and allantoic fluids of infected eggs. FFF fractograms of the virus present in these different fluids revealed similar distribution of monomeric and oligomeric virions. However, the monomer fraction of cell grown virus has greater size variety. Notably, β-propialactone (BPL) inactivation of influenza viruses did not influence any of the FFF-MALS measurements. Quantitation analysis by FFF-MALS was compared to infectivity assays and real-time RT-PCR (qRT-PCR) and the limitations of each assay were discussed. PMID:23916678
Evans, Maggie; Gregory, Alison; Feder, Gene; Howarth, Emma; Hegarty, Kelsey
2016-01-01
This article explores the challenges of providing a quantitative measure of domestic violence and abuse (DVA), illustrated by the Composite Abuse Scale, a validated multidimensional measure of frequency and severity of abuse, used worldwide for prevalence studies and intervention trials. Cognitive "think-aloud" and qualitative interviewing with a sample of women who had experienced DVA revealed a tendency toward underreporting their experience of abuse, particularly of coercive control, threatening behavior, restrictions to freedom, and sexual abuse. Underreporting was linked to inconsistency and uncertainty in item interpretation and response, fear of answering truthfully, and unwillingness to identify with certain forms of abuse. Suggestions are made for rewording or reconceptualizing items and the inclusion of a distress scale to measure the individual impact of abuse. The importance of including qualitative methods in questionnaire design and in the interpretation of quantitative findings is highlighted.
Roberts, P
1999-07-01
The political climate of health care provision and education for health care in the latter years of the 20th century is evolving from the uncertainty of newly created markets to a more clearly focused culture of collaboration, dissemination of good practice, with an increased emphasis on quality provision and its measurement. The need for provider units to prove and improve efficiency and effectiveness through evidence-based quality strategies in order to stay firmly in the market place has never been more necessary. The measurement of customer expectations and perceptions of delivered service quality is widely utilized as a basis for customer retention and business growth in both commercial and non-profit organizations. This paper describes the methodological development of NEdSERV--quantitative instrumentation designed to measure and respond to ongoing stakeholder expectations and perceptions of delivered service quality within nurse education.
NASA Astrophysics Data System (ADS)
Zhao, H.; Zhang, S.
2008-01-01
One of the most effective means to achieve controlled auto-ignition (CAI) combustion in a gasoline engine is by the residual gas trapping method. The amount of residual gas and mixture composition have significant effects on the subsequent combustion process and engine emissions. In order to obtain quantitative measurements of in-cylinder residual gas concentration and air/fuel ratio, a spontaneous Raman scattering (SRS) system has been developed recently. The optimized optical SRS setups are presented and discussed. The temperature effect on the SRS measurement is considered and a method has been developed to correct for the overestimated values due to the temperature effect. Simultaneous measurements of O2, H2O, CO2 and fuel were obtained throughout the intake, compression, combustion and expansion strokes. It shows that the SRS can provide valuable data on this process in a CAI combustion engine.
Enabling Interactive Measurements from Large Coverage Microscopy
Bajcsy, Peter; Vandecreme, Antoine; Amelot, Julien; Chalfoun, Joe; Majurski, Michael; Brady, Mary
2017-01-01
Microscopy could be an important tool for characterizing stem cell products if quantitative measurements could be collected over multiple spatial and temporal scales. With the cells changing states over time and being several orders of magnitude smaller than cell products, modern microscopes are already capable of imaging large spatial areas, repeat imaging over time, and acquiring images over several spectra. However, characterizing stem cell products from such large image collections is challenging because of data size, required computations, and lack of interactive quantitative measurements needed to determine release criteria. We present a measurement web system consisting of available algorithms, extensions to a client-server framework using Deep Zoom, and the configuration know-how to provide the information needed for inspecting the quality of a cell product. The cell and other data sets are accessible via the prototype web-based system at http://isg.nist.gov/deepzoomweb. PMID:28663600
Bohren, Meghan A; Vogel, Joshua P; Hunter, Erin C; Lutsiv, Olha; Makh, Suprita K; Souza, João Paulo; Aguiar, Carolina; Saraiva Coneglian, Fernando; Diniz, Alex Luíz Araújo; Tunçalp, Özge; Javadi, Dena; Oladapo, Olufemi T; Khosla, Rajat; Hindin, Michelle J; Gülmezoglu, A Metin
2015-06-01
Despite growing recognition of neglectful, abusive, and disrespectful treatment of women during childbirth in health facilities, there is no consensus at a global level on how these occurrences are defined and measured. This mixed-methods systematic review aims to synthesize qualitative and quantitative evidence on the mistreatment of women during childbirth in health facilities to inform the development of an evidence-based typology of the phenomenon. We searched PubMed, CINAHL, and Embase databases and grey literature using a predetermined search strategy to identify qualitative, quantitative, and mixed-methods studies on the mistreatment of women during childbirth across all geographical and income-level settings. We used a thematic synthesis approach to synthesize the qualitative evidence and assessed the confidence in the qualitative review findings using the CERQual approach. In total, 65 studies were included from 34 countries. Qualitative findings were organized under seven domains: (1) physical abuse, (2) sexual abuse, (3) verbal abuse, (4) stigma and discrimination, (5) failure to meet professional standards of care, (6) poor rapport between women and providers, and (7) health system conditions and constraints. Due to high heterogeneity of the quantitative data, we were unable to conduct a meta-analysis; instead, we present descriptions of study characteristics, outcome measures, and results. Additional themes identified in the quantitative studies are integrated into the typology. This systematic review presents a comprehensive, evidence-based typology of the mistreatment of women during childbirth in health facilities, and demonstrates that mistreatment can occur at the level of interaction between the woman and provider, as well as through systemic failures at the health facility and health system levels. We propose this typology be adopted to describe the phenomenon and be used to develop measurement tools and inform future research, programs, and interventions.
Leading for the long haul: a mixed-method evaluation of the Sustainment Leadership Scale (SLS).
Ehrhart, Mark G; Torres, Elisa M; Green, Amy E; Trott, Elise M; Willging, Cathleen E; Moullin, Joanna C; Aarons, Gregory A
2018-01-19
Despite our progress in understanding the organizational context for implementation and specifically the role of leadership in implementation, its role in sustainment has received little attention. This paper took a mixed-method approach to examine leadership during the sustainment phase of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Utilizing the Implementation Leadership Scale as a foundation, we sought to develop a short, practical measure of sustainment leadership that can be used for both applied and research purposes. Data for this study were collected as a part of a larger mixed-method study of evidence-based intervention, SafeCare®, sustainment. Quantitative data were collected from 157 providers using web-based surveys. Confirmatory factor analysis was used to examine the factor structure of the Sustainment Leadership Scale (SLS). Qualitative data were collected from 95 providers who participated in one of 15 focus groups. A framework approach guided qualitative data analysis. Mixed-method integration was also utilized to examine convergence of quantitative and qualitative findings. Confirmatory factor analysis supported the a priori higher order factor structure of the SLS with subscales indicating a single higher order sustainment leadership factor. The SLS demonstrated excellent internal consistency reliability. Qualitative analyses offered support for the dimensions of sustainment leadership captured by the quantitative measure, in addition to uncovering a fifth possible factor, available leadership. This study found qualitative and quantitative support for the pragmatic SLS measure. The SLS can be used for assessing leadership of first-level leaders to understand how staff perceive leadership during sustainment and to suggest areas where leaders could direct more attention in order to increase the likelihood that EBIs are institutionalized into the normal functioning of the organization.
Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods.
Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C W; Lipiński, Wojciech; Bischof, John C
2016-07-21
Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.
Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods
NASA Astrophysics Data System (ADS)
Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C. W.; Lipiński, Wojciech; Bischof, John C.
2016-07-01
Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.
Quantitative tomographic imaging of intermolecular FRET in small animals
Venugopal, Vivek; Chen, Jin; Barroso, Margarida; Intes, Xavier
2012-01-01
Forster resonance energy transfer (FRET) is a nonradiative transfer of energy between two fluorescent molecules (a donor and an acceptor) in nanometer range proximity. FRET imaging methods have been applied to proteomic studies and drug discovery applications based on intermolecular FRET efficiency measurements and stoichiometric measurements of FRET interaction as quantitative parameters of interest. Importantly, FRET provides information about biomolecular interactions at a molecular level, well beyond the diffraction limits of standard microscopy techniques. The application of FRET to small animal imaging will allow biomedical researchers to investigate physiological processes occurring at nanometer range in vivo as well as in situ. In this work a new method for the quantitative reconstruction of FRET measurements in small animals, incorporating a full-field tomographic acquisition system with a Monte Carlo based hierarchical reconstruction scheme, is described and validated in murine models. Our main objective is to estimate the relative concentration of two forms of donor species, i.e., a donor molecule involved in FRETing to an acceptor close by and a nonFRETing donor molecule. PMID:23243567
Process modeling KC-135 aircraft
NASA Technical Reports Server (NTRS)
Workman, Gary L.
1991-01-01
Instrumentation will be provided for KC-135 aircraft which will provide a quantitative measure of g-level variation during parabolic flights and its effect on experiments which demonstrate differences in results obtained with differences in convective flow. The flight apparatus will provide video recording of the effects of the g-level variations on varying fluid samples. The apparatus will be constructed to be available to fly on the KC-135 during most missions.
NecroQuant: quantitative assessment of radiological necrosis
NASA Astrophysics Data System (ADS)
Hwang, Darryl H.; Mohamed, Passant; Varghese, Bino A.; Cen, Steven Y.; Duddalwar, Vinay
2017-11-01
Clinicians can now objectively quantify tumor necrosis by Hounsfield units and enhancement characteristics from multiphase contrast enhanced CT imaging. NecroQuant has been designed to work as part of a radiomics pipelines. The software is a departure from the conventional qualitative assessment of tumor necrosis, as it provides the user (radiologists and researchers) a simple interface to precisely and interactively define and measure necrosis in contrast-enhanced CT images. Although, the software is tested here on renal masses, it can be re-configured to assess tumor necrosis across variety of tumors from different body sites, providing a generalized, open, portable, and extensible quantitative analysis platform that is widely applicable across cancer types to quantify tumor necrosis.
Geiss, S; Einax, J W
2001-07-01
Detection limit, reporting limit and limit of quantitation are analytical parameters which describe the power of analytical methods. These parameters are used for internal quality assurance and externally for competing, especially in the case of trace analysis in environmental compartments. The wide variety of possibilities for computing or obtaining these measures in literature and in legislative rules makes any comparison difficult. Additionally, a host of terms have been used within the analytical community to describe detection and quantitation capabilities. Without trying to create an order for the variety of terms, this paper is aimed at providing a practical proposal for answering the main questions for the analysts concerning quality measures above. These main questions and related parameters were explained and graphically demonstrated. Estimation and verification of these parameters are the two steps to get real measures. A rule for a practical verification is given in a table, where the analyst can read out what to measure, what to estimate and which criteria have to be fulfilled. In this manner verified parameters detection limit, reporting limit and limit of quantitation now are comparable and the analyst himself is responsible to the unambiguity and reliability of these measures.
Quantitative interpretation of Great Lakes remote sensing data
NASA Technical Reports Server (NTRS)
Shook, D. F.; Salzman, J.; Svehla, R. A.; Gedney, R. T.
1980-01-01
The paper discusses the quantitative interpretation of Great Lakes remote sensing water quality data. Remote sensing using color information must take into account (1) the existence of many different organic and inorganic species throughout the Great Lakes, (2) the occurrence of a mixture of species in most locations, and (3) spatial variations in types and concentration of species. The radiative transfer model provides a potential method for an orderly analysis of remote sensing data and a physical basis for developing quantitative algorithms. Predictions and field measurements of volume reflectances are presented which show the advantage of using a radiative transfer model. Spectral absorptance and backscattering coefficients for two inorganic sediments are reported.
Left seat command or leadership flight, leadership training and research at North Central Airlines
NASA Technical Reports Server (NTRS)
Foster, G. C.; Garvey, M. C.
1980-01-01
The need for flight leadership training for flight deck crewmembers is addressed. A management grid is also described which provides a quantitative management language against which any number of management behaviors can be measured.
Salamat, Sara; Hutchings, John; Kwong, Clemens; Magnussen, John; Hancock, Mark J
2016-01-01
To assess the relationship between quantitative measures of disc height and signal intensity with the Pfirrmann disc degeneration scoring system and to test the inter-rater reliability of the quantitative measures. Participants were 76 people who had recently recovered from their last episode of acute low back pain and underwent MRI scan on a single 3T machine. At all 380 lumbar discs, quantitative measures of disc height and signal intensity were made by 2 independent raters and compared to Pfirrmann scores from a single radiologist. For quantitative measures of disc height and signal intensity a "raw" score and 2 adjusted ratios were calculated and the relationship with Pfirrmann scores was assessed. The inter-tester reliability of quantitative measures was also investigated. There was a strong linear relationship between quantitative disc signal intensity and Pfirrmann scores for grades 1-4, but not for grades 4 and 5. For disc height only, Pfirrmann grade 5 had significantly reduced disc height compared to all other grades. Results were similar regardless of whether raw or adjusted scores were used. Inter-rater reliability for the quantitative measures was excellent (ICC > 0.97). Quantitative measures of disc signal intensity were strongly related to Pfirrmann scores from grade 1 to 4; however disc height only differentiated between grade 4 and 5 Pfirrmann scores. Using adjusted ratios for quantitative measures of disc height or signal intensity did not significantly alter the relationship with Pfirrmann scores.
ERIC Educational Resources Information Center
Constantino, John N.; Frazier, Thomas W.
2013-01-01
In their analysis of the accumulated data from the clinically ascertained Simons Simplex Collection (SSC), Hus et al. (2013) provide a large-scale clinical replication of previously reported associations (see Constantino, Hudziak & Todd, 2003) between quantitative autistic traits [as measured by the Social Responsiveness Scale (SRS)] and…
Yoshimitsu, Kengo; Shinagawa, Yoshinobu; Mitsufuji, Toshimichi; Mutoh, Emi; Urakawa, Hiroshi; Sakamoto, Keiko; Fujimitsu, Ritsuko; Takano, Koichi
2017-01-10
To elucidate whether any differences are present in the stiffness map obtained with a multiscale direct inversion algorithm (MSDI) vs that with a multimodel direct inversion algorithm (MMDI), both qualitatively and quantitatively. The MR elastography (MRE) data of 37 consecutive patients who underwent liver MR elastography between September and October 2014 were retrospectively analyzed by using both MSDI and MMDI. Two radiologists qualitatively assessed the stiffness maps for the image quality in consensus, and the measured liver stiffness and measurable areas were quantitatively compared between MSDI and MMDI. MMDI provided a stiffness map of better image quality, with comparable or slightly less artifacts. Measurable areas by MMDI (43.7 ± 17.8 cm 2 ) was larger than that by MSDI (37.5 ± 14.7 cm 2 ) (P < 0.05). Liver stiffness measured by MMDI (4.51 ± 2.32 kPa) was slightly (7%), but significantly less than that by MSDI (4.86 ± 2.44 kPa) (P < 0.05). MMDI can provide stiffness map of better image quality, and slightly lower stiffness values as compared to MSDI at 3T MRE, which radiologists should be aware of.
Connecting the Kinetics and Energy Landscape of tRNA Translocation on the Ribosome
Whitford, Paul C.; Blanchard, Scott C.; Cate, Jamie H. D.; Sanbonmatsu, Karissa Y.
2013-01-01
Functional rearrangements in biomolecular assemblies result from diffusion across an underlying energy landscape. While bulk kinetic measurements rely on discrete state-like approximations to the energy landscape, single-molecule methods can project the free energy onto specific coordinates. With measures of the diffusion, one may establish a quantitative bridge between state-like kinetic measurements and the continuous energy landscape. We used an all-atom molecular dynamics simulation of the 70S ribosome (2.1 million atoms; 1.3 microseconds) to provide this bridge for specific conformational events associated with the process of tRNA translocation. Starting from a pre-translocation configuration, we identified sets of residues that collectively undergo rotary rearrangements implicated in ribosome function. Estimates of the diffusion coefficients along these collective coordinates for translocation were then used to interconvert between experimental rates and measures of the energy landscape. This analysis, in conjunction with previously reported experimental rates of translocation, provides an upper-bound estimate of the free-energy barriers associated with translocation. While this analysis was performed for a particular kinetic scheme of translocation, the quantitative framework is general and may be applied to energetic and kinetic descriptions that include any number of intermediates and transition states. PMID:23555233
Connecting the kinetics and energy landscape of tRNA translocation on the ribosome.
Whitford, Paul C; Blanchard, Scott C; Cate, Jamie H D; Sanbonmatsu, Karissa Y
2013-01-01
Functional rearrangements in biomolecular assemblies result from diffusion across an underlying energy landscape. While bulk kinetic measurements rely on discrete state-like approximations to the energy landscape, single-molecule methods can project the free energy onto specific coordinates. With measures of the diffusion, one may establish a quantitative bridge between state-like kinetic measurements and the continuous energy landscape. We used an all-atom molecular dynamics simulation of the 70S ribosome (2.1 million atoms; 1.3 microseconds) to provide this bridge for specific conformational events associated with the process of tRNA translocation. Starting from a pre-translocation configuration, we identified sets of residues that collectively undergo rotary rearrangements implicated in ribosome function. Estimates of the diffusion coefficients along these collective coordinates for translocation were then used to interconvert between experimental rates and measures of the energy landscape. This analysis, in conjunction with previously reported experimental rates of translocation, provides an upper-bound estimate of the free-energy barriers associated with translocation. While this analysis was performed for a particular kinetic scheme of translocation, the quantitative framework is general and may be applied to energetic and kinetic descriptions that include any number of intermediates and transition states.
Andreani, Carla; Romanelli, Giovanni; Senesi, Roberto
2016-06-16
This study presents the first direct and quantitative measurement of the nuclear momentum distribution anisotropy and the quantum kinetic energy tensor in stable and metastable (supercooled) water near its triple point, using deep inelastic neutron scattering (DINS). From the experimental spectra, accurate line shapes of the hydrogen momentum distributions are derived using an anisotropic Gaussian and a model-independent framework. The experimental results, benchmarked with those obtained for the solid phase, provide the state of the art directional values of the hydrogen mean kinetic energy in metastable water. The determinations of the direction kinetic energies in the supercooled phase, provide accurate and quantitative measurements of these dynamical observables in metastable and stable phases, that is, key insight in the physical mechanisms of the hydrogen quantum state in both disordered and polycrystalline systems. The remarkable findings of this study establish novel insight into further expand the capacity and accuracy of DINS investigations of the nuclear quantum effects in water and represent reference experimental values for theoretical investigations.
Slade, Jeffrey W.; Adams, Jean V.; Christie, Gavin C.; Cuddy, Douglas W.; Fodale, Michael F.; Heinrich, John W.; Quinlan, Henry R.; Weise, Jerry G.; Weisser, John W.; Young, Robert J.
2003-01-01
Before 1995, Great Lakes streams were selected for lampricide treatment based primarily on qualitative measures of the relative abundance of larval sea lampreys, Petromyzon marinus. New integrated pest management approaches required standardized quantitative measures of sea lamprey. This paper evaluates historical larval assessment techniques and data and describes how new standardized methods for estimating abundance of larval and metamorphosed sea lampreys were developed and implemented. These new methods have been used to estimate larval and metamorphosed sea lamprey abundance in about 100 Great Lakes streams annually and to rank them for lampricide treatment since 1995. Implementation of these methods has provided a quantitative means of selecting streams for treatment based on treatment cost and estimated production of metamorphosed sea lampreys, provided managers with a tool to estimate potential recruitment of sea lampreys to the Great Lakes and the ability to measure the potential consequences of not treating streams, resulting in a more justifiable allocation of resources. The empirical data produced can also be used to simulate the impacts of various control scenarios.
Towards quantitative assessment of calciphylaxis
NASA Astrophysics Data System (ADS)
Deserno, Thomas M.; Sárándi, István.; Jose, Abin; Haak, Daniel; Jonas, Stephan; Specht, Paula; Brandenburg, Vincent
2014-03-01
Calciphylaxis is a rare disease that has devastating conditions associated with high morbidity and mortality. Calciphylaxis is characterized by systemic medial calcification of the arteries yielding necrotic skin ulcerations. In this paper, we aim at supporting the installation of multi-center registries for calciphylaxis, which includes a photographic documentation of skin necrosis. However, photographs acquired in different centers under different conditions using different equipment and photographers cannot be compared quantitatively. For normalization, we use a simple color pad that is placed into the field of view, segmented from the image, and its color fields are analyzed. In total, 24 colors are printed on that scale. A least-squares approach is used to determine the affine color transform. Furthermore, the card allows scale normalization. We provide a case study for qualitative assessment. In addition, the method is evaluated quantitatively using 10 images of two sets of different captures of the same necrosis. The variability of quantitative measurements based on free hand photography is assessed regarding geometric and color distortions before and after our simple calibration procedure. Using automated image processing, the standard deviation of measurements is significantly reduced. The coefficients of variations yield 5-20% and 2-10% for geometry and color, respectively. Hence, quantitative assessment of calciphylaxis becomes practicable and will impact a better understanding of this rare but fatal disease.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blake, Thomas A.; Johnson, Timothy J.; Tonkyn, Russell G.
Infrared integrating sphere measurements of solid samples are important in providing reference data for contact, standoff and remote sensing applications. At the Pacific Northwest National Laboratory (PNNL) we have developed protocols to measure both the directional-hemispherical ( and diffuse (d) reflectances of powders, liquids, and disks of powders and solid materials using a commercially available, matte gold-coated integrating sphere and Fourier transform infrared spectrometer. Detailed descriptions of the sphere alignment and its use for making these reflectance measurements are given. Diffuse reflectance values were found to be dependent on the bidirectional reflection distribution function (BRDF) of the sample and themore » solid angle intercepted by the sphere’s specular exclusion port. To determine how well the sphere and protocols produce quantitative reflectance data, measurements were made of three diffuse and two specular standards prepared by the National institute of Standards and Technology (NIST, USA), LabSphere Infragold and Spectralon standards, hand-loaded sulfur and talc powder samples, and water. The five NIST standards behaved as expected: the three diffuse standards had a high degree of “diffuseness,” d/ = D > 0.9, whereas the two specular standards had D ≤ 0.03. The average absolute differences between the NIST and PNNL measurements of the NIST standards for both directional-hemispherical and diffuse reflectances are on the order of 0.01 reflectance units. Other quantitative differences between the PNNL-measured and calibration (where available) or literature reflectance values for these standards and materials are given and the possible origins of discrepancies are discussed. Random uncertainties and estimates of systematic uncertainties are presented. Corrections necessary to provide better agreement between the PNNL reflectance values as measured for the NIST standards and the NIST reflectance values for these same standards are also discussed.« less
Lancione, Marta; Tosetti, Michela; Donatelli, Graziella; Cosottini, Mirco; Costagli, Mauro
2017-11-01
The aim of this work was to assess the impact of tissue structural orientation on quantitative susceptibility mapping (QSM) reliability, and to provide a criterion to identify voxels in which measures of magnetic susceptibility (χ) are most affected by spatial orientation effects. Four healthy volunteers underwent 7-T magnetic resonance imaging (MRI). Multi-echo, gradient-echo sequences were used to obtain quantitative maps of frequency shift (FS) and χ. Information from diffusion tensor imaging (DTI) was used to investigate the relationship between tissue orientation and FS measures and QSM. After sorting voxels on the basis of their fractional anisotropy (FA), the variations in FS and χ values over tissue orientation were measured. Using a K-means clustering algorithm, voxels were separated into two groups depending on the variability of measures within each FA interval. The consistency of FS and QSM values, observed at low FA, was disrupted for FA > 0.6. The standard deviation of χ measured at high FA (0.0103 ppm) was nearly five times that at low FA (0.0022 ppm). This result was consistent through data across different head positions and for different brain regions considered separately, which confirmed that such behavior does not depend on structures with different bulk susceptibility oriented along particular angles. The reliability of single-orientation QSM anticorrelates with local FA. QSM provides replicable values with little variability in brain regions with FA < 0.6, but QSM should be interpreted cautiously in major and coherent fiber bundles, which are strongly affected by structural anisotropy and magnetic susceptibility anisotropy. Copyright © 2017 John Wiley & Sons, Ltd.
Dynamic feature analysis of vector-based images for neuropsychological testing
NASA Astrophysics Data System (ADS)
Smith, Stephen L.; Cervantes, Basilio R.
1998-07-01
The dynamic properties of human motor activities, such as those observed in the course of drawing simple geometric shapes, are considerably more complex and often more informative than the goal to be achieved; in this case a static line drawing. This paper demonstrates how these dynamic properties may be used to provide a means of assessing a patient's visuo-spatial ability -- an important component of neuropsychological testing. The work described here provides a quantitative assessment of visuo-spatial ability, whilst preserving the conventional test environment. Results will be presented for a clinical population of long-term haemodialysis patients and test population comprises three groups of children (1) 7-8 years, (2) 9-10 years and (3) 11-12 years, all of which have no known neurological dysfunction. Ten new dynamic measurements extracted from patient responses in conjunction with one static feature deduced from earlier work describe a patient's visuo-spatial ability in a quantitative manner with sensitivity not previously attainable. The dynamic feature measurements in isolation provide a unique means of tracking a patient's approach to motor activities and could prove useful in monitoring a child' visuo-motor development.
Contextual Fraction as a Measure of Contextuality.
Abramsky, Samson; Barbosa, Rui Soares; Mansfield, Shane
2017-08-04
We consider the contextual fraction as a quantitative measure of contextuality of empirical models, i.e., tables of probabilities of measurement outcomes in an experimental scenario. It provides a general way to compare the degree of contextuality across measurement scenarios; it bears a precise relationship to violations of Bell inequalities; its value, and a witnessing inequality, can be computed using linear programing; it is monotonic with respect to the "free" operations of a resource theory for contextuality; and it measures quantifiable advantages in informatic tasks, such as games and a form of measurement-based quantum computing.
Contextual Fraction as a Measure of Contextuality
NASA Astrophysics Data System (ADS)
Abramsky, Samson; Barbosa, Rui Soares; Mansfield, Shane
2017-08-01
We consider the contextual fraction as a quantitative measure of contextuality of empirical models, i.e., tables of probabilities of measurement outcomes in an experimental scenario. It provides a general way to compare the degree of contextuality across measurement scenarios; it bears a precise relationship to violations of Bell inequalities; its value, and a witnessing inequality, can be computed using linear programing; it is monotonic with respect to the "free" operations of a resource theory for contextuality; and it measures quantifiable advantages in informatic tasks, such as games and a form of measurement-based quantum computing.
Michalek, Lukas; Barner, Leonie; Barner-Kowollik, Christopher
2018-03-07
Well-defined polymer strands covalently tethered onto solid substrates determine the properties of the resulting functional interface. Herein, the current approaches to determine quantitative grafting densities are assessed. Based on a brief introduction into the key theories describing polymer brush regimes, a user's guide is provided to estimating maximum chain coverage and-importantly-examine the most frequently employed approaches for determining grafting densities, i.e., dry thickness measurements, gravimetric assessment, and swelling experiments. An estimation of the reliability of these determination methods is provided via carefully evaluating their assumptions and assessing the stability of the underpinning equations. A practical access guide for comparatively and quantitatively evaluating the reliability of a given approach is thus provided, enabling the field to critically judge experimentally determined grafting densities and to avoid the reporting of grafting densities that fall outside the physically realistic parameter space. The assessment is concluded with a perspective on the development of advanced approaches for determination of grafting density, in particular, on single-chain methodologies. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Elliott, Jonathan T.; Diop, Mamadou; Tichauer, Kenneth M.; Lee, Ting-Yim; Lawrence, Keith St.
2010-05-01
Nearly half a million children and young adults are affected by traumatic brain injury each year in the United States. Although adequate cerebral blood flow (CBF) is essential to recovery, complications that disrupt blood flow to the brain and exacerbate neurological injury often go undetected because no adequate bedside measure of CBF exists. In this study we validate a depth-resolved, near-infrared spectroscopy (NIRS) technique that provides quantitative CBF measurement despite significant signal contamination from skull and scalp tissue. The respiration rates of eight anesthetized pigs (weight: 16.2+/-0.5 kg, age: 1 to 2 months old) are modulated to achieve a range of CBF levels. Concomitant CBF measurements are performed with NIRS and CT perfusion. A significant correlation between CBF measurements from the two techniques is demonstrated (r2=0.714, slope=0.92, p<0.001), and the bias between the two techniques is -2.83 mL.min-1.100 g-1 (CI0.95: -19.63 mL.min-1.100 g-1-13.9 mL.min-1.100 g-1). This study demonstrates that accurate measurements of CBF can be achieved with depth-resolved NIRS despite significant signal contamination from scalp and skull. The ability to measure CBF at the bedside provides a means of detecting, and thereby preventing, secondary ischemia during neurointensive care.
Kazerooni, Ella A.; Lynch, David A.; Liu, Lyrica X.; Murray, Susan; Curtis, Jeffrey L.; Criner, Gerard J.; Kim, Victor; Bowler, Russell P.; Hanania, Nicola A.; Anzueto, Antonio R.; Make, Barry J.; Hokanson, John E.; Crapo, James D.; Silverman, Edwin K.; Martinez, Fernando J.; Washko, George R.
2011-01-01
Purpose: To test the hypothesis—given the increasing emphasis on quantitative computed tomographic (CT) phenotypes of chronic obstructive pulmonary disease (COPD)—that a relationship exists between COPD exacerbation frequency and quantitative CT measures of emphysema and airway disease. Materials and Methods: This research protocol was approved by the institutional review board of each participating institution, and all participants provided written informed consent. One thousand two subjects who were enrolled in the COPDGene Study and met the GOLD (Global Initiative for Chronic Obstructive Lung Disease) criteria for COPD with quantitative CT analysis were included. Total lung emphysema percentage was measured by using the attenuation mask technique with a −950-HU threshold. An automated program measured the mean wall thickness and mean wall area percentage in six segmental bronchi. The frequency of COPD exacerbation in the prior year was determined by using a questionnaire. Statistical analysis was performed to examine the relationship of exacerbation frequency with lung function and quantitative CT measurements. Results: In a multivariate analysis adjusted for lung function, bronchial wall thickness and total lung emphysema percentage were associated with COPD exacerbation frequency. Each 1-mm increase in bronchial wall thickness was associated with a 1.84-fold increase in annual exacerbation rate (P = .004). For patients with 35% or greater total emphysema, each 5% increase in emphysema was associated with a 1.18-fold increase in this rate (P = .047). Conclusion: Greater lung emphysema and airway wall thickness were associated with COPD exacerbations, independent of the severity of airflow obstruction. Quantitative CT can help identify subgroups of patients with COPD who experience exacerbations for targeted research and therapy development for individual phenotypes. © RSNA, 2011 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.11110173/-/DC1 PMID:21788524
Nijran, Kuldip S; Houston, Alex S; Fleming, John S; Jarritt, Peter H; Heikkinen, Jari O; Skrypniuk, John V
2014-07-01
In this second UK audit of quantitative parameters obtained from renography, phantom simulations were used in cases in which the 'true' values could be estimated, allowing the accuracy of the parameters measured to be assessed. A renal physical phantom was used to generate a set of three phantom simulations (six kidney functions) acquired on three different gamma camera systems. A total of nine phantom simulations and three real patient studies were distributed to UK hospitals participating in the audit. Centres were asked to provide results for the following parameters: relative function and time-to-peak (whole kidney and cortical region). As with previous audits, a questionnaire collated information on methodology. Errors were assessed as the root mean square deviation from the true value. Sixty-one centres responded to the audit, with some hospitals providing multiple sets of results. Twenty-one centres provided a complete set of parameter measurements. Relative function and time-to-peak showed a reasonable degree of accuracy and precision in most UK centres. The overall average root mean squared deviation of the results for (i) the time-to-peak measurement for the whole kidney and (ii) the relative function measurement from the true value was 7.7 and 4.5%, respectively. These results showed a measure of consistency in the relative function and time-to-peak that was similar to the results reported in a previous renogram audit by our group. Analysis of audit data suggests a reasonable degree of accuracy in the quantification of renography function using relative function and time-to-peak measurements. However, it is reasonable to conclude that the objectives of the audit could not be fully realized because of the limitations of the mechanical phantom in providing true values for renal parameters.
ERIC Educational Resources Information Center
Luyt, Russell
2012-01-01
A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…
Sabu, Thomas K.; Shiju, Raj T.
2010-01-01
The present study provides data to decide on the most appropriate method for sampling of ground-dwelling arthropods measured in a moist-deciduous forest in the Western Ghats in South India. The abundance of ground-dwelling arthropods was compared among large numbers of samples obtained using pitfall trapping, Berlese and Winkler extraction methods. Highest abundance and frequency of most of the represented taxa indicated pitfall trapping as the ideal method for sampling of ground-dwelling arthropods. However, with possible bias towards surface-active taxa, pitfall-trapping data is inappropriate for quantitative studies, and Berlese extraction is the better alternative. Berlese extraction is the better method for quantitative measurements than the other two methods, whereas pitfall trapping would be appropriate for qualitative measurements. A comparison of the Berlese and Winkler extraction data shows that in a quantitative multigroup approach, Winkler extraction was inferior to Berlese extraction because the total number of arthropods caught was the lowest; and many of the taxa that were caught from an identical sample via Berlese extraction method were not caught. Significantly a greater frequency and higher abundance of arthropods belonging to Orthoptera, Blattaria, and Diptera occurred in pitfall-trapped samples and Psocoptera and Acariformes in Berlese-extracted samples than that were obtained in the other two methods, indicating that both methods are useful, one complementing the other, eliminating a chance for possible under-representation of taxa in quantitative studies. PMID:20673122
Andrzejak, Ralph G.; Hauf, Martinus; Pollo, Claudio; Müller, Markus; Weisstanner, Christian; Wiest, Roland; Schindler, Kaspar
2015-01-01
Background Epilepsy surgery is a potentially curative treatment option for pharmacoresistent patients. If non-invasive methods alone do not allow to delineate the epileptogenic brain areas the surgical candidates undergo long-term monitoring with intracranial EEG. Visual EEG analysis is then used to identify the seizure onset zone for targeted resection as a standard procedure. Methods Despite of its great potential to assess the epileptogenicty of brain tissue, quantitative EEG analysis has not yet found its way into routine clinical practice. To demonstrate that quantitative EEG may yield clinically highly relevant information we retrospectively investigated how post-operative seizure control is associated with four selected EEG measures evaluated in the resected brain tissue and the seizure onset zone. Importantly, the exact spatial location of the intracranial electrodes was determined by coregistration of pre-operative MRI and post-implantation CT and coregistration with post-resection MRI was used to delineate the extent of tissue resection. Using data-driven thresholding, quantitative EEG results were separated into normally contributing and salient channels. Results In patients with favorable post-surgical seizure control a significantly larger fraction of salient channels in three of the four quantitative EEG measures was resected than in patients with unfavorable outcome in terms of seizure control (median over the whole peri-ictal recordings). The same statistics revealed no association with post-operative seizure control when EEG channels contributing to the seizure onset zone were studied. Conclusions We conclude that quantitative EEG measures provide clinically relevant and objective markers of target tissue, which may be used to optimize epilepsy surgery. The finding that differentiation between favorable and unfavorable outcome was better for the fraction of salient values in the resected brain tissue than in the seizure onset zone is consistent with growing evidence that spatially extended networks might be more relevant for seizure generation, evolution and termination than a single highly localized brain region (i.e. a “focus”) where seizures start. PMID:26513359
Permanent field plot methodology and equipment
Thomas G. Cole
1993-01-01
Long-term research into the composition, phenology, yield, and growth rates of agroforests can be accomplished with the use of permanent field plots. The periodic remeasurement of these plots provides researchers a quantitative measure of what changes occur over time in indigenous agroforestry systems.
Test/QA Plan for Verification of Ozone Indicator Cards
This verification test will address ozone indicator cards (OICs) that provide short-term semi-quantitative measures of ozone concentration in ambient air. Testing will be conducted under the auspices of the U.S. Environmental Protection Agency (EPA) through the Environmental Tec...
Assessing integrity of insect RNA
USDA-ARS?s Scientific Manuscript database
Assessing total RNA integrity is important for the success of downstream RNA applications. The 2100 Bioanalyzer system with the RNA Integrity Number (RIN) provides a quantitative measure of RNA degradation. Although RINs may not be ascertained for RNA from all organisms, namely those with unusual or...
Generalizability and Validity of a Mathematics Performance Assessment.
ERIC Educational Resources Information Center
Lane, Suzanne; And Others
1996-01-01
Evidence from test results of 3,604 sixth and seventh graders is provided for the generalizability and validity of the Quantitative Understanding: Amplifying Student Achievement and Reasoning (QUASAR) Cognitive Assessment Instrument, which is designed to measure program outcomes and growth in mathematics. (SLD)
A framework for organizing and selecting quantitative approaches for benefit-harm assessment.
Puhan, Milo A; Singh, Sonal; Weiss, Carlos O; Varadhan, Ravi; Boyd, Cynthia M
2012-11-19
Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches.
A framework for organizing and selecting quantitative approaches for benefit-harm assessment
2012-01-01
Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches. PMID:23163976
What information on measurement uncertainty should be communicated to clinicians, and how?
Plebani, Mario; Sciacovelli, Laura; Bernardi, Daniela; Aita, Ada; Antonelli, Giorgia; Padoan, Andrea
2018-02-02
The communication of laboratory results to physicians and the quality of reports represent fundamental requirements of the post-analytical phase in order to assure the right interpretation and utilization of laboratory information. Accordingly, the International Standard for clinical laboratories accreditation (ISO 15189) requires that "laboratory reports shall include the information necessary for the interpretation of the examination results". Measurement uncertainty (MU) is an inherent property of any quantitative measurement result which express the lack of knowledge of the true value and quantify the uncertainty of a result, incorporating the factors known to influence it. Even if the MU is not included in the report attributes of ISO 15189 and cannot be considered a post-analytical requirement, it is suggested as an information which should facilitate an appropriate interpretation of quantitative results (quantity values). Therefore, MU has two intended uses: for laboratory professionals, it gives information about the quality of measurements, providing evidence of the compliance with analytical performance characteristics; for physicians (and patients) it may help in interpretation of measurement results, especially when values are compared with reference intervals or clinical decision limits, providing objective information. Here we describe the way that MU should be added to laboratory reports in order to facilitate the interpretation of laboratory results and connecting efforts performed within laboratory to provide more accurate and reliable results with a more objective tool for their interpretation by physicians. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Observer variation in the assessment of root canal curvature.
Faraj, S; Boutsioukis, C
2017-02-01
To evaluate the inter- and intra-observer agreement between training/trained endodontists regarding the ex vivo classification of root canal curvature into three categories and its measurement using three quantitative methods. Periapical radiographs of seven extracted human posterior teeth with varying degrees of curvature were exposed ex vivo. Twenty training/trained endodontists were asked to classify the root canal curvature into three categories (<10°, 10-30°, >30°), to measure the curvature using three quantitative methods (Schneider, Weine, Pruett) and to draw angles of 10° or 30°, as a control experiment. The procedure was repeated after six weeks. Inter- and intra-observer agreement was evaluated by the intraclass correlation coefficient and weighted kappa. The inter-observer agreement on the visual classification of root canal curvature was substantial (ICC = 0.65, P < 0.018), but a trend towards underestimation of the angle was evident. Participants modified their classifications both within and between the two sessions. Median angles drawn as a control experiment were not significantly different from the target values (P > 0.10), but the results of individual participants varied. When quantitative methods were used, the inter- and intra-observer agreement on the angle measurements was considerably better (ICC = 0.76-0.82, P < 0.001) than on the radius measurements (ICC = 0.16-0.19, P > 0.895). Visual estimation of root canal curvature was not reliable. The use of computer-based quantitative methods is recommended. The measurement of radius of curvature was more subjective than angle measurement. Endodontic Associations need to provide specific guidelines on how to estimate root canal curvature in case difficulty assessment forms. © 2015 International Endodontic Journal. Published by John Wiley & Sons Ltd.
Saito, Akira; Numata, Yasushi; Hamada, Takuya; Horisawa, Tomoyoshi; Cosatto, Eric; Graf, Hans-Peter; Kuroda, Masahiko; Yamamoto, Yoichiro
2016-01-01
Recent developments in molecular pathology and genetic/epigenetic analysis of cancer tissue have resulted in a marked increase in objective and measurable data. In comparison, the traditional morphological analysis approach to pathology diagnosis, which can connect these molecular data and clinical diagnosis, is still mostly subjective. Even though the advent and popularization of digital pathology has provided a boost to computer-aided diagnosis, some important pathological concepts still remain largely non-quantitative and their associated data measurements depend on the pathologist's sense and experience. Such features include pleomorphism and heterogeneity. In this paper, we propose a method for the objective measurement of pleomorphism and heterogeneity, using the cell-level co-occurrence matrix. Our method is based on the widely used Gray-level co-occurrence matrix (GLCM), where relations between neighboring pixel intensity levels are captured into a co-occurrence matrix, followed by the application of analysis functions such as Haralick features. In the pathological tissue image, through image processing techniques, each nucleus can be measured and each nucleus has its own measureable features like nucleus size, roundness, contour length, intra-nucleus texture data (GLCM is one of the methods). In GLCM each nucleus in the tissue image corresponds to one pixel. In this approach the most important point is how to define the neighborhood of each nucleus. We define three types of neighborhoods of a nucleus, then create the co-occurrence matrix and apply Haralick feature functions. In each image pleomorphism and heterogeneity are then determined quantitatively. For our method, one pixel corresponds to one nucleus feature, and we therefore named our method Cell Feature Level Co-occurrence Matrix (CFLCM). We tested this method for several nucleus features. CFLCM is showed as a useful quantitative method for pleomorphism and heterogeneity on histopathological image analysis.
An Approach to Quantify Workload in a System of Agents
NASA Technical Reports Server (NTRS)
Stocker, Richard; Rungta, Neha; Mercer, Eric; Raimondi, Franco; Holbrook, Jon; Cardoza, Colleen; Goodrich, Michael
2015-01-01
The role of humans in aviation and other domains continues to shift from manual control to automation monitoring. Studies have found that humans are often poorly suited for monitoring roles, and workload can easily spike in off-nominal situations. Current workload measurement tools, like NASA TLX, use human operators to assess their own workload after using a prototype system. Such measures are used late in the design process and can result in ex- pensive alterations when problems are discovered. Our goal in this work is to provide a quantitative workload measure for use early in the design process. We leverage research in human cognition to de ne metrics that can measure workload on belief-desire-intentions based multi-agent systems. These measures can alert designers to potential workload issues early in design. We demonstrate the utility of our approach by characterizing quantitative differences in the workload for a single pilot operations model compared to a traditional two pilot model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reno; Fowles, H.M.
On most previous nuclear detonations, signatures and quantitative measurements of the electric-field signals associated with the detonations was obtained at distances such that normal radiation field characteristics apply. On Small Boy, measurements were made from stations located much closer in, such as to be inside, on the boundary of and just outside the limits of the ionized sphere created by the nuclear burst. The electric-field characteristics in these regions were unknown. In the hope of providing continuity from the region of the unknown into the reasonably well-understood region of the radiation field, this project was requested to make the typicalmore » radiation-field type of measurement that had been made on previous detonations. This report covers the signature characteristics and quantitative measurements of the electric-field signal from Small Boy as seen from outside the immediate region of theoretical generating mechanism.« less
Fällman, Erik; Schedin, Staffan; Jass, Jana; Andersson, Magnus; Uhlin, Bernt Eric; Axner, Ove
2004-06-15
An optical force measurement system for quantitating forces in the pN range between micrometer-sized objects has been developed. The system was based upon optical tweezers in combination with a sensitive position detection system and constructed around an inverted microscope. A trapped particle in the focus of the high numerical aperture microscope-objective behaves like an omnidirectional mechanical spring in response to an external force. The particle's displacement from the equilibrium position is therefore a direct measure of the exerted force. A weak probe laser beam, focused directly below the trapping focus, was used for position detection of the trapped particle (a polystyrene bead). The bead and the condenser focus the light to a distinct spot in the far field, monitored by a position sensitive detector. Various calibration procedures were implemented in order to provide absolute force measurements. The system has been used to measure the binding forces between Escherichia coli bacterial adhesins and galabiose-functionalized beads.
Calibration-free assays on standard real-time PCR devices
Debski, Pawel R.; Gewartowski, Kamil; Bajer, Seweryn; Garstecki, Piotr
2017-01-01
Quantitative Polymerase Chain Reaction (qPCR) is one of central techniques in molecular biology and important tool in medical diagnostics. While being a golden standard qPCR techniques depend on reference measurements and are susceptible to large errors caused by even small changes of reaction efficiency or conditions that are typically not marked by decreased precision. Digital PCR (dPCR) technologies should alleviate the need for calibration by providing absolute quantitation using binary (yes/no) signals from partitions provided that the basic assumption of amplification a single target molecule into a positive signal is met. Still, the access to digital techniques is limited because they require new instruments. We show an analog-digital method that can be executed on standard (real-time) qPCR devices. It benefits from real-time readout, providing calibration-free assessment. The method combines advantages of qPCR and dPCR and bypasses their drawbacks. The protocols provide for small simplified partitioning that can be fitted within standard well plate format. We demonstrate that with the use of synergistic assay design standard qPCR devices are capable of absolute quantitation when normal qPCR protocols fail to provide accurate estimates. We list practical recipes how to design assays for required parameters, and how to analyze signals to estimate concentration. PMID:28327545
Calibration-free assays on standard real-time PCR devices
NASA Astrophysics Data System (ADS)
Debski, Pawel R.; Gewartowski, Kamil; Bajer, Seweryn; Garstecki, Piotr
2017-03-01
Quantitative Polymerase Chain Reaction (qPCR) is one of central techniques in molecular biology and important tool in medical diagnostics. While being a golden standard qPCR techniques depend on reference measurements and are susceptible to large errors caused by even small changes of reaction efficiency or conditions that are typically not marked by decreased precision. Digital PCR (dPCR) technologies should alleviate the need for calibration by providing absolute quantitation using binary (yes/no) signals from partitions provided that the basic assumption of amplification a single target molecule into a positive signal is met. Still, the access to digital techniques is limited because they require new instruments. We show an analog-digital method that can be executed on standard (real-time) qPCR devices. It benefits from real-time readout, providing calibration-free assessment. The method combines advantages of qPCR and dPCR and bypasses their drawbacks. The protocols provide for small simplified partitioning that can be fitted within standard well plate format. We demonstrate that with the use of synergistic assay design standard qPCR devices are capable of absolute quantitation when normal qPCR protocols fail to provide accurate estimates. We list practical recipes how to design assays for required parameters, and how to analyze signals to estimate concentration.
Destounis, Stamatia; Arieno, Andrea; Morgan, Renee; Roberts, Christina; Chan, Ariane
2017-01-01
Mammographic breast density (MBD) has been proven to be an important risk factor for breast cancer and an important determinant of mammographic screening performance. The measurement of density has changed dramatically since its inception. Initial qualitative measurement methods have been found to have limited consistency between readers, and in regards to breast cancer risk. Following the introduction of full-field digital mammography, more sophisticated measurement methodology is now possible. Automated computer-based density measurements can provide consistent, reproducible, and objective results. In this review paper, we describe various methods currently available to assess MBD, and provide a discussion on the clinical utility of such methods for breast cancer screening. PMID:28561776
Investigation of quartz grain surface textures by atomic force microscopy for forensic analysis.
Konopinski, D I; Hudziak, S; Morgan, R M; Bull, P A; Kenyon, A J
2012-11-30
This paper presents a study of quartz sand grain surface textures using atomic force microscopy (AFM) to image the surface. Until now scanning electron microscopy (SEM) has provided the primary technique used in the forensic surface texture analysis of quartz sand grains as a means of establishing the provenance of the grains for forensic reconstructions. The ability to independently corroborate the grain type classifications is desirable and provides additional weight to the findings of SEM analysis of the textures of quartz grains identified in forensic soil/sediment samples. AFM offers a quantitative means of analysis that complements SEM examination, and is a non-destructive technique that requires no sample preparation prior to scanning. It therefore has great potential to be used for forensic analysis where sample preservation is highly valuable. By taking quantitative topography scans, it is possible to produce 3D representations of microscopic surface textures and diagnostic features for examination. Furthermore, various empirical measures can be obtained from analysing the topography scans, including arithmetic average roughness, root-mean-square surface roughness, skewness, kurtosis, and multiple gaussian fits to height distributions. These empirical measures, combined with qualitative examination of the surfaces can help to discriminate between grain types and provide independent analysis that can corroborate the morphological grain typing based on the surface textures assigned using SEM. Furthermore, the findings from this study also demonstrate that quartz sand grain surfaces exhibit a statistically self-similar fractal nature that remains unchanged across scales. This indicates the potential for a further quantitative measure that could be utilised in the discrimination of quartz grains based on their provenance for forensic investigations. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Tromberg, B.J.; Tsay, T.T.; Berns, M.W.; Svaasand, L.O.; Haskell, R.C.
1995-06-13
Optical measurements of turbid media, that is media characterized by multiple light scattering, is provided through an apparatus and method for exposing a sample to a modulated laser beam. The light beam is modulated at a fundamental frequency and at a plurality of integer harmonics thereof. Modulated light is returned from the sample and preferentially detected at cross frequencies at frequencies slightly higher than the fundamental frequency and at integer harmonics of the same. The received radiance at the beat or cross frequencies is compared against a reference signal to provide a measure of the phase lag of the radiance and modulation ratio relative to a reference beam. The phase and modulation amplitude are then provided as a frequency spectrum by an array processor to which a computer applies a complete curve fit in the case of highly scattering samples or a linear curve fit below a predetermined frequency in the case of highly absorptive samples. The curve fit in any case is determined by the absorption and scattering coefficients together with a concentration of the active substance in the sample. Therefore, the curve fitting to the frequency spectrum can be used both for qualitative and quantitative analysis of substances in the sample even though the sample is highly turbid. 14 figs.
Tromberg, Bruce J.; Tsay, Tsong T.; Berns, Michael W.; Svaasand, Lara O.; Haskell, Richard C.
1995-01-01
Optical measurements of turbid media, that is media characterized by multiple light scattering, is provided through an apparatus and method for exposing a sample to a modulated laser beam. The light beam is modulated at a fundamental frequency and at a plurality of integer harmonics thereof. Modulated light is returned from the sample and preferentially detected at cross frequencies at frequencies slightly higher than the fundamental frequency and at integer harmonics of the same. The received radiance at the beat or cross frequencies is compared against a reference signal to provide a measure of the phase lag of the radiance and modulation ratio relative to a reference beam. The phase and modulation amplitude are then provided as a frequency spectrum by an array processor to which a computer applies a complete curve fit in the case of highly scattering samples or a linear curve fit below a predetermined frequency in the case of highly absorptive samples. The curve fit in any case is determined by the absorption and scattering coefficients together with a concentration of the active substance in the sample. Therefore, the curve fitting to the frequency spectrum can be used both for qualitative and quantitative analysis of substances in the sample even though the sample is highly turbid.
The physical and biological basis of quantitative parameters derived from diffusion MRI
2012-01-01
Diffusion magnetic resonance imaging is a quantitative imaging technique that measures the underlying molecular diffusion of protons. Diffusion-weighted imaging (DWI) quantifies the apparent diffusion coefficient (ADC) which was first used to detect early ischemic stroke. However this does not take account of the directional dependence of diffusion seen in biological systems (anisotropy). Diffusion tensor imaging (DTI) provides a mathematical model of diffusion anisotropy and is widely used. Parameters, including fractional anisotropy (FA), mean diffusivity (MD), parallel and perpendicular diffusivity can be derived to provide sensitive, but non-specific, measures of altered tissue structure. They are typically assessed in clinical studies by voxel-based or region-of-interest based analyses. The increasing recognition of the limitations of the diffusion tensor model has led to more complex multi-compartment models such as CHARMED, AxCaliber or NODDI being developed to estimate microstructural parameters including axonal diameter, axonal density and fiber orientations. However these are not yet in routine clinical use due to lengthy acquisition times. In this review, I discuss how molecular diffusion may be measured using diffusion MRI, the biological and physical bases for the parameters derived from DWI and DTI, how these are used in clinical studies and the prospect of more complex tissue models providing helpful micro-structural information. PMID:23289085
Lerner, Eitan; Ploetz, Evelyn; Hohlbein, Johannes; Cordes, Thorben; Weiss, Shimon
2016-07-07
Single-molecule, protein-induced fluorescence enhancement (PIFE) serves as a molecular ruler at molecular distances inaccessible to other spectroscopic rulers such as Förster-type resonance energy transfer (FRET) or photoinduced electron transfer. In order to provide two simultaneous measurements of two distances on different molecular length scales for the analysis of macromolecular complexes, we and others recently combined measurements of PIFE and FRET (PIFE-FRET) on the single molecule level. PIFE relies on steric hindrance of the fluorophore Cy3, which is covalently attached to a biomolecule of interest, to rotate out of an excited-state trans isomer to the cis isomer through a 90° intermediate. In this work, we provide a theoretical framework that accounts for relevant photophysical and kinetic parameters of PIFE-FRET, show how this framework allows the extraction of the fold-decrease in isomerization mobility from experimental data, and show how these results provide information on changes in the accessible volume of Cy3. The utility of this model is then demonstrated for experimental results on PIFE-FRET measurement of different protein-DNA interactions. The proposed model and extracted parameters could serve as a benchmark to allow quantitative comparison of PIFE effects in different biological systems.
Quantitative Interferometry in the Severe Acoustic Environment of Resonant Supersonic Jets
NASA Technical Reports Server (NTRS)
Mercer, Carolyn R.; Raman, Ganesh
1999-01-01
Understanding fundamental fluidic dynamic and acoustic processes in high-speed jets requires quantitative velocity, density and temperature measurements. In this paper we demonstrate a new, robust Liquid Crystal Point Diffraction Interferometer (LCPDI) that includes phase stepping and can provide accurate data even in the presence of intense acoustic fields. This novel common path interferometer (LCPDI) was developed to overcome difficulties with the Mach Zehnder interferometer in vibratory environments and is applied here to the case of a supersonic shock- containing jet. The environmentally insensitive LCPDI that is easy to align and capable of measuring optical wavefronts with high accuracy is briefly described, then integrated line of sight density data from the LCPDI for two underexpanded jets are presented.
Quantitative Tester And Reconditioner For Hand And Arm
NASA Technical Reports Server (NTRS)
Engle, Gary; Bond, Malcolm; Naumann, Theodore
1993-01-01
Apparatus measures torques, forces, and motions of hand, wrist, forearm, elbow, and shoulder and aids in reconditioning muscles involved. Used to determine strengths and endurances of muscles, ranges of motion of joints, and reaction times. Provides quantitative data used to assess extent to which disuse, disease, or injury causes deterioration of muscles and of motor-coordination skills. Same apparatus serves as exercise machine to restore muscle performance by imposing electronically controlled, gradually increasing loads on muscles. Suitable for training and evaluating astronauts, field testing for workers' compensation claims, and physical therapy in hospitals. With aid of various attachments, system adapted to measure such special motions as pinching, rotation of wrist, and supination and pronation of the forearm. Attachments are gloves, wristlets, and sleeves.
Blew, Robert M; Lee, Vinson R; Farr, Joshua N; Schiferl, Daniel J; Going, Scott B
2014-02-01
Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale and (2) establish a quantitative motion assessment methodology. Scans were performed on 506 healthy girls (9-13 years) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n = 46) was examined to determine %Move's impact on bone parameters. Agreement between measurers was strong (intraclass correlation coefficient = 0.732 for tibia, 0.812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat and no repeat. The quantitative approach found ≥95% of subjects had %Move <25 %. Comparison of initial and repeat scans by groups above and below 25% initial movement showed significant differences in the >25 % grouping. A pQCT visual inspection scale can be a reliable metric of image quality, but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat.
Blew, Robert M.; Lee, Vinson R.; Farr, Joshua N.; Schiferl, Daniel J.; Going, Scott B.
2013-01-01
Purpose Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale, and (2) establish a quantitative motion assessment methodology. Methods Scans were performed on 506 healthy girls (9–13yr) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n=46) was examined to determine %Move’s impact on bone parameters. Results Agreement between measurers was strong (ICC = .732 for tibia, .812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat or no repeat. The quantitative approach found ≥95% of subjects had %Move <25%. Comparison of initial and repeat scans by groups above and below 25% initial movement, showed significant differences in the >25% grouping. Conclusions A pQCT visual inspection scale can be a reliable metric of image quality but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat. PMID:24077875
Nagayama, T.; Bailey, J. E.; Loisel, G.; ...
2016-02-05
Recently, frequency-resolved iron opacity measurements at electron temperatures of 170–200 eV and electron densities of (0.7 – 4.0) × 10 22 cm –3 revealed a 30–400% disagreement with the calculated opacities [J. E. Bailey et al., Nature (London) 517, 56 (2015)]. The discrepancies have a high impact on astrophysics, atomic physics, and high-energy density physics, and it is important to verify our understanding of the experimental platform with simulations. Reliable simulations are challenging because the temporal and spatial evolution of the source radiation and of the sample plasma are both complex and incompletely diagnosed. In this article, we describe simulationsmore » that reproduce the measured temperature and density in recent iron opacity experiments performed at the Sandia National Laboratories Z facility. The time-dependent spectral irradiance at the sample is estimated using the measured time- and space-dependent source radiation distribution, in situ source-to-sample distance measurements, and a three-dimensional (3D) view-factor code. The inferred spectral irradiance is used to drive 1D sample radiation hydrodynamics simulations. The images recorded by slit-imaged space-resolved spectrometers are modeled by solving radiation transport of the source radiation through the sample. We find that the same drive radiation time history successfully reproduces the measured plasma conditions for eight different opacity experiments. These results provide a quantitative physical explanation for the observed dependence of both temperature and density on the sample configuration. Simulated spectral images for the experiments without the FeMg sample show quantitative agreement with the measured spectral images. The agreement in spectral profile, spatial profile, and brightness provides further confidence in our understanding of the backlight-radiation time history and image formation. Furthermore, these simulations bridge the static-uniform picture of the data interpretation and the dynamic-gradient reality of the experiments, and they will allow us to quantitatively assess the impact of effects neglected in the data interpretation.« less
Thornberg, Steven M; Brown, Jason
2015-02-17
A method of detecting leaks and measuring volumes as well as a device, the Power-free Pump Module (PPM), provides a self-contained leak test and volume measurement apparatus that requires no external sources of electrical power during leak testing or volume measurement. The PPM is a portable, pneumatically-controlled instrument capable of generating a vacuum, calibrating volumes, and performing quantitative leak tests on a closed test system or device, all without the use of alternating current (AC) power. Capabilities include the ability is to provide a modest vacuum (less than 10 Torr) using a venturi pump, perform a pressure rise leak test, measure the gas's absolute pressure, and perform volume measurements. All operations are performed through a simple rotary control valve which controls pneumatically-operated manifold valves.
Leve, Leslie D.; Harold, Gordon T.; Ge, Xiaojia; Neiderhiser, Jenae M.; Patterson, Gerald
2010-01-01
The results from a large body of family-based research studies indicate that modifying the environment (specifically dimensions of the social environment) through intervention is an effective mechanism for achieving positive outcomes. Parallel to this work is a growing body of evidence from genetically informed studies indicating that social environmental factors are central to enhancing or offsetting genetic influences. Increased precision in the understanding of the role of the social environment in offsetting genetic risk might provide new information about environmental mechanisms that could be applied to prevention science. However, at present, the multifaceted conceptualization of the environment in prevention science is mismatched with the more limited measurement of the environment in many genetically informed studies. A framework for translating quantitative behavioral genetic research to inform the development of preventive interventions is presented in this article. The measurement of environmental indices amenable to modification is discussed within the context of quantitative behavioral genetic studies. In particular, emphasis is placed on the necessary elements that lead to benefits in prevention science, specifically the development of evidence-based interventions. An example from an ongoing prospective adoption study is provided to illustrate the potential of this translational process to inform the selection of preventive intervention targets. PMID:21188273
A quantitative index for classification of plantar thermal changes in the diabetic foot
NASA Astrophysics Data System (ADS)
Hernandez-Contreras, D.; Peregrina-Barreto, H.; Rangel-Magdaleno, J.; Gonzalez-Bernal, J. A.; Altamirano-Robles, L.
2017-03-01
One of the main complications caused by diabetes mellitus is the development of diabetic foot, which in turn, can lead to ulcerations. Because ulceration risks are linked to an increase in plantar temperatures, recent approaches analyze thermal changes. These approaches try to identify spatial patterns of temperature that could be characteristic of a diabetic group. However, this is a difficult task since thermal patterns have wide variations resulting on complex classification. Moreover, the measurement of contralateral plantar temperatures is important to determine whether there is an abnormal difference but, this only provides information when thermal changes are asymmetric and in absence of ulceration or amputation. Therefore, in this work is proposed a quantitative index for measuring the thermal change in the plantar region of participants diagnosed diabetes mellitus regards to a reliable reference (control) or regards to the contralateral foot (as usual). Also, a classification of the thermal changes based on a quantitative index is proposed. Such classification demonstrate the wide diversity of spatial distributions in the diabetic foot but also demonstrate that it is possible to identify common characteristics. An automatic process, based on the analysis of plantar angiosomes and image processing, is presented to quantify these thermal changes and to provide valuable information to the medical expert.
NASA Astrophysics Data System (ADS)
Chen, Shichao; Zhu, Yizheng
2017-02-01
Sensitivity is a critical index to measure the temporal fluctuation of the retrieved optical pathlength in quantitative phase imaging system. However, an accurate and comprehensive analysis for sensitivity evaluation is still lacking in current literature. In particular, previous theoretical studies for fundamental sensitivity based on Gaussian noise models are not applicable to modern cameras and detectors, which are dominated by shot noise. In this paper, we derive two shot noiselimited theoretical sensitivities, Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry, which is a major category of on-axis interferometry techniques in quantitative phase imaging. Based on the derivations, we show that the shot noise-limited model permits accurate estimation of theoretical sensitivities directly from measured data. These results can provide important insights into fundamental constraints in system performance and can be used to guide system design and optimization. The same concepts can be generalized to other quantitative phase imaging techniques as well.
High-coverage quantitative proteomics using amine-specific isotopic labeling.
Melanson, Jeremy E; Avery, Steven L; Pinto, Devanand M
2006-08-01
Peptide dimethylation with isotopically coded formaldehydes was evaluated as a potential alternative to techniques such as the iTRAQ method for comparative proteomics. The isotopic labeling strategy and custom-designed protein quantitation software were tested using protein standards and then applied to measure proteins levels associated with Alzheimer's disease (AD). The method provided high accuracy (10% error), precision (14% RSD) and coverage (70%) when applied to the analysis of a standard solution of BSA by LC-MS/MS. The technique was then applied to measure protein abundance levels in brain tissue afflicted with AD relative to normal brain tissue. 2-D LC-MS analysis identified 548 unique proteins (p<0.05). Of these, 349 were quantified with two or more peptides that met the statistical criteria used in this study. Several classes of proteins exhibited significant changes in abundance. For example, elevated levels of antioxidant proteins and decreased levels of mitochondrial electron transport proteins were observed. The results demonstrate the utility of the labeling method for high-throughput quantitative analysis.
Quantitative analysis of multiple sclerosis: a feasibility study
NASA Astrophysics Data System (ADS)
Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong
2006-03-01
Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.
NASA Astrophysics Data System (ADS)
Wu, Tao; Cheung, Tak-Hong; Yim, So-Fan; Qu, Jianan Y.
2010-03-01
A quantitative colposcopic imaging system for the diagnosis of early cervical cancer is evaluated in a clinical study. This imaging technology based on 3-D active stereo vision and motion tracking extracts diagnostic information from the kinetics of acetowhitening process measured from the cervix of human subjects in vivo. Acetowhitening kinetics measured from 137 cervical sites of 57 subjects are analyzed and classified using multivariate statistical algorithms. Cross-validation methods are used to evaluate the performance of the diagnostic algorithms. The results show that an algorithm for screening precancer produced 95% sensitivity (SE) and 96% specificity (SP) for discriminating normal and human papillomavirus (HPV)-infected tissues from cervical intraepithelial neoplasia (CIN) lesions. For a diagnostic algorithm, 91% SE and 90% SP are achieved for discriminating normal tissue, HPV infected tissue, and low-grade CIN lesions from high-grade CIN lesions. The results demonstrate that the quantitative colposcopic imaging system could provide objective screening and diagnostic information for early detection of cervical cancer.
NASA Astrophysics Data System (ADS)
Jackson, Edward F.
2016-04-01
Over the past decade, there has been an increasing focus on quantitative imaging biomarkers (QIBs), which are defined as "objectively measured characteristics derived from in vivo images as indicators of normal biological processes, pathogenic processes, or response to a therapeutic intervention"1. To evolve qualitative imaging assessments to the use of QIBs requires the development and standardization of data acquisition, data analysis, and data display techniques, as well as appropriate reporting structures. As such, successful implementation of QIB applications relies heavily on expertise from the fields of medical physics, radiology, statistics, and informatics as well as collaboration from vendors of imaging acquisition, analysis, and reporting systems. When successfully implemented, QIBs will provide image-derived metrics with known bias and variance that can be validated with anatomically and physiologically relevant measures, including treatment response (and the heterogeneity of that response) and outcome. Such non-invasive quantitative measures can then be used effectively in clinical and translational research and will contribute significantly to the goals of precision medicine. This presentation will focus on 1) outlining the opportunities for QIB applications, with examples to demonstrate applications in both research and patient care, 2) discussing key challenges in the implementation of QIB applications, and 3) providing overviews of efforts to address such challenges from federal, scientific, and professional organizations, including, but not limited to, the RSNA, NCI, FDA, and NIST. 1Sullivan, Obuchowski, Kessler, et al. Radiology, epub August 2015.
Chen, Wei; Li, Yanying; Chen, Chang-Er; Sweetman, Andrew J; Zhang, Hao; Jones, Kevin C
2017-11-21
Widespread use of organic chemicals in household and personal care products (HPCPs) and their discharge into aquatic systems means reliable, robust techniques to monitor environmental concentrations are needed. The passive sampling approach of diffusive gradients in thin-films (DGT) is developed here and demonstrated to provide in situ quantitative and time-weighted average (TWA) measurement of these chemicals in waters. The novel technique is developed for HPCPs, including preservatives, antioxidants and disinfectants, by evaluating the performance of different binding agents. Ultrasonic extraction of binding gels in acetonitrile gave good and consistent recoveries for all test chemicals. Uptake by DGT with HLB (hydrophilic-lipophilic-balanced) as the binding agent was relatively independent of pH (3.5-9.5), ionic strength (0.001-0.1 M) and dissolved organic matter (0-20 mg L -1 ), making it suitable for applications across a wide range of environments. Deployment time and diffusion layer thickness dependence experiments confirmed DGT accumulated chemicals masses are consistent with theoretical predictions. The technique was further tested and applied in the influent and effluent of a wastewater treatment plant. Results were compared with conventional grab-sampling and 24-h-composited samples from autosamplers. DGT provided TWA concentrations over up to 18 days deployment, with minimal effects from biofouling or the diffusive boundary layer. The field application demonstrated advantages of the DGT technique: it gives in situ analyte preconcentration in a simple matrix, with more quantitative measurement of the HPCP analytes.
NASA Technical Reports Server (NTRS)
Dragan, O.; Galan, N.; Sirbu, A.; Ghita, C.
1974-01-01
The design and construction of inductive transducers for measuring the vibrations in metal bars at ultrasonic frequencies are discussed. Illustrations of the inductive transducers are provided. The quantitative relations that are useful in designing the transducers are analyzed. Mathematical models are developed to substantiate the theoretical considerations. Results obtained with laboratory equipment in testing specified metal samples are included.
Relative-Error-Covariance Algorithms
NASA Technical Reports Server (NTRS)
Bierman, Gerald J.; Wolff, Peter J.
1991-01-01
Two algorithms compute error covariance of difference between optimal estimates, based on data acquired during overlapping or disjoint intervals, of state of discrete linear system. Provides quantitative measure of mutual consistency or inconsistency of estimates of states. Relative-error-covariance concept applied, to determine degree of correlation between trajectories calculated from two overlapping sets of measurements and construct real-time test of consistency of state estimates based upon recently acquired data.
Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R.
2018-01-01
Bio-barcode assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio-barcode assay requires lengthy experimental procedures including the preparation and release of barcode DNA probes from the target-nanoparticle complex, and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio-barcode assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2’2’-bipyridyl) ruthenium (TBR)-labele barcode DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products. PMID:18386909
Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R
2008-05-15
A bio bar code assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio bar code assay requires lengthy experimental procedures including the preparation and release of bar code DNA probes from the target-nanoparticle complex and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio bar code assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2,2'-bipyridyl) ruthenium (TBR)-labeled bar code DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products.
NASA Astrophysics Data System (ADS)
Ehn, Andreas; Jonsson, Malin; Johansson, Olof; Aldén, Marcus; Bood, Joakim
2013-01-01
Fluorescence lifetimes of toluene as a function of oxygen concentration in toluene/nitrogen/oxygen mixtures have been measured at room temperature using picosecond-laser excitation of the S1-S0 transition at 266 nm. The data satisfy the Stern-Volmer relation with high accuracy, providing an updated value of the Stern-Volmer slope. A newly developed fluorescence lifetime imaging scheme, called Dual Imaging with Modeling Evaluation (DIME), is evaluated and successfully demonstrated for quantitative oxygen concentration imaging in toluene-seeded O2/N2 gas mixtures.
NASA Astrophysics Data System (ADS)
Ehn, Andreas; Jonsson, Malin; Johansson, Olof; Aldén, Marcus; Bood, Joakim
2012-12-01
Fluorescence lifetimes of toluene as a function of oxygen concentration in toluene/nitrogen/oxygen mixtures have been measured at room temperature using picosecond-laser excitation of the S1-S0 transition at 266 nm. The data satisfy the Stern-Volmer relation with high accuracy, providing an updated value of the Stern-Volmer slope. A newly developed fluorescence lifetime imaging scheme, called Dual Imaging with Modeling Evaluation (DIME), is evaluated and successfully demonstrated for quantitative oxygen concentration imaging in toluene-seeded O2/N2 gas mixtures.
Nadort, Annemarie; Woolthuis, Rutger G.; van Leeuwen, Ton G.; Faber, Dirk J.
2013-01-01
We present integrated Laser Speckle Contrast Imaging (LSCI) and Sidestream Dark Field (SDF) flowmetry to provide real-time, non-invasive and quantitative measurements of speckle decorrelation times related to microcirculatory flow. Using a multi exposure acquisition scheme, precise speckle decorrelation times were obtained. Applying SDF-LSCI in vitro and in vivo allows direct comparison between speckle contrast decorrelation and flow velocities, while imaging the phantom and microcirculation architecture. This resulted in a novel analysis approach that distinguishes decorrelation due to flow from other additive decorrelation sources. PMID:24298399
Cost and Efficacy Assessment of an Alternative Medication Compliance Urine Drug Testing Strategy.
Doyle, Kelly; Strathmann, Frederick G
2017-02-01
This study investigates the frequency at which quantitative results provide additional clinical benefit compared to qualitative results alone. A comparison between alternative urine drug screens and conventional screens including the assessment of cost-to-payer differences, accuracy of prescription compliance or polypharmacy/substance abuse was also included. In a reference laboratory evaluation of urine specimens from across the United States, 213 urine specimens with provided prescription medication information (302 prescriptions) were analyzed by two testing algorithms: 1) conventional immunoassay screen with subsequent reflexive testing of positive results by quantitative mass spectrometry; and 2) a combined immunoassay/qualitative mass-spectrometry screen that substantially reduced the need for subsequent testing. The qualitative screen was superior to immunoassay with reflex to mass spectrometry in confirming compliance per prescription (226/302 vs 205/302), and identifying non-prescription abuse (97 vs 71). Pharmaceutical impurities and inconsistent drug metabolite patterns were detected in only 3.8% of specimens, suggesting that quantitative results have limited benefit. The percentage difference between the conventional testing algorithm and the alternative screen was projected to be 55%, and a 2-year evaluation of test utilization as a measure of test order volume follows an exponential trend for alternative screen test orders over conventional immunoassay screens that require subsequent confirmation testing. Alternative, qualitative urine drug screens provide a less expensive, faster, and more comprehensive evaluation of patient medication compliance and drug abuse. The vast majority of results were interpretable with qualitative results alone indicating a reduced need to automatically reflex to quantitation or provide quantitation for the majority of patients. This strategy highlights a successful approach using an alternative strategy for both the laboratory and physician to align clinical needs while being mindful of costs.
Reviewing effectiveness of ankle assessment techniques for use in robot-assisted therapy.
Zhang, Mingming; Davies, T Claire; Zhang, Yanxin; Xie, Shane
2014-01-01
This article provides a comprehensive review of studies that investigated ankle assessment techniques to better understand those that can be used in the real-time monitoring of rehabilitation progress for implementation in conjunction with robot-assisted therapy. Seventy-six publications published between January 1980 and August 2013 were selected based on eight databases. They were divided into two main categories (16 qualitative and 60 quantitative studies): 13 goniometer studies, 18 dynamometer studies, and 29 studies about innovative techniques. A total of 465 subjects participated in the 29 quantitative studies of innovative measurement techniques that may potentially be integrated in a real-time monitoring device, of which 19 studies included less than 10 participants. Results show that qualitative ankle assessment methods are not suitable for real-time monitoring in robot-assisted therapy, though they are reliable for certain patients, while the quantitative methods show great potential. The majority of quantitative techniques are reliable in measuring ankle kinematics and kinetics but are usually available only for use in the sagittal plane. Limited studies determine kinematics and kinetics in all three planes (sagittal, transverse, and frontal) where motions of the ankle joint and the subtalar joint actually occur.
Kisingo, Alex; Rollins, Rick; Murray, Grant; Dearden, Phil; Clarke, Marlea
2016-10-01
Protected areas (PAs) can provide important benefits to conservation and to communities. A key factor in the effective delivery of these benefits is the role of governance. There has been a growth in research developing frameworks to evaluate 'good' PA governance, usually drawing on a set of principles that are associated with groups of indicators. In contrast to dominant qualitative approaches, this paper describes the development of a quantitative method for measuring effectiveness of protected area governance, as perceived by stakeholders in the Greater Serengeti Ecosystem in Tanzania. The research developed a quantitative method for developing effectiveness measures of PA governance, using a set of 65 statements related to governance principles developed from a literature review. The instrument was administered to 389 individuals from communities located near PAs in the Greater Serengeti Ecosystem. The results of a factor analysis suggest that statements load onto 10 factors that demonstrate high psychometric validity as measured by factor loadings, explained variance, and Cronbach's alpha reliability. The ten common factors that were extracted were: 1) legitimacy, 2) transparency and accountability, 3) responsiveness, 4) fairness, 5) participation, 6) ecosystem based management (EBM) and connectivity, 7) resilience, 8) achievements, 9) consensus orientation, and 10) power. The paper concludes that quantitative surveys can be used to evaluate governance of protected areas from a community-level perspective. Copyright © 2016 Elsevier Ltd. All rights reserved.
Measuring Plant Water Status: A Simple Method for Investigative Laboratories.
ERIC Educational Resources Information Center
Mansfield, Donald H.; Anderson, Jay E.
1980-01-01
Describes a method suitable for quantitative studies of plant water status conducted by high school or college students and the calculation of the relative water content (RWC) of a plant. Materials, methods, procedures, and results are discussed, with sample data figures provided. (CS)
Simulation of UV atomic radiation for application in exhaust plume spectrometry
NASA Astrophysics Data System (ADS)
Wallace, T. L.; Powers, W. T.; Cooper, A. E.
1993-06-01
Quantitative analysis of exhaust plume spectral data has long been a goal of developers of advanced engine health monitoring systems which incorporate optical measurements of rocket exhaust constituents. Discussed herein is the status of present efforts to model and predict atomic radiation spectra and infer free-atom densities from emission/absorption measurements as part of the Optical Plume Anomaly Detection (OPAD) program at Marshall Space Flight Center (MSFC). A brief examination of the mathematical formalism is provided in the context of predicting radiation from the Mach disk region of the SSME exhaust flow at nominal conditions during ground level testing at MSFC. Computational results are provided for Chromium and Copper at selected transitions which indicate a strong dependence upon broadening parameter values determining the absorption-emission line shape. Representative plots of recent spectral data from the Stennis Space Center (SSC) Diagnostic Test Facility (DTF) rocket engine are presented and compared to numerical results from the present self-absorbing model; a comprehensive quantitative analysis will be reported at a later date.
A Workstation for Interactive Display and Quantitative Analysis of 3-D and 4-D Biomedical Images
Robb, R.A.; Heffeman, P.B.; Camp, J.J.; Hanson, D.P.
1986-01-01
The capability to extract objective and quantitatively accurate information from 3-D radiographic biomedical images has not kept pace with the capabilities to produce the images themselves. This is rather an ironic paradox, since on the one hand the new 3-D and 4-D imaging capabilities promise significant potential for providing greater specificity and sensitivity (i.e., precise objective discrimination and accurate quantitative measurement of body tissue characteristics and function) in clinical diagnostic and basic investigative imaging procedures than ever possible before, but on the other hand, the momentous advances in computer and associated electronic imaging technology which have made these 3-D imaging capabilities possible have not been concomitantly developed for full exploitation of these capabilities. Therefore, we have developed a powerful new microcomputer-based system which permits detailed investigations and evaluation of 3-D and 4-D (dynamic 3-D) biomedical images. The system comprises a special workstation to which all the information in a large 3-D image data base is accessible for rapid display, manipulation, and measurement. The system provides important capabilities for simultaneously representing and analyzing both structural and functional data and their relationships in various organs of the body. This paper provides a detailed description of this system, as well as some of the rationale, background, theoretical concepts, and practical considerations related to system implementation. ImagesFigure 5Figure 7Figure 8Figure 9Figure 10Figure 11Figure 12Figure 13Figure 14Figure 15Figure 16
Using confidence intervals to evaluate the focus alignment of spectrograph detector arrays.
Sawyer, Travis W; Hawkins, Kyle S; Damento, Michael
2017-06-20
High-resolution spectrographs extract detailed spectral information of a sample and are frequently used in astronomy, laser-induced breakdown spectroscopy, and Raman spectroscopy. These instruments employ dispersive elements such as prisms and diffraction gratings to spatially separate different wavelengths of light, which are then detected by a charge-coupled device (CCD) or complementary metal-oxide-semiconductor (CMOS) detector array. Precise alignment along the optical axis (focus position) of the detector array is critical to maximize the instrumental resolution; however, traditional approaches of scanning the detector through focus lack a quantitative measure of precision, limiting the repeatability and relying on one's experience. Here we propose a method to evaluate the focus alignment of spectrograph detector arrays by establishing confidence intervals to measure the alignment precision. We show that propagation of uncertainty can be used to estimate the variance in an alignment, thus providing a quantitative and repeatable means to evaluate the precision and confidence of an alignment. We test the approach by aligning the detector array of a prototype miniature echelle spectrograph. The results indicate that the procedure effectively quantifies alignment precision, enabling one to objectively determine when an alignment has reached an acceptable level. This quantitative approach also provides a foundation for further optimization, including automated alignment. Furthermore, the procedure introduced here can be extended to other alignment techniques that rely on numerically fitting data to a model, providing a general framework for evaluating the precision of alignment methods.
Aslan, Kerim; Gunbey, Hediye Pinar; Tomak, Leman; Ozmen, Zafer; Incesu, Lutfi
The aim of this study was to investigate whether the use of combination quantitative metrics (mamillopontine distance [MPD], pontomesencephalic angle, and mesencephalon anterior-posterior/medial-lateral diameter ratios) with qualitative signs (dural enhancement, subdural collections/hematoma, venous engorgement, pituitary gland enlargements, and tonsillar herniations) provides a more accurate diagnosis of intracranial hypotension (IH). The quantitative metrics and qualitative signs of 34 patients and 34 control subjects were assessed by 2 independent observers. Receiver operating characteristic (ROC) curve was used to evaluate the diagnostic performance of quantitative metrics and qualitative signs, and for the diagnosis of IH, optimum cutoff values of quantitative metrics were found with ROC analysis. Combined ROC curve was measured for the quantitative metrics, and qualitative signs combinations in determining diagnostic accuracy and sensitivity, specificity, and positive and negative predictive values were found, and the best model combination was formed. Whereas MPD and pontomesencephalic angle were significantly lower in patients with IH when compared with the control group (P < 0.001), mesencephalon anterior-posterior/medial-lateral diameter ratio was significantly higher (P < 0.001). For qualitative signs, the highest individual distinctive power was dural enhancement with area under the ROC curve (AUC) of 0.838. For quantitative metrics, the highest individual distinctive power was MPD with AUC of 0.947. The best accuracy in the diagnosis of IH was obtained by combination of dural enhancement, venous engorgement, and MPD with an AUC of 1.00. This study showed that the combined use of dural enhancement, venous engorgement, and MPD had diagnostic accuracy of 100 % for the diagnosis of IH. Therefore, a more accurate IH diagnosis can be provided with combination of quantitative metrics with qualitative signs.
Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A
2017-12-19
As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.
Quantitative Species Measurements In Microgravity Combustion Flames
NASA Technical Reports Server (NTRS)
Chen, Shin-Juh; Pilgrim, Jeffrey S.; Silver, Joel A.; Piltch, Nancy D.
2003-01-01
The capability of models and theories to accurately predict and describe the behavior of low gravity flames can only be verified by quantitative measurements. Although video imaging, simple temperature measurements, and velocimetry methods have provided useful information in many cases, there is still a need for quantitative species measurements. Over the past decade, we have been developing high sensitivity optical absorption techniques to permit in situ, non-intrusive, absolute concentration measurements for both major and minor flames species using diode lasers. This work has helped to establish wavelength modulation spectroscopy (WMS) as an important method for species detection within the restrictions of microgravity-based measurements. More recently, in collaboration with Prof. Dahm at the University of Michigan, a new methodology combining computed flame libraries with a single experimental measurement has allowed us to determine the concentration profiles for all species in a flame. This method, termed ITAC (Iterative Temperature with Assumed Chemistry) was demonstrated for a simple laminar nonpremixed methane-air flame at both 1-g and at 0-g in a vortex ring flame. In this paper, we report additional normal and microgravity experiments which further confirm the usefulness of this approach. We also present the development of a new type of laser. This is an external cavity diode laser (ECDL) which has the unique capability of high frequency modulation as well as a very wide tuning range. This will permit the detection of multiple species with one laser while using WMS detection.
Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero
2011-03-24
High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.
Doblas, Ana; Sánchez-Ortiga, Emilio; Martínez-Corral, Manuel; Saavedra, Genaro; Garcia-Sucerquia, Jorge
2014-04-01
The advantages of using a telecentric imaging system in digital holographic microscopy (DHM) to study biological specimens are highlighted. To this end, the performances of nontelecentric DHM and telecentric DHM are evaluated from the quantitative phase imaging (QPI) point of view. The evaluated stability of the microscope allows single-shot QPI in DHM by using telecentric imaging systems. Quantitative phase maps of a section of the head of the drosophila melanogaster fly and of red blood cells are obtained via single-shot DHM with no numerical postprocessing. With these maps we show that the use of telecentric DHM provides larger field of view for a given magnification and permits more accurate QPI measurements with less number of computational operations.
Electro-mechanical properties of hydrogel composites with micro- and nano-cellulose fillers
NASA Astrophysics Data System (ADS)
N, Mohamed Shahid U.; Deshpande, Abhijit P.; Lakshmana Rao, C.
2015-09-01
Stimuli responsive cross-linked hydrogels are of great interest for applications in diverse fields such as sensors and biomaterials. In this study, we investigate polymer composites filled with cellulose fillers. The celluloses used in making the composites were a microcrystalline cellulose of commercial grade and cellulose nano-whiskers obtained through acid hydrolysis of microcrystalline cellulose. The filler concentration was varied and corresponding physical, mechanical and electro-mechanical characterization was carried out. The electro-mechanical properties were determined using a quasi-static method. The fillers not only enhance the mechanical properties of the composite by providing better reinforcement but also provide a quantitative electric potential in the composite. The measurements reveal that the polymer composites prepared from two different cellulose fillers possess a quantitative electric potential which can be utilized in biomedical applications. It is argued that the mechanism behind the quantitative electric potential in the composites is due to streaming potentials arising due to electrical double layer formation.
Gravitational Effects on Near Field Flow Structure of Low Density Gas Jets
NASA Technical Reports Server (NTRS)
Yep, Tze-Wing; Agrawal, Ajay K.; Griffin, DeVon; Salzman, Jack (Technical Monitor)
2001-01-01
Experiments were conducted in Earth gravity and microgravity to acquire quantitative data on near field flow structure of helium jets injected into air. Microgravity conditions were simulated in the 2.2-second drop tower at NASA Glenn Research Center. The jet flow was observed by quantitative rainbow schlieren deflectometry, a non-intrusive line of site measurement technique for the whole field. The flow structure was characterized by distributions of angular deflection and helium mole percentage obtained from color schlieren images taken at 60 Hz. Results show that the jet flow was significantly influenced by the gravity. The jet in microgravity was up to 70 percent wider than that in Earth gravity. The jet flow oscillations observed in Earth gravity were absent in microgravity, providing direct experimental evidence that the flow instability in the low density jet was buoyancy induced. The paper provides quantitative details of temporal flow evolution as the experiment undergoes a change in gravity in the drop tower.
Development of a Biological Science Quantitative Reasoning Exam (BioSQuaRE)
Stanhope, Liz; Ziegler, Laura; Haque, Tabassum; Le, Laura; Vinces, Marcelo; Davis, Gregory K.; Zieffler, Andrew; Brodfuehrer, Peter; Preest, Marion; M. Belitsky, Jason; Umbanhowar, Charles; Overvoorde, Paul J.
2017-01-01
Multiple reports highlight the increasingly quantitative nature of biological research and the need to innovate means to ensure that students acquire quantitative skills. We present a tool to support such innovation. The Biological Science Quantitative Reasoning Exam (BioSQuaRE) is an assessment instrument designed to measure the quantitative skills of undergraduate students within a biological context. The instrument was developed by an interdisciplinary team of educators and aligns with skills included in national reports such as BIO2010, Scientific Foundations for Future Physicians, and Vision and Change. Undergraduate biology educators also confirmed the importance of items included in the instrument. The current version of the BioSQuaRE was developed through an iterative process using data from students at 12 postsecondary institutions. A psychometric analysis of these data provides multiple lines of evidence for the validity of inferences made using the instrument. Our results suggest that the BioSQuaRE will prove useful to faculty and departments interested in helping students acquire the quantitative competencies they need to successfully pursue biology, and useful to biology students by communicating the importance of quantitative skills. We invite educators to use the BioSQuaRE at their own institutions. PMID:29196427
Will Quantitative Proteomics Redefine Some of the Key Concepts in Skeletal Muscle Physiology?
Gizak, Agnieszka; Rakus, Dariusz
2016-01-11
Molecular and cellular biology methodology is traditionally based on the reasoning called "the mechanistic explanation". In practice, this means identifying and selecting correlations between biological processes which result from our manipulation of a biological system. In theory, a successful application of this approach requires precise knowledge about all parameters of a studied system. However, in practice, due to the systems' complexity, this requirement is rarely, if ever, accomplished. Typically, it is limited to a quantitative or semi-quantitative measurements of selected parameters (e.g., concentrations of some metabolites), and a qualitative or semi-quantitative description of expression/post-translational modifications changes within selected proteins. A quantitative proteomics approach gives a possibility of quantitative characterization of the entire proteome of a biological system, in the context of the titer of proteins as well as their post-translational modifications. This enables not only more accurate testing of novel hypotheses but also provides tools that can be used to verify some of the most fundamental dogmas of modern biology. In this short review, we discuss some of the consequences of using quantitative proteomics to verify several key concepts in skeletal muscle physiology.
Quantitative multimodality imaging in cancer research and therapy.
Yankeelov, Thomas E; Abramson, Richard G; Quarles, C Chad
2014-11-01
Advances in hardware and software have enabled the realization of clinically feasible, quantitative multimodality imaging of tissue pathophysiology. Earlier efforts relating to multimodality imaging of cancer have focused on the integration of anatomical and functional characteristics, such as PET-CT and single-photon emission CT (SPECT-CT), whereas more-recent advances and applications have involved the integration of multiple quantitative, functional measurements (for example, multiple PET tracers, varied MRI contrast mechanisms, and PET-MRI), thereby providing a more-comprehensive characterization of the tumour phenotype. The enormous amount of complementary quantitative data generated by such studies is beginning to offer unique insights into opportunities to optimize care for individual patients. Although important technical optimization and improved biological interpretation of multimodality imaging findings are needed, this approach can already be applied informatively in clinical trials of cancer therapeutics using existing tools. These concepts are discussed herein.
Calabrese, Edward J
2013-11-01
The most common quantitative feature of the hormetic-biphasic dose response is its modest stimulatory response which at maximum is only 30-60% greater than control values, an observation that is consistently independent of biological model, level of organization (i.e., cell, organ or individual), endpoint measured, chemical/physical agent studied, or mechanism. This quantitative feature suggests an underlying "upstream" mechanism common across biological systems, therefore basic and general. Hormetic dose response relationships represent an estimate of the peak performance of integrative biological processes that are allometrically based. Hormetic responses reflect both direct stimulatory or overcompensation responses to damage induced by relatively low doses of chemical or physical agents. The integration of the hormetic dose response within an allometric framework provides, for the first time, an explanation for both the generality and the quantitative features of the hormetic dose response. Copyright © 2013 Elsevier Ltd. All rights reserved.
Hattori, Yoshiaki; Falgout, Leo; Lee, Woosik; ...
2014-03-26
Non-invasive, biomedical devices have the potential to provide important, quantitative data for the assessment of skin diseases and wound healing. Traditional methods either rely on qualitative visual and tactile judgments of a professional and/or data obtained using instrumentation with forms that do not readily allow intimate integration with sensitive skin near a wound site. In this paper, an electronic sensor platform that can softly and reversibly laminate perilesionally at wounds to provide highly accurate, quantitative data of relevance to the management of surgical wound healing is reported. Clinical studies on patients using thermal sensors and actuators in fractal layouts providemore » precise time-dependent mapping of temperature and thermal conductivity of the skin near the wounds. Analytical and simulation results establish the fundamentals of the sensing modalities, the mechanics of the system, and strategies for optimized design. The use of this type of “epidermal” electronics system in a realistic clinical setting with human subjects establishes a set of practical procedures in disinfection, reuse, and protocols for quantitative measurement. Finally, the results have the potential to address important unmet needs in chronic wound management.« less
The Relationship between Quantitative and Qualitative Measures of Writing Skills.
ERIC Educational Resources Information Center
Howerton, Mary Lou P.; And Others
The relationships of quantitative measures of writing skills to overall writing quality as measured by the E.T.S. Composition Evaluation Scale (CES) were examined. Quantitative measures included indices of language productivity, vocabulary diversity, spelling, and syntactic maturity. Power of specific indices to account for variation in overall…
The (mis)use of subjective process measures in software engineering
NASA Technical Reports Server (NTRS)
Valett, Jon D.; Condon, Steven E.
1993-01-01
A variety of measures are used in software engineering research to develop an understanding of the software process and product. These measures fall into three broad categories: quantitative, characteristics, and subjective. Quantitative measures are those to which a numerical value can be assigned, for example effort or lines of code (LOC). Characteristics describe the software process or product; they might include programming language or the type of application. While such factors do not provide a quantitative measurement of a process or product, they do help characterize them. Subjective measures (as defined in this study) are those that are based on the opinion or opinions of individuals; they are somewhat unique and difficult to quantify. Capturing of subjective measure data typically involves development of some type of scale. For example, 'team experience' is one of the subjective measures that were collected and studied by the Software Engineering Laboratory (SEL). Certainly, team experience could have an impact on the software process or product; actually measuring a team's experience, however, is not a strictly mathematical exercise. Simply adding up each team member's years of experience appears inadequate. In fact, most researchers would agree that 'years' do not directly translate into 'experience.' Team experience must be defined subjectively and then a scale must be developed e.g., high experience versus low experience; or high, medium, low experience; or a different or more granular scale. Using this type of scale, a particular team's overall experience can be compared with that of other teams in the development environment. Defining, collecting, and scaling subjective measures is difficult. First, precise definitions of the measures must be established. Next, choices must be made about whose opinions will be solicited to constitute the data. Finally, care must be given to defining the right scale and level of granularity for measurement.
Holographic 3D imaging through diffuse media by compressive sampling of the mutual intensity
NASA Astrophysics Data System (ADS)
Falldorf, Claas; Klein, Thorsten; Agour, Mostafa; Bergmann, Ralf B.
2017-05-01
We present a method for holographic imaging through a volume scattering material, which is based on selfreference and light with good spatial but limited temporal coherence. In contrast to existing techniques, we do not require a separate reference wave, thus our approach provides great advantages towards the flexibility of the measurement system. The main applications are remote sensing and investigation of moving objects through gaseous streams, bubbles or foggy water for example. Furthermore, due to the common path nature, the system is also insensitive to mechanical disturbances. The measurement result is a complex amplitude which is comparable to a phase shifted digital hologramm and therefore allows 3D imaging, numerical refocusing and quantitative phase contrast imaging. As an example of application, we present measurements of the quantitative phase contrast of the epidermis of an onion through a volume scattering material.
A Quantitative and Qualitative Exploration of Photoaversion in Achromatopsia
Aboshiha, Jonathan; Kumaran, Neruban; Kalitzeos, Angelos; Hogg, Chris; Rubin, Gary; Michaelides, Michel
2017-01-01
Purpose Photoaversion (PA) is a disabling and ubiquitous feature of achromatopsia (ACHM). We aimed to help define the characteristics of this important symptom, and present the first published assessment of its impact on patients' lives, as well as quantitative and qualitative PA assessments. Methods Molecularly confirmed ACHM subjects were assessed for PA using four tasks: structured survey of patient experience, novel quantitative subjective measurement of PA, visual acuities in differing ambient lighting, and objective palpebral aperture-related PA testing. Results Photoaversion in ACHM was found to be the most significant symptom for a substantial proportion (38%) of patients. A novel subjective PA measurement technique was developed and demonstrated fidelity with more invasive paradigms without exposing often very photosensitive patients to brighter light intensities used elsewhere. An objective PA measurement was also refined for use in trials, indicating that higher light intensities than previously published are likely to be needed. Monocular testing, as required for trials, was also validated for the first time. Conclusions This study offers new insights into PA in ACHM. It provides the first structured evidence of the great significance of this symptom to patients, suggesting that PA should be considered as an additional outcome measure in therapeutic trials. It also offers new insights into the characteristics of PA in ACHM, and describes both subjective and objective measures of PA that could be employed in clinical trials. PMID:28715587
NASA Astrophysics Data System (ADS)
Sader, John E.; Uchihashi, Takayuki; Higgins, Michael J.; Farrell, Alan; Nakayama, Yoshikazu; Jarvis, Suzanne P.
2005-03-01
Use of the atomic force microscope (AFM) in quantitative force measurements inherently requires a theoretical framework enabling conversion of the observed deflection properties of the cantilever to an interaction force. In this paper, the theoretical foundations of using frequency modulation atomic force microscopy (FM-AFM) in quantitative force measurements are examined and rigorously elucidated, with consideration being given to both 'conservative' and 'dissipative' interactions. This includes a detailed discussion of the underlying assumptions involved in such quantitative force measurements, the presentation of globally valid explicit formulae for evaluation of so-called 'conservative' and 'dissipative' forces, discussion of the origin of these forces, and analysis of the applicability of FM-AFM to quantitative force measurements in liquid.
Extraction of quantitative surface characteristics from AIRSAR data for Death Valley, California
NASA Technical Reports Server (NTRS)
Kierein-Young, K. S.; Kruse, F. A.
1992-01-01
Polarimetric Airborne Synthetic Aperture Radar (AIRSAR) data were collected for the Geologic Remote Sensing Field Experiment (GRSFE) over Death Valley, California, USA, in Sep. 1989. AIRSAR is a four-look, quad-polarization, three frequency instrument. It collects measurements at C-band (5.66 cm), L-band (23.98 cm), and P-band (68.13 cm), and has a GIFOV of 10 meters and a swath width of 12 kilometers. Because the radar measures at three wavelengths, different scales of surface roughness are measured. Also, dielectric constants can be calculated from the data. The AIRSAR data were calibrated using in-scene trihedral corner reflectors to remove cross-talk; and to calibrate the phase, amplitude, and co-channel gain imbalance. The calibration allows for the extraction of accurate values of rms surface roughness, dielectric constants, sigma(sub 0) backscatter, and polarization information. The radar data sets allow quantitative characterization of small scale surface structure of geologic units, providing information about the physical and chemical processes that control the surface morphology. Combining the quantitative information extracted from the radar data with other remotely sensed data sets allows discrimination, identification and mapping of geologic units that may be difficult to discern using conventional techniques.
An optimized method for measuring fatty acids and cholesterol in stable isotope-labeled cells
Argus, Joseph P.; Yu, Amy K.; Wang, Eric S.; Williams, Kevin J.; Bensinger, Steven J.
2017-01-01
Stable isotope labeling has become an important methodology for determining lipid metabolic parameters of normal and neoplastic cells. Conventional methods for fatty acid and cholesterol analysis have one or more issues that limit their utility for in vitro stable isotope-labeling studies. To address this, we developed a method optimized for measuring both fatty acids and cholesterol from small numbers of stable isotope-labeled cultured cells. We demonstrate quantitative derivatization and extraction of fatty acids from a wide range of lipid classes using this approach. Importantly, cholesterol is also recovered, albeit at a modestly lower yield, affording the opportunity to quantitate both cholesterol and fatty acids from the same sample. Although we find that background contamination can interfere with quantitation of certain fatty acids in low amounts of starting material, our data indicate that this optimized method can be used to accurately measure mass isotopomer distributions for cholesterol and many fatty acids isolated from small numbers of cultured cells. Application of this method will facilitate acquisition of lipid parameters required for quantifying flux and provide a better understanding of how lipid metabolism influences cellular function. PMID:27974366
Quantitative analysis on electrooculography (EOG) for neurodegenerative disease
NASA Astrophysics Data System (ADS)
Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.
2007-11-01
Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.
Allesø, Morten; Holm, Per; Carstensen, Jens Michael; Holm, René
2016-05-25
Surface topography, in the context of surface smoothness/roughness, was investigated by the use of an image analysis technique, MultiRay™, related to photometric stereo, on different tablet batches manufactured either by direct compression or roller compaction. In the present study, oblique illumination of the tablet (darkfield) was considered and the area of cracks and pores in the surface was used as a measure of tablet surface topography; the higher a value, the rougher the surface. The investigations demonstrated a high precision of the proposed technique, which was able to rapidly (within milliseconds) and quantitatively measure the obtained surface topography of the produced tablets. Compaction history, in the form of applied roll force and tablet punch pressure, was also reflected in the measured smoothness of the tablet surfaces. Generally it was found that a higher degree of plastic deformation of the microcrystalline cellulose resulted in a smoother tablet surface. This altogether demonstrated that the technique provides the pharmaceutical developer with a reliable, quantitative response parameter for visual appearance of solid dosage forms, which may be used for process and ultimately product optimization. Copyright © 2015 Elsevier B.V. All rights reserved.
A quantitative approach to painting styles
NASA Astrophysics Data System (ADS)
Vieira, Vilson; Fabbri, Renato; Sbrissa, David; da Fontoura Costa, Luciano; Travieso, Gonzalo
2015-01-01
This research extends a method previously applied to music and philosophy (Vilson Vieira et al., 2012), representing the evolution of art as a time-series where relations like dialectics are measured quantitatively. For that, a corpus of paintings of 12 well-known artists from baroque and modern art is analyzed. A set of 99 features is extracted and the features which most contributed to the classification of painters are selected. The projection space obtained provides the basis to the analysis of measurements. These quantitative measures underlie revealing observations about the evolution of painting styles, specially when compared with other humanity fields already analyzed: while music evolved along a master-apprentice tradition (high dialectics) and philosophy by opposition, painting presents another pattern: constant increasing skewness, low opposition between members of the same movement and opposition peaks in the transition between movements. Differences between baroque and modern movements are also observed in the projected "painting space": while baroque paintings are presented as an overlapped cluster, the modern paintings present minor overlapping and are disposed more widely in the projection than the baroque counterparts. This finding suggests that baroque painters shared aesthetics while modern painters tend to "break rules" and develop their own style.
Providing Evidence in the Moral Domain
ERIC Educational Resources Information Center
Cooper, Diane L.; Liddell, Debora L.; Davis, Tiffany J.; Pasquesi, Kira
2012-01-01
In this era of increased accountability, it is important to consider how student affairs researches and assesses the outcomes of efforts to increase moral competence. This article examines both qualitative and quantitative inquiry methods for measuring moral development. The authors review the instrumentation and methods typically used to measure…
Simple X-ray diffraction algorithm for direct determination of cotton crystallinity
USDA-ARS?s Scientific Manuscript database
Traditionally, XRD had been used to study the crystalline structure of cotton celluloses. Despite considerable efforts in developing the curve-fitting protocol to evaluate the crystallinity index (CI), in its present state, XRD measurement can only provide a qualitative or semi-quantitative assessme...
ERIC Educational Resources Information Center
Mayhew, Jerry L.
1981-01-01
Body composition refers to the types and amounts of tissues which make up the body. The most acceptable method for assessing body composition is underwater weighing. A subcutaneous skinfold provides a quantitative measurement of fat below the skin. The skinfold technique permits a valid estimate of the body's total fat content. (JN)
Quantitative software models for the estimation of cost, size, and defects
NASA Technical Reports Server (NTRS)
Hihn, J.; Bright, L.; Decker, B.; Lum, K.; Mikulski, C.; Powell, J.
2002-01-01
The presentation will provide a brief overview of the SQI measurement program as well as describe each of these models and how they are currently being used in supporting JPL project, task and software managers to estimate and plan future software systems and subsystems.
77 FR 4765 - Marine Mammals; File No. 15142
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-31
... comment. Those individuals requesting a public hearing should submit a written request to the Chief... reasons why a hearing on this application would be appropriate. FOR FURTHER INFORMATION CONTACT: Amy Sloan...-01 (75 FR 58352) and will provide quantitative measurements of the amphibious hearing capabilities of...
Changes in body composition of neonatal piglets during growth
USDA-ARS?s Scientific Manuscript database
During studies of neonatal piglet growth it is important to be able to accurately assess changes in body composition. Previous studies have demonstrated that quantitative magnetic resonance (QMR) provides precise and accurate measurements of total body fat mass, lean mass and total body water in non...
Wu, C; de Jong, J R; Gratama van Andel, H A; van der Have, F; Vastenhouw, B; Laverman, P; Boerman, O C; Dierckx, R A J O; Beekman, F J
2011-09-21
Attenuation of photon flux on trajectories between the source and pinhole apertures affects the quantitative accuracy of reconstructed single-photon emission computed tomography (SPECT) images. We propose a Chang-based non-uniform attenuation correction (NUA-CT) for small-animal SPECT/CT with focusing pinhole collimation, and compare the quantitative accuracy with uniform Chang correction based on (i) body outlines extracted from x-ray CT (UA-CT) and (ii) on hand drawn body contours on the images obtained with three integrated optical cameras (UA-BC). Measurements in phantoms and rats containing known activities of isotopes were conducted for evaluation. In (125)I, (201)Tl, (99m)Tc and (111)In phantom experiments, average relative errors comparing to the gold standards measured in a dose calibrator were reduced to 5.5%, 6.8%, 4.9% and 2.8%, respectively, with NUA-CT. In animal studies, these errors were 2.1%, 3.3%, 2.0% and 2.0%, respectively. Differences in accuracy on average between results of NUA-CT, UA-CT and UA-BC were less than 2.3% in phantom studies and 3.1% in animal studies except for (125)I (3.6% and 5.1%, respectively). All methods tested provide reasonable attenuation correction and result in high quantitative accuracy. NUA-CT shows superior accuracy except for (125)I, where other factors may have more impact on the quantitative accuracy than the selected attenuation correction.
Tojo, Axel; Malm, Johan; Marko-Varga, György; Lilja, Hans; Laurell, Thomas
2014-01-01
The antibody microarrays have become widespread, but their use for quantitative analyses in clinical samples has not yet been established. We investigated an immunoassay based on nanoporous silicon antibody microarrays for quantification of total prostate-specific-antigen (PSA) in 80 clinical plasma samples, and provide quantitative data from a duplex microarray assay that simultaneously quantifies free and total PSA in plasma. To further develop the assay the porous silicon chips was placed into a standard 96-well microtiter plate for higher throughput analysis. The samples analyzed by this quantitative microarray were 80 plasma samples obtained from men undergoing clinical PSA testing (dynamic range: 0.14-44ng/ml, LOD: 0.14ng/ml). The second dataset, measuring free PSA (dynamic range: 0.40-74.9ng/ml, LOD: 0.47ng/ml) and total PSA (dynamic range: 0.87-295ng/ml, LOD: 0.76ng/ml), was also obtained from the clinical routine. The reference for the quantification was a commercially available assay, the ProStatus PSA Free/Total DELFIA. In an analysis of 80 plasma samples the microarray platform performs well across the range of total PSA levels. This assay might have the potential to substitute for the large-scale microtiter plate format in diagnostic applications. The duplex assay paves the way for a future quantitative multiplex assay, which analyses several prostate cancer biomarkers simultaneously. PMID:22921878
Direct measurements of protein-stabilized gold nanoparticle interactions.
Eichmann, Shannon L; Bevan, Michael A
2010-09-21
We report integrated video and total internal reflection microscopy measurements of protein stabilized 110 nm Au nanoparticles confined in 280 nm gaps in physiological media. Measured potential energy profiles display quantitative agreement with Brownian dynamic simulations that include hydrodynamic interactions and camera exposure time and noise effects. Our results demonstrate agreement between measured nonspecific van der Waals and adsorbed protein interactions with theoretical potentials. Confined, lateral nanoparticle diffusivity measurements also display excellent agreement with predictions. These findings provide a basis to interrogate specific biomacromolecular interactions in similar experimental configurations and to design future improved measurement methods.
Castaldi, Peter J; San José Estépar, Raúl; Mendoza, Carlos S; Hersh, Craig P; Laird, Nan; Crapo, James D; Lynch, David A; Silverman, Edwin K; Washko, George R
2013-11-01
Emphysema occurs in distinct pathologic patterns, but little is known about the epidemiologic associations of these patterns. Standard quantitative measures of emphysema from computed tomography (CT) do not distinguish between distinct patterns of parenchymal destruction. To study the epidemiologic associations of distinct emphysema patterns with measures of lung-related physiology, function, and health care use in smokers. Using a local histogram-based assessment of lung density, we quantified distinct patterns of low attenuation in 9,313 smokers in the COPDGene Study. To determine if such patterns provide novel insights into chronic obstructive pulmonary disease epidemiology, we tested for their association with measures of physiology, function, and health care use. Compared with percentage of low-attenuation area less than -950 Hounsfield units (%LAA-950), local histogram-based measures of distinct CT low-attenuation patterns are more predictive of measures of lung function, dyspnea, quality of life, and health care use. These patterns are strongly associated with a wide array of measures of respiratory physiology and function, and most of these associations remain highly significant (P < 0.005) after adjusting for %LAA-950. In smokers without evidence of chronic obstructive pulmonary disease, the mild centrilobular disease pattern is associated with lower FEV1 and worse functional status (P < 0.005). Measures of distinct CT emphysema patterns provide novel information about the relationship between emphysema and key measures of physiology, physical function, and health care use. Measures of mild emphysema in smokers with preserved lung function can be extracted from CT scans and are significantly associated with functional measures.
Auroral photometry from the atmosphere Explorer satellite
NASA Technical Reports Server (NTRS)
Rees, M. H.; Abreu, V. J.
1984-01-01
Attention is given to the ability of remote sensing from space to yield quantitative auroral and ionospheric parametrers, in view of the auroral measurements made during two passes of the Explorer C satellite over the Poker Flat Optical Observatory and the Chatanika Radar Facility. The emission rate of the N2(+) 4278 A band computed from intensity measurements of energetic auroral electrons has tracked the same spetral feature that was measured remotely from the satellite over two decades of intensity, providing a stringent test for the measurement of atmospheric scattering effects. It also verifies the absolute intensity with respect to ground-based photometric measurements. In situ satellite measurments of ion densities and ground based electron density profile radar measurements provide a consistent picture of the ionospheric response to auroral input, while also predicting the observed optical emission rate.
Automatic vertebral bodies detection of x-ray images using invariant multiscale template matching
NASA Astrophysics Data System (ADS)
Sharifi Sarabi, Mona; Villaroman, Diane; Beckett, Joel; Attiah, Mark; Marcus, Logan; Ahn, Christine; Babayan, Diana; Gaonkar, Bilwaj; Macyszyn, Luke; Raghavendra, Cauligi
2017-03-01
Lower back pain and pathologies related to it are one of the most common results for a referral to a neurosurgical clinic in the developed and the developing world. Quantitative evaluation of these pathologies is a challenge. Image based measurements of angles/vertebral heights and disks could provide a potential quantitative biomarker for tracking and measuring these pathologies. Detection of vertebral bodies is a key element and is the focus of the current work. From the variety of medical imaging techniques, MRI and CT scans have been typically used for developing image segmentation methods. However, CT scans are known to give a large dose of x-rays, increasing cancer risk [8]. MRI can be substituted for CTs when the risk is high [8] but are difficult to obtain in smaller facilities due to cost and lack of expertise in the field [2]. X-rays provide another option with its ability to control the x-ray dosage, especially for young people, and its accessibility for smaller facilities. Hence, the ability to create quantitative biomarkers from x-ray data is especially valuable. Here, we develop a multiscale template matching, inspired by [9], to detect centers of vertebral bodies from x-ray data. The immediate application of such detection lies in developing quantitative biomarkers and in querying similar images in a database. Previously, shape similarity classification methods have been used to address this problem, but these are challenging to use in the presence of variation due to gross pathology and even subtle effects [1].
NASA Astrophysics Data System (ADS)
Mehta, Shalin B.; Sheppard, Colin J. R.
2010-05-01
Various methods that use large illumination aperture (i.e. partially coherent illumination) have been developed for making transparent (i.e. phase) specimens visible. These methods were developed to provide qualitative contrast rather than quantitative measurement-coherent illumination has been relied upon for quantitative phase analysis. Partially coherent illumination has some important advantages over coherent illumination and can be used for measurement of the specimen's phase distribution. However, quantitative analysis and image computation in partially coherent systems have not been explored fully due to the lack of a general, physically insightful and computationally efficient model of image formation. We have developed a phase-space model that satisfies these requirements. In this paper, we employ this model (called the phase-space imager) to elucidate five different partially coherent systems mentioned in the title. We compute images of an optical fiber under these systems and verify some of them with experimental images. These results and simulated images of a general phase profile are used to compare the contrast and the resolution of the imaging systems. We show that, for quantitative phase imaging of a thin specimen with matched illumination, differential phase contrast offers linear transfer of specimen information to the image. We also show that the edge enhancement properties of spiral phase contrast are compromised significantly as the coherence of illumination is reduced. The results demonstrate that the phase-space imager model provides a useful framework for analysis, calibration, and design of partially coherent imaging methods.
Immunochromatographic diagnostic test analysis using Google Glass.
Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan
2014-03-25
We demonstrate a Google Glass-based rapid diagnostic test (RDT) reader platform capable of qualitative and quantitative measurements of various lateral flow immunochromatographic assays and similar biomedical diagnostics tests. Using a custom-written Glass application and without any external hardware attachments, one or more RDTs labeled with Quick Response (QR) code identifiers are simultaneously imaged using the built-in camera of the Google Glass that is based on a hands-free and voice-controlled interface and digitally transmitted to a server for digital processing. The acquired JPEG images are automatically processed to locate all the RDTs and, for each RDT, to produce a quantitative diagnostic result, which is returned to the Google Glass (i.e., the user) and also stored on a central server along with the RDT image, QR code, and other related information (e.g., demographic data). The same server also provides a dynamic spatiotemporal map and real-time statistics for uploaded RDT results accessible through Internet browsers. We tested this Google Glass-based diagnostic platform using qualitative (i.e., yes/no) human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) tests. For the quantitative RDTs, we measured activated tests at various concentrations ranging from 0 to 200 ng/mL for free and total PSA. This wearable RDT reader platform running on Google Glass combines a hands-free sensing and image capture interface with powerful servers running our custom image processing codes, and it can be quite useful for real-time spatiotemporal tracking of various diseases and personal medical conditions, providing a valuable tool for epidemiology and mobile health.
Immunochromatographic Diagnostic Test Analysis Using Google Glass
2014-01-01
We demonstrate a Google Glass-based rapid diagnostic test (RDT) reader platform capable of qualitative and quantitative measurements of various lateral flow immunochromatographic assays and similar biomedical diagnostics tests. Using a custom-written Glass application and without any external hardware attachments, one or more RDTs labeled with Quick Response (QR) code identifiers are simultaneously imaged using the built-in camera of the Google Glass that is based on a hands-free and voice-controlled interface and digitally transmitted to a server for digital processing. The acquired JPEG images are automatically processed to locate all the RDTs and, for each RDT, to produce a quantitative diagnostic result, which is returned to the Google Glass (i.e., the user) and also stored on a central server along with the RDT image, QR code, and other related information (e.g., demographic data). The same server also provides a dynamic spatiotemporal map and real-time statistics for uploaded RDT results accessible through Internet browsers. We tested this Google Glass-based diagnostic platform using qualitative (i.e., yes/no) human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) tests. For the quantitative RDTs, we measured activated tests at various concentrations ranging from 0 to 200 ng/mL for free and total PSA. This wearable RDT reader platform running on Google Glass combines a hands-free sensing and image capture interface with powerful servers running our custom image processing codes, and it can be quite useful for real-time spatiotemporal tracking of various diseases and personal medical conditions, providing a valuable tool for epidemiology and mobile health. PMID:24571349
Digital PCR analysis of circulating nucleic acids.
Hudecova, Irena
2015-10-01
Detection of plasma circulating nucleic acids (CNAs) requires the use of extremely sensitive and precise methods. The commonly used quantitative real-time polymerase chain reaction (PCR) poses certain technical limitations in relation to the precise measurement of CNAs whereas the costs of massively parallel sequencing are still relatively high. Digital PCR (dPCR) now represents an affordable and powerful single molecule counting strategy to detect minute amounts of genetic material with performance surpassing many quantitative methods. Microfluidic (chip) and emulsion (droplet)-based technologies have already been integrated into platforms offering hundreds to millions of nanoliter- or even picoliter-scale reaction partitions. The compelling observations reported in the field of cancer research, prenatal testing, transplantation medicine and virology support translation of this technology into routine use. Extremely sensitive plasma detection of rare mutations originating from tumor or placental cells among a large background of homologous sequences facilitates unraveling of the early stages of cancer or the detection of fetal mutations. Digital measurement of quantitative changes in plasma CNAs associated with cancer or graft rejection provides valuable information on the monitoring of disease burden or the recipient's immune response and subsequent therapy treatment. Furthermore, careful quantitative assessment of the viral load offers great value for effective monitoring of antiviral therapy for immunosuppressed or transplant patients. The present review describes the inherent features of dPCR that make it exceptionally robust in precise and sensitive quantification of CNAs. Moreover, I provide an insight into the types of potential clinical applications that have been developed by researchers to date. Copyright © 2015 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Standardized pivot shift test improves measurement accuracy.
Hoshino, Yuichi; Araujo, Paulo; Ahlden, Mattias; Moore, Charity G; Kuroda, Ryosuke; Zaffagnini, Stefano; Karlsson, Jon; Fu, Freddie H; Musahl, Volker
2012-04-01
The variability of the pivot shift test techniques greatly interferes with achieving a quantitative and generally comparable measurement. The purpose of this study was to compare the variation of the quantitative pivot shift measurements with different surgeons' preferred techniques to a standardized technique. The hypothesis was that standardizing the pivot shift test would improve consistency in the quantitative evaluation when compared with surgeon-specific techniques. A whole lower body cadaveric specimen was prepared to have a low-grade pivot shift on one side and high-grade pivot shift on the other side. Twelve expert surgeons performed the pivot shift test using (1) their preferred technique and (2) a standardized technique. Electromagnetic tracking was utilized to measure anterior tibial translation and acceleration of the reduction during the pivot shift test. The variation of the measurement was compared between the surgeons' preferred technique and the standardized technique. The anterior tibial translation during pivot shift test was similar between using surgeons' preferred technique (left 24.0 ± 4.3 mm; right 15.5 ± 3.8 mm) and using standardized technique (left 25.1 ± 3.2 mm; right 15.6 ± 4.0 mm; n.s.). However, the variation in acceleration was significantly smaller with the standardized technique (left 3.0 ± 1.3 mm/s(2); right 2.5 ± 0.7 mm/s(2)) compared with the surgeons' preferred technique (left 4.3 ± 3.3 mm/s(2); right 3.4 ± 2.3 mm/s(2); both P < 0.01). Standardizing the pivot shift test maneuver provides a more consistent quantitative evaluation and may be helpful in designing future multicenter clinical outcome trials. Diagnostic study, Level I.
Quantitative Measures of Immersion in Cloud and the Biogeography of Cloud Forests
NASA Technical Reports Server (NTRS)
Lawton, R. O.; Nair, U. S.; Ray, D.; Regmi, A.; Pounds, J. A.; Welch, R. M.
2010-01-01
Sites described as tropical montane cloud forests differ greatly, in part because observers tend to differ in their opinion as to what constitutes frequent and prolonged immersion in cloud. This definitional difficulty interferes with hydrologic analyses, assessments of environmental impacts on ecosystems, and biogeographical analyses of cloud forest communities and species. Quantitative measurements of cloud immersion can be obtained on site, but the observations are necessarily spatially limited, although well-placed observers can examine 10 50 km of a mountain range under rainless conditions. Regional analyses, however, require observations at a broader scale. This chapter discusses remote sensing and modeling approaches that can provide quantitative measures of the spatiotemporal patterns of cloud cover and cloud immersion in tropical mountain ranges. These approaches integrate remote sensing tools of various spatial resolutions and frequencies of observation, digital elevation models, regional atmospheric models, and ground-based observations to provide measures of cloud cover, cloud base height, and the intersection of cloud and terrain. This combined approach was applied to the Monteverde region of northern Costa Rica to illustrate how the proportion of time the forest is immersed in cloud may vary spatially and temporally. The observed spatial variation was largely due to patterns of airflow over the mountains. The temporal variation reflected the diurnal rise and fall of the orographic cloud base, which was influenced in turn by synoptic weather conditions, the seasonal movement of the Intertropical Convergence Zone and the north-easterly trade winds. Knowledge of the proportion of the time that sites are immersed in clouds should facilitate ecological comparisons and biogeographical analyses, as well as land use planning and hydrologic assessments in areas where intensive on-site work is not feasible.
Huang, Yongzhi; Green, Alexander L; Hyam, Jonathan; Fitzgerald, James; Aziz, Tipu Z; Wang, Shouyan
2018-01-01
Understanding the function of sensory thalamic neural activity is essential for developing and improving interventions for neuropathic pain. However, there is a lack of investigation of the relationship between sensory thalamic oscillations and pain relief in patients with neuropathic pain. This study aims to identify the oscillatory neural characteristics correlated with pain relief induced by deep brain stimulation (DBS), and develop a quantitative model to predict pain relief by integrating characteristic measures of the neural oscillations. Measures of sensory thalamic local field potentials (LFPs) in thirteen patients with neuropathic pain were screened in three dimensional feature space according to the rhythm, balancing, and coupling neural behaviours, and correlated with pain relief. An integrated approach based on principal component analysis (PCA) and multiple regression analysis is proposed to integrate the multiple measures and provide a predictive model. This study reveals distinct thalamic rhythms of theta, alpha, high beta and high gamma oscillations correlating with pain relief. The balancing and coupling measures between these neural oscillations were also significantly correlated with pain relief. The study enriches the series research on the function of thalamic neural oscillations in neuropathic pain and relief, and provides a quantitative approach for predicting pain relief by DBS using thalamic neural oscillations. Copyright © 2017 Elsevier Inc. All rights reserved.
Evaluation of TRMM Ground-Validation Radar-Rain Errors Using Rain Gauge Measurements
NASA Technical Reports Server (NTRS)
Wang, Jianxin; Wolff, David B.
2009-01-01
Ground-validation (GV) radar-rain products are often utilized for validation of the Tropical Rainfall Measuring Mission (TRMM) spaced-based rain estimates, and hence, quantitative evaluation of the GV radar-rain product error characteristics is vital. This study uses quality-controlled gauge data to compare with TRMM GV radar rain rates in an effort to provide such error characteristics. The results show that significant differences of concurrent radar-gauge rain rates exist at various time scales ranging from 5 min to 1 day, despite lower overall long-term bias. However, the differences between the radar area-averaged rain rates and gauge point rain rates cannot be explained as due to radar error only. The error variance separation method is adapted to partition the variance of radar-gauge differences into the gauge area-point error variance and radar rain estimation error variance. The results provide relatively reliable quantitative uncertainty evaluation of TRMM GV radar rain estimates at various times scales, and are helpful to better understand the differences between measured radar and gauge rain rates. It is envisaged that this study will contribute to better utilization of GV radar rain products to validate versatile spaced-based rain estimates from TRMM, as well as the proposed Global Precipitation Measurement, and other satellites.
Frequency modulation atomic force microscopy: a dynamic measurement technique for biological systems
NASA Astrophysics Data System (ADS)
Higgins, Michael J.; Riener, Christian K.; Uchihashi, Takayuki; Sader, John E.; McKendry, Rachel; Jarvis, Suzanne P.
2005-03-01
Frequency modulation atomic force microscopy (FM-AFM) has been modified to operate in a liquid environment within an atomic force microscope specifically designed for investigating biological samples. We demonstrate the applicability of FM-AFM to biological samples using the spectroscopy mode to measure the unbinding forces of a single receptor-ligand (biotin-avidin) interaction. We show that quantitative adhesion force measurements can only be obtained provided certain modifications are made to the existing theory, which is used to convert the detected frequency shifts to an interaction force. Quantitative force measurements revealed that the unbinding forces for the biotin-avidin interaction were greater than those reported in previous studies. This finding was due to the use of high average tip velocities, which were calculated to be two orders of magnitude greater than those typically used in unbinding receptor-ligand experiments. This study therefore highlights the potential use of FM-AFM to study a range of biological systems, including living cells and/or single biomolecule interactions.
NASA Astrophysics Data System (ADS)
D'Andrea, W. J.; Balascio, N. L.; Bradley, R. S.; Bakke, J.; Gjerde, M.; Kaufman, D. S.; Briner, J. P.; von Gunten, L.
2014-12-01
Generating continuous, accurate and quantitative Holocene temperature estimates from the Arctic is an ongoing challenge. In many Arctic regions, tree ring-based approaches cannot be used and lake sediments provide the most valuable repositories for extracting paleotemperature information. Advances in lacustrine alkenone paleothermometry now allow for quantitative reconstruction of lake-water temperature based on the UK37 values of sedimentary alkenones. In addition, a recent study demonstrated the efficacy of non-destructive scanning reflectance spectroscopy in the visible range (VIS-RS) for high-resolution quantitative temperature reconstruction from arctic lake sediments1. In this presentation, I will report a new UK37-based temperature reconstruction and a scanning VIS-RS record (using the RABD660;670 index as a measure of sedimentary chlorin content) from Kulusuk Lake in southeastern Greenland (65.6°N, 37.1°W). The UK37 record reveals a ~3°C increase in summer lake water temperatures between ~10ka and ~7ka followed by sustained warmth until ~4ka and a gradual (~3°C) cooling until ~400 yr BP. The strong correlation between UK37 and RABD660;670 measured in the same sediment core provides further evidence that in arctic lakes where temperature regulates primary productivity, and thereby sedimentary chlorin content, these proxies can be combined to develop high-resolution quantitative temperature records. The Holocene temperature history of Kulusuk Lake determined using this approach corresponds to changes in the size of the glaciers adjacent to the lake, as inferred from sediment minerogenic properties measured with scanning XRF. Glaciers retreated during early Holocene warming, likely disappeared during the period of mid-Holocene warmth, and advanced after 4ka. I will also discuss new UK37 and RABD660;670 reconstructions from northwestern Svalbard and the central Brooks Range of Alaska within the framework of published regional temperature reconstructions and model simulations of Holocene temperature around the Arctic. 1. von Gunten, L., D'Andrea, W.J., Bradley, R.S. and Huang, Y., 2012, Proxy-to-proxy calibration: Increasing the temporal resolution of quantitative climate reconstructions. Scientific Reports, v. 2, 609. doi: 10:1038/srep00609.
Surface pressure measurement by oxygen quenching of luminescence
NASA Technical Reports Server (NTRS)
Gouterman, Martin P. (Inventor); Kavandi, Janet L. (Inventor); Gallery, Jean (Inventor); Callis, James B. (Inventor)
1993-01-01
Methods and compositions for measuring the pressure of an oxygen-containing gas on an aerodynamic surface, by oxygen-quenching of luminescence of molecular sensors is disclosed. Objects are coated with luminescent films containing a first sensor and at least one of two additional sensors, each of the sensors having luminescences that have different dependencies on temperature and oxygen pressure. Methods and compositions are also provided for improving pressure measurements (qualitative or quantitive) on surfaces coated with a film having one or more types of sensor.
Surface pressure measurement by oxygen quenching of luminescence
NASA Technical Reports Server (NTRS)
Gouterman, Martin P. (Inventor); Kavandi, Janet L. (Inventor); Gallery, Jean (Inventor); Callis, James B. (Inventor)
1994-01-01
Methods and compositions for measuring the pressure of an oxygen-containing gas on an aerodynamic surface, by oxygen-quenching of luminescence of molecular sensors is disclosed. Objects are coated with luminescent films containing a first sensor and at least one of two additional sensors, each of the sensors having luminescences that have different dependencies on temperature and oxygen pressure. Methods and compositions are also provided for improving pressure measurements (qualitative or quantitive) on surfaces coated with a film having one or more types of sensor.
Selection and Presentation of Imaging Figures in the Medical Literature
Siontis, George C. M.; Patsopoulos, Nikolaos A.; Vlahos, Antonios P.; Ioannidis, John P. A.
2010-01-01
Background Images are important for conveying information, but there is no empirical evidence on whether imaging figures are properly selected and presented in the published medical literature. We therefore evaluated the selection and presentation of radiological imaging figures in major medical journals. Methodology/Principal Findings We analyzed articles published in 2005 in 12 major general and specialty medical journals that had radiological imaging figures. For each figure, we recorded information on selection, study population, provision of quantitative measurements, color scales and contrast use. Overall, 417 images from 212 articles were analyzed. Any comment/hint on image selection was made in 44 (11%) images (range 0–50% across the 12 journals) and another 37 (9%) (range 0–60%) showed both a normal and abnormal appearance. In 108 images (26%) (range 0–43%) it was unclear whether the image came from the presented study population. Eighty-three images (20%) (range 0–60%) had any quantitative or ordered categorical value on a measure of interest. Information on the distribution of the measure of interest in the study population was given in 59 cases. For 43 images (range 0–40%), a quantitative measurement was provided for the depicted case and the distribution of values in the study population was also available; in those 43 cases there was no over-representation of extreme than average cases (p = 0.37). Significance The selection and presentation of images in the medical literature is often insufficiently documented; quantitative data are sparse and difficult to place in context. PMID:20526360
Assessment of metabolic bone diseases by quantitative computed tomography
NASA Technical Reports Server (NTRS)
Richardson, M. L.; Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U. J.
1985-01-01
Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid-induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements. Knowledge of appendicular cortical mineral status is important in its own right but is not a valid predictor of axial trabecular mineral status, which may be disproportionately decreased in certain diseases. Quantitative CT provides a reliable means of assessing the latter region of the skeleton, correlates well with the spinal fracture index (a semiquantitative measurement of end-organ failure), and offers the clinician a sensitive means of following the effects of therapy.
NASA Technical Reports Server (NTRS)
Cooper, Clayton S.; Laurendeau, Normand M.; Hicks, Yolanda R. (Technical Monitor)
2000-01-01
Lean direct-injection (LDI) spray flames offer the possibility of reducing NO(sub x) emissions from gas turbines by rapid mixing of the liquid fuel and air so as to drive the flame structure toward partially-premixed conditions. We consider the technical approaches required to utilize laser-induced fluorescence methods for quantitatively measuring NO concentrations in high-pressure LDI spray flames. In the progression from atmospheric to high-pressure measurements, the LIF method requires a shift from the saturated to the linear regime of fluorescence measurements. As such, we discuss quantitative, spatially resolved laser-saturated fluorescence (LSF), linear laser-induced fluorescence (LIF), and planar laser-induced fluorescence (PLIF) measurements of NO concentration in LDI spray flames. Spatially-resolved LIF measurements of NO concentration (ppm) are reported for preheated, LDI spray flames at pressures of two to five atmospheres. The spray is produced by a hollow-cone, pressure-atomized nozzle supplied with liquid heptane. NO is excited via the Q(sub 2)(26.5) transition of the gamma(0,0) band. Detection is performed in a two nanometer region centered on the gamma(0,1) band. A complete scheme is developed by which quantitative NO concentrations in high-pressure LDI spray flames can be measured by applying linear LIF. NO is doped into the reactants and convected through the flame with no apparent destruction, thus allowing a NO fluorescence calibration to be taken inside the flame environment. The in-situ calibration scheme is validated by comparisons to a reference flame. Quantitative NO profiles are presented and analyzed so as to better understand the operation of lean-direct injectors for gas turbine combustors. Moreover, parametric studies are provided for variations in pressure, air-preheat temperature, and equivalence ratio. Similar parametric studies are performed for lean, premixed-prevaporized flames to permit comparisons to those for LDI flames. Finally, PLIF is expanded to high pressure in an effort to quantify the detected fluorescence image for LDI flames. Success is achieved by correcting the PLIF calibration via a single-point LIF measurement. This procedure removes the influence of any preferential background that occurs in the PLIF detection window. In general, both the LIF and PLIF measurements verify that the LDI strategy could be used to reduce NO(sub x) emissions in future gas turbine combustors.
Annaswamy, Thiru; Mallempati, Srinivas; Allison, Stephen C; Abraham, Lawrence D
2007-05-01
To examine the usefulness of a biomechanical measure, resistance torque (RT), in quantifying spasticity by comparing its use with a clinical scale, the modified Ashworth scale (MAS), and quantitative electrophysiological measures. This is a correlational study of spasticity measurements in 34 adults with traumatic brain injury and plantarflexor spasticity. Plantarflexor spasticity was measured in the seated position before and after cryotherapy using the MAS and also by strapping each subject's foot and ankle to an apparatus that provided a ramp and hold stretch. The quantitative measures were (1) reflex threshold angle (RTA) calculated through electromyographic signals and joint angle traces, (2) Hdorsiflexion (Hdf)/Hcontrol (Hctrl) amplitude ratio obtained through reciprocal inhibition of the soleus H-reflex, (3) Hvibration (Hvib)/Hctrl ratio obtained through vibratory inhibition of the soleus H-reflex, and (4) RT calculated as the time integral of the torque graph between the starting and ending pulses of the stretch. Correlation coefficients between RT and MAS scores in both pre-ice (0.41) and post-ice trials (0.42) were fair (P = 0.001). The correlation coefficients between RT scores and RTA scores in both the pre-ice (0.66) and post-ice trials (0.75) were moderate (P
Characterizing non-photochemical quenching in leaves through fluorescence lifetime snapshots
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sylak-Glassman, Emily J.; Zaks, Julia; Amarnath, Kapil
2015-03-12
A technique is described to measure the fluorescence decay profiles of intact leaves during adaptation to high light and subsequent relaxation to dark conditions. We illustrate how to ensure that photosystem II reaction centers are closed and compare data for wild type Arabidopsis thaliana with conventional pulse-amplitude modulated (PAM) fluorescence measurements. Unlike PAM measurements, the lifetime measurements are not sensitive to photobleaching or chloroplast shielding, and the form of the fluorescence decay provides additional information to test quantitative models of excitation dynamics in intact leaves.
Multiple-wavelength spectroscopic quantitation of light-absorbing species in scattering media
Nathel, Howard; Cartland, Harry E.; Colston, Jr., Billy W.; Everett, Matthew J.; Roe, Jeffery N.
2000-01-01
An oxygen concentration measurement system for blood hemoglobin comprises a multiple-wavelength low-coherence optical light source that is coupled by single mode fibers through a splitter and combiner and focused on both a target tissue sample and a reference mirror. Reflections from both the reference mirror and from the depths of the target tissue sample are carried back and mixed to produce interference fringes in the splitter and combiner. The reference mirror is set such that the distance traversed in the reference path is the same as the distance traversed into and back from the target tissue sample at some depth in the sample that will provide light attenuation information that is dependent on the oxygen in blood hemoglobin in the target tissue sample. Two wavelengths of light are used to obtain concentrations. The method can be used to measure total hemoglobin concentration [Hb.sub.deoxy +Hb.sub.oxy ] or total blood volume in tissue and in conjunction with oxygen saturation measurements from pulse oximetry can be used to absolutely quantify oxyhemoglobin [HbO.sub.2 ] in tissue. The apparatus and method provide a general means for absolute quantitation of an absorber dispersed in a highly scattering medium.
Identifying persistent and characteristic features in firearm tool marks on cartridge cases
NASA Astrophysics Data System (ADS)
Ott, Daniel; Soons, Johannes; Thompson, Robert; Song, John
2017-12-01
Recent concerns about subjectivity in forensic firearm identification have motivated the development of algorithms to compare firearm tool marks that are imparted on ammunition and to generate quantitative measures of similarity. In this paper, we describe an algorithm that identifies impressed tool marks on a cartridge case that are both consistent between firings and contribute strongly to a surface similarity metric. The result is a representation of the tool mark topography that emphasizes both significant and persistent features across firings. This characteristic surface map is useful for understanding the variability and persistence of the tool marks created by a firearm and can provide improved discrimination between the comparison scores of samples fired from the same firearm and the scores of samples fired from different firearms. The algorithm also provides a convenient method for visualizing areas of similarity that may be useful in providing quantitative support for visual comparisons by trained examiners.
Lipid Informed Quantitation and Identification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevin Crowell, PNNL
2014-07-21
LIQUID (Lipid Informed Quantitation and Identification) is a software program that has been developed to enable users to conduct both informed and high-throughput global liquid chromatography-tandem mass spectrometry (LC-MS/MS)-based lipidomics analysis. This newly designed desktop application can quickly identify and quantify lipids from LC-MS/MS datasets while providing a friendly graphical user interface for users to fully explore the data. Informed data analysis simply involves the user specifying an electrospray ionization mode, lipid common name (i.e. PE(16:0/18:2)), and associated charge carrier. A stemplot of the isotopic profile and a line plot of the extracted ion chromatogram are also provided to showmore » the MS-level evidence of the identified lipid. In addition to plots, other information such as intensity, mass measurement error, and elution time are also provided. Typically, a global analysis for 15,000 lipid targets« less
A fluorescence anisotropy method for measuring protein concentration in complex cell culture media.
Groza, Radu Constantin; Calvet, Amandine; Ryder, Alan G
2014-04-22
The rapid, quantitative analysis of the complex cell culture media used in biopharmaceutical manufacturing is of critical importance. Requirements for cell culture media composition profiling, or changes in specific analyte concentrations (e.g. amino acids in the media or product protein in the bioprocess broth) often necessitate the use of complicated analytical methods and extensive sample handling. Rapid spectroscopic methods like multi-dimensional fluorescence (MDF) spectroscopy have been successfully applied for the routine determination of compositional changes in cell culture media and bioprocess broths. Quantifying macromolecules in cell culture media is a specific challenge as there is a need to implement measurements rapidly on the prepared media. However, the use of standard fluorescence spectroscopy is complicated by the emission overlap from many media components. Here, we demonstrate how combining anisotropy measurements with standard total synchronous fluorescence spectroscopy (TSFS) provides a rapid, accurate quantitation method for cell culture media. Anisotropy provides emission resolution between large and small fluorophores while TSFS provides a robust measurement space. Model cell culture media was prepared using yeastolate (2.5 mg mL(-1)) spiked with bovine serum albumin (0 to 5 mg mL(-1)). Using this method, protein emission is clearly discriminated from background yeastolate emission, allowing for accurate bovine serum albumin (BSA) quantification over a 0.1 to 4.0 mg mL(-1) range with a limit of detection (LOD) of 13.8 μg mL(-1). Copyright © 2014. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Gallo, Emanuela Carolina Angela
Width increased dual-pump enhanced coherent anti-Stokes Raman spectroscopy (WIDECARS) measurements were conducted in a McKenna air-ethylene premixed burner, at nominal equivalence ratio range between 0.55 and 2.50 to provide quantitative measurements of six major combustion species (C2H 4, N2, O2, H2, CO, CO2) concentration and temperature simultaneously. The purpose of this test was to investigate the uncertainties in the experimental and spectral modeling methods in preparation for an subsequent scramjet C2H4/air combustion test at the University of Virginia-Aerospace Research Laboratory. A broadband Pyrromethene (PM) PM597 and PM650 dye laser mixture and optical cavity were studied and optimized to excite the Raman shift of all the target species. Two hundred single shot recorded spectra were processed, theoretically fitted and then compared to computational models, to verify where chemical equilibrium or adiabatic condition occurred, providing experimental flame location and formation, species concentrations, temperature, and heat losses inputs to computational kinetic models. The Stark effect, temperature, and concentration errors are discussed. Subsequently, WIDECARS measurements of a premixed air-ethylene flame were successfully acquired in a direct connect small-scale dual-mode scramjet combustor, at University of Virginia Supersonic Combustion Facility (UVaSCF). A nominal Mach 5 flight condition was simulated (stagnation pressure p0 = 300 kPa, temperature T0 = 1200 K, equivalence ratio range ER = 0.3 -- 0.4). The purpose of this test was to provide quantitative measurements of the six major combustion species concentration and temperature. Point-wise measurements were taken by mapping four two-dimensional orthogonal planes (before, within, and two planes after the cavity flame holder) with respect to the combustor freestream direction. Two hundred single shot recorded spectra were processed and theoretically fitted. Mean flow and standard deviation are provided for each investigated case. Within the flame limits tested, WIDECARS data were analyzed and compared with CFD simulations and OH-PLIF measurements.
Controlling the opto-mechanics of a cantilever in an interferometer via cavity loss
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmidsfeld, A. von, E-mail: avonschm@uos.de; Reichling, M., E-mail: reichling@uos.de
2015-09-21
In a non-contact atomic force microscope, based on interferometric cantilever displacement detection, the optical return loss of the system is tunable via the distance between the fiber end and the cantilever. We utilize this for tuning the interferometer from a predominant Michelson to a predominant Fabry-Pérot characteristics and introduce the Fabry-Pérot enhancement factor as a quantitative measure for multibeam interference in the cavity. This experimentally easily accessible and adjustable parameter provides a control of the opto-mechanical interaction between the cavity light field and the cantilever. The quantitative assessment of the light pressure acting on the cantilever oscillating in the cavitymore » via the frequency shift allows an in-situ measurement of the cantilever stiffness with remarkable precision.« less
Total Homocysteine Is Associated With White Matter Hyperintensity Volume
Wright, Clinton B.; Paik, Myunghee C.; Brown, Truman R.; Stabler, Sally P.; Allen, Robert H.; Sacco, Ralph L.; DeCarli, Charles
2005-01-01
Background Total homocysteine (tHcy) has been implicated as a risk factor for stroke and dementia, but the mechanism is unclear. White matter hyperintensities may be a risk factor for both, but studies of the relationship between tHcy and quantitative measures of white matter hyperintensity volume (WMHV) are lacking, especially in minority populations. Methods A community-based sample of 259 subjects with baseline tHcy levels underwent pixel-based quantitative measurement of WMHV. We examined the relationship between tHcy and WMHV adjusting for age, sociodemographics, vascular risk factors, and B12 deficiency. Results Higher levels of tHcy were associated with WMHV adjusting for sociodemographics and vascular risk factors. Conclusions These cross-sectional data provide evidence that tHcy is a risk factor for white matter damage. PMID:15879345
Shin, Mimi; Kaplan, Sam V; Raider, Kayla D; Johnson, Michael A
2015-05-07
Caged compounds have been used extensively to investigate neuronal function in a variety of preparations, including cell culture, ex vivo tissue samples, and in vivo. As a first step toward electrochemically measuring the extent of caged compound photoactivation while also measuring the release of the catecholamine neurotransmitter, dopamine, fast-scan cyclic voltammetry at carbon-fiber microelectrodes (FSCV) was used to electrochemically characterize 4-hydroxyphenylacetic acid (4HPAA) in the absence and presence of dopamine. 4HPAA is a by-product formed during the process of photoactivation of p-hydroxyphenacyl-based caged compounds, such as p-hydroxyphenylglutamate (pHP-Glu). Our data suggest that the oxidation of 4HPAA occurs through the formation of a conjugated species. Moreover, we found that a triangular waveform of -0.4 V to +1.3 V to -0.4 V at 600 V s(-1), repeated every 100 ms, provided an oxidation current of 4HPAA that was enhanced with a limit of detection of 100 nM, while also allowing the detection and quantitation of dopamine within the same scan. Along with quantifying 4HPAA in biological preparations, the results from this work will allow the electrochemical measurement of photoactivation reactions that generate 4HPAA as a by-product as well as provide a framework for measuring the photorelease of electroactive by-products from caged compounds that incorporate other chromophores.
Shin, Mimi; Kaplan, Sam V.; Raider, Kayla D.; Johnson, Michael A.
2015-01-01
Caged compounds have been used extensively to investigate neuronal function in a variety of preparations, including cell culture, ex vivo tissue samples, and in vivo. As a first step toward electrochemically measuring the extent of caged compound photoactivation while also measuring the release of the catecholamine neurotransmitter, dopamine, fast-scan cyclic voltammetry at carbon-fiber microelectrodes (FSCV) was used to electrochemically characterize 4-hydroxyphenylacetic acid (4HPAA) in the absence and presence of dopamine. 4HPAA is a by-product formed during the process of photoactivation of p-hydroxyphenylacyl-based caged compounds, such as p-hydroxyphenylglutamate (pHP-Glu). Our data suggest that the oxidation of 4HPAA occurs through the formation of a conjugated species. Moreover, we found that a triangular waveform of −0.4 V to +1.3 V to −0.4 V at 600 V/s, repeated every 100 ms, provided an oxidation current of 4HPAA that was enhanced with a limit of detection of 100 nM, while also allowing the detection and quantitation of dopamine within the same scan. Along with quantifying 4HPAA in biological preparations, the results from this work will allow the electrochemical measurement of photoactivation reactions that generate 4HPAA as a by-product as well as provide a framework for measuring the photorelease of electroactive by-products from caged compounds that incorporate other chromophores. PMID:25785694
Disrupting the Pipeline: Critical Analyses of Student Pathways through Postsecondary STEM Education
ERIC Educational Resources Information Center
Metcalf, Heather E.
2014-01-01
Critical mixed methods approaches allow us to reflect upon the ways in which we collect, measure, interpret, and analyze data, providing novel alternatives for quantitative analysis. For institutional researchers, whose work influences institutional policies, programs, and practices, the approach has the transformative ability to expose and create…
ERIC Educational Resources Information Center
Franzen, Stefan
2011-01-01
Determination of the solubility limit of a strongly colored organometallic reagent in a mixed-solvent system provides an example of quantitative solubility measurement appropriate to understand polymer, nanoparticle, and other macromolecular aggregation processes. The specific example chosen involves a solution of tris(dibenzylideneacetone)…
Treatment of Childhood Obesity: A Systematic Review
ERIC Educational Resources Information Center
Staniford, Leanne J.; Breckon, Jeff D.; Copeland, Robert J.
2012-01-01
Childhood obesity trends have increased dramatically over the past three decade's. The purpose of this quantitative systematic review is to provide an update of the evidence, illustrating the efficacy of childhood obesity treatment, considering whether treatment fidelity has been measured and/or reported and whether this related to the treatment…
42 CFR 417.564 - Apportionment and allocation of administrative and general costs.
Code of Federal Regulations, 2010 CFR
2010-10-01
... general costs. 417.564 Section 417.564 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT... providing medical care. Enrollment, marketing, and other administrative and general costs that benefit the... benefit of which cannot be quantitatively measured (such as facility costs), the total allowable costs of...
Neoliberalism, Performance Measurement, and the Governance of American Academic Science
ERIC Educational Resources Information Center
Feller, Irwin
2008-01-01
The international thrust of neoliberal liberal policies on higher education systems has generally been to reduce governmental control over the operations of universities in de facto exchange for these institutions assuming increased responsibility for generating a larger share of their revenues and for providing quantitative evidence of…
Hydrocarbon saturation determination using acoustic velocities obtained through casing
Moos, Daniel
2010-03-09
Compressional and shear velocities of earth formations are measured through casing. The determined compressional and shear velocities are used in a two component mixing model to provides improved quantitative values for the solid, the dry frame, and the pore compressibility. These are used in determination of hydrocarbon saturation.
Air quality and composite wood products
Melissa G. D. Baumann
1999-01-01
Research at the USDA Forest Service, Forest Products Laboratory (FPL) is being conducted to identify the compounds emitted from wood products during their manufacture and subsequent use. The FPL researchers are measuring the types and quantities of VOCs that are emitted from particleboard and MDF products to provide quantitative emissions information. This information...
Improving Reading Achievement in High Poverty, Rural Schools
ERIC Educational Resources Information Center
Tufts, Janet
2017-01-01
Purpose: The purpose of this quantitative study was to determine what correlation exists between a student's poverty coefficient and the daily number of minutes provided for independent reading and third grade reading achievement level as measured by California Assessment of Student Performance and Progress (CAASPP) in small, rural school…
Assessing Higher-Order Thinking Using a Networked Portfolio System with Peer Assessment
ERIC Educational Resources Information Center
Liu, Eric Zhi-Feng; Zhuo, Yi-Chin; Yuan, Shyan-Ming
2004-01-01
In the past, the quantitative evidences of portfolio assessment have been explored under online instruction. Liu, Lin, and Yuan provide a long-term measure of peer-self, peer-instructor and self-instructor correlation coefficients under networked innovative assessment procedures. Analytical results indicated that undergraduate students could…
pH & Rate of Enzymatic Reactions.
ERIC Educational Resources Information Center
Clariana, Roy B.
1991-01-01
A quantitative and inexpensive way to measure the rate of enzymatic reaction is provided. The effects of different pH levels on the reaction rate of an enzyme from yeast are investigated and the results graphed. Background information, a list of needed materials, directions for preparing solutions, procedure, and results and discussion are…
Quantitative interpretation of the magnetic susceptibility frequency dependence
NASA Astrophysics Data System (ADS)
Ustra, Andrea; Mendonça, Carlos A.; Leite, Aruã; Jovane, Luigi; Trindade, Ricardo I. F.
2018-05-01
Low-field mass-specific magnetic susceptibility (MS) measurements using multifrequency alternating fields are commonly used to evaluate concentration of ferrimagnetic particles in the transition of superparamagnetic (SP) to stable single domain (SSD). In classical palaeomagnetic analyses, this measurement serves as a preliminary assessment of rock samples providing rapid, non-destructive, economical and easy information of magnetic properties. The SP-SSD transition is relevant in environmental studies because it has been associated with several geological and biogeochemical processes affecting magnetic mineralogy. MS is a complex function of mineral-type and grain-size distribution, as well as measuring parameters such as external field magnitude and frequency. In this work, we propose a new technique to obtain quantitative information on grain-size variations of magnetic particles in the SP-SSD transition by inverting frequency-dependent susceptibility. We introduce a descriptive parameter named as `limiting frequency effect' that provides an accurate estimation of MS loss with frequency. Numerical simulations show the methodology capability in providing data fitting and model parameters in many practical situations. Real-data applications with magnetite nanoparticles and core samples from sediments of Poggio le Guaine section of Umbria-Marche Basin (Italy) provide additional information not clearly recognized when interpreting cruder MS data. Caution is needed when interpreting frequency dependence in terms of single relaxation processes, which are not universally applicable and depend upon the nature of magnetic mineral in the material. Nevertheless, the proposed technique is a promising tool for SP-SSD content analyses.
GOSAT/TANSO-FTS Measurement of Volcanic and Geothermal CO2 Emissions
NASA Astrophysics Data System (ADS)
Schwandner, Florian M.; Carn, Simon A.; Newhall, Christopher G.
2010-05-01
Approximately one tenth of the Earth's human population lives in direct reach of volcanic hazards. Being able to provide sufficiently early and scientifically sound warning is a key to volcanic hazard mitigation. Quantitative time-series monitoring of volcanic CO2 emissions will likely play a key role in such early warning activities in the future. Impending volcanic eruptions or any potentially disastrous activity that involves movement of magma in the subsurface, is often preceded by an early increase of CO2 emissions. Conventionally, volcanic CO2 monitoring is done either in campaigns of soil emission measurements (grid of one-time measuring points) that are labor intensive and slow, or by ground-based remote FTIR measurements in emission plumes. These methods are not easily available at all sites of potential activity and prohibitively costly to employ on a large number of volcanoes. In addition, both of these ground-based approaches pose a significant risk to the workers conducting these measurements. Some aircraft-based measurements have been conducted as well in the past, however these are limited by the usually meager funding situation of individual observatories, the hazard such flights pose to equipment and crew, and by the inaccessibility of parts of the plume due to ash hazards. The core motivation for this study is therefore to develop a method for volcanic CO2 monitoring from space that will provide sufficient coverage, resolution, and data quality for an application to quantitative time series monitoring and correlation with other available datasets, from a safe distance and with potentially global reach. In summary, the purpose of the proposed research is to quantify volcanic CO2 emissions using satellite-borne observations. Quantitative estimates will be useful for warning of impending volcanic eruptions, and assessing the contribution of volcanic CO2 to global GHG. Our approach encompasses method development and testing for the detection of volcanic CO2 anomalies using GOSAT and correlation with Aura/OMI, AIRS, and ASTER determined SO2 fluxes and ground based monitoring of CO2 and other geophysical and geochemical parameters. This will provide the ground work for future higher spatial resolution satellite missions. This is a joint effort from two GOSAT-IBUKI data application projects: "Satellite-Borne Quantification of Carbon Dioxide Emissions from Volcanoes and Geothermal Areas" (PI Schwandner), and "Application of GOSAT/TANSO-FTS to the Measurement of Volcanic CO2 Emissions" (PI Carn).
An information measure for class discrimination. [in remote sensing of crop observation
NASA Technical Reports Server (NTRS)
Shen, S. S.; Badhwar, G. D.
1986-01-01
This article describes a separability measure for class discrimination. This measure is based on the Fisher information measure for estimating the mixing proportion of two classes. The Fisher information measure not only provides a means to assess quantitatively the information content in the features for separating classes, but also gives the lower bound for the variance of any unbiased estimate of the mixing proportion based on observations of the features. Unlike most commonly used separability measures, this measure is not dependent on the form of the probability distribution of the features and does not imply a specific estimation procedure. This is important because the probability distribution function that describes the data for a given class does not have simple analytic forms, such as a Gaussian. Results of applying this measure to compare the information content provided by three Landsat-derived feature vectors for the purpose of separating small grains from other crops are presented.
Fluvial Volumes, Timescales, and Intermittency in Milna Crater, Mars
NASA Technical Reports Server (NTRS)
Buhler, P.; Fassett, C. I.; Head, J. W.; Lamb, M. P.
2017-01-01
Ancient lake deposits and valley networks on Mars provide strong evidence that its surface was once modified by liquid water, but the extent of that modification is still debated. Ancient lacustrine deposits in Milna Crater provide insight into the timescale and fluid volume required to construct fluvially derived sedimentary deposits near the Noachian-Hesperian boundary. Placing the lacustrine deposits their regional context in Paraná Valles provides a quantitative measurement of the intermittency of large, water-mediated sediment transport events in that region.
A new way of measuring wiggling pattern in SADP for 3D NAND technology
NASA Astrophysics Data System (ADS)
Mi, Jian; Chen, Ziqi; Tu, Li Ming; Mao, Xiaoming; Liu, Gong Cai; Kawada, Hiroki
2018-03-01
A new metrology method of quantitatively measuring wiggling patterns in a Self-Aligned Double Patterning (SADP) process for 2D NAND technology has been developed with a CD-SEM metrology program on images from a Review-SEM system. The metrology program provided accurate modeling of various wiggling patterns. The Review-SEM system provided a-few-micrometer-wide Field of View (FOV), which exceeds precision-guaranteed FOV of a conventional CD-SEM. The result has been effectively verified by visual inspection on vertically compressed images compared with Wiggling Index from this new method. A best-known method (BKM) system has been developed with connected HW and SW to automatically measure wiggling patterns.
Machine characterization based on an abstract high-level language machine
NASA Technical Reports Server (NTRS)
Saavedra-Barrera, Rafael H.; Smith, Alan Jay; Miya, Eugene
1989-01-01
Measurements are presented for a large number of machines ranging from small workstations to supercomputers. The authors combine these measurements into groups of parameters which relate to specific aspects of the machine implementation, and use these groups to provide overall machine characterizations. The authors also define the concept of pershapes, which represent the level of performance of a machine for different types of computation. A metric based on pershapes is introduced that provides a quantitative way of measuring how similar two machines are in terms of their performance distributions. The metric is related to the extent to which pairs of machines have varying relative performance levels depending on which benchmark is used.
Inferring the Growth of Massive Galaxies Using Bayesian Spectral Synthesis Modeling
NASA Astrophysics Data System (ADS)
Stillman, Coley Michael; Poremba, Megan R.; Moustakas, John
2018-01-01
The most massive galaxies in the universe are typically found at the centers of massive galaxy clusters. Studying these galaxies can provide valuable insight into the hierarchical growth of massive dark matter halos. One of the key challenges of measuring the stellar mass growth of massive galaxies is converting the measured light profiles into stellar mass. We use Prospector, a state-of-the-art Bayesian spectral synthesis modeling code, to infer the total stellar masses of a pilot sample of massive central galaxies selected from the Sloan Digital Sky Survey. We compare our stellar mass estimates to previous measurements, and present some of the quantitative diagnostics provided by Prospector.
Elayavilli, Ravikumar Komandur; Liu, Hongfang
2016-01-01
Computational modeling of biological cascades is of great interest to quantitative biologists. Biomedical text has been a rich source for quantitative information. Gathering quantitative parameters and values from biomedical text is one significant challenge in the early steps of computational modeling as it involves huge manual effort. While automatically extracting such quantitative information from bio-medical text may offer some relief, lack of ontological representation for a subdomain serves as impedance in normalizing textual extractions to a standard representation. This may render textual extractions less meaningful to the domain experts. In this work, we propose a rule-based approach to automatically extract relations involving quantitative data from biomedical text describing ion channel electrophysiology. We further translated the quantitative assertions extracted through text mining to a formal representation that may help in constructing ontology for ion channel events using a rule based approach. We have developed Ion Channel ElectroPhysiology Ontology (ICEPO) by integrating the information represented in closely related ontologies such as, Cell Physiology Ontology (CPO), and Cardiac Electro Physiology Ontology (CPEO) and the knowledge provided by domain experts. The rule-based system achieved an overall F-measure of 68.93% in extracting the quantitative data assertions system on an independently annotated blind data set. We further made an initial attempt in formalizing the quantitative data assertions extracted from the biomedical text into a formal representation that offers potential to facilitate the integration of text mining into ontological workflow, a novel aspect of this study. This work is a case study where we created a platform that provides formal interaction between ontology development and text mining. We have achieved partial success in extracting quantitative assertions from the biomedical text and formalizing them in ontological framework. The ICEPO ontology is available for download at http://openbionlp.org/mutd/supplementarydata/ICEPO/ICEPO.owl.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin Mingde; Marshall, Craig T.; Qi, Yi
Purpose: The use of preclinical rodent models of disease continues to grow because these models help elucidate pathogenic mechanisms and provide robust test beds for drug development. Among the major anatomic and physiologic indicators of disease progression and genetic or drug modification of responses are measurements of blood vessel caliber and flow. Moreover, cardiopulmonary blood flow is a critical indicator of gas exchange. Current methods of measuring cardiopulmonary blood flow suffer from some or all of the following limitations--they produce relative values, are limited to global measurements, do not provide vasculature visualization, are not able to measure acute changes, aremore » invasive, or require euthanasia. Methods: In this study, high-spatial and high-temporal resolution x-ray digital subtraction angiography (DSA) was used to obtain vasculature visualization, quantitative blood flow in absolute metrics (ml/min instead of arbitrary units or velocity), and relative blood volume dynamics from discrete regions of interest on a pixel-by-pixel basis (100x100 {mu}m{sup 2}). Results: A series of calibrations linked the DSA flow measurements to standard physiological measurement using thermodilution and Fick's method for cardiac output (CO), which in eight anesthetized Fischer-344 rats was found to be 37.0{+-}5.1 ml/min. Phantom experiments were conducted to calibrate the radiographic density to vessel thickness, allowing a link of DSA cardiac output measurements to cardiopulmonary blood flow measurements in discrete regions of interest. The scaling factor linking relative DSA cardiac output measurements to the Fick's absolute measurements was found to be 18.90xCO{sub DSA}=CO{sub Fick}. Conclusions: This calibrated DSA approach allows repeated simultaneous visualization of vasculature and measurement of blood flow dynamics on a regional level in the living rat.« less
NASA Technical Reports Server (NTRS)
Hasler, A. F.; Desjardins, M.; Shenk, W. E.
1979-01-01
Simultaneous Geosynchronous Operational Environmental Satellite (GOES) 1 km resolution visible image pairs can provide quantitative three dimensional measurements of clouds. These data have great potential for severe storms research and as a basic parameter measurement source for other areas of meteorology (e.g. climate). These stereo cloud height measurements are not subject to the errors and ambiguities caused by unknown cloud emissivity and temperature profiles that are associated with infrared techniques. This effort describes the display and measurement of stereo data using digital processing techniques.
Quantitative comparison of measurements of urgent care service quality.
Qin, Hong; Prybutok, Victor; Prybutok, Gayle
2016-01-01
Service quality and patient satisfaction are essential to health care organization success. Parasuraman, Zeithaml, and Berry introduced SERVQUAL, a prominent service quality measure not yet applied to urgent care. We develop an instrument to measure perceived service quality and identify the determinants of patient satisfaction/ behavioral intentions. We examine the relationships among perceived service quality, patient satisfaction and behavioral intentions, and demonstrate that urgent care service quality is not equivalent using measures of perceptions only, differences of expectations minus perceptions, ratio of perceptions to expectations, and the log of the ratio. Perceptions provide the best measure of urgent care service quality.
Three-dimensional analysis of alveolar bone resorption by image processing of 3-D dental CT images
NASA Astrophysics Data System (ADS)
Nagao, Jiro; Kitasaka, Takayuki; Mori, Kensaku; Suenaga, Yasuhito; Yamada, Shohzoh; Naitoh, Munetaka
2006-03-01
We have developed a novel system that provides total support for assessment of alveolar bone resorption, caused by periodontitis, based on three-dimensional (3-D) dental CT images. In spite of the difficulty in perceiving the complex 3-D shape of resorption, dentists assessing resorption location and severity have been relying on two-dimensional radiography and probing, which merely provides one-dimensional information (depth) about resorption shape. However, there has been little work on assisting assessment of the disease by 3-D image processing and visualization techniques. This work provides quantitative evaluation results and figures for our system that measures the three-dimensional shape and spread of resorption. It has the following functions: (1) measures the depth of resorption by virtually simulating probing in the 3-D CT images, taking advantage of image processing of not suffering obstruction by teeth on the inter-proximal sides and much smaller measurement intervals than the conventional examination; (2) visualizes the disposition of the depth by movies and graphs; (3) produces a quantitative index and intuitive visual representation of the spread of resorption in the inter-radicular region in terms of area; and (4) calculates the volume of resorption as another severity index in the inter-radicular region and the region outside it. Experimental results in two cases of 3-D dental CT images and a comparison of the results with the clinical examination results and experts' measurements of the corresponding patients confirmed that the proposed system gives satisfying results, including 0.1 to 0.6mm of resorption measurement (probing) error and fairly intuitive presentation of measurement and calculation results.
3D/4D multiscale imaging in acute lymphoblastic leukemia cells: visualizing dynamics of cell death
NASA Astrophysics Data System (ADS)
Sarangapani, Sreelatha; Mohan, Rosmin Elsa; Patil, Ajeetkumar; Lang, Matthew J.; Asundi, Anand
2017-06-01
Quantitative phase detection is a new methodology that provides quantitative information on cellular morphology to monitor the cell status, drug response and toxicity. In this paper the morphological changes in acute leukemia cells treated with chitosan were detected using d'Bioimager a robust imaging system. Quantitative phase image of the cells was obtained with numerical analysis. Results show that the average area and optical volume of the chitosan treated cells is significantly reduced when compared with the control cells, which reveals the effect of chitosan on the cancer cells. From the results it can be attributed that d'Bioimager can be used as a non-invasive imaging alternative to measure the morphological changes of the living cells in real time.
Thermodynamics and Mechanics of Membrane Curvature Generation and Sensing by Proteins and Lipids
Baumgart, Tobias; Capraro, Benjamin R.; Zhu, Chen; Das, Sovan L.
2014-01-01
Research investigating lipid membrane curvature generation and sensing is a rapidly developing frontier in membrane physical chemistry and biophysics. The fast recent progress is based on the discovery of a plethora of proteins involved in coupling membrane shape to cellular membrane function, the design of new quantitative experimental techniques to study aspects of membrane curvature, and the development of analytical theories and simulation techniques that allow a mechanistic interpretation of quantitative measurements. The present review first provides an overview of important classes of membrane proteins for which function is coupled to membrane curvature. We then survey several mechanisms that are assumed to underlie membrane curvature sensing and generation. Finally, we discuss relatively simple thermodynamic/mechanical models that allow quantitative interpretation of experimental observations. PMID:21219150
Three-dimensional label-free imaging and quantification of lipid droplets in live hepatocytes
NASA Astrophysics Data System (ADS)
Kim, Kyoohyun; Lee, Seoeun; Yoon, Jonghee; Heo, Jihan; Choi, Chulhee; Park, Yongkeun
2016-11-01
Lipid droplets (LDs) are subcellular organelles with important roles in lipid storage and metabolism and involved in various diseases including cancer, obesity, and diabetes. Conventional methods, however, have limited ability to provide quantitative information on individual LDs and have limited capability for three-dimensional (3-D) imaging of LDs in live cells especially for fast acquisition of 3-D dynamics. Here, we present an optical method based on 3-D quantitative phase imaging to measure the 3-D structural distribution and biochemical parameters (concentration and dry mass) of individual LDs in live cells without using exogenous labelling agents. The biochemical change of LDs under oleic acid treatment was quantitatively investigated, and 4-D tracking of the fast dynamics of LDs revealed the intracellular transport of LDs in live cells.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartman, J.S.; Gordon, R.L.; Lessor, D.L.
1980-09-01
The application of reflective Nomarski differential interference contrast microscopy for the determination of quantitative sample topography data is presented. The discussion includes a review of key theoretical results presented previously plus the experimental implementation of the concepts using a commercial Momarski microscope. The experimental work included the modification and characterization of a commercial microscope to allow its use for obtaining quantitative sample topography data. System usage for the measurement of slopes on flat planar samples is also discussed. The discussion has been designed to provide the theoretical basis, a physical insight, and a cookbook procedure for implementation to allow thesemore » results to be of value to both those interested in the microscope theory and its practical usage in the metallography laboratory.« less
Canela, Andrés; Vera, Elsa; Klatt, Peter; Blasco, María A
2007-03-27
A major limitation of studies of the relevance of telomere length to cancer and age-related diseases in human populations and to the development of telomere-based therapies has been the lack of suitable high-throughput (HT) assays to measure telomere length. We have developed an automated HT quantitative telomere FISH platform, HT quantitative FISH (Q-FISH), which allows the quantification of telomere length as well as percentage of short telomeres in large human sample sets. We show here that this technique provides the accuracy and sensitivity to uncover associations between telomere length and human disease.
NASA Astrophysics Data System (ADS)
Wang, Binbin; Socolofsky, Scott A.; Breier, John A.; Seewald, Jeffrey S.
2016-04-01
This paper reports the results of quantitative imaging using a stereoscopic, high-speed camera system at two natural gas seep sites in the northern Gulf of Mexico during the Gulf Integrated Spill Research G07 cruise in July 2014. The cruise was conducted on the E/V Nautilus using the ROV Hercules for in situ observation of the seeps as surrogates for the behavior of hydrocarbon bubbles in subsea blowouts. The seeps originated between 890 and 1190 m depth in Mississippi Canyon block 118 and Green Canyon block 600. The imaging system provided qualitative assessment of bubble behavior (e.g., breakup and coalescence) and verified the formation of clathrate hydrate skins on all bubbles above 1.3 m altitude. Quantitative image analysis yielded the bubble size distributions, rise velocity, total gas flux, and void fraction, with most measurements conducted from the seafloor to an altitude of 200 m. Bubble size distributions fit well to lognormal distributions, with median bubble sizes between 3 and 4.5 mm. Measurements of rise velocity fluctuated between two ranges: fast-rising bubbles following helical-type trajectories and bubbles rising about 40% slower following a zig-zag pattern. Rise speed was uncorrelated with hydrate formation, and bubbles following both speeds were observed at both sites. Ship-mounted multibeam sonar provided the flare rise heights, which corresponded closely with the boundary of the hydrate stability zone for the measured gas compositions. The evolution of bubble size with height agreed well with mass transfer rates predicted by equations for dirty bubbles.
Bohren, Meghan A.; Vogel, Joshua P.; Hunter, Erin C.; Lutsiv, Olha; Makh, Suprita K.; Souza, João Paulo; Aguiar, Carolina; Saraiva Coneglian, Fernando; Diniz, Alex Luíz Araújo; Tunçalp, Özge; Javadi, Dena; Oladapo, Olufemi T.; Khosla, Rajat; Hindin, Michelle J.; Gülmezoglu, A. Metin
2015-01-01
Background Despite growing recognition of neglectful, abusive, and disrespectful treatment of women during childbirth in health facilities, there is no consensus at a global level on how these occurrences are defined and measured. This mixed-methods systematic review aims to synthesize qualitative and quantitative evidence on the mistreatment of women during childbirth in health facilities to inform the development of an evidence-based typology of the phenomenon. Methods and Findings We searched PubMed, CINAHL, and Embase databases and grey literature using a predetermined search strategy to identify qualitative, quantitative, and mixed-methods studies on the mistreatment of women during childbirth across all geographical and income-level settings. We used a thematic synthesis approach to synthesize the qualitative evidence and assessed the confidence in the qualitative review findings using the CERQual approach. In total, 65 studies were included from 34 countries. Qualitative findings were organized under seven domains: (1) physical abuse, (2) sexual abuse, (3) verbal abuse, (4) stigma and discrimination, (5) failure to meet professional standards of care, (6) poor rapport between women and providers, and (7) health system conditions and constraints. Due to high heterogeneity of the quantitative data, we were unable to conduct a meta-analysis; instead, we present descriptions of study characteristics, outcome measures, and results. Additional themes identified in the quantitative studies are integrated into the typology. Conclusions This systematic review presents a comprehensive, evidence-based typology of the mistreatment of women during childbirth in health facilities, and demonstrates that mistreatment can occur at the level of interaction between the woman and provider, as well as through systemic failures at the health facility and health system levels. We propose this typology be adopted to describe the phenomenon and be used to develop measurement tools and inform future research, programs, and interventions. PMID:26126110
Current issues with standards in the measurement and documentation of human skeletal anatomy.
Magee, Justin; McClelland, Brian; Winder, John
2012-09-01
Digital modeling of human anatomy has become increasingly important and relies on well-documented quantitative anatomy literature. This type of documentation is common for the spine and pelvis; however, significant issues exist due to the lack of standardization in measurement and technique. Existing literature on quantitative anatomy for the spine and pelvis of white adults (aged 18-65 years, separated into decadal categories) was reviewed from the disciplines of anatomy, manipulative therapy, anthropometrics, occupational ergonomics, biomechanics and forensic science. The data were unified into a single normative model of the sub-axial spine. Two-dimensional orthographic drawings were produced from the 590 individual measurements identified, which informed the development of a 3D digital model. A similar review of full range of motion data was conducted as a meta-analysis and the results were applied to the existing model, providing an inter-connected, articulated digital spine. During these data analysis processes several inconsistencies were observed accompanied by an evidential lack of standardization with measurement and recording of data. These have been categorized as: anatomical terminology; scaling of measurements; measurement methodology, dimension and anatomical reference positions; global coordinate systems. There is inconsistency in anatomical terminology where independent researchers use the same terms to describe different aspects of anatomy or different terms for the same anatomy. Published standards exist for measurement methods of the human body regarding spatial interaction, anthropometric databases, automotive applications, clothing industries and for computer manikins, but none exists for skeletal anatomy. Presentation of measurements often lacks formal structure in clinical publications, seldom providing geometric reference points, therefore making digital reconstruction difficult. Published quantitative data does not follow existing international published standards relating to engineering drawing and visual communication. Large variations are also evident in standards or guidelines used for global coordinate systems across biomechanics, ergonomics, software systems and 3D software applications. This paper identifies where established good practice exists and suggests additional recommendations, informing an improved communication protocol, to assist reconstruction of skeletal anatomy using 3D digital modeling. © 2012 The Authors. Journal of Anatomy © 2012 Anatomical Society.
Current issues with standards in the measurement and documentation of human skeletal anatomy
Magee, Justin; McClelland, Brian; Winder, John
2012-01-01
Digital modeling of human anatomy has become increasingly important and relies on well-documented quantitative anatomy literature. This type of documentation is common for the spine and pelvis; however, significant issues exist due to the lack of standardization in measurement and technique. Existing literature on quantitative anatomy for the spine and pelvis of white adults (aged 18–65 years, separated into decadal categories) was reviewed from the disciplines of anatomy, manipulative therapy, anthropometrics, occupational ergonomics, biomechanics and forensic science. The data were unified into a single normative model of the sub-axial spine. Two-dimensional orthographic drawings were produced from the 590 individual measurements identified, which informed the development of a 3D digital model. A similar review of full range of motion data was conducted as a meta-analysis and the results were applied to the existing model, providing an inter-connected, articulated digital spine. During these data analysis processes several inconsistencies were observed accompanied by an evidential lack of standardization with measurement and recording of data. These have been categorized as: anatomical terminology; scaling of measurements; measurement methodology, dimension and anatomical reference positions; global coordinate systems. There is inconsistency in anatomical terminology where independent researchers use the same terms to describe different aspects of anatomy or different terms for the same anatomy. Published standards exist for measurement methods of the human body regarding spatial interaction, anthropometric databases, automotive applications, clothing industries and for computer manikins, but none exists for skeletal anatomy. Presentation of measurements often lacks formal structure in clinical publications, seldom providing geometric reference points, therefore making digital reconstruction difficult. Published quantitative data does not follow existing international published standards relating to engineering drawing and visual communication. Large variations are also evident in standards or guidelines used for global coordinate systems across biomechanics, ergonomics, software systems and 3D software applications. This paper identifies where established good practice exists and suggests additional recommendations, informing an improved communication protocol, to assist reconstruction of skeletal anatomy using 3D digital modeling. PMID:22747678
Using Inequality Measures to Incorporate Environmental Justice into Regulatory Analyses
Harper, Sam; Ruder, Eric; Roman, Henry A.; Geggel, Amelia; Nweke, Onyemaechi; Payne-Sturges, Devon; Levy, Jonathan I.
2013-01-01
Formally evaluating how specific policy measures influence environmental justice is challenging, especially in the context of regulatory analyses in which quantitative comparisons are the norm. However, there is a large literature on developing and applying quantitative measures of health inequality in other settings, and these measures may be applicable to environmental regulatory analyses. In this paper, we provide information to assist policy decision makers in determining the viability of using measures of health inequality in the context of environmental regulatory analyses. We conclude that quantification of the distribution of inequalities in health outcomes across social groups of concern, considering both within-group and between-group comparisons, would be consistent with both the structure of regulatory analysis and the core definition of environmental justice. Appropriate application of inequality indicators requires thorough characterization of the baseline distribution of exposures and risks, leveraging data generally available within regulatory analyses. Multiple inequality indicators may be applicable to regulatory analyses, and the choice among indicators should be based on explicit value judgments regarding the dimensions of environmental justice of greatest interest. PMID:23999551
Automatic 3D segmentation of multiphoton images: a key step for the quantification of human skin.
Decencière, Etienne; Tancrède-Bohin, Emmanuelle; Dokládal, Petr; Koudoro, Serge; Pena, Ana-Maria; Baldeweck, Thérèse
2013-05-01
Multiphoton microscopy has emerged in the past decade as a useful noninvasive imaging technique for in vivo human skin characterization. However, it has not been used until now in evaluation clinical trials, mainly because of the lack of specific image processing tools that would allow the investigator to extract pertinent quantitative three-dimensional (3D) information from the different skin components. We propose a 3D automatic segmentation method of multiphoton images which is a key step for epidermis and dermis quantification. This method, based on the morphological watershed and graph cuts algorithms, takes into account the real shape of the skin surface and of the dermal-epidermal junction, and allows separating in 3D the epidermis and the superficial dermis. The automatic segmentation method and the associated quantitative measurements have been developed and validated on a clinical database designed for aging characterization. The segmentation achieves its goals for epidermis-dermis separation and allows quantitative measurements inside the different skin compartments with sufficient relevance. This study shows that multiphoton microscopy associated with specific image processing tools provides access to new quantitative measurements on the various skin components. The proposed 3D automatic segmentation method will contribute to build a powerful tool for characterizing human skin condition. To our knowledge, this is the first 3D approach to the segmentation and quantification of these original images. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.
Brody, Sarah; Anilkumar, Thapasimuthu; Liliensiek, Sara; Last, Julie A; Murphy, Christopher J; Pandit, Abhay
2006-02-01
A fully effective prosthetic heart valve has not yet been developed. A successful tissue-engineered valve prosthetic must contain a scaffold that fully supports valve endothelial cell function. Recently, topographic features of scaffolds have been shown to influence the behavior of a variety of cell types and should be considered in rational scaffold design and fabrication. The basement membrane of the aortic valve endothelium provides important parameters for tissue engineering scaffold design. This study presents a quantitative characterization of the topographic features of the native aortic valve endothelial basement membrane; topographical features were measured, and quantitative data were generated using scanning electron microscopy (SEM), atomic force microscopy (AFM), transmission electron microscopy (TEM), and light microscopy. Optimal conditions for basement membrane isolation were established. Histological, immunohistochemical, and TEM analyses following decellularization confirmed basement membrane integrity. SEM and AFM photomicrographs of isolated basement membrane were captured and quantitatively analyzed. The basement membrane of the aortic valve has a rich, felt-like, 3-D nanoscale topography, consisting of pores, fibers, and elevations. All features measured were in the sub-100 nm range. No statistical difference was found between the fibrosal and ventricular surfaces of the cusp. These data provide a rational starting point for the design of extracellular scaffolds with nanoscale topographic features that mimic those found in the native aortic heart valve basement membrane.
BRODY, SARAH; ANILKUMAR, THAPASIMUTHU; LILIENSIEK, SARA; LAST, JULIE A.; MURPHY, CHRISTOPHER J.; PANDIT, ABHAY
2016-01-01
A fully effective prosthetic heart valve has not yet been developed. A successful tissue-engineered valve prosthetic must contain a scaffold that fully supports valve endothelial cell function. Recently, topographic features of scaffolds have been shown to influence the behavior of a variety of cell types and should be considered in rational scaffold design and fabrication. The basement membrane of the aortic valve endothelium provides important parameters for tissue engineering scaffold design. This study presents a quantitative characterization of the topographic features of the native aortic valve endothelial basement membrane; topographical features were measured, and quantitative data were generated using scanning electron microscopy (SEM), atomic force microscopy (AFM), transmission electron microscopy (TEM), and light microscopy. Optimal conditions for basement membrane isolation were established. Histological, immunohistochemical, and TEM analyses following decellularization confirmed basement membrane integrity. SEM and AFM photomicrographs of isolated basement membrane were captured and quantitatively analyzed. The basement membrane of the aortic valve has a rich, felt-like, 3-D nanoscale topography, consisting of pores, fibers, and elevations. All features measured were in the sub-100 nm range. No statistical difference was found between the fibrosal and ventricular surfaces of the cusp. These data provide a rational starting point for the design of extracellular scaffolds with nanoscale topographic features that mimic those found in the native aortic heart valve basement membrane. PMID:16548699
Aujla, Navneet; Stone, Margaret A; Taub, Nicholas; Davies, Melanie J; Khunti, Kamlesh
2013-12-01
This paper focuses mainly on explanations and lessons from a research-based programme for identifying undiagnosed type 2 diabetes and high risk. In addition to outlining key quantitative findings, we specifically aim to explore reasons for low uptake from the perspective of primary care staff involved. The MY-WAIST study was conducted in UK primary care and included the use of oral glucose tolerance tests (OGTTs) and waist measurement. Qualitative data from interviews with healthcare providers and records of meetings were analysed thematically. The key quantitative finding was low uptake of the assessments offered (8.6% overall, 2.6% in inner-city locations with high South Asian residency). In addition to confirming patient-reported barriers including those associated with OGTTs, qualitative findings highlighted a number of primary care provider barriers, including limited staff capacity. Interviewees suggested that those who attended were typically the 'worried well' rather than those from hard-to-reach groups. Implications discussed include the impact of low uptake on the usefulness of the quantitative data obtained, and lessons relevant to research design. Relevance to current guidance regarding early identification strategies is discussed and the importance of addressing the needs of less accessible groups is highlighted. Copyright © 2013 Primary Care Diabetes Europe. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ivanova, Bojidarka; Spiteller, Michael
2018-04-01
The problematic that we consider in this paper treats the quantitative correlation model equations between experimental kinetic and thermodynamic parameters of coupled electrospray ionization (ESI) mass spectrometry (MS) or atmospheric pressure chemical ionization (APCI) mass spectrometry with collision induced dissociation mass spectrometry, accounting for the fact that the physical phenomena and mechanisms of ESI- and APCI-ion formation are completely different. There are described forty two fragment reactions of three analytes under independent ESI- and APCI-measurements. The developed new quantitative models allow us to study correlatively the reaction kinetics and thermodynamics using the methods of mass spectrometry, which complementary application with the methods of the quantum chemistry provide 3D structural information of the analytes. Both static and dynamic quantum chemical computations are carried out. The object of analyses are [2,3-dimethyl-4-(4-methyl-benzoyl)-2,3-di-p-tolyl-cyclobutyl]-p-tolyl-methanone (1) and the polycyclic aromatic hydrocarbons derivatives of dibenzoperylen (2) and tetrabenzo [a,c,fg,op]naphthacene (3), respectively. As far as (1) is known to be a product of [2π+2π] cycloaddition reactions of chalcone (1,3-di-p-tolyl-propenone), however producing cyclic derivatives with different stereo selectivity, so that the study provide crucial data about the capability of mass spectrometry to provide determine the stereo selectivity of the analytes. This work also first provides quantitative treatment of the relations '3D molecular/electronic structures'-'quantum chemical diffusion coefficient'-'mass spectrometric diffusion coefficient', thus extending the capability of the mass spectrometry for determination of the exact 3D structure of the analytes using independent measurements and computations of the diffusion coefficients. The determination of the experimental diffusion parameters is carried out within the 'current monitoring method' evaluating the translation diffusion of charged analytes, while the theoretical modelling of MS ions and computations of theoretical diffusion coefficients are based on the Arrhenius type behavior of the charged species under ESI- and APCI-conditions. Although the study provide certain sound considerations for the quantitative relations between the reaction kinetic-thermodynamics and 3D structure of the analytes together with correlations between 3D molecular/electronic structures-quantum chemical diffusion coefficient-mass spectrometric diffusion coefficient, which contribute significantly to the structural analytical chemistry, the results have importance to other areas such as organic synthesis and catalysis as well.
Qualitative versus quantitative methods in psychiatric research.
Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S
2012-01-01
Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.
NASA Astrophysics Data System (ADS)
Lamarche, G.; Le Gonidec, Y.; Lucieer, V.; Lurton, X.; Greinert, J.; Dupré, S.; Nau, A.; Heffron, E.; Roche, M.; Ladroit, Y.; Urban, P.
2017-12-01
Detecting liquid, solid or gaseous features in the ocean is generating considerable interest in the geoscience community, because of their potentially high economic values (oil & gas, mining), their significance for environmental management (oil/gas leakage, biodiversity mapping, greenhouse gas monitoring) as well as their potential cultural and traditional values (food, freshwater). Enhancing people's capability to quantify and manage the natural capital present in the ocean water goes hand in hand with the development of marine acoustic technology, as marine echosounders provide the most reliable and technologically advanced means to develop quantitative studies of water column backscatter data. This is not developed to its full capability because (i) of the complexity of the physics involved in relation to the constantly changing marine environment, and (ii) the rapid technological evolution of high resolution multibeam echosounder (MBES) water-column imaging systems. The Water Column Imaging Working Group is working on a series of multibeam echosounder (MBES) water column datasets acquired in a variety of environments, using a range of frequencies, and imaging a number of water-column features such as gas seeps, oil leaks, suspended particulate matter, vegetation and freshwater springs. Access to data from different acoustic frequencies and ocean dynamics enables us to discuss and test multifrequency approaches which is the most promising means to develop a quantitative analysis of the physical properties of acoustic scatterers, providing rigorous cross calibration of the acoustic devices. In addition, high redundancy of multibeam data, such as is available for some datasets, will allow us to develop data processing techniques, leading to quantitative estimates of water column gas seeps. Each of the datasets has supporting ground-truthing data (underwater videos and photos, physical oceanography measurements) which provide information on the origin and chemistry of the seep content. This is of first importance when assessing the physical properties of water column scatterers from backscatter acoustic measurement.
San José Estépar, Raúl; Mendoza, Carlos S.; Hersh, Craig P.; Laird, Nan; Crapo, James D.; Lynch, David A.; Silverman, Edwin K.; Washko, George R.
2013-01-01
Rationale: Emphysema occurs in distinct pathologic patterns, but little is known about the epidemiologic associations of these patterns. Standard quantitative measures of emphysema from computed tomography (CT) do not distinguish between distinct patterns of parenchymal destruction. Objectives: To study the epidemiologic associations of distinct emphysema patterns with measures of lung-related physiology, function, and health care use in smokers. Methods: Using a local histogram-based assessment of lung density, we quantified distinct patterns of low attenuation in 9,313 smokers in the COPDGene Study. To determine if such patterns provide novel insights into chronic obstructive pulmonary disease epidemiology, we tested for their association with measures of physiology, function, and health care use. Measurements and Main Results: Compared with percentage of low-attenuation area less than −950 Hounsfield units (%LAA-950), local histogram-based measures of distinct CT low-attenuation patterns are more predictive of measures of lung function, dyspnea, quality of life, and health care use. These patterns are strongly associated with a wide array of measures of respiratory physiology and function, and most of these associations remain highly significant (P < 0.005) after adjusting for %LAA-950. In smokers without evidence of chronic obstructive pulmonary disease, the mild centrilobular disease pattern is associated with lower FEV1 and worse functional status (P < 0.005). Conclusions: Measures of distinct CT emphysema patterns provide novel information about the relationship between emphysema and key measures of physiology, physical function, and health care use. Measures of mild emphysema in smokers with preserved lung function can be extracted from CT scans and are significantly associated with functional measures. PMID:23980521
Code of Federal Regulations, 2014 CFR
2014-04-01
... regarding a variety of quantitative measurements of their covered trading activities, which vary depending... entity's covered trading activities. c. The quantitative measurements that must be furnished pursuant to... prior to September 30, 2015. e. In addition to the quantitative measurements required in this appendix...
Measuring temperature and field profiles in heat assisted magnetic recording
NASA Astrophysics Data System (ADS)
Hohlfeld, J.; Zheng, X.; Benakli, M.
2015-08-01
We introduce a theoretical and experimental framework that enables quantitative measurements of the temperature and magnetic field profiles governing the thermo-magnetic write process in heat assisted magnetic recording. Since our approach allows the identification of the correct temperature dependence of the magneto-crystalline anisotropy field in the vicinity of the Curie point as well, it provides an unprecedented experimental foundation to assess our understanding of heat assisted magnetic recording.
NASA Astrophysics Data System (ADS)
Peterson, Hannah M.; Hoang, Bang H.; Geller, David; Yang, Rui; Gorlick, Richard; Berger, Jeremy; Tingling, Janet; Roth, Michael; Gill, Jonathon; Roblyer, Darren
2017-12-01
Diffuse optical spectroscopic imaging (DOSI) is an emerging near-infrared imaging technique that noninvasively measures quantitative functional information in thick tissue. This study aimed to assess the feasibility of using DOSI to measure optical contrast from bone sarcomas. These tumors are rare and pose technical and practical challenges for DOSI measurements due to the varied anatomic locations and tissue depths of presentation. Six subjects were enrolled in the study. One subject was unable to be measured due to tissue contact sensitivity. For the five remaining subjects, the signal-to-noise ratio, imaging depth, optical properties, and quantitative tissue concentrations of oxyhemoglobin, deoxyhemoglobin, water, and lipids from tumor and contralateral normal tissues were assessed. Statistical differences between tumor and contralateral normal tissue were found in chromophore concentrations and optical properties for four subjects. Low signal-to-noise was encountered during several subject's measurements, suggesting increased detector sensitivity will help to optimize DOSI for this patient population going forward. This study demonstrates that DOSI is capable of measuring optical properties and obtaining functional information in bone sarcomas. In the future, DOSI may provide a means to stratify treatment groups and monitor chemotherapy response for this disease.
Design and evaluation of a miniature laser speckle imaging device to assess gingival health
Regan, Caitlin; White, Sean M.; Yang, Bruce Y.; Takesh, Thair; Ho, Jessica; Wink, Cherie; Wilder-Smith, Petra; Choi, Bernard
2016-01-01
Abstract. Current methods used to assess gingivitis are qualitative and subjective. We hypothesized that gingival perfusion measurements could provide a quantitative metric of disease severity. We constructed a compact laser speckle imaging (LSI) system that could be mounted in custom-made oral molds. Rigid fixation of the LSI system in the oral cavity enabled measurement of blood flow in the gingiva. In vitro validation performed in controlled flow phantoms demonstrated that the compact LSI system had comparable accuracy and linearity compared to a conventional bench-top LSI setup. In vivo validation demonstrated that the compact LSI system was capable of measuring expected blood flow dynamics during a standard postocclusive reactive hyperemia and that the compact LSI system could be used to measure gingival blood flow repeatedly without significant variation in measured blood flow values (p<0.05). Finally, compact LSI system measurements were collected from the interdental papilla of nine subjects and compared to a clinical assessment of gingival bleeding on probing. A statistically significant correlation (ρ=0.53; p<0.005) was found between these variables, indicating that quantitative gingival perfusion measurements performed using our system may aid in the diagnosis and prognosis of periodontal disease. PMID:27787545
Design and evaluation of a miniature laser speckle imaging device to assess gingival health
NASA Astrophysics Data System (ADS)
Regan, Caitlin; White, Sean M.; Yang, Bruce Y.; Takesh, Thair; Ho, Jessica; Wink, Cherie; Wilder-Smith, Petra; Choi, Bernard
2016-10-01
Current methods used to assess gingivitis are qualitative and subjective. We hypothesized that gingival perfusion measurements could provide a quantitative metric of disease severity. We constructed a compact laser speckle imaging (LSI) system that could be mounted in custom-made oral molds. Rigid fixation of the LSI system in the oral cavity enabled measurement of blood flow in the gingiva. In vitro validation performed in controlled flow phantoms demonstrated that the compact LSI system had comparable accuracy and linearity compared to a conventional bench-top LSI setup. In vivo validation demonstrated that the compact LSI system was capable of measuring expected blood flow dynamics during a standard postocclusive reactive hyperemia and that the compact LSI system could be used to measure gingival blood flow repeatedly without significant variation in measured blood flow values (p<0.05). Finally, compact LSI system measurements were collected from the interdental papilla of nine subjects and compared to a clinical assessment of gingival bleeding on probing. A statistically significant correlation (ρ=0.53 p<0.005) was found between these variables, indicating that quantitative gingival perfusion measurements performed using our system may aid in the diagnosis and prognosis of periodontal disease.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Damao; Wang, Zhien; Heymsfield, Andrew J.
Measurement of ice number concentration in clouds is important but still challenging. Stratiform mixed-phase clouds (SMCs) provide a simple scenario for retrieving ice number concentration from remote sensing measurements. The simple ice generation and growth pattern in SMCs offers opportunities to use cloud radar reflectivity (Ze) measurements and other cloud properties to infer ice number concentration quantitatively. To understand the strong temperature dependency of ice habit and growth rate quantitatively, we develop a 1-D ice growth model to calculate the ice diffusional growth along its falling trajectory in SMCs. The radar reflectivity and fall velocity profiles of ice crystals calculatedmore » from the 1-D ice growth model are evaluated with the Atmospheric Radiation Measurements (ARM) Climate Research Facility (ACRF) ground-based high vertical resolution radar measurements. Combining Ze measurements and 1-D ice growth model simulations, we develop a method to retrieve the ice number concentrations in SMCs at given cloud top temperature (CTT) and liquid water path (LWP). The retrieved ice concentrations in SMCs are evaluated with in situ measurements and with a three-dimensional cloud-resolving model simulation with a bin microphysical scheme. These comparisons show that the retrieved ice number concentrations are within an uncertainty of a factor of 2, statistically.« less
A unified material decomposition framework for quantitative dual- and triple-energy CT imaging.
Zhao, Wei; Vernekohl, Don; Han, Fei; Han, Bin; Peng, Hao; Yang, Yong; Xing, Lei; Min, James K
2018-04-21
Many clinical applications depend critically on the accurate differentiation and classification of different types of materials in patient anatomy. This work introduces a unified framework for accurate nonlinear material decomposition and applies it, for the first time, in the concept of triple-energy CT (TECT) for enhanced material differentiation and classification as well as dual-energy CT (DECT). We express polychromatic projection into a linear combination of line integrals of material-selective images. The material decomposition is then turned into a problem of minimizing the least-squares difference between measured and estimated CT projections. The optimization problem is solved iteratively by updating the line integrals. The proposed technique is evaluated by using several numerical phantom measurements under different scanning protocols. The triple-energy data acquisition is implemented at the scales of micro-CT and clinical CT imaging with commercial "TwinBeam" dual-source DECT configuration and a fast kV switching DECT configuration. Material decomposition and quantitative comparison with a photon counting detector and with the presence of a bow-tie filter are also performed. The proposed method provides quantitative material- and energy-selective images examining realistic configurations for both DECT and TECT measurements. Compared to the polychromatic kV CT images, virtual monochromatic images show superior image quality. For the mouse phantom, quantitative measurements show that the differences between gadodiamide and iodine concentrations obtained using TECT and idealized photon counting CT (PCCT) are smaller than 8 and 1 mg/mL, respectively. TECT outperforms DECT for multicontrast CT imaging and is robust with respect to spectrum estimation. For the thorax phantom, the differences between the concentrations of the contrast map and the corresponding true reference values are smaller than 7 mg/mL for all of the realistic configurations. A unified framework for both DECT and TECT imaging has been established for the accurate extraction of material compositions using currently available commercial DECT configurations. The novel technique is promising to provide an urgently needed solution for several CT-based diagnostic and therapy applications, especially for the diagnosis of cardiovascular and abdominal diseases where multicontrast imaging is involved. © 2018 American Association of Physicists in Medicine.
QDMR: a quantitative method for identification of differentially methylated regions by entropy
Zhang, Yan; Liu, Hongbo; Lv, Jie; Xiao, Xue; Zhu, Jiang; Liu, Xiaojuan; Su, Jianzhong; Li, Xia; Wu, Qiong; Wang, Fang; Cui, Ying
2011-01-01
DNA methylation plays critical roles in transcriptional regulation and chromatin remodeling. Differentially methylated regions (DMRs) have important implications for development, aging and diseases. Therefore, genome-wide mapping of DMRs across various temporal and spatial methylomes is important in revealing the impact of epigenetic modifications on heritable phenotypic variation. We present a quantitative approach, quantitative differentially methylated regions (QDMRs), to quantify methylation difference and identify DMRs from genome-wide methylation profiles by adapting Shannon entropy. QDMR was applied to synthetic methylation patterns and methylation profiles detected by methylated DNA immunoprecipitation microarray (MeDIP-chip) in human tissues/cells. This approach can give a reasonable quantitative measure of methylation difference across multiple samples. Then DMR threshold was determined from methylation probability model. Using this threshold, QDMR identified 10 651 tissue DMRs which are related to the genes enriched for cell differentiation, including 4740 DMRs not identified by the method developed by Rakyan et al. QDMR can also measure the sample specificity of each DMR. Finally, the application to methylation profiles detected by reduced representation bisulphite sequencing (RRBS) in mouse showed the platform-free and species-free nature of QDMR. This approach provides an effective tool for the high-throughput identification of potential functional regions involved in epigenetic regulation. PMID:21306990
Probing lipid membrane electrostatics
NASA Astrophysics Data System (ADS)
Yang, Yi
The electrostatic properties of lipid bilayer membranes play a significant role in many biological processes. Atomic force microscopy (AFM) is highly sensitive to membrane surface potential in electrolyte solutions. With fully characterized probe tips, AFM can perform quantitative electrostatic analysis of lipid membranes. Electrostatic interactions between Silicon nitride probes and supported zwitterionic dioleoylphosphatidylcholine (DOPC) bilayer with a variable fraction of anionic dioleoylphosphatidylserine (DOPS) were measured by AFM. Classical Gouy-Chapman theory was used to model the membrane electrostatics. The nonlinear Poisson-Boltzmann equation was numerically solved with finite element method to provide the potential distribution around the AFM tips. Theoretical tip-sample electrostatic interactions were calculated with the surface integral of both Maxwell and osmotic stress tensors on tip surface. The measured forces were interpreted with theoretical forces and the resulting surface charge densities of the membrane surfaces were in quantitative agreement with the Gouy-Chapman-Stern model of membrane charge regulation. It was demonstrated that the AFM can quantitatively detect membrane surface potential at a separation of several screening lengths, and that the AFM probe only perturbs the membrane surface potential by <2%. One important application of this technique is to estimate the dipole density of lipid membrane. Electrostatic analysis of DOPC lipid bilayers with the AFM reveals a repulsive force between the negatively charged probe tips and the zwitterionic lipid bilayers. This unexpected interaction has been analyzed quantitatively to reveal that the repulsion is due to a weak external field created by the internai membrane dipole moment. The analysis yields a dipole moment of 1.5 Debye per lipid with a dipole potential of +275 mV for supported DOPC membranes. This new ability to quantitatively measure the membrane dipole density in a noninvasive manner will be useful in identifying the biological effects of the dipole potential. Finally, heterogeneous model membranes were studied with fluid electric force microscopy (FEFM). Electrostatic mapping was demonstrated with 50 nm resolution. The capabilities of quantitative electrostatic measurement and lateral charge density mapping make AFM a unique and powerful probe of membrane electrostatics.
Sedimentation in mountain streams: A review of methods of measurement
Hedrick, Lara B.; Anderson, James T.; Welsh, Stuart A.; Lin, Lian-Shin
2013-01-01
The goal of this review paper is to provide a list of methods and devices used to measure sediment accumulation in wadeable streams dominated by cobble and gravel substrate. Quantitative measures of stream sedimentation are useful to monitor and study anthropogenic impacts on stream biota, and stream sedimentation is measurable with multiple sampling methods. Evaluation of sedimentation can be made by measuring the concentration of suspended sediment, or turbidity, and by determining the amount of deposited sediment, or sedimentation on the streambed. Measurements of deposited sediments are more time consuming and labor intensive than measurements of suspended sediments. Traditional techniques for characterizing sediment composition in streams include core sampling, the shovel method, visual estimation along transects, and sediment traps. This paper provides a comprehensive review of methodology, devices that can be used, and techniques for processing and analyzing samples collected to aid researchers in choosing study design and equipment.
NASA Astrophysics Data System (ADS)
Wang, Shang; Lopez, Andrew L.; Morikawa, Yuka; Tao, Ge; Li, Jiasong; Larina, Irina V.; Martin, James F.; Larin, Kirill V.
2015-03-01
Optical coherence elastography (OCE) is an emerging low-coherence imaging technique that provides noninvasive assessment of tissue biomechanics with high spatial resolution. Among various OCE methods, the capability of quantitative measurement of tissue elasticity is of great importance for tissue characterization and pathology detection across different samples. Here we report a quantitative OCE technique, termed quantitative shear wave imaging optical coherence tomography (Q-SWI-OCT), which enables noncontact measurement of tissue Young's modulus based on the ultra-fast imaging of the shear wave propagation inside the sample. A focused air-puff device is used to interrogate the tissue with a low-pressure short-duration air stream that stimulates a localized displacement with the scale at micron level. The propagation of this tissue deformation in the form of shear wave is captured by a phase-sensitive OCT system running with the scan of the M-mode imaging over the path of the wave propagation. The temporal characteristics of the shear wave is quantified based on the cross-correlation of the tissue deformation profiles at all the measurement locations, and linear regression is utilized to fit the data plotted in the domain of time delay versus wave propagation distance. The wave group velocity is thus calculated, which results in the quantitative measurement of the Young's modulus. As the feasibility demonstration, experiments are performed on tissuemimicking phantoms with different agar concentrations and the quantified elasticity values with Q-SWI-OCT agree well with the uniaxial compression tests. For functional characterization of myocardium with this OCE technique, we perform our pilot experiments on ex vivo mouse cardiac muscle tissues with two studies, including 1) elasticity difference of cardiac muscle under relaxation and contract conditions and 2) mechanical heterogeneity of the heart introduced by the muscle fiber orientation. Our results suggest the potential of using Q-SWI-OCT as an essential tool for nondestructive biomechanical evaluation of myocardium.
Measuring HIV-related stigma among healthcare providers: a systematic review.
Alexandra Marshall, S; Brewington, Krista M; Kathryn Allison, M; Haynes, Tiffany F; Zaller, Nickolas D
2017-11-01
In the United States, HIV-related stigma in the healthcare setting is known to affect the utilization of prevention and treatment services. Multiple HIV/AIDS stigma scales have been developed to assess the attitudes and behaviors of the general population in the U.S. towards people living with HIV/AIDS, but fewer scales have been developed to assess HIV-related stigma among healthcare providers. This systematic review aimed to identify and evaluate the measurement tools used to assess HIV stigma among healthcare providers in the U.S. The five studies selected quantitatively assessed the perceived HIV stigma among healthcare providers from the patient or provider perspective, included HIV stigma as a primary outcome, and were conducted in the U.S. These five studies used adapted forms of four HIV stigma scales. No standardized measure was identified. Assessment of HIV stigma among providers is valuable to better understand how this phenomenon may impact health outcomes and to inform interventions aiming to improve healthcare delivery and utilization.
NASA Astrophysics Data System (ADS)
Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John
2013-03-01
The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to the optical properties of nanoscale coatings on thin metallic surfaces, for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases - uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. We first demonstrate the directionally heterogeneous nature of the SPR phenomenon using a directionally ordered sample, then show how this allows for the calculation of the average coverage of a heterogeneous sample. Finally, the degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.
Quantitative analysis of facial paralysis using local binary patterns in biomedical videos.
He, Shu; Soraghan, John J; O'Reilly, Brian F; Xing, Dongshan
2009-07-01
Facial paralysis is the loss of voluntary muscle movement of one side of the face. A quantitative, objective, and reliable assessment system would be an invaluable tool for clinicians treating patients with this condition. This paper presents a novel framework for objective measurement of facial paralysis. The motion information in the horizontal and vertical directions and the appearance features on the apex frames are extracted based on the local binary patterns (LBPs) on the temporal-spatial domain in each facial region. These features are temporally and spatially enhanced by the application of novel block processing schemes. A multiresolution extension of uniform LBP is proposed to efficiently combine the micropatterns and large-scale patterns into a feature vector. The symmetry of facial movements is measured by the resistor-average distance (RAD) between LBP features extracted from the two sides of the face. Support vector machine is applied to provide quantitative evaluation of facial paralysis based on the House-Brackmann (H-B) scale. The proposed method is validated by experiments with 197 subject videos, which demonstrates its accuracy and efficiency.
Close, D.A.; Franks, L.A.; Kocimski, S.M.
1984-08-16
An invention is described that enables the quantitative simultaneous identification of the matrix materials in which fertile and fissile nuclides are embedded to be made along with the quantitative assay of the fertile and fissile materials. The invention also enables corrections for any absorption of neutrons by the matrix materials and by the measurement apparatus by the measurement of the prompt and delayed neutron flux emerging from a sample after the sample is interrogated by simultaneously applied neutrons and gamma radiation. High energy electrons are directed at a first target to produce gamma radiation. A second target receives the resulting pulsed gamma radiation and produces neutrons from the interaction with the gamma radiation. These neutrons are slowed by a moderator surrounding the sample and bathe the sample uniformly, generating second gamma radiation in the interaction. The gamma radiation is then resolved and quantitatively detected, providing a spectroscopic signature of the constituent elements contained in the matrix and in the materials within the vicinity of the sample. (LEW)
Heymsfield, S. B.; Peterson, C. M.; Thomas, D. M.; Heo, M.; Schuna, J. M.
2016-01-01
Summary Body mass index (BMI) is now the most widely used measure of adiposity on a global scale. Nevertheless, intense discussion centers on the appropriateness of BMI as a phenotypic marker of adiposity across populations differing in race and ethnicity. BMI-adiposity relations appear to vary significantly across race/ethnic groups, but a collective critical analysis of these effects establishing their magnitude and underlying body shape/composition basis is lacking. Accordingly, we systematically review the magnitude of these race-ethnic differences across non-Hispanic (NH) white, NH black and Mexican American adults, their anatomic body composition basis and potential biologically linked mechanisms, using both earlier publications and new analyses from the US National Health and Nutrition Examination Survey. Our collective observations provide a new framework for critically evaluating the quantitative relations between BMI and adiposity across groups differing in race and ethnicity; reveal new insights into BMI as a measure of adiposity across the adult age-span; identify knowledge gaps that can form the basis of future research and create a quantitative foundation for developing BMI-related public health recommendations. PMID:26663309
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Bingjing; Zhao, Jianlin, E-mail: jlzhao@nwpu.edu.cn; Wang, Jun
2013-11-21
We present a method for visually and quantitatively investigating the heat dissipation process of plate-fin heat sinks by using digital holographic interferometry. A series of phase change maps reflecting the temperature distribution and variation trend of the air field surrounding heat sink during the heat dissipation process are numerically reconstructed based on double-exposure holographic interferometry. According to the phase unwrapping algorithm and the derived relationship between temperature and phase change of the detection beam, the full-field temperature distributions are quantitatively obtained with a reasonably high measurement accuracy. And then the impact of heat sink's channel width on the heat dissipationmore » performance in the case of natural convection is analyzed. In addition, a comparison between simulation and experiment results is given to verify the reliability of this method. The experiment results certify the feasibility and validity of the presented method in full-field, dynamical, and quantitative measurement of the air field temperature distribution, which provides a basis for analyzing the heat dissipation performance of plate-fin heat sinks.« less
Hurley, Samuel A.; Samsonov, Alexey A.; Adluru, Nagesh; Hosseinbor, Ameer Pasha; Mossahebi, Pouria; Tromp, Do P.M.; Zakszewski, Elizabeth; Field, Aaron S.
2011-01-01
Abstract The image contrast in magnetic resonance imaging (MRI) is highly sensitive to several mechanisms that are modulated by the properties of the tissue environment. The degree and type of contrast weighting may be viewed as image filters that accentuate specific tissue properties. Maps of quantitative measures of these mechanisms, akin to microstructural/environmental-specific tissue stains, may be generated to characterize the MRI and physiological properties of biological tissues. In this article, three quantitative MRI (qMRI) methods for characterizing white matter (WM) microstructural properties are reviewed. All of these measures measure complementary aspects of how water interacts with the tissue environment. Diffusion MRI, including diffusion tensor imaging, characterizes the diffusion of water in the tissues and is sensitive to the microstructural density, spacing, and orientational organization of tissue membranes, including myelin. Magnetization transfer imaging characterizes the amount and degree of magnetization exchange between free water and macromolecules like proteins found in the myelin bilayers. Relaxometry measures the MRI relaxation constants T1 and T2, which in WM have a component associated with the water trapped in the myelin bilayers. The conduction of signals between distant brain regions occurs primarily through myelinated WM tracts; thus, these methods are potential indicators of pathology and structural connectivity in the brain. This article provides an overview of the qMRI stain mechanisms, acquisition and analysis strategies, and applications for these qMRI stains. PMID:22432902
2018-01-01
This study aimed to assess and validate the repeatability and agreement of quantitative elastography of novel shear wave methods on four individual tissue-mimicking liver fibrosis phantoms with different known Young’s modulus. We used GE Logiq E9 2D-SWE, Philips iU22 ARFI (pSWE), Samsung TS80A SWE (pSWE), Hitachi Ascendus (SWM) and Transient Elastography (TE). Two individual investigators performed all measurements non-continued and in parallel. The methods were evaluated for inter- and intraobserver variability by intraclass correlation, coefficient of variation and limits of agreement using the median elastography value. All systems used in this study provided high repeatability in quantitative measurements in a liver fibrosis phantom and excellent inter- and intraclass correlations. All four elastography platforms showed excellent intra-and interobserver agreement (interclass correlation 0.981–1.000 and intraclass correlation 0.987–1.000) and no significant difference in mean elasticity measurements for all systems, except for TE on phantom 4. All four liver fibrosis phantoms could be differentiated by quantitative elastography, by all platforms (p<0.001). In the Bland-Altman analysis the differences in measurements were larger for the phantoms with higher Young’s modulus. All platforms had a coefficient of variation in the range 0.00–0.21 for all four phantoms, equivalent to low variance and high repeatability. PMID:29293527
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmittroth, F.
1979-09-01
A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples.
Effects of performance measure implementation on clinical manager and provider motivation.
Damschroder, Laura J; Robinson, Claire H; Francis, Joseph; Bentley, Douglas R; Krein, Sarah L; Rosland, Ann-Marie; Hofer, Timothy P; Kerr, Eve A
2014-12-01
Clinical performance measurement has been a key element of efforts to transform the Veterans Health Administration (VHA). However, there are a number of signs that current performance measurement systems used within and outside the VHA may be reaching the point of maximum benefit to care and in some settings, may be resulting in negative consequences to care, including overtreatment and diminished attention to patient needs and preferences. Our research group has been involved in a long-standing partnership with the office responsible for clinical performance measurement in the VHA to understand and develop potential strategies to mitigate the unintended consequences of measurement. Our aim was to understand how the implementation of diabetes performance measures (PMs) influences management actions and day-to-day clinical practice. This is a mixed methods study design based on quantitative administrative data to select study facilities and quantitative data from semi-structured interviews. Sixty-two network-level and facility-level executives, managers, front-line providers and staff participated in the study. Qualitative content analyses were guided by a team-based consensus approach using verbatim interview transcripts. A published interpretive motivation theory framework is used to describe potential contributions of local implementation strategies to unintended consequences of PMs. Implementation strategies used by management affect providers' response to PMs, which in turn potentially undermines provision of high-quality patient-centered care. These include: 1) feedback reports to providers that are dissociated from a realistic capability to address performance gaps; 2) evaluative criteria set by managers that are at odds with patient-centered care; and 3) pressure created by managers' narrow focus on gaps in PMs that is viewed as more punitive than motivating. Next steps include working with VHA leaders to develop and test implementation approaches to help ensure that the next generation of PMs motivate truly patient-centered care and are clinically meaningful.
Student Experiments on the Effects of Dam Removal on the Elwha River
NASA Astrophysics Data System (ADS)
Sandland, T. O.; Grack Nelson, A. L.
2006-12-01
The National Center for Earth Surface Dynamics (NCED) is an NSF funded Science and Technology Center devoted to developing a quantitative, predictive science of the ecological and physical processes that define and shape rivers and river networks. The Science Museum of Minnesota's (SMM) Earthscapes River Restoration classes provide k-12 students, teachers, and the public opportunities to explore NCED concepts and, like NCED scientists, move from a qualitative to a quantitative-based understanding of river systems. During a series of classes, students work with an experimental model of the Elwha River in Washington State to gain an understanding of the processes that define and shape river systems. Currently, two large dams on the Elwha are scheduled for removal to restore salmon habitat. Students design different dam removal scenarios to test and make qualitative observations describing and comparing how the modeled system evolves over time. In a following session, after discussing the ambiguity of the previous session's qualitative data, student research teams conduct a quantitative experiment to collect detailed measurements of the system. Finally, students interpret, critique, and compare the data the groups collected and ultimately develop and advocate a recommendation for the "ideal" dam removal scenario. SMM is currently conducting a formative evaluation of River Restoration classes to improve their educational effectiveness and guide development of an educator's manual. As of August 2006, pre- and post-surveys have been administered to 167 students to gauge student learning and engagement. The surveys have found the program successful in teaching students why scientists use river models and what processes and phenomena are at work in river systems. Most notable is the increase in student awareness of sediment in river systems. A post-visit survey was also administered to 20 teachers who used the models in their classrooms. This survey provided feedback about teachers' experience with the program and will help inform the development of a future educator's manual. All teachers found the program to be effective at providing opportunities for students to make qualitative observations and most (95%) found the program effective at providing students opportunities to make quantitative measurements. A full summary of evaluation results will be shared at the meeting.
Four-point bending as a method for quantitatively evaluating spinal arthrodesis in a rat model.
Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F
2015-02-01
The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague-Dawley rat spines after single-level posterolateral fusion procedures at L4-L5. Segments were classified as 'not fused,' 'restricted motion,' or 'fused' by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4-L5 motion segment, and stiffness was measured as the slope of the moment-displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery.
A Quantitative Description of a Teacher Preparation Program. Educational Curriculum and Instruction.
ERIC Educational Resources Information Center
Denton, Jon J.; Morris, Geneva W.
A research program has been initiated at Texas A&M University College of Education to collect, analyze, and interpret data from a diagnostic prescriptive teacher preparation program. The project was undertaken to create a substantial data file containing multiple measures of learner attainment information that will provide alternatives for…
Meeting the Highly Qualified Teachers Challenge
ERIC Educational Resources Information Center
Toh, Kok-Aun; Ho, Boon-Tiong; Riley, Joseph P.; Hoh, Yin-Kiong
2006-01-01
The moulding of the future of a nation depends on the teachers and the education they provide in schools. Research evidence from the US. National Assessment of Educational Progress (NAEP) confirms this to be the case. Quantitative analysis of data indicates that measures associated with pre-service teacher preparation are by far the strongest…
Landscape pattern metrics and regional assessment
Robert V. O' Neill; Kurt H. Riitters; J.D. Wickham; Bruce K. Jones
1999-01-01
The combination of remote imagery data, geographic information systems software, and landscape ecology theory provides a unique basis for monitoring and assessing large-scale ecological systems. The unique feature of the work has been the need to develop interpret quantitative measures of spatial patter-the landscape indices. This article reviews what is known about...
Bakeout Chamber Within Vacuum Chamber
NASA Technical Reports Server (NTRS)
Taylor, Daniel M.; Soules, David M.; Barengoltz, Jack B.
1995-01-01
Vacuum-bakeout apparatus for decontaminating and measuring outgassing from pieces of equipment constructed by mounting bakeout chamber within conventional vacuum chamber. Upgrade cost effective: fabrication and installation of bakeout chamber simple, installation performed quickly and without major changes in older vacuum chamber, and provides quantitative data on outgassing from pieces of equipment placed in bakeout chamber.
The Fathering Indicators Framework: A Tool for Quantitative and Qualitative Analysis.
ERIC Educational Resources Information Center
Gadsden, Vivian, Ed.; Fagan, Jay, Ed.; Ray, Aisha, Ed.; Davis, James Earl, Ed.
The Fathering Indicators Framework (FIF) is an evaluation tool designed to help researchers, practitioners, and policymakers conceptualize, examine, and measure change in fathering behaviors in relation to child and family well-being. This report provides a detailed overview of the research and theory informing the development of the FIF. The FIF…
Risk Assessment: Evidence Base
NASA Technical Reports Server (NTRS)
Johnson-Throop, Kathy A.
2007-01-01
Human systems PRA (Probabilistic Risk Assessment: a) Provides quantitative measures of probability, consequence, and uncertainty; and b) Communicates risk and informs decision-making. Human health risks rated highest in ISS PRA are based on 1997 assessment of clinical events in analog operational settings. Much work remains to analyze remaining human health risks identified in Bioastronautics Roadmap.
Delay Discounting: I'm a "K", You're a "K"
ERIC Educational Resources Information Center
Odum, Amy L.
2011-01-01
Delay discounting is the decline in the present value of a reward with delay to its receipt. Across a variety of species, populations, and reward types, value declines hyperbolically with delay. Value declines steeply with shorter delays, but more shallowly with longer delays. Quantitative modeling provides precise measures to characterize the…
ERIC Educational Resources Information Center
Calgary Univ. (Alberta).
This report describes a pilot energy conservation project in Grande Prairie (Alberta) School District No. 2357. Extensive data collection and analysis were undertaken to provide a sound, quantitative basis for evaluation of the program. Energy conserving measures requiring capital outlays were not considered. During the project, electric demand…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-09
... asserted that quantitative and measurable analyses similar to those for minorities and women were needed to... provided few details about the design or operation of the California State program, and that, consequently... recommended that the contractor be required to invite voluntary self- identification at both the pre- and post...
How Safe Is a School? An Exploratory Study Comparing Measures and Perceptions of Safety
ERIC Educational Resources Information Center
Hernandez, Diley; Floden, Lysbeth; Bosworth, Kris
2010-01-01
This exploratory study investigates the relation between incident reports to local law enforcement, and students' and teachers' perceptions of school safety. Using a combination of grounded theory and statistics, we compared quantitative data collected from law enforcement agencies with qualitative data provided by students and teachers during…
Mamou, Jonathan; Wa, Christianne A; Yee, Kenneth M P; Silverman, Ronald H; Ketterling, Jeffrey A; Sadun, Alfredo A; Sebag, J
2015-01-22
Clinical evaluation of floaters lacks quantitative assessment of vitreous structure. This study used quantitative ultrasound (QUS) to measure vitreous opacities. Since floaters reduce contrast sensitivity (CS) and quality of life (Visual Function Questionnaire [VFQ]), it is hypothesized that QUS will correlate with CS and VFQ in patients with floaters. Twenty-two eyes (22 subjects; age = 57 ± 19 years) with floaters were evaluated with Freiburg acuity contrast testing (FrACT; %Weber) and VFQ. Ultrasonography used a customized probe (15-MHz center frequency, 20-mm focal length, 7-mm aperture) with longitudinal and transverse scans taken in primary gaze and a horizontal longitudinal scan through premacular vitreous in temporal gaze. Each scan set had 100 frames of log-compressed envelope data. Within each frame, two regions of interest (ROIs) were analyzed (whole-central and posterior vitreous) to yield three parameters (energy, E; mean amplitude, M; and percentage of vitreous filled by echodensities, P50) averaged over the entire 100-frame dataset. Statistical analyses evaluated E, M, and P50 correlations with CS and VFQ. Contrast sensitivity ranged from 1.19%W (normal) to 5.59%W. All QUS parameters in two scan positions within the whole-central ROI correlated with CS (R > 0.67, P < 0.001). P50 in the nasal longitudinal position had R = 0.867 (P < 0.001). Correlations with VFQ ranged from R = 0.52 (P < 0.013) to R = 0.65 (P < 0.001). Quantitative ultrasound provides quantitative measures of vitreous echodensity that correlate with CS and VFQ, providing objective assessment of vitreous structure underlying the functional disturbances induced by floaters, useful to quantify vitreous disease severity and the response to therapy. Copyright 2015 The Association for Research in Vision and Ophthalmology, Inc.
Mamou, Jonathan; Wa, Christianne A.; Yee, Kenneth M. P.; Silverman, Ronald H.; Ketterling, Jeffrey A.; Sadun, Alfredo A.; Sebag, J.
2015-01-01
Purpose. Clinical evaluation of floaters lacks quantitative assessment of vitreous structure. This study used quantitative ultrasound (QUS) to measure vitreous opacities. Since floaters reduce contrast sensitivity (CS) and quality of life (Visual Function Questionnaire [VFQ]), it is hypothesized that QUS will correlate with CS and VFQ in patients with floaters. Methods. Twenty-two eyes (22 subjects; age = 57 ± 19 years) with floaters were evaluated with Freiburg acuity contrast testing (FrACT; %Weber) and VFQ. Ultrasonography used a customized probe (15-MHz center frequency, 20-mm focal length, 7-mm aperture) with longitudinal and transverse scans taken in primary gaze and a horizontal longitudinal scan through premacular vitreous in temporal gaze. Each scan set had 100 frames of log-compressed envelope data. Within each frame, two regions of interest (ROIs) were analyzed (whole-central and posterior vitreous) to yield three parameters (energy, E; mean amplitude, M; and percentage of vitreous filled by echodensities, P50) averaged over the entire 100-frame dataset. Statistical analyses evaluated E, M, and P50 correlations with CS and VFQ. Results. Contrast sensitivity ranged from 1.19%W (normal) to 5.59%W. All QUS parameters in two scan positions within the whole-central ROI correlated with CS (R > 0.67, P < 0.001). P50 in the nasal longitudinal position had R = 0.867 (P < 0.001). Correlations with VFQ ranged from R = 0.52 (P < 0.013) to R = 0.65 (P < 0.001). Conclusions. Quantitative ultrasound provides quantitative measures of vitreous echodensity that correlate with CS and VFQ, providing objective assessment of vitreous structure underlying the functional disturbances induced by floaters, useful to quantify vitreous disease severity and the response to therapy. PMID:25613948
Sadeghi, N.; Namjoshi, D.; Irfanoglu, M. O.; Wellington, C.; Diaz-Arrastia, R.
2017-01-01
Diffuse axonal injury (DAI) is a hallmark of traumatic brain injury (TBI) pathology. Recently, the Closed Head Injury Model of Engineered Rotational Acceleration (CHIMERA) was developed to generate an experimental model of DAI in a mouse. The characterization of DAI using diffusion tensor magnetic resonance imaging (MRI; diffusion tensor imaging, DTI) may provide a useful set of outcome measures for preclinical and clinical studies. The objective of this study was to identify the complex neurobiological underpinnings of DTI features following DAI using a comprehensive and quantitative evaluation of DTI and histopathology in the CHIMERA mouse model. A consistent neuroanatomical pattern of pathology in specific white matter tracts was identified across ex vivo DTI maps and photomicrographs of histology. These observations were confirmed by voxelwise and regional analysis of DTI maps, demonstrating reduced fractional anisotropy (FA) in distinct regions such as the optic tract. Similar regions were identified by quantitative histology and exhibited axonal damage as well as robust gliosis. Additional analysis using a machine-learning algorithm was performed to identify regions and metrics important for injury classification in a manner free from potential user bias. This analysis found that diffusion metrics were able to identify injured brains almost with the same degree of accuracy as the histology metrics. Good agreement between regions detected as abnormal by histology and MRI was also found. The findings of this work elucidate the complexity of cellular changes that give rise to imaging abnormalities and provide a comprehensive and quantitative evaluation of the relative importance of DTI and histological measures to detect brain injury. PMID:28966972
Cocrystals to facilitate delivery of poorly soluble compounds beyond-rule-of-5.
Kuminek, Gislaine; Cao, Fengjuan; Bahia de Oliveira da Rocha, Alanny; Gonçalves Cardoso, Simone; Rodríguez-Hornedo, Naír
2016-06-01
Besides enhancing aqueous solubilities, cocrystals have the ability to fine-tune solubility advantage over drug, supersaturation index, and bioavailability. This review presents important facts about cocrystals that set them apart from other solid-state forms of drugs, and a quantitative set of rules for the selection of additives and solution/formulation conditions that predict cocrystal solubility, supersaturation index, and transition points. Cocrystal eutectic constants are shown to be the most important cocrystal property that can be measured once a cocrystal is discovered, and simple relationships are presented that allow for prediction of cocrystal behavior as a function of pH and drug solubilizing agents. Cocrystal eutectic constant is a stability or supersatuation index that: (a) reflects how close or far from equilibrium a cocrystal is, (b) establishes transition points, and (c) provides a quantitative scale of cocrystal true solubility changes over drug. The benefit of this strategy is that a single measurement, that requires little material and time, provides a principled basis to tailor cocrystal supersaturation index by the rational selection of cocrystal formulation, dissolution, and processing conditions. Copyright © 2016 Elsevier B.V. All rights reserved.
Through thick and thin: quantitative classification of photometric observing conditions on Paranal
NASA Astrophysics Data System (ADS)
Kerber, Florian; Querel, Richard R.; Neureiter, Bianca; Hanuschik, Reinhard
2016-07-01
A Low Humidity and Temperature Profiling (LHATPRO) microwave radiometer is used to monitor sky conditions over ESO's Paranal observatory. It provides measurements of precipitable water vapour (PWV) at 183 GHz, which are being used in Service Mode for scheduling observations that can take advantage of favourable conditions for infrared (IR) observations. The instrument also contains an IR camera measuring sky brightness temperature at 10.5 μm. It is capable of detecting cold and thin, even sub-visual, cirrus clouds. We present a diagnostic diagram that, based on a sophisticated time series analysis of these IR sky brightness data, allows for the automatic and quantitative classification of photometric observing conditions over Paranal. The method is highly sensitive to the presence of even very thin clouds but robust against other causes of sky brightness variations. The diagram has been validated across the complete range of conditions that occur over Paranal and we find that the automated process provides correct classification at the 95% level. We plan to develop our method into an operational tool for routine use in support of ESO Science Operations.
Wu, Chang-Guang; Li, Sheng; Ren, Hua-Dong; Yao, Xiao-Hua; Huang, Zi-Jie
2012-06-01
Soil loss prediction models such as universal soil loss equation (USLE) and its revised universal soil loss equation (RUSLE) are the useful tools for risk assessment of soil erosion and planning of soil conservation at regional scale. To make a rational estimation of vegetation cover and management factor, the most important parameters in USLE or RUSLE, is particularly important for the accurate prediction of soil erosion. The traditional estimation based on field survey and measurement is time-consuming, laborious, and costly, and cannot rapidly extract the vegetation cover and management factor at macro-scale. In recent years, the development of remote sensing technology has provided both data and methods for the estimation of vegetation cover and management factor over broad geographic areas. This paper summarized the research findings on the quantitative estimation of vegetation cover and management factor by using remote sensing data, and analyzed the advantages and the disadvantages of various methods, aimed to provide reference for the further research and quantitative estimation of vegetation cover and management factor at large scale.
Henry, Francis P.; Wang, Yan; Rodriguez, Carissa L. R.; Randolph, Mark A.; Rust, Esther A. Z.; Winograd, Jonathan M.; de Boer, Johannes F.; Park, B. Hyle
2015-01-01
Abstract. Assessing nerve integrity and myelination after injury is necessary to provide insight for treatment strategies aimed at restoring neuromuscular function. Currently, this is largely done with electrical analysis, which lacks direct quantitative information. In vivo optical imaging with sufficient imaging depth and resolution could be used to assess the nerve microarchitecture. In this study, we examine the use of polarization sensitive-optical coherence tomography (PS-OCT) to quantitatively assess the sciatic nerve microenvironment through measurements of birefringence after applying a nerve crush injury in a rat model. Initial loss of function and subsequent recovery were demonstrated by calculating the sciatic function index (SFI). We found that the PS-OCT phase retardation slope, which is proportional to birefringence, increased monotonically with the SFI. Additionally, histomorphometric analysis of the myelin thickness and g-ratio shows that the PS-OCT slope is a good indicator of myelin health and recovery after injury. These results demonstrate that PS-OCT is capable of providing nondestructive and quantitative assessment of nerve health after injury and shows promise for continued use both clinically and experimentally in neuroscience. PMID:25858593
Henry, Francis P; Wang, Yan; Rodriguez, Carissa L R; Randolph, Mark A; Rust, Esther A Z; Winograd, Jonathan M; de Boer, Johannes F; Park, B Hyle
2015-04-01
Assessing nerve integrity and myelination after injury is necessary to provide insight for treatment strategies aimed at restoring neuromuscular function. Currently, this is largely done with electrical analysis, which lacks direct quantitative information. In vivo optical imaging with sufficient imaging depth and resolution could be used to assess the nerve microarchitecture. In this study, we examine the use of polarization sensitive-optical coherence tomography (PS-OCT) to quantitatively assess the sciatic nerve microenvironment through measurements of birefringence after applying a nerve crush injury in a rat model. Initial loss of function and subsequent recovery were demonstrated by calculating the sciatic function index (SFI). We found that the PS-OCT phase retardation slope, which is proportional to birefringence, increased monotonically with the SFI. Additionally, histomorphometric analysis of the myelin thickness and g-ratio shows that the PS-OCT slope is a good indicator of myelin health and recovery after injury. These results demonstrate that PS-OCT is capable of providing nondestructive and quantitative assessment of nerve health after injury and shows promise for continued use both clinically and experimentally in neuroscience.
Cappelleri, Joseph C; Jason Lundy, J; Hays, Ron D
2014-05-01
The US Food and Drug Administration's guidance for industry document on patient-reported outcomes (PRO) defines content validity as "the extent to which the instrument measures the concept of interest" (FDA, 2009, p. 12). According to Strauss and Smith (2009), construct validity "is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity" (p. 7). Hence, both qualitative and quantitative information are essential in evaluating the validity of measures. We review classical test theory and item response theory (IRT) approaches to evaluating PRO measures, including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized "difficulty" (severity) order of items is represented by observed responses. If a researcher has few qualitative data and wants to get preliminary information about the content validity of the instrument, then descriptive assessments using classical test theory should be the first step. As the sample size grows during subsequent stages of instrument development, confidence in the numerical estimates from Rasch and other IRT models (as well as those of classical test theory) would also grow. Classical test theory and IRT can be useful in providing a quantitative assessment of items and scales during the content-validity phase of PRO-measure development. Depending on the particular type of measure and the specific circumstances, the classical test theory and/or the IRT should be considered to help maximize the content validity of PRO measures. Copyright © 2014 Elsevier HS Journals, Inc. All rights reserved.
Quantitative contrast-enhanced mammography for contrast medium kinetics studies
NASA Astrophysics Data System (ADS)
Arvanitis, C. D.; Speller, R.
2009-10-01
Quantitative contrast-enhanced mammography, based on a dual-energy approach, aims to extract quantitative and temporal information of the tumour enhancement after administration of iodinated vascular contrast media. Simulations using analytical expressions and optimization of critical parameters essential for the development of quantitative contrast-enhanced mammography are presented. The procedure has been experimentally evaluated using a tissue-equivalent phantom and an amorphous silicon active matrix flat panel imager. The x-ray beams were produced by a tungsten target tube and spectrally shaped using readily available materials. Measurement of iodine projected thickness in mg cm-2 has been performed. The effect of beam hardening does not introduce nonlinearities in the measurement of iodine projected thickness for values of thicknesses found in clinical investigations. However, scattered radiation introduces significant deviations from slope equal to unity when compared with the actual iodine projected thickness. Scatter correction before the analysis of the dual-energy images provides accurate iodine projected thickness measurements. At 10% of the exposure used in clinical mammography, signal-to-noise ratios in excess of 5 were achieved for iodine projected thicknesses less than 3 mg cm-2 within a 4 cm thick phantom. For the extraction of temporal information, a limited number of low-dose images were used with the phantom incorporating a flow of iodinated contrast medium. The results suggest that spatial and temporal information of iodinated contrast media can be used to indirectly measure the tumour microvessel density and determine its uptake and washout from breast tumours. The proposed method can significantly improve tumour detection in dense breasts. Its application to perform in situ x-ray biopsy and assessment of the oncolytic effect of anticancer agents is foreseeable.
Evaluation of macrozone dimensions by ultrasound and EBSD techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreau, Andre, E-mail: Andre.Moreau@cnrc-nrc.gc.ca; Toubal, Lotfi; Ecole de technologie superieure, 1100, rue Notre-Dame Ouest, Montreal, QC, Canada H3C 1K3
2013-01-15
Titanium alloys are known to have texture heterogeneities, i.e. regions much larger than the grain dimensions, where the local orientation distribution of the grains differs from one region to the next. The electron backscattering diffraction (EBSD) technique is the method of choice to characterize these macro regions, which are called macrozones. Qualitatively, the images obtained by EBSD show that these macrozones may be larger or smaller, elongated or equiaxed. However, often no well-defined boundaries are observed between the macrozones and it is very hard to obtain objective and quantitative estimates of the macrozone dimensions from these data. In the presentmore » work, we present a novel, non-destructive ultrasonic technique that provides objective and quantitative characteristic dimensions of the macrozones. The obtained dimensions are based on the spatial autocorrelation function of fluctuations in the sound velocity. Thus, a pragmatic definition of macrozone dimensions naturally arises from the ultrasonic measurement. This paper has three objectives: 1) to disclose the novel, non-destructive ultrasonic technique to measure macrozone dimensions, 2) to propose a quantitative and objective definition of macrozone dimensions adapted to and arising from the ultrasonic measurement, and which is also applicable to the orientation data obtained by EBSD, and 3) to compare the macrozone dimensions obtained using the two techniques on two samples of the near-alpha titanium alloy IMI834. In addition, it was observed that macrozones may present a semi-periodical arrangement. - Highlights: Black-Right-Pointing-Pointer Discloses a novel, ultrasonic NDT technique to measure macrozone dimensions Black-Right-Pointing-Pointer Proposes a quantitative and objective definition of macrozone dimensions Black-Right-Pointing-Pointer Compares macrozone dimensions obtained using EBSD and ultrasonics on 2 Ti samples Black-Right-Pointing-Pointer Observes that macrozones may have a semi-periodical arrangement.« less
Wu, Ed X.; Tang, Haiying; Tong, Christopher; Heymsfield, Steve B.; Vasselli, Joseph R.
2015-01-01
This study aimed to develop a quantitative and in vivo magnetic resonance imaging (MRI) approach to investigate the muscle growth effects of anabolic steroids. A protocol of MRI acquisition on a standard clinical 1.5 Tesla scanner and quantitative image analysis was established and employed to measure the individual muscle and organ volumes in the intact and castrated guinea pigs undergoing a 16-week treatment protocol by two well-documented anabolic steroids, testosterone and nandrolone, via implanted silastic capsules. High correlations between the in vivo MRI and postmortem dissection measurements were observed for shoulder muscle complex (R = 0.86), masseter (R=0.79), temporalis (R=0.95), neck muscle complex (R=0.58), prostate gland and seminal vesicles (R=0.98), and testis (R=0.96). Furthermore, the longitudinal MRI measurements yielded adequate sensitivity to detect the restoration of growth to or towards normal in castrated guinea pigs by replacing circulating steroid levels to physiological or slightly higher levels, as expected. These results demonstrated that quantitative MRI using a standard clinical scanner provides accurate and sensitive measurement of individual muscles and organs, and this in vivo MRI protocol in conjunction with the castrated guinea pig model constitutes an effective platform to investigate the longitudinal and cross-sectional growth effects of other potential anabolic steroids. The quantitative MRI protocol developed can also be readily adapted for human studies on most clinical MRI scanner to investigate the anabolic steroid growth effects, or monitor the changes in individual muscle and organ volume and geometry following injury, strength training, neuromuscular disorders, and pharmacological or surgical interventions. PMID:18241900
Size Dependent Mechanical Properties of Monolayer Densely Arranged Polystyrene Nanospheres.
Huang, Peng; Zhang, Lijing; Yan, Qingfeng; Guo, Dan; Xie, Guoxin
2016-12-13
In contrast to macroscopic materials, the mechanical properties of polymer nanospheres show fascinating scientific and application values. However, the experimental measurements of individual nanospheres and quantitative analysis of theoretical mechanisms remain less well performed and understood. We provide a highly efficient and accurate method with monolayer densely arranged honeycomb polystyrene (PS) nanospheres for the quantitatively mechanical characterization of individual nanospheres on the basis of atomic force microscopy (AFM) nanoindentation. The efficiency is improved by 1-2 orders, and the accuracy is also enhanced almost by half-order. The elastic modulus measured in the experiments increases with decreasing radius to the smallest nanospheres (25-35 nm in radius). A core-shell model is introduced to predict the size dependent elasticity of PS nanospheres, and the theoretical prediction agrees reasonably well with the experimental results and also shows a peak modulus value.
Niu, Ben; Zhang, Hao; Giblin, Daryl; Rempel, Don L; Gross, Michael L
2015-05-01
Fast photochemical oxidation of proteins (FPOP) employs laser photolysis of hydrogen peroxide to give OH radicals that label amino acid side-chains of proteins on the microsecond time scale. A method for quantitation of hydroxyl radicals after laser photolysis is of importance to FPOP because it establishes a means to adjust the yield of •OH, offers the opportunity of tunable modifications, and provides a basis for kinetic measurements. The initial concentration of OH radicals has yet to be measured experimentally. We report here an approach using isotope dilution gas chromatography/mass spectrometry (GC/MS) to determine quantitatively the initial •OH concentration (we found ~0.95 mM from 15 mM H2O2) from laser photolysis and to investigate the quenching efficiencies for various •OH scavengers.
NASA Astrophysics Data System (ADS)
Singh, Manoj K.; Gautam, Ritesh; Gatebe, Charles K.; Poudyal, Rajesh
2016-11-01
The Bidirectional Reflectance Distribution Function (BRDF) is a fundamental concept for characterizing the reflectance property of a surface, and helps in the analysis of remote sensing data from satellite, airborne and surface platforms. Multi-angular remote sensing measurements are required for the development and evaluation of BRDF models for improved characterization of surface properties. However, multi-angular data and the associated BRDF models are typically multidimensional involving multi-angular and multi-wavelength information. Effective visualization of such complex multidimensional measurements for different wavelength combinations is presently somewhat lacking in the literature, and could serve as a potentially useful research and teaching tool in aiding both interpretation and analysis of BRDF measurements. This article describes a newly developed software package in Python (PolarBRDF) to help visualize and analyze multi-angular data in polar and False Color Composite (FCC) forms. PolarBRDF also includes functionalities for computing important multi-angular reflectance/albedo parameters including spectral albedo, principal plane reflectance and spectral reflectance slope. Application of PolarBRDF is demonstrated using various case studies obtained from airborne multi-angular remote sensing measurements using NASA's Cloud Absorption Radiometer (CAR). Our visualization program also provides functionalities for untangling complex surface/atmosphere features embedded in pixel-based remote sensing measurements, such as the FCC imagery generation of BRDF measurements of grasslands in the presence of wildfire smoke and clouds. Furthermore, PolarBRDF also provides quantitative information of the angular distribution of scattered surface/atmosphere radiation, in the form of relevant BRDF variables such as sunglint, hotspot and scattering statistics.
NASA Astrophysics Data System (ADS)
Poudyal, R.; Singh, M.; Gautam, R.; Gatebe, C. K.
2016-12-01
The Bidirectional Reflectance Distribution Function (BRDF) is a fundamental concept for characterizing the reflectance property of a surface, and helps in the analysis of remote sensing data from satellite, airborne and surface platforms. Multi-angular remote sensing measurements are required for the development and evaluation of BRDF models for improved characterization of surface properties. However, multi-angular data and the associated BRDF models are typically multidimensional involving multi-angular and multi-wavelength information. Effective visualization of such complex multidimensional measurements for different wavelength combinations is presently somewhat lacking in the literature, and could serve as a potentially useful research and teaching tool in aiding both interpretation and analysis of BRDF measurements. This article describes a newly developed software package in Python (PolarBRDF) to help visualize and analyze multi-angular data in polar and False Color Composite (FCC) forms. PolarBRDF also includes functionalities for computing important multi-angular reflectance/albedo parameters including spectral albedo, principal plane reflectance and spectral reflectance slope. Application of PolarBRDF is demonstrated using various case studies obtained from airborne multi-angular remote sensing measurements using NASA's Cloud Absorption Radiometer (CAR)- http://car.gsfc.nasa.gov/. Our visualization program also provides functionalities for untangling complex surface/atmosphere features embedded in pixel-based remote sensing measurements, such as the FCC imagery generation of BRDF measurements of grasslands in the presence of wildfire smoke and clouds. Furthermore, PolarBRDF also provides quantitative information of the angular distribution of scattered surface/atmosphere radiation, in the form of relevant BRDF variables such as sunglint, hotspot and scattering statistics.
NASA Technical Reports Server (NTRS)
Singh, Manoj K.; Gautam, Ritesh; Gatebe, Charles K.; Poudyal, Rajesh
2016-01-01
The Bidirectional Reflectance Distribution Function (BRDF) is a fundamental concept for characterizing the reflectance property of a surface, and helps in the analysis of remote sensing data from satellite, airborne and surface platforms. Multi-angular remote sensing measurements are required for the development and evaluation of BRDF models for improved characterization of surface properties. However, multi-angular data and the associated BRDF models are typically multidimensional involving multi-angular and multi-wavelength information. Effective visualization of such complex multidimensional measurements for different wavelength combinations is presently somewhat lacking in the literature, and could serve as a potentially useful research and teaching tool in aiding both interpretation and analysis of BRDF measurements. This article describes a newly developed software package in Python (PolarBRDF) to help visualize and analyze multi-angular data in polar and False Color Composite (FCC) forms. PolarBRDF also includes functionalities for computing important multi-angular reflectance/albedo parameters including spectral albedo, principal plane reflectance and spectral reflectance slope. Application of PolarBRDF is demonstrated using various case studies obtained from airborne multi-angular remote sensing measurements using NASA's Cloud Absorption Radiometer (CAR). Our visualization program also provides functionalities for untangling complex surface/atmosphere features embedded in pixel-based remote sensing measurements, such as the FCC imagery generation of BRDF measurements of grasslands in the presence of wild fire smoke and clouds. Furthermore, PolarBRDF also provides quantitative information of the angular distribution of scattered surface/atmosphere radiation, in the form of relevant BRDF variables such as sunglint, hotspot and scattering statistics.
Quantification of the level of crowdedness for pedestrian movements
NASA Astrophysics Data System (ADS)
Duives, Dorine C.; Daamen, Winnie; Hoogendoorn, Serge P.
2015-06-01
Within the realm of pedestrian research numerous measures have been proposed to estimate the level of crowdedness experienced by pedestrians. However, within the field of pedestrian traffic flow modelling there does not seem to be consensus on the question which of these measures performs best. This paper shows that the shape and scatter within the resulting fundamental diagrams differs a lot depending on the measure of crowdedness used. The main aim of the paper is to establish the advantages and disadvantages of the currently existing measures to quantify crowdedness in order to evaluate which measures provide both accurate and consistent results. The assessment is not only based on the theoretical differences, but also on the qualitative and quantitative differences between the resulting fundamental diagrams computed using the crowdedness measures on one and the same data set. The qualitative and quantitative functioning of the classical Grid-based measure is compared to with the X-T measure, an Exponentially Weighted Distance measure, and a Voronoi-Diagram measure. The consistency of relating these measures for crowdedness to the two macroscopic flow variables velocity and flow, the computational efficiency and the amount of scatter present within the fundamental diagrams produced by the implementation of the different measures are reviewed. It is found that the Voronoi-Diagram and X-T measure are the most efficient and consistent measures for crowdedness.
Sil, Payel; Yoo, Dae-Goon; Floyd, Madison; Gingerich, Aaron; Rada, Balazs
2016-06-18
Neutrophil granulocytes are the most abundant leukocytes in the human blood. Neutrophils are the first to arrive at the site of infection. Neutrophils developed several antimicrobial mechanisms including phagocytosis, degranulation and formation of neutrophil extracellular traps (NETs). NETs consist of a DNA scaffold decorated with histones and several granule markers including myeloperoxidase (MPO) and human neutrophil elastase (HNE). NET release is an active process involving characteristic morphological changes of neutrophils leading to expulsion of their DNA into the extracellular space. NETs are essential to fight microbes, but uncontrolled release of NETs has been associated with several disorders. To learn more about the clinical relevance and the mechanism of NET formation, there is a need to have reliable tools capable of NET quantitation. Here three methods are presented that can assess NET release from human neutrophils in vitro. The first one is a high throughput assay to measure extracellular DNA release from human neutrophils using a membrane impermeable DNA-binding dye. In addition, two other methods are described capable of quantitating NET formation by measuring levels of NET-specific MPO-DNA and HNE-DNA complexes. These microplate-based methods in combination provide great tools to efficiently study the mechanism and regulation of NET formation of human neutrophils.
Luchins, Daniel
2012-01-01
The quality improvement model currently used in medicine and mental health was adopted from industry, where it developed out of early 20th-century efforts to apply a positivist/quantitative agenda to improving manufacturing. This article questions the application of this model to mental health care. It argues that (1) developing "operational definitions" for something as value-laden as "quality" risks conflating two realms, what we measure with what we value; (2) when measurements that are tied to individuals are aggregated to establish benchmarks and goals, unwarranted mathematical assumptions are made; (3) choosing clinical outcomes is problematic; (4) there is little relationship between process measures and clinical outcomes; and (5) since changes in quality indices do not relate to improved clinical care, management's reliance on such indices provides an illusory sense of control. An alternative model is the older, skill-based/qualitative approach to knowing, which relies on "implicit/ expert" knowledge. These two approaches offer a series of contrasts: quality versus excellence, competence versus expertise, management versus leadership, extrinsic versus intrinsic rewards. The article concludes that we need not totally dispense with the current quality improvement model, but rather should balance quantitative efforts with the older qualitative approach in a mixed methods model.
Thermal Imaging with Novel Infrared Focal Plane Arrays and Quantitative Analysis of Thermal Imagery
NASA Technical Reports Server (NTRS)
Gunapala, S. D.; Rafol, S. B.; Bandara, S. V.; Liu, J. K.; Mumolo, J. M.; Soibel, A.; Ting, D. Z.; Tidrow, Meimei
2012-01-01
We have developed a single long-wavelength infrared (LWIR) quantum well infrared photodetector (QWIP) camera for thermography. This camera has been used to measure the temperature profile of patients. A pixel coregistered simultaneously reading mid-wavelength infrared (MWIR)/LWIR dual-band QWIP camera was developed to improve the accuracy of temperature measurements especially with objects with unknown emissivity. Even the dualband measurement can provide inaccurate results due to the fact that emissivity is a function of wavelength. Thus we have been developing a four-band QWIP camera for accurate temperature measurement of remote object.
Analysis of ERTS imagery using special electronic viewing/measuring equipment
NASA Technical Reports Server (NTRS)
Evans, W. E.; Serebreny, S. M.
1973-01-01
An electronic satellite image analysis console (ESIAC) is being employed to process imagery for use by USGS investigators in several different disciplines studying dynamic hydrologic conditions. The ESIAC provides facilities for storing registered image sequences in a magnetic video disc memory for subsequent recall, enhancement, and animated display in monochrome or color. Quantitative measurements of distances, areas, and brightness profiles can be extracted digitally under operator supervision. Initial results are presented for the display and measurement of snowfield extent, glacier development, sediment plumes from estuary discharge, playa inventory, phreatophyte and other vegetative changes.
Stripline fast faraday cup for measuring GHz structure of ion beams
Bogaty, John M.
1992-01-01
The Stripline Fast Faraday Cup is a device which is used to quantitatively and qualitatively measure gigahertz time structure characteristics of ion beams with energies up to at least 30 Mev per nucleon. A stripline geometry is employed in conjunction with an electrostatic screen and a Faraday cup to provide for analysis of the structural characteristics of an ion beam. The stripline geometry allows for a large reduction in the size of the instrument while the electrostatic screen permits measurements of the properties associated with low speed ion beams.