Sample records for quantitative experimental measurements

  1. Comparing the MRI-based Goutallier Classification to an experimental quantitative MR spectroscopic fat measurement of the supraspinatus muscle.

    PubMed

    Gilbert, Fabian; Böhm, Dirk; Eden, Lars; Schmalzl, Jonas; Meffert, Rainer H; Köstler, Herbert; Weng, Andreas M; Ziegler, Dirk

    2016-08-22

    The Goutallier Classification is a semi quantitative classification system to determine the amount of fatty degeneration in rotator cuff muscles. Although initially proposed for axial computer tomography scans it is currently applied to magnet-resonance-imaging-scans. The role for its clinical use is controversial, as the reliability of the classification has been shown to be inconsistent. The purpose of this study was to compare the semi quantitative MRI-based Goutallier Classification applied by 5 different raters to experimental MR spectroscopic quantitative fat measurement in order to determine the correlation between this classification system and the true extent of fatty degeneration shown by spectroscopy. MRI-scans of 42 patients with rotator cuff tears were examined by 5 shoulder surgeons and were graduated according to the MRI-based Goutallier Classification proposed by Fuchs et al. Additionally the fat/water ratio was measured with MR spectroscopy using the experimental SPLASH technique. The semi quantitative grading according to the Goutallier Classification was statistically correlated with the quantitative measured fat/water ratio using Spearman's rank correlation. Statistical analysis of the data revealed only fair correlation of the Goutallier Classification system and the quantitative fat/water ratio with R = 0.35 (p < 0.05). By dichotomizing the scale the correlation was 0.72. The interobserver and intraobserver reliabilities were substantial with R = 0.62 and R = 0.74 (p < 0.01). The correlation between the semi quantitative MRI based Goutallier Classification system and MR spectroscopic fat measurement is weak. As an adequate estimation of fatty degeneration based on standard MRI may not be possible, quantitative methods need to be considered in order to increase diagnostic safety and thus provide patients with ideal care in regard to the amount of fatty degeneration. Spectroscopic MR measurement may increase the accuracy of

  2. A general way for quantitative magnetic measurement by transmitted electrons

    NASA Astrophysics Data System (ADS)

    Song, Dongsheng; Li, Gen; Cai, Jianwang; Zhu, Jing

    2016-01-01

    EMCD (electron magnetic circular dichroism) technique opens a new door to explore magnetic properties by transmitted electrons. The recently developed site-specific EMCD technique makes it possible to obtain rich magnetic information from the Fe atoms sited at nonequivalent crystallographic planes in NiFe2O4, however it is based on a critical demand for the crystallographic structure of the testing sample. Here, we have further improved and tested the method for quantitative site-specific magnetic measurement applicable for more complex crystallographic structure by using the effective dynamical diffraction effects (general routine for selecting proper diffraction conditions, making use of the asymmetry of dynamical diffraction for design of experimental geometry and quantitative measurement, etc), and taken yttrium iron garnet (Y3Fe5O12, YIG) with more complex crystallographic structure as an example to demonstrate its applicability. As a result, the intrinsic magnetic circular dichroism signals, spin and orbital magnetic moment of iron with site-specific are quantitatively determined. The method will further promote the development of quantitative magnetic measurement with high spatial resolution by transmitted electrons.

  3. Experimental study of flash boiling spray vaporization through quantitative vapor concentration and liquid temperature measurements

    NASA Astrophysics Data System (ADS)

    Zhang, Gaoming; Hung, David L. S.; Xu, Min

    2014-08-01

    Flash boiling sprays of liquid injection under superheated conditions provide the novel solutions of fast vaporization and better air-fuel mixture formation for internal combustion engines. However, the physical mechanisms of flash boiling spray vaporization are more complicated than the droplet surface vaporization due to the unique bubble generation and boiling process inside a superheated bulk liquid, which are not well understood. In this study, the vaporization of flash boiling sprays was investigated experimentally through the quantitative measurements of vapor concentration and liquid temperature. Specifically, the laser-induced exciplex fluorescence technique was applied to distinguish the liquid and vapor distributions. Quantitative vapor concentration was obtained by correlating the intensity of vapor-phase fluorescence with vapor concentration through systematic corrections and calibrations. The intensities of two wavelengths were captured simultaneously from the liquid-phase fluorescence spectra, and their intensity ratios were correlated with liquid temperature. The results show that both liquid and vapor phase of multi-hole sprays collapse toward the centerline of the spray with different mass distributions under the flash boiling conditions. Large amount of vapor aggregates along the centerline of the spray to form a "gas jet" structure, whereas the liquid distributes more uniformly with large vortexes formed in the vicinity of the spray tip. The vaporization process under the flash boiling condition is greatly enhanced due to the intense bubble generation and burst. The liquid temperature measurements show strong temperature variations inside the flash boiling sprays with hot zones present in the "gas jet" structure and vortex region. In addition, high vapor concentration and closed vortex motion seem to have inhibited the heat and mass transfer in these regions. In summary, the vapor concentration and liquid temperature provide detailed information

  4. Experimental Assessment and Enhancement of Planar Laser-Induced Fluorescence Measurements of Nitric Oxide in an Inverse Diffusion Flame

    NASA Technical Reports Server (NTRS)

    Partridge, William P.; Laurendeau, Normand M.

    1997-01-01

    We have experimentally assessed the quantitative nature of planar laser-induced fluorescence (PLIF) measurements of NO concentration in a unique atmospheric pressure, laminar, axial inverse diffusion flame (IDF). The PLIF measurements were assessed relative to a two-dimensional array of separate laser saturated fluorescence (LSF) measurements. We demonstrated and evaluated several experimentally-based procedures for enhancing the quantitative nature of PLIF concentration images. Because these experimentally-based PLIF correction schemes require only the ability to make PLIF and LSF measurements, they produce a more broadly applicable PLIF diagnostic compared to numerically-based correction schemes. We experimentally assessed the influence of interferences on both narrow-band and broad-band fluorescence measurements at atmospheric and high pressures. Optimum excitation and detection schemes were determined for the LSF and PLIF measurements. Single-input and multiple-input, experimentally-based PLIF enhancement procedures were developed for application in test environments with both negligible and significant quench-dependent error gradients. Each experimentally-based procedure provides an enhancement of approximately 50% in the quantitative nature of the PLIF measurements, and results in concentration images nominally as quantitative as LSF point measurements. These correction procedures can be applied to other species, including radicals, for which no experimental data are available from which to implement numerically-based PLIF enhancement procedures.

  5. Quantitative Species Measurements In Microgravity Combustion Flames

    NASA Technical Reports Server (NTRS)

    Chen, Shin-Juh; Pilgrim, Jeffrey S.; Silver, Joel A.; Piltch, Nancy D.

    2003-01-01

    The capability of models and theories to accurately predict and describe the behavior of low gravity flames can only be verified by quantitative measurements. Although video imaging, simple temperature measurements, and velocimetry methods have provided useful information in many cases, there is still a need for quantitative species measurements. Over the past decade, we have been developing high sensitivity optical absorption techniques to permit in situ, non-intrusive, absolute concentration measurements for both major and minor flames species using diode lasers. This work has helped to establish wavelength modulation spectroscopy (WMS) as an important method for species detection within the restrictions of microgravity-based measurements. More recently, in collaboration with Prof. Dahm at the University of Michigan, a new methodology combining computed flame libraries with a single experimental measurement has allowed us to determine the concentration profiles for all species in a flame. This method, termed ITAC (Iterative Temperature with Assumed Chemistry) was demonstrated for a simple laminar nonpremixed methane-air flame at both 1-g and at 0-g in a vortex ring flame. In this paper, we report additional normal and microgravity experiments which further confirm the usefulness of this approach. We also present the development of a new type of laser. This is an external cavity diode laser (ECDL) which has the unique capability of high frequency modulation as well as a very wide tuning range. This will permit the detection of multiple species with one laser while using WMS detection.

  6. Quantitative Thermochemical Measurements in High-Pressure Gaseous Combustion

    NASA Technical Reports Server (NTRS)

    Kojima, Jun J.; Fischer, David G.

    2012-01-01

    We present our strategic experiment and thermochemical analyses on combustion flow using a subframe burst gating (SBG) Raman spectroscopy. This unconventional laser diagnostic technique has promising ability to enhance accuracy of the quantitative scalar measurements in a point-wise single-shot fashion. In the presentation, we briefly describe an experimental methodology that generates transferable calibration standard for the routine implementation of the diagnostics in hydrocarbon flames. The diagnostic technology was applied to simultaneous measurements of temperature and chemical species in a swirl-stabilized turbulent flame with gaseous methane fuel at elevated pressure (17 atm). Statistical analyses of the space-/time-resolved thermochemical data provide insights into the nature of the mixing process and it impact on the subsequent combustion process in the model combustor.

  7. Experimental validation of a Monte-Carlo-based inversion scheme for 3D quantitative photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Buchmann, Jens; Kaplan, Bernhard A.; Prohaska, Steffen; Laufer, Jan

    2017-03-01

    Quantitative photoacoustic tomography (qPAT) aims to extract physiological parameters, such as blood oxygen saturation (sO2), from measured multi-wavelength image data sets. The challenge of this approach lies in the inherently nonlinear fluence distribution in the tissue, which has to be accounted for by using an appropriate model, and the large scale of the inverse problem. In addition, the accuracy of experimental and scanner-specific parameters, such as the wavelength dependence of the incident fluence, the acoustic detector response, the beam profile and divergence, needs to be considered. This study aims at quantitative imaging of blood sO2, as it has been shown to be a more robust parameter compared to absolute concentrations. We propose a Monte-Carlo-based inversion scheme in conjunction with a reduction in the number of variables achieved using image segmentation. The inversion scheme is experimentally validated in tissue-mimicking phantoms consisting of polymer tubes suspended in a scattering liquid. The tubes were filled with chromophore solutions at different concentration ratios. 3-D multi-spectral image data sets were acquired using a Fabry-Perot based PA scanner. A quantitative comparison of the measured data with the output of the forward model is presented. Parameter estimates of chromophore concentration ratios were found to be within 5 % of the true values.

  8. Quantitative nuclear magnetic resonance imaging: characterisation of experimental cerebral oedema.

    PubMed Central

    Barnes, D; McDonald, W I; Johnson, G; Tofts, P S; Landon, D N

    1987-01-01

    Magnetic resonance imaging (MRI) has been used quantitatively to define the characteristics of two different models of experimental cerebral oedema in cats: vasogenic oedema produced by cortical freezing and cytotoxic oedema induced by triethyl tin. The MRI results have been correlated with the ultrastructural changes. The images accurately delineated the anatomical extent of the oedema in the two lesions, but did not otherwise discriminate between them. The patterns of measured increase in T1' and T2' were, however, characteristic for each type of oedema, and reflected the protein content. The magnetisation decay characteristics of both normal and oedematous white matter were monoexponential for T1 but biexponential for T2 decay. The relative sizes of the two component exponentials of the latter corresponded with the physical sizes of the major tissue water compartments. Quantitative MRI data can provide reliable information about the physico-chemical environment of tissue water in normal and oedematous cerebral tissue, and are useful for distinguishing between acute and chronic lesions in multiple sclerosis. Images PMID:3572428

  9. Quantitative evaluation of statistical errors in small-angle X-ray scattering measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sedlak, Steffen M.; Bruetzel, Linda K.; Lipfert, Jan

    A new model is proposed for the measurement errors incurred in typical small-angle X-ray scattering (SAXS) experiments, which takes into account the setup geometry and physics of the measurement process. The model accurately captures the experimentally determined errors from a large range of synchrotron and in-house anode-based measurements. Its most general formulation gives for the variance of the buffer-subtracted SAXS intensity σ 2(q) = [I(q) + const.]/(kq), whereI(q) is the scattering intensity as a function of the momentum transferq;kand const. are fitting parameters that are characteristic of the experimental setup. The model gives a concrete procedure for calculating realistic measurementmore » errors for simulated SAXS profiles. In addition, the results provide guidelines for optimizing SAXS measurements, which are in line with established procedures for SAXS experiments, and enable a quantitative evaluation of measurement errors.« less

  10. Experimental Null Method to Guide the Development of Technical Procedures and to Control False-Positive Discovery in Quantitative Proteomics.

    PubMed

    Shen, Xiaomeng; Hu, Qiang; Li, Jun; Wang, Jianmin; Qu, Jun

    2015-10-02

    Comprehensive and accurate evaluation of data quality and false-positive biomarker discovery is critical to direct the method development/optimization for quantitative proteomics, which nonetheless remains challenging largely due to the high complexity and unique features of proteomic data. Here we describe an experimental null (EN) method to address this need. Because the method experimentally measures the null distribution (either technical or biological replicates) using the same proteomic samples, the same procedures and the same batch as the case-vs-contol experiment, it correctly reflects the collective effects of technical variability (e.g., variation/bias in sample preparation, LC-MS analysis, and data processing) and project-specific features (e.g., characteristics of the proteome and biological variation) on the performances of quantitative analysis. To show a proof of concept, we employed the EN method to assess the quantitative accuracy and precision and the ability to quantify subtle ratio changes between groups using different experimental and data-processing approaches and in various cellular and tissue proteomes. It was found that choices of quantitative features, sample size, experimental design, data-processing strategies, and quality of chromatographic separation can profoundly affect quantitative precision and accuracy of label-free quantification. The EN method was also demonstrated as a practical tool to determine the optimal experimental parameters and rational ratio cutoff for reliable protein quantification in specific proteomic experiments, for example, to identify the necessary number of technical/biological replicates per group that affords sufficient power for discovery. Furthermore, we assessed the ability of EN method to estimate levels of false-positives in the discovery of altered proteins, using two concocted sample sets mimicking proteomic profiling using technical and biological replicates, respectively, where the true

  11. Accurate Construction of Photoactivated Localization Microscopy (PALM) Images for Quantitative Measurements

    PubMed Central

    Coltharp, Carla; Kessler, Rene P.; Xiao, Jie

    2012-01-01

    Localization-based superresolution microscopy techniques such as Photoactivated Localization Microscopy (PALM) and Stochastic Optical Reconstruction Microscopy (STORM) have allowed investigations of cellular structures with unprecedented optical resolutions. One major obstacle to interpreting superresolution images, however, is the overcounting of molecule numbers caused by fluorophore photoblinking. Using both experimental and simulated images, we determined the effects of photoblinking on the accurate reconstruction of superresolution images and on quantitative measurements of structural dimension and molecule density made from those images. We found that structural dimension and relative density measurements can be made reliably from images that contain photoblinking-related overcounting, but accurate absolute density measurements, and consequently faithful representations of molecule counts and positions in cellular structures, require the application of a clustering algorithm to group localizations that originate from the same molecule. We analyzed how applying a simple algorithm with different clustering thresholds (tThresh and dThresh) affects the accuracy of reconstructed images, and developed an easy method to select optimal thresholds. We also identified an empirical criterion to evaluate whether an imaging condition is appropriate for accurate superresolution image reconstruction with the clustering algorithm. Both the threshold selection method and imaging condition criterion are easy to implement within existing PALM clustering algorithms and experimental conditions. The main advantage of our method is that it generates a superresolution image and molecule position list that faithfully represents molecule counts and positions within a cellular structure, rather than only summarizing structural properties into ensemble parameters. This feature makes it particularly useful for cellular structures of heterogeneous densities and irregular geometries, and

  12. Experimental Influences in the Accurate Measurement of Cartilage Thickness in MRI.

    PubMed

    Wang, Nian; Badar, Farid; Xia, Yang

    2018-01-01

    Objective To study the experimental influences to the measurement of cartilage thickness by magnetic resonance imaging (MRI). Design The complete thicknesses of healthy and trypsin-degraded cartilage were measured at high-resolution MRI under different conditions, using two intensity-based imaging sequences (ultra-short echo [UTE] and multislice-multiecho [MSME]) and 3 quantitative relaxation imaging sequences (T 1 , T 2 , and T 1 ρ). Other variables included different orientations in the magnet, 2 soaking solutions (saline and phosphate buffered saline [PBS]), and external loading. Results With cartilage soaked in saline, UTE and T 1 methods yielded complete and consistent measurement of cartilage thickness, while the thickness measurement by T 2 , T 1 ρ, and MSME methods were orientation dependent. The effect of external loading on cartilage thickness is also sequence and orientation dependent. All variations in cartilage thickness in MRI could be eliminated with the use of a 100 mM PBS or imaged by UTE sequence. Conclusions The appearance of articular cartilage and the measurement accuracy of cartilage thickness in MRI can be influenced by a number of experimental factors in ex vivo MRI, from the use of various pulse sequences and soaking solutions to the health of the tissue. T 2 -based imaging sequence, both proton-intensity sequence and quantitative relaxation sequence, similarly produced the largest variations. With adequate resolution, the accurate measurement of whole cartilage tissue in clinical MRI could be utilized to detect differences between healthy and osteoarthritic cartilage after compression.

  13. Using direct numerical simulation to improve experimental measurements of inertial particle radial relative velocities

    NASA Astrophysics Data System (ADS)

    Ireland, Peter J.; Collins, Lance R.

    2012-11-01

    Turbulence-induced collision of inertial particles may contribute to the rapid onset of precipitation in warm cumulus clouds. The particle collision frequency is determined from two parameters: the radial distribution function g (r) and the mean inward radial relative velocity . These quantities have been measured in three dimensions computationally, using direct numerical simulation (DNS), and experimentally, using digital holographic particle image velocimetry (DHPIV). While good quantitative agreement has been attained between computational and experimental measures of g (r) (Salazar et al. 2008), measures of wr have not reached that stage (de Jong et al. 2010). We apply DNS to mimic the experimental image analysis used in the relative velocity measurement. To account for experimental errors, we add noise to the particle positions and `measure' the velocity from these positions. Our DNS shows that the experimental errors are inherent to the DHPIV setup, and so we explore an alternate approach, in which velocities are measured along thin two-dimensional planes using standard PIV. We show that this technique better recovers the correct radial relative velocity PDFs and suggest optimal parameter ranges for the experimental measurements.

  14. A novel semi-quantitative method for measuring tissue bleeding.

    PubMed

    Vukcevic, G; Volarevic, V; Raicevic, S; Tanaskovic, I; Milicic, B; Vulovic, T; Arsenijevic, S

    2014-03-01

    In this study, we describe a new semi-quantitative method for measuring the extent of bleeding in pathohistological tissue samples. To test our novel method, we recruited 120 female patients in their first trimester of pregnancy and divided them into three groups of 40. Group I was the control group, in which no dilation was applied. Group II was an experimental group, in which dilation was performed using classical mechanical dilators. Group III was also an experimental group, in which dilation was performed using a hydraulic dilator. Tissue samples were taken from the patients' cervical canals using a Novak's probe via energetic single-step curettage prior to any dilation in Group I and after dilation in Groups II and III. After the tissue samples were prepared, light microscopy was used to obtain microphotographs at 100x magnification. The surfaces affected by bleeding were measured in the microphotographs using the Autodesk AutoCAD 2009 program and its "polylines" function. The lines were used to mark the area around the entire sample (marked A) and to create "polyline" areas around each bleeding area on the sample (marked B). The percentage of the total area affected by bleeding was calculated using the formula: N = Bt x 100 / At where N is the percentage (%) of the tissue sample surface affected by bleeding, At (A total) is the sum of the surfaces of all of the tissue samples and Bt (B total) is the sum of all the surfaces affected by bleeding in all of the tissue samples. This novel semi-quantitative method utilizes the Autodesk AutoCAD 2009 program, which is simple to use and widely available, thereby offering a new, objective and precise approach to estimate the extent of bleeding in tissue samples.

  15. Quantitative stress measurement of elastic deformation using mechanoluminescent sensor: An intensity ratio model

    NASA Astrophysics Data System (ADS)

    Cai, Tao; Guo, Songtao; Li, Yongzeng; Peng, Di; Zhao, Xiaofeng; Liu, Yingzheng

    2018-04-01

    The mechanoluminescent (ML) sensor is a newly developed non-invasive technique for stress/strain measurement. However, its application has been mostly restricted to qualitative measurement due to the lack of a well-defined relationship between ML intensity and stress. To achieve accurate stress measurement, an intensity ratio model was proposed in this study to establish a quantitative relationship between the stress condition and its ML intensity in elastic deformation. To verify the proposed model, experiments were carried out on a ML measurement system using resin samples mixed with the sensor material SrAl2O4:Eu2+, Dy3+. The ML intensity ratio was found to be dependent on the applied stress and strain rate, and the relationship acquired from the experimental results agreed well with the proposed model. The current study provided a physical explanation for the relationship between ML intensity and its stress condition. The proposed model was applicable in various SrAl2O4:Eu2+, Dy3+-based ML measurement in elastic deformation, and could provide a useful reference for quantitative stress measurement using the ML sensor in general.

  16. Development and Measurement of Preschoolers' Quantitative Knowledge

    ERIC Educational Resources Information Center

    Geary, David C.

    2015-01-01

    The collection of studies in this special issue make an important contribution to our understanding and measurement of the core cognitive and noncognitive factors that influence children's emerging quantitative competencies. The studies also illustrate how the field has matured, from a time when the quantitative competencies of infants and young…

  17. Connecting qualitative observation and quantitative measurement for enhancing quantitative literacy in plant anatomy course

    NASA Astrophysics Data System (ADS)

    Nuraeni, E.; Rahmat, A.

    2018-05-01

    Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.

  18. Electric Field Quantitative Measurement System and Method

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  19. Quantitative measurement of oxygen in microgravity combustion

    NASA Technical Reports Server (NTRS)

    Silver, Joel A.

    1995-01-01

    This research combines two innovations in an experimental system which should result in a new capability for quantitative, nonintrusive measurement of major combustion species. Using a newly available vertical cavity surface-emitting diode laser (VCSEL) and an improved spatial scanning method, we plan to measure the temporal and spatial profiles of the concentrations and temperatures of molecular oxygen in a candle flame and in a solid fuel (cellulose sheet) system. The required sensitivity for detecting oxygen is achieved by the use of high frequency wavelength modulation spectroscopy (WMS). Measurements will be performed in the NASA Lewis 2.2-second Drop Tower Facility. The objective of this research is twofold. First, we want to develop a better understanding of the relative roles of diffusion and reaction of oxygen in microgravity combustion. As the primary oxidizer species, oxygen plays a major role in controlling the observed properties of flames, including flame front speed (in solid or liquid flames), extinguishment characteristics, flame size, and flame temperature. The second objective is to develop better diagnostics based on diode laser absorption which can be of real value in microgravity combustion research. We will also demonstrate diode lasers' potential usefulness for compact, intrinsically-safe monitoring sensors aboard spacecraft. Such sensors could be used to monitor any of the major cabin gases as well as important pollutants.

  20. Quantitative comparisons between experimentally measured 2D carbon radiation and Monte Carlo impurity (MCI) code simulations

    NASA Astrophysics Data System (ADS)

    Evans, T. E.; Finkenthal, D. F.; Fenstermacher, M. E.; Leonard, A. W.; Porter, G. D.; West, W. P.

    Experimentally measured carbon line emissions and total radiated power distributions from the DIII-D divertor and scrape-off layer (SOL) are compared to those calculated with the Monte Carlo impurity (MCI) model. A UEDGE [T.D. Rognlien et al., J. Nucl. Mater. 196-198 (1992) 347] background plasma is used in MCI with the Roth and Garcia-Rosales (RG-R) chemical sputtering model [J. Roth, C. García-Rosales, Nucl. Fusion 36 (1992) 196] and/or one of six physical sputtering models. While results from these simulations do not reproduce all of the features seen in the experimentally measured radiation patterns, the total radiated power calculated in MCI is in relatively good agreement with that measured by the DIII-D bolometric system when the Smith78 [D.L. Smith, J. Nucl. Mater. 75 (1978) 20] physical sputtering model is coupled to RG-R chemical sputtering in an unaltered UEDGE plasma. Alternatively, MCI simulations done with UEDGE background ion temperatures along the divertor target plates adjusted to better match those measured in the experiment resulted in three physical sputtering models which when coupled to the RG-R model gave a total radiated power that was within 10% of measured value.

  1. Rapid experimental measurements of physicochemical properties to inform models and testing.

    PubMed

    Nicolas, Chantel I; Mansouri, Kamel; Phillips, Katherine A; Grulke, Christopher M; Richard, Ann M; Williams, Antony J; Rabinowitz, James; Isaacs, Kristin K; Yau, Alice; Wambaugh, John F

    2018-05-02

    The structures and physicochemical properties of chemicals are important for determining their potential toxicological effects, toxicokinetics, and route(s) of exposure. These data are needed to prioritize the risk for thousands of environmental chemicals, but experimental values are often lacking. In an attempt to efficiently fill data gaps in physicochemical property information, we generated new data for 200 structurally diverse compounds, which were rigorously selected from the USEPA ToxCast chemical library, and whose structures are available within the Distributed Structure-Searchable Toxicity Database (DSSTox). This pilot study evaluated rapid experimental methods to determine five physicochemical properties, including the log of the octanol:water partition coefficient (known as log(K ow ) or logP), vapor pressure, water solubility, Henry's law constant, and the acid dissociation constant (pKa). For most compounds, experiments were successful for at least one property; log(K ow ) yielded the largest return (176 values). It was determined that 77 ToxPrint structural features were enriched in chemicals with at least one measurement failure, indicating which features may have played a role in rapid method failures. To gauge consistency with traditional measurement methods, the new measurements were compared with previous measurements (where available). Since quantitative structure-activity/property relationship (QSAR/QSPR) models are used to fill gaps in physicochemical property information, 5 suites of QSPRs were evaluated for their predictive ability and chemical coverage or applicability domain of new experimental measurements. The ability to have accurate measurements of these properties will facilitate better exposure predictions in two ways: 1) direct input of these experimental measurements into exposure models; and 2) construction of QSPRs with a wider applicability domain, as their predicted physicochemical values can be used to parameterize exposure

  2. Quantitative measures for redox signaling.

    PubMed

    Pillay, Ché S; Eagling, Beatrice D; Driscoll, Scott R E; Rohwer, Johann M

    2016-07-01

    Redox signaling is now recognized as an important regulatory mechanism for a number of cellular processes including the antioxidant response, phosphokinase signal transduction and redox metabolism. While there has been considerable progress in identifying the cellular machinery involved in redox signaling, quantitative measures of redox signals have been lacking, limiting efforts aimed at understanding and comparing redox signaling under normoxic and pathogenic conditions. Here we have outlined some of the accepted principles for redox signaling, including the description of hydrogen peroxide as a signaling molecule and the role of kinetics in conferring specificity to these signaling events. Based on these principles, we then develop a working definition for redox signaling and review a number of quantitative methods that have been employed to describe signaling in other systems. Using computational modeling and published data, we show how time- and concentration- dependent analyses, in particular, could be used to quantitatively describe redox signaling and therefore provide important insights into the functional organization of redox networks. Finally, we consider some of the key challenges with implementing these methods. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Quantitative and simultaneous non-invasive measurement of skin hydration and sebum levels

    PubMed Central

    Ezerskaia, Anna; Pereira, S. F.; Urbach, H. Paul; Verhagen, Rieko; Varghese, Babu

    2016-01-01

    We report a method on quantitative and simultaneous non-contact in-vivo hydration and sebum measurements of the skin using an infrared optical spectroscopic set-up. The method utilizes differential detection with three wavelengths 1720, 1750, and 1770 nm, corresponding to the lipid vibrational bands that lay “in between” the prominent water absorption bands. We have used an emulsifier containing hydro- and lipophilic components to mix water and sebum in various volume fractions which was applied to the skin to mimic different oily-dry skin conditions. We also measured the skin sebum and hydration values on the forehead under natural conditions and its variations to external stimuli. Good agreement was found between our experimental results and reference values measured using conventional biophysical methods such as Corneometer and Sebumeter. PMID:27375946

  4. The Relationship between Quantitative and Qualitative Measures of Writing Skills.

    ERIC Educational Resources Information Center

    Howerton, Mary Lou P.; And Others

    The relationships of quantitative measures of writing skills to overall writing quality as measured by the E.T.S. Composition Evaluation Scale (CES) were examined. Quantitative measures included indices of language productivity, vocabulary diversity, spelling, and syntactic maturity. Power of specific indices to account for variation in overall…

  5. Tau-U: A Quantitative Approach for Analysis of Single-Case Experimental Data in Aphasia.

    PubMed

    Lee, Jaime B; Cherney, Leora R

    2018-03-01

    Tau-U is a quantitative approach for analyzing single-case experimental design (SCED) data. It combines nonoverlap between phases with intervention phase trend and can correct for a baseline trend (Parker, Vannest, & Davis, 2011). We demonstrate the utility of Tau-U by comparing it with the standardized mean difference approach (Busk & Serlin, 1992) that is widely reported within the aphasia SCED literature. Repeated writing measures from 3 participants with chronic aphasia who received computer-based writing treatment are analyzed visually and quantitatively using both Tau-U and the standardized mean difference approach. Visual analysis alone was insufficient for determining an effect between the intervention and writing improvement. The standardized mean difference yielded effect sizes ranging from 4.18 to 26.72 for trained items and 1.25 to 3.20 for untrained items. Tau-U yielded significant (p < .05) effect sizes for 2 of 3 participants for trained probes and 1 of 3 participants for untrained probes. A baseline trend correction was applied to data from 2 of 3 participants. Tau-U has the unique advantage of allowing for the correction of an undesirable baseline trend. Although further study is needed, Tau-U shows promise as a quantitative approach to augment visual analysis of SCED data in aphasia.

  6. Measuring the Nonuniform Evaporation Dynamics of Sprayed Sessile Microdroplets with Quantitative Phase Imaging.

    PubMed

    Edwards, Chris; Arbabi, Amir; Bhaduri, Basanta; Wang, Xiaozhen; Ganti, Raman; Yunker, Peter J; Yodh, Arjun G; Popescu, Gabriel; Goddard, Lynford L

    2015-10-13

    We demonstrate real-time quantitative phase imaging as a new optical approach for measuring the evaporation dynamics of sessile microdroplets. Quantitative phase images of various droplets were captured during evaporation. The images enabled us to generate time-resolved three-dimensional topographic profiles of droplet shape with nanometer accuracy and, without any assumptions about droplet geometry, to directly measure important physical parameters that characterize surface wetting processes. Specifically, the time-dependent variation of the droplet height, volume, contact radius, contact angle distribution along the droplet's perimeter, and mass flux density for two different surface preparations are reported. The studies clearly demonstrate three phases of evaporation reported previously: pinned, depinned, and drying modes; the studies also reveal instances of partial pinning. Finally, the apparatus is employed to investigate the cooperative evaporation of the sprayed droplets. We observe and explain the neighbor-induced reduction in evaporation rate, that is, as compared to predictions for isolated droplets. In the future, the new experimental methods should stimulate the exploration of colloidal particle dynamics on the gas-liquid-solid interface.

  7. Quantitative Experimental Study of Defects Induced by Process Parameters in the High-Pressure Die Cast Process

    NASA Astrophysics Data System (ADS)

    Sharifi, P.; Jamali, J.; Sadayappan, K.; Wood, J. T.

    2018-05-01

    A quantitative experimental study of the effects of process parameters on the formation of defects during solidification of high-pressure die cast magnesium alloy components is presented. The parameters studied are slow-stage velocity, fast-stage velocity, intensification pressure, and die temperature. The amount of various defects are quantitatively characterized. Multiple runs of the commercial casting simulation package, ProCAST™, are used to model the mold-filling and solidification events. Several locations in the component including knit lines, last-to-fill region, and last-to-solidify region are identified as the critical regions that have a high concentration of defects. The area fractions of total porosity, shrinkage porosity, gas porosity, and externally solidified grains are separately measured. This study shows that the process parameters, fluid flow and local solidification conditions, play major roles in the formation of defects during HPDC process.

  8. Quantitative force measurements using frequency modulation atomic force microscopy—theoretical foundations

    NASA Astrophysics Data System (ADS)

    Sader, John E.; Uchihashi, Takayuki; Higgins, Michael J.; Farrell, Alan; Nakayama, Yoshikazu; Jarvis, Suzanne P.

    2005-03-01

    Use of the atomic force microscope (AFM) in quantitative force measurements inherently requires a theoretical framework enabling conversion of the observed deflection properties of the cantilever to an interaction force. In this paper, the theoretical foundations of using frequency modulation atomic force microscopy (FM-AFM) in quantitative force measurements are examined and rigorously elucidated, with consideration being given to both 'conservative' and 'dissipative' interactions. This includes a detailed discussion of the underlying assumptions involved in such quantitative force measurements, the presentation of globally valid explicit formulae for evaluation of so-called 'conservative' and 'dissipative' forces, discussion of the origin of these forces, and analysis of the applicability of FM-AFM to quantitative force measurements in liquid.

  9. Quantitative performance measurements of bent crystal Laue analyzers for X-ray fluorescence spectroscopy.

    PubMed

    Karanfil, C; Bunker, G; Newville, M; Segre, C U; Chapman, D

    2012-05-01

    Third-generation synchrotron radiation sources pose difficult challenges for energy-dispersive detectors for XAFS because of their count rate limitations. One solution to this problem is the bent crystal Laue analyzer (BCLA), which removes most of the undesired scatter and fluorescence before it reaches the detector, effectively eliminating detector saturation due to background. In this paper experimental measurements of BCLA performance in conjunction with a 13-element germanium detector, and a quantitative analysis of the signal-to-noise improvement of BCLAs are presented. The performance of BCLAs are compared with filters and slits.

  10. Smile line assessment comparing quantitative measurement and visual estimation.

    PubMed

    Van der Geld, Pieter; Oosterveld, Paul; Schols, Jan; Kuijpers-Jagtman, Anne Marie

    2011-02-01

    Esthetic analysis of dynamic functions such as spontaneous smiling is feasible by using digital videography and computer measurement for lip line height and tooth display. Because quantitative measurements are time-consuming, digital videography and semiquantitative (visual) estimation according to a standard categorization are more practical for regular diagnostics. Our objective in this study was to compare 2 semiquantitative methods with quantitative measurements for reliability and agreement. The faces of 122 male participants were individually registered by using digital videography. Spontaneous and posed smiles were captured. On the records, maxillary lip line heights and tooth display were digitally measured on each tooth and also visually estimated according to 3-grade and 4-grade scales. Two raters were involved. An error analysis was performed. Reliability was established with kappa statistics. Interexaminer and intraexaminer reliability values were high, with median kappa values from 0.79 to 0.88. Agreement of the 3-grade scale estimation with quantitative measurement showed higher median kappa values (0.76) than the 4-grade scale estimation (0.66). Differentiating high and gummy smile lines (4-grade scale) resulted in greater inaccuracies. The estimation of a high, average, or low smile line for each tooth showed high reliability close to quantitative measurements. Smile line analysis can be performed reliably with a 3-grade scale (visual) semiquantitative estimation. For a more comprehensive diagnosis, additional measuring is proposed, especially in patients with disproportional gingival display. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  11. Evaluation of airway protection: Quantitative timing measures versus penetration/aspiration score.

    PubMed

    Kendall, Katherine A

    2017-10-01

    Quantitative measures of swallowing function may improve the reliability and accuracy of modified barium swallow (MBS) study interpretation. Quantitative study analysis has not been widely instituted, however, secondary to concerns about the time required to make measures and a lack of research demonstrating impact on MBS interpretation. This study compares the accuracy of the penetration/aspiration (PEN/ASP) scale (an observational visual-perceptual assessment tool) to quantitative measures of airway closure timing relative to the arrival of the bolus at the upper esophageal sphincter in identifying a failure of airway protection during deglutition. Retrospective review of clinical swallowing data from a university-based outpatient clinic. Swallowing data from 426 patients were reviewed. Patients with normal PEN/ASP scores were identified, and the results of quantitative airway closure timing measures for three liquid bolus sizes were evaluated. The incidence of significant airway closure delay with and without a normal PEN/ASP score was determined. Inter-rater reliability for the quantitative measures was calculated. In patients with a normal PEN/ASP score, 33% demonstrated a delay in airway closure on at least one swallow during the MBS study. There was no correlation between PEN/ASP score and airway closure delay. Inter-rater reliability for the quantitative measure of airway closure timing was nearly perfect (intraclass correlation coefficient = 0.973). The use of quantitative measures of swallowing function, in conjunction with traditional visual perceptual methods of MBS study interpretation, improves the identification of airway closure delay, and hence, potential aspiration risk, even when no penetration or aspiration is apparent on the MBS study. 4. Laryngoscope, 127:2314-2318, 2017. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.

  12. Quantitative endoscopy: initial accuracy measurements.

    PubMed

    Truitt, T O; Adelman, R A; Kelly, D H; Willging, J P

    2000-02-01

    The geometric optics of an endoscope can be used to determine the absolute size of an object in an endoscopic field without knowing the actual distance from the object. This study explores the accuracy of a technique that estimates absolute object size from endoscopic images. Quantitative endoscopy involves calibrating a rigid endoscope to produce size estimates from 2 images taken with a known traveled distance between the images. The heights of 12 samples, ranging in size from 0.78 to 11.80 mm, were estimated with this calibrated endoscope. Backup distances of 5 mm and 10 mm were used for comparison. The mean percent error for all estimated measurements when compared with the actual object sizes was 1.12%. The mean errors for 5-mm and 10-mm backup distances were 0.76% and 1.65%, respectively. The mean errors for objects <2 mm and > or =2 mm were 0.94% and 1.18%, respectively. Quantitative endoscopy estimates endoscopic image size to within 5% of the actual object size. This method remains promising for quantitatively evaluating object size from endoscopic images. It does not require knowledge of the absolute distance of the endoscope from the object, rather, only the distance traveled by the endoscope between images.

  13. Indium adhesion provides quantitative measure of surface cleanliness

    NASA Technical Reports Server (NTRS)

    Krieger, G. L.; Wilson, G. J.

    1968-01-01

    Indium tipped probe measures hydrophobic and hydrophilic contaminants on rough and smooth surfaces. The force needed to pull the indium tip, which adheres to a clean surface, away from the surface provides a quantitative measure of cleanliness.

  14. Direct Measurements of Quantum Kinetic Energy Tensor in Stable and Metastable Water near the Triple Point: An Experimental Benchmark.

    PubMed

    Andreani, Carla; Romanelli, Giovanni; Senesi, Roberto

    2016-06-16

    This study presents the first direct and quantitative measurement of the nuclear momentum distribution anisotropy and the quantum kinetic energy tensor in stable and metastable (supercooled) water near its triple point, using deep inelastic neutron scattering (DINS). From the experimental spectra, accurate line shapes of the hydrogen momentum distributions are derived using an anisotropic Gaussian and a model-independent framework. The experimental results, benchmarked with those obtained for the solid phase, provide the state of the art directional values of the hydrogen mean kinetic energy in metastable water. The determinations of the direction kinetic energies in the supercooled phase, provide accurate and quantitative measurements of these dynamical observables in metastable and stable phases, that is, key insight in the physical mechanisms of the hydrogen quantum state in both disordered and polycrystalline systems. The remarkable findings of this study establish novel insight into further expand the capacity and accuracy of DINS investigations of the nuclear quantum effects in water and represent reference experimental values for theoretical investigations.

  15. Quantitative tomographic measurements of opaque multiphase flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    GEORGE,DARIN L.; TORCZYNSKI,JOHN R.; SHOLLENBERGER,KIM ANN

    2000-03-01

    An electrical-impedance tomography (EIT) system has been developed for quantitative measurements of radial phase distribution profiles in two-phase and three-phase vertical column flows. The EIT system is described along with the computer algorithm used for reconstructing phase volume fraction profiles. EIT measurements were validated by comparison with a gamma-densitometry tomography (GDT) system. The EIT system was used to accurately measure average solid volume fractions up to 0.05 in solid-liquid flows, and radial gas volume fraction profiles in gas-liquid flows with gas volume fractions up to 0.15. In both flows, average phase volume fractions and radial volume fraction profiles from GDTmore » and EIT were in good agreement. A minor modification to the formula used to relate conductivity data to phase volume fractions was found to improve agreement between the methods. GDT and EIT were then applied together to simultaneously measure the solid, liquid, and gas radial distributions within several vertical three-phase flows. For average solid volume fractions up to 0.30, the gas distribution for each gas flow rate was approximately independent of the amount of solids in the column. Measurements made with this EIT system demonstrate that EIT may be used successfully for noninvasive, quantitative measurements of dispersed multiphase flows.« less

  16. MEASUREMENT AND PRECISION, EXPERIMENTAL VERSION.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Harvard Project Physics.

    THIS DOCUMENT IS AN EXPERIMENTAL VERSION OF A PROGRAMED TEXT ON MEASUREMENT AND PRECISION. PART I CONTAINS 24 FRAMES DEALING WITH PRECISION AND SIGNIFICANT FIGURES ENCOUNTERED IN VARIOUS MATHEMATICAL COMPUTATIONS AND MEASUREMENTS. PART II BEGINS WITH A BRIEF SECTION ON EXPERIMENTAL DATA, COVERING SUCH POINTS AS (1) ESTABLISHING THE ZERO POINT, (2)…

  17. Sooting turbulent jet flame: characterization and quantitative soot measurements

    NASA Astrophysics Data System (ADS)

    Köhler, M.; Geigle, K. P.; Meier, W.; Crosland, B. M.; Thomson, K. A.; Smallwood, G. J.

    2011-08-01

    Computational fluid dynamics (CFD) modelers require high-quality experimental data sets for validation of their numerical tools. Preferred features for numerical simulations of a sooting, turbulent test case flame are simplicity (no pilot flame), well-defined boundary conditions, and sufficient soot production. This paper proposes a non-premixed C2H4/air turbulent jet flame to fill this role and presents an extensive database for soot model validation. The sooting turbulent jet flame has a total visible flame length of approximately 400 mm and a fuel-jet Reynolds number of 10,000. The flame has a measured lift-off height of 26 mm which acts as a sensitive marker for CFD model validation, while this novel compiled experimental database of soot properties, temperature and velocity maps are useful for the validation of kinetic soot models and numerical flame simulations. Due to the relatively simple burner design which produces a flame with sufficient soot concentration while meeting modelers' needs with respect to boundary conditions and flame specifications as well as the present lack of a sooting "standard flame", this flame is suggested as a new reference turbulent sooting flame. The flame characterization presented here involved a variety of optical diagnostics including quantitative 2D laser-induced incandescence (2D-LII), shifted-vibrational coherent anti-Stokes Raman spectroscopy (SV-CARS), and particle image velocimetry (PIV). Producing an accurate and comprehensive characterization of a transient sooting flame was challenging and required optimization of these diagnostics. In this respect, we present the first simultaneous, instantaneous PIV, and LII measurements in a heavily sooting flame environment. Simultaneous soot and flow field measurements can provide new insights into the interaction between a turbulent vortex and flame chemistry, especially since soot structures in turbulent flames are known to be small and often treated in a statistical manner.

  18. Quantitative measurement of marginal disintegration of ceramic inlays.

    PubMed

    Hayashi, Mikako; Tsubakimoto, Yuko; Takeshige, Fumio; Ebisu, Shigeyuki

    2004-01-01

    The objectives of this study include establishing a method for quantitative measurement of marginal change in ceramic inlays and clarifying their marginal disintegration in vivo. An accurate CCD optical laser scanner system was used for morphological measurement of the marginal change of ceramic inlays. The accuracy of the CCD measurement was assessed by comparing it with microscopic measurement. Replicas of 15 premolars restored with Class II ceramic inlays at the time of placement and eight years after restoration were used for morphological measurement by means of the CCD laser scanner system. Occlusal surfaces of the restored teeth were scanned and cross-sections of marginal areas were computed with software. Marginal change was defined as the area enclosed by two profiles obtained by superimposing two cross-sections of the same location at two different times and expressing the maximum depth and mean area of the area enclosed. The accuracy of this method of measurement was 4.3 +/- 3.2 microm in distance and 2.0 +/- 0.6% in area. Quantitative marginal changes for the eight-year period were 10 x 10 microm in depth and 50 x 10(3) microm2 in area at the functional cusp area and 7 x 10 microm in depth and 28 x 10(3) microm2 in area at the non-functional cusp area. Marginal disintegration at the functional cusp area was significantly greater than at the non-functional cusp area (Wilcoxon signed-ranks test, p < 0.05). This study constitutes a quantitative measurement of in vivo deterioration in marginal adaptation of ceramic inlays and indicates that occlusal force may accelerate marginal disintegration.

  19. Comparison of microfluidic digital PCR and conventional quantitative PCR for measuring copy number variation.

    PubMed

    Whale, Alexandra S; Huggett, Jim F; Cowen, Simon; Speirs, Valerie; Shaw, Jacqui; Ellison, Stephen; Foy, Carole A; Scott, Daniel J

    2012-06-01

    One of the benefits of Digital PCR (dPCR) is the potential for unparalleled precision enabling smaller fold change measurements. An example of an assessment that could benefit from such improved precision is the measurement of tumour-associated copy number variation (CNV) in the cell free DNA (cfDNA) fraction of patient blood plasma. To investigate the potential precision of dPCR and compare it with the established technique of quantitative PCR (qPCR), we used breast cancer cell lines to investigate HER2 gene amplification and modelled a range of different CNVs. We showed that, with equal experimental replication, dPCR could measure a smaller CNV than qPCR. As dPCR precision is directly dependent upon both the number of replicate measurements and the template concentration, we also developed a method to assist the design of dPCR experiments for measuring CNV. Using an existing model (based on Poisson and binomial distributions) to derive an expression for the variance inherent in dPCR, we produced a power calculation to define the experimental size required to reliably detect a given fold change at a given template concentration. This work will facilitate any future translation of dPCR to key diagnostic applications, such as cancer diagnostics and analysis of cfDNA.

  20. Quantitative force measurements in liquid using frequency modulation atomic force microscopy

    NASA Astrophysics Data System (ADS)

    Uchihashi, Takayuki; Higgins, Michael J.; Yasuda, Satoshi; Jarvis, Suzanne P.; Akita, Seiji; Nakayama, Yoshikazu; Sader, John E.

    2004-10-01

    The measurement of short-range forces with the atomic force microscope (AFM) typically requires implementation of dynamic techniques to maintain sensitivity and stability. While frequency modulation atomic force microscopy (FM-AFM) is used widely for high-resolution imaging and quantitative force measurements in vacuum, quantitative force measurements using FM-AFM in liquids have proven elusive. Here we demonstrate that the formalism derived for operation in vacuum can also be used in liquids, provided certain modifications are implemented. To facilitate comparison with previous measurements taken using surface forces apparatus, we choose a model system (octamethylcyclotetrasiloxane) that is known to exhibit short-ranged structural ordering when confined between two surfaces. Force measurements obtained are found to be in excellent agreement with previously reported results. This study therefore establishes FM-AFM as a powerful tool for the quantitative measurement of forces in liquid.

  1. High throughput and quantitative approaches for measuring circadian rhythms in cyanobacteria using bioluminescence

    PubMed Central

    Shultzaberger, Ryan K.; Paddock, Mark L.; Katsuki, Takeo; Greenspan, Ralph J.; Golden, Susan S.

    2016-01-01

    The temporal measurement of a bioluminescent reporter has proven to be one of the most powerful tools for characterizing circadian rhythms in the cyanobacterium Synechococcus elongatus. Primarily, two approaches have been used to automate this process: (1) detection of cell culture bioluminescence in 96-well plates by a photomultiplier tube-based plate-cycling luminometer (TopCount Microplate Scintillation and Luminescence Counter, Perkin Elmer) and (2) detection of individual colony bioluminescence by iteratively rotating a Petri dish under a cooled CCD camera using a computer-controlled turntable. Each approach has distinct advantages. The TopCount provides a more quantitative measurement of bioluminescence, enabling the direct comparison of clock output levels among strains. The computer-controlled turntable approach has a shorter set-up time and greater throughput, making it a more powerful phenotypic screening tool. While the latter approach is extremely useful, only a few labs have been able to build such an apparatus because of technical hurdles involved in coordinating and controlling both the camera and the turntable, and in processing the resulting images. This protocol provides instructions on how to construct, use, and process data from a computer-controlled turntable to measure the temporal changes in bioluminescence of individual cyanobacterial colonies. Furthermore, we describe how to prepare samples for use with the TopCount to minimize experimental noise, and generate meaningful quantitative measurements of clock output levels for advanced analysis. PMID:25662451

  2. Quantitative experimental assessment of hot carrier-enhanced solar cells at room temperature

    NASA Astrophysics Data System (ADS)

    Nguyen, Dac-Trung; Lombez, Laurent; Gibelli, François; Boyer-Richard, Soline; Le Corre, Alain; Durand, Olivier; Guillemoles, Jean-François

    2018-03-01

    In common photovoltaic devices, the part of the incident energy above the absorption threshold quickly ends up as heat, which limits their maximum achievable efficiency to far below the thermodynamic limit for solar energy conversion. Conversely, the conversion of the excess kinetic energy of the photogenerated carriers into additional free energy would be sufficient to approach the thermodynamic limit. This is the principle of hot carrier devices. Unfortunately, such device operation in conditions relevant for utilization has never been evidenced. Here, we show that the quantitative thermodynamic study of the hot carrier population, with luminance measurements, allows us to discuss the hot carrier contribution to the solar cell performance. We demonstrate that the voltage and current can be enhanced in a semiconductor heterostructure due to the presence of the hot carrier population in a single InGaAsP quantum well at room temperature. These experimental results substantiate the potential of increasing photovoltaic performances in the hot carrier regime.

  3. The calibration of video cameras for quantitative measurements

    NASA Technical Reports Server (NTRS)

    Snow, Walter L.; Childers, Brooks A.; Shortis, Mark R.

    1993-01-01

    Several different recent applications of velocimetry at Langley Research Center are described in order to show the need for video camera calibration for quantitative measurements. Problems peculiar to video sensing are discussed, including synchronization and timing, targeting, and lighting. The extension of the measurements to include radiometric estimates is addressed.

  4. Quantitative angle-insensitive flow measurement using relative standard deviation OCT

    NASA Astrophysics Data System (ADS)

    Zhu, Jiang; Zhang, Buyun; Qi, Li; Wang, Ling; Yang, Qiang; Zhu, Zhuqing; Huo, Tiancheng; Chen, Zhongping

    2017-10-01

    Incorporating different data processing methods, optical coherence tomography (OCT) has the ability for high-resolution angiography and quantitative flow velocity measurements. However, OCT angiography cannot provide quantitative information of flow velocities, and the velocity measurement based on Doppler OCT requires the determination of Doppler angles, which is a challenge in a complex vascular network. In this study, we report on a relative standard deviation OCT (RSD-OCT) method which provides both vascular network mapping and quantitative information for flow velocities within a wide range of Doppler angles. The RSD values are angle-insensitive within a wide range of angles, and a nearly linear relationship was found between the RSD values and the flow velocities. The RSD-OCT measurement in a rat cortex shows that it can quantify the blood flow velocities as well as map the vascular network in vivo.

  5. Quantitative angle-insensitive flow measurement using relative standard deviation OCT.

    PubMed

    Zhu, Jiang; Zhang, Buyun; Qi, Li; Wang, Ling; Yang, Qiang; Zhu, Zhuqing; Huo, Tiancheng; Chen, Zhongping

    2017-10-30

    Incorporating different data processing methods, optical coherence tomography (OCT) has the ability for high-resolution angiography and quantitative flow velocity measurements. However, OCT angiography cannot provide quantitative information of flow velocities, and the velocity measurement based on Doppler OCT requires the determination of Doppler angles, which is a challenge in a complex vascular network. In this study, we report on a relative standard deviation OCT (RSD-OCT) method which provides both vascular network mapping and quantitative information for flow velocities within a wide range of Doppler angles. The RSD values are angle-insensitive within a wide range of angles, and a nearly linear relationship was found between the RSD values and the flow velocities. The RSD-OCT measurement in a rat cortex shows that it can quantify the blood flow velocities as well as map the vascular network in vivo .

  6. On Measuring Quantitative Interpretations of Reasonable Doubt

    ERIC Educational Resources Information Center

    Dhami, Mandeep K.

    2008-01-01

    Beyond reasonable doubt represents a probability value that acts as the criterion for conviction in criminal trials. I introduce the membership function (MF) method as a new tool for measuring quantitative interpretations of reasonable doubt. Experiment 1 demonstrated that three different methods (i.e., direct rating, decision theory based, and…

  7. Experimental measurements of rf breakdowns and deflecting gradients in mm-wave metallic accelerating structures

    DOE PAGES

    Dal Forno, Massimo; Dolgashev, Valery; Bowden, Gordon; ...

    2016-05-03

    We present an experimental study of a high-gradient metallic accelerating structure at sub-THz frequencies, where we investigated the physics of rf breakdowns. Wakefields in the structure were excited by an ultrarelativistic electron beam. We present the first quantitative measurements of gradients and metal vacuum rf breakdowns in sub-THz accelerating cavities. When the beam travels off axis, a deflecting field is induced in addition to the longitudinal field. We measured the deflecting forces by observing the displacement and changes in the shape of the electron bunch. This behavior can be exploited for subfemtosecond beam diagnostics.

  8. Prediction of Coronal Mass Ejections From Vector Magnetograms: Quantitative Measures as Predictors

    NASA Technical Reports Server (NTRS)

    Falconer, D. A.; Moore, R. L.; Gary, G. A.; Rose, M. Franklin (Technical Monitor)

    2001-01-01

    We derived two quantitative measures of an active region's global nonpotentiality from the region's vector magnetogram, 1) the net current (I(sub N)), and 2) the length of strong-shear, strong-field main neutral line (Lss), and used these two measures in a pilot study of the CME productivity of 4 active regions. We compared the global nonpotentiality measures to the active regions' CME productivity determined from GOES and Yohkoh/SXT observations. We found that two of the active regions were highly globally nonpotential and were CME productive, while the other two active regions had little global nonpotentiality and produced no CMEs. At the Fall 2000 AGU, we reported on an expanded study (12 active regions and 17 magnetograms) in which we evaluated four quantitative global measures of an active region's magnetic field and compared these measures with the CME productivity. The four global measures (all derived from MSFC vector magnetograms) included our two previous measures (I(sub N) and L(sub ss)) as well as two new ones, the total magnetic flux (PHI) (a measure of an active region's size), and the normalized twist (alpha (bar)= muIN/PHI). We found that the three quantitative measures of global nonpotentiality (I(sub N), L(sub ss), alpha (bar)) were all well correlated (greater than 99% confidence level) with an active region's CME productivity within plus or minus 2 days of the day of the magnetogram. We will now report on our findings of how good our quantitative measures are as predictors of active-region CME productivity, using only CMEs that occurred after the magnetogram. We report the preliminary skill test of these quantitative measures as predictors. We compare the CME prediction success of our quantitative measures to the CME prediction success based on an active region's past CME productivity. We examine the cases of the handful of false positive and false negatives to look for improvements to our predictors. This work is funded by NSF through the Space

  9. Semi-quantitative estimation of cellular SiO2 nanoparticles using flow cytometry combined with X-ray fluorescence measurements.

    PubMed

    Choi, Seo Yeon; Yang, Nuri; Jeon, Soo Kyung; Yoon, Tae Hyun

    2014-09-01

    In this study, we have demonstrated feasibility of a semi-quantitative approach for the estimation of cellular SiO2 nanoparticles (NPs), which is based on the flow cytometry measurements of their normalized side scattering intensity. In order to improve our understanding on the quantitative aspects of cell-nanoparticle interactions, flow cytometry, transmission electron microscopy, and X-ray fluorescence experiments were carefully performed for the HeLa cells exposed to SiO2 NPs with different core diameters, hydrodynamic sizes, and surface charges. Based on the observed relationships among the experimental data, a semi-quantitative cellular SiO2 NPs estimation method from their normalized side scattering and core diameters was proposed, which can be applied for the determination of cellular SiO2 NPs within their size-dependent linear ranges. © 2014 International Society for Advancement of Cytometry.

  10. Simulation of FRET dyes allows quantitative comparison against experimental data

    NASA Astrophysics Data System (ADS)

    Reinartz, Ines; Sinner, Claude; Nettels, Daniel; Stucki-Buchli, Brigitte; Stockmar, Florian; Panek, Pawel T.; Jacob, Christoph R.; Nienhaus, Gerd Ulrich; Schuler, Benjamin; Schug, Alexander

    2018-03-01

    Fully understanding biomolecular function requires detailed insight into the systems' structural dynamics. Powerful experimental techniques such as single molecule Förster Resonance Energy Transfer (FRET) provide access to such dynamic information yet have to be carefully interpreted. Molecular simulations can complement these experiments but typically face limits in accessing slow time scales and large or unstructured systems. Here, we introduce a coarse-grained simulation technique that tackles these challenges. While requiring only few parameters, we maintain full protein flexibility and include all heavy atoms of proteins, linkers, and dyes. We are able to sufficiently reduce computational demands to simulate large or heterogeneous structural dynamics and ensembles on slow time scales found in, e.g., protein folding. The simulations allow for calculating FRET efficiencies which quantitatively agree with experimentally determined values. By providing atomically resolved trajectories, this work supports the planning and microscopic interpretation of experiments. Overall, these results highlight how simulations and experiments can complement each other leading to new insights into biomolecular dynamics and function.

  11. Experimental study of oscillating plates in viscous fluids: Qualitative and quantitative analysis of the flow physics and hydrodynamic forces

    NASA Astrophysics Data System (ADS)

    Shrestha, Bishwash; Ahsan, Syed N.; Aureli, Matteo

    2018-01-01

    In this paper, we present a comprehensive experimental study on harmonic oscillations of a submerged rigid plate in a quiescent, incompressible, Newtonian, viscous fluid. The fluid-structure interaction problem is analyzed from both qualitative and quantitative perspectives via a detailed particle image velocimetry (PIV) experimental campaign conducted over a broad range of oscillation frequency and amplitude parameters. Our primary goal is to identify the effect of the oscillation characteristics on the mechanisms of fluid-structure interaction and on the dynamics of vortex shedding and convection and to elucidate the behavior of hydrodynamic forces on the oscillating structure. Towards this goal, we study the flow in terms of qualitative aspects of its pathlines, vortex shedding, and symmetry breaking phenomena and identify distinct hydrodynamic regimes in the vicinity of the oscillating structure. Based on these experimental observations, we produce a novel phase diagram detailing the occurrence of distinct hydrodynamic regimes as a function of relevant governing nondimensional parameters. We further study the hydrodynamic forces associated with each regime using both PIV and direct force measurement via a load cell. Our quantitative results on experimental estimation of hydrodynamic forces show good agreement against predictions from the literature, where numerical and semi-analytical models are available. The findings and observations in this work shed light on the relationship between flow physics, vortex shedding, and convection mechanisms and the hydrodynamic forces acting on a rigid oscillating plate and, as such, have relevance to various engineering applications, including energy harvesting devices, biomimetic robotic system, and micro-mechanical sensors and actuators.

  12. Fundamental limits of measurement in telecommunications: Experimental and modeling studies in a test optical network on proposal for the reform of telecommunication quantitations

    NASA Astrophysics Data System (ADS)

    Egan, James; McMillan, Normal; Denieffe, David

    2011-08-01

    Proposals for a review of the limits of measurement for telecommunications are made. The measures are based on adapting work from the area of chemical metrology for the field of telecommunications. Currie has introduced recommendations for defining the limits of measurement in chemical metrology and has identified three key fundamental limits of measurement. These are the critical level, the detection limit and the determination limit. Measurements on an optical system are used to illustrate the utility of these measures and discussion is given into the advantages of using these fundamental quantitations over existing methods.

  13. Experimentally validated quantitative linear model for the device physics of elastomeric microfluidic valves

    NASA Astrophysics Data System (ADS)

    Kartalov, Emil P.; Scherer, Axel; Quake, Stephen R.; Taylor, Clive R.; Anderson, W. French

    2007-03-01

    A systematic experimental study and theoretical modeling of the device physics of polydimethylsiloxane "pushdown" microfluidic valves are presented. The phase space is charted by 1587 dimension combinations and encompasses 45-295μm lateral dimensions, 16-39μm membrane thickness, and 1-28psi closing pressure. Three linear models are developed and tested against the empirical data, and then combined into a fourth-power-polynomial superposition. The experimentally validated final model offers a useful quantitative prediction for a valve's properties as a function of its dimensions. Typical valves (80-150μm width) are shown to behave like thin springs.

  14. Quantitative color measurement for black walnut wood.

    Treesearch

    Ali A. Moslemi

    1967-01-01

    Black walnut (Juglans nigra L.) veneer specimens with wide variations in color were evaluated by a quantitative method of color measurement. The internationally adopted CIE system of colorimetry was used to analyze the data. These data were converted to also show them in the Munsell system. Color differences among the walnut veneer specimens were also numerically...

  15. Quantitative comparison of PZT and CMUT probes for photoacoustic imaging: Experimental validation.

    PubMed

    Vallet, Maëva; Varray, François; Boutet, Jérôme; Dinten, Jean-Marc; Caliano, Giosuè; Savoia, Alessandro Stuart; Vray, Didier

    2017-12-01

    Photoacoustic (PA) signals are short ultrasound (US) pulses typically characterized by a single-cycle shape, often referred to as N-shape. The spectral content of such wideband signals ranges from a few hundred kilohertz to several tens of megahertz. Typical reception frequency responses of classical piezoelectric US imaging transducers, based on PZT technology, are not sufficiently broadband to fully preserve the entire information contained in PA signals, which are then filtered, thus limiting PA imaging performance. Capacitive micromachined ultrasonic transducers (CMUT) are rapidly emerging as a valid alternative to conventional PZT transducers in several medical ultrasound imaging applications. As compared to PZT transducers, CMUTs exhibit both higher sensitivity and significantly broader frequency response in reception, making their use attractive in PA imaging applications. This paper explores the advantages of the CMUT larger bandwidth in PA imaging by carrying out an experimental comparative study using various CMUT and PZT probes from different research laboratories and manufacturers. PA acquisitions are performed on a suture wire and on several home-made bimodal phantoms with both PZT and CMUT probes. Three criteria, based on the evaluation of pure receive impulse response, signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) respectively, have been used for a quantitative comparison of imaging results. The measured fractional bandwidths of the CMUT arrays are larger compared to PZT probes. Moreover, both SNR and CNR are enhanced by at least 6 dB with CMUT technology. This work highlights the potential of CMUT technology for PA imaging through qualitative and quantitative parameters.

  16. SERS quantitative urine creatinine measurement of human subject

    NASA Astrophysics Data System (ADS)

    Wang, Tsuei Lian; Chiang, Hui-hua K.; Lu, Hui-hsin; Hung, Yung-da

    2005-03-01

    SERS method for biomolecular analysis has several potentials and advantages over traditional biochemical approaches, including less specimen contact, non-destructive to specimen, and multiple components analysis. Urine is an easily available body fluid for monitoring the metabolites and renal function of human body. We developed surface-enhanced Raman scattering (SERS) technique using 50nm size gold colloidal particles for quantitative human urine creatinine measurements. This paper shows that SERS shifts of creatinine (104mg/dl) in artificial urine is from 1400cm-1 to 1500cm-1 which was analyzed for quantitative creatinine measurement. Ten human urine samples were obtained from ten healthy persons and analyzed by the SERS technique. Partial least square cross-validation (PLSCV) method was utilized to obtain the estimated creatinine concentration in clinically relevant (55.9mg/dl to 208mg/dl) concentration range. The root-mean square error of cross validation (RMSECV) is 26.1mg/dl. This research demonstrates the feasibility of using SERS for human subject urine creatinine detection, and establishes the SERS platform technique for bodily fluids measurement.

  17. Quantitative Measurement of Trans-Fats by Infrared Spectroscopy

    ERIC Educational Resources Information Center

    Walker, Edward B.; Davies, Don R.; Campbell, Mike

    2007-01-01

    Trans-fat is a general term, which is mainly used to describe the various trans geometric isomers present in unsaturated fatty acids. Various techniques are now used for a quantitative measurement of the amount of trans-fats present in foods and cooking oil.

  18. A theoretical/experimental program to develop active optical pollution sensors: Quantitative remote Raman lidar measurements of pollutants from stationary sources

    NASA Technical Reports Server (NTRS)

    Poultney, S. K.; Brumfield, M. L.; Siviter, J. S.

    1975-01-01

    Typical pollutant gas concentrations at the stack exits of stationary sources can be estimated to be about 500 ppm under the present emission standards. Raman lidar has a number of advantages which makes it a valuable tool for remote measurements of these stack emissions. Tests of the Langley Research Center Raman lidar at a calibration tank indicate that night measurements of SO2 concentrations and stack opacity are possible. Accuracies of 10 percent are shown to be achievable from a distance of 300 m within 30 min integration times for 500 ppm SO2 at the stack exits. All possible interferences were examined quantitatively (except for the fluorescence of aerosols in actual stack emissions) and found to have negligible effect on the measurements. An early test at an instrumented stack is strongly recommended.

  19. Measurement Invariance: A Foundational Principle for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    This article describes why measurement invariance is a critical issue to quantitative theory building within the field of human resource development. Readers will learn what measurement invariance is and how to test for its presence using techniques that are accessible to applied researchers. Using data from a LibQUAL+[TM] study of user…

  20. Theory and preliminary experimental verification of quantitative edge illumination x-ray phase contrast tomography.

    PubMed

    Hagen, C K; Diemoz, P C; Endrizzi, M; Rigon, L; Dreossi, D; Arfelli, F; Lopez, F C M; Longo, R; Olivo, A

    2014-04-07

    X-ray phase contrast imaging (XPCi) methods are sensitive to phase in addition to attenuation effects and, therefore, can achieve improved image contrast for weakly attenuating materials, such as often encountered in biomedical applications. Several XPCi methods exist, most of which have already been implemented in computed tomographic (CT) modality, thus allowing volumetric imaging. The Edge Illumination (EI) XPCi method had, until now, not been implemented as a CT modality. This article provides indications that quantitative 3D maps of an object's phase and attenuation can be reconstructed from EI XPCi measurements. Moreover, a theory for the reconstruction of combined phase and attenuation maps is presented. Both reconstruction strategies find applications in tissue characterisation and the identification of faint, weakly attenuating details. Experimental results for wires of known materials and for a biological object validate the theory and confirm the superiority of the phase over conventional, attenuation-based image contrast.

  1. Racial Differences in Quantitative Measures of Area and Volumetric Breast Density

    PubMed Central

    McCarthy, Anne Marie; Keller, Brad M.; Pantalone, Lauren M.; Hsieh, Meng-Kang; Synnestvedt, Marie; Conant, Emily F.; Armstrong, Katrina; Kontos, Despina

    2016-01-01

    Abstract Background: Increased breast density is a strong risk factor for breast cancer and also decreases the sensitivity of mammographic screening. The purpose of our study was to compare breast density for black and white women using quantitative measures. Methods: Breast density was assessed among 5282 black and 4216 white women screened using digital mammography. Breast Imaging-Reporting and Data System (BI-RADS) density was obtained from radiologists’ reports. Quantitative measures for dense area, area percent density (PD), dense volume, and volume percent density were estimated using validated, automated software. Breast density was categorized as dense or nondense based on BI-RADS categories or based on values above and below the median for quantitative measures. Logistic regression was used to estimate the odds of having dense breasts by race, adjusted for age, body mass index (BMI), age at menarche, menopause status, family history of breast or ovarian cancer, parity and age at first birth, and current hormone replacement therapy (HRT) use. All statistical tests were two-sided. Results: There was a statistically significant interaction of race and BMI on breast density. After accounting for age, BMI, and breast cancer risk factors, black women had statistically significantly greater odds of high breast density across all quantitative measures (eg, PD nonobese odds ratio [OR] = 1.18, 95% confidence interval [CI] = 1.02 to 1.37, P = .03, PD obese OR = 1.26, 95% CI = 1.04 to 1.53, P = .02). There was no statistically significant difference in BI-RADS density by race. Conclusions: After accounting for age, BMI, and other risk factors, black women had higher breast density than white women across all quantitative measures previously associated with breast cancer risk. These results may have implications for risk assessment and screening. PMID:27130893

  2. Quantitative Velocity Field Measurements in Reduced-Gravity Combustion Science and Fluid Physics Experiments

    NASA Technical Reports Server (NTRS)

    Greenberg, Paul S.; Wernet, Mark P.

    1999-01-01

    Systems have been developed and demonstrated for performing quantitative velocity measurements in reduced gravity combustion science and fluid physics investigations. The unique constraints and operational environments inherent to reduced-gravity experimental facilities pose special challenges to the development of hardware and software systems. Both point and planar velocimetric capabilities are described, with particular attention being given to the development of systems to support the International Space Station laboratory. Emphasis has been placed on optical methods, primarily arising from the sensitivity of the phenomena of interest to intrusive probes. Limitations on available power, volume, data storage, and attendant expertise have motivated the use of solid-state sources and detectors, as well as efficient analysis capabilities emphasizing interactive data display and parameter control.

  3. Quantitative measurement of the near-field enhancement of nanostructures by two-photon polymerization.

    PubMed

    Geldhauser, Tobias; Kolloch, Andreas; Murazawa, Naoki; Ueno, Kosei; Boneberg, Johannes; Leiderer, Paul; Scheer, Elke; Misawa, Hiroaki

    2012-06-19

    The quantitative determination of the strength of the near-field enhancement in and around nanostructures is essential for optimizing and using these structures for applications. We combine the gaussian intensity distribution of a laser profile and two-photon-polymerization of SU-8 to a suitable tool for the quantitative experimental measurement of the near-field enhancement of a nanostructure. Our results give a feedback to the results obtained by finite-difference time-domain (FDTD) simulations. The structures under investigation are gold nanotriangles on a glass substrate with 85 nm side length and a thickness of 40 nm. We compare the threshold fluence for polymerization for areas of the gaussian intensity profile with and without the near-field enhancement of the nanostructures. The experimentally obtained value of the near-field intensity enhancement is 600 ± 140, independent of the laser power, irradiation time, and spot size. The FDTD simulation shows a pointlike maximum of 2600 at the tip. In a more extended area with an approximate size close to the smallest polymerized structure of 25 nm in diameter, we find a value between 800 and 600. Using our novel approach, we determine the threshold fluence for polymerization of the commercially available photopolymerizable resin SU-8 by a femtosecond laser working at a wavelength of 795 nm and a repetition rate of 82 MHz to be 0.25 J/cm(2) almost independent of the irradiation time and the laser power used. This finding is important for future applications of the method because it enables one to use varying laser systems.

  4. Quantitative Determination of Isotope Ratios from Experimental Isotopic Distributions

    PubMed Central

    Kaur, Parminder; O’Connor, Peter B.

    2008-01-01

    Isotope variability due to natural processes provides important information for studying a variety of complex natural phenomena from the origins of a particular sample to the traces of biochemical reaction mechanisms. These measurements require high-precision determination of isotope ratios of a particular element involved. Isotope Ratio Mass Spectrometers (IRMS) are widely employed tools for such a high-precision analysis, which have some limitations. This work aims at overcoming the limitations inherent to IRMS by estimating the elemental isotopic abundance from the experimental isotopic distribution. In particular, a computational method has been derived which allows the calculation of 13C/12C ratios from the whole isotopic distributions, given certain caveats, and these calculations are applied to several cases to demonstrate their utility. The limitations of the method in terms of the required number of ions and S/N ratio are discussed. For high-precision estimates of the isotope ratios, this method requires very precise measurement of the experimental isotopic distribution abundances, free from any artifacts introduced by noise, sample heterogeneity, or other experimental sources. PMID:17263354

  5. MR-ARFI-based method for the quantitative measurement of tissue elasticity: application for monitoring HIFU therapy

    NASA Astrophysics Data System (ADS)

    Vappou, Jonathan; Bour, Pierre; Marquet, Fabrice; Ozenne, Valery; Quesson, Bruno

    2018-05-01

    Monitoring thermal therapies through medical imaging is essential in order to ensure that they are safe, efficient and reliable. In this paper, we propose a new approach, halfway between MR acoustic radiation force imaging (MR-ARFI) and MR elastography (MRE), allowing for the quantitative measurement of the elastic modulus of tissue in a highly localized manner. It relies on the simulation of the MR-ARFI profile, which depends on tissue biomechanical properties, and on the identification of tissue elasticity through the fitting of experimental displacement images measured using rapid MR-ARFI. This method was specifically developed to monitor MR-guided high intensity focused ultrasound (MRgHIFU) therapy. Elasticity changes were followed during HIFU ablations (N  =  6) performed ex vivo in porcine muscle samples, and were compared to temperature changes measured by MR-thermometry. Shear modulus was found to increase consistently and steadily a few seconds after the heating started, and such changes were found to be irreversible. The shear modulus was found to increase from 1.49  ±  0.48 kPa (before ablation) to 3.69  ±  0.93 kPa (after ablation and cooling). Thanks to its ability to perform quantitative elasticity measurements in a highly localized manner around the focal spot, this method proved to be particularly attractive for monitoring HIFU ablations.

  6. Calibrated simulations of Z opacity experiments that reproduce the experimentally measured plasma conditions

    DOE PAGES

    Nagayama, T.; Bailey, J. E.; Loisel, G.; ...

    2016-02-05

    Recently, frequency-resolved iron opacity measurements at electron temperatures of 170–200 eV and electron densities of (0.7 – 4.0) × 10 22 cm –3 revealed a 30–400% disagreement with the calculated opacities [J. E. Bailey et al., Nature (London) 517, 56 (2015)]. The discrepancies have a high impact on astrophysics, atomic physics, and high-energy density physics, and it is important to verify our understanding of the experimental platform with simulations. Reliable simulations are challenging because the temporal and spatial evolution of the source radiation and of the sample plasma are both complex and incompletely diagnosed. In this article, we describe simulationsmore » that reproduce the measured temperature and density in recent iron opacity experiments performed at the Sandia National Laboratories Z facility. The time-dependent spectral irradiance at the sample is estimated using the measured time- and space-dependent source radiation distribution, in situ source-to-sample distance measurements, and a three-dimensional (3D) view-factor code. The inferred spectral irradiance is used to drive 1D sample radiation hydrodynamics simulations. The images recorded by slit-imaged space-resolved spectrometers are modeled by solving radiation transport of the source radiation through the sample. We find that the same drive radiation time history successfully reproduces the measured plasma conditions for eight different opacity experiments. These results provide a quantitative physical explanation for the observed dependence of both temperature and density on the sample configuration. Simulated spectral images for the experiments without the FeMg sample show quantitative agreement with the measured spectral images. The agreement in spectral profile, spatial profile, and brightness provides further confidence in our understanding of the backlight-radiation time history and image formation. Furthermore, these simulations bridge the static-uniform picture of the

  7. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    ERIC Educational Resources Information Center

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  8. Prediction of Coronal Mass Ejections from Vector Magnetograms: Quantitative Measures as Predictors

    NASA Astrophysics Data System (ADS)

    Falconer, D. A.; Moore, R. L.; Gary, G. A.

    2001-05-01

    In a pilot study of 4 active regions (Falconer, D.A. 2001, JGR, in press), we derived two quantitative measures of an active region's global nonpotentiality from the region's vector magnetogram, 1) the net current (IN), and 2) the length of the strong-shear, strong-field main neutral line (LSS), and used these two measures of the CME productivity of the active regions. We compared the global nonpotentiality measures to the active regions' CME productivity determined from GOES and Yohkoh/SXT observations. We found that two of the active regions were highly globally nonpotential and were CME productive, while the other two active regions had little global nonpotentiality and produced no CMEs. At the Fall 2000 AGU (Falconer, Moore, & Gary, 2000, EOS 81, 48 F998), we reported on an expanded study (12 active regions and 17 magnetograms) in which we evaluated four quantitative global measures of an active region's magnetic field and compared these measures with the CME productivity. The four global measures (all derived from MSFC vector magnetograms) included our two previous measures (IN and LSS) as well as two new ones, the total magnetic flux (Φ ) (a measure of an active region's size), and the normalized twist (α =μ IN/Φ ). We found that the three measures of global nonpotentiality (IN, LSS, α ) were all well correlated (>99% confidence level) with an active region's CME productivity within (2 days of the day of the magnetogram. We will now report on our findings of how good our quantitative measures are as predictors of active-region CME productivity, using only CMEs that occurred after the magnetogram. We report the preliminary skill test of these quantitative measures as predictors. We compare the CME prediction success of our quantitative measures to the CME prediction success based on an active region's past CME productivity. We examine the cases of the handful of false positive and false negatives to look for improvements to our predictors. This work is

  9. Evaluating the quality of a cell counting measurement process via a dilution series experimental design.

    PubMed

    Sarkar, Sumona; Lund, Steven P; Vyzasatya, Ravi; Vanguri, Padmavathy; Elliott, John T; Plant, Anne L; Lin-Gibson, Sheng

    2017-12-01

    Cell counting measurements are critical in the research, development and manufacturing of cell-based products, yet determining cell quantity with accuracy and precision remains a challenge. Validating and evaluating a cell counting measurement process can be difficult because of the lack of appropriate reference material. Here we describe an experimental design and statistical analysis approach to evaluate the quality of a cell counting measurement process in the absence of appropriate reference materials or reference methods. The experimental design is based on a dilution series study with replicate samples and observations as well as measurement process controls. The statistical analysis evaluates the precision and proportionality of the cell counting measurement process and can be used to compare the quality of two or more counting methods. As an illustration of this approach, cell counting measurement processes (automated and manual methods) were compared for a human mesenchymal stromal cell (hMSC) preparation. For the hMSC preparation investigated, results indicated that the automated method performed better than the manual counting methods in terms of precision and proportionality. By conducting well controlled dilution series experimental designs coupled with appropriate statistical analysis, quantitative indicators of repeatability and proportionality can be calculated to provide an assessment of cell counting measurement quality. This approach does not rely on the use of a reference material or comparison to "gold standard" methods known to have limited assurance of accuracy and precision. The approach presented here may help the selection, optimization, and/or validation of a cell counting measurement process. Published by Elsevier Inc.

  10. Quantitative measures detect sensory and motor impairments in multiple sclerosis.

    PubMed

    Newsome, Scott D; Wang, Joseph I; Kang, Jonathan Y; Calabresi, Peter A; Zackowski, Kathleen M

    2011-06-15

    Sensory and motor dysfunction in multiple sclerosis (MS) is often assessed with rating scales which rely heavily on clinical judgment. Quantitative devices may be more precise than rating scales. To quantify lower extremity sensorimotor measures in individuals with MS, evaluate the extent to which they can detect functional systems impairments, and determine their relationship to global disability measures. We tested 145 MS subjects and 58 controls. Vibration thresholds were quantified using a Vibratron-II device. Strength was quantified by a hand-held dynamometer. We also recorded Expanded Disability Status Scale (EDSS) and Timed 25-Foot Walk (T25FW). t-tests and Wilcoxon-rank sum were used to compare group data. Spearman correlations were used to assess relationships between each measure. We also used a step-wise linear regression model to determine how much the quantitative measures explain the variance in the respective functional systems scores (FSS). EDSS scores ranged from 0-7.5, mean disease duration was 10.4 ± 9.6 years, and 66% were female. In relapsing-remitting MS, but not progressive MS, poorer vibration sensation correlated with a worse EDSS score, whereas progressive groups' ankle/hip strength changed significantly with EDSS progression. Interestingly, not only did sensorimotor measures significantly correlate with global disability measures (i.e., EDSS), but they had improved sensitivity, as they detected impairments in up to 32% of MS subjects with normal sensory and pyramidal FSS. Sensory and motor deficits in MS can be quantified using clinically accessible tools and distinguish differences among MS subtypes. We show that quantitative sensorimotor measures are more sensitive than FSS from the EDSS. These tools have the potential to be used as clinical outcome measures in practice and for future MS clinical trials of neurorehabilitative and neuroreparative interventions. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Quantitative measures detect sensory and motor impairments in multiple sclerosis

    PubMed Central

    Newsome, Scott D.; Wang, Joseph I.; Kang, Jonathan Y.; Calabresi, Peter A.; Zackowski, Kathleen M.

    2011-01-01

    Background Sensory and motor dysfunction in multiple sclerosis (MS) is often assessed with rating scales which rely heavily on clinical judgment. Quantitative devices may be more precise than rating scales. Objective To quantify lower extremity sensorimotor measures in individuals with MS, evaluate the extent to which they can detect functional systems impairments, and determine their relationship to global disability measures. Methods We tested 145 MS subjects and 58 controls. Vibration thresholds were quantified using a Vibratron-II device. Strength was quantified by a hand-held dynamometer. We also recorded Expanded Disability Status Scale (EDSS) and timed 25-foot walk (T25FW). T-tests and Wilcoxon-rank sum were used to compare group data. Spearman correlations were used to assess relationships between each measure. We also used a step-wise linear regression model to determine how much the quantitative measures explain the variance in the respective functional systems scores (FSS). Results EDSS scores ranged from 0-7.5, mean disease duration was 10.4±9.6 years, and 66% were female. In RRMS, but not progressive MS, poorer vibration sensation correlated with a worse EDSS score, whereas progressive groups’ ankle/hip strength changed significantly with EDSS progression. Interestingly, not only did sensorimotor measures significantly correlate with global disability measures (EDSS), but they had improved sensitivity, as they detected impairments in up to 32% of MS subjects with normal sensory FSS. Conclusions Sensory and motor deficits can be quantified using clinically accessible tools and distinguish differences among MS subtypes. We show that quantitative sensorimotor measures are more sensitive than FSS from the EDSS. These tools have the potential to be used as clinical outcome measures in practice and for future MS clinical trials of neurorehabilitative and neuroreparative interventions. PMID:21458828

  12. Cleavage Entropy as Quantitative Measure of Protease Specificity

    PubMed Central

    Fuchs, Julian E.; von Grafenstein, Susanne; Huber, Roland G.; Margreiter, Michael A.; Spitzer, Gudrun M.; Wallnoefer, Hannes G.; Liedl, Klaus R.

    2013-01-01

    A purely information theory-guided approach to quantitatively characterize protease specificity is established. We calculate an entropy value for each protease subpocket based on sequences of cleaved substrates extracted from the MEROPS database. We compare our results with known subpocket specificity profiles for individual proteases and protease groups (e.g. serine proteases, metallo proteases) and reflect them quantitatively. Summation of subpocket-wise cleavage entropy contributions yields a measure for overall protease substrate specificity. This total cleavage entropy allows ranking of different proteases with respect to their specificity, separating unspecific digestive enzymes showing high total cleavage entropy from specific proteases involved in signaling cascades. The development of a quantitative cleavage entropy score allows an unbiased comparison of subpocket-wise and overall protease specificity. Thus, it enables assessment of relative importance of physicochemical and structural descriptors in protease recognition. We present an exemplary application of cleavage entropy in tracing substrate specificity in protease evolution. This highlights the wide range of substrate promiscuity within homologue proteases and hence the heavy impact of a limited number of mutations on individual substrate specificity. PMID:23637583

  13. Assessment of the neutron dose field around a biomedical cyclotron: FLUKA simulation and experimental measurements.

    PubMed

    Infantino, Angelo; Cicoria, Gianfranco; Lucconi, Giulia; Pancaldi, Davide; Vichi, Sara; Zagni, Federico; Mostacci, Domiziano; Marengo, Mario

    2016-12-01

    In the planning of a new cyclotron facility, an accurate knowledge of the radiation field around the accelerator is fundamental for the design of shielding, the protection of workers, the general public and the environment. Monte Carlo simulations can be very useful in this process, and their use is constantly increasing. However, few data have been published so far as regards the proper validation of Monte Carlo simulation against experimental measurements, particularly in the energy range of biomedical cyclotrons. In this work a detailed model of an existing installation of a GE PETtrace 16.5MeV cyclotron was developed using FLUKA. An extensive measurement campaign of the neutron ambient dose equivalent H ∗ (10) in marked positions around the cyclotron was conducted using a neutron rem-counter probe and CR39 neutron detectors. Data from a previous measurement campaign performed by our group using TLDs were also re-evaluated. The FLUKA model was then validated by comparing the results of high-statistics simulations with experimental data. In 10 out of 12 measurement locations, FLUKA simulations were in agreement within uncertainties with all the three different sets of experimental data; in the remaining 2 positions, the agreement was with 2/3 of the measurements. Our work allows to quantitatively validate our FLUKA simulation setup and confirms that Monte Carlo technique can produce accurate results in the energy range of biomedical cyclotrons. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  14. Quantitation of absorbed or deposited materials on a substrate that measures energy deposition

    DOEpatents

    Grant, Patrick G.; Bakajin, Olgica; Vogel, John S.; Bench, Graham

    2005-01-18

    This invention provides a system and method for measuring an energy differential that correlates to quantitative measurement of an amount mass of an applied localized material. Such a system and method remains compatible with other methods of analysis, such as, for example, quantitating the elemental or isotopic content, identifying the material, or using the material in biochemical analysis.

  15. Adipose tissue MRI for quantitative measurement of central obesity.

    PubMed

    Poonawalla, Aziz H; Sjoberg, Brett P; Rehm, Jennifer L; Hernando, Diego; Hines, Catherine D; Irarrazaval, Pablo; Reeder, Scott B

    2013-03-01

    To validate adipose tissue magnetic resonance imaging (atMRI) for rapid, quantitative volumetry of visceral adipose tissue (VAT) and total adipose tissue (TAT). Data were acquired on normal adults and clinically overweight girls with Institutional Review Board (IRB) approval/parental consent using sagittal 6-echo 3D-spoiled gradient-echo (SPGR) (26-sec single-breath-hold) at 3T. Fat-fraction images were reconstructed with quantitative corrections, permitting measurement of a physiologically based fat-fraction threshold in normals to identify adipose tissue, for automated measurement of TAT, and semiautomated measurement of VAT. TAT accuracy was validated using oil phantoms and in vivo TAT/VAT measurements validated with manual segmentation. Group comparisons were performed between normals and overweight girls using TAT, VAT, VAT-TAT-ratio (VTR), body-mass-index (BMI), waist circumference, and waist-hip-ratio (WHR). Oil phantom measurements were highly accurate (<3% error). The measured adipose fat-fraction threshold was 96% ± 2%. VAT and TAT correlated strongly with manual segmentation (normals r(2) ≥ 0.96, overweight girls r(2) ≥ 0.99). VAT segmentation required 30 ± 11 minutes/subject (14 ± 5 sec/slice) using atMRI, versus 216 ± 73 minutes/subject (99 ± 31 sec/slice) manually. Group discrimination was significant using WHR (P < 0.001) and VTR (P = 0.004). The atMRI technique permits rapid, accurate measurements of TAT, VAT, and VTR. Copyright © 2012 Wiley Periodicals, Inc.

  16. Quantitative measurements of magnetic vortices using position resolved diffraction in Lorentz STEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zaluzec, N. J.

    2002-03-05

    A number of electron column techniques have been developed over the last forty years to permit visualization of magnetic fields in specimens. These include: Fresnel imaging, Differential Phase Contrast, Electron Holography and Lorentz STEM. In this work we have extended the LSTEM methodology using Position Resolved Diffraction (PRD) to quantitatively measure the in-plane electromagnetic fields of thin film materials. The experimental work reported herein has been carried out using the ANL AAEM HB603Z 300 kV FEG instrument 5. In this instrument, the electron optical column was operated in a zero field mode, at the specimen, where the objective lens ismore » turned off and the probe forming lens functions were reallocated to the C1, C2, and C3 lenses. Post specimen lenses (P1, P2, P3, P4) were used to magnify the transmitted electrons to a YAG screen, which was then optically transferred to a Hamamatsu ORCA ER CCD array. This CCD was interfaced to an EmiSpec Data Acquisition System and the data was subsequently transferred to an external computer system for detailed quantitative analysis. In Position Resolved Diffraction mode, we digitally step a focused electron probe across the region of interest of the specimen while at the same time recording the complete diffraction pattern at each point in the scan.« less

  17. Advanced quantitative measurement methodology in physics education research

    NASA Astrophysics Data System (ADS)

    Wang, Jing

    parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The

  18. Experimental measurement of coil-rod-coil block copolymer tracer diffusion through entangled coil homopolymers

    PubMed Central

    Wang, Muzhou; Timachova, Ksenia; Olsen, Bradley D.

    2014-01-01

    The diffusion of coil-rod-coil triblock copolymers in entangled coil homopolymers is experimentally measured and demonstrated to be significantly slower than rod or coil homopolymers of the same molecular weight. A model coil-rod-coil triblock was prepared by expressing rodlike alanine-rich α-helical polypeptides in E. coli and conjugating coillike poly(ethylene oxide) (PEO) to both ends to form coil-rod-coil triblock copolymers. Tracer diffusion through entangled PEO homopolymer melts was measured using forced Rayleigh scattering at various rod lengths, coil molecular weights, and coil homopolymer concentrations. For rod lengths, L, that are close to the entanglementh length, a, the ratio between triblock diffusivity and coil homopolymer diffusivity decreases monotonically and is only a function of L/a, in quantitative agreement with previous simulation results. For large rod lengths, diffusion follows an arm retraction scaling, which is also consistent with previous theoretical predictions. These experimental results support the key predictions of theory and simulation, suggesting that the mismatch in curvature between rod and coil entanglement tubes leads to the observed diffusional slowing. PMID:25484454

  19. A thorough experimental study of CH/π interactions in water: quantitative structure-stability relationships for carbohydrate/aromatic complexes.

    PubMed

    Jiménez-Moreno, Ester; Jiménez-Osés, Gonzalo; Gómez, Ana M; Santana, Andrés G; Corzana, Francisco; Bastida, Agatha; Jiménez-Barbero, Jesus; Asensio, Juan Luis

    2015-11-13

    CH/π interactions play a key role in a large variety of molecular recognition processes of biological relevance. However, their origins and structural determinants in water remain poorly understood. In order to improve our comprehension of these important interaction modes, we have performed a quantitative experimental analysis of a large data set comprising 117 chemically diverse carbohydrate/aromatic stacking complexes, prepared through a dynamic combinatorial approach recently developed by our group. The obtained free energies provide a detailed picture of the structure-stability relationships that govern the association process, opening the door to the rational design of improved carbohydrate-based ligands or carbohydrate receptors. Moreover, this experimental data set, supported by quantum mechanical calculations, has contributed to the understanding of the main driving forces that promote complex formation, underlining the key role played by coulombic and solvophobic forces on the stabilization of these complexes. This represents the most quantitative and extensive experimental study reported so far for CH/π complexes in water.

  20. Listening to light scattering in turbid media: quantitative optical scattering imaging using photoacoustic measurements with one-wavelength illumination

    NASA Astrophysics Data System (ADS)

    Yuan, Zhen; Li, Xiaoqi; Xi, Lei

    2014-06-01

    Biomedical photoacoustic tomography (PAT), as a potential imaging modality, can visualize tissue structure and function with high spatial resolution and excellent optical contrast. It is widely recognized that the ability of quantitatively imaging optical absorption and scattering coefficients from photoacoustic measurements is essential before PAT can become a powerful imaging modality. Existing quantitative PAT (qPAT), while successful, has been focused on recovering absorption coefficient only by assuming scattering coefficient a constant. An effective method for photoacoustically recovering optical scattering coefficient is presently not available. Here we propose and experimentally validate such a method for quantitative scattering coefficient imaging using photoacoustic data from one-wavelength illumination. The reconstruction method developed combines conventional PAT with the photon diffusion equation in a novel way to realize the recovery of scattering coefficient. We demonstrate the method using various objects having scattering contrast only or both absorption and scattering contrasts embedded in turbid media. The listening-to-light-scattering method described will be able to provide high resolution scattering imaging for various biomedical applications ranging from breast to brain imaging.

  1. Initial Description of a Quantitative, Cross-Species (Chimpanzee-Human) Social Responsiveness Measure

    ERIC Educational Resources Information Center

    Marrus, Natasha; Faughn, Carley; Shuman, Jeremy; Petersen, Steve E.; Constantino, John N.; Povinelli, Daniel J.; Pruett, John R., Jr.

    2011-01-01

    Objective: Comparative studies of social responsiveness, an ability that is impaired in autism spectrum disorders, can inform our understanding of both autism and the cognitive architecture of social behavior. Because there is no existing quantitative measure of social responsiveness in chimpanzees, we generated a quantitative, cross-species…

  2. The influence of the design matrix on treatment effect estimates in the quantitative analyses of single-subject experimental design research.

    PubMed

    Moeyaert, Mariola; Ugille, Maaike; Ferron, John M; Beretvas, S Natasha; Van den Noortgate, Wim

    2014-09-01

    The quantitative methods for analyzing single-subject experimental data have expanded during the last decade, including the use of regression models to statistically analyze the data, but still a lot of questions remain. One question is how to specify predictors in a regression model to account for the specifics of the design and estimate the effect size of interest. These quantitative effect sizes are used in retrospective analyses and allow synthesis of single-subject experimental study results which is informative for evidence-based decision making, research and theory building, and policy discussions. We discuss different design matrices that can be used for the most common single-subject experimental designs (SSEDs), namely, the multiple-baseline designs, reversal designs, and alternating treatment designs, and provide empirical illustrations. The purpose of this article is to guide single-subject experimental data analysts interested in analyzing and meta-analyzing SSED data. © The Author(s) 2014.

  3. Quantitative measurement of transverse injector and free stream interaction in a nonreacting SCRAMJET combustor using laser-induced iodine fluorescence

    NASA Technical Reports Server (NTRS)

    Fletcher, D. G.; Mcdaniel, J. C.

    1987-01-01

    A preliminary quantitative study of the compressible flowfield in a steady, nonreacting model SCRAMJET combustor using laser-induced iodine fluorescence (LIIF) is reported. Measurements of density, temperature, and velocity were conducted with the calibrated, nonintrusive, optical technique for two different combustor operating conditions. First, measurements were made in the supersonic flow over a rearward-facing step without transverse injection for comparison with calculated pressure profiles. The second configuration was staged injection behind the rearward-facing step at an injection dynamic pressure ratio of 1.06. These experimental results will be used to validate computational fluid dynamic (CFD) codes being developed to model supersonic combustor flowfields.

  4. Improvement of Quantitative Measurements in Multiplex Proteomics Using High-Field Asymmetric Waveform Spectrometry.

    PubMed

    Pfammatter, Sibylle; Bonneil, Eric; Thibault, Pierre

    2016-12-02

    Quantitative proteomics using isobaric reagent tandem mass tags (TMT) or isobaric tags for relative and absolute quantitation (iTRAQ) provides a convenient approach to compare changes in protein abundance across multiple samples. However, the analysis of complex protein digests by isobaric labeling can be undermined by the relative large proportion of co-selected peptide ions that lead to distorted reporter ion ratios and affect the accuracy and precision of quantitative measurements. Here, we investigated the use of high-field asymmetric waveform ion mobility spectrometry (FAIMS) in proteomic experiments to reduce sample complexity and improve protein quantification using TMT isobaric labeling. LC-FAIMS-MS/MS analyses of human and yeast protein digests led to significant reductions in interfering ions, which increased the number of quantifiable peptides by up to 68% while significantly improving the accuracy of abundance measurements compared to that with conventional LC-MS/MS. The improvement in quantitative measurements using FAIMS is further demonstrated for the temporal profiling of protein abundance of HEK293 cells following heat shock treatment.

  5. Experimental Measurements at the MASURCA Facility

    NASA Astrophysics Data System (ADS)

    Assal, W.; Bosq, J. C.; Mellier, F.

    2012-12-01

    Dedicated to the neutronics studies of fast and semi-fast reactor lattices, MASURCA (meaning “mock-up facility for fast breeder reactor studies at CADARACHE”) is an airflow cooled fast reactor operating at a maximum power of 5 kW playing an important role in the CEA research activities. At this facility, a lot of neutron integral experimental programs were undertaken. The purpose of this poster is to show a panorama of the facility from this experimental measurement point of view. A hint at the forthcoming refurbishment will be included. These programs include various experimental measurements (reactivity, distributions of fluxes, reaction rates), performed essentially with fission chambers, in accordance with different methods (noise methods, radial or axial traverses, rod drops) and involving several devices systems (monitors, fission chambers, amplifiers, power supplies, data acquisition systems ...). For this purpose are implemented electronics modules to shape the signals sent from the detectors in various mode (fluctuation, pulse, current). All the electric and electronic devices needed for these measurements and the relating wiring will be fully explained through comprehensive layouts. Data acquired during counting performed at the time of startup phase or rod drops are analyzed by the mean of a Neutronic Measurement Treatment (TMN in French) programmed on the basis of the MATLAB software. This toolbox gives the opportunity of data files management, reactivity valuation from neutronics measurements and transient or divergence simulation at zero power. Particular TMN using at MASURCA will be presented.

  6. Quantitative cell biology: the essential role of theory.

    PubMed

    Howard, Jonathon

    2014-11-05

    Quantitative biology is a hot area, as evidenced by the recent establishment of institutes, graduate programs, and conferences with that name. But what is quantitative biology? What should it be? And how can it contribute to solving the big questions in biology? The past decade has seen very rapid development of quantitative experimental techniques, especially at the single-molecule and single-cell levels. In this essay, I argue that quantitative biology is much more than just the quantitation of these experimental results. Instead, it should be the application of the scientific method by which measurement is directed toward testing theories. In this view, quantitative biology is the recognition that theory and models play critical roles in biology, as they do in physics and engineering. By tying together experiment and theory, quantitative biology promises a deeper understanding of underlying mechanisms, when the theory works, or to new discoveries, when it does not. © 2014 Howard. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  7. Measurements in quantitative research: how to select and report on research instruments.

    PubMed

    Hagan, Teresa L

    2014-07-01

    Measures exist to numerically represent degrees of attributes. Quantitative research is based on measurement and is conducted in a systematic, controlled manner. These measures enable researchers to perform statistical tests, analyze differences between groups, and determine the effectiveness of treatments. If something is not measurable, it cannot be tested.

  8. Quantitative measurement of intervertebral disc signal using MRI.

    PubMed

    Niemeläinen, R; Videman, T; Dhillon, S S; Battié, M C

    2008-03-01

    To investigate the spinal cord as an alternative intra-body reference to cerebrospinal fluid (CSF) in evaluating thoracic disc signal intensity. T2-weighted magnetic resonance imaging (MRI) images of T6-T12 were obtained using 1.5 T machines for a population-based sample of 523 men aged 35-70 years. Quantitative data on the signal intensities were acquired using an image analysis program (SpEx). A random sample of 30 subjects and intraclass correlation coefficients (ICC) were used to examine the repeatability of the spinal cord measurements. The validity of using the spinal cord as a reference was examined by correlating cord and CSF samples. Finally, thoracic disc signal was validated by correlating it with age without adjustment and adjusting for either cord or CSF. Pearson's r was used for correlational analyses. The repeatability of the spinal cord signal measurements was extremely high (>or=0.99). The correlations between the signals of spinal cord and CSF by level were all above 0.9. The spinal cord-adjusted disc signal and age correlated similarly with CSF-adjusted disc signal and age (r=-0.30 to -0.40 versus r=-0.26 to -0.36). Adjacent spinal cord is a good alternative reference to the current reference standard, CSF, for quantitative measurements of disc signal intensity. Clearly fewer levels were excluded when using spinal cord as compared to CSF due to missing reference samples.

  9. Hematocrit Measurement with R2* and Quantitative Susceptibility Mapping in Postmortem Brain.

    PubMed

    Walsh, A J; Sun, H; Emery, D J; Wilman, A H

    2018-05-24

    Noninvasive venous oxygenation quantification with MR imaging will improve the neurophysiologic investigation and the understanding of the pathophysiology in neurologic diseases. Available MR imaging methods are limited by sensitivity to flow and often require assumptions of the hematocrit level. In situ postmortem imaging enables evaluation of methods in a fully deoxygenated environment without flow artifacts, allowing direct calculation of hematocrit. This study compares 2 venous oxygenation quantification methods in in situ postmortem subjects. Transverse relaxation (R2*) mapping and quantitative susceptibility mapping were performed on a whole-body 4.7T MR imaging system. Intravenous measurements in major draining intracranial veins were compared between the 2 methods in 3 postmortem subjects. The quantitative susceptibility mapping technique was also applied in 10 healthy control subjects and compared with reference venous oxygenation values. In 2 early postmortem subjects, R2* mapping and quantitative susceptibility mapping measurements within intracranial veins had a significant and strong correlation ( R 2 = 0.805, P = .004 and R 2 = 0.836, P = .02). Higher R2* and susceptibility values were consistently demonstrated within gravitationally dependent venous segments during the early postmortem period. Hematocrit ranged from 0.102 to 0.580 in postmortem subjects, with R2* and susceptibility as large as 291 seconds -1 and 1.75 ppm, respectively. Measurements of R2* and quantitative susceptibility mapping within large intracranial draining veins have a high correlation in early postmortem subjects. This study supports the use of quantitative susceptibility mapping for evaluation of in vivo venous oxygenation and postmortem hematocrit concentrations. © 2018 by American Journal of Neuroradiology.

  10. Experimental Measurement-Device-Independent Entanglement Detection

    NASA Astrophysics Data System (ADS)

    Nawareg, Mohamed; Muhammad, Sadiq; Amselem, Elias; Bourennane, Mohamed

    2015-02-01

    Entanglement is one of the most puzzling features of quantum theory and of great importance for the new field of quantum information. The determination whether a given state is entangled or not is one of the most challenging open problems of the field. Here we report on the experimental demonstration of measurement-device-independent (MDI) entanglement detection using witness method for general two qubits photon polarization systems. In the MDI settings, there is no requirement to assume perfect implementations or neither to trust the measurement devices. This experimental demonstration can be generalized for the investigation of properties of quantum systems and for the realization of cryptography and communication protocols.

  11. Experimental Measurement-Device-Independent Entanglement Detection

    PubMed Central

    Nawareg, Mohamed; Muhammad, Sadiq; Amselem, Elias; Bourennane, Mohamed

    2015-01-01

    Entanglement is one of the most puzzling features of quantum theory and of great importance for the new field of quantum information. The determination whether a given state is entangled or not is one of the most challenging open problems of the field. Here we report on the experimental demonstration of measurement-device-independent (MDI) entanglement detection using witness method for general two qubits photon polarization systems. In the MDI settings, there is no requirement to assume perfect implementations or neither to trust the measurement devices. This experimental demonstration can be generalized for the investigation of properties of quantum systems and for the realization of cryptography and communication protocols. PMID:25649664

  12. A Systematic Review of Quantitative Resilience Measures for Water Infrastructure Systems

    DOE PAGES

    Shin, Sangmin; Lee, Seungyub; Judi, David; ...

    2018-02-07

    Over the past few decades, the concept of resilience has emerged as an important consideration in the planning and management of water infrastructure systems. Accordingly, various resilience measures have been developed for the quantitative evaluation and decision-making of systems. There are, however, numerous considerations and no clear choice of which measure, if any, provides the most appropriate representation of resilience for a given application. This study provides a critical review of quantitative approaches to measure the resilience of water infrastructure systems, with a focus on water resources and distribution systems. A compilation of 11 criteria evaluating 21 selected resilience measuresmore » addressing major features of resilience is developed using the Axiomatic Design process. Existing gaps of resilience measures are identified based on the review criteria. The results show that resilience measures have generally paid less attention to cascading damage to interrelated systems, rapid identification of failure, physical damage of system components, and time variation of resilience. Concluding the paper, improvements to resilience measures are recommended. The findings contribute to our understanding of gaps and provide information to help further improve resilience measures of water infrastructure systems.« less

  13. A Systematic Review of Quantitative Resilience Measures for Water Infrastructure Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Sangmin; Lee, Seungyub; Judi, David

    Over the past few decades, the concept of resilience has emerged as an important consideration in the planning and management of water infrastructure systems. Accordingly, various resilience measures have been developed for the quantitative evaluation and decision-making of systems. There are, however, numerous considerations and no clear choice of which measure, if any, provides the most appropriate representation of resilience for a given application. This study provides a critical review of quantitative approaches to measure the resilience of water infrastructure systems, with a focus on water resources and distribution systems. A compilation of 11 criteria evaluating 21 selected resilience measuresmore » addressing major features of resilience is developed using the Axiomatic Design process. Existing gaps of resilience measures are identified based on the review criteria. The results show that resilience measures have generally paid less attention to cascading damage to interrelated systems, rapid identification of failure, physical damage of system components, and time variation of resilience. Concluding the paper, improvements to resilience measures are recommended. The findings contribute to our understanding of gaps and provide information to help further improve resilience measures of water infrastructure systems.« less

  14. Experimental simulation of the effects of sudden increases in geomagnetic activity upon quantitative measures of human brain activity: validation of correlational studies.

    PubMed

    Mulligan, Bryce P; Persinger, Michael A

    2012-05-10

    Previous correlations between geomagnetic activity and quantitative changes in electroencephalographic power revealed particular associations with the right parietal lobe for theta activity and the right frontal region for gamma activity. In the present experiment subjects were exposed to either no field (sham conditions) or to either 20 nT or 70 nT, 7 Hz, amplitude modulated (mHz range) magnetic fields for 30 min. Quantitative electroencephalographic (QEEG) measurements were completed before, during, and after the field exposures. After about 10 min of exposure theta power over the right parietal region was enhanced for the 20 nT exposure but suppressed for the 70 nT exposure relative to sham field exposures. The effect dissipated by the end of the exposure. These results support the contention that magnetic field fluctuations were primarily responsible for the significant geomagnetic-QEEG correlations reported in several studies. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  15. Quantitative phase measurement for wafer-level optics

    NASA Astrophysics Data System (ADS)

    Qu, Weijuan; Wen, Yongfu; Wang, Zhaomin; Yang, Fang; Huang, Lei; Zuo, Chao

    2015-07-01

    Wafer-level-optics now is widely used in smart phone camera, mobile video conferencing or in medical equipment that require tiny cameras. Extracting quantitative phase information has received increased interest in order to quantify the quality of manufactured wafer-level-optics, detect defective devices before packaging, and provide feedback for manufacturing process control, all at the wafer-level for high-throughput microfabrication. We demonstrate two phase imaging methods, digital holographic microscopy (DHM) and Transport-of-Intensity Equation (TIE) to measure the phase of the wafer-level lenses. DHM is a laser-based interferometric method based on interference of two wavefronts. It can perform a phase measurement in a single shot. While a minimum of two measurements of the spatial intensity of the optical wave in closely spaced planes perpendicular to the direction of propagation are needed to do the direct phase retrieval by solving a second-order differential equation, i.e., with a non-iterative deterministic algorithm from intensity measurements using the Transport-of-Intensity Equation (TIE). But TIE is a non-interferometric method, thus can be applied to partial-coherence light. We demonstrated the capability and disability for the two phase measurement methods for wafer-level optics inspection.

  16. Quantitative thickness measurement of polarity-inverted piezoelectric thin-film layer by scanning nonlinear dielectric microscopy

    NASA Astrophysics Data System (ADS)

    Odagawa, Hiroyuki; Terada, Koshiro; Tanaka, Yohei; Nishikawa, Hiroaki; Yanagitani, Takahiko; Cho, Yasuo

    2017-10-01

    A quantitative measurement method for a polarity-inverted layer in ferroelectric or piezoelectric thin film is proposed. It is performed nondestructively by scanning nonlinear dielectric microscopy (SNDM). In SNDM, linear and nonlinear dielectric constants are measured using a probe that converts the variation of capacitance related to these constants into the variation of electrical oscillation frequency. In this paper, we describe a principle for determining the layer thickness and some calculation results of the output signal, which are related to the radius of the probe tip and the thickness of the inverted layer. Moreover, we derive an equation that represents the relationship between the output signal and the oscillation frequency of the probe and explain how to determine the thickness from the measured frequency. Experimental results in Sc-doped AlN piezoelectric thin films that have a polarity-inverted layer with a thickness of 1.5 µm fabricated by radio frequency magnetron sputtering showed a fairly good value of 1.38 µm for the thickness of the polarity-inverted layer.

  17. The relationship between quantitative measures of disc height and disc signal intensity with Pfirrmann score of disc degeneration.

    PubMed

    Salamat, Sara; Hutchings, John; Kwong, Clemens; Magnussen, John; Hancock, Mark J

    2016-01-01

    To assess the relationship between quantitative measures of disc height and signal intensity with the Pfirrmann disc degeneration scoring system and to test the inter-rater reliability of the quantitative measures. Participants were 76 people who had recently recovered from their last episode of acute low back pain and underwent MRI scan on a single 3T machine. At all 380 lumbar discs, quantitative measures of disc height and signal intensity were made by 2 independent raters and compared to Pfirrmann scores from a single radiologist. For quantitative measures of disc height and signal intensity a "raw" score and 2 adjusted ratios were calculated and the relationship with Pfirrmann scores was assessed. The inter-tester reliability of quantitative measures was also investigated. There was a strong linear relationship between quantitative disc signal intensity and Pfirrmann scores for grades 1-4, but not for grades 4 and 5. For disc height only, Pfirrmann grade 5 had significantly reduced disc height compared to all other grades. Results were similar regardless of whether raw or adjusted scores were used. Inter-rater reliability for the quantitative measures was excellent (ICC > 0.97). Quantitative measures of disc signal intensity were strongly related to Pfirrmann scores from grade 1 to 4; however disc height only differentiated between grade 4 and 5 Pfirrmann scores. Using adjusted ratios for quantitative measures of disc height or signal intensity did not significantly alter the relationship with Pfirrmann scores.

  18. Quantitative Morphology Measures in Galaxies: Ground-Truthing from Simulations

    NASA Astrophysics Data System (ADS)

    Narayanan, Desika T.; Abruzzo, Matthew W.; Dave, Romeel; Thompson, Robert

    2017-01-01

    The process of galaxy assembly is a prevalent question in astronomy; there are a variety of potentially important effects, including baryonic accretion from the intergalactic medium, as well as major galaxy mergers. Recent years have ushered in the development of quantitative measures of morphology such as the Gini coefficient (G), the second-order moment of the brightest quintile of a galaxy’s light (M20), and the concentration (C), asymmetry (A), and clumpiness (S) of galaxies. To investigate the efficacy of these observational methods at identifying major mergers, we have run a series of very high resolution cosmological zoom simulations, and coupled these with 3D Monte Carlo dust radiative transfer. Our methodology is powerful in that it allows us to “observe” the simulation as an observer would, while maintaining detailed knowledge of the true merger history of the galaxy. In this presentation, we will present our main results from our analysis of these quantitative morphology measures, with a particular focus on high-redshift (z>2) systems.

  19. Detecting Genetic Interactions for Quantitative Traits Using m-Spacing Entropy Measure

    PubMed Central

    Yee, Jaeyong; Kwon, Min-Seok; Park, Taesung; Park, Mira

    2015-01-01

    A number of statistical methods for detecting gene-gene interactions have been developed in genetic association studies with binary traits. However, many phenotype measures are intrinsically quantitative and categorizing continuous traits may not always be straightforward and meaningful. Association of gene-gene interactions with an observed distribution of such phenotypes needs to be investigated directly without categorization. Information gain based on entropy measure has previously been successful in identifying genetic associations with binary traits. We extend the usefulness of this information gain by proposing a nonparametric evaluation method of conditional entropy of a quantitative phenotype associated with a given genotype. Hence, the information gain can be obtained for any phenotype distribution. Because any functional form, such as Gaussian, is not assumed for the entire distribution of a trait or a given genotype, this method is expected to be robust enough to be applied to any phenotypic association data. Here, we show its use to successfully identify the main effect, as well as the genetic interactions, associated with a quantitative trait. PMID:26339620

  20. Reconstruction of dynamic structures of experimental setups based on measurable experimental data only

    NASA Astrophysics Data System (ADS)

    Chen, Tian-Yu; Chen, Yang; Yang, Hu-Jiang; Xiao, Jing-Hua; Hu, Gang

    2018-03-01

    Nowadays, massive amounts of data have been accumulated in various and wide fields, it has become today one of the central issues in interdisciplinary fields to analyze existing data and extract as much useful information as possible from data. It is often that the output data of systems are measurable while dynamic structures producing these data are hidden, and thus studies to reveal system structures by analyzing available data, i.e., reconstructions of systems become one of the most important tasks of information extractions. In the past, most of the works in this respect were based on theoretical analyses and numerical verifications. Direct analyses of experimental data are very rare. In physical science, most of the analyses of experimental setups were based on the first principles of physics laws, i.e., so-called top-down analyses. In this paper, we conducted an experiment of “Boer resonant instrument for forced vibration” (BRIFV) and inferred the dynamic structure of the experimental set purely from the analysis of the measurable experimental data, i.e., by applying the bottom-up strategy. Dynamics of the experimental set is strongly nonlinear and chaotic, and itʼs subjects to inevitable noises. We proposed to use high-order correlation computations to treat nonlinear dynamics; use two-time correlations to treat noise effects. By applying these approaches, we have successfully reconstructed the structure of the experimental setup, and the dynamic system reconstructed with the measured data reproduces good experimental results in a wide range of parameters.

  1. Experimental Psychological Stress on Quantitative Sensory Testing Response in Patients with Temporomandibular Disorders.

    PubMed

    Araújo Oliveira Ferreira, Dyna Mara; Costa, Yuri Martins; de Quevedo, Henrique Müller; Bonjardim, Leonardo Rigoldi; Rodrigues Conti, Paulo César

    2018-05-15

    To assess the modulatory effects of experimental psychological stress on the somatosensory evaluation of myofascial temporomandibular disorder (TMD) patients. A total of 20 women with myofascial TMD and 20 age-matched healthy women were assessed by means of a standardized battery of quantitative sensory testing. Cold detection threshold (CDT), warm detection threshold (WDT), cold pain threshold (CPT), heat pain threshold (HPT), mechanical pain threshold (MPT), wind-up ratio (WUR), and pressure pain threshold (PPT) were performed on the facial skin overlying the masseter muscle. The variables were measured in three sessions: before (baseline) and immediately after the Paced Auditory Serial Addition Task (PASAT) (stress) and then after a washout period of 20 to 30 minutes (poststress). Mixed analysis of variance (ANOVA) was applied to the data, and the significance level was set at P = .050. A significant main effect of the experimental session on all thermal tests was found (ANOVA: F > 4.10, P < .017), where detection tests presented an increase in thresholds in the poststress session compared to baseline (CDT, P = .012; WDT, P = .040) and pain thresholds were reduced in the stress (CPT, P < .001; HPT, P = .001) and poststress sessions (CPT, P = .005; HPT, P = .006) compared to baseline. In addition, a significant main effect of the study group on all mechanical tests (MPT, WUR, and PPT) was found (ANOVA: F > 4.65, P < .037), where TMD patients were more sensitive than healthy volunteers. Acute mental stress conditioning can modulate thermal sensitivity of the skin overlying the masseter in myofascial TMD patients and healthy volunteers. Therefore, psychological stress should be considered in order to perform an unbiased somatosensory assessment of TMD patients.

  2. Challenge in Enhancing the Teaching and Learning of Variable Measurements in Quantitative Research

    ERIC Educational Resources Information Center

    Kee, Chang Peng; Osman, Kamisah; Ahmad, Fauziah

    2013-01-01

    Statistical analysis is one component that cannot be avoided in a quantitative research. Initial observations noted that students in higher education institution faced difficulty analysing quantitative data which were attributed to the confusions of various variable measurements. This paper aims to compare the outcomes of two approaches applied in…

  3. Application of experimental poverty measures to the aged.

    PubMed

    Olsen, K A

    1999-01-01

    The U.S. Census Bureau recently released new, experimental measures of poverty based on a National Academy of Sciences (NAS) panel's recommendations. This article examines the effects of the experimental measures on poverty rates among persons aged 65 or older in order to help inform policy debate. Policymakers and analysts use poverty rates to measure the successes and failures of existing programs and to create and defend new policy initiatives. The Census Bureau computes the official rates of poverty using poverty thresholds and definitions of countable income that have changed little since the official poverty measure was adopted in 1965. Amid growing concerns about the adequacy of the official poverty measure, a NAS panel undertook a study of the concepts, methodology, and data needed to measure poverty. The panel concluded in its 1995 report that the current measure no longer provides an accurate picture of relative rates of poverty for different groups in the population or of changes in poverty over time. The panel recommended changes in establishing the poverty thresholds, defining family resources, and obtaining the required data. The Census Bureau report shows how estimated levels of poverty would differ from the official level as specific recommendations of the NAS panel are implemented individually and how estimated trends would differ when many recommendations are implemented simultaneously. It computes nonstandardized and standardized poverty rates. (The latter constrains the overall poverty rate under the experimental measures to match the official rate.) This article reports poverty rates that have not been standardized and provides considerably more detail than the Census report about the effects of the experimental measures on poverty among the aged. It examines the effects of changing the poverty thresholds and the items included or excluded from the definition of available resources. It also explores the effects of the experimental measures on

  4. Quantitative NDA measurements of advanced reprocessing product materials containing uranium, neptunium, plutonium, and americium

    NASA Astrophysics Data System (ADS)

    Goddard, Braden

    The ability of inspection agencies and facility operators to measure powders containing several actinides is increasingly necessary as new reprocessing techniques and fuel forms are being developed. These powders are difficult to measure with nondestructive assay (NDA) techniques because neutrons emitted from induced and spontaneous fission of different nuclides are very similar. A neutron multiplicity technique based on first principle methods was developed to measure these powders by exploiting isotope-specific nuclear properties, such as the energy-dependent fission cross sections and the neutron induced fission neutron multiplicity. This technique was tested through extensive simulations using the Monte Carlo N-Particle eXtended (MCNPX) code and by one measurement campaign using the Active Well Coincidence Counter (AWCC) and two measurement campaigns using the Epithermal Neutron Multiplicity Counter (ENMC) with various (alpha,n) sources and actinide materials. Four potential applications of this first principle technique have been identified: (1) quantitative measurement of uranium, neptunium, plutonium, and americium materials; (2) quantitative measurement of mixed oxide (MOX) materials; (3) quantitative measurement of uranium materials; and (4) weapons verification in arms control agreements. This technique still has several challenges which need to be overcome, the largest of these being the challenge of having high-precision active and passive measurements to produce results with acceptably small uncertainties.

  5. Measuring teamwork in primary care: Triangulation of qualitative and quantitative data.

    PubMed

    Brown, Judith Belle; Ryan, Bridget L; Thorpe, Cathy; Markle, Emma K R; Hutchison, Brian; Glazier, Richard H

    2015-09-01

    This article describes the triangulation of qualitative dimensions, reflecting high functioning teams, with the results of standardized teamwork measures. The study used a mixed methods design using qualitative and quantitative approaches to assess teamwork in 19 Family Health Teams in Ontario, Canada. This article describes dimensions from the qualitative phase using grounded theory to explore the issues and challenges to teamwork. Two quantitative measures were used in the study, the Team Climate Inventory (TCI) and the Providing Effective Resources and Knowledge (PERK) scale. For the triangulation analysis, the mean scores of these measures were compared with the qualitatively derived ratings for the dimensions. The final sample for the qualitative component was 107 participants. The qualitative analysis identified 9 dimensions related to high team functioning such as common philosophy, scope of practice, conflict resolution, change management, leadership, and team evolution. From these dimensions, teams were categorized numerically as high, moderate, or low functioning. Three hundred seventeen team members completed the survey measures. Mean site scores for the TCI and PERK were 3.87 and 3.88, respectively (of 5). The TCI was associated will all dimensions except for team location, space allocation, and executive director leadership. The PERK was associated with all dimensions except team location. Data triangulation provided qualitative and quantitative evidence of what constitutes teamwork. Leadership was pivotal in forging a common philosophy and encouraging team collaboration. Teams used conflict resolution strategies and adapted to the changes they encountered. These dimensions advanced the team's evolution toward a high functioning team. (c) 2015 APA, all rights reserved).

  6. Quantitative measures of walking and strength provide insight into brain corticospinal tract pathology in multiple sclerosis.

    PubMed

    Fritz, Nora E; Keller, Jennifer; Calabresi, Peter A; Zackowski, Kathleen M

    2017-01-01

    At least 85% of individuals with multiple sclerosis report walking dysfunction as their primary complaint. Walking and strength measures are common clinical measures to mark increasing disability or improvement with rehabilitation. Previous studies have shown an association between strength or walking ability and spinal cord MRI measures, and strength measures with brainstem corticospinal tract magnetization transfer ratio. However, the relationship between walking performance and brain corticospinal tract magnetization transfer imaging measures and the contribution of clinical measurements of walking and strength to the underlying integrity of the corticospinal tract has not been explored in multiple sclerosis. The objectives of this study were explore the relationship of quantitative measures of walking and strength to whole-brain corticospinal tract-specific MRI measures and to determine the contribution of quantitative measures of function in addition to basic clinical measures (age, gender, symptom duration and Expanded Disability Status Scale) to structural imaging measures of the corticospinal tract. We hypothesized that quantitative walking and strength measures would be related to brain corticospinal tract-specific measures, and would provide insight into the heterogeneity of brain pathology. Twenty-nine individuals with relapsing-remitting multiple sclerosis (mean(SD) age 48.7 (11.5) years; symptom duration 11.9(8.7); 17 females; median[range] Expanded Disability Status Scale 4.0 [1.0-6.5]) and 29 age and gender-matched healthy controls (age 50.8(11.6) years; 20 females) participated in clinical tests of strength and walking (Timed Up and Go, Timed 25 Foot Walk, Two Minute Walk Test ) as well as 3 T imaging including diffusion tensor imaging and magnetization transfer imaging. Individuals with multiple sclerosis were weaker (p = 0.0024) and walked slower (p = 0.0013) compared to controls. Quantitative measures of walking and strength were

  7. Longitudinal change in quantitative meniscus measurements in knee osteoarthritis--data from the Osteoarthritis Initiative.

    PubMed

    Bloecker, Katja; Wirth, W; Guermazi, A; Hitzl, W; Hunter, D J; Eckstein, F

    2015-10-01

    We aimed to apply 3D MRI-based measurement technology to studying 2-year change in quantitative measurements of meniscus size and position. Forty-seven knees from the Osteoarthritis Initiative with medial radiographic joint space narrowing had baseline and 2-year follow-up MRIs. Quantitative measures were obtained from manual segmentation of the menisci and tibia using coronal DESSwe images. The standardized response mean (SRM = mean/SD change) was used as measure of sensitivity to longitudinal change. Medial tibial plateau coverage decreased from 34.8% to 29.9% (SRM -0.82; p < 0.001). Change in medial meniscus extrusion in a central image (SRM 0.18) and in the central five slices (SRM 0.22) did not reach significance, but change in extrusion across the entire meniscus (SRM 0.32; p = 0.03) and in the relative area of meniscus extrusion (SRM 0.56; p < 0.001) did. There was a reduction in medial meniscus volume (10%; p < 0.001), width (7%; p < 0.001), and height (2%; p = 0.08); meniscus substance loss was strongest in the posterior (SRM -0.51; p = 0.001) and weakest in the anterior horn (SRM -0.15; p = 0.31). This pilot study reports, for the first time, longitudinal change in quantitative 3D meniscus measurements in knee osteoarthritis. It provides evidence of improved sensitivity to change of 3D measurements compared with single slice analysis. • First longitudinal MRI-based measurements of change of meniscus position and size. • Quantitative longitudinal evaluation of meniscus change in knee osteoarthritis. • Improved sensitivity to change of 3D measurements compared with single slice analysis.

  8. Quantitative colorectal cancer perfusion measurement by multidetector-row CT: does greater tumour coverage improve measurement reproducibility?

    PubMed

    Goh, V; Halligan, S; Gartner, L; Bassett, P; Bartram, C I

    2006-07-01

    The purpose of this study was to determine if greater z-axis tumour coverage improves the reproducibility of quantitative colorectal cancer perfusion measurements using CT. A 65 s perfusion study was acquired following intravenous contrast administration in 10 patients with proven colorectal cancer using a four-detector row scanner. This was repeated within 48 h using identical technical parameters to allow reproducibility assessment. Quantitative tumour blood volume, blood flow, mean transit time and permeability measurements were determined using commercially available software (Perfusion 3.0; GE Healthcare, Waukesha, WI) for data obtained from a 5 mm z-axis tumour coverage, and from a 20 mm z-axis tumour coverage. Measurement reproducibility was assessed using Bland-Altman statistics, for a 5 mm z-axis tumour coverage, and 20 mm z-axis tumour coverage, respectively. The mean difference (95% limits of agreement) for blood volume, blood flow, mean transit time and permeability were 0.04 (-2.50 to +2.43) ml/100 g tissue; +8.80 (-50.5 to +68.0) ml/100 g tissue/min; -0.99 (-8.19 to +6.20) seconds; and +1.20 (-5.42 to +7.83) ml/100 g tissue/min, respectively, for a 5 mm coverage, and -0.04 (-2.61 to +2.53) ml/100 g tissue; +7.40 (-50.3 to +65.0) ml/100 g tissue/min; -2.46 (-12.61 to +7.69) seconds; and -0.23 (-8.31 to +7.85) ml/100 g tissue/min, respectively, for a 20 mm coverage, indicating similar levels of agreement. In conclusion, increasing z-axis coverage does not improve reproducibility of quantitative colorectal cancer perfusion measurements.

  9. Modern projection of the old electroscope for nuclear radiation quantitative work and demonstrations

    NASA Astrophysics Data System (ADS)

    Oliveira Bastos, Rodrigo; Baltokoski Boch, Layara

    2017-11-01

    Although quantitative measurements in radioactivity teaching and research are only believed to be possible with high technology, early work in this area was fully accomplished with very simple apparatus such as zinc sulphide screens and electroscopes. This article presents an experimental practice using the electroscope, which is a very simple apparatus that has been widely used for educational purposes, although generally for qualitative work. The main objective is to show the possibility of measuring radioactivity not only in qualitative demonstrations, but also in quantitative experimental practices. The experimental set-up is a low-cost ion chamber connected to an electroscope in a configuration that is very similar to that used by Marie and Pierre Currie, Rutherford, Geiger, Pacini, Hess and other great researchers from the time of the big discoveries in nuclear and high-energy particle physics. An electroscope leaf is filmed and projected, permitting the collection of quantitative data for the measurement of the 220Rn half-life, collected from the emanation of the lantern mantles. The article presents the experimental procedures and the expected results, indicating that the experiment may provide support for nuclear physics classes. These practices could spread widely to either university or school didactic laboratories, and the apparatus has the potential to allow the development of new teaching activity for nuclear physics.

  10. Development of Naphthalene PLIF for Making Quantitative Measurements of Ablation Products Transport in Supersonic Flows

    NASA Astrophysics Data System (ADS)

    Combs, Christopher; Clemens, Noel

    2014-11-01

    Ablation is a multi-physics process involving heat and mass transfer and codes aiming to predict ablation are in need of experimental data pertaining to the turbulent transport of ablation products for validation. Low-temperature sublimating ablators such as naphthalene can be used to create a limited physics problem and simulate ablation at relatively low temperature conditions. At The University of Texas at Austin, a technique is being developed that uses planar laser-induced fluorescence (PLIF) of naphthalene to visualize the transport of ablation products in a supersonic flow. In the current work, naphthalene PLIF will be used to make quantitative measurements of the concentration of ablation products in a Mach 5 turbulent boundary layer. For this technique to be used for quantitative research in supersonic wind tunnel facilities, the fluorescence properties of naphthalene must first be investigated over a wide range of state conditions and excitation wavelengths. The resulting calibration of naphthalene fluorescence will be applied to the PLIF images of ablation from a boundary layer plug, yielding 2-D fields of naphthalene mole fraction. These images may help provide data necessary to validate computational models of ablative thermal protection systems for reentry vehicles. Work supported by NASA Space Technology Research Fellowship Program under grant NNX11AN55H.

  11. Quadratic elongation: A quantitative measure of distortion in coordination polyhedra

    USGS Publications Warehouse

    Robinson, Kelly F.; Gibbs, G.V.; Ribbe, P.H.

    1971-01-01

    Quadratic elongation and the variance of bond angles are linearly correlated for distorted octahedral and tetrahedral coordination complexes, both of which show variations in bond length and bond angle. The quadratic elonga tion is dimensionless, giving a quantitative measure of polyhedral distortion which is independent of the effective size of the polyhedron.

  12. Quantitative measures of healthy aging and biological age

    PubMed Central

    Kim, Sangkyu; Jazwinski, S. Michal

    2015-01-01

    Numerous genetic and non-genetic factors contribute to aging. To facilitate the study of these factors, various descriptors of biological aging, including ‘successful aging’ and ‘frailty’, have been put forth as integrative functional measures of aging. A separate but related quantitative approach is the ‘frailty index’, which has been operationalized and frequently used. Various frailty indices have been constructed. Although based on different numbers and types of health variables, frailty indices possess several common properties that make them useful across different studies. We have been using a frailty index termed FI34 based on 34 health variables. Like other frailty indices, FI34 increases non-linearly with advancing age and is a better indicator of biological aging than chronological age. FI34 has a substantial genetic basis. Using FI34, we found elevated levels of resting metabolic rate linked to declining health in nonagenarians. Using FI34 as a quantitative phenotype, we have also found a genomic region on chromosome 12 that is associated with healthy aging and longevity. PMID:26005669

  13. Quantitative fiber-optic Raman spectroscopy for tissue Raman measurements

    NASA Astrophysics Data System (ADS)

    Duraipandian, Shiyamala; Bergholt, Mads; Zheng, Wei; Huang, Zhiwei

    2014-03-01

    Molecular profiling of tissue using near-infrared (NIR) Raman spectroscopy has shown great promise for in vivo detection and prognostication of cancer. The Raman spectra measured from the tissue generally contain fundamental information about the absolute biomolecular concentrations in tissue and its changes associated with disease transformation. However, producing analogues tissue Raman spectra present a great technical challenge. In this preliminary study, we propose a method to ensure the reproducible tissue Raman measurements and validated with the in vivo Raman spectra (n=150) of inner lip acquired using different laser powers (i.e., 30 and 60 mW). A rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe was utilized for tissue Raman measurements. The investigational results showed that the variations between the spectra measured with different laser powers are almost negligible, facilitating the quantitative analysis of tissue Raman measurements in vivo.

  14. Experimental joint quantum measurements with minimum uncertainty.

    PubMed

    Ringbauer, Martin; Biggerstaff, Devon N; Broome, Matthew A; Fedrizzi, Alessandro; Branciard, Cyril; White, Andrew G

    2014-01-17

    Quantum physics constrains the accuracy of joint measurements of incompatible observables. Here we test tight measurement-uncertainty relations using single photons. We implement two independent, idealized uncertainty-estimation methods, the three-state method and the weak-measurement method, and adapt them to realistic experimental conditions. Exceptional quantum state fidelities of up to 0.999 98(6) allow us to verge upon the fundamental limits of measurement uncertainty.

  15. Quantitative single-photon emission computed tomography/computed tomography for technetium pertechnetate thyroid uptake measurement

    PubMed Central

    Lee, Hyunjong; Kim, Ji Hyun; Kang, Yeon-koo; Moon, Jae Hoon; So, Young; Lee, Won Woo

    2016-01-01

    Abstract Objectives: Technetium pertechnetate (99mTcO4) is a radioactive tracer used to assess thyroid function by thyroid uptake system (TUS). However, the TUS often fails to deliver accurate measurements of the percent of thyroid uptake (%thyroid uptake) of 99mTcO4. Here, we investigated the usefulness of quantitative single-photon emission computed tomography/computed tomography (SPECT/CT) after injection of 99mTcO4 in detecting thyroid function abnormalities. Materials and methods: We retrospectively reviewed data from 50 patients (male:female = 15:35; age, 46.2 ± 16.3 years; 17 Graves disease, 13 thyroiditis, and 20 euthyroid). All patients underwent 99mTcO4 quantitative SPECT/CT (185 MBq = 5 mCi), which yielded %thyroid uptake and standardized uptake value (SUV). Twenty-one (10 Graves disease and 11 thyroiditis) of the 50 patients also underwent conventional %thyroid uptake measurements using a TUS. Results: Quantitative SPECT/CT parameters (%thyroid uptake, SUVmean, and SUVmax) were the highest in Graves disease, second highest in euthyroid, and lowest in thyroiditis (P < 0.0001, Kruskal–Wallis test). TUS significantly overestimated the %thyroid uptake compared with SPECT/CT (P < 0.0001, paired t test) because other 99mTcO4 sources in addition to thyroid, such as salivary glands and saliva, contributed to the %thyroid uptake result by TUS, whereas %thyroid uptake, SUVmean and SUVmax from the SPECT/CT were associated with the functional status of thyroid. Conclusions: Quantitative SPECT/CT is more accurate than conventional TUS for measuring 99mTcO4 %thyroid uptake. Quantitative measurements using SPECT/CT may facilitate more accurate assessment of thyroid tracer uptake. PMID:27399139

  16. Refractive index variance of cells and tissues measured by quantitative phase imaging.

    PubMed

    Shan, Mingguang; Kandel, Mikhail E; Popescu, Gabriel

    2017-01-23

    The refractive index distribution of cells and tissues governs their interaction with light and can report on morphological modifications associated with disease. Through intensity-based measurements, refractive index information can be extracted only via scattering models that approximate light propagation. As a result, current knowledge of refractive index distributions across various tissues and cell types remains limited. Here we use quantitative phase imaging and the statistical dispersion relation (SDR) to extract information about the refractive index variance in a variety of specimens. Due to the phase-resolved measurement in three-dimensions, our approach yields refractive index results without prior knowledge about the tissue thickness. With the recent progress in quantitative phase imaging systems, we anticipate that using SDR will become routine in assessing tissue optical properties.

  17. Quantitation of cholesterol incorporation into extruded lipid bilayers.

    PubMed

    Ibarguren, Maitane; Alonso, Alicia; Tenchov, Boris G; Goñi, Felix M

    2010-09-01

    Cholesterol incorporation into lipid bilayers, in the form of multilamellar vesicles or extruded large unilamellar vesicles, has been quantitated. To this aim, the cholesterol contents of bilayers prepared from phospholipid:cholesterol mixtures 33-75 mol% cholesterol have been measured and compared with the original mixture before lipid hydration. There is a great diversity of cases, but under most conditions the actual cholesterol proportion present in the extruded bilayers is much lower than predicted. A quantitative analysis of the vesicles is thus required before any experimental study is undertaken. 2010 Elsevier B.V. All rights reserved.

  18. Quantitative Measurement of Oxygen in Microgravity Combustion

    NASA Technical Reports Server (NTRS)

    Silver, Joel A.

    1997-01-01

    A low-gravity environment, in space or in ground-based facilities such as drop towers, provides a unique setting for studying combustion mechanisms. Understanding the physical phenomena controlling the ignition and spread of flames in microgravity has importance for space safety as well as for better characterization of dynamical and chemical combustion processes which are normally masked by buoyancy and other gravity-related effects. Due to restrictions associated with performing measurements in reduced gravity, diagnostic methods which have been applied to microgravity combustion studies have generally been limited to capture of flame emissions on film or video, laser Schlieren imaging and (intrusive) temperature measurements using thermocouples. Given the development of detailed theoretical models, more sophisticated diagnostic methods are needed to provide the kind of quantitative data necessary to characterize the properties of microgravity combustion processes as well as provide accurate feedback to improve the predictive capabilities of the models. When the demands of space flight are considered, the need for improved diagnostic systems which are rugged, compact, reliable, and operate at low power becomes apparent. The objective of this research is twofold. First, we want to develop a better understanding of the relative roles of diffusion and reaction of oxygen in microgravity combustion. As the primary oxidizer species, oxygen plays a major role in controlling the observed properties of flames, including flame front speed (in solid or liquid flames), extinguishment characteristics, flame size and flame temperature. The second objective is to develop better diagnostics based on diode laser absorption which can be of real value in both microgravity combustion research and as a sensor on-board Spacelab as either an air quality monitor or as part of a fire detection system. In our prior microgravity work, an eight line-of-sight fiber optic system measured

  19. NASA Intellectual Property Negotiation Practices and their Relationship to Quantitative Measures of Technology Transfer

    NASA Technical Reports Server (NTRS)

    Bush, Lance B.

    1997-01-01

    In the current political climate NASA must be able to show reliable measures demonstrating successful technology transfer. The currently available quantitative data of intellectual property technology transfer efforts portray a less than successful performance. In this paper, the use of only quantitative values for measurement of technology transfer is shown to undervalue the effort. In addition, NASA's current policy in negotiating intellectual property rights results in undervalued royalty rates. NASA has maintained that it's position of providing public good precludes it from negotiating fair market value for its technology and instead has negotiated for reasonable cost in order to recover processing fees. This measurement issue is examined and recommendations made which include a new policy regarding the intellectual property rights negotiation, and two measures to supplement the intellectual property measures.

  20. Quantitative diffusion and swelling kinetic measurements using large-angle interferometric refractometry.

    PubMed

    Saunders, John E; Chen, Hao; Brauer, Chris; Clayton, McGregor; Chen, Weijian; Barnes, Jack A; Loock, Hans-Peter

    2015-12-07

    The uptake and release of sorbates into films and coatings is typically accompanied by changes of the films' refractive index and thickness. We provide a comprehensive model to calculate the concentration of the sorbate from the average refractive index and the film thickness, and validate the model experimentally. The mass fraction of the analyte partitioned into a film is described quantitatively by the Lorentz-Lorenz equation and the Clausius-Mosotti equation. To validate the model, the uptake kinetics of water and other solvents into SU-8 films (d = 40-45 μm) were explored. Large-angle interferometric refractometry measurements can be used to characterize films that are between 15 μm to 150 μm thick and, Fourier analysis, is used to determine independently the thickness, the average refractive index and the refractive index at the film-substrate interface at one-second time intervals. From these values the mass fraction of water in SU-8 was calculated. The kinetics were best described by two independent uptake processes having different rates. Each process followed one-dimensional Fickian diffusion kinetics with diffusion coefficients for water into SU-8 photoresist film of 5.67 × 10(-9) cm(2) s(-1) and 61.2 × 10(-9) cm(2) s(-1).

  1. Experimental evaluation of a MOSFET dosimeter for proton dose measurements.

    PubMed

    Kohno, Ryosuke; Nishio, Teiji; Miyagishi, Tomoko; Hirano, Eriko; Hotta, Kenji; Kawashima, Mitsuhiko; Ogino, Takashi

    2006-12-07

    The metal oxide semiconductor field-effect transistor (MOSFET) dosimeter has been widely studied for use as a dosimeter for patient dose verification. The major advantage of this detector is its size, which acts as a point dosimeter, and also its ease of use. The commercially available TN502RD MOSFET dosimeter manufactured by Thomson and Nielsen has never been used for proton dosimetry. Therefore we used the MOSFET dosimeter for the first time in proton dose measurements. In this study, the MOSFET dosimeter was irradiated with 190 MeV therapeutic proton beams. We experimentally evaluated dose reproducibility, linearity, fading effect, beam intensity dependence and angular dependence for the proton beam. Furthermore, the Bragg curve and spread-out Bragg peak were also measured and the linear-energy transfer (LET) dependence of the MOSFET response was investigated. Many characteristics of the MOSFET response for proton beams were the same as those for photon beams reported in previous papers. However, the angular MOSFET responses at 45, 90, 135, 225, 270 and 315 degrees for proton beams were over-responses of about 15%, and moreover the MOSFET response depended strongly on the LET of the proton beam. This study showed that the angular dependence and LET dependence of the MOSFET response must be considered very carefully for quantitative proton dose evaluations.

  2. Development of a novel nanoscratch technique for quantitative measurement of ice adhesion strength

    NASA Astrophysics Data System (ADS)

    Loho, T.; Dickinson, M.

    2018-04-01

    The mechanism for the way that ice adheres to surfaces is still not well understood. Currently there is no standard method to quantitatively measure how ice adheres to surfaces which makes ice surface studies difficult to compare. A novel quantitative lateral force adhesion measurement at the micro-nano scale for ice was created which shears micro-nano sized ice droplets (less than 3 μm in diameter and 100nm in height) using a nanoindenter. By using small ice droplets, the variables associated with bulk ice measurements were minimised which increased data repeatability compared to bulk testing. The technique provided post- testing surface scans to confirm that the ice had been removed and that measurements were of ice adhesion strength. Results show that the ice adhesion strength of a material is greatly affected by the nano-scale surface roughness of the material with rougher surfaces having higher ice adhesion strength.

  3. How quantitative measures unravel design principles in multi-stage phosphorylation cascades.

    PubMed

    Frey, Simone; Millat, Thomas; Hohmann, Stefan; Wolkenhauer, Olaf

    2008-09-07

    We investigate design principles of linear multi-stage phosphorylation cascades by using quantitative measures for signaling time, signal duration and signal amplitude. We compare alternative pathway structures by varying the number of phosphorylations and the length of the cascade. We show that a model for a weakly activated pathway does not reflect the biological context well, unless it is restricted to certain parameter combinations. Focusing therefore on a more general model, we compare alternative structures with respect to a multivariate optimization criterion. We test the hypothesis that the structure of a linear multi-stage phosphorylation cascade is the result of an optimization process aiming for a fast response, defined by the minimum of the product of signaling time and signal duration. It is then shown that certain pathway structures minimize this criterion. Several popular models of MAPK cascades form the basis of our study. These models represent different levels of approximation, which we compare and discuss with respect to the quantitative measures.

  4. Quantitative elasticity measurement of urinary bladder wall using laser-induced surface acoustic waves.

    PubMed

    Li, Chunhui; Guan, Guangying; Zhang, Fan; Song, Shaozhen; Wang, Ruikang K; Huang, Zhihong; Nabi, Ghulam

    2014-12-01

    The maintenance of urinary bladder elasticity is essential to its functions, including the storage and voiding phases of the micturition cycle. The bladder stiffness can be changed by various pathophysiological conditions. Quantitative measurement of bladder elasticity is an essential step toward understanding various urinary bladder disease processes and improving patient care. As a nondestructive, and noncontact method, laser-induced surface acoustic waves (SAWs) can accurately characterize the elastic properties of different layers of organs such as the urinary bladder. This initial investigation evaluates the feasibility of a noncontact, all-optical method of generating and measuring the elasticity of the urinary bladder. Quantitative elasticity measurements of ex vivo porcine urinary bladder were made using the laser-induced SAW technique. A pulsed laser was used to excite SAWs that propagated on the bladder wall surface. A dedicated phase-sensitive optical coherence tomography (PhS-OCT) system remotely recorded the SAWs, from which the elasticity properties of different layers of the bladder were estimated. During the experiments, series of measurements were performed under five precisely controlled bladder volumes using water to estimate changes in the elasticity in relation to various urinary bladder contents. The results, validated by optical coherence elastography, show that the laser-induced SAW technique combined with PhS-OCT can be a feasible method of quantitative estimation of biomechanical properties.

  5. Experimental Performance of a Frequency Measurement Circuit.

    DTIC Science & Technology

    1984-12-01

    STANDARDS 1963-A NAVAL POSTGRADUATE SCHOOL Monterey, California N O In DTISEL ECTE. APR 26 1985 THESIS EXPERIMENTAL PERFORMANCE OF A FREQUENCY MEASUREMENT...CIRCUIT by CO George H. Eastwood December 198𔃾 Thesis Advisor: G. A. Myers * Approved for public release; distribution is unlimited. * 85 4 2 105...ACCESSION NO. 3. RECIPIENT’S CATALOG NUMBER 4. TITLE (and Subtitle) S. TYPE OF REPORT & PERIOD COVERED Experimental Performance of a Master’s Thesis

  6. Large-Scale Interlaboratory Study to Develop, Analytically Validate and Apply Highly Multiplexed, Quantitative Peptide Assays to Measure Cancer-Relevant Proteins in Plasma*

    PubMed Central

    Abbatiello, Susan E.; Schilling, Birgit; Mani, D. R.; Zimmerman, Lisa J.; Hall, Steven C.; MacLean, Brendan; Albertolle, Matthew; Allen, Simon; Burgess, Michael; Cusack, Michael P.; Gosh, Mousumi; Hedrick, Victoria; Held, Jason M.; Inerowicz, H. Dorota; Jackson, Angela; Keshishian, Hasmik; Kinsinger, Christopher R.; Lyssand, John; Makowski, Lee; Mesri, Mehdi; Rodriguez, Henry; Rudnick, Paul; Sadowski, Pawel; Sedransk, Nell; Shaddox, Kent; Skates, Stephen J.; Kuhn, Eric; Smith, Derek; Whiteaker, Jeffery R.; Whitwell, Corbin; Zhang, Shucha; Borchers, Christoph H.; Fisher, Susan J.; Gibson, Bradford W.; Liebler, Daniel C.; MacCoss, Michael J.; Neubert, Thomas A.; Paulovich, Amanda G.; Regnier, Fred E.; Tempst, Paul; Carr, Steven A.

    2015-01-01

    There is an increasing need in biology and clinical medicine to robustly and reliably measure tens to hundreds of peptides and proteins in clinical and biological samples with high sensitivity, specificity, reproducibility, and repeatability. Previously, we demonstrated that LC-MRM-MS with isotope dilution has suitable performance for quantitative measurements of small numbers of relatively abundant proteins in human plasma and that the resulting assays can be transferred across laboratories while maintaining high reproducibility and quantitative precision. Here, we significantly extend that earlier work, demonstrating that 11 laboratories using 14 LC-MS systems can develop, determine analytical figures of merit, and apply highly multiplexed MRM-MS assays targeting 125 peptides derived from 27 cancer-relevant proteins and seven control proteins to precisely and reproducibly measure the analytes in human plasma. To ensure consistent generation of high quality data, we incorporated a system suitability protocol (SSP) into our experimental design. The SSP enabled real-time monitoring of LC-MRM-MS performance during assay development and implementation, facilitating early detection and correction of chromatographic and instrumental problems. Low to subnanogram/ml sensitivity for proteins in plasma was achieved by one-step immunoaffinity depletion of 14 abundant plasma proteins prior to analysis. Median intra- and interlaboratory reproducibility was <20%, sufficient for most biological studies and candidate protein biomarker verification. Digestion recovery of peptides was assessed and quantitative accuracy improved using heavy-isotope-labeled versions of the proteins as internal standards. Using the highly multiplexed assay, participating laboratories were able to precisely and reproducibly determine the levels of a series of analytes in blinded samples used to simulate an interlaboratory clinical study of patient samples. Our study further establishes that LC

  7. Impact of image quality on OCT angiography based quantitative measurements.

    PubMed

    Al-Sheikh, Mayss; Ghasemi Falavarjani, Khalil; Akil, Handan; Sadda, SriniVas R

    2017-01-01

    To study the impact of image quality on quantitative measurements and the frequency of segmentation error with optical coherence tomography angiography (OCTA). Seventeen eyes of 10 healthy individuals were included in this study. OCTA was performed using a swept-source device (Triton, Topcon). Each subject underwent three scanning sessions 1-2 min apart; the first two scans were obtained under standard conditions and for the third session, the image quality index was reduced using application of a topical ointment. En face OCTA images of the retinal vasculature were generated using the default segmentation for the superficial and deep retinal layer (SRL, DRL). Intraclass correlation coefficient (ICC) was used as a measure for repeatability. The frequency of segmentation error, motion artifact, banding artifact and projection artifact was also compared among the three sessions. The frequency of segmentation error, and motion artifact was statistically similar between high and low image quality sessions (P = 0.707, and P = 1 respectively). However, the frequency of projection and banding artifact was higher with a lower image quality. The vessel density in the SRL was highly repeatable in the high image quality sessions (ICC = 0.8), however, the repeatability was low, comparing the high and low image quality measurements (ICC = 0.3). In the DRL, the repeatability of the vessel density measurements was fair in the high quality sessions (ICC = 0.6 and ICC = 0.5, with and without automatic artifact removal, respectively) and poor comparing high and low image quality sessions (ICC = 0.3 and ICC = 0.06, with and without automatic artifact removal, respectively). The frequency of artifacts is higher and the repeatability of the measurements is lower with lower image quality. The impact of image quality index should be always considered in OCTA based quantitative measurements.

  8. Automatic and quantitative measurement of collagen gel contraction using model-guided segmentation

    NASA Astrophysics Data System (ADS)

    Chen, Hsin-Chen; Yang, Tai-Hua; Thoreson, Andrew R.; Zhao, Chunfeng; Amadio, Peter C.; Sun, Yung-Nien; Su, Fong-Chin; An, Kai-Nan

    2013-08-01

    Quantitative measurement of collagen gel contraction plays a critical role in the field of tissue engineering because it provides spatial-temporal assessment (e.g., changes of gel area and diameter during the contraction process) reflecting the cell behavior and tissue material properties. So far the assessment of collagen gels relies on manual segmentation, which is time-consuming and suffers from serious intra- and inter-observer variability. In this study, we propose an automatic method combining various image processing techniques to resolve these problems. The proposed method first detects the maximal feasible contraction range of circular references (e.g., culture dish) and avoids the interference of irrelevant objects in the given image. Then, a three-step color conversion strategy is applied to normalize and enhance the contrast between the gel and background. We subsequently introduce a deformable circular model which utilizes regional intensity contrast and circular shape constraint to locate the gel boundary. An adaptive weighting scheme was employed to coordinate the model behavior, so that the proposed system can overcome variations of gel boundary appearances at different contraction stages. Two measurements of collagen gels (i.e., area and diameter) can readily be obtained based on the segmentation results. Experimental results, including 120 gel images for accuracy validation, showed high agreement between the proposed method and manual segmentation with an average dice similarity coefficient larger than 0.95. The results also demonstrated obvious improvement in gel contours obtained by the proposed method over two popular, generic segmentation methods.

  9. An overview of quantitative approaches in Gestalt perception.

    PubMed

    Jäkel, Frank; Singh, Manish; Wichmann, Felix A; Herzog, Michael H

    2016-09-01

    Gestalt psychology is often criticized as lacking quantitative measurements and precise mathematical models. While this is true of the early Gestalt school, today there are many quantitative approaches in Gestalt perception and the special issue of Vision Research "Quantitative Approaches in Gestalt Perception" showcases the current state-of-the-art. In this article we give an overview of these current approaches. For example, ideal observer models are one of the standard quantitative tools in vision research and there is a clear trend to try and apply this tool to Gestalt perception and thereby integrate Gestalt perception into mainstream vision research. More generally, Bayesian models, long popular in other areas of vision research, are increasingly being employed to model perceptual grouping as well. Thus, although experimental and theoretical approaches to Gestalt perception remain quite diverse, we are hopeful that these quantitative trends will pave the way for a unified theory. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Principles of Quantitative MR Imaging with Illustrated Review of Applicable Modular Pulse Diagrams.

    PubMed

    Mills, Andrew F; Sakai, Osamu; Anderson, Stephan W; Jara, Hernan

    2017-01-01

    Continued improvements in diagnostic accuracy using magnetic resonance (MR) imaging will require development of methods for tissue analysis that complement traditional qualitative MR imaging studies. Quantitative MR imaging is based on measurement and interpretation of tissue-specific parameters independent of experimental design, compared with qualitative MR imaging, which relies on interpretation of tissue contrast that results from experimental pulse sequence parameters. Quantitative MR imaging represents a natural next step in the evolution of MR imaging practice, since quantitative MR imaging data can be acquired using currently available qualitative imaging pulse sequences without modifications to imaging equipment. The article presents a review of the basic physical concepts used in MR imaging and how quantitative MR imaging is distinct from qualitative MR imaging. Subsequently, the article reviews the hierarchical organization of major applicable pulse sequences used in this article, with the sequences organized into conventional, hybrid, and multispectral sequences capable of calculating the main tissue parameters of T1, T2, and proton density. While this new concept offers the potential for improved diagnostic accuracy and workflow, awareness of this extension to qualitative imaging is generally low. This article reviews the basic physical concepts in MR imaging, describes commonly measured tissue parameters in quantitative MR imaging, and presents the major available pulse sequences used for quantitative MR imaging, with a focus on the hierarchical organization of these sequences. © RSNA, 2017.

  11. Living cell dry mass measurement using quantitative phase imaging with quadriwave lateral shearing interferometry: an accuracy and sensitivity discussion.

    PubMed

    Aknoun, Sherazade; Savatier, Julien; Bon, Pierre; Galland, Frédéric; Abdeladim, Lamiae; Wattellier, Benoit; Monneret, Serge

    2015-01-01

    Single-cell dry mass measurement is used in biology to follow cell cycle, to address effects of drugs, or to investigate cell metabolism. Quantitative phase imaging technique with quadriwave lateral shearing interferometry (QWLSI) allows measuring cell dry mass. The technique is very simple to set up, as it is integrated in a camera-like instrument. It simply plugs onto a standard microscope and uses a white light illumination source. Its working principle is first explained, from image acquisition to automated segmentation algorithm and dry mass quantification. Metrology of the whole process, including its sensitivity, repeatability, reliability, sources of error, over different kinds of samples and under different experimental conditions, is developed. We show that there is no influence of magnification or spatial light coherence on dry mass measurement; effect of defocus is more critical but can be calibrated. As a consequence, QWLSI is a well-suited technique for fast, simple, and reliable cell dry mass study, especially for live cells.

  12. Single-case synthesis tools II: Comparing quantitative outcome measures.

    PubMed

    Zimmerman, Kathleen N; Pustejovsky, James E; Ledford, Jennifer R; Barton, Erin E; Severini, Katherine E; Lloyd, Blair P

    2018-03-07

    Varying methods for evaluating the outcomes of single case research designs (SCD) are currently used in reviews and meta-analyses of interventions. Quantitative effect size measures are often presented alongside visual analysis conclusions. Six measures across two classes-overlap measures (percentage non-overlapping data, improvement rate difference, and Tau) and parametric within-case effect sizes (standardized mean difference and log response ratio [increasing and decreasing])-were compared to determine if choice of synthesis method within and across classes impacts conclusions regarding effectiveness. The effectiveness of sensory-based interventions (SBI), a commonly used class of treatments for young children, was evaluated. Separately from evaluations of rigor and quality, authors evaluated behavior change between baseline and SBI conditions. SBI were unlikely to result in positive behavior change across all measures except IRD. However, subgroup analyses resulted in variable conclusions, indicating that the choice of measures for SCD meta-analyses can impact conclusions. Suggestions for using the log response ratio in SCD meta-analyses and considerations for understanding variability in SCD meta-analysis conclusions are discussed. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Quantitative magnetic resonance (QMR) measurement of changes in body composition of neonatal pigs

    USDA-ARS?s Scientific Manuscript database

    The survival of low birth weight pigs in particular may depend on energy stores in the body. QMR (quantitative magnetic resonance) is a new approach to measuring total body fat, lean and water. These measurements are based on quantifying protons associated with lipid and water molecules in the body...

  14. Issues to Consider When Measuring and Applying Socioeconomic Position Quantitatively in Immigrant Health Research

    PubMed Central

    Nielsen, Signe Smith; Hempler, Nana Folmann; Krasnik, Allan

    2013-01-01

    The relationship between migration and health is complex, yet, immigrant-related inequalities in health are largely influenced by socioeconomic position. Drawing upon previous findings, this paper discusses issues to consider when measuring and applying socioeconomic position in quantitative immigrant health research. When measuring socioeconomic position, it is important to be aware of four aspects: (1) there is a lack of clarity about how socioeconomic position should be measured; (2) different types of socioeconomic position may be relevant to immigrants compared with the native-born population; (3) choices of measures of socioeconomic position in quantitative analyses often rely on data availability; and (4) different measures of socioeconomic position have different effects in population groups. Therefore, caution should be used in the collection, presentation, analyses, and interpretation of data and researchers need to display their proposed conceptual models and data limitations as well as apply different approaches for analyses. PMID:24287857

  15. Quantitative Measurements of Nitric Oxide Concentration in High-Pressure, Swirl-Stabilized Spray Flames

    NASA Technical Reports Server (NTRS)

    Cooper, Clayton S.; Laurendeau, Normand M.; Hicks, Yolanda R. (Technical Monitor)

    2000-01-01

    Lean direct-injection (LDI) spray flames offer the possibility of reducing NO(sub x) emissions from gas turbines by rapid mixing of the liquid fuel and air so as to drive the flame structure toward partially-premixed conditions. We consider the technical approaches required to utilize laser-induced fluorescence methods for quantitatively measuring NO concentrations in high-pressure LDI spray flames. In the progression from atmospheric to high-pressure measurements, the LIF method requires a shift from the saturated to the linear regime of fluorescence measurements. As such, we discuss quantitative, spatially resolved laser-saturated fluorescence (LSF), linear laser-induced fluorescence (LIF), and planar laser-induced fluorescence (PLIF) measurements of NO concentration in LDI spray flames. Spatially-resolved LIF measurements of NO concentration (ppm) are reported for preheated, LDI spray flames at pressures of two to five atmospheres. The spray is produced by a hollow-cone, pressure-atomized nozzle supplied with liquid heptane. NO is excited via the Q(sub 2)(26.5) transition of the gamma(0,0) band. Detection is performed in a two nanometer region centered on the gamma(0,1) band. A complete scheme is developed by which quantitative NO concentrations in high-pressure LDI spray flames can be measured by applying linear LIF. NO is doped into the reactants and convected through the flame with no apparent destruction, thus allowing a NO fluorescence calibration to be taken inside the flame environment. The in-situ calibration scheme is validated by comparisons to a reference flame. Quantitative NO profiles are presented and analyzed so as to better understand the operation of lean-direct injectors for gas turbine combustors. Moreover, parametric studies are provided for variations in pressure, air-preheat temperature, and equivalence ratio. Similar parametric studies are performed for lean, premixed-prevaporized flames to permit comparisons to those for LDI flames

  16. Qualitative pattern classification of shear wave elastography for breast masses: how it correlates to quantitative measurements.

    PubMed

    Yoon, Jung Hyun; Ko, Kyung Hee; Jung, Hae Kyoung; Lee, Jong Tae

    2013-12-01

    To determine the correlation of qualitative shear wave elastography (SWE) pattern classification to quantitative SWE measurements and whether it is representative of quantitative SWE values with similar performances. From October 2012 to January 2013, 267 breast masses of 236 women (mean age: 45.12 ± 10.54 years, range: 21-88 years) who had undergone ultrasonography (US), SWE, and subsequent biopsy were included. US BI-RADS final assessment and qualitative and quantitative SWE measurements were recorded. Correlation between pattern classification and mean elasticity, maximum elasticity, elasticity ratio and standard deviation were evaluated. Diagnostic performances of grayscale US, SWE parameters, and US combined to SWE values were calculated and compared. Of the 267 breast masses, 208 (77.9%) were benign and 59 (22.1%) were malignant. Pattern classifications significantly correlated with all quantitative SWE measurements, showing highest correlation with maximum elasticity, r = 0.721 (P<0.001). Sensitivity was significantly decreased in US combined to SWE measurements to grayscale US: 69.5-89.8% to 100.0%, while specificity was significantly improved: 62.5-81.7% to 13.9% (P<0.001). Area under the ROC curve (Az) did not show significant differences between grayscale US to US combined to SWE (P>0.05). Pattern classification shows high correlation to maximum stiffness and may be representative of quantitative SWE values. When combined to grayscale US, SWE improves specificity of US. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  17. Tophaceous gout: quantitative evaluation by direct physical measurement.

    PubMed

    Schumacher, H Ralph; Becker, Michael A; Palo, William A; Streit, Janet; MacDonald, Patricia A; Joseph-Ridge, Nancy

    2005-12-01

    The absence of accepted standardized methods for monitoring tophaceous gout limits the ability to track tophus progression or regression. This multicenter study assessed intra- and interrater reproducibility of a simple and direct physical measurement. The quantitative evaluation was the area (mm2) of each measurable tophus and was determined independently by 2 raters on 2 occasions within 10 days. Intra- and interrater reproducibilities were determined by calculating mean differences and average percentage differences (APD) in measurements of areas for the same tophus at each of 2 visits and by each rater, respectively. Fifty-two tophi were measured in 13 subjects: 22 on the hand/wrist, 16 on the elbow, and 14 on the foot/ankle. The mean (+/- SD) difference in tophus areas between visits was -0.2 +/- 835 mm2 (95% CI -162 to 162 mm2) and the mean (+/- SD) APD was 29% +/- 33%. The mean (+/- SD) APD between raters was 32% +/- 27%. The largest variations in measurements were noted for elbow tophi and variations were least for well demarcated tophi on the hands. This simple and reproducible method can be easily utilized in clinical trials and in practice as a measure of efficacy of urate-lowering treatment in tophaceous gout. Among factors contributing to variability in these measurements were the anatomic site of tophi and rater experience with the method. Restriction of measurements to well circumscribed hand or foot tophi could improve reliability, but major changes, as expected with effective therapy, can clearly be documented with this simple technique.

  18. Quantitative CT Measures of Bronchiectasis in Smokers.

    PubMed

    Diaz, Alejandro A; Young, Thomas P; Maselli, Diego J; Martinez, Carlos H; Gill, Ritu; Nardelli, Pietro; Wang, Wei; Kinney, Gregory L; Hokanson, John E; Washko, George R; San Jose Estepar, Raul

    2017-06-01

    Bronchiectasis is frequent in smokers with COPD; however, there are only limited data on objective assessments of this process. The objective was to assess bronchovascular morphology, calculate the ratio of the diameters of bronchial lumen and adjacent artery (BA ratio), and identify those measurements able to discriminate bronchiectasis. We collected quantitative CT (QCT) measures of BA ratios, peak wall attenuation, wall thickness (WT), wall area, and wall area percent (WA%) at matched fourth through sixth airway generations in 21 ever smokers with bronchiectasis (cases) and 21 never-smoking control patients (control airways). In cases, measurements were collected at both bronchiectatic and nonbronchiectatic airways. Logistic analysis and the area under receiver operating characteristic curve (AUC) were used to assess the predictive ability of QCT measurements for bronchiectasis. The whole-lung and fourth through sixth airway generation BA ratio, WT, and WA% were significantly greater in bronchiectasis cases than control patients. The AUCs for the BA ratio to predict bronchiectasis ranged from 0.90 (whole lung) to 0.79 (fourth-generation). AUCs for WT and WA% ranged from 0.72 to 0.75 and from 0.71 to 0.75. The artery diameters but not bronchial diameters were smaller in bronchiectatic than both nonbronchiectatic and control airways (P < .01 for both). Smoking-related increases in the BA ratio appear to be driven by reductions in vascular caliber. QCT measures of BA ratio, WT, and WA% may be useful to objectively identify and quantify bronchiectasis in smokers. ClinicalTrials.gov; No.: NCT00608764; URL: www.clinicaltrials.gov. Copyright © 2016 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  19. Quantitative Characterization of Nanostructured Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dr. Frank

    The two-and-a-half day symposium on the "Quantitative Characterization of Nanostructured Materials" will be the first comprehensive meeting on this topic held under the auspices of a major U.S. professional society. Spring MRS Meetings provide a natural venue for this symposium as they attract a broad audience of researchers that represents a cross-section of the state-of-the-art regarding synthesis, structure-property relations, and applications of nanostructured materials. Close interactions among the experts in local structure measurements and materials researchers will help both to identify measurement needs pertinent to real-world materials problems and to familiarize the materials research community with the state-of-the-art local structuremore » measurement techniques. We have chosen invited speakers that reflect the multidisciplinary and international nature of this topic and the need to continually nurture productive interfaces among university, government and industrial laboratories. The intent of the symposium is to provide an interdisciplinary forum for discussion and exchange of ideas on the recent progress in quantitative characterization of structural order in nanomaterials using different experimental techniques and theory. The symposium is expected to facilitate discussions on optimal approaches for determining atomic structure at the nanoscale using combined inputs from multiple measurement techniques.« less

  20. Iterative optimization method for design of quantitative magnetization transfer imaging experiments.

    PubMed

    Levesque, Ives R; Sled, John G; Pike, G Bruce

    2011-09-01

    Quantitative magnetization transfer imaging (QMTI) using spoiled gradient echo sequences with pulsed off-resonance saturation can be a time-consuming technique. A method is presented for selection of an optimum experimental design for quantitative magnetization transfer imaging based on the iterative reduction of a discrete sampling of the Z-spectrum. The applicability of the technique is demonstrated for human brain white matter imaging at 1.5 T and 3 T, and optimal designs are produced to target specific model parameters. The optimal number of measurements and the signal-to-noise ratio required for stable parameter estimation are also investigated. In vivo imaging results demonstrate that this optimal design approach substantially improves parameter map quality. The iterative method presented here provides an advantage over free form optimal design methods, in that pragmatic design constraints are readily incorporated. In particular, the presented method avoids clustering and repeated measures in the final experimental design, an attractive feature for the purpose of magnetization transfer model validation. The iterative optimal design technique is general and can be applied to any method of quantitative magnetization transfer imaging. Copyright © 2011 Wiley-Liss, Inc.

  1. Quantitative experimental modelling of fragmentation during explosive volcanism

    NASA Astrophysics Data System (ADS)

    Thordén Haug, Ø.; Galland, O.; Gisler, G.

    2012-04-01

    Phreatomagmatic eruptions results from the violent interaction between magma and an external source of water, such as ground water or a lake. This interaction causes fragmentation of the magma and/or the host rock, resulting in coarse-grained (lapilli) to very fine-grained (ash) material. The products of phreatomagmatic explosions are classically described by their fragment size distribution, which commonly follows power laws of exponent D. Such descriptive approach, however, considers the final products only and do not provide information on the dynamics of fragmentation. The aim of this contribution is thus to address the following fundamental questions. What are the physics that govern fragmentation processes? How fragmentation occurs through time? What are the mechanisms that produce power law fragment size distributions? And what are the scaling laws that control the exponent D? To address these questions, we performed a quantitative experimental study. The setup consists of a Hele-Shaw cell filled with a layer of cohesive silica flour, at the base of which a pulse of pressurized air is injected, leading to fragmentation of the layer of flour. The fragmentation process is monitored through time using a high-speed camera. By varying systematically the air pressure (P) and the thickness of the flour layer (h) we observed two morphologies of fragmentation: "lift off" where the silica flour above the injection inlet is ejected upwards, and "channeling" where the air pierces through the layer along sub-vertical conduit. By building a phase diagram, we show that the morphology is controlled by P/dgh, where d is the density of the flour and g is the gravitational acceleration. To quantify the fragmentation process, we developed a Matlab image analysis program, which calculates the number and sizes of the fragments, and so the fragment size distribution, during the experiments. The fragment size distributions are in general described by power law distributions of

  2. Quantitative measures of gingival recession and the influence of gender, race, and attrition.

    PubMed

    Handelman, Chester S; Eltink, Anthony P; BeGole, Ellen

    2018-01-29

    Gingival recession in dentitions with otherwise healthy periodontium is a common occurrence in adults. Recession is clinically measured using a periodontal probe to the nearest millimeter. The aim of this study is to establish quantitative measures of recession, the clinical crown height, and a new measure the gingival margin-papillae measurement. The latter is seen as the shortest apico-coronal distance measured from the depth of the gingival margin to a line connecting the tips of the two adjacent papillae. Measurements on all teeth up to and including the first molar were performed on pretreatment study models of 120 adult Caucasian and African-American subjects divided into four groups of 30 by gender and race. Both the clinical crown height and the gingival margin-papillae measurements gave a true positive result for changes associated with gingival recession. Tooth wear shortens the clinical crown, and therefore, the measure of clinical crown height can give a false negative result when gingival recession is present. However, the gingival margin-papillae measurement was not affected by tooth wear and gave a true positive result for gingival recession. Tooth wear (attrition) was not associated with an increase in gingival recession. These measures are also useful in detecting recession prior to cemental exposure. Measures for recession and tooth wear were different for the four demographic groups studied. These measures can be used as quantitative standards in both clinical dentistry, research, and epidemiological studies.

  3. Quantitative impedance measurements for eddy current model validation

    NASA Astrophysics Data System (ADS)

    Khan, T. A.; Nakagawa, N.

    2000-05-01

    This paper reports on a series of laboratory-based impedance measurement data, collected by the use of a quantitatively accurate, mechanically controlled measurement station. The purpose of the measurement is to validate a BEM-based eddy current model against experiment. We have therefore selected two "validation probes," which are both split-D differential probes. Their internal structures and dimensions are extracted from x-ray CT scan data, and thus known within the measurement tolerance. A series of measurements was carried out, using the validation probes and two Ti-6Al-4V block specimens, one containing two 1-mm long fatigue cracks, and the other containing six EDM notches of a range of sizes. Motor-controlled XY scanner performed raster scans over the cracks, with the probe riding on the surface with a spring-loaded mechanism to maintain the lift off. Both an impedance analyzer and a commercial EC instrument were used in the measurement. The probes were driven in both differential and single-coil modes for the specific purpose of model validation. The differential measurements were done exclusively by the eddyscope, while the single-coil data were taken with both the impedance analyzer and the eddyscope. From the single-coil measurements, we obtained the transfer function to translate the voltage output of the eddyscope into impedance values, and then used it to translate the differential measurement data into impedance results. The presentation will highlight the schematics of the measurement procedure, a representative of raw data, explanation of the post data-processing procedure, and then a series of resulting 2D flaw impedance results. A noise estimation will be given also, in order to quantify the accuracy of these measurements, and to be used in probability-of-detection estimation.—This work was supported by the NSF Industry/University Cooperative Research Program.

  4. Measuring iron in the brain using quantitative susceptibility mapping and X-ray fluorescence imaging

    PubMed Central

    Zheng, Weili; Nichol, Helen; Liu, Saifeng; Cheng, Yu-Chung N.; Haacke, E. Mark

    2013-01-01

    Measuring iron content in the brain has important implications for a number of neurodegenerative diseases. Quantitative susceptibility mapping (QSM), derived from magnetic resonance images, has been used to measure total iron content in vivo and in post mortem brain. In this paper, we show how magnetic susceptibility from QSM correlates with total iron content measured by X-ray fluorescence (XRF) imaging and by inductively coupled plasma mass spectrometry (ICPMS). The relationship between susceptibility and ferritin iron was estimated at 1.10 ± 0.08 ppb susceptibility per μg iron/g wet tissue, similar to that of iron in fixed (frozen/thawed) cadaveric brain and previously published data from unfixed brains. We conclude that magnetic susceptibility can provide a direct and reliable quantitative measurement of iron content and that it can be used clinically at least in regions with high iron content. PMID:23591072

  5. Quantitative targeting maps based on experimental investigations for a branched tube model in magnetic drug targeting

    NASA Astrophysics Data System (ADS)

    Gitter, K.; Odenbach, S.

    2011-12-01

    Magnetic drug targeting (MDT), because of its high targeting efficiency, is a promising approach for tumour treatment. Unwanted side effects are considerably reduced, since the nanoparticles are concentrated within the target region due to the influence of a magnetic field. Nevertheless, understanding the transport phenomena of nanoparticles in an artery system is still challenging. This work presents experimental results for a branched tube model. Quantitative results describe, for example, the net amount of nanoparticles that are targeted towards the chosen region due to the influence of a magnetic field. As a result of measurements, novel drug targeting maps, combining, e.g. the magnetic volume force, the position of the magnet and the net amount of targeted nanoparticles, are presented. The targeting maps are valuable for evaluation and comparison of setups and are also helpful for the design and the optimisation of a magnet system with an appropriate strength and distribution of the field gradient. The maps indicate the danger of accretion within the tube and also show the promising result of magnetic drug targeting that up to 97% of the nanoparticles were successfully targeted.

  6. Measuring the Beginning: A Quantitative Study of the Transition to Higher Education

    ERIC Educational Resources Information Center

    Brooman, Simon; Darwent, Sue

    2014-01-01

    This quantitative study measures change in certain factors known to influence success of first-year students during the transition to higher education: self-efficacy, autonomous learning and social integration. A social integration scale was developed with three subscales: "sense of belonging", "relationship with staff" and…

  7. Technical note: quantitative measures of iris color using high resolution photographs.

    PubMed

    Edwards, Melissa; Gozdzik, Agnes; Ross, Kendra; Miles, Jon; Parra, Esteban J

    2012-01-01

    Our understanding of the genetic architecture of iris color is still limited. This is partly related to difficulties associated with obtaining quantitative measurements of eye color. Here we introduce a new automated method for measuring iris color using high resolution photographs. This method extracts color measurements in the CIE 1976 L*a*b* (CIELAB) color space from a 256 by 256 pixel square sampled from the 9:00 meridian of the iris. Color is defined across three dimensions: L* (the lightness coordinate), a* (the red-green coordinate), and b* (the blue-yellow coordinate). We applied this method to a sample of individuals of diverse ancestry (East Asian, European and South Asian) that was genotyped for the HERC2 rs12913832 polymorphism, which is strongly associated with blue eye color. We identified substantial variation in the CIELAB color space, not only in the European sample, but also in the East Asian and South Asian samples. As expected, rs12913832 was significantly associated with quantitative iris color measurements in subjects of European ancestry. However, this SNP was also strongly associated with iris color in the South Asian sample, although there were no participants with blue irides in this sample. The usefulness of this method is not restricted only to the study of iris pigmentation. High-resolution pictures of the iris will also make it possible to study the genetic variation involved in iris textural patterns, which show substantial heritability in human populations. Copyright © 2011 Wiley Periodicals, Inc.

  8. Practicable methods for histological section thickness measurement in quantitative stereological analyses.

    PubMed

    Matenaers, Cyrill; Popper, Bastian; Rieger, Alexandra; Wanke, Rüdiger; Blutke, Andreas

    2018-01-01

    The accuracy of quantitative stereological analysis tools such as the (physical) disector method substantially depends on the precise determination of the thickness of the analyzed histological sections. One conventional method for measurement of histological section thickness is to re-embed the section of interest vertically to its original section plane. The section thickness is then measured in a subsequently prepared histological section of this orthogonally re-embedded sample. However, the orthogonal re-embedding (ORE) technique is quite work- and time-intensive and may produce inaccurate section thickness measurement values due to unintentional slightly oblique (non-orthogonal) positioning of the re-embedded sample-section. Here, an improved ORE method is presented, allowing for determination of the factual section plane angle of the re-embedded section, and correction of measured section thickness values for oblique (non-orthogonal) sectioning. For this, the analyzed section is mounted flat on a foil of known thickness (calibration foil) and both the section and the calibration foil are then vertically (re-)embedded. The section angle of the re-embedded section is then calculated from the deviation of the measured section thickness of the calibration foil and its factual thickness, using basic geometry. To find a practicable, fast, and accurate alternative to ORE, the suitability of spectral reflectance (SR) measurement for determination of plastic section thicknesses was evaluated. Using a commercially available optical reflectometer (F20, Filmetrics®, USA), the thicknesses of 0.5 μm thick semi-thin Epon (glycid ether)-sections and of 1-3 μm thick plastic sections (glycolmethacrylate/ methylmethacrylate, GMA/MMA), as regularly used in physical disector analyses, could precisely be measured within few seconds. Compared to the measured section thicknesses determined by ORE, SR measures displayed less than 1% deviation. Our results prove the applicability

  9. Practicable methods for histological section thickness measurement in quantitative stereological analyses

    PubMed Central

    Matenaers, Cyrill; Popper, Bastian; Rieger, Alexandra; Wanke, Rüdiger

    2018-01-01

    The accuracy of quantitative stereological analysis tools such as the (physical) disector method substantially depends on the precise determination of the thickness of the analyzed histological sections. One conventional method for measurement of histological section thickness is to re-embed the section of interest vertically to its original section plane. The section thickness is then measured in a subsequently prepared histological section of this orthogonally re-embedded sample. However, the orthogonal re-embedding (ORE) technique is quite work- and time-intensive and may produce inaccurate section thickness measurement values due to unintentional slightly oblique (non-orthogonal) positioning of the re-embedded sample-section. Here, an improved ORE method is presented, allowing for determination of the factual section plane angle of the re-embedded section, and correction of measured section thickness values for oblique (non-orthogonal) sectioning. For this, the analyzed section is mounted flat on a foil of known thickness (calibration foil) and both the section and the calibration foil are then vertically (re-)embedded. The section angle of the re-embedded section is then calculated from the deviation of the measured section thickness of the calibration foil and its factual thickness, using basic geometry. To find a practicable, fast, and accurate alternative to ORE, the suitability of spectral reflectance (SR) measurement for determination of plastic section thicknesses was evaluated. Using a commercially available optical reflectometer (F20, Filmetrics®, USA), the thicknesses of 0.5 μm thick semi-thin Epon (glycid ether)-sections and of 1–3 μm thick plastic sections (glycolmethacrylate/ methylmethacrylate, GMA/MMA), as regularly used in physical disector analyses, could precisely be measured within few seconds. Compared to the measured section thicknesses determined by ORE, SR measures displayed less than 1% deviation. Our results prove the applicability

  10. Evaluation of empirical rule of linearly correlated peptide selection (ERLPS) for proteotypic peptide-based quantitative proteomics.

    PubMed

    Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong

    2014-07-01

    Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Experimental equipment for measuring of rotary air motors parameters

    NASA Astrophysics Data System (ADS)

    Dvořák, Lukáš; Fojtášek, Kamil; Řeháček, Vojtěch

    In the article the construction of an experimental device for measuring the parameters of small rotary air motors is described. Further a measurement methodology and measured data processing are described. At the end of the article characteristics of the chosen air motor are presented.

  12. Measurements of morphology and refractive indexes on human downy hairs using three-dimensional quantitative phase imaging.

    PubMed

    Lee, SangYun; Kim, Kyoohyun; Lee, Yuhyun; Park, Sungjin; Shin, Heejae; Yang, Jongwon; Ko, Kwanhong; Park, HyunJoo; Park, YongKeun

    2015-01-01

    We present optical measurements of morphology and refractive indexes (RIs) of human downy arm hairs using three-dimensional (3-D) quantitative phase imaging techniques. 3-D RI tomograms and high-resolution two-dimensional synthetic aperture images of individual downy arm hairs were measured using a Mach–Zehnder laser interferometric microscopy equipped with a two-axis galvanometer mirror. From the measured quantitative images, the RIs and morphological parameters of downy hairs were noninvasively quantified including the mean RI, volume, cylinder, and effective radius of individual hairs. In addition, the effects of hydrogen peroxide on individual downy hairs were investigated.

  13. Experimental design and data-analysis in label-free quantitative LC/MS proteomics: A tutorial with MSqRob.

    PubMed

    Goeminne, Ludger J E; Gevaert, Kris; Clement, Lieven

    2018-01-16

    Label-free shotgun proteomics is routinely used to assess proteomes. However, extracting relevant information from the massive amounts of generated data remains difficult. This tutorial provides a strong foundation on analysis of quantitative proteomics data. We provide key statistical concepts that help researchers to design proteomics experiments and we showcase how to analyze quantitative proteomics data using our recent free and open-source R package MSqRob, which was developed to implement the peptide-level robust ridge regression method for relative protein quantification described by Goeminne et al. MSqRob can handle virtually any experimental proteomics design and outputs proteins ordered by statistical significance. Moreover, its graphical user interface and interactive diagnostic plots provide easy inspection and also detection of anomalies in the data and flaws in the data analysis, allowing deeper assessment of the validity of results and a critical review of the experimental design. Our tutorial discusses interactive preprocessing, data analysis and visualization of label-free MS-based quantitative proteomics experiments with simple and more complex designs. We provide well-documented scripts to run analyses in bash mode on GitHub, enabling the integration of MSqRob in automated pipelines on cluster environments (https://github.com/statOmics/MSqRob). The concepts outlined in this tutorial aid in designing better experiments and analyzing the resulting data more appropriately. The two case studies using the MSqRob graphical user interface will contribute to a wider adaptation of advanced peptide-based models, resulting in higher quality data analysis workflows and more reproducible results in the proteomics community. We also provide well-documented scripts for experienced users that aim at automating MSqRob on cluster environments. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Repeatability Assessment by ISO 11843-7 in Quantitative HPLC for Herbal Medicines.

    PubMed

    Chen, Liangmian; Kotani, Akira; Hakamata, Hideki; Tsutsumi, Risa; Hayashi, Yuzuru; Wang, Zhimin; Kusu, Fumiyo

    2015-01-01

    We have proposed an assessment methods to estimate the measurement relative standard deviation (RSD) of chromatographic peaks in quantitative HPLC for herbal medicines by the methodology of ISO 11843 Part 7 (ISO 11843-7:2012), which provides detection limits stochastically. In quantitative HPLC with UV detection (HPLC-UV) of Scutellaria Radix for the determination of baicalin, the measurement RSD of baicalin by ISO 11843-7:2012 stochastically was within a 95% confidence interval of the statistically obtained RSD by repetitive measurements (n = 6). Thus, our findings show that it is applicable for estimating of the repeatability of HPLC-UV for determining baicalin without repeated measurements. In addition, the allowable limit of the "System repeatability" in "Liquid Chromatography" regulated in a pharmacopoeia can be obtained by the present assessment method. Moreover, the present assessment method was also successfully applied to estimate the measurement RSDs of quantitative three-channel liquid chromatography with electrochemical detection (LC-3ECD) of Chrysanthemi Flos for determining caffeoylquinic acids and flavonoids. By the present repeatability assessment method, reliable measurement RSD was obtained stochastically, and the experimental time was remarkably reduced.

  15. Confirmatory Factor Analytic Structure and Measurement Invariance of Quantitative Autistic Traits Measured by the Social Responsiveness Scale-2

    ERIC Educational Resources Information Center

    Frazier, Thomas W.; Ratliff, Kristin R.; Gruber, Chris; Zhang, Yi; Law, Paul A.; Constantino, John N.

    2014-01-01

    Understanding the factor structure of autistic symptomatology is critical to the discovery and interpretation of causal mechanisms in autism spectrum disorder. We applied confirmatory factor analysis and assessment of measurement invariance to a large ("N" = 9635) accumulated collection of reports on quantitative autistic traits using…

  16. Quantitative nuclear magnetic resonance to measure body composition in infants and children

    USDA-ARS?s Scientific Manuscript database

    Quantitative Nuclear Magnetic Resonance (QMR) is being used in human adults to obtain measures of total body fat (FM) with high precision. The current study assessed a device specially designed to accommodate infants and children between 3 and 50 kg (EchoMRI-AH™). Body composition of 113 infants and...

  17. Quantitative measurements of electromechanical response with a combined optical beam and interferometric atomic force microscope

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Labuda, Aleksander; Proksch, Roger

    An ongoing challenge in atomic force microscope (AFM) experiments is the quantitative measurement of cantilever motion. The vast majority of AFMs use the optical beam deflection (OBD) method to infer the deflection of the cantilever. The OBD method is easy to implement, has impressive noise performance, and tends to be mechanically robust. However, it represents an indirect measurement of the cantilever displacement, since it is fundamentally an angular rather than a displacement measurement. Here, we demonstrate a metrological AFM that combines an OBD sensor with a laser Doppler vibrometer (LDV) to enable accurate measurements of the cantilever velocity and displacement.more » The OBD/LDV AFM allows a host of quantitative measurements to be performed, including in-situ measurements of cantilever oscillation modes in piezoresponse force microscopy. As an example application, we demonstrate how this instrument can be used for accurate quantification of piezoelectric sensitivity—a longstanding goal in the electromechanical community.« less

  18. Limits of quantitation - Yet another suggestion

    NASA Astrophysics Data System (ADS)

    Carlson, Jill; Wysoczanski, Artur; Voigtman, Edward

    2014-06-01

    The work presented herein suggests that the limit of quantitation concept may be rendered substantially less ambiguous and ultimately more useful as a figure of merit by basing it upon the significant figure and relative measurement error ideas due to Coleman, Auses and Gram, coupled with the correct instantiation of Currie's detection limit methodology. Simple theoretical results are presented for a linear, univariate chemical measurement system with homoscedastic Gaussian noise, and these are tested against both Monte Carlo computer simulations and laser-excited molecular fluorescence experimental results. Good agreement among experiment, theory and simulation is obtained and an easy extension to linearly heteroscedastic Gaussian noise is also outlined.

  19. Quantitative measurement and analysis for detection and treatment planning of developmental dysplasia of the hip

    NASA Astrophysics Data System (ADS)

    Liu, Xin; Lu, Hongbing; Chen, Hanyong; Zhao, Li; Shi, Zhengxing; Liang, Zhengrong

    2009-02-01

    Developmental dysplasia of the hip is a congenital hip joint malformation affecting the proximal femurs and acetabulum that are subluxatable, dislocatable, and dislocated. Conventionally, physicians made diagnoses and treatments only based on findings from two-dimensional (2D) images by manually calculating clinic parameters. However, anatomical complexity of the disease and the limitation of current standard procedures make accurate diagnosis quite difficultly. In this study, we developed a system that provides quantitative measurement of 3D clinical indexes based on computed tomography (CT) images. To extract bone structure from surrounding tissues more accurately, the system firstly segments the bone using a knowledge-based fuzzy clustering method, which is formulated by modifying the objective function of the standard fuzzy c-means algorithm with additive adaptation penalty. The second part of the system calculates automatically the clinical indexes, which are extended from 2D to 3D for accurate description of spatial relationship between femurs and acetabulum. To evaluate the system performance, experimental study based on 22 patients with unilateral or bilateral affected hip was performed. The results of 3D acetabulum index (AI) automatically provided by the system were validated by comparison with 2D results measured by surgeons manually. The correlation between the two results was found to be 0.622 (p<0.01).

  20. Quantitative and Isolated Measurement of Far-Field Light Scattering by a Single Nanostructure

    NASA Astrophysics Data System (ADS)

    Kim, Donghyeong; Jeong, Kwang-Yong; Kim, Jinhyung; Ee, Ho-Seok; Kang, Ju-Hyung; Park, Hong-Gyu; Seo, Min-Kyo

    2017-11-01

    Light scattering by nanostructures has facilitated research on various optical phenomena and applications by interfacing the near fields and free-propagating radiation. However, direct quantitative measurement of far-field scattering by a single nanostructure on the wavelength scale or less is highly challenging. Conventional back-focal-plane imaging covers only a limited solid angle determined by the numerical aperture of the objectives and suffers from optical aberration and distortion. Here, we present a quantitative measurement of the differential far-field scattering cross section of a single nanostructure over the full hemisphere. In goniometer-based far-field scanning with a high signal-to-noise ratio of approximately 27.4 dB, weak scattering signals are efficiently isolated and detected under total-internal-reflection illumination. Systematic measurements reveal that the total and differential scattering cross sections of a Au nanorod are determined by the plasmonic Fabry-Perot resonances and the phase-matching conditions to the free-propagating radiation, respectively. We believe that our angle-resolved far-field measurement scheme provides a way to investigate and evaluate the physical properties and performance of nano-optical materials and phenomena.

  1. Experimental setup for the measurement of induction motor cage currents

    NASA Astrophysics Data System (ADS)

    Bottauscio, Oriano; Chiampi, Mario; Donadio, Lorenzo; Zucca, Mauro

    2005-04-01

    An experimental setup for measurement of the currents flowing in the rotor bars of induction motors during synchronous no-load tests is described in the paper. The experimental verification of the high-frequency phenomena in the rotor cage is fundamental for a deep insight of the additional loss estimation by numerical methods. The attention is mainly focused on the analysis and design of the transducers developed for the cage current measurement.

  2. Quantitative Primary Tumor Indocyanine Green Measurements Predict Osteosarcoma Metastatic Lung Burden in a Mouse Model.

    PubMed

    Fourman, Mitchell S; Mahjoub, Adel; Mandell, Jon B; Yu, Shibing; Tebbets, Jessica C; Crasto, Jared A; Alexander, Peter E; Weiss, Kurt R

    2018-03-01

    Current preclinical osteosarcoma (OS) models largely focus on quantifying primary tumor burden. However, most fatalities from OS are caused by metastatic disease. The quantification of metastatic OS currently relies on CT, which is limited by motion artifact, requires intravenous contrast, and can be technically demanding in the preclinical setting. We describe the ability for indocyanine green (ICG) fluorescence angiography to quantify primary and metastatic OS in a previously validated orthotopic, immunocompetent mouse model. (1) Can near-infrared ICG fluorescence be used to attach a comparable, quantitative value to the primary OS tumor in our experimental mouse model? (2) Will primary tumor fluorescence differ in mice that go on to develop metastatic lung disease? (3) Does primary tumor fluorescence correlate with tumor volume measured with CT? Six groups of 4- to 6-week-old immunocompetent Balb/c mice (n = 6 per group) received paraphyseal injections into their left hindlimb proximal tibia consisting of variable numbers of K7M2 mouse OS cells. A hindlimb transfemoral amputation was performed 4 weeks after injection with euthanasia and lung extraction performed 10 weeks after injection. Histologic examination of lung and primary tumor specimens confirmed ICG localization only within the tumor bed. Mice with visible or palpable tumor growth had greater hindlimb fluorescence (3.5 ± 2.3 arbitrary perfusion units [APU], defined as the fluorescence pixel return normalized by the detector) compared with those with a negative examination (0.71 ± 0.38 APU, -2.7 ± 0.5 mean difference, 95% confidence interval -3.7 to -1.8, p < 0.001). A strong linear trend (r = 0.81, p < 0.01) was observed between primary tumor and lung fluorescence, suggesting that quantitative ICG tumor fluorescence is directly related to eventual metastatic burden. We did not find a correlation (r = 0.04, p = 0.45) between normalized primary tumor fluorescence and CT volumetric measurements. We

  3. Estimating Coherence Measures from Limited Experimental Data Available

    NASA Astrophysics Data System (ADS)

    Zhang, Da-Jian; Liu, C. L.; Yu, Xiao-Dong; Tong, D. M.

    2018-04-01

    Quantifying coherence has received increasing attention, and considerable work has been directed towards finding coherence measures. While various coherence measures have been proposed in theory, an important issue following is how to estimate these coherence measures in experiments. This is a challenging task, since the state of a system is often unknown in practical applications and the accessible measurements in a real experiment are typically limited. In this Letter, we put forward an approach to estimate coherence measures of an unknown state from any limited experimental data available. Our approach is not only applicable to coherence measures but can be extended to other resource measures.

  4. Quantitative fluorescence angiography for neurosurgical interventions.

    PubMed

    Weichelt, Claudia; Duscha, Philipp; Steinmeier, Ralf; Meyer, Tobias; Kuß, Julia; Cimalla, Peter; Kirsch, Matthias; Sobottka, Stephan B; Koch, Edmund; Schackert, Gabriele; Morgenstern, Ute

    2013-06-01

    Present methods for quantitative measurement of cerebral perfusion during neurosurgical operations require additional technology for measurement, data acquisition, and processing. This study used conventional fluorescence video angiography--as an established method to visualize blood flow in brain vessels--enhanced by a quantifying perfusion software tool. For these purposes, the fluorescence dye indocyanine green is given intravenously, and after activation by a near-infrared light source the fluorescence signal is recorded. Video data are analyzed by software algorithms to allow quantification of the blood flow. Additionally, perfusion is measured intraoperatively by a reference system. Furthermore, comparing reference measurements using a flow phantom were performed to verify the quantitative blood flow results of the software and to validate the software algorithm. Analysis of intraoperative video data provides characteristic biological parameters. These parameters were implemented in the special flow phantom for experimental validation of the developed software algorithms. Furthermore, various factors that influence the determination of perfusion parameters were analyzed by means of mathematical simulation. Comparing patient measurement, phantom experiment, and computer simulation under certain conditions (variable frame rate, vessel diameter, etc.), the results of the software algorithms are within the range of parameter accuracy of the reference methods. Therefore, the software algorithm for calculating cortical perfusion parameters from video data presents a helpful intraoperative tool without complex additional measurement technology.

  5. Quantitative facial asymmetry: using three-dimensional photogrammetry to measure baseline facial surface symmetry.

    PubMed

    Taylor, Helena O; Morrison, Clinton S; Linden, Olivia; Phillips, Benjamin; Chang, Johnny; Byrne, Margaret E; Sullivan, Stephen R; Forrest, Christopher R

    2014-01-01

    Although symmetry is hailed as a fundamental goal of aesthetic and reconstructive surgery, our tools for measuring this outcome have been limited and subjective. With the advent of three-dimensional photogrammetry, surface geometry can be captured, manipulated, and measured quantitatively. Until now, few normative data existed with regard to facial surface symmetry. Here, we present a method for reproducibly calculating overall facial symmetry and present normative data on 100 subjects. We enrolled 100 volunteers who underwent three-dimensional photogrammetry of their faces in repose. We collected demographic data on age, sex, and race and subjectively scored facial symmetry. We calculated the root mean square deviation (RMSD) between the native and reflected faces, reflecting about a plane of maximum symmetry. We analyzed the interobserver reliability of the subjective assessment of facial asymmetry and the quantitative measurements and compared the subjective and objective values. We also classified areas of greatest asymmetry as localized to the upper, middle, or lower facial thirds. This cluster of normative data was compared with a group of patients with subtle but increasing amounts of facial asymmetry. We imaged 100 subjects by three-dimensional photogrammetry. There was a poor interobserver correlation between subjective assessments of asymmetry (r = 0.56). There was a high interobserver reliability for quantitative measurements of facial symmetry RMSD calculations (r = 0.91-0.95). The mean RMSD for this normative population was found to be 0.80 ± 0.24 mm. Areas of greatest asymmetry were distributed as follows: 10% upper facial third, 49% central facial third, and 41% lower facial third. Precise measurement permitted discrimination of subtle facial asymmetry within this normative group and distinguished norms from patients with subtle facial asymmetry, with placement of RMSDs along an asymmetry ruler. Facial surface symmetry, which is poorly assessed

  6. Steps to achieve quantitative measurements of microRNA using two step droplet digital PCR.

    PubMed

    Stein, Erica V; Duewer, David L; Farkas, Natalia; Romsos, Erica L; Wang, Lili; Cole, Kenneth D

    2017-01-01

    Droplet digital PCR (ddPCR) is being advocated as a reference method to measure rare genomic targets. It has consistently been proven to be more sensitive and direct at discerning copy numbers of DNA than other quantitative methods. However, one of the largest obstacles to measuring microRNA (miRNA) using ddPCR is that reverse transcription efficiency depends upon the target, meaning small RNA nucleotide composition directly effects primer specificity in a manner that prevents traditional quantitation optimization strategies. Additionally, the use of reagents that are optimized for miRNA measurements using quantitative real-time PCR (qRT-PCR) appear to either cause false positive or false negative detection of certain targets when used with traditional ddPCR quantification methods. False readings are often related to using inadequate enzymes, primers and probes. Given that two-step miRNA quantification using ddPCR relies solely on reverse transcription and uses proprietary reagents previously optimized only for qRT-PCR, these barriers are substantial. Therefore, here we outline essential controls, optimization techniques, and an efficacy model to improve the quality of ddPCR miRNA measurements. We have applied two-step principles used for miRNA qRT-PCR measurements and leveraged the use of synthetic miRNA targets to evaluate ddPCR following cDNA synthesis with four different commercial kits. We have identified inefficiencies and limitations as well as proposed ways to circumvent identified obstacles. Lastly, we show that we can apply these criteria to a model system to confidently quantify miRNA copy number. Our measurement technique is a novel way to quantify specific miRNA copy number in a single sample, without using standard curves for individual experiments. Our methodology can be used for validation and control measurements, as well as a diagnostic technique that allows scientists, technicians, clinicians, and regulators to base miRNA measures on a single

  7. Steps to achieve quantitative measurements of microRNA using two step droplet digital PCR

    PubMed Central

    Duewer, David L.; Farkas, Natalia; Romsos, Erica L.; Wang, Lili; Cole, Kenneth D.

    2017-01-01

    Droplet digital PCR (ddPCR) is being advocated as a reference method to measure rare genomic targets. It has consistently been proven to be more sensitive and direct at discerning copy numbers of DNA than other quantitative methods. However, one of the largest obstacles to measuring microRNA (miRNA) using ddPCR is that reverse transcription efficiency depends upon the target, meaning small RNA nucleotide composition directly effects primer specificity in a manner that prevents traditional quantitation optimization strategies. Additionally, the use of reagents that are optimized for miRNA measurements using quantitative real-time PCR (qRT-PCR) appear to either cause false positive or false negative detection of certain targets when used with traditional ddPCR quantification methods. False readings are often related to using inadequate enzymes, primers and probes. Given that two-step miRNA quantification using ddPCR relies solely on reverse transcription and uses proprietary reagents previously optimized only for qRT-PCR, these barriers are substantial. Therefore, here we outline essential controls, optimization techniques, and an efficacy model to improve the quality of ddPCR miRNA measurements. We have applied two-step principles used for miRNA qRT-PCR measurements and leveraged the use of synthetic miRNA targets to evaluate ddPCR following cDNA synthesis with four different commercial kits. We have identified inefficiencies and limitations as well as proposed ways to circumvent identified obstacles. Lastly, we show that we can apply these criteria to a model system to confidently quantify miRNA copy number. Our measurement technique is a novel way to quantify specific miRNA copy number in a single sample, without using standard curves for individual experiments. Our methodology can be used for validation and control measurements, as well as a diagnostic technique that allows scientists, technicians, clinicians, and regulators to base miRNA measures on a single

  8. Quantitative Measurements of X-ray Intensity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haugh, M. J., Schneider, M.

    This chapter describes the characterization of several X-ray sources and their use in calibrating different types of X-ray cameras at National Security Technologies, LLC (NSTec). The cameras are employed in experimental plasma studies at Lawrence Livermore National Laboratory (LLNL), including the National Ignition Facility (NIF). The sources provide X-rays in the energy range from several hundred eV to 110 keV. The key to this effort is measuring the X-ray beam intensity accurately and traceable to international standards. This is accomplished using photodiodes of several types that are calibrated using radioactive sources and a synchrotron source using methods and materials thatmore » are traceable to the U.S. National Institute of Standards and Technology (NIST). The accreditation procedures are described. The chapter begins with an introduction to the fundamental concepts of X-ray physics. The types of X-ray sources that are used for device calibration are described. The next section describes the photodiode types that are used for measuring X-ray intensity: power measuring photodiodes, energy dispersive photodiodes, and cameras comprising photodiodes as pixel elements. Following their description, the methods used to calibrate the primary detectors, the power measuring photodiodes and the energy dispersive photodiodes, as well as the method used to get traceability to international standards are described. The X-ray source beams can then be measured using the primary detectors. The final section then describes the use of the calibrated X-ray beams to calibrate X-ray cameras. Many of the references are web sites that provide databases, explanations of the data and how it was generated, and data calculations for specific cases. Several general reference books related to the major topics are included. Papers expanding some subjects are cited.« less

  9. Selecting Models for Measuring Change When True Experimental Conditions Do Not Exist.

    ERIC Educational Resources Information Center

    Fortune, Jim C.; Hutson, Barbara A.

    1984-01-01

    Measuring change when true experimental conditions do not exist is a difficult process. This article reviews the artifacts of change measurement in evaluations and quasi-experimental designs, delineates considerations in choosing a model to measure change under nonideal conditions, and suggests ways to organize models to facilitate selection.…

  10. Experimental validation of arthroscopic cartilage stiffness measurement using enzymatically degraded cartilage samples

    NASA Astrophysics Data System (ADS)

    Lyyra, T.; Arokoski, J. P. A.; Oksala, N.; Vihko, A.; Hyttinen, M.; Jurvelin, J. S.; Kiviranta, I.

    1999-02-01

    In order to evaluate the ability of the arthroscopic indentation instrument, originally developed for the measurement of cartilage stiffness during arthroscopy, to detect cartilage degeneration, we compared changes in the stiffness with the structural and constitutional alterations induced by enzymes on the tissue in vitro. The culturing of osteochondral plugs on Petri dishes was initiated in Minimum Essential Medium with Earle's salts and the baseline stiffness was measured. Then, the experimental specimens were digested using trypsin for 24 h, chondroitinase ABC or purified collagenase (type VII) for 24 h or 48 h ( n = 8-15 per group). The control specimens were incubated in the medium. After the enzyme digestion, the end-point stiffness was measured and the specimens for the microscopic analyses were processed. The proteoglycan (PG) distribution was analysed using quantitative microspectrophotometry and the quantitative evaluation of the collagen network was made using a computer-based polarized light microscopy analysis. Decrease of cartilage stiffness was found after 24 h trypsin (36%) and 48 h chondroitinase ABC (24%) digestion corresponding to a decrease of up to 80% and up to 30% in the PG content respectively. Decrease of the superficial zone collagen content or arrangement (78%, ) after 48 h collagenase digestion also induced a decrease (30%, ) in cartilage stiffness. We conclude that our instrument is capable of

  11. The Vermicelli Handling Test: A Simple Quantitative Measure of Dexterous Forepaw Function in Rats

    PubMed Central

    Allred, Rachel P.; Adkins, DeAnna L.; Woodlee, Martin T.; Husbands, Lincoln C.; Maldonado, Mónica A.; Kane, Jacqueline R.; Schallert, Timothy; Jones, Theresa A.

    2008-01-01

    Loss of function in the hands occurs with many brain disorders, but there are few measures of skillful forepaw use in rats available to model these impairments that are both sensitive and simple to administer. Whishaw and Coles (1996) previously described the dexterous manner in which rats manipulate food items with their paws, including thin pieces of pasta. We set out to develop a measure of this food handling behavior that would be quantitative, easy to administer, sensitive to the effects of damage to sensory and motor systems of the CNS and useful for identifying the side of lateralized impairments. When rats handle 7 cm lengths of vermicelli, they manipulate the pasta by repeatedly adjusting the forepaw hold on the pasta piece. As operationally defined, these adjustments can be easily identified and counted by an experimenter without specialized equipment. After unilateral sensorimotor cortex (SMC) lesions, transient middle cerebral artery occlusion (MCAO) and striatal dopamine depleting (6-hydroxydopamine, 6-OHDA) lesions in adult rats, there were enduring reductions in adjustments made with the contralateral forepaw. Additional pasta handling characteristics distinguished between the lesion types. MCAO and 6-OHDA lesions increased the frequency of several identified atypical handling patterns. Severe dopamine depletion increased eating time and adjustments made with the ipsilateral forepaw. However, contralateral forepaw adjustment number most sensitively detected enduring impairments across lesion types. Because of its ease of administration and sensitivity to lateralized impairments in skilled forepaw use, this measure may be useful in rat models of upper extremity impairment. PMID:18325597

  12. Quantitative Study of Emotional Intelligence and Communication Levels in Information Technology Professionals

    ERIC Educational Resources Information Center

    Hendon, Michalina

    2016-01-01

    This quantitative non-experimental correlational research analyzes the relationship between emotional intelligence and communication due to the lack of this research on information technology professionals in the U.S. One hundred and eleven (111) participants completed a survey that measures both the emotional intelligence and communication…

  13. Quantitative dispersion microscopy

    PubMed Central

    Fu, Dan; Choi, Wonshik; Sung, Yongjin; Yaqoob, Zahid; Dasari, Ramachandra R.; Feld, Michael

    2010-01-01

    Refractive index dispersion is an intrinsic optical property and a useful source of contrast in biological imaging studies. In this report, we present the first dispersion phase imaging of living eukaryotic cells. We have developed quantitative dispersion microscopy based on the principle of quantitative phase microscopy. The dual-wavelength quantitative phase microscope makes phase measurements at 310 nm and 400 nm wavelengths to quantify dispersion (refractive index increment ratio) of live cells. The measured dispersion of living HeLa cells is found to be around 1.088, which agrees well with that measured directly for protein solutions using total internal reflection. This technique, together with the dry mass and morphology measurements provided by quantitative phase microscopy, could prove to be a useful tool for distinguishing different types of biomaterials and studying spatial inhomogeneities of biological samples. PMID:21113234

  14. Quantitative transmission Raman spectroscopy of pharmaceutical tablets and capsules.

    PubMed

    Johansson, Jonas; Sparén, Anders; Svensson, Olof; Folestad, Staffan; Claybourn, Mike

    2007-11-01

    Quantitative analysis of pharmaceutical formulations using the new approach of transmission Raman spectroscopy has been investigated. For comparison, measurements were also made in conventional backscatter mode. The experimental setup consisted of a Raman probe-based spectrometer with 785 nm excitation for measurements in backscatter mode. In transmission mode the same system was used to detect the Raman scattered light, while an external diode laser of the same type was used as excitation source. Quantitative partial least squares models were developed for both measurement modes. The results for tablets show that the prediction error for an independent test set was lower for the transmission measurements with a relative root mean square error of about 2.2% as compared with 2.9% for the backscatter mode. Furthermore, the models were simpler in the transmission case, for which only a single partial least squares (PLS) component was required to explain the variation. The main reason for the improvement using the transmission mode is a more representative sampling of the tablets compared with the backscatter mode. Capsules containing mixtures of pharmaceutical powders were also assessed by transmission only. The quantitative results for the capsules' contents were good, with a prediction error of 3.6% w/w for an independent test set. The advantage of transmission Raman over backscatter Raman spectroscopy has been demonstrated for quantitative analysis of pharmaceutical formulations, and the prospects for reliable, lean calibrations for pharmaceutical analysis is discussed.

  15. Quantitative indexes of aminonucleoside-induced nephrotic syndrome.

    PubMed Central

    Nevins, T. E.; Gaston, T.; Basgen, J. M.

    1984-01-01

    Aminonucleoside of puromycin (PAN) is known to cause altered glomerular permeability, resulting in a nephrotic syndrome in rats. The early sequence of this lesion was studied quantitatively, with the application of a new morphometric technique for determining epithelial foot process widths and a sensitive assay for quantifying urinary albumin excretion. Twenty-four hours following a single intraperitoneal injection of PAN, significant widening of foot processes was documented. Within 36 hours significant increases in urinary albumin excretion were observed. When control rats were examined, there was no clear correlation between epithelial foot process width and quantitative albumin excretion. However, in the PAN-treated animals, abnormal albuminuria only appeared in association with appreciable foot process expansion. These studies indicate that quantitative alterations occur in the rat glomerular capillary wall as early as 24 hours after PAN. Further studies of altered glomerular permeability may use these sensitive measures to more precisely define the temporal sequence and elucidate possible subgroups of experimental glomerular injury. Images Figure 1 Figure 2 PMID:6486243

  16. Quantitative measurement of solvation shells using frequency modulated atomic force microscopy

    NASA Astrophysics Data System (ADS)

    Uchihashi, T.; Higgins, M.; Nakayama, Y.; Sader, J. E.; Jarvis, S. P.

    2005-03-01

    The nanoscale specificity of interaction measurements and additional imaging capability of the atomic force microscope make it an ideal technique for measuring solvation shells in a variety of liquids next to a range of materials. Unfortunately, the widespread use of atomic force microscopy for the measurement of solvation shells has been limited by uncertainties over the dimensions, composition and durability of the tip during the measurements, and problems associated with quantitative force calibration of the most sensitive dynamic measurement techniques. We address both these issues by the combined use of carbon nanotube high aspect ratio probes and quantifying the highly sensitive frequency modulation (FM) detection technique using a recently developed analytical method. Due to the excellent reproducibility of the measurement technique, additional information regarding solvation shell size as a function of proximity to the surface has been obtained for two very different liquids. Further, it has been possible to identify differences between chemical and geometrical effects in the chosen systems.

  17. Sensitive and quantitative measurement of gene expression directly from a small amount of whole blood.

    PubMed

    Zheng, Zhi; Luo, Yuling; McMaster, Gary K

    2006-07-01

    Accurate and precise quantification of mRNA in whole blood is made difficult by gene expression changes during blood processing, and by variations and biases introduced by sample preparations. We sought to develop a quantitative whole-blood mRNA assay that eliminates blood purification, RNA isolation, reverse transcription, and target amplification while providing high-quality data in an easy assay format. We performed single- and multiplex gene expression analysis with multiple hybridization probes to capture mRNA directly from blood lysate and used branched DNA to amplify the signal. The 96-well plate singleplex assay uses chemiluminescence detection, and the multiplex assay combines Luminex-encoded beads with fluorescent detection. The single- and multiplex assays could quantitatively measure as few as 6000 and 24,000 mRNA target molecules (0.01 and 0.04 amoles), respectively, in up to 25 microL of whole blood. Both formats had CVs < 10% and dynamic ranges of 3-4 logs. Assay sensitivities allowed quantitative measurement of gene expression in the minority of cells in whole blood. The signals from whole-blood lysate correlated well with signals from purified RNA of the same sample, and absolute mRNA quantification results from the assay were similar to those obtained by quantitative reverse transcription-PCR. Both single- and multiplex assay formats were compatible with common anticoagulants and PAXgene-treated samples; however, PAXgene preparations induced expression of known antiapoptotic genes in whole blood. Both the singleplex and the multiplex branched DNA assays can quantitatively measure mRNA expression directly from small volumes of whole blood. The assay offers an alternative to current technologies that depend on RNA isolation and is amenable to high-throughput gene expression analysis of whole blood.

  18. Feasibility of Quantitative Ultrasound Measurement of the Heel Bone in People with Intellectual Disabilities

    ERIC Educational Resources Information Center

    Mergler, S.; Lobker, B.; Evenhuis, H. M.; Penning, C.

    2010-01-01

    Low bone mineral density (BMD) and fractures are common in people with intellectual disabilities (ID). Reduced mobility in case of motor impairment and the use of anti-epileptic drugs contribute to the development of low BMD. Quantitative ultrasound (QUS) measurement of the heel bone is a non-invasive and radiation-free method for measuring bone…

  19. Accurate quantitation standards of glutathione via traceable sulfur measurement by inductively coupled plasma optical emission spectrometry and ion chromatography

    PubMed Central

    Rastogi, L.; Dash, K.; Arunachalam, J.

    2013-01-01

    The quantitative analysis of glutathione (GSH) is important in different fields like medicine, biology, and biotechnology. Accurate quantitative measurements of this analyte have been hampered by the lack of well characterized reference standards. The proposed procedure is intended to provide an accurate and definitive method for the quantitation of GSH for reference measurements. Measurement of the stoichiometrically existing sulfur content in purified GSH offers an approach for its quantitation and calibration through an appropriate characterized reference material (CRM) for sulfur would provide a methodology for the certification of GSH quantity, that is traceable to SI (International system of units). The inductively coupled plasma optical emission spectrometry (ICP-OES) approach negates the need for any sample digestion. The sulfur content of the purified GSH is quantitatively converted into sulfate ions by microwave-assisted UV digestion in the presence of hydrogen peroxide prior to ion chromatography (IC) measurements. The measurement of sulfur by ICP-OES and IC (as sulfate) using the “high performance” methodology could be useful for characterizing primary calibration standards and certified reference materials with low uncertainties. The relative expanded uncertainties (% U) expressed at 95% confidence interval for ICP-OES analyses varied from 0.1% to 0.3%, while in the case of IC, they were between 0.2% and 1.2%. The described methods are more suitable for characterizing primary calibration standards and certifying reference materials of GSH, than for routine measurements. PMID:29403814

  20. Quantitative measurement of feline colonic transit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krevsky, B.; Somers, M.B.; Maurer, A.H.

    1988-10-01

    Colonic transit scintigraphy, a method for quantitatively evaluating the movement of the fecal stream in vivo, was employed to evaluate colonic transit in the cat. Scintigraphy was performed in duplicate in five cats and repeated four times in one cat. After instillation of an 111In marker into the cecum through a surgically implanted silicone cecostomy tube, colonic movement of the instillate was quantitated for 24 h using gamma scintigraphy. Antegrade and retrograde motion of radionuclide was observed. The cecum and ascending colon emptied rapidly, with a half-emptying time of 1.68 +/- 0.56 h (mean +/- SE). After 24 h, 25.1more » +/- 5.2% of the activity remained in the transverse colon. The progression of the geometric center was initially rapid, followed later by a delayed phase. Geometric center reproducibility was found to be high when analyzed using simple linear regression (slope = 0.92; r = 0.73; P less than 0.01). Atropine (0.1 mg/kg im) was found to delay cecum and ascending colon emptying and delay progression of the geometric center. These results demonstrate both 1) the ability of colonic transit scintigraphy to detect changes in transit induced by pharmacological manipulation and 2) the fact that muscarinic blockade inhibits antegrade transit of the fecal stream. We conclude that feline colonic transit may be studied in a quantitative and reproducible manner with colonic transit scintigraphy.« less

  1. An experimental design for quantification of cardiovascular responses to music stimuli in humans.

    PubMed

    Chang, S-H; Luo, C-H; Yeh, T-L

    2004-01-01

    There have been several researches on the relationship between music and human physiological or psychological responses. However, there are cardiovascular index factors that have not been explored quantitatively due to the qualitative nature of acoustic stimuli. This study proposes and demonstrates an experimental design for quantification of cardiovascular responses to music stimuli in humans. The system comprises two components: a unit for generating and monitoring quantitative acoustic stimuli and a portable autonomic nervous system (ANS) analysis unit for quantitative recording and analysis of the cardiovascular responses. The experimental results indicate that the proposed system can exactly achieve the goal of full control and measurement for the music stimuli, and also effectively support many quantitative indices of cardiovascular response in humans. In addition, the analysis results are discussed and predicted in the future clinical research.

  2. Para-Quantitative Methodology: Reclaiming Experimentalism in Educational Research

    ERIC Educational Resources Information Center

    Shabani Varaki, Bakhtiar; Floden, Robert E.; Javidi Kalatehjafarabadi, Tahereh

    2015-01-01

    This article focuses on the criticisms of current approaches in educational research methodology. It summarizes rationales for mixed methods and argues that the mixing quantitative paradigm and qualitative paradigm is problematic due to practical and philosophical arguments. It is also indicated that the current rise of mixed methods work has…

  3. Development of an exposure measurement database on five lung carcinogens (ExpoSYN) for quantitative retrospective occupational exposure assessment.

    PubMed

    Peters, Susan; Vermeulen, Roel; Olsson, Ann; Van Gelder, Rainer; Kendzia, Benjamin; Vincent, Raymond; Savary, Barbara; Williams, Nick; Woldbæk, Torill; Lavoué, Jérôme; Cavallo, Domenico; Cattaneo, Andrea; Mirabelli, Dario; Plato, Nils; Dahmann, Dirk; Fevotte, Joelle; Pesch, Beate; Brüning, Thomas; Straif, Kurt; Kromhout, Hans

    2012-01-01

    SYNERGY is a large pooled analysis of case-control studies on the joint effects of occupational carcinogens and smoking in the development of lung cancer. A quantitative job-exposure matrix (JEM) will be developed to assign exposures to five major lung carcinogens [asbestos, chromium, nickel, polycyclic aromatic hydrocarbons (PAH), and respirable crystalline silica (RCS)]. We assembled an exposure database, called ExpoSYN, to enable such a quantitative exposure assessment. Existing exposure databases were identified and European and Canadian research institutes were approached to identify pertinent exposure measurement data. Results of individual air measurements were entered anonymized according to a standardized protocol. The ExpoSYN database currently includes 356 551 measurements from 19 countries. In total, 140 666 personal and 215 885 stationary data points were available. Measurements were distributed over the five agents as follows: RCS (42%), asbestos (20%), chromium (16%), nickel (15%), and PAH (7%). The measurement data cover the time period from 1951 to present. However, only a small portion of measurements (1.4%) were performed prior to 1975. The major contributing countries for personal measurements were Germany (32%), UK (22%), France (14%), and Norway and Canada (both 11%). ExpoSYN is a unique occupational exposure database with measurements from 18 European countries and Canada covering a time period of >50 years. This database will be used to develop a country-, job-, and time period-specific quantitative JEM. This JEM will enable data-driven quantitative exposure assessment in a multinational pooled analysis of community-based lung cancer case-control studies.

  4. Qualitative versus quantitative methods in psychiatric research.

    PubMed

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  5. Experimental validation of a transformation optics based lens for beam steering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yi, Jianjia; Burokur, Shah Nawaz, E-mail: shah-nawaz.burokur@u-psud.fr; Lustrac, André de

    2015-10-12

    A transformation optics based lens for beam control is experimentally realized and measured at microwave frequencies. Laplace's equation is adopted to construct the mapping between the virtual and physical spaces. The metamaterial-based lens prototype is designed using electric LC resonators. A planar microstrip antenna source is used as transverse electric polarized wave launcher for the lens. Both the far field radiation patterns and the near-field distributions have been measured to experimentally demonstrate the beam steering properties. Measurements agree quantitatively and qualitatively with numerical simulations, and a non-narrow frequency bandwidth operation is observed.

  6. Quantitative fundus autofluorescence in mice: correlation with HPLC quantitation of RPE lipofuscin and measurement of retina outer nuclear layer thickness.

    PubMed

    Sparrow, Janet R; Blonska, Anna; Flynn, Erin; Duncker, Tobias; Greenberg, Jonathan P; Secondi, Roberta; Ueda, Keiko; Delori, François C

    2013-04-17

    Our study was conducted to establish procedures and protocols for quantitative autofluorescence (qAF) measurements in mice, and to report changes in qAF, A2E bisretinoid concentration, and outer nuclear layer (ONL) thickness in mice of different genotypes and age. Fundus autofluorescence (AF) images (55° lens, 488 nm excitation) were acquired in albino Abca4(-/-), Abca4(+/-), and Abca4(+/+) mice (ages 2-12 months) with a confocal scanning laser ophthalmoscope (cSLO). Gray levels (GLs) in each image were calibrated to an internal fluorescence reference. The bisretinoid A2E was measured by quantitative high performance liquid chromatography (HPLC). Histometric analysis of ONL thicknesses was performed. The Bland-Altman coefficient of repeatability (95% confidence interval) was ±18% for between-session qAF measurements. Mean qAF values increased with age (2-12 months) in all groups of mice. qAF was approximately 2-fold higher in Abca4(-/-) mice than in Abca4(+/+) mice and approximately 20% higher in heterozygous mice. HPLC measurements of the lipofuscin fluorophore A2E also revealed age-associated increases, and the fold difference between Abca4(-/-) and wild-type mice was more pronounced (approximately 3-4-fold) than measurable by qAF. Moreover, A2E levels declined after 8 months of age, a change not observed with qAF. The decline in A2E levels in the Abca4(-/-) mice corresponded to reduced photoreceptor cell viability as reflected in ONL thinning beginning at 8 months of age. The qAF method enables measurement of in vivo lipofuscin and the detection of genotype and age-associated differences. The use of this approach has the potential to aid in understanding retinal disease processes and will facilitate preclinical studies.

  7. Comprehensive Comparison of Self-Administered Questionnaires for Measuring Quantitative Autistic Traits in Adults

    ERIC Educational Resources Information Center

    Nishiyama, Takeshi; Suzuki, Masako; Adachi, Katsunori; Sumi, Satoshi; Okada, Kensuke; Kishino, Hirohisa; Sakai, Saeko; Kamio, Yoko; Kojima, Masayo; Suzuki, Sadao; Kanne, Stephen M.

    2014-01-01

    We comprehensively compared all available questionnaires for measuring quantitative autistic traits (QATs) in terms of reliability and construct validity in 3,147 non-clinical and 60 clinical subjects with normal intelligence. We examined four full-length forms, the Subthreshold Autism Trait Questionnaire (SATQ), the Broader Autism Phenotype…

  8. Validation of reference genes for quantitative gene expression analysis in experimental epilepsy.

    PubMed

    Sadangi, Chinmaya; Rosenow, Felix; Norwood, Braxton A

    2017-12-01

    To grasp the molecular mechanisms and pathophysiology underlying epilepsy development (epileptogenesis) and epilepsy itself, it is important to understand the gene expression changes that occur during these phases. Quantitative real-time polymerase chain reaction (qPCR) is a technique that rapidly and accurately determines gene expression changes. It is crucial, however, that stable reference genes are selected for each experimental condition to ensure that accurate values are obtained for genes of interest. If reference genes are unstably expressed, this can lead to inaccurate data and erroneous conclusions. To date, epilepsy studies have used mostly single, nonvalidated reference genes. This is the first study to systematically evaluate reference genes in male Sprague-Dawley rat models of epilepsy. We assessed 15 potential reference genes in hippocampal tissue obtained from 2 different models during epileptogenesis, 1 model during chronic epilepsy, and a model of noninjurious seizures. Reference gene ranking varied between models and also differed between epileptogenesis and chronic epilepsy time points. There was also some variance between the four mathematical models used to rank reference genes. Notably, we found novel reference genes to be more stably expressed than those most often used in experimental epilepsy studies. The consequence of these findings is that reference genes suitable for one epilepsy model may not be appropriate for others and that reference genes can change over time. It is, therefore, critically important to validate potential reference genes before using them as normalizing factors in expression analysis in order to ensure accurate, valid results. © 2017 Wiley Periodicals, Inc.

  9. Simple and cost-effective liquid chromatography-mass spectrometry method to measure dabrafenib quantitatively and six metabolites semi-quantitatively in human plasma.

    PubMed

    Vikingsson, Svante; Dahlberg, Jan-Olof; Hansson, Johan; Höiom, Veronica; Gréen, Henrik

    2017-06-01

    Dabrafenib is an inhibitor of BRAF V600E used for treating metastatic melanoma but a majority of patients experience adverse effects. Methods to measure the levels of dabrafenib and major metabolites during treatment are needed to allow development of individualized dosing strategies to reduce the burden of such adverse events. In this study, an LC-MS/MS method capable of measuring dabrafenib quantitatively and six metabolites semi-quantitatively is presented. The method is fully validated with regard to dabrafenib in human plasma in the range 5-5000 ng/mL. The analytes were separated on a C18 column after protein precipitation and detected in positive electrospray ionization mode using a Xevo TQ triple quadrupole mass spectrometer. As no commercial reference standards are available, the calibration curve of dabrafenib was used for semi-quantification of dabrafenib metabolites. Compared to earlier methods the presented method represents a simpler and more cost-effective approach suitable for clinical studies. Graphical abstract Combined multi reaction monitoring transitions of dabrafenib and metabolites in a typical case sample.

  10. Towards a quantitative, measurement-based estimate of the uncertainty in photon mass attenuation coefficients at radiation therapy energies

    NASA Astrophysics Data System (ADS)

    Ali, E. S. M.; Spencer, B.; McEwen, M. R.; Rogers, D. W. O.

    2015-02-01

    In this study, a quantitative estimate is derived for the uncertainty in the XCOM photon mass attenuation coefficients in the energy range of interest to external beam radiation therapy—i.e. 100 keV (orthovoltage) to 25 MeV—using direct comparisons of experimental data against Monte Carlo models and theoretical XCOM data. Two independent datasets are used. The first dataset is from our recent transmission measurements and the corresponding EGSnrc calculations (Ali et al 2012 Med. Phys. 39 5990-6003) for 10-30 MV photon beams from the research linac at the National Research Council Canada. The attenuators are graphite and lead, with a total of 140 data points and an experimental uncertainty of ˜0.5% (k = 1). An optimum energy-independent cross section scaling factor that minimizes the discrepancies between measurements and calculations is used to deduce cross section uncertainty. The second dataset is from the aggregate of cross section measurements in the literature for graphite and lead (49 experiments, 288 data points). The dataset is compared to the sum of the XCOM data plus the IAEA photonuclear data. Again, an optimum energy-independent cross section scaling factor is used to deduce the cross section uncertainty. Using the average result from the two datasets, the energy-independent cross section uncertainty estimate is 0.5% (68% confidence) and 0.7% (95% confidence). The potential for energy-dependent errors is discussed. Photon cross section uncertainty is shown to be smaller than the current qualitative ‘envelope of uncertainty’ of the order of 1-2%, as given by Hubbell (1999 Phys. Med. Biol 44 R1-22).

  11. Investigation of PACE™ software and VeriFax's Impairoscope device for quantitatively measuring the effects of stress

    NASA Astrophysics Data System (ADS)

    Morgenthaler, George W.; Nuñez, German R.; Botello, Aaron M.; Soto, Jose; Shrairman, Ruth; Landau, Alexander

    1998-01-01

    Many reaction time experiments have been conducted over the years to observe human responses. However, most of the experiments that were performed did not have quantitatively accurate instruments for measuring change in reaction time under stress. There is a great need for quantitative instruments to measure neuromuscular reaction responses under stressful conditions such as distraction, disorientation, disease, alcohol, drugs, etc. The two instruments used in the experiments reported in this paper are such devices. Their accuracy, portability, ease of use, and biometric character are what makes them very special. PACE™ is a software model used to measure reaction time. VeriFax's Impairoscope measures the deterioration of neuromuscular responses. During the 1997 Summer Semester, various reaction time experiments were conducted on University of Colorado faculty, staff, and students using the PACE™ system. The tests included both two-eye and one-eye unstressed trials and trials with various stresses such as fatigue, distractions in which subjects were asked to perform simple arithmetic during the PACE™ tests, and stress due to rotating-chair dizziness. Various VeriFax Impairoscope tests, both stressed and unstressed, were conducted to determine the Impairoscope's ability to quantitatively measure this impairment. In the 1997 Fall Semester, a Phase II effort was undertaken to increase test sample sizes in order to provide statistical precision and stability. More sophisticated statistical methods remain to be applied to better interpret the data.

  12. Experimental study of ERT monitoring ability to measure solute dispersion.

    PubMed

    Lekmine, Grégory; Pessel, Marc; Auradou, Harold

    2012-01-01

    This paper reports experimental measurements performed to test the ability of electrical resistivity tomography (ERT) imaging to provide quantitative information about transport parameters in porous media such as the dispersivity α, the mixing front velocity u, and the retardation factor R(f) associated with the sorption or trapping of the tracers in the pore structure. The flow experiments are performed in a homogeneous porous column placed between two vertical set of electrodes. Ionic and dyed tracers are injected from the bottom of the porous media over its full width. Under such condition, the mixing front is homogeneous in the transverse direction and shows an S-shape variation in the flow direction. The transport parameters are inferred from the variation of the concentration curves and are compared with data obtained from video analysis of the dyed tracer front. The variations of the transport parameters obtained from an inversion performed by the Gauss-Newton method applied on smoothness-constrained least-squares are studied in detail. While u and R(f) show a relatively small dependence on the inversion procedure, α is strongly dependent on the choice of the inversion parameters. Comparison with the video observations allows for the optimization of the parameters; these parameters are found to be robust with respect to changes in the flow condition and conductivity contrast. © 2011, The Author(s). Ground Water © 2011, National Ground Water Association.

  13. Effect of Pt Doping on Nucleation and Crystallization in Li2O.2SiO2 Glass: Experimental Measurements and Computer Modeling

    NASA Technical Reports Server (NTRS)

    Narayan, K. Lakshmi; Kelton, K. F.; Ray, C. S.

    1996-01-01

    Heterogeneous nucleation and its effects on the crystallization of lithium disilicate glass containing small amounts of Pt are investigated. Measurements of the nucleation frequencies and induction times with and without Pt are shown to be consistent with predictions based on the classical nucleation theory. A realistic computer model for the transformation is presented. Computed differential thermal analysis data (such as crystallization rates as a function of time and temperature) are shown to be in good agreement with experimental results. This modeling provides a new, more quantitative method for analyzing calorimetric data.

  14. Comparison of MPEG-1 digital videotape with digitized sVHS videotape for quantitative echocardiographic measurements

    NASA Technical Reports Server (NTRS)

    Garcia, M. J.; Thomas, J. D.; Greenberg, N.; Sandelski, J.; Herrera, C.; Mudd, C.; Wicks, J.; Spencer, K.; Neumann, A.; Sankpal, B.; hide

    2001-01-01

    Digital format is rapidly emerging as a preferred method for displaying and retrieving echocardiographic studies. The qualitative diagnostic accuracy of Moving Pictures Experts Group (MPEG-1) compressed digital echocardiographic studies has been previously reported. The goals of the present study were to compare quantitative measurements derived from MPEG-1 recordings with the super-VHS (sVHS) videotape clinical standard. Six reviewers performed blinded measurements from still-frame images selected from 20 echocardiographic studies that were simultaneously acquired in sVHS and MPEG-1 formats. Measurements were obtainable in 1401 (95%) of 1486 MPEG-1 variables compared with 1356 (91%) of 1486 sVHS variables (P <.001). Excellent agreement existed between MPEG-1 and sVHS 2-dimensional linear measurements (r = 0.97; MPEG-1 = 0.95[sVHS] + 1.1 mm; P <.001; Delta = 9% +/- 10%), 2-dimensional area measurements (r = 0.89), color jet areas (r = 0.87, p <.001), and Doppler velocities (r = 0.92, p <.001). Interobserver variability was similar for both sVHS and MPEG-1 readings. Our results indicate that quantitative off-line measurements from MPEG-1 digitized echocardiographic studies are feasible and comparable to those obtained from sVHS.

  15. Epithelium percentage estimation facilitates epithelial quantitative protein measurement in tissue specimens.

    PubMed

    Chen, Jing; Toghi Eshghi, Shadi; Bova, George Steven; Li, Qing Kay; Li, Xingde; Zhang, Hui

    2013-12-01

    The rapid advancement of high-throughput tools for quantitative measurement of proteins has demonstrated the potential for the identification of proteins associated with cancer. However, the quantitative results on cancer tissue specimens are usually confounded by tissue heterogeneity, e.g. regions with cancer usually have significantly higher epithelium content yet lower stromal content. It is therefore necessary to develop a tool to facilitate the interpretation of the results of protein measurements in tissue specimens. Epithelial cell adhesion molecule (EpCAM) and cathepsin L (CTSL) are two epithelial proteins whose expressions in normal and tumorous prostate tissues were confirmed by measuring staining intensity with immunohistochemical staining (IHC). The expressions of these proteins were measured by ELISA in protein extracts from OCT embedded frozen prostate tissues. To eliminate the influence of tissue heterogeneity on epithelial protein quantification measured by ELISA, a color-based segmentation method was developed in-house for estimation of epithelium content using H&E histology slides from the same prostate tissues and the estimated epithelium percentage was used to normalize the ELISA results. The epithelium contents of the same slides were also estimated by a pathologist and used to normalize the ELISA results. The computer based results were compared with the pathologist's reading. We found that both EpCAM and CTSL levels, measured by ELISA assays itself, were greatly affected by epithelium content in the tissue specimens. Without adjusting for epithelium percentage, both EpCAM and CTSL levels appeared significantly higher in tumor tissues than normal tissues with a p value less than 0.001. However, after normalization by the epithelium percentage, ELISA measurements of both EpCAM and CTSL were in agreement with IHC staining results, showing a significant increase only in EpCAM with no difference in CTSL expression in cancer tissues. These results

  16. Sieve-based device for MALDI sample preparation. III. Its power for quantitative measurements.

    PubMed

    Molin, Laura; Cristoni, Simone; Seraglia, Roberta; Traldi, Pietro

    2011-02-01

    The solid sample inhomogeneity is a weak point of traditional MALDI deposition techniques that reflects negatively on quantitative analysis. The recently developed sieve-based device (SBD) sample deposition method, based on the electrospraying of matrix/analyte solutions through a grounded sieve, allows the homogeneous deposition of microcrystals with dimensions smaller than that of the laser spot. In each microcrystal the matrix/analyte molar ratio can be considered constant. Then, by irradiating different portions of the microcrystal distribution an identical response is obtained. This result suggests the employment of SBD in the development of quantitative procedures. For this aim, mixtures of different proteins of known molarity were analyzed, showing a good relationship between molarity and intensity ratios. This behaviour was also observed in the case of proteins with quite different ionic yields. The power of the developed method for quantitative evaluation was also tested by the measurement of the abundance of IGPP[Oxi]GPP[Oxi]GLMGPP (m/z 1219) present in the collagen-α-5(IV) chain precursor, differently expressed in urines from healthy subjects and diabetic-nephropathic patients, confirming its overexpression in the presence of nephropathy. The data obtained indicate that SBD is a particularly effective method for quantitative analysis also in biological fluids of interest. Copyright © 2011 John Wiley & Sons, Ltd.

  17. Serial semi-quantitative measurement of fecal calprotectin in patients with ulcerative colitis in remission.

    PubMed

    Garcia-Planella, Esther; Mañosa, Míriam; Chaparro, María; Beltrán, Belén; Barreiro-de-Acosta, Manuel; Gordillo, Jordi; Ricart, Elena; Bermejo, Fernando; García-Sánchez, Valle; Piqueras, Marta; Llaó, Jordina; Gisbert, Javier P; Cabré, Eduard; Domènech, Eugeni

    2018-02-01

    Fecal calprotectin (FC) correlates with clinical and endoscopic activity in ulcerative colitis (UC), and it is a good predictor of relapse. However, its use in clinical practice is constrained by the need for the patient to deliver stool samples, and for their handling and processing in the laboratory. The availability of hand held devices might spread the use of FC in clinical practice. To evaluate the usefulness of a rapid semi-quantitative test of FC in predicting relapse in patients with UC in remission. Prospective, multicenter study that included UC patients in clinical remission for ≥6 months on maintenance treatment with mesalamine. Patients were evaluated clinically and semi-quantitative FC was measured using a monoclonal immunochromatography rapid test at baseline and every three months until relapse or 12 months of follow-up. One hundred and ninety-one patients had at least one determination of FC. At the end of follow-up, 33 patients (17%) experienced clinical relapse. Endoscopic activity at baseline (p = .043) and having had at least one FC > 60 μg/g during the study period (p = .03) were associated with a higher risk of relapse during follow-up. We obtained a total of 636 semi-quantitative FC determinations matched with a three-month follow-up clinical assessment. Having undetectable FC was inversely associated with early relapse (within three months), with a negative predictive value of 98.6% and a sensitivity of 93.9%. Serial, rapid semi-quantitative measurement of FC may be a useful, easy and cheap monitoring tool for patients with UC in remission.

  18. Quantitative FE-EPMA measurement of formation and inhibition of carbon contamination on Fe for trace carbon analysis.

    PubMed

    Tanaka, Yuji; Yamashita, Takako; Nagoshi, Masayasu

    2017-04-01

    Hydrocarbon contamination introduced during point, line and map analyses in a field emission electron probe microanalysis (FE-EPMA) was investigated to enable reliable quantitative analysis of trace amounts of carbon in steels. The increment of contamination on pure iron in point analysis is proportional to the number of iterations of beam irradiation, but not to the accumulated irradiation time. A combination of a longer dwell time and single measurement with a liquid nitrogen (LN2) trap as an anti-contamination device (ACD) is sufficient for a quantitative point analysis. However, in line and map analyses, contamination increases with irradiation time in addition to the number of iterations, even though the LN2 trap and a plasma cleaner are used as ACDs. Thus, a shorter dwell time and single measurement are preferred for line and map analyses, although it is difficult to eliminate the influence of contamination. While ring-like contamination around the irradiation point grows during electron-beam irradiation, contamination at the irradiation point increases during blanking time after irradiation. This can explain the increment of contamination in iterative point analysis as well as in line and map analyses. Among the ACDs, which are tested in this study, specimen heating at 373 K has a significant contamination inhibition effect. This technique makes it possible to obtain line and map analysis data with minimum influence of contamination. The above-mentioned FE-EPMA data are presented and discussed in terms of the contamination-formation mechanisms and the preferable experimental conditions for the quantification of trace carbon in steels. © The Author 2016. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Quantitative Species Measurements in Microgravity Combustion Flames

    NASA Technical Reports Server (NTRS)

    Silver, Joel A.; Wood, William R.; Chen, Shin-Juh; Dahm, Werner J. A.; Piltch, Nancy D.

    2001-01-01

    Flame-vortex interactions are canonical configurations that can be used to study the underlying processes occurring in complicated turbulent reacting flows. The elegant simplicity of the flame-vortex interaction permits the study of these complex interactions under relatively controllable experimental configurations, in contrast to direct measurements in turbulent flames. The ability to measure and model the fundamental phenomena that occur in a turbulent flame, but with time and spatial scales which are amenable to our diagnostics, permits significant improvements in the understanding of turbulent combustion under both normal and reduced gravity conditions. In this paper, we report absolute mole fraction measurements of methane in a reacting vortex ring. These microgravity experiments are performed in the 2.2-sec drop tower at NASA Glenn Research Center. In collaboration with Drs. Chen and Dahm at the University of Michigan, measured methane absorbances are incorporated into a new model from which the temperature and concentrations of all major gases in the flame can be determined at all positions and times in the development of the vortex ring. This is the first demonstration of the ITAC (Iterative Temperature with Assumed Chemistry) approach, and the results of these computations and analyses are presented in a companion paper by Dahm and Chen at this Workshop. We believe that the ITAC approach will become a powerful tool in understanding a wide variety of combustion flames under both equilibrium and non-equilibrium conditions.

  20. Integration of Error Compensation of Coordinate Measuring Machines into Feature Measurement: Part II—Experimental Implementation

    PubMed Central

    Calvo, Roque; D’Amato, Roberto; Gómez, Emilio; Domingo, Rosario

    2016-01-01

    Coordinate measuring machines (CMM) are main instruments of measurement in laboratories and in industrial quality control. A compensation error model has been formulated (Part I). It integrates error and uncertainty in the feature measurement model. Experimental implementation for the verification of this model is carried out based on the direct testing on a moving bridge CMM. The regression results by axis are quantified and compared to CMM indication with respect to the assigned values of the measurand. Next, testing of selected measurements of length, flatness, dihedral angle, and roundness features are accomplished. The measurement of calibrated gauge blocks for length or angle, flatness verification of the CMM granite table and roundness of a precision glass hemisphere are presented under a setup of repeatability conditions. The results are analysed and compared with alternative methods of estimation. The overall performance of the model is endorsed through experimental verification, as well as the practical use and the model capability to contribute in the improvement of current standard CMM measuring capabilities. PMID:27754441

  1. The importance of quantitative measurement methods for uveitis: laser flare photometry endorsed in Europe while neglected in Japan where the technology measuring quantitatively intraocular inflammation was developed.

    PubMed

    Herbort, Carl P; Tugal-Tutkun, Ilknur

    2017-06-01

    Laser flare photometry (LFP) is an objective and quantitative method to measure intraocular inflammation. The LFP technology was developed in Japan and has been commercially available since 1990. The aim of this work was to review the application of LFP in uveitis practice in Europe compared to Japan where the technology was born. We reviewed PubMed articles published on LFP and uveitis. Although LFP has been largely integrated in routine uveitis practice in Europe, it has been comparatively neglected in Japan and still has not received FDA approval in the USA. As LFP is the only method that provides a precise measure of intraocular inflammation, it should be used as a gold standard in uveitis centres worldwide.

  2. An Improved Method for Measuring Quantitative Resistance to the Wheat Pathogen Zymoseptoria tritici Using High-Throughput Automated Image Analysis.

    PubMed

    Stewart, Ethan L; Hagerty, Christina H; Mikaberidze, Alexey; Mundt, Christopher C; Zhong, Ziming; McDonald, Bruce A

    2016-07-01

    Zymoseptoria tritici causes Septoria tritici blotch (STB) on wheat. An improved method of quantifying STB symptoms was developed based on automated analysis of diseased leaf images made using a flatbed scanner. Naturally infected leaves (n = 949) sampled from fungicide-treated field plots comprising 39 wheat cultivars grown in Switzerland and 9 recombinant inbred lines (RIL) grown in Oregon were included in these analyses. Measures of quantitative resistance were percent leaf area covered by lesions, pycnidia size and gray value, and pycnidia density per leaf and lesion. These measures were obtained automatically with a batch-processing macro utilizing the image-processing software ImageJ. All phenotypes in both locations showed a continuous distribution, as expected for a quantitative trait. The trait distributions at both sites were largely overlapping even though the field and host environments were quite different. Cultivars and RILs could be assigned to two or more statistically different groups for each measured phenotype. Traditional visual assessments of field resistance were highly correlated with quantitative resistance measures based on image analysis for the Oregon RILs. These results show that automated image analysis provides a promising tool for assessing quantitative resistance to Z. tritici under field conditions.

  3. Quantitative surface temperature measurement using two-color thermographic phosphors and video equipment

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M. (Inventor)

    1989-01-01

    A thermal imaging system provides quantitative temperature information and is particularly useful in hypersonic wind tunnel applications. An object to be measured is prepared by coating with a two-color, ultraviolet-activated, thermographic phosphor. The colors emitted by the phosphor are detected by a conventional color video camera. A phosphor emitting blue and green light with a ratio that varies depending on temperature is used so that the intensity of light in the blue and green wavelengths detected by the blue and green tubes in the video camera can be compared. Signals representing the intensity of blue and green light at points on the surface of a model in a hypersonic wind tunnel are used to calculate a ratio of blue to green light intensity which provides quantitative temperature information for the surface of the model.

  4. Counterfactual statements and weak measurements: an experimental proposal

    NASA Astrophysics Data System (ADS)

    Mølmer, Klaus

    2001-12-01

    A recent analysis suggests that weak measurements can be used to give observational meaning to counterfactual reasoning in quantum physics. A weak measurement is predicted to assign a negative unit population to a specific state in an interferometric Gedankenexperiment proposed by Hardy. We propose an experimental implementation with trapped ions of the Gedankenexperiment and of the weak measurement. In our standard quantum mechanical analysis of the proposal no states have negative population, but we identify the registration of a negative population by particles being displaced on average in the direction opposite to a force acting upon them.

  5. Quantitative measurement of eyestrain on 3D stereoscopic display considering the eye foveation model and edge information.

    PubMed

    Heo, Hwan; Lee, Won Oh; Shin, Kwang Yong; Park, Kang Ryoung

    2014-05-15

    We propose a new method for measuring the degree of eyestrain on 3D stereoscopic displays using a glasses-type of eye tracking device. Our study is novel in the following four ways: first, the circular area where a user's gaze position exists is defined based on the calculated gaze position and gaze estimation error. Within this circular area, the position where edge strength is maximized can be detected, and we determine this position as the gaze position that has a higher probability of being the correct one. Based on this gaze point, the eye foveation model is defined. Second, we quantitatively evaluate the correlation between the degree of eyestrain and the causal factors of visual fatigue, such as the degree of change of stereoscopic disparity (CSD), stereoscopic disparity (SD), frame cancellation effect (FCE), and edge component (EC) of the 3D stereoscopic display using the eye foveation model. Third, by comparing the eyestrain in conventional 3D video and experimental 3D sample video, we analyze the characteristics of eyestrain according to various factors and types of 3D video. Fourth, by comparing the eyestrain with or without the compensation of eye saccades movement in 3D video, we analyze the characteristics of eyestrain according to the types of eye movements in 3D video. Experimental results show that the degree of CSD causes more eyestrain than other factors.

  6. Novel method for quantitative ANA measurement using near-infrared imaging.

    PubMed

    Peterson, Lisa K; Wells, Daniel; Shaw, Laura; Velez, Maria-Gabriela; Harbeck, Ronald; Dragone, Leonard L

    2009-09-30

    Antinuclear antibodies (ANA) have been detected in patients with systemic rheumatic diseases and are used in the screening and/or diagnosis of autoimmunity in patients as well as mouse models of systemic autoimmunity. Indirect immunofluorescence (IIF) on HEp-2 cells is the gold standard for ANA screening. However, its usefulness is limited in diagnosis, prognosis and monitoring of disease activity due to the lack of standardization in performing the technique, subjectivity in interpreting the results and the fact that it is only semi-quantitative. Various immunological techniques have been developed in an attempt to improve upon the method to quantify ANA, including enzyme-linked immunosorbent assays (ELISAs), line immunoassays (LIAs), multiplexed bead immunoassays and IIF on substrates other than HEp-2 cells. Yet IIF on HEp-2 cells remains the most common screening method for ANA. In this study, we describe a simple quantitative method to detect ANA which combines IIF on HEp-2 coated slides with analysis using a near-infrared imaging (NII) system. Using NII to determine ANA titer, 86.5% (32 of 37) of the titers for human patient samples were within 2 dilutions of those determined by IIF, which is the acceptable range for proficiency testing. Combining an initial screening for nuclear staining using microscopy with titration by NII resulted in 97.3% (36 of 37) of the titers detected to be within two dilutions of those determined by IIF. The NII method for quantitative ANA measurements using serum from both patients and mice with autoimmunity provides a fast, relatively simple, objective, sensitive and reproducible assay, which could easily be standardized for comparison between laboratories.

  7. Quantitative Measurement of Local Infrared Absorption and Dielectric Function with Tip-Enhanced Near-Field Microscopy.

    PubMed

    Govyadinov, Alexander A; Amenabar, Iban; Huth, Florian; Carney, P Scott; Hillenbrand, Rainer

    2013-05-02

    Scattering-type scanning near-field optical microscopy (s-SNOM) and Fourier transform infrared nanospectroscopy (nano-FTIR) are emerging tools for nanoscale chemical material identification. Here, we push s-SNOM and nano-FTIR one important step further by enabling them to quantitatively measure local dielectric constants and infrared absorption. Our technique is based on an analytical model, which allows for a simple inversion of the near-field scattering problem. It yields the dielectric permittivity and absorption of samples with 2 orders of magnitude improved spatial resolution compared to far-field measurements and is applicable to a large class of samples including polymers and biological matter. We verify the capabilities by determining the local dielectric permittivity of a PMMA film from nano-FTIR measurements, which is in excellent agreement with far-field ellipsometric data. We further obtain local infrared absorption spectra with unprecedented accuracy in peak position and shape, which is the key to quantitative chemometrics on the nanometer scale.

  8. Modern Projection of the Old Electroscope for Nuclear Radiation Quantitative Work and Demonstrations

    ERIC Educational Resources Information Center

    Bastos, Rodrigo Oliveira; Boch, Layara Baltokoski

    2017-01-01

    Although quantitative measurements in radioactivity teaching and research are only believed to be possible with high technology, early work in this area was fully accomplished with very simple apparatus such as zinc sulphide screens and electroscopes. This article presents an experimental practice using the electroscope, which is a very simple…

  9. The NIST Quantitative Infrared Database

    PubMed Central

    Chu, P. M.; Guenther, F. R.; Rhoderick, G. C.; Lafferty, W. J.

    1999-01-01

    With the recent developments in Fourier transform infrared (FTIR) spectrometers it is becoming more feasible to place these instruments in field environments. As a result, there has been enormous increase in the use of FTIR techniques for a variety of qualitative and quantitative chemical measurements. These methods offer the possibility of fully automated real-time quantitation of many analytes; therefore FTIR has great potential as an analytical tool. Recently, the U.S. Environmental Protection Agency (U.S.EPA) has developed protocol methods for emissions monitoring using both extractive and open-path FTIR measurements. Depending upon the analyte, the experimental conditions and the analyte matrix, approximately 100 of the hazardous air pollutants (HAPs) listed in the 1990 U.S.EPA Clean Air Act amendment (CAAA) can be measured. The National Institute of Standards and Technology (NIST) has initiated a program to provide quality-assured infrared absorption coefficient data based on NIST prepared primary gas standards. Currently, absorption coefficient data has been acquired for approximately 20 of the HAPs. For each compound, the absorption coefficient spectrum was calculated using nine transmittance spectra at 0.12 cm−1 resolution and the Beer’s law relationship. The uncertainties in the absorption coefficient data were estimated from the linear regressions of the transmittance data and considerations of other error sources such as the nonlinear detector response. For absorption coefficient values greater than 1 × 10−4 μmol/mol)−1 m−1 the average relative expanded uncertainty is 2.2 %. This quantitative infrared database is currently an ongoing project at NIST. Additional spectra will be added to the database as they are acquired. Our current plans include continued data acquisition of the compounds listed in the CAAA, as well as the compounds that contribute to global warming and ozone depletion.

  10. Electrons, Photons, and Force: Quantitative Single-Molecule Measurements from Physics to Biology

    PubMed Central

    2011-01-01

    Single-molecule measurement techniques have illuminated unprecedented details of chemical behavior, including observations of the motion of a single molecule on a surface, and even the vibration of a single bond within a molecule. Such measurements are critical to our understanding of entities ranging from single atoms to the most complex protein assemblies. We provide an overview of the strikingly diverse classes of measurements that can be used to quantify single-molecule properties, including those of single macromolecules and single molecular assemblies, and discuss the quantitative insights they provide. Examples are drawn from across the single-molecule literature, ranging from ultrahigh vacuum scanning tunneling microscopy studies of adsorbate diffusion on surfaces to fluorescence studies of protein conformational changes in solution. PMID:21338175

  11. Quantitative Experimental Determination of Primer-Dimer Formation Risk by Free-Solution Conjugate Electrophoresis

    PubMed Central

    Desmarais, Samantha M.; Leitner, Thomas; Barron, Annelise E.

    2012-01-01

    DNA barcodes are short, unique ssDNA primers that “mark” individual biomolecules. To gain better understanding of biophysical parameters constraining primer-dimer formation between primers that incorporate barcode sequences, we have developed a capillary electrophoresis method that utilizes drag-tag-DNA conjugates to quantify dimerization risk between primer-barcode pairs. Results obtained with this unique free-solution conjugate electrophoresis (FSCE) approach are useful as quantitatively precise input data to parameterize computation models of dimerization risk. A set of fluorescently labeled, model primer-barcode conjugates were designed with complementary regions of differing lengths to quantify heterodimerization as a function of temperature. Primer-dimer cases comprised two 30-mer primers, one of which was covalently conjugated to a lab-made, chemically synthesized poly-N-methoxyethylglycine drag-tag, which reduced electrophoretic mobility of ssDNA to distinguish it from ds primer-dimers. The drag-tags also provided a shift in mobility for the dsDNA species, which allowed us to quantitate primer-dimer formation. In the experimental studies, pairs of oligonucleotide primer-barcodes with fully or partially complementary sequences were annealed, and then separated by free-solution conjugate CE at different temperatures, to assess effects on primer-dimer formation. When less than 30 out of 30 basepairs were bonded, dimerization was inversely correlated to temperature. Dimerization occurred when more than 15 consecutive basepairs formed, yet non-consecutive basepairs did not create stable dimers even when 20 out of 30 possible basepairs bonded. The use of free-solution electrophoresis in combination with a peptoid drag-tag and different fluorophores enabled precise separation of short DNA fragments to establish a new mobility shift assay for detection of primer-dimer formation. PMID:22331820

  12. Three-dimensional simulations of plasma turbulence in the RFX-mod scrape-off layer and comparison with experimental measurements

    NASA Astrophysics Data System (ADS)

    Riva, Fabio; Vianello, Nicola; Spolaore, Monica; Ricci, Paolo; Cavazzana, Roberto; Marrelli, Lionello; Spagnolo, Silvia

    2018-02-01

    The tokamak scrape-off layer (SOL) plasma dynamics is investigated in a circular limiter configuration with a low edge safety factor. Focusing on the experimental parameters of two ohmic tokamak inner-wall limited plasma discharges in RFX-mod [Sonato et al., Fusion Eng. Des. 74, 97 (2005)], nonlinear SOL plasma simulations are performed with the GBS code [Ricci et al., Plasma Phys. Controlled Fusion 54, 124047 (2012)]. The numerical results are compared with the experimental measurements, assessing the reliability of the GBS model in describing the RFX-mod SOL plasma dynamics. It is found that the simulations are able to quantitatively reproduce the RFX-mod experimental measurements of the electron plasma density, electron temperature, and ion saturation current density (jsat) equilibrium profiles. Moreover, there are indications that the turbulent transport is driven by the same instability in the simulations and in the experiment, with coherent structures having similar statistical properties. On the other hand, it is found that the simulation results are not able to correctly reproduce the floating potential equilibrium profile and the jsat fluctuation level. It is likely that these discrepancies are, at least in part, related to simulating only the tokamak SOL region, without including the plasma dynamics inside the last close flux surface, and to the limits of applicability of the drift approximation. The turbulence drive is then identified from the nonlinear simulations and with the linear theory. It results that the inertial drift wave is the instability driving most of the turbulent transport in the considered discharges.

  13. Airborne radar and radiometer experiment for quantitative remote measurements of rain

    NASA Technical Reports Server (NTRS)

    Kozu, Toshiaki; Meneghini, Robert; Boncyk, Wayne; Wilheit, Thomas T.; Nakamura, Kenji

    1989-01-01

    An aircraft experiment has been conducted with a dual-frequency (10 GHz and 35 GHz) radar/radiometer system and an 18-GHz radiometer to test various rain-rate retrieval algorithms from space. In the experiment, which took place in the fall of 1988 at the NASA Wallops Flight Facility, VA, both stratiform and convective storms were observed. A ground-based radar and rain gauges were also used to obtain truth data. An external radar calibration is made with rain gauge data, thereby enabling quantitative reflectivity measurements. Comparisons between path attenuations derived from the surface return and from the radar reflectivity profile are made to test the feasibility of a technique to estimate the raindrop size distribution from simultaneous radar and path-attenuation measurements.

  14. Accuracy in streamflow measurements on the Fernow Experimental Forest

    Treesearch

    James W. Hornbeck

    1965-01-01

    Measurement of streamflow from small watersheds on the Fernow Experimental Forest at Parsons, West Virginia was begun in 1951. Stream-gaging stations are now being operated on 9 watersheds ranging from 29 to 96 acres in size; and 91 watershed-years of record have been collected. To determine how accurately streamflow is being measured at these stations, several of the...

  15. Experimental technique for measuring the isentrope of hydrogen to several megabars

    NASA Astrophysics Data System (ADS)

    Barker, L. M.; Truncano, T. G.; Wise, J. I.; Asay, J. R.

    The experimental measurement of the Equations of State (EOS) of hydrogen has been of interest for some time because of the theoretical expectation of a transition to the metallic state in the multi-megabar pressure regime. Previous experiments have reported results which are consistent with a metallic transition, but experimental uncertainties have precluded positive identification of the metallic phase. In this paper we describe a new experimental approach to the measurement of the high-pressure EOS of hydrogen. A cryogenic hydrogen specimen, either liquid or solid, is located in the muzzle of a gun barrel between a tungsten anvil and another tungsten disk called a shim. Helium gas in the gun barrel cushions the impact and allows nearly isentropic compression of the hydrogen. The time-resolved pressure in the specimen is calculated from a laser interferometer (VISAR) measurement of the acceleration history of the anvil's free surface, and volume measurements at specific times are made by combining VISAR data, which define the position of the anvil, with flash X-ray photographs which define the shim position.

  16. Strain measurement on stiff structures: experimental evaluation of three integrated measurement principles

    NASA Astrophysics Data System (ADS)

    Rausch, J.; Hatzfeld, C.; Karsten, R.; Kraus, R.; Millitzer, J.; Werthschützky, R.

    2012-06-01

    This paper presents an experimental evaluation of three different strain measuring principles. Mounted on a steel beam resembling a car engine mount, metal foil strain gauges, piezoresistive silicon strain gauges and piezoelectric patches are investigated to measure structure-borne forces to control an active mounting structure. FEA simulation determines strains to be measured in the range of 10-8 up to 10-5 m × m-1. These low strains cannot be measured with conventional metal foil strain gauges, as shown in the experiment conducted. Both piezoresistive and piezoelectric gauges show good results compared to a conventional piezoelectric force sensor. Depending on bandwidth, overload capacity and primary electronic costs, these principles seem to be worth considering in an adaptronic system design. These parameters are described in detail for the principles investigated.

  17. A quantitative brain map of experimental cerebral malaria pathology.

    PubMed

    Strangward, Patrick; Haley, Michael J; Shaw, Tovah N; Schwartz, Jean-Marc; Greig, Rachel; Mironov, Aleksandr; de Souza, J Brian; Cruickshank, Sheena M; Craig, Alister G; Milner, Danny A; Allan, Stuart M; Couper, Kevin N

    2017-03-01

    The murine model of experimental cerebral malaria (ECM) has been utilised extensively in recent years to study the pathogenesis of human cerebral malaria (HCM). However, it has been proposed that the aetiologies of ECM and HCM are distinct, and, consequently, no useful mechanistic insights into the pathogenesis of HCM can be obtained from studying the ECM model. Therefore, in order to determine the similarities and differences in the pathology of ECM and HCM, we have performed the first spatial and quantitative histopathological assessment of the ECM syndrome. We demonstrate that the accumulation of parasitised red blood cells (pRBCs) in brain capillaries is a specific feature of ECM that is not observed during mild murine malaria infections. Critically, we show that individual pRBCs appear to occlude murine brain capillaries during ECM. As pRBC-mediated congestion of brain microvessels is a hallmark of HCM, this suggests that the impact of parasite accumulation on cerebral blood flow may ultimately be similar in mice and humans during ECM and HCM, respectively. Additionally, we demonstrate that cerebrovascular CD8+ T-cells appear to co-localise with accumulated pRBCs, an event that corresponds with development of widespread vascular leakage. As in HCM, we show that vascular leakage is not dependent on extensive vascular destruction. Instead, we show that vascular leakage is associated with alterations in transcellular and paracellular transport mechanisms. Finally, as in HCM, we observed axonal injury and demyelination in ECM adjacent to diverse vasculopathies. Collectively, our data therefore shows that, despite very different presentation, and apparently distinct mechanisms, of parasite accumulation, there appear to be a number of comparable features of cerebral pathology in mice and in humans during ECM and HCM, respectively. Thus, when used appropriately, the ECM model may be useful for studying specific pathological features of HCM.

  18. A quantitative brain map of experimental cerebral malaria pathology

    PubMed Central

    Schwartz, Jean-Marc; Greig, Rachel; Mironov, Aleksandr; de Souza, J. Brian; Cruickshank, Sheena M.; Craig, Alister G.; Milner, Danny A.; Allan, Stuart M.

    2017-01-01

    The murine model of experimental cerebral malaria (ECM) has been utilised extensively in recent years to study the pathogenesis of human cerebral malaria (HCM). However, it has been proposed that the aetiologies of ECM and HCM are distinct, and, consequently, no useful mechanistic insights into the pathogenesis of HCM can be obtained from studying the ECM model. Therefore, in order to determine the similarities and differences in the pathology of ECM and HCM, we have performed the first spatial and quantitative histopathological assessment of the ECM syndrome. We demonstrate that the accumulation of parasitised red blood cells (pRBCs) in brain capillaries is a specific feature of ECM that is not observed during mild murine malaria infections. Critically, we show that individual pRBCs appear to occlude murine brain capillaries during ECM. As pRBC-mediated congestion of brain microvessels is a hallmark of HCM, this suggests that the impact of parasite accumulation on cerebral blood flow may ultimately be similar in mice and humans during ECM and HCM, respectively. Additionally, we demonstrate that cerebrovascular CD8+ T-cells appear to co-localise with accumulated pRBCs, an event that corresponds with development of widespread vascular leakage. As in HCM, we show that vascular leakage is not dependent on extensive vascular destruction. Instead, we show that vascular leakage is associated with alterations in transcellular and paracellular transport mechanisms. Finally, as in HCM, we observed axonal injury and demyelination in ECM adjacent to diverse vasculopathies. Collectively, our data therefore shows that, despite very different presentation, and apparently distinct mechanisms, of parasite accumulation, there appear to be a number of comparable features of cerebral pathology in mice and in humans during ECM and HCM, respectively. Thus, when used appropriately, the ECM model may be useful for studying specific pathological features of HCM. PMID:28273147

  19. Strain measurement of objects subjected to aerodynamic heating using digital image correlation: experimental design and preliminary results.

    PubMed

    Pan, Bing; Jiang, Tianyun; Wu, Dafang

    2014-11-01

    In thermomechanical testing of hypersonic materials and structures, direct observation and quantitative strain measurement of the front surface of a test specimen directly exposed to severe aerodynamic heating has been considered as a very challenging task. In this work, a novel quartz infrared heating device with an observation window is designed to reproduce the transient thermal environment experienced by hypersonic vehicles. The specially designed experimental system allows the capture of test article's surface images at various temperatures using an optical system outfitted with a bandpass filter. The captured images are post-processed by digital image correlation to extract full-field thermal deformation. To verify the viability and accuracy of the established system, thermal strains of a chromiumnickel austenite stainless steel sample heated from room temperature up to 600 °C were determined. The preliminary results indicate that the air disturbance between the camera and the specimen due to heat haze induces apparent distortions in the recorded images and large errors in the measured strains, but the average values of the measured strains are accurate enough. Limitations and further improvements of the proposed technique are discussed.

  20. Formation resistivity measurements from within a cased well used to quantitatively determine the amount of oil and gas present

    DOEpatents

    Vail, III, William B.

    1997-01-01

    Methods to quantitatively determine the separate amounts of oil and gas in a geological formation adjacent to a cased well using measurements of formation resistivity are disclosed. The steps include obtaining resistivity measurements from within a cased well of a given formation, obtaining the porosity, obtaining the resistivity of formation water present, computing the combined amounts of oil and gas present using Archie's Equations, determining the relative amounts of oil and gas present from measurements within a cased well, and then quantitatively determining the separate amounts of oil and gas present in the formation.

  1. Reineke’s stand density index: a quantitative and non-unitless measure of stand density

    Treesearch

    Curtis L. VanderSchaaf

    2013-01-01

    When used as a measure of relative density, Reineke’s stand density index (SDI) can be made unitless by relating the current SDI to a standard density but when used as a quantitative measure of stand density SDI is not unitless. Reineke’s SDI relates the current stand density to an equivalent number of trees per unit area in a stand with a quadratic mean diameter (Dq)...

  2. Quantitative measurements of nanoscale permittivity and conductivity using tuning-fork-based microwave impedance microscopy

    NASA Astrophysics Data System (ADS)

    Wu, Xiaoyu; Hao, Zhenqi; Wu, Di; Zheng, Lu; Jiang, Zhanzhi; Ganesan, Vishal; Wang, Yayu; Lai, Keji

    2018-04-01

    We report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-field microwave imaging with small distance modulation.

  3. Evaluation of quantitative PCR measurement of bacterial colonization of epithelial cells.

    PubMed

    Schmidt, Marcin T; Olejnik-Schmidt, Agnieszka K; Myszka, Kamila; Borkowska, Monika; Grajek, Włodzimierz

    2010-01-01

    Microbial colonization is an important step in establishing pathogenic or probiotic relations to host cells and in biofilm formation on industrial or medical devices. The aim of this work was to verify the applicability of quantitative PCR (Real-Time PCR) to measure bacterial colonization of epithelial cells. Salmonella enterica and Caco-2 intestinal epithelial cell line was used as a model. To verify sensitivity of the assay a competition of the pathogen cells to probiotic microorganism was tested. The qPCR method was compared to plate count and radiolabel approach, which are well established techniques in this area of research. The three methods returned similar results. The best quantification accuracy had radiolabel method, followed by qPCR. The plate count results showed coefficient of variation two-times higher than this of qPCR. The quantitative PCR proved to be a reliable method for enumeration of microbes in colonization assay. It has several advantages that make it very useful in case of analyzing mixed populations, where several different species or even strains can be monitored at the same time.

  4. PLS-based quantitative structure-activity relationship for substituted benzamides of clebopride type. Application of experimental design in drug design.

    PubMed

    Norinder, U; Högberg, T

    1992-04-01

    The advantageous approach of using an experimentally designed training set as the basis for establishing a quantitative structure-activity relationship with good predictive capability is described. The training set was selected from a fractional factorial design scheme based on a principal component description of physico-chemical parameters of aromatic substituents. The derived model successfully predicts the activities of additional substituted benzamides of 6-methoxy-N-(4-piperidyl)salicylamide type. The major influence on activity of the 3-substituent is demonstrated.

  5. Timed function tests, motor function measure, and quantitative thigh muscle MRI in ambulant children with Duchenne muscular dystrophy: A cross-sectional analysis.

    PubMed

    Schmidt, Simone; Hafner, Patricia; Klein, Andrea; Rubino-Nacht, Daniela; Gocheva, Vanya; Schroeder, Jonas; Naduvilekoot Devasia, Arjith; Zuesli, Stephanie; Bernert, Guenther; Laugel, Vincent; Bloetzer, Clemens; Steinlin, Maja; Capone, Andrea; Gloor, Monika; Tobler, Patrick; Haas, Tanja; Bieri, Oliver; Zumbrunn, Thomas; Fischer, Dirk; Bonati, Ulrike

    2018-01-01

    The development of new therapeutic agents for the treatment of Duchenne muscular dystrophy has put a focus on defining outcome measures most sensitive to capture treatment effects. This cross-sectional analysis investigates the relation between validated clinical assessments such as the 6-minute walk test, motor function measure and quantitative muscle MRI of thigh muscles in ambulant Duchenne muscular dystrophy patients, aged 6.5 to 10.8 years (mean 8.2, SD 1.1). Quantitative muscle MRI included the mean fat fraction using a 2-point Dixon technique, and transverse relaxation time (T2) measurements. All clinical assessments were highly significantly inter-correlated with p < 0.001. The strongest correlation with the motor function measure and its D1-subscore was shown by the 6-minute walk test. Clinical assessments showed no correlation with age. Importantly, quantitative muscle MRI values significantly correlated with all clinical assessments with the extensors showing the strongest correlation. In contrast to the clinical assessments, quantitative muscle MRI values were highly significantly correlated with age. In conclusion, the motor function measure and timed function tests measure disease severity in a highly comparable fashion and all tests correlated with quantitative muscle MRI values quantifying fatty muscle degeneration. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Retrieving the Quantitative Chemical Information at Nanoscale from Scanning Electron Microscope Energy Dispersive X-ray Measurements by Machine Learning

    NASA Astrophysics Data System (ADS)

    Jany, B. R.; Janas, A.; Krok, F.

    2017-11-01

    The quantitative composition of metal alloy nanowires on InSb(001) semiconductor surface and gold nanostructures on germanium surface is determined by blind source separation (BSS) machine learning (ML) method using non negative matrix factorization (NMF) from energy dispersive X-ray spectroscopy (EDX) spectrum image maps measured in a scanning electron microscope (SEM). The BSS method blindly decomposes the collected EDX spectrum image into three source components, which correspond directly to the X-ray signals coming from the supported metal nanostructures, bulk semiconductor signal and carbon background. The recovered quantitative composition is validated by detailed Monte Carlo simulations and is confirmed by separate cross-sectional TEM EDX measurements of the nanostructures. This shows that SEM EDX measurements together with machine learning blind source separation processing could be successfully used for the nanostructures quantitative chemical composition determination.

  7. Quantitative echocardiographic measures in the assessment of single ventricle function post-Fontan: Incorporation into routine clinical practice.

    PubMed

    Rios, Rodrigo; Ginde, Salil; Saudek, David; Loomba, Rohit S; Stelter, Jessica; Frommelt, Peter

    2017-01-01

    Quantitative echocardiographic measurements of single ventricular (SV) function have not been incorporated into routine clinical practice. A clinical protocol, which included quantitative measurements of SV deformation (global circumferential and longitudinal strain and strain rate), standard deviation of time to peak systolic strain, myocardial performance index (MPI), dP/dT from an atrioventricular valve regurgitant jet, and superior mesenteric artery resistance index, was instituted for all patients with a history of Fontan procedure undergoing echocardiography. All measures were performed real time during clinically indicated studies and were included in clinical reports. A total of 100 consecutive patients (mean age = 11.95±6.8 years, range 17 months-31.3 years) completed the protocol between September 1, 2014 to April 29, 2015. Deformation measures were completed in 100% of the studies, MPI in 93%, dP/dT in 55%, and superior mesenteric artery Doppler in 82%. The studies were reviewed to assess for efficiency in completing the protocol. The average time for image acquisition was 27.4±8.8 (range 10-62 minutes). The average time to perform deformation measures was 10.8±5.5 minutes (range 5-35 minutes) and time from beginning of imaging to report completion was 53.4±13.7 minutes (range 27-107 minutes). There was excellent inter-observer reliability when deformation indices were blindly repeated. Patients with a single left ventricle had significantly higher circumferential strain and strain rate, longitudinal strain and strain rate, and dP/dT compared to a single right ventricle. There were no differences in quantitative indices of ventricular function between patients <10 vs. >10 years post-Fontan. Advanced quantitative assessment of SV function post-Fontan can be consistently and efficiently performed real time during clinically indicated echocardiograms with excellent reliability. © 2016, Wiley Periodicals, Inc.

  8. Quantitative measurements of intercellular adhesion between a macrophage and cancer cells using a cup-attached AFM chip.

    PubMed

    Kim, Hyonchol; Yamagishi, Ayana; Imaizumi, Miku; Onomura, Yui; Nagasaki, Akira; Miyagi, Yohei; Okada, Tomoko; Nakamura, Chikashi

    2017-07-01

    Intercellular adhesion between a macrophage and cancer cells was quantitatively measured using atomic force microscopy (AFM). Cup-shaped metal hemispheres were fabricated using polystyrene particles as a template, and a cup was attached to the apex of the AFM cantilever. The cup-attached AFM chip (cup-chip) approached a murine macrophage cell (J774.2), the cell was captured on the inner concave of the cup, and picked up by withdrawing the cup-chip from the substrate. The cell-attached chip was advanced towards a murine breast cancer cell (FP10SC2), and intercellular adhesion between the two cells was quantitatively measured. To compare cell adhesion strength, the work required to separate two adhered cells (separation work) was used as a parameter. Separation work was almost 2-fold larger between a J774.2 cell and FP10SC2 cell than between J774.2 cell and three additional different cancer cells (4T1E, MAT-LyLu, and U-2OS), two FP10SC2 cells, or two J774.2 cells. FP10SC2 was established from 4T1E as a highly metastatic cell line, indicates separation work increased as the malignancy of cancer cells became higher. One possible explanation of the strong adhesion of macrophages to cancer cells observed in this study is that the measurement condition mimicked the microenvironment of tumor-associated macrophages (TAMs) in vivo, and J774.2 cells strongly expressed CD204, which is a marker of TAMs. The results of the present study, which were obtained by measuring cell adhesion strength quantitatively, indicate that the fabricated cup-chip is a useful tool for measuring intercellular adhesion easily and quantitatively. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. The quantitative and condition-dependent Escherichia coli proteome

    PubMed Central

    Schmidt, Alexander; Kochanowski, Karl; Vedelaar, Silke; Ahrné, Erik; Volkmer, Benjamin; Callipo, Luciano; Knoops, Kèvin; Bauer, Manuel; Aebersold, Ruedi; Heinemann, Matthias

    2016-01-01

    Measuring precise concentrations of proteins can provide insights into biological processes. Here, we use efficient protein extraction and sample fractionation and state-of-the-art quantitative mass spectrometry techniques to generate a comprehensive, condition-dependent protein abundance map of Escherichia coli. We measure cellular protein concentrations for 55% of predicted E. coli genes (>2300 proteins) under 22 different experimental conditions and identify methylation and N-terminal protein acetylations previously not known to be prevalent in bacteria. We uncover system-wide proteome allocation, expression regulation, and post-translational adaptations. These data provide a valuable resource for the systems biology and broader E. coli research communities. PMID:26641532

  10. Experimental study of stratified jet by simultaneous measurements of velocity and density fields

    NASA Astrophysics Data System (ADS)

    Xu, Duo; Chen, Jun

    2012-07-01

    Stratified flows with small density difference commonly exist in geophysical and engineering applications, which often involve interaction of turbulence and buoyancy effect. A combined particle image velocimetry (PIV) and planar laser-induced fluorescence (PLIF) system is developed to measure the velocity and density fields in a dense jet discharged horizontally into a tank filled with light fluid. The illumination of PIV particles and excitation of PLIF dye are achieved by a dual-head pulsed Nd:YAG laser and two CCD cameras with a set of optical filters. The procedure for matching refractive indexes of two fluids and calibration of the combined system are presented, as well as a quantitative analysis of the measurement uncertainties. The flow structures and mixing dynamics within the central vertical plane are studied by examining the averaged parameters, turbulent kinetic energy budget, and modeling of momentum flux and buoyancy flux. At downstream, profiles of velocity and density display strong asymmetry with respect to its center. This is attributed to the fact that stable stratification reduces mixing and unstable stratification enhances mixing. In stable stratification region, most of turbulence production is consumed by mean-flow convection, whereas in unstable stratification region, turbulence production is nearly balanced by viscous dissipation. Experimental data also indicate that at downstream locations, mixing length model performs better in mixing zone of stable stratification regions, whereas in other regions, eddy viscosity/diffusivity models with static model coefficients represent effectively momentum and buoyancy flux terms. The measured turbulent Prandtl number displays strong spatial variation in the stratified jet.

  11. From information theory to quantitative description of steric effects.

    PubMed

    Alipour, Mojtaba; Safari, Zahra

    2016-07-21

    Immense efforts have been made in the literature to apply the information theory descriptors for investigating the electronic structure theory of various systems. In the present study, the information theoretic quantities, such as Fisher information, Shannon entropy, Onicescu information energy, and Ghosh-Berkowitz-Parr entropy, have been used to present a quantitative description for one of the most widely used concepts in chemistry, namely the steric effects. Taking the experimental steric scales for the different compounds as benchmark sets, there are reasonable linear relationships between the experimental scales of the steric effects and theoretical values of steric energies calculated from information theory functionals. Perusing the results obtained from the information theoretic quantities with the two representations of electron density and shape function, the Shannon entropy has the best performance for the purpose. On the one hand, the usefulness of considering the contributions of functional groups steric energies and geometries, and on the other hand, dissecting the effects of both global and local information measures simultaneously have also been explored. Furthermore, the utility of the information functionals for the description of steric effects in several chemical transformations, such as electrophilic and nucleophilic reactions and host-guest chemistry, has been analyzed. The functionals of information theory correlate remarkably with the stability of systems and experimental scales. Overall, these findings show that the information theoretic quantities can be introduced as quantitative measures of steric effects and provide further evidences of the quality of information theory toward helping theoreticians and experimentalists to interpret different problems in real systems.

  12. Validating internal controls for quantitative plant gene expression studies

    PubMed Central

    Brunner, Amy M; Yakovlev, Igor A; Strauss, Steven H

    2004-01-01

    Background Real-time reverse transcription PCR (RT-PCR) has greatly improved the ease and sensitivity of quantitative gene expression studies. However, accurate measurement of gene expression with this method relies on the choice of a valid reference for data normalization. Studies rarely verify that gene expression levels for reference genes are adequately consistent among the samples used, nor compare alternative genes to assess which are most reliable for the experimental conditions analyzed. Results Using real-time RT-PCR to study the expression of 10 poplar (genus Populus) housekeeping genes, we demonstrate a simple method for determining the degree of stability of gene expression over a set of experimental conditions. Based on a traditional method for analyzing the stability of varieties in plant breeding, it defines measures of gene expression stability from analysis of variance (ANOVA) and linear regression. We found that the potential internal control genes differed widely in their expression stability over the different tissues, developmental stages and environmental conditions studied. Conclusion Our results support that quantitative comparisons of candidate reference genes are an important part of real-time RT-PCR studies that seek to precisely evaluate variation in gene expression. The method we demonstrated facilitates statistical and graphical evaluation of gene expression stability. Selection of the best reference gene for a given set of experimental conditions should enable detection of biologically significant changes in gene expression that are too small to be revealed by less precise methods, or when highly variable reference genes are unknowingly used in real-time RT-PCR experiments. PMID:15317655

  13. Validating internal controls for quantitative plant gene expression studies.

    PubMed

    Brunner, Amy M; Yakovlev, Igor A; Strauss, Steven H

    2004-08-18

    Real-time reverse transcription PCR (RT-PCR) has greatly improved the ease and sensitivity of quantitative gene expression studies. However, accurate measurement of gene expression with this method relies on the choice of a valid reference for data normalization. Studies rarely verify that gene expression levels for reference genes are adequately consistent among the samples used, nor compare alternative genes to assess which are most reliable for the experimental conditions analyzed. Using real-time RT-PCR to study the expression of 10 poplar (genus Populus) housekeeping genes, we demonstrate a simple method for determining the degree of stability of gene expression over a set of experimental conditions. Based on a traditional method for analyzing the stability of varieties in plant breeding, it defines measures of gene expression stability from analysis of variance (ANOVA) and linear regression. We found that the potential internal control genes differed widely in their expression stability over the different tissues, developmental stages and environmental conditions studied. Our results support that quantitative comparisons of candidate reference genes are an important part of real-time RT-PCR studies that seek to precisely evaluate variation in gene expression. The method we demonstrated facilitates statistical and graphical evaluation of gene expression stability. Selection of the best reference gene for a given set of experimental conditions should enable detection of biologically significant changes in gene expression that are too small to be revealed by less precise methods, or when highly variable reference genes are unknowingly used in real-time RT-PCR experiments.

  14. Experimental measurement of flexion-extension movement in normal and corpse prosthetic elbow joint.

    PubMed

    TarniŢă, Daniela; TarniŢă, DănuŢ Nicolae

    2016-01-01

    This paper presents a comparative experimental study of flexion-extension movement in healthy elbow and in the prosthetic elbow joint fixed on an original experimental bench. Measurements were carried out in order to validate the functional morphology and a new elbow prosthesis type ball head. The three-dimensional (3D) model and the physical prototype of our experimental bench used to test elbow endoprosthesis at flexion-extension and pronation-supination movements is presented. The measurements were carried out on a group of nine healthy subjects and on the prosthetic corpse elbow, the experimental data being obtained for flexion-extension movement cycles. Experimental data for the two different flexion-extension tests for the nine subjects and for the corpse prosthetic elbow were acquired using SimiMotion video system. Experimental data were processed statistically. The corresponding graphs were obtained for all subjects in the experimental group, and for corpse prosthetic elbow for both flexion-extension tests. The statistical analysis has proved that the flexion angles of healthy elbows were significantly close to the values measured at the prosthetic elbow fixed on the experimental bench. The studied elbow prosthesis manages to re-establish the mobility for the elbow joint as close to the normal one.

  15. Quantitative Measures of Swallowing Deficits in Patients With Parkinson's Disease.

    PubMed

    Ellerston, Julia K; Heller, Amanda C; Houtz, Daniel R; Kendall, Katherine A

    2016-05-01

    Dysphagia and associated aspiration pneumonia are commonly reported sequelae of Parkinson's disease (PD). Previous studies of swallowing in patients with PD have described prolonged pharyngeal transit time, delayed onset of pharyngeal transit, cricopharyngeal (CP) achalasia, reduced pharyngeal constriction, and slowed hyolaryngeal elevation. These studies were completed using inconsistent evaluation methodology, reliance on qualitative analysis, and a lack of a large control group, resulting in concerns regarding diagnostic precision. The purpose of this study was to investigate swallowing function in patients with PD using a norm-referenced, quantitative approach. This retrospective study includes 34 patients with a diagnosis of PD referred to a multidisciplinary voice and swallowing clinic. Modified barium swallow studies were performed using quantitative measures of pharyngeal transit time, hyoid displacement, CP sphincter opening, area of the pharynx at maximal constriction, and timing of laryngeal vestibule closure relative to bolus arrival at the CP sphincter. Reduced pharyngeal constriction was found in 30.4%, and a delay in airway closure relative to arrival of the bolus at the CP sphincter was the most common abnormality, present in 62% of patients. Previously reported findings of prolonged pharyngeal transit, poor hyoid elevation, and CP achalasia were not identified as prominent features. © The Author(s) 2015.

  16. Measuring changes in transmission of neglected tropical diseases, malaria, and enteric pathogens from quantitative antibody levels.

    PubMed

    Arnold, Benjamin F; van der Laan, Mark J; Hubbard, Alan E; Steel, Cathy; Kubofcik, Joseph; Hamlin, Katy L; Moss, Delynn M; Nutman, Thomas B; Priest, Jeffrey W; Lammie, Patrick J

    2017-05-01

    Serological antibody levels are a sensitive marker of pathogen exposure, and advances in multiplex assays have created enormous potential for large-scale, integrated infectious disease surveillance. Most methods to analyze antibody measurements reduce quantitative antibody levels to seropositive and seronegative groups, but this can be difficult for many pathogens and may provide lower resolution information than quantitative levels. Analysis methods have predominantly maintained a single disease focus, yet integrated surveillance platforms would benefit from methodologies that work across diverse pathogens included in multiplex assays. We developed an approach to measure changes in transmission from quantitative antibody levels that can be applied to diverse pathogens of global importance. We compared age-dependent immunoglobulin G curves in repeated cross-sectional surveys between populations with differences in transmission for multiple pathogens, including: lymphatic filariasis (Wuchereria bancrofti) measured before and after mass drug administration on Mauke, Cook Islands, malaria (Plasmodium falciparum) before and after a combined insecticide and mass drug administration intervention in the Garki project, Nigeria, and enteric protozoans (Cryptosporidium parvum, Giardia intestinalis, Entamoeba histolytica), bacteria (enterotoxigenic Escherichia coli, Salmonella spp.), and viruses (norovirus groups I and II) in children living in Haiti and the USA. Age-dependent antibody curves fit with ensemble machine learning followed a characteristic shape across pathogens that aligned with predictions from basic mechanisms of humoral immunity. Differences in pathogen transmission led to shifts in fitted antibody curves that were remarkably consistent across pathogens, assays, and populations. Mean antibody levels correlated strongly with traditional measures of transmission intensity, such as the entomological inoculation rate for P. falciparum (Spearman's rho = 0.75). In

  17. A quantitative ELISA procedure for the measurement of membrane-bound platelet-associated IgG (PAIgG).

    PubMed

    Lynch, D M; Lynch, J M; Howe, S E

    1985-03-01

    A quantitative ELISA assay for the measurement of in vivo bound platelet-associated IgG (PAIgG) using intact patient platelets is presented. The assay requires quantitation and standardization of the number of platelets bound to microtiter plate wells and an absorbance curve using quantitated IgG standards. Platelet-bound IgG was measured using an F(ab')2 peroxidase labeled anti-human IgG and o-phenylenediamine dihydrochloride (OPD) as the substrate. Using this assay, PAIgG for normal individuals was 2.8 +/- 1.6 fg/platelet (mean +/- 1 SD; n = 30). Increased levels were found in 28 of 30 patients with clinical autoimmune thrombocytopenia (ATP) with a range of 7.0-80 fg/platelet. Normal PAIgG levels were found in 26 of 30 patients with nonimmune thrombocytopenia. In the sample population studied, the PAIgG assay showed a sensitivity of 93%, specificity of 90%, a positive predictive value of 0.90, and a negative predictive value of 0.93. The procedure is highly reproducible (CV = 6.8%) and useful in evaluating patients with suspected immune mediated thrombocytopenia.

  18. Remote measurements of the atmosphere using Raman scattering.

    NASA Technical Reports Server (NTRS)

    Melfi, S. H.

    1972-01-01

    Raman optical radar measurements of the atmosphere demonstrate that the technique may be used to obtain quantitative measurements of the spatial distribution of individual atmospheric molecular trace constituents (in particular water vapor) and of the major constituents. It is shown that monitoring Raman signals from atmospheric nitrogen aids in interpreting elastic scattering measurements by eliminating attenuation effects. In general, the experimental results show good agreement with independent meteorological measurements. Finally, experimental data are utilized to estimate the Raman backscatter cross section for water vapor excited at 3471.5 A.

  19. Spontaneous emergence of rogue waves in partially coherent waves: A quantitative experimental comparison between hydrodynamics and optics

    NASA Astrophysics Data System (ADS)

    El Koussaifi, R.; Tikan, A.; Toffoli, A.; Randoux, S.; Suret, P.; Onorato, M.

    2018-01-01

    Rogue waves are extreme and rare fluctuations of the wave field that have been discussed in many physical systems. Their presence substantially influences the statistical properties of a partially coherent wave field, i.e., a wave field characterized by a finite band spectrum with random Fourier phases. Their understanding is fundamental for the design of ships and offshore platforms. In many meteorological conditions waves in the ocean are characterized by the so-called Joint North Sea Wave Project (JONSWAP) spectrum. Here we compare two unique experimental results: the first one has been performed in a 270 m wave tank and the other in optical fibers. In both cases, waves characterized by a JONSWAP spectrum and random Fourier phases have been launched at the input of the experimental device. The quantitative comparison, based on an appropriate scaling of the two experiments, shows a very good agreement between the statistics in hydrodynamics and optics. Spontaneous emergence of heavy tails in the probability density function of the wave amplitude is observed in both systems. The results demonstrate the universal features of rogue waves and provide a fundamental and explicit bridge between two important fields of research. Numerical simulations are also compared with experimental results.

  20. Spontaneous emergence of rogue waves in partially coherent waves: A quantitative experimental comparison between hydrodynamics and optics.

    PubMed

    El Koussaifi, R; Tikan, A; Toffoli, A; Randoux, S; Suret, P; Onorato, M

    2018-01-01

    Rogue waves are extreme and rare fluctuations of the wave field that have been discussed in many physical systems. Their presence substantially influences the statistical properties of a partially coherent wave field, i.e., a wave field characterized by a finite band spectrum with random Fourier phases. Their understanding is fundamental for the design of ships and offshore platforms. In many meteorological conditions waves in the ocean are characterized by the so-called Joint North Sea Wave Project (JONSWAP) spectrum. Here we compare two unique experimental results: the first one has been performed in a 270 m wave tank and the other in optical fibers. In both cases, waves characterized by a JONSWAP spectrum and random Fourier phases have been launched at the input of the experimental device. The quantitative comparison, based on an appropriate scaling of the two experiments, shows a very good agreement between the statistics in hydrodynamics and optics. Spontaneous emergence of heavy tails in the probability density function of the wave amplitude is observed in both systems. The results demonstrate the universal features of rogue waves and provide a fundamental and explicit bridge between two important fields of research. Numerical simulations are also compared with experimental results.

  1. Quantitative and Sensitive Detection of Chloramphenicol by Surface-Enhanced Raman Scattering

    PubMed Central

    Ding, Yufeng; Yin, Hongjun; Meng, Qingyun; Zhao, Yongmei; Liu, Luo; Wu, Zhenglong; Xu, Haijun

    2017-01-01

    We used surface-enhanced Raman scattering (SERS) for the quantitative and sensitive detection of chloramphenicol (CAP). Using 30 nm colloidal Au nanoparticles (NPs), a low detection limit for CAP of 10−8 M was obtained. The characteristic Raman peak of CAP centered at 1344 cm−1 was used for the rapid quantitative detection of CAP in three different types of CAP eye drops, and the accuracy of the measurement result was verified by high-performance liquid chromatography (HPLC). The experimental results reveal that the SERS technique based on colloidal Au NPs is accurate and sensitive, and can be used for the rapid detection of various antibiotics. PMID:29261161

  2. Piezoelectric tuning fork biosensors for the quantitative measurement of biomolecular interactions

    NASA Astrophysics Data System (ADS)

    Gonzalez, Laura; Rodrigues, Mafalda; Benito, Angel Maria; Pérez-García, Lluïsa; Puig-Vidal, Manel; Otero, Jorge

    2015-12-01

    The quantitative measurement of biomolecular interactions is of great interest in molecular biology. Atomic force microscopy (AFM) has proved its capacity to act as a biosensor and determine the affinity between biomolecules of interest. Nevertheless, the detection scheme presents certain limitations when it comes to developing a compact biosensor. Recently, piezoelectric quartz tuning forks (QTFs) have been used as laser-free detection sensors for AFM. However, only a few studies along these lines have considered soft biological samples, and even fewer constitute quantified molecular recognition experiments. Here, we demonstrate the capacity of QTF probes to perform specific interaction measurements between biotin-streptavidin complexes in buffer solution. We propose in this paper a variant of dynamic force spectroscopy based on representing adhesion energies E (aJ) against pulling rates v (nm s-1). Our results are compared with conventional AFM measurements and show the great potential of these sensors in molecular interaction studies.

  3. Atom probe trajectory mapping using experimental tip shape measurements.

    PubMed

    Haley, D; Petersen, T; Ringer, S P; Smith, G D W

    2011-11-01

    Atom probe tomography is an accurate analytical and imaging technique which can reconstruct the complex structure and composition of a specimen in three dimensions. Despite providing locally high spatial resolution, atom probe tomography suffers from global distortions due to a complex projection function between the specimen and detector which is different for each experiment and can change during a single run. To aid characterization of this projection function, this work demonstrates a method for the reverse projection of ions from an arbitrary projection surface in 3D space back to an atom probe tomography specimen surface. Experimental data from transmission electron microscopy tilt tomography are combined with point cloud surface reconstruction algorithms and finite element modelling to generate a mapping back to the original tip surface in a physically and experimentally motivated manner. As a case study, aluminium tips are imaged using transmission electron microscopy before and after atom probe tomography, and the specimen profiles used as input in surface reconstruction methods. This reconstruction method is a general procedure that can be used to generate mappings between a selected surface and a known tip shape using numerical solutions to the electrostatic equation, with quantitative solutions to the projection problem readily achievable in tens of minutes on a contemporary workstation. © 2011 The Authors Journal of Microscopy © 2011 Royal Microscopical Society.

  4. Evaluation of patients with painful total hip arthroplasty using combined single photon emission tomography and conventional computerized tomography (SPECT/CT) - a comparison of semi-quantitative versus 3D volumetric quantitative measurements.

    PubMed

    Barthassat, Emilienne; Afifi, Faik; Konala, Praveen; Rasch, Helmut; Hirschmann, Michael T

    2017-05-08

    It was the primary purpose of our study to evaluate the inter- and intra-observer reliability of a standardized SPECT/CT algorithm for evaluating patients with painful primary total hip arthroplasty (THA). The secondary purpose was a comparison of semi-quantitative and 3D volumetric quantification method for assessment of bone tracer uptake (BTU) in those patients. A novel SPECT/CT localization scheme consisting of 14 femoral and 4 acetabular regions on standardized axial and coronal slices was introduced and evaluated in terms of inter- and intra-observer reliability in 37 consecutive patients with hip pain after THA. BTU for each anatomical region was assessed semi-quantitatively using a color-coded Likert type scale (0-10) and volumetrically quantified using a validated software. Two observers interpreted the SPECT/CT findings in all patients two times with six weeks interval between interpretations in random order. Semi-quantitative and quantitative measurements were compared in terms of reliability. In addition, the values were correlated using Pearson`s correlation. A factorial cluster analysis of BTU was performed to identify clinically relevant regions, which should be grouped and analysed together. The localization scheme showed high inter- and intra-observer reliabilities for all femoral and acetabular regions independent of the measurement method used (semiquantitative versus 3D volumetric quantitative measurements). A high to moderate correlation between both measurement methods was shown for the distal femur, the proximal femur and the acetabular cup. The factorial cluster analysis showed that the anatomical regions might be summarized into three distinct anatomical regions. These were the proximal femur, the distal femur and the acetabular cup region. The SPECT/CT algorithm for assessment of patients with pain after THA is highly reliable independent from the measurement method used. Three clinically relevant anatomical regions (proximal femoral

  5. Gold Nanoparticle Labeling Based ICP-MS Detection/Measurement of Bacteria, and Their Quantitative Photothermal Destruction

    PubMed Central

    Lin, Yunfeng

    2015-01-01

    Bacteria such as Salmonella and E. coli present a great challenge in public health care in today’s society. Protection of public safety against bacterial contamination and rapid diagnosis of infection require simple and fast assays for the detection and elimination of bacterial pathogens. After utilizing Salmonella DT104 as an example bacterial strain for our investigation, we report a rapid and sensitive assay for the qualitative and quantitative detection of bacteria by using antibody affinity binding, popcorn shaped gold nanoparticle (GNPOPs) labeling, surfance enchanced Raman spectroscopy (SERS), and inductively coupled plasma mass spectrometry (ICP-MS) detection. For qualitative analysis, our assay can detect Salmonella within 10 min by Raman spectroscopy; for quantitative analysis, our assay has the ability to measure as few as 100 Salmonella DT104 in a 1 mL sample (100 CFU/mL) within 40 min. Based on the quantitative detection, we investigated the quantitative destruction of Salmonella DT104, and the assay’s photothermal efficiency in order to reduce the amount of GNPOPs in the assay to ultimately to eliminate any potential side effects/toxicity to the surrounding cells in vivo. Results suggest that our assay may serve as a promising candidate for qualitative and quantitative detection and elimination of a variety of bacterial pathogens. PMID:26417447

  6. Quantitative biology: where modern biology meets physical sciences.

    PubMed

    Shekhar, Shashank; Zhu, Lian; Mazutis, Linas; Sgro, Allyson E; Fai, Thomas G; Podolski, Marija

    2014-11-05

    Quantitative methods and approaches have been playing an increasingly important role in cell biology in recent years. They involve making accurate measurements to test a predefined hypothesis in order to compare experimental data with predictions generated by theoretical models, an approach that has benefited physicists for decades. Building quantitative models in experimental biology not only has led to discoveries of counterintuitive phenomena but has also opened up novel research directions. To make the biological sciences more quantitative, we believe a two-pronged approach needs to be taken. First, graduate training needs to be revamped to ensure biology students are adequately trained in physical and mathematical sciences and vice versa. Second, students of both the biological and the physical sciences need to be provided adequate opportunities for hands-on engagement with the methods and approaches necessary to be able to work at the intersection of the biological and physical sciences. We present the annual Physiology Course organized at the Marine Biological Laboratory (Woods Hole, MA) as a case study for a hands-on training program that gives young scientists the opportunity not only to acquire the tools of quantitative biology but also to develop the necessary thought processes that will enable them to bridge the gap between these disciplines. © 2014 Shekhar, Zhu, Mazutis, Sgro, Fai, and Podolski. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  7. Correlation of visual in vitro cytotoxicity ratings of biomaterials with quantitative in vitro cell viability measurements.

    PubMed

    Bhatia, Sujata K; Yetter, Ann B

    2008-08-01

    Medical devices and implanted biomaterials are often assessed for biological reactivity using visual scores of cell-material interactions. In such testing, biomaterials are assigned cytotoxicity ratings based on visual evidence of morphological cellular changes, including cell lysis, rounding, spreading, and proliferation. For example, ISO 10993 cytotoxicity testing of medical devices allows the use of a visual grading scale. The present study compared visual in vitro cytotoxicity ratings to quantitative in vitro cytotoxicity measurements for biomaterials to determine the level of correlation between visual scoring and a quantitative cell viability assay. Biomaterials representing a spectrum of biological reactivity levels were evaluated, including organo-tin polyvinylchloride (PVC; a known cytotoxic material), ultra-high molecular weight polyethylene (a known non-cytotoxic material), and implantable tissue adhesives. Each material was incubated in direct contact with mouse 3T3 fibroblast cell cultures for 24 h. Visual scores were assigned to the materials using a 5-point rating scale; the scorer was blinded to the material identities. Quantitative measurements of cell viability were performed using a 3-(4,5-dimethylthiozol-2-yl)-2,5-diphenyltetrazolium bromide (MTT) colorimetric assay; again, the assay operator was blinded to material identities. The investigation revealed a high degree of correlation between visual cytotoxicity ratings and quantitative cell viability measurements; a Pearson's correlation gave a correlation coefficient of 0.90 between the visual cytotoxicity score and the percent viable cells. An equation relating the visual cytotoxicity score and the percent viable cells was derived. The results of this study are significant for the design and interpretation of in vitro cytotoxicity studies of novel biomaterials.

  8. Assessment of Renal Hemodynamics and Oxygenation by Simultaneous Magnetic Resonance Imaging (MRI) and Quantitative Invasive Physiological Measurements.

    PubMed

    Cantow, Kathleen; Arakelyan, Karen; Seeliger, Erdmann; Niendorf, Thoralf; Pohlmann, Andreas

    2016-01-01

    In vivo assessment of renal perfusion and oxygenation under (patho)physiological conditions by means of noninvasive diagnostic imaging is conceptually appealing. Blood oxygen level-dependent (BOLD) magnetic resonance imaging (MRI) and quantitative parametric mapping of the magnetic resonance (MR) relaxation times T 2* and T 2 are thought to provide surrogates of renal tissue oxygenation. The validity and efficacy of this technique for quantitative characterization of local tissue oxygenation and its changes under different functional conditions have not been systematically examined yet and remain to be established. For this purpose, the development of an integrative multimodality approaches is essential. Here we describe an integrated hybrid approach (MR-PHYSIOL) that combines established quantitative physiological measurements with T 2* (T 2) mapping and MR-based kidney size measurements. Standardized reversible (patho)physiologically relevant interventions, such as brief periods of aortic occlusion, hypoxia, and hyperoxia, are used for detailing the relation between the MR-PHYSIOL parameters, in particular between renal T 2* and tissue oxygenation.

  9. The Reliability and Validity of Discrete and Continuous Measures of Psychopathology: A Quantitative Review

    ERIC Educational Resources Information Center

    Markon, Kristian E.; Chmielewski, Michael; Miller, Christopher J.

    2011-01-01

    In 2 meta-analyses involving 58 studies and 59,575 participants, we quantitatively summarized the relative reliability and validity of continuous (i.e., dimensional) and discrete (i.e., categorical) measures of psychopathology. Overall, results suggest an expected 15% increase in reliability and 37% increase in validity through adoption of a…

  10. Formation resistivity measurements from within a cased well used to quantitatively determine the amount of oil and gas present

    DOEpatents

    Vail, W.B. III

    1997-05-27

    Methods to quantitatively determine the separate amounts of oil and gas in a geological formation adjacent to a cased well using measurements of formation resistivity are disclosed. The steps include obtaining resistivity measurements from within a cased well of a given formation, obtaining the porosity, obtaining the resistivity of formation water present, computing the combined amounts of oil and gas present using Archie`s Equations, determining the relative amounts of oil and gas present from measurements within a cased well, and then quantitatively determining the separate amounts of oil and gas present in the formation. 7 figs.

  11. Measurement of lung expansion with computed tomography and comparison with quantitative histology.

    PubMed

    Coxson, H O; Mayo, J R; Behzad, H; Moore, B J; Verburgt, L M; Staples, C A; Paré, P D; Hogg, J C

    1995-11-01

    The total and regional lung volumes were estimated from computed tomography (CT), and the pleural pressure gradient was determined by using the milliliters of gas per gram of tissue estimated from the X-ray attenuation values and the pressure-volume curve of the lung. The data show that CT accurately estimated the volume of the resected lobe but overestimated its weight by 24 +/- 19%. The volume of gas per gram of tissue was less in the gravity-dependent regions due to a pleural pressure gradient of 0.24 +/- 0.08 cmH2O/cm of descent in the thorax. The proportion of tissue to air obtained with CT was similar to that obtained by quantitative histology. We conclude that the CT scan can be used to estimate total and regional lung volumes and that measurements of the proportions of tissue and air within the thorax by CT can be used in conjunction with quantitative histology to evaluate lung structure.

  12. Experimental measurement of energy harvesting with backpack

    NASA Astrophysics Data System (ADS)

    Pavelkova, Radka; Vala, David; Suranek, Pavel; Mahdal, Miroslav

    2017-08-01

    This article deals with the energy harvesting systems, especially the energy harvesting backpack, which appears as a convenient means for energy harvesting for mobile sensors power. Before starting the experiment, it was necessary to verify whether this energy will be sufficient to get acquainted with the human kinematics and analyze problematics itself. For this purpose there was used motion capture technology from Xsens. Measured data on the position of a particle moving man and back when walking, these data were then used for experimental realization of energy harvesting backpack and as input data to the simulation in Simulink, which brought us a comparison between theoretical assumptions and practical implementation. When measuring characteristics of energy harvesting system we have a problem with measurements on backpack solved when redoing of the hydraulic cylinder as a source of a suitable movement corresponding to the amplitude and frequency of human walk.

  13. Quantitative measurements of nanoscale permittivity and conductivity using tuning-fork-based microwave impedance microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Xiaoyu; Hao, Zhenqi; Wu, Di

    Here, we report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS 2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-fieldmore » microwave imaging with small distance modulation.« less

  14. Quantitative measurements of nanoscale permittivity and conductivity using tuning-fork-based microwave impedance microscopy

    DOE PAGES

    Wu, Xiaoyu; Hao, Zhenqi; Wu, Di; ...

    2018-04-01

    Here, we report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS 2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-fieldmore » microwave imaging with small distance modulation.« less

  15. An experimental abdominal pressure measurement device for child ATDs.

    DOT National Transportation Integrated Search

    1995-12-01

    An experimental device to measure the abdominal pressure in child-size Anthropomorphic Test Dummies (ATDs) during dynamic tests was developed. A description is provided of the two ATDs in which the device was installed, the CRABI six-month-old and th...

  16. Quantitative MRI and strength measurements in the assessment of muscle quality in Duchenne muscular dystrophy.

    PubMed

    Wokke, B H; van den Bergen, J C; Versluis, M J; Niks, E H; Milles, J; Webb, A G; van Zwet, E W; Aartsma-Rus, A; Verschuuren, J J; Kan, H E

    2014-05-01

    The purpose of this study was to assess leg muscle quality and give a detailed description of leg muscle involvement in a series of Duchenne muscular dystrophy patients using quantitative MRI and strength measurements. Fatty infiltration, as well as total and contractile (not fatty infiltrated) cross sectional areas of various leg muscles were determined in 16 Duchenne patients and 11 controls (aged 8-15). To determine specific muscle strength, four leg muscle groups (quadriceps femoris, hamstrings, anterior tibialis and triceps surae) were measured and related to the amount of contractile tissue. In patients, the quadriceps femoris showed decreased total and contractile cross sectional area, attributable to muscle atrophy. The total, but not the contractile, cross sectional area of the triceps surae was increased in patients, corresponding to hypertrophy. Specific strength decreased in all four muscle groups of Duchenne patients, indicating reduced muscle quality. This suggests that muscle hypertrophy and fatty infiltration are two distinct pathological processes, differing between muscle groups. Additionally, the quality of remaining muscle fibers is severely reduced in the legs of Duchenne patients. The combination of quantitative MRI and quantitative muscle testing could be a valuable outcome parameter in longitudinal studies and in the follow-up of therapeutic effects. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Experimental demonstration of a measurement-based realisation of a quantum channel

    NASA Astrophysics Data System (ADS)

    McCutcheon, W.; McMillan, A.; Rarity, J. G.; Tame, M. S.

    2018-03-01

    We introduce and experimentally demonstrate a method for realising a quantum channel using the measurement-based model. Using a photonic setup and modifying the basis of single-qubit measurements on a four-qubit entangled cluster state, representative channels are realised for the case of a single qubit in the form of amplitude and phase damping channels. The experimental results match the theoretical model well, demonstrating the successful performance of the channels. We also show how other types of quantum channels can be realised using our approach. This work highlights the potential of the measurement-based model for realising quantum channels which may serve as building blocks for simulations of realistic open quantum systems.

  18. Experimental determination of entanglement with a single measurement.

    PubMed

    Walborn, S P; Souto Ribeiro, P H; Davidovich, L; Mintert, F; Buchleitner, A

    2006-04-20

    Nearly all protocols requiring shared quantum information--such as quantum teleportation or key distribution--rely on entanglement between distant parties. However, entanglement is difficult to characterize experimentally. All existing techniques for doing so, including entanglement witnesses or Bell inequalities, disclose the entanglement of some quantum states but fail for other states; therefore, they cannot provide satisfactory results in general. Such methods are fundamentally different from entanglement measures that, by definition, quantify the amount of entanglement in any state. However, these measures suffer from the severe disadvantage that they typically are not directly accessible in laboratory experiments. Here we report a linear optics experiment in which we directly observe a pure-state entanglement measure, namely concurrence. Our measurement set-up includes two copies of a quantum state: these 'twin' states are prepared in the polarization and momentum degrees of freedom of two photons, and concurrence is measured with a single, local measurement on just one of the photons.

  19. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  20. Quantitative Method of Measuring Metastatic Activity

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  1. Quantitative Measurement of Eyestrain on 3D Stereoscopic Display Considering the Eye Foveation Model and Edge Information

    PubMed Central

    Heo, Hwan; Lee, Won Oh; Shin, Kwang Yong; Park, Kang Ryoung

    2014-01-01

    We propose a new method for measuring the degree of eyestrain on 3D stereoscopic displays using a glasses-type of eye tracking device. Our study is novel in the following four ways: first, the circular area where a user's gaze position exists is defined based on the calculated gaze position and gaze estimation error. Within this circular area, the position where edge strength is maximized can be detected, and we determine this position as the gaze position that has a higher probability of being the correct one. Based on this gaze point, the eye foveation model is defined. Second, we quantitatively evaluate the correlation between the degree of eyestrain and the causal factors of visual fatigue, such as the degree of change of stereoscopic disparity (CSD), stereoscopic disparity (SD), frame cancellation effect (FCE), and edge component (EC) of the 3D stereoscopic display using the eye foveation model. Third, by comparing the eyestrain in conventional 3D video and experimental 3D sample video, we analyze the characteristics of eyestrain according to various factors and types of 3D video. Fourth, by comparing the eyestrain with or without the compensation of eye saccades movement in 3D video, we analyze the characteristics of eyestrain according to the types of eye movements in 3D video. Experimental results show that the degree of CSD causes more eyestrain than other factors. PMID:24834910

  2. An improved method for quantitatively measuring the sequences of total organic carbon and black carbon in marine sediment cores

    NASA Astrophysics Data System (ADS)

    Xu, Xiaoming; Zhu, Qing; Zhou, Qianzhi; Liu, Jinzhong; Yuan, Jianping; Wang, Jianghai

    2018-01-01

    Understanding global carbon cycle is critical to uncover the mechanisms of global warming and remediate its adverse effects on human activities. Organic carbon in marine sediments is an indispensable part of the global carbon reservoir in global carbon cycling. Evaluating such a reservoir calls for quantitative studies of marine carbon burial, which closely depend on quantifying total organic carbon and black carbon in marine sediment cores and subsequently on obtaining their high-resolution temporal sequences. However, the conventional methods for detecting the contents of total organic carbon or black carbon cannot resolve the following specific difficulties, i.e., (1) a very limited amount of each subsample versus the diverse analytical items, (2) a low and fluctuating recovery rate of total organic carbon or black carbon versus the reproducibility of carbon data, and (3) a large number of subsamples versus the rapid batch measurements. In this work, (i) adopting the customized disposable ceramic crucibles with the microporecontrolled ability, (ii) developing self-made or customized facilities for the procedures of acidification and chemothermal oxidization, and (iii) optimizing procedures and carbon-sulfur analyzer, we have built a novel Wang-Xu-Yuan method (the WXY method) for measuring the contents of total organic carbon or black carbon in marine sediment cores, which includes the procedures of pretreatment, weighing, acidification, chemothermal oxidation and quantification; and can fully meet the requirements of establishing their highresolution temporal sequences, whatever in the recovery, experimental efficiency, accuracy and reliability of the measurements, and homogeneity of samples. In particular, the usage of disposable ceramic crucibles leads to evidently simplify the experimental scenario, which further results in the very high recovery rates for total organic carbon and black carbon. This new technique may provide a significant support for

  3. Poem Generator: A Comparative Quantitative Evaluation of a Microworlds-Based Learning Approach for Teaching English

    ERIC Educational Resources Information Center

    Jenkins, Craig

    2015-01-01

    This paper is a comparative quantitative evaluation of an approach to teaching poetry in the subject domain of English that employs a "guided discovery" pedagogy using computer-based microworlds. It uses a quasi-experimental design in order to measure performance gains in computational thinking and poetic thinking following a…

  4. The predictive value of quantitative fibronectin testing in combination with cervical length measurement in symptomatic women.

    PubMed

    Bruijn, Merel M C; Kamphuis, Esme I; Hoesli, Irene M; Martinez de Tejada, Begoña; Loccufier, Anne R; Kühnert, Maritta; Helmer, Hanns; Franz, Marie; Porath, Martina M; Oudijk, Martijn A; Jacquemyn, Yves; Schulzke, Sven M; Vetter, Grit; Hoste, Griet; Vis, Jolande Y; Kok, Marjolein; Mol, Ben W J; van Baaren, Gert-Jan

    2016-12-01

    The combination of the qualitative fetal fibronectin test and cervical length measurement has a high negative predictive value for preterm birth within 7 days; however, positive prediction is poor. A new bedside quantitative fetal fibronectin test showed potential additional value over the conventional qualitative test, but there is limited evidence on the combination with cervical length measurement. The purpose of this study was to compare quantitative fetal fibronectin and qualitative fetal fibronectin testing in the prediction of spontaneous preterm birth within 7 days in symptomatic women who undergo cervical length measurement. We performed a European multicenter cohort study in 10 perinatal centers in 5 countries. Women between 24 and 34 weeks of gestation with signs of active labor and intact membranes underwent quantitative fibronectin testing and cervical length measurement. We assessed the risk of preterm birth within 7 days in predefined strata based on fibronectin concentration and cervical length. Of 455 women who were included in the study, 48 women (11%) delivered within 7 days. A combination of cervical length and qualitative fibronectin resulted in the identification of 246 women who were at low risk: 164 women with a cervix between 15 and 30 mm and a negative fibronectin test (<50 ng/mL; preterm birth rate, 2%) and 82 women with a cervix at >30 mm (preterm birth rate, 2%). Use of quantitative fibronectin alone resulted in a predicted risk of preterm birth within 7 days that ranged from 2% in the group with the lowest fibronectin level (<10 ng/mL) to 38% in the group with the highest fibronectin level (>500 ng/mL), with similar accuracy as that of the combination of cervical length and qualitative fibronectin. Combining cervical length and quantitative fibronectin resulted in the identification of an additional 19 women at low risk (preterm birth rate, 5%), using a threshold of 10 ng/mL in women with a cervix at <15 mm, and 6 women at high risk

  5. Experimental investigation of measurement-induced disturbance and time symmetry in quantum physics

    NASA Astrophysics Data System (ADS)

    Curic, D.; Richardson, M. C.; Thekkadath, G. S.; Flórez, J.; Giner, L.; Lundeen, J. S.

    2018-04-01

    Unlike regular time evolution governed by the Schrödinger equation, standard quantum measurement appears to violate time-reversal symmetry. Measurement creates random disturbances (e.g., collapse) that prevent back-tracing the quantum state of the system. The effect of these disturbances is explicit in the results of subsequent measurements. In this way, the joint result of sequences of measurements depends on the order in time in which those measurements are performed. One might expect that if the disturbance could be eliminated this time-ordering dependence would vanish. Following a recent theoretical proposal [Bednorz, Franke, and Belzig, New J. Phys. 15, 023043 (2013), 10.1088/1367-2630/15/2/023043], we experimentally investigate this dependence for a kind of measurement that creates an arbitrarily small disturbance: weak measurement. We perform various sequences of a set of polarization weak measurements on photons. We experimentally demonstrate that, although the weak measurements are minimally disturbing, their time ordering affects the outcome of the measurement sequence for quantum systems.

  6. Validation of a Spectral Method for Quantitative Measurement of Color in Protein Drug Solutions.

    PubMed

    Yin, Jian; Swartz, Trevor E; Zhang, Jian; Patapoff, Thomas W; Chen, Bartolo; Marhoul, Joseph; Shih, Norman; Kabakoff, Bruce; Rahimi, Kimia

    2016-01-01

    A quantitative spectral method has been developed to precisely measure the color of protein solutions. In this method, a spectrophotometer is utilized for capturing the visible absorption spectrum of a protein solution, which can then be converted to color values (L*a*b*) that represent human perception of color in a quantitative three-dimensional space. These quantitative values (L*a*b*) allow for calculating the best match of a sample's color to a European Pharmacopoeia reference color solution. In order to qualify this instrument and assay for use in clinical quality control, a technical assessment was conducted to evaluate the assay suitability and precision. Setting acceptance criteria for this study required development and implementation of a unique statistical method for assessing precision in 3-dimensional space. Different instruments, cuvettes, protein solutions, and analysts were compared in this study. The instrument accuracy, repeatability, and assay precision were determined. The instrument and assay are found suitable for use in assessing color of drug substances and drug products and is comparable to the current European Pharmacopoeia visual assessment method. In the biotechnology industry, a visual assessment is the most commonly used method for color characterization, batch release, and stability testing of liquid protein drug solutions. Using this method, an analyst visually determines the color of the sample by choosing the closest match to a standard color series. This visual method can be subjective because it requires an analyst to make a judgment of the best match of color of the sample to the standard color series, and it does not capture data on hue and chroma that would allow for improved product characterization and the ability to detect subtle differences between samples. To overcome these challenges, we developed a quantitative spectral method for color determination that greatly reduces the variability in measuring color and allows

  7. Contact Angle Measurements Using a Simplified Experimental Setup

    ERIC Educational Resources Information Center

    Lamour, Guillaume; Hamraoui, Ahmed; Buvailo, Andrii; Xing, Yangjun; Keuleyan, Sean; Prakash, Vivek; Eftekhari-Bafrooei, Ali; Borguet, Eric

    2010-01-01

    A basic and affordable experimental apparatus is described that measures the static contact angle of a liquid drop in contact with a solid. The image of the drop is made with a simple digital camera by taking a picture that is magnified by an optical lens. The profile of the drop is then processed with ImageJ free software. The ImageJ contact…

  8. Quantitative image fusion in infrared radiometry

    NASA Astrophysics Data System (ADS)

    Romm, Iliya; Cukurel, Beni

    2018-05-01

    Towards high-accuracy infrared radiance estimates, measurement practices and processing techniques aimed to achieve quantitative image fusion using a set of multi-exposure images of a static scene are reviewed. The conventional non-uniformity correction technique is extended, as the original is incompatible with quantitative fusion. Recognizing the inherent limitations of even the extended non-uniformity correction, an alternative measurement methodology, which relies on estimates of the detector bias using self-calibration, is developed. Combining data from multi-exposure images, two novel image fusion techniques that ultimately provide high tonal fidelity of a photoquantity are considered: ‘subtract-then-fuse’, which conducts image subtraction in the camera output domain and partially negates the bias frame contribution common to both the dark and scene frames; and ‘fuse-then-subtract’, which reconstructs the bias frame explicitly and conducts image fusion independently for the dark and the scene frames, followed by subtraction in the photoquantity domain. The performances of the different techniques are evaluated for various synthetic and experimental data, identifying the factors contributing to potential degradation of the image quality. The findings reflect the superiority of the ‘fuse-then-subtract’ approach, conducting image fusion via per-pixel nonlinear weighted least squares optimization.

  9. Measuring changes in transmission of neglected tropical diseases, malaria, and enteric pathogens from quantitative antibody levels

    PubMed Central

    van der Laan, Mark J.; Hubbard, Alan E.; Steel, Cathy; Kubofcik, Joseph; Hamlin, Katy L.; Moss, Delynn M.; Nutman, Thomas B.; Priest, Jeffrey W.; Lammie, Patrick J.

    2017-01-01

    Background Serological antibody levels are a sensitive marker of pathogen exposure, and advances in multiplex assays have created enormous potential for large-scale, integrated infectious disease surveillance. Most methods to analyze antibody measurements reduce quantitative antibody levels to seropositive and seronegative groups, but this can be difficult for many pathogens and may provide lower resolution information than quantitative levels. Analysis methods have predominantly maintained a single disease focus, yet integrated surveillance platforms would benefit from methodologies that work across diverse pathogens included in multiplex assays. Methods/Principal findings We developed an approach to measure changes in transmission from quantitative antibody levels that can be applied to diverse pathogens of global importance. We compared age-dependent immunoglobulin G curves in repeated cross-sectional surveys between populations with differences in transmission for multiple pathogens, including: lymphatic filariasis (Wuchereria bancrofti) measured before and after mass drug administration on Mauke, Cook Islands, malaria (Plasmodium falciparum) before and after a combined insecticide and mass drug administration intervention in the Garki project, Nigeria, and enteric protozoans (Cryptosporidium parvum, Giardia intestinalis, Entamoeba histolytica), bacteria (enterotoxigenic Escherichia coli, Salmonella spp.), and viruses (norovirus groups I and II) in children living in Haiti and the USA. Age-dependent antibody curves fit with ensemble machine learning followed a characteristic shape across pathogens that aligned with predictions from basic mechanisms of humoral immunity. Differences in pathogen transmission led to shifts in fitted antibody curves that were remarkably consistent across pathogens, assays, and populations. Mean antibody levels correlated strongly with traditional measures of transmission intensity, such as the entomological inoculation rate for P

  10. Insights into Spray Development from Metered-Dose Inhalers Through Quantitative X-ray Radiography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mason-Smith, Nicholas; Duke, Daniel J.; Kastengren, Alan L.

    Typical methods to study pMDI sprays employ particle sizing or visible light diagnostics, which suffer in regions of high spray density. X-ray techniques can be applied to pharmaceutical sprays to obtain information unattainable by conventional particle sizing and light-based techniques. We present a technique for obtaining quantitative measurements of spray density in pMDI sprays. A monochromatic focused X-ray beam was used to perform quantitative radiography measurements in the near-nozzle region and plume of HFA-propelled sprays. Measurements were obtained with a temporal resolution of 0.184 ms and spatial resolution of 5 mu m. Steady flow conditions were reached after around 30more » ms for the formulations examined with the spray device used. Spray evolution was affected by the inclusion of ethanol in the formulation and unaffected by the inclusion of 0.1% drug by weight. Estimation of the nozzle exit density showed that vapour is likely to dominate the flow leaving the inhaler nozzle during steady flow. Quantitative measurements in pMDI sprays allow the determination of nozzle exit conditions that are difficult to obtain experimentally by other means. Measurements of these nozzle exit conditions can improve understanding of the atomization mechanisms responsible for pMDI spray droplet and particle formation.« less

  11. Quantitative model analysis with diverse biological data: applications in developmental pattern formation.

    PubMed

    Pargett, Michael; Umulis, David M

    2013-07-15

    Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Quantitative method for measuring heat flux emitted from a cryogenic object

    DOEpatents

    Duncan, Robert V.

    1993-01-01

    The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infra-red sensing devices.

  13. Quantitative method for measuring heat flux emitted from a cryogenic object

    DOEpatents

    Duncan, R.V.

    1993-03-16

    The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infrared sensing devices.

  14. Quantitative measurement of piezoelectric coefficient of thin film using a scanning evanescent microwave microscope.

    PubMed

    Zhao, Zhenli; Luo, Zhenlin; Liu, Chihui; Wu, Wenbin; Gao, Chen; Lu, Yalin

    2008-06-01

    This article describes a new approach to quantitatively measure the piezoelectric coefficients of thin films at the microscopic level using a scanning evanescent microwave microscope. This technique can resolve 10 pm deformation caused by the piezoelectric effect and has the advantages of high scanning speed, large scanning area, submicron spatial resolution, and a simultaneous accessibility to many other related properties. Results from the test measurements on the longitudinal piezoelectric coefficient of PZT thin film agree well with those from other techniques listed in literatures.

  15. A specialized plug-in software module for computer-aided quantitative measurement of medical images.

    PubMed

    Wang, Q; Zeng, Y J; Huo, P; Hu, J L; Zhang, J H

    2003-12-01

    This paper presents a specialized system for quantitative measurement of medical images. Using Visual C++, we developed a computer-aided software based on Image-Pro Plus (IPP), a software development platform. When transferred to the hard disk of a computer by an MVPCI-V3A frame grabber, medical images can be automatically processed by our own IPP plug-in for immunohistochemical analysis, cytomorphological measurement and blood vessel segmentation. In 34 clinical studies, the system has shown its high stability, reliability and ease of utility.

  16. Measuring Filament Orientation: A New Quantitative, Local Approach

    NASA Astrophysics Data System (ADS)

    Green, C.-E.; Dawson, J. R.; Cunningham, M. R.; Jones, P. A.; Novak, G.; Fissel, L. M.

    2017-09-01

    The relative orientation between filamentary structures in molecular clouds and the ambient magnetic field provides insight into filament formation and stability. To calculate the relative orientation, a measurement of filament orientation is first required. We propose a new method to calculate the orientation of the one-pixel-wide filament skeleton that is output by filament identification algorithms such as filfinder. We derive the local filament orientation from the direction of the intensity gradient in the skeleton image using the Sobel filter and a few simple post-processing steps. We call this the “Sobel-gradient method.” The resulting filament orientation map can be compared quantitatively on a local scale with the magnetic field orientation map to then find the relative orientation of the filament with respect to the magnetic field at each point along the filament. It can also be used for constructing radial profiles for filament width fitting. The proposed method facilitates automation in analyses of filament skeletons, which is imperative in this era of “big data.”

  17. Blood-brain barrier permeability and monocyte infiltration in experimental allergic encephalomyelitis: a quantitative MRI study.

    PubMed

    Floris, S; Blezer, E L A; Schreibelt, G; Döpp, E; van der Pol, S M A; Schadee-Eestermans, I L; Nicolay, K; Dijkstra, C D; de Vries, H E

    2004-03-01

    Enhanced cerebrovascular permeability and cellular infiltration mark the onset of early multiple sclerosis lesions. So far, the precise sequence of these events and their role in lesion formation and disease progression remain unknown. Here we provide quantitative evidence that blood-brain barrier leakage is an early event and precedes massive cellular infiltration in the development of acute experimental allergic encephalomyelitis (EAE), the animal correlate of multiple sclerosis. Cerebrovascular leakage and monocytes infiltrates were separately monitored by quantitative in vivo MRI during the course of the disease. Magnetic resonance enhancement of the contrast agent gadolinium diethylenetriaminepentaacetate (Gd-DTPA), reflecting vascular leakage, occurred concomitantly with the onset of neurological signs and was already at a maximal level at this stage of the disease. Immunohistochemical analysis also confirmed the presence of the serum-derived proteins such as fibrinogen around the brain vessels early in the disease, whereas no cellular infiltrates could be detected. MRI further demonstrated that Gd-DTPA leakage clearly preceded monocyte infiltration as imaged by the contrast agent based on ultra small particles of iron oxide (USPIO), which was maximal only during full-blown EAE. Ultrastructural and immunohistochemical investigation revealed that USPIOs were present in newly infiltrated macrophages within the inflammatory lesions. To validate the use of USPIOs as a non-invasive tool to evaluate therapeutic strategies, EAE animals were treated with the immunomodulator 3-hydroxy-3-methylglutaryl Coenzyme A reductase inhibitor, lovastatin, which ameliorated clinical scores. MRI showed that the USPIO load in the brain was significantly diminished in lovastatin-treated animals. Data indicate that cerebrovascular leakage and monocytic trafficking into the brain are two distinct processes in the development of inflammatory lesions during multiple sclerosis, which can

  18. Standardized Approach to Quantitatively Measure Residual Limb Skin Health in Individuals with Lower Limb Amputation.

    PubMed

    Rink, Cameron L; Wernke, Matthew M; Powell, Heather M; Tornero, Mark; Gnyawali, Surya C; Schroeder, Ryan M; Kim, Jayne Y; Denune, Jeffrey A; Albury, Alexander W; Gordillo, Gayle M; Colvin, James M; Sen, Chandan K

    2017-07-01

    Objective: (1) Develop a standardized approach to quantitatively measure residual limb skin health. (2) Report reference residual limb skin health values in people with transtibial and transfemoral amputation. Approach: Residual limb health outcomes in individuals with transtibial ( n  = 5) and transfemoral ( n  = 5) amputation were compared to able-limb controls ( n  = 4) using noninvasive imaging (hyperspectral imaging and laser speckle flowmetry) and probe-based approaches (laser doppler flowmetry, transcutaneous oxygen, transepidermal water loss, surface electrical capacitance). Results: A standardized methodology that employs noninvasive imaging and probe-based approaches to measure residual limb skin health are described. Compared to able-limb controls, individuals with transtibial and transfemoral amputation have significantly lower transcutaneous oxygen tension, higher transepidermal water loss, and higher surface electrical capacitance in the residual limb. Innovation: Residual limb health as a critical component of prosthesis rehabilitation for individuals with lower limb amputation is understudied in part due to a lack of clinical measures. Here, we present a standardized approach to measure residual limb health in people with transtibial and transfemoral amputation. Conclusion: Technology advances in noninvasive imaging and probe-based measures are leveraged to develop a standardized approach to quantitatively measure residual limb health in individuals with lower limb loss. Compared to able-limb controls, resting residual limb physiology in people that have had transfemoral or transtibial amputation is characterized by lower transcutaneous oxygen tension and poorer skin barrier function.

  19. Multi-spectral digital holographic microscopy for enhanced quantitative phase imaging of living cells

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Kastl, Lena; Schnekenburger, Jürgen; Ketelhut, Steffi

    2018-02-01

    Main restrictions of using laser light in digital holographic microscopy (DHM) are coherence induced noise and parasitic reflections in the experimental setup which limit resolution and measurement accuracy. We explored, if coherence properties of partial coherent light sources can be generated synthetically utilizing spectrally tunable lasers. The concept of the method is demonstrated by label-free quantitative phase imaging of living pancreatic tumor cells and utilizing an experimental configuration including a commercial microscope and a laser source with a broad tunable spectral range of more than 200 nm.

  20. The measurement of liver fat from single-energy quantitative computed tomography scans

    PubMed Central

    Cheng, Xiaoguang; Brown, J. Keenan; Guo, Zhe; Zhou, Jun; Wang, Fengzhe; Yang, Liqiang; Wang, Xiaohong; Xu, Li

    2017-01-01

    Background Studies of soft tissue composition using computed tomography (CT) scans are often semi-quantitative and based on Hounsfield units (HU) measurements that have not been calibrated with a quantitative CT (QCT) phantom. We describe a study to establish the water (H2O) and dipotassium hydrogen phosphate (K2HPO4) basis set equivalent densities of fat and fat-free liver tissue. With this information liver fat can be accurately measured from any abdominal CT scan calibrated with a suitable phantom. Methods Liver fat content was measured by comparing single-energy QCT (SEQCT) HU measurements of the liver with predicted HU values for fat and fat-free liver tissue calculated from their H2O and K2HPO4 equivalent densities and calibration data from a QCT phantom. The equivalent densities of fat were derived from a listing of its constituent fatty acids, and those of fat-free liver tissue from a dual-energy QCT (DEQCT) study performed in 14 healthy Chinese subjects. This information was used to calculate liver fat from abdominal SEQCT scans performed in a further 541 healthy Chinese subjects (mean age 62 years; range, 31–95 years) enrolled in the Prospective Urban Rural Epidemiology (PURE) Study. Results The equivalent densities of fat were 941.75 mg/cm3 H2O and –43.72 mg/cm3 K2HPO4, and for fat-free liver tissue 1,040.13 mg/cm3 H2O and 21.34 mg/cm3 K2HPO4. Liver fat in the 14 subjects in the DEQCT study varied from 0–17.9% [median: 4.5%; interquartile range (IQR): 3.0–7.9%]. Liver fat in the 541 PURE study subjects varied from –0.3–29.9% (median: 4.9%; IQR: 3.4–6.9%). Conclusions We have established H2O and K2HPO4 equivalent densities for fat and fat-free liver tissue that allow a measurement of liver fat to be obtained from any abdominal CT scan acquired with a QCT phantom. Although radiation dose considerations preclude the routine use of QCT to measure liver fat, the method described here facilitates its measurement in patients having CT scans

  1. Heat Transfer in a Complex Trailing Edge Passage for a High Pressure Turbine Blade - Part 1: Experimental Measurements. Part 1; Experimental Measurements

    NASA Technical Reports Server (NTRS)

    Bunker, Ronald S.; Wetzel, Todd G.; Rigby, David L.; Reddy, D. R. (Technical Monitor)

    2000-01-01

    A combined experimental and computational study has been performed to investigate the detailed heat transfer coefficient distributions within a complex blade trailing edge passage. The experimental measurements are made using a steady liquid crystal thermography technique applied to one major side of the passage. The geometry of the trailing edge passage is that of a two-pass serpentine circuit with a sharp 180-degree turning region at the tip. The upflow channel is split by interrupted ribs into two major subchannels, one of which is turbulated. This channel has an average aspect ratio of roughly 14:1. The spanwise extent of the channel geometry includes both area convergence from root to tip, as well as taper towards the trailing edge apex. The average section Reynolds numbers tested in this upflow channel range from 55,000 to 98,000. The tip section contains a turning vane near the extreme comer. The downflow channel has an aspect ratio of about 5:1, and also includes convergence and taper. Turbulators of varying sizes are included in this channel also. Both detailed heat transfer and pressure distribution measurements are presented. The pressure measurements are incorporated into a flow network model illustrating the major loss contributors.

  2. Impact of measurement uncertainty from experimental load distribution factors on bridge load rating

    NASA Astrophysics Data System (ADS)

    Gangone, Michael V.; Whelan, Matthew J.

    2018-03-01

    Load rating and testing of highway bridges is important in determining the capacity of the structure. Experimental load rating utilizes strain transducers placed at critical locations of the superstructure to measure normal strains. These strains are then used in computing diagnostic performance measures (neutral axis of bending, load distribution factor) and ultimately a load rating. However, it has been shown that experimentally obtained strain measurements contain uncertainties associated with the accuracy and precision of the sensor and sensing system. These uncertainties propagate through to the diagnostic indicators that in turn transmit into the load rating calculation. This paper will analyze the effect that measurement uncertainties have on the experimental load rating results of a 3 span multi-girder/stringer steel and concrete bridge. The focus of this paper will be limited to the uncertainty associated with the experimental distribution factor estimate. For the testing discussed, strain readings were gathered at the midspan of each span of both exterior girders and the center girder. Test vehicles of known weight were positioned at specified locations on each span to generate maximum strain response for each of the five girders. The strain uncertainties were used in conjunction with a propagation formula developed by the authors to determine the standard uncertainty in the distribution factor estimates. This distribution factor uncertainty is then introduced into the load rating computation to determine the possible range of the load rating. The results show the importance of understanding measurement uncertainty in experimental load testing.

  3. Mid-infrared laser absorption tomography for quantitative 2D thermochemistry measurements in premixed jet flames

    NASA Astrophysics Data System (ADS)

    Wei, Chuyu; Pineda, Daniel I.; Paxton, Laurel; Egolfopoulos, Fokion N.; Spearrin, R. Mitchell

    2018-06-01

    A tomographic laser absorption spectroscopy technique, utilizing mid-infrared light sources, is presented as a quantitative method to spatially resolve species and temperature profiles in small-diameter reacting flows relevant to combustion systems. Here, tunable quantum and interband cascade lasers are used to spectrally resolve select rovibrational transitions near 4.98 and 4.19 μm to measure CO and {CO2}, respectively, as well as their vibrational temperatures, in piloted premixed jet flames. Signal processing methods are detailed for the reconstruction of axial and radial profiles of thermochemical structure in a canonical ethylene-air jet flame. The method is further demonstrated to quantitatively distinguish between different turbulent flow conditions.

  4. Quantitative surface topography determination by Nomarski reflection microscopy. 2: Microscope modification, calibration, and planar sample experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartman, J.S.; Gordon, R.L.; Lessor, D.L.

    1980-09-01

    The application of reflective Nomarski differential interference contrast microscopy for the determination of quantitative sample topography data is presented. The discussion includes a review of key theoretical results presented previously plus the experimental implementation of the concepts using a commercial Momarski microscope. The experimental work included the modification and characterization of a commercial microscope to allow its use for obtaining quantitative sample topography data. System usage for the measurement of slopes on flat planar samples is also discussed. The discussion has been designed to provide the theoretical basis, a physical insight, and a cookbook procedure for implementation to allow thesemore » results to be of value to both those interested in the microscope theory and its practical usage in the metallography laboratory.« less

  5. Numerical analysis of quantitative measurement of hydroxyl radical concentration using laser-induced fluorescence in flame

    NASA Astrophysics Data System (ADS)

    Shuang, Chen; Tie, Su; Yao-Bang, Zheng; Li, Chen; Ting-Xu, Liu; Ren-Bing, Li; Fu-Rong, Yang

    2016-06-01

    The aim of the present work is to quantitatively measure the hydroxyl radical concentration by using LIF (laser-induced fluorescence) in flame. The detailed physical models of spectral absorption lineshape broadening, collisional transition and quenching at elevated pressure are built. The fine energy level structure of the OH molecule is illustrated to understand the process with laser-induced fluorescence emission and others in the case without radiation, which include collisional quenching, rotational energy transfer (RET), and vibrational energy transfer (VET). Based on these, some numerical results are achieved by simulations in order to evaluate the fluorescence yield at elevated pressure. These results are useful for understanding the real physical processes in OH-LIF technique and finding a way to calibrate the signal for quantitative measurement of OH concentration in a practical combustor. Project supported by the National Natural Science Foundation of China (Grant No. 11272338) and the Fund from the Science and Technology on Scramjet Key Laboratory, China (Grant No. STSKFKT2013004).

  6. Quantitative analysis of fracture surface by roughness and fractal method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, X.W.; Tian, J.F.; Kang, Y.

    1995-09-01

    In recent years there has been extensive research and great development in Quantitative Fractography, which acts as an integral part of fractographic analysis. A prominent technique for studying the fracture surface is based on fracture profile generation and the major means for characterizing the profile quantitatively are roughness and fractal methods. By this way, some quantitative indexes such as the roughness parameters R{sub L} for profile and R{sub S} for surface, fractal dimensions D{sub L} for profile and D{sub S} for surface can be measured. Given the relationships between the indexes and the mechanical properties of materials, it is possiblemore » to achieve the goal of protecting materials from fracture. But, as the case stands, the theory and experimental technology of quantitative fractography are still imperfect and remain to be studied further. Recently, Gokhale and Underwood et al have proposed an assumption-free method for estimating the surface roughness by vertically sectioning the fracture surface with sections at an angle of 120 deg with each other, which could be expressed as follows: R{sub S} = {ovr R{sub L}{center_dot}{Psi}} where {Psi} is the profile structure factor. This method is based on the classical sterological principles and verified with the aid of computer simulations for some ruled surfaces. The results are considered to be applicable to fracture surfaces with any arbitrary complexity and anisotropy. In order to extend the detail applications to this method in quantitative fractography, the authors made a study on roughness and fractal methods dependent on this method by performing quantitative measurements on some typical low-temperature impact fractures.« less

  7. Baseline Error Analysis and Experimental Validation for Height Measurement of Formation Insar Satellite

    NASA Astrophysics Data System (ADS)

    Gao, X.; Li, T.; Zhang, X.; Geng, X.

    2018-04-01

    In this paper, we proposed the stochastic model of InSAR height measurement by considering the interferometric geometry of InSAR height measurement. The model directly described the relationship between baseline error and height measurement error. Then the simulation analysis in combination with TanDEM-X parameters was implemented to quantitatively evaluate the influence of baseline error to height measurement. Furthermore, the whole emulation validation of InSAR stochastic model was performed on the basis of SRTM DEM and TanDEM-X parameters. The spatial distribution characteristics and error propagation rule of InSAR height measurement were fully evaluated.

  8. Design and analysis issues in quantitative proteomics studies.

    PubMed

    Karp, Natasha A; Lilley, Kathryn S

    2007-09-01

    Quantitative proteomics is the comparison of distinct proteomes which enables the identification of protein species which exhibit changes in expression or post-translational state in response to a given stimulus. Many different quantitative techniques are being utilized and generate large datasets. Independent of the technique used, these large datasets need robust data analysis to ensure valid conclusions are drawn from such studies. Approaches to address the problems that arise with large datasets are discussed to give insight into the types of statistical analyses of data appropriate for the various experimental strategies that can be employed by quantitative proteomic studies. This review also highlights the importance of employing a robust experimental design and highlights various issues surrounding the design of experiments. The concepts and examples discussed within will show how robust design and analysis will lead to confident results that will ensure quantitative proteomics delivers.

  9. Comparison Between Numerically Simulated and Experimentally Measured Flowfield Quantities Behind a Pulsejet

    NASA Technical Reports Server (NTRS)

    Geng, Tao; Paxson, Daniel E.; Zheng, Fei; Kuznetsov, Andrey V.; Roberts, William L.

    2008-01-01

    Pulsed combustion is receiving renewed interest as a potential route to higher performance in air breathing propulsion systems. Pulsejets offer a simple experimental device with which to study unsteady combustion phenomena and validate simulations. Previous computational fluid dynamic (CFD) simulation work focused primarily on the pulsejet combustion and exhaust processes. This paper describes a new inlet sub-model which simulates the fluidic and mechanical operation of a valved pulsejet head. The governing equations for this sub-model are described. Sub-model validation is provided through comparisons of simulated and experimentally measured reed valve motion, and time averaged inlet mass flow rate. The updated pulsejet simulation, with the inlet sub-model implemented, is validated through comparison with experimentally measured combustion chamber pressure, inlet mass flow rate, operational frequency, and thrust. Additionally, the simulated pulsejet exhaust flowfield, which is dominated by a starting vortex ring, is compared with particle imaging velocimetry (PIV) measurements on the bases of velocity, vorticity, and vortex location. The results show good agreement between simulated and experimental data. The inlet sub-model is shown to be critical for the successful modeling of pulsejet operation. This sub-model correctly predicts both the inlet mass flow rate and its phase relationship with the combustion chamber pressure. As a result, the predicted pulsejet thrust agrees very well with experimental data.

  10. Quantitative flow and velocity measurements of pulsatile blood flow with 4D-DSA

    NASA Astrophysics Data System (ADS)

    Shaughnessy, Gabe; Hoffman, Carson; Schafer, Sebastian; Mistretta, Charles A.; Strother, Charles M.

    2017-03-01

    Time resolved 3D angiographic data from 4D DSA provides a unique environment to explore physical properties of blood flow. Utilizing the pulsatility of the contrast waveform, the Fourier components can be used to track the waveform motion through vessels. Areas of strong pulsatility are determined through the FFT power spectrum. Using this method, we find an accuracy from 4D-DSA flow measurements within 7.6% and 6.8% RMSE of ICA PCVIPR and phantom flow probe validation measurements, respectively. The availability of velocity and flow information with fast acquisition could provide a more quantitative approach to treatment planning and evaluation in interventional radiology.

  11. A quantitative evaluation of spurious results in the infrared spectroscopic measurement of CO2 isotope ratios

    NASA Astrophysics Data System (ADS)

    Mansfield, C. D.; Rutt, H. N.

    2002-02-01

    The possible generation of spurious results, arising from the application of infrared spectroscopic techniques to the measurement of carbon isotope ratios in breath, due to coincident absorption bands has been re-examined. An earlier investigation, which approached the problem qualitatively, fulfilled its aspirations in providing an unambiguous assurance that 13C16O2/12C16O2 ratios can be confidently measured for isotopic breath tests using instruments based on infrared absorption. Although this conclusion still stands, subsequent quantitative investigation has revealed an important exception that necessitates a strict adherence to sample collection protocol. The results show that concentrations and decay rates of the coincident breath trace compounds acetonitrile and carbon monoxide, found in the breath sample of a heavy smoker, can produce spurious results. Hence, findings from this investigation justify the concern that breath trace compounds present a risk to the accurate measurement of carbon isotope ratios in breath when using broadband, non-dispersive, ground state absorption infrared spectroscopy. It provides recommendations on the length of smoking abstention required to avoid generation of spurious results and also reaffirms, through quantitative argument, the validity of using infrared absorption spectroscopy to measure CO2 isotope ratios in breath.

  12. Nanoscale Structure of Type I Collagen Fibrils: Quantitative Measurement of D-spacing

    PubMed Central

    Erickson, Blake; Fang, Ming; Wallace, Joseph M.; Orr, Bradford G.; Les, Clifford M.; Holl, Mark M. Banaszak

    2012-01-01

    This paper details a quantitative method to measure the D-periodic spacing of Type I collagen fibrils using Atomic Force Microscopy coupled with analysis using a 2D Fast Fourier Transform approach. Instrument calibration, data sampling and data analysis are all discussed and comparisons of the data to the complementary methods of electron microscopy and X-ray scattering are made. Examples of the application of this new approach to the analysis of Type I collagen morphology in disease models of estrogen depletion and Osteogenesis Imperfecta are provided. We demonstrate that it is the D-spacing distribution, not the D-spacing mean, that showed statistically significant differences in estrogen depletion associated with early stage Osteoporosis and Osteogenesis Imperfecta. The ability to quantitatively characterize nanoscale morphological features of Type I collagen fibrils will provide important structural information regarding Type I collagen in many research areas, including tissue aging and disease, tissue engineering, and gene knock out studies. Furthermore, we also envision potential clinical applications including evaluation of tissue collagen integrity under the impact of diseases or drug treatments. PMID:23027700

  13. Initial description of a quantitative, cross-species (chimpanzee-human) social responsiveness measure

    PubMed Central

    Marrus, Natasha; Faughn, Carley; Shuman, Jeremy; Petersen, Steve; Constantino, John; Povinelli, Daniel; Pruett, John R.

    2011-01-01

    Objective Comparative studies of social responsiveness, an ability that is impaired in autistic spectrum disorders, can inform our understanding of both autism and the cognitive architecture of social behavior. Because there is no existing quantitative measure of social responsiveness in chimpanzees, we generated a quantitative, cross-species (human-chimpanzee) social responsiveness measure. Method We translated the Social Responsiveness Scale (SRS), an instrument that quantifies human social responsiveness, into an analogous instrument for chimpanzees. We then retranslated this "Chimp SRS" into a human "Cross-Species SRS" (XSRS). We evaluated three groups of chimpanzees (n=29) with the Chimp SRS and typical and autistic spectrum disorder (ASD) human children (n=20) with the XSRS. Results The Chimp SRS demonstrated strong inter-rater reliability at the three sites (ranges for individual ICCs: .534–.866 and mean ICCs: .851–.970). As has been observed in humans, exploratory principal components analysis of Chimp SRS scores supports a single factor underlying chimpanzee social responsiveness. Human subjects' XSRS scores were fully concordant with their SRS scores (r=.976, p=.001) and distinguished appropriately between typical and ASD subjects. One chimpanzee known for inappropriate social behavior displayed a significantly higher score than all other chimpanzees at its site, demonstrating the scale's ability to detect impaired social responsiveness in chimpanzees. Conclusion Our initial cross-species social responsiveness scale proved reliable and discriminated differences in social responsiveness across (in a relative sense) and within (in a more objectively quantifiable manner) humans and chimpanzees. PMID:21515200

  14. Resected Brain Tissue, Seizure Onset Zone and Quantitative EEG Measures: Towards Prediction of Post-Surgical Seizure Control

    PubMed Central

    Andrzejak, Ralph G.; Hauf, Martinus; Pollo, Claudio; Müller, Markus; Weisstanner, Christian; Wiest, Roland; Schindler, Kaspar

    2015-01-01

    Background Epilepsy surgery is a potentially curative treatment option for pharmacoresistent patients. If non-invasive methods alone do not allow to delineate the epileptogenic brain areas the surgical candidates undergo long-term monitoring with intracranial EEG. Visual EEG analysis is then used to identify the seizure onset zone for targeted resection as a standard procedure. Methods Despite of its great potential to assess the epileptogenicty of brain tissue, quantitative EEG analysis has not yet found its way into routine clinical practice. To demonstrate that quantitative EEG may yield clinically highly relevant information we retrospectively investigated how post-operative seizure control is associated with four selected EEG measures evaluated in the resected brain tissue and the seizure onset zone. Importantly, the exact spatial location of the intracranial electrodes was determined by coregistration of pre-operative MRI and post-implantation CT and coregistration with post-resection MRI was used to delineate the extent of tissue resection. Using data-driven thresholding, quantitative EEG results were separated into normally contributing and salient channels. Results In patients with favorable post-surgical seizure control a significantly larger fraction of salient channels in three of the four quantitative EEG measures was resected than in patients with unfavorable outcome in terms of seizure control (median over the whole peri-ictal recordings). The same statistics revealed no association with post-operative seizure control when EEG channels contributing to the seizure onset zone were studied. Conclusions We conclude that quantitative EEG measures provide clinically relevant and objective markers of target tissue, which may be used to optimize epilepsy surgery. The finding that differentiation between favorable and unfavorable outcome was better for the fraction of salient values in the resected brain tissue than in the seizure onset zone is consistent

  15. Experimental methods in cryogenic spectroscopy: Stark effect measurements in substituted myoglobin

    NASA Astrophysics Data System (ADS)

    Moran, Bradley M.

    Dawning from well-defined tertiary structure, the active regions of enzymatic proteins exist as specifically tailored electrostatic microenvironments capable of facilitating chemical interaction. The specific influence these charge distributions have on ligand binding dynamics, and their impact on specificity, reactivity, and biological functionality, have yet to be fully understood. A quantitative determination of these intrinsic fields would offer insight towards the mechanistic aspects of protein functionality. This work seeks to investigate the internal molecular electric fields that are present at the oxygen binding site of myoglobin. Experiments are performed at 1 K on samples located within a glassy matrix, using the high-resolution technique spectral hole-burning. The internal electric field distributions can be explored by implementing a unique mathematical treatment for analyzing the effect that externally applied electric fields have on the spectral hole profiles. Precise control of the light field, the temperature, and the externally applied electric field at the site of the sample is crucial. Experimentally, the functionality of custom cryogenic temperature confocal scanning microscope was extended to allow for collection of imaging and spectral data with the ability to modulate the polarization of the light at the sample. Operation of the instrumentation was integrated into a platform allowing for seamless execution of input commands with high temporal inter-instrument resolution for collection of data streams. For the regulated control and cycling of the sample temperature. the thermal characteristics of the research Dewar were theoretically modeled to systematically predict heat flows throughout the system. A high voltage feedthrough for delivering voltages of up to 5000 V to the sample as positioned within the Dewar was developed. The burning of spectral holes with this particular experimental setup is highly repeatable. The quantum mechanical

  16. Introduction of an automated user-independent quantitative volumetric magnetic resonance imaging breast density measurement system using the Dixon sequence: comparison with mammographic breast density assessment.

    PubMed

    Wengert, Georg Johannes; Helbich, Thomas H; Vogl, Wolf-Dieter; Baltzer, Pascal; Langs, Georg; Weber, Michael; Bogner, Wolfgang; Gruber, Stephan; Trattnig, Siegfried; Pinker, Katja

    2015-02-01

    The purposes of this study were to introduce and assess an automated user-independent quantitative volumetric (AUQV) breast density (BD) measurement system on the basis of magnetic resonance imaging (MRI) using the Dixon technique as well as to compare it with qualitative and quantitative mammographic (MG) BD measurements. Forty-three women with normal mammogram results (Breast Imaging Reporting and Data System 1) were included in this institutional review board-approved prospective study. All participants were subjected to BD assessment with MRI using the following sequence with the Dixon technique (echo time/echo time, 6 milliseconds/2.45 milliseconds/2.67 milliseconds; 1-mm isotropic; 3 minutes 38 seconds). To test the reproducibility, a second MRI after patient repositioning was performed. The AUQV magnetic resonance (MR) BD measurement system automatically calculated percentage (%) BD. The qualitative BD assessment was performed using the American College of Radiology Breast Imaging Reporting and Data System BD categories. Quantitative BD was estimated semiautomatically using the thresholding technique Cumulus4. Appropriate statistical tests were used to assess the agreement between the AUQV MR measurements and to compare them with qualitative and quantitative MG BD estimations. The AUQV MR BD measurements were successfully performed in all 43 women. There was a nearly perfect agreement of AUQV MR BD measurements between the 2 MR examinations for % BD (P < 0.001; intraclass correlation coefficient, 0.998) with no significant differences (P = 0.384). The AUQV MR BD measurements were significantly lower than quantitative and qualitative MG BD assessment (P < 0.001). The AUQV MR BD measurement system allows a fully automated, user-independent, robust, reproducible, as well as radiation- and compression-free volumetric quantitative BD assessment through different levels of BD. The AUQV MR BD measurements were significantly lower than the currently used qualitative

  17. Formation resistivity measurements from within a cased well used to quantitatively determine the amount of oil and gas present

    DOEpatents

    Vail, III, William Banning

    2000-01-01

    Methods to quantitatively determine the separate amounts of oil and gas in a geological formation adjacent to a cased well using measurements of formation resistivity. The steps include obtaining resistivity measurements from within a cased well of a given formation, obtaining the porosity, obtaining the resistivity of formation water present, computing the combined amounts of oil and gas present using Archie's Equations, determining the relative amounts of oil and gas present from measurements within a cased well, and then quantitatively determining the separate amounts of oil and gas present in the formation. Resistivity measurements are obtained from within the cased well by conducting A.C. current from within the cased well to a remote electrode at a frequency that is within the frequency range of 0.1 Hz to 20 Hz.

  18. OPPORTUNISTIC ASPERGILLUS PATHOGENS MEASURED IN HOME AND HOSPITAL TAP WATER BY MOLD SPECIFIC QUANTITATIVE PCR (MSQPCR)

    EPA Science Inventory

    Opportunistic fungal pathogens are a concern because of the increasing number of immunocompromised patients. The goal of this research was to test a simple extraction method and rapid quantitative PCR (QPCR) measurement of the occurrence of potential pathogens, Aspergillus fumiga...

  19. A measure of state persecutory ideation for experimental studies.

    PubMed

    Freeman, Daniel; Pugh, Katherine; Green, Catherine; Valmaggia, Lucia; Dunn, Graham; Garety, Philippa

    2007-09-01

    Experimental research is increasingly important in developing the understanding of paranoid thinking. An assessment measure of persecutory ideation is necessary for such work. We report the reliability and validity of the first state measure of paranoia: The State Social Paranoia Scale. The items in the measure conform to a recent definition in which persecutory thinking has the 2 elements of feared harm and perpetrator intent. The measure was tested with 164 nonclinical participants and 21 individuals at high risk of psychosis with attenuated positive symptoms. The participants experienced a social situation presented in virtual reality and completed the new measure. The State Social Paranoia Scale was found to have excellent internal reliability, adequate test-retest reliability, clear convergent validity as assessed by both independent interviewer ratings and self-report measures, and showed divergent validity with measures of positive and neutral thinking. The measure of paranoia in a recent social situation has good psychometric properties.

  20. Advances in Surface Plasmon Resonance Imaging allowing for quantitative measurement of laterally heterogeneous samples

    NASA Astrophysics Data System (ADS)

    Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John

    2012-02-01

    The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to materials on metallic surfaces for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases -- uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. The degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.

  1. Quantitative Measures of Immersion in Cloud and the Biogeography of Cloud Forests

    NASA Technical Reports Server (NTRS)

    Lawton, R. O.; Nair, U. S.; Ray, D.; Regmi, A.; Pounds, J. A.; Welch, R. M.

    2010-01-01

    Sites described as tropical montane cloud forests differ greatly, in part because observers tend to differ in their opinion as to what constitutes frequent and prolonged immersion in cloud. This definitional difficulty interferes with hydrologic analyses, assessments of environmental impacts on ecosystems, and biogeographical analyses of cloud forest communities and species. Quantitative measurements of cloud immersion can be obtained on site, but the observations are necessarily spatially limited, although well-placed observers can examine 10 50 km of a mountain range under rainless conditions. Regional analyses, however, require observations at a broader scale. This chapter discusses remote sensing and modeling approaches that can provide quantitative measures of the spatiotemporal patterns of cloud cover and cloud immersion in tropical mountain ranges. These approaches integrate remote sensing tools of various spatial resolutions and frequencies of observation, digital elevation models, regional atmospheric models, and ground-based observations to provide measures of cloud cover, cloud base height, and the intersection of cloud and terrain. This combined approach was applied to the Monteverde region of northern Costa Rica to illustrate how the proportion of time the forest is immersed in cloud may vary spatially and temporally. The observed spatial variation was largely due to patterns of airflow over the mountains. The temporal variation reflected the diurnal rise and fall of the orographic cloud base, which was influenced in turn by synoptic weather conditions, the seasonal movement of the Intertropical Convergence Zone and the north-easterly trade winds. Knowledge of the proportion of the time that sites are immersed in clouds should facilitate ecological comparisons and biogeographical analyses, as well as land use planning and hydrologic assessments in areas where intensive on-site work is not feasible.

  2. How-to-Do-It: Apparatus & Experimental Design for Measuring Fermentation Rates in Yeast.

    ERIC Educational Resources Information Center

    Tatina, Robert

    1989-01-01

    Describes an apparatus that facilitates the quantitative study of fermentation in yeast by allowing simultaneous measurements of fermentation rates in several treatments and a control. Explains a laboratory procedure in which the apparatus is used. Several suggestions for further investigations are included. (Author/RT)

  3. Quantitative analyses of bifunctional molecules.

    PubMed

    Braun, Patrick D; Wandless, Thomas J

    2004-05-11

    Small molecules can be discovered or engineered to bind tightly to biologically relevant proteins, and these molecules have proven to be powerful tools for both basic research and therapeutic applications. In many cases, detailed biophysical analyses of the intermolecular binding events are essential for improving the activity of the small molecules. These interactions can often be characterized as straightforward bimolecular binding events, and a variety of experimental and analytical techniques have been developed and refined to facilitate these analyses. Several investigators have recently synthesized heterodimeric molecules that are designed to bind simultaneously with two different proteins to form ternary complexes. These heterodimeric molecules often display compelling biological activity; however, they are difficult to characterize. The bimolecular interaction between one protein and the heterodimeric ligand (primary dissociation constant) can be determined by a number of methods. However, the interaction between that protein-ligand complex and the second protein (secondary dissociation constant) is more difficult to measure due to the noncovalent nature of the original protein-ligand complex. Consequently, these heterodimeric compounds are often characterized in terms of their activity, which is an experimentally dependent metric. We have developed a general quantitative mathematical model that can be used to measure both the primary (protein + ligand) and secondary (protein-ligand + protein) dissociation constants for heterodimeric small molecules. These values are largely independent of the experimental technique used and furthermore provide a direct measure of the thermodynamic stability of the ternary complexes that are formed. Fluorescence polarization and this model were used to characterize the heterodimeric molecule, SLFpYEEI, which binds to both FKBP12 and the Fyn SH2 domain, demonstrating that the model is useful for both predictive as well as ex

  4. Control of experimental uncertainties in filtered Rayleigh scattering measurements

    NASA Technical Reports Server (NTRS)

    Forkey, Joseph N.; Finkelstein, N. D.; Lempert, Walter R.; Miles, Richard B.

    1995-01-01

    Filtered Rayleigh Scattering is a technique which allows for measurement of velocity, temperature, and pressure in unseeded flows, spatially resolved in 2-dimensions. We present an overview of the major components of a Filtered Rayleigh Scattering system. In particular, we develop and discuss a detailed theoretical model along with associated model parameters and related uncertainties. Based on this model, we then present experimental results for ambient room air and for a Mach 2 free jet, including spatially resolved measurements of velocity, temperature, and pressure.

  5. Hydrodynamic Radii of Intrinsically Disordered Proteins Determined from Experimental Polyproline II Propensities

    PubMed Central

    Tomasso, Maria E.; Tarver, Micheal J.; Devarajan, Deepa; Whitten, Steven T.

    2016-01-01

    The properties of disordered proteins are thought to depend on intrinsic conformational propensities for polyproline II (PP II) structure. While intrinsic PP II propensities have been measured for the common biological amino acids in short peptides, the ability of these experimentally determined propensities to quantitatively reproduce structural behavior in intrinsically disordered proteins (IDPs) has not been established. Presented here are results from molecular simulations of disordered proteins showing that the hydrodynamic radius (R h) can be predicted from experimental PP II propensities with good agreement, even when charge-based considerations are omitted. The simulations demonstrate that R h and chain propensity for PP II structure are linked via a simple power-law scaling relationship, which was tested using the experimental R h of 22 IDPs covering a wide range of peptide lengths, net charge, and sequence composition. Charge effects on R h were found to be generally weak when compared to PP II effects on R h. Results from this study indicate that the hydrodynamic dimensions of IDPs are evidence of considerable sequence-dependent backbone propensities for PP II structure that qualitatively, if not quantitatively, match conformational propensities measured in peptides. PMID:26727467

  6. Experimental measurement-device-independent verification of quantum steering

    NASA Astrophysics Data System (ADS)

    Kocsis, Sacha; Hall, Michael J. W.; Bennet, Adam J.; Saunders, Dylan J.; Pryde, Geoff J.

    2015-01-01

    Bell non-locality between distant quantum systems—that is, joint correlations which violate a Bell inequality—can be verified without trusting the measurement devices used, nor those performing the measurements. This leads to unconditionally secure protocols for quantum information tasks such as cryptographic key distribution. However, complete verification of Bell non-locality requires high detection efficiencies, and is not robust to typical transmission losses over long distances. In contrast, quantum or Einstein-Podolsky-Rosen steering, a weaker form of quantum correlation, can be verified for arbitrarily low detection efficiencies and high losses. The cost is that current steering-verification protocols require complete trust in one of the measurement devices and its operator, allowing only one-sided secure key distribution. Here we present measurement-device-independent steering protocols that remove this need for trust, even when Bell non-locality is not present. We experimentally demonstrate this principle for singlet states and states that do not violate a Bell inequality.

  7. Experimental measurement-device-independent verification of quantum steering.

    PubMed

    Kocsis, Sacha; Hall, Michael J W; Bennet, Adam J; Saunders, Dylan J; Pryde, Geoff J

    2015-01-07

    Bell non-locality between distant quantum systems--that is, joint correlations which violate a Bell inequality--can be verified without trusting the measurement devices used, nor those performing the measurements. This leads to unconditionally secure protocols for quantum information tasks such as cryptographic key distribution. However, complete verification of Bell non-locality requires high detection efficiencies, and is not robust to typical transmission losses over long distances. In contrast, quantum or Einstein-Podolsky-Rosen steering, a weaker form of quantum correlation, can be verified for arbitrarily low detection efficiencies and high losses. The cost is that current steering-verification protocols require complete trust in one of the measurement devices and its operator, allowing only one-sided secure key distribution. Here we present measurement-device-independent steering protocols that remove this need for trust, even when Bell non-locality is not present. We experimentally demonstrate this principle for singlet states and states that do not violate a Bell inequality.

  8. Quantitative Rainbow Schlieren Deflectometry as a Temperature Diagnostic for Spherical Flames

    NASA Technical Reports Server (NTRS)

    Feikema, Douglas A.

    2004-01-01

    Numerical analysis and experimental results are presented to define a method for quantitatively measuring the temperature distribution of a spherical diffusion flame using Rainbow Schlieren Deflectometry in microgravity. First, a numerical analysis is completed to show the method can suitably determine temperature in the presence of spatially varying species composition. Also, a numerical forward-backward inversion calculation is presented to illustrate the types of calculations and deflections to be encountered. Lastly, a normal gravity demonstration of temperature measurement in an axisymmetric laminar, diffusion flame using Rainbow Schlieren deflectometry is presented. The method employed in this paper illustrates the necessary steps for the preliminary design of a Schlieren system. The largest deflections for the normal gravity flame considered in this paper are 7.4 x 10(-4) radians which can be accurately measured with 2 meter focal length collimating and decollimating optics. The experimental uncertainty of deflection is less than 5 x 10(-5) radians.

  9. Calculation of the Intensity of Physical Time Fluctuations Using the Standard Solar Model and its Comparison with the Results of Experimental Measurements

    NASA Astrophysics Data System (ADS)

    Morozov, A. N.

    2017-11-01

    The article reviews the possibility of describing physical time as a random Poisson process. An equation allowing the intensity of physical time fluctuations to be calculated depending on the entropy production density within irreversible natural processes has been proposed. Based on the standard solar model the work calculates the entropy production density inside the Sun and the dependence of the intensity of physical time fluctuations on the distance to the centre of the Sun. A free model parameter has been established, and the method of its evaluation has been suggested. The calculations of the entropy production density inside the Sun showed that it differs by 2-3 orders of magnitude in different parts of the Sun. The intensity of physical time fluctuations on the Earth's surface depending on the entropy production density during the sunlight-to-Earth's thermal radiation conversion has been theoretically predicted. A method of evaluation of the Kullback's measure of voltage fluctuations in small amounts of electrolyte has been proposed. Using a simple model of the Earth's surface heat transfer to the upper atmosphere, the effective Earth's thermal radiation temperature has been determined. A comparison between the theoretical values of the Kullback's measure derived from the fluctuating physical time model and the experimentally measured values of this measure for two independent electrolytic cells showed a good qualitative and quantitative concurrence of predictions of both theoretical model and experimental data.

  10. Quantitative collision induced mass spectrometry of substituted piperazines - A correlative analysis between theory and experiment

    NASA Astrophysics Data System (ADS)

    Ivanova, Bojidarka; Spiteller, Michael

    2017-12-01

    The present paper deals with quantitative kinetics and thermodynamics of collision induced dissociation (CID) reactions of piperazines under different experimental conditions together with a systematic description of effect of counter-ions on common MS fragment reactions of piperazines; and intra-molecular effect of quaternary cyclization of substituted piperazines yielding to quaternary salts. There are discussed quantitative model equations of rate constants as well as free Gibbs energies of series of m-independent CID fragment processes in GP, which have been evidenced experimentally. Both kinetic and thermodynamic parameters are also predicted by computational density functional theory (DFT) and ab initio both static and dynamic methods. The paper examines validity of Maxwell-Boltzmann distribution to non-Boltzmann CID processes in quantitatively as well. The experiments conducted within the latter framework yield to an excellent correspondence with theoretical quantum chemical modeling. The important property of presented model equations of reaction kinetics is the applicability in predicting unknown and assigning of known mass spectrometric (MS) patterns. The nature of "GP" continuum of CID-MS coupled scheme of measurements with electrospray ionization (ESI) source is discussed, performing parallel computations in gas-phase (GP) and polar continuum at different temperatures and ionic strengths. The effect of pressure is presented. The study contributes significantly to methodological and phenomenological developments of CID-MS and its analytical implementations for quantitative and structural analyses. It also demonstrates great prospective of a complementary application of experimental CID-MS and computational quantum chemistry studying chemical reactivity, among others. To a considerable extend this work underlies the place of computational quantum chemistry to the field of experimental analytical chemistry in particular highlighting the structural analysis.

  11. A device for rapid and quantitative measurement of cardiac myocyte contractility

    NASA Astrophysics Data System (ADS)

    Gaitas, Angelo; Malhotra, Ricky; Li, Tao; Herron, Todd; Jalife, José

    2015-03-01

    Cardiac contractility is the hallmark of cardiac function and is a predictor of healthy or diseased cardiac muscle. Despite advancements over the last two decades, the techniques and tools available to cardiovascular scientists are limited in their utility to accurately and reliably measure the amplitude and frequency of cardiomyocyte contractions. Isometric force measurements in the past have entailed cumbersome attachment of isolated and permeabilized cardiomyocytes to a force transducer followed by measurements of sarcomere lengths under conditions of submaximal and maximal Ca2+ activation. These techniques have the inherent disadvantages of being labor intensive and costly. We have engineered a micro-machined cantilever sensor with an embedded deflection-sensing element that, in preliminary experiments, has demonstrated to reliably measure cardiac cell contractions in real-time. Here, we describe this new bioengineering tool with applicability in the cardiovascular research field to effectively and reliably measure cardiac cell contractility in a quantitative manner. We measured contractility in both primary neonatal rat heart cardiomyocyte monolayers that demonstrated a beat frequency of 3 Hz as well as human embryonic stem cell-derived cardiomyocytes with a contractile frequency of about 1 Hz. We also employed the β-adrenergic agonist isoproterenol (100 nmol l-1) and observed that our cantilever demonstrated high sensitivity in detecting subtle changes in both chronotropic and inotropic responses of monolayers. This report describes the utility of our micro-device in both basic cardiovascular research as well as in small molecule drug discovery to monitor cardiac cell contractions.

  12. Development and psychometric evaluation of a quantitative measure of "fat talk".

    PubMed

    MacDonald Clarke, Paige; Murnen, Sarah K; Smolak, Linda

    2010-01-01

    Based on her anthropological research, Nichter (2000) concluded that it is normative for many American girls to engage in body self-disparagement in the form of "fat talk." The purpose of the present two studies was to develop a quantitative measure of fat talk. A series of 17 scenarios were created in which "Naomi" is talking with a female friend(s) and there is an expression of fat talk. College women respondents rated the frequency with which they would behave in a similar way as the women in each scenario. A nine-item one-factor scale was determined through principal components analysis and its scores yielded evidence of internal consistency reliability, test-retest reliability over a five-week time period, construct validity, discriminant validity, and incremental validity in that it predicted unique variance in body shame and eating disorder symptoms above and beyond other measures of self-objectification. Copyright 2009 Elsevier Ltd. All rights reserved.

  13. Linearization improves the repeatability of quantitative dynamic contrast-enhanced MRI.

    PubMed

    Jones, Kyle M; Pagel, Mark D; Cárdenas-Rodríguez, Julio

    2018-04-01

    The purpose of this study was to compare the repeatabilities of the linear and nonlinear Tofts and reference region models (RRM) for dynamic contrast-enhanced MRI (DCE-MRI). Simulated and experimental DCE-MRI data from 12 rats with a flank tumor of C6 glioma acquired over three consecutive days were analyzed using four quantitative and semi-quantitative DCE-MRI metrics. The quantitative methods used were: 1) linear Tofts model (LTM), 2) non-linear Tofts model (NTM), 3) linear RRM (LRRM), and 4) non-linear RRM (NRRM). The following semi-quantitative metrics were used: 1) maximum enhancement ratio (MER), 2) time to peak (TTP), 3) initial area under the curve (iauc64), and 4) slope. LTM and NTM were used to estimate K trans , while LRRM and NRRM were used to estimate K trans relative to muscle (R Ktrans ). Repeatability was assessed by calculating the within-subject coefficient of variation (wSCV) and the percent intra-subject variation (iSV) determined with the Gage R&R analysis. The iSV for R Ktrans using LRRM was two-fold lower compared to NRRM at all simulated and experimental conditions. A similar trend was observed for the Tofts model, where LTM was at least 50% more repeatable than the NTM under all experimental and simulated conditions. The semi-quantitative metrics iauc64 and MER were as equally repeatable as K trans and R Ktrans estimated by LTM and LRRM respectively. The iSV for iauc64 and MER were significantly lower than the iSV for slope and TTP. In simulations and experimental results, linearization improves the repeatability of quantitative DCE-MRI by at least 30%, making it as repeatable as semi-quantitative metrics. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Remote measurements of the atmosphere using Raman scattering.

    PubMed

    Melfi, S H

    1972-07-01

    The Raman optical radar measurements of the atmosphere presented demonstrate that the technique may be used to obtain quantitative measurements of the spatial distribution of individual atmospheric molecular trace constituents, in particular water vapor, as well as those of the major constituents. In addition, it is shown that monitoring Raman signals from atmospheric nitrogen aids in interpreting elastic scattering measurements by eliminating attenuation effects. In general, the experimental results show good agreement with independent meteorological measurements. Finally, experimental data are utilized to estimate the Raman backscatter cross section for water vapor excited at 3471.5 A as sigmaH(2)O/sigmaN(2) = 3.8 +/- 25%.

  15. Selecting quantitative water management measures at the river basin scale in a global change context

    NASA Astrophysics Data System (ADS)

    Girard, Corentin; Rinaudo, Jean-Daniel; Caballero, Yvan; Pulido-Velazquez, Manuel

    2013-04-01

    One of the main challenges in the implementation of the Water Framework Directive (WFD) in the European Union is the definition of programme of measures to reach the good status of the European water bodies. In areas where water scarcity is an issue, one of these challenges is the selection of water conservation and capacity expansion measures to ensure minimum environmental in-stream flow requirements. At the same time, the WFD calls for the use of economic analysis to identify the most cost-effective combination of measures at the river basin scale to achieve its objective. With this respect, hydro-economic river basin models, by integrating economics, environmental and hydrological aspects at the river basin scale in a consistent framework, represent a promising approach. This article presents a least-cost river basin optimization model (LCRBOM) that selects the combination of quantitative water management measures to meet environmental flows for future scenarios of agricultural and urban demand taken into account the impact of the climate change. The model has been implemented in a case study on a Mediterranean basin in the south of France, the Orb River basin. The water basin has been identified as in need for quantitative water management measures in order to reach the good status of its water bodies. The LCRBOM has been developed using GAMS, applying Mixed Integer Linear Programming. It is run to select the set of measures that minimizes the total annualized cost of the applied measures, while meeting the demands and minimum in-stream flow constraints. For the economic analysis, the programme of measures is composed of water conservation measures on agricultural and urban water demands. It compares them with measures mobilizing new water resources coming from groundwater, inter-basin transfers and improvement in reservoir operating rules. The total annual cost of each measure is calculated for each demand unit considering operation, maintenance and

  16. Experimental validation of the AVIVET trap, a tool to quantitatively monitor the dynamics of Dermanyssus gallinae populations in laying hens.

    PubMed

    Lammers, G A; Bronneberg, R G G; Vernooij, J C M; Stegeman, J A

    2017-06-01

    Dermanyssus gallinae (D.gallinae) infestation causes economic losses due to impaired health and production of hens and costs of parasite control across the world. Moreover, infestations are associated with reduced welfare of hens and may cause itching in humans. To effectively implement control methods it is crucially important to have high quality information about the D.gallinae populations in poultry houses in space and time. At present no validated tool is available to quantitatively monitor the dynamics of all four stages of D.gallinae (i.e., eggs, larvae, nymphs, and adults) in poultry houses.This article describes the experimental validation of the AVIVET trap, a device to quantitatively monitor dynamics of D.gallinae infestations. We used the device to study D.gallinae in fully equipped cages with two white specific pathogen free Leghorn laying hens experimentally exposed to three different infestation levels of D.gallinae (low to high).The AVIVET trap was successfully able to detect D.gallinae at high (5,000 D.gallinae), medium (2,500 D.gallinae), and low (50 D.gallinae) level of D.gallinae infestation. The linear equation Y = 10∧10∧(0.47 + 1.21X) with Y = log10 (Total number of D.gallinae nymphs and adults) in the cage and X = log10 (Total number of D.gallinae nymphs and adults) in the AVIVET trap explained 93.8% of the variation.The weight of D.gallinae in the AVIVET trap also appears to be a reliable parameter for quantifying D.gallinae infestation in a poultry house. The weight of D.gallinae in the AVIVET trap correlates 99.6% (P < 0.000) to the counted number of all stages of D.gallinae in the trap (i.e., eggs, larvae, nymphs, and adults) indicating that the trap is highly specific.From this experiment it can be concluded that the AVIVET trap is promising as quantitative tool for monitoring D.gallinae dynamics in a poultry house. © 2016 Poultry Science Association Inc.

  17. Evidence-based nursing: a stereotyped view of quantitative and experimental research could work against professional autonomy and authority.

    PubMed

    Bonell, C

    1999-07-01

    In recent years, there have been calls within the United Kingdom's National Health Service (NHS) for evidence-based health care. These resonate with long-standing calls for nursing to become a research-based profession. Evidence-based practice could enable nurses to demonstrate their unique contribution to health care outcomes, and support their seeking greater professionalization, in terms of enhanced authority and autonomy. Nursing's professionalization project, and, within this, various practices comprising the 'new nursing', whilst sometimes not delivering all that was hoped of them, have been important in developing certain conditions conducive to developing evidence-based practice, notably a critical perspective on practice and a reluctance merely to follow physicians' orders. However, nursing has often been hesitant in its adoption of quantitative and experimental research. This hesitancy, it is argued, has been influenced by the propounding by some authors within the new nursing of a stereotyped view of quantitative/experimental methods which equates them with a number of methodological and philosophical points which are deemed, by at least some of these authors, as inimical to, or problematic within, nursing research. It is argued that, not only is the logic on which the various stereotyped views are based flawed, but further, that the wider influence of these viewpoints on nurses could lead to a greater marginalization of nurses in research and evidence-based practice initiatives, thus perhaps leading to evidence-based nursing being led by other groups. In the longer term, this might result in a form of evidence-based nursing emphasizing routinization, thus--ironically--working against strategies of professional authority and autonomy embedded in the new nursing. Nursing research should instead follow the example of nurse researchers who already embrace multiple methods. While the paper describes United Kingdom experiences and debates, points raised about

  18. Multisite concordance of apparent diffusion coefficient measurements across the NCI Quantitative Imaging Network.

    PubMed

    Newitt, David C; Malyarenko, Dariya; Chenevert, Thomas L; Quarles, C Chad; Bell, Laura; Fedorov, Andriy; Fennessy, Fiona; Jacobs, Michael A; Solaiyappan, Meiyappan; Hectors, Stefanie; Taouli, Bachir; Muzi, Mark; Kinahan, Paul E; Schmainda, Kathleen M; Prah, Melissa A; Taber, Erin N; Kroenke, Christopher; Huang, Wei; Arlinghaus, Lori R; Yankeelov, Thomas E; Cao, Yue; Aryal, Madhava; Yen, Yi-Fen; Kalpathy-Cramer, Jayashree; Shukla-Dave, Amita; Fung, Maggie; Liang, Jiachao; Boss, Michael; Hylton, Nola

    2018-01-01

    Diffusion weighted MRI has become ubiquitous in many areas of medicine, including cancer diagnosis and treatment response monitoring. Reproducibility of diffusion metrics is essential for their acceptance as quantitative biomarkers in these areas. We examined the variability in the apparent diffusion coefficient (ADC) obtained from both postprocessing software implementations utilized by the NCI Quantitative Imaging Network and online scan time-generated ADC maps. Phantom and in vivo breast studies were evaluated for two ([Formula: see text]) and four ([Formula: see text]) [Formula: see text]-value diffusion metrics. Concordance of the majority of implementations was excellent for both phantom ADC measures and in vivo [Formula: see text], with relative biases [Formula: see text] ([Formula: see text]) and [Formula: see text] (phantom [Formula: see text]) but with higher deviations in ADC at the lowest phantom ADC values. In vivo [Formula: see text] concordance was good, with typical biases of [Formula: see text] to 3% but higher for online maps. Multiple b -value ADC implementations were separated into two groups determined by the fitting algorithm. Intergroup mean ADC differences ranged from negligible for phantom data to 2.8% for [Formula: see text] in vivo data. Some higher deviations were found for individual implementations and online parametric maps. Despite generally good concordance, implementation biases in ADC measures are sometimes significant and may be large enough to be of concern in multisite studies.

  19. Some observations on precipitation measurement on forested experimental watersheds

    Treesearch

    Raymond E. Leonard; Kenneth G. Reinhart

    1963-01-01

    Measurement of precipitation on forested experimental watersheds presents difficulties other than those associated with access to and from the gages in all kinds of weather. For instance, the tree canopy must be cleared above the gage. The accepted practice of keeping an unobstructed sky view of 45" around the gage involves considerable tree cutting. On a level...

  20. [Quantitative Analysis of Heavy Metals in Water with LIBS Based on Signal-to-Background Ratio].

    PubMed

    Hu, Li; Zhao, Nan-jing; Liu, Wen-qing; Fang, Li; Zhang, Da-hai; Wang, Yin; Meng, De Shuo; Yu, Yang; Ma, Ming-jun

    2015-07-01

    There are many influence factors in the precision and accuracy of the quantitative analysis with LIBS technology. According to approximately the same characteristics trend of background spectrum and characteristic spectrum along with the change of temperature through in-depth analysis, signal-to-background ratio (S/B) measurement and regression analysis could compensate the spectral line intensity changes caused by system parameters such as laser power, spectral efficiency of receiving. Because the measurement dates were limited and nonlinear, we used support vector machine (SVM) for regression algorithm. The experimental results showed that the method could improve the stability and the accuracy of quantitative analysis of LIBS, and the relative standard deviation and average relative error of test set respectively were 4.7% and 9.5%. Data fitting method based on signal-to-background ratio(S/B) is Less susceptible to matrix elements and background spectrum etc, and provides data processing reference for real-time online LIBS quantitative analysis technology.

  1. Quantitative contrast-enhanced mammography for contrast medium kinetics studies

    NASA Astrophysics Data System (ADS)

    Arvanitis, C. D.; Speller, R.

    2009-10-01

    Quantitative contrast-enhanced mammography, based on a dual-energy approach, aims to extract quantitative and temporal information of the tumour enhancement after administration of iodinated vascular contrast media. Simulations using analytical expressions and optimization of critical parameters essential for the development of quantitative contrast-enhanced mammography are presented. The procedure has been experimentally evaluated using a tissue-equivalent phantom and an amorphous silicon active matrix flat panel imager. The x-ray beams were produced by a tungsten target tube and spectrally shaped using readily available materials. Measurement of iodine projected thickness in mg cm-2 has been performed. The effect of beam hardening does not introduce nonlinearities in the measurement of iodine projected thickness for values of thicknesses found in clinical investigations. However, scattered radiation introduces significant deviations from slope equal to unity when compared with the actual iodine projected thickness. Scatter correction before the analysis of the dual-energy images provides accurate iodine projected thickness measurements. At 10% of the exposure used in clinical mammography, signal-to-noise ratios in excess of 5 were achieved for iodine projected thicknesses less than 3 mg cm-2 within a 4 cm thick phantom. For the extraction of temporal information, a limited number of low-dose images were used with the phantom incorporating a flow of iodinated contrast medium. The results suggest that spatial and temporal information of iodinated contrast media can be used to indirectly measure the tumour microvessel density and determine its uptake and washout from breast tumours. The proposed method can significantly improve tumour detection in dense breasts. Its application to perform in situ x-ray biopsy and assessment of the oncolytic effect of anticancer agents is foreseeable.

  2. Experimental research of digital holographic microscopic measuring

    NASA Astrophysics Data System (ADS)

    Zhu, Xueliang; Chen, Feifei; Li, Jicheng

    2013-06-01

    Digital holography is a new imaging technique, which is developed on the base of optical holography, Digital processing, and Computer techniques. It is using CCD instead of the conventional silver to record hologram, and then reproducing the 3D contour of the object by the way of computer simulation. Compared with the traditional optical holographic, the whole process is of simple measuring, lower production cost, faster the imaging speed, and with the advantages of non-contact real-time measurement. At present, it can be used in the fields of the morphology detection of tiny objects, micro deformation analysis, and biological cells shape measurement. It is one of the research hot spot at home and abroad. This paper introduced the basic principles and relevant theories about the optical holography and Digital holography, and researched the basic questions which influence the reproduce images in the process of recording and reconstructing of the digital holographic microcopy. In order to get a clear digital hologram, by analyzing the optical system structure, we discussed the recording distance and of the hologram. On the base of the theoretical studies, we established a measurement and analyzed the experimental conditions, then adjusted them to the system. To achieve a precise measurement of tiny object in three-dimension, we measured MEMS micro device for example, and obtained the reproduction three-dimensional contour, realized the three dimensional profile measurement of tiny object. According to the experiment results consider: analysis the reference factors between the zero-order term and a pair of twin-images by the choice of the object light and the reference light and the distance of the recording and reconstructing and the characteristics of reconstruction light on the measurement, the measurement errors were analyzed. The research result shows that the device owns certain reliability.

  3. Measuring experimental cyclohexane-water distribution coefficients for the SAMPL5 challenge

    NASA Astrophysics Data System (ADS)

    Rustenburg, Ariën S.; Dancer, Justin; Lin, Baiwei; Feng, Jianwen A.; Ortwine, Daniel F.; Mobley, David L.; Chodera, John D.

    2016-11-01

    Small molecule distribution coefficients between immiscible nonaqueuous and aqueous phases—such as cyclohexane and water—measure the degree to which small molecules prefer one phase over another at a given pH. As distribution coefficients capture both thermodynamic effects (the free energy of transfer between phases) and chemical effects (protonation state and tautomer effects in aqueous solution), they provide an exacting test of the thermodynamic and chemical accuracy of physical models without the long correlation times inherent to the prediction of more complex properties of relevance to drug discovery, such as protein-ligand binding affinities. For the SAMPL5 challenge, we carried out a blind prediction exercise in which participants were tasked with the prediction of distribution coefficients to assess its potential as a new route for the evaluation and systematic improvement of predictive physical models. These measurements are typically performed for octanol-water, but we opted to utilize cyclohexane for the nonpolar phase. Cyclohexane was suggested to avoid issues with the high water content and persistent heterogeneous structure of water-saturated octanol phases, since it has greatly reduced water content and a homogeneous liquid structure. Using a modified shake-flask LC-MS/MS protocol, we collected cyclohexane/water distribution coefficients for a set of 53 druglike compounds at pH 7.4. These measurements were used as the basis for the SAMPL5 Distribution Coefficient Challenge, where 18 research groups predicted these measurements before the experimental values reported here were released. In this work, we describe the experimental protocol we utilized for measurement of cyclohexane-water distribution coefficients, report the measured data, propose a new bootstrap-based data analysis procedure to incorporate multiple sources of experimental error, and provide insights to help guide future iterations of this valuable exercise in predictive modeling.

  4. Qualitative versus Quantitative Results: An Experimental Introduction to Data Interpretation.

    ERIC Educational Resources Information Center

    Johnson, Eric R.; Alter, Paula

    1989-01-01

    Described is an experiment in which the student can ascertain the meaning of a negative result from a qualitative test by performing a more sensitive quantitative test on the same sample. Methodology for testing urinary glucose with a spectrophotometer at 630 nm and with commercial assaying glucose strips is presented. (MVL)

  5. Introduction to Quantitative Science, a Ninth-Grade Laboratory-Centered Course Stressing Quantitative Observation and Mathematical Analysis of Experimental Results. Final Report.

    ERIC Educational Resources Information Center

    Badar, Lawrence J.

    This report, in the form of a teacher's guide, presents materials for a ninth grade introductory course on Introduction to Quantitative Science (IQS). It is intended to replace a traditional ninth grade general science with a process oriented course that will (1) unify the sciences, and (2) provide a quantitative preparation for the new science…

  6. Experimental Demonstration of In-Place Calibration for Time Domain Microwave Imaging System

    NASA Astrophysics Data System (ADS)

    Kwon, S.; Son, S.; Lee, K.

    2018-04-01

    In this study, the experimental demonstration of in-place calibration was conducted using the developed time domain measurement system. Experiments were conducted using three calibration methods—in-place calibration and two existing calibrations, that is, array rotation and differential calibration. The in-place calibration uses dual receivers located at an equal distance from the transmitter. The received signals at the dual receivers contain similar unwanted signals, that is, the directly received signal and antenna coupling. In contrast to the simulations, the antennas are not perfectly matched and there might be unexpected environmental errors. Thus, we experimented with the developed experimental system to demonstrate the proposed method. The possible problems with low signal-to-noise ratio and clock jitter, which may exist in time domain systems, were rectified by averaging repeatedly measured signals. The tumor was successfully detected using the three calibration methods according to the experimental results. The cross correlation was calculated using the reconstructed image of the ideal differential calibration for a quantitative comparison between the existing rotation calibration and the proposed in-place calibration. The mean value of cross correlation between the in-place calibration and ideal differential calibration was 0.80, and the mean value of cross correlation of the rotation calibration was 0.55. Furthermore, the results of simulation were compared with the experimental results to verify the in-place calibration method. A quantitative analysis was also performed, and the experimental results show a tendency similar to the simulation.

  7. Quantitative measurement of adhesion of ink on plastic films with a Nano Indenter and a Scanning Probe Microscope

    NASA Astrophysics Data System (ADS)

    Shen, Weidian

    2005-03-01

    Plastic film packaging is widely used these days, especially in the convenience food industry due to its flexibility, boilability, and microwavability. Almost every package is printed with ink. The adhesion of ink on plastic films merits increasing attention to ensure quality packaging. However, inks and plastic films are polymeric materials with complicated molecular structures. The thickness of the jelly-like ink is only 500nm or less, and the thickness of the soft and flexible film is no more than 50μm, which make the quantitative measurement of their adhesion very challenging. Up to now, no scientific quantitative measurement method for the adhesion of ink on plastic films has been documented. We have tried a technique, in which a Nano-Indenter and a Scanning Probe Microscope were used to evaluate the adhesion strength of ink deposited on plastic films, quantitatively, as well as examine the configurations of adhesion failure. It was helpful in better understanding the adhesion mechanism, thus giving direction as to how to improve the adhesion.

  8. Impact of reconstruction parameters on quantitative I-131 SPECT

    NASA Astrophysics Data System (ADS)

    van Gils, C. A. J.; Beijst, C.; van Rooij, R.; de Jong, H. W. A. M.

    2016-07-01

    Radioiodine therapy using I-131 is widely used for treatment of thyroid disease or neuroendocrine tumors. Monitoring treatment by accurate dosimetry requires quantitative imaging. The high energy photons however render quantitative SPECT reconstruction challenging, potentially requiring accurate correction for scatter and collimator effects. The goal of this work is to assess the effectiveness of various correction methods on these effects using phantom studies. A SPECT/CT acquisition of the NEMA IEC body phantom was performed. Images were reconstructed using the following parameters: (1) without scatter correction, (2) with triple energy window (TEW) scatter correction and (3) with Monte Carlo-based scatter correction. For modelling the collimator-detector response (CDR), both (a) geometric Gaussian CDRs as well as (b) Monte Carlo simulated CDRs were compared. Quantitative accuracy, contrast to noise ratios and recovery coefficients were calculated, as well as the background variability and the residual count error in the lung insert. The Monte Carlo scatter corrected reconstruction method was shown to be intrinsically quantitative, requiring no experimentally acquired calibration factor. It resulted in a more accurate quantification of the background compartment activity density compared with TEW or no scatter correction. The quantification error relative to a dose calibrator derived measurement was found to be  <1%,-26% and 33%, respectively. The adverse effects of partial volume were significantly smaller with the Monte Carlo simulated CDR correction compared with geometric Gaussian or no CDR modelling. Scatter correction showed a small effect on quantification of small volumes. When using a weighting factor, TEW correction was comparable to Monte Carlo reconstruction in all measured parameters, although this approach is clinically impractical since this factor may be patient dependent. Monte Carlo based scatter correction including accurately simulated CDR

  9. A novel method for morphological pleomorphism and heterogeneity quantitative measurement: Named cell feature level co-occurrence matrix.

    PubMed

    Saito, Akira; Numata, Yasushi; Hamada, Takuya; Horisawa, Tomoyoshi; Cosatto, Eric; Graf, Hans-Peter; Kuroda, Masahiko; Yamamoto, Yoichiro

    2016-01-01

    Recent developments in molecular pathology and genetic/epigenetic analysis of cancer tissue have resulted in a marked increase in objective and measurable data. In comparison, the traditional morphological analysis approach to pathology diagnosis, which can connect these molecular data and clinical diagnosis, is still mostly subjective. Even though the advent and popularization of digital pathology has provided a boost to computer-aided diagnosis, some important pathological concepts still remain largely non-quantitative and their associated data measurements depend on the pathologist's sense and experience. Such features include pleomorphism and heterogeneity. In this paper, we propose a method for the objective measurement of pleomorphism and heterogeneity, using the cell-level co-occurrence matrix. Our method is based on the widely used Gray-level co-occurrence matrix (GLCM), where relations between neighboring pixel intensity levels are captured into a co-occurrence matrix, followed by the application of analysis functions such as Haralick features. In the pathological tissue image, through image processing techniques, each nucleus can be measured and each nucleus has its own measureable features like nucleus size, roundness, contour length, intra-nucleus texture data (GLCM is one of the methods). In GLCM each nucleus in the tissue image corresponds to one pixel. In this approach the most important point is how to define the neighborhood of each nucleus. We define three types of neighborhoods of a nucleus, then create the co-occurrence matrix and apply Haralick feature functions. In each image pleomorphism and heterogeneity are then determined quantitatively. For our method, one pixel corresponds to one nucleus feature, and we therefore named our method Cell Feature Level Co-occurrence Matrix (CFLCM). We tested this method for several nucleus features. CFLCM is showed as a useful quantitative method for pleomorphism and heterogeneity on histopathological image

  10. Quantitative autistic trait measurements index background genetic risk for ASD in Hispanic families.

    PubMed

    Page, Joshua; Constantino, John Nicholas; Zambrana, Katherine; Martin, Eden; Tunc, Ilker; Zhang, Yi; Abbacchi, Anna; Messinger, Daniel

    2016-01-01

    Recent studies have indicated that quantitative autistic traits (QATs) of parents reflect inherited liabilities that may index background genetic risk for clinical autism spectrum disorder (ASD) in their offspring. Moreover, preferential mating for QATs has been observed as a potential factor in concentrating autistic liabilities in some families across generations. Heretofore, intergenerational studies of QATs have focused almost exclusively on Caucasian populations-the present study explored these phenomena in a well-characterized Hispanic population. The present study examined QAT scores in siblings and parents of 83 Hispanic probands meeting research diagnostic criteria for ASD, and 64 non-ASD controls, using the Social Responsiveness Scale-2 (SRS-2). Ancestry of the probands was characterized by genotype, using information from 541,929 single nucleotide polymorphic markers. In families of Hispanic children with an ASD diagnosis, the pattern of quantitative trait correlations observed between ASD-affected children and their first-degree relatives (ICCs on the order of 0.20), between unaffected first-degree relatives in ASD-affected families (sibling/mother ICC = 0.36; sibling/father ICC = 0.53), and between spouses (mother/father ICC = 0.48) were in keeping with the influence of transmitted background genetic risk and strong preferential mating for variation in quantitative autistic trait burden. Results from analysis of ancestry-informative genetic markers among probands in this sample were consistent with that from other Hispanic populations. Quantitative autistic traits represent measurable indices of inherited liability to ASD in Hispanic families. The accumulation of autistic traits occurs within generations, between spouses, and across generations, among Hispanic families affected by ASD. The occurrence of preferential mating for QATs-the magnitude of which may vary across cultures-constitutes a mechanism by which background genetic liability

  11. Quantitative prediction of drug side effects based on drug-related features.

    PubMed

    Niu, Yanqing; Zhang, Wen

    2017-09-01

    Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.

  12. Affinity for Quantitative Tools: Undergraduate Marketing Students Moving beyond Quantitative Anxiety

    ERIC Educational Resources Information Center

    Tarasi, Crina O.; Wilson, J. Holton; Puri, Cheenu; Divine, Richard L.

    2013-01-01

    Marketing students are known as less likely to have an affinity for the quantitative aspects of the marketing discipline. In this article, we study the reasons why this might be true and develop a parsimonious 20-item scale for measuring quantitative affinity in undergraduate marketing students. The scale was administered to a sample of business…

  13. Failure to Integrate Quantitative Measurement Methods of Ocular Inflammation Hampers Clinical Practice and Trials on New Therapies for Posterior Uveitis.

    PubMed

    Herbort, Carl P; Tugal-Tutkun, Ilknur; Neri, Piergiorgio; Pavésio, Carlos; Onal, Sumru; LeHoang, Phuc

    2017-05-01

    Uveitis is one of the fields in ophthalmology where a tremendous evolution took place in the past 25 years. Not only did we gain access to more efficient, more targeted, and better tolerated therapies, but also in parallel precise and quantitative measurement methods developed allowing the clinician to evaluate these therapies and adjust therapeutic intervention with a high degree of precision. Objective and quantitative measurement of the global level of intraocular inflammation became possible for most inflammatory diseases with direct or spill-over anterior chamber inflammation, thanks to laser flare photometry. The amount of retinal inflammation could be quantified by using fluorescein angiography to score retinal angiographic signs. Indocyanine green angiography gave imaging insight into the hitherto inaccessible choroidal compartment, rendering possible the quantification of choroiditis by scoring indocyanine green angiographic signs. Optical coherence tomography has enabled measurement and objective monitoring of retinal and choroidal thickness. This multimodal quantitative appraisal of intraocular inflammation represents an exquisite security in monitoring uveitis. What is enigmatic, however, is the slow pace with which these improvements are integrated in some areas. What is even more difficult to understand is the fact that clinical trials to assess new therapeutic agents still mostly rely on subjective parameters such as clinical evaluation of vitreous haze as a main endpoint; whereas a whole array of precise, quantitative, and objective modalities are available for the design of clinical studies. The scope of this work was to review the quantitative investigations that improved the management of uveitis in the past 2-3 decades.

  14. Quantitative research.

    PubMed

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  15. Deformation behavior of coherently strained InAs/GaAs(111)A heteroepitaxial systems: Theoretical calculations and experimental measurements

    NASA Astrophysics Data System (ADS)

    Zepeda-Ruiz, Luis A.; Pelzel, Rodney I.; Nosho, Brett Z.; Weinberg, W. Henry; Maroudas, Dimitrios

    2001-09-01

    A comprehensive, quantitative analysis is presented of the deformation behavior of coherently strained InAs/GaAs(111)A heteroepitaxial systems. The analysis combines a hierarchical theoretical approach with experimental measurements. Continuum linear elasticity theory is linked with atomic-scale calculations of structural relaxation for detailed theoretical studies of deformation in systems consisting of InAs thin films on thin GaAs(111)A substrates that are mechanically unconstrained at their bases. Molecular-beam epitaxy is used to grow very thin InAs films on both thick and thin GaAs buffer layers on epi-ready GaAs(111)A substrates. The deformation state of these samples is characterized by x-ray diffraction (XRD). The interplanar distances of thin GaAs buffer layers along the [220] and [111] crystallographic directions obtained from the corresponding XRD spectra indicate clearly that thin buffer layers deform parallel to the InAs/GaAs(111)A interfacial plane, thus aiding in the accommodation of the strain induced by lattice mismatch. The experimental measurements are in excellent agreement with the calculated lattice interplanar distances and the corresponding strain fields in the thin mechanically unconstrained substrates considered in the theoretical analysis. Therefore, this work contributes direct evidence in support of our earlier proposal that thin buffer layers in layer-by-layer semiconductor heteroepitaxy exhibit mechanical behavior similar to that of compliant substrates [see, e.g., B. Z. Nosho, L. A. Zepeda-Ruiz, R. I. Pelzel, W. H. Weinberg, and D. Maroudas, Appl. Phys. Lett. 75, 829 (1999)].

  16. Extension of nanoconfined DNA: Quantitative comparison between experiment and theory

    NASA Astrophysics Data System (ADS)

    Iarko, V.; Werner, E.; Nyberg, L. K.; Müller, V.; Fritzsche, J.; Ambjörnsson, T.; Beech, J. P.; Tegenfeldt, J. O.; Mehlig, K.; Westerlund, F.; Mehlig, B.

    2015-12-01

    The extension of DNA confined to nanochannels has been studied intensively and in detail. However, quantitative comparisons between experiments and model calculations are difficult because most theoretical predictions involve undetermined prefactors, and because the model parameters (contour length, Kuhn length, effective width) are difficult to compute reliably, leading to substantial uncertainties. Here we use a recent asymptotically exact theory for the DNA extension in the "extended de Gennes regime" that allows us to compare experimental results with theory. For this purpose, we performed experiments measuring the mean DNA extension and its standard deviation while varying the channel geometry, dye intercalation ratio, and ionic strength of the buffer. The experimental results agree very well with theory at high ionic strengths, indicating that the model parameters are reliable. At low ionic strengths, the agreement is less good. We discuss possible reasons. In principle, our approach allows us to measure the Kuhn length and the effective width of a single DNA molecule and more generally of semiflexible polymers in solution.

  17. Statistical design of quantitative mass spectrometry-based proteomic experiments.

    PubMed

    Oberg, Ann L; Vitek, Olga

    2009-05-01

    We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.

  18. Quantitative Ultrasound for Measuring Obstructive Severity in Children with Hydronephrosis.

    PubMed

    Cerrolaza, Juan J; Peters, Craig A; Martin, Aaron D; Myers, Emmarie; Safdar, Nabile; Linguraru, Marius George

    2016-04-01

    We define sonographic biomarkers for hydronephrotic renal units that can predict the necessity of diuretic nuclear renography. We selected a cohort of 50 consecutive patients with hydronephrosis of varying severity in whom 2-dimensional sonography and diuretic mercaptoacetyltriglycine renography had been performed. A total of 131 morphological parameters were computed using quantitative image analysis algorithms. Machine learning techniques were then applied to identify ultrasound based safety thresholds that agreed with the t½ for washout. A best fit model was then derived for each threshold level of t½ that would be clinically relevant at 20, 30 and 40 minutes. Receiver operating characteristic curve analysis was performed. Sensitivity, specificity and area under the receiver operating characteristic curve were determined. Improvement obtained by the quantitative imaging method compared to the Society for Fetal Urology grading system and the hydronephrosis index was statistically verified. For the 3 thresholds considered and at 100% sensitivity the specificities of the quantitative imaging method were 94%, 70% and 74%, respectively. Corresponding area under the receiver operating characteristic curve values were 0.98, 0.94 and 0.94, respectively. Improvement obtained by the quantitative imaging method over the Society for Fetal Urology grade and hydronephrosis index was statistically significant (p <0.05 in all cases). Quantitative imaging analysis of renal sonograms in children with hydronephrosis can identify thresholds of clinically significant washout times with 100% sensitivity to decrease the number of diuretic renograms in up to 62% of children. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  19. Can't Count or Won't Count? Embedding Quantitative Methods in Substantive Sociology Curricula: A Quasi-Experiment.

    PubMed

    Williams, Malcolm; Sloan, Luke; Cheung, Sin Yi; Sutton, Carole; Stevens, Sebastian; Runham, Libby

    2016-06-01

    This paper reports on a quasi-experiment in which quantitative methods (QM) are embedded within a substantive sociology module. Through measuring student attitudes before and after the intervention alongside control group comparisons, we illustrate the impact that embedding has on the student experience. Our findings are complex and even contradictory. Whilst the experimental group were less likely to be distrustful of statistics and appreciate how QM inform social research, they were also less confident about their statistical abilities, suggesting that through 'doing' quantitative sociology the experimental group are exposed to the intricacies of method and their optimism about their own abilities is challenged. We conclude that embedding QM in a single substantive module is not a 'magic bullet' and that a wider programme of content and assessment diversification across the curriculum is preferential.

  20. Experimental Techniques for Thermodynamic Measurements of Ceramics

    NASA Technical Reports Server (NTRS)

    Jacobson, Nathan S.; Putnam, Robert L.; Navrotsky, Alexandra

    1999-01-01

    Experimental techniques for thermodynamic measurements on ceramic materials are reviewed. For total molar quantities, calorimetry is used. Total enthalpies are determined with combustion calorimetry or solution calorimetry. Heat capacities and entropies are determined with drop calorimetry, differential thermal methods, and adiabatic calorimetry . Three major techniques for determining partial molar quantities are discussed. These are gas equilibration techniques, Knudsen cell methods, and electrochemical techniques. Throughout this report, issues unique to ceramics are emphasized. Ceramic materials encompass a wide range of stabilities and this must be considered. In general data at high temperatures is required and the need for inert container materials presents a particular challenge.

  1. Quantitative Analysis of the Efficiency of OLEDs.

    PubMed

    Sim, Bomi; Moon, Chang-Ki; Kim, Kwon-Hyeon; Kim, Jang-Joo

    2016-12-07

    We present a comprehensive model for the quantitative analysis of factors influencing the efficiency of organic light-emitting diodes (OLEDs) as a function of the current density. The model takes into account the contribution made by the charge carrier imbalance, quenching processes, and optical design loss of the device arising from various optical effects including the cavity structure, location and profile of the excitons, effective radiative quantum efficiency, and out-coupling efficiency. Quantitative analysis of the efficiency can be performed with an optical simulation using material parameters and experimental measurements of the exciton profile in the emission layer and the lifetime of the exciton as a function of the current density. This method was applied to three phosphorescent OLEDs based on a single host, mixed host, and exciplex-forming cohost. The three factors (charge carrier imbalance, quenching processes, and optical design loss) were influential in different ways, depending on the device. The proposed model can potentially be used to optimize OLED configurations on the basis of an analysis of the underlying physical processes.

  2. Quantum tomography for measuring experimentally the matrix elements of an arbitrary quantum operation.

    PubMed

    D'Ariano, G M; Lo Presti, P

    2001-05-07

    Quantum operations describe any state change allowed in quantum mechanics, including the evolution of an open system or the state change due to a measurement. We present a general method based on quantum tomography for measuring experimentally the matrix elements of an arbitrary quantum operation. As input the method needs only a single entangled state. The feasibility of the technique for the electromagnetic field is shown, and the experimental setup is illustrated based on homodyne tomography of a twin beam.

  3. Online and offline experimental techniques for polycyclic aromatic hydrocarbons recovery and measurement.

    PubMed

    Comandini, A; Malewicki, T; Brezinsky, K

    2012-03-01

    The implementation of techniques aimed at improving engine performance and reducing particulate matter (PM) pollutant emissions is strongly influenced by the limited understanding of the polycyclic aromatic hydrocarbons (PAH) formation chemistry, in combustion devices, that produces the PM emissions. New experimental results which examine the formation of multi-ring compounds are required. The present investigation focuses on two techniques for such an experimental examination by recovery of PAH compounds from a typical combustion oriented experimental apparatus. The online technique discussed constitutes an optimal solution but not always feasible approach. Nevertheless, a detailed description of a new online sampling system is provided which can serve as reference for future applications to different experimental set-ups. In comparison, an offline technique, which is sometimes more experimentally feasible but not necessarily optimal, has been studied in detail for the recovery of a variety of compounds with different properties, including naphthalene, biphenyl, and iodobenzene. The recovery results from both techniques were excellent with an error in the total carbon balance of around 10% for the online technique and an uncertainty in the measurement of the single species of around 7% for the offline technique. Although both techniques proved to be suitable for measurement of large PAH compounds, the online technique represents the optimal solution in view of the simplicity of the corresponding experimental procedure. On the other hand, the offline technique represents a valuable solution in those cases where the online technique cannot be implemented.

  4. A quantitative model and the experimental evaluation of the liquid fuel layer for the downward flame spread of XPS foam.

    PubMed

    Luo, Shengfeng; Xie, Qiyuan; Tang, Xinyi; Qiu, Rong; Yang, Yun

    2017-05-05

    The objective of this work is to investigate the distinctive mechanisms of downward flame spread for XPS foam. It was physically considered as a moving down of narrow pool fire instead of downward surface flame spread for normal solids. A method was developed to quantitatively analyze the accumulated liquid fuel based on the experimental measurement of locations of flame tips and burning rates. The results surprisingly showed that about 80% of the generated hot liquid fuel remained in the pool fire during a certain period. Most of the consumed solid XPS foam didn't really burn away but transformed as the liquid fuel in the downward moving pool fire, which might be an important promotion for the fast fire development. The results also indicated that the dripping propensity of the hot liquid fuel depends on the total amount of the hot liquid accumulated in the pool fire. The leading point of the flame front curve might be the breach of the accumulated hot liquid fuel if it is enough for dripping. Finally, it is suggested that horizontal noncombustible barriers for preventing the accumulation and dripping of liquid fuel are helpful for vertical confining of XPS fire. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Experimental search for Exact Coherent Structures in turbulent small aspect ratio Taylor-Couette flow

    NASA Astrophysics Data System (ADS)

    Crowley, Christopher J.; Krygier, Michael; Grigoriev, Roman O.; Schatz, Michael F.

    2017-11-01

    Recent theoretical and experimental work suggests that the dynamics of turbulent flows are guided by unstable nonchaotic solutions to the Navier-Stokes equations. These solutions, known as exact coherent structures (ECS), play a key role in a fundamentally deterministic description of turbulence. In order to quantitatively demonstrate that actual turbulence in 3D flows is guided by ECS, high resolution, 3D-3C experimental measurements of the velocity need to be compared to solutions from direct numerical simulation of the Navier-Stokes equations. In this talk, we will present experimental measurements of fully time resolved, velocity measurements in a volume of turbulence in a counter-rotating, small aspect ratio Taylor-Couette flow. This work is supported by the Army Research Office (Contract # W911NF-16-1-0281).

  6. Improving Middle School Students’ Quantitative Literacy through Inquiry Lab and Group Investigation

    NASA Astrophysics Data System (ADS)

    Aisya, N. S. M.; Supriatno, B.; Saefudin; Anggraeni, S.

    2017-02-01

    The purpose of this study was to analyze the application of metacognitive strategies learning based Vee Diagram through Inquiry Lab and Group Investigation toward students’ quantitative literacy. This study compared two treatments on learning activity in middle school. The metacognitive strategies have applied to the content of environmental pollution at 7th grade. This study used a quantitative approach with quasi-experimental method. The research sample were the 7th grade students, involves 27 students in the experimental through Inquiry Lab and 27 students in the experimental through Group Investigation. The instruments that used in this research were pretest and posttest quantitative literacy skills, learning step observation sheets, and the questionnaire of teachers and students responses. As the result, N-gain average of pretest and posttest increased in both experimental groups. The average of posttest score was 61,11 for the Inquiry Lab and 54,01 to the Group Investigation. The average score of N-gain quantitative literacy skill of Inquiry Lab class was 0,492 and Group Investigation class was 0,426. Both classes of experiments showed an average N-gain in the medium category. The data has been analyzed statistically by using SPSS ver.23 and the results showed that although both the learning model can develop quantitative literacy, but there is not significantly different of improving students’ quantitative literacy between Inquiry Lab and Group Investigation in environmental pollution material.

  7. Basic quantitative polymerase chain reaction using real-time fluorescence measurements.

    PubMed

    Ares, Manuel

    2014-10-01

    This protocol uses quantitative polymerase chain reaction (qPCR) to measure the number of DNA molecules containing a specific contiguous sequence in a sample of interest (e.g., genomic DNA or cDNA generated by reverse transcription). The sample is subjected to fluorescence-based PCR amplification and, theoretically, during each cycle, two new duplex DNA molecules are produced for each duplex DNA molecule present in the sample. The progress of the reaction during PCR is evaluated by measuring the fluorescence of dsDNA-dye complexes in real time. In the early cycles, DNA duplication is not detected because inadequate amounts of DNA are made. At a certain threshold cycle, DNA-dye complexes double each cycle for 8-10 cycles, until the DNA concentration becomes so high and the primer concentration so low that the reassociation of the product strands blocks efficient synthesis of new DNA and the reaction plateaus. There are two types of measurements: (1) the relative change of the target sequence compared to a reference sequence and (2) the determination of molecule number in the starting sample. The first requires a reference sequence, and the second requires a sample of the target sequence with known numbers of the molecules of sequence to generate a standard curve. By identifying the threshold cycle at which a sample first begins to accumulate DNA-dye complexes exponentially, an estimation of the numbers of starting molecules in the sample can be extrapolated. © 2014 Cold Spring Harbor Laboratory Press.

  8. Quantitative Laughter Detection, Measurement, and Classification-A Critical Survey.

    PubMed

    Cosentino, Sarah; Sessa, Salvatore; Takanishi, Atsuo

    2016-01-01

    The study of human nonverbal social behaviors has taken a more quantitative and computational approach in recent years due to the development of smart interfaces and virtual agents or robots able to interact socially. One of the most interesting nonverbal social behaviors, producing a characteristic vocal signal, is laughing. Laughter is produced in several different situations: in response to external physical, cognitive, or emotional stimuli; to negotiate social interactions; and also, pathologically, as a consequence of neural damage. For this reason, laughter has attracted researchers from many disciplines. A consequence of this multidisciplinarity is the absence of a holistic vision of this complex behavior: the methods of analysis and classification of laughter, as well as the terminology used, are heterogeneous; the findings sometimes contradictory and poorly documented. This survey aims at collecting and presenting objective measurement methods and results from a variety of different studies in different fields, to contribute to build a unified model and taxonomy of laughter. This could be successfully used for advances in several fields, from artificial intelligence and human-robot interaction to medicine and psychiatry.

  9. A literature review of quantitative indicators to measure the quality of labor and delivery care.

    PubMed

    Tripathi, Vandana

    2016-02-01

    Strengthening measurement of the quality of labor and delivery (L&D) care in low-resource countries requires an understanding of existing approaches. To identify quantitative indicators of L&D care quality and assess gaps in indicators. PubMed, CINAHL Plus, and Embase databases were searched for research published in English between January 1, 1990, and October 31, 2013, using structured terms. Studies describing indicators for L&D care quality assessment were included. Those whose abstracts contained inclusion criteria underwent full-text review. Study characteristics, including indicator selection and data sources, were extracted via a standard spreadsheet. The structured search identified 1224 studies. After abstract and full-text review, 477 were included in the analysis. Most studies selected indicators by using literature review, clinical guidelines, or expert panels. Few indicators were empirically validated; most studies relied on medical record review to measure indicators. Many quantitative indicators have been used to measure L&D care quality, but few have been validated beyond expert opinion. There has been limited use of clinical observation in quality assessment of care processes. The findings suggest the need for validated, efficient consensus indicators of the quality of L&D care processes, particularly in low-resource countries. Copyright © 2015 International Federation of Gynecology and Obstetrics. Published by Elsevier Ireland Ltd. All rights reserved.

  10. Initial description of a quantitative, cross-species (chimpanzee-human) social responsiveness measure.

    PubMed

    Marrus, Natasha; Faughn, Carley; Shuman, Jeremy; Petersen, Steve E; Constantino, John N; Povinelli, Daniel J; Pruett, John R

    2011-05-01

    Comparative studies of social responsiveness, an ability that is impaired in autism spectrum disorders, can inform our understanding of both autism and the cognitive architecture of social behavior. Because there is no existing quantitative measure of social responsiveness in chimpanzees, we generated a quantitative, cross-species (human-chimpanzee) social responsiveness measure. We translated the Social Responsiveness Scale (SRS), an instrument that quantifies human social responsiveness, into an analogous instrument for chimpanzees. We then retranslated this "Chimpanzee SRS" into a human "Cross-Species SRS" (XSRS). We evaluated three groups of chimpanzees (n = 29) with the Chimpanzee SRS and typical and human children with autism spectrum disorder (ASD; n = 20) with the XSRS. The Chimpanzee SRS demonstrated strong interrater reliability at the three sites (ranges for individual ICCs: 0.534 to 0.866; mean ICCs: 0.851 to 0.970). As has been observed in human beings, exploratory principal components analysis of Chimpanzee SRS scores supports a single factor underlying chimpanzee social responsiveness. Human subjects' XSRS scores were fully concordant with their SRS scores (r = 0.976, p = .001) and distinguished appropriately between typical and ASD subjects. One chimpanzee known for inappropriate social behavior displayed a significantly higher score than all other chimpanzees at its site, demonstrating the scale's ability to detect impaired social responsiveness in chimpanzees. Our initial cross-species social responsiveness scale proved reliable and discriminated differences in social responsiveness across (in a relative sense) and within (in a more objectively quantifiable manner) human beings and chimpanzees. Copyright © 2011 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  11. Quantitative approach for optimizing e-beam condition of photoresist inspection and measurement

    NASA Astrophysics Data System (ADS)

    Lin, Chia-Jen; Teng, Chia-Hao; Cheng, Po-Chung; Sato, Yoshishige; Huang, Shang-Chieh; Chen, Chu-En; Maruyama, Kotaro; Yamazaki, Yuichiro

    2018-03-01

    Severe process margin in advanced technology node of semiconductor device is controlled by e-beam metrology system and e-beam inspection system with scanning electron microscopy (SEM) image. By using SEM, larger area image with higher image quality is required to collect massive amount of data for metrology and to detect defect in a large area for inspection. Although photoresist is the one of the critical process in semiconductor device manufacturing, observing photoresist pattern by SEM image is crucial and troublesome especially in the case of large image. The charging effect by e-beam irradiation on photoresist pattern causes deterioration of image quality, and it affect CD variation on metrology system and causes difficulties to continue defect inspection in a long time for a large area. In this study, we established a quantitative approach for optimizing e-beam condition with "Die to Database" algorithm of NGR3500 on photoresist pattern to minimize charging effect. And we enhanced the performance of measurement and inspection on photoresist pattern by using optimized e-beam condition. NGR3500 is the geometry verification system based on "Die to Database" algorithm which compares SEM image with design data [1]. By comparing SEM image and design data, key performance indicator (KPI) of SEM image such as "Sharpness", "S/N", "Gray level variation in FOV", "Image shift" can be retrieved. These KPIs were analyzed with different e-beam conditions which consist of "Landing Energy", "Probe Current", "Scanning Speed" and "Scanning Method", and the best e-beam condition could be achieved with maximum image quality, maximum scanning speed and minimum image shift. On this quantitative approach of optimizing e-beam condition, we could observe dependency of SEM condition on photoresist charging. By using optimized e-beam condition, measurement could be continued on photoresist pattern over 24 hours stably. KPIs of SEM image proved image quality during measurement and

  12. Calculated and scale model experimentally measured scattering from metallic structures in Instrument Landing System

    DOT National Transportation Integrated Search

    1974-03-01

    Comparison is made of theoretically calculated and experimentally determined scattering from metallic tilted rectangles and vertical cylindrical scatterers. The scattering was experimentally measured in a scale model range at the Watertown Arsenal, W...

  13. Jet measurements in heavy ion physics

    NASA Astrophysics Data System (ADS)

    Connors, Megan; Nattrass, Christine; Reed, Rosi; Salur, Sevil

    2018-04-01

    A hot, dense medium called a quark gluon plasma (QGP) is created in ultrarelativistic heavy ion collisions. Early in the collision, hard parton scatterings generate high momentum partons that traverse the medium, which then fragment into sprays of particles called jets. Understanding how these partons interact with the QGP and fragment into final state particles provides critical insight into quantum chromodynamics. Experimental measurements from high momentum hadrons, two particle correlations, and full jet reconstruction at the Relativistic Heavy Ion Collider (RHIC) and the Large Hadron Collider (LHC) continue to improve our understanding of energy loss in the QGP. Run 2 at the LHC recently began and there is a jet detector at RHIC under development. Now is the perfect time to reflect on what the experimental measurements have taught us so far, the limitations of the techniques used for studying jets, how the techniques can be improved, and how to move forward with the wealth of experimental data such that a complete description of energy loss in the QGP can be achieved. Measurements of jets to date clearly indicate that hard partons lose energy. Detailed comparisons of the nuclear modification factor between data and model calculations led to quantitative constraints on the opacity of the medium to hard probes. However, while there is substantial evidence for softening and broadening jets through medium interactions, the difficulties comparing measurements to theoretical calculations limit further quantitative constraints on energy loss mechanisms. Since jets are algorithmic descriptions of the initial parton, the same jet definitions must be used, including the treatment of the underlying heavy ion background, when making data and theory comparisons. An agreement is called for between theorists and experimentalists on the appropriate treatment of the background, Monte Carlo generators that enable experimental algorithms to be applied to theoretical calculations

  14. Navigational Traffic Conflict Technique: A Proactive Approach to Quantitative Measurement of Collision Risks in Port Waters

    NASA Astrophysics Data System (ADS)

    Debnath, Ashim Kumar; Chin, Hoong Chor

    Navigational safety analysis relying on collision statistics is often hampered because of the low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possessing great potential for managing collision risks in port waters.

  15. Laser flare photometry: a noninvasive, objective, and quantitative method to measure intraocular inflammation.

    PubMed

    Tugal-Tutkun, Ilknur; Herbort, Carl P

    2010-10-01

    Aqueous flare and cells are the two inflammatory parameters of anterior chamber inflammation resulting from disruption of the blood-ocular barriers. When examined with the slit lamp, measurement of intraocular inflammation remains subjective with considerable intra- and interobserver variations. Laser flare cell photometry is an objective quantitative method that enables accurate measurement of these parameters with very high reproducibility. Laser flare photometry allows detection of subclinical alterations in the blood-ocular barriers, identifying subtle pathological changes that could not have been recorded otherwise. With the use of this method, it has been possible to compare the effect of different surgical techniques, surgical adjuncts, and anti-inflammatory medications on intraocular inflammation. Clinical studies of uveitis patients have shown that flare measurements by laser flare photometry allowed precise monitoring of well-defined uveitic entities and prediction of disease relapse. Relationships of laser flare photometry values with complications of uveitis and visual loss further indicate that flare measurement by laser flare photometry should be included in the routine follow-up of patients with uveitis.

  16. Quantitative measurement of stream respiration using the resazurin-resorufin system

    NASA Astrophysics Data System (ADS)

    Gonzalez Pinzon, R. A.; Acker, S.; Haggerty, R.; Myrold, D.

    2011-12-01

    After three decades of active research in hydrology and stream ecology, the relationship between stream solute transport, metabolism and nutrient dynamics is still unresolved. These knowledge gaps obscure the function of stream ecosystems and how they interact with other landscape processes. To date, measuring rates of stream metabolism is accomplished with techniques that have vast uncertainties and are not spatially representative. These limitations mask the role of metabolism in nutrient processing. Clearly, more robust techniques are needed to develop mechanistic relationships that will ultimately improve our fundamental understanding of in-stream processes and how streams interact with other ecosystems. We investigated the "metabolic window of detection" of the Resazurin (Raz)-Resorufin (Rru) system (Haggerty et al., 2008, 2009). Although previous results have shown that the transformation of Raz to Rru is strongly correlated with respiration, a quantitative relationship between them is needed. We investigated this relationship using batch experiments with pure cultures (aerobic and anaerobic) and flow-through columns with incubated sediments from four different streams. The results suggest that the Raz-Rru system is a suitable approach that will enable hydrologists and stream ecologists to measure in situ and in vivo respiration at different scales, thus opening a reliable alternative to investigate how solute transport and stream metabolism control nutrient processing.

  17. NanoDrop Microvolume Quantitation of Nucleic Acids

    PubMed Central

    Desjardins, Philippe; Conklin, Deborah

    2010-01-01

    Biomolecular assays are continually being developed that use progressively smaller amounts of material, often precluding the use of conventional cuvette-based instruments for nucleic acid quantitation for those that can perform microvolume quantitation. The NanoDrop microvolume sample retention system (Thermo Scientific NanoDrop Products) functions by combining fiber optic technology and natural surface tension properties to capture and retain minute amounts of sample independent of traditional containment apparatus such as cuvettes or capillaries. Furthermore, the system employs shorter path lengths, which result in a broad range of nucleic acid concentration measurements, essentially eliminating the need to perform dilutions. Reducing the volume of sample required for spectroscopic analysis also facilitates the inclusion of additional quality control steps throughout many molecular workflows, increasing efficiency and ultimately leading to greater confidence in downstream results. The need for high-sensitivity fluorescent analysis of limited mass has also emerged with recent experimental advances. Using the same microvolume sample retention technology, fluorescent measurements may be performed with 2 μL of material, allowing fluorescent assays volume requirements to be significantly reduced. Such microreactions of 10 μL or less are now possible using a dedicated microvolume fluorospectrometer. Two microvolume nucleic acid quantitation protocols will be demonstrated that use integrated sample retention systems as practical alternatives to traditional cuvette-based protocols. First, a direct A260 absorbance method using a microvolume spectrophotometer is described. This is followed by a demonstration of a fluorescence-based method that enables reduced-volume fluorescence reactions with a microvolume fluorospectrometer. These novel techniques enable the assessment of nucleic acid concentrations ranging from 1 pg/ μL to 15,000 ng/ μL with minimal consumption of

  18. The mathematics of cancer: integrating quantitative models.

    PubMed

    Altrock, Philipp M; Liu, Lin L; Michor, Franziska

    2015-12-01

    Mathematical modelling approaches have become increasingly abundant in cancer research. The complexity of cancer is well suited to quantitative approaches as it provides challenges and opportunities for new developments. In turn, mathematical modelling contributes to cancer research by helping to elucidate mechanisms and by providing quantitative predictions that can be validated. The recent expansion of quantitative models addresses many questions regarding tumour initiation, progression and metastases as well as intra-tumour heterogeneity, treatment responses and resistance. Mathematical models can complement experimental and clinical studies, but also challenge current paradigms, redefine our understanding of mechanisms driving tumorigenesis and shape future research in cancer biology.

  19. Boron concentration measurements by alpha spectrometry and quantitative neutron autoradiography in cells and tissues treated with different boronated formulations and administration protocols.

    PubMed

    Bortolussi, Silva; Ciani, Laura; Postuma, Ian; Protti, Nicoletta; Luca Reversi; Bruschi, Piero; Ferrari, Cinzia; Cansolino, Laura; Panza, Luigi; Ristori, Sandra; Altieri, Saverio

    2014-06-01

    The possibility to measure boron concentration with high precision in tissues that will be irradiated represents a fundamental step for a safe and effective BNCT treatment. In Pavia, two techniques have been used for this purpose, a quantitative method based on charged particles spectrometry and a boron biodistribution imaging based on neutron autoradiography. A quantitative method to determine boron concentration by neutron autoradiography has been recently set-up and calibrated for the measurement of biological samples, both solid and liquid, in the frame of the feasibility study of BNCT. This technique was calibrated and the obtained results were cross checked with those of α spectrometry, in order to validate them. The comparisons were performed using tissues taken form animals treated with different boron administration protocols. Subsequently the quantitative neutron autoradiography was employed to measure osteosarcoma cell samples treated with BPA and with new boronated formulations. © 2013 Published by Elsevier Ltd.

  20. Development of quantitative radioactive methodologies on paper to determine important lateral-flow immunoassay parameters.

    PubMed

    Mosley, Garrett L; Nguyen, Phuong; Wu, Benjamin M; Kamei, Daniel T

    2016-08-07

    The lateral-flow immunoassay (LFA) is a well-established diagnostic technology that has recently seen significant advancements due in part to the rapidly expanding fields of paper diagnostics and paper-fluidics. As LFA-based diagnostics become more complex, it becomes increasingly important to quantitatively determine important parameters during the design and evaluation process. However, current experimental methods for determining these parameters have certain limitations when applied to LFA systems. In this work, we describe our novel methods of combining paper and radioactive measurements to determine nanoprobe molarity, the number of antibodies per nanoprobe, and the forward and reverse rate constants for nanoprobe binding to immobilized target on the LFA test line. Using a model LFA system that detects for the presence of the protein transferrin (Tf), we demonstrate the application of our methods, which involve quantitative experimentation and mathematical modeling. We also compare the results of our rate constant experiments with traditional experiments to demonstrate how our methods more appropriately capture the influence of the LFA environment on the binding interaction. Our novel experimental approaches can therefore more efficiently guide the research process for LFA design, leading to more rapid advancement of the field of paper-based diagnostics.

  1. Quantitative Measures for Evaluation of Ultrasound Therapies of the Prostate

    NASA Astrophysics Data System (ADS)

    Kobelevskiy, Ilya; Burtnyk, Mathieu; Bronskill, Michael; Chopra, Rajiv

    2010-03-01

    Development of non-invasive techniques for prostate cancer treatment requires implementation of quantitative measures for evaluation of the treatment results. In this paper. we introduce measures that estimate spatial targeting accuracy and potential thermal damage to the structures surrounding the prostate. The measures were developed for the technique of treating prostate cancer with a transurethral ultrasound heating applicators guided by active MR temperature feedback. Variations of ultrasound element length and related MR imaging parameters such as MR slice thickness and update time were investigated by performing numerical simulations of the treatment on a database of ten patient prostate geometries segmented from clinical MR images. Susceptibility of each parameter configuration to uncertainty in MR temperature measurements was studied by adding noise to the temperature measurements. Gaussian noise with zero mean and standard deviation of 0, 1, 3 and 5° C was used to model different levels of uncertainty in MR temperature measurements. Results of simulations for each parameter configuration were averaged over the database of the ten prostate patient geometries studied. Results have shown that for update time of 5 seconds both 3- and 5-mm elements achieve appropriate performance for temperature uncertainty up to 3° C, while temperature uncertainty of 5° C leads to noticeable reduction in spatial accuracy and increased risk of damaging rectal wall. Ten-mm elements lacked spatial accuracy and had higher risk of damaging rectal wall compared to 3- and 5-mm elements, but were less sensitive to the level of temperature uncertainty. The effect of changing update time was studied for 5-mm elements. Simulations showed that update time had minor effects on all aspects of treatment for temperature uncertainty of 0° C and 1° C, while temperature uncertainties of 3° C and 5° C led to reduced spatial accuracy, increased potential damage to the rectal wall, and

  2. Quantitative measurement of channel-block hydraulic interactions by experimental saturation of a large, natural, fissured rock mass.

    PubMed

    Guglielmi, Y; Mudry, J

    2001-01-01

    The hydrodynamic behavior of fissured media relies on the relationships between a few very conductive fractures (channels) and the remaining low-conductivity fractures and matrix (blocks). We made a quantitative measurement of those relationships and their effect on water drainage and storage in a 19,000 m3 natural reservoir consisting of karstified limestones. This reservoir was artificially saturated with water by closing a water gate on the main outlet during a varying time (delta t) fixed by the operator. The water gate was completely or partly closed until the water pressure reached a particular specified value. If the water gate was left completely closed long enough, the water pressure was fixed by the elevation of temporary outlets at the site boundaries. The water elevation within the reservoir was monitored by means of pressure cells located on single fractures representative of the bedding plane and the two families of fractures of the massif network. The comparison of pressure variations with the network geometry allows us to identify a typical double permeability characterized by a few very conductive channels along 10 vertical faults. These channels limit blocks consisting of low-conductivity bedding planes and a rather impervious matrix. Depending on the closure interval, delta t, of the water gate, the total volume of water stored in the reservoir can vary from 0.8 m3 (delta t = 5 minutes) to 18.6 m3 (delta t = 2 days). Such a variance of storage versus closure time is explained by the reservoir's double permeability that is characterized by blocks that saturate much more slowly than channels. If plotted versus time, this injected volume fits a power relationship, according to the saturation state of the blocks. In less than 34 minutes, storage is close to zero in the blocks and to 1.6 to 2 m3 in the channels. For closing times shorter than 1 hour, only 1% of the volume that flows in the channels is stored into the blocks. Depending on the water

  3. Estimation of the number of fluorescent end-members for quantitative analysis of multispectral FLIM data.

    PubMed

    Gutierrez-Navarro, Omar; Campos-Delgado, Daniel U; Arce-Santana, Edgar R; Maitland, Kristen C; Cheng, Shuna; Jabbour, Joey; Malik, Bilal; Cuenca, Rodrigo; Jo, Javier A

    2014-05-19

    Multispectral fluorescence lifetime imaging (m-FLIM) can potentially allow identifying the endogenous fluorophores present in biological tissue. Quantitative description of such data requires estimating the number of components in the sample, their characteristic fluorescent decays, and their relative contributions or abundances. Unfortunately, this inverse problem usually requires prior knowledge about the data, which is seldom available in biomedical applications. This work presents a new methodology to estimate the number of potential endogenous fluorophores present in biological tissue samples from time-domain m-FLIM data. Furthermore, a completely blind linear unmixing algorithm is proposed. The method was validated using both synthetic and experimental m-FLIM data. The experimental m-FLIM data include in-vivo measurements from healthy and cancerous hamster cheek-pouch epithelial tissue, and ex-vivo measurements from human coronary atherosclerotic plaques. The analysis of m-FLIM data from in-vivo hamster oral mucosa identified healthy from precancerous lesions, based on the relative concentration of their characteristic fluorophores. The algorithm also provided a better description of atherosclerotic plaques in term of their endogenous fluorophores. These results demonstrate the potential of this methodology to provide quantitative description of tissue biochemical composition.

  4. Association between quantitative measures obtained using fluorescence-based methods and activity status of occlusal caries lesions in primary molars.

    PubMed

    Novaes, Tatiane Fernandes; Reyes, Alessandra; Matos, Ronilza; Antunes-Pontes, Laura Regina; Marques, Renata Pereira de Samuel; Braga, Mariana Minatel; Diniz, Michele Baffi; Mendes, Fausto Medeiros

    2017-05-01

    Fluorescence-based methods (FBM) can add objectiveness to diagnosis strategy for caries. Few studies, however, have focused on the evaluation of caries activity. To evaluate the association between quantitative measures obtained with FBM, clinical parameters acquired from the patients, caries detection, and assessment of activity status in occlusal surfaces of primary molars. Six hundred and six teeth from 113 children (4-14 years) were evaluated. The presence of a biofilm, caries experience, and the number of active lesions were recorded. The teeth were assessed using FBM: DIAGNOdent pen (Lfpen) and Quantitative light-induced fluorescence (QLF). As reference standard, all teeth were evaluated using the ICDAS (International Caries Detection and Assessment System) associated with clinical activity assessments. Multilevel regressions compared the FBM values and evaluated the association between the FBM measures and clinical variables related to the caries activity. The measures from the FBM were higher in cavitated lesions. Only, ∆F values distinguished active and inactive lesions. The LFpen measures were higher in active lesions, at the cavitated threshold (56.95 ± 29.60). Following regression analyses, only the presence of visible biofilm on occlusal surfaces (adjusted prevalence ratio = 1.43) and ∆R values of the teeth (adjusted prevalence ratio = 1.02) were associated with caries activity. Some quantitative measures from FBM parameters are associated with caries activity evaluation, which is similar to the clinical evaluation of the presence of visible biofilm. © 2016 BSPD, IAPD and John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods.

    PubMed

    Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C W; Lipiński, Wojciech; Bischof, John C

    2016-07-21

    Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.

  6. Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods

    NASA Astrophysics Data System (ADS)

    Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C. W.; Lipiński, Wojciech; Bischof, John C.

    2016-07-01

    Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.

  7. Experimental measurement of interparticle acoustic radiation force in the Rayleigh limit

    NASA Astrophysics Data System (ADS)

    Mohapatra, Abhishek Ray; Sepehrirahnama, Shahrokh; Lim, Kian-Meng

    2018-05-01

    Acoustophoresis is a form of contact-free particle manipulation in microfluidic devices. The precision of manipulation can be enhanced with better understanding of the acoustic radiation force. In this paper we present the measurements of interparticle radiation force between a pair of polystyrene beads in the Rayleigh limit. The study is conducted for three different sizes of beads and the experimental results are of the same order of magnitude when compared with theoretical predictions. However, the experimental values are larger than the theoretical values. The trend of a decrease in the magnitude of the interparticle radiation force with decreasing particle size and increasing center-to-center distance between the particles is also observed experimentally. The experiments are conducted in the specific scenario where the pair of beads are in close proximity, but not in contact with each other, and the beads are approaching the pressure nodal plane with the center-to-center line aligned perpendicular to the incident wave. This scenario minimizes the presence of the primary radiation force, allowing accurate measurement of the interparticle force. The attractive nature of the interparticle force is observed, consistent with theoretical predictions.

  8. Quantitative mapping of intracellular cations in the human amniotic membrane

    NASA Astrophysics Data System (ADS)

    Moretto, Ph.; Llabador, Y.; Simonoff, M.; Razafindrabe, L.; Bara, M.; Guiet-Bara, A.

    1993-05-01

    The effect of magnesium and taurine on the permeability of cell membranes to monovalent cations has been investigated using the Bordeaux nuclear microprobe. PIXE and RBS techniques have been used to provide quantitative measurements and ion distributions in the isolated amniotic membrane. This physiological model for cellular exchanges allowed us to reveal the distribution of most elements involved in cellular pathways and the modifications under different experimental conditions of incubation in physiological fluids. The PIXE microanalysis provided an original viewpoint on these mechanisms. Following this first study, the amnion compact lamina was found to play a role which was not, up to now, taken into account in the interpretation of electrophysiological experimentations. The release of some ionic species, such as K +, from the epithelial cells, during immersion in isotonic fluids, could have been hitherto underestimated.

  9. Experimental Test of Entropic Noise-Disturbance Uncertainty Relations for Spin-1/2 Measurements.

    PubMed

    Sulyok, Georg; Sponar, Stephan; Demirel, Bülent; Buscemi, Francesco; Hall, Michael J W; Ozawa, Masanao; Hasegawa, Yuji

    2015-07-17

    Information-theoretic definitions for noise and disturbance in quantum measurements were given in [Phys. Rev. Lett. 112, 050401 (2014)] and a state-independent noise-disturbance uncertainty relation was obtained. Here, we derive a tight noise-disturbance uncertainty relation for complementary qubit observables and carry out an experimental test. Successive projective measurements on the neutron's spin-1/2 system, together with a correction procedure which reduces the disturbance, are performed. Our experimental results saturate the tight noise-disturbance uncertainty relation for qubits when an optimal correction procedure is applied.

  10. Beyond Math Skills: Measuring Quantitative Reasoning in Context

    ERIC Educational Resources Information Center

    Grawe, Nathan D.

    2011-01-01

    It might be argued that quantitative and qualitative analyses are merely two alternative reflections of an overarching critical thinking. For instance, just as instructors of numeracy warn their charges to consider the construction of variables, teachers of qualitative approaches caution students to define terms. Similarly, an advocate of…

  11. Identification and evaluation of reliable reference genes for quantitative real-time PCR analysis in tea plant (Camellia sinensis (L.) O. Kuntze)

    USDA-ARS?s Scientific Manuscript database

    Quantitative real-time polymerase chain reaction (qRT-PCR) is a commonly used technique for measuring gene expression levels due to its simplicity, specificity, and sensitivity. Reliable reference selection for the accurate quantification of gene expression under various experimental conditions is a...

  12. A Musical Approach to Reading Fluency: An Experimental Study in First-Grade Classrooms

    ERIC Educational Resources Information Center

    Leguizamon, Daniel F.

    2010-01-01

    The purpose of this quantitative, quasi-experimental study was to investigate the relationship between Kodaly-based music instruction and reading fluency in first-grade classrooms. Reading fluency and overall reading achievement were measured for 109 participants at mid-point in the academic year pre- and post treatment. Tests were carried out to…

  13. Capsular Outcomes After Pediatric Cataract Surgery Without Intraocular Lens Implantation: Qualitative Classification and Quantitative Measurement.

    PubMed

    Tan, Xuhua; Lin, Haotian; Lin, Zhuoling; Chen, Jingjing; Tang, Xiangchen; Luo, Lixia; Chen, Weirong; Liu, Yizhi

    2016-03-01

    The objective of this study was to investigate capsular outcomes 12 months after pediatric cataract surgery without intraocular lens implantation via qualitative classification and quantitative measurement.This study is a cross-sectional study that was approved by the institutional review board of Zhongshan Ophthalmic Center of Sun Yat-sen University in Guangzhou, China.Digital coaxial retro-illumination photographs of 329 aphakic pediatric eyes were obtained 12 months after pediatric cataract surgery without intraocular lens implantation. Capsule digital coaxial retro-illumination photographs were divided as follows: anterior capsule opening area (ACOA), posterior capsule opening area (PCOA), and posterior capsule opening opacity (PCOO). Capsular outcomes were qualitatively classified into 3 types based on the PCOO: Type I-capsule with mild opacification but no invasion into the capsule opening; Type II-capsule with moderate opacification accompanied by contraction of the ACOA and invasion to the occluding part of the PCOA; and Type III-capsule with severe opacification accompanied by total occlusion of the PCOA. Software was developed to quantitatively measure the ACOA, PCOA, and PCOO using standardized DCRPs. The relationships between the accurate intraoperative anterior and posterior capsulorhexis sizes and the qualitative capsular types were statistically analyzed.The DCRPs of 315 aphakic eyes (95.8%) of 191 children were included. Capsular outcomes were classified into 3 types: Type I-120 eyes (38.1%); Type II-157 eyes (49.8%); Type III-38 eyes (12.1%). The scores of the capsular outcomes were negatively correlated with intraoperative anterior capsulorhexis size (R = -0.572, P < 0.001), but no significant correlation with intraoperative posterior capsulorhexis size (R = -0.16, P = 0.122) was observed. The ACOA significantly decreased from Type I to Type II to Type III, the PCOA increased in size from Type I to Type II, and the PCOO increased

  14. Quantitative computed tomography measurements of emphysema for diagnosing asthma-chronic obstructive pulmonary disease overlap syndrome

    PubMed Central

    Xie, Mengshuang; Wang, Wei; Dou, Shuang; Cui, Liwei; Xiao, Wei

    2016-01-01

    Background The diagnostic criteria of asthma–COPD overlap syndrome (ACOS) are controversial. Emphysema is characteristic of COPD and usually does not exist in typical asthma patients. Emphysema in patients with asthma suggests the coexistence of COPD. Quantitative computed tomography (CT) allows repeated evaluation of emphysema noninvasively. We investigated the value of quantitative CT measurements of emphysema in the diagnosis of ACOS. Methods This study included 404 participants; 151 asthma patients, 125 COPD patients, and 128 normal control subjects. All the participants underwent pulmonary function tests and a high-resolution CT scan. Emphysema measurements were taken with an Airway Inspector software. The asthma patients were divided into high and low emphysema index (EI) groups based on the percentage of low attenuation areas less than −950 Hounsfield units. The characteristics of asthma patients with high EI were compared with those having low EI or COPD. Results The normal value of percentage of low attenuation areas less than −950 Hounsfield units in Chinese aged >40 years was 2.79%±2.37%. COPD patients indicated more severe emphysema and more upper-zone-predominant distribution of emphysema than asthma patients or controls. Thirty-two (21.2%) of the 151 asthma patients had high EI. Compared with asthma patients with low EI, those with high EI were significantly older, more likely to be male, had more pack-years of smoking, had more upper-zone-predominant distribution of emphysema, and had greater airflow limitation. There were no significant differences in sex ratios, pack-years of smoking, airflow limitation, or emphysema distribution between asthma patients with high EI and COPD patients. A greater number of acute exacerbations were seen in asthma patients with high EI compared with those with low EI or COPD. Conclusion Asthma patients with high EI fulfill the features of ACOS, as described in the Global Initiative for Asthma and Global

  15. Experimental ion mobility measurements in Xe-CO2

    NASA Astrophysics Data System (ADS)

    Cortez, A. F. V.; Santos, M. A. G.; Veenhof, R.; Patra, R. N.; Neves, P. N. B.; Santos, F. P.; Borges, F. I. G. M.; Conde, C. A. N.

    2017-06-01

    Data on ion mobility is important to improve the performance of large volume gaseous detectors. In the present work the method, experimental setup and results for the ion mobility measurements in Xe-CO2 mixtures are presented. The results for this mixture show the presence of only one peak for all gas ratios of Xe-CO2, low reduced electric fields, E/N, 10-25 Td (2.4-6.1 kV·cm-1·bar-1), low pressures 6-8 Torr (8-10.6 mbar), at room temperature.

  16. Hydrogen Isotope Measurements of Organic Acids and Alcohols by Pyrolysis-GC-MS-TC-IRMS: Application to Analysis of Experimentally Derived Hydrothermal Mineral-Catalyzed Organic Products

    NASA Technical Reports Server (NTRS)

    Socki, Richard A.; Fu, Qi; Niles, Paul B.; Gibson, Everett K., Jr.

    2012-01-01

    We report results of experiments to measure the H isotope composition of organic acids and alcohols. These experiments make use of a pyroprobe interfaced with a GC and high temperature extraction furnace to make quantitative H isotope measurements. This work compliments our previous work that focused on the extraction and analysis of C isotopes from the same compounds [1]. Together with our carbon isotope analyses our experiments serve as a "proof of concept" for making C and H isotope measurements on more complex mixtures of organic compounds on mineral surfaces in abiotic hydrocarbon formation processes at elevated temperatures and pressures. Our motivation for undertaking this work stems from observations of methane detected within the Martian atmosphere [2-5], coupled with evidence showing extensive water-rock interaction during Mars history [6-8]. Methane production on Mars could be the result of synthesis by mineral surface-catalyzed reduction of CO2 and/or CO by Fischer-Tropsch Type (FTT) reactions during serpentization [9,10]. Others have conducted experimental studies to show that FTT reactions are plausible mechanisms for low-molecular weight hydrocarbon formation in hydrothermal systems at mid-ocean ridges [11-13]. Our H isotope measurements utilize an analytical technique combining Pyrolysis-Gas Chromatograph-Mass Spectrometry-High Temperature Conversion-Isotope Ratio Mass Spectrometry (Py-GC-MS-TC-IRMS). This technique is designed to carry a split of the pyrolyzed GC-separated product to a Thermo DSQII quadrupole mass spectrometer as a means of making qualitative and semi-quantitative compositional measurements of separated organic compounds, therefore both chemical and isotopic measurements can be carried out simultaneously on the same sample.

  17. Imaging and quantitative measurement of corrosion in painted automotive and aircraft structures

    NASA Astrophysics Data System (ADS)

    Sun, G.; Wang, Xun; Feng, Z. J.; Jin, Huijia; Sui, Hua; Ouyang, Zhong; Han, Xiaoyan; Favro, L. D.; Thomas, R. L.; Bomback, J. L.

    2000-05-01

    Some of the authors have shown that it is possible to image and make rapid, quantitative measurements of metal thickness loss due to corrosion on the rear surface of a single layer structure, with an accuracy better than one percent. These measurements are complicated by the presence of thick and/or uneven layers of paint on either the front surface, the back surface, or both. We will discuss progress in overcoming these complications. Examples from both automotive and aircraft structures will be presented.—This material is based in part upon work performed at the FAA Center for Aviation Systems Reliability operated at Iowa State University and supported by the Federal Aviation Administration Technical Center, Atlantic City, New Jersey, under Grant number 95-G-025, and is also supported in part by the Institute for Manufacturing Research, Wayne State University, and by Ford Motor Company. Supported by a Grant from Ford Motor Company.

  18. Experimental measurement of structural power flow on an aircraft fuselage

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1989-01-01

    An experimental technique is used to measure the structural power flow through an aircraft fuselage with the excitation near the wing attachment location. Because of the large number of measurements required to analyze the whole of an aircraft fuselage, it is necessary that a balance be achieved between the number of measurement transducers, the mounting of these transducers, and the accuracy of the measurements. Using four transducers mounted on a bakelite platform, the structural intensity vectors at locations distributed throughout the fuselage are measured. To minimize the errors associated with using a four transducers technique the measurement positions are selected away from bulkheads and stiffeners. Because four separate transducers are used, with each transducer having its own drive and conditioning amplifiers, phase errors are introduced in the measurements that can be much greater than the phase differences associated with the measurements. To minimize these phase errors two sets of measurements are taken for each position with the orientation of the transducers rotated by 180 deg and an average taken between the two sets of measurements. Results are presented and discussed.

  19. The development of NEdSERV: quantitative instrumentation to measure service quality in nurse education.

    PubMed

    Roberts, P

    1999-07-01

    The political climate of health care provision and education for health care in the latter years of the 20th century is evolving from the uncertainty of newly created markets to a more clearly focused culture of collaboration, dissemination of good practice, with an increased emphasis on quality provision and its measurement. The need for provider units to prove and improve efficiency and effectiveness through evidence-based quality strategies in order to stay firmly in the market place has never been more necessary. The measurement of customer expectations and perceptions of delivered service quality is widely utilized as a basis for customer retention and business growth in both commercial and non-profit organizations. This paper describes the methodological development of NEdSERV--quantitative instrumentation designed to measure and respond to ongoing stakeholder expectations and perceptions of delivered service quality within nurse education.

  20. Quantitative measurements of in-cylinder gas composition in a controlled auto-ignition combustion engine

    NASA Astrophysics Data System (ADS)

    Zhao, H.; Zhang, S.

    2008-01-01

    One of the most effective means to achieve controlled auto-ignition (CAI) combustion in a gasoline engine is by the residual gas trapping method. The amount of residual gas and mixture composition have significant effects on the subsequent combustion process and engine emissions. In order to obtain quantitative measurements of in-cylinder residual gas concentration and air/fuel ratio, a spontaneous Raman scattering (SRS) system has been developed recently. The optimized optical SRS setups are presented and discussed. The temperature effect on the SRS measurement is considered and a method has been developed to correct for the overestimated values due to the temperature effect. Simultaneous measurements of O2, H2O, CO2 and fuel were obtained throughout the intake, compression, combustion and expansion strokes. It shows that the SRS can provide valuable data on this process in a CAI combustion engine.

  1. Can’t Count or Won’t Count? Embedding Quantitative Methods in Substantive Sociology Curricula: A Quasi-Experiment

    PubMed Central

    Williams, Malcolm; Sloan, Luke; Cheung, Sin Yi; Sutton, Carole; Stevens, Sebastian; Runham, Libby

    2015-01-01

    This paper reports on a quasi-experiment in which quantitative methods (QM) are embedded within a substantive sociology module. Through measuring student attitudes before and after the intervention alongside control group comparisons, we illustrate the impact that embedding has on the student experience. Our findings are complex and even contradictory. Whilst the experimental group were less likely to be distrustful of statistics and appreciate how QM inform social research, they were also less confident about their statistical abilities, suggesting that through ‘doing’ quantitative sociology the experimental group are exposed to the intricacies of method and their optimism about their own abilities is challenged. We conclude that embedding QM in a single substantive module is not a ‘magic bullet’ and that a wider programme of content and assessment diversification across the curriculum is preferential. PMID:27330225

  2. Experimental design and quantitative analysis of microbial community multiomics.

    PubMed

    Mallick, Himel; Ma, Siyuan; Franzosa, Eric A; Vatanen, Tommi; Morgan, Xochitl C; Huttenhower, Curtis

    2017-11-30

    Studies of the microbiome have become increasingly sophisticated, and multiple sequence-based, molecular methods as well as culture-based methods exist for population-scale microbiome profiles. To link the resulting host and microbial data types to human health, several experimental design considerations, data analysis challenges, and statistical epidemiological approaches must be addressed. Here, we survey current best practices for experimental design in microbiome molecular epidemiology, including technologies for generating, analyzing, and integrating microbiome multiomics data. We highlight studies that have identified molecular bioactives that influence human health, and we suggest steps for scaling translational microbiome research to high-throughput target discovery across large populations.

  3. Proficiency testing as a basis for estimating uncertainty of measurement: application to forensic alcohol and toxicology quantitations.

    PubMed

    Wallace, Jack

    2010-05-01

    While forensic laboratories will soon be required to estimate uncertainties of measurement for those quantitations reported to the end users of the information, the procedures for estimating this have been little discussed in the forensic literature. This article illustrates how proficiency test results provide the basis for estimating uncertainties in three instances: (i) For breath alcohol analyzers the interlaboratory precision is taken as a direct measure of uncertainty. This approach applies when the number of proficiency tests is small. (ii) For blood alcohol, the uncertainty is calculated from the differences between the laboratory's proficiency testing results and the mean quantitations determined by the participants; this approach applies when the laboratory has participated in a large number of tests. (iii) For toxicology, either of these approaches is useful for estimating comparability between laboratories, but not for estimating absolute accuracy. It is seen that data from proficiency tests enable estimates of uncertainty that are empirical, simple, thorough, and applicable to a wide range of concentrations.

  4. Spin-up flow of ferrofluids: Asymptotic theory and experimental measurements

    NASA Astrophysics Data System (ADS)

    Chaves, Arlex; Zahn, Markus; Rinaldi, Carlos

    2008-05-01

    We treat the flow of ferrofluid in a cylindrical container subjected to a uniform rotating magnetic field, commonly referred to as spin-up flow. A review of theoretical and experimental results published since the phenomenon was first observed in 1967 shows that the experimental data from surface observations of tracer particles are inadequate for the assessment of bulk flow theories. We present direct measurements of the bulk flow by using the ultrasound velocity profile method, and torque measurements for water and kerosene based ferrofluids, showing the fluid corotating with the field in a rigid-body-like fashion throughout most of the bulk region of the container, except near the air-fluid interface, where it was observed to counter-rotate. We obtain an extension of the spin diffusion theory of Zaitsev and Shliomis, using the regular perturbation method. The solution is rigorously valid for αK≪√3/2 , where αK is the Langevin parameter evaluated by using the applied field magnitude, and provides a means for obtaining successively higher contributions of the nonlinearity of the equilibrium magnetization response and the spin-magnetization coupling in the magnetization relaxation equation. Because of limitations in the sensitivity of our apparatus, experiments were carried out under conditions for which α ˜1. Still, under such conditions the predictions of the analysis are in good qualitative agreement with the experimental observations. An estimate of the spin viscosity is obtained from comparison of flow measurements and theoretical results of the extrapolated wall velocity from the regular perturbation method. The estimated value lies in the range of 10-8-10-12kgms-1 and is several orders of magnitude higher than that obtained from dimensional analysis of a suspension of noninteracting particles in a Newtonian fluid.

  5. High heat flux measurements and experimental calibrations/characterizations

    NASA Technical Reports Server (NTRS)

    Kidd, Carl T.

    1992-01-01

    Recent progress in techniques employed in the measurement of very high heat-transfer rates in reentry-type facilities at the Arnold Engineering Development Center (AEDC) is described. These advances include thermal analyses applied to transducer concepts used to make these measurements; improved heat-flux sensor fabrication methods, equipment, and procedures for determining the experimental time response of individual sensors; performance of absolute heat-flux calibrations at levels above 2,000 Btu/cu ft-sec (2.27 kW/cu cm); and innovative methods of performing in-situ run-to-run characterizations of heat-flux probes installed in the test facility. Graphical illustrations of the results of extensive thermal analyses of the null-point calorimeter and coaxial surface thermocouple concepts with application to measurements in aerothermal test environments are presented. Results of time response experiments and absolute calibrations of null-point calorimeters and coaxial thermocouples performed in the laboratory at intermediate to high heat-flux levels are shown. Typical AEDC high-enthalpy arc heater heat-flux data recently obtained with a Calspan-fabricated null-point probe model are included.

  6. Neutron field measurement at the Experimental Advanced Superconducting Tokamak using a Bonner sphere spectrometer

    NASA Astrophysics Data System (ADS)

    Hu, Zhimeng; Zhong, Guoqiang; Ge, Lijian; Du, Tengfei; Peng, Xingyu; Chen, Zhongjing; Xie, Xufei; Yuan, Xi; Zhang, Yimo; Sun, Jiaqi; Fan, Tieshuan; Zhou, Ruijie; Xiao, Min; Li, Kai; Hu, Liqun; Chen, Jun; Zhang, Hui; Gorini, Giuseppe; Nocente, Massimo; Tardocchi, Marco; Li, Xiangqing; Chen, Jinxiang; Zhang, Guohui

    2018-07-01

    The neutron field measurement was performed in the Experimental Advanced Superconducting Tokamak (EAST) experimental hall using a Bonner sphere spectrometer (BSS) based on a 3He thermal neutron counter. The measured spectra and the corresponding integrated neutron fluence and dose values deduced from the spectra at two exposed positions were compared to the calculated results obtained by a general Monte Carlo code MCNP5, and good agreements were found. The applicability of a homemade dose survey meter installed at EAST was also verified with the comparison of the ambient dose equivalent H*(10) values measured by the meter and BSS.

  7. Point-of-Care Quantitative Measure of Glucose-6-Phosphate Dehydrogenase Enzyme Deficiency.

    PubMed

    Bhutani, Vinod K; Kaplan, Michael; Glader, Bertil; Cotten, Michael; Kleinert, Jairus; Pamula, Vamsee

    2015-11-01

    Widespread newborn screening on a point-of-care basis could prevent bilirubin neurotoxicity in newborns with glucose-6-phosphate dehydrogenase (G6PD) deficiency. We evaluated a quantitative G6PD assay on a digital microfluidic platform by comparing its performance with standard clinical methods. G6PD activity was measured quantitatively by using digital microfluidic fluorescence and the gold standard fluorescence biochemical test on a convenience sample of 98 discarded blood samples. Twenty-four samples were designated as G6PD deficient. Mean ± SD G6PD activity for normal samples using the digital microfluidic method and the standard method, respectively, was 9.7 ± 2.8 and 11.1 ± 3.0 U/g hemoglobin (Hb), respectively; for G6PD-deficient samples, it was 0.8 ± 0.7 and 1.4 ± 0.9 U/g Hb. Bland-Altman analysis determined a mean difference of -0.96 ± 1.8 U/g Hb between the digital microfluidic fluorescence results and the standard biochemical test results. The lower and upper limits for the digital microfluidic platform were 4.5 to 19.5 U/g Hb for normal samples and 0.2 to 3.7 U/g Hb for G6PD-deficient samples. The lower and upper limits for the Stanford method were 5.5 to 20.7 U/g Hb for normal samples and 0.1 to 2.8 U/g Hb for G6PD-deficient samples. The measured activity discriminated between G6PD-deficient samples and normal samples with no overlap. Pending further validation, a digital microfluidics platform could be an accurate point-of-care screening tool for rapid newborn G6PD screening. Copyright © 2015 by the American Academy of Pediatrics.

  8. Direct and quantitative AFM measurements of the concentration and temperature dependence of the hydrophobic force law at nanoscopic contacts.

    PubMed

    Stock, Philipp; Utzig, Thomas; Valtiner, Markus

    2015-05-15

    By virtue of its importance for self-organization of biological matter the hydrophobic force law and the range of hydrophobic interactions (HI) have been debated extensively over the last 40 years. Here, we directly measure and quantify the hydrophobic force-distance law over large temperature and concentration ranges. In particular, we study the HI between molecularly smooth hydrophobic self-assembled monolayers, and similarly modified gold-coated AFM tips (radii∼8-50 nm). We present quantitative and direct evidence that the hydrophobic force is both long-ranged and exponential down to distances of about 1-2 nm. Therefore, we introduce a self-consistent radius-normalization for atomic force microscopy data. This approach allows quantitative data fitting of AFM-based experimental data to the recently proposed Hydra-model. With a statistical significance of r(2)⩾0.96 our fitting and data directly reveal an exponential HI decay length of 7.2±1.2 Å that is independent of the salt concentration up to 750 mM. As such, electrostatic screening does not have a significant influence on the HI in electrolyte concentrations ranging from 1 mM to 750 mM. In 1 M solutions the observed instability during approach shifts to longer distances, indicating ion correlation/adsorption effects at high salt concentrations. With increasing temperature the magnitude of HI decreases monotonically, while the range increases slightly. We compare our results to the large body of available literature, and shed new light into range and magnitude of hydrophobic interactions at very close distances and over wide temperature and concentration regimes. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. A Quantitative Infrared Spectroscopy Experiment.

    ERIC Educational Resources Information Center

    Krahling, Mark D.; Eliason, Robert

    1985-01-01

    Although infrared spectroscopy is used primarily for qualitative identifications, it is possible to use it as a quantitative tool as well. The use of a standard curve to determine percent methanol in a 2,2,2-trifluoroethanol sample is described. Background information, experimental procedures, and results obtained are provided. (JN)

  10. Classical Experiments Revisited: Smartphones and Tablet PCs as Experimental Tools in Acoustics and Optics

    ERIC Educational Resources Information Center

    Klein, P.; Hirth, M.; Gröber, S.; Kuhn, J.; Müller, A.

    2014-01-01

    Smartphones and tablets are used as experimental tools and for quantitative measurements in two traditional laboratory experiments for undergraduate physics courses. The Doppler effect is analyzed and the speed of sound is determined with an accuracy of about 5% using ultrasonic frequency and two smartphones, which serve as rotating sound emitter…

  11. A rapid and quantitative assay for measuring antibody-mediated neutralization of West Nile virus infection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pierson, Theodore C.; Sanchez, Melissa D.; Puffer, Bridget A.

    2006-03-01

    West Nile virus (WNV) is a neurotropic flavivirus within the Japanese encephalitis antigenic complex that is responsible for causing West Nile encephalitis in humans. The surface of WNV virions is covered by a highly ordered icosahedral array of envelope proteins that is responsible for mediating attachment and fusion with target cells. These envelope proteins are also primary targets for the generation of neutralizing antibodies in vivo. In this study, we describe a novel approach for measuring antibody-mediated neutralization of WNV infection using virus-like particles that measure infection as a function of reporter gene expression. These reporter virus particles (RVPs) aremore » produced by complementation of a sub-genomic replicon with WNV structural proteins provided in trans using conventional DNA expression vectors. The precision and accuracy of this approach stem from an ability to measure the outcome of the interaction between antibody and viral antigens under conditions that satisfy the assumptions of the law of mass action as applied to virus neutralization. In addition to its quantitative strengths, this approach allows the production of WNV RVPs bearing the prM-E proteins of different WNV strains and mutants, offering considerable flexibility for the study of the humoral immune response to WNV in vitro. WNV RVPs are capable of only a single round of infection, can be used under BSL-2 conditions, and offer a rapid and quantitative approach for detecting virus entry and its inhibition by neutralizing antibody.« less

  12. Characterization and Comparison of Galactomannan Enzyme Immunoassay and Quantitative Real-Time PCR Assay for Detection of Aspergillus fumigatus in Bronchoalveolar Lavage Fluid from Experimental Invasive Pulmonary Aspergillosis

    PubMed Central

    Francesconi, Andrea; Kasai, Miki; Petraitiene, Ruta; Petraitis, Vidmantas; Kelaher, Amy M.; Schaufele, Robert; Hope, William W.; Shea, Yvonne R.; Bacher, John; Walsh, Thomas J.

    2006-01-01

    Bronchoalveolar lavage (BAL) is widely used for evaluation of patients with suspected invasive pulmonary aspergillosis (IPA). However, the diagnostic yield of BAL for detection of IPA by culture and direct examination is limited. Earlier diagnosis may be facilitated by assays that can detect Aspergillus galactomannan antigen or DNA in BAL fluid. We therefore characterized and compared the diagnostic yields of a galactomannan enzyme immunoassay (GM EIA), quantitative real-time PCR (qPCR), and quantitative cultures in experiments using BAL fluid from neutropenic rabbits with experimentally induced IPA defined as microbiologically and histologically evident invasion. The qPCR assay targeted the rRNA gene complex of Aspergillus fumigatus. The GM EIA and qPCR assay were characterized by receiver operator curve analysis. With an optimal cutoff of 0.75, the GM EIA had a sensitivity and specificity of 100% in untreated controls. A decline in sensitivity (92%) was observed when antifungal therapy (AFT) was administered. The optimal cutoff for qPCR was a crossover of 36 cycles, with sensitivity and specificity of 80% and 100%, respectively. The sensitivity of qPCR also decreased with AFT to 50%. Quantitative culture of BAL had a sensitivity of 46% and a specificity of 100%. The sensitivity of quantitative culture decreased with AFT to 16%. The GM EIA and qPCR assay had greater sensitivity than culture in detection of A. fumigatus in BAL fluid in experimentally induced IPA (P ± 0.04). Use of the GM EIA and qPCR assay in conjunction with culture-based diagnostic methods applied to BAL fluid could facilitate accurate diagnosis and more-timely initiation of specific therapy. PMID:16825367

  13. Imaging samples in silica aerogel using an experimental point spread function.

    PubMed

    White, Amanda J; Ebel, Denton S

    2015-02-01

    Light microscopy is a powerful tool that allows for many types of samples to be examined in a rapid, easy, and nondestructive manner. Subsequent image analysis, however, is compromised by distortion of signal by instrument optics. Deconvolution of images prior to analysis allows for the recovery of lost information by procedures that utilize either a theoretically or experimentally calculated point spread function (PSF). Using a laser scanning confocal microscope (LSCM), we have imaged whole impact tracks of comet particles captured in silica aerogel, a low density, porous SiO2 solid, by the NASA Stardust mission. In order to understand the dynamical interactions between the particles and the aerogel, precise grain location and track volume measurement are required. We report a method for measuring an experimental PSF suitable for three-dimensional deconvolution of imaged particles in aerogel. Using fluorescent beads manufactured into Stardust flight-grade aerogel, we have applied a deconvolution technique standard in the biological sciences to confocal images of whole Stardust tracks. The incorporation of an experimentally measured PSF allows for better quantitative measurements of the size and location of single grains in aerogel and more accurate measurements of track morphology.

  14. Measuring the Unmeasurable: Upholding Rigor in Quantitative Studies of Personal and Social Development in Outdoor Adventure Education

    ERIC Educational Resources Information Center

    Scrutton, Roger; Beames, Simon

    2015-01-01

    Outdoor adventure education (OAE) has a long history of being credited with the personal and social development (PSD) of its participants. PSD is notoriously difficult to measure quantitatively, yet stakeholders demand statistical evidence that given approaches to eliciting PSD are effective in their methods. Rightly or wrongly, many stakeholders…

  15. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  16. Qualitative and quantitative evaluation of avian demineralized bone matrix in heterotopic beds.

    PubMed

    Reza Sanaei, M; Abu, Jalila; Nazari, Mojgan; A B, Mohd Zuki; Allaudin, Zeenathul N

    2013-11-01

    To evaluate the osteogenic potential of avian demineralized bone matrix (DBM) in the context of implant geometry. Experimental. Rock pigeons (n = 24). Tubular and chipped forms of DBM were prepared by acid demineralization of long bones from healthy allogeneic donors and implanted bilaterally into the pectoral region of 24 pigeons. After euthanasia at 1, 4, 6, 8, 10, and 12 weeks, explants were evaluated histologically and compared by means of quantitative (bone area) and semi quantitative measures (scores). All explants had new bone at retrieval with the exception of tubular implants at the end of week 1. The most reactive part in both implants was the interior region between the periosteal and endosteal surfaces followed by the area at the implant-muscle interface. Quantitative measurements demonstrated a significantly (P = .012) greater percentage of new bone formation induced by tubular implants (80.28 ± 8.94) compared with chip implants (57.64 ± 3.12). There was minimal inflammation. Avian DBM initiates heterotopic bone formation in allogeneic recipients with low grades of immunogenicity. Implant geometry affects this phenomenon as osteoconduction appeared to augment the magnitude of the effects in larger tubular implants. © Copyright 2013 by The American College of Veterinary Surgeons.

  17. Measuring the Internal Structure and Physical Conditions in Star and Planet Forming Clouds Cores: Towards a Quantitative Description of Cloud Evolution

    NASA Technical Reports Server (NTRS)

    Lada, Charles J.

    2004-01-01

    This grant funds a research program to use infrared extinction measurements to probe the detailed structure of dark molecular cloud cores and investigate the physical conditions which give rise to star and planet formation. The goals of this program are to acquire, reduce and analyze deep infrared and molecular-line observations of a carefully selected sample of nearby dark clouds in order to determine the detailed initial conditions for star formation from quantitative measurements of the internal structure of starless cloud cores and to quantitatively investigate the evolution of such structure through the star and planet formation process.

  18. Single-cell quantitative HER2 measurement identifies heterogeneity and distinct subgroups within traditionally defined HER2-positive patients.

    PubMed

    Onsum, Matthew D; Geretti, Elena; Paragas, Violette; Kudla, Arthur J; Moulis, Sharon P; Luus, Lia; Wickham, Thomas J; McDonagh, Charlotte F; MacBeath, Gavin; Hendriks, Bart S

    2013-11-01

    Human epidermal growth factor receptor 2 (HER2) is an important biomarker for breast and gastric cancer prognosis and patient treatment decisions. HER2 positivity, as defined by IHC or fluorescent in situ hybridization testing, remains an imprecise predictor of patient response to HER2-targeted therapies. Challenges to correct HER2 assessment and patient stratification include intratumoral heterogeneity, lack of quantitative and/or objective assays, and differences between measuring HER2 amplification at the protein versus gene level. We developed a novel immunofluorescence method for quantitation of HER2 protein expression at the single-cell level on FFPE patient samples. Our assay uses automated image analysis to identify and classify tumor versus non-tumor cells, as well as quantitate the HER2 staining for each tumor cell. The HER2 staining level is converted to HER2 protein expression using a standard cell pellet array stained in parallel with the tissue sample. This approach allows assessment of HER2 expression and heterogeneity within a tissue section at the single-cell level. By using this assay, we identified distinct subgroups of HER2 heterogeneity within traditional definitions of HER2 positivity in both breast and gastric cancers. Quantitative assessment of intratumoral HER2 heterogeneity may offer an opportunity to improve the identification of patients likely to respond to HER2-targeted therapies. The broad applicability of the assay was demonstrated by measuring HER2 expression profiles on multiple tumor types, and on normal and diseased heart tissues. Copyright © 2013 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.

  19. Measuring landscape esthetics: the scenic beauty estimation method

    Treesearch

    Terry C. Daniel; Ron S. Boster

    1976-01-01

    The Scenic Beauty Estimation Method (SBE) provides quantitative measures of esthetic preferences for alternative wildland management systems. Extensive experimentation and testing with user, interest, and professional groups validated the method. SBE shows promise as an efficient and objective means for assessing the scenic beauty of public forests and wildlands, and...

  20. Quantitative validation of an air-coupled ultrasonic probe model by Interferometric laser tomography

    NASA Astrophysics Data System (ADS)

    Revel, G. M.; Pandarese, G.; Cavuto, A.

    2012-06-01

    The present paper describes the quantitative validation of a finite element (FE) model of the ultrasound beam generated by an air coupled non-contact ultrasound transducer. The model boundary conditions are given by vibration velocities measured by laser vibrometry on the probe membrane. The proposed validation method is based on the comparison between the simulated 3D pressure field and the pressure data measured with interferometric laser tomography technique. The model details and the experimental techniques are described in paper. The analysis of results shows the effectiveness of the proposed approach and the possibility to quantitatively assess and predict the generated acoustic pressure field, with maximum discrepancies in the order of 20% due to uncertainty effects. This step is important for determining in complex problems the real applicability of air-coupled probes and for the simulation of the whole inspection procedure, also when the component is designed, so as to virtually verify its inspectability.

  1. Electrosurgical vessel sealing tissue temperature: experimental measurement and finite element modeling.

    PubMed

    Chen, Roland K; Chastagner, Matthew W; Dodde, Robert E; Shih, Albert J

    2013-02-01

    The temporal and spatial tissue temperature profile in electrosurgical vessel sealing was experimentally measured and modeled using finite element modeling (FEM). Vessel sealing procedures are often performed near the neurovascular bundle and may cause collateral neural thermal damage. Therefore, the heat generated during electrosurgical vessel sealing is of concern among surgeons. Tissue temperature in an in vivo porcine femoral artery sealed using a bipolar electrosurgical device was studied. Three FEM techniques were incorporated to model the tissue evaporation, water loss, and fusion by manipulating the specific heat, electrical conductivity, and electrical contact resistance, respectively. These three techniques enable the FEM to accurately predict the vessel sealing tissue temperature profile. The averaged discrepancy between the experimentally measured temperature and the FEM predicted temperature at three thermistor locations is less than 7%. The maximum error is 23.9%. Effects of the three FEM techniques are also quantified.

  2. Design and analysis of quantitative differential proteomics investigations using LC-MS technology.

    PubMed

    Bukhman, Yury V; Dharsee, Moyez; Ewing, Rob; Chu, Peter; Topaloglou, Thodoros; Le Bihan, Thierry; Goh, Theo; Duewel, Henry; Stewart, Ian I; Wisniewski, Jacek R; Ng, Nancy F

    2008-02-01

    Liquid chromatography-mass spectrometry (LC-MS)-based proteomics is becoming an increasingly important tool in characterizing the abundance of proteins in biological samples of various types and across conditions. Effects of disease or drug treatments on protein abundance are of particular interest for the characterization of biological processes and the identification of biomarkers. Although state-of-the-art instrumentation is available to make high-quality measurements and commercially available software is available to process the data, the complexity of the technology and data presents challenges for bioinformaticians and statisticians. Here, we describe a pipeline for the analysis of quantitative LC-MS data. Key components of this pipeline include experimental design (sample pooling, blocking, and randomization) as well as deconvolution and alignment of mass chromatograms to generate a matrix of molecular abundance profiles. An important challenge in LC-MS-based quantitation is to be able to accurately identify and assign abundance measurements to members of protein families. To address this issue, we implement a novel statistical method for inferring the relative abundance of related members of protein families from tryptic peptide intensities. This pipeline has been used to analyze quantitative LC-MS data from multiple biomarker discovery projects. We illustrate our pipeline here with examples from two of these studies, and show that the pipeline constitutes a complete workable framework for LC-MS-based differential quantitation. Supplementary material is available at http://iec01.mie.utoronto.ca/~thodoros/Bukhman/.

  3. Learning Quantitative Sequence-Function Relationships from Massively Parallel Experiments

    NASA Astrophysics Data System (ADS)

    Atwal, Gurinder S.; Kinney, Justin B.

    2016-03-01

    A fundamental aspect of biological information processing is the ubiquity of sequence-function relationships—functions that map the sequence of DNA, RNA, or protein to a biochemically relevant activity. Most sequence-function relationships in biology are quantitative, but only recently have experimental techniques for effectively measuring these relationships been developed. The advent of such "massively parallel" experiments presents an exciting opportunity for the concepts and methods of statistical physics to inform the study of biological systems. After reviewing these recent experimental advances, we focus on the problem of how to infer parametric models of sequence-function relationships from the data produced by these experiments. Specifically, we retrace and extend recent theoretical work showing that inference based on mutual information, not the standard likelihood-based approach, is often necessary for accurately learning the parameters of these models. Closely connected with this result is the emergence of "diffeomorphic modes"—directions in parameter space that are far less constrained by data than likelihood-based inference would suggest. Analogous to Goldstone modes in physics, diffeomorphic modes arise from an arbitrarily broken symmetry of the inference problem. An analytically tractable model of a massively parallel experiment is then described, providing an explicit demonstration of these fundamental aspects of statistical inference. This paper concludes with an outlook on the theoretical and computational challenges currently facing studies of quantitative sequence-function relationships.

  4. Experimental measurement of structural power flow on an aircraft fuselage

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1991-01-01

    An experimental technique is used to measure structural intensity through an aircraft fuselage with an excitation load applied near one of the wing attachment locations. The fuselage is a relatively large structure, requiring a large number of measurement locations to analyze the whole of the structure. For the measurement of structural intensity, multiple point measurements are necessary at every location of interest. A tradeoff is therefore required between the number of measurement transducers, the mounting of these transducers, and the accuracy of the measurements. Using four transducers mounted on a bakelite platform, structural intensity vectors are measured at locations distributed throughout the fuselage. To minimize the errors associated with using the four transducer technique, the measurement locations are selected to be away from bulkheads and stiffeners. Furthermore, to eliminate phase errors between the four transducer measurements, two sets of data are collected for each position, with the orientation of the platform with the four transducers rotated by 180 degrees and an average taken between the two sets of data. The results of these measurements together with a discussion of the suitability of the approach for measuring structural intensity on a real structure are presented.

  5. Quantitative detection of caffeine in human skin by confocal Raman spectroscopy--A systematic in vitro validation study.

    PubMed

    Franzen, Lutz; Anderski, Juliane; Windbergs, Maike

    2015-09-01

    For rational development and evaluation of dermal drug delivery, the knowledge of rate and extent of substance penetration into the human skin is essential. However, current analytical procedures are destructive, labor intense and lack a defined spatial resolution. In this context, confocal Raman microscopy bares the potential to overcome current limitations in drug depth profiling. Confocal Raman microscopy already proved its suitability for the acquisition of qualitative penetration profiles, but a comprehensive investigation regarding its suitability for quantitative measurements inside the human skin is still missing. In this work, we present a systematic validation study to deploy confocal Raman microscopy for quantitative drug depth profiling in human skin. After we validated our Raman microscopic setup, we successfully established an experimental procedure that allows correlating the Raman signal of a model drug with its controlled concentration in human skin. To overcome current drawbacks in drug depth profiling, we evaluated different modes of peak correlation for quantitative Raman measurements and offer a suitable operating procedure for quantitative drug depth profiling in human skin. In conclusion, we successfully demonstrate the potential of confocal Raman microscopy for quantitative drug depth profiling in human skin as valuable alternative to destructive state-of-the-art techniques. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Phosphorescent nanoparticles for quantitative measurements of oxygen profiles in vitro and in vivo

    PubMed Central

    Choi, Nak Won; Verbridge, Scott S.; Williams, Rebecca M.; Chen, Jin; Kim, Ju-Young; Schmehl, Russel; Farnum, Cornelia E.; Zipfel, Warren R.; Fischbach, Claudia; Stroock, Abraham D.

    2012-01-01

    We present the development and characterization of nanoparticles loaded with a custom phosphor; we exploit these nanoparticles to perform quantitative measurements of the concentration of oxygen within three-dimensional (3-D) tissue cultures in vitro and blood vessels in vivo. We synthesized a customized ruthenium (Ru)-phosphor and incorporated it into polymeric nanoparticles via self-assembly. We demonstrate that the encapsulated phosphor is non-toxic with and without illumination. We evaluated two distinct modes of employing the phosphorescent nanoparticles for the measurement of concentrations of oxygen: 1) in vitro, in a 3-D microfluidic tumor model via ratiometric measurements of intensity with an oxygen-insensitive fluorophore as a reference, and 2) in vivo, in mouse vasculature using measurements of phosphorescence lifetime. With both methods, we demonstrated micrometer-scale resolution and absolute calibration to the dissolved oxygen concentration. Based on the ease and customizability of the synthesis of the nanoparticles and the flexibility of their application, these oxygen-sensing polymeric nanoparticles will find a natural home in a range of biological applications, benefiting studies of physiological as well as pathological processes in which oxygen availability and concentration play a critical role. PMID:22240511

  7. Model based inference from microvascular measurements: Combining experimental measurements and model predictions using a Bayesian probabilistic approach

    PubMed Central

    Rasmussen, Peter M.; Smith, Amy F.; Sakadžić, Sava; Boas, David A.; Pries, Axel R.; Secomb, Timothy W.; Østergaard, Leif

    2017-01-01

    Objective In vivo imaging of the microcirculation and network-oriented modeling have emerged as powerful means of studying microvascular function and understanding its physiological significance. Network-oriented modeling may provide the means of summarizing vast amounts of data produced by high-throughput imaging techniques in terms of key, physiological indices. To estimate such indices with sufficient certainty, however, network-oriented analysis must be robust to the inevitable presence of uncertainty due to measurement errors as well as model errors. Methods We propose the Bayesian probabilistic data analysis framework as a means of integrating experimental measurements and network model simulations into a combined and statistically coherent analysis. The framework naturally handles noisy measurements and provides posterior distributions of model parameters as well as physiological indices associated with uncertainty. Results We applied the analysis framework to experimental data from three rat mesentery networks and one mouse brain cortex network. We inferred distributions for more than five hundred unknown pressure and hematocrit boundary conditions. Model predictions were consistent with previous analyses, and remained robust when measurements were omitted from model calibration. Conclusion Our Bayesian probabilistic approach may be suitable for optimizing data acquisition and for analyzing and reporting large datasets acquired as part of microvascular imaging studies. PMID:27987383

  8. QUANTITATIVE MORPHOLOGY

    EPA Science Inventory

    Abstract: In toxicology, the role of quantitative assessment of brain morphology can be understood in the context of two types of treatment-related alterations. One type of alteration is specifically associated with treatment and is not observed in control animals. Measurement ...

  9. A Versatile Panel of Reference Gene Assays for the Measurement of Chicken mRNA by Quantitative PCR

    PubMed Central

    Maier, Helena J.; Van Borm, Steven; Young, John R.; Fife, Mark

    2016-01-01

    Quantitative real-time PCR assays are widely used for the quantification of mRNA within avian experimental samples. Multiple stably-expressed reference genes, selected for the lowest variation in representative samples, can be used to control random technical variation. Reference gene assays must be reliable, have high amplification specificity and efficiency, and not produce signals from contaminating DNA. Whilst recent research papers identify specific genes that are stable in particular tissues and experimental treatments, here we describe a panel of ten avian gene primer and probe sets that can be used to identify suitable reference genes in many experimental contexts. The panel was tested with TaqMan and SYBR Green systems in two experimental scenarios: a tissue collection and virus infection of cultured fibroblasts. GeNorm and NormFinder algorithms were able to select appropriate reference gene sets in each case. We show the effects of using the selected genes on the detection of statistically significant differences in expression. The results are compared with those obtained using 28s ribosomal RNA, the present most widely accepted reference gene in chicken work, identifying circumstances where its use might provide misleading results. Methods for eliminating DNA contamination of RNA reduced, but did not completely remove, detectable DNA. We therefore attached special importance to testing each qPCR assay for absence of signal using DNA template. The assays and analyses developed here provide a useful resource for selecting reference genes for investigations of avian biology. PMID:27537060

  10. Measuring temperature and field profiles in heat assisted magnetic recording

    NASA Astrophysics Data System (ADS)

    Hohlfeld, J.; Zheng, X.; Benakli, M.

    2015-08-01

    We introduce a theoretical and experimental framework that enables quantitative measurements of the temperature and magnetic field profiles governing the thermo-magnetic write process in heat assisted magnetic recording. Since our approach allows the identification of the correct temperature dependence of the magneto-crystalline anisotropy field in the vicinity of the Curie point as well, it provides an unprecedented experimental foundation to assess our understanding of heat assisted magnetic recording.

  11. Volume measurement of the leg with the depth camera for quantitative evaluation of edema

    NASA Astrophysics Data System (ADS)

    Kiyomitsu, Kaoru; Kakinuma, Akihiro; Takahashi, Hiroshi; Kamijo, Naohiro; Ogawa, Keiko; Tsumura, Norimichi

    2017-02-01

    Volume measurement of the leg is important in the evaluation of leg edema. Recently, method for measurement by using a depth camera is proposed. However, many depth cameras are expensive. Therefore, we propose a method using Microsoft Kinect. We obtain a point cloud of the leg by Kinect Fusion technique and calculate the volume. We measured the volume of leg for three healthy students during three days. In each measurement, the increase of volume was confirmed from morning to evening. It is known that the volume of leg is increased in doing office work. Our experimental results meet this expectation.

  12. Quantitative theory of driven nonlinear brain dynamics.

    PubMed

    Roberts, J A; Robinson, P A

    2012-09-01

    Strong periodic stimuli such as bright flashing lights evoke nonlinear responses in the brain and interact nonlinearly with ongoing cortical activity, but the underlying mechanisms for these phenomena are poorly understood at present. The dominant features of these experimentally observed dynamics are reproduced by the dynamics of a quantitative neural field model subject to periodic drive. Model power spectra over a range of drive frequencies show agreement with multiple features of experimental measurements, exhibiting nonlinear effects including entrainment over a range of frequencies around the natural alpha frequency f(α), subharmonic entrainment near 2f(α), and harmonic generation. Further analysis of the driven dynamics as a function of the drive parameters reveals rich nonlinear dynamics that is predicted to be observable in future experiments at high drive amplitude, including period doubling, bistable phase-locking, hysteresis, wave mixing, and chaos indicated by positive Lyapunov exponents. Moreover, photosensitive seizures are predicted for physiologically realistic model parameters yielding bistability between healthy and seizure dynamics. These results demonstrate the applicability of neural field models to the new regime of periodically driven nonlinear dynamics, enabling interpretation of experimental data in terms of specific generating mechanisms and providing new tests of the theory. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. Quantitative Glycomics Strategies*

    PubMed Central

    Mechref, Yehia; Hu, Yunli; Desantos-Garcia, Janie L.; Hussein, Ahmed; Tang, Haixu

    2013-01-01

    The correlations between protein glycosylation and many biological processes and diseases are increasing the demand for quantitative glycomics strategies enabling sensitive monitoring of changes in the abundance and structure of glycans. This is currently attained through multiple strategies employing several analytical techniques such as capillary electrophoresis, liquid chromatography, and mass spectrometry. The detection and quantification of glycans often involve labeling with ionic and/or hydrophobic reagents. This step is needed in order to enhance detection in spectroscopic and mass spectrometric measurements. Recently, labeling with stable isotopic reagents has also been presented as a very viable strategy enabling relative quantitation. The different strategies available for reliable and sensitive quantitative glycomics are herein described and discussed. PMID:23325767

  14. Recent gyrokinetic turbulence insights with GENE and direct comparison with experimental measurements

    NASA Astrophysics Data System (ADS)

    Goerler, Tobias

    2017-10-01

    Throughout the last years direct comparisons between gyrokinetic turbulence simulations and experimental measurements have been intensified substantially. Such studies are largely motivated by the urgent need for reliable transport predictions for future burning plasma devices and the associated necessity for validating the numerical tools. On the other hand, they can be helpful to assess the way a particular diagnostic experiences turbulence and provide ideas for further optimization and the physics that may not yet be accessible. Here, synthetic diagnostics, i.e. models that mimic the spatial and sometimes temporal response of the experimental diagnostic, play an important role. In the contribution at hand, we focus on recent gyrokinetic GENE simulations dedicated to ASDEX Upgrade L-mode plasmas and comparison with various turbulence measurements. Particular emphasis will be given to density fluctuation spectra which are experimentally accessible via Doppler reflectometry. A sophisticated synthetic diagnostic involving a fullwave code has recently been established and solves the long-lasting question on different spectral roll-overs in gyrokinetic and measured spectra as well as the potentially different power laws in the O- and X-mode signals. The demonstrated agreement furthermore extends the validation data base deep into spectral space and confirms a proper coverage of the turbulence cascade physics. The flux-matched GENE simulations are then used to study the sensitivity of the latter to the main microinstability drive and investigate the energetics at the various scales. Additionally, electron scale turbulence based modifications of the high-k power law spectra in such plasmas will be presented and their visibility in measurable signals be discussed.

  15. Optimizing Nanoscale Quantitative Optical Imaging of Subfield Scattering Targets

    PubMed Central

    Henn, Mark-Alexander; Barnes, Bryan M.; Zhou, Hui; Sohn, Martin; Silver, Richard M.

    2016-01-01

    The full 3-D scattered field above finite sets of features has been shown to contain a continuum of spatial frequency information, and with novel optical microscopy techniques and electromagnetic modeling, deep-subwavelength geometrical parameters can be determined. Similarly, by using simulations, scattering geometries and experimental conditions can be established to tailor scattered fields that yield lower parametric uncertainties while decreasing the number of measurements and the area of such finite sets of features. Such optimized conditions are reported through quantitative optical imaging in 193 nm scatterfield microscopy using feature sets up to four times smaller in area than state-of-the-art critical dimension targets. PMID:27805660

  16. Force Exertion Capacity Measurements in Haptic Virtual Environments

    ERIC Educational Resources Information Center

    Munih, Marko; Bardorfer, Ales; Ceru, Bojan; Bajd, Tadej; Zupan, Anton

    2010-01-01

    An objective test for evaluating functional status of the upper limbs (ULs) in patients with muscular distrophy (MD) is presented. The method allows for quantitative assessment of the UL functional state with an emphasis on force exertion capacity. The experimental measurement setup and the methodology for the assessment of maximal exertable force…

  17. Reproducibility of Quantitative Structural and Physiological MRI Measurements

    DTIC Science & Technology

    2017-08-09

    per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and...Journal Article 3. DATES COVERED (From – To) January 2015 – July 2017 4. TITLE AND SUBTITLE Reproducibility of Quantitative Structural and...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) USAF School of Aerospace Medicine Aeromedical Research Dept/FHOH 2510 Fifth St., Bldg

  18. An experimental approach to identify dynamical models of transcriptional regulation in living cells

    NASA Astrophysics Data System (ADS)

    Fiore, G.; Menolascina, F.; di Bernardo, M.; di Bernardo, D.

    2013-06-01

    We describe an innovative experimental approach, and a proof of principle investigation, for the application of System Identification techniques to derive quantitative dynamical models of transcriptional regulation in living cells. Specifically, we constructed an experimental platform for System Identification based on a microfluidic device, a time-lapse microscope, and a set of automated syringes all controlled by a computer. The platform allows delivering a time-varying concentration of any molecule of interest to the cells trapped in the microfluidics device (input) and real-time monitoring of a fluorescent reporter protein (output) at a high sampling rate. We tested this platform on the GAL1 promoter in the yeast Saccharomyces cerevisiae driving expression of a green fluorescent protein (Gfp) fused to the GAL1 gene. We demonstrated that the System Identification platform enables accurate measurements of the input (sugars concentrations in the medium) and output (Gfp fluorescence intensity) signals, thus making it possible to apply System Identification techniques to obtain a quantitative dynamical model of the promoter. We explored and compared linear and nonlinear model structures in order to select the most appropriate to derive a quantitative model of the promoter dynamics. Our platform can be used to quickly obtain quantitative models of eukaryotic promoters, currently a complex and time-consuming process.

  19. RECENT ADVANCES IN QUANTITATIVE NEUROPROTEOMICS

    PubMed Central

    Craft, George E; Chen, Anshu; Nairn, Angus C

    2014-01-01

    The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson’s disease and Alzheimer’s disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to

  20. Recent advances in quantitative neuroproteomics.

    PubMed

    Craft, George E; Chen, Anshu; Nairn, Angus C

    2013-06-15

    The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson's disease and Alzheimer's disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to shed

  1. Experimental Test of Heisenberg's Measurement Uncertainty Relation Based on Statistical Distances

    NASA Astrophysics Data System (ADS)

    Ma, Wenchao; Ma, Zhihao; Wang, Hengyan; Chen, Zhihua; Liu, Ying; Kong, Fei; Li, Zhaokai; Peng, Xinhua; Shi, Mingjun; Shi, Fazhan; Fei, Shao-Ming; Du, Jiangfeng

    2016-04-01

    Incompatible observables can be approximated by compatible observables in joint measurement or measured sequentially, with constrained accuracy as implied by Heisenberg's original formulation of the uncertainty principle. Recently, Busch, Lahti, and Werner proposed inaccuracy trade-off relations based on statistical distances between probability distributions of measurement outcomes [P. Busch et al., Phys. Rev. Lett. 111, 160405 (2013); P. Busch et al., Phys. Rev. A 89, 012129 (2014)]. Here we reformulate their theoretical framework, derive an improved relation for qubit measurement, and perform an experimental test on a spin system. The relation reveals that the worst-case inaccuracy is tightly bounded from below by the incompatibility of target observables, and is verified by the experiment employing joint measurement in which two compatible observables designed to approximate two incompatible observables on one qubit are measured simultaneously.

  2. Experimental Test of Heisenberg's Measurement Uncertainty Relation Based on Statistical Distances.

    PubMed

    Ma, Wenchao; Ma, Zhihao; Wang, Hengyan; Chen, Zhihua; Liu, Ying; Kong, Fei; Li, Zhaokai; Peng, Xinhua; Shi, Mingjun; Shi, Fazhan; Fei, Shao-Ming; Du, Jiangfeng

    2016-04-22

    Incompatible observables can be approximated by compatible observables in joint measurement or measured sequentially, with constrained accuracy as implied by Heisenberg's original formulation of the uncertainty principle. Recently, Busch, Lahti, and Werner proposed inaccuracy trade-off relations based on statistical distances between probability distributions of measurement outcomes [P. Busch et al., Phys. Rev. Lett. 111, 160405 (2013); P. Busch et al., Phys. Rev. A 89, 012129 (2014)]. Here we reformulate their theoretical framework, derive an improved relation for qubit measurement, and perform an experimental test on a spin system. The relation reveals that the worst-case inaccuracy is tightly bounded from below by the incompatibility of target observables, and is verified by the experiment employing joint measurement in which two compatible observables designed to approximate two incompatible observables on one qubit are measured simultaneously.

  3. Experimental Investigations on Two Potential Sound Diffuseness Measures in Enclosures

    NASA Astrophysics Data System (ADS)

    Bai, Xin

    This study investigates two different approaches to measure sound field diffuseness in enclosures from monophonic room impulse responses. One approach quantifies sound field diffuseness in enclosures by calculating the kurtosis of the pressure samples of room impulse responses. Kurtosis is a statistical measure that is known to describe the peakedness or tailedness of the distribution of a set of data. High kurtosis indicates low diffuseness of the sound field of interest. The other one relies on multifractal detrended fluctuation analysis which is a way to evaluate the statistical self-affinity of a signal to measure diffuseness. To test these two approaches, room impulse responses are obtained under varied room-acoustic diffuseness configurations, achieved by using varied degrees of diffusely reflecting interior surfaces. This paper will analyze experimentally measured monophonic room impulse responses, and discuss results from these two approaches.

  4. Evolution of Quantitative Measures in NMR: Quantum Mechanical qHNMR Advances Chemical Standardization of a Red Clover (Trifolium pratense) Extract

    PubMed Central

    2017-01-01

    Chemical standardization, along with morphological and DNA analysis ensures the authenticity and advances the integrity evaluation of botanical preparations. Achievement of a more comprehensive, metabolomic standardization requires simultaneous quantitation of multiple marker compounds. Employing quantitative 1H NMR (qHNMR), this study determined the total isoflavone content (TIfCo; 34.5–36.5% w/w) via multimarker standardization and assessed the stability of a 10-year-old isoflavone-enriched red clover extract (RCE). Eleven markers (nine isoflavones, two flavonols) were targeted simultaneously, and outcomes were compared with LC-based standardization. Two advanced quantitative measures in qHNMR were applied to derive quantities from complex and/or overlapping resonances: a quantum mechanical (QM) method (QM-qHNMR) that employs 1H iterative full spin analysis, and a non-QM method that uses linear peak fitting algorithms (PF-qHNMR). A 10 min UHPLC-UV method provided auxiliary orthogonal quantitation. This is the first systematic evaluation of QM and non-QM deconvolution as qHNMR quantitation measures. It demonstrates that QM-qHNMR can account successfully for the complexity of 1H NMR spectra of individual analytes and how QM-qHNMR can be built for mixtures such as botanical extracts. The contents of the main bioactive markers were in good agreement with earlier HPLC-UV results, demonstrating the chemical stability of the RCE. QM-qHNMR advances chemical standardization by its inherent QM accuracy and the use of universal calibrants, avoiding the impractical need for identical reference materials. PMID:28067513

  5. Particulate exhaust emissions from an experimental combustor. [gas turbine engine

    NASA Technical Reports Server (NTRS)

    Norgren, C. T.; Ingebo, R. D.

    1975-01-01

    The concentration of dry particulates (carbon) in the exhaust of an experimental gas turbine combustor was measured at simulated takeoff operating conditions and correlated with the standard smoke-number measurement. Carbon was determined quantitatively from a sample collected on a fiberglass filter by converting the carbon in the smoke sample to carbon dioxide and then measuring the volume of carbon dioxide formed by gas chromatography. At a smoke of 25 (threshold of visibility of the smoke plume for large turbojets) the carbon concentration was 2.8 mg carbon/cu m exhaust gas, which is equivalent to an emission index of 0.17 g carbon/kg fuel.

  6. Contextual Fraction as a Measure of Contextuality.

    PubMed

    Abramsky, Samson; Barbosa, Rui Soares; Mansfield, Shane

    2017-08-04

    We consider the contextual fraction as a quantitative measure of contextuality of empirical models, i.e., tables of probabilities of measurement outcomes in an experimental scenario. It provides a general way to compare the degree of contextuality across measurement scenarios; it bears a precise relationship to violations of Bell inequalities; its value, and a witnessing inequality, can be computed using linear programing; it is monotonic with respect to the "free" operations of a resource theory for contextuality; and it measures quantifiable advantages in informatic tasks, such as games and a form of measurement-based quantum computing.

  7. Contextual Fraction as a Measure of Contextuality

    NASA Astrophysics Data System (ADS)

    Abramsky, Samson; Barbosa, Rui Soares; Mansfield, Shane

    2017-08-01

    We consider the contextual fraction as a quantitative measure of contextuality of empirical models, i.e., tables of probabilities of measurement outcomes in an experimental scenario. It provides a general way to compare the degree of contextuality across measurement scenarios; it bears a precise relationship to violations of Bell inequalities; its value, and a witnessing inequality, can be computed using linear programing; it is monotonic with respect to the "free" operations of a resource theory for contextuality; and it measures quantifiable advantages in informatic tasks, such as games and a form of measurement-based quantum computing.

  8. Experimental uncertainty and drag measurements in the national transonic facility

    NASA Technical Reports Server (NTRS)

    Batill, Stephen M.

    1994-01-01

    This report documents the results of a study which was conducted in order to establish a framework for the quantitative description of the uncertainty in measurements conducted in the National Transonic Facility (NTF). The importance of uncertainty analysis in both experiment planning and reporting results has grown significantly in the past few years. Various methodologies have been proposed and the engineering community appears to be 'converging' on certain accepted practices. The practical application of these methods to the complex wind tunnel testing environment at the NASA Langley Research Center was based upon terminology and methods established in the American National Standards Institute (ANSI) and the American Society of Mechanical Engineers (ASME) standards. The report overviews this methodology.

  9. Quantitative measurement of pass-by noise radiated by vehicles running at high speeds

    NASA Astrophysics Data System (ADS)

    Yang, Diange; Wang, Ziteng; Li, Bing; Luo, Yugong; Lian, Xiaomin

    2011-03-01

    It has been a challenge in the past to accurately locate and quantify the pass-by noise source radiated by the running vehicles. A system composed of a microphone array is developed in our current work to do this work. An acoustic-holography method for moving sound sources is designed to handle the Doppler effect effectively in the time domain. The effective sound pressure distribution is reconstructed on the surface of a running vehicle. The method has achieved a high calculation efficiency and is able to quantitatively measure the sound pressure at the sound source and identify the location of the main sound source. The method is also validated by the simulation experiments and the measurement tests with known moving speakers. Finally, the engine noise, tire noise, exhaust noise and wind noise of the vehicle running at different speeds are successfully identified by this method.

  10. Matrix Effects in Quantitative Assessment of Pharmaceutical Tablets Using Transmission Raman and Near-Infrared (NIR) Spectroscopy.

    PubMed

    Sparén, Anders; Hartman, Madeleine; Fransson, Magnus; Johansson, Jonas; Svensson, Olof

    2015-05-01

    Raman spectroscopy can be an alternative to near-infrared spectroscopy (NIR) for nondestructive quantitative analysis of solid pharmaceutical formulations. Compared with NIR spectra, Raman spectra have much better selectivity, but subsampling was always an issue for quantitative assessment. Raman spectroscopy in transmission mode has reduced this issue, since a large volume of the sample is measured in transmission mode. The sample matrix, such as particle size of the drug substance in a tablet, may affect the Raman signal. In this work, matrix effects in transmission NIR and Raman spectroscopy were systematically investigated for a solid pharmaceutical formulation. Tablets were manufactured according to an experimental design, varying the factors particle size of the drug substance (DS), particle size of the filler, compression force, and content of drug substance. All factors were varied at two levels plus a center point, except the drug substance content, which was varied at five levels. Six tablets from each experimental point were measured with transmission NIR and Raman spectroscopy, and their concentration of DS was determined for a third of those tablets. Principal component analysis of NIR and Raman spectra showed that the drug substance content and particle size, the particle size of the filler, and the compression force affected both NIR and Raman spectra. For quantitative assessment, orthogonal partial least squares regression was applied. All factors varied in the experimental design influenced the prediction of the DS content to some extent, both for NIR and Raman spectroscopy, the particle size of the filler having the largest effect. When all matrix variations were included in the multivariate calibrations, however, good predictions of all types of tablets were obtained, both for NIR and Raman spectroscopy. The prediction error using transmission Raman spectroscopy was about 30% lower than that obtained with transmission NIR spectroscopy.

  11. Unmixing of fluorescence spectra to resolve quantitative time-series measurements of gene expression in plate readers.

    PubMed

    Lichten, Catherine A; White, Rachel; Clark, Ivan B N; Swain, Peter S

    2014-02-03

    To connect gene expression with cellular physiology, we need to follow levels of proteins over time. Experiments typically use variants of Green Fluorescent Protein (GFP), and time-series measurements require specialist expertise if single cells are to be followed. Fluorescence plate readers, however, a standard in many laboratories, can in principle provide similar data, albeit at a mean, population level. Nevertheless, extracting the average fluorescence per cell is challenging because autofluorescence can be substantial. Here we propose a general method for correcting plate reader measurements of fluorescent proteins that uses spectral unmixing and determines both the fluorescence per cell and the errors on that fluorescence. Combined with strain collections, such as the GFP fusion collection for budding yeast, our methodology allows quantitative measurements of protein levels of up to hundreds of genes and therefore provides complementary data to high throughput studies of transcription. We illustrate the method by following the induction of the GAL genes in Saccharomyces cerevisiae for over 20 hours in different sugars and argue that the order of appearance of the Leloir enzymes may be to reduce build-up of the toxic intermediate galactose-1-phosphate. Further, we quantify protein levels of over 40 genes, again over 20 hours, after cells experience a change in carbon source (from glycerol to glucose). Our methodology is sensitive, scalable, and should be applicable to other organisms. By allowing quantitative measurements on a per cell basis over tens of hours and over hundreds of genes, it should increase our understanding of the dynamic changes that drive cellular behaviour.

  12. Unmixing of fluorescence spectra to resolve quantitative time-series measurements of gene expression in plate readers

    PubMed Central

    2014-01-01

    Background To connect gene expression with cellular physiology, we need to follow levels of proteins over time. Experiments typically use variants of Green Fluorescent Protein (GFP), and time-series measurements require specialist expertise if single cells are to be followed. Fluorescence plate readers, however, a standard in many laboratories, can in principle provide similar data, albeit at a mean, population level. Nevertheless, extracting the average fluorescence per cell is challenging because autofluorescence can be substantial. Results Here we propose a general method for correcting plate reader measurements of fluorescent proteins that uses spectral unmixing and determines both the fluorescence per cell and the errors on that fluorescence. Combined with strain collections, such as the GFP fusion collection for budding yeast, our methodology allows quantitative measurements of protein levels of up to hundreds of genes and therefore provides complementary data to high throughput studies of transcription. We illustrate the method by following the induction of the GAL genes in Saccharomyces cerevisiae for over 20 hours in different sugars and argue that the order of appearance of the Leloir enzymes may be to reduce build-up of the toxic intermediate galactose-1-phosphate. Further, we quantify protein levels of over 40 genes, again over 20 hours, after cells experience a change in carbon source (from glycerol to glucose). Conclusions Our methodology is sensitive, scalable, and should be applicable to other organisms. By allowing quantitative measurements on a per cell basis over tens of hours and over hundreds of genes, it should increase our understanding of the dynamic changes that drive cellular behaviour. PMID:24495318

  13. Quantitative measurement of mitochondrial membrane potential in cultured cells: calcium-induced de- and hyperpolarization of neuronal mitochondria

    PubMed Central

    Gerencser, Akos A; Chinopoulos, Christos; Birket, Matthew J; Jastroch, Martin; Vitelli, Cathy; Nicholls, David G; Brand, Martin D

    2012-01-01

    Mitochondrial membrane potential (ΔΨM) is a central intermediate in oxidative energy metabolism. Although ΔΨM is routinely measured qualitatively or semi-quantitatively using fluorescent probes, its quantitative assay in intact cells has been limited mostly to slow, bulk-scale radioisotope distribution methods. Here we derive and verify a biophysical model of fluorescent potentiometric probe compartmentation and dynamics using a bis-oxonol-type indicator of plasma membrane potential (ΔΨP) and the ΔΨM probe tetramethylrhodamine methyl ester (TMRM) using fluorescence imaging and voltage clamp. Using this model we introduce a purely fluorescence-based quantitative assay to measure absolute values of ΔΨM in millivolts as they vary in time in individual cells in monolayer culture. The ΔΨP-dependent distribution of the probes is modelled by Eyring rate theory. Solutions of the model are used to deconvolute ΔΨP and ΔΨM in time from the probe fluorescence intensities, taking into account their slow, ΔΨP-dependent redistribution and Nernstian behaviour. The calibration accounts for matrix:cell volume ratio, high- and low-affinity binding, activity coefficients, background fluorescence and optical dilution, allowing comparisons of potentials in cells or cell types differing in these properties. In cultured rat cortical neurons, ΔΨM is −139 mV at rest, and is regulated between −108 mV and −158 mV by concerted increases in ATP demand and Ca2+-dependent metabolic activation. Sensitivity analysis showed that the standard error of the mean in the absolute calibrated values of resting ΔΨM including all biological and systematic measurement errors introduced by the calibration parameters is less than 11 mV. Between samples treated in different ways, the typical equivalent error is ∼5 mV. PMID:22495585

  14. Measurement and Prediction of the Thermomechanical Response of Shape Memory Alloy Hybrid Composite Beams

    NASA Technical Reports Server (NTRS)

    Davis, Brian; Turner, Travis L.; Seelecke, Stefan

    2008-01-01

    An experimental and numerical investigation into the static and dynamic responses of shape memory alloy hybrid composite (SMAHC) beams is performed to provide quantitative validation of a recently commercialized numerical analysis/design tool for SMAHC structures. The SMAHC beam specimens consist of a composite matrix with embedded pre-strained SMA actuators, which act against the mechanical boundaries of the structure when thermally activated to adaptively stiffen the structure. Numerical results are produced from the numerical model as implemented into the commercial finite element code ABAQUS. A rigorous experimental investigation is undertaken to acquire high fidelity measurements including infrared thermography and projection moire interferometry for full-field temperature and displacement measurements, respectively. High fidelity numerical results are also obtained from the numerical model and include measured parameters, such as geometric imperfection and thermal load. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.

  15. Sexual Harassment Prevention Initiatives: Quantitative and Qualitative Approaches

    DTIC Science & Technology

    2010-10-28

    design , and the time series with nonequivalent control group design . The experimental research approach will randomly assign participants...Leedy & Ormrod, 2005). According to Fife- Schaw (2006) there are three quasi-experimental designs : the nonequivalent control group design , the time...that have controlled and isolated variables. A specific quantitative approach available to the researcher is the use of surveys. Surveys, in

  16. Genome-Wide Association Studies of Quantitatively Measured Skin, Hair, and Eye Pigmentation in Four European Populations

    PubMed Central

    Candille, Sophie I.; Absher, Devin M.; Beleza, Sandra; Bauchet, Marc; McEvoy, Brian; Garrison, Nanibaa’ A.; Li, Jun Z.; Myers, Richard M.; Barsh, Gregory S.; Tang, Hua; Shriver, Mark D.

    2012-01-01

    Pigmentation of the skin, hair, and eyes varies both within and between human populations. Identifying the genes and alleles underlying this variation has been the goal of many candidate gene and several genome-wide association studies (GWAS). Most GWAS for pigmentary traits to date have been based on subjective phenotypes using categorical scales. But skin, hair, and eye pigmentation vary continuously. Here, we seek to characterize quantitative variation in these traits objectively and accurately and to determine their genetic basis. Objective and quantitative measures of skin, hair, and eye color were made using reflectance or digital spectroscopy in Europeans from Ireland, Poland, Italy, and Portugal. A GWAS was conducted for the three quantitative pigmentation phenotypes in 176 women across 313,763 SNP loci, and replication of the most significant associations was attempted in a sample of 294 European men and women from the same countries. We find that the pigmentation phenotypes are highly stratified along axes of European genetic differentiation. The country of sampling explains approximately 35% of the variation in skin pigmentation, 31% of the variation in hair pigmentation, and 40% of the variation in eye pigmentation. All three quantitative phenotypes are correlated with each other. In our two-stage association study, we reproduce the association of rs1667394 at the OCA2/HERC2 locus with eye color but we do not identify new genetic determinants of skin and hair pigmentation supporting the lack of major genes affecting skin and hair color variation within Europe and suggesting that not only careful phenotyping but also larger cohorts are required to understand the genetic architecture of these complex quantitative traits. Interestingly, we also see that in each of these four populations, men are more lightly pigmented in the unexposed skin of the inner arm than women, a fact that is underappreciated and may vary across the world. PMID:23118974

  17. Modeling of Receptor Tyrosine Kinase Signaling: Computational and Experimental Protocols.

    PubMed

    Fey, Dirk; Aksamitiene, Edita; Kiyatkin, Anatoly; Kholodenko, Boris N

    2017-01-01

    The advent of systems biology has convincingly demonstrated that the integration of experiments and dynamic modelling is a powerful approach to understand the cellular network biology. Here we present experimental and computational protocols that are necessary for applying this integrative approach to the quantitative studies of receptor tyrosine kinase (RTK) signaling networks. Signaling by RTKs controls multiple cellular processes, including the regulation of cell survival, motility, proliferation, differentiation, glucose metabolism, and apoptosis. We describe methods of model building and training on experimentally obtained quantitative datasets, as well as experimental methods of obtaining quantitative dose-response and temporal dependencies of protein phosphorylation and activities. The presented methods make possible (1) both the fine-grained modeling of complex signaling dynamics and identification of salient, course-grained network structures (such as feedback loops) that bring about intricate dynamics, and (2) experimental validation of dynamic models.

  18. Non-invasive tissue temperature measurements based on quantitative diffuse optical spectroscopy (DOS) of water.

    PubMed

    Chung, S H; Cerussi, A E; Merritt, S I; Ruth, J; Tromberg, B J

    2010-07-07

    We describe the development of a non-invasive method for quantitative tissue temperature measurements using Broadband diffuse optical spectroscopy (DOS). Our approach is based on well-characterized opposing shifts in near-infrared (NIR) water absorption spectra that appear with temperature and macromolecular binding state. Unlike conventional reflectance methods, DOS is used to generate scattering-corrected tissue water absorption spectra. This allows us to separate the macromolecular bound water contribution from the thermally induced spectral shift using the temperature isosbestic point at 996 nm. The method was validated in intralipid tissue phantoms by correlating DOS with thermistor measurements (R=0.96) with a difference of 1.1+/-0.91 degrees C over a range of 28-48 degrees C. Once validated, thermal and hemodynamic (i.e. oxy- and deoxy-hemoglobin concentration) changes were measured simultaneously and continuously in human subjects (forearm) during mild cold stress. DOS-measured arm temperatures were consistent with previously reported invasive deep tissue temperature studies. These results suggest that DOS can be used for non-invasive, co-registered measurements of absolute temperature and hemoglobin parameters in thick tissues, a potentially important approach for optimizing thermal diagnostics and therapeutics.

  19. Experimental ion mobility measurements in Xe-CH4

    NASA Astrophysics Data System (ADS)

    Perdigoto, J. M. C.; Cortez, A. F. V.; Veenhof, R.; Neves, P. N. B.; Santos, F. P.; Borges, F. I. G. M.; Conde, C. A. N.

    2017-09-01

    Data on ion mobility is important to improve the performance of large volume gaseous detectors. In the present work, the method, experimental setup and results for the ion mobility measurements in Xe-CH4 mixtures are presented. The results for this mixture show the presence of two distinct groups of ions. The nature of the ions depend on the mixture ratio since they are originated by both Xe and CH4. The results here presented were obtained for low reduced electric fields, E/N, 10-25 Td (2.4-6.1 kV ṡ cm-1 ṡ bar-1), at low pressure (8 Torr) (10.6 mbar), and at room temperature.

  20. Quantitative mass spectrometry: an overview

    NASA Astrophysics Data System (ADS)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.

  1. Validation of PCR methods for quantitation of genetically modified plants in food.

    PubMed

    Hübner, P; Waiblinger, H U; Pietsch, K; Brodmann, P

    2001-01-01

    For enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients, quantitative detection methods such as quantitative competitive (QC-PCR) and real-time PCR are applied by official food control laboratories. The experiences of 3 European food control laboratories in validating such methods were compared to describe realistic performance characteristics of quantitative PCR detection methods. The limit of quantitation (LOQ) of GMO-specific, real-time PCR was experimentally determined to reach 30-50 target molecules, which is close to theoretical prediction. Starting PCR with 200 ng genomic plant DNA, the LOQ depends primarily on the genome size of the target plant and ranges from 0.02% for rice to 0.7% for wheat. The precision of quantitative PCR detection methods, expressed as relative standard deviation (RSD), varied from 10 to 30%. Using Bt176 corn containing test samples and applying Bt176 specific QC-PCR, mean values deviated from true values by -7to 18%, with an average of 2+/-10%. Ruggedness of real-time PCR detection methods was assessed in an interlaboratory study analyzing commercial, homogeneous food samples. Roundup Ready soybean DNA contents were determined in the range of 0.3 to 36%, relative to soybean DNA, with RSDs of about 25%. Taking the precision of quantitative PCR detection methods into account, suitable sample plans and sample sizes for GMO analysis are suggested. Because quantitative GMO detection methods measure GMO contents of samples in relation to reference material (calibrants), high priority must be given to international agreements and standardization on certified reference materials.

  2. Quantitative Assessment of Parkinsonian Tremor Based on an Inertial Measurement Unit

    PubMed Central

    Dai, Houde; Zhang, Pengyue; Lueth, Tim C.

    2015-01-01

    Quantitative assessment of parkinsonian tremor based on inertial sensors can provide reliable feedback on the effect of medication. In this regard, the features of parkinsonian tremor and its unique properties such as motor fluctuations and dyskinesia are taken into account. Least-square-estimation models are used to assess the severities of rest, postural, and action tremors. In addition, a time-frequency signal analysis algorithm for tremor state detection was also included in the tremor assessment method. This inertial sensor-based method was verified through comparison with an electromagnetic motion tracking system. Seven Parkinson’s disease (PD) patients were tested using this tremor assessment system. The measured tremor amplitudes correlated well with the judgments of a neurologist (r = 0.98). The systematic analysis of sensor-based tremor quantification and the corresponding experiments could be of great help in monitoring the severity of parkinsonian tremor. PMID:26426020

  3. Overview of Classical Test Theory and Item Response Theory for Quantitative Assessment of Items in Developing Patient-Reported Outcome Measures

    PubMed Central

    Cappelleri, Joseph C.; Lundy, J. Jason; Hays, Ron D.

    2014-01-01

    Introduction The U.S. Food and Drug Administration’s patient-reported outcome (PRO) guidance document defines content validity as “the extent to which the instrument measures the concept of interest” (FDA, 2009, p. 12). “Construct validity is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity” (Strauss & Smith, 2009, p. 7). Hence both qualitative and quantitative information are essential in evaluating the validity of measures. Methods We review classical test theory and item response theory approaches to evaluating PRO measures including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized “difficulty” (severity) order of items is represented by observed responses. Conclusion Classical test theory and item response theory can be useful in providing a quantitative assessment of items and scales during the content validity phase of patient-reported outcome measures. Depending on the particular type of measure and the specific circumstances, either one or both approaches should be considered to help maximize the content validity of PRO measures. PMID:24811753

  4. RADON PROGENY AS AN EXPERIMENTAL TOOL FOR DOSIMETRY OF NANOAEROSOLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruzer, Lev; Ruzer, Lev S.; Apte, Michael G.

    2008-02-25

    The study of aerosol exposure and dosimetry measurements and related quantitation of health effects are important to the understanding of the consequences of air pollution, and are discussed widely in the scientific literature. During the last 10 years the need to correlate aerosol exposure and biological effects has become especially important due to rapid development of a new, revolutionary industry ?-- nanotechnology. Nanoproduct commerce is predicted to top $1 trillion by 2015. Quantitative assessment of aerosol particle behavior in air and in lung deposition, and dosimetry in different parts of the lung, particularly for nanoaerosols, remains poor despite several decadesmore » of study. Direct measurements on humans are still needed in order to validate the hollow cast, animal studies, and lung deposition modeling. We discuss here the use of nanoscale radon decay products as an experimental tool in the study of local deposition and lung dosimetry for nanoaerosols. The issue of the safe use of radon progeny in such measurements is discussed based on a comparison of measured exposure in 3 settings: general population, miners, and in a human experiment conducted at the Paul Scherer Institute (PSI) in Switzerland. One of the properties of radon progeny is that they consist partly of 1 nm radioactive particles called unattached activity; having extremely small size and high diffusion coefficients, these particles can be potentially useful as radioactive tracers in the study of nanometer-sized aerosols. We present a theoretical and experimental study of the correlation between the unattached activity and aerosol particle surface area, together with a description of its calibration and method for measurement of the unattached fraction.« less

  5. Real-time quantitative fluorescence measurement of microscale cell culture analog systems

    NASA Astrophysics Data System (ADS)

    Oh, Taek-il; Kim, Donghyun; Tatosian, Daniel; Sung, Jong Hwan; Shuler, Michael

    2007-02-01

    A microscale cell culture analog (μCCA) is a cell-based lab-on-a-chip assay that, as an animal surrogate, is applied to pharmacological studies for toxicology tests. A μCCA typically comprises multiple chambers and microfluidics that connect the chambers, which represent animal organs and blood flow to mimic animal metabolism more realistically. A μCCA is expected to provide a tool for high-throughput drug discovery. Previously, a portable fluorescence detection system was investigated for a single μCCA device in real-time. In this study, we present a fluorescence-based imaging system that provides quantitative real-time data of the metabolic interactions in μCCAs with an emphasis on measuring multiple μCCA samples simultaneously for high-throughput screening. The detection system is based on discrete optics components, with a high-power LED and a charge-coupled device (CCD) camera as a light source and a detector, for monitoring cellular status on the chambers of each μCCA sample. Multiple samples are characterized mechanically on a motorized linear stage, which is fully-automated. Each μCCA sample has four chambers, where cell lines MES-SA/DX- 5, and MES-SA (tumor cells of human uterus) have been cultured. All cell-lines have been transfected to express the fusion protein H2B-GFP, which is a human histone protein fused at the amino terminus to EGFP. As a model cytotoxic drug, 10 μM doxorubicin (DOX) was used. Real-time quantitative data of the intensity loss of enhanced green fluorescent protein (EGFP) during cell death of target cells have been collected over several minutes to 40 hours. Design issues and improvements are also discussed.

  6. Validation of a quantitative magnetic resonance method for measuring human body composition.

    PubMed

    Napolitano, Antonella; Miller, Sam R; Murgatroyd, Peter R; Coward, W Andrew; Wright, Antony; Finer, Nick; De Bruin, Tjerk W; Bullmore, Edward T; Nunez, Derek J

    2008-01-01

    To evaluate a novel quantitative magnetic resonance (QMR) methodology (EchoMRI-AH, Echo Medical Systems) for measurement of whole-body fat and lean mass in humans. We have studied (i) the in vitro accuracy and precision by measuring 18 kg Canola oil with and without 9 kg water (ii) the accuracy and precision of measures of simulated fat mass changes in human subjects (n = 10) and (iii) QMR fat and lean mass measurements compared to those obtained using the established 4-compartment (4-C) model method (n = 30). (i) QMR represented 18 kg of oil at 40 degrees C as 17.1 kg fat and 1 kg lean while at 30 degrees C 15.8 kg fat and 4.7 kg lean were reported. The s.d. of repeated estimates was 0.13 kg for fat and 0.23 kg for lean mass. Adding 9 kg of water reduced the fat estimates, increased misrepresentation of fat as lean, and degraded the precision. (ii) the simulated change in the fat mass of human volunteers was accurately represented, independently of added water. (iii) compared to the 4-C model, QMR underestimated fat and over-estimated lean mass. The extent of difference increased with body mass. The s.d. of repeated measurements increased with adiposity, from 0.25 kg (fat) and 0.51 kg (lean) with BMI <25 kg/m(2) to 0.43 kg and 0.81 kg respectively with BMI >30 kg/m(2). EchoMRI-AH prototype showed shortcomings in absolute accuracy and specificity of fat mass measures, but detected simulated body composition change accurately and with precision roughly three times better than current best measures. This methodology should reduce the study duration and cohort number needed to evaluate anti-obesity interventions.

  7. Quantitative Determination of Aluminum in Deodorant Brands: A Guided Inquiry Learning Experience in Quantitative Analysis Laboratory

    ERIC Educational Resources Information Center

    Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr

    2018-01-01

    The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…

  8. Experimental realization of generalized qubit measurements based on quantum walks

    NASA Astrophysics Data System (ADS)

    Zhao, Yuan-yuan; Yu, Neng-kun; Kurzyński, Paweł; Xiang, Guo-yong; Li, Chuan-Feng; Guo, Guang-Can

    2015-04-01

    We report an experimental implementation of a single-qubit generalized measurement scenario, the positive-operator valued measure (POVM), based on a quantum walk model. The qubit is encoded in a single-photon polarization. The photon performs a quantum walk on an array of optical elements, where the polarization-dependent translation is performed via birefringent beam displacers and a change of the polarization is implemented with the help of wave plates. We implement: (i) trine POVM, i.e., the POVM elements uniformly distributed on an equatorial plane of the Bloch sphere; (ii) symmetric-informationally-complete (SIC) POVM; and (iii) unambiguous discrimination of two nonorthogonal qubit states.

  9. Porous Silicon Antibody Microarrays for Quantitative Analysis: Measurement of Free and Total PSA in Clinical Plasma Samples

    PubMed Central

    Tojo, Axel; Malm, Johan; Marko-Varga, György; Lilja, Hans; Laurell, Thomas

    2014-01-01

    The antibody microarrays have become widespread, but their use for quantitative analyses in clinical samples has not yet been established. We investigated an immunoassay based on nanoporous silicon antibody microarrays for quantification of total prostate-specific-antigen (PSA) in 80 clinical plasma samples, and provide quantitative data from a duplex microarray assay that simultaneously quantifies free and total PSA in plasma. To further develop the assay the porous silicon chips was placed into a standard 96-well microtiter plate for higher throughput analysis. The samples analyzed by this quantitative microarray were 80 plasma samples obtained from men undergoing clinical PSA testing (dynamic range: 0.14-44ng/ml, LOD: 0.14ng/ml). The second dataset, measuring free PSA (dynamic range: 0.40-74.9ng/ml, LOD: 0.47ng/ml) and total PSA (dynamic range: 0.87-295ng/ml, LOD: 0.76ng/ml), was also obtained from the clinical routine. The reference for the quantification was a commercially available assay, the ProStatus PSA Free/Total DELFIA. In an analysis of 80 plasma samples the microarray platform performs well across the range of total PSA levels. This assay might have the potential to substitute for the large-scale microtiter plate format in diagnostic applications. The duplex assay paves the way for a future quantitative multiplex assay, which analyses several prostate cancer biomarkers simultaneously. PMID:22921878

  10. Quantitative multi-color FRET measurements by Fourier lifetime excitation-emission matrix spectroscopy.

    PubMed

    Zhao, Ming; Huang, Run; Peng, Leilei

    2012-11-19

    Förster resonant energy transfer (FRET) is extensively used to probe macromolecular interactions and conformation changes. The established FRET lifetime analysis method measures the FRET process through its effect on the donor lifetime. In this paper we present a method that directly probes the time-resolved FRET signal with frequency domain Fourier lifetime excitation-emission matrix (FLEEM) measurements. FLEEM separates fluorescent signals by their different phonon energy pathways from excitation to emission. The FRET process generates a unique signal channel that is initiated by donor excitation but ends with acceptor emission. Time-resolved analysis of the FRET EEM channel allows direct measurements on the FRET process, unaffected by free fluorophores that might be present in the sample. Together with time-resolved analysis on non-FRET channels, i.e. donor and acceptor EEM channels, time resolved EEM analysis allows precise quantification of FRET in the presence of free fluorophores. The method is extended to three-color FRET processes, where quantification with traditional methods remains challenging because of the significantly increased complexity in the three-way FRET interactions. We demonstrate the time-resolved EEM analysis method with quantification of three-color FRET in incompletely hybridized triple-labeled DNA oligonucleotides. Quantitative measurements of the three-color FRET process in triple-labeled dsDNA are obtained in the presence of free single-labeled ssDNA and double-labeled dsDNA. The results establish a quantification method for studying multi-color FRET between multiple macromolecules in biochemical equilibrium.

  11. Quantitative multi-color FRET measurements by Fourier lifetime excitation-emission matrix spectroscopy

    PubMed Central

    Zhao, Ming; Huang, Run; Peng, Leilei

    2012-01-01

    Förster resonant energy transfer (FRET) is extensively used to probe macromolecular interactions and conformation changes. The established FRET lifetime analysis method measures the FRET process through its effect on the donor lifetime. In this paper we present a method that directly probes the time-resolved FRET signal with frequency domain Fourier lifetime excitation-emission matrix (FLEEM) measurements. FLEEM separates fluorescent signals by their different phonon energy pathways from excitation to emission. The FRET process generates a unique signal channel that is initiated by donor excitation but ends with acceptor emission. Time-resolved analysis of the FRET EEM channel allows direct measurements on the FRET process, unaffected by free fluorophores that might be present in the sample. Together with time-resolved analysis on non-FRET channels, i.e. donor and acceptor EEM channels, time resolved EEM analysis allows precise quantification of FRET in the presence of free fluorophores. The method is extended to three-color FRET processes, where quantification with traditional methods remains challenging because of the significantly increased complexity in the three-way FRET interactions. We demonstrate the time-resolved EEM analysis method with quantification of three-color FRET in incompletely hybridized triple-labeled DNA oligonucleotides. Quantitative measurements of the three-color FRET process in triple-labeled dsDNA are obtained in the presence of free single-labeled ssDNA and double-labeled dsDNA. The results establish a quantification method for studying multi-color FRET between multiple macromolecules in biochemical equilibrium. PMID:23187535

  12. Assessment and Mission Planning Capability For Quantitative Aerothermodynamic Flight Measurements Using Remote Imaging

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas; Splinter, Scott; Daryabeigi, Kamran; Wood, William; Schwartz, Richard; Ross, Martin

    2008-01-01

    High resolution calibrated infrared imagery of vehicles during hypervelocity atmospheric entry or sustained hypersonic cruise has the potential to provide flight data on the distribution of surface temperature and the state of the airflow over the vehicle. In the early 1980 s NASA sought to obtain high spatial resolution infrared imagery of the Shuttle during entry. Despite mission execution with a technically rigorous pre-planning capability, the single airborne optical system for this attempt was considered developmental and the scientific return was marginal. In 2005 the Space Shuttle Program again sponsored an effort to obtain imagery of the Orbiter. Imaging requirements were targeted towards Shuttle ascent; companion requirements for entry did not exist. The engineering community was allowed to define observation goals and incrementally demonstrate key elements of a quantitative spatially resolved measurement capability over a series of flights. These imaging opportunities were extremely beneficial and clearly demonstrated capability to capture infrared imagery with mature and operational assets of the US Navy and the Missile Defense Agency. While successful, the usefulness of the imagery was, from an engineering perspective, limited. These limitations were mainly associated with uncertainties regarding operational aspects of data acquisition. These uncertainties, in turn, came about because of limited pre-flight mission planning capability, a poor understanding of several factors including the infrared signature of the Shuttle, optical hardware limitations, atmospheric effects and detector response characteristics. Operational details of sensor configuration such as detector integration time and tracking system algorithms were carried out ad hoc (best practices) which led to low probability of target acquisition and detector saturation. Leveraging from the qualified success during Return-to-Flight, the NASA Engineering and Safety Center sponsored an

  13. Experimental contextuality in classical light

    NASA Astrophysics Data System (ADS)

    Li, Tao; Zeng, Qiang; Song, Xinbing; Zhang, Xiangdong

    2017-03-01

    The Klyachko, Can, Binicioglu, and Shumovsky (KCBS) inequality is an important contextuality inequality in three-level system, which has been demonstrated experimentally by using quantum states. Using the path and polarization degrees of freedom of classical optics fields, we have constructed the classical trit (cetrit), tested the KCBS inequality and its geometrical form (Wright’s inequality) in this work. The projection measurement has been implemented, the clear violations of the KCBS inequality and its geometrical form have been observed. This means that the contextuality inequality, which is commonly used in test of the conflict between quantum theory and noncontextual realism, may be used as a quantitative tool in classical optical coherence to describe correlation characteristics of the classical fields.

  14. Quantitative measurement of electron number in nanosecond and picosecond laser-induced air breakdown

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Yue; Sawyer, Jordan C.; Su, Liu

    2016-05-07

    Here we present quantitative measurements of total electron numbers in laser-induced air breakdown at pressures ranging from atmospheric to 40 bar{sub g} by 10 ns and 100 ps laser pulses. A quantifiable definition for the laser-induced breakdown threshold is identified by a sharp increase in the measurable total electron numbers via dielectric-calibrated coherent microwave scattering. For the 10 ns laser pulse, the threshold of laser-induced breakdown in atmospheric air is defined as the total electron number of ∼10{sup 6}. This breakdown threshold decreases with an increase of pressure and laser photon energy (shorter wavelength), which is consistent with the theory of initialmore » multiphoton ionization and subsequent avalanche processes. For the 100 ps laser pulse cases, a clear threshold is not present and only marginal pressure effects can be observed, which is due to the short pulse duration leading to stronger multiphoton ionization and minimal collisional avalanche ionization.« less

  15. Quantitative measurement of carbon nanotubes released from their composites by thermal carbon analysis

    NASA Astrophysics Data System (ADS)

    Ogura, I.; Kotake, M.; Ata, S.; Honda, K.

    2017-06-01

    The release of free carbon nanotubes (CNTs) and CNTs partly embedded in matrix debris into the air may occur during mechanical and abrasion processes involving CNT composites. Since the harmful effects of CNT-matrix mixtures have not yet been fully evaluated, it is considered that any exposure to CNTs, including CNT-matrix mixtures, should be measured and controlled. Thermal carbon analysis, such as Method 5040 of the National Institute for Occupational Safety and Health, is one of the most reliable quantitative methods for measuring CNTs in the air. However, when CNTs are released together with polymer matrices, this technique may be inapplicable. In this study, we evaluated the potential for using thermal carbon analysis to determine CNTs in the presence of polymer matrices. Our results showed that thermal carbon analysis was potentially capable of determining CNTs in distinction from polyamide 12, polybutylene terephthalate, polypropylene, and polyoxymethylene. However, it was difficult to determine CNTs in the presence of polyethylene terephthalate, polycarbonate, polyetheretherketone, or polyamide 6.

  16. Friedreich and dominant ataxias: quantitative differences in cerebellar dysfunction measurements.

    PubMed

    Tanguy Melac, Audrey; Mariotti, Caterina; Filipovic Pierucci, Antoine; Giunti, Paola; Arpa, Javier; Boesch, Sylvia; Klopstock, Thomas; Müller Vom Hagen, Jennifer; Klockgether, Thomas; Bürk, Katrin; Schulz, Jörg B; Reetz, Kathrin; Pandolfo, Massimo; Durr, Alexandra; Tezenas du Montcel, Sophie

    2018-06-01

    Sensitive outcome measures for clinical trials on cerebellar ataxias are lacking. Most cerebellar ataxias progress very slowly and quantitative measurements are required to evaluate cerebellar dysfunction. We evaluated two scales for rating cerebellar ataxias: the Composite Cerebellar Functional Severity (CCFS) Scale and Scale for the Assessment and Rating of Ataxia (SARA), in patients with spinocerebellar ataxia (SCA) and controls. We evaluated these scales for different diseases and investigated the factors governing the scores obtained. All patients were recruited prospectively. There were 383 patients with Friedreich's ataxia (FRDA), 205 patients with SCA and 168 controls. In FRDA, 31% of the variance of cerebellar signs with the CCFS and 41% of that with SARA were explained by disease duration, age at onset and the shorter abnormal repeat in the FXN gene. Increases in CCFS and SARA scores per year were lower for FRDA than for SCA (CCFS index: 0.123±0.123 per year vs 0.163±0.179, P<0.001; SARA index: 1.5±1.2 vs 1.7±1.7, P<0.001), indicating slower cerebellar dysfunction indexes for FRDA than for SCA. Patients with SCA2 had higher CCFS scores than patients with SCA1 and SCA3, but similar SARA scores. Cerebellar dysfunction, as measured with the CCFS and SARA scales, was more severe in FRDA than in patients with SCA, but with lower progression indexes, within the limits of these types of indexes. Ceiling effects may occur at late stages, for both scales. The CCFS scale is rater-independent and could be used in a multicentre context, as it is simple, rapid and fully automated. ClinicalTrials.gov: NCT02069509. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  17. Comparative Evaluation of Quantitative Test Methods for Gases on a Hard Surface

    DTIC Science & Technology

    2017-02-01

    COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE ECBC-TR-1426 Vipin Rastogi...1 COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE 1. INTRODUCTION Members of the U.S. Environmental...Generator 4 2.4 Experimental Design Each quantitative method was performed three times on three consecutive days. For the CD runs, three

  18. Quantitative molecular characterization of bovine vitreous and lens with non-invasive dynamic light scattering

    NASA Technical Reports Server (NTRS)

    Ansari, R. R.; Suh, K. I.; Dunker, S.; Kitaya, N.; Sebag, J.

    2001-01-01

    The non-invasive technique of dynamic light scattering (DLS) was used to quantitatively characterize vitreous and lens structure on a molecular level by measuring the sizes of the predominant particles and mapping the three-dimensional topographic distribution of these structural macromolecules in three spatial dimensions. The results of DLS measurements in five fresh adult bovine eyes were compared to DLS measurements in model solutions of hyaluronan (HA) and collagen (Coll). In the bovine eyes DLS measurements were obtained from excised samples of gel and liquid vitreous and compared to the model solutions. Measurements in whole vitreous were obtained at multiple points posterior to the lens to generate a three-dimensional 'map' of molecular structure. The macromolecule distribution in bovine lens was similarly characterized.In each bovine vitreous (Bo Vit) specimen, DLS predominantly detected two distinct particles, which differed in diffusion properties and hence size. Comparisons with model vitreous solutions demonstrated that these most likely corresponded to the Coll and HA components of vitreous. Three-dimensional mapping of Bo Vit found heterogeneity throughout the vitreous body, with different particle size distributions for Coll and HA at different loci. In contrast, the three-dimensional distribution of lens macromolecules was more homogeneous. Thus, the non-invasive DLS technique can quantitate the average sizes of vitreous and lens macromolecules and map their three-dimensional distribution. This method to assess quantitatively the macromolecular structure of vitreous and lens should be useful for clinical as well as experimental applications in health and disease. Copyright 2001 Academic Press.

  19. Full skin quantitative optical coherence elastography achieved by combining vibration and surface acoustic wave methods

    NASA Astrophysics Data System (ADS)

    Li, Chunhui; Guan, Guangying; Huang, Zhihong; Wang, Ruikang K.; Nabi, Ghulam

    2015-03-01

    By combining with the phase sensitive optical coherence tomography (PhS-OCT), vibration and surface acoustic wave (SAW) methods have been reported to provide elastography of skin tissue respectively. However, neither of these two methods can provide the elastography in full skin depth in current systems. This paper presents a feasibility study on an optical coherence elastography method which combines both vibration and SAW in order to give the quantitative mechanical properties of skin tissue with full depth range, including epidermis, dermis and subcutaneous fat. Experiments are carried out on layered tissue mimicking phantoms and in vivo human forearm and palm skin. A ring actuator generates vibration while a line actuator were used to excited SAWs. A PhS-OCT system is employed to provide the ultrahigh sensitive measurement of the generated waves. The experimental results demonstrate that by the combination of vibration and SAW method the full skin bulk mechanical properties can be quantitatively measured and further the elastography can be obtained with a sensing depth from ~0mm to ~4mm. This method is promising to apply in clinics where the quantitative elasticity of localized skin diseases is needed to aid the diagnosis and treatment.

  20. Universality and predictability in molecular quantitative genetics.

    PubMed

    Nourmohammad, Armita; Held, Torsten; Lässig, Michael

    2013-12-01

    Molecular traits, such as gene expression levels or protein binding affinities, are increasingly accessible to quantitative measurement by modern high-throughput techniques. Such traits measure molecular functions and, from an evolutionary point of view, are important as targets of natural selection. We review recent developments in evolutionary theory and experiments that are expected to become building blocks of a quantitative genetics of molecular traits. We focus on universal evolutionary characteristics: these are largely independent of a trait's genetic basis, which is often at least partially unknown. We show that universal measurements can be used to infer selection on a quantitative trait, which determines its evolutionary mode of conservation or adaptation. Furthermore, universality is closely linked to predictability of trait evolution across lineages. We argue that universal trait statistics extends over a range of cellular scales and opens new avenues of quantitative evolutionary systems biology. Copyright © 2013. Published by Elsevier Ltd.

  1. Quantitative measurements and modeling of cargo-motor interactions during fast transport in the living axon

    NASA Astrophysics Data System (ADS)

    Seamster, Pamela E.; Loewenberg, Michael; Pascal, Jennifer; Chauviere, Arnaud; Gonzales, Aaron; Cristini, Vittorio; Bearer, Elaine L.

    2012-10-01

    The kinesins have long been known to drive microtubule-based transport of sub-cellular components, yet the mechanisms of their attachment to cargo remain a mystery. Several different cargo-receptors have been proposed based on their in vitro binding affinities to kinesin-1. Only two of these—phosphatidyl inositol, a negatively charged lipid, and the carboxyl terminus of the amyloid precursor protein (APP-C), a trans-membrane protein—have been reported to mediate motility in living systems. A major question is how these many different cargo, receptors and motors interact to produce the complex choreography of vesicular transport within living cells. Here we describe an experimental assay that identifies cargo-motor receptors by their ability to recruit active motors and drive transport of exogenous cargo towards the synapse in living axons. Cargo is engineered by derivatizing the surface of polystyrene fluorescent nanospheres (100 nm diameter) with charged residues or with synthetic peptides derived from candidate motor receptor proteins, all designed to display a terminal COOH group. After injection into the squid giant axon, particle movements are imaged by laser-scanning confocal time-lapse microscopy. In this report we compare the motility of negatively charged beads with APP-C beads in the presence of glycine-conjugated non-motile beads using new strategies to measure bead movements. The ensuing quantitative analysis of time-lapse digital sequences reveals detailed information about bead movements: instantaneous and maximum velocities, run lengths, pause frequencies and pause durations. These measurements provide parameters for a mathematical model that predicts the spatiotemporal evolution of distribution of the two different types of bead cargo in the axon. The results reveal that negatively charged beads differ from APP-C beads in velocity and dispersion, and predict that at long time points APP-C will achieve greater progress towards the presynaptic

  2. Quantitative measurements and modeling of cargo–motor interactions during fast transport in the living axon

    PubMed Central

    Seamster, Pamela E; Loewenberg, Michael; Pascal, Jennifer; Chauviere, Arnaud; Gonzales, Aaron; Cristini, Vittorio; Bearer, Elaine L

    2013-01-01

    The kinesins have long been known to drive microtubule-based transport of sub-cellular components, yet the mechanisms of their attachment to cargo remain a mystery. Several different cargo-receptors have been proposed based on their in vitro binding affinities to kinesin-1. Only two of these—phosphatidyl inositol, a negatively charged lipid, and the carboxyl terminus of the amyloid precursor protein (APP-C), a trans-membrane protein—have been reported to mediate motility in living systems. A major question is how these many different cargo, receptors and motors interact to produce the complex choreography of vesicular transport within living cells. Here we describe an experimental assay that identifies cargo–motor receptors by their ability to recruit active motors and drive transport of exogenous cargo towards the synapse in living axons. Cargo is engineered by derivatizing the surface of polystyrene fluorescent nanospheres (100 nm diameter) with charged residues or with synthetic peptides derived from candidate motor receptor proteins, all designed to display a terminal COOH group. After injection into the squid giant axon, particle movements are imaged by laser-scanning confocal time-lapse microscopy. In this report we compare the motility of negatively charged beads with APP-C beads in the presence of glycine-conjugated non-motile beads using new strategies to measure bead movements. The ensuing quantitative analysis of time-lapse digital sequences reveals detailed information about bead movements: instantaneous and maximum velocities, run lengths, pause frequencies and pause durations. These measurements provide parameters for a mathematical model that predicts the spatiotemporal evolution of distribution of the two different types of bead cargo in the axon. The results reveal that negatively charged beads differ from APP-C beads in velocity and dispersion, and predict that at long time points APP-C will achieve greater progress towards the presynaptic

  3. Quantitative Targeted Proteomics and Electrochromic Shift for Measuring Photosystem Content of Marine Phytoplankton

    NASA Astrophysics Data System (ADS)

    Brown, C. M.; Bailleul, B.; Melanson, J. R.; Campbell, D. A.; Cockshutt, A. M.; Cardol, P.

    2016-02-01

    Abundance and stoichiometry data for the photosystems, the intersystem electron transport complexes and the Calvin cycle enzymes are rich in information about light and nutrient acclimation. Quantifying these complexes is essential for understanding limitations on and capacities for photosynthesis. Targeted quantitative immunodetections of conserved subunits (eg. PsbA for PSII; PsaC for PSI) are becoming an established method for absolute measurement of these complexes. An advantage of protein measurements is that they can be done with non-living flash-frozen samples and processed post-field. A pitfall of physical versus functional measures is that in some scenarios, such as during photoinhibition of photosystem II (PSII), physical and functional measures give different values, but such disparities are often meaningful, informing targeted studies of regulation, repair and enzyme kinetics. Electrochromic Shift (ECS) is an alternative, fast and noninvasive method which can be exploited to determine functional PSI:PSII ratios in living cells. The basis for ECS is that pigments in the photosynthetic membrane exhibit a shift in their absorption spectra when the electric component of the proton motive force is generated across the membrane in the light. Cross-validation of methods by independent measures builds confidence in results from both approaches and can be useful for ground truthing of underway or high-throughput optical measurements or functional measurements from bioassays. We present comparative data from immunoquantitation and ECS for an array of diatom taxa. The physical data fall within established ranges. The basis for similarities and disparities in the photosystem stoichiometries between the methods are discussed.

  4. Breach Risk Magnitude: A Quantitative Measure of Database Security.

    PubMed

    Yasnoff, William A

    2016-01-01

    A quantitative methodology is described that provides objective evaluation of the potential for health record system breaches. It assumes that breach risk increases with the number of potential records that could be exposed, while it decreases when more authentication steps are required for access. The breach risk magnitude (BRM) is the maximum value for any system user of the common logarithm of the number of accessible database records divided by the number of authentication steps needed to achieve such access. For a one million record relational database, the BRM varies from 5.52 to 6 depending on authentication protocols. For an alternative data architecture designed specifically to increase security by separately storing and encrypting each patient record, the BRM ranges from 1.3 to 2.6. While the BRM only provides a limited quantitative assessment of breach risk, it may be useful to objectively evaluate the security implications of alternative database organization approaches.

  5. Characterization of shape and deformation of MEMS by quantitative optoelectronic metrology techniques

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    2002-06-01

    Recent technological trends based on miniaturization of mechanical, electro-mechanical, and photonic devices to the microscopic scale, have led to the development of microelectromechanical systems (MEMS). Effective development of MEMS components requires the synergism of advanced design, analysis, and fabrication methodologies, and also of quantitative metrology techniques for characterizing their performance, reliability, and integrity during the electronic packaging cycle. In this paper, we describe opto-electronic techniques for measuring, with sub-micrometer accuracy, shape and changes in states of deformation of MEMS strictures. With the described opto-electronic techniques, it is possible to characterize MEMS components using the display and data modes. In the display mode, interferometric information related to shape and deformation is displayed at video frame rates, providing the capability for adjusting and setting experimental conditions. In the data mode, interferometric information related to shape and deformation is recorded as high-spatial and high-digital resolution images, which are further processed to provide quantitative 3D information. Furthermore, the quantitative 3D data are exported to computer-aided design (CAD) environments and utilized for analysis and optimization of MEMS devices. Capabilities of opto- electronic techniques are illustrated with representative applications demonstrating their applicability to provide indispensable quantitative information for the effective development and optimization of MEMS devices.

  6. Assessing signal-to-noise in quantitative proteomics: multivariate statistical analysis in DIGE experiments.

    PubMed

    Friedman, David B

    2012-01-01

    All quantitative proteomics experiments measure variation between samples. When performing large-scale experiments that involve multiple conditions or treatments, the experimental design should include the appropriate number of individual biological replicates from each condition to enable the distinction between a relevant biological signal from technical noise. Multivariate statistical analyses, such as principal component analysis (PCA), provide a global perspective on experimental variation, thereby enabling the assessment of whether the variation describes the expected biological signal or the unanticipated technical/biological noise inherent in the system. Examples will be shown from high-resolution multivariable DIGE experiments where PCA was instrumental in demonstrating biologically significant variation as well as sample outliers, fouled samples, and overriding technical variation that would not be readily observed using standard univariate tests.

  7. Highly Accurate Quantitative Analysis Of Enantiomeric Mixtures from Spatially Frequency Encoded 1H NMR Spectra.

    PubMed

    Plainchont, Bertrand; Pitoux, Daisy; Cyrille, Mathieu; Giraud, Nicolas

    2018-02-06

    We propose an original concept to measure accurately enantiomeric excesses on proton NMR spectra, which combines high-resolution techniques based on a spatial encoding of the sample, with the use of optically active weakly orienting solvents. We show that it is possible to simulate accurately dipolar edited spectra of enantiomers dissolved in a chiral liquid crystalline phase, and to use these simulations to calibrate integrations that can be measured on experimental data, in order to perform a quantitative chiral analysis. This approach is demonstrated on a chemical intermediate for which optical purity is an essential criterion. We find that there is a very good correlation between the experimental and calculated integration ratios extracted from G-SERF spectra, which paves the way to a general method of determination of enantiomeric excesses based on the observation of 1 H nuclei.

  8. Procedures for experimental measurement and theoretical analysis of large plastic deformations

    NASA Technical Reports Server (NTRS)

    Morris, R. E.

    1974-01-01

    Theoretical equations are derived and analytical procedures are presented for the interpretation of experimental measurements of large plastic strains in the surface of a plate. Orthogonal gage lengths established on the metal surface are measured before and after deformation. The change in orthogonality after deformation is also measured. Equations yield the principal strains, deviatoric stresses in the absence of surface friction forces, true stresses if the stress normal to the surface is known, and the orientation angle between the deformed gage line and the principal stress-strain axes. Errors in the measurement of nominal strains greater than 3 percent are within engineering accuracy. Applications suggested for this strain measurement system include the large-strain-stress analysis of impact test models, burst tests of spherical or cylindrical pressure vessels, and to augment small-strain instrumentation tests where large strains are anticipated.

  9. Coherent versus incoherent resonant emission: an experimental method for easy discrimination and measurement

    NASA Astrophysics Data System (ADS)

    Ceccherini, S.; Colocci, M.; Gurioli, M.; Bogani, F.

    1998-11-01

    The distinction between the coherent and the incoherent component of the radiation emitted from resonantly excited material systems is difficult experimentally, particularly when ultra-short optical pulses are used for excitation. We propose an experimental procedure allowing an easy measurement of the two components. The method is completely general and applicable to any kind of physical system; its feasibility is demonstrated on the resonant emission from excitons in a semiconductor quantum well.

  10. [Quantitative relationships between various representatives of gastrointestinal microflora of experimental animals (rats) in normal conditions and after immunosuppression with imuran].

    PubMed

    Amanov, N A

    1983-06-01

    The influence of imuran (an analog of nitrogen ioprin) on the quantitative relationship between lactobacilli, bifidobacteria, bacteroids and aerobic autoflora in different sections of the gastrointestinal tract of white rats was studied under experimental conditions. On days 7-14-30 after the introduction of imuran into the gastrointestinal tract dysbacteriosis developed; it was characterized by a decrease in the number of lactobacilli and asporogenic anaerobic microflora and an increase in the number of aerobic microorganisms. By days 60-90 the content of aerobic microbes in all sections of the gastrointestinal tract was still elevated, while the rapid restoration of the number of bacteroids took place. Therefore, immunosuppression therapy with imuran may give rise to autoinfectious complications caused by different representatives of infective microflora.

  11. Development of a Model for Measuring Scientific Processing Skills Based on Brain-Imaging Technology: Focused on the Experimental Design Process

    ERIC Educational Resources Information Center

    Lee, Il-Sun; Byeon, Jung-Ho; Kim, Young-shin; Kwon, Yong-Ju

    2014-01-01

    The purpose of this study was to develop a model for measuring experimental design ability based on functional magnetic resonance imaging (fMRI) during biological inquiry. More specifically, the researchers developed an experimental design task that measures experimental design ability. Using the developed experimental design task, they measured…

  12. Direct measurements of protein-stabilized gold nanoparticle interactions.

    PubMed

    Eichmann, Shannon L; Bevan, Michael A

    2010-09-21

    We report integrated video and total internal reflection microscopy measurements of protein stabilized 110 nm Au nanoparticles confined in 280 nm gaps in physiological media. Measured potential energy profiles display quantitative agreement with Brownian dynamic simulations that include hydrodynamic interactions and camera exposure time and noise effects. Our results demonstrate agreement between measured nonspecific van der Waals and adsorbed protein interactions with theoretical potentials. Confined, lateral nanoparticle diffusivity measurements also display excellent agreement with predictions. These findings provide a basis to interrogate specific biomacromolecular interactions in similar experimental configurations and to design future improved measurement methods.

  13. High Throughput Measurement of Extracellular DNA Release and Quantitative NET Formation in Human Neutrophils In Vitro.

    PubMed

    Sil, Payel; Yoo, Dae-Goon; Floyd, Madison; Gingerich, Aaron; Rada, Balazs

    2016-06-18

    Neutrophil granulocytes are the most abundant leukocytes in the human blood. Neutrophils are the first to arrive at the site of infection. Neutrophils developed several antimicrobial mechanisms including phagocytosis, degranulation and formation of neutrophil extracellular traps (NETs). NETs consist of a DNA scaffold decorated with histones and several granule markers including myeloperoxidase (MPO) and human neutrophil elastase (HNE). NET release is an active process involving characteristic morphological changes of neutrophils leading to expulsion of their DNA into the extracellular space. NETs are essential to fight microbes, but uncontrolled release of NETs has been associated with several disorders. To learn more about the clinical relevance and the mechanism of NET formation, there is a need to have reliable tools capable of NET quantitation. Here three methods are presented that can assess NET release from human neutrophils in vitro. The first one is a high throughput assay to measure extracellular DNA release from human neutrophils using a membrane impermeable DNA-binding dye. In addition, two other methods are described capable of quantitating NET formation by measuring levels of NET-specific MPO-DNA and HNE-DNA complexes. These microplate-based methods in combination provide great tools to efficiently study the mechanism and regulation of NET formation of human neutrophils.

  14. Experimental study of quantitative assessment of left ventricular mass with contrast enhanced real-time three-dimensional echocardiography.

    PubMed

    Zhuang, Lei; Wang, Xin-Fang; Xie, Ming-Xing; Chen, Li-Xin; Fei, Hong-Wen; Yang, Ying; Wang, Jing; Huang, Run-Qing; Chen, Ou-Di; Wang, Liang-Yu

    2004-01-01

    To evaluate the feasibility and accuracy of measurement of left ventricular mass with intravenous contrast enhanced real-time three-dimensional (RT3D) echocardiography in the experimental setting. RT3D echocardiography was performed in 13 open-chest mongrel dogs before and after intravenous infusion of a perfluorocarbon contrast agent. Left ventricular myocardium volume was measured according to the apical four-plane method provided by TomTec 4D cardio-View RT1.0 software, then the left ventricular mass was calculated as the myocardial volume multiplied by the relative density of myocardium. Correlative analysis and paired t-test were performed between left ventricular mass obtained from RT3D echocardiography and the anatomic measurements. Anatomic measurement of total left ventricular mass was 55.6 +/- 9.3 g, whereas RT3D echocardiographic calculation of left ventricular mass before and after intravenous perfluorocarbon contrast agent was 57.5 +/- 11.4 and 55.5 +/- 9.3 g, respectively. A significant correlation was observed between the RT3D echocardiographic estimates of total left ventricular mass and the corresponding anatomic measurements (r = 0.95). A strong correlation was found between RT3D echocardiographic estimates of left ventricular mass with perfluorocarbon contrast and the anatomic results (r = 0.99). Analysis of intraobserver and interobserver variability showed strong indexes of agreement in the measurement of left ventricular mass with pre and post-contrast RT3D echocardiography. Measurements of left ventricular mass derived from RT3D echocardiography with and without intravenous contrast showed a significant correlation with the anatomic results. Contrast enhanced RT3D echocardiography permitted better visualization of the endocardial border, which would provide a more accurate and reliable means of determining left ventricular myocardial mass in the experimental setting.

  15. Distance measures and optimization spaces in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.; Rode, Karyn D.; Budge, Suzanne M.; Thiemann, Gregory W.

    2015-01-01

    Quantitative fatty acid signature analysis has become an important method of diet estimation in ecology, especially marine ecology. Controlled feeding trials to validate the method and estimate the calibration coefficients necessary to account for differential metabolism of individual fatty acids have been conducted with several species from diverse taxa. However, research into potential refinements of the estimation method has been limited. We compared the performance of the original method of estimating diet composition with that of five variants based on different combinations of distance measures and calibration-coefficient transformations between prey and predator fatty acid signature spaces. Fatty acid signatures of pseudopredators were constructed using known diet mixtures of two prey data sets previously used to estimate the diets of polar bears Ursus maritimus and gray seals Halichoerus grypus, and their diets were then estimated using all six variants. In addition, previously published diets of Chukchi Sea polar bears were re-estimated using all six methods. Our findings reveal that the selection of an estimation method can meaningfully influence estimates of diet composition. Among the pseudopredator results, which allowed evaluation of bias and precision, differences in estimator performance were rarely large, and no one estimator was universally preferred, although estimators based on the Aitchison distance measure tended to have modestly superior properties compared to estimators based on the Kullback-Leibler distance measure. However, greater differences were observed among estimated polar bear diets, most likely due to differential estimator sensitivity to assumption violations. Our results, particularly the polar bear example, suggest that additional research into estimator performance and model diagnostics is warranted.

  16. Calibrated Passive Sampling--Multi-plot Field Measurements of NH3 Emissions with a Combination of Dynamic Tube Method and Passive Samplers.

    PubMed

    Pacholski, Andreas

    2016-03-21

    Agricultural ammonia (NH3) emissions (90% of total EU emissions) are responsible for about 45% airborne eutrophication, 31% soil acidification and 12% fine dust formation within the EU15. But NH3 emissions also mean a considerable loss of nutrients. Many studies on NH3 emission from organic and mineral fertilizer application have been performed in recent decades. Nevertheless, research related to NH3 emissions after application fertilizers is still limited in particular with respect to relationships to emissions, fertilizer type, site conditions and crop growth. Due to the variable response of crops to treatments, effects can only be validated in experimental designs including field replication for statistical testing. The dominating ammonia loss methods yielding quantitative emissions require large field areas, expensive equipment or current supply, which restricts their application in replicated field trials. This protocol describes a new methodology for the measurement of NH3 emissions on many plots linking a simple semi-quantitative measuring method used in all plots, with a quantitative method by simultaneous measurements using both methods on selected plots. As a semi-quantitative measurement method passive samplers are used. The second method is a dynamic chamber method (Dynamic Tube Method) to obtain a transfer quotient, which converts the semi-quantitative losses of the passive sampler to quantitative losses (kg nitrogen ha(-1)). The principle underlying this approach is that passive samplers placed in a homogeneous experimental field have the same NH3 absorption behavior under identical environmental conditions. Therefore, a transfer co-efficient obtained from single passive samplers can be used to scale the values of all passive samplers used in the same field trial. The method proved valid under a wide range of experimental conditions and is recommended to be used under conditions with bare soil or small canopies (<0.3 m). Results obtained from

  17. Absolute Scale Quantitative Off-Axis Electron Holography at Atomic Resolution

    NASA Astrophysics Data System (ADS)

    Winkler, Florian; Barthel, Juri; Tavabi, Amir H.; Borghardt, Sven; Kardynal, Beata E.; Dunin-Borkowski, Rafal E.

    2018-04-01

    An absolute scale match between experiment and simulation in atomic-resolution off-axis electron holography is demonstrated, with unknown experimental parameters determined directly from the recorded electron wave function using an automated numerical algorithm. We show that the local thickness and tilt of a pristine thin WSe2 flake can be measured uniquely, whereas some electron optical aberrations cannot be determined unambiguously for a periodic object. The ability to determine local specimen and imaging parameters directly from electron wave functions is of great importance for quantitative studies of electrostatic potentials in nanoscale materials, in particular when performing in situ experiments and considering that aberrations change over time.

  18. Surface temperature/heat transfer measurement using a quantitative phosphor thermography system

    NASA Technical Reports Server (NTRS)

    Buck, G. M.

    1991-01-01

    A relative-intensity phosphor thermography technique developed for surface heating studies in hypersonic wind tunnels is described. A direct relationship between relative emission intensity and phosphor temperature is used for quantitative surface temperature measurements in time. The technique provides global surface temperature-time histories using a 3-CCD (Charge Coupled Device) video camera and digital recording system. A current history of technique development at Langley is discussed. Latest developments include a phosphor mixture for a greater range of temperature sensitivity and use of castable ceramics for inexpensive test models. A method of calculating surface heat-transfer from thermal image data in blowdown wind tunnels is included in an appendix, with an analysis of material thermal heat-transfer properties. Results from tests in the Langley 31-Inch Mach 10 Tunnel are presented for a ceramic orbiter configuration and a four-inch diameter hemisphere model. Data include windward heating for bow-shock/wing-shock interactions on the orbiter wing surface, and a comparison with prediction for hemisphere heating distribution.

  19. Effective wavefront aberration measurement of spectacle lenses in as-worn status

    NASA Astrophysics Data System (ADS)

    Jia, Zhigang; Xu, Kai; Fang, Fengzhou

    2018-04-01

    An effective wavefront aberration analysis method for measuring spectacle lenses in as-worn status was proposed and verified using an experimental apparatus based on an eye rotation model. Two strategies were employed to improve the accuracy of measurement of the effective wavefront aberrations on the corneal sphere. The influences of three as-worn parameters, the vertex distance, pantoscopic angle, and face form angle, together with the eye rotation and corresponding incident beams, were objectively and quantitatively obtained. The experimental measurements of spherical single vision and freeform progressive addition lenses demonstrate the accuracy and validity of the proposed method and experimental apparatus, which provide a potential means of achieving supernormal vision correction with customization and personalization in optimizing the as-worn status-based design of spectacle lenses and evaluating their manufacturing and imaging qualities.

  20. An experimental approach to the fundamental principles of hemodynamics.

    PubMed

    Pontiga, Francisco; Gaytán, Susana P

    2005-09-01

    An experimental model has been developed to give students hands-on experience with the fundamental laws of hemodynamics. The proposed experimental setup is of simple construction but permits the precise measurements of physical variables involved in the experience. The model consists in a series of experiments where different basic phenomena are quantitatively investigated, such as the pressure drop in a long straight vessel and in an obstructed vessel, the transition from laminar to turbulent flow, the association of vessels in vascular networks, or the generation of a critical stenosis. Through these experiments, students acquire a direct appreciation of the importance of the parameters involved in the relationship between pressure and flow rate, thus facilitating the comprehension of more complex problems in hemodynamics.

  1. Multidimensional analysis of the frequencies and rates of cytokine secretion from single cells by quantitative microengraving.

    PubMed

    Han, Qing; Bradshaw, Elizabeth M; Nilsson, Björn; Hafler, David A; Love, J Christopher

    2010-06-07

    The large diversity of cells that comprise the human immune system requires methods that can resolve the individual contributions of specific subsets to an immunological response. Microengraving is process that uses a dense, elastomeric array of microwells to generate microarrays of proteins secreted from large numbers of individual live cells (approximately 10(4)-10(5) cells/assay). In this paper, we describe an approach based on this technology to quantify the rates of secretion from single immune cells. Numerical simulations of the microengraving process indicated an operating regime between 30 min-4 h that permits quantitative analysis of the rates of secretion. Through experimental validation, we demonstrate that microengraving can provide quantitative measurements of both the frequencies and the distribution in rates of secretion for up to four cytokines simultaneously released from individual viable primary immune cells. The experimental limits of detection ranged from 0.5 to 4 molecules/s for IL-6, IL-17, IFNgamma, IL-2, and TNFalpha. These multidimensional measures resolve the number and intensities of responses by cells exposed to stimuli with greater sensitivity than single-parameter assays for cytokine release. We show that cells from different donors exhibit distinct responses based on both the frequency and magnitude of cytokine secretion when stimulated under different activating conditions. Primary T cells with specific profiles of secretion can also be recovered after microengraving for subsequent expansion in vitro. These examples demonstrate the utility of quantitative, multidimensional profiles of single cells for analyzing the diversity and dynamics of immune responses in vitro and for identifying rare cells from clinical samples.

  2. Measurement and prediction of the thermomechanical response of shape memory alloy hybrid composite beams

    NASA Astrophysics Data System (ADS)

    Davis, Brian; Turner, Travis L.; Seelecke, Stefan

    2005-05-01

    Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons within the beam structures. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical analysis/design tool. The experimental investigation is refined by a more thorough test procedure and incorporation of higher fidelity measurements such as infrared thermography and projection moire interferometry. The numerical results are produced by a recently commercialized version of the constitutive model as implemented in ABAQUS and are refined by incorporation of additional measured parameters such as geometric imperfection. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. The results demonstrate the effectiveness of SMAHC structures in controlling static and dynamic responses by adaptive stiffening. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.

  3. Measurement and Prediction of the Thermomechanical Response of Shape Memory Alloy Hybrid Composite Beams

    NASA Technical Reports Server (NTRS)

    Davis, Brian; Turner, Travis L.; Seelecke, Stefan

    2005-01-01

    Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons within the beam structures. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical analysis/design tool. The experimental investigation is refined by a more thorough test procedure and incorporation of higher fidelity measurements such as infrared thermography and projection moire interferometry. The numerical results are produced by a recently commercialized version of the constitutive model as implemented in ABAQUS and are refined by incorporation of additional measured parameters such as geometric imperfection. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. The results demonstrate the effectiveness of SMAHC structures in controlling static and dynamic responses by adaptive stiffening. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.

  4. Quantitative modelling in cognitive ergonomics: predicting signals passed at danger.

    PubMed

    Moray, Neville; Groeger, John; Stanton, Neville

    2017-02-01

    This paper shows how to combine field observations, experimental data and mathematical modelling to produce quantitative explanations and predictions of complex events in human-machine interaction. As an example, we consider a major railway accident. In 1999, a commuter train passed a red signal near Ladbroke Grove, UK, into the path of an express. We use the Public Inquiry Report, 'black box' data, and accident and engineering reports to construct a case history of the accident. We show how to combine field data with mathematical modelling to estimate the probability that the driver observed and identified the state of the signals, and checked their status. Our methodology can explain the SPAD ('Signal Passed At Danger'), generate recommendations about signal design and placement and provide quantitative guidance for the design of safer railway systems' speed limits and the location of signals. Practitioner Summary: Detailed ergonomic analysis of railway signals and rail infrastructure reveals problems of signal identification at this location. A record of driver eye movements measures attention, from which a quantitative model for out signal placement and permitted speeds can be derived. The paper is an example of how to combine field data, basic research and mathematical modelling to solve ergonomic design problems.

  5. Peripheral Quantitative Computed Tomography: Measurement Sensitivity in Persons With and Without Spinal Cord Injury

    PubMed Central

    Shields, Richard K.; Dudley-Javoroski, Shauna; Boaldin, Kathryn M.; Corey, Trent A.; Fog, Daniel B.; Ruen, Jacquelyn M.

    2012-01-01

    Objectives To determine (1) the error attributable to external tibia-length measurements by using peripheral quantitative computed tomography (pQCT) and (2) the effect these errors have on scan location and tibia trabecular bone mineral density (BMD) after spinal cord injury (SCI). Design Blinded comparison and criterion standard in matched cohorts. Setting Primary care university hospital. Participants Eight able-bodied subjects underwent tibia length measurement. A separate cohort of 7 men with SCI and 7 able-bodied age-matched male controls underwent pQCT analysis. Interventions Not applicable. Main Outcome Measures The projected worst-case tibia-length–measurement error translated into a pQCT slice placement error of ±3mm. We collected pQCT slices at the distal 4% tibia site, 3mm proximal and 3mm distal to that site, and then quantified BMD error attributable to slice placement. Results Absolute BMD error was greater for able-bodied than for SCI subjects (5.87mg/cm3 vs 4.5mg/cm3). However, the percentage error in BMD was larger for SCI than able-bodied subjects (4.56% vs 2.23%). Conclusions During cross-sectional studies of various populations, BMD differences up to 5% may be attributable to variation in limb-length–measurement error. PMID:17023249

  6. Targeted Quantitation of Proteins by Mass Spectrometry

    PubMed Central

    2013-01-01

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement. PMID:23517332

  7. Targeted quantitation of proteins by mass spectrometry.

    PubMed

    Liebler, Daniel C; Zimmerman, Lisa J

    2013-06-04

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement.

  8. Medical student attitudes towards older people: a critical review of quantitative measures.

    PubMed

    Wilson, Mark A G; Kurrle, Susan; Wilson, Ian

    2018-01-24

    Further research into medical student attitudes towards older people is important, and requires accurate and detailed evaluative methodology. The two objectives for this paper are: (1) From the literature, to critically review instruments of measure for medical student attitudes towards older people, and (2) To recommend the most appropriate quantitative instrument for future research into medical student attitudes towards older people. A SCOPUS and Ovid cross search was performed using the keywords Attitude and medical student and aged or older or elderly. This search was supplemented by manual searching, guided by citations in articles identified by the initial literature search, using the SCOPUS and PubMed databases. International studies quantifying medical student attitudes have demonstrated neutral to positive attitudes towards older people, using various instruments. The most commonly used instruments are the Ageing Semantic Differential (ASD) and the University of California Los Angeles Geriatric Attitudes Scale, with several other measures occasionally used. All instruments used to date have inherent weaknesses. A reliable and valid instrument with which to quantify modern medical student attitudes towards older people has not yet been developed. Adaptation of the ASD for contemporary usage is recommended.

  9. A novel Raman spectrophotometric method for quantitative measurement of nucleoside triphosphate hydrolysis.

    PubMed

    Jenkins, R H; Tuma, R; Juuti, J T; Bamford, D H; Thomas, G J

    1999-01-01

    A novel spectrophotometric method, based upon Raman spectroscopy, has been developed for accurate quantitative determination of nucleoside triphosphate phosphohydrolase (NTPase) activity. The method relies upon simultaneous measurement in real time of the intensities of Raman marker bands diagnostic of the triphosphate (1115 cm(-1)) and diphosphate (1085 cm(-1)) moieties of the NTPase substrate and product, respectively. The reliability of the method is demonstrated for the NTPase-active RNA-packaging enzyme (protein P4) of bacteriophage phi6, for which comparative NTPase activities have been estimated independently by radiolabeling assays. The Raman-determined rate for adenosine triphosphate substrate (8.6 +/- 1.3 micromol x mg(-1) x min(-1) at 40 degrees C) is in good agreement with previous estimates. The versatility of the Raman method is demonstrated by its applicability to a variety of nucleotide substrates of P4, including the natural ribonucleoside triphosphates (ATP, GTP) and dideoxynucleoside triphosphates (ddATP, ddGTP). Advantages of the present protocol include conservative sample requirements (approximately 10(-6) g enzyme/protocol) and relative ease of data collection and analysis. The latter conveniences are particularly advantageous for the measurement of activation energies of phosphohydrolase activity.

  10. Quantitative laser diagnostic and modeling study of C2 and CH chemistry in combustion.

    PubMed

    Köhler, Markus; Brockhinke, Andreas; Braun-Unkhoff, Marina; Kohse-Höinghaus, Katharina

    2010-04-15

    Quantitative concentration measurements of CH and C(2) have been performed in laminar, premixed, flat flames of propene and cyclopentene with varying stoichiometry. A combination of cavity ring-down (CRD) spectroscopy and laser-induced fluorescence (LIF) was used to enable sensitive detection of these species with high spatial resolution. Previously, CH and C(2) chemistry had been studied, predominantly in methane flames, to understand potential correlations of their formation and consumption. For flames of larger hydrocarbon fuels, however, quantitative information on these small intermediates is scarce, especially under fuel-rich conditions. Also, the combustion chemistry of C(2) in particular has not been studied in detail, and although it has often been observed, its role in potential build-up reactions of higher hydrocarbon species is not well understood. The quantitative measurements performed here are the first to detect both species with good spatial resolution and high sensitivity in the same experiment in flames of C(3) and C(5) fuels. The experimental profiles were compared with results of combustion modeling to reveal details of the formation and consumption of these important combustion molecules, and the investigation was devoted to assist the further understanding of the role of C(2) and of its potential chemical interdependences with CH and other small radicals.

  11. Data Generated by Quantitative Liquid Chromatography-Mass Spectrometry Proteomics Are Only the Start and Not the Endpoint: Optimization of Quantitative Concatemer-Based Measurement of Hepatic Uridine-5'-Diphosphate-Glucuronosyltransferase Enzymes with Reference to Catalytic Activity.

    PubMed

    Achour, Brahim; Dantonio, Alyssa; Niosi, Mark; Novak, Jonathan J; Al-Majdoub, Zubida M; Goosen, Theunis C; Rostami-Hodjegan, Amin; Barber, Jill

    2018-06-01

    Quantitative proteomic methods require optimization at several stages, including sample preparation, liquid chromatography-tandem mass spectrometry (LC-MS/MS), and data analysis, with the final analysis stage being less widely appreciated by end-users. Previously reported measurement of eight uridine-5'-diphospho-glucuronosyltransferases (UGT) generated by two laboratories [using stable isotope-labeled (SIL) peptides or quantitative concatemer (QconCAT)] reflected significant disparity between proteomic methods. Initial analysis of QconCAT data showed lack of correlation with catalytic activity for several UGTs (1A4, 1A6, 1A9, 2B15) and moderate correlations for UGTs 1A1, 1A3, and 2B7 ( R s = 0.40-0.79, P < 0.05; R 2 = 0.30); good correlations were demonstrated between cytochrome P450 activities and abundances measured in the same experiments. Consequently, a systematic review of data analysis, starting from unprocessed LC-MS/MS data, was undertaken, with the aim of improving accuracy, defined by correlation against activity. Three main criteria were found to be important: choice of monitored peptides and fragments, correction for isotope-label incorporation, and abundance normalization using fractional protein mass. Upon optimization, abundance-activity correlations improved significantly for six UGTs ( R s = 0.53-0.87, P < 0.01; R 2 = 0.48-0.73); UGT1A9 showed moderate correlation ( R s = 0.47, P = 0.02; R 2 = 0.34). No spurious abundance-activity relationships were identified. However, methods remained suboptimal for UGT1A3 and UGT1A9; here hydrophobicity of standard peptides is believed to be limiting. This commentary provides a detailed data analysis strategy and indicates, using examples, the significance of systematic data processing following acquisition. The proposed strategy offers significant improvement on existing guidelines applicable to clinically relevant proteins quantified using QconCAT. Copyright © 2018 by The American Society for Pharmacology

  12. Experimental Measurement of In Situ Stress

    NASA Astrophysics Data System (ADS)

    Tibbo, Maria; Milkereit, Bernd; Nasseri, Farzine; Schmitt, Douglas; Young, Paul

    2016-04-01

    The World Stress Map data is determined by stress indicators including earthquake focal mechanisms, in situ measurement in mining, oil and gas boreholes as well as the borehole cores, and geologic data. Unfortunately, these measurements are not only infrequent but sometimes infeasible, and do not provide nearly enough data points with high accuracy to correctly infer stress fields in deep mines around the world. Improvements in stress measurements of Earth's crust is fundamental to several industries such as oil and gas, mining, nuclear waste management, and enhanced geothermal systems. Quantifying the state of stress and the geophysical properties of different rock types is a major complication in geophysical monitoring of deep mines. Most stress measurement techniques involve either the boreholes or their cores, however these measurements usually only give stress along one axis, not the complete stress tensor. The goal of this project is to investigate a new method of acquiring a complete stress tensor of the in situ stress in the Earth's crust. This project is part of a comprehensive, exploration geophysical study in a deep, highly stressed mine located in Sudbury, Ontario, Canada, and focuses on two boreholes located in this mine. These boreholes are approximately 400 m long with NQ diameters and are located at depths of about 1300 - 1600 m and 1700 - 2000 m. Two borehole logging surveys were performed on both boreholes, October 2013 and July 2015, in order to perform a time-lapse analysis of the geophysical changes in the mine. These multi-parameter surveys include caliper, full waveform sonic, televiewer, chargeability (IP), and resistivity. Laboratory experiments have been performed on borehole core samples of varying geologies from each borehole. These experiments have measured the geophysical properties including elastic modulus, bulk modulus, P- and S-wave velocities, and density. The apparatus' used for this project are geophysical imaging cells capable

  13. Effects of Filtering on Experimental Blast Overpressure Measurements.

    PubMed

    Alphonse, Vanessa D; Kemper, Andrew R; Duma, Stefan M

    2015-01-01

    When access to live-fire test facilities is limited, experimental studies of blast-related injuries necessitate the use of a shock tube or Advanced Blast Simulator (ABS) to mimic free-field blast overpressure. However, modeling blast overpressure in a laboratory setting potentially introduces experimental artifacts in measured responses. Due to the high sampling rates required to capture a blast overpressure event, proximity to alternating current (AC-powered electronics) and poorly strain-relieved or unshielded wires can result in artifacts in the recorded overpressure trace. Data in this study were collected for tests conducted on an empty ABS (“Empty Tube”) using high frequency pressure sensors specifically designed for blast loading rates (n=5). Additionally, intraocular overpressure data (“IOP”) were collected for porcine eyes potted inside synthetic orbits located inside the ABS using an unshielded miniature pressure sensor (n=3). All tests were conducted at a 30 psi static overpressure level. A 4th order phaseless low pass Butterworth software filter was applied to the data. Various cutoff frequencies were examined to determine if the raw shock wave parameters values could be preserved while eliminating noise and artifacts. A Fast Fourier Transform (FFT) was applied to each test to examine the frequency spectra of the raw and filtered signals. Shock wave parameters (time of arrival, peak overpressure, positive duration, and positive impulse) were quantified using a custom MATLAB® script. Lower cutoff frequencies attenuated the raw signal, effectively decreasing the peak overpressure and increasing the positive duration. Rise time was not preserved the filtered data. A CFC 6000 filter preserved the remaining shock wave parameters within ±2.5% of the average raw values for the Empty Tube test data. A CFC 7000 filter removed experimental high-frequency artifacts and preserved the remaining shock wave parameters within ±2.5% of the average raw values for

  14. Quantitative confirmation of diffusion-limited oxidation theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillen, K.T.; Clough, R.L.

    1990-01-01

    Diffusion-limited (heterogeneous) oxidation effects are often important for studies of polymer degradation. Such effects are common in polymers subjected to ionizing radiation at relatively high dose rate. To better understand the underlying oxidation processes and to aid in the planning of accelerated aging studies, it would be desirable to be able to monitor and quantitatively understand these effects. In this paper, we briefly review a theoretical diffusion approach which derives model profiles for oxygen surrounded sheets of material by combining oxygen permeation rates with kinetically based oxygen consumption expressions. The theory leads to a simple governing expression involving the oxygenmore » consumption and permeation rates together with two model parameters {alpha} and {beta}. To test the theory, gamma-initiated oxidation of a sheet of commercially formulated EPDM rubber was performed under conditions which led to diffusion-limited oxidation. Profile shapes from the theoretical treatments are shown to accurately fit experimentally derived oxidation profiles. In addition, direct measurements on the same EPDM material of the oxygen consumption and permeation rates, together with values of {alpha} and {beta} derived from the fitting procedure, allow us to quantitatively confirm for the first time the governing theoretical relationship. 17 refs., 3 figs.« less

  15. Scanning transmission ion microscopy mass measurements for quantitative trace element analysis within biological samples and validation using atomic force microscopy thickness measurements

    NASA Astrophysics Data System (ADS)

    Devès, Guillaume; Cohen-Bouhacina, Touria; Ortega, Richard

    2004-10-01

    We used the nuclear microprobe techniques, micro-PIXE (particle-induced X-ray emission), micro-RBS (Rutherford backscattering spectrometry) and scanning transmission ion microscopy (STIM) in order to perform the characterization of trace element content and spatial distribution within biological samples (dehydrated cultured cells, tissues). The normalization of PIXE results was usually expressed in terms of sample dry mass as determined by micro-RBS recorded simultaneously to micro-PIXE. However, the main limit of RBS mass measurement is the sample mass loss occurring during irradiation and which could be up to 30% of the initial sample mass. We present here a new methodology for PIXE normalization and quantitative analysis of trace element within biological samples based on dry mass measurement performed by mean of STIM. The validation of STIM cell mass measurements was obtained in comparison with AFM sample thickness measurements. Results indicated the reliability of STIM mass measurement performed on biological samples and suggested that STIM should be performed for PIXE normalization. Further information deriving from direct confrontation of AFM and STIM analysis could as well be obtained, like in situ measurements of cell specific gravity within cells compartment (nucleolus and cytoplasm).

  16. Quantitative scanning thermal microscopy of ErAs/GaAs superlattice structures grown by molecular beam epitaxy

    NASA Astrophysics Data System (ADS)

    Park, K. W.; Nair, H. P.; Crook, A. M.; Bank, S. R.; Yu, E. T.

    2013-02-01

    A proximal probe-based quantitative measurement of thermal conductivity with ˜100-150 nm lateral and vertical spatial resolution has been implemented. Measurements on an ErAs/GaAs superlattice structure grown by molecular beam epitaxy with 3% volumetric ErAs content yielded thermal conductivity at room temperature of 9 ± 2 W/m K, approximately five times lower than that for GaAs. Numerical modeling of phonon scattering by ErAs nanoparticles yielded thermal conductivities in reasonable agreement with those measured experimentally and provides insight into the potential influence of nanoparticle shape on phonon scattering. Measurements of wedge-shaped samples created by focused ion beam milling provide direct confirmation of depth resolution achieved.

  17. Multigrid-based reconstruction algorithm for quantitative photoacoustic tomography

    PubMed Central

    Li, Shengfu; Montcel, Bruno; Yuan, Zhen; Liu, Wanyu; Vray, Didier

    2015-01-01

    This paper proposes a multigrid inversion framework for quantitative photoacoustic tomography reconstruction. The forward model of optical fluence distribution and the inverse problem are solved at multiple resolutions. A fixed-point iteration scheme is formulated for each resolution and used as a cost function. The simulated and experimental results for quantitative photoacoustic tomography reconstruction show that the proposed multigrid inversion can dramatically reduce the required number of iterations for the optimization process without loss of reliability in the results. PMID:26203371

  18. Quantitative Phase Imaging in a Volume Holographic Microscope

    NASA Astrophysics Data System (ADS)

    Waller, Laura; Luo, Yuan; Barbastathis, George

    2010-04-01

    We demonstrate a method for quantitative phase imaging in a Volume Holographic Microscope (VHM) from a single exposure, describe the properties of the system and show experimental results. The VHM system uses a multiplexed volume hologram (VH) to laterally separate images from different focal planes. This 3D intensity information is then used to solve the transport of intensity (TIE) equation and recover phase quantitatively. We discuss the modifications to the technique that were made in order to give accurate results.

  19. Reproducibility and quantitation of amplicon sequencing-based detection

    PubMed Central

    Zhou, Jizhong; Wu, Liyou; Deng, Ye; Zhi, Xiaoyang; Jiang, Yi-Huei; Tu, Qichao; Xie, Jianping; Van Nostrand, Joy D; He, Zhili; Yang, Yunfeng

    2011-01-01

    To determine the reproducibility and quantitation of the amplicon sequencing-based detection approach for analyzing microbial community structure, a total of 24 microbial communities from a long-term global change experimental site were examined. Genomic DNA obtained from each community was used to amplify 16S rRNA genes with two or three barcode tags as technical replicates in the presence of a small quantity (0.1% wt/wt) of genomic DNA from Shewanella oneidensis MR-1 as the control. The technical reproducibility of the amplicon sequencing-based detection approach is quite low, with an average operational taxonomic unit (OTU) overlap of 17.2%±2.3% between two technical replicates, and 8.2%±2.3% among three technical replicates, which is most likely due to problems associated with random sampling processes. Such variations in technical replicates could have substantial effects on estimating β-diversity but less on α-diversity. A high variation was also observed in the control across different samples (for example, 66.7-fold for the forward primer), suggesting that the amplicon sequencing-based detection approach could not be quantitative. In addition, various strategies were examined to improve the comparability of amplicon sequencing data, such as increasing biological replicates, and removing singleton sequences and less-representative OTUs across biological replicates. Finally, as expected, various statistical analyses with preprocessed experimental data revealed clear differences in the composition and structure of microbial communities between warming and non-warming, or between clipping and non-clipping. Taken together, these results suggest that amplicon sequencing-based detection is useful in analyzing microbial community structure even though it is not reproducible and quantitative. However, great caution should be taken in experimental design and data interpretation when the amplicon sequencing-based detection approach is used for quantitative

  20. A quantitative measure for degree of automation and its relation to system performance and mental load.

    PubMed

    Wei, Z G; Macwan, A P; Wieringa, P A

    1998-06-01

    In this paper we quantitatively model degree of automation (DofA) in supervisory control as a function of the number and nature of tasks to be performed by the operator and automation. This model uses a task weighting scheme in which weighting factors are obtained from task demand load, task mental load, and task effect on system performance. The computation of DofA is demonstrated using an experimental system. Based on controlled experiments using operators, analyses of the task effect on system performance, the prediction and assessment of task demand load, and the prediction of mental load were performed. Each experiment had a different DofA. The effect of a change in DofA on system performance and mental load was investigated. It was found that system performance became less sensitive to changes in DofA at higher levels of DofA. The experimental data showed that when the operator controlled a partly automated system, perceived mental load could be predicted from the task mental load for each task component, as calculated by analyzing a situation in which all tasks were manually controlled. Actual or potential applications of this research include a methodology to balance and optimize the automation of complex industrial systems.