Sample records for quantitative experimental results

  1. Qualitative versus Quantitative Results: An Experimental Introduction to Data Interpretation.

    ERIC Educational Resources Information Center

    Johnson, Eric R.; Alter, Paula

    1989-01-01

    Described is an experiment in which the student can ascertain the meaning of a negative result from a qualitative test by performing a more sensitive quantitative test on the same sample. Methodology for testing urinary glucose with a spectrophotometer at 630 nm and with commercial assaying glucose strips is presented. (MVL)

  2. Modeling and experimental result analysis for high-power VECSELs

    NASA Astrophysics Data System (ADS)

    Zakharian, Aramais R.; Hader, Joerg; Moloney, Jerome V.; Koch, Stephan W.; Lutgen, Stephan; Brick, Peter; Albrecht, Tony; Grotsch, Stefan; Luft, Johann; Spath, Werner

    2003-06-01

    We present a comparison of experimental and microscopically based model results for optically pumped vertical external cavity surface emitting semiconductor lasers. The quantum well gain model is based on a quantitative ab-initio approach that allows calculation of a complex material susceptibility dependence on the wavelength, carrier density and lattice temperature. The gain model is coupled to the macroscopic thermal transport, spatially resolved in both the radial and longitudinal directions, with temperature and carrier density dependent pump absorption. The radial distribution of the refractive index and gain due to temperature variation are computed. Thermal managment issues, highlighted by the experimental data, are discussed. Experimental results indicate a critical dependence of the input power, at which thermal roll-over occurs, on the thermal resistance of the device. This requires minimization of the substrate thickness and optimization of the design and placement of the heatsink. Dependence of the model results on the radiative and non-radiative carrier recombination lifetimes and cavity losses are evaluated.

  3. Simulation of FRET dyes allows quantitative comparison against experimental data

    NASA Astrophysics Data System (ADS)

    Reinartz, Ines; Sinner, Claude; Nettels, Daniel; Stucki-Buchli, Brigitte; Stockmar, Florian; Panek, Pawel T.; Jacob, Christoph R.; Nienhaus, Gerd Ulrich; Schuler, Benjamin; Schug, Alexander

    2018-03-01

    Fully understanding biomolecular function requires detailed insight into the systems' structural dynamics. Powerful experimental techniques such as single molecule Förster Resonance Energy Transfer (FRET) provide access to such dynamic information yet have to be carefully interpreted. Molecular simulations can complement these experiments but typically face limits in accessing slow time scales and large or unstructured systems. Here, we introduce a coarse-grained simulation technique that tackles these challenges. While requiring only few parameters, we maintain full protein flexibility and include all heavy atoms of proteins, linkers, and dyes. We are able to sufficiently reduce computational demands to simulate large or heterogeneous structural dynamics and ensembles on slow time scales found in, e.g., protein folding. The simulations allow for calculating FRET efficiencies which quantitatively agree with experimentally determined values. By providing atomically resolved trajectories, this work supports the planning and microscopic interpretation of experiments. Overall, these results highlight how simulations and experiments can complement each other leading to new insights into biomolecular dynamics and function.

  4. Quantitative nuclear magnetic resonance imaging: characterisation of experimental cerebral oedema.

    PubMed Central

    Barnes, D; McDonald, W I; Johnson, G; Tofts, P S; Landon, D N

    1987-01-01

    Magnetic resonance imaging (MRI) has been used quantitatively to define the characteristics of two different models of experimental cerebral oedema in cats: vasogenic oedema produced by cortical freezing and cytotoxic oedema induced by triethyl tin. The MRI results have been correlated with the ultrastructural changes. The images accurately delineated the anatomical extent of the oedema in the two lesions, but did not otherwise discriminate between them. The patterns of measured increase in T1' and T2' were, however, characteristic for each type of oedema, and reflected the protein content. The magnetisation decay characteristics of both normal and oedematous white matter were monoexponential for T1 but biexponential for T2 decay. The relative sizes of the two component exponentials of the latter corresponded with the physical sizes of the major tissue water compartments. Quantitative MRI data can provide reliable information about the physico-chemical environment of tissue water in normal and oedematous cerebral tissue, and are useful for distinguishing between acute and chronic lesions in multiple sclerosis. Images PMID:3572428

  5. Introduction to Quantitative Science, a Ninth-Grade Laboratory-Centered Course Stressing Quantitative Observation and Mathematical Analysis of Experimental Results. Final Report.

    ERIC Educational Resources Information Center

    Badar, Lawrence J.

    This report, in the form of a teacher's guide, presents materials for a ninth grade introductory course on Introduction to Quantitative Science (IQS). It is intended to replace a traditional ninth grade general science with a process oriented course that will (1) unify the sciences, and (2) provide a quantitative preparation for the new science…

  6. Quantitative Assessment of the CCMC's Experimental Real-time SWMF-Geospace Results

    NASA Astrophysics Data System (ADS)

    Liemohn, Michael; Ganushkina, Natalia; De Zeeuw, Darren; Welling, Daniel; Toth, Gabor; Ilie, Raluca; Gombosi, Tamas; van der Holst, Bart; Kuznetsova, Maria; Maddox, Marlo; Rastaetter, Lutz

    2016-04-01

    Experimental real-time simulations of the Space Weather Modeling Framework (SWMF) are conducted at the Community Coordinated Modeling Center (CCMC), with results available there (http://ccmc.gsfc.nasa.gov/realtime.php), through the CCMC Integrated Space Weather Analysis (iSWA) site (http://iswa.ccmc.gsfc.nasa.gov/IswaSystemWebApp/), and the Michigan SWMF site (http://csem.engin.umich.edu/realtime). Presently, two configurations of the SWMF are running in real time at CCMC, both focusing on the geospace modules, using the BATS-R-US magnetohydrodynamic model, the Ridley Ionosphere Model, and with and without the Rice Convection Model for inner magnetospheric drift physics. While both have been running for several years, nearly continuous results are available since July 2015. Dst from the model output is compared against the Kyoto real-time Dst, in particular the daily minimum value of Dst to quantify the ability of the model to capture storms. Contingency tables are presented, showing that the run with the inner magnetosphere model is much better at reproducing storm-time values. For disturbances with a minimum Dst lower than -50 nT, this version yields a probability of event detection of 0.86 and a Heidke Skill Score of 0.60. In the other version of the SWMF, without the inner magnetospheric module included, the modeled Dst never dropped below -50 nT during the examined epoch.

  7. Comparing the MRI-based Goutallier Classification to an experimental quantitative MR spectroscopic fat measurement of the supraspinatus muscle.

    PubMed

    Gilbert, Fabian; Böhm, Dirk; Eden, Lars; Schmalzl, Jonas; Meffert, Rainer H; Köstler, Herbert; Weng, Andreas M; Ziegler, Dirk

    2016-08-22

    The Goutallier Classification is a semi quantitative classification system to determine the amount of fatty degeneration in rotator cuff muscles. Although initially proposed for axial computer tomography scans it is currently applied to magnet-resonance-imaging-scans. The role for its clinical use is controversial, as the reliability of the classification has been shown to be inconsistent. The purpose of this study was to compare the semi quantitative MRI-based Goutallier Classification applied by 5 different raters to experimental MR spectroscopic quantitative fat measurement in order to determine the correlation between this classification system and the true extent of fatty degeneration shown by spectroscopy. MRI-scans of 42 patients with rotator cuff tears were examined by 5 shoulder surgeons and were graduated according to the MRI-based Goutallier Classification proposed by Fuchs et al. Additionally the fat/water ratio was measured with MR spectroscopy using the experimental SPLASH technique. The semi quantitative grading according to the Goutallier Classification was statistically correlated with the quantitative measured fat/water ratio using Spearman's rank correlation. Statistical analysis of the data revealed only fair correlation of the Goutallier Classification system and the quantitative fat/water ratio with R = 0.35 (p < 0.05). By dichotomizing the scale the correlation was 0.72. The interobserver and intraobserver reliabilities were substantial with R = 0.62 and R = 0.74 (p < 0.01). The correlation between the semi quantitative MRI based Goutallier Classification system and MR spectroscopic fat measurement is weak. As an adequate estimation of fatty degeneration based on standard MRI may not be possible, quantitative methods need to be considered in order to increase diagnostic safety and thus provide patients with ideal care in regard to the amount of fatty degeneration. Spectroscopic MR measurement may increase the accuracy of

  8. The influence of the design matrix on treatment effect estimates in the quantitative analyses of single-subject experimental design research.

    PubMed

    Moeyaert, Mariola; Ugille, Maaike; Ferron, John M; Beretvas, S Natasha; Van den Noortgate, Wim

    2014-09-01

    The quantitative methods for analyzing single-subject experimental data have expanded during the last decade, including the use of regression models to statistically analyze the data, but still a lot of questions remain. One question is how to specify predictors in a regression model to account for the specifics of the design and estimate the effect size of interest. These quantitative effect sizes are used in retrospective analyses and allow synthesis of single-subject experimental study results which is informative for evidence-based decision making, research and theory building, and policy discussions. We discuss different design matrices that can be used for the most common single-subject experimental designs (SSEDs), namely, the multiple-baseline designs, reversal designs, and alternating treatment designs, and provide empirical illustrations. The purpose of this article is to guide single-subject experimental data analysts interested in analyzing and meta-analyzing SSED data. © The Author(s) 2014.

  9. Experimental design and data-analysis in label-free quantitative LC/MS proteomics: A tutorial with MSqRob.

    PubMed

    Goeminne, Ludger J E; Gevaert, Kris; Clement, Lieven

    2018-01-16

    Label-free shotgun proteomics is routinely used to assess proteomes. However, extracting relevant information from the massive amounts of generated data remains difficult. This tutorial provides a strong foundation on analysis of quantitative proteomics data. We provide key statistical concepts that help researchers to design proteomics experiments and we showcase how to analyze quantitative proteomics data using our recent free and open-source R package MSqRob, which was developed to implement the peptide-level robust ridge regression method for relative protein quantification described by Goeminne et al. MSqRob can handle virtually any experimental proteomics design and outputs proteins ordered by statistical significance. Moreover, its graphical user interface and interactive diagnostic plots provide easy inspection and also detection of anomalies in the data and flaws in the data analysis, allowing deeper assessment of the validity of results and a critical review of the experimental design. Our tutorial discusses interactive preprocessing, data analysis and visualization of label-free MS-based quantitative proteomics experiments with simple and more complex designs. We provide well-documented scripts to run analyses in bash mode on GitHub, enabling the integration of MSqRob in automated pipelines on cluster environments (https://github.com/statOmics/MSqRob). The concepts outlined in this tutorial aid in designing better experiments and analyzing the resulting data more appropriately. The two case studies using the MSqRob graphical user interface will contribute to a wider adaptation of advanced peptide-based models, resulting in higher quality data analysis workflows and more reproducible results in the proteomics community. We also provide well-documented scripts for experienced users that aim at automating MSqRob on cluster environments. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Experimentally validated quantitative linear model for the device physics of elastomeric microfluidic valves

    NASA Astrophysics Data System (ADS)

    Kartalov, Emil P.; Scherer, Axel; Quake, Stephen R.; Taylor, Clive R.; Anderson, W. French

    2007-03-01

    A systematic experimental study and theoretical modeling of the device physics of polydimethylsiloxane "pushdown" microfluidic valves are presented. The phase space is charted by 1587 dimension combinations and encompasses 45-295μm lateral dimensions, 16-39μm membrane thickness, and 1-28psi closing pressure. Three linear models are developed and tested against the empirical data, and then combined into a fourth-power-polynomial superposition. The experimentally validated final model offers a useful quantitative prediction for a valve's properties as a function of its dimensions. Typical valves (80-150μm width) are shown to behave like thin springs.

  11. Experimental Null Method to Guide the Development of Technical Procedures and to Control False-Positive Discovery in Quantitative Proteomics.

    PubMed

    Shen, Xiaomeng; Hu, Qiang; Li, Jun; Wang, Jianmin; Qu, Jun

    2015-10-02

    Comprehensive and accurate evaluation of data quality and false-positive biomarker discovery is critical to direct the method development/optimization for quantitative proteomics, which nonetheless remains challenging largely due to the high complexity and unique features of proteomic data. Here we describe an experimental null (EN) method to address this need. Because the method experimentally measures the null distribution (either technical or biological replicates) using the same proteomic samples, the same procedures and the same batch as the case-vs-contol experiment, it correctly reflects the collective effects of technical variability (e.g., variation/bias in sample preparation, LC-MS analysis, and data processing) and project-specific features (e.g., characteristics of the proteome and biological variation) on the performances of quantitative analysis. To show a proof of concept, we employed the EN method to assess the quantitative accuracy and precision and the ability to quantify subtle ratio changes between groups using different experimental and data-processing approaches and in various cellular and tissue proteomes. It was found that choices of quantitative features, sample size, experimental design, data-processing strategies, and quality of chromatographic separation can profoundly affect quantitative precision and accuracy of label-free quantification. The EN method was also demonstrated as a practical tool to determine the optimal experimental parameters and rational ratio cutoff for reliable protein quantification in specific proteomic experiments, for example, to identify the necessary number of technical/biological replicates per group that affords sufficient power for discovery. Furthermore, we assessed the ability of EN method to estimate levels of false-positives in the discovery of altered proteins, using two concocted sample sets mimicking proteomic profiling using technical and biological replicates, respectively, where the true

  12. Quantitative experimental modelling of fragmentation during explosive volcanism

    NASA Astrophysics Data System (ADS)

    Thordén Haug, Ø.; Galland, O.; Gisler, G.

    2012-04-01

    Phreatomagmatic eruptions results from the violent interaction between magma and an external source of water, such as ground water or a lake. This interaction causes fragmentation of the magma and/or the host rock, resulting in coarse-grained (lapilli) to very fine-grained (ash) material. The products of phreatomagmatic explosions are classically described by their fragment size distribution, which commonly follows power laws of exponent D. Such descriptive approach, however, considers the final products only and do not provide information on the dynamics of fragmentation. The aim of this contribution is thus to address the following fundamental questions. What are the physics that govern fragmentation processes? How fragmentation occurs through time? What are the mechanisms that produce power law fragment size distributions? And what are the scaling laws that control the exponent D? To address these questions, we performed a quantitative experimental study. The setup consists of a Hele-Shaw cell filled with a layer of cohesive silica flour, at the base of which a pulse of pressurized air is injected, leading to fragmentation of the layer of flour. The fragmentation process is monitored through time using a high-speed camera. By varying systematically the air pressure (P) and the thickness of the flour layer (h) we observed two morphologies of fragmentation: "lift off" where the silica flour above the injection inlet is ejected upwards, and "channeling" where the air pierces through the layer along sub-vertical conduit. By building a phase diagram, we show that the morphology is controlled by P/dgh, where d is the density of the flour and g is the gravitational acceleration. To quantify the fragmentation process, we developed a Matlab image analysis program, which calculates the number and sizes of the fragments, and so the fragment size distribution, during the experiments. The fragment size distributions are in general described by power law distributions of

  13. Experimental validation of a Monte-Carlo-based inversion scheme for 3D quantitative photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Buchmann, Jens; Kaplan, Bernhard A.; Prohaska, Steffen; Laufer, Jan

    2017-03-01

    Quantitative photoacoustic tomography (qPAT) aims to extract physiological parameters, such as blood oxygen saturation (sO2), from measured multi-wavelength image data sets. The challenge of this approach lies in the inherently nonlinear fluence distribution in the tissue, which has to be accounted for by using an appropriate model, and the large scale of the inverse problem. In addition, the accuracy of experimental and scanner-specific parameters, such as the wavelength dependence of the incident fluence, the acoustic detector response, the beam profile and divergence, needs to be considered. This study aims at quantitative imaging of blood sO2, as it has been shown to be a more robust parameter compared to absolute concentrations. We propose a Monte-Carlo-based inversion scheme in conjunction with a reduction in the number of variables achieved using image segmentation. The inversion scheme is experimentally validated in tissue-mimicking phantoms consisting of polymer tubes suspended in a scattering liquid. The tubes were filled with chromophore solutions at different concentration ratios. 3-D multi-spectral image data sets were acquired using a Fabry-Perot based PA scanner. A quantitative comparison of the measured data with the output of the forward model is presented. Parameter estimates of chromophore concentration ratios were found to be within 5 % of the true values.

  14. Spontaneous emergence of rogue waves in partially coherent waves: A quantitative experimental comparison between hydrodynamics and optics

    NASA Astrophysics Data System (ADS)

    El Koussaifi, R.; Tikan, A.; Toffoli, A.; Randoux, S.; Suret, P.; Onorato, M.

    2018-01-01

    Rogue waves are extreme and rare fluctuations of the wave field that have been discussed in many physical systems. Their presence substantially influences the statistical properties of a partially coherent wave field, i.e., a wave field characterized by a finite band spectrum with random Fourier phases. Their understanding is fundamental for the design of ships and offshore platforms. In many meteorological conditions waves in the ocean are characterized by the so-called Joint North Sea Wave Project (JONSWAP) spectrum. Here we compare two unique experimental results: the first one has been performed in a 270 m wave tank and the other in optical fibers. In both cases, waves characterized by a JONSWAP spectrum and random Fourier phases have been launched at the input of the experimental device. The quantitative comparison, based on an appropriate scaling of the two experiments, shows a very good agreement between the statistics in hydrodynamics and optics. Spontaneous emergence of heavy tails in the probability density function of the wave amplitude is observed in both systems. The results demonstrate the universal features of rogue waves and provide a fundamental and explicit bridge between two important fields of research. Numerical simulations are also compared with experimental results.

  15. Spontaneous emergence of rogue waves in partially coherent waves: A quantitative experimental comparison between hydrodynamics and optics.

    PubMed

    El Koussaifi, R; Tikan, A; Toffoli, A; Randoux, S; Suret, P; Onorato, M

    2018-01-01

    Rogue waves are extreme and rare fluctuations of the wave field that have been discussed in many physical systems. Their presence substantially influences the statistical properties of a partially coherent wave field, i.e., a wave field characterized by a finite band spectrum with random Fourier phases. Their understanding is fundamental for the design of ships and offshore platforms. In many meteorological conditions waves in the ocean are characterized by the so-called Joint North Sea Wave Project (JONSWAP) spectrum. Here we compare two unique experimental results: the first one has been performed in a 270 m wave tank and the other in optical fibers. In both cases, waves characterized by a JONSWAP spectrum and random Fourier phases have been launched at the input of the experimental device. The quantitative comparison, based on an appropriate scaling of the two experiments, shows a very good agreement between the statistics in hydrodynamics and optics. Spontaneous emergence of heavy tails in the probability density function of the wave amplitude is observed in both systems. The results demonstrate the universal features of rogue waves and provide a fundamental and explicit bridge between two important fields of research. Numerical simulations are also compared with experimental results.

  16. Quantitative experimental assessment of hot carrier-enhanced solar cells at room temperature

    NASA Astrophysics Data System (ADS)

    Nguyen, Dac-Trung; Lombez, Laurent; Gibelli, François; Boyer-Richard, Soline; Le Corre, Alain; Durand, Olivier; Guillemoles, Jean-François

    2018-03-01

    In common photovoltaic devices, the part of the incident energy above the absorption threshold quickly ends up as heat, which limits their maximum achievable efficiency to far below the thermodynamic limit for solar energy conversion. Conversely, the conversion of the excess kinetic energy of the photogenerated carriers into additional free energy would be sufficient to approach the thermodynamic limit. This is the principle of hot carrier devices. Unfortunately, such device operation in conditions relevant for utilization has never been evidenced. Here, we show that the quantitative thermodynamic study of the hot carrier population, with luminance measurements, allows us to discuss the hot carrier contribution to the solar cell performance. We demonstrate that the voltage and current can be enhanced in a semiconductor heterostructure due to the presence of the hot carrier population in a single InGaAsP quantum well at room temperature. These experimental results substantiate the potential of increasing photovoltaic performances in the hot carrier regime.

  17. Direct Numerical Simulation of Liquid Nozzle Spray with Comparison to Shadowgraphy and X-Ray Computed Tomography Experimental Results

    NASA Astrophysics Data System (ADS)

    van Poppel, Bret; Owkes, Mark; Nelson, Thomas; Lee, Zachary; Sowell, Tyler; Benson, Michael; Vasquez Guzman, Pablo; Fahrig, Rebecca; Eaton, John; Kurman, Matthew; Kweon, Chol-Bum; Bravo, Luis

    2014-11-01

    In this work, we present high-fidelity Computational Fluid Dynamics (CFD) results of liquid fuel injection from a pressure-swirl atomizer and compare the simulations to experimental results obtained using both shadowgraphy and phase-averaged X-ray computed tomography (CT) scans. The CFD and experimental results focus on the dense near-nozzle region to identify the dominant mechanisms of breakup during primary atomization. Simulations are performed using the NGA code of Desjardins et al (JCP 227 (2008)) and employ the volume of fluid (VOF) method proposed by Owkes and Desjardins (JCP 270 (2013)), a second order accurate, un-split, conservative, three-dimensional VOF scheme providing second order density fluxes and capable of robust and accurate high density ratio simulations. Qualitative features and quantitative statistics are assessed and compared for the simulation and experimental results, including the onset of atomization, spray cone angle, and drop size and distribution.

  18. A thorough experimental study of CH/π interactions in water: quantitative structure-stability relationships for carbohydrate/aromatic complexes.

    PubMed

    Jiménez-Moreno, Ester; Jiménez-Osés, Gonzalo; Gómez, Ana M; Santana, Andrés G; Corzana, Francisco; Bastida, Agatha; Jiménez-Barbero, Jesus; Asensio, Juan Luis

    2015-11-13

    CH/π interactions play a key role in a large variety of molecular recognition processes of biological relevance. However, their origins and structural determinants in water remain poorly understood. In order to improve our comprehension of these important interaction modes, we have performed a quantitative experimental analysis of a large data set comprising 117 chemically diverse carbohydrate/aromatic stacking complexes, prepared through a dynamic combinatorial approach recently developed by our group. The obtained free energies provide a detailed picture of the structure-stability relationships that govern the association process, opening the door to the rational design of improved carbohydrate-based ligands or carbohydrate receptors. Moreover, this experimental data set, supported by quantum mechanical calculations, has contributed to the understanding of the main driving forces that promote complex formation, underlining the key role played by coulombic and solvophobic forces on the stabilization of these complexes. This represents the most quantitative and extensive experimental study reported so far for CH/π complexes in water.

  19. Experimental study of oscillating plates in viscous fluids: Qualitative and quantitative analysis of the flow physics and hydrodynamic forces

    NASA Astrophysics Data System (ADS)

    Shrestha, Bishwash; Ahsan, Syed N.; Aureli, Matteo

    2018-01-01

    In this paper, we present a comprehensive experimental study on harmonic oscillations of a submerged rigid plate in a quiescent, incompressible, Newtonian, viscous fluid. The fluid-structure interaction problem is analyzed from both qualitative and quantitative perspectives via a detailed particle image velocimetry (PIV) experimental campaign conducted over a broad range of oscillation frequency and amplitude parameters. Our primary goal is to identify the effect of the oscillation characteristics on the mechanisms of fluid-structure interaction and on the dynamics of vortex shedding and convection and to elucidate the behavior of hydrodynamic forces on the oscillating structure. Towards this goal, we study the flow in terms of qualitative aspects of its pathlines, vortex shedding, and symmetry breaking phenomena and identify distinct hydrodynamic regimes in the vicinity of the oscillating structure. Based on these experimental observations, we produce a novel phase diagram detailing the occurrence of distinct hydrodynamic regimes as a function of relevant governing nondimensional parameters. We further study the hydrodynamic forces associated with each regime using both PIV and direct force measurement via a load cell. Our quantitative results on experimental estimation of hydrodynamic forces show good agreement against predictions from the literature, where numerical and semi-analytical models are available. The findings and observations in this work shed light on the relationship between flow physics, vortex shedding, and convection mechanisms and the hydrodynamic forces acting on a rigid oscillating plate and, as such, have relevance to various engineering applications, including energy harvesting devices, biomimetic robotic system, and micro-mechanical sensors and actuators.

  20. Magnetic Guarding: Experimental and Numerical Results

    NASA Astrophysics Data System (ADS)

    Heinrich, Jonathon; Font, Gabriel; Garrett, Michael; Rose, D.; Genoni, T.; Welch, D.; McGuire, Thomas

    2017-10-01

    The magnetic field topology of Lockheed Martin's Compact Fusion Reactor (CFR) concept requires internal magnetic field coils. Internal coils for similar devices have leveraged levitating coils or coils with magnetically guarded supports. Magnetic guarding of supports has been investigated for multipole devices (theoretically and experimentally) without conclusive results. One outstanding question regarding magnetic guarding of supports is the magnitude and behavior of secondary plasma drifts resulting from magnetic guard fields (grad-B drifts, etc). We present magnetic-implicit PIC modeling results and preliminary proof of concept experimental results on magnetic guarding of internal-supports and the subsequent reduction in total plasma losses.

  1. The mzTab Data Exchange Format: Communicating Mass-spectrometry-based Proteomics and Metabolomics Experimental Results to a Wider Audience*

    PubMed Central

    Griss, Johannes; Jones, Andrew R.; Sachsenberg, Timo; Walzer, Mathias; Gatto, Laurent; Hartler, Jürgen; Thallinger, Gerhard G.; Salek, Reza M.; Steinbeck, Christoph; Neuhauser, Nadin; Cox, Jürgen; Neumann, Steffen; Fan, Jun; Reisinger, Florian; Xu, Qing-Wei; del Toro, Noemi; Pérez-Riverol, Yasset; Ghali, Fawaz; Bandeira, Nuno; Xenarios, Ioannis; Kohlbacher, Oliver; Vizcaíno, Juan Antonio; Hermjakob, Henning

    2014-01-01

    The HUPO Proteomics Standards Initiative has developed several standardized data formats to facilitate data sharing in mass spectrometry (MS)-based proteomics. These allow researchers to report their complete results in a unified way. However, at present, there is no format to describe the final qualitative and quantitative results for proteomics and metabolomics experiments in a simple tabular format. Many downstream analysis use cases are only concerned with the final results of an experiment and require an easily accessible format, compatible with tools such as Microsoft Excel or R. We developed the mzTab file format for MS-based proteomics and metabolomics results to meet this need. mzTab is intended as a lightweight supplement to the existing standard XML-based file formats (mzML, mzIdentML, mzQuantML), providing a comprehensive summary, similar in concept to the supplemental material of a scientific publication. mzTab files can contain protein, peptide, and small molecule identifications together with experimental metadata and basic quantitative information. The format is not intended to store the complete experimental evidence but provides mechanisms to report results at different levels of detail. These range from a simple summary of the final results to a representation of the results including the experimental design. This format is ideally suited to make MS-based proteomics and metabolomics results available to a wider biological community outside the field of MS. Several software tools for proteomics and metabolomics have already adapted the format as an output format. The comprehensive mzTab specification document and extensive additional documentation can be found online. PMID:24980485

  2. Quantitative comparison of PZT and CMUT probes for photoacoustic imaging: Experimental validation.

    PubMed

    Vallet, Maëva; Varray, François; Boutet, Jérôme; Dinten, Jean-Marc; Caliano, Giosuè; Savoia, Alessandro Stuart; Vray, Didier

    2017-12-01

    Photoacoustic (PA) signals are short ultrasound (US) pulses typically characterized by a single-cycle shape, often referred to as N-shape. The spectral content of such wideband signals ranges from a few hundred kilohertz to several tens of megahertz. Typical reception frequency responses of classical piezoelectric US imaging transducers, based on PZT technology, are not sufficiently broadband to fully preserve the entire information contained in PA signals, which are then filtered, thus limiting PA imaging performance. Capacitive micromachined ultrasonic transducers (CMUT) are rapidly emerging as a valid alternative to conventional PZT transducers in several medical ultrasound imaging applications. As compared to PZT transducers, CMUTs exhibit both higher sensitivity and significantly broader frequency response in reception, making their use attractive in PA imaging applications. This paper explores the advantages of the CMUT larger bandwidth in PA imaging by carrying out an experimental comparative study using various CMUT and PZT probes from different research laboratories and manufacturers. PA acquisitions are performed on a suture wire and on several home-made bimodal phantoms with both PZT and CMUT probes. Three criteria, based on the evaluation of pure receive impulse response, signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) respectively, have been used for a quantitative comparison of imaging results. The measured fractional bandwidths of the CMUT arrays are larger compared to PZT probes. Moreover, both SNR and CNR are enhanced by at least 6 dB with CMUT technology. This work highlights the potential of CMUT technology for PA imaging through qualitative and quantitative parameters.

  3. Experimental results of use of triple-energy X-ray beam with K-edge filter in multi-energy imaging

    NASA Astrophysics Data System (ADS)

    Kim, D.; Lee, S.; Jeon, P.-H.

    2016-04-01

    Multi-energy imaging is useful for contrast enhancement of lesions, quantitative analysis of specific materials and material separation in the human body. Generally, dual-energy methods are applied to discriminating two materials, but this method cannot discriminate more than two materials. Photon-counting detectors provide spectral information from polyenergetic X-rays using multiple energy bins. In this work, we developed triple-energy X-ray beams using a filter with K-edge energy and applied them experimentally. The energy spectra of triple-energy X-ray beams were assessed by using a spectrometer. The designed triple-energy X-ray beams were validated by measuring quantitative evaluations with mean energy ratio (MER), contrast variation ratio (CVR) and exposure efficiency (EE). Then, triple-energy X-ray beams were used to extract density map of three materials, iodine (I), aluminum (Al) and polymethyl methacrylate (PMMA). The results of the thickness density maps obtained with the developed triple-energy X-ray beams were compared to those acquired using the photon-counting method. As a result, it was found experimentally that the proposed triple-energy X-ray beam technique can separate the three materials as well as the photon-counting method.

  4. The mzTab data exchange format: communicating mass-spectrometry-based proteomics and metabolomics experimental results to a wider audience.

    PubMed

    Griss, Johannes; Jones, Andrew R; Sachsenberg, Timo; Walzer, Mathias; Gatto, Laurent; Hartler, Jürgen; Thallinger, Gerhard G; Salek, Reza M; Steinbeck, Christoph; Neuhauser, Nadin; Cox, Jürgen; Neumann, Steffen; Fan, Jun; Reisinger, Florian; Xu, Qing-Wei; Del Toro, Noemi; Pérez-Riverol, Yasset; Ghali, Fawaz; Bandeira, Nuno; Xenarios, Ioannis; Kohlbacher, Oliver; Vizcaíno, Juan Antonio; Hermjakob, Henning

    2014-10-01

    The HUPO Proteomics Standards Initiative has developed several standardized data formats to facilitate data sharing in mass spectrometry (MS)-based proteomics. These allow researchers to report their complete results in a unified way. However, at present, there is no format to describe the final qualitative and quantitative results for proteomics and metabolomics experiments in a simple tabular format. Many downstream analysis use cases are only concerned with the final results of an experiment and require an easily accessible format, compatible with tools such as Microsoft Excel or R. We developed the mzTab file format for MS-based proteomics and metabolomics results to meet this need. mzTab is intended as a lightweight supplement to the existing standard XML-based file formats (mzML, mzIdentML, mzQuantML), providing a comprehensive summary, similar in concept to the supplemental material of a scientific publication. mzTab files can contain protein, peptide, and small molecule identifications together with experimental metadata and basic quantitative information. The format is not intended to store the complete experimental evidence but provides mechanisms to report results at different levels of detail. These range from a simple summary of the final results to a representation of the results including the experimental design. This format is ideally suited to make MS-based proteomics and metabolomics results available to a wider biological community outside the field of MS. Several software tools for proteomics and metabolomics have already adapted the format as an output format. The comprehensive mzTab specification document and extensive additional documentation can be found online. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.

  5. Approaches to quantitating the results of differentially dyed cottons

    USDA-ARS?s Scientific Manuscript database

    The differential dyeing (DD) method has served as a subjective method for visually determining immature cotton fibers. In an attempt to quantitate the results of the differential dyeing method, and thus offer an efficient means of elucidating cotton maturity without visual discretion, image analysi...

  6. Validation of reference genes for quantitative gene expression analysis in experimental epilepsy.

    PubMed

    Sadangi, Chinmaya; Rosenow, Felix; Norwood, Braxton A

    2017-12-01

    To grasp the molecular mechanisms and pathophysiology underlying epilepsy development (epileptogenesis) and epilepsy itself, it is important to understand the gene expression changes that occur during these phases. Quantitative real-time polymerase chain reaction (qPCR) is a technique that rapidly and accurately determines gene expression changes. It is crucial, however, that stable reference genes are selected for each experimental condition to ensure that accurate values are obtained for genes of interest. If reference genes are unstably expressed, this can lead to inaccurate data and erroneous conclusions. To date, epilepsy studies have used mostly single, nonvalidated reference genes. This is the first study to systematically evaluate reference genes in male Sprague-Dawley rat models of epilepsy. We assessed 15 potential reference genes in hippocampal tissue obtained from 2 different models during epileptogenesis, 1 model during chronic epilepsy, and a model of noninjurious seizures. Reference gene ranking varied between models and also differed between epileptogenesis and chronic epilepsy time points. There was also some variance between the four mathematical models used to rank reference genes. Notably, we found novel reference genes to be more stably expressed than those most often used in experimental epilepsy studies. The consequence of these findings is that reference genes suitable for one epilepsy model may not be appropriate for others and that reference genes can change over time. It is, therefore, critically important to validate potential reference genes before using them as normalizing factors in expression analysis in order to ensure accurate, valid results. © 2017 Wiley Periodicals, Inc.

  7. The Bigfoot Drive; Experimental Results

    NASA Astrophysics Data System (ADS)

    Baker, Kevin; Thomas, Cliff; Khan, Shahab; Casey, Daniel; Spears, Brian; Nora, Ryan; Munro, Davis; Eder, David; Milovich, Jose; Berger, Dick; Strozzi, David; Goyon, Clement; Turnbull, David; Ma, Tammy; Izumi, Nobuhiko; Benedetti, Robin; Millot, Marius; Celliers, Peter; Yeamans, Charles; Hatarik, Robert; Landen, Nino; Hurricane, Omar; Callahan, Debbie

    2016-10-01

    The Bigfoot platform was developed on the National Ignition Facility to investigate low convergence, high adiabat, high rhoR hotspot implosions. This platform was designed to be less susceptible to wall motion, LPI and CBET and to be more robust against capsule hydrodynamic instabilities. To date experimental studies have been carried out at two hohlraum scales, a 5.75 and 5.4 mm diameter hohlraum. We will present experimental results from these tuning campaigns including the shape vs. cone fraction, surrogacy comparisons of self-emission from the capsules vs. radiography of the imploding capsule and doped vs. undoped capsules. Prepared by LLNL under Contract DE-AC52-07NA27344.

  8. Theory and preliminary experimental verification of quantitative edge illumination x-ray phase contrast tomography.

    PubMed

    Hagen, C K; Diemoz, P C; Endrizzi, M; Rigon, L; Dreossi, D; Arfelli, F; Lopez, F C M; Longo, R; Olivo, A

    2014-04-07

    X-ray phase contrast imaging (XPCi) methods are sensitive to phase in addition to attenuation effects and, therefore, can achieve improved image contrast for weakly attenuating materials, such as often encountered in biomedical applications. Several XPCi methods exist, most of which have already been implemented in computed tomographic (CT) modality, thus allowing volumetric imaging. The Edge Illumination (EI) XPCi method had, until now, not been implemented as a CT modality. This article provides indications that quantitative 3D maps of an object's phase and attenuation can be reconstructed from EI XPCi measurements. Moreover, a theory for the reconstruction of combined phase and attenuation maps is presented. Both reconstruction strategies find applications in tissue characterisation and the identification of faint, weakly attenuating details. Experimental results for wires of known materials and for a biological object validate the theory and confirm the superiority of the phase over conventional, attenuation-based image contrast.

  9. Tau-U: A Quantitative Approach for Analysis of Single-Case Experimental Data in Aphasia.

    PubMed

    Lee, Jaime B; Cherney, Leora R

    2018-03-01

    Tau-U is a quantitative approach for analyzing single-case experimental design (SCED) data. It combines nonoverlap between phases with intervention phase trend and can correct for a baseline trend (Parker, Vannest, & Davis, 2011). We demonstrate the utility of Tau-U by comparing it with the standardized mean difference approach (Busk & Serlin, 1992) that is widely reported within the aphasia SCED literature. Repeated writing measures from 3 participants with chronic aphasia who received computer-based writing treatment are analyzed visually and quantitatively using both Tau-U and the standardized mean difference approach. Visual analysis alone was insufficient for determining an effect between the intervention and writing improvement. The standardized mean difference yielded effect sizes ranging from 4.18 to 26.72 for trained items and 1.25 to 3.20 for untrained items. Tau-U yielded significant (p < .05) effect sizes for 2 of 3 participants for trained probes and 1 of 3 participants for untrained probes. A baseline trend correction was applied to data from 2 of 3 participants. Tau-U has the unique advantage of allowing for the correction of an undesirable baseline trend. Although further study is needed, Tau-U shows promise as a quantitative approach to augment visual analysis of SCED data in aphasia.

  10. Quantitative Experimental Determination of Primer-Dimer Formation Risk by Free-Solution Conjugate Electrophoresis

    PubMed Central

    Desmarais, Samantha M.; Leitner, Thomas; Barron, Annelise E.

    2012-01-01

    DNA barcodes are short, unique ssDNA primers that “mark” individual biomolecules. To gain better understanding of biophysical parameters constraining primer-dimer formation between primers that incorporate barcode sequences, we have developed a capillary electrophoresis method that utilizes drag-tag-DNA conjugates to quantify dimerization risk between primer-barcode pairs. Results obtained with this unique free-solution conjugate electrophoresis (FSCE) approach are useful as quantitatively precise input data to parameterize computation models of dimerization risk. A set of fluorescently labeled, model primer-barcode conjugates were designed with complementary regions of differing lengths to quantify heterodimerization as a function of temperature. Primer-dimer cases comprised two 30-mer primers, one of which was covalently conjugated to a lab-made, chemically synthesized poly-N-methoxyethylglycine drag-tag, which reduced electrophoretic mobility of ssDNA to distinguish it from ds primer-dimers. The drag-tags also provided a shift in mobility for the dsDNA species, which allowed us to quantitate primer-dimer formation. In the experimental studies, pairs of oligonucleotide primer-barcodes with fully or partially complementary sequences were annealed, and then separated by free-solution conjugate CE at different temperatures, to assess effects on primer-dimer formation. When less than 30 out of 30 basepairs were bonded, dimerization was inversely correlated to temperature. Dimerization occurred when more than 15 consecutive basepairs formed, yet non-consecutive basepairs did not create stable dimers even when 20 out of 30 possible basepairs bonded. The use of free-solution electrophoresis in combination with a peptoid drag-tag and different fluorophores enabled precise separation of short DNA fragments to establish a new mobility shift assay for detection of primer-dimer formation. PMID:22331820

  11. Design and analysis issues in quantitative proteomics studies.

    PubMed

    Karp, Natasha A; Lilley, Kathryn S

    2007-09-01

    Quantitative proteomics is the comparison of distinct proteomes which enables the identification of protein species which exhibit changes in expression or post-translational state in response to a given stimulus. Many different quantitative techniques are being utilized and generate large datasets. Independent of the technique used, these large datasets need robust data analysis to ensure valid conclusions are drawn from such studies. Approaches to address the problems that arise with large datasets are discussed to give insight into the types of statistical analyses of data appropriate for the various experimental strategies that can be employed by quantitative proteomic studies. This review also highlights the importance of employing a robust experimental design and highlights various issues surrounding the design of experiments. The concepts and examples discussed within will show how robust design and analysis will lead to confident results that will ensure quantitative proteomics delivers.

  12. A Quantitative Infrared Spectroscopy Experiment.

    ERIC Educational Resources Information Center

    Krahling, Mark D.; Eliason, Robert

    1985-01-01

    Although infrared spectroscopy is used primarily for qualitative identifications, it is possible to use it as a quantitative tool as well. The use of a standard curve to determine percent methanol in a 2,2,2-trifluoroethanol sample is described. Background information, experimental procedures, and results obtained are provided. (JN)

  13. Improving Middle School Students’ Quantitative Literacy through Inquiry Lab and Group Investigation

    NASA Astrophysics Data System (ADS)

    Aisya, N. S. M.; Supriatno, B.; Saefudin; Anggraeni, S.

    2017-02-01

    The purpose of this study was to analyze the application of metacognitive strategies learning based Vee Diagram through Inquiry Lab and Group Investigation toward students’ quantitative literacy. This study compared two treatments on learning activity in middle school. The metacognitive strategies have applied to the content of environmental pollution at 7th grade. This study used a quantitative approach with quasi-experimental method. The research sample were the 7th grade students, involves 27 students in the experimental through Inquiry Lab and 27 students in the experimental through Group Investigation. The instruments that used in this research were pretest and posttest quantitative literacy skills, learning step observation sheets, and the questionnaire of teachers and students responses. As the result, N-gain average of pretest and posttest increased in both experimental groups. The average of posttest score was 61,11 for the Inquiry Lab and 54,01 to the Group Investigation. The average score of N-gain quantitative literacy skill of Inquiry Lab class was 0,492 and Group Investigation class was 0,426. Both classes of experiments showed an average N-gain in the medium category. The data has been analyzed statistically by using SPSS ver.23 and the results showed that although both the learning model can develop quantitative literacy, but there is not significantly different of improving students’ quantitative literacy between Inquiry Lab and Group Investigation in environmental pollution material.

  14. Quantitative Experimental Study of Defects Induced by Process Parameters in the High-Pressure Die Cast Process

    NASA Astrophysics Data System (ADS)

    Sharifi, P.; Jamali, J.; Sadayappan, K.; Wood, J. T.

    2018-05-01

    A quantitative experimental study of the effects of process parameters on the formation of defects during solidification of high-pressure die cast magnesium alloy components is presented. The parameters studied are slow-stage velocity, fast-stage velocity, intensification pressure, and die temperature. The amount of various defects are quantitatively characterized. Multiple runs of the commercial casting simulation package, ProCAST™, are used to model the mold-filling and solidification events. Several locations in the component including knit lines, last-to-fill region, and last-to-solidify region are identified as the critical regions that have a high concentration of defects. The area fractions of total porosity, shrinkage porosity, gas porosity, and externally solidified grains are separately measured. This study shows that the process parameters, fluid flow and local solidification conditions, play major roles in the formation of defects during HPDC process.

  15. Quantitative cell biology: the essential role of theory.

    PubMed

    Howard, Jonathon

    2014-11-05

    Quantitative biology is a hot area, as evidenced by the recent establishment of institutes, graduate programs, and conferences with that name. But what is quantitative biology? What should it be? And how can it contribute to solving the big questions in biology? The past decade has seen very rapid development of quantitative experimental techniques, especially at the single-molecule and single-cell levels. In this essay, I argue that quantitative biology is much more than just the quantitation of these experimental results. Instead, it should be the application of the scientific method by which measurement is directed toward testing theories. In this view, quantitative biology is the recognition that theory and models play critical roles in biology, as they do in physics and engineering. By tying together experiment and theory, quantitative biology promises a deeper understanding of underlying mechanisms, when the theory works, or to new discoveries, when it does not. © 2014 Howard. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  16. Quantitative targeting maps based on experimental investigations for a branched tube model in magnetic drug targeting

    NASA Astrophysics Data System (ADS)

    Gitter, K.; Odenbach, S.

    2011-12-01

    Magnetic drug targeting (MDT), because of its high targeting efficiency, is a promising approach for tumour treatment. Unwanted side effects are considerably reduced, since the nanoparticles are concentrated within the target region due to the influence of a magnetic field. Nevertheless, understanding the transport phenomena of nanoparticles in an artery system is still challenging. This work presents experimental results for a branched tube model. Quantitative results describe, for example, the net amount of nanoparticles that are targeted towards the chosen region due to the influence of a magnetic field. As a result of measurements, novel drug targeting maps, combining, e.g. the magnetic volume force, the position of the magnet and the net amount of targeted nanoparticles, are presented. The targeting maps are valuable for evaluation and comparison of setups and are also helpful for the design and the optimisation of a magnet system with an appropriate strength and distribution of the field gradient. The maps indicate the danger of accretion within the tube and also show the promising result of magnetic drug targeting that up to 97% of the nanoparticles were successfully targeted.

  17. Multigrid-based reconstruction algorithm for quantitative photoacoustic tomography

    PubMed Central

    Li, Shengfu; Montcel, Bruno; Yuan, Zhen; Liu, Wanyu; Vray, Didier

    2015-01-01

    This paper proposes a multigrid inversion framework for quantitative photoacoustic tomography reconstruction. The forward model of optical fluence distribution and the inverse problem are solved at multiple resolutions. A fixed-point iteration scheme is formulated for each resolution and used as a cost function. The simulated and experimental results for quantitative photoacoustic tomography reconstruction show that the proposed multigrid inversion can dramatically reduce the required number of iterations for the optimization process without loss of reliability in the results. PMID:26203371

  18. Quantitative Phase Imaging in a Volume Holographic Microscope

    NASA Astrophysics Data System (ADS)

    Waller, Laura; Luo, Yuan; Barbastathis, George

    2010-04-01

    We demonstrate a method for quantitative phase imaging in a Volume Holographic Microscope (VHM) from a single exposure, describe the properties of the system and show experimental results. The VHM system uses a multiplexed volume hologram (VH) to laterally separate images from different focal planes. This 3D intensity information is then used to solve the transport of intensity (TIE) equation and recover phase quantitatively. We discuss the modifications to the technique that were made in order to give accurate results.

  19. Quantitative MR imaging in fracture dating--Initial results.

    PubMed

    Baron, Katharina; Neumayer, Bernhard; Widek, Thomas; Schick, Fritz; Scheicher, Sylvia; Hassler, Eva; Scheurer, Eva

    2016-04-01

    For exact age determinations of bone fractures in a forensic context (e.g. in cases of child abuse) improved knowledge of the time course of the healing process and use of non-invasive modern imaging technology is of high importance. To date, fracture dating is based on radiographic methods by determining the callus status and thereby relying on an expert's experience. As a novel approach, this study aims to investigate the applicability of magnetic resonance imaging (MRI) for bone fracture dating by systematically investigating time-resolved changes in quantitative MR characteristics after a fracture event. Prior to investigating fracture healing in children, adults were examined for this study in order to test the methodology for this application. Altogether, 31 MR examinations in 17 subjects (♀: 11 ♂: 6; median age 34 ± 15 y, scanned 1-5 times over a period of up to 200 days after the fracture event) were performed on a clinical 3T MR scanner (TimTrio, Siemens AG, Germany). All subjects were treated conservatively for a fracture in either a long bone or in the collar bone. Both, qualitative and quantitative MR measurements were performed in all subjects. MR sequences for a quantitative measurement of relaxation times T1 and T2 in the fracture gap and musculature were applied. Maps of quantitative MR parameters T1, T2, and magnetisation transfer ratio (MTR) were calculated and evaluated by investigating changes over time in the fractured area by defined ROIs. Additionally, muscle areas were examined as reference regions to validate this approach. Quantitative evaluation of 23 MR data sets (12 test subjects, ♀: 7 ♂: 5) showed an initial peak in T1 values in the fractured area (T1=1895 ± 607 ms), which decreased over time to a value of 1094 ± 182 ms (200 days after the fracture event). T2 values also peaked for early-stage fractures (T2=115 ± 80 ms) and decreased to 73 ± 33 ms within 21 days after the fracture event. After that time point, no

  20. Experimental design and quantitative analysis of microbial community multiomics.

    PubMed

    Mallick, Himel; Ma, Siyuan; Franzosa, Eric A; Vatanen, Tommi; Morgan, Xochitl C; Huttenhower, Curtis

    2017-11-30

    Studies of the microbiome have become increasingly sophisticated, and multiple sequence-based, molecular methods as well as culture-based methods exist for population-scale microbiome profiles. To link the resulting host and microbial data types to human health, several experimental design considerations, data analysis challenges, and statistical epidemiological approaches must be addressed. Here, we survey current best practices for experimental design in microbiome molecular epidemiology, including technologies for generating, analyzing, and integrating microbiome multiomics data. We highlight studies that have identified molecular bioactives that influence human health, and we suggest steps for scaling translational microbiome research to high-throughput target discovery across large populations.

  1. Fuel-rich, catalytic reaction experimental results

    NASA Technical Reports Server (NTRS)

    Rollbuhler, R. James

    1991-01-01

    Future aeropropulsion gas turbine combustion requirements call for operating at very high inlet temperatures, pressures, and large temperature rises. At the same time, the combustion process is to have minimum pollution effects on the environment. Aircraft gas turbine engines utilize liquid hydrocarbon fuels which are difficult to uniformly atomize and mix with combustion air. An approach for minimizing fuel related problems is to transform the liquid fuel into gaseous form prior to the completion of the combustion process. Experimentally obtained results are presented for vaporizing and partially oxidizing a liquid hydrocarbon fuel into burnable gaseous components. The presented experimental data show that 1200 to 1300 K reaction product gas, rich in hydrogen, carbon monoxide, and light-end hydrocarbons, is formed when flowing 0.3 to 0.6 fuel to air mixes through a catalyst reactor. The reaction temperatures are kept low enough that nitrogen oxides and carbon particles (soot) do not form. Results are reported for tests using different catalyst types and configurations, mass flowrates, input temperatures, and fuel to air ratios.

  2. High Contrast Imaging in the Visible: First Experimental Results at the Large Binocular Telescope

    NASA Astrophysics Data System (ADS)

    Pedichini, F.; Stangalini, M.; Ambrosino, F.; Puglisi, A.; Pinna, E.; Bailey, V.; Carbonaro, L.; Centrone, M.; Christou, J.; Esposito, S.; Farinato, J.; Fiore, F.; Giallongo, E.; Hill, J. M.; Hinz, P. M.; Sabatini, L.

    2017-08-01

    In 2014 February, the System for High contrast And coronography from R to K at VISual bands (SHARK-VIS) Forerunner, a high contrast experimental imager operating at visible wavelengths, was installed at the Large Binocular Telescope (LBT). Here we report on the first results obtained by recent on-sky tests. These results show the extremely good performance of the LBT Extreme Adaptive Optics (ExAO) system at visible wavelengths, both in terms of spatial resolution and contrast achieved. Similarly to what was done by Amara & Quanz (2012), we used the SHARK-VIS Forerunner data to quantitatively assess the contrast enhancement. This is done by injecting several different synthetic faint objects in the acquired data and applying the angular differential imaging (ADI) technique. A contrast of the order of 5 × 10-5 is obtained at 630 nm for angular separations from the star larger than 100 mas. These results are discussed in light of the future development of SHARK-VIS and compared to those obtained by other high contrast imagers operating at similar wavelengths.

  3. Linearization improves the repeatability of quantitative dynamic contrast-enhanced MRI.

    PubMed

    Jones, Kyle M; Pagel, Mark D; Cárdenas-Rodríguez, Julio

    2018-04-01

    The purpose of this study was to compare the repeatabilities of the linear and nonlinear Tofts and reference region models (RRM) for dynamic contrast-enhanced MRI (DCE-MRI). Simulated and experimental DCE-MRI data from 12 rats with a flank tumor of C6 glioma acquired over three consecutive days were analyzed using four quantitative and semi-quantitative DCE-MRI metrics. The quantitative methods used were: 1) linear Tofts model (LTM), 2) non-linear Tofts model (NTM), 3) linear RRM (LRRM), and 4) non-linear RRM (NRRM). The following semi-quantitative metrics were used: 1) maximum enhancement ratio (MER), 2) time to peak (TTP), 3) initial area under the curve (iauc64), and 4) slope. LTM and NTM were used to estimate K trans , while LRRM and NRRM were used to estimate K trans relative to muscle (R Ktrans ). Repeatability was assessed by calculating the within-subject coefficient of variation (wSCV) and the percent intra-subject variation (iSV) determined with the Gage R&R analysis. The iSV for R Ktrans using LRRM was two-fold lower compared to NRRM at all simulated and experimental conditions. A similar trend was observed for the Tofts model, where LTM was at least 50% more repeatable than the NTM under all experimental and simulated conditions. The semi-quantitative metrics iauc64 and MER were as equally repeatable as K trans and R Ktrans estimated by LTM and LRRM respectively. The iSV for iauc64 and MER were significantly lower than the iSV for slope and TTP. In simulations and experimental results, linearization improves the repeatability of quantitative DCE-MRI by at least 30%, making it as repeatable as semi-quantitative metrics. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Evidence-based nursing: a stereotyped view of quantitative and experimental research could work against professional autonomy and authority.

    PubMed

    Bonell, C

    1999-07-01

    In recent years, there have been calls within the United Kingdom's National Health Service (NHS) for evidence-based health care. These resonate with long-standing calls for nursing to become a research-based profession. Evidence-based practice could enable nurses to demonstrate their unique contribution to health care outcomes, and support their seeking greater professionalization, in terms of enhanced authority and autonomy. Nursing's professionalization project, and, within this, various practices comprising the 'new nursing', whilst sometimes not delivering all that was hoped of them, have been important in developing certain conditions conducive to developing evidence-based practice, notably a critical perspective on practice and a reluctance merely to follow physicians' orders. However, nursing has often been hesitant in its adoption of quantitative and experimental research. This hesitancy, it is argued, has been influenced by the propounding by some authors within the new nursing of a stereotyped view of quantitative/experimental methods which equates them with a number of methodological and philosophical points which are deemed, by at least some of these authors, as inimical to, or problematic within, nursing research. It is argued that, not only is the logic on which the various stereotyped views are based flawed, but further, that the wider influence of these viewpoints on nurses could lead to a greater marginalization of nurses in research and evidence-based practice initiatives, thus perhaps leading to evidence-based nursing being led by other groups. In the longer term, this might result in a form of evidence-based nursing emphasizing routinization, thus--ironically--working against strategies of professional authority and autonomy embedded in the new nursing. Nursing research should instead follow the example of nurse researchers who already embrace multiple methods. While the paper describes United Kingdom experiences and debates, points raised about

  5. A Qualitative-Quantitative H-NMR Experiment for the Instrumental Analysis Laboratory.

    ERIC Educational Resources Information Center

    Phillips, John S.; Leary, James J.

    1986-01-01

    Describes an experiment combining qualitative and quantitative information from hydrogen nuclear magnetic resonance spectra. Reviews theory, discusses the experimental approach, and provides sample results. (JM)

  6. Comparison the Results of Numerical Simulation And Experimental Results for Amirkabir Plasma Focus Facility

    NASA Astrophysics Data System (ADS)

    Goudarzi, Shervin; Amrollahi, R.; Niknam Sharak, M.

    2014-06-01

    In this paper the results of the numerical simulation for Amirkabir Mather-type Plasma Focus Facility (16 kV, 36μF and 115 nH) in several experiments with Argon as working gas at different working conditions (different discharge voltages and gas pressures) have been presented and compared with the experimental results. Two different models have been used for simulation: five-phase model of Lee and lumped parameter model of Gonzalez. It is seen that the results (optimum pressures and current signals) of the Lee model at different working conditions show better agreement than lumped parameter model with experimental values.

  7. An Inexpensive Electrodeposition Device and Its Use in a Quantitative Analysis Laboratory Exercise

    ERIC Educational Resources Information Center

    Parker, Richard H.

    2011-01-01

    An experimental procedure, using an apparatus that is easy to construct, was developed to incorporate a quantitative electrogravimetric determination of the solution nickel content into an undergraduate or advanced high school quantitative analysis laboratory. This procedure produces results comparable to the procedure used for the gravimetric…

  8. Para-Quantitative Methodology: Reclaiming Experimentalism in Educational Research

    ERIC Educational Resources Information Center

    Shabani Varaki, Bakhtiar; Floden, Robert E.; Javidi Kalatehjafarabadi, Tahereh

    2015-01-01

    This article focuses on the criticisms of current approaches in educational research methodology. It summarizes rationales for mixed methods and argues that the mixing quantitative paradigm and qualitative paradigm is problematic due to practical and philosophical arguments. It is also indicated that the current rise of mixed methods work has…

  9. Photoionization microscopy: Hydrogenic theory in semiparabolic coordinates and comparison with experimental results

    NASA Astrophysics Data System (ADS)

    Kalaitzis, P.; Danakas, S.; Lépine, F.; Bordas, C.; Cohen, S.

    2018-05-01

    Photoionization microscopy (PM) is an experimental method allowing for high-resolution measurements of the electron current probability density in the case of photoionization of an atom in an external uniform static electric field. PM is based on high-resolution velocity-map imaging and offers the unique opportunity to observe the quantum oscillatory spatial structure of the outgoing electron flux. We present the basic elements of the quantum-mechanical theoretical framework of PM for hydrogenic systems near threshold. Our development is based on the computationally more convenient semiparabolic coordinate system. Theoretical results are first subjected to a quantitative comparison with hydrogenic images corresponding to quasibound states and a qualitative comparison with nonresonant images of multielectron atoms. Subsequently, particular attention is paid on the structure of the electron's momentum distribution transversely to the static field (i.e., of the angularly integrated differential cross-section as a function of electron energy and radius of impact on the detector). Such 2D maps provide at a glance a complete picture of the peculiarities of the differential cross-section over the entire near-threshold energy range. Hydrogenic transverse momentum distributions are computed for the cases of the ground and excited initial states and single- and two-photon ionization schemes. Their characteristics of general nature are identified by comparing the hydrogenic distributions among themselves, as well as with a presently recorded experimental distribution concerning the magnesium atom. Finally, specificities attributed to different target atoms, initial states, and excitation scenarios are also discussed, along with directions of further work.

  10. Experimental results on chiral magnetic and vortical effects

    DOE PAGES

    Wang, Gang; Wen, Liwen

    2017-01-12

    Various novel transport phenomena in chiral systems result from the interplay of quantum anomalies with magnetic field and vorticity in high-energy heavy-ion collisions and could survive the expansion of the fireball and be detected in experiments. Among them are the chiral magnetic effect, the chiral vortical effect, and the chiral magnetic wave, the experimental searches for which have aroused extensive interest. As a result, the goal of this review is to describe the current status of experimental studies at Relativistic Heavy-Ion Collider at BNL and the Large Hadron Collider at CERN and to outline the future work in experiment neededmore » to eliminate the existing uncertainties in the interpretation of the data.« less

  11. Experimental study of flash boiling spray vaporization through quantitative vapor concentration and liquid temperature measurements

    NASA Astrophysics Data System (ADS)

    Zhang, Gaoming; Hung, David L. S.; Xu, Min

    2014-08-01

    Flash boiling sprays of liquid injection under superheated conditions provide the novel solutions of fast vaporization and better air-fuel mixture formation for internal combustion engines. However, the physical mechanisms of flash boiling spray vaporization are more complicated than the droplet surface vaporization due to the unique bubble generation and boiling process inside a superheated bulk liquid, which are not well understood. In this study, the vaporization of flash boiling sprays was investigated experimentally through the quantitative measurements of vapor concentration and liquid temperature. Specifically, the laser-induced exciplex fluorescence technique was applied to distinguish the liquid and vapor distributions. Quantitative vapor concentration was obtained by correlating the intensity of vapor-phase fluorescence with vapor concentration through systematic corrections and calibrations. The intensities of two wavelengths were captured simultaneously from the liquid-phase fluorescence spectra, and their intensity ratios were correlated with liquid temperature. The results show that both liquid and vapor phase of multi-hole sprays collapse toward the centerline of the spray with different mass distributions under the flash boiling conditions. Large amount of vapor aggregates along the centerline of the spray to form a "gas jet" structure, whereas the liquid distributes more uniformly with large vortexes formed in the vicinity of the spray tip. The vaporization process under the flash boiling condition is greatly enhanced due to the intense bubble generation and burst. The liquid temperature measurements show strong temperature variations inside the flash boiling sprays with hot zones present in the "gas jet" structure and vortex region. In addition, high vapor concentration and closed vortex motion seem to have inhibited the heat and mass transfer in these regions. In summary, the vapor concentration and liquid temperature provide detailed information

  12. Experimental results for correlation-based wavefront sensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poyneer, L A; Palmer, D W; LaFortune, K N

    2005-07-01

    Correlation wave-front sensing can improve Adaptive Optics (AO) system performance in two keys areas. For point-source-based AO systems, Correlation is more accurate, more robust to changing conditions and provides lower noise than a centroiding algorithm. Experimental results from the Lick AO system and the SSHCL laser AO system confirm this. For remote imaging, Correlation enables the use of extended objects for wave-front sensing. Results from short horizontal-path experiments will show algorithm properties and requirements.

  13. Quantitative analysis of terahertz spectra for illicit drugs using adaptive-range micro-genetic algorithm

    NASA Astrophysics Data System (ADS)

    Chen, Yi; Ma, Yong; Lu, Zheng; Peng, Bei; Chen, Qin

    2011-08-01

    In the field of anti-illicit drug applications, many suspicious mixture samples might consist of various drug components—for example, a mixture of methamphetamine, heroin, and amoxicillin—which makes spectral identification very difficult. A terahertz spectroscopic quantitative analysis method using an adaptive range micro-genetic algorithm with a variable internal population (ARVIPɛμGA) has been proposed. Five mixture cases are discussed using ARVIPɛμGA driven quantitative terahertz spectroscopic analysis in this paper. The devised simulation results show agreement with the previous experimental results, which suggested that the proposed technique has potential applications for terahertz spectral identifications of drug mixture components. The results show agreement with the results obtained using other experimental and numerical techniques.

  14. Methods of experimentation with models and utilization of results

    NASA Technical Reports Server (NTRS)

    Robert,

    1924-01-01

    The present report treats the subject of testing small models in a wind tunnel and of the methods employed for rendering the results constant, accurate and comparable with one another. Detailed experimental results are given.

  15. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    ERIC Educational Resources Information Center

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  16. A quantitative brain map of experimental cerebral malaria pathology.

    PubMed

    Strangward, Patrick; Haley, Michael J; Shaw, Tovah N; Schwartz, Jean-Marc; Greig, Rachel; Mironov, Aleksandr; de Souza, J Brian; Cruickshank, Sheena M; Craig, Alister G; Milner, Danny A; Allan, Stuart M; Couper, Kevin N

    2017-03-01

    The murine model of experimental cerebral malaria (ECM) has been utilised extensively in recent years to study the pathogenesis of human cerebral malaria (HCM). However, it has been proposed that the aetiologies of ECM and HCM are distinct, and, consequently, no useful mechanistic insights into the pathogenesis of HCM can be obtained from studying the ECM model. Therefore, in order to determine the similarities and differences in the pathology of ECM and HCM, we have performed the first spatial and quantitative histopathological assessment of the ECM syndrome. We demonstrate that the accumulation of parasitised red blood cells (pRBCs) in brain capillaries is a specific feature of ECM that is not observed during mild murine malaria infections. Critically, we show that individual pRBCs appear to occlude murine brain capillaries during ECM. As pRBC-mediated congestion of brain microvessels is a hallmark of HCM, this suggests that the impact of parasite accumulation on cerebral blood flow may ultimately be similar in mice and humans during ECM and HCM, respectively. Additionally, we demonstrate that cerebrovascular CD8+ T-cells appear to co-localise with accumulated pRBCs, an event that corresponds with development of widespread vascular leakage. As in HCM, we show that vascular leakage is not dependent on extensive vascular destruction. Instead, we show that vascular leakage is associated with alterations in transcellular and paracellular transport mechanisms. Finally, as in HCM, we observed axonal injury and demyelination in ECM adjacent to diverse vasculopathies. Collectively, our data therefore shows that, despite very different presentation, and apparently distinct mechanisms, of parasite accumulation, there appear to be a number of comparable features of cerebral pathology in mice and in humans during ECM and HCM, respectively. Thus, when used appropriately, the ECM model may be useful for studying specific pathological features of HCM.

  17. A quantitative brain map of experimental cerebral malaria pathology

    PubMed Central

    Schwartz, Jean-Marc; Greig, Rachel; Mironov, Aleksandr; de Souza, J. Brian; Cruickshank, Sheena M.; Craig, Alister G.; Milner, Danny A.; Allan, Stuart M.

    2017-01-01

    The murine model of experimental cerebral malaria (ECM) has been utilised extensively in recent years to study the pathogenesis of human cerebral malaria (HCM). However, it has been proposed that the aetiologies of ECM and HCM are distinct, and, consequently, no useful mechanistic insights into the pathogenesis of HCM can be obtained from studying the ECM model. Therefore, in order to determine the similarities and differences in the pathology of ECM and HCM, we have performed the first spatial and quantitative histopathological assessment of the ECM syndrome. We demonstrate that the accumulation of parasitised red blood cells (pRBCs) in brain capillaries is a specific feature of ECM that is not observed during mild murine malaria infections. Critically, we show that individual pRBCs appear to occlude murine brain capillaries during ECM. As pRBC-mediated congestion of brain microvessels is a hallmark of HCM, this suggests that the impact of parasite accumulation on cerebral blood flow may ultimately be similar in mice and humans during ECM and HCM, respectively. Additionally, we demonstrate that cerebrovascular CD8+ T-cells appear to co-localise with accumulated pRBCs, an event that corresponds with development of widespread vascular leakage. As in HCM, we show that vascular leakage is not dependent on extensive vascular destruction. Instead, we show that vascular leakage is associated with alterations in transcellular and paracellular transport mechanisms. Finally, as in HCM, we observed axonal injury and demyelination in ECM adjacent to diverse vasculopathies. Collectively, our data therefore shows that, despite very different presentation, and apparently distinct mechanisms, of parasite accumulation, there appear to be a number of comparable features of cerebral pathology in mice and in humans during ECM and HCM, respectively. Thus, when used appropriately, the ECM model may be useful for studying specific pathological features of HCM. PMID:28273147

  18. Hydrocarbon-Fueled Rocket Engine Plume Diagnostics: Analytical Developments and Experimental Results

    NASA Technical Reports Server (NTRS)

    Tejwani, Gopal D.; McVay, Gregory P.; Langford, Lester A.; St. Cyr, William W.

    2006-01-01

    A viewgraph presentation describing experimental results and analytical developments about plume diagnostics for hydrocarbon-fueled rocket engines is shown. The topics include: 1) SSC Plume Diagnostics Background; 2) Engine Health Monitoring Approach; 3) Rocket Plume Spectroscopy Simulation Code; 4) Spectral Simulation for 10 Atomic Species and for 11 Diatomic Molecular Electronic Bands; 5) "Best" Lines for Plume Diagnostics for Hydrocarbon-Fueled Rocket Engines; 6) Experimental Set Up for the Methane Thruster Test Program and Experimental Results; and 7) Summary and Recommendations.

  19. PLS-based quantitative structure-activity relationship for substituted benzamides of clebopride type. Application of experimental design in drug design.

    PubMed

    Norinder, U; Högberg, T

    1992-04-01

    The advantageous approach of using an experimentally designed training set as the basis for establishing a quantitative structure-activity relationship with good predictive capability is described. The training set was selected from a fractional factorial design scheme based on a principal component description of physico-chemical parameters of aromatic substituents. The derived model successfully predicts the activities of additional substituted benzamides of 6-methoxy-N-(4-piperidyl)salicylamide type. The major influence on activity of the 3-substituent is demonstrated.

  20. An experimental design for quantification of cardiovascular responses to music stimuli in humans.

    PubMed

    Chang, S-H; Luo, C-H; Yeh, T-L

    2004-01-01

    There have been several researches on the relationship between music and human physiological or psychological responses. However, there are cardiovascular index factors that have not been explored quantitatively due to the qualitative nature of acoustic stimuli. This study proposes and demonstrates an experimental design for quantification of cardiovascular responses to music stimuli in humans. The system comprises two components: a unit for generating and monitoring quantitative acoustic stimuli and a portable autonomic nervous system (ANS) analysis unit for quantitative recording and analysis of the cardiovascular responses. The experimental results indicate that the proposed system can exactly achieve the goal of full control and measurement for the music stimuli, and also effectively support many quantitative indices of cardiovascular response in humans. In addition, the analysis results are discussed and predicted in the future clinical research.

  1. On collisional disruption - Experimental results and scaling laws

    NASA Technical Reports Server (NTRS)

    Davis, Donald R.; Ryan, Eileen V.

    1990-01-01

    Both homogeneous and inhomogeneous targets have been addressed by the present experimental consideration of the impact strengths, fragment sizes, and fragment velocities generated by cement mortar targets whose crushing strengths vary by an order of magnitude, upon impact of projectiles in the velocity range of 50-5700 m/sec. When combined with additional published data, dynamic impact strength is found to correlate with quasi-static material strengths for materials ranging in character from basalt to ice; two materials not following this trend, however, are weak mortar and clay targets. Values consistent with experimental results are obtainable with a simple scaling algorithm based on impact energy, material properties, and collisional strain rate.

  2. Heat Transfer Enhancement for Finned-Tube Heat Exchangers with Vortex Generators: Experimental and Numerical Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Brien, James Edward; Sohal, Manohar Singh; Huff, George Albert

    2002-08-01

    A combined experimental and numerical investigation is under way to investigate heat transfer enhancement techniques that may be applicable to large-scale air-cooled condensers such as those used in geothermal power applications. The research is focused on whether air-side heat transfer can be improved through the use of finsurface vortex generators (winglets,) while maintaining low heat exchanger pressure drop. A transient heat transfer visualization and measurement technique has been employed in order to obtain detailed distributions of local heat transfer coefficients on model fin surfaces. Pressure drop measurements have also been acquired in a separate multiple-tube row apparatus. In addition, numericalmore » modeling techniques have been developed to allow prediction of local and average heat transfer for these low-Reynolds-number flows with and without winglets. Representative experimental and numerical results presented in this paper reveal quantitative details of local fin-surface heat transfer in the vicinity of a circular tube with a single delta winglet pair downstream of the cylinder. The winglets were triangular (delta) with a 1:2 height/length aspect ratio and a height equal to 90% of the channel height. Overall mean fin-surface Nusselt-number results indicate a significant level of heat transfer enhancement (average enhancement ratio 35%) associated with the deployment of the winglets with oval tubes. Pressure drop measurements have also been obtained for a variety of tube and winglet configurations using a single-channel flow apparatus that includes four tube rows in a staggered array. Comparisons of heat transfer and pressure drop results for the elliptical tube versus a circular tube with and without winglets are provided. Heat transfer and pressure-drop results have been obtained for flow Reynolds numbers based on channel height and mean flow velocity ranging from 700 to 6500.« less

  3. DoSSiER: Database of scientific simulation and experimental results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wenzel, Hans; Yarba, Julia; Genser, Krzystof

    The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER can be easily accessed via a web application. In addition, a web service allows for programmatic access to the repository to extract records in json or xml exchange formats. In this paper, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.

  4. DoSSiER: Database of scientific simulation and experimental results

    DOE PAGES

    Wenzel, Hans; Yarba, Julia; Genser, Krzystof; ...

    2016-08-01

    The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER can be easily accessed via a web application. In addition, a web service allows for programmatic access to the repository to extract records in json or xml exchange formats. In this paper, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.

  5. Quantitative prediction of drug side effects based on drug-related features.

    PubMed

    Niu, Yanqing; Zhang, Wen

    2017-09-01

    Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.

  6. Reproducibility and quantitation of amplicon sequencing-based detection

    PubMed Central

    Zhou, Jizhong; Wu, Liyou; Deng, Ye; Zhi, Xiaoyang; Jiang, Yi-Huei; Tu, Qichao; Xie, Jianping; Van Nostrand, Joy D; He, Zhili; Yang, Yunfeng

    2011-01-01

    To determine the reproducibility and quantitation of the amplicon sequencing-based detection approach for analyzing microbial community structure, a total of 24 microbial communities from a long-term global change experimental site were examined. Genomic DNA obtained from each community was used to amplify 16S rRNA genes with two or three barcode tags as technical replicates in the presence of a small quantity (0.1% wt/wt) of genomic DNA from Shewanella oneidensis MR-1 as the control. The technical reproducibility of the amplicon sequencing-based detection approach is quite low, with an average operational taxonomic unit (OTU) overlap of 17.2%±2.3% between two technical replicates, and 8.2%±2.3% among three technical replicates, which is most likely due to problems associated with random sampling processes. Such variations in technical replicates could have substantial effects on estimating β-diversity but less on α-diversity. A high variation was also observed in the control across different samples (for example, 66.7-fold for the forward primer), suggesting that the amplicon sequencing-based detection approach could not be quantitative. In addition, various strategies were examined to improve the comparability of amplicon sequencing data, such as increasing biological replicates, and removing singleton sequences and less-representative OTUs across biological replicates. Finally, as expected, various statistical analyses with preprocessed experimental data revealed clear differences in the composition and structure of microbial communities between warming and non-warming, or between clipping and non-clipping. Taken together, these results suggest that amplicon sequencing-based detection is useful in analyzing microbial community structure even though it is not reproducible and quantitative. However, great caution should be taken in experimental design and data interpretation when the amplicon sequencing-based detection approach is used for quantitative

  7. Experimental Psychological Stress on Quantitative Sensory Testing Response in Patients with Temporomandibular Disorders.

    PubMed

    Araújo Oliveira Ferreira, Dyna Mara; Costa, Yuri Martins; de Quevedo, Henrique Müller; Bonjardim, Leonardo Rigoldi; Rodrigues Conti, Paulo César

    2018-05-15

    To assess the modulatory effects of experimental psychological stress on the somatosensory evaluation of myofascial temporomandibular disorder (TMD) patients. A total of 20 women with myofascial TMD and 20 age-matched healthy women were assessed by means of a standardized battery of quantitative sensory testing. Cold detection threshold (CDT), warm detection threshold (WDT), cold pain threshold (CPT), heat pain threshold (HPT), mechanical pain threshold (MPT), wind-up ratio (WUR), and pressure pain threshold (PPT) were performed on the facial skin overlying the masseter muscle. The variables were measured in three sessions: before (baseline) and immediately after the Paced Auditory Serial Addition Task (PASAT) (stress) and then after a washout period of 20 to 30 minutes (poststress). Mixed analysis of variance (ANOVA) was applied to the data, and the significance level was set at P = .050. A significant main effect of the experimental session on all thermal tests was found (ANOVA: F > 4.10, P < .017), where detection tests presented an increase in thresholds in the poststress session compared to baseline (CDT, P = .012; WDT, P = .040) and pain thresholds were reduced in the stress (CPT, P < .001; HPT, P = .001) and poststress sessions (CPT, P = .005; HPT, P = .006) compared to baseline. In addition, a significant main effect of the study group on all mechanical tests (MPT, WUR, and PPT) was found (ANOVA: F > 4.65, P < .037), where TMD patients were more sensitive than healthy volunteers. Acute mental stress conditioning can modulate thermal sensitivity of the skin overlying the masseter in myofascial TMD patients and healthy volunteers. Therefore, psychological stress should be considered in order to perform an unbiased somatosensory assessment of TMD patients.

  8. Blood-brain barrier permeability and monocyte infiltration in experimental allergic encephalomyelitis: a quantitative MRI study.

    PubMed

    Floris, S; Blezer, E L A; Schreibelt, G; Döpp, E; van der Pol, S M A; Schadee-Eestermans, I L; Nicolay, K; Dijkstra, C D; de Vries, H E

    2004-03-01

    Enhanced cerebrovascular permeability and cellular infiltration mark the onset of early multiple sclerosis lesions. So far, the precise sequence of these events and their role in lesion formation and disease progression remain unknown. Here we provide quantitative evidence that blood-brain barrier leakage is an early event and precedes massive cellular infiltration in the development of acute experimental allergic encephalomyelitis (EAE), the animal correlate of multiple sclerosis. Cerebrovascular leakage and monocytes infiltrates were separately monitored by quantitative in vivo MRI during the course of the disease. Magnetic resonance enhancement of the contrast agent gadolinium diethylenetriaminepentaacetate (Gd-DTPA), reflecting vascular leakage, occurred concomitantly with the onset of neurological signs and was already at a maximal level at this stage of the disease. Immunohistochemical analysis also confirmed the presence of the serum-derived proteins such as fibrinogen around the brain vessels early in the disease, whereas no cellular infiltrates could be detected. MRI further demonstrated that Gd-DTPA leakage clearly preceded monocyte infiltration as imaged by the contrast agent based on ultra small particles of iron oxide (USPIO), which was maximal only during full-blown EAE. Ultrastructural and immunohistochemical investigation revealed that USPIOs were present in newly infiltrated macrophages within the inflammatory lesions. To validate the use of USPIOs as a non-invasive tool to evaluate therapeutic strategies, EAE animals were treated with the immunomodulator 3-hydroxy-3-methylglutaryl Coenzyme A reductase inhibitor, lovastatin, which ameliorated clinical scores. MRI showed that the USPIO load in the brain was significantly diminished in lovastatin-treated animals. Data indicate that cerebrovascular leakage and monocytic trafficking into the brain are two distinct processes in the development of inflammatory lesions during multiple sclerosis, which can

  9. Experimental investigations of recent anomalous results in superconductivity

    NASA Astrophysics Data System (ADS)

    Souw, Victor K.

    2000-12-01

    This thesis examines three recent anomalous results associated with irreversibility in type-II superconductivity: (1) The magnetic properties of the predicted superconductors LiBeH3 and Li2BeH 4, (2) the paramagnetic transition near T = Tc in Nb, and (3) a noise transition in a YBa2Cu3O7-delta thin film near the vortex-solid transition. The investigation of Li 2BeH4 and LiBeH3 was prompted by theoretical predictions of room-temperature superconductivity for Li2BeH4 and LiBeH3 and a recent report that Li2BeH4 showed magnetic irreversibilities similar to those of type-II superconductors. A modified experimental method is introduced in order to avoid artifacts due to background signals. The resulting data is suggestive of a superparamagnetic impurity from one of the reagents used in the synthesis and after subtracting this contribution, the temperature-dependent susceptibilities of Li2 BeH4 and LiBeH3 are estimated. However, no magnetic irreversibility suggestive of superconductivity is observed. The anomalous paramagnetic transition in Nb is intriguing because Nb does not share the d-wave order parameter symmetry often invoked to explain the phenomenon in other superconductors. A modified experimental method was developed in order to avoid instrumental artifacts known to produce a similar apparently paramagnetic response, but the results of this method indicate that the paramagnetic response is a physical property of the sample. Finally, a very sharp noise transition in a YBa2Cu3O7-delta thin film was found to be distinct from previously reported features in the voltage noise commonly associated with vortex fluctuations near the irreversibility line. In each of these three cases the examination of experimental techniques is an integral part of the investigation of novel vortex behavior near the onset of irreversibility.

  10. Does contraceptive treatment in wildlife result in side effects? A review of quantitative and anecdotal evidence.

    PubMed

    Gray, Meeghan E; Cameron, Elissa Z

    2010-01-01

    The efficacy of contraceptive treatments has been extensively tested, and several formulations are effective at reducing fertility in a range of species. However, these formulations should minimally impact the behavior of individuals and populations before a contraceptive is used for population manipulation, but these effects have received less attention. Potential side effects have been identified theoretically and we reviewed published studies that have investigated side effects on behavior and physiology of individuals or population-level effects, which provided mixed results. Physiological side effects were most prevalent. Most studies reported a lack of secondary effects, but were usually based on qualitative data or anecdotes. A meta-analysis on quantitative studies of side effects showed that secondary effects consistently occur across all categories and all contraceptive types. This contrasts with the qualitative studies, suggesting that anecdotal reports are insufficient to investigate secondary impacts of contraceptive treatment. We conclude that more research is needed to address fundamental questions about secondary effects of contraceptive treatment and experiments are fundamental to conclusions. In addition, researchers are missing a vital opportunity to use contraceptives as an experimental tool to test the influence of reproduction, sex and fertility on the behavior of wildlife species.

  11. Interlaboratory Comparison of Quantitative PCR Test Results for Dehalococcoides

    EPA Science Inventory

    Quantitative PCR (qPCR) techniques have been widely used to measure Dehalococcoides (Dhc) DNA in the groundwater at field sites for several years. Interpretation of these data may be complicated when different laboratories using alternate methods conduct the analysis. An...

  12. Comparison of 99mTc-MDP SPECT qualitative vs quantitative results in patients with suspected condylar hyperplasia.

    PubMed

    López Buitrago, D F; Ruiz Botero, J; Corral, C M; Carmona, A R; Sabogal, A

    To compare qualitative vs quantitative results of Single Photon Emission Computerised Tomography (SPECT), calculated from percentage of 99m Tc-MDP (methylene diphosphonate) uptake, in condyles of patients with a presumptive clinical diagnosis of condylar hyperplasia. A retrospective, descriptive study was conducted on the 99m Tc-MDP SPECT bone scintigraphy reports from 51 patients, with clinical impression of facial asymmetry related to condylar hyperplasia referred by their specialist in orthodontics or maxillofacial surgery, to a nuclear medicine department in order to take this type of test. Quantitative data from 99m Tc-MDP condylar uptake of each were obtained and compared with qualitative image interpretation reported by a nuclear medicine expert. The concordances between the 51 qualitative and quantitative reports results was established. The total sample included 32 women (63%) and 19 men (37%). The patient age range was 13-45 years (21±8 years). According to qualitative reports, 19 patients were positive for right side condylar hyperplasia, 12 for left side condylar hyperplasia, with 8 bilateral, and 12 negative. The quantitative reports diagnosed 16 positives for right side condylar hyperplasia, 10 for left side condylar hyperplasia, and 25 negatives. Nuclear medicine images are an important diagnostic tool, but the qualitative interpretation of the images is not as reliable as the quantitative calculation. The agreement between the two types of report is low (39.2%, Kappa=0.13; P>.2). The main limitation of quantitative reports is that they do not register bilateral condylar hyperplasia cases. Copyright © 2017 Elsevier España, S.L.U. y SEMNIM. All rights reserved.

  13. Assessing agreement between preclinical magnetic resonance imaging and histology: An evaluation of their image qualities and quantitative results

    PubMed Central

    Elschner, Cindy; Korn, Paula; Hauptstock, Maria; Schulz, Matthias C.; Range, Ursula; Jünger, Diana; Scheler, Ulrich

    2017-01-01

    One consequence of demographic change is the increasing demand for biocompatible materials for use in implants and prostheses. This is accompanied by a growing number of experimental animals because the interactions between new biomaterials and its host tissue have to be investigated. To evaluate novel materials and engineered tissues the use of non-destructive imaging modalities have been identified as a strategic priority. This provides the opportunity for studying interactions repeatedly with individual animals, along with the advantages of reduced biological variability and decreased number of laboratory animals. However, histological techniques are still the golden standard in preclinical biomaterial research. The present article demonstrates a detailed method comparison between histology and magnetic resonance imaging. This includes the presentation of their image qualities as well as the detailed statistical analysis for assessing agreement between quantitative measures. Exemplarily, the bony ingrowth of tissue engineered bone substitutes for treatment of a cleft-like maxillary bone defect has been evaluated. By using a graphical concordance analysis the mean difference between MRI results and histomorphometrical measures has been examined. The analysis revealed a slightly but significant bias in the case of the bone volume (biasHisto−MRI:Bone volume=2.40 %, p<0.005) and a clearly significant deviation for the remaining defect width (biasHisto−MRI:Defect width=−6.73 %, p≪0.005). But the study although showed a considerable effect of the analyzed section position to the quantitative result. It could be proven, that the bias of the data sets was less originated due to the imaging modalities, but mainly on the evaluation of different slice positions. The article demonstrated that method comparisons not always need the use of an independent animal study, additionally. PMID:28666026

  14. Quantitative indexes of aminonucleoside-induced nephrotic syndrome.

    PubMed Central

    Nevins, T. E.; Gaston, T.; Basgen, J. M.

    1984-01-01

    Aminonucleoside of puromycin (PAN) is known to cause altered glomerular permeability, resulting in a nephrotic syndrome in rats. The early sequence of this lesion was studied quantitatively, with the application of a new morphometric technique for determining epithelial foot process widths and a sensitive assay for quantifying urinary albumin excretion. Twenty-four hours following a single intraperitoneal injection of PAN, significant widening of foot processes was documented. Within 36 hours significant increases in urinary albumin excretion were observed. When control rats were examined, there was no clear correlation between epithelial foot process width and quantitative albumin excretion. However, in the PAN-treated animals, abnormal albuminuria only appeared in association with appreciable foot process expansion. These studies indicate that quantitative alterations occur in the rat glomerular capillary wall as early as 24 hours after PAN. Further studies of altered glomerular permeability may use these sensitive measures to more precisely define the temporal sequence and elucidate possible subgroups of experimental glomerular injury. Images Figure 1 Figure 2 PMID:6486243

  15. Comprehensive Quantitative Analysis on Privacy Leak Behavior

    PubMed Central

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  16. Comprehensive quantitative analysis on privacy leak behavior.

    PubMed

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects.

  17. Clinical and experimental study of TMJ distraction: preliminary results.

    PubMed

    Festa, F; Galluccio, G

    1998-01-01

    A physiotherapeutic approach, with manual maneuvers and/or distraction appliances, is indicated in the treatment of temporomandibular joint disorders (TMDs) to prevent the progressive fibrosis of the muscle fibers. In this article, the authors report preliminary results of experimental and clinical studies conducted to assess the real effect of distraction in temporomandibular joint disorders. The experimental invivo studies confirmed the structural alteration due to compression and distraction on the capsular and condylar tissues. Clinical cases are reported to show the increase of the intraarticular vertical dimension, with a forward and downward movement of the condyles in a more physiologic condition.

  18. Qualitative versus quantitative methods in psychiatric research.

    PubMed

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  19. Experimental results for a hypersonic nozzle/afterbody flow field

    NASA Technical Reports Server (NTRS)

    Spaid, Frank W.; Keener, Earl R.; Hui, Frank C. L.

    1995-01-01

    This study was conducted to experimentally characterize the flow field created by the interaction of a single-expansion ramp-nozzle (SERN) flow with a hypersonic external stream. Data were obtained from a generic nozzle/afterbody model in the 3.5 Foot Hypersonic Wind Tunnel at the NASA Ames Research Center, in a cooperative experimental program involving Ames and McDonnell Douglas Aerospace. The model design and test planning were performed in close cooperation with members of the Ames computational fluid dynamics (CFD) team for the National Aerospace Plane (NASP) program. This paper presents experimental results consisting of oil-flow and shadow graph flow-visualization photographs, afterbody surface-pressure distributions, rake boundary-layer measurements, Preston-tube skin-friction measurements, and flow field surveys with five-hole and thermocouple probes. The probe data consist of impact pressure, flow direction, and total temperature profiles in the interaction flow field.

  20. A quantitative study to design an experimental setup for photoacoustic imaging.

    PubMed

    Marion, Adrien; Boutet, Jérôme; Debourdeau, Mathieu; Dinten, Jean-Marc; Vray, Didier

    2011-01-01

    During the last decade, a new modality called photoacoustic imaging has emerged. The increasing interest for this new modality is due to the fact that it combines advantages of ultrasound and optical imaging, i.e. the high contrast due to optical absorption and the low acoustic attenuation in biological tissues. It is thus possible to study vascularization because blood has high optical absorption coefficient. Papers in the literature often focus on applications and rarely discuss quantitative parameters. The goal of this paper is to provide quantitative elements to design an acquisition setup. By defining the targeted resolution and penetration depth, it is then possible to evaluate which kind of excitation and reception systems have to be used. First, we recall theoretical background related to photoacoustic effect before to describe the experiments based on a nanosecond laser at 1064 nm and 2.25-5 MHz transducers. Second, we present results about the relation linking fluence laser to signal amplitude and axial and lateral resolutions of our acquisition setup. We verify the linear relation between fluence and amplitude before to estimate axial resolution at 550 μm for a 2.25 MHz ultrasonic transducer. Concerning lateral resolution, we show that a reconstruction technique based on curvilinear acquisition of 30 lines improves it by a factor of 3 compared to a lateral displacement. Future works will include improvement of lateral resolution using probes, like in ultrasound imaging, instead of single-element transducers.

  1. Mechanical properties of triaxially braided composites: Experimental and analytical results

    NASA Technical Reports Server (NTRS)

    Masters, John E.; Foye, Raymond L.; Pastore, Christopher M.; Gowayed, Yasser A.

    1992-01-01

    This paper investigates the unnotched tensile properties of two-dimensional triaxial braid reinforced composites from both an experimental and analytical viewpoint. The materials are graphite fibers in an epoxy matrix. Three different reinforcing fiber architectures were considered. Specimens were cut from resin transfer molded (RTM) composite panels made from each braid. There were considerable differences in the observed elastic constants from different size strain gage and extensometer readings. Larger strain gages gave more consistent results and correlated better with the extensometer readings. Experimental strains correlated reasonably well with analytical predictions in the longitudinal, zero degree, fiber direction but not in the transverse direction. Tensile strength results were not always predictable even in reinforcing directions. Minor changes in braid geometry led to disproportionate strength variations. The unit cell structure of the triaxial braid was discussed with the assistence of computer analysis of the microgeometry. Photomicrographs of the braid geometry were used to improve upon the computer graphics representations of unit cells. These unit cells were used to predict the elastic moduli with various degrees of sophistication. The simple and the complex analyses were generally in agreement but none adequately matched the experimental results for all the braids.

  2. Mechanical properties of triaxially braided composites: Experimental and analytical results

    NASA Technical Reports Server (NTRS)

    Masters, John E.; Foye, Raymond L.; Pastore, Christopher M.; Gowayed, Yasser A.

    1992-01-01

    The unnotched tensile properties of 2-D triaxial braid reinforced composites from both an experimental and an analytical viewpoint are studied. The materials are graphite fibers in an epoxy matrix. Three different reinforcing fiber architectures were considered. Specimens were cut from resin transfer molded (RTM) composite panels made from each braid. There were considerable differences in the observed elastic constants from different size strain gage and extensometer reading. Larger strain gages gave more consistent results and correlated better with the extensometer reading. Experimental strains correlated reasonably well with analytical predictions in the longitudinal, 0 degrees, fiber direction but not in the transverse direction. Tensile strength results were not always predictable even in reinforcing directions. Minor changes in braid geometry led to disproportionate strength variations. The unit cell structure of the triaxial braid was discussed with the assistance of computer analysis of the microgeometry. Photomicrographs of braid geometry were used to improve upon the computer graphics representations of unit cells. These unit cells were used to predict the elastic moduli with various degrees of sophistication. The simple and the complex analyses were generally in agreement but none adequately matched the experimental results for all the braids.

  3. Composite Failures: A Comparison of Experimental Test Results and Computational Analysis Using XFEM

    DTIC Science & Technology

    2016-09-30

    NUWC-NPT Technical Report 12,218 30 September 2016 Composite Failures: A Comparison of Experimental Test Results and Computational Analysis...A Comparison of Experimental Test Results and Computational Analysis Using XFEM 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...availability of measurement techniques, experimental testing of composite materials has largely outpaced the computational modeling ability, forcing

  4. Quantitative detection of Toxoplasma gondii in tissues of experimentally infected turkeys and in retail turkey products by magnetic-capture PCR.

    PubMed

    Koethe, Martin; Straubinger, Reinhard K; Pott, Susan; Bangoura, Berit; Geuthner, Anne-Catrin; Daugschies, Arwid; Ludewig, Martina

    2015-12-01

    Magnetic-capture PCR was applied for the quantitative detection of Toxoplasma gondii in tissues of experimentally infected turkeys and retail turkey meat products. For experimental infection, three T. gondii strains (ME49, CZ-Tiger, NED), varying infectious doses in different matrices (organisms in single mouse brains or 10(3), 10(5), or 10(6) oocysts in buffer) were used. From all animals, breast, thigh, and drumstick muscle tissues and for CZ-Tiger-infected animals additionally brains and hearts were analyzed. Using the magnetic-capture PCR large volumes of up to 100 g were examined. Our results show that most T. gondii parasites are present in brain and heart tissue. Of the three skeletal muscle types, drumsticks were affected at the highest and breast at the lowest level. Type III strain (NED) seems to be less efficient in infecting turkeys compared to type II strains, because only few tissues of NED infected animals contained T. gondii DNA. Furthermore, the number of detected parasitic stages increased with the level of infectious dose. Infection mode by either oocyst or tissue cyst stage did not have an effect on the amount of T. gondii present in tissues. In retail turkey meat products T. gondii DNA was not detectable although a contact with the parasite was inferred by serology. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Hybrid, experimental and computational, investigation of mechanical components

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1996-07-01

    Computational and experimental methodologies have unique features for the analysis and solution of a wide variety of engineering problems. Computations provide results that depend on selection of input parameters such as geometry, material constants, and boundary conditions which, for correct modeling purposes, have to be appropriately chosen. In addition, it is relatively easy to modify the input parameters in order to computationally investigate different conditions. Experiments provide solutions which characterize the actual behavior of the object of interest subjected to specific operating conditions. However, it is impractical to experimentally perform parametric investigations. This paper discusses the use of a hybrid, computational and experimental, approach for study and optimization of mechanical components. Computational techniques are used for modeling the behavior of the object of interest while it is experimentally tested using noninvasive optical techniques. Comparisons are performed through a fringe predictor program used to facilitate the correlation between both techniques. In addition, experimentally obtained quantitative information, such as displacements and shape, can be applied in the computational model in order to improve this correlation. The result is a validated computational model that can be used for performing quantitative analyses and structural optimization. Practical application of the hybrid approach is illustrated with a representative example which demonstrates the viability of the approach as an engineering tool for structural analysis and optimization.

  6. Experimental validation of the AVIVET trap, a tool to quantitatively monitor the dynamics of Dermanyssus gallinae populations in laying hens.

    PubMed

    Lammers, G A; Bronneberg, R G G; Vernooij, J C M; Stegeman, J A

    2017-06-01

    Dermanyssus gallinae (D.gallinae) infestation causes economic losses due to impaired health and production of hens and costs of parasite control across the world. Moreover, infestations are associated with reduced welfare of hens and may cause itching in humans. To effectively implement control methods it is crucially important to have high quality information about the D.gallinae populations in poultry houses in space and time. At present no validated tool is available to quantitatively monitor the dynamics of all four stages of D.gallinae (i.e., eggs, larvae, nymphs, and adults) in poultry houses.This article describes the experimental validation of the AVIVET trap, a device to quantitatively monitor dynamics of D.gallinae infestations. We used the device to study D.gallinae in fully equipped cages with two white specific pathogen free Leghorn laying hens experimentally exposed to three different infestation levels of D.gallinae (low to high).The AVIVET trap was successfully able to detect D.gallinae at high (5,000 D.gallinae), medium (2,500 D.gallinae), and low (50 D.gallinae) level of D.gallinae infestation. The linear equation Y = 10∧10∧(0.47 + 1.21X) with Y = log10 (Total number of D.gallinae nymphs and adults) in the cage and X = log10 (Total number of D.gallinae nymphs and adults) in the AVIVET trap explained 93.8% of the variation.The weight of D.gallinae in the AVIVET trap also appears to be a reliable parameter for quantifying D.gallinae infestation in a poultry house. The weight of D.gallinae in the AVIVET trap correlates 99.6% (P < 0.000) to the counted number of all stages of D.gallinae in the trap (i.e., eggs, larvae, nymphs, and adults) indicating that the trap is highly specific.From this experiment it can be concluded that the AVIVET trap is promising as quantitative tool for monitoring D.gallinae dynamics in a poultry house. © 2016 Poultry Science Association Inc.

  7. Quantitative and Sensitive Detection of Chloramphenicol by Surface-Enhanced Raman Scattering

    PubMed Central

    Ding, Yufeng; Yin, Hongjun; Meng, Qingyun; Zhao, Yongmei; Liu, Luo; Wu, Zhenglong; Xu, Haijun

    2017-01-01

    We used surface-enhanced Raman scattering (SERS) for the quantitative and sensitive detection of chloramphenicol (CAP). Using 30 nm colloidal Au nanoparticles (NPs), a low detection limit for CAP of 10−8 M was obtained. The characteristic Raman peak of CAP centered at 1344 cm−1 was used for the rapid quantitative detection of CAP in three different types of CAP eye drops, and the accuracy of the measurement result was verified by high-performance liquid chromatography (HPLC). The experimental results reveal that the SERS technique based on colloidal Au NPs is accurate and sensitive, and can be used for the rapid detection of various antibiotics. PMID:29261161

  8. Principles of Quantitative MR Imaging with Illustrated Review of Applicable Modular Pulse Diagrams.

    PubMed

    Mills, Andrew F; Sakai, Osamu; Anderson, Stephan W; Jara, Hernan

    2017-01-01

    Continued improvements in diagnostic accuracy using magnetic resonance (MR) imaging will require development of methods for tissue analysis that complement traditional qualitative MR imaging studies. Quantitative MR imaging is based on measurement and interpretation of tissue-specific parameters independent of experimental design, compared with qualitative MR imaging, which relies on interpretation of tissue contrast that results from experimental pulse sequence parameters. Quantitative MR imaging represents a natural next step in the evolution of MR imaging practice, since quantitative MR imaging data can be acquired using currently available qualitative imaging pulse sequences without modifications to imaging equipment. The article presents a review of the basic physical concepts used in MR imaging and how quantitative MR imaging is distinct from qualitative MR imaging. Subsequently, the article reviews the hierarchical organization of major applicable pulse sequences used in this article, with the sequences organized into conventional, hybrid, and multispectral sequences capable of calculating the main tissue parameters of T1, T2, and proton density. While this new concept offers the potential for improved diagnostic accuracy and workflow, awareness of this extension to qualitative imaging is generally low. This article reviews the basic physical concepts in MR imaging, describes commonly measured tissue parameters in quantitative MR imaging, and presents the major available pulse sequences used for quantitative MR imaging, with a focus on the hierarchical organization of these sequences. © RSNA, 2017.

  9. Statistical design of quantitative mass spectrometry-based proteomic experiments.

    PubMed

    Oberg, Ann L; Vitek, Olga

    2009-05-01

    We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.

  10. Limits of quantitation - Yet another suggestion

    NASA Astrophysics Data System (ADS)

    Carlson, Jill; Wysoczanski, Artur; Voigtman, Edward

    2014-06-01

    The work presented herein suggests that the limit of quantitation concept may be rendered substantially less ambiguous and ultimately more useful as a figure of merit by basing it upon the significant figure and relative measurement error ideas due to Coleman, Auses and Gram, coupled with the correct instantiation of Currie's detection limit methodology. Simple theoretical results are presented for a linear, univariate chemical measurement system with homoscedastic Gaussian noise, and these are tested against both Monte Carlo computer simulations and laser-excited molecular fluorescence experimental results. Good agreement among experiment, theory and simulation is obtained and an easy extension to linearly heteroscedastic Gaussian noise is also outlined.

  11. Modern projection of the old electroscope for nuclear radiation quantitative work and demonstrations

    NASA Astrophysics Data System (ADS)

    Oliveira Bastos, Rodrigo; Baltokoski Boch, Layara

    2017-11-01

    Although quantitative measurements in radioactivity teaching and research are only believed to be possible with high technology, early work in this area was fully accomplished with very simple apparatus such as zinc sulphide screens and electroscopes. This article presents an experimental practice using the electroscope, which is a very simple apparatus that has been widely used for educational purposes, although generally for qualitative work. The main objective is to show the possibility of measuring radioactivity not only in qualitative demonstrations, but also in quantitative experimental practices. The experimental set-up is a low-cost ion chamber connected to an electroscope in a configuration that is very similar to that used by Marie and Pierre Currie, Rutherford, Geiger, Pacini, Hess and other great researchers from the time of the big discoveries in nuclear and high-energy particle physics. An electroscope leaf is filmed and projected, permitting the collection of quantitative data for the measurement of the 220Rn half-life, collected from the emanation of the lantern mantles. The article presents the experimental procedures and the expected results, indicating that the experiment may provide support for nuclear physics classes. These practices could spread widely to either university or school didactic laboratories, and the apparatus has the potential to allow the development of new teaching activity for nuclear physics.

  12. [Interactions of DNA bases with individual water molecules. Molecular mechanics and quantum mechanics computation results vs. experimental data].

    PubMed

    Gonzalez, E; Lino, J; Deriabina, A; Herrera, J N F; Poltev, V I

    2013-01-01

    To elucidate details of the DNA-water interactions we performed the calculations and systemaitic search for minima of interaction energy of the systems consisting of one of DNA bases and one or two water molecules. The results of calculations using two force fields of molecular mechanics (MM) and correlated ab initio method MP2/6-31G(d, p) of quantum mechanics (QM) have been compared with one another and with experimental data. The calculations demonstrated a qualitative agreement between geometry characteristics of the most of local energy minima obtained via different methods. The deepest minima revealed by MM and QM methods correspond to water molecule position between two neighbor hydrophilic centers of the base and to the formation by water molecule of hydrogen bonds with them. Nevertheless, the relative depth of some minima and peculiarities of mutual water-base positions in' these minima depend on the method used. The analysis revealed insignificance of some differences in the results of calculations performed via different methods and the importance of other ones for the description of DNA hydration. The calculations via MM methods enable us to reproduce quantitatively all the experimental data on the enthalpies of complex formation of single water molecule with the set of mono-, di-, and trimethylated bases, as well as on water molecule locations near base hydrophilic atoms in the crystals of DNA duplex fragments, while some of these data cannot be rationalized by QM calculations.

  13. Correlation of analytical and experimental hot structure vibration results

    NASA Technical Reports Server (NTRS)

    Kehoe, Michael W.; Deaton, Vivian C.

    1993-01-01

    High surface temperatures and temperature gradients can affect the vibratory characteristics and stability of aircraft structures. Aircraft designers are relying more on finite-element model analysis methods to ensure sufficient vehicle structural dynamic stability throughout the desired flight envelope. Analysis codes that predict these thermal effects must be correlated and verified with experimental data. Experimental modal data for aluminum, titanium, and fiberglass plates heated at uniform, nonuniform, and transient heating conditions are presented. The data show the effect of heat on each plate's modal characteristics, a comparison of predicted and measured plate vibration frequencies, the measured modal damping, and the effect of modeling material property changes and thermal stresses on the accuracy of the analytical results at nonuniform and transient heating conditions.

  14. Experimental Results for Titan Aerobot Thermo-Mechanical Subsystem Development

    NASA Technical Reports Server (NTRS)

    Pauken, Michael T.; Hall, Jeffery L.

    2006-01-01

    This paper presents experimental results on a set of 4 thermo-mechanical research tasks aimed at Titan and Venus aerobots: 1. A cryogenic balloon materials development program culminating in the fabrication and testing of a 4.6 m long blimp prototype at 93K. 2. A combined computational and experimental thermal analysis of the effect of radioisotope power system (RPS) waste heat on the behavior of a helium filled blimp hull. 3. Aerial deployment and inflation testing using a blimp 4. A proof of concept experiment with an aerobot-mounted steerable high gain antenna These tasks were supported with JPL internal R&D funds and executed by JPL engineers with substantial industry collaboration for Task #1, the cryogenic balloon materials

  15. Quantitative surface topography determination by Nomarski reflection microscopy. 2: Microscope modification, calibration, and planar sample experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartman, J.S.; Gordon, R.L.; Lessor, D.L.

    1980-09-01

    The application of reflective Nomarski differential interference contrast microscopy for the determination of quantitative sample topography data is presented. The discussion includes a review of key theoretical results presented previously plus the experimental implementation of the concepts using a commercial Momarski microscope. The experimental work included the modification and characterization of a commercial microscope to allow its use for obtaining quantitative sample topography data. System usage for the measurement of slopes on flat planar samples is also discussed. The discussion has been designed to provide the theoretical basis, a physical insight, and a cookbook procedure for implementation to allow thesemore » results to be of value to both those interested in the microscope theory and its practical usage in the metallography laboratory.« less

  16. Experimental and computational results from a large low-speed centrifugal impeller

    NASA Technical Reports Server (NTRS)

    Hathaway, M. D.; Chriss, R. M.; Wood, J. R.; Strazisar, A. J.

    1993-01-01

    An experimental and computational investigation of the NASA Low-Speed Centrifugal Compressor (LSCC) flow field has been conducted using laser anemometry and Dawes' 3D viscous code. The experimental configuration consists of a backswept impeller followed by a vaneless diffuser. Measurements of the three-dimensional velocity field were acquired at several measurement planes through the compressor. The measurements describe both the throughflow and secondary velocity field along each measurement plane and in several cases provide details of the flow within the blade boundary layers. The experimental and computational results provide a clear understanding of the development of the throughflow momentum wake which is characteristic of centrifugal compressors.

  17. Quantitative Determination of Isotope Ratios from Experimental Isotopic Distributions

    PubMed Central

    Kaur, Parminder; O’Connor, Peter B.

    2008-01-01

    Isotope variability due to natural processes provides important information for studying a variety of complex natural phenomena from the origins of a particular sample to the traces of biochemical reaction mechanisms. These measurements require high-precision determination of isotope ratios of a particular element involved. Isotope Ratio Mass Spectrometers (IRMS) are widely employed tools for such a high-precision analysis, which have some limitations. This work aims at overcoming the limitations inherent to IRMS by estimating the elemental isotopic abundance from the experimental isotopic distribution. In particular, a computational method has been derived which allows the calculation of 13C/12C ratios from the whole isotopic distributions, given certain caveats, and these calculations are applied to several cases to demonstrate their utility. The limitations of the method in terms of the required number of ions and S/N ratio are discussed. For high-precision estimates of the isotope ratios, this method requires very precise measurement of the experimental isotopic distribution abundances, free from any artifacts introduced by noise, sample heterogeneity, or other experimental sources. PMID:17263354

  18. Quantitative EEG analysis in minimally conscious state patients during postural changes.

    PubMed

    Greco, A; Carboncini, M C; Virgillito, A; Lanata, A; Valenza, G; Scilingo, E P

    2013-01-01

    Mobilization and postural changes of patients with cognitive impairment are standard clinical practices useful for both psychic and physical rehabilitation process. During this process, several physiological signals, such as Electroen-cephalogram (EEG), Electrocardiogram (ECG), Photopletysmography (PPG), Respiration activity (RESP), Electrodermal activity (EDA), are monitored and processed. In this paper we investigated how quantitative EEG (qEEG) changes with postural modifications in minimally conscious state patients. This study is quite novel and no similar experimental data can be found in the current literature, therefore, although results are very encouraging, a quantitative analysis of the cortical area activated in such postural changes still needs to be deeply investigated. More specifically, this paper shows EEG power spectra and brain symmetry index modifications during a verticalization procedure, from 0 to 60 degrees, of three patients in Minimally Consciousness State (MCS) with focused region of impairment. Experimental results show a significant increase of the power in β band (12 - 30 Hz), commonly associated to human alertness process, thus suggesting that mobilization and postural changes can have beneficial effects in MCS patients.

  19. Iterative optimization method for design of quantitative magnetization transfer imaging experiments.

    PubMed

    Levesque, Ives R; Sled, John G; Pike, G Bruce

    2011-09-01

    Quantitative magnetization transfer imaging (QMTI) using spoiled gradient echo sequences with pulsed off-resonance saturation can be a time-consuming technique. A method is presented for selection of an optimum experimental design for quantitative magnetization transfer imaging based on the iterative reduction of a discrete sampling of the Z-spectrum. The applicability of the technique is demonstrated for human brain white matter imaging at 1.5 T and 3 T, and optimal designs are produced to target specific model parameters. The optimal number of measurements and the signal-to-noise ratio required for stable parameter estimation are also investigated. In vivo imaging results demonstrate that this optimal design approach substantially improves parameter map quality. The iterative method presented here provides an advantage over free form optimal design methods, in that pragmatic design constraints are readily incorporated. In particular, the presented method avoids clustering and repeated measures in the final experimental design, an attractive feature for the purpose of magnetization transfer model validation. The iterative optimal design technique is general and can be applied to any method of quantitative magnetization transfer imaging. Copyright © 2011 Wiley-Liss, Inc.

  20. The mathematics of cancer: integrating quantitative models.

    PubMed

    Altrock, Philipp M; Liu, Lin L; Michor, Franziska

    2015-12-01

    Mathematical modelling approaches have become increasingly abundant in cancer research. The complexity of cancer is well suited to quantitative approaches as it provides challenges and opportunities for new developments. In turn, mathematical modelling contributes to cancer research by helping to elucidate mechanisms and by providing quantitative predictions that can be validated. The recent expansion of quantitative models addresses many questions regarding tumour initiation, progression and metastases as well as intra-tumour heterogeneity, treatment responses and resistance. Mathematical models can complement experimental and clinical studies, but also challenge current paradigms, redefine our understanding of mechanisms driving tumorigenesis and shape future research in cancer biology.

  1. Experimental Assessment and Enhancement of Planar Laser-Induced Fluorescence Measurements of Nitric Oxide in an Inverse Diffusion Flame

    NASA Technical Reports Server (NTRS)

    Partridge, William P.; Laurendeau, Normand M.

    1997-01-01

    We have experimentally assessed the quantitative nature of planar laser-induced fluorescence (PLIF) measurements of NO concentration in a unique atmospheric pressure, laminar, axial inverse diffusion flame (IDF). The PLIF measurements were assessed relative to a two-dimensional array of separate laser saturated fluorescence (LSF) measurements. We demonstrated and evaluated several experimentally-based procedures for enhancing the quantitative nature of PLIF concentration images. Because these experimentally-based PLIF correction schemes require only the ability to make PLIF and LSF measurements, they produce a more broadly applicable PLIF diagnostic compared to numerically-based correction schemes. We experimentally assessed the influence of interferences on both narrow-band and broad-band fluorescence measurements at atmospheric and high pressures. Optimum excitation and detection schemes were determined for the LSF and PLIF measurements. Single-input and multiple-input, experimentally-based PLIF enhancement procedures were developed for application in test environments with both negligible and significant quench-dependent error gradients. Each experimentally-based procedure provides an enhancement of approximately 50% in the quantitative nature of the PLIF measurements, and results in concentration images nominally as quantitative as LSF point measurements. These correction procedures can be applied to other species, including radicals, for which no experimental data are available from which to implement numerically-based PLIF enhancement procedures.

  2. Guidelines for Reporting Quantitative Methods and Results in Primary Research

    ERIC Educational Resources Information Center

    Norris, John M.; Plonsky, Luke; Ross, Steven J.; Schoonen, Rob

    2015-01-01

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of "Language Learning," outline the basic expectations for…

  3. A quantitative experimental phantom study on MRI image uniformity.

    PubMed

    Felemban, Doaa; Verdonschot, Rinus G; Iwamoto, Yuri; Uchiyama, Yuka; Kakimoto, Naoya; Kreiborg, Sven; Murakami, Shumei

    2018-05-23

    Our goal was to assess MR image uniformity by investigating aspects influencing said uniformity via a method laid out by the National Electrical Manufacturers Association (NEMA). Six metallic materials embedded in a glass phantom were scanned (i.e. Au, Ag, Al, Au-Ag-Pd alloy, Ti and Co-Cr alloy) as well as a reference image. Sequences included spin echo (SE) and gradient echo (GRE) scanned in three planes (i.e. axial, coronal, and sagittal). Moreover, three surface coil types (i.e. head and neck, Brain, and temporomandibular joint coils) and two image correction methods (i.e. surface coil intensity correction or SCIC, phased array uniformity enhancement or PURE) were employed to evaluate their effectiveness on image uniformity. Image uniformity was assessed using the National Electrical Manufacturers Association peak-deviation non-uniformity method. Results showed that temporomandibular joint coils elicited the least uniform image and brain coils outperformed head and neck coils when metallic materials were present. Additionally, when metallic materials were present, spin echo outperformed gradient echo especially for Co-Cr (particularly in the axial plane). Furthermore, both SCIC and PURE improved image uniformity compared to uncorrected images, and SCIC slightly surpassed PURE when metallic metals were present. Lastly, Co-Cr elicited the least uniform image while other metallic materials generally showed similar patterns (i.e. no significant deviation from images without metallic metals). Overall, a quantitative understanding of the factors influencing MR image uniformity (e.g. coil type, imaging method, metal susceptibility, and post-hoc correction method) is advantageous to optimize image quality, assists clinical interpretation, and may result in improved medical and dental care.

  4. Wageningen Urban Rainfall Experiment 2014 (WURex14): Experimental Setup and First Results

    NASA Astrophysics Data System (ADS)

    Uijlenhoet, R.; Overeem, A.; Leijnse, H.; Hazenberg, P.

    2014-12-01

    Microwave links from cellular communication networks have been shown to be able to provide valuable information concerning the space-time variability of rainfall. In particular over urban areas, where network densities are generally high, they have the potential to complement existing dedicated infrastructure to measure rainfall (gauges, radars). In addition, microwave links provide a great opportunity for ground-based rainfall measurement for those land surface areas of the world where gauges and radars are generally lacking, e.g. Africa, Latin America, and large parts of Asia. Such information is not only crucial for water management and agriculture, but also for instance for ground validation of space-borne rainfall estimates such as those provided by the recently launched core satellite of the GPM (Global Precipitation Measurement) mission. WURex14 is dedicated to address several errors and uncertainties associated with such quantitative precipitation estimates in detail. The core of the experiment is provided by two co-located microwave links installed between two major buildings on the Wageningen University campus, approximately 2 km apart: a 38 GHz commercial microwave link, kindly provided to us by T-Mobile NL, and a 38 GHz dual-polarization research microwave link from RAL. Transmitting and receiving antennas have been attached to masts installed on the roofs of the two buildings, about 30 m above the ground. This setup has been complemented with a Scintec infrared Large-Aperture Scintillometer, installed over the same path, as well as a Parsivel optical disdrometer, located close to the mast on the receiving end of the links. During the course of the experiment, a 26 GHz RAL research microwave link was added to the experimental setup. Temporal sampling of the received signals was performed at a rate of 20 Hz. In addition, two time-lapse cameras have been installed on either side of the path to monitor the wetness of the antennas as well as the state of

  5. Wageningen Urban Rainfall Experiment 2014 (WURex14): Experimental Setup and First Results

    NASA Astrophysics Data System (ADS)

    van Leth, Thomas; Uijlenhoet, Remko; Overeem, Aart; Leijnse, Hidde; Hazenberg, Pieter

    2015-04-01

    Microwave links from cellular communication networks have been shown to be able to provide valuable information concerning the space-time variability of rainfall. In particular over urban areas, where network densities are generally high, they have the potential to complement existing dedicated infrastructure to measure rainfall (gauges, radars). In addition, microwave links provide a great opportunity for ground-based rainfall measurement for those land surface areas of the world where gauges and radars are generally lacking, e.g. Africa, Latin America, and large parts of Asia. Such information is not only crucial for water management and agriculture, but also for instance for ground validation of space-borne rainfall estimates such as those provided by the recently launched core satellite of the GPM (Global Precipitation Measurement) mission. WURex14 is dedicated to address several errors and uncertainties associated with such quantitative precipitation estimates in detail. The core of the experiment is provided by two co-located microwave links installed between two major buildings on the Wageningen University campus, approximately 2 km apart: a 38 GHz commercial microwave link, kindly provided to us by T-Mobile NL, and a 38 GHz dual-polarization research microwave link from RAL. Transmitting and receiving antennas have been attached to masts installed on the roofs of the two buildings, about 30 m above the ground. This setup has been complemented with a Scintec infrared Large-Aperture Scintillometer, installed over the same path, as well as a Parsivel optical disdrometer, located close to the mast on the receiving end of the links. During the course of the experiment, a 26 GHz RAL research microwave link was added to the experimental setup. Temporal sampling of the received signals was performed at a rate of 20 Hz. In addition, two time-lapse cameras have been installed on either side of the path to monitor the wetness of the antennas as well as the state of

  6. Experimental results on current-driven turbulence in plasmas - a survey

    NASA Astrophysics Data System (ADS)

    de Kluiver, H.; Perepelkin, N. F.; Hirose, A.

    1991-01-01

    The experimental consequences of plasma turbulence driven by a current parallel to a magnetic field and concurrent anomalous plasma heating are reviewed, with an attempt to deduce universalities in key parameters such as the anomalous electrical conductivities observed in diverse devices. It has been found that the nature of plasma turbulence and turbulent heating depends on several parameters including the electric field, current and magnetic fields. A classification of turbulence regimes based on these parameters has been made. Experimental observations of the anomalous electrical conductivity, plasma heating, skin effect, runaway electron braking and turbulent fluctuations are surveyed, and current theoretical understanding is briefly reviewed. Experimental results recently obtained in stellarators (SIRIUS, URAGAN at Kharkov), and in tokamaks (TORTUR at Nieuwegein, STOR-1M at Saskatoon) are presented in some detail in the light of investigating the feasibility of using turbulent heating as a means of injecting a large power into toroidal devices.

  7. Evaluation of empirical rule of linearly correlated peptide selection (ERLPS) for proteotypic peptide-based quantitative proteomics.

    PubMed

    Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong

    2014-07-01

    Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Characterization and Comparison of Galactomannan Enzyme Immunoassay and Quantitative Real-Time PCR Assay for Detection of Aspergillus fumigatus in Bronchoalveolar Lavage Fluid from Experimental Invasive Pulmonary Aspergillosis

    PubMed Central

    Francesconi, Andrea; Kasai, Miki; Petraitiene, Ruta; Petraitis, Vidmantas; Kelaher, Amy M.; Schaufele, Robert; Hope, William W.; Shea, Yvonne R.; Bacher, John; Walsh, Thomas J.

    2006-01-01

    Bronchoalveolar lavage (BAL) is widely used for evaluation of patients with suspected invasive pulmonary aspergillosis (IPA). However, the diagnostic yield of BAL for detection of IPA by culture and direct examination is limited. Earlier diagnosis may be facilitated by assays that can detect Aspergillus galactomannan antigen or DNA in BAL fluid. We therefore characterized and compared the diagnostic yields of a galactomannan enzyme immunoassay (GM EIA), quantitative real-time PCR (qPCR), and quantitative cultures in experiments using BAL fluid from neutropenic rabbits with experimentally induced IPA defined as microbiologically and histologically evident invasion. The qPCR assay targeted the rRNA gene complex of Aspergillus fumigatus. The GM EIA and qPCR assay were characterized by receiver operator curve analysis. With an optimal cutoff of 0.75, the GM EIA had a sensitivity and specificity of 100% in untreated controls. A decline in sensitivity (92%) was observed when antifungal therapy (AFT) was administered. The optimal cutoff for qPCR was a crossover of 36 cycles, with sensitivity and specificity of 80% and 100%, respectively. The sensitivity of qPCR also decreased with AFT to 50%. Quantitative culture of BAL had a sensitivity of 46% and a specificity of 100%. The sensitivity of quantitative culture decreased with AFT to 16%. The GM EIA and qPCR assay had greater sensitivity than culture in detection of A. fumigatus in BAL fluid in experimentally induced IPA (P ± 0.04). Use of the GM EIA and qPCR assay in conjunction with culture-based diagnostic methods applied to BAL fluid could facilitate accurate diagnosis and more-timely initiation of specific therapy. PMID:16825367

  9. Experimental Results on the Feasibility of an Aerospike for Hypersonic Missiles

    NASA Technical Reports Server (NTRS)

    Huebner, Lawrence D.; Mitchell, Anthony M.; Boudreaux, Ellis J.

    1995-01-01

    A series of wind tunnel tests have been performed on an aerospike-protected missile dome at a Mach number of 6 to obtain quantitative surface pressure and temperature-rise data, as well as qualitative flow visualization data. These data were used to determine aerospike concept feasibility and will also provide a database to be used for calibration of computational fluid dynamics codes. Data were obtained on the hemispherical missile dome with and without an aerospike that protrudes ahead of the dome along the axisymmetric center line. Data were obtained on two models (one pressure, one temperature) in the NASA Langley 20-Inch Mach 6 Tunnel at a freestream Reynolds number of 8.0 x 10(exp 6) per feet and angles of attack from 0 to 40 degrees. Surface pressure and temperature-rise results indicate that the aerospike is effective for very low angles of attack (less than 5 degrees) at Mach 6. Above 5 degrees, impingement of the aerospike bow shock and the flow separation shock from the recirculation region created by the aerospike causes pressure and temperature increases on the windward side of the dome which exceed values observed in the same region with the aerospike removed. Flow characterization obtained via oil-flow and schlieren photographs provides some insight into the quantitative surface data results, including vortical flow and shock-wave impingement.

  10. A quantitative model and the experimental evaluation of the liquid fuel layer for the downward flame spread of XPS foam.

    PubMed

    Luo, Shengfeng; Xie, Qiyuan; Tang, Xinyi; Qiu, Rong; Yang, Yun

    2017-05-05

    The objective of this work is to investigate the distinctive mechanisms of downward flame spread for XPS foam. It was physically considered as a moving down of narrow pool fire instead of downward surface flame spread for normal solids. A method was developed to quantitatively analyze the accumulated liquid fuel based on the experimental measurement of locations of flame tips and burning rates. The results surprisingly showed that about 80% of the generated hot liquid fuel remained in the pool fire during a certain period. Most of the consumed solid XPS foam didn't really burn away but transformed as the liquid fuel in the downward moving pool fire, which might be an important promotion for the fast fire development. The results also indicated that the dripping propensity of the hot liquid fuel depends on the total amount of the hot liquid accumulated in the pool fire. The leading point of the flame front curve might be the breach of the accumulated hot liquid fuel if it is enough for dripping. Finally, it is suggested that horizontal noncombustible barriers for preventing the accumulation and dripping of liquid fuel are helpful for vertical confining of XPS fire. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Mastitomics, the integrated omics of bovine milk in an experimental model of Streptococcus uberis mastitis: 2. Label-free relative quantitative proteomics.

    PubMed

    Mudaliar, Manikhandan; Tassi, Riccardo; Thomas, Funmilola C; McNeilly, Tom N; Weidt, Stefan K; McLaughlin, Mark; Wilson, David; Burchmore, Richard; Herzyk, Pawel; Eckersall, P David; Zadoks, Ruth N

    2016-08-16

    Mastitis, inflammation of the mammary gland, is the most common and costly disease of dairy cattle in the western world. It is primarily caused by bacteria, with Streptococcus uberis as one of the most prevalent causative agents. To characterize the proteome during Streptococcus uberis mastitis, an experimentally induced model of intramammary infection was used. Milk whey samples obtained from 6 cows at 6 time points were processed using label-free relative quantitative proteomics. This proteomic analysis complements clinical, bacteriological and immunological studies as well as peptidomic and metabolomic analysis of the same challenge model. A total of 2552 non-redundant bovine peptides were identified, and from these, 570 bovine proteins were quantified. Hierarchical cluster analysis and principal component analysis showed clear clustering of results by stage of infection, with similarities between pre-infection and resolution stages (0 and 312 h post challenge), early infection stages (36 and 42 h post challenge) and late infection stages (57 and 81 h post challenge). Ingenuity pathway analysis identified upregulation of acute phase protein pathways over the course of infection, with dominance of different acute phase proteins at different time points based on differential expression analysis. Antimicrobial peptides, notably cathelicidins and peptidoglycan recognition protein, were upregulated at all time points post challenge and peaked at 57 h, which coincided with 10 000-fold decrease in average bacterial counts. The integration of clinical, bacteriological, immunological and quantitative proteomics and other-omic data provides a more detailed systems level view of the host response to mastitis than has been achieved previously.

  12. Experimental verification of Pyragas-Schöll-Fiedler control.

    PubMed

    von Loewenich, Clemens; Benner, Hartmut; Just, Wolfram

    2010-09-01

    We present an experimental realization of time-delayed feedback control proposed by Schöll and Fiedler. The scheme enables us to stabilize torsion-free periodic orbits in autonomous systems, and to overcome the so-called odd number limitation. The experimental control performance is in quantitative agreement with the bifurcation analysis of simple model systems. The results uncover some general features of the control scheme which are deemed to be relevant for a large class of setups.

  13. Experimental and computational surface and flow-field results for an all-body hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Lockman, William K.; Lawrence, Scott L.; Cleary, Joseph W.

    1990-01-01

    The objective of the present investigation is to establish a benchmark experimental data base for a generic hypersonic vehicle shape for validation and/or calibration of advanced computational fluid dynamics computer codes. This paper includes results from the comprehensive test program conducted in the NASA/Ames 3.5-foot Hypersonic Wind Tunnel for a generic all-body hypersonic aircraft model. Experimental and computational results on flow visualization, surface pressures, surface convective heat transfer, and pitot-pressure flow-field surveys are presented. Comparisons of the experimental results with computational results from an upwind parabolized Navier-Stokes code developed at Ames demonstrate the capabilities of this code.

  14. Human Factors Experimental Design and Analysis Reference

    DTIC Science & Technology

    2007-07-01

    Equipment 1.4.5. Pretesting 1.5. Research Design Alternatives ………………………………………………………...30 1.6. Analyzing Results ……………………………………………………………………….31 1.7...46 2.1.1. Threats to Validity 2.1.2. Quantitative Research Approach 2.2. Experimental Design Alternatives...with a discussion of quantitative models in research that are used to predict human performance. Next empirical model building using polynomial

  15. Quantitative image analysis for investigating cell-matrix interactions

    NASA Astrophysics Data System (ADS)

    Burkel, Brian; Notbohm, Jacob

    2017-07-01

    The extracellular matrix provides both chemical and physical cues that control cellular processes such as migration, division, differentiation, and cancer progression. Cells can mechanically alter the matrix by applying forces that result in matrix displacements, which in turn may localize to form dense bands along which cells may migrate. To quantify the displacements, we use confocal microscopy and fluorescent labeling to acquire high-contrast images of the fibrous material. Using a technique for quantitative image analysis called digital volume correlation, we then compute the matrix displacements. Our experimental technology offers a means to quantify matrix mechanics and cell-matrix interactions. We are now using these experimental tools to modulate mechanical properties of the matrix to study cell contraction and migration.

  16. A general way for quantitative magnetic measurement by transmitted electrons

    NASA Astrophysics Data System (ADS)

    Song, Dongsheng; Li, Gen; Cai, Jianwang; Zhu, Jing

    2016-01-01

    EMCD (electron magnetic circular dichroism) technique opens a new door to explore magnetic properties by transmitted electrons. The recently developed site-specific EMCD technique makes it possible to obtain rich magnetic information from the Fe atoms sited at nonequivalent crystallographic planes in NiFe2O4, however it is based on a critical demand for the crystallographic structure of the testing sample. Here, we have further improved and tested the method for quantitative site-specific magnetic measurement applicable for more complex crystallographic structure by using the effective dynamical diffraction effects (general routine for selecting proper diffraction conditions, making use of the asymmetry of dynamical diffraction for design of experimental geometry and quantitative measurement, etc), and taken yttrium iron garnet (Y3Fe5O12, YIG) with more complex crystallographic structure as an example to demonstrate its applicability. As a result, the intrinsic magnetic circular dichroism signals, spin and orbital magnetic moment of iron with site-specific are quantitatively determined. The method will further promote the development of quantitative magnetic measurement with high spatial resolution by transmitted electrons.

  17. A Review of Out-of-School Time Program Quasi-Experimental and Experimental Evaluation Results. Out-of-School Time Evaluation Snapshot.

    ERIC Educational Resources Information Center

    Little, Priscilla M. D.; Harris, Erin

    As the amount of resources allocated to out-of-school (OST) programming and policymakers' demands for research-based results increase, there is increasing interest in rigorous research designs to examine OST program outcomes. This issue of "Out-of-School Time Evaluation Snapshots" reviews 27 quasi-experimental and experimental OST…

  18. A quantitative evaluation of spurious results in the infrared spectroscopic measurement of CO2 isotope ratios

    NASA Astrophysics Data System (ADS)

    Mansfield, C. D.; Rutt, H. N.

    2002-02-01

    The possible generation of spurious results, arising from the application of infrared spectroscopic techniques to the measurement of carbon isotope ratios in breath, due to coincident absorption bands has been re-examined. An earlier investigation, which approached the problem qualitatively, fulfilled its aspirations in providing an unambiguous assurance that 13C16O2/12C16O2 ratios can be confidently measured for isotopic breath tests using instruments based on infrared absorption. Although this conclusion still stands, subsequent quantitative investigation has revealed an important exception that necessitates a strict adherence to sample collection protocol. The results show that concentrations and decay rates of the coincident breath trace compounds acetonitrile and carbon monoxide, found in the breath sample of a heavy smoker, can produce spurious results. Hence, findings from this investigation justify the concern that breath trace compounds present a risk to the accurate measurement of carbon isotope ratios in breath when using broadband, non-dispersive, ground state absorption infrared spectroscopy. It provides recommendations on the length of smoking abstention required to avoid generation of spurious results and also reaffirms, through quantitative argument, the validity of using infrared absorption spectroscopy to measure CO2 isotope ratios in breath.

  19. Validating internal controls for quantitative plant gene expression studies

    PubMed Central

    Brunner, Amy M; Yakovlev, Igor A; Strauss, Steven H

    2004-01-01

    Background Real-time reverse transcription PCR (RT-PCR) has greatly improved the ease and sensitivity of quantitative gene expression studies. However, accurate measurement of gene expression with this method relies on the choice of a valid reference for data normalization. Studies rarely verify that gene expression levels for reference genes are adequately consistent among the samples used, nor compare alternative genes to assess which are most reliable for the experimental conditions analyzed. Results Using real-time RT-PCR to study the expression of 10 poplar (genus Populus) housekeeping genes, we demonstrate a simple method for determining the degree of stability of gene expression over a set of experimental conditions. Based on a traditional method for analyzing the stability of varieties in plant breeding, it defines measures of gene expression stability from analysis of variance (ANOVA) and linear regression. We found that the potential internal control genes differed widely in their expression stability over the different tissues, developmental stages and environmental conditions studied. Conclusion Our results support that quantitative comparisons of candidate reference genes are an important part of real-time RT-PCR studies that seek to precisely evaluate variation in gene expression. The method we demonstrated facilitates statistical and graphical evaluation of gene expression stability. Selection of the best reference gene for a given set of experimental conditions should enable detection of biologically significant changes in gene expression that are too small to be revealed by less precise methods, or when highly variable reference genes are unknowingly used in real-time RT-PCR experiments. PMID:15317655

  20. Electrical and thermal behavior of unsaturated soils: experimental results

    NASA Astrophysics Data System (ADS)

    Nouveau, Marie; Grandjean, Gilles; Leroy, Philippe; Philippe, Mickael; Hedri, Estelle; Boukcim, Hassan

    2016-05-01

    When soil is affected by a heat source, some of its properties are modified, and in particular, the electrical resistivity due to changes in water content. As a result, these changes affect the thermal properties of soil, i.e., its thermal conductivity and diffusivity. We experimentally examine the changes in electrical resistivity and thermal conductivity for four soils with different grain size distributions and clay content over a wide range of temperatures, from 20 to 100 °C. This temperature range corresponds to the thermal conditions in the vicinity of a buried high voltage cable or a geothermal system. Experiments were conducted at the field scale, at a geothermal test facility, and in the laboratory using geophysical devices and probing systems. The results show that the electrical resistivity decreases and the thermal conductivity increases with temperature up to a critical temperature depending on soil types. At this critical temperature, the air volume in the pore space increases with temperature, and the resulting electrical resistivity also increases. For higher temperatures , the thermal conductivity increases sharply with temperature up to a second temperature limit. Beyond it, the thermal conductivity drops drastically. This limit corresponds to the temperature at which most of the water evaporates from the soil pore space. Once the evaporation is completed, the thermal conductivity stabilizes. To explain these experimental results, we modeled the electrical resistivity variations with temperature and water content in the temperature range 20 - 100°C, showing that two critical temperatures influence the main processes occurring during heating at temperatures below 100 °C.

  1. Advanced Supersonic Nozzle Concepts: Experimental Flow Visualization Results Paired With LES

    NASA Astrophysics Data System (ADS)

    Berry, Matthew; Magstadt, Andrew; Stack, Cory; Gaitonde, Datta; Glauser, Mark; Syracuse University Team; The Ohio State University Team

    2015-11-01

    Advanced supersonic nozzle concepts are currently under investigation, utilizing multiple bypass streams and airframe integration to bolster performance and efficiency. This work focuses on the parametric study of a supersonic, multi-stream jet with aft deck. The single plane of symmetry, rectangular nozzle, displays very complex and unique flow characteristics. Flow visualization techniques in the form of PIV and schlieren capture flow features at various deck lengths and Mach numbers. LES is compared to the experimental results to both validate the computational model and identify limitations of the simulation. By comparing experimental results to LES, this study will help create a foundation of knowledge for advanced nozzle designs in future aircraft. SBIR Phase II with Spectral Energies, LLC under direction of Barry Kiel.

  2. Non-interferometric quantitative phase imaging of yeast cells

    NASA Astrophysics Data System (ADS)

    Poola, Praveen K.; Pandiyan, Vimal Prabhu; John, Renu

    2015-12-01

    Real-time imaging of live cells is quite difficult without the addition of external contrast agents. Various methods for quantitative phase imaging of living cells have been proposed like digital holographic microscopy and diffraction phase microscopy. In this paper, we report theoretical and experimental results of quantitative phase imaging of live yeast cells with nanometric precision using transport of intensity equations (TIE). We demonstrate nanometric depth sensitivity in imaging live yeast cells using this technique. This technique being noninterferometric, does not need any coherent light sources and images can be captured through a regular bright-field microscope. This real-time imaging technique would deliver the depth or 3-D volume information of cells and is highly promising in real-time digital pathology applications, screening of pathogens and staging of diseases like malaria as it does not need any preprocessing of samples.

  3. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    PubMed

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  4. Experimental results in evolutionary fault-recovery for field programmable analog devices

    NASA Technical Reports Server (NTRS)

    Zebulum, Ricardo S.; Keymeulen, Didier; Duong, Vu; Guo, Xin; Ferguson, M. I.; Stoica, Adrian

    2003-01-01

    This paper presents experimental results of fast intrinsic evolutionary design and evolutionary fault recovery of a 4-bit Digital to Analog Converter (DAC) using the JPL stand-alone board-level evolvable system (SABLES).

  5. Experimental results on the enhanced backscatter phenomenon and its dynamics

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Nelson, William; Ko, Jonathan; Davis, Christopher C.

    2014-10-01

    Enhanced backscatter effects have long been predicted theoretically and experimentally demonstrated. The reciprocity of a turbulent channel generates a group of paired rays with identical trajectory and phase information that leads to a region in phase space with double intensity and scintillation index. Though simulation work based on phase screen models has demonstrated the existence of the phenomenon, few experimental results have been published describing its characteristics, and possible applications of the enhanced backscatter phenomenon are still unclear. With the development of commercially available high powered lasers and advanced cameras with high frame rates, we have successfully captured the enhanced backscatter effects from different reflection surfaces. In addition to static observations, we have also tilted and pre-distorted the transmitted beam at various frequencies to track the dynamic properties of the enhanced backscatter phenomenon to verify its possible application in guidance and beam and image correction through atmospheric turbulence. In this paper, experimental results will be described, and discussions on the principle and applications of the phenomenon will be included. Enhanced backscatter effects are best observed in certain levels of turbulence (Cn 2≍10-13 m-2/3), and show significant potential for providing self-guidance in beam correction that doesn't introduce additional costs (unlike providing a beacon laser). Possible applications of this phenomenon include tracking fast moving object with lasers, long distance (>1km) alignment, and focusing a high-power corrected laser beam over long distances.

  6. Experimental Demonstration of In-Place Calibration for Time Domain Microwave Imaging System

    NASA Astrophysics Data System (ADS)

    Kwon, S.; Son, S.; Lee, K.

    2018-04-01

    In this study, the experimental demonstration of in-place calibration was conducted using the developed time domain measurement system. Experiments were conducted using three calibration methods—in-place calibration and two existing calibrations, that is, array rotation and differential calibration. The in-place calibration uses dual receivers located at an equal distance from the transmitter. The received signals at the dual receivers contain similar unwanted signals, that is, the directly received signal and antenna coupling. In contrast to the simulations, the antennas are not perfectly matched and there might be unexpected environmental errors. Thus, we experimented with the developed experimental system to demonstrate the proposed method. The possible problems with low signal-to-noise ratio and clock jitter, which may exist in time domain systems, were rectified by averaging repeatedly measured signals. The tumor was successfully detected using the three calibration methods according to the experimental results. The cross correlation was calculated using the reconstructed image of the ideal differential calibration for a quantitative comparison between the existing rotation calibration and the proposed in-place calibration. The mean value of cross correlation between the in-place calibration and ideal differential calibration was 0.80, and the mean value of cross correlation of the rotation calibration was 0.55. Furthermore, the results of simulation were compared with the experimental results to verify the in-place calibration method. A quantitative analysis was also performed, and the experimental results show a tendency similar to the simulation.

  7. Sexual Harassment Prevention Initiatives: Quantitative and Qualitative Approaches

    DTIC Science & Technology

    2010-10-28

    design , and the time series with nonequivalent control group design . The experimental research approach will randomly assign participants...Leedy & Ormrod, 2005). According to Fife- Schaw (2006) there are three quasi-experimental designs : the nonequivalent control group design , the time...that have controlled and isolated variables. A specific quantitative approach available to the researcher is the use of surveys. Surveys, in

  8. Modeling of Receptor Tyrosine Kinase Signaling: Computational and Experimental Protocols.

    PubMed

    Fey, Dirk; Aksamitiene, Edita; Kiyatkin, Anatoly; Kholodenko, Boris N

    2017-01-01

    The advent of systems biology has convincingly demonstrated that the integration of experiments and dynamic modelling is a powerful approach to understand the cellular network biology. Here we present experimental and computational protocols that are necessary for applying this integrative approach to the quantitative studies of receptor tyrosine kinase (RTK) signaling networks. Signaling by RTKs controls multiple cellular processes, including the regulation of cell survival, motility, proliferation, differentiation, glucose metabolism, and apoptosis. We describe methods of model building and training on experimentally obtained quantitative datasets, as well as experimental methods of obtaining quantitative dose-response and temporal dependencies of protein phosphorylation and activities. The presented methods make possible (1) both the fine-grained modeling of complex signaling dynamics and identification of salient, course-grained network structures (such as feedback loops) that bring about intricate dynamics, and (2) experimental validation of dynamic models.

  9. Experimental results of temperature response to stress change: An indication of the physics of earthquake rupture propagation

    NASA Astrophysics Data System (ADS)

    Lin, W.; Yang, X.; Tadai, O.; Zeng, X.; Yeh, E. C.; Yu, C.; Hatakeda, K.; Xu, H.; Xu, Z.

    2016-12-01

    As a result of the earthquake rupture propagation, stress on the earthquake fault and in the hanging wall and in the footwall coseismically drops. Based on the thermo-elasticity theory, the temperature of rocks may change associated with coseismic stress change at the same time as their elastic deformation. This coseismic temperature change is one of the physics of earthquake rupture propagation, however has not been noted and expressly addressed before. To understand this temperature issue, we conducted laboratory experiments to quantitatively investigate temperatures response of rocks to rapid stress change of various typical rocks. Consequently, we developed a hydrostatic compression experimental equipment for rock samples with a high resolution temperature measuring system. This enable us to rapidly load and/or unload the confining pressure. As experimental rock samples, we collected 15 representative rocks from various scientific drilling projects and outcrops of earthquake faults, and quarries in the world. The rock types include sandstone, siltstone, limestone, granite, basalt, tuff etc. Based on the classical thermo-elastic theory, a conventional relationship between the temperature change (dT) of rock samples and the confining pressure change (dP) in the hydrostatic compression system under adiabatic condition can be expressed as a linear function. Therefore, we can measure the adiabatic pressure derivative of temperature (dT/dP) directly by monitoring changes of rock sample temperature and confining pressure during the rapidly loading and unloading processes. As preliminary results of the experiments, the data of 15 rock samples showed that i) the adiabatic pressure derivative of temperature (dT/dP) of most rocks are about 1.5 6.2 mK/MPa; ii) the dT/dP of sedimentary rocks is larger than igneous and metamorphic rocks; iii) a good linear correlation between dT/dP and the rock's bulk modulus was recognized.

  10. Analytical, Numerical, and Experimental Results on Turbulent Boundary Layers

    DTIC Science & Technology

    1976-07-01

    a pitot pressure rake where the spacing between probe centers was 0.5 in. near the wall and 1.0 in. away from the wall. Recently, measurements have...Pressure Gradient, Part II. Analysis- of the Experimental Data." BRL R 1543, June 1971. 51. Allen, J. M. " Pitot -Probe Displacement in a Supersonic Turbulent...numbers; (4) a description of the data reduction of pitot pressure measurements utilizing these analytical results in order to obtain velocity

  11. Quantitative Determination of Aluminum in Deodorant Brands: A Guided Inquiry Learning Experience in Quantitative Analysis Laboratory

    ERIC Educational Resources Information Center

    Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr

    2018-01-01

    The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…

  12. Closed Loop Two-Phase Thermosyphon of Small Dimensions: a Review of the Experimental Results

    NASA Astrophysics Data System (ADS)

    Franco, Alessandro; Filippeschi, Sauro

    2012-06-01

    A bibliographical review on the heat and mass transfer in gravity assisted Closed Loop Two Phase Thermosyphons (CLTPT) with channels having a hydraulic diameter of the order of some millimetres and input power below 1 kW is proposed. The available experimental works in the literature are critically analysed in order to highlight the main results and the correlation between mass flow rate and heat input in natural circulation loops. A comparison of different experimental apparatuses and results is made. It is observed that the results are very different among them and in many cases the experimental data disagree with the conventional theory developed for an imposed flow rate. The paper analyses the main differences among the experimental devices and try to understand these disagreements. From the present analysis it is evident that further systematic studies are required to generate a meaningful body of knowledge of the heat and mass transport mechanism in these devices for practical applications in cooling devices or energy systems.

  13. Nuclear medicine and imaging research: Quantitative studies in radiopharmaceutical science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Copper, M.; Beck, R.N.

    1991-06-01

    During the past three years the program has undergone a substantial revitalization. There has been no significant change in the scientific direction of this grant, in which emphasis continues to be placed on developing new or improved methods of obtaining quantitative data from radiotracer imaging studies. However, considerable scientific progress has been made in the three areas of interest: Radiochemistry, Quantitative Methodologies, and Experimental Methods and Feasibility Studies, resulting in a sharper focus of perspective and improved integration of the overall scientific effort. Changes in Faculty and staff, including development of new collaborations, have contributed to this, as has acquisitionmore » of additional and new equipment and renovations and expansion of the core facilities. 121 refs., 30 figs., 2 tabs.« less

  14. Examining the Role of Numeracy in College STEM Courses: Results from the Quantitative Reasoning for College Science (QuaRCS) Assessment Instrument

    NASA Astrophysics Data System (ADS)

    Follette, Katherine B.; McCarthy, Donald W.; Dokter, Erin F.; Buxner, Sanlyn; Prather, Edward E.

    2016-01-01

    Is quantitative literacy a prerequisite for science literacy? Can students become discerning voters, savvy consumers and educated citizens without it? Should college science courses for nonmajors be focused on "science appreciation", or should they engage students in the messy quantitative realities of modern science? We will present results from the recently developed and validated Quantitative Reasoning for College Science (QuaRCS) Assessment, which probes both quantitative reasoning skills and attitudes toward mathematics. Based on data from nearly two thousand students enrolled in nineteen general education science courses, we show that students in these courses did not demonstrate significant skill or attitude improvements over the course of a single semester, but find encouraging evidence for longer term trends.

  15. An experimental/computational study of sharp fin induced shock wave/turbulent boundary layer interactions at Mach 5 - Experimental results

    NASA Technical Reports Server (NTRS)

    Rodi, Patrick E.; Dolling, David S.

    1992-01-01

    A combined experimental/computational study has been performed of sharp fin induced shock wave/turbulent boundary layer interactions at Mach 5. The current paper focuses on the experiments and analysis of the results. The experimental data include mean surface heat transfer, mean surface pressure distributions and surface flow visualization for fin angles of attack of 6, 8, 10, 12, 14 and 16-degrees at Mach 5 under a moderately cooled wall condition. Comparisons between the results and correlations developed earlier show that Scuderi's correlation for the upstream influence angle (recast in a conical form) is superior to other such correlations in predicting the current results, that normal Mach number based correlations for peak pressure heat transfer are adequate and that the initial heat transfer peak can be predicted using pressure-interaction theory.

  16. Quantitatively differentiating microstructural variations of skeletal muscle tissues by multispectral Mueller matrix imaging

    NASA Astrophysics Data System (ADS)

    Dong, Yang; He, Honghui; He, Chao; Ma, Hui

    2016-10-01

    Polarized light is sensitive to the microstructures of biological tissues and can be used to detect physiological changes. Meanwhile, spectral features of the scattered light can also provide abundant microstructural information of tissues. In this paper, we take the backscattering polarization Mueller matrix images of bovine skeletal muscle tissues during the 24-hour experimental time, and analyze their multispectral behavior using quantitative Mueller matrix parameters. In the processes of rigor mortis and proteolysis of muscle samples, multispectral frequency distribution histograms (FDHs) of the Mueller matrix elements can reveal rich qualitative structural information. In addition, we analyze the temporal variations of the sample using the multispectral Mueller matrix transformation (MMT) parameters. The experimental results indicate that the different stages of rigor mortis and proteolysis for bovine skeletal muscle samples can be judged by these MMT parameters. The results presented in this work show that combining with the multispectral technique, the FDHs and MMT parameters can characterize the microstructural variation features of skeletal muscle tissues. The techniques have the potential to be used as tools for quantitative assessment of meat qualities in food industry.

  17. Comparative Evaluation of Quantitative Test Methods for Gases on a Hard Surface

    DTIC Science & Technology

    2017-02-01

    COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE ECBC-TR-1426 Vipin Rastogi...1 COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE 1. INTRODUCTION Members of the U.S. Environmental...Generator 4 2.4 Experimental Design Each quantitative method was performed three times on three consecutive days. For the CD runs, three

  18. Quantitative transmission Raman spectroscopy of pharmaceutical tablets and capsules.

    PubMed

    Johansson, Jonas; Sparén, Anders; Svensson, Olof; Folestad, Staffan; Claybourn, Mike

    2007-11-01

    Quantitative analysis of pharmaceutical formulations using the new approach of transmission Raman spectroscopy has been investigated. For comparison, measurements were also made in conventional backscatter mode. The experimental setup consisted of a Raman probe-based spectrometer with 785 nm excitation for measurements in backscatter mode. In transmission mode the same system was used to detect the Raman scattered light, while an external diode laser of the same type was used as excitation source. Quantitative partial least squares models were developed for both measurement modes. The results for tablets show that the prediction error for an independent test set was lower for the transmission measurements with a relative root mean square error of about 2.2% as compared with 2.9% for the backscatter mode. Furthermore, the models were simpler in the transmission case, for which only a single partial least squares (PLS) component was required to explain the variation. The main reason for the improvement using the transmission mode is a more representative sampling of the tablets compared with the backscatter mode. Capsules containing mixtures of pharmaceutical powders were also assessed by transmission only. The quantitative results for the capsules' contents were good, with a prediction error of 3.6% w/w for an independent test set. The advantage of transmission Raman over backscatter Raman spectroscopy has been demonstrated for quantitative analysis of pharmaceutical formulations, and the prospects for reliable, lean calibrations for pharmaceutical analysis is discussed.

  19. [Quantitative relationships between various representatives of gastrointestinal microflora of experimental animals (rats) in normal conditions and after immunosuppression with imuran].

    PubMed

    Amanov, N A

    1983-06-01

    The influence of imuran (an analog of nitrogen ioprin) on the quantitative relationship between lactobacilli, bifidobacteria, bacteroids and aerobic autoflora in different sections of the gastrointestinal tract of white rats was studied under experimental conditions. On days 7-14-30 after the introduction of imuran into the gastrointestinal tract dysbacteriosis developed; it was characterized by a decrease in the number of lactobacilli and asporogenic anaerobic microflora and an increase in the number of aerobic microorganisms. By days 60-90 the content of aerobic microbes in all sections of the gastrointestinal tract was still elevated, while the rapid restoration of the number of bacteroids took place. Therefore, immunosuppression therapy with imuran may give rise to autoinfectious complications caused by different representatives of infective microflora.

  20. The state of RT-quantitative PCR: firsthand observations of implementation of minimum information for the publication of quantitative real-time PCR experiments (MIQE).

    PubMed

    Taylor, Sean C; Mrkusich, Eli M

    2014-01-01

    In the past decade, the techniques of quantitative PCR (qPCR) and reverse transcription (RT)-qPCR have become accessible to virtually all research labs, producing valuable data for peer-reviewed publications and supporting exciting research conclusions. However, the experimental design and validation processes applied to the associated projects are the result of historical biases adopted by individual labs that have evolved and changed since the inception of the techniques and associated technologies. This has resulted in wide variability in the quality, reproducibility and interpretability of published data as a direct result of how each lab has designed their RT-qPCR experiments. The 'minimum information for the publication of quantitative real-time PCR experiments' (MIQE) was published to provide the scientific community with a consistent workflow and key considerations to perform qPCR experiments. We use specific examples to highlight the serious negative ramifications for data quality when the MIQE guidelines are not applied and include a summary of good and poor practices for RT-qPCR. © 2013 S. Karger AG, Basel.

  1. Adsorption of methanol molecule on graphene: Experimental results and first-principles calculations

    NASA Astrophysics Data System (ADS)

    Zhao, X. W.; Tian, Y. L.; Yue, W. W.; Chen, M. N.; Hu, G. C.; Ren, J. F.; Yuan, X. B.

    2018-04-01

    Adsorption properties of methanol molecule on graphene surface are studied both theoretically and experimentally. The adsorption geometrical structures, adsorption energies, band structures, density of states and the effective masses are obtained by means of first-principles calculations. It is found that the electronic characteristics and conductivity of graphene are sensitive to the methanol molecule adsorption. After adsorption of methanol molecule, bandgap appears. With the increasing of the adsorption distance, the bandgap, adsorption energy and effective mass of the adsorption system decreased, hence the resistivity of the system decreases gradually, these results are consistent with the experimental results. All these calculations and experiments indicate that the graphene-based sensors have a wide range of applications in detecting particular molecules.

  2. A Novel Approach to Teach the Generation of Bioelectrical Potentials from a Descriptive and Quantitative Perspective

    ERIC Educational Resources Information Center

    Rodriguez-Falces, Javier

    2013-01-01

    In electrophysiology studies, it is becoming increasingly common to explain experimental observations using both descriptive methods and quantitative approaches. However, some electrophysiological phenomena, such as the generation of extracellular potentials that results from the propagation of the excitation source along the muscle fiber, are…

  3. Theoretical versus experimental results for the rotordynamic coefficients of eccentric, smooth, gas annular seal annular gas seals

    NASA Technical Reports Server (NTRS)

    Childs, Dara W.; Alexander, Chis

    1994-01-01

    This viewgraph presentation presents the following results: (1) The analytical results overpredict the experimental results for the direct stiffness values and incorrectly predict increasing stiffness with decreasing pressure ratios. (2) Theory correctly predicts increasing cross-coupled stiffness, K(sub YX), with increasing eccentricity and inlet preswirl. (3) Direct damping, C(sub XX), underpredicts the experimental results, but the analytical results do correctly show that damping increases with increasing eccentricity. (4) The whirl frequency values predicted by theory are insensitive to changes in the static eccentricity ratio. Although these values match perfectly with the experimental results at 16,000 rpm, the results at the lower speed do not correspond. (5) Theoretical and experimental mass flow rates match at 5000 rpm, but at 16,000 rpm the theoretical results overpredict the experimental mass flow rates. (6) Theory correctly shows the linear pressure profiles and the associated entrance losses with the specified rotor positions.

  4. Strain measurement of objects subjected to aerodynamic heating using digital image correlation: experimental design and preliminary results.

    PubMed

    Pan, Bing; Jiang, Tianyun; Wu, Dafang

    2014-11-01

    In thermomechanical testing of hypersonic materials and structures, direct observation and quantitative strain measurement of the front surface of a test specimen directly exposed to severe aerodynamic heating has been considered as a very challenging task. In this work, a novel quartz infrared heating device with an observation window is designed to reproduce the transient thermal environment experienced by hypersonic vehicles. The specially designed experimental system allows the capture of test article's surface images at various temperatures using an optical system outfitted with a bandpass filter. The captured images are post-processed by digital image correlation to extract full-field thermal deformation. To verify the viability and accuracy of the established system, thermal strains of a chromiumnickel austenite stainless steel sample heated from room temperature up to 600 °C were determined. The preliminary results indicate that the air disturbance between the camera and the specimen due to heat haze induces apparent distortions in the recorded images and large errors in the measured strains, but the average values of the measured strains are accurate enough. Limitations and further improvements of the proposed technique are discussed.

  5. An overview of quantitative approaches in Gestalt perception.

    PubMed

    Jäkel, Frank; Singh, Manish; Wichmann, Felix A; Herzog, Michael H

    2016-09-01

    Gestalt psychology is often criticized as lacking quantitative measurements and precise mathematical models. While this is true of the early Gestalt school, today there are many quantitative approaches in Gestalt perception and the special issue of Vision Research "Quantitative Approaches in Gestalt Perception" showcases the current state-of-the-art. In this article we give an overview of these current approaches. For example, ideal observer models are one of the standard quantitative tools in vision research and there is a clear trend to try and apply this tool to Gestalt perception and thereby integrate Gestalt perception into mainstream vision research. More generally, Bayesian models, long popular in other areas of vision research, are increasingly being employed to model perceptual grouping as well. Thus, although experimental and theoretical approaches to Gestalt perception remain quite diverse, we are hopeful that these quantitative trends will pave the way for a unified theory. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Dissociation coefficients of protein adsorption to nanoparticles as quantitative metrics for description of the protein corona: A comparison of experimental techniques and methodological relevance.

    PubMed

    Hühn, Jonas; Fedeli, Chiara; Zhang, Qian; Masood, Atif; Del Pino, Pablo; Khashab, Niveen M; Papini, Emanuele; Parak, Wolfgang J

    2016-06-01

    Protein adsorption to nanoparticles is described as a chemical reaction in which proteins attach to binding sites on the nanoparticle surface. This process is defined by a dissociation coefficient, which tells how many proteins are adsorbed per nanoparticle in dependence of the protein concentration. Different techniques to experimentally determine dissociation coefficients of protein adsorption to nanoparticles are reviewed. Results of more than 130 experiments in which dissociation coefficients have been determined are compared. Data show that different methods, nanoparticle systems, and proteins can lead to significantly different dissociation coefficients. However, we observed a clear tendency of smaller dissociation coefficients upon less negative towards more positive zeta potentials of the nanoparticles. The zeta potential thus is a key parameter influencing protein adsorption to the surface of nanoparticles. Our analysis highlights the importance of the characterization of the parameters governing protein-nanoparticle interaction for quantitative evaluation and objective literature comparison. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Identification and Quantitative Analysis of Acetaminophen, Acetylsalicylic Acid, and Caffeine in Commercial Analgesic Tablets by LC-MS

    ERIC Educational Resources Information Center

    Fenk, Christopher J.; Hickman, Nicole M.; Fincke, Melissa A.; Motry, Douglas H.; Lavine, Barry

    2010-01-01

    An undergraduate LC-MS experiment is described for the identification and quantitative determination of acetaminophen, acetylsalicylic acid, and caffeine in commercial analgesic tablets. This inquiry-based experimental procedure requires minimal sample preparation and provides good analytical results. Students are provided sufficient background…

  8. A conductive grating sensor for online quantitative monitoring of fatigue crack.

    PubMed

    Li, Peiyuan; Cheng, Li; Yan, Xiaojun; Jiao, Shengbo; Li, Yakun

    2018-05-01

    Online quantitative monitoring of crack damage due to fatigue is a critical challenge for structural health monitoring systems assessing structural safety. To achieve online quantitative monitoring of fatigue crack, a novel conductive grating sensor based on the principle of electrical potential difference is proposed. The sensor consists of equidistant grating channels to monitor the fatigue crack length and conductive bars to provide the circuit path. An online crack monitoring system is established to verify the sensor's capability. The experimental results prove that the sensor is suitable for online quantitative monitoring of fatigue crack. A finite element model for the sensor is also developed to optimize the sensitivity of crack monitoring, which is defined by the rate of sensor resistance change caused by the break of the first grating channel. Analysis of the model shows that the sensor sensitivity can be enhanced by reducing the number of grating channels and increasing their resistance and reducing the resistance of the conductive bar.

  9. A conductive grating sensor for online quantitative monitoring of fatigue crack

    NASA Astrophysics Data System (ADS)

    Li, Peiyuan; Cheng, Li; Yan, Xiaojun; Jiao, Shengbo; Li, Yakun

    2018-05-01

    Online quantitative monitoring of crack damage due to fatigue is a critical challenge for structural health monitoring systems assessing structural safety. To achieve online quantitative monitoring of fatigue crack, a novel conductive grating sensor based on the principle of electrical potential difference is proposed. The sensor consists of equidistant grating channels to monitor the fatigue crack length and conductive bars to provide the circuit path. An online crack monitoring system is established to verify the sensor's capability. The experimental results prove that the sensor is suitable for online quantitative monitoring of fatigue crack. A finite element model for the sensor is also developed to optimize the sensitivity of crack monitoring, which is defined by the rate of sensor resistance change caused by the break of the first grating channel. Analysis of the model shows that the sensor sensitivity can be enhanced by reducing the number of grating channels and increasing their resistance and reducing the resistance of the conductive bar.

  10. Quantitative characterization of genetic parts and circuits for plant synthetic biology.

    PubMed

    Schaumberg, Katherine A; Antunes, Mauricio S; Kassaw, Tessema K; Xu, Wenlong; Zalewski, Christopher S; Medford, June I; Prasad, Ashok

    2016-01-01

    Plant synthetic biology promises immense technological benefits, including the potential development of a sustainable bio-based economy through the predictive design of synthetic gene circuits. Such circuits are built from quantitatively characterized genetic parts; however, this characterization is a significant obstacle in work with plants because of the time required for stable transformation. We describe a method for rapid quantitative characterization of genetic plant parts using transient expression in protoplasts and dual luciferase outputs. We observed experimental variability in transient-expression assays and developed a mathematical model to describe, as well as statistical normalization methods to account for, this variability, which allowed us to extract quantitative parameters. We characterized >120 synthetic parts in Arabidopsis and validated our method by comparing transient expression with expression in stably transformed plants. We also tested >100 synthetic parts in sorghum (Sorghum bicolor) protoplasts, and the results showed that our method works in diverse plant groups. Our approach enables the construction of tunable gene circuits in complex eukaryotic organisms.

  11. A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples

    PubMed Central

    Licier, Rígel; Miranda, Eric; Serrano, Horacio

    2016-01-01

    The proper handling of samples to be analyzed by mass spectrometry (MS) can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine. PMID:28248241

  12. A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples.

    PubMed

    Licier, Rígel; Miranda, Eric; Serrano, Horacio

    2016-10-17

    The proper handling of samples to be analyzed by mass spectrometry (MS) can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine.

  13. Experimental segregation of iron-nickel metal, iron-sulfide, and olivine in a thermal gradient: Preliminary results

    NASA Technical Reports Server (NTRS)

    Jurewicz, Stephen R.; Jones, J. H.

    1993-01-01

    Speculation about the possible mechanisms for core formation in small asteroids raises more questions than answers. Petrologic evidence from iron meteorites, pallasites, and astronomical observations of M asteroids suggests that many small bodies were capable of core formation. Recent work by Taylor reviews the geochemical evidence and examines the possible physical/mechanical constraints on segregation processes. Taylor's evaluation suggests that extensive silicate partial melting (preferably 50 vol. percent or greater) is required before metal can segregate from the surrounding silicate and form a metal core. The arguments for large degrees of silicate partial melting are two-fold: (1) elemental trends in iron meteorites require that the metal was at is liquidus; and (2) experimental observations of metal/sulfide inclusions in partially molten silicate meteorites show that the metal/sulfide tends to form spherules in the liquid silicate due to surface tension effects. Taylor points out that for these metal spherules to sink through a silicate mush, high degrees of silicate partial melting are required to lower the silicate yield strength. Although some qualitative experimental data exists, little is actually known about the behavior of metals and liquid sulfides dispersed in silicate systems. In addition, we have been impressed with the ability of cumulative olivine to expel trapped liquid when placed in a thermal gradient. Consequently, we undertook to accomplish the following: (1) experimentally evaluate the potential for metal/sulfide/silicate segregation in a thermal gradient; and (2) obtain quantitative data of the wetting parameters of metal-sulfide melts among silicate grains.

  14. Quantitative analysis of the mixtures of illicit drugs using terahertz time-domain spectroscopy

    NASA Astrophysics Data System (ADS)

    Jiang, Dejun; Zhao, Shusen; Shen, Jingling

    2008-03-01

    A method was proposed to quantitatively inspect the mixtures of illicit drugs with terahertz time-domain spectroscopy technique. The mass percentages of all components in a mixture can be obtained by linear regression analysis, on the assumption that all components in the mixture and their absorption features be known. For illicit drugs were scarce and expensive, firstly we used common chemicals, Benzophenone, Anthraquinone, Pyridoxine hydrochloride and L-Ascorbic acid in the experiment. Then illicit drugs and a common adulterant, methamphetamine and flour, were selected for our experiment. Experimental results were in significant agreement with actual content, which suggested that it could be an effective method for quantitative identification of illicit drugs.

  15. At Odds: Reconciling Experimental and Theoretical Results in High School Physics

    ERIC Educational Resources Information Center

    Gates, Joshua

    2009-01-01

    For this experiment, students are divided into 2 groups and presented with a static equilibrium force-balance problem to solve. One group works entirely experimentally and the other group theoretically, using Newton's laws. The groups present their seemingly dissimilar results and must reconcile them through discussion. (Contains 3 figures.)

  16. Rumination and modes of processing around meal times in women with anorexia nervosa: qualitative and quantitative results from a pilot study.

    PubMed

    Cowdrey, Felicity A; Stewart, Anne; Roberts, Jill; Park, Rebecca J

    2013-09-01

    The primary aim of this exploratory study was to examine qualitatively and quantitatively the effects of rumination, mindful breathing, and distraction on processing styles and the meal time experience in women with a history of anorexia nervosa (AN). A quasi-experimental within-participant design was employed. Thirty-seven women with history of AN and all experiencing current eating disorder psychopathology listened to a single rumination, mindful breathing and distraction exercise before a meal time. Qualitative and quantitative analyses were employed. Specific themes were extracted for each exercise including avoidance, being in the moment and rumination. The rumination exercise led to significantly greater analytical self-focus. Mindful breathing led to significantly greater experiential self-focus compared with distraction in partially weight-restored AN participants. In AN, self-material is processed in a ruminative way and avoidance is valued. It is difficult to shift individuals with AN out of a rumination around meal times using brief mindful breathing. Future research should investigate at what stage of AN illness mindful-based and acceptance-based strategies are useful and how these strategies could be incorporated in treatment. Copyright © 2013 John Wiley & Sons, Ltd and Eating Disorders Association.

  17. Phase transition kinetics in DIET of vanadium pentoxide. I. Experimental results

    NASA Astrophysics Data System (ADS)

    Ai, R.; Fan, H.-J.; Marks, L. D.

    1993-01-01

    Experimental results of the kinetics of phase transformation in vanadium pentoxide during surface loss of oxygen from electron irradiation are described. Phase transformations under three different regimes were examined: (a) low flux; (b) intermediate flux and (c) high flux. Different phase transformation routes were observed under different fluxes. In a companion paper, numerical calculations are presented demonstrating that these results are due to a mixed interface/diffusion controlled phase transition pumped by surface oxygen loss.

  18. Anthropometric and quantitative EMG status of femoral quadriceps before and after conventional kinesitherapy with and without magnetotherapy.

    PubMed

    Graberski Matasović, M; Matasović, T; Markovac, Z

    1997-06-01

    The frequency of femoral quadriceps muscle hypotrophy has become a significant therapeutic problem. Efforts are being made to improve the standard scheme of kinesitherapeutic treatment by using additional more effective therapeutic methods. Beside kinesitherapy, the authors have used magnetotherapy in 30 of the 60 patients. The total of 60 patients, both sexes, similar age groups and intensity of hypotrophy, were included in the study. They were divided into groups A and B, the experimental and the control one (30 patients each). The treatment was scheduled for the usual 5-6 weeks. Electromyographic quantitative analysis was used to check-up the treatment results achieved after 5 and 6 weeks of treatment period. Analysis of results has confirmed the assumption that magnetotherapy may yield better and faster treatment results, disappearance of pain and decreased risk of complications. The same results were obtained in the experimental group, only one week earlier than in the control group. The EMG quantitative analysis has not proved sufficiently reliable and objective method in the assessment of real condition of the muscle and effects of treatment.

  19. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    PubMed Central

    Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.

    2015-01-01

    PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895

  20. Design and experimental results of the 1-T Bitter Electromagnet Testing Apparatus (BETA)

    NASA Astrophysics Data System (ADS)

    Bates, E. M.; Birmingham, W. J.; Romero-Talamás, C. A.

    2018-05-01

    The Bitter Electromagnet Testing Apparatus (BETA) is a 1-Tesla (T) technical prototype of the 10 T Adjustable Long Pulsed High-Field Apparatus. BETA's final design specifications are highlighted in this paper which include electromagnetic, thermal, and stress analyses. We discuss here the design and fabrication of BETA's core, vessel, cooling, and electrical subsystems. The electrical system of BETA is composed of a scalable solid-state DC breaker circuit. Experimental results display the stable operation of BETA at 1 T. These results are compared to both analytical design and finite element calculations. Experimental results validate analytical magnet designing methods developed at the Dusty Plasma Laboratory. The theoretical steady state maxima and the limits of BETA's design are explored in this paper.

  1. Design and experimental results of the 1-T Bitter Electromagnet Testing Apparatus (BETA).

    PubMed

    Bates, E M; Birmingham, W J; Romero-Talamás, C A

    2018-05-01

    The Bitter Electromagnet Testing Apparatus (BETA) is a 1-Tesla (T) technical prototype of the 10 T Adjustable Long Pulsed High-Field Apparatus. BETA's final design specifications are highlighted in this paper which include electromagnetic, thermal, and stress analyses. We discuss here the design and fabrication of BETA's core, vessel, cooling, and electrical subsystems. The electrical system of BETA is composed of a scalable solid-state DC breaker circuit. Experimental results display the stable operation of BETA at 1 T. These results are compared to both analytical design and finite element calculations. Experimental results validate analytical magnet designing methods developed at the Dusty Plasma Laboratory. The theoretical steady state maxima and the limits of BETA's design are explored in this paper.

  2. [Influence and correlation of attitude, availability and institutional support to research implementation in nursing practice – results from an exploratory, cross-sectional quantitative study].

    PubMed

    Haslinger-Baumann, Elisabeth; Lang, Gert; Müller, Gerhard

    2015-06-01

    The concrete application of research findings in nursing practice is a multidimensional process. In Austria, there are currently no results available that explain the impact of and association with the implementation of research in hospitals. The aim of the study was to investigate influences and relationships of individual attitudes towards research utilization, availability of research results and institutional support of nurses in Austrian hospitals with respect to research application. In a non-experimental quantitative cross-sectional design a multi-centre study (n = 10) was performed in 2011. The sample comprises 178 certified nurses who were interviewed with a survey questionnaire. The multiple regression analysis shows that a positive attitude towards research use (β = 0.388, p < 0.001), the availability of processed research results (β = 0.470, p < 0.001), and an adequate institutional support (β = 0.142, p < 0.050) has a significant influence on the application of research results. The path analysis proves that course attendance in evidence-based nursing has a strong positive influence towards research application (β = 0.464; p < 0.001). Health institutions are, according to legal instructions, called on to make use of the positive attitude and supply supporting measures in order to introduce research results into the daily nursing practice.

  3. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  4. Hydrodynamic Radii of Intrinsically Disordered Proteins Determined from Experimental Polyproline II Propensities

    PubMed Central

    Tomasso, Maria E.; Tarver, Micheal J.; Devarajan, Deepa; Whitten, Steven T.

    2016-01-01

    The properties of disordered proteins are thought to depend on intrinsic conformational propensities for polyproline II (PP II) structure. While intrinsic PP II propensities have been measured for the common biological amino acids in short peptides, the ability of these experimentally determined propensities to quantitatively reproduce structural behavior in intrinsically disordered proteins (IDPs) has not been established. Presented here are results from molecular simulations of disordered proteins showing that the hydrodynamic radius (R h) can be predicted from experimental PP II propensities with good agreement, even when charge-based considerations are omitted. The simulations demonstrate that R h and chain propensity for PP II structure are linked via a simple power-law scaling relationship, which was tested using the experimental R h of 22 IDPs covering a wide range of peptide lengths, net charge, and sequence composition. Charge effects on R h were found to be generally weak when compared to PP II effects on R h. Results from this study indicate that the hydrodynamic dimensions of IDPs are evidence of considerable sequence-dependent backbone propensities for PP II structure that qualitatively, if not quantitatively, match conformational propensities measured in peptides. PMID:26727467

  5. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test

    PubMed Central

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G.

    2015-01-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as “gold standard” for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  6. Rigour in quantitative research.

    PubMed

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  7. Photovoltaic Grid-Connected Modeling and Characterization Based on Experimental Results.

    PubMed

    Humada, Ali M; Hojabri, Mojgan; Sulaiman, Mohd Herwan Bin; Hamada, Hussein M; Ahmed, Mushtaq N

    2016-01-01

    A grid-connected photovoltaic (PV) system operates under fluctuated weather condition has been modeled and characterized based on specific test bed. A mathematical model of a small-scale PV system has been developed mainly for residential usage, and the potential results have been simulated. The proposed PV model based on three PV parameters, which are the photocurrent, IL, the reverse diode saturation current, Io, the ideality factor of diode, n. Accuracy of the proposed model and its parameters evaluated based on different benchmarks. The results showed that the proposed model fitting the experimental results with high accuracy compare to the other models, as well as the I-V characteristic curve. The results of this study can be considered valuable in terms of the installation of a grid-connected PV system in fluctuated climatic conditions.

  8. Experimental Results from a Resonant Dielectric Laser Accelerator

    NASA Astrophysics Data System (ADS)

    Yoder, Rodney; McNeur, Joshua; Sozer, Esin; Travish, Gil; Hazra, Kiran Shankar; Matthews, Brian; England, Joel; Peralta, Edgar; Wu, Ziran

    2015-04-01

    Laser-powered accelerators have the potential to operate with very large accelerating gradients (~ GV/m) and represent a path toward extremely compact colliders and accelerator technology. Optical-scale laser-powered devices based on field-shaping structures (known as dielectric laser accelerators, or DLAs) have been described and demonstrated recently. Here we report on the first experimental results from the Micro-Accelerator Platform (MAP), a DLA based on a slab-symmetric resonant optical-scale structure. As a resonant (rather than near-field) device, the MAP is distinct from other DLAs. Its cavity resonance enhances its accelerating field relative to the incoming laser fields, which are coupled efficiently through a diffractive optic on the upper face of the device. The MAP demonstrated modest accelerating gradients in recent experiments, in which it was powered by a Ti:Sapphire laser well below its breakdown limit. More detailed results and some implications for future developments will be discussed. Supported in part by the U.S. Defense Threat Reduction Agency (UCLA); U.S. Dept of Energy (SLAC); and DARPA (SLAC).

  9. From Deuterium to Free Neutrons - Recent Experimental Results

    NASA Astrophysics Data System (ADS)

    Kuhn, Sebastian

    2009-05-01

    Lepton scattering has long been used to gather data on the internal structure of both protons and neutrons. Assuming isospin symmetry, these data can be used to pin down the contributions of both u and d quarks to the spatial and momentum-spin structure of the nucleon and its excitations. In this context, information on the neutron is crucial and is typically obtained from experiments on few-body nuclear targets (predominantly ^3He and deuterium). However, the need to account for binding effects complicates the interpretation of these experiments. On the other hand, detailed studies of the reaction mechanism can yield important new information on the structure of few-body nuclei and the interplay of nuclear and quark degrees of freedom. Recent theoretical and experimental advances have allowed us to make significant progress on both fronts -- a cleaner extraction of neutron properties from nuclear data and a better understanding of nuclear modifications of the bound neutron structure. I will concentrate on recent results on the deuteron. I will present a new extraction of neutron spin structure functions in the resonance and large-x region (from the EG1 experiment with CLAS at Jefferson Lab). The same data can also be used for a detailed comparison with modern calculations of quasi-elastic spin-dependent scattering on the deuteron. A second experimental program with CLAS uses the technique of ``spectator tagging'' to extract the unpolarized structure functions of the neutron with minimal uncertainties from nuclear effects. By mapping out the dependence of the cross section on the ``spectator'' momentum, we can learn about final state interactions between the struck nucleon and the spectator, as well as modifications of the neutron structure due to nuclear binding. I will present preliminary results from the ``BoNuS'' experiment which pushed the detection limit of the spectator proton down to momenta of 70 MeV/c, where nuclear corrections should become small.

  10. From information theory to quantitative description of steric effects.

    PubMed

    Alipour, Mojtaba; Safari, Zahra

    2016-07-21

    Immense efforts have been made in the literature to apply the information theory descriptors for investigating the electronic structure theory of various systems. In the present study, the information theoretic quantities, such as Fisher information, Shannon entropy, Onicescu information energy, and Ghosh-Berkowitz-Parr entropy, have been used to present a quantitative description for one of the most widely used concepts in chemistry, namely the steric effects. Taking the experimental steric scales for the different compounds as benchmark sets, there are reasonable linear relationships between the experimental scales of the steric effects and theoretical values of steric energies calculated from information theory functionals. Perusing the results obtained from the information theoretic quantities with the two representations of electron density and shape function, the Shannon entropy has the best performance for the purpose. On the one hand, the usefulness of considering the contributions of functional groups steric energies and geometries, and on the other hand, dissecting the effects of both global and local information measures simultaneously have also been explored. Furthermore, the utility of the information functionals for the description of steric effects in several chemical transformations, such as electrophilic and nucleophilic reactions and host-guest chemistry, has been analyzed. The functionals of information theory correlate remarkably with the stability of systems and experimental scales. Overall, these findings show that the information theoretic quantities can be introduced as quantitative measures of steric effects and provide further evidences of the quality of information theory toward helping theoreticians and experimentalists to interpret different problems in real systems.

  11. Validating internal controls for quantitative plant gene expression studies.

    PubMed

    Brunner, Amy M; Yakovlev, Igor A; Strauss, Steven H

    2004-08-18

    Real-time reverse transcription PCR (RT-PCR) has greatly improved the ease and sensitivity of quantitative gene expression studies. However, accurate measurement of gene expression with this method relies on the choice of a valid reference for data normalization. Studies rarely verify that gene expression levels for reference genes are adequately consistent among the samples used, nor compare alternative genes to assess which are most reliable for the experimental conditions analyzed. Using real-time RT-PCR to study the expression of 10 poplar (genus Populus) housekeeping genes, we demonstrate a simple method for determining the degree of stability of gene expression over a set of experimental conditions. Based on a traditional method for analyzing the stability of varieties in plant breeding, it defines measures of gene expression stability from analysis of variance (ANOVA) and linear regression. We found that the potential internal control genes differed widely in their expression stability over the different tissues, developmental stages and environmental conditions studied. Our results support that quantitative comparisons of candidate reference genes are an important part of real-time RT-PCR studies that seek to precisely evaluate variation in gene expression. The method we demonstrated facilitates statistical and graphical evaluation of gene expression stability. Selection of the best reference gene for a given set of experimental conditions should enable detection of biologically significant changes in gene expression that are too small to be revealed by less precise methods, or when highly variable reference genes are unknowingly used in real-time RT-PCR experiments.

  12. Connecting qualitative observation and quantitative measurement for enhancing quantitative literacy in plant anatomy course

    NASA Astrophysics Data System (ADS)

    Nuraeni, E.; Rahmat, A.

    2018-05-01

    Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.

  13. A Quantitative Model of Keyhole Instability Induced Porosity in Laser Welding of Titanium Alloy

    NASA Astrophysics Data System (ADS)

    Pang, Shengyong; Chen, Weidong; Wang, Wen

    2014-06-01

    Quantitative prediction of the porosity defects in deep penetration laser welding has generally been considered as a very challenging task. In this study, a quantitative model of porosity defects induced by keyhole instability in partial penetration CO2 laser welding of a titanium alloy is proposed. The three-dimensional keyhole instability, weld pool dynamics, and pore formation are determined by direct numerical simulation, and the results are compared to prior experimental results. It is shown that the simulated keyhole depth fluctuations could represent the variation trends in the number and average size of pores for the studied process conditions. Moreover, it is found that it is possible to use the predicted keyhole depth fluctuations as a quantitative measure of the average size of porosity. The results also suggest that due to the shadowing effect of keyhole wall humps, the rapid cooling of the surface of the keyhole tip before keyhole collapse could lead to a substantial decrease in vapor pressure inside the keyhole tip, which is suggested to be the mechanism by which shielding gas enters into the porosity.

  14. Physical mechanism of comet outbursts - An experimental result

    NASA Technical Reports Server (NTRS)

    Hartmann, William K.

    1993-01-01

    Attention is given to an experimental investigation of the physical mechanism of comet outbursts which is consistent with the general picture of mantle presence on comets and clarifies the relation of mantles to eruptive activity. The experiment and closeup observation of Comet P/Halley suggest a result different from most mathematical models in that the release of gas pressure does not occur only from uniform gas flow out of the entire surface. In some active comets near perihelion within a few AU of the sun, gas production rates and disturbance of the surface may be so high that the outflow is nearly continuous, with the regolith being entirely stripped away, as in many of the models. The present model provides a cyclic eruption and recharge mechanism which is lacking in most other models.

  15. Quantitative fluorescence angiography for neurosurgical interventions.

    PubMed

    Weichelt, Claudia; Duscha, Philipp; Steinmeier, Ralf; Meyer, Tobias; Kuß, Julia; Cimalla, Peter; Kirsch, Matthias; Sobottka, Stephan B; Koch, Edmund; Schackert, Gabriele; Morgenstern, Ute

    2013-06-01

    Present methods for quantitative measurement of cerebral perfusion during neurosurgical operations require additional technology for measurement, data acquisition, and processing. This study used conventional fluorescence video angiography--as an established method to visualize blood flow in brain vessels--enhanced by a quantifying perfusion software tool. For these purposes, the fluorescence dye indocyanine green is given intravenously, and after activation by a near-infrared light source the fluorescence signal is recorded. Video data are analyzed by software algorithms to allow quantification of the blood flow. Additionally, perfusion is measured intraoperatively by a reference system. Furthermore, comparing reference measurements using a flow phantom were performed to verify the quantitative blood flow results of the software and to validate the software algorithm. Analysis of intraoperative video data provides characteristic biological parameters. These parameters were implemented in the special flow phantom for experimental validation of the developed software algorithms. Furthermore, various factors that influence the determination of perfusion parameters were analyzed by means of mathematical simulation. Comparing patient measurement, phantom experiment, and computer simulation under certain conditions (variable frame rate, vessel diameter, etc.), the results of the software algorithms are within the range of parameter accuracy of the reference methods. Therefore, the software algorithm for calculating cortical perfusion parameters from video data presents a helpful intraoperative tool without complex additional measurement technology.

  16. High Performance Liquid Chromatography of Vitamin A: A Quantitative Determination.

    ERIC Educational Resources Information Center

    Bohman, Ove; And Others

    1982-01-01

    Experimental procedures are provided for the quantitative determination of Vitamin A (retinol) in food products by analytical liquid chromatography. Standard addition and calibration curve extraction methods are outlined. (SK)

  17. Quantitation of cholesterol incorporation into extruded lipid bilayers.

    PubMed

    Ibarguren, Maitane; Alonso, Alicia; Tenchov, Boris G; Goñi, Felix M

    2010-09-01

    Cholesterol incorporation into lipid bilayers, in the form of multilamellar vesicles or extruded large unilamellar vesicles, has been quantitated. To this aim, the cholesterol contents of bilayers prepared from phospholipid:cholesterol mixtures 33-75 mol% cholesterol have been measured and compared with the original mixture before lipid hydration. There is a great diversity of cases, but under most conditions the actual cholesterol proportion present in the extruded bilayers is much lower than predicted. A quantitative analysis of the vesicles is thus required before any experimental study is undertaken. 2010 Elsevier B.V. All rights reserved.

  18. Development of quantitative radioactive methodologies on paper to determine important lateral-flow immunoassay parameters.

    PubMed

    Mosley, Garrett L; Nguyen, Phuong; Wu, Benjamin M; Kamei, Daniel T

    2016-08-07

    The lateral-flow immunoassay (LFA) is a well-established diagnostic technology that has recently seen significant advancements due in part to the rapidly expanding fields of paper diagnostics and paper-fluidics. As LFA-based diagnostics become more complex, it becomes increasingly important to quantitatively determine important parameters during the design and evaluation process. However, current experimental methods for determining these parameters have certain limitations when applied to LFA systems. In this work, we describe our novel methods of combining paper and radioactive measurements to determine nanoprobe molarity, the number of antibodies per nanoprobe, and the forward and reverse rate constants for nanoprobe binding to immobilized target on the LFA test line. Using a model LFA system that detects for the presence of the protein transferrin (Tf), we demonstrate the application of our methods, which involve quantitative experimentation and mathematical modeling. We also compare the results of our rate constant experiments with traditional experiments to demonstrate how our methods more appropriately capture the influence of the LFA environment on the binding interaction. Our novel experimental approaches can therefore more efficiently guide the research process for LFA design, leading to more rapid advancement of the field of paper-based diagnostics.

  19. PRIORITIZING FUTURE RESEACH ON OFF-LABEL PRESCRIBING: RESULTS OF A QUANTITATIVE EVALUATION

    PubMed Central

    Walton, Surrey M.; Schumock, Glen T.; Lee, Ky-Van; Alexander, G. Caleb; Meltzer, David; Stafford, Randall S.

    2015-01-01

    Background Drug use for indications not approved by the Food and Drug Administration exceeds 20% of prescribing. Available compendia indicate that a minority of off-label uses are well supported by evidence. Policy makers, however, lack information to identify where systematic reviews of the evidence or other research would be most valuable. Methods We developed a quantitative model for prioritizing individual drugs for future research on off-label uses. The base model incorporated three key factors, 1) the volume of off-label use with inadequate evidence, 2) safety, and 3) cost and market considerations. Nationally representative prescribing data were used to estimate the number of off-label drug uses by indication from 1/2005 through 6/2007 in the United States, and these indications were then categorized according to the adequacy of scientific support. Black box warnings and safety alerts were used to quantify drug safety. Drug cost, date of market entry, and marketing expenditures were used to quantify cost and market considerations. Each drug was assigned a relative value for each factor, and the factors were then weighted in the final model to produce a priority score. Sensitivity analyses were conducted by varying the weightings and model parameters. Results Drugs that were consistently ranked highly in both our base model and sensitivity analyses included quetiapine, warfarin, escitalopram, risperidone, montelukast, bupropion, sertraline, venlafaxine, celecoxib, lisinopril, duloxetine, trazodone, olanzapine, and epoetin alfa. Conclusion Future research into off-label drug use should focus on drugs used frequently with inadequate supporting evidence, particularly if further concerns are raised by known safety issues, high drug cost, recent market entry, and extensive marketing. Based on quantitative measures of these factors, we have prioritized drugs where targeted research and policy activities have high potential value. PMID:19025425

  20. Does Training in Table Creation Enhance Table Interpretation? A Quasi-Experimental Study with Follow-Up

    ERIC Educational Resources Information Center

    Karazsia, Bryan T.; Wong, Kendal

    2016-01-01

    Quantitative and statistical literacy are core domains in the undergraduate psychology curriculum. An important component of such literacy includes interpretation of visual aids, such as tables containing results from statistical analyses. This article presents results of a quasi-experimental study with longitudinal follow-up that tested the…

  1. Wind Code Application to External Forebody Flowfields with Comparisons to Experimental Results

    NASA Technical Reports Server (NTRS)

    Frate, F. C.; Kim, H. D.

    2001-01-01

    The WIND Code, a general purpose Navier-Stokes solver, has been utilized to obtain supersonic external flowfield Computational Fluid Dynamics (CFD) solutions over an axisymmetric, parabolic forebody with comparisons made to wind tunnel experimental results. Various cases have been investigated at supersonic freestream conditions ranging from Mach 2.0 to 3.5, at 0 deg and 3 deg angles-of-attack, and with either a sharp-nose or blunt-nose forebody configuration. Both a turbulent (Baldwin-Lomax algebraic turbulence model) and a laminar model have been implemented in the CFD. Obtaining the solutions involved utilizing either the parabolized- or full-Navier-Stokes analyses supplied in WIND. Comparisons have been made with static pressure measurements, with boundary-layer rake and flowfield rake pitot pressure measurements, and with temperature sensitive paint experimental results. Using WIND's parabolized Navier-Stokes capability, grid sequencing, and the Baldwin-Lomax algebraic turbulence model allowed for significant reductions in computational time while still providing good agreement with experiment. Given that CFD and experiment compare well, WIND is found to be a good computational platform for solving this type of forebody problem, and the grids developed in conjunction with it will be used in the future to investigate varying freestream conditions not tested experimentally.

  2. A quantitative model of optimal data selection in Wason's selection task.

    PubMed

    Hattori, Masasi

    2002-10-01

    The optimal data selection model proposed by Oaksford and Chater (1994) successfully formalized Wason's selection task (Wason, 1966). The model, however, involved some questionable assumptions and was also not sufficient as a model of the task because it could not provide quantitative predictions of the card selection frequencies. In this paper, the model was revised to provide quantitative fits to the data. The model can predict the selection frequencies of cards based on a selection tendency function (STF), or conversely, it enables the estimation of subjective probabilities from data. Past experimental data were first re-analysed based on the model. In Experiment 1, the superiority of the revised model was shown. However, when the relationship between antecedent and consequent was forced to deviate from the biconditional form, the model was not supported. In Experiment 2, it was shown that sufficient emphasis on probabilistic information can affect participants' performance. A detailed experimental method to sort participants by probabilistic strategies was introduced. Here, the model was supported by a subgroup of participants who used the probabilistic strategy. Finally, the results were discussed from the viewpoint of adaptive rationality.

  3. Quantitative phase imaging and complex field reconstruction by pupil modulation differential phase contrast

    PubMed Central

    Lu, Hangwen; Chung, Jaebum; Ou, Xiaoze; Yang, Changhuei

    2016-01-01

    Differential phase contrast (DPC) is a non-interferometric quantitative phase imaging method achieved by using an asymmetric imaging procedure. We report a pupil modulation differential phase contrast (PMDPC) imaging method by filtering a sample’s Fourier domain with half-circle pupils. A phase gradient image is captured with each half-circle pupil, and a quantitative high resolution phase image is obtained after a deconvolution process with a minimum of two phase gradient images. Here, we introduce PMDPC quantitative phase image reconstruction algorithm and realize it experimentally in a 4f system with an SLM placed at the pupil plane. In our current experimental setup with the numerical aperture of 0.36, we obtain a quantitative phase image with a resolution of 1.73μm after computationally removing system aberrations and refocusing. We also extend the depth of field digitally by 20 times to ±50μm with a resolution of 1.76μm. PMID:27828473

  4. Photovoltaic Grid-Connected Modeling and Characterization Based on Experimental Results

    PubMed Central

    Humada, Ali M.; Hojabri, Mojgan; Sulaiman, Mohd Herwan Bin; Hamada, Hussein M.; Ahmed, Mushtaq N.

    2016-01-01

    A grid-connected photovoltaic (PV) system operates under fluctuated weather condition has been modeled and characterized based on specific test bed. A mathematical model of a small-scale PV system has been developed mainly for residential usage, and the potential results have been simulated. The proposed PV model based on three PV parameters, which are the photocurrent, IL, the reverse diode saturation current, Io, the ideality factor of diode, n. Accuracy of the proposed model and its parameters evaluated based on different benchmarks. The results showed that the proposed model fitting the experimental results with high accuracy compare to the other models, as well as the I-V characteristic curve. The results of this study can be considered valuable in terms of the installation of a grid-connected PV system in fluctuated climatic conditions. PMID:27035575

  5. Quantitative estimation of minimum offset for multichannel surface-wave survey with actively exciting source

    USGS Publications Warehouse

    Xu, Y.; Xia, J.; Miller, R.D.

    2006-01-01

    Multichannel analysis of surface waves is a developing method widely used in shallow subsurface investigations. The field procedures and related parameters are very important for successful applications. Among these parameters, the source-receiver offset range is seldom discussed in theory and normally determined by empirical or semi-quantitative methods in current practice. This paper discusses the problem from a theoretical perspective. A formula for quantitatively evaluating a layered homogenous elastic model was developed. The analytical results based on simple models and experimental data demonstrate that the formula is correct for surface wave surveys for near-surface applications. ?? 2005 Elsevier B.V. All rights reserved.

  6. Experimental Investigation of Gas/Slag/Matte/Tridymite Equilibria in the Cu-Fe-O-S-Si System in Controlled Gas Atmospheres: Experimental Results at 1473 K (1200 °C) and P(SO2) = 0.25 atm

    NASA Astrophysics Data System (ADS)

    Fallah-Mehrjardi, Ata; Hidayat, Taufiq; Hayes, Peter C.; Jak, Evgueni

    2017-12-01

    Experimental studies were undertaken to determine the gas/slag/matte/tridymite equilibria in the Cu-Fe-O-S-Si system at 1473 K (1200 °C), P(SO2) = 0.25 atm, and a range of P(O2)'s. The experimental methodology involved high-temperature equilibration using a substrate support technique in controlled gas atmospheres (CO/CO2/SO2/Ar), rapid quenching of equilibrium phases, followed by direct measurement of the chemical compositions of the phases with Electron Probe X-ray Microanalysis (EPMA). The experimental data for slag and matte were presented as a function of copper concentration in matte (matte grade). The data provided are essential for the evaluation of the effect of oxygen potential under controlled atmosphere on the matte grade, liquidus composition of slag and chemically dissolved copper in slag. The new data provide important accurate and reliable quantitative foundation for improvement of the thermodynamic databases for copper-containing systems.

  7. Experimental results for characterization of a tapered plastic optical fiber sensor based on SPR

    NASA Astrophysics Data System (ADS)

    Cennamo, N.; Galatus, R.; Zeni, L.

    2015-05-01

    The experimental results obtained with two different Plastic Optical Fiber (POF) geometries, tapered and not-tapered, for a sensor based on Surface Plasmon Resonance (SPR) are presented. SPR is used for determining the refractive index variations at the interface between a gold layer and a dielectric medium (aqueous medium). In this work SPR sensors in POF configurations, useful for bio-sensing applications, have been realized for the optimization of the sensitivity and experimentally tested. The results show as the sensitivity increases with the tapered POF configuration, when the refractive index of aqueous medium increases.

  8. Low-dose CT for quantitative analysis in acute respiratory distress syndrome

    PubMed Central

    2013-01-01

    Introduction The clinical use of serial quantitative computed tomography (CT) to characterize lung disease and guide the optimization of mechanical ventilation in patients with acute respiratory distress syndrome (ARDS) is limited by the risk of cumulative radiation exposure and by the difficulties and risks related to transferring patients to the CT room. We evaluated the effects of tube current-time product (mAs) variations on quantitative results in healthy lungs and in experimental ARDS in order to support the use of low-dose CT for quantitative analysis. Methods In 14 sheep chest CT was performed at baseline and after the induction of ARDS via intravenous oleic acid injection. For each CT session, two consecutive scans were obtained applying two different mAs: 60 mAs was paired with 140, 15 or 7.5 mAs. All other CT parameters were kept unaltered (tube voltage 120 kVp, collimation 32 × 0.5 mm, pitch 0.85, matrix 512 × 512, pixel size 0.625 × 0.625 mm). Quantitative results obtained at different mAs were compared via Bland-Altman analysis. Results Good agreement was observed between 60 mAs and 140 mAs and between 60 mAs and 15 mAs (all biases less than 1%). A further reduction of mAs to 7.5 mAs caused an increase in the bias of poorly aerated and nonaerated tissue (-2.9% and 2.4%, respectively) and determined a significant widening of the limits of agreement for the same compartments (-10.5% to 4.8% for poorly aerated tissue and -5.9% to 10.8% for nonaerated tissue). Estimated mean effective dose at 140, 60, 15 and 7.5 mAs corresponded to 17.8, 7.4, 2.0 and 0.9 mSv, respectively. Image noise of scans performed at 140, 60, 15 and 7.5 mAs corresponded to 10, 16, 38 and 74 Hounsfield units, respectively. Conclusions A reduction of effective dose up to 70% has been achieved with minimal effects on lung quantitative results. Low-dose computed tomography provides accurate quantitative results and could be used to characterize lung compartment distribution and

  9. GProX, a user-friendly platform for bioinformatics analysis and visualization of quantitative proteomics data.

    PubMed

    Rigbolt, Kristoffer T G; Vanselow, Jens T; Blagoev, Blagoy

    2011-08-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)(1). The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net.

  10. GProX, a User-Friendly Platform for Bioinformatics Analysis and Visualization of Quantitative Proteomics Data*

    PubMed Central

    Rigbolt, Kristoffer T. G.; Vanselow, Jens T.; Blagoev, Blagoy

    2011-01-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)1. The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net. PMID:21602510

  11. Two schemes for quantitative photoacoustic tomography based on Monte Carlo simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yubin; Yuan, Zhen, E-mail: zhenyuan@umac.mo

    Purpose: The aim of this study was to develop novel methods for photoacoustically determining the optical absorption coefficient of biological tissues using Monte Carlo (MC) simulation. Methods: In this study, the authors propose two quantitative photoacoustic tomography (PAT) methods for mapping the optical absorption coefficient. The reconstruction methods combine conventional PAT with MC simulation in a novel way to determine the optical absorption coefficient of biological tissues or organs. Specifically, the authors’ two schemes were theoretically and experimentally examined using simulations, tissue-mimicking phantoms, ex vivo, and in vivo tests. In particular, the authors explored these methods using several objects withmore » different absorption contrasts embedded in turbid media and by using high-absorption media when the diffusion approximation was not effective at describing the photon transport. Results: The simulations and experimental tests showed that the reconstructions were quantitatively accurate in terms of the locations, sizes, and optical properties of the targets. The positions of the recovered targets were accessed by the property profiles, where the authors discovered that the off center error was less than 0.1 mm for the circular target. Meanwhile, the sizes and quantitative optical properties of the targets were quantified by estimating the full width half maximum of the optical absorption property. Interestingly, for the reconstructed sizes, the authors discovered that the errors ranged from 0 for relatively small-size targets to 26% for relatively large-size targets whereas for the recovered optical properties, the errors ranged from 0% to 12.5% for different cases. Conclusions: The authors found that their methods can quantitatively reconstruct absorbing objects of different sizes and optical contrasts even when the diffusion approximation is unable to accurately describe the photon propagation in biological tissues. In particular

  12. Propagation effects for land mobile satellite systems: Overview of experimental and modeling results

    NASA Technical Reports Server (NTRS)

    Goldhirsh, Julius; Vogel, Wolfhard J.

    1992-01-01

    Models developed and experiments performed to characterize the propagation environment associated with land mobile communication using satellites are discussed. Experiments were carried out with transmitters on stratospheric balloons, remotely piloted aircraft, helicopters, and geostationary satellites. This text is comprised of compiled experimental results for the expressed use of communications engineers, designers of planned Land Mobile Satellite Systems (LMSS), and modelers of propagation effects. The results presented here are mostly derived from systematic studies of propagation effects for LMSS geometries in the United States associated with rural and suburban regions. Where applicable, the authors also draw liberally from the results of other related investigations in Canada, Europe, and Australia. Frequencies near 1500 MHz are emphasized to coincide with frequency bands allocated for LMSS by the International Telecommunication Union, although earlier experimental work at 870 MHz is also included.

  13. Optimizing experimental procedures for quantitative evaluation of crop plant performance in high throughput phenotyping systems

    PubMed Central

    Junker, Astrid; Muraya, Moses M.; Weigelt-Fischer, Kathleen; Arana-Ceballos, Fernando; Klukas, Christian; Melchinger, Albrecht E.; Meyer, Rhonda C.; Riewe, David; Altmann, Thomas

    2015-01-01

    Detailed and standardized protocols for plant cultivation in environmentally controlled conditions are an essential prerequisite to conduct reproducible experiments with precisely defined treatments. Setting up appropriate and well defined experimental procedures is thus crucial for the generation of solid evidence and indispensable for successful plant research. Non-invasive and high throughput (HT) phenotyping technologies offer the opportunity to monitor and quantify performance dynamics of several hundreds of plants at a time. Compared to small scale plant cultivations, HT systems have much higher demands, from a conceptual and a logistic point of view, on experimental design, as well as the actual plant cultivation conditions, and the image analysis and statistical methods for data evaluation. Furthermore, cultivation conditions need to be designed that elicit plant performance characteristics corresponding to those under natural conditions. This manuscript describes critical steps in the optimization of procedures for HT plant phenotyping systems. Starting with the model plant Arabidopsis, HT-compatible methods were tested, and optimized with regard to growth substrate, soil coverage, watering regime, experimental design (considering environmental inhomogeneities) in automated plant cultivation and imaging systems. As revealed by metabolite profiling, plant movement did not affect the plants' physiological status. Based on these results, procedures for maize HT cultivation and monitoring were established. Variation of maize vegetative growth in the HT phenotyping system did match well with that observed in the field. The presented results outline important issues to be considered in the design of HT phenotyping experiments for model and crop plants. It thereby provides guidelines for the setup of HT experimental procedures, which are required for the generation of reliable and reproducible data of phenotypic variation for a broad range of applications. PMID

  14. Learning Quantitative Sequence-Function Relationships from Massively Parallel Experiments

    NASA Astrophysics Data System (ADS)

    Atwal, Gurinder S.; Kinney, Justin B.

    2016-03-01

    A fundamental aspect of biological information processing is the ubiquity of sequence-function relationships—functions that map the sequence of DNA, RNA, or protein to a biochemically relevant activity. Most sequence-function relationships in biology are quantitative, but only recently have experimental techniques for effectively measuring these relationships been developed. The advent of such "massively parallel" experiments presents an exciting opportunity for the concepts and methods of statistical physics to inform the study of biological systems. After reviewing these recent experimental advances, we focus on the problem of how to infer parametric models of sequence-function relationships from the data produced by these experiments. Specifically, we retrace and extend recent theoretical work showing that inference based on mutual information, not the standard likelihood-based approach, is often necessary for accurately learning the parameters of these models. Closely connected with this result is the emergence of "diffeomorphic modes"—directions in parameter space that are far less constrained by data than likelihood-based inference would suggest. Analogous to Goldstone modes in physics, diffeomorphic modes arise from an arbitrarily broken symmetry of the inference problem. An analytically tractable model of a massively parallel experiment is then described, providing an explicit demonstration of these fundamental aspects of statistical inference. This paper concludes with an outlook on the theoretical and computational challenges currently facing studies of quantitative sequence-function relationships.

  15. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    DOE PAGES

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; ...

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically,more » the presence of C π...C πinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.« less

  16. Experimental Results for Titan Aerobot Thermo-Mechanical Subsystem Development

    NASA Technical Reports Server (NTRS)

    Hall, Jeffrey L.; Jones, J. A.; Kerzhanovich, V. V.; Lachenmeier, T.; Mahr, P.; Pauken, M.; Plett, G. A.; Smith, L.; VanLuvender, M. L.; Yavrouian, A. H.

    2006-01-01

    This paper describes experimental results from a development program focused in maturing Titan aerobot technology in the areas of mechanical and thermal subsystems. Results from four key activities are described: first, a cryogenic balloon materials development program involving coupon and cylinder tests and culminating in the fabrication and testing of an inflated 4.6 m long prototype blimp at 93 K; second, a combined lab experiment and numerical simulation effort to assess potential problems resulting from radioisotope thermal generator waste heat generation near an inflated blimp; third, an aerial deployment and inflation development program consisting of laboratory and helicopter drop tests on a near full scale (11 m long) prototype blimp; and fourth, a proof of concept experiment demonstrating the viability of using a mechanically steerable high gain antenna on a floating blimp to perform direct to Earth telecommunications from Titan. The paper provides details on all of these successful activities and discusses their impact on the overall effort to produce mature systems technology for future Titan aerobot missions.

  17. Defining an Analytic Framework to Evaluate Quantitative MRI Markers of Traumatic Axonal Injury: Preliminary Results in a Mouse Closed Head Injury Model

    PubMed Central

    Sadeghi, N.; Namjoshi, D.; Irfanoglu, M. O.; Wellington, C.; Diaz-Arrastia, R.

    2017-01-01

    Diffuse axonal injury (DAI) is a hallmark of traumatic brain injury (TBI) pathology. Recently, the Closed Head Injury Model of Engineered Rotational Acceleration (CHIMERA) was developed to generate an experimental model of DAI in a mouse. The characterization of DAI using diffusion tensor magnetic resonance imaging (MRI; diffusion tensor imaging, DTI) may provide a useful set of outcome measures for preclinical and clinical studies. The objective of this study was to identify the complex neurobiological underpinnings of DTI features following DAI using a comprehensive and quantitative evaluation of DTI and histopathology in the CHIMERA mouse model. A consistent neuroanatomical pattern of pathology in specific white matter tracts was identified across ex vivo DTI maps and photomicrographs of histology. These observations were confirmed by voxelwise and regional analysis of DTI maps, demonstrating reduced fractional anisotropy (FA) in distinct regions such as the optic tract. Similar regions were identified by quantitative histology and exhibited axonal damage as well as robust gliosis. Additional analysis using a machine-learning algorithm was performed to identify regions and metrics important for injury classification in a manner free from potential user bias. This analysis found that diffusion metrics were able to identify injured brains almost with the same degree of accuracy as the histology metrics. Good agreement between regions detected as abnormal by histology and MRI was also found. The findings of this work elucidate the complexity of cellular changes that give rise to imaging abnormalities and provide a comprehensive and quantitative evaluation of the relative importance of DTI and histological measures to detect brain injury. PMID:28966972

  18. NanoDrop Microvolume Quantitation of Nucleic Acids

    PubMed Central

    Desjardins, Philippe; Conklin, Deborah

    2010-01-01

    Biomolecular assays are continually being developed that use progressively smaller amounts of material, often precluding the use of conventional cuvette-based instruments for nucleic acid quantitation for those that can perform microvolume quantitation. The NanoDrop microvolume sample retention system (Thermo Scientific NanoDrop Products) functions by combining fiber optic technology and natural surface tension properties to capture and retain minute amounts of sample independent of traditional containment apparatus such as cuvettes or capillaries. Furthermore, the system employs shorter path lengths, which result in a broad range of nucleic acid concentration measurements, essentially eliminating the need to perform dilutions. Reducing the volume of sample required for spectroscopic analysis also facilitates the inclusion of additional quality control steps throughout many molecular workflows, increasing efficiency and ultimately leading to greater confidence in downstream results. The need for high-sensitivity fluorescent analysis of limited mass has also emerged with recent experimental advances. Using the same microvolume sample retention technology, fluorescent measurements may be performed with 2 μL of material, allowing fluorescent assays volume requirements to be significantly reduced. Such microreactions of 10 μL or less are now possible using a dedicated microvolume fluorospectrometer. Two microvolume nucleic acid quantitation protocols will be demonstrated that use integrated sample retention systems as practical alternatives to traditional cuvette-based protocols. First, a direct A260 absorbance method using a microvolume spectrophotometer is described. This is followed by a demonstration of a fluorescence-based method that enables reduced-volume fluorescence reactions with a microvolume fluorospectrometer. These novel techniques enable the assessment of nucleic acid concentrations ranging from 1 pg/ μL to 15,000 ng/ μL with minimal consumption of

  19. Finite Element Analysis of Quantitative Percussion Diagnostics for Evaluating the Strength of Bonds Between Composite Laminates

    NASA Astrophysics Data System (ADS)

    Poveromo, Scott; Malcolm, Doug; Earthman, James

    Conventional nondestructive (NDT) techniques used to detect defects in composites are not able to determine intact bond integrity within a composite structure and are costly to use on large and complex shaped surfaces. To overcome current NDT limitations, a new technology was adopted based on quantitative percussion diagnostics (QPD) to better quantify bond quality in fiber reinforced composite materials. Results indicate that this technology is capable of detecting weak (`kiss') bonds between flat composite laminates. Specifically, the local value of the probe force determined from quantitative percussion testing was predicted to be significantly lower for a laminate that contained a `kiss' bond compared to that for a well-bonded sample, which is in agreement with experimental findings. Experimental results were compared to a finite element analysis (FEA) using MSC PATRAN/NASTRAN to understand the visco-elastic behavior of the laminates during percussion testing. The dynamic FEA models were used to directly predict changes in the probe force, as well as effective stress distributions across the bonded panels as a function of time.

  20. Extension of nanoconfined DNA: Quantitative comparison between experiment and theory

    NASA Astrophysics Data System (ADS)

    Iarko, V.; Werner, E.; Nyberg, L. K.; Müller, V.; Fritzsche, J.; Ambjörnsson, T.; Beech, J. P.; Tegenfeldt, J. O.; Mehlig, K.; Westerlund, F.; Mehlig, B.

    2015-12-01

    The extension of DNA confined to nanochannels has been studied intensively and in detail. However, quantitative comparisons between experiments and model calculations are difficult because most theoretical predictions involve undetermined prefactors, and because the model parameters (contour length, Kuhn length, effective width) are difficult to compute reliably, leading to substantial uncertainties. Here we use a recent asymptotically exact theory for the DNA extension in the "extended de Gennes regime" that allows us to compare experimental results with theory. For this purpose, we performed experiments measuring the mean DNA extension and its standard deviation while varying the channel geometry, dye intercalation ratio, and ionic strength of the buffer. The experimental results agree very well with theory at high ionic strengths, indicating that the model parameters are reliable. At low ionic strengths, the agreement is less good. We discuss possible reasons. In principle, our approach allows us to measure the Kuhn length and the effective width of a single DNA molecule and more generally of semiflexible polymers in solution.

  1. Results of an Experimental Exploration of Advanced Automated Geospatial Tools: Agility in Complex Planning

    DTIC Science & Technology

    2009-06-01

    AUTOMATED GEOSPATIAL TOOLS : AGILITY IN COMPLEX PLANNING Primary Topic: Track 5 – Experimentation and Analysis Walter A. Powell [STUDENT] - GMU...TITLE AND SUBTITLE Results of an Experimental Exploration of Advanced Automated Geospatial Tools : Agility in Complex Planning 5a. CONTRACT NUMBER...Std Z39-18 Abstract Typically, the development of tools and systems for the military is requirement driven; systems are developed to meet

  2. Quantitative theory of driven nonlinear brain dynamics.

    PubMed

    Roberts, J A; Robinson, P A

    2012-09-01

    Strong periodic stimuli such as bright flashing lights evoke nonlinear responses in the brain and interact nonlinearly with ongoing cortical activity, but the underlying mechanisms for these phenomena are poorly understood at present. The dominant features of these experimentally observed dynamics are reproduced by the dynamics of a quantitative neural field model subject to periodic drive. Model power spectra over a range of drive frequencies show agreement with multiple features of experimental measurements, exhibiting nonlinear effects including entrainment over a range of frequencies around the natural alpha frequency f(α), subharmonic entrainment near 2f(α), and harmonic generation. Further analysis of the driven dynamics as a function of the drive parameters reveals rich nonlinear dynamics that is predicted to be observable in future experiments at high drive amplitude, including period doubling, bistable phase-locking, hysteresis, wave mixing, and chaos indicated by positive Lyapunov exponents. Moreover, photosensitive seizures are predicted for physiologically realistic model parameters yielding bistability between healthy and seizure dynamics. These results demonstrate the applicability of neural field models to the new regime of periodically driven nonlinear dynamics, enabling interpretation of experimental data in terms of specific generating mechanisms and providing new tests of the theory. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast

    PubMed Central

    Pang, Wei; Coghill, George M.

    2015-01-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377

  4. Preliminary experimental results from a MARS Micro-CT system.

    PubMed

    He, Peng; Yu, Hengyong; Thayer, Patrick; Jin, Xin; Xu, Qiong; Bennett, James; Tappenden, Rachael; Wei, Biao; Goldstein, Aaron; Renaud, Peter; Butler, Anthony; Butler, Phillip; Wang, Ge

    2012-01-01

    The Medipix All Resolution System (MARS) system is a commercial spectral/multi-energy micro-CT scanner designed and assembled by the MARS Bioimaging, Ltd. in New Zealand. This system utilizes the state-of-the-art Medipix photon-counting, energy-discriminating detector technology developed by a collaboration at European Organization for Nuclear Research (CERN). In this paper, we report our preliminary experimental results using this system, including geometrical alignment, photon energy characterization, protocol optimization, and spectral image reconstruction. We produced our scan datasets with a multi-material phantom, and then applied ordered subset-simultaneous algebraic reconstruction technique (OS-SART) to reconstruct images in different energy ranges and principal component analysis (PCA) to evaluate spectral deviation among the energy ranges.

  5. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  6. A Quantitative Model of Early Atherosclerotic Plaques Parameterized Using In Vitro Experiments.

    PubMed

    Thon, Moritz P; Ford, Hugh Z; Gee, Michael W; Myerscough, Mary R

    2018-01-01

    There are a growing number of studies that model immunological processes in the artery wall that lead to the development of atherosclerotic plaques. However, few of these models use parameters that are obtained from experimental data even though data-driven models are vital if mathematical models are to become clinically relevant. We present the development and analysis of a quantitative mathematical model for the coupled inflammatory, lipid and macrophage dynamics in early atherosclerotic plaques. Our modeling approach is similar to the biologists' experimental approach where the bigger picture of atherosclerosis is put together from many smaller observations and findings from in vitro experiments. We first develop a series of three simpler submodels which are least-squares fitted to various in vitro experimental results from the literature. Subsequently, we use these three submodels to construct a quantitative model of the development of early atherosclerotic plaques. We perform a local sensitivity analysis of the model with respect to its parameters that identifies critical parameters and processes. Further, we present a systematic analysis of the long-term outcome of the model which produces a characterization of the stability of model plaques based on the rates of recruitment of low-density lipoproteins, high-density lipoproteins and macrophages. The analysis of the model suggests that further experimental work quantifying the different fates of macrophages as a function of cholesterol load and the balance between free cholesterol and cholesterol ester inside macrophages may give valuable insight into long-term atherosclerotic plaque outcomes. This model is an important step toward models applicable in a clinical setting.

  7. Quantitative analysis of fracture surface by roughness and fractal method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, X.W.; Tian, J.F.; Kang, Y.

    1995-09-01

    In recent years there has been extensive research and great development in Quantitative Fractography, which acts as an integral part of fractographic analysis. A prominent technique for studying the fracture surface is based on fracture profile generation and the major means for characterizing the profile quantitatively are roughness and fractal methods. By this way, some quantitative indexes such as the roughness parameters R{sub L} for profile and R{sub S} for surface, fractal dimensions D{sub L} for profile and D{sub S} for surface can be measured. Given the relationships between the indexes and the mechanical properties of materials, it is possiblemore » to achieve the goal of protecting materials from fracture. But, as the case stands, the theory and experimental technology of quantitative fractography are still imperfect and remain to be studied further. Recently, Gokhale and Underwood et al have proposed an assumption-free method for estimating the surface roughness by vertically sectioning the fracture surface with sections at an angle of 120 deg with each other, which could be expressed as follows: R{sub S} = {ovr R{sub L}{center_dot}{Psi}} where {Psi} is the profile structure factor. This method is based on the classical sterological principles and verified with the aid of computer simulations for some ruled surfaces. The results are considered to be applicable to fracture surfaces with any arbitrary complexity and anisotropy. In order to extend the detail applications to this method in quantitative fractography, the authors made a study on roughness and fractal methods dependent on this method by performing quantitative measurements on some typical low-temperature impact fractures.« less

  8. Shuttle Return To Flight Experimental Results: Cavity Effects on Boundary Layer Transition

    NASA Technical Reports Server (NTRS)

    Liechty, Derek S.; Horvath, Thomas J.; Berry, Scott A.

    2006-01-01

    The effect of an isolated rectangular cavity on hypersonic boundary layer transition of the windward surface of the Shuttle Orbiter has been experimentally examined in the Langley Aerothermodynamics Laboratory in support of an agency-wide effort to prepare the Shuttle Orbiter for return to flight. This experimental study was initiated to provide a cavity effects database for developing hypersonic transition criteria to support on-orbit decisions to repair a damaged thermal protection system. Boundary layer transition results were obtained using 0.0075-scale Orbiter models with simulated tile damage (rectangular cavities) of varying length, width, and depth. The database contained within this report will be used to formulate cavity-induced transition correlations using predicted boundary layer edge parameters.

  9. Results of Studying Astronomy Students’ Science Literacy, Quantitative Literacy, and Information Literacy

    NASA Astrophysics Data System (ADS)

    Buxner, Sanlyn; Impey, Chris David; Follette, Katherine B.; Dokter, Erin F.; McCarthy, Don; Vezino, Beau; Formanek, Martin; Romine, James M.; Brock, Laci; Neiberding, Megan; Prather, Edward E.

    2017-01-01

    Introductory astronomy courses often serve as terminal science courses for non-science majors and present an opportunity to assess non future scientists’ attitudes towards science as well as basic scientific knowledge and scientific analysis skills that may remain unchanged after college. Through a series of studies, we have been able to evaluate students’ basic science knowledge, attitudes towards science, quantitative literacy, and informational literacy. In the Fall of 2015, we conducted a case study of a single class administering all relevant surveys to an undergraduate class of 20 students. We will present our analysis of trends of each of these studies as well as the comparison case study. In general we have found that students basic scientific knowledge has remained stable over the past quarter century. In all of our studies, there is a strong relationship between student attitudes and their science and quantitative knowledge and skills. Additionally, students’ information literacy is strongly connected to their attitudes and basic scientific knowledge. We are currently expanding these studies to include new audiences and will discuss the implications of our findings for instructors.

  10. Internal wave emission from baroclinic jets: experimental results

    NASA Astrophysics Data System (ADS)

    Borcia, Ion D.; Rodda, Costanza; Harlander, Uwe

    2016-04-01

    Large-scale balanced flows can spontaneously radiate meso-scale inertia-gravity waves (IGWs) and are thus in fact unbalanced. While flow-dependent parameterizations for the radiation of IGWs from orographic and convective sources do exist, the situation is less developed for spontaneously emitted IGWs. Observations identify increased IGW activity in the vicinity of jet exit regions. A direct interpretation of those based on geostrophic adjustment might be tempting. However, directly applying this concept to the parameterization of spontaneous imbalance is difficult since the dynamics itself is continuously re-establishing an unbalanced flow which then sheds imbalances by GW radiation. Examining spontaneous IGW emission in the atmosphere and validating parameterization schemes confronts the scientist with particular challenges. Due to its extreme complexity, GW emission will always be embedded in the interaction of a multitude of interdependent processes, many of which are hardly detectable from analysis or campaign data. The benefits of repeated and more detailed measurements, while representing the only source of information about the real atmosphere, are limited by the non-repeatability of an atmospheric situation. The same event never occurs twice. This argues for complementary laboratory experiments, which can provide a more focused dialogue between experiment and theory. Indeed, life cycles are also examined in rotating-annulus laboratory experiments. Thus, these experiments might form a useful empirical benchmark for theoretical and modeling work that is also independent of any sort of subgrid model. In addition, the more direct correspondence between experimental and model data and the data reproducibility makes lab experiments a powerful testbed for parameterizations. Here we show first results from a small rotating annulus experiments and we will further present our new experimental facility to study wave emission from jets and fronts.

  11. Electromagnetic Vortex-Based Radar Imaging Using a Single Receiving Antenna: Theory and Experimental Results

    PubMed Central

    Yuan, Tiezhu; Wang, Hongqiang; Cheng, Yongqiang; Qin, Yuliang

    2017-01-01

    Radar imaging based on electromagnetic vortex can achieve azimuth resolution without relative motion. The present paper investigates this imaging technique with the use of a single receiving antenna through theoretical analysis and experimental results. Compared with the use of multiple receiving antennas, the echoes from a single receiver cannot be used directly for image reconstruction using Fourier method. The reason is revealed by using the point spread function. An additional phase is compensated for each mode before imaging process based on the array parameters and the elevation of the targets. A proof-of-concept imaging system based on a circular phased array is created, and imaging experiments of corner-reflector targets are performed in an anechoic chamber. The azimuthal image is reconstructed by the use of Fourier transform and spectral estimation methods. The azimuth resolution of the two methods is analyzed and compared through experimental data. The experimental results verify the principle of azimuth resolution and the proposed phase compensation method. PMID:28335487

  12. The NIST Quantitative Infrared Database

    PubMed Central

    Chu, P. M.; Guenther, F. R.; Rhoderick, G. C.; Lafferty, W. J.

    1999-01-01

    With the recent developments in Fourier transform infrared (FTIR) spectrometers it is becoming more feasible to place these instruments in field environments. As a result, there has been enormous increase in the use of FTIR techniques for a variety of qualitative and quantitative chemical measurements. These methods offer the possibility of fully automated real-time quantitation of many analytes; therefore FTIR has great potential as an analytical tool. Recently, the U.S. Environmental Protection Agency (U.S.EPA) has developed protocol methods for emissions monitoring using both extractive and open-path FTIR measurements. Depending upon the analyte, the experimental conditions and the analyte matrix, approximately 100 of the hazardous air pollutants (HAPs) listed in the 1990 U.S.EPA Clean Air Act amendment (CAAA) can be measured. The National Institute of Standards and Technology (NIST) has initiated a program to provide quality-assured infrared absorption coefficient data based on NIST prepared primary gas standards. Currently, absorption coefficient data has been acquired for approximately 20 of the HAPs. For each compound, the absorption coefficient spectrum was calculated using nine transmittance spectra at 0.12 cm−1 resolution and the Beer’s law relationship. The uncertainties in the absorption coefficient data were estimated from the linear regressions of the transmittance data and considerations of other error sources such as the nonlinear detector response. For absorption coefficient values greater than 1 × 10−4 μmol/mol)−1 m−1 the average relative expanded uncertainty is 2.2 %. This quantitative infrared database is currently an ongoing project at NIST. Additional spectra will be added to the database as they are acquired. Our current plans include continued data acquisition of the compounds listed in the CAAA, as well as the compounds that contribute to global warming and ozone depletion.

  13. Stand-off thermal IR minefield survey: system concept and experimental results

    NASA Astrophysics Data System (ADS)

    Cremer, Frank; Nguyen, Thanh T.; Yang, Lixin; Sahli, Hichem

    2005-06-01

    A detailed description of the CLEARFAST system for thermal IR stand-off minefield survey is given. The system allows (i) a stand-off diurnal observation of hazardous area, (ii) detecting anomalies, i.e. locating and searching for targets which are thermally and spectrally distinct from their surroundings, (iii) estimating the physical parameters, i.e. depth and thermal diffusivity, of the detected anomalies, and (iv) providing panoramic (mosaic) images indicating the locations of suspect objects and known markers. The CLEARFAST demonstrator has been successfully deployed and operated, in November 2004, in a real minefield within the United Nations Buffer Zone in Cyprus. The paper describes the main principles of the system and illustrates the processing chain on a set of real minefield images, together with qualitative and quantitative results.

  14. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    PubMed

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  15. Characterisation and optimisation of flexible transfer lines for liquid helium. Part I: Experimental results

    NASA Astrophysics Data System (ADS)

    Dittmar, N.; Haberstroh, Ch.; Hesse, U.; Krzyzowski, M.

    2016-04-01

    The transfer of liquid helium (LHe) into mobile dewars or transport vessels is a common and unavoidable process at LHe decant stations. During this transfer reasonable amounts of LHe evaporate due to heat leak and pressure drop. Thus generated helium gas needs to be collected and reliquefied which requires a huge amount of electrical energy. Therefore, the design of transfer lines used at LHe decant stations has been optimised to establish a LHe transfer with minor evaporation losses which increases the overall efficiency and capacity of LHe decant stations. This paper presents the experimental results achieved during the thermohydraulic optimisation of a flexible LHe transfer line. An extensive measurement campaign with a set of dedicated transfer lines equipped with pressure and temperature sensors led to unique experimental data of this specific transfer process. The experimental results cover the heat leak, the pressure drop, the transfer rate, the outlet quality, and the cool-down and warm-up behaviour of the examined transfer lines. Based on the obtained results the design of the considered flexible transfer line has been optimised, featuring reduced heat leak and pressure drop.

  16. Integrating quantitative thinking into an introductory biology course improves students' mathematical reasoning in biological contexts.

    PubMed

    Hester, Susan; Buxner, Sanlyn; Elfring, Lisa; Nagy, Lisa

    2014-01-01

    Recent calls for improving undergraduate biology education have emphasized the importance of students learning to apply quantitative skills to biological problems. Motivated by students' apparent inability to transfer their existing quantitative skills to biological contexts, we designed and taught an introductory molecular and cell biology course in which we integrated application of prerequisite mathematical skills with biology content and reasoning throughout all aspects of the course. In this paper, we describe the principles of our course design and present illustrative examples of course materials integrating mathematics and biology. We also designed an outcome assessment made up of items testing students' understanding of biology concepts and their ability to apply mathematical skills in biological contexts and administered it as a pre/postcourse test to students in the experimental section and other sections of the same course. Precourse results confirmed students' inability to spontaneously transfer their prerequisite mathematics skills to biological problems. Pre/postcourse outcome assessment comparisons showed that, compared with students in other sections, students in the experimental section made greater gains on integrated math/biology items. They also made comparable gains on biology items, indicating that integrating quantitative skills into an introductory biology course does not have a deleterious effect on students' biology learning.

  17. Integrating Quantitative Thinking into an Introductory Biology Course Improves Students’ Mathematical Reasoning in Biological Contexts

    PubMed Central

    Hester, Susan; Buxner, Sanlyn; Elfring, Lisa; Nagy, Lisa

    2014-01-01

    Recent calls for improving undergraduate biology education have emphasized the importance of students learning to apply quantitative skills to biological problems. Motivated by students’ apparent inability to transfer their existing quantitative skills to biological contexts, we designed and taught an introductory molecular and cell biology course in which we integrated application of prerequisite mathematical skills with biology content and reasoning throughout all aspects of the course. In this paper, we describe the principles of our course design and present illustrative examples of course materials integrating mathematics and biology. We also designed an outcome assessment made up of items testing students’ understanding of biology concepts and their ability to apply mathematical skills in biological contexts and administered it as a pre/postcourse test to students in the experimental section and other sections of the same course. Precourse results confirmed students’ inability to spontaneously transfer their prerequisite mathematics skills to biological problems. Pre/postcourse outcome assessment comparisons showed that, compared with students in other sections, students in the experimental section made greater gains on integrated math/biology items. They also made comparable gains on biology items, indicating that integrating quantitative skills into an introductory biology course does not have a deleterious effect on students’ biology learning. PMID:24591504

  18. An Overview of NSTX Research Facility and Recent Experimental Results

    NASA Astrophysics Data System (ADS)

    Ono, Masayuki

    2006-10-01

    The 2006 NSTX experimental campaign yielded significant new experimental results in many areas. Improved plasma control achieved the highest elongation of 2.9 and plasma shape factor q95Ip/aBT = 42 MA/m.T. Active feedback correction of error fields sustained the plasma rotation and increased the pulse length of high beta discharges. Active feedback stabilization of the resistive wall mode in high-beta, low-rotation plasmas was demonstrated for ˜100 resistive wall times. Operation at higher toroidal field showed favorable plasma confinement and HHFW heating efficiency trends with the field. A broader current profile, measured by the 12-channel MSE diagnostic in high beta discharges revealed an outward anomalous diffusivity of energetic ions due to the n=1 MHD modes. A tangential microwave scattering diagnostic measured localized electron gyro-scale fluctuations in L-mode, H-mode and reversed-shear plasmas. Evaporation of lithium onto plasma facing surfaces yielded lower density, higher temperature and improved confinement. A strong dependence of the divertor heat load and ELM behavior on the plasma triangularity was observed. Coaxial helicity injection produced a start-up current of 160 kA on closed flux surfaces.

  19. Comprehensive evaluation of direct injection mass spectrometry for the quantitative profiling of volatiles in food samples

    PubMed Central

    2016-01-01

    Although qualitative strategies based on direct injection mass spectrometry (DIMS) have recently emerged as an alternative for the rapid classification of food samples, the potential of these approaches in quantitative tasks has scarcely been addressed to date. In this paper, the applicability of different multivariate regression procedures to data collected by DIMS from simulated mixtures has been evaluated. The most relevant factors affecting quantitation, such as random noise, the number of calibration samples, type of validation, mixture complexity and similarity of mass spectra, were also considered and comprehensively discussed. Based on the conclusions drawn from simulated data, and as an example of application, experimental mass spectral fingerprints collected by direct thermal desorption coupled to mass spectrometry were used for the quantitation of major volatiles in Thymus zygis subsp. zygis chemotypes. The results obtained, validated with the direct thermal desorption coupled to gas chromatography–mass spectrometry method here used as a reference, show the potential of DIMS approaches for the fast and precise quantitative profiling of volatiles in foods. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644978

  20. Application of an Unstructured Grid Navier-Stokes Solver to a Generic Helicopter Boby: Comparison of Unstructured Grid Results with Structured Grid Results and Experimental Results

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.

    1999-01-01

    An unstructured-grid Navier-Stokes solver was used to predict the surface pressure distribution, the off-body flow field, the surface flow pattern, and integrated lift and drag coefficients on the ROBIN configuration (a generic helicopter) without a rotor at four angles of attack. The results are compared to those predicted by two structured- grid Navier-Stokes solvers and to experimental surface pressure distributions. The surface pressure distributions from the unstructured-grid Navier-Stokes solver are in good agreement with the results from the structured-grid Navier-Stokes solvers. Agreement with the experimental pressure coefficients is good over the forward portion of the body. However, agreement is poor on the lower portion of the mid-section of the body. Comparison of the predicted surface flow patterns showed similar regions of separated flow. Predicted lift and drag coefficients were in fair agreement with each other.

  1. Quantitative determination and validation of octreotide acetate using 1 H-NMR spectroscopy with internal standard method.

    PubMed

    Yu, Chen; Zhang, Qian; Xu, Peng-Yao; Bai, Yin; Shen, Wen-Bin; Di, Bin; Su, Meng-Xiang

    2018-01-01

    Quantitative nuclear magnetic resonance (qNMR) is a well-established technique in quantitative analysis. We presented a validated 1 H-qNMR method for assay of octreotide acetate, a kind of cyclic octopeptide. Deuterium oxide was used to remove the undesired exchangeable peaks, which was referred to as proton exchange, in order to make the quantitative signals isolated in the crowded spectrum of the peptide and ensure precise quantitative analysis. Gemcitabine hydrochloride was chosen as the suitable internal standard. Experimental conditions, including relaxation delay time, the numbers of scans, and pulse angle, were optimized first. Then method validation was carried out in terms of selectivity, stability, linearity, precision, and robustness. The assay result was compared with that by means of high performance liquid chromatography, which is provided by Chinese Pharmacopoeia. The statistical F test, Student's t test, and nonparametric test at 95% confidence level indicate that there was no significant difference between these two methods. qNMR is a simple and accurate quantitative tool with no need for specific corresponding reference standards. It has the potential of the quantitative analysis of other peptide drugs and standardization of the corresponding reference standards. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Robotic follower experimentation results: ready for FCS increment I

    NASA Astrophysics Data System (ADS)

    Jaczkowski, Jeffrey J.

    2003-09-01

    Robotics is a fundamental enabling technology required to meet the U.S. Army's vision to be a strategically responsive force capable of domination across the entire spectrum of conflict. The U. S. Army Research, Development and Engineering Command (RDECOM) Tank Automotive Research, Development & Engineering Center (TARDEC), in partnership with the U.S. Army Research Laboratory, is developing a leader-follower capability for Future Combat Systems. The Robotic Follower Advanced Technology Demonstration (ATD) utilizes a manned leader to provide a highlevel proofing of the follower's path, which operates with minimal user intervention. This paper will give a programmatic overview and discuss both the technical approach and operational experimentation results obtained during testing conducted at Ft. Bliss, New Mexico in February-March 2003.

  3. Epistemology and expectations survey about experimental physics: Development and initial results

    NASA Astrophysics Data System (ADS)

    Zwickl, Benjamin M.; Hirokawa, Takako; Finkelstein, Noah; Lewandowski, H. J.

    2014-06-01

    In response to national calls to better align physics laboratory courses with the way physicists engage in research, we have developed an epistemology and expectations survey to assess how students perceive the nature of physics experiments in the contexts of laboratory courses and the professional research laboratory. The Colorado Learning Attitudes about Science Survey for Experimental Physics (E-CLASS) evaluates students' epistemology at the beginning and end of a semester. Students respond to paired questions about how they personally perceive doing experiments in laboratory courses and how they perceive an experimental physicist might respond regarding their research. Also, at the end of the semester, the E-CLASS assesses a third dimension of laboratory instruction, students' reflections on their course's expectations for earning a good grade. By basing survey statements on widely embraced learning goals and common critiques of teaching labs, the E-CLASS serves as an assessment tool for lab courses across the undergraduate curriculum and as a tool for physics education research. We present the development, evidence of validation, and initial formative assessment results from a sample that includes 45 classes at 20 institutions. We also discuss feedback from instructors and reflect on the challenges of large-scale online administration and distribution of results.

  4. Tilted wheel satellite attitude control with air-bearing table experimental results

    NASA Astrophysics Data System (ADS)

    Inumoh, Lawrence O.; Forshaw, Jason L.; Horri, Nadjim M.

    2015-12-01

    Gyroscopic actuators for satellite control have attracted significant research interest over the years, but their viability for the control of small satellites has only recently started to become clear. Research on variable speed gyroscopic actuators has long been focused on single gimbal actuators; double gimbal actuators typically operate at constant wheel spin rate and allow tilt angle ranges far larger than the ranges needed to operate most satellite missions. This research examines a tilted wheel, a newly proposed type of inertial actuator that can generate torques in all three principal axes of a rigid satellite using a spinning wheel and a double tilt mechanism. The tilt mechanism tilts the angular momentum vector about two axes providing two degree of freedom control, while variation of the wheel speed provides the third. The equations of motion of the system lead to a singularity-free system during nominal operation avoiding the need for complex steering logic. This paper describes the hardware design of the tilted wheel and the experimental setup behind both standalone and spherical air-bearing tables used to test it. Experimental results from the air bearing table are provided with the results depicting the high performance capabilities of the proposed actuator in torque generation.

  5. Quantitative Percussion Diagnostics For Evaluating Bond Integrity Between Composite Laminates

    NASA Astrophysics Data System (ADS)

    Poveromo, Scott Leonard

    Conventional nondestructive testing (NDT) techniques used to detect defects in composites are not able to determine intact bond integrity within a composite structure and are costly to use on large and complex shaped surfaces. To overcome current NDT limitations, a new technology was utilized based on quantitative percussion diagnostics (QPD) to better quantify bond quality in fiber reinforced composite materials. Experimental results indicate that this technology is capable of detecting 'kiss' bonds (very low adhesive shear strength), caused by the application of release agents on the bonding surfaces, between flat composite laminates bonded together with epoxy adhesive. Specifically, the local value of the loss coefficient determined from quantitative percussion testing was found to be significantly greater for a release coated panel compared to that for a well bonded sample. Also, the local value of the probe force or force returned to the probe after impact was observed to be lower for the release coated panels. The increase in loss coefficient and decrease in probe force are thought to be due to greater internal friction during the percussion event for poorly bonded specimens. NDT standards were also fabricated by varying the cure parameters of an epoxy film adhesive. Results from QPD for the variable cure NDT standards and lap shear strength measurements taken of mechanical test specimens were compared and analyzed. Finally, experimental results have been compared to a finite element analysis to understand the visco-elastic behavior of the laminates during percussion testing. This comparison shows how a lower quality bond leads to a reduction in the percussion force by biasing strain in the percussion tested side of the panel.

  6. Circular Samples as Objects for Magnetic Resonance Imaging - Mathematical Simulation, Experimental Results

    NASA Astrophysics Data System (ADS)

    Frollo, Ivan; Krafčík, Andrej; Andris, Peter; Přibil, Jiří; Dermek, Tomáš

    2015-12-01

    Circular samples are the frequent objects of "in-vitro" investigation using imaging method based on magnetic resonance principles. The goal of our investigation is imaging of thin planar layers without using the slide selection procedure, thus only 2D imaging or imaging of selected layers of samples in circular vessels, eppendorf tubes,.. compulsorily using procedure "slide selection". In spite of that the standard imaging methods was used, some specificity arise when mathematical modeling of these procedure is introduced. In the paper several mathematical models were presented that were compared with real experimental results. Circular magnetic samples were placed into the homogenous magnetic field of a low field imager based on nuclear magnetic resonance. For experimental verification an MRI 0.178 Tesla ESAOTE Opera imager was used.

  7. Model wall and recovery temperature effects on experimental heat transfer data analysis

    NASA Technical Reports Server (NTRS)

    Throckmorton, D. A.; Stone, D. R.

    1974-01-01

    Basic analytical procedures are used to illustrate, both qualitatively and quantitatively, the relative impact upon heat transfer data analysis of certain factors which may affect the accuracy of experimental heat transfer data. Inaccurate knowledge of adiabatic wall conditions results in a corresponding inaccuracy in the measured heat transfer coefficient. The magnitude of the resulting error is extreme for data obtained at wall temperatures approaching the adiabatic condition. High model wall temperatures and wall temperature gradients affect the level and distribution of heat transfer to an experimental model. The significance of each of these factors is examined and its impact upon heat transfer data analysis is assessed.

  8. Quantitative biology: where modern biology meets physical sciences.

    PubMed

    Shekhar, Shashank; Zhu, Lian; Mazutis, Linas; Sgro, Allyson E; Fai, Thomas G; Podolski, Marija

    2014-11-05

    Quantitative methods and approaches have been playing an increasingly important role in cell biology in recent years. They involve making accurate measurements to test a predefined hypothesis in order to compare experimental data with predictions generated by theoretical models, an approach that has benefited physicists for decades. Building quantitative models in experimental biology not only has led to discoveries of counterintuitive phenomena but has also opened up novel research directions. To make the biological sciences more quantitative, we believe a two-pronged approach needs to be taken. First, graduate training needs to be revamped to ensure biology students are adequately trained in physical and mathematical sciences and vice versa. Second, students of both the biological and the physical sciences need to be provided adequate opportunities for hands-on engagement with the methods and approaches necessary to be able to work at the intersection of the biological and physical sciences. We present the annual Physiology Course organized at the Marine Biological Laboratory (Woods Hole, MA) as a case study for a hands-on training program that gives young scientists the opportunity not only to acquire the tools of quantitative biology but also to develop the necessary thought processes that will enable them to bridge the gap between these disciplines. © 2014 Shekhar, Zhu, Mazutis, Sgro, Fai, and Podolski. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  9. Summary of experimental heat-transfer results from the turbine hot section facility

    NASA Technical Reports Server (NTRS)

    Gladden, Herbert J.; Yeh, Fredrick C.

    1993-01-01

    Experimental data from the turbine Hot Section Facility are presented and discussed. These data include full-coverage film-cooled airfoil results as well as special instrumentation results obtained at simulated real engine conditions. Local measurements of airfoil wall temperature, airfoil gas-path static-pressure distribution, and local heat-transfer coefficient distributions are presented and discussed. In addition, measured gas and coolant temperatures and pressures are presented. These data are also compared with analyses from Euler and boundary-layer codes.

  10. Quantitative Reappraisal of the Helmholtz-Guyton Resonance Theory of Frequency Tuning in the Cochlea

    PubMed Central

    Babbs, Charles F.

    2011-01-01

    To explore the fundamental biomechanics of sound frequency transduction in the cochlea, a two-dimensional analytical model of the basilar membrane was constructed from first principles. Quantitative analysis showed that axial forces along the membrane are negligible, condensing the problem to a set of ordered one-dimensional models in the radial dimension, for which all parameters can be specified from experimental data. Solutions of the radial models for asymmetrical boundary conditions produce realistic deformation patterns. The resulting second-order differential equations, based on the original concepts of Helmholtz and Guyton, and including viscoelastic restoring forces, predict a frequency map and amplitudes of deflections that are consistent with classical observations. They also predict the effects of an observation hole drilled in the surrounding bone, the effects of curvature of the cochlear spiral, as well as apparent traveling waves under a variety of experimental conditions. A quantitative rendition of the classical Helmholtz-Guyton model captures the essence of cochlear mechanics and unifies the competing resonance and traveling wave theories. PMID:22028708

  11. Quantitative determination of Auramine O by terahertz spectroscopy with 2DCOS-PLSR model

    NASA Astrophysics Data System (ADS)

    Zhang, Huo; Li, Zhi; Chen, Tao; Qin, Binyi

    2017-09-01

    Residues of harmful dyes such as Auramine O (AO) in herb and food products threaten the health of people. So, fast and sensitive detection techniques of the residues are needed. As a powerful tool for substance detection, terahertz (THz) spectroscopy was used for the quantitative determination of AO by combining with an improved partial least-squares regression (PLSR) model in this paper. Absorbance of herbal samples with different concentrations was obtained by THz-TDS in the band between 0.2THz and 1.6THz. We applied two-dimensional correlation spectroscopy (2DCOS) to improve the PLSR model. This method highlighted the spectral differences of different concentrations, provided a clear criterion of the input interval selection, and improved the accuracy of detection result. The experimental result indicated that the combination of the THz spectroscopy and 2DCOS-PLSR is an excellent quantitative analysis method.

  12. Stimulating Contributions to Public Goods through Information Feedback: Some Experimental Results

    PubMed Central

    Janssen, Marco A.; Lee, Allen; Sundaram, Hari

    2016-01-01

    In traditional public good experiments participants receive an endowment from the experimenter that can be invested in a public good or kept in a private account. In this paper we present an experimental environment where participants can invest time during five days to contribute to a public good. Participants can make contributions to a linear public good by logging into a web application and performing virtual actions. We compared four treatments, with different group sizes and information of (relative) performance of other groups. We find that information feedback about performance of other groups has a small positive effect if we control for various attributes of the groups. Moreover, we find a significant effect of the contributions of others in the group in the previous day on the number of points earned in the current day. Our results confirm that people participate more when participants in their group participate more, and are influenced by information about the relative performance of other groups. PMID:27459070

  13. FDTD-based quantitative analysis of terahertz wave detection for multilayered structures.

    PubMed

    Tu, Wanli; Zhong, Shuncong; Shen, Yaochun; Zhou, Qing; Yao, Ligang

    2014-10-01

    Experimental investigations have shown that terahertz pulsed imaging (TPI) is able to quantitatively characterize a range of multilayered media (e.g., biological issues, pharmaceutical tablet coatings, layered polymer composites, etc.). Advanced modeling of the interaction of terahertz radiation with a multilayered medium is required to enable the wide application of terahertz technology in a number of emerging fields, including nondestructive testing. Indeed, there have already been many theoretical analyses performed on the propagation of terahertz radiation in various multilayered media. However, to date, most of these studies used 1D or 2D models, and the dispersive nature of the dielectric layers was not considered or was simplified. In the present work, the theoretical framework of using terahertz waves for the quantitative characterization of multilayered media was established. A 3D model based on the finite difference time domain (FDTD) method is proposed. A batch of pharmaceutical tablets with a single coating layer of different coating thicknesses and different refractive indices was modeled. The reflected terahertz wave from such a sample was computed using the FDTD method, assuming that the incident terahertz wave is broadband, covering a frequency range up to 3.5 THz. The simulated results for all of the pharmaceutical-coated tablets considered were found to be in good agreement with the experimental results obtained using a commercial TPI system. In addition, we studied a three-layered medium to mimic the occurrence of defects in the sample.

  14. Experimental Keratitis Due to Pseudomonas aeruginosa: Model for Evaluation of Antimicrobial Drugs

    PubMed Central

    Davis, Starkey D.; Chandler, John W.

    1975-01-01

    An improved method for experimental keratitis due to Pseudomonas aeruginosa is described. Essential features of the method are use of inbred guinea pigs, intracorneal injection of bacteria, subconjunctival injection of antibiotics, “blind” evaluation of results, and statistical analysis of data. Untreated ocular infections were most severe 5 to 7 days after infection. Sterilized bacterial suspensions caused no abnormalities on day 5. Tobramycin and polymyxin B were more active than gentamicin against two strains of Pseudomonas. This model is suitable for many types of quantitative studies on experimental keratitis. Images PMID:810084

  15. Design and Experimental Results for the S406 Airfoil

    DTIC Science & Technology

    2010-08-01

    Concluded.45 46 (a) R = 0.50 × 106. Figure 14.- Comparison of theoretical and experimental section ch with transition free.aracteristics 47 (b) R...106. Figure 15.- Comparison of theoretical and experimental section ch with transition fixed.aracteristics 51 (b) R = 0.70 × 106. Figure 15...04986 4.088 .7022 .013251 −.04644 5.104 . 7845 .014147 −.04381 6.119 .8571 .015149 −.04019 7.135 .9294 .016691 −.03508 8.149 .9947 .017447 −.03106 9.162

  16. Pyrolysis process for the treatment of scrap tyres: preliminary experimental results.

    PubMed

    Galvagno, S; Casu, S; Casabianca, T; Calabrese, A; Cornacchia, G

    2002-01-01

    The aim of this work is the evaluation, on a pilot scale, of scrap tyre pyrolysis process performance and the characteristics of the products under different process parameters, such as temperature, residence time, pressure, etc. In this frame, a series of tests were carried out at varying process temperatures between 550 and 680 degrees C, other parameters being equal. Pyrolysis plant process data are collected by an acquisition system; scrap tyre samples used for the treatment, solid and liquid by-products and produced syngas were analysed through both on-line monitoring (for gas) and laboratory analyses. Results show that process temperature, in the explored range, does not seem to seriously influence the volatilisation reaction yield, at least from a quantitative point of view, while it observably influences the distribution of the volatile fraction (liquid and gas) and by-products characteristics.

  17. Genomic Quantitative Genetics to Study Evolution in the Wild.

    PubMed

    Gienapp, Phillip; Fior, Simone; Guillaume, Frédéric; Lasky, Jesse R; Sork, Victoria L; Csilléry, Katalin

    2017-12-01

    Quantitative genetic theory provides a means of estimating the evolutionary potential of natural populations. However, this approach was previously only feasible in systems where the genetic relatedness between individuals could be inferred from pedigrees or experimental crosses. The genomic revolution opened up the possibility of obtaining the realized proportion of genome shared among individuals in natural populations of virtually any species, which could promise (more) accurate estimates of quantitative genetic parameters in virtually any species. Such a 'genomic' quantitative genetics approach relies on fewer assumptions, offers a greater methodological flexibility, and is thus expected to greatly enhance our understanding of evolution in natural populations, for example, in the context of adaptation to environmental change, eco-evolutionary dynamics, and biodiversity conservation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Supersonic Retropropulsion Experimental Results from the NASA Langley Unitary Plan Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Berry, Scott A.; Rhode, Matthew N.; Edquist, Karl T.; Player, Charles J.

    2011-01-01

    A new supersonic retropropulsion experimental effort, intended to provide code validation data, was recently completed in the Langley Research Center Unitary Plan Wind Tunnel Test Section 2 over the Mach number range from 2.4 to 4.6. The experimental model was designed using insights gained from pre-test computations, which were instrumental for sizing and refining the model to minimize tunnel wall interference and internal flow separation concerns. A 5-in diameter 70-deg sphere-cone forebody with a roughly 10-in long cylindrical aftbody was the baseline configuration selected for this study. The forebody was designed to accommodate up to four 4:1 area ratio supersonic nozzles. Primary measurements for this model were a large number of surface pressures on the forebody and aftbody. Supplemental data included high-speed Schlieren video and internal pressures and temperatures. The run matrix was developed to allow for the quantification of various sources of experimental uncertainty, such as random errors due to run-to-run variations and bias errors due to flow field or model misalignments. Preliminary results and observations from the test are presented, while detailed data and uncertainty analyses are ongoing.

  19. High-throughput real-time quantitative reverse transcription PCR.

    PubMed

    Bookout, Angie L; Cummins, Carolyn L; Mangelsdorf, David J; Pesola, Jean M; Kramer, Martha F

    2006-02-01

    Extensive detail on the application of the real-time quantitative polymerase chain reaction (QPCR) for the analysis of gene expression is provided in this unit. The protocols are designed for high-throughput, 384-well-format instruments, such as the Applied Biosystems 7900HT, but may be modified to suit any real-time PCR instrument. QPCR primer and probe design and validation are discussed, and three relative quantitation methods are described: the standard curve method, the efficiency-corrected DeltaCt method, and the comparative cycle time, or DeltaDeltaCt method. In addition, a method is provided for absolute quantification of RNA in unknown samples. RNA standards are subjected to RT-PCR in the same manner as the experimental samples, thus accounting for the reaction efficiencies of both procedures. This protocol describes the production and quantitation of synthetic RNA molecules for real-time and non-real-time RT-PCR applications.

  20. Mechanistic and quantitative insight into cell surface targeted molecular imaging agent design.

    PubMed

    Zhang, Liang; Bhatnagar, Sumit; Deschenes, Emily; Thurber, Greg M

    2016-05-05

    Molecular imaging agent design involves simultaneously optimizing multiple probe properties. While several desired characteristics are straightforward, including high affinity and low non-specific background signal, in practice there are quantitative trade-offs between these properties. These include plasma clearance, where fast clearance lowers background signal but can reduce target uptake, and binding, where high affinity compounds sometimes suffer from lower stability or increased non-specific interactions. Further complicating probe development, many of the optimal parameters vary depending on both target tissue and imaging agent properties, making empirical approaches or previous experience difficult to translate. Here, we focus on low molecular weight compounds targeting extracellular receptors, which have some of the highest contrast values for imaging agents. We use a mechanistic approach to provide a quantitative framework for weighing trade-offs between molecules. Our results show that specific target uptake is well-described by quantitative simulations for a variety of targeting agents, whereas non-specific background signal is more difficult to predict. Two in vitro experimental methods for estimating background signal in vivo are compared - non-specific cellular uptake and plasma protein binding. Together, these data provide a quantitative method to guide probe design and focus animal work for more cost-effective and time-efficient development of molecular imaging agents.

  1. Identification of ginseng root using quantitative X-ray microtomography.

    PubMed

    Ye, Linlin; Xue, Yanling; Wang, Yudan; Qi, Juncheng; Xiao, Tiqiao

    2017-07-01

    The use of X-ray phase-contrast microtomography for the investigation of Chinese medicinal materials is advantageous for its nondestructive, in situ , and three-dimensional quantitative imaging properties. The X-ray phase-contrast microtomography quantitative imaging method was used to investigate the microstructure of ginseng, and the phase-retrieval method is also employed to process the experimental data. Four different ginseng samples were collected and investigated; these were classified according to their species, production area, and sample growth pattern. The quantitative internal characteristic microstructures of ginseng were extracted successfully. The size and position distributions of the calcium oxalate cluster crystals (COCCs), important secondary metabolites that accumulate in ginseng, are revealed by the three-dimensional quantitative imaging method. The volume and amount of the COCCs in different species of the ginseng are obtained by a quantitative analysis of the three-dimensional microstructures, which shows obvious difference among the four species of ginseng. This study is the first to provide evidence of the distribution characteristics of COCCs to identify four types of ginseng, with regard to species authentication and age identification, by X-ray phase-contrast microtomography quantitative imaging. This method is also expected to reveal important relationships between COCCs and the occurrence of the effective medicinal components of ginseng.

  2. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays.

    PubMed

    Guetterman, Timothy C; Fetters, Michael D; Creswell, John W

    2015-11-01

    Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. © 2015 Annals of Family Medicine, Inc.

  3. Quantitative investigation of red blood cell three-dimensional geometric and chemical changes in the storage lesion using digital holographic microscopy.

    PubMed

    Jaferzadeh, Keyvan; Moon, Inkyu

    2015-11-01

    Quantitative phase information obtained by digital holographic microscopy (DHM) can provide new insight into the functions and morphology of single red blood cells (RBCs). Since the functionality of a RBC is related to its three-dimensional (3-D) shape, quantitative 3-D geometric changes induced by storage time can help hematologists realize its optimal functionality period. We quantitatively investigate RBC 3-D geometric changes in the storage lesion using DHM. Our experimental results show that the substantial geometric transformation of the biconcave-shaped RBCs to the spherocyte occurs due to RBC storage lesion. This transformation leads to progressive loss of cell surface area, surface-to-volume ratio, and functionality of RBCs. Furthermore, our quantitative analysis shows that there are significant correlations between chemical and morphological properties of RBCs.

  4. Impact of reconstruction parameters on quantitative I-131 SPECT

    NASA Astrophysics Data System (ADS)

    van Gils, C. A. J.; Beijst, C.; van Rooij, R.; de Jong, H. W. A. M.

    2016-07-01

    Radioiodine therapy using I-131 is widely used for treatment of thyroid disease or neuroendocrine tumors. Monitoring treatment by accurate dosimetry requires quantitative imaging. The high energy photons however render quantitative SPECT reconstruction challenging, potentially requiring accurate correction for scatter and collimator effects. The goal of this work is to assess the effectiveness of various correction methods on these effects using phantom studies. A SPECT/CT acquisition of the NEMA IEC body phantom was performed. Images were reconstructed using the following parameters: (1) without scatter correction, (2) with triple energy window (TEW) scatter correction and (3) with Monte Carlo-based scatter correction. For modelling the collimator-detector response (CDR), both (a) geometric Gaussian CDRs as well as (b) Monte Carlo simulated CDRs were compared. Quantitative accuracy, contrast to noise ratios and recovery coefficients were calculated, as well as the background variability and the residual count error in the lung insert. The Monte Carlo scatter corrected reconstruction method was shown to be intrinsically quantitative, requiring no experimentally acquired calibration factor. It resulted in a more accurate quantification of the background compartment activity density compared with TEW or no scatter correction. The quantification error relative to a dose calibrator derived measurement was found to be  <1%,-26% and 33%, respectively. The adverse effects of partial volume were significantly smaller with the Monte Carlo simulated CDR correction compared with geometric Gaussian or no CDR modelling. Scatter correction showed a small effect on quantification of small volumes. When using a weighting factor, TEW correction was comparable to Monte Carlo reconstruction in all measured parameters, although this approach is clinically impractical since this factor may be patient dependent. Monte Carlo based scatter correction including accurately simulated CDR

  5. Music and suicidality: a quantitative review and extension.

    PubMed

    Stack, Steven; Lester, David; Rosenberg, Jonathan S

    2012-12-01

    This article provides the first quantitative review of the literature on music and suicidality. Multivariate logistic regression techniques are applied to 90 findings from 21 studies. Investigations employing ecological data on suicide completions are 19.2 times more apt than other studies to report a link between music and suicide. More recent and studies with large samples are also more apt than their counterparts to report significant results. Further, none of the studies based on experimental research designs found a link between music and suicide ideation, prompting us to do a brief content analysis of 24 suicide songs versus 24 nonsuicide songs from the same album. Using Linguistic Inquiry and Word Count software, we found no difference in the content of the suicide songs and controls, including the percentage of sad words, negative affect, and mentions of death, thus providing an explanation for nonfindings from experimental research. In summary, ecologically based (which capture at-risk persons not in typical school-based samples) and more recent investigations (which have used superior or new methodologies) tend to demonstrate a linkage between music and suicidality. Experimental research is needed with a control group of songs from an alternative genre with low suicidogenic content. © 2012 The American Association of Suicidology.

  6. Quantitative fetal fibronectin and cervical length in symptomatic women: results from a prospective blinded cohort study.

    PubMed

    Levine, Lisa D; Downes, Katheryne L; Romero, Julie A; Pappas, Hope; Elovitz, Michal A

    2018-05-15

    Our objectives were to determine whether quantitative fetal fibronectin (fFN) and cervical length (CL) screening can be used alone or in combination as prognostic tests to identify symptomatic women at the highest or lowest risk for spontaneous preterm birth (sPTB). A prospective, blinded cohort study of women presenting with a singleton gestation to our triage unit between 22-33w6d with preterm labor symptoms was performed. Women with ruptured membranes, moderate/severe bleeding, and dilation >2 cm were excluded. The primary outcome was sPTB <37 weeks. We evaluated test characteristics of quantitative fFN and CL assessment, both separately and in combination, considering traditionally reported cut-points (fFN ≥50 and CL <25), as well as cut-points above and below these measures. We found interactions between fFN >50 and CL <25 and sPTB by parity and obstetric history (p < .05) and therefore stratified results. Test characteristics are presented with positive predictive value (PPV) and negative predictive value (NPV). Five hundred eighty women were enrolled and 537 women were available for analysis. Overall sPTB rate was 11.1%. Among nulliparous women, increasing levels of fFN were associated with increasing risk of sPTB, with PPV going from 26.5% at ≥20 ng/mL to 44.4% at ≥200 ng/mL. A cut-point of 20 ng/mL had higher sensitivity (69.2%) and higher NPV (96.8%) and therefore identified a "low-risk" group. fFN was not informative for multiparous women regardless of prior obstetrical history or quantitative level chosen. For all women, a shorter CL was associated with an increased sPTB risk. Among nulliparas and multiparas without a prior sPTB, a CL <20 mm optimized test characteristics (PPV 25 and 20%, NPV 95.5, and 92.7%, respectively). For multiparas with a prior sPTB, CL <25 mm was more useful. Using fFN and CL in combination for nulliparas did not improve test characteristics over using the individual fFN (p = .74) and CL (p = .31

  7. Experimental and Theoretical Results in Output Trajectory Redesign for Flexible Structures

    NASA Technical Reports Server (NTRS)

    Dewey, J. S.; Leang, K.; Devasia, S.

    1998-01-01

    In this paper we study the optimal redesign of output trajectories for linear invertible systems. This is particularly important for tracking control of flexible structures because the input-state trajectores, that achieve tracking of the required output may cause excessive vibrations in the structure. We pose and solve this problem, in the context of linear systems, as the minimization of a quadratic cost function. The theory is developed and applied to the output tracking of a flexible structure and experimental results are presented.

  8. Cluster dynamics modeling and experimental investigation of the effect of injected interstitials

    NASA Astrophysics Data System (ADS)

    Michaut, B.; Jourdan, T.; Malaplate, J.; Renault-Laborne, A.; Sefta, F.; Décamps, B.

    2017-12-01

    The effect of injected interstitials on loop and cavity microstructures is investigated experimentally and numerically for 304L austenitic stainless steel irradiated at 450 °C with 10 MeV Fe5+ ions up to about 100 dpa. A cluster dynamics model is parametrized on experimental results obtained by transmission electron microscopy (TEM) in a region where injected interstitials can be safely neglected. It is then used to model the damage profile and study the impact of self-ion injection. Results are compared to TEM observations on cross-sections of specimens. It is shown that injected interstitials have a significant effect on cavity density and mean size, even in the sink-dominated regime. To quantitatively match the experimental data in the self-ions injected area, a variation of some parameters is necessary. We propose that the fraction of freely migrating species may vary as a function of depth. Finally, we show that simple rate theory considerations do not seem to be valid for these experimental conditions.

  9. Correction for isotopic interferences between analyte and internal standard in quantitative mass spectrometry by a nonlinear calibration function.

    PubMed

    Rule, Geoffrey S; Clark, Zlatuse D; Yue, Bingfang; Rockwood, Alan L

    2013-04-16

    Stable isotope-labeled internal standards are of great utility in providing accurate quantitation in mass spectrometry (MS). An implicit assumption has been that there is no "cross talk" between signals of the internal standard and the target analyte. In some cases, however, naturally occurring isotopes of the analyte do contribute to the signal of the internal standard. This phenomenon becomes more pronounced for isotopically rich compounds, such as those containing sulfur, chlorine, or bromine, higher molecular weight compounds, and those at high analyte/internal standard concentration ratio. This can create nonlinear calibration behavior that may bias quantitative results. Here, we propose the use of a nonlinear but more accurate fitting of data for these situations that incorporates one or two constants determined experimentally for each analyte/internal standard combination and an adjustable calibration parameter. This fitting provides more accurate quantitation in MS-based assays where contributions from analyte to stable labeled internal standard signal exist. It can also correct for the reverse situation where an analyte is present in the internal standard as an impurity. The practical utility of this approach is described, and by using experimental data, the approach is compared to alternative fits.

  10. Vibration Based Crack Detection in a Rotating Disk. Part 2; Experimental Results

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, Andrew L.; Sawicki, Jerzy T.; Martin, Richard E.; Haase, Wayne C.; Baaklini, George

    2005-01-01

    This paper describes the experimental results concerning the detection of a crack in a rotating disk. The goal was to utilize blade tip clearance and shaft vibration measurements to monitor changes in the system's center of mass and/or blade deformation behaviors. The concept of the approach is based on the fact that the development of a disk crack results in a distorted strain field within the component. As a result, a minute deformation in the disk's geometry as well as a change in the system's center of mass occurs. Here, a notch was used to simulate an actual crack. The vibration based experimental results failed to identify the existence of a notch when utilizing the approach described above, even with a rather large, circumferential notch (l.2 in.) located approximately mid-span on the disk (disk radius = 4.63 in. with notch at r = 2.12 in.). This was somewhat expected, since the finite element based results in Part 1 of this study predicted changes in blade tip clearance as well as center of mass shifts due to a notch to be less than 0.001 in. Therefore, the small changes incurred by the notch could not be differentiated from the mechanical and electrical noise of the rotor system. Although the crack detection technique of interest failed to identify the existence ofthe notch, the vibration data produced and captured here will be utilized in upcoming studies that will focus on different data mining techniques concerning damage detection in a disk.

  11. Quantitative inhibition of soil C and N cycling by ectomycorrhizal fungi under field condition

    NASA Astrophysics Data System (ADS)

    Averill, C.; Hawkes, C.

    2014-12-01

    Ectomycorrhizal (ECM) ecosystems store more carbon than non-ectomycorrhizal ecosystems at global scale. Recent theoretical and empirical work suggests the presence of ECM fungi allows plants to compete directly with decomposers for soil nitrogen (N) via exo-enzyme synthesis. Experimental ECM exclusion often results in a release from competition of saprotrophic decomposers, allowing for increased C-degrading enzyme production, increased microbial biomass, and eventually declines in soil C stocks. Our knowledge of this phenomenon is limited, however, to the presence or absence of ECM fungi. It remains unknown if competitive repression of saprotrophic microbes and soil C cycling by ECM fungi varies with ECM abundance. This is particularly relevant to global change experiments when manipulations alter plant C allocation to ECM symbionts. To test if variation in ECM abundance alters the competitive inhibition of saprotrophic soil microbes (quantitative inhibition) we established experimental ECM exclusion treatments along an ECM abundance gradient. We dug trenches to experimentally exclude ECM fungi, allowing us to test for competitive release of soil saprotrophs from competition. To control for disturbance we placed in-growth bags both inside and outside of trenches. Consistent with the quantitative inhibition hypothesis, sites with more ECM fungi had significantly less microbial biomass per unit soil C and lower rates of N mineralization. Consistent with a release from competition, C-degrading enzyme activities were higher and gross proteolytic rates were lower per unit microbial biomass inside compared to outside trenches. We interpret this to reflect increased microbial investment in C-acquisition and decreased investment in N-acquisition in the absence of ECM fungi. Furthermore, the increase in C-degrading enzymes per unit microbial biomass was significantly greater in sites with the most abundant ECM fungi. Based on these results, ECM-saprotroph competition does

  12. Uncertainties and understanding of experimental and theoretical results regarding reactions forming heavy and superheavy nuclei

    NASA Astrophysics Data System (ADS)

    Giardina, G.; Mandaglio, G.; Nasirov, A. K.; Anastasi, A.; Curciarello, F.; Fazio, G.

    2018-02-01

    Experimental and theoretical results of the PCN fusion probability of reactants in the entrance channel and the Wsur survival probability against fission at deexcitation of the compound nucleus formed in heavy-ion collisions are discussed. The theoretical results for a set of nuclear reactions leading to formation of compound nuclei (CNs) with the charge number Z = 102- 122 reveal a strong sensitivity of PCN to the characteristics of colliding nuclei in the entrance channel, dynamics of the reaction mechanism, and excitation energy of the system. We discuss the validity of assumptions and procedures for analysis of experimental data, and also the limits of validity of theoretical results obtained by the use of phenomenological models. The comparison of results obtained in many investigated reactions reveals serious limits of validity of the data analysis and calculation procedures.

  13. A Quantitative Tunneling/Desorption Model for the Exchange Current at the Porous Electrode/Beta - Alumina/Alkali Metal Gas Three Phase Zone at 700-1300K

    NASA Technical Reports Server (NTRS)

    Williams, R. M.; Ryan, M. A.; Saipetch, C.; LeDuc, H. G.

    1996-01-01

    The exchange current observed at porous metal electrodes on sodium or potassium beta -alumina solid electrolytes in alkali metal vapor is quantitatively modeled with a multi-step process with good agreement with experimental results.

  14. Experimental and raytrace results for throat-to-throat compound parabolic concentrators

    NASA Technical Reports Server (NTRS)

    Leviton, D. B.; Leitch, J. W.

    1986-01-01

    Compound parabolic concentrators are nonimaging cone-shaped optics with useful angular transmission characteristics. Two cones used throat-to-throat accept radiant flux within one well-defined acceptance angle and redistribute it into another. If the entrance cone is fed with Lambertian flux, the exit cone produces a beam whose half-angle is the exit cone's acceptance angle and whose cross section shows uniform irradiance from near the exit mouth to infinity. (The pair is a beam angle transformer). The design of one pair of cones is discussed, also an experiment to map the irradiance of the emergent beam, and a raytracing program which models the cones fed by Lambertian flux. Experimental results compare favorably with raytrace results.

  15. LBE water interaction in sub-critical reactors: First experimental and modelling results

    NASA Astrophysics Data System (ADS)

    Ciampichetti, A.; Agostini, P.; Benamati, G.; Bandini, G.; Pellini, D.; Forgione, N.; Oriolo, F.; Ambrosini, W.

    2008-06-01

    This paper concerns the study of the phenomena involved in the interaction between LBE and pressurised water which could occur in some hypothetical accidents in accelerator driven system type reactors. The LIFUS 5 facility was designed and built at ENEA-Brasimone to reproduce this kind of interaction in a wide range of conditions. The first test of the experimental program was carried out injecting water at 70 bar and 235 °C in a reaction vessel containing LBE at 1 bar and 350 °C. A pressurisation up to 80 bar was observed in the test section during the considered transient. The SIMMER III code was used to simulate the performed test. The calculated data agree in a satisfactory way with the experimental results giving confidence in the possibility to use this code for safety analyses of heavy liquid metal cooled reactors.

  16. Drying in porous media with gravity-stabilized fronts: experimental results.

    PubMed

    Yiotis, A G; Salin, D; Tajer, E S; Yortsos, Y C

    2012-08-01

    In a recent paper [Yiotis et al., Phys. Rev. E 85, 046308 (2012)] we developed a model for the drying of porous media in the presence of gravity. It incorporated effects of corner film flow, internal and external mass transfer, and the effect of gravity. Analytical results were derived when gravity opposes drying and hence leads to a stable percolation drying front. In this paper, we test the theory using laboratory experiments. A series of isothermal drying experiments in glass bead packings saturated with volatile hydrocarbons is conducted. The transparent glass cells containing the packing allow for the visual monitoring of the phase distribution patterns below the surface, including the formation of liquid films, as the gaseous phase invades the pore space, and for the control of the thickness of the diffusive mass boundary layer over the packing. The experimental results agree very well with theory, provided that the latter is generalized to account for the effects of corner roundness in the film region (which was neglected in the theoretical part). We demonstrate the existence of an early constant rate period (CRP), which lasts as long as the films saturate the surface of the packing, and of a subsequent falling rate period (FRP), which begins practically after the detachment of the film tips from the external surface. During the CRP, the process is controlled by diffusion within the stagnant gaseous phase in the upper part of the cells, yielding a Stefan tube problem solution. During the FRP, the process is controlled by diffusion within the packing, with a drying rate inversely proportional to the observed position of the film tips in the cell. Theoretical and experimental results compare favorably for a specific value of the roundness of the films, which is found to be constant and equal to 0.2 for various conditions, and verify the theoretical dependence on the capillary Ca(f), Bond Bo, and Sherwood Sh numbers.

  17. PREDICTING TOXICOLOGICAL ENDPOINTS OF CHEMICALS USING QUANTITATIVE STRUCTURE-ACTIVITY RELATIONSHIPS (QSARS)

    EPA Science Inventory

    Quantitative structure-activity relationships (QSARs) are being developed to predict the toxicological endpoints for untested chemicals similar in structure to chemicals that have known experimental toxicological data. Based on a very large number of predetermined descriptors, a...

  18. Quantitative photoacoustic assessment of red blood cell aggregation under pulsatile blood flow: experimental and theoretical approaches

    NASA Astrophysics Data System (ADS)

    Bok, Tae-Hoon; Hysi, Eno; Kolios, Michael C.

    2017-03-01

    In the present paper, the optical wavelength dependence on the photoacoustic (PA) assessment of the pulsatile blood flow was investigated by means of the experimental and theoretical approaches analyzing PA radiofrequency spectral parameters such as the spectral slope (SS) and mid-band fit (MBF). For the experimental approach, the pulsatile flow of human whole blood at 60 bpm was imaged using the VevoLAZR system (40-MHz-linear-array probe, 700-900 nm illuminations). For the theoretical approach, a Monte Carlo simulation for the light transmit into a layered tissue phantom and a Green's function based method for the PA wave generation was implemented for illumination wavelengths of 700, 750, 800, 850 and 900 nm. The SS and MBF for the experimental results were compared to theoretical ones as a function of the illumination wavelength. The MBF increased with the optical wavelength in both theory and experiments. This was expected because the MBF is representative of the PA magnitude, and the PA signal from red blood cell (RBC) is dependent on the molar extinction coefficient of oxyhemoglobin. On the other hand, the SS decreased with the wavelength, even though the RBC size (absorber size which is related to the SS) cannot depend on the illumination wavelength. This conflicting result can be interpreted by means of the changes of the fluence pattern for different illumination wavelengths. The SS decrease with the increasing illumination wavelength should be further investigated.

  19. A gold nanoparticle-based semi-quantitative and quantitative ultrasensitive paper sensor for the detection of twenty mycotoxins

    NASA Astrophysics Data System (ADS)

    Kong, Dezhao; Liu, Liqiang; Song, Shanshan; Suryoprabowo, Steven; Li, Aike; Kuang, Hua; Wang, Libing; Xu, Chuanlai

    2016-02-01

    A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan reader, with the calculated limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.04-0.17, 0.06-49, 0.15-0.22, 0.056-0.49 and 0.53-1.05 μg kg-1, respectively. The analytical results of spiked samples were in accordance with the accurate content in the simultaneous detection analysis. This newly developed ICA strip assay is suitable for the on-site detection and rapid initial screening of mycotoxins in cereal samples, facilitating both semi-quantitative and quantitative determination.A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan

  20. Quantitative imaging biomarker ontology (QIBO) for knowledge representation of biomedical imaging biomarkers.

    PubMed

    Buckler, Andrew J; Liu, Tiffany Ting; Savig, Erica; Suzek, Baris E; Ouellette, M; Danagoulian, J; Wernsing, G; Rubin, Daniel L; Paik, David

    2013-08-01

    A widening array of novel imaging biomarkers is being developed using ever more powerful clinical and preclinical imaging modalities. These biomarkers have demonstrated effectiveness in quantifying biological processes as they occur in vivo and in the early prediction of therapeutic outcomes. However, quantitative imaging biomarker data and knowledge are not standardized, representing a critical barrier to accumulating medical knowledge based on quantitative imaging data. We use an ontology to represent, integrate, and harmonize heterogeneous knowledge across the domain of imaging biomarkers. This advances the goal of developing applications to (1) improve precision and recall of storage and retrieval of quantitative imaging-related data using standardized terminology; (2) streamline the discovery and development of novel imaging biomarkers by normalizing knowledge across heterogeneous resources; (3) effectively annotate imaging experiments thus aiding comprehension, re-use, and reproducibility; and (4) provide validation frameworks through rigorous specification as a basis for testable hypotheses and compliance tests. We have developed the Quantitative Imaging Biomarker Ontology (QIBO), which currently consists of 488 terms spanning the following upper classes: experimental subject, biological intervention, imaging agent, imaging instrument, image post-processing algorithm, biological target, indicated biology, and biomarker application. We have demonstrated that QIBO can be used to annotate imaging experiments with standardized terms in the ontology and to generate hypotheses for novel imaging biomarker-disease associations. Our results established the utility of QIBO in enabling integrated analysis of quantitative imaging data.

  1. The Impact of Acquisition Dose on Quantitative Breast Density Estimation with Digital Mammography: Results from ACRIN PA 4006.

    PubMed

    Chen, Lin; Ray, Shonket; Keller, Brad M; Pertuz, Said; McDonald, Elizabeth S; Conant, Emily F; Kontos, Despina

    2016-09-01

    Purpose To investigate the impact of radiation dose on breast density estimation in digital mammography. Materials and Methods With institutional review board approval and Health Insurance Portability and Accountability Act compliance under waiver of consent, a cohort of women from the American College of Radiology Imaging Network Pennsylvania 4006 trial was retrospectively analyzed. All patients underwent breast screening with a combination of dose protocols, including standard full-field digital mammography, low-dose digital mammography, and digital breast tomosynthesis. A total of 5832 images from 486 women were analyzed with previously validated, fully automated software for quantitative estimation of density. Clinical Breast Imaging Reporting and Data System (BI-RADS) density assessment results were also available from the trial reports. The influence of image acquisition radiation dose on quantitative breast density estimation was investigated with analysis of variance and linear regression. Pairwise comparisons of density estimations at different dose levels were performed with Student t test. Agreement of estimation was evaluated with quartile-weighted Cohen kappa values and Bland-Altman limits of agreement. Results Radiation dose of image acquisition did not significantly affect quantitative density measurements (analysis of variance, P = .37 to P = .75), with percent density demonstrating a high overall correlation between protocols (r = 0.88-0.95; weighted κ = 0.83-0.90). However, differences in breast percent density (1.04% and 3.84%, P < .05) were observed within high BI-RADS density categories, although they were significantly correlated across the different acquisition dose levels (r = 0.76-0.92, P < .05). Conclusion Precision and reproducibility of automated breast density measurements with digital mammography are not substantially affected by variations in radiation dose; thus, the use of low-dose techniques for the purpose of density estimation

  2. The Impact of Acquisition Dose on Quantitative Breast Density Estimation with Digital Mammography: Results from ACRIN PA 4006

    PubMed Central

    Chen, Lin; Ray, Shonket; Keller, Brad M.; Pertuz, Said; McDonald, Elizabeth S.; Conant, Emily F.

    2016-01-01

    Purpose To investigate the impact of radiation dose on breast density estimation in digital mammography. Materials and Methods With institutional review board approval and Health Insurance Portability and Accountability Act compliance under waiver of consent, a cohort of women from the American College of Radiology Imaging Network Pennsylvania 4006 trial was retrospectively analyzed. All patients underwent breast screening with a combination of dose protocols, including standard full-field digital mammography, low-dose digital mammography, and digital breast tomosynthesis. A total of 5832 images from 486 women were analyzed with previously validated, fully automated software for quantitative estimation of density. Clinical Breast Imaging Reporting and Data System (BI-RADS) density assessment results were also available from the trial reports. The influence of image acquisition radiation dose on quantitative breast density estimation was investigated with analysis of variance and linear regression. Pairwise comparisons of density estimations at different dose levels were performed with Student t test. Agreement of estimation was evaluated with quartile-weighted Cohen kappa values and Bland-Altman limits of agreement. Results Radiation dose of image acquisition did not significantly affect quantitative density measurements (analysis of variance, P = .37 to P = .75), with percent density demonstrating a high overall correlation between protocols (r = 0.88–0.95; weighted κ = 0.83–0.90). However, differences in breast percent density (1.04% and 3.84%, P < .05) were observed within high BI-RADS density categories, although they were significantly correlated across the different acquisition dose levels (r = 0.76–0.92, P < .05). Conclusion Precision and reproducibility of automated breast density measurements with digital mammography are not substantially affected by variations in radiation dose; thus, the use of low-dose techniques for the purpose of density

  3. Experimental Results From the Thermal Energy Storage-1 (TES-1) Flight Experiment

    NASA Technical Reports Server (NTRS)

    Jacqmin, David

    1995-01-01

    The Thermal Energy Storage (TES) experiments are designed to provide data to help researchers understand the long-duration microgravity behavior of thermal energy storage fluoride salts that undergo repeated melting and freezing. Such data, which have never been obtained before, have direct application to space-based solar dynamic power systems. These power systems will store solar energy in a thermal energy salt, such as lithium fluoride (LiF) or a eutectic of lithium fluoride/calcium difluoride (LiF-CaF2) (which melts at a lower temperature). The energy will be stored as the latent heat of fusion when the salt is melted by absorbing solar thermal energy. The stored energy will then be extracted during the shade portion of the orbit, enabling the solar dynamic power system to provide constant electrical power over the entire orbit. Analytical computer codes have been developed to predict the performance of a spacebased solar dynamic power system. However, the analytical predictions must be verified experimentally before the analytical results can be used for future space power design applications. Four TES flight experiments will be used to obtain the needed experimental data. This article focuses on the flight results from the first experiment, TES-1, in comparison to the predicted results from the Thermal Energy Storage Simulation (TESSIM) analytical computer code.

  4. Beyond the group mind: a quantitative review of the interindividual-intergroup discontinuity effect.

    PubMed

    Wildschut, Tim; Pinter, Brad; Vevea, Jack L; Insko, Chester A; Schopler, John

    2003-09-01

    This quantitative review of 130 comparisons of interindividual and intergroup interactions in the context of mixed-motive situations reveals that intergroup interactions are generally more competitive than interindividual interactions. The authors identify 4 moderators of this interindividual-intergroup discontinuity effect, each based on the theoretical perspective that the discontinuity effect flows from greater fear and greed in intergroup relative to interindividual interactions. Results reveal that each moderator shares a unique association with the magnitude of the discontinuity effect. The discontinuity effect is larger when (a) participants interact with an opponent whose behavior is unconstrained by the experimenter or constrained by the experimenter to be cooperative rather than constrained by the experimenter to be reciprocal, (b) group members make a group decision rather than individual decisions, (c) unconstrained communication between participants is present rather than absent, and (d) conflict of interest is severe rather than mild.

  5. Out-of-plane buckling of pantographic fabrics in displacement-controlled shear tests: experimental results and model validation

    NASA Astrophysics Data System (ADS)

    Barchiesi, Emilio; Ganzosch, Gregor; Liebold, Christian; Placidi, Luca; Grygoruk, Roman; Müller, Wolfgang H.

    2018-01-01

    Due to the latest advancements in 3D printing technology and rapid prototyping techniques, the production of materials with complex geometries has become more affordable than ever. Pantographic structures, because of their attractive features, both in dynamics and statics and both in elastic and inelastic deformation regimes, deserve to be thoroughly investigated with experimental and theoretical tools. Herein, experimental results relative to displacement-controlled large deformation shear loading tests of pantographic structures are reported. In particular, five differently sized samples are analyzed up to first rupture. Results show that the deformation behavior is strongly nonlinear, and the structures are capable of undergoing large elastic deformations without reaching complete failure. Finally, a cutting edge model is validated by means of these experimental results.

  6. Experimental and Theoretical Results in Output-Trajectory Redesign for Flexible Structures

    NASA Technical Reports Server (NTRS)

    Dewey, J. S.; Devasia, Santosh

    1996-01-01

    In this paper we study the optimal redesign of output trajectory for linear invertible systems. This is particularly important for tracking control of flexible structures because the input-state trajectories that achieve the required output may cause excessive vibrations in the structure. A trade-off is then required between tracking and vibrations reduction. We pose and solve this problem as the minimization of a quadratic cost function. The theory is developed and applied to the output tracking of a flexible structure and experimental results are presented.

  7. Transient segregation behavior in Cd1-xZnxTe with low Zn content-A qualitative and quantitative analysis

    NASA Astrophysics Data System (ADS)

    Neubert, M.; Jurisch, M.

    2015-06-01

    The paper analyzes experimental compositional profiles in Vertical Bridgman (VB, VGF) grown (Cd,Zn)Te crystals, found in the literature. The origin of the observed axial ZnTe-distribution profiles is attributed to dendritic growth after initial nucleation from supercooled melts. The analysis was done by utilizing a boundary layer model providing a very good approximation of the experimental data. Besides the discussion of the qualitative results also a quantitative analysis of the fitted model parameters is presented as far as it is possible by the utilized model.

  8. Swinging Atwood Machine: Experimental and numerical results, and a theoretical study

    NASA Astrophysics Data System (ADS)

    Pujol, O.; Pérez, J. P.; Ramis, J. P.; Simó, C.; Simon, S.; Weil, J. A.

    2010-06-01

    A Swinging Atwood Machine ( SAM) is built and some experimental results concerning its dynamic behaviour are presented. Experiments clearly show that pulleys play a role in the motion of the pendulum, since they can rotate and have non-negligible radii and masses. Equations of motion must therefore take into account the moment of inertia of the pulleys, as well as the winding of the rope around them. Their influence is compared to previous studies. A preliminary discussion of the role of dissipation is included. The theoretical behaviour of the system with pulleys is illustrated numerically, and the relevance of different parameters is highlighted. Finally, the integrability of the dynamic system is studied, the main result being that the machine with pulleys is non-integrable. The status of the results on integrability of the pulley-less machine is also recalled.

  9. Using a PC and external media to quantitatively investigate electromagnetic induction

    NASA Astrophysics Data System (ADS)

    Bonanno, A.; Bozzo, G.; Camarca, M.; Sapia, P.

    2011-07-01

    In this article we describe an experimental learning path about electromagnetic induction which uses an Atwood machine where one of the two hanging bodies is a cylindrical magnet falling through a plexiglass guide, surrounded either by a coil or by a copper pipe. The first configuration (magnet falling across a coil) allows students to quantitatively study the Faraday-Neumann-Lenz law, while the second configuration (falling through a copper pipe) permits learners to investigate the complex phenomena of induction by quantifying the amount of electric power dissipated through the pipe as a result of Foucault eddy currents, when the magnet travels through the pipe. The magnet's fall acceleration can be set by adjusting the counterweight of the Atwood machine so that both the kinematic quantities associated with it and the electromotive force induced within the coil are continuously and quantitatively monitored (respectively, by a common personal computer (PC) equipped with a webcam and by freely available software that makes it possible to use the audio card to convert the PC into an oscilloscope). Measurements carried out when the various experimental parameters are changed provide a useful framework for a thorough understanding and clarification of the conceptual nodes related to electromagnetic induction. The proposed learning path is under evaluation in various high schools participating in the project 'Lauree Scientifiche' promoted by the Italian Department of Education.

  10. Quantitative Assessment of Commutability for Clinical Viral Load Testing Using a Digital PCR-Based Reference Standard

    PubMed Central

    Tang, L.; Sun, Y.; Buelow, D.; Gu, Z.; Caliendo, A. M.; Pounds, S.

    2016-01-01

    Given recent advances in the development of quantitative standards, particularly WHO international standards, efforts to better understand the commutability of reference materials have been made. Existing approaches in evaluating commutability include prediction intervals and correspondence analysis; however, the results obtained from existing approaches may be ambiguous. We have developed a “deviation-from-ideal” (DFI) approach to evaluate commutability of standards and applied it to the assessment of Epstein-Bar virus (EBV) load testing in four quantitative PCR assays, treating digital PCR as a reference assay. We then discuss advantages and limitations of the DFI approach as well as experimental design to best evaluate the commutability of an assay in practice. PMID:27076654

  11. Shuttle Return To Flight Experimental Results: Protuberance Effects on Boundary Layer Transition

    NASA Technical Reports Server (NTRS)

    Liechty, Derek S.; Berry, Scott A.; Horvath, Thomas J.

    2006-01-01

    The effect of isolated roughness elements on the windward boundary layer of the Shuttle Orbiter has been experimentally examined in the Langley Aerothermodynamic Laboratory in support of an agency-wide effort to prepare the Shuttle Orbiter for return to flight. This experimental effort was initiated to provide a roughness effects database for developing transition criteria to support on-orbit decisions to repair damage to the thermal protection system. Boundary layer transition results were obtained using trips of varying heights and locations along the centerline and attachment lines of 0.0075-scale models. Global heat transfer images using phosphor thermography of the Orbiter windward surface and the corresponding heating distributions were used to infer the state of the boundary layer (laminar, transitional, or turbulent). The database contained within this report will be used to formulate protuberance-induced transition correlations using predicted boundary layer edge parameters.

  12. Experimental BCAS Performance Results

    DOT National Transportation Integrated Search

    1978-07-01

    The results of the (Litchford) Beacon-based Collision Avoidance System concept feasibility evaluation are reported. Included are a description of the concept, analysis and flight test results. The system concept is based on the range and bearing meas...

  13. Experimental and simulational result multipactors in 112 MHz QWR injector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xin, T.; Ben-Zvi, I.; Belomestnykh, S.

    2015-05-03

    The first RF commissioning of 112 MHz QWR superconducting electron gun was done in late 2014. The coaxial Fundamental Power Coupler (FPC) and Cathode Stalk (stalk) were installed and tested for the first time. During this experiment, we observed several multipacting barriers at different gun voltage levels. The simulation work was done within the same range. The comparison between the experimental observation and the simulation results are presented in this paper. The observations during the test are consisted with the simulation predictions. We were able to overcome most of the multipacting barriers and reach 1.8 MV gun voltage under pulsedmore » mode after several round of conditioning processes.« less

  14. [Quantitative Analysis of Heavy Metals in Water with LIBS Based on Signal-to-Background Ratio].

    PubMed

    Hu, Li; Zhao, Nan-jing; Liu, Wen-qing; Fang, Li; Zhang, Da-hai; Wang, Yin; Meng, De Shuo; Yu, Yang; Ma, Ming-jun

    2015-07-01

    There are many influence factors in the precision and accuracy of the quantitative analysis with LIBS technology. According to approximately the same characteristics trend of background spectrum and characteristic spectrum along with the change of temperature through in-depth analysis, signal-to-background ratio (S/B) measurement and regression analysis could compensate the spectral line intensity changes caused by system parameters such as laser power, spectral efficiency of receiving. Because the measurement dates were limited and nonlinear, we used support vector machine (SVM) for regression algorithm. The experimental results showed that the method could improve the stability and the accuracy of quantitative analysis of LIBS, and the relative standard deviation and average relative error of test set respectively were 4.7% and 9.5%. Data fitting method based on signal-to-background ratio(S/B) is Less susceptible to matrix elements and background spectrum etc, and provides data processing reference for real-time online LIBS quantitative analysis technology.

  15. Blanking and piercing theory, applications and recent experimental results

    NASA Astrophysics Data System (ADS)

    Zaid, Adnan l. O.

    2014-06-01

    Blanking and piercing are manufacturing processes by which certain geometrical shapes are sheared off a sheet metal. If the sheared off part is the one required, the processes referred to as blanking and if the remaining part in the sheet is the one required, the process is referred to as piercing. In this paper, the theory and practice of these processes are reviewed and discussed The main parameters affecting these processes are presented and discussed. These include: the radial clearance percentage, punch and die geometrical parameters, for example punch and die profile radii. The abovementioned parameters on the force and energy required to effect blanking together with their effect on the quality of the products are also presented and discussed. Recent experimental results together with photomacrographs and photomicrographs are also included and discussed. Finally, the effect of punch and die wear on the quality of the blanks is alsogiven and discussed.

  16. Recent experimental results of KSTAR RF heating and current drive

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, S. J., E-mail: sjwang@nfri.re.kr; Kim, J.; Jeong, J. H.

    2015-12-10

    The overview of KSTAR activities on ICRH, LHCD and ECH/CD including the last experimental results and future plan aiming for long-pulse high-beta plasma will be presented. Recently we achieved reasonable coupling of ICRF power to H-mode plasma through several efforts to increase system reliability. Power balance will be discussed on this experiment. LHCD is still struggling in the low power regime. Review of antenna spectrum for the higher coupling in H-mode plasma will be tried. ECH/CD provides 41 sec, 0.8 MW of heating power to support high-performance long-pulse discharge. Also, 170 GHz ECH system is integrated with the Plasma Control Systemmore » (PCS) for the feedback controlling of NTM. Status and plan of ECH/CD will be discussed. Finally, helicon current drive is being prepared for the next stage of KSTAR operation. The hardware preparation and the calculation results of helicon current drive in KSTAR plasma will be discussed.« less

  17. Recent experimental results of KSTAR RF heating and current drive

    NASA Astrophysics Data System (ADS)

    Wang, S. J.; Kim, J.; Jeong, J. H.; Kim, H. J.; Joung, M.; Bae, Y. S.; Kwak, J. G.

    2015-12-01

    The overview of KSTAR activities on ICRH, LHCD and ECH/CD including the last experimental results and future plan aiming for long-pulse high-beta plasma will be presented. Recently we achieved reasonable coupling of ICRF power to H-mode plasma through several efforts to increase system reliability. Power balance will be discussed on this experiment. LHCD is still struggling in the low power regime. Review of antenna spectrum for the higher coupling in H-mode plasma will be tried. ECH/CD provides 41 sec, 0.8 MW of heating power to support high-performance long-pulse discharge. Also, 170 GHz ECH system is integrated with the Plasma Control System (PCS) for the feedback controlling of NTM. Status and plan of ECH/CD will be discussed. Finally, helicon current drive is being prepared for the next stage of KSTAR operation. The hardware preparation and the calculation results of helicon current drive in KSTAR plasma will be discussed.

  18. Toward Quantitative Small Animal Pinhole SPECT: Assessment of Quantitation Accuracy Prior to Image Compensations

    PubMed Central

    Chen, Chia-Lin; Wang, Yuchuan; Lee, Jason J. S.; Tsui, Benjamin M. W.

    2011-01-01

    Purpose We assessed the quantitation accuracy of small animal pinhole single photon emission computed tomography (SPECT) under the current preclinical settings, where image compensations are not routinely applied. Procedures The effects of several common image-degrading factors and imaging parameters on quantitation accuracy were evaluated using Monte-Carlo simulation methods. Typical preclinical imaging configurations were modeled, and quantitative analyses were performed based on image reconstructions without compensating for attenuation, scatter, and limited system resolution. Results Using mouse-sized phantom studies as examples, attenuation effects alone degraded quantitation accuracy by up to −18% (Tc-99m or In-111) or −41% (I-125). The inclusion of scatter effects changed the above numbers to −12% (Tc-99m or In-111) and −21% (I-125), respectively, indicating the significance of scatter in quantitative I-125 imaging. Region-of-interest (ROI) definitions have greater impacts on regional quantitation accuracy for small sphere sources as compared to attenuation and scatter effects. For the same ROI, SPECT acquisitions using pinhole apertures of different sizes could significantly affect the outcome, whereas the use of different radii-of-rotation yielded negligible differences in quantitation accuracy for the imaging configurations simulated. Conclusions We have systematically quantified the influence of several factors affecting the quantitation accuracy of small animal pinhole SPECT. In order to consistently achieve accurate quantitation within 5% of the truth, comprehensive image compensation methods are needed. PMID:19048346

  19. Experimental results on atomic oxygen corrosion of silver

    NASA Technical Reports Server (NTRS)

    Fromhold, Albert T.

    1988-01-01

    The results of an experimental study of the reaction kinetics of silver with atomic oxygen in 10 degree increments over the temperature range of 0 to 70 C is reported. The silver specimens, of the order of 10,000 A in thickness, were prepared by thermal evaporation onto 3 inch diameter polished silicon wafers. There were later sliced into pieces having surface areas of the order of 1/4 to 1/2 square inch. Atomic oxygen was generated by a gas discharge in a commercial plasmod asher operating in the megahertz frequency range. The sample temperature within the chamber was controlled by means of a thermoelectric unit. Exposure of the silver specimens to atomic oxygen was incremental, with oxide film thickness measurements being carried out between exposures by means of an automated ellipsometer. For the early growth phase, the data can be described satisfactorily by a logarithmic growth law: the oxide film thickness increases as the logarithm of the exposure time. Furthermore, the oxidation process is thermally activated, the rate increasing with increasing temperature. However, the empirical activation energy parameter deduced from Arrhenius plots is quite low, being of the order of 0.1 eV.

  20. Shuttle Upper Atmosphere Mass Spectrometer Experimental Flight Results

    NASA Technical Reports Server (NTRS)

    Blanchard, R. C.; Ozoroski, Thomas A.; Nicholson, John Y.

    1994-01-01

    Calibrated pressure measurements for species with mass-to-charge ratios up to 50 amu/e(-) were obtained trom the shuttle upper atmosphere mass spectrometer experiment during re-entry on the STS-35 mission. The principal experimental objective is to obtain measurements of freestream density in the hypersonic rarefied flow flight regime. Data were collected from 180 to about 87 km. However, data above 115 km were contaminated from a source of gas emanating from pressure transdueers connected in parallel to the mass spectrometer. At lower altitudes, the pressure transducer data are compared to the mass spectrometer total pressure with excellent agreement. Near the orifice entrance, a significant amount of CO2 was generated from chemical reactions. The freestream density in the rarefied flow flight regime is calculated using an orifice pressure coefficient model based upon direct simulation Monte Carlo results. This density, when compared with the 1976 U.S. Standard Atmosphere model, exhibits the wavelike nature seen on previous flights using accelerometry. Selected spectra are presented at higher altitudes (320 km) showing the effects of the ingestion of gases from a forward fuselage fuel dump.

  1. Artificial cochlea and acoustic black hole travelling waves observation: Model and experimental results

    NASA Astrophysics Data System (ADS)

    Foucaud, Simon; Michon, Guilhem; Gourinat, Yves; Pelat, Adrien; Gautier, François

    2014-07-01

    An inhomogeneous fluid structure waveguide reproducing passive behaviour of the inner ear is modelled with the help of the Wentzel-Kramers-Brillouin method. A physical setup is designed and built. Experimental results are compared with a good correlation to theoretical ones. The experimental setup is a varying width plate immersed in fluid and terminated with an acoustic black hole. The varying width plate provides a spatial repartition of the vibration depending on the excitation frequency. The acoustic black hole is made by decreasing the plate's thickness with a quadratic profile and by covering this region with a thin film of viscoelastic material. Such a termination attenuates the flexural wave reflection at the end of the waveguide, turning standing waves into travelling waves.

  2. EASE (Experimental Assembly of Structures in EVA) overview of selected results

    NASA Technical Reports Server (NTRS)

    Akin, David L.

    1987-01-01

    Experimental Assembly of Structures in EVA (EASE) objectives, experimental protocol, neutral buoyancy simulation, task time distribution, assembly task performance, metabolic rate/biomedical readouts are summarized. This presentation is shown in charts, figures, and graphs.

  3. Simulation and the Development of Clinical Judgment: A Quantitative Study

    ERIC Educational Resources Information Center

    Holland, Susan

    2015-01-01

    The purpose of this quantitative pretest posttest quasi-experimental research study was to explore the effect of the NESD on clinical judgment in associate degree nursing students and compare the differences between groups when the Nursing Education Simulation Design (NESD) guided simulation in order to identify educational strategies promoting…

  4. Quantitative systems toxicology

    PubMed Central

    Bloomingdale, Peter; Housand, Conrad; Apgar, Joshua F.; Millard, Bjorn L.; Mager, Donald E.; Burke, John M.; Shah, Dhaval K.

    2017-01-01

    The overarching goal of modern drug development is to optimize therapeutic benefits while minimizing adverse effects. However, inadequate efficacy and safety concerns remain to be the major causes of drug attrition in clinical development. For the past 80 years, toxicity testing has consisted of evaluating the adverse effects of drugs in animals to predict human health risks. The U.S. Environmental Protection Agency recognized the need to develop innovative toxicity testing strategies and asked the National Research Council to develop a long-range vision and strategy for toxicity testing in the 21st century. The vision aims to reduce the use of animals and drug development costs through the integration of computational modeling and in vitro experimental methods that evaluates the perturbation of toxicity-related pathways. Towards this vision, collaborative quantitative systems pharmacology and toxicology modeling endeavors (QSP/QST) have been initiated amongst numerous organizations worldwide. In this article, we discuss how quantitative structure-activity relationship (QSAR), network-based, and pharmacokinetic/pharmacodynamic modeling approaches can be integrated into the framework of QST models. Additionally, we review the application of QST models to predict cardiotoxicity and hepatotoxicity of drugs throughout their development. Cell and organ specific QST models are likely to become an essential component of modern toxicity testing, and provides a solid foundation towards determining individualized therapeutic windows to improve patient safety. PMID:29308440

  5. Quantitative collision induced mass spectrometry of substituted piperazines - A correlative analysis between theory and experiment

    NASA Astrophysics Data System (ADS)

    Ivanova, Bojidarka; Spiteller, Michael

    2017-12-01

    The present paper deals with quantitative kinetics and thermodynamics of collision induced dissociation (CID) reactions of piperazines under different experimental conditions together with a systematic description of effect of counter-ions on common MS fragment reactions of piperazines; and intra-molecular effect of quaternary cyclization of substituted piperazines yielding to quaternary salts. There are discussed quantitative model equations of rate constants as well as free Gibbs energies of series of m-independent CID fragment processes in GP, which have been evidenced experimentally. Both kinetic and thermodynamic parameters are also predicted by computational density functional theory (DFT) and ab initio both static and dynamic methods. The paper examines validity of Maxwell-Boltzmann distribution to non-Boltzmann CID processes in quantitatively as well. The experiments conducted within the latter framework yield to an excellent correspondence with theoretical quantum chemical modeling. The important property of presented model equations of reaction kinetics is the applicability in predicting unknown and assigning of known mass spectrometric (MS) patterns. The nature of "GP" continuum of CID-MS coupled scheme of measurements with electrospray ionization (ESI) source is discussed, performing parallel computations in gas-phase (GP) and polar continuum at different temperatures and ionic strengths. The effect of pressure is presented. The study contributes significantly to methodological and phenomenological developments of CID-MS and its analytical implementations for quantitative and structural analyses. It also demonstrates great prospective of a complementary application of experimental CID-MS and computational quantum chemistry studying chemical reactivity, among others. To a considerable extend this work underlies the place of computational quantum chemistry to the field of experimental analytical chemistry in particular highlighting the structural analysis.

  6. Evaluation of Quantitative Literacy Series: Exploring Data and Exploring Probability. Program Report 87-5.

    ERIC Educational Resources Information Center

    Day, Roger P.; And Others

    A quasi-experimental design with two experimental groups and one control group was used to evaluate the use of two books in the Quantitative Literacy Series, "Exploring Data" and "Exploring Probability." Group X teachers were those who had attended a workshop on the use of the materials and were using the materials during the…

  7. Ten Years of LibQual: A Study of Qualitative and Quantitative Survey Results at the University of Mississippi 2001-2010

    ERIC Educational Resources Information Center

    Greenwood, Judy T.; Watson, Alex P.; Dennis, Melissa

    2011-01-01

    This article analyzes quantitative adequacy gap scores and coded qualitative comments from LibQual surveys at the University of Mississippi from 2001 to 2010, looking for relationships between library policy changes and LibQual results and any other trends that emerged. Analysis found no relationship between changes in policy and survey results…

  8. Exploring discrepancies between quantitative validation results and the geomorphic plausibility of statistical landslide susceptibility maps

    NASA Astrophysics Data System (ADS)

    Steger, Stefan; Brenning, Alexander; Bell, Rainer; Petschko, Helene; Glade, Thomas

    2016-06-01

    Empirical models are frequently applied to produce landslide susceptibility maps for large areas. Subsequent quantitative validation results are routinely used as the primary criteria to infer the validity and applicability of the final maps or to select one of several models. This study hypothesizes that such direct deductions can be misleading. The main objective was to explore discrepancies between the predictive performance of a landslide susceptibility model and the geomorphic plausibility of subsequent landslide susceptibility maps while a particular emphasis was placed on the influence of incomplete landslide inventories on modelling and validation results. The study was conducted within the Flysch Zone of Lower Austria (1,354 km2) which is known to be highly susceptible to landslides of the slide-type movement. Sixteen susceptibility models were generated by applying two statistical classifiers (logistic regression and generalized additive model) and two machine learning techniques (random forest and support vector machine) separately for two landslide inventories of differing completeness and two predictor sets. The results were validated quantitatively by estimating the area under the receiver operating characteristic curve (AUROC) with single holdout and spatial cross-validation technique. The heuristic evaluation of the geomorphic plausibility of the final results was supported by findings of an exploratory data analysis, an estimation of odds ratios and an evaluation of the spatial structure of the final maps. The results showed that maps generated by different inventories, classifiers and predictors appeared differently while holdout validation revealed similar high predictive performances. Spatial cross-validation proved useful to expose spatially varying inconsistencies of the modelling results while additionally providing evidence for slightly overfitted machine learning-based models. However, the highest predictive performances were obtained for

  9. WE-FG-207B-12: Quantitative Evaluation of a Spectral CT Scanner in a Phantom Study: Results of Spectral Reconstructions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duan, X; Arbique, G; Guild, J

    Purpose: To evaluate the quantitative image quality of spectral reconstructions of phantom data from a spectral CT scanner. Methods: The spectral CT scanner (IQon Spectral CT, Philips Healthcare) is equipped with a dual-layer detector and generates conventional 80-140 kVp images and variety of spectral reconstructions, e.g., virtual monochromatic (VM) images, virtual non-contrast (VNC) images, iodine maps, and effective atomic number (Z) images. A cylindrical solid water phantom (Gammex 472, 33 cm diameter and 5 cm thick) with iodine (2.0-20.0 mg I/ml) and calcium (50-600 mg/ml) rod inserts was scanned at 120 kVp and 27 mGy CTDIvol. Spectral reconstructions were evaluatedmore » by comparing image measurements with theoretical values calculated from nominal rod compositions provided by the phantom manufacturer. The theoretical VNC was calculated using water and iodine basis material decomposition, and the theoretical Z was calculated using two common methods, the chemical formula method (Z1) and the dual-energy ratio method (Z2). Results: Beam-hardening-like artifacts between high-attenuation calcium rods (≥300 mg/ml, >800 HU) influenced quantitative measurements, so the quantitative analysis was only performed on iodine rods using the images from the scan with all the calcium rods removed. The CT numbers of the iodine rods in the VM images (50∼150 keV) were close to theoretical values with average difference of 2.4±6.9 HU. Compared with theoretical values, the average difference for iodine concentration, VNC CT number and effective Z of iodine rods were −0.10±0.38 mg/ml, −0.1±8.2 HU, 0.25±0.06 (Z1) and −0.23±0.07 (Z2). Conclusion: The results indicate that the spectral CT scanner generates quantitatively accurate spectral reconstructions at clinically relevant iodine concentrations. Beam-hardening-like artifacts still exist when high-attenuation objects are present and their impact on patient images needs further investigation. YY is an employee of

  10. Recent advances in continuum plasticity: phenomenological modeling and experimentation using X-ray diffraction

    NASA Astrophysics Data System (ADS)

    Edmiston, John Kearney

    This work explores the field of continuum plasticity from two fronts. On the theory side, we establish a complete specification of a phenomenological theory of plasticity for single crystals. The model serves as an alternative to the popular crystal plasticity formulation. Such a model has been previously proposed in the literature; the new contribution made here is the constitutive framework and resulting simulations. We calibrate the model to available data and use a simple numerical method to explore resulting predictions in plane strain boundary value problems. Results show promise for further investigation of the plasticity model. Conveniently, this theory comes with a corresponding experimental tool in X-ray diffraction. Recent advances in hardware technology at synchrotron sources have led to an increased use of the technique for studies of plasticity in the bulk of materials. The method has been successful in qualitative observations of material behavior, but its use in quantitative studies seeking to extract material properties is open for investigation. Therefore in the second component of the thesis several contributions are made to synchrotron X-ray diffraction experiments, in terms of method development as well as the quantitative reporting of constitutive parameters. In the area of method development, analytical tools are developed to determine the available precision of this type of experiment—a crucial aspect to determine if the method is to be used for quantitative studies. We also extract kinematic information relating to intragranular inhomogeneity which is not accessible with traditional methods of data analysis. In the area of constitutive parameter identification, we use the method to extract parameters corresponding to the proposed formulation of plasticity for a titanium alloy (HCP) which is continuously sampled by X-ray diffraction during uniaxial extension. These results and the lessons learned from the efforts constitute early reporting

  11. Design and Experimental Results for a Natural-Laminar-Flow Airfoil for General Aviation Applications

    NASA Technical Reports Server (NTRS)

    Somers, D. M.

    1981-01-01

    A natural-laminar-flow airfoil for general aviation applications, the NLF(1)-0416, was designed and analyzed theoretically and verified experimentally in the Langley Low-Turbulence Pressure Tunnel. The basic objective of combining the high maximum lift of the NASA low-speed airfoils with the low cruise drag of the NACA 6-series airfoils was achieved. The safety requirement that the maximum lift coefficient not be significantly affected with transition fixed near the leading edge was also met. Comparisons of the theoretical and experimental results show excellent agreement. Comparisons with other airfoils, both laminar flow and turbulent flow, confirm the achievement of the basic objective.

  12. Design and Experimental Results for the S825 Airfoil; Period of Performance: 1998-1999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Somers, D. M.

    2005-01-01

    A 17%-thick, natural-laminar-flow airfoil, the S825, for the 75% blade radial station of 20- to 40-meter, variable-speed and variable-pitch (toward feather), horizontal-axis wind turbines has been designed and analyzed theoretically and verified experimentally in the NASA Langley Low-Turbulence Pressure Tunnel. The two primary objectives of high maximum lift, relatively insensitive to roughness and low-profile drag have been achieved. The airfoil exhibits a rapid, trailing-edge stall, which does not meet the design goal of a docile stall. The constraints on the pitching moment and the airfoil thickness have been satisfied. Comparisons of the theoretical and experimental results generally show good agreement.

  13. Quantitative coronary plaque analysis predicts high-risk plaque morphology on coronary computed tomography angiography: results from the ROMICAT II trial.

    PubMed

    Liu, Ting; Maurovich-Horvat, Pál; Mayrhofer, Thomas; Puchner, Stefan B; Lu, Michael T; Ghemigian, Khristine; Kitslaar, Pieter H; Broersen, Alexander; Pursnani, Amit; Hoffmann, Udo; Ferencik, Maros

    2018-02-01

    Semi-automated software can provide quantitative assessment of atherosclerotic plaques on coronary CT angiography (CTA). The relationship between established qualitative high-risk plaque features and quantitative plaque measurements has not been studied. We analyzed the association between quantitative plaque measurements and qualitative high-risk plaque features on coronary CTA. We included 260 patients with plaque who underwent coronary CTA in the Rule Out Myocardial Infarction/Ischemia Using Computer Assisted Tomography (ROMICAT) II trial. Quantitative plaque assessment and qualitative plaque characterization were performed on a per coronary segment basis. Quantitative coronary plaque measurements included plaque volume, plaque burden, remodeling index, and diameter stenosis. In qualitative analysis, high-risk plaque was present if positive remodeling, low CT attenuation plaque, napkin-ring sign or spotty calcium were detected. Univariable and multivariable logistic regression analyses were performed to assess the association between quantitative and qualitative high-risk plaque assessment. Among 888 segments with coronary plaque, high-risk plaque was present in 391 (44.0%) segments by qualitative analysis. In quantitative analysis, segments with high-risk plaque had higher total plaque volume, low CT attenuation plaque volume, plaque burden and remodeling index. Quantitatively assessed low CT attenuation plaque volume (odds ratio 1.12 per 1 mm 3 , 95% CI 1.04-1.21), positive remodeling (odds ratio 1.25 per 0.1, 95% CI 1.10-1.41) and plaque burden (odds ratio 1.53 per 0.1, 95% CI 1.08-2.16) were associated with high-risk plaque. Quantitative coronary plaque characteristics (low CT attenuation plaque volume, positive remodeling and plaque burden) measured by semi-automated software correlated with qualitative assessment of high-risk plaque features.

  14. Acoustic analysis in Mudejar-Gothic churches: Experimental results

    NASA Astrophysics Data System (ADS)

    Galindo, Miguel; Zamarreño, Teófilo; Girón, Sara

    2005-05-01

    This paper describes the preliminary results of research work in acoustics, conducted in a set of 12 Mudejar-Gothic churches in the city of Seville in the south of Spain. Despite common architectural style, the churches feature individual characteristics and have volumes ranging from 3947 to 10 708 m3. Acoustic parameters were measured in unoccupied churches according to the ISO-3382 standard. An extensive experimental study was carried out using impulse response analysis through a maximum length sequence measurement system in each church. It covered aspects such as reverberation (reverberation times, early decay times), distribution of sound levels (sound strength); early to late sound energy parameters derived from the impulse responses (center time, clarity for speech, clarity, definition, lateral energy fraction), and speech intelligibility (rapid speech transmission index), which all take both spectral and spatial distribution into account. Background noise was also measured to obtain the NR indices. The study describes the acoustic field inside each temple and establishes a discussion for each one of the acoustic descriptors mentioned by using the theoretical models available and the principles of architectural acoustics. Analysis of the quality of the spaces for music and speech is carried out according to the most widespread criteria for auditoria. .

  15. Acoustic analysis in Mudejar-Gothic churches: experimental results.

    PubMed

    Galindo, Miguel; Zamarreño, Teófilo; Girón, Sara

    2005-05-01

    This paper describes the preliminary results of research work in acoustics, conducted in a set of 12 Mudejar-Gothic churches in the city of Seville in the south of Spain. Despite common architectural style, the churches feature individual characteristics and have volumes ranging from 3947 to 10 708 m3. Acoustic parameters were measured in unoccupied churches according to the ISO-3382 standard. An extensive experimental study was carried out using impulse response analysis through a maximum length sequence measurement system in each church. It covered aspects such as reverberation (reverberation times, early decay times), distribution of sound levels (sound strength); early to late sound energy parameters derived from the impulse responses (center time, clarity for speech, clarity, definition, lateral energy fraction), and speech intelligibility (rapid speech transmission index), which all take both spectral and spatial distribution into account. Background noise was also measured to obtain the NR indices. The study describes the acoustic field inside each temple and establishes a discussion for each one of the acoustic descriptors mentioned by using the theoretical models available and the principles of architectural acoustics. Analysis of the quality of the spaces for music and speech is carried out according to the most widespread criteria for auditoria.

  16. Controls-structures interaction guest investigator program: Overview and phase 1 experimental results and future plans

    NASA Technical Reports Server (NTRS)

    Smith-Taylor, Rudeen; Tanner, Sharon E.

    1993-01-01

    The NASA Controls-Structures Interaction (CSI) Guest Investigator program is described in terms of its support of the development of CSI technologies. The program is based on the introduction of CSI researchers from industry and academia to available test facilities for experimental validation of technologies and methods. Phase 1 experimental results are reviewed with attention given to their use of the Mini-MAST test facility and the facility for the Advance Control Evaluation of Structures. Experiments were conducted regarding the following topics: collocated/noncollocated controllers, nonlinear math modeling, controller design, passive/active suspension systems design, and system identification and fault isolation. The results demonstrate that significantly enhanced performance from the control techniques can be achieved by integrating knowledge of the structural dynamics under consideration into the approaches.

  17. A two-factor error model for quantitative steganalysis

    NASA Astrophysics Data System (ADS)

    Böhme, Rainer; Ker, Andrew D.

    2006-02-01

    Quantitative steganalysis refers to the exercise not only of detecting the presence of hidden stego messages in carrier objects, but also of estimating the secret message length. This problem is well studied, with many detectors proposed but only a sparse analysis of errors in the estimators. A deep understanding of the error model, however, is a fundamental requirement for the assessment and comparison of different detection methods. This paper presents a rationale for a two-factor model for sources of error in quantitative steganalysis, and shows evidence from a dedicated large-scale nested experimental set-up with a total of more than 200 million attacks. Apart from general findings about the distribution functions found in both classes of errors, their respective weight is determined, and implications for statistical hypothesis tests in benchmarking scenarios or regression analyses are demonstrated. The results are based on a rigorous comparison of five different detection methods under many different external conditions, such as size of the carrier, previous JPEG compression, and colour channel selection. We include analyses demonstrating the effects of local variance and cover saturation on the different sources of error, as well as presenting the case for a relative bias model for between-image error.

  18. Estimation of the number of fluorescent end-members for quantitative analysis of multispectral FLIM data.

    PubMed

    Gutierrez-Navarro, Omar; Campos-Delgado, Daniel U; Arce-Santana, Edgar R; Maitland, Kristen C; Cheng, Shuna; Jabbour, Joey; Malik, Bilal; Cuenca, Rodrigo; Jo, Javier A

    2014-05-19

    Multispectral fluorescence lifetime imaging (m-FLIM) can potentially allow identifying the endogenous fluorophores present in biological tissue. Quantitative description of such data requires estimating the number of components in the sample, their characteristic fluorescent decays, and their relative contributions or abundances. Unfortunately, this inverse problem usually requires prior knowledge about the data, which is seldom available in biomedical applications. This work presents a new methodology to estimate the number of potential endogenous fluorophores present in biological tissue samples from time-domain m-FLIM data. Furthermore, a completely blind linear unmixing algorithm is proposed. The method was validated using both synthetic and experimental m-FLIM data. The experimental m-FLIM data include in-vivo measurements from healthy and cancerous hamster cheek-pouch epithelial tissue, and ex-vivo measurements from human coronary atherosclerotic plaques. The analysis of m-FLIM data from in-vivo hamster oral mucosa identified healthy from precancerous lesions, based on the relative concentration of their characteristic fluorophores. The algorithm also provided a better description of atherosclerotic plaques in term of their endogenous fluorophores. These results demonstrate the potential of this methodology to provide quantitative description of tissue biochemical composition.

  19. A sampling framework for incorporating quantitative mass spectrometry data in protein interaction analysis.

    PubMed

    Tucker, George; Loh, Po-Ru; Berger, Bonnie

    2013-10-04

    Comprehensive protein-protein interaction (PPI) maps are a powerful resource for uncovering the molecular basis of genetic interactions and providing mechanistic insights. Over the past decade, high-throughput experimental techniques have been developed to generate PPI maps at proteome scale, first using yeast two-hybrid approaches and more recently via affinity purification combined with mass spectrometry (AP-MS). Unfortunately, data from both protocols are prone to both high false positive and false negative rates. To address these issues, many methods have been developed to post-process raw PPI data. However, with few exceptions, these methods only analyze binary experimental data (in which each potential interaction tested is deemed either observed or unobserved), neglecting quantitative information available from AP-MS such as spectral counts. We propose a novel method for incorporating quantitative information from AP-MS data into existing PPI inference methods that analyze binary interaction data. Our approach introduces a probabilistic framework that models the statistical noise inherent in observations of co-purifications. Using a sampling-based approach, we model the uncertainty of interactions with low spectral counts by generating an ensemble of possible alternative experimental outcomes. We then apply the existing method of choice to each alternative outcome and aggregate results over the ensemble. We validate our approach on three recent AP-MS data sets and demonstrate performance comparable to or better than state-of-the-art methods. Additionally, we provide an in-depth discussion comparing the theoretical bases of existing approaches and identify common aspects that may be key to their performance. Our sampling framework extends the existing body of work on PPI analysis using binary interaction data to apply to the richer quantitative data now commonly available through AP-MS assays. This framework is quite general, and many enhancements are likely

  20. Spectral Analysis and Experimental Modeling of Ice Accretion Roughness

    NASA Technical Reports Server (NTRS)

    Orr, D. J.; Breuer, K. S.; Torres, B. E.; Hansman, R. J., Jr.

    1996-01-01

    A self-consistent scheme for relating wind tunnel ice accretion roughness to the resulting enhancement of heat transfer is described. First, a spectral technique of quantitative analysis of early ice roughness images is reviewed. The image processing scheme uses a spectral estimation technique (SET) which extracts physically descriptive parameters by comparing scan lines from the experimentally-obtained accretion images to a prescribed test function. Analysis using this technique for both streamwise and spanwise directions of data from the NASA Lewis Icing Research Tunnel (IRT) are presented. An experimental technique is then presented for constructing physical roughness models suitable for wind tunnel testing that match the SET parameters extracted from the IRT images. The icing castings and modeled roughness are tested for enhancement of boundary layer heat transfer using infrared techniques in a "dry" wind tunnel.

  1. A Direct, Quantitative Connection between Molecular Dynamics Simulations and Vibrational Probe Line Shapes.

    PubMed

    Xu, Rosalind J; Blasiak, Bartosz; Cho, Minhaeng; Layfield, Joshua P; Londergan, Casey H

    2018-05-17

    A quantitative connection between molecular dynamics simulations and vibrational spectroscopy of probe-labeled systems would enable direct translation of experimental data into structural and dynamical information. To constitute this connection, all-atom molecular dynamics (MD) simulations were performed for two SCN probe sites (solvent-exposed and buried) in a calmodulin-target peptide complex. Two frequency calculation approaches with substantial nonelectrostatic components, a quantum mechanics/molecular mechanics (QM/MM)-based technique and a solvatochromic fragment potential (SolEFP) approach, were used to simulate the infrared probe line shapes. While QM/MM results disagreed with experiment, SolEFP results matched experimental frequencies and line shapes and revealed the physical and dynamic bases for the observed spectroscopic behavior. The main determinant of the CN probe frequency is the exchange repulsion between the probe and its local structural neighbors, and there is a clear dynamic explanation for the relatively broad probe line shape observed at the "buried" probe site. This methodology should be widely applicable to vibrational probes in many environments.

  2. Experimental and computational results on exciton/free-carrier ratio, hot/thermalized carrier diffusion, and linear/nonlinear rate constants affecting scintillator proportionality

    NASA Astrophysics Data System (ADS)

    Williams, R. T.; Grim, Joel Q.; Li, Qi; Ucer, K. B.; Bizarri, G. A.; Kerisit, S.; Gao, Fei; Bhattacharya, P.; Tupitsyn, E.; Rowe, E.; Buliga, V. M.; Burger, A.

    2013-09-01

    Models of nonproportional response in scintillators have highlighted the importance of parameters such as branching ratios, carrier thermalization times, diffusion, kinetic order of quenching, associated rate constants, and radius of the electron track. For example, the fraction ηeh of excitations that are free carriers versus excitons was shown by Payne and coworkers to have strong correlation with the shape of electron energy response curves from Compton-coincidence studies. Rate constants for nonlinear quenching are implicit in almost all models of nonproportionality, and some assumption about track radius must invariably be made if one is to relate linear energy deposition dE/dx to volume-based excitation density n (eh/cm3) in terms of which the rates are defined. Diffusion, affecting time-dependent track radius and thus density of excitations, has been implicated as an important factor in nonlinear light yield. Several groups have recently highlighted diffusion of hot electrons in addition to thermalized carriers and excitons in scintillators. However, experimental determination of many of these parameters in the insulating crystals used as scintillators has seemed difficult. Subpicosecond laser techniques including interband z scan light yield, fluence-dependent decay time, and transient optical absorption are now yielding experimental values for some of the missing rates and ratios needed for modeling scintillator response. First principles calculations and Monte Carlo simulations can fill in additional parameters still unavailable from experiment. As a result, quantitative modeling of scintillator electron energy response from independently determined material parameters is becoming possible on an increasingly firmer data base. This paper describes recent laser experiments, calculations, and numerical modeling of scintillator response.

  3. Experimental and computational results on exciton/free-carrier ratio, hot/thermalized carrier diffusion, and linear/nonlinear rate constants affecting scintillator proportionality

    DOE PAGES

    Williams, R. T.; Grim, Joel Q.; Li, Qi; ...

    2013-09-26

    Models of nonproportional response in scintillators have highlighted the importance of parameters such as branching ratios, carrier thermalization times, diffusion, kinetic order of quenching, associated rate constants, and radius of the electron track. For example, the fraction ηeh of excitations that are free carriers versus excitons was shown by Payne and coworkers to have strong correlation with the shape of electron energy response curves from Compton-coincidence studies. Rate constants for nonlinear quenching are implicit in almost all models of nonproportionality, and some assumption about track radius must invariably be made if one is to relate linear energy deposition dE/dx tomore » volume-based excitation density n (eh/cm 3) in terms of which the rates are defined. Diffusion, affecting time-dependent track radius and thus density of excitations, has been implicated as an important factor in nonlinear light yield. Several groups have recently highlighted diffusion of hot electrons in addition to thermalized carriers and excitons in scintillators. However, experimental determination of many of these parameters in the insulating crystals used as scintillators has seemed difficult. Subpicosecond laser techniques including interband z scan light yield, fluence-dependent decay time, and transient optical absorption are now yielding experimental values for some of the missing rates and ratios needed for modeling scintillator response. First principles calculations and Monte Carlo simulations can fill in additional parameters still unavailable from experiment. As a result, quantitative modeling of scintillator electron energy response from independently determined material parameters is becoming possible on an increasingly firmer data base. This study describes recent laser experiments, calculations, and numerical modeling of scintillator response.« less

  4. Quantitative Acoustic Model for Adhesion Evaluation of Pmma/silicon Film Structures

    NASA Astrophysics Data System (ADS)

    Ju, H. S.; Tittmann, B. R.

    2010-02-01

    A Poly-methyl-methacrylate (PMMA) film on a silicon substrate is a main structure for photolithography in semiconductor manufacturing processes. This paper presents a potential of scanning acoustic microscopy (SAM) for nondestructive evaluation of the PMMA/Si film structure, whose adhesion failure is commonly encountered during the fabrication and post-fabrication processes. A physical model employing a partial discontinuity in displacement is developed for rigorously quantitative evaluation of the interfacial weakness. The model is implanted to the matrix method for the surface acoustic wave (SAW) propagation in anisotropic media. Our results show that variations in the SAW velocity and reflectance are predicted to show their sensitivity to the adhesion condition. Experimental results by the v(z) technique and SAW velocity reconstruction verify the prediction.

  5. Quantitative genetics

    USDA-ARS?s Scientific Manuscript database

    The majority of economically important traits targeted for cotton improvement are quantitatively inherited. In this chapter, the current state of cotton quantitative genetics is described and separated into four components. These components include: 1) traditional quantitative inheritance analysis, ...

  6. Automated detection of discourse segment and experimental types from the text of cancer pathway results sections.

    PubMed

    Burns, Gully A P C; Dasigi, Pradeep; de Waard, Anita; Hovy, Eduard H

    2016-01-01

    Automated machine-reading biocuration systems typically use sentence-by-sentence information extraction to construct meaning representations for use by curators. This does not directly reflect the typical discourse structure used by scientists to construct an argument from the experimental data available within a article, and is therefore less likely to correspond to representations typically used in biomedical informatics systems (let alone to the mental models that scientists have). In this study, we develop Natural Language Processing methods to locate, extract, and classify the individual passages of text from articles' Results sections that refer to experimental data. In our domain of interest (molecular biology studies of cancer signal transduction pathways), individual articles may contain as many as 30 small-scale individual experiments describing a variety of findings, upon which authors base their overall research conclusions. Our system automatically classifies discourse segments in these texts into seven categories (fact, hypothesis, problem, goal, method, result, implication) with an F-score of 0.68. These segments describe the essential building blocks of scientific discourse to (i) provide context for each experiment, (ii) report experimental details and (iii) explain the data's meaning in context. We evaluate our system on text passages from articles that were curated in molecular biology databases (the Pathway Logic Datum repository, the Molecular Interaction MINT and INTACT databases) linking individual experiments in articles to the type of assay used (coprecipitation, phosphorylation, translocation etc.). We use supervised machine learning techniques on text passages containing unambiguous references to experiments to obtain baseline F1 scores of 0.59 for MINT, 0.71 for INTACT and 0.63 for Pathway Logic. Although preliminary, these results support the notion that targeting information extraction methods to experimental results could provide

  7. Beta decay and the origins of biological chirality - Experimental results

    NASA Technical Reports Server (NTRS)

    Gidley, D. W.; Rich, A.; Van House, J.; Zitzewitz, P. W.

    1982-01-01

    Preliminary experimental results are presented of an investigation of the possible role of preferential radiolysis by electrons emitted in the beta decay of radionuclides, a parity-nonconserving process, in the universal causation of the optical activity of biological compounds. Experiments were designed to measure the asymmetry in the production of triplet positronium upon the bombardment of an amino acid powder target by a collimated beam of positrons as positron helicity or target chirality is reversed. No asymmetry down to a level of 0.0007 is found in experiments on the D and L forms of cystine and tryptophan, indicating an asymmetry in positronium formation cross section of less than 0.01, while an asymmetry of 0.0031 is found for leucine, corresponding to a formation cross section asymmetry of about 0.04

  8. Solving and Learning Soft Temporal Constraints: Experimental Setting and Results

    NASA Technical Reports Server (NTRS)

    Rossi, F.; Sperduti, A.; Venable, K. B.; Khatib, L.; Morris, P.; Morris, R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Soft temporal constraints problems allow to describe in a natural way scenarios where events happen over time and preferences are associated to event distances and durations. However, sometimes such local preferences are difficult to set, and it may be easier instead to associate preferences to some complete solutions of the problem. Machine learning techniques can be useful in this respect. In this paper we describe two solvers (one more general and the other one more efficient) for tractable subclasses of soft temporal problems, and we show some experimental results. The random generator used to build the problems on which tests are performed is also described. We also compare the two solvers highlighting the tradeoff between performance and representational power. Finally, we present a learning module and we show its behavior on randomly-generated examples.

  9. Quantitative Prediction of Systemic Toxicity Points of Departure (OpenTox USA 2017)

    EPA Science Inventory

    Human health risk assessment associated with environmental chemical exposure is limited by the tens of thousands of chemicals little or no experimental in vivo toxicity data. Data gap filling techniques, such as quantitative models based on chemical structure information, are c...

  10. Legionella in water samples: how can you interpret the results obtained by quantitative PCR?

    PubMed

    Ditommaso, Savina; Ricciardi, Elisa; Giacomuzzi, Monica; Arauco Rivera, Susan R; Zotti, Carla M

    2015-02-01

    Evaluation of the potential risk associated with Legionella has traditionally been determined from culture-based methods. Quantitative polymerase chain reaction (qPCR) is an alternative tool that offers rapid, sensitive and specific detection of Legionella in environmental water samples. In this study we compare the results obtained by conventional qPCR (iQ-Check™ Quanti Legionella spp.; Bio-Rad) and by culture method on artificial samples prepared in Page's saline by addiction of Legionella pneumophila serogroup 1 (ATCC 33152) and we analyse the selective quantification of viable Legionella cells by the qPCR-PMA method. The amount of Legionella DNA (GU) determined by qPCR was 28-fold higher than the load detected by culture (CFU). Applying the qPCR combined with PMA treatment we obtained a reduction of 98.5% of the qPCR signal from dead cells. We observed a dissimilarity in the ability of PMA to suppress the PCR signal in samples with different amounts of bacteria: the effective elimination of detection signals by PMA depended on the concentration of GU and increasing amounts of cells resulted in higher values of reduction. Using the results from this study we created an algorithm to facilitate the interpretation of viable cell level estimation with qPCR-PMA. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Transient Lift-Off Test Results for an Experimental Hybrid Bearing in Air

    DTIC Science & Technology

    2009-12-01

    High-Speed Hydrostatic Bearings,” ASME Journal of Tribology, Vol. 116, n2, 1994, pp. 337-344. [2] Scharrer, J.K., Tellier , J. and Hibbs, R., “A...J.K., Tellier , J. and Hibbs, R., “A Study of the Transient Performance of Hydrostatic Journal Bearings: Part II-Experimental Results,” STLE Paper 91...TC- 3B-2, 1991. [4] Sharrer, J., Tellier , J. and Hibbs, R., “Start Transient Testing of an Annular Hydrostatic Bearing in Liquid Oxygen,” AIAA

  12. Design and experimental results for a flapped natural-laminar-flow airfoil for general aviation applications

    NASA Technical Reports Server (NTRS)

    Somers, D. M.

    1981-01-01

    A flapped natural laminar flow airfoil for general aviation applications, the NLF(1)-0215F, has been designed and analyzed theoretically and verified experimentally in the Langley Low Turbulence Pressure Tunnel. The basic objective of combining the high maximum lift of the NASA low speed airfoils with the low cruise drag of the NACA 6 series airfoils has been achieved. The safety requirement that the maximum lift coefficient not be significantly affected with transition fixed near the leading edge has also been met. Comparisons of the theoretical and experimental results show generally good agreement.

  13. Qualitative and Quantitative Analysis for Facial Complexion in Traditional Chinese Medicine

    PubMed Central

    Zhao, Changbo; Li, Guo-zheng; Li, Fufeng; Wang, Zhi; Liu, Chang

    2014-01-01

    Facial diagnosis is an important and very intuitive diagnostic method in Traditional Chinese Medicine (TCM). However, due to its qualitative and experience-based subjective property, traditional facial diagnosis has a certain limitation in clinical medicine. The computerized inspection method provides classification models to recognize facial complexion (including color and gloss). However, the previous works only study the classification problems of facial complexion, which is considered as qualitative analysis in our perspective. For quantitative analysis expectation, the severity or degree of facial complexion has not been reported yet. This paper aims to make both qualitative and quantitative analysis for facial complexion. We propose a novel feature representation of facial complexion from the whole face of patients. The features are established with four chromaticity bases splitting up by luminance distribution on CIELAB color space. Chromaticity bases are constructed from facial dominant color using two-level clustering; the optimal luminance distribution is simply implemented with experimental comparisons. The features are proved to be more distinctive than the previous facial complexion feature representation. Complexion recognition proceeds by training an SVM classifier with the optimal model parameters. In addition, further improved features are more developed by the weighted fusion of five local regions. Extensive experimental results show that the proposed features achieve highest facial color recognition performance with a total accuracy of 86.89%. And, furthermore, the proposed recognition framework could analyze both color and gloss degrees of facial complexion by learning a ranking function. PMID:24967342

  14. The cutting edge - Micro-CT for quantitative toolmark analysis of sharp force trauma to bone.

    PubMed

    Norman, D G; Watson, D G; Burnett, B; Fenne, P M; Williams, M A

    2018-02-01

    Toolmark analysis involves examining marks created on an object to identify the likely tool responsible for creating those marks (e.g., a knife). Although a potentially powerful forensic tool, knife mark analysis is still in its infancy and the validation of imaging techniques as well as quantitative approaches is ongoing. This study builds on previous work by simulating real-world stabbings experimentally and statistically exploring quantitative toolmark properties, such as cut mark angle captured by micro-CT imaging, to predict the knife responsible. In Experiment 1 a mechanical stab rig and two knives were used to create 14 knife cut marks on dry pig ribs. The toolmarks were laser and micro-CT scanned to allow for quantitative measurements of numerous toolmark properties. The findings from Experiment 1 demonstrated that both knives produced statistically different cut mark widths, wall angle and shapes. Experiment 2 examined knife marks created on fleshed pig torsos with conditions designed to better simulate real-world stabbings. Eight knives were used to generate 64 incision cut marks that were also micro-CT scanned. Statistical exploration of these cut marks suggested that knife type, serrated or plain, can be predicted from cut mark width and wall angle. Preliminary results suggest that knives type can be predicted from cut mark width, and that knife edge thickness correlates with cut mark width. An additional 16 cut marks walls were imaged for striation marks using scanning electron microscopy with results suggesting that this approach might not be useful for knife mark analysis. Results also indicated that observer judgements of cut mark shape were more consistent when rated from micro-CT images than light microscopy images. The potential to combine micro-CT data, medical grade CT data and photographs to develop highly realistic virtual models for visualisation and 3D printing is also demonstrated. This is the first study to statistically explore simulated

  15. Method and platform standardization in MRM-based quantitative plasma proteomics.

    PubMed

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H

    2013-12-16

    There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of MS-based quantitative data. This need is heightened by the evolving trend for MRM-based validation of proposed disease biomarkers in complex biofluids such as blood plasma. We have developed two kits to assist in the inter- and intra-laboratory quality control of MRM experiments: the first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). In this paper, we report the use of these kits in intra- and inter-laboratory testing on 6 common LC/MS platforms. This

  16. Training Signaling Pathway Maps to Biochemical Data with Constrained Fuzzy Logic: Quantitative Analysis of Liver Cell Responses to Inflammatory Stimuli

    PubMed Central

    Morris, Melody K.; Saez-Rodriguez, Julio; Clarke, David C.; Sorger, Peter K.; Lauffenburger, Douglas A.

    2011-01-01

    Predictive understanding of cell signaling network operation based on general prior knowledge but consistent with empirical data in a specific environmental context is a current challenge in computational biology. Recent work has demonstrated that Boolean logic can be used to create context-specific network models by training proteomic pathway maps to dedicated biochemical data; however, the Boolean formalism is restricted to characterizing protein species as either fully active or inactive. To advance beyond this limitation, we propose a novel form of fuzzy logic sufficiently flexible to model quantitative data but also sufficiently simple to efficiently construct models by training pathway maps on dedicated experimental measurements. Our new approach, termed constrained fuzzy logic (cFL), converts a prior knowledge network (obtained from literature or interactome databases) into a computable model that describes graded values of protein activation across multiple pathways. We train a cFL-converted network to experimental data describing hepatocytic protein activation by inflammatory cytokines and demonstrate the application of the resultant trained models for three important purposes: (a) generating experimentally testable biological hypotheses concerning pathway crosstalk, (b) establishing capability for quantitative prediction of protein activity, and (c) prediction and understanding of the cytokine release phenotypic response. Our methodology systematically and quantitatively trains a protein pathway map summarizing curated literature to context-specific biochemical data. This process generates a computable model yielding successful prediction of new test data and offering biological insight into complex datasets that are difficult to fully analyze by intuition alone. PMID:21408212

  17. Quantitative analysis of macrophages in wound healing of rat skin subjected to loud noise stress.

    PubMed

    Rafi, Aisha; Khan, Muhammad Yunus; Minhas, Liaqat Ali

    2014-01-01

    Factors affecting skin wound healing have always been a central consideration in medical practice. Loud noise is biological stressor affecting the body systems at various levels. The present study was taken to study the effect of loud noise stress on the macrophages during wound healing process in male rat skin. One hundred and eighty male Sprague Dawley rats were randomly divided into control group-A and experimental group-B. Each group comprised 90 animals. Control and experimental groups were further subdivided into three subgroups of 30 animals each, corresponding to the day of sacrifice of animals, i.e., day 3, 5 and 7 after surgery. After induction of local anaesthesia a linear full thickness incision paravertebral to thoracic spine was made on the dorsum of rat. The experimental group B was exposed to loud noise stimulus (recorded noise of aero planes and gun fire) set at 97dBA to 102 dBA with a sound level meter. The animals were decapitated on day 3, 5 and 7 after surgery. Tissue was processed for paraffin embedding and stained by Hematoxylin and Eosin and Mallory's trichrome stain. Data was collected for the incisional space of the wound. Quantitative data of number of macrophages was analysed by Student's' test for the detection of any significant differences between the mean number in the experimental and control groups. All the quantitative data was expressed as means ± SE. A p-value of ≤ 0.05 was considered statistically significant. In this study macrophages were decreased statistically significantly at day 3 after surgery and thereafter increased significantly on day 5 and 7 after surgery in the experimental subgroups as compared to their match control subgroups. These results show that loud noise stress affects the cells (macrophages) involved in the healing of the wound therefore it is expected to have impact on the stages of wound healing.

  18. Direct Measurements of Quantum Kinetic Energy Tensor in Stable and Metastable Water near the Triple Point: An Experimental Benchmark.

    PubMed

    Andreani, Carla; Romanelli, Giovanni; Senesi, Roberto

    2016-06-16

    This study presents the first direct and quantitative measurement of the nuclear momentum distribution anisotropy and the quantum kinetic energy tensor in stable and metastable (supercooled) water near its triple point, using deep inelastic neutron scattering (DINS). From the experimental spectra, accurate line shapes of the hydrogen momentum distributions are derived using an anisotropic Gaussian and a model-independent framework. The experimental results, benchmarked with those obtained for the solid phase, provide the state of the art directional values of the hydrogen mean kinetic energy in metastable water. The determinations of the direction kinetic energies in the supercooled phase, provide accurate and quantitative measurements of these dynamical observables in metastable and stable phases, that is, key insight in the physical mechanisms of the hydrogen quantum state in both disordered and polycrystalline systems. The remarkable findings of this study establish novel insight into further expand the capacity and accuracy of DINS investigations of the nuclear quantum effects in water and represent reference experimental values for theoretical investigations.

  19. REVIEW OF NUMERICAL MODELS FOR PREDICTING THE ENERGY DEPOSITION AND RESULTANT THERMAL RESPONSE OF HUMANS EXPOSED TO ELECTROMAGNETIC FIELDS

    EPA Science Inventory

    For humans exposed to electromagnetic (EM) radiation, the resulting thermophysiologic response is not well understood. Because it is unlikely that this information will be determined from quantitative experimentation, it is necessary to develop theoretical models which predict th...

  20. Simultaneous experimental determination of labile proton fraction ratio and exchange rate with irradiation radio frequency power-dependent quantitative CEST MRI analysis.

    PubMed

    Sun, Phillip Zhe; Wang, Yu; Xiao, Gang; Wu, Renhua

    2013-01-01

    Chemical exchange saturation transfer (CEST) imaging is sensitive to dilute proteins/peptides and microenvironmental properties, and has been increasingly evaluated for molecular imaging and in vivo applications. However, the experimentally measured CEST effect depends on the CEST agent concentration, exchange rate and relaxation time. In addition, there may be non-negligible direct radio-frequency (RF) saturation effects, particularly severe for diamagnetic CEST (DIACEST) agents owing to their relatively small chemical shift difference from that of the bulk water resonance. As such, the commonly used asymmetry analysis only provides CEST-weighted information. Recently, it has been shown with numerical simulation that both labile proton concentration and exchange rate can be determined by evaluating the RF power dependence of DIACEST effect. To validate the simulation results, we prepared and imaged two CEST phantoms: a pH phantom of serially titrated pH at a fixed creatine concentration and a concentration phantom of serially varied creatine concentration titrated to the same pH, and solved the labile proton fraction ratio and exchange rate per-pixel. For the concentration phantom, we showed that the labile proton fraction ratio is proportional to the CEST agent concentration with negligible change in the exchange rate. Additionally, we found the exchange rate of the pH phantom is dominantly base-catalyzed with little difference in the labile proton fraction ratio. In summary, our study demonstrated quantitative DIACEST MRI, which remains promising to augment the conventional CEST-weighted MRI analysis. Copyright © 2013 John Wiley & Sons, Ltd.

  1. Growth of wormlike micelles in nonionic surfactant solutions: Quantitative theory vs. experiment.

    PubMed

    Danov, Krassimir D; Kralchevsky, Peter A; Stoyanov, Simeon D; Cook, Joanne L; Stott, Ian P; Pelan, Eddie G

    2018-06-01

    Despite the considerable advances of molecular-thermodynamic theory of micelle growth, agreement between theory and experiment has been achieved only in isolated cases. A general theory that can provide self-consistent quantitative description of the growth of wormlike micelles in mixed surfactant solutions, including the experimentally observed high peaks in viscosity and aggregation number, is still missing. As a step toward the creation of such theory, here we consider the simplest system - nonionic wormlike surfactant micelles from polyoxyethylene alkyl ethers, C i E j . Our goal is to construct a molecular-thermodynamic model that is in agreement with the available experimental data. For this goal, we systematized data for the micelle mean mass aggregation number, from which the micelle growth parameter was determined at various temperatures. None of the available models can give a quantitative description of these data. We constructed a new model, which is based on theoretical expressions for the interfacial-tension, headgroup-steric and chain-conformation components of micelle free energy, along with appropriate expressions for the parameters of the model, including their temperature and curvature dependencies. Special attention was paid to the surfactant chain-conformation free energy, for which a new more general formula was derived. As a result, relatively simple theoretical expressions are obtained. All parameters that enter these expressions are known, which facilitates the theoretical modeling of micelle growth for various nonionic surfactants in excellent agreement with the experiment. The constructed model can serve as a basis that can be further upgraded to obtain quantitative description of micelle growth in more complicated systems, including binary and ternary mixtures of nonionic, ionic and zwitterionic surfactants, which determines the viscosity and stability of various formulations in personal-care and house-hold detergency. Copyright © 2018

  2. Numerical simulation of granular flows : comparison with experimental results

    NASA Astrophysics Data System (ADS)

    Pirulli, M.; Mangeney-Castelnau, A.; Lajeunesse, E.; Vilotte, J.-P.; Bouchut, F.; Bristeau, M. O.; Perthame, B.

    2003-04-01

    Granular avalanches such as rock or debris flows regularly cause large amounts of human and material damages. Numerical simulation of granular avalanches should provide a useful tool for investigating, within realistic geological contexts, the dynamics of these flows and of their arrest phase and for improving the risk assessment of such natural hazards. Validation of debris avalanche numerical model on granular experiments over inclined plane is performed here. The comparison is performed by simulating granular flow of glass beads from a reservoir through a gate down an inclined plane. This unsteady situation evolves toward the steady state observed in the laboratory. Furthermore simulation exactly reproduces the arrest phase obtained by suddenly closing the gate of the reservoir once a thick flow has developped. The spreading of a granular mass released from rest at the top of a rough inclined plane is also investigated. The evolution of the avalanche shape, the velocity and the characteristics of the arrest phase are compared with experimental results and analysis of the involved forces are studied for various flow laws.

  3. Quantitative comparisons between experimentally measured 2D carbon radiation and Monte Carlo impurity (MCI) code simulations

    NASA Astrophysics Data System (ADS)

    Evans, T. E.; Finkenthal, D. F.; Fenstermacher, M. E.; Leonard, A. W.; Porter, G. D.; West, W. P.

    Experimentally measured carbon line emissions and total radiated power distributions from the DIII-D divertor and scrape-off layer (SOL) are compared to those calculated with the Monte Carlo impurity (MCI) model. A UEDGE [T.D. Rognlien et al., J. Nucl. Mater. 196-198 (1992) 347] background plasma is used in MCI with the Roth and Garcia-Rosales (RG-R) chemical sputtering model [J. Roth, C. García-Rosales, Nucl. Fusion 36 (1992) 196] and/or one of six physical sputtering models. While results from these simulations do not reproduce all of the features seen in the experimentally measured radiation patterns, the total radiated power calculated in MCI is in relatively good agreement with that measured by the DIII-D bolometric system when the Smith78 [D.L. Smith, J. Nucl. Mater. 75 (1978) 20] physical sputtering model is coupled to RG-R chemical sputtering in an unaltered UEDGE plasma. Alternatively, MCI simulations done with UEDGE background ion temperatures along the divertor target plates adjusted to better match those measured in the experiment resulted in three physical sputtering models which when coupled to the RG-R model gave a total radiated power that was within 10% of measured value.

  4. Quantitative Rainbow Schlieren Deflectometry as a Temperature Diagnostic for Spherical Flames

    NASA Technical Reports Server (NTRS)

    Feikema, Douglas A.

    2004-01-01

    Numerical analysis and experimental results are presented to define a method for quantitatively measuring the temperature distribution of a spherical diffusion flame using Rainbow Schlieren Deflectometry in microgravity. First, a numerical analysis is completed to show the method can suitably determine temperature in the presence of spatially varying species composition. Also, a numerical forward-backward inversion calculation is presented to illustrate the types of calculations and deflections to be encountered. Lastly, a normal gravity demonstration of temperature measurement in an axisymmetric laminar, diffusion flame using Rainbow Schlieren deflectometry is presented. The method employed in this paper illustrates the necessary steps for the preliminary design of a Schlieren system. The largest deflections for the normal gravity flame considered in this paper are 7.4 x 10(-4) radians which can be accurately measured with 2 meter focal length collimating and decollimating optics. The experimental uncertainty of deflection is less than 5 x 10(-5) radians.

  5. LOD significance thresholds for QTL analysis in experimental populations of diploid species

    PubMed

    Van Ooijen JW

    1999-11-01

    Linkage analysis with molecular genetic markers is a very powerful tool in the biological research of quantitative traits. The lack of an easy way to know what areas of the genome can be designated as statistically significant for containing a gene affecting the quantitative trait of interest hampers the important prediction of the rate of false positives. In this paper four tables, obtained by large-scale simulations, are presented that can be used with a simple formula to get the false-positives rate for analyses of the standard types of experimental populations with diploid species with any size of genome. A new definition of the term 'suggestive linkage' is proposed that allows a more objective comparison of results across species.

  6. Wires in the soup: quantitative models of cell signaling

    PubMed Central

    Cheong, Raymond; Levchenko, Andre

    2014-01-01

    Living cells are capable of extracting information from their environments and mounting appropriate responses to a variety of associated challenges. The underlying signal transduction networks enabling this can be quite complex, necessitating for their unraveling by sophisticated computational modeling coupled with precise experimentation. Although we are still at the beginning of this process, some recent examples of integrative analysis of cell signaling are very encouraging. This review highlights the case of the NF-κB pathway in order to illustrate how a quantitative model of a signaling pathway can be gradually constructed through continuous experimental validation, and what lessons one might learn from such exercises. PMID:18291655

  7. Cyclic Inelastic Deformation and Fatigue Resistance of a Rail Steel : Experimental Results and Mathematical Models

    DOT National Transportation Integrated Search

    1981-10-01

    Experimental results developed from tests of uniaxial, smooth specimens obtained from the head of an unused section of rail have been reported. Testing encompassed a broad range of conditions - monotonic tension, monotonic compression, and fully reve...

  8. Initial development of the two-dimensional ejector shear layer - Experimental results

    NASA Technical Reports Server (NTRS)

    Benjamin, M. A.; Dufflocq, M.; Roan, V. P.

    1993-01-01

    An experimental investigation designed to study the development of shear layers in a two-dimensional single-nozzle ejector has been completed. In this study, combinations of air/air, argon/air, helium/air, and air/helium were used as the supersonic primary and subsonic secondary, respectively. Mixing of the gases occurred in a constant-area tube 39.1 mm high by 25.4 mm wide, where the inlet static pressure was maintained at 35 kPa. The cases studied resulted in convective Mach numbers between 0.058 and 1.64, density ratios between 0.102 and 3.49, and velocity ratios between 0.065 and 0.811. The resulting data shows the differences in the shear-layer development for the various combinations of independent variables utilized in the investigation. The normalized growth-rates in the near-field were found to be similar to two-dimensional mixing layers. These results have enhanced the ability to analyze and design ejector systems as well as providing a better understanding of the physics.

  9. Object impedance control for cooperative manipulation - Theory and experimental results

    NASA Technical Reports Server (NTRS)

    Schneider, Stanley A.; Cannon, Robert H., Jr.

    1992-01-01

    This paper presents the dynamic control module of the Dynamic and Strategic Control of Cooperating Manipulators (DASCCOM) project at Stanford University's Aerospace Robotics Laboratory. First, the cooperative manipulation problem is analyzed from a systems perspective, and the desirable features of a control system for cooperative manipulation are discussed. Next, a control policy is developed that enforces a controlled impedance not of the individual arm endpoints, but of the manipulated object itself. A parallel implementation for a multiprocessor system is presented. The controller fully compensates for the system dynamics and directly controls the object internal forces. Most importantly, it presents a simple, powerful, intuitive interface to higher level strategic control modules. Experimental results from a dual two-link-arm robotic system are used to compare the object impedance controller with other strategies, both for free-motion slews and environmental contact.

  10. Laser long-range remote-sensing program experimental results

    NASA Astrophysics Data System (ADS)

    Highland, Ronald G.; Shilko, Michael L.; Fox, Marsha J.; Gonglewski, John D.; Czyzak, Stanley R.; Dowling, James A.; Kelly, Brian; Pierrottet, Diego F.; Ruffatto, Donald; Loando, Sharon; Matsuura, Chris; Senft, Daniel C.; Finkner, Lyle; Rae, Joe; Gallegos, Joe

    1995-12-01

    A laser long range remote sensing (LRS) program is being conducted by the United States Air Force Phillips Laboratory (AF/PL). As part of this program, AF/PL is testing the feasibility of developing a long path CO(subscript 2) laser-based DIAL system for remote sensing. In support of this program, the AF/PL has recently completed an experimental series using a 21 km slant- range path (3.05 km ASL transceiver height to 0.067 km ASL target height) at its Phillips Laboratory Air Force Maui Optical Station (AMOS) facility located on Maui, Hawaii. The dial system uses a 3-joule, (superscript 13)C isotope laser coupled into a 0.6 m diameter telescope. The atmospheric optical characterization incorporates information from an infrared scintillometer co-aligned to the laser path, atmospheric profiles from weather balloons launched from the target site, and meteorological data from ground stations at AMOS and the target site. In this paper, we report a description of the experiment configuration, a summary of the results, a summary of the atmospheric conditions and their implications to the LRS program. The capability of such a system for long-range, low-angle, slant-path remote sensing is discussed. System performance issues relating to both coherent and incoherent detection methods, atmospheric limitations, as well as, the development of advanced models to predict performance of long range scenarios are presented.

  11. Experimental and Computational Aerothermodynamics of a Mars Entry Vehicle

    NASA Technical Reports Server (NTRS)

    Hollis, Brian R.

    1996-01-01

    An aerothermodynamic database has been generated through both experimental testing and computational fluid dynamics simulations for a 70 deg sphere-cone configuration based on the NASA Mars Pathfinder entry vehicle. The aerothermodynamics of several related parametric configurations were also investigated. Experimental heat-transfer data were obtained at hypersonic test conditions in both a perfect gas air wind tunnel and in a hypervelocity, high-enthalpy expansion tube in which both air and carbon dioxide were employed as test gases. In these facilities, measurements were made with thin-film temperature-resistance gages on both the entry vehicle models and on the support stings of the models. Computational results for freestream conditions equivalent to those of the test facilities were generated using an axisymmetric/2D laminar Navier-Stokes solver with both perfect-gas and nonequilibrium thermochemical models. Forebody computational and experimental heating distributions agreed to within the experimental uncertainty for both the perfect-gas and high-enthalpy test conditions. In the wake, quantitative differences between experimental and computational heating distributions for the perfect-gas conditions indicated transition of the free shear layer near the reattachment point on the sting. For the high enthalpy cases, agreement to within, or slightly greater than, the experimental uncertainty was achieved in the wake except within the recirculation region, where further grid resolution appeared to be required. Comparisons between the perfect-gas and high-enthalpy results indicated that the wake remained laminar at the high-enthalpy test conditions, for which the Reynolds number was significantly lower than that of the perfect-gas conditions.

  12. Results of Experimental Study on Flexitime and Family Life.

    ERIC Educational Resources Information Center

    Winett, Richard A.; Neale, Michael S.

    1980-01-01

    According to two small experimental studies of flexible working hours, federal workers with young children choose to arrive at and depart from work earlier, allowing them to increase the time spent with their families and to engage in recreational, educational, and household activities. (Author/SK)

  13. Director gliding in a nematic liquid crystal layer: Quantitative comparison with experiments

    NASA Astrophysics Data System (ADS)

    Mema, E.; Kondic, L.; Cummings, L. J.

    2018-03-01

    The interaction between nematic liquid crystals and polymer-coated substrates may lead to slow reorientation of the easy axis (so-called "director gliding") when a prolonged external field is applied. We consider the experimental evidence of zenithal gliding observed by Joly et al. [Phys. Rev. E 70, 050701 (2004), 10.1103/PhysRevE.70.050701] and Buluy et al. [J. Soc. Inf. Disp. 14, 603 (2006), 10.1889/1.2235686] as well as azimuthal gliding observed by S. Faetti and P. Marianelli [Liq. Cryst. 33, 327 (2006), 10.1080/02678290500512227], and we present a simple, physically motivated model that captures the slow dynamics of gliding, both in the presence of an electric field and after the electric field is turned off. We make a quantitative comparison of our model results and the experimental data and conclude that our model explains the gliding evolution very well.

  14. Quantitative evaluation of statistical errors in small-angle X-ray scattering measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sedlak, Steffen M.; Bruetzel, Linda K.; Lipfert, Jan

    A new model is proposed for the measurement errors incurred in typical small-angle X-ray scattering (SAXS) experiments, which takes into account the setup geometry and physics of the measurement process. The model accurately captures the experimentally determined errors from a large range of synchrotron and in-house anode-based measurements. Its most general formulation gives for the variance of the buffer-subtracted SAXS intensity σ 2(q) = [I(q) + const.]/(kq), whereI(q) is the scattering intensity as a function of the momentum transferq;kand const. are fitting parameters that are characteristic of the experimental setup. The model gives a concrete procedure for calculating realistic measurementmore » errors for simulated SAXS profiles. In addition, the results provide guidelines for optimizing SAXS measurements, which are in line with established procedures for SAXS experiments, and enable a quantitative evaluation of measurement errors.« less

  15. Development of a relational database to capture and merge clinical history with the quantitative results of radionuclide renography.

    PubMed

    Folks, Russell D; Savir-Baruch, Bital; Garcia, Ernest V; Verdes, Liudmila; Taylor, Andrew T

    2012-12-01

    Our objective was to design and implement a clinical history database capable of linking to our database of quantitative results from (99m)Tc-mercaptoacetyltriglycine (MAG3) renal scans and export a data summary for physicians or our software decision support system. For database development, we used a commercial program. Additional software was developed in Interactive Data Language. MAG3 studies were processed using an in-house enhancement of a commercial program. The relational database has 3 parts: a list of all renal scans (the RENAL database), a set of patients with quantitative processing results (the Q2 database), and a subset of patients from Q2 containing clinical data manually transcribed from the hospital information system (the CLINICAL database). To test interobserver variability, a second physician transcriber reviewed 50 randomly selected patients in the hospital information system and tabulated 2 clinical data items: hydronephrosis and presence of a current stent. The CLINICAL database was developed in stages and contains 342 fields comprising demographic information, clinical history, and findings from up to 11 radiologic procedures. A scripted algorithm is used to reliably match records present in both Q2 and CLINICAL. An Interactive Data Language program then combines data from the 2 databases into an XML (extensible markup language) file for use by the decision support system. A text file is constructed and saved for review by physicians. RENAL contains 2,222 records, Q2 contains 456 records, and CLINICAL contains 152 records. The interobserver variability testing found a 95% match between the 2 observers for presence or absence of ureteral stent (κ = 0.52), a 75% match for hydronephrosis based on narrative summaries of hospitalizations and clinical visits (κ = 0.41), and a 92% match for hydronephrosis based on the imaging report (κ = 0.84). We have developed a relational database system to integrate the quantitative results of MAG3 image

  16. VX Hydrolysis by Human Serum Paraoxonase 1: A Comparison of Experimental and Computational Results

    PubMed Central

    Peterson, Matthew W.; Fairchild, Steven Z.; Otto, Tamara C.; Mohtashemi, Mojdeh; Cerasoli, Douglas M.; Chang, Wenling E.

    2011-01-01

    Human Serum paraoxonase 1 (HuPON1) is an enzyme that has been shown to hydrolyze a variety of chemicals including the nerve agent VX. While wildtype HuPON1 does not exhibit sufficient activity against VX to be used as an in vivo countermeasure, it has been suggested that increasing HuPON1's organophosphorous hydrolase activity by one or two orders of magnitude would make the enzyme suitable for this purpose. The binding interaction between HuPON1 and VX has recently been modeled, but the mechanism for VX hydrolysis is still unknown. In this study, we created a transition state model for VX hydrolysis (VXts) in water using quantum mechanical/molecular mechanical simulations, and docked the transition state model to 22 experimentally characterized HuPON1 variants using AutoDock Vina. The HuPON1-VXts complexes were grouped by reaction mechanism using a novel clustering procedure. The average Vina interaction energies for different clusters were compared to the experimentally determined activities of HuPON1 variants to determine which computational procedures best predict how well HuPON1 variants will hydrolyze VX. The analysis showed that only conformations which have the attacking hydroxyl group of VXts coordinated by the sidechain oxygen of D269 have a significant correlation with experimental results. The results from this study can be used for further characterization of how HuPON1 hydrolyzes VX and design of HuPON1 variants with increased activity against VX. PMID:21655255

  17. Progress in Quantitative Viral Load Testing: Variability and Impact of the WHO Quantitative International Standards

    PubMed Central

    Sun, Y.; Tang, L.; Procop, G. W.; Hillyard, D. R.; Young, S. A.; Caliendo, A. M.

    2016-01-01

    ABSTRACT It has been hoped that the recent availability of WHO quantitative standards would improve interlaboratory agreement for viral load testing; however, insufficient data are available to evaluate whether this has been the case. Results from 554 laboratories participating in proficiency testing surveys for quantitative PCR assays of cytomegalovirus (CMV), Epstein-Barr virus (EBV), BK virus (BKV), adenovirus (ADV), and human herpesvirus 6 (HHV6) were evaluated to determine overall result variability and then were stratified by assay manufacturer. The impact of calibration to international units/ml (CMV and EBV) on variability was also determined. Viral loads showed a high degree of interlaboratory variability for all tested viruses, with interquartile ranges as high as 1.46 log10 copies/ml and the overall range for a given sample up to 5.66 log10 copies/ml. Some improvement in result variability was seen when international units were adopted. This was particularly the case for EBV viral load results. Variability in viral load results remains a challenge across all viruses tested here; introduction of international quantitative standards may help reduce variability and does so more or less markedly for certain viruses. PMID:27852673

  18. Experimental macroevolution†

    PubMed Central

    Bell, Graham

    2016-01-01

    The convergence of several disparate research programmes raises the possibility that the long-term evolutionary processes of innovation and radiation may become amenable to laboratory experimentation. Ancestors might be resurrected directly from naturally stored propagules or tissues, or indirectly from the expression of ancestral genes in contemporary genomes. New kinds of organisms might be evolved through artificial selection of major developmental genes. Adaptive radiation can be studied by mimicking major ecological transitions in the laboratory. All of these possibilities are subject to severe quantitative and qualitative limitations. In some cases, however, laboratory experiments may be capable of illuminating the processes responsible for the evolution of new kinds of organisms. PMID:26763705

  19. [A novel approach to NIR spectral quantitative analysis: semi-supervised least-squares support vector regression machine].

    PubMed

    Li, Lin; Xu, Shuo; An, Xin; Zhang, Lu-Da

    2011-10-01

    In near infrared spectral quantitative analysis, the precision of measured samples' chemical values is the theoretical limit of those of quantitative analysis with mathematical models. However, the number of samples that can obtain accurately their chemical values is few. Many models exclude the amount of samples without chemical values, and consider only these samples with chemical values when modeling sample compositions' contents. To address this problem, a semi-supervised LS-SVR (S2 LS-SVR) model is proposed on the basis of LS-SVR, which can utilize samples without chemical values as well as those with chemical values. Similar to the LS-SVR, to train this model is equivalent to solving a linear system. Finally, the samples of flue-cured tobacco were taken as experimental material, and corresponding quantitative analysis models were constructed for four sample compositions' content(total sugar, reducing sugar, total nitrogen and nicotine) with PLS regression, LS-SVR and S2 LS-SVR. For the S2 LS-SVR model, the average relative errors between actual values and predicted ones for the four sample compositions' contents are 6.62%, 7.56%, 6.11% and 8.20%, respectively, and the correlation coefficients are 0.974 1, 0.973 3, 0.923 0 and 0.948 6, respectively. Experimental results show the S2 LS-SVR model outperforms the other two, which verifies the feasibility and efficiency of the S2 LS-SVR model.

  20. How measurement science can improve confidence in research results.

    PubMed

    Plant, Anne L; Becker, Chandler A; Hanisch, Robert J; Boisvert, Ronald F; Possolo, Antonio M; Elliott, John T

    2018-04-01

    The current push for rigor and reproducibility is driven by a desire for confidence in research results. Here, we suggest a framework for a systematic process, based on consensus principles of measurement science, to guide researchers and reviewers in assessing, documenting, and mitigating the sources of uncertainty in a study. All study results have associated ambiguities that are not always clarified by simply establishing reproducibility. By explicitly considering sources of uncertainty, noting aspects of the experimental system that are difficult to characterize quantitatively, and proposing alternative interpretations, the researcher provides information that enhances comparability and reproducibility.

  1. Quantitative Muscle Ultrasonography in Carpal Tunnel Syndrome.

    PubMed

    Lee, Hyewon; Jee, Sungju; Park, Soo Ho; Ahn, Seung-Chan; Im, Juneho; Sohn, Min Kyun

    2016-12-01

    To assess the reliability of quantitative muscle ultrasonography (US) in healthy subjects and to evaluate the correlation between quantitative muscle US findings and electrodiagnostic study results in patients with carpal tunnel syndrome (CTS). The clinical significance of quantitative muscle US in CTS was also assessed. Twenty patients with CTS and 20 age-matched healthy volunteers were recruited. All control and CTS subjects underwent a bilateral median and ulnar nerve conduction study (NCS) and quantitative muscle US. Transverse US images of the abductor pollicis brevis (APB) and abductor digiti minimi (ADM) were obtained to measure muscle cross-sectional area (CSA), thickness, and echo intensity (EI). EI was determined using computer-assisted, grayscale analysis. Inter-rater and intra-rater reliability for quantitative muscle US in control subjects, and differences in muscle thickness, CSA, and EI between the CTS patient and control groups were analyzed. Relationships between quantitative US parameters and electrodiagnostic study results were evaluated. Quantitative muscle US had high inter-rater and intra-rater reliability in the control group. Muscle thickness and CSA were significantly decreased, and EI was significantly increased in the APB of the CTS group (all p<0.05). EI demonstrated a significant positive correlation with latency of the median motor and sensory NCS in CTS patients (p<0.05). These findings suggest that quantitative muscle US parameters may be useful for detecting muscle changes in CTS. Further study involving patients with other neuromuscular diseases is needed to evaluate peripheral muscle change using quantitative muscle US.

  2. The Next Frontier: Quantitative Biochemistry in Living Cells.

    PubMed

    Honigmann, Alf; Nadler, André

    2018-01-09

    Researchers striving to convert biology into an exact science foremost rely on structural biology and biochemical reconstitution approaches to obtain quantitative data. However, cell biological research is moving at an ever-accelerating speed into areas where these approaches lose much of their edge. Intrinsically unstructured proteins and biochemical interaction networks composed of interchangeable, multivalent, and unspecific interactions pose unique challenges to quantitative biology, as do processes that occur in discrete cellular microenvironments. Here we argue that a conceptual change in our way of conducting biochemical experiments is required to take on these new challenges. We propose that reconstitution of cellular processes in vitro should be much more focused on mimicking the cellular environment in vivo, an approach that requires detailed knowledge of the material properties of cellular compartments, essentially requiring a material science of the cell. In a similar vein, we suggest that quantitative biochemical experiments in vitro should be accompanied by corresponding experiments in vivo, as many newly relevant cellular processes are highly context-dependent. In essence, this constitutes a call for chemical biologists to convert their discipline from a proof-of-principle science to an area that could rightfully be called quantitative biochemistry in living cells. In this essay, we discuss novel techniques and experimental strategies with regard to their potential to fulfill such ambitious aims.

  3. [Navigated drilling for femoral head necrosis. Experimental and clinical results].

    PubMed

    Beckmann, J; Tingart, M; Perlick, L; Lüring, C; Grifka, J; Anders, S

    2007-05-01

    In the early stages of osteonecrosis of the femoral head, core decompression by exact drilling into the ischemic areas can reduce pain and achieve reperfusion. Using computer aided surgery, the precision of the drilling can be improved while simultaneously lowering radiation exposure time for both staff and patients. We describe the experimental and clinical results of drilling under the guidance of the fluoroscopically-based VectorVision navigation system (BrainLAB, Munich, Germany). A total of 70 sawbones were prepared mimicking an osteonecrosis of the femoral head. In two experimental models, bone only and obesity, as well as in a clinical setting involving ten patients with osteonecrosis of the femoral head, the precision and the duration of radiation exposure were compared between the VectorVision system and conventional drilling. No target was missed. For both models, there was a statistically significant difference in terms of the precision, the number of drilling corrections as well as the radiation exposure time. The average distance to the desired midpoint of the lesion of both models was 0.48 mm for navigated drilling and 1.06 mm for conventional drilling, the average drilling corrections were 0.175 and 2.1, and the radiation exposure time less than 1 s and 3.6 s, respectively. In the clinical setting, the reduction of radiation exposure (below 1 s for navigation compared to 56 s for the conventional technique) as well as of drilling corrections (0.2 compared to 3.4) was also significant. Computer guided drilling using the fluoroscopically based VectorVision navigation system shows a clearly improved precision with a enormous simultaneous reduction in radiation exposure. It is therefore recommended for clinical routine.

  4. Guidelines for Initiating a Research Agenda: Research Design and Dissemination of Results.

    PubMed

    Delost, Maria E; Nadder, Teresa S

    2014-01-01

    Successful research outcomes require selection and implementation of the appropriate research design. A realistic sampling plan appropriate for the design is essential. Qualitative or quantitative methodology may be utilized, depending on the research question and goals. Quantitative research may be experimental where there is an intervention, or nonexperimental, if no intervention is included in the design. Causation can only be established with experimental research. Popular types of nonexperimental research include descriptive and survey research. Research findings may be disseminated via presentations, posters, and publications, such as abstracts and manuscripts.

  5. Reflectivity of 1D photonic crystals: A comparison of computational schemes with experimental results

    NASA Astrophysics Data System (ADS)

    Pérez-Huerta, J. S.; Ariza-Flores, D.; Castro-García, R.; Mochán, W. L.; Ortiz, G. P.; Agarwal, V.

    2018-04-01

    We report the reflectivity of one-dimensional finite and semi-infinite photonic crystals, computed through the coupling to Bloch modes (BM) and through a transfer matrix method (TMM), and their comparison to the experimental spectral line shapes of porous silicon (PS) multilayer structures. Both methods reproduce a forbidden photonic bandgap (PBG), but slowly-converging oscillations are observed in the TMM as the number of layers increases to infinity, while a smooth converged behavior is presented with BM. The experimental reflectivity spectra is in good agreement with the TMM results for multilayer structures with a small number of periods. However, for structures with large amount of periods, the measured spectral line shapes exhibit better agreement with the smooth behavior predicted by BM.

  6. Experimental and computational flow-field results for an all-body hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Cleary, Joseph W.

    1989-01-01

    A comprehensive test program is defined which is being implemented in the NASA/Ames 3.5 foot Hypersonic Wind Tunnel for obtaining data on a generic all-body hypersonic vehicle for computational fluid dynamics (CFD) code validation. Computational methods (approximate inviscid methods and an upwind parabolized Navier-Stokes code) currently being applied to the all-body model are outlined. Experimental and computational results on surface pressure distributions and Pitot-pressure surveys for the basic sharp-nose model (without control surfaces) at a free-stream Mach number of 7 are presented.

  7. Experimental light scattering by small particles: first results with a novel Mueller matrix scatterometer

    NASA Astrophysics Data System (ADS)

    Penttilä, Antti; Maconi, Göran; Kassamakov, Ivan; Gritsevich, Maria; Helander, Petteri; Puranen, Tuomas; Hæggström, Edward; Muinonen, Karri

    2017-06-01

    We describe a setup for measuring the full angular Mueller matrix profile of a single mm- to μm-sized sample, and verify the experimental results against a theoretical model. The scatterometer has a fixed or levitating sample, illuminated with a laser beam whose full polarization state is controlled. The scattered light is detected with a combination of wave retarder, linear polarizer, and photomultiplier tube that is attached to a rotational stage. The first results are reported.

  8. The emergence of modern statistics in agricultural science: analysis of variance, experimental design and the reshaping of research at Rothamsted Experimental Station, 1919-1933.

    PubMed

    Parolini, Giuditta

    2015-01-01

    During the twentieth century statistical methods have transformed research in the experimental and social sciences. Qualitative evidence has largely been replaced by quantitative results and the tools of statistical inference have helped foster a new ideal of objectivity in scientific knowledge. The paper will investigate this transformation by considering the genesis of analysis of variance and experimental design, statistical methods nowadays taught in every elementary course of statistics for the experimental and social sciences. These methods were developed by the mathematician and geneticist R. A. Fisher during the 1920s, while he was working at Rothamsted Experimental Station, where agricultural research was in turn reshaped by Fisher's methods. Analysis of variance and experimental design required new practices and instruments in field and laboratory research, and imposed a redistribution of expertise among statisticians, experimental scientists and the farm staff. On the other hand the use of statistical methods in agricultural science called for a systematization of information management and made computing an activity integral to the experimental research done at Rothamsted, permanently integrating the statisticians' tools and expertise into the station research programme. Fisher's statistical methods did not remain confined within agricultural research and by the end of the 1950s they had come to stay in psychology, sociology, education, chemistry, medicine, engineering, economics, quality control, just to mention a few of the disciplines which adopted them.

  9. Benefit-risk analysis : a brief review and proposed quantitative approaches.

    PubMed

    Holden, William L

    2003-01-01

    Given the current status of benefit-risk analysis as a largely qualitative method, two techniques for a quantitative synthesis of a drug's benefit and risk are proposed to allow a more objective approach. The recommended methods, relative-value adjusted number-needed-to-treat (RV-NNT) and its extension, minimum clinical efficacy (MCE) analysis, rely upon efficacy or effectiveness data, adverse event data and utility data from patients, describing their preferences for an outcome given potential risks. These methods, using hypothetical data for rheumatoid arthritis drugs, demonstrate that quantitative distinctions can be made between drugs which would better inform clinicians, drug regulators and patients about a drug's benefit-risk profile. If the number of patients needed to treat is less than the relative-value adjusted number-needed-to-harm in an RV-NNT analysis, patients are willing to undergo treatment with the experimental drug to derive a certain benefit knowing that they may be at risk for any of a series of potential adverse events. Similarly, the results of an MCE analysis allow for determining the worth of a new treatment relative to an older one, given not only the potential risks of adverse events and benefits that may be gained, but also by taking into account the risk of disease without any treatment. Quantitative methods of benefit-risk analysis have a place in the evaluative armamentarium of pharmacovigilance, especially those that incorporate patients' perspectives.

  10. Experimental and Numerical Modeling of Fluid Flow Processes in Continuous Casting: Results from the LIMMCAST-Project

    NASA Astrophysics Data System (ADS)

    Timmel, K.; Kratzsch, C.; Asad, A.; Schurmann, D.; Schwarze, R.; Eckert, S.

    2017-07-01

    The present paper reports about numerical simulations and model experiments concerned with the fluid flow in the continuous casting process of steel. This work was carried out in the LIMMCAST project in the framework of the Helmholtz alliance LIMTECH. A brief description of the LIMMCAST facilities used for the experimental modeling at HZDR is given here. Ultrasonic and inductive techniques and the X-ray radioscopy were employed for flow measurements or visualizations of two-phase flow regimes occurring in the submerged entry nozzle and the mold. Corresponding numerical simulations were performed at TUBAF taking into account the dimensions and properties of the model experiments. Numerical models were successfully validated using the experimental data base. The reasonable and in many cases excellent agreement of numerical with experimental data allows to extrapolate the models to real casting configurations. Exemplary results will be presented here showing the effect of electromagnetic brakes or electromagnetic stirrers on the flow in the mold or illustrating the properties of two-phase flows resulting from an Ar injection through the stopper rod.

  11. RFI in hybrid loops - Simulation and experimental results.

    NASA Technical Reports Server (NTRS)

    Ziemer, R. E.; Nelson, D. R.; Raghavan, H. R.

    1972-01-01

    A digital simulation of an imperfect second-order hybrid phase-locked loop (HPLL) operating in radio frequency interference (RFI) is described. Its performance is characterized in terms of phase error variance and phase error probability density function (PDF). Monte-Carlo simulation is used to show that the HPLL can be superior to the conventional phase-locked loops in RFI backgrounds when minimum phase error variance is the goodness criterion. Similar experimentally obtained data are given in support of the simulation data.

  12. Boston Community Information System 1986 Experimental Test Results.

    DTIC Science & Technology

    1987-08-01

    self -selected participants have a strong technical orientation and high educational achievement. In addition, five visually impaired people use the...group. The experimental test of the system was performed on a self -selected population of computer literate volunteers. In order to simplify the test...for fat respose .’ - 1041 OI haven’t used it yet.’ - 1046 ’No modem yet. New version installed 11/2/86.0 - 1047 ’Not yet tried. Wil do so moon.’ - 1061

  13. Insulator-based dielectrophoresis of microorganisms: theoretical and experimental results.

    PubMed

    Moncada-Hernandez, Hector; Baylon-Cardiel, Javier L; Pérez-González, Victor H; Lapizco-Encinas, Blanca H

    2011-09-01

    Dielectrophoresis (DEP) is the motion of particles due to polarization effects in nonuniform electric fields. DEP has great potential for handling cells and is a non-destructive phenomenon. It has been utilized for different cell analysis, from viability assessments to concentration enrichment and separation. Insulator-based DEP (iDEP) provides an attractive alternative to conventional electrode-based systems; in iDEP, insulating structures are used to generate nonuniform electric fields, resulting in simpler and more robust devices. Despite the rapid development of iDEP microdevices for applications with cells, the fundamentals behind the dielectrophoretic behavior of cells has not been fully elucidated. Understanding the theory behind iDEP is necessary to continue the progress in this field. This work presents the manipulation and separation of bacterial and yeast cells with iDEP. A computational model in COMSOL Multiphysics was employed to predict the effect of direct current-iDEP on cells suspended in a microchannel containing an array of insulating structures. The model allowed predicting particle behavior, pathlines and the regions where dielectrophoretic immobilization should occur. Experimental work was performed at the same operating conditions employed with the model and results were compared, obtaining good agreement. This is the first report on the mathematical modeling of the dielectrophoretic response of yeast and bacterial cells in a DC-iDEP microdevice. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Static properties of ferromagnetic quantum chains: Numerical results and experimental data on two S=1/2 systems (invited)

    NASA Astrophysics Data System (ADS)

    Kopinga, K.; Delica, T.; Leschke, H.

    1990-05-01

    New results of a variant of the numerically exact quantum transfer matrix method have been compared with experimental data on the static properties of [C6H11NH3]CuBr3(CHAB), a ferromagnetic system with about 5% easy-plane anisotropy. Above T=3.5 K, the available data on the zero-field heat capacity, the excess heat capacity ΔC=C(B)-C(B=0), and the magnetization are described with an accuracy comparable to the experimental error. Calculations of the spin-spin correlation functions reveal that the good description of the experimental correlation length in CHAB by a classical spin model is largely accidental. The zero-field susceptibility, which can be deduced from these correlation functions, is in fair agreement with the reported experimental data between 4 and 100 K. The method also seems to yield accurate results for the chlorine isomorph, CHAC, a system with about 2% uniaxial anisotropy.

  15. Quantitative measurement of the near-field enhancement of nanostructures by two-photon polymerization.

    PubMed

    Geldhauser, Tobias; Kolloch, Andreas; Murazawa, Naoki; Ueno, Kosei; Boneberg, Johannes; Leiderer, Paul; Scheer, Elke; Misawa, Hiroaki

    2012-06-19

    The quantitative determination of the strength of the near-field enhancement in and around nanostructures is essential for optimizing and using these structures for applications. We combine the gaussian intensity distribution of a laser profile and two-photon-polymerization of SU-8 to a suitable tool for the quantitative experimental measurement of the near-field enhancement of a nanostructure. Our results give a feedback to the results obtained by finite-difference time-domain (FDTD) simulations. The structures under investigation are gold nanotriangles on a glass substrate with 85 nm side length and a thickness of 40 nm. We compare the threshold fluence for polymerization for areas of the gaussian intensity profile with and without the near-field enhancement of the nanostructures. The experimentally obtained value of the near-field intensity enhancement is 600 ± 140, independent of the laser power, irradiation time, and spot size. The FDTD simulation shows a pointlike maximum of 2600 at the tip. In a more extended area with an approximate size close to the smallest polymerized structure of 25 nm in diameter, we find a value between 800 and 600. Using our novel approach, we determine the threshold fluence for polymerization of the commercially available photopolymerizable resin SU-8 by a femtosecond laser working at a wavelength of 795 nm and a repetition rate of 82 MHz to be 0.25 J/cm(2) almost independent of the irradiation time and the laser power used. This finding is important for future applications of the method because it enables one to use varying laser systems.

  16. Quantitative validation of an air-coupled ultrasonic probe model by Interferometric laser tomography

    NASA Astrophysics Data System (ADS)

    Revel, G. M.; Pandarese, G.; Cavuto, A.

    2012-06-01

    The present paper describes the quantitative validation of a finite element (FE) model of the ultrasound beam generated by an air coupled non-contact ultrasound transducer. The model boundary conditions are given by vibration velocities measured by laser vibrometry on the probe membrane. The proposed validation method is based on the comparison between the simulated 3D pressure field and the pressure data measured with interferometric laser tomography technique. The model details and the experimental techniques are described in paper. The analysis of results shows the effectiveness of the proposed approach and the possibility to quantitatively assess and predict the generated acoustic pressure field, with maximum discrepancies in the order of 20% due to uncertainty effects. This step is important for determining in complex problems the real applicability of air-coupled probes and for the simulation of the whole inspection procedure, also when the component is designed, so as to virtually verify its inspectability.

  17. A Quantitative Review of Ethnic Group Differences in Experimental Pain Response: Do Biology, Psychology and Culture Matter?

    PubMed Central

    Riley, Joseph L.; Williams, Ameenah K.K.; Fillingim, Roger B.

    2012-01-01

    Objective Pain is a subjectively complex and universal experience. We examine research investigating ethnic group differences in experimental pain response, and factors contributing to group differences. Method We conducted a systematic literature review and analysis of studies using experimental pain stimuli to assess pain sensitivity across multiple ethnic groups. Our search covered the period from 1944-2011, and utilized the PUBMED bibliographic database; a reference source containing over 17 million citations. We calculated effect sizes, identified ethnic/racial group categories, pain stimuli and measures, and examined findings regarding biopsychosociocultural factors contributing to ethnic/racial group differences. Results We found 472 studies investigating ethnic group differences and pain. Twenty-six of these met our review inclusion criteria of investigating ethnic group differences in experimental pain. The majority of studies included comparisons between African Americans (AA) and non-Hispanic Whites (NHW). There were consistently moderate to large effect sizes for pain tolerance across multiple stimulus modalities; African Americans demonstrated lower pain tolerance. For pain threshold, findings were generally in the same direction, but effect sizes were small to moderate across ethnic groups. Limited data were available for suprathreshold pain ratings. A subset of studies comparing NHW and other ethnic groups showed a variable range of effect sizes for pain threshold and tolerance. Conclusion There are potentially important ethnic/racial group differences in experimental pain perception. Elucidating ethnic group differences, has translational merit for culturally-competent clinical care and for addressing and reducing pain treatment disparities among ethnically/racially diverse groups. PMID:22390201

  18. Biological Dynamics Markup Language (BDML): an open format for representing quantitative biological dynamics data

    PubMed Central

    Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H. L.; Onami, Shuichi

    2015-01-01

    Motivation: Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. Results: We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. Availability and implementation: A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Contact: sonami@riken.jp Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:25414366

  19. A critical review of RHIC experimental results

    NASA Astrophysics Data System (ADS)

    Trainor, Thomas A.

    2014-07-01

    The relativistic heavy-ion collider (RHIC) was constructed to achieve an asymptotic state of nuclear matter in heavy-ion collisions, a near-ideal gas of deconfined quarks and gluons denoted quark-gluon plasma or QGP. RHIC collisions are indeed very different from the hadronic processes observed at the Bevalac and AGS, but high-energy elementary-collision mechanisms are also non-hadronic. The two-component model (TCM) combines measured properties of elementary collisions with the Glauber eikonal model to provide an alternative asymptotic limit for A-A collisions. RHIC data have been interpreted to indicate formation of a strongly-coupled QGP (sQGP) or "perfect liquid". In this review, I consider the experimental evidence that seems to support such conclusions and alternative evidence that may conflict with those conclusions and suggest different interpretations.

  20. Experimental Results from the Thermal Energy Storage-1 (TES-1) Flight Experiment

    NASA Technical Reports Server (NTRS)

    Wald, Lawrence W.; Tolbert, Carol; Jacqmin, David

    1995-01-01

    The Thermal Energy Storage-1 (TES-1) is a flight experiment that flew on the Space Shuttle Columbia (STS-62), in March 1994, as part of the OAST-2 mission. TES-1 is the first experiment in a four experiment suite designed to provide data for understanding the long duration microgravity behavior of thermal energy storage fluoride salts that undergo repeated melting and freezing. Such data have never been obtained before and have direct application for the development of space-based solar dynamic (SD) power systems. These power systems will store solar energy in a thermal energy salt such as lithium fluoride or calcium fluoride. The stored energy is extracted during the shade portion of the orbit. This enables the solar dynamic power system to provide constant electrical power over the entire orbit. Analytical computer codes have been developed for predicting performance of a spaced-based solar dynamic power system. Experimental verification of the analytical predictions is needed prior to using the analytical results for future space power design applications. The four TES flight experiments will be used to obtain the needed experimental data. This paper will focus on the flight results from the first experiment, TES-1, in comparison to the predicted results from the Thermal Energy Storage Simulation (TESSIM) analytical computer code. The TES-1 conceptual development, hardware design, final development, and system verification testing were accomplished at the NASA lewis Research Center (LeRC). TES-1 was developed under the In-Space Technology Experiment Program (IN-STEP), which sponsors NASA, industry, and university flight experiments designed to enable and enhance space flight technology. The IN-STEP Program is sponsored by the Office of Space Access and Technology (OSAT).

  1. The Design of a Quantitative Western Blot Experiment

    PubMed Central

    Taylor, Sean C.; Posch, Anton

    2014-01-01

    Western blotting is a technique that has been in practice for more than three decades that began as a means of detecting a protein target in a complex sample. Although there have been significant advances in both the imaging and reagent technologies to improve sensitivity, dynamic range of detection, and the applicability of multiplexed target detection, the basic technique has remained essentially unchanged. In the past, western blotting was used simply to detect a specific target protein in a complex mixture, but now journal editors and reviewers are requesting the quantitative interpretation of western blot data in terms of fold changes in protein expression between samples. The calculations are based on the differential densitometry of the associated chemiluminescent and/or fluorescent signals from the blots and this now requires a fundamental shift in the experimental methodology, acquisition, and interpretation of the data. We have recently published an updated approach to produce quantitative densitometric data from western blots (Taylor et al., 2013) and here we summarize the complete western blot workflow with a focus on sample preparation and data analysis for quantitative western blotting. PMID:24738055

  2. Direct and Quantitative Characterization of Dynamic Ligand Exchange between Coordination-Driven Self-Assembled Supramolecular Polygons

    PubMed Central

    Zheng, Yao-Rong; Stang, Peter J.

    2009-01-01

    The direct observation of dynamic ligand exchange beween Pt-N coordination-driven self-assembled supramolecular polygons (triangles and rectangles) has been achieved using stable isotope labeling (1H/2D) of the pyridyl donors and electrospray ionization mass spectrometry (ESI-MS) together with NMR spectroscopy. Both the thermodynamic and kinetic aspects of such exchange processes have been established based on quantitative mass spectral results. Further investigation showed that the exchange is highly dependent on experimental conditions such as temperature, solvent, and the counter anions. PMID:19243144

  3. Direct and quantitative characterization of dynamic ligand exchange between coordination-driven self-assembled supramolecular polygons.

    PubMed

    Zheng, Yao-Rong; Stang, Peter J

    2009-03-18

    The direct observation of dynamic ligand exchange between Pt-N coordination-driven self-assembled supramolecular polygons (triangles and rectangles) has been achieved using stable (1)H/(2)D isotope labeling of the pyridyl donors and electrospray ionization mass spectrometry combined with NMR spectroscopy. Both the thermodynamic and kinetic aspects of such exchange processes have been established on the basis of quantitative mass spectral results. Further investigation has shown that the exchange is highly dependent on experimental conditions such as temperature, solvent, and the counteranions.

  4. Artifacts, assumptions, and ambiguity: Pitfalls in comparing experimental results to numerical simulations when studying electrical stimulation of the heart.

    PubMed

    Roth, Bradley J.

    2002-09-01

    Insidious experimental artifacts and invalid theoretical assumptions complicate the comparison of numerical predictions and observed data. Such difficulties are particularly troublesome when studying electrical stimulation of the heart. During unipolar stimulation of cardiac tissue, the artifacts include nonlinearity of membrane dyes, optical signals blocked by the stimulating electrode, averaging of optical signals with depth, lateral averaging of optical signals, limitations of the current source, and the use of excitation-contraction uncouplers. The assumptions involve electroporation, membrane models, electrode size, the perfusing bath, incorrect model parameters, the applicability of a continuum model, and tissue damage. Comparisons of theory and experiment during far-field stimulation are limited by many of these same factors, plus artifacts from plunge and epicardial recording electrodes and assumptions about the fiber angle at an insulating boundary. These pitfalls must be overcome in order to understand quantitatively how the heart responds to an electrical stimulus. (c) 2002 American Institute of Physics.

  5. Quantitative Study of Emotional Intelligence and Communication Levels in Information Technology Professionals

    ERIC Educational Resources Information Center

    Hendon, Michalina

    2016-01-01

    This quantitative non-experimental correlational research analyzes the relationship between emotional intelligence and communication due to the lack of this research on information technology professionals in the U.S. One hundred and eleven (111) participants completed a survey that measures both the emotional intelligence and communication…

  6. Development of a quantitative PCR assay for monitoring Streptococcus agalactiae colonization and tissue tropism in experimentally infected tilapia.

    PubMed

    Su, Y-L; Feng, J; Li, Y-W; Bai, J-S; Li, A-X

    2016-02-01

    Streptococcus agalactiae has become one of the most important emerging pathogens in the aquaculture industry and has resulted in large economic losses for tilapia farms in China. In this study, three pairs of specific primers were designed and tested for their specificities and sensitivities in quantitative real-time polymerase chain reactions (qPCRs) after optimization of the annealing temperature. The primer pair IGS-s/IGS-a, which targets the 16S-23S rRNA intergenic spacer region, was finally chosen, having a detection limit of 8.6 copies of S. agalactiae DNA in a 20 μL reaction mixture. Bacterial tissue tropism was demonstrated by qPCR in Oreochromis niloticus 5 days post-injection with a virulent S. agalactiae strain. Bacterial loads were detected at the highest level in brain, followed by moderately high levels in kidney, heart, spleen, intestines, and eye. Significantly lower bacterial loads were observed in muscle, gill and liver. In addition, significantly lower bacterial loads were observed in the brain of convalescent O. niloticus 14 days post-injection with several different S. agalactiae strains. The qPCR for the detection of S. agalactiae developed in this study provides a quantitative tool for investigating bacterial tissue tropism in infected fish, as well as for monitoring bacterial colonization in convalescent fish. © 2015 John Wiley & Sons Ltd.

  7. Quantitative Detection of Cracks in Steel Using Eddy Current Pulsed Thermography.

    PubMed

    Shi, Zhanqun; Xu, Xiaoyu; Ma, Jiaojiao; Zhen, Dong; Zhang, Hao

    2018-04-02

    Small cracks are common defects in steel and often lead to catastrophic accidents in industrial applications. Various nondestructive testing methods have been investigated for crack detection; however, most current methods focus on qualitative crack identification and image processing. In this study, eddy current pulsed thermography (ECPT) was applied for quantitative crack detection based on derivative analysis of temperature variation. The effects of the incentive parameters on the temperature variation were analyzed in the simulation study. The crack profile and position are identified in the thermal image based on the Canny edge detection algorithm. Then, one or more trajectories are determined through the crack profile in order to determine the crack boundary through its temperature distribution. The slope curve along the trajectory is obtained. Finally, quantitative analysis of the crack sizes was performed by analyzing the features of the slope curves. The experimental verification showed that the crack sizes could be quantitatively detected with errors of less than 1%. Therefore, the proposed ECPT method was demonstrated to be a feasible and effective nondestructive approach for quantitative crack detection.

  8. Quantitative Analysis in the General Chemistry Laboratory: Training Students to Analyze Individual Results in the Context of Collective Data

    ERIC Educational Resources Information Center

    Ling, Chris D.; Bridgeman, Adam J.

    2011-01-01

    Titration experiments are ideal for generating large data sets for use in quantitative-analysis activities that are meaningful and transparent to general chemistry students. We report the successful implementation of a sophisticated quantitative exercise in which the students identify a series of unknown acids by determining their molar masses…

  9. Quantitative subsurface analysis using frequency modulated thermal wave imaging

    NASA Astrophysics Data System (ADS)

    Subhani, S. K.; Suresh, B.; Ghali, V. S.

    2018-01-01

    Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.

  10. Knowledge Management for the Analysis of Complex Experimentation.

    ERIC Educational Resources Information Center

    Maule, R.; Schacher, G.; Gallup, S.

    2002-01-01

    Describes a knowledge management system that was developed to help provide structure for dynamic and static data and to aid in the analysis of complex experimentation. Topics include quantitative and qualitative data; mining operations using artificial intelligence techniques; information architecture of the system; and transforming data into…

  11. Experimental Demonstration of Frequency Regulation by Commercial Buildings – Part II: Results and Performance Evaluation

    DOE PAGES

    Vrettos, Evangelos; Kara, Emre Can; MacDonald, Jason; ...

    2016-11-15

    This paper is the second part of a two-part series presenting the results from an experimental demonstration of frequency regulation in a commercial building test facility. We developed relevant building models and designed a hierarchical controller for reserve scheduling, building climate control and frequency regulation in Part I. In Part II, we introduce the communication architecture and experiment settings, and present extensive experimental results under frequency regulation. More specifically, we compute the day-ahead reserve capacity of the test facility under different assumptions and conditions. Furthermore, we demonstrate the ability of model predictive control to satisfy comfort constraints under frequency regulation,more » and show that fan speed control can track the fast-moving RegD signal of the Pennsylvania, Jersey, and Maryland Power Market (PJM) very accurately. In addition, we discuss potential effects of frequency regulation on building operation (e.g., increase in energy consumption, oscillations in supply air temperature, and effect on chiller cycling), and provide suggestions for real-world implementation projects. Our results show that hierarchical control is appropriate for frequency regulation from commercial buildings.« less

  12. Classical experiments revisited: smartphones and tablet PCs as experimental tools in acoustics and optics

    NASA Astrophysics Data System (ADS)

    Klein, P.; Hirth, M.; Gröber, S.; Kuhn, J.; Müller, A.

    2014-07-01

    Smartphones and tablets are used as experimental tools and for quantitative measurements in two traditional laboratory experiments for undergraduate physics courses. The Doppler effect is analyzed and the speed of sound is determined with an accuracy of about 5% using ultrasonic frequency and two smartphones, which serve as rotating sound emitter and stationary sound detector. Emphasis is put on the investigation of measurement errors in order to judge experimentally derived results and to sensitize undergraduate students to the methods of error estimates. The distance dependence of the illuminance of a light bulb is investigated using an ambient light sensor of a mobile device. Satisfactory results indicate that the spectrum of possible smartphone experiments goes well beyond those already published for mechanics.

  13. Determination of quantitative trait variants by concordance via application of the a posteriori granddaughter design to the U.S. Holstein population

    USDA-ARS?s Scientific Manuscript database

    Experimental designs that exploit family information can provide substantial predictive power in quantitative trait variant discovery projects. Concordance between quantitative trait locus genotype as determined by the a posteriori granddaughter design and marker genotype was determined for 29 trai...

  14. Hardening of particle/oil/water suspensions due to capillary bridges: Experimental yield stress and theoretical interpretation.

    PubMed

    Danov, Krassimir D; Georgiev, Mihail T; Kralchevsky, Peter A; Radulova, Gergana M; Gurkov, Theodor D; Stoyanov, Simeon D; Pelan, Eddie G

    2018-01-01

    Suspensions of colloid particles possess the remarkable property to solidify upon the addition of minimal amount of a second liquid that preferentially wets the particles. The hardening is due to the formation of capillary bridges (pendular rings), which connect the particles. Here, we review works on the mechanical properties of such suspensions and related works on the capillary-bridge force, and present new rheological data for the weakly studied concentration range 30-55 vol% particles. The mechanical strength of the solidified capillary suspensions, characterized by the yield stress Y, is measured at the elastic limit for various volume fractions of the particles and the preferentially wetting liquid. A quantitative theoretical model is developed, which relates Y with the maximum of the capillary-bridge force, projected on the shear plane. A semi-empirical expression for the mean number of capillary bridges per particle is proposed. The model agrees very well with the experimental data and gives a quantitative description of the yield stress, which increases with the rise of interfacial tension and with the volume fractions of particles and capillary bridges, but decreases with the rise of particle radius and contact angle. The quantitative description of capillary force is based on the exact theory and numerical calculation of the capillary bridge profile at various bridge volumes and contact angles. An analytical formula for Y is also derived. The comparison of the theoretical and experimental strain at the elastic limit reveals that the fluidization of the capillary suspension takes place only in a deformation zone of thickness up to several hundred particle diameters, which is adjacent to the rheometer's mobile plate. The reported experimental results refer to water-continuous suspension with hydrophobic particles and oily capillary bridges. The comparison of data for bridges from soybean oil and hexadecane surprisingly indicate that the yield strength is

  15. VX hydrolysis by human serum paraoxonase 1: a comparison of experimental and computational results.

    PubMed

    Peterson, Matthew W; Fairchild, Steven Z; Otto, Tamara C; Mohtashemi, Mojdeh; Cerasoli, Douglas M; Chang, Wenling E

    2011-01-01

    Human Serum paraoxonase 1 (HuPON1) is an enzyme that has been shown to hydrolyze a variety of chemicals including the nerve agent VX. While wildtype HuPON1 does not exhibit sufficient activity against VX to be used as an in vivo countermeasure, it has been suggested that increasing HuPON1's organophosphorous hydrolase activity by one or two orders of magnitude would make the enzyme suitable for this purpose. The binding interaction between HuPON1 and VX has recently been modeled, but the mechanism for VX hydrolysis is still unknown. In this study, we created a transition state model for VX hydrolysis (VX(ts)) in water using quantum mechanical/molecular mechanical simulations, and docked the transition state model to 22 experimentally characterized HuPON1 variants using AutoDock Vina. The HuPON1-VX(ts) complexes were grouped by reaction mechanism using a novel clustering procedure. The average Vina interaction energies for different clusters were compared to the experimentally determined activities of HuPON1 variants to determine which computational procedures best predict how well HuPON1 variants will hydrolyze VX. The analysis showed that only conformations which have the attacking hydroxyl group of VX(ts) coordinated by the sidechain oxygen of D269 have a significant correlation with experimental results. The results from this study can be used for further characterization of how HuPON1 hydrolyzes VX and design of HuPON1 variants with increased activity against VX.

  16. Comparison of Experimental Surface and Flow Field Measurements to Computational Results of the Juncture Flow Model

    NASA Technical Reports Server (NTRS)

    Roozeboom, Nettie H.; Lee, Henry C.; Simurda, Laura J.; Zilliac, Gregory G.; Pulliam, Thomas H.

    2016-01-01

    Wing-body juncture flow fields on commercial aircraft configurations are challenging to compute accurately. The NASA Advanced Air Vehicle Program's juncture flow committee is designing an experiment to provide data to improve Computational Fluid Dynamics (CFD) modeling in the juncture flow region. Preliminary design of the model was done using CFD, yet CFD tends to over-predict the separation in the juncture flow region. Risk reduction wind tunnel tests were requisitioned by the committee to obtain a better understanding of the flow characteristics of the designed models. NASA Ames Research Center's Fluid Mechanics Lab performed one of the risk reduction tests. The results of one case, accompanied by CFD simulations, are presented in this paper. Experimental results suggest the wall mounted wind tunnel model produces a thicker boundary layer on the fuselage than the CFD predictions, resulting in a larger wing horseshoe vortex suppressing the side of body separation in the juncture flow region. Compared to experimental results, CFD predicts a thinner boundary layer on the fuselage generates a weaker wing horseshoe vortex resulting in a larger side of body separation.

  17. Quantitatively characterizing the microstructural features of breast ductal carcinoma tissues in different progression stages by Mueller matrix microscope.

    PubMed

    Dong, Yang; Qi, Ji; He, Honghui; He, Chao; Liu, Shaoxiong; Wu, Jian; Elson, Daniel S; Ma, Hui

    2017-08-01

    Polarization imaging has been recognized as a potentially powerful technique for probing the microstructural information and optical properties of complex biological specimens. Recently, we have reported a Mueller matrix microscope by adding the polarization state generator and analyzer (PSG and PSA) to a commercial transmission-light microscope, and applied it to differentiate human liver and cervical cancerous tissues with fibrosis. In this paper, we apply the Mueller matrix microscope for quantitative detection of human breast ductal carcinoma samples at different stages. The Mueller matrix polar decomposition and transformation parameters of the breast ductal tissues in different regions and at different stages are calculated and analyzed. For more quantitative comparisons, several widely-used image texture feature parameters are also calculated to characterize the difference in the polarimetric images. The experimental results indicate that the Mueller matrix microscope and the polarization parameters can facilitate the quantitative detection of breast ductal carcinoma tissues at different stages.

  18. Experimental Results of the EU ITER Prototype Gyrotrons

    NASA Astrophysics Data System (ADS)

    Gantenbein, G.; Albajar, F.; Alberti, S.; Avramidis, K.; Bin, W.; Bonicelli, T.; Bruschi, A.; Chelis, J.; Fanale, F.; Legrand, F.; Hermann, V.; Hogge, J.-P.; Illy, S.; Ioannidis, Z. C.; Jin, J.; Jelonnek, J.; Kasparek, W.; Latsas, G. P.; Lechte, C.; Lontano, M.; Pagonakis, I. G.; Rzesnicki, T.; Schlatter, C.; Schmid, M.; Tigelis, I. G.; Thumm, M.; Tran, M. Q.; Vomvoridis, J. L.; Zein, A.; Zisis, A.

    2017-10-01

    The European 1 MW, 170 GHz CW industrial prototype gyrotron for ECRH&CD on ITER was under test at the KIT test facility during 2016. In order to optimize the gyrotron operation, the tube was thoroughly tested in the short-pulse regime, with pulse lengths below 10 ms, for a wide range of operational parameters. The operation was extended to longer pulses with a duration of up to 180 s. In this work we present in detail the achievements and the challenges that were faced during the long-pulse experimental campaign.

  19. Beam dynamics studies at DAΦNE: from ideas to experimental results

    NASA Astrophysics Data System (ADS)

    Zobov, M.; DAΦNE Team

    2017-12-01

    DAΦNE is the electron-positron collider operating at the energy of Φ-resonance, 1 GeV in the center of mass. The presently achieved luminosity is by about two orders of magnitude higher than that obtained at other colliders ever operated at this energy. Careful beam dynamic studies such as the vacuum chamber design with low beam coupling impedance, suppression of different kinds of beam instabilities, investigation of beam-beam interaction, optimization of the beam nonlinear motion have been the key ingredients that have helped to reach this impressive result. Many novel ideas in accelerator physics have been proposed and/or tested experimentally at DAΦNE for the first time. In this paper we discuss the advanced accelerator physics studies performed at DAΦNE.

  20. RECENT ADVANCES IN QUANTITATIVE NEUROPROTEOMICS

    PubMed Central

    Craft, George E; Chen, Anshu; Nairn, Angus C

    2014-01-01

    The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson’s disease and Alzheimer’s disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to

  1. Recent advances in quantitative neuroproteomics.

    PubMed

    Craft, George E; Chen, Anshu; Nairn, Angus C

    2013-06-15

    The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson's disease and Alzheimer's disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to shed

  2. Integrated experimental and theoretical approach for the structural characterization of Hg2+ aqueous solutions

    NASA Astrophysics Data System (ADS)

    D'Angelo, Paola; Migliorati, Valentina; Mancini, Giordano; Barone, Vincenzo; Chillemi, Giovanni

    2008-02-01

    The structural and dynamic properties of the solvated Hg2+ ion in aqueous solution have been investigated by a combined experimental-theoretical approach employing x-ray absorption spectroscopy and molecular dynamics (MD) simulations. This method allows one to perform a quantitative analysis of the x-ray absorption near-edge structure (XANES) spectra of ionic solutions using a proper description of the thermal and structural fluctuations. XANES spectra have been computed starting from the MD trajectory, without carrying out any minimization in the structural parameter space. The XANES experimental data are accurately reproduced by a first-shell heptacoordinated cluster only if the second hydration shell is included in the calculations. These results confirm at the same time the existence of a sevenfold first hydration shell for the Hg2+ ion in aqueous solution and the reliability of the potentials used in the MD simulations. The combination of MD and XANES is found to be very helpful to get important new insights into the quantitative estimation of structural properties of disordered systems.

  3. Experimental and QSAR study on the surface activities of alkyl imidazoline surfactants

    NASA Astrophysics Data System (ADS)

    Kong, Xiangjun; Qian, Chengduo; Fan, Weiyu; Liang, Zupei

    2018-03-01

    15 alkyl imidazoline surfactants with different structures were synthesized and their critical micelle concentration (CMC) and surface tension under the CMC (σcmc) in aqueous solution were measured at 298 K. 54 kinds of molecular structure descriptors were selected as independent variables and the quantitative structure-activity relationship (QSAR) between surface activities of alkyl imidazoline and molecular structure were built through the genetic function approximation (GFA) method. Experimental results showed that the maximum surface excess of alkyl imidazoline molecules at the gas-liquid interface increased and the area occupied by each surfactant molecule and the free energies of micellization ΔGm decreased with increasing carbon number (NC) of the hydrophobic chain or decreasing hydrophilicity of counterions, which resulted in a CMC and σcmc decrease, while the log CMC and NC had a linear relationship and a negative correlation. The GFA-QSAR model, which was generated by a training set composed of 13 kinds of alkyl imidazoline though GFA method regression analysis, was highly correlated with predicted values and experimental values of the CMC. The correlation coefficient R was 0.9991, which means high prediction accuracy. The prediction error of 2 kinds of alkyl imidazoline CMCs in the Validation Set that quantitatively analyzed the influence of the alkyl imidazoline molecular structure on the CMC was less than 4%.

  4. Quantitative imaging methods in osteoporosis.

    PubMed

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  5. The 2.5 bit/detected photon demonstration program: Phase 2 and 3 experimental results

    NASA Technical Reports Server (NTRS)

    Katz, J.

    1982-01-01

    The experimental program for laboratory demonstration of and energy efficient optical communication channel operating at a rate of 2.5 bits/detected photon is described. Results of the uncoded PPM channel performance are presented. It is indicated that the throughput efficiency can be achieved not only with a Reed-Solomon code as originally predicted, but with a less complex code as well.

  6. Normalization of Reverse Transcription Quantitative PCR Data During Ageing in Distinct Cerebral Structures.

    PubMed

    Bruckert, G; Vivien, D; Docagne, F; Roussel, B D

    2016-04-01

    Reverse transcription quantitative-polymerase chain reaction (RT-qPCR) has become a routine method in many laboratories. Normalization of data from experimental conditions is critical for data processing and is usually achieved by the use of a single reference gene. Nevertheless, as pointed by the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) guidelines, several reference genes should be used for reliable normalization. Ageing is a physiological process that results in a decline of many expressed genes. Reliable normalization of RT-qPCR data becomes crucial when studying ageing. Here, we propose a RT-qPCR study from four mouse brain regions (cortex, hippocampus, striatum and cerebellum) at different ages (from 8 weeks to 22 months) in which we studied the expression of nine commonly used reference genes. With the use of two different algorithms, we found that all brain structures need at least two genes for a good normalization step. We propose specific pairs of gene for efficient data normalization in the four brain regions studied. These results underline the importance of reliable reference genes for specific brain regions in ageing.

  7. Quantitative Study on Corrosion of Steel Strands Based on Self-Magnetic Flux Leakage.

    PubMed

    Xia, Runchuan; Zhou, Jianting; Zhang, Hong; Liao, Leng; Zhao, Ruiqiang; Zhang, Zeyu

    2018-05-02

    This paper proposed a new computing method to quantitatively and non-destructively determine the corrosion of steel strands by analyzing the self-magnetic flux leakage (SMFL) signals from them. The magnetic dipole model and three growth models (Logistic model, Exponential model, and Linear model) were proposed to theoretically analyze the characteristic value of SMFL. Then, the experimental study on the corrosion detection by the magnetic sensor was carried out. The setup of the magnetic scanning device and signal collection method were also introduced. The results show that the Logistic Growth model is verified as the optimal model for calculating the magnetic field with good fitting effects. Combined with the experimental data analysis, the amplitudes of the calculated values ( B xL ( x,z ) curves) agree with the measured values in general. This method provides significant application prospects for the evaluation of the corrosion and the residual bearing capacity of steel strand.

  8. A novel approach to teach the generation of bioelectrical potentials from a descriptive and quantitative perspective.

    PubMed

    Rodriguez-Falces, Javier

    2013-12-01

    In electrophysiology studies, it is becoming increasingly common to explain experimental observations using both descriptive methods and quantitative approaches. However, some electrophysiological phenomena, such as the generation of extracellular potentials that results from the propagation of the excitation source along the muscle fiber, are difficult to describe and conceptualize. In addition, most traditional approaches aimed at describing extracellular potentials consist of complex mathematical machinery that gives no chance for physical interpretation. The aim of the present study is to present a new method to teach the formation of extracellular potentials around a muscle fiber from both a descriptive and quantitative perspective. The implementation of this method was tested through a written exam and a satisfaction survey. The new method enhanced the ability of students to visualize the generation of bioelectrical potentials. In addition, the new approach improved students' understanding of how changes in the fiber-to-electrode distance and in the shape of the excitation source are translated into changes in the extracellular potential. The survey results show that combining general principles of electrical fields with accurate graphic imagery gives students an intuitive, yet quantitative, feel for electrophysiological signals and enhances their motivation to continue their studies in the biomedical engineering field.

  9. Second-order sliding mode control with experimental application.

    PubMed

    Eker, Ilyas

    2010-07-01

    In this article, a second-order sliding mode control (2-SMC) is proposed for second-order uncertain plants using equivalent control approach to improve the performance of control systems. A Proportional + Integral + Derivative (PID) sliding surface is used for the sliding mode. The sliding mode control law is derived using direct Lyapunov stability approach and asymptotic stability is proved theoretically. The performance of the closed-loop system is analysed through an experimental application to an electromechanical plant to show the feasibility and effectiveness of the proposed second-order sliding mode control and factors involved in the design. The second-order plant parameters are experimentally determined using input-output measured data. The results of the experimental application are presented to make a quantitative comparison with the traditional (first-order) sliding mode control (SMC) and PID control. It is demonstrated that the proposed 2-SMC system improves the performance of the closed-loop system with better tracking specifications in the case of external disturbances, better behavior of the output and faster convergence of the sliding surface while maintaining the stability. 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Quantitative proteomics in biological research.

    PubMed

    Wilm, Matthias

    2009-10-01

    Proteomics has enabled the direct investigation of biological material, at first through the analysis of individual proteins, then of lysates from cell cultures, and finally of extracts from tissues and biopsies from entire organisms. Its latest manifestation - quantitative proteomics - allows deeper insight into biological systems. This article reviews the different methods used to extract quantitative information from mass spectra. It follows the technical developments aimed toward global proteomics, the attempt to characterize every expressed protein in a cell by at least one peptide. When applications of the technology are discussed, the focus is placed on yeast biology. In particular, differential quantitative proteomics, the comparison between an experiment and its control, is very discriminating for proteins involved in the process being studied. When trying to understand biological processes on a molecular level, differential quantitative proteomics tends to give a clearer picture than global transcription analyses. As a result, MS has become an even more indispensable tool for biochemically motivated biological research.

  11. [Quantitative study of the antibacterial effect of cefotaxime and ceftriaxone during experimental Escherichia coli K1 bacteremia in chickens].

    PubMed

    Labarthe, J C; Guillot, J F; Mouline, C; Bree, A

    1989-06-01

    In order to assess the in vivo antibacterial activity of two cephalosporins of third generation, cefotaxime and ceftriaxone, we used the model of experimental bacteremia in chickens we had developed for a few years. 93 chickens were inoculated with 10(7) E. coli K1 coming from a meningitis in a new-born baby. 19 chickens were used as a control group; 29 were given ceftriaxone (50 mg/kg); 28 cefotaxime (50 mg/kg) and 17 cefotaxime (100 mg/kg). The antibiotics were injected 4 hours after the inoculation. The bacterial concentrations found in capillaries by using quantitative blood cultures, were significantly lower in the 3 groups of chickens which were given antibiotics than in the control group, at 24, 48 and 72 hours after inoculation. At 24 hours after the inoculation, the bacterial concentration in the chickens treated by ceftriaxone (50 mg/kg) was significantly lower than that found in chickens treated by cefotaxime (50 mg/kg). At 48 or 72 hours the differences of bacterial concentration in the three groups of chickens were not significant. Over 72 hours following inoculation, 4 control and only one treated chickens died. The efficient clearance of E. coli K1 by a single dose of ceftriaxone, found at 24 hours after inoculation, confirms the possibility of using ceftriaxone once daily for serious infections.

  12. Comparison of experimental data with results of some drying models for regularly shaped products

    NASA Astrophysics Data System (ADS)

    Kaya, Ahmet; Aydın, Orhan; Dincer, Ibrahim

    2010-05-01

    This paper presents an experimental and theoretical investigation of drying of moist slab, cylinder and spherical products to study dimensionless moisture content distributions and their comparisons. Experimental study includes the measurement of the moisture content distributions of slab and cylindrical carrot, slab and cylindrical pumpkin and spherical blueberry during drying at various temperatures (e.g., 30, 40, 50 and 60°C) at specific constant velocity ( U = 1 m/s) and the relative humidity φ = 30%. In theoretical analysis, two moisture transfer models are used to determine drying process parameters (e.g., drying coefficient and lag factor) and moisture transfer parameters (e.g., moisture diffusivity and moisture transfer coefficient), and to calculate the dimensionless moisture content distributions. The calculated results are then compared with the experimental moisture data. A considerably high agreement is obtained between the calculations and experimental measurements for the cases considered. The effective diffusivity values were evaluated between 0.741 × 10-5 and 5.981 × 10-5 m2/h for slab products, 0.818 × 10-5 and 6.287 × 10-5 m2/h for cylindrical products and 1.213 × 10-7 and 7.589 × 10-7 m2/h spherical products using the Model-I and 0.316 × 10-5-5.072 × 10-5 m2/h for slab products, 0.580 × 10-5-9.587 × 10-5 m2/h for cylindrical products and 1.408 × 10-7-13.913 × 10-7 m2/h spherical products using the Model-II.

  13. Experimental results on the ω- and η'-nucleus potential - on the way to mesic states

    NASA Astrophysics Data System (ADS)

    Nanova, Mariana

    2015-06-01

    Different experimental approaches to determine the meson-nucleus optical potential are discussed. The experiments have been performed with the Crystal Barrel/TAPS detector system at the ELSA accelerator in Bonn and the Crystal Ball/TAPS at the MAMI accelerator in Mainz. Experimental results about the real and imaginary part of the η'- and ω-nucleus optical potential are presented. The imaginary part of the meson-nucleus optical potential is determined from the in-medium width of the meson by the measurement of the transparency ratio. Information on the real part of the optical potential is deduced from measurements of the excitation function and momentum distribution which are sensitive to the sign and depth of the potential. The results are discussed and compared to theoretical predictions. The data for both mesons are consistent with a weakly attractive potential. The formation and population of ω-nucleus and η'-nucleus bound states is additionally discussed.

  14. Optimization and automation of quantitative NMR data extraction.

    PubMed

    Bernstein, Michael A; Sýkora, Stan; Peng, Chen; Barba, Agustín; Cobas, Carlos

    2013-06-18

    NMR is routinely used to quantitate chemical species. The necessary experimental procedures to acquire quantitative data are well-known, but relatively little attention has been applied to data processing and analysis. We describe here a robust expert system that can be used to automatically choose the best signals in a sample for overall concentration determination and determine analyte concentration using all accepted methods. The algorithm is based on the complete deconvolution of the spectrum which makes it tolerant of cases where signals are very close to one another and includes robust methods for the automatic classification of NMR resonances and molecule-to-spectrum multiplets assignments. With the functionality in place and optimized, it is then a relatively simple matter to apply the same workflow to data in a fully automatic way. The procedure is desirable for both its inherent performance and applicability to NMR data acquired for very large sample sets.

  15. An overview on development and application of an experimental platform for quantitative cardiac imaging research in rabbit models of myocardial infarction

    PubMed Central

    Feng, Yuanbo; Bogaert, Jan; Oyen, Raymond

    2014-01-01

    To exploit the advantages of using rabbits for cardiac imaging research and to tackle the technical obstacles, efforts have been made under the framework of a doctoral research program. In this overview article, by cross-referencing the current literature, we summarize how we have developed a preclinical cardiac research platform based on modified models of reperfused myocardial infarction (MI) in rabbits; how the in vivo manifestations of cardiac imaging could be closely matched with those ex vivo macro- and microscopic findings; how these imaging outcomes could be quantitatively analyzed, validated and demonstrated; and how we could apply this cardiac imaging platform to provide possible solutions to certain lingering diagnostic and therapeutic problems in experimental cardiology. In particular, tissue components in acute cardiac ischemia have been stratified and characterized, post-infarct lipomatous metaplasia (LM) as a common but hardly illuminated clinical pathology has been identified in rabbit models, and a necrosis avid tracer as well as an anti-ischemic drug have been successfully assessed for their potential utilities in clinical cardiology. These outcomes may interest the researchers in the related fields and help strengthen translational research in cardiovascular diseases. PMID:25392822

  16. An overview on development and application of an experimental platform for quantitative cardiac imaging research in rabbit models of myocardial infarction.

    PubMed

    Feng, Yuanbo; Bogaert, Jan; Oyen, Raymond; Ni, Yicheng

    2014-10-01

    To exploit the advantages of using rabbits for cardiac imaging research and to tackle the technical obstacles, efforts have been made under the framework of a doctoral research program. In this overview article, by cross-referencing the current literature, we summarize how we have developed a preclinical cardiac research platform based on modified models of reperfused myocardial infarction (MI) in rabbits; how the in vivo manifestations of cardiac imaging could be closely matched with those ex vivo macro- and microscopic findings; how these imaging outcomes could be quantitatively analyzed, validated and demonstrated; and how we could apply this cardiac imaging platform to provide possible solutions to certain lingering diagnostic and therapeutic problems in experimental cardiology. In particular, tissue components in acute cardiac ischemia have been stratified and characterized, post-infarct lipomatous metaplasia (LM) as a common but hardly illuminated clinical pathology has been identified in rabbit models, and a necrosis avid tracer as well as an anti-ischemic drug have been successfully assessed for their potential utilities in clinical cardiology. These outcomes may interest the researchers in the related fields and help strengthen translational research in cardiovascular diseases.

  17. Manufacturing of hybrid aluminum copper joints by electromagnetic pulse welding - Identification of quantitative process windows

    NASA Astrophysics Data System (ADS)

    Psyk, Verena; Scheffler, Christian; Linnemann, Maik; Landgrebe, Dirk

    2017-10-01

    Compared to conventional joining techniques, electromagnetic pulse welding offers important advantages especially when it comes to dissimilar material connections as e.g. copper aluminum welds. However, due to missing guidelines and tools for process design, the process has not been widely implemented in industrial production, yet. In order to contribute to overcoming this obstacle, a combined numerical and experimental process analysis for electromagnetic pulse welding of Cu-DHP and EN AW-1050 was carried out and the results were consolidated in a quantitative collision parameter based process window.

  18. Comparison between maximum radial expansion of ultrasound contrast agents and experimental postexcitation signal results.

    PubMed

    King, Daniel A; O'Brien, William D

    2011-01-01

    Experimental postexcitation signal data of collapsing Definity microbubbles are compared with the Marmottant theoretical model for large amplitude oscillations of ultrasound contrast agents (UCAs). After taking into account the insonifying pulse characteristics and size distribution of the population of UCAs, a good comparison between simulated results and previously measured experimental data is obtained by determining a threshold maximum radial expansion (Rmax) to indicate the onset of postexcitation. This threshold Rmax is found to range from 3.4 to 8.0 times the initial bubble radius, R0, depending on insonification frequency. These values are well above the typical free bubble inertial cavitation threshold commonly chosen at 2R0. The close agreement between the experiment and models suggests that lipid-shelled UCAs behave as unshelled bubbles during most of a large amplitude cavitation cycle, as proposed in the Marmottant equation.

  19. Bayes` theorem and quantitative risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaplan, S.

    1994-12-31

    This paper argues that for a quantitative risk analysis (QRA) to be useful for public and private decision making, and for rallying the support necessary to implement those decisions, it is necessary that the QRA results be ``trustable.`` Trustable means that the results are based solidly and logically on all the relevant evidence available. This, in turn, means that the quantitative results must be derived from the evidence using Bayes` theorem. Thus, it argues that one should strive to make their QRAs more clearly and explicitly Bayesian, and in this way make them more ``evidence dependent`` than ``personality dependent.``

  20. Experimental approaches to well controlled studies of thin-film nucleation and growth.

    NASA Technical Reports Server (NTRS)

    Poppa, H.; Moorhead, R. D.; Heinemann, K.

    1972-01-01

    Particular features and the performance of two experimental systems are described for quantitative studies of thin-film nucleation and growth processes including epitaxial depositions. System I consists of a modified LEED-Auger instrument combined with high-resolution electron microscopy. System II is a UHV electron microscope adapted for in-situ deposition studies. The two systems complement each other ideally, and the combined use of both can result in a comprehensive investigation of vapor deposition processes not obtainable with any other known method.

  1. Quantitative Characterization of Nanostructured Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dr. Frank

    The two-and-a-half day symposium on the "Quantitative Characterization of Nanostructured Materials" will be the first comprehensive meeting on this topic held under the auspices of a major U.S. professional society. Spring MRS Meetings provide a natural venue for this symposium as they attract a broad audience of researchers that represents a cross-section of the state-of-the-art regarding synthesis, structure-property relations, and applications of nanostructured materials. Close interactions among the experts in local structure measurements and materials researchers will help both to identify measurement needs pertinent to real-world materials problems and to familiarize the materials research community with the state-of-the-art local structuremore » measurement techniques. We have chosen invited speakers that reflect the multidisciplinary and international nature of this topic and the need to continually nurture productive interfaces among university, government and industrial laboratories. The intent of the symposium is to provide an interdisciplinary forum for discussion and exchange of ideas on the recent progress in quantitative characterization of structural order in nanomaterials using different experimental techniques and theory. The symposium is expected to facilitate discussions on optimal approaches for determining atomic structure at the nanoscale using combined inputs from multiple measurement techniques.« less

  2. Portable low-coherence interferometry for quantitatively imaging fast dynamics with extended field of view

    NASA Astrophysics Data System (ADS)

    Shaked, Natan T.; Girshovitz, Pinhas; Frenklach, Irena

    2014-06-01

    We present our recent advances in the development of compact, highly portable and inexpensive wide-field interferometric modules. By a smart design of the interferometric system, including the usage of low-coherence illumination sources and common-path off-axis geometry of the interferometers, spatial and temporal noise levels of the resulting quantitative thickness profile can be sub-nanometric, while processing the phase profile in real time. In addition, due to novel experimentally-implemented multiplexing methods, we can capture low-coherence off-axis interferograms with significantly extended field of view and in faster acquisition rates. Using these techniques, we quantitatively imaged rapid dynamics of live biological cells including sperm cells and unicellular microorganisms. Then, we demonstrated dynamic profiling during lithography processes of microscopic elements, with thicknesses that may vary from several nanometers to hundreds of microns. Finally, we present new algorithms for fast reconstruction (including digital phase unwrapping) of off-axis interferograms, which allow real-time processing in more than video rate on regular single-core computers.

  3. Investigation of sonar transponders for offshore wind farms: modeling approach, experimental setup, and results.

    PubMed

    Fricke, Moritz B; Rolfes, Raimund

    2013-11-01

    The installation of offshore wind farms in the German Exclusive Economic Zone requires the deployment of sonar transponders to prevent collisions with submarines. The general requirements for these systems have been previously worked out by the Research Department for Underwater Acoustics and Marine Geophysics of the Bundeswehr. In this article, the major results of the research project "Investigation of Sonar Transponders for Offshore Wind Farms" are presented. For theoretical investigations a hybrid approach was implemented using the boundary element method to calculate the source directivity and a three-dimensional ray-tracing algorithm to estimate the transmission loss. The angle-dependence of the sound field as well as the weather-dependence of the transmission loss are compared to experimental results gathered at the offshore wind farm alpha ventus, located 45 km north of the island Borkum. While theoretical and experimental results are in general agreement, the implemented model slightly underestimates scattering at the rough sea surface. It is found that the source level of 200 dB re 1 μPa at 1 m is adequate to satisfy the detectability of the warning sequence at distances up to 2 NM (≈3.7 km) within a horizontal sector of ±60° if realistic assumptions about signal-processing and noise are made. An arrangement to enlarge the angular coverage is discussed.

  4. Multiplicative effects model with internal standard in mobile phase for quantitative liquid chromatography-mass spectrometry.

    PubMed

    Song, Mi; Chen, Zeng-Ping; Chen, Yao; Jin, Jing-Wen

    2014-07-01

    Liquid chromatography-mass spectrometry assays suffer from signal instability caused by the gradual fouling of the ion source, vacuum instability, aging of the ion multiplier, etc. To address this issue, in this contribution, an internal standard was added into the mobile phase. The internal standard was therefore ionized and detected together with the analytes of interest by the mass spectrometer to ensure that variations in measurement conditions and/or instrument have similar effects on the signal contributions of both the analytes of interest and the internal standard. Subsequently, based on the unique strategy of adding internal standard in mobile phase, a multiplicative effects model was developed for quantitative LC-MS assays and tested on a proof of concept model system: the determination of amino acids in water by LC-MS. The experimental results demonstrated that the proposed method could efficiently mitigate the detrimental effects of continuous signal variation, and achieved quantitative results with average relative predictive error values in the range of 8.0-15.0%, which were much more accurate than the corresponding results of conventional internal standard method based on the peak height ratio and partial least squares method (their average relative predictive error values were as high as 66.3% and 64.8%, respectively). Therefore, it is expected that the proposed method can be developed and extended in quantitative LC-MS analysis of more complex systems. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. A comparison of experimental and theoretical results for labyrinth gas seals with honeycomb stators. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Hawkins, Lawrence Allen

    1988-01-01

    Experimental results for the rotordynamic stiffness and damping coefficients of a labyrinth -rotor honeycomb-stator seal are presented. The coefficients are compared to the coefficients of a labyrinth-rotor smooth-stator seal having the same geometry. The coefficients are compared to analytical results from a two-control-volume compressible flow model. The experimental results show that the honeycomb stator configuration is more stable than the smooth stator configuration at low rotor speeds. At high rotor speeds and low clearance, the smooth stator seal is more stable. The theoretical model predicts the cross-coupled stiffness of the honeycomb stator seal correctly within 25 percent of measured values. The model provides accurate predictions of direct damping for large clearance seals. Overall, the model does not perform as well for low clearance seals as for high clearance seals.

  6. Quantitative contrast-enhanced mammography for contrast medium kinetics studies

    NASA Astrophysics Data System (ADS)

    Arvanitis, C. D.; Speller, R.

    2009-10-01

    Quantitative contrast-enhanced mammography, based on a dual-energy approach, aims to extract quantitative and temporal information of the tumour enhancement after administration of iodinated vascular contrast media. Simulations using analytical expressions and optimization of critical parameters essential for the development of quantitative contrast-enhanced mammography are presented. The procedure has been experimentally evaluated using a tissue-equivalent phantom and an amorphous silicon active matrix flat panel imager. The x-ray beams were produced by a tungsten target tube and spectrally shaped using readily available materials. Measurement of iodine projected thickness in mg cm-2 has been performed. The effect of beam hardening does not introduce nonlinearities in the measurement of iodine projected thickness for values of thicknesses found in clinical investigations. However, scattered radiation introduces significant deviations from slope equal to unity when compared with the actual iodine projected thickness. Scatter correction before the analysis of the dual-energy images provides accurate iodine projected thickness measurements. At 10% of the exposure used in clinical mammography, signal-to-noise ratios in excess of 5 were achieved for iodine projected thicknesses less than 3 mg cm-2 within a 4 cm thick phantom. For the extraction of temporal information, a limited number of low-dose images were used with the phantom incorporating a flow of iodinated contrast medium. The results suggest that spatial and temporal information of iodinated contrast media can be used to indirectly measure the tumour microvessel density and determine its uptake and washout from breast tumours. The proposed method can significantly improve tumour detection in dense breasts. Its application to perform in situ x-ray biopsy and assessment of the oncolytic effect of anticancer agents is foreseeable.

  7. Full skin quantitative optical coherence elastography achieved by combining vibration and surface acoustic wave methods

    NASA Astrophysics Data System (ADS)

    Li, Chunhui; Guan, Guangying; Huang, Zhihong; Wang, Ruikang K.; Nabi, Ghulam

    2015-03-01

    By combining with the phase sensitive optical coherence tomography (PhS-OCT), vibration and surface acoustic wave (SAW) methods have been reported to provide elastography of skin tissue respectively. However, neither of these two methods can provide the elastography in full skin depth in current systems. This paper presents a feasibility study on an optical coherence elastography method which combines both vibration and SAW in order to give the quantitative mechanical properties of skin tissue with full depth range, including epidermis, dermis and subcutaneous fat. Experiments are carried out on layered tissue mimicking phantoms and in vivo human forearm and palm skin. A ring actuator generates vibration while a line actuator were used to excited SAWs. A PhS-OCT system is employed to provide the ultrahigh sensitive measurement of the generated waves. The experimental results demonstrate that by the combination of vibration and SAW method the full skin bulk mechanical properties can be quantitatively measured and further the elastography can be obtained with a sensing depth from ~0mm to ~4mm. This method is promising to apply in clinics where the quantitative elasticity of localized skin diseases is needed to aid the diagnosis and treatment.

  8. Quality-by-Design II: Application of Quantitative Risk Analysis to the Formulation of Ciprofloxacin Tablets.

    PubMed

    Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W

    2016-04-01

    Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.

  9. Noise characteristics of upper surface blown configurations. Experimental program and results

    NASA Technical Reports Server (NTRS)

    Brown, W. H.; Searle, N.; Blakney, D. F.; Pennock, A. P.; Gibson, J. S.

    1977-01-01

    An experimental data base was developed from the model upper surface blowing (USB) propulsive lift system hardware. While the emphasis was on far field noise data, a considerable amount of relevant flow field data were also obtained. The data were derived from experiments in four different facilities resulting in: (1) small scale static flow field data; (2) small scale static noise data; (3) small scale simulated forward speed noise and load data; and (4) limited larger-scale static noise flow field and load data. All of the small scale tests used the same USB flap parts. Operational and geometrical variables covered in the test program included jet velocity, nozzle shape, nozzle area, nozzle impingement angle, nozzle vertical and horizontal location, flap length, flap deflection angle, and flap radius of curvature.

  10. Quantitative Analysis by Isotopic Dilution Using Mass Spectroscopy: The Determination of Caffeine by GC-MS.

    ERIC Educational Resources Information Center

    Hill, Devon W.; And Others

    1988-01-01

    Describes a laboratory technique for quantitative analysis of caffeine by an isotopic dilution method for coupled gas chromatography-mass spectroscopy. Discusses caffeine analysis and experimental methodology. Lists sample caffeine concentrations found in common products. (MVL)

  11. Characterization and Application of a Grazing Angle Objective for Quantitative Infrared Reflection Microspectroscopy

    NASA Technical Reports Server (NTRS)

    Pepper, Stephen V.

    1995-01-01

    A grazing angle objective on an infrared microspectrometer is studied for quantitative spectroscopy by considering the angular dependence of the incident intensity within the objective's angular aperture. The assumption that there is no angular dependence is tested by comparing the experimental reflectance of Si and KBr surfaces with the reflectance calculated by integrating the Fresnel reflection coefficient over the angular aperture under this assumption. Good agreement was found, indicating that the specular reflectance of surfaces can straight-forwardly be quantitatively integrated over the angular aperture without considering non-uniform incident intensity. This quantitative approach is applied to the thickness determination of dipcoated Krytox on gold. The infrared optical constants of both materials are known, allowing the integration to be carried out. The thickness obtained is in fair agreement with the value determined by ellipsometry in the visible. Therefore, this paper illustrates a method for more quantitative use of a grazing angle objective for infrared reflectance microspectroscopy.

  12. Quantitation of active pharmaceutical ingredients and excipients in powder blends using designed multivariate calibration models by near-infrared spectroscopy.

    PubMed

    Li, Weiyong; Worosila, Gregory D

    2005-05-13

    This research note demonstrates the simultaneous quantitation of a pharmaceutical active ingredient and three excipients in a simulated powder blend containing acetaminophen, Prosolv and Crospovidone. An experimental design approach was used in generating a 5-level (%, w/w) calibration sample set that included 125 samples. The samples were prepared by weighing suitable amount of powders into separate 20-mL scintillation vials and were mixed manually. Partial least squares (PLS) regression was used in calibration model development. The models generated accurate results for quantitation of Crospovidone (at 5%, w/w) and magnesium stearate (at 0.5%, w/w). Further testing of the models demonstrated that the 2-level models were as effective as the 5-level ones, which reduced the calibration sample number to 50. The models had a small bias for quantitation of acetaminophen (at 30%, w/w) and Prosolv (at 64.5%, w/w) in the blend. The implication of the bias is discussed.

  13. Arenavirus budding resulting from viral-protein-associated cell membrane curvature

    PubMed Central

    Schley, David; Whittaker, Robert J.; Neuman, Benjamin W.

    2013-01-01

    Viral replication occurs within cells, with release (and onward infection) primarily achieved through two alternative mechanisms: lysis, in which virions emerge as the infected cell dies and bursts open; or budding, in which virions emerge gradually from a still living cell by appropriating a small part of the cell membrane. Virus budding is a poorly understood process that challenges current models of vesicle formation. Here, a plausible mechanism for arenavirus budding is presented, building on recent evidence that viral proteins embed in the inner lipid layer of the cell membrane. Experimental results confirm that viral protein is associated with increased membrane curvature, whereas a mathematical model is used to show that localized increases in curvature alone are sufficient to generate viral buds. The magnitude of the protein-induced curvature is calculated from the size of the amphipathic region hypothetically removed from the inner membrane as a result of translation, with a change in membrane stiffness estimated from observed differences in virion deformation as a result of protein depletion. Numerical results are based on experimental data and estimates for three arenaviruses, but the mechanisms described are more broadly applicable. The hypothesized mechanism is shown to be sufficient to generate spontaneous budding that matches well both qualitatively and quantitatively with experimental observations. PMID:23864502

  14. Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods.

    PubMed

    Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C W; Lipiński, Wojciech; Bischof, John C

    2016-07-21

    Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.

  15. Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods

    NASA Astrophysics Data System (ADS)

    Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C. W.; Lipiński, Wojciech; Bischof, John C.

    2016-07-01

    Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.

  16. Novel Sessile Drop Software for Quantitative Estimation of Slag Foaming in Carbon/Slag Interactions

    NASA Astrophysics Data System (ADS)

    Khanna, Rita; Rahman, Mahfuzur; Leow, Richard; Sahajwalla, Veena

    2007-08-01

    Novel video-processing software has been developed for the sessile drop technique for a rapid and quantitative estimation of slag foaming. The data processing was carried out in two stages: the first stage involved the initial transformation of digital video/audio signals into a format compatible with computing software, and the second stage involved the computation of slag droplet volume and area of contact in a chosen video frame. Experimental results are presented on slag foaming from synthetic graphite/slag system at 1550 °C. This technique can be used for determining the extent and stability of foam as a function of time.

  17. Quantitative nephelometry

    MedlinePlus

    ... this page: //medlineplus.gov/ency/article/003545.htm Quantitative nephelometry test To use the sharing features on this page, please enable JavaScript. Quantitative nephelometry is a lab test to quickly and ...

  18. Evaluating the Effectiveness of Remedial Reading Courses at Community Colleges: A Quantitative Study

    ERIC Educational Resources Information Center

    Lavonier, Nicole

    2014-01-01

    The present study evaluated the effectiveness of two instructional approaches for remedial reading courses at a community college. The instructional approaches were strategic reading and traditional, textbook-based instruction. The two research questions that guided the quantitative, quasi-experimental study were: (a) what is the effect of…

  19. Viscoelastic modeling and quantitative experimental characterization of normal and osteoarthritic human articular cartilage using indentation.

    PubMed

    Richard, F; Villars, M; Thibaud, S

    2013-08-01

    The viscoelastic behavior of articular cartilage changes with progression of osteoarthritis. The objective of this study is to quantify this progression and to propose a viscoelastic model of articular cartilage taking into account the degree of osteoarthritis that which be easily used in predictive numerical simulations of the hip joint behavior. To quantify the effects of osteoarthritis (OA) on the viscoelastic behavior of human articular cartilage, samples were obtained from the hip arthroplasty due to femoral neck fracture (normal cartilage) or advanced coxarthrosis (OA cartilage). Experimental data were obtained from instrumented indentation tests on unfrozen femoral cartilage collected and studied in the day following the prosthetic hip surgery pose. By using an inverse method coupled with a numerical modeling (FEM) of all experimental data of the indentation tests, the viscoelastic properties of the two states were quantified. Mean values of viscoelastic parameters were significantly lower for OA cartilage than normal (instantaneous and relaxed tension moduli, viscosity coefficient). Based on the results and in the thermodynamic framework, a constitutive viscoelastic model taking into account the degree of osteoarthritis as an internal variable of damage is proposed. The isotropic phenomenological viscoelastic model including degradation provides an accurate prediction of the mechanical response of the normal human cartilage and OA cartilage with advanced coxarthrosis but should be further validated for intermediate degrees of osteoarthritis. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Using Performance Tasks to Improve Quantitative Reasoning in an Introductory Mathematics Course

    ERIC Educational Resources Information Center

    Kruse, Gerald; Drews, David

    2013-01-01

    A full-cycle assessment of our efforts to improve quantitative reasoning in an introductory math course is described. Our initial iteration substituted more open-ended performance tasks for the active learning projects than had been used. Using a quasi-experimental design, we compared multiple sections of the same course and found non-significant…

  1. Comparison of GEANT4 very low energy cross section models with experimental data in water.

    PubMed

    Incerti, S; Ivanchenko, A; Karamitros, M; Mantero, A; Moretto, P; Tran, H N; Mascialino, B; Champion, C; Ivanchenko, V N; Bernal, M A; Francis, Z; Villagrasa, C; Baldacchin, G; Guèye, P; Capra, R; Nieminen, P; Zacharatou, C

    2010-09-01

    The GEANT4 general-purpose Monte Carlo simulation toolkit is able to simulate physical interaction processes of electrons, hydrogen and helium atoms with charge states (H0, H+) and (He0, He+, He2+), respectively, in liquid water, the main component of biological systems, down to the electron volt regime and the submicrometer scale, providing GEANT4 users with the so-called "GEANT4-DNA" physics models suitable for microdosimetry simulation applications. The corresponding software has been recently re-engineered in order to provide GEANT4 users with a coherent and unique approach to the simulation of electromagnetic interactions within the GEANT4 toolkit framework (since GEANT4 version 9.3 beta). This work presents a quantitative comparison of these physics models with a collection of experimental data in water collected from the literature. An evaluation of the closeness between the total and differential cross section models available in the GEANT4 toolkit for microdosimetry and experimental reference data is performed using a dedicated statistical toolkit that includes the Kolmogorov-Smirnov statistical test. The authors used experimental data acquired in water vapor as direct measurements in the liquid phase are not yet available in the literature. Comparisons with several recommendations are also presented. The authors have assessed the compatibility of experimental data with GEANT4 microdosimetry models by means of quantitative methods. The results show that microdosimetric measurements in liquid water are necessary to assess quantitatively the validity of the software implementation for the liquid water phase. Nevertheless, a comparison with existing experimental data in water vapor provides a qualitative appreciation of the plausibility of the simulation models. The existing reference data themselves should undergo a critical interpretation and selection, as some of the series exhibit significant deviations from each other. The GEANT4-DNA physics models

  2. Risk assessment of false-positive quantitative real-time PCR results in food, due to detection of DNA originating from dead cells.

    PubMed

    Wolffs, Petra; Norling, Börje; Rådström, Peter

    2005-03-01

    Real-time PCR technology is increasingly used for detection and quantification of pathogens in food samples. A main disadvantage of nucleic acid detection is the inability to distinguish between signals originating from viable cells and DNA released from dead cells. In order to gain knowledge concerning risks of false-positive results due to detection of DNA originating from dead cells, quantitative PCR (qPCR) was used to investigate the degradation kinetics of free DNA in four types of meat samples. Results showed that the fastest degradation rate was observed (1 log unit per 0.5 h) in chicken homogenate, whereas the slowest rate was observed in pork rinse (1 log unit per 120.5 h). Overall results indicated that degradation occurred faster in chicken samples than in pork samples and faster at higher temperatures. Based on these results, it was concluded that, especially in pork samples, there is a risk of false-positive PCR results. This was confirmed in a quantitative study on cell death and signal persistence over a period of 28 days, employing three different methods, i.e. viable counts, direct qPCR, and finally floatation, a recently developed discontinuous density centrifugation method, followed by qPCR. Results showed that direct qPCR resulted in an overestimation of up to 10 times of the amount of cells in the samples compared to viable counts, due to detection of DNA from dead cells. However, after using floatation prior to qPCR, results resembled the viable count data. This indicates that by using of floatation as a sample treatment step prior to qPCR, the risk of false-positive PCR results due to detection of dead cells, can be minimized.

  3. Vascular Corrosion Casting: Review of Advantages and Limitations in the Application of Some Simple Quantitative Methods.

    PubMed

    Hossler, Fred E.; Douglas, John E.

    2001-05-01

    Vascular corrosion casting has been used for about 40 years to produce replicas of normal and abnormal vasculature and microvasculature of various tissues and organs that could be viewed at the ultrastructural level. In combination with scanning electron microscopy (SEM), the primary application of corrosion casting has been to describe the morphology and anatomical distribution of blood vessels in these tissues. However, such replicas should also contain quantitative information about that vasculature. This report summarizes some simple quantitative applications of vascular corrosion casting. Casts were prepared by infusing Mercox resin or diluted Mercox resin into the vasculature. Surrounding tissues were removed with KOH, hot water, and formic acid, and the resulting dried casts were observed with routine SEM. The orientation, size, and frequency of vascular endothelial cells were determined from endothelial nuclear imprints on various cast surfaces. Vascular volumes of heart, lung, and avian salt gland were calculated using tissue and resin densities, and weights. Changes in vascular volume and functional capillary density in an experimentally induced emphysema model were estimated from confocal images of casts. Clearly, corrosion casts lend themselves to quantitative analysis. However, because blood vessels differ in their compliances, in their responses to the toxicity of casting resins, and in their response to varying conditions of corrosion casting procedures, it is prudent to use care in interpreting this quantitative data. Some of the applications and limitations of quantitative methodology with corrosion casts are reviewed here.

  4. Water-waves on linear shear currents. A comparison of experimental and numerical results.

    NASA Astrophysics Data System (ADS)

    Simon, Bruno; Seez, William; Touboul, Julien; Rey, Vincent; Abid, Malek; Kharif, Christian

    2016-04-01

    Propagation of water waves can be described for uniformly sheared current conditions. Indeed, some mathematical simplifications remain applicable in the study of waves whether there is no current or a linearly sheared current. However, the widespread use of mathematical wave theories including shear has rarely been backed by experimental studies of such flows. New experimental and numerical methods were both recently developed to study wave current interactions for constant vorticity. On one hand, the numerical code can simulate, in two dimensions, arbitrary non-linear waves. On the other hand, the experimental methods can be used to generate waves with various shear conditions. Taking advantage of the simplicity of the experimental protocol and versatility of the numerical code, comparisons between experimental and numerical data are discussed and compared with linear theory for validation of the methods. ACKNOWLEDGEMENTS The DGA (Direction Générale de l'Armement, France) is acknowledged for its financial support through the ANR grant N° ANR-13-ASTR-0007.

  5. Analysis of Dynamic Fracture Compliance Based on Poroelastic Theory - Part II: Results of Numerical and Experimental Tests

    NASA Astrophysics Data System (ADS)

    Wang, Ding; Ding, Pin-bo; Ba, Jing

    2018-03-01

    In Part I, a dynamic fracture compliance model (DFCM) was derived based on the poroelastic theory. The normal compliance of fractures is frequency-dependent and closely associated with the connectivity of porous media. In this paper, we first compare the DFCM with previous fractured media theories in the literature in a full frequency range. Furthermore, experimental tests are performed on synthetic rock specimens, and the DFCM is compared with the experimental data in the ultrasonic frequency band. Synthetic rock specimens saturated with water have more realistic mineral compositions and pore structures relative to previous works in comparison with natural reservoir rocks. The fracture/pore geometrical and physical parameters can be controlled to replicate approximately those of natural rocks. P- and S-wave anisotropy characteristics with different fracture and pore properties are calculated and numerical results are compared with experimental data. Although the measurement frequency is relatively high, the results of DFCM are appropriate for explaining the experimental data. The characteristic frequency of fluid pressure equilibration calculated based on the specimen parameters is not substantially less than the measurement frequency. In the dynamic fracture model, the wave-induced fluid flow behavior is an important factor for the fracture-wave interaction process, which differs from the models at the high-frequency limits, for instance, Hudson's un-relaxed model.

  6. Fines migration during CO 2 injection: Experimental results interpreted using surface forces

    DOE PAGES

    Xie, Quan; Saeedi, Ali; Delle Piane, Claudio; ...

    2017-09-04

    The South West Hub project is one of the Australian Flagship Carbon Capture and Storage projects located in the south-west of Western Australia. To evaluate the injectivity potential during the forthcoming full-scale CO 2 injection, we conducted three core-flooding experiments using reservoir core plugs from the well Harvey-1. We aimed to investigate in this paper whether the injection of CO 2 leads to fines migration and permeability reduction due to the relatively high kaolinite content (up to 13%) in the injection interval of the target formation (i.e. the Wonnerup Member of the Lesueur Formation). We imaged the core samples beforemore » flooding to verify the presence of kaolinite at the pore-scale using scanning electron microscopy (SEM). We also examined the pore network of the core plugs before and after the core-flooding experiments using Nuclear Magnetic Resonance (NMR). Moreover, to gain a better understanding of any kaolinite fines migration, we delineated surface force using two models based on Derjaguin-Landau-Verwey-Overbeek (denoted by DLVO) theory coupled hydrodynamic force: (1) sphere/flat model representing interaction between kaolinite/quartz, and (2) flat/flat model representing interaction between kaolinite/kaolinite. Our core-flooding experimental results showed that CO 2/brine injection triggered moderate to significant reduction in the permeability of the core samples with a negligible porosity change. NMR measurements supported the core-flooding results, suggesting that the relatively large pores disappeared in favour of a higher proportion of the medium to small pores after flooding. The DLVO calculations showed that some kaolinite particles probably lifted off and detached from neighbouring kaolinite particles rather than quartz grains. Moreover, the modelling results showed that the kaolinite fines migration would not occur under normal reservoir multiphase flow conditions. This is not because of the low hydrodynamic force. It is

  7. Fines migration during CO 2 injection: Experimental results interpreted using surface forces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Quan; Saeedi, Ali; Delle Piane, Claudio

    The South West Hub project is one of the Australian Flagship Carbon Capture and Storage projects located in the south-west of Western Australia. To evaluate the injectivity potential during the forthcoming full-scale CO 2 injection, we conducted three core-flooding experiments using reservoir core plugs from the well Harvey-1. We aimed to investigate in this paper whether the injection of CO 2 leads to fines migration and permeability reduction due to the relatively high kaolinite content (up to 13%) in the injection interval of the target formation (i.e. the Wonnerup Member of the Lesueur Formation). We imaged the core samples beforemore » flooding to verify the presence of kaolinite at the pore-scale using scanning electron microscopy (SEM). We also examined the pore network of the core plugs before and after the core-flooding experiments using Nuclear Magnetic Resonance (NMR). Moreover, to gain a better understanding of any kaolinite fines migration, we delineated surface force using two models based on Derjaguin-Landau-Verwey-Overbeek (denoted by DLVO) theory coupled hydrodynamic force: (1) sphere/flat model representing interaction between kaolinite/quartz, and (2) flat/flat model representing interaction between kaolinite/kaolinite. Our core-flooding experimental results showed that CO 2/brine injection triggered moderate to significant reduction in the permeability of the core samples with a negligible porosity change. NMR measurements supported the core-flooding results, suggesting that the relatively large pores disappeared in favour of a higher proportion of the medium to small pores after flooding. The DLVO calculations showed that some kaolinite particles probably lifted off and detached from neighbouring kaolinite particles rather than quartz grains. Moreover, the modelling results showed that the kaolinite fines migration would not occur under normal reservoir multiphase flow conditions. This is not because of the low hydrodynamic force. It is

  8. Assessing deep and shallow learning methods for quantitative prediction of acute chemical toxicity.

    PubMed

    Liu, Ruifeng; Madore, Michael; Glover, Kyle P; Feasel, Michael G; Wallqvist, Anders

    2018-05-02

    Animal-based methods for assessing chemical toxicity are struggling to meet testing demands. In silico approaches, including machine-learning methods, are promising alternatives. Recently, deep neural networks (DNNs) were evaluated and reported to outperform other machine-learning methods for quantitative structure-activity relationship modeling of molecular properties. However, most of the reported performance evaluations relied on global performance metrics, such as the root mean squared error (RMSE) between the predicted and experimental values of all samples, without considering the impact of sample distribution across the activity spectrum. Here, we carried out an in-depth analysis of DNN performance for quantitative prediction of acute chemical toxicity using several datasets. We found that the overall performance of DNN models on datasets of up to 30,000 compounds was similar to that of random forest (RF) models, as measured by the RMSE and correlation coefficients between the predicted and experimental results. However, our detailed analyses demonstrated that global performance metrics are inappropriate for datasets with a highly uneven sample distribution, because they show a strong bias for the most populous compounds along the toxicity spectrum. For highly toxic compounds, DNN and RF models trained on all samples performed much worse than the global performance metrics indicated. Surprisingly, our variable nearest neighbor method, which utilizes only structurally similar compounds to make predictions, performed reasonably well, suggesting that information of close near neighbors in the training sets is a key determinant of acute toxicity predictions.

  9. Experimental study on melting and flowing behavior of thermoplastics combustion based on a new setup with a T-shape trough.

    PubMed

    Xie, Qiyuan; Zhang, Heping; Ye, Ruibo

    2009-07-30

    The objective of this work is to quantitatively study the burning characteristics of thermoplastics. A new experimental setup with a T-shape trough is designed. Based on this setup, the loop mechanism between the wall fire and pool fires induced by the melting and dripping of thermoplastic can be well simulated and studied. Additionally, the flowing characteristics of pool fires can also be quantitatively analyzed. Experiments are conducted for PP and PE sheets with different thicknesses. The maximum distances of the induced flowing pool flame in the T-shape trough are recorded and analyzed. The typical fire parameters, such as heat release rates (HRRs), CO concentrations are also monitored. The results show that the softening and clinging of the thermoplastic sheets plays a considerable role for their vertical wall burning. It is illustrated that the clinging of burning thermoplastic sheet may be mainly related with the softening temperatures and the ignition temperatures of the thermoplastics, as well as their viscosity coefficients. Through comparing the maximum distances of flowing flame of induced pool fires in the T-shape trough for thermoplastic sheets with different thicknesses, it is indicated that the pool fires induced by PE materials are easier to flow away than that of PP materials. Therefore, PE materials may be more dangerous for their faster pool fire spread on the floor. These experimental results preliminarily illustrate that this new experimental setup is helpful for quantitatively studying the special burning feature of thermoplastics although further modifications is needed for this setup in the future.

  10. Optimal active vibration absorber: Design and experimental results

    NASA Technical Reports Server (NTRS)

    Lee-Glauser, Gina; Juang, Jer-Nan; Sulla, Jeffrey L.

    1992-01-01

    An optimal active vibration absorber can provide guaranteed closed-loop stability and control for large flexible space structures with collocated sensors/actuators. The active vibration absorber is a second-order dynamic system which is designed to suppress any unwanted structural vibration. This can be designed with minimum knowledge of the controlled system. Two methods for optimizing the active vibration absorber parameters are illustrated: minimum resonant amplitude and frequency matched active controllers. The Controls-Structures Interaction Phase-1 Evolutionary Model at NASA LaRC is used to demonstrate the effectiveness of the active vibration absorber for vibration suppression. Performance is compared numerically and experimentally using acceleration feedback.

  11. Experimental modelling of fragmentation applied to volcanic explosions

    NASA Astrophysics Data System (ADS)

    Haug, Øystein Thordén; Galland, Olivier; Gisler, Galen R.

    2013-12-01

    Explosions during volcanic eruptions cause fragmentation of magma and host rock, resulting in fragments with sizes ranging from boulders to fine ash. The products can be described by fragment size distributions (FSD), which commonly follow power laws with exponent D. The processes that lead to power-law distributions and the physical parameters that control D remain unknown. We developed a quantitative experimental procedure to study the physics of the fragmentation process through time. The apparatus consists of a Hele-Shaw cell containing a layer of cohesive silica flour that is fragmented by a rapid injection of pressurized air. The evolving fragmentation of the flour is monitored with a high-speed camera, and the images are analysed to obtain the evolution of the number of fragments (N), their average size (A), and the FSD. Using the results from our image-analysis procedure, we find transient empirical laws for N, A and the exponent D of the power-law FSD as functions of the initial air pressure. We show that our experimental procedure is a promising tool for unravelling the complex physics of fragmentation during phreatomagmatic and phreatic eruptions.

  12. Rock surface roughness measurement using CSI technique and analysis of surface characterization by qualitative and quantitative results

    NASA Astrophysics Data System (ADS)

    Mukhtar, Husneni; Montgomery, Paul; Gianto; Susanto, K.

    2016-01-01

    In order to develop image processing that is widely used in geo-processing and analysis, we introduce an alternative technique for the characterization of rock samples. The technique that we have used for characterizing inhomogeneous surfaces is based on Coherence Scanning Interferometry (CSI). An optical probe is first used to scan over the depth of the surface roughness of the sample. Then, to analyse the measured fringe data, we use the Five Sample Adaptive method to obtain quantitative results of the surface shape. To analyse the surface roughness parameters, Hmm and Rq, a new window resizing analysis technique is employed. The results of the morphology and surface roughness analysis show micron and nano-scale information which is characteristic of each rock type and its history. These could be used for mineral identification and studies in rock movement on different surfaces. Image processing is thus used to define the physical parameters of the rock surface.

  13. Rotating Fluidized Bed Reactor for Space Nuclear Propulsion. Annual Report; Design Studies and Experimental Results

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The rotating fluidized bed reactor concept is being investigated for possible application in nuclear propulsion systems. Physics calculations show U-233 to be superior to U-235 as a fuel for a cavity reactor of this type. Preliminary estimates of the effect of hydrogen in the reactor, reflector material, and power peaking are given. A preliminary engineering analysis was made for U-235 and U-233 fueled systems. An evaluation of the parameters affecting the design of the system is given, along with the thrust-to-weight ratios. The experimental equipment is described, as are the special photographic techniques and procedures. Characteristics of the fluidized bed and experimental results are given, including photographic evidence of bed fluidization at high rotational velocities.

  14. [Experimental Methods and Result Analysis of a Variety of Spectral Reflectance Properties of the Thin Oil Film].

    PubMed

    Ye, Zhou; Liu, Li; Wei, Chuan-xin; Gu, Qun; An, Ping-ao; Zhao, Yue-jiao; Yin, Da-yi

    2015-06-01

    In order to analysis the oil spill situation based on the obtained data in airborne aerial work, it's needed to get the spectral reflectance characteristics of the oil film of different oils and thickness as support and to select the appropriate operating band. An experiment is set up to measure the reflectance spectroscopy from ultraviolet to near-infrared for the film of five target samples, which means petrol, diesel, lubricating oil, kerosene and fossil, using spectral measurement device. The result is compared with the reflectance spectra of water in the same experimental environment, which shows that the spectral reflection characteristics of the oil film are related to the thickness and the type of the oil film. In case of the same thickness, the spectral reflectance curve of different types of film is far different, and for the same type of film, the spectral reflectance curve changes accordingly with the change of film thickness, therefore in terms of the single film, different film thickness can be distinguished by reflectance curves. It also shows that in terms of the same film thickness, the reflectance of diesel, kerosene, lubricants reaches peak around 380 nm wavelength, obviously different from the reflectance of water, and that the reflectance of crude oil is far less than that of water in more than 340 nm wavelength, and the obtained reflection spectrum can be used to distinguish between different types of oil film to some extent. The experiment covers main types of spilled oil, with data comprehensively covering commonly used detect spectral bands, and quantitative description of the spectral reflectance properties of film. It provides comprehensive theoretical and data support for the selection of airborne oil spill detection working band and the detection and analysis of water-surface oil spill.

  15. Survey of Experimental Results in High-Contrast Imaging for Future Exoplanet Missions

    NASA Technical Reports Server (NTRS)

    Lawson, P. R.; Belikov, R.; Cash, W.; Clampin, M.; Glassman, T.; Guyon, O.; Kasdin, N. J.; Kern, B. D.; Lyon, R.; Mawet, D.; hide

    2013-01-01

    We present and compare experimental results in high contrast imaging representing the state of the art in coronagraph and starshade technology. These experiments have been undertaken with the goal of demonstrating the capability of detecting Earth-like planets around nearby Sun-like stars. The contrast of an Earth seen in reflected light around a Sun-like star would be about 1.2 x 10(exp -10). Several of the current candidate technologies now yield raw contrasts of 1.0 x 10(exp -9) or better, and so should enable the detection of Earths, assuming a gain in sensitivity in post-processing of a factor of 10. We present results of coronagraph and starshade experiments conducted at visible and infrared wavelengths. Cross-sections of dark fields are directly compared as a function of field angle and bandwidth. The strength and differences of the techniques are compared.

  16. Reinventing the ames test as a quantitative lab that connects classical and molecular genetics.

    PubMed

    Goodson-Gregg, Nathan; De Stasio, Elizabeth A

    2009-01-01

    While many institutions use a version of the Ames test in the undergraduate genetics laboratory, students typically are not exposed to techniques or procedures beyond qualitative analysis of phenotypic reversion, thereby seriously limiting the scope of learning. We have extended the Ames test to include both quantitative analysis of reversion frequency and molecular analysis of revertant gene sequences. By giving students a role in designing their quantitative methods and analyses, students practice and apply quantitative skills. To help students connect classical and molecular genetic concepts and techniques, we report here procedures for characterizing the molecular lesions that confer a revertant phenotype. We suggest undertaking reversion of both missense and frameshift mutants to allow a more sophisticated molecular genetic analysis. These modifications and additions broaden the educational content of the traditional Ames test teaching laboratory, while simultaneously enhancing students' skills in experimental design, quantitative analysis, and data interpretation.

  17. Quantitative analysis on PUVA-induced skin photodamages using optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Zhai, Juan; Guo, Zhouyi; Liu, Zhiming; Xiong, Honglian; Zeng, Changchun; Jin, Ying

    2009-08-01

    Psoralen plus ultraviolet A radiation (PUVA) therapy is a very important clinical treatment of skin diseases such as vitiligo and psoriasis, but associated with an increased risk of skin photodamages especially photoaging. Since skin biopsy alters the original skin morphology and always requires an iatrogenic trauma, optical coherence tomography (OCT) appears to be a promising technique to study skin damage in vivo. In this study, the Balb/c mice had 8-methoxypsralen (8-MOP) treatment prior to UVA radiation was used as PUVA-induced photo-damaged modal. The OCT imaging of photo-damaged group (modal) and normal group (control) in vivo was obtained of mice dorsal skin at 0, 24, 48, 72 hours after irradiation respectively. And then the results were quantitatively analyzed combined with histological information. The experimental results showed that, PUVA-induced photo-damaged skin had an increase in epidermal thickness (ET), a reduction of attenuation coefficient in OCT images signal, and an increase in brightness of the epidermis layer compared with the control group. In conclusion, noninvasive high-resolution imaging techniques such as OCT may be a promising tool for photobiological studies aimed at assessing photo-damage and repair processes in vivo. It can be used to quantitative analysis of changes in photo-damaged skin, such as the ET and collagen in dermis, provides a theoretical basis for treatment and prevention of skin photodamages.

  18. Transport of fluorobenzoate tracers in a vegetated hydrologic control volume: 1. Experimental results

    NASA Astrophysics Data System (ADS)

    Queloz, Pierre; Bertuzzo, Enrico; Carraro, Luca; Botter, Gianluca; Miglietta, Franco; Rao, P. S. C.; Rinaldo, Andrea

    2015-04-01

    This paper reports about the experimental evidence collected on the transport of five fluorobenzoate tracers injected under controlled conditions in a vegetated hydrologic volume, a large lysimeter (fitted with load cells, sampling ports, and an underground chamber) where two willows prompting large evapotranspiration fluxes had been grown. The relevance of the study lies in the direct and indirect measures of the ways in which hydrologic fluxes, in this case, evapotranspiration from the upper surface and discharge from the bottom drainage, sample water and solutes in storage at different times under variable hydrologic forcings. Methods involve the accurate control of hydrologic inputs and outputs and a large number of suitable chemical analyses of water samples in discharge waters. Mass extraction from biomass has also been performed ex post. The results of the 2 year long experiment established that our initial premises on the tracers' behavior, known to be sorption-free under saturated conditions which we verified in column leaching tests, were unsuitable as large differences in mass recovery appeared. Issues on reactivity thus arose and were addressed in the paper, in this case attributed to microbial degradation and solute plant uptake. Our results suggest previously unknown features of fluorobenzoate compounds as hydrologic tracers, potentially interesting for catchment studies owing to their suitability for distinguishable multiple injections, and an outlook on direct experimental closures of mass balance in hydrologic transport volumes involving fluxes that are likely to sample differently stored water and solutes.

  19. Experimental simulation of the effects of sudden increases in geomagnetic activity upon quantitative measures of human brain activity: validation of correlational studies.

    PubMed

    Mulligan, Bryce P; Persinger, Michael A

    2012-05-10

    Previous correlations between geomagnetic activity and quantitative changes in electroencephalographic power revealed particular associations with the right parietal lobe for theta activity and the right frontal region for gamma activity. In the present experiment subjects were exposed to either no field (sham conditions) or to either 20 nT or 70 nT, 7 Hz, amplitude modulated (mHz range) magnetic fields for 30 min. Quantitative electroencephalographic (QEEG) measurements were completed before, during, and after the field exposures. After about 10 min of exposure theta power over the right parietal region was enhanced for the 20 nT exposure but suppressed for the 70 nT exposure relative to sham field exposures. The effect dissipated by the end of the exposure. These results support the contention that magnetic field fluctuations were primarily responsible for the significant geomagnetic-QEEG correlations reported in several studies. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  20. Infectious bovine rhinotracheitis: study on the experimentally induced disease and its prevention using an inactivated, adjuvanted vaccine.

    PubMed

    Soulebot, J P; Guillemin, F; Brun, A; Dubourget, P; Espinasse, J; Terre, J

    1982-01-01

    Experimentally induced IBR was studied in calves. Intranasal challenge enabled reproducible results to be obtained, both from qualitative (clinical aspect) and quantitative points of view (virus excretion, temperature); local and general immunity were also evaluated. This challenge method is useful when studying IBR vaccines. The disease was also experimentally induced by putting healthy animals into contact with diffusor calves. A single injection of inactivated vaccine in oily adjuvant already conferred good protection; it was 100% successful against the experimentally induced disease when administered two times at a 7 or 14 day interval. Immunity obtained was long-lasting and even persisted up to one year. Therefore, this vaccine is advised for vaccination in both contaminated and high risk areas. Results obtained for both safety and potency suggest that this killed vaccine should be used rather than live vaccines.

  1. Development of two-photon fluorescence microscopy for quantitative imaging in turbid tissues

    NASA Astrophysics Data System (ADS)

    Coleno, Mariah Lee

    Two-photon laser scanning fluorescence microscopy (TPM) is a high resolution, non-invasive biological imaging technique that can be used to image turbid tissues both in vitro and in vivo at depths of several hundred microns. Although TPM has been widely used to image tissue structures, no one has focused on using TPM to extract quantitative information from turbid tissues at depth. As a result, this thesis addresses the quantitative characterization of two-photon signals in turbid media. Initially, a two-photon microscope system is constructed, and two-photon images that validate system performance are obtained. Then TPM is established as an imaging technique that can be used to validate theoretical observations already listed in the literature. In particular, TPM is found to validate the exponential dependence of the fluorescence intensity decay with depth in turbid tissue model systems. Results from these studies next prompted experimental investigation into whether TPM could be used to determine tissue optical properties. Comparing the exponential dependence of the decay with a Monte Carlo model involving tissue optical properties, TPM is shown to be useful for determining the optical properties (total attenuation coefficient) of thick, turbid tissues on a small spatial scale. Next, a role for TPM for studying and optimizing wound healing is demonstrated. In particular, TPM is used to study the effects of perturbations (growth factors, PDT) on extracellular matrix remodeling in artificially engineered skin tissues. Results from these studies combined with tissue contraction studies are shown to demonstrate ways to modulate tissues to optimize the wound healing immune response and reduce scarring. In the end, TPM is shown to be an extremely important quantitative biological imaging technique that can be used to optimize wound repair.

  2. [THE COMPARATIVE ANALYSIS OF RESULTS OF DETECTION OF CARCINOGENIC TYPES OF HUMAN PAPILLOMA VIRUS BY QUALITATIVE AND QUANTITATIVE TESTS].

    PubMed

    Kuzmenko, E T; Labigina, A V; Leshenko, O Ya; Rusanov, D N; Kuzmenko, V V; Fedko, L P; Pak, I P

    2015-05-01

    The analysis of results of screening (n = 3208; sexually active citizen aged from 18 to 59 years) was carried out to detect oncogene types of human papilloma virus in using qualitative (1150 females and 720 males) and quantitative (polymerase chain reaction in real-time (843 females and 115 males) techniques. The human papilloma virus of high oncogene type was detected in 65% and 68.4% of females and in 48.6% and 53% of males correspondingly. Among 12 types of human papilloma virus the most frequently diagnosed was human papilloma virus 16 independently of gender of examined and technique of analysis. In females, under application of qualitative tests rate of human papilloma virus 16 made up to 18.3% (n = 280) and under application of quantitative tests Rte of human papilloma virus made up to 14.9% (n = 126; p ≤ 0.05). Under examination of males using qualitative tests rate of human papilloma virus 16 made up to 8.3% (n = 60) and under application of qualitative tests made up to 12.2% (n = 14; p ≥ 0.05). Under application of qualitative tests rate of detection on the rest ofoncogene types of human papilloma virus varied in females from 3.4% to 8.4% and in males from 1.8% to 5.9%. Under application of qualitative tests to females rate of human papilloma virus with high viral load made up to 68.4%, with medium viral load - 2.85% (n = 24) and with low viral load -0.24% (n = 2). Under application of quantitative tests in males rate of detection of types of human papilloma virus made up to 53% and at that in all high viral load was established. In females, the most of oncogene types of human papilloma virus (except for 31, 39, 59) are detected significantly more often than in males.

  3. Microfabricated Air-Microfluidic Sensor for Personal Monitoring of Airborne Particulate Matter: Design, Fabrication, and Experimental Results

    EPA Science Inventory

    We present the design and fabrication of a micro electro mechanical systems (MEMS) air-microfluidic particulate matter (PM) sensor, and show experimental results obtained from exposing the sensor to concentrations of tobacco smoke and diesel exhaust, two commonly occurring P...

  4. QuASAR: quantitative allele-specific analysis of reads

    PubMed Central

    Harvey, Chris T.; Moyerbrailean, Gregory A.; Davis, Gordon O.; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-01-01

    Motivation: Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. Results: We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. Availability and implementation: http://github.com/piquelab/QuASAR. Contact: fluca@wayne.edu or rpique@wayne.edu Supplementary information: Supplementary Material is available at Bioinformatics online. PMID:25480375

  5. Natural bacterial communities serve as quantitative geochemical biosensors.

    PubMed

    Smith, Mark B; Rocha, Andrea M; Smillie, Chris S; Olesen, Scott W; Paradis, Charles; Wu, Liyou; Campbell, James H; Fortney, Julian L; Mehlhorn, Tonia L; Lowe, Kenneth A; Earles, Jennifer E; Phillips, Jana; Techtmann, Steve M; Joyner, Dominique C; Elias, Dwayne A; Bailey, Kathryn L; Hurt, Richard A; Preheim, Sarah P; Sanders, Matthew C; Yang, Joy; Mueller, Marcella A; Brooks, Scott; Watson, David B; Zhang, Ping; He, Zhili; Dubinsky, Eric A; Adams, Paul D; Arkin, Adam P; Fields, Matthew W; Zhou, Jizhong; Alm, Eric J; Hazen, Terry C

    2015-05-12

    Biological sensors can be engineered to measure a wide range of environmental conditions. Here we show that statistical analysis of DNA from natural microbial communities can be used to accurately identify environmental contaminants, including uranium and nitrate at a nuclear waste site. In addition to contamination, sequence data from the 16S rRNA gene alone can quantitatively predict a rich catalogue of 26 geochemical features collected from 93 wells with highly differing geochemistry characteristics. We extend this approach to identify sites contaminated with hydrocarbons from the Deepwater Horizon oil spill, finding that altered bacterial communities encode a memory of prior contamination, even after the contaminants themselves have been fully degraded. We show that the bacterial strains that are most useful for detecting oil and uranium are known to interact with these substrates, indicating that this statistical approach uncovers ecologically meaningful interactions consistent with previous experimental observations. Future efforts should focus on evaluating the geographical generalizability of these associations. Taken as a whole, these results indicate that ubiquitous, natural bacterial communities can be used as in situ environmental sensors that respond to and capture perturbations caused by human impacts. These in situ biosensors rely on environmental selection rather than directed engineering, and so this approach could be rapidly deployed and scaled as sequencing technology continues to become faster, simpler, and less expensive. Here we show that DNA from natural bacterial communities can be used as a quantitative biosensor to accurately distinguish unpolluted sites from those contaminated with uranium, nitrate, or oil. These results indicate that bacterial communities can be used as environmental sensors that respond to and capture perturbations caused by human impacts. Copyright © 2015 Smith et al.

  6. Natural bacterial communities serve as quantitative geochemical biosensors

    DOE PAGES

    Smith, Mark B.; Rocha, Andrea M.; Smillie, Chris S.; ...

    2015-05-12

    Biological sensors can be engineered to measure a wide range of environmental conditions. Here we show that statistical analysis of DNA from natural microbial communities can be used to accurately identify environmental contaminants, including uranium and nitrate at a nuclear waste site. In addition to contamination, sequence data from the 16S rRNA gene alone can quantitatively predict a rich catalogue of 26 geochemical features collected from 93 wells with highly differing geochemistry characteristics. We extend this approach to identify sites contaminated with hydrocarbons from the Deepwater Horizon oil spill, finding that altered bacterial communities encode a memory of prior contamination,more » even after the contaminants themselves have been fully degraded. We show that the bacterial strains that are most useful for detecting oil and uranium are known to interact with these substrates, indicating that this statistical approach uncovers ecologically meaningful interactions consistent with previous experimental observations. Future efforts should focus on evaluating the geographical generalizability of these associations. Taken as a whole, these results indicate that ubiquitous, natural bacterial communities can be used as in situ environmental sensors that respond to and capture perturbations caused by human impacts. These in situ biosensors rely on environmental selection rather than directed engineering, and so this approach could be rapidly deployed and scaled as sequencing technology continues to become faster, simpler, and less expensive. Here we show that DNA from natural bacterial communities can be used as a quantitative biosensor to accurately distinguish unpolluted sites from those contaminated with uranium, nitrate, or oil. These results indicate that bacterial communities can be used as environmental sensors that respond to and capture perturbations caused by human impacts.« less

  7. Identification of internal control genes for quantitative expression analysis by real-time PCR in bovine peripheral lymphocytes.

    PubMed

    Spalenza, Veronica; Girolami, Flavia; Bevilacqua, Claudia; Riondato, Fulvio; Rasero, Roberto; Nebbia, Carlo; Sacchi, Paola; Martin, Patrice

    2011-09-01

    Gene expression studies in blood cells, particularly lymphocytes, are useful for monitoring potential exposure to toxicants or environmental pollutants in humans and livestock species. Quantitative PCR is the method of choice for obtaining accurate quantification of mRNA transcripts although variations in the amount of starting material, enzymatic efficiency, and the presence of inhibitors can lead to evaluation errors. As a result, normalization of data is of crucial importance. The most common approach is the use of endogenous reference genes as an internal control, whose expression should ideally not vary among individuals and under different experimental conditions. The accurate selection of reference genes is therefore an important step in interpreting quantitative PCR studies. Since no systematic investigation in bovine lymphocytes has been performed, the aim of the present study was to assess the expression stability of seven candidate reference genes in circulating lymphocytes collected from 15 dairy cows. Following the characterization by flow cytometric analysis of the cell populations obtained from blood through a density gradient procedure, three popular softwares were used to evaluate the gene expression data. The results showed that two genes are sufficient for normalization of quantitative PCR studies in cattle lymphocytes and that YWAHZ, S24 and PPIA are the most stable genes. Copyright © 2010 Elsevier Ltd. All rights reserved.

  8. Quantitative Analysis of the Efficiency of OLEDs.

    PubMed

    Sim, Bomi; Moon, Chang-Ki; Kim, Kwon-Hyeon; Kim, Jang-Joo

    2016-12-07

    We present a comprehensive model for the quantitative analysis of factors influencing the efficiency of organic light-emitting diodes (OLEDs) as a function of the current density. The model takes into account the contribution made by the charge carrier imbalance, quenching processes, and optical design loss of the device arising from various optical effects including the cavity structure, location and profile of the excitons, effective radiative quantum efficiency, and out-coupling efficiency. Quantitative analysis of the efficiency can be performed with an optical simulation using material parameters and experimental measurements of the exciton profile in the emission layer and the lifetime of the exciton as a function of the current density. This method was applied to three phosphorescent OLEDs based on a single host, mixed host, and exciplex-forming cohost. The three factors (charge carrier imbalance, quenching processes, and optical design loss) were influential in different ways, depending on the device. The proposed model can potentially be used to optimize OLED configurations on the basis of an analysis of the underlying physical processes.

  9. Quantitative prediction of solute strengthening in aluminium alloys.

    PubMed

    Leyson, Gerard Paul M; Curtin, William A; Hector, Louis G; Woodward, Christopher F

    2010-09-01

    Despite significant advances in computational materials science, a quantitative, parameter-free prediction of the mechanical properties of alloys has been difficult to achieve from first principles. Here, we present a new analytic theory that, with input from first-principles calculations, is able to predict the strengthening of aluminium by substitutional solute atoms. Solute-dislocation interaction energies in and around the dislocation core are first calculated using density functional theory and a flexible-boundary-condition method. An analytic model for the strength, or stress to move a dislocation, owing to the random field of solutes, is then presented. The theory, which has no adjustable parameters and is extendable to other metallic alloys, predicts both the energy barriers to dislocation motion and the zero-temperature flow stress, allowing for predictions of finite-temperature flow stresses. Quantitative comparisons with experimental flow stresses at temperature T=78 K are made for Al-X alloys (X=Mg, Si, Cu, Cr) and good agreement is obtained.

  10. Quantitative laser diagnostic and modeling study of C2 and CH chemistry in combustion.

    PubMed

    Köhler, Markus; Brockhinke, Andreas; Braun-Unkhoff, Marina; Kohse-Höinghaus, Katharina

    2010-04-15

    Quantitative concentration measurements of CH and C(2) have been performed in laminar, premixed, flat flames of propene and cyclopentene with varying stoichiometry. A combination of cavity ring-down (CRD) spectroscopy and laser-induced fluorescence (LIF) was used to enable sensitive detection of these species with high spatial resolution. Previously, CH and C(2) chemistry had been studied, predominantly in methane flames, to understand potential correlations of their formation and consumption. For flames of larger hydrocarbon fuels, however, quantitative information on these small intermediates is scarce, especially under fuel-rich conditions. Also, the combustion chemistry of C(2) in particular has not been studied in detail, and although it has often been observed, its role in potential build-up reactions of higher hydrocarbon species is not well understood. The quantitative measurements performed here are the first to detect both species with good spatial resolution and high sensitivity in the same experiment in flames of C(3) and C(5) fuels. The experimental profiles were compared with results of combustion modeling to reveal details of the formation and consumption of these important combustion molecules, and the investigation was devoted to assist the further understanding of the role of C(2) and of its potential chemical interdependences with CH and other small radicals.

  11. Quantitative Investigation of Protein-Nucleic Acid Interactions by Biosensor Surface Plasmon Resonance.

    PubMed

    Wang, Shuo; Poon, Gregory M K; Wilson, W David

    2015-01-01

    Biosensor-surface plasmon resonance (SPR) technology has emerged as a powerful label-free approach for the study of nucleic acid interactions in real time. The method provides simultaneous equilibrium and kinetic characterization for biomolecular interactions with low sample requirements and without the need for external probes. A detailed and practical guide for protein-DNA interaction analyses using biosensor-SPR methods is presented. Details of SPR technology and basic fundamentals are described with recommendations on the preparation of the SPR instrument, sensor chips and samples, experimental design, quantitative and qualitative data analyses and presentation. A specific example of the interaction of a transcription factor with DNA is provided with results evaluated by both kinetic and steady-state SPR methods.

  12. Mueller matrix microscope: a quantitative tool to facilitate detections and fibrosis scorings of liver cirrhosis and cancer tissues.

    PubMed

    Wang, Ye; He, Honghui; Chang, Jintao; He, Chao; Liu, Shaoxiong; Li, Migao; Zeng, Nan; Wu, Jian; Ma, Hui

    2016-07-01

    Today the increasing cancer incidence rate is becoming one of the biggest threats to human health.Among all types of cancers, liver cancer ranks in the top five in both frequency and mortality rate all over the world. During the development of liver cancer, fibrosis often evolves as part of a healing process in response to liver damage, resulting in cirrhosis of liver tissues. In a previous study, we applied the Mueller matrix microscope to pathological liver tissue samples and found that both the Mueller matrix polar decomposition (MMPD) and Mueller matrix transformation (MMT) parameters are closely related to the fibrous microstructures. In this paper,we take this one step further to quantitatively facilitate the fibrosis detections and scorings of pathological liver tissue samples in different stages from cirrhosis to cancer using the Mueller matrix microscope. The experimental results of MMPD and MMT parameters for the fibrotic liver tissue samples in different stages are measured and analyzed. We also conduct Monte Carlo simulations based on the sphere birefringence model to examine in detail the influence of structural changes in different fibrosis stages on the imaging parameters. Both the experimental and simulated results indicate that the polarized light microscope and transformed Mueller matrix parameter scan provide additional quantitative information helpful for fibrosis detections and scorings of liver cirrhosis and cancers. Therefore, the polarized light microscope and transformed Mueller matrix parameters have a good application prospect in liver cancer diagnosis.

  13. Mueller matrix microscope: a quantitative tool to facilitate detections and fibrosis scorings of liver cirrhosis and cancer tissues

    NASA Astrophysics Data System (ADS)

    Wang, Ye; He, Honghui; Chang, Jintao; He, Chao; Liu, Shaoxiong; Li, Migao; Zeng, Nan; Wu, Jian; Ma, Hui

    2016-07-01

    Today the increasing cancer incidence rate is becoming one of the biggest threats to human health. Among all types of cancers, liver cancer ranks in the top five in both frequency and mortality rate all over the world. During the development of liver cancer, fibrosis often evolves as part of a healing process in response to liver damage, resulting in cirrhosis of liver tissues. In a previous study, we applied the Mueller matrix microscope to pathological liver tissue samples and found that both the Mueller matrix polar decomposition (MMPD) and Mueller matrix transformation (MMT) parameters are closely related to the fibrous microstructures. In this paper, we take this one step further to quantitatively facilitate the fibrosis detections and scorings of pathological liver tissue samples in different stages from cirrhosis to cancer using the Mueller matrix microscope. The experimental results of MMPD and MMT parameters for the fibrotic liver tissue samples in different stages are measured and analyzed. We also conduct Monte Carlo simulations based on the sphere birefringence model to examine in detail the influence of structural changes in different fibrosis stages on the imaging parameters. Both the experimental and simulated results indicate that the polarized light microscope and transformed Mueller matrix parameters can provide additional quantitative information helpful for fibrosis detections and scorings of liver cirrhosis and cancers. Therefore, the polarized light microscope and transformed Mueller matrix parameters have a good application prospect in liver cancer diagnosis.

  14. An improved transmutation method for quantitative determination of the components in multicomponent overlapping chromatograms.

    PubMed

    Shao, Xueguang; Yu, Zhengliang; Ma, Chaoxiong

    2004-06-01

    An improved method is proposed for the quantitative determination of multicomponent overlapping chromatograms based on a known transmutation method. To overcome the main limitation of the transmutation method caused by the oscillation generated in the transmutation process, two techniques--wavelet transform smoothing and the cubic spline interpolation for reducing data points--were adopted, and a new criterion was also developed. By using the proposed algorithm, the oscillation can be suppressed effectively, and quantitative determination of the components in both the simulated and experimental overlapping chromatograms is successfully obtained.

  15. Improved quantitative analysis of spectra using a new method of obtaining derivative spectra based on a singular perturbation technique.

    PubMed

    Li, Zhigang; Wang, Qiaoyun; Lv, Jiangtao; Ma, Zhenhe; Yang, Linjuan

    2015-06-01

    Spectroscopy is often applied when a rapid quantitative analysis is required, but one challenge is the translation of raw spectra into a final analysis. Derivative spectra are often used as a preliminary preprocessing step to resolve overlapping signals, enhance signal properties, and suppress unwanted spectral features that arise due to non-ideal instrument and sample properties. In this study, to improve quantitative analysis of near-infrared spectra, derivatives of noisy raw spectral data need to be estimated with high accuracy. A new spectral estimator based on singular perturbation technique, called the singular perturbation spectra estimator (SPSE), is presented, and the stability analysis of the estimator is given. Theoretical analysis and simulation experimental results confirm that the derivatives can be estimated with high accuracy using this estimator. Furthermore, the effectiveness of the estimator for processing noisy infrared spectra is evaluated using the analysis of beer spectra. The derivative spectra of the beer and the marzipan are used to build the calibration model using partial least squares (PLS) modeling. The results show that the PLS based on the new estimator can achieve better performance compared with the Savitzky-Golay algorithm and can serve as an alternative choice for quantitative analytical applications.

  16. Comparison of Computational and Experimental Microphone Array Results for an 18%-Scale Aircraft Model

    NASA Technical Reports Server (NTRS)

    Lockard, David P.; Humphreys, William M.; Khorrami, Mehdi R.; Fares, Ehab; Casalino, Damiano; Ravetta, Patricio A.

    2015-01-01

    An 18%-scale, semi-span model is used as a platform for examining the efficacy of microphone array processing using synthetic data from numerical simulations. Two hybrid RANS/LES codes coupled with Ffowcs Williams-Hawkings solvers are used to calculate 97 microphone signals at the locations of an array employed in the NASA LaRC 14x22 tunnel. Conventional, DAMAS, and CLEAN-SC array processing is applied in an identical fashion to the experimental and computational results for three different configurations involving deploying and retracting the main landing gear and a part span flap. Despite the short time records of the numerical signals, the beamform maps are able to isolate the noise sources, and the appearance of the DAMAS synthetic array maps is generally better than those from the experimental data. The experimental CLEAN-SC maps are similar in quality to those from the simulations indicating that CLEAN-SC may have less sensitivity to background noise. The spectrum obtained from DAMAS processing of synthetic array data is nearly identical to the spectrum of the center microphone of the array, indicating that for this problem array processing of synthetic data does not improve spectral comparisons with experiment. However, the beamform maps do provide an additional means of comparison that can reveal differences that cannot be ascertained from spectra alone.

  17. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  18. Correlative SEM SERS for quantitative analysis of dimer nanoparticles.

    PubMed

    Timmermans, F J; Lenferink, A T M; van Wolferen, H A G M; Otto, C

    2016-11-14

    A Raman microscope integrated with a scanning electron microscope was used to investigate plasmonic structures by correlative SEM-SERS analysis. The integrated Raman-SEM microscope combines high-resolution electron microscopy information with SERS signal enhancement from selected nanostructures with adsorbed Raman reporter molecules. Correlative analysis is performed for dimers of two gold nanospheres. Dimers were selected on the basis of SEM images from multi aggregate samples. The effect of the orientation of the dimer with respect to the polarization state of the laser light and the effect of the particle gap size on the Raman signal intensity is observed. Additionally, calculations are performed to simulate the electric near field enhancement. These simulations are based on the morphologies observed by electron microscopy. In this way the experiments are compared with the enhancement factor calculated with near field simulations and are subsequently used to quantify the SERS enhancement factor. Large differences between experimentally observed and calculated enhancement factors are regularly detected, a phenomenon caused by nanoscale differences between the real and 'simplified' simulated structures. Quantitative SERS experiments reveal the structure induced enhancement factor, ranging from ∼200 to ∼20 000, averaged over the full nanostructure surface. The results demonstrate correlative Raman-SEM microscopy for the quantitative analysis of plasmonic particles and structures, thus enabling a new analytical method in the field of SERS and plasmonics.

  19. Polymer on Top: Current Limits and Future Perspectives of Quantitatively Evaluating Surface Grafting.

    PubMed

    Michalek, Lukas; Barner, Leonie; Barner-Kowollik, Christopher

    2018-03-07

    Well-defined polymer strands covalently tethered onto solid substrates determine the properties of the resulting functional interface. Herein, the current approaches to determine quantitative grafting densities are assessed. Based on a brief introduction into the key theories describing polymer brush regimes, a user's guide is provided to estimating maximum chain coverage and-importantly-examine the most frequently employed approaches for determining grafting densities, i.e., dry thickness measurements, gravimetric assessment, and swelling experiments. An estimation of the reliability of these determination methods is provided via carefully evaluating their assumptions and assessing the stability of the underpinning equations. A practical access guide for comparatively and quantitatively evaluating the reliability of a given approach is thus provided, enabling the field to critically judge experimentally determined grafting densities and to avoid the reporting of grafting densities that fall outside the physically realistic parameter space. The assessment is concluded with a perspective on the development of advanced approaches for determination of grafting density, in particular, on single-chain methodologies. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Hybrid computational and experimental approach for the study and optimization of mechanical components

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1998-05-01

    Increased demands on the performance and efficiency of mechanical components impose challenges on their engineering design and optimization, especially when new and more demanding applications must be developed in relatively short periods of time while satisfying design objectives, as well as cost and manufacturability. In addition, reliability and durability must be taken into consideration. As a consequence, effective quantitative methodologies, computational and experimental, should be applied in the study and optimization of mechanical components. Computational investigations enable parametric studies and the determination of critical engineering design conditions, while experimental investigations, especially those using optical techniques, provide qualitative and quantitative information on the actual response of the structure of interest to the applied load and boundary conditions. We discuss a hybrid experimental and computational approach for investigation and optimization of mechanical components. The approach is based on analytical, computational, and experimental resolutions methodologies in the form of computational, noninvasive optical techniques, and fringe prediction analysis tools. Practical application of the hybrid approach is illustrated with representative examples that demonstrate the viability of the approach as an effective engineering tool for analysis and optimization.

  1. Improved Methods for Capture, Extraction, and Quantitative Assay of Environmental DNA from Asian Bigheaded Carp (Hypophthalmichthys spp.)

    PubMed Central

    Turner, Cameron R.; Miller, Derryl J.; Coyne, Kathryn J.; Corush, Joel

    2014-01-01

    Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species. PMID:25474207

  2. Improved methods for capture, extraction, and quantitative assay of environmental DNA from Asian bigheaded carp (Hypophthalmichthys spp.).

    PubMed

    Turner, Cameron R; Miller, Derryl J; Coyne, Kathryn J; Corush, Joel

    2014-01-01

    Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species.

  3. A Radioactivity Based Quantitative Analysis of the Amount of Thorium Present in Ores and Metallurgical Products; ANALYSE QUANTITATIVE DU THORIUM DANS LES MINERAIS ET LES PRODUITS THORIFERES PAR UNE METHODE BASEE SUR LA RADIOACTIVITE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collee, R.; Govaerts, J.; Winand, L.

    1959-10-31

    A brief resume of the classical methods of quantitative determination of thorium in ores and thoriferous products is given to show that a rapid, accurate, and precise physical method based on the radioactivity of thorium would be of great utility. A method based on the utilization of the characteristic spectrum of the thorium gamma radiation is presented. The preparation of the samples and the instruments needed for the measurements is discussed. The experimental results show that the reproducibility is very satisfactory and that it is possible to detect Th contents of 1% or smaller. (J.S.R.)

  4. QUANTITATIVE TESTS OF ELMS AS INTERMEDIATE N PEELING-BALOONING MODES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LAO,LL; SNYDER,PB; LEONARD,AW

    2003-03-01

    A271 QUANTITATIVE TESTS OF ELMS AS INTERMEDIATE N PEELING-BALOONING MODES. Several testable features of the working model of edge localized modes (ELMs) as intermediate toroidal mode number peeling-ballooning modes are evaluated quantitatively using DIII-D and JT-60U experimental data and the ELITE MHD stability code. These include the hypothesis that ELM sizes are related to the radial widths of the unstable MHD modes, the unstable modes have a strong ballooning character localized in the outboard bad curvature region, and ELM size generally becomes smaller at high edge collisionality. ELMs are triggered when the growth rates of the unstable MHD modes becomemore » significantly large. These testable features are consistent with many ELM observations in DIII-D and JT-60U discharges.« less

  5. The experimental results of AMTEC and a study of its terrestrial applications in IEE of China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ni, Q.; Tong, J.; Kan, Y.

    1997-12-31

    The R and D activities in the field of AMTEC research at The Institute of Electrical Engineering, Chinese Academy of Sciences are introduced. The outline of experimental facility with a single tube cell is described. The experimental results so far are reported followed by an analysis of electrical characteristic, in particular, an evaluation of characteristic of BASE/porous electrode interface with the effective sheet resistivity and the electrode efficiency. The approaches for improving device performance are discussed. The terrestrial applications of AMTEC in China are considered as an alternative of conventional diesel-generators. The possibility of AMTEC power supply for some separatemore » sites is predicted.« less

  6. Computer-aided design and experimental investigation of a hydrodynamic device: the microwire electrode

    PubMed

    Fulian; Gooch; Fisher; Stevens; Compton

    2000-08-01

    The development and application of a new electrochemical device using a computer-aided design strategy is reported. This novel design is based on the flow of electrolyte solution past a microwire electrode situated centrally within a large duct. In the design stage, finite element simulations were employed to evaluate feasible working geometries and mass transport rates. The computer-optimized designs were then exploited to construct experimental devices. Steady-state voltammetric measurements were performed for a reversible one-electron-transfer reaction to establish the experimental relationship between electrolysis current and solution velocity. The experimental results are compared to those predicted numerically, and good agreement is found. The numerical studies are also used to establish an empirical relationship between the mass transport limited current and the volume flow rate, providing a simple and quantitative alternative for workers who would prefer to exploit this device without the need to develop the numerical aspects.

  7. Secondary emission from dust grains: Comparison of experimental and model results

    NASA Astrophysics Data System (ADS)

    Richterova, I.; Pavlu, J.; Nemecek, Z.; Safrankova, J.; Zilavy, P.

    The motion, coalescence, and other processes in dust clouds are determined by the dust charge. Since dust grains in the space are bombarded by energetic electrons, the secondary emission is an important process contributing to their charge. It is generally expected that the secondary emission yield is related to surface properties of the bombarded body. However, it is well known that secondary emission from small bodies is determined not only by their composition but an effect of dimension can be very important when the penetration depth of primary electrons is comparable with the grain size. It implies that the secondary emission yield can be influenced by the substrate material if the surface layer is thin enough. We have developed a simple Monte Carlo model of secondary emission that was successfully applied on the dust simulants from glass and melanine formaldehyd (MF) resin and matched very well experimental results. In order to check the influence of surface layers, we have modified the model for spheres covered by a layer with different material properties. The results of model simulations are compared with measurements on MF spheres covered by different metals.

  8. Experimental design matters for statistical analysis: how to handle blocking.

    PubMed

    Jensen, Signe M; Schaarschmidt, Frank; Onofri, Andrea; Ritz, Christian

    2018-03-01

    Nowadays, evaluation of the effects of pesticides often relies on experimental designs that involve multiple concentrations of the pesticide of interest or multiple pesticides at specific comparable concentrations and, possibly, secondary factors of interest. Unfortunately, the experimental design is often more or less neglected when analysing data. Two data examples were analysed using different modelling strategies. First, in a randomized complete block design, mean heights of maize treated with a herbicide and one of several adjuvants were compared. Second, translocation of an insecticide applied to maize as a seed treatment was evaluated using incomplete data from an unbalanced design with several layers of hierarchical sampling. Extensive simulations were carried out to further substantiate the effects of different modelling strategies. It was shown that results from suboptimal approaches (two-sample t-tests and ordinary ANOVA assuming independent observations) may be both quantitatively and qualitatively different from the results obtained using an appropriate linear mixed model. The simulations demonstrated that the different approaches may lead to differences in coverage percentages of confidence intervals and type 1 error rates, confirming that misleading conclusions can easily happen when an inappropriate statistical approach is chosen. To ensure that experimental data are summarized appropriately, avoiding misleading conclusions, the experimental design should duly be reflected in the choice of statistical approaches and models. We recommend that author guidelines should explicitly point out that authors need to indicate how the statistical analysis reflects the experimental design. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  9. Quantitative evaluation of hidden defects in cast iron components using ultrasound activated lock-in vibrothermography.

    PubMed

    Montanini, R; Freni, F; Rossi, G L

    2012-09-01

    This paper reports one of the first experimental results on the application of ultrasound activated lock-in vibrothermography for quantitative assessment of buried flaws in complex cast parts. The use of amplitude modulated ultrasonic heat generation allowed selective response of defective areas within the part, as the defect itself is turned into a local thermal wave emitter. Quantitative evaluation of hidden damages was accomplished by estimating independently both the area and the depth extension of the buried flaws, while x-ray 3D computed tomography was used as reference for sizing accuracy assessment. To retrieve flaw's area, a simple yet effective histogram-based phase image segmentation algorithm with automatic pixels classification has been developed. A clear correlation was found between the thermal (phase) signature measured by the infrared camera on the target surface and the actual mean cross-section area of the flaw. Due to the very fast cycle time (<30 s/part), the method could potentially be applied for 100% quality control of casting components.

  10. Experimental Aerodynamic Characteristics of the Pegasus Air-Launched Booster and Comparisons with Predicted and Flight Results

    NASA Technical Reports Server (NTRS)

    Rhode, M. N.; Engelund, Walter C.; Mendenhall, Michael R.

    1995-01-01

    Experimental longitudinal and lateral-directional aerodynamic characteristics were obtained for the Pegasus and Pegasus XL configurations over a Mach number range from 1.6 to 6 and angles of attack from -4 to +24 degrees. Angle of sideslip was varied from -6 to +6 degrees, and control surfaces were deflected to obtain elevon, aileron, and rudder effectiveness. Experimental data for the Pegasus configuration are compared with engineering code predictions performed by Nielsen Engineering & Research, Inc. (NEAR) in the aerodynamic design of the Pegasus vehicle, and with results from the Aerodynamic Preliminary Analysis System (APAS) code. Comparisons of experimental results are also made with longitudinal flight data from Flight #2 of the Pegasus vehicle. Results show that the longitudinal aerodynamic characteristics of the Pegasus and Pegasus XL configurations are similar, having the same lift-curve slope and drag levels across the Mach number range. Both configurations are longitudinally stable, with stability decreasing towards neutral levels as Mach number increases. Directional stability is negative at moderate to high angles of attack due to separated flow over the vertical tail. Dihedral effect is positive for both configurations, but is reduced 30-50 percent for the Pegasus XL configuration because of the horizontal tail anhedral. Predicted longitudinal characteristics and both longitudinal and lateral-directional control effectiveness are generally in good agreement with experiment. Due to the complex leeside flowfield, lateral-directional characteristics are not as well predicted by the engineering codes. Experiment and flight data are in good agreement across the Mach number range.

  11. Modern Projection of the Old Electroscope for Nuclear Radiation Quantitative Work and Demonstrations

    ERIC Educational Resources Information Center

    Bastos, Rodrigo Oliveira; Boch, Layara Baltokoski

    2017-01-01

    Although quantitative measurements in radioactivity teaching and research are only believed to be possible with high technology, early work in this area was fully accomplished with very simple apparatus such as zinc sulphide screens and electroscopes. This article presents an experimental practice using the electroscope, which is a very simple…

  12. Power Analysis of Artificial Selection Experiments Using Efficient Whole Genome Simulation of Quantitative Traits

    PubMed Central

    Kessner, Darren; Novembre, John

    2015-01-01

    Evolve and resequence studies combine artificial selection experiments with massively parallel sequencing technology to study the genetic basis for complex traits. In these experiments, individuals are selected for extreme values of a trait, causing alleles at quantitative trait loci (QTL) to increase or decrease in frequency in the experimental population. We present a new analysis of the power of artificial selection experiments to detect and localize quantitative trait loci. This analysis uses a simulation framework that explicitly models whole genomes of individuals, quantitative traits, and selection based on individual trait values. We find that explicitly modeling QTL provides qualitatively different insights than considering independent loci with constant selection coefficients. Specifically, we observe how interference between QTL under selection affects the trajectories and lengthens the fixation times of selected alleles. We also show that a substantial portion of the genetic variance of the trait (50–100%) can be explained by detected QTL in as little as 20 generations of selection, depending on the trait architecture and experimental design. Furthermore, we show that power depends crucially on the opportunity for recombination during the experiment. Finally, we show that an increase in power is obtained by leveraging founder haplotype information to obtain allele frequency estimates. PMID:25672748

  13. [A new method of processing quantitative PCR data].

    PubMed

    Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun

    2003-05-01

    Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.

  14. Shuttle Damage/Repair from the Perspective of Hypersonic Boundary Layer Transition - Experimental Results

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas J.; Berry, Scott A.; Merski, N. Ronald; Berger, Karen T.; Buck, Gregory M.; Liechty, Derek S.; Schneider, Steven P.

    2006-01-01

    An overview is provided of the experimental wind tunnel program conducted at the NASA Langley Research Center Aerothermodynamics Laboratory in support of an agency-wide effort to prepare the Shuttle Orbiter for Return-to-Flight. The effect of an isolated protuberance and an isolated rectangular cavity on hypersonic boundary layer transition onset on the windward surface of the Shuttle Orbiter has been experimentally characterized. These experimental studies were initiated to provide a protuberance and cavity effects database for developing hypersonic transition criteria to support on-orbit disposition of thermal protection system damage or repair. In addition, a synergistic experimental investigation was undertaken to assess the impact of an isolated mass-flow entrainment source (simulating pyrolysis/outgassing from a proposed tile repair material) on boundary layer transition. A brief review of the relevant literature regarding hypersonic boundary layer transition induced from cavities and localized mass addition from ablation is presented. Boundary layer transition results were obtained using 0.0075-scale Orbiter models with simulated tile damage (rectangular cavities) of varying length, width, and depth and simulated tile damage or repair (protuberances) of varying height. Cavity and mass addition effects were assessed at a fixed location (x/L = 0.3) along the model centerline in a region of near zero pressure gradient. Cavity length-to-depth ratio was systematically varied from 2.5 to 17.7 and length-to-width ratio of 1 to 8.5. Cavity depth-to-local boundary layer thickness ranged from 0.5 to 4.8. Protuberances were located at several sites along the centerline and port/starboard attachment lines along the chine and wing leading edge. Protuberance height-to-boundary layer thickness was varied from approximately 0.2 to 1.1. Global heat transfer images and heating distributions of the Orbiter windward surface using phosphor thermography were used to infer the

  15. Physico-chemical properties of aqueous drug solutions: From the basic thermodynamics to the advanced experimental and simulation results.

    PubMed

    Bellich, Barbara; Gamini, Amelia; Brady, John W; Cesàro, Attilio

    2018-04-05

    The physical chemical properties of aqueous solutions of model compounds are illustrated in relation to hydration and solubility issues by using three perspectives: thermodynamic, spectroscopic and molecular dynamics simulations. The thermodynamic survey of the fundamental backgrounds of concentration dependence and experimental solubility results show some peculiar behavior of aqueous solutions with several types of similar solutes. Secondly, the use of a variety of experimental spectroscopic devices, operating under different experimental conditions of dimension and frequency, has produced a large amount of structural and dynamic data on aqueous solutions showing the richness of the information produced, depending on where and how the experiment is carried out. Finally, the use of molecular dynamics computational work is presented to highlight how the different types of solute functional groups and surface topologies organize adjacent water molecules differently. The highly valuable contribution of computer simulation studies in providing molecular explanations for experimental deductions, either of a thermodynamic or spectroscopic nature, is shown to have changed the current knowledge of many aqueous solution processes. While this paper is intended to provide a collective view on the latest literature results, still the presentation aims at a tutorial explanation of the potentials of the three methodologies in the field of aqueous solutions of pharmaceutical molecules. Copyright © 2018. Published by Elsevier B.V.

  16. Validation of PCR methods for quantitation of genetically modified plants in food.

    PubMed

    Hübner, P; Waiblinger, H U; Pietsch, K; Brodmann, P

    2001-01-01

    For enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients, quantitative detection methods such as quantitative competitive (QC-PCR) and real-time PCR are applied by official food control laboratories. The experiences of 3 European food control laboratories in validating such methods were compared to describe realistic performance characteristics of quantitative PCR detection methods. The limit of quantitation (LOQ) of GMO-specific, real-time PCR was experimentally determined to reach 30-50 target molecules, which is close to theoretical prediction. Starting PCR with 200 ng genomic plant DNA, the LOQ depends primarily on the genome size of the target plant and ranges from 0.02% for rice to 0.7% for wheat. The precision of quantitative PCR detection methods, expressed as relative standard deviation (RSD), varied from 10 to 30%. Using Bt176 corn containing test samples and applying Bt176 specific QC-PCR, mean values deviated from true values by -7to 18%, with an average of 2+/-10%. Ruggedness of real-time PCR detection methods was assessed in an interlaboratory study analyzing commercial, homogeneous food samples. Roundup Ready soybean DNA contents were determined in the range of 0.3 to 36%, relative to soybean DNA, with RSDs of about 25%. Taking the precision of quantitative PCR detection methods into account, suitable sample plans and sample sizes for GMO analysis are suggested. Because quantitative GMO detection methods measure GMO contents of samples in relation to reference material (calibrants), high priority must be given to international agreements and standardization on certified reference materials.

  17. Building quantitative, three-dimensional atlases of gene expression and morphology at cellular resolution.

    PubMed

    Knowles, David W; Biggin, Mark D

    2013-01-01

    Animals comprise dynamic three-dimensional arrays of cells that express gene products in intricate spatial and temporal patterns that determine cellular differentiation and morphogenesis. A rigorous understanding of these developmental processes requires automated methods that quantitatively record and analyze complex morphologies and their associated patterns of gene expression at cellular resolution. Here we summarize light microscopy-based approaches to establish permanent, quantitative datasets-atlases-that record this information. We focus on experiments that capture data for whole embryos or large areas of tissue in three dimensions, often at multiple time points. We compare and contrast the advantages and limitations of different methods and highlight some of the discoveries made. We emphasize the need for interdisciplinary collaborations and integrated experimental pipelines that link sample preparation, image acquisition, image analysis, database design, visualization, and quantitative analysis. Copyright © 2013 Wiley Periodicals, Inc.

  18. Multi-spectral digital holographic microscopy for enhanced quantitative phase imaging of living cells

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Kastl, Lena; Schnekenburger, Jürgen; Ketelhut, Steffi

    2018-02-01

    Main restrictions of using laser light in digital holographic microscopy (DHM) are coherence induced noise and parasitic reflections in the experimental setup which limit resolution and measurement accuracy. We explored, if coherence properties of partial coherent light sources can be generated synthetically utilizing spectrally tunable lasers. The concept of the method is demonstrated by label-free quantitative phase imaging of living pancreatic tumor cells and utilizing an experimental configuration including a commercial microscope and a laser source with a broad tunable spectral range of more than 200 nm.

  19. School Context and Educational Outcomes: Results from a Quasi-Experimental Study

    PubMed Central

    Casciano, Rebecca; Massey, Douglas S.

    2013-01-01

    In this study we draw on data from a quasi-experimental study to test whether moving into a subsidized housing development in an affluent suburb yields educational benefits to the children of residents, compared to the educations they would have received had they not moved into the development. Results suggest that resident children experienced a significant improvement in school quality compared with a comparison group of students whose parents also had applied for residence. Parents who were residents of the development also displayed higher levels of school involvement compared with the comparison group of non-resident parents, and their children were exposed to significantly lower levels of school disorder and violence within school and spent more time reading outside of school. Living in the development did not influence GPA directly, but did indirectly increase GPA by increasing the time residents spent reading outside of school. PMID:25342878

  20. An experimental study of stratospheric gravity waves - Design and preliminary results

    NASA Astrophysics Data System (ADS)

    Talagrand, O.; Ovarlez, H.

    1984-02-01

    The design of balloon-borne experimental apparatus for long-term gravitational-wave measurements in the stratosphere is reported, and preliminary results of a first test flight are presented. Two gondolas (each containing a pressure sensor; a temperature sensor; horizontal and vertical sonic anemometers; a fin equipped with crossed magnetometers; and data-processing, data-transmission, and control electronics) are suspended 100 and 300 m below a solar/terrestrial-IR-absorption-heated hot-air balloon drifting between altitudes 22 km (night) and 28 km (day); power is supplied by NiCd batteries recharged by solar cells. The path of the first flight, a circumnavigation beginning in Pretoria, South Africa and crossing South America and northern Australia, from December 11, 1982, to February 2, 1983 (when transmission ceased over southern Africa) is shown on a map, and sample data for a 36-h period are summarized in a graph.

  1. Experimental and Theoretical Studies of Atmosphereic Inorganic Chlorine Chemistry

    NASA Technical Reports Server (NTRS)

    Sander, Stanley P.; Friedl, Randall R.

    1993-01-01

    Over the last five years substantial progress has been made in defining the realm of new chlorine chemistry in the polar stratosphere. Application of existing experimental techniques to potentially important chlorine-containing compounds has yielded quantitative kinetic and spectroscopic data as well as qualitative mechanistic insights into the relevant reactions.

  2. Experimental and Numerical Analysis of Notched Composites Under Tension Loading

    NASA Astrophysics Data System (ADS)

    Aidi, Bilel; Case, Scott W.

    2015-12-01

    Experimental quasi-static tests were performed on center notched carbon fiber reinforced polymer (CFRP) composites having different stacking sequences made of G40-600/5245C prepreg. The three-dimensional Digital Image Correlation (DIC) technique was used during quasi-static tests conducted on quasi-isotropic notched samples to obtain the distribution of strains as a function of applied stress. A finite element model was built within Abaqus to predict the notched strength and the strain profiles for comparison with measured results. A user-material subroutine using the multi-continuum theory (MCT) as a failure initiation criterion and an energy-based damage evolution law as implemented by Autodesk Simulation Composite Analysis (ASCA) was used to conduct a quantitative comparison of strain components predicted by the analysis and obtained in the experiments. Good agreement between experimental data and numerical analyses results are observed. Modal analysis was carried out to investigate the effect of static damage on the dominant frequencies of the notched structure using the resulted degraded material elements. The first in-plane mode was found to be a good candidate for tracking the level of damage.

  3. Early Results in Capella's Prior Learning Assessment Experimental Site Initiative

    ERIC Educational Resources Information Center

    Klein, Jillian

    2017-01-01

    In July 2014, the U.S. Department of Education announced a new round of experimental sites focusing on competency-based education. Capella University was selected to participate in three of the Department of Education's competency-based education (CBE) experiments and began by implementing the prior learning assessment experiment, which allows…

  4. Modern quantitative schlieren techniques

    NASA Astrophysics Data System (ADS)

    Hargather, Michael; Settles, Gary

    2010-11-01

    Schlieren optical techniques have traditionally been used to qualitatively visualize refractive flowfields in transparent media. Modern schlieren optics, however, are increasingly focused on obtaining quantitative information such as temperature and density fields in a flow -- once the sole purview of interferometry -- without the need for coherent illumination. Quantitative data are obtained from schlieren images by integrating the measured refractive index gradient to obtain the refractive index field in an image. Ultimately this is converted to a density or temperature field using the Gladstone-Dale relationship, an equation of state, and geometry assumptions for the flowfield of interest. Several quantitative schlieren methods are reviewed here, including background-oriented schlieren (BOS), schlieren using a weak lens as a "standard," and "rainbow schlieren." Results are presented for the application of these techniques to measure density and temperature fields across a supersonic turbulent boundary layer and a low-speed free-convection boundary layer in air. Modern equipment, including digital cameras, LED light sources, and computer software that make this possible are also discussed.

  5. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  6. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  7. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  8. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  9. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  10. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative-quantitative modeling.

    PubMed

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-05-01

    Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if-then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLab(TM)-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/

  11. Nucleate pool boiling in subcooled liquid under microgravity: Results of TEXUS experimental investigations

    NASA Astrophysics Data System (ADS)

    Zell, M.; Straub, J.; Weinzierl, A.

    1984-12-01

    Experiments on subcooled nucleate pool boiling in microgravity were carried out to separate gravity driven effects on heat transfer within the boiling process. A ballistic trajectory by sounding rocket flight (TEXUS 5 and 10) achieved a gravity level of a/g = 0.0001 for 360 sec. For determination of geometrical effects on heat transport two different experimental configurations (platinum wire and flat plate) were employed. Boiling curves and bubble dynamics recorded by cinematography lead to gravity independent modelling of the boiling phenomena. The results ensure the applicability and high efficiency of nucleate pool boiling for heat exchangers in space laboratories.

  12. Dissemination Strategies to Improve Implementation of the PHS Smoking Cessation Guideline in MCH Public Health Clinics: Experimental Evaluation Results and Contextual Factors

    ERIC Educational Resources Information Center

    Manfredi, Clara; Cho, Young Ik; Warnecke, Richard; Saunders, Stephen; Sullivan, Myrtis

    2011-01-01

    We report results from an experimental study that tested the effectiveness of dissemination interventions to improve implementation of smoking cessation guidelines in maternal and child public health clinics. We additionally examine individual clinic results for contextual explanations not apparent from the experimental findings alone. Twelve…

  13. Informatics methods to enable sharing of quantitative imaging research data.

    PubMed

    Levy, Mia A; Freymann, John B; Kirby, Justin S; Fedorov, Andriy; Fennessy, Fiona M; Eschrich, Steven A; Berglund, Anders E; Fenstermacher, David A; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L; Brown, Bartley J; Braun, Terry A; Dekker, Andre; Roelofs, Erik; Mountz, James M; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-11-01

    The National Cancer Institute Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. There are a variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Development of one novel multiple-target plasmid for duplex quantitative PCR analysis of roundup ready soybean.

    PubMed

    Zhang, Haibo; Yang, Litao; Guo, Jinchao; Li, Xiang; Jiang, Lingxi; Zhang, Dabing

    2008-07-23

    To enforce the labeling regulations of genetically modified organisms (GMOs), the application of reference molecules as calibrators is becoming essential for practical quantification of GMOs. However, the reported reference molecules with tandem marker multiple targets have been proved not suitable for duplex PCR analysis. In this study, we developed one unique plasmid molecule based on one pMD-18T vector with three exogenous target DNA fragments of Roundup Ready soybean GTS 40-3-2 (RRS), that is, CaMV35S, NOS, and RRS event fragments, plus one fragment of soybean endogenous Lectin gene. This Lectin gene fragment was separated from the three exogenous target DNA fragments of RRS by inserting one 2.6 kb DNA fragment with no relatedness to RRS detection targets in this resultant plasmid. Then, we proved that this design allows the quantification of RRS using the three duplex real-time PCR assays targeting CaMV35S, NOS, and RRS events employing this reference molecule as the calibrator. In these duplex PCR assays, the limits of detection (LOD) and quantification (LOQ) were 10 and 50 copies, respectively. For the quantitative analysis of practical RRS samples, the results of accuracy and precision were similar to those of simplex PCR assays, for instance, the quantitative results were at the 1% level, the mean bias of the simplex and duplex PCR were 4.0% and 4.6%, respectively, and the statistic analysis ( t-test) showed that the quantitative data from duplex and simplex PCR had no significant discrepancy for each soybean sample. Obviously, duplex PCR analysis has the advantages of saving the costs of PCR reaction and reducing the experimental errors in simplex PCR testing. The strategy reported in the present study will be helpful for the development of new reference molecules suitable for duplex PCR quantitative assays of GMOs.

  15. Experimental results of active control on a large structure to suppress vibration

    NASA Technical Reports Server (NTRS)

    Dunn, H. J.

    1991-01-01

    Three design methods, Linear Quadratic Gaussian with Loop Transfer Recovery (LQG/LTR), H-infinity, and mu-synthesis, are used to obtain compensators for suppressing the vibrations of a 10-bay vertical truss structure, a component typical of what may be used to build a large space structure. For the design process the plant dynamic characteristics of the structure were determined experimentally using an identification method. The resulting compensators were implemented on a digital computer and tested for their ability to suppress the first bending mode response of the 10-bay vertical truss. Time histories of the measured motion are presented, and modal damping obtained during the experiments are compared with analytical predictions. The advantages and disadvantages of using the various design methods are discussed.

  16. Experimental Whole-Ecosystem Warming Alters Vegetation Phenology in a Boreal Spruce Bog: Initial Results from the SPRUCE Experiment

    NASA Astrophysics Data System (ADS)

    Richardson, A. D.

    2016-12-01

    Phenology is one of the most robust indicators of the biological impacts of global change. However, the response of phenology to future environmental conditions still remains highly uncertain because of the challenges associated with conducting realistic manipulative experiments. At the SPRUCE (Spruce and Peatland Responses Under Climatic and Environmental Change) experiment in the north-central United States, experimental temperature (0 to +9°C above ambient) and CO2 (ambient and elevated) treatments are being applied to mature, and intact, Picea mariana-Sphagnum spp. bog communities in their native habitat through the use of ten large (approximately 12 m wide, 10 m high) open-topped enclosures. We are tracking vegetation green-up and senescence in these chambers using repeat digital photography. Within each chamber, images are recorded every 30 minutes and uploaded to PhenoCam (http://phenocam.sr.unh.edu), where processed to yield quantitative measures of canopy color. These data are complemented by on-the-ground phenological data collected by human observers. Air warming treatments at SPRUCE began in August 2015. We observed a delay in senescence during autumn 2015 (2-5 days per degree of warming) and an advance in onset during spring 2016 (1-4 days per degree of warming). These patterns are robust across species and methods of phenological observation (i.e. camera-based vs. human observer). And, our results show very little evidence for photoperiod acting as a constraint on the response to warming. Early spring onset and consequent loss of frost hardiness in the warmest chambers proved disadvantageous when a brief period of extreme cold (to -12°C in the control chambers, to -3°C in the +9°C chambers) followed a month of generally mild weather. Foliage mortality for both Larix and Picea was immediate and severe, although both species subsequently re-flushed. These results give support for the hypothesis that warming may enhance the likelihood of spring frost

  17. Experimental Design for Parameter Estimation of Gene Regulatory Networks

    PubMed Central

    Timmer, Jens

    2012-01-01

    Systems biology aims for building quantitative models to address unresolved issues in molecular biology. In order to describe the behavior of biological cells adequately, gene regulatory networks (GRNs) are intensively investigated. As the validity of models built for GRNs depends crucially on the kinetic rates, various methods have been developed to estimate these parameters from experimental data. For this purpose, it is favorable to choose the experimental conditions yielding maximal information. However, existing experimental design principles often rely on unfulfilled mathematical assumptions or become computationally demanding with growing model complexity. To solve this problem, we combined advanced methods for parameter and uncertainty estimation with experimental design considerations. As a showcase, we optimized three simulated GRNs in one of the challenges from the Dialogue for Reverse Engineering Assessment and Methods (DREAM). This article presents our approach, which was awarded the best performing procedure at the DREAM6 Estimation of Model Parameters challenge. For fast and reliable parameter estimation, local deterministic optimization of the likelihood was applied. We analyzed identifiability and precision of the estimates by calculating the profile likelihood. Furthermore, the profiles provided a way to uncover a selection of most informative experiments, from which the optimal one was chosen using additional criteria at every step of the design process. In conclusion, we provide a strategy for optimal experimental design and show its successful application on three highly nonlinear dynamic models. Although presented in the context of the GRNs to be inferred for the DREAM6 challenge, the approach is generic and applicable to most types of quantitative models in systems biology and other disciplines. PMID:22815723

  18. La Methode Experimentale en Pedagogie (The Experimental Method in Pedagogy)

    ERIC Educational Resources Information Center

    Rouquette, Michel-Louis

    1975-01-01

    The pedagogue is caught between the qualitative and quantitative or regularized aspects of his work, a situation not automatically conducive to scientific study. The article refreshes the instructor on the elementary principles of experimentation: observation, systematization, elaboration of hypothesis, and startegies of comparison. (Text is in…

  19. Experimental results for a two-dimensional supersonic inlet used as a thrust deflecting nozzle

    NASA Technical Reports Server (NTRS)

    Johns, Albert L.; Burstadt, Paul L.

    1984-01-01

    Nearly all supersonic V/STOL aircraft concepts are dependent on the thrust deflecting capability of a nozzle. In one unique concept, referred to as the reverse flow dual fan, not only is there a thrust deflecting nozzle for the fan and core engine exit flow, but because of the way the propulsion system operates during vertical takeoff and landing, the supersonic inlet is also used as a thrust deflecting nozzle. This paper presents results of an experimental study to evaluate the performance of a supersonic inlet used as a thrust deflecting nozzle for this reverse flow dual fan concept. Results are presented in terms of nozzle thrust coefficient and thrust vector angle for a number of inlet/nozzle configurations. Flow visualization and nozzle exit flow survey results are also shown.

  20. An experimental approach to identify dynamical models of transcriptional regulation in living cells

    NASA Astrophysics Data System (ADS)

    Fiore, G.; Menolascina, F.; di Bernardo, M.; di Bernardo, D.

    2013-06-01

    We describe an innovative experimental approach, and a proof of principle investigation, for the application of System Identification techniques to derive quantitative dynamical models of transcriptional regulation in living cells. Specifically, we constructed an experimental platform for System Identification based on a microfluidic device, a time-lapse microscope, and a set of automated syringes all controlled by a computer. The platform allows delivering a time-varying concentration of any molecule of interest to the cells trapped in the microfluidics device (input) and real-time monitoring of a fluorescent reporter protein (output) at a high sampling rate. We tested this platform on the GAL1 promoter in the yeast Saccharomyces cerevisiae driving expression of a green fluorescent protein (Gfp) fused to the GAL1 gene. We demonstrated that the System Identification platform enables accurate measurements of the input (sugars concentrations in the medium) and output (Gfp fluorescence intensity) signals, thus making it possible to apply System Identification techniques to obtain a quantitative dynamical model of the promoter. We explored and compared linear and nonlinear model structures in order to select the most appropriate to derive a quantitative model of the promoter dynamics. Our platform can be used to quickly obtain quantitative models of eukaryotic promoters, currently a complex and time-consuming process.