Science.gov

Sample records for accurately reproduce experimental

  1. Accurate measurements of dynamics and reproducibility in small genetic networks

    PubMed Central

    Dubuis, Julien O; Samanta, Reba; Gregor, Thomas

    2013-01-01

    Quantification of gene expression has become a central tool for understanding genetic networks. In many systems, the only viable way to measure protein levels is by immunofluorescence, which is notorious for its limited accuracy. Using the early Drosophila embryo as an example, we show that careful identification and control of experimental error allows for highly accurate gene expression measurements. We generated antibodies in different host species, allowing for simultaneous staining of four Drosophila gap genes in individual embryos. Careful error analysis of hundreds of expression profiles reveals that less than ∼20% of the observed embryo-to-embryo fluctuations stem from experimental error. These measurements make it possible to extract not only very accurate mean gene expression profiles but also their naturally occurring fluctuations of biological origin and corresponding cross-correlations. We use this analysis to extract gap gene profile dynamics with ∼1 min accuracy. The combination of these new measurements and analysis techniques reveals a twofold increase in profile reproducibility owing to a collective network dynamics that relays positional accuracy from the maternal gradients to the pair-rule genes. PMID:23340845

  2. Accurate and reproducible determination of lignin molar mass by acetobromination.

    PubMed

    Asikkala, Janne; Tamminen, Tarja; Argyropoulos, Dimitris S

    2012-09-12

    The accurate and reproducible determination of lignin molar mass by using size exclusion chromatography (SEC) is challenging. The lignin association effects, known to dominate underivatized lignins, have been thoroughly addressed by reaction with acetyl bromide in an excess of glacial acetic acid. The combination of a concerted acetylation with the introduction of bromine within the lignin alkyl side chains is thought to be responsible for the observed excellent solubilization characteristics acetobromination imparts to a variety of lignin samples. The proposed methodology was compared and contrasted to traditional lignin derivatization methods. In addition, side reactions that could possibly be induced under the acetobromination conditions were explored with native softwood (milled wood lignin, MWL) and technical (kraft) lignin. These efforts lend support toward the use of room temperature acetobromination being a facile, effective, and universal lignin derivatization medium proposed to be employed prior to SEC measurements. PMID:22870925

  3. Experimental challenges to reproduce seismic fault motion

    NASA Astrophysics Data System (ADS)

    Shimamoto, T.

    2011-12-01

    This presentation briefly reviews scientific and technical development in the studies of intermediate to high-velocity frictional properties of faults and summarizes remaining technical challenges to reproduce nucleation to growth processes of large earthquakes in laboratory. Nearly 10 high-velocity or low to high-velocity friction apparatuses have been built in the last several years in the world and it has become possible now to produce sub-plate velocity to seismic slip rate in a single machine. Despite spreading of high-velocity friction studies, reproducing seismic fault motion at high P and T conditions to cover the entire seismogenic zone is still a big challenge. Previous studies focused on (1) frictional melting, (2) thermal pressurization, and (3) high-velocity gouge behavior without frictional melting. Frictional melting process was solved as a Stefan problem with very good agreement with experimental results. Thermal pressurization has been solved theoretically based on measured transport properties and has been included successfully in the modeling of earthquake generation. High-velocity gouge experiments in the last several years have revealed that a wide variety of gouges exhibit dramatic weakening at high velocities (e.g., Di Toro et al., 2011, Nature). Most gouge experiments were done under dry conditions partly to separate gouge friction from the involvement of thermal pressurization. However, recent studies demonstrated that dehydration or degassing due to mineral decomposition can occur during seismic fault motion. Those results not only provided a new view of looking at natural fault zones in search of geological evidence of seismic fault motion, but also indicated that thermal pressurization and gouge weakening can occur simultaneously even in initially dry gouge. Thus experiments with controlled pore pressure are needed. I have struggled to make a pressure vessel for wet high-velocity experiments in the last several years. A technical

  4. Cycle accurate and cycle reproducible memory for an FPGA based hardware accelerator

    DOEpatents

    Asaad, Sameh W.; Kapur, Mohit

    2016-03-15

    A method, system and computer program product are disclosed for using a Field Programmable Gate Array (FPGA) to simulate operations of a device under test (DUT). The DUT includes a device memory having a number of input ports, and the FPGA is associated with a target memory having a second number of input ports, the second number being less than the first number. In one embodiment, a given set of inputs is applied to the device memory at a frequency Fd and in a defined cycle of time, and the given set of inputs is applied to the target memory at a frequency Ft. Ft is greater than Fd and cycle accuracy is maintained between the device memory and the target memory. In an embodiment, a cycle accurate model of the DUT memory is created by separating the DUT memory interface protocol from the target memory storage array.

  5. Generating clock signals for a cycle accurate, cycle reproducible FPGA based hardware accelerator

    DOEpatents

    Asaad, Sameth W.; Kapur, Mohit

    2016-01-05

    A method, system and computer program product are disclosed for generating clock signals for a cycle accurate FPGA based hardware accelerator used to simulate operations of a device-under-test (DUT). In one embodiment, the DUT includes multiple device clocks generating multiple device clock signals at multiple frequencies and at a defined frequency ratio; and the FPG hardware accelerator includes multiple accelerator clocks generating multiple accelerator clock signals to operate the FPGA hardware accelerator to simulate the operations of the DUT. In one embodiment, operations of the DUT are mapped to the FPGA hardware accelerator, and the accelerator clock signals are generated at multiple frequencies and at the defined frequency ratio of the frequencies of the multiple device clocks, to maintain cycle accuracy between the DUT and the FPGA hardware accelerator. In an embodiment, the FPGA hardware accelerator may be used to control the frequencies of the multiple device clocks.

  6. A cricoid cartilage compression device for the accurate and reproducible application of cricoid pressure.

    PubMed

    Taylor, R J; Smurthwaite, G; Mehmood, I; Kitchen, G B; Baker, R D

    2015-01-01

    We describe the development and laboratory assessment of a refined prototype tactile feedback device for the safe and accurate application of cricoid pressure. We recruited 20 operating department practitioners and compared their performance of cricoid pressure on a training simulator using both the device and a manual unaided technique. The device significantly reduced the spread of the applied force: average (SE) root mean squared error decreased from 8.23 (0.48) N to 5.23 (0.32) N (p < 0.001). The average (SE) upwards bias in applied force also decreased, from 2.30 (0.74) N to 0.88 (0.48) N (p < 0.01). Most importantly, the percentage of force applications that deviated from target by more than 10 N decreased from 18% to 7% (p < 0.01). The device requires no prior training, is cheap to manufacture, is single-use and requires no power to operate, whilst ensuring that the correct force is always consistently applied. PMID:25267415

  7. A simple safe, reliable and reproducible mechanism for producing experimental bite marks.

    PubMed

    Chinni, S Subramanyeswara; Al-Ibrahim, Anas; Forgie, Andrew H

    2013-12-01

    With improving technology it should be possible to develop an objective, reliable and valid method that can be undertaken by most forensic Odontologists without recourse to expensive or bulky equipment. One of the main factors that affect the physical appearance of bitemark is the amount of force applied during biting. There is little evidence relating the appearance of a bite mark to the amount of force applied and how that force relates to the biters maximal biteforce. This paper describes simple apparatus that can be used to inflict experimental bites on living subjects reproducibly and with minimal risk. The aims of this study are to report on the development of a mechanical apparatus that produces experimental bitemarks on living human subjects with a known force in a safe, reliable and reproducible manner and to relate the force applied during production of the experimental bitemark to the maximum bite force of the biter. Maximum bite force of one of the authors was determined as 324 N. Experimental bitemarks were inflicted on living subjects with known weights. Weights of up to 10 kg were well tolerated by the subjects. The relation between forces used to inflict bites and the maximum bite force of the author is reported, with 10 kg being approximately one third of the maximum bite force. The apparatus was well tolerated and the results were reliable and reproducible. The results from this study could help in determining the severity of bitemarks. This apparatus could help researchers in developing objective based bitemark analysis techniques. PMID:24776438

  8. Reproducibility and variability of the cost functions reconstructed from experimental recordings in multifinger prehension.

    PubMed

    Niu, Xun; Latash, Mark L; Zatsiorsky, Vladimir M

    2012-01-01

    The study examines whether the cost functions reconstructed from experimental recordings are reproducible over time. Participants repeated the trials on three days. By following Analytical Inverse Optimization procedures, the cost functions of finger forces were reconstructed for each day. The cost functions were found to be reproducible over time: application of a cost function C(i) to the data of Day j (i≠j) resulted in smaller deviations from the experimental observations than using other commonly used cost functions. Other findings are: (a) the 2nd order coefficients of the cost function showed negative linear relations with finger force magnitudes; (b) the finger forces were distributed on a 2-dimensional plane in the 4-dimensional finger force space for all subjects and all testing sessions; (c) the data agreed well with the principle of superposition, i.e. the action of object prehension can be decoupled into the control of rotational equilibrium and slipping prevention. PMID:22364441

  9. Randomized block experimental designs can increase the power and reproducibility of laboratory animal experiments.

    PubMed

    Festing, Michael F W

    2014-01-01

    Randomized block experimental designs have been widely used in agricultural and industrial research for many decades. Usually they are more powerful, have higher external validity, are less subject to bias, and produce more reproducible results than the completely randomized designs typically used in research involving laboratory animals. Reproducibility can be further increased by using time as a blocking factor. These benefits can be achieved at no extra cost. A small experiment investigating the effect of an antioxidant on the activity of a liver enzyme in four inbred mouse strains, which had two replications (blocks) separated by a period of two months, illustrates this approach. The widespread failure to use these designs more widely in research involving laboratory animals has probably led to a substantial waste of animals, money, and scientific resources and slowed down the development of new treatments for human and animal diseases.

  10. Reproducibility and Variability of the Cost Functions Reconstructed from Experimental Recordings in Multi-Finger Prehension

    PubMed Central

    Niu, Xun; Latash, Mark L.; Zatsiorsky, Vladimir M.

    2012-01-01

    The main goal of the study is to examine whether the cost (objective) functions reconstructed from experimental recordings in multi-finger prehension tasks are reproducible over time, i.e., whether the functions reflect stable preferences of the subjects and can be considered personal characteristics of motor coordination. Young, healthy participants grasped an instrumented handle with varied values of external torque, load and target grasping force and repeated the trials on three days: Day 1, Day 2, and Day 7. By following Analytical Inverse Optimization (ANIO) computation procedures, the cost functions for individual subjects were reconstructed from the experimental recordings (individual finger forces) for each day. The cost functions represented second-order polynomials of finger forces with non-zero linear terms. To check whether the obtained cost functions were reproducible over time a cross-validation was performed: a cost function obtained on Day i was applied to experimental data observed on Day j (i≠j). In spite of the observed day-to-day variability of the performance and the cost functions, the ANIO reconstructed cost functions were found to be reproducible over time: application of a cost function Ci to the data of Day j (i≠j) resulted in smaller deviations from the experimental observations than using other commonly used cost functions. Other findings are: (a) The 2nd order coefficients Ki of the cost function showed negative linear relations with finger force magnitudes. This fact may be interpreted as encouraging involvement of stronger fingers in tasks requiring higher total force magnitude production. (b) The finger forces were distributed on a 2-dimensional plane in the 4-dimensional finger force space, which has been confirmed for all subjects and all testing sessions. (c) The discovered principal components in the principal component analysis of the finger forces agreed well with the principle of superposition, i.e. the complex action of

  11. Reproducible Science▿

    PubMed Central

    Casadevall, Arturo; Fang, Ferric C.

    2010-01-01

    The reproducibility of an experimental result is a fundamental assumption in science. Yet, results that are merely confirmatory of previous findings are given low priority and can be difficult to publish. Furthermore, the complex and chaotic nature of biological systems imposes limitations on the replicability of scientific experiments. This essay explores the importance and limits of reproducibility in scientific manuscripts. PMID:20876290

  12. Panel-based Genetic Diagnostic Testing for Inherited Eye Diseases is Highly Accurate and Reproducible and More Sensitive for Variant Detection Than Exome Sequencing

    PubMed Central

    Bujakowska, Kinga M.; Sousa, Maria E.; Fonseca-Kelly, Zoë D.; Taub, Daniel G.; Janessian, Maria; Wang, Dan Yi; Au, Elizabeth D.; Sims, Katherine B.; Sweetser, David A.; Fulton, Anne B.; Liu, Qin; Wiggs, Janey L.; Gai, Xiaowu; Pierce, Eric A.

    2015-01-01

    Purpose Next-generation sequencing (NGS) based methods are being adopted broadly for genetic diagnostic testing, but the performance characteristics of these techniques have not been fully defined with regard to test accuracy and reproducibility. Methods We developed a targeted enrichment and NGS approach for genetic diagnostic testing of patients with inherited eye disorders, including inherited retinal degenerations, optic atrophy and glaucoma. In preparation for providing this Genetic Eye Disease (GEDi) test on a CLIA-certified basis, we performed experiments to measure the sensitivity, specificity, reproducibility as well as the clinical sensitivity of the test. Results The GEDi test is highly reproducible and accurate, with sensitivity and specificity for single nucleotide variant detection of 97.9% and 100%, respectively. The sensitivity for variant detection was notably better than the 88.3% achieved by whole exome sequencing (WES) using the same metrics, due to better coverage of targeted genes in the GEDi test compared to commercially available exome capture sets. Prospective testing of 192 patients with IRDs indicated that the clinical sensitivity of the GEDi test is high, with a diagnostic rate of 51%. Conclusion The data suggest that based on quantified performance metrics, selective targeted enrichment is preferable to WES for genetic diagnostic testing. PMID:25412400

  13. Accurate theoretical and experimental characterization of optical grating coupler.

    PubMed

    Fesharaki, Faezeh; Hossain, Nadir; Vigne, Sebastien; Chaker, Mohamed; Wu, Ke

    2016-09-01

    Periodic structures, acting as reflectors, filters, and couplers, are a fundamental building block section in many optical devices. In this paper, a three-dimensional simulation of a grating coupler, a well-known periodic structure, is conducted. Guided waves and leakage characteristics of an out-of-plane grating coupler are studied in detail, and its coupling efficiency is examined. Furthermore, a numerical calibration analysis is applied through a commercial software package on the basis of a full-wave finite-element method to calculate the complex propagation constant of the structure and to evaluate the radiation pattern. For experimental evaluation, an optimized grating coupler is fabricated using electron-beam lithography technique and plasma etching. An excellent agreement between simulations and measurements is observed, thereby validating the demonstrated method. PMID:27607706

  14. Experimental and Numerical Investigation of Forging Process to Reproduce a 3D Aluminium Foam Complex Shape

    NASA Astrophysics Data System (ADS)

    Filice, Luigino; Gagliardi, Francesco; Shivpuri, Rajiv; Umbrello, Domenico

    2007-05-01

    Metallic foams represent one of the most exciting materials introduced in the manufacturing scenario in the last years. In the study here addressed, the experimental and numerical investigations on the forging process of a simple foam billet shaped into complex sculptured parts were carried out. In particular, the deformation behavior of metallic foams and the development of density gradients were investigated through a series of experimental forging tests in order to produce a selected portion of a hip prosthesis. The human bone replacement was chosen as case study due to its industrial demand and for its particular 3D complex shape. A finite element code (Deform 3D®) was utilized for modeling the foam behavior during the forging process and an accurate material rheology description was used based on a porous material model which includes the measured local density. Once the effectiveness of the utilized Finite Element model was verified through the comparison with the experimental evidences, a numerical study of the influence of the foam density was investigated. The obtained numerical results shown as the initial billet density plays an important role on the prediction of the final shape, the optimization of the flash as well as the estimation of the punch load.

  15. Experimental and Numerical Investigation of Forging Process to Reproduce a 3D Aluminium Foam Complex Shape

    SciTech Connect

    Filice, Luigino; Gagliardi, Francesco; Umbrello, Domenico; Shivpuri, Rajiv

    2007-05-17

    Metallic foams represent one of the most exciting materials introduced in the manufacturing scenario in the last years. In the study here addressed, the experimental and numerical investigations on the forging process of a simple foam billet shaped into complex sculptured parts were carried out. In particular, the deformation behavior of metallic foams and the development of density gradients were investigated through a series of experimental forging tests in order to produce a selected portion of a hip prosthesis. The human bone replacement was chosen as case study due to its industrial demand and for its particular 3D complex shape. A finite element code (Deform 3D) was utilized for modeling the foam behavior during the forging process and an accurate material rheology description was used based on a porous material model which includes the measured local density. Once the effectiveness of the utilized Finite Element model was verified through the comparison with the experimental evidences, a numerical study of the influence of the foam density was investigated. The obtained numerical results shown as the initial billet density plays an important role on the prediction of the final shape, the optimization of the flash as well as the estimation of the punch load.

  16. Calibrated simulations of Z opacity experiments that reproduce the experimentally measured plasma conditions

    NASA Astrophysics Data System (ADS)

    Nagayama, T.; Bailey, J. E.; Loisel, G.; Rochau, G. A.; MacFarlane, J. J.; Golovkin, I.

    2016-02-01

    Recently, frequency-resolved iron opacity measurements at electron temperatures of 170-200 eV and electron densities of (0.7 - 4.0 )× 1022cm-3 revealed a 30 - 400 % disagreement with the calculated opacities [J. E. Bailey et al., Nature (London) 517, 56 (2015), 10.1038/nature14048]. The discrepancies have a high impact on astrophysics, atomic physics, and high-energy density physics, and it is important to verify our understanding of the experimental platform with simulations. Reliable simulations are challenging because the temporal and spatial evolution of the source radiation and of the sample plasma are both complex and incompletely diagnosed. In this article, we describe simulations that reproduce the measured temperature and density in recent iron opacity experiments performed at the Sandia National Laboratories Z facility. The time-dependent spectral irradiance at the sample is estimated using the measured time- and space-dependent source radiation distribution, in situ source-to-sample distance measurements, and a three-dimensional (3D) view-factor code. The inferred spectral irradiance is used to drive 1D sample radiation hydrodynamics simulations. The images recorded by slit-imaged space-resolved spectrometers are modeled by solving radiation transport of the source radiation through the sample. We find that the same drive radiation time history successfully reproduces the measured plasma conditions for eight different opacity experiments. These results provide a quantitative physical explanation for the observed dependence of both temperature and density on the sample configuration. Simulated spectral images for the experiments without the FeMg sample show quantitative agreement with the measured spectral images. The agreement in spectral profile, spatial profile, and brightness provides further confidence in our understanding of the backlight-radiation time history and image formation. These simulations bridge the static-uniform picture of the data

  17. Calibrated simulations of Z opacity experiments that reproduce the experimentally measured plasma conditions.

    PubMed

    Nagayama, T; Bailey, J E; Loisel, G; Rochau, G A; MacFarlane, J J; Golovkin, I

    2016-02-01

    Recently, frequency-resolved iron opacity measurements at electron temperatures of 170-200 eV and electron densities of (0.7-4.0)×10(22)cm(-3) revealed a 30-400% disagreement with the calculated opacities [J. E. Bailey et al., Nature (London) 517, 56 (2015)]. The discrepancies have a high impact on astrophysics, atomic physics, and high-energy density physics, and it is important to verify our understanding of the experimental platform with simulations. Reliable simulations are challenging because the temporal and spatial evolution of the source radiation and of the sample plasma are both complex and incompletely diagnosed. In this article, we describe simulations that reproduce the measured temperature and density in recent iron opacity experiments performed at the Sandia National Laboratories Z facility. The time-dependent spectral irradiance at the sample is estimated using the measured time- and space-dependent source radiation distribution, in situ source-to-sample distance measurements, and a three-dimensional (3D) view-factor code. The inferred spectral irradiance is used to drive 1D sample radiation hydrodynamics simulations. The images recorded by slit-imaged space-resolved spectrometers are modeled by solving radiation transport of the source radiation through the sample. We find that the same drive radiation time history successfully reproduces the measured plasma conditions for eight different opacity experiments. These results provide a quantitative physical explanation for the observed dependence of both temperature and density on the sample configuration. Simulated spectral images for the experiments without the FeMg sample show quantitative agreement with the measured spectral images. The agreement in spectral profile, spatial profile, and brightness provides further confidence in our understanding of the backlight-radiation time history and image formation. These simulations bridge the static-uniform picture of the data interpretation and the

  18. Calibrated simulations of Z opacity experiments that reproduce the experimentally measured plasma conditions

    DOE PAGES

    Nagayama, T.; Bailey, J. E.; Loisel, G.; Rochau, G. A.; MacFarlane, J. J.; Golovkin, I.

    2016-02-05

    Recently, frequency-resolved iron opacity measurements at electron temperatures of 170–200 eV and electron densities of (0.7 – 4.0) × 1022 cm–3 revealed a 30–400% disagreement with the calculated opacities [J. E. Bailey et al., Nature (London) 517, 56 (2015)]. The discrepancies have a high impact on astrophysics, atomic physics, and high-energy density physics, and it is important to verify our understanding of the experimental platform with simulations. Reliable simulations are challenging because the temporal and spatial evolution of the source radiation and of the sample plasma are both complex and incompletely diagnosed. In this article, we describe simulations that reproducemore » the measured temperature and density in recent iron opacity experiments performed at the Sandia National Laboratories Z facility. The time-dependent spectral irradiance at the sample is estimated using the measured time- and space-dependent source radiation distribution, in situ source-to-sample distance measurements, and a three-dimensional (3D) view-factor code. The inferred spectral irradiance is used to drive 1D sample radiation hydrodynamics simulations. The images recorded by slit-imaged space-resolved spectrometers are modeled by solving radiation transport of the source radiation through the sample. We find that the same drive radiation time history successfully reproduces the measured plasma conditions for eight different opacity experiments. These results provide a quantitative physical explanation for the observed dependence of both temperature and density on the sample configuration. Simulated spectral images for the experiments without the FeMg sample show quantitative agreement with the measured spectral images. The agreement in spectral profile, spatial profile, and brightness provides further confidence in our understanding of the backlight-radiation time history and image formation. Furthermore, these simulations bridge the static-uniform picture of the

  19. Elusive reproducibility.

    PubMed

    Gori, Gio Batta

    2014-08-01

    Reproducibility remains a mirage for many biomedical studies because inherent experimental uncertainties generate idiosyncratic outcomes. The authentication and error rates of primary empirical data are often elusive, while multifactorial confounders beset experimental setups. Substantive methodological remedies are difficult to conceive, signifying that many biomedical studies yield more or less plausible results, depending on the attending uncertainties. Real life applications of those results remain problematic, with important exceptions for counterfactual field validations of strong experimental signals, notably for some vaccines and drugs, and for certain safety and occupational measures. It is argued that industrial, commercial and public policies and regulations could not ethically rely on unreliable biomedical results; rather, they should be rationally grounded on transparent cost-benefit tradeoffs. PMID:24882687

  20. Elusive reproducibility.

    PubMed

    Gori, Gio Batta

    2014-08-01

    Reproducibility remains a mirage for many biomedical studies because inherent experimental uncertainties generate idiosyncratic outcomes. The authentication and error rates of primary empirical data are often elusive, while multifactorial confounders beset experimental setups. Substantive methodological remedies are difficult to conceive, signifying that many biomedical studies yield more or less plausible results, depending on the attending uncertainties. Real life applications of those results remain problematic, with important exceptions for counterfactual field validations of strong experimental signals, notably for some vaccines and drugs, and for certain safety and occupational measures. It is argued that industrial, commercial and public policies and regulations could not ethically rely on unreliable biomedical results; rather, they should be rationally grounded on transparent cost-benefit tradeoffs.

  1. Evaluation of new reference genes in papaya for accurate transcript normalization under different experimental conditions.

    PubMed

    Zhu, Xiaoyang; Li, Xueping; Chen, Weixin; Chen, Jianye; Lu, Wangjin; Chen, Lei; Fu, Danwen

    2012-01-01

    Real-time reverse transcription PCR (RT-qPCR) is a preferred method for rapid and accurate quantification of gene expression studies. Appropriate application of RT-qPCR requires accurate normalization though the use of reference genes. As no single reference gene is universally suitable for all experiments, thus reference gene(s) validation under different experimental conditions is crucial for RT-qPCR analysis. To date, only a few studies on reference genes have been done in other plants but none in papaya. In the present work, we selected 21 candidate reference genes, and evaluated their expression stability in 246 papaya fruit samples using three algorithms, geNorm, NormFinder and RefFinder. The samples consisted of 13 sets collected under different experimental conditions, including various tissues, different storage temperatures, different cultivars, developmental stages, postharvest ripening, modified atmosphere packaging, 1-methylcyclopropene (1-MCP) treatment, hot water treatment, biotic stress and hormone treatment. Our results demonstrated that expression stability varied greatly between reference genes and that different suitable reference gene(s) or combination of reference genes for normalization should be validated according to the experimental conditions. In general, the internal reference genes EIF (Eukaryotic initiation factor 4A), TBP1 (TATA binding protein 1) and TBP2 (TATA binding protein 2) genes had a good performance under most experimental conditions, whereas the most widely present used reference genes, ACTIN (Actin 2), 18S rRNA (18S ribosomal RNA) and GAPDH (Glyceraldehyde-3-phosphate dehydrogenase) were not suitable in many experimental conditions. In addition, two commonly used programs, geNorm and Normfinder, were proved sufficient for the validation. This work provides the first systematic analysis for the selection of superior reference genes for accurate transcript normalization in papaya under different experimental conditions.

  2. Implementation of an experimental pilot reproducing the fouling of the exhaust gas recirculation system in diesel engines

    NASA Astrophysics Data System (ADS)

    Gaborieau, Cécile; Sommier, Alain; Toutain, Jean; Anguy, Yannick; Crepeau, Gérald; Gobin, Benoît

    2012-04-01

    The European emission standards EURO 5 and EURO 6 define more stringent acceptable limits for exhaust emissions of new vehicles. The Exhaust Gas Recirculation (EGR) system is a partial but essential solution for lowering the emission of nitrogen oxides and soot particulates. Yet, due to a more intensive use than in the past, the fouling of the EGR system is increased. Ensuring the reliability of the EGR system becomes a main challenge. In partnership with PSA Peugeot Citroën, we designed an experimental setup that mimics an operating EGR system. Its distinctive features are (1) its ability to reproduce precisely the operating conditions and (2) its ability to measure the temperature field on the heat exchanger surface with an Infra Red camera for detecting in real time the evolution of the fooling deposit based on its thermal resistance. Numerical codes are used in conjunction with this experimental setup to determine the evolution of the fouling thickness from its thermal resistance.

  3. Two-dimensional discrete element models of debris avalanches: Parameterization and the reproducibility of experimental results

    NASA Astrophysics Data System (ADS)

    Banton, J.; Villard, P.; Jongmans, D.; Scavia, C.

    2009-11-01

    Application of the discrete element method (DEM) to model avalanches of granular materials requires determining the correct geometric and rheological parameters for and between the particles as well as for the basal surface. The use of spherical (circular in 2-D) particles enhances particle rolling, yielding excessive runout values. The solution usually adopted to correct this effect is to introduce a drag force which artificially slows down the particle velocities. The aim of this study is to test the capability of the DEM to simulate well-controlled unsteady channelized granular flows, considering the measured properties of the particles and of the basal surface which naturally contribute to dissipate energy. We first performed a parametrical analysis on a simple 2-D model in order to estimate the influence of particle shape, friction parameters, and restitution coefficients on the dynamics of the flow and on the deposit geometry. We then simulated three channelized laboratory experiments performed with two materials and two bed linings. Using the geometrical layout and the values of the mechanical parameters provided by the authors, we obtained a remarkable agreement between the observed and 2-D simulated deposit shapes for the three experiments. Also, the computed mass evolution with time was very consistent with the experimental snapshots in all cases. These results highlight the capability of the DEM technique for modeling avalanche of granular material when the particle shape as well as the friction and restitution coefficients are properly considered.

  4. All-atom molecular dynamics analysis of multi-peptide systems reproduces peptide solubility in line with experimental observations

    PubMed Central

    Kuroda, Yutaka; Suenaga, Atsushi; Sato, Yuji; Kosuda, Satoshi; Taiji, Makoto

    2016-01-01

    In order to investigate the contribution of individual amino acids to protein and peptide solubility, we carried out 100 ns molecular dynamics (MD) simulations of 106 Å3 cubic boxes containing ~3 × 104 water molecules and 27 tetra-peptides regularly positioned at 23 Å from each other and composed of a single amino acid type for all natural amino acids but cysteine and glycine. The calculations were performed using Amber with a standard force field on a special purpose MDGRAPE-3 computer, without introducing any “artificial” hydrophobic interactions. Tetra-peptides composed of I, V, L, M, N, Q, F, W, Y, and H formed large amorphous clusters, and those containing A, P, S, and T formed smaller ones. Tetra-peptides made of D, E, K, and R did not cluster at all. These observations correlated well with experimental solubility tendencies as well as hydrophobicity scales with correlation coefficients of 0.5 to > 0.9. Repulsive Coulomb interactions were dominant in ensuring high solubility, whereas both Coulomb and van der Waals (vdW) energies contributed to the aggregations of low solubility amino acids. Overall, this very first all-atom molecular dynamics simulation of a multi-peptide system appears to reproduce the basic properties of peptide solubility, essentially in line with experimental observations. PMID:26817663

  5. Reproducibility blues.

    PubMed

    Pulverer, Bernd

    2015-11-12

    Research findings advance science only if they are significant, reliable and reproducible. Scientists and journals must publish robust data in a way that renders it optimally reproducible. Reproducibility has to be incentivized and supported by the research infrastructure but without dampening innovation. PMID:26538323

  6. Reproducibility blues.

    PubMed

    Pulverer, Bernd

    2015-11-12

    Research findings advance science only if they are significant, reliable and reproducible. Scientists and journals must publish robust data in a way that renders it optimally reproducible. Reproducibility has to be incentivized and supported by the research infrastructure but without dampening innovation.

  7. Continuous nanoflow-scanning electrochemical microscopy: voltammetric characterization and application for accurate and reproducible imaging of enzyme-labeled protein microarrays.

    PubMed

    Kai, Tianhan; Chen, Shu; Monterroso, Estuardo; Zhou, Feimeng

    2015-04-21

    The coupling of scanning electrochemical microscopy (SECM) to a continuous nanoflow (CNF) system is accomplished with the use of a microconcentric ring electrode/injector probe. The gold microring electrode encapsulated by a glass sheath is robust and can be beveled and polished. The CNF system, comprising a precision gas displacement pump and a rotary valve, is capable of delivering solution to the center of the SECM probe in the range of 1-150 nL/min. Major advantages of the CNF-SECM imaging mode over the conventional SECM generation/collection (G/C) mode include higher imaging resolution, immunity from interferences by species in the bulk solution or at other sites of the substrate, elimination of the feedback current that could interfere with the G/C data interpretation, and versatility of initiating surface reactions/processes via introducing different reactants into the flowing stream. Parameters such as flow rates, probe/substrate separations, and collection efficiencies are examined and optimized. Higher resolution, reproducibility, and accuracy are demonstrated through the application of CNF-SECM to horseradish peroxidase (HRP)-amplified imaging of protein microarrays. By flowing H2O2 and ferrocenemethanol through the injector and detecting the surface-generated ferriceniummethanol, human IgG spots covered with HPR-labeled antihuman IgG can be detected in the range of 13 nM-1.333 μM with a detection limit of 3.0 nM. In addition, consistent images of microarray spots for selective and high-density detection of analytes can be attained. PMID:25831146

  8. Identification and Evaluation of Reference Genes for Accurate Transcription Normalization in Safflower under Different Experimental Conditions.

    PubMed

    Li, Dandan; Hu, Bo; Wang, Qing; Liu, Hongchang; Pan, Feng; Wu, Wei

    2015-01-01

    Safflower (Carthamus tinctorius L.) has received a significant amount of attention as a medicinal plant and oilseed crop. Gene expression studies provide a theoretical molecular biology foundation for improving new traits and developing new cultivars. Real-time quantitative PCR (RT-qPCR) has become a crucial approach for gene expression analysis. In addition, appropriate reference genes (RGs) are essential for accurate and rapid relative quantification analysis of gene expression. In this study, fifteen candidate RGs involved in multiple metabolic pathways of plants were finally selected and validated under different experimental treatments, at different seed development stages and in different cultivars and tissues for real-time PCR experiments. These genes were ABCS, 60SRPL10, RANBP1, UBCL, MFC, UBCE2, EIF5A, COA, EF1-β, EF1, GAPDH, ATPS, MBF1, GTPB and GST. The suitability evaluation was executed by the geNorm and NormFinder programs. Overall, EF1, UBCE2, EIF5A, ATPS and 60SRPL10 were the most stable genes, and MBF1, as well as MFC, were the most unstable genes by geNorm and NormFinder software in all experimental samples. To verify the validation of RGs selected by the two programs, the expression analysis of 7 CtFAD2 genes in safflower seeds at different developmental stages under cold stress was executed using different RGs in RT-qPCR experiments for normalization. The results showed similar expression patterns when the most stable RGs selected by geNorm or NormFinder software were used. However, the differences were detected using the most unstable reference genes. The most stable combination of genes selected in this study will help to achieve more accurate and reliable results in a wide variety of samples in safflower.

  9. Identification and Evaluation of Reference Genes for Accurate Transcription Normalization in Safflower under Different Experimental Conditions

    PubMed Central

    Li, Dandan; Hu, Bo; Wang, Qing; Liu, Hongchang; Pan, Feng; Wu, Wei

    2015-01-01

    Safflower (Carthamus tinctorius L.) has received a significant amount of attention as a medicinal plant and oilseed crop. Gene expression studies provide a theoretical molecular biology foundation for improving new traits and developing new cultivars. Real-time quantitative PCR (RT-qPCR) has become a crucial approach for gene expression analysis. In addition, appropriate reference genes (RGs) are essential for accurate and rapid relative quantification analysis of gene expression. In this study, fifteen candidate RGs involved in multiple metabolic pathways of plants were finally selected and validated under different experimental treatments, at different seed development stages and in different cultivars and tissues for real-time PCR experiments. These genes were ABCS, 60SRPL10, RANBP1, UBCL, MFC, UBCE2, EIF5A, COA, EF1-β, EF1, GAPDH, ATPS, MBF1, GTPB and GST. The suitability evaluation was executed by the geNorm and NormFinder programs. Overall, EF1, UBCE2, EIF5A, ATPS and 60SRPL10 were the most stable genes, and MBF1, as well as MFC, were the most unstable genes by geNorm and NormFinder software in all experimental samples. To verify the validation of RGs selected by the two programs, the expression analysis of 7 CtFAD2 genes in safflower seeds at different developmental stages under cold stress was executed using different RGs in RT-qPCR experiments for normalization. The results showed similar expression patterns when the most stable RGs selected by geNorm or NormFinder software were used. However, the differences were detected using the most unstable reference genes. The most stable combination of genes selected in this study will help to achieve more accurate and reliable results in a wide variety of samples in safflower. PMID:26457898

  10. Identification and Evaluation of Reference Genes for Accurate Transcription Normalization in Safflower under Different Experimental Conditions.

    PubMed

    Li, Dandan; Hu, Bo; Wang, Qing; Liu, Hongchang; Pan, Feng; Wu, Wei

    2015-01-01

    Safflower (Carthamus tinctorius L.) has received a significant amount of attention as a medicinal plant and oilseed crop. Gene expression studies provide a theoretical molecular biology foundation for improving new traits and developing new cultivars. Real-time quantitative PCR (RT-qPCR) has become a crucial approach for gene expression analysis. In addition, appropriate reference genes (RGs) are essential for accurate and rapid relative quantification analysis of gene expression. In this study, fifteen candidate RGs involved in multiple metabolic pathways of plants were finally selected and validated under different experimental treatments, at different seed development stages and in different cultivars and tissues for real-time PCR experiments. These genes were ABCS, 60SRPL10, RANBP1, UBCL, MFC, UBCE2, EIF5A, COA, EF1-β, EF1, GAPDH, ATPS, MBF1, GTPB and GST. The suitability evaluation was executed by the geNorm and NormFinder programs. Overall, EF1, UBCE2, EIF5A, ATPS and 60SRPL10 were the most stable genes, and MBF1, as well as MFC, were the most unstable genes by geNorm and NormFinder software in all experimental samples. To verify the validation of RGs selected by the two programs, the expression analysis of 7 CtFAD2 genes in safflower seeds at different developmental stages under cold stress was executed using different RGs in RT-qPCR experiments for normalization. The results showed similar expression patterns when the most stable RGs selected by geNorm or NormFinder software were used. However, the differences were detected using the most unstable reference genes. The most stable combination of genes selected in this study will help to achieve more accurate and reliable results in a wide variety of samples in safflower. PMID:26457898

  11. The NHLBI-Sponsored Consortium for preclinicAl assESsment of cARdioprotective Therapies (CAESAR): A New Paradigm for Rigorous, Accurate, and Reproducible Evaluation of Putative Infarct-Sparing Interventions in Mice, Rabbits, and Pigs

    PubMed Central

    Jones, Steven P.; Tang, Xian-Liang; Guo, Yiru; Steenbergen, Charles; Lefer, David J.; Kukreja, Rakesh C.; Kong, Maiying; Li, Qianhong; Bhushan, Shashi; Zhu, Xiaoping; Du, Junjie; Nong, Yibing; Stowers, Heather L.; Kondo, Kazuhisa; Hunt, Gregory N.; Goodchild, Traci T.; Orr, Adam; Chang, Carlos C.; Ockaili, Ramzi; Salloum, Fadi N.; Bolli, Roberto

    2014-01-01

    rigorous, accurate, and reproducible. PMID:25499773

  12. A SELDI mass spectrometry study of experimental autoimmune encephalomyelitis: sample preparation, reproducibility, and differential protein expression patterns

    PubMed Central

    2013-01-01

    Background Experimental autoimmune encephalomyelitis (EAE) is an autoimmune, inflammatory disease of the central nervous system that is widely used as a model of multiple sclerosis (MS). Mitochondrial dysfunction appears to play a role in the development of neuropathology in MS and may also play a role in disease pathology in EAE. Here, surface enhanced laser desorption ionization mass spectrometry (SELDI-MS) has been employed to obtain protein expression profiles from mitochondrially enriched fractions derived from EAE and control mouse brain. To gain insight into experimental variation, the reproducibility of sub-cellular fractionation, anion exchange fractionation as well as spot-to-spot and chip-to-chip variation using pooled samples from brain tissue was examined. Results Variability of SELDI mass spectral peak intensities indicates a coefficient of variation (CV) of 15.6% and 17.6% between spots on a given chip and between different chips, respectively. Thinly slicing tissue prior to homogenization with a rotor homogenizer showed better reproducibility (CV = 17.0%) than homogenization of blocks of brain tissue with a Teflon® pestle (CV = 27.0%). Fractionation of proteins with anion exchange beads prior to SELDI-MS analysis gave overall CV values from 16.1% to 18.6%. SELDI mass spectra of mitochondrial fractions obtained from brain tissue from EAE mice and controls displayed 39 differentially expressed proteins (p≤ 0.05) out of a total of 241 protein peaks observed in anion exchange fractions. Hierarchical clustering analysis showed that protein fractions from EAE animals with severe disability clearly segregated from controls. Several components of electron transport chain complexes (cytochrome c oxidase subunit 6b1, subunit 6C, and subunit 4; NADH dehydrogenase flavoprotein 3, alpha subcomplex subunit 2, Fe-S protein 4, and Fe-S protein 6; and ATP synthase subunit e) were identified as possible differentially expressed proteins. Myelin Basic Protein

  13. Developing a reproducible non-line-of-sight experimental setup for testing wireless medical device coexistence utilizing ZigBee.

    PubMed

    LaSorte, Nickolas J; Rajab, Samer A; Refai, Hazem H

    2012-11-01

    The integration of heterogeneous wireless technologies is believed to aid revolutionary healthcare delivery in hospitals and residential care. Wireless medical device coexistence is a growing concern given the ubiquity of wireless technology. In spite of this, a consensus standard that addresses risks associated with wireless heterogeneous networks has not been adopted. This paper serves as a starting point by recommending a practice for assessing the coexistence of a wireless medical device in a non-line-of-sight environment utilizing 802.15.4 in a practical, versatile, and reproducible test setup. This paper provides an extensive survey of other coexistence studies concerning 802.15.4 and 802.11 and reports on the authors' coexistence testing inside and outside an anechoic chamber. Results are compared against a non-line-of-sight test setup. Findings relative to co-channel and adjacent channel interference were consistent with results reported in the literature. PMID:22907957

  14. Accurate metal-site structures in proteins obtained by combining experimental data and quantum chemistry.

    PubMed

    Ryde, Ulf

    2007-02-14

    The use of molecular mechanics calculations to supplement experimental data in standard X-ray crystallography and NMR refinements is discussed and it is shown that structures can be locally improved by the use of quantum chemical calculations. Such calculations can also be used to interpret the structures, e.g. to decide the protonation state of metal-bound ligands. They have shown that metal sites in crystal structures are frequently photoreduced or disordered, which makes the interpretation of the structures hard. Similar methods can be used for EXAFS refinements to obtain a full atomic structure, rather than a set of metal-ligand distances.

  15. An experimental correction proposed for an accurate determination of mass diffusivity of wood in steady regime

    NASA Astrophysics Data System (ADS)

    Zohoun, Sylvain; Agoua, Eusèbe; Degan, Gérard; Perre, Patrick

    2002-08-01

    This paper presents an experimental study of the mass diffusion in the hygroscopic region of four temperate species and three tropical ones. In order to simplify the interpretation of the phenomena, a dimensionless parameter called reduced diffusivity is defined. This parameter varies from 0 to 1. The method used is firstly based on the determination of that parameter from results of the measurement of the mass flux which takes into account the conditions of operating standard device (tightness, dimensional variations and easy installation of samples of wood, good stability of temperature and humidity). Secondly the reasons why that parameter has to be corrected are presented. An abacus for this correction of mass diffusivity of wood in steady regime has been plotted. This work constitutes an advanced deal nowadays for characterising forest species.

  16. An experimental device for accurate ultrasounds measurements in liquid foods at high pressure

    NASA Astrophysics Data System (ADS)

    Hidalgo-Baltasar, E.; Taravillo, M.; Baonza, V. G.; Sanz, P. D.; Guignon, B.

    2012-12-01

    The use of high hydrostatic pressure to ensure safe and high-quality product has markedly increased in the food industry during the last decade. Ultrasonic sensors can be employed to control such processes in an equivalent way as they are currently used in processes carried out at room pressure. However, their installation, calibration and use are particularly challenging in the context of a high pressure environment. Besides, data about acoustic properties of food under pressure and even for water are quite scarce in the pressure range of interest for food treatment (namely, above 200 MPa). The objective of this work was to establish a methodology to determine the speed of sound in foods under pressure. An ultrasonic sensor using the multiple reflections method was adapted to a lab-scale HHP equipment to determine the speed of sound in water between 253.15 and 348.15 K, and at pressures up to 700 MPa. The experimental speed-of-sound data were compared to the data calculated from the equation of state of water (IAPWS-95 formulation). From this analysis, the way to calibrate cell path was validated. After this calibration procedure, the speed of sound could be determined in liquid foods by using this sensor with a relative uncertainty between (0.22 and 0.32) % at a confidence level of 95 % over the whole pressure domain.

  17. The use of experimental bending tests to more accurate numerical description of TBC damage process

    NASA Astrophysics Data System (ADS)

    Sadowski, T.; Golewski, P.

    2016-04-01

    Thermal barrier coatings (TBCs) have been extensively used in aircraft engines to protect critical engine parts such as blades and combustion chambers, which are exposed to high temperatures and corrosive environment. The blades of turbine engines are additionally exposed to high mechanical loads. These loads are created by the high rotational speed of the rotor (30 000 rot/min), causing the tensile and bending stresses. Therefore, experimental testing of coated samples is necessary in order to determine strength properties of TBCs. Beam samples with dimensions 50×10×2 mm were used in those studies. The TBC system consisted of 150 μm thick bond coat (NiCoCrAlY) and 300 μm thick top coat (YSZ) made by APS (air plasma spray) process. Samples were tested by three-point bending test with various loads. After bending tests, the samples were subjected to microscopic observation to determine the quantity of cracks and their depth. The above mentioned results were used to build numerical model and calibrate material data in Abaqus program. Brittle cracking damage model was applied for the TBC layer, which allows to remove elements after reaching criterion. Surface based cohesive behavior was used to model the delamination which may occur at the boundary between bond coat and top coat.

  18. Theory of bi-molecular association dynamics in 2D for accurate model and experimental parameterization of binding rates

    NASA Astrophysics Data System (ADS)

    Yogurtcu, Osman N.; Johnson, Margaret E.

    2015-08-01

    The dynamics of association between diffusing and reacting molecular species are routinely quantified using simple rate-equation kinetics that assume both well-mixed concentrations of species and a single rate constant for parameterizing the binding rate. In two-dimensions (2D), however, even when systems are well-mixed, the assumption of a single characteristic rate constant for describing association is not generally accurate, due to the properties of diffusional searching in dimensions d ≤ 2. Establishing rigorous bounds for discriminating between 2D reactive systems that will be accurately described by rate equations with a single rate constant, and those that will not, is critical for both modeling and experimentally parameterizing binding reactions restricted to surfaces such as cellular membranes. We show here that in regimes of intrinsic reaction rate (ka) and diffusion (D) parameters ka/D > 0.05, a single rate constant cannot be fit to the dynamics of concentrations of associating species independently of the initial conditions. Instead, a more sophisticated multi-parametric description than rate-equations is necessary to robustly characterize bimolecular reactions from experiment. Our quantitative bounds derive from our new analysis of 2D rate-behavior predicted from Smoluchowski theory. Using a recently developed single particle reaction-diffusion algorithm we extend here to 2D, we are able to test and validate the predictions of Smoluchowski theory and several other theories of reversible reaction dynamics in 2D for the first time. Finally, our results also mean that simulations of reactive systems in 2D using rate equations must be undertaken with caution when reactions have ka/D > 0.05, regardless of the simulation volume. We introduce here a simple formula for an adaptive concentration dependent rate constant for these chemical kinetics simulations which improves on existing formulas to better capture non-equilibrium reaction dynamics from dilute

  19. Development and experimental verification of a finite element method for accurate analysis of a surface acoustic wave device

    NASA Astrophysics Data System (ADS)

    Mohibul Kabir, K. M.; Matthews, Glenn I.; Sabri, Ylias M.; Russo, Salvy P.; Ippolito, Samuel J.; Bhargava, Suresh K.

    2016-03-01

    Accurate analysis of surface acoustic wave (SAW) devices is highly important due to their use in ever-growing applications in electronics, telecommunication and chemical sensing. In this study, a novel approach for analyzing the SAW devices was developed based on a series of two-dimensional finite element method (FEM) simulations, which has been experimentally verified. It was found that the frequency response of the two SAW device structures, each having slightly different bandwidth and center lobe characteristics, can be successfully obtained utilizing the current density of the electrodes via FEM simulations. The two SAW structures were based on XY Lithium Niobate (LiNbO3) substrates and had two and four electrode finger pairs in both of their interdigital transducers, respectively. Later, SAW devices were fabricated in accordance with the simulated models and their measured frequency responses were found to correlate well with the obtained simulations results. The results indicated that better match between calculated and measured frequency response can be obtained when one of the input electrode finger pairs was set at zero volts and all the current density components were taken into account when calculating the frequency response of the simulated SAW device structures.

  20. A fast experimental beam hardening correction method for accurate bone mineral measurements in 3D μCT imaging system.

    PubMed

    Koubar, Khodor; Bekaert, Virgile; Brasse, David; Laquerriere, Patrice

    2015-06-01

    Bone mineral density plays an important role in the determination of bone strength and fracture risks. Consequently, it is very important to obtain accurate bone mineral density measurements. The microcomputerized tomography system provides 3D information about the architectural properties of bone. Quantitative analysis accuracy is decreased by the presence of artefacts in the reconstructed images, mainly due to beam hardening artefacts (such as cupping artefacts). In this paper, we introduced a new beam hardening correction method based on a postreconstruction technique performed with the use of off-line water and bone linearization curves experimentally calculated aiming to take into account the nonhomogeneity in the scanned animal. In order to evaluate the mass correction rate, calibration line has been carried out to convert the reconstructed linear attenuation coefficient into bone masses. The presented correction method was then applied on a multimaterial cylindrical phantom and on mouse skeleton images. Mass correction rate up to 18% between uncorrected and corrected images were obtained as well as a remarkable improvement of a calculated mouse femur mass has been noticed. Results were also compared to those obtained when using the simple water linearization technique which does not take into account the nonhomogeneity in the object.

  1. Automatic, accurate, and reproducible segmentation of the brain and cerebro-spinal fluid in T1-weighted volume MRI scans and its application to serial cerebral and intracranial volumetry

    NASA Astrophysics Data System (ADS)

    Lemieux, Louis

    2001-07-01

    A new fully automatic algorithm for the segmentation of the brain and cerebro-spinal fluid (CSF) from T1-weighted volume MRI scans of the head was specifically developed in the context of serial intra-cranial volumetry. The method is an extension of a previously published brain extraction algorithm. The brain mask is used as a basis for CSF segmentation based on morphological operations, automatic histogram analysis and thresholding. Brain segmentation is then obtained by iterative tracking of the brain-CSF interface. Grey matter (GM), white matter (WM) and CSF volumes are calculated based on a model of intensity probability distribution that includes partial volume effects. Accuracy was assessed using a digital phantom scan. Reproducibility was assessed by segmenting pairs of scans from 20 normal subjects scanned 8 months apart and 11 patients with epilepsy scanned 3.5 years apart. Segmentation accuracy as measured by overlap was 98% for the brain and 96% for the intra-cranial tissues. The volume errors were: total brain (TBV): -1.0%, intra-cranial (ICV):0.1%, CSF: +4.8%. For repeated scans, matching resulted in improved reproducibility. In the controls, the coefficient of reliability (CR) was 1.5% for the TVB and 1.0% for the ICV. In the patients, the Cr for the ICV was 1.2%.

  2. Impact of the H275Y and I223V Mutations in the Neuraminidase of the 2009 Pandemic Influenza Virus In Vitro and Evaluating Experimental Reproducibility

    PubMed Central

    Paradis, Eric G.; Pinilla, Lady Tatiana; Holder, Benjamin P.; Abed, Yacine; Boivin, Guy; Beauchemin, Catherine A.A.

    2015-01-01

    The 2009 pandemic H1N1 (H1N1pdm09) influenza virus is naturally susceptible to neuraminidase (NA) inhibitors, but mutations in the NA protein can cause oseltamivir resistance. The H275Y and I223V amino acid substitutions in the NA of the H1N1pdm09 influenza strain have been separately observed in patients exhibiting oseltamivir-resistance. Here, we apply mathematical modelling techniques to compare the fitness of the wild-type H1N1pdm09 strain relative to each of these two mutants. We find that both the H275Y and I223V mutations in the H1N1pdm09 background significantly lengthen the duration of the eclipse phase (by 2.5 h and 3.6 h, respectively), consistent with these NA mutations delaying the release of viral progeny from newly infected cells. Cells infected by H1N1pdm09 virus carrying the I223V mutation display a disadvantageous, shorter infectious lifespan (17 h shorter) than those infected with the wild-type or MUT-H275Y strains. In terms of compensating traits, the H275Y mutation in the H1N1pdm09 background results in increased virus infectiousness, as we reported previously, whereas the I223V exhibits none, leaving it overall less fit than both its wild-type counterpart and the MUT-H275Y strain. Using computer simulated competition experiments, we determine that in the presence of oseltamivir at doses even below standard therapy, both the MUT-H275Y and MUT-I223V dominate their wild-type counterpart in all aspects, and the MUT-H275Y outcompetes the MUT-I223V. The H275Y mutation should therefore be more commonly observed than the I223V mutation in circulating H1N1pdm09 strains, assuming both mutations have a similar impact or no significant impact on between-host transmission. We also show that mathematical modelling offers a relatively inexpensive and reliable means to quantify inter-experimental variability and assess the reproducibility of results. PMID:25992792

  3. Opening Reproducible Research

    NASA Astrophysics Data System (ADS)

    Nüst, Daniel; Konkol, Markus; Pebesma, Edzer; Kray, Christian; Klötgen, Stephanie; Schutzeichel, Marc; Lorenz, Jörg; Przibytzin, Holger; Kussmann, Dirk

    2016-04-01

    Open access is not only a form of publishing such that research papers become available to the large public free of charge, it also refers to a trend in science that the act of doing research becomes more open and transparent. When science transforms to open access we not only mean access to papers, research data being collected, or data being generated, but also access to the data used and the procedures carried out in the research paper. Increasingly, scientific results are generated by numerical manipulation of data that were already collected, and may involve simulation experiments that are completely carried out computationally. Reproducibility of research findings, the ability to repeat experimental procedures and confirm previously found results, is at the heart of the scientific method (Pebesma, Nüst and Bivand, 2012). As opposed to the collection of experimental data in labs or nature, computational experiments lend themselves very well for reproduction. Some of the reasons why scientists do not publish data and computational procedures that allow reproduction will be hard to change, e.g. privacy concerns in the data, fear for embarrassment or of losing a competitive advantage. Others reasons however involve technical aspects, and include the lack of standard procedures to publish such information and the lack of benefits after publishing them. We aim to resolve these two technical aspects. We propose a system that supports the evolution of scientific publications from static papers into dynamic, executable research documents. The DFG-funded experimental project Opening Reproducible Research (ORR) aims for the main aspects of open access, by improving the exchange of, by facilitating productive access to, and by simplifying reuse of research results that are published over the Internet. Central to the project is a new form for creating and providing research results, the executable research compendium (ERC), which not only enables third parties to

  4. Meteorite Atmospheric Entry Reproduced in Plasmatron

    NASA Astrophysics Data System (ADS)

    Pittarello, L.; McKibbin, S.; Goderis, S.; Soens, B.; Bariselli, F.; Barros Dias, B. R.; Zavalan, F. L.; Magin, T.; Claeys, Ph.

    2016-08-01

    Plasmatron facility allows experimental conditions that reproduce atmospheric entry of meteorites. Tests on basalt, as meteorite analogue, have been performed. Preliminary results have highlighted melting and evaporation effects.

  5. ReproPhylo: An Environment for Reproducible Phylogenomics

    PubMed Central

    Szitenberg, Amir; John, Max; Blaxter, Mark L.; Lunt, David H.

    2015-01-01

    The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This ‘single file’ approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution. PMID:26335558

  6. ReproPhylo: An Environment for Reproducible Phylogenomics.

    PubMed

    Szitenberg, Amir; John, Max; Blaxter, Mark L; Lunt, David H

    2015-09-01

    The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This 'single file' approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution. PMID:26335558

  7. In the pursuit of small "red shift" of C-H stretching vibrational frequency of C-H...pi interactions for benzene dimer: How to amend MP2 calculations to reproduce the experimental results.

    PubMed

    Dinadayalane, T C; Leszczynski, Jerzy

    2009-02-28

    For the bent T-shaped benzene dimer, the vibrational frequencies at the MP2/aug-cc-pVDZ level with counterpoise correction reproduce experimental results of the small "red shifts" of C-H stretching, while those without counterpoise correction yield considerable "blue shift." Counterpoise correction also affects the C-H bond distances of C-H...pi interactions as well as intermoiety distances.

  8. Reproducibility of NIF hohlraum measurements

    NASA Astrophysics Data System (ADS)

    Moody, J. D.; Ralph, J. E.; Turnbull, D. P.; Casey, D. T.; Albert, F.; Bachmann, B. L.; Doeppner, T.; Divol, L.; Grim, G. P.; Hoover, M.; Landen, O. L.; MacGowan, B. J.; Michel, P. A.; Moore, A. S.; Pino, J. E.; Schneider, M. B.; Tipton, R. E.; Smalyuk, V. A.; Strozzi, D. J.; Widmann, K.; Hohenberger, M.

    2015-11-01

    The strategy of experimentally ``tuning'' the implosion in a NIF hohlraum ignition target towards increasing hot-spot pressure, areal density of compressed fuel, and neutron yield relies on a level of experimental reproducibility. We examine the reproducibility of experimental measurements for a collection of 15 identical NIF hohlraum experiments. The measurements include incident laser power, backscattered optical power, x-ray measurements, hot-electron fraction and energy, and target characteristics. We use exact statistics to set 1-sigma confidence levels on the variations in each of the measurements. Of particular interest is the backscatter and laser-induced hot-spot locations on the hohlraum wall. Hohlraum implosion designs typically include variability specifications [S. W. Haan et al., Phys. Plasmas 18, 051001 (2011)]. We describe our findings and compare with the specifications. This work was performed under the auspices of the U.S. Department of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48.

  9. Blast-induced biomechanical loading of the rat: an experimental and anatomically accurate computational blast injury model.

    PubMed

    Sundaramurthy, Aravind; Alai, Aaron; Ganpule, Shailesh; Holmberg, Aaron; Plougonven, Erwan; Chandra, Namas

    2012-09-01

    Blast waves generated by improvised explosive devices (IEDs) cause traumatic brain injury (TBI) in soldiers and civilians. In vivo animal models that use shock tubes are extensively used in laboratories to simulate field conditions, to identify mechanisms of injury, and to develop injury thresholds. In this article, we place rats in different locations along the length of the shock tube (i.e., inside, outside, and near the exit), to examine the role of animal placement location (APL) in the biomechanical load experienced by the animal. We found that the biomechanical load on the brain and internal organs in the thoracic cavity (lungs and heart) varied significantly depending on the APL. When the specimen is positioned outside, organs in the thoracic cavity experience a higher pressure for a longer duration, in contrast to APL inside the shock tube. This in turn will possibly alter the injury type, severity, and lethality. We found that the optimal APL is where the Friedlander waveform is first formed inside the shock tube. Once the optimal APL was determined, the effect of the incident blast intensity on the surface and intracranial pressure was measured and analyzed. Noticeably, surface and intracranial pressure increases linearly with the incident peak overpressures, though surface pressures are significantly higher than the other two. Further, we developed and validated an anatomically accurate finite element model of the rat head. With this model, we determined that the main pathway of pressure transmission to the brain was through the skull and not through the snout; however, the snout plays a secondary role in diffracting the incoming blast wave towards the skull.

  10. Blast-induced biomechanical loading of the rat: an experimental and anatomically accurate computational blast injury model.

    PubMed

    Sundaramurthy, Aravind; Alai, Aaron; Ganpule, Shailesh; Holmberg, Aaron; Plougonven, Erwan; Chandra, Namas

    2012-09-01

    Blast waves generated by improvised explosive devices (IEDs) cause traumatic brain injury (TBI) in soldiers and civilians. In vivo animal models that use shock tubes are extensively used in laboratories to simulate field conditions, to identify mechanisms of injury, and to develop injury thresholds. In this article, we place rats in different locations along the length of the shock tube (i.e., inside, outside, and near the exit), to examine the role of animal placement location (APL) in the biomechanical load experienced by the animal. We found that the biomechanical load on the brain and internal organs in the thoracic cavity (lungs and heart) varied significantly depending on the APL. When the specimen is positioned outside, organs in the thoracic cavity experience a higher pressure for a longer duration, in contrast to APL inside the shock tube. This in turn will possibly alter the injury type, severity, and lethality. We found that the optimal APL is where the Friedlander waveform is first formed inside the shock tube. Once the optimal APL was determined, the effect of the incident blast intensity on the surface and intracranial pressure was measured and analyzed. Noticeably, surface and intracranial pressure increases linearly with the incident peak overpressures, though surface pressures are significantly higher than the other two. Further, we developed and validated an anatomically accurate finite element model of the rat head. With this model, we determined that the main pathway of pressure transmission to the brain was through the skull and not through the snout; however, the snout plays a secondary role in diffracting the incoming blast wave towards the skull. PMID:22620716

  11. Experimental scale and dimensionality requirements for reproducing and studying coupled land-atmosphere-vegetative processes in the intermediate scale laboratory settings

    NASA Astrophysics Data System (ADS)

    Trautz, Andrew; Illangasekare, Tissa; Rodriguez-Iturbe, Ignacio; Helmig, Rainer; Heck, Katharina

    2016-04-01

    Past investigations of coupled land-atmosphere-vegetative processes have been constrained to two extremes, small laboratory bench-scale and field scale testing. In recognition of the limitations of studying the scale-dependency of these fundamental processes at either extreme, researchers have recently begun to promote the use of experimentation at intermediary scales between the bench and field scales. A requirement for employing intermediate scale testing to refine heat and mass transport theory regarding land-atmosphere-vegetative processes is high spatial-temporal resolution datasets generated under carefully controlled experimental conditions in which both small and field scale phenomena can be observed. Field experimentation often fails these criteria as a result of sensor network limitations as well as the natural complexities and uncertainties introduced by heterogeneity and constantly changing atmospheric conditions. Laboratory experimentation, which is used to study three-dimensional (3-D) processes, is often conducted in 2-D test systems as a result of space, instrumentation, and cost constraints. In most flow and transport problems, 2-D testing is not considered a serious limitation because the bypassing of flow and transport due to geo-biochemical heterogeneities can still be studied. Constraining the study of atmosphere-soil-vegetation interactions to 2-D systems introduces a new challenge given that the soil moisture dynamics associated with these interactions occurs in three dimensions. This is an important issue that needs to be addressed as evermore intricate and specialized experimental apparatuses like the climate-controlled wind tunnel-porous media test system at CESEP are being constructed and used for these types of studies. The purpose of this study is to therefore investigate the effects of laboratory experimental dimensionality on observed soil moisture dynamics in the context of bare-soil evaporation and evapotranspiration

  12. Explorations in statistics: statistical facets of reproducibility.

    PubMed

    Curran-Everett, Douglas

    2016-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.

  13. Accurate calculation of mutational effects on the thermodynamics of inhibitor binding to p38α MAP kinase: a combined computational and experimental study.

    PubMed

    Zhu, Shun; Travis, Sue M; Elcock, Adrian H

    2013-07-01

    A major current challenge for drug design efforts focused on protein kinases is the development of drug resistance caused by spontaneous mutations in the kinase catalytic domain. The ubiquity of this problem means that it would be advantageous to develop fast, effective computational methods that could be used to determine the effects of potential resistance-causing mutations before they arise in a clinical setting. With this long-term goal in mind, we have conducted a combined experimental and computational study of the thermodynamic effects of active-site mutations on a well-characterized and high-affinity interaction between a protein kinase and a small-molecule inhibitor. Specifically, we developed a fluorescence-based assay to measure the binding free energy of the small-molecule inhibitor, SB203580, to the p38α MAP kinase and used it measure the inhibitor's affinity for five different kinase mutants involving two residues (Val38 and Ala51) that contact the inhibitor in the crystal structure of the inhibitor-kinase complex. We then conducted long, explicit-solvent thermodynamic integration (TI) simulations in an attempt to reproduce the experimental relative binding affinities of the inhibitor for the five mutants; in total, a combined simulation time of 18.5 μs was obtained. Two widely used force fields - OPLS-AA/L and Amber ff99SB-ILDN - were tested in the TI simulations. Both force fields produced excellent agreement with experiment for three of the five mutants; simulations performed with the OPLS-AA/L force field, however, produced qualitatively incorrect results for the constructs that contained an A51V mutation. Interestingly, the discrepancies with the OPLS-AA/L force field could be rectified by the imposition of position restraints on the atoms of the protein backbone and the inhibitor without destroying the agreement for other mutations; the ability to reproduce experiment depended, however, upon the strength of the restraints' force constant

  14. Refinement of the experimental energy levels of higher {sup 2}D Rydberg states of the lithium atom with very accurate quantum mechanical calculations

    SciTech Connect

    Sharkey, Keeper L.; Bubin, Sergiy; Adamowicz, Ludwik

    2011-05-21

    Very accurate variational non-relativistic calculations are performed for four higher Rydberg {sup 2}D states (1s{sup 2}nd{sup 1}, n= 8, ..., 11) of the lithium atom ({sup 7}Li). The wave functions of the states are expanded in terms of all-electron explicitly correlated Gaussian functions and finite nuclear mass is used. The exponential parameters of the Gaussians are optimized using the variational method with the aid of the analytical energy gradient determined with respect to those parameters. The results of the calculations allow for refining the experimental energy levels determined with respect to the {sup 2}S 1s{sup 2}2s{sup 1} ground state.

  15. Experimental study on the application of a compressed-sensing (CS) algorithm to dental cone-beam CT (CBCT) for accurate, low-dose image reconstruction

    NASA Astrophysics Data System (ADS)

    Oh, Jieun; Cho, Hyosung; Je, Uikyu; Lee, Minsik; Kim, Hyojeong; Hong, Daeki; Park, Yeonok; Lee, Seonhwa; Cho, Heemoon; Choi, Sungil; Koo, Yangseo

    2013-03-01

    In practical applications of three-dimensional (3D) tomographic imaging, there are often challenges for image reconstruction from insufficient data. In computed tomography (CT); for example, image reconstruction from few views would enable fast scanning with reduced doses to the patient. In this study, we investigated and implemented an efficient reconstruction method based on a compressed-sensing (CS) algorithm, which exploits the sparseness of the gradient image with substantially high accuracy, for accurate, low-dose dental cone-beam CT (CBCT) reconstruction. We applied the algorithm to a commercially-available dental CBCT system (Expert7™, Vatech Co., Korea) and performed experimental works to demonstrate the algorithm for image reconstruction in insufficient sampling problems. We successfully reconstructed CBCT images from several undersampled data and evaluated the reconstruction quality in terms of the universal-quality index (UQI). Experimental demonstrations of the CS-based reconstruction algorithm appear to show that it can be applied to current dental CBCT systems for reducing imaging doses and improving the image quality.

  16. Reproducible research in palaeomagnetism

    NASA Astrophysics Data System (ADS)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined

  17. Magnetogastrography (MGG) Reproducibility Assessments

    NASA Astrophysics Data System (ADS)

    de la Roca-Chiapas, J. M.; Córdova, T.; Hernández, E.; Solorio, S.; Solís Ortiz, S.; Sosa, M.

    2006-09-01

    Seven healthy subjects underwent a magnetic pulse of 32 mT for 17 ms, seven times in 90 minutes. The procedure was repeated one and two weeks later. Assessments of the gastric emptying were carried out for each one of the measurements and a statistical analysis of ANOVA was performed for every group of data. The gastric emptying time was 19.22 ± 5 min. Reproducibility estimation was above 85%. Therefore, magnetogastrography seems to be an excellent technique to be implemented in routine clinical trials.

  18. Reproducible measurements of MPI performance characteristics.

    SciTech Connect

    Gropp, W.; Lusk, E.

    1999-06-25

    In this paper we describe the difficulties inherent in making accurate, reproducible measurements of message-passing performance. We describe some of the mistakes often made in attempting such measurements and the consequences of such mistakes. We describe mpptest, a suite of performance measurement programs developed at Argonne National Laboratory, that attempts to avoid such mistakes and obtain reproducible measures of MPI performance that can be useful to both MPI implementers and MPI application writers. We include a number of illustrative examples of its use.

  19. Accurate experimental determination of the isotope effects on the triple point temperature of water. I. Dependence on the 2H abundance

    NASA Astrophysics Data System (ADS)

    Faghihi, V.; Peruzzi, A.; Aerts-Bijma, A. T.; Jansen, H. G.; Spriensma, J. J.; van Geel, J.; Meijer, H. A. J.

    2015-12-01

    Variation in the isotopic composition of water is one of the major contributors to uncertainty in the realization of the triple point of water (TPW). Although the dependence of the TPW on the isotopic composition of the water has been known for years, there is still a lack of a detailed and accurate experimental determination of the values for the correction constants. This paper is the first of two articles (Part I and Part II) that address quantification of isotope abundance effects on the triple point temperature of water. In this paper, we describe our experimental assessment of the 2H isotope effect. We manufactured five triple point cells with prepared water mixtures with a range of 2H isotopic abundances encompassing widely the natural abundance range, while the 18O and 17O isotopic abundance were kept approximately constant and the 18O  -  17O ratio was close to the Meijer-Li relationship for natural waters. The selected range of 2H isotopic abundances led to cells that realised TPW temperatures between approximately  -140 μK to  +2500 μK with respect to the TPW temperature as realized by VSMOW (Vienna Standard Mean Ocean Water). Our experiment led to determination of the value for the δ2H correction parameter of A2H  =  673 μK / (‰ deviation of δ2H from VSMOW) with a combined uncertainty of 4 μK (k  =  1, or 1σ).

  20. Reproducing in cities.

    PubMed

    Mace, Ruth

    2008-02-01

    Reproducing in cities has always been costly, leading to lower fertility (that is, lower birth rates) in urban than in rural areas. Historically, although cities provided job opportunities, initially residents incurred the penalty of higher infant mortality, but as mortality rates fell at the end of the 19th century, European birth rates began to plummet. Fertility decline in Africa only started recently and has been dramatic in some cities. Here it is argued that both historical and evolutionary demographers are interpreting fertility declines across the globe in terms of the relative costs of child rearing, which increase to allow children to outcompete their peers. Now largely free from the fear of early death, postindustrial societies may create an environment that generates runaway parental investment, which will continue to drive fertility ever lower.

  1. Reproducible Experiment Platform

    NASA Astrophysics Data System (ADS)

    Likhomanenko, Tatiana; Rogozhnikov, Alex; Baranov, Alexander; Khairullin, Egor; Ustyuzhanin, Andrey

    2015-12-01

    Data analysis in fundamental sciences nowadays is an essential process that pushes frontiers of our knowledge and leads to new discoveries. At the same time we can see that complexity of those analyses increases fast due to a) enormous volumes of datasets being analyzed, b) variety of techniques and algorithms one have to check inside a single analysis, c) distributed nature of research teams that requires special communication media for knowledge and information exchange between individual researchers. There is a lot of resemblance between techniques and problems arising in the areas of industrial information retrieval and particle physics. To address those problems we propose Reproducible Experiment Platform (REP), a software infrastructure to support collaborative ecosystem for computational science. It is a Python based solution for research teams that allows running computational experiments on shared datasets, obtaining repeatable results, and consistent comparisons of the obtained results. We present some key features of REP based on case studies which include trigger optimization and physics analysis studies at the LHCb experiment.

  2. NNLOPS accurate associated HW production

    NASA Astrophysics Data System (ADS)

    Astill, William; Bizon, Wojciech; Re, Emanuele; Zanderighi, Giulia

    2016-06-01

    We present a next-to-next-to-leading order accurate description of associated HW production consistently matched to a parton shower. The method is based on reweighting events obtained with the HW plus one jet NLO accurate calculation implemented in POWHEG, extended with the MiNLO procedure, to reproduce NNLO accurate Born distributions. Since the Born kinematics is more complex than the cases treated before, we use a parametrization of the Collins-Soper angles to reduce the number of variables required for the reweighting. We present phenomenological results at 13 TeV, with cuts suggested by the Higgs Cross section Working Group.

  3. First accurate experimental study of Mu reactivity from a state-selected reactant in the gas phase: the Mu + H2{1} reaction rate at 300 K

    NASA Astrophysics Data System (ADS)

    Bakule, Pavel; Sukhorukov, Oleksandr; Ishida, Katsuhiko; Pratt, Francis; Fleming, Donald; Momose, Takamasa; Matsuda, Yasuyuki; Torikai, Eiko

    2015-02-01

    This paper reports on the experimental background and methodology leading to recent results on the first accurate measurement of the reaction rate of the muonium (Mu) atom from a state-selected reactant in the gas phase: the Mu + H2\\{1\\}\\to MuH + H reaction at 300 K, and its comparison with rigorous quantum rate theory, Bakule et al (2012 J. Phys. Chem. Lett. 3 2755). Stimulated Raman pumping, induced by 532 nm light from the 2nd harmonic of a Nd:YAG laser, was used to produce H2 in its first vibrational (v = 1) state, H2\\{1\\}, in a single Raman/reaction cell. A pulsed muon beam (from ‘ISIS’, at 50 Hz) matched the 25 Hz repetition rate of the laser, allowing data taking in equal ‘Laser-On/Laser-Off’ modes of operation. The signal to noise was improved by over an order of magnitude in comparison with an earlier proof-of-principle experiment. The success of the present experiment also relied on optimizing the overlap of the laser profile with the extended stopping distribution of the muon beam at 50 bar H2 pressure, in which Monte Carlo simulations played a central role. The rate constant, found from the analysis of three separate measurements, which includes a correction for the loss of {{H}2}\\{1\\} concentration due to collisional relaxation with unpumped H2 during the time of each measurement, is {{k}Mu}\\{1\\} = 9.9[(-1.4)(+1.7)] × 10-13 cm3 s-1 at 300 K. This is in good to excellent agreement with rigorous quantum rate calculations on the complete configuration interaction/Born-Huang surface, as reported earlier by Bakule et al, and which are also briefly commented on herein.

  4. Assessing the reproducibility of discriminant function analyses.

    PubMed

    Andrew, Rose L; Albert, Arianne Y K; Renaut, Sebastien; Rennison, Diana J; Bock, Dan G; Vines, Tim

    2015-01-01

    Data are the foundation of empirical research, yet all too often the datasets underlying published papers are unavailable, incorrect, or poorly curated. This is a serious issue, because future researchers are then unable to validate published results or reuse data to explore new ideas and hypotheses. Even if data files are securely stored and accessible, they must also be accompanied by accurate labels and identifiers. To assess how often problems with metadata or data curation affect the reproducibility of published results, we attempted to reproduce Discriminant Function Analyses (DFAs) from the field of organismal biology. DFA is a commonly used statistical analysis that has changed little since its inception almost eight decades ago, and therefore provides an opportunity to test reproducibility among datasets of varying ages. Out of 100 papers we initially surveyed, fourteen were excluded because they did not present the common types of quantitative result from their DFA or gave insufficient details of their DFA. Of the remaining 86 datasets, there were 15 cases for which we were unable to confidently relate the dataset we received to the one used in the published analysis. The reasons ranged from incomprehensible or absent variable labels, the DFA being performed on an unspecified subset of the data, or the dataset we received being incomplete. We focused on reproducing three common summary statistics from DFAs: the percent variance explained, the percentage correctly assigned and the largest discriminant function coefficient. The reproducibility of the first two was fairly high (20 of 26, and 44 of 60 datasets, respectively), whereas our success rate with the discriminant function coefficients was lower (15 of 26 datasets). When considering all three summary statistics, we were able to completely reproduce 46 (65%) of 71 datasets. While our results show that a majority of studies are reproducible, they highlight the fact that many studies still are not the

  5. Assessing the reproducibility of discriminant function analyses

    PubMed Central

    Andrew, Rose L.; Albert, Arianne Y.K.; Renaut, Sebastien; Rennison, Diana J.; Bock, Dan G.

    2015-01-01

    Data are the foundation of empirical research, yet all too often the datasets underlying published papers are unavailable, incorrect, or poorly curated. This is a serious issue, because future researchers are then unable to validate published results or reuse data to explore new ideas and hypotheses. Even if data files are securely stored and accessible, they must also be accompanied by accurate labels and identifiers. To assess how often problems with metadata or data curation affect the reproducibility of published results, we attempted to reproduce Discriminant Function Analyses (DFAs) from the field of organismal biology. DFA is a commonly used statistical analysis that has changed little since its inception almost eight decades ago, and therefore provides an opportunity to test reproducibility among datasets of varying ages. Out of 100 papers we initially surveyed, fourteen were excluded because they did not present the common types of quantitative result from their DFA or gave insufficient details of their DFA. Of the remaining 86 datasets, there were 15 cases for which we were unable to confidently relate the dataset we received to the one used in the published analysis. The reasons ranged from incomprehensible or absent variable labels, the DFA being performed on an unspecified subset of the data, or the dataset we received being incomplete. We focused on reproducing three common summary statistics from DFAs: the percent variance explained, the percentage correctly assigned and the largest discriminant function coefficient. The reproducibility of the first two was fairly high (20 of 26, and 44 of 60 datasets, respectively), whereas our success rate with the discriminant function coefficients was lower (15 of 26 datasets). When considering all three summary statistics, we were able to completely reproduce 46 (65%) of 71 datasets. While our results show that a majority of studies are reproducible, they highlight the fact that many studies still are not the

  6. Combined experimental and numerical approach to evaluate impact scaling relations and reproducibility of craters produced at the Experimental Projectile Impact Chamber (E.P.I.C., Centro de Astrobiología, Spain.)

    NASA Astrophysics Data System (ADS)

    Ormö, J.; Wünnemann, K.; Collins, G.; Melero Asensio, I.

    2012-04-01

    The Experimental Projectile Impact Chamber at Centro de Astrobiología, Spain, consists of a 7m wide, funnel-shaped test bed, and a 20.5mm caliber compressed N2 gas gun. The test bed can be filled with any type of target material, but is especially designed for wet target experiments. The shape and size aim to decrease disturbance from reflected surface waves in wet target experiments. Experiments are done under 1Atm pressure. The gas gun can launch projectiles of any material and dimensions <20mm (smaller diameters using sabots), and at any angle from vertical to near horizontal. The projectile velocities are of the order of a few hundreds of meters per second depending mainly on the gas pressure, as well as projectile diameter and density. When using a dry sand target a transient crater about 30cm wide is produced. Wet target experiments have not yet been performed in this newly installed test chamber, but transient cavities in water are expected to be in the order of 50-70cm wide. The large scale allows for detailed study of the dynamics of cratering motions during the stages of crater growth and subsequent collapse, especially in wet targets. These observations provide valuable benchmark data for numerical simulations and for comparison with field studies. Here we describe the results of ten impact experiments using three different gas pressures (100bar, 180bar, 200bar), two projectile compositions (20mm, 5.7g delrin; 20mm, 16.3g Al2O3), and two different impact angles (90˚ and 53˚ over the horizontal plane). Nine of the experiments were done in a quarter-space geometry using a specially designed camera tank with a 45mm thick glass window. One experiment was done in half-space geometry as reference. The experiments were recorded with a high-speed digital video camera, and the resulting craters were documented with a digital still frame camera. Projectile velocities are estimated with a combination of tracking software and a Shooting Chrony Alpha M-1

  7. Reproducibility in a multiprocessor system

    DOEpatents

    Bellofatto, Ralph A; Chen, Dong; Coteus, Paul W; Eisley, Noel A; Gara, Alan; Gooding, Thomas M; Haring, Rudolf A; Heidelberger, Philip; Kopcsay, Gerard V; Liebsch, Thomas A; Ohmacht, Martin; Reed, Don D; Senger, Robert M; Steinmacher-Burow, Burkhard; Sugawara, Yutaka

    2013-11-26

    Fixing a problem is usually greatly aided if the problem is reproducible. To ensure reproducibility of a multiprocessor system, the following aspects are proposed; a deterministic system start state, a single system clock, phase alignment of clocks in the system, system-wide synchronization events, reproducible execution of system components, deterministic chip interfaces, zero-impact communication with the system, precise stop of the system and a scan of the system state.

  8. Study of resonance interactions in polyatomic molecules on the basis of highly accurate experimental data: Set of strongly interacting Bands ν10(B1), ν7(B2), ν4(A2), ν8(B2), ν3(A1) and ν6(B1) of CH2=CD2

    NASA Astrophysics Data System (ADS)

    Ulenikov, O. N.; Gromova, O. V.; Bekhtereva, E. S.; Berezkin, K. B.; Kashirina, N. V.; Tan, T. L.; Sydow, C.; Maul, C.; Bauerecker, S.

    2016-09-01

    The highly accurate (experimental accuracy in line positions ~(1 - 3) ×10-4cm-1) FTIR ro-vibrational spectra of CH2=CD2 in the region of 600-1300 cm-1, where the fundamental bands ν10, ν7, ν4, ν8, ν3, and ν6 are located, were recorded and analyzed with the Hamiltonian model which takes into account resonance interactions between all six studied bands. About 12 200 ro-vibrational transitions belonging to these bands (that is considerably more than it was made in the preceding studies for the bands ν10, ν7, ν8, ν3 and ν6; transitions belonging to the ν4 band were assigned for the first time) were assigned in the experimental spectra with the maximum values of quantum numbers Jmax. / Kamax . equal to 31/20, 46/18, 33/11, 50/26, 44/20 and 42/21 for the bands ν10, ν7, ν4, ν8, ν3, and ν6, respectively. On that basis, a set of 133 vibrational, rotational, centrifugal distortion and resonance interaction parameters was obtained from the weighted fit. They reproduce values of 3920 initial "experimental" ro-vibrational energy levels (positions of about 12 200 experimentally recorded and assigned transitions) with the rms error drms = 2.3 ×10-4cm-1.

  9. Accurate Molecular Polarizabilities Based on Continuum Electrostatics

    PubMed Central

    Truchon, Jean-François; Nicholls, Anthony; Iftimie, Radu I.; Roux, Benoît; Bayly, Christopher I.

    2013-01-01

    A novel approach for representing the intramolecular polarizability as a continuum dielectric is introduced to account for molecular electronic polarization. It is shown, using a finite-difference solution to the Poisson equation, that the Electronic Polarization from Internal Continuum (EPIC) model yields accurate gas-phase molecular polarizability tensors for a test set of 98 challenging molecules composed of heteroaromatics, alkanes and diatomics. The electronic polarization originates from a high intramolecular dielectric that produces polarizabilities consistent with B3LYP/aug-cc-pVTZ and experimental values when surrounded by vacuum dielectric. In contrast to other approaches to model electronic polarization, this simple model avoids the polarizability catastrophe and accurately calculates molecular anisotropy with the use of very few fitted parameters and without resorting to auxiliary sites or anisotropic atomic centers. On average, the unsigned error in the average polarizability and anisotropy compared to B3LYP are 2% and 5%, respectively. The correlation between the polarizability components from B3LYP and this approach lead to a R2 of 0.990 and a slope of 0.999. Even the F2 anisotropy, shown to be a difficult case for existing polarizability models, can be reproduced within 2% error. In addition to providing new parameters for a rapid method directly applicable to the calculation of polarizabilities, this work extends the widely used Poisson equation to areas where accurate molecular polarizabilities matter. PMID:23646034

  10. Reproducible quantitative proteotype data matrices for systems biology

    PubMed Central

    Röst, Hannes L.; Malmström, Lars; Aebersold, Ruedi

    2015-01-01

    Historically, many mass spectrometry–based proteomic studies have aimed at compiling an inventory of protein compounds present in a biological sample, with the long-term objective of creating a proteome map of a species. However, to answer fundamental questions about the behavior of biological systems at the protein level, accurate and unbiased quantitative data are required in addition to a list of all protein components. Fueled by advances in mass spectrometry, the proteomics field has thus recently shifted focus toward the reproducible quantification of proteins across a large number of biological samples. This provides the foundation to move away from pure enumeration of identified proteins toward quantitative matrices of many proteins measured across multiple samples. It is argued here that data matrices consisting of highly reproducible, quantitative, and unbiased proteomic measurements across a high number of conditions, referred to here as quantitative proteotype maps, will become the fundamental currency in the field and provide the starting point for downstream biological analysis. Such proteotype data matrices, for example, are generated by the measurement of large patient cohorts, time series, or multiple experimental perturbations. They are expected to have a large effect on systems biology and personalized medicine approaches that investigate the dynamic behavior of biological systems across multiple perturbations, time points, and individuals. PMID:26543201

  11. Reproducible quantitative proteotype data matrices for systems biology.

    PubMed

    Röst, Hannes L; Malmström, Lars; Aebersold, Ruedi

    2015-11-01

    Historically, many mass spectrometry-based proteomic studies have aimed at compiling an inventory of protein compounds present in a biological sample, with the long-term objective of creating a proteome map of a species. However, to answer fundamental questions about the behavior of biological systems at the protein level, accurate and unbiased quantitative data are required in addition to a list of all protein components. Fueled by advances in mass spectrometry, the proteomics field has thus recently shifted focus toward the reproducible quantification of proteins across a large number of biological samples. This provides the foundation to move away from pure enumeration of identified proteins toward quantitative matrices of many proteins measured across multiple samples. It is argued here that data matrices consisting of highly reproducible, quantitative, and unbiased proteomic measurements across a high number of conditions, referred to here as quantitative proteotype maps, will become the fundamental currency in the field and provide the starting point for downstream biological analysis. Such proteotype data matrices, for example, are generated by the measurement of large patient cohorts, time series, or multiple experimental perturbations. They are expected to have a large effect on systems biology and personalized medicine approaches that investigate the dynamic behavior of biological systems across multiple perturbations, time points, and individuals.

  12. Open Science and Research Reproducibility

    PubMed Central

    Munafò, Marcus

    2016-01-01

    Many scientists, journals and funders are concerned about the low reproducibility of many scientific findings. One approach that may serve to improve the reliability and robustness of research is open science. Here I argue that the process of pre-registering study protocols, sharing study materials and data, and posting preprints of manuscripts may serve to improve quality control procedures at every stage of the research pipeline, and in turn improve the reproducibility of published work. PMID:27350794

  13. Towards Reproducibility in Computational Hydrology

    NASA Astrophysics Data System (ADS)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei

    2016-04-01

    The ability to reproduce published scientific findings is a foundational principle of scientific research. Independent observation helps to verify the legitimacy of individual findings; build upon sound observations so that we can evolve hypotheses (and models) of how catchments function; and move them from specific circumstances to more general theory. The rise of computational research has brought increased focus on the issue of reproducibility across the broader scientific literature. This is because publications based on computational research typically do not contain sufficient information to enable the results to be reproduced, and therefore verified. Given the rise of computational analysis in hydrology over the past 30 years, to what extent is reproducibility, or a lack thereof, a problem in hydrology? Whilst much hydrological code is accessible, the actual code and workflow that produced and therefore documents the provenance of published scientific findings, is rarely available. We argue that in order to advance and make more robust the process of hypothesis testing and knowledge creation within the computational hydrological community, we need to build on from existing open data initiatives and adopt common standards and infrastructures to: first make code re-useable and easy to find through consistent use of metadata; second, publish well documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; finally, use unique persistent identifiers (e.g. DOIs) to reference re-useable and reproducible code, thereby clearly showing the provenance of published scientific findings. Whilst extra effort is require to make work reproducible, there are benefits to both the individual and the broader community in doing so, which will improve the credibility of the science in the face of the need for societies to adapt to changing hydrological environments.

  14. Reproducibility of airway wall thickness measurements

    NASA Astrophysics Data System (ADS)

    Schmidt, Michael; Kuhnigk, Jan-Martin; Krass, Stefan; Owsijewitsch, Michael; de Hoop, Bartjan; Peitgen, Heinz-Otto

    2010-03-01

    Airway remodeling and accompanying changes in wall thickness are known to be a major symptom of chronic obstructive pulmonary disease (COPD), associated with reduced lung function in diseased individuals. Further investigation of this disease as well as monitoring of disease progression and treatment effect demand for accurate and reproducible assessment of airway wall thickness in CT datasets. With wall thicknesses in the sub-millimeter range, this task remains challenging even with today's high resolution CT datasets. To provide accurate measurements, taking partial volume effects into account is mandatory. The Full-Width-at-Half-Maximum (FWHM) method has been shown to be inappropriate for small airways1,2 and several improved algorithms for objective quantification of airway wall thickness have been proposed.1-8 In this paper, we describe an algorithm based on a closed form solution proposed by Weinheimer et al.7 We locally estimate the lung density parameter required for the closed form solution to account for possible variations of parenchyma density between different lung regions, inspiration states and contrast agent concentrations. The general accuracy of the algorithm is evaluated using basic tubular software and hardware phantoms. Furthermore, we present results on the reproducibility of the algorithm with respect to clinical CT scans, varying reconstruction kernels, and repeated acquisitions, which is crucial for longitudinal observations.

  15. Reproducibility of ERG responses obtained with the DTL electrode.

    PubMed

    Hébert, M; Vaegan; Lachapelle, P

    1999-03-01

    Previous investigators have suggested that the DTL fibre electrode might not be suitable for the recording of replicable electroretinograms. We present experimental evidence that when used adequately, this electrode does permit the recording of highly reproducible retinal potentials.

  16. ITK: enabling reproducible research and open science

    PubMed Central

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387

  17. ITK: enabling reproducible research and open science.

    PubMed

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  18. Can Contemporary Density Functional Theory Predict Energy Spans in Molecular Catalysis Accurately Enough To Be Applicable for in Silico Catalyst Design? A Computational/Experimental Case Study for the Ruthenium-Catalyzed Hydrogenation of Olefins.

    PubMed

    Rohmann, Kai; Hölscher, Markus; Leitner, Walter

    2016-01-13

    The catalytic hydrogenation of cyclohexene and 1-methylcyclohexene is investigated experimentally and by means of density functional theory (DFT) computations using novel ruthenium Xantphos(Ph) (4,5-bis(diphenylphosphino)-9,9-dimethylxanthene) and Xantphos(Cy) (4,5-bis(dicyclohexylphosphino)-9,9-dimethylxanthene) precatalysts [Ru(Xantphos(Ph))(PhCO2)(Cl)] (1) and [Ru(Xantphos(Cy))(PhCO2)(Cl)] (2), the synthesis, characterization, and crystal structures of which are reported. The intention of this work is to (i) understand the reaction mechanisms on the microscopic level and (ii) compare experimentally observed activation barriers with computed barriers. The Gibbs free activation energy ΔG(⧧) was obtained experimentally with precatalyst 1 from Eyring plots for the hydrogenation of cyclohexene (ΔG(⧧) = 17.2 ± 1.0 kcal/mol) and 1-methylcyclohexene (ΔG(⧧) = 18.8 ± 2.4 kcal/mol), while the Gibbs free activation energy ΔG(⧧) for the hydrogenation of cyclohexene with precatalyst 2 was determined to be 21.1 ± 2.3 kcal/mol. Plausible activation pathways and catalytic cycles were computed in the gas phase (M06-L/def2-SVP). A variety of popular density functionals (ωB97X-D, LC-ωPBE, CAM-B3LYP, B3LYP, B97-D3BJ, B3LYP-D3, BP86-D3, PBE0-D3, M06-L, MN12-L) were used to reoptimize the turnover determining states in the solvent phase (DF/def2-TZVP; IEF-PCM and/or SMD) to investigate how well the experimentally obtained activation barriers can be reproduced by the calculations. The density functionals B97-D3BJ, MN12-L, M06-L, B3LYP-D3, and CAM-B3LYP reproduce the experimentally observed activation barriers for both olefins very well with very small (0.1 kcal/mol) to moderate (3.0 kcal/mol) mean deviations from the experimental values indicating for the field of hydrogenation catalysis most of these functionals to be useful for in silico catalyst design prior to experimental work. PMID:26713773

  19. An open investigation of the reproducibility of cancer biology research.

    PubMed

    Errington, Timothy M; Iorns, Elizabeth; Gunn, William; Tan, Fraser Elisabeth; Lomax, Joelle; Nosek, Brian A

    2014-12-10

    It is widely believed that research that builds upon previously published findings has reproduced the original work. However, it is rare for researchers to perform or publish direct replications of existing results. The Reproducibility Project: Cancer Biology is an open investigation of reproducibility in preclinical cancer biology research. We have identified 50 high impact cancer biology articles published in the period 2010-2012, and plan to replicate a subset of experimental results from each article. A Registered Report detailing the proposed experimental designs and protocols for each subset of experiments will be peer reviewed and published prior to data collection. The results of these experiments will then be published in a Replication Study. The resulting open methodology and dataset will provide evidence about the reproducibility of high-impact results, and an opportunity to identify predictors of reproducibility.

  20. Latent fingermark pore area reproducibility.

    PubMed

    Gupta, A; Buckley, K; Sutton, R

    2008-08-01

    The study of the reproducibility of friction ridge pore detail in fingermarks is a measure of their usefulness in personal identification. Pore area in latent prints developed using cyanoacrylate and ninhydrin were examined and measured by photomicrography using appropriate software tools. The data were analysed statistically and the results showed that pore area is not reproducible in developed latent prints, using either of the development techniques. The results add further support to the lack of reliability of pore area in personal identification. PMID:18617339

  1. Rotary head type reproducing apparatus

    DOEpatents

    Takayama, Nobutoshi; Edakubo, Hiroo; Kozuki, Susumu; Takei, Masahiro; Nagasawa, Kenichi

    1986-01-01

    In an apparatus of the kind arranged to reproduce, with a plurality of rotary heads, an information signal from a record bearing medium having many recording tracks which are parallel to each other with the information signal recorded therein and with a plurality of different pilot signals of different frequencies also recorded one by one, one in each of the recording tracks, a plurality of different reference signals of different frequencies are simultaneously generated. A tracking error is detected by using the different reference signals together with the pilot signals which are included in signals reproduced from the plurality of rotary heads.

  2. New experimental methodology, setup and LabView program for accurate absolute thermoelectric power and electrical resistivity measurements between 25 and 1600 K: Application to pure copper, platinum, tungsten, and nickel at very high temperatures

    SciTech Connect

    Abadlia, L.; Mayoufi, M.; Gasser, F.; Khalouk, K.; Gasser, J. G.

    2014-09-15

    In this paper we describe an experimental setup designed to measure simultaneously and very accurately the resistivity and the absolute thermoelectric power, also called absolute thermopower or absolute Seebeck coefficient, of solid and liquid conductors/semiconductors over a wide range of temperatures (room temperature to 1600 K in present work). A careful analysis of the existing experimental data allowed us to extend the absolute thermoelectric power scale of platinum to the range 0-1800 K with two new polynomial expressions. The experimental device is controlled by a LabView program. A detailed description of the accurate dynamic measurement methodology is given in this paper. We measure the absolute thermoelectric power and the electrical resistivity and deduce with a good accuracy the thermal conductivity using the relations between the three electronic transport coefficients, going beyond the classical Wiedemann-Franz law. We use this experimental setup and methodology to give new very accurate results for pure copper, platinum, and nickel especially at very high temperatures. But resistivity and absolute thermopower measurement can be more than an objective in itself. Resistivity characterizes the bulk of a material while absolute thermoelectric power characterizes the material at the point where the electrical contact is established with a couple of metallic elements (forming a thermocouple). In a forthcoming paper we will show that the measurement of resistivity and absolute thermoelectric power characterizes advantageously the (change of) phase, probably as well as DSC (if not better), since the change of phases can be easily followed during several hours/days at constant temperature.

  3. New experimental methodology, setup and LabView program for accurate absolute thermoelectric power and electrical resistivity measurements between 25 and 1600 K: application to pure copper, platinum, tungsten, and nickel at very high temperatures.

    PubMed

    Abadlia, L; Gasser, F; Khalouk, K; Mayoufi, M; Gasser, J G

    2014-09-01

    In this paper we describe an experimental setup designed to measure simultaneously and very accurately the resistivity and the absolute thermoelectric power, also called absolute thermopower or absolute Seebeck coefficient, of solid and liquid conductors/semiconductors over a wide range of temperatures (room temperature to 1600 K in present work). A careful analysis of the existing experimental data allowed us to extend the absolute thermoelectric power scale of platinum to the range 0-1800 K with two new polynomial expressions. The experimental device is controlled by a LabView program. A detailed description of the accurate dynamic measurement methodology is given in this paper. We measure the absolute thermoelectric power and the electrical resistivity and deduce with a good accuracy the thermal conductivity using the relations between the three electronic transport coefficients, going beyond the classical Wiedemann-Franz law. We use this experimental setup and methodology to give new very accurate results for pure copper, platinum, and nickel especially at very high temperatures. But resistivity and absolute thermopower measurement can be more than an objective in itself. Resistivity characterizes the bulk of a material while absolute thermoelectric power characterizes the material at the point where the electrical contact is established with a couple of metallic elements (forming a thermocouple). In a forthcoming paper we will show that the measurement of resistivity and absolute thermoelectric power characterizes advantageously the (change of) phase, probably as well as DSC (if not better), since the change of phases can be easily followed during several hours/days at constant temperature.

  4. Reproducible Bioinformatics Research for Biologists

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  5. Reproducible Research in Computational Science

    PubMed Central

    Peng, Roger D.

    2012-01-01

    Computational science has led to exciting new developments, but the nature of the work has exposed limitations in our ability to evaluate published findings. Reproducibility has the potential to serve as a minimum standard for judging scientific claims when full independent replication of a study is not possible. PMID:22144613

  6. Reproducible research in computational science.

    PubMed

    Peng, Roger D

    2011-12-01

    Computational science has led to exciting new developments, but the nature of the work has exposed limitations in our ability to evaluate published findings. Reproducibility has the potential to serve as a minimum standard for judging scientific claims when full independent replication of a study is not possible.

  7. Performance reproducibility index for classification

    PubMed Central

    Yousefi, Mohammadmahdi R.; Dougherty, Edward R.

    2012-01-01

    Motivation: A common practice in biomarker discovery is to decide whether a large laboratory experiment should be carried out based on the results of a preliminary study on a small set of specimens. Consideration of the efficacy of this approach motivates the introduction of a probabilistic measure, for whether a classifier showing promising results in a small-sample preliminary study will perform similarly on a large independent sample. Given the error estimate from the preliminary study, if the probability of reproducible error is low, then there is really no purpose in substantially allocating more resources to a large follow-on study. Indeed, if the probability of the preliminary study providing likely reproducible results is small, then why even perform the preliminary study? Results: This article introduces a reproducibility index for classification, measuring the probability that a sufficiently small error estimate on a small sample will motivate a large follow-on study. We provide a simulation study based on synthetic distribution models that possess known intrinsic classification difficulties and emulate real-world scenarios. We also set up similar simulations on four real datasets to show the consistency of results. The reproducibility indices for different distributional models, real datasets and classification schemes are empirically calculated. The effects of reporting and multiple-rule biases on the reproducibility index are also analyzed. Availability: We have implemented in C code the synthetic data distribution model, classification rules, feature selection routine and error estimation methods. The source code is available at http://gsp.tamu.edu/Publications/supplementary/yousefi12a/. Supplementary simulation results are also included. Contact: edward@ece.tamu.edu Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:22954625

  8. Assessing the accuracy and reproducibility of modality independent elastography in a murine model of breast cancer

    PubMed Central

    Weis, Jared A.; Flint, Katelyn M.; Sanchez, Violeta; Yankeelov, Thomas E.; Miga, Michael I.

    2015-01-01

    Abstract. Cancer progression has been linked to mechanics. Therefore, there has been recent interest in developing noninvasive imaging tools for cancer assessment that are sensitive to changes in tissue mechanical properties. We have developed one such method, modality independent elastography (MIE), that estimates the relative elastic properties of tissue by fitting anatomical image volumes acquired before and after the application of compression to biomechanical models. The aim of this study was to assess the accuracy and reproducibility of the method using phantoms and a murine breast cancer model. Magnetic resonance imaging data were acquired, and the MIE method was used to estimate relative volumetric stiffness. Accuracy was assessed using phantom data by comparing to gold-standard mechanical testing of elasticity ratios. Validation error was <12%. Reproducibility analysis was performed on animal data, and within-subject coefficients of variation ranged from 2 to 13% at the bulk level and 32% at the voxel level. To our knowledge, this is the first study to assess the reproducibility of an elasticity imaging metric in a preclinical cancer model. Our results suggest that the MIE method can reproducibly generate accurate estimates of the relative mechanical stiffness and provide guidance on the degree of change needed in order to declare biological changes rather than experimental error in future therapeutic studies. PMID:26158120

  9. Cloning to reproduce desired genotypes.

    PubMed

    Westhusin, M E; Long, C R; Shin, T; Hill, J R; Looney, C R; Pryor, J H; Piedrahita, J A

    2001-01-01

    Cloned sheep, cattle, goats, pigs and mice have now been produced using somatic cells for nuclear transplantation. Animal cloning is still very inefficient with on average less than 10% of the cloned embryos transferred resulting in a live offspring. However successful cloning of a variety of different species and by a number of different laboratory groups has generated tremendous interest in reproducing desired genotypes. Some of these specific genotypes represent animal cell lines that have been genetically modified. In other cases there is a significant demand for cloning animals characterized by their inherent genetic value, for example prize livestock, household pets and rare or endangered species. A number of different variables may influence the ability to reproduce a specific genotype by cloning. These include species, source of recipient ova, cell type of nuclei donor, treatment of donor cells prior to nuclear transfer, and the techniques employed for nuclear transfer. At present, there is no solid evidence that suggests cloning will be limited to only a few specific animals, and in fact, most data collected to date suggests cloning will be applicable to a wide variety of different animals. The ability to reproduce any desired genotype by cloning will ultimately depend on the amount of time and resources invested in research.

  10. Reproducibility of sterilized rubber impressions.

    PubMed

    Abdelaziz, Khalid M; Hassan, Ahmed M; Hodges, J S

    2004-01-01

    Impressions, dentures and other dental appliances may be contaminated with oral micro-flora or other organisms of varying pathogenicity from patient's saliva and blood. Several approaches have been tried to control the transmission of infectious organisms via dental impressions and because disinfection is less effective and has several drawbacks for impression characterization, several sterilization methods have been suggested. This study evaluated the reproducibility of rubber impressions after sterilization by different methods. Dimensional accuracy and wettability of two rubber impression materials (vinyl polysiloxane and polyether) were evaluated after sterilization by each of three well-known methods (immersion in 2% glutaraldehyde for 10 h, autoclaving and microwave radiation). Non-sterilized impressions served as control. The effect of the tray material on impression accuracy and the effect of topical surfactant on the wettability were also evaluated. One-way ANOVA with Dunnett's method was used for statistical analysis. All sterilizing methods reduced the reproducibility of rubber impressions, although not always significantly. Microwave sterilization had a small effect on both accuracy and wettability. The greater effects of the other methods could usually be overcome by using ceramic trays and by spraying impression surfaces with surfactant before pouring the gypsum mix. There was one exception: glutaraldehyde still degraded dimensional accuracy even with ceramic trays and surfactant. We conclude that a) sterilization of rubber impressions made on acrylic trays was usually associated with a degree of dimensional change; b) microwave energy seems to be a suitable technique for sterilizing rubber impressions; c) topical surfactant application helped restore wettability of sterilized impressions. PMID:15798825

  11. Evaluation of guidewire path reproducibility.

    PubMed

    Schafer, Sebastian; Hoffmann, Kenneth R; Noël, Peter B; Ionita, Ciprian N; Dmochowski, Jacek

    2008-05-01

    The number of minimally invasive vascular interventions is increasing. In these interventions, a variety of devices are directed to and placed at the site of intervention. The device used in almost all of these interventions is the guidewire, acting as a monorail for all devices which are delivered to the intervention site. However, even with the guidewire in place, clinicians still experience difficulties during the interventions. As a first step toward understanding these difficulties and facilitating guidewire and device guidance, we have investigated the reproducibility of the final paths of the guidewire in vessel phantom models on different factors: user, materials and geometry. Three vessel phantoms (vessel diameters approximately 4 mm) were constructed having tortuousity similar to the internal carotid artery from silicon tubing and encased in Sylgard elastomer. Several trained users repeatedly passed two guidewires of different flexibility through the phantoms under pulsatile flow conditions. After the guidewire had been placed, rotational c-arm image sequences were acquired (9 in. II mode, 0.185 mm pixel size), and the phantom and guidewire were reconstructed (512(3), 0.288 mm voxel size). The reconstructed volumes were aligned. The centerlines of the guidewire and the phantom vessel were then determined using region-growing techniques. Guidewire paths appear similar across users but not across materials. The average root mean square difference of the repeated placement was 0.17 +/- 0.02 mm (plastic-coated guidewire), 0.73 +/- 0.55 mm (steel guidewire) and 1.15 +/- 0.65 mm (steel versus plastic-coated). For a given guidewire, these results indicate that the guidewire path is relatively reproducible in shape and position.

  12. PSYCHOLOGY. Estimating the reproducibility of psychological science.

    PubMed

    2015-08-28

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams. PMID:26315443

  13. PSYCHOLOGY. Estimating the reproducibility of psychological science.

    PubMed

    2015-08-28

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.

  14. Reproducible analyses of microbial food for advanced life support systems

    NASA Technical Reports Server (NTRS)

    Petersen, Gene R.

    1988-01-01

    The use of yeasts in controlled ecological life support systems (CELSS) for microbial food regeneration in space required the accurate and reproducible analysis of intracellular carbohydrate and protein levels. The reproducible analysis of glycogen was a key element in estimating overall content of edibles in candidate yeast strains. Typical analytical methods for estimating glycogen in Saccharomyces were not found to be entirely aplicable to other candidate strains. Rigorous cell lysis coupled with acid/base fractionation followed by specific enzymatic glycogen analyses were required to obtain accurate results in two strains of Candida. A profile of edible fractions of these strains was then determined. The suitability of yeasts as food sources in CELSS food production processes is discussed.

  15. Wire like link for cycle reproducible and cycle accurate hardware accelerator

    SciTech Connect

    Asaad, Sameh; Kapur, Mohit; Parker, Benjamin D

    2015-04-07

    First and second field programmable gate arrays are provided which implement first and second blocks of a circuit design to be simulated. The field programmable gate arrays are operated at a first clock frequency and a wire like link is provided to send a plurality of signals between them. The wire like link includes a serializer, on the first field programmable gate array, to serialize the plurality of signals; a deserializer on the second field programmable gate array, to deserialize the plurality of signals; and a connection between the serializer and the deserializer. The serializer and the deserializer are operated at a second clock frequency, greater than the first clock frequency, and the second clock frequency is selected such that latency of transmission and reception of the plurality of signals is less than the period corresponding to the first clock frequency.

  16. Applicability of density functional theory in reproducing accurate vibrational spectra of surface bound species.

    PubMed

    Matanović, Ivana; Atanassov, Plamen; Kiefer, Boris; Garzon, Fernando H; Henson, Neil J

    2014-10-01

    The structural equilibrium parameters, the adsorption energies, and the vibrational frequencies of the nitrogen molecule and the hydrogen atom adsorbed on the (111) surface of rhodium have been investigated using different generalized-gradient approximation (GGA), nonlocal correlation, meta-GGA, and hybrid functionals, namely, Perdew, Burke, and Ernzerhof (PBE), Revised-RPBE, vdW-DF, Tao, Perdew, Staroverov, and Scuseria functional (TPSS), and Heyd, Scuseria, and Ernzerhof (HSE06) functional in the plane wave formalism. Among the five tested functionals, nonlocal vdW-DF and meta-GGA TPSS functionals are most successful in describing energetics of dinitrogen physisorption to the Rh(111) surface, while the PBE functional provides the correct chemisorption energy for the hydrogen atom. It was also found that TPSS functional produces the best vibrational spectra of the nitrogen molecule and the hydrogen atom on rhodium within the harmonic formalism with the error of -2.62 and -1.1% for the N-N stretching and Rh-H stretching frequency. Thus, TPSS functional was proposed as a method of choice for obtaining vibrational spectra of low weight adsorbates on metallic surfaces within the harmonic approximation. At the anharmonic level, by decoupling the Rh-H and N-N stretching modes from the bulk phonons and by solving one- and two-dimensional Schrödinger equation associated with the Rh-H, Rh-N, and N-N potential energy we calculated the anharmonic correction for N-N and Rh-H stretching modes as -31 cm(-1) and -77 cm(-1) at PBE level. Anharmonic vibrational frequencies calculated with the use of the hybrid HSE06 function are in best agreement with available experiments. PMID:25164265

  17. Applicability of Density Functional Theory in Reproducing Accurate Vibrational Spectra of Surface Bound Species

    SciTech Connect

    Matanovic, Ivana; Atanassov, Plamen; Kiefer, Boris; Garzon, Fernando; Henson, Neil J.

    2014-10-05

    The structural equilibrium parameters, the adsorption energies, and the vibrational frequencies of the nitrogen molecule and the hydrogen atom adsorbed on the (111) surface of rhodium have been investigated using different generalized-gradient approximation (GGA), nonlocal correlation, meta-GGA, and hybrid functionals, namely, Perdew, Burke, and Ernzerhof (PBE), Revised-RPBE, vdW-DF, Tao, Perdew, Staroverov, and Scuseria functional (TPSS), and Heyd, Scuseria, and Ernzerhof (HSE06) functional in the plane wave formalism. Among the five tested functionals, nonlocal vdW-DF and meta-GGA TPSS functionals are most successful in describing energetics of dinitrogen physisorption to the Rh(111) surface, while the PBE functional provides the correct chemisorption energy for the hydrogen atom. It was also found that TPSS functional produces the best vibrational spectra of the nitrogen molecule and the hydrogen atom on rhodium within the harmonic formalism with the error of 22.62 and 21.1% for the NAN stretching and RhAH stretching frequency. Thus, TPSS functional was proposed as a method of choice for obtaining vibrational spectra of low weight adsorbates on metallic surfaces within the harmonic approximation. At the anharmonic level, by decoupling the RhAH and NAN stretching modes from the bulk phonons and by solving one- and two-dimensional Schr€odinger equation associated with the RhAH, RhAN, and NAN potential energy we calculated the anharmonic correction for NAN and RhAH stretching modes as 231 cm21 and 277 cm21 at PBE level. Anharmonic vibrational frequencies calculated with the use of the hybrid HSE06 function are in best agreement with available experiments.

  18. Susceptibility testing: accurate and reproducible minimum inhibitory concentration (MIC) and non-inhibitory concentration (NIC) values.

    PubMed

    Lambert, R J; Pearson, J

    2000-05-01

    Measuring the minimum inhibitory concentration (MIC) of a substance by current methods is straightforward, whereas obtaining useful comparative information from the tests can be more difficult. A simple technique and a method of data analysis are reported which give the experimentalist more useful information from susceptibility testing. This method makes use of a 100-well microtitre plate and the analysis uses all the growth information, obtained by turbidometry, from each and every well of the microtitre plate. A modified Gompertz function is used to fit the data, from which a more exact value can be obtained for the MIC. The technique also showed that at certain concentrations of inhibitor, there was no effect on growth relative to a control well (zero inhibitor). Above a threshold value, which has been termed the non-inhibitory concentration or NIC, growth becomes limiting until it reaches the MIC, where no growth relative to the control is observed.

  19. Statistical analysis of accurate prediction of local atmospheric optical attenuation with a new model according to weather together with beam wandering compensation system: a season-wise experimental investigation

    NASA Astrophysics Data System (ADS)

    Arockia Bazil Raj, A.; Padmavathi, S.

    2016-07-01

    Atmospheric parameters strongly affect the performance of Free Space Optical Communication (FSOC) system when the optical wave is propagating through the inhomogeneous turbulent medium. Developing a model to get an accurate prediction of optical attenuation according to meteorological parameters becomes significant to understand the behaviour of FSOC channel during different seasons. A dedicated free space optical link experimental set-up is developed for the range of 0.5 km at an altitude of 15.25 m. The diurnal profile of received power and corresponding meteorological parameters are continuously measured using the developed optoelectronic assembly and weather station, respectively, and stored in a data logging computer. Measured meteorological parameters (as input factors) and optical attenuation (as response factor) of size [177147 × 4] are used for linear regression analysis and to design the mathematical model that is more suitable to predict the atmospheric optical attenuation at our test field. A model that exhibits the R2 value of 98.76% and average percentage deviation of 1.59% is considered for practical implementation. The prediction accuracy of the proposed model is investigated along with the comparative results obtained from some of the existing models in terms of Root Mean Square Error (RMSE) during different local seasons in one-year period. The average RMSE value of 0.043-dB/km is obtained in the longer range dynamic of meteorological parameters variations.

  20. Combining Dissimilarities in a Hyper Reproducing Kernel Hilbert Space for Complex Human Cancer Prediction

    PubMed Central

    Martín-Merino, Manuel; Blanco, Ángela; De Las Rivas, Javier

    2009-01-01

    DNA microarrays provide rich profiles that are used in cancer prediction considering the gene expression levels across a collection of related samples. Support Vector Machines (SVM) have been applied to the classification of cancer samples with encouraging results. However, they rely on Euclidean distances that fail to reflect accurately the proximities among sample profiles. Then, non-Euclidean dissimilarities provide additional information that should be considered to reduce the misclassification errors. In this paper, we incorporate in the ν-SVM algorithm a linear combination of non-Euclidean dissimilarities. The weights of the combination are learnt in a (Hyper Reproducing Kernel Hilbert Space) HRKHS using a Semidefinite Programming algorithm. This approach allows us to incorporate a smoothing term that penalizes the complexity of the family of distances and avoids overfitting. The experimental results suggest that the method proposed helps to reduce the misclassification errors in several human cancer problems. PMID:19584909

  1. Vapor Pressure of Aqueous Solutions of Electrolytes Reproduced with Coarse-Grained Models without Electrostatics.

    PubMed

    Perez Sirkin, Yamila A; Factorovich, Matías H; Molinero, Valeria; Scherlis, Damian A

    2016-06-14

    The vapor pressure of water is a key property in a large class of applications from the design of membranes for fuel cells and separations to the prediction of the mixing state of atmospheric aerosols. Molecular simulations have been used to compute vapor pressures, and a few studies on liquid mixtures and solutions have been reported on the basis of the Gibbs Ensemble Monte Carlo method in combination with atomistic force fields. These simulations are costly, making them impractical for the prediction of the vapor pressure of complex materials. The goal of the present work is twofold: (1) to demonstrate the use of the grand canonical screening approach ( Factorovich , M. H. J. Chem. Phys. 2014 , 140 , 064111 ) to compute the vapor pressure of solutions and to extend the methodology for the treatment of systems without a liquid-vapor interface and (2) to investigate the ability of computationally efficient high-resolution coarse-grained models based on the mW monatomic water potential and ions described exclusively with short-range interactions to reproduce the relative vapor pressure of aqueous solutions. We find that coarse-grained models of LiCl and NaCl solutions faithfully reproduce the experimental relative pressures up to high salt concentrations, despite the inability of these models to predict cohesive energies of the solutions or the salts. A thermodynamic analysis reveals that the coarse-grained models achieve the experimental activity coefficients of water in solution through a compensation of severely underestimated hydration and vaporization free energies of the salts. Our results suggest that coarse-grained models developed to replicate the hydration structure and the effective ion-ion attraction in solution may lead to this compensation. Moreover, they suggest an avenue for the design of coarse-grained models that accurately reproduce the activity coefficients of solutions.

  2. Vapor Pressure of Aqueous Solutions of Electrolytes Reproduced with Coarse-Grained Models without Electrostatics.

    PubMed

    Perez Sirkin, Yamila A; Factorovich, Matías H; Molinero, Valeria; Scherlis, Damian A

    2016-06-14

    The vapor pressure of water is a key property in a large class of applications from the design of membranes for fuel cells and separations to the prediction of the mixing state of atmospheric aerosols. Molecular simulations have been used to compute vapor pressures, and a few studies on liquid mixtures and solutions have been reported on the basis of the Gibbs Ensemble Monte Carlo method in combination with atomistic force fields. These simulations are costly, making them impractical for the prediction of the vapor pressure of complex materials. The goal of the present work is twofold: (1) to demonstrate the use of the grand canonical screening approach ( Factorovich , M. H. J. Chem. Phys. 2014 , 140 , 064111 ) to compute the vapor pressure of solutions and to extend the methodology for the treatment of systems without a liquid-vapor interface and (2) to investigate the ability of computationally efficient high-resolution coarse-grained models based on the mW monatomic water potential and ions described exclusively with short-range interactions to reproduce the relative vapor pressure of aqueous solutions. We find that coarse-grained models of LiCl and NaCl solutions faithfully reproduce the experimental relative pressures up to high salt concentrations, despite the inability of these models to predict cohesive energies of the solutions or the salts. A thermodynamic analysis reveals that the coarse-grained models achieve the experimental activity coefficients of water in solution through a compensation of severely underestimated hydration and vaporization free energies of the salts. Our results suggest that coarse-grained models developed to replicate the hydration structure and the effective ion-ion attraction in solution may lead to this compensation. Moreover, they suggest an avenue for the design of coarse-grained models that accurately reproduce the activity coefficients of solutions. PMID:27196963

  3. Assessment of the performance of numerical modeling in reproducing a replenishment of sediments in a water-worked channel

    NASA Astrophysics Data System (ADS)

    Juez, C.; Battisacco, E.; Schleiss, A. J.; Franca, M. J.

    2016-06-01

    The artificial replenishment of sediment is used as a method to re-establish sediment continuity downstream of a dam. However, the impact of this technique on the hydraulics conditions, and resulting bed morphology, is yet to be understood. Several numerical tools have been developed during last years for modeling sediment transport and morphology evolution which can be used for this application. These models range from 1D to 3D approaches: the first being over simplistic for the simulation of such a complex geometry; the latter requires often a prohibitive computational effort. However, 2D models are computationally efficient and in these cases may already provide sufficiently accurate predictions of the morphology evolution caused by the sediment replenishment in a river. Here, the 2D shallow water equations in combination with the Exner equation are solved by means of a weak-coupled strategy. The classical friction approach considered for reproducing the bed channel roughness has been modified to take into account the morphological effect of replenishment which provokes a channel bed fining. Computational outcomes are compared with four sets of experimental data obtained from several replenishment configurations studied in the laboratory. The experiments differ in terms of placement volume and configuration. A set of analysis parameters is proposed for the experimental-numerical comparison, with particular attention to the spreading, covered surface and travel distance of placed replenishment grains. The numerical tool is reliable in reproducing the overall tendency shown by the experimental data. The effect of fining roughness is better reproduced with the approach herein proposed. However, it is also highlighted that the sediment clusters found in the experiment are not well numerically reproduced in the regions of the channel with a limited number of sediment grains.

  4. A simple and reproducible breast cancer prognostic test

    PubMed Central

    2013-01-01

    Background A small number of prognostic and predictive tests based on gene expression are currently offered as reference laboratory tests. In contrast to such success stories, a number of flaws and errors have recently been identified in other genomic-based predictors and the success rate for developing clinically useful genomic signatures is low. These errors have led to widespread concerns about the protocols for conducting and reporting of computational research. As a result, a need has emerged for a template for reproducible development of genomic signatures that incorporates full transparency, data sharing and statistical robustness. Results Here we present the first fully reproducible analysis of the data used to train and test MammaPrint, an FDA-cleared prognostic test for breast cancer based on a 70-gene expression signature. We provide all the software and documentation necessary for researchers to build and evaluate genomic classifiers based on these data. As an example of the utility of this reproducible research resource, we develop a simple prognostic classifier that uses only 16 genes from the MammaPrint signature and is equally accurate in predicting 5-year disease free survival. Conclusions Our study provides a prototypic example for reproducible development of computational algorithms for learning prognostic biomarkers in the era of personalized medicine. PMID:23682826

  5. Towards Accurate Molecular Modeling of Plastic Bonded Explosives

    NASA Astrophysics Data System (ADS)

    Chantawansri, T. L.; Andzelm, J.; Taylor, D.; Byrd, E.; Rice, B.

    2010-03-01

    There is substantial interest in identifying the controlling factors that influence the susceptibility of polymer bonded explosives (PBXs) to accidental initiation. Numerous Molecular Dynamics (MD) simulations of PBXs using the COMPASS force field have been reported in recent years, where the validity of the force field in modeling the solid EM fill has been judged solely on its ability to reproduce lattice parameters, which is an insufficient metric. Performance of the COMPASS force field in modeling EMs and the polymeric binder has been assessed by calculating structural, thermal, and mechanical properties, where only fair agreement with experimental data is obtained. We performed MD simulations using the COMPASS force field for the polymer binder hydroxyl-terminated polybutadiene and five EMs: cyclotrimethylenetrinitramine, 1,3,5,7-tetranitro-1,3,5,7-tetra-azacyclo-octane, 2,4,6,8,10,12-hexantirohexaazazisowurzitane, 2,4,6-trinitro-1,3,5-benzenetriamine, and pentaerythritol tetranitate. Predicted EM crystallographic and molecular structural parameters, as well as calculated properties for the binder will be compared with experimental results for different simulation conditions. We also present novel simulation protocols, which improve agreement between experimental and computation results thus leading to the accurate modeling of PBXs.

  6. Accurate monotone cubic interpolation

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1991-01-01

    Monotone piecewise cubic interpolants are simple and effective. They are generally third-order accurate, except near strict local extrema where accuracy degenerates to second-order due to the monotonicity constraint. Algorithms for piecewise cubic interpolants, which preserve monotonicity as well as uniform third and fourth-order accuracy are presented. The gain of accuracy is obtained by relaxing the monotonicity constraint in a geometric framework in which the median function plays a crucial role.

  7. Accurate Finite Difference Algorithms

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.

    1996-01-01

    Two families of finite difference algorithms for computational aeroacoustics are presented and compared. All of the algorithms are single step explicit methods, they have the same order of accuracy in both space and time, with examples up to eleventh order, and they have multidimensional extensions. One of the algorithm families has spectral like high resolution. Propagation with high order and high resolution algorithms can produce accurate results after O(10(exp 6)) periods of propagation with eight grid points per wavelength.

  8. Reproducible research in vadose zone sciences

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...

  9. Thou Shalt Be Reproducible! A Technology Perspective

    PubMed Central

    Mair, Patrick

    2016-01-01

    This article elaborates on reproducibility in psychology from a technological viewpoint. Modern open source computational environments are shown and explained that foster reproducibility throughout the whole research life cycle, and to which emerging psychology researchers should be sensitized, are shown and explained. First, data archiving platforms that make datasets publicly available are presented. Second, R is advocated as the data-analytic lingua franca in psychology for achieving reproducible statistical analysis. Third, dynamic report generation environments for writing reproducible manuscripts that integrate text, data analysis, and statistical outputs such as figures and tables in a single document are described. Supplementary materials are provided in order to get the reader started with these technologies. PMID:27471486

  10. Reproducibility and uncertainty of wastewater turbidity measurements.

    PubMed

    Joannis, C; Ruban, G; Gromaire, M-C; Chebbo, G; Bertrand-Krajewski, J-L; Joannis, C; Ruban, G

    2008-01-01

    Turbidity monitoring is a valuable tool for operating sewer systems, but it is often considered as a somewhat tricky parameter for assessing water quality, because measured values depend on the model of sensor, and even on the operator. This paper details the main components of the uncertainty in turbidity measurements with a special focus on reproducibility, and provides guidelines for improving the reproducibility of measurements in wastewater relying on proper calibration procedures. Calibration appears to be the main source of uncertainties, and proper procedures must account for uncertainties in standard solutions as well as non linearity of the calibration curve. With such procedures, uncertainty and reproducibility of field measurement can be kept lower than 5% or 25 FAU. On the other hand, reproducibility has no meaning if different measuring principles (attenuation vs. nephelometry) or very different wavelengths are used.

  11. Thou Shalt Be Reproducible! A Technology Perspective.

    PubMed

    Mair, Patrick

    2016-01-01

    This article elaborates on reproducibility in psychology from a technological viewpoint. Modern open source computational environments are shown and explained that foster reproducibility throughout the whole research life cycle, and to which emerging psychology researchers should be sensitized, are shown and explained. First, data archiving platforms that make datasets publicly available are presented. Second, R is advocated as the data-analytic lingua franca in psychology for achieving reproducible statistical analysis. Third, dynamic report generation environments for writing reproducible manuscripts that integrate text, data analysis, and statistical outputs such as figures and tables in a single document are described. Supplementary materials are provided in order to get the reader started with these technologies. PMID:27471486

  12. A synthesis approach for reproducing the response of aircraft panels to a turbulent boundary layer excitation.

    PubMed

    Bravo, Teresa; Maury, Cédric

    2011-01-01

    Random wall-pressure fluctuations due to the turbulent boundary layer (TBL) are a feature of the air flow over an aircraft fuselage under cruise conditions, creating undesirable effects such as cabin noise annoyance. In order to test potential solutions to reduce the TBL-induced noise, a cost-efficient alternative to in-flight or wind-tunnel measurements involves the laboratory simulation of the response of aircraft sidewalls to high-speed subsonic TBL excitation. Previously published work has shown that TBL simulation using a near-field array of loudspeakers is only feasible in the low frequency range due to the rapid decay of the spanwise correlation length with frequency. This paper demonstrates through theoretical criteria how the wavenumber filtering capabilities of the radiating panel reduces the number of sources required, thus dramatically enlarging the frequency range over which the response of the TBL-excited panel is accurately reproduced. Experimental synthesis of the panel response to high-speed TBL excitation is found to be feasible over the hydrodynamic coincidence frequency range using a reduced set of near-field loudspeakers driven by optimal signals. Effective methodologies are proposed for an accurate reproduction of the TBL-induced sound power radiated by the panel into a free-field and when coupled to a cavity.

  13. A synthesis approach for reproducing the response of aircraft panels to a turbulent boundary layer excitation.

    PubMed

    Bravo, Teresa; Maury, Cédric

    2011-01-01

    Random wall-pressure fluctuations due to the turbulent boundary layer (TBL) are a feature of the air flow over an aircraft fuselage under cruise conditions, creating undesirable effects such as cabin noise annoyance. In order to test potential solutions to reduce the TBL-induced noise, a cost-efficient alternative to in-flight or wind-tunnel measurements involves the laboratory simulation of the response of aircraft sidewalls to high-speed subsonic TBL excitation. Previously published work has shown that TBL simulation using a near-field array of loudspeakers is only feasible in the low frequency range due to the rapid decay of the spanwise correlation length with frequency. This paper demonstrates through theoretical criteria how the wavenumber filtering capabilities of the radiating panel reduces the number of sources required, thus dramatically enlarging the frequency range over which the response of the TBL-excited panel is accurately reproduced. Experimental synthesis of the panel response to high-speed TBL excitation is found to be feasible over the hydrodynamic coincidence frequency range using a reduced set of near-field loudspeakers driven by optimal signals. Effective methodologies are proposed for an accurate reproduction of the TBL-induced sound power radiated by the panel into a free-field and when coupled to a cavity. PMID:21302997

  14. Accurate quantum chemical calculations

    NASA Technical Reports Server (NTRS)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.

    1989-01-01

    An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.

  15. Relevance relations for the concept of reproducibility

    PubMed Central

    Atmanspacher, H.; Bezzola Lambert, L.; Folkers, G.; Schubiger, P. A.

    2014-01-01

    The concept of reproducibility is widely considered a cornerstone of scientific methodology. However, recent problems with the reproducibility of empirical results in large-scale systems and in biomedical research have cast doubts on its universal and rigid applicability beyond the so-called basic sciences. Reproducibility is a particularly difficult issue in interdisciplinary work where the results to be reproduced typically refer to different levels of description of the system considered. In such cases, it is mandatory to distinguish between more and less relevant features, attributes or observables of the system, depending on the level at which they are described. For this reason, we propose a scheme for a general ‘relation of relevance’ between the level of complexity at which a system is considered and the granularity of its description. This relation implies relevance criteria for particular selected aspects of a system and its description, which can be operationally implemented by an interlevel relation called ‘contextual emergence’. It yields a formally sound and empirically applicable procedure to translate between descriptive levels and thus construct level-specific criteria for reproducibility in an overall consistent fashion. Relevance relations merged with contextual emergence challenge the old idea of one fundamental ontology from which everything else derives. At the same time, our proposal is specific enough to resist the backlash into a relativist patchwork of unconnected model fragments. PMID:24554574

  16. Reproducibility responsibilities in the HPC arena

    SciTech Connect

    Fahey, Mark R; McLay, Robert

    2014-01-01

    Expecting bit-for-bit reproducibility in the HPC arena is not feasible because of the ever changing hardware and software. No user s application is an island; it lives in an HPC eco-system that changes over time. Old hardware stops working and even old software won t run on new hardware. Further, software libraries change over time either by changing the internals or even interfaces. So bit-for-bit reproducibility should not be expected. Rather a reasonable expectation is that results are reproducible within error bounds; or that the answers are close (which is its own debate.) To expect a researcher to reproduce their own results or the results of others within some error bounds, there must be enough information to recreate all the details of the experiment. This requires complete documentation of all phases of the researcher s workflow; from code to versioning to programming and runtime environments to publishing of data. This argument is the core statement of the Yale 2009 Declaration on Reproducible Research [1]. Although the HPC ecosystem is often outside the researchers control, the application code could be built almost identically and there is a chance for very similar results with just only round-off error differences. To achieve complete documentation at every step, the researcher, the computing center, and the funding agencies all have a role. In this thesis, the role of the researcher is expanded upon as compared to the Yale report and the role of the computing centers is described.

  17. The Economics of Reproducibility in Preclinical Research

    PubMed Central

    Freedman, Leonard P.; Cockburn, Iain M.; Simcoe, Timothy S.

    2015-01-01

    Low reproducibility rates within life science research undermine cumulative knowledge production and contribute to both delays and costs of therapeutic drug development. An analysis of past studies indicates that the cumulative (total) prevalence of irreproducible preclinical research exceeds 50%, resulting in approximately US$28,000,000,000 (US$28B)/year spent on preclinical research that is not reproducible—in the United States alone. We outline a framework for solutions and a plan for long-term improvements in reproducibility rates that will help to accelerate the discovery of life-saving therapies and cures. PMID:26057340

  18. Composting in small laboratory pilots: Performance and reproducibility

    SciTech Connect

    Lashermes, G.; Barriuso, E.; Le Villio-Poitrenaud, M.; Houot, S.

    2012-02-15

    Highlights: Black-Right-Pointing-Pointer We design an innovative small-scale composting device including six 4-l reactors. Black-Right-Pointing-Pointer We investigate the performance and reproducibility of composting on a small scale. Black-Right-Pointing-Pointer Thermophilic conditions are established by self-heating in all replicates. Black-Right-Pointing-Pointer Biochemical transformations, organic matter losses and stabilisation are realistic. Black-Right-Pointing-Pointer The organic matter evolution exhibits good reproducibility for all six replicates. - Abstract: Small-scale reactors (<10 l) have been employed in composting research, but few attempts have assessed the performance of composting considering the transformations of organic matter. Moreover, composting at small scales is often performed by imposing a fixed temperature, thus creating artificial conditions, and the reproducibility of composting has rarely been reported. The objectives of this study are to design an innovative small-scale composting device safeguarding self-heating to drive the composting process and to assess the performance and reproducibility of composting in small-scale pilots. The experimental setup included six 4-l reactors used for composting a mixture of sewage sludge and green wastes. The performance of the process was assessed by monitoring the temperature, O{sub 2} consumption and CO{sub 2} emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final

  19. Composting in small laboratory pilots: performance and reproducibility.

    PubMed

    Lashermes, G; Barriuso, E; Le Villio-Poitrenaud, M; Houot, S

    2012-02-01

    Small-scale reactors (<10 l) have been employed in composting research, but few attempts have assessed the performance of composting considering the transformations of organic matter. Moreover, composting at small scales is often performed by imposing a fixed temperature, thus creating artificial conditions, and the reproducibility of composting has rarely been reported. The objectives of this study are to design an innovative small-scale composting device safeguarding self-heating to drive the composting process and to assess the performance and reproducibility of composting in small-scale pilots. The experimental setup included six 4-l reactors used for composting a mixture of sewage sludge and green wastes. The performance of the process was assessed by monitoring the temperature, O(2) consumption and CO(2) emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final compost. The TOM losses, compost stabilisation and evolution of the biochemical fractions were similar to observed in large reactors or on-site experiments, excluding the lignin degradation, which was less important than in full-scale systems. The reproducibility of the process and the quality of the final compost make it possible to propose the use of this experimental device for research requiring a mass reduction of the initial composted waste mixtures. PMID:21982279

  20. Design Procedure and Fabrication of Reproducible Silicon Vernier Devices for High-Performance Refractive Index Sensing

    PubMed Central

    Troia, Benedetto; Khokhar, Ali Z.; Nedeljkovic, Milos; Reynolds, Scott A.; Hu, Youfang; Mashanovich, Goran Z.; Passaro, Vittorio M. N.

    2015-01-01

    In this paper, we propose a generalized procedure for the design of integrated Vernier devices for high performance chemical and biochemical sensing. In particular, we demonstrate the accurate control of the most critical design and fabrication parameters of silicon-on-insulator cascade-coupled racetrack resonators operating in the second regime of the Vernier effect, around 1.55 μm. The experimental implementation of our design strategies has allowed a rigorous and reliable investigation of the influence of racetrack resonator and directional coupler dimensions as well as of waveguide process variability on the operation of Vernier devices. Figures of merit of our Vernier architectures have been measured experimentally, evidencing a high reproducibility and a very good agreement with the theoretical predictions, as also confirmed by relative errors even lower than 1%. Finally, a Vernier gain as high as 30.3, average insertion loss of 2.1 dB and extinction ratio up to 30 dB have been achieved. PMID:26067193

  1. Accurate Optical Reference Catalogs

    NASA Astrophysics Data System (ADS)

    Zacharias, N.

    2006-08-01

    Current and near future all-sky astrometric catalogs on the ICRF are reviewed with the emphasis on reference star data at optical wavelengths for user applications. The standard error of a Hipparcos Catalogue star position is now about 15 mas per coordinate. For the Tycho-2 data it is typically 20 to 100 mas, depending on magnitude. The USNO CCD Astrograph Catalog (UCAC) observing program was completed in 2004 and reductions toward the final UCAC3 release are in progress. This all-sky reference catalogue will have positional errors of 15 to 70 mas for stars in the 10 to 16 mag range, with a high degree of completeness. Proper motions for the about 60 million UCAC stars will be derived by combining UCAC astrometry with available early epoch data, including yet unpublished scans of the complete set of AGK2, Hamburg Zone astrograph and USNO Black Birch programs. Accurate positional and proper motion data are combined in the Naval Observatory Merged Astrometric Dataset (NOMAD) which includes Hipparcos, Tycho-2, UCAC2, USNO-B1, NPM+SPM plate scan data for astrometry, and is supplemented by multi-band optical photometry as well as 2MASS near infrared photometry. The Milli-Arcsecond Pathfinder Survey (MAPS) mission is currently being planned at USNO. This is a micro-satellite to obtain 1 mas positions, parallaxes, and 1 mas/yr proper motions for all bright stars down to about 15th magnitude. This program will be supplemented by a ground-based program to reach 18th magnitude on the 5 mas level.

  2. Reproducibility, Controllability, and Optimization of Lenr Experiments

    NASA Astrophysics Data System (ADS)

    Nagel, David J.

    2006-02-01

    Low-energy nuclear reaction (LENR) measurements are significantly and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments.

  3. Natural Disasters: Earth Science Readings. Reproducibles.

    ERIC Educational Resources Information Center

    Lobb, Nancy

    Natural Disasters is a reproducible teacher book that explains what scientists believe to be the causes of a variety of natural disasters and suggests steps that teachers and students can take to be better prepared in the event of a natural disaster. It contains both student and teacher sections. Teacher sections include vocabulary, an answer key,…

  4. Making Early Modern Medicine: Reproducing Swedish Bitters.

    PubMed

    Ahnfelt, Nils-Otto; Fors, Hjalmar

    2016-05-01

    Historians of science and medicine have rarely applied themselves to reproducing the experiments and practices of medicine and pharmacy. This paper delineates our efforts to reproduce "Swedish Bitters," an early modern composite medicine in wide European use from the 1730s to the present. In its original formulation, it was made from seven medicinal simples: aloe, rhubarb, saffron, myrrh, gentian, zedoary and agarikon. These were mixed in alcohol together with some theriac, a composite medicine of classical origin. The paper delineates the compositional history of Swedish Bitters and the medical rationale underlying its composition. It also describes how we go about to reproduce the medicine in a laboratory using early modern pharmaceutical methods, and analyse it using contemporary methods of pharmaceutical chemistry. Our aim is twofold: first, to show how reproducing medicines may provide a path towards a deeper understanding of the role of sensual and practical knowledge in the wider context of early modern medical culture; and second, how it may yield interesting results from the point of view of contemporary pharmaceutical science.

  5. A Simple and Accurate Method for Measuring Enzyme Activity.

    ERIC Educational Resources Information Center

    Yip, Din-Yan

    1997-01-01

    Presents methods commonly used for investigating enzyme activity using catalase and presents a new method for measuring catalase activity that is more reliable and accurate. Provides results that are readily reproduced and quantified. Can also be used for investigations of enzyme properties such as the effects of temperature, pH, inhibitors,…

  6. Reproducibility and quantitation of amplicon sequencing-based detection.

    PubMed

    Zhou, Jizhong; Wu, Liyou; Deng, Ye; Zhi, Xiaoyang; Jiang, Yi-Huei; Tu, Qichao; Xie, Jianping; Van Nostrand, Joy D; He, Zhili; Yang, Yunfeng

    2011-08-01

    To determine the reproducibility and quantitation of the amplicon sequencing-based detection approach for analyzing microbial community structure, a total of 24 microbial communities from a long-term global change experimental site were examined. Genomic DNA obtained from each community was used to amplify 16S rRNA genes with two or three barcode tags as technical replicates in the presence of a small quantity (0.1% wt/wt) of genomic DNA from Shewanella oneidensis MR-1 as the control. The technical reproducibility of the amplicon sequencing-based detection approach is quite low, with an average operational taxonomic unit (OTU) overlap of 17.2%±2.3% between two technical replicates, and 8.2%±2.3% among three technical replicates, which is most likely due to problems associated with random sampling processes. Such variations in technical replicates could have substantial effects on estimating β-diversity but less on α-diversity. A high variation was also observed in the control across different samples (for example, 66.7-fold for the forward primer), suggesting that the amplicon sequencing-based detection approach could not be quantitative. In addition, various strategies were examined to improve the comparability of amplicon sequencing data, such as increasing biological replicates, and removing singleton sequences and less-representative OTUs across biological replicates. Finally, as expected, various statistical analyses with preprocessed experimental data revealed clear differences in the composition and structure of microbial communities between warming and non-warming, or between clipping and non-clipping. Taken together, these results suggest that amplicon sequencing-based detection is useful in analyzing microbial community structure even though it is not reproducible and quantitative. However, great caution should be taken in experimental design and data interpretation when the amplicon sequencing-based detection approach is used for quantitative

  7. A Physical Activity Questionnaire: Reproducibility and Validity

    PubMed Central

    Barbosa, Nicolas; Sanchez, Carlos E.; Vera, Jose A.; Perez, Wilson; Thalabard, Jean-Christophe; Rieu, Michel

    2007-01-01

    This study evaluates the Quantification de L’Activite Physique en Altitude chez les Enfants (QAPACE) supervised self-administered questionnaire reproducibility and validity on the estimation of the mean daily energy expenditure (DEE) on Bogotá’s schoolchildren. The comprehension was assessed on 324 students, whereas the reproducibility was studied on a different random sample of 162 who were exposed twice to it. Reproducibility was assessed using both the Bland-Altman plot and the intra-class correlation coefficient (ICC). The validity was studied in a sample of 18 girls and 18 boys randomly selected, which completed the test - re-test study. The DEE derived from the questionnaire was compared with the laboratory measurement results of the peak oxygen uptake (Peak VO2) from ergo-spirometry and Leger Test. The reproducibility ICC was 0.96 (95% C.I. 0.95-0.97); by age categories 8-10, 0.94 (0.89-0. 97); 11-13, 0.98 (0.96- 0.99); 14-16, 0.95 (0.91-0.98). The ICC between mean TEE as estimated by the questionnaire and the direct and indirect Peak VO2 was 0.76 (0.66) (p<0.01); by age categories, 8-10, 11-13, and 14-16 were 0.89 (0.87), 0.76 (0.78) and 0.88 (0.80) respectively. The QAPACE questionnaire is reproducible and valid for estimating PA and showed a high correlation with the Peak VO2 uptake. Key pointsThe presence of a supervisor, the limited size of the group with the possibility of answering to their questions could explain the high reproducibility for this questionnaire.No study in the literature had directly addressed the issue of estimating a yearly average PA including school and vacation period.A two step procedure, in the population of schoolchildren of Bogotá, gives confidence in the use of the QAPACE questionnaire in a large epidemiological survey in related populations. PMID:24149485

  8. Reproducibility of electroretinograms recorded with DTL electrodes.

    PubMed

    Hébert, M; Lachapelle, P; Dumont, M

    The purpose of this study was to examine whether the use of the DTL fiber electrode yields stable and reproducible electroretinographic recordings. To do so, luminance response function, derived from dark-adapted electroretinograms, was obtained from both eyes of 10 normal subjects at two recording sessions spaced by 7-14 days. The data thus generated was used to calculate Naka-Rushton Vmax and k parameters and values obtained at the two recording sessions were compared. Our results showed that there was no significant difference in the values of Vmax and k calculated from the data generated at the two recording sessions. The above clearly demonstrate that the use of the DTL fiber electrode does not jeopardize, in any way, the stability and reproducibility of ERG responses.

  9. Regulating Ultrasound Cavitation in order to Induce Reproducible Sonoporation

    NASA Astrophysics Data System (ADS)

    Mestas, J.-L.; Alberti, L.; El Maalouf, J.; Béra, J.-C.; Gilles, B.

    2010-03-01

    Sonoporation would be linked to cavitation, which generally appears to be a non reproducible and unstationary phenomenon. In order to obtain an acceptable trade-off between cell mortality and transfection, a regulated cavitation generator based on an acoustical cavitation measurement was developed and tested. The medium to be sonicated is placed in a sample tray. This tray is immersed in in degassed water and positioned above the face of a flat ultrasonic transducer (frequency: 445 kHz; intensity range: 0.08-1.09 W/cm2). This technical configuration was admitted to be conducive to standing-wave generation through reflection at the air/medium interface in the well thus enhancing the cavitation phenomenon. Laterally to the transducer, a homemade hydrophone was oriented to receive the acoustical signal from the bubbles. From this spectral signal recorded at intervals of 5 ms, a cavitation index was calculated as the mean of the cavitation spectrum integration in a logarithmic scale, and the excitation power is automatically corrected. The device generates stable and reproducible cavitation level for a wide range of cavitation setpoint from stable cavitation condition up to full-developed inertial cavitation. For the ultrasound intensity range used, the time delay of the response is lower than 200 ms. The cavitation regulation device was evaluated in terms of chemical bubble collapse effect. Hydroxyl radical production was measured on terephthalic acid solutions. In open loop, the results present a great variability whatever the excitation power. On the contrary the closed loop allows a great reproducibility. This device was implemented for study of sonodynamic effect. The regulation provides more reproducible results independent of cell medium and experimental conditions (temperature, pressure). Other applications of this regulated cavitation device concern internalization of different particles (Quantum Dot) molecules (SiRNA) or plasmids (GFP, DsRed) into different

  10. Reproducing Actual Morphology of Planetary Lava Flows

    NASA Astrophysics Data System (ADS)

    Miyamoto, H.; Sasaki, S.

    1996-03-01

    Assuming that lava flows behave as non-isothermal laminar Bingham fluids, we developed a numerical code of lava flows. We take the self gravity effects and cooling mechanisms into account. The calculation method is a kind of cellular automata using a reduced random space method, which can eliminate the mesh shape dependence. We can calculate large scale lava flows precisely without numerical instability and reproduce morphology of actual lava flows.

  11. Reproducibility of liquid oxygen impact test results

    NASA Technical Reports Server (NTRS)

    Gayle, J. B.

    1975-01-01

    Results for 12,000 impacts on a wide range of materials were studied to determine the reproducibility of the liquid oxygen impact test method. Standard deviations representing the overall variability of results were in close agreement with the expected values for a binomial process. This indicates that the major source of variability is due to the go - no go nature of the test method and that variations due to sampling and testing operations were not significant.

  12. Data Identifiers and Citations Enable Reproducible Science

    NASA Astrophysics Data System (ADS)

    Tilmes, C.

    2011-12-01

    Modern science often involves data processing with tremendous volumes of data. Keeping track of that data has been a growing challenge for data center. Researchers who access and use that data don't always reference and cite their data sources adequately for consumers of their research to follow their methodology or reproduce their analyses or experiments. Recent research has led to recommendations for good identifiers and citations that can help address this problem. This paper will describe some of the best practices in data identifiers, reference and citation. Using a simplified example scenario based on a long term remote sensing satellite mission, it will explore issues in identifying dynamic data sets and the importance of good data citations for reproducibility. It will describe the difference between granule and collection level identifiers, using UUIDs and DOIs to illustrate some recommendations for developing identifiers and assigning them during data processing. As data processors create data products, the provenance of the input products and precise steps that led to their creation are recorded and published for users of the data to see. As researchers access the data from an archive, they can use the provenance to help understand the genesis of the data, which could have effects on their usage of the data. By citing the data on publishing their research, others can retrieve the precise data used in their research and reproduce the analyses and experiments to confirm the results. Describing the experiment to a sufficient extent to reproduce the research enforces a formal approach that lends credibility to the results, and ultimately, to the policies of decision makers depending on that research.

  13. Robust tissue classification for reproducible wound assessment in telemedicine environments

    NASA Astrophysics Data System (ADS)

    Wannous, Hazem; Treuillet, Sylvie; Lucas, Yves

    2010-04-01

    In telemedicine environments, a standardized and reproducible assessment of wounds, using a simple free-handled digital camera, is an essential requirement. However, to ensure robust tissue classification, particular attention must be paid to the complete design of the color processing chain. We introduce the key steps including color correction, merging of expert labeling, and segmentation-driven classification based on support vector machines. The tool thus developed ensures stability under lighting condition, viewpoint, and camera changes, to achieve accurate and robust classification of skin tissues. Clinical tests demonstrate that such an advanced tool, which forms part of a complete 3-D and color wound assessment system, significantly improves the monitoring of the healing process. It achieves an overlap score of 79.3 against 69.1% for a single expert, after mapping on the medical reference developed from the image labeling by a college of experts.

  14. The Vienna LTE simulators - Enabling reproducibility in wireless communications research

    NASA Astrophysics Data System (ADS)

    Mehlführer, Christian; Colom Colom Ikuno, Josep; Šimko, Michal; Schwarz, Stefan; Wrulich, Martin; Rupp, Markus

    2011-12-01

    In this article, we introduce MATLAB-based link and system level simulation environments for UMTS Long-Term Evolution (LTE). The source codes of both simulators are available under an academic non-commercial use license, allowing researchers full access to standard-compliant simulation environments. Owing to the open source availability, the simulators enable reproducible research in wireless communications and comparison of novel algorithms. In this study, we explain how link and system level simulations are connected and show how the link level simulator serves as a reference to design the system level simulator. We compare the accuracy of the PHY modeling at system level by means of simulations performed both with bit-accurate link level simulations and PHY-model-based system level simulations. We highlight some of the currently most interesting research questions for LTE, and explain by some research examples how our simulators can be applied.

  15. A Framework for Reproducible Latent Fingerprint Enhancements

    PubMed Central

    Carasso, Alfred S.

    2014-01-01

    Photoshop processing1 of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology. PMID:26601028

  16. A Framework for Reproducible Latent Fingerprint Enhancements.

    PubMed

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology.

  17. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-01-01

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches. PMID:27401684

  18. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-07-11

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches.

  19. A Framework for Reproducible Latent Fingerprint Enhancements.

    PubMed

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology. PMID:26601028

  20. A highly accurate ab initio potential energy surface for methane

    NASA Astrophysics Data System (ADS)

    Owens, Alec; Yurchenko, Sergei N.; Yachmenev, Andrey; Tennyson, Jonathan; Thiel, Walter

    2016-09-01

    A new nine-dimensional potential energy surface (PES) for methane has been generated using state-of-the-art ab initio theory. The PES is based on explicitly correlated coupled cluster calculations with extrapolation to the complete basis set limit and incorporates a range of higher-level additive energy corrections. These include core-valence electron correlation, higher-order coupled cluster terms beyond perturbative triples, scalar relativistic effects, and the diagonal Born-Oppenheimer correction. Sub-wavenumber accuracy is achieved for the majority of experimentally known vibrational energy levels with the four fundamentals of 12CH4 reproduced with a root-mean-square error of 0.70 cm-1. The computed ab initio equilibrium C-H bond length is in excellent agreement with previous values despite pure rotational energies displaying minor systematic errors as J (rotational excitation) increases. It is shown that these errors can be significantly reduced by adjusting the equilibrium geometry. The PES represents the most accurate ab initio surface to date and will serve as a good starting point for empirical refinement.

  1. A highly accurate ab initio potential energy surface for methane.

    PubMed

    Owens, Alec; Yurchenko, Sergei N; Yachmenev, Andrey; Tennyson, Jonathan; Thiel, Walter

    2016-09-14

    A new nine-dimensional potential energy surface (PES) for methane has been generated using state-of-the-art ab initio theory. The PES is based on explicitly correlated coupled cluster calculations with extrapolation to the complete basis set limit and incorporates a range of higher-level additive energy corrections. These include core-valence electron correlation, higher-order coupled cluster terms beyond perturbative triples, scalar relativistic effects, and the diagonal Born-Oppenheimer correction. Sub-wavenumber accuracy is achieved for the majority of experimentally known vibrational energy levels with the four fundamentals of (12)CH4 reproduced with a root-mean-square error of 0.70 cm(-1). The computed ab initio equilibrium C-H bond length is in excellent agreement with previous values despite pure rotational energies displaying minor systematic errors as J (rotational excitation) increases. It is shown that these errors can be significantly reduced by adjusting the equilibrium geometry. The PES represents the most accurate ab initio surface to date and will serve as a good starting point for empirical refinement. PMID:27634258

  2. A highly accurate ab initio potential energy surface for methane.

    PubMed

    Owens, Alec; Yurchenko, Sergei N; Yachmenev, Andrey; Tennyson, Jonathan; Thiel, Walter

    2016-09-14

    A new nine-dimensional potential energy surface (PES) for methane has been generated using state-of-the-art ab initio theory. The PES is based on explicitly correlated coupled cluster calculations with extrapolation to the complete basis set limit and incorporates a range of higher-level additive energy corrections. These include core-valence electron correlation, higher-order coupled cluster terms beyond perturbative triples, scalar relativistic effects, and the diagonal Born-Oppenheimer correction. Sub-wavenumber accuracy is achieved for the majority of experimentally known vibrational energy levels with the four fundamentals of (12)CH4 reproduced with a root-mean-square error of 0.70 cm(-1). The computed ab initio equilibrium C-H bond length is in excellent agreement with previous values despite pure rotational energies displaying minor systematic errors as J (rotational excitation) increases. It is shown that these errors can be significantly reduced by adjusting the equilibrium geometry. The PES represents the most accurate ab initio surface to date and will serve as a good starting point for empirical refinement.

  3. Spectroscopically Accurate Line Lists for Application in Sulphur Chemistry

    NASA Astrophysics Data System (ADS)

    Underwood, D. S.; Azzam, A. A. A.; Yurchenko, S. N.; Tennyson, J.

    2013-09-01

    for inclusion in standard atmospheric and planetary spectroscopic databases. The methods involved in computing the ab initio potential energy and dipole moment surfaces involved minor corrections to the equilibrium S-O distance, which produced a good agreement with experimentally determined rotational energies. However the purely ab initio method was not been able to reproduce an equally spectroscopically accurate representation of vibrational motion. We therefore present an empirical refinement to this original, ab initio potential surface, based on the experimental data available. This will not only be used to reproduce the room-temperature spectrum to a greater degree of accuracy, but is essential in the production of a larger, accurate line list necessary for the simulation of higher temperature spectra: we aim for coverage suitable for T ? 800 K. Our preliminary studies on SO3 have also shown it to exhibit an interesting "forbidden" rotational spectrum and "clustering" of rotational states; to our knowledge this phenomenon has not been observed in other examples of trigonal planar molecules and is also an investigative avenue we wish to pursue. Finally, the IR absorption bands for SO2 and SO3 exhibit a strong overlap, and the inclusion of SO2 as a complement to our studies is something that we will be interested in doing in the near future.

  4. Towards reproducible, scalable lateral molecular electronic devices

    SciTech Connect

    Durkan, Colm Zhang, Qian

    2014-08-25

    An approach to reproducibly fabricate molecular electronic devices is presented. Lateral nanometer-scale gaps with high yield are formed in Au/Pd nanowires by a combination of electromigration and Joule-heating-induced thermomechanical stress. The resulting nanogap devices are used to measure the electrical properties of small numbers of two different molecular species with different end-groups, namely 1,4-butane dithiol and 1,5-diamino-2-methylpentane. Fluctuations in the current reveal that in the case of the dithiol molecule devices, individual molecules conduct intermittently, with the fluctuations becoming more pronounced at larger biases.

  5. Nonlinear sequential laminates reproducing hollow sphere assemblages

    NASA Astrophysics Data System (ADS)

    Idiart, Martín I.

    2007-07-01

    A special class of nonlinear porous materials with isotropic 'sequentially laminated' microstructures is found to reproduce exactly the hydrostatic behavior of 'hollow sphere assemblages'. It is then argued that this result supports the conjecture that Gurson's approximate criterion for plastic porous materials, and its viscoplastic extension of Leblond et al. (1994), may actually yield rigorous upper bounds for the hydrostatic flow stress of porous materials containing an isotropic, but otherwise arbitrary, distribution of porosity. To cite this article: M.I. Idiart, C. R. Mecanique 335 (2007).

  6. Open and reproducible global land use classification

    NASA Astrophysics Data System (ADS)

    Nüst, Daniel; Václavík, Tomáš; Pross, Benjamin

    2015-04-01

    Researchers led by the Helmholtz Centre for Environmental research (UFZ) developed a new world map of land use systems based on over 30 diverse indicators (http://geoportal.glues.geo.tu-dresden.de/stories/landsystemarchetypes.html) of land use intensity, climate and environmental and socioeconomic factors. They identified twelve land system archetypes (LSA) using a data-driven classification algorithm (self-organizing maps) to assess global impacts of land use on the environment, and found unexpected similarities across global regions. We present how the algorithm behind this analysis can be published as an executable web process using 52°North WPS4R (https://wiki.52north.org/bin/view/Geostatistics/WPS4R) within the GLUES project (http://modul-a.nachhaltiges-landmanagement.de/en/scientific-coordination-glues/). WPS4R is an open source collaboration platform for researchers, analysts and software developers to publish R scripts (http://www.r-project.org/) as a geo-enabled OGC Web Processing Service (WPS) process. The interoperable interface to call the geoprocess allows both reproducibility of the analysis and integration of user data without knowledge about web services or classification algorithms. The open platform allows everybody to replicate the analysis in their own environments. The LSA WPS process has several input parameters, which can be changed via a simple web interface. The input parameters are used to configure both the WPS environment and the LSA algorithm itself. The encapsulation as a web process allows integration of non-public datasets, while at the same time the publication requires a well-defined documentation of the analysis. We demonstrate this platform specifically to domain scientists and show how reproducibility and open source publication of analyses can be enhanced. We also discuss future extensions of the reproducible land use classification, such as the possibility for users to enter their own areas of interest to the system and

  7. Queer nuclear families? Reproducing and transgressing heteronormativity.

    PubMed

    Folgerø, Tor

    2008-01-01

    During the past decade the public debate on gay and lesbian adoptive rights has been extensive in the Norwegian media. The debate illustrates how women and men planning to raise children in homosexual family constellations challenge prevailing cultural norms and existing concepts of kinship and family. The article discusses how lesbian mothers and gay fathers understand and redefine their own family practices. An essential point in this article is the fundamental ambiguity in these families' accounts of themselves-how they simultaneously transgress and reproduce heteronormative assumptions about childhood, fatherhood, motherhood, family and kinship.

  8. Reproducibility and reliability of fetal cardiac time intervals using magnetocardiography.

    PubMed

    van Leeuwen, P; Lange, S; Klein, A; Geue, D; Zhang, Y; Krause, H J; Grönemeyer, D

    2004-04-01

    We investigated several factors which may affect the accuracy of fetal cardiac time intervals (CTI) determined in magnetocardiographic (MCG) recordings: observer differences, the number of available recording sites and the type of sensor used in acquisition. In 253 fetal MCG recordings, acquired using different biomagnetometer devices between the 15th and 42nd weeks of gestation, P-wave, QRS complex and T-wave onsets and ends were identified in signal averaged data sets independently by different observers. Using a defined procedure for setting signal events, interobserver reliability was high. Increasing the number of registration sites led to more accurate identification of the events. The differences in wave morphology between magnetometer and gradiometer configurations led to deviations in timing whereas the differences between low and high temperature devices seemed to be primarily due to noise. Signal-to-noise ratio played an important overall role in the accurate determination of CTI and changes in signal amplitude associated with fetal maturation may largely explain the effects of gestational age on reproducibility. As fetal CTI may be of value in the identification of pathologies such as intrauterine growth retardation or fetal cardiac hypertrophy, their reliable estimation will be enhanced by strategies which take these factors into account.

  9. Response to Comment on "Estimating the reproducibility of psychological science".

    PubMed

    Anderson, Christopher J; Bahník, Štěpán; Barnett-Cowan, Michael; Bosco, Frank A; Chandler, Jesse; Chartier, Christopher R; Cheung, Felix; Christopherson, Cody D; Cordes, Andreas; Cremata, Edward J; Della Penna, Nicolas; Estel, Vivien; Fedor, Anna; Fitneva, Stanka A; Frank, Michael C; Grange, James A; Hartshorne, Joshua K; Hasselman, Fred; Henninger, Felix; van der Hulst, Marije; Jonas, Kai J; Lai, Calvin K; Levitan, Carmel A; Miller, Jeremy K; Moore, Katherine S; Meixner, Johannes M; Munafò, Marcus R; Neijenhuijs, Koen I; Nilsonne, Gustav; Nosek, Brian A; Plessow, Franziska; Prenoveau, Jason M; Ricker, Ashley A; Schmidt, Kathleen; Spies, Jeffrey R; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B; van Aert, Robbie C M; van Assen, Marcel A L M; Vanpaemel, Wolf; Vianello, Michelangelo; Voracek, Martin; Zuni, Kellylynn

    2016-03-01

    Gilbert et al. conclude that evidence from the Open Science Collaboration's Reproducibility Project: Psychology indicates high reproducibility, given the study methodology. Their very optimistic assessment is limited by statistical misconceptions and by causal inferences from selectively interpreted, correlational data. Using the Reproducibility Project: Psychology data, both optimistic and pessimistic conclusions about reproducibility are possible, and neither are yet warranted.

  10. Response to Comment on "Estimating the reproducibility of psychological science".

    PubMed

    Anderson, Christopher J; Bahník, Štěpán; Barnett-Cowan, Michael; Bosco, Frank A; Chandler, Jesse; Chartier, Christopher R; Cheung, Felix; Christopherson, Cody D; Cordes, Andreas; Cremata, Edward J; Della Penna, Nicolas; Estel, Vivien; Fedor, Anna; Fitneva, Stanka A; Frank, Michael C; Grange, James A; Hartshorne, Joshua K; Hasselman, Fred; Henninger, Felix; van der Hulst, Marije; Jonas, Kai J; Lai, Calvin K; Levitan, Carmel A; Miller, Jeremy K; Moore, Katherine S; Meixner, Johannes M; Munafò, Marcus R; Neijenhuijs, Koen I; Nilsonne, Gustav; Nosek, Brian A; Plessow, Franziska; Prenoveau, Jason M; Ricker, Ashley A; Schmidt, Kathleen; Spies, Jeffrey R; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B; van Aert, Robbie C M; van Assen, Marcel A L M; Vanpaemel, Wolf; Vianello, Michelangelo; Voracek, Martin; Zuni, Kellylynn

    2016-03-01

    Gilbert et al. conclude that evidence from the Open Science Collaboration's Reproducibility Project: Psychology indicates high reproducibility, given the study methodology. Their very optimistic assessment is limited by statistical misconceptions and by causal inferences from selectively interpreted, correlational data. Using the Reproducibility Project: Psychology data, both optimistic and pessimistic conclusions about reproducibility are possible, and neither are yet warranted. PMID:26941312

  11. Reproducibility Data on SUMMiT

    SciTech Connect

    Irwin, Lloyd; Jakubczak, Jay; Limary, Siv; McBrayer, John; Montague, Stephen; Smith, James; Sniegowski, Jeffry; Stewart, Harold; de Boer, Maarten

    1999-07-16

    SUMMiT (Sandia Ultra-planar Multi-level MEMS Technology) at the Sandia National Laboratories' MDL (Microelectronics Development Laboratory) is a standardized MEMS (Microelectromechanical Systems) technology that allows designers to fabricate concept prototypes. This technology provides four polysilicon layers plus three sacrificial oxide layers (with the third oxide layer being planarized) to enable fabrication of complex mechanical systems-on-a-chip. Quantified reproducibility of the SUMMiT process is important for process engineers as well as designers. Summary statistics for critical MEMS technology parameters such as film thickness, line width, and sheet resistance will be reported for the SUMMiT process. Additionally, data from Van der Pauw test structures will be presented. Data on film thickness, film uniformity and critical dimensions of etched line widths are collected from both process and monitor wafers during manufacturing using film thickness metrology tools and SEM tools. A standardized diagnostic module is included in each SWiT run to obtain post-processing parametric data to monitor run-to-run reproducibility such as Van der Pauw structures for measuring sheet resistance. This characterization of the SUMMiT process enables design for manufacturability in the SUMMiT technology.

  12. Nonempirically Tuned Range-Separated DFT Accurately Predicts Both Fundamental and Excitation Gaps in DNA and RNA Nucleobases

    PubMed Central

    2012-01-01

    Using a nonempirically tuned range-separated DFT approach, we study both the quasiparticle properties (HOMO–LUMO fundamental gaps) and excitation energies of DNA and RNA nucleobases (adenine, thymine, cytosine, guanine, and uracil). Our calculations demonstrate that a physically motivated, first-principles tuned DFT approach accurately reproduces results from both experimental benchmarks and more computationally intensive techniques such as many-body GW theory. Furthermore, in the same set of nucleobases, we show that the nonempirical range-separated procedure also leads to significantly improved results for excitation energies compared to conventional DFT methods. The present results emphasize the importance of a nonempirically tuned range-separation approach for accurately predicting both fundamental and excitation gaps in DNA and RNA nucleobases. PMID:22904693

  13. Effect of Population Heterogenization on the Reproducibility of Mouse Behavior: A Multi-Laboratory Study

    PubMed Central

    Richter, S. Helene; Garner, Joseph P.; Zipser, Benjamin; Lewejohann, Lars; Sachser, Norbert; Touma, Chadi; Schindler, Britta; Chourbaji, Sabine; Brandwein, Christiane; Gass, Peter; van Stipdonk, Niek; van der Harst, Johanneke; Spruijt, Berry; Võikar, Vootele; Wolfer, David P.; Würbel, Hanno

    2011-01-01

    In animal experiments, animals, husbandry and test procedures are traditionally standardized to maximize test sensitivity and minimize animal use, assuming that this will also guarantee reproducibility. However, by reducing within-experiment variation, standardization may limit inference to the specific experimental conditions. Indeed, we have recently shown in mice that standardization may generate spurious results in behavioral tests, accounting for poor reproducibility, and that this can be avoided by population heterogenization through systematic variation of experimental conditions. Here, we examined whether a simple form of heterogenization effectively improves reproducibility of test results in a multi-laboratory situation. Each of six laboratories independently ordered 64 female mice of two inbred strains (C57BL/6NCrl, DBA/2NCrl) and examined them for strain differences in five commonly used behavioral tests under two different experimental designs. In the standardized design, experimental conditions were standardized as much as possible in each laboratory, while they were systematically varied with respect to the animals' test age and cage enrichment in the heterogenized design. Although heterogenization tended to improve reproducibility by increasing within-experiment variation relative to between-experiment variation, the effect was too weak to account for the large variation between laboratories. However, our findings confirm the potential of systematic heterogenization for improving reproducibility of animal experiments and highlight the need for effective and practicable heterogenization strategies. PMID:21305027

  14. Is Grannum grading of the placenta reproducible?

    NASA Astrophysics Data System (ADS)

    Moran, Mary; Ryan, John; Brennan, Patrick C.; Higgins, Mary; McAuliffe, Fionnuala M.

    2009-02-01

    Current ultrasound assessment of placental calcification relies on Grannum grading. The aim of this study was to assess if this method is reproducible by measuring inter- and intra-observer variation in grading placental images, under strictly controlled viewing conditions. Thirty placental images were acquired and digitally saved. Five experienced sonographers independently graded the images on two separate occasions. In order to eliminate any technological factors which could affect data reliability and consistency all observers reviewed images at the same time. To optimise viewing conditions ambient lighting was maintained between 25-40 lux, with monitors calibrated to the GSDF standard to ensure consistent brightness and contrast. Kappa (κ) analysis of the grades assigned was used to measure inter- and intra-observer reliability. Intra-observer agreement had a moderate mean κ-value of 0.55, with individual comparisons ranging from 0.30 to 0.86. Two images saved from the same patient, during the same scan, were each graded as I, II and III by the same observer. A mean κ-value of 0.30 (range from 0.13 to 0.55) indicated fair inter-observer agreement over the two occasions and only one image was graded consistently the same by all five observers. The study findings confirmed the lack of reproducibility associated with Grannum grading of the placenta despite optimal viewing conditions and highlight the need for new methods of assessing placental health in order to improve neonatal outcomes. Alternative methods for quantifying placental calcification such as a software based technique and 3D ultrasound assessment need to be explored.

  15. Datathons and Software to Promote Reproducible Research

    PubMed Central

    2016-01-01

    Background Datathons facilitate collaboration between clinicians, statisticians, and data scientists in order to answer important clinical questions. Previous datathons have resulted in numerous publications of interest to the critical care community and serve as a viable model for interdisciplinary collaboration. Objective We report on an open-source software called Chatto that was created by members of our group, in the context of the second international Critical Care Datathon, held in September 2015. Methods Datathon participants formed teams to discuss potential research questions and the methods required to address them. They were provided with the Chatto suite of tools to facilitate their teamwork. Each multidisciplinary team spent the next 2 days with clinicians working alongside data scientists to write code, extract and analyze data, and reformulate their queries in real time as needed. All projects were then presented on the last day of the datathon to a panel of judges that consisted of clinicians and scientists. Results Use of Chatto was particularly effective in the datathon setting, enabling teams to reduce the time spent configuring their research environments to just a few minutes—a process that would normally take hours to days. Chatto continued to serve as a useful research tool after the conclusion of the datathon. Conclusions This suite of tools fulfills two purposes: (1) facilitation of interdisciplinary teamwork through archiving and version control of datasets, analytical code, and team discussions, and (2) advancement of research reproducibility by functioning postpublication as an online environment in which independent investigators can rerun or modify analyses with relative ease. With the introduction of Chatto, we hope to solve a variety of challenges presented by collaborative data mining projects while improving research reproducibility. PMID:27558834

  16. Selection on soil microbiomes reveals reproducible impacts on plant function.

    PubMed

    Panke-Buisse, Kevin; Poole, Angela C; Goodrich, Julia K; Ley, Ruth E; Kao-Kniffin, Jenny

    2015-04-01

    Soil microorganisms found in the root zone impact plant growth and development, but the potential to harness these benefits is hampered by the sheer abundance and diversity of the players influencing desirable plant traits. Here, we report a high level of reproducibility of soil microbiomes in altering plant flowering time and soil functions when partnered within and between plant hosts. We used a multi-generation experimental system using Arabidopsis thaliana Col to select for soil microbiomes inducing earlier or later flowering times of their hosts. We then inoculated the selected microbiomes from the tenth generation of plantings into the soils of three additional A. thaliana genotypes (Ler, Be, RLD) and a related crucifer (Brassica rapa). With the exception of Ler, all other plant hosts showed a shift in flowering time corresponding with the inoculation of early- or late-flowering microbiomes. Analysis of the soil microbial community using 16 S rRNA gene sequencing showed distinct microbiota profiles assembling by flowering time treatment. Plant hosts grown with the late-flowering-associated microbiomes showed consequent increases in inflorescence biomass for three A. thaliana genotypes and an increase in total biomass for B. rapa. The increase in biomass was correlated with two- to five-fold enhancement of microbial extracellular enzyme activities associated with nitrogen mineralization in soils. The reproducibility of the flowering phenotype across plant hosts suggests that microbiomes can be selected to modify plant traits and coordinate changes in soil resource pools.

  17. Selection on soil microbiomes reveals reproducible impacts on plant function.

    PubMed

    Panke-Buisse, Kevin; Poole, Angela C; Goodrich, Julia K; Ley, Ruth E; Kao-Kniffin, Jenny

    2015-04-01

    Soil microorganisms found in the root zone impact plant growth and development, but the potential to harness these benefits is hampered by the sheer abundance and diversity of the players influencing desirable plant traits. Here, we report a high level of reproducibility of soil microbiomes in altering plant flowering time and soil functions when partnered within and between plant hosts. We used a multi-generation experimental system using Arabidopsis thaliana Col to select for soil microbiomes inducing earlier or later flowering times of their hosts. We then inoculated the selected microbiomes from the tenth generation of plantings into the soils of three additional A. thaliana genotypes (Ler, Be, RLD) and a related crucifer (Brassica rapa). With the exception of Ler, all other plant hosts showed a shift in flowering time corresponding with the inoculation of early- or late-flowering microbiomes. Analysis of the soil microbial community using 16 S rRNA gene sequencing showed distinct microbiota profiles assembling by flowering time treatment. Plant hosts grown with the late-flowering-associated microbiomes showed consequent increases in inflorescence biomass for three A. thaliana genotypes and an increase in total biomass for B. rapa. The increase in biomass was correlated with two- to five-fold enhancement of microbial extracellular enzyme activities associated with nitrogen mineralization in soils. The reproducibility of the flowering phenotype across plant hosts suggests that microbiomes can be selected to modify plant traits and coordinate changes in soil resource pools. PMID:25350154

  18. Selection on soil microbiomes reveals reproducible impacts on plant function

    PubMed Central

    Panke-Buisse, Kevin; Poole, Angela C; Goodrich, Julia K; Ley, Ruth E; Kao-Kniffin, Jenny

    2015-01-01

    Soil microorganisms found in the root zone impact plant growth and development, but the potential to harness these benefits is hampered by the sheer abundance and diversity of the players influencing desirable plant traits. Here, we report a high level of reproducibility of soil microbiomes in altering plant flowering time and soil functions when partnered within and between plant hosts. We used a multi-generation experimental system using Arabidopsis thaliana Col to select for soil microbiomes inducing earlier or later flowering times of their hosts. We then inoculated the selected microbiomes from the tenth generation of plantings into the soils of three additional A. thaliana genotypes (Ler, Be, RLD) and a related crucifer (Brassica rapa). With the exception of Ler, all other plant hosts showed a shift in flowering time corresponding with the inoculation of early- or late-flowering microbiomes. Analysis of the soil microbial community using 16 S rRNA gene sequencing showed distinct microbiota profiles assembling by flowering time treatment. Plant hosts grown with the late-flowering-associated microbiomes showed consequent increases in inflorescence biomass for three A. thaliana genotypes and an increase in total biomass for B. rapa. The increase in biomass was correlated with two- to five-fold enhancement of microbial extracellular enzyme activities associated with nitrogen mineralization in soils. The reproducibility of the flowering phenotype across plant hosts suggests that microbiomes can be selected to modify plant traits and coordinate changes in soil resource pools. PMID:25350154

  19. Reproducibility of techniques using Archimedes' principle in measuring cancellous bone volume.

    PubMed

    Zou, L; Bloebaum, R D; Bachus, K N

    1997-01-01

    Researchers have been interested in developing techniques to accurately and reproducibly measure the volume fraction of cancellous bone. Historically bone researchers have used Archimedes' principle with water to measure the volume fraction of cancellous bone. Preliminary results in our lab suggested that the calibrated water technique did not provide reproducible results. Because of this difficulty, it was decided to compare the conventional water method to a water with surfactant and a helium method using a micropycnometer. The water/surfactant and the helium methods were attempts to improve the fluid penetration into the small voids present in the cancellous bone structure. In order to compare the reproducibility of the new methods with the conventional water method, 16 cancellous bone specimens were obtained from femoral condyles of human and greyhound dog femora. The volume fraction measurements on each specimen were repeated three times with all three techniques. The results showed that the helium displacement method was more than an order of magnitudes more reproducible than the two other water methods (p < 0.05). Statistical analysis also showed that the conventional water method produced the lowest reproducibility (p < 0.05). The data from this study indicate that the helium displacement technique is a very useful, rapid and reproducible tool for quantitatively characterizing anisotropic porous tissue structures such as cancellous bone.

  20. Reproducibility of techniques using Archimedes' principle in measuring cancellous bone volume.

    PubMed

    Zou, L; Bloebaum, R D; Bachus, K N

    1997-01-01

    Researchers have been interested in developing techniques to accurately and reproducibly measure the volume fraction of cancellous bone. Historically bone researchers have used Archimedes' principle with water to measure the volume fraction of cancellous bone. Preliminary results in our lab suggested that the calibrated water technique did not provide reproducible results. Because of this difficulty, it was decided to compare the conventional water method to a water with surfactant and a helium method using a micropycnometer. The water/surfactant and the helium methods were attempts to improve the fluid penetration into the small voids present in the cancellous bone structure. In order to compare the reproducibility of the new methods with the conventional water method, 16 cancellous bone specimens were obtained from femoral condyles of human and greyhound dog femora. The volume fraction measurements on each specimen were repeated three times with all three techniques. The results showed that the helium displacement method was more than an order of magnitudes more reproducible than the two other water methods (p < 0.05). Statistical analysis also showed that the conventional water method produced the lowest reproducibility (p < 0.05). The data from this study indicate that the helium displacement technique is a very useful, rapid and reproducible tool for quantitatively characterizing anisotropic porous tissue structures such as cancellous bone. PMID:9140874

  1. Retention Projection Enables Accurate Calculation of Liquid Chromatographic Retention Times Across Labs and Methods

    PubMed Central

    Abate-Pella, Daniel; Freund, Dana M.; Ma, Yan; Simón-Manso, Yamil; Hollender, Juliane; Broeckling, Corey D.; Huhman, David V.; Krokhin, Oleg V.; Stoll, Dwight R.; Hegeman, Adrian D.; Kind, Tobias; Fiehn, Oliver; Schymanski, Emma L.; Prenni, Jessica E.; Sumner, Lloyd W.; Boswell, Paul G.

    2015-01-01

    Identification of small molecules by liquid chromatography-mass spectrometry (LC-MS) can be greatly improved if the chromatographic retention information is used along with mass spectral information to narrow down the lists of candidates. Linear retention indexing remains the standard for sharing retention data across labs, but it is unreliable because it cannot properly account for differences in the experimental conditions used by various labs, even when the differences are relatively small and unintentional. On the other hand, an approach called “retention projection” properly accounts for many intentional differences in experimental conditions, and when combined with a “back-calculation” methodology described recently, it also accounts for unintentional differences. In this study, the accuracy of this methodology is compared with linear retention indexing across eight different labs. When each lab ran a test mixture under a range of multi-segment gradients and flow rates they selected independently, retention projections averaged 22-fold more accurate for uncharged compounds because they properly accounted for these intentional differences, which were more pronounced in steep gradients. When each lab ran the test mixture under nominally the same conditions, which is the ideal situation to reproduce linear retention indices, retention projections still averaged 2-fold more accurate because they properly accounted for many unintentional differences between the LC systems. To the best of our knowledge, this is the most successful study to date aiming to calculate (or even just to reproduce) LC gradient retention across labs, and it is the only study in which retention was reliably calculated under various multi-segment gradients and flow rates chosen independently by labs. PMID:26292625

  2. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    PubMed Central

    2010-01-01

    Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets

  3. Extended Eden model reproduces growth of an acellular slime mold

    NASA Astrophysics Data System (ADS)

    Wagner, Geri; Halvorsrud, Ragnhild; Meakin, Paul

    1999-11-01

    A stochastic growth model was used to simulate the growth of the acellular slime mold Physarum polycephalum on substrates where the nutrients were confined in separate drops. Growth of Physarum on such substrates was previously studied experimentally and found to produce a range of different growth patterns [Phys. Rev. E 57, 941 (1998)]. The model represented the aging of cluster sites and differed from the original Eden model in that the occupation probability of perimeter sites depended on the time of occupation of adjacent cluster sites. This feature led to a bias in the selection of growth directions. A moderate degree of persistence was found to be crucial to reproduce the biological growth patterns under various conditions. Persistence in growth combined quick propagation in heterogeneous environments with a high probability of locating sources of nutrients.

  4. Reproducing kernel hilbert space based single infrared image super resolution

    NASA Astrophysics Data System (ADS)

    Chen, Liangliang; Deng, Liangjian; Shen, Wei; Xi, Ning; Zhou, Zhanxin; Song, Bo; Yang, Yongliang; Cheng, Yu; Dong, Lixin

    2016-07-01

    The spatial resolution of Infrared (IR) images is limited by lens optical diffraction, sensor array pitch size and pixel dimension. In this work, a robust model is proposed to reconstruct high resolution infrared image via a single low resolution sampling, where the image features are discussed and classified as reflective, cooled emissive and uncooled emissive based on infrared irradiation source. A spline based reproducing kernel hilbert space and approximative heaviside function are deployed to model smooth part and edge component of image respectively. By adjusting the parameters of heaviside function, the proposed model can enhance distinct part of images. The experimental results show that the model is applicable on both reflective and emissive low resolution infrared images to improve thermal contrast. The overall outcome produces a high resolution IR image, which makes IR camera better measurement accuracy and observes more details at long distance.

  5. Improving the batch-to-batch reproducibility of microbial cultures during recombinant protein production by regulation of the total carbon dioxide production.

    PubMed

    Jenzsch, Marco; Gnoth, Stefan; Kleinschmidt, Martin; Simutis, Rimvydas; Lübbert, Andreas

    2007-03-10

    Batch-to-batch reproducibility of fermentation processes performed during the manufacturing processes of biologics can be increased by operating the cultures at feed rate profiles that are robust against typically arising disturbances. Remaining randomly appearing deviations from the desired path should be suppressed automatically by manipulating the feed rate. With respect to the cells' physiology it is best guiding the cultivations along an optimal profile of the specific biomass growth rate mu(t). However, there are two problems that speak for further investigations: Upon severe disturbances that may happen during the fermentation, the biomass concentration X may significantly deviate from its desired value, then a fixed mu-profile leads to a diminished batch-to-batch reproducibility. Second, the specific growth rate cannot easily be estimated online to a favourably high accuracy, hence it is difficult to determine the deviations in mu from the desired profile. The alternative discussed here solves both problems by keeping the process at the corresponding total cumulative carbon dioxide production-profile: it is robust against distortions in X and the controlled variable can accurately be measured online during cultivations of all relevant sizes. As compared to the fermentation practice currently used in industry, the experimental results, presented at the example of a recombinant protein production with Escherichia coli cells, show that CPR-based corrections lead to a considerably improved batch-to-batch reproducibility.

  6. Monte Carlo modeling provides accurate calibration factors for radionuclide activity meters.

    PubMed

    Zagni, F; Cicoria, G; Lucconi, G; Infantino, A; Lodi, F; Marengo, M

    2014-12-01

    Accurate determination of calibration factors for radionuclide activity meters is crucial for quantitative studies and in the optimization step of radiation protection, as these detectors are widespread in radiopharmacy and nuclear medicine facilities. In this work we developed the Monte Carlo model of a widely used activity meter, using the Geant4 simulation toolkit. More precisely the "PENELOPE" EM physics models were employed. The model was validated by means of several certified sources, traceable to primary activity standards, and other sources locally standardized with spectrometry measurements, plus other experimental tests. Great care was taken in order to accurately reproduce the geometrical details of the gas chamber and the activity sources, each of which is different in shape and enclosed in a unique container. Both relative calibration factors and ionization current obtained with simulations were compared against experimental measurements; further tests were carried out, such as the comparison of the relative response of the chamber for a source placed at different positions. The results showed a satisfactory level of accuracy in the energy range of interest, with the discrepancies lower than 4% for all the tested parameters. This shows that an accurate Monte Carlo modeling of this type of detector is feasible using the low-energy physics models embedded in Geant4. The obtained Monte Carlo model establishes a powerful tool for first instance determination of new calibration factors for non-standard radionuclides, for custom containers, when a reference source is not available. Moreover, the model provides an experimental setup for further research and optimization with regards to materials and geometrical details of the measuring setup, such as the ionization chamber itself or the containers configuration.

  7. Data-Driven and Predefined ROI-Based Quantification of Long-Term Resting-State fMRI Reproducibility.

    PubMed

    Song, Xiaomu; Panych, Lawrence P; Chen, Nan-Kuei

    2016-03-01

    Resting-state functional magnetic resonance imaging (fMRI) is a promising tool for neuroscience and clinical studies. However, there exist significant variations in strength and spatial extent of resting-state functional connectivity over repeated sessions in a single or multiple subjects with identical experimental conditions. Reproducibility studies have been conducted for resting-state fMRI where the reproducibility was usually evaluated in predefined regions-of-interest (ROIs). It was possible that reproducibility measures strongly depended on the ROI definition. In this work, this issue was investigated by comparing data-driven and predefined ROI-based quantification of reproducibility. In the data-driven analysis, the reproducibility was quantified using functionally connected voxels detected by a support vector machine (SVM)-based technique. In the predefined ROI-based analysis, all voxels in the predefined ROIs were included when estimating the reproducibility. Experimental results show that (1) a moderate to substantial within-subject reproducibility and a reasonable between-subject reproducibility can be obtained using functionally connected voxels identified by the SVM-based technique; (2) in the predefined ROI-based analysis, an increase in ROI size does not always result in higher reproducibility measures; (3) ROI pairs with high connectivity strength have a higher chance to exhibit high reproducibility; (4) ROI pairs with high reproducibility do not necessarily have high connectivity strength; (5) the reproducibility measured from the identified functionally connected voxels is generally higher than that measured from all voxels in predefined ROIs with typical sizes. The findings (2) and (5) suggest that conventional ROI-based analyses would underestimate the resting-state fMRI reproducibility.

  8. Reproducibility of the cutoff probe for the measurement of electron density

    NASA Astrophysics Data System (ADS)

    Kim, D. W.; You, S. J.; Kwon, J. H.; You, K. H.; Seo, B. H.; Kim, J. H.; Yoon, J.-S.; Oh, W. Y.

    2016-06-01

    Since a plasma processing control based on plasma diagnostics attracted considerable attention in industry, the reproducibility of the diagnostics using in this application has become a great interest. Because the cutoff probe is one of the potential candidates for this application, knowing the reproducibility of the cutoff probe measurement becomes quit important in the cutoff probe application research. To test the reproducibility of the cutoff probe measurement, in this paper, a comparative study among the different cutoff probe measurements was performed. The comparative study revealed remarkable result: the cutoff probe has a great reproducibility for the electron density measurement, i.e., there are little differences among measurements by different probes made by different experimenters. The discussion including the reason for the result was addressed via this paper by using a basic measurement principle of cutoff probe and a comparative experiment with Langmuir probe.

  9. Accurate and Efficient Resolution of Overlapping Isotopic Envelopes in Protein Tandem Mass Spectra

    PubMed Central

    Xiao, Kaijie; Yu, Fan; Fang, Houqin; Xue, Bingbing; Liu, Yan; Tian, Zhixin

    2015-01-01

    It has long been an analytical challenge to accurately and efficiently resolve extremely dense overlapping isotopic envelopes (OIEs) in protein tandem mass spectra to confidently identify proteins. Here, we report a computationally efficient method, called OIE_CARE, to resolve OIEs by calculating the relative deviation between the ideal and observed experimental abundance. In the OIE_CARE method, the ideal experimental abundance of a particular overlapping isotopic peak (OIP) is first calculated for all the OIEs sharing this OIP. The relative deviation (RD) of the overall observed experimental abundance of this OIP relative to the summed ideal value is then calculated. The final individual abundance of the OIP for each OIE is the individual ideal experimental abundance multiplied by 1 + RD. Initial studies were performed using higher-energy collisional dissociation tandem mass spectra on myoglobin (with direct infusion) and the intact E. coli proteome (with liquid chromatographic separation). Comprehensive data at the protein and proteome levels, high confidence and good reproducibility were achieved. The resolving method reported here can, in principle, be extended to resolve any envelope-type overlapping data for which the corresponding theoretical reference values are available. PMID:26439836

  10. REPRODUCIBLE AND SHAREABLE QUANTIFICATIONS OF PATHOGENICITY

    PubMed Central

    Manrai, Arjun K; Wang, Brice L; Patel, Chirag J; Kohane, Isaac S

    2016-01-01

    There are now hundreds of thousands of pathogenicity assertions that relate genetic variation to disease, but most of this clinically utilized variation has no accepted quantitative disease risk estimate. Recent disease-specific studies have used control sequence data to reclassify large amounts of prior pathogenic variation, but there is a critical need to scale up both the pace and feasibility of such pathogenicity reassessments across human disease. In this manuscript we develop a shareable computational framework to quantify pathogenicity assertions. We release a reproducible “digital notebook” that integrates executable code, text annotations, and mathematical expressions in a freely accessible statistical environment. We extend previous disease-specific pathogenicity assessments to over 6,000 diseases and 160,000 assertions in the ClinVar database. Investigators can use this platform to prioritize variants for reassessment and tailor genetic model parameters (such as prevalence and heterogeneity) to expose the uncertainty underlying pathogenicity-based risk assessments. Finally, we release a website that links users to pathogenic variation for a queried disease, supporting literature, and implied disease risk calculations subject to user-defined and disease-specific genetic risk models in order to facilitate variant reassessments. PMID:26776189

  11. Laboratory 20-km cycle time trial reproducibility.

    PubMed

    Zavorsky, G S; Murias, J M; Gow, J; Kim, D J; Poulin-Harnois, C; Kubow, S; Lands, L C

    2007-09-01

    This study evaluated the reproducibility of laboratory based 20-km time trials in well trained versus recreational cyclists. Eighteen cyclists (age = 34 +/- 8 yrs; body mass index = 23.1 +/- 2.2 kg/m (2); VO(2max) = 4.19 +/- 0.65 L/min) completed three 20-km time trials over a month on a Velotron cycle ergometer. Average power output (PO) (W), speed, and heart rate (HR) were significantly lower in the first time trial compared to the second and third time trial. The coefficients of variation (CV) between the second and third trial of the top eight performers for average PO, time to completion, and speed were 1.2 %, 0.6 %, 0.5 %, respectively, compared to 4.8 %, 2.0 %, and 2.3 % for the bottom ten. In addition, the average HR, VO(2), and percentage of VO(2max) were similar between trials. This study demonstrated that (1) a familiarization session improves the reliability of the measurements (i.e., average PO, time to completion and speed), and (2) the CV was much smaller for the best performers.

  12. The reproducible radio outbursts of SS Cygni

    NASA Astrophysics Data System (ADS)

    Russell, T. D.; Miller-Jones, J. C. A.; Sivakoff, G. R.; Altamirano, D.; O'Brien, T. J.; Page, K. L.; Templeton, M. R.; Körding, E. G.; Knigge, C.; Rupen, M. P.; Fender, R. P.; Heinz, S.; Maitra, D.; Markoff, S.; Migliari, S.; Remillard, R. A.; Russell, D. M.; Sarazin, C. L.; Waagen, E. O.

    2016-08-01

    We present the results of our intensive radio observing campaign of the dwarf nova SS Cyg during its 2010 April outburst. We argue that the observed radio emission was produced by synchrotron emission from a transient radio jet. Comparing the radio light curves from previous and subsequent outbursts of this system (including high-resolution observations from outbursts in 2011 and 2012) shows that the typical long and short outbursts of this system exhibit reproducible radio outbursts that do not vary significantly between outbursts, which is consistent with the similarity of the observed optical, ultraviolet and X-ray light curves. Contemporaneous optical and X-ray observations show that the radio emission appears to have been triggered at the same time as the initial X-ray flare, which occurs as disc material first reaches the boundary layer. This raises the possibility that the boundary region may be involved in jet production in accreting white dwarf systems. Our high spatial resolution monitoring shows that the compact jet remained active throughout the outburst with no radio quenching.

  13. Reproducibility of neuroimaging analyses across operating systems

    PubMed Central

    Glatard, Tristan; Lewis, Lindsay B.; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C.

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed. PMID:25964757

  14. Modelling soil erosion at European scale: towards harmonization and reproducibility

    NASA Astrophysics Data System (ADS)

    Bosco, C.; de Rigo, D.; Dewitte, O.; Poesen, J.; Panagos, P.

    2015-02-01

    Soil erosion by water is one of the most widespread forms of soil degradation. The loss of soil as a result of erosion can lead to decline in organic matter and nutrient contents, breakdown of soil structure and reduction of the water-holding capacity. Measuring soil loss across the whole landscape is impractical and thus research is needed to improve methods of estimating soil erosion with computational modelling, upon which integrated assessment and mitigation strategies may be based. Despite the efforts, the prediction value of existing models is still limited, especially at regional and continental scale, because a systematic knowledge of local climatological and soil parameters is often unavailable. A new approach for modelling soil erosion at regional scale is here proposed. It is based on the joint use of low-data-demanding models and innovative techniques for better estimating model inputs. The proposed modelling architecture has at its basis the semantic array programming paradigm and a strong effort towards computational reproducibility. An extended version of the Revised Universal Soil Loss Equation (RUSLE) has been implemented merging different empirical rainfall-erosivity equations within a climatic ensemble model and adding a new factor for a better consideration of soil stoniness within the model. Pan-European soil erosion rates by water have been estimated through the use of publicly available data sets and locally reliable empirical relationships. The accuracy of the results is corroborated by a visual plausibility check (63% of a random sample of grid cells are accurate, 83% at least moderately accurate, bootstrap p ≤ 0.05). A comparison with country-level statistics of pre-existing European soil erosion maps is also provided.

  15. Modelling soil erosion at European scale: towards harmonization and reproducibility

    NASA Astrophysics Data System (ADS)

    Bosco, C.; de Rigo, D.; Dewitte, O.; Poesen, J.; Panagos, P.

    2014-04-01

    Soil erosion by water is one of the most widespread forms of soil degradation. The loss of soil as a result of erosion can lead to decline in organic matter and nutrient contents, breakdown of soil structure and reduction of the water holding capacity. Measuring soil loss across the whole landscape is impractical and thus research is needed to improve methods of estimating soil erosion with computational modelling, upon which integrated assessment and mitigation strategies may be based. Despite the efforts, the prediction value of existing models is still limited, especially at regional and continental scale. A new approach for modelling soil erosion at large spatial scale is here proposed. It is based on the joint use of low data demanding models and innovative techniques for better estimating model inputs. The proposed modelling architecture has at its basis the semantic array programming paradigm and a strong effort towards computational reproducibility. An extended version of the Revised Universal Soil Loss Equation (RUSLE) has been implemented merging different empirical rainfall-erosivity equations within a climatic ensemble model and adding a new factor for a better consideration of soil stoniness within the model. Pan-European soil erosion rates by water have been estimated through the use of publicly available datasets and locally reliable empirical relationships. The accuracy of the results is corroborated by a visual plausibility check (63% of a random sample of grid cells are accurate, 83% at least moderately accurate, bootstrap p ≤ 0.05). A comparison with country level statistics of pre-existing European maps of soil erosion by water is also provided.

  16. Accurate comparison of antibody expression levels by reproducible transgene targeting in engineered recombination-competent CHO cells.

    PubMed

    Mayrhofer, Patrick; Kratzer, Bernhard; Sommeregger, Wolfgang; Steinfellner, Willibald; Reinhart, David; Mader, Alexander; Turan, Soeren; Qiao, Junhua; Bode, Juergen; Kunert, Renate

    2014-12-01

    Over the years, Chinese hamster ovary (CHO) cells have emerged as the major host for expressing biotherapeutic proteins. Traditional methods to generate high-producer cell lines rely on random integration(s) of the gene of interest but have thereby left the identification of bottlenecks as a challenging task. For comparison of different producer cell lines derived from various transfections, a system that provides control over transgene expression behavior is highly needed. This motivated us to develop a novel "DUKX-B11 F3/F" cell line to target different single-chain antibody fragments into the same chromosomal target site by recombinase-mediated cassette exchange (RMCE) using the flippase (FLP)/FLP recognition target (FRT) system. The RMCE-competent cell line contains a gfp reporter fused to a positive/negative selection system flanked by heterospecific FRT (F) variants under control of an external CMV promoter, constructed as "promoter trap". The expression stability and FLP accessibility of the tagged locus was demonstrated by successive rounds of RMCE. As a proof of concept, we performed RMCE using cassettes encoding two different anti-HIV single-chain Fc fragments, 3D6scFv-Fc and 2F5scFv-Fc. Both targeted integrations yielded homogenous cell populations with comparable intracellular product contents and messenger RNA (mRNA) levels but product related differences in specific productivities. These studies confirm the potential of the newly available "DUKX-B11 F3/F" cell line to guide different transgenes into identical transcriptional control regions by RMCE and thereby generate clones with comparable amounts of transgene mRNA. This new host is a prerequisite for cell biology studies of independent transfections and transgenes.

  17. Accurate Evaluation of Ion Conductivity of the Gramicidin A Channel Using a Polarizable Force Field without Any Corrections.

    PubMed

    Peng, Xiangda; Zhang, Yuebin; Chu, Huiying; Li, Yan; Zhang, Dinglin; Cao, Liaoran; Li, Guohui

    2016-06-14

    Classical molecular dynamic (MD) simulation of membrane proteins faces significant challenges in accurately reproducing and predicting experimental observables such as ion conductance and permeability due to its incapability of precisely describing the electronic interactions in heterogeneous systems. In this work, the free energy profiles of K(+) and Na(+) permeating through the gramicidin A channel are characterized by using the AMOEBA polarizable force field with a total sampling time of 1 μs. Our results indicated that by explicitly introducing the multipole terms and polarization into the electrostatic potentials, the permeation free energy barrier of K(+) through the gA channel is considerably reduced compared to the overestimated results obtained from the fixed-charge model. Moreover, the estimated maximum conductance, without any corrections, for both K(+) and Na(+) passing through the gA channel are much closer to the experimental results than any classical MD simulations, demonstrating the power of AMOEBA in investigating the membrane proteins. PMID:27171823

  18. High Reproducibility of ELISPOT Counts from Nine Different Laboratories.

    PubMed

    Sundararaman, Srividya; Karulin, Alexey Y; Ansari, Tameem; BenHamouda, Nadine; Gottwein, Judith; Laxmanan, Sreenivas; Levine, Steven M; Loffredo, John T; McArdle, Stephanie; Neudoerfl, Christine; Roen, Diana; Silina, Karina; Welch, Mackenzie; Lehmann, Paul V

    2015-01-01

    The primary goal of immune monitoring with ELISPOT is to measure the number of T cells, specific for any antigen, accurately and reproducibly between different laboratories. In ELISPOT assays, antigen-specific T cells secrete cytokines, forming spots of different sizes on a membrane with variable background intensities. Due to the subjective nature of judging maximal and minimal spot sizes, different investigators come up with different numbers. This study aims to determine whether statistics-based, automated size-gating can harmonize the number of spot counts calculated between different laboratories. We plated PBMC at four different concentrations, 24 replicates each, in an IFN-γ ELISPOT assay with HCMV pp65 antigen. The ELISPOT plate, and an image file of the plate was counted in nine different laboratories using ImmunoSpot® Analyzers by (A) Basic Count™ relying on subjective counting parameters set by the respective investigators and (B) SmartCount™, an automated counting protocol by the ImmunoSpot® Software that uses statistics-based spot size auto-gating with spot intensity auto-thresholding. The average coefficient of variation (CV) for the mean values between independent laboratories was 26.7% when counting with Basic Count™, and 6.7% when counting with SmartCount™. Our data indicates that SmartCount™ allows harmonization of counting ELISPOT results between different laboratories and investigators. PMID:25585297

  19. Reproducibility and utility of dune luminescence chronologies

    NASA Astrophysics Data System (ADS)

    Leighton, Carly L.; Thomas, David S. G.; Bailey, Richard M.

    2014-02-01

    Optically stimulated luminescence (OSL) dating of dune deposits has increasingly been used as a tool to investigate the response of aeolian systems to environmental change. Amalgamation of individual dune accumulation chronologies has been employed in order to distinguish regional from local geomorphic responses to change. However, advances in dating have produced chronologies of increasing complexity. In particular, questions regarding the interpretation of dune ages have been raised, including over the most appropriate method to evaluate the significance of suites of OSL ages when local 'noisy' and discontinuous records are combined. In this paper, these issues are reviewed and the reproducibility of dune chronologies is assessed. OSL ages from two cores sampled from the same dune in the northeast Rub' al Khali, United Arab Emirates, are presented and compared, alongside an analysis of previously published dune ages dated to within the last 30 ka. Distinct periods of aeolian activity and preservation are identified, which can be tied to regional climatic and environmental changes. This case study is used to address fundamental questions that are persistently asked of dune dating studies, including the appropriate spatial scale over which to infer environmental and climatic change based on dune chronologies, whether chronological hiatuses can be interpreted, how to most appropriately combine and display datasets, and the relationship between geomorphic and palaeoclimatic signals. Chronological profiles reflect localised responses to environmental variability and climatic forcing, and amalgamation of datasets, with consideration of sampling resolution, is required; otherwise local factors are always likely to dominate. Using net accumulation rates to display ages may provide an informative approach of analysing and presenting dune OSL chronologies less susceptible to biases resulting from insufficient sampling resolution.

  20. Accurate calculation of chemical shifts in highly dynamic H2@C60 through an integrated quantum mechanics/molecular dynamics scheme.

    PubMed

    Jiménez-Osés, Gonzalo; García, José I; Corzana, Francisco; Elguero, José

    2011-05-20

    A new protocol combining classical MD simulations and DFT calculations is presented to accurately estimate the (1)H NMR chemical shifts of highly mobile guest-host systems and their thermal dependence. This strategy has been successfully applied for the hydrogen molecule trapped into C(60) fullerene, an unresolved and challenging prototypical case for which experimental values have never been reproduced. The dependence of the final values on the theoretical method and their implications to avoid over interpretation of the obtained results are carefully described.

  1. Research Reproducibility in Geosciences: Current Landscape, Practices and Perspectives

    NASA Astrophysics Data System (ADS)

    Yan, An

    2016-04-01

    Reproducibility of research can gauge the validity of its findings. Yet currently we lack understanding of how much of a problem research reproducibility is in geosciences. We developed an online survey on faculty and graduate students in geosciences, and received 136 responses from research institutions and universities in Americas, Asia, Europe and other parts of the world. This survey examined (1) the current state of research reproducibility in geosciences by asking researchers' experiences with unsuccessful replication work, and what obstacles that lead to their replication failures; (2) the current reproducibility practices in community by asking what efforts researchers made to try to reproduce other's work and make their own work reproducible, and what the underlying factors that contribute to irreproducibility are; (3) the perspectives on reproducibility by collecting researcher's thoughts and opinions on this issue. The survey result indicated that nearly 80% of respondents who had ever reproduced a published study had failed at least one time in reproducing. Only one third of the respondents received helpful feedbacks when they contacted the authors of a published study for data, code, or other information. The primary factors that lead to unsuccessful replication attempts are insufficient details of instructions in published literature, and inaccessibility of data, code and tools needed in the study. Our findings suggest a remarkable lack of research reproducibility in geoscience. Changing the incentive mechanism in academia, as well as developing policies and tools that facilitate open data and code sharing are the promising ways for geosciences community to alleviate this reproducibility problem.

  2. Semiautomated, Reproducible Batch Processing of Soy

    NASA Technical Reports Server (NTRS)

    Thoerne, Mary; Byford, Ivan W.; Chastain, Jack W.; Swango, Beverly E.

    2005-01-01

    A computer-controlled apparatus processes batches of soybeans into one or more of a variety of food products, under conditions that can be chosen by the user and reproduced from batch to batch. Examples of products include soy milk, tofu, okara (an insoluble protein and fiber byproduct of soy milk), and whey. Most processing steps take place without intervention by the user. This apparatus was developed for use in research on processing of soy. It is also a prototype of other soy-processing apparatuses for research, industrial, and home use. Prior soy-processing equipment includes household devices that automatically produce soy milk but do not automatically produce tofu. The designs of prior soy-processing equipment require users to manually transfer intermediate solid soy products and to press them manually and, hence, under conditions that are not consistent from batch to batch. Prior designs do not afford choices of processing conditions: Users cannot use previously developed soy-processing equipment to investigate the effects of variations of techniques used to produce soy milk (e.g., cold grinding, hot grinding, and pre-cook blanching) and of such process parameters as cooking times and temperatures, grinding times, soaking times and temperatures, rinsing conditions, and sizes of particles generated by grinding. In contrast, the present apparatus is amenable to such investigations. The apparatus (see figure) includes a processing tank and a jacketed holding or coagulation tank. The processing tank can be capped by either of two different heads and can contain either of two different insertable mesh baskets. The first head includes a grinding blade and heating elements. The second head includes an automated press piston. One mesh basket, designated the okara basket, has oblong holes with a size equivalent to about 40 mesh [40 openings per inch (.16 openings per centimeter)]. The second mesh basket, designated the tofu basket, has holes of 70 mesh [70 openings

  3. The importance of accurate experimental data to marginal field development

    SciTech Connect

    Overa, S.J.; Lingelem, M.N.

    1997-12-31

    Since exploration started in the Norwegian North Sea in 1965 a total of 196 fields have been discovered. Less than one-third of these fields have been developed. The marginal fields can not be developed economically with current technology even though some of those fields have significant reserves. The total cost to develop one of those large installations is estimated to be 2--5 billion US dollars. Therefore new technology is needed to lower the designed and installed costs of each unit. The need for new physical property data is shown. The value of valid operating data from present units is also pointed out.

  4. A Mechanical System to Reproduce Cardiovascular Flows

    NASA Astrophysics Data System (ADS)

    Lindsey, Thomas; Valsecchi, Pietro

    2010-11-01

    Within the framework of the "Pumps&Pipes" collaboration between ExxonMobil Upstream Research Company and The DeBakey Heart and Vascular Center in Houston, a hydraulic control system was developed to accurately simulate general cardiovascular flows. The final goal of the development of the apparatus was the reproduction of the periodic flow of blood through the heart cavity with the capability of varying frequency and amplitude, as well as designing the systolic/diastolic volumetric profile over one period. The system consists of a computer-controlled linear actuator that drives hydraulic fluid in a closed loop to a secondary hydraulic cylinder. The test section of the apparatus is located inside a MRI machine, and the closed loop serves to physically separate all metal moving parts (control system and actuator cylinder) from the MRI-compatible pieces. The secondary cylinder is composed of nonmetallic elements and directly drives the test section circulatory flow loop. The circulatory loop consists of nonmetallic parts and several types of Newtonian and non-Newtonian fluids, which model the behavior of blood. This design allows for a periodic flow of blood-like fluid pushed through a modeled heart cavity capable of replicating any healthy heart condition as well as simulating anomalous conditions. The behavior of the flow inside the heart can thus be visualized by MRI techniques.

  5. Fast and Accurate Exhaled Breath Ammonia Measurement

    PubMed Central

    Solga, Steven F.; Mudalel, Matthew L.; Spacek, Lisa A.; Risby, Terence H.

    2014-01-01

    This exhaled breath ammonia method uses a fast and highly sensitive spectroscopic method known as quartz enhanced photoacoustic spectroscopy (QEPAS) that uses a quantum cascade based laser. The monitor is coupled to a sampler that measures mouth pressure and carbon dioxide. The system is temperature controlled and specifically designed to address the reactivity of this compound. The sampler provides immediate feedback to the subject and the technician on the quality of the breath effort. Together with the quick response time of the monitor, this system is capable of accurately measuring exhaled breath ammonia representative of deep lung systemic levels. Because the system is easy to use and produces real time results, it has enabled experiments to identify factors that influence measurements. For example, mouth rinse and oral pH reproducibly and significantly affect results and therefore must be controlled. Temperature and mode of breathing are other examples. As our understanding of these factors evolves, error is reduced, and clinical studies become more meaningful. This system is very reliable and individual measurements are inexpensive. The sampler is relatively inexpensive and quite portable, but the monitor is neither. This limits options for some clinical studies and provides rational for future innovations. PMID:24962141

  6. Automated curve matching techniques for reproducible, high-resolution palaeomagnetic dating

    NASA Astrophysics Data System (ADS)

    Lurcock, Pontus; Channell, James

    2016-04-01

    High-resolution relative palaeointensity (RPI) and palaeosecular variation (PSV) data are increasingly important for accurate dating of sedimentary sequences, often in combination with oxygen isotope (δ18O) measurements. A chronology is established by matching a measured downcore signal to a dated reference curve, but there is no standard methodology for performing this correlation. Traditionally, matching is done by eye, but this becomes difficult when two parameters (e.g. RPI and δ18O) are being matched simultaneously, and cannot be done entirely objectively or repeatably. More recently, various automated techniques have appeared for matching one or more signals. We present Scoter, a user-friendly program for dating by signal matching and for comparing different matching techniques. Scoter is a cross-platform application implemented in Python, and consists of a general-purpose signal processing and correlation library linked to a graphical desktop front-end. RPI, PSV, and other records can be opened, pre-processed, and automatically matched with reference curves. A Scoter project can be exported as a self-contained bundle, encapsulating the input data, pre-processing steps, and correlation parameters, as well as the program itself. The analysis can be automatically replicated by anyone using only the resources in the bundle, ensuring full reproducibility. The current version of Scoter incorporates an experimental signal-matching algorithm based on simulated annealing, as well as an interface to the well-established Match program of Lisiecki and Lisiecki (2002), enabling results of the two approaches to be compared directly.

  7. Exploring predictive and reproducible modeling with the single-subject FIAC dataset.

    PubMed

    Chen, Xu; Pereira, Francisco; Lee, Wayne; Strother, Stephen; Mitchell, Tom

    2006-05-01

    Predictive modeling of functional magnetic resonance imaging (fMRI) has the potential to expand the amount of information extracted and to enhance our understanding of brain systems by predicting brain states, rather than emphasizing the standard spatial mapping. Based on the block datasets of Functional Imaging Analysis Contest (FIAC) Subject 3, we demonstrate the potential and pitfalls of predictive modeling in fMRI analysis by investigating the performance of five models (linear discriminant analysis, logistic regression, linear support vector machine, Gaussian naive Bayes, and a variant) as a function of preprocessing steps and feature selection methods. We found that: (1) independent of the model, temporal detrending and feature selection assisted in building a more accurate predictive model; (2) the linear support vector machine and logistic regression often performed better than either of the Gaussian naive Bayes models in terms of the optimal prediction accuracy; and (3) the optimal prediction accuracy obtained in a feature space using principal components was typically lower than that obtained in a voxel space, given the same model and same preprocessing. We show that due to the existence of artifacts from different sources, high prediction accuracy alone does not guarantee that a classifier is learning a pattern of brain activity that might be usefully visualized, although cross-validation methods do provide fairly unbiased estimates of true prediction accuracy. The trade-off between the prediction accuracy and the reproducibility of the spatial pattern should be carefully considered in predictive modeling of fMRI. We suggest that unless the experimental goal is brain-state classification of new scans on well-defined spatial features, prediction alone should not be used as an optimization procedure in fMRI data analysis.

  8. Profitable capitation requires accurate costing.

    PubMed

    West, D A; Hicks, L L; Balas, E A; West, T D

    1996-01-01

    In the name of costing accuracy, nurses are asked to track inventory use on per treatment basis when more significant costs, such as general overhead and nursing salaries, are usually allocated to patients or treatments on an average cost basis. Accurate treatment costing and financial viability require analysis of all resources actually consumed in treatment delivery, including nursing services and inventory. More precise costing information enables more profitable decisions as is demonstrated by comparing the ratio-of-cost-to-treatment method (aggregate costing) with alternative activity-based costing methods (ABC). Nurses must participate in this costing process to assure that capitation bids are based upon accurate costs rather than simple averages. PMID:8788799

  9. Can atmospheric reanalysis datasets be used to reproduce flood characteristics?

    NASA Astrophysics Data System (ADS)

    Andreadis, K.; Schumann, G.; Stampoulis, D.

    2014-12-01

    Floods are one of the costliest natural disasters and the ability to understand their characteristics and their interactions with population, land cover and climate changes is of paramount importance. In order to accurately reproduce flood characteristics such as water inundation and heights both in the river channels and floodplains, hydrodynamic models are required. Most of these models operate at very high resolutions and are computationally very expensive, making their application over large areas very difficult. However, a need exists for such models to be applied at regional to global scales so that the effects of climate change with regards to flood risk can be examined. We use the LISFLOOD-FP hydrodynamic model to simulate a 40-year history of flood characteristics at the continental scale, particularly over Australia. LISFLOOD-FP is a 2-D hydrodynamic model that solves the approximate Saint-Venant equations at large scales (on the order of 1 km) using a sub-grid representation of the river channel. This implementation is part of an effort towards a global 1-km flood modeling framework that will allow the reconstruction of a long-term flood climatology. The components of this framework include a hydrologic model (the widely-used Variable Infiltration Capacity model) and a meteorological dataset that forces it. In order to extend the simulated flood climatology to 50-100 years in a consistent manner, reanalysis datasets have to be used. The objective of this study is the evaluation of multiple atmospheric reanalysis datasets (ERA, NCEP, MERRA, JRA) as inputs to the VIC/LISFLOOD-FP model. Comparisons of the simulated flood characteristics are made with both satellite observations of inundation and a benchmark simulation of LISFLOOD-FP being forced by observed flows. Finally, the implications of the availability of a global flood modeling framework for producing flood hazard maps and disseminating disaster information are discussed.

  10. Crowdsourcing reproducible seizure forecasting in human and canine epilepsy.

    PubMed

    Brinkmann, Benjamin H; Wagenaar, Joost; Abbot, Drew; Adkins, Phillip; Bosshard, Simone C; Chen, Min; Tieng, Quang M; He, Jialune; Muñoz-Almaraz, F J; Botella-Rocamora, Paloma; Pardo, Juan; Zamora-Martinez, Francisco; Hills, Michael; Wu, Wei; Korshunova, Iryna; Cukierski, Will; Vite, Charles; Patterson, Edward E; Litt, Brian; Worrell, Gregory A

    2016-06-01

    SEE MORMANN AND ANDRZEJAK DOI101093/BRAIN/AWW091 FOR A SCIENTIFIC COMMENTARY ON THIS ARTICLE  : Accurate forecasting of epileptic seizures has the potential to transform clinical epilepsy care. However, progress toward reliable seizure forecasting has been hampered by lack of open access to long duration recordings with an adequate number of seizures for investigators to rigorously compare algorithms and results. A seizure forecasting competition was conducted on kaggle.com using open access chronic ambulatory intracranial electroencephalography from five canines with naturally occurring epilepsy and two humans undergoing prolonged wide bandwidth intracranial electroencephalographic monitoring. Data were provided to participants as 10-min interictal and preictal clips, with approximately half of the 60 GB data bundle labelled (interictal/preictal) for algorithm training and half unlabelled for evaluation. The contestants developed custom algorithms and uploaded their classifications (interictal/preictal) for the unknown testing data, and a randomly selected 40% of data segments were scored and results broadcasted on a public leader board. The contest ran from August to November 2014, and 654 participants submitted 17 856 classifications of the unlabelled test data. The top performing entry scored 0.84 area under the classification curve. Following the contest, additional held-out unlabelled data clips were provided to the top 10 participants and they submitted classifications for the new unseen data. The resulting area under the classification curves were well above chance forecasting, but did show a mean 6.54 ± 2.45% (min, max: 0.30, 20.2) decline in performance. The kaggle.com model using open access data and algorithms generated reproducible research that advanced seizure forecasting. The overall performance from multiple contestants on unseen data was better than a random predictor, and demonstrates the feasibility of seizure forecasting in canine and human

  11. Reproducibility of urinary phthalate metabolites in first morning urine samples.

    PubMed Central

    Hoppin, Jane A; Brock, John W; Davis, Barbara J; Baird, Donna D

    2002-01-01

    Phthalates are ubiquitous in our modern environment because of their use in plastics and cosmetic products. Phthalate monoesters--primarily monoethylhexyl phthalate and monobutyl phthalate--are reproductive and developmental toxicants in animals. Accurate measures of phthalate exposure are needed to assess their human health effects. Phthalate monoesters have a biologic half-life of approximately 12 hr, and little is known about the temporal variability and daily reproducibility of urinary measures in humans. To explore these aspects, we measured seven phthalate monoesters and creatinine concentration in two consecutive first-morning urine specimens from 46 African-American women, ages 35-49 years, residing in the Washington, DC, area in 1996-1997. We measured phthalate monoesters using high-pressure liquid chromatography followed by tandem mass spectrometry on a triple quadrupole instrument using atmospheric pressure chemical ionization. We detected four phthalate monoesters in all subjects, with median levels of 31 ng/mL for monobenzyl phthalate (mBzP), 53 ng/mL for monobutyl phthalate (mBP), 211 ng/mL for monoethyl phthalate (mEP), and 7.3 ng/mL for monoethylhexyl phthalate (mEHP). These were similar to concentrations reported for other populations using spot urine specimens. Phthalate levels did not differ between the two sampling days. The Pearson correlation coefficient between the concentrations on the 2 days was 0.8 for mBP, 0.7 for mEHP, 0.6 for mEP, and 0.5 for mBzP. These results suggest that even with the short half-lives of phthalates, women's patterns of exposure may be sufficiently stable to assign an exposure level based on a single first morning void urine measurement. PMID:12003755

  12. Crowdsourcing reproducible seizure forecasting in human and canine epilepsy

    PubMed Central

    Wagenaar, Joost; Abbot, Drew; Adkins, Phillip; Bosshard, Simone C.; Chen, Min; Tieng, Quang M.; He, Jialune; Muñoz-Almaraz, F. J.; Botella-Rocamora, Paloma; Pardo, Juan; Zamora-Martinez, Francisco; Hills, Michael; Wu, Wei; Korshunova, Iryna; Cukierski, Will; Vite, Charles; Patterson, Edward E.; Litt, Brian; Worrell, Gregory A.

    2016-01-01

    See Mormann and Andrzejak (doi:10.1093/brain/aww091) for a scientific commentary on this article.   Accurate forecasting of epileptic seizures has the potential to transform clinical epilepsy care. However, progress toward reliable seizure forecasting has been hampered by lack of open access to long duration recordings with an adequate number of seizures for investigators to rigorously compare algorithms and results. A seizure forecasting competition was conducted on kaggle.com using open access chronic ambulatory intracranial electroencephalography from five canines with naturally occurring epilepsy and two humans undergoing prolonged wide bandwidth intracranial electroencephalographic monitoring. Data were provided to participants as 10-min interictal and preictal clips, with approximately half of the 60 GB data bundle labelled (interictal/preictal) for algorithm training and half unlabelled for evaluation. The contestants developed custom algorithms and uploaded their classifications (interictal/preictal) for the unknown testing data, and a randomly selected 40% of data segments were scored and results broadcasted on a public leader board. The contest ran from August to November 2014, and 654 participants submitted 17 856 classifications of the unlabelled test data. The top performing entry scored 0.84 area under the classification curve. Following the contest, additional held-out unlabelled data clips were provided to the top 10 participants and they submitted classifications for the new unseen data. The resulting area under the classification curves were well above chance forecasting, but did show a mean 6.54 ± 2.45% (min, max: 0.30, 20.2) decline in performance. The kaggle.com model using open access data and algorithms generated reproducible research that advanced seizure forecasting. The overall performance from multiple contestants on unseen data was better than a random predictor, and demonstrates the feasibility of seizure forecasting in canine and

  13. Crowdsourcing reproducible seizure forecasting in human and canine epilepsy

    PubMed Central

    Wagenaar, Joost; Abbot, Drew; Adkins, Phillip; Bosshard, Simone C.; Chen, Min; Tieng, Quang M.; He, Jialune; Muñoz-Almaraz, F. J.; Botella-Rocamora, Paloma; Pardo, Juan; Zamora-Martinez, Francisco; Hills, Michael; Wu, Wei; Korshunova, Iryna; Cukierski, Will; Vite, Charles; Patterson, Edward E.; Litt, Brian; Worrell, Gregory A.

    2016-01-01

    See Mormann and Andrzejak (doi:10.1093/brain/aww091) for a scientific commentary on this article.   Accurate forecasting of epileptic seizures has the potential to transform clinical epilepsy care. However, progress toward reliable seizure forecasting has been hampered by lack of open access to long duration recordings with an adequate number of seizures for investigators to rigorously compare algorithms and results. A seizure forecasting competition was conducted on kaggle.com using open access chronic ambulatory intracranial electroencephalography from five canines with naturally occurring epilepsy and two humans undergoing prolonged wide bandwidth intracranial electroencephalographic monitoring. Data were provided to participants as 10-min interictal and preictal clips, with approximately half of the 60 GB data bundle labelled (interictal/preictal) for algorithm training and half unlabelled for evaluation. The contestants developed custom algorithms and uploaded their classifications (interictal/preictal) for the unknown testing data, and a randomly selected 40% of data segments were scored and results broadcasted on a public leader board. The contest ran from August to November 2014, and 654 participants submitted 17 856 classifications of the unlabelled test data. The top performing entry scored 0.84 area under the classification curve. Following the contest, additional held-out unlabelled data clips were provided to the top 10 participants and they submitted classifications for the new unseen data. The resulting area under the classification curves were well above chance forecasting, but did show a mean 6.54 ± 2.45% (min, max: 0.30, 20.2) decline in performance. The kaggle.com model using open access data and algorithms generated reproducible research that advanced seizure forecasting. The overall performance from multiple contestants on unseen data was better than a random predictor, and demonstrates the feasibility of seizure forecasting in canine and

  14. Enhancing reproducibility of ultrasonic measurements by new users

    NASA Astrophysics Data System (ADS)

    Pramanik, Manojit; Gupta, Madhumita; Krishnan, Kajoli Banerjee

    2013-03-01

    Perception of operator influences ultrasound image acquisition and processing. Lower costs are attracting new users to medical ultrasound. Anticipating an increase in this trend, we conducted a study to quantify the variability in ultrasonic measurements made by novice users and identify methods to reduce it. We designed a protocol with four presets and trained four new users to scan and manually measure the head circumference of a fetal phantom with an ultrasound scanner. In the first phase, the users followed this protocol in seven distinct sessions. They then received feedback on the quality of the scans from an expert. In the second phase, two of the users repeated the entire protocol aided by visual cues provided to them during scanning. We performed off-line measurements on all the images using a fully automated algorithm capable of measuring the head circumference from fetal phantom images. The ground truth (198.1±1.6 mm) was based on sixteen scans and measurements made by an expert. Our analysis shows that: (1) the inter-observer variability of manual measurements was 5.5 mm, whereas the inter-observer variability of automated measurements was only 0.6 mm in the first phase (2) consistency of image appearance improved and mean manual measurements was 4-5 mm closer to the ground truth in the second phase (3) automated measurements were more precise, accurate and less sensitive to different presets compared to manual measurements in both phases. Our results show that visual aids and automation can bring more reproducibility to ultrasonic measurements made by new users.

  15. Accurate documentation and wound measurement.

    PubMed

    Hampton, Sylvie

    This article, part 4 in a series on wound management, addresses the sometimes routine yet crucial task of documentation. Clear and accurate records of a wound enable its progress to be determined so the appropriate treatment can be applied. Thorough records mean any practitioner picking up a patient's notes will know when the wound was last checked, how it looked and what dressing and/or treatment was applied, ensuring continuity of care. Documenting every assessment also has legal implications, demonstrating due consideration and care of the patient and the rationale for any treatment carried out. Part 5 in the series discusses wound dressing characteristics and selection.

  16. Reproducibility study of TLD-100 micro-cubes at radiotherapy dose level.

    PubMed

    da Rosa, L A; Regulla, D F; Fill, U A

    1999-03-01

    The precision of the thermoluminescent response of Harshaw micro-cube dosimeters (TLD-100), evaluated in both Harshaw thermoluminescent readers 5500 and 3500, for 1 Gy dose value, was investigated. The mean reproducibility for micro-cubes, pre-readout annealed at 100 degrees C for 15 min, evaluated with the manual planchet reader 3500, is 0.61% (1 standard deviation). When micro-cubes are evaluated with the automated hot-gas reader 5500, reproducibility values are undoubtedly worse, mean reproducibility for numerically stabilised dosimeters being equal to 3.27% (1 standard deviation). These results indicate that the reader model 5500, or, at least, the instrument used for the present measurements, is not adequate for micro-cube evaluation, if precise and accurate dosimetry is required. The difference in precision is apparently due to geometry inconsistencies in the orientation of the imperfect micro-cube faces during readout, requiring careful and manual reproducible arrangement of the selected micro-cube faces in contact with the manual reader planchet.

  17. An Open Science and Reproducible Research Primer for Landscape Ecologists

    EPA Science Inventory

    In recent years many funding agencies, some publishers, and even the United States government have enacted policies that encourage open science and strive for reproducibility; however, the knowledge and skills to implement open science and enable reproducible research are not yet...

  18. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 2 2012-10-01 2012-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3... APPURTENANCES Selection and Limitations of Piping Joints § 56.30-3 Piping joints (reproduces 110). The type of piping joint used shall be suitable for the design conditions and shall be selected with consideration...

  19. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3... APPURTENANCES Selection and Limitations of Piping Joints § 56.30-3 Piping joints (reproduces 110). The type of piping joint used shall be suitable for the design conditions and shall be selected with consideration...

  20. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 2 2014-10-01 2014-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3... APPURTENANCES Selection and Limitations of Piping Joints § 56.30-3 Piping joints (reproduces 110). The type of piping joint used shall be suitable for the design conditions and shall be selected with consideration...

  1. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 2 2013-10-01 2013-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3... APPURTENANCES Selection and Limitations of Piping Joints § 56.30-3 Piping joints (reproduces 110). The type of piping joint used shall be suitable for the design conditions and shall be selected with consideration...

  2. 10 CFR 95.43 - Authority to reproduce.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Authority to reproduce. 95.43 Section 95.43 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) FACILITY SECURITY CLEARANCE AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION AND RESTRICTED DATA Control of Information § 95.43 Authority to reproduce. (a) Each...

  3. Towards Accurate Application Characterization for Exascale (APEX)

    SciTech Connect

    Hammond, Simon David

    2015-09-01

    Sandia National Laboratories has been engaged in hardware and software codesign activities for a number of years, indeed, it might be argued that prototyping of clusters as far back as the CPLANT machines and many large capability resources including ASCI Red and RedStorm were examples of codesigned solutions. As the research supporting our codesign activities has moved closer to investigating on-node runtime behavior a nature hunger has grown for detailed analysis of both hardware and algorithm performance from the perspective of low-level operations. The Application Characterization for Exascale (APEX) LDRD was a project concieved of addressing some of these concerns. Primarily the research was to intended to focus on generating accurate and reproducible low-level performance metrics using tools that could scale to production-class code bases. Along side this research was an advocacy and analysis role associated with evaluating tools for production use, working with leading industry vendors to develop and refine solutions required by our code teams and to directly engage with production code developers to form a context for the application analysis and a bridge to the research community within Sandia. On each of these accounts significant progress has been made, particularly, as this report will cover, in the low-level analysis of operations for important classes of algorithms. This report summarizes the development of a collection of tools under the APEX research program and leaves to other SAND and L2 milestone reports the description of codesign progress with Sandia’s production users/developers.

  4. SPLASH: Accurate OH maser positions

    NASA Astrophysics Data System (ADS)

    Walsh, Andrew; Gomez, Jose F.; Jones, Paul; Cunningham, Maria; Green, James; Dawson, Joanne; Ellingsen, Simon; Breen, Shari; Imai, Hiroshi; Lowe, Vicki; Jones, Courtney

    2013-10-01

    The hydroxyl (OH) 18 cm lines are powerful and versatile probes of diffuse molecular gas, that may trace a largely unstudied component of the Galactic ISM. SPLASH (the Southern Parkes Large Area Survey in Hydroxyl) is a large, unbiased and fully-sampled survey of OH emission, absorption and masers in the Galactic Plane that will achieve sensitivities an order of magnitude better than previous work. In this proposal, we request ATCA time to follow up OH maser candidates. This will give us accurate (~10") positions of the masers, which can be compared to other maser positions from HOPS, MMB and MALT-45 and will provide full polarisation measurements towards a sample of OH masers that have not been observed in MAGMO.

  5. Accurate thickness measurement of graphene

    NASA Astrophysics Data System (ADS)

    Shearer, Cameron J.; Slattery, Ashley D.; Stapleton, Andrew J.; Shapter, Joseph G.; Gibson, Christopher T.

    2016-03-01

    Graphene has emerged as a material with a vast variety of applications. The electronic, optical and mechanical properties of graphene are strongly influenced by the number of layers present in a sample. As a result, the dimensional characterization of graphene films is crucial, especially with the continued development of new synthesis methods and applications. A number of techniques exist to determine the thickness of graphene films including optical contrast, Raman scattering and scanning probe microscopy techniques. Atomic force microscopy (AFM), in particular, is used extensively since it provides three-dimensional images that enable the measurement of the lateral dimensions of graphene films as well as the thickness, and by extension the number of layers present. However, in the literature AFM has proven to be inaccurate with a wide range of measured values for single layer graphene thickness reported (between 0.4 and 1.7 nm). This discrepancy has been attributed to tip-surface interactions, image feedback settings and surface chemistry. In this work, we use standard and carbon nanotube modified AFM probes and a relatively new AFM imaging mode known as PeakForce tapping mode to establish a protocol that will allow users to accurately determine the thickness of graphene films. In particular, the error in measuring the first layer is reduced from 0.1-1.3 nm to 0.1-0.3 nm. Furthermore, in the process we establish that the graphene-substrate adsorbate layer and imaging force, in particular the pressure the tip exerts on the surface, are crucial components in the accurate measurement of graphene using AFM. These findings can be applied to other 2D materials.

  6. Accurate thickness measurement of graphene.

    PubMed

    Shearer, Cameron J; Slattery, Ashley D; Stapleton, Andrew J; Shapter, Joseph G; Gibson, Christopher T

    2016-03-29

    Graphene has emerged as a material with a vast variety of applications. The electronic, optical and mechanical properties of graphene are strongly influenced by the number of layers present in a sample. As a result, the dimensional characterization of graphene films is crucial, especially with the continued development of new synthesis methods and applications. A number of techniques exist to determine the thickness of graphene films including optical contrast, Raman scattering and scanning probe microscopy techniques. Atomic force microscopy (AFM), in particular, is used extensively since it provides three-dimensional images that enable the measurement of the lateral dimensions of graphene films as well as the thickness, and by extension the number of layers present. However, in the literature AFM has proven to be inaccurate with a wide range of measured values for single layer graphene thickness reported (between 0.4 and 1.7 nm). This discrepancy has been attributed to tip-surface interactions, image feedback settings and surface chemistry. In this work, we use standard and carbon nanotube modified AFM probes and a relatively new AFM imaging mode known as PeakForce tapping mode to establish a protocol that will allow users to accurately determine the thickness of graphene films. In particular, the error in measuring the first layer is reduced from 0.1-1.3 nm to 0.1-0.3 nm. Furthermore, in the process we establish that the graphene-substrate adsorbate layer and imaging force, in particular the pressure the tip exerts on the surface, are crucial components in the accurate measurement of graphene using AFM. These findings can be applied to other 2D materials.

  7. Toward Accurate and Quantitative Comparative Metagenomics.

    PubMed

    Nayfach, Stephen; Pollard, Katherine S

    2016-08-25

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  8. Micron Accurate Absolute Ranging System: Range Extension

    NASA Technical Reports Server (NTRS)

    Smalley, Larry L.; Smith, Kely L.

    1999-01-01

    The purpose of this research is to investigate Fresnel diffraction as a means of obtaining absolute distance measurements with micron or greater accuracy. It is believed that such a system would prove useful to the Next Generation Space Telescope (NGST) as a non-intrusive, non-contact measuring system for use with secondary concentrator station-keeping systems. The present research attempts to validate past experiments and develop ways to apply the phenomena of Fresnel diffraction to micron accurate measurement. This report discusses past research on the phenomena, and the basis of the use Fresnel diffraction distance metrology. The apparatus used in the recent investigations, experimental procedures used, preliminary results are discussed in detail. Continued research and equipment requirements on the extension of the effective range of the Fresnel diffraction systems is also described.

  9. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  10. An Effective and Reproducible Model of Ventricular Fibrillation in Crossbred Yorkshire Swine (Sus scrofa) for Use in Physiologic Research

    PubMed Central

    Burgert, James M; Johnson, Arthur D; Garcia-Blanco, Jose C; Craig, W John; O'Sullivan, Joseph C

    2015-01-01

    Transcutaneous electrical induction (TCEI) has been used to induce ventricular fibrillation (VF) in laboratory swine for physiologic and resuscitation research. Many studies do not describe the method of TCEI in detail, thus making replication by future investigators difficult. Here we describe a detailed method of electrically inducing VF that was used successfully in a prospective, experimental resuscitation study. Specifically, an electrical current was passed through the heart to induce VF in crossbred Yorkshire swine (n = 30); the current was generated by using two 22-gauge spinal needles, with one placed above and one below the heart, and three 9V batteries connected in series. VF developed in 28 of the 30 pigs (93%) within 10 s of beginning the procedure. In the remaining 2 swine, VF was induced successfully after medial redirection of the superior parasternal needle. The TCEI method is simple, reproducible, and cost-effective. TCEI may be especially valuable to researchers with limited access to funding, sophisticated equipment, or colleagues experienced in interventional cardiology techniques. The TCEI method might be most appropriate for pharmacologic studies requiring VF, VF resulting from the R-on-T phenomenon (as in prolonged QT syndrome), and VF arising from other ectopic or reentrant causes. However, the TCEI method does not accurately model the most common cause of VF, acute coronary occlusive disease. Researchers must consider the limitations of TCEI that may affect internal and external validity of collected data, when designing experiments using this model of VF. PMID:26473349

  11. Accurate Cross Sections for Microanalysis

    PubMed Central

    Rez, Peter

    2002-01-01

    To calculate the intensity of x-ray emission in electron beam microanalysis requires a knowledge of the energy distribution of the electrons in the solid, the energy variation of the ionization cross section of the relevant subshell, the fraction of ionizations events producing x rays of interest and the absorption coefficient of the x rays on the path to the detector. The theoretical predictions and experimental data available for ionization cross sections are limited mainly to K shells of a few elements. Results of systematic plane wave Born approximation calculations with exchange for K, L, and M shell ionization cross sections over the range of electron energies used in microanalysis are presented. Comparisons are made with experimental measurement for selected K shells and it is shown that the plane wave theory is not appropriate for overvoltages less than 2.5 V. PMID:27446747

  12. Easy and accurate calculation of programmed temperature gas chromatographic retention times by back-calculation of temperature and hold-up time profiles.

    PubMed

    Boswell, Paul G; Carr, Peter W; Cohen, Jerry D; Hegeman, Adrian D

    2012-11-01

    Linear retention indices are commonly used to identify compounds in programmed-temperature gas chromatography (GC), but they are unreliable unless the original experimental conditions used to measure them are stringently reproduced. However, differences in many experimental conditions may be properly taken into account by calculating programmed-temperature retention times of compounds from their measured isothermal retention vs. temperature relationships. We call this approach "retention projection". Until now, retention projection has been impractical because it required very precise, meticulous measurement of the temperature vs. time and hold-up time vs. temperature profiles actually produced by a specific GC instrument to be accurate. Here we present a new, easy-to-use methodology to precisely measure those profiles: we spike a sample with 25 n-alkanes and use their measured, programmed-temperature retention times to precisely back-calculate what the instrument profiles must have been. Then, when we use those back-calculated profiles to project retention times of 63 chemically diverse compounds, we found that the projections are extremely accurate (e.g. to ±0.9 s in a 40 min ramp). They remained accurate with different temperature programs, GC instruments, inlet pressures, flow rates, and with columns taken from different batches of stationary phase while the accuracy of retention indices became worse the more the experimental conditions were changed from the original ones used to measure them. We also developed new, open-source software (http://www.retentionprediction.org/gc) to demonstrate the system.

  13. Using the mouse to model human disease: increasing validity and reproducibility

    PubMed Central

    Justice, Monica J.; Dhillon, Paraminder

    2016-01-01

    ABSTRACT Experiments that use the mouse as a model for disease have recently come under scrutiny because of the repeated failure of data, particularly derived from preclinical studies, to be replicated or translated to humans. The usefulness of mouse models has been questioned because of irreproducibility and poor recapitulation of human conditions. Newer studies, however, point to bias in reporting results and improper data analysis as key factors that limit reproducibility and validity of preclinical mouse research. Inaccurate and incomplete descriptions of experimental conditions also contribute. Here, we provide guidance on best practice in mouse experimentation, focusing on appropriate selection and validation of the model, sources of variation and their influence on phenotypic outcomes, minimum requirements for control sets, and the importance of rigorous statistics. Our goal is to raise the standards in mouse disease modeling to enhance reproducibility, reliability and clinical translation of findings. PMID:26839397

  14. Reproducible and controllable induction voltage adder for scaled beam experiments

    NASA Astrophysics Data System (ADS)

    Sakai, Yasuo; Nakajima, Mitsuo; Horioka, Kazuhiko

    2016-08-01

    A reproducible and controllable induction adder was developed using solid-state switching devices and Finemet cores for scaled beam compression experiments. A gate controlled MOSFET circuit was developed for the controllable voltage driver. The MOSFET circuit drove the induction adder at low magnetization levels of the cores which enabled us to form reproducible modulation voltages with jitter less than 0.3 ns. Preliminary beam compression experiments indicated that the induction adder can improve the reproducibility of modulation voltages and advance the beam physics experiments.

  15. Reproducible and controllable induction voltage adder for scaled beam experiments.

    PubMed

    Sakai, Yasuo; Nakajima, Mitsuo; Horioka, Kazuhiko

    2016-08-01

    A reproducible and controllable induction adder was developed using solid-state switching devices and Finemet cores for scaled beam compression experiments. A gate controlled MOSFET circuit was developed for the controllable voltage driver. The MOSFET circuit drove the induction adder at low magnetization levels of the cores which enabled us to form reproducible modulation voltages with jitter less than 0.3 ns. Preliminary beam compression experiments indicated that the induction adder can improve the reproducibility of modulation voltages and advance the beam physics experiments. PMID:27587112

  16. Sympathetic neural reactivity to mental stress in humans: test-retest reproducibility.

    PubMed

    Fonkoue, Ida T; Carter, Jason R

    2015-12-01

    Mental stress consistently increases arterial blood pressure, but this reliable pressor response is often associated with highly variable muscle sympathetic nerve activity (MSNA) responsiveness between individuals. Although MSNA has been shown to be reproducible within individuals at rest and during the cold pressor test (CPT), intraindividual reproducibility of MSNA responsiveness to mental stress has not been adequately explored. The purpose of this study was to examine MSNA reactivity to mental stress across three experimental sessions. Sixteen men and women (age 21 ± 1 yr) performed two experimental sessions within a single laboratory visit and a third experimental session 1 mo later. Each experimental session consisted of a mental stress trial via mental arithmetic and a CPT trial. Blood pressure, heart rate (HR), and MSNA were measured, and the consistencies of these variables were determined using intraclass correlation (Cronbach's α coefficient). MSNA, mean arterial pressure (MAP), and HR were highly reproducible across the baselines preceding mental stress (Cronbach's α ≥ 0.816, P ≤ 0.001) and CPT (Cronbach's α ≥ 0.782, P ≤ 0.001). Across the three mental stress trials, changes in MSNA (Cronbach's α = 0.875; P = 0.001), MAP (Cronbach's α = 0.749; P < 0.001), and HR (Cronbach's α = 0.919; P < 0.001) were reproducible. During CPT, changes in MSNA (Cronbach's α = 0.805; P = 0.008), MAP (Cronbach's α = 0.878; P < 0.001), and HR (Cronbach's α = 0.927; P < 0.001) remained consistent across the three sessions. In conclusion, our findings demonstrate that MSNA reactivity to mental stress is consistent within a single laboratory visit and across laboratory sessions conducted on separate days. PMID:26400186

  17. Engineering preliminaries to obtain reproducible mixtures of atelocollagen and polysaccharides.

    PubMed

    Lefter, Cristina-Mihaela; Maier, Stelian Sergiu; Maier, Vasilica; Popa, Marcel; Desbrieres, Jacques

    2013-05-01

    The critical stage in producing blends of biomacromolecules consists in the mixing of component solutions to generate homogenous diluted colloidal systems. Simple experimental investigations allow the establishment of the design rules of recipes and the procedures for preparing homogenous and compositionally reproducible mixtures. Starting from purified solutions of atelocollagen, hyaluronan and native gellan, having as low as possible inorganic salts content, initial binary and ternary mixtures can be prepared up to a total dry matter content of 0.150 g/dL, in no co-precipitating conditions. Two pH manipulation ways are feasible for homogenous mixing: (i) unbuffered prior correction at pH 5.5, and (ii) "rigid" buffering at pH 9.0, using organic species. Atelocollagen including co-precipitates can be obtained in the presence of one or both polysaccharides, preferably in pH domains far from the isoelectric point of scleroprotein. A critical behavior has been observed in mixtures containing gellan, due to its macromolecular dissimilarities compared with atelocollagen. In optimal binary mixtures, the coordinates of threshold points on the phase diagrams are 0.028% w/w atelocollagen/0.025% w/w hyaluronan, and 0.022% w/w atelocollagen/0.020% w/w gellan. Uni- or bi-phasic ternary systems having equilibrated ratios of co-precipitated components can be prepared starting from initial mixtures containing up to 0.032 g/dL atelocollagen, associated with, for example, 0.040 g/dL hyaluronan and 0.008 g/dL gellan, following the first pH manipulation way.

  18. Is the Sciatic Functional Index always reliable and reproducible?

    PubMed

    Monte-Raso, Vanessa Vilela; Barbieri, Cláudio Henrique; Mazzer, Nilton; Yamasita, Alexandre Calura; Barbieri, Giuliano

    2008-05-30

    The Sciatic Functional Index (SFI) is a quite useful tool for the evaluation of functional recovery of the sciatic nerve of rats in a number of experimental injuries and treatments. Although it is an objective method, it depends on the examiner's ability to adequately recognize and mark the previously established footprint key points, which is an entirely subjective step, thus potentially interfering with the calculations according to the mathematical formulae proposed by different authors. Thus, an interpersonal evaluation of the reproducibility of an SFI computer-aided method was carried out here to study data variability. A severe crush injury was produced on a 5 mm-long segment of the right sciatic nerve of 20 Wistar rats (a 5000 g load directly applied for 10 min) and the SFI was measured by four different examiners (an experienced one and three newcomers) preoperatively and at weekly intervals from the 1st to the 8th postoperative week. Three measurements were made for each print and the average was calculated and used for statistical analysis. The results showed that interpersonal correlation was high (0.82) in the 3rd, 4th, 5th, 7th and 8th weeks, with an unexpected but significant (p<0.01) drop in the 6th week. There was virtually no interpersonal correlation (correlation index close to 0) on the 1st and 2nd weeks, a period during which the variability between animals and examiners (p=0.24 and 0.32, respectively) was similar, certainly due to a poor definition of the footprints. The authors conclude that the SFI method studied here is only reliable from the 3rd week on after a severe lesion of the sciatic nerve of rats.

  19. 15. REPRODUCED FROM 'GRIST WIND MILLS AT EAST HAMPTON,' PICTURESQUE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. REPRODUCED FROM 'GRIST WIND MILLS AT EAST HAMPTON,' PICTURESQUE AMERICA NEW YORK, 1872. THE HOOD WINDMILL IS IN THE FOREGROUND AND THE PANTIGO WINDMILL IS IN THE BACKGROUND - Pantigo Windmill, James Lane, East Hampton, Suffolk County, NY

  20. 8. Historic American Buildings Survey Reproduced from the collections of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. Historic American Buildings Survey Reproduced from the collections of the Library of Congress, Accession No. 45041 Geographical File ('Nantucket, Mass.') Division of Prints and Photographs c. 1880 - Jethro Coffin House, Sunset Hill, Nantucket, Nantucket County, MA

  1. 223. FREQUENTLY REPRODUCED VIEW OF GWMP SHOWING VARIABLE WIDTH MEDIANS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    223. FREQUENTLY REPRODUCED VIEW OF GWMP SHOWING VARIABLE WIDTH MEDIANS WITH INDEPENDENT ALIGNMENTS FROM KEY BRIDGE LOOKING NORTHWEST, 1953. - George Washington Memorial Parkway, Along Potomac River from McLean to Mount Vernon, VA, Mount Vernon, Fairfax County, VA

  2. Photographic copy of reproduced photograph dated 1942. Exterior view, west ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photographic copy of reproduced photograph dated 1942. Exterior view, west elevation. Building camouflaged during World War II. - Grand Central Air Terminal, 1310 Air Way, Glendale, Los Angeles County, CA

  3. Comment on "Estimating the reproducibility of psychological science".

    PubMed

    Gilbert, Daniel T; King, Gary; Pettigrew, Stephen; Wilson, Timothy D

    2016-03-01

    A paper from the Open Science Collaboration (Research Articles, 28 August 2015, aac4716) attempting to replicate 100 published studies suggests that the reproducibility of psychological science is surprisingly low. We show that this article contains three statistical errors and provides no support for such a conclusion. Indeed, the data are consistent with the opposite conclusion, namely, that the reproducibility of psychological science is quite high.

  4. Toward more accurate loss tangent measurements in reentrant cavities

    SciTech Connect

    Moyer, R. D.

    1980-05-01

    Karpova has described an absolute method for measurement of dielectric properties of a solid in a coaxial reentrant cavity. His cavity resonance equation yields very accurate results for dielectric constants. However, he presented only approximate expressions for the loss tangent. This report presents more exact expressions for that quantity and summarizes some experimental results.

  5. On The Reproducibility of Seasonal Land-surface Climate

    SciTech Connect

    Phillips, T J

    2004-10-22

    The sensitivity of the continental seasonal climate to initial conditions is estimated from an ensemble of decadal simulations of an atmospheric general circulation model with the same specifications of radiative forcings and monthly ocean boundary conditions, but with different initial states of atmosphere and land. As measures of the ''reproducibility'' of continental climate for different initial conditions, spatio-temporal correlations are computed across paired realizations of eleven model land-surface variables in which the seasonal cycle is either included or excluded--the former case being pertinent to climate simulation, and the latter to seasonal anomaly prediction. It is found that the land-surface variables which include the seasonal cycle are impacted only marginally by changes in initial conditions; moreover, their seasonal climatologies exhibit high spatial reproducibility. In contrast, the reproducibility of a seasonal land-surface anomaly is generally low, although it is substantially higher in the Tropics; its spatial reproducibility also markedly fluctuates in tandem with warm and cold phases of the El Nino/Southern Oscillation. However, the overall degree of reproducibility depends strongly on the particular land-surface anomaly considered. It is also shown that the predictability of a land-surface anomaly implied by its reproducibility statistics is consistent with what is inferred from more conventional predictability metrics. Implications of these results for climate model intercomparison projects and for operational forecasts of seasonal continental climate also are elaborated.

  6. Triploid planarian reproduces truly bisexually with euploid gametes produced through a different meiotic system between sex.

    PubMed

    Chinone, Ayako; Nodono, Hanae; Matsumoto, Midori

    2014-06-01

    Although polyploids are common among plants and some animals, polyploidization often causes reproductive failure. Triploids, in particular, are characterized by the problems of chromosomal pairing and segregation during meiosis, which may cause aneuploid gametes and results in sterility. Thus, they are generally considered to reproduce only asexually. In the case of the Platyhelminthes Dugesia ryukyuensis, populations with triploid karyotypes are normally found in nature as both fissiparous and oviparous triploids. Fissiparous triploids can also be experimentally sexualized if they are fed sexual planarians, developing both gonads and other reproductive organs. Fully sexualized worms begin reproducing by copulation rather than fission. In this study, we examined the genotypes of the offspring obtained by breeding sexualized triploids and found that the offspring inherited genes from both parents, i.e., they reproduced truly bisexually. Furthermore, meiotic chromosome behavior in triploid sexualized planarians differed significantly between male and female germ lines, in that female germ line cells remained triploid until prophase I, whereas male germ line cells appeared to become diploid before entry into meiosis. Oocytes at the late diplotene stage contained not only paired bivalents but also unpaired univalents that were suggested to produce diploid eggs if they remained in subsequent processes. Triploid planarians may therefore form euploid gametes by different meiotic systems in female and male germ lines and thus are be able to reproduce sexually in contrast to many other triploid organisms. PMID:24402417

  7. Standardization of Hemagglutination Inhibition Assay for Influenza Serology Allows for High Reproducibility between Laboratories.

    PubMed

    Zacour, Mary; Ward, Brian J; Brewer, Angela; Tang, Patrick; Boivin, Guy; Li, Yan; Warhuus, Michelle; McNeil, Shelly A; LeBlanc, Jason J; Hatchette, Todd F

    2016-03-01

    Standardization of the hemagglutination inhibition (HAI) assay for influenza serology is challenging. Poor reproducibility of HAI results from one laboratory to another is widely cited, limiting comparisons between candidate vaccines in different clinical trials and posing challenges for licensing authorities. In this study, we standardized HAI assay materials, methods, and interpretive criteria across five geographically dispersed laboratories of a multidisciplinary influenza research network and then evaluated intralaboratory and interlaboratory variations in HAI titers by repeatedly testing standardized panels of human serum samples. Duplicate precision and reproducibility from comparisons between assays within laboratories were 99.8% (99.2% to 100%) and 98.0% (93.3% to 100%), respectively. The results for 98.9% (95% to 100%) of the samples were within 2-fold of all-laboratory consensus titers, and the results for 94.3% (85% to 100%) of the samples were within 2-fold of our reference laboratory data. Low-titer samples showed the greatest variability in comparisons between assays and between sites. Classification of seroprotection (titer ≥ 40) was accurate in 93.6% or 89.5% of cases in comparison to the consensus or reference laboratory classification, respectively. This study showed that with carefully chosen standardization processes, high reproducibility of HAI results between laboratories is indeed achievable. PMID:26818953

  8. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  9. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2013-07-01 2013-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  10. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2011-07-01 2011-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  11. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2014-07-01 2014-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  12. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2012-07-01 2012-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  13. Reproducibility of MRI-Determined Proton Density Fat Fraction Across Two Different MR Scanner Platforms

    PubMed Central

    Kang, Geraldine H.; Cruite, Irene; Shiehmorteza, Masoud; Wolfson, Tanya; Gamst, Anthony C.; Hamilton, Gavin; Bydder, Mark; Middleton, Michael S.; Sirlin, Claude B.

    2016-01-01

    Purpose To evaluate magnetic resonance imaging (MRI)-determined proton density fat fraction (PDFF) reproducibility across two MR scanner platforms and, using MR spectroscopy (MRS)-determined PDFF as reference standard, to confirm MRI-determined PDFF estimation accuracy. Materials and Methods This prospective, cross-sectional, crossover, observational pilot study was approved by an Institutional Review Board. Twenty-one subjects gave written informed consent and underwent liver MRI and MRS at both 1.5T (Siemens Symphony scanner) and 3T (GE Signa Excite HD scanner). MRI-determined PDFF was estimated using an axial 2D spoiled gradient-recalled echo sequence with low flip-angle to minimize T1 bias and six echo-times to permit correction of T2* and fat-water signal interference effects. MRS-determined PDFF was estimated using a stimulated-echo acquisition mode sequence with long repetition time to minimize T1 bias and five echo times to permit T2 correction. Interscanner reproducibility of MRI determined PDFF was assessed by correlation analysis; accuracy was assessed separately at each field strength by linear regression analysis using MRS-determined PDFF as reference standard. Results 1.5T and 3T MRI-determined PDFF estimates were highly correlated (r = 0.992). MRI-determined PDFF estimates were accurate at both 1.5T (regression slope/intercept = 0.958/−0.48) and 3T (slope/intercept = 1.020/0.925) against the MRS-determined PDFF reference. Conclusion MRI-determined PDFF estimation is reproducible and, using MRS-determined PDFF as reference standard, accurate across two MR scanner platforms at 1.5T and 3T. PMID:21769986

  14. Reproducibility of the Structural Connectome Reconstruction across Diffusion Methods.

    PubMed

    Prčkovska, Vesna; Rodrigues, Paulo; Puigdellivol Sanchez, Ana; Ramos, Marc; Andorra, Magi; Martinez-Heras, Eloy; Falcon, Carles; Prats-Galino, Albert; Villoslada, Pablo

    2016-01-01

    Analysis of the structural connectomes can lead to powerful insights about the brain's organization and damage. However, the accuracy and reproducibility of constructing the structural connectome done with different acquisition and reconstruction techniques is not well defined. In this work, we evaluated the reproducibility of the structural connectome techniques by performing test-retest (same day) and longitudinal studies (after 1 month) as well as analyzing graph-based measures on the data acquired from 22 healthy volunteers (6 subjects were used for the longitudinal study). We compared connectivity matrices and tract reconstructions obtained with the most typical acquisition schemes used in clinical application: diffusion tensor imaging (DTI), high angular resolution diffusion imaging (HARDI), and diffusion spectrum imaging (DSI). We observed that all techniques showed high reproducibility in the test-retest analysis (correlation >.9). However, HARDI was the only technique with low variability (2%) in the longitudinal assessment (1-month interval). The intraclass coefficient analysis showed the highest reproducibility for the DTI connectome, however, with more sparse connections than HARDI and DSI. Qualitative (neuroanatomical) assessment of selected tracts confirmed the quantitative results showing that HARDI managed to detect most of the analyzed fiber groups and fanning fibers. In conclusion, we found that HARDI acquisition showed the most balanced trade-off between high reproducibility of the connectome, higher rate of path detection and of fanning fibers, and intermediate acquisition times (10-15 minutes), although at the cost of higher appearance of aberrant fibers. PMID:26464179

  15. Using prediction markets to estimate the reproducibility of scientific research.

    PubMed

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus

    2015-12-15

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications.

  16. Reproducibility of regional brain metabolic responses to lorazepam

    SciTech Connect

    Wang, G.J.; Volkow, N.D.; Overall, J. |

    1996-10-01

    Changes in regional brain glucose metabolism in response to benzodiazepine agonists have been used as indicators of benzodiazepine-GABA receptor function. The purpose of this study was to assess the reproducibility of these responses. Sixteen healthy right-handed men underwent scanning with PET and [{sup 18}F]fluorodeoxyglucose (FDG) twice: before placebo and before lorazepam (30 {mu}g/kg). The same double FDG procedure was repeated 6-8 wk later on the men to assess test-retest reproducibility. The regional absolute brain metabolic values obtained during the second evaluation were significantly lower than those obtained from the first evaluation regardless of condition (p {le} 0.001). Lorazepam significantly and consistently decreased both whole-brain metabolism and the magnitude. The regional pattern of the changes were comparable for both studies (12.3% {plus_minus} 6.9% and 13.7% {plus_minus} 7.4%). Lorazepam effects were the largest in the thalamus (22.2% {plus_minus} 8.6% and 22.4% {plus_minus} 6.9%) and occipital cortex (19% {plus_minus} 8.9% and 21.8% {plus_minus} 8.9%). Relative metabolic measures were highly reproducible both for pharmacolgic and replication condition. This study measured the test-retest reproducibility in regional brain metabolic responses, and although the global and regional metabolic values were significantly lower for the repeated evaluation, the response to lorazepam was highly reproducible. 1613 refs., 3 figs., 3 tabs.

  17. Using prediction markets to estimate the reproducibility of scientific research

    PubMed Central

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A.; Johannesson, Magnus

    2015-01-01

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants’ individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a “statistically significant” finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988

  18. Self-reproducing systems: structure, niche relations and evolution.

    PubMed

    Sharov, A A

    1991-01-01

    A formal definition of a self-reproducing system is proposed using Petri nets. A potential self-reproducing system is a set of places in the Petri net such that the number of tokens in each place increases due to some sequence of internal transitions (a transition is called internal to the marked subset of places if at least one of its starting places and one of its terminating places belongs to that subset). An actual self-reproducing system is a system that compensates the outflow of its components by reproduction. In a suitable environment every potential self-reproducing system becomes an actual one. Each Petri net can be considered as an ecosystem with the web of ecological niches bound together with trophic and other relations. The stationary dynamics of the ecosystem is characterized by the set of filled niches. The process of evolution is described in terms of niche composition change. Perspectives of the theory of self-reproducing systems in biology are discussed.

  19. Skill, reproducibility and potential predictability of the West African monsoon in coupled GCMs

    NASA Astrophysics Data System (ADS)

    Philippon, N.; Doblas-Reyes, F. J.; Ruti, P. M.

    2010-07-01

    In the framework of the ENSEMBLES FP6 project, an ensemble prediction system based on five different state-of-the-art European coupled models has been developed. This study evaluates the performance of these models for forecasting the West African monsoon (WAM) at the monthly time scale. From simulations started the 1 May of each year and covering the period 1991-2001, the reproducibility and potential predictability (PP) of key parameters of the WAM—rainfall, zonal and meridional wind at four levels from the surface to 200 hPa, and specific humidity, from July to September—are assessed. The Sahelian rainfall mode of variability is not accurately reproduced contrary to the Guinean rainfall one: the correlation between observations (from CMAP) and the multi-model ensemble mean is 0.17 and 0.55, respectively. For the Sahelian mode, the correlation is consistent with a low PP of about ~6%. The PP of the Guinean mode is higher, ~44% suggesting a stronger forcing of the sea surface temperature on rainfall variability over this region. Parameters relative to the atmospheric dynamics are on average much more skillful and reproducible than rainfall. Among them, the first mode of variability of the zonal wind at 200 hPa that depicts the Tropical Easterly Jet, is correlated at 0.79 with its “observed” counterpart (from the NCEP/DOE2 reanalyses) and has a PP of 39%. Moreover, models reproduce the correlations between all the atmospheric dynamics parameters and the Sahelian rainfall in a satisfactory way. In that context, a statistical adaptation of the atmospheric dynamic forecasts, using a linear regression model with the leading principal components of the atmospheric dynamical parameters studied, leads to moderate receiver operating characteristic area under the curve and correlation skill scores for the Sahelian rainfall. These scores are however much higher than those obtained using the modelled rainfall.

  20. Voxel size dependency, reproducibility and sensitivity of an in vivo bone loading estimation algorithm.

    PubMed

    Christen, Patrik; Schulte, Friederike A; Zwahlen, Alexander; van Rietbergen, Bert; Boutroy, Stephanie; Melton, L Joseph; Amin, Shreyasee; Khosla, Sundeep; Goldhahn, Jörg; Müller, Ralph

    2016-01-01

    A bone loading estimation algorithm was previously developed that provides in vivo loading conditions required for in vivo bone remodelling simulations. The algorithm derives a bone's loading history from its microstructure as assessed by high-resolution (HR) computed tomography (CT). This reverse engineering approach showed accurate and realistic results based on micro-CT and HR-peripheral quantitative CT images. However, its voxel size dependency, reproducibility and sensitivity still need to be investigated, which is the purpose of this study. Voxel size dependency was tested on cadaveric distal radii with micro-CT images scanned at 25 µm and downscaled to 50, 61, 75, 82, 100, 125 and 150 µm. Reproducibility was calculated with repeated in vitro as well as in vivo HR-pQCT measurements at 82 µm. Sensitivity was defined using HR-pQCT images from women with fracture versus non-fracture, and low versus high bone volume fraction, expecting similar and different loading histories, respectively. Our results indicate that the algorithm is voxel size independent within an average (maximum) error of 8.2% (32.9%) at 61 µm, but that the dependency increases considerably at voxel sizes bigger than 82 µm. In vitro and in vivo reproducibility are up to 4.5% and 10.2%, respectively, which is comparable to other in vitro studies and slightly higher than in other in vivo studies. Subjects with different bone volume fraction were clearly distinguished but not subjects with and without fracture. This is in agreement with bone adapting to customary loading but not to fall loads. We conclude that the in vivo bone loading estimation algorithm provides reproducible, sensitive and fairly voxel size independent results at up to 82 µm, but that smaller voxel sizes would be advantageous.

  1. Generalized RAICAR: discover homogeneous subject (sub)groups by reproducibility of their intrinsic connectivity networks.

    PubMed

    Yang, Zhi; Zuo, Xi-Nian; Wang, Peipei; Li, Zhihao; LaConte, Stephen M; Bandettini, Peter A; Hu, Xiaoping P

    2012-10-15

    Existing spatial independent component analysis (ICA) methods for multi-subject fMRI datasets have mainly focused on detecting common components across subjects, under the assumption that all the subjects in a group share the same (identical) components. However, as a data-driven approach, ICA could potentially serve as an exploratory tool at multi-subject level, and help us uncover inter-subject differences in patterns of connectivity (e.g., find subtypes in patient populations). In this work, we propose a methodology named gRAICAR that exploits the data-driven nature of ICA to allow discovery of sub-groupings of subjects based on reproducibility of their ICA components. This technique allows us not only to find highly reproducible common components across subjects but also to explore (without a priori subject groupings) components that could classify all subjects into sub-groups. gRAICAR generalizes the reproducibility framework previously developed for single subjects (Ranking and averaging independent component analysis by reproducibility-RAICAR-Yang et al., Hum Brain Mapp, 2008) to multiple-subject analysis. For each group-level component, gRAICAR generates its reproducibility matrix and further computes two metrics, inter-subject consistency and intra-subject reliability, to characterize inter-subject variability and reflect contributions from individual subjects. Nonparametric tests are employed to examine the significance of both the inter-subject consistency and the separation of subject groups reflected in the component. Our validations based on simulated and experimental resting-state fMRI datasets demonstrated the advantage of gRAICAR in extracting features reflecting potential subject groupings. It may facilitate discovery of the underlying brain functional networks with substantial potential to inform our understandings of development, neurodegenerative conditions, and psychiatric disorders.

  2. Accurate Measurement of the Effects of All Amino-Acid Mutations on Influenza Hemagglutinin

    PubMed Central

    Doud, Michael B.; Bloom, Jesse D.

    2016-01-01

    Influenza genes evolve mostly via point mutations, and so knowing the effect of every amino-acid mutation provides information about evolutionary paths available to the virus. We and others have combined high-throughput mutagenesis with deep sequencing to estimate the effects of large numbers of mutations to influenza genes. However, these measurements have suffered from substantial experimental noise due to a variety of technical problems, the most prominent of which is bottlenecking during the generation of mutant viruses from plasmids. Here we describe advances that ameliorate these problems, enabling us to measure with greatly improved accuracy and reproducibility the effects of all amino-acid mutations to an H1 influenza hemagglutinin on viral replication in cell culture. The largest improvements come from using a helper virus to reduce bottlenecks when generating viruses from plasmids. Our measurements confirm at much higher resolution the results of previous studies suggesting that antigenic sites on the globular head of hemagglutinin are highly tolerant of mutations. We also show that other regions of hemagglutinin—including the stalk epitopes targeted by broadly neutralizing antibodies—have a much lower inherent capacity to tolerate point mutations. The ability to accurately measure the effects of all influenza mutations should enhance efforts to understand and predict viral evolution. PMID:27271655

  3. A colorimetric-based accurate method for the determination of enterovirus 71 titer.

    PubMed

    Pourianfar, Hamid Reza; Javadi, Arman; Grollo, Lara

    2012-12-01

    The 50 % tissue culture infectious dose (TCID50) is still one of the most commonly used techniques for estimating virus titers. However, the traditional TCID50 assay is time consuming, susceptible to subjective errors and generates only quantal data. Here, we describe a colorimetric-based approach for the titration of Enterovirus 71 (EV71) using a modified method for making virus dilutions. In summary, the titration of EV71 using MTT or MTS staining with a modified virus dilution method decreased the time of the assay and eliminated the subjectivity of observational results, improving accuracy, reproducibility and reliability of virus titration, in comparison with the conventional TCID50 approach (p < 0.01). In addition, the results provided evidence that there was better correlation between a plaquing assay and our approach when compared to the traditional TCID50 approach. This increased accuracy also improved the ability to predict the number of virus plaque forming units present in a solution. These improvements could be of use for any virological experimentation, where a quick accurate titration of a virus capable of causing cell destruction is required or a sensible estimation of the number of viral plaques based on TCID50 of a virus is desired.

  4. Accurate Measurement of the Effects of All Amino-Acid Mutations on Influenza Hemagglutinin.

    PubMed

    Doud, Michael B; Bloom, Jesse D

    2016-01-01

    Influenza genes evolve mostly via point mutations, and so knowing the effect of every amino-acid mutation provides information about evolutionary paths available to the virus. We and others have combined high-throughput mutagenesis with deep sequencing to estimate the effects of large numbers of mutations to influenza genes. However, these measurements have suffered from substantial experimental noise due to a variety of technical problems, the most prominent of which is bottlenecking during the generation of mutant viruses from plasmids. Here we describe advances that ameliorate these problems, enabling us to measure with greatly improved accuracy and reproducibility the effects of all amino-acid mutations to an H1 influenza hemagglutinin on viral replication in cell culture. The largest improvements come from using a helper virus to reduce bottlenecks when generating viruses from plasmids. Our measurements confirm at much higher resolution the results of previous studies suggesting that antigenic sites on the globular head of hemagglutinin are highly tolerant of mutations. We also show that other regions of hemagglutinin-including the stalk epitopes targeted by broadly neutralizing antibodies-have a much lower inherent capacity to tolerate point mutations. The ability to accurately measure the effects of all influenza mutations should enhance efforts to understand and predict viral evolution. PMID:27271655

  5. History and progress on accurate measurements of the Planck constant.

    PubMed

    Steiner, Richard

    2013-01-01

    The measurement of the Planck constant, h, is entering a new phase. The CODATA 2010 recommended value is 6.626 069 57 × 10(-34) J s, but it has been a long road, and the trip is not over yet. Since its discovery as a fundamental physical constant to explain various effects in quantum theory, h has become especially important in defining standards for electrical measurements and soon, for mass determination. Measuring h in the International System of Units (SI) started as experimental attempts merely to prove its existence. Many decades passed while newer experiments measured physical effects that were the influence of h combined with other physical constants: elementary charge, e, and the Avogadro constant, N(A). As experimental techniques improved, the precision of the value of h expanded. When the Josephson and quantum Hall theories led to new electronic devices, and a hundred year old experiment, the absolute ampere, was altered into a watt balance, h not only became vital in definitions for the volt and ohm units, but suddenly it could be measured directly and even more accurately. Finally, as measurement uncertainties now approach a few parts in 10(8) from the watt balance experiments and Avogadro determinations, its importance has been linked to a proposed redefinition of a kilogram unit of mass. The path to higher accuracy in measuring the value of h was not always an example of continuous progress. Since new measurements periodically led to changes in its accepted value and the corresponding SI units, it is helpful to see why there were bumps in the road and where the different branch lines of research joined in the effort. Recalling the bumps along this road will hopefully avoid their repetition in the upcoming SI redefinition debates. This paper begins with a brief history of the methods to measure a combination of fundamental constants, thus indirectly obtaining the Planck constant. The historical path is followed in the section describing how the

  6. Must Kohn-Sham oscillator strengths be accurate at threshold?

    SciTech Connect

    Yang Zenghui; Burke, Kieron; Faassen, Meta van

    2009-09-21

    The exact ground-state Kohn-Sham (KS) potential for the helium atom is known from accurate wave function calculations of the ground-state density. The threshold for photoabsorption from this potential matches the physical system exactly. By carefully studying its absorption spectrum, we show the answer to the title question is no. To address this problem in detail, we generate a highly accurate simple fit of a two-electron spectrum near the threshold, and apply the method to both the experimental spectrum and that of the exact ground-state Kohn-Sham potential.

  7. Benchmarking contactless acquisition sensor reproducibility for latent fingerprint trace evidence

    NASA Astrophysics Data System (ADS)

    Hildebrandt, Mario; Dittmann, Jana

    2015-03-01

    Optical, nano-meter range, contactless, non-destructive sensor devices are promising acquisition techniques in crime scene trace forensics, e.g. for digitizing latent fingerprint traces. Before new approaches are introduced in crime investigations, innovations need to be positively tested and quality ensured. In this paper we investigate sensor reproducibility by studying different scans from four sensors: two chromatic white light sensors (CWL600/CWL1mm), one confocal laser scanning microscope, and one NIR/VIS/UV reflection spectrometer. Firstly, we perform an intra-sensor reproducibility testing for CWL600 with a privacy conform test set of artificial-sweat printed, computer generated fingerprints. We use 24 different fingerprint patterns as original samples (printing samples/templates) for printing with artificial sweat (physical trace samples) and their acquisition with contactless sensory resulting in 96 sensor images, called scan or acquired samples. The second test set for inter-sensor reproducibility assessment consists of the first three patterns from the first test set, acquired in two consecutive scans using each device. We suggest using a simple feature space set in spatial and frequency domain known from signal processing and test its suitability for six different classifiers classifying scan data into small differences (reproducible) and large differences (non-reproducible). Furthermore, we suggest comparing the classification results with biometric verification scores (calculated with NBIS, with threshold of 40) as biometric reproducibility score. The Bagging classifier is nearly for all cases the most reliable classifier in our experiments and the results are also confirmed with the biometric matching rates.

  8. Language-Agnostic Reproducible Data Analysis Using Literate Programming

    PubMed Central

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123

  9. Fast and accurate generation of ab initio quality atomic charges using nonparametric statistical regression.

    PubMed

    Rai, Brajesh K; Bakken, Gregory A

    2013-07-15

    We introduce a class of partial atomic charge assignment method that provides ab initio quality description of the electrostatics of bioorganic molecules. The method uses a set of models that neither have a fixed functional form nor require a fixed set of parameters, and therefore are capable of capturing the complexities of the charge distribution in great detail. Random Forest regression is used to build separate charge models for elements H, C, N, O, F, S, and Cl, using training data consisting of partial charges along with a description of their surrounding chemical environments; training set charges are generated by fitting to the b3lyp/6-31G* electrostatic potential (ESP) and are subsequently refined to improve consistency and transferability of the charge assignments. Using a set of 210 neutral, small organic molecules, the absolute hydration free energy calculated using these charges in conjunction with Generalized Born solvation model shows a low mean unsigned error, close to 1 kcal/mol, from the experimental data. Using another large and independent test set of chemically diverse organic molecules, the method is shown to accurately reproduce charge-dependent observables--ESP and dipole moment--from ab initio calculations. The method presented here automatically provides an estimate of potential errors in the charge assignment, enabling systematic improvement of these models using additional data. This work has implications not only for the future development of charge models but also in developing methods to describe many other chemical properties that require accurate representation of the electronic structure of the system.

  10. Next-generation sequencing data interpretation: enhancing reproducibility and accessibility.

    PubMed

    Nekrutenko, Anton; Taylor, James

    2012-09-01

    Areas of life sciences research that were previously distant from each other in ideology, analysis practices and toolkits, such as microbial ecology and personalized medicine, have all embraced techniques that rely on next-generation sequencing instruments. Yet the capacity to generate the data greatly outpaces our ability to analyse it. Existing sequencing technologies are more mature and accessible than the methodologies that are available for individual researchers to move, store, analyse and present data in a fashion that is transparent and reproducible. Here we discuss currently pressing issues with analysis, interpretation, reproducibility and accessibility of these data, and we present promising solutions and venture into potential future developments.

  11. Dynamic pseudos: How accurate outside their parent case?

    SciTech Connect

    Ekrann, S.; Mykkeltveit, J.

    1995-12-31

    If properly constructed, dynamic pseudos allow the parent solution from which they were derived to be exactly reproduced, in a certain well-defined sense, in a subsequent coarse grid simulation. The paper reports extensive numerical experimentation, in 1D homogeneous and heterogeneous media, to determine the performance of pseudos when used outside their parent case. The authors perturb fluid viscosities and injection rate, as well as realization. Parent solutions are produced analytically, via a generalization of the Buckley-Leverett technique, as are true solutions in off-parent cases. Capillarity is neglected in these experiments, while gravity is sometimes retained in order to force rate sensitivity.

  12. Respiratory effort correction strategies to improve the reproducibility of lung expansion measurements

    SciTech Connect

    Du, Kaifang; Reinhardt, Joseph M.; Christensen, Gary E.; Ding, Kai; Bayouth, John E.

    2013-12-15

    correlated with respiratory effort difference (R = 0.744 for ELV in the cohort with tidal volume difference greater than 100 cc). In general for all subjects, global normalization, ETV and ELV significantly improved reproducibility compared to no effort correction (p = 0.009, 0.002, 0.005 respectively). When tidal volume difference was small (less than 100 cc), none of the three effort correction strategies improved reproducibility significantly (p = 0.52, 0.46, 0.46 respectively). For the cohort (N = 13) with tidal volume difference greater than 100 cc, the average gamma pass rate improves from 57.3% before correction to 66.3% after global normalization, and 76.3% after ELV. ELV was found to be significantly better than global normalization (p = 0.04 for all subjects, and p = 0.003 for the cohort with tidal volume difference greater than 100 cc).Conclusions: All effort correction strategies improve the reproducibility of the authors' pulmonary ventilation measures, and the improvement of reproducibility is highly correlated with the changes in respiratory effort. ELV gives better results as effort difference increase, followed by ETV, then global. However, based on the spatial and temporal heterogeneity in the lung expansion rate, a single scaling factor (e.g., global normalization) appears to be less accurate to correct the ventilation map when changes in respiratory effort are large.

  13. Can quasiclassical trajectory calculations reproduce the extreme kinetic isotope effect observed in the muonic isotopologues of the H + H2 reaction?

    NASA Astrophysics Data System (ADS)

    Jambrina, P. G.; García, Ernesto; Herrero, Víctor J.; Sáez-Rábanos, Vicente; Aoiz, F. J.

    2011-07-01

    Rate coefficients for the mass extreme isotopologues of the H + H2 reaction, namely, Mu + H2, where Mu is muonium, and Heμ + H2, where Heμ is a He atom in which one of the electrons has been replaced by a negative muon, have been calculated in the 200-1000 K temperature range by means of accurate quantum mechanical (QM) and quasiclassical trajectory (QCT) calculations and compared with the experimental and theoretical results recently reported by Fleming et al. [Science 331, 448 (2011)], 10.1126/science.1199421. The QCT calculations can reproduce the experimental and QM rate coefficients and kinetic isotope effect (KIE), kMu(T)/kHeμ(T), if the Gaussian binning procedure (QCT-GB) - weighting the trajectories according to their proximity to the right quantal vibrational action - is applied. The analysis of the results shows that the large zero point energy of the MuH product is the key factor for the large KIE observed.

  14. Ruggedness and reproducibility of the MBEC biofilm disinfectant efficacy test.

    PubMed

    Parker, A E; Walker, D K; Goeres, D M; Allan, N; Olson, M E; Omar, A

    2014-07-01

    The MBEC™ Physiology & Genetics Assay recently became the first approved ASTM standardized biofilm disinfectant efficacy test method. This report summarizes the results of the standardization process using Pseudomonas aeruginosa biofilms. Initial ruggedness testing of the MBEC method suggests that the assay is rugged (i.e., insensitive) to small changes to the protocol with respect to 4 factors: incubation time of the bacteria (when varied from 16 to 18h), treatment temperature (20-24°C), sonication duration (25-35min), and sonication power (130-480W). In order to assess the repeatability of MBEC results across multiple tests in the same laboratory and the reproducibility across multiple labs, an 8-lab study was conducted in which 8 concentrations of each of 3 disinfectants (a non-chlorine oxidizer, a phenolic, and a quaternary ammonium compound) were applied to biofilms using the MBEC method. The repeatability and reproducibility of the untreated control biofilms were acceptable, as indicated by small repeatability and reproducibility standard deviations (SD) (0.33 and 0.67 log10(CFU/mm(2)), respectively). The repeatability SDs of the biofilm log reductions after application of the 24 concentration and disinfectant combinations ranged from 0.22 to 1.61, and the reproducibility SDs ranged from 0.27 to 1.70. In addition, for each of the 3 disinfectant types considered, the assay was statistically significantly responsive to the increasing treatment concentrations.

  15. The reproducibility of dipping status: beyond the cutoff points.

    PubMed

    Chaves, Hilton; Campello de Souza, Fernando Menezes; Krieger, Eduardo Moacyr

    2005-08-01

    A limited reproducibility has been ascribed to 24-h ambulatory blood pressure monitoring, especially in relation to the dipper and nondipper phenomena. This study examined the reproducibility of 24-h ambulatory blood pressure monitoring in three recordings of pressure at intervals of 8-15 days in 101 study participants (73% treated hypertensive patients) residing in the city of Recife, Pernambuco, Brazil. SpaceLabs 90207 monitors were used, and the minimum number of valid measurements was 80. No significant differences were found between the mean systolic and diastolic pressures, between the second and third recordings when the normotensive and hypertensive patients were assessed jointly (P=0.44). Likewise, no significant differences were present when the normotensive patients were analyzed separately (P=0.96). In the hypertensive group, a significant difference existed between only the first and second ambulatory blood pressure readings (135.1 vs. 132.9 mmHg, respectively; P=0.0005). Regarding declines in pressure during sleep, no significant differences occurred when continuous percentage values were considered (P=0.27). The values obtained from 24-h ambulatory blood pressure monitoring are reproducible when tested at intervals of 8-15 days. Small differences, when significantly present, always involved the first ambulatory blood pressure monitoring. The reproducibility of the dipper and nondipper patterns is of greater complexity because it considers cutoff points rather than continuous ones to characterize these states.

  16. Artificially reproduced image of earth photographed by UV camera

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A reproduction of a color enhancement of a picture photographed in far-ultraviolet light by Astronaut John W. Young, Apollo 16 commander, showing the Earth. Note this is an artificially reproduced image. The three auroral belts, the sunlit atmosphere and background stars are visible.

  17. Latin America Today: An Atlas of Reproducible Pages. Revised Edition.

    ERIC Educational Resources Information Center

    World Eagle, Inc., Wellesley, MA.

    This document contains reproducible maps, charts and graphs of Latin America for use by teachers and students. The maps are divided into five categories (1) the land; (2) peoples, countries, cities, and governments; (3) the national economies, product, trade, agriculture, and resources; (4) energy, education, employment, illicit drugs, consumer…

  18. Reproducibility of polycarbonate reference material in toxicity evaluation

    NASA Technical Reports Server (NTRS)

    Hilado, C. J.; Huttlinger, P. A.

    1981-01-01

    A specific lot of bisphenol A polycarbonate has been used for almost four years as the reference material for the NASA-USF-PSC toxicity screening test method. The reproducibility of the test results over this period of time indicate that certain plastics may be more suitable reference materials than the more traditional cellulosic materials.

  19. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND... joint tightness, mechanical strength and the nature of the fluid handled....

  20. Slide rule-type color chart predicts reproduced photo tones

    NASA Technical Reports Server (NTRS)

    Griffin, J. D.

    1966-01-01

    Slide rule-type color chart determines the final reproduced gray tones in the production of briefing charts that are photographed in black and white. The chart shows both the color by drafting paint manufacturers name and mixture number, and the gray tone resulting from black and white photographic reproduction.

  1. The reproducibility of dipping status: beyond the cutoff points.

    PubMed

    Chaves, Hilton; Campello de Souza, Fernando Menezes; Krieger, Eduardo Moacyr

    2005-08-01

    A limited reproducibility has been ascribed to 24-h ambulatory blood pressure monitoring, especially in relation to the dipper and nondipper phenomena. This study examined the reproducibility of 24-h ambulatory blood pressure monitoring in three recordings of pressure at intervals of 8-15 days in 101 study participants (73% treated hypertensive patients) residing in the city of Recife, Pernambuco, Brazil. SpaceLabs 90207 monitors were used, and the minimum number of valid measurements was 80. No significant differences were found between the mean systolic and diastolic pressures, between the second and third recordings when the normotensive and hypertensive patients were assessed jointly (P=0.44). Likewise, no significant differences were present when the normotensive patients were analyzed separately (P=0.96). In the hypertensive group, a significant difference existed between only the first and second ambulatory blood pressure readings (135.1 vs. 132.9 mmHg, respectively; P=0.0005). Regarding declines in pressure during sleep, no significant differences occurred when continuous percentage values were considered (P=0.27). The values obtained from 24-h ambulatory blood pressure monitoring are reproducible when tested at intervals of 8-15 days. Small differences, when significantly present, always involved the first ambulatory blood pressure monitoring. The reproducibility of the dipper and nondipper patterns is of greater complexity because it considers cutoff points rather than continuous ones to characterize these states. PMID:16077266

  2. Reproducibility of topographic measures of the glaucomatous optic nerve head

    PubMed Central

    Geyer, O; Michaeli-Cohen, A; Silver, D; Versano, D; Neudorfer, M; Dzhanov, R; Lazar, M

    1998-01-01

    AIMS/BACKGROUND—Laser scanning tomography provides an assessment of three dimensional optic disc topography. For the clinical purpose of follow up of glaucoma patients, the repeatability of the various measured variables is essential. In the present study, the reproducibility of morphometric variables calculated by the topographic scanning system, TopSS (Laser Diagnostic Technology, San Diego, CA) was investigated.
METHODS—Two independent measurements (30 minutes apart) each consisting of three complete images of the optic disc were performed on 16 eyes of 16 glaucoma patients using a TopSS. The instrument calculates 14 morphometric variables for the characterisation of the optic disc.
RESULTS—From the two tailed paired tests, all variables were seen to have good reproducibility. However, correlation and regression analyses showed that only the three variables, volume below, half depth area, and average cup depth, are acceptably reproducible.
CONCLUSION—The TopSS provides three variables which describe the physiological shape of the optic disc that have high reproducibility. These three variables might be useful for following the progression of optic disc changes in glaucoma patients.

 Keywords: optic nerve head; scanning laser; glaucoma; tomography PMID:9536873

  3. The United States Today: An Atlas of Reproducible Pages.

    ERIC Educational Resources Information Center

    World Eagle, Inc., Wellesley, MA.

    Black and white maps, graphs and tables that may be reproduced are presented in this volume focusing on the United States. Some of the features of the United States depicted are: size, population, agriculture and resources, manufactures, trade, citizenship, employment, income, poverty, the federal budget, energy, health, education, crime, and the…

  4. Reproducibility of Tactile Assessments for Children with Unilateral Cerebral Palsy

    ERIC Educational Resources Information Center

    Auld, Megan Louise; Ware, Robert S.; Boyd, Roslyn Nancy; Moseley, G. Lorimer; Johnston, Leanne Marie

    2012-01-01

    A systematic review identified tactile assessments used in children with cerebral palsy (CP), but their reproducibility is unknown. Sixteen children with unilateral CP and 31 typically developing children (TDC) were assessed 2-4 weeks apart. Test-retest percent agreements within one point for children with unilateral CP (and TDC) were…

  5. Direct computation of parameters for accurate polarizable force fields

    SciTech Connect

    Verstraelen, Toon Vandenbrande, Steven; Ayers, Paul W.

    2014-11-21

    We present an improved electronic linear response model to incorporate polarization and charge-transfer effects in polarizable force fields. This model is a generalization of the Atom-Condensed Kohn-Sham Density Functional Theory (DFT), approximated to second order (ACKS2): it can now be defined with any underlying variational theory (next to KS-DFT) and it can include atomic multipoles and off-center basis functions. Parameters in this model are computed efficiently as expectation values of an electronic wavefunction, obviating the need for their calibration, regularization, and manual tuning. In the limit of a complete density and potential basis set in the ACKS2 model, the linear response properties of the underlying theory for a given molecular geometry are reproduced exactly. A numerical validation with a test set of 110 molecules shows that very accurate models can already be obtained with fluctuating charges and dipoles. These features greatly facilitate the development of polarizable force fields.

  6. Accurate Determination of Conformational Transitions in Oligomeric Membrane Proteins

    PubMed Central

    Sanz-Hernández, Máximo; Vostrikov, Vitaly V.; Veglia, Gianluigi; De Simone, Alfonso

    2016-01-01

    The structural dynamics governing collective motions in oligomeric membrane proteins play key roles in vital biomolecular processes at cellular membranes. In this study, we present a structural refinement approach that combines solid-state NMR experiments and molecular simulations to accurately describe concerted conformational transitions identifying the overall structural, dynamical, and topological states of oligomeric membrane proteins. The accuracy of the structural ensembles generated with this method is shown to reach the statistical error limit, and is further demonstrated by correctly reproducing orthogonal NMR data. We demonstrate the accuracy of this approach by characterising the pentameric state of phospholamban, a key player in the regulation of calcium uptake in the sarcoplasmic reticulum, and by probing its dynamical activation upon phosphorylation. Our results underline the importance of using an ensemble approach to characterise the conformational transitions that are often responsible for the biological function of oligomeric membrane protein states. PMID:26975211

  7. Calculation of Accurate Hexagonal Discontinuity Factors for PARCS

    SciTech Connect

    Pounders. J., Bandini, B. R. , Xu, Y, and Downar, T. J.

    2007-11-01

    In this study we derive a methodology for calculating discontinuity factors consistent with the Triangle-based Polynomial Expansion Nodal (TPEN) method implemented in PARCS for hexagonal reactor geometries. The accuracy of coarse-mesh nodal methods is greatly enhanced by permitting flux discontinuities at node boundaries, but the practice of calculating discontinuity factors from infinite-medium (zero-current) single bundle calculations may not be sufficiently accurate for more challenging problems in which there is a large amount of internodal neutron streaming. The authors therefore derive a TPEN-based method for calculating discontinuity factors that are exact with respect to generalized equivalence theory. The method is validated by reproducing the reference solution for a small hexagonal core.

  8. Tract Specific Reproducibility of Tractography Based Morphology and Diffusion Metrics

    PubMed Central

    Besseling, René M. H.; Jansen, Jacobus F. A.; Overvliet, Geke M.; Vaessen, Maarten J.; Braakman, Hilde M. H.; Hofman, Paul A. M.; Aldenkamp, Albert P.; Backes, Walter H.

    2012-01-01

    Introduction The reproducibility of tractography is important to determine its sensitivity to pathological abnormalities. The reproducibility of tract morphology has not yet been systematically studied and the recently developed tractography contrast Tract Density Imaging (TDI) has not yet been assessed at the tract specific level. Materials and Methods Diffusion tensor imaging (DTI) and probabilistic constrained spherical deconvolution (CSD) tractography are performed twice in 9 healthy subjects. Tractography is based on common space seed and target regions and performed for several major white matter tracts. Tractograms are converted to tract segmentations and inter-session reproducibility of tract morphology is assessed using Dice similarity coefficient (DSC). The coefficient of variation (COV) and intraclass correlation coefficient (ICC) are calculated of the following tract metrics: fractional anisotropy (FA), apparent diffusion coefficient (ADC), volume, and TDI. Analyses are performed both for proximal (deep white matter) and extended (including subcortical white matter) tract segmentations. Results Proximal DSC values were 0.70–0.92. DSC values were 5–10% lower in extended compared to proximal segmentations. COV/ICC values of FA, ADC, volume and TDI were 1–4%/0.65–0.94, 2–4%/0.62–0.94, 3–22%/0.53–0.96 and 8–31%/0.48–0.70, respectively, with the lower COV and higher ICC values found in the proximal segmentations. Conclusion For all investigated metrics, reproducibility depended on the segmented tract. FA and ADC had relatively low COV and relatively high ICC, indicating clinical potential. Volume had higher COV but its moderate to high ICC values in most tracts still suggest subject-differentiating power. Tract TDI had high COV and relatively low ICC, which reflects unfavorable reproducibility. PMID:22485157

  9. Reproducibility of dual-photon absorptiometry using a clinical phantom

    SciTech Connect

    DaCosta, M.; DeLaney, M.; Goldsmith, S.J.

    1985-05-01

    The use of dual-photon absorptiometry (DPA) bone mineral density (BMD) to monitor bone for diagnosis and monitoring therapy of osteoporosis has been established. The objective of this study is to determine the reproducibility of DPA measurements. A phantom was constructed using a section of human boney pelvis and lumbo-sacral spine. Provisions were made to mimic changes in patient girth. To evaluate the DPA reproducibility within a single day, 12 consecutive studies were performed on the phantom using standard acquisition and processing procedures. The mean BMD +-1 SD in gms/cm/sup 2/ (BMD-bar)of lumbar vertebrae 2-4 was 0.771 +- 0.007 with a 0.97% coefficient of variation (1SD) (CV). This evaluation was repeated 7 times over the next 4 months with the performance of 3 to 6 studies each time, the maximum CV found was 1.93. In order to evaluate the DPA reproducibility with time, phantom studies were performed over a 7 month period which included a 153-Gd source change. The BMD-bar was 0.770 +- 0.017 with a 2.15CV. DPA reproducibility with patient girth changes was evaluated by performing the phantom studies at water depths of 12.5, 17.0 and 20.0cm. Five studies of each were performed using standard acquisition and processing procedures. The BMD-bar was 0.779 +- 0.012 with a 1.151CV. based on these results, BMD measurements by DPA are reproducible within 2%. This reliability is maintained for studies performed over extended period of time and are independent of changes in patient girth.

  10. EMG analysis of trapezius and masticatory muscles: experimental protocol and data reproducibility.

    PubMed

    Sforza, C; Rosati, R; De Menezes, M; Musto, F; Toma, M

    2011-09-01

    We aimed to define a standardised protocol for the electromyographic evaluation of trapezius muscle in dentistry and to assess its within- and between-session repeatability. Surface electromyography of trapezius, masseter and temporal muscles was performed in 40 healthy subjects aged 20-35 years during shoulder elevation, and maximum teeth clenching with and without cotton rolls. Two repetitions were made both within (same electrodes) and between sessions (different electrodes). Maximum voluntary clench on cotton rolls was used to standardise the potentials of the six analysed muscles with tooth contact; shoulder elevation was used to standardise the upper trapezius potentials. From the standardised electromyographic potentials, several indices (muscle symmetry; masticatory muscle torque and relative activity; total masticatory muscle activity; trapezius cervical load, percentage co-contraction of trapezius during teeth clenching) were computed; random (technical error of measurement) and systematic (Student's t-test, Analysis of Variance) errors were assessed. For all indices, no systematic errors were found between the two separate data collection sessions. Within session, limited (lower than 8%) technical errors of measurement were found for temporalis and masseter symmetry, torque and activity indices, and the trapezius cervical load. Larger random errors were obtained for trapezius symmetry and total masticatory muscle activity (up to 20%). Between sessions, no significant differences were found for trapezius co-contraction. In conclusion, a protocol for the standardisation of trapezius muscle that may be used within dental clinical applications was defined, and the repeatability of masseter, temporalis and trapezius electromyographic recordings for serial assessments was assessed in healthy subjects.

  11. A reproducible approach to high-throughput biological data acquisition and integration

    PubMed Central

    Rahnavard, Gholamali; Waldron, Levi; McIver, Lauren; Shafquat, Afrah; Franzosa, Eric A.; Miropolsky, Larissa; Sweeney, Christopher

    2015-01-01

    Modern biological research requires rapid, complex, and reproducible integration of multiple experimental results generated both internally and externally (e.g., from public repositories). Although large systematic meta-analyses are among the most effective approaches both for clinical biomarker discovery and for computational inference of biomolecular mechanisms, identifying, acquiring, and integrating relevant experimental results from multiple sources for a given study can be time-consuming and error-prone. To enable efficient and reproducible integration of diverse experimental results, we developed a novel approach for standardized acquisition and analysis of high-throughput and heterogeneous biological data. This allowed, first, novel biomolecular network reconstruction in human prostate cancer, which correctly recovered and extended the NFκB signaling pathway. Next, we investigated host-microbiome interactions. In less than an hour of analysis time, the system retrieved data and integrated six germ-free murine intestinal gene expression datasets to identify the genes most influenced by the gut microbiota, which comprised a set of immune-response and carbohydrate metabolism processes. Finally, we constructed integrated functional interaction networks to compare connectivity of peptide secretion pathways in the model organisms Escherichia coli, Bacillus subtilis, and Pseudomonas aeruginosa. PMID:26157642

  12. Accurate Development of Thermal Neutron Scattering Cross Section Libraries

    SciTech Connect

    Hawari, Ayman; Dunn, Michael

    2014-06-10

    The objective of this project is to develop a holistic (fundamental and accurate) approach for generating thermal neutron scattering cross section libraries for a collection of important enutron moderators and reflectors. The primary components of this approach are the physcial accuracy and completeness of the generated data libraries. Consequently, for the first time, thermal neutron scattering cross section data libraries will be generated that are based on accurate theoretical models, that are carefully benchmarked against experimental and computational data, and that contain complete covariance information that can be used in propagating the data uncertainties through the various components of the nuclear design and execution process. To achieve this objective, computational and experimental investigations will be performed on a carefully selected subset of materials that play a key role in all stages of the nuclear fuel cycle.

  13. Ab initio molecular dynamics of liquid water using embedded-fragment second-order many-body perturbation theory towards its accurate property prediction

    PubMed Central

    Willow, Soohaeng Yoo; Salim, Michael A.; Kim, Kwang S.; Hirata, So

    2015-01-01

    A direct, simultaneous calculation of properties of a liquid using an ab initio electron-correlated theory has long been unthinkable. Here we present structural, dynamical, and response properties of liquid water calculated by ab initio molecular dynamics using the embedded-fragment spin-component-scaled second-order many-body perturbation method with the aug-cc-pVDZ basis set. This level of theory is chosen as it accurately and inexpensively reproduces the water dimer potential energy surface from the coupled-cluster singles, doubles, and noniterative triples with the aug-cc-pVQZ basis set, which is nearly exact. The calculated radial distribution function, self-diffusion coefficient, coordinate number, and dipole moment, as well as the infrared and Raman spectra are in excellent agreement with experimental results. The shapes and widths of the OH stretching bands in the infrared and Raman spectra and their isotropic-anisotropic Raman noncoincidence, which reflect the diverse local hydrogen-bond environment, are also reproduced computationally. The simulation also reveals intriguing dynamic features of the environment, which are difficult to probe experimentally, such as a surprisingly large fluctuation in the coordination number and the detailed mechanism by which the hydrogen donating water molecules move across the first and second shells, thereby causing this fluctuation. PMID:26400690

  14. Ab initio molecular dynamics of liquid water using embedded-fragment second-order many-body perturbation theory towards its accurate property prediction.

    PubMed

    Willow, Soohaeng Yoo; Salim, Michael A; Kim, Kwang S; Hirata, So

    2015-01-01

    A direct, simultaneous calculation of properties of a liquid using an ab initio electron-correlated theory has long been unthinkable. Here we present structural, dynamical, and response properties of liquid water calculated by ab initio molecular dynamics using the embedded-fragment spin-component-scaled second-order many-body perturbation method with the aug-cc-pVDZ basis set. This level of theory is chosen as it accurately and inexpensively reproduces the water dimer potential energy surface from the coupled-cluster singles, doubles, and noniterative triples with the aug-cc-pVQZ basis set, which is nearly exact. The calculated radial distribution function, self-diffusion coefficient, coordinate number, and dipole moment, as well as the infrared and Raman spectra are in excellent agreement with experimental results. The shapes and widths of the OH stretching bands in the infrared and Raman spectra and their isotropic-anisotropic Raman noncoincidence, which reflect the diverse local hydrogen-bond environment, are also reproduced computationally. The simulation also reveals intriguing dynamic features of the environment, which are difficult to probe experimentally, such as a surprisingly large fluctuation in the coordination number and the detailed mechanism by which the hydrogen donating water molecules move across the first and second shells, thereby causing this fluctuation.

  15. Accurate theoretical chemistry with coupled pair models.

    PubMed

    Neese, Frank; Hansen, Andreas; Wennmohs, Frank; Grimme, Stefan

    2009-05-19

    Quantum chemistry has found its way into the everyday work of many experimental chemists. Calculations can predict the outcome of chemical reactions, afford insight into reaction mechanisms, and be used to interpret structure and bonding in molecules. Thus, contemporary theory offers tremendous opportunities in experimental chemical research. However, even with present-day computers and algorithms, we cannot solve the many particle Schrodinger equation exactly; inevitably some error is introduced in approximating the solutions of this equation. Thus, the accuracy of quantum chemical calculations is of critical importance. The affordable accuracy depends on molecular size and particularly on the total number of atoms: for orientation, ethanol has 9 atoms, aspirin 21 atoms, morphine 40 atoms, sildenafil 63 atoms, paclitaxel 113 atoms, insulin nearly 800 atoms, and quaternary hemoglobin almost 12,000 atoms. Currently, molecules with up to approximately 10 atoms can be very accurately studied by coupled cluster (CC) theory, approximately 100 atoms with second-order Møller-Plesset perturbation theory (MP2), approximately 1000 atoms with density functional theory (DFT), and beyond that number with semiempirical quantum chemistry and force-field methods. The overwhelming majority of present-day calculations in the 100-atom range use DFT. Although these methods have been very successful in quantum chemistry, they do not offer a well-defined hierarchy of calculations that allows one to systematically converge to the correct answer. Recently a number of rather spectacular failures of DFT methods have been found-even for seemingly simple systems such as hydrocarbons, fueling renewed interest in wave function-based methods that incorporate the relevant physics of electron correlation in a more systematic way. Thus, it would be highly desirable to fill the gap between 10 and 100 atoms with highly correlated ab initio methods. We have found that one of the earliest (and now

  16. Accurate evaluation of homogenous and nonhomogeneous gas emissivities

    NASA Technical Reports Server (NTRS)

    Tiwari, S. N.; Lee, K. P.

    1984-01-01

    Spectral transmittance and total band adsorptance of selected infrared bands of carbon dioxide and water vapor are calculated by using the line-by-line and quasi-random band models and these are compared with available experimental results to establish the validity of the quasi-random band model. Various wide-band model correlations are employed to calculate the total band absorptance and total emissivity of these two gases under homogeneous and nonhomogeneous conditions. These results are compared with available experimental results under identical conditions. From these comparisons, it is found that the quasi-random band model can provide quite accurate results and is quite suitable for most atmospheric applications.

  17. Accurately measuring dynamic coefficient of friction in ultraform finishing

    NASA Astrophysics Data System (ADS)

    Briggs, Dennis; Echaves, Samantha; Pidgeon, Brendan; Travis, Nathan; Ellis, Jonathan D.

    2013-09-01

    UltraForm Finishing (UFF) is a deterministic sub-aperture computer numerically controlled grinding and polishing platform designed by OptiPro Systems. UFF is used to grind and polish a variety of optics from simple spherical to fully freeform, and numerous materials from glasses to optical ceramics. The UFF system consists of an abrasive belt around a compliant wheel that rotates and contacts the part to remove material. This work aims to accurately measure the dynamic coefficient of friction (μ), how it changes as a function of belt wear, and how this ultimately affects material removal rates. The coefficient of friction has been examined in terms of contact mechanics and Preston's equation to determine accurate material removal rates. By accurately predicting changes in μ, polishing iterations can be more accurately predicted, reducing the total number of iterations required to meet specifications. We have established an experimental apparatus that can accurately measure μ by measuring triaxial forces during translating loading conditions or while manufacturing the removal spots used to calculate material removal rates. Using this system, we will demonstrate μ measurements for UFF belts during different states of their lifecycle and assess the material removal function from spot diagrams as a function of wear. Ultimately, we will use this system for qualifying belt-wheel-material combinations to develop a spot-morphing model to better predict instantaneous material removal functions.

  18. Assessing the relative effectiveness of statistical downscaling and distribution mapping in reproducing rainfall statistics based on climate model results

    NASA Astrophysics Data System (ADS)

    Langousis, Andreas; Mamalakis, Antonios; Deidda, Roberto; Marrocu, Marino

    2016-01-01

    To improve the level skill of climate models (CMs) in reproducing the statistics of daily rainfall at a basin level, two types of statistical approaches have been suggested. One is statistical correction of CM rainfall outputs based on historical series of precipitation. The other, usually referred to as statistical rainfall downscaling, is the use of stochastic models to conditionally simulate rainfall series, based on large-scale atmospheric forcing from CMs. While promising, the latter approach attracted reduced attention in recent years, since the developed downscaling schemes involved complex weather identification procedures, while demonstrating limited success in reproducing several statistical features of rainfall. In a recent effort, Langousis and Kaleris () developed a statistical framework for simulation of daily rainfall intensities conditional on upper-air variables, which is simpler to implement and more accurately reproduces several statistical properties of actual rainfall records. Here we study the relative performance of: (a) direct statistical correction of CM rainfall outputs using nonparametric distribution mapping, and (b) the statistical downscaling scheme of Langousis and Kaleris (), in reproducing the historical rainfall statistics, including rainfall extremes, at a regional level. This is done for an intermediate-sized catchment in Italy, i.e., the Flumendosa catchment, using rainfall and atmospheric data from four CMs of the ENSEMBLES project. The obtained results are promising, since the proposed downscaling scheme is more accurate and robust in reproducing a number of historical rainfall statistics, independent of the CM used and the characteristics of the calibration period. This is particularly the case for yearly rainfall maxima.

  19. Accurate thermoelastic tensor and acoustic velocities of NaCl

    NASA Astrophysics Data System (ADS)

    Marcondes, Michel L.; Shukla, Gaurav; da Silveira, Pedro; Wentzcovitch, Renata M.

    2015-12-01

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  20. Accurate thermoelastic tensor and acoustic velocities of NaCl

    SciTech Connect

    Marcondes, Michel L.; Shukla, Gaurav; Silveira, Pedro da; Wentzcovitch, Renata M.

    2015-12-15

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  1. An exploration of graph metric reproducibility in complex brain networks

    PubMed Central

    Telesford, Qawi K.; Burdette, Jonathan H.; Laurienti, Paul J.

    2013-01-01

    The application of graph theory to brain networks has become increasingly popular in the neuroimaging community. These investigations and analyses have led to a greater understanding of the brain's complex organization. More importantly, it has become a useful tool for studying the brain under various states and conditions. With the ever expanding popularity of network science in the neuroimaging community, there is increasing interest to validate the measurements and calculations derived from brain networks. Underpinning these studies is the desire to use brain networks in longitudinal studies or as clinical biomarkers to understand changes in the brain. A highly reproducible tool for brain imaging could potentially prove useful as a clinical tool. In this review, we examine recent studies in network reproducibility and their implications for analysis of brain networks. PMID:23717257

  2. Properties of galaxies reproduced by a hydrodynamic simulation.

    PubMed

    Vogelsberger, M; Genel, S; Springel, V; Torrey, P; Sijacki, D; Xu, D; Snyder, G; Bird, S; Nelson, D; Hernquist, L

    2014-05-01

    Previous simulations of the growth of cosmic structures have broadly reproduced the 'cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the 'metal' and hydrogen content of galaxies on small scales.

  3. Implementation of a portable and reproducible parallel pseudorandom number generator

    SciTech Connect

    Pryor, D.V.; Cuccaro, S.A.; Mascagni, M.; Robinson, M.L.

    1994-12-31

    The authors describe in detail the parallel implementation of a family of additive lagged-Fibonacci pseudorandom number generators. The theoretical structure of these generators is exploited to preserve their well-known randomness properties and to provide a parallel system in of distinct cycles. The algorithm presented here solves the reproducibility problem for a far larger class of parallel Monte Carlo applications than has been previously possible. In particular, Monte Carlo applications that undergo ``splitting`` can be coded to be reproducible, independent both of the number of processors and the execution order of the parallel processes. A library of portable C routines (available from the authors) that implements these ideas is also described.

  4. Reproducibility of Mammography Units, Film Processing and Quality Imaging

    NASA Astrophysics Data System (ADS)

    Gaona, Enrique

    2003-09-01

    The purpose of this study was to carry out an exploratory survey of the problems of quality control in mammography and processors units as a diagnosis of the current situation of mammography facilities. Measurements of reproducibility, optical density, optical difference and gamma index are included. Breast cancer is the most frequently diagnosed cancer and is the second leading cause of cancer death among women in the Mexican Republic. Mammography is a radiographic examination specially designed for detecting breast pathology. We found that the problems of reproducibility of AEC are smaller than the problems of processors units because almost all processors fall outside of the acceptable variation limits and they can affect the mammography quality image and the dose to breast. Only four mammography units agree with the minimum score established by ACR and FDA for the phantom image.

  5. Reproducing kernel particle method for free and forced vibration analysis

    NASA Astrophysics Data System (ADS)

    Zhou, J. X.; Zhang, H. Y.; Zhang, L.

    2005-01-01

    A reproducing kernel particle method (RKPM) is presented to analyze the natural frequencies of Euler-Bernoulli beams as well as Kirchhoff plates. In addition, RKPM is also used to predict the forced vibration responses of buried pipelines due to longitudinal travelling waves. Two different approaches, Lagrange multipliers as well as transformation method , are employed to enforce essential boundary conditions. Based on the reproducing kernel approximation, the domain of interest is discretized by a set of particles without the employment of a structured mesh, which constitutes an advantage over the finite element method. Meanwhile, RKPM also exhibits advantages over the classical Rayleigh-Ritz method and its counterparts. Numerical results presented here demonstrate the effectiveness of this novel approach for both free and forced vibration analysis.

  6. Properties of galaxies reproduced by a hydrodynamic simulation.

    PubMed

    Vogelsberger, M; Genel, S; Springel, V; Torrey, P; Sijacki, D; Xu, D; Snyder, G; Bird, S; Nelson, D; Hernquist, L

    2014-05-01

    Previous simulations of the growth of cosmic structures have broadly reproduced the 'cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the 'metal' and hydrogen content of galaxies on small scales. PMID:24805343

  7. MASSIVE DATA, THE DIGITIZATION OF SCIENCE, AND REPRODUCIBILITY OF RESULTS

    SciTech Connect

    2010-07-02

    As the scientific enterprise becomes increasingly computational and data-driven, the nature of the information communicated must change. Without inclusion of the code and data with published computational results, we are engendering a credibility crisis in science. Controversies such as ClimateGate, the microarray-based drug sensitivity clinical trials under investigation at Duke University, and retractions from prominent journals due to unverified code suggest the need for greater transparency in our computational science. In this talk I argue that the scientific method be restored to (1) a focus on error control as central to scientific communication and (2) complete communication of the underlying methodology producing the results, ie. reproducibility. I outline barriers to these goals based on recent survey work (Stodden 2010), and suggest solutions such as the “Reproducible Research Standard” (Stodden 2009), giving open licensing options designed to create an intellectual property framework for scientists consonant with longstanding scientific norms.

  8. Pressure Stabilizer for Reproducible Picoinjection in Droplet Microfluidic Systems

    PubMed Central

    Rhee, Minsoung; Light, Yooli K.; Yilmaz, Suzan; Adams, Paul D.; Saxena, Deepak

    2014-01-01

    Picoinjection is a promising technique to add reagents into pre-formed emulsion droplets on chip; however, it is sensitive to pressure fluctuation, making stable operation of the picoinjector challenging. We present a chip architecture using a simple pressure stabilizer for consistent and highly reproducible picoinjection in multi-step biochemical assays with droplets. Incorporation of the stabilizer immediately upstream of a picoinjector or a combination of injectors greatly reduces pressure fluctuations enabling reproducible and effective picoinjection in systems where the pressure varies actively during operation. We demonstrate the effectiveness of the pressure stabilizer for an integrated platform for on-demand encapsulation of bacterial cells followed by picoinjection of reagents for lysing the encapsulated cells. The pressure stabilizer was also used for picoinjection of multiple displacement amplification (MDA) reagents to achieve genomic DNA amplification of lysed bacterial cells. PMID:25270338

  9. MASSIVE DATA, THE DIGITIZATION OF SCIENCE, AND REPRODUCIBILITY OF RESULTS

    ScienceCinema

    None

    2016-07-12

    As the scientific enterprise becomes increasingly computational and data-driven, the nature of the information communicated must change. Without inclusion of the code and data with published computational results, we are engendering a credibility crisis in science. Controversies such as ClimateGate, the microarray-based drug sensitivity clinical trials under investigation at Duke University, and retractions from prominent journals due to unverified code suggest the need for greater transparency in our computational science. In this talk I argue that the scientific method be restored to (1) a focus on error control as central to scientific communication and (2) complete communication of the underlying methodology producing the results, ie. reproducibility. I outline barriers to these goals based on recent survey work (Stodden 2010), and suggest solutions such as the “Reproducible Research Standard” (Stodden 2009), giving open licensing options designed to create an intellectual property framework for scientists consonant with longstanding scientific norms.

  10. Reproducibility of self-paced treadmill performance of trained endurance runners.

    PubMed

    Schabort, E J; Hopkins, W G; Hawley, J A

    1998-01-01

    The reproducibility of performance in a laboratory test impacts on the statistical power of that test to detect changes of performance in experiments. The purpose of this study was to determine the reproducibility of performance of distance runners completing a 60 min time trial (TT) on a motor-driven treadmill. Eight trained distance runners (age 27 +/- 7yrs, peak oxygen consumption [VO2peak] 66 +/- 5 ml x min(-1) x kg(-1), mean +/- SD) performed the TT on three occasions separated by 7-10 days. Throughout each TT the runners controlled the speed of the treadmill and could view current speed and elapsed time, but they did not know the elapsed or final distance. On the basis of heart-rate, it is estimated that the subjects ran at an average intensity equivalent to 80-83% of VO2peak. The distance run in 1 h did not vary substantially between trials (16.2 +/- 1.4 km, 15.9 +/- 1.4 km, and 16.1 +/- 1.2 km for TTs 1-3 respectively, p = 0.5). The coefficient of variation (CV) for individual runners was 2.7% (95% Cl = 1.8-4.0%) and the test-retest reliability expressed as an intraclass correlation coefficient was 0.90 (95% Cl = 0.72-0.98). Reproducibility of performance in this test was therefore acceptable. However, higher reproducibility is required for experimental studies aimed at detecting the smallest worthwhile changes in performance with realistic sample sizes. PMID:9506800

  11. Intersubject variability and reproducibility of 15O PET studies.

    PubMed

    Coles, Jonathan P; Fryer, Tim D; Bradley, Peter G; Nortje, Jurgens; Smielewski, Peter; Rice, Kenneth; Clark, John C; Pickard, John D; Menon, David K

    2006-01-01

    Oxygen-15 positron emission tomography (15O PET) can provide important data regarding patients with head injury. We provide reference data on intersubject variability and reproducibility of cerebral blood flow (CBF), cerebral blood volume (CBV), cerebral metabolism (CMRO2) and oxygen extraction fraction (OEF) in patients and healthy controls, and explored alternative ways of assessing reproducibility within the context of a single PET study. In addition, we used independent measurements of CBF and CMRO2 to investigate the effect of mathematical correlation on the relationship between flow and metabolism. In patients, intersubject coefficients of variation (CoV) for CBF, CMRO2 and OEF were larger than in controls (32.9%+/-2.2%, 23.2%+/-2.0% and 22.5%+/-3.4% versus 13.5%+/-1.4%, 12.8%+/-1.1% and 7.3%+/-1.2%), while CoV for CBV were lower (15.2%+/-2.1% versus 22.5%+/-2.8%) (P<0.001). The CoV for the test-retest reproducibility of CBF, CBV, CMRO2 and OEF in patients were 2.1%+/-1.5%, 3.8%+/-3.0%, 3.7%+/-3.0% and 4.6%+/-3.5%, respectively. These were much lower than the intersubject CoV figures, and were similar to alternative measures of reproducibility obtained by fractionating data from a single study. The physiological relationship between flow and metabolism was preserved even when mathematically independent measures were used for analysis. These data provide a context for the design and interpretation of interventional PET studies. While ideally each centre should develop its own bank of such data, the figures provided will allow initial generic approximations of sample size for such studies.

  12. Highly reproducible Bragg grating acousto-ultrasonic contact transducers

    NASA Astrophysics Data System (ADS)

    Saxena, Indu Fiesler; Guzman, Narciso; Lieberman, Robert A.

    2014-09-01

    Fiber optic acousto-ultrasonic transducers offer numerous applications as embedded sensors for impact and damage detection in industrial and aerospace applications as well as non-destructive evaluation. Superficial contact transducers with a sheet of fiber optic Bragg gratings has been demonstrated for guided wave ultrasound based measurements. It is reported here that this method of measurement provides highly reproducible guided ultrasound data of the test composite component, despite the optical fiber transducers not being permanently embedded in it.

  13. Reproducing the assembly of massive galaxies within the hierarchical cosmogony

    NASA Astrophysics Data System (ADS)

    Fontanot, Fabio; Monaco, Pierluigi; Silva, Laura; Grazian, Andrea

    2007-12-01

    In order to gain insight into the physical mechanisms leading to the formation of stars and their assembly in galaxies, we compare the predictions of the MOdel for the Rise of GAlaxies aNd Active nuclei (MORGANA) to the properties of K- and 850-μm-selected galaxies (such as number counts, redshift distributions and luminosity functions) by combining MORGANA with the spectrophotometric model GRASIL. We find that it is possible to reproduce the K- and 850-μm-band data sets at the same time and with a standard Salpeter initial mass function, and ascribe this success to our improved modelling of cooling in DM haloes. We then predict that massively star-forming discs are common at z ~ 2 and dominate the star formation rate, but most of them merge with other galaxies within ~100 Myr. Our preferred model produces an overabundance of bright galaxies at z < 1; this overabundance might be connected to the build-up of the diffuse stellar component in galaxy clusters, as suggested by Monaco et al., but a naive implementation of the mechanism suggested in that paper does not produce a sufficient slowdown of the evolution of these objects. Moreover, our model overpredicts the number of 1010-1011Msolar galaxies at z ~ 1; this is a common behaviour of theoretical models as shown by Fontana et al.. These findings show that, while the overall build-up of the stellar mass is correctly reproduced by galaxy formation models, the `downsizing' trend of galaxies is not fully reproduced yet. This hints to some missing feedback mechanism in order to reproduce at the same time the formation of both the massive and the small galaxies.

  14. Reproducibility of graph metrics of human brain structural networks.

    PubMed

    Duda, Jeffrey T; Cook, Philip A; Gee, James C

    2014-01-01

    Recent interest in human brain connectivity has led to the application of graph theoretical analysis to human brain structural networks, in particular white matter connectivity inferred from diffusion imaging and fiber tractography. While these methods have been used to study a variety of patient populations, there has been less examination of the reproducibility of these methods. A number of tractography algorithms exist and many of these are known to be sensitive to user-selected parameters. The methods used to derive a connectivity matrix from fiber tractography output may also influence the resulting graph metrics. Here we examine how these algorithm and parameter choices influence the reproducibility of proposed graph metrics on a publicly available test-retest dataset consisting of 21 healthy adults. The dice coefficient is used to examine topological similarity of constant density subgraphs both within and between subjects. Seven graph metrics are examined here: mean clustering coefficient, characteristic path length, largest connected component size, assortativity, global efficiency, local efficiency, and rich club coefficient. The reproducibility of these network summary measures is examined using the intraclass correlation coefficient (ICC). Graph curves are created by treating the graph metrics as functions of a parameter such as graph density. Functional data analysis techniques are used to examine differences in graph measures that result from the choice of fiber tracking algorithm. The graph metrics consistently showed good levels of reproducibility as measured with ICC, with the exception of some instability at low graph density levels. The global and local efficiency measures were the most robust to the choice of fiber tracking algorithm.

  15. High-Reproducibility and High-Accuracy Method for Automated Topic Classification

    NASA Astrophysics Data System (ADS)

    Lancichinetti, Andrea; Sirer, M. Irmak; Wang, Jane X.; Acuna, Daniel; Körding, Konrad; Amaral, Luís A. Nunes

    2015-01-01

    Much of human knowledge sits in large databases of unstructured text. Leveraging this knowledge requires algorithms that extract and record metadata on unstructured text documents. Assigning topics to documents will enable intelligent searching, statistical characterization, and meaningful classification. Latent Dirichlet allocation (LDA) is the state of the art in topic modeling. Here, we perform a systematic theoretical and numerical analysis that demonstrates that current optimization techniques for LDA often yield results that are not accurate in inferring the most suitable model parameters. Adapting approaches from community detection in networks, we propose a new algorithm that displays high reproducibility and high accuracy and also has high computational efficiency. We apply it to a large set of documents in the English Wikipedia and reveal its hierarchical structure.

  16. Mechanostructure and composition of highly reproducible decellularized liver matrices.

    PubMed

    Mattei, G; Di Patria, V; Tirella, A; Alaimo, A; Elia, G; Corti, A; Paolicchi, A; Ahluwalia, A

    2014-02-01

    Despite the increasing number of papers on decellularized scaffolds, there is little consensus on the optimum method of decellularizing biological tissue such that the micro-architecture and protein content of the matrix are conserved as far as possible. Focusing on the liver, the aim of this study was therefore to develop a method for the production of well-characterized and reproducible matrices that best preserves the structure and composition of the native extra cellular matrix (ECM). Given the importance of matrix stiffness in regulating cell response, the mechanical properties of the decellularized tissue were also considered. The testing and analysis framework is based on the characterization of decellularized and untreated samples in the same reproducible initial state (i.e., the equilibrium swollen state). Decellularized ECM (dECM) were characterized using biochemical, histological, mechanical and structural analyses to identify the best procedure to ensure complete cell removal while preserving most of the native ECM structure and composition. Using this method, sterile decellularized porcine ECM with highly conserved intra-lobular micro-structure and protein content were obtained in a consistent and reproducible manner using the equilibrium swollen state of tissue or matrix as a reference. A significant reduction in the compressive elastic modulus was observed for liver dECM with respect to native tissue, suggesting a re-examination of design parameters for ECM-mimicking scaffolds for engineering tissues in vitro.

  17. An evaluation of RAPD fragment reproducibility and nature.

    PubMed

    Pérez, T; Albornoz, J; Domínguez, A

    1998-10-01

    Random amplified polymorphic DNA (RAPD) fragment reproducibility was assayed in three animal species: red deer (Cervus elaphus), wild boar (Sus scrofa) and fruit fly (Drosophila melanogaster). Ten 10-mer primers (Operon) were tested in two replicate reactions per individual under different stringency conditions (annealing temperatures of 35 degrees C or 45 degrees C). Two estimates were generated from the data: autosimilarity, which tests the reproducibility of overall banding patterns, and band repeatability, which tests the reproducibility of specific bands. Autosimilarity (the similarity of individuals with themselves) was lower than 1 for all three species ranging between values of 0.66 for Drosophila at 45 degrees C and 0.88 for wild boar at 35 degrees C. Band repeatability was estimated as the proportion of individuals showing homologous bands in both replicates. The fraction of repeatable bands was 23% for deer, 36% for boar and 26% for fruit fly, all at an annealing temperature of 35 degrees C. Raising the annealing temperature did not improve repeatability. Phage lambda DNA was subjected to amplification and the pattern of bands compared with theoretical expectations based on nucleotide sequence. Observed fragments could not be related to expected ones, even if a 2 bp mismatch is allowed. Therefore, the nature of genetic variation uncovered by the RAPD method is unclear. These data demonstrate that prudence should guide inferences about population structure and nucleotide divergence based on RAPD markers. PMID:9787445

  18. Representativity and reproducibility of DNA malignancy grading in different carcinomas.

    PubMed

    Böcking, A; Chatelain, R; Homge, M; Daniel, R; Gillissen, A; Wohltmann, D

    1989-04-01

    The reproducibility of the determination of the "DNA malignancy grade" (DNA-MG) was tested in 56 carcinomas of the colon, breast and lung while its representativity was tested on 195 slides from 65 tumors of the colon, breast and lung. DNA measurements were performed on Feulgen-stained smears with the TAS Plus TV-based image analysis system combined with an automated microscope. The variance of the DNA values of tumor cells around the 2c peak, the "2c deviation index" (2cDI), was taken as a basis for the computation of the DNA-MG, which ranges on a continuous scale from 0.01 to 3.00. The representativity, analyzed by comparison of the DNA-MGs measured in three different areas of the same tumor greater than or equal to 1.5 cm apart from each other, yielded an 81% agreement. No significant differences between DNA-MGs of these areas were found. The intraobserver and interobserver reproducibilities of the DNA grading system, investigated by repeated DNA measurements, were 83.9% and 82.2%, respectively. In comparison, histopathologic grading of the 27 breast cancers studied yielded 65% intraobserver and 57% interobserver reproducibilities and 66% representativity.

  19. An evaluation of RAPD fragment reproducibility and nature.

    PubMed

    Pérez, T; Albornoz, J; Domínguez, A

    1998-10-01

    Random amplified polymorphic DNA (RAPD) fragment reproducibility was assayed in three animal species: red deer (Cervus elaphus), wild boar (Sus scrofa) and fruit fly (Drosophila melanogaster). Ten 10-mer primers (Operon) were tested in two replicate reactions per individual under different stringency conditions (annealing temperatures of 35 degrees C or 45 degrees C). Two estimates were generated from the data: autosimilarity, which tests the reproducibility of overall banding patterns, and band repeatability, which tests the reproducibility of specific bands. Autosimilarity (the similarity of individuals with themselves) was lower than 1 for all three species ranging between values of 0.66 for Drosophila at 45 degrees C and 0.88 for wild boar at 35 degrees C. Band repeatability was estimated as the proportion of individuals showing homologous bands in both replicates. The fraction of repeatable bands was 23% for deer, 36% for boar and 26% for fruit fly, all at an annealing temperature of 35 degrees C. Raising the annealing temperature did not improve repeatability. Phage lambda DNA was subjected to amplification and the pattern of bands compared with theoretical expectations based on nucleotide sequence. Observed fragments could not be related to expected ones, even if a 2 bp mismatch is allowed. Therefore, the nature of genetic variation uncovered by the RAPD method is unclear. These data demonstrate that prudence should guide inferences about population structure and nucleotide divergence based on RAPD markers.

  20. Reproducibility of LCA models of crude oil production.

    PubMed

    Vafi, Kourosh; Brandt, Adam R

    2014-11-01

    Scientific models are ideally reproducible, with results that converge despite varying methods. In practice, divergence between models often remains due to varied assumptions, incompleteness, or simply because of avoidable flaws. We examine LCA greenhouse gas (GHG) emissions models to test the reproducibility of their estimates for well-to-refinery inlet gate (WTR) GHG emissions. We use the Oil Production Greenhouse gas Emissions Estimator (OPGEE), an open source engineering-based life cycle assessment (LCA) model, as the reference model for this analysis. We study seven previous studies based on six models. We examine the reproducibility of prior results by successive experiments that align model assumptions and boundaries. The root-mean-square error (RMSE) between results varies between ∼1 and 8 g CO2 eq/MJ LHV when model inputs are not aligned. After model alignment, RMSE generally decreases only slightly. The proprietary nature of some of the models hinders explanations for divergence between the results. Because verification of the results of LCA GHG emissions is often not possible by direct measurement, we recommend the development of open source models for use in energy policy. Such practice will lead to iterative scientific review, improvement of models, and more reliable understanding of emissions.

  1. Git can facilitate greater reproducibility and increased transparency in science

    PubMed Central

    2013-01-01

    Background Reproducibility is the hallmark of good science. Maintaining a high degree of transparency in scientific reporting is essential not just for gaining trust and credibility within the scientific community but also for facilitating the development of new ideas. Sharing data and computer code associated with publications is becoming increasingly common, motivated partly in response to data deposition requirements from journals and mandates from funders. Despite this increase in transparency, it is still difficult to reproduce or build upon the findings of most scientific publications without access to a more complete workflow. Findings Version control systems (VCS), which have long been used to maintain code repositories in the software industry, are now finding new applications in science. One such open source VCS, Git, provides a lightweight yet robust framework that is ideal for managing the full suite of research outputs such as datasets, statistical code, figures, lab notes, and manuscripts. For individual researchers, Git provides a powerful way to track and compare versions, retrace errors, explore new approaches in a structured manner, while maintaining a full audit trail. For larger collaborative efforts, Git and Git hosting services make it possible for everyone to work asynchronously and merge their contributions at any time, all the while maintaining a complete authorship trail. In this paper I provide an overview of Git along with use-cases that highlight how this tool can be leveraged to make science more reproducible and transparent, foster new collaborations, and support novel uses. PMID:23448176

  2. Interrater reproducibility of clinical tests for rotator cuff lesions

    PubMed Central

    Ostor, A; Richards, C; Prevost, A; Hazleman, B; Speed, C

    2004-01-01

    Background: Rotator cuff lesions are common in the community but reproducibility of tests for shoulder assessment has not been adequately appraised and there is no uniform approach to their use. Objective: To study interrater reproducibility of standard tests for shoulder evaluation among a rheumatology specialist, rheumatology trainee, and research nurse. Methods: 136 patients were reviewed over 12 months at a major teaching hospital. The three assessors examined each patient in random order and were unaware of each other's evaluation. Each shoulder was examined in a standard manner by recognised tests for specific lesions and a diagnostic algorithm was used. Between-observer agreement was determined by calculating Cohen's κ coefficients (measuring agreement beyond that expected by chance). Results: Fair to substantial agreement was obtained for the observations of tenderness, painful arc, and external rotation. Tests for supraspinatus and subscapularis also showed at least fair agreement between observers. 40/55 (73%) κ coefficient assessments were rated at >0.2, indicating at least fair concordance between observers; 21/55 (38%) were rated at >0.4, indicating at least moderate concordance between observers. Conclusion: The reproducibility of certain tests, employed by observers of varying experience, in the assessment of the rotator cuff and general shoulder disease was determined. This has implications for delegation of shoulder assessment to nurse specialists, the development of a simplified evaluation schedule for general practitioners, and uniformity in epidemiological research studies. PMID:15361389

  3. Dosimetric algorithm to reproduce isodose curves obtained from a LINAC.

    PubMed

    Estrada Espinosa, Julio Cesar; Martínez Ovalle, Segundo Agustín; Pereira Benavides, Cinthia Kotzian

    2014-01-01

    In this work isodose curves are obtained by the use of a new dosimetric algorithm using numerical data from percentage depth dose (PDD) and the maximum absorbed dose profile, calculated by Monte Carlo in a 18 MV LINAC. The software allows reproducing the absorbed dose percentage in the whole irradiated volume quickly and with a good approximation. To validate results an 18 MV LINAC with a whole geometry and a water phantom were constructed. On this construction, the distinct simulations were processed by the MCNPX code and then obtained the PDD and profiles for the whole depths of the radiation beam. The results data were used by the code to produce the dose percentages in any point of the irradiated volume. The absorbed dose for any voxel's size was also reproduced at any point of the irradiated volume, even when the voxels are considered to be of a pixel's size. The dosimetric algorithm is able to reproduce the absorbed dose induced by a radiation beam over a water phantom, considering PDD and profiles, whose maximum percent value is in the build-up region. Calculation time for the algorithm is only a few seconds, compared with the days taken when it is carried out by Monte Carlo. PMID:25045398

  4. Dosimetric Algorithm to Reproduce Isodose Curves Obtained from a LINAC

    PubMed Central

    Estrada Espinosa, Julio Cesar; Martínez Ovalle, Segundo Agustín; Pereira Benavides, Cinthia Kotzian

    2014-01-01

    In this work isodose curves are obtained by the use of a new dosimetric algorithm using numerical data from percentage depth dose (PDD) and the maximum absorbed dose profile, calculated by Monte Carlo in a 18 MV LINAC. The software allows reproducing the absorbed dose percentage in the whole irradiated volume quickly and with a good approximation. To validate results an 18 MV LINAC with a whole geometry and a water phantom were constructed. On this construction, the distinct simulations were processed by the MCNPX code and then obtained the PDD and profiles for the whole depths of the radiation beam. The results data were used by the code to produce the dose percentages in any point of the irradiated volume. The absorbed dose for any voxel's size was also reproduced at any point of the irradiated volume, even when the voxels are considered to be of a pixel's size. The dosimetric algorithm is able to reproduce the absorbed dose induced by a radiation beam over a water phantom, considering PDD and profiles, whose maximum percent value is in the build-up region. Calculation time for the algorithm is only a few seconds, compared with the days taken when it is carried out by Monte Carlo. PMID:25045398

  5. Planar heterojunction perovskite solar cells with superior reproducibility.

    PubMed

    Jeon, Ye-Jin; Lee, Sehyun; Kang, Rira; Kim, Jueng-Eun; Yeo, Jun-Seok; Lee, Seung-Hoon; Kim, Seok-Soon; Yun, Jin-Mun; Kim, Dong-Yu

    2014-01-01

    Perovskite solar cells (PeSCs) have been considered one of the competitive next generation power sources. To date, light-to-electric conversion efficiencies have rapidly increased to over 10%, and further improvements are expected. However, the poor device reproducibility of PeSCs ascribed to their inhomogeneously covered film morphology has hindered their practical application. Here, we demonstrate high-performance PeSCs with superior reproducibility by introducing small amounts of N-cyclohexyl-2-pyrrolidone (CHP) as a morphology controller into N,N-dimethylformamide (DMF). As a result, highly homogeneous film morphology, similar to that achieved by vacuum-deposition methods, as well as a high PCE of 10% and an extremely small performance deviation within 0.14% were achieved. This study represents a method for realizing efficient and reproducible planar heterojunction (PHJ) PeSCs through morphology control, taking a major step forward in the low-cost and rapid production of PeSCs by solving one of the biggest problems of PHJ perovskite photovoltaic technology through a facile method. PMID:25377945

  6. Planar heterojunction perovskite solar cells with superior reproducibility

    PubMed Central

    Jeon, Ye-Jin; Lee, Sehyun; Kang, Rira; Kim, Jueng-Eun; Yeo, Jun-Seok; Lee, Seung-Hoon; Kim, Seok-Soon; Yun, Jin-Mun; Kim, Dong-Yu

    2014-01-01

    Perovskite solar cells (PeSCs) have been considered one of the competitive next generation power sources. To date, light-to-electric conversion efficiencies have rapidly increased to over 10%, and further improvements are expected. However, the poor device reproducibility of PeSCs ascribed to their inhomogeneously covered film morphology has hindered their practical application. Here, we demonstrate high-performance PeSCs with superior reproducibility by introducing small amounts of N-cyclohexyl-2-pyrrolidone (CHP) as a morphology controller into N,N-dimethylformamide (DMF). As a result, highly homogeneous film morphology, similar to that achieved by vacuum-deposition methods, as well as a high PCE of 10% and an extremely small performance deviation within 0.14% were achieved. This study represents a method for realizing efficient and reproducible planar heterojunction (PHJ) PeSCs through morphology control, taking a major step forward in the low-cost and rapid production of PeSCs by solving one of the biggest problems of PHJ perovskite photovoltaic technology through a facile method. PMID:25377945

  7. Feasibility and Reproducibility of Echo Planar Spectroscopic Imaging on the Quantification of Hepatic Fat

    PubMed Central

    Lin, Yi-Ru; Chiu, Jian-Jia; Tsai, Shang-Yueh

    2014-01-01

    Objectives 1H-MRS is widely regarded as the most accurate noninvasive method to quantify hepatic fat content (HFC). When practical period of breath holding, and acquisition of HFC over multiple liver areas is considered, a fast MR spectroscopic imaging technique is desired. The aim of this study is to examine the feasibility and reproducibility of echo planar spectroscopic imaging (EPSI) on the quantification of HFC in subject with various HFCs. Methods Twenty two volunteers were examined in a 3T MR system. The acquisition time of proposed EPSI protocol was 18 seconds. The EPSI scans were repeated 8 times for each subject to test reproducibility. The peak of water and individual peaks of fat including methyl, methylene, and allylic peaks at 0.9, 1.3, and 2.0 ppm were fitted. Calculated amount of water and fat content were corrected for T2 relaxation. The total HFC was defined as the combination of individual peaks. Standard deviation (SD), coefficient of variance (COV) and fitting reliability of HFC quantified by LCModel were calculated. Results Our results show that the SDs of total HFC for all subjects are less than 2.5%. Fitting reliability is mostly under 10% and positively correlates with COV. Subjects separated into three subgroups according to quantified total HFC show that improved fitting reliability and reproducibility can be achieved on subjects with higher total HFC. Conclusions We have demonstrated feasibility of the proposed EPSI protocols on the quantification of HFC over a whole slice of liver with scan time in a single breath hold. PMID:25514348

  8. Reproducible and Consistent Quantification of the Saccharomyces cerevisiae Proteome by SWATH-mass spectrometry*

    PubMed Central

    Selevsek, Nathalie; Chang, Ching-Yun; Gillet, Ludovic C.; Navarro, Pedro; Bernhardt, Oliver M.; Reiter, Lukas; Cheng, Lin-Yang; Vitek, Olga; Aebersold, Ruedi

    2015-01-01

    Targeted mass spectrometry by selected reaction monitoring (S/MRM) has proven to be a suitable technique for the consistent and reproducible quantification of proteins across multiple biological samples and a wide dynamic range. This performance profile is an important prerequisite for systems biology and biomedical research. However, the method is limited to the measurements of a few hundred peptides per LC-MS analysis. Recently, we introduced SWATH-MS, a combination of data independent acquisition and targeted data analysis that vastly extends the number of peptides/proteins quantified per sample, while maintaining the favorable performance profile of S/MRM. Here we applied the SWATH-MS technique to quantify changes over time in a large fraction of the proteome expressed in Saccharomyces cerevisiae in response to osmotic stress. We sampled cell cultures in biological triplicates at six time points following the application of osmotic stress and acquired single injection data independent acquisition data sets on a high-resolution 5600 tripleTOF instrument operated in SWATH mode. Proteins were quantified by the targeted extraction and integration of transition signal groups from the SWATH-MS datasets for peptides that are proteotypic for specific yeast proteins. We consistently identified and quantified more than 15,000 peptides and 2500 proteins across the 18 samples. We demonstrate high reproducibility between technical and biological replicates across all time points and protein abundances. In addition, we show that the abundance of hundreds of proteins was significantly regulated upon osmotic shock, and pathway enrichment analysis revealed that the proteins reacting to osmotic shock are mainly involved in the carbohydrate and amino acid metabolism. Overall, this study demonstrates the ability of SWATH-MS to efficiently generate reproducible, consistent, and quantitatively accurate measurements of a large fraction of a proteome across multiple samples. PMID

  9. The validity and reproducibility of clinical assessment of nutritional status in the elderly.

    PubMed

    Duerksen, D R; Yeo, T A; Siemens, J L; O'Connor, M P

    2000-09-01

    Malnutrition is an important predictor of morbidity and mortality. In the non-elderly, a subjective global assessment (SGA) has been developed. It has a high inter-rater agreement, correlates with other measures of nutritional status, and predicts subsequent morbidity. The purpose of this study was to determine the validity and reproducibility of the SGA in a group of patients older than 70 y of age. Consecutive patients from four geriatric/rehabilitation units were considered for the study. Each patient underwent independent nutritional assessments by a geriatrician and senior medical resident. At the completion of the assessment, skinfold caliper measurements were obtained and the patient reclassified according to the results, which were then compared with objective measures of nutritional status. Six-month follow-up was obtained on all patients. The agreement between the two clinicians was 0.48 +/- 0.17 (unweighted kappa), which represents moderate agreement and is less than the reported agreement in nonelderly subjects. Skin calipers improved the agreement between clinicians but did not improve the correlation with other nutritional markers or prediction of morbidity and mortality. There was a correlation between a patient's severely malnourished state and mortality. In addition, patients with a body mass index (BMI) of <75% or >150% age/sex standardized norms had an increased mortality. The SGA is a reproducible and valid tool for determining nutritional status in the elderly. The reproducibility is less than in the nonelderly, which may relate to changes in body composition or ability to obtain an accurate nutritional history.

  10. REPRODUCIBILITY OF INTRA-ABDOMINAL PRESSURE MEASURED DURING PHYSICAL ACTIVITIES VIA A WIRELESS VAGINAL TRANSDUCER

    PubMed Central

    Egger, Marlene J.; Hamad, Nadia M.; Hitchcock, Robert W.; Coleman, Tanner J.; Shaw, Janet M.; Hsu, Yvonne; Nygaard, Ingrid E.

    2014-01-01

    Aims In the urodynamics laboratory setting, a wireless pressure transducer, developed to facilitate research exploring intra-abdominal pressure (IAP) and pelvic floor disorders, was highly accurate. We aimed to study reproducibility of IAP measured using this transducer in women during activities performed in an exercise science laboratory. Methods Fifty seven women (mean ± SD: age 30.4 ±9.3 years; body mass index=22.4 ± 2.68 kg/m2) completed two standardized activity sessions using the same transducer at least three days apart. Pressure data for 31 activities were transmitted wirelessly to a base station and analyzed for mean net maximal IAP, area under the curve (AUC) and first moment of the area (FMA.) Activities included typical exercises, lifting 13.6 to 18.2 kg, and simulated household tasks. Analysis for test-retest reliability included Bland-Altman plots with absolute limits of agreement (ALOA), Wilcoxon signed rank tests to assess significant differences between sessions, intraclass correlations, and kappa statistics to assess inter-session agreement in highest vs. other quintiles of maximal IAP. Results Few activities exhibited significant differences between sessions in maximal IAP, or in AUC and FMA values. For 13 activities, the agreement between repeat measures of maximal IAP was better than ± 10 cm H20; for 20 activities, better than ± 15 cm H20. ALOA increased with mean IAP. The highest quintile of IAP demonstrated fair/substantial agreement between sessions in 25 of 30 activities. Conclusion Reproducibility of IAP depends on the activity undertaken. Interventions geared towards lowering IAP should account for this, and maximize efforts to improve IAP reproducibility. PMID:25730430

  11. Submicroscopic malaria parasite carriage: how reproducible are polymerase chain reaction-based methods?

    PubMed

    Costa, Daniela Camargos; Madureira, Ana Paula; Amaral, Lara Cotta; Sanchez, Bruno Antônio Marinho; Gomes, Luciano Teixeira; Fontes, Cor Jésus Fernandes; Limongi, Jean Ezequiel; Brito, Cristiana Ferreira Alves de; Carvalho, Luzia Helena

    2014-02-01

    The polymerase chain reaction (PCR)-based methods for the diagnosis of malaria infection are expected to accurately identify submicroscopic parasite carriers. Although a significant number of PCR protocols have been described, few studies have addressed the performance of PCR amplification in cases of field samples with submicroscopic malaria infection. Here, the reproducibility of two well-established PCR protocols (nested-PCR and real-time PCR for the Plasmodium 18 small subunit rRNA gene) were evaluated in a panel of 34 blood field samples from individuals that are potential reservoirs of malaria infection, but were negative for malaria by optical microscopy. Regardless of the PCR protocol, a large variation between the PCR replicates was observed, leading to alternating positive and negative results in 38% (13 out of 34) of the samples. These findings were quite different from those obtained from the microscopy-positive patients or the unexposed individuals; the diagnosis of these individuals could be confirmed based on the high reproducibility and specificity of the PCR-based protocols. The limitation of PCR amplification was restricted to the field samples with very low levels of parasitaemia because titrations of the DNA templates were able to detect < 3 parasites/µL in the blood. In conclusion, conventional PCR protocols require careful interpretation in cases of submicroscopic malaria infection, as inconsistent and false-negative results can occur.

  12. Reproducibility of transcranial magnetic stimulation metrics in the study of proximal upper limb muscles

    PubMed Central

    Sankarasubramanian, Vishwanath; Roelle, Sarah; Bonnett, Corin E; Janini, Daniel; Varnerin, Nicole; Cunningham, David A; Sharma, Jennifer S; Potter-Baker, Kelsey A; Wang, Xiaofeng; Yue, Guang H; Plow, Ela B

    2015-01-01

    Objective Reproducibility of transcranial magnetic stimulation (TMS) metrics is essential in accurately tracking recovery and disease. However, majority of evidence pertains to reproducibility of metrics for distal upper limb muscles. We investigate for the first time, reliability of corticospinal physiology for a large proximal muscle-the biceps brachii and relate how varying statistical analyses can influence interpretations. Methods 14 young right-handed healthy participants completed two sessions assessing resting motor threshold (RMT), motor evoked potentials (MEPs), motor map and intra-cortical inhibition (ICI) from the left biceps brachii. Analyses included paired t-tests, Pearson's, intra-class (ICC) and concordance correlation coefficients (CCC) and Bland-Altman plots. Results Unlike paired t-tests, ICC, CCC and Pearson's were >0.6 indicating good reliability for RMTs, MEP intensities and locations of map; however values were <0.3 for MEP responses and ICI. Conclusions Corticospinal physiology, defining excitability and output in terms of intensity of the TMS device, and spatial loci are the most reliable metrics for the biceps. MEPs and variables based on MEPs are less reliable since biceps receives fewer cortico-motor-neuronal projections. Statistical tests of agreement and associations are more powerful reliability indices than inferential tests. Significance Reliable metrics of proximal muscles when translated to a larger number of participants would serve to sensitively track and prognosticate function in neurological disorders such as stroke where proximal recovery precedes distal. PMID:26111434

  13. A novel methodology to reproduce previously recorded six-degree of freedom kinematics on the same diarthrodial joint.

    PubMed

    Moore, Susan M; Thomas, Maribeth; Woo, Savio L-Y; Gabriel, Mary T; Kilger, Robert; Debski, Richard E

    2006-01-01

    The objective of this study was to develop a novel method to more accurately reproduce previously recorded 6-DOF kinematics of the tibia with respect to the femur using robotic technology. Furthermore, the effect of performing only a single or multiple registrations and the effect of robot joint configuration were investigated. A single registration consisted of registering the tibia and femur with respect to the robot at full extension and reproducing all kinematics while multiple registrations consisted of registering the bones at each flexion angle and reproducing only the kinematics of the corresponding flexion angle. Kinematics of the knee in response to an anterior (134 N) and combined internal/external (+/-10 N m) and varus/valgus (+/-5 N m) loads were collected at 0 degrees , 15 degrees , 30 degrees , 60 degrees , and 90 degrees of flexion. A six axes, serial-articulated robotic manipulator (PUMA Model 762) was calibrated and the working volume was reduced to improve the robot's accuracy. The effect of the robot joint configuration was determined by performing single and multiple registrations for three selected configurations. For each robot joint configuration, the accuracy in position of the reproduced kinematics improved after multiple registrations (0.7+/-0.3, 1.2+/-0.5, and 0.9+/-0.2 mm, respectively) when compared to only a single registration (1.3+/-0.9, 2.0+/-1.0, and 1.5+/-0.7 mm, respectively) (p<0.05). The accuracy in position of each robot joint configuration was unique as significant differences were detected between each of the configurations. These data demonstrate that the number of registrations and the robot joint configuration both affect the accuracy of the reproduced kinematics. Therefore, when using robotic technology to reproduce previously recorded kinematics, it may be necessary to perform these analyses for each individual robotic system and for each diarthrodial joint, as different joints will require the robot to be placed in

  14. Accuracy and reproducibility of low dose insulin administration using pen-injectors and syringes

    PubMed Central

    Gnanalingham, M; Newland, P; Smith, C

    1998-01-01

    Many children with diabetes require small doses of insulin administered with syringes or pen-injector devices (at the Booth Hall Paediatric Diabetic Clinic, 20% of children aged 0-5 years receive 1-2 U insulin doses). To determine how accurately and reproducibly small doses are delivered, 1, 2, 5, and 10 U doses of soluble insulin (100 U/ml) were dispensed in random order 15times from five new NovoPens (1.5 ml), five BD-Pens (1.5 ml), and by five nurses using 30 U syringes. Each dose was weighed, and intended and actual doses compared. The two pen-injectors delivered less insulin than syringes, differences being inversely proportional to dose. For 1 U (mean (SD)): 0.89 (0.04) U (NovoPen), 0.92 (0.03) U (BD-Pen), 1.23 (0.09) U (syringe); and for 10 U: 9.8 (0.1) U (NovoPen), 9.9 (0.1) U (BD-Pen), 10.1 (0.1) U (syringe). The accuracy (percentage errors) of the pen-injectors was similar and more accurate than syringes delivering 1, 2, and 5 U of insulin. Errors for 1 U: 11(4)% (NovoPen), 8(3)% (BD-Pen), 23(9)% (syringe). The reproducibility (coefficient of variation) of actual doses was similar (< 7%) for all three devices, which were equally consistent at underdosing (pen-injectors) or overdosing (syringes) insulin. All three devices, especially syringes, are unacceptably inaccurate when delivering 1 U doses of insulin. Patients on low doses need to be educated that their dose may alter when they transfer from one device to another.

 PMID:9771255

  15. D-BRAIN: Anatomically Accurate Simulated Diffusion MRI Brain Data.

    PubMed

    Perrone, Daniele; Jeurissen, Ben; Aelterman, Jan; Roine, Timo; Sijbers, Jan; Pizurica, Aleksandra; Leemans, Alexander; Philips, Wilfried

    2016-01-01

    Diffusion Weighted (DW) MRI allows for the non-invasive study of water diffusion inside living tissues. As such, it is useful for the investigation of human brain white matter (WM) connectivity in vivo through fiber tractography (FT) algorithms. Many DW-MRI tailored restoration techniques and FT algorithms have been developed. However, it is not clear how accurately these methods reproduce the WM bundle characteristics in real-world conditions, such as in the presence of noise, partial volume effect, and a limited spatial and angular resolution. The difficulty lies in the lack of a realistic brain phantom on the one hand, and a sufficiently accurate way of modeling the acquisition-related degradation on the other. This paper proposes a software phantom that approximates a human brain to a high degree of realism and that can incorporate complex brain-like structural features. We refer to it as a Diffusion BRAIN (D-BRAIN) phantom. Also, we propose an accurate model of a (DW) MRI acquisition protocol to allow for validation of methods in realistic conditions with data imperfections. The phantom model simulates anatomical and diffusion properties for multiple brain tissue components, and can serve as a ground-truth to evaluate FT algorithms, among others. The simulation of the acquisition process allows one to include noise, partial volume effects, and limited spatial and angular resolution in the images. In this way, the effect of image artifacts on, for instance, fiber tractography can be investigated with great detail. The proposed framework enables reliable and quantitative evaluation of DW-MR image processing and FT algorithms at the level of large-scale WM structures. The effect of noise levels and other data characteristics on cortico-cortical connectivity and tractography-based grey matter parcellation can be investigated as well. PMID:26930054

  16. Tackling the Reproducibility Problem in Systems Research with Declarative Experiment Specifications

    SciTech Connect

    Jimenez, Ivo; Maltzahn, Carlos; Lofstead, Jay; Moody, Adam; Mohror, Kathryn; Arpaci-Dusseau, Remzi; Arpaci-Dusseau, Andrea

    2015-05-04

    Validating experimental results in the field of computer systems is a challenging task, mainly due to the many changes in software and hardware that computational environments go through. Determining if an experiment is reproducible entails two separate tasks: re-executing the experiment and validating the results. Existing reproducibility efforts have focused on the former, envisioning techniques and infrastructures that make it easier to re-execute an experiment. In this work we focus on the latter by analyzing the validation workflow that an experiment re-executioner goes through. We notice that validating results is done on the basis of experiment design and high-level goals, rather than exact quantitative metrics. Based on this insight, we introduce a declarative format for specifying the high-level components of an experiment as well as describing generic, testable conditions that serve as the basis for validation. We present a use case in the area of storage systems to illustrate the usefulness of this approach. We also discuss limitations and potential benefits of using this approach in other areas of experimental systems research.

  17. ACCURATE ORBITAL INTEGRATION OF THE GENERAL THREE-BODY PROBLEM BASED ON THE D'ALEMBERT-TYPE SCHEME

    SciTech Connect

    Minesaki, Yukitaka

    2013-03-15

    We propose an accurate orbital integration scheme for the general three-body problem that retains all conserved quantities except angular momentum. The scheme is provided by an extension of the d'Alembert-type scheme for constrained autonomous Hamiltonian systems. Although the proposed scheme is merely second-order accurate, it can precisely reproduce some periodic, quasiperiodic, and escape orbits. The Levi-Civita transformation plays a role in designing the scheme.

  18. Accurate Orbital Integration of the General Three-body Problem Based on the d'Alembert-type Scheme

    NASA Astrophysics Data System (ADS)

    Minesaki, Yukitaka

    2013-03-01

    We propose an accurate orbital integration scheme for the general three-body problem that retains all conserved quantities except angular momentum. The scheme is provided by an extension of the d'Alembert-type scheme for constrained autonomous Hamiltonian systems. Although the proposed scheme is merely second-order accurate, it can precisely reproduce some periodic, quasiperiodic, and escape orbits. The Levi-Civita transformation plays a role in designing the scheme.

  19. An evaluation of WRF's ability to reproduce the surface wind over complex terrain based on typical circulation patterns

    NASA Astrophysics Data System (ADS)

    Jiménez, P. A.; Dudhia, J.; González-Rouco, J. F.; Montávez, J. P.; García-Bustamante, E.; Navarro, J.; Vilã-Guerau de Arellano, J.; Muñoz-Roldán, A.

    2013-07-01

    The performance of the Weather Research and Forecasting (WRF) model to reproduce the surface wind circulations over complex terrain is examined. The atmospheric evolution is simulated using two versions of the WRF model during an over 13 year period (1992 to 2005) over a complex terrain region located in the northeast of the Iberian Peninsula. A high horizontal resolution of 2km is used to provide an accurate representation of the terrain features. The multiyear evaluation focuses on the analysis of the accuracy displayed by the WRF simulations to reproduce the wind field of the six typical wind patterns (WPs) identified over the area in a previous observational work. Each pattern contains a high number of days which allows one to reach solid conclusions regarding the model performance. The accuracy of the simulations to reproduce the wind field under representative synoptic situations, or pressure patterns (PPs), of the Iberian Peninsula is also inspected in order to diagnose errors as a function of the large-scale situation. The evaluation is accomplished using daily averages in order to inspect the ability of WRF to reproduce the surface flow as a result of the interaction between the synoptic scale and the regional topography. Results indicate that model errors can originate from problems in the initial and lateral boundary conditions, misrepresentations at the synoptic scale, or the realism of the topographic features.

  20. Mill profiler machines soft materials accurately

    NASA Technical Reports Server (NTRS)

    Rauschl, J. A.

    1966-01-01

    Mill profiler machines bevels, slots, and grooves in soft materials, such as styrofoam phenolic-filled cores, to any desired thickness. A single operator can accurately control cutting depths in contour or straight line work.

  1. Accurate 12D dipole moment surfaces of ethylene

    NASA Astrophysics Data System (ADS)

    Delahaye, Thibault; Nikitin, Andrei V.; Rey, Michael; Szalay, Péter G.; Tyuterev, Vladimir G.

    2015-10-01

    Accurate ab initio full-dimensional dipole moment surfaces of ethylene are computed using coupled-cluster approach and its explicitly correlated counterpart CCSD(T)-F12 combined respectively with cc-pVQZ and cc-pVTZ-F12 basis sets. Their analytical representations are provided through 4th order normal mode expansions. First-principles prediction of the line intensities using variational method up to J = 30 are in excellent agreement with the experimental data in the range of 0-3200 cm-1. Errors of 0.25-6.75% in integrated intensities for fundamental bands are comparable with experimental uncertainties. Overall calculated C2H4 opacity in 600-3300 cm-1 range agrees with experimental determination better than to 0.5%.

  2. On the reproducibility of protein crystal structures: five atomic resolution structures of trypsin

    SciTech Connect

    Liebschner, Dorothee; Dauter, Miroslawa; Brzuszkiewicz, Anna; Dauter, Zbigniew

    2013-08-01

    Details of five very high-resolution accurate structures of bovine trypsin are compared in the context of the reproducibility of models obtained from crystals grown under identical conditions. Structural studies of proteins usually rely on a model obtained from one crystal. By investigating the details of this model, crystallographers seek to obtain insight into the function of the macromolecule. It is therefore important to know which details of a protein structure are reproducible or to what extent they might differ. To address this question, the high-resolution structures of five crystals of bovine trypsin obtained under analogous conditions were compared. Global parameters and structural details were investigated. All of the models were of similar quality and the pairwise merged intensities had large correlation coefficients. The C{sup α} and backbone atoms of the structures superposed very well. The occupancy of ligands in regions of low thermal motion was reproducible, whereas solvent molecules containing heavier atoms (such as sulfur) or those located on the surface could differ significantly. The coordination lengths of the calcium ion were conserved. A large proportion of the multiple conformations refined to similar occupancies and the residues adopted similar orientations. More than three quarters of the water-molecule sites were conserved within 0.5 Å and more than one third were conserved within 0.1 Å. An investigation of the protonation states of histidine residues and carboxylate moieties was consistent for all of the models. Radiation-damage effects to disulfide bridges were observed for the same residues and to similar extents. Main-chain bond lengths and angles averaged to similar values and were in agreement with the Engh and Huber targets. Other features, such as peptide flips and the double conformation of the inhibitor molecule, were also reproducible in all of the trypsin structures. Therefore, many details are similar in models obtained

  3. Validity and Reproducibility of a Spanish Dietary History

    PubMed Central

    Guallar-Castillón, Pilar; Sagardui-Villamor, Jon; Balboa-Castillo, Teresa; Sala-Vila, Aleix; Ariza Astolfi, Mª José; Sarrión Pelous, Mª Dolores; León-Muñoz, Luz María; Graciani, Auxiliadora; Laclaustra, Martín; Benito, Cristina; Banegas, José Ramón; Artalejo, Fernando Rodríguez

    2014-01-01

    Objective To assess the validity and reproducibility of food and nutrient intake estimated with the electronic diet history of ENRICA (DH-E), which collects information on numerous aspects of the Spanish diet. Methods The validity of food and nutrient intake was estimated using Pearson correlation coefficients between the DH-E and the mean of seven 24-hour recalls collected every 2 months over the previous year. The reproducibility was estimated using intraclass correlation coefficients between two DH-E made one year apart. Results The correlations coefficients between the DH-E and the mean of seven 24-hour recalls for the main food groups were cereals (r = 0.66), meat (r = 0.66), fish (r = 0.42), vegetables (r = 0.62) and fruits (r = 0.44). The mean correlation coefficient for all 15 food groups considered was 0.53. The correlations for macronutrients were: energy (r = 0.76), proteins (r = 0.58), lipids (r = 0.73), saturated fat (r = 0.73), monounsaturated fat (r = 0.59), polyunsaturated fat (r = 0.57), and carbohydrates (r = 0.66). The mean correlation coefficient for all 41 nutrients studied was 0.55. The intraclass correlation coefficient between the two DH-E was greater than 0.40 for most foods and nutrients. Conclusions The DH-E shows good validity and reproducibility for estimating usual intake of foods and nutrients. PMID:24465878

  4. Repeatability and Reproducibility of Decisions by Latent Fingerprint Examiners

    PubMed Central

    Ulery, Bradford T.; Hicklin, R. Austin; Buscaglia, JoAnn; Roberts, Maria Antonia

    2012-01-01

    The interpretation of forensic fingerprint evidence relies on the expertise of latent print examiners. We tested latent print examiners on the extent to which they reached consistent decisions. This study assessed intra-examiner repeatability by retesting 72 examiners on comparisons of latent and exemplar fingerprints, after an interval of approximately seven months; each examiner was reassigned 25 image pairs for comparison, out of total pool of 744 image pairs. We compare these repeatability results with reproducibility (inter-examiner) results derived from our previous study. Examiners repeated 89.1% of their individualization decisions, and 90.1% of their exclusion decisions; most of the changed decisions resulted in inconclusive decisions. Repeatability of comparison decisions (individualization, exclusion, inconclusive) was 90.0% for mated pairs, and 85.9% for nonmated pairs. Repeatability and reproducibility were notably lower for comparisons assessed by the examiners as “difficult” than for “easy” or “moderate” comparisons, indicating that examiners' assessments of difficulty may be useful for quality assurance. No false positive errors were repeated (n = 4); 30% of false negative errors were repeated. One percent of latent value decisions were completely reversed (no value even for exclusion vs. of value for individualization). Most of the inter- and intra-examiner variability concerned whether the examiners considered the information available to be sufficient to reach a conclusion; this variability was concentrated on specific image pairs such that repeatability and reproducibility were very high on some comparisons and very low on others. Much of the variability appears to be due to making categorical decisions in borderline cases. PMID:22427888

  5. Reproducibility and Transparency in Ocean-Climate Modeling

    NASA Astrophysics Data System (ADS)

    Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.

    2015-12-01

    Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.

  6. Venusian Polar Vortex reproduced by a general circulation model

    NASA Astrophysics Data System (ADS)

    Ando, Hiroki; Sugimoto, Norihiko; Takagi, Masahiro

    2016-10-01

    Unlike the polar vortices observed in the Earth, Mars and Titan atmospheres, the observed Venus polar vortex is warmer than the mid-latitudes at cloud-top levels (~65 km). This warm polar vortex is zonally surrounded by a cold latitude band located at ~60 degree latitude, which is a unique feature called 'cold collar' in the Venus atmosphere [e.g. Taylor et al. 1980; Piccioni et al. 2007]. Although these structures have been observed in numerous previous observations, the formation mechanism is still unknown. In addition, an axi-asymmetric feature is always seen in the warm polar vortex. It changes temporally and sometimes shows a hot polar dipole or S-shaped structure as shown by a lot of infrared measurements [e.g. Garate-Lopez et al. 2013; 2015]. However, its vertical structure has not been investigated. To solve these problems, we performed a numerical simulation of the Venus atmospheric circulation using a general circulation model named AFES for Venus [Sugimoto et al. 2014] and reproduced these puzzling features.And then, the reproduced structures of the atmosphere and the axi-asymmetirc feature are compared with some previous observational results.In addition, the quasi-periodical zonal-mean zonal wind fluctuation is also seen in the Venus polar vortex reproduced in our model. This might be able to explain some observational results [e.g. Luz et al. 2007] and implies that the polar vacillation might also occur in the Venus atmosphere, which is silimar to the Earth's polar atmosphere. We will also show some initial results about this point in this presentation.

  7. Reproducing continuous radio blackout using glow discharge plasma

    SciTech Connect

    Xie, Kai; Li, Xiaoping; Liu, Donglin; Shao, Mingxu; Zhang, Hanlu

    2013-10-15

    A novel plasma generator is described that offers large-scale, continuous, non-magnetized plasma with a 30-cm-diameter hollow structure, which provides a path for an electromagnetic wave. The plasma is excited by a low-pressure glow discharge, with varying electron densities ranging from 10{sup 9} to 2.5 × 10{sup 11} cm{sup −3}. An electromagnetic wave propagation experiment reproduced a continuous radio blackout in UHF-, L-, and S-bands. The results are consistent with theoretical expectations. The proposed method is suitable in simulating a plasma sheath, and in researching communications, navigation, electromagnetic mitigations, and antenna compensation in plasma sheaths.

  8. Data quality in predictive toxicology: reproducibility of rodent carcinogenicity experiments.

    PubMed Central

    Gottmann, E; Kramer, S; Pfahringer, B; Helma, C

    2001-01-01

    We compared 121 replicate rodent carcinogenicity assays from the two parts (National Cancer Institute/National Toxicology Program and literature) of the Carcinogenic Potency Database (CPDB) to estimate the reliability of these experiments. We estimated a concordance of 57% between the overall rodent carcinogenicity classifications from both sources. This value did not improve substantially when additional biologic information (species, sex, strain, target organs) was considered. These results indicate that rodent carcinogenicity assays are much less reproducible than previously expected, an effect that should be considered in the development of structure-activity relationship models and the risk assessment process. PMID:11401763

  9. Inter-study reproducibility of cardiovascular magnetic resonance tagging

    PubMed Central

    2013-01-01

    Background The aim of this study is to determine the test-retest reliability of the measurement of regional myocardial function by cardiovascular magnetic resonance (CMR) tagging using spatial modulation of magnetization. Methods Twenty-five participants underwent CMR tagging twice over 12 ± 7 days. To assess the role of slice orientation on strain measurement, two healthy volunteers had a first exam, followed by image acquisition repeated with slices rotated ±15 degrees out of true short axis, followed by a second exam in the true short axis plane. To assess the role of slice location, two healthy volunteers had whole heart tagging. The harmonic phase (HARP) method was used to analyze the tagged images. Peak midwall circumferential strain (Ecc), radial strain (Err), Lambda 1, Lambda 2, and Angle α were determined in basal, mid and apical slices. LV torsion, systolic and early diastolic circumferential strain and torsion rates were also determined. Results LV Ecc and torsion had excellent intra-, interobserver, and inter-study intra-class correlation coefficients (ICC range, 0.7 to 0.9). Err, Lambda 1, Lambda 2 and angle had excellent intra- and interobserver ICC than inter-study ICC. Angle had least inter-study reproducibility. Torsion rates had superior intra-, interobserver, and inter-study reproducibility to strain rates. The measurements of LV Ecc were comparable in all three slices with different short axis orientations (standard deviation of mean Ecc was 0.09, 0.18 and 0.16 at basal, mid and apical slices, respectively). The mean difference in LV Ecc between slices was more pronounced in most of the basal slices compared to the rest of the heart. Conclusions Intraobserver and interobserver reproducibility of all strain and torsion parameters was excellent. Inter-study reproducibility of CMR tagging by SPAMM varied between different parameters as described in the results above and was superior for Ecc and LV torsion. The variation in LV Ecc

  10. Multi-Parametric Neuroimaging Reproducibility: A 3T Resource Study

    PubMed Central

    Landman, Bennett A.; Huang, Alan J.; Gifford, Aliya; Vikram, Deepti S.; Lim, Issel Anne L.; Farrell, Jonathan A.D.; Bogovic, John A.; Hua, Jun; Chen, Min; Jarso, Samson; Smith, Seth A.; Joel, Suresh; Mori, Susumu; Pekar, James J.; Barker, Peter B.; Prince, Jerry L.; van Zijl, Peter C.M.

    2010-01-01

    Modern MRI image processing methods have yielded quantitative, morphometric, functional, and structural assessments of the human brain. These analyses typically exploit carefully optimized protocols for specific imaging targets. Algorithm investigators have several excellent public data resources to use to test, develop, and optimize their methods. Recently, there has been an increasing focus on combining MRI protocols in multi-parametric studies. Notably, these have included innovative approaches for fusing connectivity inferences with functional and/or anatomical characterizations. Yet, validation of the reproducibility of these interesting and novel methods has been severely hampered by the limited availability of appropriate multi-parametric data. We present an imaging protocol optimized to include state-of-the-art assessment of brain function, structure, micro-architecture, and quantitative parameters within a clinically feasible 60 minute protocol on a 3T MRI scanner. We present scan-rescan reproducibility of these imaging contrasts based on 21 healthy volunteers (11 M/10 F, 22–61 y/o). The cortical gray matter, cortical white matter, ventricular cerebrospinal fluid, thalamus, putamen, caudate, cerebellar gray matter, cerebellar white matter, and brainstem were identified with mean volume-wise reproducibility of 3.5%. We tabulate the mean intensity, variability and reproducibility of each contrast in a region of interest approach, which is essential for prospective study planning and retrospective power analysis considerations. Anatomy was highly consistent on structural acquisition (~1–5% variability), while variation on diffusion and several other quantitative scans was higher (~<10%). Some sequences are particularly variable in specific structures (ASL exhibited variation of 28% in the cerebral white matter) or in thin structures (quantitative T2 varied by up to 73% in the caudate) due, in large part, to variability in automated ROI placement. The

  11. Right ventricular ejection fraction measured by first-pass intravenous krypton-81m: reproducibility and comparison with technetium-99m.

    PubMed

    Wong, D F; Natarajan, T K; Summer, W; Tibbits, P A; Beck, T; Koller, D; Kasecamp, W; Lamb, J; Olynyk, J; Philp, M S

    1985-11-01

    Study of the effects of various diseases and therapeutic manipulation of pulmonary vascular resistance on the right ventricle has been restricted by methodologic limitations. The radioactive gas in solution, krypton-81m was used to study the right ventricle and the technique was compared with a technetium-99m method. In 22 subjects, first-pass krypton-81m right ventricular ejection fraction, acquired both in list mode and electrocardiogram-gated frame mode, correlated well (r = 0.81 and 0.86, respectively, p less than 0.01) with that determined by technetium-99m first-pass studies over a broad range of ventricular function. The reproducibility of the technique was excellent (r = 0.84 and 0.95 for each acquisition mode, respectively). Krypton-81m first-pass studies provide accurate and reproducible estimates of right ventricular function. Use of krypton allows multiple measurements, with or without perturbations, over a short period of time.

  12. Quantum theory as the most robust description of reproducible experiments

    NASA Astrophysics Data System (ADS)

    De Raedt, Hans; Katsnelson, Mikhail I.; Michielsen, Kristel

    2014-08-01

    suggests that quantum theory is a powerful language to describe a certain class of statistical experiments but remains vague about the properties of the class. Similar views were expressed by other fathers of quantum mechanics, e.g., Max Born and Wolfgang Pauli [50]. They can be summarized as "Quantum theory describes our knowledge of the atomic phenomena rather than the atomic phenomena themselves". Our aim is, in a sense, to replace the philosophical components of these statements by well-defined mathematical concepts and to carefully study their relevance for physical phenomena. Specifically, by applying the general formalism of logical inference to a well-defined class of statistical experiments, the present paper shows that quantum theory is indeed the kind of language envisaged by Bohr.Theories such as Newtonian mechanics, Maxwell's electrodynamics, and Einstein's (general) relativity are deductive in character. Starting from a few axioms, abstracted from experimental observations and additional assumptions about the irrelevance of a large number of factors for the description of the phenomena of interest, deductive reasoning is used to prove or disprove unambiguous statements, propositions, about the mathematical objects which appear in the theory.The method of deductive reasoning conforms to the Boolean algebra of propositions. The deductive, reductionist methodology has the appealing feature that one can be sure that the propositions are either right or wrong, and disregarding the possibility that some of the premises on which the deduction is built may not apply, there is no doubt that the conclusions are correct. Clearly, these theories successfully describe a wide range of physical phenomena in a manner and language which is unambiguous and independent of the individual.At the same time, the construction of a physical theory, and a scientific theory in general, from "first principles" is, for sure, not something self-evident, and not even safe. Our basic

  13. Validity and Reproducibility of a Habitual Dietary Fibre Intake Short Food Frequency Questionnaire

    PubMed Central

    Healey, Genelle; Brough, Louise; Murphy, Rinki; Hedderley, Duncan; Butts, Chrissie; Coad, Jane

    2016-01-01

    Low dietary fibre intake has been associated with poorer health outcomes, therefore having the ability to be able to quickly assess an individual’s dietary fibre intake would prove useful in clinical practice and for research purposes. Current dietary assessment methods such as food records and food frequency questionnaires are time-consuming and burdensome, and there are presently no published short dietary fibre intake questionnaires that can quantify an individual’s total habitual dietary fibre intake and classify individuals as low, moderate or high habitual dietary fibre consumers. Therefore, we aimed to develop and validate a habitual dietary fibre intake short food frequency questionnaire (DFI-FFQ) which can quickly and accurately classify individuals based on their habitual dietary fibre intake. In this study the DFI-FFQ was validated against the Monash University comprehensive nutrition assessment questionnaire (CNAQ). Fifty-two healthy, normal weight male (n = 17) and female (n = 35) participants, aged between 21 and 61 years, completed the DFI-FFQ twice and the CNAQ once. All eligible participants completed the study, however the data from 46% of the participants were excluded from analysis secondary to misreporting. The DFI-FFQ cannot accurately quantify total habitual dietary fibre intakes, however, it is a quick, valid and reproducible tool in classifying individuals based on their habitual dietary fibre intakes. PMID:27626442

  14. Validity and Reproducibility of a Habitual Dietary Fibre Intake Short Food Frequency Questionnaire.

    PubMed

    Healey, Genelle; Brough, Louise; Murphy, Rinki; Hedderley, Duncan; Butts, Chrissie; Coad, Jane

    2016-01-01

    Low dietary fibre intake has been associated with poorer health outcomes, therefore having the ability to be able to quickly assess an individual's dietary fibre intake would prove useful in clinical practice and for research purposes. Current dietary assessment methods such as food records and food frequency questionnaires are time-consuming and burdensome, and there are presently no published short dietary fibre intake questionnaires that can quantify an individual's total habitual dietary fibre intake and classify individuals as low, moderate or high habitual dietary fibre consumers. Therefore, we aimed to develop and validate a habitual dietary fibre intake short food frequency questionnaire (DFI-FFQ) which can quickly and accurately classify individuals based on their habitual dietary fibre intake. In this study the DFI-FFQ was validated against the Monash University comprehensive nutrition assessment questionnaire (CNAQ). Fifty-two healthy, normal weight male (n = 17) and female (n = 35) participants, aged between 21 and 61 years, completed the DFI-FFQ twice and the CNAQ once. All eligible participants completed the study, however the data from 46% of the participants were excluded from analysis secondary to misreporting. The DFI-FFQ cannot accurately quantify total habitual dietary fibre intakes, however, it is a quick, valid and reproducible tool in classifying individuals based on their habitual dietary fibre intakes.

  15. Validity and Reproducibility of a Habitual Dietary Fibre Intake Short Food Frequency Questionnaire.

    PubMed

    Healey, Genelle; Brough, Louise; Murphy, Rinki; Hedderley, Duncan; Butts, Chrissie; Coad, Jane

    2016-01-01

    Low dietary fibre intake has been associated with poorer health outcomes, therefore having the ability to be able to quickly assess an individual's dietary fibre intake would prove useful in clinical practice and for research purposes. Current dietary assessment methods such as food records and food frequency questionnaires are time-consuming and burdensome, and there are presently no published short dietary fibre intake questionnaires that can quantify an individual's total habitual dietary fibre intake and classify individuals as low, moderate or high habitual dietary fibre consumers. Therefore, we aimed to develop and validate a habitual dietary fibre intake short food frequency questionnaire (DFI-FFQ) which can quickly and accurately classify individuals based on their habitual dietary fibre intake. In this study the DFI-FFQ was validated against the Monash University comprehensive nutrition assessment questionnaire (CNAQ). Fifty-two healthy, normal weight male (n = 17) and female (n = 35) participants, aged between 21 and 61 years, completed the DFI-FFQ twice and the CNAQ once. All eligible participants completed the study, however the data from 46% of the participants were excluded from analysis secondary to misreporting. The DFI-FFQ cannot accurately quantify total habitual dietary fibre intakes, however, it is a quick, valid and reproducible tool in classifying individuals based on their habitual dietary fibre intakes. PMID:27626442

  16. General theory of experiment containing reproducible data: The reduction to an ideal experiment

    NASA Astrophysics Data System (ADS)

    Nigmatullin, Raoul R.; Zhang, Wei; Striccoli, Domenico

    2015-10-01

    The authors suggest a general theory for consideration of all experiments associated with measurements of reproducible data in one unified scheme. The suggested algorithm does not contain unjustified suppositions and the final function that is extracted from these measurements can be compared with hypothesis that is suggested by the theory adopted for the explanation of the object/phenomenon studied. This true function is free from the influence of the apparatus (instrumental) function and when the "best fit", or the most acceptable hypothesis, is absent, can be presented as a segment of the Fourier series. The discrete set of the decomposition coefficients describes the final function quantitatively and can serve as an intermediate model that coincides with the amplitude-frequency response (AFR) of the object studied. It can be used by theoreticians also for comparison of the suggested theory with experimental observations. Two examples (Raman spectra of the distilled water and exchange by packets between two wireless sensor nodes) confirm the basic elements of this general theory. From this general theory the following important conclusions follow: 1. The Prony's decomposition should be used in detection of the quasi-periodic processes and for quantitative description of reproducible data. 2. The segment of the Fourier series should be used as the fitting function for description of observable data corresponding to an ideal experiment. The transition from the initial Prony's decomposition to the conventional Fourier transform implies also the elimination of the apparatus function that plays an important role in the reproducible data processing. 3. The suggested theory will be helpful for creation of the unified metrological standard (UMS) that should be used in comparison of similar data obtained from the same object studied but in different laboratories with the usage of different equipment. 4. Many cases when the conventional theory confirms the experimental

  17. Light Field Imaging Based Accurate Image Specular Highlight Removal.

    PubMed

    Wang, Haoqian; Xu, Chenxue; Wang, Xingzheng; Zhang, Yongbing; Peng, Bo

    2016-01-01

    Specular reflection removal is indispensable to many computer vision tasks. However, most existing methods fail or degrade in complex real scenarios for their individual drawbacks. Benefiting from the light field imaging technology, this paper proposes a novel and accurate approach to remove specularity and improve image quality. We first capture images with specularity by the light field camera (Lytro ILLUM). After accurately estimating the image depth, a simple and concise threshold strategy is adopted to cluster the specular pixels into "unsaturated" and "saturated" category. Finally, a color variance analysis of multiple views and a local color refinement are individually conducted on the two categories to recover diffuse color information. Experimental evaluation by comparison with existed methods based on our light field dataset together with Stanford light field archive verifies the effectiveness of our proposed algorithm. PMID:27253083

  18. Light Field Imaging Based Accurate Image Specular Highlight Removal

    PubMed Central

    Wang, Haoqian; Xu, Chenxue; Wang, Xingzheng; Zhang, Yongbing; Peng, Bo

    2016-01-01

    Specular reflection removal is indispensable to many computer vision tasks. However, most existing methods fail or degrade in complex real scenarios for their individual drawbacks. Benefiting from the light field imaging technology, this paper proposes a novel and accurate approach to remove specularity and improve image quality. We first capture images with specularity by the light field camera (Lytro ILLUM). After accurately estimating the image depth, a simple and concise threshold strategy is adopted to cluster the specular pixels into “unsaturated” and “saturated” category. Finally, a color variance analysis of multiple views and a local color refinement are individually conducted on the two categories to recover diffuse color information. Experimental evaluation by comparison with existed methods based on our light field dataset together with Stanford light field archive verifies the effectiveness of our proposed algorithm. PMID:27253083

  19. A Bayesian Perspective on the Reproducibility Project: Psychology.

    PubMed

    Etz, Alexander; Vandekerckhove, Joachim

    2016-01-01

    We revisit the results of the recent Reproducibility Project: Psychology by the Open Science Collaboration. We compute Bayes factors-a quantity that can be used to express comparative evidence for an hypothesis but also for the null hypothesis-for a large subset (N = 72) of the original papers and their corresponding replication attempts. In our computation, we take into account the likely scenario that publication bias had distorted the originally published results. Overall, 75% of studies gave qualitatively similar results in terms of the amount of evidence provided. However, the evidence was often weak (i.e., Bayes factor < 10). The majority of the studies (64%) did not provide strong evidence for either the null or the alternative hypothesis in either the original or the replication, and no replication attempts provided strong evidence in favor of the null. In all cases where the original paper provided strong evidence but the replication did not (15%), the sample size in the replication was smaller than the original. Where the replication provided strong evidence but the original did not (10%), the replication sample size was larger. We conclude that the apparent failure of the Reproducibility Project to replicate many target effects can be adequately explained by overestimation of effect sizes (or overestimation of evidence against the null hypothesis) due to small sample sizes and publication bias in the psychological literature. We further conclude that traditional sample sizes are insufficient and that a more widespread adoption of Bayesian methods is desirable. PMID:26919473

  20. A Bayesian Perspective on the Reproducibility Project: Psychology.

    PubMed

    Etz, Alexander; Vandekerckhove, Joachim

    2016-01-01

    We revisit the results of the recent Reproducibility Project: Psychology by the Open Science Collaboration. We compute Bayes factors-a quantity that can be used to express comparative evidence for an hypothesis but also for the null hypothesis-for a large subset (N = 72) of the original papers and their corresponding replication attempts. In our computation, we take into account the likely scenario that publication bias had distorted the originally published results. Overall, 75% of studies gave qualitatively similar results in terms of the amount of evidence provided. However, the evidence was often weak (i.e., Bayes factor < 10). The majority of the studies (64%) did not provide strong evidence for either the null or the alternative hypothesis in either the original or the replication, and no replication attempts provided strong evidence in favor of the null. In all cases where the original paper provided strong evidence but the replication did not (15%), the sample size in the replication was smaller than the original. Where the replication provided strong evidence but the original did not (10%), the replication sample size was larger. We conclude that the apparent failure of the Reproducibility Project to replicate many target effects can be adequately explained by overestimation of effect sizes (or overestimation of evidence against the null hypothesis) due to small sample sizes and publication bias in the psychological literature. We further conclude that traditional sample sizes are insufficient and that a more widespread adoption of Bayesian methods is desirable.

  1. [Study of the validity and reproducibility of passive ozone monitors].

    PubMed

    Cortez-Lugo, M; Romieu, I; Palazuelos-Rendón, E; Hernández-Avila, M

    1995-01-01

    The aim of this study was to evaluate the validity and reproducibility between ozone measurements obtained with passive ozone monitors and those registered with a continuous ozone monitor, to determine the applicability of passive monitors in epidemiological research. The study was carried out during November and December 1992. Indoor and outdoor classroom air ozone concentrations were analyzed using 28 passive monitors and using a continuous monitor. The correlation between both measurements was highly significant (r = 0.089, p < 0.001), indicating a very good validity. Also, the correlation between the measurements obtained with two different passive monitors exposed concurrently was very high (r = 0.97, p < 0.001), indicating a good reproducibility in the measurements of the passive monitors. The relative error between the concentrations measured by the passive monitors and those from the continuous monitor tended to decrease with increasing ozone concentrations. The results suggest that passive monitors should be used to determine cumulative exposure of ozone exceeding 100 ppb, corresponding to an exposure period greater than five days, if used to analyze indoor air.

  2. Data reproducibility of pace strategy in a laboratory test run.

    PubMed

    de França, Elias; Xavier, Ana Paula; Hirota, Vinicius Barroso; Côrrea, Sônia Cavalcanti; Caperuto, Érico Chagas

    2016-06-01

    This data paper contains data related to a reproducibility test for running pacing strategy in an intermittent running test until exhaustion. Ten participants underwent a crossover study (test and retest) with an intermittent running test. The test was composed of three-minute sets (at 1 km/h above Onset Blood Lactate Accumulation) until volitional exhaustion. To assess pace strategy change, in the first test participants chose the rest time interval (RTI) between sets (ranging from 30 to 60 s) and in the second test the maximum RTI values were either the RTI chosen in the first test (maximum RTI value), or less if desired. To verify the reproducibility of the test, rating perceived exertion (RPE), heart rate (HR) and blood plasma lactate concentration ([La]p) were collected at rest, immediately after each set and at the end of the tests. As results, RTI, RPE, HR, [La]p and time to exhaustion were not statistically different (p>0.05) between test and retest, as well as they demonstrated good intraclass correlation. PMID:27081672

  3. Reproducibility of UAV-based photogrammetric surface models

    NASA Astrophysics Data System (ADS)

    Anders, Niels; Smith, Mike; Cammeraat, Erik; Keesstra, Saskia

    2016-04-01

    Soil erosion, rapid geomorphological change and vegetation degradation are major threats to the human and natural environment in many regions. Unmanned Aerial Vehicles (UAVs) and Structure-from-Motion (SfM) photogrammetry are invaluable tools for the collection of highly detailed aerial imagery and subsequent low cost production of 3D landscapes for an assessment of landscape change. Despite the widespread use of UAVs for image acquisition in monitoring applications, the reproducibility of UAV data products has not been explored in detail. This paper investigates this reproducibility by comparing the surface models and orthophotos derived from different UAV flights that vary in flight direction and altitude. The study area is located near Lorca, Murcia, SE Spain, which is a semi-arid medium-relief locale. The area is comprised of terraced agricultural fields that have been abandoned for about 40 years and have suffered subsequent damage through piping and gully erosion. In this work we focused upon variation in cell size, vertical and horizontal accuracy, and horizontal positioning of recognizable landscape features. The results suggest that flight altitude has a significant impact on reconstructed point density and related cell size, whilst flight direction affects the spatial distribution of vertical accuracy. The horizontal positioning of landscape features is relatively consistent between the different flights. We conclude that UAV data products are suitable for monitoring campaigns for land cover purposes or geomorphological mapping, but special care is required when used for monitoring changes in elevation.

  4. Reproducible Research Practices and Transparency across the Biomedical Literature

    PubMed Central

    Khoury, Muin J.; Schully, Sheri D.; Ioannidis, John P. A.

    2016-01-01

    There is a growing movement to encourage reproducibility and transparency practices in the scientific community, including public access to raw data and protocols, the conduct of replication studies, systematic integration of evidence in systematic reviews, and the documentation of funding and potential conflicts of interest. In this survey, we assessed the current status of reproducibility and transparency addressing these indicators in a random sample of 441 biomedical journal articles published in 2000–2014. Only one study provided a full protocol and none made all raw data directly available. Replication studies were rare (n = 4), and only 16 studies had their data included in a subsequent systematic review or meta-analysis. The majority of studies did not mention anything about funding or conflicts of interest. The percentage of articles with no statement of conflict decreased substantially between 2000 and 2014 (94.4% in 2000 to 34.6% in 2014); the percentage of articles reporting statements of conflicts (0% in 2000, 15.4% in 2014) or no conflicts (5.6% in 2000, 50.0% in 2014) increased. Articles published in journals in the clinical medicine category versus other fields were almost twice as likely to not include any information on funding and to have private funding. This study provides baseline data to compare future progress in improving these indicators in the scientific literature. PMID:26726926

  5. Fluctuation-Driven Neural Dynamics Reproduce Drosophila Locomotor Patterns

    PubMed Central

    Cruchet, Steeve; Gustafson, Kyle; Benton, Richard; Floreano, Dario

    2015-01-01

    The neural mechanisms determining the timing of even simple actions, such as when to walk or rest, are largely mysterious. One intriguing, but untested, hypothesis posits a role for ongoing activity fluctuations in neurons of central action selection circuits that drive animal behavior from moment to moment. To examine how fluctuating activity can contribute to action timing, we paired high-resolution measurements of freely walking Drosophila melanogaster with data-driven neural network modeling and dynamical systems analysis. We generated fluctuation-driven network models whose outputs—locomotor bouts—matched those measured from sensory-deprived Drosophila. From these models, we identified those that could also reproduce a second, unrelated dataset: the complex time-course of odor-evoked walking for genetically diverse Drosophila strains. Dynamical models that best reproduced both Drosophila basal and odor-evoked locomotor patterns exhibited specific characteristics. First, ongoing fluctuations were required. In a stochastic resonance-like manner, these fluctuations allowed neural activity to escape stable equilibria and to exceed a threshold for locomotion. Second, odor-induced shifts of equilibria in these models caused a depression in locomotor frequency following olfactory stimulation. Our models predict that activity fluctuations in action selection circuits cause behavioral output to more closely match sensory drive and may therefore enhance navigation in complex sensory environments. Together these data reveal how simple neural dynamics, when coupled with activity fluctuations, can give rise to complex patterns of animal behavior. PMID:26600381

  6. Data reproducibility of pace strategy in a laboratory test run

    PubMed Central

    de França, Elias; Xavier, Ana Paula; Hirota, Vinicius Barroso; Côrrea, Sônia Cavalcanti; Caperuto, Érico Chagas

    2016-01-01

    This data paper contains data related to a reproducibility test for running pacing strategy in an intermittent running test until exhaustion. Ten participants underwent a crossover study (test and retest) with an intermittent running test. The test was composed of three-minute sets (at 1 km/h above Onset Blood Lactate Accumulation) until volitional exhaustion. To assess pace strategy change, in the first test participants chose the rest time interval (RTI) between sets (ranging from 30 to 60 s) and in the second test the maximum RTI values were either the RTI chosen in the first test (maximum RTI value), or less if desired. To verify the reproducibility of the test, rating perceived exertion (RPE), heart rate (HR) and blood plasma lactate concentration ([La]p) were collected at rest, immediately after each set and at the end of the tests. As results, RTI, RPE, HR, [La]p and time to exhaustion were not statistically different (p>0.05) between test and retest, as well as they demonstrated good intraclass correlation. PMID:27081672

  7. Effect of Soil Moisture Content on the Splash Phenomenon Reproducibility

    PubMed Central

    Ryżak, Magdalena; Bieganowski, Andrzej; Polakowski, Cezary

    2015-01-01

    One of the methods for testing splash (the first phase of water erosion) may be an analysis of photos taken using so-called high-speed cameras. The aim of this study was to determine the reproducibility of measurements using a single drop splash of simulated precipitation. The height from which the drops fell resulted in a splash of 1.5 m. Tests were carried out using two types of soil: Eutric Cambisol (loamy silt) and Orthic Luvisol (sandy loam); three initial pressure heads were applied equal to 16 kPa, 3.1 kPa, and 0.1 kPa. Images for one, five, and 10 drops were recorded at a rate of 2000 frames per second. It was found that (i) the dispersion of soil caused by the striking of the 1st drop was significantly different from the splash impact caused by subsequent drops; (ii) with every drop, the splash phenomenon proceeded more reproducibly, that is, the number of particles of soil and/or water that splashed were increasingly close to each other; (iii) the number of particles that were detached during the splash were strongly correlated with its surface area; and (iv) the higher the water film was on the surface the smaller the width of the crown was. PMID:25785859

  8. Reproducibility of the 6-minute walk test in obese adults.

    PubMed

    Beriault, K; Carpentier, A C; Gagnon, C; Ménard, J; Baillargeon, J-P; Ardilouze, J-L; Langlois, M-F

    2009-10-01

    The six-minute walk test (6MWT) is an inexpensive, quick and safe tool to evaluate the functional capacity of patients with heart failure and chronic obstructive pulmonary disease. The aim of this study was to determine the reproducibility of the 6MWT in overweight and obese individuals. We thus undertook a prospective repeated-measure validity study taking place in our academic weight management outpatient clinic. The 6MWT was conducted twice the same day in 21 overweight or obese adult subjects (15 females and 6 males). Repeatability of walking distance was the primary outcome. Anthropometric measures, blood pressure and heart rate were also recorded. Participant's mean BMI was 37.2+/-9.8 kg/m(2) (range: 27.0-62.3 kg/m(2)). Walking distance in the morning (mean=452+/-90 m) and in the afternoon (mean=458+/-97 m) were highly correlated (r=0.948; 95% Confidence Interval 0.877-0.978; p<0.001). Walking distance was negatively correlated with BMI (r=-0.47, p=0.03), waist circumference (r=-0.43, p=0.05) and pre-test heart rate (r=-0.54, p=0.01). Our findings indicate that the 6MWT is highly reproducible in obese subjects and could thus be used as a fitness indicator in clinical studies and clinical care in this population.

  9. Accurate ab initio prediction of propagation rate coefficients in free-radical polymerization: Acrylonitrile and vinyl chloride

    NASA Astrophysics Data System (ADS)

    Izgorodina, Ekaterina I.; Coote, Michelle L.

    2006-05-01

    A systematic methodology for calculating accurate propagation rate coefficients in free-radical polymerization was designed and tested for vinyl chloride and acrylonitrile polymerization. For small to medium-sized polymer systems, theoretical reaction barriers are calculated using G3(MP2)-RAD. For larger systems, G3(MP2)-RAD barriers can be approximated (to within 1 kJ mol -1) via an ONIOM-based approach in which the core is studied at G3(MP2)-RAD and the substituent effects are modeled with ROMP2/6-311+G(3df,2p). DFT methods (including BLYP, B3LYP, MPWB195, BB1K and MPWB1K) failed to reproduce the correct trends in the reaction barriers and enthalpies with molecular size, though KMLYP showed some promise as a low cost option for very large systems. Reaction rates are calculated via standard transition state theory in conjunction with the one-dimensional hindered rotor model. The harmonic oscillator approximation was shown to introduce an error of a factor of 2-3, and would be suitable for "order-of-magnitude" estimates. A systematic study of chain length effects indicated that rate coefficients had largely converged to their long chain limit at the dimer radical stage, and the inclusion of the primary substituent of the penultimate unit was sufficient for practical purposes. Solvent effects, as calculated using the COSMO model, were found to be relatively minor. The overall methodology reproduced the available experimental data for both of these monomers within a factor of 2.

  10. Accurate global potential energy surface for the H + OH+ collision

    NASA Astrophysics Data System (ADS)

    Gannouni, M. A.; Jaidane, N. E.; Halvick, P.; Stoecklin, T.; Hochlaf, M.

    2014-05-01

    We mapped the global three-dimensional potential energy surface (3D-PES) of the water cation at the MRCI/aug-cc-pV5Z including the basis set superposition (BSSE) correction. This PES covers the molecular region and the long ranges close to the H + OH+(X3Σ-), the O + H2+(X2Σg+), and the hydrogen exchange channels. The quality of the PES is checked after comparison to previous experimental and theoretical results of the spectroscopic constants of H2O+(tilde X2B1) and of the diatomic fragments, the vibronic spectrum, the dissociation energy, and the barrier to linearity for H2O+(tilde X2B1). Our data nicely approach those measured and computed previously. The long range parts reproduce quite well the diatomic potentials. In whole, a good agreement is found, which validates our 3D-PES.

  11. An extended mathematical model for reproducing the phase response of Arabidopsis thaliana under various light conditions.

    PubMed

    Ohara, Takayuki; Fukuda, Hirokazu; Tokuda, Isao T

    2015-10-01

    Experimental studies showed that light qualities such as color and strength influence the phase response properties of plant circadian systems. These effects, however, have yet to be properly addressed in theoretical models of plant circadian systems. To fill this gap, the present paper develops a mathematical model of a plant circadian clock that takes into account the intensity and wavelength of the input light. Based on experimental knowledge, we model three photoreceptors, Phytochrome A, Phytochrome B, and Cryptochrome 1, which respond to red and/or blue light, in Arabidopsis thaliana. The three photoreceptors are incorporated into a standard mathematical model of the plant system, in which activator and repressor genes form a single feedback loop. The model capability is examined by a phase response curve (PRC), which plots the phase shifts elicited by the light perturbation as a function of the perturbation phase. Numerical experiments demonstrate that the extended model reproduces the essential features of the PRCs measured experimentally under various light conditions. Particularly, unlike conventional models, the model generates the inherent shape of the PRC under dark pulse stimuli. The outcome of our modeling approach may motivate future theoretical and experimental studies of plant circadian rhythms.

  12. Data management routines for reproducible research using the G-Node Python Client library.

    PubMed

    Sobolev, Andrey; Stoewer, Adrian; Pereira, Michael; Kellner, Christian J; Garbers, Christian; Rautenberg, Philipp L; Wachtler, Thomas

    2014-01-01

    Structured, efficient, and secure storage of experimental data and associated meta-information constitutes one of the most pressing technical challenges in modern neuroscience, and does so particularly in electrophysiology. The German INCF Node aims to provide open-source solutions for this domain that support the scientific data management and analysis workflow, and thus facilitate future data access and reproducible research. G-Node provides a data management system, accessible through an application interface, that is based on a combination of standardized data representation and flexible data annotation to account for the variety of experimental paradigms in electrophysiology. The G-Node Python Library exposes these services to the Python environment, enabling researchers to organize and access their experimental data using their familiar tools while gaining the advantages that a centralized storage entails. The library provides powerful query features, including data slicing and selection by metadata, as well as fine-grained permission control for collaboration and data sharing. Here we demonstrate key actions in working with experimental neuroscience data, such as building a metadata structure, organizing recorded data in datasets, annotating data, or selecting data regions of interest, that can be automated to large degree using the library. Compliant with existing de-facto standards, the G-Node Python Library is compatible with many Python tools in the field of neurophysiology and thus enables seamless integration of data organization into the scientific data workflow.

  13. Modified chemiluminescent NO analyzer accurately measures NOX

    NASA Technical Reports Server (NTRS)

    Summers, R. L.

    1978-01-01

    Installation of molybdenum nitric oxide (NO)-to-higher oxides of nitrogen (NOx) converter in chemiluminescent gas analyzer and use of air purge allow accurate measurements of NOx in exhaust gases containing as much as thirty percent carbon monoxide (CO). Measurements using conventional analyzer are highly inaccurate for NOx if as little as five percent CO is present. In modified analyzer, molybdenum has high tolerance to CO, and air purge substantially quenches NOx destruction. In test, modified chemiluminescent analyzer accurately measured NO and NOx concentrations for over 4 months with no denegration in performance.

  14. Reproducibility of an aerobic endurance test for nonexpert swimmers

    PubMed Central

    Veronese da Costa, Adalberto; Costa, Manoel da Cunha; Carlos, Daniel Medeiros; Guerra, Luis Marcos de Medeiros; Silva, Antônio José; Barbosa, Tiago Manoel Cabral dos Santos

    2012-01-01

    Background: This study aimed to verify the reproduction of an aerobic test to determine nonexpert swimmers’ resistance. Methods: The sample consisted of 24 male swimmers (age: 22.79 ± 3.90 years; weight: 74.72 ± 11.44 kg; height: 172.58 ± 4.99 cm; and fat percentage: 15.19% ± 3.21%), who swim for 1 hour three times a week. A new instrument was used in this study (a Progressive Swim Test): the swimmer wore an underwater MP3 player and increased their swimming speed on hearing a beep after every 25 meters. Each swimmer’s heart rate was recorded before the test (BHR) and again after the test (AHR). The rate of perceived exertion (RPE) and the number of laps performed (NLP) were also recorded. The sample size was estimated using G*Power software (v 3.0.10; Franz Faul, Kiel University, Kiel, Germany). The descriptive values were expressed as mean and standard deviation. After confirming the normality of the data using both the Shapiro–Wilk and Levene tests, a paired t-test was performed to compare the data. The Pearson’s linear correlation (r) and intraclass coefficient correlation (ICC) tests were used to determine relative reproducibility. The standard error of measurement (SEM) and the coefficient of variation (CV) were used to determine absolute reproducibility. The limits of agreement and the bias of the absolute and relative values between days were determined by Bland–Altman plots. All values had a significance level of P < 0.05. Results: There were significant differences in AHR (P = 0.03) and NLP (P = 0.01) between the 2 days of testing. The obtained values were r > 0.50 and ICC > 0.66. The SEM had a variation of ±2% and the CV was <10%. Most cases were within the upper and lower limits of Bland–Altman plots, suggesting correlation of the results. The applicability of NLP showed greater robustness (r and ICC > 0.90; SEM < 1%; CV < 3%), indicating that the other variables can be used to predict incremental changes in the physiological condition

  15. Paleomagnetic analysis of curved thrust belts reproduced by physical models

    NASA Astrophysics Data System (ADS)

    Costa, Elisabetta; Speranza, Fabio

    2003-12-01

    This paper presents a new methodology for studying the evolution of curved mountain belts by means of paleomagnetic analyses performed on analogue models. Eleven models were designed aimed at reproducing various tectonic settings in thin-skinned tectonics. Our models analyze in particular those features reported in the literature as possible causes for peculiar rotational patterns in the outermost as well as in the more internal fronts. In all the models the sedimentary cover was reproduced by frictional low-cohesion materials (sand and glass micro-beads), which detached either on frictional or on viscous layers. These latter were reproduced in the models by silicone. The sand forming the models has been previously mixed with magnetite-dominated powder. Before deformation, the models were magnetized by means of two permanent magnets generating within each model a quasi-linear magnetic field of intensity variable between 20 and 100 mT. After deformation, the models were cut into closely spaced vertical sections and sampled by means of 1×1-cm Plexiglas cylinders at several locations along curved fronts. Care was taken to collect paleomagnetic samples only within virtually undeformed thrust sheets, avoiding zones affected by pervasive shear. Afterwards, the natural remanent magnetization of these samples was measured, and alternating field demagnetization was used to isolate the principal components. The characteristic components of magnetization isolated were used to estimate the vertical-axis rotations occurring during model deformation. We find that indenters pushing into deforming belts from behind form non-rotational curved outer fronts. The more internal fronts show oroclinal-type rotations of a smaller magnitude than that expected for a perfect orocline. Lateral symmetrical obstacles in the foreland colliding with forward propagating belts produce non-rotational outer curved fronts as well, whereas in between and inside the obstacles a perfect orocline forms

  16. Building Consensus on Community Standards for Reproducible Science

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Nielsen, R. L.

    2015-12-01

    As geochemists, the traditional model by which standard methods for generating, presenting, and using data have been generated relied on input from the community, the results of seminal studies, a variety of authoritative bodies, and has required a great deal of time. The rate of technological and related policy change has accelerated to the point that this historical model does not satisfy the needs of the community, publishers, or funders. The development of a new mechanism for building consensus raises a number of questions: Which aspects of our data are the focus of reproducibility standards? Who sets the standards? How do we subdivide the development of the consensus? We propose an open, transparent, and inclusive approach to the development of data and reproducibility standards that is organized around specific sub-disciplines and driven by the community of practitioners in those sub-disciplines. It should involve editors, program managers, and representatives of domain data facilities as well as professional societies, but avoid any single group to be the final authority. A successful example of this model is the Editors Roundtable, a cross section of editors, funders, and data facility managers that discussed and agreed on leading practices for the reporting of geochemical data in publications, including accessibility and format of the data, data quality information, and metadata and identifiers for samples (Goldstein et al., 2014). We argue that development of data and reproducibility standards needs to heavily rely on representatives from the community of practitioners to set priorities and provide perspective. Groups of editors, practicing scientists, and other stakeholders would be assigned the task of reviewing existing practices and recommending changes as deemed necessary. They would weigh the costs and benefits of changing the standards for that community, propose appropriate tools to facilitate those changes, work through the professional societies

  17. Toward Transparent and Reproducible Science: Using Open Source "Big Data" Tools for Water Resources Assessment

    NASA Astrophysics Data System (ADS)

    Buytaert, W.; Zulkafli, Z. D.; Vitolo, C.

    2014-12-01

    Transparency and reproducibility are fundamental properties of good science. In the current era of large and diverse datasets and long and complex workflows for data analysis and inference, ensuring such transparency and reproducibility is challenging. Hydrological science is a good case in point, because the discipline typically uses a large variety of datasets ranging from local observations to large-scale remotely sensed products. These data are often obtained from various different sources, and integrated using complex yet uncertain modelling tools. In this paper, we present and discuss methods of ensuring transparency and reproducibility in scientific workflows for hydrological data analysis for the purpose of water resources assessment, using relevant examples of emerging open source "big data" tools. First, we discuss standards for data storage, access, and processing that allow improving the modularity of a hydrological analysis workflow. In particular standards emerging from the Open Geospatial Consortium, such as the Sensor Observation Service, the Web Coverage Service, hold promise. However, some bottlenecks such as the availability of data models and the ability to work with spatio-temperal subsets of large datasets, need further development. Next, we focus on available methods to build transparent data processing workflows. Again, standards such as OGC's Web Processing Service are being developed to facilitate web-based analytics. Yet, in practice, the experimental nature of these standards and web services in general often requires a more pragmatic approach. The availability of web technologies in popular open source data analysis environments such as R and Python often makes them an attractive solution for workflow creation and sharing. Lastly, we elaborate on the potential of open source solutions hold in the context of participatory approaches to data collection and knowledge generation. Using examples from the tropical Andes and the Himalayas, we

  18. The effect of saline iontophoresis on skin integrity in human volunteers. I. Methodology and reproducibility.

    PubMed

    Camel, E; O'Connell, M; Sage, B; Gross, M; Maibach, H

    1996-08-01

    This study, conducted in 36 human volunteers, was an evaluation of the effects of saline iontophoresis on skin temperature, irritation, and barrier function. The major objectives were to assess the effects of low-level ionic currents, to validate the proposed methodology of assessment, and to establish reproducibility in repeated saline iontophoresis applications. This was the first of a multistage study designed to assess the safety of 24-hr saline iontophoresis episodes at selected currents and current densities. Since an iontophoresis patch challenges the skin barrier both by occluding the skin surface and by passing ionic current through the skin, the experimental protocol was designed to permit measurement of the contribution of each of these processes to the overall response. In this first stage we investigated the effect of 10 min of current delivery, at 0.1 mA/cm2 on a 1-cm2 area patch and 0.2 mA/cm2 on a 6.5-cm2 area patch compared to unpowered control patches. Twelve subjects were tested under each condition on two separate occasions to examine reproducibility of the response variable measurements. A further 12 subjects were tested once under the 0.2 mA/cm2, 6.5-cm2 condition. Skin irritation was evaluated via repeated measurements of transepidermal water loss, capacitance, skin temperature, skin color, and a visual scoring system, before the iontophoresis episode and after patch removal. No damage to skin barrier function in terms of skin-water loss or skin-water content was detected. Slight, subclinical, short-lasting erythema was observed for both conditions. Assessment of correlation coefficients showed highly statistically significant indications of reproducibility for all five response variables measured. The experimental design, in combination with a repeated measures analysis, provided clear separation of the occlusion and ionic current components of the iontophoretic patch challenge. Further, the repeated measures analysis gave a highly sensitive

  19. Reproducible, Scalable Fusion Gene Detection from RNA-Seq.

    PubMed

    Arsenijevic, Vladan; Davis-Dusenbery, Brandi N

    2016-01-01

    Chromosomal rearrangements resulting in the creation of novel gene products, termed fusion genes, have been identified as driving events in the development of multiple types of cancer. As these gene products typically do not exist in normal cells, they represent valuable prognostic and therapeutic targets. Advances in next-generation sequencing and computational approaches have greatly improved our ability to detect and identify fusion genes. Nevertheless, these approaches require significant computational resources. Here we describe an approach which leverages cloud computing technologies to perform fusion gene detection from RNA sequencing data at any scale. We additionally highlight methods to enhance reproducibility of bioinformatics analyses which may be applied to any next-generation sequencing experiment. PMID:26667464

  20. [Expansion of undergraduate nursing and the labor market: reproducing inequalities?].

    PubMed

    Silva, Kênia Lara; de Sena, Roseni Rosângela; Tavares, Tatiana Silva; Wan der Maas, Lucas

    2012-01-01

    This study aimed to analyze the relationship between the increase in the number of degree courses in nursing and the nursing job market. It is a descriptive exploratory study with a quantitative approach, which used data on Undergraduate Nursing courses, supply of nurses, connection with health facilities, and formal jobs in nursing in the state of Minas Gerais. The evolution of Undergraduate Nursing courses reveals a supply and demand decline in recent years. Such context is determined by the nurse's labor market being influenced by the contradiction of a professional quantitative surplus, particularly in the state's less developed areas, as opposed to a low percentage of nurses to care for the population's health. These characteristics of the nursing labor market reproduce inequalities furthermore aspects such as the regulation of nursing education and the creation of new jobs need to be discussed further.

  1. Reproducibility in density functional theory calculations of solids.

    PubMed

    Lejaeghere, Kurt; Bihlmayer, Gustav; Björkman, Torbjörn; Blaha, Peter; Blügel, Stefan; Blum, Volker; Caliste, Damien; Castelli, Ivano E; Clark, Stewart J; Dal Corso, Andrea; de Gironcoli, Stefano; Deutsch, Thierry; Dewhurst, John Kay; Di Marco, Igor; Draxl, Claudia; Dułak, Marcin; Eriksson, Olle; Flores-Livas, José A; Garrity, Kevin F; Genovese, Luigi; Giannozzi, Paolo; Giantomassi, Matteo; Goedecker, Stefan; Gonze, Xavier; Grånäs, Oscar; Gross, E K U; Gulans, Andris; Gygi, François; Hamann, D R; Hasnip, Phil J; Holzwarth, N A W; Iuşan, Diana; Jochym, Dominik B; Jollet, François; Jones, Daniel; Kresse, Georg; Koepernik, Klaus; Küçükbenli, Emine; Kvashnin, Yaroslav O; Locht, Inka L M; Lubeck, Sven; Marsman, Martijn; Marzari, Nicola; Nitzsche, Ulrike; Nordström, Lars; Ozaki, Taisuke; Paulatto, Lorenzo; Pickard, Chris J; Poelmans, Ward; Probert, Matt I J; Refson, Keith; Richter, Manuel; Rignanese, Gian-Marco; Saha, Santanu; Scheffler, Matthias; Schlipf, Martin; Schwarz, Karlheinz; Sharma, Sangeeta; Tavazza, Francesca; Thunström, Patrik; Tkatchenko, Alexandre; Torrent, Marc; Vanderbilt, David; van Setten, Michiel J; Van Speybroeck, Veronique; Wills, John M; Yates, Jonathan R; Zhang, Guo-Xu; Cottenier, Stefaan

    2016-03-25

    The widespread popularity of density functional theory has given rise to an extensive range of dedicated codes for predicting molecular and crystalline properties. However, each code implements the formalism in a different way, raising questions about the reproducibility of such predictions. We report the results of a community-wide effort that compared 15 solid-state codes, using 40 different potentials or basis set types, to assess the quality of the Perdew-Burke-Ernzerhof equations of state for 71 elemental crystals. We conclude that predictions from recent codes and pseudopotentials agree very well, with pairwise differences that are comparable to those between different high-precision experiments. Older methods, however, have less precise agreement. Our benchmark provides a framework for users and developers to document the precision of new applications and methodological improvements.

  2. GigaDB: promoting data dissemination and reproducibility

    PubMed Central

    Sneddon, Tam P.; Si Zhe, Xiao; Edmunds, Scott C.; Li, Peter; Goodman, Laurie; Hunter, Christopher I.

    2014-01-01

    Often papers are published where the underlying data supporting the research are not made available because of the limitations of making such large data sets publicly and permanently accessible. Even if the raw data are deposited in public archives, the essential analysis intermediaries, scripts or software are frequently not made available, meaning the science is not reproducible. The GigaScience journal is attempting to address this issue with the associated data storage and dissemination portal, the GigaScience database (GigaDB). Here we present the current version of GigaDB and reveal plans for the next generation of improvements. However, most importantly, we are soliciting responses from you, the users, to ensure that future developments are focused on the data storage and dissemination issues that still need resolving. Database URL: http://www.gigadb.org PMID:24622612

  3. Reproducibility in density functional theory calculations of solids.

    PubMed

    Lejaeghere, Kurt; Bihlmayer, Gustav; Björkman, Torbjörn; Blaha, Peter; Blügel, Stefan; Blum, Volker; Caliste, Damien; Castelli, Ivano E; Clark, Stewart J; Dal Corso, Andrea; de Gironcoli, Stefano; Deutsch, Thierry; Dewhurst, John Kay; Di Marco, Igor; Draxl, Claudia; Dułak, Marcin; Eriksson, Olle; Flores-Livas, José A; Garrity, Kevin F; Genovese, Luigi; Giannozzi, Paolo; Giantomassi, Matteo; Goedecker, Stefan; Gonze, Xavier; Grånäs, Oscar; Gross, E K U; Gulans, Andris; Gygi, François; Hamann, D R; Hasnip, Phil J; Holzwarth, N A W; Iuşan, Diana; Jochym, Dominik B; Jollet, François; Jones, Daniel; Kresse, Georg; Koepernik, Klaus; Küçükbenli, Emine; Kvashnin, Yaroslav O; Locht, Inka L M; Lubeck, Sven; Marsman, Martijn; Marzari, Nicola; Nitzsche, Ulrike; Nordström, Lars; Ozaki, Taisuke; Paulatto, Lorenzo; Pickard, Chris J; Poelmans, Ward; Probert, Matt I J; Refson, Keith; Richter, Manuel; Rignanese, Gian-Marco; Saha, Santanu; Scheffler, Matthias; Schlipf, Martin; Schwarz, Karlheinz; Sharma, Sangeeta; Tavazza, Francesca; Thunström, Patrik; Tkatchenko, Alexandre; Torrent, Marc; Vanderbilt, David; van Setten, Michiel J; Van Speybroeck, Veronique; Wills, John M; Yates, Jonathan R; Zhang, Guo-Xu; Cottenier, Stefaan

    2016-03-25

    The widespread popularity of density functional theory has given rise to an extensive range of dedicated codes for predicting molecular and crystalline properties. However, each code implements the formalism in a different way, raising questions about the reproducibility of such predictions. We report the results of a community-wide effort that compared 15 solid-state codes, using 40 different potentials or basis set types, to assess the quality of the Perdew-Burke-Ernzerhof equations of state for 71 elemental crystals. We conclude that predictions from recent codes and pseudopotentials agree very well, with pairwise differences that are comparable to those between different high-precision experiments. Older methods, however, have less precise agreement. Our benchmark provides a framework for users and developers to document the precision of new applications and methodological improvements. PMID:27013736

  4. geoknife: Reproducible web-processing of large gridded datasets

    USGS Publications Warehouse

    Read, Jordan S.; Walker, Jordan I.; Appling, Alison P.; Blodgett, David L.; Read, Emily Kara; Winslow, Luke A.

    2016-01-01

    Geoprocessing of large gridded data according to overlap with irregular landscape features is common to many large-scale ecological analyses. The geoknife R package was created to facilitate reproducible analyses of gridded datasets found on the U.S. Geological Survey Geo Data Portal web application or elsewhere, using a web-enabled workflow that eliminates the need to download and store large datasets that are reliably hosted on the Internet. The package provides access to several data subset and summarization algorithms that are available on remote web processing servers. Outputs from geoknife include spatial and temporal data subsets, spatially-averaged time series values filtered by user-specified areas of interest, and categorical coverage fractions for various land-use types.

  5. Whole blood metal ion measurement reproducibility between different laboratories.

    PubMed

    Rahmé, Michel; Lavigne, Martin; Barry, Janie; Cirtiu, Ciprian Mihai; Bélanger, Patrick; Vendittoli, Pascal-André

    2014-11-01

    Monitoring patients' metal ion blood concentrations can be useful in cases of problematic metal on metal hip implants. Our objective was to evaluate the reproducibility of metal ion level values measured by two different laboratories. Whole blood samples were collected in 46 patients with metal on metal hip arthroplasty. For each patients, two whole blood samples were collected and analyzed by two laboratories. Laboratory 1 had higher results than laboratory 2. There was a clinically significant absolute difference between the two laboratories, above the predetermined threshold, 35% of Cr samples and 38% of Co samples. All laboratories do not use the same technologies for their measurements. Therefore, decision to revise a metal on metal hip arthroplasty should rely on metal ion trends and have to be done in the same laboratory.

  6. Reproducing the kinematics of damped Lyman α systems

    NASA Astrophysics Data System (ADS)

    Bird, Simeon; Haehnelt, Martin; Neeleman, Marcel; Genel, Shy; Vogelsberger, Mark; Hernquist, Lars

    2015-02-01

    We examine the kinematic structure of damped Lyman α systems (DLAs) in a series of cosmological hydrodynamic simulations using the AREPO code. We are able to match the distribution of velocity widths of associated low-ionization metal absorbers substantially better than earlier work. Our simulations produce a population of DLAs dominated by haloes with virial velocities around 70 km s-1, consistent with a picture of relatively small, faint objects. In addition, we reproduce the observed correlation between velocity width and metallicity and the equivalent width distribution of Si II. Some discrepancies of moderate statistical significance remain; too many of our spectra show absorption concentrated at the edge of the profile and there are slight differences in the exact shape of the velocity width distribution. We show that the improvement over previous work is mostly due to our strong feedback from star formation and our detailed modelling of the metal ionization state.

  7. Initial evaluations of the reproducibility of vapor-diffusion crystallization.

    PubMed

    Newman, Janet; Xu, Jian; Willis, Michael C

    2007-07-01

    Experiments were set up to test how the crystallization drop size affects the crystallization process; in the test cases studied, increasing the drop size led to increasing numbers of crystals. Other data produced from a high-throughput automation-system run were analyzed in order to gauge the effect of replication on the success of crystallization screening. With over 40-fold multiplicity, lysozyme was found to crystallize in over half of the conditions in a standard 96-condition screen. However, despite the fact that industry-standard lysozyme was used in our tests, it was rare that we obtained crystals reproducibly; this suggests that replication whilst screening might improve the success rate of macromolecular crystallization.

  8. Investigating the reproducibility of a complex multifocal radiosurgery treatment

    PubMed Central

    Niebanck, M; Juang, T; Newton, J; Adamovics, J; Wang, Z; Oldham, M

    2013-01-01

    Stereotactic radiosurgery has become a widely used technique to treat solid tumors and secondary metastases of the brain. Multiple targets can be simultaneously treated with a single isocenter in order to reduce the set-up time to improve patient comfort and workflow. In this study, a 5-arc multifocal RapidArc treatment was delivered to multiple PRESAGE® dosimeters in order to explore the repeatability of the treatment. The three delivery measurements agreed well with each other, with less than 3% standard deviation of dose in the target. The deliveries also agreed well with the treatment plan, with gamma passing rates greater than 90% (5% dose-difference, and 2 mm distance-to-agreement criteria). The optical-CT PRESAGE® system provided a reproducible measurement for treatment verification, provided measurements were made immediately following treatment. PMID:27081397

  9. New model for datasets citation and extraction reproducibility in VAMDC

    NASA Astrophysics Data System (ADS)

    Zwölf, Carlo Maria; Moreau, Nicolas; Dubernet, Marie-Lise

    2016-09-01

    In this paper we present a new paradigm for the identification of datasets extracted from the Virtual Atomic and Molecular Data Centre (VAMDC) e-science infrastructure. Such identification includes information on the origin and version of the datasets, references associated to individual data in the datasets, as well as timestamps linked to the extraction procedure. This paradigm is described through the modifications of the language used to exchange data within the VAMDC and through the services that will implement those modifications. This new paradigm should enforce traceability of datasets, favor reproducibility of datasets extraction, and facilitate the systematic citation of the authors having originally measured and/or calculated the extracted atomic and molecular data.

  10. Evaluation of Statistical Downscaling Skill at Reproducing Extreme Events

    NASA Astrophysics Data System (ADS)

    McGinnis, S. A.; Tye, M. R.; Nychka, D. W.; Mearns, L. O.

    2015-12-01

    Climate model outputs usually have much coarser spatial resolution than is needed by impacts models. Although higher resolution can be achieved using regional climate models for dynamical downscaling, further downscaling is often required. The final resolution gap is often closed with a combination of spatial interpolation and bias correction, which constitutes a form of statistical downscaling. We use this technique to downscale regional climate model data and evaluate its skill in reproducing extreme events. We downscale output from the North American Regional Climate Change Assessment Program (NARCCAP) dataset from its native 50-km spatial resolution to the 4-km resolution of University of Idaho's METDATA gridded surface meterological dataset, which derives from the PRISM and NLDAS-2 observational datasets. We operate on the major variables used in impacts analysis at a daily timescale: daily minimum and maximum temperature, precipitation, humidity, pressure, solar radiation, and winds. To interpolate the data, we use the patch recovery method from the Earth System Modeling Framework (ESMF) regridding package. We then bias correct the data using Kernel Density Distribution Mapping (KDDM), which has been shown to exhibit superior overall performance across multiple metrics. Finally, we evaluate the skill of this technique in reproducing extreme events by comparing raw and downscaled output with meterological station data in different bioclimatic regions according to the the skill scores defined by Perkins et al in 2013 for evaluation of AR4 climate models. We also investigate techniques for improving bias correction of values in the tails of the distributions. These techniques include binned kernel density estimation, logspline kernel density estimation, and transfer functions constructed by fitting the tails with a generalized pareto distribution.

  11. Reproducibility of MRI segmentation using a feature space method

    NASA Astrophysics Data System (ADS)

    Soltanian-Zadeh, Hamid; Windham, Joe P.; Scarpace, Lisa; Murnock, Tanya

    1998-06-01

    This paper presents reproducibility studies for the segmentation results obtained by our optimal MRI feature space method. The steps of the work accomplished are as follows. (1) Eleven patients with brain tumors were imaged by a 1.5 T General Electric Signa MRI System. Four T2- weighted and two T1-weighted images (before and after Gadolinium injection) were acquired for each patient. (2) Images of a slice through the center of the tumor were selected for processing. (3) Patient information was removed from the image headers and new names (unrecognizable by the image analysts) were given to the images. These images were blindly analyzed by the image analysts. (4) Segmentation results obtained by the two image analysts at two time points were compared to assess the reproducibility of the segmentation method. For each tissue segmented in each patient study, a comparison was done by kappa statistics and a similarity measure (an approximation of kappa statistics used by other researchers), to evaluate the number of pixels that were in both of the segmentation results obtained by the two image analysts (agreement) relative to the number of pixels that were not in both (disagreement). An overall agreement comparison was done by finding means and standard deviations of kappa statistics and the similarity measure found for each tissue type in the studies. The kappa statistics for white matter was the largest (0.80) followed by those of gray matter (0.68), partial volume (0.67), total lesion (0.66), and CSF (0.44). The similarity measure showed the same trend but it was always higher than kappa statistics. It was 0.85 for white matter, 0.77 for gray matter, 0.73 for partial volume, 0.72 for total lesion, and 0.47 for CSF.

  12. Galaxy Zoo: reproducing galaxy morphologies via machine learning

    NASA Astrophysics Data System (ADS)

    Banerji, Manda; Lahav, Ofer; Lintott, Chris J.; Abdalla, Filipe B.; Schawinski, Kevin; Bamford, Steven P.; Andreescu, Dan; Murray, Phil; Raddick, M. Jordan; Slosar, Anze; Szalay, Alex; Thomas, Daniel; Vandenberg, Jan

    2010-07-01

    We present morphological classifications obtained using machine learning for objects in the Sloan Digital Sky Survey DR6 that have been classified by Galaxy Zoo into three classes, namely early types, spirals and point sources/artefacts. An artificial neural network is trained on a subset of objects classified by the human eye, and we test whether the machine-learning algorithm can reproduce the human classifications for the rest of the sample. We find that the success of the neural network in matching the human classifications depends crucially on the set of input parameters chosen for the machine-learning algorithm. The colours and parameters associated with profile fitting are reasonable in separating the objects into three classes. However, these results are considerably improved when adding adaptive shape parameters as well as concentration and texture. The adaptive moments, concentration and texture parameters alone cannot distinguish between early type galaxies and the point sources/artefacts. Using a set of 12 parameters, the neural network is able to reproduce the human classifications to better than 90 per cent for all three morphological classes. We find that using a training set that is incomplete in magnitude does not degrade our results given our particular choice of the input parameters to the network. We conclude that it is promising to use machine-learning algorithms to perform morphological classification for the next generation of wide-field imaging surveys and that the Galaxy Zoo catalogue provides an invaluable training set for such purposes. This publication has been made possible by the participation of more than 100000 volunteers in the Galaxy Zoo project. Their contributions are individually acknowledged at http://www.galaxyzoo.org/Volunteers.aspx. E-mail: mbanerji@ast.cam.ac.uk ‡ Einstein Fellow.

  13. A workflow for reproducing mean benthic gas fluxes

    NASA Astrophysics Data System (ADS)

    Fulweiler, Robinson W.; Emery, Hollie E.; Maguire, Timothy J.

    2016-08-01

    Long-term data sets provide unique opportunities to examine temporal variability of key ecosystem processes. The need for such data sets is becoming increasingly important as we try to quantify the impact of human activities across various scales and in some cases, as we try to determine the success of management interventions. Unfortunately, long-term benthic flux data sets for coastal ecosystems are rare and curating them is a challenge. If we wish to make our data available to others now and into the future, however, then we need to provide mechanisms that allow others to understand our methods, access the data, reproduce the results, and see updates as they become available. Here we use techniques, learned through the EarthCube Ontosoft Geoscience Paper of the Future project, to develop best practices to allow us to share a long-term data set of directly measured net sediment N2 fluxes and sediment oxygen demand at two sites in Narragansett Bay, Rhode Island (USA). This technical report describes the process we used, the challenges we faced, and the steps we will take in the future to ensure transparency and reproducibility. By developing these data and software sharing tools we hope to help disseminate well-curated data with provenance as well as products from these data, so that the community can better assess how this temperate estuary has changed over time. We also hope to provide a data sharing model for others to follow so that long-term estuarine data are more easily shared and not lost over time.

  14. Can Appraisers Rate Work Performance Accurately?

    ERIC Educational Resources Information Center

    Hedge, Jerry W.; Laue, Frances J.

    The ability of individuals to make accurate judgments about others is examined and literature on this subject is reviewed. A wide variety of situational factors affects the appraisal of performance. It is generally accepted that the purpose of the appraisal influences the accuracy of the appraiser. The instrumentation, or tools, available to the…

  15. Accurate pointing of tungsten welding electrodes

    NASA Technical Reports Server (NTRS)

    Ziegelmeier, P.

    1971-01-01

    Thoriated-tungsten is pointed accurately and quickly by using sodium nitrite. Point produced is smooth and no effort is necessary to hold the tungsten rod concentric. The chemically produced point can be used several times longer than ground points. This method reduces time and cost of preparing tungsten electrodes.

  16. Spatial mapping and statistical reproducibility of an array of 256 one-dimensional quantum wires

    SciTech Connect

    Al-Taie, H. Kelly, M. J.; Smith, L. W.; Lesage, A. A. J.; Griffiths, J. P.; Beere, H. E.; Jones, G. A. C.; Ritchie, D. A.; Smith, C. G.; See, P.

    2015-08-21

    We utilize a multiplexing architecture to measure the conductance properties of an array of 256 split gates. We investigate the reproducibility of the pinch off and one-dimensional definition voltage as a function of spatial location on two different cooldowns, and after illuminating the device. The reproducibility of both these properties on the two cooldowns is high, the result of the density of the two-dimensional electron gas returning to a similar state after thermal cycling. The spatial variation of the pinch-off voltage reduces after illumination; however, the variation of the one-dimensional definition voltage increases due to an anomalous feature in the center of the array. A technique which quantifies the homogeneity of split-gate properties across the array is developed which captures the experimentally observed trends. In addition, the one-dimensional definition voltage is used to probe the density of the wafer at each split gate in the array on a micron scale using a capacitive model.

  17. Spatial mapping and statistical reproducibility of an array of 256 one-dimensional quantum wires

    NASA Astrophysics Data System (ADS)

    Al-Taie, H.; Smith, L. W.; Lesage, A. A. J.; See, P.; Griffiths, J. P.; Beere, H. E.; Jones, G. A. C.; Ritchie, D. A.; Kelly, M. J.; Smith, C. G.

    2015-08-01

    We utilize a multiplexing architecture to measure the conductance properties of an array of 256 split gates. We investigate the reproducibility of the pinch off and one-dimensional definition voltage as a function of spatial location on two different cooldowns, and after illuminating the device. The reproducibility of both these properties on the two cooldowns is high, the result of the density of the two-dimensional electron gas returning to a similar state after thermal cycling. The spatial variation of the pinch-off voltage reduces after illumination; however, the variation of the one-dimensional definition voltage increases due to an anomalous feature in the center of the array. A technique which quantifies the homogeneity of split-gate properties across the array is developed which captures the experimentally observed trends. In addition, the one-dimensional definition voltage is used to probe the density of the wafer at each split gate in the array on a micron scale using a capacitive model.

  18. Reproducibility of histological cell type in high-grade endometrial carcinoma.

    PubMed

    Han, Guangming; Sidhu, Davinder; Duggan, Máire A; Arseneau, Jocelyne; Cesari, Matthew; Clement, Philip B; Ewanowich, Carol A; Kalloger, Steve E; Köbel, Martin

    2013-12-01

    Subclassification of endometrial carcinoma according to histological type shows variable interobserver agreement. The aim of this study was to assess specifically the interobserver agreement of histological type in high-grade endometrial carcinomas, recorded by gynecological pathologists from five academic centers across Canada. In a secondary aim, the agreement of consensus diagnosis with immunohistochemical marker combinations was assessed including six routine (TP53, CDKN2A (p16), ER, PGR, Ki67, and VIM) and six experimental immunohistochemical markers (PTEN, ARID1A, CTNNB1, IGF2BP3, HNF1B, and TFF3). The paired interobserver agreement ranged from κ 0.50 to 0.63 (median 0.58) and the intraobserver agreement from κ 0.49 to 0.67 (median 0.61). Consensus about histological type based on morphological assessment was reached in 72% of high-grade endometrial carcinomas. A seven-marker immunohistochemical panel differentiated FIGO grade 3 endometrioid from serous carcinoma with a 100% concordance rate compared with the consensus diagnosis. More practically, a three-marker panel including TP53, ER, and CDKN2A (p16) can aid in the differential diagnosis of FIGO grade 3 endometrioid from endometrial serous carcinoma. Our study demonstrates that the inter- and intraobserver reproducibility of histological type based on morphology alone are mostly moderate. Ancillary techniques such as immunohistochemical marker panels are likely needed to improve diagnostic reproducibility of histological types within high-grade endometrial carcinomas.

  19. Chimeric Mice with Competent Hematopoietic Immunity Reproduce Key Features of Severe Lassa Fever

    PubMed Central

    Oestereich, Lisa; Lüdtke, Anja; Ruibal, Paula; Pallasch, Elisa; Kerber, Romy; Rieger, Toni; Wurr, Stephanie; Bockholt, Sabrina; Krasemann, Susanne

    2016-01-01

    Lassa fever (LASF) is a highly severe viral syndrome endemic to West African countries. Despite the annual high morbidity and mortality caused by LASF, very little is known about the pathophysiology of the disease. Basic research on LASF has been precluded due to the lack of relevant small animal models that reproduce the human disease. Immunocompetent laboratory mice are resistant to infection with Lassa virus (LASV) and, to date, only immunodeficient mice, or mice expressing human HLA, have shown some degree of susceptibility to experimental infection. Here, transplantation of wild-type bone marrow cells into irradiated type I interferon receptor knockout mice (IFNAR-/-) was used to generate chimeric mice that reproduced important features of severe LASF in humans. This included high lethality, liver damage, vascular leakage and systemic virus dissemination. In addition, this model indicated that T cell-mediated immunopathology was an important component of LASF pathogenesis that was directly correlated with vascular leakage. Our strategy allows easy generation of a suitable small animal model to test new vaccines and antivirals and to dissect the basic components of LASF pathophysiology. PMID:27191716

  20. A new approach to compute accurate velocity of meteors

    NASA Astrophysics Data System (ADS)

    Egal, Auriane; Gural, Peter; Vaubaillon, Jeremie; Colas, Francois; Thuillot, William

    2016-10-01

    The CABERNET project was designed to push the limits of meteoroid orbit measurements by improving the determination of the meteors' velocities. Indeed, despite of the development of the cameras networks dedicated to the observation of meteors, there is still an important discrepancy between the measured orbits of meteoroids computed and the theoretical results. The gap between the observed and theoretic semi-major axis of the orbits is especially significant; an accurate determination of the orbits of meteoroids therefore largely depends on the computation of the pre-atmospheric velocities. It is then imperative to dig out how to increase the precision of the measurements of the velocity.In this work, we perform an analysis of different methods currently used to compute the velocities and trajectories of the meteors. They are based on the intersecting planes method developed by Ceplecha (1987), the least squares method of Borovicka (1990), and the multi-parameter fitting (MPF) method published by Gural (2012).In order to objectively compare the performances of these techniques, we have simulated realistic meteors ('fakeors') reproducing the different error measurements of many cameras networks. Some fakeors are built following the propagation models studied by Gural (2012), and others created by numerical integrations using the Borovicka et al. 2007 model. Different optimization techniques have also been investigated in order to pick the most suitable one to solve the MPF, and the influence of the geometry of the trajectory on the result is also presented.We will present here the results of an improved implementation of the multi-parameter fitting that allow an accurate orbit computation of meteors with CABERNET. The comparison of different velocities computation seems to show that if the MPF is by far the best method to solve the trajectory and the velocity of a meteor, the ill-conditioning of the costs functions used can lead to large estimate errors for noisy

  1. An index of parameter reproducibility accounting for estimation uncertainty: theory and case study on β-cell responsivity and insulin sensitivity.

    PubMed

    Dalla Man, Chiara; Pillonetto, Gianluigi; Riz, Michela; Cobelli, Claudio

    2015-06-01

    Parameter reproducibility is necessary to perform longitudinal studies where parameters are assessed to monitor disease progression or effect of therapy but are also useful in powering the study, i.e., to define how many subjects should be studied to observe a given effect. The assessment of parameter reproducibility is usually accomplished by methods that do not take into account the fact that these parameters are estimated with uncertainty. This is particularly relevant in physiological and clinical studies where usually reproducibility cannot be assessed by multiple testing and is usually assessed from a single replication of the test. Working in a suitable stochastic framework, here we propose a new index (S) to measure reproducibility that takes into account parameter uncertainty and is particularly suited to handle the normal testing conditions of physiological and clinical investigations. Simulation results prove that S, by properly taking into account parameter uncertainty, is more accurate and robust than the methods available in the literature. The new metric is applied to assess reproducibility of insulin sensitivity and β-cell responsivity of a mixed-meal tolerance test from data obtained in the same subjects retested 1 wk apart. Results show that the indices of insulin sensitivity and β-cell responsivity to glucose are well reproducible. We conclude that the oral minimal models provide useful indices that can be used safely in prospective studies or to assess the efficacy of a given therapy.

  2. Validating lipid force fields against experimental data: Progress, challenges and perspectives.

    PubMed

    Poger, David; Caron, Bertrand; Mark, Alan E

    2016-07-01

    Biological membranes display a great diversity in lipid composition and lateral structure that is crucial in a variety of cellular functions. Simulations of membranes have contributed significantly to the understanding of the properties, functions and behaviour of membranes and membrane-protein assemblies. This success relies on the ability of the force field used to describe lipid-lipid and lipid-environment interactions accurately, reproducibly and realistically. In this review, we present some recent progress in lipid force-field development and validation strategies. In particular, we highlight how a range of properties obtained from various experimental techniques on lipid bilayers and membranes, can be used to assess the quality of a force field. We discuss the limitations and assumptions that are inherent to both computational and experimental approaches and how these can influence the comparison between simulations and experimental data. This article is part of a Special Issue entitled: Membrane Proteins edited by J.C. Gumbart and Sergei Noskov.

  3. Extreme Rainfall Events Over Southern Africa: Assessment of a Climate Model to Reproduce Daily Extremes

    NASA Astrophysics Data System (ADS)

    Williams, C.; Kniveton, D.; Layberry, R.

    2007-12-01

    It is increasingly accepted that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The subcontinent is considered especially vulnerable extreme events, due to a number of factors including extensive poverty, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of a state-of-the-art climate model to simulate climate at daily timescales is carried out using satellite derived rainfall data from the Microwave Infra-Red Algorithm (MIRA). This dataset covers the period from 1993-2002 and the whole of southern Africa at a spatial resolution of 0.1 degree longitude/latitude. Once the model's ability to reproduce extremes has been assessed, idealised regions of SST anomalies are used to force the model, with the overall aim of investigating the ways in which SST anomalies influence rainfall extremes over southern Africa. In this paper, results from sensitivity testing of the UK Meteorological Office Hadley Centre's climate model's domain size are firstly presented. Then simulations of current climate from the model, operating in both regional and global mode, are compared to the MIRA dataset at daily timescales. Thirdly, the ability of the model to reproduce daily rainfall extremes will be assessed, again by a comparison with extremes from the MIRA dataset. Finally, the results from the idealised SST experiments are briefly presented, suggesting associations between rainfall extremes and both local and remote SST anomalies.

  4. Reproducibility of UAV-based earth surface topography based on structure-from-motion algorithms.

    NASA Astrophysics Data System (ADS)

    Clapuyt, François; Vanacker, Veerle; Van Oost, Kristof

    2014-05-01

    A representation of the earth surface at very high spatial resolution is crucial to accurately map small geomorphic landforms with high precision. Very high resolution digital surface models (DSM) can then be used to quantify changes in earth surface topography over time, based on differencing of DSMs taken at various moments in time. However, it is compulsory to have both high accuracy for each topographic representation and consistency between measurements over time, as DSM differencing automatically leads to error propagation. This study investigates the reproducibility of reconstructions of earth surface topography based on structure-from-motion (SFM) algorithms. To this end, we equipped an eight-propeller drone with a standard reflex camera. This equipment can easily be deployed in the field, as it is a lightweight, low-cost system in comparison with classic aerial photo surveys and terrestrial or airborne LiDAR scanning. Four sets of aerial photographs were created for one test field. The sets of airphotos differ in focal length, and viewing angles, i.e. nadir view and ground-level view. In addition, the importance of the accuracy of ground control points for the construction of a georeferenced point cloud was assessed using two different GPS devices with horizontal accuracy at resp. the sub-meter and sub-decimeter level. Airphoto datasets were processed with SFM algorithm and the resulting point clouds were georeferenced. Then, the surface representations were compared with each other to assess the reproducibility of the earth surface topography. Finally, consistency between independent datasets is discussed.

  5. Reproducible computational biology experiments with SED-ML - The Simulation Experiment Description Markup Language

    PubMed Central

    2011-01-01

    Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from

  6. Novel TPLO Alignment Jig/Saw Guide Reproduces Freehand and Ideal Osteotomy Positions

    PubMed Central

    2016-01-01

    Objectives To evaluate the ability of an alignment jig/saw guide to reproduce appropriate osteotomy positions in the tibial plateau leveling osteotomy (TPLO) in the dog. Methods Lateral radiographs of 65 clinical TPLO procedures using an alignment jig and freehand osteotomy performed by experienced TPLO surgeons using a 24 mm radial saw blade between Dec 2005–Dec 2007 and Nov 2013–Nov 2015 were reviewed. The freehand osteotomy position was compared to potential osteotomy positions using the alignment jig/saw guide. The proximal and distal jig pin holes on postoperative radiographs were used to align the jig to the bone; saw guide position was selected to most closely match the osteotomy performed. The guide-to-osteotomy fit was categorized by the distance between the actual osteotomy and proposed saw guide osteotomy at its greatest offset (≤1 mm = excellent; ≤2 mm = good; ≤3 mm = satisfactory; >3 mm = poor). Results Sixty-four of 65 TPLO osteotomies could be matched satisfactorily by the saw guide. Proximal jig pin placement 3–4 mm from the joint surface and pin location in a craniocaudal plane on the proximal tibia were significantly associated with the guide-to-osteotomy fit (P = 0.021 and P = 0.047, respectively). Clinical Significance The alignment jig/saw guide can be used to reproduce appropriate freehand osteotomy position for TPLO. Furthermore, an ideal osteotomy position centered on the tibial intercondylar tubercles also is possible. Accurate placement of the proximal jig pin is a crucial step for correct positioning of the saw guide in either instance. PMID:27556230

  7. The determination of accurate dipole polarizabilities alpha and gamma for the noble gases

    NASA Technical Reports Server (NTRS)

    Rice, Julia E.; Taylor, Peter R.; Lee, Timothy J.; Almlof, Jan

    1991-01-01

    Accurate static dipole polarizabilities alpha and gamma of the noble gases He through Xe were determined using wave functions of similar quality for each system. Good agreement with experimental data for the static polarizability gamma was obtained for Ne and Xe, but not for Ar and Kr. Calculations suggest that the experimental values for these latter ions are too low.

  8. Accurate and occlusion-robust multi-view stereo

    NASA Astrophysics Data System (ADS)

    Zhu, Zhaokun; Stamatopoulos, Christos; Fraser, Clive S.

    2015-11-01

    This paper proposes an accurate multi-view stereo method for image-based 3D reconstruction that features robustness in the presence of occlusions. The new method offers improvements in dealing with two fundamental image matching problems. The first concerns the selection of the support window model, while the second centers upon accurate visibility estimation for each pixel. The support window model is based on an approximate 3D support plane described by a depth and two per-pixel depth offsets. For the visibility estimation, the multi-view constraint is initially relaxed by generating separate support plane maps for each support image using a modified PatchMatch algorithm. Then the most likely visible support image, which represents the minimum visibility of each pixel, is extracted via a discrete Markov Random Field model and it is further augmented by parameter clustering. Once the visibility is estimated, multi-view optimization taking into account all redundant observations is conducted to achieve optimal accuracy in the 3D surface generation for both depth and surface normal estimates. Finally, multi-view consistency is utilized to eliminate any remaining observational outliers. The proposed method is experimentally evaluated using well-known Middlebury datasets, and results obtained demonstrate that it is amongst the most accurate of the methods thus far reported via the Middlebury MVS website. Moreover, the new method exhibits a high completeness rate.

  9. Fractionating Polymer Microspheres as Highly Accurate Density Standards.

    PubMed

    Bloxham, William H; Hennek, Jonathan W; Kumar, Ashok A; Whitesides, George M

    2015-07-21

    This paper describes a method of isolating small, highly accurate density-standard beads and characterizing their densities using accurate and experimentally traceable techniques. Density standards have a variety of applications, including the characterization of density gradients, which are used to separate objects in a variety of fields. Glass density-standard beads can be very accurate (±0.0001 g cm(-3)) but are too large (3-7 mm in diameter) for many applications. When smaller density standards are needed, commercial polymer microspheres are often used. These microspheres have standard deviations in density ranging from 0.006 to 0.021 g cm(-3); these distributions in density make these microspheres impractical for applications demanding small steps in density. In this paper, commercial microspheres are fractionated using aqueous multiphase systems (AMPS), aqueous mixture of polymers and salts that spontaneously separate into phases having molecularly sharp steps in density, to isolate microspheres having much narrower distributions in density (standard deviations from 0.0003 to 0.0008 g cm(-3)) than the original microspheres. By reducing the heterogeneity in densities, this method reduces the uncertainty in the density of any specific bead and, therefore, improves the accuracy within the limits of the calibration standards used to characterize the distributions in density.

  10. Bipedal spring-damper-mass model reproduces external mechanical power of human walking.

    PubMed

    Etenzi, Ettore; Monaco, Vito

    2015-01-01

    Previous authors have long investigated the behavior of different models of passive walkers with stiff or compliant limbs. We investigated a model of bipedal mechanism whose limba are provided with damping and elastic elements. This model is designed for walking along an inclined plane, in order to make up the energy lost due to the damping element with that gained thanks to the lowering the CoM. The proposed model is hence able to steadily walk. In particular we investigated the stability of this model by using the Poincaré return map for different dynamical configurations. Then we compared the estimated external mechanical power with experimental data from literature in order to validate the model. Results show that the model is able to reproduce the main features of the time course of the external mechanical power during the gait cycle. Accordingly, dissipative elements coupled with limbs' compliant behavior represent a suitable paradigm, to mimic human locomotion.

  11. Question 8: From a Set of Chemical Reactions to Reproducing Cells

    NASA Astrophysics Data System (ADS)

    Kaneko, Kunihiko

    2007-10-01

    As a prerequisite for a reproducing cell, we first discuss how non-equilibrium states are sustained endogenously in a catalytic reaction network. Negative correlation between abundances of resource chemicals and of catalysts for their consumption is shown to lead to hindrance of relaxation to equilibrium. Mutual reinforcement among such sustainment of non-equilibrium state, spatial structure formation, and reproduction of a compartment is discussed as a mechanism to further suppress the relaxation to equilibrium. As a next step to protocell, consistency between cell reproduction and replication of constituent molecules is theoretically studied, which leads to a power-law on abundance of molecules as well as log-normal distribution over cells, which are shown to be universal, and also confirmed experimentally in the present cells.

  12. ENVIRONMENT: a computational platform to stochastically simulate reacting and self-reproducing lipid compartments

    NASA Astrophysics Data System (ADS)

    Mavelli, Fabio; Ruiz-Mirazo, Kepa

    2010-09-01

    'ENVIRONMENT' is a computational platform that has been developed in the last few years with the aim to simulate stochastically the dynamics and stability of chemically reacting protocellular systems. Here we present and describe some of its main features, showing how the stochastic kinetics approach can be applied to study the time evolution of reaction networks in heterogeneous conditions, particularly when supramolecular lipid structures (micelles, vesicles, etc) coexist with aqueous domains. These conditions are of special relevance to understand the origins of cellular, self-reproducing compartments, in the context of prebiotic chemistry and evolution. We contrast our simulation results with real lab experiments, with the aim to bring together theoretical and experimental research on protocell and minimal artificial cell systems.

  13. Bipedal spring-damper-mass model reproduces external mechanical power of human walking.

    PubMed

    Etenzi, Ettore; Monaco, Vito

    2015-08-01

    Previous authors have long investigated the behavior of different models of passive walkers with stiff or compliant limbs. We investigated a model of bipedal mechanism whose limba are provided with damping and elastic elements. This model is designed for walking along an inclined plane, in order to make up the energy lost due to the damping element with that gained thanks to the lowering the CoM. The proposed model is hence able to steadily walk. In particular we investigated the stability of this model by using the Poincaré return map for different dynamical configurations. Then we compared the estimated external mechanical power with experimental data from literature in order to validate the model. Results show that the model is able to reproduce the main features of the time course of the external mechanical power during the gait cycle. Accordingly, dissipative elements coupled with limbs' compliant behavior represent a suitable paradigm, to mimic human locomotion. PMID:26736788

  14. Symphony: A Framework for Accurate and Holistic WSN Simulation

    PubMed Central

    Riliskis, Laurynas; Osipov, Evgeny

    2015-01-01

    Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN) systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles. PMID:25723144

  15. Symphony: a framework for accurate and holistic WSN simulation.

    PubMed

    Riliskis, Laurynas; Osipov, Evgeny

    2015-01-01

    Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN) systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles.

  16. Accurate ab initio vibrational energies of methyl chloride

    SciTech Connect

    Owens, Alec; Yurchenko, Sergei N.; Yachmenev, Andrey; Tennyson, Jonathan; Thiel, Walter

    2015-06-28

    Two new nine-dimensional potential energy surfaces (PESs) have been generated using high-level ab initio theory for the two main isotopologues of methyl chloride, CH{sub 3}{sup 35}Cl and CH{sub 3}{sup 37}Cl. The respective PESs, CBS-35{sup  HL}, and CBS-37{sup  HL}, are based on explicitly correlated coupled cluster calculations with extrapolation to the complete basis set (CBS) limit, and incorporate a range of higher-level (HL) additive energy corrections to account for core-valence electron correlation, higher-order coupled cluster terms, scalar relativistic effects, and diagonal Born-Oppenheimer corrections. Variational calculations of the vibrational energy levels were performed using the computer program TROVE, whose functionality has been extended to handle molecules of the form XY {sub 3}Z. Fully converged energies were obtained by means of a complete vibrational basis set extrapolation. The CBS-35{sup  HL} and CBS-37{sup  HL} PESs reproduce the fundamental term values with root-mean-square errors of 0.75 and 1.00 cm{sup −1}, respectively. An analysis of the combined effect of the HL corrections and CBS extrapolation on the vibrational wavenumbers indicates that both are needed to compute accurate theoretical results for methyl chloride. We believe that it would be extremely challenging to go beyond the accuracy currently achieved for CH{sub 3}Cl without empirical refinement of the respective PESs.

  17. Fourier modeling of the BOLD response to a breath-hold task: Optimization and reproducibility.

    PubMed

    Pinto, Joana; Jorge, João; Sousa, Inês; Vilela, Pedro; Figueiredo, Patrícia

    2016-07-15

    Cerebrovascular reactivity (CVR) reflects the capacity of blood vessels to adjust their caliber in order to maintain a steady supply of brain perfusion, and it may provide a sensitive disease biomarker. Measurement of the blood oxygen level dependent (BOLD) response to a hypercapnia-inducing breath-hold (BH) task has been frequently used to map CVR noninvasively using functional magnetic resonance imaging (fMRI). However, the best modeling approach for the accurate quantification of CVR maps remains an open issue. Here, we compare and optimize Fourier models of the BOLD response to a BH task with a preparatory inspiration, and assess the test-retest reproducibility of the associated CVR measurements, in a group of 10 healthy volunteers studied over two fMRI sessions. Linear combinations of sine-cosine pairs at the BH task frequency and its successive harmonics were added sequentially in a nested models approach, and were compared in terms of the adjusted coefficient of determination and corresponding variance explained (VE) of the BOLD signal, as well as the number of voxels exhibiting significant BOLD responses, the estimated CVR values, and their test-retest reproducibility. The brain average VE increased significantly with the Fourier model order, up to the 3rd order. However, the number of responsive voxels increased significantly only up to the 2nd order, and started to decrease from the 3rd order onwards. Moreover, no significant relative underestimation of CVR values was observed beyond the 2nd order. Hence, the 2nd order model was concluded to be the optimal choice for the studied paradigm. This model also yielded the best test-retest reproducibility results, with intra-subject coefficients of variation of 12 and 16% and an intra-class correlation coefficient of 0.74. In conclusion, our results indicate that a Fourier series set consisting of a sine-cosine pair at the BH task frequency and its two harmonics is a suitable model for BOLD-fMRI CVR measurements

  18. Refined Dummy Atom Model of Mg(2+) by Simple Parameter Screening Strategy with Revised Experimental Solvation Free Energy.

    PubMed

    Jiang, Yang; Zhang, Haiyang; Feng, Wei; Tan, Tianwei

    2015-12-28

    Metal ions play an important role in the catalysis of metalloenzymes. To investigate metalloenzymes via molecular modeling, a set of accurate force field parameters for metal ions is highly imperative. To extend its application range and improve the performance, the dummy atom model of metal ions was refined through a simple parameter screening strategy using the Mg(2+) ion as an example. Using the AMBER ff03 force field with the TIP3P model, the refined model accurately reproduced the experimental geometric and thermodynamic properties of Mg(2+). Compared with point charge models and previous dummy atom models, the refined dummy atom model yields an enhanced performance for producing reliable ATP/GTP-Mg(2+)-protein conformations in three metalloenzyme systems with single or double metal centers. Similar to other unbounded models, the refined model failed to reproduce the Mg-Mg distance and favored a monodentate binding of carboxylate groups, and these drawbacks needed to be considered with care. The outperformance of the refined model is mainly attributed to the use of a revised (more accurate) experimental solvation free energy and a suitable free energy correction protocol. This work provides a parameter screening strategy that can be readily applied to refine the dummy atom models for metal ions.

  19. Feedback about More Accurate versus Less Accurate Trials: Differential Effects on Self-Confidence and Activation

    ERIC Educational Resources Information Center

    Badami, Rokhsareh; VaezMousavi, Mohammad; Wulf, Gabriele; Namazizadeh, Mahdi

    2012-01-01

    One purpose of the present study was to examine whether self-confidence or anxiety would be differentially affected by feedback from more accurate rather than less accurate trials. The second purpose was to determine whether arousal variations (activation) would predict performance. On Day 1, participants performed a golf putting task under one of…

  20. Reversibility and reproducibility of histamine induced plasma leakage in nasal airways.

    PubMed Central

    Svensson, C; Baumgarten, C R; Pipkorn, U; Alkner, U; Persson, C G

    1989-01-01

    Plasma exudation is one cardinal factor in airways defence and inflammation. In inflammatory airway diseases such as rhinitis and asthma, however, plasma leakage may also have a pathogenetic role. Experimental data from animals indicate that highly sensitive, active, and reversible processes regulate the vascular and mucosal permeability to macromolecules. With the use of a nasal lavage model for the recovery of liquids on the mucosal surface the effect of histamine on the macromolecular permeability of the airway endothelial-epithelial barriers was studied in normal subjects. The concentrations of albumin, kinins, and N-alpha-beta-tosyl-L-arginine-methyl esterase (TAME) in nasal lavage fluid were measured and nasal symptoms assessed by a scoring technique. The reproducibility of three repeated challenges with 30 minute intervals on the same day was studied in 12 subjects and compared with the same procedure (three challenges) on a different day. Sneezing decreased significantly (p less than 0.05) after the first histamine challenge but was maintained thereafter. Otherwise, the mean values for symptoms and for markers of vascular leakage were very similar both for the three challenges in the same session and for the two challenge sessions on a different day. Sneezing, blockage, and secretions were associated with increased concentrations of TAME esterase (maximum 9000 cpm/ml), kinins (1.4 ng/ml), and albumin (0.3 g/l) in lavage fluid. Both the symptoms and the measures of plasma exudation were reversible and reproducible in the three repeat histamine challenges and at two challenge sessions on different days. These findings support the view that non-injurious, active processes regulate the inflammatory flow of macromolecules across airways endothelial-epithelial barriers. The present experimental approach would be suitable for studies of the modulatory effects of inflammatory stimulus induced plasma leakage and symptoms in human airways. PMID:2648641

  1. Classical signal model reproducing quantum probabilities for single and coincidence detections

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei; Nilsson, Börje; Nordebo, Sven

    2012-05-01

    We present a simple classical (random) signal model reproducing Born's rule. The crucial point of our approach is that the presence of detector's threshold and calibration procedure have to be treated not as simply experimental technicalities, but as the basic counterparts of the theoretical model. We call this approach threshold signal detection model (TSD). The experiment on coincidence detection which was done by Grangier in 1986 [22] played a crucial role in rejection of (semi-)classical field models in favour of quantum mechanics (QM): impossibility to resolve the wave-particle duality in favour of a purely wave model. QM predicts that the relative probability of coincidence detection, the coefficient g(2) (0), is zero (for one photon states), but in (semi-)classical models g(2)(0) >= 1. In TSD the coefficient g(2)(0) decreases as 1/ɛ2d, where ɛd > 0 is the detection threshold. Hence, by increasing this threshold an experimenter can make the coefficient g(2) (0) essentially less than 1. The TSD-prediction can be tested experimentally in new Grangier type experiments presenting a detailed monitoring of dependence of the coefficient g(2)(0) on the detection threshold. Structurally our model has some similarity with the prequantum model of Grossing et al. Subquantum stochasticity is composed of the two counterparts: a stationary process in the space of internal degrees of freedom and the random walk type motion describing the temporal dynamics.

  2. Feedback about more accurate versus less accurate trials: differential effects on self-confidence and activation.

    PubMed

    Badami, Rokhsareh; VaezMousavi, Mohammad; Wulf, Gabriele; Namazizadeh, Mahdi

    2012-06-01

    One purpose of the present study was to examine whether self-confidence or anxiety would be differentially affected byfeedback from more accurate rather than less accurate trials. The second purpose was to determine whether arousal variations (activation) would predict performance. On day 1, participants performed a golf putting task under one of two conditions: one group received feedback on the most accurate trials, whereas another group received feedback on the least accurate trials. On day 2, participants completed an anxiety questionnaire and performed a retention test. Shin conductance level, as a measure of arousal, was determined. The results indicated that feedback about more accurate trials resulted in more effective learning as well as increased self-confidence. Also, activation was a predictor of performance. PMID:22808705

  3. Two highly accurate methods for pitch calibration

    NASA Astrophysics Data System (ADS)

    Kniel, K.; Härtig, F.; Osawa, S.; Sato, O.

    2009-11-01

    Among profiles, helix and tooth thickness pitch is one of the most important parameters of an involute gear measurement evaluation. In principle, coordinate measuring machines (CMM) and CNC-controlled gear measuring machines as a variant of a CMM are suited for these kinds of gear measurements. Now the Japan National Institute of Advanced Industrial Science and Technology (NMIJ/AIST) and the German national metrology institute the Physikalisch-Technische Bundesanstalt (PTB) have each developed independently highly accurate pitch calibration methods applicable to CMM or gear measuring machines. Both calibration methods are based on the so-called closure technique which allows the separation of the systematic errors of the measurement device and the errors of the gear. For the verification of both calibration methods, NMIJ/AIST and PTB performed measurements on a specially designed pitch artifact. The comparison of the results shows that both methods can be used for highly accurate calibrations of pitch standards.

  4. Accurate guitar tuning by cochlear implant musicians.

    PubMed

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task. PMID:24651081

  5. Accurate Guitar Tuning by Cochlear Implant Musicians

    PubMed Central

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task. PMID:24651081

  6. Preparation and accurate measurement of pure ozone.

    PubMed

    Janssen, Christof; Simone, Daniela; Guinet, Mickaël

    2011-03-01

    Preparation of high purity ozone as well as precise and accurate measurement of its pressure are metrological requirements that are difficult to meet due to ozone decomposition occurring in pressure sensors. The most stable and precise transducer heads are heated and, therefore, prone to accelerated ozone decomposition, limiting measurement accuracy and compromising purity. Here, we describe a vacuum system and a method for ozone production, suitable to accurately determine the pressure of pure ozone by avoiding the problem of decomposition. We use an inert gas in a particularly designed buffer volume and can thus achieve high measurement accuracy and negligible degradation of ozone with purities of 99.8% or better. The high degree of purity is ensured by comprehensive compositional analyses of ozone samples. The method may also be applied to other reactive gases. PMID:21456766

  7. Accurate guitar tuning by cochlear implant musicians.

    PubMed

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task.

  8. Accurate modeling of parallel scientific computations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Townsend, James C.

    1988-01-01

    Scientific codes are usually parallelized by partitioning a grid among processors. To achieve top performance it is necessary to partition the grid so as to balance workload and minimize communication/synchronization costs. This problem is particularly acute when the grid is irregular, changes over the course of the computation, and is not known until load time. Critical mapping and remapping decisions rest on the ability to accurately predict performance, given a description of a grid and its partition. This paper discusses one approach to this problem, and illustrates its use on a one-dimensional fluids code. The models constructed are shown to be accurate, and are used to find optimal remapping schedules.

  9. Line gas sampling system ensures accurate analysis

    SciTech Connect

    Not Available

    1992-06-01

    Tremendous changes in the natural gas business have resulted in new approaches to the way natural gas is measured. Electronic flow measurement has altered the business forever, with developments in instrumentation and a new sensitivity to the importance of proper natural gas sampling techniques. This paper reports that YZ Industries Inc., Snyder, Texas, combined its 40 years of sampling experience with the latest in microprocessor-based technology to develop the KynaPak 2000 series, the first on-line natural gas sampling system that is both compact and extremely accurate. This means the composition of the sampled gas must be representative of the whole and related to flow. If so, relative measurement and sampling techniques are married, gas volumes are accurately accounted for and adjustments to composition can be made.

  10. Accurate mask model for advanced nodes

    NASA Astrophysics Data System (ADS)

    Zine El Abidine, Nacer; Sundermann, Frank; Yesilada, Emek; Ndiaye, El Hadji Omar; Mishra, Kushlendra; Paninjath, Sankaranarayanan; Bork, Ingo; Buck, Peter; Toublan, Olivier; Schanen, Isabelle

    2014-07-01

    Standard OPC models consist of a physical optical model and an empirical resist model. The resist model compensates the optical model imprecision on top of modeling resist development. The optical model imprecision may result from mask topography effects and real mask information including mask ebeam writing and mask process contributions. For advanced technology nodes, significant progress has been made to model mask topography to improve optical model accuracy. However, mask information is difficult to decorrelate from standard OPC model. Our goal is to establish an accurate mask model through a dedicated calibration exercise. In this paper, we present a flow to calibrate an accurate mask enabling its implementation. The study covers the different effects that should be embedded in the mask model as well as the experiment required to model them.

  11. Accurate maser positions for MALT-45

    NASA Astrophysics Data System (ADS)

    Jordan, Christopher; Bains, Indra; Voronkov, Maxim; Lo, Nadia; Jones, Paul; Muller, Erik; Cunningham, Maria; Burton, Michael; Brooks, Kate; Green, James; Fuller, Gary; Barnes, Peter; Ellingsen, Simon; Urquhart, James; Morgan, Larry; Rowell, Gavin; Walsh, Andrew; Loenen, Edo; Baan, Willem; Hill, Tracey; Purcell, Cormac; Breen, Shari; Peretto, Nicolas; Jackson, James; Lowe, Vicki; Longmore, Steven

    2013-10-01

    MALT-45 is an untargeted survey, mapping the Galactic plane in CS (1-0), Class I methanol masers, SiO masers and thermal emission, and high frequency continuum emission. After obtaining images from the survey, a number of masers were detected, but without accurate positions. This project seeks to resolve each maser and its environment, with the ultimate goal of placing the Class I methanol maser into a timeline of high mass star formation.

  12. Accurate maser positions for MALT-45

    NASA Astrophysics Data System (ADS)

    Jordan, Christopher; Bains, Indra; Voronkov, Maxim; Lo, Nadia; Jones, Paul; Muller, Erik; Cunningham, Maria; Burton, Michael; Brooks, Kate; Green, James; Fuller, Gary; Barnes, Peter; Ellingsen, Simon; Urquhart, James; Morgan, Larry; Rowell, Gavin; Walsh, Andrew; Loenen, Edo; Baan, Willem; Hill, Tracey; Purcell, Cormac; Breen, Shari; Peretto, Nicolas; Jackson, James; Lowe, Vicki; Longmore, Steven

    2013-04-01

    MALT-45 is an untargeted survey, mapping the Galactic plane in CS (1-0), Class I methanol masers, SiO masers and thermal emission, and high frequency continuum emission. After obtaining images from the survey, a number of masers were detected, but without accurate positions. This project seeks to resolve each maser and its environment, with the ultimate goal of placing the Class I methanol maser into a timeline of high mass star formation.

  13. Accurate vessel segmentation with constrained B-snake.

    PubMed

    Yuanzhi Cheng; Xin Hu; Ji Wang; Yadong Wang; Tamura, Shinichi

    2015-08-01

    We describe an active contour framework with accurate shape and size constraints on the vessel cross-sectional planes to produce the vessel segmentation. It starts with a multiscale vessel axis tracing in a 3D computed tomography (CT) data, followed by vessel boundary delineation on the cross-sectional planes derived from the extracted axis. The vessel boundary surface is deformed under constrained movements on the cross sections and is voxelized to produce the final vascular segmentation. The novelty of this paper lies in the accurate contour point detection of thin vessels based on the CT scanning model, in the efficient implementation of missing contour points in the problematic regions and in the active contour model with accurate shape and size constraints. The main advantage of our framework is that it avoids disconnected and incomplete segmentation of the vessels in the problematic regions that contain touching vessels (vessels in close proximity to each other), diseased portions (pathologic structure attached to a vessel), and thin vessels. It is particularly suitable for accurate segmentation of thin and low contrast vessels. Our method is evaluated and demonstrated on CT data sets from our partner site, and its results are compared with three related methods. Our method is also tested on two publicly available databases and its results are compared with the recently published method. The applicability of the proposed method to some challenging clinical problems, the segmentation of the vessels in the problematic regions, is demonstrated with good results on both quantitative and qualitative experimentations; our segmentation algorithm can delineate vessel boundaries that have level of variability similar to those obtained manually.

  14. Towards Principled Experimental Study of Autonomous Mobile Robots

    NASA Technical Reports Server (NTRS)

    Gat, Erann

    1995-01-01

    We review the current state of research in autonomous mobile robots and conclude that there is an inadequate basis for predicting the reliability and behavior of robots operating in unengineered environments. We present a new approach to the study of autonomous mobile robot performance based on formal statistical analysis of independently reproducible experiments conducted on real robots. Simulators serve as models rather than experimental surrogates. We demonstrate three new results: 1) Two commonly used performance metrics (time and distance) are not as well correlated as is often tacitly assumed. 2) The probability distributions of these performance metrics are exponential rather than normal, and 3) a modular, object-oriented simulation accurately predicts the behavior of the real robot in a statistically significant manner.

  15. Ambipolar transition voltage spectroscopy: Analytical results and experimental agreement

    NASA Astrophysics Data System (ADS)

    Bâldea, Ioan

    2012-01-01

    This work emphasizes that the transition voltages Vt± for both bias polarities (V ≷ 0) should be used to properly determine the energy offset ɛ0 of the molecular orbital closest to electrodes’ Fermi level and the bias asymmetry γ in molecular junctions. Accurate analytical formulas are deduced to estimate ɛ0 and γ solely in terms of Vt±. These estimates are validated against experiments, by showing that full experimental I-V curves measured by Beebe [Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.97.026801 97, 026801 (2006)] and Tan [Appl. Phsy. Lett.APPLAB0003-695110.1063/1.3291521 96, 013110 (2010)] for both bias polarities can be excellently reproduced.

  16. Accurate phase-shift velocimetry in rock.

    PubMed

    Shukla, Matsyendra Nath; Vallatos, Antoine; Phoenix, Vernon R; Holmes, William M

    2016-06-01

    Spatially resolved Pulsed Field Gradient (PFG) velocimetry techniques can provide precious information concerning flow through opaque systems, including rocks. This velocimetry data is used to enhance flow models in a wide range of systems, from oil behaviour in reservoir rocks to contaminant transport in aquifers. Phase-shift velocimetry is the fastest way to produce velocity maps but critical issues have been reported when studying flow through rocks and porous media, leading to inaccurate results. Combining PFG measurements for flow through Bentheimer sandstone with simulations, we demonstrate that asymmetries in the molecular displacement distributions within each voxel are the main source of phase-shift velocimetry errors. We show that when flow-related average molecular displacements are negligible compared to self-diffusion ones, symmetric displacement distributions can be obtained while phase measurement noise is minimised. We elaborate a complete method for the production of accurate phase-shift velocimetry maps in rocks and low porosity media and demonstrate its validity for a range of flow rates. This development of accurate phase-shift velocimetry now enables more rapid and accurate velocity analysis, potentially helping to inform both industrial applications and theoretical models. PMID:27111139

  17. Accurate phase-shift velocimetry in rock

    NASA Astrophysics Data System (ADS)

    Shukla, Matsyendra Nath; Vallatos, Antoine; Phoenix, Vernon R.; Holmes, William M.

    2016-06-01

    Spatially resolved Pulsed Field Gradient (PFG) velocimetry techniques can provide precious information concerning flow through opaque systems, including rocks. This velocimetry data is used to enhance flow models in a wide range of systems, from oil behaviour in reservoir rocks to contaminant transport in aquifers. Phase-shift velocimetry is the fastest way to produce velocity maps but critical issues have been reported when studying flow through rocks and porous media, leading to inaccurate results. Combining PFG measurements for flow through Bentheimer sandstone with simulations, we demonstrate that asymmetries in the molecular displacement distributions within each voxel are the main source of phase-shift velocimetry errors. We show that when flow-related average molecular displacements are negligible compared to self-diffusion ones, symmetric displacement distributions can be obtained while phase measurement noise is minimised. We elaborate a complete method for the production of accurate phase-shift velocimetry maps in rocks and low porosity media and demonstrate its validity for a range of flow rates. This development of accurate phase-shift velocimetry now enables more rapid and accurate velocity analysis, potentially helping to inform both industrial applications and theoretical models.

  18. Accurate phase-shift velocimetry in rock.

    PubMed

    Shukla, Matsyendra Nath; Vallatos, Antoine; Phoenix, Vernon R; Holmes, William M

    2016-06-01

    Spatially resolved Pulsed Field Gradient (PFG) velocimetry techniques can provide precious information concerning flow through opaque systems, including rocks. This velocimetry data is used to enhance flow models in a wide range of systems, from oil behaviour in reservoir rocks to contaminant transport in aquifers. Phase-shift velocimetry is the fastest way to produce velocity maps but critical issues have been reported when studying flow through rocks and porous media, leading to inaccurate results. Combining PFG measurements for flow through Bentheimer sandstone with simulations, we demonstrate that asymmetries in the molecular displacement distributions within each voxel are the main source of phase-shift velocimetry errors. We show that when flow-related average molecular displacements are negligible compared to self-diffusion ones, symmetric displacement distributions can be obtained while phase measurement noise is minimised. We elaborate a complete method for the production of accurate phase-shift velocimetry maps in rocks and low porosity media and demonstrate its validity for a range of flow rates. This development of accurate phase-shift velocimetry now enables more rapid and accurate velocity analysis, potentially helping to inform both industrial applications and theoretical models.

  19. Developmental pesticide exposure reproduces features of attention deficit hyperactivity disorder

    PubMed Central

    Richardson, Jason R.; Taylor, Michele M.; Shalat, Stuart L.; Guillot, Thomas S.; Caudle, W. Michael; Hossain, Muhammad M.; Mathews, Tiffany A.; Jones, Sara R.; Cory-Slechta, Deborah A.; Miller, Gary W.

    2015-01-01

    Attention-deficit hyperactivity disorder (ADHD) is estimated to affect 8–12% of school-age children worldwide. ADHD is a complex disorder with significant genetic contributions. However, no single gene has been linked to a significant percentage of cases, suggesting that environmental factors may contribute to ADHD. Here, we used behavioral, molecular, and neurochemical techniques to characterize the effects of developmental exposure to the pyrethroid pesticide deltamethrin. We also used epidemiologic methods to determine whether there is an association between pyrethroid exposure and diagnosis of ADHD. Mice exposed to the pyrethroid pesticide deltamethrin during development exhibit several features reminiscent of ADHD, including elevated dopamine transporter (DAT) levels, hyperactivity, working memory and attention deficits, and impulsive-like behavior. Increased DAT and D1 dopamine receptor levels appear to be responsible for the behavioral deficits. Epidemiologic data reveal that children aged 6–15 with detectable levels of pyrethroid metabolites in their urine were more than twice as likely to be diagnosed with ADHD. Our epidemiologic finding, combined with the recapitulation of ADHD behavior in pesticide-treated mice, provides a mechanistic basis to suggest that developmental pyrethroid exposure is a risk factor for ADHD.—Richardson, J. R., Taylor, M. M., Shalat, S. L., Guillot III, T. S., Caudle, W. M., Hossain, M. M., Mathews, T. A., Jones, S. R., Cory-Slechta, D. A., Miller, G. W. Developmental pesticide exposure reproduces features of attention deficit hyperactivity disorder. PMID:25630971

  20. How to Obtain Reproducible Results for Lithium Sulfur Batteries

    SciTech Connect

    Zheng, Jianming; Lu, Dongping; Gu, Meng; Wang, Chong M.; Zhang, Jiguang; Liu, Jun; Xiao, Jie

    2013-01-01

    The basic requirements for getting reliable Li-S battery data have been discussed in this work. Unlike Li-ion batteries, electrolyte-rich environment significantly affects the cycling stability of Li-S batteries prepared and tested under the same conditions. The reason has been assigned to the different concentrations of polysulfide-containing electrolytes in the cells, which have profound influences on both sulfur cathode and lithium anode. At optimized S/E ratio of 50 g L-1, a good balance among electrolyte viscosity, wetting ability, diffusion rate dissolved polysulfide and nucleation/growth of short-chain Li2S/Li2S2 has been built along with largely reduced contamination on the lithium anode side. Accordingly, good cyclability, high reversible capacity and Coulombic efficiency are achieved in Li-S cell with controlled S/E ratio without any additive. Other factors such as sulfur content in the composite and sulfur loading on the electrode also need careful concern in Li-S system in order to generate reproducible results and gauge the various methods used to improve Li-S battery technology.

  1. The rapid reproducers paradox: population control and individual procreative rights.

    PubMed

    Wissenburg, M

    1998-01-01

    This article argues that population policies need to be evaluated from macro and micro perspectives and to consider individual rights. Ecological arguments that are stringent conditions of liberal democracy are assessed against a moral standard. The moral standard is applied to a series of reasons for limiting procreative rights in the cause of sustainability. The focus is directly on legally enforced antinatalist measures and not on indirect policies with incentives and disincentives. The explicit assumption is that population policy violates the fairness to individuals for societal gain and that population policies are incompatible with stringent conditions of liberal democracy. The author identifies the individual-societal tradeoff as the "rapid reproducers paradox." The perfect sustainable population level is either not possible or is a repugnant alternative. 12 ecological arguments are presented, and none are found compatible with notions of a liberal democracy. Three alternative antinatalist options are the acceptance of less rigid and still coercive policies, amendments to the conception of liberal democracy, or loss of hope and choice of noncoercive solutions to sustainability, none of which is found viable. If voluntary abstinence and distributive solutions fail, then frugal demand options and technological supply options both will be necessary.

  2. A silicon retina that reproduces signals in the optic nerve.

    PubMed

    Zaghloul, Kareem A; Boahen, Kwabena

    2006-12-01

    Prosthetic devices may someday be used to treat lesions of the central nervous system. Similar to neural circuits, these prosthetic devices should adapt their properties over time, independent of external control. Here we describe an artificial retina, constructed in silicon using single-transistor synaptic primitives, with two forms of locally controlled adaptation: luminance adaptation and contrast gain control. Both forms of adaptation rely on local modulation of synaptic strength, thus meeting the criteria of internal control. Our device is the first to reproduce the responses of the four major ganglion cell types that drive visual cortex, producing 3600 spiking outputs in total. We demonstrate how the responses of our device's ganglion cells compare to those measured from the mammalian retina. Replicating the retina's synaptic organization in our chip made it possible to perform these computations using a hundred times less energy than a microprocessor-and to match the mammalian retina in size and weight. With this level of efficiency and autonomy, it is now possible to develop fully implantable intraocular prostheses.

  3. Reproducing Natural Spider Silks’ Copolymer Behavior in Synthetic Silk Mimics

    PubMed Central

    An, Bo; Jenkins, Janelle E.; Sampath, Sujatha; Holland, Gregory P.; Hinman, Mike; Yarger, Jeffery L.; Lewis, Randolph

    2012-01-01

    Dragline silk from orb-weaving spiders is a copolymer of two large proteins, major ampullate spidroin 1 (MaSp1) and 2 (MaSp2). The ratio of these proteins is known to have a large variation across different species of orb-weaving spiders. NMR results from gland material of two different species of spiders, N. clavipes and A. aurantia, indicates that MaSp1 proteins are more easily formed into β-sheet nanostructures, while MaSp2 proteins form random coil and helical structures. To test if this behavior of natural silk proteins could be reproduced by recombinantly produced spider silk mimic protein, recombinant MaSp1/MaSp2 mixed fibers as well as chimeric silk fibers from MaSp1 and MaSp2 sequences in a single protein were produced based on the variable ratio and conserved motifs of MaSp1 and MaSp2 in native silk fiber. Mechanical properties, solid-state NMR, and XRD results of tested synthetic fibers indicate the differing roles of MaSp1 and MaSp2 in the fiber and verify the importance of postspin stretching treatment in helping the fiber to form the proper spatial structure. PMID:23110450

  4. Repeatability and reproducibility of aquatic testing with zinc dithiophosphate

    SciTech Connect

    Hooter, D.L.; Hoke, D.I.; Kraska, R.C.; Wojewodka, R.A.

    1994-12-31

    This testing program was designed to characterize the repeatability and reproducibility of aquatic screening studies with a water insoluble chemical substance. Zinc dithiophosphate was selected for its limited water solubility and moderate aquatic toxicity. Acute tests were conducted using fathead minnows and Daphnia magna, according to guidelines developed to minimize random sources of non-repeatability. Zinc dithiosphosphate was exposed to the organisms in static tests using an oil-water dispersion method for the fathead minnows, and a water-accommodated-fraction method for the Daphnia magna. Testing was conducted in moderately hard water with pre-determined nominal concentrations of 0. 1, 1.0, 10.0, 100.00, and 1000.0 ppm or ppm WAF. 24 studies were contracted among 3 separate commercial contract laboratories. The program results demonstrate the diverse range of intralaboratory and interlaboratory variability based on the organism type, and emphasize the need for further study and caution in the design, and implementation of aquatic testing for insoluble materials.

  5. Diet rapidly and reproducibly alters the human gut microbiome.

    PubMed

    David, Lawrence A; Maurice, Corinne F; Carmody, Rachel N; Gootenberg, David B; Button, Julie E; Wolfe, Benjamin E; Ling, Alisha V; Devlin, A Sloan; Varma, Yug; Fischbach, Michael A; Biddinger, Sudha B; Dutton, Rachel J; Turnbaugh, Peter J

    2014-01-23

    Long-term dietary intake influences the structure and activity of the trillions of microorganisms residing in the human gut, but it remains unclear how rapidly and reproducibly the human gut microbiome responds to short-term macronutrient change. Here we show that the short-term consumption of diets composed entirely of animal or plant products alters microbial community structure and overwhelms inter-individual differences in microbial gene expression. The animal-based diet increased the abundance of bile-tolerant microorganisms (Alistipes, Bilophila and Bacteroides) and decreased the levels of Firmicutes that metabolize dietary plant polysaccharides (Roseburia, Eubacterium rectale and Ruminococcus bromii). Microbial activity mirrored differences between herbivorous and carnivorous mammals, reflecting trade-offs between carbohydrate and protein fermentation. Foodborne microbes from both diets transiently colonized the gut, including bacteria, fungi and even viruses. Finally, increases in the abundance and activity of Bilophila wadsworthia on the animal-based diet support a link between dietary fat, bile acids and the outgrowth of microorganisms capable of triggering inflammatory bowel disease. In concert, these results demonstrate that the gut microbiome can rapidly respond to altered diet, potentially facilitating the diversity of human dietary lifestyles.

  6. Virtual Raters for Reproducible and Objective Assessments in Radiology.

    PubMed

    Kleesiek, Jens; Petersen, Jens; Döring, Markus; Maier-Hein, Klaus; Köthe, Ullrich; Wick, Wolfgang; Hamprecht, Fred A; Bendszus, Martin; Biller, Armin

    2016-04-27

    Volumetric measurements in radiologic images are important for monitoring tumor growth and treatment response. To make these more reproducible and objective we introduce the concept of virtual raters (VRs). A virtual rater is obtained by combining knowledge of machine-learning algorithms trained with past annotations of multiple human raters with the instantaneous rating of one human expert. Thus, he is virtually guided by several experts. To evaluate the approach we perform experiments with multi-channel magnetic resonance imaging (MRI) data sets. Next to gross tumor volume (GTV) we also investigate subcategories like edema, contrast-enhancing and non-enhancing tumor. The first data set consists of N = 71 longitudinal follow-up scans of 15 patients suffering from glioblastoma (GB). The second data set comprises N = 30 scans of low- and high-grade gliomas. For comparison we computed Pearson Correlation, Intra-class Correlation Coefficient (ICC) and Dice score. Virtual raters always lead to an improvement w.r.t. inter- and intra-rater agreement. Comparing the 2D Response Assessment in Neuro-Oncology (RANO) measurements to the volumetric measurements of the virtual raters results in one-third of the cases in a deviating rating. Hence, we believe that our approach will have an impact on the evaluation of clinical studies as well as on routine imaging diagnostics.

  7. Reproducibility of tactile assessments for children with unilateral cerebral palsy.

    PubMed

    Auld, Megan Louise; Ware, Robert S; Boyd, Roslyn Nancy; Moseley, G Lorimer; Johnston, Leanne Marie

    2012-05-01

    A systematic review identified tactile assessments used in children with cerebral palsy (CP), but their reproducibility is unknown. Sixteen children with unilateral CP and 31 typically developing children (TDC) were assessed 2-4 weeks apart. Test-retest percent agreements within one point for children with unilateral CP (and TDC) were Semmes-Weinstein monofilaments: 75% (90%); single-point localization: 69% (97%); static two-point discrimination: 93% (97%); and moving two-point discrimination: 87% (97%). Test-retest reliability for registration and unilateral spatial tactile perception tests was high in children with CP (intraclass correlation coefficient [ICC] = 0.79-0.96). Two tests demonstrated a learning effect for children with CP, double simultaneous and tactile texture perception. Stereognosis had a ceiling effect for TDC (ICC = 0) and variability for children with CP (% exact agreement = 47%-50%). The Semmes-Weinstein monofilaments, single-point localization, and both static and moving two-point discrimination are recommended for use in practice and research. Although recommended to provide a comprehensive assessment, the measures of double simultaneous, stereognosis, and tactile texture perception may not be responsive to change over time in children with unilateral CP.

  8. Magnetofection: A Reproducible Method for Gene Delivery to Melanoma Cells

    PubMed Central

    Prosen, Lara; Prijic, Sara; Music, Branka; Lavrencak, Jaka; Cemazar, Maja; Sersa, Gregor

    2013-01-01

    Magnetofection is a nanoparticle-mediated approach for transfection of cells, tissues, and tumors. Specific interest is in using superparamagnetic iron oxide nanoparticles (SPIONs) as delivery system of therapeutic genes. Magnetofection has already been described in some proof-of-principle studies; however, fine tuning of the synthesis of SPIONs is necessary for its broader application. Physicochemical properties of SPIONs, synthesized by the co-precipitation in an alkaline aqueous medium, were tested after varying different parameters of the synthesis procedure. The storage time of iron(II) sulfate salt, the type of purified water, and the synthesis temperature did not affect physicochemical properties of SPIONs. Also, varying the parameters of the synthesis procedure did not influence magnetofection efficacy. However, for the pronounced gene expression encoded by plasmid DNA it was crucial to functionalize poly(acrylic) acid-stabilized SPIONs (SPIONs-PAA) with polyethyleneimine (PEI) without the adjustment of its elementary alkaline pH water solution to the physiological pH. In conclusion, the co-precipitation of iron(II) and iron(III) sulfate salts with subsequent PAA stabilization, PEI functionalization, and plasmid DNA binding is a robust method resulting in a reproducible and efficient magnetofection. To achieve high gene expression is important, however, the pH of PEI water solution for SPIONs-PAA functionalization, which should be in the alkaline range. PMID:23862136

  9. Reproducibility of Vibrionaceae population structure in coastal bacterioplankton

    PubMed Central

    Szabo, Gitta; Preheim, Sarah P; Kauffman, Kathryn M; David, Lawrence A; Shapiro, Jesse; Alm, Eric J; Polz, Martin F

    2013-01-01

    How reproducibly microbial populations assemble in the wild remains poorly understood. Here, we assess evidence for ecological specialization and predictability of fine-scale population structure and habitat association in coastal ocean Vibrionaceae across years. We compare Vibrionaceae lifestyles in the bacterioplankton (combinations of free-living, particle, or zooplankton associations) measured using the same sampling scheme in 2006 and 2009 to assess whether the same groups show the same environmental association year after year. This reveals complex dynamics with populations falling primarily into two categories: (i) nearly equally represented in each of the two samplings and (ii) highly skewed, often to an extent that they appear exclusive to one or the other sampling times. Importantly, populations recovered at the same abundance in both samplings occupied highly similar habitats suggesting predictable and robust environmental association while skewed abundances of some populations may be triggered by shifts in ecological conditions. The latter is supported by difference in the composition of large eukaryotic plankton between years, with samples in 2006 being dominated by copepods, and those in 2009 by diatoms. Overall, the comparison supports highly predictable population-habitat linkage but highlights the fact that complex, and often unmeasured, environmental dynamics in habitat occurrence may have strong effects on population dynamics. PMID:23178668

  10. Reproducibility of cold provocation in patients with Raynaud's phenomenon.

    PubMed

    Wigley, F M; Malamet, R; Wise, R A

    1987-08-01

    Twenty-five patients with Raynaud's phenomenon had serial cold challenges during a double blinded drug trial. The data were analyzed to determine the reproducibility of cold provocation in the induction of critical closure of the digital artery in patients with Raynaud's phenomenon. Finger systolic pressure (FSP) was measured after local digital cooling using a digital strain gauge placed around the distal phalanx. Nineteen of 25 patients completed the study. The prevalence of inducing a Raynaud's attack decreased with each successive cold challenge from 74% of patients at initial challenge to 42% at the 3rd challenge. A lower temperature was required to induce a Raynaud's attack at last challenge (10.6 +/- 0.6 degrees C) compared to the first cold challenge (13.2 +/- 1.0 degrees C). Our data demonstrate adaptation to a laboratory cold challenge through the winter months in patients with Raynaud's phenomenon and show it is an important factor in objectively assessing drug efficacy in the treatment of Raynaud's phenomenon.

  11. Reproducing natural spider silks' copolymer behavior in synthetic silk mimics.

    PubMed

    An, Bo; Jenkins, Janelle E; Sampath, Sujatha; Holland, Gregory P; Hinman, Mike; Yarger, Jeffery L; Lewis, Randolph

    2012-12-10

    Dragline silk from orb-weaving spiders is a copolymer of two large proteins, major ampullate spidroin 1 (MaSp1) and 2 (MaSp2). The ratio of these proteins is known to have a large variation across different species of orb-weaving spiders. NMR results from gland material of two different species of spiders, N. clavipes and A. aurantia , indicates that MaSp1 proteins are more easily formed into β-sheet nanostructures, while MaSp2 proteins form random coil and helical structures. To test if this behavior of natural silk proteins could be reproduced by recombinantly produced spider silk mimic protein, recombinant MaSp1/MaSp2 mixed fibers as well as chimeric silk fibers from MaSp1 and MaSp2 sequences in a single protein were produced based on the variable ratio and conserved motifs of MaSp1 and MaSp2 in native silk fiber. Mechanical properties, solid-state NMR, and XRD results of tested synthetic fibers indicate the differing roles of MaSp1 and MaSp2 in the fiber and verify the importance of postspin stretching treatment in helping the fiber to form the proper spatial structure. PMID:23110450

  12. Reproducing Natural Spider Silks' Copolymer Behavior in Synthetic Silk Mimics

    SciTech Connect

    An, Bo; Jenkins, Janelle E; Sampath, Sujatha; Holland, Gregory P; Hinman, Mike; Yarger, Jeffery L; Lewis, Randolph

    2012-10-30

    Dragline silk from orb-weaving spiders is a copolymer of two large proteins, major ampullate spidroin 1 (MaSp1) and 2 (MaSp2). The ratio of these proteins is known to have a large variation across different species of orb-weaving spiders. NMR results from gland material of two different species of spiders, N. clavipes and A. aurantia, indicates that MaSp1 proteins are more easily formed into β-sheet nanostructures, while MaSp2 proteins form random coil and helical structures. To test if this behavior of natural silk proteins could be reproduced by recombinantly produced spider silk mimic protein, recombinant MaSp1/MaSp2 mixed fibers as well as chimeric silk fibers from MaSp1 and MaSp2 sequences in a single protein were produced based on the variable ratio and conserved motifs of MaSp1 and MaSp2 in native silk fiber. Mechanical properties, solid-state NMR, and XRD results of tested synthetic fibers indicate the differing roles of MaSp1 and MaSp2 in the fiber and verify the importance of postspin stretching treatment in helping the fiber to form the proper spatial structure.

  13. A silicon retina that reproduces signals in the optic nerve

    NASA Astrophysics Data System (ADS)

    Zaghloul, Kareem A.; Boahen, Kwabena

    2006-12-01

    Prosthetic devices may someday be used to treat lesions of the central nervous system. Similar to neural circuits, these prosthetic devices should adapt their properties over time, independent of external control. Here we describe an artificial retina, constructed in silicon using single-transistor synaptic primitives, with two forms of locally controlled adaptation: luminance adaptation and contrast gain control. Both forms of adaptation rely on local modulation of synaptic strength, thus meeting the criteria of internal control. Our device is the first to reproduce the responses of the four major ganglion cell types that drive visual cortex, producing 3600 spiking outputs in total. We demonstrate how the responses of our device's ganglion cells compare to those measured from the mammalian retina. Replicating the retina's synaptic organization in our chip made it possible to perform these computations using a hundred times less energy than a microprocessor—and to match the mammalian retina in size and weight. With this level of efficiency and autonomy, it is now possible to develop fully implantable intraocular prostheses.

  14. Stochastic simulations of minimal self-reproducing cellular systems.

    PubMed

    Mavelli, Fabio; Ruiz-Mirazo, Kepa

    2007-10-29

    This paper is a theoretical attempt to gain insight into the problem of how self-assembling vesicles (closed bilayer structures) could progressively turn into minimal self-producing and self-reproducing cells, i.e. into interesting candidates for (proto)biological systems. With this aim, we make use of a recently developed object-oriented platform to carry out stochastic simulations of chemical reaction networks that take place in dynamic cellular compartments. We apply this new tool to study the behaviour of different minimal cell models, making realistic assumptions about the physico-chemical processes and conditions involved (e.g. thermodynamic equilibrium/non-equilibrium, variable volume-to-surface relationship, osmotic pressure, solute diffusion across the membrane due to concentration gradients, buffering effect). The new programming platform has been designed to analyse not only how a single protometabolic cell could maintain itself, grow or divide, but also how a collection of these cells could 'evolve' as a result of their mutual interactions in a common environment. PMID:17510021

  15. Virtual Raters for Reproducible and Objective Assessments in Radiology.

    PubMed

    Kleesiek, Jens; Petersen, Jens; Döring, Markus; Maier-Hein, Klaus; Köthe, Ullrich; Wick, Wolfgang; Hamprecht, Fred A; Bendszus, Martin; Biller, Armin

    2016-01-01

    Volumetric measurements in radiologic images are important for monitoring tumor growth and treatment response. To make these more reproducible and objective we introduce the concept of virtual raters (VRs). A virtual rater is obtained by combining knowledge of machine-learning algorithms trained with past annotations of multiple human raters with the instantaneous rating of one human expert. Thus, he is virtually guided by several experts. To evaluate the approach we perform experiments with multi-channel magnetic resonance imaging (MRI) data sets. Next to gross tumor volume (GTV) we also investigate subcategories like edema, contrast-enhancing and non-enhancing tumor. The first data set consists of N = 71 longitudinal follow-up scans of 15 patients suffering from glioblastoma (GB). The second data set comprises N = 30 scans of low- and high-grade gliomas. For comparison we computed Pearson Correlation, Intra-class Correlation Coefficient (ICC) and Dice score. Virtual raters always lead to an improvement w.r.t. inter- and intra-rater agreement. Comparing the 2D Response Assessment in Neuro-Oncology (RANO) measurements to the volumetric measurements of the virtual raters results in one-third of the cases in a deviating rating. Hence, we believe that our approach will have an impact on the evaluation of clinical studies as well as on routine imaging diagnostics. PMID:27118379

  16. Resting Functional Connectivity of Language Networks: Characterization and Reproducibility

    PubMed Central

    Tomasi, Dardo; Volkow, Nora D.

    2011-01-01

    The neural basis of language comprehension and production has been associated with superior temporal (Wernicke’s) and inferior frontal (Broca’s) cortical areas respectively. However, recent resting state functional connectivity (RSFC) and lesion studies implicate a more extended network in language processing. Using a large RSFC dataset from 970 healthy subjects and seed regions in Broca’s and Wernicke’s we recapitulate this extended network that includes adjoining prefrontal, temporal and parietal regions but also bilateral caudate and left putamen/globus pallidus and subthalamic nucleus. We also show that the language network has predominance of short-range functional connectivity (except posterior Wernicke’s area that exhibited predominant long-range connectivity), which is consistent with reliance on local processing. Predominantly, the long-range connectivity was left lateralized (except anterior Wernicke’s area that exhibited rightward lateralization). The language network also exhibited anticorrelated activity with auditory (only for Wernickes’s area) and visual cortices that suggests integrated sequential activity with regions involved with listening or reading words. Assessment of the intra subject’s reproducibility of this network and its characterization in individuals with language dysfunction is needed to determine its potential as a biomarker for language disorders. PMID:22212597

  17. Optimizing reproducibility evaluation for random amplified polymorphic DNA markers.

    PubMed

    Ramos, J R; Telles, M P C; Diniz-Filho, J A F; Soares, T N; Melo, D B; Oliveira, G

    2008-01-01

    The random amplified polymorphic DNA (RAPD) technique is often criticized because it usually shows low levels of repeatability; thus it can generate spurious bands. These problems can be partially overcome by rigid laboratory protocols and by performing repeatability tests. However, because it is expensive and time-consuming to obtain genetic data twice for all individuals, a few randomly chosen individuals are usually selected for a priori repeatability analysis, introducing a potential bias in genetic parameter estimates. We developed a procedure to optimize repeatability analysis based on RAPD data, which was applied to evaluate genetic variability in three local populations of Tibochina papyrus, an endemic Cerrado plant found in elevated rocky fields in Brazil. We used a simulated annealing procedure to select the smallest number of individuals that contain all bands and repeated the analyses only for those bands that were reproduced in these individuals. We compared genetic parameter estimates using HICKORY and POPGENE softwares on an unreduced data set and on data sets in which we eliminated bands based on repeatability of individuals selected by simulated annealing and based on three randomly selected individuals. Genetic parameter estimates were very similar when we used the optimization procedure to reduce the number of bands analyzed, but as expected, selecting only three individuals to evaluate the repeatability of bands produced very different estimates. We conclude that the problems of repeatability attributed to RAPD markers could be due to bias in the selection of loci and primers and not necessarily to the RAPD technique per se. PMID:19065774

  18. Diet rapidly and reproducibly alters the human gut microbiome.

    PubMed

    David, Lawrence A; Maurice, Corinne F; Carmody, Rachel N; Gootenberg, David B; Button, Julie E; Wolfe, Benjamin E; Ling, Alisha V; Devlin, A Sloan; Varma, Yug; Fischbach, Michael A; Biddinger, Sudha B; Dutton, Rachel J; Turnbaugh, Peter J

    2014-01-23

    Long-term dietary intake influences the structure and activity of the trillions of microorganisms residing in the human gut, but it remains unclear how rapidly and reproducibly the human gut microbiome responds to short-term macronutrient change. Here we show that the short-term consumption of diets composed entirely of animal or plant products alters microbial community structure and overwhelms inter-individual differences in microbial gene expression. The animal-based diet increased the abundance of bile-tolerant microorganisms (Alistipes, Bilophila and Bacteroides) and decreased the levels of Firmicutes that metabolize dietary plant polysaccharides (Roseburia, Eubacterium rectale and Ruminococcus bromii). Microbial activity mirrored differences between herbivorous and carnivorous mammals, reflecting trade-offs between carbohydrate and protein fermentation. Foodborne microbes from both diets transiently colonized the gut, including bacteria, fungi and even viruses. Finally, increases in the abundance and activity of Bilophila wadsworthia on the animal-based diet support a link between dietary fat, bile acids and the outgrowth of microorganisms capable of triggering inflammatory bowel disease. In concert, these results demonstrate that the gut microbiome can rapidly respond to altered diet, potentially facilitating the diversity of human dietary lifestyles. PMID:24336217

  19. Reproducibility of Differential Proteomic Technologies in CPTAC Fractionated Xenografts

    PubMed Central

    2015-01-01

    The NCI Clinical Proteomic Tumor Analysis Consortium (CPTAC) employed a pair of reference xenograft proteomes for initial platform validation and ongoing quality control of its data collection for The Cancer Genome Atlas (TCGA) tumors. These two xenografts, representing basal and luminal-B human breast cancer, were fractionated and analyzed on six mass spectrometers in a total of 46 replicates divided between iTRAQ and label-free technologies, spanning a total of 1095 LC–MS/MS experiments. These data represent a unique opportunity to evaluate the stability of proteomic differentiation by mass spectrometry over many months of time for individual instruments or across instruments running dissimilar workflows. We evaluated iTRAQ reporter ions, label-free spectral counts, and label-free extracted ion chromatograms as strategies for data interpretation (source code is available from http://homepages.uc.edu/~wang2x7/Research.htm). From these assessments, we found that differential genes from a single replicate were confirmed by other replicates on the same instrument from 61 to 93% of the time. When comparing across different instruments and quantitative technologies, using multiple replicates, differential genes were reproduced by other data sets from 67 to 99% of the time. Projecting gene differences to biological pathways and networks increased the degree of similarity. These overlaps send an encouraging message about the maturity of technologies for proteomic differentiation. PMID:26653538

  20. Diet rapidly and reproducibly alters the human gut microbiome

    PubMed Central

    David, Lawrence A.; Maurice, Corinne F.; Carmody, Rachel N.; Gootenberg, David B.; Button, Julie E.; Wolfe, Benjamin E.; Ling, Alisha V.; Devlin, A. Sloan; Varma, Yug; Fischbach, Michael A.; Biddinger, Sudha B.; Dutton, Rachel J.; Turnbaugh, Peter J.

    2013-01-01

    Long-term diet influences the structure and activity of the trillions of microorganisms residing in the human gut1–5, but it remains unclear how rapidly and reproducibly the human gut microbiome responds to short-term macronutrient change. Here, we show that the short-term consumption of diets composed entirely of animal or plant products alters microbial community structure and overwhelms inter-individual differences in microbial gene expression. The animal-based diet increased the abundance of bile-tolerant microorganisms (Alistipes, Bilophila, and Bacteroides) and decreased the levels of Firmicutes that metabolize dietary plant polysaccharides (Roseburia, Eubacterium rectale, and Ruminococcus bromii). Microbial activity mirrored differences between herbivorous and carnivorous mammals2, reflecting trade-offs between carbohydrate and protein fermentation. Foodborne microbes from both diets transiently colonized the gut, including bacteria, fungi, and even viruses. Finally, increases in the abundance and activity of Bilophila wadsworthia on the animal-based diet support a link between dietary fat, bile acids, and the outgrowth of microorganisms capable of triggering inflammatory bowel disease6. In concert, these results demonstrate that the gut microbiome can rapidly respond to altered diet, potentially facilitating the diversity of human dietary lifestyles. PMID:24336217

  1. Periotest values: Its reproducibility, accuracy, and variability with hormonal influence

    PubMed Central

    Chakrapani, Swarna; Goutham, Madireddy; Krishnamohan, Thota; Anuparthy, Sujitha; Tadiboina, Nagarjuna; Rambha, Somasekhar

    2015-01-01

    Tooth mobility can be assessed by both subjective and objective means. The use of subjective measures may lead to bias and hence it becomes imperative to use objective means to assess tooth mobility. It has also been observed that hormonal fluctuations may have significantly influence tooth mobility. Aims: The study was undertaken to assess the reproducibility of periotest in the assessment of tooth mobility and, to unravel the obscurity associated with the hormonal influence on tooth mobility. Materials and Methods: 100 subjects were included in the study and were divided equally into two groups based on their age, group I (11-14 years) and group II(16-22 years). Results: There was no statistical significant difference between the periotest values (PTV) taken at two different time periods with a time difference of 20 minutes. PTV of group I was found to have a statistical significant greater PTV than group II. Conclusion: Periotest can reliably measure tooth mobility. Tooth mobility is greater during puberty as compared to adolescence and during adolescence mobility was slightly greater in males. PMID:25684904

  2. Virtual Raters for Reproducible and Objective Assessments in Radiology

    PubMed Central

    Kleesiek, Jens; Petersen, Jens; Döring, Markus; Maier-Hein, Klaus; Köthe, Ullrich; Wick, Wolfgang; Hamprecht, Fred A.; Bendszus, Martin; Biller, Armin

    2016-01-01

    Volumetric measurements in radiologic images are important for monitoring tumor growth and treatment response. To make these more reproducible and objective we introduce the concept of virtual raters (VRs). A virtual rater is obtained by combining knowledge of machine-learning algorithms trained with past annotations of multiple human raters with the instantaneous rating of one human expert. Thus, he is virtually guided by several experts. To evaluate the approach we perform experiments with multi-channel magnetic resonance imaging (MRI) data sets. Next to gross tumor volume (GTV) we also investigate subcategories like edema, contrast-enhancing and non-enhancing tumor. The first data set consists of N = 71 longitudinal follow-up scans of 15 patients suffering from glioblastoma (GB). The second data set comprises N = 30 scans of low- and high-grade gliomas. For comparison we computed Pearson Correlation, Intra-class Correlation Coefficient (ICC) and Dice score. Virtual raters always lead to an improvement w.r.t. inter- and intra-rater agreement. Comparing the 2D Response Assessment in Neuro-Oncology (RANO) measurements to the volumetric measurements of the virtual raters results in one-third of the cases in a deviating rating. Hence, we believe that our approach will have an impact on the evaluation of clinical studies as well as on routine imaging diagnostics. PMID:27118379

  3. Reproducing stone monument photosynthetic-based colonization under laboratory conditions.

    PubMed

    Miller, Ana Zélia; Laiz, Leonila; Gonzalez, Juan Miguel; Dionísio, Amélia; Macedo, Maria Filomena; Saiz-Jimenez, Cesareo

    2008-11-01

    In order to understand the biodeterioration process occurring on stone monuments, we analyzed the microbial communities involved in these processes and studied their ability to colonize stones under controlled laboratory experiments. In this study, a natural green biofilm from a limestone monument was cultivated, inoculated on stone probes of the same lithotype and incubated in a laboratory chamber. This incubation system, which exposes stone samples to intermittently sprinkling water, allowed the development of photosynthetic biofilms similar to those occurring on stone monuments. Denaturing gradient gel electrophoresis (DGGE) analysis was used to evaluate the major microbial components of the laboratory biofilms. Cyanobacteria, green microalgae, bacteria and fungi were identified by DNA-based molecular analysis targeting the 16S and 18S ribosomal RNA genes. The natural green biofilm was mainly composed by the Chlorophyta Chlorella, Stichococcus, and Trebouxia, and by Cyanobacteria belonging to the genera Leptolyngbya and Pleurocapsa. A number of bacteria belonging to Alphaproteobacteria, Bacteroidetes and Verrucomicrobia were identified, as well as fungi from the Ascomycota. The laboratory colonization experiment on stone probes showed a colonization pattern similar to that occurring on stone monuments. The methodology described in this paper allowed to reproduce a colonization equivalent to the natural biodeteriorating process.

  4. Using Copula Distributions to Support More Accurate Imaging-Based Diagnostic Classifiers for Neuropsychiatric Disorders

    PubMed Central

    Bansal, Ravi; Hao, Xuejun; Liu, Jun; Peterson, Bradley S.

    2014-01-01

    Many investigators have tried to apply machine learning techniques to magnetic resonance images (MRIs) of the brain in order to diagnose neuropsychiatric disorders. Usually the number of brain imaging measures (such as measures of cortical thickness and measures of local surface morphology) derived from the MRIs (i.e., their dimensionality) has been large (e.g. >10) relative to the number of participants who provide the MRI data (<100). Sparse data in a high dimensional space increases the variability of the classification rules that machine learning algorithms generate, thereby limiting the validity, reproducibility, and generalizability of those classifiers. The accuracy and stability of the classifiers can improve significantly if the multivariate distributions of the imaging measures can be estimated accurately. To accurately estimate the multivariate distributions using sparse data, we propose to estimate first the univariate distributions of imaging data and then combine them using a Copula to generate more accurate estimates of their multivariate distributions. We then sample the estimated Copula distributions to generate dense sets of imaging measures and use those measures to train classifiers. We hypothesize that the dense sets of brain imaging measures will generate classifiers that are stable to variations in brain imaging measures, thereby improving the reproducibility, validity, and generalizability of diagnostic classification algorithms in imaging datasets from clinical populations. In our experiments, we used both computer-generated and real-world brain imaging datasets to assess the accuracy of multivariate Copula distributions in estimating the corresponding multivariate distributions of real-world imaging data. Our experiments showed that diagnostic classifiers generated using imaging measures sampled from the Copula were significantly more accurate and more reproducible than were the classifiers generated using either the real-world imaging

  5. High Frequency QRS ECG Accurately Detects Cardiomyopathy

    NASA Technical Reports Server (NTRS)

    Schlegel, Todd T.; Arenare, Brian; Poulin, Gregory; Moser, Daniel R.; Delgado, Reynolds

    2005-01-01

    High frequency (HF, 150-250 Hz) analysis over the entire QRS interval of the ECG is more sensitive than conventional ECG for detecting myocardial ischemia. However, the accuracy of HF QRS ECG for detecting cardiomyopathy is unknown. We obtained simultaneous resting conventional and HF QRS 12-lead ECGs in 66 patients with cardiomyopathy (EF = 23.2 plus or minus 6.l%, mean plus or minus SD) and in 66 age- and gender-matched healthy controls using PC-based ECG software recently developed at NASA. The single most accurate ECG parameter for detecting cardiomyopathy was an HF QRS morphological score that takes into consideration the total number and severity of reduced amplitude zones (RAZs) present plus the clustering of RAZs together in contiguous leads. This RAZ score had an area under the receiver operator curve (ROC) of 0.91, and was 88% sensitive, 82% specific and 85% accurate for identifying cardiomyopathy at optimum score cut-off of 140 points. Although conventional ECG parameters such as the QRS and QTc intervals were also significantly longer in patients than controls (P less than 0.001, BBBs excluded), these conventional parameters were less accurate (area under the ROC = 0.77 and 0.77, respectively) than HF QRS morphological parameters for identifying underlying cardiomyopathy. The total amplitude of the HF QRS complexes, as measured by summed root mean square voltages (RMSVs), also differed between patients and controls (33.8 plus or minus 11.5 vs. 41.5 plus or minus 13.6 mV, respectively, P less than 0.003), but this parameter was even less accurate in distinguishing the two groups (area under ROC = 0.67) than the HF QRS morphologic and conventional ECG parameters. Diagnostic accuracy was optimal (86%) when the RAZ score from the HF QRS ECG and the QTc interval from the conventional ECG were used simultaneously with cut-offs of greater than or equal to 40 points and greater than or equal to 445 ms, respectively. In conclusion 12-lead HF QRS ECG employing

  6. Accurate Molecular Dimensions from Stearic Acid Monolayers.

    ERIC Educational Resources Information Center

    Lane, Charles A.; And Others

    1984-01-01

    Discusses modifications in the fatty acid monolayer experiment to reduce the inaccurate moleculary data students usually obtain. Copies of the experimental procedure used and a Pascal computer program to work up the data are available from the authors. (JN)

  7. Procedure for accurate fabrication of tissue compensators with high-density material

    NASA Astrophysics Data System (ADS)

    Mejaddem, Younes; Lax, Ingmar; Adakkai K, Shamsuddin

    1997-02-01

    An accurate method for producing compensating filters using high-density material (Cerrobend) is described. The procedure consists of two cutting steps in a Styrofoam block: (i) levelling a surface of the block to a reference level; (ii) depth-modulated milling of the levelled block in accordance with pre-calculated thickness profiles of the compensator. The calculated thickness (generated by a dose planning system) can be reproduced within acceptable accuracy. The desired compensator thickness manufactured according to this procedure is reproduced to within 0.1 mm, corresponding to a 0.5% change in dose at a beam quality of 6 MV. The results of our quality control checks performed with the technique of stylus profiling measurements show an accuracy of 0.04 mm in the milling process over an arbitrary profile along the milled-out Styrofoam block.

  8. Quantifying reproducibility in computational biology: the case of the tuberculosis drugome.

    PubMed

    Garijo, Daniel; Kinnings, Sarah; Xie, Li; Xie, Lei; Zhang, Yinliang; Bourne, Philip E; Gil, Yolanda

    2013-01-01

    How easy is it to reproduce the results found in a typical computational biology paper? Either through experience or intuition the reader will already know that the answer is with difficulty or not at all. In this paper we attempt to quantify this difficulty by reproducing a previously published paper for different classes of users (ranging from users with little expertise to domain experts) and suggest ways in which the situation might be improved. Quantification is achieved by estimating the time required to reproduce each of the steps in the method described in the original paper and make them part of an explicit workflow that reproduces the original results. Reproducing the method took several months of effort, and required using new versions and new software that posed challenges to reconstructing and validating the results. The quantification leads to "reproducibility maps" that reveal that novice researchers would only be able to reproduce a few of the steps in the method, and that only expert researchers with advance knowledge of the domain would be able to reproduce the method in its entirety. The workflow itself is published as an online resource together with supporting software and data. The paper concludes with a brief discussion of the complexities of requiring reproducibility in terms of cost versus benefit, and a desiderata with our observations and guidelines for improving reproducibility. This has implications not only in reproducing the work of others from published papers, but reproducing work from one's own laboratory.

  9. Optimized PCR Conditions and Increased shRNA Fold Representation Improve Reproducibility of Pooled shRNA Screens

    PubMed Central

    Strezoska, Žaklina; Licon, Abel; Haimes, Josh; Spayd, Katie Jansen; Patel, Kruti M.; Sullivan, Kevin; Jastrzebski, Katarzyna; Simpson, Kaylene J.; Leake, Devin; van Brabant Smith, Anja; Vermeulen, Annaleen

    2012-01-01

    RNAi screening using pooled shRNA libraries is a valuable tool for identifying genetic regulators of biological processes. However, for a successful pooled shRNA screen, it is imperative to thoroughly optimize experimental conditions to obtain reproducible data. Here we performed viability screens with a library of ∼10 000 shRNAs at two different fold representations (100- and 500-fold at transduction) and report the reproducibility of shRNA abundance changes between screening replicates determined by microarray and next generation sequencing analyses. We show that the technical reproducibility between PCR replicates from a pooled screen can be drastically improved by ensuring that PCR amplification steps are kept within the exponential phase and by using an amount of genomic DNA input in the reaction that maintains the average template copies per shRNA used during library transduction. Using these optimized PCR conditions, we then show that higher reproducibility of biological replicates is obtained by both microarray and next generation sequencing when screening with higher average shRNA fold representation. shRNAs that change abundance reproducibly in biological replicates (primary hits) are identified from screens performed with both 100- and 500-fold shRNA representation, however a higher percentage of primary hit overlap between screening replicates is obtained from 500-fold shRNA representation screens. While strong hits with larger changes in relative abundance were generally identified in both screens, hits with smaller changes were identified only in the screens performed with the higher shRNA fold representation at transduction. PMID:22870320

  10. Ordered array of Ag semishells on different diameter monolayer polystyrene colloidal crystals: An ultrasensitive and reproducible SERS substrate.

    PubMed

    Yi, Zao; Niu, Gao; Luo, Jiangshan; Kang, Xiaoli; Yao, Weitang; Zhang, Weibin; Yi, Yougen; Yi, Yong; Ye, Xin; Duan, Tao; Tang, Yongjian

    2016-09-02

    Ag semishells (AgSS) ordered arrays for surface-enhanced Raman scattering (SERS) spectroscopy have been prepared by depositing Ag film onto polystyrene colloidal particle (PSCP) monolayer templates array. The diversified activity for SERS activity with the ordered AgSS arrays mainly depends on the PSCP diameter and Ag film thickness. The high SERS sensitivity and reproducibility are proved by the detection of rhodamine 6G (R6G) and 4-aminothiophenol (4-ATP) molecules. The prominent enhancements of SERS are mainly from the "V"-shaped or "U"-shaped nanogaps on AgSS, which are experimentally and theoretically investigated. The higher SERS activity, stability and reproducibility make the ordered AgSS a promising choice for practical SERS low concentration detection applications.

  11. Ordered array of Ag semishells on different diameter monolayer polystyrene colloidal crystals: An ultrasensitive and reproducible SERS substrate

    PubMed Central

    Yi, Zao; Niu, Gao; Luo, Jiangshan; Kang, Xiaoli; Yao, Weitang; Zhang, Weibin; Yi, Yougen; Yi, Yong; Ye, Xin; Duan, Tao; Tang, Yongjian

    2016-01-01

    Ag semishells (AgSS) ordered arrays for surface-enhanced Raman scattering (SERS) spectroscopy have been prepared by depositing Ag film onto polystyrene colloidal particle (PSCP) monolayer templates array. The diversified activity for SERS activity with the ordered AgSS arrays mainly depends on the PSCP diameter and Ag film thickness. The high SERS sensitivity and reproducibility are proved by the detection of rhodamine 6G (R6G) and 4-aminothiophenol (4-ATP) molecules. The prominent enhancements of SERS are mainly from the “V”-shaped or “U”-shaped nanogaps on AgSS, which are experimentally and theoretically investigated. The higher SERS activity, stability and reproducibility make the ordered AgSS a promising choice for practical SERS low concentration detection applications. PMID:27586562

  12. Ordered array of Ag semishells on different diameter monolayer polystyrene colloidal crystals: An ultrasensitive and reproducible SERS substrate.

    PubMed

    Yi, Zao; Niu, Gao; Luo, Jiangshan; Kang, Xiaoli; Yao, Weitang; Zhang, Weibin; Yi, Yougen; Yi, Yong; Ye, Xin; Duan, Tao; Tang, Yongjian

    2016-01-01

    Ag semishells (AgSS) ordered arrays for surface-enhanced Raman scattering (SERS) spectroscopy have been prepared by depositing Ag film onto polystyrene colloidal particle (PSCP) monolayer templates array. The diversified activity for SERS activity with the ordered AgSS arrays mainly depends on the PSCP diameter and Ag film thickness. The high SERS sensitivity and reproducibility are proved by the detection of rhodamine 6G (R6G) and 4-aminothiophenol (4-ATP) molecules. The prominent enhancements of SERS are mainly from the "V"-shaped or "U"-shaped nanogaps on AgSS, which are experimentally and theoretically investigated. The higher SERS activity, stability and reproducibility make the ordered AgSS a promising choice for practical SERS low concentration detection applications. PMID:27586562

  13. Ordered array of Ag semishells on different diameter monolayer polystyrene colloidal crystals: An ultrasensitive and reproducible SERS substrate

    NASA Astrophysics Data System (ADS)

    Yi, Zao; Niu, Gao; Luo, Jiangshan; Kang, Xiaoli; Yao, Weitang; Zhang, Weibin; Yi, Yougen; Yi, Yong; Ye, Xin; Duan, Tao; Tang, Yongjian

    2016-09-01

    Ag semishells (AgSS) ordered arrays for surface-enhanced Raman scattering (SERS) spectroscopy have been prepared by depositing Ag film onto polystyrene colloidal particle (PSCP) monolayer templates array. The diversified activity for SERS activity with the ordered AgSS arrays mainly depends on the PSCP diameter and Ag film thickness. The high SERS sensitivity and reproducibility are proved by the detection of rhodamine 6G (R6G) and 4-aminothiophenol (4-ATP) molecules. The prominent enhancements of SERS are mainly from the “V”-shaped or “U”-shaped nanogaps on AgSS, which are experimentally and theoretically investigated. The higher SERS activity, stability and reproducibility make the ordered AgSS a promising choice for practical SERS low concentration detection applications.

  14. Reproducibility of magnetic resonance spectroscopy in correlation with signal-to-noise ratio.

    PubMed

    Okada, Tomohisa; Sakamoto, Setsu; Nakamoto, Yuji; Kohara, Nobuo; Senda, Michio

    2007-11-15

    An increased amount of myoinositol (mI) relative to creatine (Cr) by proton MR spectroscopy ((1)H-MRS) measurement gives a useful aid for the diagnosis of Alzheimer's disease (AD). Previous results of test-retest measurement of mI, however, have shown variability more than twice as large as for other metabolites. The aims of this study were to analyze test-retest variability of (1)H-MRS measurements in correlation with signal-to-noise ratio (SNR). Ten subjects clinically suspected of mild AD were examined twice (2-14 days apart) with (1)H-MRS measurements of voxels placed at anterior and posterior cingulate cortex. The percent differences between two measurements (%differences) of mI/Cr showed a significant linear trend to decrease as average SNR increased, but %differences of N-acetylaspartate (NAA)/Cr and choline (Cho)/Cr did not. The average of %differences was 10.5, 15.0 and 20.8 for NAA/Cr, Cho/Cr, and mI/Cr, respectively, indicating a prominent deterioration of mI/Cr measurement reproducibility, which decreased to 6.96, 15.4 and 9.87, respectively, when the analysis was limited to measurements with SNR over 25. The results indicate that MRS measurements with high SNR should be used to obtain reliable assessments of mI/Cr as accurate diagnostic indicator of AD in clinical MR examinations. PMID:17900878

  15. Short-term evaluation of plaque area measurement reproducibility from computer-assisted sonography.

    PubMed

    Massonneau, M; Caillard, P; de Chassey, L; Mouren, X; Thébault, B; Cloarec, M

    1995-05-01

    One reason why quantifying plaque regression is difficult is the poor spatial control of the shooting angle whether in angiography or ultrasonography techniques. A computer-assisted technique has been developed to assess absolute carotid plaque dimensions from B-mode ultrasonography, with enhanced capability of comparative examinations at large time intervals. Plaque area is measured from arterial lumen to adventitia with a real-time tissular detection program. Further measurements on the same patient are made using an echo-specific mask automatically generated by the computer from the original section. For an average sonographer, the manipulation takes no more than ten minutes for each view. In order to determine the reproducibility of this technique, a repeated measurement study (T0, T1, T2) was carried out on 8 patients with moderate to severe atherosclerotic lesions at carotid localizations. The plaque areas ranged from 52.7 to 202.3 mm2 (120.7 +/- 61). The coefficients of correlation between the measurements (T0-T1, T0-T2) were respectively 0.93 and 0.96 (P < 0.0001). The mean coefficient of variation (+/- SD) was 9.8% +/- 4.8. This study shows the feasibility of an accurate follow-up for atherosclerotic patients, with a two-dimensional plaque quantification, closer to the reality of the evolution of the pathology than the usual scoring system.

  16. Scan-rescan reproducibility of CT densitometric measures of emphysema

    NASA Astrophysics Data System (ADS)

    Chong, D.; van Rikxoort, E. M.; Kim, H. J.; Goldin, J. G.; Brown, M. S.

    2011-03-01

    This study investigated the reproducibility of HRCT densitometric measures of emphysema in patients scanned twice one week apart. 24 emphysema patients from a multicenter study were scanned at full inspiration (TLC) and expiration (RV), then again a week later for four scans total. Scans for each patient used the same scanner and protocol, except for tube current in three patients. Lung segmentation with gross airway removal was performed on the scans. Volume, weight, mean lung density (MLD), relative area under -950HU (RA-950), and 15th percentile (PD-15) were calculated for TLC, and volume and an airtrapping mask (RA-air) between -950 and -850HU for RV. For each measure, absolute differences were computed for each scan pair, and linear regression was performed against volume difference in a subgroup with volume difference <500mL. Two TLC scan pairs were excluded due to segmentation failure. The mean lung volumes were 5802 +/- 1420mL for TLC, 3878 +/- 1077mL for RV. The mean absolute differences were 169mL for TLC volume, 316mL for RV volume, 14.5g for weight, 5.0HU for MLD, 0.66p.p. for RA-950, 2.4HU for PD-15, and 3.1p.p. for RA-air. The <500mL subgroup had 20 scan pairs for TLC and RV. The R2 values were 0.8 for weight, 0.60 for MLD, 0.29 for RA-950, 0.31 for PD-15, and 0.64 for RA-air. Our results indicate that considerable variability exists in densitometric measures over one week that cannot be attributed to breathhold or physiology. This has implications for clinical trials relying on these measures to assess emphysema treatment efficacy.

  17. Reproducibility and intraindividual variability of the pattern electroretinogram.

    PubMed

    Jacobi, P C; Walter, P; Brunner, R; Krieglstein, G K

    1994-08-01

    The human pattern electroretinogram (PERG) is a contrast-specific potential presumedly reflecting the functional integrity of ganglion cells. Many studies have devised criteria that enable PERG measurements to distinguish established glaucomatous (hypertonic) eyes from normal controls. As there are relatively few reports concerning the reproducibility and reliability of the PERG, we studied the intraindividual variability of the PERG in 20 healthy subjects. Both transient and steady-state responses were recorded using a high-contrast (98%), black-and-white, counterphasing checkerboard pattern (average luminance, 80 cd/m2) generated by a television monitor (subtending angle, 13.8 degrees x 10.8 degrees) using three different check sizes (15', 30', and 60'). Recordings were performed in both eyes simultaneously at a 7-day interval under test-retest conditions. Responses of 30' spatial frequency were most consistent and resulted in a mean amplitude (+/- SD) of 2.18 +/- 0.95 microV (P50) and 4.00 +/- 1.69 microV (N95) for transient patterns and 1.84 +/- 1.25 microV for steady-state patterns. No statistically significant difference was observed between either right and left eyes, test and retest conditions or 1st- and 7th-day recording sessions for PERG parameters. In linear correlation analysis there was an adequate, positive correlation between the right and left eyes (r = 0.78); a weak correlation between test and retest conditions (r = 0.58); and no correlation between measurements made at a 7-day interval. As a consequence, we conclude that the follow-up of patients (e.g., glaucoma, ocular hypertension) by means of PERG is critical, especially when therapeutic consequences may be based on the physiological variability of a weak retinal signal.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:7804106

  18. A reproducible method to determine the meteoroid mass index

    NASA Astrophysics Data System (ADS)

    Pokorný, P.; Brown, P. G.

    2016-08-01

    Context. The determination of meteoroid mass indices is central to flux measurements and evolutionary studies of meteoroid populations. However, different authors use different approaches to fit observed data, making results difficult to reproduce and the resulting uncertainties difficult to justify. The real, physical, uncertainties are usually an order of magnitude higher than the reported values. Aims: We aim to develop a fully automated method that will measure meteoroid mass indices and associated uncertainty. We validate our method on large radar and optical datasets and compare results to obtain a best estimate of the true meteoroid mass index. Methods: Using MultiNest, a Bayesian inference tool that calculates the evidence and explores the parameter space, we search for the best fit of cumulative number vs. mass distributions in a four-dimensional space of variables (a,b,X1,X2). We explore biases in meteor echo distributions using optical meteor data as a calibration dataset to establish the systematic offset in measured mass index values. Results: Our best estimate for the average de-biased mass index for the sporadic meteoroid complex, as measured by radar appropriate to the mass range 10-3 > m > 10-5 g, was s = -2.10 ± 0.08. Optical data in the 10-1 > m > 10-3 g range, with the shower meteors removed, produced s = -2.08 ± 0.08. We find the mass index used by Grün et al. (1985) is substantially larger than we measure in the 10-4 < m < 10-1 g range. Our own code with a simple manual and a sample dataset can be found here: http://ftp://aquarid.physics.uwo.ca/pub/peter/MassIndexCode/

  19. Development of a Consistent and Reproducible Porcine Scald Burn Model

    PubMed Central

    Kempf, Margit; Kimble, Roy; Cuttle, Leila

    2016-01-01

    There are very few porcine burn models that replicate scald injuries similar to those encountered by children. We have developed a robust porcine burn model capable of creating reproducible scald burns for a wide range of burn conditions. The study was conducted with juvenile Large White pigs, creating replicates of burn combinations; 50°C for 1, 2, 5 and 10 minutes and 60°C, 70°C, 80°C and 90°C for 5 seconds. Visual wound examination, biopsies and Laser Doppler Imaging were performed at 1, 24 hours and at 3 and 7 days post-burn. A consistent water temperature was maintained within the scald device for long durations (49.8 ± 0.1°C when set at 50°C). The macroscopic and histologic appearance was consistent between replicates of burn conditions. For 50°C water, 10 minute duration burns showed significantly deeper tissue injury than all shorter durations at 24 hours post-burn (p ≤ 0.0001), with damage seen to increase until day 3 post-burn. For 5 second duration burns, by day 7 post-burn the 80°C and 90°C scalds had damage detected significantly deeper in the tissue than the 70°C scalds (p ≤ 0.001). A reliable and safe model of porcine scald burn injury has been successfully developed. The novel apparatus with continually refreshed water improves consistency of scald creation for long exposure times. This model allows the pathophysiology of scald burn wound creation and progression to be examined. PMID:27612153

  20. Reproducing American Sign Language sentences: cognitive scaffolding in working memory.

    PubMed

    Supalla, Ted; Hauser, Peter C; Bavelier, Daphne

    2014-01-01

    The American Sign Language Sentence Reproduction Test (ASL-SRT) requires the precise reproduction of a series of ASL sentences increasing in complexity and length. Error analyses of such tasks provides insight into working memory and scaffolding processes. Data was collected from three groups expected to differ in fluency: deaf children, deaf adults and hearing adults, all users of ASL. Quantitative (correct/incorrect recall) and qualitative error analyses were performed. Percent correct on the reproduction task supports its sensitivity to fluency as test performance clearly differed across the three groups studied. A linguistic analysis of errors further documented differing strategies and bias across groups. Subjects' recall projected the affordance and constraints of deep linguistic representations to differing degrees, with subjects resorting to alternate processing strategies when they failed to recall the sentence correctly. A qualitative error analysis allows us to capture generalizations about the relationship between error pattern and the cognitive scaffolding, which governs the sentence reproduction process. Highly fluent signers and less-fluent signers share common chokepoints on particular words in sentences. However, they diverge in heuristic strategy. Fluent signers, when they make an error, tend to preserve semantic details while altering morpho-syntactic domains. They produce syntactically correct sentences with equivalent meaning to the to-be-reproduced one, but these are not verbatim reproductions of the original sentence. In contrast, less-fluent signers tend to use a more linear strategy, preserving lexical status and word ordering while omitting local inflections, and occasionally resorting to visuo-motoric imitation. Thus, whereas fluent signers readily use top-down scaffolding in their working memory, less fluent signers fail to do so. Implications for current models of working memory across spoken and signed modalities are considered. PMID

  1. Soft and hard classification by reproducing kernel Hilbert space methods.

    PubMed

    Wahba, Grace

    2002-12-24

    Reproducing kernel Hilbert space (RKHS) methods provide a unified context for solving a wide variety of statistical modelling and function estimation problems. We consider two such problems: We are given a training set [yi, ti, i = 1, em leader, n], where yi is the response for the ith subject, and ti is a vector of attributes for this subject. The value of y(i) is a label that indicates which category it came from. For the first problem, we wish to build a model from the training set that assigns to each t in an attribute domain of interest an estimate of the probability pj(t) that a (future) subject with attribute vector t is in category j. The second problem is in some sense less ambitious; it is to build a model that assigns to each t a label, which classifies a future subject with that t into one of the categories or possibly "none of the above." The approach to the first of these two problems discussed here is a special case of what is known as penalized likelihood estimation. The approach to the second problem is known as the support vector machine. We also note some alternate but closely related approaches to the second problem. These approaches are all obtained as solutions to optimization problems in RKHS. Many other problems, in particular the solution of ill-posed inverse problems, can be obtained as solutions to optimization problems in RKHS and are mentioned in passing. We caution the reader that although a large literature exists in all of these topics, in this inaugural article we are selectively highlighting work of the author, former students, and other collaborators.

  2. Color accuracy and reproducibility in whole slide imaging scanners

    PubMed Central

    Shrestha, Prarthana; Hulsken, Bas

    2014-01-01

    Abstract We propose a workflow for color reproduction in whole slide imaging (WSI) scanners, such that the colors in the scanned images match to the actual slide color and the inter-scanner variation is minimum. We describe a new method of preparation and verification of the color phantom slide, consisting of a standard IT8-target transmissive film, which is used in color calibrating and profiling the WSI scanner. We explore several International Color Consortium (ICC) compliant techniques in color calibration/profiling and rendering intents for translating the scanner specific colors to the standard display (sRGB) color space. Based on the quality of the color reproduction in histopathology slides, we propose the matrix-based calibration/profiling and absolute colorimetric rendering approach. The main advantage of the proposed workflow is that it is compliant to the ICC standard, applicable to color management systems in different platforms, and involves no external color measurement devices. We quantify color difference using the CIE-DeltaE2000 metric, where DeltaE values below 1 are considered imperceptible. Our evaluation on 14 phantom slides, manufactured according to the proposed method, shows an average inter-slide color difference below 1 DeltaE. The proposed workflow is implemented and evaluated in 35 WSI scanners developed at Philips, called the Ultra Fast Scanners (UFS). The color accuracy, measured as DeltaE between the scanner reproduced colors and the reference colorimetric values of the phantom patches, is improved on average to 3.5 DeltaE in calibrated scanners from 10 DeltaE in uncalibrated scanners. The average inter-scanner color difference is found to be 1.2 DeltaE. The improvement in color performance upon using the proposed method is apparent with the visual color quality of the tissue scans. PMID:26158041

  3. Development of a Consistent and Reproducible Porcine Scald Burn Model.

    PubMed

    Andrews, Christine J; Kempf, Margit; Kimble, Roy; Cuttle, Leila

    2016-01-01

    There are very few porcine burn models that replicate scald injuries similar to those encountered by children. We have developed a robust porcine burn model capable of creating reproducible scald burns for a wide range of burn conditions. The study was conducted with juvenile Large White pigs, creating replicates of burn combinations; 50°C for 1, 2, 5 and 10 minutes and 60°C, 70°C, 80°C and 90°C for 5 seconds. Visual wound examination, biopsies and Laser Doppler Imaging were performed at 1, 24 hours and at 3 and 7 days post-burn. A consistent water temperature was maintained within the scald device for long durations (49.8 ± 0.1°C when set at 50°C). The macroscopic and histologic appearance was consistent between replicates of burn conditions. For 50°C water, 10 minute duration burns showed significantly deeper tissue injury than all shorter durations at 24 hours post-burn (p ≤ 0.0001), with damage seen to increase until day 3 post-burn. For 5 second duration burns, by day 7 post-burn the 80°C and 90°C scalds had damage detected significantly deeper in the tissue than the 70°C scalds (p ≤ 0.001). A reliable and safe model of porcine scald burn injury has been successfully developed. The novel apparatus with continually refreshed water improves consistency of scald creation for long exposure times. This model allows the pathophysiology of scald burn wound creation and progression to be examined. PMID:27612153

  4. Accurately Mapping M31's Microlensing Population

    NASA Astrophysics Data System (ADS)

    Crotts, Arlin

    2004-07-01

    We propose to augment an existing microlensing survey of M31 with source identifications provided by a modest amount of ACS {and WFPC2 parallel} observations to yield an accurate measurement of the masses responsible for microlensing in M31, and presumably much of its dark matter. The main benefit of these data is the determination of the physical {or "einstein"} timescale of each microlensing event, rather than an effective {"FWHM"} timescale, allowing masses to be determined more than twice as accurately as without HST data. The einstein timescale is the ratio of the lensing cross-sectional radius and relative velocities. Velocities are known from kinematics, and the cross-section is directly proportional to the {unknown} lensing mass. We cannot easily measure these quantities without knowing the amplification, hence the baseline magnitude, which requires the resolution of HST to find the source star. This makes a crucial difference because M31 lens m ass determinations can be more accurate than those towards the Magellanic Clouds through our Galaxy's halo {for the same number of microlensing events} due to the better constrained geometry in the M31 microlensing situation. Furthermore, our larger survey, just completed, should yield at least 100 M31 microlensing events, more than any Magellanic survey. A small amount of ACS+WFPC2 imaging will deliver the potential of this large database {about 350 nights}. For the whole survey {and a delta-function mass distribution} the mass error should approach only about 15%, or about 6% error in slope for a power-law distribution. These results will better allow us to pinpoint the lens halo fraction, and the shape of the halo lens spatial distribution, and allow generalization/comparison of the nature of halo dark matter in spiral galaxies. In addition, we will be able to establish the baseline magnitude for about 50, 000 variable stars, as well as measure an unprecedentedly deta iled color-magnitude diagram and luminosity

  5. Accurate upwind methods for the Euler equations

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1993-01-01

    A new class of piecewise linear methods for the numerical solution of the one-dimensional Euler equations of gas dynamics is presented. These methods are uniformly second-order accurate, and can be considered as extensions of Godunov's scheme. With an appropriate definition of monotonicity preservation for the case of linear convection, it can be shown that they preserve monotonicity. Similar to Van Leer's MUSCL scheme, they consist of two key steps: a reconstruction step followed by an upwind step. For the reconstruction step, a monotonicity constraint that preserves uniform second-order accuracy is introduced. Computational efficiency is enhanced by devising a criterion that detects the 'smooth' part of the data where the constraint is redundant. The concept and coding of the constraint are simplified by the use of the median function. A slope steepening technique, which has no effect at smooth regions and can resolve a contact discontinuity in four cells, is described. As for the upwind step, existing and new methods are applied in a manner slightly different from those in the literature. These methods are derived by approximating the Euler equations via linearization and diagonalization. At a 'smooth' interface, Harten, Lax, and Van Leer's one intermediate state model is employed. A modification for this model that can resolve contact discontinuities is presented. Near a discontinuity, either this modified model or a more accurate one, namely, Roe's flux-difference splitting. is used. The current presentation of Roe's method, via the conceptually simple flux-vector splitting, not only establishes a connection between the two splittings, but also leads to an admissibility correction with no conditional statement, and an efficient approximation to Osher's approximate Riemann solver. These reconstruction and upwind steps result in schemes that are uniformly second-order accurate and economical at smooth regions, and yield high resolution at discontinuities.

  6. Accurate measurement of unsteady state fluid temperature

    NASA Astrophysics Data System (ADS)

    Jaremkiewicz, Magdalena

    2016-07-01

    In this paper, two accurate methods for determining the transient fluid temperature were presented. Measurements were conducted for boiling water since its temperature is known. At the beginning the thermometers are at the ambient temperature and next they are immediately immersed into saturated water. The measurements were carried out with two thermometers of different construction but with the same housing outer diameter equal to 15 mm. One of them is a K-type industrial thermometer widely available commercially. The temperature indicated by the thermometer was corrected considering the thermometers as the first or second order inertia devices. The new design of a thermometer was proposed and also used to measure the temperature of boiling water. Its characteristic feature is a cylinder-shaped housing with the sheath thermocouple located in its center. The temperature of the fluid was determined based on measurements taken in the axis of the solid cylindrical element (housing) using the inverse space marching method. Measurements of the transient temperature of the air flowing through the wind tunnel using the same thermometers were also carried out. The proposed measurement technique provides more accurate results compared with measurements using industrial thermometers in conjunction with simple temperature correction using the inertial thermometer model of the first or second order. By comparing the results, it was demonstrated that the new thermometer allows obtaining the fluid temperature much faster and with higher accuracy in comparison to the industrial thermometer. Accurate measurements of the fast changing fluid temperature are possible due to the low inertia thermometer and fast space marching method applied for solving the inverse heat conduction problem.

  7. The first accurate description of an aurora

    NASA Astrophysics Data System (ADS)

    Schröder, Wilfried

    2006-12-01

    As technology has advanced, the scientific study of auroral phenomena has increased by leaps and bounds. A look back at the earliest descriptions of aurorae offers an interesting look into how medieval scholars viewed the subjects that we study.Although there are earlier fragmentary references in the literature, the first accurate description of the aurora borealis appears to be that published by the German Catholic scholar Konrad von Megenberg (1309-1374) in his book Das Buch der Natur (The Book of Nature). The book was written between 1349 and 1350.

  8. New law requires 'medically accurate' lesson plans.

    PubMed

    1999-09-17

    The California Legislature has passed a bill requiring all textbooks and materials used to teach about AIDS be medically accurate and objective. Statements made within the curriculum must be supported by research conducted in compliance with scientific methods, and published in peer-reviewed journals. Some of the current lesson plans were found to contain scientifically unsupported and biased information. In addition, the bill requires material to be "free of racial, ethnic, or gender biases." The legislation is supported by a wide range of interests, but opposed by the California Right to Life Education Fund, because they believe it discredits abstinence-only material.

  9. Accurate density functional thermochemistry for larger molecules.

    SciTech Connect

    Raghavachari, K.; Stefanov, B. B.; Curtiss, L. A.; Lucent Tech.

    1997-06-20

    Density functional methods are combined with isodesmic bond separation reaction energies to yield accurate thermochemistry for larger molecules. Seven different density functionals are assessed for the evaluation of heats of formation, Delta H 0 (298 K), for a test set of 40 molecules composed of H, C, O and N. The use of bond separation energies results in a dramatic improvement in the accuracy of all the density functionals. The B3-LYP functional has the smallest mean absolute deviation from experiment (1.5 kcal mol/f).

  10. New law requires 'medically accurate' lesson plans.

    PubMed

    1999-09-17

    The California Legislature has passed a bill requiring all textbooks and materials used to teach about AIDS be medically accurate and objective. Statements made within the curriculum must be supported by research conducted in compliance with scientific methods, and published in peer-reviewed journals. Some of the current lesson plans were found to contain scientifically unsupported and biased information. In addition, the bill requires material to be "free of racial, ethnic, or gender biases." The legislation is supported by a wide range of interests, but opposed by the California Right to Life Education Fund, because they believe it discredits abstinence-only material. PMID:11366835

  11. Universality: Accurate Checks in Dyson's Hierarchical Model

    NASA Astrophysics Data System (ADS)

    Godina, J. J.; Meurice, Y.; Oktay, M. B.

    2003-06-01

    In this talk we present high-accuracy calculations of the susceptibility near βc for Dyson's hierarchical model in D = 3. Using linear fitting, we estimate the leading (γ) and subleading (Δ) exponents. Independent estimates are obtained by calculating the first two eigenvalues of the linearized renormalization group transformation. We found γ = 1.29914073 ± 10 -8 and, Δ = 0.4259469 ± 10-7 independently of the choice of local integration measure (Ising or Landau-Ginzburg). After a suitable rescaling, the approximate fixed points for a large class of local measure coincide accurately with a fixed point constructed by Koch and Wittwer.

  12. Methods for Computing Accurate Atomic Spin Moments for Collinear and Noncollinear Magnetism in Periodic and Nonperiodic Materials.

    PubMed

    Manz, Thomas A; Sholl, David S

    2011-12-13

    The partitioning of electron spin density among atoms in a material gives atomic spin moments (ASMs), which are important for understanding magnetic properties. We compare ASMs computed using different population analysis methods and introduce a method for computing density derived electrostatic and chemical (DDEC) ASMs. Bader and DDEC ASMs can be computed for periodic and nonperiodic materials with either collinear or noncollinear magnetism, while natural population analysis (NPA) ASMs can be computed for nonperiodic materials with collinear magnetism. Our results show Bader, DDEC, and (where applicable) NPA methods give similar ASMs, but different net atomic charges. Because they are optimized to reproduce both the magnetic field and the chemical states of atoms in a material, DDEC ASMs are especially suitable for constructing interaction potentials for atomistic simulations. We describe the computation of accurate ASMs for (a) a variety of systems using collinear and noncollinear spin DFT, (b) highly correlated materials (e.g., magnetite) using DFT+U, and (c) various spin states of ozone using coupled cluster expansions. The computed ASMs are in good agreement with available experimental results for a variety of periodic and nonperiodic materials. Examples considered include the antiferromagnetic metal organic framework Cu3(BTC)2, several ozone spin states, mono- and binuclear transition metal complexes, ferri- and ferro-magnetic solids (e.g., Fe3O4, Fe3Si), and simple molecular systems. We briefly discuss the theory of exchange-correlation functionals for studying noncollinear magnetism. A method for finding the ground state of systems with highly noncollinear magnetism is introduced. We use these methods to study the spin-orbit coupling potential energy surface of the single molecule magnet Fe4C40H52N4O12, which has highly noncollinear magnetism, and find that it contains unusual features that give a new interpretation to experimental data.

  13. NEAMS Experimental Support for Code Validation, INL FY2009

    SciTech Connect

    G. Youinou; G. Palmiotti; M. Salvatore; C. Rabiti

    2009-09-01

    The goal is for all modeling and simulation tools to be demonstrated accurate and reliable through a formal Verification and Validation (V&V) process, especially where such tools are to be used to establish safety margins and support regulatory compliance, or to design a system in a manner that reduces the role of expensive mockups and prototypes. Whereas the Verification part of the process does not rely on experiment, the Validation part, on the contrary, necessitates as many relevant and precise experimental data as possible to make sure the models reproduce reality as closely as possible. Hence, this report presents a limited selection of experimental data that could be used to validate the codes devoted mainly to Fast Neutron Reactor calculations in the US. Emphasis has been put on existing data for thermal-hydraulics, fuel and reactor physics. The principles of a new “smart” experiment that could be used to improve our knowledge of neutron cross-sections are presented as well. In short, it consists in irradiating a few milligrams of actinides and analyzing the results with Accelerator Mass Spectroscopy to infer the neutron cross-sections. Finally, the wealth of experimental data relevant to Fast Neutron Reactors in the US should not be taken for granted and efforts should be put on saving these 30-40 years old data and on making sure they are validation-worthy, i.e. that the experimental conditions and uncertainties are well documented.

  14. Accurate shear measurement with faint sources

    SciTech Connect

    Zhang, Jun; Foucaud, Sebastien; Luo, Wentao E-mail: walt@shao.ac.cn

    2015-01-01

    For cosmic shear to become an accurate cosmological probe, systematic errors in the shear measurement method must be unambiguously identified and corrected for. Previous work of this series has demonstrated that cosmic shears can be measured accurately in Fourier space in the presence of background noise and finite pixel size, without assumptions on the morphologies of galaxy and PSF. The remaining major source of error is source Poisson noise, due to the finiteness of source photon number. This problem is particularly important for faint galaxies in space-based weak lensing measurements, and for ground-based images of short exposure times. In this work, we propose a simple and rigorous way of removing the shear bias from the source Poisson noise. Our noise treatment can be generalized for images made of multiple exposures through MultiDrizzle. This is demonstrated with the SDSS and COSMOS/ACS data. With a large ensemble of mock galaxy images of unrestricted morphologies, we show that our shear measurement method can achieve sub-percent level accuracy even for images of signal-to-noise ratio less than 5 in general, making it the most promising technique for cosmic shear measurement in the ongoing and upcoming large scale galaxy surveys.

  15. Accurate basis set truncation for wavefunction embedding

    NASA Astrophysics Data System (ADS)

    Barnes, Taylor A.; Goodpaster, Jason D.; Manby, Frederick R.; Miller, Thomas F.

    2013-07-01

    Density functional theory (DFT) provides a formally exact framework for performing embedded subsystem electronic structure calculations, including DFT-in-DFT and wavefunction theory-in-DFT descriptions. In the interest of efficiency, it is desirable to truncate the atomic orbital basis set in which the subsystem calculation is performed, thus avoiding high-order scaling with respect to the size of the MO virtual space. In this study, we extend a recently introduced projection-based embedding method [F. R. Manby, M. Stella, J. D. Goodpaster, and T. F. Miller III, J. Chem. Theory Comput. 8, 2564 (2012)], 10.1021/ct300544e to allow for the systematic and accurate truncation of the embedded subsystem basis set. The approach is applied to both covalently and non-covalently bound test cases, including water clusters and polypeptide chains, and it is demonstrated that errors associated with basis set truncation are controllable to well within chemical accuracy. Furthermore, we show that this approach allows for switching between accurate projection-based embedding and DFT embedding with approximate kinetic energy (KE) functionals; in this sense, the approach provides a means of systematically improving upon the use of approximate KE functionals in DFT embedding.

  16. Accurate determination of characteristic relative permeability curves

    NASA Astrophysics Data System (ADS)

    Krause, Michael H.; Benson, Sally M.

    2015-09-01

    A recently developed technique to accurately characterize sub-core scale heterogeneity is applied to investigate the factors responsible for flowrate-dependent effective relative permeability curves measured on core samples in the laboratory. The dependency of laboratory measured relative permeability on flowrate has long been both supported and challenged by a number of investigators. Studies have shown that this apparent flowrate dependency is a result of both sub-core scale heterogeneity and outlet boundary effects. However this has only been demonstrated numerically for highly simplified models of porous media. In this paper, flowrate dependency of effective relative permeability is demonstrated using two rock cores, a Berea Sandstone and a heterogeneous sandstone from the Otway Basin Pilot Project in Australia. Numerical simulations of steady-state coreflooding experiments are conducted at a number of injection rates using a single set of input characteristic relative permeability curves. Effective relative permeability is then calculated from the simulation data using standard interpretation methods for calculating relative permeability from steady-state tests. Results show that simplified approaches may be used to determine flowrate-independent characteristic relative permeability provided flow rate is sufficiently high, and the core heterogeneity is relatively low. It is also shown that characteristic relative permeability can be determined at any typical flowrate, and even for geologically complex models, when using accurate three-dimensional models.

  17. How Accurately can we Calculate Thermal Systems?

    SciTech Connect

    Cullen, D; Blomquist, R N; Dean, C; Heinrichs, D; Kalugin, M A; Lee, M; Lee, Y; MacFarlan, R; Nagaya, Y; Trkov, A

    2004-04-20

    I would like to determine how accurately a variety of neutron transport code packages (code and cross section libraries) can calculate simple integral parameters, such as K{sub eff}, for systems that are sensitive to thermal neutron scattering. Since we will only consider theoretical systems, we cannot really determine absolute accuracy compared to any real system. Therefore rather than accuracy, it would be more precise to say that I would like to determine the spread in answers that we obtain from a variety of code packages. This spread should serve as an excellent indicator of how accurately we can really model and calculate such systems today. Hopefully, eventually this will lead to improvements in both our codes and the thermal scattering models that they use in the future. In order to accomplish this I propose a number of extremely simple systems that involve thermal neutron scattering that can be easily modeled and calculated by a variety of neutron transport codes. These are theoretical systems designed to emphasize the effects of thermal scattering, since that is what we are interested in studying. I have attempted to keep these systems very simple, and yet at the same time they include most, if not all, of the important thermal scattering effects encountered in a large, water-moderated, uranium fueled thermal system, i.e., our typical thermal reactors.

  18. Accurate Stellar Parameters for Exoplanet Host Stars

    NASA Astrophysics Data System (ADS)

    Brewer, John Michael; Fischer, Debra; Basu, Sarbani; Valenti, Jeff A.

    2015-01-01

    A large impedement to our understanding of planet formation is obtaining a clear picture of planet radii and densities. Although determining precise ratios between planet and stellar host are relatively easy, determining accurate stellar parameters is still a difficult and costly undertaking. High resolution spectral analysis has traditionally yielded precise values for some stellar parameters but stars in common between catalogs from different authors or analyzed using different techniques often show offsets far in excess of their uncertainties. Most analyses now use some external constraint, when available, to break observed degeneracies between surface gravity, effective temperature, and metallicity which can otherwise lead to correlated errors in results. However, these external constraints are impossible to obtain for all stars and can require more costly observations than the initial high resolution spectra. We demonstrate that these discrepencies can be mitigated by use of a larger line list that has carefully tuned atomic line data. We use an iterative modeling technique that does not require external constraints. We compare the surface gravity obtained with our spectral synthesis modeling to asteroseismically determined values for 42 Kepler stars. Our analysis agrees well with only a 0.048 dex offset and an rms scatter of 0.05 dex. Such accurate stellar gravities can reduce the primary source of uncertainty in radii by almost an order of magnitude over unconstrained spectral analysis.

  19. Leveraging Two Kinect Sensors for Accurate Full-Body Motion Capture.

    PubMed

    Gao, Zhiquan; Yu, Yao; Zhou, Yu; Du, Sidan

    2015-09-22

    Accurate motion capture plays an important role in sports analysis, the medical field and virtual reality. Current methods for motion capture often suffer from occlusions, which limits the accuracy of their pose estimation. In this paper, we propose a complete system to measure the pose parameters of the human body accurately. Different from previous monocular depth camera systems, we leverage two Kinect sensors to acquire more information about human movements, which ensures that we can still get an accurate estimation even when significant occlusion occurs. Because human motion is temporally constant, we adopt a learning analysis to mine the temporal information across the posture variations. Using this information, we estimate human pose parameters accurately, regardless of rapid movement. Our experimental results show that our system can perform an accurate pose estimation of the human body with the constraint of information from the temporal domain.

  20. Leveraging Two Kinect Sensors for Accurate Full-Body Motion Capture

    PubMed Central

    Gao, Zhiquan; Yu, Yao; Zhou, Yu; Du, Sidan

    2015-01-01

    Accurate motion capture plays an important role in sports analysis, the medical field and virtual reality. Current methods for motion capture often suffer from occlusions, which limits the accuracy of their pose estimation. In this paper, we propose a complete system to measure the pose parameters of the human body accurately. Different from previous monocular depth camera systems, we leverage two Kinect sensors to acquire more information about human movements, which ensures that we can still get an accurate estimation even when significant occlusion occurs. Because human motion is temporally constant, we adopt a learning analysis to mine the temporal information across the posture variations. Using this information, we estimate human pose parameters accurately, regardless of rapid movement. Our experimental results show that our system can perform an accurate pose estimation of the human body with the constraint of information from the temporal domain. PMID:26402681

  1. Research Elements: new article types by Elsevier to facilitate reproducibility in science

    NASA Astrophysics Data System (ADS)

    Zudilova-Seinstra, Elena; van Hensbergen, Kitty; Wacek, Bart

    2016-04-01

    When researchers start to make plans for new experiments, this is the beginning of a whole cycle of work, including experimental designs, tweaking of existing methods, developing protocols, writing code, collecting and processing experimental data, etc. A large part of this very useful information rarely gets published, which makes experiments difficult to reproduce. The same holds for experimental data, which is not always provided in a reusable format and lacks descriptive information. Furthermore, many types of data, such as a replication data, negative datasets or data from "intermediate experiments" often don't get published because they have no place in a research journal. To address this concern, Elsevier launched a series of peer-reviewed journal titles grouped under the umbrella of Research Elements (https://www.elsevier.com/books-and-journals/research-elements) that allow researchers to publish their data, software, materials and methods and other elements of the research cycle in a brief article format. To facilitate reproducibility, Research Elements have thoroughly thought out submission templates that include all necessary information and metadata as well as peer-review criteria defined per article type. Research Elements can be applicable to multiple research areas; for example, a number of multidisciplinary journals (Data in Brief, SoftwareX, MethodsX) welcome submissions from a large number of subject areas. At other times, these elements are better served within a single field; therefore, a number of domain-specific journals (e.g.: Genomics Data, Chemical Data Collections, Neurocomputing) support the new article formats, too. Upon publication, all Research Elements are assigned with persistent identifiers for direct citation and easy discoverability. Persistent identifiers are also used for interlinking Research Elements and relevant research papers published in traditional journals. Some Research Elements allow post-publication article updates

  2. Research Reproducibility in Longitudinal Multi-Center Studies Using Data from Electronic Health Records

    PubMed Central

    Zozus, Meredith N.; Richesson, Rachel L.; Walden, Anita; Tenenbaum, Jessie D.; Hammond, W.E.

    2016-01-01

    A fundamental premise of scientific research is that it should be reproducible. However, the specific requirements for reproducibility of research using electronic health record (EHR) data have not been sufficiently articulated. There is no guidance for researchers about how to assess a given project and identify provisions for reproducibility. We analyze three different clinical research initiatives that use EHR data in order to define a set of requirements to reproduce the research using the original or other datasets. We identify specific project features that drive these requirements. The resulting framework will support the much-needed discussion of strategies to ensure the reproducibility of research that uses data from EHRs. PMID:27570682

  3. Research Reproducibility in Longitudinal Multi-Center Studies Using Data from Electronic Health Records.

    PubMed

    Zozus, Meredith N; Richesson, Rachel L; Walden, Anita; Tenenbaum, Jessie D; Hammond, W E

    2016-01-01

    A fundamental premise of scientific research is that it should be reproducible. However, the specific requirements for reproducibility of research using electronic health record (EHR) data have not been sufficiently articulated. There is no guidance for researchers about how to assess a given project and identify provisions for reproducibility. We analyze three different clinical research initiatives that use EHR data in order to define a set of requirements to reproduce the research using the original or other datasets. We identify specific project features that drive these requirements. The resulting framework will support the much-needed discussion of strategies to ensure the reproducibility of research that uses data from EHRs. PMID:27570682

  4. Accurate spectral modeling for infrared radiation

    NASA Technical Reports Server (NTRS)

    Tiwari, S. N.; Gupta, S. K.

    1977-01-01

    Direct line-by-line integration and quasi-random band model techniques are employed to calculate the spectral transmittance and total band absorptance of 4.7 micron CO, 4.3 micron CO2, 15 micron CO2, and 5.35 micron NO bands. Results are obtained for different pressures, temperatures, and path lengths. These are compared with available theoretical and experimental investigations. For each gas, extensive tabulations of results are presented for comparative purposes. In almost all cases, line-by-line results are found to be in excellent agreement with the experimental values. The range of validity of other models and correlations are discussed.

  5. SOPROLIFE System: An Accurate Diagnostic Enhancer

    PubMed Central

    Zeitouny, Mona; Feghali, Mireille; Nasr, Assaad; Abou-Samra, Philippe; Saleh, Nadine; Bourgeois, Denis; Farge, Pierre

    2014-01-01

    Objectives. The aim of this study was to evaluate a light-emitting diode fluorescence tool, the SOPROLIFE light-induced fluorescence evaluator, and compare it to the international caries detection and assessment system-II (ICDAS-II) in the detection of occlusal caries. Methods. A total of 219 permanent posterior teeth in 21 subjects, with age ranging from 15 to 65 years, were examined. An intraclass correlation coefficient (ICC) was computed to assess the reliability between the two diagnostic methods. Results. The results showed a high reliability between the two methods (ICC = 0.92; IC = 0.901–0.940; P < 0.001). The SOPROLIFE blue fluorescence mode had a high sensitivity (87%) and a high specificity (99%) when compared to ICDAS-II. Conclusion. Compared to the most used visual method in the diagnosis of occlusal caries lesions, the finding from this study suggests that SOPROLIFE can be used as a reproducible and reliable assessment tool. At a cut-off point, categorizing noncarious lesions and visual change in enamel, SOPROLIFE shows a high sensitivity and specificity. We can conclude that financially ICDAS is better than SOPROLIFE. However SOPROLIFE is easier for clinicians since it is a simple evaluation of images. Finally in terms of efficiency SOPROLIFE is not superior to ICDAS but tends to be equivalent with the same advantages. PMID:25401161

  6. An electrostatic mechanism closely reproducing observed behavior in the bacterial flagellar motor.

    PubMed

    Walz, D; Caplan, S R

    2000-02-01

    A mechanism coupling the transmembrane flow of protons to the rotation of the bacterial flagellum is studied. The coupling is accomplished by means of an array of tilted rows of positive and negative charges around the circumference of the rotor, which interacts with a linear array of proton binding sites in channels. We present a rigorous treatment of the electrostatic interactions using minimal assumptions. Interactions with the transition states are included, as well as proton-proton interactions in and between channels. In assigning values to the parameters of the model, experimentally determined structural characteristics of the motor have been used. According to the model, switching and pausing occur as a consequence of modest conformational changes in the rotor. In contrast to similar approaches developed earlier, this model closely reproduces a large number of experimental findings from different laboratories, including the nonlinear behavior of the torque-frequency relation in Escherichia coli, the stoichiometry of the system in Streptococcus, and the pH-dependence of swimming speed in Bacillus subtilis. PMID:10653777

  7. An electrostatic mechanism closely reproducing observed behavior in the bacterial flagellar motor.

    PubMed Central

    Walz, D; Caplan, S R

    2000-01-01

    A mechanism coupling the transmembrane flow of protons to the rotation of the bacterial flagellum is studied. The coupling is accomplished by means of an array of tilted rows of positive and negative charges around the circumference of the rotor, which interacts with a linear array of proton binding sites in channels. We present a rigorous treatment of the electrostatic interactions using minimal assumptions. Interactions with the transition states are included, as well as proton-proton interactions in and between channels. In assigning values to the parameters of the model, experimentally determined structural characteristics of the motor have been used. According to the model, switching and pausing occur as a consequence of modest conformational changes in the rotor. In contrast to similar approaches developed earlier, this model closely reproduces a large number of experimental findings from different laboratories, including the nonlinear behavior of the torque-frequency relation in Escherichia coli, the stoichiometry of the system in Streptococcus, and the pH-dependence of swimming speed in Bacillus subtilis. PMID:10653777

  8. Reproducing kernel potential energy surfaces in biomolecular simulations: Nitric oxide binding to myoglobin

    SciTech Connect

    Soloviov, Maksym; Meuwly, Markus

    2015-09-14

    Multidimensional potential energy surfaces based on reproducing kernel-interpolation are employed to explore the energetics and dynamics of free and bound nitric oxide in myoglobin (Mb). Combining a force field description for the majority of degrees of freedom and the higher-accuracy representation for the NO ligand and the Fe out-of-plane motion allows for a simulation approach akin to a mixed quantum mechanics/molecular mechanics treatment. However, the kernel-representation can be evaluated at conventional force-field speed. With the explicit inclusion of the Fe-out-of-plane (Fe-oop) coordinate, the dynamics and structural equilibrium after photodissociation of the ligand are correctly described compared to experiment. Experimentally, the Fe-oop coordinate plays an important role for the ligand dynamics. This is also found here where the isomerization dynamics between the Fe–ON and Fe–NO state is significantly affected whether or not this co-ordinate is explicitly included. Although the Fe–ON conformation is metastable when considering only the bound {sup 2}A state, it may disappear once the {sup 4}A state is included. This explains the absence of the Fe–ON state in previous experimental investigations of MbNO.

  9. An accurate equation of state for fluids and solids.

    PubMed

    Parsafar, G A; Spohr, H V; Patey, G N

    2009-09-01

    A simple functional form for a general equation of state based on an effective near-neighbor pair interaction of an extended Lennard-Jones (12,6,3) type is given and tested against experimental data for a wide variety of fluids and solids. Computer simulation results for ionic liquids are used for further evaluation. For fluids, there appears to be no upper density limitation on the equation of state. The lower density limit for isotherms near the critical temperature is the critical density. The equation of state gives a good description of all types of fluids, nonpolar (including long-chain hydrocarbons), polar, hydrogen-bonded, and metallic, at temperatures ranging from the triple point to the highest temperature for which there is experimental data. For solids, the equation of state is very accurate for all types considered, including covalent, molecular, metallic, and ionic systems. The experimental pvT data available for solids does not reveal any pressure or temperature limitations. An analysis of the importance and possible underlying physical significance of the terms in the equation of state is given. PMID:19678647

  10. An accurate potential energy curve for helium based on ab initio calculations

    NASA Astrophysics Data System (ADS)

    Janzen, A. R.; Aziz, R. A.

    1997-07-01

    Korona, Williams, Bukowski, Jeziorski, and Szalewicz [J. Chem. Phys. 106, 1 (1997)] constructed a completely ab initio potential for He2 by fitting their calculations using infinite order symmetry adapted perturbation theory at intermediate range, existing Green's function Monte Carlo calculations at short range and accurate dispersion coefficients at long range to a modified Tang-Toennies potential form. The potential with retardation added to the dipole-dipole dispersion is found to predict accurately a large set of microscopic and macroscopic experimental data. The potential with a significantly larger well depth than other recent potentials is judged to be the most accurate characterization of the helium interaction yet proposed.

  11. Quantitative real-time PCR for rapid and accurate titration of recombinant baculovirus particles.

    PubMed

    Hitchman, Richard B; Siaterli, Evangelia A; Nixon, Clare P; King, Linda A

    2007-03-01

    We describe the use of quantitative PCR (QPCR) to titer recombinant baculoviruses. Custom primers and probe were designed to gp64 and used to calculate a standard curve of QPCR derived titers from dilutions of a previously titrated baculovirus stock. Each dilution was titrated by both plaque assay and QPCR, producing a consistent and reproducible inverse relationship between C(T) and plaque forming units per milliliter. No significant difference was observed between titers produced by QPCR and plaque assay for 12 recombinant viruses, confirming the validity of this technique as a rapid and accurate method of baculovirus titration.

  12. Accurate bulk density determination of irregularly shaped translucent and opaque aerogels

    NASA Astrophysics Data System (ADS)

    Petkov, M. P.; Jones, S. M.

    2016-05-01

    We present a volumetric method for accurate determination of bulk density of aerogels, calculated from extrapolated weight of the dry pure solid and volume estimates based on the Archimedes' principle of volume displacement, using packed 100 μm-sized monodispersed glass spheres as a "quasi-fluid" media. Hard particle packing theory is invoked to demonstrate the reproducibility of the apparent density of the quasi-fluid. Accuracy rivaling that of the refractive index method is demonstrated for both translucent and opaque aerogels with different absorptive properties, as well as for aerogels with regular and irregular shapes.

  13. Arthroscopically assisted Latarjet procedure: A new surgical approach for accurate coracoid graft placement and compression

    PubMed Central

    Taverna, Ettore; Ufenast, Henri; Broffoni, Laura; Garavaglia, Guido

    2013-01-01

    The Latarjet procedure is a confirmed method for the treatment of shoulder instability in the presence of bone loss. It is a challenging procedure for which a key point is the correct placement of the coracoid graft onto the glenoid neck. We here present our technique for an athroscopically assisted Latarjet procedure with a new drill guide, permitting an accurate and reproducible positioning of the coracoid graft, with optimal compression of the graft onto the glenoid neck due to the perfect position of the screws: perpendicular to the graft and the glenoid neck and parallel between them. PMID:24167405

  14. Arthroscopically assisted Latarjet procedure: A new surgical approach for accurate coracoid graft placement and compression.

    PubMed

    Taverna, Ettore; Ufenast, Henri; Broffoni, Laura; Garavaglia, Guido

    2013-07-01

    The Latarjet procedure is a confirmed method for the treatment of shoulder instability in the presence of bone loss. It is a challenging procedure for which a key point is the correct placement of the coracoid graft onto the glenoid neck. We here present our technique for an athroscopically assisted Latarjet procedure with a new drill guide, permitting an accurate and reproducible positioning of the coracoid graft, with optimal compression of the graft onto the glenoid neck due to the perfect position of the screws: perpendicular to the graft and the glenoid neck and parallel between them.

  15. Interlaboratory reproducibility of standard accelerated aging methods for oxidation of UHMWPE.

    PubMed

    Kurtz, S M; Muratoglu, O K; Buchanan, F; Currier, B; Gsell, R; Greer, K; Gualtieri, G; Johnson, R; Schaffner, S; Sevo, K; Spiegelberg, S; Shen, F W; Yau, S S

    2001-07-01

    During accelerating aging, experimental uncertainty may arise due to variability in the oxidation process, or due to limitations in the technique that is ultimately used to measure oxidation. The purpose of the present interlaboratory study was to quantify the repeatability and reproducibility of standard accelerated aging methods for ultra-high molecular weight polyethylene (UHMWPE). Sections (200 microm thick) were microtomed from the center of an extruded rod of GUR 4150 HP, gamma irradiated in air or nitrogen, and circulated to 12 institutions in the United States and Europe for characterization of oxidation before and after accelerated aging. Specimens were aged for 3 weeks at 80 degrees C in an air circulating oven or for 2 weeks at 70 degrees C in an oxygen bomb (maintained at 503 kPa (5 atm.) of O2) in accordance with the two standard protocols described in ASTM F 2003-00. FTIR spectra were collected from each specimen within 24 h of the start and finish of accelerated aging, and oxidation indices were calculated by normalizing the peak area of the carbonyl region by the reference peak areas at 1370 or 2022 cm(-1). The mean relative interlaboratory uncertainty of the oxidation data was 78.5% after oven aging and 129.1% after bomb aging. The oxidation index measurement technique was not found to be a significant factor in the reproducibility. Comparable relative intrainstitutional uncertainty was observed after oven aging and bomb aging. For both aging methods, institutions successfully discriminated between air-irradiated and control specimens. However, the large interinstitutional variation suggests that absolute performance standards for the oxidation index of UHMWPE after accelerated aging may not be practical at the present time.

  16. Comprehensive and Reproducible Phosphopeptide Enrichment Using Iron Immobilized Metal Ion Affinity Chromatography (Fe-IMAC) Columns

    PubMed Central

    Ruprecht, Benjamin; Koch, Heiner; Medard, Guillaume; Mundt, Max; Kuster, Bernhard; Lemeer, Simone

    2015-01-01

    Advances in phosphopeptide enrichment methods enable the identification of thousands of phosphopeptides from complex samples. Current offline enrichment approaches using TiO2, Ti, and Fe immobilized metal ion affinity chromatography (IMAC) material in batch or microtip format are widely used, but they suffer from irreproducibility and compromised selectivity. To address these shortcomings, we revisited the merits of performing phosphopeptide enrichment in an HPLC column format. We found that Fe-IMAC columns enabled the selective, comprehensive, and reproducible enrichment of phosphopeptides out of complex lysates. Column enrichment did not suffer from bead-to-sample ratio issues and scaled linearly from 100 μg to 5 mg of digest. Direct measurements on an Orbitrap Velos mass spectrometer identified >7500 unique phosphopeptides with 90% selectivity and good quantitative reproducibility (median cv of 15%). The number of unique phosphopeptides could be increased to more than 14,000 when the IMAC eluate was subjected to a subsequent hydrophilic strong anion exchange separation. Fe-IMAC columns outperformed Ti-IMAC and TiO2 in batch or tip mode in terms of phosphopeptide identification and intensity. Permutation enrichments of flow-throughs showed that all materials largely bound the same phosphopeptide species, independent of physicochemical characteristics. However, binding capacity and elution efficiency did profoundly differ among the enrichment materials and formats. As a result, the often quoted orthogonality of the materials has to be called into question. Our results strongly suggest that insufficient capacity, inefficient elution, and the stochastic nature of data-dependent acquisition in mass spectrometry are the causes of the experimentally observed complementarity. The Fe-IMAC enrichment workflow using an HPLC format developed here enables rapid and comprehensive phosphoproteome analysis that can be applied to a wide range of biological systems. PMID

  17. Reproducible voluntary muscle performance during constant work rate dynamic leg exercise.

    PubMed

    Fulco, C S; Rock, P B; Muza, S R; Lammi, E; Cymerman, A; Lewis, S F

    2000-02-01

    During constant intensity treadmill or cycle exercise, progressive muscle fatigue is not readily quantified and endurance time is poorly reproducible. However, integration of dynamic knee extension (DKE) exercise with serial measurement of maximal voluntary contraction (MVC) force of knee extensor muscles permits close tracking of leg fatigue. We studied reproducibility of four performance indices: MVC force of rested muscle (MVC(rest)) rate of MVC force fall, time to exhaustion, and percentage of MVC(rest) (%MVC(rest)) at exhaustion in 11 healthy women (22+/-1 yrs) during identical constant work rate 1-leg DKE (1 Hz) on 2 separate days at sea level (30 m). Means+/-SD for the two test days, and the correlations (r), standard estimate errors and coefficients of variation (CV%) between days were, respectively: a) MVC(rest)(N), 524+/-99 vs 517+/-111, 0.91, 43.0, 4.9%; b) MVC force fall (N x min(-1)), -10.77+/-9.3 vs -11.79+/-12.1, 0.94, 3.6, 26.5 %; c) Time to exhaustion (min), 22.6+/-12 vs 23.9+/-14, 0.98, 2.7, 7.5 %; and d) %MVC(rest) at exhaustion, 65+/-13 vs 62+/-14, 0.85, 7.8, 5.6%. There were no statistically significant mean differences between the two test days for any of the performance measures. To demonstrate the potential benefits of evaluating multiple effects of an experimental intervention, nine of the women were again tested within 24hr of arriving at 4,300 m altitude using the identical force, velocity, power output, and energy requirement during constant work rate dynamic leg exercise. Low variability of each performance index enhanced the ability to describe the effects of acute altitude exposure on voluntary muscle function.

  18. The FLUKA Code: An Accurate Simulation Tool for Particle Therapy

    PubMed Central

    Battistoni, Giuseppe; Bauer, Julia; Boehlen, Till T.; Cerutti, Francesco; Chin, Mary P. W.; Dos Santos Augusto, Ricardo; Ferrari, Alfredo; Ortega, Pablo G.; Kozłowska, Wioletta; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R.; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with both 4He and 12C ion beams. Accurate description of ionization energy losses and of particle scattering and interactions lead to the excellent agreement of calculated depth–dose profiles with those measured at leading European hadron therapy centers, both with proton and ion beams. In order to support the application of FLUKA in hospital-based environments, Flair, the FLUKA graphical interface, has been enhanced with the capability of translating CT DICOM images into voxel-based computational phantoms in a fast and well-structured way. The interface is capable of importing also radiotherapy treatment data described in DICOM RT standard. In addition, the interface is equipped with an intuitive PET scanner geometry generator and automatic recording of coincidence events. Clinically, similar cases will be presented both in terms of absorbed dose and biological dose calculations describing the various available features. PMID:27242956

  19. The FLUKA Code: An Accurate Simulation Tool for Particle Therapy.

    PubMed

    Battistoni, Giuseppe; Bauer, Julia; Boehlen, Till T; Cerutti, Francesco; Chin, Mary P W; Dos Santos Augusto, Ricardo; Ferrari, Alfredo; Ortega, Pablo G; Kozłowska, Wioletta; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with both (4)He and (12)C ion beams. Accurate description of ionization energy losses and of particle scattering and interactions lead to the excellent agreement of calculated depth-dose profiles with those measured at leading European hadron therapy centers, both with proton and ion beams. In order to support the application of FLUKA in hospital-based environments, Flair, the FLUKA graphical interface, has been enhanced with the capability of translating CT DICOM images into voxel-based computational phantoms in a fast and well-structured way. The interface is capable of importing also radiotherapy treatment data described in DICOM RT standard. In addition, the interface is equipped with an intuitive PET scanner geometry generator and automatic recording of coincidence events. Clinically, similar cases will be presented both in terms of absorbed dose and biological dose calculations describing the various available features. PMID:27242956

  20. Highly accurate articulated coordinate measuring machine

    DOEpatents

    Bieg, Lothar F.; Jokiel, Jr., Bernhard; Ensz, Mark T.; Watson, Robert D.

    2003-12-30

    Disclosed is a highly accurate articulated coordinate measuring machine, comprising a revolute joint, comprising a circular encoder wheel, having an axis of rotation; a plurality of marks disposed around at least a portion of the circumference of the encoder wheel; bearing means for supporting the encoder wheel, while permitting free rotation of the encoder wheel about the wheel's axis of rotation; and a sensor, rigidly attached to the bearing means, for detecting the motion of at least some of the marks as the encoder wheel rotates; a probe arm, having a proximal end rigidly attached to the encoder wheel, and having a distal end with a probe tip attached thereto; and coordinate processing means, operatively connected to the sensor, for converting the output of the sensor into a set of cylindrical coordinates representing the position of the probe tip relative to a reference cylindrical coordinate system.