Science.gov

Sample records for accurately reproduce experimental

  1. Accurate, reproducible measurement of blood pressure.

    PubMed Central

    Campbell, N R; Chockalingam, A; Fodor, J G; McKay, D W

    1990-01-01

    The diagnosis of mild hypertension and the treatment of hypertension require accurate measurement of blood pressure. Blood pressure readings are altered by various factors that influence the patient, the techniques used and the accuracy of the sphygmomanometer. The variability of readings can be reduced if informed patients prepare in advance by emptying their bladder and bowel, by avoiding over-the-counter vasoactive drugs the day of measurement and by avoiding exposure to cold, caffeine consumption, smoking and physical exertion within half an hour before measurement. The use of standardized techniques to measure blood pressure will help to avoid large systematic errors. Poor technique can account for differences in readings of more than 15 mm Hg and ultimately misdiagnosis. Most of the recommended procedures are simple and, when routinely incorporated into clinical practice, require little additional time. The equipment must be appropriate and in good condition. Physicians should have a suitable selection of cuff sizes readily available; the use of the correct cuff size is essential to minimize systematic errors in blood pressure measurement. Semiannual calibration of aneroid sphygmomanometers and annual inspection of mercury sphygmomanometers and blood pressure cuffs are recommended. We review the methods recommended for measuring blood pressure and discuss the factors known to produce large differences in blood pressure readings. PMID:2192791

  2. Single-sideband modulator accurately reproduces phase information in 2-Mc signals

    NASA Technical Reports Server (NTRS)

    Strenglein, H. F.

    1966-01-01

    Phase-locked oscillator system employing solid state components acts as a single-sideband modulator to accurately reproduce phase information in 2-Mc signals. This system is useful in telemetry, aircraft communications and position-finding stations, and VHF test circuitry.

  3. Cycle accurate and cycle reproducible memory for an FPGA based hardware accelerator

    DOEpatents

    Asaad, Sameh W.; Kapur, Mohit

    2016-03-15

    A method, system and computer program product are disclosed for using a Field Programmable Gate Array (FPGA) to simulate operations of a device under test (DUT). The DUT includes a device memory having a number of input ports, and the FPGA is associated with a target memory having a second number of input ports, the second number being less than the first number. In one embodiment, a given set of inputs is applied to the device memory at a frequency Fd and in a defined cycle of time, and the given set of inputs is applied to the target memory at a frequency Ft. Ft is greater than Fd and cycle accuracy is maintained between the device memory and the target memory. In an embodiment, a cycle accurate model of the DUT memory is created by separating the DUT memory interface protocol from the target memory storage array.

  4. Generating clock signals for a cycle accurate, cycle reproducible FPGA based hardware accelerator

    SciTech Connect

    Asaad, Sameth W.; Kapur, Mohit

    2016-01-05

    A method, system and computer program product are disclosed for generating clock signals for a cycle accurate FPGA based hardware accelerator used to simulate operations of a device-under-test (DUT). In one embodiment, the DUT includes multiple device clocks generating multiple device clock signals at multiple frequencies and at a defined frequency ratio; and the FPG hardware accelerator includes multiple accelerator clocks generating multiple accelerator clock signals to operate the FPGA hardware accelerator to simulate the operations of the DUT. In one embodiment, operations of the DUT are mapped to the FPGA hardware accelerator, and the accelerator clock signals are generated at multiple frequencies and at the defined frequency ratio of the frequencies of the multiple device clocks, to maintain cycle accuracy between the DUT and the FPGA hardware accelerator. In an embodiment, the FPGA hardware accelerator may be used to control the frequencies of the multiple device clocks.

  5. Highly accurate nuclear and electronic stopping cross sections derived using Monte Carlo simulations to reproduce measured range data

    NASA Astrophysics Data System (ADS)

    Wittmaack, Klaus; Mutzke, Andreas

    2017-03-01

    We have examined and confirmed the previously unexplored concept of using Monte Carlo calculations in combination with measured projected ranges of ions implanted in solids to derive a quantitative description of nuclear interaction and electronic stopping. The study involved 98 ranges of 11B in Si between 1 keV and 8 MeV, contained in 12 sets of 10 different groups. Systematic errors by up to ±8% were removed to establish a refined data base with 93 ranges featuring only statistical uncertainties (±1.8%). The Monte Carlo calculations could be set up to reproduce the refined ranges with a mean ratio 1.002 ± 1.7%. The input parameters required for this very high level of agreement are as follows. Nuclear interaction is best described by the Kr-C potential, but in obligatory combination with the Lindhard-Scharff (LS) screening length. Up to 300 keV, the electronic stopping cross section is proportional to the projectile velocity, Se = kSe,LS, with k = 1.46 ± 0.01. At higher energies, Se falls progressively short of kSe,LS. Around the Bragg peak, i.e., between 0.8 and 10 MeV, Se is modeled by an adjustable function serving to tailor the peak shape properly. Calculated and measured isotope effects for ranges of 10B and 11B in Si agree within the experimental uncertainty (±0.25%). The range-based Se,R(E) reported here predicts the scarce experimental data derived from the energy loss in projectile transmission through thin Si foils to within 2% or better. By contrast, Se(E) data of available stopping power tables exhibit deviations from Se,R(E) between -40% and +14%.

  6. Reproducibility and variability of the cost functions reconstructed from experimental recordings in multifinger prehension.

    PubMed

    Niu, Xun; Latash, Mark L; Zatsiorsky, Vladimir M

    2012-01-01

    The study examines whether the cost functions reconstructed from experimental recordings are reproducible over time. Participants repeated the trials on three days. By following Analytical Inverse Optimization procedures, the cost functions of finger forces were reconstructed for each day. The cost functions were found to be reproducible over time: application of a cost function C(i) to the data of Day j (i≠j) resulted in smaller deviations from the experimental observations than using other commonly used cost functions. Other findings are: (a) the 2nd order coefficients of the cost function showed negative linear relations with finger force magnitudes; (b) the finger forces were distributed on a 2-dimensional plane in the 4-dimensional finger force space for all subjects and all testing sessions; (c) the data agreed well with the principle of superposition, i.e. the action of object prehension can be decoupled into the control of rotational equilibrium and slipping prevention.

  7. Randomized block experimental designs can increase the power and reproducibility of laboratory animal experiments.

    PubMed

    Festing, Michael F W

    2014-01-01

    Randomized block experimental designs have been widely used in agricultural and industrial research for many decades. Usually they are more powerful, have higher external validity, are less subject to bias, and produce more reproducible results than the completely randomized designs typically used in research involving laboratory animals. Reproducibility can be further increased by using time as a blocking factor. These benefits can be achieved at no extra cost. A small experiment investigating the effect of an antioxidant on the activity of a liver enzyme in four inbred mouse strains, which had two replications (blocks) separated by a period of two months, illustrates this approach. The widespread failure to use these designs more widely in research involving laboratory animals has probably led to a substantial waste of animals, money, and scientific resources and slowed down the development of new treatments for human and animal diseases.

  8. Accurate and reproducible detection of proteins in water using an extended-gate type organic transistor biosensor

    NASA Astrophysics Data System (ADS)

    Minamiki, Tsukuru; Minami, Tsuyoshi; Kurita, Ryoji; Niwa, Osamu; Wakida, Shin-ichi; Fukuda, Kenjiro; Kumaki, Daisuke; Tokito, Shizuo

    2014-06-01

    In this Letter, we describe an accurate antibody detection method using a fabricated extended-gate type organic field-effect-transistor (OFET), which can be operated at below 3 V. The protein-sensing portion of the designed device is the gate electrode functionalized with streptavidin. Streptavidin possesses high molecular recognition ability for biotin, which specifically allows for the detection of biotinylated proteins. Here, we attempted to detect biotinylated immunoglobulin G (IgG) and observed a shift of threshold voltage of the OFET upon the addition of the antibody in an aqueous solution with a competing bovine serum albumin interferent. The detection limit for the biotinylated IgG was 8 nM, which indicates the potential utility of the designed device in healthcare applications.

  9. Reproducibility and Variability of the Cost Functions Reconstructed from Experimental Recordings in Multi-Finger Prehension

    PubMed Central

    Niu, Xun; Latash, Mark L.; Zatsiorsky, Vladimir M.

    2012-01-01

    The main goal of the study is to examine whether the cost (objective) functions reconstructed from experimental recordings in multi-finger prehension tasks are reproducible over time, i.e., whether the functions reflect stable preferences of the subjects and can be considered personal characteristics of motor coordination. Young, healthy participants grasped an instrumented handle with varied values of external torque, load and target grasping force and repeated the trials on three days: Day 1, Day 2, and Day 7. By following Analytical Inverse Optimization (ANIO) computation procedures, the cost functions for individual subjects were reconstructed from the experimental recordings (individual finger forces) for each day. The cost functions represented second-order polynomials of finger forces with non-zero linear terms. To check whether the obtained cost functions were reproducible over time a cross-validation was performed: a cost function obtained on Day i was applied to experimental data observed on Day j (i≠j). In spite of the observed day-to-day variability of the performance and the cost functions, the ANIO reconstructed cost functions were found to be reproducible over time: application of a cost function Ci to the data of Day j (i≠j) resulted in smaller deviations from the experimental observations than using other commonly used cost functions. Other findings are: (a) The 2nd order coefficients Ki of the cost function showed negative linear relations with finger force magnitudes. This fact may be interpreted as encouraging involvement of stronger fingers in tasks requiring higher total force magnitude production. (b) The finger forces were distributed on a 2-dimensional plane in the 4-dimensional finger force space, which has been confirmed for all subjects and all testing sessions. (c) The discovered principal components in the principal component analysis of the finger forces agreed well with the principle of superposition, i.e. the complex action of

  10. Accurate Control of 17β-Estradiol Long-Term Release Increases Reliability and Reproducibility of Preclinical Animal Studies.

    PubMed

    Gérard, Céline; Gallez, Anne; Dubois, Charline; Drion, Pierre; Delahaut, Philippe; Quertemont, Etienne; Noël, Agnès; Pequeux, Christel

    2017-03-01

    Estrogens are the subject of intensive researches aiming to elucidate their mechanism of action on the various tissues they target and especially on mammary gland and breast cancer. The use of ready-to-use slow releasing devices to administer steroids, especially estrogens, to small experimental animals remains the method of choice in terms of animal well-being and of safety for both the researcher and the animal. In this study, we evaluated and compared, in vitro and in vivo, the release kinetic of estradiol (E2) over sixty days from two different slow-releasing systems: the matrix pellet (MP) and the reservoir implant (RI). We compared the impact of these systems in three E2-sensitive mouse models : mammary gland development, human MCF7 adenocarcinoma xenograft and mouse melanoma progression. The real amount of E2 that is released from both types of devices could differ from manufacturer specifications due to inadequate release for MP and initial burst effect for RI. Compared to MP, the interindividual variability was reduced with RI thanks to a superior control of the E2 release. Depending on the dose-dependent sensitivity of the physiological or pathological readout studied, this could lead to an improvement of the statistical power of in vivo experiments and thus to a reduction of the required animal number. Altogether, our data draw attention on the importance to adequately select the slow-releasing device that is the most appropriated to a specific experiment to better fulfill the 3Rs rule (Replacement, Reduction, Refinement) related to animal welfare and protection.

  11. Panel-based Genetic Diagnostic Testing for Inherited Eye Diseases is Highly Accurate and Reproducible and More Sensitive for Variant Detection Than Exome Sequencing

    PubMed Central

    Bujakowska, Kinga M.; Sousa, Maria E.; Fonseca-Kelly, Zoë D.; Taub, Daniel G.; Janessian, Maria; Wang, Dan Yi; Au, Elizabeth D.; Sims, Katherine B.; Sweetser, David A.; Fulton, Anne B.; Liu, Qin; Wiggs, Janey L.; Gai, Xiaowu; Pierce, Eric A.

    2015-01-01

    Purpose Next-generation sequencing (NGS) based methods are being adopted broadly for genetic diagnostic testing, but the performance characteristics of these techniques have not been fully defined with regard to test accuracy and reproducibility. Methods We developed a targeted enrichment and NGS approach for genetic diagnostic testing of patients with inherited eye disorders, including inherited retinal degenerations, optic atrophy and glaucoma. In preparation for providing this Genetic Eye Disease (GEDi) test on a CLIA-certified basis, we performed experiments to measure the sensitivity, specificity, reproducibility as well as the clinical sensitivity of the test. Results The GEDi test is highly reproducible and accurate, with sensitivity and specificity for single nucleotide variant detection of 97.9% and 100%, respectively. The sensitivity for variant detection was notably better than the 88.3% achieved by whole exome sequencing (WES) using the same metrics, due to better coverage of targeted genes in the GEDi test compared to commercially available exome capture sets. Prospective testing of 192 patients with IRDs indicated that the clinical sensitivity of the GEDi test is high, with a diagnostic rate of 51%. Conclusion The data suggest that based on quantified performance metrics, selective targeted enrichment is preferable to WES for genetic diagnostic testing. PMID:25412400

  12. Experimental and Numerical Investigation of Forging Process to Reproduce a 3D Aluminium Foam Complex Shape

    SciTech Connect

    Filice, Luigino; Gagliardi, Francesco; Umbrello, Domenico; Shivpuri, Rajiv

    2007-05-17

    Metallic foams represent one of the most exciting materials introduced in the manufacturing scenario in the last years. In the study here addressed, the experimental and numerical investigations on the forging process of a simple foam billet shaped into complex sculptured parts were carried out. In particular, the deformation behavior of metallic foams and the development of density gradients were investigated through a series of experimental forging tests in order to produce a selected portion of a hip prosthesis. The human bone replacement was chosen as case study due to its industrial demand and for its particular 3D complex shape. A finite element code (Deform 3D) was utilized for modeling the foam behavior during the forging process and an accurate material rheology description was used based on a porous material model which includes the measured local density. Once the effectiveness of the utilized Finite Element model was verified through the comparison with the experimental evidences, a numerical study of the influence of the foam density was investigated. The obtained numerical results shown as the initial billet density plays an important role on the prediction of the final shape, the optimization of the flash as well as the estimation of the punch load.

  13. Experimental and Numerical Investigation of Forging Process to Reproduce a 3D Aluminium Foam Complex Shape

    NASA Astrophysics Data System (ADS)

    Filice, Luigino; Gagliardi, Francesco; Shivpuri, Rajiv; Umbrello, Domenico

    2007-05-01

    Metallic foams represent one of the most exciting materials introduced in the manufacturing scenario in the last years. In the study here addressed, the experimental and numerical investigations on the forging process of a simple foam billet shaped into complex sculptured parts were carried out. In particular, the deformation behavior of metallic foams and the development of density gradients were investigated through a series of experimental forging tests in order to produce a selected portion of a hip prosthesis. The human bone replacement was chosen as case study due to its industrial demand and for its particular 3D complex shape. A finite element code (Deform 3D®) was utilized for modeling the foam behavior during the forging process and an accurate material rheology description was used based on a porous material model which includes the measured local density. Once the effectiveness of the utilized Finite Element model was verified through the comparison with the experimental evidences, a numerical study of the influence of the foam density was investigated. The obtained numerical results shown as the initial billet density plays an important role on the prediction of the final shape, the optimization of the flash as well as the estimation of the punch load.

  14. Calibrated simulations of Z opacity experiments that reproduce the experimentally measured plasma conditions

    NASA Astrophysics Data System (ADS)

    Nagayama, T.; Bailey, J. E.; Loisel, G.; Rochau, G. A.; MacFarlane, J. J.; Golovkin, I.

    2016-02-01

    Recently, frequency-resolved iron opacity measurements at electron temperatures of 170-200 eV and electron densities of (0.7 - 4.0 )× 1022cm-3 revealed a 30 - 400 % disagreement with the calculated opacities [J. E. Bailey et al., Nature (London) 517, 56 (2015), 10.1038/nature14048]. The discrepancies have a high impact on astrophysics, atomic physics, and high-energy density physics, and it is important to verify our understanding of the experimental platform with simulations. Reliable simulations are challenging because the temporal and spatial evolution of the source radiation and of the sample plasma are both complex and incompletely diagnosed. In this article, we describe simulations that reproduce the measured temperature and density in recent iron opacity experiments performed at the Sandia National Laboratories Z facility. The time-dependent spectral irradiance at the sample is estimated using the measured time- and space-dependent source radiation distribution, in situ source-to-sample distance measurements, and a three-dimensional (3D) view-factor code. The inferred spectral irradiance is used to drive 1D sample radiation hydrodynamics simulations. The images recorded by slit-imaged space-resolved spectrometers are modeled by solving radiation transport of the source radiation through the sample. We find that the same drive radiation time history successfully reproduces the measured plasma conditions for eight different opacity experiments. These results provide a quantitative physical explanation for the observed dependence of both temperature and density on the sample configuration. Simulated spectral images for the experiments without the FeMg sample show quantitative agreement with the measured spectral images. The agreement in spectral profile, spatial profile, and brightness provides further confidence in our understanding of the backlight-radiation time history and image formation. These simulations bridge the static-uniform picture of the data

  15. Calibrated simulations of Z opacity experiments that reproduce the experimentally measured plasma conditions.

    PubMed

    Nagayama, T; Bailey, J E; Loisel, G; Rochau, G A; MacFarlane, J J; Golovkin, I

    2016-02-01

    Recently, frequency-resolved iron opacity measurements at electron temperatures of 170-200 eV and electron densities of (0.7-4.0)×10(22)cm(-3) revealed a 30-400% disagreement with the calculated opacities [J. E. Bailey et al., Nature (London) 517, 56 (2015)]. The discrepancies have a high impact on astrophysics, atomic physics, and high-energy density physics, and it is important to verify our understanding of the experimental platform with simulations. Reliable simulations are challenging because the temporal and spatial evolution of the source radiation and of the sample plasma are both complex and incompletely diagnosed. In this article, we describe simulations that reproduce the measured temperature and density in recent iron opacity experiments performed at the Sandia National Laboratories Z facility. The time-dependent spectral irradiance at the sample is estimated using the measured time- and space-dependent source radiation distribution, in situ source-to-sample distance measurements, and a three-dimensional (3D) view-factor code. The inferred spectral irradiance is used to drive 1D sample radiation hydrodynamics simulations. The images recorded by slit-imaged space-resolved spectrometers are modeled by solving radiation transport of the source radiation through the sample. We find that the same drive radiation time history successfully reproduces the measured plasma conditions for eight different opacity experiments. These results provide a quantitative physical explanation for the observed dependence of both temperature and density on the sample configuration. Simulated spectral images for the experiments without the FeMg sample show quantitative agreement with the measured spectral images. The agreement in spectral profile, spatial profile, and brightness provides further confidence in our understanding of the backlight-radiation time history and image formation. These simulations bridge the static-uniform picture of the data interpretation and the

  16. Calibrated simulations of Z opacity experiments that reproduce the experimentally measured plasma conditions

    DOE PAGES

    Nagayama, T.; Bailey, J. E.; Loisel, G.; ...

    2016-02-05

    Recently, frequency-resolved iron opacity measurements at electron temperatures of 170–200 eV and electron densities of (0.7 – 4.0) × 1022 cm–3 revealed a 30–400% disagreement with the calculated opacities [J. E. Bailey et al., Nature (London) 517, 56 (2015)]. The discrepancies have a high impact on astrophysics, atomic physics, and high-energy density physics, and it is important to verify our understanding of the experimental platform with simulations. Reliable simulations are challenging because the temporal and spatial evolution of the source radiation and of the sample plasma are both complex and incompletely diagnosed. In this article, we describe simulations that reproducemore » the measured temperature and density in recent iron opacity experiments performed at the Sandia National Laboratories Z facility. The time-dependent spectral irradiance at the sample is estimated using the measured time- and space-dependent source radiation distribution, in situ source-to-sample distance measurements, and a three-dimensional (3D) view-factor code. The inferred spectral irradiance is used to drive 1D sample radiation hydrodynamics simulations. The images recorded by slit-imaged space-resolved spectrometers are modeled by solving radiation transport of the source radiation through the sample. We find that the same drive radiation time history successfully reproduces the measured plasma conditions for eight different opacity experiments. These results provide a quantitative physical explanation for the observed dependence of both temperature and density on the sample configuration. Simulated spectral images for the experiments without the FeMg sample show quantitative agreement with the measured spectral images. The agreement in spectral profile, spatial profile, and brightness provides further confidence in our understanding of the backlight-radiation time history and image formation. Furthermore, these simulations bridge the static-uniform picture of the

  17. Impact of genetic background and experimental reproducibility on identifying chemical compounds with robust longevity effects

    PubMed Central

    Lucanic, Mark; Plummer, W. Todd; Chen, Esteban; Harke, Jailynn; Foulger, Anna C.; Onken, Brian; Coleman-Hulbert, Anna L.; Dumas, Kathleen J.; Guo, Suzhen; Johnson, Erik; Bhaumik, Dipa; Xue, Jian; Crist, Anna B.; Presley, Michael P.; Harinath, Girish; Sedore, Christine A.; Chamoli, Manish; Kamat, Shaunak; Chen, Michelle K.; Angeli, Suzanne; Chang, Christina; Willis, John H.; Edgar, Daniel; Royal, Mary Anne; Chao, Elizabeth A.; Patel, Shobhna; Garrett, Theo; Ibanez-Ventoso, Carolina; Hope, June; Kish, Jason L; Guo, Max; Lithgow, Gordon J.; Driscoll, Monica; Phillips, Patrick C.

    2017-01-01

    Limiting the debilitating consequences of ageing is a major medical challenge of our time. Robust pharmacological interventions that promote healthy ageing across diverse genetic backgrounds may engage conserved longevity pathways. Here we report results from the Caenorhabditis Intervention Testing Program in assessing longevity variation across 22 Caenorhabditis strains spanning 3 species, using multiple replicates collected across three independent laboratories. Reproducibility between test sites is high, whereas individual trial reproducibility is relatively low. Of ten pro-longevity chemicals tested, six significantly extend lifespan in at least one strain. Three reported dietary restriction mimetics are mainly effective across C. elegans strains, indicating species and strain-specific responses. In contrast, the amyloid dye ThioflavinT is both potent and robust across the strains. Our results highlight promising pharmacological leads and demonstrate the importance of assessing lifespans of discrete cohorts across repeat studies to capture biological variation in the search for reproducible ageing interventions. PMID:28220799

  18. A rapid, reproducible, on-the-fly orthogonal array optimization method for targeted protein quantification by LC/MS and its application for accurate and sensitive quantification of carbonyl reductases in human liver.

    PubMed

    Cao, Jin; Gonzalez-Covarrubias, Vanessa; Covarrubias, Vanessa M; Straubinger, Robert M; Wang, Hao; Duan, Xiaotao; Yu, Haoying; Qu, Jun; Blanco, Javier G

    2010-04-01

    Liquid chromatography (LC)/mass spectrometry (MS) in selected-reactions-monitoring (SRM) mode provides a powerful tool for targeted protein quantification. However, efficient, high-throughput strategies for proper selection of signature peptides (SP) for protein quantification and accurate optimization of their SRM conditions remain elusive. Here we describe an on-the-fly, orthogonal array optimization (OAO) approach that enables rapid, comprehensive, and reproducible SRM optimization of a large number of candidate peptides in a single nanoflow-LC/MS run. With the optimized conditions, many peptide candidates can be evaluated in biological matrixes for selection of the final SP. The OAO strategy employs a systematic experimental design that strategically varies product ions, declustering energy, and collision energy in a cycle of 25 consecutive SRM trials, which accurately reveals the effects of these factors on the signal-to-noise ratio of a candidate peptide and optimizes each. As proof of concept, we developed a highly sensitive, accurate, and reproducible method for the quantification of carbonyl reductases CBR1 and CBR3 in human liver. Candidate peptides were identified by nano-LC/LTQ/Orbitrap, filtered using a stringent set of criteria, and subjected to OAO. After evaluating both sensitivity and stability of the candidates, two SP were selected for quantification of each protein. As a result of the accurate OAO of assay conditions, sensitivities of 80 and 110 amol were achieved for CBR1 and CBR3, respectively. The method was validated and used to quantify the CBRs in 33 human liver samples. The mean level of CBR1 was 93.4 +/- 49.7 (range: 26.2-241) ppm of total protein, and of CBR3 was 7.69 +/- 4.38 (range: 1.26-17.9) ppm. Key observations of this study: (i) evaluation of peptide stability in the target matrix is essential for final selection of the SP; (ii) utilization of two unique SP contributes to high reliability of target protein quantification; (iii

  19. Experimental and Numerical Models of Complex Clinical Scenarios; Strategies to Improve Relevance and Reproducibility of Joint Replacement Research

    PubMed Central

    Bechtold, Joan E.; Swider, Pascal; Goreham-Voss, Curtis; Soballe, Kjeld

    2016-01-01

    This research review aims to focus attention on the effect of specific surgical and host factors on implant fixation, and the importance of accounting for them in experimental and numerical models. These factors affect (a) eventual clinical applicability and (b) reproducibility of findings across research groups. Proper function and longevity for orthopedic joint replacement implants relies on secure fixation to the surrounding bone. Technology and surgical technique has improved over the last 50 years, and robust ingrowth and decades of implant survival is now routinely achieved for healthy patients and first-time (primary) implantation. Second-time (revision) implantation presents with bone loss with interfacial bone gaps in areas vital for secure mechanical fixation. Patients with medical comorbidities such as infection, smoking, congestive heart failure, kidney disease, and diabetes have a diminished healing response, poorer implant fixation, and greater revision risk. It is these more difficult clinical scenarios that require research to evaluate more advanced treatment approaches. Such treatments can include osteogenic or antimicrobial implant coatings, allo- or autogenous cellular or tissue-based approaches, local and systemic drug delivery, surgical approaches. Regarding implant-related approaches, most experimental and numerical models do not generally impose conditions that represent mechanical instability at the implant interface, or recalcitrant healing. Many treatments will work well in forgiving settings, but fail in complex human settings with disease, bone loss, or previous surgery. Ethical considerations mandate that we justify and limit the number of animals tested, which restricts experimental permutations of treatments. Numerical models provide flexibility to evaluate multiple parameters and combinations, but generally need to employ simplifying assumptions. The objectives of this paper are to (a) to highlight the importance of mechanical

  20. Experimental and theoretical oscillator strengths of Mg i for accurate abundance analysis

    NASA Astrophysics Data System (ADS)

    Pehlivan Rhodin, A.; Hartman, H.; Nilsson, H.; Jönsson, P.

    2017-02-01

    Context. With the aid of stellar abundance analysis, it is possible to study the galactic formation and evolution. Magnesium is an important element to trace the α-element evolution in our Galaxy. For chemical abundance analysis, such as magnesium abundance, accurate and complete atomic data are essential. Inaccurate atomic data lead to uncertain abundances and prevent discrimination between different evolution models. Aims: We study the spectrum of neutral magnesium from laboratory measurements and theoretical calculations. Our aim is to improve the oscillator strengths (f-values) of Mg i lines and to create a complete set of accurate atomic data, particularly for the near-IR region. Methods: We derived oscillator strengths by combining the experimental branching fractions with radiative lifetimes reported in the literature and computed in this work. A hollow cathode discharge lamp was used to produce free atoms in the plasma and a Fourier transform spectrometer recorded the intensity-calibrated high-resolution spectra. In addition, we performed theoretical calculations using the multiconfiguration Hartree-Fock program ATSP2K. Results: This project provides a set of experimental and theoretical oscillator strengths. We derived 34 experimental oscillator strengths. Except from the Mg i optical triplet lines (3p 3P°0,1,2-4s 3S1), these oscillator strengths are measured for the first time. The theoretical oscillator strengths are in very good agreement with the experimental data and complement the missing transitions of the experimental data up to n = 7 from even and odd parity terms. We present an evaluated set of oscillator strengths, gf, with uncertainties as small as 5%. The new values of the Mg i optical triplet line (3p 3P°0,1,2-4s 3S1) oscillator strength values are 0.08 dex larger than the previous measurements.

  1. Reproducibility blues.

    PubMed

    Pulverer, Bernd

    2015-11-12

    Research findings advance science only if they are significant, reliable and reproducible. Scientists and journals must publish robust data in a way that renders it optimally reproducible. Reproducibility has to be incentivized and supported by the research infrastructure but without dampening innovation.

  2. Electrical detection of C-reactive protein using a single free-standing, thermally controlled piezoresistive microcantilever for highly reproducible and accurate measurements.

    PubMed

    Yen, Yi-Kuang; Lai, Yu-Cheng; Hong, Wei-Ting; Pheanpanitporn, Yotsapoom; Chen, Chuin-Shan; Huang, Long-Sun

    2013-07-29

    This study demonstrates a novel method for electrical detection of C-reactive protein (CRP) as a means of identifying an infection in the body, or as a cardiovascular disease risk assay. The method uses a single free-standing, thermally controlled piezoresistive microcantilever biosensor. In a commonly used sensing arrangement of conventional dual cantilevers in the Wheatstone bridge circuit, reference and gold-coated sensing cantilevers that inherently have heterogeneous surface materials and different multilayer structures may yield independent responses to the liquid environmental changes of chemical substances, flow field and temperature, leading to unwanted signal disturbance for biosensing targets. In this study, the single free-standing microcantilever for biosensing applications is employed to resolve the dual-beam problem of individual responses in chemical solutions and, in a thermally controlled system, to maintain its sensor performance due to the sensitive temperature effect. With this type of single temperature-controlled microcantilever sensor, the electrical detection of various CRP concentrations from 1 µg/mL to 200 µg/mL was performed, which covers the clinically relevant range. Induced surface stresses were measured at between 0.25 N/m and 3.4 N/m with high reproducibility. Moreover, the binding affinity (KD) of CRP and anti-CRP interaction was found to be 18.83 ± 2.99 µg/mL, which agreed with results in previous reported studies. This biosensing technique thus proves valuable in detecting inflammation, and in cardiovascular disease risk assays.

  3. Calibrated simulations of Z opacity experiments that reproduce the experimentally measured plasma conditions

    SciTech Connect

    Nagayama, T.; Bailey, J. E.; Loisel, G.; Rochau, G. A.; MacFarlane, J. J.; Golovkin, I.

    2016-02-05

    Recently, frequency-resolved iron opacity measurements at electron temperatures of 170–200 eV and electron densities of (0.7 – 4.0) × 1022 cm–3 revealed a 30–400% disagreement with the calculated opacities [J. E. Bailey et al., Nature (London) 517, 56 (2015)]. The discrepancies have a high impact on astrophysics, atomic physics, and high-energy density physics, and it is important to verify our understanding of the experimental platform with simulations. Reliable simulations are challenging because the temporal and spatial evolution of the source radiation and of the sample plasma are both complex and incompletely diagnosed. In this article, we describe simulations that reproduce the measured temperature and density in recent iron opacity experiments performed at the Sandia National Laboratories Z facility. The time-dependent spectral irradiance at the sample is estimated using the measured time- and space-dependent source radiation distribution, in situ source-to-sample distance measurements, and a three-dimensional (3D) view-factor code. The inferred spectral irradiance is used to drive 1D sample radiation hydrodynamics simulations. The images recorded by slit-imaged space-resolved spectrometers are modeled by solving radiation transport of the source radiation through the sample. We find that the same drive radiation time history successfully reproduces the measured plasma conditions for eight different opacity experiments. These results provide a quantitative physical explanation for the observed dependence of both temperature and density on the sample configuration. Simulated spectral images for the experiments without the FeMg sample show quantitative agreement with the measured spectral images. The agreement in spectral profile, spatial profile, and brightness provides further confidence in our understanding of the backlight-radiation time history and image formation. Furthermore, these simulations bridge the static

  4. Accurate rotational constants for linear interstellar carbon chains: achieving experimental accuracy

    NASA Astrophysics Data System (ADS)

    Etim, Emmanuel E.; Arunan, Elangannan

    2017-01-01

    Linear carbon chain molecular species remain the dominant theme in interstellar chemistry. Their continuous astronomical observation depends on the availability of accurate spectroscopic parameters. Accurate rotational constants are reported for hundreds of molecular species of astrophysical, spectroscopy and chemical interests from the different linear carbon chains; C_{{n}}H, C_{{n}}H-, C_{{n}}N, C_{{n}}N-, C_{{n}}O, C_{{n}}S, HC_{{n}}S, C_{{n}}Si, CH3(CC)_{{n}}H, HC_{{n}}N, DC_{2{n}+1}N, HC_{2{n}}NC, and CH3(C≡C)_{{n}}CN using three to four moments of inertia calculated from the experimental rotational constants coupled with those obtained from the optimized geometries at the Hartree Fock level. The calculated rotational constants are obtained from the corrected moments of inertia at the Hartfree Fock geometries. The calculated rotational constants show accuracy of few kHz below irrespective of the chain length and terminating groups. The obtained accuracy of few kHz places these rotational constants as excellent tools for both astronomical and laboratory detection of these molecular species of astrophysical interest. From the numerous unidentified lines from different astronomical surveys, transitions corresponding to known and new linear carbon chains could be found using these rotational constants. The astrophysical, spectroscopic and chemical implications of these results are discussed.

  5. Identification and Evaluation of Reference Genes for Accurate Transcription Normalization in Safflower under Different Experimental Conditions

    PubMed Central

    Li, Dandan; Hu, Bo; Wang, Qing; Liu, Hongchang; Pan, Feng; Wu, Wei

    2015-01-01

    Safflower (Carthamus tinctorius L.) has received a significant amount of attention as a medicinal plant and oilseed crop. Gene expression studies provide a theoretical molecular biology foundation for improving new traits and developing new cultivars. Real-time quantitative PCR (RT-qPCR) has become a crucial approach for gene expression analysis. In addition, appropriate reference genes (RGs) are essential for accurate and rapid relative quantification analysis of gene expression. In this study, fifteen candidate RGs involved in multiple metabolic pathways of plants were finally selected and validated under different experimental treatments, at different seed development stages and in different cultivars and tissues for real-time PCR experiments. These genes were ABCS, 60SRPL10, RANBP1, UBCL, MFC, UBCE2, EIF5A, COA, EF1-β, EF1, GAPDH, ATPS, MBF1, GTPB and GST. The suitability evaluation was executed by the geNorm and NormFinder programs. Overall, EF1, UBCE2, EIF5A, ATPS and 60SRPL10 were the most stable genes, and MBF1, as well as MFC, were the most unstable genes by geNorm and NormFinder software in all experimental samples. To verify the validation of RGs selected by the two programs, the expression analysis of 7 CtFAD2 genes in safflower seeds at different developmental stages under cold stress was executed using different RGs in RT-qPCR experiments for normalization. The results showed similar expression patterns when the most stable RGs selected by geNorm or NormFinder software were used. However, the differences were detected using the most unstable reference genes. The most stable combination of genes selected in this study will help to achieve more accurate and reliable results in a wide variety of samples in safflower. PMID:26457898

  6. Surfactant-aided precipitation/on-pellet-digestion (SOD) procedure provides robust and rapid sample preparation for reproducible, accurate and sensitive LC/MS quantification of therapeutic protein in plasma and tissues.

    PubMed

    An, Bo; Zhang, Ming; Johnson, Robert W; Qu, Jun

    2015-04-07

    For targeted protein quantification by liquid chromatography mass spectrometry (LC/MS), an optimal approach for efficient, robust and hi-throughput sample preparation is critical, but often remains elusive. Here we describe a straightforward surfactant-aided-precipitation/on-pellet-digestion (SOD) strategy that provides effective sample cleanup and enables high and constant peptide yields in various matrices, allowing reproducible, accurate and sensitive protein quantification. This strategy was developed using quantification of monocolnocal antibody in tissues and plasma as the model system. Surfactant treatment before precipitation substantially increased peptide recovery and reproducibility from plasma/tissue, likely because surfactant permits extensive denaturation/reduction/alkylation of proteins and inactivation of endogenous protease inhibitors, and facilitates removal of matrix components. The subsequent precipitation procedure effectively eliminates the surfactant and nonprotein matrix components, and the thorough denaturation by both surfactant and precipitation enabled very rapid on-pellet-digestion (45 min at 37 °C) with high peptide recovery. The performance of SOD was systematically compared against in-solution-digestion, in-gel-digestion and filter-aided-sample-preparation (FASP) in plasma/tissues, and then examined in a full pharmacokinetic study in rats. SOD achieved the best peptide recovery (∼21.0-700% higher than the other three methods across various matrices), reproducibility (3.75-10.9%) and sensitivity (28-30 ng/g across plasma and tissue matrices), and its performance was independent of matrix types. Finally, in validation and pharmacokinetic studies in rats, SOD outperformed other methods and provided highly accurate and precise quantification in all plasma samples without using stable isotope labeled (SIL)-protein internal standard (I.S.). In summary, the SOD method has proven to be highly robust, efficient and rapid, making it readily

  7. Developing a reproducible non-line-of-sight experimental setup for testing wireless medical device coexistence utilizing ZigBee.

    PubMed

    LaSorte, Nickolas J; Rajab, Samer A; Refai, Hazem H

    2012-11-01

    The integration of heterogeneous wireless technologies is believed to aid revolutionary healthcare delivery in hospitals and residential care. Wireless medical device coexistence is a growing concern given the ubiquity of wireless technology. In spite of this, a consensus standard that addresses risks associated with wireless heterogeneous networks has not been adopted. This paper serves as a starting point by recommending a practice for assessing the coexistence of a wireless medical device in a non-line-of-sight environment utilizing 802.15.4 in a practical, versatile, and reproducible test setup. This paper provides an extensive survey of other coexistence studies concerning 802.15.4 and 802.11 and reports on the authors' coexistence testing inside and outside an anechoic chamber. Results are compared against a non-line-of-sight test setup. Findings relative to co-channel and adjacent channel interference were consistent with results reported in the literature.

  8. Analytical Validation of a Highly Quantitative, Sensitive, Accurate, and Reproducible Assay (HERmark®) for the Measurement of HER2 Total Protein and HER2 Homodimers in FFPE Breast Cancer Tumor Specimens

    PubMed Central

    Larson, Jeffrey S.; Goodman, Laurie J.; Tan, Yuping; Defazio-Eli, Lisa; Paquet, Agnes C.; Cook, Jennifer W.; Rivera, Amber; Frankson, Kristi; Bose, Jolly; Chen, Lili; Cheung, Judy; Shi, Yining; Irwin, Sarah; Kiss, Linda D. B.; Huang, Weidong; Utter, Shannon; Sherwood, Thomas; Bates, Michael; Weidler, Jodi; Parry, Gordon; Winslow, John; Petropoulos, Christos J.; Whitcomb, Jeannette M.

    2010-01-01

    We report here the results of the analytical validation of assays that measure HER2 total protein (H2T) and HER2 homodimer (H2D) expression in Formalin Fixed Paraffin Embedded (FFPE) breast cancer tumors as well as cell line controls. The assays are based on the VeraTag technology platform and are commercially available through a central CAP-accredited clinical reference laboratory. The accuracy of H2T measurements spans a broad dynamic range (2-3 logs) as evaluated by comparison with cross-validating technologies. The measurement of H2T expression demonstrates a sensitivity that is approximately 7–10 times greater than conventional immunohistochemistry (IHC) (HercepTest). The HERmark assay is a quantitative assay that sensitively and reproducibly measures continuous H2T and H2D protein expression levels and therefore may have the potential to stratify patients more accurately with respect to response to HER2-targeted therapies than current methods which rely on semiquantitative protein measurements (IHC) or on indirect assessments of gene amplification (FISH). PMID:21151530

  9. Material response mechanisms are needed to obtain highly accurate experimental shock wave data

    NASA Astrophysics Data System (ADS)

    Forbes, Jerry W.

    2017-01-01

    The field of shock wave compression of matter has provided a simple set of equations relating thermodynamic and kinematic parameters that describe the conservation of mass, momentum and energy across a steady plane shock wave with one-dimensional flow. Well-known condensed matter shock wave experimental results will be reviewed to see whether the assumptions required for deriving these simple R-H equations are satisfied. Note that the material compression model is not required for deriving the 1-D conservation flow equations across a steady plane shock front. However, this statement is misleading from a practical experimental viewpoint since obtaining small systematic errors in shock wave measured parameters requires the material compression and release mechanisms to be known. A review will be presented on errors in shock wave data from common experimental techniques for elastic-plastic solids. Issues related to time scales of experiments, steady waves with long rise times and detonations will also be discussed

  10. Reproducibility in Chemical Research.

    PubMed

    Bergman, Robert G; Danheiser, Rick L

    2016-10-04

    "… To what extent is reproducibility a significant issue in chemical research? How can problems involving irreproducibility be minimized? … Researchers should be aware of the dangers of unconscious investigator bias, all papers should provide adequate experimental detail, and Reviewers have a responsibility to carefully examine papers for adequacy of experimental detail and support for the conclusions …" Read more in the Editorial by Robert G. Bergman and Rick L. Danheiser.

  11. The accurate measurement of second virial coefficients using self-interaction chromatography: experimental considerations.

    PubMed

    Quigley, A; Heng, J Y Y; Liddell, J M; Williams, D R

    2013-11-01

    Measurement of B22, the second virial coefficient, is an important technique for describing the solution behaviour of proteins, especially as it relates to precipitation, aggregation and crystallisation phenomena. This paper describes the best practise for calculating B22 values from self-interaction chromatograms (SIC) for aqueous protein solutions. Detailed analysis of SIC peak shapes for lysozyme shows that non-Gaussian peaks are commonly encountered for SIC, with typical peak asymmetries of 10%. This asymmetry reflects a non-linear chromatographic retention process, in this case heterogeneity of the protein-protein interactions. Therefore, it is important to use the centre of mass calculations for determining accurate retention volumes and thus B22 values. Empirical peak maximum chromatogram analysis, often reported in the literature, can result in errors of up to 50% in B22 values. A methodology is reported here for determining both the mean and the variance in B22 from SIC experiments, includes a correction for normal longitudinal peak broadening. The variance in B22 due to chemical effects is quantified statistically and is a measure of the heterogeneity of protein-protein interactions in solution. In the case of lysozyme, a wide range of B22 values are measured which can vary significantly from the average B22 values.

  12. An efficient and reproducible method for quantifying macrophages in different experimental models of central nervous system pathology

    PubMed Central

    Donnelly, Dustin J.; Gensel, John C.; Ankeny, Daniel P.; van Rooijen, Nico; Popovich, Phillip G.

    2009-01-01

    Historically, microglia/macrophages are quantified in the pathological central nervous system (CNS) by counting cell profiles then expressing the data as cells/mm2. However, because it is difficult to visualize individual cells in dense clusters and in most cases it is unimportant to know the absolute number of macrophages within lesioned tissue, alternative methods may be more efficient for quantifying the magnitude of the macrophage response in the context of different experimental variables (e.g., therapeutic intervention or time post-injury/infection). The present study provides the first in-depth comparison of different techniques commonly used to quantify microglial/macrophage reactions in the pathological spinal cord. Individuals from the same and different laboratories applied techniques of digital image analysis (DIA), standard cell profile counting and a computer-assisted cell counting method with unbiased sampling to quantify macrophages in focal inflammatory lesions, disseminated lesions caused by autoimmune inflammation or at sites of spinal trauma. Our goal was to find a simple, rapid and sensitive method with minimal variability between trials and users. DIA was consistently the least variable and most time-efficient method for assessing the magnitude of macrophage responses across lesions and between users. When used to evaluate the efficacy of an anti-inflammatory treatment, DIA was 5–35x faster than cell counting and was sensitive enough to detect group differences while eliminating inter-user variability. Since lesions are clearly defined and single profiles of microglia/macrophages are difficult to discern in most pathological specimens of brain or spinal cord, DIA offers significant advantages over other techniques for quantifying activated macrophages. PMID:19393692

  13. The use of experimental bending tests to more accurate numerical description of TBC damage process

    NASA Astrophysics Data System (ADS)

    Sadowski, T.; Golewski, P.

    2016-04-01

    Thermal barrier coatings (TBCs) have been extensively used in aircraft engines to protect critical engine parts such as blades and combustion chambers, which are exposed to high temperatures and corrosive environment. The blades of turbine engines are additionally exposed to high mechanical loads. These loads are created by the high rotational speed of the rotor (30 000 rot/min), causing the tensile and bending stresses. Therefore, experimental testing of coated samples is necessary in order to determine strength properties of TBCs. Beam samples with dimensions 50×10×2 mm were used in those studies. The TBC system consisted of 150 μm thick bond coat (NiCoCrAlY) and 300 μm thick top coat (YSZ) made by APS (air plasma spray) process. Samples were tested by three-point bending test with various loads. After bending tests, the samples were subjected to microscopic observation to determine the quantity of cracks and their depth. The above mentioned results were used to build numerical model and calibrate material data in Abaqus program. Brittle cracking damage model was applied for the TBC layer, which allows to remove elements after reaching criterion. Surface based cohesive behavior was used to model the delamination which may occur at the boundary between bond coat and top coat.

  14. An experimental device for accurate ultrasounds measurements in liquid foods at high pressure

    NASA Astrophysics Data System (ADS)

    Hidalgo-Baltasar, E.; Taravillo, M.; Baonza, V. G.; Sanz, P. D.; Guignon, B.

    2012-12-01

    The use of high hydrostatic pressure to ensure safe and high-quality product has markedly increased in the food industry during the last decade. Ultrasonic sensors can be employed to control such processes in an equivalent way as they are currently used in processes carried out at room pressure. However, their installation, calibration and use are particularly challenging in the context of a high pressure environment. Besides, data about acoustic properties of food under pressure and even for water are quite scarce in the pressure range of interest for food treatment (namely, above 200 MPa). The objective of this work was to establish a methodology to determine the speed of sound in foods under pressure. An ultrasonic sensor using the multiple reflections method was adapted to a lab-scale HHP equipment to determine the speed of sound in water between 253.15 and 348.15 K, and at pressures up to 700 MPa. The experimental speed-of-sound data were compared to the data calculated from the equation of state of water (IAPWS-95 formulation). From this analysis, the way to calibrate cell path was validated. After this calibration procedure, the speed of sound could be determined in liquid foods by using this sensor with a relative uncertainty between (0.22 and 0.32) % at a confidence level of 95 % over the whole pressure domain.

  15. Accurate modeling of antennas for radiating short pulses, FDTD analysis and experimental measurements

    NASA Astrophysics Data System (ADS)

    Maloney, James G.; Smith, Glenn S.

    1993-01-01

    Antennas used to radiate short pulses often require different design rules that those that are used to radiate essentially time-harmonic signals. The finite-difference time-domain (FDTD) method is a very flexible numerical approach that can be used to treat a variety of electromagnetic problems in the time domain. It is well suited to the analysis and design of antennas for radiating short pulses; however, several advances had to be made before the method could be applied to this problem. In this paper, we will illustrate the use of the FDTD method with two antennas designed for the radiation of short pulses. The first is a simple, two-dimensional geometry, and open-ended parallel-plate waveguide, while the second is a three-dimensional, rotationally symmetric geometry, a conical monopole fed through an image by a coaxial transmission line. Both antennas are 'optimized' according to given criteria by adjusting geometrical parameters and including resistive loading that varies continuously with position along the antenna. The predicted performance for the conical monopole antenna is compared with experimental measurements; this verifies the optimization and demonstrates the practicality of the design.

  16. Theory of bi-molecular association dynamics in 2D for accurate model and experimental parameterization of binding rates

    PubMed Central

    Yogurtcu, Osman N.; Johnson, Margaret E.

    2015-01-01

    The dynamics of association between diffusing and reacting molecular species are routinely quantified using simple rate-equation kinetics that assume both well-mixed concentrations of species and a single rate constant for parameterizing the binding rate. In two-dimensions (2D), however, even when systems are well-mixed, the assumption of a single characteristic rate constant for describing association is not generally accurate, due to the properties of diffusional searching in dimensions d ≤ 2. Establishing rigorous bounds for discriminating between 2D reactive systems that will be accurately described by rate equations with a single rate constant, and those that will not, is critical for both modeling and experimentally parameterizing binding reactions restricted to surfaces such as cellular membranes. We show here that in regimes of intrinsic reaction rate (ka) and diffusion (D) parameters ka/D > 0.05, a single rate constant cannot be fit to the dynamics of concentrations of associating species independently of the initial conditions. Instead, a more sophisticated multi-parametric description than rate-equations is necessary to robustly characterize bimolecular reactions from experiment. Our quantitative bounds derive from our new analysis of 2D rate-behavior predicted from Smoluchowski theory. Using a recently developed single particle reaction-diffusion algorithm we extend here to 2D, we are able to test and validate the predictions of Smoluchowski theory and several other theories of reversible reaction dynamics in 2D for the first time. Finally, our results also mean that simulations of reactive systems in 2D using rate equations must be undertaken with caution when reactions have ka/D > 0.05, regardless of the simulation volume. We introduce here a simple formula for an adaptive concentration dependent rate constant for these chemical kinetics simulations which improves on existing formulas to better capture non-equilibrium reaction dynamics from dilute

  17. A fast experimental beam hardening correction method for accurate bone mineral measurements in 3D μCT imaging system.

    PubMed

    Koubar, Khodor; Bekaert, Virgile; Brasse, David; Laquerriere, Patrice

    2015-06-01

    Bone mineral density plays an important role in the determination of bone strength and fracture risks. Consequently, it is very important to obtain accurate bone mineral density measurements. The microcomputerized tomography system provides 3D information about the architectural properties of bone. Quantitative analysis accuracy is decreased by the presence of artefacts in the reconstructed images, mainly due to beam hardening artefacts (such as cupping artefacts). In this paper, we introduced a new beam hardening correction method based on a postreconstruction technique performed with the use of off-line water and bone linearization curves experimentally calculated aiming to take into account the nonhomogeneity in the scanned animal. In order to evaluate the mass correction rate, calibration line has been carried out to convert the reconstructed linear attenuation coefficient into bone masses. The presented correction method was then applied on a multimaterial cylindrical phantom and on mouse skeleton images. Mass correction rate up to 18% between uncorrected and corrected images were obtained as well as a remarkable improvement of a calculated mouse femur mass has been noticed. Results were also compared to those obtained when using the simple water linearization technique which does not take into account the nonhomogeneity in the object.

  18. Development and experimental verification of a finite element method for accurate analysis of a surface acoustic wave device

    NASA Astrophysics Data System (ADS)

    Mohibul Kabir, K. M.; Matthews, Glenn I.; Sabri, Ylias M.; Russo, Salvy P.; Ippolito, Samuel J.; Bhargava, Suresh K.

    2016-03-01

    Accurate analysis of surface acoustic wave (SAW) devices is highly important due to their use in ever-growing applications in electronics, telecommunication and chemical sensing. In this study, a novel approach for analyzing the SAW devices was developed based on a series of two-dimensional finite element method (FEM) simulations, which has been experimentally verified. It was found that the frequency response of the two SAW device structures, each having slightly different bandwidth and center lobe characteristics, can be successfully obtained utilizing the current density of the electrodes via FEM simulations. The two SAW structures were based on XY Lithium Niobate (LiNbO3) substrates and had two and four electrode finger pairs in both of their interdigital transducers, respectively. Later, SAW devices were fabricated in accordance with the simulated models and their measured frequency responses were found to correlate well with the obtained simulations results. The results indicated that better match between calculated and measured frequency response can be obtained when one of the input electrode finger pairs was set at zero volts and all the current density components were taken into account when calculating the frequency response of the simulated SAW device structures.

  19. Opening Reproducible Research

    NASA Astrophysics Data System (ADS)

    Nüst, Daniel; Konkol, Markus; Pebesma, Edzer; Kray, Christian; Klötgen, Stephanie; Schutzeichel, Marc; Lorenz, Jörg; Przibytzin, Holger; Kussmann, Dirk

    2016-04-01

    Open access is not only a form of publishing such that research papers become available to the large public free of charge, it also refers to a trend in science that the act of doing research becomes more open and transparent. When science transforms to open access we not only mean access to papers, research data being collected, or data being generated, but also access to the data used and the procedures carried out in the research paper. Increasingly, scientific results are generated by numerical manipulation of data that were already collected, and may involve simulation experiments that are completely carried out computationally. Reproducibility of research findings, the ability to repeat experimental procedures and confirm previously found results, is at the heart of the scientific method (Pebesma, Nüst and Bivand, 2012). As opposed to the collection of experimental data in labs or nature, computational experiments lend themselves very well for reproduction. Some of the reasons why scientists do not publish data and computational procedures that allow reproduction will be hard to change, e.g. privacy concerns in the data, fear for embarrassment or of losing a competitive advantage. Others reasons however involve technical aspects, and include the lack of standard procedures to publish such information and the lack of benefits after publishing them. We aim to resolve these two technical aspects. We propose a system that supports the evolution of scientific publications from static papers into dynamic, executable research documents. The DFG-funded experimental project Opening Reproducible Research (ORR) aims for the main aspects of open access, by improving the exchange of, by facilitating productive access to, and by simplifying reuse of research results that are published over the Internet. Central to the project is a new form for creating and providing research results, the executable research compendium (ERC), which not only enables third parties to

  20. ReproPhylo: An Environment for Reproducible Phylogenomics.

    PubMed

    Szitenberg, Amir; John, Max; Blaxter, Mark L; Lunt, David H

    2015-09-01

    The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This 'single file' approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution.

  1. ReproPhylo: An Environment for Reproducible Phylogenomics

    PubMed Central

    Szitenberg, Amir; John, Max; Blaxter, Mark L.; Lunt, David H.

    2015-01-01

    The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This ‘single file’ approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution. PMID:26335558

  2. The Need for Reproducibility

    SciTech Connect

    Robey, Robert W.

    2016-06-27

    The purpose of this presentation is to consider issues of reproducibility, specifically it determines whether bitwise reproducible computation is possible, if computational research in DOE improves its publication process, and if reproducible results can be achieved apart from the peer review process?

  3. Combining computer algorithms with experimental approaches permits the rapid and accurate identification of T cell epitopes from defined antigens.

    PubMed

    Schirle, M; Weinschenk, T; Stevanović, S

    2001-11-01

    The identification of T cell epitopes from immunologically relevant antigens remains a critical step in the development of vaccines and methods for monitoring of T cell responses. This review presents an overview of strategies that employ computer algorithms for the selection of candidate peptides from defined proteins and subsequent verification of their in vivo relevance by experimental approaches. Several computer algorithms are currently being used for epitope prediction of various major histocompatibility complex (MHC) class I and II molecules, based either on the analysis of natural MHC ligands or on the binding properties of synthetic peptides. Moreover, the analysis of proteasomal digests of peptides and whole proteins has led to the development of algorithms for the prediction of proteasomal cleavages. In order to verify the generation of the predicted peptides during antigen processing in vivo as well as their immunogenic potential, several experimental approaches have been pursued in the recent past. Mass spectrometry-based bioanalytical approaches have been used specifically to detect predicted peptides among isolated natural ligands. Other strategies employ various methods for the stimulation of primary T cell responses against the predicted peptides and subsequent testing of the recognition pattern towards target cells that express the antigen.

  4. Experimental scale and dimensionality requirements for reproducing and studying coupled land-atmosphere-vegetative processes in the intermediate scale laboratory settings

    NASA Astrophysics Data System (ADS)

    Trautz, Andrew; Illangasekare, Tissa; Rodriguez-Iturbe, Ignacio; Helmig, Rainer; Heck, Katharina

    2016-04-01

    Past investigations of coupled land-atmosphere-vegetative processes have been constrained to two extremes, small laboratory bench-scale and field scale testing. In recognition of the limitations of studying the scale-dependency of these fundamental processes at either extreme, researchers have recently begun to promote the use of experimentation at intermediary scales between the bench and field scales. A requirement for employing intermediate scale testing to refine heat and mass transport theory regarding land-atmosphere-vegetative processes is high spatial-temporal resolution datasets generated under carefully controlled experimental conditions in which both small and field scale phenomena can be observed. Field experimentation often fails these criteria as a result of sensor network limitations as well as the natural complexities and uncertainties introduced by heterogeneity and constantly changing atmospheric conditions. Laboratory experimentation, which is used to study three-dimensional (3-D) processes, is often conducted in 2-D test systems as a result of space, instrumentation, and cost constraints. In most flow and transport problems, 2-D testing is not considered a serious limitation because the bypassing of flow and transport due to geo-biochemical heterogeneities can still be studied. Constraining the study of atmosphere-soil-vegetation interactions to 2-D systems introduces a new challenge given that the soil moisture dynamics associated with these interactions occurs in three dimensions. This is an important issue that needs to be addressed as evermore intricate and specialized experimental apparatuses like the climate-controlled wind tunnel-porous media test system at CESEP are being constructed and used for these types of studies. The purpose of this study is to therefore investigate the effects of laboratory experimental dimensionality on observed soil moisture dynamics in the context of bare-soil evaporation and evapotranspiration

  5. Explorations in statistics: statistical facets of reproducibility.

    PubMed

    Curran-Everett, Douglas

    2016-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.

  6. Impact of Surface Water Layers on Protein--Ligand Binding: How Well Are Experimental Data Reproduced by Molecular Dynamics Simulations in a Thermolysin Test Case?

    PubMed

    Betz, Michael; Wulsdorf, Tobias; Krimmer, Stefan G; Klebe, Gerhard

    2016-01-25

    Drug binding involves changes of the local water structure around proteins including water rearrangements across surface-solvation layers around protein and ligand portions exposed to the newly formed complex surface. For a series of thermolysin-binding phosphonamidates, we discovered that variations of the partly exposed P2'-substituents modulate binding affinity up to 10 kJ mol(-1) with even larger enthalpy/entropy partitioning of the binding signature. The observed profiles cannot be completely explained by desolvation effects. Instead, the quality and completeness of the surface water network wrapping around the formed complexes provide an explanation for the observed structure-activity relationship. We used molecular dynamics to compute surface water networks and predict solvation sites around the complexes. A fairly good correspondence with experimental difference electron densities in high-resolution crystal structures is achieved; in detail some problems with the potentials were discovered. Charge-assisted contacts to waters appeared as exaggerated by AMBER, and stabilizing contributions of water-to-methyl contacts were underestimated.

  7. Experimental Toxoplasma gondii infections in pigs: Humoral immune response, estimation of specific IgG avidity and the challenges of reproducing vertical transmission in sows.

    PubMed

    Basso, Walter; Grimm, Felix; Ruetten, Maja; Djokic, Vitomir; Blaga, Radu; Sidler, Xaver; Deplazes, Peter

    2017-03-15

    Ten pregnant sows were experimentally inoculated per os with T. gondii in order to investigate vertical and galactogenic transmission of the parasite and the evolution and maturation of the specific IgG humoral response in the sows and piglets. Five seronegative sows received 10(4)T. gondii (CZ isolate clone H3) sporulated oocysts during late-pregnancy (Exp. 1), three sows received 10(4) oocysts during mid-pregnancy (Exp. 2) and three sows from Exp. 1 (and two seronegative sows) were re-inoculated with 10(5) oocysts during a further pregnancy (late-pregnancy) (Exp. 3). Besides, six 4.5 week-old piglets inoculated per os with 5×10(3) oocysts were also included in the serological investigations. All animals seroconverted (PrioCHECK Toxoplasma Ab porcine ELISA, Prionics, Switzerland) by 2-3 weeks post inoculation (wpi) and remained seropositive for at least 38 weeks or until euthanasia. Four chronically infected sows from Exp. 1 and 2 were serologically monitored during a further pregnancy and no reactivation, but a decrease of the antibody levels was observed at farrowing (Exp. 4). In all experiments, the specific IgG-avidity was initially low, increased during the course of infection and after re-inoculations. An avidity index (AI) ≥40% could be used to rule out recent infections (<8 weeks) in most (15 of 16) animals. In some piglets (18.6% of 70) delivered by inoculated sows (Exp. 1 and 2), maternal antibodies were still detectable at 2 months (but not by 3 months) of age, with constant high avidity values, comparable to those of the dams at farrowing. In all experiments, the sows remained asymptomatic and delivered non-infected offspring at term. A total of 208 normal and 5 stillborn piglets delivered by the inoculated sows (Exp. 1-4) tested serologically negative before colostrum uptake. Placentas (n=88) from all sows and tissues (brain, liver, lung, heart, and masseter muscle) from 56 delivered piglets were analysed histopathologically and by real-time PCR

  8. Reproducible research in palaeomagnetism

    NASA Astrophysics Data System (ADS)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined

  9. Patent Law's Reproducibility Paradox.

    PubMed

    Sherkow, Jacob S

    2017-01-01

    Clinical research faces a reproducibility crisis. Many recent clinical and preclinical studies appear to be irreproducible--their results cannot be verified by outside researchers. This is problematic for not only scientific reasons but also legal ones: patents grounded in irreproducible research appear to fail their constitutional bargain of property rights in exchange for working disclosures of inventions. The culprit is likely patent law’s doctrine of enablement. Although the doctrine requires patents to enable others to make and use their claimed inventions, current difficulties in applying the doctrine hamper or even actively dissuade reproducible data in patents. This Article assesses the difficulties in reconciling these basic goals of scientific research and patent law. More concretely, it provides several examples of irreproducibility in patents on blockbuster drugs--Prempro, Xigris, Plavix, and Avastin--and discusses some of the social costs of the misalignment between good clinical practice and patent doctrine. Ultimately, this analysis illuminates several current debates concerning innovation policy. It strongly suggests that a proper conception of enablement should take into account after-arising evidence. It also sheds light on the true purpose--and limits--of patent disclosure. And lastly, it untangles the doctrines of enablement and utility.

  10. First accurate experimental study of Mu reactivity from a state-selected reactant in the gas phase: the Mu + H2{1} reaction rate at 300 K

    NASA Astrophysics Data System (ADS)

    Bakule, Pavel; Sukhorukov, Oleksandr; Ishida, Katsuhiko; Pratt, Francis; Fleming, Donald; Momose, Takamasa; Matsuda, Yasuyuki; Torikai, Eiko

    2015-02-01

    This paper reports on the experimental background and methodology leading to recent results on the first accurate measurement of the reaction rate of the muonium (Mu) atom from a state-selected reactant in the gas phase: the Mu + H2\\{1\\}\\to MuH + H reaction at 300 K, and its comparison with rigorous quantum rate theory, Bakule et al (2012 J. Phys. Chem. Lett. 3 2755). Stimulated Raman pumping, induced by 532 nm light from the 2nd harmonic of a Nd:YAG laser, was used to produce H2 in its first vibrational (v = 1) state, H2\\{1\\}, in a single Raman/reaction cell. A pulsed muon beam (from ‘ISIS’, at 50 Hz) matched the 25 Hz repetition rate of the laser, allowing data taking in equal ‘Laser-On/Laser-Off’ modes of operation. The signal to noise was improved by over an order of magnitude in comparison with an earlier proof-of-principle experiment. The success of the present experiment also relied on optimizing the overlap of the laser profile with the extended stopping distribution of the muon beam at 50 bar H2 pressure, in which Monte Carlo simulations played a central role. The rate constant, found from the analysis of three separate measurements, which includes a correction for the loss of {{H}2}\\{1\\} concentration due to collisional relaxation with unpumped H2 during the time of each measurement, is {{k}Mu}\\{1\\} = 9.9[(-1.4)(+1.7)] × 10-13 cm3 s-1 at 300 K. This is in good to excellent agreement with rigorous quantum rate calculations on the complete configuration interaction/Born-Huang surface, as reported earlier by Bakule et al, and which are also briefly commented on herein.

  11. Assessing the reproducibility of discriminant function analyses

    PubMed Central

    Andrew, Rose L.; Albert, Arianne Y.K.; Renaut, Sebastien; Rennison, Diana J.; Bock, Dan G.

    2015-01-01

    Data are the foundation of empirical research, yet all too often the datasets underlying published papers are unavailable, incorrect, or poorly curated. This is a serious issue, because future researchers are then unable to validate published results or reuse data to explore new ideas and hypotheses. Even if data files are securely stored and accessible, they must also be accompanied by accurate labels and identifiers. To assess how often problems with metadata or data curation affect the reproducibility of published results, we attempted to reproduce Discriminant Function Analyses (DFAs) from the field of organismal biology. DFA is a commonly used statistical analysis that has changed little since its inception almost eight decades ago, and therefore provides an opportunity to test reproducibility among datasets of varying ages. Out of 100 papers we initially surveyed, fourteen were excluded because they did not present the common types of quantitative result from their DFA or gave insufficient details of their DFA. Of the remaining 86 datasets, there were 15 cases for which we were unable to confidently relate the dataset we received to the one used in the published analysis. The reasons ranged from incomprehensible or absent variable labels, the DFA being performed on an unspecified subset of the data, or the dataset we received being incomplete. We focused on reproducing three common summary statistics from DFAs: the percent variance explained, the percentage correctly assigned and the largest discriminant function coefficient. The reproducibility of the first two was fairly high (20 of 26, and 44 of 60 datasets, respectively), whereas our success rate with the discriminant function coefficients was lower (15 of 26 datasets). When considering all three summary statistics, we were able to completely reproduce 46 (65%) of 71 datasets. While our results show that a majority of studies are reproducible, they highlight the fact that many studies still are not the

  12. Reproducibility and Comparability of Computational Models for Astrocyte Calcium Excitability

    PubMed Central

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2017-01-01

    The scientific community across all disciplines faces the same challenges of ensuring accessibility, reproducibility, and efficient comparability of scientific results. Computational neuroscience is a rapidly developing field, where reproducibility and comparability of research results have gained increasing interest over the past years. As the number of computational models of brain functions is increasing, we chose to address reproducibility using four previously published computational models of astrocyte excitability as an example. Although not conventionally taken into account when modeling neuronal systems, astrocytes have been shown to take part in a variety of in vitro and in vivo phenomena including synaptic transmission. Two of the selected astrocyte models describe spontaneous calcium excitability, and the other two neurotransmitter-evoked calcium excitability. We specifically addressed how well the original simulation results can be reproduced with a reimplementation of the models. Additionally, we studied how well the selected models can be reused and whether they are comparable in other stimulation conditions and research settings. Unexpectedly, we found out that three of the model publications did not give all the necessary information required to reimplement the models. In addition, we were able to reproduce the original results of only one of the models completely based on the information given in the original publications and in the errata. We actually found errors in the equations provided by two of the model publications; after modifying the equations accordingly, the original results were reproduced more accurately. Even though the selected models were developed to describe the same biological event, namely astrocyte calcium excitability, the models behaved quite differently compared to one another. Our findings on a specific set of published astrocyte models stress the importance of proper validation of the models against experimental wet

  13. Reproducibility in a multiprocessor system

    SciTech Connect

    Bellofatto, Ralph A; Chen, Dong; Coteus, Paul W; Eisley, Noel A; Gara, Alan; Gooding, Thomas M; Haring, Rudolf A; Heidelberger, Philip; Kopcsay, Gerard V; Liebsch, Thomas A; Ohmacht, Martin; Reed, Don D; Senger, Robert M; Steinmacher-Burow, Burkhard; Sugawara, Yutaka

    2013-11-26

    Fixing a problem is usually greatly aided if the problem is reproducible. To ensure reproducibility of a multiprocessor system, the following aspects are proposed; a deterministic system start state, a single system clock, phase alignment of clocks in the system, system-wide synchronization events, reproducible execution of system components, deterministic chip interfaces, zero-impact communication with the system, precise stop of the system and a scan of the system state.

  14. Reproducible quantitative proteotype data matrices for systems biology

    PubMed Central

    Röst, Hannes L.; Malmström, Lars; Aebersold, Ruedi

    2015-01-01

    Historically, many mass spectrometry–based proteomic studies have aimed at compiling an inventory of protein compounds present in a biological sample, with the long-term objective of creating a proteome map of a species. However, to answer fundamental questions about the behavior of biological systems at the protein level, accurate and unbiased quantitative data are required in addition to a list of all protein components. Fueled by advances in mass spectrometry, the proteomics field has thus recently shifted focus toward the reproducible quantification of proteins across a large number of biological samples. This provides the foundation to move away from pure enumeration of identified proteins toward quantitative matrices of many proteins measured across multiple samples. It is argued here that data matrices consisting of highly reproducible, quantitative, and unbiased proteomic measurements across a high number of conditions, referred to here as quantitative proteotype maps, will become the fundamental currency in the field and provide the starting point for downstream biological analysis. Such proteotype data matrices, for example, are generated by the measurement of large patient cohorts, time series, or multiple experimental perturbations. They are expected to have a large effect on systems biology and personalized medicine approaches that investigate the dynamic behavior of biological systems across multiple perturbations, time points, and individuals. PMID:26543201

  15. Contextual sensitivity in scientific reproducibility.

    PubMed

    Van Bavel, Jay J; Mende-Siedlecki, Peter; Brady, William J; Reinero, Diego A

    2016-06-07

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher's degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed "hidden moderators") between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility.

  16. Open Science and Research Reproducibility

    PubMed Central

    Munafò, Marcus

    2016-01-01

    Many scientists, journals and funders are concerned about the low reproducibility of many scientific findings. One approach that may serve to improve the reliability and robustness of research is open science. Here I argue that the process of pre-registering study protocols, sharing study materials and data, and posting preprints of manuscripts may serve to improve quality control procedures at every stage of the research pipeline, and in turn improve the reproducibility of published work. PMID:27350794

  17. Tissue Doppler imaging reproducibility during exercise.

    PubMed

    Bougault, V; Nottin, S; Noltin, S; Doucende, G; Obert, P

    2008-05-01

    Tissue Doppler imaging (TDI) is an echocardiographic technique used during exercising to improve the accuracy of a cardiovascular diagnostic. The validity of TDI requires its reproducibility, which has never been challenged during moderate to maximal intensity exercising. The present study was specifically designed to assess the transmitral Doppler and pulsed TDI reproducibility in 19 healthy men, who had undergone two identical semi-supine maximal exercise tests on a cycle ergometer. Systolic (S') and diastolic (E') tissue velocities at the septal and lateral walls as well as early transmitral velocities (E) were assessed during exercise up to maximal effort. The data were compared between the two tests at 40 %, 60 %, 80 % and 100 % of maximal aerobic power. Despite upper body movements and hyperventilation, good quality echocardiographic images were obtained in each case. Regardless of exercise intensity, no differences were noticed between the two tests for all measurements. The variation coefficients for Doppler variables ranged from 3 % to 9 % over the transition from rest to maximal exercise. The random measurement error was, on average, 5.8 cm/s for E' and 4.4 cm/s for S'. Overall, the reproducibility of TDI was acceptable. Tissue Doppler imaging can be used to accurately evaluate LV diastolic and/or systolic function for this range of exercise intensity.

  18. Contextual sensitivity in scientific reproducibility

    PubMed Central

    Van Bavel, Jay J.; Mende-Siedlecki, Peter; Brady, William J.; Reinero, Diego A.

    2016-01-01

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher’s degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed “hidden moderators”) between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility. PMID:27217556

  19. Towards Reproducibility in Computational Hydrology

    NASA Astrophysics Data System (ADS)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei

    2016-04-01

    The ability to reproduce published scientific findings is a foundational principle of scientific research. Independent observation helps to verify the legitimacy of individual findings; build upon sound observations so that we can evolve hypotheses (and models) of how catchments function; and move them from specific circumstances to more general theory. The rise of computational research has brought increased focus on the issue of reproducibility across the broader scientific literature. This is because publications based on computational research typically do not contain sufficient information to enable the results to be reproduced, and therefore verified. Given the rise of computational analysis in hydrology over the past 30 years, to what extent is reproducibility, or a lack thereof, a problem in hydrology? Whilst much hydrological code is accessible, the actual code and workflow that produced and therefore documents the provenance of published scientific findings, is rarely available. We argue that in order to advance and make more robust the process of hypothesis testing and knowledge creation within the computational hydrological community, we need to build on from existing open data initiatives and adopt common standards and infrastructures to: first make code re-useable and easy to find through consistent use of metadata; second, publish well documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; finally, use unique persistent identifiers (e.g. DOIs) to reference re-useable and reproducible code, thereby clearly showing the provenance of published scientific findings. Whilst extra effort is require to make work reproducible, there are benefits to both the individual and the broader community in doing so, which will improve the credibility of the science in the face of the need for societies to adapt to changing hydrological environments.

  20. Reproducibility of airway wall thickness measurements

    NASA Astrophysics Data System (ADS)

    Schmidt, Michael; Kuhnigk, Jan-Martin; Krass, Stefan; Owsijewitsch, Michael; de Hoop, Bartjan; Peitgen, Heinz-Otto

    2010-03-01

    Airway remodeling and accompanying changes in wall thickness are known to be a major symptom of chronic obstructive pulmonary disease (COPD), associated with reduced lung function in diseased individuals. Further investigation of this disease as well as monitoring of disease progression and treatment effect demand for accurate and reproducible assessment of airway wall thickness in CT datasets. With wall thicknesses in the sub-millimeter range, this task remains challenging even with today's high resolution CT datasets. To provide accurate measurements, taking partial volume effects into account is mandatory. The Full-Width-at-Half-Maximum (FWHM) method has been shown to be inappropriate for small airways1,2 and several improved algorithms for objective quantification of airway wall thickness have been proposed.1-8 In this paper, we describe an algorithm based on a closed form solution proposed by Weinheimer et al.7 We locally estimate the lung density parameter required for the closed form solution to account for possible variations of parenchyma density between different lung regions, inspiration states and contrast agent concentrations. The general accuracy of the algorithm is evaluated using basic tubular software and hardware phantoms. Furthermore, we present results on the reproducibility of the algorithm with respect to clinical CT scans, varying reconstruction kernels, and repeated acquisitions, which is crucial for longitudinal observations.

  1. ITK: enabling reproducible research and open science

    PubMed Central

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387

  2. Rotary head type reproducing apparatus

    DOEpatents

    Takayama, Nobutoshi; Edakubo, Hiroo; Kozuki, Susumu; Takei, Masahiro; Nagasawa, Kenichi

    1986-01-01

    In an apparatus of the kind arranged to reproduce, with a plurality of rotary heads, an information signal from a record bearing medium having many recording tracks which are parallel to each other with the information signal recorded therein and with a plurality of different pilot signals of different frequencies also recorded one by one, one in each of the recording tracks, a plurality of different reference signals of different frequencies are simultaneously generated. A tracking error is detected by using the different reference signals together with the pilot signals which are included in signals reproduced from the plurality of rotary heads.

  3. An open investigation of the reproducibility of cancer biology research.

    PubMed

    Errington, Timothy M; Iorns, Elizabeth; Gunn, William; Tan, Fraser Elisabeth; Lomax, Joelle; Nosek, Brian A

    2014-12-10

    It is widely believed that research that builds upon previously published findings has reproduced the original work. However, it is rare for researchers to perform or publish direct replications of existing results. The Reproducibility Project: Cancer Biology is an open investigation of reproducibility in preclinical cancer biology research. We have identified 50 high impact cancer biology articles published in the period 2010-2012, and plan to replicate a subset of experimental results from each article. A Registered Report detailing the proposed experimental designs and protocols for each subset of experiments will be peer reviewed and published prior to data collection. The results of these experiments will then be published in a Replication Study. The resulting open methodology and dataset will provide evidence about the reproducibility of high-impact results, and an opportunity to identify predictors of reproducibility.

  4. An open investigation of the reproducibility of cancer biology research

    PubMed Central

    Errington, Timothy M; Iorns, Elizabeth; Gunn, William; Tan, Fraser Elisabeth; Lomax, Joelle; Nosek, Brian A

    2014-01-01

    It is widely believed that research that builds upon previously published findings has reproduced the original work. However, it is rare for researchers to perform or publish direct replications of existing results. The Reproducibility Project: Cancer Biology is an open investigation of reproducibility in preclinical cancer biology research. We have identified 50 high impact cancer biology articles published in the period 2010-2012, and plan to replicate a subset of experimental results from each article. A Registered Report detailing the proposed experimental designs and protocols for each subset of experiments will be peer reviewed and published prior to data collection. The results of these experiments will then be published in a Replication Study. The resulting open methodology and dataset will provide evidence about the reproducibility of high-impact results, and an opportunity to identify predictors of reproducibility. DOI: http://dx.doi.org/10.7554/eLife.04333.001 PMID:25490932

  5. Reproducible Bioinformatics Research for Biologists

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  6. Reproducible research in computational science.

    PubMed

    Peng, Roger D

    2011-12-02

    Computational science has led to exciting new developments, but the nature of the work has exposed limitations in our ability to evaluate published findings. Reproducibility has the potential to serve as a minimum standard for judging scientific claims when full independent replication of a study is not possible.

  7. New experimental methodology, setup and LabView program for accurate absolute thermoelectric power and electrical resistivity measurements between 25 and 1600 K: Application to pure copper, platinum, tungsten, and nickel at very high temperatures

    NASA Astrophysics Data System (ADS)

    Abadlia, L.; Gasser, F.; Khalouk, K.; Mayoufi, M.; Gasser, J. G.

    2014-09-01

    In this paper we describe an experimental setup designed to measure simultaneously and very accurately the resistivity and the absolute thermoelectric power, also called absolute thermopower or absolute Seebeck coefficient, of solid and liquid conductors/semiconductors over a wide range of temperatures (room temperature to 1600 K in present work). A careful analysis of the existing experimental data allowed us to extend the absolute thermoelectric power scale of platinum to the range 0-1800 K with two new polynomial expressions. The experimental device is controlled by a LabView program. A detailed description of the accurate dynamic measurement methodology is given in this paper. We measure the absolute thermoelectric power and the electrical resistivity and deduce with a good accuracy the thermal conductivity using the relations between the three electronic transport coefficients, going beyond the classical Wiedemann-Franz law. We use this experimental setup and methodology to give new very accurate results for pure copper, platinum, and nickel especially at very high temperatures. But resistivity and absolute thermopower measurement can be more than an objective in itself. Resistivity characterizes the bulk of a material while absolute thermoelectric power characterizes the material at the point where the electrical contact is established with a couple of metallic elements (forming a thermocouple). In a forthcoming paper we will show that the measurement of resistivity and absolute thermoelectric power characterizes advantageously the (change of) phase, probably as well as DSC (if not better), since the change of phases can be easily followed during several hours/days at constant temperature.

  8. New experimental methodology, setup and LabView program for accurate absolute thermoelectric power and electrical resistivity measurements between 25 and 1600 K: application to pure copper, platinum, tungsten, and nickel at very high temperatures.

    PubMed

    Abadlia, L; Gasser, F; Khalouk, K; Mayoufi, M; Gasser, J G

    2014-09-01

    In this paper we describe an experimental setup designed to measure simultaneously and very accurately the resistivity and the absolute thermoelectric power, also called absolute thermopower or absolute Seebeck coefficient, of solid and liquid conductors/semiconductors over a wide range of temperatures (room temperature to 1600 K in present work). A careful analysis of the existing experimental data allowed us to extend the absolute thermoelectric power scale of platinum to the range 0-1800 K with two new polynomial expressions. The experimental device is controlled by a LabView program. A detailed description of the accurate dynamic measurement methodology is given in this paper. We measure the absolute thermoelectric power and the electrical resistivity and deduce with a good accuracy the thermal conductivity using the relations between the three electronic transport coefficients, going beyond the classical Wiedemann-Franz law. We use this experimental setup and methodology to give new very accurate results for pure copper, platinum, and nickel especially at very high temperatures. But resistivity and absolute thermopower measurement can be more than an objective in itself. Resistivity characterizes the bulk of a material while absolute thermoelectric power characterizes the material at the point where the electrical contact is established with a couple of metallic elements (forming a thermocouple). In a forthcoming paper we will show that the measurement of resistivity and absolute thermoelectric power characterizes advantageously the (change of) phase, probably as well as DSC (if not better), since the change of phases can be easily followed during several hours/days at constant temperature.

  9. New experimental methodology, setup and LabView program for accurate absolute thermoelectric power and electrical resistivity measurements between 25 and 1600 K: Application to pure copper, platinum, tungsten, and nickel at very high temperatures

    SciTech Connect

    Abadlia, L.; Mayoufi, M.; Gasser, F.; Khalouk, K.; Gasser, J. G.

    2014-09-15

    In this paper we describe an experimental setup designed to measure simultaneously and very accurately the resistivity and the absolute thermoelectric power, also called absolute thermopower or absolute Seebeck coefficient, of solid and liquid conductors/semiconductors over a wide range of temperatures (room temperature to 1600 K in present work). A careful analysis of the existing experimental data allowed us to extend the absolute thermoelectric power scale of platinum to the range 0-1800 K with two new polynomial expressions. The experimental device is controlled by a LabView program. A detailed description of the accurate dynamic measurement methodology is given in this paper. We measure the absolute thermoelectric power and the electrical resistivity and deduce with a good accuracy the thermal conductivity using the relations between the three electronic transport coefficients, going beyond the classical Wiedemann-Franz law. We use this experimental setup and methodology to give new very accurate results for pure copper, platinum, and nickel especially at very high temperatures. But resistivity and absolute thermopower measurement can be more than an objective in itself. Resistivity characterizes the bulk of a material while absolute thermoelectric power characterizes the material at the point where the electrical contact is established with a couple of metallic elements (forming a thermocouple). In a forthcoming paper we will show that the measurement of resistivity and absolute thermoelectric power characterizes advantageously the (change of) phase, probably as well as DSC (if not better), since the change of phases can be easily followed during several hours/days at constant temperature.

  10. Assessing the accuracy and reproducibility of modality independent elastography in a murine model of breast cancer

    PubMed Central

    Weis, Jared A.; Flint, Katelyn M.; Sanchez, Violeta; Yankeelov, Thomas E.; Miga, Michael I.

    2015-01-01

    Abstract. Cancer progression has been linked to mechanics. Therefore, there has been recent interest in developing noninvasive imaging tools for cancer assessment that are sensitive to changes in tissue mechanical properties. We have developed one such method, modality independent elastography (MIE), that estimates the relative elastic properties of tissue by fitting anatomical image volumes acquired before and after the application of compression to biomechanical models. The aim of this study was to assess the accuracy and reproducibility of the method using phantoms and a murine breast cancer model. Magnetic resonance imaging data were acquired, and the MIE method was used to estimate relative volumetric stiffness. Accuracy was assessed using phantom data by comparing to gold-standard mechanical testing of elasticity ratios. Validation error was <12%. Reproducibility analysis was performed on animal data, and within-subject coefficients of variation ranged from 2 to 13% at the bulk level and 32% at the voxel level. To our knowledge, this is the first study to assess the reproducibility of an elasticity imaging metric in a preclinical cancer model. Our results suggest that the MIE method can reproducibly generate accurate estimates of the relative mechanical stiffness and provide guidance on the degree of change needed in order to declare biological changes rather than experimental error in future therapeutic studies. PMID:26158120

  11. Construction of Spectroscopically Accurate IR Linelists for NH3 and CO2

    NASA Astrophysics Data System (ADS)

    Huang, X.; Schwenke, D. W.; Lee, T. J.

    2011-05-01

    The strategy of using the best theory together with high-resolution experi-ment was applied to NH3 and CO2: that is, refine a highly accurate ab initio PES with the most reliable HITRAN or pure experimental data. With 0.01 - 0.02 cm-1 accuracy, our calculations are clearly far beyond simply reproducing experimental data, but are also capable of revealing many deficiencies in the cur- rent experimental analysis of the various isotopologues, as well as provide reliable predictions with similar accuracy.

  12. Reproducibility of 3D chromatin configuration reconstructions

    PubMed Central

    Segal, Mark R.; Xiong, Hao; Capurso, Daniel; Vazquez, Mariel; Arsuaga, Javier

    2014-01-01

    It is widely recognized that the three-dimensional (3D) architecture of eukaryotic chromatin plays an important role in processes such as gene regulation and cancer-driving gene fusions. Observing or inferring this 3D structure at even modest resolutions had been problematic, since genomes are highly condensed and traditional assays are coarse. However, recently devised high-throughput molecular techniques have changed this situation. Notably, the development of a suite of chromatin conformation capture (CCC) assays has enabled elicitation of contacts—spatially close chromosomal loci—which have provided insights into chromatin architecture. Most analysis of CCC data has focused on the contact level, with less effort directed toward obtaining 3D reconstructions and evaluating the accuracy and reproducibility thereof. While questions of accuracy must be addressed experimentally, questions of reproducibility can be addressed statistically—the purpose of this paper. We use a constrained optimization technique to reconstruct chromatin configurations for a number of closely related yeast datasets and assess reproducibility using four metrics that measure the distance between 3D configurations. The first of these, Procrustes fitting, measures configuration closeness after applying reflection, rotation, translation, and scaling-based alignment of the structures. The others base comparisons on the within-configuration inter-point distance matrix. Inferential results for these metrics rely on suitable permutation approaches. Results indicate that distance matrix-based approaches are preferable to Procrustes analysis, not because of the metrics per se but rather on account of the ability to customize permutation schemes to handle within-chromosome contiguity. It has recently been emphasized that the use of constrained optimization approaches to 3D architecture reconstruction are prone to being trapped in local minima. Our methods of reproducibility assessment provide a

  13. Accurate Prediction of Glucuronidation of Structurally Diverse Phenolics by Human UGT1A9 Using Combined Experimental and In Silico Approaches

    PubMed Central

    Wu, Baojian; Wang, Xiaoqiang; Zhang, Shuxing; Hu, Ming

    2012-01-01

    Purpose The catalytic selectivity of human UGT1A9, an important membrane-bound enzyme catalyzing glucuronidation of xenobiotics were determined experimentally using 145 phenolics, and analyzed by 3D-QSAR methods. Methods The catalytic efficiency of UGT1A9 was determined by kinetic profiling. Quantitative structure activity relationships were analyzed using the CoMFA and CoMSIA techniques. Molecular alignment of the substrate structures was made by superimposing the glucuronidation site and its adjacent aromatic ring to achieve maximal steric overlap. For a substrate with multiple active glucuronidation sites, each site was considered as a separate substrate. Results The 3D-QSAR analyses produced statistically reliable models with good predictive power (CoMFA: q2 = 0.548, r2= 0.949, r2pred = 0.775; CoMSIA: q2 = 0.579, r2= 0.876, r2pred = 0.700). The contour coefficient maps were applied to elucidate structural features among substrates that are responsible for the selectivity differences. Furthermore, the contour coefficient maps were overlaid in the catalytic pocket of a homology model of UGT1A9; this enabled us to identify the UGT1A9 catalytic pocket with a high degree of confidence. Conclusion The CoMFA/CoMSIA models can predict the substrate selectivity and in vitro clearance of UGT1A9. Our findings also provide a possible molecular basis for understanding UGT1A9 functions and its substrate selectivity. PMID:22302521

  14. PSYCHOLOGY. Estimating the reproducibility of psychological science.

    PubMed

    2015-08-28

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.

  15. Reproducible analyses of microbial food for advanced life support systems

    NASA Technical Reports Server (NTRS)

    Petersen, Gene R.

    1988-01-01

    The use of yeasts in controlled ecological life support systems (CELSS) for microbial food regeneration in space required the accurate and reproducible analysis of intracellular carbohydrate and protein levels. The reproducible analysis of glycogen was a key element in estimating overall content of edibles in candidate yeast strains. Typical analytical methods for estimating glycogen in Saccharomyces were not found to be entirely aplicable to other candidate strains. Rigorous cell lysis coupled with acid/base fractionation followed by specific enzymatic glycogen analyses were required to obtain accurate results in two strains of Candida. A profile of edible fractions of these strains was then determined. The suitability of yeasts as food sources in CELSS food production processes is discussed.

  16. Wire like link for cycle reproducible and cycle accurate hardware accelerator

    DOEpatents

    Asaad, Sameh; Kapur, Mohit; Parker, Benjamin D

    2015-04-07

    First and second field programmable gate arrays are provided which implement first and second blocks of a circuit design to be simulated. The field programmable gate arrays are operated at a first clock frequency and a wire like link is provided to send a plurality of signals between them. The wire like link includes a serializer, on the first field programmable gate array, to serialize the plurality of signals; a deserializer on the second field programmable gate array, to deserialize the plurality of signals; and a connection between the serializer and the deserializer. The serializer and the deserializer are operated at a second clock frequency, greater than the first clock frequency, and the second clock frequency is selected such that latency of transmission and reception of the plurality of signals is less than the period corresponding to the first clock frequency.

  17. Applicability of Density Functional Theory in Reproducing Accurate Vibrational Spectra of Surface Bound Species

    SciTech Connect

    Matanovic, Ivana; Atanassov, Plamen; Kiefer, Boris; Garzon, Fernando; Henson, Neil J.

    2014-10-05

    The structural equilibrium parameters, the adsorption energies, and the vibrational frequencies of the nitrogen molecule and the hydrogen atom adsorbed on the (111) surface of rhodium have been investigated using different generalized-gradient approximation (GGA), nonlocal correlation, meta-GGA, and hybrid functionals, namely, Perdew, Burke, and Ernzerhof (PBE), Revised-RPBE, vdW-DF, Tao, Perdew, Staroverov, and Scuseria functional (TPSS), and Heyd, Scuseria, and Ernzerhof (HSE06) functional in the plane wave formalism. Among the five tested functionals, nonlocal vdW-DF and meta-GGA TPSS functionals are most successful in describing energetics of dinitrogen physisorption to the Rh(111) surface, while the PBE functional provides the correct chemisorption energy for the hydrogen atom. It was also found that TPSS functional produces the best vibrational spectra of the nitrogen molecule and the hydrogen atom on rhodium within the harmonic formalism with the error of 22.62 and 21.1% for the NAN stretching and RhAH stretching frequency. Thus, TPSS functional was proposed as a method of choice for obtaining vibrational spectra of low weight adsorbates on metallic surfaces within the harmonic approximation. At the anharmonic level, by decoupling the RhAH and NAN stretching modes from the bulk phonons and by solving one- and two-dimensional Schr€odinger equation associated with the RhAH, RhAN, and NAN potential energy we calculated the anharmonic correction for NAN and RhAH stretching modes as 231 cm21 and 277 cm21 at PBE level. Anharmonic vibrational frequencies calculated with the use of the hybrid HSE06 function are in best agreement with available experiments.

  18. Vapor Pressure of Aqueous Solutions of Electrolytes Reproduced with Coarse-Grained Models without Electrostatics.

    PubMed

    Perez Sirkin, Yamila A; Factorovich, Matías H; Molinero, Valeria; Scherlis, Damian A

    2016-06-14

    The vapor pressure of water is a key property in a large class of applications from the design of membranes for fuel cells and separations to the prediction of the mixing state of atmospheric aerosols. Molecular simulations have been used to compute vapor pressures, and a few studies on liquid mixtures and solutions have been reported on the basis of the Gibbs Ensemble Monte Carlo method in combination with atomistic force fields. These simulations are costly, making them impractical for the prediction of the vapor pressure of complex materials. The goal of the present work is twofold: (1) to demonstrate the use of the grand canonical screening approach ( Factorovich , M. H. J. Chem. Phys. 2014 , 140 , 064111 ) to compute the vapor pressure of solutions and to extend the methodology for the treatment of systems without a liquid-vapor interface and (2) to investigate the ability of computationally efficient high-resolution coarse-grained models based on the mW monatomic water potential and ions described exclusively with short-range interactions to reproduce the relative vapor pressure of aqueous solutions. We find that coarse-grained models of LiCl and NaCl solutions faithfully reproduce the experimental relative pressures up to high salt concentrations, despite the inability of these models to predict cohesive energies of the solutions or the salts. A thermodynamic analysis reveals that the coarse-grained models achieve the experimental activity coefficients of water in solution through a compensation of severely underestimated hydration and vaporization free energies of the salts. Our results suggest that coarse-grained models developed to replicate the hydration structure and the effective ion-ion attraction in solution may lead to this compensation. Moreover, they suggest an avenue for the design of coarse-grained models that accurately reproduce the activity coefficients of solutions.

  19. Reliability and reproducibility of Kienbock's disease staging.

    PubMed

    Goeminne, S; Degreef, I; De Smet, L

    2010-09-01

    We evaluated the interobserver reliability and intraobserver reproducibility of the Lichtman et al. classification for Kienböck's disease by getting four observers with different experience to look at 70 sets of wrist radiographs at different points in time. These observers staged each set of radiographs. Paired comparisons of the observations identified an agreement in 63% of cases and a mean weighted kappa coefficient of 0.64 confirming interobserver reliability. The stage of the involved lunate was reproduced in 78% of the observations with a mean weighted kappa coefficient of 0.81 showing intraobserver reproducibility. This classification for Kienböck's disease has good reliability and reproducibility.

  20. Statistical analysis of accurate prediction of local atmospheric optical attenuation with a new model according to weather together with beam wandering compensation system: a season-wise experimental investigation

    NASA Astrophysics Data System (ADS)

    Arockia Bazil Raj, A.; Padmavathi, S.

    2016-07-01

    Atmospheric parameters strongly affect the performance of Free Space Optical Communication (FSOC) system when the optical wave is propagating through the inhomogeneous turbulent medium. Developing a model to get an accurate prediction of optical attenuation according to meteorological parameters becomes significant to understand the behaviour of FSOC channel during different seasons. A dedicated free space optical link experimental set-up is developed for the range of 0.5 km at an altitude of 15.25 m. The diurnal profile of received power and corresponding meteorological parameters are continuously measured using the developed optoelectronic assembly and weather station, respectively, and stored in a data logging computer. Measured meteorological parameters (as input factors) and optical attenuation (as response factor) of size [177147 × 4] are used for linear regression analysis and to design the mathematical model that is more suitable to predict the atmospheric optical attenuation at our test field. A model that exhibits the R2 value of 98.76% and average percentage deviation of 1.59% is considered for practical implementation. The prediction accuracy of the proposed model is investigated along with the comparative results obtained from some of the existing models in terms of Root Mean Square Error (RMSE) during different local seasons in one-year period. The average RMSE value of 0.043-dB/km is obtained in the longer range dynamic of meteorological parameters variations.

  1. Assessment of the performance of numerical modeling in reproducing a replenishment of sediments in a water-worked channel

    NASA Astrophysics Data System (ADS)

    Juez, C.; Battisacco, E.; Schleiss, A. J.; Franca, M. J.

    2016-06-01

    The artificial replenishment of sediment is used as a method to re-establish sediment continuity downstream of a dam. However, the impact of this technique on the hydraulics conditions, and resulting bed morphology, is yet to be understood. Several numerical tools have been developed during last years for modeling sediment transport and morphology evolution which can be used for this application. These models range from 1D to 3D approaches: the first being over simplistic for the simulation of such a complex geometry; the latter requires often a prohibitive computational effort. However, 2D models are computationally efficient and in these cases may already provide sufficiently accurate predictions of the morphology evolution caused by the sediment replenishment in a river. Here, the 2D shallow water equations in combination with the Exner equation are solved by means of a weak-coupled strategy. The classical friction approach considered for reproducing the bed channel roughness has been modified to take into account the morphological effect of replenishment which provokes a channel bed fining. Computational outcomes are compared with four sets of experimental data obtained from several replenishment configurations studied in the laboratory. The experiments differ in terms of placement volume and configuration. A set of analysis parameters is proposed for the experimental-numerical comparison, with particular attention to the spreading, covered surface and travel distance of placed replenishment grains. The numerical tool is reliable in reproducing the overall tendency shown by the experimental data. The effect of fining roughness is better reproduced with the approach herein proposed. However, it is also highlighted that the sediment clusters found in the experiment are not well numerically reproduced in the regions of the channel with a limited number of sediment grains.

  2. Accurate measurement of the specific absorption rate using a suitable adiabatic magnetothermal setup

    NASA Astrophysics Data System (ADS)

    Natividad, Eva; Castro, Miguel; Mediano, Arturo

    2008-03-01

    Accurate measurements of the specific absorption rate (SAR) of solids and fluids were obtained by a calorimetric method, using a special-purpose setup working under adiabatic conditions. Unlike in current nonadiabatic setups, the weak heat exchange with the surroundings allowed a straightforward determination of temperature increments, avoiding the usual initial-time approximations. The measurements performed on a commercial magnetite aqueous ferrofluid revealed a good reproducibility (4%). Also, the measurements on a copper sample allowed comparison between experimental and theoretical values: adiabatic conditions gave SAR values only 3% higher than the theoretical ones, while the typical nonadiabatic method underestimated SAR by 21%.

  3. Interobserver reproducibility of radiographic evaluation of lumbar spine instability

    PubMed Central

    Segundo, Saulo de Tarso de Sá Pereira; Valesin, Edgar Santiago; Lenza, Mario; Santos, Durval do Carmo Barros; Rosemberg, Laercio Alberto; Ferretti, Mario

    2016-01-01

    ABSTRACT Objective: To measure the interobserver reproducibility of the radiographic evaluation of lumbar spine instability. Methods: Measurements of the dynamic radiographs of the lumbar spine in lateral view were performed, evaluating the anterior translation and the angulation among the vertebral bodies. The tests were evaluated at workstations of the organization, through the Carestream Health Vue RIS (PACS), version 11.0.12.14 Inc. 2009© system. Results: Agreement in detecting cases of radiographic instability among the observers varied from 88.1 to 94.4%, and the agreement coefficients AC1 were all above 0.8, indicating excellent agreement. Conclusion: The interobserver analysis performed among orthopedic surgeons with different levels of training in dynamic radiographs of the spine obtained high reproducibility and agreement. However, some factors, such as the manual method of measurement and the presence of vertebral osteophytes, might have generated a few less accurate results in this comparative evaluation of measurements. PMID:27759827

  4. Towards Accurate Molecular Modeling of Plastic Bonded Explosives

    NASA Astrophysics Data System (ADS)

    Chantawansri, T. L.; Andzelm, J.; Taylor, D.; Byrd, E.; Rice, B.

    2010-03-01

    There is substantial interest in identifying the controlling factors that influence the susceptibility of polymer bonded explosives (PBXs) to accidental initiation. Numerous Molecular Dynamics (MD) simulations of PBXs using the COMPASS force field have been reported in recent years, where the validity of the force field in modeling the solid EM fill has been judged solely on its ability to reproduce lattice parameters, which is an insufficient metric. Performance of the COMPASS force field in modeling EMs and the polymeric binder has been assessed by calculating structural, thermal, and mechanical properties, where only fair agreement with experimental data is obtained. We performed MD simulations using the COMPASS force field for the polymer binder hydroxyl-terminated polybutadiene and five EMs: cyclotrimethylenetrinitramine, 1,3,5,7-tetranitro-1,3,5,7-tetra-azacyclo-octane, 2,4,6,8,10,12-hexantirohexaazazisowurzitane, 2,4,6-trinitro-1,3,5-benzenetriamine, and pentaerythritol tetranitate. Predicted EM crystallographic and molecular structural parameters, as well as calculated properties for the binder will be compared with experimental results for different simulation conditions. We also present novel simulation protocols, which improve agreement between experimental and computation results thus leading to the accurate modeling of PBXs.

  5. The Economics of Reproducibility in Preclinical Research.

    PubMed

    Freedman, Leonard P; Cockburn, Iain M; Simcoe, Timothy S

    2015-06-01

    Low reproducibility rates within life science research undermine cumulative knowledge production and contribute to both delays and costs of therapeutic drug development. An analysis of past studies indicates that the cumulative (total) prevalence of irreproducible preclinical research exceeds 50%, resulting in approximately US$28,000,000,000 (US$28B)/year spent on preclinical research that is not reproducible-in the United States alone. We outline a framework for solutions and a plan for long-term improvements in reproducibility rates that will help to accelerate the discovery of life-saving therapies and cures.

  6. INFRARED IMAGING OF CARBON AND CERAMIC COMPOSITES: DATA REPRODUCIBILITY

    SciTech Connect

    Knight, B.; Howard, D. R.; Ringermacher, H. I.; Hudson, L. D.

    2010-02-22

    Infrared NDE techniques have proven to be superior for imaging of flaws in ceramic matrix composites (CMC) and carbon silicon carbide composites (C/SiC). Not only can one obtain accurate depth gauging of flaws such as delaminations and layered porosity in complex-shaped components such as airfoils and other aeronautical components, but also excellent reproducibility of image data is obtainable using the STTOF (Synthetic Thermal Time-of-Flight) methodology. The imaging of large complex shapes is fast and reliable. This methodology as applied to large C/SiC flight components at the NASA Dryden Flight Research Center will be described.

  7. Accurate Finite Difference Algorithms

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.

    1996-01-01

    Two families of finite difference algorithms for computational aeroacoustics are presented and compared. All of the algorithms are single step explicit methods, they have the same order of accuracy in both space and time, with examples up to eleventh order, and they have multidimensional extensions. One of the algorithm families has spectral like high resolution. Propagation with high order and high resolution algorithms can produce accurate results after O(10(exp 6)) periods of propagation with eight grid points per wavelength.

  8. Accurate monotone cubic interpolation

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1991-01-01

    Monotone piecewise cubic interpolants are simple and effective. They are generally third-order accurate, except near strict local extrema where accuracy degenerates to second-order due to the monotonicity constraint. Algorithms for piecewise cubic interpolants, which preserve monotonicity as well as uniform third and fourth-order accuracy are presented. The gain of accuracy is obtained by relaxing the monotonicity constraint in a geometric framework in which the median function plays a crucial role.

  9. Reproducible research in vadose zone sciences

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...

  10. Thou Shalt Be Reproducible! A Technology Perspective

    PubMed Central

    Mair, Patrick

    2016-01-01

    This article elaborates on reproducibility in psychology from a technological viewpoint. Modern open source computational environments are shown and explained that foster reproducibility throughout the whole research life cycle, and to which emerging psychology researchers should be sensitized, are shown and explained. First, data archiving platforms that make datasets publicly available are presented. Second, R is advocated as the data-analytic lingua franca in psychology for achieving reproducible statistical analysis. Third, dynamic report generation environments for writing reproducible manuscripts that integrate text, data analysis, and statistical outputs such as figures and tables in a single document are described. Supplementary materials are provided in order to get the reader started with these technologies. PMID:27471486

  11. How reproducible is the acoustical characterization of porous media?

    PubMed

    Pompoli, Francesco; Bonfiglio, Paolo; Horoshenkov, Kirill V; Khan, Amir; Jaouen, Luc; Bécot, François-Xavier; Sgard, Franck; Asdrubali, Francesco; D'Alessandro, Francesco; Hübelt, Jörn; Atalla, Noureddine; Amédin, Celse K; Lauriks, Walter; Boeckx, Laurens

    2017-02-01

    There is a considerable number of research publications on the characterization of porous media that is carried out in accordance with ISO 10534-2 (International Standards Organization, Geneva, Switzerland, 2001) and/or ISO 9053 (International Standards Organization, Geneva, Switzerland, 1991). According to the Web of Science(TM) (last accessed 22 September 2016) there were 339 publications in the Journal of the Acoustical Society of America alone which deal with the acoustics of porous media. However, the reproducibility of these characterization procedures is not well understood. This paper deals with the reproducibility of some standard characterization procedures for acoustic porous materials. The paper is an extension of the work published by Horoshenkov, Khan, Bécot, Jaouen, Sgard, Renault, Amirouche, Pompoli, Prodi, Bonfiglio, Pispola, Asdrubali, Hübelt, Atalla, Amédin, Lauriks, and Boeckx [J. Acoust. Soc. Am. 122(1), 345-353 (2007)]. In this paper, independent laboratory measurements were performed on the same material specimens so that the naturally occurring inhomogeneity in materials was controlled. It also presented the reproducibility data for the characteristic impedance, complex wavenumber, and for some related pore structure properties. This work can be helpful to better understand the tolerances of these material characterization procedures so improvements can be developed to reduce experimental errors and improve the reproducibility between laboratories.

  12. A synthesis approach for reproducing the response of aircraft panels to a turbulent boundary layer excitation.

    PubMed

    Bravo, Teresa; Maury, Cédric

    2011-01-01

    Random wall-pressure fluctuations due to the turbulent boundary layer (TBL) are a feature of the air flow over an aircraft fuselage under cruise conditions, creating undesirable effects such as cabin noise annoyance. In order to test potential solutions to reduce the TBL-induced noise, a cost-efficient alternative to in-flight or wind-tunnel measurements involves the laboratory simulation of the response of aircraft sidewalls to high-speed subsonic TBL excitation. Previously published work has shown that TBL simulation using a near-field array of loudspeakers is only feasible in the low frequency range due to the rapid decay of the spanwise correlation length with frequency. This paper demonstrates through theoretical criteria how the wavenumber filtering capabilities of the radiating panel reduces the number of sources required, thus dramatically enlarging the frequency range over which the response of the TBL-excited panel is accurately reproduced. Experimental synthesis of the panel response to high-speed TBL excitation is found to be feasible over the hydrodynamic coincidence frequency range using a reduced set of near-field loudspeakers driven by optimal signals. Effective methodologies are proposed for an accurate reproduction of the TBL-induced sound power radiated by the panel into a free-field and when coupled to a cavity.

  13. Relevance relations for the concept of reproducibility

    PubMed Central

    Atmanspacher, H.; Bezzola Lambert, L.; Folkers, G.; Schubiger, P. A.

    2014-01-01

    The concept of reproducibility is widely considered a cornerstone of scientific methodology. However, recent problems with the reproducibility of empirical results in large-scale systems and in biomedical research have cast doubts on its universal and rigid applicability beyond the so-called basic sciences. Reproducibility is a particularly difficult issue in interdisciplinary work where the results to be reproduced typically refer to different levels of description of the system considered. In such cases, it is mandatory to distinguish between more and less relevant features, attributes or observables of the system, depending on the level at which they are described. For this reason, we propose a scheme for a general ‘relation of relevance’ between the level of complexity at which a system is considered and the granularity of its description. This relation implies relevance criteria for particular selected aspects of a system and its description, which can be operationally implemented by an interlevel relation called ‘contextual emergence’. It yields a formally sound and empirically applicable procedure to translate between descriptive levels and thus construct level-specific criteria for reproducibility in an overall consistent fashion. Relevance relations merged with contextual emergence challenge the old idea of one fundamental ontology from which everything else derives. At the same time, our proposal is specific enough to resist the backlash into a relativist patchwork of unconnected model fragments. PMID:24554574

  14. Accurate calculated optical properties of substituted quaterphenylene nanofibers.

    PubMed

    Finnerty, Justin J; Koch, Rainer

    2010-01-14

    The accurate prediction of both excitation and emission energies of substituted p-quaterphenylenes using a variety of established and newly developed density functional methods is evaluated and compared against experimental data, both from single molecules and from nanofibers. For calculation of the UV-vis excitation the MPW1K functional is the best performing method (with the employed TZVP basis set). After a linear scaling factor is applied, mPW2-PLYP, CIS and the very fast INDO/S also reproduce the experimental data correctly. For the fluorescence relaxation energies MPW1K, mPW2-PLYP, and INDO/S give good results, even without scaling. However, mPW2-PLYP involves second-order perturbation to introduce nonlocal electron correlation and therefore requires significantly more resources, so the recommended level of theory for a single methodology to investigate the optical properties of substituted phenylenes and related systems is MPW1K/6-311+G(2d,p), followed by INDO/S as a low-cost alternative. As an extension of a previous work on predicting first hyperpolarisabilities, we can now demonstrate that the chosen approach (HF/6-31G(d)//B3LYP/6-31G(d)) produces data that correlate well with the susceptibilities derived from measurements on nanofibers.

  15. Accurate quantum chemical calculations

    NASA Technical Reports Server (NTRS)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.

    1989-01-01

    An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.

  16. Tractography of the optic radiation: a repeatability and reproducibility study.

    PubMed

    Dayan, Michael; Kreutzer, Sylvia; Clark, Chris A

    2015-04-01

    Our main objective was to evaluate the repeatability and reproducibility of optic radiation (OR) reconstruction from diffusion MRI (dMRI) data. 14 adults were scanned twice with the same 60-direction dMRI sequence. Peaks in the diffusion profile were estimated with the single tensor (ST), Q-ball (QSH) and persistent angular structure (PAS) methods. Segmentation of the OR was performed by two experimenters with probabilistic tractography based on a manually drawn region-of-interest (ROI) protocol typically employed for OR segmentation, with both standard and extended sets of ROIs. The repeatability and reproducibility were assessed by calculating the intra-class correlation coefficient (ICC) of intra- and inter-rater experiments, respectively. ICCs were calculated for commonly used dMRI metrics (FA, MD, AD, RD) and anatomical dimensions of the optic radiation (distance from Meyer's loop to the temporal pole, ML-TP), as well as the Dice similarity coefficient (DSC) between the raters' OR segmentation. Bland-Altman plots were also calculated to investigate bias and variability in the reproducibility measurements. The OR was successfully reconstructed in all subjects by both raters. The ICC was found to be in the good to excellent range for both repeatability and reproducibility of the dMRI metrics, DSC and ML-TP distance. The Bland-Altman plots did not show any apparent systematic bias for any quantities. Overall, higher ICC values were found for the multi-fiber methods, QSH and PAS, and for the standard set of ROIs. Considering the good to excellent repeatability and reproducibility of all the quantities investigated, these findings support the use of multi-fiber OR reconstruction with a limited number of manually drawn ROIs in clinical applications utilizing either OR microstructure characterization or OR dimensions, as is the case in neurosurgical planning for temporal lobectomy.

  17. Highly reproducible SERS arrays directly written by inkjet printing

    NASA Astrophysics Data System (ADS)

    Yang, Qiang; Deng, Mengmeng; Li, Huizeng; Li, Mingzhu; Zhang, Cong; Shen, Weizhi; Li, Yanan; Guo, Dan; Song, Yanlin

    2014-12-01

    SERS arrays with uniform gold nanoparticle distribution were fabricated by direct-writing with an inkjet printing method. Quantitative analysis based on Raman detection was achieved with a small standard statistical deviation of less than 4% for the reproducibility and less than 5% for the long-term stability for 12 weeks.SERS arrays with uniform gold nanoparticle distribution were fabricated by direct-writing with an inkjet printing method. Quantitative analysis based on Raman detection was achieved with a small standard statistical deviation of less than 4% for the reproducibility and less than 5% for the long-term stability for 12 weeks. Electronic supplementary information (ESI) available: Additional information on the experimental details, gold nanoparticle characterization, and theoretical calculation for the diameters of contact area of droplets on substrates with different contact angles. See DOI: 10.1039/c4nr04656k

  18. Reproducibility of graph metrics in FMRI networks.

    PubMed

    Telesford, Qawi K; Morgan, Ashley R; Hayasaka, Satoru; Simpson, Sean L; Barret, William; Kraft, Robert A; Mozolic, Jennifer L; Laurienti, Paul J

    2010-01-01

    The reliability of graph metrics calculated in network analysis is essential to the interpretation of complex network organization. These graph metrics are used to deduce the small-world properties in networks. In this study, we investigated the test-retest reliability of graph metrics from functional magnetic resonance imaging data collected for two runs in 45 healthy older adults. Graph metrics were calculated on data for both runs and compared using intraclass correlation coefficient (ICC) statistics and Bland-Altman (BA) plots. ICC scores describe the level of absolute agreement between two measurements and provide a measure of reproducibility. For mean graph metrics, ICC scores were high for clustering coefficient (ICC = 0.86), global efficiency (ICC = 0.83), path length (ICC = 0.79), and local efficiency (ICC = 0.75); the ICC score for degree was found to be low (ICC = 0.29). ICC scores were also used to generate reproducibility maps in brain space to test voxel-wise reproducibility for unsmoothed and smoothed data. Reproducibility was uniform across the brain for global efficiency and path length, but was only high in network hubs for clustering coefficient, local efficiency, and degree. BA plots were used to test the measurement repeatability of all graph metrics. All graph metrics fell within the limits for repeatability. Together, these results suggest that with exception of degree, mean graph metrics are reproducible and suitable for clinical studies. Further exploration is warranted to better understand reproducibility across the brain on a voxel-wise basis.

  19. Composting in small laboratory pilots: Performance and reproducibility

    SciTech Connect

    Lashermes, G.; Barriuso, E.; Le Villio-Poitrenaud, M.; Houot, S.

    2012-02-15

    Highlights: Black-Right-Pointing-Pointer We design an innovative small-scale composting device including six 4-l reactors. Black-Right-Pointing-Pointer We investigate the performance and reproducibility of composting on a small scale. Black-Right-Pointing-Pointer Thermophilic conditions are established by self-heating in all replicates. Black-Right-Pointing-Pointer Biochemical transformations, organic matter losses and stabilisation are realistic. Black-Right-Pointing-Pointer The organic matter evolution exhibits good reproducibility for all six replicates. - Abstract: Small-scale reactors (<10 l) have been employed in composting research, but few attempts have assessed the performance of composting considering the transformations of organic matter. Moreover, composting at small scales is often performed by imposing a fixed temperature, thus creating artificial conditions, and the reproducibility of composting has rarely been reported. The objectives of this study are to design an innovative small-scale composting device safeguarding self-heating to drive the composting process and to assess the performance and reproducibility of composting in small-scale pilots. The experimental setup included six 4-l reactors used for composting a mixture of sewage sludge and green wastes. The performance of the process was assessed by monitoring the temperature, O{sub 2} consumption and CO{sub 2} emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final

  20. Composting in small laboratory pilots: performance and reproducibility.

    PubMed

    Lashermes, G; Barriuso, E; Le Villio-Poitrenaud, M; Houot, S

    2012-02-01

    Small-scale reactors (<10 l) have been employed in composting research, but few attempts have assessed the performance of composting considering the transformations of organic matter. Moreover, composting at small scales is often performed by imposing a fixed temperature, thus creating artificial conditions, and the reproducibility of composting has rarely been reported. The objectives of this study are to design an innovative small-scale composting device safeguarding self-heating to drive the composting process and to assess the performance and reproducibility of composting in small-scale pilots. The experimental setup included six 4-l reactors used for composting a mixture of sewage sludge and green wastes. The performance of the process was assessed by monitoring the temperature, O(2) consumption and CO(2) emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final compost. The TOM losses, compost stabilisation and evolution of the biochemical fractions were similar to observed in large reactors or on-site experiments, excluding the lignin degradation, which was less important than in full-scale systems. The reproducibility of the process and the quality of the final compost make it possible to propose the use of this experimental device for research requiring a mass reduction of the initial composted waste mixtures.

  1. Design Procedure and Fabrication of Reproducible Silicon Vernier Devices for High-Performance Refractive Index Sensing

    PubMed Central

    Troia, Benedetto; Khokhar, Ali Z.; Nedeljkovic, Milos; Reynolds, Scott A.; Hu, Youfang; Mashanovich, Goran Z.; Passaro, Vittorio M. N.

    2015-01-01

    In this paper, we propose a generalized procedure for the design of integrated Vernier devices for high performance chemical and biochemical sensing. In particular, we demonstrate the accurate control of the most critical design and fabrication parameters of silicon-on-insulator cascade-coupled racetrack resonators operating in the second regime of the Vernier effect, around 1.55 μm. The experimental implementation of our design strategies has allowed a rigorous and reliable investigation of the influence of racetrack resonator and directional coupler dimensions as well as of waveguide process variability on the operation of Vernier devices. Figures of merit of our Vernier architectures have been measured experimentally, evidencing a high reproducibility and a very good agreement with the theoretical predictions, as also confirmed by relative errors even lower than 1%. Finally, a Vernier gain as high as 30.3, average insertion loss of 2.1 dB and extinction ratio up to 30 dB have been achieved. PMID:26067193

  2. Design Procedure and Fabrication of Reproducible Silicon Vernier Devices for High-Performance Refractive Index Sensing.

    PubMed

    Troia, Benedetto; Khokhar, Ali Z; Nedeljkovic, Milos; Reynolds, Scott A; Hu, Youfang; Mashanovich, Goran Z; Passaro, Vittorio M N

    2015-06-10

    In this paper, we propose a generalized procedure for the design of integrated Vernier devices for high performance chemical and biochemical sensing. In particular, we demonstrate the accurate control of the most critical design and fabrication parameters of silicon-on-insulator cascade-coupled racetrack resonators operating in the second regime of the Vernier effect, around 1.55 μm. The experimental implementation of our design strategies has allowed a rigorous and reliable investigation of the influence of racetrack resonator and directional coupler dimensions as well as of waveguide process variability on the operation of Vernier devices. Figures of merit of our Vernier architectures have been measured experimentally, evidencing a high reproducibility and a very good agreement with the theoretical predictions, as also confirmed by relative errors even lower than 1%. Finally, a Vernier gain as high as 30.3, average insertion loss of 2.1 dB and extinction ratio up to 30 dB have been achieved.

  3. BIOACCESSIBILITY TESTS ACCURATELY ESTIMATE ...

    EPA Pesticide Factsheets

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from five Pb-contaminated Superfund sites had relative bioavailabilities from 33%-63%, with a mean of about 50%. Treatment of two of the soils with P significantly reduced the bioavailability of Pb. The bioaccessibility of the Pb in the test soils was then measured in six in vitro tests and regressed on bioavailability. They were: the “Relative Bioavailability Leaching Procedure” (RBALP) at pH 1.5, the same test conducted at pH 2.5, the “Ohio State University In vitro Gastrointestinal” method (OSU IVG), the “Urban Soil Bioaccessible Lead Test”, the modified “Physiologically Based Extraction Test” and the “Waterfowl Physiologically Based Extraction Test.” All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the RBALP pH 2.5 and OSU IVG tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter 24%, or present as Pb sulfate 18%. Ad

  4. Making Early Modern Medicine: Reproducing Swedish Bitters.

    PubMed

    Ahnfelt, Nils-Otto; Fors, Hjalmar

    2016-05-01

    Historians of science and medicine have rarely applied themselves to reproducing the experiments and practices of medicine and pharmacy. This paper delineates our efforts to reproduce "Swedish Bitters," an early modern composite medicine in wide European use from the 1730s to the present. In its original formulation, it was made from seven medicinal simples: aloe, rhubarb, saffron, myrrh, gentian, zedoary and agarikon. These were mixed in alcohol together with some theriac, a composite medicine of classical origin. The paper delineates the compositional history of Swedish Bitters and the medical rationale underlying its composition. It also describes how we go about to reproduce the medicine in a laboratory using early modern pharmaceutical methods, and analyse it using contemporary methods of pharmaceutical chemistry. Our aim is twofold: first, to show how reproducing medicines may provide a path towards a deeper understanding of the role of sensual and practical knowledge in the wider context of early modern medical culture; and second, how it may yield interesting results from the point of view of contemporary pharmaceutical science.

  5. Natural Disasters: Earth Science Readings. Reproducibles.

    ERIC Educational Resources Information Center

    Lobb, Nancy

    Natural Disasters is a reproducible teacher book that explains what scientists believe to be the causes of a variety of natural disasters and suggests steps that teachers and students can take to be better prepared in the event of a natural disaster. It contains both student and teacher sections. Teacher sections include vocabulary, an answer key,…

  6. Europe Today: An Atlas of Reproducible Pages.

    ERIC Educational Resources Information Center

    World Eagle, Inc., Wellesley, MA.

    Illustrative black and white maps, tables, and graphs designed for clear reproducibility depict Europe's size, population, resources, commodities, trade, cities, schooling, jobs, energy, industry, demographic statistics, food, and agriculture. Also included are 33 United States Department of State individual country maps. This volume is intended…

  7. Accurate spectral color measurements

    NASA Astrophysics Data System (ADS)

    Hiltunen, Jouni; Jaeaeskelaeinen, Timo; Parkkinen, Jussi P. S.

    1999-08-01

    Surface color measurement is of importance in a very wide range of industrial applications including paint, paper, printing, photography, textiles, plastics and so on. For a demanding color measurements spectral approach is often needed. One can measure a color spectrum with a spectrophotometer using calibrated standard samples as a reference. Because it is impossible to define absolute color values of a sample, we always work with approximations. The human eye can perceive color difference as small as 0.5 CIELAB units and thus distinguish millions of colors. This 0.5 unit difference should be a goal for the precise color measurements. This limit is not a problem if we only want to measure the color difference of two samples, but if we want to know in a same time exact color coordinate values accuracy problems arise. The values of two instruments can be astonishingly different. The accuracy of the instrument used in color measurement may depend on various errors such as photometric non-linearity, wavelength error, integrating sphere dark level error, integrating sphere error in both specular included and specular excluded modes. Thus the correction formulas should be used to get more accurate results. Another question is how many channels i.e. wavelengths we are using to measure a spectrum. It is obvious that the sampling interval should be short to get more precise results. Furthermore, the result we get is always compromise of measuring time, conditions and cost. Sometimes we have to use portable syste or the shape and the size of samples makes it impossible to use sensitive equipment. In this study a small set of calibrated color tiles measured with the Perkin Elmer Lamda 18 and the Minolta CM-2002 spectrophotometers are compared. In the paper we explain the typical error sources of spectral color measurements, and show which are the accuracy demands a good colorimeter should have.

  8. A Simple and Accurate Method for Measuring Enzyme Activity.

    ERIC Educational Resources Information Center

    Yip, Din-Yan

    1997-01-01

    Presents methods commonly used for investigating enzyme activity using catalase and presents a new method for measuring catalase activity that is more reliable and accurate. Provides results that are readily reproduced and quantified. Can also be used for investigations of enzyme properties such as the effects of temperature, pH, inhibitors,…

  9. Ellipsoidal optical reflectors reproduced by electroforming

    NASA Technical Reports Server (NTRS)

    Hungerford, W. J.; Larmer, J. W.; Levinsohn, M.

    1964-01-01

    An accurately dimensioned convex ellipsoidal surface, which will become a master after polishing, is fabricated from 316L stainless steel. When polishing of the master is completed, it is suspended in a modified watt bath for electroforming of nickel reflectors.

  10. Problems in publishing accurate color in IEEE journals.

    PubMed

    Vrhel, Michael J; Trussell, H J

    2002-01-01

    To demonstrate the performance of color image processing algorithms, it is desirable to be able to accurately display color images in archival publications. In poster presentations, the authors have substantial control of the printing process, although little control of the illumination. For journal publication, the authors must rely on professional intermediaries (printers) to accurately reproduce their results. Our previous work describes requirements for accurately rendering images using your own equipment. This paper discusses the problems of dealing with intermediaries and offers suggestions for improved communication and rendering.

  11. Data Identifiers and Citations Enable Reproducible Science

    NASA Astrophysics Data System (ADS)

    Tilmes, C.

    2011-12-01

    Modern science often involves data processing with tremendous volumes of data. Keeping track of that data has been a growing challenge for data center. Researchers who access and use that data don't always reference and cite their data sources adequately for consumers of their research to follow their methodology or reproduce their analyses or experiments. Recent research has led to recommendations for good identifiers and citations that can help address this problem. This paper will describe some of the best practices in data identifiers, reference and citation. Using a simplified example scenario based on a long term remote sensing satellite mission, it will explore issues in identifying dynamic data sets and the importance of good data citations for reproducibility. It will describe the difference between granule and collection level identifiers, using UUIDs and DOIs to illustrate some recommendations for developing identifiers and assigning them during data processing. As data processors create data products, the provenance of the input products and precise steps that led to their creation are recorded and published for users of the data to see. As researchers access the data from an archive, they can use the provenance to help understand the genesis of the data, which could have effects on their usage of the data. By citing the data on publishing their research, others can retrieve the precise data used in their research and reproduce the analyses and experiments to confirm the results. Describing the experiment to a sufficient extent to reproduce the research enforces a formal approach that lends credibility to the results, and ultimately, to the policies of decision makers depending on that research.

  12. A Framework for Reproducible Latent Fingerprint Enhancements.

    PubMed

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology.

  13. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-07-11

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches.

  14. Robust tissue classification for reproducible wound assessment in telemedicine environments

    NASA Astrophysics Data System (ADS)

    Wannous, Hazem; Treuillet, Sylvie; Lucas, Yves

    2010-04-01

    In telemedicine environments, a standardized and reproducible assessment of wounds, using a simple free-handled digital camera, is an essential requirement. However, to ensure robust tissue classification, particular attention must be paid to the complete design of the color processing chain. We introduce the key steps including color correction, merging of expert labeling, and segmentation-driven classification based on support vector machines. The tool thus developed ensures stability under lighting condition, viewpoint, and camera changes, to achieve accurate and robust classification of skin tissues. Clinical tests demonstrate that such an advanced tool, which forms part of a complete 3-D and color wound assessment system, significantly improves the monitoring of the healing process. It achieves an overlap score of 79.3 against 69.1% for a single expert, after mapping on the medical reference developed from the image labeling by a college of experts.

  15. Reproducibility of Variant Calls in Replicate Next Generation Sequencing Experiments

    PubMed Central

    Qi, Yuan; Liu, Xiuping; Liu, Chang-gong; Wang, Bailing; Hess, Kenneth R.; Symmans, W. Fraser; Shi, Weiwei; Pusztai, Lajos

    2015-01-01

    Nucleotide alterations detected by next generation sequencing are not always true biological changes but could represent sequencing errors. Even highly accurate methods can yield substantial error rates when applied to millions of nucleotides. In this study, we examined the reproducibility of nucleotide variant calls in replicate sequencing experiments of the same genomic DNA. We performed targeted sequencing of all known human protein kinase genes (kinome) (~3.2 Mb) using the SOLiD v4 platform. Seventeen breast cancer samples were sequenced in duplicate (n=14) or triplicate (n=3) to assess concordance of all calls and single nucleotide variant (SNV) calls. The concordance rates over the entire sequenced region were >99.99%, while the concordance rates for SNVs were 54.3-75.5%. There was substantial variation in basic sequencing metrics from experiment to experiment. The type of nucleotide substitution and genomic location of the variant had little impact on concordance but concordance increased with coverage level, variant allele count (VAC), variant allele frequency (VAF), variant allele quality and p-value of SNV-call. The most important determinants of concordance were VAC and VAF. Even using the highest stringency of QC metrics the reproducibility of SNV calls was around 80% suggesting that erroneous variant calling can be as high as 20-40% in a single experiment. The sequence data have been deposited into the European Genome-phenome Archive (EGA) with accession number EGAS00001000826. PMID:26136146

  16. Accurate Evaluation of the Dispersion Energy in the Simulation of Gas Adsorption into Porous Zeolites.

    PubMed

    Fraccarollo, Alberto; Canti, Lorenzo; Marchese, Leonardo; Cossi, Maurizio

    2017-03-07

    The force fields used to simulate the gas adsorption in porous materials are strongly dominated by the van der Waals (vdW) terms. Here we discuss the delicate problem to estimate these terms accurately, analyzing the effect of different models. To this end, we simulated the physisorption of CH4, CO2, and Ar into various Al-free microporous zeolites (ITQ-29, SSZ-13, and silicalite-1), comparing the theoretical results with accurate experimental isotherms. The vdW terms in the force fields were parametrized against the free gas densities and high-level quantum mechanical (QM) calculations, comparing different methods to evaluate the dispersion energies. In particular, MP2 and DFT with semiempirical corrections, with suitable basis sets, were chosen to approximate the best QM calculations; either Lennard-Jones or Morse expressions were used to include the vdW terms in the force fields. The comparison of the simulated and experimental isotherms revealed that a strong interplay exists between the definition of the dispersion energies and the functional form used in the force field; these results are fairly general and reproducible, at least for the systems considered here. On this basis, the reliability of different models can be discussed, and a recipe can be provided to obtain accurate simulated adsorption isotherms.

  17. A highly accurate ab initio potential energy surface for methane

    NASA Astrophysics Data System (ADS)

    Owens, Alec; Yurchenko, Sergei N.; Yachmenev, Andrey; Tennyson, Jonathan; Thiel, Walter

    2016-09-01

    A new nine-dimensional potential energy surface (PES) for methane has been generated using state-of-the-art ab initio theory. The PES is based on explicitly correlated coupled cluster calculations with extrapolation to the complete basis set limit and incorporates a range of higher-level additive energy corrections. These include core-valence electron correlation, higher-order coupled cluster terms beyond perturbative triples, scalar relativistic effects, and the diagonal Born-Oppenheimer correction. Sub-wavenumber accuracy is achieved for the majority of experimentally known vibrational energy levels with the four fundamentals of 12CH4 reproduced with a root-mean-square error of 0.70 cm-1. The computed ab initio equilibrium C-H bond length is in excellent agreement with previous values despite pure rotational energies displaying minor systematic errors as J (rotational excitation) increases. It is shown that these errors can be significantly reduced by adjusting the equilibrium geometry. The PES represents the most accurate ab initio surface to date and will serve as a good starting point for empirical refinement.

  18. Quantitative proteomic analysis by accurate mass retention time pairs.

    PubMed

    Silva, Jeffrey C; Denny, Richard; Dorschel, Craig A; Gorenstein, Marc; Kass, Ignatius J; Li, Guo-Zhong; McKenna, Therese; Nold, Michael J; Richardson, Keith; Young, Phillip; Geromanos, Scott

    2005-04-01

    Current methodologies for protein quantitation include 2-dimensional gel electrophoresis techniques, metabolic labeling, and stable isotope labeling methods to name only a few. The current literature illustrates both pros and cons for each of the previously mentioned methodologies. Keeping with the teachings of William of Ockham, "with all things being equal the simplest solution tends to be correct", a simple LC/MS based methodology is presented that allows relative changes in abundance of proteins in highly complex mixtures to be determined. Utilizing a reproducible chromatographic separations system along with the high mass resolution and mass accuracy of an orthogonal time-of-flight mass spectrometer, the quantitative comparison of tens of thousands of ions emanating from identically prepared control and experimental samples can be made. Using this configuration, we can determine the change in relative abundance of a small number of ions between the two conditions solely by accurate mass and retention time. Employing standard operating procedures for both sample preparation and ESI-mass spectrometry, one typically obtains under 5 ppm mass precision and quantitative variations between 10 and 15%. The principal focus of this paper will demonstrate the quantitative aspects of the methodology and continue with a discussion of the associated, complementary qualitative capabilities.

  19. Spectroscopically Accurate Line Lists for Application in Sulphur Chemistry

    NASA Astrophysics Data System (ADS)

    Underwood, D. S.; Azzam, A. A. A.; Yurchenko, S. N.; Tennyson, J.

    2013-09-01

    for inclusion in standard atmospheric and planetary spectroscopic databases. The methods involved in computing the ab initio potential energy and dipole moment surfaces involved minor corrections to the equilibrium S-O distance, which produced a good agreement with experimentally determined rotational energies. However the purely ab initio method was not been able to reproduce an equally spectroscopically accurate representation of vibrational motion. We therefore present an empirical refinement to this original, ab initio potential surface, based on the experimental data available. This will not only be used to reproduce the room-temperature spectrum to a greater degree of accuracy, but is essential in the production of a larger, accurate line list necessary for the simulation of higher temperature spectra: we aim for coverage suitable for T ? 800 K. Our preliminary studies on SO3 have also shown it to exhibit an interesting "forbidden" rotational spectrum and "clustering" of rotational states; to our knowledge this phenomenon has not been observed in other examples of trigonal planar molecules and is also an investigative avenue we wish to pursue. Finally, the IR absorption bands for SO2 and SO3 exhibit a strong overlap, and the inclusion of SO2 as a complement to our studies is something that we will be interested in doing in the near future.

  20. Reproducibility and Validity of a Handheld Spirometer

    PubMed Central

    Barr, R Graham; Stemple, Kimberly J.; Mesia-Vela, Sonia; Basner, Robert C.; Derk, Susan; Henneberger, Paul; Milton, Donald K; Taveras, Brenda

    2013-01-01

    Background Handheld spirometers have several advantages over desktop spirometers but worries persist regarding their reproducibility and validity. We undertook an independent examination of an ultrasonic flow-sensing handheld spirometer. Methods Laboratory methods included reproducibility and validity testing using a waveform generator with standard American Thoracic Society (ATS) waveforms, in-line testing, calibration adaptor testing, and compression of the mouthpiece. Clinical testing involved repeated testing of 24 spirometry-naive volunteers and comparison to a volume-sensing dry rolling seal spirometer. Results The EasyOne Diagnostic spirometer exceeded standard thresholds of acceptability for ATS waveforms. In-line testing yielded valid results with relative differences (mean ± SD) between the EasyOne and the reference spirometer for the forced vital capacity (FVC) of 0.03±0.23 L and the forced expiratory volume in one second (FEV1) of −0.06±0.09 L. The calibration adaptor showed no appreciable problems, but extreme compression of the mouthpiece reduced measures. In clinical testing, coefficients of variation and limits of agreement were, respectively: 3.3% and 0.24 L for the FVC; 2.6% and 0.18 L for the FEV1; and 1.9% and 0.05 for the FEV1/FVC ratio. The EasyOne yielded lower values than the reference spirometry (FVC: −0.12 L; FEV1: −0.17 L; FEV1/FVC ratio: −0.02). Limits of agreement were within criteria for FVC but not for the FEV1, possibly due to a training effect. Conclusion The EasyOne spirometer yielded generally reproducible results that were generally valid compared to laboratory-based spirometry. The use of this handheld spirometer in clinical, occupational and research settings seems justified. PMID:18364054

  1. Queer nuclear families? Reproducing and transgressing heteronormativity.

    PubMed

    Folgerø, Tor

    2008-01-01

    During the past decade the public debate on gay and lesbian adoptive rights has been extensive in the Norwegian media. The debate illustrates how women and men planning to raise children in homosexual family constellations challenge prevailing cultural norms and existing concepts of kinship and family. The article discusses how lesbian mothers and gay fathers understand and redefine their own family practices. An essential point in this article is the fundamental ambiguity in these families' accounts of themselves-how they simultaneously transgress and reproduce heteronormative assumptions about childhood, fatherhood, motherhood, family and kinship.

  2. Towards reproducible, scalable lateral molecular electronic devices

    SciTech Connect

    Durkan, Colm Zhang, Qian

    2014-08-25

    An approach to reproducibly fabricate molecular electronic devices is presented. Lateral nanometer-scale gaps with high yield are formed in Au/Pd nanowires by a combination of electromigration and Joule-heating-induced thermomechanical stress. The resulting nanogap devices are used to measure the electrical properties of small numbers of two different molecular species with different end-groups, namely 1,4-butane dithiol and 1,5-diamino-2-methylpentane. Fluctuations in the current reveal that in the case of the dithiol molecule devices, individual molecules conduct intermittently, with the fluctuations becoming more pronounced at larger biases.

  3. Open and reproducible global land use classification

    NASA Astrophysics Data System (ADS)

    Nüst, Daniel; Václavík, Tomáš; Pross, Benjamin

    2015-04-01

    Researchers led by the Helmholtz Centre for Environmental research (UFZ) developed a new world map of land use systems based on over 30 diverse indicators (http://geoportal.glues.geo.tu-dresden.de/stories/landsystemarchetypes.html) of land use intensity, climate and environmental and socioeconomic factors. They identified twelve land system archetypes (LSA) using a data-driven classification algorithm (self-organizing maps) to assess global impacts of land use on the environment, and found unexpected similarities across global regions. We present how the algorithm behind this analysis can be published as an executable web process using 52°North WPS4R (https://wiki.52north.org/bin/view/Geostatistics/WPS4R) within the GLUES project (http://modul-a.nachhaltiges-landmanagement.de/en/scientific-coordination-glues/). WPS4R is an open source collaboration platform for researchers, analysts and software developers to publish R scripts (http://www.r-project.org/) as a geo-enabled OGC Web Processing Service (WPS) process. The interoperable interface to call the geoprocess allows both reproducibility of the analysis and integration of user data without knowledge about web services or classification algorithms. The open platform allows everybody to replicate the analysis in their own environments. The LSA WPS process has several input parameters, which can be changed via a simple web interface. The input parameters are used to configure both the WPS environment and the LSA algorithm itself. The encapsulation as a web process allows integration of non-public datasets, while at the same time the publication requires a well-defined documentation of the analysis. We demonstrate this platform specifically to domain scientists and show how reproducibility and open source publication of analyses can be enhanced. We also discuss future extensions of the reproducible land use classification, such as the possibility for users to enter their own areas of interest to the system and

  4. Reproducibility and reliability of fetal cardiac time intervals using magnetocardiography.

    PubMed

    van Leeuwen, P; Lange, S; Klein, A; Geue, D; Zhang, Y; Krause, H J; Grönemeyer, D

    2004-04-01

    We investigated several factors which may affect the accuracy of fetal cardiac time intervals (CTI) determined in magnetocardiographic (MCG) recordings: observer differences, the number of available recording sites and the type of sensor used in acquisition. In 253 fetal MCG recordings, acquired using different biomagnetometer devices between the 15th and 42nd weeks of gestation, P-wave, QRS complex and T-wave onsets and ends were identified in signal averaged data sets independently by different observers. Using a defined procedure for setting signal events, interobserver reliability was high. Increasing the number of registration sites led to more accurate identification of the events. The differences in wave morphology between magnetometer and gradiometer configurations led to deviations in timing whereas the differences between low and high temperature devices seemed to be primarily due to noise. Signal-to-noise ratio played an important overall role in the accurate determination of CTI and changes in signal amplitude associated with fetal maturation may largely explain the effects of gestational age on reproducibility. As fetal CTI may be of value in the identification of pathologies such as intrauterine growth retardation or fetal cardiac hypertrophy, their reliable estimation will be enhanced by strategies which take these factors into account.

  5. Reproducibility and reusability of scientific software

    NASA Astrophysics Data System (ADS)

    Shamir, Lior

    2017-01-01

    Information science and technology has been becoming an integral part of astronomy research, and due to the consistent growth in the size and impact of astronomical databases, that trend is bound to continue. While software is a vital part information systems and data analysis processes, in many cases the importance of the software and the standards of reporting on the use of source code has not yet elevated in the scientific communication process to the same level as other parts of the research. The purpose of the discussion is to examine the role of software in the scientific communication process in the light of transparency, reproducibility, and reusability of the research, as well as discussing software in astronomy in comparison to other disciplines.

  6. Is Isolated Nocturnal Hypertension A Reproducible Phenotype?

    PubMed Central

    Goldsmith, Jeff; Muntner, Paul; Diaz, Keith M.; Reynolds, Kristi; Schwartz, Joseph E.; Shimbo, Daichi

    2016-01-01

    BACKGROUND Isolated nocturnal hypertension (INH), defined as nocturnal without daytime hypertension on ambulatory blood pressure (BP) monitoring (ABPM), has been observed to be associated with an increased risk of cardiovascular disease (CVD) events and mortality. The aim of this study was to determine the short-term reproducibility of INH. METHODS The Improving the Detection of Hypertension Study enrolled a community-based sample of adults (N = 282) in upper Manhattan without CVD, renal failure, or treated hypertension. Each participant completed two 24-hour ABPM recordings (ABPM1: first recording and ABPM2: second recording) with a mean ± SD time interval of 33±17 days between recordings. Daytime hypertension was defined as mean awake systolic/diastolic BP ≥ 135/85mm Hg; nocturnal hypertension as mean sleep systolic/diastolic BP ≥ 120/70mm Hg; INH as nocturnal without daytime hypertension; isolated daytime hypertension (IDH) as daytime without nocturnal hypertension; day and night hypertension (DNH) as daytime and nocturnal hypertension, and any ambulatory hypertension as having daytime and/or nocturnal hypertension. RESULTS On ABPM1, 26 (9.2%), 21 (7.4%), and 50 (17.7%) participants had INH, IDH, and DNH, respectively. On ABPM2, 24 (8.5%), 19 (6.7%), and 54 (19.1%) had INH, IDH, and DNH, respectively. The kappa statistics were 0.21 (95% confidence interval (CI) 0.04–0.38), 0.25 (95% CI 0.06–0.44), and 0.65 (95% CI 0.53–0.77) for INH, IDH, and DNH respectively; and 0.72 (95% CI 0.63–0.81) for having any ambulatory hypertension. CONCLUSIONS Our results suggest that INH and IDH are poorly reproducible phenotypes, and that ABPM should be primarily used to identify individuals with daytime hypertension and/or nocturnal hypertension. PMID:25904648

  7. Is Grannum grading of the placenta reproducible?

    NASA Astrophysics Data System (ADS)

    Moran, Mary; Ryan, John; Brennan, Patrick C.; Higgins, Mary; McAuliffe, Fionnuala M.

    2009-02-01

    Current ultrasound assessment of placental calcification relies on Grannum grading. The aim of this study was to assess if this method is reproducible by measuring inter- and intra-observer variation in grading placental images, under strictly controlled viewing conditions. Thirty placental images were acquired and digitally saved. Five experienced sonographers independently graded the images on two separate occasions. In order to eliminate any technological factors which could affect data reliability and consistency all observers reviewed images at the same time. To optimise viewing conditions ambient lighting was maintained between 25-40 lux, with monitors calibrated to the GSDF standard to ensure consistent brightness and contrast. Kappa (κ) analysis of the grades assigned was used to measure inter- and intra-observer reliability. Intra-observer agreement had a moderate mean κ-value of 0.55, with individual comparisons ranging from 0.30 to 0.86. Two images saved from the same patient, during the same scan, were each graded as I, II and III by the same observer. A mean κ-value of 0.30 (range from 0.13 to 0.55) indicated fair inter-observer agreement over the two occasions and only one image was graded consistently the same by all five observers. The study findings confirmed the lack of reproducibility associated with Grannum grading of the placenta despite optimal viewing conditions and highlight the need for new methods of assessing placental health in order to improve neonatal outcomes. Alternative methods for quantifying placental calcification such as a software based technique and 3D ultrasound assessment need to be explored.

  8. Datathons and Software to Promote Reproducible Research

    PubMed Central

    2016-01-01

    Background Datathons facilitate collaboration between clinicians, statisticians, and data scientists in order to answer important clinical questions. Previous datathons have resulted in numerous publications of interest to the critical care community and serve as a viable model for interdisciplinary collaboration. Objective We report on an open-source software called Chatto that was created by members of our group, in the context of the second international Critical Care Datathon, held in September 2015. Methods Datathon participants formed teams to discuss potential research questions and the methods required to address them. They were provided with the Chatto suite of tools to facilitate their teamwork. Each multidisciplinary team spent the next 2 days with clinicians working alongside data scientists to write code, extract and analyze data, and reformulate their queries in real time as needed. All projects were then presented on the last day of the datathon to a panel of judges that consisted of clinicians and scientists. Results Use of Chatto was particularly effective in the datathon setting, enabling teams to reduce the time spent configuring their research environments to just a few minutes—a process that would normally take hours to days. Chatto continued to serve as a useful research tool after the conclusion of the datathon. Conclusions This suite of tools fulfills two purposes: (1) facilitation of interdisciplinary teamwork through archiving and version control of datasets, analytical code, and team discussions, and (2) advancement of research reproducibility by functioning postpublication as an online environment in which independent investigators can rerun or modify analyses with relative ease. With the introduction of Chatto, we hope to solve a variety of challenges presented by collaborative data mining projects while improving research reproducibility. PMID:27558834

  9. Ranking and averaging independent component analysis by reproducibility (RAICAR).

    PubMed

    Yang, Zhi; LaConte, Stephen; Weng, Xuchu; Hu, Xiaoping

    2008-06-01

    Independent component analysis (ICA) is a data-driven approach that has exhibited great utility for functional magnetic resonance imaging (fMRI). Standard ICA implementations, however, do not provide the number and relative importance of the resulting components. In addition, ICA algorithms utilizing gradient-based optimization give decompositions that are dependent on initialization values, which can lead to dramatically different results. In this work, a new method, RAICAR (Ranking and Averaging Independent Component Analysis by Reproducibility), is introduced to address these issues for spatial ICA applied to fMRI. RAICAR utilizes repeated ICA realizations and relies on the reproducibility between them to rank and select components. Different realizations are aligned based on correlations, leading to aligned components. Each component is ranked and thresholded based on between-realization correlations. Furthermore, different realizations of each aligned component are selectively averaged to generate the final estimate of the given component. Reliability and accuracy of this method are demonstrated with both simulated and experimental fMRI data.

  10. Reproducibility of techniques using Archimedes' principle in measuring cancellous bone volume.

    PubMed

    Zou, L; Bloebaum, R D; Bachus, K N

    1997-01-01

    Researchers have been interested in developing techniques to accurately and reproducibly measure the volume fraction of cancellous bone. Historically bone researchers have used Archimedes' principle with water to measure the volume fraction of cancellous bone. Preliminary results in our lab suggested that the calibrated water technique did not provide reproducible results. Because of this difficulty, it was decided to compare the conventional water method to a water with surfactant and a helium method using a micropycnometer. The water/surfactant and the helium methods were attempts to improve the fluid penetration into the small voids present in the cancellous bone structure. In order to compare the reproducibility of the new methods with the conventional water method, 16 cancellous bone specimens were obtained from femoral condyles of human and greyhound dog femora. The volume fraction measurements on each specimen were repeated three times with all three techniques. The results showed that the helium displacement method was more than an order of magnitudes more reproducible than the two other water methods (p < 0.05). Statistical analysis also showed that the conventional water method produced the lowest reproducibility (p < 0.05). The data from this study indicate that the helium displacement technique is a very useful, rapid and reproducible tool for quantitatively characterizing anisotropic porous tissue structures such as cancellous bone.

  11. Reproducibility of neuroimaging analyses across operating systems.

    PubMed

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  12. REPRODUCIBLE AND SHAREABLE QUANTIFICATIONS OF PATHOGENICITY

    PubMed Central

    Manrai, Arjun K; Wang, Brice L; Patel, Chirag J; Kohane, Isaac S

    2016-01-01

    There are now hundreds of thousands of pathogenicity assertions that relate genetic variation to disease, but most of this clinically utilized variation has no accepted quantitative disease risk estimate. Recent disease-specific studies have used control sequence data to reclassify large amounts of prior pathogenic variation, but there is a critical need to scale up both the pace and feasibility of such pathogenicity reassessments across human disease. In this manuscript we develop a shareable computational framework to quantify pathogenicity assertions. We release a reproducible “digital notebook” that integrates executable code, text annotations, and mathematical expressions in a freely accessible statistical environment. We extend previous disease-specific pathogenicity assessments to over 6,000 diseases and 160,000 assertions in the ClinVar database. Investigators can use this platform to prioritize variants for reassessment and tailor genetic model parameters (such as prevalence and heterogeneity) to expose the uncertainty underlying pathogenicity-based risk assessments. Finally, we release a website that links users to pathogenic variation for a queried disease, supporting literature, and implied disease risk calculations subject to user-defined and disease-specific genetic risk models in order to facilitate variant reassessments. PMID:26776189

  13. Accurate Evaluation of Quantum Integrals

    NASA Technical Reports Server (NTRS)

    Galant, D. C.; Goorvitch, D.; Witteborn, Fred C. (Technical Monitor)

    1995-01-01

    Combining an appropriate finite difference method with Richardson's extrapolation results in a simple, highly accurate numerical method for solving a Schrodinger's equation. Important results are that error estimates are provided, and that one can extrapolate expectation values rather than the wavefunctions to obtain highly accurate expectation values. We discuss the eigenvalues, the error growth in repeated Richardson's extrapolation, and show that the expectation values calculated on a crude mesh can be extrapolated to obtain expectation values of high accuracy.

  14. Monte Carlo modeling provides accurate calibration factors for radionuclide activity meters.

    PubMed

    Zagni, F; Cicoria, G; Lucconi, G; Infantino, A; Lodi, F; Marengo, M

    2014-12-01

    Accurate determination of calibration factors for radionuclide activity meters is crucial for quantitative studies and in the optimization step of radiation protection, as these detectors are widespread in radiopharmacy and nuclear medicine facilities. In this work we developed the Monte Carlo model of a widely used activity meter, using the Geant4 simulation toolkit. More precisely the "PENELOPE" EM physics models were employed. The model was validated by means of several certified sources, traceable to primary activity standards, and other sources locally standardized with spectrometry measurements, plus other experimental tests. Great care was taken in order to accurately reproduce the geometrical details of the gas chamber and the activity sources, each of which is different in shape and enclosed in a unique container. Both relative calibration factors and ionization current obtained with simulations were compared against experimental measurements; further tests were carried out, such as the comparison of the relative response of the chamber for a source placed at different positions. The results showed a satisfactory level of accuracy in the energy range of interest, with the discrepancies lower than 4% for all the tested parameters. This shows that an accurate Monte Carlo modeling of this type of detector is feasible using the low-energy physics models embedded in Geant4. The obtained Monte Carlo model establishes a powerful tool for first instance determination of new calibration factors for non-standard radionuclides, for custom containers, when a reference source is not available. Moreover, the model provides an experimental setup for further research and optimization with regards to materials and geometrical details of the measuring setup, such as the ionization chamber itself or the containers configuration.

  15. Accurate and Efficient Resolution of Overlapping Isotopic Envelopes in Protein Tandem Mass Spectra

    PubMed Central

    Xiao, Kaijie; Yu, Fan; Fang, Houqin; Xue, Bingbing; Liu, Yan; Tian, Zhixin

    2015-01-01

    It has long been an analytical challenge to accurately and efficiently resolve extremely dense overlapping isotopic envelopes (OIEs) in protein tandem mass spectra to confidently identify proteins. Here, we report a computationally efficient method, called OIE_CARE, to resolve OIEs by calculating the relative deviation between the ideal and observed experimental abundance. In the OIE_CARE method, the ideal experimental abundance of a particular overlapping isotopic peak (OIP) is first calculated for all the OIEs sharing this OIP. The relative deviation (RD) of the overall observed experimental abundance of this OIP relative to the summed ideal value is then calculated. The final individual abundance of the OIP for each OIE is the individual ideal experimental abundance multiplied by 1 + RD. Initial studies were performed using higher-energy collisional dissociation tandem mass spectra on myoglobin (with direct infusion) and the intact E. coli proteome (with liquid chromatographic separation). Comprehensive data at the protein and proteome levels, high confidence and good reproducibility were achieved. The resolving method reported here can, in principle, be extended to resolve any envelope-type overlapping data for which the corresponding theoretical reference values are available. PMID:26439836

  16. Direct, quantitative clinical assessment of hand function: usefulness and reproducibility.

    PubMed

    Goodson, Alexander; McGregor, Alison H; Douglas, Jane; Taylor, Peter

    2007-05-01

    Methods of assessing functional impairment in arthritic hands include pain assessments and disability scoring scales which are subjective, variable over time and fail to take account of the patients' need to adapt to deformities. The aim of this study was to evaluate measures of functional strength and joint motion in the assessment of the rheumatoid (RA) and osteoarthritic (OA) hand. Ten control subjects, ten RA and ten OA patients were recruited for the study. All underwent pain and disability scoring and functional assessment of the hand using measures of pinch/grip strength and range of joint motion (ROM). Functional assessments including ROM analyses at interphalangeal (IP), metacarpophalangeal (MCP) and wrist joints along with pinch/grip strength clearly discriminated between patient groups (RA vs. OA MCP ROM P<0.0001), pain and disability scales were unable to. In the RA there were demonstrable relationships between ROM measurements and disability (R2=0.31) as well as disease duration (R2=0.37). Intra-patient measures of strength were robust whereas inter-patient comparisons showed variability. In conclusion, pinch/grip strength and ROM are clinically reproducible assessments that may more accurately reflect functional impairment associated with arthritis.

  17. Semiautomated, Reproducible Batch Processing of Soy

    NASA Technical Reports Server (NTRS)

    Thoerne, Mary; Byford, Ivan W.; Chastain, Jack W.; Swango, Beverly E.

    2005-01-01

    A computer-controlled apparatus processes batches of soybeans into one or more of a variety of food products, under conditions that can be chosen by the user and reproduced from batch to batch. Examples of products include soy milk, tofu, okara (an insoluble protein and fiber byproduct of soy milk), and whey. Most processing steps take place without intervention by the user. This apparatus was developed for use in research on processing of soy. It is also a prototype of other soy-processing apparatuses for research, industrial, and home use. Prior soy-processing equipment includes household devices that automatically produce soy milk but do not automatically produce tofu. The designs of prior soy-processing equipment require users to manually transfer intermediate solid soy products and to press them manually and, hence, under conditions that are not consistent from batch to batch. Prior designs do not afford choices of processing conditions: Users cannot use previously developed soy-processing equipment to investigate the effects of variations of techniques used to produce soy milk (e.g., cold grinding, hot grinding, and pre-cook blanching) and of such process parameters as cooking times and temperatures, grinding times, soaking times and temperatures, rinsing conditions, and sizes of particles generated by grinding. In contrast, the present apparatus is amenable to such investigations. The apparatus (see figure) includes a processing tank and a jacketed holding or coagulation tank. The processing tank can be capped by either of two different heads and can contain either of two different insertable mesh baskets. The first head includes a grinding blade and heating elements. The second head includes an automated press piston. One mesh basket, designated the okara basket, has oblong holes with a size equivalent to about 40 mesh [40 openings per inch (.16 openings per centimeter)]. The second mesh basket, designated the tofu basket, has holes of 70 mesh [70 openings

  18. Research Reproducibility in Geosciences: Current Landscape, Practices and Perspectives

    NASA Astrophysics Data System (ADS)

    Yan, An

    2016-04-01

    Reproducibility of research can gauge the validity of its findings. Yet currently we lack understanding of how much of a problem research reproducibility is in geosciences. We developed an online survey on faculty and graduate students in geosciences, and received 136 responses from research institutions and universities in Americas, Asia, Europe and other parts of the world. This survey examined (1) the current state of research reproducibility in geosciences by asking researchers' experiences with unsuccessful replication work, and what obstacles that lead to their replication failures; (2) the current reproducibility practices in community by asking what efforts researchers made to try to reproduce other's work and make their own work reproducible, and what the underlying factors that contribute to irreproducibility are; (3) the perspectives on reproducibility by collecting researcher's thoughts and opinions on this issue. The survey result indicated that nearly 80% of respondents who had ever reproduced a published study had failed at least one time in reproducing. Only one third of the respondents received helpful feedbacks when they contacted the authors of a published study for data, code, or other information. The primary factors that lead to unsuccessful replication attempts are insufficient details of instructions in published literature, and inaccessibility of data, code and tools needed in the study. Our findings suggest a remarkable lack of research reproducibility in geoscience. Changing the incentive mechanism in academia, as well as developing policies and tools that facilitate open data and code sharing are the promising ways for geosciences community to alleviate this reproducibility problem.

  19. Reproducibility of parameter learning with missing observations in naive Wnt Bayesian network trained on colorectal cancer samples and doxycycline-treated cell lines.

    PubMed

    Sinha, Shriprakash

    2015-07-01

    In this manuscript the reproducibility of parameter learning with missing observations in a naive Bayesian network and its effect on the prediction results for Wnt signaling activation in colorectal cancer is tested. The training of the network is carried out separately on doxycycline-treated LS174T cell lines (GSE18560) as well as normal and adenoma samples (GSE8671). A computational framework to test the reproducibility of the parameters is designed in order check the veracity of the prediction results. Detailed experimental analysis suggests that the prediction results are accurate and reproducible with negligible deviations. Anomalies in estimated parameters are accounted for due to the representation issues of the Bayesian network model. High prediction accuracies are reported for normal (N) and colon-related adenomas (AD), colorectal cancer (CRC), carcinomas (C), adenocarcinomas (ADC) and replication error colorectal cancer (RER CRC) test samples. Test samples from inflammatory bowel diseases (IBD) do not fare well in the prediction test. Also, an interesting case regarding hypothesis testing came up while proving the statistical significance of the different design setups of the Bayesian network model. It was found that hypothesis testing may not be the correct way to check the significance between design setups, especially when the structure of the model is the same, given that the model is trained on a single piece of test data. The significance test does have value when the datasets are independent. Finally, in comparison to the biologically inspired models, the naive Bayesian model may give accurate results, but this accuracy comes at the cost of a loss of crucial biological knowledge which might help reveal hidden relations among intra/extracellular factors affecting the Wnt pathway.

  20. A Mechanical System to Reproduce Cardiovascular Flows

    NASA Astrophysics Data System (ADS)

    Lindsey, Thomas; Valsecchi, Pietro

    2010-11-01

    Within the framework of the "Pumps&Pipes" collaboration between ExxonMobil Upstream Research Company and The DeBakey Heart and Vascular Center in Houston, a hydraulic control system was developed to accurately simulate general cardiovascular flows. The final goal of the development of the apparatus was the reproduction of the periodic flow of blood through the heart cavity with the capability of varying frequency and amplitude, as well as designing the systolic/diastolic volumetric profile over one period. The system consists of a computer-controlled linear actuator that drives hydraulic fluid in a closed loop to a secondary hydraulic cylinder. The test section of the apparatus is located inside a MRI machine, and the closed loop serves to physically separate all metal moving parts (control system and actuator cylinder) from the MRI-compatible pieces. The secondary cylinder is composed of nonmetallic elements and directly drives the test section circulatory flow loop. The circulatory loop consists of nonmetallic parts and several types of Newtonian and non-Newtonian fluids, which model the behavior of blood. This design allows for a periodic flow of blood-like fluid pushed through a modeled heart cavity capable of replicating any healthy heart condition as well as simulating anomalous conditions. The behavior of the flow inside the heart can thus be visualized by MRI techniques.

  1. Automated curve matching techniques for reproducible, high-resolution palaeomagnetic dating

    NASA Astrophysics Data System (ADS)

    Lurcock, Pontus; Channell, James

    2016-04-01

    High-resolution relative palaeointensity (RPI) and palaeosecular variation (PSV) data are increasingly important for accurate dating of sedimentary sequences, often in combination with oxygen isotope (δ18O) measurements. A chronology is established by matching a measured downcore signal to a dated reference curve, but there is no standard methodology for performing this correlation. Traditionally, matching is done by eye, but this becomes difficult when two parameters (e.g. RPI and δ18O) are being matched simultaneously, and cannot be done entirely objectively or repeatably. More recently, various automated techniques have appeared for matching one or more signals. We present Scoter, a user-friendly program for dating by signal matching and for comparing different matching techniques. Scoter is a cross-platform application implemented in Python, and consists of a general-purpose signal processing and correlation library linked to a graphical desktop front-end. RPI, PSV, and other records can be opened, pre-processed, and automatically matched with reference curves. A Scoter project can be exported as a self-contained bundle, encapsulating the input data, pre-processing steps, and correlation parameters, as well as the program itself. The analysis can be automatically replicated by anyone using only the resources in the bundle, ensuring full reproducibility. The current version of Scoter incorporates an experimental signal-matching algorithm based on simulated annealing, as well as an interface to the well-established Match program of Lisiecki and Lisiecki (2002), enabling results of the two approaches to be compared directly.

  2. TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics

    PubMed Central

    Röst, Hannes L.; Liu, Yansheng; D’Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C.; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi

    2016-01-01

    Large scale, quantitative proteomic studies have become essential for the analysis of clinical cohorts, large perturbation experiments and systems biology studies. While next-generation mass spectrometric techniques such as SWATH-MS have substantially increased throughput and reproducibility, ensuring consistent quantification of thousands of peptide analytes across multiple LC-MS/MS runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we have developed the TRIC software which utilizes fragment ion data to perform cross-run alignment, consistent peak-picking and quantification for high throughput targeted proteomics. TRIC uses a graph-based alignment strategy based on non-linear retention time correction to integrate peak elution information from all LC-MS/MS runs acquired in a study. When compared to state-of-the-art SWATH-MS data analysis, the algorithm was able to reduce the identification error by more than 3-fold at constant recall, while correcting for highly non-linear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem (iPS) cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups and substantially increased the quantitative completeness and biological information in the data, providing insights into protein dynamics of iPS cells. Overall, this study demonstrates the importance of consistent quantification in highly challenging experimental setups, and proposes an algorithm to automate this task, constituting the last missing piece in a pipeline for automated analysis of massively parallel targeted proteomics datasets. PMID:27479329

  3. The importance of accurate experimental data to marginal field development

    SciTech Connect

    Overa, S.J.; Lingelem, M.N.

    1997-12-31

    Since exploration started in the Norwegian North Sea in 1965 a total of 196 fields have been discovered. Less than one-third of these fields have been developed. The marginal fields can not be developed economically with current technology even though some of those fields have significant reserves. The total cost to develop one of those large installations is estimated to be 2--5 billion US dollars. Therefore new technology is needed to lower the designed and installed costs of each unit. The need for new physical property data is shown. The value of valid operating data from present units is also pointed out.

  4. Experimental studies of the magnetized friction force

    SciTech Connect

    Fedotov, A. V.; Litvinenko, V. N.; Gaalnander, B.; Lofnes, T.; Ziemann, V.; Sidorin, A.; Smirnov, A.

    2006-06-15

    High-energy electron cooling, presently considered as an essential tool for several applications in high-energy and nuclear physics, requires an accurate description of the friction force which ions experience by passing through an electron beam. Present low-energy electron coolers can be used for a detailed study of the friction force. In addition, parameters of a low-energy cooler can be chosen in a manner to reproduce regimes expected in future high-energy operation. Here, we report a set of dedicated experiments in CELSIUS aimed at a detailed study of the magnetized friction force. Some results of the accurate comparison of experimental data with the friction force formulas are presented.

  5. Fast and accurate exhaled breath ammonia measurement.

    PubMed

    Solga, Steven F; Mudalel, Matthew L; Spacek, Lisa A; Risby, Terence H

    2014-06-11

    This exhaled breath ammonia method uses a fast and highly sensitive spectroscopic method known as quartz enhanced photoacoustic spectroscopy (QEPAS) that uses a quantum cascade based laser. The monitor is coupled to a sampler that measures mouth pressure and carbon dioxide. The system is temperature controlled and specifically designed to address the reactivity of this compound. The sampler provides immediate feedback to the subject and the technician on the quality of the breath effort. Together with the quick response time of the monitor, this system is capable of accurately measuring exhaled breath ammonia representative of deep lung systemic levels. Because the system is easy to use and produces real time results, it has enabled experiments to identify factors that influence measurements. For example, mouth rinse and oral pH reproducibly and significantly affect results and therefore must be controlled. Temperature and mode of breathing are other examples. As our understanding of these factors evolves, error is reduced, and clinical studies become more meaningful. This system is very reliable and individual measurements are inexpensive. The sampler is relatively inexpensive and quite portable, but the monitor is neither. This limits options for some clinical studies and provides rational for future innovations.

  6. Can atmospheric reanalysis datasets be used to reproduce flood characteristics?

    NASA Astrophysics Data System (ADS)

    Andreadis, K.; Schumann, G.; Stampoulis, D.

    2014-12-01

    Floods are one of the costliest natural disasters and the ability to understand their characteristics and their interactions with population, land cover and climate changes is of paramount importance. In order to accurately reproduce flood characteristics such as water inundation and heights both in the river channels and floodplains, hydrodynamic models are required. Most of these models operate at very high resolutions and are computationally very expensive, making their application over large areas very difficult. However, a need exists for such models to be applied at regional to global scales so that the effects of climate change with regards to flood risk can be examined. We use the LISFLOOD-FP hydrodynamic model to simulate a 40-year history of flood characteristics at the continental scale, particularly over Australia. LISFLOOD-FP is a 2-D hydrodynamic model that solves the approximate Saint-Venant equations at large scales (on the order of 1 km) using a sub-grid representation of the river channel. This implementation is part of an effort towards a global 1-km flood modeling framework that will allow the reconstruction of a long-term flood climatology. The components of this framework include a hydrologic model (the widely-used Variable Infiltration Capacity model) and a meteorological dataset that forces it. In order to extend the simulated flood climatology to 50-100 years in a consistent manner, reanalysis datasets have to be used. The objective of this study is the evaluation of multiple atmospheric reanalysis datasets (ERA, NCEP, MERRA, JRA) as inputs to the VIC/LISFLOOD-FP model. Comparisons of the simulated flood characteristics are made with both satellite observations of inundation and a benchmark simulation of LISFLOOD-FP being forced by observed flows. Finally, the implications of the availability of a global flood modeling framework for producing flood hazard maps and disseminating disaster information are discussed.

  7. Crowdsourcing reproducible seizure forecasting in human and canine epilepsy

    PubMed Central

    Wagenaar, Joost; Abbot, Drew; Adkins, Phillip; Bosshard, Simone C.; Chen, Min; Tieng, Quang M.; He, Jialune; Muñoz-Almaraz, F. J.; Botella-Rocamora, Paloma; Pardo, Juan; Zamora-Martinez, Francisco; Hills, Michael; Wu, Wei; Korshunova, Iryna; Cukierski, Will; Vite, Charles; Patterson, Edward E.; Litt, Brian; Worrell, Gregory A.

    2016-01-01

    See Mormann and Andrzejak (doi:10.1093/brain/aww091) for a scientific commentary on this article.   Accurate forecasting of epileptic seizures has the potential to transform clinical epilepsy care. However, progress toward reliable seizure forecasting has been hampered by lack of open access to long duration recordings with an adequate number of seizures for investigators to rigorously compare algorithms and results. A seizure forecasting competition was conducted on kaggle.com using open access chronic ambulatory intracranial electroencephalography from five canines with naturally occurring epilepsy and two humans undergoing prolonged wide bandwidth intracranial electroencephalographic monitoring. Data were provided to participants as 10-min interictal and preictal clips, with approximately half of the 60 GB data bundle labelled (interictal/preictal) for algorithm training and half unlabelled for evaluation. The contestants developed custom algorithms and uploaded their classifications (interictal/preictal) for the unknown testing data, and a randomly selected 40% of data segments were scored and results broadcasted on a public leader board. The contest ran from August to November 2014, and 654 participants submitted 17 856 classifications of the unlabelled test data. The top performing entry scored 0.84 area under the classification curve. Following the contest, additional held-out unlabelled data clips were provided to the top 10 participants and they submitted classifications for the new unseen data. The resulting area under the classification curves were well above chance forecasting, but did show a mean 6.54 ± 2.45% (min, max: 0.30, 20.2) decline in performance. The kaggle.com model using open access data and algorithms generated reproducible research that advanced seizure forecasting. The overall performance from multiple contestants on unseen data was better than a random predictor, and demonstrates the feasibility of seizure forecasting in canine and

  8. Reproducible LTE uplink performance analysis using precomputed interference signals

    NASA Astrophysics Data System (ADS)

    Pauli, Volker; Nisar, Muhammad Danish; Seidel, Eiko

    2011-12-01

    The consideration of realistic uplink inter-cell interference is essential for the overall performance testing of future cellular systems, and in particular for the evaluation of the radio resource management (RRM) algorithms. Most beyond-3G communication systems employ orthogonal multiple access in uplink (SC-FDMA in LTE and OFDMA in WiMAX), and additionally rely on frequency-selective RRM (scheduling) algorithms. This makes the task of accurate modeling of uplink interference both crucial and non-trivial. Traditional methods for its modeling (e.g., via additive white Gaussian noise interference sources) are therefore proving to be ineffective to realistically model the uplink interference in the next generation cellular systems. In this article, we propose the use of realistic precomputed interference patterns for LTE uplink performance analysis and testing. The interference patterns are generated via an LTE system-level simulator for a given set of scenario parameters, such as cell configuration, user configurations, and traffic models. The generated interference patterns (some of which are made publicly available) can be employed to benchmark the performance of any LTE uplink system in both lab simulations and field trials for practical deployments. It is worth mentioning that the proposed approach can also be extended to other cellular communication systems employing OFDMA-like multiple access with frequency-selective RRM techniques. The proposed approach offers twofold advantages. First, it allows for repeatability and reproducibility of the performance analysis. This is of crucial significance not only for researchers and developers to analyze the behavior and performance of their systems, but also for the network operators to compare the performance of competing system vendors. Second, the proposed testing mechanism evades the need for deployment of multiple cells (with multiple active users in each) to achieve realistic field trials, thereby resulting in

  9. Virtual Reference Environments: a simple way to make research reproducible

    PubMed Central

    Hurley, Daniel G.; Budden, David M.

    2015-01-01

    Reproducible research’ has received increasing attention over the past few years as bioinformatics and computational biology methodologies become more complex. Although reproducible research is progressing in several valuable ways, we suggest that recent increases in internet bandwidth and disk space, along with the availability of open-source and free-software licences for tools, enable another simple step to make research reproducible. In this article, we urge the creation of minimal virtual reference environments implementing all the tools necessary to reproduce a result, as a standard part of publication. We address potential problems with this approach, and show an example environment from our own work. PMID:25433467

  10. Virtual Reference Environments: a simple way to make research reproducible.

    PubMed

    Hurley, Daniel G; Budden, David M; Crampin, Edmund J

    2015-09-01

    'Reproducible research' has received increasing attention over the past few years as bioinformatics and computational biology methodologies become more complex. Although reproducible research is progressing in several valuable ways, we suggest that recent increases in internet bandwidth and disk space, along with the availability of open-source and free-software licences for tools, enable another simple step to make research reproducible. In this article, we urge the creation of minimal virtual reference environments implementing all the tools necessary to reproduce a result, as a standard part of publication. We address potential problems with this approach, and show an example environment from our own work.

  11. Interlaboratory reproducibility of large-scale human protein-complex analysis by standardized AP-MS.

    PubMed

    Varjosalo, Markku; Sacco, Roberto; Stukalov, Alexey; van Drogen, Audrey; Planyavsky, Melanie; Hauri, Simon; Aebersold, Ruedi; Bennett, Keiryn L; Colinge, Jacques; Gstaiger, Matthias; Superti-Furga, Giulio

    2013-04-01

    The characterization of all protein complexes of human cells under defined physiological conditions using affinity purification-mass spectrometry (AP-MS) is a highly desirable step in the quest to understand the phenotypic effects of genomic information. However, such a challenging goal has not yet been achieved, as it requires reproducibility of the experimental workflow and high data consistency across different studies and laboratories. We systematically investigated the reproducibility of a standardized AP-MS workflow by performing a rigorous interlaboratory comparative analysis of the interactomes of 32 human kinases. We show that it is possible to achieve high interlaboratory reproducibility of this standardized workflow despite differences in mass spectrometry configurations and subtle sample preparation-related variations and that combination of independent data sets improves the approach sensitivity, resulting in even more-detailed networks. Our analysis demonstrates the feasibility of obtaining a high-quality map of the human protein interactome with a multilaboratory project.

  12. Evaluation of reproducibility and reliability of 3D soft tissue analysis using 3D stereophotogrammetry.

    PubMed

    Plooij, J M; Swennen, G R J; Rangel, F A; Maal, T J J; Schutyser, F A C; Bronkhorst, E M; Kuijpers-Jagtman, A M; Bergé, S J

    2009-03-01

    In 3D photographs the bony structures are neither available nor palpable, therefore, the bone-related landmarks, such as the soft tissue gonion, need to be redefined. The purpose of this study was to determine the reproducibility and reliability of 49 soft tissue landmarks, including newly defined 3D bone-related soft tissue landmarks with the use of 3D stereophotogrammetric images. Two observers carried out soft-tissue analysis on 3D photographs twice for 20 patients. A reference frame and 49 landmarks were identified on each 3D photograph. Paired Student's t-test was used to test the reproducibility and Pearson's correlation coefficient to determine the reliability of the landmark identification. Intra- and interobserver reproducibility of the landmarks were high. The study showed a high reliability coefficient for intraobserver (0.97 (0.90 - 0.99)) and interobserver reliability (0.94 (0.69 - 0.99)). Identification of the landmarks in the midline was more precise than identification of the paired landmarks. In conclusion, the redefinition of bone-related soft tissue 3D landmarks in combination with the 3D photograph reference system resulted in an accurate and reliable 3D photograph based soft tissue analysis. This shows that hard tissue data are not needed to perform accurate soft tissue analysis.

  13. Development of hydrophobic surface substrates enabling reproducible drop-and-dry spectroscopic measurements.

    PubMed

    Lee, Jinah; Duy, Pham Khac; Park, Seok Chan; Chung, Hoeil

    2016-06-01

    We investigated several spectroscopic substrates with hydrophobic surfaces that were able to form reproducible droplets of aqueous samples for reliable high throughput drop-and-dry measurements. An amine-coated substrate, a polytetrafluoroethylene (PTFE) disk, and a perfluorooctyltrichlorosilane (FTS) coated substrate were prepared and initially evaluated for use in the determination of fat concentrations in milks using near-infrared (NIR) spectroscopy. Since the dried milk spots were not compositionally uniform due to the localization of components during sample drying, NIR spectra were collected by fully covering each spot to ensure a correct compositional representation of the sample. The amine-coated substrate yielded more reproducible dried milk patterns because its hydrophobicity was optimal for loading an appropriate amount of milk with decreased component localization after drying. The relative standard deviation (RSD) of the absorbance at 4330cm(-1) was 1.0%, thereby resulting in the more accurate determination of fat concentration. In addition, infrared (IR) spectroscopic discrimination between wild and transgenic tobaccos using their extracts was attempted. The extracted metabolites had a low concentration, so an FTS-coated CaF2 substrate that maximized sample loading was used to improve measurement sensitivity and produce reproducible droplets. The RSD of the absorbance at 1070cm(-1) was only 0.8%. Our strategy produced droplets that had consistent sizes and provided reproducible IR spectral features, which enabled the differentiation between wild and transgenic tobacco groups in the principal component (PC) score domain.

  14. An Open Science and Reproducible Research Primer for Landscape Ecologists

    EPA Science Inventory

    In recent years many funding agencies, some publishers, and even the United States government have enacted policies that encourage open science and strive for reproducibility; however, the knowledge and skills to implement open science and enable reproducible research are not yet...

  15. 10 CFR 1016.35 - Authority to reproduce Restricted Data.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Authority to reproduce Restricted Data. 1016.35 Section 1016.35 Energy DEPARTMENT OF ENERGY (GENERAL PROVISIONS) SAFEGUARDING OF RESTRICTED DATA Control of... Restricted Data may be reproduced to the minimum extent necessary consistent with efficient operation...

  16. 10 CFR 1016.35 - Authority to reproduce Restricted Data.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Authority to reproduce Restricted Data. 1016.35 Section 1016.35 Energy DEPARTMENT OF ENERGY (GENERAL PROVISIONS) SAFEGUARDING OF RESTRICTED DATA Control of... Restricted Data may be reproduced to the minimum extent necessary consistent with efficient operation...

  17. Is reproducibility inside the bag? Special issue fundamentals and applications of sonochemistry ESS-15.

    PubMed

    Gomes, Filipe; Thakkar, Harsh; Lähde, Anna; Verhaagen, Bram; Pandit, Aniruddha B; Fernández Rivas, David

    2017-03-24

    In this paper we report our most recent attempts to tackle a notorious problem across several scientific activities from the ultrasonics sonochemical perspective: reproducibility of results. We provide experimental results carried out in three different laboratories, using the same ingredients: ultrasound and a novel cavitation reactor bag. The main difference between the experiments is that they are aimed at different applications, KI liberation and MB degradation; and exfoliation of two nanomaterials: graphene and molybdenum disulfide. Iodine liberation rates and methylene blue degradation were higher for the cases where a cavitation intensification bag was used. Similarly, improved dispersion and more polydisperse exfoliated layers of nanomaterials were observed in the intensified bags compared to plain ones. The reproducibility of these new experiments is compared to previous experimental results under similar conditions. Our main conclusion is that despite knowing and understanding most physicochemical phenomena related to the origins and effects of cavitation, there is still a long path towards reproducibility, both in one laboratory, and compared across different laboratories. As emphasized in the sonochemical literature, the latter clearly illustrates the complexity of cavitation as nonlinear phenomenon, whose quantitative estimation represents a challenging aspect. We also provide a list of procedural steps that can help improving reproducibility and scale-up efforts.

  18. Development of accurate force fields for the simulation of biomineralization.

    PubMed

    Raiteri, Paolo; Demichelis, Raffaella; Gale, Julian D

    2013-01-01

    The existence of an accurate force field (FF) model that reproduces the free-energy landscape is a key prerequisite for the simulation of biomineralization. Here, the stages in the development of such a model are discussed including the quality of the water model, the thermodynamics of polymorphism, and the free energies of solvation for the relevant species. The reliability of FFs can then be benchmarked against quantities such as the free energy of ion pairing in solution, the solubility product, and the structure of the mineral-water interface.

  19. An Effective and Reproducible Model of Ventricular Fibrillation in Crossbred Yorkshire Swine (Sus scrofa) for Use in Physiologic Research.

    PubMed

    Burgert, James M; Johnson, Arthur D; Garcia-Blanco, Jose C; Craig, W John; O'Sullivan, Joseph C

    2015-10-01

    Transcutaneous electrical induction (TCEI) has been used to induce ventricular fibrillation (VF) in laboratory swine for physiologic and resuscitation research. Many studies do not describe the method of TCEI in detail, thus making replication by future investigators difficult. Here we describe a detailed method of electrically inducing VF that was used successfully in a prospective, experimental resuscitation study. Specifically, an electrical current was passed through the heart to induce VF in crossbred Yorkshire swine (n = 30); the current was generated by using two 22-gauge spinal needles, with one placed above and one below the heart, and three 9V batteries connected in series. VF developed in 28 of the 30 pigs (93%) within 10 s of beginning the procedure. In the remaining 2 swine, VF was induced successfully after medial redirection of the superior parasternal needle. The TCEI method is simple, reproducible, and cost-effective. TCEI may be especially valuable to researchers with limited access to funding, sophisticated equipment, or colleagues experienced in interventional cardiology techniques. The TCEI method might be most appropriate for pharmacologic studies requiring VF, VF resulting from the R-on-T phenomenon (as in prolonged QT syndrome), and VF arising from other ectopic or reentrant causes. However, the TCEI method does not accurately model the most common cause of VF, acute coronary occlusive disease. Researchers must consider the limitations of TCEI that may affect internal and external validity of collected data, when designing experiments using this model of VF.

  20. Towards Accurate Application Characterization for Exascale (APEX)

    SciTech Connect

    Hammond, Simon David

    2015-09-01

    Sandia National Laboratories has been engaged in hardware and software codesign activities for a number of years, indeed, it might be argued that prototyping of clusters as far back as the CPLANT machines and many large capability resources including ASCI Red and RedStorm were examples of codesigned solutions. As the research supporting our codesign activities has moved closer to investigating on-node runtime behavior a nature hunger has grown for detailed analysis of both hardware and algorithm performance from the perspective of low-level operations. The Application Characterization for Exascale (APEX) LDRD was a project concieved of addressing some of these concerns. Primarily the research was to intended to focus on generating accurate and reproducible low-level performance metrics using tools that could scale to production-class code bases. Along side this research was an advocacy and analysis role associated with evaluating tools for production use, working with leading industry vendors to develop and refine solutions required by our code teams and to directly engage with production code developers to form a context for the application analysis and a bridge to the research community within Sandia. On each of these accounts significant progress has been made, particularly, as this report will cover, in the low-level analysis of operations for important classes of algorithms. This report summarizes the development of a collection of tools under the APEX research program and leaves to other SAND and L2 milestone reports the description of codesign progress with Sandia’s production users/developers.

  1. On numerically accurate finite element

    NASA Technical Reports Server (NTRS)

    Nagtegaal, J. C.; Parks, D. M.; Rice, J. R.

    1974-01-01

    A general criterion for testing a mesh with topologically similar repeat units is given, and the analysis shows that only a few conventional element types and arrangements are, or can be made suitable for computations in the fully plastic range. Further, a new variational principle, which can easily and simply be incorporated into an existing finite element program, is presented. This allows accurate computations to be made even for element designs that would not normally be suitable. Numerical results are given for three plane strain problems, namely pure bending of a beam, a thick-walled tube under pressure, and a deep double edge cracked tensile specimen. The effects of various element designs and of the new variational procedure are illustrated. Elastic-plastic computation at finite strain are discussed.

  2. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  3. Micron Accurate Absolute Ranging System: Range Extension

    NASA Technical Reports Server (NTRS)

    Smalley, Larry L.; Smith, Kely L.

    1999-01-01

    The purpose of this research is to investigate Fresnel diffraction as a means of obtaining absolute distance measurements with micron or greater accuracy. It is believed that such a system would prove useful to the Next Generation Space Telescope (NGST) as a non-intrusive, non-contact measuring system for use with secondary concentrator station-keeping systems. The present research attempts to validate past experiments and develop ways to apply the phenomena of Fresnel diffraction to micron accurate measurement. This report discusses past research on the phenomena, and the basis of the use Fresnel diffraction distance metrology. The apparatus used in the recent investigations, experimental procedures used, preliminary results are discussed in detail. Continued research and equipment requirements on the extension of the effective range of the Fresnel diffraction systems is also described.

  4. Accurate determination of rates from non-uniformly sampled relaxation data.

    PubMed

    Stetz, Matthew A; Wand, A Joshua

    2016-08-01

    The application of non-uniform sampling (NUS) to relaxation experiments traditionally used to characterize the fast internal motion of proteins is quantitatively examined. Experimentally acquired Poisson-gap sampled data reconstructed with iterative soft thresholding are compared to regular sequentially sampled (RSS) data. Using ubiquitin as a model system, it is shown that 25 % sampling is sufficient for the determination of quantitatively accurate relaxation rates. When the sampling density is fixed at 25 %, the accuracy of rates is shown to increase sharply with the total number of sampled points until eventually converging near the inherent reproducibility of the experiment. Perhaps contrary to some expectations, it is found that accurate peak height reconstruction is not required for the determination of accurate rates. Instead, inaccuracies in rates arise from inconsistencies in reconstruction across the relaxation series that primarily manifest as a non-linearity in the recovered peak height. This indicates that the performance of an NUS relaxation experiment cannot be predicted from comparison of peak heights using a single RSS reference spectrum. The generality of these findings was assessed using three alternative reconstruction algorithms, eight different relaxation measurements, and three additional proteins that exhibit varying degrees of spectral complexity. From these data, it is revealed that non-linearity in peak height reconstruction across the relaxation series is strongly correlated with errors in NUS-derived relaxation rates. Importantly, it is shown that this correlation can be exploited to reliably predict the performance of an NUS-relaxation experiment by using three or more RSS reference planes from the relaxation series. The RSS reference time points can also serve to provide estimates of the uncertainty of the sampled intensity, which for a typical relaxation times series incurs no penalty in total acquisition time.

  5. Accurate Cross Sections for Microanalysis

    PubMed Central

    Rez, Peter

    2002-01-01

    To calculate the intensity of x-ray emission in electron beam microanalysis requires a knowledge of the energy distribution of the electrons in the solid, the energy variation of the ionization cross section of the relevant subshell, the fraction of ionizations events producing x rays of interest and the absorption coefficient of the x rays on the path to the detector. The theoretical predictions and experimental data available for ionization cross sections are limited mainly to K shells of a few elements. Results of systematic plane wave Born approximation calculations with exchange for K, L, and M shell ionization cross sections over the range of electron energies used in microanalysis are presented. Comparisons are made with experimental measurement for selected K shells and it is shown that the plane wave theory is not appropriate for overvoltages less than 2.5 V. PMID:27446747

  6. A technique to ensure the reproducibility of a cast post and core.

    PubMed

    Naas, Haitem M M; Dashti, Mohammad Hossein; Hashemian, Roxana; Hifeda, Nedda Y

    2014-12-01

    The post-and-core pattern duplication technique is a simple, cost-effective, and accurate method of ensuring the reproducibility of a cast post and core. An acrylic resin pattern is fabricated for an endodontically treated tooth. The post portion of the pattern is duplicated with a polyvinyl siloxane impression material in the lower compartment of a container. The core portion is then duplicated with a polyether impression material in the upper compartment. After the original pattern has been retrieved, the duplicate resin pattern is fabricated in the provided space. This technique will improve efficiency if damage or loss of the pattern or the actual cast post and core occurs.

  7. Reproducibility of BOLD, Perfusion, and CMRO2 Measurements with Calibrated-BOLD fMRI

    PubMed Central

    Leontiev, Oleg; Buxton, Richard B.

    2007-01-01

    The coupling of changes in cerebral blood flow (CBF) and cerebral metabolic rate of oxygen (CMRO2) during brain activation can be characterized by an empirical index, n, defined as the ratio between fractional CBF change and fractional CMRO2 change. The combination of blood oxygenation level dependent (BOLD) imaging with CBF measurements from arterial spin labeling (ASL) provides a potentially powerful experimental approach for measuring n, but the reproducibility of the technique previously has not been assessed. In this study, inter-subject variance and intra-subject reproducibility of the method were determined. Block design %BOLD and %CBF responses to visual stimulation and mild hypercapnia (5% CO2) were measured, and these data were used to compute the BOLD scaling factor M, %CMRO2 change with activation, and the coupling index n. Reproducibility was determined for three approaches to defining regions-of-interest (ROIs): 1) Visual area V1 determined from prior retinotopic maps, 2) BOLD-activated voxels from a separate functional localizer, and 3) CBF–activated voxels from a separate functional localizer. For estimates of %BOLD, %CMRO2 and n, intra-subject reproducibility was found to be best for regions selected according to CBF activation. Among all fMRI measurements, estimates of n were the most robust and were substantially more stable within individual subjects (coefficient of variation, CV=7.4%) than across the subject pool (CV=36.9%). The stability of n across days, despite wider variability of CBF and CMRO2 responses, suggests that the reproducibility of blood flow changes is limited by variation in the oxidative metabolic demand. We conclude that the calibrated BOLD approach provides a highly reproducible measurement of n that can serve as a useful quantitative probe of the coupling of blood flow and energy metabolism in the brain. PMID:17208013

  8. Reproducibility of BOLD, perfusion, and CMRO2 measurements with calibrated-BOLD fMRI.

    PubMed

    Leontiev, Oleg; Buxton, Richard B

    2007-03-01

    The coupling of changes in cerebral blood flow (CBF) and cerebral metabolic rate of oxygen (CMRO(2)) during brain activation can be characterized by an empirical index, n, defined as the ratio between fractional CBF change and fractional CMRO(2) change. The combination of blood oxygenation level dependent (BOLD) imaging with CBF measurements from arterial spin labeling (ASL) provides a potentially powerful experimental approach for measuring n, but the reproducibility of the technique previously has not been assessed. In this study, inter-subject variance and intra-subject reproducibility of the method were determined. Block design %BOLD and %CBF responses to visual stimulation and mild hypercapnia (5% CO(2)) were measured, and these data were used to compute the BOLD scaling factor M, %CMRO(2) change with activation, and the coupling index n. Reproducibility was determined for three approaches to defining regions-of-interest (ROIs): 1) Visual area V1 determined from prior retinotopic maps, 2) BOLD-activated voxels from a separate functional localizer, and 3) CBF-activated voxels from a separate functional localizer. For estimates of %BOLD, %CMRO(2) and n, intra-subject reproducibility was found to be best for regions selected according to CBF activation. Among all fMRI measurements, estimates of n were the most robust and were substantially more stable within individual subjects (coefficient of variation, CV=7.4%) than across the subject pool (CV=36.9%). The stability of n across days, despite wider variability of CBF and CMRO(2) responses, suggests that the reproducibility of blood flow changes is limited by variation in the oxidative metabolic demand. We conclude that the calibrated BOLD approach provides a highly reproducible measurement of n that can serve as a useful quantitative probe of the coupling of blood flow and energy metabolism in the brain.

  9. Photographic copy of reproduced photograph dated 1942. Exterior view, west ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photographic copy of reproduced photograph dated 1942. Exterior view, west elevation. Building camouflaged during World War II. - Grand Central Air Terminal, 1310 Air Way, Glendale, Los Angeles County, CA

  10. Reproducibility of computational workflows is automated using continuous analysis.

    PubMed

    Beaulieu-Jones, Brett K; Greene, Casey S

    2017-03-13

    Replication, validation and extension of experiments are crucial for scientific progress. Computational experiments are scriptable and should be easy to reproduce. However, computational analyses are designed and run in a specific computing environment, which may be difficult or impossible to match using written instructions. We report the development of continuous analysis, a workflow that enables reproducible computational analyses. Continuous analysis combines Docker, a container technology akin to virtual machines, with continuous integration, a software development technique, to automatically rerun a computational analysis whenever updates or improvements are made to source code or data. This enables researchers to reproduce results without contacting the study authors. Continuous analysis allows reviewers, editors or readers to verify reproducibility without manually downloading and rerunning code and can provide an audit trail for analyses of data that cannot be shared.

  11. Accurate ab Initio Spin Densities.

    PubMed

    Boguslawski, Katharina; Marti, Konrad H; Legeza, Ors; Reiher, Markus

    2012-06-12

    We present an approach for the calculation of spin density distributions for molecules that require very large active spaces for a qualitatively correct description of their electronic structure. Our approach is based on the density-matrix renormalization group (DMRG) algorithm to calculate the spin density matrix elements as a basic quantity for the spatially resolved spin density distribution. The spin density matrix elements are directly determined from the second-quantized elementary operators optimized by the DMRG algorithm. As an analytic convergence criterion for the spin density distribution, we employ our recently developed sampling-reconstruction scheme [J. Chem. Phys.2011, 134, 224101] to build an accurate complete-active-space configuration-interaction (CASCI) wave function from the optimized matrix product states. The spin density matrix elements can then also be determined as an expectation value employing the reconstructed wave function expansion. Furthermore, the explicit reconstruction of a CASCI-type wave function provides insight into chemically interesting features of the molecule under study such as the distribution of α and β electrons in terms of Slater determinants, CI coefficients, and natural orbitals. The methodology is applied to an iron nitrosyl complex which we have identified as a challenging system for standard approaches [J. Chem. Theory Comput.2011, 7, 2740].

  12. Comment on "Estimating the reproducibility of psychological science".

    PubMed

    Gilbert, Daniel T; King, Gary; Pettigrew, Stephen; Wilson, Timothy D

    2016-03-04

    A paper from the Open Science Collaboration (Research Articles, 28 August 2015, aac4716) attempting to replicate 100 published studies suggests that the reproducibility of psychological science is surprisingly low. We show that this article contains three statistical errors and provides no support for such a conclusion. Indeed, the data are consistent with the opposite conclusion, namely, that the reproducibility of psychological science is quite high.

  13. Reproducibility with repeat CT in radiomics study for rectal cancer

    PubMed Central

    Hu, Panpan; Wang, Jiazhou; Zhong, Haoyu; Zhou, Zhen; Shen, Lijun; Hu, Weigang; Zhang, Zhen

    2016-01-01

    Purpose To evaluate the reproducibility of radiomics features by repeating computed tomographic (CT) scans in rectal cancer. To choose stable radiomics features for rectal cancer. Results Volume normalized features are much more reproducible than unnormalized features. The average value of all slices is the most reproducible feature type in rectal cancer. Different filters have little effect for the reproducibility of radiomics features. For the average type features, 496 out of 775 features showed high reproducibility (ICC ≥ 0.8), 225 out of 775 features showed medium reproducibility (0.8 > ICC ≥ 0.5) and 54 out of 775 features showed low reproducibility (ICC < 0.5). Methods 40 rectal cancer patients with stage II were enrolled in this study, each of whom underwent two CT scans within average 8.7 days. 775 radiomics features were defined in this study. For each features, five different values (value from the largest slice, maximum value, minimum value, average value of all slices and value from superposed intermediate matrix) were extracted. Meanwhile a LOG filter with different parameters was applied to these images to find stable filter value. Concordance correlation coefficients (CCC) and inter-class correlation coefficients (ICC) of two CT scans were calculated to assess the reproducibility, based on original features and volume normalized features. Conclusions Features are recommended to be normalized to volume in radiomics analysis. The average type radiomics features are the most stable features in rectal cancer. Further analysis of these features of rectal cancer can be warranted for treatment monitoring and prognosis prediction. PMID:27669756

  14. Accurate Theoretical Thermochemistry for Fluoroethyl Radicals.

    PubMed

    Ganyecz, Ádám; Kállay, Mihály; Csontos, József

    2017-02-09

    An accurate coupled-cluster (CC) based model chemistry was applied to calculate reliable thermochemical quantities for hydrofluorocarbon derivatives including radicals 1-fluoroethyl (CH3-CHF), 1,1-difluoroethyl (CH3-CF2), 2-fluoroethyl (CH2F-CH2), 1,2-difluoroethyl (CH2F-CHF), 2,2-difluoroethyl (CHF2-CH2), 2,2,2-trifluoroethyl (CF3-CH2), 1,2,2,2-tetrafluoroethyl (CF3-CHF), and pentafluoroethyl (CF3-CF2). The model chemistry used contains iterative triple and perturbative quadruple excitations in CC theory, as well as scalar relativistic and diagonal Born-Oppenheimer corrections. To obtain heat of formation values with better than chemical accuracy perturbative quadruple excitations and scalar relativistic corrections were inevitable. Their contributions to the heats of formation steadily increase with the number of fluorine atoms in the radical reaching 10 kJ/mol for CF3-CF2. When discrepancies were found between the experimental and our values it was always possible to resolve the issue by recalculating the experimental result with currently recommended auxiliary data. For each radical studied here this study delivers the best heat of formation as well as entropy data.

  15. On The Reproducibility of Seasonal Land-surface Climate

    SciTech Connect

    Phillips, T J

    2004-10-22

    The sensitivity of the continental seasonal climate to initial conditions is estimated from an ensemble of decadal simulations of an atmospheric general circulation model with the same specifications of radiative forcings and monthly ocean boundary conditions, but with different initial states of atmosphere and land. As measures of the ''reproducibility'' of continental climate for different initial conditions, spatio-temporal correlations are computed across paired realizations of eleven model land-surface variables in which the seasonal cycle is either included or excluded--the former case being pertinent to climate simulation, and the latter to seasonal anomaly prediction. It is found that the land-surface variables which include the seasonal cycle are impacted only marginally by changes in initial conditions; moreover, their seasonal climatologies exhibit high spatial reproducibility. In contrast, the reproducibility of a seasonal land-surface anomaly is generally low, although it is substantially higher in the Tropics; its spatial reproducibility also markedly fluctuates in tandem with warm and cold phases of the El Nino/Southern Oscillation. However, the overall degree of reproducibility depends strongly on the particular land-surface anomaly considered. It is also shown that the predictability of a land-surface anomaly implied by its reproducibility statistics is consistent with what is inferred from more conventional predictability metrics. Implications of these results for climate model intercomparison projects and for operational forecasts of seasonal continental climate also are elaborated.

  16. Triploid planarian reproduces truly bisexually with euploid gametes produced through a different meiotic system between sex.

    PubMed

    Chinone, Ayako; Nodono, Hanae; Matsumoto, Midori

    2014-06-01

    Although polyploids are common among plants and some animals, polyploidization often causes reproductive failure. Triploids, in particular, are characterized by the problems of chromosomal pairing and segregation during meiosis, which may cause aneuploid gametes and results in sterility. Thus, they are generally considered to reproduce only asexually. In the case of the Platyhelminthes Dugesia ryukyuensis, populations with triploid karyotypes are normally found in nature as both fissiparous and oviparous triploids. Fissiparous triploids can also be experimentally sexualized if they are fed sexual planarians, developing both gonads and other reproductive organs. Fully sexualized worms begin reproducing by copulation rather than fission. In this study, we examined the genotypes of the offspring obtained by breeding sexualized triploids and found that the offspring inherited genes from both parents, i.e., they reproduced truly bisexually. Furthermore, meiotic chromosome behavior in triploid sexualized planarians differed significantly between male and female germ lines, in that female germ line cells remained triploid until prophase I, whereas male germ line cells appeared to become diploid before entry into meiosis. Oocytes at the late diplotene stage contained not only paired bivalents but also unpaired univalents that were suggested to produce diploid eggs if they remained in subsequent processes. Triploid planarians may therefore form euploid gametes by different meiotic systems in female and male germ lines and thus are be able to reproduce sexually in contrast to many other triploid organisms.

  17. Modeling and experimental characterization of stepped and v-shaped (311) defects in silicon

    SciTech Connect

    Marqués, Luis A. Aboy, María; Dudeck, Karleen J.; Botton, Gianluigi A.; Knights, Andrew P.; Gwilliam, Russell M.

    2014-04-14

    We propose an atomistic model to describe extended (311) defects in silicon. It is based on the combination of interstitial and bond defect chains. The model is able to accurately reproduce not only planar (311) defects but also defect structures that show steps, bends, or both. We use molecular dynamics techniques to show that these interstitial and bond defect chains spontaneously transform into extended (311) defects. Simulations are validated by comparing with precise experimental measurements on actual (311) defects. The excellent agreement between the simulated and experimentally derived structures, regarding individual atomic positions and shape of the distinct structural (311) defect units, provides strong evidence for the robustness of the proposed model.

  18. Short communication: Intraoperator repeatability and interoperator reproducibility of devices measuring teat dimensions in dairy cows.

    PubMed

    Zwertvaegher, I; De Vliegher, S; Baert, J; Van Weyenberg, S

    2013-01-01

    Various methods have been applied to measure teat dimensions. However, the accuracy and precision needed to obtain reliable results are often poor or have not yet been investigated. To determine the precision of the ruler, the caliper, and a recently developed 2-dimensional (2D) vision-based measuring device under field conditions, for respectively teat length, teat diameter, and both teat length and diameter, 2 experiments were conducted in which the consistency of measurements within operators (repeatability) and between operators (reproducibility) was tested. In addition, the agreement of the 2D device with the ruler and the caliper was studied. Although the ruler and the 2D device poorly agreed, both methods were precise in measuring teat length when the operators had experience in working with cows. The caliper was repeatable in measuring teat diameter, but was not reproducible. The 2D device was also repeatable in measuring teat diameter, and reproducible when the operators had experience with the device. The methods had poor agreement, most likely due to the operator-dependent pressure applied by the caliper. Because the 2D device has the advantage of measuring both teat length and teat diameters in a single measurement and is accurate and practical, this method allows efficient and fast collection of data on a large scale for various applications.

  19. Evolvix BEST Names for semantic reproducibility across code2brain interfaces

    PubMed Central

    Scheuer, Katherine S.; Keel, Seth A.; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C.; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G.; Moog, Cecilia L.; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist‐Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda‐Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L.; Freiberg, Erika; Waters, Noah P.; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M.; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha

    2016-01-01

    Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general‐purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long‐term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder‐brains to reader‐brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. PMID:27918836

  20. Evolvix BEST Names for semantic reproducibility across code2brain interfaces.

    PubMed

    Loewe, Laurence; Scheuer, Katherine S; Keel, Seth A; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G; Moog, Cecilia L; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist-Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda-Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L; Freiberg, Erika; Waters, Noah P; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha

    2017-01-01

    Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general-purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long-term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder-brains to reader-brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core.

  1. Rapid, reliable, and reproducible molecular sub-grouping of clinical medulloblastoma samples.

    PubMed

    Northcott, Paul A; Shih, David J H; Remke, Marc; Cho, Yoon-Jae; Kool, Marcel; Hawkins, Cynthia; Eberhart, Charles G; Dubuc, Adrian; Guettouche, Toumy; Cardentey, Yoslayma; Bouffet, Eric; Pomeroy, Scott L; Marra, Marco; Malkin, David; Rutka, James T; Korshunov, Andrey; Pfister, Stefan; Taylor, Michael D

    2012-04-01

    The diagnosis of medulloblastoma likely encompasses several distinct entities, with recent evidence for the existence of at least four unique molecular subgroups that exhibit distinct genetic, transcriptional, demographic, and clinical features. Assignment of molecular subgroup through routine profiling of high-quality RNA on expression microarrays is likely impractical in the clinical setting. The planning and execution of medulloblastoma clinical trials that stratify by subgroup, or which are targeted to a specific subgroup requires technologies that can be economically, rapidly, reliably, and reproducibly applied to formalin-fixed paraffin embedded (FFPE) specimens. In the current study, we have developed an assay that accurately measures the expression level of 22 medulloblastoma subgroup-specific signature genes (CodeSet) using nanoString nCounter Technology. Comparison of the nanoString assay with Affymetrix expression array data on a training series of 101 medulloblastomas of known subgroup demonstrated a high concordance (Pearson correlation r = 0.86). The assay was validated on a second set of 130 non-overlapping medulloblastomas of known subgroup, correctly assigning 98% (127/130) of tumors to the appropriate subgroup. Reproducibility was demonstrated by repeating the assay in three independent laboratories in Canada, the United States, and Switzerland. Finally, the nanoString assay could confidently predict subgroup in 88% of recent FFPE cases, of which 100% had accurate subgroup assignment. We present an assay based on nanoString technology that is capable of rapidly, reliably, and reproducibly assigning clinical FFPE medulloblastoma samples to their molecular subgroup, and which is highly suited for future medulloblastoma clinical trials.

  2. Reproducibility of Research Algorithms in GOES-R Operational Software

    NASA Astrophysics Data System (ADS)

    Kennelly, E.; Botos, C.; Snell, H. E.; Steinfelt, E.; Khanna, R.; Zaccheo, T.

    2012-12-01

    The research to operations transition for satellite observations is an area of active interest as identified by The National Research Council Committee on NASA-NOAA Transition from Research to Operations. Their report recommends improved transitional processes for bridging technology from research to operations. Assuring the accuracy of operational algorithm results as compared to research baselines, called reproducibility in this paper, is a critical step in the GOES-R transition process. This paper defines reproducibility methods and measurements for verifying that operationally implemented algorithms conform to research baselines, demonstrated with examples from GOES-R software development. The approach defines reproducibility for implemented algorithms that produce continuous data in terms of a traditional goodness-of-fit measure (i.e., correlation coefficient), while the reproducibility for discrete categorical data is measured using a classification matrix. These reproducibility metrics have been incorporated in a set of Test Tools developed for GOES-R and the software processes have been developed to include these metrics to validate both the scientific and numerical implementation of the GOES-R algorithms. In this work, we outline the test and validation processes and summarize the current results for GOES-R Level 2+ algorithms.

  3. Using prediction markets to estimate the reproducibility of scientific research

    PubMed Central

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A.; Johannesson, Magnus

    2015-01-01

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants’ individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a “statistically significant” finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988

  4. Reproducibility of the Structural Connectome Reconstruction across Diffusion Methods.

    PubMed

    Prčkovska, Vesna; Rodrigues, Paulo; Puigdellivol Sanchez, Ana; Ramos, Marc; Andorra, Magi; Martinez-Heras, Eloy; Falcon, Carles; Prats-Galino, Albert; Villoslada, Pablo

    2016-01-01

    Analysis of the structural connectomes can lead to powerful insights about the brain's organization and damage. However, the accuracy and reproducibility of constructing the structural connectome done with different acquisition and reconstruction techniques is not well defined. In this work, we evaluated the reproducibility of the structural connectome techniques by performing test-retest (same day) and longitudinal studies (after 1 month) as well as analyzing graph-based measures on the data acquired from 22 healthy volunteers (6 subjects were used for the longitudinal study). We compared connectivity matrices and tract reconstructions obtained with the most typical acquisition schemes used in clinical application: diffusion tensor imaging (DTI), high angular resolution diffusion imaging (HARDI), and diffusion spectrum imaging (DSI). We observed that all techniques showed high reproducibility in the test-retest analysis (correlation >.9). However, HARDI was the only technique with low variability (2%) in the longitudinal assessment (1-month interval). The intraclass coefficient analysis showed the highest reproducibility for the DTI connectome, however, with more sparse connections than HARDI and DSI. Qualitative (neuroanatomical) assessment of selected tracts confirmed the quantitative results showing that HARDI managed to detect most of the analyzed fiber groups and fanning fibers. In conclusion, we found that HARDI acquisition showed the most balanced trade-off between high reproducibility of the connectome, higher rate of path detection and of fanning fibers, and intermediate acquisition times (10-15 minutes), although at the cost of higher appearance of aberrant fibers.

  5. Validation and reproducibility of an Australian caffeine food frequency questionnaire.

    PubMed

    Watson, E J; Kohler, M; Banks, S; Coates, A M

    2017-01-05

    The aim of this study was to measure validity and reproducibility of a caffeine food frequency questionnaire (C-FFQ) developed for the Australian population. The C-FFQ was designed to assess average daily caffeine consumption using four categories of food and beverages including; energy drinks; soft drinks/soda; coffee and tea and chocolate (food and drink). Participants completed a seven-day food diary immediately followed by the C-FFQ on two consecutive days. The questionnaire was first piloted in 20 adults, and then, a validity/reproducibility study was conducted (n = 90 adults). The C-FFQ showed moderate correlations (r = .60), fair agreement (mean difference 63 mg) and reasonable quintile rankings indicating fair to moderate agreement with the seven-day food diary. To test reproducibility, the C-FFQ was compared to itself and showed strong correlations (r = .90), good quintile rankings and strong kappa values (κ = 0.65), indicating strong reproducibility. The C-FFQ shows adequate validity and reproducibility and will aid researchers in Australia to quantify caffeine consumption.

  6. Using prediction markets to estimate the reproducibility of scientific research.

    PubMed

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus

    2015-12-15

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications.

  7. Reproducibility of radiomics for deciphering tumor phenotype with imaging

    PubMed Central

    Zhao, Binsheng; Tan, Yongqiang; Tsai, Wei-Yann; Qi, Jing; Xie, Chuanmiao; Lu, Lin; Schwartz, Lawrence H.

    2016-01-01

    Radiomics (radiogenomics) characterizes tumor phenotypes based on quantitative image features derived from routine radiologic imaging to improve cancer diagnosis, prognosis, prediction and response to therapy. Although radiomic features must be reproducible to qualify as biomarkers for clinical care, little is known about how routine imaging acquisition techniques/parameters affect reproducibility. To begin to fill this knowledge gap, we assessed the reproducibility of a comprehensive, commonly-used set of radiomic features using a unique, same-day repeat computed tomography data set from lung cancer patients. Each scan was reconstructed at 6 imaging settings, varying slice thicknesses (1.25 mm, 2.5 mm and 5 mm) and reconstruction algorithms (sharp, smooth). Reproducibility was assessed using the repeat scans reconstructed at identical imaging setting (6 settings in total). In separate analyses, we explored differences in radiomic features due to different imaging parameters by assessing the agreement of these radiomic features extracted from the repeat scans reconstructed at the same slice thickness but different algorithms (3 settings in total). Our data suggest that radiomic features are reproducible over a wide range of imaging settings. However, smooth and sharp reconstruction algorithms should not be used interchangeably. These findings will raise awareness of the importance of properly setting imaging acquisition parameters in radiomics/radiogenomics research. PMID:27009765

  8. Making neurophysiological data analysis reproducible: why and how?

    PubMed

    Delescluse, Matthieu; Franconville, Romain; Joucla, Sébastien; Lieury, Tiffany; Pouzat, Christophe

    2012-01-01

    Reproducible data analysis is an approach aiming at complementing classical printed scientific articles with everything required to independently reproduce the results they present. "Everything" covers here: the data, the computer codes and a precise description of how the code was applied to the data. A brief history of this approach is presented first, starting with what economists have been calling replication since the early eighties to end with what is now called reproducible research in computational data analysis oriented fields like statistics and signal processing. Since efficient tools are instrumental for a routine implementation of these approaches, a description of some of the available ones is presented next. A toy example demonstrates then the use of two open source software programs for reproducible data analysis: the "Sweave family" and the org-mode of emacs. The former is bound to R while the latter can be used with R, Matlab, Python and many more "generalist" data processing software. Both solutions can be used with Unix-like, Windows and Mac families of operating systems. It is argued that neuroscientists could communicate much more efficiently their results by adopting the reproducible research paradigm from their lab books all the way to their articles, thesis and books.

  9. Reproducibility of radiomics for deciphering tumor phenotype with imaging.

    PubMed

    Zhao, Binsheng; Tan, Yongqiang; Tsai, Wei-Yann; Qi, Jing; Xie, Chuanmiao; Lu, Lin; Schwartz, Lawrence H

    2016-03-24

    Radiomics (radiogenomics) characterizes tumor phenotypes based on quantitative image features derived from routine radiologic imaging to improve cancer diagnosis, prognosis, prediction and response to therapy. Although radiomic features must be reproducible to qualify as biomarkers for clinical care, little is known about how routine imaging acquisition techniques/parameters affect reproducibility. To begin to fill this knowledge gap, we assessed the reproducibility of a comprehensive, commonly-used set of radiomic features using a unique, same-day repeat computed tomography data set from lung cancer patients. Each scan was reconstructed at 6 imaging settings, varying slice thicknesses (1.25 mm, 2.5 mm and 5 mm) and reconstruction algorithms (sharp, smooth). Reproducibility was assessed using the repeat scans reconstructed at identical imaging setting (6 settings in total). In separate analyses, we explored differences in radiomic features due to different imaging parameters by assessing the agreement of these radiomic features extracted from the repeat scans reconstructed at the same slice thickness but different algorithms (3 settings in total). Our data suggest that radiomic features are reproducible over a wide range of imaging settings. However, smooth and sharp reconstruction algorithms should not be used interchangeably. These findings will raise awareness of the importance of properly setting imaging acquisition parameters in radiomics/radiogenomics research.

  10. Reproducibility of radiomics for deciphering tumor phenotype with imaging

    NASA Astrophysics Data System (ADS)

    Zhao, Binsheng; Tan, Yongqiang; Tsai, Wei-Yann; Qi, Jing; Xie, Chuanmiao; Lu, Lin; Schwartz, Lawrence H.

    2016-03-01

    Radiomics (radiogenomics) characterizes tumor phenotypes based on quantitative image features derived from routine radiologic imaging to improve cancer diagnosis, prognosis, prediction and response to therapy. Although radiomic features must be reproducible to qualify as biomarkers for clinical care, little is known about how routine imaging acquisition techniques/parameters affect reproducibility. To begin to fill this knowledge gap, we assessed the reproducibility of a comprehensive, commonly-used set of radiomic features using a unique, same-day repeat computed tomography data set from lung cancer patients. Each scan was reconstructed at 6 imaging settings, varying slice thicknesses (1.25 mm, 2.5 mm and 5 mm) and reconstruction algorithms (sharp, smooth). Reproducibility was assessed using the repeat scans reconstructed at identical imaging setting (6 settings in total). In separate analyses, we explored differences in radiomic features due to different imaging parameters by assessing the agreement of these radiomic features extracted from the repeat scans reconstructed at the same slice thickness but different algorithms (3 settings in total). Our data suggest that radiomic features are reproducible over a wide range of imaging settings. However, smooth and sharp reconstruction algorithms should not be used interchangeably. These findings will raise awareness of the importance of properly setting imaging acquisition parameters in radiomics/radiogenomics research.

  11. Reproducibility of regional brain metabolic responses to lorazepam

    SciTech Connect

    Wang, G.J.; Volkow, N.D.; Overall, J. |

    1996-10-01

    Changes in regional brain glucose metabolism in response to benzodiazepine agonists have been used as indicators of benzodiazepine-GABA receptor function. The purpose of this study was to assess the reproducibility of these responses. Sixteen healthy right-handed men underwent scanning with PET and [{sup 18}F]fluorodeoxyglucose (FDG) twice: before placebo and before lorazepam (30 {mu}g/kg). The same double FDG procedure was repeated 6-8 wk later on the men to assess test-retest reproducibility. The regional absolute brain metabolic values obtained during the second evaluation were significantly lower than those obtained from the first evaluation regardless of condition (p {le} 0.001). Lorazepam significantly and consistently decreased both whole-brain metabolism and the magnitude. The regional pattern of the changes were comparable for both studies (12.3% {plus_minus} 6.9% and 13.7% {plus_minus} 7.4%). Lorazepam effects were the largest in the thalamus (22.2% {plus_minus} 8.6% and 22.4% {plus_minus} 6.9%) and occipital cortex (19% {plus_minus} 8.9% and 21.8% {plus_minus} 8.9%). Relative metabolic measures were highly reproducible both for pharmacolgic and replication condition. This study measured the test-retest reproducibility in regional brain metabolic responses, and although the global and regional metabolic values were significantly lower for the repeated evaluation, the response to lorazepam was highly reproducible. 1613 refs., 3 figs., 3 tabs.

  12. Reproducibility of thalamic segmentation based on probabilistic tractography.

    PubMed

    Traynor, Catherine; Heckemann, Rolf A; Hammers, Alexander; O'Muircheartaigh, Jonathan; Crum, William R; Barker, Gareth J; Richardson, Mark P

    2010-08-01

    Reliable identification of thalamic nuclei is required to improve targeting of electrodes used in Deep Brain Stimulation (DBS), and for exploring the role of thalamus in health and disease. A previously described method using probabilistic tractography to segment the thalamus based on connections to cortical target regions was implemented. Both within- and between-subject reproducibility were quantitatively assessed by the overlap of the resulting segmentations; the effect of two different numbers of target regions (6 and 31) on reproducibility of the segmentation results was also investigated. Very high reproducibility was observed when a single dataset was processed multiple times using different starting conditions. Thalamic segmentation was also very reproducible when multiple datasets from the same subject were processed using six cortical target regions. Within-subject reproducibility was reduced when the number of target regions was increased, particularly in medial and posterior regions of the thalamus. A large degree of overlap in segmentation results from different subjects was obtained, particularly in thalamic regions classified as connecting to frontal, parietal, temporal and pre-central cortical target regions.

  13. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2013-07-01 2013-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  14. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2012-07-01 2012-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  15. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  16. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2014-07-01 2014-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  17. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2011-07-01 2011-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  18. On the reproducibility of science: unique identification of research resources in the biomedical literature

    PubMed Central

    Brush, Matthew H.; Paddock, Holly; Ponting, Laura; Tripathy, Shreejoy J.; LaRocca, Gregory M.; Haendel, Melissa A.

    2013-01-01

    Scientific reproducibility has been at the forefront of many news stories and there exist numerous initiatives to help address this problem. We posit that a contributor is simply a lack of specificity that is required to enable adequate research reproducibility. In particular, the inability to uniquely identify research resources, such as antibodies and model organisms, makes it difficult or impossible to reproduce experiments even where the science is otherwise sound. In order to better understand the magnitude of this problem, we designed an experiment to ascertain the “identifiability” of research resources in the biomedical literature. We evaluated recent journal articles in the fields of Neuroscience, Developmental Biology, Immunology, Cell and Molecular Biology and General Biology, selected randomly based on a diversity of impact factors for the journals, publishers, and experimental method reporting guidelines. We attempted to uniquely identify model organisms (mouse, rat, zebrafish, worm, fly and yeast), antibodies, knockdown reagents (morpholinos or RNAi), constructs, and cell lines. Specific criteria were developed to determine if a resource was uniquely identifiable, and included examining relevant repositories (such as model organism databases, and the Antibody Registry), as well as vendor sites. The results of this experiment show that 54% of resources are not uniquely identifiable in publications, regardless of domain, journal impact factor, or reporting requirements. For example, in many cases the organism strain in which the experiment was performed or antibody that was used could not be identified. Our results show that identifiability is a serious problem for reproducibility. Based on these results, we provide recommendations to authors, reviewers, journal editors, vendors, and publishers. Scientific efficiency and reproducibility depend upon a research-wide improvement of this substantial problem in science today. PMID:24032093

  19. phyloseq: An R Package for Reproducible Interactive Analysis and Graphics of Microbiome Census Data

    PubMed Central

    McMurdie, Paul J.; Holmes, Susan

    2013-01-01

    Background The analysis of microbial communities through DNA sequencing brings many challenges: the integration of different types of data with methods from ecology, genetics, phylogenetics, multivariate statistics, visualization and testing. With the increased breadth of experimental designs now being pursued, project-specific statistical analyses are often needed, and these analyses are often difficult (or impossible) for peer researchers to independently reproduce. The vast majority of the requisite tools for performing these analyses reproducibly are already implemented in R and its extensions (packages), but with limited support for high throughput microbiome census data. Results Here we describe a software project, phyloseq, dedicated to the object-oriented representation and analysis of microbiome census data in R. It supports importing data from a variety of common formats, as well as many analysis techniques. These include calibration, filtering, subsetting, agglomeration, multi-table comparisons, diversity analysis, parallelized Fast UniFrac, ordination methods, and production of publication-quality graphics; all in a manner that is easy to document, share, and modify. We show how to apply functions from other R packages to phyloseq-represented data, illustrating the availability of a large number of open source analysis techniques. We discuss the use of phyloseq with tools for reproducible research, a practice common in other fields but still rare in the analysis of highly parallel microbiome census data. We have made available all of the materials necessary to completely reproduce the analysis and figures included in this article, an example of best practices for reproducible research. Conclusions The phyloseq project for R is a new open-source software package, freely available on the web from both GitHub and Bioconductor. PMID:23630581

  20. Relevant principal factors affecting the reproducibility of insect primary culture.

    PubMed

    Ogata, Norichika; Iwabuchi, Kikuo

    2017-02-22

    The primary culture of insect cells often suffers from problems with poor reproducibility in the quality of the final cell preparations. The cellular composition of the explants (cell number and cell types), surgical methods (surgical duration and surgical isolation), and physiological and genetic differences between donors may be critical factors affecting the reproducibility of culture. However, little is known about where biological variation (interindividual differences between donors) ends and technical variation (variance in replication of culture conditions) begins. In this study, we cultured larval fat bodies from the Japanese rhinoceros beetle, Allomyrina dichotoma, and evaluated, using linear mixed models, the effect of interindividual variation between donors on the reproducibility of the culture. We also performed transcriptome analysis of the hemocyte-like cells mainly seen in the cultures using RNA sequencing and ultrastructural analyses of hemocytes using a transmission electron microscope, revealing that the cultured cells have many characteristics of insect hemocytes.

  1. Reproducibility of cephalometric measurements made by three radiology clinics.

    PubMed

    da Silveira, Heraldo Luis Dias; Silveira, Heloisa Emilia Dias

    2006-05-01

    The purpose of this study was to assess reproducibility of cephalometric measurements in cephalograms obtained by three dentomaxillofacial radiology clinics. Forty lateral cephalometric radiographs were selected and sent at different times to three different clinics for cephalometric analyses. Each clinic digitized the radiographs with the same resolution, and landmarks were located with the mouse pointer directly on the digitized radiographic image on the screen. Three cephalograms were obtained from each radiograph, totaling 120 analyses. Data were analyzed with analysis of variance. Of the 32 factors studied, reproducibility of results was satisfactory for only four factors: position of maxilla relative to anterior cranial base, inclination of occlusal plane relative to anterior cranial base, position of lower incisor relative to nasion-pogonion line, and soft-tissue profile of face (P < .05). Differences in cephalometric measurements were present and such differences were significant for most factors analyzed. The different cephalometric measurements obtained by the three dental radiology clinics were not reproducible.

  2. Benchmarking contactless acquisition sensor reproducibility for latent fingerprint trace evidence

    NASA Astrophysics Data System (ADS)

    Hildebrandt, Mario; Dittmann, Jana

    2015-03-01

    Optical, nano-meter range, contactless, non-destructive sensor devices are promising acquisition techniques in crime scene trace forensics, e.g. for digitizing latent fingerprint traces. Before new approaches are introduced in crime investigations, innovations need to be positively tested and quality ensured. In this paper we investigate sensor reproducibility by studying different scans from four sensors: two chromatic white light sensors (CWL600/CWL1mm), one confocal laser scanning microscope, and one NIR/VIS/UV reflection spectrometer. Firstly, we perform an intra-sensor reproducibility testing for CWL600 with a privacy conform test set of artificial-sweat printed, computer generated fingerprints. We use 24 different fingerprint patterns as original samples (printing samples/templates) for printing with artificial sweat (physical trace samples) and their acquisition with contactless sensory resulting in 96 sensor images, called scan or acquired samples. The second test set for inter-sensor reproducibility assessment consists of the first three patterns from the first test set, acquired in two consecutive scans using each device. We suggest using a simple feature space set in spatial and frequency domain known from signal processing and test its suitability for six different classifiers classifying scan data into small differences (reproducible) and large differences (non-reproducible). Furthermore, we suggest comparing the classification results with biometric verification scores (calculated with NBIS, with threshold of 40) as biometric reproducibility score. The Bagging classifier is nearly for all cases the most reliable classifier in our experiments and the results are also confirmed with the biometric matching rates.

  3. Language-Agnostic Reproducible Data Analysis Using Literate Programming.

    PubMed

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.

  4. Language-Agnostic Reproducible Data Analysis Using Literate Programming

    PubMed Central

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123

  5. Wear characteristics of UHMW polyethylene: a method for accurately measuring extremely low wear rates.

    PubMed

    McKellop, H; Clarke, I C; Markolf, K L; Amstutz, H C

    1978-11-01

    The wear of UHMW polyethylene bearing against 316 stainless steel or cobalt chrome alloy was measured using a 12-channel wear tester especially developed for the evaluation of candidate materials for prosthetic joints. The coefficient of friction and wear rate was determined as a function of lubricant, contact stress, and metallic surface roughness in tests lasting two to three million cycles, the equivalent of several years' use of a prosthesis. Wear was determined from the weight loss of the polyethylene specimens corrected for the effect of fluid absorption. The friction and wear processes in blood serum differed markedly from those in saline solution or distilled water. Only serum lubrication produced wear surfaces resembling those observed on removed prostheses. The experimental method provided a very accurate reproducible measurement of polyethylene wear. The long-term wear rates were proportional to load and sliding distance and were much lower than expected from previously published data. Although the polyethylene wear rate increased with increasing surface roughness, wear was not severe except with very coarse metal surfaces. The data obtained in these studies forms a basis for the subsequent comparative evaluation of potentially superior materials for prosthetic joints.

  6. Respiratory effort correction strategies to improve the reproducibility of lung expansion measurements

    SciTech Connect

    Du, Kaifang; Reinhardt, Joseph M.; Christensen, Gary E.; Ding, Kai; Bayouth, John E.

    2013-12-15

    correlated with respiratory effort difference (R = 0.744 for ELV in the cohort with tidal volume difference greater than 100 cc). In general for all subjects, global normalization, ETV and ELV significantly improved reproducibility compared to no effort correction (p = 0.009, 0.002, 0.005 respectively). When tidal volume difference was small (less than 100 cc), none of the three effort correction strategies improved reproducibility significantly (p = 0.52, 0.46, 0.46 respectively). For the cohort (N = 13) with tidal volume difference greater than 100 cc, the average gamma pass rate improves from 57.3% before correction to 66.3% after global normalization, and 76.3% after ELV. ELV was found to be significantly better than global normalization (p = 0.04 for all subjects, and p = 0.003 for the cohort with tidal volume difference greater than 100 cc).Conclusions: All effort correction strategies improve the reproducibility of the authors' pulmonary ventilation measures, and the improvement of reproducibility is highly correlated with the changes in respiratory effort. ELV gives better results as effort difference increase, followed by ETV, then global. However, based on the spatial and temporal heterogeneity in the lung expansion rate, a single scaling factor (e.g., global normalization) appears to be less accurate to correct the ventilation map when changes in respiratory effort are large.

  7. History and progress on accurate measurements of the Planck constant

    NASA Astrophysics Data System (ADS)

    Steiner, Richard

    2013-01-01

    The measurement of the Planck constant, h, is entering a new phase. The CODATA 2010 recommended value is 6.626 069 57 × 10-34 J s, but it has been a long road, and the trip is not over yet. Since its discovery as a fundamental physical constant to explain various effects in quantum theory, h has become especially important in defining standards for electrical measurements and soon, for mass determination. Measuring h in the International System of Units (SI) started as experimental attempts merely to prove its existence. Many decades passed while newer experiments measured physical effects that were the influence of h combined with other physical constants: elementary charge, e, and the Avogadro constant, NA. As experimental techniques improved, the precision of the value of h expanded. When the Josephson and quantum Hall theories led to new electronic devices, and a hundred year old experiment, the absolute ampere, was altered into a watt balance, h not only became vital in definitions for the volt and ohm units, but suddenly it could be measured directly and even more accurately. Finally, as measurement uncertainties now approach a few parts in 108 from the watt balance experiments and Avogadro determinations, its importance has been linked to a proposed redefinition of a kilogram unit of mass. The path to higher accuracy in measuring the value of h was not always an example of continuous progress. Since new measurements periodically led to changes in its accepted value and the corresponding SI units, it is helpful to see why there were bumps in the road and where the different branch lines of research joined in the effort. Recalling the bumps along this road will hopefully avoid their repetition in the upcoming SI redefinition debates. This paper begins with a brief history of the methods to measure a combination of fundamental constants, thus indirectly obtaining the Planck constant. The historical path is followed in the section describing how the improved

  8. History and progress on accurate measurements of the Planck constant.

    PubMed

    Steiner, Richard

    2013-01-01

    The measurement of the Planck constant, h, is entering a new phase. The CODATA 2010 recommended value is 6.626 069 57 × 10(-34) J s, but it has been a long road, and the trip is not over yet. Since its discovery as a fundamental physical constant to explain various effects in quantum theory, h has become especially important in defining standards for electrical measurements and soon, for mass determination. Measuring h in the International System of Units (SI) started as experimental attempts merely to prove its existence. Many decades passed while newer experiments measured physical effects that were the influence of h combined with other physical constants: elementary charge, e, and the Avogadro constant, N(A). As experimental techniques improved, the precision of the value of h expanded. When the Josephson and quantum Hall theories led to new electronic devices, and a hundred year old experiment, the absolute ampere, was altered into a watt balance, h not only became vital in definitions for the volt and ohm units, but suddenly it could be measured directly and even more accurately. Finally, as measurement uncertainties now approach a few parts in 10(8) from the watt balance experiments and Avogadro determinations, its importance has been linked to a proposed redefinition of a kilogram unit of mass. The path to higher accuracy in measuring the value of h was not always an example of continuous progress. Since new measurements periodically led to changes in its accepted value and the corresponding SI units, it is helpful to see why there were bumps in the road and where the different branch lines of research joined in the effort. Recalling the bumps along this road will hopefully avoid their repetition in the upcoming SI redefinition debates. This paper begins with a brief history of the methods to measure a combination of fundamental constants, thus indirectly obtaining the Planck constant. The historical path is followed in the section describing how the

  9. Accurate equilibrium structures for piperidine and cyclohexane.

    PubMed

    Demaison, Jean; Craig, Norman C; Groner, Peter; Écija, Patricia; Cocinero, Emilio J; Lesarri, Alberto; Rudolph, Heinz Dieter

    2015-03-05

    Extended and improved microwave (MW) measurements are reported for the isotopologues of piperidine. New ground state (GS) rotational constants are fitted to MW transitions with quartic centrifugal distortion constants taken from ab initio calculations. Predicate values for the geometric parameters of piperidine and cyclohexane are found from a high level of ab initio theory including adjustments for basis set dependence and for correlation of the core electrons. Equilibrium rotational constants are obtained from GS rotational constants corrected for vibration-rotation interactions and electronic contributions. Equilibrium structures for piperidine and cyclohexane are fitted by the mixed estimation method. In this method, structural parameters are fitted concurrently to predicate parameters (with appropriate uncertainties) and moments of inertia (with uncertainties). The new structures are regarded as being accurate to 0.001 Å and 0.2°. Comparisons are made between bond parameters in equatorial piperidine and cyclohexane. Another interesting result of this study is that a structure determination is an effective way to check the accuracy of the ground state experimental rotational constants.

  10. Accurate upper body rehabilitation system using kinect.

    PubMed

    Sinha, Sanjana; Bhowmick, Brojeshwar; Chakravarty, Kingshuk; Sinha, Aniruddha; Das, Abhijit

    2016-08-01

    The growing importance of Kinect as a tool for clinical assessment and rehabilitation is due to its portability, low cost and markerless system for human motion capture. However, the accuracy of Kinect in measuring three-dimensional body joint center locations often fails to meet clinical standards of accuracy when compared to marker-based motion capture systems such as Vicon. The length of the body segment connecting any two joints, measured as the distance between three-dimensional Kinect skeleton joint coordinates, has been observed to vary with time. The orientation of the line connecting adjoining Kinect skeletal coordinates has also been seen to differ from the actual orientation of the physical body segment. Hence we have proposed an optimization method that utilizes Kinect Depth and RGB information to search for the joint center location that satisfies constraints on body segment length and as well as orientation. An experimental study have been carried out on ten healthy participants performing upper body range of motion exercises. The results report 72% reduction in body segment length variance and 2° improvement in Range of Motion (ROM) angle hence enabling to more accurate measurements for upper limb exercises.

  11. Accurate Fission Data for Nuclear Safety

    NASA Astrophysics Data System (ADS)

    Solders, A.; Gorelov, D.; Jokinen, A.; Kolhinen, V. S.; Lantz, M.; Mattera, A.; Penttilä, H.; Pomp, S.; Rakopoulos, V.; Rinta-Antila, S.

    2014-05-01

    The Accurate fission data for nuclear safety (AlFONS) project aims at high precision measurements of fission yields, using the renewed IGISOL mass separator facility in combination with a new high current light ion cyclotron at the University of Jyväskylä. The 30 MeV proton beam will be used to create fast and thermal neutron spectra for the study of neutron induced fission yields. Thanks to a series of mass separating elements, culminating with the JYFLTRAP Penning trap, it is possible to achieve a mass resolving power in the order of a few hundred thousands. In this paper we present the experimental setup and the design of a neutron converter target for IGISOL. The goal is to have a flexible design. For studies of exotic nuclei far from stability a high neutron flux (1012 neutrons/s) at energies 1 - 30 MeV is desired while for reactor applications neutron spectra that resembles those of thermal and fast nuclear reactors are preferred. It is also desirable to be able to produce (semi-)monoenergetic neutrons for benchmarking and to study the energy dependence of fission yields. The scientific program is extensive and is planed to start in 2013 with a measurement of isomeric yield ratios of proton induced fission in uranium. This will be followed by studies of independent yields of thermal and fast neutron induced fission of various actinides.

  12. Must Kohn-Sham oscillator strengths be accurate at threshold?

    SciTech Connect

    Yang Zenghui; Burke, Kieron; Faassen, Meta van

    2009-09-21

    The exact ground-state Kohn-Sham (KS) potential for the helium atom is known from accurate wave function calculations of the ground-state density. The threshold for photoabsorption from this potential matches the physical system exactly. By carefully studying its absorption spectrum, we show the answer to the title question is no. To address this problem in detail, we generate a highly accurate simple fit of a two-electron spectrum near the threshold, and apply the method to both the experimental spectrum and that of the exact ground-state Kohn-Sham potential.

  13. Can quasiclassical trajectory calculations reproduce the extreme kinetic isotope effect observed in the muonic isotopologues of the H + H2 reaction?

    PubMed

    Jambrina, P G; García, Ernesto; Herrero, Víctor J; Sáez-Rábanos, Vicente; Aoiz, F J

    2011-07-21

    Rate coefficients for the mass extreme isotopologues of the H + H(2) reaction, namely, Mu + H(2), where Mu is muonium, and Heμ + H(2), where Heμ is a He atom in which one of the electrons has been replaced by a negative muon, have been calculated in the 200-1000 K temperature range by means of accurate quantum mechanical (QM) and quasiclassical trajectory (QCT) calculations and compared with the experimental and theoretical results recently reported by Fleming et al. [Science 331, 448 (2011)]. The QCT calculations can reproduce the experimental and QM rate coefficients and kinetic isotope effect (KIE), k(Mu)(T)/k(Heμ)(T), if the Gaussian binning procedure (QCT-GB)--weighting the trajectories according to their proximity to the right quantal vibrational action--is applied. The analysis of the results shows that the large zero point energy of the MuH product is the key factor for the large KIE observed.

  14. Dynamic pseudos: How accurate outside their parent case?

    SciTech Connect

    Ekrann, S.; Mykkeltveit, J.

    1995-12-31

    If properly constructed, dynamic pseudos allow the parent solution from which they were derived to be exactly reproduced, in a certain well-defined sense, in a subsequent coarse grid simulation. The paper reports extensive numerical experimentation, in 1D homogeneous and heterogeneous media, to determine the performance of pseudos when used outside their parent case. The authors perturb fluid viscosities and injection rate, as well as realization. Parent solutions are produced analytically, via a generalization of the Buckley-Leverett technique, as are true solutions in off-parent cases. Capillarity is neglected in these experiments, while gravity is sometimes retained in order to force rate sensitivity.

  15. Measurement of Liver Iron Concentration by MRI Is Reproducible

    PubMed Central

    Alústiza, José María; Emparanza, José I.; Castiella, Agustín; Casado, Alfonso; Aldazábal, Pablo; San Vicente, Manuel; Garcia, Nerea; Asensio, Ana Belén; Banales, Jesús; Salvador, Emma; Moyua, Aranzazu; Arozena, Xabier; Zarco, Miguel; Jauregui, Lourdes; Vicente, Ohiana

    2015-01-01

    Purpose. The objectives were (i) construction of a phantom to reproduce the behavior of iron overload in the liver by MRI and (ii) assessment of the variability of a previously validated method to quantify liver iron concentration between different MRI devices using the phantom and patients. Materials and Methods. A phantom reproducing the liver/muscle ratios of two patients with intermediate and high iron overload. Nine patients with different levels of iron overload were studied in 4 multivendor devices and 8 of them were studied twice in the machine where the model was developed. The phantom was analysed in the same equipment and 14 times in the reference machine. Results. FeCl3 solutions containing 0.3, 0.5, 0.6, and 1.2 mg Fe/mL were chosen to generate the phantom. The average of the intramachine variability for patients was 10% and for the intermachines 8%. For the phantom the intramachine coefficient of variation was always below 0.1 and the average of intermachine variability was 10% for moderate and 5% for high iron overload. Conclusion. The phantom reproduces the behavior of patients with moderate or high iron overload. The proposed method of calculating liver iron concentration is reproducible in several different 1.5 T systems. PMID:25874207

  16. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 2 2013-10-01 2013-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND... joint tightness, mechanical strength and the nature of the fluid handled....

  17. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND... joint tightness, mechanical strength and the nature of the fluid handled....

  18. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND... joint tightness, mechanical strength and the nature of the fluid handled....

  19. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 2 2014-10-01 2014-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND... joint tightness, mechanical strength and the nature of the fluid handled....

  20. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 2 2012-10-01 2012-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND... joint tightness, mechanical strength and the nature of the fluid handled....

  1. Reproducibility of Tactile Assessments for Children with Unilateral Cerebral Palsy

    ERIC Educational Resources Information Center

    Auld, Megan Louise; Ware, Robert S.; Boyd, Roslyn Nancy; Moseley, G. Lorimer; Johnston, Leanne Marie

    2012-01-01

    A systematic review identified tactile assessments used in children with cerebral palsy (CP), but their reproducibility is unknown. Sixteen children with unilateral CP and 31 typically developing children (TDC) were assessed 2-4 weeks apart. Test-retest percent agreements within one point for children with unilateral CP (and TDC) were…

  2. Latin America Today: An Atlas of Reproducible Pages. Revised Edition.

    ERIC Educational Resources Information Center

    World Eagle, Inc., Wellesley, MA.

    This document contains reproducible maps, charts and graphs of Latin America for use by teachers and students. The maps are divided into five categories (1) the land; (2) peoples, countries, cities, and governments; (3) the national economies, product, trade, agriculture, and resources; (4) energy, education, employment, illicit drugs, consumer…

  3. Reproducibility of heart rate turbulence indexes in heart failure patients.

    PubMed

    D'Addio, Gianni; Cesarelli, Mario; Corbi, Graziamaria; Romano, Maria; Furgi, Giuseppe; Ferrara, Nicola; Rengo, Franco

    2010-01-01

    Cardiovascular oscillations following spontaneous ventricular premature complexes (VPC) are characterized by a short-term heart rate fluctuation known as heart rate turbulence (HRT) described by the so-called turbulence onset (TO) and slope (TS). Despite a recent written consensus on the standard of HRT measurement, reproducibility data are lacking. Aim of the paper was a reproducibility study of HRT indexes in heart failure patients (HF). Eleven HF patients underwent two 24h ECG Holter recordings, spaced 7 ± 5 days. A paired t test was used to assess the clinical stability of patients during the study period and the number of PVC in Holter recordings' couples. Both TO and TS indexes were calculated for each isolated VPC, and due to their skewed distribution, reproducibility of median and mean TO and TS was studied by Bland-Altman technique. Results showed that median HRT indexes might be preferred to commonly suggested mean values and that, although TO showed lower bias value than TS, TS can be considered much more reproducible than TO, comparing limits of agreements with normal values. This preliminary results suggest the use of medians instead of mean HRT indexes values and a reliability of the turbulence slope greater than the turbulence onset index.

  4. Reproducibility of Manual Platelet Estimation Following Automated Low Platelet Counts

    PubMed Central

    Al-Hosni, Zainab S; Al-Khabori, Murtadha; Al-Mamari, Sahimah; Al-Qasabi, Jamal; Davis, Hiedi; Al-Lawati, Hatim; Al-Riyami, Arwa Z

    2016-01-01

    Objectives Manual platelet estimation is one of the methods used when automated platelet estimates are very low. However, the reproducibility of manual platelet estimation has not been adequately studied. We sought to assess the reproducibility of manual platelet estimation following automated low platelet counts and to evaluate the impact of the level of experience of the person counting on the reproducibility of manual platelet estimates. Methods In this cross-sectional study, peripheral blood films of patients with platelet counts less than 100 × 109/L were retrieved and given to four raters to perform manual platelet estimation independently using a predefined method (average of platelet counts in 10 fields using 100× objective multiplied by 20). Data were analyzed using intraclass correlation coefficient (ICC) as a method of reproducibility assessment. Results The ICC across the four raters was 0.840, indicating excellent agreement. The median difference of the two most experienced raters was 0 (range: -64 to 78). The level of platelet estimate by the least-experienced rater predicted the disagreement (p = 0.037). When assessing the difference between pairs of raters, there was no significant difference in the ICC (p = 0.420). Conclusions The agreement between different raters using manual platelet estimation was excellent. Further confirmation is necessary, with a prospective study using a gold standard method of platelet counts. PMID:27974955

  5. Reproducibility of anthropometric measurements in children: a longitudinal study.

    PubMed

    Leppik, Aire; Jürimäe, Toivo; Jürimäe, Jaak

    2004-03-01

    The purpose of this study was to establish the reproducibility of a series of anthropometric measures performed twice during one week during a three year period in boys and girls. The subjects of this investigation were 39 children (21 boys and 18 girls), 9-10 year of age at the beginning of the study. Children were measured three times with one year interval. Children were classified by Tanner stage 1-2 during the first measurements, stage 1-3 during the second measurements and stage 1-4 during the third measurements. Body height and weight were measured and BMI calculated. All anthropometric parameters were measured according to the protocol recommended by the International Society for the Advancement of Kinanthropometry (Norton & Olds 1996). Nine skinfolds, 13 girths, eight lengths and eight breadths/lengths were measured. The reproducibility of body height (r = 0.995-0.999), body weight (r = 0.990-0.999) and BMI (r = 0.969-0.999) was very high in boys and girls. The intraclass correlations (ICC), technical errors (TE) and coefficients of variation (CV) were quite different depending on the measurement site of the skinfold thickness. It was surprising that the ICCs were highest and TEs and CVs were lowest during the second year of the measurement. The computed ICC was high, and TE and CV values were quite similar and relatively low in girth, length and breadth/length measurements. It was concluded that the reproducibility of girths, lengths and breadths/lengths in children is very high and the reproducibility of skinfolds is high. Specifically, the reproducibility is very high immediately before puberty in boys and girls.

  6. EMG analysis of trapezius and masticatory muscles: experimental protocol and data reproducibility.

    PubMed

    Sforza, C; Rosati, R; De Menezes, M; Musto, F; Toma, M

    2011-09-01

    We aimed to define a standardised protocol for the electromyographic evaluation of trapezius muscle in dentistry and to assess its within- and between-session repeatability. Surface electromyography of trapezius, masseter and temporal muscles was performed in 40 healthy subjects aged 20-35 years during shoulder elevation, and maximum teeth clenching with and without cotton rolls. Two repetitions were made both within (same electrodes) and between sessions (different electrodes). Maximum voluntary clench on cotton rolls was used to standardise the potentials of the six analysed muscles with tooth contact; shoulder elevation was used to standardise the upper trapezius potentials. From the standardised electromyographic potentials, several indices (muscle symmetry; masticatory muscle torque and relative activity; total masticatory muscle activity; trapezius cervical load, percentage co-contraction of trapezius during teeth clenching) were computed; random (technical error of measurement) and systematic (Student's t-test, Analysis of Variance) errors were assessed. For all indices, no systematic errors were found between the two separate data collection sessions. Within session, limited (lower than 8%) technical errors of measurement were found for temporalis and masseter symmetry, torque and activity indices, and the trapezius cervical load. Larger random errors were obtained for trapezius symmetry and total masticatory muscle activity (up to 20%). Between sessions, no significant differences were found for trapezius co-contraction. In conclusion, a protocol for the standardisation of trapezius muscle that may be used within dental clinical applications was defined, and the repeatability of masseter, temporalis and trapezius electromyographic recordings for serial assessments was assessed in healthy subjects.

  7. Experimentally reproduced relict enstatite in porphyritic chondrules of enstatite chondrite composition

    NASA Technical Reports Server (NTRS)

    Lofgren, Gary E.; Dehart, John M.; Dickinson, Tammy L.

    1993-01-01

    Experiments are presented that test a model for the origin of porphyritic pyroxene (PP) chondrules in enstatite chondrites that contain phenocrysts of enstatite with blue cathodoluminescence (CL) set in a matrix of radial, dendritic enstatite with red CL. Established one-atmosphere, gas-mixing techniques were used. Relict enstatite phenocrysts with blue CL in a matrix of coarsely radial to dendritic enstatite with red CL were successfully produced. The relict crystals are preserved in runs with a melt time of 36 minutes or less at 1537 C. The relicts remain angular with smooth crystal/melt interfaces, and thus melting has occurred uniformly. Partial melting does occur along fractures produced when the blue CL enstatite was initially grown and cooled through the proto/ortho enstatite transition with the attendent volume change. There is either reaction with the melt and diffusion of Mn and Cr into the blue CL En, or there is an overgrowth of red CL En along the fractures. The bulk of the relicts remain blue. The melt enclosing the relicts crystalized to a coarsely radial to dendritic to micro porphyritic texture comprised of enstatite that has a bright red CL with decreasing melt time. The blue CL En has Mn and Cr contents at or below detection limits of the electron probe as described in earlier studies and in natural blue CL En. In the red CL En in this study, the Mn, Al2O3, and Cr are at previously observed levels and the levels change rapidly.

  8. A reproducible approach to high-throughput biological data acquisition and integration.

    PubMed

    Börnigen, Daniela; Moon, Yo Sup; Rahnavard, Gholamali; Waldron, Levi; McIver, Lauren; Shafquat, Afrah; Franzosa, Eric A; Miropolsky, Larissa; Sweeney, Christopher; Morgan, Xochitl C; Garrett, Wendy S; Huttenhower, Curtis

    2015-01-01

    Modern biological research requires rapid, complex, and reproducible integration of multiple experimental results generated both internally and externally (e.g., from public repositories). Although large systematic meta-analyses are among the most effective approaches both for clinical biomarker discovery and for computational inference of biomolecular mechanisms, identifying, acquiring, and integrating relevant experimental results from multiple sources for a given study can be time-consuming and error-prone. To enable efficient and reproducible integration of diverse experimental results, we developed a novel approach for standardized acquisition and analysis of high-throughput and heterogeneous biological data. This allowed, first, novel biomolecular network reconstruction in human prostate cancer, which correctly recovered and extended the NFκB signaling pathway. Next, we investigated host-microbiome interactions. In less than an hour of analysis time, the system retrieved data and integrated six germ-free murine intestinal gene expression datasets to identify the genes most influenced by the gut microbiota, which comprised a set of immune-response and carbohydrate metabolism processes. Finally, we constructed integrated functional interaction networks to compare connectivity of peptide secretion pathways in the model organisms Escherichia coli, Bacillus subtilis, and Pseudomonas aeruginosa.

  9. Repeatability and reproducibility of product ion abundances in electron capture dissociation mass spectrometry of peptides.

    PubMed

    Ben Hamidane, Hisham; Vorobyev, Aleksey; Tsybin, Yury O

    2011-01-01

    Site-specific reproducibility and repeatability of electron capture dissociation (ECD) in Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) are of fundamental importance for product ion abundance (PIA)-based peptide and protein structure analysis. However, despite the growing interest in ECD PIA-based applications, these parameters have not yet been investigated in a consistent manner. Here, we first provide a detailed description of the experimental parameters for ECD-based tandem mass spectrometry performed on a hybrid linear ion trap (LTQ) FT-ICR MS. In the following, we describe the evaluation and comparison of ECD and infrared multiphoton dissociation (IRMPD) PIA methodologies upon variation of a number of experimental parameters, for example, cathode potential (electron energy), laser power, electron and photon irradiation periods and pre- irradiation delays, as well as precursor ion number. Ranges of experimental parameters that yielded an average PIA variation below 5% and 15% were determined for ECD and IRMPD, respectively. We report cleavage site-dependent ECD PIA variation below 20% and correlation coefficients between fragmentation patterns superior to 0.95 for experiments performed on three FT-ICR MS instruments. Overall, the encouraging results obtained for ECD PIA reproducibility and repeatability support the use of ECD PIA as a complementary source of information to m/z data in radical-induced dissociation applied for peptide and protein structure analysis.

  10. Calculation of Accurate Hexagonal Discontinuity Factors for PARCS

    SciTech Connect

    Pounders. J., Bandini, B. R. , Xu, Y, and Downar, T. J.

    2007-11-01

    In this study we derive a methodology for calculating discontinuity factors consistent with the Triangle-based Polynomial Expansion Nodal (TPEN) method implemented in PARCS for hexagonal reactor geometries. The accuracy of coarse-mesh nodal methods is greatly enhanced by permitting flux discontinuities at node boundaries, but the practice of calculating discontinuity factors from infinite-medium (zero-current) single bundle calculations may not be sufficiently accurate for more challenging problems in which there is a large amount of internodal neutron streaming. The authors therefore derive a TPEN-based method for calculating discontinuity factors that are exact with respect to generalized equivalence theory. The method is validated by reproducing the reference solution for a small hexagonal core.

  11. Direct computation of parameters for accurate polarizable force fields

    SciTech Connect

    Verstraelen, Toon Vandenbrande, Steven; Ayers, Paul W.

    2014-11-21

    We present an improved electronic linear response model to incorporate polarization and charge-transfer effects in polarizable force fields. This model is a generalization of the Atom-Condensed Kohn-Sham Density Functional Theory (DFT), approximated to second order (ACKS2): it can now be defined with any underlying variational theory (next to KS-DFT) and it can include atomic multipoles and off-center basis functions. Parameters in this model are computed efficiently as expectation values of an electronic wavefunction, obviating the need for their calibration, regularization, and manual tuning. In the limit of a complete density and potential basis set in the ACKS2 model, the linear response properties of the underlying theory for a given molecular geometry are reproduced exactly. A numerical validation with a test set of 110 molecules shows that very accurate models can already be obtained with fluctuating charges and dipoles. These features greatly facilitate the development of polarizable force fields.

  12. Covalent and ionic nature of the dative bond and account of accurate ammonia borane binding enthalpies.

    PubMed

    Plumley, Joshua A; Evanseck, Jeffrey D

    2007-12-27

    The inherent difficulty in modeling the energetic character of the B-N dative bond has been investigated utilizing density functional theory and ab initio methods. The underlying influence of basis set size and functions, thermal corrections, and basis set superposition error (BSSE) on the predicted binding enthalpy of ammonia borane (H3B-NH3) and four methyl-substituted ammonia trimethylboranes ((CH3)3B-N(CH3)nH3-n; n = 0-3) has been evaluated and compared with experiment. HF, B3LYP, MPW1K, MP2, QCISD, and QCISD(T) have been utilized with a wide range of Pople and correlation-consistent basis sets, totaling 336 levels of theory. MPW1K, B3LYP, and HF result in less BSSE and converge to binding enthalpies with fewer basis functions than post-SCF techniques; however, the methods fail to model experimental binding enthalpies and trends accurately, producing mean absolute deviations (MADs) of 5.1, 10.8, and 16.3 kcal/mol, respectively. Despite slow convergence, MP2, QCISD, and QCISD(T) using the 6-311++G(3df,2p) basis set reproduce the experimental binding enthalpy trend and result in lower MADs of 2.2, 2.6, and 0.5 kcal/mol, respectively, when corrected for BSSE and a residual convergence error of ca. 1.3-1.6 kcal/mol. Accuracy of the predicted binding enthalpy is linked to correct determination of the bond's dative character given by charge-transfer frustration, QCTF = -(Delta QN + Delta QB). Frustration gauges the incompleteness of charge transfer between the donor and the acceptor. The binding enthalpy across ammonia borane and methylated complexes is correlated to its dative character (R2 = 0.91), where a more dative bond (less charge-transfer frustration) results in a weaker binding enthalpy. However, a balance of electronic and steric factors must be considered to explain trends in experimentally reported binding enthalpies. Dative bond descriptors, such as bond ionicity and covalency are important in the accurate characterization of the dative bond. The B

  13. On the importance of recrystallization to reproduce the Taylor impact specimen shape of a pure nickel

    NASA Astrophysics Data System (ADS)

    Couque, Hervé

    2015-09-01

    Taylor tests are a mean to investigate the dynamic plastic and failure behaviour of metals under compression. By taking in account the strengthening occurring at high strain rates, the Taylor final diameter of a pure nickel impacted at 453 m/s have been numerically reproduced by 13%. Through post-mortem observations of the specimen impacted at 453 m/s, a recrystallization process has been found to occur resulting in a softening of the pure nickel. Subsequent numerical simulations taking in account this softening have been found to reduce the difference between experimental and numerical diameter by 10%.

  14. Spin-coating process evolution and reproducibility for power-law fluids.

    PubMed

    Jardim, P L G; Michels, A F; Horowitz, F

    2014-03-20

    A distinct development of an exact analytical solution for power-law fluids during the spin-coating process is presented for temporal and spatial thickness evolution, after steady state conditions are attained. This solution leads to the definition of a characteristic time, related to the memory of the initial thickness profile. Previously obtained experimental data, for several rotation speeds and carboxymetilcellulose concentrations in water, are quantitatively analyzed through the evaluation of their characteristic times and compared with theoretical predictions, thus allowing better understanding of thickness profile evolution and of process reproducibility.

  15. Generation of reproducible turbulent inflows for wind tunnel applications using active grids

    NASA Astrophysics Data System (ADS)

    Kroeger, Lars; Guelker, Gerd; Peinke, Joachim

    2016-11-01

    Turbulent flows are omnipresent in nature. In the case of wind energy applications, reproducible measurements in situ are quite difficult, therefore research in turbulence demands for experimental setups with reproducible turbulent flow fields. To simulate the situation from the outside in a wind tunnel an active grid can be used. It consists of horizontal and vertical rotating axes with attached square flaps which could be moved individually. This dynamically driven setup and the possibility to repeat the motions of the active grid axes permits to generate reproducible, statistically well defined turbulence with a wide range of statistical behavior. The objective of this work is to create turbulence with two active grids of different dimensions, to establish comparable setups in our available wind tunnel facilities. In this study the wake of the active grids was investigated by high speed PIV and hotwire measurements. To determine the similarities and limitations between the setups of different dimensions the hotwire data is compared using higher order statistics, increment analysis and the power spectra. The PIV data is used to observe spatial correlations and the prevailing length scales in the turbulent wakes. First results regarding this comparison are shown.

  16. Ab initio molecular dynamics of liquid water using embedded-fragment second-order many-body perturbation theory towards its accurate property prediction

    PubMed Central

    Willow, Soohaeng Yoo; Salim, Michael A.; Kim, Kwang S.; Hirata, So

    2015-01-01

    A direct, simultaneous calculation of properties of a liquid using an ab initio electron-correlated theory has long been unthinkable. Here we present structural, dynamical, and response properties of liquid water calculated by ab initio molecular dynamics using the embedded-fragment spin-component-scaled second-order many-body perturbation method with the aug-cc-pVDZ basis set. This level of theory is chosen as it accurately and inexpensively reproduces the water dimer potential energy surface from the coupled-cluster singles, doubles, and noniterative triples with the aug-cc-pVQZ basis set, which is nearly exact. The calculated radial distribution function, self-diffusion coefficient, coordinate number, and dipole moment, as well as the infrared and Raman spectra are in excellent agreement with experimental results. The shapes and widths of the OH stretching bands in the infrared and Raman spectra and their isotropic-anisotropic Raman noncoincidence, which reflect the diverse local hydrogen-bond environment, are also reproduced computationally. The simulation also reveals intriguing dynamic features of the environment, which are difficult to probe experimentally, such as a surprisingly large fluctuation in the coordination number and the detailed mechanism by which the hydrogen donating water molecules move across the first and second shells, thereby causing this fluctuation. PMID:26400690

  17. Accurate Development of Thermal Neutron Scattering Cross Section Libraries

    SciTech Connect

    Hawari, Ayman; Dunn, Michael

    2014-06-10

    The objective of this project is to develop a holistic (fundamental and accurate) approach for generating thermal neutron scattering cross section libraries for a collection of important enutron moderators and reflectors. The primary components of this approach are the physcial accuracy and completeness of the generated data libraries. Consequently, for the first time, thermal neutron scattering cross section data libraries will be generated that are based on accurate theoretical models, that are carefully benchmarked against experimental and computational data, and that contain complete covariance information that can be used in propagating the data uncertainties through the various components of the nuclear design and execution process. To achieve this objective, computational and experimental investigations will be performed on a carefully selected subset of materials that play a key role in all stages of the nuclear fuel cycle.

  18. An exploration of graph metric reproducibility in complex brain networks

    PubMed Central

    Telesford, Qawi K.; Burdette, Jonathan H.; Laurienti, Paul J.

    2013-01-01

    The application of graph theory to brain networks has become increasingly popular in the neuroimaging community. These investigations and analyses have led to a greater understanding of the brain's complex organization. More importantly, it has become a useful tool for studying the brain under various states and conditions. With the ever expanding popularity of network science in the neuroimaging community, there is increasing interest to validate the measurements and calculations derived from brain networks. Underpinning these studies is the desire to use brain networks in longitudinal studies or as clinical biomarkers to understand changes in the brain. A highly reproducible tool for brain imaging could potentially prove useful as a clinical tool. In this review, we examine recent studies in network reproducibility and their implications for analysis of brain networks. PMID:23717257

  19. Utility, reliability and reproducibility of immunoassay multiplex kits.

    PubMed

    Tighe, Paddy; Negm, Ola; Todd, Ian; Fairclough, Lucy

    2013-05-15

    Multiplex technologies are becoming increasingly important in biomarker studies as they enable patterns of biomolecules to be examined, which provide a more comprehensive depiction of disease than individual biomarkers. They are crucial in deciphering these patterns, but it is essential that they are endorsed for reliability, reproducibility and precision. Here we outline the theoretical basis of a variety of multiplex technologies: Bead-based multiplex immunoassays (i.e. Cytometric Bead Arrays, Luminex™ and Bio-Plex Pro™), microtitre plate-based arrays (i.e. Mesoscale Discovery (MSD) and Quantsys BioSciences QPlex), Slide-based Arrays (i.e. FastQuant™) and reverse phase protein arrays. Their utility, reliability and reproducibility are discussed.

  20. Reproducibility of Mammography Units, Film Processing and Quality Imaging

    NASA Astrophysics Data System (ADS)

    Gaona, Enrique

    2003-09-01

    The purpose of this study was to carry out an exploratory survey of the problems of quality control in mammography and processors units as a diagnosis of the current situation of mammography facilities. Measurements of reproducibility, optical density, optical difference and gamma index are included. Breast cancer is the most frequently diagnosed cancer and is the second leading cause of cancer death among women in the Mexican Republic. Mammography is a radiographic examination specially designed for detecting breast pathology. We found that the problems of reproducibility of AEC are smaller than the problems of processors units because almost all processors fall outside of the acceptable variation limits and they can affect the mammography quality image and the dose to breast. Only four mammography units agree with the minimum score established by ACR and FDA for the phantom image.

  1. jicbioimage: a tool for automated and reproducible bioimage analysis

    PubMed Central

    Hartley, Matthew

    2016-01-01

    There has been steady improvement in methods for capturing bioimages. However analysing these images still remains a challenge. The Python programming language provides a powerful and flexible environment for scientific computation. It has a wide range of supporting libraries for image processing but lacks native support for common bioimage formats, and requires specific code to be written to ensure that suitable audit trails are generated and analyses are reproducible. Here we describe the development of a Python tool that: (1) allows users to quickly view and explore microscopy data; (2) generate reproducible analyses, encoding a complete history of image transformations from raw data to final result; and (3) scale up analyses from initial exploration to high throughput processing pipelines, with a minimal amount of extra effort. The tool, jicbioimage, is open source and freely available online at http://jicbioimage.readthedocs.io. PMID:27896026

  2. Reproducibility Issues: Avoiding Pitfalls in Animal Inflammation Models.

    PubMed

    Laman, Jon D; Kooistra, Susanne M; Clausen, Björn E

    2017-01-01

    In light of an enhanced awareness of ethical questions and ever increasing costs when working with animals in biomedical research, there is a dedicated and sometimes fierce debate concerning the (lack of) reproducibility of animal models and their relevance for human inflammatory diseases. Despite evident advancements in searching for alternatives, that is, replacing, reducing, and refining animal experiments-the three R's of Russel and Burch (1959)-understanding the complex interactions of the cells of the immune system, the nervous system and the affected tissue/organ during inflammation critically relies on in vivo models. Consequently, scientific advancement and ultimately novel therapeutic interventions depend on improving the reproducibility of animal inflammation models. As a prelude to the remaining hands-on protocols described in this volume, here, we summarize potential pitfalls of preclinical animal research and provide resources and background reading on how to avoid them.

  3. Data Sharing and Reproducible Clinical Genetic Testing: Successes and Challenges

    PubMed Central

    Yang, Shan; Cline, Melissa; Zhang, Can; Paten, Benedict; Lincoln, Stephen e.

    2016-01-01

    Open sharing of clinical genetic data promises to both monitor and eventually improve the reproducibility of variant interpretation among clinical testing laboratories. A significant public data resource has been developed by the NIH ClinVar initiative, which includes submissions from hundreds of laboratories and clinics worldwide. We analyzed a subset of ClinVar data focused on specific clinical areas and we find high reproducibility (>90% concordance) among labs, although challenges for the community are clearly identified in this dataset. We further review results for the commonly tested BRCA1 and BRCA2 genes, which show even higher concordance, although the significant fragmentation of data into different silos presents an ongoing challenge now being addressed by the BRCA Exchange. We encourage all laboratories and clinics to contribute to these important resources. PMID:27896972

  4. jicbioimage: a tool for automated and reproducible bioimage analysis.

    PubMed

    Olsson, Tjelvar S G; Hartley, Matthew

    2016-01-01

    There has been steady improvement in methods for capturing bioimages. However analysing these images still remains a challenge. The Python programming language provides a powerful and flexible environment for scientific computation. It has a wide range of supporting libraries for image processing but lacks native support for common bioimage formats, and requires specific code to be written to ensure that suitable audit trails are generated and analyses are reproducible. Here we describe the development of a Python tool that: (1) allows users to quickly view and explore microscopy data; (2) generate reproducible analyses, encoding a complete history of image transformations from raw data to final result; and (3) scale up analyses from initial exploration to high throughput processing pipelines, with a minimal amount of extra effort. The tool, jicbioimage, is open source and freely available online at http://jicbioimage.readthedocs.io.

  5. Properties of galaxies reproduced by a hydrodynamic simulation.

    PubMed

    Vogelsberger, M; Genel, S; Springel, V; Torrey, P; Sijacki, D; Xu, D; Snyder, G; Bird, S; Nelson, D; Hernquist, L

    2014-05-08

    Previous simulations of the growth of cosmic structures have broadly reproduced the 'cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the 'metal' and hydrogen content of galaxies on small scales.

  6. Implementation of a portable and reproducible parallel pseudorandom number generator

    SciTech Connect

    Pryor, D.V.; Cuccaro, S.A.; Mascagni, M.; Robinson, M.L.

    1994-12-31

    The authors describe in detail the parallel implementation of a family of additive lagged-Fibonacci pseudorandom number generators. The theoretical structure of these generators is exploited to preserve their well-known randomness properties and to provide a parallel system in of distinct cycles. The algorithm presented here solves the reproducibility problem for a far larger class of parallel Monte Carlo applications than has been previously possible. In particular, Monte Carlo applications that undergo ``splitting`` can be coded to be reproducible, independent both of the number of processors and the execution order of the parallel processes. A library of portable C routines (available from the authors) that implements these ideas is also described.

  7. Empowering Multi-Cohort Gene Expression Analysis to Increase Reproducibility

    PubMed Central

    Haynes, Winston A; Vallania, Francesco; Liu, Charles; Bongen, Erika; Tomczak, Aurelie; Andres-Terrè, Marta; Lofgren, Shane; Tam, Andrew; Deisseroth, Cole A; Li, Matthew D; Sweeney, Timothy E

    2016-01-01

    A major contributor to the scientific reproducibility crisis has been that the results from homogeneous, single-center studies do not generalize to heterogeneous, real world populations. Multi-cohort gene expression analysis has helped to increase reproducibility by aggregating data from diverse populations into a single analysis. To make the multi-cohort analysis process more feasible, we have assembled an analysis pipeline which implements rigorously studied meta-analysis best practices. We have compiled and made publicly available the results of our own multi-cohort gene expression analysis of 103 diseases, spanning 615 studies and 36,915 samples, through a novel and interactive web application. As a result, we have made both the process of and the results from multi-cohort gene expression analysis more approachable for non-technical users. PMID:27896970

  8. Pressure Stabilizer for Reproducible Picoinjection in Droplet Microfluidic Systems

    PubMed Central

    Rhee, Minsoung; Light, Yooli K.; Yilmaz, Suzan; Adams, Paul D.; Saxena, Deepak

    2014-01-01

    Picoinjection is a promising technique to add reagents into pre-formed emulsion droplets on chip; however, it is sensitive to pressure fluctuation, making stable operation of the picoinjector challenging. We present a chip architecture using a simple pressure stabilizer for consistent and highly reproducible picoinjection in multi-step biochemical assays with droplets. Incorporation of the stabilizer immediately upstream of a picoinjector or a combination of injectors greatly reduces pressure fluctuations enabling reproducible and effective picoinjection in systems where the pressure varies actively during operation. We demonstrate the effectiveness of the pressure stabilizer for an integrated platform for on-demand encapsulation of bacterial cells followed by picoinjection of reagents for lysing the encapsulated cells. The pressure stabilizer was also used for picoinjection of multiple displacement amplification (MDA) reagents to achieve genomic DNA amplification of lysed bacterial cells. PMID:25270338

  9. Pressure stabilizer for reproducible picoinjection in droplet microfluidic systems.

    PubMed

    Rhee, Minsoung; Light, Yooli K; Yilmaz, Suzan; Adams, Paul D; Saxena, Deepak; Meagher, Robert J; Singh, Anup K

    2014-12-07

    Picoinjection is a promising technique to add reagents into pre-formed emulsion droplets on chip however, it is sensitive to pressure fluctuation, making stable operation of the picoinjector challenging. We present a chip architecture using a simple pressure stabilizer for consistent and highly reproducible picoinjection in multi-step biochemical assays with droplets. Incorporation of the stabilizer immediately upstream of a picoinjector or a combination of injectors greatly reduces pressure fluctuations enabling reproducible and effective picoinjection in systems where the pressure varies actively during operation. We demonstrate the effectiveness of the pressure stabilizer for an integrated platform for on-demand encapsulation of bacterial cells followed by picoinjection of reagents for lysing the encapsulated cells. The pressure stabilizer was also used for picoinjection of multiple displacement amplification (MDA) reagents to achieve genomic DNA amplification of lysed bacterial cells.

  10. MASSIVE DATA, THE DIGITIZATION OF SCIENCE, AND REPRODUCIBILITY OF RESULTS

    ScienceCinema

    None

    2016-07-12

    As the scientific enterprise becomes increasingly computational and data-driven, the nature of the information communicated must change. Without inclusion of the code and data with published computational results, we are engendering a credibility crisis in science. Controversies such as ClimateGate, the microarray-based drug sensitivity clinical trials under investigation at Duke University, and retractions from prominent journals due to unverified code suggest the need for greater transparency in our computational science. In this talk I argue that the scientific method be restored to (1) a focus on error control as central to scientific communication and (2) complete communication of the underlying methodology producing the results, ie. reproducibility. I outline barriers to these goals based on recent survey work (Stodden 2010), and suggest solutions such as the “Reproducible Research Standard” (Stodden 2009), giving open licensing options designed to create an intellectual property framework for scientists consonant with longstanding scientific norms.

  11. Accurately measuring dynamic coefficient of friction in ultraform finishing

    NASA Astrophysics Data System (ADS)

    Briggs, Dennis; Echaves, Samantha; Pidgeon, Brendan; Travis, Nathan; Ellis, Jonathan D.

    2013-09-01

    UltraForm Finishing (UFF) is a deterministic sub-aperture computer numerically controlled grinding and polishing platform designed by OptiPro Systems. UFF is used to grind and polish a variety of optics from simple spherical to fully freeform, and numerous materials from glasses to optical ceramics. The UFF system consists of an abrasive belt around a compliant wheel that rotates and contacts the part to remove material. This work aims to accurately measure the dynamic coefficient of friction (μ), how it changes as a function of belt wear, and how this ultimately affects material removal rates. The coefficient of friction has been examined in terms of contact mechanics and Preston's equation to determine accurate material removal rates. By accurately predicting changes in μ, polishing iterations can be more accurately predicted, reducing the total number of iterations required to meet specifications. We have established an experimental apparatus that can accurately measure μ by measuring triaxial forces during translating loading conditions or while manufacturing the removal spots used to calculate material removal rates. Using this system, we will demonstrate μ measurements for UFF belts during different states of their lifecycle and assess the material removal function from spot diagrams as a function of wear. Ultimately, we will use this system for qualifying belt-wheel-material combinations to develop a spot-morphing model to better predict instantaneous material removal functions.

  12. Nonexposure Accurate Location K-Anonymity Algorithm in LBS

    PubMed Central

    2014-01-01

    This paper tackles location privacy protection in current location-based services (LBS) where mobile users have to report their exact location information to an LBS provider in order to obtain their desired services. Location cloaking has been proposed and well studied to protect user privacy. It blurs the user's accurate coordinate and replaces it with a well-shaped cloaked region. However, to obtain such an anonymous spatial region (ASR), nearly all existent cloaking algorithms require knowing the accurate locations of all users. Therefore, location cloaking without exposing the user's accurate location to any party is urgently needed. In this paper, we present such two nonexposure accurate location cloaking algorithms. They are designed for K-anonymity, and cloaking is performed based on the identifications (IDs) of the grid areas which were reported by all the users, instead of directly on their accurate coordinates. Experimental results show that our algorithms are more secure than the existent cloaking algorithms, need not have all the users reporting their locations all the time, and can generate smaller ASR. PMID:24605060

  13. Nonexposure accurate location K-anonymity algorithm in LBS.

    PubMed

    Jia, Jinying; Zhang, Fengli

    2014-01-01

    This paper tackles location privacy protection in current location-based services (LBS) where mobile users have to report their exact location information to an LBS provider in order to obtain their desired services. Location cloaking has been proposed and well studied to protect user privacy. It blurs the user's accurate coordinate and replaces it with a well-shaped cloaked region. However, to obtain such an anonymous spatial region (ASR), nearly all existent cloaking algorithms require knowing the accurate locations of all users. Therefore, location cloaking without exposing the user's accurate location to any party is urgently needed. In this paper, we present such two nonexposure accurate location cloaking algorithms. They are designed for K-anonymity, and cloaking is performed based on the identifications (IDs) of the grid areas which were reported by all the users, instead of directly on their accurate coordinates. Experimental results show that our algorithms are more secure than the existent cloaking algorithms, need not have all the users reporting their locations all the time, and can generate smaller ASR.

  14. Revised Charge Equilibration Parameters for More Accurate Hydration Free Energies of Alkanes.

    PubMed

    Davis, Joseph E; Patel, Sandeep

    2010-01-01

    We present a refined alkane charge equilibration (CHEQ) force field, improving our previously reported CHEQ alkane force field[1] to better reproduce experimental hydration free energies. Experimental hydration free energies of ethane, propane, butane, pentane, hexane, and heptane are reproduced to within 3.6% on average. We demonstrate that explicit polarization results in a shift in molecular dipole moment for water molecules associated with the alkane molecule. We also show that our new parameters do not have a significant effect on the alkane-water interactions as measured by the radial distribution function (RDF).

  15. Reproducibility of graph metrics of human brain structural networks.

    PubMed

    Duda, Jeffrey T; Cook, Philip A; Gee, James C

    2014-01-01

    Recent interest in human brain connectivity has led to the application of graph theoretical analysis to human brain structural networks, in particular white matter connectivity inferred from diffusion imaging and fiber tractography. While these methods have been used to study a variety of patient populations, there has been less examination of the reproducibility of these methods. A number of tractography algorithms exist and many of these are known to be sensitive to user-selected parameters. The methods used to derive a connectivity matrix from fiber tractography output may also influence the resulting graph metrics. Here we examine how these algorithm and parameter choices influence the reproducibility of proposed graph metrics on a publicly available test-retest dataset consisting of 21 healthy adults. The dice coefficient is used to examine topological similarity of constant density subgraphs both within and between subjects. Seven graph metrics are examined here: mean clustering coefficient, characteristic path length, largest connected component size, assortativity, global efficiency, local efficiency, and rich club coefficient. The reproducibility of these network summary measures is examined using the intraclass correlation coefficient (ICC). Graph curves are created by treating the graph metrics as functions of a parameter such as graph density. Functional data analysis techniques are used to examine differences in graph measures that result from the choice of fiber tracking algorithm. The graph metrics consistently showed good levels of reproducibility as measured with ICC, with the exception of some instability at low graph density levels. The global and local efficiency measures were the most robust to the choice of fiber tracking algorithm.

  16. Highly reproducible Bragg grating acousto-ultrasonic contact transducers

    NASA Astrophysics Data System (ADS)

    Saxena, Indu Fiesler; Guzman, Narciso; Lieberman, Robert A.

    2014-09-01

    Fiber optic acousto-ultrasonic transducers offer numerous applications as embedded sensors for impact and damage detection in industrial and aerospace applications as well as non-destructive evaluation. Superficial contact transducers with a sheet of fiber optic Bragg gratings has been demonstrated for guided wave ultrasound based measurements. It is reported here that this method of measurement provides highly reproducible guided ultrasound data of the test composite component, despite the optical fiber transducers not being permanently embedded in it.

  17. How reproducible are the measurements of leaf fluctuating asymmetry?

    PubMed Central

    2015-01-01

    Fluctuating asymmetry (FA) represents small, non-directional deviations from perfect symmetry in morphological characters. FA is generally assumed to increase in response to stress; therefore, FA is frequently used in ecological studies as an index of environmental or genetic stress experienced by an organism. The values of FA are usually small, and therefore the reliable detection of FA requires precise measurements. The reproducibility of fluctuating asymmetry (FA) was explored by comparing the results of measurements of scanned images of 100 leaves of downy birch (Betula pubescens) conducted by 31 volunteer scientists experienced in studying plant FA. The median values of FA varied significantly among the participants, from 0.000 to 0.074, and the coefficients of variation in FA for individual leaves ranged from 25% to 179%. The overall reproducibility of the results among the participants was rather low (0.074). Variation in instruments and methods used by the participants had little effect on the reported FA values, but the reproducibility of the measurements increased by 30% following exclusion of data provided by seven participants who had modified the suggested protocol for leaf measurements. The scientists working with plant FA are advised to pay utmost attention to adequate and detailed description of their data acquisition protocols in their forthcoming publications, because all characteristics of instruments and methods need to be controlled to increase the quality and reproducibility of the data. Whenever possible, the images of all measured objects and the results of primary measurements should be published as electronic appendices to scientific papers. PMID:26157612

  18. How reproducible are the measurements of leaf fluctuating asymmetry?

    PubMed

    Kozlov, Mikhail V

    2015-01-01

    Fluctuating asymmetry (FA) represents small, non-directional deviations from perfect symmetry in morphological characters. FA is generally assumed to increase in response to stress; therefore, FA is frequently used in ecological studies as an index of environmental or genetic stress experienced by an organism. The values of FA are usually small, and therefore the reliable detection of FA requires precise measurements. The reproducibility of fluctuating asymmetry (FA) was explored by comparing the results of measurements of scanned images of 100 leaves of downy birch (Betula pubescens) conducted by 31 volunteer scientists experienced in studying plant FA. The median values of FA varied significantly among the participants, from 0.000 to 0.074, and the coefficients of variation in FA for individual leaves ranged from 25% to 179%. The overall reproducibility of the results among the participants was rather low (0.074). Variation in instruments and methods used by the participants had little effect on the reported FA values, but the reproducibility of the measurements increased by 30% following exclusion of data provided by seven participants who had modified the suggested protocol for leaf measurements. The scientists working with plant FA are advised to pay utmost attention to adequate and detailed description of their data acquisition protocols in their forthcoming publications, because all characteristics of instruments and methods need to be controlled to increase the quality and reproducibility of the data. Whenever possible, the images of all measured objects and the results of primary measurements should be published as electronic appendices to scientific papers.

  19. Icy: an open bioimage informatics platform for extended reproducible research.

    PubMed

    de Chaumont, Fabrice; Dallongeville, Stéphane; Chenouard, Nicolas; Hervé, Nicolas; Pop, Sorin; Provoost, Thomas; Meas-Yedid, Vannary; Pankajakshan, Praveen; Lecomte, Timothée; Le Montagner, Yoann; Lagache, Thibault; Dufour, Alexandre; Olivo-Marin, Jean-Christophe

    2012-06-28

    Current research in biology uses evermore complex computational and imaging tools. Here we describe Icy, a collaborative bioimage informatics platform that combines a community website for contributing and sharing tools and material, and software with a high-end visual programming framework for seamless development of sophisticated imaging workflows. Icy extends the reproducible research principles, by encouraging and facilitating the reusability, modularity, standardization and management of algorithms and protocols. Icy is free, open-source and available at http://icy.bioimageanalysis.org/.

  20. CRKSPH - A Conservative Reproducing Kernel Smoothed Particle Hydrodynamics Scheme

    NASA Astrophysics Data System (ADS)

    Frontiere, Nicholas; Raskin, Cody D.; Owen, J. Michael

    2017-03-01

    We present a formulation of smoothed particle hydrodynamics (SPH) that utilizes a first-order consistent reproducing kernel, a smoothing function that exactly interpolates linear fields with particle tracers. Previous formulations using reproducing kernel (RK) interpolation have had difficulties maintaining conservation of momentum due to the fact the RK kernels are not, in general, spatially symmetric. Here, we utilize a reformulation of the fluid equations such that mass, linear momentum, and energy are all rigorously conserved without any assumption about kernel symmetries, while additionally maintaining approximate angular momentum conservation. Our approach starts from a rigorously consistent interpolation theory, where we derive the evolution equations to enforce the appropriate conservation properties, at the sacrifice of full consistency in the momentum equation. Additionally, by exploiting the increased accuracy of the RK method's gradient, we formulate a simple limiter for the artificial viscosity that reduces the excess diffusion normally incurred by the ordinary SPH artificial viscosity. Collectively, we call our suite of modifications to the traditional SPH scheme Conservative Reproducing Kernel SPH, or CRKSPH. CRKSPH retains many benefits of traditional SPH methods (such as preserving Galilean invariance and manifest conservation of mass, momentum, and energy) while improving on many of the shortcomings of SPH, particularly the overly aggressive artificial viscosity and zeroth-order inaccuracy. We compare CRKSPH to two different modern SPH formulations (pressure based SPH and compatibly differenced SPH), demonstrating the advantages of our new formulation when modeling fluid mixing, strong shock, and adiabatic phenomena.

  1. Dosimetric algorithm to reproduce isodose curves obtained from a LINAC.

    PubMed

    Estrada Espinosa, Julio Cesar; Martínez Ovalle, Segundo Agustín; Pereira Benavides, Cinthia Kotzian

    2014-01-01

    In this work isodose curves are obtained by the use of a new dosimetric algorithm using numerical data from percentage depth dose (PDD) and the maximum absorbed dose profile, calculated by Monte Carlo in a 18 MV LINAC. The software allows reproducing the absorbed dose percentage in the whole irradiated volume quickly and with a good approximation. To validate results an 18 MV LINAC with a whole geometry and a water phantom were constructed. On this construction, the distinct simulations were processed by the MCNPX code and then obtained the PDD and profiles for the whole depths of the radiation beam. The results data were used by the code to produce the dose percentages in any point of the irradiated volume. The absorbed dose for any voxel's size was also reproduced at any point of the irradiated volume, even when the voxels are considered to be of a pixel's size. The dosimetric algorithm is able to reproduce the absorbed dose induced by a radiation beam over a water phantom, considering PDD and profiles, whose maximum percent value is in the build-up region. Calculation time for the algorithm is only a few seconds, compared with the days taken when it is carried out by Monte Carlo.

  2. Dosimetric Algorithm to Reproduce Isodose Curves Obtained from a LINAC

    PubMed Central

    Estrada Espinosa, Julio Cesar; Martínez Ovalle, Segundo Agustín; Pereira Benavides, Cinthia Kotzian

    2014-01-01

    In this work isodose curves are obtained by the use of a new dosimetric algorithm using numerical data from percentage depth dose (PDD) and the maximum absorbed dose profile, calculated by Monte Carlo in a 18 MV LINAC. The software allows reproducing the absorbed dose percentage in the whole irradiated volume quickly and with a good approximation. To validate results an 18 MV LINAC with a whole geometry and a water phantom were constructed. On this construction, the distinct simulations were processed by the MCNPX code and then obtained the PDD and profiles for the whole depths of the radiation beam. The results data were used by the code to produce the dose percentages in any point of the irradiated volume. The absorbed dose for any voxel's size was also reproduced at any point of the irradiated volume, even when the voxels are considered to be of a pixel's size. The dosimetric algorithm is able to reproduce the absorbed dose induced by a radiation beam over a water phantom, considering PDD and profiles, whose maximum percent value is in the build-up region. Calculation time for the algorithm is only a few seconds, compared with the days taken when it is carried out by Monte Carlo. PMID:25045398

  3. Reproducibility of SELDI Spectra Across Time and Laboratories

    PubMed Central

    Diao, Lixia; Clarke, Charlotte H.; Coombes, Kevin R.; Hamilton, Stanley R.; Roth, Jack; Mao, Li; Czerniak, Bogdan; Baggerly, Keith A.; Morris, Jeffrey S.; Fung, Eric T.; Bast, Robert C.

    2011-01-01

    This is an open access article. Unrestricted non-commercial use is permitted provided the original work is properly cited. The reproducibility of mass spectrometry (MS) data collected using surface enhanced laser desorption/ionization-time of flight (SELDI-TOF) has been questioned. This investigation was designed to test the reproducibility of SELDI data collected over time by multiple users and instruments. Five laboratories prepared arrays once every week for six weeks. Spectra were collected on separate instruments in the individual laboratories. Additionally, all of the arrays produced each week were rescanned on a single instrument in one laboratory. Lab-to-lab and array-to-array variability in alignment parameters were larger than the variability attributable to running samples during different weeks. The coefficient of variance (CV) in spectrum intensity ranged from 25% at baseline, to 80% in the matrix noise region, to about 50% during the exponential drop from the maximum matrix noise. Before normalization, the median CV of the peak heights was 72% and reduced to about 20% after normalization. Additionally, for the spectra from a common instrument, the CV ranged from 5% at baseline, to 50% in the matrix noise region, to 20% during the drop from the maximum matrix noise. Normalization reduced the variability in peak heights to about 18%. With proper processing methods, SELDI instruments produce spectra containing large numbers of reproducibly located peaks, with consistent heights. PMID:21552492

  4. A neural mechanism for sensing and reproducing a time interval

    PubMed Central

    Jazayeri, Mehrdad; Shadlen, Michael N.

    2015-01-01

    SUMMARY Timing plays a crucial role in sensorimotor function. The neural mechanisms that enable the brain to flexibly measure and reproduce time intervals are however not known. We recorded neural activity in parietal cortex of monkeys in a time reproduction task. Monkeys were trained to measure and immediately afterwards reproduce different sample intervals. While measuring an interval, neural responses had a nonlinear profile that increased with the duration of the sample interval. Activity was reset during the transition from measurement to production, and was followed by a ramping activity whose slope encoded the previously measured sample interval. We found that firing rates at the end of the measurement epoch were correlated with both the slope of the ramp and the monkey’s corresponding production interval on a trial-by-trial basis. Analysis of response dynamics further linked the rate of change of firing rates in the measurement epoch to the slope of the ramp in the production epoch. These observations suggest that, during time reproduction, an interval is measured prospectively in relation to the desired motor plan to reproduce that interval. PMID:26455307

  5. Planar heterojunction perovskite solar cells with superior reproducibility

    PubMed Central

    Jeon, Ye-Jin; Lee, Sehyun; Kang, Rira; Kim, Jueng-Eun; Yeo, Jun-Seok; Lee, Seung-Hoon; Kim, Seok-Soon; Yun, Jin-Mun; Kim, Dong-Yu

    2014-01-01

    Perovskite solar cells (PeSCs) have been considered one of the competitive next generation power sources. To date, light-to-electric conversion efficiencies have rapidly increased to over 10%, and further improvements are expected. However, the poor device reproducibility of PeSCs ascribed to their inhomogeneously covered film morphology has hindered their practical application. Here, we demonstrate high-performance PeSCs with superior reproducibility by introducing small amounts of N-cyclohexyl-2-pyrrolidone (CHP) as a morphology controller into N,N-dimethylformamide (DMF). As a result, highly homogeneous film morphology, similar to that achieved by vacuum-deposition methods, as well as a high PCE of 10% and an extremely small performance deviation within 0.14% were achieved. This study represents a method for realizing efficient and reproducible planar heterojunction (PHJ) PeSCs through morphology control, taking a major step forward in the low-cost and rapid production of PeSCs by solving one of the biggest problems of PHJ perovskite photovoltaic technology through a facile method. PMID:25377945

  6. Planar heterojunction perovskite solar cells with superior reproducibility

    NASA Astrophysics Data System (ADS)

    Jeon, Ye-Jin; Lee, Sehyun; Kang, Rira; Kim, Jueng-Eun; Yeo, Jun-Seok; Lee, Seung-Hoon; Kim, Seok-Soon; Yun, Jin-Mun; Kim, Dong-Yu

    2014-11-01

    Perovskite solar cells (PeSCs) have been considered one of the competitive next generation power sources. To date, light-to-electric conversion efficiencies have rapidly increased to over 10%, and further improvements are expected. However, the poor device reproducibility of PeSCs ascribed to their inhomogeneously covered film morphology has hindered their practical application. Here, we demonstrate high-performance PeSCs with superior reproducibility by introducing small amounts of N-cyclohexyl-2-pyrrolidone (CHP) as a morphology controller into N,N-dimethylformamide (DMF). As a result, highly homogeneous film morphology, similar to that achieved by vacuum-deposition methods, as well as a high PCE of 10% and an extremely small performance deviation within 0.14% were achieved. This study represents a method for realizing efficient and reproducible planar heterojunction (PHJ) PeSCs through morphology control, taking a major step forward in the low-cost and rapid production of PeSCs by solving one of the biggest problems of PHJ perovskite photovoltaic technology through a facile method.

  7. Planar heterojunction perovskite solar cells with superior reproducibility.

    PubMed

    Jeon, Ye-Jin; Lee, Sehyun; Kang, Rira; Kim, Jueng-Eun; Yeo, Jun-Seok; Lee, Seung-Hoon; Kim, Seok-Soon; Yun, Jin-Mun; Kim, Dong-Yu

    2014-11-07

    Perovskite solar cells (PeSCs) have been considered one of the competitive next generation power sources. To date, light-to-electric conversion efficiencies have rapidly increased to over 10%, and further improvements are expected. However, the poor device reproducibility of PeSCs ascribed to their inhomogeneously covered film morphology has hindered their practical application. Here, we demonstrate high-performance PeSCs with superior reproducibility by introducing small amounts of N-cyclohexyl-2-pyrrolidone (CHP) as a morphology controller into N,N-dimethylformamide (DMF). As a result, highly homogeneous film morphology, similar to that achieved by vacuum-deposition methods, as well as a high PCE of 10% and an extremely small performance deviation within 0.14% were achieved. This study represents a method for realizing efficient and reproducible planar heterojunction (PHJ) PeSCs through morphology control, taking a major step forward in the low-cost and rapid production of PeSCs by solving one of the biggest problems of PHJ perovskite photovoltaic technology through a facile method.

  8. Reproducing Kernels in Harmonic Spaces and Their Numerical Implementation

    NASA Astrophysics Data System (ADS)

    Nesvadba, Otakar

    2010-05-01

    In harmonic analysis such as the modelling of the Earth's gravity field, the importance of Hilbert's space of harmonic functions with the reproducing kernel is often discussed. Moreover, in case of an unbounded domain given by the exterior of the sphere or an ellipsoid, the reproducing kernel K(x,y) can be expressed analytically by means of closed formulas or by infinite series. Nevertheless, the straightforward numerical implementation of these formulas leads to dozen of problems, which are mostly connected with the floating-point arithmetic and a number representation. The contribution discusses numerical instabilities in K(x,y) and gradK(x,y) that can be overcome by employing elementary functions, in particular expm1 and log1p. Suggested evaluation scheme for reproducing kernels offers uniform formulas within the whole solution domain as well as superior speed and near-perfect accuracy (10-16 for IEC 60559 double-precision numbers) when compared with the straightforward formulas. The formulas can be easily implemented on the majority of computer platforms, especially when C standard library ISO/IEC 9899:1999 is available.

  9. Indomethacin reproducibly induces metamorphosis in Cassiopea xamachana scyphistomae.

    PubMed

    Cabrales-Arellano, Patricia; Islas-Flores, Tania; Thomé, Patricia E; Villanueva, Marco A

    2017-01-01

    Cassiopea xamachana jellyfish are an attractive model system to study metamorphosis and/or cnidarian-dinoflagellate symbiosis due to the ease of cultivation of their planula larvae and scyphistomae through their asexual cycle, in which the latter can bud new larvae and continue the cycle without differentiation into ephyrae. Then, a subsequent induction of metamorphosis and full differentiation into ephyrae is believed to occur when the symbionts are acquired by the scyphistomae. Although strobilation induction and differentiation into ephyrae can be accomplished in various ways, a controlled, reproducible metamorphosis induction has not been reported. Such controlled metamorphosis induction is necessary for an ensured synchronicity and reproducibility of biological, biochemical, and molecular analyses. For this purpose, we tested if differentiation could be pharmacologically stimulated as in Aurelia aurita, by the metamorphic inducers thyroxine, KI, NaI, Lugol's iodine, H2O2, indomethacin, or retinol. We found reproducibly induced strobilation by 50 μM indomethacin after six days of exposure, and 10-25 μM after 7 days. Strobilation under optimal conditions reached 80-100% with subsequent ephyrae release after exposure. Thyroxine yielded inconsistent results as it caused strobilation occasionally, while all other chemicals had no effect. Thus, indomethacin can be used as a convenient tool for assessment of biological phenomena through a controlled metamorphic process in C. xamachana scyphistomae.

  10. Endoscopic Evaluation of Adenoids: Reproducibility Analysis of Current Methods

    PubMed Central

    Hermann, Juliana Sato; Sallum, Ana Carolina; Pignatari, Shirley Shizue Nagata

    2013-01-01

    Objectives To investigate intra- and interexaminers' reproducibility of usual adenoid hypertrophy assessment methods, according to nasofiberendoscopic examination. Methods Forty children of both sexes, ages ranging between 4 and 14 years, presenting with nasal obstruction and oral breathing suspected to be caused by adenoid hypertrophy, were enrolled in this study. Patients were evaluated by nasofiberendoscopy, and records were referred to and evaluated by two experienced otolaryngologists. Examiners analysed the records according to different evaluation methods; i.e., estimated, and measured percentage of choanal occlusion; as well as subjective and objective classificatory systems of adenoid hypertrophy. Results Data disclosed excellent intraexaminer reproducibility for both estimated and measured choanal occlusion. analysis revealed lower reproducibility rates of estimated in relation to measured choanal occlusion. Measured choanal occlusion also demonstrated less agreement among evaluations made through the right and left sides of the nasal cavity. Alternatively, intra- and interexaminers reliability analysis revealed higher agreement for subjective than objective classificatory system. Besides, subjective method demonstrated higher agreement than the objective classificatory system, when opposite sides were compared. Conclusion Our results suggest that measured is superior to estimated percentage of choanal occlusion, particularly if employed bilaterally, diminishing the lack of agreement between sides. When adenoid categorization is used instead, the authors recommend subjective rather than objective classificatory system of adenoid hypertrophy. PMID:23526477

  11. Indomethacin reproducibly induces metamorphosis in Cassiopea xamachana scyphistomae

    PubMed Central

    Cabrales-Arellano, Patricia; Islas-Flores, Tania; Thomé, Patricia E.

    2017-01-01

    Cassiopea xamachana jellyfish are an attractive model system to study metamorphosis and/or cnidarian–dinoflagellate symbiosis due to the ease of cultivation of their planula larvae and scyphistomae through their asexual cycle, in which the latter can bud new larvae and continue the cycle without differentiation into ephyrae. Then, a subsequent induction of metamorphosis and full differentiation into ephyrae is believed to occur when the symbionts are acquired by the scyphistomae. Although strobilation induction and differentiation into ephyrae can be accomplished in various ways, a controlled, reproducible metamorphosis induction has not been reported. Such controlled metamorphosis induction is necessary for an ensured synchronicity and reproducibility of biological, biochemical, and molecular analyses. For this purpose, we tested if differentiation could be pharmacologically stimulated as in Aurelia aurita, by the metamorphic inducers thyroxine, KI, NaI, Lugol’s iodine, H2O2, indomethacin, or retinol. We found reproducibly induced strobilation by 50 μM indomethacin after six days of exposure, and 10–25 μM after 7 days. Strobilation under optimal conditions reached 80–100% with subsequent ephyrae release after exposure. Thyroxine yielded inconsistent results as it caused strobilation occasionally, while all other chemicals had no effect. Thus, indomethacin can be used as a convenient tool for assessment of biological phenomena through a controlled metamorphic process in C. xamachana scyphistomae. PMID:28265497

  12. Reproducibility of LCA models of crude oil production.

    PubMed

    Vafi, Kourosh; Brandt, Adam R

    2014-11-04

    Scientific models are ideally reproducible, with results that converge despite varying methods. In practice, divergence between models often remains due to varied assumptions, incompleteness, or simply because of avoidable flaws. We examine LCA greenhouse gas (GHG) emissions models to test the reproducibility of their estimates for well-to-refinery inlet gate (WTR) GHG emissions. We use the Oil Production Greenhouse gas Emissions Estimator (OPGEE), an open source engineering-based life cycle assessment (LCA) model, as the reference model for this analysis. We study seven previous studies based on six models. We examine the reproducibility of prior results by successive experiments that align model assumptions and boundaries. The root-mean-square error (RMSE) between results varies between ∼1 and 8 g CO2 eq/MJ LHV when model inputs are not aligned. After model alignment, RMSE generally decreases only slightly. The proprietary nature of some of the models hinders explanations for divergence between the results. Because verification of the results of LCA GHG emissions is often not possible by direct measurement, we recommend the development of open source models for use in energy policy. Such practice will lead to iterative scientific review, improvement of models, and more reliable understanding of emissions.

  13. Establishing a reproducible protocol for measuring index active extension strength.

    PubMed

    Matter-Parrat, V; Hidalgo Diaz, J J; Collon, S; Salazar Botero, S; Prunières, G; Ichihara, S; Facca, S; Liverneaux, P

    2017-02-01

    The goal of this study was to establish a reproducible protocol to measure active extension strength in the index finger. The secondary objectives consisted in correlating the independent or associated index extension strength to the other fingers force of contraction of the extensor indicis propius with hand dominance. The population studied consisted of 24 healthy volunteers, including 19 women and 20 right-handed individuals. The independent and dependent index extension strength in each hand was measured three times with a dynamometer by three examiners at Day 0 and again at Day 7. Intra and inter-examiner reproducibility were, respectively, >0.90 and >0.75 in all cases. The independent extension strength was lower than the dependent one. There was no difference between the independent index extension strength on the dominant and non-dominant sides. The same was true for the dependent strength. Our results show that our protocol is reproducible in measuring independent and dependent index extension strength. Dominance did not come into account.

  14. Accurate thermoelastic tensor and acoustic velocities of NaCl

    NASA Astrophysics Data System (ADS)

    Marcondes, Michel L.; Shukla, Gaurav; da Silveira, Pedro; Wentzcovitch, Renata M.

    2015-12-01

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  15. Accurate thermoelastic tensor and acoustic velocities of NaCl

    SciTech Connect

    Marcondes, Michel L.; Shukla, Gaurav; Silveira, Pedro da; Wentzcovitch, Renata M.

    2015-12-15

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  16. Reproducible and Consistent Quantification of the Saccharomyces cerevisiae Proteome by SWATH-mass spectrometry*

    PubMed Central

    Selevsek, Nathalie; Chang, Ching-Yun; Gillet, Ludovic C.; Navarro, Pedro; Bernhardt, Oliver M.; Reiter, Lukas; Cheng, Lin-Yang; Vitek, Olga; Aebersold, Ruedi

    2015-01-01

    Targeted mass spectrometry by selected reaction monitoring (S/MRM) has proven to be a suitable technique for the consistent and reproducible quantification of proteins across multiple biological samples and a wide dynamic range. This performance profile is an important prerequisite for systems biology and biomedical research. However, the method is limited to the measurements of a few hundred peptides per LC-MS analysis. Recently, we introduced SWATH-MS, a combination of data independent acquisition and targeted data analysis that vastly extends the number of peptides/proteins quantified per sample, while maintaining the favorable performance profile of S/MRM. Here we applied the SWATH-MS technique to quantify changes over time in a large fraction of the proteome expressed in Saccharomyces cerevisiae in response to osmotic stress. We sampled cell cultures in biological triplicates at six time points following the application of osmotic stress and acquired single injection data independent acquisition data sets on a high-resolution 5600 tripleTOF instrument operated in SWATH mode. Proteins were quantified by the targeted extraction and integration of transition signal groups from the SWATH-MS datasets for peptides that are proteotypic for specific yeast proteins. We consistently identified and quantified more than 15,000 peptides and 2500 proteins across the 18 samples. We demonstrate high reproducibility between technical and biological replicates across all time points and protein abundances. In addition, we show that the abundance of hundreds of proteins was significantly regulated upon osmotic shock, and pathway enrichment analysis revealed that the proteins reacting to osmotic shock are mainly involved in the carbohydrate and amino acid metabolism. Overall, this study demonstrates the ability of SWATH-MS to efficiently generate reproducible, consistent, and quantitatively accurate measurements of a large fraction of a proteome across multiple samples. PMID

  17. Individual Drusen Segmentation and Repeatability and Reproducibility of Their Automated Quantification in Optical Coherence Tomography Images

    PubMed Central

    de Sisternes, Luis; Jonna, Gowtham; Greven, Margaret A.; Chen, Qiang; Leng, Theodore; Rubin, Daniel L.

    2017-01-01

    Purpose To introduce a novel method to segment individual drusen in spectral-domain optical coherence tomography (SD-OCT), and evaluate its accuracy, and repeatability/reproducibility of drusen quantifications extracted from the segmentation results. Methods Our method uses a smooth interpolation of the retinal pigment epithelium (RPE) outer boundary, fitted to candidate locations in proximity to Bruch's Membrane, to identify regions of substantial lifting in the inner-RPE or inner-segment boundaries, and then separates and evaluates individual druse independently. The study included 192 eyes from 129 patients. Accuracy of drusen segmentations was evaluated measuring the overlap ratio (OR) with manual markings, also comparing the results to a previously proposed method. Repeatability and reproducibility across scanning protocols of automated drusen quantifications were investigated in repeated SD-OCT volume pairs and compared with those measured by a commercial tool (Cirrus HD-OCT). Results Our segmentation method produced higher accuracy than a previously proposed method, showing similar differences to manual markings (0.72 ± 0.09 OR) as the measured intra- and interreader variability (0.78 ± 0.09 and 0.77 ± 0.09, respectively). The automated quantifications displayed high repeatability and reproducibility, showing a more stable behavior across scanning protocols in drusen area and volume measurements than the commercial software. Measurements of drusen slope and mean intensity showed significant differences across protocols. Conclusion Automated drusen outlines produced by our method show promising accurate results that seem relatively stable in repeated scans using the same or different scanning protocols. Translational Relevance The proposed method represents a viable tool to measure and track drusen measurements in early or intermediate age-related macular degeneration patients. PMID:28275527

  18. Submicroscopic malaria parasite carriage: how reproducible are polymerase chain reaction-based methods?

    PubMed

    Costa, Daniela Camargos; Madureira, Ana Paula; Amaral, Lara Cotta; Sanchez, Bruno Antônio Marinho; Gomes, Luciano Teixeira; Fontes, Cor Jésus Fernandes; Limongi, Jean Ezequiel; Brito, Cristiana Ferreira Alves de; Carvalho, Luzia Helena

    2014-02-01

    The polymerase chain reaction (PCR)-based methods for the diagnosis of malaria infection are expected to accurately identify submicroscopic parasite carriers. Although a significant number of PCR protocols have been described, few studies have addressed the performance of PCR amplification in cases of field samples with submicroscopic malaria infection. Here, the reproducibility of two well-established PCR protocols (nested-PCR and real-time PCR for the Plasmodium 18 small subunit rRNA gene) were evaluated in a panel of 34 blood field samples from individuals that are potential reservoirs of malaria infection, but were negative for malaria by optical microscopy. Regardless of the PCR protocol, a large variation between the PCR replicates was observed, leading to alternating positive and negative results in 38% (13 out of 34) of the samples. These findings were quite different from those obtained from the microscopy-positive patients or the unexposed individuals; the diagnosis of these individuals could be confirmed based on the high reproducibility and specificity of the PCR-based protocols. The limitation of PCR amplification was restricted to the field samples with very low levels of parasitaemia because titrations of the DNA templates were able to detect < 3 parasites/µL in the blood. In conclusion, conventional PCR protocols require careful interpretation in cases of submicroscopic malaria infection, as inconsistent and false-negative results can occur.

  19. Better living through transparency: improving the reproducibility of fMRI results through comprehensive methods reporting.

    PubMed

    Carp, Joshua

    2013-09-01

    Recent studies suggest that a greater proportion of published scientific findings than expected cannot be replicated. The field of functional neuroimaging research is no exception to this trend, with estimates of false positive results ranging from 10 % to 40 %. While false positive results in neuroimaging studies stem from a variety of causes, incomplete methodological reporting is perhaps the most obvious: Most published reports of neuroimaging studies provide ambiguous or incomplete descriptions of their methods and results. If neuroimaging researchers do not report methods and results in adequate detail, independent scientists can neither check their work for errors nor accurately replicate their efforts. Thus, I argue that comprehensive methods reporting is essential for reproducible research. I recommend three strategies for improving transparency and reproducibility in neuroimaging research: improving natural language descriptions of research protocols; sharing source code for data collection and analysis; and sharing formal, machine-readable representations of methods and results. Last, I discuss the technological and cultural barriers to implementing these recommendations and suggest steps toward overcoming those barriers.

  20. Reproducibility of transcranial magnetic stimulation metrics in the study of proximal upper limb muscles

    PubMed Central

    Sankarasubramanian, Vishwanath; Roelle, Sarah; Bonnett, Corin E; Janini, Daniel; Varnerin, Nicole; Cunningham, David A; Sharma, Jennifer S; Potter-Baker, Kelsey A; Wang, Xiaofeng; Yue, Guang H; Plow, Ela B

    2015-01-01

    Objective Reproducibility of transcranial magnetic stimulation (TMS) metrics is essential in accurately tracking recovery and disease. However, majority of evidence pertains to reproducibility of metrics for distal upper limb muscles. We investigate for the first time, reliability of corticospinal physiology for a large proximal muscle-the biceps brachii and relate how varying statistical analyses can influence interpretations. Methods 14 young right-handed healthy participants completed two sessions assessing resting motor threshold (RMT), motor evoked potentials (MEPs), motor map and intra-cortical inhibition (ICI) from the left biceps brachii. Analyses included paired t-tests, Pearson's, intra-class (ICC) and concordance correlation coefficients (CCC) and Bland-Altman plots. Results Unlike paired t-tests, ICC, CCC and Pearson's were >0.6 indicating good reliability for RMTs, MEP intensities and locations of map; however values were <0.3 for MEP responses and ICI. Conclusions Corticospinal physiology, defining excitability and output in terms of intensity of the TMS device, and spatial loci are the most reliable metrics for the biceps. MEPs and variables based on MEPs are less reliable since biceps receives fewer cortico-motor-neuronal projections. Statistical tests of agreement and associations are more powerful reliability indices than inferential tests. Significance Reliable metrics of proximal muscles when translated to a larger number of participants would serve to sensitively track and prognosticate function in neurological disorders such as stroke where proximal recovery precedes distal. PMID:26111434

  1. Reproducible quantification of cancer-associated proteins in body fluids using targeted proteomics.

    PubMed

    Hüttenhain, Ruth; Soste, Martin; Selevsek, Nathalie; Röst, Hannes; Sethi, Atul; Carapito, Christine; Farrah, Terry; Deutsch, Eric W; Kusebauch, Ulrike; Moritz, Robert L; Niméus-Malmström, Emma; Rinner, Oliver; Aebersold, Ruedi

    2012-07-11

    The rigorous testing of hypotheses on suitable sample cohorts is a major limitation in translational research. This is particularly the case for the validation of protein biomarkers; the lack of accurate, reproducible, and sensitive assays for most proteins has precluded the systematic assessment of hundreds of potential marker proteins described in the literature. Here, we describe a high-throughput method for the development and refinement of selected reaction monitoring (SRM) assays for human proteins. The method was applied to generate such assays for more than 1000 cancer-associated proteins, which are functionally related to candidate cancer driver mutations. We used the assays to determine the detectability of the target proteins in two clinically relevant samples: plasma and urine. One hundred eighty-two proteins were detected in depleted plasma, spanning five orders of magnitude in abundance and reaching below a concentration of 10 ng/ml. The narrower concentration range of proteins in urine allowed the detection of 408 proteins. Moreover, we demonstrate that these SRM assays allow reproducible quantification by monitoring 34 biomarker candidates across 83 patient plasma samples. Through public access to the entire assay library, researchers will be able to target their cancer-associated proteins of interest in any sample type using the detectability information in plasma and urine as a guide. The generated expandable reference map of SRM assays for cancer-associated proteins will be a valuable resource for accelerating and planning biomarker verification studies.

  2. Communication: An accurate full 15 dimensional permutationally invariant potential energy surface for the OH + CH4 → H2O + CH3 reaction

    NASA Astrophysics Data System (ADS)

    Li, Jun; Guo, Hua

    2015-12-01

    A globally accurate full-dimensional potential energy surface (PES) for the OH + CH4 → H2O + CH3 reaction is developed using the permutation invariant polynomial-neural network approach based on ˜135 000 points at the level of correlated coupled cluster singles, doubles, and perturbative triples level with the augmented correlation consistent polarized valence triple-zeta basis set. The total root mean square fitting error is only 3.9 meV or 0.09 kcal/mol. This PES is shown to reproduce energies, geometries, and harmonic frequencies of stationary points along the reaction path. Kinetic and dynamical calculations on the PES indicated a good agreement with the available experimental data.

  3. Limit analysis assessment of experimental behavior of arches reinforced with GFRP materials

    NASA Astrophysics Data System (ADS)

    Basilio, Ismael; Fedele, Roberto; Lourenço, Paulo B.; Milani, Gabriele

    2014-10-01

    In this paper, a comparison between results furnished by a 3D FE upper bound limit analysis and experimental results for some reinforced masonry arches tested at the University of Minho (Portugal) is provided. While the delamination from arches support can be modelled only in an approximate way within limit analysis, the aim of the paper is to accurately reproduce the change in the failure mechanism observed in experimentation, due to the introduction of strengthening elements. Both experimental and numerical results showa clear change in the failure mechanism and in the corresponding ultimate peak load. A set of simulations is also performed on reinforced arches previously damaged, to investigate the role played by the reinforcement within a proper repairing procedure. Good correlation with experimental work and numerical simulations is achieved.

  4. Experimental and Theoretical Reduction Potentials of Some Biologically Active ortho-Carbonyl para-Quinones.

    PubMed

    Martínez-Cifuentes, Maximiliano; Salazar, Ricardo; Ramírez-Rodríguez, Oney; Weiss-López, Boris; Araya-Maturana, Ramiro

    2017-04-04

    The rational design of quinones with specific redox properties is an issue of great interest because of their applications in pharmaceutical and material sciences. In this work, the electrochemical behavior of a series of four p-quinones was studied experimentally and theoretically. The first and second one-electron reduction potentials of the quinones were determined using cyclic voltammetry and correlated with those calculated by density functional theory (DFT) using three different functionals, BHandHLYP, M06-2x and PBE0. The differences among the experimental reduction potentials were explained in terms of structural effects on the stabilities of the formed species. DFT calculations accurately reproduced the first one-electron experimental reduction potentials with R² higher than 0.94. The BHandHLYP functional presented the best fit to the experimental values (R² = 0.957), followed by M06-2x (R² = 0.947) and PBE0 (R² = 0.942).

  5. Towards a universal product ion mass spectral library - reproducibility of product ion spectra across eleven different mass spectrometers.

    PubMed

    Hopley, Chris; Bristow, Tony; Lubben, Anneke; Simpson, Alec; Bull, Elaine; Klagkou, Katerina; Herniman, Julie; Langley, John

    2008-06-01

    Product ion spectra produced by collision-induced dissociation (CID) in tandem mass spectrometry experiments can differ markedly between instruments. There have been a number of attempts to standardise the production of product ion spectra; however, a consensus on the most appropriate approach to the reproducible production of spectra has yet to be reached. We have previously reported the comparison of product ion spectra on a number of different types of instruments - a triple quadrupole, two ion traps and a Fourier transform ion cyclotron resonance mass spectrometer (Bristow AWT, Webb KS, Lubben AT, Halket JM. Rapid Commun. Mass Spectrom. 2004; 18: 1). The study showed that a high degree of reproducibility was achievable. The goal of this study was to improve the comparability and reproducibility of CID product ion mass spectra produced in different laboratories and using different instruments. This was carried out experimentally by defining a spectral calibration point on each mass spectrometer for product ion formation. The long-term goal is the development of a universal (instrument independent) product ion mass spectral library for the identification of unknowns. The spectra of 48 compounds have been recorded on eleven mass spectrometers: six ion traps, two triple quadrupoles, a hybrid triple quadrupole, and two quadrupole time-of-flight instruments. Initially, 4371 spectral comparisons were carried out using the data from eleven instruments and the degree of reproducibility was evaluated. A blind trial has also been carried out to assess the reproducibility of spectra obtained during LC/MS/MS. The results suggest a degree of reproducibility across all instrument types using the tuning point technique. The reproducibility of the product ion spectra is increased when comparing the tandem in time type instruments and the tandem in space instruments as two separate groups. This may allow the production of a more limited, yet useful, screening library for LC

  6. An open science resource for establishing reliability and reproducibility in functional connectomics

    PubMed Central

    Zuo, Xi-Nian; Anderson, Jeffrey S; Bellec, Pierre; Birn, Rasmus M; Biswal, Bharat B; Blautzik, Janusch; Breitner, John C.S; Buckner, Randy L; Calhoun, Vince D; Castellanos, F. Xavier; Chen, Antao; Chen, Bing; Chen, Jiangtao; Chen, Xu; Colcombe, Stanley J; Courtney, William; Craddock, R Cameron; Di Martino, Adriana; Dong, Hao-Ming; Fu, Xiaolan; Gong, Qiyong; Gorgolewski, Krzysztof J; Han, Ying; He, Ye; He, Yong; Ho, Erica; Holmes, Avram; Hou, Xiao-Hui; Huckins, Jeremy; Jiang, Tianzi; Jiang, Yi; Kelley, William; Kelly, Clare; King, Margaret; LaConte, Stephen M; Lainhart, Janet E; Lei, Xu; Li, Hui-Jie; Li, Kaiming; Li, Kuncheng; Lin, Qixiang; Liu, Dongqiang; Liu, Jia; Liu, Xun; Liu, Yijun; Lu, Guangming; Lu, Jie; Luna, Beatriz; Luo, Jing; Lurie, Daniel; Mao, Ying; Margulies, Daniel S; Mayer, Andrew R; Meindl, Thomas; Meyerand, Mary E; Nan, Weizhi; Nielsen, Jared A; O’Connor, David; Paulsen, David; Prabhakaran, Vivek; Qi, Zhigang; Qiu, Jiang; Shao, Chunhong; Shehzad, Zarrar; Tang, Weijun; Villringer, Arno; Wang, Huiling; Wang, Kai; Wei, Dongtao; Wei, Gao-Xia; Weng, Xu-Chu; Wu, Xuehai; Xu, Ting; Yang, Ning; Yang, Zhi; Zang, Yu-Feng; Zhang, Lei; Zhang, Qinglin; Zhang, Zhe; Zhang, Zhiqiang; Zhao, Ke; Zhen, Zonglei; Zhou, Yuan; Zhu, Xing-Ting; Milham, Michael P

    2014-01-01

    Efforts to identify meaningful functional imaging-based biomarkers are limited by the ability to reliably characterize inter-individual differences in human brain function. Although a growing number of connectomics-based measures are reported to have moderate to high test-retest reliability, the variability in data acquisition, experimental designs, and analytic methods precludes the ability to generalize results. The Consortium for Reliability and Reproducibility (CoRR) is working to address this challenge and establish test-retest reliability as a minimum standard for methods development in functional connectomics. Specifically, CoRR has aggregated 1,629 typical individuals’ resting state fMRI (rfMRI) data (5,093 rfMRI scans) from 18 international sites, and is openly sharing them via the International Data-sharing Neuroimaging Initiative (INDI). To allow researchers to generate various estimates of reliability and reproducibility, a variety of data acquisition procedures and experimental designs are included. Similarly, to enable users to assess the impact of commonly encountered artifacts (for example, motion) on characterizations of inter-individual variation, datasets of varying quality are included. PMID:25977800

  7. Tackling the Reproducibility Problem in Systems Research with Declarative Experiment Specifications

    SciTech Connect

    Jimenez, Ivo; Maltzahn, Carlos; Lofstead, Jay; Moody, Adam; Mohror, Kathryn; Arpaci-Dusseau, Remzi; Arpaci-Dusseau, Andrea

    2015-05-04

    Validating experimental results in the field of computer systems is a challenging task, mainly due to the many changes in software and hardware that computational environments go through. Determining if an experiment is reproducible entails two separate tasks: re-executing the experiment and validating the results. Existing reproducibility efforts have focused on the former, envisioning techniques and infrastructures that make it easier to re-execute an experiment. In this work we focus on the latter by analyzing the validation workflow that an experiment re-executioner goes through. We notice that validating results is done on the basis of experiment design and high-level goals, rather than exact quantitative metrics. Based on this insight, we introduce a declarative format for specifying the high-level components of an experiment as well as describing generic, testable conditions that serve as the basis for validation. We present a use case in the area of storage systems to illustrate the usefulness of this approach. We also discuss limitations and potential benefits of using this approach in other areas of experimental systems research.

  8. D-BRAIN: Anatomically Accurate Simulated Diffusion MRI Brain Data.

    PubMed

    Perrone, Daniele; Jeurissen, Ben; Aelterman, Jan; Roine, Timo; Sijbers, Jan; Pizurica, Aleksandra; Leemans, Alexander; Philips, Wilfried

    2016-01-01

    Diffusion Weighted (DW) MRI allows for the non-invasive study of water diffusion inside living tissues. As such, it is useful for the investigation of human brain white matter (WM) connectivity in vivo through fiber tractography (FT) algorithms. Many DW-MRI tailored restoration techniques and FT algorithms have been developed. However, it is not clear how accurately these methods reproduce the WM bundle characteristics in real-world conditions, such as in the presence of noise, partial volume effect, and a limited spatial and angular resolution. The difficulty lies in the lack of a realistic brain phantom on the one hand, and a sufficiently accurate way of modeling the acquisition-related degradation on the other. This paper proposes a software phantom that approximates a human brain to a high degree of realism and that can incorporate complex brain-like structural features. We refer to it as a Diffusion BRAIN (D-BRAIN) phantom. Also, we propose an accurate model of a (DW) MRI acquisition protocol to allow for validation of methods in realistic conditions with data imperfections. The phantom model simulates anatomical and diffusion properties for multiple brain tissue components, and can serve as a ground-truth to evaluate FT algorithms, among others. The simulation of the acquisition process allows one to include noise, partial volume effects, and limited spatial and angular resolution in the images. In this way, the effect of image artifacts on, for instance, fiber tractography can be investigated with great detail. The proposed framework enables reliable and quantitative evaluation of DW-MR image processing and FT algorithms at the level of large-scale WM structures. The effect of noise levels and other data characteristics on cortico-cortical connectivity and tractography-based grey matter parcellation can be investigated as well.

  9. Venusian Polar Vortex reproduced by a general circulation model

    NASA Astrophysics Data System (ADS)

    Ando, Hiroki; Sugimoto, Norihiko; Takagi, Masahiro

    2016-10-01

    Unlike the polar vortices observed in the Earth, Mars and Titan atmospheres, the observed Venus polar vortex is warmer than the mid-latitudes at cloud-top levels (~65 km). This warm polar vortex is zonally surrounded by a cold latitude band located at ~60 degree latitude, which is a unique feature called 'cold collar' in the Venus atmosphere [e.g. Taylor et al. 1980; Piccioni et al. 2007]. Although these structures have been observed in numerous previous observations, the formation mechanism is still unknown. In addition, an axi-asymmetric feature is always seen in the warm polar vortex. It changes temporally and sometimes shows a hot polar dipole or S-shaped structure as shown by a lot of infrared measurements [e.g. Garate-Lopez et al. 2013; 2015]. However, its vertical structure has not been investigated. To solve these problems, we performed a numerical simulation of the Venus atmospheric circulation using a general circulation model named AFES for Venus [Sugimoto et al. 2014] and reproduced these puzzling features.And then, the reproduced structures of the atmosphere and the axi-asymmetirc feature are compared with some previous observational results.In addition, the quasi-periodical zonal-mean zonal wind fluctuation is also seen in the Venus polar vortex reproduced in our model. This might be able to explain some observational results [e.g. Luz et al. 2007] and implies that the polar vacillation might also occur in the Venus atmosphere, which is silimar to the Earth's polar atmosphere. We will also show some initial results about this point in this presentation.

  10. Reproducibility and Transparency in Ocean-Climate Modeling

    NASA Astrophysics Data System (ADS)

    Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.

    2015-12-01

    Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.

  11. Validity and Reproducibility of a Spanish Dietary History

    PubMed Central

    Guallar-Castillón, Pilar; Sagardui-Villamor, Jon; Balboa-Castillo, Teresa; Sala-Vila, Aleix; Ariza Astolfi, Mª José; Sarrión Pelous, Mª Dolores; León-Muñoz, Luz María; Graciani, Auxiliadora; Laclaustra, Martín; Benito, Cristina; Banegas, José Ramón; Artalejo, Fernando Rodríguez

    2014-01-01

    Objective To assess the validity and reproducibility of food and nutrient intake estimated with the electronic diet history of ENRICA (DH-E), which collects information on numerous aspects of the Spanish diet. Methods The validity of food and nutrient intake was estimated using Pearson correlation coefficients between the DH-E and the mean of seven 24-hour recalls collected every 2 months over the previous year. The reproducibility was estimated using intraclass correlation coefficients between two DH-E made one year apart. Results The correlations coefficients between the DH-E and the mean of seven 24-hour recalls for the main food groups were cereals (r = 0.66), meat (r = 0.66), fish (r = 0.42), vegetables (r = 0.62) and fruits (r = 0.44). The mean correlation coefficient for all 15 food groups considered was 0.53. The correlations for macronutrients were: energy (r = 0.76), proteins (r = 0.58), lipids (r = 0.73), saturated fat (r = 0.73), monounsaturated fat (r = 0.59), polyunsaturated fat (r = 0.57), and carbohydrates (r = 0.66). The mean correlation coefficient for all 41 nutrients studied was 0.55. The intraclass correlation coefficient between the two DH-E was greater than 0.40 for most foods and nutrients. Conclusions The DH-E shows good validity and reproducibility for estimating usual intake of foods and nutrients. PMID:24465878

  12. Efficient and reproducible mammalian cell bioprocesses without probes and controllers?

    PubMed

    Tissot, Stéphanie; Oberbek, Agata; Reclari, Martino; Dreyer, Matthieu; Hacker, David L; Baldi, Lucia; Farhat, Mohamed; Wurm, Florian M

    2011-07-01

    Bioprocesses for recombinant protein production with mammalian cells are typically controlled for several physicochemical parameters including the pH and dissolved oxygen concentration (DO) of the culture medium. Here we studied whether these controls are necessary for efficient and reproducible bioprocesses in an orbitally shaken bioreactor (OSR). Mixing, gas transfer, and volumetric power consumption (P(V)) were determined in both a 5-L OSR and a 3-L stirred-tank bioreactor (STR). The two cultivation systems had a similar mixing intensity, but the STR had a lower volumetric mass transfer coefficient of oxygen (k(L)a) and a higher P(V) than the OSR. Recombinant CHO cell lines expressing either tumor necrosis factor receptor as an Fc fusion protein (TNFR:Fc) or an anti-RhesusD monoclonal antibody were cultivated in the two systems. The 5-L OSR was operated in an incubator shaker with 5% CO(2) in the gas environment but without pH and DO control whereas the STR was operated with or without pH and DO control. Higher cell densities and recombinant protein titers were obtained in the OSR as compared to both the controlled and the non-controlled STRs. To test the reproducibility of a bioprocess in a non-controlled OSR, the two CHO cell lines were each cultivated in parallel in six 5-L OSRs. Similar cell densities, cell viabilities, and recombinant protein titers along with similar pH and DO profiles were achieved in each group of replicates. Our study demonstrated that bioprocesses can be performed in OSRs without pH or DO control in a highly reproducible manner, at least at the scale of operation studied here.

  13. Emergence of reproducible spatiotemporal activity during motor learning.

    PubMed

    Peters, Andrew J; Chen, Simon X; Komiyama, Takaki

    2014-06-12

    The motor cortex is capable of reliably driving complex movements yet exhibits considerable plasticity during motor learning. These observations suggest that the fundamental relationship between motor cortex activity and movement may not be fixed but is instead shaped by learning; however, to what extent and how motor learning shapes this relationship are not fully understood. Here we addressed this issue by using in vivo two-photon calcium imaging to monitor the activity of the same population of hundreds of layer 2/3 neurons while mice learned a forelimb lever-press task over two weeks. Excitatory and inhibitory neurons were identified by transgenic labelling. Inhibitory neuron activity was relatively stable and balanced local excitatory neuron activity on a movement-by-movement basis, whereas excitatory neuron activity showed higher dynamism during the initial phase of learning. The dynamics of excitatory neurons during the initial phase involved the expansion of the movement-related population which explored various activity patterns even during similar movements. This was followed by a refinement into a smaller population exhibiting reproducible spatiotemporal sequences of activity. This pattern of activity associated with the learned movement was unique to expert animals and not observed during similar movements made during the naive phase, and the relationship between neuronal activity and individual movements became more consistent with learning. These changes in population activity coincided with a transient increase in dendritic spine turnover in these neurons. Our results indicate that a novel and reproducible activity-movement relationship develops as a result of motor learning, and we speculate that synaptic plasticity within the motor cortex underlies the emergence of reproducible spatiotemporal activity patterns for learned movements. These results underscore the profound influence of learning on the way that the cortex produces movements.

  14. Psychophysiological responses to pain identify reproducible human clusters.

    PubMed

    Farmer, Adam D; Coen, Steven J; Kano, Michiko; Paine, Peter A; Shwahdi, Mustafa; Jafari, Jafar; Kishor, Jessin; Worthen, Sian F; Rossiter, Holly E; Kumari, Veena; Williams, Steven C R; Brammer, Michael; Giampietro, Vincent P; Droney, Joanne; Riley, Julia; Furlong, Paul L; Knowles, Charles H; Lightman, Stafford L; Aziz, Qasim

    2013-11-01

    Pain is a ubiquitous yet highly variable experience. The psychophysiological and genetic factors responsible for this variability remain unresolved. We hypothesised the existence of distinct human pain clusters (PCs) composed of distinct psychophysiological and genetic profiles coupled with differences in the perception and the brain processing of pain. We studied 120 healthy subjects in whom the baseline personality and anxiety traits and the serotonin transporter-linked polymorphic region (5-HTTLPR) genotype were measured. Real-time autonomic nervous system parameters and serum cortisol were measured at baseline and after standardised visceral and somatic pain stimuli. Brain processing reactions to visceral pain were studied in 29 subjects using functional magnetic resonance imaging (fMRI). The reproducibility of the psychophysiological responses to pain was assessed at year. In group analysis, visceral and somatic pain caused an expected increase in sympathetic and cortisol responses and activated the pain matrix according to fMRI studies. However, using cluster analysis, we found 2 reproducible PCs: at baseline, PC1 had higher neuroticism/anxiety scores (P ≤ 0.01); greater sympathetic tone (P<0.05); and higher cortisol levels (P ≤ 0.001). During pain, less stimulus was tolerated (P ≤ 0.01), and there was an increase in parasympathetic tone (P ≤ 0.05). The 5-HTTLPR short allele was over-represented (P ≤ 0.005). PC2 had the converse profile at baseline and during pain. Brain activity differed (P ≤ 0.001); greater activity occurred in the left frontal cortex in PC1, whereas PC2 showed greater activity in the right medial/frontal cortex and right anterior insula. In health, 2 distinct reproducible PCs exist in humans. In the future, PC characterization may help to identify subjects at risk for developing chronic pain and may reduce variability in brain imaging studies.

  15. On the reproducibility of protein crystal structures: five atomic resolution structures of trypsin

    SciTech Connect

    Liebschner, Dorothee; Dauter, Miroslawa; Brzuszkiewicz, Anna; Dauter, Zbigniew

    2013-08-01

    Details of five very high-resolution accurate structures of bovine trypsin are compared in the context of the reproducibility of models obtained from crystals grown under identical conditions. Structural studies of proteins usually rely on a model obtained from one crystal. By investigating the details of this model, crystallographers seek to obtain insight into the function of the macromolecule. It is therefore important to know which details of a protein structure are reproducible or to what extent they might differ. To address this question, the high-resolution structures of five crystals of bovine trypsin obtained under analogous conditions were compared. Global parameters and structural details were investigated. All of the models were of similar quality and the pairwise merged intensities had large correlation coefficients. The C{sup α} and backbone atoms of the structures superposed very well. The occupancy of ligands in regions of low thermal motion was reproducible, whereas solvent molecules containing heavier atoms (such as sulfur) or those located on the surface could differ significantly. The coordination lengths of the calcium ion were conserved. A large proportion of the multiple conformations refined to similar occupancies and the residues adopted similar orientations. More than three quarters of the water-molecule sites were conserved within 0.5 Å and more than one third were conserved within 0.1 Å. An investigation of the protonation states of histidine residues and carboxylate moieties was consistent for all of the models. Radiation-damage effects to disulfide bridges were observed for the same residues and to similar extents. Main-chain bond lengths and angles averaged to similar values and were in agreement with the Engh and Huber targets. Other features, such as peptide flips and the double conformation of the inhibitor molecule, were also reproducible in all of the trypsin structures. Therefore, many details are similar in models obtained

  16. Brugada phenocopy clinical reproducibility demonstrated by recurrent hypokalemia.

    PubMed

    Genaro, Natalia R; Anselm, Daniel D; Cervino, Nahuel; Estevez, Ariel O; Perona, Carlos; Villamil, Alejandro M; Kervorkian, Ruben; Baranchuk, Adrian

    2014-07-01

    Brugada phenocopies (BrP) are clinical entities that are etiologically distinct from true congenital Brugada syndrome (BrS). BrP are characterized by type 1 or type 2 Brugada electrocardiogram (ECG) patterns in precordial leads V1 -V3 ; however, BrP are elicited by various underlying clinical conditions such as electrolyte disturbances, myocardial ischemia, or poor ECG filters. In this report, we describe the first case of clinically reproducible BrP which is important to the conceptual evolution of BrP.

  17. Reproducing continuous radio blackout using glow discharge plasma

    SciTech Connect

    Xie, Kai; Li, Xiaoping; Liu, Donglin; Shao, Mingxu; Zhang, Hanlu

    2013-10-15

    A novel plasma generator is described that offers large-scale, continuous, non-magnetized plasma with a 30-cm-diameter hollow structure, which provides a path for an electromagnetic wave. The plasma is excited by a low-pressure glow discharge, with varying electron densities ranging from 10{sup 9} to 2.5 × 10{sup 11} cm{sup −3}. An electromagnetic wave propagation experiment reproduced a continuous radio blackout in UHF-, L-, and S-bands. The results are consistent with theoretical expectations. The proposed method is suitable in simulating a plasma sheath, and in researching communications, navigation, electromagnetic mitigations, and antenna compensation in plasma sheaths.

  18. Data quality in predictive toxicology: reproducibility of rodent carcinogenicity experiments.

    PubMed Central

    Gottmann, E; Kramer, S; Pfahringer, B; Helma, C

    2001-01-01

    We compared 121 replicate rodent carcinogenicity assays from the two parts (National Cancer Institute/National Toxicology Program and literature) of the Carcinogenic Potency Database (CPDB) to estimate the reliability of these experiments. We estimated a concordance of 57% between the overall rodent carcinogenicity classifications from both sources. This value did not improve substantially when additional biologic information (species, sex, strain, target organs) was considered. These results indicate that rodent carcinogenicity assays are much less reproducible than previously expected, an effect that should be considered in the development of structure-activity relationship models and the risk assessment process. PMID:11401763

  19. Mill profiler machines soft materials accurately

    NASA Technical Reports Server (NTRS)

    Rauschl, J. A.

    1966-01-01

    Mill profiler machines bevels, slots, and grooves in soft materials, such as styrofoam phenolic-filled cores, to any desired thickness. A single operator can accurately control cutting depths in contour or straight line work.

  20. Quantum theory as the most robust description of reproducible experiments

    NASA Astrophysics Data System (ADS)

    De Raedt, Hans; Katsnelson, Mikhail I.; Michielsen, Kristel

    2014-08-01

    suggests that quantum theory is a powerful language to describe a certain class of statistical experiments but remains vague about the properties of the class. Similar views were expressed by other fathers of quantum mechanics, e.g., Max Born and Wolfgang Pauli [50]. They can be summarized as "Quantum theory describes our knowledge of the atomic phenomena rather than the atomic phenomena themselves". Our aim is, in a sense, to replace the philosophical components of these statements by well-defined mathematical concepts and to carefully study their relevance for physical phenomena. Specifically, by applying the general formalism of logical inference to a well-defined class of statistical experiments, the present paper shows that quantum theory is indeed the kind of language envisaged by Bohr.Theories such as Newtonian mechanics, Maxwell's electrodynamics, and Einstein's (general) relativity are deductive in character. Starting from a few axioms, abstracted from experimental observations and additional assumptions about the irrelevance of a large number of factors for the description of the phenomena of interest, deductive reasoning is used to prove or disprove unambiguous statements, propositions, about the mathematical objects which appear in the theory.The method of deductive reasoning conforms to the Boolean algebra of propositions. The deductive, reductionist methodology has the appealing feature that one can be sure that the propositions are either right or wrong, and disregarding the possibility that some of the premises on which the deduction is built may not apply, there is no doubt that the conclusions are correct. Clearly, these theories successfully describe a wide range of physical phenomena in a manner and language which is unambiguous and independent of the individual.At the same time, the construction of a physical theory, and a scientific theory in general, from "first principles" is, for sure, not something self-evident, and not even safe. Our basic

  1. Laryngeal muscular control of vocal fold posturing: Numerical modeling and experimental validation

    PubMed Central

    Yin, Jun; Zhang, Zhaoyan

    2016-01-01

    A three-dimensional continuum model of vocal fold posturing was developed to investigate laryngeal muscular control of vocal fold geometry, stiffness, and tension, which are difficult to measure in live humans or in vivo models. This model was able to qualitatively reproduce in vivo experimental observations of laryngeal control of vocal fold posturing, despite the many simplifications which are necessary due to the lack of accurate data of laryngeal geometry and material properties. The results present a first comprehensive study of the co-variations between glottal width, vocal fold length, stiffness, tension at different conditions of individual, and combined laryngeal muscle activation. PMID:27914396

  2. Validity and Reproducibility of a Habitual Dietary Fibre Intake Short Food Frequency Questionnaire.

    PubMed

    Healey, Genelle; Brough, Louise; Murphy, Rinki; Hedderley, Duncan; Butts, Chrissie; Coad, Jane

    2016-09-10

    Low dietary fibre intake has been associated with poorer health outcomes, therefore having the ability to be able to quickly assess an individual's dietary fibre intake would prove useful in clinical practice and for research purposes. Current dietary assessment methods such as food records and food frequency questionnaires are time-consuming and burdensome, and there are presently no published short dietary fibre intake questionnaires that can quantify an individual's total habitual dietary fibre intake and classify individuals as low, moderate or high habitual dietary fibre consumers. Therefore, we aimed to develop and validate a habitual dietary fibre intake short food frequency questionnaire (DFI-FFQ) which can quickly and accurately classify individuals based on their habitual dietary fibre intake. In this study the DFI-FFQ was validated against the Monash University comprehensive nutrition assessment questionnaire (CNAQ). Fifty-two healthy, normal weight male (n = 17) and female (n = 35) participants, aged between 21 and 61 years, completed the DFI-FFQ twice and the CNAQ once. All eligible participants completed the study, however the data from 46% of the participants were excluded from analysis secondary to misreporting. The DFI-FFQ cannot accurately quantify total habitual dietary fibre intakes, however, it is a quick, valid and reproducible tool in classifying individuals based on their habitual dietary fibre intakes.

  3. Validity and Reproducibility of a Habitual Dietary Fibre Intake Short Food Frequency Questionnaire

    PubMed Central

    Healey, Genelle; Brough, Louise; Murphy, Rinki; Hedderley, Duncan; Butts, Chrissie; Coad, Jane

    2016-01-01

    Low dietary fibre intake has been associated with poorer health outcomes, therefore having the ability to be able to quickly assess an individual’s dietary fibre intake would prove useful in clinical practice and for research purposes. Current dietary assessment methods such as food records and food frequency questionnaires are time-consuming and burdensome, and there are presently no published short dietary fibre intake questionnaires that can quantify an individual’s total habitual dietary fibre intake and classify individuals as low, moderate or high habitual dietary fibre consumers. Therefore, we aimed to develop and validate a habitual dietary fibre intake short food frequency questionnaire (DFI-FFQ) which can quickly and accurately classify individuals based on their habitual dietary fibre intake. In this study the DFI-FFQ was validated against the Monash University comprehensive nutrition assessment questionnaire (CNAQ). Fifty-two healthy, normal weight male (n = 17) and female (n = 35) participants, aged between 21 and 61 years, completed the DFI-FFQ twice and the CNAQ once. All eligible participants completed the study, however the data from 46% of the participants were excluded from analysis secondary to misreporting. The DFI-FFQ cannot accurately quantify total habitual dietary fibre intakes, however, it is a quick, valid and reproducible tool in classifying individuals based on their habitual dietary fibre intakes. PMID:27626442

  4. Plastic films for reflective surfaces reproduced from masters

    NASA Technical Reports Server (NTRS)

    1964-01-01

    Accurate reproduction in plastic of the surface of the optical master to which a reflective finish may be applied is done by using backing from any suitable material to which cured plastic will adhere tightly. Plastics used for reflectors should be of the thermosetting or catalytically hardened type.

  5. General theory of experiment containing reproducible data: The reduction to an ideal experiment

    NASA Astrophysics Data System (ADS)

    Nigmatullin, Raoul R.; Zhang, Wei; Striccoli, Domenico

    2015-10-01

    The authors suggest a general theory for consideration of all experiments associated with measurements of reproducible data in one unified scheme. The suggested algorithm does not contain unjustified suppositions and the final function that is extracted from these measurements can be compared with hypothesis that is suggested by the theory adopted for the explanation of the object/phenomenon studied. This true function is free from the influence of the apparatus (instrumental) function and when the "best fit", or the most acceptable hypothesis, is absent, can be presented as a segment of the Fourier series. The discrete set of the decomposition coefficients describes the final function quantitatively and can serve as an intermediate model that coincides with the amplitude-frequency response (AFR) of the object studied. It can be used by theoreticians also for comparison of the suggested theory with experimental observations. Two examples (Raman spectra of the distilled water and exchange by packets between two wireless sensor nodes) confirm the basic elements of this general theory. From this general theory the following important conclusions follow: 1. The Prony's decomposition should be used in detection of the quasi-periodic processes and for quantitative description of reproducible data. 2. The segment of the Fourier series should be used as the fitting function for description of observable data corresponding to an ideal experiment. The transition from the initial Prony's decomposition to the conventional Fourier transform implies also the elimination of the apparatus function that plays an important role in the reproducible data processing. 3. The suggested theory will be helpful for creation of the unified metrological standard (UMS) that should be used in comparison of similar data obtained from the same object studied but in different laboratories with the usage of different equipment. 4. Many cases when the conventional theory confirms the experimental

  6. Percolating silicon nanowire networks with highly reproducible electrical properties.

    PubMed

    Serre, Pauline; Mongillo, Massimo; Periwal, Priyanka; Baron, Thierry; Ternon, Céline

    2015-01-09

    Here, we report the morphological and electrical properties of self-assembled silicon nanowires networks, also called Si nanonets. At the macroscopic scale, the nanonets involve several millions of nanowires. So, the observed properties should result from large scale statistical averaging, minimizing thus the discrepancies that occur from one nanowire to another. Using a standard filtration procedure, the so-obtained Si nanonets are highly reproducible in terms of their morphology, with a Si nanowire density precisely controlled during the nanonet elaboration. In contrast to individual Si nanowires, the electrical properties of Si nanonets are highly consistent, as demonstrated here by the similar electrical properties obtained in hundreds of Si nanonet-based devices. The evolution of the Si nanonet conductance with Si nanowire density demonstrates that Si nanonets behave like standard percolating media despite the presence of numerous nanowire-nanowire intersecting junctions into the nanonets and the native oxide shell surrounding the Si nanowires. Moreover, when silicon oxidation is prevented or controlled, the electrical properties of Si nanonets are stable over many months. As a consequence, Si nanowire-based nanonets constitute a promising flexible material with stable and reproducible electrical properties at the macroscopic scale while being composed of nanoscale components, which confirms the Si nanonet potential for a wide range of applications including flexible electronic, sensing and photovoltaic applications.

  7. The flux qubit revisited to enhance coherence and reproducibility

    NASA Astrophysics Data System (ADS)

    Yan, Fei; Gustavsson, Simon; Kamal, Archana; Birenbaum, Jeffrey; Sears, Adam P.; Hover, David; Gudmundsen, Ted J.; Rosenberg, Danna; Samach, Gabriel; Weber, S.; Yoder, Jonilyn L.; Orlando, Terry P.; Clarke, John; Kerman, Andrew J.; Oliver, William D.

    2016-11-01

    The scalable application of quantum information science will stand on reproducible and controllable high-coherence quantum bits (qubits). Here, we revisit the design and fabrication of the superconducting flux qubit, achieving a planar device with broad-frequency tunability, strong anharmonicity, high reproducibility and relaxation times in excess of 40 μs at its flux-insensitive point. Qubit relaxation times T1 across 22 qubits are consistently matched with a single model involving resonator loss, ohmic charge noise and 1/f-flux noise, a noise source previously considered primarily in the context of dephasing. We furthermore demonstrate that qubit dephasing at the flux-insensitive point is dominated by residual thermal-photons in the readout resonator. The resulting photon shot noise is mitigated using a dynamical decoupling protocol, resulting in T2~85 μs, approximately the 2T1 limit. In addition to realizing an improved flux qubit, our results uniquely identify photon shot noise as limiting T2 in contemporary qubits based on transverse qubit-resonator interaction.

  8. Reproducibility of the measurement of sweet taste preferences.

    PubMed

    Asao, Keiko; Luo, Wendy; Herman, William H

    2012-12-01

    Developing interventions to prevent and treat obesity are medical and public health imperatives. Taste is a major determinant of food intake and reliable methods to measure taste preferences need to be established. This study aimed to establish the short-term reproducibility of sweet taste preference measurements using 5-level sucrose concentrations in healthy adult volunteers. We defined sweet taste preference as the geometric mean of the preferred sucrose concentration determined from two series of two-alternative, forced-choice staircase procedures administered 10min apart on a single day. We repeated the same procedures at a second visit 3-7days later. Twenty-six adults (13 men and 13 women, age 33.2±12.2years) completed the measurements. The median number of pairs presented for each series was three (25th and 75th percentiles: 3, 4). The intraclass correlation coefficients between the measurements was 0.82 (95% confidence interval [CI]: 0.63-0.92) within a few days. This study showed high short-term reproducibility of a simple, 5-level procedure for measuring sweet taste preferences. This method may be useful for assessing sweet taste preferences and the risks resulting from those preferences.

  9. Reproducibility of Neonate Ocular Circulation Measurements Using Laser Speckle Flowgraphy

    PubMed Central

    Matsumoto, Tadashi; Itokawa, Takashi; Shiba, Tomoaki; Katayama, Yuji; Arimura, Tetsushi; Mizukaki, Norio; Yoda, Hitoshi; Hori, Yuichi

    2015-01-01

    Measuring the ocular blood flow in neonates may clarify the relationships between eye diseases and ocular circulation abnormalities. However, no method for noninvasively measuring ocular circulation in neonates is established. We used laser speckle flowgraphy (LSFG) modified for neonates to measure their ocular circulation and investigated whether this method is reproducible. During their normal sleep, we studied 16 subjects (adjusted age of 34–48 weeks) whose blood flow could be measured three consecutive times. While the subjects slept in the supine position, three mean blur rate (MBR) values of the optic nerve head (ONH) were obtained: the MBR-A (mean of all values), MBR-V (vessel mean), and MBR-T (tissue mean), and nine blood flow pulse waveform parameters in the ONH were examined. We analyzed the coefficient of variation (COV) and the intraclass correlation coefficient (ICC) for each parameter. The COVs of the MBR values were all ≤10%. The ICCs of the MBR values were all >0.8. Good COVs were observed for the blowout score, blowout time, rising rate, falling rate, and acceleration time index. Although the measurement of ocular circulation in the neonates was difficult, our results exhibited reproducibility, suggesting that this method could be used in clinical research. PMID:26557689

  10. Reproducibility in Nerve Morphometry: Comparison between Methods and among Observers

    PubMed Central

    Bilego Neto, Antônio Paulo da Costa; Silveira, Fernando Braga Cassiano; Rodrigues da Silva, Greice Anne; Sanada, Luciana Sayuri; Fazan, Valéria Paula Sassoli

    2013-01-01

    We investigated the reproducibility of a semiautomated method (computerized with manual intervention) for nerve morphometry (counting and measuring myelinated fibers) between three observers with different levels of expertise and experience with the method. Comparisons between automatic (fully computerized) and semiautomated morphometric methods performed by the same computer software using the same nerve images were also performed. Sural nerves of normal adult rats were used. Automatic and semiautomated morphometry of the myelinated fibers were made through the computer software KS-400. Semiautomated morphometry was conducted by three independent observers on the same images, using the semiautomated method. Automatic morphometry overestimated the myelin sheath area, thus overestimating the myelinated fiber size and underestimating the axon size. Fiber distributions overestimation was of 0.5 μm. For the semiautomated morphometry, no differences were found between observers for myelinated fiber and axon size distributions. Overestimation of the myelin sheath size of normal fibers by the fully automatic method might have an impact when morphometry is used for diagnostic purposes. We suggest that not only semiautomated morphometry results can be compared between different centers in clinical trials but it can also be performed by more than one investigator in one single experiment, being a reliable and reproducible method. PMID:23841086

  11. Assessment of Modeling Capability for Reproducing Storm Impacts on TEC

    NASA Astrophysics Data System (ADS)

    Shim, J. S.; Kuznetsova, M. M.; Rastaetter, L.; Bilitza, D.; Codrescu, M.; Coster, A. J.; Emery, B. A.; Foerster, M.; Foster, B.; Fuller-Rowell, T. J.; Huba, J. D.; Goncharenko, L. P.; Mannucci, A. J.; Namgaladze, A. A.; Pi, X.; Prokhorov, B. E.; Ridley, A. J.; Scherliess, L.; Schunk, R. W.; Sojka, J. J.; Zhu, L.

    2014-12-01

    During geomagnetic storm, the energy transfer from solar wind to magnetosphere-ionosphere system adversely affects the communication and navigation systems. Quantifying storm impacts on TEC (Total Electron Content) and assessment of modeling capability of reproducing storm impacts on TEC are of importance to specifying and forecasting space weather. In order to quantify storm impacts on TEC, we considered several parameters: TEC changes compared to quiet time (the day before storm), TEC difference between 24-hour intervals, and maximum increase/decrease during the storm. We investigated the spatial and temporal variations of the parameters during the 2006 AGU storm event (14-15 Dec. 2006) using ground-based GPS TEC measurements in the selected 5 degree eight longitude sectors. The latitudinal variations were also studied in two longitude sectors among the eight sectors where data coverage is relatively better. We obtained modeled TEC from various ionosphere/thermosphere (IT) models. The parameters from the models were compared with each other and with the observed values. We quantified performance of the models in reproducing the TEC variations during the storm using skill scores. This study has been supported by the Community Coordinated Modeling Center (CCMC) at the Goddard Space Flight Center. Model outputs and observational data used for the study will be permanently posted at the CCMC website (http://ccmc.gsfc.nasa.gov) for the space science communities to use.

  12. Reproducible Research Practices and Transparency across the Biomedical Literature

    PubMed Central

    Khoury, Muin J.; Schully, Sheri D.; Ioannidis, John P. A.

    2016-01-01

    There is a growing movement to encourage reproducibility and transparency practices in the scientific community, including public access to raw data and protocols, the conduct of replication studies, systematic integration of evidence in systematic reviews, and the documentation of funding and potential conflicts of interest. In this survey, we assessed the current status of reproducibility and transparency addressing these indicators in a random sample of 441 biomedical journal articles published in 2000–2014. Only one study provided a full protocol and none made all raw data directly available. Replication studies were rare (n = 4), and only 16 studies had their data included in a subsequent systematic review or meta-analysis. The majority of studies did not mention anything about funding or conflicts of interest. The percentage of articles with no statement of conflict decreased substantially between 2000 and 2014 (94.4% in 2000 to 34.6% in 2014); the percentage of articles reporting statements of conflicts (0% in 2000, 15.4% in 2014) or no conflicts (5.6% in 2000, 50.0% in 2014) increased. Articles published in journals in the clinical medicine category versus other fields were almost twice as likely to not include any information on funding and to have private funding. This study provides baseline data to compare future progress in improving these indicators in the scientific literature. PMID:26726926

  13. [Study of the validity and reproducibility of passive ozone monitors].

    PubMed

    Cortez-Lugo, M; Romieu, I; Palazuelos-Rendón, E; Hernández-Avila, M

    1995-01-01

    The aim of this study was to evaluate the validity and reproducibility between ozone measurements obtained with passive ozone monitors and those registered with a continuous ozone monitor, to determine the applicability of passive monitors in epidemiological research. The study was carried out during November and December 1992. Indoor and outdoor classroom air ozone concentrations were analyzed using 28 passive monitors and using a continuous monitor. The correlation between both measurements was highly significant (r = 0.089, p < 0.001), indicating a very good validity. Also, the correlation between the measurements obtained with two different passive monitors exposed concurrently was very high (r = 0.97, p < 0.001), indicating a good reproducibility in the measurements of the passive monitors. The relative error between the concentrations measured by the passive monitors and those from the continuous monitor tended to decrease with increasing ozone concentrations. The results suggest that passive monitors should be used to determine cumulative exposure of ozone exceeding 100 ppb, corresponding to an exposure period greater than five days, if used to analyze indoor air.

  14. The flux qubit revisited to enhance coherence and reproducibility

    PubMed Central

    Yan, Fei; Gustavsson, Simon; Kamal, Archana; Birenbaum, Jeffrey; Sears, Adam P; Hover, David; Gudmundsen, Ted J.; Rosenberg, Danna; Samach, Gabriel; Weber, S; Yoder, Jonilyn L.; Orlando, Terry P.; Clarke, John; Kerman, Andrew J.; Oliver, William D.

    2016-01-01

    The scalable application of quantum information science will stand on reproducible and controllable high-coherence quantum bits (qubits). Here, we revisit the design and fabrication of the superconducting flux qubit, achieving a planar device with broad-frequency tunability, strong anharmonicity, high reproducibility and relaxation times in excess of 40 μs at its flux-insensitive point. Qubit relaxation times T1 across 22 qubits are consistently matched with a single model involving resonator loss, ohmic charge noise and 1/f-flux noise, a noise source previously considered primarily in the context of dephasing. We furthermore demonstrate that qubit dephasing at the flux-insensitive point is dominated by residual thermal-photons in the readout resonator. The resulting photon shot noise is mitigated using a dynamical decoupling protocol, resulting in T2≈85 μs, approximately the 2T1 limit. In addition to realizing an improved flux qubit, our results uniquely identify photon shot noise as limiting T2 in contemporary qubits based on transverse qubit–resonator interaction. PMID:27808092

  15. Fluctuation-Driven Neural Dynamics Reproduce Drosophila Locomotor Patterns

    PubMed Central

    Cruchet, Steeve; Gustafson, Kyle; Benton, Richard; Floreano, Dario

    2015-01-01

    The neural mechanisms determining the timing of even simple actions, such as when to walk or rest, are largely mysterious. One intriguing, but untested, hypothesis posits a role for ongoing activity fluctuations in neurons of central action selection circuits that drive animal behavior from moment to moment. To examine how fluctuating activity can contribute to action timing, we paired high-resolution measurements of freely walking Drosophila melanogaster with data-driven neural network modeling and dynamical systems analysis. We generated fluctuation-driven network models whose outputs—locomotor bouts—matched those measured from sensory-deprived Drosophila. From these models, we identified those that could also reproduce a second, unrelated dataset: the complex time-course of odor-evoked walking for genetically diverse Drosophila strains. Dynamical models that best reproduced both Drosophila basal and odor-evoked locomotor patterns exhibited specific characteristics. First, ongoing fluctuations were required. In a stochastic resonance-like manner, these fluctuations allowed neural activity to escape stable equilibria and to exceed a threshold for locomotion. Second, odor-induced shifts of equilibria in these models caused a depression in locomotor frequency following olfactory stimulation. Our models predict that activity fluctuations in action selection circuits cause behavioral output to more closely match sensory drive and may therefore enhance navigation in complex sensory environments. Together these data reveal how simple neural dynamics, when coupled with activity fluctuations, can give rise to complex patterns of animal behavior. PMID:26600381

  16. A Bayesian Perspective on the Reproducibility Project: Psychology.

    PubMed

    Etz, Alexander; Vandekerckhove, Joachim

    2016-01-01

    We revisit the results of the recent Reproducibility Project: Psychology by the Open Science Collaboration. We compute Bayes factors-a quantity that can be used to express comparative evidence for an hypothesis but also for the null hypothesis-for a large subset (N = 72) of the original papers and their corresponding replication attempts. In our computation, we take into account the likely scenario that publication bias had distorted the originally published results. Overall, 75% of studies gave qualitatively similar results in terms of the amount of evidence provided. However, the evidence was often weak (i.e., Bayes factor < 10). The majority of the studies (64%) did not provide strong evidence for either the null or the alternative hypothesis in either the original or the replication, and no replication attempts provided strong evidence in favor of the null. In all cases where the original paper provided strong evidence but the replication did not (15%), the sample size in the replication was smaller than the original. Where the replication provided strong evidence but the original did not (10%), the replication sample size was larger. We conclude that the apparent failure of the Reproducibility Project to replicate many target effects can be adequately explained by overestimation of effect sizes (or overestimation of evidence against the null hypothesis) due to small sample sizes and publication bias in the psychological literature. We further conclude that traditional sample sizes are insufficient and that a more widespread adoption of Bayesian methods is desirable.

  17. Data reproducibility of pace strategy in a laboratory test run

    PubMed Central

    de França, Elias; Xavier, Ana Paula; Hirota, Vinicius Barroso; Côrrea, Sônia Cavalcanti; Caperuto, Érico Chagas

    2016-01-01

    This data paper contains data related to a reproducibility test for running pacing strategy in an intermittent running test until exhaustion. Ten participants underwent a crossover study (test and retest) with an intermittent running test. The test was composed of three-minute sets (at 1 km/h above Onset Blood Lactate Accumulation) until volitional exhaustion. To assess pace strategy change, in the first test participants chose the rest time interval (RTI) between sets (ranging from 30 to 60 s) and in the second test the maximum RTI values were either the RTI chosen in the first test (maximum RTI value), or less if desired. To verify the reproducibility of the test, rating perceived exertion (RPE), heart rate (HR) and blood plasma lactate concentration ([La]p) were collected at rest, immediately after each set and at the end of the tests. As results, RTI, RPE, HR, [La]p and time to exhaustion were not statistically different (p>0.05) between test and retest, as well as they demonstrated good intraclass correlation. PMID:27081672

  18. Towards reproducible MRM based biomarker discovery using dried blood spots.

    PubMed

    Ozcan, Sureyya; Cooper, Jason D; Lago, Santiago G; Kenny, Diarmuid; Rustogi, Nitin; Stocki, Pawel; Bahn, Sabine

    2017-03-27

    There is an increasing interest in the use of dried blood spot (DBS) sampling and multiple reaction monitoring in proteomics. Although several groups have explored the utility of DBS by focusing on protein detection, the reproducibility of the approach and whether it can be used for biomarker discovery in high throughput studies is yet to be determined. We assessed the reproducibility of multiplexed targeted protein measurements in DBS compared to serum. Eighty-two medium to high abundance proteins were monitored in a number of technical and biological replicates. Importantly, as part of the data analysis, several statistical quality control approaches were evaluated to detect inaccurate transitions. After implementing statistical quality control measures, the median CV on the original scale for all detected peptides in DBS was 13.2% and in Serum 8.8%. We also found a strong correlation (r = 0.72) between relative peptide abundance measured in DBS and serum. The combination of minimally invasive sample collection with a highly specific and sensitive mass spectrometry (MS) technique allows for targeted quantification of multiple proteins in a single MS run. This approach has the potential to fundamentally change clinical proteomics and personalized medicine by facilitating large-scale studies.

  19. A Telescope Inventor's Spyglass Possibly Reproduced in a Brueghel's Painting

    NASA Astrophysics Data System (ADS)

    Molaro, P.; Selvelli, P.

    2011-06-01

    Jan Brueghel the Elder depicted spyglasses belonging to the Archduke Albert VII of Habsburg in at least five paintings in the period between 1608 and 1625. Albert VII was fascinated by art and science and he obtained spyglasses directly from Lipperhey and Sacharias Janssen approximately at the time when the telescope was first shown at The Hague at the end of 1608. In the Extensive Landscape with View of the Castle of Mariemont, dated 1608-1612, the Archduke is looking at his Mariemont castle through an optical tube and this is the first time a spyglass was painted whatsoever. It is quite possible that the painting reproduces one of the first telescopes ever made. Two other Albert VII's telescopes are prominently reproduced in two Allegories of Sight painted a few years later (1617-1618). They are sophisticated instruments and their structure, in particular the shape of the eyepiece, suggests that they are composed by two convex lenses in a Keplerian optical configuration which became of common use only more than two decades later. If this is the case, these paintings are the first available record of a Keplerian telescope.

  20. Reproducibility of UAV-based photogrammetric surface models

    NASA Astrophysics Data System (ADS)

    Anders, Niels; Smith, Mike; Cammeraat, Erik; Keesstra, Saskia

    2016-04-01

    Soil erosion, rapid geomorphological change and vegetation degradation are major threats to the human and natural environment in many regions. Unmanned Aerial Vehicles (UAVs) and Structure-from-Motion (SfM) photogrammetry are invaluable tools for the collection of highly detailed aerial imagery and subsequent low cost production of 3D landscapes for an assessment of landscape change. Despite the widespread use of UAVs for image acquisition in monitoring applications, the reproducibility of UAV data products has not been explored in detail. This paper investigates this reproducibility by comparing the surface models and orthophotos derived from different UAV flights that vary in flight direction and altitude. The study area is located near Lorca, Murcia, SE Spain, which is a semi-arid medium-relief locale. The area is comprised of terraced agricultural fields that have been abandoned for about 40 years and have suffered subsequent damage through piping and gully erosion. In this work we focused upon variation in cell size, vertical and horizontal accuracy, and horizontal positioning of recognizable landscape features. The results suggest that flight altitude has a significant impact on reconstructed point density and related cell size, whilst flight direction affects the spatial distribution of vertical accuracy. The horizontal positioning of landscape features is relatively consistent between the different flights. We conclude that UAV data products are suitable for monitoring campaigns for land cover purposes or geomorphological mapping, but special care is required when used for monitoring changes in elevation.

  1. Towards reproducible MRM based biomarker discovery using dried blood spots

    PubMed Central

    Ozcan, Sureyya; Cooper, Jason D.; Lago, Santiago G.; Kenny, Diarmuid; Rustogi, Nitin; Stocki, Pawel; Bahn, Sabine

    2017-01-01

    There is an increasing interest in the use of dried blood spot (DBS) sampling and multiple reaction monitoring in proteomics. Although several groups have explored the utility of DBS by focusing on protein detection, the reproducibility of the approach and whether it can be used for biomarker discovery in high throughput studies is yet to be determined. We assessed the reproducibility of multiplexed targeted protein measurements in DBS compared to serum. Eighty-two medium to high abundance proteins were monitored in a number of technical and biological replicates. Importantly, as part of the data analysis, several statistical quality control approaches were evaluated to detect inaccurate transitions. After implementing statistical quality control measures, the median CV on the original scale for all detected peptides in DBS was 13.2% and in Serum 8.8%. We also found a strong correlation (r = 0.72) between relative peptide abundance measured in DBS and serum. The combination of minimally invasive sample collection with a highly specific and sensitive mass spectrometry (MS) technique allows for targeted quantification of multiple proteins in a single MS run. This approach has the potential to fundamentally change clinical proteomics and personalized medicine by facilitating large-scale studies. PMID:28345601

  2. Reproducible and inexpensive probe preparation for oligonucleotide arrays.

    PubMed

    Zhang, Y; Price, B D; Tetradis, S; Chakrabarti, S; Maulik, G; Makrigiorgos, G M

    2001-07-01

    We present a new protocol for the preparation of nucleic acids for microarray hybridization. DNA is fragmented quantitatively and reproducibly by using a hydroxyl radical-based reaction, which is initiated by hydrogen peroxide, iron(II)-EDTA and ascorbic acid. Following fragmentation, the nucleic acid fragments are densely biotinylated using a biotinylated psoralen analog plus UVA light and hybridized on microarrays. This non-enzymatic protocol circumvents several practical difficulties associated with DNA preparation for microarrays: the lack of reproducible fragmentation patterns associated with enzymatic methods; the large amount of labeled nucleic acids required by some array designs, which is often combined with a limited amount of starting material; and the high cost associated with currently used biotinylation methods. The method is applicable to any form of nucleic acid, but is particularly useful when applying double-stranded DNA on oligonucleotide arrays. Validation of this protocol is demonstrated by hybridizing PCR products with oligonucleotide-coated microspheres and PCR amplified cDNA with Affymetrix Cancer GeneChip microarrays.

  3. Data management routines for reproducible research using the G-Node Python Client library

    PubMed Central

    Sobolev, Andrey; Stoewer, Adrian; Pereira, Michael; Kellner, Christian J.; Garbers, Christian; Rautenberg, Philipp L.; Wachtler, Thomas

    2014-01-01

    Structured, efficient, and secure storage of experimental data and associated meta-information constitutes one of the most pressing technical challenges in modern neuroscience, and does so particularly in electrophysiology. The German INCF Node aims to provide open-source solutions for this domain that support the scientific data management and analysis workflow, and thus facilitate future data access and reproducible research. G-Node provides a data management system, accessible through an application interface, that is based on a combination of standardized data representation and flexible data annotation to account for the variety of experimental paradigms in electrophysiology. The G-Node Python Library exposes these services to the Python environment, enabling researchers to organize and access their experimental data using their familiar tools while gaining the advantages that a centralized storage entails. The library provides powerful query features, including data slicing and selection by metadata, as well as fine-grained permission control for collaboration and data sharing. Here we demonstrate key actions in working with experimental neuroscience data, such as building a metadata structure, organizing recorded data in datasets, annotating data, or selecting data regions of interest, that can be automated to large degree using the library. Compliant with existing de-facto standards, the G-Node Python Library is compatible with many Python tools in the field of neurophysiology and thus enables seamless integration of data organization into the scientific data workflow. PMID:24634654

  4. Data management routines for reproducible research using the G-Node Python Client library.

    PubMed

    Sobolev, Andrey; Stoewer, Adrian; Pereira, Michael; Kellner, Christian J; Garbers, Christian; Rautenberg, Philipp L; Wachtler, Thomas

    2014-01-01

    Structured, efficient, and secure storage of experimental data and associated meta-information constitutes one of the most pressing technical challenges in modern neuroscience, and does so particularly in electrophysiology. The German INCF Node aims to provide open-source solutions for this domain that support the scientific data management and analysis workflow, and thus facilitate future data access and reproducible research. G-Node provides a data management system, accessible through an application interface, that is based on a combination of standardized data representation and flexible data annotation to account for the variety of experimental paradigms in electrophysiology. The G-Node Python Library exposes these services to the Python environment, enabling researchers to organize and access their experimental data using their familiar tools while gaining the advantages that a centralized storage entails. The library provides powerful query features, including data slicing and selection by metadata, as well as fine-grained permission control for collaboration and data sharing. Here we demonstrate key actions in working with experimental neuroscience data, such as building a metadata structure, organizing recorded data in datasets, annotating data, or selecting data regions of interest, that can be automated to large degree using the library. Compliant with existing de-facto standards, the G-Node Python Library is compatible with many Python tools in the field of neurophysiology and thus enables seamless integration of data organization into the scientific data workflow.

  5. Multimodal spatial calibration for accurately registering EEG sensor positions.

    PubMed

    Zhang, Jianhua; Chen, Jian; Chen, Shengyong; Xiao, Gang; Li, Xiaoli

    2014-01-01

    This paper proposes a fast and accurate calibration method to calibrate multiple multimodal sensors using a novel photogrammetry system for fast localization of EEG sensors. The EEG sensors are placed on human head and multimodal sensors are installed around the head to simultaneously obtain all EEG sensor positions. A multiple views' calibration process is implemented to obtain the transformations of multiple views. We first develop an efficient local repair algorithm to improve the depth map, and then a special calibration body is designed. Based on them, accurate and robust calibration results can be achieved. We evaluate the proposed method by corners of a chessboard calibration plate. Experimental results demonstrate that the proposed method can achieve good performance, which can be further applied to EEG source localization applications on human brain.

  6. Light Field Imaging Based Accurate Image Specular Highlight Removal

    PubMed Central

    Wang, Haoqian; Xu, Chenxue; Wang, Xingzheng; Zhang, Yongbing; Peng, Bo

    2016-01-01

    Specular reflection removal is indispensable to many computer vision tasks. However, most existing methods fail or degrade in complex real scenarios for their individual drawbacks. Benefiting from the light field imaging technology, this paper proposes a novel and accurate approach to remove specularity and improve image quality. We first capture images with specularity by the light field camera (Lytro ILLUM). After accurately estimating the image depth, a simple and concise threshold strategy is adopted to cluster the specular pixels into “unsaturated” and “saturated” category. Finally, a color variance analysis of multiple views and a local color refinement are individually conducted on the two categories to recover diffuse color information. Experimental evaluation by comparison with existed methods based on our light field dataset together with Stanford light field archive verifies the effectiveness of our proposed algorithm. PMID:27253083

  7. Accurate determination of the sedimentation flux of concentrated suspensions

    NASA Astrophysics Data System (ADS)

    Martin, J.; Rakotomalala, N.; Salin, D.

    1995-10-01

    Flow rate jumps are used to generate propagating concentration variations in a counterflow stabilized suspension (a liquid fluidized bed). An acoustic technique is used to measure accurately the resulting concentration profiles through the bed. Depending on the experimental conditions, we have observed self-sharpening, or/and self-spreading concentration fronts. Our data are analyzed in the framework of Kynch's theory, providing an accurate determination of the sedimentation flux [CU(C); U(C) is the hindered sedimentation velocity of the suspension] and its derivatives in the concentration range 30%-60%. In the vicinity of the packing concentration, controlling the flow rate has allowed us to increase the maximum packing up to 60%.

  8. Reproducibility of an aerobic endurance test for nonexpert swimmers

    PubMed Central

    Veronese da Costa, Adalberto; Costa, Manoel da Cunha; Carlos, Daniel Medeiros; Guerra, Luis Marcos de Medeiros; Silva, Antônio José; Barbosa, Tiago Manoel Cabral dos Santos

    2012-01-01

    Background: This study aimed to verify the reproduction of an aerobic test to determine nonexpert swimmers’ resistance. Methods: The sample consisted of 24 male swimmers (age: 22.79 ± 3.90 years; weight: 74.72 ± 11.44 kg; height: 172.58 ± 4.99 cm; and fat percentage: 15.19% ± 3.21%), who swim for 1 hour three times a week. A new instrument was used in this study (a Progressive Swim Test): the swimmer wore an underwater MP3 player and increased their swimming speed on hearing a beep after every 25 meters. Each swimmer’s heart rate was recorded before the test (BHR) and again after the test (AHR). The rate of perceived exertion (RPE) and the number of laps performed (NLP) were also recorded. The sample size was estimated using G*Power software (v 3.0.10; Franz Faul, Kiel University, Kiel, Germany). The descriptive values were expressed as mean and standard deviation. After confirming the normality of the data using both the Shapiro–Wilk and Levene tests, a paired t-test was performed to compare the data. The Pearson’s linear correlation (r) and intraclass coefficient correlation (ICC) tests were used to determine relative reproducibility. The standard error of measurement (SEM) and the coefficient of variation (CV) were used to determine absolute reproducibility. The limits of agreement and the bias of the absolute and relative values between days were determined by Bland–Altman plots. All values had a significance level of P < 0.05. Results: There were significant differences in AHR (P = 0.03) and NLP (P = 0.01) between the 2 days of testing. The obtained values were r > 0.50 and ICC > 0.66. The SEM had a variation of ±2% and the CV was <10%. Most cases were within the upper and lower limits of Bland–Altman plots, suggesting correlation of the results. The applicability of NLP showed greater robustness (r and ICC > 0.90; SEM < 1%; CV < 3%), indicating that the other variables can be used to predict incremental changes in the physiological condition

  9. Paleomagnetic analysis of curved thrust belts reproduced by physical models

    NASA Astrophysics Data System (ADS)

    Costa, Elisabetta; Speranza, Fabio

    2003-12-01

    This paper presents a new methodology for studying the evolution of curved mountain belts by means of paleomagnetic analyses performed on analogue models. Eleven models were designed aimed at reproducing various tectonic settings in thin-skinned tectonics. Our models analyze in particular those features reported in the literature as possible causes for peculiar rotational patterns in the outermost as well as in the more internal fronts. In all the models the sedimentary cover was reproduced by frictional low-cohesion materials (sand and glass micro-beads), which detached either on frictional or on viscous layers. These latter were reproduced in the models by silicone. The sand forming the models has been previously mixed with magnetite-dominated powder. Before deformation, the models were magnetized by means of two permanent magnets generating within each model a quasi-linear magnetic field of intensity variable between 20 and 100 mT. After deformation, the models were cut into closely spaced vertical sections and sampled by means of 1×1-cm Plexiglas cylinders at several locations along curved fronts. Care was taken to collect paleomagnetic samples only within virtually undeformed thrust sheets, avoiding zones affected by pervasive shear. Afterwards, the natural remanent magnetization of these samples was measured, and alternating field demagnetization was used to isolate the principal components. The characteristic components of magnetization isolated were used to estimate the vertical-axis rotations occurring during model deformation. We find that indenters pushing into deforming belts from behind form non-rotational curved outer fronts. The more internal fronts show oroclinal-type rotations of a smaller magnitude than that expected for a perfect orocline. Lateral symmetrical obstacles in the foreland colliding with forward propagating belts produce non-rotational outer curved fronts as well, whereas in between and inside the obstacles a perfect orocline forms

  10. Building Consensus on Community Standards for Reproducible Science

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Nielsen, R. L.

    2015-12-01

    As geochemists, the traditional model by which standard methods for generating, presenting, and using data have been generated relied on input from the community, the results of seminal studies, a variety of authoritative bodies, and has required a great deal of time. The rate of technological and related policy change has accelerated to the point that this historical model does not satisfy the needs of the community, publishers, or funders. The development of a new mechanism for building consensus raises a number of questions: Which aspects of our data are the focus of reproducibility standards? Who sets the standards? How do we subdivide the development of the consensus? We propose an open, transparent, and inclusive approach to the development of data and reproducibility standards that is organized around specific sub-disciplines and driven by the community of practitioners in those sub-disciplines. It should involve editors, program managers, and representatives of domain data facilities as well as professional societies, but avoid any single group to be the final authority. A successful example of this model is the Editors Roundtable, a cross section of editors, funders, and data facility managers that discussed and agreed on leading practices for the reporting of geochemical data in publications, including accessibility and format of the data, data quality information, and metadata and identifiers for samples (Goldstein et al., 2014). We argue that development of data and reproducibility standards needs to heavily rely on representatives from the community of practitioners to set priorities and provide perspective. Groups of editors, practicing scientists, and other stakeholders would be assigned the task of reviewing existing practices and recommending changes as deemed necessary. They would weigh the costs and benefits of changing the standards for that community, propose appropriate tools to facilitate those changes, work through the professional societies

  11. Reproducibility of fMRI activations associated with auditory sentence comprehension.

    PubMed

    Gonzalez-Castillo, Javier; Talavage, Thomas M

    2011-02-01

    The reproducibility of three different aspects of fMRI activations-namely binary activation maps, effect size and spatial distribution of local maxima-was evaluated for an auditory sentence comprehension task with high attention demand on a group of 17 subjects that were scanned on five different occasions. While in the scanner subjects were asked to listen to a series of six short everyday sentences from the CUNY sentence test. Comprehension and attention to the stimuli were monitored after each listen condition epoch by having subjects answer a series of multiple-choice questions. Statistical maps of activation for the listen condition were computed at three different levels: overall results for all imaging sessions, group-level/single-session results for each of the five imaging occasions, and single-subject/single-session results computed for each subject and each scanning occasion independently. The experimental task recruited a distributed bilateral network with processing nodes located in the lateral temporal cortex, inferior frontal cortex, medial BA6, medial occipital cortex and subcortical structures such as the putamen and the thalamus. Reproducibility of these activations at the group level was high (83.95% of the imaged volume was consistently classified as active/inactive across all five imaging sessions), indicating that sites of neuronal activity associated with auditory comprehension can reliably be detected with fMRI in healthy subjects, across repeated measures after group averaging. At the single-subject level reproducibility ranged from moderate to high, although no significant differences were found on behavioral measures across subjects or sessions. This result suggests that contextual differences-i.e., those specific to each imaging session, can modulate our ability to detect fMRI activations associated with speech comprehension in individual subjects.

  12. Reproducible, rugged, and inexpensive photocathode x-ray diode

    SciTech Connect

    Idzorek, G. C.; Tierney, T. E.; Lockard, T. E.; Moy, K. J.; Keister, J. W.

    2008-10-15

    The photoemissive cathode type of x-ray diode (XRD) is popular for measuring time and spectrally resolved output of pulsed power experiments. Vitreous carbon XRDs currently used on the Sandia National Laboratories Z-machine were designed in the early 1980s and use materials and processes no longer available. Additionally cathodes used in the high x-ray flux and dirty vacuum environment of a machine such as Z suffer from response changes requiring recalibration. In searching for a suitable replacement cathode, we discovered very high purity vitreous-carbon planchets are commercially available for use as biological substrates in scanning electron microscope (SEM) work. After simplifying the photocathode mounting to use commercially available components, we constructed a set of 20 XRDs using SEM planchets that were then calibrated at the National Synchrotron Light Source at Brookhaven National Laboratory. We present comparisons of the reproducibility and absolute calibrations between the current vitreous-carbon XRDs and our new design.

  13. New model for datasets citation and extraction reproducibility in VAMDC

    NASA Astrophysics Data System (ADS)

    Zwölf, Carlo Maria; Moreau, Nicolas; Dubernet, Marie-Lise

    2016-09-01

    In this paper we present a new paradigm for the identification of datasets extracted from the Virtual Atomic and Molecular Data Centre (VAMDC) e-science infrastructure. Such identification includes information on the origin and version of the datasets, references associated to individual data in the datasets, as well as timestamps linked to the extraction procedure. This paradigm is described through the modifications of the language used to exchange data within the VAMDC and through the services that will implement those modifications. This new paradigm should enforce traceability of datasets, favor reproducibility of datasets extraction, and facilitate the systematic citation of the authors having originally measured and/or calculated the extracted atomic and molecular data.

  14. Reproducible surface-enhanced Raman spectroscopy of small molecular anions

    NASA Astrophysics Data System (ADS)

    Owens, F. J.

    2011-03-01

    A gold-coated silicon substrate having an array of pyramidal shaped holes is shown to provide a reproducible surface-enhanced Raman spectra (SERS) in a number of inorganic ions such as ? , ? , ? , and ? deposited on the substrate as 10-3 to10-4 molar aqueous solutions of their salts. Of particular interest is the observation of a SERS effect in ? , the anion of ammonium nitrate, a commonly used terrorist explosive, suggesting the potential for sensitive detection of this material. An unusual increase in the frequency of the ? bending mode frequency is observed in the SERS spectra of KNO2. Density Functional Theory calculations of the frequencies of the normal modes of vibration of ? bonded to gold predict an upward shift of the frequencies compared with the calculated results for a free ? , suggesting a possible explanation for the shifts.

  15. A reproducible method for determination of nitrocellulose in soil.

    PubMed

    Macmillan, Denise K; Majerus, Chelsea R; Laubscher, Randy D; Shannon, John P

    2008-01-15

    A reproducible analytical method for determination of nitrocellulose in soil is described. The new method provides the precision and accuracy needed for quantitation of nitrocellulose in soils to enable worker safety on contaminated sites. The method utilizes water and ethanol washes to remove co-contaminants, acetone extraction of nitrocellulose, and base hydrolysis of the extract to reduce nitrate groups. The hydrolysate is then neutralized and analyzed by ion chromatography for determination of free nitrate and nitrite. A variety of bases for hydrolysis and acids for neutralization were evaluated, with 5N sodium hydroxide and carbon dioxide giving the most complete hydrolysis and interference-free neutralization, respectively. The concentration of nitrocellulose in the soil is calculated from the concentrations of nitrate and nitrite and the weight percentage of nitrogen content in nitrocellulose. The laboratory detection limit for the analysis is 10mg/kg. The method acceptance range for recovery of nitrocellulose from control samples is 78-105%.

  16. GigaDB: promoting data dissemination and reproducibility

    PubMed Central

    Sneddon, Tam P.; Si Zhe, Xiao; Edmunds, Scott C.; Li, Peter; Goodman, Laurie; Hunter, Christopher I.

    2014-01-01

    Often papers are published where the underlying data supporting the research are not made available because of the limitations of making such large data sets publicly and permanently accessible. Even if the raw data are deposited in public archives, the essential analysis intermediaries, scripts or software are frequently not made available, meaning the science is not reproducible. The GigaScience journal is attempting to address this issue with the associated data storage and dissemination portal, the GigaScience database (GigaDB). Here we present the current version of GigaDB and reveal plans for the next generation of improvements. However, most importantly, we are soliciting responses from you, the users, to ensure that future developments are focused on the data storage and dissemination issues that still need resolving. Database URL: http://www.gigadb.org PMID:24622612

  17. On the reproducibility of SSNTD track counting efficiency

    NASA Astrophysics Data System (ADS)

    Guedes O, S.; Hadler N, J. C.; Iunes, P. J.; Paulo, S. R.; Tello S, C. A.

    1998-12-01

    In this work, the influence of track density and chemical etching on the reproducibility of the track counting efficiency, ɛ0, in solid state nuclear track detectors (SSNTDs) is studied. This was performed by means of the analysis of CR-39 sheets that were attached to a thin film of natural uranium. Maintaining the chemical etching parameters constant and varying the exposition time, ɛ0 is observed to be constant for track densities varying between values approximately equal to the track background and those corresponding to the track overlapping limit, where track counting becomes difficult ( ˜10 5 cm-2 at our conditions). Otherwise, keeping constant the exposition time and varying the etching temperature, a variation in ɛ0 is found if a usual track counting criterion is employed. However, such a variation vanishes statistically when a more rigorous criterion is adopted.

  18. geoknife: Reproducible web-processing of large gridded datasets

    USGS Publications Warehouse

    Read, Jordan S.; Walker, Jordan I.; Appling, Alison P.; Blodgett, David L.; Read, Emily K.; Winslow, Luke A.

    2016-01-01

    Geoprocessing of large gridded data according to overlap with irregular landscape features is common to many large-scale ecological analyses. The geoknife R package was created to facilitate reproducible analyses of gridded datasets found on the U.S. Geological Survey Geo Data Portal web application or elsewhere, using a web-enabled workflow that eliminates the need to download and store large datasets that are reliably hosted on the Internet. The package provides access to several data subset and summarization algorithms that are available on remote web processing servers. Outputs from geoknife include spatial and temporal data subsets, spatially-averaged time series values filtered by user-specified areas of interest, and categorical coverage fractions for various land-use types.

  19. Reproducibility in density functional theory calculations of solids.

    PubMed

    Lejaeghere, Kurt; Bihlmayer, Gustav; Björkman, Torbjörn; Blaha, Peter; Blügel, Stefan; Blum, Volker; Caliste, Damien; Castelli, Ivano E; Clark, Stewart J; Dal Corso, Andrea; de Gironcoli, Stefano; Deutsch, Thierry; Dewhurst, John Kay; Di Marco, Igor; Draxl, Claudia; Dułak, Marcin; Eriksson, Olle; Flores-Livas, José A; Garrity, Kevin F; Genovese, Luigi; Giannozzi, Paolo; Giantomassi, Matteo; Goedecker, Stefan; Gonze, Xavier; Grånäs, Oscar; Gross, E K U; Gulans, Andris; Gygi, François; Hamann, D R; Hasnip, Phil J; Holzwarth, N A W; Iuşan, Diana; Jochym, Dominik B; Jollet, François; Jones, Daniel; Kresse, Georg; Koepernik, Klaus; Küçükbenli, Emine; Kvashnin, Yaroslav O; Locht, Inka L M; Lubeck, Sven; Marsman, Martijn; Marzari, Nicola; Nitzsche, Ulrike; Nordström, Lars; Ozaki, Taisuke; Paulatto, Lorenzo; Pickard, Chris J; Poelmans, Ward; Probert, Matt I J; Refson, Keith; Richter, Manuel; Rignanese, Gian-Marco; Saha, Santanu; Scheffler, Matthias; Schlipf, Martin; Schwarz, Karlheinz; Sharma, Sangeeta; Tavazza, Francesca; Thunström, Patrik; Tkatchenko, Alexandre; Torrent, Marc; Vanderbilt, David; van Setten, Michiel J; Van Speybroeck, Veronique; Wills, John M; Yates, Jonathan R; Zhang, Guo-Xu; Cottenier, Stefaan

    2016-03-25

    The widespread popularity of density functional theory has given rise to an extensive range of dedicated codes for predicting molecular and crystalline properties. However, each code implements the formalism in a different way, raising questions about the reproducibility of such predictions. We report the results of a community-wide effort that compared 15 solid-state codes, using 40 different potentials or basis set types, to assess the quality of the Perdew-Burke-Ernzerhof equations of state for 71 elemental crystals. We conclude that predictions from recent codes and pseudopotentials agree very well, with pairwise differences that are comparable to those between different high-precision experiments. Older methods, however, have less precise agreement. Our benchmark provides a framework for users and developers to document the precision of new applications and methodological improvements.

  20. Chimie douce preparation of reproducible silver coatings for SERS applications

    NASA Astrophysics Data System (ADS)

    Sidorov, Alexander V.; Grigorieva, Anastasia V.; Goldt, Anastasia E.; Eremina, Olga E.; Veselova, Irina A.; Savilov, Sergey V.; Goodilin, Eugene A.

    2016-12-01

    A new soft chemistry preparation method of submicron — thick porous coatings of metallic silver is suggested for possible surface enhanced Raman spectroscopy (SERS) applications. The method is based on facile deposition of diamminesilver (I) aerosols forming instantly a nanostructured layer by fast decomposition and self-reduction of [Ag(NH3)2]+ aqueous solutions onto surfaces of inorganic substrates under mild conditions of 280-300∘C in air. A strong difference in overall microstructures and related SERS signals of model analytes is found for substrates with different deposition time and in comparison with a standard magnetron deposition technique. It is demonstrated that the suggested method is predominant for formation of robust SERS substrates with a stable and reproducible SERS enhancement.

  1. Using Scaling for accurate stochastic macroweather forecasts (including the "pause")

    NASA Astrophysics Data System (ADS)

    Lovejoy, Shaun; del Rio Amador, Lenin

    2015-04-01

    At scales corresponding to the lifetimes of structures of planetary extent (about 5 - 10 days), atmospheric processes undergo a drastic "dimensional transition" from high frequency weather to lower frequency macroweather processes. While conventional GCM's generally well reproduce both the transition and the corresponding (scaling) statistics, due to their sensitive dependence on initial conditions, the role of the weather scale processes is to provide random perturbations to the macroweather processes. The main problem with GCM's is thus that their long term (control run, unforced) statistics converge to the GCM climate and this is somewhat different from the real climate. This is the motivation for using a stochastic model and exploiting the empirical scaling properties and past data to make a stochastic model. It turns out that macroweather intermittency is typically low (the multifractal corrections are small) so that they can be approximated by fractional Gaussian Noise (fGN) processes whose memory can be enormous. For example for annual forecasts, and using the observed global temperature exponent, even 50 years of global temperature data would only allow us to exploit 90% of the available memory (for ocean regions, the figure increases to 600 years). The only complication is that anthropogenic effects dominate the global statistics at time scales beyond about 20 years. However, these are easy to remove using the CO2 forcing as a linear surrogate for all the anthropogenic effects. Using this theoretical framework, we show how to make accurate stochastic macroweather forecasts. We illustrate this on monthly and annual scale series of global and northern hemisphere surface temperatures (including nearly perfect hindcasts of the "pause" in the warming since 1998). We obtain forecast skill nearly as high as the theoretical (scaling) predictability limits allow. These scaling hindcasts - using a single effective climate sensitivity and single scaling exponent are

  2. Evaluation of Statistical Downscaling Skill at Reproducing Extreme Events

    NASA Astrophysics Data System (ADS)

    McGinnis, S. A.; Tye, M. R.; Nychka, D. W.; Mearns, L. O.

    2015-12-01

    Climate model outputs usually have much coarser spatial resolution than is needed by impacts models. Although higher resolution can be achieved using regional climate models for dynamical downscaling, further downscaling is often required. The final resolution gap is often closed with a combination of spatial interpolation and bias correction, which constitutes a form of statistical downscaling. We use this technique to downscale regional climate model data and evaluate its skill in reproducing extreme events. We downscale output from the North American Regional Climate Change Assessment Program (NARCCAP) dataset from its native 50-km spatial resolution to the 4-km resolution of University of Idaho's METDATA gridded surface meterological dataset, which derives from the PRISM and NLDAS-2 observational datasets. We operate on the major variables used in impacts analysis at a daily timescale: daily minimum and maximum temperature, precipitation, humidity, pressure, solar radiation, and winds. To interpolate the data, we use the patch recovery method from the Earth System Modeling Framework (ESMF) regridding package. We then bias correct the data using Kernel Density Distribution Mapping (KDDM), which has been shown to exhibit superior overall performance across multiple metrics. Finally, we evaluate the skill of this technique in reproducing extreme events by comparing raw and downscaled output with meterological station data in different bioclimatic regions according to the the skill scores defined by Perkins et al in 2013 for evaluation of AR4 climate models. We also investigate techniques for improving bias correction of values in the tails of the distributions. These techniques include binned kernel density estimation, logspline kernel density estimation, and transfer functions constructed by fitting the tails with a generalized pareto distribution.

  3. A workflow for reproducing mean benthic gas fluxes

    NASA Astrophysics Data System (ADS)

    Fulweiler, Robinson W.; Emery, Hollie E.; Maguire, Timothy J.

    2016-08-01

    Long-term data sets provide unique opportunities to examine temporal variability of key ecosystem processes. The need for such data sets is becoming increasingly important as we try to quantify the impact of human activities across various scales and in some cases, as we try to determine the success of management interventions. Unfortunately, long-term benthic flux data sets for coastal ecosystems are rare and curating them is a challenge. If we wish to make our data available to others now and into the future, however, then we need to provide mechanisms that allow others to understand our methods, access the data, reproduce the results, and see updates as they become available. Here we use techniques, learned through the EarthCube Ontosoft Geoscience Paper of the Future project, to develop best practices to allow us to share a long-term data set of directly measured net sediment N2 fluxes and sediment oxygen demand at two sites in Narragansett Bay, Rhode Island (USA). This technical report describes the process we used, the challenges we faced, and the steps we will take in the future to ensure transparency and reproducibility. By developing these data and software sharing tools we hope to help disseminate well-curated data with provenance as well as products from these data, so that the community can better assess how this temperate estuary has changed over time. We also hope to provide a data sharing model for others to follow so that long-term estuarine data are more easily shared and not lost over time.

  4. Reproducibility of pre-syncopal responses to repeated orthostatic challenge

    NASA Astrophysics Data System (ADS)

    Goswami, Nandu; Grasser, Erik; Roessler, Andreas; Hinghofer-Szalkay, Helmut

    Aims: To study individual patterns of hemodynamic adjustments in subjects reaching orthostatically induced presyncope and to observe whether these are reproducible across three runs. Procedures and methods: 10 healthy young males were subjected to extreme cardiovascular stress three times: Graded orthostatic stress (GOS), consisting of head-up tilt combined with lower body negative pressure, was used to achieve a pre-syncopal end-point. All test runs were separated by two week intervals. Orthostatic effects on cardiac and vascular function were continuously monitored and standing times noted. Results: Across the group, heart rate (HR) increased 112 percent, while mean arterial blood pressure dropped by 15 percent, pulse pressure by 36 percent, and stroke volume index by 51 percent on average from supine control to presyncope. Repetitions of the orthostatic protocols did not influence standing times of test persons from the 1st to the 3rd trial (15 plus minus 6 to 17 plus minus 7 min). Some individuals responded either with an increase in HR only, while the others with combined HR and total peripheral resistance increase, albeit shortly, and this individual specifc pattern was observed across the three runs of combined GOS. Conclusion: Strategies for maintaining blood pressure in response to central hypovolemia in subjects induced by orthostatic stress are different between subjects. However, the same individual specific hemodynamic mechanism is employed each time to maintain the blood pressure when reconfronted by this stress. Individual patterns of hemodynamic adjustments to orthostatic stress are highly reproducible when these subjects reach pre-syncope three times.

  5. Chimeric Mice with Competent Hematopoietic Immunity Reproduce Key Features of Severe Lassa Fever.

    PubMed

    Oestereich, Lisa; Lüdtke, Anja; Ruibal, Paula; Pallasch, Elisa; Kerber, Romy; Rieger, Toni; Wurr, Stephanie; Bockholt, Sabrina; Pérez-Girón, José V; Krasemann, Susanne; Günther, Stephan; Muñoz-Fontela, César

    2016-05-01

    Lassa fever (LASF) is a highly severe viral syndrome endemic to West African countries. Despite the annual high morbidity and mortality caused by LASF, very little is known about the pathophysiology of the disease. Basic research on LASF has been precluded due to the lack of relevant small animal models that reproduce the human disease. Immunocompetent laboratory mice are resistant to infection with Lassa virus (LASV) and, to date, only immunodeficient mice, or mice expressing human HLA, have shown some degree of susceptibility to experimental infection. Here, transplantation of wild-type bone marrow cells into irradiated type I interferon receptor knockout mice (IFNAR-/-) was used to generate chimeric mice that reproduced important features of severe LASF in humans. This included high lethality, liver damage, vascular leakage and systemic virus dissemination. In addition, this model indicated that T cell-mediated immunopathology was an important component of LASF pathogenesis that was directly correlated with vascular leakage. Our strategy allows easy generation of a suitable small animal model to test new vaccines and antivirals and to dissect the basic components of LASF pathophysiology.

  6. PH Tester Gauge Repeatability and Reproducibility Study for WO3 Nanostructure Hydrothermal Growth Process

    NASA Astrophysics Data System (ADS)

    Abd Rashid, Amirul; Hayati Saad, Nor; Bien Chia Sheng, Daniel; Yee, Lee Wai

    2014-06-01

    PH value is one of the important variables for tungsten trioxide (WO3) nanostructure hydrothermal synthesis process. The morphology of the synthesized nanostructure can be properly controlled by measuring and controlling the pH value of the solution used in this facile synthesis route. Therefore, it is very crucial to ensure the gauge used for pH measurement is reliable in order to achieve the expected result. In this study, gauge repeatability and reproducibility (GR&R) method was used to assess the repeatability and reproducibility of the pH tester. Based on ANOVA method, the design of experimental metrics as well as the result of the experiment was analyzed using Minitab software. It was found that the initial GR&R value for the tester was at 17.55 % which considered as acceptable. To further improve the GR&R level, a new pH measuring procedure was introduced. With the new procedure, the GR&R value was able to be reduced to 2.05%, which means the tester is statistically very ideal to measure the pH of the solution prepared for WO3 hydrothermal synthesis process.

  7. Chimeric Mice with Competent Hematopoietic Immunity Reproduce Key Features of Severe Lassa Fever

    PubMed Central

    Oestereich, Lisa; Lüdtke, Anja; Ruibal, Paula; Pallasch, Elisa; Kerber, Romy; Rieger, Toni; Wurr, Stephanie; Bockholt, Sabrina; Krasemann, Susanne

    2016-01-01

    Lassa fever (LASF) is a highly severe viral syndrome endemic to West African countries. Despite the annual high morbidity and mortality caused by LASF, very little is known about the pathophysiology of the disease. Basic research on LASF has been precluded due to the lack of relevant small animal models that reproduce the human disease. Immunocompetent laboratory mice are resistant to infection with Lassa virus (LASV) and, to date, only immunodeficient mice, or mice expressing human HLA, have shown some degree of susceptibility to experimental infection. Here, transplantation of wild-type bone marrow cells into irradiated type I interferon receptor knockout mice (IFNAR-/-) was used to generate chimeric mice that reproduced important features of severe LASF in humans. This included high lethality, liver damage, vascular leakage and systemic virus dissemination. In addition, this model indicated that T cell-mediated immunopathology was an important component of LASF pathogenesis that was directly correlated with vascular leakage. Our strategy allows easy generation of a suitable small animal model to test new vaccines and antivirals and to dissect the basic components of LASF pathophysiology. PMID:27191716

  8. Fetal Cerebellar Vermis Circumference Measured by 2-Dimensional Ultrasound Scan: Reference Range, Feasibility and Reproducibility

    PubMed Central

    Spinelli, M.; Sica, C.; Meglio, L. D.; Bolla, D.; Raio, L.; Surbek, D.

    2016-01-01

    Purpose: To provide 2-dimensional ultrasonographic (2D-US) normograms of cerebellar vermis biometry, as well as to evaluate the feasibility and the reproducibility of these measurements in clinical practice. Materials and Methods: A prospective cross-sectional study of 328 normal singleton pregnancies between 18 and 33 weeks of gestation. Measurements of the fetal cerebellar vermis circumference (VC) in the mid-sagittal plane were performed by both a senior and a junior operator using 2D-US. VC as a function of gestational age (GA) was expressed by regression equations. In 24 fetuses 3-dimensional (3D) reconstructed planes were obtained in order to allow comparisons with 2D-US measurements. The agreement between 2D and 3D measurements and the interobserver variability were assessed by interclass correlation coefficients (ICC). Results: Satisfactory vermis measurements could be obtained in 89.9% of cases. The VC (constant= − 12.21; slope=2.447; r=0.887, p<0.0001) correlated linearly with GA. A high degree of consistency was observed between 2D and 3D ultrasound measurements (ICC=0.846 95% CI 679–0.930) as well as between measurements obtained by different examiners (ICC=0.890 95% CI 989–0.945). Conclusion: 2-dimensional ultrasonographic measurements of cerebellar vermis throughout gestation in the mid-sagittal view seem to be feasible and reproducible enough to be potentially used in clinical practice. Such measurements may supply a tool for accurate identification of posterior fossa anomalies, providing the basis for proper counseling and management and of these conditions. PMID:27921094

  9. Novel TPLO Alignment Jig/Saw Guide Reproduces Freehand and Ideal Osteotomy Positions

    PubMed Central

    2016-01-01

    Objectives To evaluate the ability of an alignment jig/saw guide to reproduce appropriate osteotomy positions in the tibial plateau leveling osteotomy (TPLO) in the dog. Methods Lateral radiographs of 65 clinical TPLO procedures using an alignment jig and freehand osteotomy performed by experienced TPLO surgeons using a 24 mm radial saw blade between Dec 2005–Dec 2007 and Nov 2013–Nov 2015 were reviewed. The freehand osteotomy position was compared to potential osteotomy positions using the alignment jig/saw guide. The proximal and distal jig pin holes on postoperative radiographs were used to align the jig to the bone; saw guide position was selected to most closely match the osteotomy performed. The guide-to-osteotomy fit was categorized by the distance between the actual osteotomy and proposed saw guide osteotomy at its greatest offset (≤1 mm = excellent; ≤2 mm = good; ≤3 mm = satisfactory; >3 mm = poor). Results Sixty-four of 65 TPLO osteotomies could be matched satisfactorily by the saw guide. Proximal jig pin placement 3–4 mm from the joint surface and pin location in a craniocaudal plane on the proximal tibia were significantly associated with the guide-to-osteotomy fit (P = 0.021 and P = 0.047, respectively). Clinical Significance The alignment jig/saw guide can be used to reproduce appropriate freehand osteotomy position for TPLO. Furthermore, an ideal osteotomy position centered on the tibial intercondylar tubercles also is possible. Accurate placement of the proximal jig pin is a crucial step for correct positioning of the saw guide in either instance. PMID:27556230

  10. Random sampling causes the low reproducibility of rare eukaryotic OTUs in Illumina COI metabarcoding

    PubMed Central

    Knowlton, Nancy

    2017-01-01

    common β descriptors but will exclude positive records of taxa that are functionally important. Our results further reinforce the need for technical replicates (parallel PCR and sequencing from the same sample) in metabarcoding experimental designs. Data reproducibility should be determined empirically as it will depend upon the sequencing depth, the type of sample, the sequence analysis pipeline, and the number of replicates. Moreover, estimating relative biomasses or abundances based on read counts remains elusive at the OTU level. PMID:28348924

  11. Accurate pointing of tungsten welding electrodes

    NASA Technical Reports Server (NTRS)

    Ziegelmeier, P.

    1971-01-01

    Thoriated-tungsten is pointed accurately and quickly by using sodium nitrite. Point produced is smooth and no effort is necessary to hold the tungsten rod concentric. The chemically produced point can be used several times longer than ground points. This method reduces time and cost of preparing tungsten electrodes.

  12. A new approach to compute accurate velocity of meteors

    NASA Astrophysics Data System (ADS)

    Egal, Auriane; Gural, Peter; Vaubaillon, Jeremie; Colas, Francois; Thuillot, William

    2016-10-01

    The CABERNET project was designed to push the limits of meteoroid orbit measurements by improving the determination of the meteors' velocities. Indeed, despite of the development of the cameras networks dedicated to the observation of meteors, there is still an important discrepancy between the measured orbits of meteoroids computed and the theoretical results. The gap between the observed and theoretic semi-major axis of the orbits is especially significant; an accurate determination of the orbits of meteoroids therefore largely depends on the computation of the pre-atmospheric velocities. It is then imperative to dig out how to increase the precision of the measurements of the velocity.In this work, we perform an analysis of different methods currently used to compute the velocities and trajectories of the meteors. They are based on the intersecting planes method developed by Ceplecha (1987), the least squares method of Borovicka (1990), and the multi-parameter fitting (MPF) method published by Gural (2012).In order to objectively compare the performances of these techniques, we have simulated realistic meteors ('fakeors') reproducing the different error measurements of many cameras networks. Some fakeors are built following the propagation models studied by Gural (2012), and others created by numerical integrations using the Borovicka et al. 2007 model. Different optimization techniques have also been investigated in order to pick the most suitable one to solve the MPF, and the influence of the geometry of the trajectory on the result is also presented.We will present here the results of an improved implementation of the multi-parameter fitting that allow an accurate orbit computation of meteors with CABERNET. The comparison of different velocities computation seems to show that if the MPF is by far the best method to solve the trajectory and the velocity of a meteor, the ill-conditioning of the costs functions used can lead to large estimate errors for noisy

  13. Reproducible mesoscopic superpositions of Bose-Einstein condensates and mean-field chaos

    SciTech Connect

    Gertjerenken, Bettina; Arlinghaus, Stephan; Teichmann, Niklas; Weiss, Christoph

    2010-08-15

    In a parameter regime for which the mean-field (Gross-Pitaevskii) dynamics becomes chaotic, mesoscopic quantum superpositions in phase space can occur in a double-well potential, which is shaken periodically. For experimentally realistic initial states, such as the ground state of some 100 atoms, the emergence of mesoscopic quantum superpositions in phase space is investigated numerically. It is shown to be reproducible, even if the initial conditions change slightly. Although the final state is not a perfect superposition of two distinct phase states, the superposition is reached an order of magnitude faster than in the case of the collapse-and-revival phenomenon. Furthermore, a generator of entanglement is identified.

  14. Models that include supercoiling of topological domains reproduce several known features of interphase chromosomes.

    PubMed

    Benedetti, Fabrizio; Dorier, Julien; Burnier, Yannis; Stasiak, Andrzej

    2014-03-01

    Understanding the structure of interphase chromosomes is essential to elucidate regulatory mechanisms of gene expression. During recent years, high-throughput DNA sequencing expanded the power of chromosome conformation capture (3C) methods that provide information about reciprocal spatial proximity of chromosomal loci. Since 2012, it is known that entire chromatin in interphase chromosomes is organized into regions with strongly increased frequency of internal contacts. These regions, with the average size of ∼1 Mb, were named topological domains. More recent studies demonstrated presence of unconstrained supercoiling in interphase chromosomes. Using Brownian dynamics simulations, we show here that by including supercoiling into models of topological domains one can reproduce and thus provide possible explanations of several experimentally observed characteristics of interphase chromosomes, such as their complex contact maps.

  15. Fourier modeling of the BOLD response to a breath-hold task: Optimization and reproducibility.

    PubMed

    Pinto, Joana; Jorge, João; Sousa, Inês; Vilela, Pedro; Figueiredo, Patrícia

    2016-07-15

    Cerebrovascular reactivity (CVR) reflects the capacity of blood vessels to adjust their caliber in order to maintain a steady supply of brain perfusion, and it may provide a sensitive disease biomarker. Measurement of the blood oxygen level dependent (BOLD) response to a hypercapnia-inducing breath-hold (BH) task has been frequently used to map CVR noninvasively using functional magnetic resonance imaging (fMRI). However, the best modeling approach for the accurate quantification of CVR maps remains an open issue. Here, we compare and optimize Fourier models of the BOLD response to a BH task with a preparatory inspiration, and assess the test-retest reproducibility of the associated CVR measurements, in a group of 10 healthy volunteers studied over two fMRI sessions. Linear combinations of sine-cosine pairs at the BH task frequency and its successive harmonics were added sequentially in a nested models approach, and were compared in terms of the adjusted coefficient of determination and corresponding variance explained (VE) of the BOLD signal, as well as the number of voxels exhibiting significant BOLD responses, the estimated CVR values, and their test-retest reproducibility. The brain average VE increased significantly with the Fourier model order, up to the 3rd order. However, the number of responsive voxels increased significantly only up to the 2nd order, and started to decrease from the 3rd order onwards. Moreover, no significant relative underestimation of CVR values was observed beyond the 2nd order. Hence, the 2nd order model was concluded to be the optimal choice for the studied paradigm. This model also yielded the best test-retest reproducibility results, with intra-subject coefficients of variation of 12 and 16% and an intra-class correlation coefficient of 0.74. In conclusion, our results indicate that a Fourier series set consisting of a sine-cosine pair at the BH task frequency and its two harmonics is a suitable model for BOLD-fMRI CVR measurements

  16. Developing simulations to reproduce in vivo fluoroscopy kinematics in total knee replacement patients.

    PubMed

    Fitzpatrick, Clare K; Komistek, Richard D; Rullkoetter, Paul J

    2014-07-18

    For clinically predictive testing and design-phase evaluation of prospective total knee replacement (TKR) implants, devices should ideally be evaluated under physiological loading conditions which incorporate population-level variability. A challenge exists for experimental and computational researchers in determining appropriate loading conditions for wear and kinematic knee simulators which reflect in vivo joint loading conditions. There is a great deal of kinematic data available from fluoroscopy studies. The purpose of this work was to develop computational methods to derive anterior-posterior (A-P) and internal-external (I-E) tibiofemoral (TF) joint loading conditions from in vivo kinematic data. Two computational models were developed, a simple TF model, and a more complex lower limb model. These models were driven through external loads applied to the tibia and femur in the TF model, and applied to the hip, ankle and muscles in the lower limb model. A custom feedback controller was integrated with the finite element environment and used to determine the external loads required to reproduce target kinematics at the TF joint. The computational platform was evaluated using in vivo kinematic data from four fluoroscopy patients, and reproduced in vivo A-P and I-E motions and compressive force with a root-mean-square (RMS) accuracy of less than 1mm, 0.1°, and 40 N in the TF model and in vivo A-P and I-E motions, TF flexion, and compressive loads with a RMS accuracy of less than 1mm, 0.1°, 1.4°, and 48 N in the lower limb model. The external loading conditions derived from these models can ultimately be used to establish population variability in loading conditions, for eventual use in computational as well as experimental activity simulations.

  17. Reproducible Data Processing Research for the CABRI R.I.A. experiments Acoustic Emission signal analysis

    SciTech Connect

    Pantera, Laurent

    2015-07-01

    The CABRI facility is an experimental nuclear reactor of the French Atomic Energy Commission (CEA) designed to study the behaviour of fuel rods at high burnup under Reactivity Initiated Accident (R.I.A.) conditions such as the scenario of a control rod ejection. During the experimental phase, the behaviour of the fuel element generates acoustic waves which can be detected by two microphones placed upstream and downstream from the test device. Studies carried out on the last fourteen tests showed the interest in carrying out temporal and spectral analyses on these signals by showing the existence of signatures which can be correlated with physical phenomena. We want presently to return to this rich data in order to have a new point of view by applying modern signal processing methods. Such an antecedent works resumption leads to some difficulties. Although all the raw data are accessible in the form of text files, analyses and graphics representations were not clear in reproducing from the former studies since the people who were in charge of the original work have left the laboratory and it is not easy when time passes, even with our own work, to be able to remember the steps of data manipulations and the exact setup. Thus we decided to consolidate the availability of the data and its manipulation in order to provide a robust data processing workflow to the experimentalists before doing any further investigations. To tackle this issue of strong links between data, treatments and the generation of documents, we adopted a Reproducible Research paradigm. We shall first present the tools chosen in our laboratory to implement this workflow and, then we shall describe the global perception carried out to continue the study of the Acoustic Emission signals recorded by the two microphones during the last fourteen CABRI R.I.A. tests. (authors)

  18. Accurate modelling of unsteady flows in collapsible tubes.

    PubMed

    Marchandise, Emilie; Flaud, Patrice

    2010-01-01

    The context of this paper is the development of a general and efficient numerical haemodynamic tool to help clinicians and researchers in understanding of physiological flow phenomena. We propose an accurate one-dimensional Runge-Kutta discontinuous Galerkin (RK-DG) method coupled with lumped parameter models for the boundary conditions. The suggested model has already been successfully applied to haemodynamics in arteries and is now extended for the flow in collapsible tubes such as veins. The main difference with cardiovascular simulations is that the flow may become supercritical and elastic jumps may appear with the numerical consequence that scheme may not remain monotone if no limiting procedure is introduced. We show that our second-order RK-DG method equipped with an approximate Roe's Riemann solver and a slope-limiting procedure allows us to capture elastic jumps accurately. Moreover, this paper demonstrates that the complex physics associated with such flows is more accurately modelled than with traditional methods such as finite difference methods or finite volumes. We present various benchmark problems that show the flexibility and applicability of the numerical method. Our solutions are compared with analytical solutions when they are available and with solutions obtained using other numerical methods. Finally, to illustrate the clinical interest, we study the emptying process in a calf vein squeezed by contracting skeletal muscle in a normal and pathological subject. We compare our results with experimental simulations and discuss the sensitivity to parameters of our model.

  19. Refined Dummy Atom Model of Mg(2+) by Simple Parameter Screening Strategy with Revised Experimental Solvation Free Energy.

    PubMed

    Jiang, Yang; Zhang, Haiyang; Feng, Wei; Tan, Tianwei

    2015-12-28

    Metal ions play an important role in the catalysis of metalloenzymes. To investigate metalloenzymes via molecular modeling, a set of accurate force field parameters for metal ions is highly imperative. To extend its application range and improve the performance, the dummy atom model of metal ions was refined through a simple parameter screening strategy using the Mg(2+) ion as an example. Using the AMBER ff03 force field with the TIP3P model, the refined model accurately reproduced the experimental geometric and thermodynamic properties of Mg(2+). Compared with point charge models and previous dummy atom models, the refined dummy atom model yields an enhanced performance for producing reliable ATP/GTP-Mg(2+)-protein conformations in three metalloenzyme systems with single or double metal centers. Similar to other unbounded models, the refined model failed to reproduce the Mg-Mg distance and favored a monodentate binding of carboxylate groups, and these drawbacks needed to be considered with care. The outperformance of the refined model is mainly attributed to the use of a revised (more accurate) experimental solvation free energy and a suitable free energy correction protocol. This work provides a parameter screening strategy that can be readily applied to refine the dummy atom models for metal ions.

  20. Symphony: a framework for accurate and holistic WSN simulation.

    PubMed

    Riliskis, Laurynas; Osipov, Evgeny

    2015-02-25

    Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN) systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles.

  1. Symphony: A Framework for Accurate and Holistic WSN Simulation

    PubMed Central

    Riliskis, Laurynas; Osipov, Evgeny

    2015-01-01

    Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN) systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles. PMID:25723144

  2. Accurate ab initio vibrational energies of methyl chloride

    SciTech Connect

    Owens, Alec; Yurchenko, Sergei N.; Yachmenev, Andrey; Tennyson, Jonathan; Thiel, Walter

    2015-06-28

    Two new nine-dimensional potential energy surfaces (PESs) have been generated using high-level ab initio theory for the two main isotopologues of methyl chloride, CH{sub 3}{sup 35}Cl and CH{sub 3}{sup 37}Cl. The respective PESs, CBS-35{sup  HL}, and CBS-37{sup  HL}, are based on explicitly correlated coupled cluster calculations with extrapolation to the complete basis set (CBS) limit, and incorporate a range of higher-level (HL) additive energy corrections to account for core-valence electron correlation, higher-order coupled cluster terms, scalar relativistic effects, and diagonal Born-Oppenheimer corrections. Variational calculations of the vibrational energy levels were performed using the computer program TROVE, whose functionality has been extended to handle molecules of the form XY {sub 3}Z. Fully converged energies were obtained by means of a complete vibrational basis set extrapolation. The CBS-35{sup  HL} and CBS-37{sup  HL} PESs reproduce the fundamental term values with root-mean-square errors of 0.75 and 1.00 cm{sup −1}, respectively. An analysis of the combined effect of the HL corrections and CBS extrapolation on the vibrational wavenumbers indicates that both are needed to compute accurate theoretical results for methyl chloride. We believe that it would be extremely challenging to go beyond the accuracy currently achieved for CH{sub 3}Cl without empirical refinement of the respective PESs.

  3. Modification of Rayleigh-Plesset Theory for Reproducing Dynamics of Cavitation Bubbles in Liquid-Phase Laser Ablation

    NASA Astrophysics Data System (ADS)

    Soliman, Wafaa; Nakano, Tetsutaro; Takada, Noriharu; Sasaki, Koichi

    2010-11-01

    The solution of the conventional Rayleigh-Plesset equation did not agree with the experimental results on the temporal variations of the sizes of cavitation bubbles produced by laser ablation in water. In this work, we modified the conventional Rayleigh-Plesset theory in the following two points to reproduce the experimental observation theoretically. One was to introduce the effect of the contact angle among the water, the cavitation bubble, and the ablation target. The other was to treat the surface tension and the kinematic viscosity coefficient of water as additional adjusting parameters to fit the theoretical result with the experimental observation. The latter modification was effective especially for laser ablation in the pressurized water. Better agreement between the theoretical and the experimental results was realized with the help of these modifications, but anomalous thermodynamic parameters were necessary to obtain the best fitting. We evaluated the pressures and the temperatures inside the cavitation bubbles.

  4. Feedback about More Accurate versus Less Accurate Trials: Differential Effects on Self-Confidence and Activation

    ERIC Educational Resources Information Center

    Badami, Rokhsareh; VaezMousavi, Mohammad; Wulf, Gabriele; Namazizadeh, Mahdi

    2012-01-01

    One purpose of the present study was to examine whether self-confidence or anxiety would be differentially affected by feedback from more accurate rather than less accurate trials. The second purpose was to determine whether arousal variations (activation) would predict performance. On Day 1, participants performed a golf putting task under one of…

  5. Perfusion phantom: An efficient and reproducible method to simulate myocardial first-pass perfusion measurements with cardiovascular magnetic resonance.

    PubMed

    Chiribiri, Amedeo; Schuster, Andreas; Ishida, Masaki; Hautvast, Gilion; Zarinabad, Niloufar; Morton, Geraint; Otton, James; Plein, Sven; Breeuwer, Marcel; Batchelor, Philip; Schaeffter, Tobias; Nagel, Eike

    2013-03-01

    The aim of this article is to describe a novel hardware perfusion phantom that simulates myocardial first-pass perfusion allowing comparisons between different MR techniques and validation of the results against a true gold standard. MR perfusion images were acquired at different myocardial perfusion rates and variable doses of gadolinium and cardiac output. The system proved to be sensitive to controlled variations of myocardial perfusion rate, contrast agent dose, and cardiac output. It produced distinct signal intensity curves for perfusion rates ranging from 1 to 10 mL/mL/min. Quantification of myocardial blood flow by signal deconvolution techniques provided accurate measurements of perfusion. The phantom also proved to be very reproducible between different sessions and different operators. This novel hardware perfusion phantom system allows reliable, reproducible, and efficient simulation of myocardial first-pass MR perfusion. Direct comparison between the results of image-based quantification and reference values of flow and myocardial perfusion will allow development and validation of accurate quantification methods.

  6. Diet rapidly and reproducibly alters the human gut microbiome

    PubMed Central

    David, Lawrence A.; Maurice, Corinne F.; Carmody, Rachel N.; Gootenberg, David B.; Button, Julie E.; Wolfe, Benjamin E.; Ling, Alisha V.; Devlin, A. Sloan; Varma, Yug; Fischbach, Michael A.; Biddinger, Sudha B.; Dutton, Rachel J.; Turnbaugh, Peter J.

    2013-01-01

    Long-term diet influences the structure and activity of the trillions of microorganisms residing in the human gut1–5, but it remains unclear how rapidly and reproducibly the human gut microbiome responds to short-term macronutrient change. Here, we show that the short-term consumption of diets composed entirely of animal or plant products alters microbial community structure and overwhelms inter-individual differences in microbial gene expression. The animal-based diet increased the abundance of bile-tolerant microorganisms (Alistipes, Bilophila, and Bacteroides) and decreased the levels of Firmicutes that metabolize dietary plant polysaccharides (Roseburia, Eubacterium rectale, and Ruminococcus bromii). Microbial activity mirrored differences between herbivorous and carnivorous mammals2, reflecting trade-offs between carbohydrate and protein fermentation. Foodborne microbes from both diets transiently colonized the gut, including bacteria, fungi, and even viruses. Finally, increases in the abundance and activity of Bilophila wadsworthia on the animal-based diet support a link between dietary fat, bile acids, and the outgrowth of microorganisms capable of triggering inflammatory bowel disease6. In concert, these results demonstrate that the gut microbiome can rapidly respond to altered diet, potentially facilitating the diversity of human dietary lifestyles. PMID:24336217

  7. Resting Functional Connectivity of Language Networks: Characterization and Reproducibility

    PubMed Central

    Tomasi, Dardo; Volkow, Nora D.

    2011-01-01

    The neural basis of language comprehension and production has been associated with superior temporal (Wernicke’s) and inferior frontal (Broca’s) cortical areas respectively. However, recent resting state functional connectivity (RSFC) and lesion studies implicate a more extended network in language processing. Using a large RSFC dataset from 970 healthy subjects and seed regions in Broca’s and Wernicke’s we recapitulate this extended network that includes adjoining prefrontal, temporal and parietal regions but also bilateral caudate and left putamen/globus pallidus and subthalamic nucleus. We also show that the language network has predominance of short-range functional connectivity (except posterior Wernicke’s area that exhibited predominant long-range connectivity), which is consistent with reliance on local processing. Predominantly, the long-range connectivity was left lateralized (except anterior Wernicke’s area that exhibited rightward lateralization). The language network also exhibited anticorrelated activity with auditory (only for Wernickes’s area) and visual cortices that suggests integrated sequential activity with regions involved with listening or reading words. Assessment of the intra subject’s reproducibility of this network and its characterization in individuals with language dysfunction is needed to determine its potential as a biomarker for language disorders. PMID:22212597

  8. Virtual Raters for Reproducible and Objective Assessments in Radiology

    NASA Astrophysics Data System (ADS)

    Kleesiek, Jens; Petersen, Jens; Döring, Markus; Maier-Hein, Klaus; Köthe, Ullrich; Wick, Wolfgang; Hamprecht, Fred A.; Bendszus, Martin; Biller, Armin

    2016-04-01

    Volumetric measurements in radiologic images are important for monitoring tumor growth and treatment response. To make these more reproducible and objective we introduce the concept of virtual raters (VRs). A virtual rater is obtained by combining knowledge of machine-learning algorithms trained with past annotations of multiple human raters with the instantaneous rating of one human expert. Thus, he is virtually guided by several experts. To evaluate the approach we perform experiments with multi-channel magnetic resonance imaging (MRI) data sets. Next to gross tumor volume (GTV) we also investigate subcategories like edema, contrast-enhancing and non-enhancing tumor. The first data set consists of N = 71 longitudinal follow-up scans of 15 patients suffering from glioblastoma (GB). The second data set comprises N = 30 scans of low- and high-grade gliomas. For comparison we computed Pearson Correlation, Intra-class Correlation Coefficient (ICC) and Dice score. Virtual raters always lead to an improvement w.r.t. inter- and intra-rater agreement. Comparing the 2D Response Assessment in Neuro-Oncology (RANO) measurements to the volumetric measurements of the virtual raters results in one-third of the cases in a deviating rating. Hence, we believe that our approach will have an impact on the evaluation of clinical studies as well as on routine imaging diagnostics.

  9. Reproducing Natural Spider Silks' Copolymer Behavior in Synthetic Silk Mimics

    SciTech Connect

    An, Bo; Jenkins, Janelle E; Sampath, Sujatha; Holland, Gregory P; Hinman, Mike; Yarger, Jeffery L; Lewis, Randolph

    2012-10-30

    Dragline silk from orb-weaving spiders is a copolymer of two large proteins, major ampullate spidroin 1 (MaSp1) and 2 (MaSp2). The ratio of these proteins is known to have a large variation across different species of orb-weaving spiders. NMR results from gland material of two different species of spiders, N. clavipes and A. aurantia, indicates that MaSp1 proteins are more easily formed into β-sheet nanostructures, while MaSp2 proteins form random coil and helical structures. To test if this behavior of natural silk proteins could be reproduced by recombinantly produced spider silk mimic protein, recombinant MaSp1/MaSp2 mixed fibers as well as chimeric silk fibers from MaSp1 and MaSp2 sequences in a single protein were produced based on the variable ratio and conserved motifs of MaSp1 and MaSp2 in native silk fiber. Mechanical properties, solid-state NMR, and XRD results of tested synthetic fibers indicate the differing roles of MaSp1 and MaSp2 in the fiber and verify the importance of postspin stretching treatment in helping the fiber to form the proper spatial structure.

  10. Reproducibility of Differential Proteomic Technologies in CPTAC Fractionated Xenografts

    PubMed Central

    2015-01-01

    The NCI Clinical Proteomic Tumor Analysis Consortium (CPTAC) employed a pair of reference xenograft proteomes for initial platform validation and ongoing quality control of its data collection for The Cancer Genome Atlas (TCGA) tumors. These two xenografts, representing basal and luminal-B human breast cancer, were fractionated and analyzed on six mass spectrometers in a total of 46 replicates divided between iTRAQ and label-free technologies, spanning a total of 1095 LC–MS/MS experiments. These data represent a unique opportunity to evaluate the stability of proteomic differentiation by mass spectrometry over many months of time for individual instruments or across instruments running dissimilar workflows. We evaluated iTRAQ reporter ions, label-free spectral counts, and label-free extracted ion chromatograms as strategies for data interpretation (source code is available from http://homepages.uc.edu/~wang2x7/Research.htm). From these assessments, we found that differential genes from a single replicate were confirmed by other replicates on the same instrument from 61 to 93% of the time. When comparing across different instruments and quantitative technologies, using multiple replicates, differential genes were reproduced by other data sets from 67 to 99% of the time. Projecting gene differences to biological pathways and networks increased the degree of similarity. These overlaps send an encouraging message about the maturity of technologies for proteomic differentiation. PMID:26653538

  11. Numerically reproduced internal wave spectra in the deep ocean

    NASA Astrophysics Data System (ADS)

    Sugiyama, Yoshifumi; Niwa, Yoshihiro; Hibiya, Toshiyuki

    2009-04-01

    A vertically two-dimensional internal wave field is forced equally at the near-inertial frequency and the semidiurnal tidal frequency both at the lowest vertical wavenumber. These correspond to wind forcing and internal tide forcing, the main energy sources for the internal wave field. After 5 years of spin-up, a quasi-stationary internal wave field with characteristics of the Garrett-Munk-like spectrum is successfully reproduced. Furthermore, we carry out additional experiments by changing the strength of the semidiurnal tidal forcing relative to the near-inertial forcing. It is demonstrated that the Garrett-Munk-like spectrum is created and maintained only when energy is supplied both from the near-inertial forcing and the semidiurnal tidal forcing. So long as both energy sources are available, nonlinear interactions among internal waves occur such that the resulting internal wave spectrum becomes close to the Garrett-Munk-like spectrum irrespective of the ratio of the near-inertial forcing to the semidiurnal tidal forcing.

  12. Reproducing Natural Spider Silks’ Copolymer Behavior in Synthetic Silk Mimics

    PubMed Central

    An, Bo; Jenkins, Janelle E.; Sampath, Sujatha; Holland, Gregory P.; Hinman, Mike; Yarger, Jeffery L.; Lewis, Randolph

    2012-01-01

    Dragline silk from orb-weaving spiders is a copolymer of two large proteins, major ampullate spidroin 1 (MaSp1) and 2 (MaSp2). The ratio of these proteins is known to have a large variation across different species of orb-weaving spiders. NMR results from gland material of two different species of spiders, N. clavipes and A. aurantia, indicates that MaSp1 proteins are more easily formed into β-sheet nanostructures, while MaSp2 proteins form random coil and helical structures. To test if this behavior of natural silk proteins could be reproduced by recombinantly produced spider silk mimic protein, recombinant MaSp1/MaSp2 mixed fibers as well as chimeric silk fibers from MaSp1 and MaSp2 sequences in a single protein were produced based on the variable ratio and conserved motifs of MaSp1 and MaSp2 in native silk fiber. Mechanical properties, solid-state NMR, and XRD results of tested synthetic fibers indicate the differing roles of MaSp1 and MaSp2 in the fiber and verify the importance of postspin stretching treatment in helping the fiber to form the proper spatial structure. PMID:23110450

  13. Repeatability and reproducibility of aquatic testing with zinc dithiophosphate

    SciTech Connect

    Hooter, D.L.; Hoke, D.I.; Kraska, R.C.; Wojewodka, R.A.

    1994-12-31

    This testing program was designed to characterize the repeatability and reproducibility of aquatic screening studies with a water insoluble chemical substance. Zinc dithiophosphate was selected for its limited water solubility and moderate aquatic toxicity. Acute tests were conducted using fathead minnows and Daphnia magna, according to guidelines developed to minimize random sources of non-repeatability. Zinc dithiosphosphate was exposed to the organisms in static tests using an oil-water dispersion method for the fathead minnows, and a water-accommodated-fraction method for the Daphnia magna. Testing was conducted in moderately hard water with pre-determined nominal concentrations of 0. 1, 1.0, 10.0, 100.00, and 1000.0 ppm or ppm WAF. 24 studies were contracted among 3 separate commercial contract laboratories. The program results demonstrate the diverse range of intralaboratory and interlaboratory variability based on the organism type, and emphasize the need for further study and caution in the design, and implementation of aquatic testing for insoluble materials.

  14. Can a coupled meteorology–chemistry model reproduce the ...

    EPA Pesticide Factsheets

    The ability of a coupled meteorology–chemistry model, i.e., Weather Research and Forecast and Community Multiscale Air Quality (WRF-CMAQ), to reproduce the historical trend in aerosol optical depth (AOD) and clear-sky shortwave radiation (SWR) over the Northern Hemisphere has been evaluated through a comparison of 21-year simulated results with observation-derived records from 1990 to 2010. Six satellite-retrieved AOD products including AVHRR, TOMS, SeaWiFS, MISR, MODIS-Terra and MODIS-Aqua as well as long-term historical records from 11 AERONET sites were used for the comparison of AOD trends. Clear-sky SWR products derived by CERES at both the top of atmosphere (TOA) and surface as well as surface SWR data derived from seven SURFRAD sites were used for the comparison of trends in SWR. The model successfully captured increasing AOD trends along with the corresponding increased TOA SWR (upwelling) and decreased surface SWR (downwelling) in both eastern China and the northern Pacific. The model also captured declining AOD trends along with the corresponding decreased TOA SWR (upwelling) and increased surface SWR (downwelling) in the eastern US, Europe and the northern Atlantic for the period of 2000–2010. However, the model underestimated the AOD over regions with substantial natural dust aerosol contributions, such as the Sahara Desert, Arabian Desert, central Atlantic and northern Indian Ocean. Estimates of the aerosol direct radiative effect (DRE) at TOA a

  15. Highly reproducible thermocontrolled electrospun fiber based organic photovoltaic devices.

    PubMed

    Kim, Taehoon; Yang, Seung Jae; Sung, Sae Jin; Kim, Yern Seung; Chang, Mi Se; Jung, Haesol; Park, Chong Rae

    2015-03-04

    In this work, we examined the reasons underlying the humidity-induced morphological changes of electrospun fibers and suggest a method of controlling the electrospun fiber morphology under high humidity conditions. We fabricated OPV devices composed of electrospun fibers, and the performance of the OPV devices depends significantly on the fiber morphology. The evaporation rate of a solvent at various relative humidity was measured to investigate the effects of the relative humidity during electrospinning process. The beaded nanofiber morphology of electrospun fibers was originated due to slow solvent evaporation rate under high humidity conditions. To increase the evaporation rate under high humidity conditions, warm air was applied to the electrospinning system. The beads that would have formed on the electrospun fibers were completely avoided, and the power conversion efficiencies of OPV devices fabricated under high humidity conditions could be restored. These results highlight the simplicity and effectiveness of the proposed method for improving the reproducibility of electrospun nanofibers and performances of devices consisting of the electrospun nanofibers, regardless of the relative humidity.

  16. Virtual Raters for Reproducible and Objective Assessments in Radiology

    PubMed Central

    Kleesiek, Jens; Petersen, Jens; Döring, Markus; Maier-Hein, Klaus; Köthe, Ullrich; Wick, Wolfgang; Hamprecht, Fred A.; Bendszus, Martin; Biller, Armin

    2016-01-01

    Volumetric measurements in radiologic images are important for monitoring tumor growth and treatment response. To make these more reproducible and objective we introduce the concept of virtual raters (VRs). A virtual rater is obtained by combining knowledge of machine-learning algorithms trained with past annotations of multiple human raters with the instantaneous rating of one human expert. Thus, he is virtually guided by several experts. To evaluate the approach we perform experiments with multi-channel magnetic resonance imaging (MRI) data sets. Next to gross tumor volume (GTV) we also investigate subcategories like edema, contrast-enhancing and non-enhancing tumor. The first data set consists of N = 71 longitudinal follow-up scans of 15 patients suffering from glioblastoma (GB). The second data set comprises N = 30 scans of low- and high-grade gliomas. For comparison we computed Pearson Correlation, Intra-class Correlation Coefficient (ICC) and Dice score. Virtual raters always lead to an improvement w.r.t. inter- and intra-rater agreement. Comparing the 2D Response Assessment in Neuro-Oncology (RANO) measurements to the volumetric measurements of the virtual raters results in one-third of the cases in a deviating rating. Hence, we believe that our approach will have an impact on the evaluation of clinical studies as well as on routine imaging diagnostics. PMID:27118379

  17. The inter-observer reproducibility of Shafer's sign.

    PubMed

    Qureshi, F; Goble, R

    2009-03-01

    Pigment cells in the anterior vitreous (Shafer's sign) are known to be associated with retinal breaks. We sought to identify the reproducibility of Shafer's sign between different grades of ophthalmic staff. In all 47 patients were examined by a consultant vitreo-retinal surgeon, a senior house officer (SHO) and optician for Shafer's sign. Cohen's kappa for consultant vs SHO assessment of Shafer's sign was 0.55 while for consultant vs optician assessment, kappa was 0.28. Retinal tears were present in 63.8% of our series. Consultant assessment of Shafer's sign with fundoscopy findings, we found specificity to be 93.5% while sensitivity was 93.8%. Kappa for consultant assessment of Shafer's sign vs break presence was 0.86.Consultant and SHO assessment of Shafer's sign is of moderate agreement while optician assessment is fair. These results suggest a relationship between training and the assessment of Shafer's sign. We feel this study suggests caution in undue reliance on Shafer's sign particularly for inexperienced members of staff.

  18. A silicon retina that reproduces signals in the optic nerve

    NASA Astrophysics Data System (ADS)

    Zaghloul, Kareem A.; Boahen, Kwabena

    2006-12-01

    Prosthetic devices may someday be used to treat lesions of the central nervous system. Similar to neural circuits, these prosthetic devices should adapt their properties over time, independent of external control. Here we describe an artificial retina, constructed in silicon using single-transistor synaptic primitives, with two forms of locally controlled adaptation: luminance adaptation and contrast gain control. Both forms of adaptation rely on local modulation of synaptic strength, thus meeting the criteria of internal control. Our device is the first to reproduce the responses of the four major ganglion cell types that drive visual cortex, producing 3600 spiking outputs in total. We demonstrate how the responses of our device's ganglion cells compare to those measured from the mammalian retina. Replicating the retina's synaptic organization in our chip made it possible to perform these computations using a hundred times less energy than a microprocessor—and to match the mammalian retina in size and weight. With this level of efficiency and autonomy, it is now possible to develop fully implantable intraocular prostheses.

  19. Reproducibility of measurements of trace gas concentrations in expired air.

    PubMed

    Strocchi, A; Ellis, C; Levitt, M D

    1991-07-01

    Measurement of the pulmonary excretion of trace gases has been used as a simple means of assessing metabolic reactions. End alveolar trace gas concentration, rather than excretory rate, is usually measured. However, the reproducibility of this measurement has received little attention. In 17 healthy subjects, duplicate collections of alveolar air were obtained within 1 minute of each other using a commercially available alveolar air sampler. The concentrations of hydrogen, methane, carbon monoxide, and carbon dioxide were measured. When the subject received no instruction on how to expire into the device, a difference of 28% +/- 19% (1SD) was found between duplicate determinations of hydrogen. Instructing the subjects to avoid hyperventilation or to inspire maximally and exhale immediately resulted in only minor reduction in variability. However, a maximal inspiration held for 15 seconds before exhalation reduced the difference to a mean of 9.6% +/- 8.0%, less than half that observed with the other expiratory techniques. Percentage difference of methane measurements with the four different expiratory techniques yielded results comparable to those obtained for hydrogen. In contrast, percentage differences for carbon monoxide measurements were similar for all expiratory techniques. When normalized to a PCO2 of 5%, the variability of hydrogen measurements with the breath-holding technique was reduced to 6.8% +/- 4.7%, a value significantly lower than that obtained with the other expiratory methods. This study suggests that attention to the expiratory technique could improve the accuracy of tests using breath hydrogen measurements.

  20. Accuracy and reproducibility of cholesterol assay in the western Cape.

    PubMed

    Berger, G M; Christopher, K; Juritz, J M; Liesegang, F

    1988-11-19

    The accuracy and precision of cholesterol assay in the western Cape region is reported. The survey was carried out over 15 weeks utilising three human EDTA plasma pools with normal, borderline high and high cholesterol levels respectively. All 11 laboratories in the region providing a service to academic, provincial or military hospitals or to the private medical sector were included in the study. Ten of the 11 laboratories utilised automated enzymatic methods of cholesterol assay whereas 1 used a manual procedure based on the Liebermann-Burchard reaction. Methods were standardised by means of a variety of commercial calibrator material in all except 1 laboratory which used reference sera from the Centers for Disease Control, Atlanta. The performance of the 4 best laboratories met the standard of precision recommended for cholesterol assay, viz. total coefficient of variation of less than or equal to 2.5%. However, only 2 of the 11 laboratories achieved the optimum objective of an overall bias of less than 2.0% together with precision of less than or equal to 2.5%. Rational use of cholesterol assay for diagnosis and management will therefore require standardisation of cholesterol assay on a common reference material and greater attention to analytical factors influencing the reproducibility of results. Intrinsic biological variation also contributes uncertainty to the interpretation of a single value. Thus important clinical decisions must be based on two or more assays carried out using appropriate methodology.

  1. Reproducibility of Differential Proteomic Technologies in CPTAC Fractionated Xenografts

    SciTech Connect

    Tabb, David L.; Wang, Xia; Carr, Steven A.; Clauser, Karl R.; Mertins, Philipp; Chambers, Matthew C.; Holman, Jerry D.; Wang, Jing; Zhang, Bing; Zimmerman, Lisa J.; Chen, Xian; Gunawardena, Harsha P.; Davies, Sherri R.; Ellis, Matthew J. C.; Li, Shunqiang; Townsend, R. Reid; Boja, Emily S.; Ketchum, Karen A.; Kinsinger, Christopher R.; Mesri, Mehdi; Rodriguez, Henry; Liu, Tao; Kim, Sangtae; McDermott, Jason E.; Payne, Samuel H.; Petyuk, Vladislav A.; Rodland, Karin D.; Smith, Richard D.; Yang, Feng; Chan, Daniel W.; Zhang, Bai; Zhang, Hui; Zhang, Zhen; Zhou, Jian-Ying; Liebler, Daniel C.

    2016-03-04

    The NCI Clinical Proteomic Tumor Analysis Consortium (CPTAC) employed a pair of reference xenograft proteomes for initial platform validation and ongoing quality control of its data collection for The Cancer Genome Atlas (TCGA) tumors. These two xenografts, representing basal and luminal-B human breast cancer, were fractionated and analyzed on six mass spectrometers in a total of 46 replicates divided between iTRAQ and label-free technologies, spanning a total of 1095 LC-MS/MS experiments. These data represent a unique opportunity to evaluate the stability of proteomic differentiation by mass spectrometry over many months of time for individual instruments or across instruments running dissimilar workflows. We evaluated iTRAQ reporter ions, label-free spectral counts, and label-free extracted ion chromatograms as strategies for data interpretation. From these assessments we found that differential genes from a single replicate were confirmed by other replicates on the same instrument from 61-93% of the time. When comparing across different instruments and quantitative technologies, differential genes were reproduced by other data sets from 67-99% of the time. Projecting gene differences to biological pathways and networks increased the similarities. These overlaps send an encouraging message about the maturity of technologies for proteomic differentiation.

  2. Histopathologic reproducibility of thyroid disease in an epidemiologic study

    SciTech Connect

    Ron, E.; Griffel, B.; Liban, E.; Modan, B.

    1986-03-01

    An investigation of the long-term effects of childhood scalp irradiation demonstrated a significantly increased risk of thyroid tumors in the irradiated population. Because of the complexity of thyroid cancer diagnosis, a histopathologic slide review of 59 of the 68 patients (irradiated and nonirradiated) with thyroid disease was undertaken. The review revealed 90% agreement (kappa = +0.85, P less than 0.01) between the original and review diagnosis. Four of 27 cases previously diagnosed as malignant were reclassified as benign, yielding a cancer misdiagnosis rate of 14.8%. All four of the misdiagnosed cancers were of follicular or mixed papillary-follicular type. As a result of the histologic review, the ratio of malignant to benign tumors decreased from 2.55 to 1.75. Since disagreement in diagnosis was similar in the irradiated and nonirradiated groups, the relative risk of radiation-associated neoplasms did not change substantially. The histopathologic review shows that although there were some problems in diagnostic reproducibility, they were not statistically significant and did not alter our previous conclusions regarding radiation exposure. However, a 15% reduction in the number of malignancies might affect epidemiologic studies with an external comparison as well as geographic or temporal comparisons.

  3. Developmental pesticide exposure reproduces features of attention deficit hyperactivity disorder

    PubMed Central

    Richardson, Jason R.; Taylor, Michele M.; Shalat, Stuart L.; Guillot, Thomas S.; Caudle, W. Michael; Hossain, Muhammad M.; Mathews, Tiffany A.; Jones, Sara R.; Cory-Slechta, Deborah A.; Miller, Gary W.

    2015-01-01

    Attention-deficit hyperactivity disorder (ADHD) is estimated to affect 8–12% of school-age children worldwide. ADHD is a complex disorder with significant genetic contributions. However, no single gene has been linked to a significant percentage of cases, suggesting that environmental factors may contribute to ADHD. Here, we used behavioral, molecular, and neurochemical techniques to characterize the effects of developmental exposure to the pyrethroid pesticide deltamethrin. We also used epidemiologic methods to determine whether there is an association between pyrethroid exposure and diagnosis of ADHD. Mice exposed to the pyrethroid pesticide deltamethrin during development exhibit several features reminiscent of ADHD, including elevated dopamine transporter (DAT) levels, hyperactivity, working memory and attention deficits, and impulsive-like behavior. Increased DAT and D1 dopamine receptor levels appear to be responsible for the behavioral deficits. Epidemiologic data reveal that children aged 6–15 with detectable levels of pyrethroid metabolites in their urine were more than twice as likely to be diagnosed with ADHD. Our epidemiologic finding, combined with the recapitulation of ADHD behavior in pesticide-treated mice, provides a mechanistic basis to suggest that developmental pyrethroid exposure is a risk factor for ADHD.—Richardson, J. R., Taylor, M. M., Shalat, S. L., Guillot III, T. S., Caudle, W. M., Hossain, M. M., Mathews, T. A., Jones, S. R., Cory-Slechta, D. A., Miller, G. W. Developmental pesticide exposure reproduces features of attention deficit hyperactivity disorder. PMID:25630971

  4. Accurate Guitar Tuning by Cochlear Implant Musicians

    PubMed Central

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task. PMID:24651081

  5. New model accurately predicts reformate composition

    SciTech Connect

    Ancheyta-Juarez, J.; Aguilar-Rodriguez, E. )

    1994-01-31

    Although naphtha reforming is a well-known process, the evolution of catalyst formulation, as well as new trends in gasoline specifications, have led to rapid evolution of the process, including: reactor design, regeneration mode, and operating conditions. Mathematical modeling of the reforming process is an increasingly important tool. It is fundamental to the proper design of new reactors and revamp of existing ones. Modeling can be used to optimize operating conditions, analyze the effects of process variables, and enhance unit performance. Instituto Mexicano del Petroleo has developed a model of the catalytic reforming process that accurately predicts reformate composition at the higher-severity conditions at which new reformers are being designed. The new AA model is more accurate than previous proposals because it takes into account the effects of temperature and pressure on the rate constants of each chemical reaction.

  6. Accurate colorimetric feedback for RGB LED clusters

    NASA Astrophysics Data System (ADS)

    Man, Kwong; Ashdown, Ian

    2006-08-01

    We present an empirical model of LED emission spectra that is applicable to both InGaN and AlInGaP high-flux LEDs, and which accurately predicts their relative spectral power distributions over a wide range of LED junction temperatures. We further demonstrate with laboratory measurements that changes in LED spectral power distribution with temperature can be accurately predicted with first- or second-order equations. This provides the basis for a real-time colorimetric feedback system for RGB LED clusters that can maintain the chromaticity of white light at constant intensity to within +/-0.003 Δuv over a range of 45 degrees Celsius, and to within 0.01 Δuv when dimmed over an intensity range of 10:1.

  7. Accurate guitar tuning by cochlear implant musicians.

    PubMed

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task.

  8. An Accurate, Simplified Model Intrabeam Scattering

    SciTech Connect

    Bane, Karl LF

    2002-05-23

    Beginning with the general Bjorken-Mtingwa solution for intrabeam scattering (IBS) we derive an accurate, greatly simplified model of IBS, valid for high energy beams in normal storage ring lattices. In addition, we show that, under the same conditions, a modified version of Piwinski's IBS formulation (where {eta}{sub x,y}{sup 2}/{beta}{sub x,y} has been replaced by {Eta}{sub x,y}) asymptotically approaches the result of Bjorken-Mtingwa.

  9. An accurate registration technique for distorted images

    NASA Technical Reports Server (NTRS)

    Delapena, Michele; Shaw, Richard A.; Linde, Peter; Dravins, Dainis

    1990-01-01

    Accurate registration of International Ultraviolet Explorer (IUE) images is crucial because the variability of the geometrical distortions that are introduced by the SEC-Vidicon cameras ensures that raw science images are never perfectly aligned with the Intensity Transfer Functions (ITFs) (i.e., graded floodlamp exposures that are used to linearize and normalize the camera response). A technique for precisely registering IUE images which uses a cross correlation of the fixed pattern that exists in all raw IUE images is described.

  10. On accurate determination of contact angle

    NASA Technical Reports Server (NTRS)

    Concus, P.; Finn, R.

    1992-01-01

    Methods are proposed that exploit a microgravity environment to obtain highly accurate measurement of contact angle. These methods, which are based on our earlier mathematical results, do not require detailed measurement of a liquid free-surface, as they incorporate discontinuous or nearly-discontinuous behavior of the liquid bulk in certain container geometries. Physical testing is planned in the forthcoming IML-2 space flight and in related preparatory ground-based experiments.

  11. Towards Principled Experimental Study of Autonomous Mobile Robots

    NASA Technical Reports Server (NTRS)

    Gat, Erann

    1995-01-01

    We review the current state of research in autonomous mobile robots and conclude that there is an inadequate basis for predicting the reliability and behavior of robots operating in unengineered environments. We present a new approach to the study of autonomous mobile robot performance based on formal statistical analysis of independently reproducible experiments conducted on real robots. Simulators serve as models rather than experimental surrogates. We demonstrate three new results: 1) Two commonly used performance metrics (time and distance) are not as well correlated as is often tacitly assumed. 2) The probability distributions of these performance metrics are exponential rather than normal, and 3) a modular, object-oriented simulation accurately predicts the behavior of the real robot in a statistically significant manner.

  12. Accurate vessel segmentation with constrained B-snake.

    PubMed

    Yuanzhi Cheng; Xin Hu; Ji Wang; Yadong Wang; Tamura, Shinichi

    2015-08-01

    We describe an active contour framework with accurate shape and size constraints on the vessel cross-sectional planes to produce the vessel segmentation. It starts with a multiscale vessel axis tracing in a 3D computed tomography (CT) data, followed by vessel boundary delineation on the cross-sectional planes derived from the extracted axis. The vessel boundary surface is deformed under constrained movements on the cross sections and is voxelized to produce the final vascular segmentation. The novelty of this paper lies in the accurate contour point detection of thin vessels based on the CT scanning model, in the efficient implementation of missing contour points in the problematic regions and in the active contour model with accurate shape and size constraints. The main advantage of our framework is that it avoids disconnected and incomplete segmentation of the vessels in the problematic regions that contain touching vessels (vessels in close proximity to each other), diseased portions (pathologic structure attached to a vessel), and thin vessels. It is particularly suitable for accurate segmentation of thin and low contrast vessels. Our method is evaluated and demonstrated on CT data sets from our partner site, and its results are compared with three related methods. Our method is also tested on two publicly available databases and its results are compared with the recently published method. The applicability of the proposed method to some challenging clinical problems, the segmentation of the vessels in the problematic regions, is demonstrated with good results on both quantitative and qualitative experimentations; our segmentation algorithm can delineate vessel boundaries that have level of variability similar to those obtained manually.

  13. Using Copula Distributions to Support More Accurate Imaging-Based Diagnostic Classifiers for Neuropsychiatric Disorders

    PubMed Central

    Bansal, Ravi; Hao, Xuejun; Liu, Jun; Peterson, Bradley S.

    2014-01-01

    Many investigators have tried to apply machine learning techniques to magnetic resonance images (MRIs) of the brain in order to diagnose neuropsychiatric disorders. Usually the number of brain imaging measures (such as measures of cortical thickness and measures of local surface morphology) derived from the MRIs (i.e., their dimensionality) has been large (e.g. >10) relative to the number of participants who provide the MRI data (<100). Sparse data in a high dimensional space increases the variability of the classification rules that machine learning algorithms generate, thereby limiting the validity, reproducibility, and generalizability of those classifiers. The accuracy and stability of the classifiers can improve significantly if the multivariate distributions of the imaging measures can be estimated accurately. To accurately estimate the multivariate distributions using sparse data, we propose to estimate first the univariate distributions of imaging data and then combine them using a Copula to generate more accurate estimates of their multivariate distributions. We then sample the estimated Copula distributions to generate dense sets of imaging measures and use those measures to train classifiers. We hypothesize that the dense sets of brain imaging measures will generate classifiers that are stable to variations in brain imaging measures, thereby improving the reproducibility, validity, and generalizability of diagnostic classification algorithms in imaging datasets from clinical populations. In our experiments, we used both computer-generated and real-world brain imaging datasets to assess the accuracy of multivariate Copula distributions in estimating the corresponding multivariate distributions of real-world imaging data. Our experiments showed that diagnostic classifiers generated using imaging measures sampled from the Copula were significantly more accurate and more reproducible than were the classifiers generated using either the real-world imaging

  14. High Frequency QRS ECG Accurately Detects Cardiomyopathy

    NASA Technical Reports Server (NTRS)

    Schlegel, Todd T.; Arenare, Brian; Poulin, Gregory; Moser, Daniel R.; Delgado, Reynolds

    2005-01-01

    High frequency (HF, 150-250 Hz) analysis over the entire QRS interval of the ECG is more sensitive than conventional ECG for detecting myocardial ischemia. However, the accuracy of HF QRS ECG for detecting cardiomyopathy is unknown. We obtained simultaneous resting conventional and HF QRS 12-lead ECGs in 66 patients with cardiomyopathy (EF = 23.2 plus or minus 6.l%, mean plus or minus SD) and in 66 age- and gender-matched healthy controls using PC-based ECG software recently developed at NASA. The single most accurate ECG parameter for detecting cardiomyopathy was an HF QRS morphological score that takes into consideration the total number and severity of reduced amplitude zones (RAZs) present plus the clustering of RAZs together in contiguous leads. This RAZ score had an area under the receiver operator curve (ROC) of 0.91, and was 88% sensitive, 82% specific and 85% accurate for identifying cardiomyopathy at optimum score cut-off of 140 points. Although conventional ECG parameters such as the QRS and QTc intervals were also significantly longer in patients than controls (P less than 0.001, BBBs excluded), these conventional parameters were less accurate (area under the ROC = 0.77 and 0.77, respectively) than HF QRS morphological parameters for identifying underlying cardiomyopathy. The total amplitude of the HF QRS complexes, as measured by summed root mean square voltages (RMSVs), also differed between patients and controls (33.8 plus or minus 11.5 vs. 41.5 plus or minus 13.6 mV, respectively, P less than 0.003), but this parameter was even less accurate in distinguishing the two groups (area under ROC = 0.67) than the HF QRS morphologic and conventional ECG parameters. Diagnostic accuracy was optimal (86%) when the RAZ score from the HF QRS ECG and the QTc interval from the conventional ECG were used simultaneously with cut-offs of greater than or equal to 40 points and greater than or equal to 445 ms, respectively. In conclusion 12-lead HF QRS ECG employing

  15. Quantifying reproducibility in computational biology: the case of the tuberculosis drugome.

    PubMed

    Garijo, Daniel; Kinnings, Sarah; Xie, Li; Xie, Lei; Zhang, Yinliang; Bourne, Philip E; Gil, Yolanda

    2013-01-01

    How easy is it to reproduce the results found in a typical computational biology paper? Either through experience or intuition the reader will already know that the answer is with difficulty or not at all. In this paper we attempt to quantify this difficulty by reproducing a previously published paper for different classes of users (ranging from users with little expertise to domain experts) and suggest ways in which the situation might be improved. Quantification is achieved by estimating the time required to reproduce each of the steps in the method described in the original paper and make them part of an explicit workflow that reproduces the original results. Reproducing the method took several months of effort, and required using new versions and new software that posed challenges to reconstructing and validating the results. The quantification leads to "reproducibility maps" that reveal that novice researchers would only be able to reproduce a few of the steps in the method, and that only expert researchers with advance knowledge of the domain would be able to reproduce the method in its entirety. The workflow itself is published as an online resource together with supporting software and data. The paper concludes with a brief discussion of the complexities of requiring reproducibility in terms of cost versus benefit, and a desiderata with our observations and guidelines for improving reproducibility. This has implications not only in reproducing the work of others from published papers, but reproducing work from one's own laboratory.

  16. A tailored multi-frequency EPR approach to accurately determine the magnetic resonance parameters of dynamic nuclear polarization agents: application to AMUPol.

    PubMed

    Gast, P; Mance, D; Zurlo, E; Ivanov, K L; Baldus, M; Huber, M

    2017-02-01

    To understand the dynamic nuclear polarization (DNP) enhancements of biradical polarizing agents, the magnetic resonance parameters need to be known. We describe a tailored EPR approach to accurately determine electron spin-spin coupling parameters using a combination of standard (9 GHz), high (95 GHz) and ultra-high (275 GHz) frequency EPR. Comparing liquid- and frozen-solution continuous-wave EPR spectra provides accurate anisotropic dipolar interaction D and isotropic exchange interaction J parameters of the DNP biradical AMUPol. We found that D was larger by as much as 30% compared to earlier estimates, and that J is 43 MHz, whereas before it was considered to be negligible. With the refined data, quantum mechanical calculations confirm that an increase in dipolar electron-electron couplings leads to higher cross-effect DNP efficiencies. Moreover, the DNP calculations qualitatively reproduce the difference of TOTAPOL and AMUPol DNP efficiencies found experimentally and suggest that AMUPol is particularly effective in improving the DNP efficiency at magnetic fields higher than 500 MHz. The multi-frequency EPR approach will aid in predicting the optimal structures for future DNP agents.

  17. Color accuracy and reproducibility in whole slide imaging scanners

    PubMed Central

    Shrestha, Prarthana; Hulsken, Bas

    2014-01-01

    Abstract We propose a workflow for color reproduction in whole slide imaging (WSI) scanners, such that the colors in the scanned images match to the actual slide color and the inter-scanner variation is minimum. We describe a new method of preparation and verification of the color phantom slide, consisting of a standard IT8-target transmissive film, which is used in color calibrating and profiling the WSI scanner. We explore several International Color Consortium (ICC) compliant techniques in color calibration/profiling and rendering intents for translating the scanner specific colors to the standard display (sRGB) color space. Based on the quality of the color reproduction in histopathology slides, we propose the matrix-based calibration/profiling and absolute colorimetric rendering approach. The main advantage of the proposed workflow is that it is compliant to the ICC standard, applicable to color management systems in different platforms, and involves no external color measurement devices. We quantify color difference using the CIE-DeltaE2000 metric, where DeltaE values below 1 are considered imperceptible. Our evaluation on 14 phantom slides, manufactured according to the proposed method, shows an average inter-slide color difference below 1 DeltaE. The proposed workflow is implemented and evaluated in 35 WSI scanners developed at Philips, called the Ultra Fast Scanners (UFS). The color accuracy, measured as DeltaE between the scanner reproduced colors and the reference colorimetric values of the phantom patches, is improved on average to 3.5 DeltaE in calibrated scanners from 10 DeltaE in uncalibrated scanners. The average inter-scanner color difference is found to be 1.2 DeltaE. The improvement in color performance upon using the proposed method is apparent with the visual color quality of the tissue scans. PMID:26158041

  18. Development of a Consistent and Reproducible Porcine Scald Burn Model

    PubMed Central

    Kempf, Margit; Kimble, Roy; Cuttle, Leila

    2016-01-01

    There are very few porcine burn models that replicate scald injuries similar to those encountered by children. We have developed a robust porcine burn model capable of creating reproducible scald burns for a wide range of burn conditions. The study was conducted with juvenile Large White pigs, creating replicates of burn combinations; 50°C for 1, 2, 5 and 10 minutes and 60°C, 70°C, 80°C and 90°C for 5 seconds. Visual wound examination, biopsies and Laser Doppler Imaging were performed at 1, 24 hours and at 3 and 7 days post-burn. A consistent water temperature was maintained within the scald device for long durations (49.8 ± 0.1°C when set at 50°C). The macroscopic and histologic appearance was consistent between replicates of burn conditions. For 50°C water, 10 minute duration burns showed significantly deeper tissue injury than all shorter durations at 24 hours post-burn (p ≤ 0.0001), with damage seen to increase until day 3 post-burn. For 5 second duration burns, by day 7 post-burn the 80°C and 90°C scalds had damage detected significantly deeper in the tissue than the 70°C scalds (p ≤ 0.001). A reliable and safe model of porcine scald burn injury has been successfully developed. The novel apparatus with continually refreshed water improves consistency of scald creation for long exposure times. This model allows the pathophysiology of scald burn wound creation and progression to be examined. PMID:27612153

  19. A reproducible method to determine the meteoroid mass index

    NASA Astrophysics Data System (ADS)

    Pokorný, P.; Brown, P. G.

    2016-08-01

    Context. The determination of meteoroid mass indices is central to flux measurements and evolutionary studies of meteoroid populations. However, different authors use different approaches to fit observed data, making results difficult to reproduce and the resulting uncertainties difficult to justify. The real, physical, uncertainties are usually an order of magnitude higher than the reported values. Aims: We aim to develop a fully automated method that will measure meteoroid mass indices and associated uncertainty. We validate our method on large radar and optical datasets and compare results to obtain a best estimate of the true meteoroid mass index. Methods: Using MultiNest, a Bayesian inference tool that calculates the evidence and explores the parameter space, we search for the best fit of cumulative number vs. mass distributions in a four-dimensional space of variables (a,b,X1,X2). We explore biases in meteor echo distributions using optical meteor data as a calibration dataset to establish the systematic offset in measured mass index values. Results: Our best estimate for the average de-biased mass index for the sporadic meteoroid complex, as measured by radar appropriate to the mass range 10-3 > m > 10-5 g, was s = -2.10 ± 0.08. Optical data in the 10-1 > m > 10-3 g range, with the shower meteors removed, produced s = -2.08 ± 0.08. We find the mass index used by Grün et al. (1985) is substantially larger than we measure in the 10-4 < m < 10-1 g range. Our own code with a simple manual and a sample dataset can be found here: http://ftp://aquarid.physics.uwo.ca/pub/peter/MassIndexCode/

  20. Scan-rescan reproducibility of CT densitometric measures of emphysema

    NASA Astrophysics Data System (ADS)

    Chong, D.; van Rikxoort, E. M.; Kim, H. J.; Goldin, J. G.; Brown, M. S.

    2011-03-01

    This study investigated the reproducibility of HRCT densitometric measures of emphysema in patients scanned twice one week apart. 24 emphysema patients from a multicenter study were scanned at full inspiration (TLC) and expiration (RV), then again a week later for four scans total. Scans for each patient used the same scanner and protocol, except for tube current in three patients. Lung segmentation with gross airway removal was performed on the scans. Volume, weight, mean lung density (MLD), relative area under -950HU (RA-950), and 15th percentile (PD-15) were calculated for TLC, and volume and an airtrapping mask (RA-air) between -950 and -850HU for RV. For each measure, absolute differences were computed for each scan pair, and linear regression was performed against volume difference in a subgroup with volume difference <500mL. Two TLC scan pairs were excluded due to segmentation failure. The mean lung volumes were 5802 +/- 1420mL for TLC, 3878 +/- 1077mL for RV. The mean absolute differences were 169mL for TLC volume, 316mL for RV volume, 14.5g for weight, 5.0HU for MLD, 0.66p.p. for RA-950, 2.4HU for PD-15, and 3.1p.p. for RA-air. The <500mL subgroup had 20 scan pairs for TLC and RV. The R2 values were 0.8 for weight, 0.60 for MLD, 0.29 for RA-950, 0.31 for PD-15, and 0.64 for RA-air. Our results indicate that considerable variability exists in densitometric measures over one week that cannot be attributed to breathhold or physiology. This has implications for clinical trials relying on these measures to assess emphysema treatment efficacy.

  1. Reproducing American Sign Language sentences: cognitive scaffolding in working memory

    PubMed Central

    Supalla, Ted; Hauser, Peter C.; Bavelier, Daphne

    2014-01-01

    The American Sign Language Sentence Reproduction Test (ASL-SRT) requires the precise reproduction of a series of ASL sentences increasing in complexity and length. Error analyses of such tasks provides insight into working memory and scaffolding processes. Data was collected from three groups expected to differ in fluency: deaf children, deaf adults and hearing adults, all users of ASL. Quantitative (correct/incorrect recall) and qualitative error analyses were performed. Percent correct on the reproduction task supports its sensitivity to fluency as test performance clearly differed across the three groups studied. A linguistic analysis of errors further documented differing strategies and bias across groups. Subjects' recall projected the affordance and constraints of deep linguistic representations to differing degrees, with subjects resorting to alternate processing strategies when they failed to recall the sentence correctly. A qualitative error analysis allows us to capture generalizations about the relationship between error pattern and the cognitive scaffolding, which governs the sentence reproduction process. Highly fluent signers and less-fluent signers share common chokepoints on particular words in sentences. However, they diverge in heuristic strategy. Fluent signers, when they make an error, tend to preserve semantic details while altering morpho-syntactic domains. They produce syntactically correct sentences with equivalent meaning to the to-be-reproduced one, but these are not verbatim reproductions of the original sentence. In contrast, less-fluent signers tend to use a more linear strategy, preserving lexical status and word ordering while omitting local inflections, and occasionally resorting to visuo-motoric imitation. Thus, whereas fluent signers readily use top-down scaffolding in their working memory, less fluent signers fail to do so. Implications for current models of working memory across spoken and signed modalities are considered. PMID

  2. [Our concept of defecography. Methods and reproducibility of results].

    PubMed

    Sutorý, M; Brhelová, H; Michek, J; Kubacák, J; Vasícková, J; Stursa, V; Sehnalová, H

    1999-06-01

    Defecography is used in the Czech Republic only exceptionally. Since 1988 the authors made 402 defecographic examinations. They submit a detailed description of hitherto assembled experience and their own modification of the examination. As contrast material they use at present Micropaque susp. thickened by means of wheat bran. They administer it by means of a modified press for dough preparation. The X-rays are taken on a modified ordinary stool made from soft timber. For screening of uncovered places in the visual field they use individually placed copper plates 2 mm thick. For better evaluation of the X-rays the authors place during examination an X-ray contrasting net behind the patient. Pictures are taken at rest, during contraction, during modified Valsalva's manoeuvre and during all stages of defecation. The authors mention the most interesting pathological pictures they encountered so far--internal prolapse, levator hernia, rectocele, sphincter defect, various forms of prolapses and dyskineses of the pelvic floor. In the authors opinion the basic quantifiable parameters are the magnitude of the anorectal angles. They used the assessment method described by Mahieu, as well as the mediorectal angle which in their opinion is a reflection of the patient's somatotype and levator function. More than the absolute values of the angles they emphasize the difference of the two angles and change of the latter during contraction and defecation. In their opinion enlargement of the difference during contraction and diminution to values close to zero is normal. Converse values are according to the authors evidence of dyssynergy of the pelvic floor. Independent assessment of the angles and magnitude of the lift of the pelvic floor by three subjects are subjected to statistical analysis. They provide evidence of complete reproducibility of results of anorectal angles according to the authors' definition. The results of assessment can be used to investigate relations with

  3. Accurate Molecular Dimensions from Stearic Acid Monolayers.

    ERIC Educational Resources Information Center

    Lane, Charles A.; And Others

    1984-01-01

    Discusses modifications in the fatty acid monolayer experiment to reduce the inaccurate moleculary data students usually obtain. Copies of the experimental procedure used and a Pascal computer program to work up the data are available from the authors. (JN)

  4. A novel design method of anthropomorphic prosthetic hands for reproducing human hand grasping.

    PubMed

    Sun, Baiyang; Xiong, Caihua; Chen, Wenrui; Zhang, Qiaofei; Mao, Liu; Zhang, Qin

    2014-01-01

    Because hand is often used for grasping, developing a design of prosthetic hands, particularly light and compact underactuated anthropomorphic transradial prostheses for reproducing human hand complex grasping is crucial for upper-limb amputees. Obviously, the less the number of actuators is, the worse the anthropomorphic motion capability of the prosthetic hands will be. This paper aims to design a transmission mechanism with few motors actuating fingers which could serve the relatively accurate grasp movement of a human hand and has the potential to be embedded in a palm including the motors. We start with establishing an index for evaluating the anthropomorphic motion capability of a prosthetic hand. Based on the optimization of this index, we determine the number of actuators in fingers and the transmission relationship between the actuators and the metacarpophalangeal(MCP) joints. Then, a new design method to mechanically implement the transmission relationship based on a novel decomposition of transmission matrix is proposed in this paper. Utilizing this method, we obtained the final mechanical structure of a new prosthetic hand.

  5. TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics.

    PubMed

    Röst, Hannes L; Liu, Yansheng; D'Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi

    2016-09-01

    Next-generation mass spectrometric (MS) techniques such as SWATH-MS have substantially increased the throughput and reproducibility of proteomic analysis, but ensuring consistent quantification of thousands of peptide analytes across multiple liquid chromatography-tandem MS (LC-MS/MS) runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we developed TRIC (http://proteomics.ethz.ch/tric/), a software tool that utilizes fragment-ion data to perform cross-run alignment, consistent peak-picking and quantification for high-throughput targeted proteomics. TRIC reduced the identification error compared to a state-of-the-art SWATH-MS analysis without alignment by more than threefold at constant recall while correcting for highly nonlinear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups. Thus, TRIC fills a gap in the pipeline for automated analysis of massively parallel targeted proteomics data sets.

  6. Ordered array of Ag semishells on different diameter monolayer polystyrene colloidal crystals: An ultrasensitive and reproducible SERS substrate

    PubMed Central

    Yi, Zao; Niu, Gao; Luo, Jiangshan; Kang, Xiaoli; Yao, Weitang; Zhang, Weibin; Yi, Yougen; Yi, Yong; Ye, Xin; Duan, Tao; Tang, Yongjian

    2016-01-01

    Ag semishells (AgSS) ordered arrays for surface-enhanced Raman scattering (SERS) spectroscopy have been prepared by depositing Ag film onto polystyrene colloidal particle (PSCP) monolayer templates array. The diversified activity for SERS activity with the ordered AgSS arrays mainly depends on the PSCP diameter and Ag film thickness. The high SERS sensitivity and reproducibility are proved by the detection of rhodamine 6G (R6G) and 4-aminothiophenol (4-ATP) molecules. The prominent enhancements of SERS are mainly from the “V”-shaped or “U”-shaped nanogaps on AgSS, which are experimentally and theoretically investigated. The higher SERS activity, stability and reproducibility make the ordered AgSS a promising choice for practical SERS low concentration detection applications. PMID:27586562

  7. Ordered array of Ag semishells on different diameter monolayer polystyrene colloidal crystals: An ultrasensitive and reproducible SERS substrate

    NASA Astrophysics Data System (ADS)

    Yi, Zao; Niu, Gao; Luo, Jiangshan; Kang, Xiaoli; Yao, Weitang; Zhang, Weibin; Yi, Yougen; Yi, Yong; Ye, Xin; Duan, Tao; Tang, Yongjian

    2016-09-01

    Ag semishells (AgSS) ordered arrays for surface-enhanced Raman scattering (SERS) spectroscopy have been prepared by depositing Ag film onto polystyrene colloidal particle (PSCP) monolayer templates array. The diversified activity for SERS activity with the ordered AgSS arrays mainly depends on the PSCP diameter and Ag film thickness. The high SERS sensitivity and reproducibility are proved by the detection of rhodamine 6G (R6G) and 4-aminothiophenol (4-ATP) molecules. The prominent enhancements of SERS are mainly from the “V”-shaped or “U”-shaped nanogaps on AgSS, which are experimentally and theoretically investigated. The higher SERS activity, stability and reproducibility make the ordered AgSS a promising choice for practical SERS low concentration detection applications.

  8. Guidelines to improve animal study design and reproducibility for Alzheimer's disease and related dementias: For funders and researchers.

    PubMed

    Snyder, Heather M; Shineman, Diana W; Friedman, Lauren G; Hendrix, James A; Khachaturian, Ara; Le Guillou, Ian; Pickett, James; Refolo, Lorenzo; Sancho, Rosa M; Ridley, Simon H

    2016-11-01

    The reproducibility of laboratory experiments is fundamental to the scientific process. There have been increasing reports regarding challenges in reproducing and translating preclinical experiments in animal models. In Alzheimer's disease and related dementias, there have been similar reports and growing interest from funding organizations, researchers, and the broader scientific community to set parameters around experimental design, statistical power, and reporting requirements. A number of efforts in recent years have attempted to develop standard guidelines; however, these have not yet been widely implemented by researchers or by funding agencies. A workgroup of the International Alzheimer's disease Research Funder Consortium, a group of over 30 research funding agencies from around the world, worked to compile the best practices identified in these prior efforts for preclinical biomedical research. This article represents a consensus of this work group's review and includes recommendations for researchers and funding agencies on designing, performing, reviewing, and funding preclinical research studies.

  9. A stochastic model of kinetochore-microtubule attachment accurately describes fission yeast chromosome segregation.

    PubMed

    Gay, Guillaume; Courtheoux, Thibault; Reyes, Céline; Tournier, Sylvie; Gachet, Yannick

    2012-03-19

    In fission yeast, erroneous attachments of spindle microtubules to kinetochores are frequent in early mitosis. Most are corrected before anaphase onset by a mechanism involving the protein kinase Aurora B, which destabilizes kinetochore microtubules (ktMTs) in the absence of tension between sister chromatids. In this paper, we describe a minimal mathematical model of fission yeast chromosome segregation based on the stochastic attachment and detachment of ktMTs. The model accurately reproduces the timing of correct chromosome biorientation and segregation seen in fission yeast. Prevention of attachment defects requires both appropriate kinetochore orientation and an Aurora B-like activity. The model also reproduces abnormal chromosome segregation behavior (caused by, for example, inhibition of Aurora B). It predicts that, in metaphase, merotelic attachment is prevented by a kinetochore orientation effect and corrected by an Aurora B-like activity, whereas in anaphase, it is corrected through unbalanced forces applied to the kinetochore. These unbalanced forces are sufficient to prevent aneuploidy.

  10. Asymptotic expansion based equation of state for hard-disk fluids offering accurate virial coefficients.

    PubMed

    Tian, Jianxiang; Gui, Yuanxing; Mulero, A

    2010-01-01

    Despite the fact that more than 30 analytical expressions for the equation of state of hard-disk fluids have been proposed in the literature, none of them is capable of reproducing the currently accepted numeric or estimated values for the first eighteen virial coefficients. Using the asymptotic expansion method, extended to the first ten virial coefficients for hard-disk fluids, fifty-seven new expressions for the equation of state have been studied. Of these, a new equation of state is selected which reproduces accurately all the first eighteen virial coefficients. Comparisons for the compressibility factor with computer simulations show that this new equation is as accurate as other similar expressions with the same number of parameters. Finally, the location of the poles of the 57 new equations shows that there are some particular configurations which could give both the accurate virial coefficients and the correct closest packing fraction in the future when higher than the tenth virial coefficients are numerically calculated.

  11. Research Elements: new article types by Elsevier to facilitate reproducibility in science

    NASA Astrophysics Data System (ADS)

    Zudilova-Seinstra, Elena; van Hensbergen, Kitty; Wacek, Bart

    2016-04-01

    When researchers start to make plans for new experiments, this is the beginning of a whole cycle of work, including experimental designs, tweaking of existing methods, developing protocols, writing code, collecting and processing experimental data, etc. A large part of this very useful information rarely gets published, which makes experiments difficult to reproduce. The same holds for experimental data, which is not always provided in a reusable format and lacks descriptive information. Furthermore, many types of data, such as a replication data, negative datasets or data from "intermediate experiments" often don't get published because they have no place in a research journal. To address this concern, Elsevier launched a series of peer-reviewed journal titles grouped under the umbrella of Research Elements (https://www.elsevier.com/books-and-journals/research-elements) that allow researchers to publish their data, software, materials and methods and other elements of the research cycle in a brief article format. To facilitate reproducibility, Research Elements have thoroughly thought out submission templates that include all necessary information and metadata as well as peer-review criteria defined per article type. Research Elements can be applicable to multiple research areas; for example, a number of multidisciplinary journals (Data in Brief, SoftwareX, MethodsX) welcome submissions from a large number of subject areas. At other times, these elements are better served within a single field; therefore, a number of domain-specific journals (e.g.: Genomics Data, Chemical Data Collections, Neurocomputing) support the new article formats, too. Upon publication, all Research Elements are assigned with persistent identifiers for direct citation and easy discoverability. Persistent identifiers are also used for interlinking Research Elements and relevant research papers published in traditional journals. Some Research Elements allow post-publication article updates

  12. Research Reproducibility in Longitudinal Multi-Center Studies Using Data from Electronic Health Records

    PubMed Central

    Zozus, Meredith N.; Richesson, Rachel L.; Walden, Anita; Tenenbaum, Jessie D.; Hammond, W.E.

    2016-01-01

    A fundamental premise of scientific research is that it should be reproducible. However, the specific requirements for reproducibility of research using electronic health record (EHR) data have not been sufficiently articulated. There is no guidance for researchers about how to assess a given project and identify provisions for reproducibility. We analyze three different clinical research initiatives that use EHR data in order to define a set of requirements to reproduce the research using the original or other datasets. We identify specific project features that drive these requirements. The resulting framework will support the much-needed discussion of strategies to ensure the reproducibility of research that uses data from EHRs. PMID:27570682

  13. Research Reproducibility in Longitudinal Multi-Center Studies Using Data from Electronic Health Records.

    PubMed

    Zozus, Meredith N; Richesson, Rachel L; Walden, Anita; Tenenbaum, Jessie D; Hammond, W E

    2016-01-01

    A fundamental premise of scientific research is that it should be reproducible. However, the specific requirements for reproducibility of research using electronic health record (EHR) data have not been sufficiently articulated. There is no guidance for researchers about how to assess a given project and identify provisions for reproducibility. We analyze three different clinical research initiatives that use EHR data in order to define a set of requirements to reproduce the research using the original or other datasets. We identify specific project features that drive these requirements. The resulting framework will support the much-needed discussion of strategies to ensure the reproducibility of research that uses data from EHRs.

  14. Accurate upwind methods for the Euler equations

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1993-01-01

    A new class of piecewise linear methods for the numerical solution of the one-dimensional Euler equations of gas dynamics is presented. These methods are uniformly second-order accurate, and can be considered as extensions of Godunov's scheme. With an appropriate definition of monotonicity preservation for the case of linear convection, it can be shown that they preserve monotonicity. Similar to Van Leer's MUSCL scheme, they consist of two key steps: a reconstruction step followed by an upwind step. For the reconstruction step, a monotonicity constraint that preserves uniform second-order accuracy is introduced. Computational efficiency is enhanced by devising a criterion that detects the 'smooth' part of the data where the constraint is redundant. The concept and coding of the constraint are simplified by the use of the median function. A slope steepening technique, which has no effect at smooth regions and can resolve a contact discontinuity in four cells, is described. As for the upwind step, existing and new methods are applied in a manner slightly different from those in the literature. These methods are derived by approximating the Euler equations via linearization and diagonalization. At a 'smooth' interface, Harten, Lax, and Van Leer's one intermediate state model is employed. A modification for this model that can resolve contact discontinuities is presented. Near a discontinuity, either this modified model or a more accurate one, namely, Roe's flux-difference splitting. is used. The current presentation of Roe's method, via the conceptually simple flux-vector splitting, not only establishes a connection between the two splittings, but also leads to an admissibility correction with no conditional statement, and an efficient approximation to Osher's approximate Riemann solver. These reconstruction and upwind steps result in schemes that are uniformly second-order accurate and economical at smooth regions, and yield high resolution at discontinuities.

  15. Accurate measurement of unsteady state fluid temperature

    NASA Astrophysics Data System (ADS)

    Jaremkiewicz, Magdalena

    2017-03-01

    In this paper, two accurate methods for determining the transient fluid temperature were presented. Measurements were conducted for boiling water since its temperature is known. At the beginning the thermometers are at the ambient temperature and next they are immediately immersed into saturated water. The measurements were carried out with two thermometers of different construction but with the same housing outer diameter equal to 15 mm. One of them is a K-type industrial thermometer widely available commercially. The temperature indicated by the thermometer was corrected considering the thermometers as the first or second order inertia devices. The new design of a thermometer was proposed and also used to measure the temperature of boiling water. Its characteristic feature is a cylinder-shaped housing with the sheath thermocouple located in its center. The temperature of the fluid was determined based on measurements taken in the axis of the solid cylindrical element (housing) using the inverse space marching method. Measurements of the transient temperature of the air flowing through the wind tunnel using the same thermometers were also carried out. The proposed measurement technique provides more accurate results compared with measurements using industrial thermometers in conjunction with simple temperature correction using the inertial thermometer model of the first or second order. By comparing the results, it was demonstrated that the new thermometer allows obtaining the fluid temperature much faster and with higher accuracy in comparison to the industrial thermometer. Accurate measurements of the fast changing fluid temperature are possible due to the low inertia thermometer and fast space marching method applied for solving the inverse heat conduction problem.

  16. Numerical approach to reproduce instabilities of partial cavitation in a Venturi 8° geometry

    NASA Astrophysics Data System (ADS)

    Charriere, Boris; Goncalves, Eric

    2016-11-01

    Unsteady partial cavitation is mainly formed by an attached cavity which present periodic oscillations. Under certain conditions, the instabilities are characterized by the formation of vapour clouds, convected downstream the cavity and which collapse in higher pressure region. In order to gain a better understanding of the complex physics involved, many experimental and numerical studies have been carried out. These identified two main mechanisms responsible for the break-off cycles. The development of a liquid re-entrant jet is the most common type of instabilities, but more recently, the role of pressure waves created by the cloud collapses has been highlighted. This paper presents a one-fluid compressible Reynolds- Averaged NavierStokes (RANS) solver closed by two different equations of state (EOS) for the mixture. Based on experimental data, we investigate the ability for our simulations to reproduce the instablities of a self-sustained oscillating cavitation pocket. Two cavitation models are firstly compared. The importance of considering a non-equilibrium state for the vapour phase is also exhibited. To finish, the role played by the added transport equation to compute void ratio is emphasised. In case of partially cavitating flows with detached cavitation clouds, the reproduction of convective mechanisms is clearly improved.

  17. An electrostatic mechanism closely reproducing observed behavior in the bacterial flagellar motor.

    PubMed Central

    Walz, D; Caplan, S R

    2000-01-01

    A mechanism coupling the transmembrane flow of protons to the rotation of the bacterial flagellum is studied. The coupling is accomplished by means of an array of tilted rows of positive and negative charges around the circumference of the rotor, which interacts with a linear array of proton binding sites in channels. We present a rigorous treatment of the electrostatic interactions using minimal assumptions. Interactions with the transition states are included, as well as proton-proton interactions in and between channels. In assigning values to the parameters of the model, experimentally determined structural characteristics of the motor have been used. According to the model, switching and pausing occur as a consequence of modest conformational changes in the rotor. In contrast to similar approaches developed earlier, this model closely reproduces a large number of experimental findings from different laboratories, including the nonlinear behavior of the torque-frequency relation in Escherichia coli, the stoichiometry of the system in Streptococcus, and the pH-dependence of swimming speed in Bacillus subtilis. PMID:10653777

  18. Reproducing kernel potential energy surfaces in biomolecular simulations: Nitric oxide binding to myoglobin

    SciTech Connect

    Soloviov, Maksym; Meuwly, Markus

    2015-09-14

    Multidimensional potential energy surfaces based on reproducing kernel-interpolation are employed to explore the energetics and dynamics of free and bound nitric oxide in myoglobin (Mb). Combining a force field description for the majority of degrees of freedom and the higher-accuracy representation for the NO ligand and the Fe out-of-plane motion allows for a simulation approach akin to a mixed quantum mechanics/molecular mechanics treatment. However, the kernel-representation can be evaluated at conventional force-field speed. With the explicit inclusion of the Fe-out-of-plane (Fe-oop) coordinate, the dynamics and structural equilibrium after photodissociation of the ligand are correctly described compared to experiment. Experimentally, the Fe-oop coordinate plays an important role for the ligand dynamics. This is also found here where the isomerization dynamics between the Fe–ON and Fe–NO state is significantly affected whether or not this co-ordinate is explicitly included. Although the Fe–ON conformation is metastable when considering only the bound {sup 2}A state, it may disappear once the {sup 4}A state is included. This explains the absence of the Fe–ON state in previous experimental investigations of MbNO.

  19. The first accurate description of an aurora

    NASA Astrophysics Data System (ADS)

    Schröder, Wilfried

    2006-12-01

    As technology has advanced, the scientific study of auroral phenomena has increased by leaps and bounds. A look back at the earliest descriptions of aurorae offers an interesting look into how medieval scholars viewed the subjects that we study.Although there are earlier fragmentary references in the literature, the first accurate description of the aurora borealis appears to be that published by the German Catholic scholar Konrad von Megenberg (1309-1374) in his book Das Buch der Natur (The Book of Nature). The book was written between 1349 and 1350.

  20. Determining accurate distances to nearby galaxies

    NASA Astrophysics Data System (ADS)

    Bonanos, Alceste Zoe

    2005-11-01

    Determining accurate distances to nearby or distant galaxies is a very simple conceptually, yet complicated in practice, task. Presently, distances to nearby galaxies are only known to an accuracy of 10-15%. The current anchor galaxy of the extragalactic distance scale is the Large Magellanic Cloud, which has large (10-15%) systematic uncertainties associated with it, because of its morphology, its non-uniform reddening and the unknown metallicity dependence of the Cepheid period-luminosity relation. This work aims to determine accurate distances to some nearby galaxies, and subsequently help reduce the error in the extragalactic distance scale and the Hubble constant H 0 . In particular, this work presents the first distance determination of the DIRECT Project to M33 with detached eclipsing binaries. DIRECT aims to obtain a new anchor galaxy for the extragalactic distance scale by measuring direct, accurate (to 5%) distances to two Local Group galaxies, M31 and M33, with detached eclipsing binaries. It involves a massive variability survey of these galaxies and subsequent photometric and spectroscopic follow-up of the detached binaries discovered. In this work, I also present a catalog of variable stars discovered in one of the DIRECT fields, M31Y, which includes 41 eclipsing binaries. Additionally, we derive the distance to the Draco Dwarf Spheroidal galaxy, with ~100 RR Lyrae found in our first CCD variability study of this galaxy. A "hybrid" method of discovering Cepheids with ground-based telescopes is described next. It involves applying the image subtraction technique on the images obtained from ground-based telescopes and then following them up with the Hubble Space Telescope to derive Cepheid period-luminosity distances. By re-analyzing ESO Very Large Telescope data on M83 (NGC 5236), we demonstrate that this method is much more powerful for detecting variability, especially in crowded fields. I finally present photometry for the Wolf-Rayet binary WR 20a

  1. New law requires 'medically accurate' lesson plans.

    PubMed

    1999-09-17

    The California Legislature has passed a bill requiring all textbooks and materials used to teach about AIDS be medically accurate and objective. Statements made within the curriculum must be supported by research conducted in compliance with scientific methods, and published in peer-reviewed journals. Some of the current lesson plans were found to contain scientifically unsupported and biased information. In addition, the bill requires material to be "free of racial, ethnic, or gender biases." The legislation is supported by a wide range of interests, but opposed by the California Right to Life Education Fund, because they believe it discredits abstinence-only material.

  2. Evaluation of mammographic density patterns: reproducibility and concordance among scales

    PubMed Central

    2010-01-01

    this percentage was lower for the quantitative scales (21.89% for BI-RADS and 21.86% for Boyd). Conclusions Visual scales of mammographic density show a high reproducibility when appropriate training is provided. Their ability to distinguish between high and low risk render them useful for routine use by breast cancer screening programs. Quantitative-based scales are more specific than pattern-based scales in classifying populations in the high-risk group. PMID:20836850

  3. Assay Reproducibility in Clinical Studies of Plasma miRNA

    PubMed Central

    Rice, Jonathan; Roberts, Henry; Burton, James; Pan, Jianmin; States, Vanessa; Rai, Shesh N.; Galandiuk, Susan

    2015-01-01

    There are increasing reports of plasma miRNAs as biomarkers of human disease but few standards in methodologic reporting, leading to inconsistent data. We systematically reviewed plasma miRNA studies published between July 2013-June 2014 to assess methodology. Six parameters were investigated: time to plasma extraction, methods of RNA extraction, type of miRNA, quantification, cycle threshold (Ct) setting, and methods of statistical analysis. We compared these data with a proposed standard methodologic technique. Beginning with initial screening for 380 miRNAs using microfluidic array technology and validation in an additional cohort of patients, we compared 11 miRNAs that exhibited differential expression between 16 patients with benign colorectal neoplasms (advanced adenomas) and 16 patients without any neoplasm (controls). Plasma was isolated immediately, 12, 24, 48, or 72 h following phlebotomy. miRNA was extracted using two different techniques (Trizol LS with pre-amplification or modified miRNeasy). We performed Taqman-based RT-PCR assays for the 11 miRNAs with subsequent analyses using a variable Ct setting or a fixed Ct set at 0.01, 0.03, 0.05, or 0.5. Assays were performed in duplicate by two different operators. RNU6 was the internal reference. Systematic review yielded 74 manuscripts meeting inclusion criteria. One manuscript (1.4%) documented all 6 methodological parameters, while < 5% of studies listed Ct setting. In our proposed standard technique, plasma extraction ≤12 h provided consistent ΔCt. miRNeasy extraction yielded higher miRNA concentrations and fewer non-expressed miRNAs compared to Trizol LS (1/704 miRNAs [0.14%] vs 109/704 miRNAs [15%], not expressed, respectively). A fixed Ct bar setting of 0.03 yielded the most reproducible data, provided that <10% miRNA were non-expressed. There was no significant intra-operator variability. There was significant inter-operator variation using Trizol LS extraction, while this was negligible using

  4. Methods for Computing Accurate Atomic Spin Moments for Collinear and Noncollinear Magnetism in Periodic and Nonperiodic Materials.

    PubMed

    Manz, Thomas A; Sholl, David S

    2011-12-13

    The partitioning of electron spin density among atoms in a material gives atomic spin moments (ASMs), which are important for understanding magnetic properties. We compare ASMs computed using different population analysis methods and introduce a method for computing density derived electrostatic and chemical (DDEC) ASMs. Bader and DDEC ASMs can be computed for periodic and nonperiodic materials with either collinear or noncollinear magnetism, while natural population analysis (NPA) ASMs can be computed for nonperiodic materials with collinear magnetism. Our results show Bader, DDEC, and (where applicable) NPA methods give similar ASMs, but different net atomic charges. Because they are optimized to reproduce both the magnetic field and the chemical states of atoms in a material, DDEC ASMs are especially suitable for constructing interaction potentials for atomistic simulations. We describe the computation of accurate ASMs for (a) a variety of systems using collinear and noncollinear spin DFT, (b) highly correlated materials (e.g., magnetite) using DFT+U, and (c) various spin states of ozone using coupled cluster expansions. The computed ASMs are in good agreement with available experimental results for a variety of periodic and nonperiodic materials. Examples considered include the antiferromagnetic metal organic framework Cu3(BTC)2, several ozone spin states, mono- and binuclear transition metal complexes, ferri- and ferro-magnetic solids (e.g., Fe3O4, Fe3Si), and simple molecular systems. We briefly discuss the theory of exchange-correlation functionals for studying noncollinear magnetism. A method for finding the ground state of systems with highly noncollinear magnetism is introduced. We use these methods to study the spin-orbit coupling potential energy surface of the single molecule magnet Fe4C40H52N4O12, which has highly noncollinear magnetism, and find that it contains unusual features that give a new interpretation to experimental data.

  5. NEAMS Experimental Support for Code Validation, INL FY2009

    SciTech Connect

    G. Youinou; G. Palmiotti; M. Salvatore; C. Rabiti

    2009-09-01

    The goal is for all modeling and simulation tools to be demonstrated accurate and reliable through a formal Verification and Validation (V&V) process, especially where such tools are to be used to establish safety margins and support regulatory compliance, or to design a system in a manner that reduces the role of expensive mockups and prototypes. Whereas the Verification part of the process does not rely on experiment, the Validation part, on the contrary, necessitates as many relevant and precise experimental data as possible to make sure the models reproduce reality as closely as possible. Hence, this report presents a limited selection of experimental data that could be used to validate the codes devoted mainly to Fast Neutron Reactor calculations in the US. Emphasis has been put on existing data for thermal-hydraulics, fuel and reactor physics. The principles of a new “smart” experiment that could be used to improve our knowledge of neutron cross-sections are presented as well. In short, it consists in irradiating a few milligrams of actinides and analyzing the results with Accelerator Mass Spectroscopy to infer the neutron cross-sections. Finally, the wealth of experimental data relevant to Fast Neutron Reactors in the US should not be taken for granted and efforts should be put on saving these 30-40 years old data and on making sure they are validation-worthy, i.e. that the experimental conditions and uncertainties are well documented.

  6. Character Reading via Stylus Reproducing Normal Handwriting Motion.

    PubMed

    Hasegawa, Keisuke; Sakurai, Tatsuma; Makino, Yasutoshi; Shinoda, Hiroyuki

    2016-01-13

    In this paper, we report a method of intuitively transmitting symbolic information to untrained users via only their hands, without using any visual or auditory cues. In this simple concept, three-dimensional letter trajectories are presented to the user's hand via a stylus which is mechanically manipulated. In experiments, participants were able to read 14 mm-high lower-case letters displayed at a rate of one letter per second with an accuracy rate of 71.9% in their first trials, which improved to 91.3% after a five-minute training period. These results showed small individual differences among participants (standard deviation of 12.7% in the first trials and 6.7% after training). We also found that this accuracy was still retained to a high level (85.1%, with SD of 8.2%) even when the letters were reduced to a height of 7 mm. Thus, we revealed that sighted adults potentially possess the ability to read small letters accurately at normal writing speed using their hands.

  7. Accurate taxonomic assignment of short pyrosequencing reads.

    PubMed

    Clemente, José C; Jansson, Jesper; Valiente, Gabriel

    2010-01-01

    Ambiguities in the taxonomy dependent assignment of pyrosequencing reads are usually resolved by mapping each read to the lowest common ancestor in a reference taxonomy of all those sequences that match the read. This conservative approach has the drawback of mapping a read to a possibly large clade that may also contain many sequences not matching the read. A more accurate taxonomic assignment of short reads can be made by mapping each read to the node in the reference taxonomy that provides the best precision and recall. We show that given a suffix array for the sequences in the reference taxonomy, a short read can be mapped to the node of the reference taxonomy with the best combined value of precision and recall in time linear in the size of the taxonomy subtree rooted at the lowest common ancestor of the matching sequences. An accurate taxonomic assignment of short reads can thus be made with about the same efficiency as when mapping each read to the lowest common ancestor of all matching sequences in a reference taxonomy. We demonstrate the effectiveness of our approach on several metagenomic datasets of marine and gut microbiota.

  8. Accurate shear measurement with faint sources

    SciTech Connect

    Zhang, Jun; Foucaud, Sebastien; Luo, Wentao E-mail: walt@shao.ac.cn

    2015-01-01

    For cosmic shear to become an accurate cosmological probe, systematic errors in the shear measurement method must be unambiguously identified and corrected for. Previous work of this series has demonstrated that cosmic shears can be measured accurately in Fourier space in the presence of background noise and finite pixel size, without assumptions on the morphologies of galaxy and PSF. The remaining major source of error is source Poisson noise, due to the finiteness of source photon number. This problem is particularly important for faint galaxies in space-based weak lensing measurements, and for ground-based images of short exposure times. In this work, we propose a simple and rigorous way of removing the shear bias from the source Poisson noise. Our noise treatment can be generalized for images made of multiple exposures through MultiDrizzle. This is demonstrated with the SDSS and COSMOS/ACS data. With a large ensemble of mock galaxy images of unrestricted morphologies, we show that our shear measurement method can achieve sub-percent level accuracy even for images of signal-to-noise ratio less than 5 in general, making it the most promising technique for cosmic shear measurement in the ongoing and upcoming large scale galaxy surveys.

  9. Accurate pose estimation for forensic identification

    NASA Astrophysics Data System (ADS)

    Merckx, Gert; Hermans, Jeroen; Vandermeulen, Dirk

    2010-04-01

    In forensic authentication, one aims to identify the perpetrator among a series of suspects or distractors. A fundamental problem in any recognition system that aims for identification of subjects in a natural scene is the lack of constrains on viewing and imaging conditions. In forensic applications, identification proves even more challenging, since most surveillance footage is of abysmal quality. In this context, robust methods for pose estimation are paramount. In this paper we will therefore present a new pose estimation strategy for very low quality footage. Our approach uses 3D-2D registration of a textured 3D face model with the surveillance image to obtain accurate far field pose alignment. Starting from an inaccurate initial estimate, the technique uses novel similarity measures based on the monogenic signal to guide a pose optimization process. We will illustrate the descriptive strength of the introduced similarity measures by using them directly as a recognition metric. Through validation, using both real and synthetic surveillance footage, our pose estimation method is shown to be accurate, and robust to lighting changes and image degradation.

  10. Sparse and accurate high resolution SAR imaging

    NASA Astrophysics Data System (ADS)

    Vu, Duc; Zhao, Kexin; Rowe, William; Li, Jian

    2012-05-01

    We investigate the usage of an adaptive method, the Iterative Adaptive Approach (IAA), in combination with a maximum a posteriori (MAP) estimate to reconstruct high resolution SAR images that are both sparse and accurate. IAA is a nonparametric weighted least squares algorithm that is robust and user parameter-free. IAA has been shown to reconstruct SAR images with excellent side lobes suppression and high resolution enhancement. We first reconstruct the SAR images using IAA, and then we enforce sparsity by using MAP with a sparsity inducing prior. By coupling these two methods, we can produce a sparse and accurate high resolution image that are conducive for feature extractions and target classification applications. In addition, we show how IAA can be made computationally efficient without sacrificing accuracies, a desirable property for SAR applications where the size of the problems is quite large. We demonstrate the success of our approach using the Air Force Research Lab's "Gotcha Volumetric SAR Data Set Version 1.0" challenge dataset. Via the widely used FFT, individual vehicles contained in the scene are barely recognizable due to the poor resolution and high side lobe nature of FFT. However with our approach clear edges, boundaries, and textures of the vehicles are obtained.

  11. Accurate basis set truncation for wavefunction embedding

    NASA Astrophysics Data System (ADS)

    Barnes, Taylor A.; Goodpaster, Jason D.; Manby, Frederick R.; Miller, Thomas F.

    2013-07-01

    Density functional theory (DFT) provides a formally exact framework for performing embedded subsystem electronic structure calculations, including DFT-in-DFT and wavefunction theory-in-DFT descriptions. In the interest of efficiency, it is desirable to truncate the atomic orbital basis set in which the subsystem calculation is performed, thus avoiding high-order scaling with respect to the size of the MO virtual space. In this study, we extend a recently introduced projection-based embedding method [F. R. Manby, M. Stella, J. D. Goodpaster, and T. F. Miller III, J. Chem. Theory Comput. 8, 2564 (2012)], 10.1021/ct300544e to allow for the systematic and accurate truncation of the embedded subsystem basis set. The approach is applied to both covalently and non-covalently bound test cases, including water clusters and polypeptide chains, and it is demonstrated that errors associated with basis set truncation are controllable to well within chemical accuracy. Furthermore, we show that this approach allows for switching between accurate projection-based embedding and DFT embedding with approximate kinetic energy (KE) functionals; in this sense, the approach provides a means of systematically improving upon the use of approximate KE functionals in DFT embedding.

  12. Magnetohydrodynamic generator experimental studies

    NASA Technical Reports Server (NTRS)

    Pierson, E. S.

    1972-01-01

    The results for an experimental study of a one wavelength MHD induction generator operating on a liquid flow are presented. First the design philosophy and the experimental generator design are summarized, including a description of the flow loop and instrumentation. Next a Fourier series method of treating the fact that the magnetic flux density produced by the stator is not a pure traveling sinusoid is described and some results summarized. This approach appears to be of interest after revisions are made, but the initial results are not accurate. Finally, some of the experimental data is summarized for various methods of excitation.

  13. Accurate spectral modeling for infrared radiation

    NASA Technical Reports Server (NTRS)

    Tiwari, S. N.; Gupta, S. K.

    1977-01-01

    Direct line-by-line integration and quasi-random band model techniques are employed to calculate the spectral transmittance and total band absorptance of 4.7 micron CO, 4.3 micron CO2, 15 micron CO2, and 5.35 micron NO bands. Results are obtained for different pressures, temperatures, and path lengths. These are compared with available theoretical and experimental investigations. For each gas, extensive tabulations of results are presented for comparative purposes. In almost all cases, line-by-line results are found to be in excellent agreement with the experimental values. The range of validity of other models and correlations are discussed.

  14. An accurate equation of state for fluids and solids.

    PubMed

    Parsafar, G A; Spohr, H V; Patey, G N

    2009-09-03

    A simple functional form for a general equation of state based on an effective near-neighbor pair interaction of an extended Lennard-Jones (12,6,3) type is given and tested against experimental data for a wide variety of fluids and solids. Computer simulation results for ionic liquids are used for further evaluation. For fluids, there appears to be no upper density limitation on the equation of state. The lower density limit for isotherms near the critical temperature is the critical density. The equation of state gives a good description of all types of fluids, nonpolar (including long-chain hydrocarbons), polar, hydrogen-bonded, and metallic, at temperatures ranging from the triple point to the highest temperature for which there is experimental data. For solids, the equation of state is very accurate for all types considered, including covalent, molecular, metallic, and ionic systems. The experimental pvT data available for solids does not reveal any pressure or temperature limitations. An analysis of the importance and possible underlying physical significance of the terms in the equation of state is given.

  15. An accurate potential energy curve for helium based on ab initio calculations

    NASA Astrophysics Data System (ADS)

    Janzen, A. R.; Aziz, R. A.

    1997-07-01

    Korona, Williams, Bukowski, Jeziorski, and Szalewicz [J. Chem. Phys. 106, 1 (1997)] constructed a completely ab initio potential for He2 by fitting their calculations using infinite order symmetry adapted perturbation theory at intermediate range, existing Green's function Monte Carlo calculations at short range and accurate dispersion coefficients at long range to a modified Tang-Toennies potential form. The potential with retardation added to the dipole-dipole dispersion is found to predict accurately a large set of microscopic and macroscopic experimental data. The potential with a significantly larger well depth than other recent potentials is judged to be the most accurate characterization of the helium interaction yet proposed.

  16. Accurate bulk density determination of irregularly shaped translucent and opaque aerogels

    NASA Astrophysics Data System (ADS)

    Petkov, M. P.; Jones, S. M.

    2016-05-01

    We present a volumetric method for accurate determination of bulk density of aerogels, calculated from extrapolated weight of the dry pure solid and volume estimates based on the Archimedes' principle of volume displacement, using packed 100 μm-sized monodispersed glass spheres as a "quasi-fluid" media. Hard particle packing theory is invoked to demonstrate the reproducibility of the apparent density of the quasi-fluid. Accuracy rivaling that of the refractive index method is demonstrated for both translucent and opaque aerogels with different absorptive properties, as well as for aerogels with regular and irregular shapes.

  17. Assessment of a climate model to reproduce rainfall variability and extremes over Southern Africa

    NASA Astrophysics Data System (ADS)

    Williams, C. J. R.; Kniveton, D. R.; Layberry, R.

    2010-01-01

    It is increasingly accepted that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. A change in the distribution and magnitude of extreme rainfall events (associated with changing variability), such as droughts or flooding, may have a far greater impact on human and natural systems than a changing mean. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The sub-continent is considered especially vulnerable to and ill-equipped (in terms of adaptation) for extreme events, due to a number of factors including extensive poverty, famine, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of ability of a state of the art climate model to simulate climate at daily timescales is carried out using satellite-derived rainfall data from the Microwave Infrared Rainfall Algorithm (MIRA). This dataset covers the period from 1993 to 2002 and the whole of southern Africa at a spatial resolution of 0.1° longitude/latitude. This paper concentrates primarily on the ability of the model to simulate the spatial and temporal patterns of present-day rainfall variability over southern Africa and is not intended to discuss possible future changes in climate as these have been documented elsewhere. Simulations of current climate from the UK Meteorological Office Hadley Centre's climate model, in both regional and global mode, are firstly compared to the MIRA dataset at daily timescales. Secondly, the ability of the model to reproduce daily rainfall extremes is assessed, again by a comparison with

  18. The FLUKA Code: An Accurate Simulation Tool for Particle Therapy

    PubMed Central

    Battistoni, Giuseppe; Bauer, Julia; Boehlen, Till T.; Cerutti, Francesco; Chin, Mary P. W.; Dos Santos Augusto, Ricardo; Ferrari, Alfredo; Ortega, Pablo G.; Kozłowska, Wioletta; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R.; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with both 4He and 12C ion beams. Accurate description of ionization energy losses and of particle scattering and interactions lead to the excellent agreement of calculated depth–dose profiles with those measured at leading European hadron therapy centers, both with proton and ion beams. In order to support the application of FLUKA in hospital-based environments, Flair, the FLUKA graphical interface, has been enhanced with the capability of translating CT DICOM images into voxel-based computational phantoms in a fast and well-structured way. The interface is capable of importing also radiotherapy treatment data described in DICOM RT standard. In addition, the interface is equipped with an intuitive PET scanner geometry generator and automatic recording of coincidence events. Clinically, similar cases will be presented both in terms of absorbed dose and biological dose calculations describing the various available features. PMID:27242956

  19. The FLUKA Code: An Accurate Simulation Tool for Particle Therapy.

    PubMed

    Battistoni, Giuseppe; Bauer, Julia; Boehlen, Till T; Cerutti, Francesco; Chin, Mary P W; Dos Santos Augusto, Ricardo; Ferrari, Alfredo; Ortega, Pablo G; Kozłowska, Wioletta; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with both (4)He and (12)C ion beams. Accurate description of ionization energy losses and of particle scattering and interactions lead to the excellent agreement of calculated depth-dose profiles with those measured at leading European hadron therapy centers, both with proton and ion beams. In order to support the application of FLUKA in hospital-based environments, Flair, the FLUKA graphical interface, has been enhanced with the capability of translating CT DICOM images into voxel-based computational phantoms in a fast and well-structured way. The interface is capable of importing also radiotherapy treatment data described in DICOM RT standard. In addition, the interface is equipped with an intuitive PET scanner geometry generator and automatic recording of coincidence events. Clinically, similar cases will be presented both in terms of absorbed dose and biological dose calculations describing the various available features.

  20. Apparatus for accurately measuring high temperatures

    DOEpatents

    Smith, D.D.

    The present invention is a thermometer used for measuring furnace temperatures in the range of about 1800/sup 0/ to 2700/sup 0/C. The thermometer comprises a broadband multicolor thermal radiation sensor positioned to be in optical alignment with the end of a blackbody sight tube extending into the furnace. A valve-shutter arrangement is positioned between the radiation sensor and the sight tube and a chamber for containing a charge of high pressure gas is positioned between the valve-shutter arrangement and the radiation sensor. A momentary opening of the valve shutter arrangement allows a pulse of the high gas to purge the sight tube of air-borne thermal radiation contaminants which permits the radiation sensor to accurately measure the thermal radiation emanating from the end of the sight tube.

  1. Apparatus for accurately measuring high temperatures

    DOEpatents

    Smith, Douglas D.

    1985-01-01

    The present invention is a thermometer used for measuring furnace temperaes in the range of about 1800.degree. to 2700.degree. C. The thermometer comprises a broadband multicolor thermal radiation sensor positioned to be in optical alignment with the end of a blackbody sight tube extending into the furnace. A valve-shutter arrangement is positioned between the radiation sensor and the sight tube and a chamber for containing a charge of high pressure gas is positioned between the valve-shutter arrangement and the radiation sensor. A momentary opening of the valve shutter arrangement allows a pulse of the high gas to purge the sight tube of air-borne thermal radiation contaminants which permits the radiation sensor to accurately measure the thermal radiation emanating from the end of the sight tube.

  2. LSM: perceptually accurate line segment merging

    NASA Astrophysics Data System (ADS)

    Hamid, Naila; Khan, Nazar

    2016-11-01

    Existing line segment detectors tend to break up perceptually distinct line segments into multiple segments. We propose an algorithm for merging such broken segments to recover the original perceptually accurate line segments. The algorithm proceeds by grouping line segments on the basis of angular and spatial proximity. Then those line segment pairs within each group that satisfy unique, adaptive mergeability criteria are successively merged to form a single line segment. This process is repeated until no more line segments can be merged. We also propose a method for quantitative comparison of line segment detection algorithms. Results on the York Urban dataset show that our merged line segments are closer to human-marked ground-truth line segments compared to state-of-the-art line segment detection algorithms.

  3. Highly accurate articulated coordinate measuring machine

    DOEpatents

    Bieg, Lothar F.; Jokiel, Jr., Bernhard; Ensz, Mark T.; Watson, Robert D.

    2003-12-30

    Disclosed is a highly accurate articulated coordinate measuring machine, comprising a revolute joint, comprising a circular encoder wheel, having an axis of rotation; a plurality of marks disposed around at least a portion of the circumference of the encoder wheel; bearing means for supporting the encoder wheel, while permitting free rotation of the encoder wheel about the wheel's axis of rotation; and a sensor, rigidly attached to the bearing means, for detecting the motion of at least some of the marks as the encoder wheel rotates; a probe arm, having a proximal end rigidly attached to the encoder wheel, and having a distal end with a probe tip attached thereto; and coordinate processing means, operatively connected to the sensor, for converting the output of the sensor into a set of cylindrical coordinates representing the position of the probe tip relative to a reference cylindrical coordinate system.

  4. Practical aspects of spatially high accurate methods

    NASA Technical Reports Server (NTRS)

    Godfrey, Andrew G.; Mitchell, Curtis R.; Walters, Robert W.

    1992-01-01

    The computational qualities of high order spatially accurate methods for the finite volume solution of the Euler equations are presented. Two dimensional essentially non-oscillatory (ENO), k-exact, and 'dimension by dimension' ENO reconstruction operators are discussed and compared in terms of reconstruction and solution accuracy, computational cost and oscillatory behavior in supersonic flows with shocks. Inherent steady state convergence difficulties are demonstrated for adaptive stencil algorithms. An exact solution to the heat equation is used to determine reconstruction error, and the computational intensity is reflected in operation counts. Standard MUSCL differencing is included for comparison. Numerical experiments presented include the Ringleb flow for numerical accuracy and a shock reflection problem. A vortex-shock interaction demonstrates the ability of the ENO scheme to excel in simulating unsteady high-frequency flow physics.

  5. Obtaining accurate translations from expressed sequence tags.

    PubMed

    Wasmuth, James; Blaxter, Mark

    2009-01-01

    The genomes of an increasing number of species are being investigated through the generation of expressed sequence tags (ESTs). However, ESTs are prone to sequencing errors and typically define incomplete transcripts, making downstream annotation difficult. Annotation would be greatly improved with robust polypeptide translations. Many current solutions for EST translation require a large number of full-length gene sequences for training purposes, a resource that is not available for the majority of EST projects. As part of our ongoing EST programs investigating these "neglected" genomes, we have developed a polypeptide prediction pipeline, prot4EST. It incorporates freely available software to produce final translations that are more accurate than those derived from any single method. We describe how this integrated approach goes a long way to overcoming the deficit in training data.

  6. Accurate radio positions with the Tidbinbilla interferometer

    NASA Technical Reports Server (NTRS)

    Batty, M. J.; Gulkis, S.; Jauncey, D. L.; Rayner, P. T.

    1979-01-01

    The Tidbinbilla interferometer (Batty et al., 1977) is designed specifically to provide accurate radio position measurements of compact radio sources in the Southern Hemisphere with high sensitivity. The interferometer uses the 26-m and 64-m antennas of the Deep Space Network at Tidbinbilla, near Canberra. The two antennas are separated by 200 m on a north-south baseline. By utilizing the existing antennas and the low-noise traveling-wave masers at 2.29 GHz, it has been possible to produce a high-sensitivity instrument with a minimum of capital expenditure. The north-south baseline ensures that a good range of UV coverage is obtained, so that sources lying in the declination range between about -80 and +30 deg may be observed with nearly orthogonal projected baselines of no less than about 1000 lambda. The instrument also provides high-accuracy flux density measurements for compact radio sources.

  7. Magnetic ranging tool accurately guides replacement well

    SciTech Connect

    Lane, J.B.; Wesson, J.P. )

    1992-12-21

    This paper reports on magnetic ranging surveys and directional drilling technology which accurately guided a replacement well bore to intersect a leaking gas storage well with casing damage. The second well bore was then used to pump cement into the original leaking casing shoe. The repair well bore kicked off from the surface hole, bypassed casing damage in the middle of the well, and intersected the damaged well near the casing shoe. The repair well was subsequently completed in the gas storage zone near the original well bore, salvaging the valuable bottom hole location in the reservoir. This method would prevent the loss of storage gas, and it would prevent a potential underground blowout that could permanently damage the integrity of the storage field.

  8. Efficient and Accurate Indoor Localization Using Landmark Graphs

    NASA Astrophysics Data System (ADS)

    Gu, F.; Kealy, A.; Khoshelham, K.; Shang, J.

    2016-06-01

    Indoor localization is important for a variety of applications such as location-based services, mobile social networks, and emergency response. Fusing spatial information is an effective way to achieve accurate indoor localization with little or with no need for extra hardware. However, existing indoor localization methods that make use of spatial information are either too computationally expensive or too sensitive to the completeness of landmark detection. In this paper, we solve this problem by using the proposed landmark graph. The landmark graph is a directed graph where nodes are landmarks (e.g., doors, staircases, and turns) and edges are accessible paths with heading information. We compared the proposed method with two common Dead Reckoning (DR)-based methods (namely, Compass + Accelerometer + Landmarks and Gyroscope + Accelerometer + Landmarks) by a series of experiments. Experimental results show that the proposed method can achieve 73% accuracy with a positioning error less than 2.5 meters, which outperforms the other two DR-based methods.

  9. Accurate oscillator strengths for interstellar ultraviolet lines of Cl I

    NASA Technical Reports Server (NTRS)

    Schectman, R. M.; Federman, S. R.; Beideck, D. J.; Ellis, D. J.

    1993-01-01

    Analyses on the abundance of interstellar chlorine rely on accurate oscillator strengths for ultraviolet transitions. Beam-foil spectroscopy was used to obtain f-values for the astrophysically important lines of Cl I at 1088, 1097, and 1347 A. In addition, the line at 1363 A was studied. Our f-values for 1088, 1097 A represent the first laboratory measurements for these lines; the values are f(1088)=0.081 +/- 0.007 (1 sigma) and f(1097) = 0.0088 +/- 0.0013 (1 sigma). These results resolve the issue regarding the relative strengths for 1088, 1097 A in favor of those suggested by astronomical measurements. For the other lines, our results of f(1347) = 0.153 +/- 0.011 (1 sigma) and f(1363) = 0.055 +/- 0.004 (1 sigma) are the most precisely measured values available. The f-values are somewhat greater than previous experimental and theoretical determinations.

  10. An Inexpensive and Accurate Tensiometer Using an Electronic Balance

    NASA Astrophysics Data System (ADS)

    Dolz, Manuel; Delegido, Jesús; Hernández, María-Jesús; Pellicer, Julio

    2001-09-01

    A method for measuring surface tension of liquid-air interfaces that consists of a modification of the du Noüy tensiometer is proposed. An electronic balance is used to determine the detachment force with high resolution and the relative displacement ring/plate-liquid surface is carried out by the descent of the liquid-free surface. The procedure familiarizes undergraduate students in applied science and technology with the experimental study of surface tension by means of a simple and accurate method that offers the advantages of sophisticated devices at considerably less cost. The operational aspects that must be taken into account are analyzed: the measuring system and determination of its effective length, measurement of the detachment force, and the relative system-liquid interface displacement rate. To check the accuracy of the proposed tensiometer, measurements of the surface tension of different known liquids have been performed, and good agreement with results reported in the literature was obtained.

  11. Fast and Accurate Circuit Design Automation through Hierarchical Model Switching.

    PubMed

    Huynh, Linh; Tagkopoulos, Ilias

    2015-08-21

    In computer-aided biological design, the trifecta of characterized part libraries, accurate models and optimal design parameters is crucial for producing reliable designs. As the number of parts and model complexity increase, however, it becomes exponentially more difficult for any optimization method to search the solution space, hence creating a trade-off that hampers efficient design. To address this issue, we present a hierarchical computer-aided design architecture that uses a two-step approach for biological design. First, a simple model of low computational complexity is used to predict circuit behavior and assess candidate circuit branches through branch-and-bound methods. Then, a complex, nonlinear circuit model is used for a fine-grained search of the reduced solution space, thus achieving more accurate results. Evaluation with a benchmark of 11 circuits and a library of 102 experimental designs with known characterization parameters demonstrates a speed-up of 3 orders of magnitude when compared to other design methods that provide optimality guarantees.

  12. Accurate colon residue detection algorithm with partial volume segmentation

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Liang, Zhengrong; Zhang, PengPeng; Kutcher, Gerald J.

    2004-05-01

    Colon cancer is the second leading cause of cancer-related death in the United States. Earlier detection and removal of polyps can dramatically reduce the chance of developing malignant tumor. Due to some limitations of optical colonoscopy used in clinic, many researchers have developed virtual colonoscopy as an alternative technique, in which accurate colon segmentation is crucial. However, partial volume effect and existence of residue make it very challenging. The electronic colon cleaning technique proposed by Chen et al is a very attractive method, which is also kind of hard segmentation method. As mentioned in their paper, some artifacts were produced, which might affect the accurate colon reconstruction. In our paper, instead of labeling each voxel with a unique label or tissue type, the percentage of different tissues within each voxel, which we call a mixture, was considered in establishing a maximum a posterior probability (MAP) image-segmentation framework. A Markov random field (MRF) model was developed to reflect the spatial information for the tissue mixtures. The spatial information based on hard segmentation was used to determine which tissue types are in the specific voxel. Parameters of each tissue class were estimated by the expectation-maximization (EM) algorithm during the MAP tissue-mixture segmentation. Real CT experimental results demonstrated that the partial volume effects between four tissue types have been precisely detected. Meanwhile, the residue has been electronically removed and very smooth and clean interface along the colon wall has been obtained.

  13. Strategy for accurate liver intervention by an optical tracking system

    PubMed Central

    Lin, Qinyong; Yang, Rongqian; Cai, Ken; Guan, Peifeng; Xiao, Weihu; Wu, Xiaoming

    2015-01-01

    Image-guided navigation for radiofrequency ablation of liver tumors requires the accurate guidance of needle insertion into a tumor target. The main challenge of image-guided navigation for radiofrequency ablation of liver tumors is the occurrence of liver deformations caused by respiratory motion. This study reports a strategy of real-time automatic registration to track custom fiducial markers glued onto the surface of a patient’s abdomen to find the respiratory phase, in which the static preoperative CT is performed. Custom fiducial markers are designed. Real-time automatic registration method consists of the automatic localization of custom fiducial markers in the patient and image spaces. The fiducial registration error is calculated in real time and indicates if the current respiratory phase corresponds to the phase of the static preoperative CT. To demonstrate the feasibility of the proposed strategy, a liver simulator is constructed and two volunteers are involved in the preliminary experiments. An ex-vivo porcine liver model is employed to further verify the strategy for liver intervention. Experimental results demonstrate that real-time automatic registration method is rapid, accurate, and feasible for capturing the respiratory phase from which the static preoperative CT anatomical model is generated by tracking the movement of the skin-adhered custom fiducial markers. PMID:26417501

  14. Accurate measurement of streamwise vortices using dual-plane PIV

    NASA Astrophysics Data System (ADS)

    Waldman, Rye M.; Breuer, Kenneth S.

    2012-11-01

    Low Reynolds number aerodynamic experiments with flapping animals (such as bats and small birds) are of particular interest due to their application to micro air vehicles which operate in a similar parameter space. Previous PIV wake measurements described the structures left by bats and birds and provided insight into the time history of their aerodynamic force generation; however, these studies have faced difficulty drawing quantitative conclusions based on said measurements. The highly three-dimensional and unsteady nature of the flows associated with flapping flight are major challenges for accurate measurements. The challenge of animal flight measurements is finding small flow features in a large field of view at high speed with limited laser energy and camera resolution. Cross-stream measurement is further complicated by the predominately out-of-plane flow that requires thick laser sheets and short inter-frame times, which increase noise and measurement uncertainty. Choosing appropriate experimental parameters requires compromise between the spatial and temporal resolution and the dynamic range of the measurement. To explore these challenges, we do a case study on the wake of a fixed wing. The fixed model simplifies the experiment and allows direct measurements of the aerodynamic forces via load cell. We present a detailed analysis of the wake measurements, discuss the criteria for making accurate measurements, and present a solution for making quantitative aerodynamic load measurements behind free-flyers.

  15. Accurate three-dimensional documentation of distinct sites

    NASA Astrophysics Data System (ADS)

    Singh, Mahesh K.; Dutta, Ashish; Subramanian, Venkatesh K.

    2017-01-01

    One of the most critical aspects of documenting distinct sites is acquiring detailed and accurate range information. Several three-dimensional (3-D) acquisition techniques are available, but each has its own limitations. This paper presents a range data fusion method with the aim to enhance the descriptive contents of the entire 3-D reconstructed model. A kernel function is introduced for supervised classification of the range data using a kernelized support vector machine. The classification method is based on the local saliency features of the acquired range data. The range data acquired from heterogeneous range sensors are transformed into a defined common reference frame. Based on the segmentation criterion, the fusion of range data is performed by integrating finer regions of range data acquired from a laser range scanner with the coarser region of Kinect's range data. After fusion, the Delaunay triangulation algorithm is applied to generate the highly accurate, realistic 3-D model of the scene. Finally, experimental results show the robustness of the proposed approach.

  16. The high cost of accurate knowledge.

    PubMed

    Sutcliffe, Kathleen M; Weber, Klaus

    2003-05-01

    Many business thinkers believe it's the role of senior managers to scan the external environment to monitor contingencies and constraints, and to use that precise knowledge to modify the company's strategy and design. As these thinkers see it, managers need accurate and abundant information to carry out that role. According to that logic, it makes sense to invest heavily in systems for collecting and organizing competitive information. Another school of pundits contends that, since today's complex information often isn't precise anyway, it's not worth going overboard with such investments. In other words, it's not the accuracy and abundance of information that should matter most to top executives--rather, it's how that information is interpreted. After all, the role of senior managers isn't just to make decisions; it's to set direction and motivate others in the face of ambiguities and conflicting demands. Top executives must interpret information and communicate those interpretations--they must manage meaning more than they must manage information. So which of these competing views is the right one? Research conducted by academics Sutcliffe and Weber found that how accurate senior executives are about their competitive environments is indeed less important for strategy and corresponding organizational changes than the way in which they interpret information about their environments. Investments in shaping those interpretations, therefore, may create a more durable competitive advantage than investments in obtaining and organizing more information. And what kinds of interpretations are most closely linked with high performance? Their research suggests that high performers respond positively to opportunities, yet they aren't overconfident in their abilities to take advantage of those opportunities.

  17. SU-E-J-236: Audiovisual Biofeedback Improves Breath-Hold Lung Tumor Position Reproducibility Measured with 4D MRI

    SciTech Connect

    Lee, D; Pollock, S; Keall, P; Greer, P; Lapuz, C; Ludbrook, J; Kim, T

    2015-06-15

    Purpose: Audiovisual biofeedback breath-hold (AVBH) was employed to reproduce tumor position on inhale and exhale breath-holds for 4D tumor information. We hypothesize that lung tumor position will be more consistent using AVBH compared with conventional breath-hold (CBH). Methods: Lung tumor positions were determined for seven lung cancer patients (age: 25 – 74) during to two separate 3T MRI sessions. A breathhold training session was performed prior to the MRI sessions to allow patients to become comfortable with AVBH and their exhale and inhale target positions. CBH and AVBH 4D image datasets were obtained in the first MRI session (pre-treatment) and the second MRI session (midtreatment) within six weeks of the first session. Audio-instruction (MRI: Siemens Skyra) in CBH and verbal-instruction (radiographer) in AVBH were used. A radiation oncologist contoured the lung tumor using Eclipse (Varian Medical Systems); tumor position was quantified as the centroid of the contoured tumor after rigid registration based on vertebral anatomy across two MRI sessions. CBH and AVBH were compared in terms of the reproducibility assessed via (1) the difference between the two exhale positions for the two sessions and the two inhale positions for the sessions. (2) The difference in amplitude (exhale to inhale) between the two sessions. Results: Compared to CBH, AVBH improved the reproducibility of two exhale (or inhale) lung tumor positions relative to each other by 33%, from 6.4±5.3 mm to 4.3±3.0 mm (p=0.005). Compared to CBH, AVBH improved the reproducibility of exhale and inhale amplitude by 66%, from 5.6±5.9 mm to 1.9±1.4 mm (p=0.005). Conclusions: This study demonstrated that audiovisual biofeedback can be utilized for improving the reproducibility of breath-hold lung tumor position. These results are advantageous towards achieving more accurate emerging radiation treatment planning methods, in addition to imaging and treatment modalities utilizing breath

  18. Communication: Improved ab initio molecular dynamics by minimally biasing with experimental data

    NASA Astrophysics Data System (ADS)

    White, Andrew D.; Knight, Chris; Hocky, Glen M.; Voth, Gregory A.

    2017-01-01

    Accounting for electrons and nuclei simultaneously is a powerful capability of ab initio molecular dynamics (AIMD). However, AIMD is often unable to accurately reproduce properties of systems such as water due to inaccuracies in the underlying electronic density functionals. This shortcoming is often addressed by added empirical corrections and/or increasing the simulation temperature. We present here a maximum-entropy approach to directly incorporate limited experimental data via a minimal bias. Biased AIMD simulations of water and an excess proton in water are shown to give significantly improved properties both for observables which were biased to match experimental data and for unbiased observables. This approach also yields new physical insight into inaccuracies in the underlying density functional theory as utilized in the unbiased AIMD.

  19. The influence of the cage environment on rodent physiology and behavior: Implications for reproducibility of pre-clinical rodent research.

    PubMed

    Toth, Linda A

    2015-08-01

    The reproducibility of pre-clinical research is an important concern that is now being voiced by constituencies that include the National Institutes of Health, the pharmaceutical industry, Congress, the public and the scientific community. An important facet of performing and publishing well-controlled reproducible pre-clinical research is to stabilize and more completely define the environment of the animal subjects. Scientists who use rodents in research generally recognize the importance of maintaining a stable animal environment. However, despite a theoretical and general awareness of these issues, many may lack a true appreciation of how significantly even seemingly minor variations in the environment can affect research outcomes. The purpose of this article is to help investigators gain a more comprehensive and substantiated understanding of the potentially significant impact of even seemingly minor environmental changes on the animals and the data. An important caveat to this article is that the examples presented were selected from a very large literature, admittedly in order to illustrate certain points. The goal of this article is not to provide an overview of the entire literature on how the environment affects rodents but rather to make preclinical scientists more aware of how these factors can potentially influence the experimental data and contribute to poor reproducibility of research.

  20. International prevalidation studies of the EpiDerm 3D human reconstructed skin micronucleus (RSMN) assay: transferability and reproducibility.

    PubMed

    Aardema, Marilyn J; Barnett, Brenda C; Khambatta, Zubin; Reisinger, Kerstin; Ouedraogo-Arras, Gladys; Faquet, Brigitte; Ginestet, Anne-Claire; Mun, Greg C; Dahl, Erica L; Hewitt, Nicola J; Corvi, Raffallea; Curren, Rodger D

    2010-08-30

    Recently, a novel in vitro reconstructed skin micronucleus (RSMN) assay incorporating the EpiDerm 3D human skin model (Curren et al., Mutat. Res. 607 (2006) 192-204; Mun et al., Mutat. Res. 673 (2009) 92-99) has been shown to produce comparable data when utilized in three different laboratories in the United States (Hu et al., Mutat. Res. 673 (2009) 100-108). As part of a project sponsored by the European cosmetics companies trade association (COLIPA), with a contribution from the European Center for the Validation of Alternative Methods (ECVAM), international prevalidation studies of the RSMN assay have been initiated. The assay was transferred and optimized in two laboratories in Europe, where dose-dependent, reproducibly positive results for mitomycin C and vinblastine sulfate were obtained. Further intra- and inter-laboratory reproducibility of the RSMN assay was established by testing three coded chemicals, N-ethyl-N-nitrosourea, cyclohexanone, and mitomycin C. All chemicals were correctly identified by all laboratories as either positive or negative. These results support the international inter-laboratory and inter-experimental reproducibility of the assay and reinforce the conclusion that the RSMN assay in the EpiDerm 3D human skin model is a valuable in vitro method for assessment of genotoxicity of dermally applied chemicals.

  1. Raising the Bar for Reproducible Science at the U.S. Environmental Protection Agency Office of Research and Development

    PubMed Central

    George, Barbara Jane; Sobus, Jon R.; Phelps, Lara P.; Rashleigh, Brenda; Simmons, Jane Ellen; Hines, Ronald N.

    2015-01-01

    Considerable concern has been raised regarding research reproducibility both within and outside the scientific community. Several factors possibly contribute to a lack of reproducibility, including a failure to adequately employ statistical considerations during study design, bias in sample selection or subject recruitment, errors in developing data inclusion/exclusion criteria, and flawed statistical analysis. To address some of these issues, several publishers have developed checklists that authors must complete. Others have either enhanced statistical expertise on existing editorial boards, or formed distinct statistics editorial boards. Although the U.S. Environmental Protection Agency, Office of Research and Development, already has a strong Quality Assurance Program, an initiative was undertaken to further strengthen statistics consideration and other factors in study design and also to ensure these same factors are evaluated during the review and approval of study protocols. To raise awareness of the importance of statistical issues and provide a forum for robust discussion, a Community of Practice for Statistics was formed in January 2014. In addition, three working groups were established to develop a series of questions or criteria that should be considered when designing or reviewing experimental, observational, or modeling focused research. This article describes the process used to develop these study design guidance documents, their contents, how they are being employed by the Agency’s research enterprise, and expected benefits to Agency science. The process and guidance documents presented here may be of utility for any research enterprise interested in enhancing the reproducibility of its science. PMID:25795653

  2. Raising the bar for reproducible science at the U.S. Environmental Protection Agency Office of Research and Development.

    PubMed

    George, Barbara Jane; Sobus, Jon R; Phelps, Lara P; Rashleigh, Brenda; Simmons, Jane Ellen; Hines, Ronald N

    2015-05-01

    Considerable concern has been raised regarding research reproducibility both within and outside the scientific community. Several factors possibly contribute to a lack of reproducibility, including a failure to adequately employ statistical considerations during study design, bias in sample selection or subject recruitment, errors in developing data inclusion/exclusion criteria, and flawed statistical analysis. To address some of these issues, several publishers have developed checklists that authors must complete. Others have either enhanced statistical expertise on existing editorial boards, or formed distinct statistics editorial boards. Although the U.S. Environmental Protection Agency, Office of Research and Development, already has a strong Quality Assurance Program, an initiative was undertaken to further strengthen statistics consideration and other factors in study design and also to ensure these same factors are evaluated during the review and approval of study protocols. To raise awareness of the importance of statistical issues and provide a forum for robust discussion, a Community of Practice for Statistics was formed in January 2014. In addition, three working groups were established to develop a series of questions or criteria that should be considered when designing or reviewing experimental, observational, or modeling focused research. This article describes the process used to develop these study design guidance documents, their contents, how they are being employed by the Agency's research enterprise, and expected benefits to Agency science. The process and guidance documents presented here may be of utility for any research enterprise interested in enhancing the reproducibility of its science.

  3. How accurately can 21cm tomography constrain cosmology?

    NASA Astrophysics Data System (ADS)

    Mao, Yi; Tegmark, Max; McQuinn, Matthew; Zaldarriaga, Matias; Zahn, Oliver

    2008-07-01

    There is growing interest in using 3-dimensional neutral hydrogen mapping with the redshifted 21 cm line as a cosmological probe. However, its utility depends on many assumptions. To aid experimental planning and design, we quantify how the precision with which cosmological parameters can be measured depends on a broad range of assumptions, focusing on the 21 cm signal from 6experimental specifications like array layout and detector noise, to uncertainties in the reionization history, and to the level of contamination from astrophysical foregrounds. We derive simple analytic estimates for how various assumptions affect an experiment’s sensitivity, and we find that the modeling of reionization is the most important, followed by the array layout. We present an accurate yet robust method for measuring cosmological parameters that exploits the fact that the ionization power spectra are rather smooth functions that can be accurately fit by 7 phenomenological parameters. We find that for future experiments, marginalizing over these nuisance parameters may provide constraints almost as tight on the cosmology as if 21 cm tomography measured the matter power spectrum directly. A future square kilometer array optimized for 21 cm tomography could improve the sensitivity to spatial curvature and neutrino masses by up to 2 orders of magnitude, to ΔΩk≈0.0002 and Δmν≈0.007eV, and give a 4σ detection of the spectral index running predicted by the simplest inflation models.

  4. Simple Mathematical Models Do Not Accurately Predict Early SIV Dynamics

    PubMed Central

    Noecker, Cecilia; Schaefer, Krista; Zaccheo, Kelly; Yang, Yiding; Day, Judy; Ganusov, Vitaly V.

    2015-01-01

    Upon infection of a new host, human immunodeficiency virus (HIV) replicates in the mucosal tissues and is generally undetectable in circulation for 1–2 weeks post-infection. Several interventions against HIV including vaccines and antiretroviral prophylaxis target virus replication at this earliest stage of infection. Mathematical models have been used to understand how HIV spreads from mucosal tissues systemically and what impact vaccination and/or antiretroviral prophylaxis has on viral eradication. Because predictions of such models have been rarely compared to experimental data, it remains unclear which processes included in these models are critical for predicting early HIV dynamics. Here we modified the “standard” mathematical model of HIV infection to include two populations of infected cells: cells that are actively producing the virus and cells that are transitioning into virus production mode. We evaluated the effects of several poorly known parameters on infection outcomes in this model and compared model predictions to experimental data on infection of non-human primates with variable doses of simian immunodifficiency virus (SIV). First, we found that the mode of virus production by infected cells (budding vs. bursting) has a minimal impact on the early virus dynamics for a wide range of model parameters, as long as the parameters are constrained to provide the observed rate of SIV load increase in the blood of infected animals. Interestingly and in contrast with previous results, we found that the bursting mode of virus production generally results in a higher probability of viral extinction than the budding mode of virus production. Second, this mathematical model was not able to accurately describe the change in experimentally determined probability of host infection with increasing viral doses. Third and finally, the model was also unable to accurately explain the decline in the time to virus detection with increasing viral dose. These results

  5. Reproduction of Hip Offset and Leg Length in Navigated Total Hip Arthroplasty: How Accurate Are We?

    PubMed

    Ellapparadja, Pregash; Mahajan, Vivek; Deakin, Angela H; Deep, Kamal

    2015-06-01

    This study assesses how accurately we can restore hip offset and leg length in navigated total hip arthroplasty (THA). 152 consecutive patients with navigated THA formed the study group. The contra-lateral hip formed control for measuring hip offset and leg length. All radiological measurements were made using Orthoview digital software. In the normal hip offset group, the mean is 75.73 (SD- 8.61). In the reconstructed hip offset group, the mean is 75.35 (SD - 7.48). 95.39% had hip offset within 6 mm of opposite side while 96.04% had leg length restored within 6 mm of contra-lateral side. Equivalence test revealed that the two groups of hip offsets were essentially the same. We conclude that computer navigation can successfully reproduce hip offset and leg length accurately.

  6. Development of an XYZ Digital Camera with Embedded Color Calibration System for Accurate Color Acquisition

    NASA Astrophysics Data System (ADS)

    Kretkowski, Maciej; Jablonski, Ryszard; Shimodaira, Yoshifumi

    Acquisition of accurate colors is important in the modern era of widespread exchange of electronic multimedia. The variety of device-dependent color spaces causes troubles with accurate color reproduction. In this paper we present the outlines of accomplished digital camera system with device-independent output formed from tristimulus XYZ values. The outstanding accuracy and fidelity of acquired color is achieved in our system by employing an embedded color calibration system based on emissive device generating reference calibration colors with user-defined spectral distribution and chromaticity coordinates. The system was tested by calibrating the camera using 24 reference colors spectrally reproduced from 24 color patches of the Macbeth Chart. The average color difference (CIEDE2000) has been found to be ΔE =0.83, which is an outstanding result compared to commercially available digital cameras.

  7. An accurate equation of state for the exponential-6 fluid applied to dense supercritical nitrogen

    NASA Astrophysics Data System (ADS)

    Fried, Laurence E.; Howard, W. Michael

    1998-11-01

    The exponential-6 potential model is widely used in fluid equation of state studies. We have developed an accurate and efficient complete equation of state for the exponential-6 fluid based on HMSA integral equation theory and Monte Carlo calculations. Our equation of state has average fractional error of 0.2% in pV/NkBT and 0.3% in the excess energy Uex/NkBT. This is a substantial improvement in accuracy over perturbation methods, which are typically used in treatments of dense fluid equations of state. We have applied our equation of state to the problem of dense supercritical N2. We find that we are able to accurately reproduce a wide range of material properties with our model, over a range 0.01⩽P⩽100 GPa and 298⩽T⩽15 000 K.

  8. Accurate Nanoscale Crystallography in Real-Space Using Scanning Transmission Electron Microscopy.

    PubMed

    Dycus, J Houston; Harris, Joshua S; Sang, Xiahan; Fancher, Chris M; Findlay, Scott D; Oni, Adedapo A; Chan, Tsung-Ta E; Koch, Carl C; Jones, Jacob L; Allen, Leslie J; Irving, Douglas L; LeBeau, James M

    2015-08-01

    Here, we report reproducible and accurate measurement of crystallographic parameters using scanning transmission electron microscopy. This is made possible by removing drift and residual scan distortion. We demonstrate real-space lattice parameter measurements with <0.1% error for complex-layered chalcogenides Bi2Te3, Bi2Se3, and a Bi2Te2.7Se0.3 nanostructured alloy. Pairing the technique with atomic resolution spectroscopy, we connect local structure with chemistry and bonding. Combining these results with density functional theory, we show that the incorporation of Se into Bi2Te3 causes charge redistribution that anomalously increases the van der Waals gap between building blocks of the layered structure. The results show that atomic resolution imaging with electrons can accurately and robustly quantify crystallography at the nanoscale.

  9. Approaching system equilibrium with accurate or not accurate feedback information in a two-route system

    NASA Astrophysics Data System (ADS)

    Zhao, Xiao-mei; Xie, Dong-fan; Li, Qi

    2015-02-01

    With the development of intelligent transport system, advanced information feedback strategies have been developed to reduce traffic congestion and enhance the capacity. However, previous strategies provide accurate information to travelers and our simulation results show that accurate information brings negative effects, especially in delay case. Because travelers prefer to the best condition route with accurate information, and delayed information cannot reflect current traffic condition but past. Then travelers make wrong routing decisions, causing the decrease of the capacity and the increase of oscillations and the system deviating from the equilibrium. To avoid the negative effect, bounded rationality is taken into account by introducing a boundedly rational threshold BR. When difference between two routes is less than the BR, routes have equal probability to be chosen. The bounded rationality is helpful to improve the efficiency in terms of capacity, oscillation and the gap deviating from the system equilibrium.

  10. Higher order accurate partial implicitization: An unconditionally stable fourth-order-accurate explicit numerical technique

    NASA Technical Reports Server (NTRS)

    Graves, R. A., Jr.

    1975-01-01

    The previously obtained second-order-accurate partial implicitization numerical technique used in the solution of fluid dynamic problems was modified with little complication to achieve fourth-order accuracy. The Von Neumann stability analysis demonstrated the unconditional linear stability of the technique. The order of the truncation error was deduced from the Taylor series expansions of the linearized difference equations and was verified by numerical solutions to Burger's equation. For comparison, results were also obtained for Burger's equation using a second-order-accurate partial-implicitization scheme, as well as the fourth-order scheme of Kreiss.

  11. Poor interobserver reproducibility in the diagnosis of high-grade endometrial carcinoma.

    PubMed

    Gilks, C Blake; Oliva, Esther; Soslow, Robert A

    2013-06-01

    disagreement were serous versus clear cell (7 cases) and serous versus grade 3 endometrioid (6 cases). Immunostaining results using the 5-marker immunopanel were then used to adjudicate in the 6 cases in which there was disagreement between reviewers with respect to serous versus endometrioid carcinoma, and these supported a diagnosis of serous carcinoma in 4 of 6 cases and endometrioid carcinoma in 2 of 6 cases. Pairwise comparison between the reviewers for the 20 cases classified as showing major disagreement was as follows: reviewer 1 and reviewer 2 agreed in 5/20 cases, reviewer 1 and reviewer 3 agreed in 7/20 cases, and reviewer 2 and reviewer 3 agreed in 8/20 cases, indicating that disagreements were not because of a single reviewer holding outlier opinions. Diagnostic consensus among 3 reviewers about the exclusive or major subtype of high-grade endometrial carcinoma was reached in only 35/56 (62.5%) cases, and in 4 of these cases there was disagreement about the minor component present. This poor reproducibility did not reflect systematic bias on the part of any 1 reviewer. There is a need for molecular tools to aid in the accurate and reproducible diagnosis of high-grade endometrial carcinoma subtype.

  12. [Kansei biosensors to reproduce gustatory and olfactory senses].

    PubMed

    Toko, Kiyoshi

    2014-01-01

    Anyone talks about the taste using different taste scale. Since how to feel the taste is different from one to other people, we sometimes lead to inconsistency when speaking about the taste of food. The present study aims at development of electronic tongue (taste sensor) and electronic nose (odor sensor). There are two important properties about the taste sensor. One, each sensor electrode (lipid/polymer membrane) is specific to each taste. Another is that the sensor can measure the aftertaste such as richness, which is the aftertaste of umami. In the case of, e.g., bitterness electrode (BT0), it responds well to bitter taste substances such as quinine, cetirizine, hydroxyzine and bromhexine. For other taste qualities, on the other hand, it shows no response. A taste sensor is now sold by Intelligent Sensor Technology, Inc., and utilized in pharmaceutical and food companies. An electronic nose to detect lingering scent is composed of surface plasmon resonance (SPR) sensor, which is a sensing device with high sensitivity, and antigen-antibody interaction. A self-assembled monolayer was constructed on the reception surface of SPR device. The experimental result on benzaldehyde, a typical peach flavor, shows the sensor sensitivity 4 ppb, which is superior to the human sensitivity of about 350 ppb. Our developed taste sensor and electronic nose play the role of gustatory and olfactory senses, respectively.

  13. A New Cecal Slurry Preparation Protocol with Improved Long-Term Reproducibility for Animal Models of Sepsis

    PubMed Central

    Starr, Marlene E.; Steele, Allison M.; Saito, Mizuki; Hacker, Bill J.; Evers, B. Mark; Saito, Hiroshi

    2014-01-01

    Sepsis, a life-threatening systemic inflammatory response syndrome induced by infection, is widely studied using laboratory animal models. While cecal-ligation and puncture (CLP) is considered the gold standard model for sepsis research, it may not be preferable for experiments comparing animals of different size or under different dietary regimens. By comparing cecum size, shape, and cecal content characteristics in mice under different experimental conditions (aging, diabetes, pancreatitis), we show that cecum variability could be problematic for some CLP experiments. The cecal slurry (CS) injection model, in which the cecal contents of a laboratory animal are injected intraperitoneally to other animals, is an alternative method for inducing polymicrobial sepsis; however, the CS must be freshly prepared under conventional protocols, which is a major disadvantage with respect to reproducibility and convenience. The objective of this study was to develop an improved CS preparation protocol that allows for long-term storage of CS with reproducible results. Using our new CS preparation protocol we found that bacterial viability is maintained for at least 6 months when the CS is prepared in 15% glycerol-PBS and stored at -80°C. To test sepsis-inducing efficacy of stored CS stocks, various amounts of CS were injected to young (4–6 months old), middle-aged (12–14 months old), and aged (24–26 months old) male C57BL/6 mice. Dose- and age-dependent mortality was observed with high reproducibility. Circulating bacteria levels strongly correlated with mortality suggesting an infection-mediated death. Further, injection with heat-inactivated CS resulted in acute hypothermia without mortality, indicating that CS-mediated death is not due to endotoxic shock. This new CS preparation protocol results in CS stocks which are durable for freezing preservation without loss of bacterial viability, allowing experiments to be performed more conveniently and with higher

  14. Enrichment of the finite element method with reproducing kernel particle method

    SciTech Connect

    Chen, Y.; Liu, W.K.; Uras, R.A.

    1995-07-01

    Based on the reproducing kernel particle method on enrichment procedure is introduced to enhance the effectiveness of the finite element method. The basic concepts for the reproducing kernel particle method are briefly reviewed. By adopting the well-known completeness requirements, a generalized form of the reproducing kernel particle method is developed. Through a combination of these two methods their unique advantages can be utilized. An alternative approach, the multiple field method is also introduced.

  15. Experimental falsification of Leggett's nonlocal variable model.

    PubMed

    Branciard, Cyril; Ling, Alexander; Gisin, Nicolas; Kurtsiefer, Christian; Lamas-Linares, Antia; Scarani, Valerio

    2007-11-23

    Bell's theorem guarantees that no model based on local variables can reproduce quantum correlations. Also, some models based on nonlocal variables, if subject to apparently "reasonable" constraints, may fail to reproduce quantum physics. In this Letter, we introduce a family of inequalities, which use a finite number of measurement settings, and which therefore allow testing Leggett's nonlocal model versus quantum physics. Our experimental data falsify Leggett's model and are in agreement with quantum predictions.

  16. Repeatability and reproducibility of intracellular molar concentration assessed by synchrotron-based x-ray fluorescence microscopy

    SciTech Connect

    Merolle, L. Gianoncelli, A.; Malucelli, E. Cappadone, C.; Farruggia, G.; Sargenti, A.; Procopio, A.; Fratini, M.; Notargiacomo, A.; Lombardo, M.; Lagomarsino, S.; Iotti, S.

    2016-01-28

    Elemental analysis of biological sample can give information about content and distribution of elements essential for human life or trace elements whose absence is the cause of abnormal biological function or development. However, biological systems contain an ensemble of cells with heterogeneous chemistry and elemental content; therefore, accurate characterization of samples with high cellular heterogeneity may only be achieved by analyzing single cells. Powerful methods in molecular biology are abundant, among them X-Ray microscopy based on synchrotron light source has gaining increasing attention thanks to its extremely sensitivity. However, reproducibility and repeatability of these measurements is one of the major obstacles in achieving a statistical significance in single cells population analysis. In this study, we compared the elemental content of human colon adenocarcinoma cells obtained by three distinct accesses to synchrotron radiation light.

  17. Repeatability and reproducibility of intracellular molar concentration assessed by synchrotron-based x-ray fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Merolle, L.; Malucelli, E.; Fratini, M.; Gianoncelli, A.; Notargiacomo, A.; Cappadone, C.; Farruggia, G.; Sargenti, A.; Procopio, A.; Lombardo, M.; Lagomarsino, S.; Iotti, S.

    2016-01-01

    Elemental analysis of biological sample can give information about content and distribution of elements essential for human life or trace elements whose absence is the cause of abnormal biological function or development. However, biological systems contain an ensemble of cells with heterogeneous chemistry and elemental content; therefore, accurate characterization of samples with high cellular heterogeneity may only be achieved by analyzing single cells. Powerful methods in molecular biology are abundant, among them X-Ray microscopy based on synchrotron light source has gaining increasing attention thanks to its extremely sensitivity. However, reproducibility and repeatability of these measurements is one of the major obstacles in achieving a statistical significance in single cells population analysis. In this study, we compared the elemental content of human colon adenocarcinoma cells obtained by three distinct accesses to synchrotron radiation light.

  18. Monitoring microbiological changes in drinking water systems using a fast and reproducible flow cytometric method.

    PubMed

    Prest, E I; Hammes, F; Kötzsch, S; van Loosdrecht, M C M; Vrouwenvelder, J S

    2013-12-01

    Flow cytometry (FCM) is a rapid, cultivation-independent tool to assess and evaluate bacteriological quality and biological stability of water. Here we demonstrate that a stringent, reproducible staining protocol combined with fixed FCM operational and gating settings is essential for reliable quantification of bacteria and detection of changes in aquatic bacterial communities. Triplicate measurements of diverse water samples with this protocol typically showed relative standard deviation values and 95% confidence interval values below 2.5% on all the main FCM parameters. We propose a straightforward and instrument-independent method for the characterization of water samples based on the combination of bacterial cell concentration and fluorescence distribution. Analysis of the fluorescence distribution (or so-called fluorescence fingerprint) was accomplished firstly through a direct comparison of the raw FCM data and subsequently simplified by quantifying the percentage of large and brightly fluorescent high nucleic acid (HNA) content bacteria in each sample. Our approach enables fast differentiation of dissimilar bacterial communities (less than 15 min from sampling to final result), and allows accurate detection of even small changes in aquatic environments (detection above 3% change). Demonstrative studies on (a) indigenous bacterial growth in water, (b) contamination of drinking water with wastewater, (c) household drinking water stagnation and (d) mixing of two drinking water types, univocally showed that this FCM approach enables detection and quantification of relevant bacterial water quality changes with high sensitivity. This approach has the potential to be used as a new tool for application in the drinking water field, e.g. for rapid screening of the microbial water quality and stability during water treatment and distribution in networks and premise plumbing.

  19. Rainfall variability and extremes over southern Africa: assessment of a climate model to reproduce daily extremes

    NASA Astrophysics Data System (ADS)

    Williams, C.; Kniveton, D.; Layberry, R.

    2009-04-01

    It is increasingly accepted that that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. A change in the distribution and magnitude of extreme rainfall events (associated with changing variability), such as droughts or flooding, may have a far greater impact on human and natural systems than a changing mean. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The subcontinent is considered especially vulnerable to and ill-equipped (in terms of adaptation) for extreme events, due to a number of factors including extensive poverty, famine, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of ability of a state of the art climate model to simulate climate at daily timescales is carried out using satellite derived rainfall data from the Microwave Infra-Red Algorithm (MIRA). This dataset covers the period from 1993-2002 and the whole of southern Africa at a spatial resolution of 0.1 degree longitude/latitude. The ability of a climate model to simulate current climate provides some indication of how much confidence can be applied to its future predictions. In this paper, simulations of current climate from the UK Meteorological Office Hadley Centre's climate model, in both regional and global mode, are firstly compared to the MIRA dataset at daily timescales. This concentrates primarily on the ability of the model to simulate the spatial and temporal patterns of rainfall variability over southern Africa. Secondly, the ability of the model to reproduce daily rainfall extremes will

  20. SU-E-T-450: How Important Is a Reproducible Breath Hold for DIBH Breast Radiotherapy?

    SciTech Connect

    Liu, H; Wentworth, S; Sintay, B; Wiant, D

    2015-06-15

    Purpose: Deep inspiration breath hold (DIBH) for left-sided breast cancer has been shown to reduce heart dose. Surface imaging helps to ensure accurate breast positioning, but does not guarantee a reproducible breath hold (BH) at DIBH treatments. We examine the effects of variable BH positions for DIBH treatments. Methods: Twenty-Five patients with free breathing (FB) and DIBH scans were reviewed. Four plans were created for each patient: 1) FB, 2) DIBH, 3) FB-DIBH – the DIBH plans were copied to the FB images and recalculated (image registration was based on breast tissue), and 4) P-DIBH – a partial BH with the heart shifted midway between the FB and DIBH positions. The FB-DIBH plans give “worst case” scenarios for surface imaging DIBH, where the breast is aligned by surface imaging but the patient is not holding their breath. Students t-tests were used to compare dose metrics. Results: The DIBH plans gave lower heart dose and comparable breast coverage versus FB in all cases. The FB-DIBH plans showed no significant difference versus FB plans for breast coverage, mean heart dose, or maximum heart dose (p >= 0.10). The mean heart dose differed between FB-DIBH and FB by < 2 Gy for all cases, the maximum heart dose differed by < 2 Gy for 21 cases. The P-DIBH plans showed significantly lower mean heart dose than FB (p = 0.01). The mean heart doses for the P-DIBH plans were < FB for 22 cases, the maximum dose < FB for 18 cases. Conclusions: A DIBH plan delivered to a FB patient set-up with surface imaging will yield similar dosimetry to a plan created and delivered FB. A DIBH plan delivered with even a partial BH can give reduced heart dose compared to FB techniques when the breast tissue is well aligned.

  1. A System to Simulate and Reproduce Audio-Visual Environments for Spatial Hearing Research

    PubMed Central

    Seeber, Bernhard U.; Kerber, Stefan; Hafter, Ervin R.

    2009-01-01

    The article reports the experience gained from two implementations of the “Simulated Open-Field Environment” (SOFE), a setup that allows sounds to be played at calibrated levels over a wide frequency range from multiple loudspeakers in an anechoic chamber. Playing sounds from loudspeakers in the free-field has the advantage that each participant listens with their own ears, and individual characteristics of the ears are captured in the sound they hear. This makes an easy and accurate comparison between various listeners with and without hearing devices possible. The SOFE uses custom calibration software to assure individual equalization of each loudspeaker. Room simulation software creates the spatio-temporal reflection pattern of sound sources in rooms which is played via the SOFE loudspeakers. The sound playback system is complemented by a video projection facility which can be used to collect or give feedback or to study auditory-visual interaction. The article discusses acoustical and technical requirements for accurate sound playback against the specific needs in hearing research. An introduction to software concepts is given which allow easy, high-level control of the setup and thus fast experimental development, turning the SOFE into a “Swiss army knife” tool for auditory, spatial hearing and audio-visual research. PMID:19909802

  2. "High-precision, reconstructed 3D model" of skull scanned by conebeam CT: Reproducibility verified using CAD/CAM data.

    PubMed

    Katsumura, Seiko; Sato, Keita; Ikawa, Tomoko; Yamamura, Keiko; Ando, Eriko; Shigeta, Yuko; Ogawa, Takumi

    2016-01-01

    Computed tomography (CT) scanning has recently been introduced into forensic medicine and dentistry. However, the presence of metal restorations in the dentition can adversely affect the quality of three-dimensional reconstruction from CT scans. In this study, we aimed to evaluate the reproducibility of a "high-precision, reconstructed 3D model" obtained from a conebeam CT scan of dentition, a method that might be particularly helpful in forensic medicine. We took conebeam CT and helical CT images of three dry skulls marked with 47 measuring points; reconstructed three-dimensional images; and measured the distances between the points in the 3D images with a computer-aided design/computer-aided manufacturing (CAD/CAM) marker. We found that in comparison with the helical CT, conebeam CT is capable of reproducing measurements closer to those obtained from the actual samples. In conclusion, our study indicated that the image-reproduction from a conebeam CT scan was more accurate than that from a helical CT scan. Furthermore, the "high-precision reconstructed 3D model" facilitates reliable visualization of full-sized oral and maxillofacial regions in both helical and conebeam CT scans.

  3. Reproducibility study for free-breathing measurements of pyruvate metabolism using hyperpolarized (13) C in the heart.

    PubMed

    Lau, Angus Z; Chen, Albert P; Barry, Jennifer; Graham, John J; Dominguez-Viqueira, William; Ghugre, Nilesh R; Wright, Graham A; Cunningham, Charles H

    2013-04-01

    Spatially resolved images of hyperpolarized (13) C substrates and their downstream products provide insight into real-time metabolic processes occurring in vivo. Recently, hyperpolarized (13) C pyruvate has been used to characterize in vivo cardiac metabolism in the rat and pig, but accurate and reproducible measurements remain challenging due to the limited period available for imaging as well as physiological motion. In this article, time-resolved cardiac- and respiratory-gated images of [1-(13) C] pyruvate, [1-(13) C] lactate, and (13) C bicarbonate in the heart are acquired without the need for a breathhold. The robustness of these free-breathing measurements is demonstrated using the time-resolved data to produce a normalized metric of pyruvate dehydrogenase and lactate dehydrogenase activity in the heart. The values obtained are reproducible in a controlled metabolic state. In a 60-min ischemia/reperfusion model, significant differences in hyperpolarized bicarbonate and lactate, normalized using the left ventricular pyruvate signal, were detected between scans performed at baseline and 45 min after reperfusion. The sequence is anticipated to improve quantitative measurements of cardiac metabolism, leading to feasible validation studies using fewer subjects, and potentially improved diagnosis, serial monitoring, and treatment of cardiac disease in patients.

  4. Evaluation of the reproducibility of two techniques used to determine and record centric relation in angle's class I patients.

    PubMed

    Paixão, Fernanda; Silva, Wilkens Aurélio Buarque e; Silva, Frederico Andrade e; Ramos, Guilherme da Gama; Cruz, Mônica Vieira de Jesus

    2007-08-01

    The centric relation is a mandibular position that determines a balance relation among the temporomandibular joints, the chew muscles and the occlusion. This position makes possible to the dentist to plan and to execute oral rehabilitation respecting the physiological principles of the stomatognathic system. The aim of this study was to investigate the reproducibility of centric relation records obtained using two techniques: Dawson's Bilateral Manipulation and Gysi's Gothic Arch Tracing. Twenty volunteers (14 females and 6 males) with no dental loss, presenting occlusal contacts according to those described in Angle's I classification and without signs and symptoms of temporomandibular disorders were selected. All volunteers were submitted five times with a 1-week interval, always in the same schedule, to the Dawson's Bilateral Manipulation and to the Gysi's Gothic Arch Tracing with aid of an intraoral apparatus. The average standard error of each technique was calculated (Bilateral Manipulation 0.94 and Gothic Arch Tracing 0.27). Shapiro-Wilk test was applied and the results allowed application of Student's t-test (sampling error of 5%). The techniques showed different degrees of variability. The Gysi's Gothic Arch Tracing was found to be more accurate than the Bilateral Manipulation in reproducing the centric relation records.

  5. Does a pneumotach accurately characterize voice function?

    NASA Astrophysics Data System (ADS)

    Walters, Gage; Krane, Michael

    2016-11-01

    A study is presented which addresses how a pneumotach might adversely affect clinical measurements of voice function. A pneumotach is a device, typically a mask, worn over the mouth, in order to measure time-varying glottal volume flow. By measuring the time-varying difference in pressure across a known aerodynamic resistance element in the mask, the glottal volume flow waveform is estimated. Because it adds aerodynamic resistance to the vocal system, there is some concern that using a pneumotach may not accurately portray the behavior of the voice. To test this hypothesis, experiments were performed in a simplified airway model with the principal dimensions of an adult human upper airway. A compliant constriction, fabricated from silicone rubber, modeled the vocal folds. Variations of transglottal pressure, time-averaged volume flow, model vocal fold vibration amplitude, and radiated sound with subglottal pressure were performed, with and without the pneumotach in place, and differences noted. Acknowledge support of NIH Grant 2R01DC005642-10A1.

  6. Accurate thermoplasmonic simulation of metallic nanoparticles

    NASA Astrophysics Data System (ADS)

    Yu, Da-Miao; Liu, Yan-Nan; Tian, Fa-Lin; Pan, Xiao-Min; Sheng, Xin-Qing

    2017-01-01

    Thermoplasmonics leads to enhanced heat generation due to the localized surface plasmon resonances. The measurement of heat generation is fundamentally a complicated task, which necessitates the development of theoretical simulation techniques. In this paper, an efficient and accurate numerical scheme is proposed for applications with complex metallic nanostructures. Light absorption and temperature increase are, respectively, obtained by solving the volume integral equation (VIE) and the steady-state heat diffusion equation through the method of moments (MoM). Previously, methods based on surface integral equations (SIEs) were utilized to obtain light absorption. However, computing light absorption from the equivalent current is as expensive as O(NsNv), where Ns and Nv, respectively, denote the number of surface and volumetric unknowns. Our approach reduces the cost to O(Nv) by using VIE. The accuracy, efficiency and capability of the proposed scheme are validated by multiple simulations. The simulations show that our proposed method is more efficient than the approach based on SIEs under comparable accuracy, especially for the case where many incidents are of interest. The simulations also indicate that the temperature profile can be tuned by several factors, such as the geometry configuration of array, beam direction, and light wavelength.

  7. Accurate method for computing correlated color temperature.

    PubMed

    Li, Changjun; Cui, Guihua; Melgosa, Manuel; Ruan, Xiukai; Zhang, Yaoju; Ma, Long; Xiao, Kaida; Luo, M Ronnier

    2016-06-27

    For the correlated color temperature (CCT) of a light source to be estimated, a nonlinear optimization problem must be solved. In all previous methods available to compute CCT, the objective function has only been approximated, and their predictions have achieved limited accuracy. For example, different unacceptable CCT values have been predicted for light sources located on the same isotemperature line. In this paper, we propose to compute CCT using the Newton method, which requires the first and second derivatives of the objective function. Following the current recommendation by the International Commission on Illumination (CIE) for the computation of tristimulus values (summations at 1 nm steps from 360 nm to 830 nm), the objective function and its first and second derivatives are explicitly given and used in our computations. Comprehensive tests demonstrate that the proposed method, together with an initial estimation of CCT using Robertson's method [J. Opt. Soc. Am. 58, 1528-1535 (1968)], gives highly accurate predictions below 0.0012 K for light sources with CCTs ranging from 500 K to 106 K.

  8. Accurate, reliable prototype earth horizon sensor head

    NASA Technical Reports Server (NTRS)

    Schwarz, F.; Cohen, H.

    1973-01-01

    The design and performance is described of an accurate and reliable prototype earth sensor head (ARPESH). The ARPESH employs a detection logic 'locator' concept and horizon sensor mechanization which should lead to high accuracy horizon sensing that is minimally degraded by spatial or temporal variations in sensing attitude from a satellite in orbit around the earth at altitudes in the 500 km environ 1,2. An accuracy of horizon location to within 0.7 km has been predicted, independent of meteorological conditions. This corresponds to an error of 0.015 deg-at 500 km altitude. Laboratory evaluation of the sensor indicates that this accuracy is achieved. First, the basic operating principles of ARPESH are described; next, detailed design and construction data is presented and then performance of the sensor under laboratory conditions in which the sensor is installed in a simulator that permits it to scan over a blackbody source against background representing the earth space interface for various equivalent plant temperatures.

  9. Accurate methods for large molecular systems.

    PubMed

    Gordon, Mark S; Mullin, Jonathan M; Pruitt, Spencer R; Roskop, Luke B; Slipchenko, Lyudmila V; Boatz, Jerry A

    2009-07-23

    Three exciting new methods that address the accurate prediction of processes and properties of large molecular systems are discussed. The systematic fragmentation method (SFM) and the fragment molecular orbital (FMO) method both decompose a large molecular system (e.g., protein, liquid, zeolite) into small subunits (fragments) in very different ways that are designed to both retain the high accuracy of the chosen quantum mechanical level of theory while greatly reducing the demands on computational time and resources. Each of these methods is inherently scalable and is therefore eminently capable of taking advantage of massively parallel computer hardware while retaining the accuracy of the corresponding electronic structure method from which it is derived. The effective fragment potential (EFP) method is a sophisticated approach for the prediction of nonbonded and intermolecular interactions. Therefore, the EFP method provides a way to further reduce the computational effort while retaining accuracy by treating the far-field interactions in place of the full electronic structure method. The performance of the methods is demonstrated using applications to several systems, including benzene dimer, small organic species, pieces of the alpha helix, water, and ionic liquids.

  10. Accurate lineshape spectroscopy and the Boltzmann constant

    PubMed Central

    Truong, G.-W.; Anstie, J. D.; May, E. F.; Stace, T. M.; Luiten, A. N.

    2015-01-01

    Spectroscopy has an illustrious history delivering serendipitous discoveries and providing a stringent testbed for new physical predictions, including applications from trace materials detection, to understanding the atmospheres of stars and planets, and even constraining cosmological models. Reaching fundamental-noise limits permits optimal extraction of spectroscopic information from an absorption measurement. Here, we demonstrate a quantum-limited spectrometer that delivers high-precision measurements of the absorption lineshape. These measurements yield a very accurate measurement of the excited-state (6P1/2) hyperfine splitting in Cs, and reveals a breakdown in the well-known Voigt spectral profile. We develop a theoretical model that accounts for this breakdown, explaining the observations to within the shot-noise limit. Our model enables us to infer the thermal velocity dispersion of the Cs vapour with an uncertainty of 35 p.p.m. within an hour. This allows us to determine a value for Boltzmann's constant with a precision of 6 p.p.m., and an uncertainty of 71 p.p.m. PMID:26465085

  11. Noninvasive hemoglobin monitoring: how accurate is enough?

    PubMed

    Rice, Mark J; Gravenstein, Nikolaus; Morey, Timothy E

    2013-10-01

    Evaluating the accuracy of medical devices has traditionally been a blend of statistical analyses, at times without contextualizing the clinical application. There have been a number of recent publications on the accuracy of a continuous noninvasive hemoglobin measurement device, the Masimo Radical-7 Pulse Co-oximeter, focusing on the traditional statistical metrics of bias and precision. In this review, which contains material presented at the Innovations and Applications of Monitoring Perfusion, Oxygenation, and Ventilation (IAMPOV) Symposium at Yale University in 2012, we critically investigated these metrics as applied to the new technology, exploring what is required of a noninvasive hemoglobin monitor and whether the conventional statistics adequately answer our questions about clinical accuracy. We discuss the glucose error grid, well known in the glucose monitoring literature, and describe an analogous version for hemoglobin monitoring. This hemoglobin error grid can be used to evaluate the required clinical accuracy (±g/dL) of a hemoglobin measurement device to provide more conclusive evidence on whether to transfuse an individual patient. The important decision to transfuse a patient usually requires both an accurate hemoglobin measurement and a physiologic reason to elect transfusion. It is our opinion that the published accuracy data of the Masimo Radical-7 is not good enough to make the transfusion decision.

  12. Accurate Control of Josephson Phase Qubits

    DTIC Science & Technology

    2016-04-14

    2003; published 30 December 2003! A quantum bit is a closed two-dimensional Hilbert space, but often experimental systems have three or more energy...10.1103/PhysRevB.68.224518 PACS number~s!: 85.25.Cp, 03.67.Lx, 03.65.XpI. INTRODUCTION The remarkable promise of quantum computation1 has led to the...invention of a significant number of proposals for building a practical and scalable quantum computer. Several of these proposals2–6 envision the use of two

  13. Toward More Transparent and Reproducible Omics Studies Through a Common Metadata Checklist and Data Publications.

    PubMed

    Kolker, Eugene; Özdemir, Vural; Martens, Lennart; Hancock, William; Anderson, Gordon; Anderson, Nathaniel; Aynacioglu, Sukru; Baranova, Ancha; Campagna, Shawn R; Chen, Rui; Choiniere, John; Dearth, Stephen P; Feng, Wu-Chun; Ferguson, Lynnette; Fox, Geoffrey; Frishman, Dmitrij; Grossman, Robert; Heath,